开发微型SaaS,一定要快,把基础重复的部分打包好,每次直接利用就可以了,例如 对接大模型,向量数据库,请求第三方api,保证json返回,和wordpress的整合等等。
介绍代码包: https://github.com/hassancs91/SimplerLLM
Creating an LLM Instance
from SimplerLLM.language.llm import LLM, LLMProvider # For OpenAI llm_instance = LLM.create(provider=LLMProvider.OPENAI, model_name="gpt-3.5-turbo") # For Google Gemini #llm_instance = LLM.create(provider=LLMProvider.GEMINI,model_name="gemini-pro") # For Anthropic Claude #llm_instance = LLM.create(LLMProvider.ANTHROPIC, model_name="claude-3-opus-20240229") response = llm_instance.generate_response(prompt="generate a 5 words sentence")
Using Tools
SERP
from SimplerLLM.tools.serp import search_with_serper_api search_results = search_with_serper_api("your search query", num_results=3) # use the search results the way you want!
Generic Text Loader
from SimplerLLM.tools.generic_loader import load_content text_file = load_content("file.txt") print(text_file.content)
Calling any RapidAPI API
from SimplerLLM.tools.rapid_api import RapidAPIClient api_url = "https://domain-authority1.p.rapidapi.com/seo/get-domain-info" api_params = { 'domain': 'learnwithhasan.com', } api_client = RapidAPIClient() # API key read from environment variable response = api_client.call_api(api_url, method='GET', params=api_params)
Prompt Template Builder
from SimplerLLM.prompts.prompt_builder import create_multi_value_prompts,create_prompt_template basic_prompt = "Generate 5 titles for a blog about {topic} and {style}" prompt_template = pr.create_prompt_template(basic_prompt) prompt_template.assign_parms(topic = "marketing",style = "catchy") print(prompt_template.content) ## working with multiple value prompts multi_value_prompt_template = """Hello {name}, your next meeting is on 12/22/2024. and bring a {object} wit you""" params_list = [ {"name": "Alice", "date": "January 10th", "object" : "dog"}, {"name": "Bob", "date": "January 12th", "object" : "bag"}, {"name": "Charlie", "date": "January 15th", "object" : "pen"} ] multi_value_prompt = create_multi_value_prompts(multi_value_prompt_template) generated_prompts = multi_value_prompt.generate_prompts(params_list) print(generated_prompts[0])
Chunking Functions
We have introduced new functions to help you split texts into manageable chunks based on different criteria. These functions are part of the chunker tool.
chunk_by_max_chunk_size
This function splits text into chunks with a maximum size, optionally preserving sentence structure.
chunk_by_sentences
This function splits the text into chunks based on sentences.
chunk_by_paragraphs
This function splits text into chunks based on paragraphs.
chunk_by_semantics
This functions splits text into chunks based on semantics.
Example
from SimplerLLM.tools import text_chunker as chunker blog_url = "https://www.semrush.com/blog/digital-marketing/" blog_post = loader.load_content(blog_url) text = blog_post.content chunks = chunker.chunk_by_max_chunk_size(text, 100, True)
Next Updates
- Adding More Tools
- Interacting With Local LLMs
- Prompt Optimization
- Response Evaluation
- GPT Trainer
- Document Chunker
- Advanced Document Loader
- Integration With More Providers
- Simple RAG With SimplerVectors
- Integration with Vector Databases
- Agent Builder
- LLM Server