allms: One Rust Library to rule them aLLMs
OTHER License
This Rust library is specialized in providing type-safe interactions with APIs of the following LLM providers: OpenAI, Anthropic, Mistral, Google Gemini. (More providers to be added in the future.) It's designed to simplify the process of experimenting with different models. It de-risks the process of migrating between providers reducing vendor lock-in issues. It also standardizes serialization of sending requests to LLM APIs and interpreting the responses, ensuring that the JSON data is handled in a type-safe manner. With allms you can focus on creating effective prompts and providing LLM with the right context, instead of worrying about differences in API implementations.
OpenAI:
Custom
variant)Azure OpenAI
Anthropic:
Mistral:
Google Vertex AI / AI Studio:
OPENAI_API_URL
set to your Azure OpenAI resource endpoint. Model deployment names in Azure OpenAI Stuido need to match OpenAIModels::as_str()
Explore the examples
directory to see more use cases and how to use different LLM providers and endpoint types.
Using Completions
API with different foundational models:
let openai_answer = Completions::new(OpenAIModels::Gpt4o, &API_KEY, None, None)
.get_answer::<T>(instructions)
.await?
let anthropic_answer = Completions::new(AnthropicModels::Claude2, &API_KEY, None, None)
.get_answer::<T>(instructions)
.await?
let mistral_answer = Completions::new(MistralModels::MistralSmall, &API_KEY, None, None)
.get_answer::<T>(instructions)
.await?
let google_answer = Completions::new(GoogleModels::GeminiPro, &API_KEY, None, None)
.get_answer::<T>(instructions)
.await?
Example:
RUST_LOG=info RUST_BACKTRACE=1 cargo run --example use_completions
Using Assistant
API to analyze your files with File
and VectorStore
capabilities:
// Create a File
let openai_file = OpenAIFile::new(None, &API_KEY)
.upload(&file_name, bytes)
.await?;
// Create a Vector Store
let openai_vector_store = OpenAIVectorStore::new(None, "Name", &API_KEY)
.upload(&[openai_file.id.clone().unwrap_or_default()])
.await?;
// Extract data using Assistant
let openai_answer = OpenAIAssistant::new(OpenAIModels::Gpt4o, &API_KEY)
.version(OpenAIAssistantVersion::V2)
.vector_store(openai_vector_store.clone())
.await?
.get_answer::<T>(instructions, &[])
.await?;
Example:
RUST_LOG=info RUST_BACKTRACE=1 cargo run --example use_openai_assistant
This project is licensed under dual MIT/Apache-2.0 license. See the LICENSE-MIT and LICENSE-APACHE files for details.