LLMs Gallery¶
-
AnthropicLLM
Anthropic LLM implementation running the Async API client.
-
OpenAILLM
OpenAI LLM implementation running the async API client.
-
AnyscaleLLM
Anyscale LLM implementation running the async API client of OpenAI.
-
:simple-microsoftazure:{ .lg .middle } AzureOpenAILLM
Azure OpenAI LLM implementation running the async API client.
-
TogetherLLM
TogetherLLM LLM implementation running the async API client of OpenAI.
-
CohereLLM
Cohere API implementation using the async client for concurrent text generation.
-
GroqLLM
Groq API implementation using the async client for concurrent text generation.
-
InferenceEndpointsLLM
InferenceEndpoints LLM implementation running the async API client.
-
LiteLLM
LiteLLM implementation running the async API client.
-
MistralLLM
Mistral LLM implementation running the async API client.
-
MixtureOfAgentsLLM
Mixture-of-Agents
implementation. -
OllamaLLM
Ollama LLM implementation running the Async API client.
-
VertexAILLM
VertexAI LLM implementation running the async API clients for Gemini.
-
TransformersLLM
Hugging Face
transformers
library LLM implementation using the text generation -
LlamaCppLLM
llama.cpp LLM implementation running the Python bindings for the C++ code.
-
vLLM
vLLM
library LLM implementation.