Skip to content

LLMs Gallery

  • AnthropicLLM


    Anthropic LLM implementation running the Async API client.

    AnthropicLLM

  • OpenAILLM


    OpenAI LLM implementation running the async API client.

    OpenAILLM

  • AnyscaleLLM


    Anyscale LLM implementation running the async API client of OpenAI.

    AnyscaleLLM

  • AzureOpenAILLM


    Azure OpenAI LLM implementation running the async API client.

    AzureOpenAILLM

  • TogetherLLM


    TogetherLLM LLM implementation running the async API client of OpenAI.

    TogetherLLM

  • ClientvLLM


    A client for the vLLM server implementing the OpenAI API specification.

    ClientvLLM

  • CohereLLM


    Cohere API implementation using the async client for concurrent text generation.

    CohereLLM

  • GroqLLM


    Groq API implementation using the async client for concurrent text generation.

    GroqLLM

  • 🤗 InferenceEndpointsLLM


    InferenceEndpoints LLM implementation running the async API client.

    InferenceEndpointsLLM

  • LiteLLM


    LiteLLM implementation running the async API client.

    LiteLLM

  • MistralLLM


    Mistral LLM implementation running the async API client.

    MistralLLM

  • MixtureOfAgentsLLM


    Mixture-of-Agents implementation.

    MixtureOfAgentsLLM

  • OllamaLLM


    Ollama LLM implementation running the Async API client.

    OllamaLLM

  • VertexAILLM


    VertexAI LLM implementation running the async API clients for Gemini.

    VertexAILLM

  • 🤗 TransformersLLM


    Hugging Face transformers library LLM implementation using the text generation

    TransformersLLM

  • LlamaCppLLM


    llama.cpp LLM implementation running the Python bindings for the C++ code.

    LlamaCppLLM

  • vLLM


    vLLM library LLM implementation.

    vLLM