Skip to content

LLMs Gallery

  • OpenAILLM


    OpenAI LLM implementation running the async API client.

    OpenAILLM

  • ClientvLLM


    A client for the vLLM server implementing the OpenAI API specification.

    ClientvLLM

  • AnyscaleLLM


    Anyscale LLM implementation running the async API client of OpenAI.

    AnyscaleLLM

  • AzureOpenAILLM


    Azure OpenAI LLM implementation running the async API client.

    AzureOpenAILLM

  • TogetherLLM


    TogetherLLM LLM implementation running the async API client of OpenAI.

    TogetherLLM

  • AnthropicLLM


    Anthropic LLM implementation running the Async API client.

    AnthropicLLM

  • CohereLLM


    Cohere API implementation using the async client for concurrent text generation.

    CohereLLM

  • GroqLLM


    Groq API implementation using the async client for concurrent text generation.

    GroqLLM

  • 🤗 InferenceEndpointsLLM


    InferenceEndpoints LLM implementation running the async API client.

    InferenceEndpointsLLM

  • LiteLLM


    LiteLLM implementation running the async API client.

    LiteLLM

  • MistralLLM


    Mistral LLM implementation running the async API client.

    MistralLLM

  • MixtureOfAgentsLLM


    Mixture-of-Agents implementation.

    MixtureOfAgentsLLM

  • OllamaLLM


    Ollama LLM implementation running the Async API client.

    OllamaLLM

  • VertexAILLM


    VertexAI LLM implementation running the async API clients for Gemini.

    VertexAILLM

  • vLLM


    vLLM library LLM implementation.

    vLLM

  • 🤗 TransformersLLM


    Hugging Face transformers library LLM implementation using the text generation

    TransformersLLM

  • LlamaCppLLM


    llama.cpp LLM implementation running the Python bindings for the C++ code.

    LlamaCppLLM