Skip to content

llama

Llama2TextGenerationTask

Bases: TextGenerationTask

A TextGenerationTask for the Llama2 model.

Parameters:

Name Type Description Default
system_prompt str

the system prompt to be used. Defaults to None.

required
principles Dict[str, List[str]]

the principles to be used for the system prompt. Defaults to None.

required
principles_distribution Union[Dict[str, float], Literal[balanced], None]

the distribution of principles to be used for the system prompt. Defaults to None.

required
Source code in src/distilabel/tasks/text_generation/llama.py
class Llama2TextGenerationTask(TextGenerationTask):
    """A `TextGenerationTask` for the Llama2 model.

    Args:
        system_prompt (str, optional): the system prompt to be used. Defaults to `None`.
        principles (Dict[str, List[str]], optional): the principles to be used for the system prompt.
            Defaults to `None`.
        principles_distribution (Union[Dict[str, float], Literal["balanced"], None], optional): the
            distribution of principles to be used for the system prompt. Defaults to `None`.
    """

    def generate_prompt(self, input: str) -> str:
        """Generates a prompt for the Llama2 model.

        Args:
            input (str): the input to be used for the prompt.

        Returns:
            str: the generated prompt.

        Examples:
            >>> from distilabel.tasks.text_generation import Llama2TextGenerationTask
            >>> task = Llama2TextGenerationTask(system_prompt="You are a helpful assistant.")
            >>> task.generate_prompt("What are the first 5 Fibonacci numbers?")
            '<s>[INST] <<SYS>>\nYou are a helpful assistant.<</SYS>>\n\nWhat are the first 5 Fibonacci numbers? [/INST]'
        """
        return Prompt(
            system_prompt=self.system_prompt,
            formatted_prompt=input,
        ).format_as("llama2")  # type: ignore

generate_prompt(input)

Generates a prompt for the Llama2 model.

    Args:
        input (str): the input to be used for the prompt.

    Returns:
        str: the generated prompt.

    Examples:
        >>> from distilabel.tasks.text_generation import Llama2TextGenerationTask
        >>> task = Llama2TextGenerationTask(system_prompt="You are a helpful assistant.")
        >>> task.generate_prompt("What are the first 5 Fibonacci numbers?")
        '<s>[INST] <<SYS>>

You are a helpful assistant.<>

What are the first 5 Fibonacci numbers? [/INST]'

Source code in src/distilabel/tasks/text_generation/llama.py
def generate_prompt(self, input: str) -> str:
    """Generates a prompt for the Llama2 model.

    Args:
        input (str): the input to be used for the prompt.

    Returns:
        str: the generated prompt.

    Examples:
        >>> from distilabel.tasks.text_generation import Llama2TextGenerationTask
        >>> task = Llama2TextGenerationTask(system_prompt="You are a helpful assistant.")
        >>> task.generate_prompt("What are the first 5 Fibonacci numbers?")
        '<s>[INST] <<SYS>>\nYou are a helpful assistant.<</SYS>>\n\nWhat are the first 5 Fibonacci numbers? [/INST]'
    """
    return Prompt(
        system_prompt=self.system_prompt,
        formatted_prompt=input,
    ).format_as("llama2")  # type: ignore