Issue dashboard Most engaging open issuesLatest issues open by the communityPlanned issues for upcoming releases Rank Issue Reactions Comments 1 1041 - [FEATURE] Add Offline batch generation for open models with EXXA API 👍 2 💬 1 2 737 - [FEATURE] Allow FormatTextGenerationSFT to include tools/function calls in the formatted messages. 👍 2 💬 0 3 1030 - [FEATURE] Trim inputs 👍 1 💬 2 4 797 - [FEATURE] synthetic data generation for predictive NLP tasks 👍 1 💬 1 5 914 - [FEATURE] Use Step.resources to set tensor_parallel_size and pipeline_parallel_size in vLLM 👍 1 💬 0 6 588 - [FEATURE] Single request caching 👍 1 💬 0 7 953 - [EXAMPLE] Add CRAFT Your Dataset: Task-Specific Synthetic Dataset Generation Through Corpus Retrieval and Augmentation example 👍 0 💬 6 8 972 - [BUG] Input data size != output data size when task batch size < batch size of predecessor 👍 0 💬 5 9 859 - [FEATURE] Update PushToHub to stream data to the Hub 👍 0 💬 5 10 722 - [FEATURE] move model_name to distilabel_metadata dictionary with step name suffix 👍 0 💬 4 Rank Issue Author 1 🟢 1150 - AttributeError: 'OllamaLLM' object has no attribute 'pydantic_private' by 0xD4rky 2 🟣 1140 - [BUG] Failed to load step 'text_generation_0': Step load failed: 'InferenceClient' object local.py:316 has no attribute '_resolve_url' by Galaxy-Husky 3 🟣 1139 - [FEATURE]add seed parameter to OpenAILLM by weiminw 4 🟢 1137 - [BUG] Parameters of GeneratorStep are ignored during dry_run by sung1-kang 5 🟢 1136 - Token classification example by drewskidang 6 🟢 1135 - [BUG] Structured Output with InferenceEndpointsLLM and TGI by joaomsimoes 7 🟣 1133 - [BUG]maybe this error has to do with outlines? by makrse 8 🟢 1132 - [BUG] TextGeneration always process with fixed interval, not match the throughput of LLM by observerw 9 🟣 1131 - [BUG] Bad request: Not allowed to GET status/meta-llama/Llama-3.2-1B-Instruct for provider hf-inference by kurosse 10 🟣 1130 - [BUG] ❌ Failed to load step 'text_generation_0': Step load failed: No module named 'distilabel.models.openai' by Tavish9 Rank Issue Milestone 1 🟢 889 - [FEATURE] Replace extra_sampling_params for normal arguments in vLLM 1.4.0 2 🟢 880 - [FEATURE] Add exclude_from_signature attribute 1.4.0 3 🟢 802 - [FEATURE] Add defaults to Steps and Tasks so they can be more easily connected 1.4.0 4 🟢 773 - [DOCS] Include section/guide describing pipeline patterns 1.4.0 5 🟢 771 - [FEATURE] Allow passing path to YAML file containing pipeline runtime parameters in distilabel run 1.4.0 6 🟢 662 - [FEATURE] Allow passing self to steps created with step decorator 1.4.0 7 🟢 579 - [FEATURE] Sequential execution for local pipeline 1.4.0 8 🟢 1091 - [BUG] distiset.push_to_hub() seems to have cache pathing issue 1.6.0 9 🟢 1070 - [BUG] Pipeline serialization/caching issue when including RoutingBatchFunction 1.6.0 10 🟢 1035 - [FEATURE] Do not pass rows that contains Step.inputs with None values 1.6.0 Last update: 2025-06-01 Was this page helpful? Thanks for your feedback! Thanks for your feedback! Help us improve this page by opening a GitHub issue.