Issue dashboard
Rank | Issue | Author |
---|---|---|
1 | 🟢 1137 - [BUG] Parameters of GeneratorStep are ignored during dry_run |
by sung1-kang |
2 | 🟢 1136 - Token classification example | by drewskidang |
3 | 🟢 1135 - [BUG] Structured Output with InferenceEndpointsLLM and TGI | by joaomsimoes |
4 | 🟣 1133 - [BUG]maybe this error has to do with outlines? | by makrse |
5 | 🟢 1132 - [BUG] TextGeneration always process with fixed interval, not match the throughput of LLM |
by observerw |
6 | 🟢 1131 - [BUG] Bad request: Not allowed to GET status/meta-llama/Llama-3.2-1B-Instruct for provider hf-inference | by ytan101 |
7 | 🟣 1130 - [BUG] ❌ Failed to load step 'text_generation_0': Step load failed: No module named 'distilabel.models.openai' | by Tavish9 |
8 | 🟢 1125 - [BUG] AttributeError in AzureOpenAILLM.load: Missing attribute openai in distilabel.models | by FarrelRamdhani |
9 | 🟣 1120 - [FEATURE] Add Task for multi-turn dialogue distillation | by AndreasMadsen |
10 | 🟢 1119 - [DOCS] Fix in example - Using a DummyLLM to avoid loading one | by kgdrathan |
Last update: 2025-04-07