-
Notifications
You must be signed in to change notification settings - Fork 786
auto_rl example uses #641
Copy link
Copy link
Open
Description
When trying to run the auto_rl example here the calls to acompletion causes OpenRouter errors:
response = await acompletion(
model=INPUT_GENERATION_MODEL,
messages=messages,
response_format=TrainingDataset,
temperature=1.0,
)
This returns:
[/usr/local/lib/python3.12/dist-packages/litellm/llms/custom_httpx/llm_http_handler.py](https://localhost:8080/#) in _handle_error(self, e, provider_config)
2402
-> 2403 raise provider_config.get_error_class(
2404 error_message=error_text,
OpenRouterException: {"error":{"message":"Provider returned error","code":400,"metadata":{"raw":"{\"code\":400, \"reason\":\"INVALID_REQUEST_BODY\", \"message\":\"model features structured outputs not support\", \"metadata\":{}}","provider_name":"Novita","is_byok":false}},"user_id":"user_xxx"}
I suspect this might be a litellm issue however it seems as though it's fixed there so I'm unsure what's happening. So far I haven't been able to run any of the examples on colab, it may make sense to put some instructions that lead to a happy path with the examples so that folks can get started successfully.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels