Skip to content

Commit 3db6265

Browse files
creaumondcrmne
andauthored
Preserves assume_model_exists in to_llm for custom models (#564)
## What this does <!-- Clear description of what this PR does and why --> When a user passes in the `assume_model_exists` flag through a custom Rails model, the flag gets dropped and they are not able to utilize a custom model. This persists that flag or sets it to false if not provided. ## Type of change - [x] Bug fix - [ ] New feature - [ ] Breaking change - [ ] Documentation - [ ] Performance improvement ## Scope check - [x] I read the [Contributing Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md) - [x] This aligns with RubyLLM's focus on **LLM communication** - [x] This isn't application-specific logic that belongs in user code - [x] This benefits most users, not just my specific use case ## Quality check - [x] I ran `overcommit --install` and all hooks pass - [x] I tested my changes thoroughly - [ ] For provider changes: Re-recorded VCR cassettes with `bundle exec rake vcr:record[provider_name]` - [x] All tests pass: `bundle exec rspec` - [ ] I updated documentation if needed - [x] I didn't modify auto-generated files manually (`models.json`, `aliases.json`) ## API changes - [ ] Breaking change - [ ] New public methods/classes - [ ] Changed method signatures - [x] No API changes ## Related issues <!-- Link issues: "Fixes #123" or "Related to #123" --> #555 Co-authored-by: Carmine Paolino <carmine@paolino.me>
1 parent bdef162 commit 3db6265

3 files changed

Lines changed: 14 additions & 3 deletions

File tree

lib/ruby_llm/active_record/chat_methods.rb

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,8 @@ def to_llm
7979
model_record = model_association
8080
@chat ||= (context || RubyLLM).chat(
8181
model: model_record.model_id,
82-
provider: model_record.provider.to_sym
82+
provider: model_record.provider.to_sym,
83+
assume_model_exists: assume_model_exists || false
8384
)
8485
@chat.reset_messages!
8586

spec/ruby_llm/active_record/acts_as_model_spec.rb

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -255,7 +255,8 @@ def messages
255255
# Mock the chat creation to verify parameters
256256
expect(RubyLLM).to receive(:chat).with( # rubocop:disable RSpec/MessageSpies,RSpec/StubbedMock
257257
model: 'test-gpt',
258-
provider: :openai
258+
provider: :openai,
259+
assume_model_exists: false
259260
).and_return(
260261
instance_double(RubyLLM::Chat, reset_messages!: nil, add_message: nil,
261262
instance_variable_get: {}, on_new_message: nil, on_end_message: nil,
@@ -272,7 +273,8 @@ def messages
272273

273274
expect(RubyLLM).to receive(:chat).with( # rubocop:disable RSpec/MessageSpies,RSpec/StubbedMock
274275
model: 'test-claude',
275-
provider: :anthropic
276+
provider: :anthropic,
277+
assume_model_exists: false
276278
).and_return(
277279
instance_double(RubyLLM::Chat, reset_messages!: nil, add_message: nil,
278280
instance_variable_get: {}, on_new_message: nil, on_end_message: nil,

spec/ruby_llm/active_record/acts_as_spec.rb

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -535,6 +535,14 @@ class ToolCall < ActiveRecord::Base # rubocop:disable RSpec/LeakyConstantDeclara
535535
expect(llm_tool_call.name).to eq('calculator')
536536
expect(llm_tool_call.arguments).to eq({ 'expression' => '2 + 2' })
537537
end
538+
539+
it 'correctly preserves custom model' do
540+
custom_model = 'my-custom-model'
541+
bot_chat = Assistants::BotChat.create!(model: custom_model, assume_model_exists: true, provider: 'openrouter')
542+
bot_chat.save!
543+
llm_chat = bot_chat.to_llm
544+
expect(llm_chat.model.id).to eq(custom_model)
545+
end
538546
end
539547
end
540548

0 commit comments

Comments
 (0)