Skip to content

Add Ollama Cloud provider with native Ollama API support#603

Closed
gs-deliverists-io wants to merge 1 commit intocrmne:mainfrom
deliverists-io:feature/ollama-cloud-provider
Closed

Add Ollama Cloud provider with native Ollama API support#603
gs-deliverists-io wants to merge 1 commit intocrmne:mainfrom
deliverists-io:feature/ollama-cloud-provider

Conversation

@gs-deliverists-io
Copy link
Copy Markdown

@gs-deliverists-io gs-deliverists-io commented Feb 12, 2026

 - Implements Ollama Cloud provider using native Ollama API
 - API base: https://ollama.com (matching Python client)
 - Uses OLLAMA_API_KEY environment variable
 - Supports cloud models with -cloud suffix
 - Implements /api/tags, /api/chat, /api/embed endpoints
 - Adds assume_models_exist? to allow unregistered cloud models

 ## What this does

 This PR allows to use ollama-cloude provider for chatting.

 ## Type of change

 - [x] New feature
 - [ ] Documentation

 ## Scope check

 - [x] I read the [Contributing Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md)
 - [x] This aligns with RubyLLM's focus on **LLM communication**
 - [x] This isn't application-specific logic that belongs in user code
 - [x] This benefits most users, not just my specific use case

 ## Required for new features

 - [ ] I opened an issue **before** writing code and received maintainer approval
 - [ ] Linked issue: #___

 **PRs for new features or enhancements without a prior approved issue will be closed.**

 ## Quality check

 - [ ] I ran `overcommit --install` and all hooks pass
 - [ ] I tested my changes thoroughly
   - [ ] For provider changes: Re-recorded VCR cassettes with `bundle exec rake 'vcr:record[provider_name]'`
   - [ ] All tests pass: `bundle exec rspec`
 - [ ] I updated documentation if needed
 - [ ] I didn't modify auto-generated files manually (`models.json`, `aliases.json`)

 ## AI-generated code

 - [ ] I used AI tools to help write this code
 - [ ] I have reviewed and understand all generated code (required if above is checked)

 ## API changes

 - [ ] Breaking change
 - [ ] New public methods/classes
 - [ ] Changed method signatures
 - [ ] No API changes

Example of usage

require "ruby_llm"

RubyLLM.configure do |config|
  config.ollama_api_key = "your-api-key"
end

#Use with cloud models (note the -cloud suffix)
chat = RubyLLM.chat model: "gpt-oss:120b-cloud", provider: :ollama_cloud
response = chat.ask "What is Ruby?"
puts response.conten

- Implements Ollama Cloud provider using native Ollama API
- API base: https://ollama.com (matching Python client)
- Uses OLLAMA_API_KEY environment variable
- Supports cloud models with -cloud suffix
- Implements /api/tags, /api/chat, /api/embed endpoints
- Adds assume_models_exist? to allow unregistered cloud models
@gs-deliverists-io gs-deliverists-io marked this pull request as draft February 12, 2026 19:03
@crmne
Copy link
Copy Markdown
Owner

crmne commented Feb 28, 2026

We have already support for different API bases and #612 will add support for API keys.

Also this is violating the guidelines:

Required for new features

  • I opened an issue before writing code and received maintainer approval
  • Linked issue: #___

PRs for new features or enhancements without a prior approved issue will be closed.

Closing.

@crmne crmne closed this Feb 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants