diff --git a/Configuration/TCA/tx_aim_configuration.php b/Configuration/TCA/tx_aim_configuration.php
index 2dc9c8c..4a46e06 100644
--- a/Configuration/TCA/tx_aim_configuration.php
+++ b/Configuration/TCA/tx_aim_configuration.php
@@ -96,6 +96,7 @@
],
'api_key' => [
'label' => 'LLL:EXT:aim/Resources/Private/Language/locallang_tca.xlf:tx_aim_configuration.columns.api_key.label',
+ 'description' => 'LLL:EXT:aim/Resources/Private/Language/locallang_tca.xlf:tx_aim_configuration.columns.api_key.description',
'onChange' => 'reload',
'config' => [
'type' => 'input',
diff --git a/README.md b/README.md
index 52d2a2e..ed4bd1d 100644
--- a/README.md
+++ b/README.md
@@ -75,6 +75,8 @@ Any installed `symfony/ai-*-platform` package is **auto-discovered** at containe
After installation, create a provider configuration in the backend (Admin Tools > AiM > Providers) with your API key and preferred model.
+> **Local providers (Ollama, LM Studio):** The *API Key* field doubles as the endpoint URL. Enter `http://localhost:11434` (Ollama) or `http://localhost:1234` (LM Studio) instead of a key. The available models are then fetched live from that endpoint.
+
## Usage
### Tier 1: Proxy (recommended)
diff --git a/Resources/Private/Language/locallang_tca.xlf b/Resources/Private/Language/locallang_tca.xlf
index 267fad1..703044f 100644
--- a/Resources/Private/Language/locallang_tca.xlf
+++ b/Resources/Private/Language/locallang_tca.xlf
@@ -31,7 +31,10 @@
Default
- API Key
+ API Key / Endpoint URL
+
+
+ For local/self-hosted providers like Ollama or LM Studio, enter the endpoint URL (e.g. http://localhost:11434) instead of an API key.Model