Skip to content

Commit a906445

Browse files
Update support/troubleshooting.mdx
Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com>
1 parent 4b822cd commit a906445

1 file changed

Lines changed: 20 additions & 0 deletions

File tree

support/troubleshooting.mdx

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -60,6 +60,26 @@ description: "Solve common issues with CodinIT AI IDE, LLM providers, code gener
6060
Network issues typically resolve themselves - try again in a few minutes.
6161
</Info>
6262
</Accordion>
63+
64+
<Accordion title="Ollama models not appearing" icon="server">
65+
**Problem**: Ollama models don't show up in the model selector dropdown.
66+
67+
**Solutions**:
68+
69+
1. **Install models first**: Ensure you've installed Ollama models on your device before using them
70+
2. **Configure base URL**: Go to Settings → Local Providers and set the Ollama base URL (e.g., `http://127.0.0.1:11434`)
71+
3. **Enable Ollama**: Make sure Ollama provider is enabled in settings
72+
4. **Check Ollama service**: Verify Ollama is running on your system
73+
5. **Refresh model list**: Return to chat and open the provider/model dropdown to see available models
74+
75+
**Docker users**:
76+
- Use `host.docker.internal` instead of `localhost` or `127.0.0.1` for the base URL
77+
- Ensure Docker has network access to your host machine
78+
79+
<Tip>
80+
After configuring Ollama settings, models should appear automatically in the provider dropdown.
81+
</Tip>
82+
</Accordion>
6383
</AccordionGroup>
6484

6585
## MacOS Specific Issues

0 commit comments

Comments
 (0)