Skip to content

Commit 4547f72

Browse files
fix broken lm studio links in local model setup guide
1 parent 4165ffd commit 4547f72

1 file changed

Lines changed: 3 additions & 3 deletions

File tree

running-models-locally/local-model-setup.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ Local models have reached a turning point where they're now practical for real d
1212
## Quick Start
1313

1414
1. **Check your hardware** - 32GB+ RAM minimum
15-
2. **Choose your runtime** - [LM Studio](/running-models-locally/lm-studio) or [Ollama](/providers/ollama)
15+
2. **Choose your runtime** - [LM Studio](/providers/lmstudio) or [Ollama](/providers/ollama)
1616
3. **Download Qwen3 Coder 30B** - The recommended model
1717
4. **Configure settings** - Enable compact prompts, set max context
1818
5. **Start coding** - Completely offline
@@ -60,7 +60,7 @@ LM Studio
6060
* **Pros**: User-friendly GUI, easy model management, built-in server
6161
* **Cons**: Memory overhead from UI, limited to single model at a time
6262
* **Best for**: Desktop users who want simplicity
63-
* [Setup Guide →](/running-models-locally/lm-studio)
63+
* [Setup Guide →](/providers/lmstudio)
6464

6565
### Ollama
6666

@@ -229,7 +229,7 @@ Note: These may require additional configuration and testing.
229229
Ready to get started? Choose your path:
230230

231231
<CardGroup cols={2}>
232-
<Card title="LM Studio Setup" icon="desktop" href="/running-models-locally/lm-studio">
232+
<Card title="LM Studio Setup" icon="desktop" href="/providers/lmstudio">
233233
User-friendly GUI approach with detailed configuration guide
234234
</Card>
235235

0 commit comments

Comments
 (0)