File tree Expand file tree Collapse file tree
Expand file tree Collapse file tree Original file line number Diff line number Diff line change @@ -12,7 +12,7 @@ Local models have reached a turning point where they're now practical for real d
1212## Quick Start
1313
14141 . ** Check your hardware** - 32GB+ RAM minimum
15- 2 . ** Choose your runtime** - [ LM Studio] ( /running-models-locally/lm-studio ) or [ Ollama] ( /providers/ollama )
15+ 2 . ** Choose your runtime** - [ LM Studio] ( /providers/lmstudio ) or [ Ollama] ( /providers/ollama )
16163 . ** Download Qwen3 Coder 30B** - The recommended model
17174 . ** Configure settings** - Enable compact prompts, set max context
18185 . ** Start coding** - Completely offline
@@ -60,7 +60,7 @@ LM Studio
6060* ** Pros** : User-friendly GUI, easy model management, built-in server
6161* ** Cons** : Memory overhead from UI, limited to single model at a time
6262* ** Best for** : Desktop users who want simplicity
63- * [ Setup Guide →] ( /running-models-locally/lm-studio )
63+ * [ Setup Guide →] ( /providers/lmstudio )
6464
6565### Ollama
6666
@@ -229,7 +229,7 @@ Note: These may require additional configuration and testing.
229229Ready to get started? Choose your path:
230230
231231<CardGroup cols = { 2 } >
232- <Card title = " LM Studio Setup" icon = " desktop" href = " /running-models-locally/lm-studio " >
232+ <Card title = " LM Studio Setup" icon = " desktop" href = " /providers/lmstudio " >
233233 User-friendly GUI approach with detailed configuration guide
234234 </Card >
235235
You can’t perform that action at this time.
0 commit comments