Skip to content

Commit fcb6e3b

Browse files
Update model-config/context-windows.mdx
Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com>
1 parent b3ede3a commit fcb6e3b

1 file changed

Lines changed: 5 additions & 5 deletions

File tree

model-config/context-windows.mdx

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,18 @@
11
---
22
title: "Context Window Guide"
3-
description: "Understand and manage AI model context windows to optimize performance, control costs, and work effectively with large codebases."
3+
description: "How much the AI can remember at once"
44
---
55

6-
## What is a context window?
6+
## What is a Context Window?
77

8-
A context window is the maximum amount of text an AI model can process at once. Think of it as the model's "working memory" - it determines how much of your conversation and code the model can consider when generating responses.
8+
A context window is how much text the AI can look at and remember at one time. Think of it like the AI's short-term memory.
99

1010
<Note>
11-
**What are tokens?** Tokens are small chunks of text that AI models read. A token is roughly ¾ of a word. For example, the word "hamburger" becomes two tokens: "ham" and "burger". When we talk about context window sizes like "128K tokens," that means the AI can read about 96,000 words at once.
11+
**What are tokens?** Tokens are small pieces of text. About 4 letters = 1 token. So "hamburger" is 2 tokens: "ham" and "burger". When we say "128K tokens," that means the AI can remember about 96,000 words at once.
1212
</Note>
1313

1414
<Tip>
15-
**Key point**: Larger context windows allow the model to understand more of your codebase at once, but may increase costs and response times.
15+
**Important**: Bigger context windows let the AI see more of your project, but they cost more money and take longer.
1616
</Tip>
1717

1818
### Quick reference

0 commit comments

Comments
 (0)