Data Science & ML

Context Window

/ˈkɒntekst ˈwɪndəʊ/

Definition

The maximum number of tokens an LLM can process at once — input + output combined must fit within this limit.

Example in context

"GPT-4 has a 128k token context window — you can include long documents in the prompt, but cost scales with tokens."

Practice this term

Master Context Window in context by working through exercises in the Data Science & ML module. You'll see the term used in real engineering scenarios with multiple-choice, fill-in-the-blank, and matching drills.