LLM Knowledge Base

Context Window

Context Window refers to a specific range of words or data points that an AI model considers when predicting the next output. It's a crucial aspect of language models, where the context window helps the model understand the semantic relationship between words or phrases in a given text. The size of the context window can greatly influence the model's performance, as a larger context window provides more information for the model to base its predictions on, but may also increase computational complexity.

The size of the context window is often a limiting factor for non-trivial LLMs applications.

Over time, as LLMs become more capable, the context window size will likely keep increasing. It took only a couple of months to increase from 2,048 (GPT-3) to 100,000 (Claude 2) tokens and most recently to 200,000 (Claude 3).

PROMPTMETHEUS © 2024