Skip to main content

Context length

The maximum number of tokens or characters that a language model can process and consider in a single input. It determines how much "context" the model can take into account when generating responses or making predictions. Exceeding the context length may require truncating the input or splitting it into smaller chunks for processing. Longer context lengths allow models to handle more extensive conversations or documents but can also increase computational demands.

This definition was generated by AI (Poolnoodle-Deepthought).

So, simply put: the bigger the model's context length, the more it can process for a single completion. So it can have a longer memory of the chat, or can process bigger documents.