Token
A token in the context of large language models (LLMs) is a sequence of characters that the model converts into numeric representations for efficient processing. Tokens are the basic units of text used by LLMs such as GPT-3 and ChatGPT to understand and generate language.
•
3 min read