ai generative-ai

Token

The basic unit of text that AI language models process, typically representing about 4 characters or 0.75 words.

Definition

In AI language models, a token is the basic unit of text that the model processes. Tokens are typically pieces of words—roughly 4 characters or about 0.75 words on average in English. The word "understanding" might be split into "under" and "standing" as separate tokens.

Tokens matter because AI models have context limits measured in tokens (e.g., 8K, 32K, 128K tokens), and API pricing is typically per-token. Understanding tokenization helps in managing costs and working within model constraints.

Why It Matters

Token limits affect how much context you can provide to AI models and how long responses can be. Marketers using AI tools need to understand tokens to manage costs, work within limits, and optimize their prompts.

Longer context windows (more tokens) enable AI to process entire documents, maintain longer conversations, and handle more complex tasks.

Examples in Practice

A 2,000-word article is approximately 2,700 tokens. If your AI tool has a 4,000 token limit, you need to be strategic about how much of the article to include in your prompt.

GPT-4 pricing is based on tokens processed: $0.03 per 1,000 input tokens and $0.06 per 1,000 output tokens means a 5,000-word analysis might cost about $0.30.

Explore More Industry Terms

Browse our comprehensive glossary covering marketing, events, entertainment, and more.