Adding “think step by step” in prompts can yields better outputs. Anthropic says that giving the chatbot a role like “data scientist” makes for different results, because a data scientist might see different things in data. Now, models are not using language, but a numerical representation: tokens. 17509, 656, 5983 is “step by step” tokenized for ChatGPT 4o. Magic tokens! But if you tell a model not to think of pink elephants, you might dilute focus and waste context. Beware of “poison tokens.”

Brownlee, Jason (2025): Magical Tokens for LLMs