The bridge between a chatbot like ChatGPT and another piece of software. Want your chatbot to access your WordPress blog? MCP. Let your chatbot write in a Notion databse? MCP. Let a chatbot delete your whole codebase in a glitch? MCP.
/ AI & Journalism
Overemployment
When you secretly balance more than one remote job simultaneously. This trend started in software development during the Covid years—a mixture of hustle culture and strategic deadline management. Bosses are not happy. Those AI efficiency gains? Not for the plebs.
Magic Tokens
Adding “think step by step” in prompts can yields better outputs. Anthropic says that giving the chatbot a role like “data scientist” makes for different results, because a data scientist might see different things in data. Now, models are not using language, but a numerical representation: tokens. 17509, 656, 5983 is “step by step” tokenized...
Luddites
The Rebellion against the Empire of AI. Named after a British labor movement from 1811-1816 that—with style and swagger—disrupted textile factories. Not opposed to machines per se, they wanted a say in how technology was used to preserve their livelihood, their craft, their community. (The government eventually suppressed the Luddites, requiring 14,000 troops.) The Luddites...
Informational Logistics
The reduction of journalism into filling content pipelines. News gets fragmented into AI-digestible units, automatically repackaged for different audiences, and fed directly into chatbots and recommendation systems. Klingebiel, Johannes (2025): Informational logistics
Liquid content
It’s the death of the article. “With generative AI, we’re entering a new era: content is becoming dynamic again, flowing like water to adapt to you—your time, space, and interactions,” says Google’s Matthie Lorrain. Text-to-speech is just a tiny step in that direction. "What if news media were to let go of the artifact as...
Grounding
LLMs generate text by predicting word patterns, but they don’t know what words actually mean to humans. That’s why they sometimes hallucinate. But when you provide outside information—like documents or websites—their answers can be anchored to real, meaningful data. That’s grounding. Harnad, Stevan (1990): The symbol grounding problem
Inference
When you input something to an AI model and it responds, that process is called inference. The model answers questions it’s never seen before by inferring from its training data.
Fakecast
When two AI voices try really hard to mimic your favorite podcast. It started with Google’s NotebookLM. Now it’s everywhere. Reissmann, Ole (2025): Rise of the Fakecast: The Uncanny Valley of AI-Generated Podcasts
Deep Tailoring
Not a product you can buy—it’s a thought experiment on weaponizing behavioral science. An AI system maps your entire psychological profile—your core beliefs, moral values, identity markers—then uses that data to predict your behavior and design messages to influence you without detection. Scale this process across millions of users and automate it completely. Luttrell, Andrew,...
Context Engineering
The art of designing, testing, and optimizing prompts remains underappreciated yet crucial. But it matters less for newer “reasoning” models like o3, which attempt to figure out tasks independently. And suddenly prompt engineering feels very 2023—nowadays it’s all about context: giving chatbots access to the right tools and the right knowledge at the right moment....
Alignment
You could argue that chatbots shouldn’t give out detailed instructions on how to kill someone or call themselves MechaHitler. Most AI companies align their models this way, showing them during training what to generate and what not to generate. It’s a bit of an art, really, as we haven’t found a good way to calculate...
Brainrot
Internet slang for mindless digital consumption. In late 2024 and early 2025, “Italian brainrot” became a TikTok and Instagram meme—AI-generated videos featuring characters like Chimpanzini Bananini speaking in fake Italian gibberish. It’s not slop, it’s… art? Gen Z only
AIO or AI Overviews
Google’s AI summaries that appear above search results, leading—to no one’s surprise—to a massive drop in search clicks. Then there’s Google’s version of Perplexity, called AI Mode.
AGI or Artificial General Intelligence
It’s mostly a marketing ploy to raise money and avoid regulation. Nobody really knows what it actually means. Avoid for now