“A year ago, custom GPTs were the thing. Fast forward to today… and it almost feels vintage. But here’s the truth: many of us use custom AI agents in the newsroom, they are real and USEFUL.” (Alba Mora Roca, LinkedIn)
/ AI & Journalism / Linkposts
I Hate My Friend: The chatbot-enabled Friend necklace eavesdrops on your life and provides a running commentary that’s snarky and unhelpful. Worse, it can also make the people around you uneasy. (Kylie Robison, Wired)
How thousands of ‘overworked, underpaid’ humans train Google’s AI to seem smart: Contracted AI raters describe grueling deadlines, poor pay and opacity around work to make chatbots intelligent. (Varsha Bansal, The Guardian)
How to get less hallucinations: “What often is deemed a ‘wrong’ response is often merely a first pass at describing the beliefs out there. And the solution is the same: iterate the process.” (Mike Caulfield, The End(s) of Argument)
AI bots endlessly scrape publisher sites, causing costly downtime and meager traffic. (Charlotte Tobitt, PressGazette)
AI Search, Users, and News: A trove of data from LM Arena offers a glimpse into user search behavior. A few sources garnered the majority of impressions. (Nick Diakopoulos, Generative AI in the Newsroom)
How Elon Musk Is Remaking Grok in His Image: “Grok’s rightward shift has occurred alongside Mr. Musk’s own frustrations with the chatbot’s replies. He wrote in July that ‘all AIs are trained on a mountain of woke’ information that is very difficult to remove after training.” (New York Times)
What happens to carefully crafted journalism when readers expect AI-generated, personalized stories created instantly? Semafor’s Gina Chua on how AI will upend the news.
Werewolf leaderboard: GPT-5 is the best at bluffing and manipulating the other AIs in Werewolf. (Foaster Labs)
From Star Wars insult to TikTok meme: “Clanker has become a go-to slur against A.I. on social media, led by Gen Z and Gen Alpha posters.” (Eli Tan, New York Times)
“It’s the return of the shoe-leather reporter, empowered by an AI partner. One gathers trust, information and relationships. The other who assembles.” (David Cohn, LinkedIn)
A short clip of a Will Smith concert looks like a crappy AI fake – but it’s not: “The crowds were real, but the videos were manipulated: first by Will Smith’s team, and then without asking, by YouTube.” (Andy Baio, waxy.org)
Users of German news site Süddeutsche who were shown a difficult quiz about AI-generated images afterwards trusted media less and visited the news site a little bit more often – and now everyone’s hoping that quality journalism still has a chance. (Sarah Scire, Nieman Journalism Lab)
Less than nine seconds of watching TV: That’s the energy consumption Google reports for the “median Gemini Apps text prompt” in May 2025, which includes “all LLM models serving the Gemini app, including all supporting models for scoring, ranking, classification, and other prompt routing tasks” and accounts for idle machines and overhead.
Stories too good to be true, payment via Paypal: At least six publications have taken down articles under the name Margaux Blanchard that were AI-generated. (Maya Yang, Guardian)