This gets a little technical, but with AI’s help it’s straightforward. Here’s how it works: a Chrome plugin grabs content from the CMS text fields, sends it to ChatGPT, Gemini, or Claude with various preset prompts, and displays the response right in the browser.
For example: checking against your house style. A plausibility check. Suggestions for headlines, SEO lines, or social cards.
A Chrome plugin is basically just a few scripts distributed as a ZIP file. I built one for SPIEGEL, my AI colleague David Bauer built one for Republik. David’s Sidekick is available to try in English if you have an API key.

If you want to build your own CMS copilot, just ask an AI and let it walk you through step by step. Things to consider:
- Who should be able to use the plugin, Mac or Windows? How does the Chrome plugin get into Chrome? Normally you have to submit the plugin to Google, then it gets reviewed and added to the official directory, the Chrome Web Store. You can also hide your plugin there as unlisted.
For testing and for a handful of users, that’s too much effort. Instead, you can set your Chrome browser to “developer mode” in the top right corner chrome://extensions/ and upload a plugin directly from your own computer without review.
My experience: On a company Mac, developer mode works. On a Windows machine, corporate IT can prevent the use of custom plugins. In that case, uploading to the Chrome Web Store and having IT manually whitelist the plugin is the solution. - How do you distribute the API keys? The Chrome plugin sends a request via an API to an LLM, and needs to authenticate there. With an API key. You shouldn’t build that key directly into the Chrome plugin. Because with the key, anyone could do all sorts of things, and that creates costs.
So: The Chrome plugin needs a settings window, and there each user must enter their own API key. You create these API keys with the LLM providers, ideally one per person, with an expiration date and a cost limit. That way you reduce the risks here. - Where do the prompts live? Should plugin users be allowed to view and edit the prompts? Should they be able to use their own prompts?
If you upload the plugin to the Chrome Web Store, the default prompts are stored in there. Do they contain any internal information that shouldn’t end up semi-publicly?
One option: Does the API call to the LLM go directly to the LLM provider, or is there an intermediate step? You can set up Assistants / CustomGPTs, for example, so the plugin sends the request to an assistant and the prompt lives there. - How angry will IT be? We can not only read text fields from the CMS, we can also have the plugin automatically overwrite these text fields with results. We can even remote-control the CMS and make additional settings.
However, that can cause trouble, because we’re then interfering with the CMS using JavaScript. But the CMS might have its own scripts running, monitoring and checking text inputs. It helps to talk to IT early and, when in doubt, be more defensive.
Before you distribute your own software within a company, you should have IT on your side. Obviously.
If you want to add anything, there’s a LinkedIn post.