Skip to main content

AI doesn’t need to write the news when it dictates it

Text generators aren't coming for most newsroom jobs - yet. But news media dependency on search trends means AI already has significant sway over what journalists get assigned, and who gets to read it. 

December 10 2022, 15.50pm

The robots are at it again. Like clockwork, a new leap in AI technology makes waves on social media. On actual media, experts are called in to opine whether this means the imminent extinction of this or that trade. Column inches are spent. Morning show airtime is reliably filled up. A few people get inspired to learn coding, and a far fewer few decide to go off-grid. And so the world goes. 

This week, the two crafts on the chopping block were visual art, and non-fiction writing - from academic essays to journalism. But while the threat to humans on the first is still just about hypothetical, the robotic grip on the latter has been tightening for some time. 

Art has been dragged into the spotlight by the proliferation of frighteningly responsive engines like Dal-E and Midjourney, which can create illustrations based on textual prompts. Some examples are genuinely arresting. Others are merely goofy: what if Harry Potter was written by Dostoevesky, what if Wes Anderson directed Game of Thrones or The Shining, and so on

Much of the conversation this week has focused on the question of copyright. After all, AI can’t create anything entirely original: it draws on an immense and ever-growing pool of imagery, remixing it to meet the user’s request. (The degree to which human artists, too, “remix” the subtlest of influences is a different question.) These sources are usually not acknowledged or remunerated; surely this constitutes plagiarism. Should copyright law be redrawn to secure intellectual property? More importantly - cue the jingle - will the robots take over? Is the next Lucian Freud going to be a webcam? Can anyone - as advocates of technology go - now be an artist? 

Leaving aside such trifling questions as "what is art" and "what makes a masterpiece, an artist or their audience", one material aspect of the change is clear.  Far from democratising art, AI will drag the bottom out of the market by hitting the younger artists still refining their craft but already managing to make a living. It is infinitely cheaper and faster to ask an AI engine to produce an illustration for an article, for instance, than to trawl through sundry online portfolios, negotiate a price, and then find that the image you see in your mind’s eye can be very far from what the artist sees in his when you describe it verbally. But if no one will pay artists to produce run-of-the mill works to sustain themselves, most will never get to craft inspired masterpieces, either.  

In journalism, if anything, the situation is considerably worse. In the past week, examples proliferated on Twitter of school essays, academic essays, and even news articles written by AI - mostly by ChatGPT. But here, the question of whether AI will replace journalists like yours truly is both outdated and misplaced. Outdated, because news organisations are already using AI to write some of their more routine coverage - take MittMedia in Sweden, who use robots to write their real estate and sports news. And misplaced, because even today, when we human journalists still form a majority in our profession, AI has settled further up the food chain: in many organisations, it’s already telling our editors to tell us what to write, and deciding who will read us. 

From my own experience and from those of colleagues in too many other media to count, here is what a shift looked like in many newsrooms already half a decade ago. An editor with a particular knack for spotting rising search trends would be the first to sign up. They would check what people were posting and reposting on social media, but more importantly, they would look at what people were searching for on Google - the-search engine in Search Engine Optimization, or SEO. Sometimes, they would make an educated guess as to what people will be searching for this morning: lottery results, for instance, or sports scores. More often, they would go on Google Trends and see if the handy little graph was showing searches as rising or falling, and how steep; then, guesstimate if the positive trend will continue for another hour or two - enough for the duty news editor to sign up and and assign the piece to a human journalist. 

Other than breaking news and pre-planned feature, most pieces put out that day would at least be examined through, if not instigated by, SEO - which is, of course, AI at every level: from Google itself, to the third-party platforms we used to try and master it. The organisation I worked for is far from unique in this regard, and the process I described above is a hardly trade secret (or I wouldn’t be divulging it); most media websites you see at the top of your Google News search operate in a similar way, which is why you see them there

Obviously, this isn’t quite the AI journalism that critics (and profiteers) like to imagine. Human editors make the crucial call, and it’s human journalists who write the stories, not text generators. Except in the occasional nadirs of click-baiting frenzy, even my old organisation encourages journalists to riff off the trending searches and find another angle that would make their story stand out from the crowd. Nevertheless, there is a robot at the beginning of the process and a robot at the end: algorithms to tell you what other algorithms are showing people, and algorithms deciding whether your algorithmically driven writing deserves to be amplified beyond anything else. 

An upside of the recent slump in online advertising revenues is that more and more publications are seeing high dependency on clicks as unpredictable, unsustainable and denigrative to the brand.They are now coming back full circle to subscription models, which necessarily requires more thoughtful, less knee-jerk journalism - including my old employers, once roundly castigated for clickbaiting. But the algorithms are still there, and still not remotely as transparent as they ought to be. The recent Musk-instigated release of the so-called Twitter Files is putting a heavy emphasis on the biases and prejudices of human intervention. But letting algorithms rule the news is just as perilous. We need far more discussion, and perhaps even regulation, of how these algorithms work.