What would have happened if the UK went to war with Russia in the 1990s? Or if the Queen died in 2001? TikTokers are imagining the answers with the help of AI
It’s 1999. You’re living in the UK county of Morthamshire, where nine children, aged between six and 13, have been abducted from the Rosewell estate in west Castlegate. In response the government has launched their ‘Say No To Those You Don’t’ campaign with an accompanying public information film. The police have said they’re struggling to find any evidence relating to the disappearances at all.
Except, none of this happened. The ‘Rosewell Court Nine’ don’t exist. ‘Morthamshire’ isn’t even a real place. But you’d be forgiven for thinking so, after watching some of the videos created by 19-year-old Arran Clark, the brains behind the @morthamshirecounty TikTok account.
Using a mixture of video editing software and AI, Clark has created a series of clips which at first glance appear to be television broadcasts from the late 1990s and early 2000s. They primarily focus on big, fictional news events in the made-up county of Morthamshire: there’s a clip of the BBC reporting on a plane crash in ‘west Halesford’, for example, or an advert for a Channel 4 documentary on the ongoing spread of the mysterious OCS infection. There’s impressive attention to detail: many of the clips cross-reference one another, with Clark crafting an entire alternate universe with a coherent timeline.
It’s clear that people love these alternate reality and ‘analogue horror’ videos. Clark has amassed over 23,000 followers on TikTok and his most popular video (the ‘Say No To Those You Don’t’ public information film) has nearly one million views. Similar TikTok accounts are also popular, like @MocksEAS, which has 30,000 followers. Using TikTok’s carousel feature, @MocksEAS posts doctored TV stills from the 90s and 00s which conceptualise fake historical events. One imagines Tony Blair announcing that the UK is at war with Russia, another shows the BBC announcing the death of the Queen in 2001.
Most viewers of this type of content are Gen Z, and Clark thinks that his audience’s nostalgia for this half-remembered period of time is a large driver behind the genre’s popularity. “It could be that their interest stems from a sentimentality for media from the past,” he suggests. Young people’s fascination with media and aesthetics from the time of their childhoods is well documented: take the interest in frutiger aero, the ongoing revival of digital cameras, vinyl, and dumb phones, or the way compilations of 2000s adverts frequently go viral on TikTok. With this in mind, it’s unsurprising that Clark’s videos, which draw on 90s and 00s media, have captured people’s attention.
@morthamshirecounty [THIS IS A MOCK] Who remembers this? #ARG #analoghorror #uktv #mock #alternatereality #fyp #PSA #bbc #broadcast ♬ original sound - Morthamshire County Archive
Clark adds that nostalgia is part of the reason why he’s interested in creating these videos in the first place, too. “Being 19, the ‘look’ of 2000s UK TV is one of my earliest memories,” he says. “Unsettling public information films or big events that were broadcast to the nation had a huge impact on me, and it’s likely the same for many others who take an interest in the alternate history or analogue horror genres.”
Specifically, Clark drew inspiration from the 2003 BBC pseudo-documentary The Day Britain Stopped, which was based on a fictional disaster in which the UK’s transport system failed with catastrophic consequences. “It wasn’t real; however, its sheer believability was fascinating and made me think: what if a nationwide crisis of that scale genuinely happened? How would it unfold from our perspective as a TV viewer and member of the general public?”, Clark explains. “This is why I tend to focus a lot of my efforts on authenticity and believability, as well as the narrative.”
Clark’s viewers are often quick to point out in the comments when his videos don’t seem ‘real’ enough. One person has pointed out that Clark has used the modern iterations of the government and NHS logos in a fake public service announcement which is meant to be from 2004; another notes that Clark has misspelt ‘brief’ on a mocked-up Ceefax page. But a handful often do believe that the videos are real. “I’m gonna pretend I didn’t just google the case and found nothing,” reads one comment under one of Clark’s videos about the Rosewell estate child abductions. “wait what THIS ISNT REAL??”, says another. “I have already had a multitude of comments questioning whether what I make is real or not and there is an inherent risk in that, especially in a world where ‘fake news’ is so problematic,” Clark says.
To some, this could seem like an alarming example of the issues that deepfakes and AI pose to our society. TikTok has proved to be a breeding ground for misinformation, with users often flicking through content too quickly to pause and question whether what they’re watching or reading is real. And with tech getting slicker and more intelligent with each passing second, could we end up in a position where edited videos and images are indistinguishable from authentic ones anyway? Could we get to a stage where historians in the year 3023 are left scratching their heads, struggling to figure out if Britain did go to war with Russia in the 90s or if the millennium bug did actually happen?
Well, given that part of a historian’s job is to put information in context and assess the validity of a source’s provenance, it’s incredibly unlikely. “I don’t think we ought to be too worried about these videos being taken too seriously. It’s pretty clear from context and hashtags that these videos are Y2K creepypasta, not historical records,” says Dr Joshua Habgood-Coote, a philosopher at the University of Leeds. “There’s something approaching a moral panic going on at the moment around the possibility that ultra-realistic deepfakes will make us lose contact with reality.”
Plus, according to Dr Habgood-Coote, this kind of “playful and creative use” of photo and video editing tools has a long history anyway. In the 19th century, photographer William Mumler claimed to be able to capture ghosts on camera, before a trial in 1869 outed him as a fraud. “Who, henceforth, can trust the accuracy of a photograph?”, wrote one alarmed newspaper columnist at the time. There was also the Cottingley Fairies hoax in the 1930s, where two young girls in West Yorkshire used cardboard cutouts from children’s books to make it seem as though they had definitive proof that fairies were real.
Essentially, as manipulating media is nothing new, the level of panic surrounding pretty benign instances of photo and video editing (like these alternate reality TikTok videos) is unjustified. “This isn’t to say that there aren’t some uses of AI-assisted manipulation which we shouldn’t be concerned about,” Dr Habgood-Coote adds, citing deepfake porn as one example, “but that we’ve got the scope and nature of the problems wrong.”
Clark’s videos certainly are believable, but he’s only ever trying to get people to ‘suspend their disbelief’, rather than hoodwink them completely. “I make it clear the series is a work of fiction and a pastiche of British broadcasting, and I completely avoid any narrative that may disrepute any individual or company,” he says. Similarly, the @MocksEAS account adds a disclaimer to every single video to let viewers know it’s all “fake”. Clearly, accounts like @morthamshirecounty and @MocksEAS aren’t worth panicking about. It would be far more worthwhile to treat their creations as satirical art – not entirely dissimilar to the work of Charlie Brooker. It’s fine to suspend your disbelief for 30 seconds, briefly imagine that you live in a world where the OCS virus is spreading or aliens have invaded Manchester or Eurovision was cancelled due to a cyber attack, and scroll on.
Join Dazed Club and be part of our world! You get exclusive access to events, parties, festivals and our editors, as well as a free subscription to Dazed for a year. Join for £5/month today.