In the UK, MPs have discussed concerns that Russian deepfakes could influence local elections in May.
“We have seen them used extensively in elections around the world, so there is no reason to assume Britain would be an exception,” Vijay Rangarajan, the chief executive of the UK Electoral Commission, told lawmakers.
Britain’s Online Safety Act does not explicitly classify disinformation as a harm. It does oblige platforms to remove the material they can prove to be foreign influence – a process that often takes too long in an online environment where videos can go viral within hours.
The posts are hard to trace to their origin, but Western researchers say many share common traits – from stylistic cues to distribution patterns – that link them to organised disinformation units aligned with the Kremlin.
One campaign dubbed Matryoshka, or Operation Overload, is believed to have orchestrated a wave of synthetic videos discrediting Moldova’s president, Maia Sandu, during her 2025 election bid.
NewsGuard, an organisation that tracks online disinformation, said it spotted common patterns suggesting the same network was likely behind the video featuring Dr Read.
The operation’s name – “matryoshkas” are Russian nesting dolls – mirrors its method, which encases an original false claim in layers of ambient re-posts from old or hacked social media accounts.
Unlike traditional Russian propaganda outlets like media companies RT and Sputnik, which the West swiftly sanctioned at the start of the invasion of Ukraine, such campaigns “allow for a level of… plausible deniability that complicates counter-influence efforts”, said Sophie Williams-Dunning, a cyber and tech researcher at the Royal United Services Institute think tank.
Researchers at Clemson University linked a separate network, branded Storm-1516 by Microsoft’s Threat Analysis Centre, to veterans of the Kremlin “troll factory” run by Yevgeny Prigozhin, the leader of the paramilitary Wagner group, before his death in 2023.
In an upcoming study seen by the BBC, the academics shared an example of the speed at which fake news travels on social media.
Each time they saw Storm-1516 campaign put out a false narrative about Volodymyr Zelensky being “corrupt”, for example, that narrative took over roughly 7.5% of all discussions about the Ukrainian president on X in the following week.
“That is something any marketing company would be proud of,” said Darren L. Linvill, one of the paper’s authors.







