A claim that two British Army colonels were captured by Russian special forces in Ukraine has circulated online this week, drawing attention on social media and fringe platforms.
The story, which lacks any independent verification, appears to have originated from Russian-aligned sources and includes fabricated images and fictional identities.
There is no evidence that any such incident took place.
The initial report was published on 4 August by EADaily, a Russian outlet that regularly echoes official Kremlin narratives. It claimed that “Colonels Edward Blake and Richard Carroll” were detained by Russian special forces during a covert mission inside Ukraine. The piece alleged that both officers were classified as “illegal combatants” and suggested the UK had attempted to cover up their presence by first stating they were in London, then claiming they were visiting Second World War battlefields.
The story cited the Norwegian website Steigan.no, which has a long history of publishing anti-Western and conspiratorial content. According to the EADaily article, Russian authorities provided “forensic evidence” proving the identity and presence of the officers, although no such evidence has been made public. The article also included a single photograph that purported to show the two officers in uniform, you can see it below.
That image is demonstrably fake.
Analysis of the photograph shows clear signs of artificial generation. There are visual inconsistencies in the hands, faces, and background. These anomalies are consistent with artefacts produced by AI image-generation tools. No other photos of the alleged officers exist, and their names do not appear in any Ministry of Defence public records, honours lists, or military directories.
No reputable Western, Ukrainian, or international media outlets have reported anything resembling this story. There has been no announcement from the UK Ministry of Defence, no indication from NATO, and no alert from international bodies such as the Red Cross. These are the kinds of signals that typically accompany the detention of senior military personnel. None are present here.
The UK government has consistently stated that it does not have combat troops operating in Ukraine. Its support has focused on military aid, logistics, and training, largely conducted outside Ukrainian territory.
The names “Edward Blake” and “Richard Carroll” also raise questions. Neither name appears in available British military service records. There is no trace of them in recent Armed Forces appointments, public records, or military press releases. In short, there is no proof these individuals exist, let alone that they were captured.
Amplification and social media
Despite the lack of evidence, the story spread quickly on Telegram channels, conspiracy forums, and smaller fringe websites. On 4 August, George Galloway — former MP and now leader of the Workers Party of Britain — posted on X (formerly Twitter):
“Russia nets two British colonels and MI6 spy in Ukraine. They were just battlefield trainspotters says UK. No Vienna!”
The message closely mirrored the tone and structure of the Russian reports. Galloway did not present any additional sources or claim to have independent knowledge of the events. His comment, shared with over a million followers, further pushed the story into the British online discourse.
It is not the first time a fringe claim has entered public view through carefully worded commentary rather than outright endorsement. This method often allows a narrative to spread without full responsibility for its accuracy. The phrasing leaves room for ambiguity while still reinforcing the central implication of deception by Western governments.
A pattern seen before
This story fits a familiar pattern seen across Russian disinformation campaigns. A sensational claim is seeded in Russian state media, echoed by ideologically aligned or conspiratorial outlets, and then repeated in Western political or social circles. The key elements, unnamed sources, synthetic images, unverifiable identities, and unverifiable “forensics”, appear designed not to withstand scrutiny but to inject doubt and provoke reaction.
“We’ve seen this tactic before. create a false narrative, back it with a synthetic image, and wait for someone with a platform to repeat it. It doesn’t matter that it’s fake. The goal is to inject doubt and get people asking the wrong questions.” – Analyst at a private OSINT company, speaking on condition of anonymity
The same methods were used during the siege of Mariupol in 2022, when false reports claimed that NATO generals had been captured inside the Azovstal plant. Those reports were never substantiated, and the individuals named were never shown or verified.
What makes the current episode notable is how quickly it gained traction and how easily a fabricated narrative was repeated in Western political discourse without corroboration. Even as public awareness of AI-generated images and fake news grows, the tools used to craft these stories are improving, and their emotional appeal remains potent.
At time of writing, there is no evidence that any British Army colonels were captured in Ukraine. There is no record of Colonels Blake or Carroll existing. There is no credible photograph, no formal complaint, no press release, no Red Cross involvement, no ICRC prisoner of war notification, and no allied confirmation. The UK government has not issued contradictory statements, as the Russian report claimed.
Everything that exists traces back to a single article in a Russian-aligned publication, picked up by a handful of ideologically aligned outlets, and circulated with the aid of a fabricated image. It was then referenced by a former British MP on social media without substantiation.
The facts, checked independently, do not support the story in any respect.
Why it matters
These claims are about trust, narrative control, and the ways in which foreign actors test the resilience of open information environments. When disinformation can move freely from Kremlin sources into Western political commentary, the result is not always belief, but confusion, suspicion, and fatigue. Over time, that erosion of clarity serves strategic goals: to weaken resolve, muddy alliances, and destabilise democratic debate.
This story was built for that purpose. It was never confirmed, because it was never meant to be. Its success lies in how far it travelled before being questioned… and in how many will keep repeating it even after it has been shown to be false.
This story shows how disinformation and engagement bait often overlap online. Disinformation is deliberately false or misleading content, usually spread to advance political, strategic, or ideological aims. Engagement bait refers to material designed to provoke strong emotional reactions such as anger or shock, in order to drive clicks, shares, or comments. The two often reinforce each other. A false story does not need to be credible to spread widely if it is provocative enough to encourage discussion or outrage. When such content is repeated or referenced by public figures, it can gain momentum even without evidence. For this reason, we are not linking to the original article.
Doing so would only increase its visibility and spread, despite the lack of any credible basis for the claims.
Is that the best Ivan can do, my 10 year old nephew can produce more believable AI.
I guess Putins phy ops ‘A’ team have already been fed into the Ukrainian meat grinder ..
It is strange about Ai as incredible as the outputs are almost inevitably one gets a slightly android look about the faces… along with 6 fingers on occasion. By the way I’m sure I saw one of those guys in Salisbury diligently checking out the tallest spire in Europe.
On a more important note, what’s happened to the two North Koreans (and wasn’t there a Chinese too) captured by the Ukranians. No ai images needed there. Surprised the Russians didn’t choose even more typical British names like Smith and Mones mind, though I’m sure Blake has an irresistible familiar ring to them.
There you go again, with your decadent Western lies Spy, they were actually on a coach tour of the Kursk region and the driver took a wrong turn….
I the real image they had flip flops, ‘Koreans on tour’ t shirts on and knotted hankerchiefs on their heads, it was the lying Western propagandists added the military fatigues….
Or on a test run for the truly abysmal Destination X. 😇
You can see photoshop manipulation scars around the documentation books
Good spot it is notable that the pixel degradation evident in the passports are typical of lossy JPEG compression which are not there to any great degree on the uniforms for example. Now having worked on similar multi technique imagery myself this is an example of an ai generated image then introduced into photoshop where specific photo generated elements need to be introduced which ai would not normally produce accurately enough. Such a contradiction is a sign of a certain laziness on the part of those generating the image but common in these sort of fakes.
The worst example of such laziness inherent in these things was recently seen in the supposed downing of an F-35 by the Iranians where the originator was so incompetent, ignorant or rushed (or simply aving a laff) that the aircraft was at least twice the size, relative to the people around it than it should have been, not to mention in a relatively and unrealistically undamaged condition. So there are a lot of people of varying skill out there trying to feed this sort of imagery who fail to take care in precision, especially where it does its job as in this case without the extra effort. A Govt or professional studio would be far more entrenched in getting such detailing correct but ai and photoshop are now a hobby for any spotty kid in their bedroom.
What annoys me is that hostile nations seem to have plenty of useful idiots here to do their bidding, George Galloway being a prime example.
A bigger concern right now is that Russia is starting to test NATO’s border/reactions. This week, they flew a drone over Lithuania.