It’s almost too good to be true. A doctor you’ve seen on TV for decades advocating for a new revolutionary product on social media that big pharma prays you don’t find out about and that could cure your ailments. 

But all is not as it seems. Scammers are increasingly using AI technology to fake videos of famous TV doctors like Hilary Jones, Michael Mosley and Rangan Chatterjee to push their products to unsuspecting member of the public on social media. 

A new report, published in the prestigious British Medical Journal (BMJ), has warned of the growing rise of so-called ‘deepfakes’. 

Deepfaking uses AI to map a digital likeness of a real-life human being onto a video of a body that isn’t theirs. 

They’ve been used to create videos of politicians to make them seem inept and even for corporate heists and now they’re being used to sell you dodgy ‘cures’. 

Some of the clips unearthed in the BMJ investigation include videos of the late Dr Michael Mosley, who died last month in Greece, appearing to promote a diabetes cure on Facebook.

Another ‘reel’ video shared on the social network features a robotically voiced Dr Hilary Jones on ITV’s Lorraine as he appears to claim a blood pressure medication cures issues in eight out of 10 cases.

The videos are, of course, fake, and are not endorsed by anyone whose appearances and voices have been appropriated by fraudsters to sell dodgy counterfeit drugs.

Scammers are increasingly using AI technology to fake videos of famous TV doctors like Hilary Jones, Michael Moseley and Rangan Chatterjee to push their products to unsuspecting member of the public on social media

Scammers are increasingly using AI technology to fake videos of famous TV doctors like Hilary Jones, Michael Moseley and Rangan Chatterjee to push their products to unsuspecting member of the public on social media

The voice and likeness of the late Dr Michael Mosley has been cruelly exploited by scammers trying to sell fake diabetes drugs through deepfake videos

The voice and likeness of the late Dr Michael Mosley has been cruelly exploited by scammers trying to sell fake diabetes drugs through deepfake videos

A modified clip of Dr Hilary Jones appearing on Lorraine appears to show him advocating a new type of blood pressure drug - but the clip is completely fake, made using AI

A modified clip of Dr Hilary Jones appearing on Lorraine appears to show him advocating a new type of blood pressure drug – but the clip is completely fake, made using AI

Dr Jones is only one TV physician caught up in the trend, with a deepfake video of him endorsing a blood pressure cure spreading on Facebook earlier this year. 

And as Dr Jones himself knows its far from the only example.

‘Some of the products that are currently being promoted using my name include those that claim to fix blood pressure and diabetes, along with hemp gummies with names like Via Hemp Gummies, Bouncy Nutrition, and Eco Health,’ said.

Mail health guru Dr Mosley, who died last month in Greece, and Dr Chatterjee of Doctor In The House fame, have also been used to generate such clips. 

While the technology used to create deepfakes has been around for years early versions were flawed and often made mistakes with ears, fingers or failing to match audio and a subject’s lip movements and alerting people to their fraudulent nature.

But it has since made massive strides, and though research is limited data suggests that up to half of people struggle to tell them apart from the real thing. 

Retired medic John Cormack, who worked with the BMJ on the report, described scammers latching onto the reputation of respected doctors to hawk their products as ‘printing money’. 

‘The bottom line is, it’s much cheaper to spend your cash on making videos than it is on doing research and coming up with new products and getting them to market in the conventional way,’ he said. 

Regulators also seem powerless to stop the trend.

Practising doctors in the UK must be registered with the General Medical Council which, if a medic is found to have breached standards expected of medical professionals, can suspend them from working or even strike them off entirely.

But, they have no power to act on fake videos of doctors and while impersonating a doctor is a crime in the UK the murky world of the internet makes tracking who to hold account almost impossible, especially if they are based overseas. 

Instead medics like Dr Jones say it’s the social media giants that host this content, and ultimately make money by doing so, that need to take action. 

‘It’s down to the likes of Meta, the company that owns Facebook and Instagram, to stop this happening,’ he said.

‘But they’ve got no interest in doing so while they’re making money.’

Dr Rangan Chatterjee of BBC documentary Doctor In The House fame has also been caught up in the trend

Dr Rangan Chatterjee of BBC documentary Doctor In The House fame has also been caught up in the trend

Responding to the BMJ report a Meta spokesperson said: We will be investigating the examples highlighted by The BMJ. 

‘We don’t permit content that intentionally deceives or seeks to defraud others, and we’re constantly working to improve detection and enforcement. 

‘We encourage anyone who sees content that might violate our policies to report it so we can investigate and act.’

At the moment medics like Dr Jones have to take matters into their own hands.

Dr Jones, a frequent guest on shows like the Lorraine programme, employs a company to track deepfakes featuring him and try to purge them from the internet. 

But he added the scale of the problem only appears to be getting worse.  

‘There’s been a big increase in this kind of activity,’ he said.

‘Even if they’re taken down, they just pop up the next day under a different name.’ 

The report concludes by saying what makes deepfakes so insidious is that they play on people’s trust, tapping into a familiar face which has offered good and sometimes life-changing health advice in the past to con worried patients. 

Outside of the world of medicine, budgeting expert Martin Lewis’ likeness has been used by scammers to advocate for dodgy investments – prompting Mr Lewis himself to tell people not to be taken in. 

People who see a video they suspect of being a deepfake are advised to first examine it carefully to avoid any ‘boy who cried wolf’ scenarios. 

If they are still suspicious try to independently contact the person the video claims to feature through a verified account, for example.

If still suspicious consider leaving a comment questioning its veracity to hopefully make others also take that extra step of analysis.

People can also use social media’s in-built reporting tools to flag both the video and person who posted it in a bid to get it removed. 

You May Also Like

Doctor's 'easy-to-make' £1.50 salt solution can stop your armpits smelling

A doctor has revealed a ‘little known’ supermarket item that could be…

The politicians CONFIRMED to have taken Ozempic…as yet another leadership candidate admits he used the weight loss drugs

The fat-busting jab used by celebrities including Rebel Wilson, Sharon Osbourne, Stephen…

Think twice before buying new 'miracle' dementia drug, warn experts after reports emerge that two people in the US have died of suspected side effects in the past year

Patients have been warned against going private for a new ‘miracle’ Alzheimer’s…

Major study reveals overlooked foods that can slash cancer risk up to 35% – and the ones to avoid

Experts are warning about the overlooked food groups that protect against cancer…