IE 11 is not supported. For an optimal experience visit our site on another browser.
EXCLUSIVE
Artificial intelligence

Deepfake scams have arrived: Fake videos spread on Facebook, TikTok and Youtube

Deepfakes have circulated online for years, mostly as warnings. Now, the proliferation of advanced video manipulation technology has made them easy to produce.
Get more newsLiveon

Long feared, the deepfake scam has finally arrived on social media. 

Fake videos of celebrities hawking phony services have begun to gain some traction on major social media platforms like Facebook, TikTok and YouTube.

Last week, NBC News viewed more than 50 videos posted to those sites that featured computer-manipulated images and audio of well-known people, all of which appeared to have been created to scam viewers out of money.

Almost all of them centered on Elon Musk, with manipulated videos of several news and television personalities — including CBS News anchor Gayle King, former Fox News host Tucker Carlson and HBO host Bill Maher — falsely claiming Musk had invented a technologically advanced investment platform.

Most of the videos continued with a similarly deepfaked Musk, who encouraged viewers to invest their money in the nonexistent platform. Musk, the owner of X, formerly known as Twitter, has promoted some cryptocurrencies in the past, leading to his becoming extremely popular with scammers who use his image for their own gain. There is no evidence Musk had anything to do with the videos.

In response to emailed questions about the videos, Musk responded: “Ugh, I can’t believe you sent me Facebook links.”

Deepfakes have circulated online for years, with one of the first going viral in 2018 when actor and director Jordan Peele teamed with BuzzFeed News to make a viral public service announcement in which Peele impersonated former President Barack Obama. At the time, the trickery required working with two computer programs, and it took 56 hours to finish processing.

3x3 grid of zoomed in portions of Elon Musk's face and hands.
Leila Register / NBC News

Experts warn that such videos are no longer the most cutting-edge version of such scams. And they warned that scammers are using real-time deepfake programs to mimic celebrities on video calls with potential victims.

It wasn’t fully clear how the cryptocurrency scams pitched in the videos were intended to work, or even that they were posted by the person who created the deepfakes. Some videos include a link to defunct websites. One led to a typo-ridden website that makes no mention of Musk but invites visitors to join a cryptocurrency investing club.

On TikTok, a user appeared to have reposted one of the deepfake interviews and, in keeping with the platform’s culture of remixing others’ videos, superimposed their own fake Elon Musk Gmail account on top of that. A TikTok spokesperson referred to a policy that believable synthetic video should be disclosed as such and said the company had removed the videos NBC News asked about.

Contacted by NBC News, the owner of that Gmail address claimed on Google Chat to be Musk himself. Following a common scammer tactic, he claimed that if he were sent cryptocurrency, he would turn around and send back twice as much later. Asked for evidence of his identity, the scammer sent a picture of a trading card of Musk.

Many of the videos had tens of thousands of views. Most were on Facebook, where some were removed after NBC News inquired about them. Others remain live but are now preceded with fact-checks rating them as false.

A Facebook spokesperson said that the company was tracking trends in AI-generated content and that it was against company policy to deceive users for money. 

A YouTube spokesperson said videos NBC News inquired about didn’t violate the company’s policies. The videos have been removed.

Deepfakes — videos that use artificial intelligence to create believable but fake depictions of real people — have become significantly more common online in recent months. Some easily accessible websites specialize in deepfake pornography of celebrities, often without the consent of the people depicted. In May, Donald Trump Jr. went viral by tweeting a fan-made video that deepfaked the face of Florida Gov. Ron DeSantis onto the body of Michael Scott, Steve Carrell’s character from the TV show “The Office.” 

The unchecked rise of deepfakes has led some experts to warn that the first “deepfake election“ will arrive next year, when a substantial number of voters will see political disinformation videos online and not be able to tell with certainty whether they’re real.

The technology has become far more accessible to everyday users. Apps that can create moderately convincing deepfakes, often in real time, are available to anyone with a computer or a smartphone, said Subbarao Kambhampati, a professor of computer science at Arizona State University who has studied deepfake technology.

Because they need to be trained on real people to successfully mimic them, it can be particularly easy to make deepfakes of someone who’s already famous, as the internet is already full of videos and audio of them.

“They basically worked on [the Obama deepfake] for, like, a couple of months,” Kambhampati said. “Now, we can use public domain tools with which you can do a reasonable job with much less resources, so it’s become a lot more feasible.” 

The videos also hint at the increasingly lucrative world of internet scams. Cybercrime has risen steadily in recent years, and it exploded during the pandemic. Last year, victims reported a record $10.2 billion in money lost to scams and other online crime to the FBI, up from $6.9 billion the year before.

Amy Nofziger, who leads fraud victim support for AARP’s Fraud Watch Network, said scammers have learned to use deepfakes to make established scams more convincing.

Fraudsters have long impersonated celebrities on social media, to the point that some of country music’s biggest stars recorded a public service announcement for Facebook warning people not to fall for fake accounts. Not only does that trend persist, but AARP believes many of the scammers are also using deepfakes in short recorded videos and live video calls with victims, using the voice or face of the celebrity they’re impersonating.

“We’ve had celebrity impostor scams for a while, but what the AI can do is really accelerate the level of sophistication and help the criminals pivot really quickly,” Nofziger said. “So if the victim is like, ‘I don’t really think it’s you; I need you to read me the headlines today,’ or something like that to try to prove some validity, the scammers can adapt quickly with their technology.”

In recent months AARP members have fallen victim to fake versions of Tom Brady, Alicia Keys, Kevin Costner, Brenden Fraser, Carrie Underwood, Andrea Bocelli and members of the Korean pop band BTS, she said.

David Maimon, the director of Georgia State University’s Evidence-Based Cybersecurity Research Group, said he and his researchers have seen online tutorials for creating real-time deepfakes.

The tutorials teach would-be scammers to use two smartphones: one pointed at their own faces while they run live deepfake apps and another, held a few inches away, to video chat with victims.

Nofziger said that while it most likely seems implausible to most people that a celebrity would want to strike up a relationship with them on the internet, it can be hard for victims to fully contextualize what’s happening.

“I understand that it is very hard for someone who’s not in the moment of being victimized to think that this is basically the craziest thing I’ve ever heard,“ Nofziger said “But when you’re the victim, and you’re in the moment of it, and your celebrity crush is talking to you and wanting to be a part of your life, all of your cognitive thinking goes out the window.”