I Tried Deepfake App Wombo (So You Don't Have To)
Let’s Talk About The Deepfake App Wombo
There's a lot of talk about deepfakes lately, especially since a series of Tom Cruise deepfakes appeared on TikTok that were visually indistinguishable from real Tom Cruise content.
Deepfakes are, according to Wikipedia, "synthetic media in which a person in an existing image or video is replaced with someone else's likeness ... leverag[ing] powerful techniques from machine learning and artificial intelligence to manipulate or generate visual and audio content with a high potential to deceive."
The creator, Chris Ume, a Belgian visual effects specialist, used a Tom Cruise impersonator and is said to have spent weeks working on the technology that allowed this impersonator to not just look like Tom Cruise, but to truly embody his image.
But deepfakes aren't just for celebs, and they don't necessarily require that you have in-depth tech knolwedge. In fact, there's a new app on the market that allows you to create your own singing deepfake.
Wombo, whose tagline is "make your selfies sing", is a fun app you can use to create and send weird, funny memes. But I can't help but wonder whether its uses in the future will be much darker — or if the worry over deepfakes is overblown.
I decided to try the Wombo deepfake app to see if it works and to learn about its potential.
After all, if you listened to conspiracy theories, you might think Wombo is about to destroy the very fabric of society.
What I discovered is that the issue is complicated ... but Wombo really isn't.
Wombo lets you take a photo, or use one from your camera roll, to make it appear like the subject is lip-syncing.
If you're using the free version, like I did, you can pick from a selection of songs such as "What is Love?" by Haddaway, "Never Gonna Give You Up" by Ricks Astley, "Numa Numa" by O-Zone, and "Thriller" by Michael Jackson.
With the premium plan, you can have access to more content on Wombo for just $4.49 a month or $29.99 a year.
An article on The Independent explains that tools like Wombo “use machine learning to spot the parts of a face that need to be animated and move them in time with the music.” It goes onto explain that this company isn’t currently selling the data to make money, but rather offering a service where people can pay for more services on the app to make funny content. Funny for now, at least.
When I first used the app, I immediately texted my best friend a ridiculous video of "her" lip-syncing to a song. We both laughed, but nobody would've thought it was actually a video fo her. Unfortunately, it did not work with my cat, which is a real bummer because I was excited to seee him sing.
Overall, the app was very fun to play around with. It took a hilarious and more than slightly creepy baby picture of me crying and made it sing along to "Witch Doctor."
However, questions are already being raised about the potential for deepfakes being made with this app and the potential harm that this can cause.
Deepfakes are, after all, already being used to commit crimes. One noteworthy crime using deepfakes occurred in the UK where a scammer created a highly convincing deepfake of a CEO's voice which compelled an employeee to wire funds amounting in the US equivalent of almost $250,000 to the scammer.
So the concerns around deepfake tech are certainly justified.
Currently, the app only offers to put songs on a face, and the videos are often shared on TikTok or other social media platforms — which is innocent enough. The pictures aren’t very good quality and look very edited and stretched. So there is little chance that someone could confuse the media with anything resemling reality.
But with the app's growing popularity and the speed at which technology is advancing, people may want a more realistic-looking video and it could potentially happen sooner than we realize.
There is also potential for using an image against someone, such as singing along to an inappropriate song, or, if the creators of the app allow users to upload their own audio, make it so someone is saying something they are not.
This technology is not new. The same type of tech can be used to add motion to pictures of people who have passed away, like the Deep Nostalgia feature from the MyHeritage app, or for sport video games so people can see their favorite players while gaming.
Because it became cheaper and more available to people, people are creating more and more deepfakes. There have already been scams where companies have lost money due to audio deepfakes.
In a New York Times op-ed video (below), social media researcher Claire Wardle argues that deepfakes themselves aren’t the issue. In fact, she says, "The alarmist hype is possibly more dangerous than the technology itself."
This alamist hype causes people to doubt what they see and disbelieve their own eyes.
Yes, a good deepfake may make it hard for people to determine if what they are seeing is real, but the big threat right now is in politicians or others using deepfakes as an excuse to deny things they have done, such as former president Trump denying having ever said that he grabbed women in that controversial Access Hollywood video.
We all saw the related video and heard him say it with our own ears, but because of the existence of deepfakes, his denial spreads uncertainty among his supporters over whether it was ever real — even though we know it was, and Trump even acknowledged that and apologized for it in 2016.
Wardle says that this technology is only getting cheaper, so more and more people are going to use it and I can't help but wonder if apps like Wombo will make exacerbate both of those problems — the issue of the deepfakes themselves and the very real paranoia around them.
Should the technology become hyper-realistic, it grants deepfake technology to those who otherwise wouldn't have access.
But maybe that wouldn't be such a bad thing. We would all have to start looking more closely at the content we consume and think about whether it is authentic, and be forced to examine whether or not it could be propaganda for nefarious purposes.
Given that there are already those hyper-realistic face-changing apps available, commonly used to "photoshop" people's selfies, this isn't a new problem, but it is a problem that is likely to only continue to present itself in new and varying forms.
Leeann Reed is a textile artist, poet, and writer who covers news, entertainment, and lifestyle.