In the same amount of time it would take to toast a slice of bread, you could clone the voice of US President Joe Biden and share it on social media. You could have him mutter in his slow and gravelly voice: “I’ve always known COVID-19 was a hoax, it’s just useful to pretend it’s real,” then superimpose the audio on a photo of the president grinning, upload it to TikTok, YouTube and Facebook and wait.
A funny thing would happen: The first two sites would take your clip down. However, the biggest platform — the one with more than 3 billion users — would not. Facebook would slap a warning label on the clip, but leave it up for people to click through, listen to and share with others. That antiquated policy could prove disastrous in a divisive election year.
Several examples show how likely that scenario is. In September last year, faked audio of a Slovak political leader “discussing” ways to buy votes was shared on Facebook within days of a closely fought national election. Parent company Meta Platforms Inc does not ban fake audio clips in the same way it takes down fake videos, so Facebook let the clip remain with a label saying it had been manipulated. Two days later, that same party leader lost the election. It is impossible to know if the clip swayed votes, but the country also had a 48-hour media blackout before the election, which means that there was no one to debunk the forgery.
Illustration: Kevin Sheu
In a world of misinformation, fake audio could have a more sinister effect than fake videos. While fake “photos” of former US president Donald Trump have a glossy, plastic look that belies the artificial intelligence (AI) machinery behind them, fake versions of his voice are harder to scrutinize and distinguish. AI-generated voices could also sound hyper-realistic thanks to a passel of new tools originally designed to help podcasters and marketers.
Companies such as Eleven Labs, Voice AI and Respeecher sell services that can synthesize the voices of actors, meaning they could, for instance, read audio books in different languages, and some only require a couple of minutes of a voice recording to clone the voice. Voice AI startups raised about US$1.6 billion from venture capital investors last year, market research firm Pitchbook’s data show. Overall investment growth in these companies has decreased in the past two years, however, in part because larger companies such as Amazon.com Inc and OpenAI are taking in more business.
Some companies such as Respeecher have features in place to prevent misuse, or they require permission from people to have their voice cloned. However, that does not stop others from exploiting them anyway. For instance, someone recently cloned the voice of London Mayor Sadiq Khan and posted the faked audio clip to TikTok. In the clip, Khan’s “voice” could be heard saying that Armistice Day should be canceled in favor of a protest to support Palestinians.
“Why don’t they have Remembrance Weekend next weekend?” his “voice” asks.
The audio caused outrage among Brits who believed the country’s veteran’s day celebrations should be respected, but Khan’s office said that the clip was being “circulated and amplified by a far-right group.” To their likely dismay, it was reposted on Facebook and remains on the site, in at least one case without a warning label.
Another person generated a fake clip of UK Labour Party Leader Keir Starmer supposedly calling one of his team members a “bloody moron,” while a second forged clip had Starmer saying that he “hated Liverpool.” The posts were seen thousands of times on TikTok before being taken down. A rival Conservative politician encouraged the public to “ignore it.”
TikTok removed the London mayor’s clip, and a company spokeswoman said similar deceptive audio involving politicians would normally be taken down as it violates policy. YouTube also removed postings of the faked mayor’s voice; a company spokeswoman said that the site takes down “technically manipulated” content that could cause harm. X, formerly Twitter, has a similar rule, although it does not seem to enforce it — it has kept the mayor’s forgery up, for instance.
However, the stakes are higher with Facebook given that it has eight times more monthly active users than X, which makes its leniency toward forged audio all the more bizarre.
A spokesman for Facebook said it labeled and left fake audio of politicians up on the platform “so people have accurate information when they encounter similar content across the Internet.” It is better to leave a clip up with a warning label, Facebook argues, so that when people see it on other sites such as X or Telegram, they would be educated on its inauthenticity.
However, Facebook relies on stretched teams of fact-checkers to do such labeling.
“These things are spreading in real time over the Internet,” says Steve Nowottny, editor of the independent fact-checking charity Full Fact, which worked with Facebook to debunk the Khan and Starmer audio clips. It took them two days to check the Labour Party leader’s clip, he says.
One problem is that there are still no reliable technical tools for detecting fake AI audio, so Full Fact uses old-fashioned investigative techniques. In the case of the Starmer clip, Full Fact spoke to people in both the Labour and Conservative parties to confirm that the audio was fake.
However, its fact-checking team is made up of only 13 people. More broadly, there has also been a decline in the number of people at social media companies working on misinformation. Alphabet Inc, Meta and X have all pared back their trust and safety teams in the past two years to cut costs, and Meta also recently shuttered a project to build a fact-checking tool, CNBC said.
“I talked to a large group of fact-checkers and journalists from across Asia in November [last year], and almost everyone was seeing manipulated audio and wasn’t sure how to detect it,” says Sam Gregory, executive director of Witness, a human rights group focused on technology.
Even labeled misinformation could spread rapidly before the warning is properly understood. In moments of fast-paced information sharing, when emotions are running high, not every Facebook or Instagram user might fully comprehend the meaning of a label — or believe it.
Facebook’s policy of only taking down faked videos is outdated. As we head into what could be tumultuous national elections in the UK, India, the US and elsewhere, made all the more messy by AI tools generating all kinds of media and information, the platform should also start taking down deceptive audio.
Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes magazine, she is author of We Are Anonymous. This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
The return of US president-elect Donald Trump to the White House has injected a new wave of anxiety across the Taiwan Strait. For Taiwan, an island whose very survival depends on the delicate and strategic support from the US, Trump’s election victory raises a cascade of questions and fears about what lies ahead. His approach to international relations — grounded in transactional and unpredictable policies — poses unique risks to Taiwan’s stability, economic prosperity and geopolitical standing. Trump’s first term left a complicated legacy in the region. On the one hand, his administration ramped up arms sales to Taiwan and sanctioned
The Taiwanese have proven to be resilient in the face of disasters and they have resisted continuing attempts to subordinate Taiwan to the People’s Republic of China (PRC). Nonetheless, the Taiwanese can and should do more to become even more resilient and to be better prepared for resistance should the Chinese Communist Party (CCP) try to annex Taiwan. President William Lai (賴清德) argues that the Taiwanese should determine their own fate. This position continues the Democratic Progressive Party’s (DPP) tradition of opposing the CCP’s annexation of Taiwan. Lai challenges the CCP’s narrative by stating that Taiwan is not subordinate to the
US president-elect Donald Trump is to return to the White House in January, but his second term would surely be different from the first. His Cabinet would not include former US secretary of state Mike Pompeo and former US national security adviser John Bolton, both outspoken supporters of Taiwan. Trump is expected to implement a transactionalist approach to Taiwan, including measures such as demanding that Taiwan pay a high “protection fee” or requiring that Taiwan’s military spending amount to at least 10 percent of its GDP. However, if the Chinese Communist Party (CCP) invades Taiwan, it is doubtful that Trump would dispatch
Taiwan Semiconductor Manufacturing Co (TSMC) has been dubbed Taiwan’s “sacred mountain.” In the past few years, it has invested in the construction of fabs in the US, Japan and Europe, and has long been a world-leading super enterprise — a source of pride for Taiwanese. However, many erroneous news reports, some part of cognitive warfare campaigns, have appeared online, intentionally spreading the false idea that TSMC is not really a Taiwanese company. It is true that TSMC depositary receipts can be purchased on the US securities market, and the proportion of foreign investment in the company is high. However, this reflects the