US Presidential Candidate Impersonators

As generative AI adoption accelerates, combatting information manipulation is paramount for election security.
Kevin Tian
September 25, 2023

As generative AI adoption accelerates, combatting information manipulation is paramount for election security. The 2024 US Presidential Election is still over a year away, but Doppel has already detected 1178 impersonator personas of the top candidates from both the Republican and Democratic parties. These personas span across major social media platforms, including TikTok, YouTube, Instagram, Facebook, X (formerly known as Twitter), and Telegram. We’re publishing this blog post to raise awareness for voters and security communities on emerging election threats.


Since the last presidential election cycle in 2020, information manipulation attack vectors have increased considerably:

  • Generative AI enables bad actors to produce misleading content through deep fakes or orchestrate social engineering campaigns through malicious ChatGPT clones.
  • Account verification for sale makes it easier for bad actors to appear credible, blurring the line between authentic and fraudulent accounts.
  • New channels like TikTok and Telegram have emerged (TikTok crossed 1B MAU in 2021), where there is less security coverage by legacy solutions.
  • Quick consumption content, e.g. “reels” and “stories,” is fed to audiences by algorithmic, auto-scroll feeds. Consumers spend seconds on each piece without much discernment for who's behind it.

In particular, prominent tech executives such as Sam Altman and Eric Schmidt are deeply concerned about AI and misinformation impacting elections next year.

In response to these threats, the FBI and CISA released a PSA about foreign actors using information manipulation tactics for the 2022 midterm elections.

“These actors use publicly available and dark web media channels, online journals, messaging applications, spoofed websites, emails, text messages, and fake online personas on U.S. and foreign social media platforms to spread and amplify these false claims.”

Though we have not verified whether these accounts are created by foreign actors in this report (and likely not all are), that does not lessen their risk of influencing electorate opinions. 

Current Observations

Our systems have detected 1178 fake online personas across social media platforms, impersonating prominent candidates like Joe Biden, Robert F. Kennedy, Donald Trump, Ron DeSantis, Vivek Ramaswamy, Nikki Haley, Mike Pence, Chris Christie, and Tim Scott. What's most interesting is not the mere existence of these fake accounts, but the tactics they employ:

Squatting on Official Handles. Several flagged accounts squatted on handles often used by the official campaign teams. For instance, this month Vivek Ramaswamy announced he’d establish a channel on TikTok to connect with younger audiences. However, someone had already claimed his standard handle, “vivekgramaswamy,” posting legitimate Vivek content and consistently the top search result for “Vivek Ramaswamy.” It was subsequently taken down a day after his announcement.

Duplication of Official Profiles. Several flagged accounts copied official profiles, including profile pictures plus descriptions and re-posting of official content. For example, this Mike Pence impersonator is almost indistinguishable from the official account to the human eye. It only takes a purchase of $8 / month for a blue checkmark to make these accounts even more similar.

Generative AI Content. The rise of generative AI tools has made it easier for bad actors to create realistic fake content, i.e. deep fake videos and images. For example, one Donald Trump account is circulating an AI-generated image of an 1981-era Donald Trump handing a young Tom Brady a football. AI-generated content can be used to paint narratives and influence opinions.

Positive to Negative Tone Shift. Many personas in our system initially represent themselves as official accounts, re-posting official communications in the voice of the candidate they’re imitating and building an audience dedicated to the candidate. However, we see a number of these suddenly shift their tone to disparaging the candidate. Here, we have an example case of a Joe Biden impersona shifting from official press to negative content about the candidate.

Duplicate Accounts with Copycat Content. Impersonators spawn duplicate accounts with copycat content mirroring each other. These accounts flood the major platforms, squatting on relevant handles to the candidates. This theme is rampant among candidates and across platforms, including for Nikki Haley on X.

Phishing Scams. Some impersonators focus on profiting off their engagement, promoting sites that claim to ultimately fundraise for the official candidate. Similar to this Ron Desantis example, these sites will use the candidate’s name and likeness, and it’s difficult for the electorate to verify whether the funds will actually arrive with the candidate’s official team.

Call to Action

In light of these emerging threats, there are takeaways for both constituents and the security professionals fighting these threats.

  • For constituents, be thoughtful about information consumed online – verify the authenticity of information and consider the objective of the creator.
  • For election security professionals, monitoring of these personas is critical to prevent manipulation of constituents, media, and internal team members. Accounts can shift activity quickly, and proactive measures can prevent these risks.

As we move closer to the 2024 US Presidential Election, we believe these impersonators will continue to grow. Preventing these threats from escalating is critical for preserving civic integrity.

Ready to learn more?