Millions Followed This Pro-Trump Influencer — “She” Was AI Selling Adult Content

jessica foster

A glamorous pro-Trump influencer posing with world leaders, military aircraft and even Lionel Messi quickly captured the attention of social media. Within months, the account had attracted nearly one million followers. The problem? The influencer never existed.

Behind the viral persona known as Jessica Foster was not a political activist or a soldier — but a carefully crafted AI-generated character designed to funnel followers toward adult subscription content.

The story highlights how generative AI is rapidly reshaping online influence, blurring the line between authentic personalities and algorithmically created personas built purely for monetization.

The rise of an AI influencer that looked real enough to fool millions

The account appeared online in December 2025 and quickly gained traction across Instagram and X. The profile portrayed a patriotic young woman supporting former U.S. President Donald Trump, often dressed in a military uniform and appearing alongside famous political or sports figures.

Images posted by the account showed the supposed influencer in striking scenarios: standing on airport runways beside fighter jets, attending diplomatic meetings, or appearing in photos alongside well-known figures such as Cristiano Ronaldo, Vladimir Putin, Volodymyr Zelensky, and even members of the Trump family.

One image that circulated widely online even showed her among guests during a White House reception for Lionel Messi and the Inter Miami team, who had recently celebrated their MLS championship.

For most viewers, the account looked authentic. The persona was polished, visually consistent, and aligned with a specific political identity — a formula that often performs well on social platforms.

The small clues that revealed the account was artificial


Not everyone was convinced. Some military veterans noticed subtle inconsistencies in the images posted by the account.

One recurring detail stood out: the uniform displayed the name “Jessica” instead of a family name, which is not how real military identification patches work. Other viewers pointed to slightly unnatural textures and lighting artifacts typical of AI-generated images.

These clues eventually revealed the truth: Jessica Foster was entirely generated by artificial intelligence.

The photos, the persona, and the narrative were all part of a synthetic identity built to appear credible enough to attract followers.

The real objective: driving traffic to an adult subscription page

The social media accounts were not political activism or satire. They were part of a monetization funnel.

The Instagram profile — along with a similar account on X — directed followers toward an OnlyFans page operating under the username @jessicanextdoor.

There, the fictional influencer sold fetish content to subscribers. Some followers reportedly paid more than $100 for individual posts, believing they were interacting with a real person.

The content strategy followed a common attention-economy playbook:

  • Create a highly shareable persona
  • Build credibility using political identity and patriotic imagery
  • Attract a large male audience
  • Redirect traffic toward paid adult content

Some posts on social media were already suggestive enough to tease the type of material available on the subscription page, acting as a marketing preview.

Why AI influencers are becoming a powerful monetization tool


The case illustrates a growing trend: AI-generated personas can scale influence faster than real creators.

With generative image models and AI-assisted storytelling tools, it has become relatively easy to fabricate a consistent online identity. These synthetic influencers can produce unlimited content without aging, scheduling constraints, or reputational risks.

For creators looking to monetize attention, the incentives are clear. AI allows them to:

  • Generate endless visual content
  • Adapt the persona to trending narratives
  • Maintain constant engagement
  • Run multiple identities simultaneously

In many cases, the audience never realizes the account is fictional.

Platform rules technically prohibit this strategy

Ironically, the monetization model used by the Jessica Foster account appears to violate several platform policies.

Subscription platforms like OnlyFans require profiles to be tied to a real, identifiable person, and AI-generated content is supposed to be clearly disclosed.

The profile itself even contained a playful line that now reads differently in hindsight:

“Government employee by day, troublemaker by night. I’m new here, please be nice. I answer every message — but be patient, I’m not a robot haha.”

The joke, it turns out, was unintentionally accurate.

A glimpse into the future of synthetic influence

The discovery of the fake influencer sparked intense debate online about the growing role of artificial intelligence in shaping digital culture.

Some critics argue that synthetic personas risk manipulating audiences and exploiting emotional or political identities for profit. Others warn that AI-generated characters could increasingly be used for misinformation, political influence, or large-scale marketing schemes.

What is certain is that the technology enabling these synthetic influencers is improving rapidly.

As generative AI tools continue to evolve, distinguishing between real creators and algorithmically generated personalities may become one of the defining challenges of the social internet.

In the emerging economy of attention, authenticity itself is becoming harder to verify — and far easier to manufacture.

alex morgan
I write about artificial intelligence as it shows up in real life — not in demos or press releases. I focus on how AI changes work, habits, and decision-making once it’s actually used inside tools, teams, and everyday workflows. Most of my reporting looks at second-order effects: what people stop doing, what gets automated quietly, and how responsibility shifts when software starts making decisions for us.