He Stole $8M With AI Music No One Ever Heard — And Got Away With It for 7 Years

music fraud

For seven years, a man from North Carolina quietly ran one of the most elaborate streaming fraud operations ever uncovered — using AI-generated music and thousands of bots to siphon royalties away from legitimate artists.

When Michael Smith, 52, pleaded guilty on March 20, 2026 in a federal court in New York, the case became one of the first criminal convictions tied to AI-driven fraud in the music industry.

The numbers are staggering. Over 661,000 fake streams generated per day. More than $1.2 million in royalties diverted annually. A total of $8,091,843 stolen across the scheme’s seven-year run — money that, under normal circumstances, would have flowed to real musicians.

💡 Key Insight

This case marks a new threshold in AI misuse: not creative plagiarism or deepfake impersonation, but the systematic weaponization of generative AI against the economic infrastructure of music itself.

How the Scheme Worked

Smith’s operation was methodical. Between 2017 and 2024, he generated hundreds of thousands of tracks using generative AI tools and uploaded them to all major platforms — Spotify, Apple Music, Amazon Music, and YouTube Music. He then deployed thousands of bots across 1,040 accounts to simulate real listening behavior, carefully designed to evade the fraud detection systems each platform had in place.

The mechanics of royalty distribution made this attack possible. Streaming platforms pay per play, pooling subscription revenue and dividing it proportionally based on total streams. By flooding that pool with fake plays, Smith effectively redirected a share of every legitimate artist’s earnings to himself — without any of those artists ever knowing.

As federal prosecutor Jay Clayton put it plainly: the songs were fake, the listeners were fake, but the millions of dollars taken were entirely real.

An Industry Already Under Pressure

Smith’s case isn’t happening in a vacuum. The broader streaming ecosystem has been struggling with AI-generated content at scale for years — and the numbers suggest the problem is far larger than any single bad actor.

Platform Scale of the Problem
Spotify Removed 75 million spam tracks in a single year
Deezer Estimated 70% of AI-generated music streams on its platform are fraudulent
Apple Music Pursuing systematic labeling of all AI-generated content
Suno / Udio Facing active US lawsuits over large-scale AI music generation

These figures point to a structural vulnerability: royalty pools were designed for a world where each track represented genuine human creative effort. AI changes the cost of production to near-zero, which means the economic incentive for manipulation is enormous — and the barriers to entry are minimal.

→ What this means

The real harm in cases like this is diffuse and invisible. No single artist loses a catastrophic amount — but across millions of tracks and billions of plays, every fraudulent stream represents a fraction of a cent stolen from real musicians. The aggregate damage is massive; the individual injury is nearly impossible to detect or quantify.

What Comes Next for Smith — and the Industry

Smith faces sentencing on July 29, 2026. The charges carry a maximum penalty of five years in prison, three years of supervised release, and a $250,000 fine. He has agreed to repay the full amount taken.

For the music industry, this conviction is a signal rather than a solution. The platforms most exposed — those paying per-stream to a long tail of micro-artists — will likely face mounting pressure to redesign their royalty models. Several majors are already exploring minimum stream thresholds before a track becomes eligible for payment, specifically to reduce the return on bot-driven schemes.

The deeper challenge is detection. Smith’s operation ran uninterrupted for seven years before law enforcement caught up. The bots were sophisticated enough to mimic real listener behavior at scale. As the tools for generating music and simulating listeners both become more accessible, the gap between fraud and detection is likely to widen — unless platforms invest heavily in behavioral analysis that can distinguish synthetic attention from genuine engagement.

The music industry thought its biggest piracy battles were behind it. This case suggests the next wave won’t be about copying — it’ll be about fabrication.

sarah
I cover enterprise technology, cloud infrastructure, and cybersecurity for UCStrategies. My focus is on how organizations adopt and integrate SaaS platforms, manage cloud migrations, and navigate the evolving threat landscape. Before joining UCStrategies, I spent six years reporting on enterprise IT transformations across Fortune 500 companies. I track the gap between what vendors promise and what actually ships — and what that means for the teams deploying it. Expertise: Enterprise Software, Cloud Computing, SaaS Platforms, Cybersecurity, IT Infrastructure, Digital Transformation.