Title: The ugly memes driving crypto sales
Author: Adam Alexsic, Financial Times
Translation: Peggy, BlockBeats
Author: Rhythm BlockBeats
Source:
Reprint: Mars Finance
Editor’s note: When AI, algorithmic recommendations, and crypto speculation combine, online memes are being systematically “manufactured” to capture attention and money.
This article starts with a series of provocative content that has gone viral on social platforms, revealing how these seemingly absurd trends serve the dissemination logic of crypto scams. It reminds us: when popularity is no longer naturally generated but is deliberately designed for profit, the internet is becoming more chaotic and dangerous.
Below is the original text:
The author is known online as Etymology Nerd and is the author of “Algospeak: How Social Media Is Changing the Future of Language.”
This year, a dark and unsettling new side has emerged on Instagram Reels: aggressive memes are being systematically created to promote cryptocurrency scams—yet almost no one is seriously trying to remove them.
Since January this year, a series of bizarre, distorted characters have begun to spread on this social platform. This phenomenon is closely related to the widespread availability of AI tools and the relaxation of hate speech regulation on Meta platforms.
Among them is “George Droyd,” a biomimetic “reincarnation” image based on George Floyd, created in April this year to promote a cryptocurrency called $FLOYDAI ; and “Kirkinator,” which was born in September, following the death of political commentator Charlie Kirk, to hype the $KIRKINATOR token. Additionally, there are a series of recurring “supporting” characters, such as “Epstron” and “Diddytron,” which respectively parody Jeffrey Epstein and rapper Sean Combs (also known as Diddy).
These accounts exist within the same narrative universe, often gaining traffic by catering to racist and anti-Semitic stereotypes, accumulating millions of views. Discriminatory language frequently appears in short videos, which repeatedly revolve around so-called “racial purification” plots.
The purpose of this disturbing content is singular: to generate interaction and engagement. The ultimate goal is to direct public attention toward so-called “meme coins,” a type of cryptocurrency that theoretically rises as memes spread. Early meme coins (like $DOGE) leveraged existing internet culture, while derivatives like George Droyd and similar characters are entirely artificially created by crypto speculators.
This trick usually begins with pump.fun, a platform that allows users to easily register and trade digital tokens. Once developers create a token, they share it in trusted Telegram groups or X communities, where investors brainstorm ways to artificially generate attention for the related meme—what’s called “mindshare.” Then, they use AI to generate provocative videos, hoping for viral meme spread and attracting “ordinary people”—retail investors unfamiliar with meme coin culture but potentially drawn in. When the price rises, the core early investors often execute a “rug pull,” selling off and cashing out.
In reality, only a few thousand people typically buy these tokens. But because creating cryptocurrencies and posting AI-generated trash content has a very low barrier, creators can easily repeat this cycle, profiting from “creating cultural phenomena.”
Meanwhile, these memes often begin to “grow on their own.” Once other creators realize their viral potential, they imitate and reproduce them for money or online fame. The “Kirkinator” and “George Droyd” images have already been repeatedly used by various internet celebrities unrelated to the original token creators.
But each reinterpretation continues to benefit crypto brokers. For example, a tweet about Kirkinator in October garnered 8 million views, causing the $KIRKINATOR token price to spike fivefold before falling back within days. For investors who sold at the peak, this profit was built on millions of X users watching a video—content claiming “George Droyd was killed by Kirkinator after stealing Epstein files.”
Unfortunately, the more sensational the video, the more easily it spreads virally. Violent and offensive images generate more comments and longer watch times, both of which are rewarded by algorithms. Token creators have learned to exploit this mechanism for profit. Even Instagram or X users unaware of these cryptocurrencies may find themselves repeatedly exposed to these highly disturbing clickbait contents.
We are caught in a vortex: lax regulation of crypto sites, easily accessible AI tools, and social platforms that allow aggressive memes to proliferate—these elements stack upon each other.
As a scholar studying the evolution of internet language, I feel deeply uneasy: online trends are being artificially manufactured, with the sole purpose of manipulating us. We can no longer be sure that memes are “naturally generated”—they could be part of a profit chain at any moment.
Even if a meme is not directly created by crypto brokers, it is almost immediately co-opted by them. Every new cultural reference is quickly registered as a token on pump.fun and artificially promoted, just to profit certain individuals.
The ultimate result is that our connection to reality becomes increasingly tenuous. More and more memes are invented or amplified, forcing internet users to constantly doubt what they can still believe; and prolonged exposure to this disgusting discourse environment makes it seem “more acceptable.” The only way out is to fight to reclaim the internet and stop those who seek to poison it.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Who is creating these ugly memes?
Title: The ugly memes driving crypto sales
Author: Adam Alexsic, Financial Times
Translation: Peggy, BlockBeats
Author: Rhythm BlockBeats
Source:
Reprint: Mars Finance
Editor’s note: When AI, algorithmic recommendations, and crypto speculation combine, online memes are being systematically “manufactured” to capture attention and money.
This article starts with a series of provocative content that has gone viral on social platforms, revealing how these seemingly absurd trends serve the dissemination logic of crypto scams. It reminds us: when popularity is no longer naturally generated but is deliberately designed for profit, the internet is becoming more chaotic and dangerous.
Below is the original text:
The author is known online as Etymology Nerd and is the author of “Algospeak: How Social Media Is Changing the Future of Language.”
This year, a dark and unsettling new side has emerged on Instagram Reels: aggressive memes are being systematically created to promote cryptocurrency scams—yet almost no one is seriously trying to remove them.
Since January this year, a series of bizarre, distorted characters have begun to spread on this social platform. This phenomenon is closely related to the widespread availability of AI tools and the relaxation of hate speech regulation on Meta platforms.
Among them is “George Droyd,” a biomimetic “reincarnation” image based on George Floyd, created in April this year to promote a cryptocurrency called $FLOYDAI ; and “Kirkinator,” which was born in September, following the death of political commentator Charlie Kirk, to hype the $KIRKINATOR token. Additionally, there are a series of recurring “supporting” characters, such as “Epstron” and “Diddytron,” which respectively parody Jeffrey Epstein and rapper Sean Combs (also known as Diddy).
These accounts exist within the same narrative universe, often gaining traffic by catering to racist and anti-Semitic stereotypes, accumulating millions of views. Discriminatory language frequently appears in short videos, which repeatedly revolve around so-called “racial purification” plots.
The purpose of this disturbing content is singular: to generate interaction and engagement. The ultimate goal is to direct public attention toward so-called “meme coins,” a type of cryptocurrency that theoretically rises as memes spread. Early meme coins (like $DOGE) leveraged existing internet culture, while derivatives like George Droyd and similar characters are entirely artificially created by crypto speculators.
This trick usually begins with pump.fun, a platform that allows users to easily register and trade digital tokens. Once developers create a token, they share it in trusted Telegram groups or X communities, where investors brainstorm ways to artificially generate attention for the related meme—what’s called “mindshare.” Then, they use AI to generate provocative videos, hoping for viral meme spread and attracting “ordinary people”—retail investors unfamiliar with meme coin culture but potentially drawn in. When the price rises, the core early investors often execute a “rug pull,” selling off and cashing out.
In reality, only a few thousand people typically buy these tokens. But because creating cryptocurrencies and posting AI-generated trash content has a very low barrier, creators can easily repeat this cycle, profiting from “creating cultural phenomena.”
Meanwhile, these memes often begin to “grow on their own.” Once other creators realize their viral potential, they imitate and reproduce them for money or online fame. The “Kirkinator” and “George Droyd” images have already been repeatedly used by various internet celebrities unrelated to the original token creators.
But each reinterpretation continues to benefit crypto brokers. For example, a tweet about Kirkinator in October garnered 8 million views, causing the $KIRKINATOR token price to spike fivefold before falling back within days. For investors who sold at the peak, this profit was built on millions of X users watching a video—content claiming “George Droyd was killed by Kirkinator after stealing Epstein files.”
Unfortunately, the more sensational the video, the more easily it spreads virally. Violent and offensive images generate more comments and longer watch times, both of which are rewarded by algorithms. Token creators have learned to exploit this mechanism for profit. Even Instagram or X users unaware of these cryptocurrencies may find themselves repeatedly exposed to these highly disturbing clickbait contents.
We are caught in a vortex: lax regulation of crypto sites, easily accessible AI tools, and social platforms that allow aggressive memes to proliferate—these elements stack upon each other.
As a scholar studying the evolution of internet language, I feel deeply uneasy: online trends are being artificially manufactured, with the sole purpose of manipulating us. We can no longer be sure that memes are “naturally generated”—they could be part of a profit chain at any moment.
Even if a meme is not directly created by crypto brokers, it is almost immediately co-opted by them. Every new cultural reference is quickly registered as a token on pump.fun and artificially promoted, just to profit certain individuals.
The ultimate result is that our connection to reality becomes increasingly tenuous. More and more memes are invented or amplified, forcing internet users to constantly doubt what they can still believe; and prolonged exposure to this disgusting discourse environment makes it seem “more acceptable.” The only way out is to fight to reclaim the internet and stop those who seek to poison it.