In the world of social media, “payment” doesn’t always mean a bank transfer. For an influencer whose views align with the narrative of the Internet Research Agency (IRA) or other Russian-linked troll-farm operations, the reward can instead come in the form of algorithmic amplification that drives eyeballs, ad revenue, merchandise sales, affiliate income—and thus effectively functions as indirect compensation. Several pillars make this possible: first, the nature of outrage- and contrarian-friendly algorithms; second, the role of bot/troll farms in seeding the engagement; third, the way increased viewership monetizes the creator; fourth, the institutional disincentives for platforms to crack down fully; and finally, the flip-side: sabotage of opposing voices.
1. Outrage and contrarian opinions are algorithm friendly.
Social-media platforms increasingly operate on engagement-based ranking: content that provokes reactions, especially strong emotions such as anger or outrage, tends to get prioritized. A Tulane University study found that users are more likely to engage (comment, react) with content that challenges their views—what the authors call the “confrontation effect.” Meanwhile, work by Yale researchers observed that users whose content elicited more “likes” and “shares” when expressing outrage then increased their outbreak of outrage over time. More generally, commentary on the “algorithm of outrage” describes how social-media systems favour content designed to trigger immediate reactions, not calm reflection. Thus, an influencer who consistently takes contrarian, provocative stances—especially aligned with pro-Russian or anti-Ukrainian narratives—has an inherent algorithmic advantage: their posts are more likely to evoke comments, shares, and quick reactions, helping them ride the “virality wave.”
2. Bots and troll farms reward pro-Russian views.
This algorithmic predisposition is then amplified by coordinated efforts from troll farms and bot networks. The IRA is the most infamous example: its mandate included creating mass comments/posts to drive traffic, generate clicks, and manipulate discussion. For instance, a ProPublica investigation found a Russian-linked troll farm operating dozens of English-language accounts that spread anti-Ukraine propaganda. These networks produce artificial engagement—likes, shares, comments—designed to provoke algorithmic promotion of target content. The effect: when an influencer cooperates (even implicitly) with pro-Russian viewpoints, the troll farm can boost their content at the critical early stage, triggering platform ranking systems to expose the content to wider audiences. In effect, the “payment” is access to amplification.
3. This method drives actual eyeballs to the influencer’s content.
Once the content gets algorithmic lift, it reaches real, organic audiences. Because social platforms reward velocity (how fast a post gets traction) and volume of engagement, artificial early metrics set the stimulus. The algorithm then propagates the content into feeds, recommendations, suggested videos or posts. Studies show that popularity-based ranking amplifies emotionally charged content—not just because it’s artificial engagement, but because the algorithm is designed for it. As more real users engage (either positively or via outrage/discussion), the influencer’s visibility spikes. That increased visibility translates into larger subscriber counts, higher view counts, more clicks, more “watch time,” and more opportunity to monetize via ad revenue, sponsorships, affiliate links, and merchandise.
4. The influencer monetizes: ad revenue, affiliate sales, merch.
So what does “payment” look like for the influencer? First, standard platform monetization: higher views = more ads shown = higher earnings; a constantly growing audience allows the influencer to command higher rates for sponsored content or brand deals. Second, affiliate marketing or merchandise: the traffic spike gives them a bigger “funnel” of followers or viewers to convert into buyers. Third, cross-platform growth: amplified posts bring attention on YouTube, X/Twitter, Instagram, TikTok—each with its own monetization model. The troll farm doesn’t hand the influencer a bag of cash, but it artificially boosts their key performance indicators (KPIs), which the influencer then converts into revenue. In other words, the infrastructure is a covert subsidy of visibility and thus a form of payment.
5. Platforms like Meta Platforms and Alphabet Inc. are poorly motivated to root out this transfer.
(a) The platforms’ business models reward engagement above all else. More clicks, more ad impressions, more revenue. Whether content is “organic” or boosted via troll-networks, the platform collects ad dollars. Content that goes viral—even if manipulation aided it—is still monetized. For many platforms, the fact that engagement is high may even obscure manipulation. (b) Advertisers often focus on sales/performance rather than underlying narrative alignment. If an influencer’s traffic numbers are good, and conversions occur (sales, subscriptions, etc.), the advertiser may not care whether the audience was organically grown or algorithmically assisted via manipulative networks. This creates a weak incentive for platforms to crack down hard, and a strong incentive to keep engagement strong.
6. Trolls and bots also sabotage opposing voices.
The same machinery can work in reverse: if a content creator takes a pro-Ukrainian line, Russian-linked troll networks may mark, spam, report, or drown out their comments sections, abuse their posts, and engage in coordinated trolling. This reduces visibility, discourages engagement (creators sometimes turn off comments altogether), and damages monetization potential. The creators’ KPIs stagnate, reducing their incentive to continue—while pro-Russian influencers benefit by comparison.
In sum: for a pro-Russian or anti-Ukrainian influencer, the payment is subtle but real. They gain algorithmic visibility thanks to bot/troll farms, which fundamentally shifts their monetization curve. Platforms reward high-engagement content; influencers convert that into dollars. Troll farms supply the artificial engagement necessary to tilt algorithmic rankings; platforms continue business as usual. Meanwhile opposing voices get suppressed or drowned out. The result is a “pay-for-visibility” model that avoids overt checks or cash, but delivers the same effect.
Benjamin Cook continues to travel to, often lives in, and works in Ukraine, a connection spanning more than 14 years. He holds an MA in International Security and Conflict Studies from Dublin City University and has consulted with journalists and intelligence professionals on AI in drones, U.S. military technology, and open-source intelligence (OSINT) related to the war in Ukraine. He is co-founder of the nonprofit UAO, working in southern Ukraine. You can find Mr. Cook between Odesa, Ukraine; Charleston, South Carolina; and Tucson, Arizona.
This text is published with the permission of the author. First published here.