The New York Times reported on the Internet Research Agency (IRA), a firm believed to be funded by the Russian state government, hiring “trolls” to post divisive content to social media sites like Facebook, Twitter, YouTube, Reddit, and Tumblr around American political issues to sow political discord. These efforts were also detailed in the bipartisan report by the Senate Intelligence Committee. In 2018, Twitter published an extensive dataset of content posted by and associated with the IRA's activity on the platform allowing researchers to study their behavior (Linvill and Warren, Im et. al, Cleary and Symantec).
While often Twitter and Facebook are the focus of the media with regard to these types of troll campaigns, in this post we describe how our tools helped us find a new, sophisticated YouTube political troll campaign with 8M+ views one year out from the 2020 U.S. presidential election. Our tools aim to help detect similar campaigns by networks of bot, troll, and propaganda accounts for OSINT (open source intelligence) purposes.
Plasticity is currently contracted to work with the U.S. Air Force to conduct research and build solutions for military intelligence analysts. The software described in the post is connected to that effort, but the data, analysis, and any conclusions reached are of our own and do not represent the official views of the U.S. Air Force, the Department of Defense, or the U.S. federal government.
Upon discovering the disinformation campaign described in this post, we performed responsible disclosure to Google / YouTube to allow them to review and suspend the accounts in accordance with their policies and put in counter-measures to stop the campaign.
Plasticity's work was covered by CNN. CNN broke the story on our report online and on their TV network.
Read the CNN investigative articleOn October 30, 2017, YouTube released a statement on their findings during the 2016 election:
We have found evidence of a more significant campaign with accounts linked to the Russia and Ukraine region, much larger than the one previously found, driving higher engagement with users and thriving on YouTube. We show some of the data our intelligence tools collected below.
Video thumbnails are doctored to show Democratic and Republican leadership in outrageous and violent scenarios, such as being locked in a jail cell or hanged by a noose. The headlines are also edited to say things like “HANG HIM UP!”, which a reputable news organization would never do.
YouTube Auto-generated Topic Channels are channels that are automatically created by YouTube's algorithms to collect videos on certain topics (TV series, people, events, etc.).
For example, even though CNN posts videos of all of their anchors to a single YouTube channel, YouTube automatically creates separate, official-looking channels for each anchor's show like The Lead with Jake Tapper that aggregates videos of Jake Tapper.
According to YouTube, they help “boost your channel's search and discovery potential on YouTube.” We found videos from this campaign promoted on some of these YouTube topic channels. Anyone who subscribes to these YouTube topic channels will be pushed content from this campaign, effectively increasing their audience and reach.
While this content might appear to be obviously edited to savvy viewers, there are still a proportion of users commenting on these videos that do not question the content's source and even believe they are being uploaded from sources like CNN:
While the content posted by this campaign is certainly troubling and the outrageous tone they strike seems to have malicious intent, due to the anonymous nature of the internet it is difficult to make a definitive statement about the actors behind this campaign without access to the private metadata behind these accounts.
It seems to be a “professional” operation given the number of hacked accounts they have access to, the amount of human effort involved in the campaign, the likely automated tools developed to help create the doctored images, and the frequency at which content is uploaded (every hour or few hours). Moreover, most channels don't seem to be monetized or have financial incentive to be driving views to their channels.
Regardless of the source, the content should be taken down as it is disinformation, and the campaign should be blocked from creating further accounts.