Boosted Content and Electoral Risk in Moldova
By Caroline Nichols, Nicholas Shen, and other Integrity Institute Members
Integrity professionals concerned about Russian election interference have been closely watching Moldova, and observed a concerning trend. Moldova, which shares a border with Ukraine, is often a test-bed for Russian interference tactics. In Moldova this year, regional tensions over the Ukraine war and Russian legacy combined with a vote on EU ascension created a strong motivator for political and election interference. Examining these developments in Moldova can give us a preview of tactics that will likely be deployed in other democracies around the world, including the upcoming snap election in Germany.
Members of our community saw a marked increase in election manipulation strategies in Moldova that leveraged boosted content, a type of advertising that tends to be less monitored than other forms of political ads. This strategy was used to interfere in Moldova's Presidential and EU accession elections - crucial moments for the nation's democratic future and potential to align with European structures. This tactic is not confined to Moldova, but reflects a growing trend in which malicious actors manipulate the social internet to destabilize democracies and fair electoral processes worldwide.
We are also concerned about the lack of transparency in the paid influencer space and believe that there are opportunities to better understand monetary influence ahead of civic events. Note, this piece focuses on boosted/monetized content on social media platforms, not messaging services like WhatsApp or Telegraph.
Moving forward, there are both immediate fixes and long term solutions that tech companies and regulators can deploy to reduce external manipulation of the information environment.
WHAT IS BOOSTED CONTENT?
Boosted content on social media is organic, user-created content that has been promoted using paid advertising to increase the reach and visibility. Unlike traditional ads, boosted content typically originates as a standard post on a social media platform and is later "boosted" to appear more prominently in users' feeds, reaching a wider or more targeted audience.
While different platforms characterise and govern boosted content differently, across the board ‘boosting’ content increases the reach of a post. While boosting content works slightly differently across platforms, generally speaking boosted content is less targeted than an advertisement. Boosted content is simpler to set up than paid advertisements and less expensive.
Boosted content is offered by many platforms like Facebook, Instagram, X , TikTok, YouTube, and LinkedIn. While boosted content is labeled differently across different platforms, it is often marked as ‘sponsored’ or ‘promoted’ in a user’s feed. Due to different methods in labeling, it may be difficult for users to differentiate boosted content versus paid advertisement, which also links to the organization or page that placed the ad. Boosted content is becoming more popular now as it provides an easy, cost-effective way for individuals, organizations, and businesses to increase the visibility of their posts without needing advanced advertising knowledge.
WHY IS IT PROBLEMATIC?
TRANSPARENCY: Boosted content often falls into a less governed space between political advertisements and traditional content generated by users. Since boosting content requires an original organic post, those posts already go through a set of content moderation reviews. However, once a post gets boosted, it would be important for platforms to ensure that the boosted posts also go through an additional paid advertisement review since the content is now being paid for promotion or advertisement, and ultimately a wider audience. Boosted content that is not political or issue-based is not included in Meta's Ad Library or in Meta's Content Library. Political or issue-based boosted content is accessible through Meta’s Ads Library which is open to the public. Meta’s Content Library is only available to academics and civil society (i.e. not to journalists) through an application process.
FORCE MULTIPLICATION: LinkedIn demonstrates strong governance in this space, both moderating boosted content and limiting boosted content to a single boost. While platforms have guardrails that prevent inauthentic boosting activity (eg. shared posts can’t be boosted on FB), publicly-accessible content can be copy/pasted into new posts and boosted without the permission or knowledge of the original poster. This primes content for virality and creates opportunity for malign coordinated boosting of a particular viewpoint. Without total transparency to understand the larger picture of where boosted content comes from, content may be seen as being more popular, or ‘going viral’ without proportionate user support. In the absence of public insight tools such as CrowdTangle, algorithmic transparency, and in the context of diminished content moderation around dis- and mis-information, concealing the origin of boosted content is a major contributor to the distortion of the information ecosystem.
TESTING THE WATERS: The Moldovan elections and other elections in smaller markets are often authoritarian testing grounds for tactics. Many tactics used by Russia in the 2014 Ukrainian elections were the same tactics Russia used in the US 2016 elections. Both operations involved disinformation campaigns, the use of social media, and cyberattacks, but the specifics and scale of these tactics differed. This approach is likely to be scaled to other, larger markets, and poses a serious content moderation problem for platforms.
SHORT TERM THINGS REGULATORS CAN DO
Audit compliance with established paid content regulations. Establish consequences for non-compliance moving forward.
Insist on limitations to boosting frequency during voting periods.
Require human review of all political ads and boosted content. Expensive, yes, and potentially made more effective through seasonal, local staff.
NORTH STAR
The online space will continue to be creatively manipulated by agenda-driven actors. While we applaud efforts by social media companies to be less influential in democracy, by nature of their services it is, to some degree, beyond their control. There is an opportunity for Regulators to influence social media companies to improve electoral safeguards by insistenting on greater transparency.
Transparency asks can often feel like a thousand-cut-quest for social media platforms. Transparency in this context means making publicly accessible details on the source, content, and targets of paid content.
This includes releasing datasets to enable external validation of platform claims about their systems, detailing processes for platform changes during significant events (e.g., election periods), and clearly outlining the requirements or verification steps necessary to achieve broad audience reach, including boosted content and paid advertising.
Big asks for Regulators to consider in support of free and fair elections:
Agree on better definitions of political advertising, to include boosted organic content and paid influencers.
There are opportunities to improve how political advertising is defined which could in-turn create opportunities for greater transparency. For instance, a major take-away from the year of elections is that paid influencers played a larger role and are positioned for more impact in the years ahead. While there is not as much transparency as many would like on political advertising, there is little to no transparency on the flow of funds to influencers. It’s an opportune moment to get creative.More transparency around paid and boosted content.
Three steps towards improved transparency and public accountability in paid content could include:
A: Transparency around political backing of paid influencers, such as a requirement to disclose when influencers are paid by politicians, campaign finance groups, election influence groups, and lobbyists.
B: Geolocation of advertising sources. Typically the focus is on the geolocation of the target audience, not the location of the ones paying for advertising. Mandatory location verification of advertisers, content creators, and users paying to boost content. Most current requirements for boosting content only request for an existing account and payment information. Important location verification methods like government issued IDs, IP address authentications, and banking address information are a few ways to confirm an advertiser/person's location. While bad actors will often find work-arounds such a measure could give civil society, governments, and journalists significantly greater insight into the flow of advertising funds across influential platforms.
C: Relaunch and improve publicly accessible resources for users to ‘do their research,’ that can analyze content within - and ideally across - platforms.
If the operating principle platforms take is that users should know what is and is not misinformation, then users should be empowered to be able to understand how the information in their feeds is evaluated, the sources it comes from, and the methods used to verify its accuracy particularly when content is amplified through monetary means.Consider a regional approach to regulation around paid political advertising black-out windows
While global consistency on political advertising black-outs may be a non-starter, there could be an opportunity to consider a regional approach that aligns with existing campaign silent periods before voting days. The benefits of such an approach include reducing the likelihood of voter interference, last-minute misinformation campaigns, and promoting local campaign engagement. Political advertising black-out windows are developed through a series of negotiations between platforms and governments. Platforms are inconsistent in their application of black-out windows, arguing that black-out windows could disproportionately impact smaller campaigns or late-breaking political developments.
Social media advertising blackouts, such as those implemented by platforms like Facebook and Google, are less frequently implemented in the Global South compared to regions like North America and Europe. Political advertising in the GS often faces less stringent enforcement or fewer pauses. This can be attributed to inconsistent regulatory frameworks or a focus on addressing other pressing challenges, such as the spread of misinformation through messaging platforms. While this discrepancy can be attributed to a combination of factors, the potential for harm is no less.
Implementing an advertising black-out creates both a loss in revenue for the platforms in addition to the cost of implementing the black-outs. Given the rise of mis- and dis- information in advertising, moving forward it is imperative that regulators create consistent structure to reduce the influence of advertising in the immediate run-up to an election, particularly in the absence of greater transparency around platform advertising in general.
WHY THIS IS SUCH A BIG PROBLEM
These fixes are most important for Facebook, Instagram, X, Pinterest, and YouTube to implement but all platforms should consider.
As a sample of 2024 election year social media traffic:
Moldova: Facebook at 87.81%, Instagram at 4.06%, Pinterest at 3.37%, while YouTube, X, and others had much lower rates.
Georgia: Facebook remained leading at 70.56%, YouTube at 10.79%, Instagram at 8.15%, Pinterest at 5.05%, while X and others saw much lower traffic.
Germany: Facebook at 61.77%, Instagram at 16.97%, Pinterest at 7.41%, X at 5.29%, YouTube at 4.04%, while Reddit and others had much lower rates.
Healthy information ecosystems are essential for ensuring that human interests are authentically represented within democratic systems of governance. They enable citizens to access accurate, diverse, and relevant information, which is crucial for informed decision-making and meaningful participation in democratic processes.
Given the rising volume of information—ranging from misinformation and disinformation in advertising to influencer-driven narratives and AI-generated manipulated content— transparency in paid and sponsored content is more critical than ever.