Skip to content Skip to footer

DSA, election integrity and democracy: CEE orgs speak up

Four organisations from Central & Eastern Europe are speaking up on the EU’s impact on integrity of electoral process. We expect more precision on local-specific risks, demonetisation of disinformation, and synthetic media. Read the input by Instrat FoundationTrollWall AIBasta Foundation, Make.org and us below:

Warsaw, 5th March 2024

Regarding: Consultation on Draft Guidelines for Providers of Very Large Online Platforms and Very Large Online Search Engines on the Mitigation of Systemic Risks for Electoral Processes

At CEE Digital Democracy Watch, Instrat Foundation, Basta Foundation and TrollWall we welcome the Commission’s further work on building the framework for securing free and safe online spaces for election periods. As mostly Central & Eastern Europe-based organisations we have witnessed both the positive and negative impact of emerging tech on the local electoral process in 2023. 

The positives included multiple GOTV (get out the vote) campaigns led by grassroots NGOs. Online campaigning proved to be a tool that mobilises minority groups and allows for low-barrier entry to political campaigning for emerging candidates and non-governmental organisations. A good example of this is Poland, where it led to record turnout and mobilisation of youth groups that decided to cast their ballots.

We also witnessed examples of risky behaviours. Some of them came from internal actors, such as the misuse of synthetic media in Polish and Slovakian elections. Some came from foreign actors/FIMI having Russian-led networks misusing linguistic & cultural closeness to interfere with mass-scale misinformation.

Please see below our perspective on how to improve the draft Guidelines:

Local specific risks should be clearly identified

We welcome the expectation of providing “adequate content moderation resources with local language capacity and knowledge of the national and/or regional contexts and specificities” as indicated in Paragraph 3.1 (12). It should also be clearly indicated in the guidelines, especially in Paragraph 3.4, that the VLOPs will work on coordinating their actions with apolitical government agendas only, such as national elections committees, administrative bodies, and courts, and should not seek advice on election governance and content moderation decisions from bodies with political interest, such as ministries or MPs.

We appreciate the Commission’s effort to ensure that global communications platforms comply with national election laws. This should include compliance both with silent periods as indicated in national calendars and undisrupted access to political advertising during other periods of the campaign. Arbitrary, forced, selective, or global-scale silent periods on the side of VLOPs should be avoided.

While the inclusion for cross-border advertising within the EU is appreciated, compliance with national foreign electoral funding laws should also be encouraged.

It is good that providing access to official information on the electoral process is being encouraged as in Paragraph 3.2(16a). The guidelines should make sure that the information is shown to all users at the same time and the same amount of times across the accounts to ensure equal reach.

Let’s bear in mind that the sensitivity to national election regulations may harm the intended harmonisation of the political content moderation standards. Mitigating the conflicting standards should be undertaken not only versus the national regulations, but also the already existing EU-led files, such as Code of Practice on Disinformation, AI Act, and Transparency and Targeting of Political Advertising Regulation. 

More clarity for demonetisation of disinformation needed

There are multiple risks relating to the very brief mention of the demonetisation of disinformation content in Paragraph 3.2 (16g). The Guidelines should provide more precise definition and a realistic framework of differentiating between disinformation content and political opinion. Strong safeguards should be proposed to avoid making calls in that regard based on quality of content or ideological perspective. A risk of both VLOPs and traditional media being pushed to stay away from political coverage to avoid demonetisation should be mitigated. 

Oversight teams for political content need more transparency

We encourage VLOPs to provide transparency into the “clearly identifiable internal team” working on the distribution of political content in Paragraph 3.2 (13). The Guidelines should stipulate that those hired in above-mentioned roles should be provided with stable working conditions to insure independence and quality of leadership. 

We would advise to separate those functions from government relations and public policy roles, as this might lead to the conflict of interest. 

It should also be clear that the teams should have a systemic approach to moderation standards instead of the pick-and-choose approach, used to singling out specific cases to be presented forward to the moderation teams or external oversight bodies. This standardised approach should also be applied for the mechanisms of moderating virality proposed in 3.2 (16d). Engaging external experts in the teams should be heavily encouraged for external validation of political neutrality and avoiding favouring specific political groups or candidates.

There are different shades to synthetic media and AI-use in political content

It is good to see that the Commission refrains from a blanket ban on the use of artificial intelligence tools in political campaigning and advertising. We call upon the Commission to recognise that there are varying risk levels among the different uses of so-called artificial intelligence – starting from the use of AI tools for text edits and copywriting, and spanning to realistic deep-fakes aimed at misinforming citizens.

The majority of (VLOPs) have proposed their own initial self-regulations on the moderation of synthetic content and the use of AI-enabled mechanisms in political advertising. These should be monitored and taken into consideration when creating this framework, and realistic methods of enforcement should be proposed. For example, for the watermarking proposal in Paragraph 3.3 (27a) to be realistic, a more precise standard of declaration and detection is needed. 

Positive political online spaces should be encouraged 

The proposed Guidelines are heavily risk-based and the language used in Paragraph 1.1 (3) is quite negative towards online political discourse in general. We expect to see the encouragement to build positive and healthy online spaces for the good faith users and campaigns. More focus on the protection of the freedom of expression and information is needed. This is especially important for the groups that have historically considered online spaces as a place of free expression.

We advise to propose more dynamic incentives to avoid over-moderation of political content and discrimination against organic political content, get out the vote campaigns and information and news on elections.

***

We are happy to participate further in the process of finalising the guidelines and monitoring the developments for annual updates. Please note that at CEE Digital Democracy Watch we are also a signatory of the Democratic Shield pact that proposed solutions similar to the ones proposed in the above Guidelines, such as political parties’, media, and large influencers’ codes of conduct.

On behalf of CEE Digital Democracy Watch: Jakub SzymikOn behalf of Instrat Foundation:  Blanka Wawrzyniak
On behalf of Basta Foundation:  Bart StaszewskiOn behalf of TrollWall: Tomáš Halász