In 2025, GRU-related disinformation targeted European public opinion: deepfakes, fake media, hacking, and political relays. Here are the methods and their effects.
Summary
In 2025, Russia treated the European information space as a theater of operations. The GRU’s role is most evident when its actions combine intrusion, data collection, and political exploitation. In Germany, before the February elections, networks spread false terrorist alerts and fake videos, with hundreds of accounts and millions of interactions observed. In France, the public attribution of activities linked to APT28 serves as a reminder that cyber also serves to exert influence: steal, choose the right moment, then contaminate the debate. The VIGINUM report on Storm-1516 describes a multi-stage dissemination chain: initial dissemination, laundering, amplification, opportunistic reposts. The major shift is industrial: AI lowers the cost of content production and accelerates variations. The goal becomes attrition: polarizing, saturating, and instilling the idea that everything is manipulated. In the end, the most consistent impact is not the vote, but mistrust and fatigue.
The real scope of Russian military intelligence in the information war
Talking about “the GRU” as a single brain is convenient, but misleading. Russia conducts influence operations through an ecosystem. This includes services, media outlets, parastatal structures, and subcontractors. In 2025, the relevant question is therefore not “does the GRU do everything?” but “in which segments of the cycle is the GRU’s footprint plausible and documented?”
The GRU stands out in two main areas. First, its ability to produce intelligence through intrusion: email accounts, calendars, working documents, address books. Second, its ability to combine this intelligence with active measures: selective dissemination, context manipulation, and staging of “evidence.” In this scheme, influence is not an add-on. It is a mode of operation.
Caution is needed when it comes to attribution. Many pro-Russian campaigns in Europe in 2025 are attributed to actors “linked to Russia” without specifying which service. But there are stronger signals when a state authority formally attributes, or when an agency documents consistent technical chains. This is where analysis becomes useful, because it separates political conviction from observable elements.
Visible operations in Europe in 2025
The German sequence, a full-scale test before and after the vote
The February 2025 campaign in Germany attracted attention because it ticked all the boxes: sensitive target, electoral timing, anxiety-inducing narratives, and high-speed dissemination. Networks disseminated videos presenting themselves as media or intelligence services. The content highlighted threats of attacks and fictitious warnings. The logic is not subtle: an anxious society becomes more receptive to authoritarian narratives and more inclined to doubt democratic procedures.
The figures give an idea of the scale. A database of messages analyzed accumulated around 2.5 million interactions. Researchers also identified a network of more than 700 inauthentic accounts that appeared or were reactivated at the end of the campaign. The authorities publicly warned against a campaign associated with Storm-1516, with pseudo-media and “dormant” accounts activated at the opportune moment.
The most important thing is not a technical detail. It is the social mechanics. The initial narrative does not need to be believed. It just needs to trigger a chain of responses: screenshots, outraged reactions, debunking, counter-debunking. In the end, collective attention is consumed, and the media agenda is distorted. It’s a way of making democracy “pay” in time and energy, which is exactly what a hostile power seeks when it doesn’t need to win, but to wear down.
Continued pressure on France via cyber-influence
France is an interesting case because, in 2025, it chose to publicly name the threat.
An official statement attributed to Russia, via the GRU, the use of an attack group associated with APT28 to target or compromise a dozen French entities since 2021. This point is essential: this is not an isolated incident, but a long-term campaign with a clear objective of gathering information and exerting pressure.
The link with disinformation is rarely “direct” in the narrative sense. It is functional. When an intrusion affects an organization, it can fuel several effects of influence: stealing information and disseminating it at a chosen moment, fabricating fakes from real documents, or simply creating a climate of insecurity conducive to rumors. It is the chain reaction that counts: cyber to access, influence to exploit.
In 2025, this “hybrid” approach is better understood by European authorities. A cyberattack is no longer just theft. It is a political lever when it targets visible institutions, sensitive infrastructure, or periods when society is already under strain.
Poland, Portugal, and Moldova: areas of friction with high political value
In Poland, 2025 was marked by attention to attempts to influence the presidential election. Analyses described Doppelgänger-type operations seeking to imitate the media and push narratives on war, Ukraine, immigration, or “submission” to Brussels. In the Polish context, these narratives meet with strong resistance on defense issues. This limits their direct effect on the vote. But it can increase internal tensions, harden mistrust of the European Union, and fuel polarization that complicates governance.
In Portugal, a numerical example illustrates the use of inauthenticity on a large scale. An institutional brief relayed an analysis showing that 58% of accounts commenting on X around Chega were considered fake, with significant levels on other parties. The key element is less “who controls” each account than the following fact: simulating a majority is becoming cheap. And on sensitive issues, a simulated majority can influence the media, and therefore influence perceptions, and therefore influence behavior.
Finally, in 2025, Moldova remains a recurring target of pro-Russian campaigns, particularly during election periods, with the use of fake videos and narratives aimed at weakening pro-European forces. Here again, the issue is not just a vote. It is the strategic orientation of a pivotal country and the signal sent to all its neighbors: “Europe does not protect its cognitive borders.”
The techniques that really changed in 2025
Industrial production and the decline in marginal cost
The turning point is not the lie. It has always existed. The turning point is the ability to produce, test, correct, and relaunch, almost like a content chain. The campaigns associated with Matryoshka / Operation Overload illustrated this dynamic. Researchers counted 587 unique pieces of AI-generated or AI-assisted content between September 2024 and May 2025, compared to 230 in the previous comparable period. Some content reached several million views before being moderated on certain platforms.
This logic favors content that is “sufficiently plausible” rather than perfect. A video can be crude and still useful if it circulates in communities that are already convinced, or if it forces the media to talk about it in order to refute it. The system is based on a cynical trade-off: every refutation consumes human time, and human time is a scarce resource.
The multi-stage distribution chain, a proven expertise
Operations are no longer limited to posting and hoping. They follow a sequence. French authorities have described a mature system based on distinct phases: initial dissemination via disposable accounts, laundering via relays, amplification via networks, then opportunistic reposts. This pattern explains why disinformation can “break out” of a marginal circle and end up being quoted by public figures.
This pattern also makes attribution difficult. The initial source is ephemeral. The relay is sometimes a third-party media outlet. Amplification occurs through influencers who are not necessarily aware of the origin. At the end of the chain, the “evidence” becomes social: everyone is talking about it, so “there must be something to it.”
This architecture is accompanied by a simple tactic: shift the cost to the opponent. A public service, editorial team, or platform must mobilize teams to verify, counter, and report. The attacker responds with a variation. It is a strategy of attrition.
The confusion of labels and the creation of a gray area
In 2025, the debate was also muddied by the proliferation of labels: a particular group was named one way by a government agency, another way by a private company, and yet another way by the media. This is not a minor detail. This diversity serves the attacker, because it fragments the response. It also fuels accusations of “propaganda” as soon as an attribution is put forward.
A French technical report on Storm-1516 goes further: it mentions elements that suggest coordination by a Russian service and refers to an individual presented as potentially linked to GRU Unit 29155, publicly accused of financing and coordinating the operation. Here, we touch on a sensitive point: influence is not just a matter for communicators. It can be organized as a clandestine operation, with funding, coordination, and compartmentalization.

Observable effects on public opinion
Polarization without electoral “magic”
Let’s be honest: most disinformation campaigns do not sway an election on their own. The analyses available on Germany in 2025 suggest that they are more likely to disrupt, polarize, and “smear” the cognitive environment than to convert voters on a massive scale. This reality undermines the doomsday rhetoric, but it should not be taken as a false reassurance.
The main effect is cumulative. Each campaign adds another layer of suspicion. After a while, part of the public no longer seeks the truth. It seeks a narrative that reinforces its camp. Democracy then becomes a tribal war, and truth becomes an adjustment variable.
Fear as a lever and the temptation of civic withdrawal
Campaigns around threats of attacks in Germany illustrate a particular mechanism: the activation of fear. This is not just sensationalism. It is a way of encouraging certain behaviors: avoiding gatherings, mistrusting polling stations, favoring postal voting, or believing in the idea of “premeditated” fraud. This fear effect is a multiplier because it mobilizes reflexes, not arguments.
When this type of narrative is repeated, the result is not only anger. It is fatigue. Citizens withdraw. They share less. They get their information in fragments. And they become more dependent on informal authority figures: influencers, “whistleblowers,” community channels.
The war in Ukraine as a lasting narrative matrix
In 2025, a large part of operations remain structured around the war in Ukraine. Recurring narratives seek to portray Western aid as futile, corrupt, or dangerous. Ukrainian leaders are targeted, but the real objective is European: to weaken the social acceptability of the support effort. French authorities have documented 77 operations up to March 5, 2025, including 35 targeting Ukraine’s image and 42 targeting Western interests, often around election periods.
This framing is effective because it exploits existing tensions: inflation, war fatigue, fears of migration, divisions over European enlargement. The attacker does not invent everything. They aggregate, accentuate, and stage.
European responses that have shown results
Exposure diplomacy, useful but insufficient
In 2025, France and Germany sent a clear signal: they agreed to name certain actors and take political responsibility. This strategy has an advantage. It reduces the gray area. It informs the public. It also supports platforms and researchers who would otherwise bear the legal and reputational risk alone.
But attribution is not a solution in itself. It only works if it is accompanied by action: sanctions, judicial cooperation, support for the media, and protection of targets. Without this, it becomes just another press release in an already saturated stream.
Technical defense and reducing operational efficiency
The most effective response is often the most thankless: breaking the chain of dissemination. This requires trust and security teams, clone detection systems, and rapid response to “dormant” accounts. Reports show that some platforms are responding, while others are much less so. This discrepancy creates free zones.
At the institutional level, cooperation is progressing. European structures and national agencies are sharing signals. Joint advisories target known tactics. The goal is simple: reduce returns. If a tactic costs more and lasts less, it loses its appeal.
A culture of evidence, without morals or lessons
The most delicate point is the public. The answer cannot be “trust us.” It must be “here’s how we know.” A democracy that wants to resist must learn to think in terms of evidence, not impressions. This involves simple tools: researching the origin of a video, verifying an image, identifying a recent account, and spotting pseudo-media. This hygiene does not prevent manipulation. But it reduces exposure.
Above all, we must accept an uncomfortable idea: the battle is also being fought for attention. When the public space is saturated, even true facts no longer reach their target. This is information fog: a situation where information overload becomes a weapon.
The likely trajectory after 2025, if nothing really changes
There are two underlying trends. On the one hand, tools for creation and dissemination are becoming more accessible. Deepfakes will improve in quality and personalization. On the other hand, European societies remain divided on the response, between freedom of expression, digital sovereignty, and internal political competition.
The most serious risk is not an isolated lie. It is the establishment of a norm: “everything is manipulated.” From there, it becomes rational to believe in nothing, or to believe only in one’s own camp. For a hostile power, this is a strategic success, even without a visible electoral victory.
The best news is paradoxical: these operations have their limits. They often fail to convince beyond their already won-over audiences. They leave technical traces. They force the attacker to recycle narratives. And they can be countered when states, platforms, media, and civil society stop working in silos.
But the condition is clear: treat disinformation as a security issue, without using it as a pretext for censorship. It is a difficult balance to strike. It is also the only one that is worthwhile if Europe wants to remain a democracy that defends itself without denying its values.
Sources
- VIGINUM (SGDSN), Storm-1516 Technical Report, May 7, 2025.
- Ministry for Europe and Foreign Affairs, Attribution of APT28 to Russian military intelligence, April 29, 2025.
- CERT-FR / ANSSI, CTI Note on APT28 (CERTFR-2025-CTI-007), April 29, 2025.
- Reuters, Bot campaign and false alerts ahead of German vote, February 12, 2025.
- Reuters, Official German alert on Storm-1516 and fake videos, February 21, 2025.
- CERT-EU, Cyber Brief 25-06 (references to campaigns and APT28), 2025.
- EDMO, Analysis of Russian attempts to influence the 2025 Polish election, June 19, 2025.
- Wired, Intensification of Operation Overload/Matryoshka via AI tools, July 1, 2025.
- RUSI, Russia, AI and the Future of Disinformation Warfare, June 19, 2025.
War Wings Daily is an independant magazine.