Cyber-Militias and the Struggle for Primacy in the Information Battlespace

warfare and cyber militias, cyberwar, warfighter, intelligence, counterintelligence, c. constantin poindexter;

I came of age in an intelligence community that still treated the “front line” as a place one could step onto, map, and secure. That world is gone. Today, non-military adversaries, loosely coordinated “cyber-militias” of propagandists, patriotic hackers, influence entrepreneurs, and paid or volunteer amplifiers contest the initiative not with armor or artillery, but by colonizing attention, bending perception, and accelerating social division at scale. Our doctrine has begun to recognize this shift. In 2017, the U.S. Department of Defense elevated information to a joint function, formalizing what operators have seen for years. We note that modern campaigns hinge on creating and exploiting information advantage. The 2023 Department of Defense Strategy for Operations in the Information Environment makes the point explicitly: the Joint Force must be organized, trained, and resourced to integrate information effects alongside fires and maneuver (Department of Defense 2023).

By cyber-militias I mean non-uniformed actors—sometimes state-directed, often state-tolerated or “crowd-sourced” who blend cyber actions with narrative warfare on social platforms. They recruit and radicalize; swarm, harass, and dox; seed deepfakes and conspiracies; and flood the zone with emotionally sticky memes. Their command and control is typically flat and improvisational; their logistics are cloud-based, and their operational tempo is set by platform algorithms and news cycles. We have seen the military effects of such formations in diverse theaters. The so-called Internet Research Agency (IRA) exemplified a state-linked influence militia that scaled persuasion attempts and offline mobilization through U.S. social platforms. Rigorous research has since complicated the maximalist claims about measurable attitude change, but the operational fact remains: adversaries can reach millions of targets, at negligible marginal cost, with tailored narratives synchronized to geopolitical aims (Eady et al. 2023).

On the other end of the spectrum, the IT Army of Ukraine offers a case of defensive cyber-mobilization: a volunteer formation conducting DDoS, bug-hunting, and psychological operations in parallel with state efforts. This illustrates both the potency and the legal/ethical ambiguities that arise when civilians become combatants in the information domain (Munk 2025).

Terrorist organizations have long understood the leverage of social media. ISIS paired battlefield brutality with a meticulously engineered online propaganda machine, optimized for recruitment, intimidation, and agenda-setting across multiple languages and platforms. Peer-reviewed analyses detail how ISIS exploited platform affordances to sustain reach even as accounts were removed (Done 2022). The current flood of palestinian “claims of war theatre victory” are instructive.

Why Social Media Can Rival Physical Force

The simple answer is scale and speed. Computational propaganda leverages automation, amplification, and microtargeting to saturate feeds faster than fact-checking or deliberation can catch up. Systematic reviews now frame this as an evolving socio-technical ecosystem rather than a one-off tactic (Bradshaw and Howard 2019).

Assymetry comes a close second. Bots and coordinated inauthentic behavior give small and individual operators outsized influence, particularly in the first minutes of a narrative’s life cycle when early engagement signals can tip platform ranking systems. Studies show automated accounts disproportionately amplify low-credibility content at those critical early stages (Shao et al. 2018).

Human terrain effects must be contemplated. Even when direct persuasion is modest, harms in conflict zones are VERY REAL. Doxing, stigmatization, displacement, and cultural desecration have all been linked to online incitement during armed conflict. This is not just “online chatter”; it is operational preparation of the environment with human consequences (Ulbricht 2024).

Integration witrh kinetic operations is also an imperative ingredient. In Ukraine, Russian forces coupled physical systems (e.g., Orlan-10/Leer-3) with mass text and social campaigns to trigger panic and erode cohesion. This serves as a reminder that “information fires” can bracket the battlespace as surely as artillery (GAO 2022).

Memetic maneuver is a final consideration. In contemporary conflict, meme-based narratives are not mere ephemera. They are maneuver in the cognitive domain. Recent scholarship on memetic warfare in the Russia-Ukraine context argues that these artifacts structure attention, encode complex frames, and accelerate recruitment into “participatory propaganda” at scale (Prier 2017).

A Note on Evidence and Caution

Brutal intellectual honesty must be front and center. A Nature Communications study linking U.S. Twitter feeds to survey data found no overly significant changes in respondents’ attitudes or vote choice attributable to IRA exposure during 2016, however, we should neither ignore this nor overgeneralize from it. The study does not absolve adversary campaigns. It refines our theory of effect. Many operations seek agenda control, polarization, intimidation, and time-on-target distraction rather than simple vote-switching. In war, even small shifts in participation, risk perception, or unit morale can be decisive (Eady et al. 2023).

The Imperative: Treat Adversarial Propaganda as a Campaign Target

NATO now frames “cognitive warfare” as a cross-domain challenge. The human mind is “contested terrain” where actors seek to modify perceptions and behavior (Claverie du Cluzel et al. 2021). That is not inflammatory rhetoric. It is operational reality in every theater that I have observed. Our response must leave the era of ad-hoc rebuttals and move toward integrated operations in the information environment (OIE) with explicit objectives, authorities, and measures of performance and effect.

What Intelligence and Warfighters Must Do

1) Build a fused intelligence picture of the narrative battlespace.
We need SOCMINT and OSINT cells that map not just “what is trending,” but also why. The network topologies, amplification pathways, and cross-platform migration patterns by which malign content metastasizes. Computational propaganda research offers a starting taxonomy; we must operationalize it into collection requirements and analytic standards (Bradshaw and Howard 2019).

2) Normalize OIE alongside fires and maneuver.
Commanders should plan narrative lines of effort the way they plan suppression of enemy air defenses: with target systems, timing, sequencing, and joint enablers. The 2023 SOIE calls for exactly this, i.e., education, resourcing, and integration so that information effects are not an afterthought but embedded in campaign design (Department of Defense 2023).

3) Contest the initiative through pre-bunking and resilience, not just takedowns.
Content moderation is necessary but insufficient. The strongest evidence for population-level resilience points to psychological inoculation. Brief interventions that teach people to spot manipulation techniques before exposure reaps oversized dividends. Large field experiments on YouTube and cross-platform studies show significant gains in users’ ability to recognize manipulation, though effects attenuate without reinforcement (Roozenbeek et al. 2020; Maertens et al. 2021).

4) Impose friction on hostile cyber-militias.
Joint and interagency teams should target the infrastructure of amplification (maning botnet C2, SIM farms, and payment rails for “influence mercenaries.”) Early-cycle disruption pays outsized dividends given bots’ role in initial virality (Shao et al. 2018).

5) Clarify authorities and align with the law of armed conflict.
Volunteer cyber formations raise attribution and status-of-combatant questions. Scholars have argued for pragmatic frameworks that harness civic energy while mitigating escalation and civilian-combatant blurring (Munk 2025).

6) Train for the cognitive domain.
Treat cognitive security as tradecraft, not simply lip-service. This includes red-teaming our own narratives, pre-mission media terrain analysis, and SOPs for rumor control when adversaries seed panic. NATO-sponsored analyses emphasize that cognitive effects require skilled practitioners, clear objectives, and ethical guardrails (Claverie du Cluzel et al. 2021).

7) Measure what matters.
Intelligence and warfighter analysts must avoid over-indexing on vanity metrics. We need to build dashboards around indicators, i.e., time to adversary saturation, percentage of priority audiences inoculated, and suppression of inauthentic behavior during the “golden hour.” The ICRC’s typology linking online dynamics to offline harm provides a framework (Ulbricht 2024).

The Strategic Bottom Line

In conventional war, advantage is cumulative. Logistics, training, and combined arms competence pay off BIGLY. In the information fight, advantage is compounding. The side that gets inside the adversary’s decision cycle sets the frame for everything that follows. Our adversaries are playing that compounding game. They field cyber-militias that operate at machine speed but speak in human idiom, exploiting platform incentives and cognitive biases that are as old as persuasion itself and as new as generative AI.

As intelligence professionals and warfighters it is not merely to rebut lies after the damage is done. It is to DENY adversarial initiative in the information environment, to map and preempt their campaigns, to harden our populations, to integrate narrative effects with maneuver. Doing this all under the rule of law and democratic accountability will be a challenge. The I.C. and armed forces are not ignoring this, thankfully. The JF now names information as a core function, however, doctrine without resourcing and practice is just paper. We must build the teams, authorities, and habits to fight and win where people live now, in feeds and group chats as much as in physical space. If we fail, we cede the decisive ground of modern conflict to non-military adversaries who understand that primacy is no longer measured only in meters seized, but in minds held.

A crucial recommendation is that counterintelligence is particularly well-suited to this mission. Counterintelligence tradecraft, long dedicated to identifying, deceiving, and neutralizing hostile influence operations, translates directly into the fight against cyber-militias. C.I. operators bring expertise in adversary attribution, double-agent operations, disinformation detection, and the manipulation of clandestine networks, which are precisely the skills needed to unmask coordinated inauthentic behavior online. I firmly believe that integrating C.I. into information warfare provides unique advantages. It blends technical signals analysis with human-source validation and can “exploit, disrupt, or co-opt” adversary influence operations in ways that exceed mere content moderation (Hunker 2010; Rid 2020). To leave cyber-militias solely to public diplomacy or platform governance is to fight with one arm tied. Incorporating counterintelligence into the core of our information campaigns ensures that the United States can not only defend against adversarial propaganda but actively contest and dismantle the networks that drive it.

~ C. Constantin Poindexter, MA in Intelligence, Graduate Certificate in Counterintelligence, JD, CISA/NCISS OSINT certification, DoD/DoS BFFOC Certification

References

Bradshaw, Samantha, and Philip N. Howard. 2019. The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation. Oxford: Oxford Internet Institute.

Claverie du Cluzel, François, et al. 2021. “Cognitive Warfare.” NATO Allied Command Transformation, Innovation Hub. Norfolk, VA.

Department of Defense. 2023. Strategy for Operations in the Information Environment. Washington, DC.

Done, Alasdair. 2022. “ISIS Propaganda and Online Radicalization.” Journal of Strategic Security 15 (3): 27–49.

Eady, Gregory, Jonathan Nagler, Andrew Guess, Jan Zilinsky, and Joshua Tucker. 2023. “Exposure to the Russian Internet Research Agency Foreign Influence Campaign on Twitter in the 2016 U.S. Election and Its Relationship to Attitudes and Voting Behavior.” Nature Communications 14 (1): 367.

GAO (U.S. Government Accountability Office). 2022. Information Environment: DOD Should Take Steps to Expand Its Assessments of Information Operations. Washington, DC.

Hunker, Jeffrey. 2010. “Cyber War and Cyber Power: Issues for NATO Doctrine.” NATO Defense College Research Paper, no. 62. Rome: NATO Defense College.

Maertens, Rakoen, Melisa Roozenbeek, Sander van der Linden, and Stephan Lewandowsky. 2021. “Long-Term Effectiveness of Inoculation Against Misinformation: Three Longitudinal Experiments.” Journal of Experimental Psychology: Applied 27 (1): 1–16.

Munk, Tine. 2025. “The IT Army of Ukraine: Digital Civilian Resistance and International Law.” Crime, Law and Social Change 83 (1): 55–74.

Prier, Jarred. 2017. “Commanding the Trend: Social Media as Information Warfare.” Strategic Studies Quarterly 11 (4): 50–85.

Rid, Thomas. 2020. Active Measures: The Secret History of Disinformation and Political Warfare. New York: Farrar, Straus and Giroux.

Roozenbeek, Jon, Sander van der Linden, and others. 2020. “Fake News Game Confers Psychological Resistance Against Online Misinformation.” Palgrave Communications 6 (1): 65.

Shao, Chengcheng, Giovanni Luca Ciampaglia, Onur Varol, Alessandro Flammini, and Filippo Menczer. 2018. “The Spread of Low-Credibility Content by Social Bots.” Nature Communications 9 (1): 4787.

Ulbricht, Moritz. 2024. “Online Propaganda and Civilian Harm in Armed Conflicts.” International Review of the Red Cross 106 (1): 67–94.