THREAT DETECTION FLOW
From Origin to Capture — A Framework for Diplomatic Leaders
Threat Types
Actor Tiers
Frameworks
- AActors — Who is behind it?
- BBehaviour — How do they operate?
- CContent — What are they saying?
- DDegree — How significant?
- EEffect — What is the impact?
- 1Planning — Objectives, targets
- 2Preparation — Content, assets, infrastructure
- 3Delivery — Initial distribution
- 4Amplification — Networks, bots, proxies
- 5Exploitation — Mainstream pickup
- 6Effect — Impact achieved
- D1Detection — Early warning, monitoring
- D2Defensive Comms — Inoculation, rapid response
- D3Digital Shielding — Protect presence, accounts
- D4Development — Training, simulations
- 1Understand — Monitor, identify, analyse
- 2Prevent — Pre-bunk, inoculate, build resilience
- 3Contain — Rapid response, limit spread
- 4Recover — Restore trust, learn, strengthen
Principles
"Don't chase every falsehood. Strengthen resilience and produce authentic content that builds trust."
0. CORE PRINCIPLES
On Detection
Tools process data. Humans understand meaning. Neither is sufficient alone.
Focus on behaviour over content. Narratives change; tactics reveal the actor.
The tell is in the deviation. Prioritise what's different or unexpected.
What appears widespread may be manufactured. Consensus can be faked.
On Intelligence
Collectors are also targets.
If it's not written, it didn't happen for the institution.
What seems memorable now becomes uncertain within 48 hours.
Fact is what was observed. Comment is your interpretation. Flag the difference.
The picture emerges from connections. Isolated fragments become meaningful when combined.
Without priorities, intelligence gathering is random.
Cultures that punish uncertainty suppress valuable information.
On Response
Don't chase every falsehood. Strengthen resilience and produce authentic content that builds trust.
You can't stop every lie. You can prevent it from achieving its purpose.
Prevention is more effective than cure. Pre-bunking works like a vaccine.
Speed is decisive. Narratives set quickly. Delayed response allows false information to become established.
Corrections in friendly media don't reach audiences consuming disinformation.
Anonymous attacks have no consequences. Attribution creates accountability.
Covert operations depend on remaining hidden. Exposure is disruption.
On Strategy
Accept imperfect victory. Complete elimination of hostile narratives is unrealistic.
Win the story or lose the war. Factual corrections fail if the narrative frame remains hostile.
No single lever is sufficient. Effective counter-disinformation requires a portfolio approach.
Follow the money. Funding trails enable attribution and reveal operational scale.
Know where they're aiming. Deployment domains cluster around predictable targets.
Not all threats are equal. Assess to prioritise.
On Adversaries
The most effective manipulation is invisible. Question whether "independent" conclusions may have been engineered.
Foreign operations succeed by triggering domestic amplification.
The strategy isn't to create divisions — it's to amplify existing ones.
On Memory & Capture
The more vivid and absurd the mental image, the better it sticks.
Same-day capture is critical.
Without feedback, collectors operate blind.
0.1 US POLITICAL STAKEHOLDERS & INFLUENCERS
EXECUTIVE BRANCH (Foreign Policy)
State Department
Secretary of State: Marco Rubio (also acting National Security Advisor)
Deputy Secretary: Christopher Landau
Deputy Secretary for Management: Michael Rigas
Under Secretary (Economic): Jacob Helberg
Under Secretary (Arms Control): Thomas DiNanno
Under Secretary (Political Affairs): Allison Hooker
Under Secretary (Public Diplomacy): Sarah Rogers
National Security Council
Homeland Security Advisor: Stephen Miller (also Deputy Chief of Staff)
NSC was significantly reduced in Trump's second term; policy was pushed to agencies
CONGRESS — FOREIGN AFFAIRS
Senate Foreign Relations Committee
Chairman: Jim Risch (R-ID)
Ranking Member: Jeanne Shaheen (D-NH)
Key Republicans: Pete Ricketts (NE), Dave McCormick (PA), Steve Daines (MT), Bill Hagerty (TN), John Barrasso (WY), Mike Lee (UT), Rand Paul (KY), Ted Cruz (TX), Rick Scott (FL), John Curtis (UT), John Cornyn (TX)
Key Democrats: Chris Coons (DE), Chris Murphy (CT), Tim Kaine (VA), Jeff Merkley (OR), Cory Booker (NJ), Brian Schatz (HI), Chris Van Hollen (MD), Tammy Duckworth (IL), Jacky Rosen (NV)
House Foreign Affairs Committee
Chairman: Brian Mast (R-FL)
Ranking Member: Gregory Meeks (D-NY)
Key Members: Chris Smith (NJ), Joe Wilson (SC), Michael McCaul (TX), Maria Elvira Salazar (FL), Young Kim (CA)
THINK TANKS
Centrist/Establishment
Brookings Institution — consistently ranked #1 globally for foreign policy
Council on Foreign Relations (CFR) — publishes Foreign Affairs, hosts heads of state
Carnegie Endowment for International Peace — oldest, global reach, Beijing/Moscow centers
RAND Corporation — defence and security research
Wilson Center — congressional charter, broad policy
Center-Right/Atlanticist
Center for Strategic and International Studies (CSIS) — bipartisan, Georgetown origins
Atlantic Council — NATO/transatlantic focus, regional centers (including Adrienne Arsht Latin America Center)
German Marshall Fund — hosts Hamilton 2.0 disinformation tracking
Conservative
Heritage Foundation — significant influence in Trump administrations
Hudson Institute — hawkish foreign policy
American Enterprise Institute (AEI) — conservative domestic/foreign policy
Hoover Institution (Stanford) — conservative academic
Progressive/Libertarian
Cato Institute — libertarian, skeptical of intervention
Economic Policy Institute — labor-aligned
Regional/Specialist
American Foreign Policy Council (AFPC) — active on Eurasia
Foreign Policy Research Institute (FPRI) — Philadelphia-based
Foundation for Defense of Democracies (FDD) — national security focus
TOP LOBBYING FIRMS (by revenue, 2024-2025)
Ballard Partners — $88.3M in 2025, strong Trump administration ties (alumni include Pam Bondi, Susie Wiles)
Brownstein Hyatt Farber Schreck — highest revenue, multi-sector
Akin Gump Strauss Hauer & Feld — international trade, financial services
BGR Group — bipartisan, Fortune 500 clients
Holland & Knight — transportation, healthcare, defense
Cornerstone Government Affairs — bipartisan, employee-owned
Invariant (Heather Podesta) — tech, financial services, healthcare
Capitol Counsel — legislative strategy
Continental Strategy — rapid growth, Rubio/Wiles connections
MEDIA & JOURNALISTS (Foreign Policy Focus)
Print/Digital
Foreign Affairs (CFR publication) — Daniel Kurtz-Phelan, Editor
Foreign Policy — Ravi Agrawal, Editor-in-Chief
Washington Post — Yasmeen Abutaleb (White House)
New York Times — Steven Erlanger (Chief Diplomatic Correspondent, Europe)
Financial Times — Peter Spiegel (US Managing Editor), Ed Luce
Wall Street Journal — David Luhnow (UK Bureau Chief)
Politico — extensive DC coverage
The Atlantic — Kim Ghattas (contributing writer)
Broadcast
BBC — Lyse Doucet (Chief International Correspondent)
CNN — Elise Labott (formerly; now American University/Zivvy Media)
NPR — Michel Martin (Morning Edition), political team
MSNBC — Rachel Maddow
Fact-Checkers
Glenn Kessler (Washington Post Fact Checker)
Full Fact (UK, increasingly international)
PODCASTS & COMMENTATORS
Foreign Policy Focus
Pod Save the World — Tommy Vietor, Ben Rhodes (Crooked Media)
Deep State Radio — David Rothkopf, Rosa Brooks (Georgetown), Kori Schake (AEI), Ed Luce (FT)
FP Live — Ravi Agrawal (Foreign Policy)
FDD podcasts
General Political
Pod Save America — Crooked Media (progressive)
The NPR Politics Podcast
The Joe Rogan Experience — massive reach, frequent political guests
Countdown with Keith Olbermann
Breitbart News Daily (conservative)
Academic/Analytical
Ian Bremmer — Eurasia Group founder, frequent TV commentator
KEY INFLUENCER CATEGORIES TO TRACK
By Access Type:
Administration insiders and alumni
Congressional staffers (especially committee staff directors)
Former ambassadors and diplomats
Lobbyists with revolving-door connections
Think tank fellows who move in/out of government
By Issue Area:
China hawks/engagement advocates
Russia/Ukraine policy voices
Middle East specialists
Trade and economic policy
Defense and intelligence
KNOWLEDGE LOG
Threat Detection & Response Framework
Last Updated: January 2026
Contents
Welcome
The Threat Landscape
Detection Methods
The Human Layer
Defensive Strategies
Regional Focus: Middle East & US
Countering Deliberate Smearing
Glossary (37 Terms)
Tool Library (23 Tools)
Key Frameworks
1. WELCOME
Purpose
This Knowledge Log provides a practical, evolving reference for understanding and responding to information threats in contested environments. It is designed for rapid lookup during discussions and strategic planning. Many of these are best practices and well-known; some are emerging, and all are designed to stimulate discussion at this stage.
Who This Is For
Senior diplomatic personnel operating in the US and the Middle East who face daily threat detection challenges in unstructured settings—social events, meetings with politicians, lobbyists, and media figures.
How to Use
During meetings: Quick-reference the Glossary for term definitions
For strategic planning: Review Context sections for methodology and best practices
For tool selection: Consult the Tool Library for detection and monitoring platforms
For workshop preparation: Study the Frameworks section for analytical models
Security Reminder: This document contains operational guidance. Handle in accordance with our organisation's information security protocols.
2. THE THREAT LANDSCAPE
What We're Facing
An increasingly digital and autonomous world.
The information environment has fundamentally changed. State and non-state actors now wage sophisticated campaigns to manipulate perception, erode trust in institutions, and influence policy decisions. These operations are cheaper, faster, and more effective than ever before.
Key Reality: 86% of Europeans recognise disinformation as a grave threat to democracy. The World Economic Forum ranked it the second most significant global risk in 2024.
Types of Information Threats
Disinformation: False information deliberately created and spread to deceive. Intent is the key differentiator—this is a weaponised falsehood designed to achieve strategic objectives.
Misinformation: False information spread without malicious intent. Creates fertile ground for disinformation by establishing incorrect beliefs that can later be exploited.
Foreign Information Manipulation and Interference (FIMI): The EU's framework for state-sponsored manipulation. Focuses on the pattern of behaviour rather than just content—coordinated, intentional activity designed to negatively impact values, procedures, and political processes.
Key Threat Actors
Russia
The most sophisticated and aggressive state actor. Russia's 2025 budget allocates a record $1.4 billion to state propaganda—a 13% increase from 2024. Operations include RT and state media, the Social Design Agency troll farm, front-company financing (such as the $10M to US influencers via Tenet Media), and extensive use of proxies and deniable assets.
Primary Objectives: Undermine Western unity, erode Ukraine support, amplify social divisions
Key Tactics: Doppelganger operations, reflexive control, coordinated inauthentic behaviour
China
Increasingly active, with a different approach focused on promoting positive narratives about China and suppressing criticism. The "50 Cent Army" produces an estimated 488 million social media posts annually.
Primary Objectives: Shape global perception of China, suppress dissent, influence Taiwan narrative
Key Tactics: State media amplification, diaspora engagement, economic leverage
Iran
The IRGC conducts sophisticated cyber and influence operations through entities such as Emennet Pasargad. Active in both hack-and-leak operations and social media manipulation.
Primary Objectives: Counter US influence in the Middle East, support proxy networks, target Israel
Key Tactics: Hack-and-leak, fake personas, coordinated social media campaigns
Gulf States
Regional information warfare between Gulf countries, exemplified by the 2017 Qatar crisis, was dubbed the "first social media cold war." Operations target both regional rivals and Western audiences.
Why This Matters for us.
We operate at the intersection of these threat streams. Every social interaction, policy discussion, and media engagement takes place in a contested information environment. Understanding the landscape is the first step to operating effectively within it.
3. DETECTION METHODS
The Detection Challenge
Detection requires identifying coordinated, deceptive behaviour amid the noise of genuine public discourse. The most sophisticated operations are designed to appear organic and authentic.
Critical Limitation: Tools can detect patterns and scale. Humans provide judgment and context. Neither is sufficient alone.
What to Look For
Behavioural Indicators
Coordination Patterns: Multiple accounts posting identical or near-identical content
Timing Anomalies: Activity concentrated during working hours in specific time zones
Network Structures: Unusual amplification patterns, bot-like behaviour
Account Characteristics: Recently created accounts, limited personal content, suspicious follower ratios
Content Indicators
Narrative Alignment: Content that mirrors known state narratives
Emotional Manipulation: Designed to provoke outrage, fear, or division
Strategic Timing: Campaigns timed to elections, policy decisions, or crises
Professional Production: High-quality content from apparently grassroots sources
Detection Tools
Key categories:
Monitoring: Hamilton 2.0, Meltwater, Babel Street—platforms for tracking state narratives and media activity
Analysis: OSoMeNet, Botometer—network analysis and coordination detection
Verification: InVID/WeVerify, Truepic—forensic verification of media authenticity
Frameworks: DISARM, STIX—standardised taxonomies for sharing threat intelligence
The OSINT Approach
Open Source Intelligence (OSINT) has become central to detection. The US State Department released its inaugural OSINT Strategy in May 2024, recognising that demand for unclassified assessments is growing.
Key Challenge: Ensuring open-source data is not itself tainted by disinformation. New guidance is forthcoming on tradecraft for reviewing information reliability and credibility.
AI in Detection
AI tools are now essential for processing volume, but come with caveats:
Strengths: Pattern recognition at scale, anomaly detection, real-time monitoring
Limitations: Context interpretation, novel tactic recognition, false positives
Risk: Adversaries also use AI—2,089 undisclosed AI-generated news sites identified across 16 languages
4. THE HUMAN LAYER
Why Humans Matter
Tools process data. Humans understand meaning. The most critical intelligence often comes from conversations, observations, and relationships that no algorithm can replicate.
Diplomatic Advantage: Ambassadors and diplomats have access that no journalist possesses. You are positioned to gather intelligence that cannot be obtained through open sources.
Human Intelligence (HUMINT)
Intelligence gathered through direct human contact. In the context of information threats, this means understanding intent, relationships, and context that cannot be derived solely from data.
Key HUMINT Skills
Elicitation: Extracting information through conversation without raising suspicion
Observation: Reading body language, social dynamics, and environmental cues
Relationship Building: Developing sources who can provide ongoing insight
Verification: Cross-referencing human intelligence with other sources
Elicitation Awareness
The FBI defines elicitation as "a technique to collect information not readily available without raising suspicion." It is subtle, non-threatening, easy to disguise, deniable, and effective. You are both a practitioner and a target.
Common Elicitation Techniques
Flattery: Praise that compels elaboration or sharing to justify the compliment
Deliberate False Statements: Overstating facts to provoke correction with accurate information
Quid Pro Quo: Sharing information to encourage reciprocation
Feigned Ignorance: Pretending to know nothing, exploiting the need to educate
Active Listening: Full engagement that encourages the speaker to continue
Counter-Intelligence Awareness: Remember: collectors are also targets. The same techniques we might use are being used on us at every reception, dinner, and meeting.
Fact vs. Comment
Rigorous separation of observation from interpretation is essential for reliable intelligence. The tell is in the discipline.
Fact: "The minister said X" (what was observed)
Comment: "This appears to indicate Y" (our interpretation)
Flag analysis explicitly. Confusion between fact and assessment degrades intelligence value.
Memory Palace Technique
Ancient mnemonic method validated by modern research. CIA analysts trained for just one hour showed 45% higher recall, retained 57% more after one week, and were 5x more likely to achieve perfect recall.
How It Works
Pre-select a mental location: A familiar building, route, or space with 10-15 distinct spots
During conversation: Mentally place key information at specific locations using vivid, absurd associations
Immediately after: Walk through your mental palace in a private moment to retrieve information
Same day: Formal debrief while memory is fresh
Making It Stick: The more vivid and absurd the mental image, the better it sticks. Place the minister's comment about oil prices next to a giant barrel flooding your childhood kitchen. Strange = memorable.
Anomaly Hunting
Prioritise what's different or unexpected over routine information. The tell is in the deviation.
What surprised you?
What contradicted expectations?
What changed from previous interactions?
What was conspicuously absent?
5. DEFENSIVE STRATEGIES
The Four Functions
NATO's framework for addressing information threats encompasses four key functions:
Understand: Monitor the environment, identify threats, analyse tactics and objectives
Prevent: Pre-bunking, stakeholder inoculation, building resilience before attacks
Contain/Mitigate: Rapid response, message injection, limiting spread and impact
Recover: Restoring trust, learning from incidents, strengthening defences
Pre-Bunking vs. Debunking
Pre-Bunking (Prevention)
Warning about manipulation techniques before exposure. Research shows this is more durable than correction—it functions like a vaccine against deception.
Inoculate audiences against specific tactics
Build critical thinking before exposure
More effective than post-hoc correction
Debunking (Response)
Correcting false information after it has spread. Necessary but slower and less effective than prevention.
Speed matters—correct quickly, or the narrative sets
"Truth sandwich" technique: lead with truth, mention a lie, end with truth
Repetition of a false claim can inadvertently reinforce it
Stakeholder Inoculation
Pre-brief key audiences on expected manipulation tactics before they encounter them. This dramatically reduces the impact when attacks come.
Identify key stakeholders and opinion leaders
Brief on likely attack vectors and narratives
Provide context and counter-arguments in advance
Build trusted channels for rapid communication
Strategic Communications
Coordinated messaging to protect and advance interests. Not PR—integrated with policy, intelligence, and operations.
Ukraine Lesson: Don't chase every falsehood. Focus on strengthening resilience and producing authentic content that builds trust. The goal is narrative denial—denying the adversary's strategic objective—not winning every factual argument.
Speed and Pre-Positioning
Authoritarian adversaries act swiftly and in unison. Democratic responses require pre-positioning:
Know our vulnerabilities in advance
Have counter-narratives prepared
Brief allies before crises hit
Build relationships with third-party validators
Maintain rapid response kits
6. REGIONAL FOCUS: MIDDLE EAST & US
Middle East
Information Ecosystem
Historically closed and tightly controlled. Social media became a cheap expansion tool for influence operations. Different actors pursue different goals:
Egypt: Focuses primarily on domestic information control
Gulf States: Use influence operations for regional hegemony
Iran: Targets regional enemies, diaspora, and Western audiences
The 2017 Qatar Crisis
Dubbed the "first social media cold war,—coordinated campaigns between Gulf states demonstrated the weaponisation of information in regional disputes. Among 15 platform takedowns from Egypt, Saudi Arabia, and the UAE, at least 10 portrayed Qatar, Turkey, and Iran as terrorism sponsors.
Iran's Strategy
Discredit domestic and foreign enemies
Pacify the Iranian population
Strengthen follower loyalty
Recruit supporters internationally
International targets include radical anti-American/anti-Israel groups, marginalised minorities, Iranian diaspora, and Muslim communities globally.
Al-Ahli Hospital Case Study
Example of contested narrative warfare: battle of competing versions planted immediately online, restricted ground access preventing proper investigation, demonstrating how information voids are exploited.
United States
Russian Approach
Moscow exploits existing divisions: gun control, ethnic rivalries, police-community tensions, and abortion. The strategy isn't to create divisions—it's to amplify existing ones.
Tactical Evolution
2016: Released hacked information directly
2020: Laundered narratives through prominent Americans and the US media
Post-2022: Adopted harder-to-detect techniques, including fake versions of legitimate Western news sites
2024: Covert funding of American influencers ($10M via Tenet Media)
2024 Election Focus
Russian operations particularly focused on undermining Ukraine's support. The intelligence community assessed that Russia sought to influence the election outcome, with a specific interest in the impact on US foreign policy.
Domestic Amplification: "The scale and scope of domestic disinformation is far greater than anything a foreign adversary could do to us." Foreign operations succeed by triggering domestic amplification.
7. COUNTERING DELIBERATE SMEARING
The 4D Framework
D1 - Detection: Predictive analytics, early warning indicators, continuous monitoring for emerging threats
D2 - Defensive Communication Message injection into hostile ecosystems, stakeholder inoculation, rapid response kits
D3 - Digital Shielding Protecting institutional presence, securing accounts, monitoring for impersonation
D4 - Development Training, simulations, crisis playbooks, continuous capability building
Key Principles
Accept Imperfect Victory
Complete elimination of hostile narratives is unrealistic. Focus on denying the adversary's strategic objective regardless of their tactics. You can't stop every lie—you can prevent it from achieving its purpose.
Speed is Decisive
Authoritarian adversaries act swiftly and in unison. Democratic responses must be pre-positioned:
Know our vulnerabilities before they're exploited
Have counter-narratives ready
Brief allies in advance
Build third-party validator relationships
Narrative Denial
Focus on denying the adversary's strategic objective rather than winning every factual argument. Sometimes the best response is not to engage with false claims but to advance your own authentic narrative.
Response Options
Attribution and Exposure: Publicly identifying the source of attacks imposes reputational costs and enables further action. Exposure degrades operational effectiveness.
Stakeholder Communication: Direct outreach to key audiences with accurate information and context. Pre-positioned relationships pay off in crisis moments.
Third-Party Validators Credible independent voices carry more weight than self-defence. Cultivate relationships with fact-checkers, journalists, and respected commentators.
Legal Action where defamation is clear and jurisdiction permits. High-profile cases like Dominion Voting Systems ($787M settlement from Fox News) demonstrate the potential for accountability.
The Ukraine Approach: Don't chase falsehoods. Strengthen resilience and produce authentic content to build trust. The goal is maintaining credibility, not correcting every lie.
8. GLOSSARY (37 Terms)
Threat Landscape Terms
NEW GLOSSARY ENTRIES
NAVIGATIONAL PLAN
A strategic blueprint combining mission objective, situational chart, and operational specifics. Not aspirational—practical guidance for where the organization is headed and how it will get there.
The navigational plan answers three questions: What is the mission objective? What does the terrain look like? What specifically do we do?
Why It Matters: Strategy without navigation is wishful thinking. Vision statements that lack a chart and specifics produce drift, not direction.
DOUBLE DOWN DYNAMIC
The escalation pattern where both defender and attacker intensify efforts following failed attacks. The attacker, having invested resources and failed, increases commitment rather than withdrawing. The defender, having succeeded, capitalizes on momentum.
This creates a ratcheting effect: each cycle raises the stakes for both sides.
Why It Matters: Victory doesn't end the contest. Successful defense triggers intensified attack. Plan for the next wave before the current one ends.
ENERGY DISPLACEMENT
Strategic approach where investing in positive opportunity-building diverts adversary resources away from attacks, without directly engaging the attack itself.
The logic: strength built in one area creates capital that weakens the adversary's ability to attack in another. You don't fight where they want you to fight—you build where they can't follow.
Why It Matters: You don't have to win every fight. Playing your game instead of theirs shifts the contest to ground you choose.
EXTRACTION
The deliberate process of capturing knowledge held in individuals' heads and converting it into institutional intelligence. Particularly critical when expertise is concentrated in few people.
Extraction requires structured conversation, not passive hope that knowledge will trickle down. What isn't extracted has no institutional value.
Why It Matters: Knowledge that exists only in one person's head is an organizational vulnerability. If they leave, get sick, or simply aren't in the room, that intelligence is unavailable.
ABSORPTION FAILURE
The gap between intelligence produced and intelligence internalized by the organization. Reports generated but not absorbed create an illusion of awareness without actual understanding.
Fifty reports that no one reads provide no more protection than zero reports. Production is not absorption.
Why It Matters: Organizations often believe they know what their reports contain. They don't. Repetition and simplification are not failures of sophistication—they're requirements for actual understanding.
FLEX DELIVERABLE
A strategic communication product designed for dual purpose: internal alignment and external impression. Must be substantively credible while also being visually impactful.
The same work serves multiple audiences with appropriate presentation. Internal rigor provides the foundation; external polish provides the impact.
Why It Matters: Internal strategy documents that can't flex externally waste opportunity. Work that only impresses internally or only impresses externally is half-built.
EAGER BEAVER THREAT
Internal actors who undermine strategy because they believe they know better, often pursuing independent action that fragments organizational response.
Not malicious—often highly motivated. The damage comes from uncoordinated action that contradicts or complicates the agreed approach.
Why It Matters: Not all threats are external. Internal freelancing can be as damaging as coordinated attack. Strategy requires managing internal alignment as seriously as external defense.
LONE WARRIOR PROBLEM
When a highly capable operator prefers independent action over institutional coordination, creating both dependency and knowledge silos.
The lone warrior often delivers results. The problem is that no one else can replicate what they do, and when they're unavailable, the capability disappears.
Why It Matters: Capability concentrated in individuals who won't systematize creates institutional fragility. The organization depends on people it cannot replace or even fully understand.
THREAT PRIORITY ASSESSMENT
The structured ranking of threats by significance, enabling focused response rather than scattered reaction. Establishes shared understanding of what matters most and why.
Without prioritization, organizations treat all threats equally—which means treating serious threats inadequately and wasting resources on minor ones.
Why It Matters: Finite resources require choices. Priority assessment makes those choices explicit and shared rather than implicit and fragmented.
THREAT RESPONSE MATURITY
The progression from reactive surprise through tracking and anticipation to proactive posture. Organizations move through stages as they learn from experience.
Stage 1 - Surprised: Clueless about sophistication of attacks. Stage 2 - Tracking: Able to keep track of what's happening. Stage 3 - Anticipating: Forecasting, reading patterns. Stage 4 - Proactive: Shaping the environment before attacks materialize.
Why It Matters: Knowing where you are on the maturity curve tells you what capabilities to build next. Organizations stuck at early stages keep getting surprised by threats they could have anticipated.
STRATEGIC CENTER OF GRAVITY
The relationship, capability, or asset that, if severed, collapses the defender's position. All serious attacks ultimately target the center of gravity, whether directly or by weakening its supports.
Identifying your center of gravity clarifies what must be protected at all costs. Identifying the adversary's center of gravity clarifies where pressure is most effective.
Why It Matters: Defending everything equally defends nothing adequately. Know what you cannot afford to lose.
VECTOR SHIFT
When adversaries move attacks to new domains or methods after failing in existing ones. Successful defense in one area triggers probing in others.
Attackers don't stop when blocked—they redirect. A defended front creates pressure to find undefended flanks.
Why It Matters: Closing one vulnerability opens focus on the next. Defense is not a single problem to solve but a continuous adaptation to shifting pressure.
SITUATIONAL AWARENESS
The shared map that enables coordinated action. Without it, individuals navigate by their own partial view, creating fragmented and contradictory responses.
Situational awareness must be compiled and visible to all. The map exists to be looked at together, not filed and forgotten.
Why It Matters: Organizations that lack shared situational awareness have multiple people operating on multiple assumptions. Coordination becomes impossible when everyone sees different terrain.
DISINFORMATION: False information deliberately created and spread to deceive. Intent is the key differentiator—weaponised falsehood for strategic objectives. Why It Matters: Intent determines response. Disinformation requires attribution and counter-strategy; misinformation may only need correction.
MISINFORMATION False information spread without harmful intent. People share it believing it's true. Creates fertile ground for exploitation. Why It Matters: Requires education rather than confrontation.
FIMI (Foreign Information Manipulation and Interference) Coordinated foreign efforts to manipulate information environments. EU framework focusing on behaviour rather than content—coordinated, intentional, manipulative activity. Why It Matters: Common vocabulary with European partners. Shifts focus from content to behaviour.
NARRATIVE WARFARE The contest to control how events are interpreted and remembered. Beyond individual false claims to the battle over meaning. Why It Matters: Win the story or lose the war. Factual corrections fail if the narrative frame remains hostile.
COGNITIVE WARFARE Operations targeting perception, judgement, and decision-making to alter behaviour. The battlespace is inside people's heads. Why It Matters: Understanding cognitive vulnerabilities is essential for both defence and recognising when you're being targeted.
COORDINATED INAUTHENTIC BEHAVIOUR (CIB) Multiple actors working together while hiding their coordination. Fake grassroots—appearance of organic support through coordinated fake accounts. 38,000+ accounts detected in EU FIMI operations. Why It Matters: Consensus can be faked. What appears widespread may be manufactured.
DOPPELGANGER Operations impersonating legitimate news outlets to launder propaganda. Russian ops are creating fake versions of Western news sites. Why It Matters: Brand trust is being stolen. Verify sources—familiar sites may be hostile impersonation.
REFLEXIVE CONTROL: Manipulating perception so adversaries make decisions serving your interests while believing the choice was theirs. Russian doctrine—target-led to predetermined decision through controlled information. Why It Matters: Most effective manipulation is invisible. Question whether "independent" conclusions may have been engineered.
Tradecraft Terms
OSINT (Open Source Intelligence) - intelligence derived from publicly available sources. Media, social platforms, public records. Fast and shareable, but can be poisoned. Why It Matters: Increasingly central to intelligence work. Quality control is essential—open sources can be manipulated.
HUMINT (Human Intelligence): Intelligence gathered through direct human contact. The only reliable way to understand intent. Cannot be replaced by technical means. Why It Matters: Tools process data. Humans understand meaning. HUMINT provides context no algorithm can replicate.
ELICITATION: Extracting information through conversation without the target realising. Subtle, non-threatening, easy to disguise, deniable, and effective. Why It Matters: It's happening at every reception. You are both practitioner and target.
TTPs (Tactics, Techniques, Procedures): The patterns of how threat actors operate. Tactics are goals, techniques are methods, procedures are combinations. Harder to change than content. Why It Matters: Focus on behaviour over content. Narratives change; tactics reveal the actor.
PERSONA A constructed identity used to establish false credibility. Fake identities appearing as authentic voices—journalists, experts, concerned citizens. Why It Matters: The messenger is the weapon. Verify the identity and credentials of new contacts.
MEMORY PALACE (Method of Loci): Ancient technique of placing information mentally in familiar locations for retrieval. CIA validated: 45% better recall, 57% more retention, 5x perfect recall rate. Why It Matters: Enables information capture where notes are impossible. One hour training produces measurable improvement.
ANOMALY HUNTING Prioritising what's different or unexpected over routine information. The tell is in the deviation. Why It Matters: Routine information has limited value. Anomalies signal change, opportunity, or threat.
FACT vs. COMMENT Rigorous separation of observation from interpretation. Fact: "The minister said X." Comment: "This appears to indicate Y." Why It Matters: Confusion between fact and assessment degrades intelligence value.
Detection Terms
KILL CHAIN: The sequence of steps required to execute an attack. Breaking operations into phases enables intervention at multiple points. Why It Matters: Understand the process and identify intervention points. Early detection enables prevention.
PRE-BUNKING Warning about manipulation techniques before encounter. More durable than post-hoc correction—functions like vaccination. Why It Matters: Prevention is more effective than cure. Builds lasting resilience.
DEBUNKING Correcting false information after it has spread. Necessary but slow. "Truth sandwich": lead with truth, mention a lie, end with truth. Why It Matters: Speed matters. Delayed response allows false information to become established.
COLLECTION PRIORITIES Pre-defined intelligence requirements focusing attention during encounters. Why It Matters: Without priorities, intelligence gathering is random. Defined requirements focus attention on what matters.
Response Terms
STRATEGIC COMMUNICATIONS (StratCom) Coordinated messaging to protect and advance interests. Not PR—integrated with policy, intelligence, and operations. Why It Matters: Fragmented communication creates vulnerabilities. Coordinated StratCom presents unified messaging.
RAPID RESPONSE Pre-positioned capacity to counter hostile narratives within hours. Speed is decisive. Why It Matters: Narratives set quickly. Delayed response allows false information to become established.
STAKEHOLDER INOCULATION: Pre-briefing key audiences on manipulation tactics before the encounter. Reduces impact dramatically. Why It Matters: Inoculated stakeholders resist manipulation and become defenders.
MESSAGE INJECTION Introducing accurate counter-messages into hostile channels. Meeting disinformation where it circulates. Why It Matters: Corrections in friendly media don't reach audiences consuming disinformation.
NARRATIVE DENIAL Denying the adversary's strategic objective rather than winning every factual argument. Why It Matters: You can't stop every lie. You can prevent lies from achieving their purpose.
ATTRIBUTION: Publicly identifying the source of disinformation operations. Naming the actor changes the game. Why It Matters: Anonymous attacks have no consequences. Attribution creates accountability.
EXPOSURE: Making hostile operations visible to degrade effectiveness. Shining light on operations. Why It Matters: Covert operations depend on remaining hidden. Exposure is disruption.
Institutional Terms
DEBRIEF DISCIPLINE Structured immediate capture after encounters, separating observation from interpretation. Same-day capture is critical. Why It Matters: What seems memorable now becomes uncertain within 48 hours.
COLLECTION DISCIPLINE Institutional expectation that significant encounters generate structured reports. If it's not written, it didn't happen for the institution. Why It Matters: Intelligence that isn't recorded has no institutional value.
FUSION: Combining fragments from multiple sources into a coherent intelligence picture. Patterns emerge from aggregation. Why It Matters: The picture emerges from connections. Isolated fragments become meaningful when combined.
TASKING: Directing collection based on identified gaps and priorities. Analysis reveals gaps; tasking fills them. Why It Matters: Random collection is inefficient. Tasking focuses effort on what's needed.
FEEDBACK LOOPS Telling collectors when reporting was useful—reinforcing good practice. Why It Matters: Without feedback, collectors operate in the dark. Closed loops improve quality.
SAFE-TO-SHARE CULTURE Environment where uncertain or incomplete information can be reported without penalty. Why It Matters: Cultures that punish uncertainty suppress valuable information.
COUNTER-INTELLIGENCE AWARENESS Understanding that collectors are also targets for elicitation. Why It Matters: You are a target. Every interaction that might yield intelligence can also extract it from you.
SITUATION ROOM Physical/procedural hub where collection, fusion, analysis, and dissemination converge. Not just a room—a discipline. Why It Matters: Without a hub, intelligence fragments. Creates institutional capacity for threat awareness.
FUNDING STREAMS How disinformation operations are financed—state budgets, proxies, commercial operators.
Russia's 2025 budget allocates $1.4 billion to state propaganda. Front companies like Tenet Media funneled $10M to US influencers. Commercial disinformation-for-hire operates in 48+ countries with services starting at $8 per post. China's "50 Cent Army" produces 488 million posts annually.
Why It Matters: Follow the money. Funding trails enable attribution, provide targets for sanctions, and reveal operational scale and priorities.
THREAT MAPPING Systematic charting of actors, capabilities, networks, narratives, and channels.
The ABCDE framework (Actors, Behaviour, Content, Degree, Effect) provides structure. Mapping dimensions include: Actor tiers (official to aligned), Capability (production, distribution, technical), Network (relationships, command, funding), Narrative (strategic and tactical), Channel (platforms, traditional media).
Why It Matters: The picture emerges from connections. Map it to see it. Individual incidents become meaningful when charted as part of a larger ecosystem.
DEPLOYMENT DOMAINS The sectors and contexts where information operations concentrate.
Primary domains: Elections (most heavily targeted), Defence and national security, Energy and economic policy, Social cohesion and identity, Public health, Bilateral relationships. Research shows attacks intensify around elections and political unrest.
Why It Matters: Know where they're aiming. Deployment domains cluster around predictable targets. Harden defences where vulnerabilities are greatest.
IMPACT ASSESSMENT Evaluating ramifications of information operations across multiple categories.
Categories: Democratic processes (86% of Europeans see threat to democracy), Reputational damage (personal, organisational, national), Policy disruption, Social cohesion (71% frequently encounter disinformation), Security implications. Assessment factors: reach, resonance, persistence, amplification, real-world consequences.
Why It Matters: Not all threats are equal. Assess to prioritise. Impact assessment enables proportionate response—neither ignoring threats nor overreacting to noise.
LEVERAGE POINTS Pressure and response options available for countering information threats.
Toolkit: Attribution and exposure (naming actors imposes cost), Sanctions (Treasury designated RT executives, IRGC actors), Legal action (indictments, FARA, civil litigation—Dominion won $787M), Platform action (Meta banned RT globally), Allied coordination (G7 RRM, NATO, EEAS), Counter-narrative and StratCom, Support for independent media.
Why It Matters: Know your tools. Match response to threat. No single lever is sufficient—effective counter-disinformation requires a portfolio approach.
9. TOOL LIBRARY (23 Tools)
Monitoring & Tracking
Hamilton 2.0 Dashboard Alliance for Securing Democracy / German Marshall Fund Tracks themes and messaging from Russian, Chinese, and Iranian state media and diplomatic accounts. Essential for understanding state narrative priorities.
Meltwater Commercial AI-powered media monitoring across news, social, and broadcast. Enterprise-grade for tracking coverage and emerging narratives.
Babel Street Commercial AI-enabled OSINT platform with multilingual social media monitoring. Strong capabilities for tracking cross-platform activity.
CrowdTangle Meta (Limited Access) Facebook/Instagram public content tracking. Access is increasingly restricted but valuable for understanding platform dynamics.
EUvsDisinfo Database EEAS East StratCom Task Force Searchable database of pro-Kremlin disinformation cases. Essential reference for Russian narrative patterns.
Network Analysis & Bot Detection
OSoMeNet (Observatory on Social Media) Indiana University Network analysis and visualisation for social media. Maps information diffusion and coordination patterns.
Botometer Indiana University Bot likelihood scoring for Twitter accounts. Useful for initial screening of suspicious amplification.
Hoaxy Indiana University Visualises the spread of claims and fact-checks on Twitter. Shows how information propagates through networks.
Graphika Commercial Network mapping and influence operation detection. Used by platforms and researchers for CIB investigations.
Verification & Forensics
InVID / WeVerify EU-funded Video and image verification toolkit. Browser plugin for reverse image search, metadata analysis, and keyframe extraction.
Truepic Commercial authenticated media capture and verification. Establishes provenance for images and video at the point of creation.
FotoForensics Open Error level analysis for detecting image manipulation. Free tool for identifying edited regions.
TinEye Commercial/Free tier Reverse image search. Traces image origins and identifies modifications across the web.
Hive Moderation Commercial AI-generated content detection including deepfakes. API-based for integration into verification workflows.
Frameworks & Standards
DISARM Framework DISARM Foundation Open-source framework for analysing disinformation TTPs. The standard taxonomy adopted by EEAS, NATO, and researchers.
STIX (Structured Threat Information Expression) OASIS Standardised language for sharing cyber threat intelligence. Increasingly used for FIMI incident data exchange.
Fact-Checking Networks
IFCN (International Fact-Checking Network), Poynter Institute, Network of verified fact-checkers globally. The Code of Principles ensures methodology standards.
ClaimBuster University of Texas Arlington AI-powered claim detection and fact-check matching. Identifies check-worthy statements automatically.
Full Fact UK Charity Independent UK fact-checker. Also develops automated fact-checking tools and AI verification systems.
Research & Intelligence
DFRLab (Digital Forensic Research Lab), Atlantic Council Research, and rapid-response analysis of disinformation. Publishes investigations and methodology guidance.
Stanford Internet Observatory, Stanford University, Academic research on internet abuse, including disinformation. Produces detailed platform takedown analyses.
Bellingcat is an independent open-source investigation collective. Pioneered OSINT methodology and publishes detailed investigative guides.
EU DisinfoLab, a Brussels-based NGO, conducts research and investigations into disinformation. Produces detailed reports on influence networks.
Government Resources
GEC (Global Engagement Center), US State Department, US government coordination for counter-disinformation. Produces reports on state propaganda ecosystems.
Tool Selection Guidance: No single tool is sufficient. Effective detection requires combining monitoring platforms for early warning, network analysis for coordination detection, verification tools for content authentication, and human judgment for context and interpretation.
10. KEY FRAMEWORKS
ABCDE Framework
The EEAS standard for FIMI analysis provides a structured approach to incident assessment.
Element Focus Key Questions
A - Actors: Who is behind it? State/non-state? Proxies? Affiliations? History?
B - Behaviour: How do they operate? TTPs used? Coordination? Bot involvement? Targeting?
C - Content: What are they saying? Narratives? Key terms? Disinformation type?
D - Degree: How significant? Scale? Audience reached? Level of coordination?
E - Effect: What is the impact? Protection risks? Reputational damage? Security implications?
Kill Chain Model
Adapted from cybersecurity, breaking information operations into phases for intervention.
Planning - Adversary identifies objectives, targets, and develops an operational approach
Preparation - Content creation, asset development, infrastructure setup
Delivery - Initial distribution through controlled channels and platforms
Amplification - Coordinated boosting through networks, bots, and proxies
Exploitation - Mainstream pickup, target audience engagement, narrative spread
Effect - Achieved impact on perceptions, decisions, or behaviours
Intervention Points: Each phase offers different disruption opportunities. Early detection enables prevention. Later phases require response and mitigation. The goal is to "kill" the attack at any stage before achieving its objective.
Actor Tier Classification
Framework for categorising threat actors by their relationship to state sponsors.
Tier Description Examples
Tier 1: Official Government departments, embassies, official spokespeople, Russian MFA, Chinese embassy accounts
Tier 2: State-Controlled State media, state-owned entities RT, CCTV, Sputnik, CGTN Tier 3: State-Linked Covert Front companies, troll farms, intelligence operations IRA, Social Design Agency, Emennet Pasargad
Tier 4: State-Aligned Sympathetic media, useful amplifiers, unwitting proxies Aligned commentators, amplifying accounts
The 4D Response Framework
Structured approach when our organisation become the target.
D1 - Detection: predictive analytics, early warning indicators, continuous monitoring. Identify threats before they reach critical mass.
D2 - Defensive Communication: Message injection, stakeholder inoculation, rapid response. Counter-narratives where they circulate.
D3 - Digital Shielding Protect institutional presence, secure accounts, monitor for impersonation. Defend your digital perimeter.
D4 - Development Training, simulations, crisis playbooks, capability building. Continuous improvement of response capacity.
G7 Rapid Response Mechanism
Established in 2018 for coordinating responses to state-sponsored disinformation among G7 nations.
Information sharing on detected operations
Coordinated exposure and attribution
Aligned sanctions and response measures
Best practice exchange
EU FIMI Toolbox
The EU's whole-of-society approach operates on four pillars:
Situational Awareness: Monitoring, detection, and analysis
Resilience Building: Civil society support, independent media, capacity building
Disruption & Regulation: Platform accountability, sanctions, legal measures
External Action: Diplomatic coordination, partner support, exposure
