The question of what makes psychological warfare successful has never been addressed in a methodological, systematic, and clear manner in the existing PSYWAR literature. To identify these conditions, this study scanned the key works on psychological warfare, backed by sources in the fields of military history, strategy, psychology, and communication science. The conditions presented in this study are eight in total, ranging from the need for military victories to the targeted use of violence, from control of the information space to credibility. In the second part of this study, to illustrate the implementation of these conditions in a real military operation, the eight conditions are used to assess the performance of Russian online psychological operations against Ukraine and the Hamas online psychological campaign against Israel. Findings reveal how the model’s eight conditions explain the success/failure of these two psychological campaigns.
This study examines algorithmic patterns in Harris County, Texas (U.S.) voter registration data, revealing a sophisticated base-8 modular algorithm controlling ID assignment for 2.3 million voters. Analysis of 18 million state records shows this algorithm employs bifurcated distribution patterns that deviate from standard practices and mirror patterns identified in Ohio. Comparative analysis with Tarrant County confirms Harris County’s patterns represent deliberate implementation rather than natural database behavior. The algorithm enables covert record attribution while providing no legitimate benefits in public databases. These findings raise concerns under the National Voter Registration Act of 1993’s transparency requirements and meet SEC materiality standards, affecting more than 5% of records with engineered modifications that would alter integrity assessments. This research shows how information warfare may target democratic institutions through seemingly benign database management practices.
This article explores Ukraine’s use of memes as a tool of operations in the information environment (OIEs) during the Russo-Ukrainian War. Employed by a decentralized, civic-driven network, Ukrainian memes boosted morale, countered Russian propaganda, and shaped international narratives. Divided into categories of popularization of Ukrainian heroism, mockery of Russians, and self-reflective irony, these memes exemplify the strategic utility of memetic warfare. Despite challenges in control and predictability, their viral success underscores their efficiency as an OIE tool. The Ukrainian case highlights the growing importance of memes in modern conflicts and calls for integrating memetic warfare into broader information operations strategies.
This paper proposes a multidisciplinary approach to combatting adversarial information operations, drawing upon theoretical frameworks and practical applications. It proposes proactive countermeasures, including an Early Warning and Control System for early detection and intervention. The study also highlights media literacy and institutional cooperation as essential strategies for building societal and individual resilience against disinformation, while emphasizing the need to uphold ethical standards. In conclusion, this paper advocates for converging theory and practice in addressing adversarial information operations. The proposed approach integrates cybersecurity theory with practical applications, providing a holistic framework for countering disinformation in today’s information environments.
The Chinese Communist Party (CCP) has made public opinion warfare a key plank in its grand three warfares strategy of achieving a soft power victory against adversaries before it needs to commit hard combat power. Information warriors from nations opposed to the expansion of the CCP’s tyrannical political control must aggressively counter attack this public opinion warfare. China’s state-controlled media is a major asset in waging public opinion warfare in the international media environment. An essential question is what public image or narratives about itself is the CCP trying to sell foreign audiences? This study attempts to understand how China frames itself to foreign audiences. It was discovered that the international broadcasters emphasized economic goals while systematically avoiding both problems and their causes facing China and its population. Counter frames and counter narrative strategies are offered for information warriors.
With the development of science and technology, warfare has become a multi-domain operation that includes land, sea, air, space, cyber, electromagnetic waves, and human cognition. Nonetheless, existing research has not examined the relationship between each of these domains and the cognitive domain. Hence, this paper explores how cognitive influence on adversaries can be exerted from multiple domains. This paper analyses the case of the war in Ukraine in which the latest science and technology were used. This article finds that attacks on human cognition are exerted from all domains and provides a comprehensive model of cognitive influence on the adversary.
In an epoch where cyberspace is the emerging nexus of geopolitical contention, the juncture of Information Operations and Large Language Models (LLMs) heralds a paradigm shift, replete with immense opportunities and difficult challenges. This paper puts forth a framework for navigating this brave new world using the “ClausewitzGPT” set of equations for framing measurement of AI-augmented information operations. By breaking down the parts of a typical digital information operation into variables, these novel formulae not only seek to quantify the risks inherent in machine-speed LLM-augmented operations but also highlight the vital role of autonomous AI agents in addressing the technical shortcomings and architectural issues of LLMs. These agents, embodying ethical considerations, emerge as indispensable components, ensuring that, as the human race goes forward, it does not lose sight of its moral compasses and societal imperatives.
This paper investigates the role of generative Artificial Intelligence (AI) tools in the production of synthetic moving images—specifically, how these images could be used in online disinformation campaigns and could profoundly affect historical footage archives. AI-manipulated content, especially moving images, will have an impact far beyond the current information warfare (IW) environment and will bleed into the unconsidered terrain of visual historical archives with unknown consequences. The paper will also consider IW scenarios in which new types of long-term disinformation campaigns may emerge and will conclude with potential verification and containment strategies.
Defense and civilian planners have struggled to place disinformation as a discrete weapon in the cognitive domain. This is so because disinformation is inadequately and ambiguously defined for military and civilian components. When comparing the cognitive terrain to other forms of geography, it becomes evident why it is contested and relevant to national security. This paper analyzes the reasons for the ambiguity and explains why national security professionals must develop a framework to identify disinformation. Because disinformation is an element of cognitive warfare, it can be defined using a set of three criteria. The criteria fix disinformation in the cognitive domain enabling the warfighter and homeland defenders to counter and use it effectively.
When nation-state actors weaponize information to harm individuals, communities, and societies, they erode civilian confidence in legitimate authorities, institutions, and defences to impact national security. This paper proposes new conceptual models and a methodology, the Privacy Incident Response Plan (PIRP). The methodology’s design prepares and mitigates privacy-related harms, tactics, techniques, and mitigation strategies to counter sophisticated threat actors. Using this methodology, contingency planners and incident responders can develop strategies to defend against the privacy harms of information warfare.
In the era of labels, distinguishing between constructs is becoming increasingly difficult. Covert action and hybrid warfare are two constructs suffering from this predicament. The question is whether covert action is hybrid warfare, vis-a-versa, or whether one construct eclipsed the other. In an era where covert action has become problematic from an international relations perspective, is this predicament being resolved by labelling covert action as hybrid warfare? This article explores the semantics and nuances of these two constructs to clarify their relative utility. The paper argues that covert action is subordinate to hybrid warfare. Covert action forms part of a synchronized line of effort within a broader hybrid warfare campaign, when planned effectively against a target and target audience(s).
Foreign information manipulation and interference (FIMI) on social media is a fast- evolving threat to democracies. However, there is a growing need to systematically conceptualise the phenomenon. General Morphological Analysis seeks to explore the totalities of a complex problem, but is restricted by simplification. Using and modifying the method expands the morphological space. This expansion and relying on statistical calculation expose internal interdependencies of the phenomenon. Operation design is largely dependent on five parameters: ‘spread strategy’, ‘information channelling’, ‘market targeting’, ‘presented source’, and ‘operational openness’. These parameters are more likely to affect other parameters and thereby define significant aspects of a FIMI operation.
This essay aims to identify vulnerabilities and exploitation means necessary to use destabilization to support a military, and ultimately political, objective in a potential conflict between China and the governments supporting a liberal rules-based order. Japanese efforts during the Russo-Japanese War of 1904-1905 showed destabilizing a regime as a credible way to support military objectives during a conflict and provided some key insights by which destabilization efforts function. Based on the historical case and contemporary analysis of China, this essay makes recommendations to decision makers in a conflict on how to best execute and support destabilization efforts.
Voters in New York State are identified by two identification numbers. This study has discovered strong evidence that both numbers have been algorithmically manipulated to produce steganographically concealed record attribute information. One of the several algorithms discovered has been solved. It first utilizes a mechanism nearly identical to the simple ‘Caesar Cipher’ to change the order of a group of ID numbers. Then, it interlaces them the way a deck of cards is arranged to create a ‘stacked deck’. The algorithmic modifications create hidden structure within voter ID numbers. The structure can be used to covertly tag fraudulent records for later use.
In January 2022, Russian forces began building up on the Ukrainian border prior to entering Ukraine in what was termed a ‘special military operation’ in support of ethnic Russians. In the ten months of conflict, there has been a range of information warfare tactics deployed, most notably disinformation and cyber operations. Ukraine is a particularly useful case study due to the ongoing tensions and low-intensity conflict, since the social media-led uprisings and annexation of Crimea in 2014. This article conducts an analysis of the information warfare in the Russo-Ukraine conflict, and contrasts this to prior operations to illustrate the evolution, limitations, and possible future of information warfare during a kinetic conflict.
Russia has historically employed deception, misinformation/disinformation, propaganda, active measures, and information operations to dissuade and limit state actors from pursuing courses of actions that challenge the Kremlin’s political and military objectives. Misinformation is non-kinetic and both informs and assists Russia’s military strategy. Communication platforms with global reach spread state-sponsored misinformation to influence, shape, and limit Western political and military responses against Russia’s war in Ukraine. That Kremlin’s stated willingness to deploy tactical and strategic nuclear weapons against Ukraine and the West follows narratives that generate doubt and uncertainty regarding the true intentions of Russian state behaviour.
Critical Infrastructure (CI) is an area that has historically been rife with vulnerabilities, open to foreign and domestic threats. Recent events such as the Colonial Pipeline and JBS Foods provider ransomware attacks highlight the need for better security and resiliency from cyber threats. However, within the Information Warfare (IW) constructs that have become increasingly refined by peer adversaries like China and Russia, the areas of Electromagnetic Warfare (EW), Intelligence, Surveillance, and Reconnaissance (ISR), and Information Operations (IO) have become equally important to consider in the panoply of IW. This raises the important question regarding whether CI assets are adequately protected from the full complement of IW threats. Each IW area will be discussed from a threat perspective and examples will be presented to show how these threats can be combined to disrupt, deny, and destroy CI and CI assets with special attention given to peer and non-peer adversaries and the asymmetric advantages of each.
National Instruments of power are means for a nation to exert influence on other nations to achieve certain ends. This paper examines how Ukraine used its national instruments of power during the first months of the Russo-Ukrainian War of 2022 to conduct information influence operations. First, information influencing as a concept is described and then the framework of the paper is constructed by describing instruments of power and the basic elements of a strategy. This framework is then used to analyse how Ukraine used its instruments of power. Finally, this paper sums up the results with discussion and conclusions.
The importance of information influence operations in international conflicts has increased. New technologies and tools like the Internet and social media have enabled influence operations to shift to new channels with a wider audience. But what will happen to information influence operations in the future? By using future studies’ methods like scenarios and science fiction, it is possible to try to imagine the various possibilities for information influence operations. This article presents a method for creating scifi stories based on scenarios to think about the future of information influence operations and their counteractions.
With the ubiquitous nature of the Internet, social media, and their continued exponential growth across society, it is necessary to comprehensively understand these platforms to engage threat networks at home and abroad. Undergirding all web-based actions, however, is human behaviour. Therefore, understanding human behaviour and the dynamic range of characteristics, actions, and attributes that are influenced by culture and context, for web-based offensive and defensive actions, is an ever-evolving niche skill. As such, non-kinetic activities and change efforts, especially in the cyber domain, require cross-cultural competence and experience in addition to any cyber capability.
The definitive publication for the best and latest research and analysis on information warfare, information operations, and cyber crime. Available in traditional hard copy or online.
The definitive publication for the best and latest research and analysis on information warfare, information operations, and cyber crime. Available in traditional hard copy or online.