Entropy – a new AI-based tool the SOF operators will have the opportunity to use when executing psychological operations
Daniel IlieDid you know that out of 7.796.615.719 people on the planet 4.833.521.806 are Internet users, basically 62% of world’s population? It is a statistic update from July 2020, published by Internet World Status. These are, indeed, impressive numbers, as the Internet firstly developed lacking legal regulations and reaching a more than 90% penetration in areas such as North America and Europe. Its infrastructure is 90% held by the private sector, with no government institution to have a supreme authority on the Internet. This is happening also considering that world’s governments, more or less democratic, are constantly monitoring the Internet.

During the last years, the number of users has met a quick increase in geographical areas such as Africa (10.000%), Middle East (4.893%), Latin America (2.318%) or Asia (1.670%) and, furthermore, given the current pandemic, the Internet networks overlap an additional stress determined by the fact that an increasing number of people have started to work remotely.
This is the “ground” the development and progress of the social media happened and happens nowadays on, a process that eases the users’ metamorphosis in receptors, but also information creators, easing the social interaction and communication, but also eliminating the distances between them and the transnational limits, even if only virtually.
There are, indeed, many effects coming out of Internet’s use to such a scale. One of them is the uncontrolled information dissemination, the state and non-state actors fighting over their right to get their online targets attention, trying to influence and even win their minds and souls to achieve more or less strategic and more or less tendentious or malicious objectives and purposes.
It is pretty clear, and recent history proved it, that in such an operational environment, defined by a social component that includes very dynamic information elements, it can be developed subversive (information campaigns) or bad, influential, propagandistic, manipulative or disinformation campaigns with different objectives, up to regime changes.
Astfel de acţiuni devin ameninţări la adresa securităţii şi apărării statelor, a stabilităţii şi integrităţii acestora, mai ales atunci când Such actions become threats against states’ security and defence, their stability and integrity, especially when their people do not have a minimum of national ideals and do not believe in democratic values and the state institutions are weak and strongly affected by corruption. These aim at negatively influencing the public opinion, are negatively affecting the foreign policy’s accomplishment, are changing the internal policy, trying to ruin the government institutions and reduce the people’s trust in their capacity to protect their interests, population and national territory.
A recent example close to our territory is Ukraine which, in the last years, has lost the control over important territories, such as the Crimea Peninsula or Donbas, after fake news and propaganda attacks initiated by the Russian Federation, through Russian mass-media, before the Euromaidan protests, started on November 21th, in Kyiv, discredited the Ukrainian authorities and their ambitions to access the EU.
Indeed, the target of the information campaign developed by the Russian Federation was the majority of the Russian population in the two territories, whose support was, actually, easily obtained through propaganda and disinformation. These actions aimed at spreading the so called theory of the “Russian world”, along with demonizing the European Union.
The propaganda, disinformation, manipulation, diversion and even sabotage actions are nothing new, but only adaptations to the new expanded operational environment, which includes the social component. These were and still are used in an attempt to neutralize what the military theoreticians call “the center of gravity” of the enemy.
The famous Prussian military theoretician Von Chausewitz was writing, in his work called “On war” that victory in a war is finding and neutralizing the enemy’s “center of gravity”, a catchphrase he described as “the essential point of the entire power and freedom of movement everything depends on. This is the point that all energies should be directed against”.
He thought, however, that one of the most effective ways to defeat an enemy if by negatively affecting the moral elements, which are “among the most important ones in a war, being the spirit that blows through the war… These are establishing a connection with the will that moves and leads the entire force”. In other words, it is enough to find a way to influence, change or even destroy the moral of troops (the national will, public opinion, political purposes, alliances’ cohesion etc.) to win the war without actually confronting the enemy army. And this annihilation objective of enemy’s will, as an abstract center of gravity, became a permanent one for the military operations, thanks to the reduced cost of reaching strategic objectives. “All the military action is tied with physiological effects and forces”, was Clausewitz stating.
In the past, the influence operations were being developed through manifests, telegraph, radio and TV. In the 21st century, effectively and efficiently executing information operations involves the use of social networks, techniques, tactics and new procedures, the expansion of communication infrastructures and methods in prohibited areas, but also the development of hardware equipment, technologies and software solutions to quickly analyze the huge big data volume that it is manipulated on the social media networks, regardless of the language these information are spread through.
As we see nowadays, the neutralization of enemy’s center of gravity does not necessarily require the execution of kinetic military actions such as missile attacks, air attacks executed through the air forces, heavy artillery attacks, tanks, from the ground, air, or sea, by winning the people’s minds and souls through subversive, influence, propaganda, manipulation or disinformation campaigns. The enemy can use the “divide et impera” principle from distance, accomplishing its mission and reaching its strategic objectives without actually pulling the trigger.
In the current operational environment, the information is a weapon spread with an amazing speed, with unlimited range of action, reaching any corner of the planet and easily accessible by anyone. Unfortunately, this can be distorted and manipulated to use it maliciously or subversively, as we have already witnessed. In order to combat such actions we need changes in how we think and develop political campaigns, but also in how we strategically plan and prepare military actions.
A research report recently published by the American global politics think-tank, RAND Corporation (“Research ANd Development), called “Detecting Malign or Subversive Information Efforts over Social Media – Scalable Analytics for Early Warning” says that the United States has a capability gap in detecting malign or subversive information campaigns before these campaigns substantially influence the attitudes and behaviors of large audiences. It is a critical vulnerability in the so-called Global war against terrorism, which became even more urgent now when it comes to the answering to the actions of states called malign by the report.
The research tries to approach a new method to help the identification of such campaigns and their complexity and not just parts of it, like the fake news or compromising social media accounts.
As a proof of concept for detecting the malign or subversive information campaigns on social media it was adapted an analysis method of the existing social media. It combines the analysis of networks and text analysis to localize, visualize and understand the communities interacting on social networks. The method works and allows the analysts to look for disinformation models in data sets which are too big for a qualitative analysis made by humans, reducing a big data set to smaller and denser data sets, wherein a weak disinformation sign can be detected.
AI-based technologies and ML methods based on modeling approached of the debated topics are used to detect the narrative topic and the possible disinformation. Other researches focus on the effective dissemination operations and methods. These include, for example, the detection of bots, bots networks’ structure, as well as the detection of authentic social media accounts which can be distribution vectors.
As a general advice, the report recommends the US and its partners and allies to consider the use of this method and complementary ones, along with the use of necessary technologies and the professional expertise and necessary technique.
As for the US SOF responsible with executing special operations, the Pentagon is about to deliver the operators a new AI tool to help them execute psychological operations, in due time, in the information environment.
The information system called Entropy, which is still developing, will have a passive component, able to ingest data waves, both text and video ones, from the information environment, most often the Internet, and after the data use, to offer the operators the overview of tendencies and main development directions of the targeted phenomena. The active component will be developed on the passive component, taking the identified narrative thematic and topics, introducing them in a linguistic model and selecting messages for the team of operators to analyze and classify them. This might help engaging algorithms in the ML process. Then, this component would reintroduce these messages in the information environment.
The system’s architecture will also allow the integration of a third party in some commercial or government platforms.
I wrote more than a year ago about how American FOS will adapt to the requirements of the modern security environment, starting in 2020.
I emphasized then that among the future priorities of USSOCOM is the full operationalisation, by 2025, of a trans-regional military information support operations (MISO) capable of addressing the opportunities and risks of the information space, globally.
This structure will have the task of supporting combat commanders with improved assessment capabilities, disseminating knowledge of the situation related to enemy influence activities, transmitting messages promoted by information operations, and globally coordinating MISO via the Internet.
Most likely, these will include the AI system called Entropy, which will be tasked with helping US SOF compete in today's information environment where operators must be able to ingest content and respond quickly with counter-messages or by elaborating and transmitting their own messages, in almost real time.
American military doctrine says that psychological warfare involves the planned use of propaganda and other psychological operations to influence the views, emotions, attitudes, and behavior of opposition groups, neutral groups, friends, or foreign enemies in the way desired to achieve national strategic goals, but and military missions. Within the Pentagon, psychological operations units are subordinated to the USSOCOM and these capabilities are generally prohibited from attempting to change the views of citizens and residents, wherever they may be.
During the first virtual Conference of the Special Operations Forces Industry that took place this spring, the US SOF Command’s commander was saying “When we talk about the ability to influence and shape in this (informational) environment, we will need to have AI and machine learning tools, especially dedicated to information operations that cover a very wide portfolio. We will have to understand how the opponent thinks, how the population thinks and work in this space in a timely manner. If you're not fast, you won't be relevant. What we need is to adapt data technologies that really work in this space and that we can use in our organization”.
Translated by Andreea Soare
