The Nexus of OSINT & Misinformation

The terms misinformation, disinformation and propaganda have become common, especially following the rise of social and digital media. Individuals, communities, organizations, and governments are impacted by the effects of misinformation, disinformation and propaganda. Within the Intelligence Community (IC) it is important to identify, monitor, assist and counter such narratives that have negative impacts. The use of OSINT as a tool allows organisations and governments to remain vigilant and resilient to ensure they can mitigate the consequences posed by misinformation, disinformation, and propaganda, and remain ahead in the information environment.

What is OSINT

Before understanding the application of OSINT in combatting misinformation, OSINT itself must be understood.

Open-source Intelligence or OSINT for short refers to information that is publicly available and has been discovered, collected and determined to be of actionable and intelligence value. The information is often collected from public sources such as the internet, however, it is not limited to just the internet, but rather refers to all publicly available information.

The key element to understand for OSINT is that information is the superior caveat. Information that is of value and actionable.

Misinformation, Disinformation and Propaganda

The three terms misinformation, disinformation and propaganda can often be intertwined or cause confusion. It is imperative to understand their differences and scope of purpose. Misinformation, disinformation, and propaganda are often focused on the spreading of false or misleading information in form of informative content.

Misinformation can be considered an encompassing umbrella term. It covers information that is false.

Disinformation relates to a subset of misinformation that is often deliberate and with an intent to deceive. Whilst misinformation may be unintentional, disinformation is considered a deliberate attempt by political actors – domestic or foreign.

Propaganda, another subset, is referring to information that may be true but is used for malign purposes and to encourage opposing views. Propaganda can also include purposely false information that is used in campaigns to shift viewpoints. It is often used to persuade audiences toward a certain viewpoint.

Key aspects to consider regarding misinformation, disinformation and propaganda are motives, nature of organizational structures, methods of dissemination, and who the intended audiences are.

Growth of Technology

The growth of technology, the information environment and social media has allowed for digital bridges to be established between people, communities, organizations, and governments. They are tools that can be used for good but also for malicious purposes.

The impact of deliberate disinformation by actors, state, and non-state, can have major ramifications for a nation. It can shift their prestige in the international arena, instigate an “us vs them” mentality within society, limit their embassies and missions to properly represent them abroad, and overall limit the effectiveness of foreign policy and relations with other nations.

Social media and its growth have allowed for greater empowerment in efforts to disseminate information and manipulate public opinion compared to traditional means. Social media allows operators to have greater means to conceal their identities, target vulnerable publics with greater effectiveness and can evaluate and optimise narratives and strategies due to platforms offering programmatic interfaces (API’s) that can be exploited for malicious purposes.

The Role of OSINT

When operating from an OSINT perspective to locate, monitor and counter misinformation the mindset must be adopted that everyone online leaves a digital trace. OSINT tools allow for mass data analysis, including extracting and checking content from a user on a social or digital platform, conducting a deep analysis of a profile to determine actionable information, locating links to external resources and parties, and creating a timeline of sentiment based on information embedded within the content. Further, what OSINT allows is the ability to identify keywords, map bot networks and cross reference data. Coupled with other forms of intelligence gathering, these tools allow for large reports to be created which can assist governments in ensuring they remain resilient in the information space.

Governments need to ensure they build resilient teams which are dedicated to identifying, monitoring, and countering misinformation. OSINT should be the primary driver for this capability, particularly in the cyber, diplomatic, and political domains. To ensure malicious actors are unable to tarnish the reputation of a nation and government institutions, which could lead to a fracture of societal fabric and foreign relations, the spread of misinformation must be identified. The trends, sentiments and source must be monitored on an ongoing basis to ensure governments can remain ahead and practices put in place to counter these narratives at a rapid pace before they are spread amongst large publics.

Analysis of Digital Content

Analysis of misinformation collected from online sources, such as Twitter or Facebook, by OSINT can be done in various methods. The simplest form is to verify whether the content is truthful by conducting research, checking and cross-referencing. Further, tools can be utilized to determine if the content is new, truthful, or relevant as opposed to fabricated, altered, or old content which has been reposted for ulterior motives.

More in-depth forms of analysis can occur to track locations of the accounts posting misinformation, sentiments shared by similar accounts sharing misinformation, and mapping of networks of accounts that share misinformation and are aligned in one form or another.

To highlight this, we have provided two examples below highlighting methods of analysis that can be used.

The first highlights how clusters can be mapped on various social channels. The clusters are focused on accounts across platforms focused on “hate”. The importance of such mapping is to determine first which social channels are these accounts operating on, what are the themes common amongst them, which networks exist and how they interact. This is important for nations and governments as fostering a diverse and cohesive society is of paramount importance. Understanding what misinformation is being portrayed online will allow for efforts to be put forth to mitigate digital efforts to cause divides in society.

Mapping network of “hate” clusters on social channels. Clusters contained public posts expressing range of extremist ideologies such as neo-Nazism and Islamophobia. Network map of nodes representing a cluster of social media users posting hateful content and misinformation online – edges represent hyperlinks posted linking one cluster to another. Data from June 1 – Dec 30, 2019. Source: https://tinyurl.com/kf27pnsw

The second example demonstrates a second form of analysis, MST Semantic Network analysis. The analysis shows the connection between troll accounts operating on Twitter forming connections between the hashtags ISIS and Nauru. The accounts forming these connections partake in the sharing and/or creating of misinformation. This form of analysis is important for governments, such as Australia, which have strategic defence and foreign policy interests in the pacific region, to understand what sentiments are being generated by troll accounts, the keywords being used in misinformation practices and the associated keywords that are linked. Having such analysis can ensure decision-makers and policymakers are able to have practices in place to monitor the misinformation attempts and be able to respond at rapid paces to ensure the correct information is put forth, the risks are mitigated, and strategic interests are not heavily impacted.

MST semantic network analysis shows IRA troll accounts connection (via tweets) #nauru and #isis, a connection not being made by other accounts. Such analysis and mapping is important for nations including Australia who have strategic interests in the region. Source: https://tinyurl.com/w4sm8ymb

Tools

Information can be considered one of the most important forms of currency. As individuals continue to spend increasing amounts of time online and on social channels, it is important to remain resilient and sure they are not being subjected to misinformation. The following tools have been compiled to assist in improving one’s literacy around misinformation online, assisting in verifying content and conducting analysis. The list of tools is not exhaustive however it serves as a starting point.

Bad News

A tool designed to build user understanding of the techniques involved in the dissemination of disinformation via an interactive game.

Bot Sentinel

Free platform developed and designed to track troll-bots and untrustworthy Twitter accounts.

Bot Slayer

Browser extension that helps track and detect potential manipulation of information spreading on Twitter.

Checkology

Media literacy curriculum with offline and online components designed on teaching how to read and interpret the news.

Disinformation Index

Web-based tool that rates news outlets on the “probability of disinformation on a specific media outlet”.

Emergent

Web-based tool that tracks verifies or debunks rumours and conspiracies online.

Information Operations Archive

Archive of publicly available and attributed data from known online information operations attributed to Russian and Iranian actors, updated on an ongoing basis.

Twitter Trails

Web-based tool which uses an algorithm to analyse the spread of a story and how users react to that story, measuring the spread of the story and scepticism posted by users.