About this report

In this research, the authors emphasize the role of cyber within information warfare, with a particular focus on the role of hybrid-warfare tactics and trolling in internet media. Thus, the primary task of this particular research is to measure how and to what extent certain cyber activities influence public opinion. The research results provide an approach to evaluating the risk potential of trolling and outline recommendations on how to protect the state and society if trolling is used as an instrument of hybrid warfare. To understand the significance of trolling, the authors will use a multidisciplinary approach – theorization of the trolling phenomena is undertaken by communication science experts, while the impact assessment of trolling on public discussion is carried out by political scientists. 

The two parts have applied different methodologies and scientific approaches, but both lead to practical results: 

  1. Method for identifying trolling in the internet-media environment;
  2. Impact assessment of identified trolling on public opinion and public discussion. 

Thus, the first part of this research outlines and develops a theoretical framework for analysing social and internet media as a weapon for achieving political and military goals under new geopolitical challenges. The second part includes the collection of empiric data from Latvia’s most popular web news portals – delfi.lv, tvnet.lv and apollo.lv – in both the Latvian and Russian languages, and evaluation of the results obtained from both quantitative and qualitative perspectives. 

Consequently, the study examines the following issues: 

  1. Trolling and Russia’s military strategy: theoretical and legislative perspectives; 
  2. Trolling in Latvia: the media landscape and quantitative measures for the recognition and identification of trolls; 
  3. The impact of trolling: ‘potential-to-reshape’ public opinion;
    1. Qualitative assessment of trolling as perceived by Latvia’s Latvian-speaking society; 
    2. Qualitative assessment of trolling as perceived by Latvia’s Russian-speaking society; 
  4. A tutorial for average internet users considering reacting to trolling. 

Hence, the first part of this report sets out the theoretical background and defines key terms such as hybrid warfare, information warfare, trolling and hybrid trolling as they are applied in the research. Consequently, it is essential to analyse Russia’s official military strategy on information warfare, assessing whether cyber defence and trolling are a defined, integral part of the country’s strategy. Furthermore, particular attention is paid to discrepancies between Russia’s official strategy and its practical implementation by scrutinising examples of pro-Kremlin trolling as experienced by countries including Ukraine, Poland and Finland. 

The second part of the research turns its attention to the media landscape in Latvia and its potential for the utilisation of pro- Kremlin propaganda tools and trolling. It also sets out the quantitative measures required for the recognition and identification of trolls and pro-Kremlin trolling in particular. By employing a methodologically critical approach, this analysis is designed to test whether there actually are identifiable, paid pro-Kremlin trolls and, if so, to determine what share of online comments they are responsible for and what is their actual behaviour and potential impact. 

The third part of the research is a qualitative analysis of the impact of trolling. By setting up a number of focus groups on the basis of socio-economic and linguistic criteria, the researchers test a number of actual comments that had been identified as posted by pro-Kremlin trolls. As well as labelling several types of trolling messages, the research estimates the efficiency of each type. Consequently, the researchers assess the societal groups that are most vulnerable and most resistant to trolling and particular types of troll messages, as well as to being influenced by online media in general. Hence, in the final part of this research, the authors attempt to provide a general evaluation of trolling as an information-warfare strategy as well as setting out workable strategies for counteracting pro-Russian trolling. 

Conclusions and recommendations

 

The weaponisation of online media is an increasingly common strategy in information warfare. Although the weaponisation of information itself is by no means a new phenomenon, there are several trends that seem to be occurring alongside the increasing access to information through online media and social networks. Despite the fact that the danger of Russia’s propaganda war is often blown out of proportion, there is evidence that the Kremlin does use regime-funded online trolls to disseminate misinformation and project a pro-Russian stance in online-media comment sections. Russia’s official strategy is based on a defensive approach to information warfare and defines Russia as a victim of Western and US propaganda and trolling. However, leaked policy documents and statements from high-level authority figures, as well as empirical evidence gathered by analysts and investigative journalists, seem to prove that under the cover of its defensive stance, Russia is waging information warfare against its adversaries in order to sway international opinion in its favour, and to create confusion and mistrust in public information as such. 

For the purposes of this research, the authors have labelled the suspected pro-Russian, regime-sponsored trolls as hybrid trolls. By this means, hybrid trolls are distinguished from classic trolls also operating online. The latter, however, only act in their own interests and solely with the aim of sowing disagreement and inciting conflict in the online sphere. Apart from this difference, which is actually quite hard to prove, there are several other things that make hybrid trolls stand out. Firstly, these trolls, suspected to be paid on the basis of quantity, can be identified by following factors: intensively reposted messages, repeated messages posted from different IP addresses and/or nicknames, as well as republished information and links. Typically, hybrid trolls strongly support a particular political stance and are more likely to comment on topics linked to specific areas of politics rather than on other subjects. Interestingly, when it comes to pro-Russian hybrid trolls, one important and a rather straightforward identifier is their frequently poor language skills when posting comments in languages other than Russian, implying that the original Russian-language message has been translated using, for example, Google Translate and then disseminated through the media of a particular country. 

The first part of this research focused on the quantitative analysis of comments posted on three major online Latvian-language news portals – apollo.lv, delfi.lv and tvnet.lv – and their Russian-language counterparts between 29 July and 5 August 2014. It was established that only 1.45% of the total number of comments in the three major Latvian- and Russian-language online news portals were potentially from hybrid trolls. However, this number was slightly higher, reaching 3.72% when only taking into consideration the articles subject to trolling activity. 

A slight difference was discovered between Latvia’s Russian- and Latvian-language news portals – Russian-language portals experienced slightly higher troll activity, reaching 3.99% in the affected articles, while in the affected Latvian-language articles, hybrid trolls accounted for about 3.55% of comments. More evidence of hybrid-troll activity is the fact that, of all the articles affected by hybrid trolls, almost one third was related to events in Ukraine, while the shooting down of Malaysia Airlines flight MH17 over Eastern Ukraine also attracted a considerable proportion of hybrid-troll comments. Together these two topics accounted for 37% of all messages that were suspected to be posted by hybrid trolls, while another 27% of affected articles were related to the Western sanctions against Russia and Russia’s counter measures. 

This evidence is already sufficient to prove that pro-Russian trolling is present in Latvia’s news portals, in both Latvian- and Russian-language versions. 

Importantly, a detailed analysis of communication models and content demonstrated that the impact of hybrid trolling is decreased by a number of circumstances. Firstly, hateful and xenophobic hybrid-troll comments are often automatically deleted immediately after being posted. Secondly, users’ negative ratings of these comments result in their being hidden from other users. Thirdly, and most importantly, other users who disagree with similar comments by labelling them hateful and unacceptable, unmask troll messages, excluding them from further communication. Consequently, the detailed analysis of quantitative data demonstrates that the actual exposure of online news users is weakened through these factors. Furthermore, because of the relatively short time that readers spend online – ranging from around six to thirteen minutes, users are unlikely to deeply engage with comment sections. Qualitative analysis of the online news audience demonstrates that this section is used only by more active users. At the same time, these more active audience members are also the least likely to be susceptible to the influence of hybrid-troll messages. These audience members’ higher level of activity in accumulating a variety of information enables them to develop more critical evaluation skills regarding online-media content, including information posted by other internet commenters and hybrid trolls

However, in-depth qualitative analysis uncovered several important trends that should be taken into account in developing counter-propaganda measures. In the first stage, various types of hybrid-troll messages were identified and labelled. Blame the US conspiracy trolls disseminate information based on conspiracy theories and blaming the US for creating international turmoil. The Bikini troll refers to commenters that post rather naïve, anti-US comments typically accompanied by a profile picture of an attractive young girl. Aggressive trolls typically post emotion-laden, highly opinionated comments intended to stir up emotional responses from general users. Wikipedia trolls, seemingly also the most dangerous trolls, tend to post factual information that is out of context and is thus unlikely to be discredited, even by more experienced users. The final hybrid-troll type has been labelled as Attachment troll, also rather dangerous, posting only short messages with links to other news articles or videos containing value-laden information. Although each troll type targets particular audience segments, the final two are considered the most influential as they can even affect more internet-savvy users. 

By setting up focus groups to assess the influence of hybrid trolling on various social segments, the authors established that the most vulnerable group is the Settled group or older people, which has the lowest awareness of internet security risks. In Latvia, 42% of this age group (55-74) use the internet, which makes them highly susceptible to more aggressive trolling. The most successful measure to decrease this vulnerability would be raising their awareness of online security risks, for the purpose of which the authors drew up a sample tutorial on recognising hybrid trolls. Another major risk group is labelled Homebodies (family men in their forties). This group is susceptible to conspiracy theories and highly likely to respond to Bikini trolls’ comments, among others, because they are the group most 

likely to engage in commenting per se. 

Homebodies themselves typically form a large proportion of anonymous online commenters and therefore the most efficient protection mechanism in this case would be to decrease anonymity in internet media. The remaining societal groups such as the Open-minded, Demanding, Dreamers, Adventurers, Rational Realists and Organics were found to be highly resistant to hybrid trolling efforts, albeit to different extents. The reasons for such resistance range from highly critical approaches to publicly available information and high internet literacy, to complete disinterest in political processes. 

Long-term hybrid trolling does have an influence, shifting values where the central role is played by the emotional tone of the message rather than the message itself. Focus-group interviews proved that, even though some of the members initially demonstrated complete resistance to troll messages, after longer exposure their perception changed. When accompanied with other information sources, hybrid trolling has some potential to reshape personal values and beliefs. Hence, hybrid trolling cannot be evaluated in isolation from other media sources and their impact on society, and can actually act as a catalyst for messages in other media. 

Furthermore, an in-depth analysis of all segments of the Latvian- and Russian-speaking societies demonstrated that, although the societies are quite resistant to hybrid trolling in terms of perception of the information therein, they are susceptible to emotion-laden attacks. An important threat here is the creation of a false perception of hybrid trolls being real Russian people, leading to mutual mistrust between members of the two linguistic groups. 

RECOMMENDATIONS 

The findings from this study did not provide proof of any extensive presence of trolling comments in Latvia’s web portals that had been assumed when undertaking the study. Furthermore, media-consumption habits lead to the conclusion that trolling should not be perceived as the most influential tool for changing the opinion of Latvian society. This information tool can however, induce certain effects in the longer run. Its strengths do not lie in manipulating a limited group of people who read web comments or actively post in social media, but rather in its ability to reinforce Russia’s narrative which is already being communicated via other information channels – TV, blogs, propaganda websites run by pro-Kremlin activists, etc. Thus trolling, despite the direct evidence of its limited effects seen in isolation, is still a small but important part of a larger machinery aimed at influencing the public in NATO member and partner countries. 

Based on the focus-group discussions, the authors have designed an outline for a hybrid-troll-recognition tutorial (see p.77), which can be used by average persons with no access to advanced screening methods. Furthermore, the authors of the study would like to offer several recommendations to the mass media and to government institutions on countering hybrid-trolling activities. 

What the mass media can do: 

  • Check facts before publishing them – do not become a participant in a disinformation campaign. News production should follow high journalistic standards. Analysing information and checking facts before disseminating information further is of the outmost importance in building credibility within society. As a highly trusted environment, social media provides great opportunities to disseminate misinformation and hoax messages. For these reasons, the mass media should exercise its ‘gate-keeping’ role so as to separate facts from rumours, rather than becoming another participant in disinformation campaigns without even realising it. This requires critical thinking and more thorough appraisals of sources. 
  • Enhance general media literacy. The mass media and opinion leaders can play important roles in educating the public about the misinformation activities in online media – by providing analysis of trolling tactics and manipulation techniques, as well as suggesting criteria for identifying organised trolling. Putting trolling in the headlines and encouraging people to share their experiences of being attacked/harassed by trolls would facilitate discussion on how to identify the malicious use of social media and seeking ways to counter it. 
  • Develop filtering tools. News-portal editors are already making great efforts to filter and automatically delete comments expressing hatred, rudeness and aggression (60-70% of trolling comments are already deleted by portal editors according to the study), thus minimising the influence of trolling. However, given that these filters can be bypassed by amending messages, continuous improvement of bot/ troll-detection capability is needed. 

What government institutions can do

  • Identify and unmask sources of disinformation (trolls). Greater focus should be put on analysing information environment so as to be able to identify disinformation efforts as well as their effects on public discussion. Online-media and social-media analysis should become an integral part of every analysis of the information environment. Examples in citizen journalism have proven that identifying and revealing ‘false’ facts to the public is an effective approach to mitigating the effects of disinformation. Governments should learn from these cases and integrate these efforts into their operations. 
  • Develop unifying narratives. The manipulation efforts of trolls can only be successful if there are no alternative stories to offer. Consequently, the development of unifying strategic narratives would play a central role in countering disinformation activities. This not only means unified messaging by government representatives, but also involving a wide range of actors, from academics to private business figures, in efforts to defend the national information space. 
  • Make jokes rather than argue. Efforts to fight propaganda in social media by developing counter messages and official statements will only fuel the atmosphere of information war rather than bringing positive effects. Perhaps humour could be more successful in countering aggressive propaganda as it hampers the latter’s ability to achieve its objective – subdue the society of the target country. The informal nature of the online environment is perfect for humour and jokes as communication tools, having the capacity to attract large numbers of social-media users. 
  • Enhance the public’s critical thinking and media literacy. Long-term efforts are required to enhance the public’s critical thinking and education on the weaponisation of the media, particularly online media. Perhaps providing simple user guides for the general public (for example, when opening comment sections) on how to identify trolls would be the first and simplest step towards raising society’s awareness of the manipulation techniques utilised in the internet. One solution might be introducing media knowledge and source appraisal in social media to school curricula. 
  • Learn from other countries’ experience. Hybrid trolling is not a unique phenomenon, restricted only to Latvia. Furthermore, trolling is never conducted as a standalone hybrid-warfare tool, but rather as mechanism supporting the messages promoted by other information channels. Hence, the Baltic States, Finland, Poland, Ukraine and other countries already affected by Russia’s information activities should cooperate in their efforts to counter disinformation, and learn from each other’s experience.