12 October 2019

What Translation Troubles Can Tell Us About Russian Information Warfare

Joe Cheravitch

Moscow’s form of information warfare targeting the West has attracted significant international attention since 2014, especially through its reinvigorated military intelligence branch. Nonetheless, little research has focused on these campaigns’ apparent shortcomings. Most notable among operational errors are the confusing translation mistakes that undermine attempts at covert influence efforts, such as the flawed Arabic used during Russia‘s Cyber Caliphate’ campaign, or Guccifer 2.0’s poor Romanian during election-meddling. More recent efforts featured similar translation errors commonly experienced by native Russian-speakers, aside from misspelling the surname of a foreign minister in a forgery attempt. The fact that these operations are likely approved and scrutinized at senior levels yet falter on seemingly trivial blunders compared to the impressive hacking skills often used to support them only adds to the perplexity. 

Several factors probably influence these bizarre lapses in tradecraft. Perhaps there is too little oversight and objective evaluation. Russian operators, for instance, put their early work in Ukraine in the “best possible light” for their superiors despite the inherent difficulties in gauging the true impact of their operations. The same operators might place too much confidence in ‘automated’ information warfare. Sometime in the wake of the Georgian War in 2008, the Russian military’s psychological warfare schoolhouse began classes in “machine translation” of foreign texts alongside coursework in traditional disinformation tactics. Perhaps translation errors are just a matter of quantity over quality.


But the Russian, once Soviet, information warfare apparatus has long been lost in translation. In the late 1930s, the Red Army’s fledgling special propaganda units frequently botched Finnish and Japanese-language leaflets and broadcasts during combat operations. During World War II, it pressganged prisoners of war into its propaganda machine to make up for an acute shortage of qualified linguists and cultural specialists. Despite its status as the KGB’s favorite Warsaw Pact disinformation partner, Czechoslovak intelligence used awful German and outdated letterheads to stoke antisemitism in an attempt to stifle protests in 1968. 

Learning about the language and culture of an adversary could lead to trouble for educated Russians at least at some points during the Cold War. Now, knowledge of foreign languages and IT skills are universally marketable, and Russians are willing and able to take them to the private sector. Culture-clash between educated Russians and military life likely adds to the state’s recruiting woes. After all, the same year Defense Minister Sergey Shoygu inaugurated his “big hunt” for military programmers, especially English-speaking ones, he also decreed that commanders play the Russian anthem every morning and compile a list of mandatory patriotic books for subordinates to read.

Culture-clash between military and civilian life, though, is certainly not unique to Russia. But the country’s exodus of talent is a distinct challenge when striving to build a peer-worthy force. Several officials, for instance, have labeled ‘brain drain’ as a critical national problem. Russia’s military exacerbates this by alienating some recruits, especially with recent efforts to ‘spiritualize’ the ranks and shore up loyalty. Moscow leaned on a thriving underground hacker network to shore up the technical side of information warfare, but there is no similar avenue for incorporating linguistic, cultural, and social science expertise.

A Russian defense analyst suggested that Russia’s information operations force would fight according to General Alexander Suvorov’s principle: not by number, but by skill. But skill is expensive, especially considering Russia’s information warfare budget is probably significantly less than it once was. Those graduating with degrees from top universities earn more in the private sector than as a junior military researcher. A current opening within a psychological warfare unit that involves “computer work” and English knowledge offered pay that was much lower than the regional average and maintains a below average rating on the website. A former military intelligence programmer with “intermediate” English skills, brought into the ranks by Shoygu’s hunt, left his unit after only a year and currently seeks civilian employment.

In 2020, U.S. internet users are likely to see the same awkward grammar supporting U.S. foreign policy criticisms like social media cutout Alice Donovan’s online oeuvre last presidential election, or the continued syntactical errors from the St. Petersburg millennials staffing the “troll farm.” But disinformation remains a threat as significant as the malware used to probe digital voting infrastructure or break into sensitive networks. Just as government and private organizations have begun to work more cooperatively on the technical side of these operations, they might also work alongside one another when investigating the softer skills used to support influence campaigns. Tipped off partly by linguistic mistakes, researchers with the Atlantic Council’s Digital Forensics Lab were able to piece together a distinct influence effort attributed to Russian military intelligence following the 2016 election-meddling effort. This sort of work could have obvious benefits for policymakers, who can more appropriately respond to this activity with a better understanding of the actors behind it.

No comments: