6 June 2017

CYBER TERRORISM AND BIOTECHNOLOGY WHEN ISIS MEETS CRISPR

by RC Porter

For years, the international community has grappled with the threat of chemical, biological, radiological, and nuclear terrorism. And although al Qaeda and the Islamic State (ISIS) [1] have demonstrated interest in and some capability to develop and use such weapons, there have been no successful mass casualty terrorist attacks involving them. Attempted attacks involving radiological dispersal devices or chemical and biological means have either failed or had a very limited impact. Experts such as John Parachini [2], Jeffrey Bale and Gary Ackerman [3], Adam Dolnik[4], and Rajesh Basrur and Mallika Joseph [5] argue that the reason is terrorists’ inability to weaponize chemical, biological, radiological, or nuclear material. Others, including Brian Michael Jenkins, believe that the lack of mass causality attacks [6] also has to do with self-restraint: perpetrators might not be able to control the consequences of such an attack. It could end up harming the members of the communities that the terrorists are purportedly fighting for and could therefore be counterproductive [7].

The recent WannaCry ransomware attack [8], however, could force the expert community to rethink such positions. Although available information suggests that North Korean hackers were behind these attacks, in which hackers took control of about 300,000 computers in over 150 countries and held the victims hostage in exchange for a payment of $300 in bitcoin, there is reason to believe that terrorist groups such as al Qaeda and ISIS could not copy the tactic. In doing so, they would cause as much damage (loss of data and equipment) and chaos (in hospitals and other public utilities) as possible, comparable to the chaos and panic that could be caused by a chemical or biological attack.

Terrorists could use cyber capabilities to target any sector. But the most vulnerable industries are those with high proportions of old infrastructure onto which new technology has been grafted. According to a report from the U.S. Bureau of Economic Analysis, in 2015, the average age of all fixed assets in the United States stood at 22.8 years [9], with hospitals and utilities some of the worst culprits. This average age is the oldest reported since 1925 for government assets and since 1955 for private assets.

To understand the risk, it is worth examining the consequences of a major cyber-attack in agriculture, an industry where the pace of digitization has traditionally been slower than in other industries. In fact, a recent McKinsey study that ranked 22 U.S. industries by most digitized to least digitized placed agriculture at the bottom of the list [10]. The sectors that fared slightly better than agriculture included construction, hospitality, healthcare, and the government. Apart from being at the bottom of the digitization index, these industries also share their vulnerability to cyberterrorism. Although the recent WannaCry attack largely bypassed the United States, hospitals and public services elsewhere were some of the most affected industries: entire hospitals were shut down, as were some railway systems in Europe, India, Russia, and across the world.

Imagine a scenario in which computer hackers take control of large farmlands and, in one click, destroy a year’s harvest by flooding a field with excess water or leak sensitive data on soil moisture or yield or the nature of the plant germplasm being grown, which could be held for ransom, similar to the WannaCry attacks. What was once science fiction is now closer to reality as cyberterrorists grow bolder and start looking for unconventional ways to spread chaos.

DIGITAL REVOLUTION IN FARMING

Agriculture is becoming smarter every day, with farmers relying on data-driven decision making through sensors planted in the ground, satellites guiding tractor movements, and other new practices. A recent United Nations study showed that, to meet growing demand, world food production must double by 2050 [11]. The potential to expand production by increasing farm acreage is limited; in fact, some studies have shown that the amount of land used for agriculture [12] in the United States and globally is declining. Rather, the existing farmland needs to be made more productive. In this way, precision farming, which enables farmers to make more direct and targeted decisions by relying on information technology, could be key.

Many established companies and startups are innovating in this area. Blue River Technology, for one, automates decisions on when to apply chemical sprays for lettuce thinning through advanced computer analysis. John Deere has plans to introduce self-driving tractors, and Taranis hopes to automate growing decisions and farm disease management through remote sensing. A recent report by Business Insider Intelligence estimates that, by 2030, the average U.S. farm is likely to generate two million data points per day from satellite-based imagery and sensors and other devices planted in the fields. That is up from just 200,000 today [13]. Global tech companies such as IBM, Cisco, and GE are already gearing up for this new market, with their investments in the so-called Internet of Things likely to grow by 16 percent every year to $250 billion by 2021 [14]. These companies have all identified agriculture as one of the top opportunity markets [15].

Although such investments have exciting implications for farm yields (some studies, including one by the Accenture consulting group, have shown improvements of up to 30 percent [16]), there are also risks, especially if connected farms fall prey to cybercriminals. In fact, in 2016, the FBI and the Department of Agriculture (USDA) urged farmers to start paying attention to digital security [17]. In a report, the FBI said that the increased adoption of “precision farming” technology threatens to expose the nation’s agriculture sector to hacking and data theft. Exposure to ransom-ware similar to WannaCry has been identified as a particular threat to farming. Entities that are morally opposed to the use of GMOs or pesticides, for example, may deliberately target specific farms that use these products. This would not be dissimilar to attacks by animal rights and environmental extremists [18], which according to the Animal Agricultural Alliance have increased more than 40 percent worldwide.

HACKING BIOTECHNOLOGY

Beyond digitization, the other big technology innovation in agriculture is gene editing tools such as CRISPR-CAS9 (an abbreviation for clustered regularly interspaced short palindromic repeats), a new technique that permits scientists to quickly and precisely alter, delete, and rearrange the DNA of every living organism.

The goal is to cure genetic diseases in humans and, in the case of agriculture, develop the next generation of genetically modified crops. Research in this space is advancing at breakneck speed; in fact, a mere two years since Jennifer Doudna wrote her seminal paper on the topic of gene editing, China has already begun using CRISPR-modified white blood cells [19] in living human beings to try to cure cancer. Big agriculture companies such as Dow, Monsanto, and Syngenta are investing heavily in CRISPR research, with Monsanto recently announcing a partnership with the Broad Institute at MIT to develop modified seeds [20].

It is easy to see why it would be dangerous for gene editing technology to fall into the wrong hands. CRISPR makes it inexpensive to intentionally misuse DNA sequencing data and editing software. Furthermore, advances in genome sequencing are allowing scientists to quickly and cheaply generate the DNA sequences of entire organisms and then digitize and store that data for research use. Terrorists could use that information to design bioweapons.

Prior to CRISPR, editing DNA required sophisticated labs, years of experience, a PhD degree, and many thousands of dollars. Today, simple do-it-yourself CRISPR kits are available commercially for less than $150. In the wrong hands, these simple but powerful tools are a cause for alarm. Terrorists targeting the food supply chain could alter the avian influenza genome and engineer a large bird flu epidemic, similar to the 2009 H1N1 epidemic in Asia that affected not only poultry but also other mammals including human beings.

So far, Washington has focused its biodefense efforts on a list of known pathogens—such as anthrax, smallpox, and Ebola, but the exponential growth of software-based biotechnology makes this approach outdated. Governments around the world must prepare for both known biological threats and for the future, when genetically modified viruses and other agents can be introduced with as much frequency and ease as software viruses. In the future, governments could be held ransom by bioterrorists threatening to unleash a new pest that could destroy acres of farmland, perhaps even irreversibly, as entomologist Jeffrey Lockwood warned in his 2009 article “Six-Legged Soldiers: Using Insects as Weapons of War [21].”

In the past, at least nine countries have documented agricultural and biological weapons programs. Four had clandestine ones. During World War I, Germany used glanders (an infectious disease) against the horses and mules of the Allied forces. Japan used anthrax and rinderpest during World War II. And the Soviet Union used glanders in Afghanistan [22]. Non-state actors’ use of biological weapons in the agriculture and food sector includes the 1978 poisoning of Israeli orange groves [23] with mercury by the Arab Revolutionary Army-Palestinian Command, the 1984 salmonella attack on salad bars [24] at Oregon restaurants by the Rajneeshee cult, and the spraying of pesticides on grapevines in two Palestinian villages [25] that destroyed up to 17,000 metric tons of grapes in 1997.

The U.S. agricultural industry is at a heightened risk of cyberattack compared to other industries for a couple of reasons. First, the average age of a U.S. farmer is 58 and climbing [26]. And many of them have a blind spot when it comes to technology. A 2014 survey by the American Farm Bureau Federation revealed that, although more than 50 percent of respondents were planning to invest in precision agriculture tools [27] within the next few years, an alarming 87 percent of farmers indicated that they had no plans to handle cyberthreats [28] such as a data breaches. That puts them at a higher risk of hacking and ransomware attacks.

Second, like other industries, agriculture is rife with legacy pre-Internet technologies with which modern technology must integrate. Legacy devices were designed without consideration of modern cybersecurity threats and pose a big risk. For example, a company called Hydrobio (acquired by Monsanto in 2017) uses satellite imagery to integrate with legacy irrigation systems [29] and automate irrigation decisions. These data, if hacked, could be used to flood entire fields and destroy harvests.

Finally, the sheer scale of the agricultural sector makes the logistics of policing every piece of farmland almost impossible unless done though drones and satellites, which have their own vulnerabilities. In 2016, Mo Hailong, a U.S. permanent resident, was convicted for his role in attempting to steal corn seeds from poorly monitored Monsanto and Du Pont production fields and giving them to a China-based seed company, of which he was an employee. Of course, there was no terrorism involved in this case, but it is not hard to imagine a scenario in which it could be.

A network is only as strong as its weakest point, as the 2014 hack of Target Corporation’s customer data through an air conditioning vendor [30] should make clear. Agriculture companies such as Dow, Monsanto, and John Deere have slowly started paying attention. “As an industry, we’re still new to it,” said Robert Fraley, Monsanto’s chief technology officer, in an interview with the Wall Street Journal a few years ago. Climate Corp, a digital agriculture startup acquired by Monsanto for $1 billion, was hacked in 2014, exposing some credit card and employee information. Today, through Climate Corp, Monsanto is trying to occupy center stage in the world of connected farm sensors through their FieldView platform, which creates a mesh network into which all other sensors (both new and legacy) can integrate, creating an “Internet of farms.” Such a network would potentially make it easier to push new software patches and bug fixes to farms across the county, but it would also create a single point of failure, which if hacked could compromise the entire ecosystem.

So what if terrorists decide to use such tactics? Terrorists are not as interested in body counts as they are in causing fear, chaos, and inconvenience. If using ransomware does so, it may be preferable to other options that lead to the loss of human life. The WannaCry incident should not be brushed off as just another standalone hacking episode—something that people get used to. In the United States, agro-terrorism “with the goal of generating fear, causing economic losses, and undermining social stability [22]” is increasingly being recognized as a national security threat. More and more, disaster preparedness and homeland security training includes courses on the threat of terrorist attacks to food and agriculture [31]. It is therefore time for the concerned sectors and the government to invest in technology and public education not only to prevent but also to mitigate the consequences in the event that terrorists use this tactic to target agriculture.

No comments: