The GSN 2015 Digital Yearbook of Awards
January 2016 Digital Edition
December 2015/January 2016 Digital Edition
Digital Version of November/December 2015 Print Edition
October/November 2015 Digital Edition
Digital Version of July/August 2015
June/July 2015 Digital Edition
9/11, A Decade Later -- Connecting the ‘big data’ dots: A decade of lessons learned
In recognition of the tenth anniversary of the 9/11 terrorist attacks, many take a look back at how a revitalized concern for national security spurred numerous initiatives to improve the nation’s defense system. A decade of technological innovation and policy evolution later, the U.S. has made great strides in securing U.S. borders. In retrospect, many of these changes are byproducts of lessons learned from the official 9/11 investigation.
The results of the final 9-11 Commission Report identified one glaring pain point that could have helped prevent the terrorist attacks -- the U.S. Government was unable to connect the dots between available pieces of data that could have led to the discovery of the potential of a terrorist attack before it happened. To help ensure this never happens again, the U.S. Government has deployed new innovative technologies and improved inter- and intra-agency information sharing.
Making sense of big data
In today’s environment, government entities have exponentially more data than they did on September 10, 2001. It wasn’t until three to four years ago that government came to understand the vast amount of data it has had in its possession, and how to analyze it. To address this challenge, agencies now use complex and efficient IT systems that collect and analyze large amounts of data from hundreds of different sources in order to detect threats.
Not only does big data analytics integrate information from varying sources, it also filters through the noise of large data sets. Complex analytics systems develop profiles over time to help determine which information various agencies should be concerned about.
For example, data collected and analyzed today may make a seemingly innocuous phone call six months ago suddenly become relevant. That connection between current information and previous points of data now makes it critical to analyze that phone call and expand the analysis to determine the content of the phone call and who the people on the phone communicated with before and after that original phone call. This quickly proliferates to several degrees of separation between relevant data points that, in turn, require intense analytics to see through the data clutter.
The U.S. Customs and Border Protection (CBP) Automated Targeting System (ATS) is an excellent example of a system able to efficiently analyze raw data and provide critical information to the government. ATS allows the collection of information to analyze and disseminate data to target, identify and prevent potential terrorists and terrorist weapons from entering the U.S.
For example, the system could flag a shipment of ink cartridges from a country, such as Yemen, which does not manufacture cartridges and which has been connected with potentially harmful cargo in the past. CBP security analysts are empowered to conduct complex searches and uncover the full picture of a potential terrorist attack before it occurs.
Information sharing trumps bureaucracy
A large component of being able to “connect the dots” is the transparent sharing of information among agencies. Prior to 9/11, bureaucratic procedural processes “siloed” data transfers between governing bodies. Agencies operated within the confines of their own departments and the majority of information was shared between agencies via individually written reports, a process which was both slow and inefficient.
Since then, the government has set new policies to maximize data sharing securely. Most notably, the Department of Homeland Security was created to oversee more than 20 agencies and manage the protection of the country. To build transparency and increase efficiency, agencies also gained access to one another's databases. For example, hybrid cloud environments allow agencies to store and manage data in secure, private clouds and data centers, as well as secure, shared cloud environments.
The Patriot Act, President Obama’s National Strategy for Counterterrorism and efforts to ramp up personnel along U.S. borders also contributed to national security efforts at large. With every new policy to improve federal security processes, government has called on the assistance of modern technology to accompany these efforts.
Computing in a mobile world
Mobile technology is a key enabler for real-time information sharing. Field agents -- from those on the border to those in airports -- are equipped with mobile devices that receive and disseminate tactical information at a moment’s notice. For example, if the FBI gets wind that a restricted individual may be attempting to board a plane, agents can notify an onsite Transportation Security Administration (TSA) agent immediately in order to halt and restrain the individual.
Tactical mobile devices are equipped with security measures that are necessary to ensure communications networks remain uncompromised. In one example, government officers have utilized technologies in order to capture license plate and ID credential information from a mobile device, and conduct real-time queries on the collected information.
A decade later, now what?