Getting you, from here to done.

The Current

What Will You Sacrifice in the Name of Security?

Cameron Nath /

Should our thoughts be considered personal or are they open to collection, even if for what one might consider a ‘noble’ purpose?


The three parallel towers of Marina Bay Sands in Singapore create one of the most breathtaking views in the world. The resort has a 500-foot long infinity pool, which is the world’s largest. It is set on top of the world’s largest cantilevered platform some 55 stories in the air. Back in June of 2013, this was the site of The Asia Pacific RSA Conference. “Where the world talks security” was the headline. With Stuxnet three years in the rearview and the Target Hack that compromised 40 million card details looming in the future, the tagline was resonating powerfully. Some of the largest players in the room included HP, Airwatch, Integralis and others paying over $150,000.00 to take part. Pillars of the industry descended on Singapore to hear and be heard.

Between lectures such as “Big Data Calls for Big Security” and “Preventing Zero-Day Attacks in Mobile Devices,” there was one unique opportunity that stood out. It didn’t have a catchy title or star-studded speaker – it simply had five letters. DARPA. The Defense Advanced Research Projects Agency (DARPA) was in attendance and if prior decades of research and innovation have taught us anything – when DARPA speaks the industry listens.

It didn’t have a catchy title or star-studded speaker – it simply had five letters. DARPA.

This year’s lecture was co-hosted by Dr. Neil Costigan and Mr. Ingo Deutschmann – founder and vice president of BehavioSec respectively. The subject, DARPA Active Authentication Program: Behavioral Biometrics. Dr. Costigan and Mr. Deutschnann presented the framework for what has since become the new horizon of the Information Security world – User Behavior Analytics.

Around 2013 the public had just begun familiarizing itself with older, legacy biometric authentication mediums such as fingerprint scanners. Meanwhile, private industry was looking ahead and taking a page from the traditional User Behavior Analytics. Companies had begun leveraging models used to predict consumer behavior and applying these methodologies to the Information Security space. But DARPA was yet again looking towards the future. The speakers reviewed the traditional ranges of biometrics and began steering us towards a true active authentication model.

Since this conference in 2013, non-cooperative behavioral biometrics have allowed the passive validation of user identity by the user simply behaving normally. No longer do we need to require behavior interruptions. Still, the untapped potential of behavioral biometrics leaves much to be desired in the InfoSec arena. As Dr. Costigan reflected, we’ve just begun to explore more complex behavioral biometrics such as Computational Linguistics (how you use language) and Structural Semantic Analysis (how you construct sentences). Bringing these additional metrics to light will make an ever more complex user security profile, and in theory, create a safer, more secure digital landscape.

I’d like to look beyond even this eventuality and peer forward to what Dr. Costigan and Mr. Deutschnann refer to as ‘new biometric modalities’. More specifically, modalities in the neurocognitive space.

Whether DARPA truly understood at the time the depths to which neurocognitive modalities could be harnessed is up for debate. However, what we now know is that these ideas are more than just theoretically possible. They have since been put into practice and demonstrated in spectacular fashion.

At the Enigma Conference in January of this 2017, University of Washington researcher Dr. Tamara Bonaci provided us the bridge from theory to the reality DARPA had predicted in Singapore. To this end, some of these implications regarding the future of User Behavior Analytics in the security space have the potential to be far-reaching and may even force us to question – what are we willing to sacrifice in the name of security?

What are we willing to sacrifice in the name of security?

Dr. Bonaci describes an experiment in which a simple interactive UI might be used to harvest neural responses based on intermittently depicted subliminal images. Her game, called FlappyWhale, measures subjects’ reactions to seemingly innocuous images by reading electroencephalography signals in real-time, thereby creating a profile of the individual thoughts and feelings in relation what’s being depicted. After the Enigma conference, Bonaci told Ars Technica, “Electrical signals produced by our body might contain sensitive information about us that we might not be willing to share with the world. On top of that, we may be giving that information away without even being aware of it.”

It is here that we see the new moral questions of security take shape. Should our thoughts be considered personal or, much like their more physical behavioral biometric counterpart modalities, are they open to collection? What if the collection is what one might consider a ‘noble’ purpose? This very space is where philosophy meets physics. These questions are no longer ones of science fiction fantasy.

I will share with you one a closing thought – forty years ago a study was published by the US Privacy Protection Study Commission in which the authors noted “The real danger is the gradual erosion of individual liberties through automation, integration, and interconnection of many small, separate record-keeping systems, each of which alone may seem innocuous, even benevolent, and wholly justifiable.”

I leave it to each individual to make up their own conclusions.

Share This Post