Menus Subscribe Search

Follow us


nsa-seal

Seal of the National Security Agency. (SEAL: PUBLIC DOMAIN)

A Security Scholar Talks About the NSA Scandal’s Private Side

• June 12, 2013 • 4:24 PM

Seal of the National Security Agency. (SEAL: PUBLIC DOMAIN)

“If there’s full disclosure—we are monitoring your phone calls and we’re tracking your keystrokes, and you should just know that this is happening—people tend to be OK with it, because they can self regulate.”

Yesterday we posted part one of an interview with Torin Monahan, co-author of SuperVision: An Introduction to the Surveillance Society, on the NSA data mining scandal. Monahan described some of his research into data sharing among intelligence and law enforcement, and argued that we are seeing a weakening not only of the laws designed to prevent abuse of private data, but a cultural change in favor of surveillance.

In part two, Monahan, a professor of communications at the University of North Carolina-Chapel Hill, speaks to the private sector side of the scandal. How is it that some information stays private and some doesn’t? Why?

This interview has been slightly condensed for length and edited for clarity.

I’d like to ask about the Silicon Valley companies named in the scandal. Lots of people have enormous incentives to dig digital dirt. Yet we don’t generally see digital profiles becoming public, though the companies implicated in the NSA scandal have that information. You don’t hear that some public figure just downloaded Fifty Shades of Grey, or has an Internet porn habit or a bad credit rating.
I don’t know that that’s quite true. I think the kinds of scandals that the private sector has been involved in have to do with breaches in their own data security applications, releasing all kinds of confidential customer information because of inappropriately secure data.

If you’re given the choice between convenience against the promise of some kind of anonymity that is vague and uncertain, then most people are going to choose convenience.

They get hacked.
Well, sometimes they get hacked. Sometimes they leave laptops in the wrong place. And also there has been a lot of corporate espionage, and corporations spying on their own. I’m thinking about the Hewlett Packard case from a few years back, hiring private investigators to spy on board members, and digging through phone records. And you think about the phone records scandal in the U.K. too. There have been a lot of privacy breaches that have made news but maybe not of the dirty laundry sort that one might expect from political campaigns.

Still, if you think of a Lee Atwater type. I’m bringing this up because it suggests the private companies are hewing to some sort of privacy standard. They have a ton of dirt on a lot of people.
My read on that would be that the motivations of those private companies are often different, and possibly at odds with the government security apparatus. If Instagram or Facebook or these other sites were openly sharing your information without any legal mandate to do so, then that could negatively affect their customer base and their brand image and other things that companies care deeply about, because profits are tied to it.

They’re perfectly OK with sharing information, and they do so constantly. But they need some kind of alibi to do so. They need a scapegoat like “the government made us do it” or “we did our best to anonymize your data and someone hacked us, and it wasn’t quite as anonymous as we thought it to be.”

So it could just be business models and the different operating norms of those organizations.

Whether it’s private or public data, the NSA scandal seems likely to provoke a regulatory battle. To alter the program, you’d need Congress to recognize the growing impossibility of remaining anonymous, and regulate digital data within an inch of its life.
There are a couple of issues here. One has to do with the degrees of transparency we have in society. For the most part we have asymmetrical transparency, where the major organizations, whether government or industry, are relatively opaque, and their practices are relatively opaque, and therefore not very accountable. And that’s true with these NSA programs too. You discover this has been going on for some time and we didn’t know about it. It took a leak because we don’t have transparency on how these organizations that govern our lives are behaving.

On the other hand we have almost total transparency when it comes to individual behaviors and actions, and beliefs even. So that’s an issue here. Not simply “technology’s advancing too quickly and we can’t do anything about it.” But “what kinds of arrangements do we want to have in place, in which we can have some invisibility.” And, maybe, organizations have to be more transparent and accountable. Regardless of what the practices are.

How does that transparency work in practice?
If you look at other countries you see that they do things differently than the United States. That you can have programs that you have to opt-in in order to share your information. Instead of having that be the default. It’s not destroying business. It’s not destroying companies. They find innovative ways to comply with those regulations. So we can do the same thing here and I think those are the kinds of questions we need to be asking.

Is there a technical solution, rather than a legal or political one, for enhancing privacy?
Technically it is possible to develop systems—and there are a number of good ones out there—that embody privacy-enhancing technologies.

But for me the interesting part of the articulation is the legitimacy: “If they have a legitimate reason for listening in, they could.” Well, that is interpretable. What is legitimate? How are they going to know something is legitimate when they hear it, when they see it? How are they going to prove it or substantiate it in court?

So it’s the issue of legitimacy that is at the heart of the current debate here. Do you have a legitimate reason to listen in to people’s phone calls or look into the patterns of their travel? I suspect that most people are feeling that those activities are illegitimate. And that’s why there’s so much controversy right now over the NSA programs.

So tools like Tor, for example, aren’t the solution?
The technological community has been pretty good about developing applications to try to mask some of the personal or private activities of users. But it is a spy versus spy dynamic, where as soon as one application is developed, something else is invented to circumvent it. That postpones the conversation about what we want our information systems to look like and what kind of governance we want to have.

If someone were recording this conversation, and a system was in place that flagged it for review, sent it up a chain of command to a court or an officer who evaluated it and pressed “delete,” what is the problem there? At some point we either need to trust a national security system, or get rid of it, right?
A few responses. Because of the relative opacity, we may not ever know what implications we are being subjected to. Maybe I’m being singled out for enhanced screening much more than the next person. In the absence of any evidence, I don’t know that it’s because of this conversation, or it’s because of something else.

The second issue has to do with how all these data are converging and how they can be fused together, not just by government systems but by private systems. You can imagine the argument “I have nothing to hide if I’m not doing anything wrong.” But we’re all doing something wrong at some points in our lives from the perspective of someone. And that could be your employer who doesn’t like your political leanings or doesn’t like your lifestyle. That could be your insurance company that feels you live in an environmentally dangerous neighborhood.

But in theory they don’t have access to this.
What I think is revealed by this NSA program is the easy translation and information exchange across domains. Homeland Security fusion centers are collecting information from data aggregators who are tracking our credit card purchases and our driving records and everything else, and that information exchange is becoming a two-way street, where you can have companies like Intel or Boeing or hotels even, or the owners of utilities that could say those are critical infrastructures. So now you’re having an exchange of data that’s going from the law enforcement community to the private sector, because it’s deemed to be pertinent information for covering their own risks.

We saw this with Occupy Wall Street, where there was a real synergy between the banking industry and the police. They were exchanging information and it was a two-way street. We have precedents that were established. These programs aren’t going to remain siloed. Instead, they represent a very easy flow of personal data across those silos.

What’s the solution to this? To create obstacles to the information flow, or create a kind of oversight of the flows that’s reliable?
There are a number of solutions. One could entail what some of my colleagues have called maintaining the “contextual integrity” of our data. That data collected in one place for one purpose shouldn’t be transportable to other places for other purposes without the consent of the person involved. There are schemes to do this. It could be a data exchange scheme where you could sell your data or give other people rights to your data for marketing or other purposes.

Other possibilities are to obfuscate our data. Make it less specific, more crude, so that it’s less revealing of all our activities. We can dumb it down.

What’s an example of that?
One example could be marketing. Instead of linking your frequent shopper card to your credit card to your online searches and bundling all those together to create a profile of you as a specific individual, what if some of those were de-linked or they couldn’t identify your demographics specifically? It could be “someone who got a high school or a college education,” but it wouldn’t be “you attended this college and you got these grades and here was your major.” To say it’s OK to traffic in crude categories but not fine-grain data unless you have public safety reasons to do so.

Wouldn’t that just be defeated by the tendency among consumers to share everything? Facebook asks me where I went to college every few days, and most people tell them. Amazon’s business model is based on granular data, and they actually do recommend books I end up liking, so I’m inclined to let them learn more about me.
Absolutely. This is one reason I don’t put much faith in the various technological solutions. If you’re given the choice between convenience against the promise of some kind of anonymity that is vague and uncertain, then most people are going to choose convenience.

But the other thing is that people are often coerced into disclosure. Take Facebook. The social cost of not being a part of social media could actually be so detrimental that it’s no longer viable to stay unplugged, or use a smartphone or any of the other gadgets in our lives. So there’s a coercive element. And that’s completely intentional, of course, because it’s great for these platforms.

You won’t get a job if you’re not on LinkedIn.
I wouldn’t say that, as someone who has a job and is not on LinkedIn.

When workplace surveillance happens, when people’s keystrokes are being monitored or their telephone calls are being listened to, if they’re unaware that that’s happening, and they find out, they experience a great sense of violation. They’re angry about it. If there’s full disclosure, “we are monitoring your phone calls and we’re tracking your keystrokes, and you should just know that this is happening,” people tend to be OK with it, because they can self regulate and they don’t feel they’re being trapped.

So there’s a psychological aspect to this and we see it in the NSA scandal as well. That this was completely hidden. It was secretive. It was without public approval.

Isn’t a certain amount of secrecy necessary in the national security realm? If people know they’re being monitored, and they’re planning something nefarious, they change their behavior.
I’ve heard that argument, and, in this case, I don’t think it holds much water. What are the alternative forms of communication that people are going to use? A carrier pigeon? You’re going to not use a phone, not use the Internet?

Meet in a café.
Sure. And there are clever ways to try to evade scrutiny. People communicating in online games is one the intelligence community is really worried about. It looks like you’re playing a game, but really you’re using it as a communications platform.

On the other hand, this is a wholesale collection of all of our data that can be analyzed at a later point for any kinds of connection that one is curious about. And that’s a different matter entirely from a targeted investigation because there’s reasonable suspicion that someone is involved in criminal or terrorist activity. That’s the legal threshold. We have the federal regulations to back that up.

What these programs are doing is effectively circumventing that legal restriction, and then massaging it after the fact to say “well, we collected the data but we’re not really going to act on it unless we think we have to.”

I don’t think that’s legal.

Marc Herman

A weekly roundup of the best of Pacific Standard and PSmag.com, delivered straight to your inbox.

Recent Posts

October 31 • 10:15 AM

Levels of Depression Could Be Evaluated Through Measurements of Acoustic Speech

Engineers find tell-tale signs in speech patterns of the depressed.


October 31 • 8:00 AM

Who Wants a Cute Congressman?

You probably do—even if you won’t admit it. In politics, looks aren’t everything, but they’re definitely something.


October 31 • 7:00 AM

Why Scientists Make Promises They Can’t Keep

A research proposal that is totally upfront about the uncertainty of the scientific process and its potential benefits might never pass governmental muster.


October 31 • 6:12 AM

The Psychology of a Horror Movie Fan

Scientists have tried to figure out the appeal of axe murderers and creepy dolls, but it mostly remains a spooky mystery.


October 31 • 4:00 AM

The Power of Third Person Plural on Support for Public Policies

Researchers find citizens react differently to policy proposals when they’re framed as impacting “people,” as opposed to “you.”


October 30 • 4:00 PM

I Should Have Told My High School Students About My Struggle With Drinking

As a teacher, my students confided in me about many harrowing aspects of their lives. I never crossed the line and shared my biggest problem with them—but now I wish I had.


October 30 • 2:00 PM

How Dark Money Got a Mining Company Everything It Wanted

An accidentally released court filing reveals how one company secretly gave money to a non-profit that helped get favorable mining legislation passed.


October 30 • 12:00 PM

The Halloween Industrial Complex

The scariest thing about Halloween might be just how seriously we take it. For this week’s holiday, Americans of all ages will spend more than $5 billion on disposable costumes and bite-size candy.


October 30 • 10:00 AM

Sky’s the Limit: The Case for Selling Air Rights

Lower taxes and debt, increased revenue for the city, and a much better use of space in already dense environments: Selling air rights and encouraging upward growth seem like no-brainers, but NIMBY resistance and philosophical barriers remain.


October 30 • 9:00 AM

Cycles of Fear and Bias in the Criminal Justice System

Exploring the psychological roots of racial disparity in U.S. prisons.


October 30 • 8:00 AM

How Do You Make a Living, Email Newsletter Writer?

Noah Davis talks to Wait But Why writer Tim Urban about the newsletter concept, the research process, and escaping “money-flushing toilet” status.



October 30 • 6:00 AM

Dreamers of the Carbon-Free Dream

Can California go full-renewable?


October 30 • 5:08 AM

We’re Not So Great at Rejecting Each Other

And it’s probably something we should work on.


October 30 • 4:00 AM

He’s Definitely a Liberal—Just Check Out His Brain Scan

New research finds political ideology can be easily determined by examining how one’s brain reacts to disgusting images.


October 29 • 4:00 PM

Should We Prosecute Climate Change Protesters Who Break the Law?

A conversation with Bristol County, Massachusetts, District Attorney Sam Sutter, who dropped steep charges against two climate change protesters.


October 29 • 2:23 PM

Innovation Geography: The Beginning of the End for Silicon Valley

Will a lack of affordable housing hinder the growth of creative start-ups?


October 29 • 2:00 PM

Trapped in the Tobacco Debt Trap

A refinance of Niagara County, New York’s tobacco bonds was good news—but for investors, not taxpayers.


October 29 • 12:00 PM

Purity and Self-Mutilation in Thailand

During the nine-day Phuket Vegetarian Festival, a group of chosen ones known as the mah song torture themselves in order to redirect bad luck and misfortune away from their communities and ensure a year of prosperity.


October 29 • 10:00 AM

Can Proposition 47 Solve California’s Problem With Mass Incarceration?

Reducing penalties for low-level felonies could be the next step in rolling back draconian sentencing laws and addressing the criminal justice system’s long legacy of racism.


October 29 • 9:00 AM

Chronic Fatigue Syndrome and the Brain

Neuroscientists find less—but potentially stronger—white matter in the brains of patients with CFS.


October 29 • 8:00 AM

America’s Bathrooms Are a Total Failure

No matter which American bathroom is crowned in this year’s America’s Best Restroom contest, it will still have a host of terrible flaws.



October 29 • 6:00 AM

Tell Us What You Really Think

In politics, are we always just looking out for No. 1?


October 29 • 4:00 AM

Racial Resentment Drives Tea Party Membership

New research finds a strong link between tea party membership and anti-black feelings.


Follow us


Levels of Depression Could Be Evaluated Through Measurements of Acoustic Speech

Engineers find tell-tale signs in speech patterns of the depressed.

We’re Not So Great at Rejecting Each Other

And it's probably something we should work on.

Chronic Fatigue Syndrome and the Brain

Neuroscientists find less—but potentially stronger—white matter in the brains of patients with CFS.

Incumbents, Pray for Rain

Come next Tuesday, rain could push voters toward safer, more predictable candidates.

Could Economics Benefit From Computer Science Thinking?

Computational complexity could offer new insight into old ideas in biology and, yes, even the dismal science.

The Big One

One town, Champlain, New York, was the source of nearly half the scams targeting small businesses in the United States last year. November/December 2014

Copyright © 2014 by Pacific Standard and The Miller-McCune Center for Research, Media, and Public Policy. All Rights Reserved.