Menus Subscribe Search

Follow us


nsa-seal

Seal of the National Security Agency. (SEAL: PUBLIC DOMAIN)

A Security Scholar Talks About the NSA Scandal’s Private Side

• June 12, 2013 • 4:24 PM

Seal of the National Security Agency. (SEAL: PUBLIC DOMAIN)

“If there’s full disclosure—we are monitoring your phone calls and we’re tracking your keystrokes, and you should just know that this is happening—people tend to be OK with it, because they can self regulate.”

Yesterday we posted part one of an interview with Torin Monahan, co-author of SuperVision: An Introduction to the Surveillance Society, on the NSA data mining scandal. Monahan described some of his research into data sharing among intelligence and law enforcement, and argued that we are seeing a weakening not only of the laws designed to prevent abuse of private data, but a cultural change in favor of surveillance.

In part two, Monahan, a professor of communications at the University of North Carolina-Chapel Hill, speaks to the private sector side of the scandal. How is it that some information stays private and some doesn’t? Why?

This interview has been slightly condensed for length and edited for clarity.

I’d like to ask about the Silicon Valley companies named in the scandal. Lots of people have enormous incentives to dig digital dirt. Yet we don’t generally see digital profiles becoming public, though the companies implicated in the NSA scandal have that information. You don’t hear that some public figure just downloaded Fifty Shades of Grey, or has an Internet porn habit or a bad credit rating.
I don’t know that that’s quite true. I think the kinds of scandals that the private sector has been involved in have to do with breaches in their own data security applications, releasing all kinds of confidential customer information because of inappropriately secure data.

If you’re given the choice between convenience against the promise of some kind of anonymity that is vague and uncertain, then most people are going to choose convenience.

They get hacked.
Well, sometimes they get hacked. Sometimes they leave laptops in the wrong place. And also there has been a lot of corporate espionage, and corporations spying on their own. I’m thinking about the Hewlett Packard case from a few years back, hiring private investigators to spy on board members, and digging through phone records. And you think about the phone records scandal in the U.K. too. There have been a lot of privacy breaches that have made news but maybe not of the dirty laundry sort that one might expect from political campaigns.

Still, if you think of a Lee Atwater type. I’m bringing this up because it suggests the private companies are hewing to some sort of privacy standard. They have a ton of dirt on a lot of people.
My read on that would be that the motivations of those private companies are often different, and possibly at odds with the government security apparatus. If Instagram or Facebook or these other sites were openly sharing your information without any legal mandate to do so, then that could negatively affect their customer base and their brand image and other things that companies care deeply about, because profits are tied to it.

They’re perfectly OK with sharing information, and they do so constantly. But they need some kind of alibi to do so. They need a scapegoat like “the government made us do it” or “we did our best to anonymize your data and someone hacked us, and it wasn’t quite as anonymous as we thought it to be.”

So it could just be business models and the different operating norms of those organizations.

Whether it’s private or public data, the NSA scandal seems likely to provoke a regulatory battle. To alter the program, you’d need Congress to recognize the growing impossibility of remaining anonymous, and regulate digital data within an inch of its life.
There are a couple of issues here. One has to do with the degrees of transparency we have in society. For the most part we have asymmetrical transparency, where the major organizations, whether government or industry, are relatively opaque, and their practices are relatively opaque, and therefore not very accountable. And that’s true with these NSA programs too. You discover this has been going on for some time and we didn’t know about it. It took a leak because we don’t have transparency on how these organizations that govern our lives are behaving.

On the other hand we have almost total transparency when it comes to individual behaviors and actions, and beliefs even. So that’s an issue here. Not simply “technology’s advancing too quickly and we can’t do anything about it.” But “what kinds of arrangements do we want to have in place, in which we can have some invisibility.” And, maybe, organizations have to be more transparent and accountable. Regardless of what the practices are.

How does that transparency work in practice?
If you look at other countries you see that they do things differently than the United States. That you can have programs that you have to opt-in in order to share your information. Instead of having that be the default. It’s not destroying business. It’s not destroying companies. They find innovative ways to comply with those regulations. So we can do the same thing here and I think those are the kinds of questions we need to be asking.

Is there a technical solution, rather than a legal or political one, for enhancing privacy?
Technically it is possible to develop systems—and there are a number of good ones out there—that embody privacy-enhancing technologies.

But for me the interesting part of the articulation is the legitimacy: “If they have a legitimate reason for listening in, they could.” Well, that is interpretable. What is legitimate? How are they going to know something is legitimate when they hear it, when they see it? How are they going to prove it or substantiate it in court?

So it’s the issue of legitimacy that is at the heart of the current debate here. Do you have a legitimate reason to listen in to people’s phone calls or look into the patterns of their travel? I suspect that most people are feeling that those activities are illegitimate. And that’s why there’s so much controversy right now over the NSA programs.

So tools like Tor, for example, aren’t the solution?
The technological community has been pretty good about developing applications to try to mask some of the personal or private activities of users. But it is a spy versus spy dynamic, where as soon as one application is developed, something else is invented to circumvent it. That postpones the conversation about what we want our information systems to look like and what kind of governance we want to have.

If someone were recording this conversation, and a system was in place that flagged it for review, sent it up a chain of command to a court or an officer who evaluated it and pressed “delete,” what is the problem there? At some point we either need to trust a national security system, or get rid of it, right?
A few responses. Because of the relative opacity, we may not ever know what implications we are being subjected to. Maybe I’m being singled out for enhanced screening much more than the next person. In the absence of any evidence, I don’t know that it’s because of this conversation, or it’s because of something else.

The second issue has to do with how all these data are converging and how they can be fused together, not just by government systems but by private systems. You can imagine the argument “I have nothing to hide if I’m not doing anything wrong.” But we’re all doing something wrong at some points in our lives from the perspective of someone. And that could be your employer who doesn’t like your political leanings or doesn’t like your lifestyle. That could be your insurance company that feels you live in an environmentally dangerous neighborhood.

But in theory they don’t have access to this.
What I think is revealed by this NSA program is the easy translation and information exchange across domains. Homeland Security fusion centers are collecting information from data aggregators who are tracking our credit card purchases and our driving records and everything else, and that information exchange is becoming a two-way street, where you can have companies like Intel or Boeing or hotels even, or the owners of utilities that could say those are critical infrastructures. So now you’re having an exchange of data that’s going from the law enforcement community to the private sector, because it’s deemed to be pertinent information for covering their own risks.

We saw this with Occupy Wall Street, where there was a real synergy between the banking industry and the police. They were exchanging information and it was a two-way street. We have precedents that were established. These programs aren’t going to remain siloed. Instead, they represent a very easy flow of personal data across those silos.

What’s the solution to this? To create obstacles to the information flow, or create a kind of oversight of the flows that’s reliable?
There are a number of solutions. One could entail what some of my colleagues have called maintaining the “contextual integrity” of our data. That data collected in one place for one purpose shouldn’t be transportable to other places for other purposes without the consent of the person involved. There are schemes to do this. It could be a data exchange scheme where you could sell your data or give other people rights to your data for marketing or other purposes.

Other possibilities are to obfuscate our data. Make it less specific, more crude, so that it’s less revealing of all our activities. We can dumb it down.

What’s an example of that?
One example could be marketing. Instead of linking your frequent shopper card to your credit card to your online searches and bundling all those together to create a profile of you as a specific individual, what if some of those were de-linked or they couldn’t identify your demographics specifically? It could be “someone who got a high school or a college education,” but it wouldn’t be “you attended this college and you got these grades and here was your major.” To say it’s OK to traffic in crude categories but not fine-grain data unless you have public safety reasons to do so.

Wouldn’t that just be defeated by the tendency among consumers to share everything? Facebook asks me where I went to college every few days, and most people tell them. Amazon’s business model is based on granular data, and they actually do recommend books I end up liking, so I’m inclined to let them learn more about me.
Absolutely. This is one reason I don’t put much faith in the various technological solutions. If you’re given the choice between convenience against the promise of some kind of anonymity that is vague and uncertain, then most people are going to choose convenience.

But the other thing is that people are often coerced into disclosure. Take Facebook. The social cost of not being a part of social media could actually be so detrimental that it’s no longer viable to stay unplugged, or use a smartphone or any of the other gadgets in our lives. So there’s a coercive element. And that’s completely intentional, of course, because it’s great for these platforms.

You won’t get a job if you’re not on LinkedIn.
I wouldn’t say that, as someone who has a job and is not on LinkedIn.

When workplace surveillance happens, when people’s keystrokes are being monitored or their telephone calls are being listened to, if they’re unaware that that’s happening, and they find out, they experience a great sense of violation. They’re angry about it. If there’s full disclosure, “we are monitoring your phone calls and we’re tracking your keystrokes, and you should just know that this is happening,” people tend to be OK with it, because they can self regulate and they don’t feel they’re being trapped.

So there’s a psychological aspect to this and we see it in the NSA scandal as well. That this was completely hidden. It was secretive. It was without public approval.

Isn’t a certain amount of secrecy necessary in the national security realm? If people know they’re being monitored, and they’re planning something nefarious, they change their behavior.
I’ve heard that argument, and, in this case, I don’t think it holds much water. What are the alternative forms of communication that people are going to use? A carrier pigeon? You’re going to not use a phone, not use the Internet?

Meet in a café.
Sure. And there are clever ways to try to evade scrutiny. People communicating in online games is one the intelligence community is really worried about. It looks like you’re playing a game, but really you’re using it as a communications platform.

On the other hand, this is a wholesale collection of all of our data that can be analyzed at a later point for any kinds of connection that one is curious about. And that’s a different matter entirely from a targeted investigation because there’s reasonable suspicion that someone is involved in criminal or terrorist activity. That’s the legal threshold. We have the federal regulations to back that up.

What these programs are doing is effectively circumventing that legal restriction, and then massaging it after the fact to say “well, we collected the data but we’re not really going to act on it unless we think we have to.”

I don’t think that’s legal.

Marc Herman

A weekly roundup of the best of Pacific Standard and PSmag.com, delivered straight to your inbox.

Recent Posts

December 18 • 2:00 PM

Women in Apocalyptic Fiction Shaving Their Armpits

Because our interest in realism apparently only goes so far.


December 18 • 12:00 PM

The Paradox of Choice, 10 Years Later

Paul Hiebert talks to psychologist Barry Schwartz about how modern trends—social media, FOMO, customer review sites—fit in with arguments he made a decade ago in his highly influential book, The Paradox of Choice: Why More Is Less.


December 18 • 10:00 AM

What It’s Like to Spend a Few Hours in the Church of Scientology

Wrestling with thetans, attempting to unlock a memory bank, and a personality test seemingly aimed at people with depression. This is Scientology’s “dissemination drill” for potential new members.


December 18 • 8:00 AM

Gendering #BlackLivesMatter: A Feminist Perspective

Black men are stereotyped as violent, while black women are rendered invisible. Here’s why the gendering of black lives matters.


December 18 • 7:06 AM

Apparently You Can Bring Your Religion to Work

New research says offices that encourage talk of religion actually make for happier workplaces.


December 18 • 6:00 AM

The Very Weak and Complicated Links Between Mental Illness and Gun Violence

Vanderbilt University’s Jonathan Metzl and Kenneth MacLeish address our anxieties and correct our assumptions.


December 18 • 4:00 AM

Should Movies Be Rated RD for Reckless Driving?

A new study finds a link between watching films featuring reckless driving and engaging in similar behavior years later.


December 17 • 4:00 PM

How to Run a Drug Dealing Network in Prison

People tend not to hear about the prison drug dealing operations that succeed. Substance.com asks a veteran of the game to explain his system.


December 17 • 2:00 PM

Gender Segregation of Toys Is on the Rise

Charting the use of “toys for boys” and “toys for girls” in American English.


December 17 • 12:41 PM

Why the College Football Playoff Is Terrible But Better Than Before

The sample size is still embarrassingly small, but at least there’s less room for the availability cascade.


December 17 • 11:06 AM

Canadian Kids Have a Serious Smoking Problem

Bootleg cigarette sales could be leading Canadian teens to more serious drugs, a recent study finds.


December 17 • 10:37 AM

A Public Lynching in Sproul Plaza

When photographs of lynching victims showed up on a hallowed site of democracy in action, a provocation was issued—but to whom, by whom, and why?


December 17 • 8:00 AM

What Was the Job?

This was the year the job broke, the year we accepted a re-interpretation of its fundamental bargain and bought in to the push to get us to all work for ourselves rather than each other.


December 17 • 6:00 AM

White Kids Will Be Kids

Even the “good” kids—bound for college, upwardly mobile—sometimes break the law. The difference? They don’t have much to fear. A professor of race and social movements reflects on her teenage years and faces some uncomfortable realities.



December 16 • 4:00 PM

How Fear of Occupy Wall Street Undermined the Red Cross’ Sandy Relief Effort

Red Cross responders say there was a ban on working with the widely praised Occupy Sandy relief group because it was seen as politically unpalatable.


December 16 • 3:30 PM

Murder! Mayhem! And That’s Just the Cartoons!

New research suggests deaths are common features of animated features aimed at children.


December 16 • 1:43 PM

In Tragedy, Empathy Still Dependent on Proximity

In spite of an increasingly connected world, in the face of adversity, a personal touch is most effective.


December 16 • 12:00 PM

The ‘New York Times’ Is Hooked on Drug du Jour Journalism

For the paper of record, addiction is always about this drug or that drug rather than the real causes.


December 16 • 10:00 AM

What Is the Point of Academic Books?

Ultimately, they’re meant to disseminate knowledge. But their narrow appeal makes them expensive to produce and harder to sell.


December 16 • 8:00 AM

Unjust and Unwell: The Racial Issues That Could Be Affecting Your Health Care

Physicians and medical students have the same problems with implicit bias as the rest of us.


December 16 • 6:00 AM

If You Get Confused Just Listen to the Music Play

Healing the brain with the Grateful Dead.


December 16 • 4:00 AM

Another Casualty of the Great Recession: Trust

Research from Britain finds people who were laid off from their jobs expressed lower levels of generalized trust.


December 15 • 4:00 PM

When Charter Schools Are Non-Profit in Name Only

Some charters pass along nearly all their money to for-profit companies hired to manage the schools. It’s an arrangement that’s raising eyebrows.


December 15 • 2:00 PM

No More Space Race

A far cry from the fierce Cold War Space Race between the U.S. and the Soviet Union, exploration in the 21st century is likely to be a much more globally collaborative project.


Follow us


Apparently You Can Bring Your Religion to Work

New research says offices that encourage talk of religion actually make for happier workplaces.

Canadian Kids Have a Serious Smoking Problem

Bootleg cigarette sales could be leading Canadian teens to more serious drugs, a recent study finds.

The Hidden Psychology of the Home Ref

That old myth of home field bias isn’t a myth at all; it’s a statistical fact.

A Word of Caution to the Holiday Deal-Makers

Repeat customers—with higher return rates and real bargain-hunting prowess—can have negative effects on a company’s net earnings.

Crowdfunding Works for Science

Scientists just need to put forth some effort.

The Big One

One in two United States senators and two in five House members who left office between 1998 and 2004 became lobbyists. November/December 2014

Copyright © 2014 by Pacific Standard and The Miller-McCune Center for Research, Media, and Public Policy. All Rights Reserved.