Menus Subscribe Search
nsa-seal

Seal of the National Security Agency. (SEAL: PUBLIC DOMAIN)

A Security Scholar Talks About the NSA Scandal’s Private Side

• June 12, 2013 • 4:24 PM

Seal of the National Security Agency. (SEAL: PUBLIC DOMAIN)

“If there’s full disclosure—we are monitoring your phone calls and we’re tracking your keystrokes, and you should just know that this is happening—people tend to be OK with it, because they can self regulate.”

Yesterday we posted part one of an interview with Torin Monahan, co-author of SuperVision: An Introduction to the Surveillance Society, on the NSA data mining scandal. Monahan described some of his research into data sharing among intelligence and law enforcement, and argued that we are seeing a weakening not only of the laws designed to prevent abuse of private data, but a cultural change in favor of surveillance.

In part two, Monahan, a professor of communications at the University of North Carolina-Chapel Hill, speaks to the private sector side of the scandal. How is it that some information stays private and some doesn’t? Why?

This interview has been slightly condensed for length and edited for clarity.

I’d like to ask about the Silicon Valley companies named in the scandal. Lots of people have enormous incentives to dig digital dirt. Yet we don’t generally see digital profiles becoming public, though the companies implicated in the NSA scandal have that information. You don’t hear that some public figure just downloaded Fifty Shades of Grey, or has an Internet porn habit or a bad credit rating.
I don’t know that that’s quite true. I think the kinds of scandals that the private sector has been involved in have to do with breaches in their own data security applications, releasing all kinds of confidential customer information because of inappropriately secure data.

If you’re given the choice between convenience against the promise of some kind of anonymity that is vague and uncertain, then most people are going to choose convenience.

They get hacked.
Well, sometimes they get hacked. Sometimes they leave laptops in the wrong place. And also there has been a lot of corporate espionage, and corporations spying on their own. I’m thinking about the Hewlett Packard case from a few years back, hiring private investigators to spy on board members, and digging through phone records. And you think about the phone records scandal in the U.K. too. There have been a lot of privacy breaches that have made news but maybe not of the dirty laundry sort that one might expect from political campaigns.

Still, if you think of a Lee Atwater type. I’m bringing this up because it suggests the private companies are hewing to some sort of privacy standard. They have a ton of dirt on a lot of people.
My read on that would be that the motivations of those private companies are often different, and possibly at odds with the government security apparatus. If Instagram or Facebook or these other sites were openly sharing your information without any legal mandate to do so, then that could negatively affect their customer base and their brand image and other things that companies care deeply about, because profits are tied to it.

They’re perfectly OK with sharing information, and they do so constantly. But they need some kind of alibi to do so. They need a scapegoat like “the government made us do it” or “we did our best to anonymize your data and someone hacked us, and it wasn’t quite as anonymous as we thought it to be.”

So it could just be business models and the different operating norms of those organizations.

Whether it’s private or public data, the NSA scandal seems likely to provoke a regulatory battle. To alter the program, you’d need Congress to recognize the growing impossibility of remaining anonymous, and regulate digital data within an inch of its life.
There are a couple of issues here. One has to do with the degrees of transparency we have in society. For the most part we have asymmetrical transparency, where the major organizations, whether government or industry, are relatively opaque, and their practices are relatively opaque, and therefore not very accountable. And that’s true with these NSA programs too. You discover this has been going on for some time and we didn’t know about it. It took a leak because we don’t have transparency on how these organizations that govern our lives are behaving.

On the other hand we have almost total transparency when it comes to individual behaviors and actions, and beliefs even. So that’s an issue here. Not simply “technology’s advancing too quickly and we can’t do anything about it.” But “what kinds of arrangements do we want to have in place, in which we can have some invisibility.” And, maybe, organizations have to be more transparent and accountable. Regardless of what the practices are.

How does that transparency work in practice?
If you look at other countries you see that they do things differently than the United States. That you can have programs that you have to opt-in in order to share your information. Instead of having that be the default. It’s not destroying business. It’s not destroying companies. They find innovative ways to comply with those regulations. So we can do the same thing here and I think those are the kinds of questions we need to be asking.

Is there a technical solution, rather than a legal or political one, for enhancing privacy?
Technically it is possible to develop systems—and there are a number of good ones out there—that embody privacy-enhancing technologies.

But for me the interesting part of the articulation is the legitimacy: “If they have a legitimate reason for listening in, they could.” Well, that is interpretable. What is legitimate? How are they going to know something is legitimate when they hear it, when they see it? How are they going to prove it or substantiate it in court?

So it’s the issue of legitimacy that is at the heart of the current debate here. Do you have a legitimate reason to listen in to people’s phone calls or look into the patterns of their travel? I suspect that most people are feeling that those activities are illegitimate. And that’s why there’s so much controversy right now over the NSA programs.

So tools like Tor, for example, aren’t the solution?
The technological community has been pretty good about developing applications to try to mask some of the personal or private activities of users. But it is a spy versus spy dynamic, where as soon as one application is developed, something else is invented to circumvent it. That postpones the conversation about what we want our information systems to look like and what kind of governance we want to have.

If someone were recording this conversation, and a system was in place that flagged it for review, sent it up a chain of command to a court or an officer who evaluated it and pressed “delete,” what is the problem there? At some point we either need to trust a national security system, or get rid of it, right?
A few responses. Because of the relative opacity, we may not ever know what implications we are being subjected to. Maybe I’m being singled out for enhanced screening much more than the next person. In the absence of any evidence, I don’t know that it’s because of this conversation, or it’s because of something else.

The second issue has to do with how all these data are converging and how they can be fused together, not just by government systems but by private systems. You can imagine the argument “I have nothing to hide if I’m not doing anything wrong.” But we’re all doing something wrong at some points in our lives from the perspective of someone. And that could be your employer who doesn’t like your political leanings or doesn’t like your lifestyle. That could be your insurance company that feels you live in an environmentally dangerous neighborhood.

But in theory they don’t have access to this.
What I think is revealed by this NSA program is the easy translation and information exchange across domains. Homeland Security fusion centers are collecting information from data aggregators who are tracking our credit card purchases and our driving records and everything else, and that information exchange is becoming a two-way street, where you can have companies like Intel or Boeing or hotels even, or the owners of utilities that could say those are critical infrastructures. So now you’re having an exchange of data that’s going from the law enforcement community to the private sector, because it’s deemed to be pertinent information for covering their own risks.

We saw this with Occupy Wall Street, where there was a real synergy between the banking industry and the police. They were exchanging information and it was a two-way street. We have precedents that were established. These programs aren’t going to remain siloed. Instead, they represent a very easy flow of personal data across those silos.

What’s the solution to this? To create obstacles to the information flow, or create a kind of oversight of the flows that’s reliable?
There are a number of solutions. One could entail what some of my colleagues have called maintaining the “contextual integrity” of our data. That data collected in one place for one purpose shouldn’t be transportable to other places for other purposes without the consent of the person involved. There are schemes to do this. It could be a data exchange scheme where you could sell your data or give other people rights to your data for marketing or other purposes.

Other possibilities are to obfuscate our data. Make it less specific, more crude, so that it’s less revealing of all our activities. We can dumb it down.

What’s an example of that?
One example could be marketing. Instead of linking your frequent shopper card to your credit card to your online searches and bundling all those together to create a profile of you as a specific individual, what if some of those were de-linked or they couldn’t identify your demographics specifically? It could be “someone who got a high school or a college education,” but it wouldn’t be “you attended this college and you got these grades and here was your major.” To say it’s OK to traffic in crude categories but not fine-grain data unless you have public safety reasons to do so.

Wouldn’t that just be defeated by the tendency among consumers to share everything? Facebook asks me where I went to college every few days, and most people tell them. Amazon’s business model is based on granular data, and they actually do recommend books I end up liking, so I’m inclined to let them learn more about me.
Absolutely. This is one reason I don’t put much faith in the various technological solutions. If you’re given the choice between convenience against the promise of some kind of anonymity that is vague and uncertain, then most people are going to choose convenience.

But the other thing is that people are often coerced into disclosure. Take Facebook. The social cost of not being a part of social media could actually be so detrimental that it’s no longer viable to stay unplugged, or use a smartphone or any of the other gadgets in our lives. So there’s a coercive element. And that’s completely intentional, of course, because it’s great for these platforms.

You won’t get a job if you’re not on LinkedIn.
I wouldn’t say that, as someone who has a job and is not on LinkedIn.

When workplace surveillance happens, when people’s keystrokes are being monitored or their telephone calls are being listened to, if they’re unaware that that’s happening, and they find out, they experience a great sense of violation. They’re angry about it. If there’s full disclosure, “we are monitoring your phone calls and we’re tracking your keystrokes, and you should just know that this is happening,” people tend to be OK with it, because they can self regulate and they don’t feel they’re being trapped.

So there’s a psychological aspect to this and we see it in the NSA scandal as well. That this was completely hidden. It was secretive. It was without public approval.

Isn’t a certain amount of secrecy necessary in the national security realm? If people know they’re being monitored, and they’re planning something nefarious, they change their behavior.
I’ve heard that argument, and, in this case, I don’t think it holds much water. What are the alternative forms of communication that people are going to use? A carrier pigeon? You’re going to not use a phone, not use the Internet?

Meet in a café.
Sure. And there are clever ways to try to evade scrutiny. People communicating in online games is one the intelligence community is really worried about. It looks like you’re playing a game, but really you’re using it as a communications platform.

On the other hand, this is a wholesale collection of all of our data that can be analyzed at a later point for any kinds of connection that one is curious about. And that’s a different matter entirely from a targeted investigation because there’s reasonable suspicion that someone is involved in criminal or terrorist activity. That’s the legal threshold. We have the federal regulations to back that up.

What these programs are doing is effectively circumventing that legal restriction, and then massaging it after the fact to say “well, we collected the data but we’re not really going to act on it unless we think we have to.”

I don’t think that’s legal.

Marc Herman

A weekly roundup of the best of Pacific Standard and PSmag.com, delivered straight to your inbox.

Recent Posts

August 29 • 4:00 PM

The Hidden Costs of Tobacco Debt

Even when taxpayers aren’t explicitly on the hook, tobacco bonds can cost states and local governments money. Here’s how.


August 29 • 2:00 PM

Why Don’t Men and Women Wear the Same Gender-Neutral Bathing Suits?

They used to in the 1920s.


August 29 • 11:48 AM

Your Brain Decides Whether to Trust Someone in Milliseconds

We can determine trustworthiness even when we’re only subliminally aware of the other person.


August 29 • 10:00 AM

True Darwinism Is All About Chance

Though the rich sometimes forget, Darwin knew that nature frequently rolls the dice.


August 29 • 8:00 AM

Why Our Molecular Make-Up Can’t Explain Who We Are

Our genes only tell a portion of the story.


August 29 • 6:00 AM

Strange Situations: Attachment Theory and Sexual Assault on College Campuses

When college women leave home, does attachment behavior make them more vulnerable to campus rape?


August 29 • 4:00 AM

Forgive Your Philandering Partner—and Pay the Price

New research finds people who forgive an unfaithful romantic partner are considered weaker and less competent than those who ended the relationship.


August 28 • 4:00 PM

Some Natural-Looking Zoo Exhibits May Be Even Worse Than the Old Concrete Ones

They’re often designed for you, the paying visitor, and not the animals who have to inhabit them.


August 28 • 2:00 PM

What I Learned From Debating Science With Trolls

“Don’t feed the trolls” is sound advice, but occasionally ignoring it can lead to rewards.


August 28 • 12:00 PM

The Ice Bucket Challenge’s Meme Money

The ALS Association has raised nearly $100 million over the past month, 50 times what it raised in the same period last year. How will that money be spent, and how can non-profit executives make a windfall last?


August 28 • 11:56 AM

Outlawing Water Conflict: California Legislators Confront Risky Groundwater Loophole

California, where ambitious agriculture sucks up 80 percent of the state’s developed water, is no stranger to water wrangles. Now one of the worst droughts in state history is pushing legislators to reckon with its unwieldy water laws, especially one major oversight: California has been the only Western state without groundwater regulation—but now that looks set to change.


August 28 • 11:38 AM

Young, Undocumented, and Invisible

While young migrant workers struggle under poor working conditions, U.S. policy has done little to help.


August 28 • 10:00 AM

The Five Words You Never Want to Hear From Your Doctor

“Sometimes people just get pains.”


August 28 • 8:00 AM

Why I’m Not Sharing My Coke

Andy Warhol, algorithms, and a bunch of popular names printed on soda cans.


August 28 • 6:00 AM

Can Outdoor Art Revitalize Outdoor Advertising?

That art you’ve been seeing at bus stations and billboards—it’s serving a purpose beyond just promoting local museums.


August 28 • 4:00 AM

Linguistic Analysis Reveals Research Fraud

An examination of papers by the discredited Diederik Stapel finds linguistic differences between his legitimate and fraudulent studies.


August 28 • 2:00 AM

Poverty and Geography: The Myth of Racial Segregation

Migration, regardless of race, ethnicity, gender, or sexuality (not to mention class), can be a poverty-buster.


August 27 • 4:00 PM

The ‘Non-Lethal’ Flash-Bang Grenades Used in Ferguson Can Actually Be Quite Lethal

A journalist says he was singed by a flash-bang fired by St. Louis County police trying to disperse a crowd, raising questions about how to use these military-style devices safely and appropriately.


August 27 • 2:00 PM

Do Better Looking People Have Better Personalities Too?

An experiment on users of the dating site OKCupid found that members judge both looks and personality by looks alone.


August 27 • 12:00 PM

Love Can Make You Stronger

A new study links oxytocin, the hormone most commonly associated with social bonding, and the one that your body produces during an orgasm, with muscle regeneration.


August 27 • 11:05 AM

Education, Interrupted

When it comes to educational access, young Syrian refugees are becoming a “lost generation.”


August 27 • 9:47 AM

No, Smartphone-Loss Anxiety Disorder Isn’t Real

But people are anxious about losing their phones, even if they don’t do much to protect them.


August 27 • 8:00 AM

A Skeptic Meets a Psychic: When You Can See Into the Future, How Do You Handle Uncertainty?

For all the crystal balls and beaded doorways, some psychics provide a useful, non-paranormal service. The best ones—they give good advice.


August 27 • 6:00 AM

Speaking Eyebrow: Your Face Is Saying More Than You Think

Our involuntary gestures take on different “accents” depending on our cultural background.


August 27 • 4:00 AM

The Politics of Anti-NIMBYism and Addressing Housing Affordability

Respected expert economists like Paul Krugman and Edward Glaeser are confusing readers with their poor grasp of demography.


Follow us


Subscribe Now

Your Brain Decides Whether to Trust Someone in Milliseconds

We can determine trustworthiness even when we’re only subliminally aware of the other person.

Young, Undocumented, and Invisible

While young migrant workers struggle under poor working conditions, U.S. policy has done little to help.

Education, Interrupted

When it comes to educational access, young Syrian refugees are becoming a “lost generation.”

No, Smartphone-Loss Anxiety Disorder Isn’t Real

But people are anxious about losing their phones, even if they don’t do much to protect them.

Being a Couch Potato: Not So Bad After All?

For those who feel guilty about watching TV, a new study provides redemption.

The Big One

One in two full-time American fast-food workers' families are enrolled in public assistance programs, at a cost of $7 billion per year. July/August 2014 fast-food-big-one

Copyright © 2014 by Pacific Standard and The Miller-McCune Center for Research, Media, and Public Policy. All Rights Reserved.