NORTH STAR INTERVIEW: Byron Tau (Author, 'Means of Control: How the Hidden Alliance of Tech and Government Is Creating a New American Surveillance State')
Byron Tau is one of the sharpest, investigative journalists going with a distinguished career spanning some of the most influential media outlets in America. Currently with Associated Press, Tau specialises in uncovering the complexities of government surveillance, national security, and the intersection of politics and technology. His new book ‘Means of Control: How the Hidden Alliance of Tech and Government Is Creating a New American Surveillance State’ offers a deep dive into the intricate and unsettling dynamics of this hidden power structure. HIGHLY RECOMMENDED.
Before joining AP, Tau was with The Wall Street Journal, and Politico, where he covered national politics and the White House during a formative period in modern American politics, including the 2012 presidential campaign. He’s also well known as one of the foremost chroniclers of the Russia investigation and Robert Mueller's probe into election interference. All all all, a truth teller and a north star when it comes to surveillance, big tech and power.
North Star interviews are always free [archive], but with a paid subscription to C_NCENTRATE you get around 50 issues of analysis, links, cutting-edge thinking, and inspiration a year. Readers say it’s “the first good decision” they make each week. Subscribe now for £1.25 a week. Cancel anytime.
THE INTERVIEW
Thanks for chatting, Byron. Let’s start up top. Investigative journalism is adapting thanks to new tools, what advice do you have for people who are interested in getting into it? It’s not a fast process is it?
Earlier this year, the CEO of the blogging site Medium explained in an interview with Semafor why his company failed as a platform for journalists but was successfully able to pivot to being a blogging platform. According to him, it boiled down to the fact that “It’s expensive to write about things that you don’t already know.”
That’s the challenge of journalism. It’s really time-consuming and expensive to find out information that isn’t already known. I did more than 300 interviews over the course of 5 years as part of writing Means of Control. I travelled to a half-dozen states. I hired lawyers and filed a lawsuit against the U.S. government to get them to turn over documents and I had a private investigator scouring databases and documents for me. I had to buy hundreds of dollars to access court documents from trials across the country. And then more importantly, I needed the time and space to review all that and turn it into a narrative. That kind of work is slow, laborious, painstaking and expensive.
There is a big debate in journalism about what AI is going to do to our industry. Maybe I’m naive, but I don’t think it will change that much. AI will be able to do basic stories like who won the baseball game or that the market dropped 15 points. But it probably won’t ever be able to build relationships and credibility with real human beings over time and get them to tell it new information. It won’t be able to show up at a potential source’s doorstep and convince them to tell their story. Long term, it might mean fewer entry-level jobs doing things like writing up the sports game or basic market movements, but those aren’t the kind of stories journalists wake up wanting to tell anyway. While the economics of the profession remain uncertain, I’m not convinced that the fundamentals of whatever survives will change all that much.
The book focuses on data brokers, a very murky area, and an area where most consumers don’t know about. When it comes to our data, consumers don’t know how it works or who is holding what on us. What are the most concerning unintended consequences of the current surveillance state that you’ve uncovered? Are we seeing impacts now that the original architects of these systems could never have foreseen, particularly concerning international relationships and diplomatic trust?
I think what I find the most concerning is the thoughtless way that the technologies we use everyday and that are deeply woven into our digital and physical lives are designed with almost no thought to privacy, security or the social consequences of data collection and storage.
Digital advertising is the perfect example. This is a system that collects vast amounts of information about what people do on the web and connects nearly every device on Earth to a few centralised servers to deliver them personalised advertising. Of course governments were going to want to get access to that kind of information and it was naive from the beginning to believe such a system could be created for purely commercial purposes. It’s one of the richest repositories of information ever assembled on the global population — with everything from detailed technical information about the devices people use to information about their browsing habits and consumer preferences.
Or take car tires. Did you know that there are little antennas in your car tires that constantly transmit the pressure back to your car’s central computer to warn you if they get low? Clever government agencies have figured out that they can put sensors in certain places and monitor the comings and goings of cars through tires.
Ok, that’s slightly terrifying! You describe in the book how the U.S. government partnered with private companies to build a sprawling surveillance apparatus. To what extent do you think this surveillance is driven by legitimate security needs versus a deeper political or psychological desire for control over citizens?
I think governments have legitimate security needs. And they need access to good, high quality information for those security needs. But historically, democracies have tried to strike between giving governments the information they need and protecting the privacy, dignity and civil liberties of citizens by requiring some showing of need and putting barriers between the government and the information. Those barriers are meant as a protection for the population against an overly intrusive state.
But things today have gotten out of whack because of all the information we give away to third parties like our cell phone carrier, our email provider and the hundreds of apps we load up our phone with. Each of those relationships is governed by a mind-numbingly complicated terms of service that generally speaking gives all these corporate
Here’s an analogue example that I think illustrates the point well. The U.S. government needs a search warrant to open mail sent via the U.S. Postal Service. And to get that search warrant, they need to show that they believe that there is probable cause of criminal activity involving the letter or parcel. But those rules don’t apply to packages or envelopes shipped through companies like UPS, FedEx, or DHL. It’s up to those companies to set policies about when they can open your mail — or let the government open your mail.
Basically, over time, our lives are governed by a series of these kinds of relationships with corporations. And under the terms of those relationships, those corporations get to make their own determinations about how they handle our information. And then of course, many of those corporations are poor stewards of your information and it gets hacked or leaked or sold or turned over to the police or the intelligence agencies. And so governments don’t need pesky things like search warrants or probable cause — they just need a good working relationship with giant tech companies. It has upended the entire social bargain between citizens and government.
How do you see mass surveillance impacting democratic processes, especially in terms of activism, freedom of assembly, and political dissent? Is there a risk that surveillance will erode the space necessary for dissenting voices to operate freely?
It’s a good question. I don’t think we know what the social consequences are in a democracy of persistent surveillance. We know what authoritarian societies look like when states control too much power and information. What’s less clear is what surveillance-heavy democracies look like. I really don’t know but I think it’s enough of a concern that it’s a central question in Means of Control.
Has anything surprised you about how the book has been received or the questions it has brought up? What questions have shocked you (if any!)?
One nice thing is that the book has gotten a positive reception from people on both the political left and the political right. Even in a very divided country like the United States, the coalition of people who care about privacy knows no political party or ideology. Second, it was nice to hear from a number of people who were quite sceptical and believed I had an agenda and refused to speak to me during my reporting who ultimately read the finished book and told me that they thought it was fair-minded and even-handed, even if they didn’t agree with everything in it. As a reporter, you’re always trying to be thorough and open-minded. And you’re also trying to help educate people and tell them a little bit more about the world that they didn’t already know so they can make informed choices. It was nice to hear that people got that from the book.
On a recent episode of Tracey Follows ‘Future of You’ podcast you said: “Good expertise is hard to find in the halls of government.” Is this intentional? What structural aspects of government bureaucracies do you think actively dissuade or prevent “good expertise” from developing and flourishing within these institutions? Is this scarcity of expertise due to recruitment failures, the nature of bureaucracy itself, or perhaps an intentional disregard for subject matter experts? The less we know, the better kind of thing..
America prides itself on values of freedom and liberty, yet mass surveillance runs directly counter to these ideals. How do you reconcile this contradiction, and do we get the American public (and beyond) to genuinely care about the trade-off they are accepting?
America contains multitudes and contradictions — we are after-all a nation that proclaimed all men are created equal while keeping millions in slavery. It’s not different on the question liberty versus surveillance. America has always had a strong civil libertarian tradition — with manifestations on both the political left and the political right. That tradition has always been sceptical of government power and authority. And it has had another political tradition who believes that public safety and ordered liberty stem from an empowered but accountable state acting in the best interest of its citizens. We’ve been tussling over these questions since 1776 and I don’t expect it to stop any time soon.
Is the current direction of surveillance inevitably leading us towards a “Minority Report” scenario, where predictive analytics start determining an individual’s guilt before they commit any crime? How do you see this emerging with the kind of data infrastructure we already have in place?
Maybe. We are already living in that world in some respects. Even in democracies, governments and corporations run risk analyses on citizens. Corporations make data-driven decisions on whether to grant you credit or what to charge you on your auto insurance premium. Governments decide whether you can go through the quicker security line — or in extreme cases, whether you can board an aeroplane at all. That kind of risk analysis is spreading into things like whether you can remain free on bail pending trial or whether you qualify for probation. In other cases, governments put people on watchlists over their social media rhetoric or their associations with extremists — and in many instances, for good reasons. I think our legal tradition wouldn’t allow us to truly convict and lock people up before they commit a crime but we are already making criminal justice decisions based on data-driven risk modelling.
Meta recently announced Orion, Apple has the chonky Vision Pro, Snap has its Spectacles-the tech bros clearly think/want the future on our faces. How do you foresee AR glasses being integrated into the current surveillance infrastructure? Could AR’s real-time data collection capabilities be the next evolution in mass surveillance, and what privacy concerns do you think this would raise? Did the MIT student doxing example surprise you?
When it comes to AI are we seeing a new layer of control forming? Is it formed? Are we getting new masters or are these companies already in “the system”? Or are they the new system?
I think the privacy and surveillance issues with AI are more mundane. First of all, these systems need high quality data for training so corporations are going to be ever-hungrier to collect information and feed them into next-generation AI systems. Second, AI and machine learning systems make it easier for government agencies and corporations to sort through large volumes of information and identity patterns and outliers. Right now government agencies already collect more information than they can reasonably exploit and analyse. AI will tip the balance towards better use of huge volumes of data.
The end result will just simply be more hunger for personal data — for both training and operational uses by everyone: governments and corporations alike.
Considering the trajectory of technological advancement, do you believe there is any viable path to reclaiming privacy, or is the concept of individual privacy itself becoming obsolete? How will future generations define or even recognise privacy?
I think it’s clear that social norms around privacy are changing dramatically. People are trading away a lot of their privacy in exchange for conveniences or peace of mind. But I think they are slowly realising that there are trade-offs. The big question is where will this all settle?
Some of this is probably never going away entirely. Young people today are growing up with completely different norms and expectations of privacy. They are now growing up with persistent GPS trackers installed on their phones that allow their parents to see their locations 24/7 and smart doorbells that track when they come and go. It’s hard to foresee a world where we put that genie back in the bottle.
At the same time, I think consumers are increasingly aware of privacy and if given real meaningful choices, often choose the more privacy-preserving option. Apple has long offered users the opportunity to opt-out of advertisers building targeted profiles of them but originally, the option was buried deep in some menu that few users knew existed. Few people did. Then in 2021, it made opting out much easier and more visible to the average consumer. And it changed the language it used to describe the technology. Instead of asking people if they wanted to opt-out of and “limit ad tracking,” the iPhone started asking users if they wanted to opt-in to cross-app tracking. And guess what? Most people didn’t like the sound of that and opted out. That simple change helped decimate a lot of third party data collection on the iPhone.
Polls show most consumers do not approve of all the data that’s collected about them but feel powerless to do anything about it. When companies make it easy to do something to reclaim privacy, people will usually choose that option.
What are the two top tip you have for people when it comes to their smartphone use? What do they need to do right now? Turn things off? Go on a course?
Be willing to pay money for digital tools and services you find valuable. Free is what got us into this situation. When people won’t pay journalists to deliver them high quality news, or when they won’t pay app developers to code useful apps, or they won’t pay services for the bandwidth and data costs, that’s when the business turns to surveillance and exploitation of personal data. While I’m not going to tell you every paid service is privacy respecting, it’s at least try to build a business model that isn’t relying on selling data or tracking users.
Buy ‘Means of Control: How the Hidden Alliance of Tech and Government Is Creating a New American Surveillance State’ on Amazon and in most other bookstores.
For more information and to engage with Byron, head over to X.