Surveillance, silence, and reclaiming privacy
Naomi Brockwell covers the erosion of digital privacy, the infrastructure of mass surveillance, and practical tools everyone can use to reclaim their right to privacy.
Date published: 15 ноября 2024 г.
A keynote by Naomi Brockwell at EthBoulder 2026 on the erosion of digital privacy, the infrastructure of mass surveillance, and practical tools everyone can use to build a more private digital life, from VPNs and encrypted email to GrapheneOS and decentralized mixnets.
This transcript is an accessible copy of the original video transcript (opens in a new tab) published by EthBoulder. It has been lightly edited for readability.
Two-part talk: warning + solution (00:00)
Naomi: Wonderful. Welcome everyone. Thank you for for being here. So, this is going to be a two-part talk. in the first part, it might get like a little intense. I have been known to get pretty intense when it comes to privacy and surveillance talks. the second part is going to be a little bit of a different tone. So in the first part I'm going to tell you a story and in the second part we're going to save the world. So if you guys are all cool with that we can get started.
The emperor's new clothes (00:44)
So act one. Let's start with act one of our story. The spell. So there once was an emperor who was sold a magnificent set of new clothes. The salesman told him, "These garments were special. Only the intelligent and virtuous could see them, and anyone unfit for their position would see nothing at all." The emperor couldn't see any clothes, but he was afraid to admit it. Saying so would mean confessing that he wasn't worthy to rule, so he said nothing. And when the emperor appeared in public parading through the streets, no one else could see the clothes either.
But again, no one spoke up. Admitting the truth would mean admitting their own supposed ignorance or moral failing. And each person assumed that well, if the king were actually naked, someone else would have said something already. In reality, there were no clothes. The king was walking around naked and everyone could see it, but no one said a word. Now, does everyone know this story? Right. The emperor's new clothes.
So, at its core, let's go back here. At its core, this story isn't about clothes, obviously. It's about not trusting your own judgment. It's about mainstream consensus overriding our own common sense. It's about going along with the crowd even when we think that the crowd might actually be wrong and outsourcing the responsibility for truth.
Now, in our story, each person thought that maybe they were the problem. So, they deferred to their neighbors and presumed that if something were truly wrong, someone else would have spoken up first. Everyone submitted to the wisdom of the crowd. And this was a bad strategy because the crowd was wrong. The emperor was naked. And because everyone stayed silent, consensus replaced truth. The silence of the crowd became proof that everything was fine. Now, this is exactly how privacy collapses in the modern world.
The machine (02:46)
So, act two, the machine. Here's the modern version of the story. We live under the largest data collection regime in human history. We have built and we continue to feed an infrastructure of surveillance unlike anything the world has ever seen before. and the trajectory of where we're headed is a recipe for disaster. Now, why is that? Because right now, the only way that this ends well is if we can guarantee that someone bad will never gain control of this system. But if someone bad does eventually get control, this is like an atomic weapon of potential energy filled with all of the ingredients needed to completely control a population.
Now, obviously, no one can ever guarantee that this surveillance apparatus being built that all of us are fueling won't be weaponized. We just can't make that sort of a guarantee. And so, it's a ticking time bomb. We are driving full speed towards the edge of a cliff and no one seems worried about it. People sense that something's wrong. Who here thinks that there's something going on, that maybe there's an issue with privacy and surveillance, data collection? They feel something.
It feels uneasy, right? And we feel it when an app asks for access that it doesn't really make much sense. Why does this calculator need my location data, you know? Or when a device listens, or when a podcast accidentally does release their podcast saying the quiet part out loud, that yes, they are turning on your mic and flagging keywords and selling that to advertisers. You know, the number of podcasts I get interviewed on where they're like, "Is my phone actually listening to me?" It's like, yeah, yeah, it is. Your phone is actually listening to you and you were the one who initiated most of those permissions on there.
We feel that something is wrong when we skim a privacy policy and we know that we should read it carefully. Probably has some important stuff in there. And actually it says right in there in black and white that yes they are absolutely going to be sharing this intimate data and we don't even know who with but we click accept anyway because after all it can't be that bad or accepting these things wouldn't be the status quo right? If it were that bad everyone wouldn't be just doing this.
How bad is the status quo? (05:12)
So how bad is the status quo? How bad is this stuff really? Or maybe we should talk about this because some people think that this is just a matter of companies trying to sell us a better pair of shoes. This is just like a consumer thing, right? Or maybe it's just about social media companies learning about us to create a more finely tuned algorithm, right? That doesn't seem so dire. What's what's the problem with all of that?
But right now, we are all consenting to a pervasive surveillance machine that is quietly invading every private area of our lives. Now, in some places like authoritarian regimes, this machine is used to control disscent before it happens by flagging potentially problematic people as more likely to join a protest movement and then targeting those people. Sometimes this machine is used to shape public sentiment or influence opinions or sway elections or get entire populations to to hate certain types of people by convincing them that those groups hate them. And then there are countries that publicly broadcast information about citizens whose social credit scores have dropped and then they use those scores to restrict travel, to limit their employment, to block their children from certain schools or to cut them off from opportunity entirely.
Data collection at trillion-dollar scale (06:26)
And now with the AI revolution, the machine stops being just a record of your life and it becomes a prediction engine. So this is why this is important. So I want to make this concrete for you. So I'll just kind of go over this state of surveillance right now.
I'm going to break the machine down into three parts. First one collected. How this information is used varies from country to country. Maybe it's an authoritarian regime using it for one thing. Maybe it's a, you know, country that's just using it to sway popular opinion, infect algorithms to show you certain types of contract content. But the raw data is actually the same everywhere. And it's dangerously easy to abuse. Now, every day, a trillion dollar industry harvests information about where you go, whom you talk to, what you read, what you buy, how long you linger on a screen, what scares you, what persuades you. And this data is packaged, it's analyzed, it's inferred, and it's sold. And it's not just sold to advertisers. It's sold to contractors. It's sold to basically anyone willing to pay. You don't get control over who gets access to this data. And some of the largest clients are governments all over the world who use this information to target their own populations. And maybe you will never be targeted. I don't know. It's unlikely. I would presume you're all already targeted in ways you don't know.
But let's say that you're really lucky and you avoid the targeting of this system. But your children probably won't avoid it, and you have no idea whether they will or not. And this machinery that you're consenting to today, it doesn't go away. You don't know who will be in charge tomorrow.
How this information is leaked (08:03)
So bucket two is how this information is then leaked. Every year the number of data breaches hits a new time high. Leaking all kinds of information that companies should never have collected in the first place. So location histories and medical records, financial data, private messages, and this information gets dumped into the wild, and it all ends up on the dark web to be used by organized cartels, by criminal gangs, and nation state hackers.
Again, you don't actually get to control who gets access to this once it's out there in the wild. And companies know it can't be protected, right? Centralized databases are constant targets and breaches are inevitable.
So, there was a great quote by the former CEO of Cisco who said there are two types of companies. There are those that have been hacked and those that don't yet know that they've been hacked. Right? So, it's inevitable that this stuff, anything you're feeding to these companies, it will end up out there. It's just a matter of who then gets access to it, and who chooses to weaponize it.
And yet companies still decide to collect all of this unnecessary data, mountains of unnecessary data, just in case. And all of us continue to hand it over anyway, trusting in these systems that have never earned our trust.
So this is the crowd clapping at the parade, right? It's not because we're certain that every click and every accept is safe. It's because speaking up and opting out or switching tools feels harder than going along.
Backdoors & government interception (09:33)
So now let's talk about this third bucket, weaponized. So hostile foreign intelligent operations have already infiltrated core communication infrastructure. I was just chatting before to people about Salt Typhoon, right? China, for example, has been intercepting our calls and messages at scale.
But what else should we have expected from a system that mandates lawful access requirements? Our own government has mandated back doors in these telecommunication systems and then we all act surprised when they get used by people who do not have our best interests at heart.
We know that it isn't possible for governments to make sure that they're the only ones who access these back doors. And yet we all just kind of went along with this, because surely if it were as bad as us keeping this gaping hole in the system, we wouldn't all just be complicit and consenting to it. It's not until someone actually decides to look that we discover that we've all been made more vulnerable and that people have been intercepting all of our calls and messages. And who knows how many hostile entities have been collecting this?
We know about one of them, Salt Typhoon, but we have no idea who's been collecting our sensitive, intimate communications in this very infrastructure that we rely on.
Why oversight is more rare than you think (10:51)
So, the emperor is naked and the only reason all of this persists is because the crowd keeps clapping. But there is another reason why the crowd keeps clapping.
So, let's talk about that. I mean, one of the reasons is people are afraid, right? You're in a crowd, the emperor's there, you don't want to speak up. You might get in trouble. But it's not just the people are afraid. They're also comforted by presumed diligence. They assume some expert checks the clothes. And what about in our modern day story? How does that transfer?
Well, oversight is much rarer than you think. People auditing this stuff is much rarer than you think. Like I run a grants program. I'm trying to find researchers who are willing to reverse engineer everyday technology to find hidden surveillance. It's hard to pay people to do this. People aren't just doing this in their free time. They've all got jobs. So, no one is looking into this stuff.
So, this silence we take as proof of safety, and we keep using these tools because everyone's using them. And surely if this were a problem, someone would have spoken up.
It's not proof of safety. It's proof of neglect of an entire system, right? The crowd assumes that an army of auditors have been making sure that the emperor isn't naked. But in the privacy world, no one's checking this stuff at all. And that's got to change. And maybe it's because privacy crept in slowly, and it just kind of hit us, and we realized what was going on, and that was kind of too late.
But whatever the reason, no one is really looking into this stuff and we keep going along with the crowd and pretending that everything is fine.
So there are some thread pullers. There are some people who are not pretending that everything is fine. There's a great book by Byron Tau called Means of Control. Highly recommend it. He talks about how our own devices are riddled with surveillance. He's shown us through multiple FOIA requests. He's sued the government many times to try to get access to this data that everyone wants to keep covered up.
It's not like the information is just sitting there. There are entire industries, and entire governments, where it's in their best interest to keep this stuff quiet, right? So it requires FOIA requests, and actual digging, and suing them. But it turns out that so-called analytics companies are quietly inserting SDKs into our apps with hidden code that turns these apps into surveillance tools. And he goes into a bunch of examples where this has been found out. It turns out that sometimes it's actually governments behind this surveillance, spying on their own populations, who are behind these SDKs and tools. So I highly recommend you read that — it's quite enlightening, and also a little bit terrifying.
Okay. So, how many of your apps are actually doing these things that no one realizes? And you have to keep in mind it's not just like sometimes the developers themselves don't know this stuff's going on, right?
I sometimes give the anecdote that if you're a developer, and you have a side project, and you make a compass app, and then you're like, "This is just me learning how to make an app and I made this in my spare time." You know, developers do that all the time. But then it gets a million downloads because people really like compass apps. They're cool.
And then suddenly, inevitably, you're going to get a call or an email from someone that says, "Hey, we're an analytics company. If you just put this SDK in your app, we'll give you a couple of thousands a month. We just do analytics." You're a developer who created a side project, and now you can potentially monetize this. Of course, you're going to say yes.
Now, you don't know what that code does, but, you know, why would the analytics company lie? So you say yes, you get paid, and the next thing you know, you're handing off all of this data from this app. And this now becomes a vector for a million people to siphon all this information to some shell company that no one's ever heard of. You'll be surprised how often that is going on with the apps in your phone — because who has actually bothered to look at the code in these apps? No one's looking at it.
So, I also recently interviewed someone who did a presentation at Devcon last year and he was just tinkering with his device and he noticed some weird things going on when he used Siri. Now, he did a bunch of magical technical things where he tried to bypass Apple protection so that he could undo certificate pinning and all of that. But what he discovered was that when you use Siri dictation, your iMessages are no longer end-to-end encrypted.
The contents of your messages are being sent to Apple servers where they can read them. Who knew that? It turns out Apple didn't even know that. It took this one developer who just happened to tinker, because he saw a weird thing going on with his machine. He's like, "I want to figure that out."
So, how many hundreds of millions of people are using Apple products and one guy decided to take a look about what's actually going on? That's the current state of surveillance, and that's the current state of privacy right now.
There's another presentation where someone's dad bought home one of those home hubs, right? She decided to do some probing. Sometimes she plays with the different tools in the house and she wanted to figure out how it works. And it turns out that this popular consumer device that anyone could buy was being used as a hub in a massive Chinese botnet. So then the FBI sees a presentation. They end up taking a presentation offline classifying the investigation. They didn't know this was going on, but she was the one who brought it to their attention by just giving this presentation like, "Hey guys, I found this weird stuff going on." And that's how we find out that a massive Chinese botnet is invading all of our homes through this one specific device. What about all the other devices in our home that no one has bothered to even look at yet?
The silence problem & false consensus (16:30)
So, that's that's where we're currently at right now. Surveillance today is pervasive, and it's invisible, and it's normalized, and it's justified, and we consider it industry standard. And that's where we're currently at.
That's why it's so effective, because people sense that something is wrong, but they assume someone else has already checked, and they assume someone smarter than them has audited the system, and they assume that someone braver would have warned them if there was something going on that they should be worried about. So they stop trusting their own judgment. They stop investigating. They don't question. They don't push back. And they tell themselves, "Well, I'm the problem. I must not understand this, or I'm probably overreacting, or if this were really bad surely someone smarter would have already raised the alarm about this."
Everyone privately doubts what they're seeing, but they assume that they're the problem. But here's the thing. If we actually stay silent, we become the problem.
So here's why people presume from silence that everyone is in consensus. And that's the most dangerous part of this whole story. No one actually checks whether there is a consensus. They just make the assumption that, because no one is openly objecting, the system must be fine, because the product is possibly popular. It must be safe. This has 100 million downloads. There's no way 100 million people would be stupid enough to download spyware on their phone. Am I right?
So consensus is never verified. It's assumed. And the silence about how bad the state of privacy is becomes interpreted as proof of legitimacy. If the surveillance were truly invasive, someone would have stopped it. If the data collection were abusive, there would have be consequences. If this were unconstitutional, surely it wouldn't be allowed to continue.
Now, when we see something that feels wrong and we don't say anything or push back or question the standard, our silence actually validates what's going on. That's a really big issue.
Then there's the complexity of these systems that amplify the effect. So, privacy systems are opaque by design. We talked about that. They're designed to not show you what's going on because governments don't want you to know what's going on. Companies don't want you to know what's going on. So it's wrapped in technical language, hidden behind legal documents, framed as too complicated for normal people to understand.
So when governments and corporations or experts say this is fine, people defer. Authority fills the gap where understanding should be, just like the emperor's advisers, just like the crowd. But the real genius of the scammers in the emperor's story was actually the moral trap. So the salesman didn't just say, "These clothes are hard to see." They said that only the virtuous could see them. So we use shameful language when we ask people things like, "What have you got to hide?" We have turned surveillance into righteousness.
So Eric Schmidt of Google, he has this famous atrocious quote where he says if you have something you don't want anyone to know, maybe you shouldn't be doing it in the first place. As if privacy isn't our right, it's something we have to justify, and we're maybe bad people for wanting it. I mean, that's insane that we have flipped the tables so completely on privacy and surveillance.
So, notice what's happening. Privacy has become framed as guilt and compliance has become framed as virtue. The good people are the ones who surrender to the access and the suspicious people are the ones who ask the questions. And now pushing back becomes socially costly. Once you attach that moral label to silence, the parade just runs itself.
How do we stop the emperor's parade? (20:23)
Let's go back to our story. The emperor's new clothes. As the emperor marches through the street in his new clothes, the crowd applauds. They admire the craftsmanship. They praise the elegance. They comment on the cuts and the fabric and the way that the garments catch the light. And they compete to sound the most impressed. And courters lean forward and they're eager to be seen agreeing, and officials nod solemnly, and advisers add flourishes of technical praise and they invent details to prove that they understand what they're looking at, and some speak loudly hoping to be overheard, and others smile and say nothing, careful not to look confused. And no one wants to be the first to hesitate, and no one wants to be the one who asks obvious questions, and with every compliment the lie becomes harder to undo.
Because once enough people publicly pretend to see the clothes, admitting the truth would no longer just be embarrassing. It would be destabilizing. It would mean confessing that the emperor was naked and that everyone else had helped pretend otherwise. So the performance continues and the applause grows louder, and the praise more elaborate, and the certainty more confident. And the more absurd it became, the more everyone doubled down.
Until a child spoke up. And this child didn't have status to protect. And he didn't have a reputation to lose. He didn't know the rules. The child wasn't afraid to speak the obvious truth. And he stated clearly, "The emperor has no clothes, guys." And once this was said out loud, the illusion collapsed instantly. The crowd freezes. You know, people giggle and then whisper because the spell is broken. But they had all been complicit. So they try to stay hushed, hoping that the attention won't turn to them. And the emperor hears the child, and he too now knows that the lie is no longer private. It's public. And the crowd knows, and he knows that they know, and they know that he knows.
But here is the most important part of the story. The emperor keeps walking. He doesn't stop the parade. He doesn't cover himself up. He doesn't correct the lie. He walks on naked, because stopping would mean admitting the truth out loud. The illusion collapses, but the system doesn't correct itself.
This is a real warning. Like, of course, people can be fooled. But what's scary is that even after truth is spoken, the system continues as if nothing has changed. Power tries to carry on as if nothing has changed. And so the crowd stays where they are and they continue to play the game because the emperor is still playing the game, and they just go along with everyone else.
Now we have a society with self-correcting mechanisms, right? We have whistleblowers telling us about this stuff. We have people speaking out, doing research on what's going on. We have researchers exposing hidden surveillance. We have journalists public publishing reports about it. And yet the surveillance continues. The emperor has no clothes and people finally say it out loud. And the parade keeps moving anyway.
So, how do we stop the parade? Like, what do we do when truth alone isn't enough? If whistleblowers speak and nothing changes, if researchers publish and nothing reverses, if journalists expose things and the parade keeps moving, then the problem isn't lack of information. The problem is that the the cost of stopping still feels higher than the cost of continuing.
The emperor doesn't stop because he knows the truth. He only stops when the crowd makes it impossible to continue pretending. One child speaking breaks the illusion, but it doesn't break the system. Systems don't change when truth is spoken. They change when participation is withdrawn. Now, if the crowd had laughed openly, if they had stopped applauding, if they had refused to play along, the parade would have stopped. Not because the emperor suddenly grew honest, but because the performance would no longer function.
That's the real lesson here. The solution is not just more people speaking up. It's the refusal to consent. Refusing to normalize this, refusing to comply quietly, refusing to outsource judgment to authority. Privacy doesn't collapse because no one knows what's happening. It collapses because people keep showing up, clapping, playing their assigned role, using these systems that everyone else is using because it's expected of them.
So, the way this changes is not by waiting for the emperor to stop. It's by the crowd changing its behavior. By people choosing tools that don't depend on surveillance, by pulling consent away from systems that rely on passive participation to survive.
Now, when people when enough people stop applauding, the parade can't continue. And that's the part of the story that we're still writing here. So, it's not whether the emperor has no clothes. We all know that he doesn't have any clothes. The only question left is whether we keep walking alongside him, pretending that it is fine.
Part two: let's save the world (25:22)
So on that kind of dire note, part two: let's save the world. Who wants a better future? Who wants a better future for future generations, for their children? Who wants to change things? Because we're completely empowered to make a difference.
So if we need to stop feeding the surveillance economy and start supporting competitors in order to shift the system, let's talk about how to do that. You know, this means stopping giving our business to companies that are trying to exploit us and starting to give our business to companies that are trying to protect us.
Let's go over some of the ways that we can opt out. And just so you know, I'll be hosting a deep dive phone privacy workshop straight after this at 3. If anyone wants to come, we'll go through specific steps you guys can go to through to really lock down your devices, all the types of tracking going on, and how to mitigate it all. So if you guys want to come to that, please feel free.
But right now, I want to hear from you guys. So what are some of the ways that people here are opting out? You guys are all tech forward industrious agentic people. So what are the choices? Is anyone here making any choices? where you're choosing a better system instead of just feeding.
Yes, gentleman in the back.
Audience member: No notifications on my phone.
Naomi: Oh, I like that. You have reclaimed control of your attention. Instead of being reactive to every person who wants to get contact with you, you decide on your terms when you want to get in contact with other people. I do the same thing. I haven't had not notifications on my phone for years. and it's been so wonderful for mental bandwidth. I get to control the focus of my day and my attention. And then like let's be honest, we're all picking up our phone every 10 minutes and unlocking it anyway. So, the difference between me getting a message from someone between an instant, you know, ping on my phone, to like 10 minutes later, I see it when I eventually open my phone — I love it. I love having a zero notification device. So, kudos to you.
Anyone else doing things to opt out? Yeah.
Audience member: Not enough, but I quit and deleted my Facebook account.
Naomi: Oh, yes. You that's very very good. and how's that feel? because some people feel that they become siloed or they lose contact with their friends and family. What's your coping strategy for that?
Audience member: Well, it's really nice because somebody actually tried to extract my tokens out of me by finding personal information about me and my family. So, it's one less attack vector.
Naomi: I love that. Yeah. I mean, this is a a crypto conference, right? So, we have to realize what's going on right now is organized cartels all over the all over the world, identifying people involved with crypto and using all of the information we're putting about ourselves online to extract information to make it easier to target us, to make spear fishing easy, because you know that your sister's name is Susie and she went to this school and this is her best friend Peter. All of that information is public. We're just feeding this giant system and anyone can scrape it.
So, Facebook, it's so interesting. Like when Facebook first came around, it was exciting, right? It was this idea of connection across the globe in a way that we couldn't connect previously. It was kind of revolutionary, and no one told us when we signed up that this was a data harvesting machine, that this was a giant advertising model.
And I would probably have paid for it. Like I would pay certain number of dollars a month to use it and not have the advertising. But no one really thought about the monetization. How do they keep these servers up and running? Why is it free?
So I love that. Now that we do know, I think that there are ways that we can go about creating those connections with our friends and family that don't revolve around a system that makes everyone more vulnerable. One of the suggestions I gave online, people were saying, "I can't get off Facebook because that's where all my friends and family are." I have my banner on Facebook saying, "Hey, here's my signal username. If you want to contact me, it's here."
You know what? That's a great filtering mechanism for who's your friend. Because if the barrier — if it's such an effort for them to like ping you on Signal to get in contact — like are they only pinging you on Facebook because it's easy and convenient, like what does that say about how much you mean to them? And it's actually been really nice to see how many people have been willing to go out and actually use a different platform to connect. They actually do want to connect. So that could be an interesting filtering system if anyone wants to try it.
Anyone else doing things?
Audience member: Yeah, I send letters through the postal service.
Naomi: Letters through the postal service. Well, yeah. All right. I give you half a point for that. All right. You understand that, you know, digital communication, it's mainly a massive surveillance network and easily intercepted. I'm not convinced that the USPS isn't a massive surveillance network and also surveiles. I mean, they're scanning every envelope these days. So yeah, half a point is the right idea, but let's go even further.
You know, for me personally, and I maybe because I'm very tech forward. I run a privacy channel. A lot of the people who like my content tend to be anti-tech. I'm the complete opposite. I'm a total technophile. And I think the only way we survive this is by leaning into technology. So some people want to throw out their devices and that's how they think they're going to win.
Okay, but what about flock cameras? How do you avoid them by throwing out your devices, right? Are you going to throw out your car as well? Are you going to wear a mask everywhere? Surveillance isn't just on the devices in our lives. Surveillance is now pervasive all through our our lives. And we need a different toolbox.
We can't just, you know, throw out our devices and think we're going to be safe. We need to lean into the technology that is going to give us back our privacy. So things like zero knowledge proofs, things like homomorphic encryption, all of the amazing cutting edge privacy tooling out there that is waiting for us, begging for us to just implement it in our life, to incorporate it into the tools we're we're building, right? So I really would love to see people leaning into privacy tech and understanding that.
Even AI, right? So many people hate it, right? And that's because it's been overwhelmingly hijacked for surveillance in so many ways. AI at the end of the day is what? Powerful compute. So, don't we want powerful compute on our side if we want to be building cool privacy tools? Anything that supercharges us and helps us get where we're going faster, I think we should be leaning into. And I don't think we should be throwing out things because they're new or scary, or because most people are using them for nefarious means.
We should be figuring out how we can harness the power of this to be creating a more private world. So, I can think of a million ways we could use AI for privacy, right? You could be creating white noise about yourself, and using AI agents to propagate it through the internet so that we render data brokers obsolete and they can no longer sell verifiable profiles about us, because there's so much noise out there now. Or we could be, you know, having a a system on our computer that is analyzing every bit of telemetry that leaves our device, figuring out what kind of data is being exfiltrated, who's doing it, what could we tell from the IP address about the companies collecting this, how do we lock it down, right?
These are all things AI agents can be doing. Be careful of AI agents. They're really, really insecure right now. But you could be using AI in general. You don't need to be giving it privileged access to your machine, but you could be using local AI. There are all kinds of ways you could be using and harnessing this powerful compute to build a more private world. So, we should not be throwing out tech. I think we should really be embracing it.
What else are people doing? yeah.
Audience member: Thank you for being here.
Naomi: Thank you for being here.
Audience member: You bet. And I'll just tell you that for better or worse, I know our congressional delegation and whenever I see one of those guys or gals, I make sure I tell them one single piece of information about why privacy needs to be more supportive.
Naomi: You are doing an amazing. Can everyone give this man a round of applause?
Thank you for your participation. The fact is that education of elected officials is probably the most important thing you could spend your time doing.
Audience member: Unfortunately.
Naomi: Unfortunately. For sure.
Yeah. No, I agree and thank you for doing that right now. You're absolutely right. I wish it weren't the way because it feels so distasteful for me to have to cowtow to politicians to beg for rights that should be mine. So I hate it.
But at the same time, when you have asymmetric power in society and you have people pulling levers, it actually pays off to try to influence the people who are controlling those levers. And if those people are currently undermining your privacy and trying to ban end to end encryption and all this other stuff, then yeah, it is a battlefront that people need to be fighting as well. Our institute does a lot of work mainly on individual empowerment. So we try to say, okay, regardless of what the politicians are doing, here's how you can reclaim your privacy yourself.
You know, empower yourself. These are the tools you can be using. You don't have to ask for permission, but I really applaud the people who are doing the work to educate those who do have an asymmetric amount of power who can make a difference because if we can win them over, you know, that's that's some some area in the the battlefield that we can can take. So thank you.
Who else is doing things?
Audience member: So speaking of AI, I highly recommend Venice. And not only can you use it as a user for private conversations, but if you're building an app, you can use their API to protect your users information as well.
Naomi: Yeah. Venice, who has been trying out Venice or any other AI privacy tools? Yeah, it's really cool and better in a lot of ways. so it it's funny. I was just telling someone this story earlier. I wrote this newsletter, and I use AI a lot for for all different areas, and we kind of have a spectrum in our organization of like what's the most private AI to use. Well, it's going to be local on your home system and then you've got more private cloud providers and then you've got account-based data harvesters on the other end, and we kind of teach people what information is allowed to be put into each depending on how sensitive. But anyway, I was putting in a newsletter, and I was about to publish it, and I'm checking the typos and we're going to hit publish. And this was in ChatGPT. I mentioned things like SMSool.net as a place where you can buy burner numbers if you don't have a cell number. I don't have a cell number. I don't have a SIM in my phone. So actually every platform that says no, I need a real SIM cell number — I'm like, I don't have one.
So I wrote a tutorial on what I do in a situation like that. Listed all these services. ChatGPT censored them. It didn't do a typo check. It changed little sentences. I'm reading through it. And it's saying things, like, where I'd listed specific services, it would say, "I'm sorry I can't provide any names of services, but there are things out there." And I was like, "GPT, you censored me. Why did you do that?"
It said because these are tools that can potentially be used by bad people for nefarious means. Therefore I cannot provide examples. And I was like privacy is not a crime and this is clearly a tutorial for normal people to just teach them how to reclaim their privacy in the digital world. And it was like, I understand and it is clearly just a tutorial, but I can't help make a tutorial that teaches people how to do things where those things could potentially be dangerous. And I was like, this is really dystopian that these things are starting to get filtered out. And then I mentioned cryptocurrency, and I said, yeah, you can, you know, use Bit Refill to buy prepaid SIMs and top them up. It deleted my reference of cryptocurrency entirely.
And I was like, you censored me again. What are you doing? Put my newsletter back the way it was. It said, "I'm sorry. Cryptocurrency is used by criminals to circumvent things. So, we cannot add this to the tutorial. I cannot mention it."
This is ridiculous. So, Venice, great alternative. Venice.ai. I really like Brave's Leo. Great for browsing. I ask it questions there and it's pretty comprehensive. There are a lot of different cool platforms out there you could be trying instead of these non-privacy-preserving systems. So give them a go. Image generation. This one was made by Venice. And it was way quicker than any of the other platforms that I was trying. So actually there are some real benefits to using some of these tools.
And they have uncensored models which is kind of nice too because I don't like a single company being the arbiter of truth and determining what people are and aren't allowed to say in their tutorials and newsletters.
Who else is doing things?
Audience member: Moxy just started a new one. Confer. Which is doing some really interesting things around privacy. And just to build on the point that this gentleman was making about educating politicians, there's a project in Argentina that is running a reverse technology accelerator to educate policy makers on technology, which is a really cool way to take that to scale and like many different experts in our industry could be educating policy makers in narrow channels to really significant effect.
Naomi: I love that. Do they have some sort of a guide where they're saying this is how we've set up this reverse accelerator other people could follow? Because that would be great to, you know, spread that around. If so, if you know of something, ping me. I would love to share that in our newsletter or something if in case other people want to do similar work.
But Confer confer.to, I think it is. That's another one. They have a functionality where you can literally ingest your entire chat history directly into Confer and just go from there. So if you were like, listen, ChatGPT was the first one I used, there was you know sunk cost there and now I just continue to use it out of habit, you can just ingest all your history into Confer. And Moxy, if you don't know him, he's a really cool cypherpunk, he built Signal and now he's doing private AI, so give it a go. I've had a really favorable impression of it so far — it's new but it's really cool so far.
Anyone else doing things to reclaim? Yeah.
Audience member: I think where I live and sleep is probably the most private place that I have in my world. I don't want people knowing my location. So, I use a PMB to ship things to, and sometimes ship things to friends and pick it up from there. But I do not tell the internet where I live.
Naomi: I love that. So, let's talk about all the different ways that the internet can find out where you live. Major vector is going to be your credit card. So, every time you buy something from some unknown vendor, thousands of people you interact with, you give them your home address. You give them your billing address. They now have your real name and your billing address.
It's insane that that's just the standard practice. The emperor has no clothes, guys, and we're all going along with it. And it's okay to just tell everyone, "This is my home address. My name is Naomi Brockwell and I live in this place." Like, it's insane. So, you could use a masked credit card service. Privacy.com is a great one. Obviously it's part of the, you know, TradFi world, so it's all KYC, but privacy.com takes precautions to really protect your data and encrypt it at rest and they allow you to basically create burner credit cards. You can create any name on it. You can put any billing address and it will still go through which is great. You can do one-time use. You can set limits. You could have recurring payments. And this way you will never have to give anyone your billing address ever again. Highly recommend it.
PMB is is another underutilized thing. It's like a PO box, but PO boxes can't receive things from places like FedEx. So a PMB is generally going to be like a local mom and pop provider. There are some chains. I recommend going smaller. They tend to be easier to navigate. But yeah, you can be sending a lot of your stuff to these places instead of your home address.
Or if you're sending something to your home address, use a fake name, you know. Especially if you're using privacy.com, you can just change your name to any alias. It's a great way to try and protect yourself.
There are ways that your data is still going to be leaked. So, utility companies, for example, are one of the most notorious for selling data. Your bank is one of the most notorious for selling data. All of these places demand your real address. And and then we'll share that. So, there are other methods you could do to try and protect it. You could buy a house in a trust. You could rent a house in an LLC. There are just different barriers you can kind of put up to people getting that information.
With your bank, you know, you could enroll in something like an address confidentiality program. Every state in America has one. You should probably check it out. And heavily underutilized mainly for people who are victims of stalking.
If you're in this room and you're involved with crypto, I hereby authorize all of you to apply for this project because I can guarantee there are people targeting crypto people all over the world. So feel free to use these programs to protect yourself. It's much better to do these things in advance than for something bad to happen and for it to be too late.
What else are people doing? Yes.
Audience member: ZK MixNet.
Naomi: ZK MixNet. That's awesome. So, you're in like the MixNet for like proxy VPN type thing. What is your ZK MixNet called?
Audience member: ZKNet.
Naomi: Okay. Very cool. And how has the experience been using it? Like latency? Is it functional?
Audience member: Early alpha.
Naomi: Early alpha. See, that's the future, guys. I think we're all going to be like moving on to stuff. Do you want something to add?
Audience member: Yeah, there is latency by design because it's strong anonymity and if you want to protect the onour or anything else in that way, that is the trade-off. And so without compromise, it sends the highest privacy for the highest value transactions. So crypto transactions are a great example. AI API requests, that's different than streaming your Netflix. That's outside the scope of that.
Naomi: So that's really really cool. So you have all these tools for private browsing online, navigating the web privately. Obviously like Tor is going to be something that is very slow and something you should all be trying out and using. And then things that are actually using secure enclaves and TEE's to protect data so that whoever's running a node can't see it is really really exciting. You've got a lot of those kind of mixnets popping up right now. GeneralVPN for everything. You put it on your home router, put it on every device. The function of this really is so that every website you visit doesn't get your IP address and use this as a tracking tool and a fingerprinting tool.
So, this is really great. so it kind of gives you a spectrum. You can level up and start to use, you know, ZK mixnets if you if you want to do things that are more sensitive than just general browsing.
What else are people doing? Yeah.
Audience member: I pay for a Proton subscription.
Naomi: Thank you for paying. So, there are a lot of premium services out there. I love the idea of everyone getting access to privacy. I don't want people getting priced out of access to something that is really important. That means that if you can afford to pay, you should, because these places are not going to be sustainable unless we support them. So, they usually have free tiers. So, it's great if you just want to try it out without commitment. But then if you find that you're using something that's valuable, even if it's a free tool, write to the developers, find a way to donate to them. So, if you're using like a grapheneOS phone, just see if you can donate something to them. These teams work real hard for your benefit. and so I love that you're paying for a subscription there.
Proton is a great ecosystem. So, they're trying to be like a Google competitor in that they offer drive and collaborative docs and spreadsheets and VPN and calendar, and all of these different things as well as email. So, it could be a really nice ecosystem. We use it for our company. All of our emails are within the Proton ecosystem. Now obviously some of these tools might not be as polished as Google, because Google has like 85 billion people working for their company on, like, the emoji feature, right? And then you have Proton. It's not going to have the same number of people. But most of the people at Google are really focused on the ad side. And a lot of bad exploitative behavior. And you kind of have a choice, right?
We can continue to use the products we've always used. Like the same thing with Facebook that I said before. A lot of us probably signed up to Gmail not understanding that Google is an advertising company. That's their business model. We just thought this was a free thing on the internet and it's free because it's in the ether. Why would you need to pay for something, because it's just ones and zeros? Why would that have any cost?
So, we all just sign up, and then inertia gets us, and just out of habit we've built up all of our contacts and everything in the Google ecosystem. We have tools we can replace that with now. And I really encourage you — don't feel you need to switch immediately. Just set up an account. Just create it and it's there, right? Just take the first step to moving over.
Because you have the choice, that you can either be fueling the ecosystem that is exploiting people, and creating tons of data that governments are absolutely getting without a warrant all of the time because they do not need a warrant due to the third party doctrine to get access to email contents, and all of this stuff. So you're either fueling that world or you're supporting the companies that are trying real hard to protect you. They're trying to make better privacy tools. They're trying to do things that help give individuals back their right to privacy and protect them.
So every time you're in that juncture, just try see if it's something you can incorporate into your life and if you can support the people who are building this stuff, we need to support them. We need to use them. If we don't these things will disappear. If they are not sustainable they will disappear. If their developers can't afford to work on this stuff full-time, because they can't afford to even keep the servers going, this stuff will disappear.
On top of that, if legislators ban this stuff out of existence, because no one is fighting for it, because we're all saying, "Well, I have nothing to hide." This stuff will disappear.
The choices we have to make about the future we want to see (46:56)
So, I'll leave you on this note because I think we're running out of time here. Right now, we are at that juncture in the road where we have to make some choices about the world that we want to see. And I know there are a lot of people who are inconvenienced by these systems and it seems like a lot of work to move things over.
I think that we need to be really cognizant of the future we're currently writing and the direction we're going in. And if the people in this room are not the ones being those trailblazers, I can bet you that the mainstream are not doing this. You guys are the ones who will create that quorum, who create the new norm that move people over, right? So there's a lot of responsibility on your shoulders right now. And a lot of you may be thinking that you have nothing to hide. This stuff isn't important information. You don't really care. And maybe the cost of switching is too much for you.
So I want to just kind of put this question to you. Do you want to live in a world where whistleblowers can no longer exist? Do you want to live in a world where investigative journalists can no longer do their job safely? Do you want to live in a world where opposition parties can no longer form? Do you want to live in a world where dissent is no longer possible?
Because that is the current world that we're building. It is not about you, actually. It's not about whether you personally have something to hide. It's whether you want to live in a world where none of that stuff is is possible anymore. That's the future that we're currently building. That's the infrastructure of surveillance that has taken hold.
And so we have to think about what world are we building for future generations. And are we actually fueling a world where we can no longer undo this stuff anymore? Where this stuff becomes embedded and we could no longer walk it back, because governments have now outlawed it, because no one spoke up for it. Businesses have gone out of business because no one supported the tools, and we just kept fueling their competitors — the competitors that are harvesting all of us.
So have a think about that when you leave today and just think about what future you want to write, and if even it's a small change, even if it's one tiny choice you make differently. Someone says, "Hey, let's DM. Are you on Telegram?" and you say, "Actually, let's connect on Signal." Or if they say, "Hey, I'm on, you know, WhatsApp," or, I mean, there's so many bad things, like SMS. Try to think about the tiny little choices you can make that help build a more private future and support the tools that are trying to support us.
So, I'll leave you on that note. I really appreciate all of you being here. As I said, I am hosting a privacy deep dive. We'll be talking a lot about GrapheneOS. We'll talk about specific settings. We'll be talking about Wi-Fi beacons. We'll be talking about your apps and SDKs and all of those things, and we'll go through how to really lock down a device. If any of you want to join that, it'll be at Regen Hub at 310.
So, thank you so much for being here and I believe in all of you. We've got this. We can build a better future.