Jerod Santo:

So we're here with Bruce Schneier, who's a cryptographer, a computer security professional. If you haven't heard of Bruce, you need to. He's been around a long time, he's been keeping a lot of us up to date with what's going on in cybersecurity, info security etc. Bruce, thanks for joining us.

Bruce Schneier:

Thanks for having me.

Jerod Santo:

Happy to have you. First of all, you call yourself a public interest technologist. This is a term that many of us probably haven't heard. What does that mean?

Bruce Schneier:

Yeah, and I want you all to have heard of it, because I think this is an important term and way of thinking. Public interest tech is kind of an umbrella term for people who someone marry tech and policy. And traditionally, these are very different worlds. Technologists deal in computers and numbers and algorithms, and true/false, yes/no, and policy people deal in consensus, and just different ways of thinking and problem-solving. And we know when it goes badly wrong, if you watch the tech hearings in Congress, or attempts to reform various tech laws... And I try to occupy this space between tech and policy. So I am teaching cybersecurity policy at the Harvard University Kennedy School. I'm teaching cybersecurity to students who deliberately never took math in college. So you imagined a technologist working on a lawmaker staff, at a federal agency, in the military, for an NGO, trying to figure out the policy of tech, the tech of policy... You know, all the ways that tech and policy have to work together and can be across purposes - I think this is important. Like I said, I'm here at Harvard, so there's a field called Public Interest Law. 20% of the Harvard Law School graduates go into public interest law. They don't work for a big law firm, they don't work for a corporation. They work on immigration law, and housing law, and discrimination law. And all of these things are actually paid very well, but make the world better.

Adam Stacoviak:

Yeah.

Bruce Schneier:

The number of computer scientists that do that kind of thing here is like zero.

Jerod Santo:

Right.

Bruce Schneier:

They all go work for the big tech. But we need this career path of people who want to do good with a tech degree, or go to law school after a tech degree, or they have a law degree and learn how to program. So all of this kind of bridging ways of thinking I think are really important. The fundamental problems of our society in this century are tech. And if you don't understand tech, how could you deal with (I don't know) the future of employment, let alone algorithm discrimination. So that's what I'm really trying to preach and model and push for. And it's not just me. Ford Foundation is trying to fund public interest tech programs in universities around the country. They invented public interest law in the mid-'70s, so it was kind of good for them. And this notion that we need to train people to bridge that gap, to have one foot in both camps.

Jerod Santo:

Yeah. What do you think is a more viable path - taking lawyer types and teaching them computer science and security, or taking computer science types and teaching them law and getting interested and foregoing what are lucrative salaries, very relaxed work environments in many of these big tech companies? What's the way to get it done?

Bruce Schneier:

You need both. You need all ways to get it done. So ACLU pays one-third to one-tenth money that you can make as a lawyer at a big corporate law firm, and they put out an application for an attorney, and they get a hundred resumes. So there are lots of people who are not just pursuing money, if there's a viable career path. \[08:06\] You work for the ACLU as an attorney, you feel good about your life when you come home. You're not working for some horrible corporate interest, you're not doing something you don't believe in. And I don't think the problem is gonna be supply. I think the problem is going to be...

Jerod Santo:

Path.

Bruce Schneier:

...demand and path. And I want both. I want there to be a path for an attorney or a policy student to learn enough tech to bridge the gap. I want a path for a CS student to learn enough law or policy to bridge that gap. And I'm teaching cyber-security and policy and I will get CS students, and that's fantastic. I'll get business school students. I want that mix.

Adam Stacoviak:

What was your path then? So if path is important, what was your path to -- you kind of mentioned the why this is important, but how did you get there? What was your path to make this a thing for you, or even care so much?

Bruce Schneier:

My path is becoming more general... And path stories are interesting right now, but every one of us is an exception, and has an exceptional and unique path. So it's not something that's mimicable.

Adam Stacoviak:

Right.

Bruce Schneier:

Because here I am, in the early '90s, doing tech; I get fired from AT&T, I get laid off, and I write a book about cryptography which becomes a bestseller, because no one knew about cryptography; the internet is taking off... And really suddenly I am in the middle of this tech renaissance. I'm good at explaining, I'm good at making tech accessible, and I naturally get drawn into policy. And as I start generalizing, I write about the mathematics of security, then I write about computer security, network security, security policy, the economics, the psychology of security... And then my latest books are about the public policy of security. So I'm coming at it from tech, but I'm making it up as I go along. Be a bestselling author is not a viable career path.

Jerod Santo:

\[laughs\] We can't all do that, yeah.

Bruce Schneier:

It is a fun thing to do, and I recommend it, but if that's the only way, we're not getting anywhere.

Jerod Santo:

So you mentioned your books -- and I do have to thank you... So when I was back in college, I was studying computer science and information security, and I was knee-deep in Diffie–Hellman key exchanges, one-way hashing algorithms, and really staring right at the trees. And I was assigned to read Secrets and Lies, which you wrote - the second edition; I think you wrote the original one pre-9/11. This was the post 9/11 update. And in that book you really made it clear to me how tangible and applicable these technical nuances and details that I was studying actually affect the real world, and it was very useful. So I appreciate you writing that one; of course, you've written many books since then.

Bruce Schneier:

It's interesting - your story's got a few holes in it. Secrets and Lies is my second book, and it came out in (I think) 2000. I actually have to pull it off the shelf and check. I never updated it. In 2003 I wrote a book called Beyond Fear, and that's where I actually talk about the terrorist attacks of 9/11. You're holding up the paperback, which might have been issued post-9/11, even though it was published.

Jerod Santo:

Okay. So I have chapter one, the introduction was copyright 2004...

Bruce Schneier:

Oh, so for the paperback I wrote a new intro. Yeah, they make you do that.

Jerod Santo:

Okay, so that's the one that I've got.

Bruce Schneier:

Right. Because people think it's a new book, but you just wrote like four new pages... So fooled you.

Jerod Santo:

Good play. Good play, yeah.

Adam Stacoviak:

\[laughs\]

Jerod Santo:

What was interesting, and the reason why I was kind of flabbergasted when you said that is because in the intro you do say in that one that when you are making this update, one thing that surprised you is how little had changed in between the two, from the 2000 to the 2004.

Bruce Schneier:

Interesting.

Jerod Santo:

\[12:02\] And that's interesting, because now we're like 20 years from there, and I wonder, would you still say that, or has so much changed since then? In the world of security specifically.

Bruce Schneier:

It's interesting... I mean, a lot has changed and not a lot has changed. People still read Secrets and Lies and get a lot out of it, because a lot of it is still true. But the threat landscape is now weirdly different. We're worrying about nation-states in ways we weren't. Ransomware didn't exist back then, business email compromise wasn't anything I wrote about... I mean, a lot of the business of cyber-crime, and almost like the business of cyber-espionage - it's become institutionalized in a way that we didn't really think about 20 years ago.

Jerod Santo:

Right.

Bruce Schneier:

But it's surprising how much is the same. The stuff on passwords is the same, the stuff on firewalls and IDS'es and network security is the same. So both the same and different... Which I think is interesting.

Jerod Santo:

Yeah, it's almost like the foundations are still there, only everything's just kind of escalated, gotten more mature... You said it's been business-ified. I remember the early worms and stuff, people would do them as jokes, or on accident, and it caused major harm. And at a certain point it seemed like people realized "Well, if I have this virus, or if I have this attack, actually if I keep it secret and don't let anybody know about it, I can actually do a lot better for myself and make a lot more money.

Bruce Schneier:

And now you've got the Russians who do this for espionage purposes. I mean, SolarWinds was -- not a worm in the same way, but you think about WannaCry and NotPetya, and a lot of these nation-state attacks... I don't know if we really thought about the internet as a battlefield in the same way. We were still believing John Perry Barlow's declaration of independence in cyberspace, that nations couldn't touch us there.

Jerod Santo:

What's the most sophisticated or impressive current hack or technique that you've seen in the modern era? What's really impressed you? Sometimes these things are so clever and interesting, the way that people actually go about them.

Bruce Schneier:

Yeah, SolarWinds was pretty clever. Subverting the update mechanism of a random piece of network management software you didn't even know you had, and a way to subvert 14,000 networks worldwide, and then pick and choose who you wanna actually attack, and go in and lay whatever groundwork you need so they can't possibly ever kick you out, unless they burn that network to the ground, which nobody ever does... That was pretty impressive. Now, what's interesting, I think, is to think back at the NSA documents that we saw because of Snowden. This is 2013, so it's almost a decade old now. And a lot of that was really impressive. They had exploits that would survive reinstalling the operating systems. Wiping the computer and rebuilding it from scratch. That was ten years ago.

Adam Stacoviak:

Yeah, right?

Jerod Santo:

It hasn't advanced since then, surely.

Bruce Schneier:

And, like, they haven't done nothing in the past ten years. So I think the impressive exploits are the ones we don't see. And you never use them when they can be exposed. If you are an intelligence organization - Russians, Chinese, Americans, Brits, whoever. You never use a more sophisticated attack than you absolutely have to.

Jerod Santo:

You hold on to the best stuff for later.

Bruce Schneier:

You hold on to the best stuff until you really need it. If you've got a ten and a three will get you in, you're gonna use a three; or maybe use a four, to make sure. You save the ten when you need a ten. You don't waste it. So the sophistication - you don't almost need, really. The fact that there are now business models for ransomware - it's organizational sophistication as opposed to technicaly sophistication.

Adam Stacoviak:

What's the actual business model of ransomware these days? What is the business model?

Bruce Schneier:

Oh, the business model is to ransom and to get money.

Adam Stacoviak:

Okay, that's easy.

Bruce Schneier:

\[16:11\] But there are organizations that do this, ransomware as a service. You can rent ransomware capability. There are criminal organizations that specialize in getting in, there are ones specialized in getting money, there are ones that specialize in turning the Bitcoin into actual cash you can spend. There's a whole supply chain, international criminal supply chain. That's incredibly sophisticated. That all is in the service of ransomware.

Jerod Santo:

One thing I heard you say about ransomware which is interesting to me and I would love for you to elaborate even more on it is that it takes advantage of the fact that most people's data actually isn't all that interesting to anybody, except for themselves.

Bruce Schneier:

And this is, I think, the fundamental insight. There are really two, and that's the first one. If I steal your data, what do I do with it? I can sell it. The only freakin' person who wants to buy it is you. Nobody else wants your photos, nobody else wants your email. If you were an important celebrity, then yes, I can sell your stuff to somebody else. But for the average person, the average company, no one else cares. So that's insight one - not to steal your data, but to block you from having it and then sell you your access back. I think that is an enormous insight, and whoever thought of that was being incredibly creative. The second thing that makes ransomware possible is Bitcoin. You cannot -- criminals can't use the banking system. You have two problems: criminals are prohibited from using the real banking system, and suitcases full of hundred dollar bills are really heavy. The only way for me to pay a ransom is through a cryptocurrency. And I'm not making this up. You go to your bank and try to wire $50,000 to a Russian account. I mean, just try. You can't. Not like it's hard, it's impossible.

Jerod Santo:

What if you say "But they kidnapped my daughter. I have to do this."

Bruce Schneier:

You can't do it. You will not be able to do it. The banking system will not let you wire money that way. It's not going to a reputable business, it can't move. There are a lot of banking regs to stop you from doing that. So Bitcoin makes ransomware work.

Adam Stacoviak:

How do you feel about that? Does that make you negative Bitcoin? How does that make you feel about Bitcoin?

Bruce Schneier:

It does not make me negative Bitcoin. Bitcoin is completely stupid and useless for all sorts of other reasons. This is just an ancillary bad thing. If ransomware didn't exist, Bitcoin would still be stupid and useless and idiotic, and we hope it dies in the fire as soon as possible.

Adam Stacoviak:

Who's "we"?

Bruce Schneier:

Everybody who does security, basically.

Adam Stacoviak:

Okay... Why is that? Why do they have that feeling?

Bruce Schneier:

Because it doesn't solve any actual problems anybody has, it isn't decentralized, it isn't secure, it isn't anything. It causes all sorts of other problems, and has absolutely no value. At all. Plus, a speculative bubble, and people losing lots of money.

Jerod Santo:

\[laughs\]

Adam Stacoviak:

Gotcha.

Jerod Santo:

So what about censorship-resistant money exchange? The concept of Bitcoin, with the peer-to-peer exchange of money. Do you think there's value in that concept of not having an intermediary between the two of us?

Bruce Schneier:

No, intermedial share value. There's no value in a system where if you're exchanging money with somebody and they're a millisecond slower than you, you lose all your money, and there's no recourse. That is not valuable.

Jerod Santo:

Sure.

Bruce Schneier:

There's no value in a system where if you forget your password, you lose your life savings. That's just dumb. Intermediaries have value, that's why they exist. They're not there because they hate us. You wanna exchange money - use Venmo. It works great. Why don't you like it?

Jerod Santo:

\[20:01\] I like Venmo.

Adam Stacoviak:

Yeah. I use Venmo.

Bruce Schneier:

Right. We all like it. And most people who actually think they have Bitcoin don't actually have Bitcoin. The blockchain does seven transactions per second. Most people on Coinbase - they don't actually own their Bitcoin. Coinbase has a database, just like Venmo, that allocates ownership. The whole blockchain is largely a myth for most users.

Jerod Santo:

So one aspect of the idea of using cash - I'm talking about actual, physical cash - is privacy concerns. So you talk a lot about government espionage, spying etc. and of course, digital currencies are easy to spy on what your citizens are doing etc. Now, Bitcoin, public ledger - of course, that's easy to spy on and track as well.

Bruce Schneier:

Right. Easy to spy on. So the notion that it's private isn't true... I mean, Bitcoin's built out of a whole lot of lies.

Jerod Santo:

Right. Well, I was wondering what you think of... If you look at other privacy coins, people doing things like with Monero, and Zcash, and if you think there's any value in those.

Bruce Schneier:

Not really. I mean, yes, it facilitates ransomware, it facilitates a whole lot of crime... And Know Your Customer rules, anti-money laundering, anti-terrorist financing - I think these are things that are valuable for society. I wouldn't toss them out the window. If you talk to people who deal in child exploitation, and the fact that you can move that money around without a government stopping you - it's not good. It harms people.

Adam Stacoviak:

It's like -- this might be a little leftfield, but Tinder Swindler on Netflix... Have you seen that?

Bruce Schneier:

I don't even know what this is. It doesn't sound good...

Adam Stacoviak:

Well, I'll give you the TL;DR. Somebody who was just able to con people with large amounts of money, various bank accounts, he would con one person, then con another person... Credit cards, bank accounts that didn't match this name, so when it comes to their customer and attaching a bank account to a cash app account, or to a Venmo account that doesn't match your name - that's anti-money laundering; that's what you're speaking of. I mention that because this person never really used a bank account or a credit card that was in his name. It was always somebody else's. So when it comes to these intermediaries - I don't know how he was able to by-pass this stuff, but that's what they do. They say "Okay, your account is in this name, Adam Stacoviak. Does your bank account that you're attaching, money coming in or money going out, match that same name? If not, we're gonna flag it for anti-money laundering and you have to prove you own it by way of W-2, or some sort of tax form, or--" W-2 wouldn't fit there, but some sort of tax form that you filled out, that says this account is yours, or whatever. Something. Right? So that's where that comes into play, this intermediary benefit.

Bruce Schneier:

Yeah. I think there's real value in governance. We need governance, and you saw this in a lot of the hacks... The notion that blockchain money is secure... The math is never hacked. The exchanges are hacked, the wallets are hacked... Everything else is hacked. All the time. And there's nothing you can do about it.

Adam Stacoviak:

Or it's a complete con.

Bruce Schneier:

It's a total, complete con. In a lot of ways, all of this blockchain-based finance is speed-running 500 years of financial fraud. You've got wildcat banks, you've got Ponzi schemes, you've got unregulated securities, you've got pump and dump... It's all there. Front-running... And it's all illegal in the normal banking world, because it's bad, and it should be illegal... But because nobody's regulating these blockchain-based systems yet, a lot of people are losing money. Another fun experiment - go on to Twitter and type "I'm having trouble with my Bitcoin wallet. Can anybody help me?"

Jerod Santo:

\[laughs\] No, thanks.

Adam Stacoviak:

Oh, gosh...

Bruce Schneier:

You will get a lot of responses who will help you, and if you follow their advice, you will unwittingly give them control of your account.

Adam Stacoviak:

Wow.

Bruce Schneier:

That is the way that fraud works. And it's Bitcoin - there's nothing you can do about it. Period. Done. This is not great.

Break:

\[24:10\]

Jerod Santo:

So fraud leads us to one of the problems we have in computer security, which is social engineering. Fraud is just sophisticated social engineering. You're tricking somebody into doing something that benefits you and doesn't benefit them. It seems like those kind of things - education is really the only solution to that particular problem... Is that what you think?

Bruce Schneier:

You know, not really... Education is a lot to me victim-blaming. "If you were smarter, you wouldn't have fallen for that." I think that's a convenient crutch to hide bad design.

Jerod Santo:

Okay.

Bruce Schneier:

Think about some of the security advice that we're giving. "Don't click on a random URL." It's a URL, what am I supposed to do with it?!

Jerod Santo:

Right.

Bruce Schneier:

"Don't stick a USB stick into your computer." Like, what dumb advice is that? It's a USB stick. The real problem to me is how can we design systems so that clicking on a URL isn't dangerous? That's a design problem. Anytime I think you see the user did something wrong and a bad thing happened, or educate the user, go a little and look at the design. What is the design that forces us to throw this on the user? We don't talk about various salmonella in chickens and say "Well, the user has to check." No. We have health codes.

Jerod Santo:

Right.

Bruce Schneier:

You got sick at a restaurant. "You should have gone in the kitchen and done an inspection. Why didn't you?"

Adam Stacoviak:

\[laughs\]

Bruce Schneier:

We don't do that.

Adam Stacoviak:

I'm doing that next time. I'm just kidding... \[laughter\]

Bruce Schneier:

I think we need to design systems so that naive and uneducated users can be safe. I'm flying tomorrow; first time in a while, kind of exciting... I'm gonna get on the airplane and I'm not going to inspect the engine, I'm not going to look at the flight logs, I'm not gonna check the pilot's training record, or did he have mandatory rest period... I'm not gonna do any of that. I'm gonna get on the plane and not even think about it. I don't even have to know what the safety looks like. It's magically done for me by a government. We need computers to be more like that. It can't be that you need to be an intelligent user to safely use the internet. That's not gonna fly, so to speak.

Jerod Santo:

So then the next question logically is "Well, how do we get there?" And it sounds like the answer is "Policy."

Bruce Schneier:

And \[unintelligible 00:27:38.22\] Because the companies wanna blame the user. The companies love that we blame the user for security issues, because they don't have to fix anything. So I think there are answers in regulation, liability... Markets don't reward safety and security, pretty much ever. And if you wanna get that, if you want restaurants that won't poison you, or drugs that are safe, or cars that won't blow up on impact - that is always government intervention. That's the way we do it. Pajamas that don't catch on fire... whatever it is we in society like.

Adam Stacoviak:

Blankets yeah. Gosh.

Jerod Santo:

\[28:18\] Well, what's crazy is how data breaches are becoming normalized.

Bruce Schneier:

Right. And they are normal.

Jerod Santo:

Right.

Bruce Schneier:

And the question is whose fault is it. So if we're talking about the Harvard Law School - they deal a lot in partial liability. There's a car crash - it's this driver, it's that driver, it's this car, it's that car, it's road conditions, it's the signs and the way that the road is designed, the weather... And they figure out who's at fault, and by how much. We don't do that in the computer world. We don't really have that notion of liability. But you know, some of it is gonna be the fault of the vendor. SolarWinds - they had a faulty product, that allowed the Russians to break into their updates system and send a hacked, backdoored update to 14,000 customers. You'd think they'd have some liability here... I mean, it wasn't my fault...

Adam Stacoviak:

I had this conversation actually for an upcoming episode of another show we have called Founders Talk.

Bruce Schneier:

You have more than one episode, more than one show? Is that allowed?

Jerod Santo:

\[laughs\]

Adam Stacoviak:

Yeah, we have six different shows, and maybe more in the future, yeah.

Jerod Santo:

There's no regulation, so we just do what we want.

Adam Stacoviak:

Yeah, we do what we want.

Bruce Schneier:

It keeps you busy, I guess...

Jerod Santo:

Yeah.

Adam Stacoviak:

The conversation was really around incident management, but the opposite of that, which is reliability... And this idea that as part of incident management and this pursuit of reliable software, part of a good design in the hierarchy of an organization is this idea of service ownership. So when you speak to SolarWinds and who's at fault, some organizational things can happen to sort of showcase service ownership. So if you have unreliable software and you get called for pager duty, that's one way to say who's - not so much at fault, but who sort of owns it. Could that kind of stuff begin -- like, maturing engineering departments, essentially; could that begin to help? ...more information, more evidence to showcase who's at fault, and to how much, when it comes to these kinds of hacks.

Bruce Schneier:

I think that makes some sense... The way to really think about liability as an improvement tool is to look at who can fix the problem. You want in general in society whoever has the ability to fix the problem, to be in charge of the problem. Credit cards is probably a decent example. In the old days, in the early '70s, you were liable for credit card fraud on your card. Someone stole your card, charged a bunch of stuff - you were liable. Now, you couldn't fix the problem. Congress passes the Fair Credit Reporting Act in 1978, and now the maximum liability for the customer is $50. So now the credit card companies are suddenly losing money due to fraud. So they do all sorts of things.

Adam Stacoviak:

They're fixing the problem, right?

Bruce Schneier:

They fix it. They have to start doing a real-time verification of card number with these terminals. They start doing better anti-counterfeit protection. Holograms, and micro-printing on the cards. They have the card and the PIN. They mail you the card and the activation separately. All of these things. And the biggest thing is they have these giant expert systems in the backend, looking at your spending patterns, for patterns of fraud. And none of that the customer was able to do. So pushing the liability onto the companies was for society better, because society could fix it. So if you if you think about SolarWInds - if I'm a SolarWinds customer, I get an update, I install it. You want me to do that. You want people to install updates. If we want the update to be safe, that has to be SolarWinds' problem. No one else can fix that. So from a societal perspective, I want them liable for defects in the update, because only they can improve the process. The customer can't.

Adam Stacoviak:

\[32:17\] And then it becomes a thing you can leverage in terms of competition. It's like, "Well, who's better at keeping their software safe. Who's better at keeping their software more reliable? Well, this company, so I give them my business." It becomes a competitive advantage.

Bruce Schneier:

Yeah, somewhat... That tends not to work. It tends not to be a market driver. Think about it - no airline advertises themselves as "We have fewer crashes than the other guys." Nobody.

Adam Stacoviak:

Sure. They don't want you thinking about crashing. They're like "Don't mention the word "crash". Don't say "bomb."

Bruce Schneier:

Right? Cars don't. The exception was Saab in the '80s. They would advertise "We're a safer car." But pretty much nobody does. Restaurants, supermarkets... They do not compete on these.

Adam Stacoviak:

"No salmonella here."

Bruce Schneier:

Right. "No salmonella here." Big sign. "No salmonella here." You never see that. And you're right, they don't want you to think about salmonella when you're buying your chicken.

Adam Stacoviak:

Truth.

Jerod Santo:

Yeah.

Bruce Schneier:

So it isn't something the market can solve. It is rare that you see market solutions for safety and security, because they tend not to be things that are salient when someone makes a purchasing decision. It's price and features.

Jerod Santo:

Yeah. And convenience. We've seen it over and over again.

Bruce Schneier:

Convenience is a feature, yeah.

Jerod Santo:

Yeah, it is. But we'll trade our security or our privacy for convenience.

Bruce Schneier:

All the time.

Jerod Santo:

We do it all the time.

Bruce Schneier:

And it makes perfect sense.

Jerod Santo:

Yeah. On the margins, for sure. So has any of these big breaches or cases been litigated in a sense that has brought the liability back to the vendors? Or is it just not the case?

Bruce Schneier:

Not liability... There has been litigation. I'm not up on the current state of litigation. But there are class action lawsuits, there are some regulatory fines... They tend to be rounding errors. The exception is going to be Europe and GDPR, and privacy violations. Europe is the regulatory superpower on the planet. They do issue fines that companies notice, and don't say "Oh yeah, that was cheaper than the attorney fees. We'll take it", which the U.S. tends to do. But not enough... One of the problems with litigation as a driver of social change is that almost all cases never get to courts where a judge decides.

Jerod Santo:

It gets settled.

Bruce Schneier:

They're almost always settled in private, with nobody admitting any wrongdoing. Even ones that the FTC brings to bear on companies. So they tend not to be good models for others going forward. I'm not sure how to fix that, but that seems to be a problem we're having.

Jerod Santo:

What's your take on GDPR? Are you happy with it? Do you think it's worked out the way they wanted it to? It seems like to me, in a practical sense, there's just a whole bunch of cookie banners now that weren't. And it's like, was that the intended --

Bruce Schneier:

Yeah, in a sense, GDPR was medication to stop the pain, rather than medication to fix the illness. It was a good start. It probably what the people who wrote it thought it would, but there are too many loopholes, to many ways to get around it, too many things it doesn't do... So we can't stop there. But it is doing some. It is not completely useless.

Adam Stacoviak:

This only puts more pressure on your point, which is policy. This idea of tech and policy.

Jerod Santo:

Right.

Adam Stacoviak:

Like, we need more people to have an understanding of technology, to be involved in policy-making, so that -- this is an iteration, like you said; it's a beginning. I think if we're in software, we have to believe in iteration. So we have to believe in, I would imagine, iteration at the policy level as well. So while GDPR may be a start, it's gotta be something that begins an evolution. And that begins with more and more people - as you had said, this vacuum that's there - and demand for people involved in tech and policy.

Bruce Schneier:

\[36:09\] I think that's right. And these are not easy problems we're talking about. Now, the public policy of tech - look at the current battles on sections 230, and free speech on Twitter, all of these... These are not tech problems; these are policy problems, these are human value problems. These are what kind of society do we wanna live in problems. They're informed by tech. You have to understand tech to think about the problems, but you're not gonna solve them with tech. Tech is gonna be a part of the solution. So yes, very much so.

Adam Stacoviak:

One thing that's come up to me though, with policy, and I think even -- not so much to go back to Bitcoin, but more so this idea that I think people believe in, or wanna believe in, this idea of a decentralized currency, and crypto, and Bitcoin, is this lack of trust in government.

Bruce Schneier:

I don't know, I mean -- I think if you don't trust government, you've got way bigger problems.

Adam Stacoviak:

Well, isn't that what they do? They're trying to hedge their bets against fiat currency, that's controlled by government?

Bruce Schneier:

Well, you know, it's just a bunch of libertarian crypto bros. It's not actually a legit, sensical philosophy.

Adam Stacoviak:

Sure.

Bruce Schneier:

I don't buy it for a second.

Adam Stacoviak:

Well, there's some out there that believe that. Even if it's not the majority, right?

Bruce Schneier:

I mean, a lot of people believe it; it doesn't mean it makes sense.

Adam Stacoviak:

Right. The point I'm trying to make or get to is less that, but more so this idea that if we wanna believe in policy change and policy updates - which we do want - I think we have to begin to trust our government more. Or the people that trust it less, they need to have that faith in it. And then you mentioned Snowden, and spying on folks... That kind of stuff doesn't make you trust your government more; it makes you trust them less. So what are your thoughts on government trust?

Bruce Schneier:

Yeah, it doesn't... And also, the far-right paranoia on government can't do good - there's a lot of anti-government fear being stoked by people who have ulterior motives. The people who want you to mistrust government are the people who wanna poison your water supply, and don't want anybody to stop them from doing it. So you know... I mean, yes, I did a book on trust. You have no choice but to trust your government. And government actually does a lot of good in our world. But yeah, I think you are right, that mistrust in government is a problem here. And a bigger problem than this -- and you're right, it is one that we do have to solve, to figure out how to get back to the notion of good government doing good things.

Jerod Santo:

Right. Well, it doesn't help when as technologists we see these Congresspeople questioning, talking about technologies and they're completely out of their depth. They have no idea what they're talking about. It's hard to trust that person, yeah.

Bruce Schneier:

Right. Remember -- who asked Mark Zuckerberg, "How does Facebook make money?" A legit question asked at a Senate hearing.

Jerod Santo:

\[laughs\] Yeah...

Bruce Schneier:

Like, you people are trying to govern this and you have no idea that Facebook makes money by selling ads?

Jerod Santo:

Right.

Adam Stacoviak:

We sell ads.

Jerod Santo:

Which is why I think skepticism of government regulation in that circumstance is well founded. Having said that, you're trying to change that...

Bruce Schneier:

But no government regulation is worse. That's the problem.

Jerod Santo:

Sure. I guess the point I'm trying to drive at is you're trying to change that by having a more well-informed policy-making body. You're trying to instruct... I'm sure -- do you advise policymakers as an expert?

Bruce Schneier:

I have. It is not something -- I know people who do that full-time, who work on Congressional staffs and committee staffs, and they do really good work. I do some of it, but that is not the one thing I do. But I'm trying to teach here a generation, people going into public policy. Teach them how to listen to technologists, figure out what they're saying... I'm really trying here.

Jerod Santo:

\[40:01\] Yeah. So you're talking to an audience of software developers and technologists... What would you teach us or instruct us? What can we do in our little part of the world, whether we're an independent contributor on a large codebase, or maybe we're starting a new business in a software-as-a-service... We're building these things, so the future... What are the kind of things we can be doing now to push things in the right direction, versus the wrong?

Bruce Schneier:

I want us to think about the policy implications of what we do. This is actually interesting. A few years ago Google invented a new job title. I think it's called "Project counsel." So here's the idea - in the old way of doing things, engineers would build the thing, and a the end they'd show it to the attorneys and say "Will we go to jail if we do this? Is this good? Is this bad?" And the attorneys would give an opinion. And Google realized it's way better to embed the attorneys into the design team from the beginning, where the changes are cheaper. The attorney can say "If you did it this way and not that way, it's better." And that's what Google does. It's a great idea. I think we need staff policy people. I want a policy person on the design team of these systems from the beginning, to do the same thing. To say "If you did it this way, your thing won't be racist." Isn't that better?

Jerod Santo:

Yeah.

Bruce Schneier:

Instead of like at the end, when it's too late, and suddenly your system is racist and everyone hates you. So I want us as developers, as techies, to be more open for non-tech input into our designs and development from the beginning. I think that is incredibly valuable. And if we can take into account human flourishing, the environment, lots of policy things, I think that that would be better.

Adam Stacoviak:

What's the path then to get -- so if this is something you think could be on the up-and-coming SaaS, for example, or the up-and-coming next thing happening that is, you know, maybe a well-funded company, 15 million dollars series A, half a billion dollar valuation, which is pretty common for a SaaS business... How do they find that kind of person? Are they going through your course? Where is the --

Bruce Schneier:

Yeah, so this is the hard part. We started with this - what's the career path?

Adam Stacoviak:

Right. We're back to the beginning.

Bruce Schneier:

And these jobs are out there. My students are getting hired by tech companies to do tech policy. But there's no good job board, there's no way I can say "Here. You wanna do this - here's where you go." We're working on it. Ford Foundation is trying to build these paths, these systems, but it's not yet there. So I don't have a good answer, and that's bad. I mean, I wanna have a good -- I wanna have an easy answer to your question. "You wanna do this, go do this thing."

Adam Stacoviak:

Yeah.

Bruce Schneier:

"And there's a career path for you."

Adam Stacoviak:

If you just go on Twitter and say "I have some policy help needed" and hope you don't get hacked, or swindled, or whatever it might be, right?

Jerod Santo:

\[laughs\] Yeah.

Bruce Schneier:

Maybe...

Adam Stacoviak:

That could be one path. Just kidding.

Jerod Santo:

Enter your Bitcoin wallet address here for policy help...

Bruce Schneier:

You know, and everyone I know who finds these jobs, they're all exceptions. And I do try to pay attention. Because a lot of students ask me, "I'm looking for a job. What do I do?" But certainly Facebook and Google and the big guys hire them. But a lot of my students don't wanna work for them, because they're evil. They wanna work for some small and more interesting company that's doing some social good.

Jerod Santo:

\[43:54\] So you mentioned this good idea inside of Google, you mentioned that we should have policy decisions coming in in the beginning, when we're starting software projects... And it's making me think of idea-sharing; like "This is a good policy." It makes me think of open source. And we talked about how cyber-security has kind of grown up over the last 20 years, it's gotten way more serious, ratcheted up the stakes... Open source has also matured during that time, and gotten corporate, and everything...

Bruce Schneier:

Both good and bad, yeah.

Jerod Santo:

Yeah, and I'm just curious, how does open source weave into this story, if at all, and what do you think is good and bad about it?

Bruce Schneier:

I don't think it weaves into the story. Open source is a thing... You know, there's a myth that open source is more secure than closed source. That's not true. Software that's more secure is software that's been looked at, and there's sort of two ways to have that happen; one is you can be Microsoft and hire people to look at your software, and two, you could be Linux and put it out there, and lots of people look at it. But you could also be a company, like most software companies, that doesn't hire anybody to look at their software, and you could be like most open source projects, and nobody looks at it anyway. So open source is another path, but it is not a magical elixir. So I don't think open source, closed source really matters here in any important way.

Jerod Santo:

Right. I was thinking more like open source ideas applied to policies.

Bruce Schneier:

Now, here we're getting interesting. And it's open source ideas, it's agile computing ideas... Like, how do we make policy at the speed of tech? That's actually hard. The story I'll tell in class is about drones. And if you remember the history of drones - so drones started appearing on the consumer market, and everyone says you can't regulate drones, it is too early; you will destroy the nascent industry. And then one year, everyone gets one for Christmas. And then you can't regulate drones, it's too late; everybody has them.

Jerod Santo:

We're already flying them.

Bruce Schneier:

Right. There was never a moment when it was right to regulate drones. Now, this is, I think, a microcosm of the problem we have. In the beginning you don't know what to do, it's too early to do it, and at the end there are too many (I don't know) rich lobbyists preventing you from doing anything. So how do we navigate that? This is actually a very big problem of regulation in the 21st century; way bigger than security anywhere talking about. And it's something that we really need to think about. You know, can we use the ideas of open source, or agile software development, and apply it to legislation, apply it to policy? I think the answer is yes; I don't know how... But we need to figure it out.

Adam Stacoviak:

What about the flipside of that on open source, in terms of an attack vector? What are your thoughts as a security person?

Bruce Schneier:

You know, again, open source and closed source both have attack vectors. We have seen open source attacked; a lot of open source projects are very poorly maintained, so by a hobbyist who doesn't have a lot of security... We've seen open source projects taken over by malicious actors, and being subverted... But you see a lot of this in proprietary software as well. I'm not sure it's a difference that makes a difference.

Adam Stacoviak:

It kind of does some interesting things to open source too, because to make open source more secure, in some ways, you have to put money involved, you have to put organizations involved, potentially more people involved, eyeballs, or just more watchers... Which essentially turns it into like a mini-organization, which isn't necessarily proprietary software; it's still open, it's still open source, it's still permissive license, all that good stuff which is the virtues of open source, but it does create a lot of complexity around the idea of open source.

Bruce Schneier:

And there's also a tragedy of the commons. If everyone's using this open source project in their software, everyone assumes somebody else is evaluating it, and then nobody evaluates is. We see this a lot. Log4j was an example of that. Everyone thought somebody else was paying attention to Log4j, and it was actually just this guy, and suddenly there's this huge vulnerability. \[48:10\] So there is a fix happening now. I think the Open Source Foundation has set up a program, and they're getting the big tech companies to put in money... I think Google and Microsoft each put in five million to -- we're all going to evaluate these open source projects. So this third party is going to do the work, the big companies that benefit are going to put in money, and everyone benefits. And it's called (I think) the Alpha Omega Project. The idea is they're gonna look at the most popular and critical open source projects really carefully, which is the Alpha, and then run automatic vulnerability scanning tools against the top 10,000 libraries. That's the Omega. And can we bypass the tragedy of the commons, and then get some real evaluation of these things that it turns out we're relying on, even though we don't realize it?

Break:

\[49:07\]

Jerod Santo:

So one thing that's amazing to me about you, Bruce, is just how long you've been going at it.

Bruce Schneier:

Stop telling me I'm old! It's the second time. The first time was about the book you had when you were in college. I'm getting tired of this! \[laughter\]

Jerod Santo:

Longevity. I'm speaking to your longevity, not to your age.

Bruce Schneier:

\[laughs\]

Jerod Santo:

So you've been doing this monthly newsletter, The Cryptogram, and I think I did subscribe to it after reading the book, and I've just subscribed to it pretty much my whole adult life now... And this is 1998.

Bruce Schneier:

That's three...

Adam Stacoviak:

"That's three." Sorry...

Bruce Schneier:

\[laughs\]

Jerod Santo:

My question is, what drives you? How do you stay so on this every month? And what I find about it is a lot of times I just read the headlines, because there's so much in there. I mean, you're writing a lot, you're logging a lot... How do you keep it going, man?

Bruce Schneier:

It's an interesting story... So Cryptogram is from like 1998 I started a monthly newsletter. So that was back when email newsletters were cool the first time... Before they got uncool and now they're cool again.

Adam Stacoviak:

I like that. That's a good one.

Jerod Santo:

They're cool again now, yeah. I like that.

Bruce Schneier:

Right? And then I turned that into a blog in 2004. And that's like the second wave of blogs, when blogs were cool before they were uncool, and now I guess something else is cool. I don't know what's cool now. So the monthly Cryptogram now is a compilation of the daily blog.

Jerod Santo:

Right.

Bruce Schneier:

So you see it in email, some people see it on my website... And I have been doing it pretty much every day, daily, week days, since 2004. That's (alright) a long time.

Jerod Santo:

That's impressive.

Bruce Schneier:

And a lot of it is -- it forces me to stay current. It forces me to start reading around and seeing what's happening, seeing what's being talked about... And that's good for me. I get a lot of my entries and news items from readers. I get a lot of email, which is really useful to me. \[54:11\] Some people will send me links all the time. And that is something I use to stay current, so I really appreciate that. Any listeners who send me emails and I see a good crypto story - thank you; keep doing that. And then, to me, writing is how I understand things. So it's how I process the news. It's how I process all of this. So you're seeing my process of processing, I just do it a little bit in public.

Adam Stacoviak:

Yeah. Super-cool. What would you say over the last -- since you said you've been doing it daily since 2004, but it's been longer than that... Maybe give us a trip down memory lane; what were some of the biggest, most surprising things you saw in security.

Bruce Schneier:

Oh, man, I don't even know. I'm terrible at memory lane. I really am.

Adam Stacoviak:

Let's say the last five years. The last couple of years. What were some of the biggest deals?

Bruce Schneier:

I mean, I remember writing about the September 11th terrorist attacks... It was the first time I ever did an issue out of sequence. I wrote a bunch of articles, I think -- and I go back and read it; and this is September 30th, 2001. I'm writing about, I think, a lot of things that became part of the debate years later. I thought it was really kind of interesting.

Jerod Santo:

Didn't you coin the term "security theater"?

Bruce Schneier:

Security theater, I invented that term. I think that's my contribution to popular culture, if that's what you wanna call it. It's the notion of security theater. The other thing I was gonna call it was Potemkin security. But it turns out that surprisingly few people younger than me recognize the term Potemkin village. It is a Cold War term that people don't know anymore.

Jerod Santo:

Did that term come out of the post-9/11 PATRIOT Act?

Bruce Schneier:

No, Potemkin village is from communist Russia.

Jerod Santo:

No, I mean the security theater. When were you thinking about it?

Bruce Schneier:

Security theater - yes. I coined the phrase soon after 9/11. Wikipedia has the actual origin...

Adam Stacoviak:

What does it mean?

Jerod Santo:

What's it mean? It means people are acting like it's secure, but it's just for show.

Bruce Schneier:

Yeah. Security theater -- so the example I would use right after 9/11... I don't know if you remember, there were national guard troops stationed in airports. They were just inside security, off to the side, in uniform, holding a big gun... Those guns had no bullets. Because - my God... A 22-year-old with a gun in an airport - what could possibly go wrong? You do not want to give him ammunition. But it was there to make people feel better. It actually was theater to make people feel safer flying.

Jerod Santo:

Have you got any modern examples of security theater, things that are going on today maybe?

Bruce Schneier:

Yeah, there's a lot of Covid theater. There's a lot of health measures that make no sense. There are people wiping down their mail...

Adam Stacoviak:

It's amazing what FUD will do to you.

Bruce Schneier:

It is amazing.

Jerod Santo:

Wear the mask on the way into the restaurant, but once you sit down you're safe...

Bruce Schneier:

Right, and then you take it off... What is it -- what are we doing here? And you know, some of that is valuable, because if people are more afraid than they should be, then a little theater is good... But some of it just makes no sense.

Jerod Santo:

Yeah.

Adam Stacoviak:

It's perception, too. It's like a perceived threat.

Bruce Schneier:

It's all about perception. Because fear is the perception.

Adam Stacoviak:

Yeah. Even to yourself.

Bruce Schneier:

Security is a feeling and a reality. It's both. And they are different.

Adam Stacoviak:

Yeah.

Bruce Schneier:

You can feel secure when you're not, and you can be secure and not feel it.

Jerod Santo:

Yeah. There's an old saying, "Just because you're not paranoid doesn't mean someone's not out to get you."

Bruce Schneier:

And just because you are paranoid doesn't mean people are out to get you. It works both ways.

Jerod Santo:

Exactly. You can say it either way. It's both true.

Adam Stacoviak:

Precisely.

Jerod Santo:

Yeah, that's why I like it. That's funny...

Adam Stacoviak:

\[57:56\] Jerod asked the question before about developers and what they could do, building systems tomorrow... And you kind of mentioned some of the things they could do, which was essentially find somebody in policy and hire them, though the supply of them is challenging to find, because the path is challenging... What else would you share with today's technologists that they need to know? Things that you're preaching, that software devs, engineers, leaders of engineering departments, people building products should know about the state of security today, that they don't know.

Bruce Schneier:

What does that say of the state of the world, that we are used to thinking that what we do ends at the keyboard and screen. And it turns out it's not true. That the stuff we write affects the world, affects society, affects people, affects human flourishing. And we need to think that way. We really do. We need to think that when we build a software product, we're building the world. This is an old story, but I think it's a good story... I don't know if you remember Friendster. Friendster was a social network before MySpace.

Jerod Santo:

Right.

Adam Stacoviak:

Yup.

Jerod Santo:

Really old.

Bruce Schneier:

And they had something called the Top 8. You could have as many friends as you want, like any social network, but the Top 8 would appear on your home screen. It was not six, it was not ten, it was eight. For whatever reason, some programmer decided 8. Power of 2, we're good. In high schools all across the country, who's your top 8 friends suddenly mattered.

Jerod Santo:

Right.

Bruce Schneier:

Now, the engineers just picked a number. But wouldn't it be great if there was a teen psychologist who said "No, no, no, if you make eight, it's gonna be a disaster. Make it 12. You must make it 12." The engineer would say "Okay, it's 12."

Jerod Santo:

Unintended consequences.

Bruce Schneier:

Right. And it used to be the unintended consequences didn't matter. Nobody cared how Usenet worked, because Usenet wasn't important, ever. No one kind of cared in the beginning how email worked. But now it matters. Now the unintended consequences can affect democracy. And maybe we should pay a little more attention to that.

Jerod Santo:

Yeah.

Bruce Schneier:

So my advice is that your tech system is not only a tech system, it is a human system fundamentally, and you need people who understand that on your design and development teams.

Jerod Santo:

So you're saying "Don't move fast and break things."

Bruce Schneier:

Move deliberately, and fix things.

Jerod Santo:

\[laughs\]

Adam Stacoviak:

There you go.

Jerod Santo:

There you go. Bruce, one thing I've noticed, and you've confessed it here today, is that you live in your email. I guess in a sense it's kind of your primary social network. That's how you communicate...

Bruce Schneier:

Yeah, and I'm not on any social network. Email is my life. I come from the generation where email is my life. And of course, these days don't use email. They use text. They'll send me a text, and I'm like, "What are you doing?! Just send me an email."

Jerod Santo:

I'm sending you a calendar invite, and you're like "Please, don't. Just email me."

Bruce Schneier:

Don't send me a calendar -- just send me an email. Stop it.

Jerod Santo:

\[laughs\]

Adam Stacoviak:

Keep it simple.

Jerod Santo:

They've made me wonder about some of your personal practices, whether that's privacy, or security best practices... What do you do in your life of technology that may be different or unique, or at least is notable for people who wanna be like you?

Bruce Schneier:

You know, I wouldn't recommend what I do to anybody, because a lot of the stuff I do that's unique is not using normal technology, like calendar invites.

Jerod Santo:

Okay...

Bruce Schneier:

I don't use the cloud, for anything. That makes me a weirdo. I don't keep my email in the cloud.

Jerod Santo:

Is that because you know better?

Bruce Schneier:

No, because I've always done it my way, and that means I can do my stuff without having an internet connection.

Jerod Santo:

Okay.

Bruce Schneier:

I hate Google Docs. And it does make me a freak and hard to get along with. So I am hard pressed to give you my advice as something to follow. I think I'm a cautionary tale of something to avoid.

Adam Stacoviak:

But yet you do it.

Bruce Schneier:

But yet I do it. And I can get away with it, because I could be ordinary and you'd still want me on your show. But if I was someone less important, you'd say "Who is this idiot? We're not gonna interview him. He doesn't even use a calendar invite."

Jerod Santo:

Well said, well said. You don't use the cloud... Are you involved, at least the company -- there's a tie to the Solid project. I wanted to ask you about that project, Tim Berners-Lee, Solid. We talked about decentralized networks with cryptocurrencies, but here's one that's decentralized storage, it's got of course Tim Berners-Lee attached to it... So it sounds like it's interesting. Are you attached to that somehow? Are you working on that?

Bruce Schneier:

\[01:02:27.04\] I am. So I'm a big fan of decentralization, but don't use a blockchain.

Jerod Santo:

Okay.

Bruce Schneier:

Email is decentralized. Now, I can send an email to anybody, regardless of what they're using, which is different than like a Facebook message. SMS is decentralized. Web pages are decentralized. Decentralization is great. And Solid is a vision of decentralized data... The idea being - right now your data is siloed. Fitbit has your health data, and your phone has your location data, and someone else has your photographs... And on and on and on. Wouldn't it be great if all your data was in one place and you got to decide? And then you can do things with your data that you couldn't do otherwise. I don't know, my airline has a lot of my data in this Frequent Flier program. So does the hotel program I like. They don't actually want my data, they just want access to it. My data was in my - and the term Solid uses is "pod". I have control over it, I can see who accesses it, I can give permissions... If my address changes, I change it in my pod and it propagates everywhere... I had to download an app - because I'm going to Spain - to type in my health information, to get a QR code, so when I land in Spain I can show it and get into the country... So I'm entering my data, again and again and again. That doesn't make sense. And once this app has -- I don't even know what they're gonna do with it. I have no idea what Spain is gonna do with my data. For all I know, they're gonna sell it to Cambridge Analytica. They could. Who knows. And this is a way of thinking about data that puts people in control. It's a way that actually solves the problems that GDPR tried to solve. So yeah, I'm involved, and I think it's a big deal, I think it's really important, and I think it's valuable. The fact that it's Tim Berners-Lee - it could conceivably change the world; he has a track record for doing that... So yeah, I'm super-excited about it.

Jerod Santo:

What's the status? Is it usable, is it private?

Bruce Schneier:

It's a couple of things. It is a W3C standard, so it's a web standard. There's also a company, so I'm actually involved in a company called Inrupt, which is basically the Red Hat of Solid. They are making a commercial server and system for the public standard. So there's a free server, and all kinds of free tools, but there's also these commercial series of tools.

Jerod Santo:

Can you use it today?

Bruce Schneier:

Yes, you can use it today. You can get your pod, you can do it... You kind of have to be a techie to use it today. It's like the early days of the web; you had to program HTML to use it. The early browsers were not at all intuitive. So it's early for regular people. But for techies it works great.

Adam Stacoviak:

I think it's got good premises thought. It's like, here's my data, and I have integrations to that data, and I can give them permission, I can fine-tune those permissions... But the rest of the world has to begin to accept it; like a Hilton Honors club, or airlines...

Jerod Santo:

Right.

Bruce Schneier:

And it's gonna happen slowly. But think about it from a liability perspective. Marriott was hacked by the Chinese and they lost everybody's data. So having everybody's data is a huge liability for Marriott hotels. What they actually want is access to your data when they need it. If they knew they had that, they wouldn't need to store a copy locally, because that is just dangerous.

Jerod Santo:

Right.

Bruce Schneier:

But you can't guarantee access, so they need to store a copy locally. Fixing that I think is important.

Adam Stacoviak:

\[01:06:05.11\] Have you ever had to write a contract and have somebody sign it?

Bruce Schneier:

Yeah.

Adam Stacoviak:

It's challenging though, right?

Bruce Schneier:

Oh, yeah. Because contracts are very human.

Adam Stacoviak:

Right. So the reason why they're five years and not one year is just because every time you've gotta go back, you're reminding them. So I think maybe the challenge however with Solid might be that "Okay, Hilton wants access, but man, they're accessing it quite often. Way more than I want." Whereas if they actually had it, they could do whatever they want it, and access it whenever they wanted.

Bruce Schneier:

Which we don't want, actually. We don't want to have that kind of --

Jerod Santo:

Right.

Bruce Schneier:

I would like to know when they're using it, and what they're using it for. That seems fair.

Jerod Santo:

Yeah. And maybe you could even build payment layers on top, so now instead of Facebook selling my data, I can --

Bruce Schneier:

Sure they can. I've just changed banks, and I had to give them a whole lot of data. Why can't I just say "Here. Here's my pod. You now have access to all that data. Done."

Jerod Santo:

Right. Or "For a dollar, you can have access to my data."

Bruce Schneier:

Well, you know, but I'm opening a bank account, so I kind of want them to have it.

Jerod Santo:

Oh, I know. I'm just thinking in general.

Bruce Schneier:

So there's a transaction here. I want to give them the data.

Jerod Santo:

Sure.

Bruce Schneier:

I just don't wanna type all the damn stuff in again.

Jerod Santo:

Right. That reminds me, you have this great quote from the Data and Goliath book. You said "Data is the pollution problem of the information age, and protecting privacy is the environmental challenge." I like that casting of that. I think that plays well into this whole Solid idea, isn't it?

Bruce Schneier:

I do, too. And it's actually a pretty rich metaphor.

Jerod Santo:

Yeah.

Bruce Schneier:

Because if you think about it, all computer processes produce data. It stays around, kind of festering... We spend a lot of time talking about its reuse, how it's recycled, how it's disposed of; what its secondary characteristics are. And I actually -- if you think back to the early decades of the industrial age, we as society kind of ignored pollution in our rush to build the industrial age. Today, we are ignoring data in our rush to build the information age. And I think just as we look back at those people a hundred years ago and say "How could you have been so short-sighted?", we will be judged a couple generations from now on how we could be so short-sighted. So I actually think the metaphor is really robust. The data is the pollution problem of the information age.

Jerod Santo:

Well, I'll be fascinated to see how Solid goes. Hopefully it gets adoption, because I do think, from what I've read about it and what you're telling me about it, I think it has a lot of fundamental things done well.

Bruce Schneier:

There's a huge chicken and egg problem in all of these... But we're getting traction with governments, oddly enough. The notion of a country giving every citizen a pod... Because governments also don't want their citizens have to type the same stuff in again and again. And want them to be able to share data among different government agencies. So mostly in Europe, but governments seem to be the early adopters here... Which is weird, because government as early adopter - it's like an insane thing I just said.

Adam Stacoviak:

Yeah. That was surprising, actually, too.

Jerod Santo:

Well, on the note of policy meets technology meets repetition, I would just like to take a moment to say to the Stack Overflow folks - I've already accepted your cookie policy, okay? I don't wanna accept it every single time I come to your website.

Adam Stacoviak:

Oh my gosh, yes...

Jerod Santo:

Every single time. It's like, you should remember I just said you can keep my cookies. So put that information in a cookie and store it, so I don't have to accept your cookie policy every time.

Adam Stacoviak:

Or just put it into the browser. Bake it into the browser.

Bruce Schneier:

They might be required by the regulation to ask you every time. I don't know the answer to that. But it's interesting.

Jerod Santo:

Yeah.

Bruce Schneier:

Now, can we solve that by you having your cookie policy in your browser, so it would check "What is this person's cookie policy? Did he change his mind? We need to give you the ability to change your mind, so how do we do that?" So we're solving it the dumb way by asking you every time, "Do you consent? Do you consent?"

Jerod Santo:

\[01:10:20.21\] Right.

Bruce Schneier:

Maybe you can put your consents in some kind of accessible document that they can look at. But here again, this is a problem. We have to ask you every time instead of saying we asked you already.

Jerod Santo:

It sounds like something that they might build into the Google Chrome browser. And I know from our previous conversations that you refuse to use such things. I'm curious --

Bruce Schneier:

Yeah, but if they build it in the Chrome browser, it would default spy on you.

Jerod Santo:

\[laughs\]

Adam Stacoviak:

Default spy on you... \[laughs\] Yeah.

Jerod Santo:

Firefox fan over here.

Bruce Schneier:

I am a Firefox user, yes.

Jerod Santo:

Fair enough. Fair enough. Alright, Bruce, we've used lots of your time; I really appreciate this conversation. Adam, any other questions before we let him go?

Adam Stacoviak:

I'm clear.

Bruce Schneier:

This was fun!

Adam Stacoviak:

This was a lot of fun, Bruce. I appreciate you. I wanna catch up on one of your books that was mentioned at the top of the show... That's cool.

Jerod Santo:

Which is the one that we should read?

Bruce Schneier:

I love them all, for different reasons... The newish books that are worth reading - I'm staring at my shelf... Data and Goliath is about data and privacy. After that, I wrote a book called "Click here to kill everybody", my favorite title, which is really about the internet of things and safety. And before that, I wrote a called Liars and Outliers, which is about trust, and how systems enable trust. So those are my three most recent. I'm coming out with a book next year, which is due in two weeks, so I'm kind of panicky about this, which is really about hacking society, broader social systems.

Adam Stacoviak:

Wow.

Jerod Santo:

What's that one gonna be called? Can you tease us?

Bruce Schneier:

Probably "A hacker's mind." We're still finalizing a title.

Jerod Santo:

Nice.

Adam Stacoviak:

Very cool.

Jerod Santo:

Yeah.

Adam Stacoviak:

Bruce, thank you so much for all your wisdom. And honestly, the book-writing, while you may not become rich and famous because of it, you will become rich and famous in terms of helping other people.

Bruce Schneier:

Nobody writes books to make money, with the exception of the top New York Times bestseller thriller writers. Nobody writes books to make money.

Adam Stacoviak:

Your wealth is in the appreciation of the knowledge you're sharing, so that's my point.

Bruce Schneier:

You know, for someone to say "I read your book in your college and it changed my life", that's like the best compliment you can get.

Adam Stacoviak:

Yeah. On that note, that's the point - thank you for sharing your wisdom. We appreciate that.

Bruce Schneier:

Hey, thank you for having me.