Cory is a science fiction author, activist, journalist, co-editor of Boing Boing and the author of many books. We talked to Cory about open source, the open web, internet freedom, his involvement with the EFF, where he began his career, the details he'll be covering in his keynote at OSCON, and his thoughts on open source today and where developers should be focusing their efforts.
Code School – Learn to program by doing with hands-on courses. There’s a path for everyone at Code School. It’s the best place to start learning new technologies.
Toptal – Scale your team and hire the top 3% of developers and designers at Toptal. Email Adam at
firstname.lastname@example.org for a personal introduction to Toptal.
Linode – Our cloud server of choice! We host everything we do on Linode servers. Use the code
changelog20 to get 2 months free!
This episode was produced in partnership with O'Reilly Media and OSCON. Use the code
changelog20 to get 20% off your registration.
Welcome back everyone! This is the Changelog and I'm your host Adam Stacoviak. This is episode 221 and today Jerod and I are talking to Cory Doctorow, a science fiction author, activist, journalist, co-editor of Boing Boing and author of many books. We're producing this show in partnership with O'Reilly Media. We'll be at OSCON next month in London. Use the code "changelog30" if you want to get 30% off your registration; head to OSCON.com/uk to learn more and register.
We talked to Cory today about his involvement with the EFF and where he began his career. He shared some details he'll be covering in his keynote at OSCON about open source licenses, the potentially dark side of open source software if we don't do it right.
Let's say you wanna learn React - you can start Level One of Code School's React course, which begins with a quick video lesson on React components. After the video, you get hands-on practice building with components using in-browser coding challenges. There's no hassle of setup, just learning. There's a path for everyone at Code School. Head to CodeSchool.com to get started and learn by doing. Now on to the show.
Alright, we're back. We've got a fun show and, Jerod, this is another show we are doing in partnership with O'Reilly as part of OSCON London. Big show today, the keynote speaker is here , Cory Doctorow. He doesn't need much of an introduction, but if we were giving him one, Jerod, what would it be?
Well, many people know him as the fiction author, science fiction; he's an activist with the EFF, a journalist - many things; blogger... Cory, first of all, welcome to the show and then maybe give how'd you like to introduce yourself to people.
You know I usually say I'm a science fiction writer and activist, and then if pressed I say a few things more. When I lived in the UK before I became a citizen, it was always a problem because when you'd land there, if you're not a citizen, you'd get these landing cards and they've got this 3 cm long blank in which you were supposed to write occupation, and I always wanted to say "See attached" and then have my Wikipedia entry.
Well, there you go. [laughs] Well, we'll just say see attached and we'll include your Wikipedia entry in our show notes. Surely, our audience is probably well versed with you and your work over the years, especially as co-editor of Boing Boing and the many books you wrote. In fact, Adam, I believe you first found out Cory about a book that he wrote.
I'm glad you brought that up because, Cory, I don't know if you know this, but I'm sure that that you get this often - people know you by what you've written before. That's one of the ways.
For me, it goes back to an early thing you've done, which was a commission you'd actually written, a science fiction story about Google, the day they became evil. I found Scroogled, read it end-to-end, which was a short read anyways, but it changed my life and that was like the earmark of my life of knowing Cory Doctorow. So that was a cool moment for me.
You know, there's a super cool - going back to audiobooks - there's a supercool reading of that, that Will Wheaton did in my short story collection with a little help; that's a free MP3 download or pay-what-you-like MP3 download.
[00:03:55.00] We'll definitely share that because I like reading it, but if it's read like audiobooks are read, not just "and here's the story", it's actually got some reading behind it, I love it.
Will Wheaton would definitely do it.
Will is a killer reader.
For those who aren't familiar with that story, it's the day Google became evil. You wrote this short story - I guess I learned today, as part of the research to doing this call with you,that it was actually a contract writing for you, so somebody commissioned you to write that for them. But it was basically about Google becoming evil, INS, somebody goes out of country, gets locked out, passport, this is and that, you know your story... But for listeners can you give a quick, brief synopsis of that story just so they know?
Sure. Well, you know, I'm always reluctant to think of science fiction as a pressing-it literature particularly, but the one thing that book did or that story did that I think really paid off is that it's about Google making a compromise on its metadata. Google's always had this position that they are not gonna let Uncle Sam spy on its users, but they say "I'll tell you what, US Customs authority, we will show you the same ads that this user sees when they look at their email, and through that you can make up your mind about whether or not you trust them when they enter the country", and we've actually had proposals like that since. I mean, obviously the thing we learned from Snowden is that metadata is considered fair game and has been spied on at great length and in great depth, but also that since then, the US Customs and Immigration Service has proposed - or had a notice of proposed rulemaking to see whether or not they should be allowed to tell people who enter the country that they're required to give their social media handles, and then allow them to do data mining of their social media presence as a condition of entry. That's a motif we see in lots of other places; landlords are now doing it in a lot of places, employers have periodically made the news for requiring that their employees give them not just their logins, but also their passwords so their employers can log into Facebook and see what's posted in private. That sort of thing.
It's so funny that you wrote that so long ago... Later on we'll ask you about things you've written as a science fiction author and how inventions have become real, but basically you kind of teed up this idea of this world we could live in, and then some of it, not the exact truth of what you wrote, but some of it is actually playing out. Jerod, he just made me think about that recent show we did with Ben Bixby, TensorFlow...
Jerod: Eli Bixby.
...and like just passing somebody's social handle into a deep learning algorithm and seeing what comes out of it.
Yeah. I think that what science fiction does, it's like what a doctor does when you go in and sat that you've got a sore throat and she'll swab the back of your throat and rub it on a Petri dish and leave it for the weekend, and then come back and look through a microscope to see what's going on. She's not making an accurate model of your body, she's making a usefully inaccurate model of your body; a science fiction writer reaches in and plucks out a single technology or idea and builds a world around it that's not meant to be a model of the world as it is or can be, but a model that's useful because it's not like the world is going to be, it's an exaggeration, or it's like that one fact becomes reified as the most important fact in that world.
It also can be a somewhat self-fulfilling prophecy as well, as you're often influencing or inspiring the engineers and the scientists and the people who are creating the things of the future in either direct or indirect ways.
Yeah, I actually call that the opposite of a prophecy, I think that's inspiration. The whole problem with the idea of predicting the future and with fortune telling is that it's this intrinsically fatalistic idea that the future is knowable and fixed. The significant thing about the future, the only reason to care about it, is that we can change it, and so what science fiction can do at its very best, is it can make different futures than the ones we are headed towards manifest, which is a very exciting idea.
[00:08:12.24] Well, it's funny, because I was on Netflix last night, and I was like "What!? Star Trek episode one is on here? My dad would love it." He's not alive, but he'd be like "That's awesome, let me go back and watch this", but I was thinking the iPhone is kind of a reminiscence of "Beam me up, Scotty", that whole device they had to do different things and this magical handheld device, and while it may not have been prescriptive, it was sort of hinting at what might come, and life sort of evolves from the art and art evolves from life.
Well, and even more than that, those Motorola flip phones that looked exactly like Star Trek communicators; it didn't look exactly like Star Trek communicators because Gene Roddenberry had a crystal ball that showed him what would happen in a Motorola, it was something way more awesome than that - Motorola engineers grew up wanting to live Gene Roddenberry's future. Give me the choice between those two, like knowing what's coming and changing what's coming, and I'll take changing any day.
Yeah. So let's pause there for a second... I know we riffed quite a bit on some fun stuff, but before we go much deeper, for those who don’t really know Cory, how do you describe yourself? Take us back, introduce us to who you are, but maybe even take us back to where you got your start, the paths you've taken around activism, around EFF, and freedoms, and things like that.
Well, I grew up in Toronto and in the mid-90s I started commuting to Silicon Valley to do systems integration for an IRIX shop - Silicon Graphics UNIX shop - and through that became more interested in bringing open-source software and network administration. Then, as the web took off, I became a CIO of a web services company and then started a DotCom that did free and open-source peer-to-peer search that we raised some money on and that we had an acquisition offer for, and that made our investors see dollar signs on it, so they took all of the founders' equity and in the ensuing chaos, the acquisition deal fell through and the company just imploded.
While we were working on that, because it was doing peer-to-peer and file sharing, it was really involved with the legal fights of the day, which revolved around Napster, and our best legal advice came from Electronic Frontier Foundation. A lot of our programmers were also members of the Cult of the Dead Cow, which was this amazing hacker group, and CDC hooked us up with the Electronic Frontier Foundation and I got more and more involved with them.
As the company started to implode, I quit my job, quit the company I'd started and went to work for Electronic Frontier Foundation. All this time, I'd been writing novels and stories and they started to sell around this time, as well. I took a job with the EFF overseas, I went to London to be their European Director and worked on digital standards and treaties at the United Nations and then in Brussels, mostly on killing DRM. I was also all this time writing this weblog called Boing Boing, that started off as a hobby that some friends and I put together. It was founded by my friend Mark Frauenfelder and his wife Carla Sinclair, and then it built into a fairly big, significant commercial concern, with 9 million unique readers a month and still going strong. So I was doing that as well, and that was also a useful platform for talking about the political work I was doing and my writing.
[00:11:38.26] When my writing got to the point where it was occupying too much of my time to do a proper job at EFF, I did what I always told them I was gonna do - I quit to write full-time, but like literally within seconds of announcing that I was gonna do that, I got offered a Fulbright at the University Southern California to go and teach about DRM in LA, and my wife at the time was working for the BBC. She's British and she was able to transfer to BBC America, so we came and lived in LA for a year and moved back to the UK in 2007 with my wife pregnant at the time. She delivered our daughter in our flat in East London, and we lived there ever since.
I did some contract work for Disney Imagineering and wrote books and went and spoke to people, and did other bits and pieces, but I became more and more alarmed about the proliferation of DRM. It started off as a harmless folly that was used to lock up game consoles and DVD players, but because it has this law, the DMCA, that says that breaking DRM is illegal even for legal purposes, companies started adding it to cars and tractors and insulin pumps and cat litter trays, and arranging the DRM so that you had to break the DRM to do things like put your own detergent in your cat litter tray, or to broadcast your own seed using the soil density data that your John Deer tractor had gathered while you drove it around your field and the torque sensors on your wheels gathered centimeter accurate soil density data. And because things that are DRM’ed are off-limits to security researchers, because knowing about a defect in a product helps you break the DRM, and because DRM can't be implemented as free and open-source software, because it's obvious on its face that if you're a designing programs that treats its users as its adversary, that making that program modifiable by that user is not a good idea from a security model perspective. If there is a flag in the source that says "DRM On = 1" and the user doesn't want the DRM - which I think universally users don't want DRM; no one woke up this morning and said, "I wish there was a way I could do less with my music" - then you know someone is gonna just turn that one into a zero and recompile your program.
Free and open-source software is antithetical to DRM, so we have this existential threat to flaws and this existential threat to security at the same moment that software was metastasizing and invading the world and moving into our light bulbs and our baby monitors, and I thought this was a terrible thing.
I came up with an idea for fixing it that was built around first launching a lawsuit to invalidate section 1201 of the DMCA which protects DRM, and then going around the world and getting other countries to drop their own versions of the DMCA, and also doing some activism with standards bodies, the W3C, the World Wide Web Consortium is adding DRM to the core suite of web standards, I'm trying to get them to abandon that... So I came up with this plan and I pitched it to EFF, and they said "That's an awesome idea, but we don't think anyone here has the bandwidth for it, but it sure sounds like something you'd be good at... Hint-hint-hint!" and so I said I was gonna take a couple of years off from writing books full-time. I'm writing slowly now, a page a day on The Third Little Brother book, and then I would go back and work half-time for EFF on this, which I started doing.
The Director of the MIT Media Lab, Joi Ito, gave us a grant. He made me his Activist in Residence, and so that pays my way there, and this is my gig, I'm gonna kill all the DRM in the world within a decade, that's my project.
And I continue to write novels and I continue to do some contract work for Disney, which is always ironic, but I'm a giant theme park nut and their Imagineering Organization's amazing.
Yeah, to hear the Disney Imagineering part of your story was surprising, honestly.
Well, I get to do super cool work, and Disney being who they are, I can't tell you about any of it, which is hilarious...
It's like a human side of DRM, right? You can't say something that's a secret. They've locked you.
[00:15:55.07] Well, it's just confidentiality. I don't think you can get a privacy advocate to say that confidentiality is bad. I'm okay with that; I just think it's over the top. I don't think there is any reason, any rational reason for me not to tell you this, I think it’s just for them to evaluate when it is and isn't in their interest to allow people to speak on the record about the work that they do would cost them more than any gains that they would get from allowing the people who aren't gonna reveal anything sensitive to talk about it in public. So I think they've just made this self-interested, totally rational decision, that rather than figuring out when it's okay for people to talk, they're just gonna tell everyone they can't talk. It's just a pain in the ass. It's like Indiana Jones, that last scene - you make this amazing thing and then they stick it in a vault and they lock the door.
It definitely levels up the level of intrigue, though. People now are so curious what it is you imagineered while you were working with them.
I did some really cool things that I am happy to have done. I'll leave it there. So I did that, and I've done lots of other cool stuff. I continue to do other cool stuff with Boing Boing, and what not. I'm a visiting professor of Computer Science at the Open University in the UK and I still do some stuff with them, and I co-founded the Open Rights Group in the UK, which is a kind of analog to EFF there, and I sit on their Advisory Board.
It's a lot of work on the freedom front to have the ideas you've had, and then also be trusted by these people that you've friended over the years, and like, "Hey just go overseas and do this job"... Maybe you were qualified, maybe you weren't - it's hard to say for sure based on a story, but obviously you were, cause you've done the job, but to get that kind of authority so early and so easily - maybe not so easily, maybe so quickly...
Well it wasn't easy. I'd worked for the EFF for a of couple years at that point, and I'd been doing that work already in the US, and I started going overseas to do that work, but based in America, and that was not super efficient; it was expensive and it was exhausting, and so I moved overseas to do it full-time. So I was in 31 countries in three years representing EFF and doing its work.
So we've got these notes here obviously, and you are keynoting this conference called OSCON. It’s a pretty popular conference that people have heard of I'm sure, over in London, back in your original stomping grounds, or at least your wife's, and you by a happenstance...
Yeah, I'm gonna stay in our apartment there, we rent it out. We're gonna stay in our apartment there, saving O'Reilly on the hotel.
There you go. So the keynote is titled "How You Got Here" and I think just the opening to this show, I really cannot wait to hear this keynote because someone like you sharing about the open source landscape, especially around DRM and that whole philosophy of DRM being at odds with the ideas of open source, I can only imagine what you are gonna cover. Why don't you share a brief bit about that keynote?
Sure. As I mentioned, I've got an eight-year-old and she was born in London in our flat in a pool in the living room. She likes to hear the story of her birth, right? So we would tell her the story of her birth... My wife told her she came and yanked on my arm and shouted, "Story arm", that I would tell her a story. So she runs up whenever she is bored and she yanks at my arm and yells "Story arm!" And being a writer, I like to iterate when I tell stories, so I’d tell them a little differently every time. I'd start a little further back or go a little bit further forward.
What I realized was that the interesting part of her birth story was the stuff that led up to her birth, and not like the stork stuff, but the stuff about how my wife and I met and became best friends and lovers and a couple, and got married and decided to have a baby. You know, all of that stuff is unique; the stork stuff is the same for everybody. What your parents did to make you is almost certainly something I can guess at with a pretty high degree of accuracy, right? But how your parents came to make the decision to do that and make you - everyone has a different version of that.
[00:20:08.14] The open-source version of how we got here, we talk about the licenses and we talk about the packages and milestones, but there's like this really strong social component to how we got here, because around the same time that the open source movement was starting, it was also around the same time that the open web movement was starting, that we were sun setting these proprietary network architectures - whether those were the ones the phone companies ran... You know, AT&T circuit-switched services-centric network or the big commercial services like Compuserve in AOL... They both kick-off around the same time, and yet the open web has collapsed, the open web is almost dead.
We are in a desperate and dire moment for the open web, and the free and open-source software movement has soared; everything, including the things that are closing down the open web is built on free and open source software. So that is an amazing thing, and the speech kind of interrogates what the difference is, and how one soared and the other sank, and what we can learn from the free and open-source movement to keep the web open as we try to open it up again.
I think the thing that the free and open source software movement had going for it is this thing called the Ulysses Pact. The story of Ulysses goes that Ulysses was gonna sail into siren-infested waters and anyone who heard the song of the sirens would be tempted irresistibly to jump into the sea and the sirens would drown them. So normally, when sailors sailed into the siren sea, they would fill their ears with wax. But Ulysses was a hacker and he wanted to hear what the sirens' song sounded like, so he had his men lash him to the mast so that he could hear it, but he couldn't get loose. So what he used was his strong self, the moment at which he was strong, to predict that in a future moment he would be weak and to take countermeasures to prevent himself from giving into that weakness. We use Ulysses pacts all the time - if you go on a diet, you should throw away your Oreos away on night one; not because you're like incapable of resisting temptation, but because everyone sometimes has moments of weakness, and the strongest thing you can do is to recognize that you will have a moment of weakness in the future and take a countermeasure against it.
In the free and open-source world, our Ulysses pact is the irrevocable license, because the failure mode of free and open source software, having founded a free and open source software company, I can tell you is that there are moments in which it feels like your survival turns on being able to close the code that you had opened when you were idealistic. There are moments of desperation when that happens.
Of course, it's ridiculous, because if you're making anything substantial under free and open source software, you're building it on other things that other people have opened and can't close, and if they were to close off their code your project will collapse. So every one of us wants to be the only one who can revoke a free and open source software license, while all the plumbing that we built on top of stays open. Because the licenses are irrevocable, because you can't close it once you opened it, you generally don't even get the pressure from your investors or from potential acquisition suitors or from other parties who can otherwise lean on you and put a gun to your head - they don't even bother, because there's no point in shouting at you to close the code, if they know that it's not course of action that's even open to you.
[00:23:44.01] So even though the same desperation that led us to close the web is present for everyone who's ever made an open source project that succeeded, that desperation can't express itself in the same failure mode that the web has had. So my talk is about how we can build a Ulysses pact for a newly opened web around two principles that will keep the web open even in the desperation of its founders, even when the pirates who founded it become admirals.
The first principle is that any time a technology or computer gets an order from its owner that conflicts an order that's been given to it by a remote party, the owner should 100% of the time without exception win. The owner always gets to overwrite remote policy.
The second one is that any true fact about the security of a system that you rely on should always be legal to disclose, under every circumstance. My pitch is that these two principles should the principles that we become zealots for; that if they're not calling you an unrealistic idealist about your adherence to these principles, then you're probably not trying hard enough. So my pitch is that the people who care about building an open web to be the nervous system of the 21st century, to have an Internet of Things that's not an Internet of things on fire that spy on you and ruin your life, is that we need to like take these principles and cherish them as much as we cherish the core principles of free and open source software, and weave them into our licenses, into our professional codes of conduct, into our membership agreements, into every single piece of what we do, so that there's never any question that this will come about.
There have been lots of times when the governments have tried to pass laws that say "In order to make software, it has to be closed" and the fact that there's all of this critical, open software has meant that those laws died every time. Because you're going back to them and saying, "Well okay, but what you're talking about is throwing away all the infrastructure on which the digital world is built. What are you planning to replace it with when you pass your dumb law?” You know, reality asserts itself, and so if we can create a reality on the ground to assert itself when governments contemplate stupid laws that say that remote parties can override local parties, whether those are crypto backdoors, or DRM, or lawful interception overrides, or any of the other things that have been the parade of horribles of the 21st-century, then we can make a difference.
Fascinating! I think we need to drill down on these principles a little bit more. I also wanna ask you about licenses, and specifically copyright versus copyleft, the more liberal licenses, the more GPL-style licensing... We're heading up against our first break. This is gonna be a great keynote, I'm fascinated already. Adam, you're gonna have to wait until they put the video online.
I know, I'm not going. I'm so sad.
Because I will be there with Cory at OSCON London; you can be there too, we have a discount code in the show notes. Check that out for 20% off. See that, see Eli Bixby talk about TensorFlow, come hang out with at least one-half of the Changelog. We are gonna talk to Cory more on the other side of this short break and we'll be right back.
Alright, we are back with Cory Doctorow, talking about saving the open web, the future, and this proposal he has. He'll be giving a keynote at OSCON London upcoming in October; all about how this Ulysses pact that we have with open source licensing and software has really saved us in times of weaknesses, and how the web is in this time of weakness, in dire straights and we need to save it. Cory has two principles he's trying to impart as ways that we can protect ourselves against failure. Cory, give us a little bit of a rehash; we have these two principles. The first one is that the first party should always be able to override remote parties.
If you own a device, you should always be able to tell it what to do, even if someone else who's not you gives it an order.
And second one, any true fact should be legal to disclose.
Any true fact about the security of a computer that someone relies on should always be legal to disclose.
Okay, very good. So my first question - and maybe some of this is the work you're doing with the EFF and can only be enforced -- we have to write laws for these things, but how do you convince everybody else that these two principles are the way to go?
Well, there's a bunch of different things. Starting with the question of whether or not people who own computers should be able to decide how they act, there's different appeals. There's a pure property appeal, which is like when you own stuff, it should do what you tell it to do - that's what ownership means. If it’s your insulin pump or your car, or whatever... Cars can be steered into people, and we make it illegal to do that, but we don't try and design cars that are incapable of being steered into people or incapable of being driven over the speed limit, or anything else. A car is just a computer you put your body into and then pray that its software is accurate, while it hurtles down the road at 100 kilometers an hour.
Think about that G-Pact last summer, where it turned out that 1.4 Million Cherokees could be driven over the Internet - steering, brakes, everything. It's not that that car is just a computer, but the single most salient fact about that car is that it’s a computer; take the software out of that car and it ceases to work, just as thoroughly as if you take the gasoline or the engine out of the car. A voting machine is a computer we put democracy inside of... And so the idea that you should be able to tell the stuff you own to do what is in your interest, that is a no-brainer.
If you don't accept that, let's think what the consequences might be. Let's assume for the moment there are times when it’s legitimate to let other people give your device orders that you disagree with, and have the device choose them instead of you. So first of all, we have an authentication problem, because anything that the manufacturer can order your device to do, or that law enforcement can order your device to do, is also a thing that anyone who can steal the credentials of, or successfully impersonate the manufacturer or law enforcement; it’s also a thing that manufacturers and law enforcement who are operating in territories that we don't think of as being in accord with the rule of law get to do.
[00:31:39.09] We talk a lot about self-driving cars and whether or not the police will have a way to just send an instruction to a selfdriving car, causing it to pull over, or whether you could ever have an OJ Simpson car chase in an era of self-driving cars, or whether the cops would just email your car and tell it to pull over, and it would just disobey you. The thing is that there's probably a one in five chance that in the next fifteen years ISIS will form the government of a country in the territory currently occupied by Syria and Iraq. So there'll be a government, so they will have a credential that allows them to lawfully intercept cars.
Any power that we create in a technology that is correlated around the world - because it's not like we have Syrian cars and American cars... We just have cars and they have firmware loads. So if you are gonna create something in a firmware load in an American vehicle, you should expect it to show up in Russian, Syrian and potentially caliphate vehicles, so that’s another problem.
There's a similar problem with manufacturers where, well, you might trust a manufacturer today, but what happens tomorrow? I was once very fond of a little company called Flickr; in fact, Flickr started because Stuart Butterfield and the woman who's now my wife and I all met at a conference, the woman who is now my wife and I fell in love and carried on a long-distance relationship, and Stuart at the time was making a game called Game Never Ending, that we were both Alpha testing, and he asked me how's it going with Alice, and I said it’s great, but we are having a hard time sharing photos. He said, "Oh, well we have a photo sharing thing for Game Never Ending, I'll just bring it in the product roadmap." They launched it the next week and it was so successful they shot the game down and re-titled the company Flickr, and sold it to Yahoo! for $30 million. Even then it wasn’t so terrible, because Stuart was working at Flickr and so was Katarina, his wife at the time, and it was all great. And now, Yahoo! is a dumpster fire and Flickr is terrible.
Even if you trust the manufacturer to the point where you are personal friends with the people who founded it, and it was founded for your benefit, it’s still not a good idea to let the manufacturer decide what you're gonna do with your product. Steve Wozniak got locked out of his iPhone; these are real, no-fooling problems that people have. Bob Frankston, who created the first commercial spreadsheet for - not Lotus - VisiCalc; they put DRM in VisiCalc. There was a defect in the floppy disk that if it wasn't present, the thing wouldn't run, so they introduced a physical defect at manufacture time. Then he wanted to extract his old spreadsheets from an emulated 80-86 running VisiCalc, except it wouldn't run VisiCalc, because he couldn't emulate the physical defect in the floppy disk in the emulator floppy drive. So even if you founded the company, you may not be able to trust the company in the future. You and the company may not have the same interests in the future.
You should always be able to override - so this is the second argument, is that even if you think that today there is a government or a company that you trust, if you think that Tim Cook would never block an app unreasonably, you are creating and arming the weapon that you're handing to all of Tim Cook's successors. As the guy who was the CIO of a tech shop buying Apple hardware from John Scully, I'm here to tell you that Apple is perfectly capable of hiring some absolute clowns to run that company, and someday you may be trying to reconcile your trust in Tim Cook with a future John Sculley. When Martin Shkreli becomes CEO of Apple in 2072 and is in charge of the App Store, you're gonna have to live with the decision you made when you trusted Tim Cook. That's another argument.
Isn't that just the nature of the beast, though? We can never prevent ourselves from having future versions of somebody that we may or may not trust. That's a human issue we've had all along.
[\00:35:46.21] Yeah, that's why we shouldn't design things that require us to trust people in the future, absolutely. So it's never been the case that if you bought a GM car and then you didn't like GM in the future, that GM could reach into your car and brick it. But that is the case with your Nintendo 3DS; every time you fire one of those up, it checks to see if there's any new firmware updates, and if it finds one, it installs it without user intervention. The first thing it does when it launches is checks them on the previous firmware load, and if it detects any tampering, it permanently bricks the 3DS. So it's never been the case that if KitchenAid detected in the future that you're using your blender to mix paint, that they could brick your blender, but now if you do anything that the manufacturer doesn't like, the manufacturer can reach in and brick your device, so that's the other thing.
But we don't have ownership of anything these days. We have licenses to use.
Yeah, it's a kind of feudalism... Except, when you actually look at the terms of those licenses, they are rubbish, and they are not really copyright licenses. For example, in music, if you buy a downloadable song, that song is characterized as being the license to you, which is why they can place restrictions that they wouldn't be able to place in a sale of a copyrighted work. But the way that the standard record deal goes, is that if you have a record deal and your song is licensed to someone, you get 50% of the revenue; if your song is sold to someone you get 7%. So you get seven times more money if your song is licensed than if it's sold, but iTunes revenue and mp3 store revenue is characterized by the labels as a sale in its bookkeeping with artists, and it's only with you it's characterized as a license. It's just bullshit - I know you said to keep it clean - it's rubbish. [laughter]
These copyright licenses are not particularly well crafted and they're silly, and if they're enforceable, they are not enforceable en todo. They are naught in keeping with the constitutionality of copyright. One of the things that our lawsuit to invalidate section 1201 under the DMCA turns on, is that the US Supreme Court handed down two rulings in the last ten years about copyright and fair use. These were in the cases Golan and Holder - and they said that a copyright law is only constitutional if it respects the traditional contours of copyright. So copyright historically only applies to creative works and not to functional things, and only if it allows for fair use, which is use without permission for critical purposes and other reasons. Otherwise, the copyright law is not valid, it doesn't pass Constitutional muster, it conflicts with the First Amendment.
Well, the traditional contours of copyright do not afford for copyrightable dishwashers and copyrightable doorknobs, and copyrightable light sockets. The fact that the software in the light socket is copyrighted, and that you can then use copyright law to tell the person whose plates they can put in the dishwasher or whose light sockets they can use with the light bulb, violates this traditional contours test that the Supreme Court set out. So while there's this fiction that we can't own anything because of copyright law, the legal reality which is yet to be litigated, but which EFF has begun the long, arduous process of litigating out, is that that's not how copyright law works.
I'm definitely with you on all this, but what my question is with regards to this, it seems like the focus is on devices and almost like all things... Let me say it this way, how does this specifically apply to the web as we know it today? Not the devices using the web, but the DNS, the HTTP, the markup... So it seems like in that case the website owner is the local party and we're all the remote parties just visiting this other person's website. How does this help us in those circumstances?
[00:39:55.14] There's two parts to that; the first one is a very practical thing that's going on right now. As apps gained ascendancy, browsers lost some of their power; they became less significant to technology ecosystems, and this made browser vendors and the World Wide Web Consortium (which standardizes browsers) pretty desperate. In 2013, the W3C decided to add DRM to the core set of HTML standards for HTML 5, and something called encrypted media extensions. What these mean is that for the first time ever, the person who runs a website will be allowed to tell your user agent, your browser, how it must perform, how it must render content and whether it can render content. That's never been the case; this is why ad blockers and pop-up blockers work, but also why I have bad low contrast vision, and when I get to gray on white type, I turn on a thing that turns the gray type black. All of that stuff only exists because users are able to configure their user agents to display the web in the way that’s convenient to them, and so what this is doing is it's setting up this regime where it's a felony to change the way your browser is configured, if that conflicts with the interest of the people who serve the content to your browser.
We have proposed to the W3C that it should take its existing policies and extend them to cover DRM. Right now the W3C, if you join, you have to promise not to use your software patents to attack people who want to employ W3C standards. The W3C's position on patents is that standards are more open if you don't need to license a patent to implement them, and so they have this policy. We're saying, "Okay, well there's this new right, the right to tell people how their browsers must work, that you're creating by doing this DRM standardization. Surely, a standard is more open if you don't need someone's permission to implement it, because of DRM, just as surely as it is with patents. You should have the same policy for patents". So that's coming up for a vote very shortly.
We've been joined by some pretty significant parties there, the browser vendor Brave is in with us, so is the Royal National Institute for the Blind and Oxford University. They've all signed on, along with most of the crypto-currency and Blockchain companies that are W3C members, because as cryptographers, they are like "Yeah, of course people should be allowed to report vulnerabilities in browsers." It's a terrible policy to say that companies get to decide who can report vulnerabilities. That's a thing that’s live and underway, and if you work for a W3C member, you should talk to your rep about supporting us there, because that is coming up any day now.
What's been the stance of the other browser vendors, notably Google, Apple, Microsoft?
And Mozilla is the other big one. They're all backing it. I think Mozilla believes that DRM is a foregone conclusion, that they are gonna have to put DRM into the web, and that if they negotiate at the W3C, they'll be able to negotiate the deal that Google gets and if they negotiate away from the W3C, they'll be the smallest of the major vendor browsers negotiating on their own. So even though they're champions of the open web, they're not doing the right thing on this.
It's true that companies will try and make DRM without the W3C, whether or not the W3C standardizes it, but there’s no way that they could collaborate to the extent that they are collaborating at the W3C without doing so at the W3C, because the anti-trust implications of all the major players in an industry gathering in a closed room to decide what features their products will and won't have, that’s totally illegal. The only way they can do this is with the W3C abetting them.
So who's actually pushing for this, who's pushing for DRM in the open web?
Do you remember we started off talking about Audible, and you were like "Damn, I feel so bad for supporting Audible."
The listening audience didn't get to hear that part. We had a pre-call, we talked about Audible, DRM... Cory went on a rant, we loved it, but it didn't make it into the show. We might actually release it as a teaser, but feel free to share what you want, Cory.
[00:44:07.07] Well, then afterwards you started talking watching things on Netflix and I was like, "Do I tell him”? Because Netflix are the major advocates of this. Netflix, Comcast, Cable Labs, the MPAA, the IRAA - those are the major people pushing for it, as well as Microsoft, Apple, Google, Firefox and Mozilla. Those are the proponents.
What's their motivation? I mean, we know who they are, but is it greed, is it control?
Cory Doctorow: Netflix wants to be able to assure the parties that it licenses from, that they can exert controls beyond that which copyright allows them. For example, in 1984, the Supreme Court ruled that you are allowed to record TV shows for personal use. That was the Betamax ruling.
Right, you can take a VCR and record whatever you want, Days of Our Lives.
Yeah, so in theory there’s no reason you can't record a streamed video from Netflix, except that there's DRM and it's against the law to break the DRM. That’s hard to make work in browsers, because browsers are under user control to an extent apps are not, it's a much more open platform. So what Netflix gets by adding DRM that's supported across all the browsers is they are able to go back to the people they license from, and say "This commercial preference that you have, that people not be able to record their shows, we've just converted that at the stroke of a pen into a legal obligation. We made up a private law and without ever having to ask Congress, that law was enacted for the web." So that's what they get out of it. Google and Apple and Microsoft, they get to tick a box that makes Netflix happy, and their browser divisions, who are worried that they are gonna lose ground to their app divisions, get to assure themselves that Netflix isn't gonna boycott the web, which is a thing they are all super worried about, because Netflix has more or less said that they would boycott the web if they didn't get DRM.
What?! So if this is where it begins, then where does it end up? Like, if this is the breaking ground of this issue, where do we...
I'll tell you where it ends up. The W3C is currently enacting a merger with an e-book standardization group that says that they want to make DRM for all formatted text. So if that was built into browsers, wouldn't the New York Times and Washington Post and everyone else who has a pay wall start to put the so-called 'premium web' behind a DRM wall as well, so that you couldn't save and print. So then what will we do the next time there was a Gulf War and we wanted to prove that the New York Times had lied in making the case for the Gulf War, if we are not allowed to save the text that's in the news. And what do we do as more and more the web disappears into silos that are off-limits to free and open source software? What's the future of desktop Linux, what's the future of free and open browsers? Where do we end up if user modifiability is antithetical to using the network?
This is a big deal.
I think it's an existential threat to the future of the human race. I think that the idea that we are going to move the control surface... Because remember, HTML 5 is the control surface for the future of the Internet of Things. The idea is that we'll get rid of apps and we'll use in-browser apps, native apps to control your pacemaker, your car, your thermostat. When we take those and we make them off-limits to security research, and we invite the worst, most monopolistic practices with no check against them in competition, when we make competing with them a felony, then we would be insanely naïve to expect anything but the worst kind of abuse. Entertainment technology has the potential to usher in a future of absolute censorship and control. I call it being 'pucks lead into the full Orwell', and it is an absolute disaster, it terrifies the hell out of me.
[00:47:55.18] So we've talked about DRM, we've talked about the owner of the thing should be able to override its manufactures point of use, so to speak, so "No updating firmware without me approving it", those kinds of things. But with regards to security, point number two you mentioned was any true fact with regards to security should be legal to disclose. How does this play out? Can you give us some examples of that?
There's a couple of these things. Back to DRM - disclosing defects in products that have DRM has led to security researchers in one case going to jail; the copyright office has heard testimony from security researchers, some of the most famous, best respected in the world, including Ed Feldman, who is now Deputy CTO of the White House, who've said they've found defects in things like voting machines and medical implants, and that they weren't able to come forward with them because they felt that they would face too much liability under the DMCA. So that's part of it.
The other part is though the Computer Fraud and Abuse Act. In the 1980s, we didn't have any specific anti-hacking statutes, and it was kind of a problem because people would break into computers and raid their databases, and they'd have to be charged with the theft of one microwatt of electricity; it was kind of embarrassing, and it was not a sustainable thing. So Congress decided to make an anti-hacking law, but it's hard to make a really effective anti-hacking law because hacking changes over time. Technology is a fast-moving target, so rather than spelling out a set of things that you were and were not allowed to do, they said that anytime you exceeded your authorization on a computer that didn't belong to you, that you were committing a felony.
This has been a real problem, because it's allowed companies to spell out your authorization by creating these ridiculous Terms of Service, these long, 1000 words of boilerplate, and then anytime someone does something they don't like, they can threaten them or actually sue them, or have them arrested for violating the Computer Fraud and Abuse Act, and this also had been really problematic for security researchers, and other kinds researchers, too. Your listeners will probably know about Aaron Swartz who was this amazing open source and freedom activist who was allowed to download scientific articles using MIT's network, but the Terms of Service said that using a script to do it was not allowed. And because he wrote a Python script to access files that he was allowed to access, he was charged with 13 felonies and facing 35 years in prison, and he hanged himself.
But you know, other researchers have fallen afoul of the Computer Fraud and Abuse Act. One researcher was looking at his AT&T customer record which had all of his financial details, and he altered the URL, he changed the number at the end of the URL link, he incremented it by one and found himself looking at someone else's financial details, and all told he was able to look at hundreds of thousands of people's financial details, which he then went public with; he didn't publish their financial details, but he went public with AT&Ts sloppiness, and AT&T had him thrown in jail for changing the URL in his browser, because their user terms said you couldn't do that. So right now, the American Civil Liberties Union is actually suing on behalf of a bunch of different kinds of researchers and news gatherers to invalidate the Computer Fraud and Abuse Act to address this question, to make sure that these true facts about the security of computers that we rely on are legal to discover and disclose, because companies are very poor trustees of their own embarrassing truths. They can't be relied on to tell you when something that potentially could cost them a lot of money and face is true.
Here's kind of a silly question, but you tell me whether it's silly or not - what if none of us ever agreed to those things? The end user license agreements - aren't they so much rubbish that... The reason why they can actually do that is because the violation, by doing the act, but you've agreed to use the thing based on the terms - what if none of us ever agreed? We just used the products without... Would that be a legal loophole?
[00:52:17.24] It would be great to have some limits on those Terms of Service. There was an effort at one point to pass along called UCITA that would have limited what could go in terms of service. We haven't had a lot of luck with that, we haven't had a lot of luck with courts limiting what they can do. But Courts have held so far that just being in the vicinity of Terms of Service, clicking a website which at the bottom of it says 'You agree"...
That your usage is your agreement, basically...
It's like an implicit agreement, even if you didn't explicitly agree to it.
By running away, shouting "No, no, no, I don't agree!" you agree. [laughter] So that's another problem. And frankly, it's getting hard to function in society without Facebook. I'm a Facebook vegan, and I'm here to tell you that there's a lot of stuff I don't get to do cause I don't use Facebook. The only reason I can be a Facebook vegan is because I'm a relatively well-off, well-known, white privileged, English-speaking dude, and there are a lot of people don't have that option.
I think that not agreeing is not enough. There's a hilarious photo on my Flickr stream of just my newborn daughter's hand pressing the 'A' button with the Nintendo Wii agreement in the background, so used her one-day-old hand to agree to the Terms of Service, because she couldn't form a contract. But the lawyers I know are like, "Yeah that doesn't work. You pushed her hand."
That's what I was thinking, like what if we just hire some guy in Zimbabwe and he clicks the button for all of us. Like, "None of us ever agree", and he agrees for all of them.
He then acts as your agent. I'm not a lawyer, but trust me, this is a thing that people have thought of, it doesn't work. We need other things. Larry Lessig talks about the four ways we can fix these problems. One is with code; we can make things that don't have these agreements. One is with norms; we can make companies that force these agreements down your throat into social pariahs and characterize them as having done something profoundly immoral for having done this. One is with law; we can use law to limit what those agreements can do. And one is with Markets; so we can buy things that respect our freedom, but no one of those is enough and all four of those work together. The things that are technologically possible are things that you can create markets for. You can't create markets for things that can't be done, and so all of these things together work well.
Just thinking about those four things - and I agree that you can't have them all, but if we look at the way humans are going, even this conversation about Netflix and Audible, I'd be hard-pressed to believe that Adam, even though you're probably outraged at this point, Adam... Like, you get back to your regular life and it's hard. People aren't as resolute as you are, Cory. We don't stick to our convictions; even you said Facebook itself is not all that attractive to you. I'm also a Facebook vegan as you call it, but that's because I don't care as much...
Me as well.
...but if you took my Netflix away, that would actually hurt me in my everyday life. It seems like the legal front is probably the best one, if we had to put our efforts behind one thing, because the social norms thing doesn't seem is working out that well.
I don't know. The social norms thing to me, Jerod, it seems like... As Cory was saying that, it reminded me of this idea of free speech. Everybody has free speech, but you say something that's free for you to say, but the society at large doesn't agree with it, they're gonna come down on you. So you may have the freedom to say it, but not agree to do it. So to me, I feel like today's society and the way internet rage sort of comes up very easily, it would be pretty easy if we could band together, and it does happen and it's happening more and more in networks and in more and more communities where if someone doesn’t play by the rules of society sets or society norms, then they get ousted in some way, shape or form.
[00:56:20.13] I think there's something to that. I want to caution against the paralysis of purity. I try really hard to spend money with companies that are trying to make the future that I want to live in and not destroy the future I want to live in, but I’m not purely successful. At the end of the day, it's pretty hard not to buy your phone service from a company that's a monopolist in waiting, that wants to destroy network neutrality. Even though I buy laptops, throw the hard drive away and put a new one in and then install Ubuntu on them, I'm still buying those laptops from Lenovo that have shipped four models in a row with spyware on them out of the box. Every vegetarian eventually meets a vegan, and if your test for whether or not you can do anything, as whether you can be as pure as the purest person you can think of, then you will do nothing. There's another way to do this, and I got this from Denise Cooper, who is one of the great doyens of the free and open source movement. She says that every month she adds up how much money she has given to companies that are working to destroy the future she wants to live in, and she gives that much money to organizations that are working to save it. She's at least carbon offsetting the harm that she does.
I obviously brief for EFF because they are an organization that I work for and love, and I've seen how effective they can be, and I've never seen an organization be more effective with less, but they're not the only ones. Obviously, there's a Free Software Foundation, but there is also the Software Freedom Law Center, and there is Creative Commons, and there are so many other organizations fighting for the future, defending progress, and so many different organizations that will take the money that you give them and try and fix the structural problem that has trapped you into subsidizing a future that you are horrified to be approaching, and try and fix it from the other edge of things. That's another thing you can do - you could not fall prey to this argument that goes, "Well, how can you be in favor of doing something about climate change when one time you got on an airplane?" People get on airplanes and they can care about climate change, and if you say to people that you're not allowed to care about climate change if you fly, the world will go up in flames.
That's a good place to pause here, we've got one more break before we tail the show, so let's pause here. Cory, on the other side we're gonna talk a bit about the future. We figured that with the mind you have as a science fiction writer who dreams up some really cool stuff, you must have really interesting ideas, or at least science fiction ideas about the future. So when we come back we'll talk about the great or bleak future we might have. We'll be right back.
Alright, we're back with Cory Doctorow, and it's definitely been a good conversation. Cory, you think about and you're so passionate about things I never even knew I should care so much about, and I feel like the general public has some blinders on basically and that you've got a lot of ideas and a lot of passion around internet freedom and such other things. But coming back to our ground roots, developers who listen to this show, people who really are passionate about open source, they're getting involved with communities, they're going to conferences and are giving talks, they're leading the way in all shapes and ways - what advice do you give to people like that, people who build software every single day, people who care about the future of software, more importantly open source software? What kind of advice and what not can you give to those kinds developers where they should be focusing their efforts to not so much just subscribe to this potentially bleak future that we're driving towards, but ways that they can shape the future of the open world?
Well, you know, I'm a science fiction writer, so I know exactly how badly qualified I am to predict the future, because science fiction writers suck at predicting the future. You know, we made a lot of predictions and our success rate is very low. We have this hindsight bias where we trumpet our successes, but if you take our overall hit rate, it's pretty poor. And besides that, like I said earlier, knowing what the future is gonna be is pretty depressing, because it suggests that the future can't be changed. So rather than briefing for optimism or pessimism, I'm a great fan of hope, and hope is like why you tread water when your ship sinks, even though you know that in most cases you have no chance of being picked up; everyone who has ever picked up, treading water until rescue arrived, and so it's this necessary, but insufficient precondition for a better future. Hope doesn't require that you know how you get from A to Z, hope only requires that you know what your next step is. The first casualty of any battle is the plan of attack. So if you think you've got a plot that you can take from here right to a kind of free and open source utopia, the hours you spend on that critical path are gonna be completely wasted when the first exogenous shock comes along and blows you off the path.
So instead, I'm a great believer in iterative hill-climbing; you check to see whether there's a course of action that takes you closer to the future that you wanna live in, and you take that one incremental step, because as you ascend the problem landscape, you get a view new parts of the territory that were off-limits to you before because you were too low down. And yes, you can reach a local maximum, which is why sometimes you've gotta try veering off into left field and try something you've never tried before, but I don’t believe in grand plans. I believe in incremental, iterated, slow, steady, continuous progress. So if you think that you can do a thing, one thing, doesn't matter what, a single thing to make things better, go do that then.
What is that thing for you and the EFF? We can back off of developers, and talk about specifically what you're up to.
Corey Doctorow: I'm gonna kill all of the DRM in the world in a decade.
That sounds like a big plan!
It is a big plan, but that's where I want to be, and my next step to do it is I've got all these little projects. We're getting the W3C to protect security researchers and innovators and accessibility in web standards; we're suing the US Government to get rid of the section 1201 under the DMCA; we're coordinating with activist groups around the world to launch their own campaigns on the basis that this is going forward. I'm talking to investors and entrepreneur groups about the economic opportunities for breaking DRM.
We've just petitioned the Federal Trade Commission to require electronic retailers to notify people when they have products that have DRM on them, and we asked the FCC to do this for set-top boxes, to say that the new set-top boxes and their unlock-the-box order, all the manufacturers should promise never to invoke the DMCA against people who unlock those boxes for legal reason.
[01:04:08.11] These are all projects that take us little further up the hill. I've got some stuff on the drawing board that I'm talking about with my colleagues right now. I want to do a one-stop shop where you can go and complain about DRM in a product, and have that complaint sent to the FTC, your state attorney general, your congressman and the Better Business Bureau, so that they all get a complaint every time someone buys a thing that has DRM in it and then it bites them in the ass. So to start building the evidentiary record and making this normative shift as well... Those are all the little pieces that I'm doing that take me one step up the hill, and then I know what's at the top of the hill, which is killing all the DRM in the world.
Right, that's your end goal. It makes sense.
I like the idea of the disclosure... I like the label, the idea that if there's -- because how many things can you imagine that are right around you, in your office, in Jerod's office and my office, that have DRM that you are just not even aware of, or that you were never disclosed that it had DRM.
Sure. I mean, it's not enough, but it’s good. The Amazon self-publication market, the 99-cent, short, self-published novels - those books, the people who buy them are very prolific readers; they tend to be a-book-a-day readers, so they are very familiar with Amazon's very cryptic interface, so they are able to figure out which books do and don't have DRM. The DRM-free books on Amazon don't say 'DRM-free' usually, they say things like "Can be used on unlimited devices." So in those marketplaces where you have very knowledgeable consumers, the DRM-free products outsell the DRM ones two-to-one. So that's pretty cool.
I think in some marketplaces it will make a marketplace difference. It’s not enough, but if we're hoping that people will differentiate themselves from the competition by being DRM-free, there has to be a way to tell which things have DRM and which things don't in the marketplaces where they compete.
I like that you have a big master plan and you have a bunch of small tactical moves, slowly up the hill or up the mountain, what works you follow up on, what doesn't work maybe you try something new...
I'm like a one-man Scrum. [laughter]
We're very familiar with a Scrum, many of us, and I think you have the ear of an audience who's very open to your cause. Speaking personally, I very much believe in many of the things you're saying right now. So I'm starting to think beyond limiting my Netflix and Audible usage, what would be a tactical next step for me, a guy who has different skills than you, a software developer who's day-to-day writing code, working for people and building websites and so on, how are ways that we can get involved and what are some small tactical things that we could do to try and push for the same things you are pushing for?
Well, EFF has a ton of projects on GitHub, where we have open issues. You could always address some of that and do a pull request; we've got Privacy Badger, we've got CertBot, which is part of Let's Encrypt; we've given away a million certificates this year. That number is probably now two million or more; we gave away a million certificates in the first 90 days of CertBot running. All of those have open bugs against them, and they could all use your contributions. Joining EFF and giving the EFF money actually does something, and again, I know this sounds very self-interested... I'll point out that at the very least EFF doesn't give me any money. I get my money from MIT for being an Activist in Residence in the Media Lab, so it's not like I pay my rent if you give EFF money. But EFF's an amazing organization and that's a thing you can do right now.
[01:07:56.26] Even just joining EFF's mailings lists. I know it feels useless to send a petition to your congressman or whatever, and there've been lots of times when it was useless, but the way that we killed SOPA was by eight million people putting phone calls through to Congress within 72 hours. The reason we were able to do that was because there were so many people who joined these mailing lists, and we were able to coordinate them in a big, consolidated effort that did something that politically no one thought was possible, and still the reverberations are being felt in DC. So joining those mailing lists...
If you are a security researcher and you wanna join my petition to the W3C to protect security researchers in DRM, send me an email. My email is cory@EFF.org (Electronic Frontier Foundation). I need to know what institutional affiliation you'd like listed (if any) and also what country you are in. We're trying to give them a sense of how diverse this is. So send me that at cory@EFF.org - your name, your institutional affiliation, and your country if you're a security researcher. If you work for a W3C member company, you really can make a difference by going to your boss and saying "There's this thing brewing at the W3C that has the potential to make free and open source software off limits for large parts of the web. We need to do something about it, and we can. There's this EFF initiative coming up. Can we ask our rep to contact Cory at Cory@EFF.org?" and I'd be happy to take it from there.
Those are all things you can do. I wish that there was more, I wish there was something like Wikipedia, where it's just "Go find an entry that you're interested in. If anything seems wrong, fix it." We haven't gotten there yet, but we are gonna get there. We are finding what Tim O'Reilly calls "the new architectures of participation" for this all the time, and we're trying to work them out.
We have this tool out for reporting DRM to the FTC and your congressman, and so on. We're going to need people to contact their friends and go tell them about it. And then there's one last thing that everybody can do - it's to explain this stuff to others nerds. Because like you said, there are lots and lots of people who are really deep technologically-savvy nerds, who this stuff just doesn't really cross their radar in any meaningful way. They work with technology all day and they would get it faster than anyone else you could possibly explain this to.
There's tens of thousands of EFF users, there's millions of Hacker News readers, and so if you and everyone listening to this were to go and explain this to two nerds that they know, people who are not technologically naïve people, who are savvy, who do this all day long, and say "Sit down, I need to explain this to you. We need to build the future with these two principles; computer should obey their owners, you should always be able to tell people about defects in the products that they rely on. We're gonna build that future with EFF; here's the podcast to listen to or a video of a speech, or EFF's homepage" and then go back to them in a week and say, "That conversation we had last week, I want to follow-up with you and see if you did it, and whether you'll go tell two other people."
That is a big ask. Going and talking to two people is a huge ask, but if you want something that every technologically savvy person can do to make a difference... If we could go from tens of thousands to hundreds of thousands of people who are involved in every one of these campaigns, that could be the critical mass that takes us to a better future. So that's my other final big ask to people: two people, one week follow-up, ask them to contact two people.
That's a nice list, if I do say so myself, very well played.
[01:11:44.26] It's a good social proof thing too, because everybody knows at least two people - at least most people - and that's like the MLM way of doing things. The only way you grow is by telling two friends. I hate to pigeonhole the EFF with MLMs - sorry about that.
Hah! Well, they do it because it works. I once had this hilarious lunch; do you know a book called ‘Getting Things Done’?
It's an amazing book, it totally revolutionized my life. So I had this lunch with the guy who wrote it once, because he wanted to get advice on what kind of web stuff he could do. I said, "I have to ask you, where did you get the cool stuff that you put into Getting Things Done?" and he said, "Oh, well I just stole all the good stuff from Dianetics". If you're going to convince people to join your weird cult, you need to give them something that works at first. Just because a bad person has done it, it doesn't make it a bad thing, right? You wanna steal the best tactics, regardless of where you find them.
I totally agree with that. So switching gears just a tiny little bit, it kind of goes back to the original mention of how I kind of came to know you through your writing, which was Scroogled. I'm sure you came up in the blogging era, the weblog era, Boing Boing and all that, pre-Flickr which was the game, so you come from an era of the internet which most don't touch.
I was country before country was cool. [laughter]
Right, exactly. A lot of people are doing personal blogging on networks that are not self-owned anymore. We used to blog on our own WordPress installation, an open source installation, our own whatever, and now most people do it on Facebook, Medium, Twitter if they're like tweet-raining or whatever... But someone like you who's outspoken about DRM, intellectual property, privacy, security - all of these fun things, you must have some pretty deep feelings about how the collective conversations are taking place on networks not owned by ourselves basically, and how that impacts our privacy. It's a little bit out of left field, but it kind of goes back to the beginning, which is, I'm just curious what your thoughts are around that, around this proliferation of writing on networks that aren't owned by us.
It's this question of how do you re-decentralize. Normally, in the history of the web, when there's been a lot of centralization, when there's been a big winner, what's happened is that the people whose content that was or whose social graph it was were able to use a rival's tool to bring whatever it was they were getting from Service A into Service B. That's why Web 2.0 was so exciting, it was these mashups where a company that had achieved success could be commodified by another company that did something even cooler.
So you had this anti-lock-in effect through both the technological underpinnings, the code, but also through the norms. Why would you use a service that didn’t you let you mash it up with other services? The services were better together. The silos, the walled gardens are where the problem is.
I can easily see a technological way to fix Facebook; I don't know if it would work, but at least I can come up with a plausible one, which is that you have a Bot that logs into Facebook for you, and scrapes all the stuff about Facebook that you value every day, and puts it in another context that belongs to you, and that can also merge with LinkedIn and Twitter and wherever else your friends are, and so you are looking at your dashboard. I'd sign up for that service. And then when you replied, it would put the reply in the right context for other people. Basically, use net style federation for Facebook, and there's no reason that you couldn't write a specialist browser that logged into Facebook as a person, that ran on that person's computer or ran on a cloud instance that was tasked to that person, or tasked to multiple people, and that did that for them. That would be completely awesome - it would be pro-competitive, it would be pro-market, it would let you be in control of your data and your social services, it would make it hard for surveillance mechanisms and Facebook to be so effective, and the only thing stopping us is the enforceability of Facebook’s' Terms of Service.
[01:15:57.17] What we need to do is challenge that enforceability. That's one of the reasons I'm very excited about the ACLU's lawsuit, which is opening the door to invalidating Terms of Service as enforceable legal contracts, and saying that users have the right to take their own data, and their social graphs and their interactions, and use them in ways that are best suited to them, not to a giant corporation. If we can do that, then I think Facebook’s' days are numbered.
I think that this is an area in which we could harness code and law technology and norms to make a better world. When you're thinking about what organizations you are going to tie it to, to hedge against the fact that you're giving money to corporations that are destroying the future, ACLU should be one of them; and not just for that, it's also an election season and there's no one who is doing better work about ending voter suppression than ACLU, but also for that.
One way to close this show Cory, is to offer it back, and obviously we've enjoyed this conversation this conversation with you and we think you have a unique perspective - one as a writer, two as a father and then as someone who cares about the future of where we are all trying to go. You have some really interesting perspectives obviously, but is there anything else that we haven't covered well enough? If you were front of the room of hackers as you will be soon at OSCON giving the keynote, this is a chance to share something that we didn't ask you directly. What do you wanna share, what would you like to close with in terms of the hackers, the open source people out there doing all the awesome stuff they are doing on GitHub and BitBucket and everywhere else to move open source forward?
Well, I guess the last thing I'd like to say is that the issue here is not whether information wants to be free, or whether the internet should or shouldn't be free, or whether that’s not the most important issue. I know for sure that there are things that are way more important than any of those things. There are fundamental issues of economic justice, there's climate change, there's questions of race and gender and gender orientation that are a lot more urgent than the future of the internet, but the thing is that every one of those fights is going to be won or lost on the internet. If you think we can fight climate change without having a networked public that coordinates its efforts, that holds companies and governments to account, that does citizen science, then you're nuts.
[01:18:24.13] The reason I fight to keep the internet free and open is not because information wants to be free - information doesn’t want anything, it’s an abstraction - but because people wanna be free. The internet is the nervous system of the 21st century and the way you make people free in the 21st century is by seizing the means of information, by having a free, fair and open information infrastructure, the battleground on which all those other fights will be won or lost.
Well said. Well, Cory, as I mentioned, I found you through Scroogle, but have loved your books between now and then, and obviously I’m gonna miss part of your keynote, face-to-face at least, but I’ll hit the playback. I can't wait to hear that. I certainly appreciated you sharing your time here today. As we've mentioned before to the listeners, you can go to OSCON too, we have a code - "PCCL20", which will get you 20% off registration. Go to OSCON.com/uk.
We did this show in partnership with O'Reilly, so thanks to O'Reilly for getting people to work with us and getting people like Cory on the show and Eli Bixby. Earlier I said Ben Bixby because I know a Ben Bixby, and that’s the name that popped in my head -- didn't mean it, sorry Eli; I loved the conversation on TensorFlow. We will be at OSCON London, and actually when I say 'we' I mean Jerod, because I don't cross the ocean like that so easily; I'm not going, long story short, but meet Jerod if you're going there. Make sure you say hi. Fellas, that’s it for this show today, so let's call it done and say goodbye.
Goodbye! Thanks again, Cory.
Our transcripts are open source on GitHub. Improvements are welcome. 💚