
OpenSSL maintainer and Google cryptographer Ben Laurie and I collaborated on an article for Nature magazine on technical systems for finding untrustworthy Certificate Authorities. We focused on Certificate Transparency, the solution that will shortly be integrated into Chrome, and also discuss Sovereign Keys, a related proposal from the Electronic Frontier Foundation. Both make clever use of cryptographic hashes, arranged in Merkle trees, to produce “untrusted, provable logs.”
In 2011, a fake Adobe Flash updater was discovered on the Internet. To any user it looked authentic. The software’s cryptographic certificates, which securely verify
the authenticity and integrity of Internet connections, bore an authorized signature. Internet users who thought they were applying a legitimate patch unwittingly turned their computers into spies. An unknown master had access to all of their data. The keys used to sign the certificates had been stolen from a ‘certificate authority’ (CA), a trusted body (in this case, the Malaysian Agricultural Research and Development Institute) whose encrypted signature on a website or piece of software tells a browser program that the destination is bona fide. Until the breach was found and the certificate revoked, the keys could be used to impersonate virtually any site on the Internet.
Hey, Londoners! I’m speaking at one of the Open Rights Group’s meetings on the Snooper’s Charter (the proposed new mass-scale network spying bill) in London on Nov 24. It’s free, but they’d like you to register so they know how many to plan for.
Here’s an interview I did with the ITSM podcast, about information technology, IT policy, and corporate IT and its implications.
I did an interview with The Geek’s Guide to the Galaxy, which they’ve published in both text and MP3 form. We talked about Pirate Cinema, Rapture of the Nerds, the Humble Ebook Bundle, the future of publishing, the Disney/Star Wars merger, and lots more:
Wired: Do you ever get letters from kids who have been inspired by your books to become hacker anarchists?
Doctorow: Yeah, all the time — at least to become hackers, and political activists. My first young-adult novel Little Brother had an afterword with a bibliography for kids who want to get involved in learning how security works, learning how computers work, learning how to program them, learning how to take them apart, learning how to solve their problems with technology as well as with politics. And the number of kids who have written to me and said that they became programmers after reading that, I couldn’t even count them. I’ve had similar responses to my second young-adult novel, For the Win, and I’ve also heard from kids who’ve read Pirate Cinema. In fact, we published an editorial by one of them on Boing Boing — an anonymous reader who makes her own movies out of Japanese anime, and who talked about what drives her and how the book resonated with her.
With Pirate Cinema, Cory Doctorow Grows His Young Hacker Army
Here’s a recording of a debate I participated in on Monday at Denmark’s Fagfestival (yes, really — Danish has weird English cognates) 2012, the largest gathering of journalists in the country. I debated Peter Schønning, a prominent Danish copyright lawyer, in an event hosted by Henrik Føhns.
I did an interview with The Geek’s Guide to the Galaxy, which they’ve published in both text and MP3 form. We talked about Pirate Cinema, Rapture of the Nerds, the Humble Ebook Bundle, the future of publishing, the Disney/Star Wars merger, and lots more:
Wired: Do you ever get letters from kids who have been inspired by your books to become hacker anarchists?
Doctorow: Yeah, all the time — at least to become hackers, and political activists. My first young-adult novel Little Brother had an afterword with a bibliography for kids who want to get involved in learning how security works, learning how computers work, learning how to program them, learning how to take them apart, learning how to solve their problems with technology as well as with politics. And the number of kids who have written to me and said that they became programmers after reading that, I couldn’t even count them. I’ve had similar responses to my second young-adult novel, For the Win, and I’ve also heard from kids who’ve read Pirate Cinema. In fact, we published an editorial by one of them on Boing Boing — an anonymous reader who makes her own movies out of Japanese anime, and who talked about what drives her and how the book resonated with her.
With Pirate Cinema, Cory Doctorow Grows His Young Hacker Army
I recently recorded an interview with the BBC’s Digital Human programme, which was recording an episode on death. It’s came out very well.
My latest Guardian column is “There’s no way to stop children viewing porn in Starbucks,” a postmortem analysis of the terrible debate in the Lords last week over a proposed mandatory opt-out pornography censorship system for the UK’s Internet service providers.
In order to filter out adult content on the internet, a company has to either look at all the pages on the internet and find the bad ones, or write a piece of software that can examine a page on the wire and decide, algorithmically, whether it is inappropriate for children.
Neither of these strategies are even remotely feasible. To filter content automatically and accurately would require software capable of making human judgments – working artificial intelligence, the province of science fiction.
As for human filtering: there simply aren’t enough people of sound judgment in all the world to examine all the web pages that have been created and continue to be created around the clock, and determine whether they are good pages or bad pages. Even if you could marshal such a vast army of censors, they would have to attain an inhuman degree of precision and accuracy, or would be responsible for a system of censorship on a scale never before seen in the world, because they would be sitting in judgment on a medium whose scale was beyond any in human history.
Think, for a moment, of what it means to have a 99% accuracy rate when it comes to judging a medium that carries billions of publications.
Consider a hypothetical internet of a mere 20bn documents that is comprised one half “adult” content, and one half “child-safe” content. A 1% misclassification rate applied to 20bn documents means 200m documents will be misclassified. That’s 100m legitimate documents that would be blocked by the government because of human error, and 100m adult documents that the filter does not touch and that any schoolkid can find.





























