/ / News, Podcast

I did an interview (MP3) with the O’Reilly Radar podcast at the Solid conference last month; we talked about the Apollo 1201 project I’m doing with EFF.

In the absence of any other confounding factors, obnoxious stuff that vendors do tends to self-correct, but there’s an important confounding factor, which is that in 1998, Congress passed the Digital Millennium Copyright Act. In order to try and contain unauthorized copying, they made it a felony to break a lock that protects access to a copyrighted work or to tell people information that they could use to break that lock.

I’m way more worried about the fact that the [DMCA] law also criminalizes disclosing information about vulnerabilities in these systems.

Lawrence Lessig, who was on our board for many years and is a great friend and fellow of Electronic Frontier Foundation, talks about how there are four factors that regulate our society. There’s code, what’s technologically possible. There is law, what’s allowed. There’s norms, what’s socially acceptable. And then there are markets, what’s profitable. In many cases, the right thing is profitable and also socially acceptable and legal and also technologically possible. Every now and again you run up against areas where one or more of those factors just aren’t in harmony.

This summer, the EFF is launching its own certificate authority called ‘Let’s Encrypt‘ to try and overcome the fact that in order to have secure Web sessions, you effectively need permission from a big corporation that issues you a certificate. We’re going to issue free certificates to all comers starting this summer.

If you had a mobile device that was yours and that you trusted and that didn’t give your information to other people, it could amass an enormous amount of both explicit and implicit information about you. … Then, as that device moved thorough space, the things around it could advertise what kinds of services, opportunities, availabilities they had to the device without the device ever acknowledging that it received them, without the device telling them a single thing about you. Because your device knows a lot about you, more than you would ever willingly give out to a third party, it could actually make better inferences about what you should be doing at this time in this place than you would get if it were the other way around, if you were the thing being sensed instead of you being the thing that’s doing the sensing. I quite like that model. I think that’s a very exciting way of thinking about human beings as entities with agency and dignity and not just ambulatory wallets.

I think we’re already in a world where markets don’t solve all of our problems, but markets actually do discipline firms.

Leave a Reply