PivotX Powered

./me

Cory Doctorow's Little Brother

No comments

Yesterday I finished the awesome book "Little Brother" by Cory Doctorow (the book is available as PDF download too).

After the jump some thoughts on the book and some citations (warning: long text to read!)

Cory Doctorow is also an editor for boingboing and, looking at his posts topics, will let you know what this book is about.

Liberty, civil rights, privacy, hacking, security and how someone (mostly governments) had lost the focus on security.

We already live (and especially Cory, who is actually living in London, afaik) in a world where we, somehow, are constantly observed, recorded and maybe tracked.

I'll post a citation from the book at the end of this post: I think is a good point to explain to everyone (like my parents, or some of my friends) who argues "what's the problem with being recorded from CCTV around the city? I've nothing to hide, I'm not worried for that and maybe, thanks to these cameras, they'll catch some bad-ass!". 

It's not a matter of having something to hide: the point is that I've things I don't want to show.

I don't have anything to hide, but I've things I don't want to show [Page 24]

[after someone makes the main character reveal his passwords]
I wish I could say that I'd anticipated this possibility in advance and created a fake password that unlocked a completely innocuous partition on my phone, but I wasn't nearly that paranoid/clever. You might be wondering at this point what dark secrets I had locked away on my phone and memory sticks and email. I'm just a kid, after all.
The truth is that I had everything to hide, and nothing. Between my phone and my memory sticks, you could get a pretty good idea of who my friends were, what I thought of them, all the goofy things we'd done. You could read the transcripts of the electronic arguments we'd carried out and the electronic reconciliations we'd arrived at.
You see, I don't delete stuff. Why would I? Storage is cheap, and you never know when you're going to want to go back to that stuff. Especially the stupid stuff. You know that feeling you get sometimes where you're sitting on the subway and there's no one to talk to and you suddenly remember some bitter fight you had, some terrible thing you said? Well, it's usually never as bad as you remember. Being able to go back and see it again is a great way to remind yourself that you're not as horrible a person as you think you are. Darryl and I have gotten over more fights that way than I can count.
And even that's not it. I know my phone is private. I know my memory sticks are private. That's because of cryptography message scrambling. The math behind crypto is good and solid,
and you and me get access to the same crypto that banks and the National Security Agency use. There's only one kind of crypto that anyone uses: crypto that's public, open and can be deployed by anyone. That's how you know it works.
There's something really liberating about having some corner of your life that's yours, that no one gets to see except you. It's a little like nudity or taking a dump. Everyone gets naked every once in a while. Everyone has to squat on the toilet. There's nothing shameful, deviant or weird about either of them. But what if I decreed that from now on, every time you went to evacuate some solid waste, you'd have to do it in a glass room perched in the middle of Times Square, and you'd be buck naked?
Even if you've got nothing wrong or weird with your body and how many of us can say that? you'd have to be pretty strange to like that idea. Most of us would run screaming. Most of
us would hold it in until we exploded.
It's not about doing something shameful. It's about doing something private. It's about your life belonging to you. 

The issue with false positives [page 52]

If you ever decide to do something as stupid as build an utomatic terrorism detector, here's a math lesson you need to earn first. It's called "the paradox of the false positive," and it's a
doozy.
Say you have a new disease, called SuperAIDS.
Only one in a million people gets SuperAIDS. you develop a test for SuperAIDS that's 99 percent accurate. I mean, 99 percent of the time, it gives the correct result true if the subject is infected, and false if the subject is healthy. You give the test to a million people.
One in a million people have SuperAIDS. One in a hundred people that you test will generate a "false positive" the test will say he has SuperAIDS even though he doesn't. That's what "99 percent accurate" means: one percent wrong.
What's one percent of one million?
1,000,000/100 = 10,000
One in a million people has SuperAIDS.
If you test a million random people, you'll probably only find one case of real SuperAIDS.
But your test won't identify one person as having SuperAIDS. It will identify 10,000 people as having it. Your 99 percent accurate test will perform with 99.99 percent inaccuracy.
That's the paradox of the false positive. When you try to find something really rare, your test's accuracy has to match the rarity of the thing you're looking for. If you're trying to point at a single pixel on your screen, a sharp pencil is a good pointer: the penciltip is a lot smaller (more accurate) than the pixels. But a penciltip is no good at pointing at a single atom in your screen. For that, you need a pointer a test that's one atom wide or less at the tip.
This is the paradox of the false positive, and here's how it applies to terrorism: Terrorists are really rare. In a city of twenty million like New York, there might be one or two terrorists. Maybe ten of them at the outside. 10/20,000,000 = 0.00005 percent. One twentythousandth of a percent.

That's pretty rare all right. Now, say you've got some software that can sift through all the bankrecords, or tollpass records, or public transit records, or phonecall records in the city and catch terrorists 99 percent of the time.
In a pool of twenty million people, a 99 percent accurate test will identify two hundred thousand people as being terrorists. But only ten of them are terrorists. To catch ten bad guys, you have to haul in and investigate two hundred thousand innocent people.
Guess what? Terrorism tests aren't anywhere close to 99 percent accurate. More like 60 percent accurate. Even 40 percent accurate, sometimes.
What this all meant was that the Department of Homeland Security had set itself up to fail badly. They were trying to spot incredibly rare events a
person is a terrorist with inaccurate systems.
Is it any wonder we were able to make such a mess?

[the last 2 pieces are the very end of the book... stay cool, no spoiler!]

Afterword by Bruce Schneier

I'm a security technologist. My job is making people secure. I think about security systems and how to break them. Then, how to make them more secure. Computer security systems.
Surveillance systems. Airplane security systems and voting machines and RFID chips and everything else.
Cory invited me into the last few pages of his book because he wanted me to tell you that security is fun. It's incredibly fun. It's cat and mouse, who can outsmart whom, hunter versus hunted fun. I think it's the most fun job you can possibly have. If you thought it was fun to read about Marcus outsmarting the gaitrecognition
cameras with rocks in his shoes, think of how much more fun it would be if you were the first person in the world to think of that.
Working in security means knowing a lot about technology. It might mean knowing about computers and networks, or cameras and how they work, or the chemistry of bomb detection. But
really, security is a mindset. It's a way of thinking. Marcus is a great example of that way of thinking. He's always looking for ways a security system fails. I'll bet he couldn't walk into a store without figuring out a way to shoplift. Not that he'd do it there's a difference between knowing how to defeat a security system and actually defeating it but he'd know he could.
It's how security people think. We're constantly looking at security systems and how to get around them; we can't help it. This kind of thinking is important no matter what side of
security you're on. If you've been hired to build a shopliftproof store, you'd better know how to shoplift. If you're designing a camera system that detects individual gaits, you'd better plan for people putting rocks in their shoes. Because if you don't, you're not going to design anything good.
So when you're wandering through your day, take a moment to look at the security systems around you. Look at the cameras in the stores you shop at. (Do they prevent crime, or just move it next door?) See how a restaurant operates. (If you pay after you eat, why don't more people just leave without paying?) Pay attention at airport security. (How could you get a weapon onto an airplane?) Watch what the teller does at a bank. (Bank security is designed to prevent tellers from stealing just as much as it is to prevent you from stealing.) Stare at an anthill. (Insects are all about security.) Read the Constitution, and notice all the ways it provides people with security against government. Look at traffic lights and door locks and all the security systems on television and in the movies. Figure out how they work, what threats they protect against and what threats they don't, how they fail, and how they can be exploited.
Spend enough time doing this, and you'll find yourself thinking differently about the world. You'll start noticing that many of the security systems out there don't actually do what they claim to, and that much of our national security is a waste of money. You'll understand privacy as essential to security, not in opposition. You'll stop worrying about things other people worry about, and start worrying about things other people don't even think about.
Sometimes you'll notice something about security that no one has ever thought about before. And maybe you'll figure out a new way to break a security system.
It was only a few years ago that someone invented phishing.
I'm frequently amazed how easy it is to break some pretty bigname security systems. There are a lot of reasons for this, but the big one is that it's impossible to prove that something is secure. All you can do is try to break it if you fail, you know that it's secure enough to keep you out, but what about someone who's smarter than you?
Anyone can design a security system so strong he himself can't break it.
Think about that for a second, because it's not obvious. No one is qualified to analyze their own security designs, because the designer and the analyzer will be the same person, with the same limits. Someone else has to analyze the security, because it has to be secure against things the designers didn't think of.
This means that all of us have to analyze the security that other people design. And surprisingly often, one of us breaks it.
Marcus's [main character of the book] exploits aren't farfetched; that kind of thing happens all the time. Go onto the net and look up "bump key" or "Bic pen Kryptonite lock"; you'll find a couple of really interesting stories about seemingly strong security defeated by pretty basic technology.
And when that happens, be sure to publish it on the Internet somewhere. Secrecy and security aren't the same, even though it may seem that way. Only bad security relies on secrecy; good security works even if all the details of it are public.
And publishing vulnerabilities forces security designers to design better security, and makes us all better consumers of security. If you buy a Kryptonite bike lock and it can be defeated
with a Bic pen, you're not getting very good security for your money. And, likewise, if a bunch of smart kids can defeat the DHS's antiterrorist technologies, then it's not going to do a very good job against real terrorists.
Trading privacy for security is stupid enough; not getting any actual security in the bargain is even stupider.
So close the book and go. The world is full of security systems.
Hack one of them.

Bruce Schneier
http://www.schneier.com

Afterword by Andrew "bunnie" Huang, Xbox Hacker

Hackers are explorers, digital pioneers. It's in a hacker's nature to question conventions and be tempted by intricate problems.
Any complex system is sport for a hacker; a side effect of this is the hacker's natural affinity for problems involving security.
Society is a large and complex system, and is certainly not off limits to a little hacking. As a result, hackers are often stereotyped as iconoclasts and social misfits, people who defy social norms for the sake of defiance. When I hacked the Xbox in 2002 while at MIT, I wasn’t doing it to rebel or to cause harm; I was just following a natural impulse, the same impulse that leads to fixing a broken iPod or exploring the roofs and tunnels at MIT. Unfortunately, the combination of not complying with social norms and knowing  threatening things like how to read the arphid on your credit card or how to pick locks causes some people to fear hackers. However, the motivations of a hacker are typically as simple as  I'm an engineer because I like to design things.
People often ask me,  Why did you hack the Xbox security system? And my answer is simple:
First, I own the things that I buy. If someone can tell me what I can and can' t run on my hardware, then I don' t own it.
Second, because it' s there. It' s a system of sufficient complexity to make good sport. It was a great diversion from the late nights working on my PhD.
I was lucky. The fact that I was a graduate student at MIT when I hacked the Xbox legitimized the activity in the eyes of the right people. However, the right to hack shouldn t only be extended to academics.
I got my start on hacking when I was just a boy in elementary school, taking apart every electronic appliance I could get my hands on, much to my parents chagrin. My reading collection included books on model rocketry, artillery, nuclear weaponry and explosives manufacture -- books that I borrowed from my school library (I think the Cold War influenced the
reading selection in public schools). I also played with my fair share of ad-hoc fireworks and roamed the open construction sites of houses being raised in my Midwestern neighborhood. While not the wisest of things to do, these were important experiences in my coming of age and I grew up to be a free thinker because of the social tolerance and trust of my community.
Current events have not been so kind to aspiring hackers. Little Brother shows how we can get from where we are today to a world where social tolerance for new and different thoughts dies altogether.
A recent event highlights exactly how close we are to crossing the line into the world of Little Brother. I had the fortune of reading an early draft of Little Brother back in November 2006.
Fast forward two months to the end of January 2007, when Boston police found suspected explosive devices and shut down the city for a day. These devices turned out to be nothing more than circuit boards with flashing LEDs, promoting a show for the Cartoon Network. The artists who placed this urban graffiti were taken in
as suspected terrorists and ultimately charged with felony; the network producers had to shell out a $2 million settlement, and the head of the Cartoon Network resigned over the fallout.
Have the terrorists already won? Have we given in to fear, such that artists, hobbyists, hackers, iconoclasts, or perhaps an unassuming group of kids playing Harajuku Fun Madness, could
be so trivially implicated as terrorists?
There is a term for this dysfunction it is called an autoimmune disease, where an organism's defense system goes into overdrive so much that it fails to recognize itself and attacks
its own cells. Ultimately, the organism selfdestructs.
Right now, America is on the verge of going into anaphylactic shock over its own freedoms, and we need to inoculate ourselves against this.
Technology is no cure for this paranoia; in fact, it may enhance the paranoia: it turns us into prisoners of our own device.
Coercing millions of people to strip off their outer garments and walk barefoot through metal detectors every day is no solution either. It only serves to remind the population every day that they have a reason to be afraid, while in practice providing only a flimsy barrier to a determined adversary.
The truth is that we can't count on someone else to make us feel free, and M1k3y [nickname of a book's character] won’t come and save us the day our freedoms are lost to paranoia. That's because M1k3y is in you and in meLittle Brother is a reminder that no matter how unpredictable the future may be, we don't win freedom through security systems, cryptography, interrogations and spot searches.
We win freedom by having the courage and the conviction to live every day freely and to act as a free society, no matter how great the threats are on the horizon.
Be like M1k3y: step out the door and dare to be free.

Written by orangeek

Friday 01 August 2008 at 11:35 am

Posted in pop

Used tags: , , ,

No comments

Leave a Reply

(optional field)
(optional field)
Remember personal info?
Small print: All html tags except <b> and <i> will be removed from your comment. You can make links by just typing the url or mail-address.