Back in 2008, New York Times best-selling author and Boing Boing alum, Cory Doctorow introduced Markus “w1n5t0n” Yallow to the world in the original Little Brother (which you can still read for free right here). The story follows the talented teenage computer prodigy’s exploits after he and his friends find themselves caught in the aftermath of a terrorist bombing of the Bay Bridge. They must outwit and out-hack the DHS, which has turned San Francisco into a police state. Its sequel, Homeland, catches up with Yallow a few years down the line as he faces an impossible choice between behaving as the heroic hacker his friends see him as and toeing the company line.
The third installment, Attack Surface, is a standalone story set in the Little Brother universe. It follows Yallow’s archrival, Masha Maximow, an equally talented hacker who finds herself working as a counterterrorism expert for a multinational security firm. By day, she enables tin-pot dictators around the world to repress and surveil their citizens. By night, she helps those same citizens resist the very systems she has put in place to avoid the government’s intruding gaze. But when the repression hits too close to home — in this case, Oakland — Masha must use her skills to protect the people she cares for the most with as little collateral damage as possible.
In the excerpt below, Masha finds herself in a squalid Slovstakia hotel, recently fired by her employer Xoth and no longer able to assist her favored band of street resisters, led by Chriztina. Then, things go sideways.
Excerpt from ATTACK SURFACE by Cory Doctorow. Copyright © 2020 Cory Doctorow.
I woke to find myself in the dark room, with the sense that there had just been a loud noise. I sat up, looking around, reaching for my bag, shucking swiftly out of the sleep-sack, trying to remember where the light was, where I’d left my shoes.
Then I heard a scream from the street below, and a car horn, and then more screams, and then a terrible, rending crash. I stopped feeling for the light switch and went to the window, opening the blinds from the edge, looking down.
It was a bad crash, one of the city’s Finecab subcompact autonomous taxis bent around an empty planter, and I reflexively snorted: the self-driving vehicles were an absurd source of national pride for Slovstakia, and if you’ve heard of Slovstakia, there’s a pretty good chance that this is literally the only thing you know about it: “Oh, that’s the country that was stupid enough to buy gen-one automatic taxis.” The Finecabs were notorious for getting into fender benders, and had become a symbol of how easy it was for foreign companies to sell garbage tech to the country’s ruling elite (see also: Xoth).
But this wasn’t one of the customary comedy-crashes. From the sounds filtering up from the road, someone had been hurt. I saw someone in hotel livery rush to the car and decided it wasn’t my problem anymore. I went back to bed. I was just drifting off when I heard another crash, farther away, accompanied by blaring horns, then another, almost immediately after, and screams that didn’t stop. I looked out the window and saw that others were doing the same, some of them holding their phones, and then they were shouting excitedly at each other in Boris. I retreated to my bed and got out my phone, tunneled out to the free world, and started looking for Slovstakia in the feeds. Even though it was all in Cyrillic, it wasn’t hard to figure out the night’s news from the pictures: first the massive protests in the central square, then a baton charge from the cops and a countercharge, blood and tear gas, and then more gas, pepper spray, and the crowd broke and ran for it. That much I’d seen before, but what came next was anything but the usual.
At first, it was just photos of car wrecks, all involving Finecabs, many with injuries. Judging from the clothing of the injured, they were all protesters. I started to get a bad feeling. I kept scrolling. More injuries, more crashes— then, a shakicam video, racking up views like a broken odometer: an autonomous taxi speeding toward a crowd of protesters who were standing on an empty street corner. The protesters noticed the cab as it drew near to them and broke and ran, and then—the cab chased one of them. It was a woman, in a puffa jacket and snow boots, and as she ran, her friends screamed in horror. She turned a corner and the view from the camera started to jerk as whomever was holding it raced after it, rounding the corner just as the car sped off. The woman was lying motionless in the street.
That’s the video you probably saw, if you saw any of them, but for me, it wasn’t the worst. Compared to the videos taken from inside the taxis, by passengers who were hammering at the emergency stop buttons, that video was relatively benign. The screams from inside the cars as their victims’ heads starred the windshields and left behind streaks of blood and hair were a thousand times more terrifying.
I knew I wouldn’t be going back to bed that night. I logged in to Aeroflot and booked a ticket on the next flight out, to Moscow the next morning. It wasn’t Berlin, but it didn’t have to be. I could get to Berlin from there. I could get anywhere from there.
Where should I go? I felt alone and small, and ashamed to have been fired. I was good at being alone, and scared could go into a compartment, easy.
Apparently I wasn’t the kind of person who worked for Xoth anymore. I didn’t want to be that kind of person. Chances were pretty high that Xoth had sold Litvinchuk and pals the exploits to take over those cars. I’d been complicit in some pretty terrible shit before, sure, but what if Kriztina had been thrown over one of those little subcompacts, or crushed against a building by one, or run down and driven over?
I messaged her, just a quick encrypted check-in, and then, because I was going to be leaving soon, I packed my bag and synched my sensitive files to an encrypted cloud store, then securely erased them off my laptop. Now I could comply with an order to log in to my laptop and enter my hard drive’s pass- phrase without turning over my most sensitive data.
Doing that took my mind off Kriztina, but it also focused my attention on what I was going to do after my flight landed in Moscow in a few hours. Reflexively, I looked at my calendar, though of course all my appointments related to a job I’d just been fired from with extreme prejudice. But looking also reminded me that it was Tanisha’s birthday, or it was in Europe and would be shortly in San Francisco. The reminder was smart enough to include my address book entry for her and that was smart enough to include her last social post, a selfie of her in afro-puffs, grinning in front of a huge crowd of protesters somewhere else — Oakland, of course.
Seeing her smiling out of my laptop weaponized my loneliness, making it physical, an elephant on my chest, so that I gasped and gasped before my breath came back. Tanisha was a remnant of another life of mine, one without so many compartments and so many contradictions to stuff into them. It had been years since we’d been in regular contact, but still, she was one of the few people whose birthday was still in my calendar, and I never missed sending her a note.
Happy birthday, Neesh! Thinking of you
That was truer than I meant it to be.
hope you have a killer day. Stay safe, stay weird, stay you. XO Masha
That was all, a message whose mere existence — still thinking of you — carried as much meaning as the words inside it. I sent it and went back to looking at connections from Sheremetyevo.
Then my phone rang.
My screen showed TANISHA, and an older pic, which dated back to the last time I’d seen her, which was at Burning Man, with her in a silver bathing suit and her afro all crazy around her head, playing an upright bass in a jam band that we’d wandered into.
Tanisha was calling my old number—I mean, my OG number, the cell number I’d gotten at twelve—which forwarded to a cloud asterisk call-server that had a ton of rules that allowed a very small number of people to forward onto whatever phone I was using at the moment. I was religious about updating the forward, even though (or because) it meant my mom could reach me whenever she wanted to, which was both more often than I wanted to speak to her and less often than I wanted her to want to speak to me.
“Hey, Neesh. Uh, happy birthday.” “That’s tomorrow.”
“Not where I am.”
“Oh. Shit. Is it like three a.m. where you are or something?”
“Two a.m. Don’t worry about it, I was up.”
“Masha, tell me you’re not still partying. You’re too old for that.”
I laughed. “I’m not too old for it, but no, I’m not partying.” I looked around the terrible Soviet-era hotel room. “Packing for a flight.” Then I wished I hadn’t said that.
“Where are you flying?”
Maybe some part of me wanted to have this discussion with her. Otherwise, why would I have raised the subject?
“I’m still deciding that.”
There was a pause on the other end. “Uh, okay. You must be hella far away, though, the call sounds terrible.”
“I am, but I’m also putting the call through a bridge. Makes the logs harder to fingerprint.”
She sang a few bars of the Mission: Impossible theme, which was her traditional way of telling me that she wasn’t impressed with my paranoia. But she trailed off weakly. “Sorry, I’m in no position to be mocking you.”
Oh. I tried not to pay much attention to US politics—after all, most of what I hated about present-day America was stuff I helped to invent. But of course a call out of the blue from Tanisha was more likely to be soliciting professional advice and not catching up on gossip.
“Tell me about it.”
The long silence spoke volumes. I was sure she was thinking something like, Can I even trust this phone connection?
“Neesh, if you want to talk more privately, I can call you back. You still have that app?” We used to use Signal for phone conversations when I was in- country, and Tanisha said she was going to try to get her pals to use it too, but I knew that without active reminders of the threat model most people would default back to the standard way of talking.
“Uh,” she said.
“Thought so. Reinstall it, and I’ll call you in five.”
“Can you hear me?” Signal calls were a lot more jittery than regular voice or even Skype, prone to drop into Dalek-sounding interference and voice-in- a-box-fan juddering, but my roaming SIM was pretty good and Tanisha had found a spot with good reception, so it was almost as good as a regular call— for now.
“I hear you.” She sounded exhausted and it was only late afternoon on the West Coast.
“What’s going on, Neesh?” I thought maybe the connection had been cut. “Neesh?”
“Sorry. Let me get my head straight. Just a sec.”
This wasn’t like her. Tanisha had the straightest head I knew—the Tanisha I knew was an iron woman.
“Okay, it’s like this: I’ve been going out for the Black-Brown Alliance meetings and rallies, the big ones in Oakland. I took precautions, we all did — phones locked and in airplane mode when we were on-site, no fingerprint unlocking, all our cards in Faraday pouches. We only talk in person with phones off or using encrypted disappearing chat. But I always remembered what you told me —”
“There’s a difference between mass surveillance and targeted surveillance.”
“Right. So I’ve been extra careful. I use a burner for all that stuff, and I wear dazzle to the demos, watch out for kettles and get out fast when they start to form. But —”
“Come on, spit it out.”
“You’ll think I’m being paranoid.”
“Neesh, trust me, I will never, ever think you’re being too paranoid.”
I heard her sigh and waited. With Neesh, sighs always came in pairs, it was something we used to tease her about. I hadn’t thought of it in years, but my subconscious remembered. There it was.
“You were the one who taught me about binary transparency, right?” “Yeah.”
Binary transparency was an exciting idea, but also a complicated one that almost no one could actually understand. First, you had to understand what a hashing function is: that’s a cryptographic algorithm that takes a long file (say, a computer program or an email or a software update) and generates a short “fingerprint” number from it that a human being can easily read aloud and compare with other fingerprints (for certain values of “easily”). If the hashing function is working well, it should be basically impossible to deliberately create two different files that have the same fingerprint, and likewise basically impossible to figure out what was in the original file just by looking at the fingerprint (for “basically impossible” think of all the hydrogen atoms being turned into computers that worked until the universe’s heat-death to guess the answer, and still running out of both space and time).
Next you have to understand public-private cryptographic keypairs. The short explanation: whatever a public key scrambles, only the private key can unscramble, and vice versa. So everyone shares their public keys as widely as possible and guards the secrecy of their private keys with their lives. If you get something you can decrypt with my public key, you know it was encrypted with my private key (and only my private key). If you encrypt something with my public key, you know that only someone with my private key can decrypt it. If you want to send me something that only you and I can read, you encrypt it with your private key and my public key, and then I use my private key and your public key to decrypt it—and now I can be sure that only people with my private key can read the message, and only people with your private key could have sent it.
When you combine hashing and keypairs, it gets cool: you can first hash a file, then encrypt the hash with your private key, and I can use that hash to check whether you sent the file, and whether the file was changed between you and me.
Got all that? No? Join the club. Almost no one understands this stuff, which is a pity, because now we’re about to get to binary transparency, which is awesome af, as the kids say.
Stay with me: hashing lets you create a short “fingerprint” of a file. If you have your own copy of the file, you can hash it again and make sure it matches the fingerprint. If it doesn’t, someone has altered the file since it was hashed. Keypairs let you scramble a file — or a fingerprint — so that you can be sure who sent it, and also make sure it wasn’t changed, and even make sure no one else can see what the file is.
Now let’s talk about software updates and backdoors: all the software running on all the computers you rely on is, approximately speaking, total shit. That’s because humans are imperfect, so they make errors, which is why every book you’ve ever read has typos in it. The difference is that you can usually figure out what the writer meant even if there’s a few typos sprinkled around, while tiny mistakes made by computer programmers lead to crashes, data-loss, and, of course, the possibility that other computer programmers — let’s call them “hackers” — break into the program, take over the computer, and destroy your life.
So we say “security is a process and not a product” — meaning that we’re going to be discovering bugs in the software you’re depending on forever, and we need a way to fix those bugs when we find them. That’s why every computer you use bugs you all the time to update it with a patch from the people who made it.
Now, cryptography works. If a programmer does her job right and doesn’t make mistakes, the messages that her program scrambles will resist brute-force attacks until the end of time and space (see above). When a government wants to access someone’s secrets, they need to find a way to get at them without di- rectly attacking the crypto. I mean, why burn resources and time attacking the part of the lock mathematically proven to be secure? There are so many other angles for a government to use.
Like, they could send someone to your house and put a tiny camera, the size of a pinhead, in a position that lets them see your screen. Or they could wait until you leave your laptop in a hotel room and send someone to break the — inevitably shitty — hotel-room door locks and take over your computer, with BadUSB or by sticking a hardware keylogger in it or some other method. But physical intrusion is so pre-digital; it lacks the elegance of a software-based attack.
Which brings us back to “security is a process.” For software to be secure, it has to have a way to receive updates from the people who made it, because they’re always finding bugs, and will always find bugs, and so security is a process and not a product.
What about forcing a company to update its software with something that introduces a bug, rather than fixing one? Companies are not happy about doing this, but maybe you can bribe a low-level employee, or you can get your attorney general to threaten to put the CEO in jail unless he orders a flunky to write some spyware and ship it to the target(s) in the guise of a security update. As a bonus, paranoid people worried about government surveillance are also the people who are most diligent about applying security patches.
That’s where binary transparency comes in. Even if a company is willing to push spyware disguised as security, they probably don’t want to send it to all their users, not least because the wider things are spread, the more likely it is that someone will spot the switcheroo and blow the whistle. The best way to ship a targeted backdoor is to target it — at a user, a city, a region, possibly a country, but ideally not everyone, because “everyone” includes “bored, obsessive grad students who decompile every update from every company looking for a thesis subject.”
Which means that one way to spot a backdoor in your security update is to compare every update you receive with all the updates that everyone else receives. That’s binary transparency: programs ship with binary transparency modules that automatically take a fingerprint of every update they receive, and send that to one or more transparency servers, possibly with a fingerprint of the program before and after the update’s installation — sometimes there are different versions of programs based on language, so the English patch might not be the same as the Chinese patch because their error messages are in different languages. But when two Chinese users get two different patches, something might be going on.
Binary transparency is elegant and cool. It gets turned on before companies get deputized to spy on their users, which means that it’s already in place when the G-men show up at your door. If they force you to push out a backdoor, binary transparency will reveal it. If they force you to push out an update for everyone that turns off binary transparency, everyone will notice and their paranoid targets will stop using it.
This means that a rational government agency won’t even bother to ask for backdoors, because they’ll never work. Because binary transparency takes backdoors off the table, it takes asking for backdoors off the table too.
That’s the theory. But binary transparency is one of those things that’s ex- citing in theory and really messy in practice. First of all, nearly every binary transparency alert is a false alarm: maybe the company sends different updates to different customers as a way of live-testing an experimental feature, or the update or its fingerprint gets changed in some minor way by an ISP that’s doing deep packet inspection or some other dumb thing. Neither of these things happen very often, but they both happen a lot more often than binary transparency catching a real backdoor (in part because companies known to have binary transparency turned on understandably don’t get as many back-door demands from spies). So almost no one knows what binary transparency is, and if you do, chances are that all you know about it is that it’s a thing that you can safely ignore because it only ever throws false alarms.
Which wouldn’t be so bad if government agencies were rational, but spies are by definition total weirdos. Think for a second about the kid you knew growing up who always wanted to be a spy someday — the combination of grandiosity, authoritarianism, and paranoia. In the 1960s, the CIA tried surgically implanting cats with listening devices — and training them to spy on America’s enemies. (This is real. Google “acoustic kitty.”) Think about this for a second: not only did the CIA think the veterinarians who insisted you couldn’t implant huge battery-operated recording devices in live cats were just not try- ing hard enough—they also thought you could train cats. Because when you give paranoid, grandiose authoritarians an unlimited budget and no oversight, things get fucked up.
So any assumption that the spies won’t come knocking on a binary transparency shop because it’ll only waste their time and yours drastically over-estimates the extent to which spies are adverse to wasting their time and yours.
Which means some of those alerts from binary transparency checks aren’t false alarms. They’re just spooks betting on their ability to bull their way through stupid, uncooperative reality. Binary transparency is still used, because it shows up on checklists of “things companies should do to resist spying,” but in practice, everyone ignores it. Except Tanisha.