Tuesday, December 31, 2013

One of my 2014 resolutions is ...

... to read more things that Don Marti writes.

If you're not already reading Don Marti, you might not know about the fascinating topics he covers on his irregularly-updated blog.

He spends most of his time writing deeply about the online advertising industry, an important subject given that it's probably the most important part of the high-tech industry nowadays, but he occasionally posts on other topics, too. His essays often pursue the man behind the curtain, the lesser-known ways in which big money and large organizations manipulate the world in the search for Better Advertising.

Here's a few recent samplers of his work:

  • Surveillance marketing meets sales norms
    I had an interesting conversation with a California resident a few days ago. A door-to-door sales rep had just come by, and as soon as he left, she called the police. The non-emergency police number, but still.

    It turns out that other people in the neighborhood had also called. We've gone from a society in which door-to-door sales was totally normal, even the subject of underground NSFW comics, to something that regular people call the police about.

  • Privacy snake oil
    Schneier's snake oilers were always trying to re-use one-time pads. You can't do that. Likewise, you can't collect and store PII—and it's all PII—and not have it come back to bite the people that it's about.
  • An argument for targeted advertising
    Non-creepy advertising isn't perfect, and doesn't solve all the customer/vendor match-up problems in the world. We have a lot of non-advertising tools for that. But it's a fallacy to say that just because non-creepy ads have a problem doing something, creepy ads are any better.
  • Hijacking the Internet
    It's an example of the Internet making an industry less efficient. Which should be in an Economics paper somewhere, but really, we need to find honest work for software developers now stuck in adtech. And for real advertisers to quit the adtech-captured IAB, but you knew that.
  • Internet trend: unexplained value of print ads
    Print continues to command an unreasonably large share of advertising budgets. Spending is down, but proportionally not as much as time.

    With the trendiness and bubblyness of digital, we'd expect it to go the other way.

    Something deeper than click fraud is going on here. Print is inherently more valuable because it's less trackable, and carries a better signal, and we keep seeing that in these Internet Trends reports.

If you like Don Marti, you might find Bob Hoffman's writings interesting, too.

Monday, December 30, 2013

And then there were four

In all the hustle and bustle of the holiday season, I forgot to call attention to a substantial event in local Civil Engineering: Caldecott Tunnel fourth bore opens

Ever since the third bore beneath the Oakland hills opened in 1964, Caltrans has shifted the traffic direction of the center bore to accommodate the heaviest traffic. That has forced drivers heading in the opposite direction to funnel from four lanes into two to squeeze through a single bore.

I think the writers for the Chronicle, like perhaps many of us, thought that this massive project would never complete, as they kept referring to the three-tunnel arrangements in the present tense.

But that time is over! The fourth bore is here!

The new tunnel is, frankly, gorgeous. There's something about those overhanging ventilation fans that reminds me, say, of NCC-1701, or maybe the Warthog.

You can only get to the new tunnel, natch, coming East-to-West. Simply stay in the right lanes as you climb up toward the hilltop and you'll find yourself smoothly guided through this beautiful tunnel.

When we came through it the first time, I was a bit startled, because it is so much larger than the older tunnels, including an entire emergency shoulder lane inside the tunnel. A lot of work has gone into these safety systems.

Of course, there is good reason for that.

We don't often give credit to quiet successes; barely 200 people showed up when the Metropolitan Transportation Commission announced that the tunnel would open slightly earlier than expected.

For now, it's all good news. The delays through the tunnels are eased; the aggravating lane-switching protocols are no longer needed; our tax monies went to some needed infrastructure and it's serving its purpose.

I guess it's time for me to start nattering on about the next great public construction project in my area, which has been underway since (yes, you read that link correctly) 1970.

Castles of Burgundy: a very short review

My new board game this fall was Stefan Feld's The Castles of Burgundy.

This is our second Stefan Feld game; about a year ago I got his Trajan, which I thought was nice but rather busy.

Castles of Burgundy is, I think, quite enjoyable.

Castles of Burgundy has a fair amount of randomness (i.e., luck) in it. Each of your overall turns is controlled by a roll of the dice, and the selection and timing of the pieces on the board is also a matter of luck. In any particular game, not all the pieces will be used, and they will not all arrive on the board at the same time.

A further element of luck is that you may choose one of a variety of possible game boards at the start of the game, and the particular board you choose may or may not fit well to the pieces that actually appear as possible selections when it's your turn.

What this means is that Castles of Burgundy is a game where tactical optimization is quite important. Rarely can you successfully plan many moves in advance.

You may start the game with a lucky opportunity to fetch two Sheep tiles, and think that your winning strategy will be to fill your animal area with Sheep. And then, you may go three rounds without ever seeing another Sheep tile! The game will punish your stubborn inflexibility and refusal to adjust your course.

Another interesting aspect of this game is that, to be successful, you must, must, must be willing to pay attention to the strategies that your opponents are using, and make the occasional defensive play, which generally involves compromising your own Grand Strategy from time to time to make a play which denies your opponent a particularly crucial tile. If they are working to fill their 5-tile Buildings region, and are desperate for that Watchtower, grab it! If they're favoring animals, and you can grab those Chickens, make it so!

Another nice thing about Castles of Burgundy is that it isn't terribly fatiguing to learn. 15 minutes with the rulebook, and one rather slow practice game, and you'll have it down, at which point you can switch from learning mode into playing mode, which is much better.

And Castles of Burgundy has plenty of mental challenge, even given the dice and board randomness. You'll find that there's just enough opportunity for thinking and planning to keep you entertained, without the deep multi-turn planning and organization issues that can turn a game like Agricola or Le Havre into a 4 hour event. Castles of Burgundy cleverly foils those approaches without making you feel like it's dumbed down to the point of dullness.

Heck, you can even play it at the beach!

Based on just a few weeks with the game, I'd say Castles of Burgundy is a winner, and if it looks like your cup of tea, give it a whirl.

Saturday, December 28, 2013

Why Does The World Exist: a very short review

Over the (all-too-brief) holiday, I flew (all-too-rapidly) through Jim Holt's intriguing book: Why Does the World Exist?: An Existential Detective Story.

Holt, a journalist (the New Yorker, the New York Times) and amateur philosopher, takes as the subject for his book the following question: "Why is there something rather than nothing?"

One day I went to the local college library and checked out some impressive-looking tomes: Sartre's Being and Nothingness and Heidegger's Introduction to Metaphysics. It was in the opening pages of the latter book, with its promising title, that I was first confronted by the question Why is there something rather than nothing at all? I can still recall being bowled over by its starkness, its purity, its sheer power. Here was the super-ultimate why question, the one that loomed behind all the others that mankind had ever asked.

Holt undertakes to study this question from a variety of perspectives: physical, astronomical, theological, mathematical, linguistic, philosophical, literary. And hence he beings a rather meandering journey around the globe, tracking down mathematicians, astronomers, philosophers, linguists, physicists, religious thinkers, and authors to ask their opinion of The Question.

Interspersed with his relating of the conversations he has with all these deep thinkers, Holt digs into various historical figures who considered these topics: Leibniz, Plato, Hegel, Sartre, Wittgenstein, and more.

Such an omnibus approach must necessarily provide such a varied buffet that every reader of Holt's book will, at some point, throw up their hands and say: "what rubbish! Why did he bother to include that?"

But, on the other hand, there is such a collection of fascinating characters, with such intriguing and well-presented theories and proposals, that every reader will also, at some point, gasp in recognition and say: "yes! Yes, of course! How clear is that!"

Still, at times, Holt's book becomes somewhat of a catalogue:

My purpose, though, is a serious one. What I am struggling to do is to see the world in the most abstract way possible. That, it seems to me, is the best remaining hope for puzzling out why the world exists at all. All of the thinkers I had already spoken to fell short of complete ontological generality. They saw the world under some limited aspect. To Richard Swinburne, it was a manifestation of divine will. To Alex Vilenkin, it was a runaway fluctuation in a quantum vacuum. To Roger Penrose, it was the expression of a Platonic mathematical essence. To John Leslie, it was an outcropping of timeless value. Each of these ways of seeing the world purported to yield the answer to why it exists. But none of these answers struck me as satisfactory. They didn't penetrate to the root of the existential mystery -- to what Aristotle, in his Metaphysics, called "being qua being." What does it mean to be?

Surely, you cannot criticize Holt for being over-timid. If you are going to take on a Big Question, how much bigger could a question get? Holt's book has no easy answers, but it excites the mind and motivates the reader to think and consider and contemplate ideas that are big and complex and worthy of study.

How much more can you hope from a book than that?

I enjoyed this book a lot, and happily passed it on to the next reader, who I hope will enjoy it as well.

Thursday, December 26, 2013

Weather Report

On December 26th, 2013, it was 72 degrees (Fahrenheit) on the coast, clear and dry.

So we did the only sensible thing: packed up all 11 of us, and made the short drive to Francis Beach.

The weather was glorious, and a good time was had by all.

Wednesday, December 25, 2013

The PC is alive

Here's an absolutely spectacular photo-essay: This Insane Case Mod Looks Like a World War III Fever Dream.

Ikeuchi spent the better part of the last year building this incredible machine, a creation that isn’t so much a case mod as full-blown diorama. It’s a deliriously detailed little world that just happens to take place in and around a functioning computer. It also redefines the idea of what it means to have a cluttered desk.

Ikeuchi, a designer by trade, likes to call it his “secret base.” Inspired by mecha anime like Gundam and Macross, every surface is packed with something to discover. Soldiers tend to intricate, forbidding machinery. Mechs await repair. The work seamlessly blends plastic toys, gizmo components, and scraps of other materials with the computer itself.

Ikeuchi is quoted as saying: "The most important thing is to think of the computer as a living thing."

My aunt names all her inanimate objects: her computer, her car, her coffee maker, her knife sharpener.

And certainly I'm known to have conversations with my 20-year-old mini-van with some regularity: "please start! please start! please!!"

We invest much of our own selves in the objects we create, both hardware and software, and Ikeuchi's magnificent artwork makes that vividly clear.

Monday, December 23, 2013

As cuddly as a cactus, as charming as an eel

I'm not sure if it's the time of year, or just the Grinch in me coming out, but I'm feeling a bit contrarian, I guess.

I'm not really the only one, though; it seems to be something going around...

  • A Mathematician’s Lament
    For many years there has been a growing awareness that something is rotten in the state of mathematics education. Studies have been commissioned, conferences assembled, and countless committees of teachers, textbook publishers, and educators (whatever they are) have been formed to "fix the problem." Quite apart from the self-serving interest paid to reform by the textbook industry (which profits from any minute political fluctuation by offering up "new" editions of their unreadable monstrosities), the entire reform movement has always missed the point. The mathematics curriculum doesn’t need to be reformed, it needs to be scrapped.
  • Chuck Moore's Creations
    There's no return (;) between them. You can have multiple entry points and exit points in definitions! There is a certain simplicity and power that comes from not having a compiling vs. immediate mode as in regular Forth. Immediate words are just yellow and can be anywhere. Red words can also be anywhere. They simply give names to the current address in the instruction stream. There is no colon (:) word and definitions don't necessarily end with a return (;). It's also common to have an early return rather than an else. It's a different way to program, even for a Forth.
  • Why I want Bitcoin to die in a fire
    Like all currency systems, Bitcoin comes with an implicit political agenda attached. Decisions we take about how to manage money, taxation, and the economy have consequences: by its consequences you may judge a finance system. Our current global system is pretty crap, but I submit that Bitcoin is worst.
  • NSA Spying: Whom Do You Believe?
    This sort of thing can destroy our country. Trust is essential in our society. And if we can't trust either our government or the corporations that have intimate access into so much of our lives, society suffers. Study after study demonstrates the value of living in a high-trust society and the costs of living in a low-trust one.
  • RSA doesn’t quite deny undermining customers’ crypto
    So RSA’s defense is essentially that they didn’t undermine their cusotmers’ security deliberately but only through bad judgment. That’s cold comfort for RSA customers—good security judgment is one of the main things one is looking for in a security company.
  • Aaugh rather than Arragh: Windows Piracy
    The final interesting finding is that IP enforcement does not seem to have any impact on piracy rates. This is no surprise to anyone who understands how the internet work but may be a surprise to policy-makers and corporations lobbying for anti-piracy measures. So when the UK banned the Pirate Bay in 2012 here is what happened to piracy rates:

    See that? Nothing. Zip. Nada. In other words, anti-piracy measures are possible the most ineffective policy instrument ever. Why? Because the Internet.

  • Enigma codebreaker Alan Turing receives royal pardon
    A pardon is normally granted only when the person is innocent of the offence and where a request has been made by someone with a vested interest, such as a family member. On this occasion, a pardon has been issued without either requirement being met.
  • Oracle Buys Responsys
    As a part of the Oracle Marketing Cloud, we’ll be able to accelerate our vision of giving marketers across all industries the most advanced platform for orchestrating customer experiences over time and across channels. We couldn’t be more thrilled about what this means for our customers and employees.

Really, now: have you ever seen a more perfect juxtaposition of three words than:

Oracle Marketing Cloud

Saturday, December 21, 2013

Our world at the end of 2013

Happy holidays! If you know where the rain is, PLEASE SEND IT TO CALIFORNIA!

  • 'The Long Walk Home'
    Despite this, the people of Qunu were undeterred. They were welcoming their beloved "Tata" home. Everywhere I went I was greeted with a smile, a handshake, and the words, "Molo, Sisi" or "Hello, Sister." There was a spirit of South African ubuntu -- or unity -- just as President Obama noted in his moving memorial speech, and I knew all would come together to honor a great man.
  • The Birth of Standard Error
    "One afternoon several of us had the same experience -- typesetting something, feeding the paper through the developer, only to find a single, beautifully typeset line: "cannot open file foobar" The grumbles were loud enough and in the presence of the right people, and a couple of days later the standard error file was born..."
  • Into the Bitcoin Mines
    There are Bitcoin mining installations in Hong Kong and Washington State, among other places, but Mr. Abiodun chose Iceland, where geothermal and hydroelectric energy are plentiful and cheap. And the arctic air is free and piped in to cool the machines, which often overheat when they are pushed to the outer limits of their computing capacity.
  • 5 predictions on the future of databases (from a guy who knows databases)
    “I think the biggest NoSQL proponent of non-ACID has been historically a guy named Jeff Dean at Google, who’s responsible for, essentially, most to all of their database offerings. And he recently … wrote a system called Spanner,” Stonebraker explained. “Spanner is a pure ACID system. So Google is moving to ACID and I think the NoSQL market will move away from eventual consistency and toward ACID.”
  • Cards Stolen in Target Breach Flood Underground Markets
    But this store has earned a special reputation for selling quality “dumps,” data stolen from the magnetic stripe on the backs of credit and debit cards. Armed with that information, thieves can effectively clone the cards and use them in stores. If the dumps are from debit cards and the thieves also have access to the PINs for those cards, they can use the cloned cards at ATMs to pull cash out of the victim’s bank account.
  • The Google Technical Interview: How to Get Your Dream Job
    Each interviewer has a limited amount of time to convince themselves that you will be a great hire, and they want to spend that time in the most efficient way. Therefore once you are in a technical interview, our interviewers will mostly focus on programming problems, not the resume, which we find to be the best use of your time.
  • Pond
    So Pond is not email. Pond is forward secure, asynchronous messaging for the discerning. Pond messages are asynchronous, but are not a record; they expire automatically a week after they are received. Pond seeks to prevent leaking traffic information against everyone except a global passive attacker.
  • A Crypto Challenge For The Telegram Developers
    Let’s do this right and build a real Open Source secure asynchronous messaging solution that is more than snake oil and marketing gimmicks. TextSecure, the Open Source app we’ve been developing at Open WhisperSystems, uses the Axolotol ratchet, which we believe should represent the core of any secure asynchronous messaging solution today. We’ve worked with Cyanogen to transparently integrate the TextSecure protocol into CyanogenMod
  • The Taxonomy of Terrible Programmers
    The Hoarder is a cautious creature, perpetually unsure of itself. The Hoarder lives in a world of perpetual cognitive dissonance: extremely proud of his work, but so unsure of himself that he won’t let anyone see it if it can be helped.

    So he hides his code. Carefully avoiding check-ins until the last possible minute, when he crams it all into one monolithic commit and hopes no one can trace the changes back to him. His greatest fear is the dreaded merge conflict, where the risk of exposure is greatest.

  • The Cinematography of "The Incredibles" Part 1
    See all the crazy angles in the following shots. Nothing is by accident, the perspective they chose was purposefully done to help visually tell the story. Either to see a character's point of view, or to help show the dominance of a character with a certain interplay. Close-ups show what a character is thinking or feeling, over-the-shoulder shots place the audience right into the conversation, and the whole time there are shapes and lines in the foreground and background that aid in leading the viewers eyes to where they need to look.
  • The Google Test and Development Environment - Pt. 1: Office and Equipment
    Google is a highly collaborative workplace, so the open floor plan suits our engineering process. Project teams composed of Software Engineers (SWEs), Software Engineers in Test (SETs), and Test Engineers (TEs) all sit near each other or in large rooms together. The test-focused engineers are involved in every step of the development process, so it’s critical for them to sit with the product developers. This keeps the lines of communication open.

    The office space is far from rigid, and teams often rearrange desks to suit their preferences. The facilities team recently finished renovating a new floor in the New York City office, and after a day of engineering debates on optimal arrangements and white board diagrams, the floor was completely transformed.

    Besides the main office areas, there are lounge areas to which Googlers go for a change of scenery or a little peace and quiet. If you are trying to avoid becoming a casualty of The Great Foam Dart War, lounges are a great place to hide.

  • Scott Hanselman's 2014 Ultimate Developer and Power Users Tool List for Windows
    Everyone collects utilities, and most folks have a list of a few that they feel are indispensable. Here's mine. Each has a distinct purpose, and I probably touch each at least a few times a week. For me, "util" means utilitarian and it means don't clutter my tray. If it saves me time, and seamlessly integrates with my life, it's the bomb.
  • An appeal for security for the ordinary developer
    This style of communication, which I see quite often in security topics, makes it very easy for newcomers to feel utterly helpless. You are told that a particular practice is bad security-wise, but do not know how to improve, and it’s too abstract to figure it out on your own. I happen to know that strncat is safer than strcat because the latter can cause buffer overflows, and otherwise the manpage of strcat is quite vocal in explaining this. I have only a vague idea of how the block size of MACs influences timing channels. In other words, if we help developers discover what is wrong, it is most vital that we show them a clear path towards improvement.
  • Sex in Title, and Other Stories
    It seems perverse that in our digital society, where we are freer than ever to work where we like, with whom we like, on what we like, that our communities are more gender biased than ever. The most egalitarian societies, in the gender sense, are totalitarian states. This is surely a sign that we're doing something profoundly wrong when it comes to large-scale organization. The long-standing accusation of sexism may be accurate data yet it's done nothing to improve things, and instead widens the disagreeable, and enduring, split between the genders.
  • The Real Purpose of Oakland's Surveillance Center
    So what is the real purpose of the massive $10.9 million surveillance system? The records we examined show that the DAC is an open-ended project that would create a surveillance system that could watch the entire city and is designed to easily incorporate new high-tech features in the future. And one of the uses that has piqued the interest of city staffers is the deployment of the DAC to track political protesters and monitor large demonstrations.

I've only scratched the surface. There is too much to learn, and not enough time.

Take care, stay warm, sleep late.

Bowl season is here!

As always, get prepared and plan your schedule with Pat Forde's omnibus article: Dashing through the college football bowl season

Forde's survey is packed with all you need:

  • analysis of motivation:
    AdvoCare V100 Bowl (22): Arizona vs. Boston College, Dec. 31.

    Motivation Meter: Who doesn't want to spend New Year's in Shreveport? OK, so neither team spent the summer running gassers with the goal of playing in this game – but it beats being home watching other teams play. Both teams should be moderately motivated.

  • assessment of watchability:
    Sun Bowl (24): UCLA vs. Virginia Tech, Dec. 31.

    Watchability (scale of 1-5): 3.5. Much like the Advocare/Independence/Weed Eater Bowl above, you have to respect the longevity of a mid-level bowl like this in an out-of-the-way locale like El Paso. It has carved out a niche in the college football landscape. Long live the Sun Bowl. Take time to watch it.

    Belk Bowl (15): North Carolina vs. Cincinnati, Dec. 28.

    Watchability (scale of 1-5): 2.5. This game gets the garden-spot, mid-afternoon time slot on a Saturday. During the regular season that means a major SEC showdown on CBS; during bowl season that means the fifth-place team in the ACC Coastal Division against the third-place team in the AAC. Oh well.

  • and, with a straight face this time, games not to be missed:
    Rose Bowl (31): Stanford vs. Michigan State, Jan. 1.

    Watchability (scale of 1-5): 5. In terms of pure aesthetics, it's the most watchable game of the year. Every year. It wins in terms of tradition, too. And this year's matchup is the second-best bowl game, for The Dash's money.

    Orange Bowl (35): Ohio State vs. Clemson, Jan. 3.

    Watchability (scale of 1-5): 5. Urban Meyer coaches again in the state of Florida. How have he and his team handled the devastating loss to Michigan State in the Big Ten title game? And Clemson returns to the scene of its January 2012 bowl crime, where it surrendered 70 to West Virginia.

    Motivation Meter: Buckeyes talked a good game after the loss to the Spartans about being ready and finishing the season right in Miami, but that’s far easier said than done given what was lost. For Clemson, the motivation should be very high – especially after another damning loss to South Carolina.

    Best Orange Bowl Ever: Miami 31, Nebraska 30 in 1984. A massive upset that spawned the Miami dynasty and derailed one of the greatest teams in college football history. Cornhuskers scored late and Tom Osborne gamely opted for a two-point conversion, when playing to tie (in the pre-overtime days) likely would have cemented the national title. When Turner Gill’s pass fell incomplete, we had a new world order in college football for the next decade.

Well, at least it's something to do when you suddenly find yourself with three hours while everyone else is at the mall, on airport runs, or asleep.

Friday, December 20, 2013

Learning to age gracefully

I've always felt "young at heart." I stubbornly continue doing things like playing soccer, riding my bike to work, going backpacking and hiking, and so forth.

One of the things that's paramount in my mind, as I age, is to retain mobility. I've got close family members who aren't mobile, and I can see that it really impacts their quality of life. So being able to get around freely is extremely important to me.

Recently, our beautiful 6 1/2 year old Labrador, the best companion a family could ever ask for, has suddenly developed a severe case of arthritis or some other sort of joint damage. Viewed from hindsight, this has been coming on for a while, but to us it seemed very sudden. She had recovered from all her previous injuries quite quickly, but this time it's been a stubborn several weeks with no improvement.

She's always been a fetch-and-retrieve dog, and so the restrictions on carrying balls, sticks, pine cones and other toys have come as a big change in her lifestyle. If you had seen her 5 years ago, racing through a meadow at top speed, eyes focused on a flying frisbee, snatching it out of the air at the last moment, you'd watch her limping down the hall, barely able to make it 100 yards from the house, and it would break your heart.

Besides the mobility, the other thing that terrifies me is losing my eyesight, as I'm so incredibly dependent on my vision, not only for my employment, but also for my general lifestyle.

So I recently made my regular trip to the eye doctor, worrying about vision problems that have been affecting family members recently.

Happily, after a thorough, if exhausting, two hours at the doctor (did you know they make an MRI of your eyeball now?!), he pronounced himself thoroughly pleased: "perfect visual field, no blind spots, no evidence of glaucoma, no retinal tears or scars, no evidence of tumors, no cataracts, optic nerve looks healthy and strong, eyesight remains stable and prescription has changed just slightly."

I've got lots of things to do, lots left on my list, so I'm trying to keep healthy and active and stay busy.

And so far, so good.

Thursday, December 19, 2013

Familiarity Breeds Contempt

Following an interwebs link, I recently ran across a paper by Sandy Clark, Stefan Frei, Matt Blaze, and Jonathan Smith titled: Familiarity Breeds Contempt: The Honeymoon Effect and the Role of Legacy Code in Zero-Day Vulnerabilities.

This is the sort of academic research that there should be more of: with bold eyes, they take a fresh look at some precepts that were held to be Truth, run them through the maw of Hard Data, turn them on their heads, offer some suggestions as to why the surprising results might actually hold, and point to areas that have been under-considered.

So, what is the Received Wisdom that they investigate? Well, they consider the 40-year-old notion of Software Reliability Models (SRM), and the younger, but still widely known, notion of Vulnerability Discovery Models (VDM), both of which make statements about the expected behavior of a body of software over time.

The implications of such a VDM are significant for software security. It would suggest, for example, that once the rate of vulnerability discovery was sufficiently small, that the software is "safe" and needs little attention. It also suggests that software modules or components that have stood the "test of time" are appropriate candidates for reuse in other software systems. If this VDM model is wrong, these implications will be false and may have undesirable consequences for software security.

In other words, how do we assess risk when we are building software? If we reuse software that has a long pedigree, should we trust that reused code more, less, or about the same as we trust new code?

For other measures, such as the cost and speed of development, the reuse of existing code has many well-known benefits, but here the authors are specifically considering the implications for security. As they say

It seems reasonable, then, to presume that users of software are at their most vulnerable, with software suffering from the most serious latent vulnerabilities, immediately after a new release. That is, we would expect attackers (and legitimate security researchers) who are looking for bugs to exploit to have the easiest time of it early in the life cycle.

But after crunching lots of numbers, they find out that this presumption does not hold:

In fact, new software overwhelmingly enjoys a honeymoon from attack for a period after it is released. The time between release and the first 0-day vulnerability in a given software release tends to be markedly longer than the interval between the first and the second vulnerability discovered, which in turn tends to be longer than the time between the second and the third.

Furthermore, this effect seems pervasive:

Remarkably, positive honeymoons occur across our entire dataset for all classes of software and across the entire period under analysis. The honeymoon effect is strong whether the software is open- or closed- source, whether it is an OS, web client, server, text processor, or something else, and regardless of the year in which the release occurred.

So, why might this be?

The researchers have several ideas:

One possibility is that a second vulnerability might be of similar type of the first, so that finding it is facilitated by knowledge derived from finding the first one. A second possibility is that the methodology or tools developed to find the first vulnerability lowers the effort required to find a subsequent one. A third possible cause might be that a discovered vulnerability would signal weakness to other attackers (i.e., blood in the water), causing them to focus more attention on that area.

Basically, time is (somewhat) more on the side of the attackers than the defenders here.

From my own experience, I'd like to offer a few additional observations that are generally in agreement with the authors's findings, although from a slightly different perspective.

  • Firstly, bug fixers often fail, when fixing a particular bug, to consider whether the same (or similar) bugs might exist elsewhere in the code base. I often refer to this as "widening the bug", and it's an important step that only the most expert and experienced engineers will take. Plus, it takes time, which is all too often scarce. Attackers, though, are well known users of this technique. When an attack of a certain type is known, it is common to see attackers attempt the same attack with slight adjustments over and over, looking for other places where the same mistake was made.
  • Secondly, fixing old code is just plain scary. In practice, you don't have as many test suites for the old code; you are unsure of how subtle changes in the behavior of old code will affect all of the various places where it's used; the original authors of that old code may have departed, or may have forgotten how it worked or why they did it that way. Developers may well be aware of bugs in older code, but simply decide it's too expensive or too risky to fix it, and hence allow a bug to remain present in older code even though they know it exists.
  • Lastly, that "blood in the water" sentiment is real, even though it's hard to interpret. The reluctance by the manufacturer to advertise information about known vulnerabilities, often labeled "security by obscurity," is in many ways an easy emotion to comprehend: "If we tell people there's a security bug in that old release, aren't we just inviting the bad guys to attack it." Although security researchers have done excellent work to educate us all about the problems with vulnerability secrecy, it's hard to educate away an instinct.

Overall, this is a fascinating paper, and I'm glad I stumbled across it, as there's a lot to think about here. I'm certainly not going to give up my decades-long approach of reusing software; it has served me well. And I'm far from persuaded by some of the alternative solutions proposed by the authors:

research into alternative architectures or execution models which focuses on properties extrinsic to software, such as automated diversity, redundant execution, software design diversity might be used to extend the honeymoon period of newly released software, or even give old software a second honeymoon.

Some of these ideas seem quite valid to me. For example, Address Space Layout Randomization is quite clever, and indeed is a very good technique for making the life of the attacker much harder. But "software design diversity"? Harumph, I say.

I suspect that, in this area, there is no easy answer. Software security is one of the most challenging intellectual efforts of our time, with many complexities to consider.

For now, I'm pleased to have had my eyes opened by the paper, and that's a good thing to say about a research project. Thanks much to the authors for sharing their intriguing work.

Wednesday, December 18, 2013

Up comes the Respect

Just as planned, the D B General got right to work today, fastening slings around the Respect and bringing her back from Davy Jones's locker.

My good friend Cal got a nice picture of the D B General, hard at work:

As it rises slowly from the murky depths, it's hard to imagine what the Respect was once like.

Happily, Hamish, a colleague of mine, on and off, for many years, has made a habit of photographing this part of the world, and has preserved several of his pictures online:

  • Around Jingletown -- Respect (2006). If you look very closely, you can see my office building in the background of this picture of the Respect from 2006.
  • Around Jingletown -- Respect (2007). This is the view, more or less, from my office. Just off the port (left) side of the Respect, you can see, canted at about a 20 degree angle, the jack-up legs of one of the sunken barges. This barge still remains to be recovered; hopefully now that the Respect is free of the estuary floor, the barge will follow suit soon.

It's been a busy several months here on the Oakland-Alameda estuary, with a lot of successful cleanup performed.

I can't wait to see the results when they're all done!

Robert Read's How To Be A Programmer

I hadn't seen Robert Read's How to be a Programmer: A Short, Comprehensive, and Personal Summary until recently, when it was linked to by some random web newsletter that I follow.

Although it suffers somewhat from being overly long, there is still a lot of wisdom in this essay, and it's certainly worth the time to read.

I suppose I was hooked when he started with perhaps my favorite topic: debugging.

Debugging is the cornerstone of being a programmer. The first meaning of the verb to debug is to remove errors, but the meaning that really matters is to see into the execution of a program by examining it. A programmer that cannot debug effectively is blind.

I like that much of Read's advice is very practical. For example, when talking about debugging, he notes that you can use a debugger, but that there are many other ways to debug: reading the code, reading log files and traces, inserting print statements, adding assertions to the code, etc.

Since debugging is one of those activities where the goal is not to win a beauty prize, but to get the job done, a practical approach is the only sensible way, so I knew Read spoke from experience already.

It's also interesting to see how Read approaches a problem I've seen many a time: fear of making things worse:

Some beginners fear debugging when it requires modifying code. This is understandable---it is a little like exploratory surgery. But you have to learn to poke at the code and make it jump; you have to learn to experiment on it, and understand that nothing that you temporarily do to it will make it worse. If you feel this fear, seek out a mentor---we lose a lot of good programmers at the delicate onset of their learning to this fear.

Or, as my old friend Tom always says, "We call it software because you can change it."

Of course, some bugs are simply hard, and it's a measure of Read's real-world experience that he's seen those before, too:

If you can't reproduce it, set a trap for it by building a logging system, a special one if you have to, that can log what you guess you need when it really does occur. Resign yourself to that if the bug only occurs in production and not at your whim, this is may be a long process. The hints that you get from the log may not provide the solution but may give you enough information to improve the logging. The improved logging system may take a long time to be put into production. Then, you have to wait for the bug to reoccur to get more information. This cycle can go on for some time.

How true this is! I have several bugs, important ones that I'm enormously concerned about, that I'm deep into this "cycle" with, myself. A few of them have been plaguing me for years. Sadly, one of the only ways you know you're working on an important body of software is when you encounter situations like these.

If it were an easy bug, we'd have fixed it by now. The hard bugs are the only ones left.

Read's essay, like any deeply personal essay, is uneven. He spends a lot of time covering topics such as:

  • How to Recognize When to Go Home
  • How to Stay Motivated
  • How to Disagree Honestly and Get Away with It
  • How to Tell People Things They Don't Want to Hear
  • How to Get a Promotion
  • and How to Deal with Organizational Chaos
which indicate that he has spent quite a bit of time in the trenches of the modern software development world, with its pressure, emotion, and chaos.

But Read will get no friends for making a suggestion such as:

commonly, some stressed-out person who does not have the magic power will come into your cube and tell you to do something stupid. If you are really sure that it is stupid, it is best to smile and nod until they go away and then carry on doing what you know is best for the company.

This, of course, is exactly why software engineers have the prima-donna reputation that they do.

And it's not clear to me if Read even realizes what he's saying. There's no hint that this is tongue-in-cheek, or that he understands the implications of his suggested approach to the cacophony of opinions that characterizes software development, and how it risks branding the practitioner as one of Those Guys You Never Want To Work With Ever Again.

Yet it is also valuable advice, if you understand how and when to use it properly.

I suspect that Read and I would find ourselves to Disagree Honestly on many more topics. For example, I don't think he gives anywhere near enough credit to the writing of tests, which is the one technique that I've spent the most time developing in my own repertoire over the last decade, and which I think remains the most underused technique among the stellar programmers that I know.

And he almost completely overlooks another crucial topic, in my opinion: acquire tools, learn to use them, and practice using them well. Here, I'm thinking about tools like editors, code analyzers, bug trackers, test harnesses, performance monitors, shells and scripting tools, code coverage reports, memory leak detectors, SCM systems, and the like.

Read is at least honest about his weaknesses in this area, noting, for example, that

I was late to appreciate the benefits of source code control systems but now I wouldn't live without one even on a one-person project. Generally they are necessary when you have team working on the same code base. However, they have another great advantage: they encourage thinking about the code as a growing, organic system. Since each change is marked as a new revision with a new name or number, one begins to think of the software as a visibly progressive series of improvements. I think this is especially useful for beginners.
This is all well and good, but it only begins to touch the power of what you can do with a source code control system. Code archaeology, project branching, configuration tracking, bug bisection, system organization, code security, build automation; all these techniques and many more become possible once you start to learn how to really use your source code control tools effectively.

Similarly, one of the breakthrough moments for me was a point where I was, frankly, stumped on a subtle and elusive problem, battering my head against the same walls. A wise programmer of my acquaintance, when I approached him for advice, took a holistic view, asking questions like:

  • From the historical records of your build automation system, can you tell when the problem began to appear? what platforms and configurations it appears most commonly on? what patterns appear in the problem occurrences?
  • Can you correlate the problem to performance behaviors using your performance tools?
  • How has the configuration of your test systems changed since the code was written?
  • What shows up when you search the bug tracking database for that message?
  • What is the history of that bit of code? When was it last modified, why, and what bugs were being fixed? Do you have tests for those bugs? Can your coverage tools tell you if they're exercising that code?
A few days later, after writing some tools to crunch the data, we spotted the trigger. In retrospect, of course, it was obvious what the problem was, but in the heat of the moment it a revelation to me to understand how this programmer, more expert than I, had developed ways to break through roadblocks by continually searching for new evidence, developing and disproving theories, and using as many tools as possible to improve the power and accuracy of his evaluation. There's a great Sherlock Holmes quote about this use of tools:
Now the skillful workman is very careful indeed as to what he takes into his brain-attic. He will have nothing but the tools which may help him in doing his work, but of these he has a large assortment, and all in the most perfect order.

But Read's essay is over a decade old; I'm sure that his views have changed during the years, and perhaps if he and I were to meet now, we'd find that we agree on more things, and disagree on fewer.

If you're considering a career as a programmer, if you know a programmer and find that you must spend a lot of time with her or him, or if you're just curious about what goes through the mind of a programmer, Read's essay is well worth your time.

And if you're already a programmer, and want to become a better one, I'm confident that you'll find lots of ideas to chew on in Read's essay; self-improvement never stops, and there is always more to learn.

Tuesday, December 17, 2013

D B General

I don't want you to get too jealous, but this baby's moored about 50 yards from my office right now: D.B. General: 700 Ton Floating Crane

The D B General is no ordinary crane, having been recently used in the Bay Bridge construction project, as well as other local large-scale projects.

You can tell it's a special crane, because it's listed in Wild Crane Photos, no. 16:

This is the DB General picking a Manitowoc 222 from a barge and setting in on the deck of the new Benicia Bridge in California. The General is a Clyde 52

That's nothing for the D B General, though, once known as The largest derrick barge on the West Coast

It was a surreal scene for onlookers standing on the shore at Shine Tidelands State Park, many who waited all day to see the lift that, by engineering standards, was no big deal considering the DB General's 700-ton lift capacity.

Once the 280-foot-long, 70-foot-wide and 40-foot-tall truss was in the air -- almost a million pounds of steel hanging there like a toy -- tugs gently pulled the DB General directly backward. Two other tugs then nudged another barge into perfect alignment under the truss.

Although the General was once the largest barge crane on the West Coast, it is now of course dwarfed by the Left Coast Lifter.

Here at home, we didn't need the Left Coast Lifter today; the trusty D B General is onsite to complete the task of raising the sunken tugboat "Respect".

Sunday, December 15, 2013

Where'd You Go, Bernadette? : a very short review

Maria Semple's Where'd You Go, Bernadette? is sort of the fiction equivalent of Beaujolais Noveau, Strawberries at Wimbledon, or a bouquet of roses: it is not meant to be placed on a shelf, stored for later enjoyment; it needs to be consumed and enjoyed now, or it will spoil.

This is not to say that Where'd You Go, Bernadette is an inferior pleasure. It is a comic delight, bubbly and energetic, vivid and captivating. It's just to note that a book which is so thoroughly stylish and au courant runs the risk of being simply baffling to readers just a few years later.

After all, Semple weaves references such as Daniel's Broiler, TED talks, the Microsoft Connector, the Tuba Man, and even Cliff Mass, the blogging weatherman into the book, which is told primarily as a series of emails, tweets, and blog posts.

But underneath all the pop culture references is an engaging story populated with a delightful cast of characters, told with a light touch, a wink, and a smile.

I suspect this story will particularly appeal to the bit jockeys of the world, those who can instantly appreciate a passage such as

Mr. Branch's administrator knocked and asked if Mr. Branch had reviewed a code fix. Mr. Branch looked at his cell phone and shuddered. Apparently, forty-five emails had come in while we were talking. He said, "If Bernadette doesn't kill me, Reply All will." He scrolled through the emails and barked some code talk about submitting a change list, which his administrator furiously copied down before dashing out.
But even if you've never submitted a change list in your life, I think you'll enjoy Where'd You Go, Bernadette?. The holidays are busy times for everyone, but if you have a bit of downtime, and you're looking for something to brighten up your afternoon, and drive away those winter grays, give Where'd You Go, Bernadette? a try.

Saturday, December 14, 2013

Europa Universalis IV: a very short review

Have you played Europa Universalis IV?

No?

Well, let me tell you how it's going to go.

First, you'll download the game and install it. When you start it up, you'll gape at the beauty of the screen.

You'll work your way through the tutorials, getting an understanding for how the interface works, and what the various screens and dialogs are trying to tell you.

All too soon, you'll be finished with the tutorial, and you'll be invited to play a game. "Pick a country!", the computer says, "and let's get going."

So you do.

And you are immediately lost, baffled, overwhelmed.

But you'll persevere. You'll take some time to read the manual, which Paradox have so thoughtfully made available for all.

You'll spend time reading about the game on the Internet, considering the various advice, strategies, and tips that people discuss.

And you'll keep trying to play.

After a while, you'll learn how to pause the game, and how to slow the game speed way down, so that you can see what's happening.

And at some point, when you least expect it, you'll find that you suddenly can't stop thinking about the game.

"France needs to ally with Aragon," you'll think to yourself, "so what would make that happen?" Should you investigate a Royal Marriage? Send a diplomat to improve relations? Offer a bribe? Eventually, you learn enough about the game to discern that Aragon's reluctance is due to the active war that France is prosecuting with England.

And so you wonder if sueing for peace with England, to your north, will actually aid in relations elsewhere on the continent? How will you manage to unite the various dukedoms, and form a unified France?

Meanwhile, you've got troops to lead, an economy to run, diplomatic enquiries from all fronts, and a trading empire to build.

Each time you play, you'll realize how crude your previous attempts were, and how horribly you've mis-managed affairs, and so you'll start anew again, and again, and again.

Before you know it, Steam will tell you that you've been playing a total of 74 hours so far, and you still feel like such a rank amateur.

Will you enjoy it? I don't know. This is a vast and complex game, like nothing you've played before.

But if you think you might enjoy it, I encourage you to give it a try, as the game is truly a work of art, and enormously repays the time you devote to it.

Tuesday, December 10, 2013

Some good long-form writing

I'm not sure if it's because the days are cold and dark, so everyone is staying inside and reading and writing, or because there is some sort of harmonic convergence, or perhaps I just got lucky, but I've been reading some very interesting longer-form CS writing recently.

Here's a few examples:

  • Carlos Bueno: The Mature Optimization Handbook

    Bueno's e-book is a clear, compact, well-organized treatment of performance optimization. The title is a riff on Don Knuth's 40 year old tongue in cheek sound bite, which sadly is all that many computer professionals ever learn about performance observation.

    If you want to go further, start with Bueno's superb book: he points you in the right direction, saves you from several basic pitfalls, arms you with a collection of useful tools and techniques, and points at resources to help you move further once you've grown comfortable with the basics. I particularly like the fact that Bueno links to some of Richard Cook's, as I think Cook has some fascinating ideas and deserves more attention.

    And I just love this hard-won advice:

    Your instrumentation should cover the important use cases in production. Make all the measurements you want in the lab, but nothing substitutes continuous real-world data. Think about it this way: optimizing based on measurements you take in a lab environment is itself a falsifiable theory, ie, that lab conditions are sufficiently similar to production. The only way to test that theory is to collect measurements in production too.
  • Michael Nielsen: How the Bitcoin protocol actually works.

    I've read, oh, approximately 8 trillion articles about Bitcoin; these days, writing a "here, let me explain Bitcoin to you" article seems to be one of the rites of passage.

    Most of them are rubbish.

    But Nielsen's exposition is clear, nicely paced, compactly worded without being dense, and somehow hits just the right level of explanation for me. As I read it, I was reminded of another great document that, while wildly different, is still very similar in approach and technique: Bill Bryant's Designing an Authentication System: a Dialogue in Four Scenes. In both documents, the approach is to start with a solution that seems like it should work, identify the problems with that solution, and evolve from there:

    My strategy in the post is to build Bitcoin up in stages. I’ll begin by explaining a very simple digital currency, based on ideas that are almost obvious. We’ll call that currency Infocoin, to distinguish it from Bitcoin. Of course, our first version of Infocoin will have many deficiencies, and so we’ll go through several iterations of Infocoin, with each iteration introducing just one or two simple new ideas. After several such iterations, we’ll arrive at the full Bitcoin protocol. We will have reinvented Bitcoin!

    This strategy is slower than if I explained the entire Bitcoin protocol in one shot. But while you can understand the mechanics of Bitcoin through such a one-shot explanation, it would be difficult to understand why Bitcoin is designed the way it is. The advantage of the slower iterative explanation is that it gives us a much sharper understanding of each element of Bitcoin.

  • Ralph Langner: To Kill a Centrifuge : A Technical Analysis of What Stuxnet’s Creators Tried to Achieve.

    Langer has devoted 6 years of his life to studying, analyzing, deconstructing, and, most importantly, explaining Stuxnet, the most sophisticated and fascinating piece of malware yet unleashed upon the world.

    Although Langer is not the most natural of writers (in his defense, I suspect English was not his first language), he more than makes up for his dry prose with the amazing depth of detail and knowledge that he includes in this work.

    The attack continues until the attackers decide that enough is enough, based on monitoring centrifuge status, most likely vibration sensors, which suggests a mission abort before the matter hits the fan. If the idea was catastrophic destruction, one would simply have to sit and wait. But causing a solidification of process gas would have resulted in simultaneous destruction of hundreds of centrifuges per infected controller. While at first glance this may sound like a goal worthwhile achieving, it would also have blown cover since its cause would have been detected fairly easily by Iranian engineers in post mortem analysis. The implementation of the attack with its extremely close monitoring of pressures and centrifuge status suggests that the attackers instead took great care to avoid catastrophic damage. The intent of the overpressure attack was more likely to increase rotor stress, thereby causing rotors to break early – but not necessarily during the attack run.

    If there is something you want to know about Stuxnet, you will find it here.

    Much of what Langner writes remains controversial, and certainly this story is far from complete. But Langner has done the world a tremendous service by sharing his deep and broad knowledge of Stuxnet widely and openly. Read it. Think about it. Understand just that little bit more about the strange new world we occupy.

So if, wherever you should be, the days are short and the nights are cold, pull up a comfy chair, grab your reading device, and sink your brain into some deep thoughts. Enjoy!

Sunday, December 8, 2013

Inoreader looks like a good alternative to Feedly

I've been using Feedly for six months, since Google Reader vanished, and overall I'm quite happy with Feedly.

For some reason, the other day, I decided to give Inoreader a try: http://inoreader.com.

After only a little bit of use, my early reaction is that Inoreader is quite good, and is a worthy alternative to Feedly.

It's nice to have two solid alternatives out there.

Friday, December 6, 2013

World Cup 2014: may the complaining begin!

The draw is complete!

Here's how the groups shake out:

  • Group A:
    • Brazil
    • Croatia
    • Mexico
    • Cameroon
  • Group B:
    • Spain
    • Netherlands
    • Chile
    • Australia
  • Group C:
    • Colombia
    • Greece
    • Ivory Coast
    • Japan
  • Group D:
    • Uruguay
    • Costa Rica
    • England
    • Italy
  • Group E:
    • Switzerland
    • Ecuador
    • France
    • Honduras
  • Group F:
    • Argentina
    • Bosnia-Herzegovina
    • Iran
    • Nigeria
  • Group G:
    • Germany
    • Ghana
    • United States
    • Portugal
  • Group H:
    • Belgium
    • Russia
    • Algeria
    • South Korea

I'm no expert, but it seems to me that Brazil, Argentina, and Columbia must count themselves pleased, while Germany, Spain, and Uruguay have to steel themselves for stiff competition. Switzerland and Belgium fell somewhere in the middle, I think.

Both England and the U.S. were destined to have hard schedules, and certainly both now have their work cut out for them.

Can't wait for next summer!

Tuesday, December 3, 2013

Masters of Doom: a very short review

Once again, I'm late to the party, in this case an entire decade late, having only recently stumbled across Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture.

Masters of Doom is the well-written and engaging story of two guys named John (John Carmack and John Romero), who came together and in a few brief and intense months completely re-invented the entire computer gaming world, inspiring all sorts of critical innovations along the way, and then burst apart, in a collision event that captivated the (relatively) large world centered around them.

I knew much of the outlines of the story of Carmack and Romero, but there were just oodles and oodles of details that I had never been aware of, and Masters of Doom was full of surprises and things that fascinated me:

  • I had absolutely no idea that Carmack and Romero first met and started working together in Shreveport, Louisiana. Masters of Doom is polite and generous about this, but, really: Shreveport? Even in 1989 there were identifiable centers of software activity: Boston; San Francisco; North Carolina's Research Triangle; Redmond, Washington. But Shreveport? How unlikely it was that even one of the Johns would end up in Shreveport, and how spectacularly unlikely it was that the two of them ended up there at the same time. The description of this completely accidental happenstance was, by itself, worth reading the book for. Of course, once they all got together, it wasn't long before they created the classic hacker's house, in a way that could happen anywhere, whether it was in Shreveport, Louisiana or on the face of the moon:
    Carmack, Lane, Jay, and an Apple II programmer at Softdisk named Jason Blochowiak had scored an enviable coup not long before when they found a four-bedroom house for rent right along these shores. Jay had bought a cheap boat, which they docked there and used for frequent outings of kneeboarding and skiing. In the large backyard was a swimming pool and a barbecue, with which Jay, a cooking enthusiast, grilled up Flintstonian slabs of ribs. The house itself had plenty of windows looking out on the scene, a large living room, even a big tiled bathroom with a deep earth-tone-tiled Jacuzzi tub. Jay had installed a beer keg in the fridge. It was a perfect place to make games.
  • I never knew the origin of the name "id Software". I thought it was some sort of psychology reference, but in fact it was a shortened and merged amalgam of the company originally formed by John Romero and Lane Roathe:
    While in New Hampshire, the two even decided to merge their one-man-band companies -- Romero's Capitol Ideas and Lane's Blue Mountain Micro -- under one roof as Ideas from the Deep.

    ...

    When the guys christened their company, they shortened the Ideas from the Deep initialism and simply called themselves id, for "in demand". They also didn't mind that, as Tom pointed out, id has another meaning: "the part of the brain that behaves by the pleasure principle."

  • I was fascinated by the description of the special synergy that Romero and Carmack found, each one's strengths complementing the other's. You can't plan for this sort of thing; you can't cause it to be; it just happens, or it doesn't.
    Romero and Carmack were now in a perfect groove, with Carmack improving the new Keen engine -- the code that made the graphics -- while Romero worked on the editor and tools -- the software used to create the game elements. Nothing could distract them.

    ...

    Carmack and Romero had developed another aspect of their collaboration. Though Carmack was gifted at creating game graphics, he had little interest in keeping up with the gaming world. He was never a player, really, he only made the games, just as he was the Dungeon Master but not a player of D&D. Romero, by contrast, kept up with everything, all the new games and developers.

    ...

    Romero immediately saw the potential in Carmack's technology, potential that Carmack was, by his own admission, not capable of envisioning himself. And because Romero was a programmer, he could speak to Carmack in a language he understood, translating his own artistic vision ino the code Carmack would employ to help bring it to life.

    ...

    He played around with rooms that flashed strobe light, with walls that soared and receded at different heights. Every decision he made was based on how he could best show off Carmack's technology. Carmack couldn't have been happier; what more could someone want, after all, than to be both appreciated and celebrated? Romero was just as energized; with Carmack's innovations, he too could reach new heights.

  • I loved the bit of back-story about how Carmack and Romero found themselves inventing the approach of having an extensible gaming engine, with level editing tools that allowed others to create new levels and build entire new games:
    The Right Thing was programming Doom in such a way that willing players could more easily create something like this: StarDoom, a modification, or mod, of their original game.

    ...

    For Doom, Carmack organized the data so players could replace sound and graphics in a nondestructive manner. He created a subsystem that separated the media data, called WADs (an acronym suggested by Tom Hall, it stood for Where's All the Data?), from the main program. Every time someone booted up the game, the program would look for the WAD file of sounds and images to load in. This way, someone could simply point the main program to a different WAD without damaging the original contents. Carmack would also upload the source code for the Doom level-editing and utilities program so that the hackers could have the proper tools with which to create new stuff for the game.

  • The best, and most important, part of the book, in my opinion, is the long and detailed retelling of the split of the Johns, as the intense worldwide pressure to follow-up Doom with Quake drove immense tension between them, culminating in the days following the release of Quake to the world.
    The chasm between Carmack and Romero was too wide. Both of them had their veiws of what it meant to make games and how games should be made. Carmack thought Romero had lost touch with being a programmer. Romero thought Carmack had lost touch as a gamer. Carmack wanted to stay small, Romero wanted to get big. The two visions that had once forged this company were irreparably tearing it apart.

There are many other fine parts to this book. I was amazed how the hours just flew by, reading about all the vivid personalities, wild escapades, and spurts of undeniably brilliant creativity.

Perhaps surprisingly, the world of software engineering has been fairly free from the "celebrity biography" genre of reporting. Most of the external coverage of the software industry has focused on the CEOs and entrepeneurs (Bill Gates, Larry Ellison, Michael Dell, Steve Jobs, etc.). I suppose these people are interestnig, although to me their stories always feel dull and uninteresting.

And when there is a book about the people in the trenches, it often involves coverage of an entire project, built by a team, like Tracy Kidder's The Soul of A New Machine, or Katie Hafner's Where Wizards Stay Up Late: The Origins Of The Internet, or Pascal Zachary's Show Stopper!: The Breakneck Race to Create Windows NT and the Next Generation at Microsoft, or Andy Hertzfeld's Revolution in The Valley: The Insanely Great Story of How the Mac Was Made, or any of a number of Steven Levy's books.

I think there is an important reason for this: nearly all world-changing software isn't written by a single person, it's written by a team of people, working together, bringing all sorts of different talents, personalities, and approaches to the project.

Things were different 50 years ago, when you could be Tony Hoare, and with a single burst of inspiration completely alter an entire area of the industry; nowadays, however, things are rather different: GTA 5 dev team size more than 1000, manpower dependent on game detail.

They were casually talking about how a large team is required in the range of 1000 to create games with minute details, and when the question popped up about GTA 5 team’s size, Benzies revealed that it’s much, much more than 1000.

Yes, you read that right: more than 1000 people worked together on GTA V. According to Masters of Doom, the Wolfenstein 3D team was rather smaller:

They had finally fired Jason, narrowing the group to Carmack, Romero, Adrian, and Tom.

Perhaps there's no going back to the days when Carmack and Romero could sit side-by-side in a single room, building breakthrough games and tools like never before.

But just because things are different now, doesn't mean they are worse. I wasn't programming computers in Tony Hoare's day, but after 3 1/2 decades of programming, I've seen lots of approaches, lots of techniques, and lots of personalities.

I've been part of small teams, and part of immense efforts. I've seen solitary geniuses, and engaging, extroverted, and inspirational team builders.

And it's all great.

May there be many more years of programming ahead of us all.

And may there be more books like Masters of Doom, to share the excitement, thrills, and drama with us all.

The pots are out!

I don't know how anybody can remain confused after the details are released: The heat is on as FIFA announces draw procedure.

The group you are in determines where you play, which involves both travel and climate considerations:

The seeded team in Group H will have a relatively easy first round schedule with matches in the milder conditions of Belo Horizonte, Rio de Janeiro and Sao Paulo.

But the seeds in Group G will play in the intense heat of northeastern cities Fortaleza, Natal, Salvador or Recife.

The travel issues are substantial, and add to the controversy:

In such a vast country, there was an early plan to revert to the arrangements once used in World Cups where teams were based in one region instead of travelling all over the country but Brazilian organisers did not want one region to stage all of Brazil's first round games.

As that was not politically expedient, FIFA agreed every team had to travel all over, resulting in the huge distances covered, apart from those in Group H where the venues are relatively close.

The locations are relevant because the game times are set on a perhaps unexpected basis:

From June 12 until June 22 when there are three matches a day -- the programme switches to four a day from June 23 to June 26 for the last round of group games -- matches are due to start at 1pm, 4pm and 7pm local time which is 1600GMT, 1900GMT and 2200GMT to maximise European television audiences.

However, the early kickoff time has sparked some unease as it will be very hot in the northeast at that time of day.

In my time zone, I think that means that the games are at 8 AM, 11 AM, and 2 PM.

Hmmm... those 11 AM games should make for fine lunchtime viewing!

It's important not to get confused about what time it is:

From June 23 until June 26 a pair of games will kick off at 1pm and the other pair at the same time later in the afternoon, although the clock will show 4pm in one stadium and 5pm at the other because they are in different time zones.

So there you go:

  • The games won't be played at a convenient location for your team
  • The games won't be played at a convenient time for your team
  • Your team will have to travel extensively throughout the fifth largest country in the world
  • Your team will have to face the very best teams as opponents
  • And all of the above is true, no matter which team you back

It's hard to imagine a better outcome, really!

Sunday, December 1, 2013

Stuff I'm reading, December 1st edition

White Rabbit!

  • Turbo Boost and the New Mac Pro’s CPUs
    It looks like you’re paying a lot for slower clock speeds as the cores increase, but that’s not the entire story. Those weird Turbo Boost numbers, which are easy to pull from here and here, are worth understanding before choosing a modern Intel processor.

    They indicate the number of extra 100 MHz increments by which the CPU may ramp up its speed with a given number of cores in an active, high-power state. The sequence begins with all cores active, then counts down to just one core active.

  • Inside the Race to Rescue a Health Care Site, and Obama
    Out of that tense Oval Office meeting grew a frantic effort aimed at rescuing not only the insurance portal and Mr. Obama’s credibility, but also the Democratic philosophy that an activist government can solve big, complex social problems.
  • Datacenter Renewable Power Done Right
    What I like about this approach is: 1) no clear cutting was required to prepare the land for generation and the land remains multi-use, 2) it’s not fossil fuel powered, 3) the facility will be run by a major power generation operator rather than as a sideline by the datacenter operator, and 4) far more clean power is being produced than will be actually used by the datacenter so they are actually adding more clean power to the grid than they are consuming by a fairly significant margin.
  • Bash Pitfalls
    This page shows common errors that Bash programmers make. The following examples are all flawed in some way.
  • Getting Started with Sublime Text
    Sublime Text is a text editor. A very fast, efficient, cross-platform text editor written explicitly for editing code. It is not an IDE, debugger, or builder. It’s made to be super kick ass at editing text and not much else.
  • Expanding the Cloud: Enabling Globally Distributed Applications and Disaster Recovery
    Cross Region Read Replicas are available for MySQL 5.6 and enable you to maintain a nearly up-to-date copy of your master database in a different AWS Region. In case of a regional disaster, you can simply promote your read replica in a different region to a master and point your application to it to resume operations. Cross Region Read Replicas also enable you to serve read traffic for your global customer base from regions that are nearest to them.
  • It’s the little things, Pt. 3: Time to eliminate geek strongman competitions
    when you make software with a UI that's hard to use and confusing in its design, you create a situation where error is inevitable. In networking software, errors are really, really bad. When a misplaced semicolon can kill Internet routing to a large part of the country, your software design has an issue. And I don't believe that these situations are solvable by "knowing what you're doing." If that was the case, why bother with anything past binary? Why make anything easy to use?
  • Decision Quality, part 12: Queues
    there's a core truth which is near perfectly suited for the mind to remain more or less in balance
  • What this book is about
    Neural networks are one of the most beautiful programming paradigms ever invented. In the conventional approach to programming, we tell the computer what to do, breaking big problems up into many small, precisely defined tasks that the computer can easily perform. By contrast, in a neural network we don't tell the computer how to solve our problem. Instead, it learns from observational data, figuring out its own solution to the problem at hand.
  • Europa Universalis IV Manual
    Did Hungary, under your guidance, drive the Turks from Europe, unifying the Balkans along the way? Did your Aztecs hold off the Spanish, English, and French and maintain an Empire in Central America? Did your England win the Two Hundred Years War, conquering the French and building the mightiest Empire in Europe? Did your Iroquois launch a reverse colonial war, overwhelming the stunned nations of Europe after centuries of bitter warfare?

Wednesday, November 27, 2013

The Long Overdue Library Book: a very short review

I've been enjoying reading The Long Overdue Library Book.

This is that book that deserves the cliche: "a labor of love". Written by two professional librarians, reflecting on their decades of experience, drawing on colleagues near and far, the book contains 50 short essays about every aspect of libraries.

My favorite story, for whatever reason, was Number 27, "Walnut Man", with its delightful ending:

"Thank you, all of you," he said, "for your gracious hospitality. We have all appreciated it very much."

My favorite excerpt from the overall book, though, is in Number 50, "Reflections", with its wonderful description of what a library is:

A safe place for the strange, a welcoming refuge warm in winter and cool in summer, the library accepts everybody who wants to come in. Unlike schools, libraries let you come and go as you want. The staff will work hard to answer your questions, and the best of them will help you tell them what you really want to know. They will protect your right to read what you wish, as long as you respect the ultimate Library Law: Return Your Books On Time.

May they survive. We need our libraries more than ever now, to protect the unpopular theories and the forgotten poets, to introduce our children to Mr. Toad and Narnia, and to remind our politicians that there is a need for civility that is as basic as sunlight, as necessary to the human spirit as music or truth or love.

I'm not sure if there will ever be a second edition; I'm sure this book was a monumental effort to produce as it was.

But, should there be, let me offer a (possibly apocryphal) Library Story of my own:

Nearly 75 years ago, the University of Chicago was a very different place. Back then, the school was an athletics powerhouse. Known as "the Monsters of the Midway", they were a founding member of the Big 10 Conference (then known as the Western Conference), and a University of Chicago athlete was the first recipient of the now-famous Heisman Trophy. With their great coach, Amos Alonzo Stagg, the University won trophy after trophy.

During the Second World War, the University underwent a monumental shift in focus, perhaps the greatest re-alignment that any human organization has ever undergone. Led by university president Robert Maynard Hutchins, the school undertook a complete re-dedication of purpose toward the life of the mind.

Against vast protest and discord, Hutchins dismantled the athletics program, withdrew the school from the Big Ten Conference, and re-focused the institution on the goals of knowledge and research, creating the world's foremost intellectual institution, a position it still holds today.

Dramatically, during the latter days of World War II, Hutchins gave the command to tear down the football stadium (beside which Enrico Fermi had just demonstrated the first self-sustaining controlled nuclear chain reaction) and commanded that a library be built in its place.

That library, the Joseph Regenstein Library, remains one of the greatest libraries on the planet, now holding an astonishing 11 million volumes.

At the time, though, Hutchins's decision was met with more than dismay, it provoked controversy, discord, and outright defiance. The alumni, justifiably proud of their heritage, struggled to cope with the transition.

In this, Hutchins was steadfast and sure, and, when questioned about the appropriateness of destroying one of the world's top athletics programs to build (gasp!) a library, responded with one of the greatest sentiments ever delivered by a man of letters:

Whenever I get the urge to exercise, I sit down and read a book until it passes.

There you go.

Stuff I'm reading, Thanksgiving edition

Gearing up for all that time I'm gonna spend on the couch watching football, I gotta find something good to read...

  • The Mature Optimization Handbook
    Knuth’s famous quote about premature optimization was never meant to be a stick to beat people over the head with. It’s a witty remark he tossed off in the middle of a keen observation about leverage, which itself is embedded in a nuanced, evenhanded passage about, of all things, using gotos for fast and readable code. The final irony is that the whole paper was an earnest attempt to caution against taking Edsger Dijkstra’s infamous remark about gotos too seriously. It’s a wonder we risk saying anything at all about this stuff.
  • UDT: UDP-based Data Transfer
    UDT is a reliable UDP based application level data transport protocol for distributed data intensive applications over wide area high-speed networks. UDT uses UDP to transfer bulk data with its own reliability control and congestion control mechanisms. The new protocol can transfer data at a much higher speed than TCP does. UDT is also a highly configurable framework that can accommodate various congestion control algorithms.
  • Solar at Scale: How Big is a Solar Array of 9MW Average Output?
    The real challenge for most people is in trying to understand the practicality of solar to power datacenters is to get a reasonable feel for how big the land requirements actually would be. They sound big but data centers are big and everything associated with them is big. Large numbers aren’t remarkable. One approach to calibrating the “how big is it?” question is to go with a ratio. Each square foot of data center would require approximately 362 square feet of solar array, is one way to get calibration of the true size requirements.
  • DEFLATE performance improvements
    This patch series introduces a number of deflate performance improvements. These improvements include two new deflate strategies, quick and medium, as well as various improvements such as a faster hash function, PCLMULQDQ-optimized CRC folding, and SSE2 hash shifting.
  • How long do disk drives last?
    The chart below shows the failure rate of drives in each quarter of their life. For the first 18 months, the failure rate hovers around 5%, then it drops for a while, and then goes up substantially at about the 3-year mark. We are not seeing that much “infant mortality”, but it does look like 3 years is the point where drives start wearing out.
  • Farming hard drives: 2 years and $1M later
    In the last 30 years the cost for a gigabyte of storage has decreased from over $1 million in 1981 to less than $0.05 in 2011. This is evidenced by the work of Matthew Komorowski. In addition, the cost per gigabyte also declined in an amazingly predictable fashion over that time.

    Beginning in October 2011 those 30-years of history went out the window.

  • How to be a Programmer: A Short, Comprehensive, and Personal Summary
    To be a good programmer is difficult and noble. The hardest part of making real a collective vision of a software project is dealing with one's coworkers and customers. Writing computer programs is important and takes great intelligence and skill. But it is really child's play compared to everything else that a good programmer must do to make a software system that succeeds for both the customer and myriad colleagues for whom she is partially responsible. In this essay I attempt to summarize as concisely as possible those things that I wish someone had explained to me when I was twenty-one.
  • What It's Like to Fail
    During the nearly 18 months I spent homeless off and on, and during the ensuing years, I learned that I am more resourceful than I ever imagined, less respectable than I ever figured, and, ultimately, braver and more resilient than I ever dreamed. An important tool in my return to life has been Craigslist. It was through Craigslist that I found odd jobs -- gigs, they often are called -- doing everything from ghost-writing a memoir for a retired Caltech professor who had aphasia to web content writing jobs to actual real jobs with actual real startups.

Tuesday, November 26, 2013

Time to learn a new game

Uhm, yes, but learning this one may be a bit more complex than I had anticipated...

  • A Newbie's First Steps into Europa Universalis IV
    the first couple of hours were spent not so much playing the game as opening menus, hovering the cursor over icons and reading text, trying to figure out what the heck I was supposed to do.

    ...

    Toward the end of the second day, things in the game got pretty crazy as countries went bankrupt, monarchs were excommunicated left and right and people without the resources to do it started declaring war just to see what would happen. My army was humiliated by the stave-wielding natives of Sierra Leon, the market was dominated by fish, England fell, dogs and cats were living together—it was Armageddon.

  • Europa Universalis IV and the Border Between Complex and Complicated
    Europa Universalis games tend to be a little, shall we say, complex. It's a series that speaks in terms like cassus belli and papal curia, featuring a map crammed full of long-forgotten nations where every last political maneuver is an opportunity to broker a deal that might someday come back to haunt you.

    ...

    "We want complex, but not complicated," says Johansson. "A complex feature has a lot of factors that influence it, and you will discover that when you thought you mastered it, there is a sudden a shift in dynamics that forces you to reevaluate your strategy. Complex features are what make games fun in the long run."

  • Europe Universalis IV Post Spanish Tutorial Map - Episode 1
    Oh, Europa Universalis IV, you saucy minx you. This is the most confusing game I have played recently, but I still have fun with it.
  • Re: Europa Universalis 4
    I'm pretty much King of Europe at this point thanks to abusing personal unions and the only other large Christian realms are Poland and Lithuania, who I've got royal marriages with so I can force personal unions on them too if they ever lack an heir. I'm trying to unify the Holy Roman Empire as I've been Emperor for about 60 years now and I just passed the fifth reform (the one that disables internal HRE wars) and I've kind of hit a brick wall with authority, since there are no heretics to convert and no one will declare war on HRE members.
  • Beginner's guide
    There are no specific victory conditions in Europa Universalis IV, Although there is a score visible throughout and at the end of the game at the top right corner of the interface. The player is free to take history in whatever direction they desire. They may take a small nation with a single province and turn it into a powerhouse to rule the world, take control of a historically powerful nation and cause it to crumble and anything in between.
  • Top 5 Tips to getting started in Europa Universalis IV
    Don't Play as Scotland.

    In fact, don’t even be friends with Scotland. At least not until you are familiar with the game and want a challenge. Nobody likes Scotland at this stage of history, and you are only a couple hundred years from the Glorious Revolution. So unless you think you can do better than the Jacobites, best to avoid Scotland for now. Every start I have made so far has seen Scotland either nearly or completely wiped off the map at an early stage by England.

Waterfall vs Agile

Clay Shirky, who is not a software developer but who is both a very smart guy and a very good writer, has written a quite-worth-reading essay about the healthcare.gov development process: Healthcare.gov and the Gulf Between Planning and Reality.

I'm not sure how plugged-in Shirky was to the actual healthcare.gov development effort, so his specific comments on that endeavor are perhaps inaccurate, but he has some fascinating observations about the overall software development process.

Shirky starts by describing the challenges that arise when senior management have to oversee a project whose technology they don't understand, and draws an analogy to the historical changes that occurred when technology changed the media industry:

In the early days of print, you had to understand the tech to run the organization. (Ben Franklin, the man who made America a media hothouse, called himself Printer.) But in the 19th century, the printing press became domesticated. Printers were no longer senior figures — they became blue-collar workers. And the executive suite no longer interacted with them much, except during contract negotiations.

It's certainly a problem when technology executives don't understand the technology in the projects they oversee. However, Shirky has another point to make, which is about the choice of development processes that can be used in a software development project.

The preferred method for implementing large technology projects in Washington is to write the plans up front, break them into increasingly detailed specifications, then build what the specifications call for. It’s often called the waterfall method, because on a timeline the project cascades from planning, at the top left of the chart, down to implementation, on the bottom right.

As Shirky observes, in a wonderfully-pithy sound bite:

By putting the most serious planning at the beginning, with subsequent work derived from the plan, the waterfall method amounts to a pledge by all parties not to learn anything while doing the actual work. Instead, waterfall insists that the participants will understand best how things should work before accumulating any real-world experience, and that planners will always know more than workers.

This is just a brilliant point, so true, and so well stated. The great breakthrough of agile techniques is to realize that each step you take helps you comprehend what the next step should be, so allowing feedback and change into the overall cycle is critical.

Shirky then spends the remainder of his wonderful essay discussing policy-related matters such as the federal government's procurement policies, the implication of civil service bureacracy, etc., which are all well and good, but not things I really feel I have an informed opinion about.

Where I wish to slightly object to Shirky's formulation, though, is in the black-and-white way that he portrays the role of planning in a software development project:

the tradeoff is likely to mean sacrificing quality by default. That just happened to this administration’s signature policy goal. It will happen again, as long politicians can be allowed to imagine that if you just plan hard enough, you can ignore reality. It will happen again, as long as department heads imagine that complex technology can be procured like pencils. It will happen again as long as management regards listening to the people who understand the technology as a distasteful act.

This is, I think, a common mis-statement of the so-called "agile" approach to software development: Agile development processes do NOT eliminate planning! Shirky worsens the problem, in my opinion, by setting up a dichotomy, starting with the title of his essay and throughout its content, between "planning" and "reality".

To someone not deeply immersed in the world of software development process, Shirky's essay makes it sound like:

  • Waterfall processes involve complete up-front planning
  • That typically fails with software projects, because we're trying to do something new that's never been done before, and hence cannot be fully planned out ahead of time
  • Therefore we should replace all that futile planning with lots of testing ("reality")
It's that last step where I object.

Agile approaches, properly executed, solve this up-front planning problem by decomposing the overall project into smaller and smaller and smaller sub-projects, and decomposing the overall schedule into smaller and smaller and smaller incremental milestones. HOWEVER, we also decompose the planning into smaller and smaller and smaller plans (in one common formulation, captured on individual 3x5 index cards on a team bulletin board or wall), so that each little sub-project and each incremental milestone is still planned and described before it is executed.

That is, we're not just winging it.

Rather, we're endeavoring to make the units of work small enough so that:

  • Everyone on the team can understand the task being undertaken, and the result we expect it to have.
  • Regularly and frequently, everyone on the team can reflect on the work done so far, and incorporate lessons learned into the planning for the next steps
Shirky does a good job of conveying the value of the latter point, but I think he fails to understand the importance of the former point.

You can't settle for a situation in which management doesn't understand the work you're doing. Shirky is clearly aware of this, but perhaps he's never been close enough to a project run using agile approaches, to see the techniques they use to ensure that all members of the team are able to understand the work being done. (Or perhaps he just despairs of the possibility of changing the behavior of politicians and bureaucrats.)

Regardless, don't just listen to my rambling; go read Shirky's essay and keep it in mind the next time you're involved in a large software development project.