Three Management Methods (Introduction)

If you want to lead a team, a company, an army, or a country, the primary problem you face is getting everyone moving in the same direction, which is really just a polite way of saying “getting people to do what you want.”

Think of it this way. As soon as your team consists of more than one person, you’re going to have different people with different agendas. They want different things than you want. If you’re a startup founder, you might want to make a lot of money quickly so you can retire early and spend the next couple of decades going to conferences for women bloggers. So you might spend most of your time driving around Sand Hill Road talking to VCs who might buy the company and flip it to Yahoo!. But Janice the Programmer, one of your employees, doesn’t care about selling out to Yahoo!, because she’s not going to make any money that way. What she cares about is writing code in the latest coolest new programming language, because it’s fun to learn a new thing. Meanwhile your CFO is entirely driven by the need to get out of the same cubicle he has been sharing with the system administrator, Trekkie Monster, and so he’s working up a new budget proposal that shows just how much money you would save by moving to larger office space that’s two minutes from his house, what a coincidence!

The problem of getting people to move in your direction (or, at least, the same direction) is not unique to startups, of course. It’s the same fundamental problem that a political leader faces when they get elected after promising to eliminate waste, corruption, and fraud in government. The mayor wants to make sure that it’s easy to get city approval of a new building project. The city building inspectors want to keep getting the bribes they have grown accustomed to.

And it’s the same problem that a military leader faces. They might want a team of soldiers to charge at the enemy, even when every individual soldier would really just rather cower behind a rock and let the others do the charging.

Here are three common approaches you might take:

  • The Command and Control Method
  • The Econ 101 Method
  • The Identity Method

You will certainly find other methods of management in the wild (there’s the exotic “Devil Wears Prada” Method, the Jihad Method, the Charismatic Cult Method, and the Lurch From One Method To Another Method) but over the next three days, I’m going to examine these three popular methods and explore their pros and cons.

Can Your Programming Language Do This?

One day, you’re browsing through your code, and you notice two big blocks that look almost exactly the same. In fact, they’re exactly the same, except that one block refers to “Spaghetti” and one block refers to “Chocolate Moose.”

 // A trivial example: 

alert("I'd like some Spaghetti!"); 
alert("I'd like some Chocolate Moose!");

These examples happen to be in JavaScript, but even if you don’t know JavaScript, you should be able to follow along.

The repeated code looks wrong, of course, so you create a function:

function SwedishChef( food ) 
{
    alert("I'd like some " + food + "!"); 
} 

SwedishChef("Spaghetti"); 
SwedishChef("Chocolate Moose");

A picture of the Swedish Chef

OK, it’s a trivial example, but you can imagine a more substantial example. This is better code for many reasons, all of which you’ve heard a million times. Maintainability, Readability, Abstraction = Good!

Now you notice two other blocks of code which look almost the same, except that one of them keeps calling this function called BoomBoom and the other one keeps calling this function called PutInPot. Other than that, the code is pretty much the same.

alert("get the lobster"); 
PutInPot("lobster"); 
PutInPot("water"); 

alert("get the chicken"); 
BoomBoom("chicken"); 
BoomBoom("coconut");

Now you need a way to pass an argument to the function which itself is a function. This is an important capability, because it increases the chances that you’ll be able to find common code that can be stashed away in a function.

function Cook( i1, i2, f ) 
{ 
   alert("get the " + i1); 
   f(i1); 
   f(i2); 
} 

Cook( "lobster", "water", PutInPot ); 
Cook( "chicken", "coconut", BoomBoom );

Look! We’re passing in a function as an argument.

Can your language do this?

Wait… suppose you haven’t already defined the functions PutInPot or BoomBoom. Wouldn’t it be nice if you could just write them inline instead of declaring them elsewhere?

 Cook("lobster",
      "water",
      function(x) { alert("pot " + x); }  
     ); 

Cook("chicken",
     "coconut",  
     function(x) { alert("boom " + x); } 
    );

Jeez, that is handy. Notice that I’m creating a function there on the fly, not even bothering to name it, just picking it up by its ears and tossing it into a function.

As soon as you start thinking in terms of anonymous functions as arguments, you might notice code all over the place that, say, does something to every element of an array.

var a = [1,2,3]; 

for (i=0; i<a.length; i++) 
{ 
    a[i] = a[i] * 2; 
} 

for (i=0; i<a.length; i++) 
{ 
    alert(a[i]); 
}

Doing something to every element of an array is pretty common, and you can write a function that does it for you:

function map(fn, a) 
{ 
    for (i = 0; i < a.length; i++) 
    { 
        a[i] = fn(a[i]); 
    } 
}

Now you can rewrite the code above as:

map( function(x){return x*2;}, a ); 
map( alert, a );

Another common thing with arrays is to combine all the values of the array in some way.

function sum(a) 
{ 
    var s = 0; 
    for (i = 0; i < a.length; i++) 
        s += a[i]; 

    return s; 
} 

function join(a) 
{ 
    var s = ""; 
    for (i = 0; i < a.length; i++) 
        s += a[i]; 

    return s; 
} 

alert(sum([1,2,3])); 
alert(join(["a","b","c"]));

sum and join look so similar, you might want to abstract out their essence into a generic function that combines elements of an array into a single value:

function reduce(fn, a, init) 
{ 
    var s = init; 
    for (i = 0; i < a.length; i++) 
        s = fn( s, a[i] ); 

    return s; 
} 

function sum(a) 
{ 
    return reduce( function(a, b){ return a + b; },  
                   a, 0 ); 
} 

function join(a) 
{ 
    return reduce( function(a, b){ return a + b; }, 
                   a, "" ); 
}

Many older languages simply had no way to do this kind of stuff. Other languages let you do it, but it’s hard (for example, C has function pointers, but you have to declare and define the function somewhere else). Object-oriented programming languages aren’t completely convinced that you should be allowed to do anything with functions.

Java required you to create a whole object with a single method called a functor if you wanted to treat a function like a first class object. Combine that with the fact that many OO languages want you to create a whole file for each class, and it gets really klunky fast. If your programming language requires you to use functors, you’re not getting all the benefits of a modern programming environment. See if you can get some of your money back.

How much benefit do you really get out of writting itty bitty functions that do nothing more than iterate through an array doing something to each element?

Well, let’s go back to that map function. When you need to do something to every element in an array in turn, the truth is, it probably doesn’t matter what order you do them in. You can run through the array forward or backwards and get the same result, right? In fact, if you have two CPUs handy, maybe you could write some code to have each CPU do half of the elements, and suddenly map is twice as fast.

Or maybe, just hypothetically, you have hundreds of thousands of servers in several data centers around the world, and you have a really big array, containing, let’s say, again, just hypothetically, the entire contents of the internet. Now you can run map on thousands of computers, each of which will attack a tiny part of the problem.

So now, for example, writing some really fast code to search the entire contents of the internet is as simple as calling the map function with a basic string searcher as an argument.

The really interesting thing I want you to notice, here, is that as soon as you think of map and reduce as functions that everybody can use, and they use them, you only have to get one supergenius to write the hard code to run map and reduce on a global massively parallel array of computers, and all the old code that used to work fine when you just ran a loop still works only it’s a zillion times faster which means it can be used to tackle huge problems in an instant.

Lemme repeat that. By abstracting away the very concept of looping, you can implement looping any way you want, including implementing it in a way that scales nicely with extra hardware.

And now you understand something I wrote a while ago where I complained about CS students who are never taught anything but Java:

Without understanding functional programming, you can’t invent MapReduce, the algorithm that makes Google so massively scalable. The terms Map and Reduce come from Lisp and functional programming. MapReduce is, in retrospect, obvious to anyone who remembers from their 6.001-equivalent programming class that purely functional programs have no side effects and are thus trivially parallelizable. The very fact that Google invented MapReduce, and Microsoft didn’t, says something about why Microsoft is still playing catch up trying to get basic search features to work, while Google has moved on to the next problem: building Skynet^H^H^H^H^H^H the world’s largest massively parallel supercomputer. I don’t think Microsoft completely understands just how far behind they are on that wave.

Ok. I hope you’re convinced, by now, that programming languages with first-class functions let you find more opportunities for abstraction, which means your code is smaller, tighter, more reusable, and more scalable. Lots of Google applications use MapReduce and they all benefit whenever someone optimizes it or fixes bugs.

And now I’m going to get a little bit mushy, and argue that the most productive programming environments are the ones that let you work at different levels of abstraction. Crappy old FORTRAN really didn’t even let you write functions. C had function pointers, but they were ugleeeeee and not anonymous and had to be implemented somewhere else than where you were using them. Java made you use functors, which is even uglier. As Steve Yegge points out, Java is the Kingdom of Nouns.

Correction: The last time I used FORTRAN was 27 years ago. Apparently it got functions. I must have been thinking about GW-BASIC.

My First BillG Review

In the olden days, Excel had a very awkward programming language without a name. “Excel Macros,” we called it. It was a severely dysfunctional programming language without variables (you had to store values in cells on a worksheet), without locals, without subroutine calls: in short it was almost completely unmaintainable. It had advanced features like “Goto” but the labels were actually physically invisible.

The only thing that made it look reasonable was that it looked great compared to Lotus macros, which were nothing more than a sequence of keystrokes entered as a long string into a worksheet cell.

On June 17, 1991, I started working for Microsoft on the Excel team. My title was “Program Manager.” I was supposed to come up with a solution to this problem. The implication was that the solution would have something to do with the Basic programming language.

Basic? Yech!

I spend some time negotiating with various development groups. Visual Basic 1.0 had just come out, and it was pretty friggin’ cool. There was a misguided effort underway with the code name MacroMan, and another effort to make Object-Oriented Basic code-named “Silver.” The Silver team was told that they had one client for their product: Excel. The marketing manager for Silver, Bob Wyman, yes that Bob Wyman, had only one person he had to sell his technology to: me.

MacroMan was, as I said, misguided, and it took some persuading, but it was eventually shut down. The Excel team convinced the Basic team that what we really needed was some kind of Visual Basic for Excel. I managed to get four pet features added to Basic. I got them to add Variants, a union data type that could hold any other type, because otherwise you couldn’t store the contents of  a spreadsheet cell in a variable without a switch statement. I got them to add late binding, which became known as IDispatch, a.k.a. COM Automation, because the original design for Silver required a deep understanding of type systems that the kinds of people who program macros don’t care about. And I got two pet syntactic features into the language: For Each, stolen from csh, and With, stolen from Pascal.

Then I sat down to write the Excel Basic spec, a huge document that grew to hundreds of pages. I think it was 500 pages by the time it was done. (“Waterfall,” you snicker; yeah yeah shut up.)

In those days we used to have these things called BillG reviews. Basically every major important feature got reviewed by Bill Gates. I was told to send a copy of my spec to his office in preparation for the review. It was basically one ream of laser-printed paper.

I rushed to get the spec printed and sent it over to his office.

Later that day, I had some time, so I started working on figuring out if Basic had enough date and time functions to do all the things you could do in Excel.

In most modern programming environments, dates are stored as real numbers. The integer part of the number is the number of days since some agreed-upon date in the past, called the epoch. In Excel, today’s date, June 16, 2006, is stored as 38884, counting days where January 1st, 1900 is 1.

I started working through the various date and time functions in Basic and the date and time functions in Excel, trying things out, when I noticed something strange in the Visual Basic documentation: Basic uses December 31, 1899 as the epoch instead of January 1, 1900, but for some reason, today’s date was the same in Excel as it was in Basic.

Huh?

I went to find an Excel developer who was old enough to remember why. Ed Fries seemed to know the answer.

“Oh,” he told me. “Check out February 28th, 1900.”

“It’s 59,” I said.

“Now try March 1st.”

“It’s 61!”

“What happened to 60?” Ed asked.

“February 29th. 1900 was a leap year! It’s divisible by 4!”

“Good guess, but no cigar,” Ed said, and left me wondering for a while.

Oops. I did some research. Years that are divisible by 100 are not leap years, unless they’re also divisible by 400.

1900 wasn’t a leap year.

“It’s a bug in Excel!” I exclaimed.

“Well, not really,” said Ed. “We had to do it that way because we need to be able to import Lotus 123 worksheets.”

“So, it’s a bug in Lotus 123?”

“Yeah, but probably an intentional one. Lotus had to fit in 640K. That’s not a lot of memory. If you ignore 1900, you can figure out if a given year is a leap year just by looking to see if the rightmost two bits are zero. That’s really fast and easy. The Lotus guys probably figured it didn’t matter to be wrong for those two months way in the past. It looks like the Basic guys wanted to be anal about those two months, so they moved the epoch one day back.”

“Aargh!” I said, and went off to study why there was a checkbox in the options dialog called 1904 Date System.

The next day was the big BillG review.

June 30, 1992.

In those days, Microsoft was a lot less bureaucratic. Instead of the 11 or 12 layers of management they have today, I reported to Mike Conte who reported to Chris Graham who reported to Pete Higgins, who reported to Mike Maples, who reported to Bill. About 6 layers from top to bottom. We made fun of companies like General Motors with their eight layers of management or whatever it was.

In my BillG review meeting, the whole reporting hierarchy was there, along with their cousins, sisters, and aunts, and a person who came along from my team whose whole job during the meeting was to keep an accurate count of how many times Bill said the F word. The lower the f***-count, the better.

Bill came in.

I thought about how strange it was that he had two legs, two arms, one head, etc., almost exactly like a regular human being.

He had my spec in his hand.

He had my spec in his hand!

He sat down and exchanged witty banter with an executive I did not know that made no sense to me. A few people laughed.

Bill turned to me.

I noticed that there were comments in the margins of my spec. He had read the first page!

He had read the first page of my spec and written little notes in the margin!

Considering that we only got him the spec about 24 hours earlier, he must have read it the night before.

He was asking questions. I was answering them. They were pretty easy, but I can’t for the life of me remember what they were, because I couldn’t stop noticing that he was flipping through the spec…

He was flipping through the spec! [Calm down, what are you a little girl?]

… and THERE WERE NOTES IN ALL THE MARGINS. ON EVERY PAGE OF THE SPEC. HE HAD READ THE WHOLE GODDAMNED THING AND WRITTEN NOTES IN THE MARGINS.

He Read The Whole Thing! [OMG SQUEEE!]

The questions got harder and more detailed.

They seemed a little bit random. By now I was used to thinking of Bill as my buddy. He’s a nice guy! He read my spec! He probably just wants to ask me a few questions about the comments in the margins! I’ll open a bug in the bug tracker for each of his comments and makes sure it gets addressed, pronto!

Finally the killer question.

“I don’t know, you guys,” Bill said, “Is anyone really looking into all the details of how to do this? Like, all those date and time functions. Excel has so many date and time functions. Is Basic going to have the same functions? Will they all work the same way?”

“Yes,” I said, “except for January and February, 1900.”

Silence.

The f*** counter and my boss exchanged astonished glances. How did I know that? January and February WHAT?

“OK. Well, good work,” said Bill. He took his marked up copy of the spec

wait! I wanted that

and left.

“Four,” announced the f*** counter, and everyone said, “wow, that’s the lowest I can remember. Bill is getting mellow in his old age.” He was, you know, 36.

Later I had it explained to me. “Bill doesn’t really want to review your spec, he just wants to make sure you’ve got it under control. His standard M.O. is to ask harder and harder questions until you admit that you don’t know, and then he can yell at you for being unprepared. Nobody was really sure what happens if you answer the hardest question he can come up with because it’s never happened before.”

“Can you imagine if Jim Manzi had been in that meeting?” someone asked. “‘What’s a date function?’ Manzi would have asked.”

Jim Manzi was the MBA-type running Lotus into the ground.

It was a good point. Bill Gates was amazingly technical. He understood Variants, and COM objects, and IDispatch and why Automation is different than vtables and why this might lead to dual interfaces. He worried about date functions. He didn’t meddle in software if he trusted the people who were working on it, but you couldn’t bullshit him for a minute because he was a programmer. A real, actual, programmer.

Watching non-programmers trying to run software companies is like watching someone who doesn’t know how to surf trying to surf.

“It’s ok! I have great advisors standing on the shore telling me what to do!” they say, and then fall off the board, again and again. The standard cry of the MBA who believes that management is a generic function. Is Ballmer going to be another John Sculley, who nearly drove Apple into extinction because the board of directors thought that selling Pepsi was good preparation for running a computer company? The cult of the MBA likes to believe that you can run organizations that do things that you don’t understand.

Over the years, Microsoft got big, Bill got overextended, and some shady ethical decisions made it necessary to devote way too much management attention to fighting the US government. Steve took over the CEO role on the theory that this would allow Bill to spend more time doing what he does best, running the software development organization, but that didn’t seem to fix endemic problems caused by those 11 layers of management, a culture of perpetual, permanent meetings, a stubborn insistance on creating every possible product no matter what, (how many billions of dollars has Microsoft lost, in R&D, legal fees, and damage to reputation, because they decided that not only do they have to make a web browser, but they have to give it away free?), and a couple of decades of sloppy, rapid hiring has ensured that the brainpower of the median Microsoft employee has gone way down (Douglas Coupland, in Microserfs: “They hired 3,100 people in 1992 alone, and you know not all of them were gems.”)

Oh well. The party has moved elsewhere. Excel Basic became Microsoft Visual Basic for Applications for Microsoft Excel, with so many (TM)’s and (R)’s I don’t know where to put them all. I left the company in 1994, assuming Bill had completely forgotten me, until I noticed a short interview with Bill Gates in the Wall Street Journal, in which he mentioned, almost in passing, something along the lines of how hard it was to recruit, say, a good program manager for Excel. They don’t just grow on trees, or something.

Could he have been talking about me? Naw, it was probably someone else.

Still.

FogBugz 4½ and Subjective Well-Being

Let me tell you the story of how we came to be shipping FogBugz 5.0 about six months earlier than expected.

It’s kind of a long story.

It turns out that students in Canada like to organize their own conferences, and a couple of years back they organized one and invited me to speak.

College students make great audiences. They’ll laugh at anything. I gathered together a bunch of random ideas and some funny slides I downloaded from the Internet (one of which is shown at right, proving that there’s life on Mars) and they were rolling in the aisles.

One theme from that speech was based on the most important thing that I learned in Psych 110, the idea that when people are successful at controlling their environment they become happier, and when they can’t control their environment, they get grumpy.

(Actually, using words like “happiness” and “grumpy” tends to inhibit tenure, so a real psychologist would say something like “repeated failure inhibits the experience of agency, decreasing subjective well-being.”)

Put people in direct control of the stuff around them and they will, more or less, on average, be happier. It explains why some people like stick shifts, it explains why lethargic user interfaces make you frustrated and depressed, and it explains why people get so goddamn mad when Sony decides to install viruses on their computers just because they tried to listen to a CD.

If you’re a software designer, this is it. This is your big chance to do something meaningful to improve the world. Design software that puts the user in control and you’ll increase happiness, even if your product is the most boring accounting software imaginable. You can do this at the most microscopic levels:

  • The bookkeeping software I’ve been using for the last six years makes a beep when you record a transaction.
  • The Apple iPod includes a tiny internal speaker so that the thumbwheel sounds like it’s clicking when you rotate it.
  • The Sonos digital music system has a handheld controller with a motion sensor built in. The instant you pick it up, the screen lights up.

You can also screw it up:

  • Most modern cell phones have mushy on/off buttons that take forever to turn on. It’s hard to tell if you didn’t press hard enough because the phone acts like it’s ignoring you.
  • The people who make DVD movies seem to think that it’s OK to disable the Menu and Fast Forward buttons while they’re showing you advertisements and ridiculous FBI warnings.
  • Web browsers deal with the security problem by displaying a seemingly endless series of modal popups asking you to confirm that you really want to have three NFL linebackers come into your home and force-feed you a football.
  • That Sonos controller has a thumbwheel that’s too sensitive to choose menu items without a lot of futzing around. Or maybe it’s just because I have fat thumbs.

In the last year or so a lot of web developers have been working hard on improving their applications using techniques now known as Ajax. These applications use JavaScript code so that when you click on something, you get immediate feedback, rather than waiting for the web server to send you a new page at its own leisurely pace. When they do need more information from the server, they often download the small fragment they need, rather than waiting for the server to build a whole new page. The net result is faster, crisper feedback that makes you feel in control and creates “subjective well-being,” a.k.a. happiness, a feeling that is biochemically NO DIFFERENT THAN EATING LARGE QUANTITIES OF CHOCOLATE.

Just a minute… I have to pause for some fact checking …

… ok, I’m back. To summarize, Ajax = Chocolate = Happiness, and so we knew, when we started planning FogBugz 5.0, that Ajax features would be an important part of this release.

The two places FogBugz users spend most of their time is in the single case page, where you view and edit cases, and the list page, where you browse, sort, slice and dice cases. With 5.0 we basically took the approach that we would go crazy with those two pages, improving everything we can about the experience using JavaScript and Ajax.

Dragging columns in FogBugz 5.0On the list page, for example, Ben added the ability to drag and drop columns, lots of intuitive new ways to select multiple bugs, the ability to resize columns and add arbitrary columns of data. It’s all done on the client and it’s all very fast.

On the single case page, where you’re looking at a single bug or email, Brett made it so that commands like Edit or Reply happen instantaneously, on the client side, in the browser, without a round trip to the server. The net result is that when you’re working through a lot of cases, you need about half as many round-trips to the server making the whole experience feel much, much more responsive. You feel in control, and you are happier. It works!

FogBugz 5.0 Keyboard ShortcutsBrett also snuck in a feature he’s been itching for: lots and lots and lots of keyboard shortcuts. There’s only one keyboard shortcut you have to memorize, though: Ctrl+; switches FogBugz into keyboard mode and little letters light up reminding you what the shortcuts are for various commands around the screen. It’s really pretty cool to be able to work through a bunch of cases, assigning, editing, and reprioritizing, without ever reaching for the mouse. Combined with the speed and responsiveness from Ajax, FogBugz has almost reached the level of speed and fluidity of my dry cleaner’s DOS 2.0 character mode database application. And that’s pretty darn responsive for a web app.

Anyway, because FogBugz is not a hosted product—we sell the software to our customers, who install it on their own servers—we try not to have too many releases, and we try to make each release really, really solid. But we do have our own FogBugz server which runs the company—it sorts incoming email, tracks bugs and features under development, serves as our recruiting database and resume file, routes incoming faxes, and manages purchase orders; I’m even using FogBugz to edit the next edition of Best Software Writing.

In a rather extreme form of eating our own dogfood, the developers put their latest build up every few days so we can all bang on it.

The more we played with the new Ajax features the more we fell in love, and the more we realized that this was the single greatest thing we had done in FogBugz in a looooong time. So we decided to ship the new features as soon as possible. We would take a few months going through a complete beta cycle, and get this stuff out to our customers right away rather than waiting for the other planned 5.0 features.

And that’s where we are today. What’s shipping today is really something like FogBugz 4½, but we’re calling it 5.0 anyway, because life is confusing enough without fractions. We’re only on year six of the “great software takes ten years” rule, but I’d say we’re more than 60% there. Check out the FogBugz homepage; there’s an online demo at try.fogbugz.com.

 

The Development Abstraction Layer

A young man comes to town. He is reasonably good looking, has a little money in his pocket. He finds it easy to talk to women.

He doesn’t speak much about his past, but it is clear that he spent a lot of time in a soulless big company.

He is naturally friendly and outgoing, and quietly confident without being arrogant. So he finds it easy to pick up small gigs from the job board at the local Programmer’s Cafe. But he rapidly loses interest in insurance database projects, vanity web pages for housewives, and financial calculation engines.

After a year, he calculates that he has saved up enough money to pay his modest expenses for a year. So, after consulting with his faithful Alsatian, he sets up a computer in a sunfilled room in his rented apartment above the grocery store and installs a carefully-chosen selection of tools.

One by one, he calls his friends and warns them that if he seems remote over the next months, it is only because he is hard at work.

And he sits down to spin code.

And what code it is. Flawless, artistic, elegant, bug free. The user interface so perfectly mimics a users’ thought process that the people he shows it to at the Programmer’s Cafe hardly notice that there is a user interface. It’s a brilliant piece of work.

Encouraged by the feedback of his peers, he sets up in business and prepares to take orders.

His modesty precludes any pretensions, but after a month, the situation in his bank account is not looking encouraging. So far only three orders have been taken: one from his mother, one from an anonymous benefactor at the Programmer’s Cafe, and the one he submitted himself to test the commerce system.

In the second month, no more orders come in.

This surprises him and leaves him feeling melancholy. At the big company, new products were created on a regular basis, and even if they were inelegant and homely, they still sold in reasonable quantities. One product he worked on there went on to be a big hit.

After a few more months pass, his financial situation starts to look a little bit precarious. His dog looks at him sadly, not quite certain what is wrong, but aware that his face is looking a little bit gaunter than usual, and he seems to be unable to get up the energy to go out with friends, or go shopping to restock the dangerously low larder, or even to bathe.

One Tuesday morning, the local grocer has refused to extend him any more credit, and his banker has long since refused to return his calls.

The big company is not vindictive. They recognize talent, and are happy to hire him back, at a higher salary. Soon he is looking better, he has some new clothes, and he’s got his old confidence back. But something, somewhere, is missing. A spark in his eye. The hope that he might become the master of his own destiny is gone.

Why did he fail? He’s pretty sure he knows. “Marketing,” he says. Like many young technicians, he is apt to say things like, “Microsoft has worse products but better marketing.”

When uttered by a software developer, the term “marketing” simply stands in for all that business stuff: everything they don’t actually understand about creating software and selling it.

This, actually, is not really what “marketing” means. Actually Microsoft has pretty terrible marketing. Can you imagine those dinosaur ads actually making someone want to buy Microsoft Office?

Software is a conversation, between the software developer and the user. But for that conversation to happen requires a lot of work beyond the software development. It takes marketing, yes, but also sales, and public relations, and an office, and a network, and infrastructure, and air conditioning in the office, and customer service, and accounting, and a bunch of other support tasks.

But what do software developers do? They design and write code, they layout screens, they debug, they integrate, and they check things into the source code control repository.

The level a programmer works at (say, Emacs) is too abstract to support a business. Developers working at the developer abstraction layer need an implementation layer — an organization that takes their code and turns it into products. Dolly Parton, working at the “singing a nice song” layer, needs a huge implementation layer too, to make the records and book the concert halls and take the tickets and set up the audio gear and promote the records and collect the royalties.

Any successful software company is going to consist of a thin layer of developers, creating software, spread across the top of a big abstract administrative organization.

The abstraction exists solely to create the illusion that the daily activities of a programmer (design and writing code, checking in code, debugging, etc.) are all that it takes to create software products and bring them to market. Which gets me to the most important point of this essay:

Your first priority as the manager of a software team is building the development abstraction layer.

Most new software managers miss this point. They keep thinking of the traditional, Command-and-Conquer model of management that they learned from Hollywood movies.

According to Command-and-Conquer, managers-slash-leaders figure out where the business is going to go, and then issue the appropriate orders to their lieutenants to move the business in that direction. Their lieutenants in turn divide up the tasks into smaller chunks and command their reports to implement them. This continues down the org-chart until eventually someone at the bottom actually does some work. In this model, a programmer is a cog in the machine: a typist who carries out one part of management’s orders.

Some businesses actually run this way. You can always tell when you are dealing with such a business, because the person you are talking to is doing something infuriating and senseless, and they know it, and they might even care, but there’s nothing they can do about it. It’s the airline that loses a million mile customer forever because they refuse to change his non-refundable ticket so he can fly home for a family emergency. It’s the ISP whose service is down more often than it’s up, and when you cancel your account, they keep billing you, and billing you, and billing you, but when you call to complain, you have to call a toll number and wait on hold for an hour, and then they still refuse to refund you, until you start a blog about how badly they suck. It’s the Detroit automaker that long since forgot how to design cars that people might want to buy and instead lurches from marketing strategy to marketing strategy, as if the only reason we don’t buy their crappy cars is because the rebate wasn’t big enough.

Enough.

Forget it. The command-hierarchy system of management has been tried, and it seemed to work for a while in the 1920s, competing against peddlers pushing carts, but it’s not good enough for the 21st century. For software companies, you need to use a different model.

With a software company, the first priority of management needs to be creating that abstraction for the programmers.

If a programmer somewhere is worrying about a broken chair, or waiting on hold with Dell to order a new computer, the abstraction has sprung a leak.

Think of your development abstraction layer as a big, beautiful yacht with insanely powerful motors. It’s impeccably maintained. Gourmet meals are served like clockwork. The staterooms have twice-daily maid service. The navigation maps are always up to date. The GPS and the radar always work and if they break there’s a spare below deck. Standing on the bridge, you have programmers who really only think about speed, direction, and whether to have Tuna or Salmon for lunch. Meanwhile a large team of professionals in starched white uniforms tiptoes around quietly below deck, keeping everything running, filling the gas tanks, scraping off barnacles, ironing the napkins for lunch. The support staff knows what to do but they take their cues from a salty old fart who nods ever so slightly in certain directions to coordinate the whole symphony so that the programmers can abstract away everything about the yacht except speed, direction, and what they want for lunch.

Management, in a software company, is primarily responsible for creating abstractions for programmers. We build the yacht, we service the yacht, we are the yacht, but we don’t steer the yacht. Everything we do comes down to providing a non-leaky abstraction for the programmers so that they can create great code and that code can get into the hands of customers who benefit from it.

Programmers need a Subversion repository. Getting a Subversion repository means you need a network, and a server, which has to be bought, installed, backed up, and provisioned with uninterruptible power, and that server generates a lot of heat, which means it need to be in a room with an extra air conditioner, and that air conditioner needs access to the outside of the building, which means installing an 80 pound fan unit on the wall outside the building, which makes the building owners nervous, so they need to bring their engineer around, to negotiate where the air conditioner unit will go (decision: on the outside wall, up here on the 18th floor, at the most inconvenient place possible), and the building gets their lawyers involved, because we’re going to have to sign away our firstborn to be allowed to do this, and then the air conditioning installer guys show up with rigging gear that wouldn’t be out of place in a Barbie play-set, which makes our construction foreman nervous, and he doesn’t allow them to climb out of the 18th floor window in a Mattel harness made out of 1/2″ pink plastic, I swear to God it could be Disco Barbie’s belt, and somebody has to call the building agent again and see why the hell they suddenly realized, 12 weeks into a construction project, that another contract amendment is going to be needed for this goddamned air conditioner that they knew about before Christmas and they only just figured it out, and if your programmers even spend one minute thinking about this that’s one minute too many.

To the software developers on your team, this all needs to be abstracted away as typing svn commit on the command line.

That’s why you have management.

It’s for the kind of stuff that no company can avoid, but if you have your programmers worrying about it, well, management has failed, the same way as a 100 foot yacht has failed if the millionaire owner has to go down into the engine room and, um, build the engine.

You’ve got your typical company started by ex-software salesmen, where everything is Sales Sales Sales and we all exist to drive more sales. These companies can be identified in the wild because they build version 1.0 of the software (somehow) and then completely lose interest in developing new software. Their development team is starved or nonexistent because it never occurred to anyone to build version 2.0… all that management knows how to do is drive more sales.

On the other extreme you have typical software companies built by ex-programmers. These companies are harder to find because in most circumstances they keep quietly to themselves, polishing code in a garret somewhere, which nobody ever finds, and so they fade quietly into oblivion right after the Great Ruby Rewrite, their earth-changing refactoring-code code somehow unappreciated by The People.

Both of these companies can easily be wiped out by a company that’s driven by programmers and organized to put programmers in the driver’s seat, but which have an excellent abstraction that does all the hard work to convert code into products below the decks.

A programmer is most productive with a quiet private office, a great computer, unlimited beverages, an ambient temperature between 68 and 72 degrees (F), no glare on the screen, a chair that’s so comfortable you don’t feel it, an administrator that brings them their mail and orders manuals and books, a system administrator who makes the Internet as available as oxygen, a tester to find the bugs they just can’t see, a graphic designer to make their screens beautiful, a team of marketing people to make the masses want their products, a team of sales people to make sure the masses can get these products, some patient tech support saints who help customers get the product working and help the programmers understand what problems are generating the tech support calls, and about a dozen other support and administrative functions which, in a typical company, add up to about 80% of the payroll. It is not a coincidence that the Roman army had a ratio of four servants for every soldier. This was not decadence. Modern armies probably run 7:1. (Here’s something Pradeep Singh taught me today: if only 20% of your staff is programmers, and you can save 50% on salary by outsourcing programmers to India, well, how much of a competitive advantage are you really going to get out of that 10% savings?)

Management’s primary responsibility to create the illusion that a software company can be run by writing code, because that’s what programmers do. And while it would be great to have programmers who are also great at sales, graphic design, system administration, and cooking, it’s unrealistic. Like teaching a pig to sing, it wastes your time and it annoys the pig.

Microsoft does such a good job at creating this abstraction that Microsoft alumni have a notoriously hard time starting companies. They simply can’t believe how much went on below decks and they have no idea how to reproduce it.

Nobody expects Dolly Parton to know how to plug in a microphone. There’s an incredible infrastructure of managers, musicians, recording technicians, record companies, roadies, hairdressers, and publicists behind her who exist to create the abstraction that when she sings, that’s all it takes for millions of people to hear her song. All the support staff and management that make Dolly Parton possible can do their jobs best by providing the most perfect abstraction: the most perfect illusion that Dolly sings for us. It is her song. When you’re listening to her on your iPod, there’s a huge infrastructure that makes that possible, but the very best thing that infrastructure can do is disappear completely. Provide a leakproof abstraction that Dolly Parton is singing, privately, to us.

Foreword to “Eric Sink on the Business of Software”

Book image: Eric Sink on the Business of SoftwareEric Sink has been hanging around Joel on Software since the early days. He was one of the creators of the Spyglass web browser, he created the AbiWord open-source word processor, and now he’s a developer at SourceGear, which produces source code control software.

But most of us around here know him from his contributions as host of The Business of Software, a discussion group that has become the hub for the software startup crowd. He coined the term micro-ISV, he’s been writing about the business of software on his blog for several years, and he wrote an influential series of articles for MSDN. He just published a full-fledged, dead-trees paper book called Eric Sink on the Business of Software, and he asked me to write the foreword, which appears here.

 

Did I ever tell you the story of my first business?

Let me see if I can remember the whole thing. I was fourteen, I think. They were running some kind of a TESOL summer institute at the University of New Mexico, and I was hired to sit behind a desk and make copies of articles from journals if anybody wanted them.

There was a big urn full of coffee next to the desk, and if you wanted coffee, you helped yourself and left a quarter in a little cup. I didn’t drink coffee, myself, but I did like donuts and thought some nice donuts would go well with the coffee.

There were no donut stores within walking distance of my little world, so, being too young to drive, I was pretty much cut off from donuts in Albuquerque. Somehow, I persuaded a graduate student to buy a couple of dozen every day and bring them in. I put up a handwritten sign that said “Donuts: 25¢ (Cheap!)” and watched the money flow in.

Every day, people walked by, saw the little sign, dropped some money in the cup, and took a donut. We started to get regulars. The daily donut consumption was going up and up. People who didn’t even need to be in the institute lounge veered off of their daily routes to get one of our donuts.

I was, of course, entitled to free samples, but that barely made a dent in the profits. Donuts cost, maybe, a dollar a dozen. Some people would even pay a dollar for a donut just because they couldn’t be bothered to fish around in the money cup for change. I couldn’t believe it!

By the end of the summer, I was selling two big trays a day… maybe 100 donuts. Quite a lot of money had piled up… I don’t remember the exact amount, but it was hundreds of dollars. This is 1979, you know. In those days, that was enough money to buy, like, every donut in the world, although by then I was sick of donuts and starting to prefer really, really spicy cheese enchiladas.

So, what did I do with the money? Nothing. The chairman of the linguistics department took it all. He decided that the money should be used to hold a really big party for all the institute staff. I wasn’t allowed to come to the party because I was too young.

The moral of the story?

Um, there is no moral.

But there is something incredibly exciting about watching a new business grow. It’s the joy of watching the organic growth that every healthy business goes through. By “organic,” I mean, literally, “of or designating carbon compounds.” No, wait, that’s not what I mean. I mean plant-like, gradual growth. Last week you made $24. This week you made $26. By this time next year you might be making $100.

People love growing businesses for the same reason they love gardening. It’s really fun to plant a little seed in the ground, water it every day, remove the weeds, and watch a tiny sprout grow into a big bushy plant full of gorgeous hardy mums (if you’re lucky) or stinging nettles (if you got confused about what was a weed, but don’t lose hope, you can make tea out of the nettles, just be careful not to touch ‘em).

As you look at the revenues from your business, you’ll say, “gosh, it’s only 3:00, and we’ve already had nine customers! This is going to be the best day ever!” And the next year nine customers will seem like a joke, and a couple of years later you’ll realize that that intranet report listing all the sales from the last week is unmanageably large.

One day, you’ll turn off the feature that emails you every time someone buys your software. That’s a huge milestone.

Eventually, you’ll notice that one of the summer interns you hired is bringing in donuts on Friday morning and selling them for a buck. And I can only hope that you won’t take his profits and use it for a party he’s not invited to.

What Makes It Great? (First Draft)

Brad PittNow that we’ve more-or-less defined “design,” since the working title of this series is Great Design, I better come up with a working definition of “great.”

Just about every product category has its blue-chip, gold-plated stars. Movie stars? Brad Pitt. Best rock song of all time? Sweet Home, Alabama, of course. Office chairs? The Herman Miller Aeron. Portable MP3 players? Clearly the Apple iPod.

What do these products have in common?

Brad Pitt can attract millions of people to the box office. He’s very good looking, and very charismatic, and it’s not even clear if he can act, but who cares?

Sweet Home, Alabama is one of the catchiest songs of all time. It’s extremely popular despite the fact that it’s impossible to sing or hum (the refrain requires harmony), the melody is awkward, and the lyrics include a couple of reprehensible lines defending Alabama’s racist and segregationalist governor George Wallace, but few people really notice the flaws, they just enjoy the song.

The Aeron chair became the symbol of high end office chairs. It’s expensive and looks like a giant cockroach, but when the directors of 24 need to show the canonical “super luxury office chair” for the White House, they use an Aeron.

iPod nano. Hold it daintily by the corners, or it will scratch on the front and smudge on the back and generally look awful.And finally, the iPod. Ah, the iPod. It’s way more expensive than any competitive MP3 player. It has fewer features than the competition. The iPod nano, the tiny one that everybody’s raving about, is the only product I’ve ever seen that can be scratched beyond all recognition just by touching it lightly with your finger, and the shiny mirror back will be permanently covered in greasy fingerprint smudges from the moment you take it out of the elegant package until the battery wears out and you have to throw away the whole thing and buy another. But who cares?

The blue chip product in every category can usually be thought of as being popular despite obvious design flaws. Weird.

As the design gets better and better, as the product becomes more and more suitable to its users needs, it becomes more likely to be chosen by customers. So the 40GB MP3 player, all else being equal, will outsell the 20GB MP3 player. The easy-to-use phone will outsell the hard-to-use phone. All else being equal. That part is not weird.

But that only gets you so far, as Creative, makers of the unloved ZEN MP3 players, are learning the hard way. Despite having products that are better than the iPod by just about every reasonable metric, they are unable to even come close to Apple iPod’s dominant market share. They’re cheaper. They have more memory. They support more file formats. Etc. Doesn’t matter: they still have single-digit market share while iPod is probably in the 80s somewhere.

That’s because good design can only take you so far. Getting every aspect of the design perfect, making a usable product, making the right tradeoffs between price and functionality, between flexibility and ease of use, between weight and battery life, etc., etc., etc., is all really important, but the most it can possibly get you is to #2.

It’s like beauty. A wannabe model can be tall, with a perfectly symmetrical face, beautiful skin, lovely eyes, and perfectly straight white teeth, and still be considered unattractive. On the other hand, you can have a gigantic broken nose, or be completely lacking in eyebrows, or have a giant gap between your two front teeth, and still be People Magazine’s Sexiest Whatever of the Year.

How do you get to be #1? That’s the mystery here. And since certain markets (graphical operating systems, online auctions, and apparently MP3 players) seem to be winner-take-all markets, being #2 or #3 may not be good enough.

Herman Miller Aeron ChairSo this is what I’m talking about when I say “Great Design.” It’s that ineffable quality that certain incredibly successful products have that makes people fall in love with them despite their flaws. It’s extremely hard to pull off. I sure as heck can’t do it. But, if you bear with me, I think I have some theories as to what’s happening. While these theories do not exactly add up to a recipe for making good products into great products, they may give you a clue as to what’s going on when people go crazy about the Aeron chair or Julia Roberts.

Here’s the overall plan for this series of articles. First, I’m going to go through good design, namely, all the things you should know to get your design adequate given the current state of the art. Ease of use is a fundamental part of that so I’ll spend a lot of time on usability.

Later, once I’ve got all the obvious things taken care of, you’ll have a really usable design and one which meets your customers’ needs, and in fact, if you pay more attention to these usability things than your competitors, you may have the best design, but that’s not going to get you to #1.

“Every time I read Jakob Nielsen,” I wrote in 2000, “I get this feeling that he really doesn’t appreciate that usability is not the most important thing on earth. Sure, usability is important (I wrote a whole book about it). But it is simply not everyone’s number one priority, nor should it be. You get the feeling that if Mr. Nielsen designed a singles bar, it would be well lit, clean, with giant menus printed in Arial 14 point, and you’d never have to wait to get a drink. But nobody would go there; they would all be at Coyote Ugly Saloon pouring beer on each other.”

So in the final articles, roughly the last third of the series, I’ll peek under the covers at the black magic of great design. You may not be able to pull it off. It takes real talent, not just hard work. But at least I hope you’ll recognize some of the things that are going on that make certain gadgets, software, songs, movie stars, and office chairs make that leap from merely throroughly good to truly and significantly great.

 

Great Design: What is Design? (First Draft)

OK, buckle down, we’ve got a lot of ground to cover.

In our last episode, I introduced the tentative title “Great Design” for this series of articles. I have something very specific in mind when I use the words “great” and “design,” and it’s worth spending some time defining it.

First, “design.”

Brownstones in New York CityYou know those gorgeous old brownstones in New York City? With the elaborate carvings, gargoyles, and beautiful iron fences? Well, if you dig up the old architectural plans, the architect would often just write something like “beautiful fretwork” on the drawing, and leave it up to the artisan, the old craftsman from Italy to come up with something, fully expecting that it will be beautiful.

That’s not design. That’s decoration. What we, in the software industry, collectively refer to as Lipstick on a Chicken. If you have been thinking that there is anything whatsoever in design that requires artistic skill, well, banish the thought. Immediately, swiftly, and promptly. Art can enhance design but the design itself is strictly an engineering problem. (But don’t lose hope — I’ll talk more about beauty in future articles).

Design, for my purposes, is about making tradeoffs.

Let’s design a trashcan for a city street corner, shall we?

Let me give you some design constraints.

It has to be pretty light, because the dustboys, er, sanitation engineers come by and they have to pick it up to dump the trash in the garbage truck.

Oh, and it has to be heavy, or it will blow away in the wind or get knocked over. (True story: I once got in an accident because a trash can blew in front of our car. Nobody was hurt, not even the trashcan.)

It has to be really big. People throw away a lot of trash throughout the day and at a busy intersection if you don’t make it big enough, it overflows and garbage goes everywhere. When that happens, one of the little six-pack plastic ringy-dingies will get in the ocean, and a cute little birdy will get ensnared in it, and choke to death. YOU DON’T WANT TO KILL BIRDIES, DO YOU?

Oh, also, it needs to be pretty small, because otherwise it’s going to take up room on the sidewalk, forcing the pedestrians to squeeze past each other, which, possibly, when the Effete Yuppie Listening to His iPod gets distracted by a really funny joke on the Ricky Gervais podcast and accidentally brushes against the Strangely Haunted Vietnam-Era Veteran, can result in an altercation of historic proportions.

Ok, light, heavy, big, and small. What else. It should be closed on the top, so rubbish doesn’t fly away in the wind. It should be open on the top, so it’s easy to throw things away.

It should be really, really, really cheap.

Notice a trend? When you’re designing something, you often have a lot of conflicting constraints.

In fact, that’s a key part of design: resolving all of these conflicting goals.

The only goal that usually doesn’t conflict is the requirement that whatever you design be really, really cheap.

Every design decision involves tradeoffs, whether it’s finding space for all your icons on the toolbar, picking the optimal balance of font size and information density, or deciding how best to use the limited space for buttons on a cellphone.

Bell System TelephoneEvery new feature is a tradeoff, between the people who could really use such a feature and the people who are just going to get overwhelmed by all the options. The reason 1950s-era telephones were so much easier to use than modern office phones is that they just didn’t do much. Without voicemail, conference calling, three-way calling, and Java games, all you need is a way to dial numbers and hang up on the man claiming to be selling police benevolence.

By which I mean to say: even if you think your new feature is all good and can’t hurt because “people who don’t care can just ignore it,” you’re forgetting that the people who allegedly don’t care are still forced to look at your feature and figure out if they need it.

“How could a mute button on a sound system hurt?” After all, if you don’t want to waste time learning about the mute button, you can just ignore it completely, right?

No. Because at some point, someone will hit it by mistake, and no sound will come out of the speakers, and if they don’t know about “mute,” they’ll start trying to turn up the volume knob all the way, so when they do finally unmute the thing, the speakers will blow out with an ear-shattering boom that creates permanent, concave warps in each of the walls of the room where the sound system was installed (and a matching hump in the floor of the apartment upstairs).

And since the mute button takes up space on the control panel, now all the other control panel buttons have to be a bit smaller, making them harder to read, and there are more buttons, so the whole interface looks scarier. I swear, it’s gotten to the point where I don’t dare try to use the clock radio in a hotel room to wake me up. With all the options they have I can never quite tell if I’m setting the alarm clock to wake me up in time for my Very Important Meeting, or programming the damn thing to download the latest news from Mongolia on the half-hour.

“So,” you think, “simplicity, is that it?” No! I wish it was that easy!

Because without conference calling, you’re just not going to sell any office telephones.

If the nifty graphics application you developed doesn’t give users 16777216 choices for colors, you’re not going to sell a copy to Yale, which needs Yale Blue (Pantone 289) for all their documents.

You see? There are two requirements: lots of features and few features. Ah! And that is where the zen-like mystery of design comes in. When you’re designing, you’re satisfying lots of difficult constraints. One false move, and you fall into the abyss. It’s frigging hard to get this right. You think I know how to solve the Motorola RAZR phone power-switch button? Heck no! I’m sure that the design team over there spent weeks working on this. I’m sure that some engineer or team of engineers went to absolutely heroic lengths, staying up late and coming in on weekends, to make the RAZR keyboard light up right away when you press the ON button, even though you don’t notice this in daylight, because they know about the problem I whined about in the introduction and just couldn’t do anything about it because turning on a modern cellphone requires booting up a computer, and not a very fast computer, for that matter, before you can get things on the main screen to light up.

Design Adds Value Faster Than It Adds Cost

The Motorola RAZR is now selling at a rate of about four million units each month — 1.5 per second. If Motorola spends another $million or two improving the design, they can make it back in a day.

Design is something you only have to pay for once for your product. It’s a part of the fixed costs in the equation, not the variable costs. But it adds value to every unit sold. That’s what Thomas C. Gale, the famous Chrysler automobile designer who retired in 2001, meant when he said that “Good design adds value faster than it adds cost.”

(Footnote: AUTOS ON FRIDAY/Design; He Put a New Face on Chrysler, The New York Times, Published: February 9, 2001, by By JIM MCCRAW , Late Edition – Final, Section F, Page 1, Column 1)

That’s what makes it so important. By the time I’ve finished this series of articles, I think you’ll be utterly convinced that nothing matters more to a product’s success — whether it’s a software product, website, cell phone, or garbage can — than good design. And as for great design? Well, that’s coming up in the next installment. Stay tuned.

 

Introduction to Great Design (Second Draft, In Progress)

Confession: I’m afraid to turn off my cell phone.

Not because I’m afraid of being out of touch, mind you. Heck, I could care less if people can reach me. If you have something to tell me that’s so important it would be worth interrupting Will and Grace, well, I think I’d rather have another 30 minutes of ignorant bliss before I find out about it. That’s my motto: Will and Grace First, Earthquakes and Floods Later.

Picture of Motorola RAZR Cell PhoneHere’s why I’m afraid to turn off my cell phone: because I can’t always seem to muster the brain cells necessary to turn it back on.

It has two buttons on it, a happy green button and a scary red button. They have funny icons on them that don’t mean very much to me.

You might think that the green button turns it on. Green means go, right?

Wrong.

I tried that. Nothing doing. I tried pressing and holding the green button, because sometimes these phones want you to press-and-hold so that you won’t accidentally take a picture of your ear, or disconnect the phone call in the middle of an important job interview, or whatnot.

It turns out it’s the red button that turns it on.

When you press the red button, usually, nothing actually happens, so you suspect you might have done something wrong.

It turns out that you have, actually, turned on the phone, and if you’re in a dark room, you would have noticed that the keyboard flashed when you turned it on. In a bright room, nothing happens for six seconds. That’s usually long enough to think that you’ve done something wrong. So that’s when I start trying the other buttons, like the happy green button. In any case, I wind up feeling frustrated and not in control of my life.

Once you do learn that the red button turns the phone on, and you don’t have to hold it, you start to get frustrated that the time it takes the phone to boot up and load the pretty background picture and get on the network is something like 30 seconds. That’s frustrating, too. It seems like in the Olden Days you didn’t have to wait for half a minute to turn on an appliance. There was a switch, up was on (unless you lived in Europe, where they had a terrible war and couldn’t afford appliances), you switched it, the thing went on and started spinning or shining or whatever it is that the thing was supposed to do. Instantly. End of story.

Indeed, it’s surprising just how many of today’s devices and gadgets and remote controls have actually made TVs, stoves, and telephones harder to use. Suddenly, bad computer user interface design is seeping into the entire world.

Six years ago, with the total dominance of the consistent graphical interface of Mac and Windows, it seemed like the state of the art in software UI design was getting pretty good. Nothing fabulous, mind you, but pretty good. You could sit down with a new Windows app that you’d never seen before and have a pretty good chance of being able to operate it correctly.

Book: User Interface Design for ProgrammersThat’s when I wrote a book called User Interface Design for Programmers, thinking, great! It’s time to get everybody on the same page, here, about how we design user interfaces, and then life will be wonderful.

Unfortunately, that was about the same time as there was a huge wave of new consumer gadgets, and, of course, that web thing hit us.

The web didn’t really have a standard UI. You could make anything be a link. We didn’t have dropdown menus, so we made do with all kinds of differently-behaved simulations of dropdown menus.

Gadgets? Gadgets were even worse. They had tiny keyboards and tinier screens. Combined with rampant featuritis, these damn devices did more and more things but just figuring out how to do them took a degree in engineering (or a bright 12 year old, but slavery has been abolished, especially for 12 year olds.)

Maybe nobody told the people who design gadgets and gizmos and websites (and even software) that they need to work on their user interface skills.

So, this is their wake up call.

While most products were becoming increasingly incomprehensible, like the typical home entertainment remote control, with dozens of mushy little buttons marked “MTS” or “SURR” or “PTY” that nobody has any hope of understanding, something else was happening: a very few, very good designers were, somehow, coming up with truly great designs that were beautiful, easy to understand, fun, and which made people happy. You know who they are because those products became bestsellers. The Apple iPod. TiVo. Google. Even the Motorola RAZR, which is so hard to turn on, is, in most ways, a great design.

Over the next weeks and months, if all goes well, I’m going to write a series of articles right here, on this website, on UI design for the modern age. The whole series will be, tentatively, named Great Design.

If all goes well, we’re going to look at some of the original principles of good UI design, much of which I covered in the first book, and revisit them and see how they apply to today’s world of miniature gadgets, websites, and street-corner garbage cans.

Then, if we’re really lucky, we’re going even farther. We’re going to look at what it takes to make the leap from a servicable, decent product design to a Mindbogglingly Great, Earth-Shaking, History-Changing product design. I have some theories about that, too.

 

Micro-ISV: From Vision to Reality

Micro-ISV: From Vision to RealityThis is my foreword to Bob Walsh’s new book, Micro-ISV: From Vision to Reality.

How the heck did I become the poster child for the MicroISV movement?

Of all people. Sheesh.

When I started Fog Creek Software there was gonna be nothing “micro” about it. The plan was to build a big, multinational software company with offices in 120 countries and a skyscraper headquarters in Manhattan, complete with a heliport on the roof for quick access to the Hamptons. It might take a few decades–after all, we were going to be bootstrapped and we always planned to grow slowly and carefully–but our ambitions were anything but small.

Heck, I don’t even like the term MicroISV. The “ISV” part stands for Independent Software Vendor. It’s a made-up word, made up by Microsoft, to mean “software company that is not Microsoft,” or, more specifically, “software company that for some reason we have not yet bought or eliminated, probably because they are in some charming, twee line of business, like wedding table arrangements, the quaintness of which we are just way too cool to stoop down to, but you little people feel free to enjoy yourselves. Just remember to use .NET!”

It’s like that other term, legacy, that Microsoft uses to refer to all non-Microsoft software. So when they refer to Google, say, as a “legacy search engine” they are trying to imply that Google is merely “an old, crappy search engine that you’re still using by historical accident, until you bow to the inevitable and switch to MSN.” Whatever.

I prefer “software company,” and there’s nothing wrong with being a startup. Startup software company, that’s how we describe ourselves, and we don’t see any need to define ourselves in relation to Microsoft.

I suppose you’re reading this book because you want to start a small software company, and it’s a good book to read for that purpose, so let me use my pulpit here to provide you with my personal checklist of three things you should have before you start your Micro… ahem, startup software company. There are some other things you should do; Bob covers them pretty well in the rest of the book, but before you get started, here’s my contribution.

Number One. Don’t start a business if you can’t explain what pain it solves, for whom, and why your product will eliminate this pain, and how the customer will pay to solve this pain. The other day I went to a presentation of six high tech startups and not one of them had a clear idea for what pain they were proposing to solve. I saw a startup that was building a way to set a time to meet your friends for coffee, a startup that wanted you to install a plug-in in your browser to track your every movement online in exchange for being able to delete things from that history, and a startup that wanted you to be able to leave text messages for your friend that were tied to a particular location (so if they ever walked past the same bar they could get a message you had left for them there). What they all had in common was that none of them solved a problem, and all of them were as doomed as a long-tailed cat in a room full of rocking chairs.

Number Two. Don’t start a business by yourself. I know, there are lots of successful one-person startups, but there are even more failed one-person startups. If you can’t even convince one friend that your idea has merit, um, maybe it doesn’t? Besides, it’s lonely and depressing and you won’t have anyone to bounce ideas off of. And when the going gets tough, which it will, as a one-person operation, you’ll just fold up shop. With two people, you’ll feel an obligation to your partner to push on through. P.S., cats do not count.

Number Three. Don’t expect much at first. People never know how much money they’re going to make in the first month when their product goes on sale. I remember five years ago, when we started selling FogBugz, we had no idea if the first month of sales would be $0 or $50,000. Both figures seemed just as likely to me. I have talked to enough entrepreneurs and have enough data now to give you a definitive answer for your startup.

That’s right, I have a crystal ball, and can now tell you the one fact that you need to know more than anything else: exactly how much money you’re going to make during the first month after your product goes live.

Ready?

OK.

In the first month, you are going to make,

about,

$364, if you do everything right. If you charge too little, you’re going to make $40. If you charge too much, you’re going to make $0. If you expect to make any more than that, you’re going to be really disappointed and you’re going to give up and get a job working for The Man and referring to us people in startup-land as “Legacy MicroISVs.”

That $364 sounds depressing, but it’s not, because you’ll soon discover the one fatal flaw that’s keeping 50% of your potential customers from whipping out their wallets, and then *tada!* you’ll be making $728 a month. And then you’ll work really hard and you’ll get some publicity and you’ll figure out how to use AdWords effectively and there will be a story about your company in the local wedding planner newsletter and tada! You’ll be making $1456 a month. And you’ll ship version 2.0, with spam filtering and a Common Lisp interpreter built in, and your customers will chat amongst themselves, and tada! You’ll be making $2912 a month. And you’ll tweak the pricing, add support contracts, ship version 3.0, and get mentioned by Jon Stewart on The Daily Show and tada! $5824 a month.

Now we’re cooking with fire. Project out a few years, and if you plug away at it, there’s no reason you can’t double your revenues every 12-18 months, so no matter how small you start, (detailed math formula omitted – Ed.), you’ll soon be building your own skyscraper in Manhattan with a heliport so you can get to that 20 acre Southampton spread in 30 minutes flat.

And that, I think, is the real joy of starting a company: creating something, all by yourself, and nurturing it and working on it and investing in it and watching it grow, and watching the investments pay off. It’s a hell of a journey, and I wouldn’t miss it for the world.