We made a world populated with objects that reacted to what humans did, but they didn’t interact very strongly. Whereas that isn’t enough, pure discovery learning took us 100,000 years to get to science. So you actually need learning that’s facilitated. And if you can’t make 1,000 good teachers in a year to save yourself, you have to have a user interface that can do that.
July 19, 2010
July 16, 2010
Q: Will you apologize for investors?
A: Steve: We are apologizing to our customers. We want investors for the long haul. To those investors who bought the stock and are down $5, I have no apology.
–Steve Jobs, taking questions at a press conference on antenna issues with the iPhone 4 design, July 16th, 2010.
That’s a CEO with his priorities straight.
June 17, 2010
And so continues one of the biggest constants in software development: the unerring sense among developers that the level of abstraction they’re current working at is exactly the right one for the task at hand. Anything lower-level is seen as barbaric, and anything higher-level is a bloated, slow waste of resources. This remains true even as the overall level of abstraction across the industry marches ever higher.
And in the short term, in the moment, they’re often right. But this arrow points only one way, and that’s in the direction of ever-higher abstraction. To judge how much time remains before the next leap forwards, look at the leading edge of the industry.
—John Siracusa (emphasis mine)
Here’s my two cents on the future of abstraction: systems are clearly getting wider (paralell), not faster; technologies like Grand Central Dispatch help us deal with concurrency today, but longer term, I think a functional programming abstraction is the answer.
June 9, 2010
…one particularly outrageous moment stuck out for me. At about three minutes into the video, senior vice president for iPhone software Scott Forstall extolls the virtues of the Retina Display by declaring that “The text… is just perfect!” Meanwhile, the central image in the video at just that moment is this little typographic calamity:
I urge you to fast-forward the time code to 3:02 to hear this for yourself. Forstall is quite literally claiming perfection while a hand model holds up this terrible example of everything that’s wrong with Apple’s commitment to typography. While the letterforms on that virtual page may look gorgeous, it’s apparent to any designer that the text is far from perfectly typeset. It’s hideous, scarred as it is by unsightly “rivers” of bad spacing within the text. No self-respecting typographer would dare call that perfect.
The unrelenting drive for perfection was a quality I always admired in Apple. I hope this is just bullshit spin, and an unfortunate choice of sample frames.
June 7, 2010
The truth is that an iPad app is neither easier nor harder to make than an iPhone app (or a Mac or Windows app), in any general, reasonable, defensible way. Software doesn’t work like that; we don’t have to work twice as hard to cover twice as many pixels on screen. It’s all about the elusive quality factor.
May 24, 2010
Another counter-intuitive finding is that scam victims often have better than average background knowledge in the area of the scam content. For example, it seems that people with experience of playing legitimate prize draws and lotteries are more likely to fall for a scam in this area than people with less knowledge and experience in this field. This also applies to those with some knowledge of investments. Such knowledge can increase rather than decrease the risk of becoming a victim.
March 29, 2010
…But in practice, nearly all the great analytical designs have come from those possessed by the content; people who have learned something important and want to tell the world about what they have learned. That is, content-driven and thinking-driven, and not at all driven by bureaucratic externalities of marketing, human factors, commercial art, focus groups, or ISO standards.
In working on 4 books on analytical design, I have often turned to the human factors literature, and then left in despair, finding few examples or ideas (beyond common-sensical) that were useful in my own work. This contrasts to the work of scientists, artists, art historians, and architects–work overflowing with ideas about evidence, seeing, and the craft of making analytical displays.
I believe that work about analytical displays should be self-exemplifying; that is, the work should show us amazing displays of evidence. My despair about human factors began many years ago upon going through volumes and volumes of the journal, Human Factors, where evidence was reported using statistical graphics of wretched quality, with thinner data and worse designs than even in corporate annual reports.
Also the methodological quality of the research was poor, and so nothing was credible. The findings seemed entirely context-dependent, univariate (design and seeing are profoundly multivariate), and without scope: what did it matter if some students in freshman psychology in Iowa preferred one clunky font compared to another clunky font in an experiment conducted by a teaching assistant? Later, while consulting, I saw this naive dust-bowl empiricism fail again and again for nearly a decade in trying design a competent PC OS interface. (And with the Mac interface sitting there, smiling, all the time. Apple’s superb interface guidelines seemed to me to be a retrospective account of the beautiful hands-on craft of a few brilliant designers, not a reason to have experimental psychologists attempt to design OS/2 and Windows.)
At any rate, if this was the scientific practice and the design craft of applied psychology, I concluded the field did not have much to contribute to my own work on analytical design.
I happily fled to the classics of science, art, and architecture.
— Edward Tufte, November 27, 2002 (emphasis mine).
It’s still pretty bleak.
March 25, 2010
…She bent forward to put a white hand on my knee. “There is wealth in that cellar beneath the garage. You may have whatever you ask”.
I shook my head.
“You aren’t a fool!” she protested. “You know-”
Let me straighten this out for you,” I interrupted. “We’ll disregard whatever honesty I happen to have, sense of loyalty to employers, and so on. You might doubt them, so we’ll throw them out. Now I’m a detective because I happen to like the work. It pays me a fair salary, but I could find other jobs that would pay more. Even a hundred dollars more a month would be twelve hundred a year. Say twenty-five or thirty thousand dollars in the years between now and my sixtieth birthday.
“Now I pass up about twenty-five or thirty thousand of honest gain because I like being a detective, like the work. And liking work makes you want to do it as well as you can. Otherwise there’d be no sense to it. That’s the fix I am in. I don’t know anything else, don’t enjoy anything else, don’t want to know or enjoy anything else. You can’t weight that against any sum of money. Money’s good stuff. I haven’t anything against it. But in the past eighteen years I’ve been getting my fun out of chasing crooks and solving riddles. It’s the only kind of sport I know anything about, and I can’t imagine a pleasanter future than twenty-some years more of it. I’m not going to blow that up.
Excerpt from The Gutting of Couffignal by Dashiell Hammett. All monies are in 1927 US Dollars. It’d buy me a couple of nice houses today.
March 23, 2010
Next time you want to gloat about how seminal Cocoa is,
Erich Gamma: Yes, and it is funny that you mention the iPhone. The iPhone SDK is based on the NeXTStep object-oriented frameworks like the AppKit. It already existed when we wrote Design Patterns 15 years ago and was one source of inspiration. We actually refer to this framework in several of our patterns: Adapter, Bridge, Proxy, and Chain of Responsibility.
Richard: Which is a great example of the enduring nature of good design, and how it survives different technical manifestations.
Erich: Just as an aside, it is also easy to forget that we wrote design patterns before there was Java or C#.
March 18, 2010
In a way ideas only count for a little in computing, because you kind of have to implement this stuff. This is the part of the story that really makes me clutch at my throat, because every time you implement something, 5 years go away.