We made a world populated with objects that reacted to what humans did, but they didn’t interact very strongly. Whereas that isn’t enough, pure discovery learning took us 100,000 years to get to science. So you actually need learning that’s facilitated. And if you can’t make 1,000 good teachers in a year to save yourself, you have to have a user interface that can do that.
July 19, 2010
It’s a small thing, but it breeds deep suspicion. Mac OS dialogs always had “OK” buttons (capital O, capital K). Windows dialogs had “Ok” buttons (Capital O, lowercase k). “Ok” buttons in Mac/iOS software are a sign of a half-assed port, by someone who doesn’t really know the platform.
March 29, 2010
…But in practice, nearly all the great analytical designs have come from those possessed by the content; people who have learned something important and want to tell the world about what they have learned. That is, content-driven and thinking-driven, and not at all driven by bureaucratic externalities of marketing, human factors, commercial art, focus groups, or ISO standards.
In working on 4 books on analytical design, I have often turned to the human factors literature, and then left in despair, finding few examples or ideas (beyond common-sensical) that were useful in my own work. This contrasts to the work of scientists, artists, art historians, and architects–work overflowing with ideas about evidence, seeing, and the craft of making analytical displays.
I believe that work about analytical displays should be self-exemplifying; that is, the work should show us amazing displays of evidence. My despair about human factors began many years ago upon going through volumes and volumes of the journal, Human Factors, where evidence was reported using statistical graphics of wretched quality, with thinner data and worse designs than even in corporate annual reports.
Also the methodological quality of the research was poor, and so nothing was credible. The findings seemed entirely context-dependent, univariate (design and seeing are profoundly multivariate), and without scope: what did it matter if some students in freshman psychology in Iowa preferred one clunky font compared to another clunky font in an experiment conducted by a teaching assistant? Later, while consulting, I saw this naive dust-bowl empiricism fail again and again for nearly a decade in trying design a competent PC OS interface. (And with the Mac interface sitting there, smiling, all the time. Apple’s superb interface guidelines seemed to me to be a retrospective account of the beautiful hands-on craft of a few brilliant designers, not a reason to have experimental psychologists attempt to design OS/2 and Windows.)
At any rate, if this was the scientific practice and the design craft of applied psychology, I concluded the field did not have much to contribute to my own work on analytical design.
I happily fled to the classics of science, art, and architecture.
— Edward Tufte, November 27, 2002 (emphasis mine).
It’s still pretty bleak.
February 15, 2010
Obstacles to getting real feedback are now mainly cultural, not technological; any business that isn’t learning from their users doesn’t want to learn from their users.
February 9, 2010
January 13, 2010
Splash screens are evil. While branding is important, the proper place for it is in the iconography, optional “About” or “Info” screens, and App Store profiles. The most common interaction pattern with iPhone applications is to launch them frequently, close them quickly, and treat them as part of a set of tools that interact to comprise a single user experience. Splash screens break the perception of seamlessness.
The HIG offers a very useful suggestion for managing launch states, which may be quite slow, depending on the needs of your application. The suggestion is to provide a PNG image file in your application bundle that acts as a visual stand-in for the initial screen of your application. For example, if the main screen for your application is a table full of data, provide an image of a table without data to act as a stand-in. When your data is ready to be displayed, the image will be flushed from the screen, and the user experience will feel more responsive.
In this book, we will explore extensions of this, including a pattern for loading application state lazily
–Toby Boudreaux, iPhone User Experience, page 15; emphasis mine.
I’ve always hated splash screens, from the first time I turned on a computer. They get in the way of what I want to do. I want to write, or draw, or play — but if I launch Word, or Photoshop, or any game, I have to sit through a splash screen before I can get to it.
Branding a splashscreen is putting your name on a purely negative experience. Nobody wants to wait for their computer. Splashscreens, by definition, force you to wait. It’s hard for me to imagine why anyone wants to associate their brand with a computer not doing what customers want.
iPhone 4 Update
Fast App Switching, introduced in iOS 4, makes splash screens a much worse idea. They won’t consistently display, because sometimes the app will really be resuming, not starting for the first time, when the user “launches” it. Forcing a splash-screen to appear on a resume as well means breaking the “multitasking” experience.
December 7, 2009
Surprisingly, the addition of dot syntax to Objective-C 2.0 has been a major source of controversy. I wonder if there’s some kind of Bike Shed effect at work here: the problem dot-syntax causes is trivial1; while the clarity it brings to code is minor. So it essentially boils down to aesthetics. (For the record, I like the dot, even with it’s current flaws, but I don’t believe it’s worth fighting for).
The Actual Problem
The problem is that when you see
a.b = c; you don’t know if it’s:
- Assigning the
bfield of a
c. This basically compiles down to one
- Invoking the
-setB:method on an Objective-C object. By convention
-setB:should update the
bproperty of the object to hold the value
c, and nothing else. But it might have side effects, or be really slow.
Using a different symbol to access Objective-C properties would remove all ambiguity. Nobody would mistake
a@b = c; as assigning to a C-
struct. It’s clearly an Objective-C construct.
But personally, I’m not a big fan of the
@ character. It’s ugly; it’s noisy; there’re just too many lines in it. I think U+25B8 ‘BLACK RIGHT-POINTING SMALL TRIANGLE’ would make a better choice,
obj▸property = value;
And since ‘▸’ can’t be part of a valid C identifier, you can basically preprocess your code with
s/▸/./, then compile it with existing tools.
Of course, it doesn’t matter what character(s) is picked, so long as it’s clearly different from existing C syntax; and you have a way of replacing it with a
. before building it.
1 I’ve heard experienced developers complain that dot-syntax = a steeper learning curve for newbies, and that it can be confusing, but I haven’t actually seen one come out and say ‘I spent X hours debugging a problem that I couldn’t see because of it’. The fact is, any situation that dot-syntax would obscure is already pathological. In the end I just can’t see dot-syntax mattering much.
November 14, 2009
I did a project years ago called the “Dollar Dudes”, where we got on the subway with a bucket of dollar bills and announced that we were in the lucky “Dollar Train” and that everyone gets a dollar. Most everyone was delighted (at both the dollar and the ridiculousness of it all) but one guy refused to take the money and snapped at me. I was bummed out to get that reaction, but at the end of the day I didn’t feel that one guy getting irritated made the whole project a failure. The other 40 people had fun. I imagine the type of person who gets mad when offered a dollar by a stranger probably gets mad quite a bit throughout his day. I’m not trying or pretending to please every single person we encounter.
Yes, handing out a bucket of money really does upset someone. You have no chance of pleasing everyone. Make tradeoffs accordingly.
October 26, 2009
I’m taking the following as an axiom: Exposing real pre-emptive threading with shared mutable data structures to application programmers is wrong. …It gets very hard to find humans who can actually reason about threads well enough to be usefully productive.
When I give talks about this stuff, I assert that threads are a recipe for deadlocks, race conditions, horrible non-reproducible bugs that take endless pain to find, and hard-to-diagnose performance problems. Nobody ever pushes back.
October 23, 2009
The term GUI, Graphical User Interface, pronounced “Gooey” is laughably anachronistic. All interfaces meant for people on modern computers are graphical. The right abbreviation to use today is simply UI, for User Interface, pronounced “You I”.
Believe me, I understand that a command line interface is still useful today. I use them. I’m a programmer. I get the whole UNIX thing. Even without a pipe, a command-line is the highest-bandwidth input mechanism we have today.
But all command lines live inside a graphical OS. That’s how computers work in the 21st century.
Whenever I see “GUI” written I can’t help but wonder if the author is dangerously out of touch. Do they still think graphical interfaces are a novelty that needs to be called out?