Vincent Gable’s Blog

November 14, 2009

You Can’t Please Everyone

Filed under: Design,Programming,Quotes,Tips,Usability | , , ,
― Vincent Gable on November 14, 2009

I did a project years ago called the “Dollar Dudes”, where we got on the subway with a bucket of dollar bills and announced that we were in the lucky “Dollar Train” and that everyone gets a dollar. Most everyone was delighted (at both the dollar and the ridiculousness of it all) but one guy refused to take the money and snapped at me. I was bummed out to get that reaction, but at the end of the day I didn’t feel that one guy getting irritated made the whole project a failure. The other 40 people had fun. I imagine the type of person who gets mad when offered a dollar by a stranger probably gets mad quite a bit throughout his day. I’m not trying or pretending to please every single person we encounter.

Charlie Todd (of Improv Everywhere fame)

Yes, handing out a bucket of money really does upset someone. You have no chance of pleasing everyone. Make tradeoffs accordingly.

November 11, 2009

Just Look at it, Man!

Filed under: Bug Bite,Programming | , , , , , , , , ,
― Vincent Gable on November 11, 2009

You’re looking at Anscombe’s quartet: 4 datasets with identical simple statistical properties (mean, variance, correlation, linear regression); but obvious differences when graphed.

325px-Anscombe.svg.png


(via Best of Wikipedia)

Graphs aren’t a substitute for numerical analysis. Graphs are not a panacea. But they’re excellent for discovering patterns, outliers, and getting intuition about a dataset. If you never graph your data, then you’ve never really looked at it.

War Story

I was working on optimizing color correction, using SSE (high performance x86 instructions). One operation required division — an expensive operation for a computer. The hardware had a divide instruction, but sometimes using the Newton-Raphson method to do the division in software is faster. You never know until you measure.

While doing the measurement, I somehow got the crazy idea to try both: I’d already unrolled the inner loop so instead of repeating the divide or Newton’s Method twice, I’d do a divide and then use Newton’s Method for the next value. Strangely enough, this was faster on the hardware I was benchmarking than either method individually. Modern hardware is a complex and scary beast.

I was fortunate enough to have a suite of very good unit tests to run against my optimized code. But there was a caveat to testing correctness. Because computers don’t have infinitely precise arithmetic, two correct algorithms might give different answers — but if the numbers they gave were close enough to the infinitely precise answer (say a couple ulps apart) it was good enough. (We can only be exact within some Tolerance!) The tests cleared my hybrid divide/Newton-Raphson function: but we couldn’t use it, because it was fundamentally broken.

Even though the error was acceptably small, it had a nasty distribution. Using divide gave color values that were a bit too light. Doing a divide in software gave values that were a bit too dark. Individually these errors were fine. Randomly spread over the image they would have been fine. But processing every other pixel differently had the effect of adding alternating light/dark stripes! We see contrast, not absolute color, so the numerically insignificant error was quite visible. Worse still, bands of 1 pixel stripes combined to form a shimmering Moiré pattern. It was totally busted. Unusable.

This was all immediately obvious when the results of the color correction were “graphed”. Actually looking at the answer caught a subtle error that our suite of unit tests missed.

To be clear, more subjective graphical analysis is not a substitute for numerical analysis and data mining. But I believe in actually looking at your data at least once. A graph is a kind of end-to-end visualization of everything, and that has value. Graphs are a cheap sanity check — does everything look right? And sometimes, they can give you real insight into a problem.

November 9, 2009

Spurious

What’s a spurious relationship?

Here’s one: People who eat ice cream are more likely to drown. Both incidence of ice cream eating and rates of drowning are related to summertime. The relationship between ice cream and drowning is spurious. That is, there is no relationship. Yet they appear related because they are both related to a third variable.

Lisa Wade

untitled5sk.jpg

(Image via the amazing Superdickery)

November 4, 2009

Tolerance

Filed under: Announcement,Quotes | , , ,
― Vincent Gable on November 4, 2009

The Principle of Uncertainty is a bad name. In science or outside of it we are not uncertain; our knowledge is merely confined, within a certain tolerance. We should call it the Principle of Tolerance.

And I propose that name in two senses: First, in the engineering sense, science has progressed, step by step, the most successful enterprise in the ascent of man, because it has understood that the exchange of information between man and nature, and man and man, can only take place with a certain tolerance.

But second, I also use the word, passionately, about the real world. All knowledge, all information between human beings, can only be exchanged within a play of tolerance. And that is true whether the exchange is in science, or in literature, or in religion, or in politics, or in any form of thought that aspires to dogma.

It’s a major tragedy of my lifetime and yours that scientists were refining, to the most exquisite precision, the Principle of Tolerance, and turning their backs on the fact that all around them, tolerance was crashing to the ground beyond repair.

The Principle of Uncertainty or, in my phrase, the Principle of Tolerance, fixed once for all the realization that all knowledge is limited. It is an irony of history that at the very time when this was being worked out there should rise, under Hitler in Germany and other tyrants elsewhere, a counter-conception: a principle of monstrous certainty.

When the future looks back on the 1930s, it will think of them as a crucial confrontation of culture as I have been expounding it, the ascent of man, against the throwback to the despots’ belief that they have absolute certainty.

It is said that science will dehumanize people and turn them into numbers.

That is false: tragically false. Look for yourself.

This is the concentration camp and crematorium at Auschwitz. This is where people were turned into numbers.

Into this pond were flushed the ashes of four million people. And that was not done by gas. It was done by arrogance. It was done by dogma. It was done by ignorance.

When people believe that they have absolute knowledge, with no test in reality, this is how they behave. This is what men do when they aspire to the knowledge of gods.

Science is a very human form of knowledge. We are always at the brink of the known; we always feel forward for what is to be hoped. Every judgment in science stands on the edge of error, and is personal.

Science is a tribute to what we can know although we are fallible. In the end, the words were said by Oliver Cromwell: ‘I beseech you, in the bowels of Christ: Think it possible you may be mistaken.’ We have to cure ourselves of the itch for absolute knowledge and power. We have to close the distance between the push-button order and the human act. We have to touch people.

–Jacob Bronowski, “Knowledge or Certainty” episode of the 1973 BBC series The Ascent of Man
(Transcription source, formatting and emphasis mine).

November 3, 2009

Magnetoception Will Be Our First Superhuman Sense

Filed under: Uncategorized | , , , ,
― Vincent Gable on November 3, 2009

Magnetoception, the ability to sense magnetic fields like a compass, is my prediction for the first widely implemented super-sense, because it’s so simple.

I’m no biologist, but it certainly seems that only a little wetware is necessary to implement magnetoception, since even bacteria have it. On the mechanical front, tiny manometers have been built into millions of devices already. I have no idea what the state of the art is, but the first 3-axis digital compass chip I found on google measures 4x4x1.3mm. They’re only getting smaller and more efficient. We already have the technology to build it into belts and clothing.

But I hope I’m wrong. Certainly, the future promises more than better compasses.

October 26, 2009

Threading is Wrong

Filed under: Programming,Quotes,Usability | , , , ,
― Vincent Gable on October 26, 2009

I’m taking the following as an axiom: Exposing real pre-emptive threading with shared mutable data structures to application programmers is wrong. …It gets very hard to find humans who can actually reason about threads well enough to be usefully productive.

When I give talks about this stuff, I assert that threads are a recipe for deadlocks, race conditions, horrible non-reproducible bugs that take endless pain to find, and hard-to-diagnose performance problems. Nobody ever pushes back.

Tim Bray

October 24, 2009

Ignorance is Moral Strength

Filed under: Design,Quotes,Security | , , , , ,
― Vincent Gable on October 24, 2009

I have long been impressed with the casino industry’s ability to, in the case of blackjack, convince the gambling public that using strategy equals cheating.

Bruce Schneier

October 23, 2009

GUI is Dead, Long Live UI

Filed under: Design,Programming,Usability | , , , ,
― Vincent Gable on October 23, 2009

The term GUI, Graphical User Interface, pronounced “Gooey” is laughably anachronistic. All interfaces meant for people on modern computers are graphical. The right abbreviation to use today is simply UI, for User Interface, pronounced “You I”.

Believe me, I understand that a command line interface is still useful today. I use them. I’m a programmer. I get the whole UNIX thing. Even without a pipe, a command-line is the highest-bandwidth input mechanism we have today.

But all command lines live inside a graphical OS. That’s how computers work in the 21st century.
Picture 6.png

Whenever I see “GUI” written I can’t help but wonder if the author is dangerously out of touch. Do they still think graphical interfaces are a novelty that needs to be called out?

October 22, 2009

iPhone Shows the Irrelevance of the Programmer User

Filed under: iPhone,Usability | , , , ,
― Vincent Gable on October 22, 2009

There’s a lot of discord over Apple’s draconian “closed” handling of the iPhone and App store. And rightly so. But there are a few interesting lessons in the current situation. The one I want to discuss now is that,

Being able to program your own computer isn’t enough to make it open

As things stand today, Apple can’t stop you from installing any damn iPhone app if you build yourself.

To do that you have to join the iPhone developer program of course. And there’s a $99/year fee. That’s inconvenient, but it’s just using a subscription-based way of selling iPhone OS: Developer Edition.

That’s the kind of dirty money-grabbing scheme I’d expect from Microsoft. It’s a bit shady, because it’s not how most OSes are sold. But it’s not without precedent. And unless you are against ever charging money for software, I don’t think there’s an argument that it’s actually depriving people of freedom.

Yes, it’s an unaffordably high price for many. But the iPhone is a premium good that costs real money to build — it’s inherently beyond many people’s means, even when subsidized.

Observation: Only Binaries Matter

If you have a great iPhone app that Apple won’t allow into the store, you can still give it to me in source code form, and since I have iPhone OS: Developer Edition, I can run it on my iPhone.

But clearly that’s not good enough.

In fact, I’m not aware of any substantive iPhone App that’s distributed as source. By “substantive” I mean an app with a lot of users — say as many as the 100th most downloaded App Store app — or an app that does something that makes people jealous, like tethering (See update!), which we know is possible using the SDK. I realize this is a wishy-washy definition — what I’m trying to say is that distributed-as-source iPhone Apps seem to be totally irrelevant.

“It’s not open until I can put Linux on it”

I believe it’s technically possible to run Linux on an iPhone without jail-breaking it. (Although it’s not terribly practical.) Just build Linux (or an emulator that runs Linux) as an iPhone app, and leave it running all the time to get around the limitations on background processes.

Apple won’t allow such a thing into the App Store of course —but how does that stop you from distributing the source for it? As best I can tell, it doesn’t.

So as things stand today, yes you can distribute source code that lets any iPhone OS: Developer Edition user run Linux. It’s technically challenging, but it’s doable.

Conclusion

It’s possible to build open systems on top of closed systems. We’ve done it before when we built the internet on Ma Bell’s back.

But the iPhone remains a closed device. User-compiled applications have 0 momentum. And I think that clearly shows the irrelevance of the rare “programmer user”, who is comfortable dealing with the source code for the programs he uses.

UPDATE 2010-01-21: iProxy is an open-source project to enable tethering! Maybe the programmer-user will have their day after-all.

October 20, 2009

Knuth can be Out of Touch

Filed under: Accessibility,Programming,Quotes | , ,
― Vincent Gable on October 20, 2009

…Knuth has a terrible track record, bringing us TeX, which is a great typesetting language, but impossible to read, and a three-volume set of great algorithms written in some of the most impenetrable, quirky pseudocode you’re ever likely to see.

lingpipe

There, it’s been said. But let the posse note I wasn’t technically the one to do it!

« Newer PostsOlder Posts »

Powered by WordPress