When programming, never start an integer with 0. Most programming languages treat a decimal number that starts with 0 as octal (base-8). So x = 013;
does not set x
to 13. Instead x
is 11, because 013
is interpreted as 138 not 1310.
Languages with this quirk include: C, C++, Objective-C, Java, JavaScript, Perl, Python 3.0, and Ruby. If you add up the “market share” of these languages, it comes out to above 50%, which is why I say most languages.
“But I use {Smalltalk, Haskell, Lisp, etc.}”
I’m jealous that you get to use such a nice language. However, it’s bad programming hygiene to pick up habits that are dangerous in common languages.
Now, I assume you wouldn’t write 7 as 007
, unless the leading zero(s) carried some extra meaning. There are cases where this clarity outweighs “cleanliness” (unless the code meant to be ported to a C-like language).
But you should at least be aware of this inter-lingual gotcha.