I encountered a time billing application used to bill hours with single decimal precision. Initially that seemed innocuous enough until I realized that the culture that we live in naturally deals with fractional hours in 5 minute increments. How common do these examples seem to you?
• I was only 10 minutes late.
• Let’s start the meeting at quarter after nine.
• It was just a five minute discussion.
• The accident occurred at 11:35 pm.
Now, 5 minute increments do not exactly correspond to tenths of an hour. So what did the users of the application do? Well, they approximated, keeping their natural fractional hour handling mechanisms. For example:
• 5 minutes ~ 0.1 hours
• 10 minutes ~ 0.2 hours
• 15 minutes ~ 0.2 or 0.3 hours
• 20 minutes ~ 0.3 hours also
• 25 minutes ~ 0.4 hours
When I interacted with this application, I soon because frustrated with the approximations and discovered that the actual number of hours that were possible to be billed were based on 6 minute increments, not 5 and although that is not natural it is still possible. So I went all the way back to my grade 4 times tables. ( I think it was grade 4 since it was so long ago.)
• 6 x 1 = 6
• 6 x 2 = 12
• 6 x 3 = 18
• 6 x 4 = 24
Armed with this, I started measuring my time in 6 minute increments, and there was no more approximation and the math became very easy.
• 6 minutes = 0.1 hours
• 12 minutes = 0.2 hours
• 18 minutes = 0.3 hours
• 24 minutes = 0.4 hours
I compared this to the situation in software development where the whole world works with the decimal system yet developers have to understand binary and hexidecimal. The difference here between base 10 and (base 2 or 16) is similar to the difference between 5 min and 6 min.
Strangely enough, none of the other users used the application in this way. Was it too complicated? I don’t know. Maybe I’m just too scientific or something like that. What would you do?