Drew W got called in to track down a bug. Specifically, their application needed to take a customer’s location, and measure the distance to the nearest National Weather Service radar station. It knew the latitude and longitude of each, and needed to find the distance between those points, and it was wrong. It could be off by hundreds or even thousands of miles, especially in more remote locations.
This was the code in question:
from math import sqrt
dist = sqrt((abs(latdiff) * abs(latdiff)) + (abs(londiff) * abs(londiff)))
Now, there’s an obvious problem here, and a number of nitpicks. I’m going to start with the nitpicks. First, when you multiply a number by itself, it’ll always be positive, so you don’t need the abs
, making the line sqrt(latdiff*latdiff + londiff*londiff)
. Of course, Python also has an exponent operator, allowing you to write the easier-to-read version of, sqrt(latdiff**2 + londiff**2)
. But now that we bring it up, the math
package in Python also includes a hypot
function, which just implements the distance formula for you, meaning that whole thing could have been written thus:
from math import hypot
dist = hypot(latdiff, londiff)
Now, if your only criteria is, “which solution is more ‘pythonic’?”, then it’s clear that the latter solution is superior. Of course, you should still get your fingers whacked with a mechanical keyboard if you tried to check that solution in, because it still has one major problem: it’s completely and utterly wrong.
If you’re not sure why… think of it as a special kind of rounding error.