- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Admin
The sources I've seen all say that the presence / absence of underlines, the decimal point, and trailing zeroes after all denote precision.
But apparently we've all been led astray by "liars", so please enlighten us to your legitimate sources on the subject.
Admin
I put you in the same group as the people who tell me that 1e9 + 1 is anything other than 1,000,000,001. You're bullshit. You may not CARE that it's 1,000,000,001, and your computer may not have the capacity to represent a number with that many significant figures, but it's NOT still 1e9 and when your computer says it is, your computer is WRONG, albeit predictably and in a way that you don't care about.
I never met a book that told me I should care about significant digits until I came to the chapter specifically about significant digits. And every, without exception, professor I had in college SIMPLY DID NOT CARE. They cared only that, when I rounded, I rounded to a reasonable number of digits. That chapter on significant digits - and people like you who have an unnatural hardon for them (of which I'm grateful none of my college professors were) - are the only time I've given any serious consideration to the topic AT ALL.
In any serious maths discussion, "100" just means 100, and we don't try to figure out how precise you meant to be, because if we should care then you would have written 150±50 or used interval notation such as 100 ≤ x < 200 or [100, 200).
Admin
C-c-c-c-c-CUNTSTACK!
Admin
WTF is this? Look, there is a significant difference between saying that:
A: the concept of significant digits is insignificant to you;
B: our definition of the concept is incorrect.
B was what you were arguing in the first place, claiming "zero is never a significant digit", which was patently false. But rather than graciously acknowledge your error, you've now moved the goalposts to argument A, which NOBODY once disagreed with. Nobody once claimed that they have a place in pure math, and nobody once argued about the relative merits of them versus ±.
You fucked up on a simple point, wasted 3 posts trying to support it, and now are trying to save face with an irrelevant rant.
Admin
Zero times zero is zero. Zero times zero times zero more times is still zero.
Any power of 10 has exactly 1 significant digit, including negative powers of 10. They can't have more significant digits than 10 because 10 has only 1 significant digit.
The number "0" has no significant digits. It is zero followed by a decimal point followed by infinitely many zeros. It is smaller than the smallest power of 10, which has approaching infinity zeros followed by a 1. Zero has zeros as far out as you go, all the way to infinity.
And no, 0 is never a significant figure as far as computing is concerned. Then some guy brought up maths to try to chase rabbits by saying that 0 can be a significant digit because if the 0s aren't significant then 100 might actually be 101. In that case let's all go full stupid and write ...00000000100.0000000... just to emphasize that there aren't any other digits anywhere. All those zeros are significant, amirite?
Admin
Actually, lemme just go back even further and say that my original point was:
100 is 1 order of magnitude more than 10 100 is 2 orders of magnitude more than 1 100 is 3 orders of magnitude more than 0.1 100 is 4 orders of magnitude more than 0.01 100 is 5 orders of magnitude more than 0.001 etc... 100 is infinitely many orders of magnitude more than 0.
THAT is the original point that someone wanted to contend.
Admin
You're conflating another meaning of a term (in this case, the word "significant") with the one that was actually being discussed, and having a pointless argument trying to convince people of things they already know. What you're doing here is the equivalent of a non-programmer looking at this line of code:
and loudly remarking, "LOL, x can never be equal to x + 1! You're all so full of shit!"
You're making a fool of yourself. We know that 0 == 0.0. We're not talking about that. We know that "1e9 + 1 != 1000000001" is machine error, not real math. We're not talking about that. And the fact that your college professors think the notation is lame is also irrelevant.
That was a separate point, though. I was the one who contested it, and I still do. The order of magnitude of 0 can't be described comparatively; it's undefined. Calling it infinite is incorrect because 0.0...1 != 0.Admin
0.9 + 0.1 = 1 0.99 + 0.01 = 1 0.999 + 0.001 = 1 0.9999 + 0.0001 = 1
Generally, 0.9...9 + 0.0...1 = 1
And, for an infinite number of 9s and 0s, 0.9...9 = 1 (the mathematical proof for this is well known) 0.9...9 + 0.0...1 = 1 0.0...1 = 0 QED
Admin
Admin
No, it doesn't. This is similar to the argument that 1 / 0 = infinity, and is wrong for the same reason.
Admin
Admin
Admin
0.9...0 != 1 0.0...1 != 0
Both of the above are true, for the same reason. Neither of these can be "equal to" anything because they don't represent actual values, whereas 0.9... does.
Now you're getting close. But be careful. The order of magnitude of 1/x can move infinitely toward the negative side -- true. But remember, 1/x is never 0. Thus you can never define the order of magnitude of 0.Think about what you've said: "1 is infinitely many times larger than 0". That is, "1 = inf * 0". Which means 1 / 0 = inf. That is a direct contradiction to what you said immediately prior in the sentence.
Admin
It is also obviously the limit of 10^x as x goes to negative infinity. That series goes 1, 0.1, 0.01, 0.001, etc. and converges at zero.
Are you even familiar with the concept of limits? Because this is exactly what they do. They define the order of magnitude when algebra would say that it's undefined. No. Infinity is not a number. It doesn't work in algebraic expressions. It only makes sense when you're talking about limits. Again, do you even know what a limit is? It's colloquially accepted that when you say something is infinitely larger than zero, you're talking about the sense of that limit, rather than a simple statement like "six is two times as large as three" which is clearly just describing the algebraic expression 6 = 2*3.I.e. "1 is infinitely many times larger than 0" can mean no more or less than, "the limit of the ratio of 1/x as x goes to zero is infinite".
Admin
Again you waste your time arguing something I already know. Yes, it's a limit, not an actual value in an equation. That is, and always has been, the point.
Once again you are moving the goalposts; redefining the terms of the discussion so you can "win". From the beginning, posters to this comment thread were discussing zero itself, not some "colloquially accepted" concept that you now claim to have been arguing against.
Further, you contradict yourself. Although you later admit that infinity cannot be reasoned about through regular arithmetic, you attempt to do that very thing in your opening sentence. Again, Neither of these can be "equal to" anything because they don't represent actual values.
Admin
To summarize what we were actually saying:
The value that was added to timeOut, in the source code in the article, has no order of magnitude.
The digit, 0, is significant when a decimal point is written.
Your initial objections to these were knee-jerk and not actually applicable. Your subsequent insistence, that you were right and we were wrong, was misguided. I hope you now realize this.
Admin
To summarise what I'm actually saying:
100 is infinitely many times larger than 0.
If you disagree, you're wrong.
Admin
That's not all you said. You also said that zero was equal to an undefined value, and that the rules of significant digits are perpetuated by liars, along with some other unrelated nonsense about your professors.
Except no one here actually did disagree with that (particularly not in terms of its "colloquial meaning"), just like they never said anything about how programming languages parse the "0" character, or the precision of a computer's representation of large or small values, or their preferred mathematical notations, and so on. But I guess as long as you're okay with the fact that all your arguments here have been hypothetical (i.e. against critics who don't actually exist), that's fine.
Admin
Admin
I see. Someone on the internet is wrong.
Alex -- I would say you could safely delete page 3 of these comments, but there was a comment about Cuntstack that might be worth keeping.