• anonymous (unregistered) in reply to chubertdev
    chubertdev:
    "100" != "100."

    "100" means that only the 1 is significant, so it could be 101. "100." means that all three are significant digits, so it can't be 101.

    Oh yes, the magic "and I really mean it" dot. Now you're just using made-up rules that liars invented to help them figure. That has nothing to do with real maths. There is no difference between 100 and 100. Or should I add another dot there to make sure you know it's the magic I-mean-it dot and not just the period at the end of my sentence?

  • nmclean (unregistered) in reply to anonymous
    anonymous:
    chubertdev:
    "100" != "100."

    "100" means that only the 1 is significant, so it could be 101. "100." means that all three are significant digits, so it can't be 101.

    Oh yes, the magic "and I really mean it" dot. Now you're just using made-up rules that liars invented to help them figure. That has nothing to do with real maths. There is no difference between 100 and 100. Or should I add another dot there to make sure you know it's the magic I-mean-it dot and not just the period at the end of my sentence?
    Um, yes. The whole point of "significant digits" is to indicate "I really mean it", i.e. precision.

    The sources I've seen all say that the presence / absence of underlines, the decimal point, and trailing zeroes after all denote precision.

    But apparently we've all been led astray by "liars", so please enlighten us to your legitimate sources on the subject.

  • anonymous (unregistered) in reply to nmclean
    nmclean:
    anonymous:
    chubertdev:
    "100" != "100."

    "100" means that only the 1 is significant, so it could be 101. "100." means that all three are significant digits, so it can't be 101.

    Oh yes, the magic "and I really mean it" dot. Now you're just using made-up rules that liars invented to help them figure. That has nothing to do with real maths. There is no difference between 100 and 100. Or should I add another dot there to make sure you know it's the magic I-mean-it dot and not just the period at the end of my sentence?
    Um, yes. The whole point of "significant digits" is to indicate "I really mean it", i.e. precision.

    The sources I've seen all say that the presence / absence of underlines, the decimal point, and trailing zeroes after all denote precision.

    But apparently we've all been led astray by "liars", so please enlighten us to your legitimate sources on the subject.

    Hi, I'd like to introduce you to this guy: ±

    I put you in the same group as the people who tell me that 1e9 + 1 is anything other than 1,000,000,001. You're bullshit. You may not CARE that it's 1,000,000,001, and your computer may not have the capacity to represent a number with that many significant figures, but it's NOT still 1e9 and when your computer says it is, your computer is WRONG, albeit predictably and in a way that you don't care about.

    I never met a book that told me I should care about significant digits until I came to the chapter specifically about significant digits. And every, without exception, professor I had in college SIMPLY DID NOT CARE. They cared only that, when I rounded, I rounded to a reasonable number of digits. That chapter on significant digits - and people like you who have an unnatural hardon for them (of which I'm grateful none of my college professors were) - are the only time I've given any serious consideration to the topic AT ALL.

    In any serious maths discussion, "100" just means 100, and we don't try to figure out how precise you meant to be, because if we should care then you would have written 150±50 or used interval notation such as 100 ≤ x < 200 or [100, 200).

  • The amazing Mr Cuntstack (unregistered)

    C-c-c-c-c-CUNTSTACK!

  • nmclean (unregistered) in reply to anonymous
    anonymous:
    Hi, I'd like to introduce you to this guy: ±

    I put you in the same group as the people who tell me that 1e9 + 1 is anything other than 1,000,000,001. You're bullshit. You may not CARE that it's 1,000,000,001, and your computer may not have the capacity to represent a number with that many significant figures, but it's NOT still 1e9 and when your computer says it is, your computer is WRONG, albeit predictably and in a way that you don't care about.

    I never met a book that told me I should care about significant digits until I came to the chapter specifically about significant digits. And every, without exception, professor I had in college SIMPLY DID NOT CARE. They cared only that, when I rounded, I rounded to a reasonable number of digits. That chapter on significant digits - and people like you who have an unnatural hardon for them (of which I'm grateful none of my college professors were) - are the only time I've given any serious consideration to the topic AT ALL.

    In any serious maths discussion, "100" just means 100, and we don't try to figure out how precise you meant to be, because if we should care then you would have written 150±50 or used interval notation such as 100 ≤ x < 200 or [100, 200).

    WTF is this? Look, there is a significant difference between saying that:

    A: the concept of significant digits is insignificant to you;

    B: our definition of the concept is incorrect.

    B was what you were arguing in the first place, claiming "zero is never a significant digit", which was patently false. But rather than graciously acknowledge your error, you've now moved the goalposts to argument A, which NOBODY once disagreed with. Nobody once claimed that they have a place in pure math, and nobody once argued about the relative merits of them versus ±.

    You fucked up on a simple point, wasted 3 posts trying to support it, and now are trying to save face with an irrelevant rant.

  • anonymous (unregistered) in reply to nmclean
    nmclean:
    anonymous:
    Hi, I'd like to introduce you to this guy: ±

    I put you in the same group as the people who tell me that 1e9 + 1 is anything other than 1,000,000,001. You're bullshit. You may not CARE that it's 1,000,000,001, and your computer may not have the capacity to represent a number with that many significant figures, but it's NOT still 1e9 and when your computer says it is, your computer is WRONG, albeit predictably and in a way that you don't care about.

    I never met a book that told me I should care about significant digits until I came to the chapter specifically about significant digits. And every, without exception, professor I had in college SIMPLY DID NOT CARE. They cared only that, when I rounded, I rounded to a reasonable number of digits. That chapter on significant digits - and people like you who have an unnatural hardon for them (of which I'm grateful none of my college professors were) - are the only time I've given any serious consideration to the topic AT ALL.

    In any serious maths discussion, "100" just means 100, and we don't try to figure out how precise you meant to be, because if we should care then you would have written 150±50 or used interval notation such as 100 ≤ x < 200 or [100, 200).

    WTF is this? Look, there is a significant difference between saying that:

    A: the concept of significant digits is insignificant to you;

    B: our definition of the concept is incorrect.

    B was what you were arguing in the first place, claiming "zero is never a significant digit", which was patently false. But rather than graciously acknowledge your error, you've now moved the goalposts to argument A, which NOBODY once disagreed with. Nobody once claimed that they have a place in pure math, and nobody once argued about the relative merits of them versus ±.

    You fucked up on a simple point, wasted 3 posts trying to support it, and now are trying to save face with an irrelevant rant.

    Go back further and read and quit being such a dick. My original point was thus:

    1. Zero times zero is zero. Zero times zero times zero more times is still zero.

    2. Any power of 10 has exactly 1 significant digit, including negative powers of 10. They can't have more significant digits than 10 because 10 has only 1 significant digit.

    3. The number "0" has no significant digits. It is zero followed by a decimal point followed by infinitely many zeros. It is smaller than the smallest power of 10, which has approaching infinity zeros followed by a 1. Zero has zeros as far out as you go, all the way to infinity.

    And no, 0 is never a significant figure as far as computing is concerned. Then some guy brought up maths to try to chase rabbits by saying that 0 can be a significant digit because if the 0s aren't significant then 100 might actually be 101. In that case let's all go full stupid and write ...00000000100.0000000... just to emphasize that there aren't any other digits anywhere. All those zeros are significant, amirite?

  • anonymous (unregistered) in reply to anonymous
    anonymous:
    nmclean:
    anonymous:
    Hi, I'd like to introduce you to this guy: ±

    I put you in the same group as the people who tell me that 1e9 + 1 is anything other than 1,000,000,001. You're bullshit. You may not CARE that it's 1,000,000,001, and your computer may not have the capacity to represent a number with that many significant figures, but it's NOT still 1e9 and when your computer says it is, your computer is WRONG, albeit predictably and in a way that you don't care about.

    I never met a book that told me I should care about significant digits until I came to the chapter specifically about significant digits. And every, without exception, professor I had in college SIMPLY DID NOT CARE. They cared only that, when I rounded, I rounded to a reasonable number of digits. That chapter on significant digits - and people like you who have an unnatural hardon for them (of which I'm grateful none of my college professors were) - are the only time I've given any serious consideration to the topic AT ALL.

    In any serious maths discussion, "100" just means 100, and we don't try to figure out how precise you meant to be, because if we should care then you would have written 150±50 or used interval notation such as 100 ≤ x < 200 or [100, 200).

    WTF is this? Look, there is a significant difference between saying that:

    A: the concept of significant digits is insignificant to you;

    B: our definition of the concept is incorrect.

    B was what you were arguing in the first place, claiming "zero is never a significant digit", which was patently false. But rather than graciously acknowledge your error, you've now moved the goalposts to argument A, which NOBODY once disagreed with. Nobody once claimed that they have a place in pure math, and nobody once argued about the relative merits of them versus ±.

    You fucked up on a simple point, wasted 3 posts trying to support it, and now are trying to save face with an irrelevant rant.

    Go back further and read and quit being such a dick. My original point was thus:

    1. Zero times zero is zero. Zero times zero times zero more times is still zero.

    2. Any power of 10 has exactly 1 significant digit, including negative powers of 10. They can't have more significant digits than 10 because 10 has only 1 significant digit.

    3. The number "0" has no significant digits. It is zero followed by a decimal point followed by infinitely many zeros. It is smaller than the smallest power of 10, which has approaching infinity zeros followed by a 1. Zero has zeros as far out as you go, all the way to infinity.

    And no, 0 is never a significant figure as far as computing is concerned. Then some guy brought up maths to try to chase rabbits by saying that 0 can be a significant digit because if the 0s aren't significant then 100 might actually be 101. In that case let's all go full stupid and write ...00000000100.0000000... just to emphasize that there aren't any other digits anywhere. All those zeros are significant, amirite?

    Actually, lemme just go back even further and say that my original point was:

    100 is 1 order of magnitude more than 10 100 is 2 orders of magnitude more than 1 100 is 3 orders of magnitude more than 0.1 100 is 4 orders of magnitude more than 0.01 100 is 5 orders of magnitude more than 0.001 etc... 100 is infinitely many orders of magnitude more than 0.

    THAT is the original point that someone wanted to contend.

  • nmclean (unregistered) in reply to anonymous
    anonymous:
    Go back further and read and quit being such a dick. My original point was thus:
    1. Zero times zero is zero. Zero times zero times zero more times is still zero.

    2. Any power of 10 has exactly 1 significant digit, including negative powers of 10. They can't have more significant digits than 10 because 10 has only 1 significant digit.

    3. The number "0" has no significant digits. It is zero followed by a decimal point followed by infinitely many zeros. It is smaller than the smallest power of 10, which has approaching infinity zeros followed by a 1. Zero has zeros as far out as you go, all the way to infinity.

    And no, 0 is never a significant figure as far as computing is concerned. Then some guy brought up maths to try to chase rabbits by saying that 0 can be a significant digit because if the 0s aren't significant then 100 might actually be 101. In that case let's all go full stupid and write ...00000000100.0000000... just to emphasize that there aren't any other digits anywhere. All those zeros are significant, amirite?

    Yes, that was your original point. And yes, that was exactly what I was referring to. And yes, you got it wrong. Yes, you are wasting your time arguing a point that no one else ever disagreed with in the first place.

    You're conflating another meaning of a term (in this case, the word "significant") with the one that was actually being discussed, and having a pointless argument trying to convince people of things they already know. What you're doing here is the equivalent of a non-programmer looking at this line of code:

    x = x + 1

    and loudly remarking, "LOL, x can never be equal to x + 1! You're all so full of shit!"

    You're making a fool of yourself. We know that 0 == 0.0. We're not talking about that. We know that "1e9 + 1 != 1000000001" is machine error, not real math. We're not talking about that. And the fact that your college professors think the notation is lame is also irrelevant.

    anonymous:
    Actually, lemme just go back even further and say that my original point was:

    100 is 1 order of magnitude more than 10 100 is 2 orders of magnitude more than 1 100 is 3 orders of magnitude more than 0.1 100 is 4 orders of magnitude more than 0.01 100 is 5 orders of magnitude more than 0.001 etc... 100 is infinitely many orders of magnitude more than 0.

    THAT is the original point that someone wanted to contend.

    That was a separate point, though. I was the one who contested it, and I still do. The order of magnitude of 0 can't be described comparatively; it's undefined. Calling it infinite is incorrect because 0.0...1 != 0.

  • anonymous (unregistered) in reply to nmclean
    nmclean:
    anonymous:
    Actually, lemme just go back even further and say that my original point was:

    100 is 1 order of magnitude more than 10 100 is 2 orders of magnitude more than 1 100 is 3 orders of magnitude more than 0.1 100 is 4 orders of magnitude more than 0.01 100 is 5 orders of magnitude more than 0.001 etc... 100 is infinitely many orders of magnitude more than 0.

    THAT is the original point that someone wanted to contend.

    That was a separate point, though. I was the one who contested it, and I still do. The order of magnitude of 0 can't be described comparatively; it's undefined. Calling it infinite is incorrect because 0.0...1 != 0.
    Wrong. 0.0...1 does equal 0 if the number of 0s in between are infinite.

    0.9 + 0.1 = 1 0.99 + 0.01 = 1 0.999 + 0.001 = 1 0.9999 + 0.0001 = 1

    Generally, 0.9...9 + 0.0...1 = 1

    And, for an infinite number of 9s and 0s, 0.9...9 = 1 (the mathematical proof for this is well known) 0.9...9 + 0.0...1 = 1 0.0...1 = 0 QED

  • anonymous (unregistered) in reply to anonymous
    anonymous:
    0.0...1 does equal 0 if the number of 0s in between are infinite.
    To be more accurate, "0.0...1 with an infinite number of 0s before the 1" doesn't really exist since it doesn't actually have a 1 at the end - it has no end; there are infinitely many zeros. It's like the pot of gold at the end of the rainbow - it just keeps going and going and you'll never find that 1.
  • nmclean (unregistered) in reply to anonymous
    anonymous:
    0.0...1 does equal 0 if the number of 0s in between are infinite.

    No, it doesn't. This is similar to the argument that 1 / 0 = infinity, and is wrong for the same reason.

  • anonymous (unregistered) in reply to nmclean
    nmclean:
    anonymous:
    0.0...1 does equal 0 if the number of 0s in between are infinite.

    No, it doesn't. This is similar to the argument that 1 / 0 = infinity, and is wrong for the same reason.

    Yes it does. It is similar to the argument that 0.999... = 1, and is correct for the same reason. In fact it is the exact same argument.

  • anonymous (unregistered) in reply to nmclean
    nmclean:
    anonymous:
    0.0...1 does equal 0 if the number of 0s in between are infinite.

    No, it doesn't. This is similar to the argument that 1 / 0 = infinity, and is wrong for the same reason.

    Also, you fail at limits. 1/0 is undefined, but it is still an undeniable fact that 1 is infinitely many times larger than 0, as you can see from the limit of 1/x as x goes to zero.

  • nmclean (unregistered) in reply to anonymous
    anonymous:
    it does. It is similar to the argument that 0.999... = 1, and is correct for the same reason. In fact it is the exact same argument.
    It is not the same argument. To make the arguments "compatible", we could say:

    0.9...0 != 1 0.0...1 != 0

    Both of the above are true, for the same reason. Neither of these can be "equal to" anything because they don't represent actual values, whereas 0.9... does.

    anonymous:
    1/0 is undefined, but it is still an undeniable fact that 1 is infinitely many times larger than 0, as you can see from the limit of 1/x as x goes to zero.
    Now you're getting close. But be careful. The order of magnitude of 1/x can move infinitely toward the negative side -- true. But remember, 1/x is never 0. Thus you can never define the order of magnitude of 0.

    Think about what you've said: "1 is infinitely many times larger than 0". That is, "1 = inf * 0". Which means 1 / 0 = inf. That is a direct contradiction to what you said immediately prior in the sentence.

  • anonymous (unregistered) in reply to nmclean
    nmclean:
    To make the arguments "compatible", we could say:

    0.9...0 != 1 0.0...1 != 0

    Both of the above are true, for the same reason. Neither of these can be "equal to" anything because they don't represent actual values, whereas 0.9... does.

    As any idiot who knows subtraction could tell you, 0.0...1 is the result of subtracting 0.9... from 1. There should be a 1 digit at the end... but we know that it doesn't ever end, of course. So, as I already stated, 0.0...1 doesn't really ever have a 1 at the end; it's just zeros as far as you go. So it's zero.

    It is also obviously the limit of 10^x as x goes to negative infinity. That series goes 1, 0.1, 0.01, 0.001, etc. and converges at zero.

    nmclean:
    The order of magnitude of 1/x can move infinitely toward the negative side -- true. But remember, 1/x is never 0. Thus you can never define the order of magnitude of 0.
    Are you even familiar with the concept of limits? Because this is exactly what they do. They define the order of magnitude when algebra would say that it's undefined.
    nmclean:
    Think about what you've said: "1 is infinitely many times larger than 0". That is, "1 = inf * 0". Which means 1 / 0 = inf. That is a direct contradiction to what you said immediately prior in the sentence.
    No. Infinity is not a number. It doesn't work in algebraic expressions. It only makes sense when you're talking about limits. Again, do you even know what a limit is? It's colloquially accepted that when you say something is infinitely larger than zero, you're talking about the sense of that limit, rather than a simple statement like "six is two times as large as three" which is clearly just describing the algebraic expression 6 = 2*3.

    I.e. "1 is infinitely many times larger than 0" can mean no more or less than, "the limit of the ratio of 1/x as x goes to zero is infinite".

  • nmclean (unregistered) in reply to anonymous

    Again you waste your time arguing something I already know. Yes, it's a limit, not an actual value in an equation. That is, and always has been, the point.

    Once again you are moving the goalposts; redefining the terms of the discussion so you can "win". From the beginning, posters to this comment thread were discussing zero itself, not some "colloquially accepted" concept that you now claim to have been arguing against.

    Further, you contradict yourself. Although you later admit that infinity cannot be reasoned about through regular arithmetic, you attempt to do that very thing in your opening sentence. Again, Neither of these can be "equal to" anything because they don't represent actual values.

  • nmclean (unregistered) in reply to nmclean

    To summarize what we were actually saying:

    1. The value that was added to timeOut, in the source code in the article, has no order of magnitude.

    2. The digit, 0, is significant when a decimal point is written.

    Your initial objections to these were knee-jerk and not actually applicable. Your subsequent insistence, that you were right and we were wrong, was misguided. I hope you now realize this.

  • anonymous (unregistered) in reply to nmclean
    nmclean:
    To summarize what we were actually saying:
    1. The value that was added to timeOut, in the source code in the article, has no order of magnitude.

    2. The digit, 0, is significant when a decimal point is written.

    Your initial objections to these were knee-jerk and not actually applicable. Your subsequent insistence, that you were right and we were wrong, was misguided. I hope you now realize this.

    To summarise what I'm actually saying:

    1. 100 is infinitely many times larger than 0.

    2. If you disagree, you're wrong.

  • nmclean (unregistered) in reply to anonymous
    anonymous:
    To summarise what I'm actually saying:
    1. 100 is infinitely many times larger than 0.

    2. If you disagree, you're wrong.

    • That's not all you said. You also said that zero was equal to an undefined value, and that the rules of significant digits are perpetuated by liars, along with some other unrelated nonsense about your professors.

    • Except no one here actually did disagree with that (particularly not in terms of its "colloquial meaning"), just like they never said anything about how programming languages parse the "0" character, or the precision of a computer's representation of large or small values, or their preferred mathematical notations, and so on. But I guess as long as you're okay with the fact that all your arguments here have been hypothetical (i.e. against critics who don't actually exist), that's fine.

  • anonymous (unregistered) in reply to nmclean
    nmclean:
    anonymous:
    To summarise what I'm actually saying:
    1. 100 is infinitely many times larger than 0.

    2. If you disagree, you're wrong.

  • That's not all you said. You also said that zero was equal to an undefined value, and that the rules of significant digits are perpetuated by liars, along with some other unrelated nonsense about your professors.

  • Except no one here actually did disagree with that (particularly not in terms of its "colloquial meaning"), just like they never said anything about how programming languages parse the "0" character, or the precision of a computer's representation of large or small values, or their preferred mathematical notations, and so on. But I guess as long as you're okay with the fact that all your arguments here have been hypothetical (i.e. against critics who don't actually exist), that's fine.

  • No, I said that 100 is infinitely many orders of magnitude larger than 0. That is exactly the same as saying that 100 is infinitely many times (times ten or any other finite, nonzero number) larger than 0.

  • The Crunger (unregistered) in reply to anonymous
    anonymous:
    2. If you disagree, you're wrong.

    I see. Someone on the internet is wrong.

    Alex -- I would say you could safely delete page 3 of these comments, but there was a comment about Cuntstack that might be worth keeping.

  • lupus usmle (unregistered)
    Comment held for moderation.
  • cialis price (unregistered)
    Comment held for moderation.
  • cialis 20mg (unregistered)
    Comment held for moderation.
  • cheap cialis (unregistered)
    Comment held for moderation.

Leave a comment on “Sound and Fury, Implementing Nothing”

Log In or post as a guest

Replying to comment #:

« Return to Article