- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
And there was me thinking that a hash was supposed to somehow relate to the item being hashed. This will make it make it much easier to implement hashing algorithms.
Admin
Frist! And this is to convince Akismet that I'm not a spammer.
Admin
Quite odd that the amount of duplicate data still was reduced with hash function. :)
Admin
Admin
'he just implemented a quick SHA-1'
Thats the real WTF, right there.
Admin
Admin
Admin
Well then it's a good thing that this was only my first attempt at a "first" post, i.o.w. the word "every" doesn't apply therefor logically I am not a loser.
∀ p ∊ P etc... ∎ (qed)
Admin
any_number % 1 == 0
So this function always returns a string of zeroes.
Admin
Admin
I guess, if you're the kind of person who likes to introduce concurrency bottlenecks into arbitrary nonconcurrent functions.
Admin
It's data entry by users. By hand. One wouldn't even need a really long hash. (globalCounter++).toString(16) only once would be more than enough. OTOH 10^48 random numbers is also more than enough to avoid a hash collision in most cases of manual data entry, provided that the random generator is properly seeded. It's a really stupid implementation, but it will probably work provided that you never have to regenerate the same hash from the same source. And it's fewer lines of code than a complete SHA implementation.
So yeah in theory it's a WTF and I would never write something like this myself, but in practice it works good enough.
Admin
Admin
You keep telling yourself that, maybe you'll start believing it - but we won't.
Admin
The previous two commenters, talking about how incrementing a global counter fails due to concurrency, or whatever nonsense Sir Robin-the-Not-So-Brave is on about, are completely missing the point of a hash.
The point of a hash is not, in this case, to assign a globally unique identifier to each new submission. It is to detect and identify duplicate (i.e. non-unique) submissions. Therefore not only is a "hashing function" which doesn't generate the same output when given the same input not a hashing function, it doesn't even come close to being applicable to the problem.
Admin
Admin
Admin
Sir Robin run away. The entire point of the exercise is to generate hash collisions so you can see if the data is duplicated.
Admin
If only there was a Math.seed() function, that could take arbitrary input. Then you could feed everything you wanted hashing into that, and this function would do something approximately correct.
Admin
Ehrm...
I feel kinda stupid. Can anyone tell me how that "hash" helped reducing duplicate entries? Because I really don't get how it could do that.
Admin
Bad function though. He forgot to seed the random number generator...
Admin
Without proper context, I could agree that the createHash function is a WTF. However, if you imagine that createHash is called when the page is loaded, and then passed during form submission to check against a user mistakenly hitting submit more than once (which could happen if the submission was taking a while and an impatient user kept pressing submit thinking that would make things go faster...). Granted, there are better ways to guard against that sort of WTFry; simply disabling the submit button when it is pressed, or adding an interstitial page would certainly be better. So this is still a WTF, but not for the reason other posters have stated.
Admin
It didn't, but once in a while, a new submission would (at random) be assigned the same "hash" as an existing submission. That, and a healthy dose of placebo effect.
Admin
It wasn't a hash to check the data; it was a hash to check that the data was only posted once.
Admin
Admin
Admin
The objective is to remove duplicates. A hash of the text will provide a quick way to be sure two texts are different. Now, to make sure they are equal (after you got equal hashes) you will have to compare the texts. Using a "hash" that does not depends on the text will allow duplicates to slip through but will not loose texts.
If they are doing the comparison...
Admin
http://s260.photobucket.com/albums/ii12/REexpert44/%3Faction%3Dview%26current%3Dthats_the_joke.jpg
Admin
Further to that, they would be losing data because of said placebo effect.
I'm curious if there was an actual measurable improvement, or as you stated, simply a placebo effect... Hmm, less records means its working, right? Ahh, I should've looked closer, it was management that noticed the reduction in duplicate data; who better to vet this type of metric than someone who would pay someone like Nagesh to write their apps for them...
Admin
I recognise that code - its the part of the knobworks v11.78 patch - so SamF's predecessor was that small shaven poodle called Earsmus Pink. Man that blitch's code output was legendary, 10 LOC per day and all of it based on her canine logic that even our experts could not argue with...
Admin
This function would make a good hash of the current time, provided the random number generator is reseeded with the time for every use.
Admin
If anything this thread has taught me it's to not post to soon (or often).
It's interesting how the term "hash" automatically biases us to think "lookup". The code in the article works fine and fills the requirement. Using a UUID or even a single 32 bit random number would have been slightly more elegant.
I would to see how long they cache the "hash" values.
Admin
It would have the effect of reducing the amount of "duplicate" data, and it would occasionally tell a data entry operator that "duplicate" data was found.
And yeah, used this way, it's still a WTF.
Admin
Admin
So...you're saying...that the code example in the original post isn't very good?
Have the admins been informed of this?
Admin
OK. That's great and all, but what if you want to support multiversalization?
Admin
No, that isn't the joke. The article clearly says that duplication was reduced. This has yet to be explained.
Admin
How does this code validate that data entered by different users at different times is not equal? Each piece of data is assigned a random number and then the random numbers are checked against each other - how does that fulfil the requirement for checking that no two pieces of data are the same?
Admin
Wow, someone bit! Good job frits... I always thought you were too high brow to troll.
Admin
Admin
Ahem.
Admin
Admin
Sorry, but where in the article does it say this? I realize it doesn't explicitly say that it's using it to compare what is on the form with the DB, though it's implied saying that the hash was used to alert the user if they were entering duplicate data. Nowhere does it imply that it was used to prevent moron users from being click-happy...
Aw shit, I just got trolled... DAMNIT!
Admin
"Math.floor(9999999999999 * (Math.random() % 1));"
Why bother? With a simple Math.random() he could have achieved the same level of FAIL!
Admin
Go big or go home, maybe?
Admin
After smoking all that hash... a random number generated by the client's browser seems a perfectly sensible way of detecting whether Input A == Input B, even if the two are separated by time, space, and, thanks guys, the re-instatiation of the multiverse.
Admin
Admin
Wait, did I just get double-trolled?
Admin
Might be more enterprisey to find the supposed duplicate ratio and block every "n"th insert; 100% success rate!
Admin
Wouldn't that be a business practice WTF? Why would different users from all over the world have a data-entry race condition?
I'm not trolling, but I think the OP may be.