• Katrina (unregistered)

    Hey Alex,

    I just wanted to say Hi and your blog is awesome!!Love you!

  • Dave Mays (unregistered)

    From the department of redundancy department....

    And doesn't this guy realise he's using up twice as many GUIDs this way? It's really going to suck when we all run out early because of him. ;-)

  • Patrick (unregistered)

    Also importing a DLL and calling unmanaged code, from managed code, simply to do something that was already done!

  • Jacques Troux (unregistered)


  • cicorias (unregistered)

    This compares, but is far worse, to code I saw where the deveoper called out to SQL Server to generate a new guid with

    select newid()

    WTF!!! I flipped

  • icelava (unregistered)

    Unmanaged GUID > managed GUID

    is what the programmer wants to say.

  • Derick Bailey (unregistered)

    oo! oo! i'll play "defender of the WTF" for this round! :D

    Does anyone know if the Guid.NewGuid() method actually returns a truely unique guid based on the network card MAC address and all that jazz, the same way CoCreateGUID does? If NewGuid doesn't do the external call for us, or at least run the same algorhythm the create the guid, then the API call may be justified here.

  • Derick Bailey (unregistered)

    but either way... this is still stupid code since he wouldn't need to call GUid.NewGuid... .NET's GUID is a STRUCTURE which does not need to be explicitly instantiated.

  • Rojohn (unregistered)

    Yes, NewGuid in fact calls CoCreateGuid. But CoCreateGuid calls UuidCreate, which does not use the MAC address these days.

  • Rojohn (unregistered)

    Oh and I say blame the code on MSDN. The doc is a bit vague when it says: "Initializes a new instance of the Guid class." What the heck does 'initialize' mean? So this guy probably just didn't want to use any undocumented features. ;-)

  • Hassan Voyeau (unregistered)

    I agree, the only way I would defend this is if the ole32 GUID algorithm is somehow different from the .NET GUID algorithm.

  • Hassan Voyeau (unregistered)

    [Append to my last comment] : Can anyone say for certain that this is not the case?

  • Dave Mays (unregistered)

    And even if the algorithms were different between the Ole32 and .NET Guid creator, what would possibly be the justification for caring how your GLOBALLY UNIQUE identifier was created? Who cares if it's based (partially) on a MAC address??

  • Dave Mays (unregistered)

    And, from MSDN:

    The full .NET Framework Guid.NewGuid method calls the Windows API function CoCreateGuid that calls UuidCreate to generate globally unique 128-bit numbers.

    ( http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnnetcomp/html/PPCGuidGen.asp )

  • Jon Galloway (unregistered)

    Dave, you're just not thinking Enterprise Development. Sure, Globally Unique is fine for_now, but what about when our applications are running on other planets or stars or whatever? That's why we need Universally Unique ID's (UUID's), and there's no UUID class in the .NET.

  • Hassan Voyeau (unregistered)

    Using two different algorithms makes it possible to create a GUID twice. However, as shown in your quote from MSDN, this is not the case, making this a WTF indeed.

  • Scott (unregistered)

    No, using two different algorithms doesn't make it any more possible to create two of the same globally unique identifiers.

  • Scott (unregistered)

    Sorry for the double post, but the algorithm is encoded into the guid. Making it actually impossible to create two of the same guids with two different algorithms.

  • Patrick (unregistered)

    Scott, sorry but thats nonsense. A million monkeys pounding at a keyboard could duplicate your "unique" GUID, given enough time.

    It's just that the chances of a GUID clash are VERY, VERY remote, given the amount of digits in the GUID.

  • Ludvig A. Norin (unregistered)

    Come to think of it, I'd say Scott is correct. A proper GUID (or UUID for that matter) must have an algorithm identifier. If it's not, it's not a GUID by definition - rather it's just a (possibly) random 128-bit number. Just making up a 128bit number won't get you a GUID, right? I still wonder if there's a difference betweed UUID and GUID's though (by specification that is). GUID's are obviously UUID's, but is vice versa true as well?

  • Scott (unregistered)

    UUID is actually another term for a GUID.

    Guids are made up of different fields to help guarantee uniqueness.

    Typically they are: algorithm, spacial identifier (on the mac address based algorithms this would be something involving the mac address), and another section involving the spacial identifier and the time or something involving clock cycles.

    So, really, if a million monkeys were generating guids if they were to generate a clash it would be a clash with themselves probably. They would most likely die before this would happen making the guids REASONABLY unique. 128 bits is a HUGE space to deal with.

  • Mike Dimmick (unregistered)

    You do need to call Guid.NewGuid(), otherwise you're using an uninitialised, blank GUID structure, i.e. {00000000-0000-0000-0000-000000000000}. Because it's a structure, you can't define an explicit parameterless constructor.

    Digging around with Reflector and the Shared Source CLI code, you can see that NewGuid calls a private constructor which takes a boolean. The constructor zeros the fields, then, if the argument is true, it calls CompleteGuid. This method is an internal call, i.e. implemented in the virtual machine itself. Searching ecall.cpp shows that the implementation is called GuidNative::CompleteGuid, which lives in comutilnative.cpp. This function does a bit of frame setup then calls CoCreateGuid. I presume that this was done for consistency with how the OS generates GUIDs (using the MAC address for NT 4.0 and earlier, randomly for Windows 2000 and newer).

  • Robert (unregistered)

    There is nothing random about a GUID. Its a timestamp, based on your mac address.

  • Fubarer (unregistered)

    Is that XML in the comments?

  • Cobus (unregistered)

    It's because of developers like this that we're gonna run out of GUIDs one day soon... ;-)

  • Jeff S (unregistered)

    >>Is that XML in the comments?

    Yes. If you use XML comments with proprer tags in .NET the code can generate documentation automatically.

  • Hassan Voyeau (unregistered)

    If the algorithm does include an algorithm identifier then my point (http://thedailywtf.com/archive/2004/10/01/2249.aspx#2267) is null and void. And I guess it would have to, to be Globally Unique. My bad.

  • Anon (unregistered)


    Scott, sorry but thats nonsense. A million monkeys pounding at a keyboard could duplicate your "unique" GUID, given enough time.

    Read Scott's comments in context. He was responding to an incorrect statement by Hassan.

  • Anon (unregistered)


    There is nothing random about a GUID. Its a timestamp, based on your mac address.

    And your evidence is?

  • Cain (unregistered)

    This guys is wasting his Guid's - which is perfectly fine by mean because I will sell him some more when he runs out.

  • ML (unregistered)

    With 3,4028236692093846346337460743177e+38 combinations it really makes no sense questioning its usability.

  • (cs)

    For anyone stumbling across this thread, the comments regarting use of MAC address, timestamps, etc. are all obsolete. Due to "privacy concerns" [this is NOT a joke!] GUIDs are now just random numbers wih the same issues of random repeats as any other random number generator. Granted the size [128 bits] minimizes this, but does no eliminate it.

    A few years ago, I had a client who was getting occasional database corruption /errors. They had based their entire design on the presumpion that Guids are uniqe. About 3-8 time per week (this system generated over 10K records per second when aggregated across the enterprise) they would in fact get a duplicate.

  • JayC (unregistered) in reply to TheCPUWizard
    Comment held for moderation.

Leave a comment on “When once just isn't enough ...”

Log In or post as a guest

Replying to comment #:

« Return to Article