- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Who's Joe? And what did he do with Matteo?
Admin
I've seen, and as a clueless noob eons ago done, a couple of similar things.
A new revision of the language adds some new hotness, such as .NET's nullable value types. Which hotness seems at first glance to have omitted a few basic use cases. Of course they are covered; you just don't know it yet.
So before you have any real experience using the new hotness, and supremely confident you're more observant / complete a thinker than they are, you whip up your extensions for the stuff they "forgot". And proceed willy-nilly to use them all over the place in your code base when there is already better-thought out functionality available in the language.
Your own ego makes it ~~hard~~ impossible to remove your stuff once you learn of the built-in right way, if indeed you ever do. And of course your homebrew tools have subtly different behavior, often as to corner cases or error handling, that makes it hard for follow-on devs to remove your stale trash and substitute built-in stuff. At least not without a lot of deep test coverage & bravery. Hah! What test coverage?
As to Joe vs. Matteo, I suspect Matteo quit in disgust halfway through the stench-scrubbing, and poor Joe was hired to finish the half-done job. Which was even worse; now he had two competing stenches too scrub off.
Admin
I wouldn't call nullable value types new or hotness... First they are around since .NET framework 2.0, which is what, nearly 20 years now. That pretty much makes them ancient in IT terms, 200 years in human years. Secondly there's nothing hot about nullable value types, because they are just a standardized way for the last 20 years how to represent, well, nullable value types by not boxing them, which is incredible inefficient and people doing that are actively killing the planet :-)
Admin
I remember when HPC still meant "High Performance Computing" of which I've had the luck to see something. Today I read this thing and I'm somehow not surprised.
Admin
And yet when 2.0 came out they were new...