- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Min isn't that bad - she admitted fairly quickly that she'd lied on the resume. The ones I like to keep notes on are the ones who fail even the most basic language questions, yet insist that they're masters of it. Like the 'unix guru' who supposedly doesn't know any unix commands because he's aliased them to their dos equivalent and promptly forgotten about it - but he also doesn't know 'fdisk'. (Of course, if the guy otherwise knew how unix worked, he could recover even from that - I tend to be fairly thorough before I decide to try to ban someone from working in my company.)
However, that having been said, the real problem is that the people who do interviews assuming all of the assertions on the resume are true (or, worse, assuming that the fact that HR approved the interview means that the candidate's resume matches the requirements one sent to HR) are the same people who don't check the list of candidates to never hire.
That having been said, I'm still chuckling about the time one of these managers hired a guy who had put '20 years experience with Java' on his resume (and this was years ago, even!) For the curious, said individual apparently did not know Java - unless there's a dialect that looks strangely like poorly written BASIC. I know because I'd asked him write a Java function to do some silly task. Oh, and for what it's worth, when I interviewed him, his resume said '25 years experience with Java', but I'd advised him Java wasn't quite that old.
Note: chuckling was all I ever had to do about it, because the guy managed to totally wash out, very quickly, in a manner that made said manager look rather incompetent, and created no work for myself.
Admin
Re: "never put critical business transaction code in finally blocks."
Well, yes... I would suggest that really important code is better placed in the "try" block. And really important business code should probably be within the scope of a database transaction too.
(As a practical matter, exceptions thrown /from/ a finally block can be problematic. Be Careful!)
Admin
There.
Admin
public void setConnection(String dataSourceName) throws Exception { ... }
so even if I know the inner exception might be an SQLException, I still have to check for the general java.lang.Exception case (ok, I'm also answering which language I'm using). Given the state of other code I've found, I almost expect the "re-throw" to be something like this:
catch (Exception e) { throw new Exception("Couldn't connect to the DB! :("); }
Some calls are nice enough to include e.getMessage(), but no stack trace at all. My solution? I give that function its own try/catch block, then re-throw it as a new SQLException.
Admin
But if there's no circular reference isn't that an infinite loop?
Uhhh. Forget me. I'm on beer #3 so I'm probably talking out my rear end.
Admin
Actual the reason it doesn't work in java is that there are no guarantees when the finalize will be run. It could take quite a long time for it to run and for all that time the database connect, file, or whatever is still open.
Finalizers are still useful for a failsafe but the prefered way should be explicit methods for closing (or rollback/commit, etc) in a final block.
You can solve the problem you mentioned by calling System.runFinalizersOnExit(true) someplace in the program.
Admin
Maybe she finally fixed her resume (no more Microsoft Solaris). And that might be the reason why she was hired by the other company XD
Admin
Admin
I was asked this same question at the interview for my current job. The answer I thought up was to have a compare the size of traversing the list with the known number of elements, if you can traverse through more elements than you have something is obviously wrong.
They found that acceptable, but then described their own answer (which was pretty obvious as soon as they said it - I guess thats the pressure of interviews) which is to have a flag indicating whether an element has been visited or not, if you are traversing the list and come to which which has already been visited then it has a circular reference. (this technique is known as colouring)
Admin
[quote user="bcharr2"][quote user="shepd"] There is no hard and fast rule to the "Who is better, the college taught or self taught programmer?", because no 2 job candidates are exactly alike.
I've seen individuals with degrees and without who could program. I've seen individuals with degrees and without who couldn't program. [/quote]
Stupid people will be bad programmers whether they are self taught or have gone through thorough training, and likewise smart people will be good whether they have gone through training or are self taught.
In my experience, there are a lot of people who fall somewhere in between the stupid & smart boundaries, and where these people are self taught they usually know all the different libraries and API well, they know technical details to the most intricate detail, yet they will still use bubble sort to organise a list of 1million elements.
languages and technologies are very easy to learn, theory not so much.
Admin
Fortran is not object oriented. This doesn't make me think that Fortran is an inherently better language than any other (far from it) -- but it does make me wonder what the fuck other languages really need to be object oriented for.
Q.E.D.
See.. here's how the proof works. If an old language doesn't have the exact same features as a newer language, then the newer language is wrong. Right? Right?
Admin
Admin
Admin
Process.Kill() came with .NET 1.1, afaict. It also requires full trust, which by design you should never give to code you haven't written yourself (and probably shouldn't even give it to all your own code as well.)
While it does invalidate the assertion that the finally block always executes, it does so on the same level as all the variations on 'pulling the plug' do. Actually, it's an even weaker statment: An outage (if you're not using a UPS on a critical system, which is a WTF in and of itself) is atleast unpredictable, but killing a process through code has to be done explicitly.
Btw. for all the people discussing what the proper 'form' of a try-catch-finally is, in C# it's something like this:
Dispose() will be called on the transaction object as soon as the running program leaves the scope of the using clause (which includes situations where an exception occurs). Dispose() then takes care of rolling back the transaction, etc. if it wasn't committed.
The IDisposable interface in combination with the using() statement is C#'s version of RAII. Use it!
Admin
The real WTF is people using Java for anything critical enough where you would even need a finally to ensure something was done.
Admin
For the linked list question, I would just itterate forward through the list, and checking if the back pointer really points at the node we came from. If it doesn't, we got a loop. It is an O(n) complex algorithm that is guaranteed to stop.
Admin
In summary, "finally" is just a regular construct to ensure some piece of code always runs in the presence of exceptions. RAII has its place, but in some languages you have no control over when a destructor runs, which may leave you tying up a resource for too long.
Disclaimer: I'm well aware of the fact that using Delphi is TRWTF. Get over it.
Admin
Any object-oriented language which lacks destructors is pure FAIL. Use it, and you FAIL, sooner or later.
The reason for this is that object cleanup is fundamentally necessary for certain types of objects. One must ensure that they get cleaned up when they go out of scope, or the program may not be able to run twice in succession.
When the program crashes, these sorts of errors are expected - but very annoying, so it's preferred if the solution can do something to minimize the amount of trauma caused by a crash. However, normal executions should never have that issue.
'Finally' blocks may work most of the time, but there's normal situations where they don't run. Also, using 'finally' blocks to simulate a destructor requires one to remember them for every block in which an object requiring a destructor is used. That's basically the exact same situation which motivated nearly all programmers to seek languages which had some form of automatic garbage collection: people couldn't remember to free all the memory they allocated.
Incidentally, in my experience, languages which push exception handling rather than error handling also have a certain amount of fundamental fail. None have shown this to me more clearly than Java and Python - supposedly the languages with the best exception handling (at least, according to their adherents), yet over 95% of the Java and Python programs I've run have an incredible propensity to dump with a stack trace on any unexpected input. That stack trace may be useful to a developer, but when a normal user sees it, there's a good chance you're losing a user.
Note: I have not run any Db programs directly, so I do not know if they have this propensity or not.
(Just for comparison: using a combination of options intended for developer use, one can get the same behavior out of perl. But it's a simple matter to turn those off before shipping the product, which causes perl to go back to its normal unhandled exception response: attempt to do what the programmer most likely intended. (And it's generally quite successful at doing that.))
I'm uncertain what RAII means, but the Wikipedia entry to me sounds like it's "programming". If you're using any programming language which doesn't have programming, I think you fail.
Admin
Quite some years ago, my group interviewed a guy for a development position in my group. Despite a good-looking resume, we didn't feel that he all that qualified, and so didn't take him. He eventually was hired in another department, and eventually, I wound up inheriting his position. Looking at his code confirmed that our first group assessment was correct.
Admin
And yet... In this situation HR had already filtered out everybody else; presumably that included at least a few people who were sufficiently competent for the job. It appears that the only way to get past that filter is to brazenly lie about everything, and it worked for Min.
Admin
That will only work for a doubly-linked list.
Admin
Anyone that accesses the backend files for a real database over NFS is just asking for trouble. And I'm not talking about the performance hit you are going to take.
In terms of the UNIX bufer cache. Well, mount that file system with the correct options. I also know that Oracle when it opens at least it's redo log files it uses with the DSYNC option, that forces the writes to be flushed and not hang around forever.
Over the years, going back to version 5.1, I've seen people do tons of stupid things to machines running Oracle. Killing processes at random, hard power fails, disk storage going down, etc. I can count only two cases where the normal startup recovery in teh database didn't fix things. One was when some moved files around from E: to F: while the database was running. The other was many disk failures, not noticing that the RAID set had been degraded, and also no types of backups, ever.
Admin
If the transaction log indicates an incomplete transaction, roll it back. If the transaction log is damaged, roll it back. If there's anything wrong at all, roll back. Only if everything is peachy do you not do anything. This means that the data could be consistent and complete, but the transaction log still says it's incomplete, or it's damaged and so it gets rolled back, but that's ok. The only scenario where I can see this fail is where the hard drive spreads random bits all over the place just before losing power.
Admin
Wow, you're an arrogant S.O.B.
RAII is nearly 100% equivalent to try/finally. The difference is that RAII relies on rules of the C++ language to implicitly release the resources IN THE DEFAULT CASE, while try/finally requires explicit code. That's nice and all, but hardly a necessity, and certainly not enough to make RAII so vastly superior as to warrant a "try/finally considered harmful" proclamation.
Worse, you seem to not understand your own beloved C++ and RAII. Destructors are NOT guaranteed to run. I can list numerous scenarios in which they won't. Yes, these are mostly the same scenarios in which finally blocks won't run either, but you sure seemed to think destructors were superior in this arena ("'Finally' blocks may work most of the time, but there's normal situations where they don't run.").
You also go on and on about "RAII" just being programming, and how every language has it. This shows your ignorance. "RAII" isn't "just programming", it's a specific pattern. And like all patterns, it has different implications in different languages. In this case, the pattern CAN'T be applied in all languages. If you think that RAII just means (from Wikipedia) "The technique combines acquisition and release of resources with initialization and uninitialization of objects", then what I just said sounds ludicrous. But that's only PART of RAII. RAII relies upon the scoped nature of the initialization and uninitialization, and not all languages have such support. That's precisely WHY finally (and using, which is syntactic sugar for try/finally) exists in those languages. Looked at from that perspective (the correct one), try/finally IS RAII by your definition, and thus you're arguing in circles (try/finally considered harmful, RAII is the superior solution that proves that, but try/finally is RAII, making the argument fail... my head hurts).
OK, you've got a preference to C++. Good for you. Don't disguise this religious debate with a mask of superior technical knowledge, because doing so you FAIL.
Admin
Sounds like you're blaming the language for short-comings in the programmer. Unexpected input is not an exceptional case and should not be treated as such. If your program is dumping a stack trace because the user entered a letter where a number was expected (for example), the program was written poorly regardless of the language used.
Admin
This would fail if the back pointer happens to be incorrectly set.
Admin
I agree that the programmer is more at fault than the language. However, when 95% of the applications in use using one language are this sort of crap, and only 50% of the applications in use using another language are this sort of crap, I'll prefer the second language. It's not the language's fault, but that of the people the language attracts. (I realize 95% of everything is crap. But generally, the crap doesn't get used nearly as much as the 5% that's not necessarily crap.)
Incidentally, an example of one of these python programs was the Gentoo emerge (it could be fixed now - it's been three years since I used it) program: give it certain combinations of bad options, and it'd stack trace, instead of giving proper usage. Clearly a Gentoo problem - but when you can't find an example to point the said maintainer to of how one does it correctly in Python, because the first 50 examples found by Google all have the same problem, I start wanting to avoid that language.
Admin
ITYM a doubly-linked list which had the a null back pointer for the first link.
Admin
Nothing. Sure, have you programmed in LIPS all these years? Try simplifying code like this in C++ and go back to your code in 6 months and tell me if you understand what it's doing! (and comments whatsoever)
Admin
Nope, because eventually fast->next->next will throw a null pointer exception (NPE), which will be caught, and false will be returned. Note, however, that using exceptions in this was is a bad idea, and considerably slower than just checking for null at each step.
Admin
Addendum (2008-07-31 13:24): EDIT: And works on a singly linked list, obviously. For a doubly linked list, just check if ( this->next->prev == this )
Addendum (2008-07-31 13:33): Obviously testing that this->next isn't NULL first, to avoid crashing the poor thing.
Admin
Addendum (2008-07-31 13:14 PST): You should read the comments before posting your own, this solution has been posted MANY times already.
Admin
Google didn't made up that one, it's Floyd's cycle-finding algorithm, a classical one. See [url]http://en.wikipedia.org/wiki/Cycle_detection.[url].
But it can't find the beginning of the cycle.
Admin
Just because you can't guarantee that automatic destruction happens the moment that an object goes out of scope doesn't mean that you can't apply RAII. (I'd love to know how the use of "finally" helps in this regard.) Perl has DESTROY (it also has Scope::Guard, which I haven't looked into); Python has del; I can't, off-hand, think of another OO language that doesn't have a hook into the underlying mechanism for object destruction.
It was, once again, tgape who made the tentative assertion that "RAII" <is> "just programming," although I seem to recall that he claimed that he'd never heard of it, had just looked it up, and it "seemed" to be that way.
Good for you, anon. You're arrogant and guilty both of misattribution and misquotation. I make no comment on your parents, however.
I've got a preference for C++, yes, but it's hardly religious. I'd be far happier spending the rest of my life programming in Python. (Better libraries, and better organised, too, required; and with a bit of luck the GIL will disappear. I'm looking forward to 3.0 for the first two and checking out alternatives to the third.) Scheme looks fun, too. I'm pretty language-agnostic, with the exception of those that I would expect to act as personal ball-and-chains, such as Java and VB.Net ... but I've got no objection whatsoever to other people using those languages. Just so long as they don't abuse ill-advised language constructs such as "finally."Did I mention religion? I might have done once, but I think I got away with it. I suspect that you're confusing my juxtaposition of goto and finally with (non-existent) evidence that I regard both as equivalently evil, or that I have dogmatic views on the subject.
I don't. Not with regard to either, as it happens.
I've seen too many projects burned by goto (thankfully in the dim, distant past), and, judging by many comments defending the use of finally, I'm going to see a -- probably smaller -- number of projects burned by the use of finally. Either one is a deceptively powerful tool, and therefore dangerous in the hands of a depressingly large number of programmers.
Not you, obviously, because you are anonymous and not arrogant and therefore know what you're doing, even if what you do is by definition not attributable.
Hier steh' ich, Ich kann nicht anders. Finally is the modern goto. Just an observation. Not a gospel. (Sorry, Martin. I'll get around to the other 94 when I have some spare time.)
Admin
A (language C++) does not contain WTF' (finally). A (language C++) is not better than a member s of the set S (other languages). A (language C++) is Turing complete. For each s in S (other languages), s is Turing complete. s' (some member of set S) contains WTF' (finally).
Since A does not contain WTF', what benefit does s' derive from containing WTF'?
Let O represent <the set of> "object oriented" <languages>. Note, for the purposes of this proof, that neither O nor the term "object oriented" has yet been used or defined. Nor will be.
Your argument therefore goes as follows:
F (language Fortran) is not a member of set O. F (language Fortran) is not better than a member o of the set O ("object oriented" languages) I don't need to bother explaining what o' means, because I've just sprinkled pixie dust and used the magic abbreviation "Q.E.D."
quod, I think, non est demonstrandum.
Neither "old" nor "new" feature in this abstraction. Your (pitifully cretinous) argument would make C++ the superior language had Stroustrup invented it yesterday, complete with RAII.
Your (pitifully cretinous) argument also introduces the entirely irrelevant issue of whether a language is "object oriented" or not. As you know, Fortran is not. It is not, therefore, amenable to the RAII pattern. I'm not au courant with the latest Fortran standard, but I'd assume that it's also lacking in support for exceptions -- which makes the use of a "finally" clause a bit problematic.
We were talking about "finally," weren't we?
As opposed to some completely random and possibly massively destructive language construct that hasn't been invented yet, but will be in the near future?
Admin
A better example of never executing a finally is when calling System.exit().
Admin
See http://en.wikipedia.org/wiki/Cycle_detection#Tortoise_and_hare
Admin
There's a reason it never reaches the catch statement. StackOverflowError is obviously an Error not an Exception.
Admin
You could still hire him/her as a clown or other who keeps company moral up by showing that there is always a person who is doing less good than you.
Admin
Except that won't work if a=10^20, b=10^-20
The result is 10^40 which is out of the range of [+/1] 1 * 10^[+/-] 38 of single-precision floating point numbers.
Admin
In opposite to other readers I can see a correlation between goto and finally:
Isn't this a poor man's finally?
I've made this often in my first C programs and I've seen it in programs made by other programmers. And that's the only excusable use of goto I know. Anyway I prefer other ways of error handling (in C) today.
BTW, what's about a finally for C++?
I assume this would be a source of many WTF's ;)
... if it works at all, didn't tried it!
Admin
It was the only question that I outright failed on my interview for a Dot Bomb. "Well, you'll remember the answer next time, won't you?" said Michael, the Jewish Ukrainian American who turned out to be an utterly excellent boss.
Yes, I will. I will also remember to spit poison in the face of some retarded quasi-superior prick of an interviewer who actually takes puzzle-solving seriously, without being prepared to have the same crap thrown back in their face.
Addendum (2008-08-02 15:59):
Oops, sorry. An interesting Zen question, however: What is the beginning of the cycle?Admin
res ipsa loquitor.
Admin
It feels awkward when you realize that you know more than your interviewer.
Admin
The code in the following finally is unlikely to ever execute:
try { while ( true ) ; } finally { veryImportant(); }
Admin
Admin
Guys, seriously. Be careful with finally blocks. This shows right here, in a very contrived case, why. This caught a lot of very intelligent developers in various groups here at my company.
Just because finally executes last, doesn't mean that it's doing what you may intuitively think it should do.
Admin
The "exceptions" aren't thrown by C++, they are thrown by the CPU, and on most OSen there are APIs to enable those "exceptions". This applies, then, to almost any language (assembly, C, C++). Some language runtimes (Java, .net) enable those by default.
It is actually very useful to enable math exceptions in C++ when you write numerical code. This lets you avoid hand-coding (at a HUGE performance penalty) wrappers for almost every operator and math function out there, just to handle those. I know of at least one open source (and somewhat abandoned now) open source project, very numerically-intensive (solves multibody dynamics) where the author had gone that route. You could probably remove 10-20% of the code just by enabling some more exceptions and doing exception handling properly.
Cheers, Kuba
Admin
A more complete answer (in addition to this) is that proper database servers have batteries slathered all over them (including the drive controllers), and they pick up where they left off when power is restored.
Admin
I can't imagine why... after all, employers lie to employees all the time. All's fair in love and war...