• BobboB (unregistered)

    DECODE(a, 1, Frist, "???")

  • LCrawford (unregistered)

    AT&T undoubtedly refers to SQL internally as a 5G Evolution language.

  • (nodebb)

    Hey look, it's Oracle.


  • Mr Bits (unregistered) in reply to BobboB

    DECODE(poster, 'Mr Bits', 'Frist', 'Sorry, you lose')


  • zero g (unregistered)

    4GL's have been re-invented as 'frameworks'. 5GL's were supposed to be systems were you described the problem, not the solution: where you described the question, not the program for giving the answer.

    The world headed off into a different direction: Visual Basic, a system where you programmed visually. That at least was adopted widely enough with enough improvement in productivity, that it could truly be described as a Fifth Generation. And then.... the world went back to 3G languages (C and C++), then crept forward to something like 4G again: Angular, Dot Net, etc.

  • (nodebb)

    DECODE() is an Oracleism; actual SQL uses CASE ... WHEN ... THEN ... ELSE ... END. I don't think the additional verbosity would make this easier to understand though.

  • Church (unregistered) in reply to TwelveBaud

    "[A]ctual SQL"

    I LOL'd

  • Andrew (unregistered)

    Who could forget The Last One? (Well, of those of us who were programming in those distant days.)


  • OldGuy (unregistered)

    This article should be titled: How to recognize SQL Programmers who have been around a long time.

    Anyone who worked on Oracle prior to the mid 2000s not only recognizes that code but probably wrote things like it. If you don't have a CASE statement available, you work with what's there and that's nested decodes.

  • (nodebb) in reply to TwelveBaud

    DECODE() is an Oracleism; actual SQL uses CASE ... WHEN ... THEN ... ELSE ... END. I don't think the additional verbosity would make this easier to understand though.

    Yes, DECODE() is, indeed, an Oracleism. Whether you love Oracle or hate it, chances are good that if you use it you need to learn how to use DECODE() ... and this is not how it's done.

    I would have expected to see this done with a stored procedure called DECODEWTF() - since that's what it is, in any language.

    Addendum 2019-06-20 09:56:

    EDIT: Also, "actual SQL" does not exist, since every database implements its own query language, many of which are "actually" structured

  • RLB (unregistered) in reply to zero g

    5GLs were fun. Completely useless unless you had exactly the right problem, but fun to tinker around with. I've always wanted to write a text adventure in Prolog. IYAM, it ought to be possible to code the world in facts and the logic in rules. Backtracking is exactly what you need to parse something like PLANT THE POT PLANT IN THE PLANT POT WITH THE TROWEL. Never got round to it. Ah well.

    4GLs, by contrast, are aggravating, annoying know-it-betters, but I couldn't do my job without SQL, nor one of my previous jobs without Clipper. Alas.

  • John (unregistered) in reply to Andrew

    The Daily WTF remembers... https://thedailywtf.com/articles/What-the-Ad--Big-Promises

    Comments are archaeological gold...

  • ooOOooGa (unregistered)

    The problem with 4G languages:

    PointyHairedBoss: I want to cut costs in the company. I pay you a ton of money, so you look like a good candidate. I want you to write a program that will do your job for you.

    Developer: You want me to engineer myself out of a job?

    PHB: ... Yes.

    Developer: ... OK. What do you have in mind?

    PHB: You should create a program that takes a design specification that the clients and sales department create and turns it into a program that the computer can run.

    Developer: Hmm... Well, the clients and sales people are going to have to write the design spec in something other than english (or any other human language) because the language is too ambiguous. People have miscommunications when talking to each other all the time. Problems with synonyms, slang terms, idioms, dialects, things like that. They will need to use a formal language.

    PHB: Well, I suppose the cost of training the sales team to use a formal language could be amortized. What language do you recommend?

    Developer: C. It is a nice stable formal language. Easy to learn at least the basics of, and powerful enough to describe any program that you want to build. So, you convince the clients and sales department that they need to start writing their design documents in C, and I will start working on a program that will turn the design document into something that the computer can run.

    Developer: * Starts writing resume to the sales department highlighting extensive knowledge and practical experience using C *

  • Just a Dev (unregistered) in reply to Watson

    Could be Netezza as well...

  • Argle (unregistered)

    [Sets up soapbox and stands on it]

    I've heard the whole "this will eliminate the need for programmers" my whole career. The real truth is that I will always have a programming job because there are preceding programmers. Of course, now they tell me that AI will change this. Maybe it's just me, but does anyone else think people would be less impressed by AI if it had the more truthful name of SA: Simulated Intelligence?

    [Gets off soapbox]

  • (nodebb) in reply to zero g

    Let's see. Where did I put that list? Oh, the C textbook I wrote. Hmm, need to cut down the paragraphs to single sentences. [code] 1GL: Machine code. Numeric codes used to tell the computer what to do.
    2GL: Assembly code. Mnemonic codes that map one-to-one to machine code. Compiles to machine code. 3GL: Procedural languages, including COBOL, FORTRAN, C, C++, C#, Java, JavaScript, Pascal, Lisp, Visual Basic (but with some 4GL elements), etc. Most can be compiled or interpreted, or a combination of the two (compiled to intermediate code which is then interpreted). 4GL: Descriptive languages, where you describe the expected outcome, rather than the method to accomplish the task. Common ones are SQL (relational databases), HTML (web pages) and Postscript (print jobs) and several more. 5GL: Problem and constraint based languages, where you describe the problem space and the kind of expected result, often some form of optimization, and let the computer figure out the answer. /code]

  • (nodebb) in reply to ooOOooGa


  • Ross Presser (google) in reply to Bananafish

    "Actual SQL" may not exist, but "Standard SQL" certainly does (aka SQL-92, SQL:2011 or SQL:2016), and it does not contain any DECODE (in any version). DECODE is strictly Oracle, always Oracle, and it's a fucking disgrace that Remy is claiming that DECODE is SQL at all.

  • Church (unregistered) in reply to Ross Presser

    I'm thinking that "strictly Oracle" may not be entirely accurate. Seems like DB2 used it, at least. https://www.ibm.com/support/knowledgecenter/en/SSEPEK_10.0.0/sqlref/src/tpc/db2z_bif_decode.html

  • Appalled (unregistered) in reply to Ross Presser

    Total agreement. I started with Oracle (on IBM mainframe) and was later elated when virtually all projects transitioned to MSSQL and Windows servers. MUCH MUCH Easier. I don't think Remy ever DID any work in Oracle at all.

  • Ross Presser (google) in reply to Church

    I stand corrected. Apparently INFORMIX has it too -- not surprising I guess since DB2 and INFORMIX are both IBM products.

  • I can be a robot if you want me to be (unregistered) in reply to ooOOooGa

    Isn't that describing COBOL?

  • (nodebb) in reply to Argle

    Depends how you define "need." When I need something done, I need it done right. Employers, though, seem satisfied with just doing something, whether or not it actually works. So, sure, at this point, I would say they don't need programmers and some monkeys banging on typewriters will do.

  • (nodebb) in reply to Argle

    Of course, now they tell me that AI will change this.

    Yeah. No. Working in an area that might be next generation AIs (or might be the generation after that; it's pretty seriously cutting edge) I believe I can say that both current AI and anything likely to replace it soon is not going to be doing programmers out of jobs. Current systems are incredibly expensive and difficult to train correctly, and extremely inflexible too. It's much easier, cheaper and effective to hire a human programmer. Really. The hardware requirements to actually replace a single programmer (assuming everything is built on the absolutely latest generation of chips) are currently looking disturbingly close to a single computer the size of the very largest datacenter. Or maybe more, much more. (We don't know exactly what is needed, which is one reason why it's a fun research area.)

  • Mark (unregistered)

    Stringly Enterprisey might make a good name for a detective in a children's novel. Or, better yet, the villain.

  • fun_time_perro (unregistered)

    What I suggest is this:

    Start from a perspective where the only thing invented comes to you from ... ... ... the Navy.

    Do you have 22 Aircraft Carriers?


    Okay then.

  • Little Bobby Tables (unregistered) in reply to Mark

    "Stringly Enterprisey" -- the ba$tard love child of Lemony Snicket and Truly Scrumptious?

  • ooOOooGa (unregistered) in reply to I can be a robot if you want me to be

    COBOL would work too, I suppose.

    Do you want to convince the clients and sales team to start writing their ideas down in COBOL?

  • Argle (unregistered) in reply to Zenith

    You couldn't be more right. I remember once being told that they could hire a few college students instead of me. I told them "good luck with that."

  • Barf4Eva (unregistered) in reply to dkf

    True today, but maybe not true in 10-20 years... Cell Phones went from large bricks to more powerful than computers from 2 years ago in a remarkably short period. An even shorter period, the rise of very connected socializing sites. Flying drones are likely to become commonplace enough, along w/ other marvels in robotics technology, that we will completely see the face of delivery systems change. Perhaps I'm being a bit of a futurist, but I really truly believe that "software which writes itself" could be a reality within our lifetime. It might seem implausible, much like Bill Gates thought it implausible that people would need more than 10 meg or whatever the figure was. Now we chew up more than that every second from the interwebs.

    AND... Even believing all of this, I STILL believe developers will be in high demand for decades to come after this new marvel presents itself... Because, you know, maintenance of those legacy C# applications. ;)

  • sizer99 (google) in reply to Argle

    When talking replacement of 'programmers' it helps to distinguish between code pigs and engineering flavors. Of course there's a spectrum, but in general the code pigs (who take a spec and provide the code) may eventually be automated. Some of this already happens.

    But you can't replace real problem-solving (engineering type) programmers because it's not the programming that's so hard, it's the problem-solving. If you take away some of the tedium (the code pig work) from these people, well, there are always more problems that need solving. Every tool of this sort just increases the value of those who can use it. And if AI reaches the point where it can generally problem solve then there's no need for humanity at all, much less programmers.

  • Barf4Eva (unregistered) in reply to sizer99

    "And if AI reaches the point where it can generally problem solve then there's no need for humanity at all, much less programmers."

    Not sure I agree with this statement without a few extra conditions. Humanity is simply dictating to highly sophisticated AIs, well beyond what we have today, to get a job done. This doesn't remove the need for humanity, unless of course we are talking the "Robot Uprising" and self-aware, self-conscious, machines -- Machines that have become aware of "grave injustices" done to them. Until then, they are simply machines serving a function, FOR humanity. Oddly, this makes me think of a scene from this show on Hulu, FutureMan, where in the future, people smash any technology for fear of it uprising against them... Such as toasters. :)

  • (nodebb) in reply to Barf4Eva

    True today, but maybe not true in 10-20 years...

    Working with systems that are at least 5 years ahead of where the market is… no. Not likely. The hardware is many generations off what is required. The software is even further off (current mainstream machine learning approaches won't work; they require far too much data to train). We need multiple major breakthroughs to get AIs to human levels, yet all we really know is that we're doing it wrong at the moment.

  • (nodebb) in reply to Barf4Eva

    Yes, I'm sure most of the humans wouldn't agree with that statement! But humans are like cockroaches, even Skynet couldn't eliminate them all.

  • (nodebb) in reply to Zenith

    There are two ways to do a task: correctly, and again. -- Jake from Dude, You're Screwed. (probably sourced from somewhere else by him).

  • (nodebb) in reply to RLB

    Sound dodgy to me. What if you have more than one plant pot? You need to be able to tell it to plant the pot plant in the pot plant plant pot.

  • (nodebb)

    My father (who is a now retired industry journalist) once told me: "One day, computers will be able to program themselves."

    I replied that, one day, computers will also be able to write magazine articles.

    My father said: "Hummph. I don't believe that – writing an article is just too complex."

    I did not respond.

  • Some Ed (unregistered) in reply to Argle

    If you call various flailings at making computers behave in an intelligent fashion "simulated intelligence", then in the future, when we finally have "synthetic intelligence", how will we distinguish them?  They would both be given the abbreviation SI, and we'd be so busy trying to figure out which one of them we were talking about (or, for that matter, if we were talking about the swimsuit issue or scientific measurements or various other things...)

    With the name "artificial intelligence", it's clear we're talking about asset investments, so there's no confusion.

Leave a comment on “Get Out Your Decoder Ring”

Log In or post as a guest

Replying to comment #:

« Return to Article