• LCrawford (unregistered)

    The proper language for a newbie's frist generator code? C only if the newbie is a skilled assembly language programmer. C++ if they graduated from law school as a language lawyer - Prata is up to 1440 pages already. Otherwise use something more realistic.

  • RLB (unregistered)

    Yeah, this sounds like the kind of thing which should have been written in C, but by someone who spoke proper C already. Someone who, for example, knows the difference between binary and text file modes; and who knows how to setvbuf().

    Or Lua.

  • my name is missing (unregistered)

    When your project involves cutting metals with lasers, your best bet is to hire someone with actual experience.

  • (nodebb) in reply to my name is missing

    Experience? We've got a lab full of people with PhDs, that's all the experience we need!

  • Brian (unregistered)

    Oh, LabView, those were the days... I always thought the idea of a graphically-driven programming language was kinda cool, but every serious LabView program I ever saw quickly devolved into a horribly complicated mess of spaghetti. Not unlike most text-based programming, come to think of it.

  • (nodebb) in reply to my name is missing

    In engineering environments it is quite common to conflate programming and other engineering tasks, the end result being engineers from other fields becoming self-taught software engineers, with no body in the pipeline having the experience to steer them towards good practices, never mind management having the patience to wait for code quality when they can have results NOW. The long-term issues of technical debt aren't really known in these environments and by the time they become too painful, the project is too big to risk a rewrite. :(

  • Brian Boorman (google)

    At first I wondered why not use GPIB control. Ah, SCPI is a protocol on top of GPIB. I would also have thought that there would already be .llb library in existence to control a piece of standard T&M equipment.

  • Greg (unregistered)

    Reminds me of university where the department of nuclear physics had a program written by one of the professors to convert raw sensor output of one of the instruments into something one could actually use to create a graph (it also calculated some parameters from the data). It was a command line program that wasn't particularly user friendly but it was used on a regular basis by only a handful of people (and roughly 20 students once a year for a lab project) and did its job. At some point, they decided to have an informatics student create a version with a GUI as his master project. By the end of the academic year, he had a rather fancy GUI, but the program didn't do what it needed to, so everyone kept using the old one. I'm assuming said student failed his master project but I wasn't around at the time so I don't know...

  • Industrial Automation Engineer (unregistered)

    Can anyone shed any light why this wasn't implemented in a dedicated industrial logic controller? (PLC) It amazes me that these kinds of control systems are running on general purpose computers.

  • (nodebb)

    Can anyone shed any light why a lab would hire somone who knows only Perl, Octave, and R?

    Can anyone explain how you could finish your undergraduate education knowing only those three computer languages? I can imagine how you could get out of school knowing only FORTRAN, or only C, or even only Java. But those three and no others??

  • Sigako (unregistered) in reply to Ross_Presser

    Actually, this I can answer. Most likely said ex-newbie wasn't a programmer at all, but rather graduated in chemistry or something, and never learned to code in school beside several obligatory Pascal lessons. Then (s)he got some tasks which were (best) resolved by learning said languages. At least, I can see logic: Perl is for quick scripting, file/string juggling and fixing (and creating) messes, R is for all kinds of statistics, Octave is for most other numerical and data processing stuff (also, it has some niche but very useful libraries not available in R, and vice versa). Another point: these languages are easy to learn, since you can follow them step-by-step and immediately see everything happening. Also, that's exactly my story, just replace Octave with Matlab (we didn't know about Octave at the time (we didn't buy Matlab though)), Perl with Tcl (I'm a pervert), and chemistry with biology.

    TRWTF is tasking this person with real low-level programming.

  • (nodebb) in reply to Sigako

    Thank you, both for the explanation and for the humbling kneecapping reminding me how provincial I am. :)

  • Sigako (unregistered) in reply to Ross_Presser

    And the reason these people were tasked with this stuff is most likely because the head of the lab is some old fart that got his PhD before computers had GUI, and all he knows about coding and different languages is "You're a wizard, Harry! Now, witch me up something, or else". I had to fix a RS-232 port SNAFU once this way. Only I couldn't do C, it was a problem with firmware (it was a home-made shit (not by me)), and I've found a Matlab function that resets the port, clearing everything stuck during the initialization (thankfully, it doesn't clog during actual work). After a while I got fed up and just commissioned a proper replacement for my own monthly salary when the opportinity showed up.

  • Is That Really A Crosswalk? (unregistered)

    I was unemployed a long time ago and to get benefits you had to show that you'd been applying for suitable jobs - a reasonable thing to do, but it did mean having to explain weekly why being a "computer programmer" didn't mean you knew every different language there was out there. And then you'd have to explain why there were different languages, and how knowing some could actually hinder being able to use others.

    That's something so many companies still do - you know the languages you needed for the job you have, and someone getting 10x your salary decides it means you'll know all the other ones.

  • Brian Boorman (google) in reply to Industrial Automation Engineer

    This isn't industrial control, that's why. I've never seen an electronics/physics lab where the equipment is test and measurement being controlled by a PLC. You can keep your ladder diagrams. Does anyone even make a PLC with GPIB/IEEE-488 interfaces?

  • My Name (unregistered) in reply to RLB

    Text mode wouldn't have helped anyway (open it in text mode and now your program produces different newlines on different OSes while the device only expects \r\n), and even setvbuf isn't the biggest problem. On Windows, the Microsoft CRT just plain refuses to fopen() the virtual COM port no matter what you do, apparently because virtual COM ports are FILE_TYPE_UNKNOWN.

  • 516052 (unregistered)

    I've been saying this for decades now and I will say it again because this case proves my already proven to death point. Software enginering needs to become real engineering. We need a bar and exams and qualifications and licencing and accountability all the other stuff mechanical and electrical and other engineers get.

    For as long as programming is treated as just another profession as opposed to serious engineering crap like this is going to be the standard.

  • Prime Mover (unregistered) in reply to Is That Really A Crosswalk?

    Mind you, once you've got a few (of the right kinds of) languages under your belt, you can fairly quickly pick up enough of whatever other languages you are tasked with (faking that you've been) learning. Once you understand the underlying structure of the how-to-write-instructions discipline, everything else just boils down to syntax. And it's very, very rare that you'll be expected to be fluent in anything esoteric. if you've got C (of whatever flavour), Java, JavaScript and html (oh, and apparently this beast known as "XML" is also highly rated by big-boss-man types) you can fake your way into anything. If you can put Cobol and / or Fortran on your CV as well, you can also corner some of the higher-paying niches in which actually having to do any work is not necessarily mandatory.

  • Some Ed (unregistered) in reply to 516052

    I feel there's a hurdle to pass before we can get that licensing stuff sorted out. Maybe two.

    1. Most programming languages I've worked with have odd corners where dragons lurk. Sometimes, this happens because we're not always at our best, but other times it happens because those people get involved in writing programming languages.

    2. A fair number of programming languages are accessible enough that anyone can use them, and those people don't die from it. (Unlike, for example, electrical engineering, where most of the stuff is intentionally not all that accessible, and some of the people who access the stuff anyway die from it. Because 10,000 volts is a lot of volts, and usually happens with enough amps to do serious damage.) The people who do die from it aren't generally attributed to them. (The father of one of my high school classmates had his right hand, right foot, and everything between fried because a coworker disregarded a downtime notice and tagout on a 10,000 volt system he was working on. That coworker was held responsible.)

    3. So much stuff in IT is built by these people that it's difficult for people who actually have the training to do stuff reliably to not accidentally base their work on some flawed crap. I can be careful about most of the libraries I directly use, but what about the ones they use? Or the ones five layers down? What about when the library I've been using takes on a new dependency? I can't just substitute a new base library, unless there's a good drop in replacement, because my stuff depends on that functionality.

    4. Our bosses don't necessarily want stable and reliable. They want software to be developed fast and cheap. Many have accepted those consequences.

    The last is probably the most important: the people who make the decisions about what software to buy and who to pay to run it have not, in general, accepted that quality software is a matter of life and death and software quality is really important. The others just make it hard to convince them.

  • 516052 (unregistered) in reply to Some Ed
    1. That's not as big a deal as it sounds for two reasons. Firstly as you said nobody actually dies so it's not like having a bug is going to automatically mean being disbarred. And secondly that sort of thing actually happens in all sorts of engineering. I mean, it's not like phisics and material science are 100% understood fields either. People are always trying to skirt the corners and some times they build a race car out of magnesium alloys or a bridge out of "high strength" steel. It's only really when millions are lost or people die that you hear about the engineer actually getting punished.

    2. This one is a non issue as well. Again, it's not like people die from this stuff so unlike with construction or high tension wiring there is no reason to stop joe random from hacking together a VBasic macro for him self at home. There is after all no harm done. What certification is for is to ensure that the people who enter the buisiness world and are employed for their programming skills are vetted so that joe random can't get into a position where harm can actually be done.

    3. I don't see a problem here. If so much harm happens from a bug that there is enough harm done to trigger an actual court case all you need to do is dig up where the bug originates and punish that person. Even if it is 10 libraries deep in the import list.

    4. So did the people buying bridges and railroads, slumlord landlords and other cheapo tycoons of our past. It is exactly because of people like this and NOT because of bad engineers that we have all our modern engineering regulations. Laws are made not to stop bad engineers tricking good customers but to stop greedy capitalists from cutting every corner they conceavably can which includes shopping for an engineer stupid, bad or corrupt enough to do it for them, or just cheap enough to not know better.

    The only defense an engineer has against being asked to cut corners or do something he does not know how to do is to fall back on a law prohibiting him to.

  • A N Onymous (unregistered)

    LabView -- ugh!

    I work for a company that assists folks with prototype development. We often get designs with LabView as the programming language. It's nice and friendly for people who just want to get the design working, and have limited programming experience. But, oh, my, is it a bear to deal with when you're trying to take that PC, a bench full of test equipment and some custom hardware and reduce it to a 3x5 circuit board controlled by a PIC or a compute module.

    LabView code can't even be examined without a copy of LabView. You can't even look at the source. And figuring out what it does, so youc an rewrite the logic in C or whatever, is challenging, to say the least. We are not big fans of LabView powered designs.

  • Yazeran (unregistered) in reply to A N Onymous

    Oh Man, I couldn't agree more.

    Labview is fine as long as you only have one ro 2 instruments and wants to get a nice graph out, but try to do something which relies on timing actions on more than one physical instrument, and man you are going to face problems.

    I always says, that you can't get a screen big enough to do anything serious in Labview as debugging a typical Lavbview application is like trying to navigate by map but doing it viewing the map through toilet paper tube. You can never see enough to get the big picture.

    Yours Yazeran

  • (nodebb) in reply to 516052

    I don't see a problem here. If so much harm happens from a bug that there is enough harm done to trigger an actual court case all you need to do is dig up where the bug originates and punish that person. Even if it is 10 libraries deep in the import list.

    Software generally isn't written for "bug will kill people" environments, so the responsibility lies at the point. So it would be the responsibility of the person choosing libraries to use only libraries that are vetted for this kind of use, like you can't use just any library for systems with limited RAM or hard real-time constraints.

    But even then, punishment is only likely if negligence was involved, and then it will be hard to pin it on the engineer, who may have been under order to "get it done fast". Which may or may not be verifiable, by the time the bug kills someone.

    Though that's probably something best commented on by someone with experience in software engineering for aviation. As I understand they actually do have rigorous regulations, and neglecting them is how the Boeing 737 Max happened.

  • jay (unregistered)

    There's a 4th type of software development work: When you work for the government, the primary output is paperwork and working software is a side issue. I once worked at an agency where, I am not making this up, a team got an award for producing the most paperwork. They never produced any working software. The system was supposed to support 20,000 users, but their first prototype died with 5 users, and they never got past that.

  • (nodebb)

    In my Programming Languages and Translators class there was a brief survey of languages, one for manufacturing was mentioned, all the professor said about it was, "Imagine what your bugs look like"

Leave a comment on “Reinventing the Wheel”

Log In or post as a guest

Replying to comment #524934:

« Return to Article