- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Admin
Admin
Amen!
I have also heard a lot of people saying LabView is soo good: 'you can just take your instrument and within a few minutes get a graph of what it is measuring'. All well and good, but add some 20 different devices (of some 5 to 10 different types) and all of a sudden you end up with some spahgetti like the one shown. My beef is that you still can't get a screen big enough that you can properly debug any LabView program doing anything remotely interesting. Add to that some interesting race conditions and you are set for disaster....
I would prefer a program written in C anytime over Labview, as in C at least you know exactly in which order things get executed, and in my experience, in 99% of the time you never need to do things asynchronous anyways. If your application/problem requires that, then it gets more complicated, but then Labview may not be your choice regardless....
Yours Yazeran
Plan: To go to Mars one day with a hammer.
Admin
So you've filched Microsoft's plans for the Zune replacement, impressive.
Admin
Admin
Itis "Bolta" and not "Boltey", dumbass. Get better kwality of Indian person to translete.
Admin
TRWTF is absence of stable kernel driver APIs in Linux.
Admin
Admin
About 15 years ago (during college) I worked for National instruments as a tech support telephone jockey.
The real WTF is not LabVIEW it's academics who think because they have spent ages learning about something really obscure which no one else knows (or cares) about, that means they are really smart and have the right to patronise everyone else.
Also because they are so smart the problem must be with whatever they are using because it couldn't possibly be their fault because they're so smart.
filed under: The O/S is broken.
Admin
On the subject on user interface, I give you National Instruments own User Interface Gallery. These are the examples that NI think are beautiful enough that they need special recognition.
[image]Hmmmm....Windows 95 style.
Admin
Yes, they really did use a 100% oxygen atmosphere! But not at 14 PSI (about 1 atm). Instead, it was at 5 PSI. That's the partial pressure of oxygen at sea level; you can do just fine without all the nitrogen that normally buffers it, and the fire hazard at 5PSI of oxygen is not really any worse than at 14PSI of air. It saves a hell of a lot of mass, though, and for Apollo, that was a very big deal. The idea was they'd start out at 14 PSI, so nobody'd get the bends or anything, and then bleed off cabin atmosphere until it was down to 5 PSI, which they'd stick with for the rest of the mission.
Then Apollo 1 happened. On the ground, with only oxygen available to pressurize, they went up to more than sea level pressure in order to fully test all of the systems and simulate the pressure differential the capsule would experience in space against a vacuum. Basically, it was a hyperbaric oxygen chamber, and also a massive fire risk. After that, they changed so that when pressurizing to sea level on the ground, the oxygen was buffered with nitrogen, before dropping down to 5 PSI pure oxygen for the bulk of the mission.
The Space Shuttle is, I believe, the first American spacecraft to have a nitrogen/oxygen atsmophere at 14 PSI for the entire mission, though they used to drop to 10 PSI during spacewalks, to make it easier for the spacewalkers to purge nitrogen from their blood. That's the other upshot of a 5PSI pure oxygen environment -- it makes EVA prep a lot easier, because your body has already gotten rid of all that nitrogen. (Suits are still at 5PSI pure oxygen -- any more pressure and you can't bend them.)
Admin
Admin
Admin
Admin
My GUIs don't look like that. If you know what you're doing, you can create much better GUIs, but that is indeed one of the areas where LabVIEW needs considerable improvement.
Yes, it's proprietary. If NI disappears (which currently seems unlikely based on their financial reports), the LabVIEW source code is in escrow and will go to another body. Existing LabVIEW versions will continue to work. That's no different that using visual studio and .NET.
I'm sure upgrading from VB6 to VB.NET was very easy for you? Or from SQL Server 2000 to 2008? Some upgrades go smoothly, others do not. Mine were generally smooth - open the code, find new bugs (if any), build the application.
As I said earlier, those were some advantages and disadvantages, not all.
Admin
Admin
Admin
Admin
No, they aren't. That's a document listing some of the control types you can find in LabVIEW (the majority of which I personally don't like and don't use as-is).
Admin
I used to program LabVIEW for a living. While I was stuck doing that I made this demotivator: [image]
It's possible to write structured code in LabVIEW, but the IDE makes it difficult. The simple act of defining a new function (in LabVIEW parlance, creating a VI) is kind of a pain in the ass, especially compared to every other programming language.
Admin
Admin
VLADIMIR IS NOT SOUNDING LIKE REAL NAME.
Admin
Admin
Russia pepole are children born of much raping from mongol warlords. So don't talk about ancestory to me, madarchod.
Admin
This. Absolutely this. I'm working with a Labview program that I inherited from my predecessor. The block diagram only has a few blocks and is fairly clean... ...until you notice that each block is a sub VI with over 35 blocks in each (and some of those are sub-VI's as well). All told, there's just shy of 400 sub-VI's, all nested within each other. A few of the sub-VI's are on a network drive that's physically across the nation, so the program even manages to create quasi race condition failures, which is really a difficult way to mess up. I'd almost prefer the spaghetti because it looks messy, so it's easier to argue for a chance to switch. sigh
Admin
NI called it a "gallery," the whole point of a gallery is to show what you can do.
I would expect to find these on some l33t winamp sk1nz, the fact that NI has them at all is embarrassing.
But hey, while you're defending the indefensible, it's time for the classic UI game, "are these switches on or off?"
[image] [image]I thought LabView was useful for some tasks, but this is garbage. I mean, I'll listen to reasoned support, but don't put your balls in my mouth and tell me it's tea-time.
Admin
Admin
Stop making annoy me, madarchod!
Admin
Admin
You do realize that splitting your code into small, manageable and cohesive chunks is generally considered a good thing, right? That small functions that do specific things are better than a single big function that does everything? That splitting things into classes can make your code safer and easier to read?
From your basic description, it sounds like the app you're working on at least doesn't have the problem described in the original post.
Personally, most of my apps have considerably more than 400 VIs. What's the problem with that?
Admin
Congratulations, you've pushed me past of the point of "this is as much time as I'm willing to spend on this today". You seem to not have noticed how I said that I don't like these controls either and how my LabVIEW GUIs don't usually look like this.
Admin
Sounds like every "developer" I have ever met.
Admin
[quote user="Yair"][quote user="Anon"] Yes, it's proprietary. If NI disappears (which currently seems unlikely based on their financial reports), the LabVIEW source code is in escrow and will go to another body. Existing LabVIEW versions will continue to work. That's no different that using visual studio and .NET. [/quote]
Except I can open my .cs files in Notepad. Try that with your .vi documents. Even if Microsoft exploded tomorrow and every copy of Visual Studio disappeared from the face of the earth, I'd still be able to look at my source code and see what it did. Even if it was impossible to compile it again, I could still look at it and piece together the functionality.
Also, I'm sure people working at Enron thought it was highly unlikely that they'd go out-of-business. Stuff happens, companies collapse, sometimes very suddenly.
Admin
Admin
Also, what is "appelatio"? GINMF :(
Admin
No-one has yet touched on the reason idiots like me buy NI stuff: Single, direct seller.
If I start off with a simple data acquisition task, I know that the acquisition hardware, drivers and development environment all came from NI, so it's just one ass to kick if there's a problem. The sales rep can access the actual engineers to sort problems.
This is very different to many re-sellers, who have 2000 Chinese made boards to sell, and cannot access any in-depth engineering expertise.
If I need to add 422 coms, or vision acquisition, boards from NI play nice with each other and the drivers. I do not want to go back 20 years when adding a new board would require a compiler version incompatible with an existing board from a different vendor.
I agree 100% about the horribleness of Labview though.
Admin
It was in at least some of the early tests. That was reportedly one of the reasons for the Appollo 1 capsule fire.
Admin
Congratulations, your reply was ridiculous enough for me to rise back past the point I previously passed.
So, to counter: Even if NI exploded tomorrow and all backup copies of the LabVIEW source code were stolen and eaten by alien zombies and whoever holds the LabVIEW source code in escrow would turn out to be a Bond villain, my existing copies of LabVIEW would still allow me to look at my source code and see what it does. And I would even be able to compile it. So yay, I'm basically prepared for that unlikely scenario.
Admin
Until your HD crashes, and you find you can't reinstall LabVIEW because it requires online activation and the servers aren't running post-apocalypse.
Of course it isn't likely, but why even take the risk? You can write your programs in BASIC, C, C++, C#, Java,...just about any other language, and none of them have this problem. You can pick almost any other language in the world and not have your source code, the single most important thing a programmer produces, held hostage.
Man, it's alarming how much the LabVIEW fan boys are in denial about this.
Admin
If you were really Russian you'd spell it Vladimir, before or after transliteration.
Admin
You are in a maze of twisty little macros, all alike
Admin
Am I the only one at first sight read his name as BigFaggot?
Sorry, I'm done adding absoutely nothing to this thread...
Admin
Admin
Admin
Because the risk is so negligent it's irrelevant and I prefer programming in LabVIEW. I like it better. It works better for me and I'm not going to not use it because it's proprietary and there's some far-fetched option in which everything goes to pieces and yet I really, really, REALLY need my source code and can't get at it. If the situation is that bad, I assure you that access to my code would be the least of my issues. So, yes, I am in denial about that. It's a "problem" I have no problem having.
A much more likely danger (although, like I said, even that seems quite unlikely currently) is that for some reason NI stops producing LabVIEW and locks down the activation servers, etc. and no one else picks it up. So I won't have any future versions and at some point it won't run on current platforms. I'm willing to take that risk, because to me it seems irrelevant. Can it happen? Yes. Is it likely? Not currently. If it will happen, the code will have to be rewritten at some point in the future, but that applies to almost any language. Code rarely stays alive for a very long time.
Oh, and since all my LabVIEW installations are on virtual machines and are already activated and backed up, I don't need to reactivate them. So there, solved that theoretical hurdle as well. Man, it's alarming how far these anonymous anti-LabVIEW boys would go to find excuses to diss it.
Admin
Admin
Finally somebody is just admitting the real reason they choose a particular language/environment, and not rambling off a bunch of arbitrary, subjectively-chosen metrics in order to "prove" that their way of doing things is the best.
It's so rare to see this kind of honesty among programmers.
Admin
I don't really see the problem. Visual tools can often make us realise how complex code actually is.
Granted 'Hello World' wouldn't look this complex (no matter how badly it were implemented), but we should keep in mind that 'Hello World' doesn't actually do anything....
This apparently comes from a test suite (or rather test equipment) and given some of the labels that include words like Yaw, Mach, Pitch I'm guessing it's something that could be related to aircraft (or maybe a wind tunnel). Perhaps I'm weird, but it doesn't surprise me that something like that might appear very complex. Assuming the WTF here is the complexity (which I assume from 'Spaghetti' being in the title), I really don't see why complexity is a WTF.
Granted, 'Labview' may claim to simplify things (as do many systems that use pictures instead of words), but the reality is that pictures don't simplify everything for everyone, they merely provide an alternate representation for different minds. I'm guessing engineers (other than Software Engineers) and the like are more comfortable using pictures and diagrams because they use such things all the time. Software Engineers and Computer Scientists, on the other hand, are often comfortable looking at the pretty patterns that well-formatted code makes. Presumably this software does actually manage to automate the testing (as it claims), and although it may be tedious, automation often is....
Summary: I'm no expert (in Labview, or anything else significant, for that matter) Apparent Complexity != WTF. Could someone please be explaining to me?
Admin
"Alex" was the author...
I'm just sayin'....
Admin
I say, English, I'm no great scholar of the language, but I really don't understand how he's misused the word literally. Perhaps you could explain, old chap?