• (disco)

    Bruce B., a recent high school graduate

    "normalization", "indexing"

    Either the story is being polished and Bruce is a college graduate too dumb to admit it or he was a high school graduate and a virgin.

    On regard of visual programming, the closest I was to that was Access97. Never again.

  • (disco)

    I wonder how many people actually used Helix (talking about those posting here). We did a solid evaluation on it, and it lost to another system (RDM by ITI) for various technical reasons (primarily the host computer it ran on). given the mind-set and technology of 30+ years ago, it was not really a bad product and seems suited for the level of complexity that was deemed "light to moderate" back in those days.

  • (disco) in reply to TheCPUWizard

    The problem here is not Helix or others. The problem is the boss. I worked in a one-man company for such a "My way of thinking is the only right way of thinking" boss, 3 years ago ... It was a 10-month hell.

    I had to justify every class, function and line of code I wrote in nearly endless discussions. And everything I wrote had to be redesigned, in a way he thought it should be ... even if the redesigned code did exactly the same. After 6 months he started to call me on vacation and told me to fix issues within 30 minutes.

  • (disco) in reply to TheWayne
    TheWayne:
    I worked in a one-man company for such a "My way of thinking is the only right way of thinking" boss

    It must've been more than one-man while you were there…

  • (disco) in reply to dkf
    dkf:
    It must've been more than one-man while you were there…

    Now that you mention it ... Thanks for making it clear ;)

  • (disco) in reply to HardwareGeek
    HardwareGeek:
    Visual HDL

    Never heard of that one, but I had to use Renoir (I think it's just called Mentor HDL Designer now) at one job. Never used flow charts for generating code, but did use block diagrams to generate module definitions (or entity/architecture pairs, if you prefer), and to connect the modules together. Saved a lot of typing and prevented my usual copy/paste errors. And seeing the modules connected graphically was actually helpful.

  • (disco) in reply to NedFodder
    NedFodder:
    Mentor HDL Designer

    If that's what I think it is, I know the developers of it…

  • (disco)
    Matt_Westwood:
    Programming languages that make it easy for non-programmers to write programs result in badly-written programs.

    I think the real problem is treating this as something to be used by non-programmers, when it should be treated as a higher-level language. I cannot work with another austistic person who

    Matt_Westwood:
    I had the opportunity to play with a process flow tool called Pentaho once. That was fun.

    I worked in a company that used Pentaho. I recall the data storage interface. It wasn't too bad for basic grouping queries, considering that I was very aware of how often I wanted to select from these summed values, these keys, and group by the exact same keys and had to write out all the keys each time anyway.

    Admittedly part of that problem could have been the total absence of documentation when it came to writing queries to investigate the data, which I had to do all the time.

    martin:
    Programmers always get "good money". Using obsolete technology ruin their knowledge and career. Why would they do that?

    I don't understand how this works economically either. Then again, I got better money by switching from SugarCRM to Angularjs (I realise those are completely different technologies. Not the point).

    accalia:
    IT BURNS US! IT BURNS US! GODDESS HAVE MERCY!!!!!!!

    That seems as though it could be helped with moar hierarchical encapsulation.

    TheWayne:
    After 6 months he started to call me on vacation and told me to fix issues within 30 minutes.

    This is Not CoolTM.

  • (disco) in reply to Shoreline
    Shoreline:
    That seems as though it could be helped with moar hierarchical encapsulation.

    it could have been helped by[image]

  • (disco) in reply to LB_

    I am not surprised to see this LabVIEW bashing again, but I am disappointed, primarily because one of the main attributes I expect my programmers to have is the ability to assess things correctly and without hyperbole. I will grant that you at least recognize that the problem was probably that you were doing it incorrectly, not something inherent to LV itself.

    Here's a 4 button calculator in LV (with a keypad and LCD-green, although I didn't bother finding a 7seg font) and it took a lot less than a week:

    [image]

    Some comments:

    1. I never actually wrote one before and didn't even check the behavior before I started, so a lot of the time was actually spent going down a completely wrong track before I realized it.
    2. I can't record the screen at the moment, so you'll just have to believe me that it works and responds in real time and works correctly, including filtering multiple periods (or drag the snippet into LV and play with it yourself if you have LV).
    3. I didn't bother with order of operations, as can be seen from the incorrect result above, but that's usually the case with these.
    4. It's not what I would consider good or very clean code, but that wasn't the point. The point was to write it quickly and show that even so it doesn't have to be a mess.

    As I pointed out in a long thread here a few years back, it's certainly possible to write bad code in LV and many of the people using it have zero training in software engineering. It also tends to be very easy to see how terrible bad code can be just by glancing at it (which is actually a good thing, if you ask me).

    It is also possible to write good code in LV, but if your project has even a moderate level of complexity, you'd better know what you're doing if you don't want to end up with a rat's nest of code. In that respect, LV is not different from other languages (with the possible exception that more people are likely to get a project to this state and still have it more or less work).

  • (disco) in reply to Yen
    Yen:
    order of operations, as can be seen from the incorrect result above

    What did you expect to get from 14*2+36 ?

  • (disco) in reply to Tsaukpaetra
    Tsaukpaetra:
    To be fair, LabView can indeed make a pretty good emulated virtual calculator, if you have enough time and energy to spend on it.

    A team of elephants, three miles of heavy rope, some giant wooden boxes and the entire population of the island of Apraphul could also make a pretty good calculator, but I wouldn't recommend it.

  • (disco) in reply to accalia
    accalia:
    Shoreline:
    That seems as though it could be helped with moar hierarchical encapsulation.

    it could have been helped by[image]

    Fire solves many problems.

  • (disco) in reply to abarker
    abarker:
    Fire solves many problems.

    and the ones that fire can't solve can be solved by: [image]

  • (disco) in reply to obeselymorbid

    Good point. I actually had a different calculation there which did show that it doesn't respect order of operations, but I must have cleared it before taking the screenshot.

  • (disco)

    LabVIEW isn't bad for simple stuff, like connecting to hardware instruments. It would be worlds better if they got rid of the bitmap interface and used a proper vector graphics interface that could zoom in and out.

    Most of our LabVIEW functionality is actually written as C code, then we create a VI wrapper to call the underlying C DLL. It's much quicker for programmer types than trying to implement everything as sub-VIs.

  • (disco) in reply to Yen
    Yen:
    I must have cleared it before taking the screenshot.
    QFT

    I'm under the impression that if you kept it in single-value (with hidden register) mode, nobody would assume it could do order of operations and all would be good in the world.


    Filed under: Why did Discourse put an spurious carriage-return in my selection-quote? It's clearly not there in the quoted post...

  • (disco) in reply to mott555
    mott555:
    It's much quicker for programmer types than trying to implement everything as sub-VIs.
    That touches on the point I'm trying to make. It might be quicker for *your* programmers to write the code in C, but there are people who feel the other way and can be much more productive in LV.

    This is the type of thing where it's hard to have proper info, but I have heard stories from people who said they the opportunity to implement the same project in parallel with a C or .NET team and finished considerably ahead (I seem to remember something like 4 months to be done with the LV code vs. an 18 month estimate for the other team, which had a home team advantage, but wasn't done yet at the time I heard that particular story). Of course, these are anecdotal and lack detail, but they show you that the story is not what some people think it is.

    As for the graphics, yeah, that's one of the disadvantages of having a system which was created almost 30 years ago (for some sense of scale, the first version of LV came out three years before Prince of Persia) and is developed by a relatively small company.

    FWIW, I know that NI is aware of it and working on it and if you look at some of the more recent products, like the LV web UI builder or the LV communication system design suite, they do use vector graphics for both the UI and the diagram. Presumably that's on its way to making it into LV once all the pieces are in place.

  • (disco) in reply to Tsaukpaetra
    Tsaukpaetra:
    if you kept it in single-value (with hidden register) mode, nobody would assume it could do order of operations and all would be good in the world.

    Like I said, I never actually did one of these before and I actually just started coding with zero prep (and zero thought, evidently) and I somehow assumed that when I press one of the operators, I would magically have two values to operate on. It was only after coding and running a sample op that I realized I don't have two values.

    It was only at this point that I went and opened an actual calculator (the Windows 10 one) to play with it a bit, and since that one shows your entire sequence (probably also without respecting operator precedence), I just blindly copied that behavior (and inadvertently made the = button unneeded, because the calculations are being done with each click, at least in this quick implementation).

  • (disco) in reply to Yen

    No worries, just falling into Design Review mode. :stuck_out_tongue: AFAIKT it does do OOO correctly (well, at least the Windows 7 version). [image] Note: Pressed Add instead of Equals to keep the intermediate result.

  • (disco) in reply to Tsaukpaetra

    Design? What's "design"? I do not know this thing you speak of.

    Actually, I'll throw the gauntlet down to anyone who's interested (maybe @LB_ who brought it up originally) - write a 4-function calculator in the language and IDE of your choice. It doesn't have to be functionally equivalent to the one I posted, but it does have to have a keypad and display and it does have to be able to do more than just two values. See how long it takes you (including planning) and how much code you end up with.

    In my case, I would say it probably took around 30 minutes, including detours and cleanup. I expect I could have shaved at least 10 minutes off if I actually stopped to think a bit ahead of time and probably more if I went with a simple design (I actually just started the simpler design now and convinced myself that I could finish it in about 5 minutes).

    The code you see in my example is basically everything. The only stuff that's not visible is the code for the C button, which just clears everything, the code for the digit keys, which adds the digit to the current value and the code for the period, which does the same and makes sure there isn't more than one period.

  • (disco) in reply to Yen

    Wasn't this the premise of OMGWTF1? Or am I thinking of something else?

  • (disco) in reply to Yen

    LabView made the whole GUI thing really easy because it was a WYSIWYG editor. It would take me a lot longer to make a GUI in any other language. The code part, however, would be really easy compared to LabView. By the way, two value support is all I had in the LabView implementation.

  • (disco) in reply to mott555

    Maybe something else? Sorry, Discosearch isn't being exactly helpful at all...


    Filed under:; But that's Normal

  • (disco) in reply to mott555

    Yes, I remember that too, although I'm pretty sure there the object was to create a WTFy calculator (which I was actually kind of on the way to doing before I realized I'm being an idiot, but not before spending a considerable part of those ~30 minutes on it).

  • (disco) in reply to LB_
    LB_:
    LabView made the whole GUI thing really easy because it was a WYSIWYG editor.

    WinForms MOCKS you.

  • (disco)

    One thing I haven't seen mentioned is that the data integration space has a lot of stuff in the (semi) graphical mode. Oracle Data Integrator, Informatica, and JDeveloper are products that I've used that operate in this manner. JDeveloper process overview: [image]

    JDeveloper assignment step detail: [image]

    Informatica workflow (which governs how to perform a specific set of tasks): [image]

    Informatica mapping (the detailed logic of linking source data to the target, and however you want to process it along the way): [image]

    Of course, like anything, it's possible to take it way too far. If your mapping overview looks like this, you may need to rethink your life choicesdesign. (I'm pleased to say this particular one has been replaced.) [image]

    I don't have a working ODI repository and I can't be bothered looking for screenshots of it on the web, so hunt down your own pictures if you care. Basically it's similar to Informatica but a lot more awkward to work with (though there are some things it does better). The equivalent to Informatica workflows doesn't support parallel execution well (there are hacks you can do to get it); and the equivalent to mappings has very limited data transformation capabilities and only supports a single target, so a single non-trivial Informatica mapping is the equivalent of many ODI interfaces, which then have to be glued together.

    In all of these products you have a graphical expression of the main points, but you still use some sort of formula or code editor to do things that are more complex than just "copy a field from here to there". Using them allows you to work at a higher level of abstraction; in Informatica I can say "run these three queries, then combine the data in this way" without having to think about the mechanics of database connections, concurrency, etc. I just have to concentrate on the data flow and transformations, and Informatica handles the low-level stuff. I've been doing a lot of webservice stuff lately, and it's nice that I can just feed Informatica a WSDL and let it worry about all the details; it generates a Web Services Consumer transformation and I just have to shove data in one end and collect it at the other1.

    Informatica does also have a Java transformation which allows you to execute arbitrary Java code on your data. It has some pretty neat ways of interacting with the native Informatica stuff. We've been using this a lot for report processing: the report webservice returns the entire report as a single base-64 encoded element, and we use the Java transformation to decode it, split it into data rows, and split the rows into fields. (We don't actually need the Java transformation to do most of that stuff, but we need it to work around some Informatica limitations, and once we're already using it it's easier just to do the rest of the splitting up in one place.)

    1 Unless the WSDL comes from Oracle, in which case you're potentially in a world of pain. But that's a rant for another time.

  • (disco) in reply to Scarlet_Manuka
    Scarlet_Manuka:
    One thing I haven't seen mentioned is that the data integration space has a lot of stuff in the (semi) graphical mode. Oracle Data Integrator, Informatica, and JDeveloper are products that I've used that operate in this manner.

    I know of similar products for doing that sort of thing with scientific data. There's differences (e.g., the way you handle errors is utterly different between business processing and scientific processing) but there's also a lot of points where you can compare.

    Scarlet_Manuka:
    Unless the WSDL comes from Oracle, in which case you're potentially in a world of pain.

    You just wait until you start trying to interact with some service put up by a PhD student in another country. Or until you try to do almost anything with REST services (which are usually operated by people who think that you shouldn't document things or make them actually discoverable at all; “you can browse to the website and use our Flash applet!” :rage:).

    The good thing about these (semi-)graphical programming languages is that you can omit a vast amount of low-level crap and focus on what's actually going on with the data/information. That helps a lot when that's quite enough complexity in itself thankyouverymuch…

  • (disco) in reply to dkf
    dkf:
    The good thing about these (semi-)graphical programming languages is that you can omit a vast amount of low-level crap and focus on what's actually going on with the data/information. That helps a lot when that's quite enough complexity in itself thankyouverymuch…

    You don't need a graphical language to be able to skip most of the rigging in your business language - we reach the same goal with a text-based language. Basically each file defines a functional property, with a name, data storage type, target stream it must be included in (can be multiple or none) and ruling which describes how to derive that property from other properties (including inputs). All the rigging, including reading inputs, writing outputs, determining property lookup order, and cross-record correlation steps are handled by the backing C code.

  • (disco) in reply to PleegWat
    PleegWat:
    You don't need a graphical language

    That's true, but you try telling that to our users.

  • (disco) in reply to PleegWat
    PleegWat:
    You don't need a graphical language to be able to skip most of the rigging in your business language - we reach the same goal with a text-based language.
    You don't need a GUI for a lot of stuff either, you can use a text-based command line. But most people are more productive with the GUI, and I wouldn't be surprised if the same was true in this context.
  • (disco) in reply to Scarlet_Manuka
    Scarlet_Manuka:
    I wouldn't be surprised if the same was true in this context.

    It's utterly true.

    Some people can really master the sorts of abstract thought necessary to write computer programs using text. I suspect that includes most people on this site. We are not a majority. There's a larger set of people who handle complex programs with a sufficiently assistive GUI. They won't get the best out of the GUI, but they'll get their stuff done and have a pretty good experience of it.

    They're still not the majority. The majority will not ever program computers; they're just not good at organising their thoughts sufficiently to be able to describe to a highly logical system what they actually want to do. Or rather by the time we've figured out how to let these people write programs, we've probably also cracked strong AI since the computer's going to have to do one hell of a lot of interpolation, guessing and outright lying in order to get stuff done.

  • (disco) in reply to dkf
    dkf:
    There's a larger set of people who handle complex programs with a sufficiently assistive GUI.

    I wonder if it's really the case that it's a larger set for LabVIEW and other graphical languages. It certainly seems like conventional wisdom that this is easier to learn and use than C-like languages, but I have seen many people who use LV badly and not an insignificant number of those had experience with other languages (although, to be fair, it's possible they also wrote bad code there too), so it's certainly not a pyramid of Real Programmers > C > .NET > Python > Those people who program with Paint > Civilians, but rather the different capabilities of each person.

    If I was feeling snarky I might categorize these two groups as "linear thinkers" and "visual thinkers", but I don't think there's anything wrong with not grokking graphical programming or even with not liking it, and I certainly don't think text based programming is bad, so I won't.

  • (disco) in reply to Scarlet_Manuka
    Scarlet_Manuka:
    PleegWat:
    You don't need a graphical language to be able to skip most of the rigging in your business language - we reach the same goal with a text-based language.
    You don't need a GUI for a lot of stuff either, you can use a text-based command line. But most people are more productive with the GUI, and I wouldn't be surprised if the same was true in this context.

    Probably. But we've got dedicated developers working on the business rules side; we're not shipping the language as a product. I do believe there's some graphviz-based dependency visualization floating around somewhere.

    I was mainly intending to give a counter example to this separation of concerns being a property of graphical languages.

  • (disco) in reply to Yen
    Yen:
    I have seen many people who use LV badly

    Sure, but are they happy with their graphical lashed-together creaking pile of shit? If they're better off than they were before, you're still arguably making the world a bit better place since you're giving some folks a bit more control over their destinies.

    I don't want to have to ever fix any of their code. :smile:

  • (disco) in reply to dkf
    dkf:
    Yen:
    I have seen many people who use LV badly

    Sure, but are they happy with their graphical lashed-together creaking pile of shit?

    Some are perfectly happy with it and some (those "linear thinkers" I did not refer to) are "WTF is this language? This is terrible" (and there's no shortage of examples of those types of people, although not so much on this thread). My point is always that the code produced by these people is bad because they produced bad code (if only because they don't get the paradigm and techniques), not because you can't write good code in LV.

    My opinion as to whether or not it's a good thing that people with no programming background get to use it to produce more or less functional code isn't really relevant, as that's the situation regardless of what I think, so I don't think about it much. As you said, as long as I don't have to deal with it, I don't care. In the cases where I did deal with other people's code, sometimes it's bad, sometimes it's decent. It's unusual to have to deal with good code, because that tends to stay with whoever wrote it.

  • (disco) in reply to Yen
    Yen:
    My point is always that the code produced by these people is bad because they produced bad code (if only because they don't get the paradigm and techniques), not because you can't write good code in LV.

    And my point is about people in the hinterlands of programming. They're intelligent enough and might be able to learn if they really applied themselves to it, but they've got other stuff to do. Appropriate tools can help these people do things that are useful for them; learning a true programming language would help them more, but would require a much greater investment of time (particularly as it would require them to learn quite a few more conceptual frameworks; that's the sort of thing that doesn't sink in overnight).

  • (disco) in reply to dkf
    dkf:
    Yen:
    My point is always that the code produced by these people is bad because they produced bad code (if only because they don't get the paradigm and techniques), not because you can't write good code in LV.

    They're intelligent enough and might be able to learn if they really applied themselves to it, but they've got other stuff to do.

    By "these people" I was referring mainly to people who do know other languages and still manage to create bad code in LV. It's part of the same "Real Programmers" mentality which is too common IMO, and which caused you to refer to a "true" language, as if LV isn't one (which it is. I find it ironic how "these people" tend to also miss out on the interesting CS aspects of it).

    Different languages and IDEs have different strengths and weaknesses, and LV is no exception to that rule, but it's important to always remember that these are just tools and you pick the one which you prefer to work with for any given job (unless, of course, your choice of tools is limited, as is the case with the non-programmer types you were talking about, and then you just use a hammer, because hammers are brilliant, of course you would use one).

  • (disco) in reply to Yen
    Yen:
    Different languages and IDEs have different strengths and weaknesses, and LV is no exception to that rule, but it's important to always remember that these are just tools and you pick the one which you prefer to work with for any given job

    I have seen LV mostly in EE schools because NI makes the lab tools wants to make an extra buck by selling this crap. You get this fancy board with 4 radio channels, only you have to run the examples that are in LV. This mentality is carried on, so EE grads who are not lucky enough to see a linear programming (as you call it) crystallize and continue the madness to their jobs in new companies.

    dkf:
    learning a true programming language would help them more, but would require a much greater investment of time
    Not necessarily, LV comes with lists and queues and many toolboxes to do all sort of complex algorithms that take months to master. It is just more laborious to work with.
    Yen:
    My opinion as to whether or not it's a good thing that people with no programming background get to use it to produce more or less functional code isn't really relevant

    People with any programming background hated it so much NI had to come up with m-blocks (blocks to run MATLAB) because it is insane to sort up so many multiplier-adder blocks for a simple filter only to realize you have to restructure the blocks again because you want to implement it is direct-form-II.

  • (disco) in reply to Tsaukpaetra
    Tsaukpaetra:
    Why did Discourse put an spurious carriage-return in my selection-quote? It's clearly not there in the quoted post...

    This is just to say

    I have reflowed the words that were in your selection

    and which you were probably saving for quoting

    Forgive me you were doing it so wrong and so bold

  • (disco) in reply to dse
    dse:
    LV comes with lists and queues and many toolboxes to do all sort of complex algorithms that take months to master

    But the people I'm thinking about aren't necessarily using that advanced stuff. They're getting their shit done despite doing it in a way that in our eyes barely qualifies for that name.

  • (disco) in reply to dkf
    dkf:
    They're getting their shit done despite doing it in a way that in our eyes barely qualifies for that name.
    What are you talking about? I'm sure we'd all be happy to concur that it easily qualifies as shit.
  • (disco) in reply to dse
    dse:
    I have seen LV mostly in EE schools

    I haven't, but I have no experience with EE schools, so that would explain why. You presumably have less contact with other areas where it's used, which is why you saw it there. It's used in different places.

    dse:
    Yen:
    Different languages and IDEs have different strengths and weaknesses, and LV is no exception to that rule, but it's important to always remember that these are just tools and you pick the one which you prefer to work with for any given job
    ...because NI makes the lab tools wants to make an extra buck by selling this crap. ...and continue the madness to their jobs in new companies. ...It is just more laborious to work with. ...People with any programming background hated it so much NI had to come up with m-blocks (blocks to run MATLAB) because it is insane to sort up so many multiplier-adder blocks...

    And there it is. That right there is the type of mentality I'm talking about (although you at least do seem to have some experience with it). Touching on just the last point, I have no experience with the Mathscript nodes in LV and don't know the exact background that led to their development, but I think it illustrates my point - if M code is a better tool for math, then it's a good thing that LV allows you to incorporate the scripts directly into your code - it allows you to select the appropriate tool for you.

    My point was never that LV is the best language that ever was and ever will be, but rather that it's a perfectly legitimate tool with the same trade offs that other tools have and that if you write bad code, that's mostly on you. I know many people with programming background (myself included) who don't hate it, so there.

  • (disco) in reply to Yen
    Yen:
    You presumably have less contact with other areas where it's used, which is why you saw it there.
    I had to use it extensively in EE school, but later saw it also in bio engineering. I am very curious, which other area are you talking about?
    Yen:
    That right there is the type of mentality I'm talking about (although you at least do seem to have some experience with it).

    Agreed, I hate it so much I do not mention it in my resume, not to give wrong ideas. I do not want to touch it even if my livelihood depends on it.

    I said before, LabView should die in a Fiendfyre then your soul will be free, and you go to the next level, unless you are NI marketing or application field engineer :wink:

    Yen:
    My point was never that LV is the best language that ever was and ever will be, but rather that it's a perfectly legitimate tool with the same trade offs that other tools have and that if you write bad code, that's mostly on you.
    It is an interesting idea gone too far.
  • (disco) in reply to dse
    dse:
    Yen:
    You presumably have less contact with other areas where it's used, which is why you saw it there.
    I had to use it extensively in EE school, but later saw it also in bio engineering. I am very curious, which other area are you talking about?

    I'll tell you after you tell me which areas C or Python are used in. Seriously, it's used across the board in very different fields and industries. In some (like testing) it's popular. In others (like pure software desktop apps) it's rare. The poster children people like to put up are some of the control and data management for the large hadron collider in CERN and a bunch of stuff at SpaceX, such as at least some of the stuff in their control center, but that's probably because it's both public and "cool".

    Frankly, I wouldn't be able to tell you anything about most of the applications, because they're completely not my field. Here's a fairly concise list of areas, but it's NI marketing, so you can treat it as such - http://www.ni.com/solutions/

    dse:
    Agreed, I hate it so much I do not mention it in my resume, not to give wrong ideas. I do not want to touch it even if my livelihood depends on it.

    I understand that you hate it. I'm just trying to understand why. I have used quite a few tools in my life. I didn't like all of them. I thought some were badly made. There were certainly some I would not want to use again. I don't think I hate any of them. I recognize that for some of them I probably just failed to learn how to use them in a productive manner.

    If I take your example of the Mathscript node, I don't use a lot of math in my code, but I can certainly understand how someone would find needing to string together multiple primitives to be frustrating. Even with that understanding, I don't see how that leads to "I hate this language" any more than I see how the fact that C requires you to pass pointers into functions if you want multiple return values makes it "a terrible language".

    Is your problem that code was hard to write? Hard to read? What?

  • (disco) in reply to Yen
    Yen:
    I'll tell you after you tell me which areas C or Python are used in

    Well, I use C everywhere. And I have used both C and Python in GNU radio just fine, there was no need to LV. I would not be surprised if LV is compiled internally into some sort of proper language (I would do that if I wanted to make that crap). I even managed to use C with NI devices after jumping hoops and linking to their libraries.

    Yen:
    In some (like testing) it's popular.
    Yes, mostly because of NI boards that come with LV examples. NI basically does most of the common tasks and the test engineer can with minimal change do his job. They do not write the original LV, they are just customers.
    Yen:
    In others (like pure software desktop apps) it's rare.
    I have seen it in electrical plant automation, and one chemical factory as desktop application!
    Yen:
    because it's both public and "cool"
    It is interesting to show to a control/chemical engineer who knows blocks and thus tries to translate things into block diagrams. It is still interesting for me only if some other poor soul has done it, but it has gone too far into pure-block-diagram way of thinking. Why not have it both ways? like a block diagram that translates to code that translates back to diagram again, that would be "cool". But I guess NI has spent to much effort, to let you escape the lock-in, and source-code will bring in open and free variants.
    Yen:
    Is your problem that code was hard to write? Hard to read? What?
    It is laborious even for simple tasks. It would be understandable if something complex requires time and effort, but something as simple as a for-loop should not take too much time and way too many clicks.
    Yen:
    I understand that you hate it. I'm just trying to understand why.

    Lets explain my exact experience: Usually the problems I work on are not "implement this algorithm based on this variant published in this paper and do not do anything else". I do not know about others, but for me programming is an evolving exercise. I do not write the flow-chart and then faithfully reproduce it. I start with some dumb for-loop, then may change it to while-loop in 2 diffs, then change it altogether and make it a lambda expression. With LV you have to have a fixed design plan (yes like what you may expect for safety-critical part of CERN or a chemical plant) in front of you. That is not fun, there is no room to explore or let software evolve. My style of programming is painful with LV.

  • (disco) in reply to dse
    dse:
    Yen:
    I'll tell you after you tell me which areas C or Python are used in

    Well, I use C everywhere. And I have used both C and Python in GNU radio just fine, there was no need to LV.

    Of course there was no need. There's no need for C either. You choose a tool which can do the job. My point was that just like C is used widely, so is LV.

    dse:
    I would not be surprised if LV is compiled internally into some sort of proper language

    I have no idea what you mean by that. LV code is compiled to machine code (which a CPU considers proper, I guess). This compilation is done at edit time, which gives you various advantages, like being able to press run and have the program running a second later. I happen to know that in recent versions NI transitioned to using an open source compiler called LLVM for the actual code generation, but that's trivia, as far as I'm concerned, as it's behind the scenes. For all I care LV code could be interpreted, as long as it works.

    dse:
    I even managed to use C with NI devices after jumping hoops and linking to their libraries.

    Don't care about NI hardware. I don't even use it that much. I'm not sure why working with their hardware would be more difficult in C than working with any other hardware, but I have no experience with that.

    dse:
    Yen:
    In some (like testing) it's popular.
    Yes, mostly because of NI boards that come with LV examples. NI basically does most of the common tasks and the test engineer can with minimal change do his job. They do not write the original LV, they are just customers.
    Yen:
    In others (like pure software desktop apps) it's rare.
    I have seen it in electrical plant automation, and one chemical factory as desktop application!

    Unlike you, I can't claim that I know why it's popular in that area, as I don't have much experience there. Maybe it's history, maybe it's their hardware, maybe LV is more suitable for it, maybe something else. Maybe it's not even as popular as I think it is.

    I have seen it used in many places as desktop applications, because some of the stuff we write is desktop apps. The point is that it's used in different ways by different people, like any language.

    dse:
    Yen:
    because it's both public and "cool"
    It is interesting to show to a control/chemical engineer who knows blocks and thus tries to translate things into block diagrams. It is still interesting for me only if some other poor soul has done it, but it has gone too far into pure-block-diagram way of thinking. Why not have it both ways? like a block diagram that translates to code that translates back to diagram again, that would be "cool". But I guess NI has spent to much effort, to let you escape the lock-in, and source-code will bring in open and free variants.

    I have absolutely no idea what you mean by that. My point was that people commonly use the LHC and SpaceX as examples of where LV is used because people can relate to them and they sound impressive.

    dse:
    Yen:
    Is your problem that code was hard to write? Hard to read? What?
    It is laborious even for simple tasks. It would be understandable if something complex requires time and effort, but something as simple as a for-loop should not take too much time and way too many clicks.

    I can place a fully functional for loop in my code in about a second or two. I can easily set it to iterate over multiple arrays or run in parallel. Doesn't seem hard to me. In fact, I would say it's mostly easier than C. I can replace it with a while loop in somewhere between 2 and 5 seconds.

    dse:
    Yen:
    I understand that you hate it. I'm just trying to understand why.

    Lets explain my exact experience: Usually the problems I work on are not "implement this algorithm based on this variant published in this paper and do not do anything else". I do not know about others, but for me programming is an evolving exercise. I do not write the flow-chart and then faithfully reproduce it. I start with some dumb for-loop, then may change it to while-loop in 2 diffs, then change it altogether and make it a lambda expression. With LV you have to have a fixed design plan (yes like what you may expect for safety-critical part of CERN or a chemical plant) in front of you. That is not fun, there is no room to explore or let software evolve. My style of programming is painful with LV.

    I'll grant that modifying code can sometimes be harder in LV, particularly if you want to cut and paste whole blocks in or if your code is bad, but I find it very easy to develop code, play around with it, run it, test it, modify it as needed, add code, remove code, etc.

    My calculator example from above is a decent example of this - I started with zero planning and only realized my error after I wrote some code and ran it. I then had to modify the code to get it to actually work. The reason I could do that is that in some ways LV actually encourages you to write code without planning. That's why you end up with some of the terrible examples I like so much - no planning and no structure.

    I don't know why your experience is different, but I would certainly not agree about the design needing to be fixed or LV not being fun. I find that it's quite the opposite - it's more fun, not less.

  • (disco) in reply to dse
    dse:
    People with any programming background hated it so much NI had to come up with m-blocks (blocks to run MATLAB) because it is insane to sort up so many multiplier-adder blocks for a simple filter only to realize you have to restructure the blocks again because you want to implement it is direct-form-II.

    Just remembered something interesting about this. Here's a link to a utility written a few years ago by a very capable user of LV - https://decibel.ni.com/content/docs/DOC-13859 . There is a full description of it there, as well as links to demonstration videos at the end of the document. This rightly won the first place in the example contest it was submitted to, as it was easily the most interesting.

    This utility allows you to take math input from multiple sources (the Windows math input control, typed LaTex, MathML, embedded equations from Wikipedia, etc.) and convert it to LV code. The most obvious "wow" feature he added was the ability to display the equation in the code as it would be typeset:

    https://decibel.ni.com/content/servlet/JiveServlet/downloadImage/102-13859-9-15368/620-256/Math+Node+VI.png

    Now do that in C.

  • (disco) in reply to Yen
    Yen:
    Now do that in C.
    Which is likely what LV is implemented in.
  • (disco) in reply to Yen
    Yen:
    This utility allows you to take math input from multiple sources (the Windows math input control, typed LaTex, MathML, embedded equations from Wikipedia, etc.) and convert it to LV code.

    okay, now open up that control's vi. I want to see how it's built. because i expect i have some image memes that would apply

    Yen:
    Now do that in C.

    Can i do it? absolutely.

    will i? nah. but i'll knock you up something that'll do that in C# if you insist. that will waste less of my time that way.

Leave a comment on “Dual Helix”

Log In or post as a guest

Replying to comment #:

« Return to Article