• trtrwtf (unregistered) in reply to 3Doubloons
    3Doubloons:

    Somewhat related, my old boss had a ban on freeware that included open source. The reasoning was that if you didn't buy it, you had no guarantee that the program would be maintained and that you could get the changes you needed (in his mind, if you paid for something, you had the right to pester the developer to add the features you wanted)

    Same thing happens where I am. They'd rather spend $30,000 on a system than use an equivalent open-source product. Then they spend another wad of cash on a support contract. The face, the palm, the me shutting up and getting back to work.

  • ¯\(°_o)/¯ I DUNNO LOL (unregistered) in reply to SamC
    SamC:
    Thankfully, it was only needed for a few minutes, and we never spoke of said setup again.
    Rule number one of A/V Club is don't talk about A/V Club. Rule number two of A/V Club is don't talk about A/V Club.
  • OldPeter (unregistered)

    Nobody yet mentioned the parallel to the Wooden Table Approach. And voila, I seem to recognize a wooden deck involved in the setup!

  • (cs) in reply to Foo
    Foo:
    Don:
    “We’re still running Windows XP here,” Walter replied. “It needs at least Windows 7.”

    “And we can’t upgrade because of our legacy plant software,” she replied for the umpteenth time.

    VMWare vPlayer? XP Mode?

    At least a dozen solutions for that if XP vs 7 is the only limitation. Or hell, put in a requirement for everything including the plant software to be upgraded so as to ensure security...

    Cant manipulate most hardware (such as PCI cards) from within Virtual Machines....That's why they are virtual...

    There are ways, but they are ALOT of work...

    vmware player probably (!) isn't going to allow a hardware interface to the plant equipment, via a custom and overpriced PCI board or whatever.

  • Herp (unregistered)

    TRWTF is not turning that meeting into the exit interview immediately upon being threatened with termination.

  • sh oe (unregistered) in reply to Tim Phillips
    Tim Phillips:
    tldr:
    Although I do not understand the reasoning why being open source would make a program less secure.
    About all I can come up with is there is nobody to sue if you get hacked.
    I think people who don't understand open source think that only evil geniuses have anything to do with it, and that because it's open source everyone will add their own little malicious stuff in there. Of course they forget that (sort of like wikipedia), for any evil geeks out there that want to put in bad things (and have sufficient access/trust to even be working on the source of any of those projects), there's many, many good geeks questioning every line....

    I sort of also think that some people think that Open Source means anyone can modify the copy you have or something. Have to keep in mind 2 things about managers:

    1. They're often not overly technical (which is why they're in management positions not technical positions)
    2. They fear stuff that is new because it's untried - Open Source (to them at least) is still a very new way of dealing with stuff. Plus making stuff public must make it less secure, right?
  • Paolo (unregistered)

    there's a lot of "why do IT Sec people alwyas think Open Source is (more) insecure (than propritary software)?

    The answer is simple - IT Security people are too often hired for their ability to come up with complex password rules, and experience shutting down massive systems in an attempt to keep everyone out. They are (generally) NON-TECHNICAL. They are good at POLICY and that is all.

    This is how IT Security decisions on Open Source are made: We need a policy on open source. I once read an article on some random blog about the dangers of some Open Source product This had a big discussion in some forum I once visited, and although I didn't actually read all of it, skimming over it I noticed there were people commenting who claimed years of experience in security.
    Some of them seemed to endorse the idea that some elements of Open Source might be problematic Open Source is evil, so our policy will sya it's not allowed. At All.

    QED.

  • Meep (unregistered) in reply to Someone
    Someone:
    You have a week to install a closed source replacement with the same protection
    How could she know? It's closed source, so you can only guess the protection level.

    Oh, right, because she's going to read the source code of the open source software and determine how secure it is.

    She should have just made an S-corp, downloaded the software and sold it back to the company.

  • Meep (unregistered) in reply to eViLegion
    eViLegion:
    tldr:
    Although I do not understand the reasoning why being open source would make a program less secure.

    That's because you understand that proper security software relies upon mathematically and computationally difficult problems, and not on "you just haven't figured out my convoluted secret yet, and I'm not telling you what it is".

    All security is through obscurity. A key is literally a secret, typically not so convoluted as poorly written code, but there's no mathematical reason why a crypto algorithm couldn't be based on poetry or interpretive dance.

    And whether or not my security systems are well written, it's still additional work for an attacker to reverse engineer them over simply reading the source. And in a non-trivial system, the security features will have weaknesses and you are somewhat better off if you can avoid advertising them.

    That said... popular open source projects do benefit from third parties reviewing their code, and though it doesn't happen nearly as often as people think, in practice that process seems to outweigh the benefits of hiding the source. And even if you're not a security professional, if you can read source, a quick perusal can tell you if it's just complete crap and if it doesn't seem to be buggy as shit it's probably not too, too bad.

  • David (unregistered) in reply to Meep

    For six years, closed-source Borland Interbase* had a backdoor account in it. It took six months to find it after they threw the code into open source. If no blackhat decompiled the code and found «if account = "bob" and password = "bob" then valid_account_found();», it's because none of them really tried.

    • http://lwn.net/2001/0118/security.php3
  • (cs) in reply to 3Doubloons
    3Doubloons:
    Somewhat related, my old boss had a ban on freeware that included open source. The reasoning was that if you didn't buy it, you had no guarantee that the program would be maintained

    Ugh. Been there. Done that. Only in our case, it wasn't the boss. It wasn't even someone with authority to do anything. It was just some idiot from another campus kicking up a stink and accusing us of having pirated Linux (there was no invoice, therefore no license was his logic).

    We decided to ignore the idiot and keep using it. Pretty sure our boss went to his boss over it too and told him to never contact our campus directly again... I certainly never heard from them again, anyway.

  • trololo (unregistered)

    I've seen plenty of WTF's in my life but this sir, this is ...

    THE MOTHER OF ALL WTF's

  • (cs) in reply to Meep

    Yes, on a very technical level you are right, its all obscurity.

    Yes a key is literally a secret, but it's pretty fatuous to consider the obscurity of a key in a proper security system to be equivalent to the obscurity of some convoluted (and probably badly conceived and written) code.

    Some (i.e. almost all proper) security software relies on mathematically intractable problems which are WELL KNOWN. Their security works despite this.

    The algorithms are NOT obscure.

    Knowing anything about those algorithms doesn't actually help you crack them (unless mistakes were made in implementation).

    Sure, the KEY is obscure, but it is simple arbitrary data... there is no rhyme nor reason to a key, it simply is what it is, and crucially still works within a publically documented system.

    That kind of obscurity is in no way similar to having a system of just semi-random code that is confusingly difficult to navigate (but not mathematically intractable). If you get hold of the code (which may or may not be easy), such a system becomes trivial to crack (though a little time consuming to navigate all the spaghetti).

    So, sure... the key is obscure, but you simply cannot compare the two systems and say "oh, they're both just security via obscurity" because to do so is to willfully misrepresent the critical differences between the technologies.

  • drake (unregistered) in reply to jay
    jay:
    Please disregard the ppppppppppppppppppprrrrrrrrr on my last post. I was scraping some piece of random gunk off my keyboard. :-)

    I thought you were just farting personally. Rude, but hey its natural, shit happens

  • ForFoxSake (unregistered)

    Fake, joke, blaah and booo

  • (cs) in reply to TheCPUWizard
    TheCPUWizard:

    Cant manipulate most hardware (such as PCI cards) from within Virtual Machines....That's why they are virtual...

    There are ways, but they are ALOT of work...

    You meant "a lot" Fool Akismet!
  • Your Name (unregistered) in reply to trtrwtf
    trtrwtf:
    Same thing happens where I am. They'd rather spend $30,000 on a system than use an equivalent open-source product. Then they spend another wad of cash on a support contract. The face, the palm, the me shutting up and getting back to work.

    And continuging to be horridly underpaid. I worked at a company that loved buying overpriced toys but not paying its employees. It was fun for a few months having a $4000 workstation on my desk for simple windows programing work. Soon I realized that I would be much more happy making a real salary.

  • Cogo the Barbarian (unregistered) in reply to eViLegion
    eViLegion:
    If you get hold of the code (which may or may not be easy), such a system becomes trivial to crack.

    And if you get hold of the key, your cryptographic systems becomes trivial to crack, too.

    I'm not saying there aren't good reasons which support Kerckhoff's Principle, but that ain't one of them.

  • Roby McAndrew (unregistered) in reply to drobnox
    drobnox:
    TheCPUWizard:

    Cant manipulate most hardware (such as PCI cards) from within Virtual Machines....That's why they are virtual...

    There are ways, but they are ALOT of work...

    You meant "a lot" Fool Akismet!

    Or he's a manager and he meant "allot"

  • jay (unregistered) in reply to eViLegion
    eViLegion:
    Yes, on a very technical level you are right, its all obscurity.

    Yes a key is literally a secret, but it's pretty fatuous to consider the obscurity of a key in a proper security system to be equivalent to the obscurity of some convoluted (and probably badly conceived and written) code.

    Well, I think the real difference is this: The same security software is used by many, many people. If someone buys (or steals) a copy of the software and finds a security hole, he can then exploit that hole on any computer he can reach that is running the same software.

    A password or private key is known only to you. The whole point of a password is that you DON'T share it with anyone else. So if someone discovers the password on computer A, that doesn't help them break into computer B.

    Some (i.e. almost all proper) security software relies on mathematically intractable problems which are WELL KNOWN. Their security works despite this.

    The algorithms are NOT obscure.

    Knowing anything about those algorithms doesn't actually help you crack them (unless mistakes were made in implementation).

    Well, that's the goal. Whether a particular algorithm achieves that goal is another question. You may think that your algorithm is secure and unbreakable but then someone figures out a way to beat it.

  • Yanis Hristov (unregistered)

    Hmmm, I sent this picture without this novel. The real story behind is that an engineer want to find fast glitch in one of the parameters displayed but the logs are recorded every second. So with the help of the camera( recording on 25 frames per second from 60 Hz monitor) trying to capture it.

  • Kasper (unregistered) in reply to jay
    jay:
    Please disregard the ppppppppppppppppppprrrrrrrrr on my last post. I was scraping some piece of random gunk off my keyboard. :-)
    If only that was a featured comment.
  • urza9814 (unregistered) in reply to David
    David:
    For six years, closed-source Borland Interbase* had a backdoor account in it. It took six months to find it after they threw the code into open source. If no blackhat decompiled the code and found «if account = "bob" and password = "bob" then valid_account_found();», it's because none of them really tried.
    • http://lwn.net/2001/0118/security.php3

    You don't necessarily know that nobody found it for six years; only that nobody reported it....

  • The Crunger (unregistered) in reply to Paolo
    Paolo:
    there's a lot of "why do IT Sec people alwyas think Open Source is (more) insecure (than propritary software)?

    The answer is simple - IT Security people are too often hired for their ability to come up with complex password rules, and experience shutting down massive systems in an attempt to keep everyone out. They are (generally) NON-TECHNICAL. They are good at POLICY and that is all.

    "These aren't the drooling idiots you're looking for. We can move along."

    Let's think of some reasons the optimistic view of Open Source might not the best choice:

    • If people could die because of decisions you make
    • If your industry is already under constant attack from people who would love to do for you what Stuxnet did for Iran
    • If you could go to jail for failing to show an appropriate standard of caution

    Having an OSL does not make code more secure. Yes, people can see security holes, but they may not care, or actually may hope to create and/or exploit them. Some "Open Source" projects have little or no barriers to commit, and the quality really shows. Some would probably need to be completely rewritten to reach the Microsoft (i.e. bare minimum) level of security.

    Then there's another scenario. Say your vendor uses Open Source to reduce costs, and for their next release pulls in something with some stealth GPL code. No one notices until your system is deployed, and then some Assange-iot sues your vendor for specific performance to expose the entire secure-through-obscure code base for all your competitors and enemies to see.

    So, when one of our business units decided to eliminate all Open Source from their product, it was one of many sound business decisions, because the day was coming where no one in their industry was going to buy anything with these unpredictable liabilities.

    So, yes, be amazed at how foolish those security droids are, as you light your natural gas stove, and the next time you drive over a towering bridge.

  • Catprog (unregistered) in reply to The Crunger
    The Crunger:
    Then there's another scenario. Say your vendor uses Open Source to reduce costs, and for their next release pulls in something with some stealth GPL code. No one notices until your system is deployed, and then some Assange-iot sues your vendor for specific performance to expose the entire secure-through-obscure code base for all your competitors and enemies to see.

    Say you vendor takes some code it does not have the rights to, say some windows code they reversed engineered. The exact same situation will occur

  • The Crunger (unregistered) in reply to Catprog
    The Crunger:
    Then there's another scenario. Say your vendor uses Open Source to reduce costs, and for their next release pulls in something with some stealth GPL code. No one notices until your system is deployed, and then some Assange-iot sues your vendor for specific performance to expose the entire secure-through-obscure code base for all your competitors and enemies to see.
    Catprog:
    Say you vendor takes some code it does not have the rights to, say some windows code they reversed engineered. The exact opposite situation will occur
    FTFY

    If your vendor infringes upon a closed source copyright, the last thing the owner will ask for is public disclosure of the offending source code.

    If your vendor infringes upon an open source copyright, this is the first thing that will be demanded. And, if the vendor has certified "no open source", they have now defrauded their customers, too.

    That, in a nutshell, is the difference between Open and Closed source.

  • craftworkgames (unregistered)

    I'm in favor of using open source software for security BUT I can see a reason in this case that open source might be a bad thing:

    In theory, an employee of the company could modify and recompile the open source software to remove the security and gain access that way. Potentially a disgruntled employee sabotage thing. Perhaps.

  • luptatum (unregistered) in reply to The Crunger
    The Crunger:
    If your vendor infringes upon a closed source copyright, the last thing the owner will ask for is public disclosure of the offending source code.

    If your vendor infringes upon an open source copyright, this is the first thing that will be demanded.

    No, they can ask that the vendor stops distributing the code in violation of the licence, and possibly ask for damages, but they can't demand that the vendor releases any of its own code. The vendor may choose to do that in order to come into compliance (although it probably won't exempt them from the damages if any), but only if they consider it a better option than removing the open source code from the product.

    And, if the vendor has certified "no open source", they have now defrauded their customers, too.
    Exactly the same as if they certified "no proprietary code belonging to someone else" and violated that.
  • usitas (unregistered) in reply to The Crunger
    The Crunger:
    Let's think of some reasons the optimistic view of Open Source might not the best choice:
    • If people could die because of decisions you make
    • If your industry is already under constant attack from people who would love to do for you what Stuxnet did for Iran
    • If you could go to jail for failing to show an appropriate standard of caution
    These are all reasons to carefully evaluate a piece of software before choosing it. None of them are in any way different for open source than for closed source.
  • opto (unregistered) in reply to craftworkgames
    craftworkgames:
    In theory, an employee of the company could modify and recompile the open source software to remove the security and gain access that way. Potentially a disgruntled employee sabotage thing. Perhaps.
    If they can replace the software you're using with their own code then you have a security hole. It doesn't matter whether the original was open source or not - there are plenty of instances of people modifying closed source code to remove security checks (see: every game DRM crack), and who's to say the "hacked" version has to be derived from the original in the first place, just as long as it behaves sufficiently similarly that people don't notice?
  • nimis (unregistered) in reply to The Crunger
    The Crunger:
    Then there's another scenario. Say your vendor uses Open Source to reduce costs, and for their next release pulls in something with some stealth GPL code. No one notices until your system is deployed, and then some Assange-iot sues your vendor for specific performance to expose the entire secure-through-obscure code base for all your competitors and enemies to see.
    Actually, if this were true, it would be a good reason not to choose a closed source product in the first place - you're saying that they could be forced to reveal their critical secrets at any time, whereas a (reputable) open source program won't have any such secrets in the first place.
  • illum (unregistered) in reply to The Crunger
    The Crunger:
    And, if the vendor has certified "no open source", they have now defrauded their customers, too.
    So you're trying to convince people that they shouldn't use open source, and to support your argument, you've described a case where a strict policy against open source could get a company into legal trouble. You're not very good at this, are you?
  • Anonymous (unregistered) in reply to Slapout
    Slapout:
    Windows is closed source while Linux is open source. People find security flaws in both.
    There are ~8000 Linux kernel developers around the world with the freedom to find and fix those vulnerabilities whenever they want. Meanwhile, Microsoft asks that the person that discovered the vulnerability kindly keep it a secret for 6 months or a year because they don't plan to fix it until then, as they kindly steer their programmers to develop something stupid and useless that looks different so that they can market it for new sales (e.g., Windows 8).

    Indeed, the turnaround for security vulnerability fixes in open source is typically much shorter than proprietary software. Nevermind the bosses that think closed source is more secure because the source is withheld; proprietary vendors consider their software secure if the person that discovered a serious vulnerability doesn't disclose it to anyone else.

    Not only are there more eyes looking at popular open source code, but there's also freedom to improve it without the limitations that a profit-seeking entity enforces. ANYBODY can fix open source. Only a select few can fix closed source, and only if it appeals to the business' bottom line. Open source developers are also more forthcoming with vulnerabilities. They're more likely to warn the users about it, whereas a proprietary vendor is more likely to sweep it under the rug and feign ignorance.

  • Anonymoous coward (unregistered)

    Secourity through... insanity?

  • S-x (unregistered) in reply to Someone

    That security guy doesn't seem smart, so changing shortcut's icon and name would probably do. If not, you can always replace icon inside binary with resource hacking program, or recompile it (Hey, open source!) with different name. Bam, you're the only owner of the only copy of closed open source program! And you can still get some updates!

  • VolodyA! V Anarhist (unregistered)

    "we can’t rely on open source software for protection"

    File under the same category as "Java is too slow".

Leave a comment on “Screen Recording HARDWARE”

Log In or post as a guest

Replying to comment #:

« Return to Article