• boog (unregistered)
    TFA:
    "Well, what did they put in its place?" Bonnie asked.

    As it turned out? Nothing.

    Is "nothing" the standard? I'm not quite sure the credit card companies would agree.

    This is probably the result of some IT manager going through a "standards" phase, where he forces the whole department to focus all their time on becoming standards-compliant, though he doesn't really know anything about standards. Never mind that the company is risking breaking all of their currently-functioning code with drastic changes in the name of "standards." That's okay, they have source control, right? Right? Oh dear.

    Once they've broken enough processes, worked enough nights/weekends/holidays, lost enough data, and wasted enough money, the IT manager is likely to go through a "backups" phase. Everyone will forget about standards and focus on backups. He'll buy massive space to backup the data (since it wasn't before), the code, the backups, the backups' backups, the backups' backups' backups, ad nauseam. But nobody will test the backups to see if they can recover from them.

    Wait until he goes through a "security" phase. Not only will it be mandated that all data are encrypted using ROT26, but interns won't be allowed within 30 feet of any computer that has (or ever had) access to the data.

  • Helix (unregistered)

    Furrfu

  • Mr CVV2 (unregistered)

    I'm pretty sure it's actually illegal to store the 3 digit verification code code "anywhere", even if you encrypt it (properly). Well maybe not illegal as in go to prison illegal, but against the credit card vendor's rules where they can suspend your account for doing so.

  • Kowell (unregistered)

    A fortune 500 company would probably be public and therefore subject to yearly security audit from SAS-70 and maybe also SSAE-16. No company with this level of grossly incompetent "security" would pass such an audit.

  • Anonymous (unregistered) in reply to Kowell
    Kowell:
    A fortune 500 company would probably be public and therefore subject to yearly security audit from SAS-70 and maybe also SSAE-16. No company with this level of grossly incompetent "security" would pass such an audit.

    Don't you see? That's where the genius lies! All of the plaintext is in the DEV environment.

  • Wheeldog (unregistered)

    I want magical unicorn and rainbow ROT26 encryption on all my sites too now. :(

  • (cs) in reply to Kowell
    Kowell:
    A fortune 500 company would probably be public and therefore subject to yearly security audit from SAS-70 and maybe also SSAE-16. No company with this level of grossly incompetent "security" would pass such an audit.
    Are you sure? We're good about security, but I'm always amazed at the questions that aren't asked during these audits. One time, I got grilled about an application that uses Windows Authentication, but we weren't deleting the database rows referencing the account and providing metadata (for reporting purposes), even though the user couldn't log in after the Windows account was deleted. The same audit didn't even bother asking about the commercial application running next to it (and of the same importance) that puts application passwords in a table unencrypted and has a maximum password length of five characters. Both applications were explicitly in scope for this audit.
  • (cs) in reply to pabraham
    pabraham:
    sechma?

    No it's not my captcha.

    Could be worse... the person at my desk read that as "smegma", snorted and said "WTF?"

  • Ken (unregistered)

    Not a wtf. How else am I supposed to steal credit card data from users?

    I hope everyone realizes I'm kidding.

  • a (unregistered) in reply to David
    David:
    This sentence was originally encrypted using ROT26. If you can read it, you have successfully decrypted it.

    Oooo, my brain's been decrypting on the fly!

  • Kowell (unregistered) in reply to Jaime
    Jaime:
    Kowell:
    A fortune 500 company would probably be public and therefore subject to yearly security audit from SAS-70 and maybe also SSAE-16. No company with this level of grossly incompetent "security" would pass such an audit.
    Are you sure? We're good about security, but I'm always amazed at the questions that aren't asked during these audits. One time, I got grilled about an application that uses Windows Authentication, but we weren't deleting the database rows referencing the account and providing metadata (for reporting purposes), even though the user couldn't log in after the Windows account was deleted. The same audit didn't even bother asking about the commercial application running next to it (and of the same importance) that puts application passwords in a table unencrypted and has a maximum password length of five characters. Both applications were explicitly in scope for this audit.

    Guess our auditors were more "motivated" in their job than yours.... Just finished another round of SAS-70 audit 2 weeks ago and they were a major pain in the ass.... An incompetent auditor would be nice for a change ;)

  • uuang (unregistered)

    Remy I want more secret comments next time.

  • Bill's Kid (unregistered) in reply to mott555
    mott555:
    lol I had to look up ROT26. Having never heard of it I assumed it was a real encryption method and I was confused about why the story mentioned storing data in plaintext later.

    I did too. The unintended benefit to that: I finally figured out the rainbows and unicorns. I had always thought that bored people clicked random words until they saw them. Now I think they are punishment for people who have to look something up that the author has deemed common knowledge.

    I guess I can be somewhat happy that it took me 6 months of constant reading to find them on my own.

  • Bonnie (unregistered) in reply to Jaime
    Jaime:
    Kowell:
    A fortune 500 company would probably be public and therefore subject to yearly security audit from SAS-70 and maybe also SSAE-16. No company with this level of grossly incompetent "security" would pass such an audit.
    Are you sure? We're good about security, but I'm always amazed at the questions that aren't asked during these audits. One time, I got grilled about an application that uses Windows Authentication, but we weren't deleting the database rows referencing the account and providing metadata (for reporting purposes), even though the user couldn't log in after the Windows account was deleted. The same audit didn't even bother asking about the commercial application running next to it (and of the same importance) that puts application passwords in a table unencrypted and has a maximum password length of five characters. Both applications were explicitly in scope for this audit.

    I only sat in one one audit (of the application's error log) and it was pretty much a scenario like that. They audited the strangest stuff and went right past the stuff that should have been audited, reported on, and gotten someone's ass thrown in the fire for.

    Working there was the biggest mistake I've made.

  • (cs) in reply to uuang
    uuang:
    Remy I want more secret comments next time.

    It's not something I can force. When I feel it, I pile 'em on. When I don't… well, I always make sure to include a little something.

  • Andomar (unregistered)

    It works like that everywhere. Security is just talk. Companies that succeed invest time in more useful stuff.

    Security is a good place to steer people that would fark up real work.

  • (cs)

    Hahaha, I love that cornify script. Was trying to copy/paste ROT26 encryption into google to make sure I wasn't crazy. Glad I'm not.

  • ShatteredArm (unregistered) in reply to Jaime
    Jaime:
    Are you sure? We're good about security, but I'm always amazed at the questions that aren't asked during these audits.

    Not just that, but it's entirely possible for a developer to slip a wrong configuration value in there, changing the behavior to something other than what is expected (incorrect service endpoint, for example). I haven't seen anything as blatant as this story describes, but I have seen a dev endpoint, for example, inadvertently turn up in the config file in production.

    We had a situation awhile back at one of my jobs where we were logging web service requests and responses from our 3rd party data provider, and it turned out that they were sending unmasked credit card numbers back in a free flow text field which we weren't using... One of the DBAs discovered this while running a script on our non-PCI compliant database.

    In short, I wouldn't be surprised if there are PCI violations at every large company, but the story does seem a little too flagrant.

  • (cs) in reply to Andomar
    Andomar:
    It works like that everywhere. Security is just talk. Companies that succeed invest time in more useful stuff.

    Security is a good place to steer people that would fark up real work.

    Security also affords those people the perfect opportunity to fark up everybody else's real work, though. I'd rather just fire them.

  • (cs) in reply to luis.espinal
    luis.espinal:
    Ouch!:
    Oh well, I guess it works, and by the "If it ain't broke, don't fix it" rule, they did the right thing.

    But it was broken. You simply do not store credit card information in the clear. And that's just for starters since IIRC, they aren't supposed to store credit card verifications codes at all.

    running code != working code

    Luis, I think you should train your ability to detect when others are facetious. Especially on this site, that is something you have to expect.
  • Wilford Brimley (unregistered) in reply to Bob
    Bob:
    pabraham:
    sechma?

    I noticed that, too. I think it is a misspelling of "sperm".

    FTFY

    -diabeetus

  • CoderDan (unregistered) in reply to PCI DSS
    PCI DSS:
    That's why we need PCI DSS.

    That is correct, PA-DSS enforcement by the PCI council has been less than stellar. They are relying upon the merchant service reps, processors (when they pay attention) and to a lesser extent the merchant banks.

    Having worked in global payments and payment security, the code that is out there is atrocious at best. Of course I still run into people who are storing track 2 data and the CCV/CVV.

    Captcha: praesent, a European farmer

  • (cs) in reply to Kowell
    Kowell:
    A fortune 500 company would probably be public and therefore subject to yearly security audit from SAS-70 and maybe also SSAE-16. No company with this level of grossly incompetent "security" should pass such an audit.
    FTFY
  • adam (unregistered)

    Um, that violates PCA - they can (and arguably should) lose their ability to process credit card transactions.

  • adam (unregistered) in reply to adam

    Crap, I meant PCI. Duh.

  • (cs)

    Just about the same think happened to me about 10 years ago when I worked for a fortune 500 company which also was one of the top 10 software companies. I found the client order database on a public network share. It also had the CC info for each order. It was however a MS Access database, and didn't even have a password.

  • Harrow (unregistered)

    ...gwana gwana double ROT39 gwana...

    -Harrow.

  • Sylver (unregistered)

    I have spent a good part of the day reading the BOFH archives. Then I decided to read some on TDWTF for a break.

    Why does it sound like it's about the same company?

    The worst is, it's that special kind of unbelievable story that I have no trouble believing.

  • Buffled (unregistered) in reply to Anonymous
    Anonymous:
    Sometimes it's not enough to simply highlight the security flaw - it requires a practical lesson to reinforce the point. Several thousand customer CC numbers turning up on P2P should do the trick, for example. Let's see how long the flaw remains unpatched after that.

    But cover your tracks kids, us software developer types are too fragile for prison.

    And who cares about the customers whose lives you just made hell, right? There's this thing that us adults have awareness of - it's called "consequences". And it extends far beyond "will I get caught?"

  • Max (unregistered) in reply to David
    David:
    This sentence was originally encrypted using ROT26. If you can read it, you have successfully decrypted it.
    Does anyone know what this says? I can't seem to figure it out.
  • (cs) in reply to Buffled
    Buffled:
    Anonymous:
    Sometimes it's not enough to simply highlight the security flaw - it requires a practical lesson to reinforce the point. Several thousand customer CC numbers turning up on P2P should do the trick, for example. Let's see how long the flaw remains unpatched after that.

    But cover your tracks kids, us software developer types are too fragile for prison.

    And who cares about the customers whose lives you just made hell, right? There's this thing that us adults have awareness of - it's called "consequences". And it extends far beyond "will I get caught?"

    Do you "adults" have awareness of the presence of bullshit?

  • Dan (unregistered) in reply to whiskeyjack
    whiskeyjack:
    fennec:
    It's true; an internship always looks good on a resume.

    Except if it's with the White House, during the Clinton administration.

    What do you mean? It got one intern a book deal, fashion line, and 15 minutes of fame.

  • Icelander (unregistered)

    I had a similar experience. An application I was working on was storing passwords in the clear. Now, it's not credit card data but storing passwords in the clear is just bad practice. So I wrote an encryption function and a decryption function and a function to encrypt the passwords already in the database. I tested it, checked it in, and moved onto another task.

    A month later I hear from my coworker that our boss logged into the production database (why he has access to it I'll never know) and FREAKED OUT because the passwords were all garbled. He then restored the database from his own backup copy (why he has this I'll never know) and then freaked out because he couldn't log in.

    Moral of the story: Don't bother encrypting anything when your boss doesn't know what encryption means.

  • Sylver (unregistered) in reply to Henryk Plötz
    Henryk Plötz:
    "… containing the name, address, credit-card number, verification code and expiration date …"

    Correct me if I'm wrong, but isn't storing the verification code a breach of contract with your credit card clearing center and should lead to the company losing the ability to process card payments?

    It is, but when has that stopped a large company? It's a "standard" practice for a lot of companies over here, and a lot of cashiers don't see anything wrong with it.

    Frankly, it is amazing credit card fraud doesn't happen more often than it does.

  • MaindotC (unregistered)

    Your assessment that a fortune 500 company would have these types of things under close guard is erroneous. The larger a company gets, the more insecure it is. I took a network security class and the first week we studied social engineering and our assignment was to gather as much data (or attack vectors) on a F500 company. I was assigned Sprint, who just happen to be going through a merger with Nextel (which usually adds an additional layer of insecurity). You can dumpster dive loads of customer and employee information from any kiosk trash, grab information on their IT systems by browsing LinkedIn, or find forums that employees vent information vital to social engineering (in this case howardforums.com).

    It's a shame that data was so easily available, but it's common in a large organization, especially due to the use of resource roles and typically those roles focus more on accomplishing an unending list of tasks and not general security of information.

  • Anonymous (unregistered) in reply to Buffled
    Buffled:
    Anonymous:
    Sometimes it's not enough to simply highlight the security flaw - it requires a practical lesson to reinforce the point. Several thousand customer CC numbers turning up on P2P should do the trick, for example. Let's see how long the flaw remains unpatched after that.

    But cover your tracks kids, us software developer types are too fragile for prison.

    And who cares about the customers whose lives you just made hell, right? There's this thing that us adults have awareness of - it's called "consequences". And it extends far beyond "will I get caught?"
    You have an interesting definition of "hell":

    Customer: "Oh hello there, is this Visa? My card details have been compromised and I now have a fraudulent charge on my bill."

    Visa: "I'm sorry to hear that sir but don't worry, we will cancel your current card and issue you a new one immediately. It should take 3-5 working days to arrive. As soon as our fraud investigation department has cleared up the details, you will be credited for the fraudulent transaction."

    Customer: "Sweet, cheers dude."

    Visa: "Only too happy to help sir!"

    This has happened to me before and the above transcript is pretty much the exact conversation I had with my card issuer. It was HELL!!! Oh wait, no it wasn't, it was a trivial inconvenience.

  • Sylver (unregistered) in reply to Warmasta
    Warmasta:
    Credit card have NUMBERS so ROT26 jokes are just plain stupid.
    1. It's a joke.
    2. Just a credit card number doesn't get you anywhere if you don't know the name of the credit card holder. And that precious piece of information is safely stored in the database thank to a redundant rot13 encryption layer.

    Perhaps you should reconsider all aspects before calling someone fghcvq.

  • (cs)

    I once had a conversation that went something like this (sarcasm added):

    me: You know that extra private personal data that we transfer over the internet from the client machines to the server? Turns out it is actually transferred in plain text.

    Original Developer, now manager: No it's not

    me: Sure it has our proprietary delimiters and tags are on it, but I am sure people could figure out what the tag SSN followed by 9 numeric digits means - and there is no encryption on it at all.

    That dev/man: But it is encoded - it is sent across in unicode. Nothing to worry about.

    me: FACEPALM

    Yes. He was serious. I left when he was promoted from manager of his own little world to being manager again of our team.

  • (cs) in reply to Remy Porter
    Remy Porter:
    uuang:
    Remy I want more secret comments next time.

    It's not something I can force. When I feel it, I pile 'em on. When I don't… well, I always make sure to include a little something.

    I clicked all over the end of that sentence expecting unicorns to come up :(

  • (cs) in reply to Sutherlands

    Production data is required for dev in some products, but then again we don't store private user data... just stock/bond/derivative/etc data.

  • Ed Von Emacs, VI (unregistered) in reply to Remy Porter
    Remy Porter:
    uuang:
    Remy I want more secret comments next time.

    It's not something I can force. When I feel it, I pile 'em on. When I don't… well, I always make sure to include a little something.

    If you're taking requests, I'd like the html footer to give free foot massages, as it should. What else is a footer good for?

    And don't get me started on the header.

  • SecTech (unregistered) in reply to Buffled
    Buffled:
    Anonymous:
    Sometimes it's not enough to simply highlight the security flaw - it requires a practical lesson to reinforce the point. Several thousand customer CC numbers turning up on P2P should do the trick, for example. Let's see how long the flaw remains unpatched after that.

    But cover your tracks kids, us software developer types are too fragile for prison.

    And who cares about the customers whose lives you just made hell, right? There's this thing that us adults have awareness of - it's called "consequences". And it extends far beyond "will I get caught?"
    With all due respect this is a bullshit argument. As a security researcher, if we find an exploit in a piece of commercial software we give the vendor 14 days to respond to our findings. If we don't have a satisfactory response after 14 days we will publish the exploit. We understand full well that customers of said vendor may suffer as a result - some may even fall foul of the exploit that we designed and published - but what is the alternative? We keep our mouths shut and hope that security through obscurity does the job? I don't think so. If a vendor is not willing to accept responsibility for the faults in their software then we cannot be held responsible for the fallout.

    The situation in today's WTF is no different - if the company knows the problem exists but refuses to fix it, the only way to force their hand is to expose the problem. A small number of people may be inconvenienced as a result but the long term benefits far outweigh the inconvenience of a few customers.

  • Lurch (unregistered) in reply to uuang
    uuang:
    Remy I want more secret comments next time.

    uuang?

  • Jay Jay (unregistered)

    While working as a consultant for the Welfare department of a State government, I was a developer in the Day Care Subsidy program. According to requirements handed down by the state, all production user data had to be encrypted in any environment lower than prod. Why? Because the application stored all user data deemed necessary by the state. It included date of birth, social security number, and full address, and several other fun pieces of information. Was it encrypted in lower environments? Nope. I know I had the ethics not to copy the personal information of 250K people; I can't vouch for some of the other guys who worked there, especially the foreign nationals...

  • uuang (unregistered) in reply to Lurch
    Lurch:
    uuang:
    Remy I want more secret comments next time.

    uuang?

    You rang?

  • Sechma (unregistered) in reply to whiskeyjack
    whiskeyjack:
    fennec:
    It's true; an internship always looks good on a resume.

    Except if it's with the White House, during the Clinton administration.

    Then it looks good on a dress.

  • NoAstronomer (unregistered) in reply to luis.espinal
    luis.espinal:

    running code != working code

    It's astounding how many people, both development and non-technical, simply do not understand that. Even if you explain it to them.

  • My Name Is Missing (unregistered)

    I worked at a health care claims processing company that stored everything that came through as plain text in databases accessed with one password (that everyone knew) on prod systems with one password (that everyone knew) and with no auditing features. A trove of blackmail opportunities. Yet they passed a security audit (which I guess assumed no one could be so stupid as to have a single public password for production).

  • Peter (unregistered) in reply to Jay Jay
    Jay Jay:
    I know I had the ethics not to copy the personal information of 250K people; I can't vouch for some of the other guys who worked there, especially the foreign nationals...
    And naturally, foreign nationals are likely to have lower ethical standards than you do.
  • ÃÆâ€â„ (unregistered)

    Seriously, I want to know who these people are. There is no way in hell I'm doing business with them. Any dev can give themselves a "raise" in this company.

Leave a comment on “Internal Standards”

Log In or post as a guest

Replying to comment #323773:

« Return to Article