• Andrew (unregistered) in reply to somechick

    I'm stealing "God's gift to bits" and adding it to my lexicon. It too perfectly describes the arrogance and impersonality of these sorts. Thanks.

  • (cs) in reply to ThePants999
    ThePants999:
    How many DB calls does it take to change a lightbulb?
    Would that be a Metric or Imperial lightbulb?
  • (cs) in reply to BentFranklin
    BentFranklin:
    Am I the only one who thinks firstSpaceLastName is Gagarin?
    Finally, something worthy of being a featured comment. A++
  • (cs) in reply to Ru
    Ru:
    Sylver:
    Sure, the real world applications that actually do something are not our business. Our business should be high flown architecture. Got ya.

    Damn straight. Design? Waste of time. Git'r dun. Bish bash bosh, ship it.

    Sylver:
    The nonsense you just spouted is one of the reason why some college grads suck big time. Because you think your job is to come up with pretty architectures and discuss their relative merits.

    If you are a programmer working almost anywhere in the real world(TM), sorry, but that's not your job. Your job is to find solutions for people's problems and implement these solutions in code.

    And no matter how good you are, until you understand what your job is, you are a liability to your company.

    A company that fosters this sort of attitude that is engaged in any sort of complex work is making a very grevious mistake.

    If you're writing yet another database-molesting application, maybe hiring adequate programmers who shoudl show no imagination or understanding is a great idea. Trying to get those same people to write complex systems that need to scale significantly, or do a sophisticated job efficiently, or worst of all, not be a colossal kludge of poorly understood concepts that have been shoehorned into passing a test case but will never been maintable? Big mistake.

    I'm currently engaged in porting a quite complex system which uses various fancy bits of machine learning and computer vision, wrapped up in a godawful architecture by merely adequate programmers of the sort you are approving of. Sure, they got'r dun, and met the ship date. Now we're having to spend hundreds of thousands of dollars on fixing the mess they've made, employing people who understand why you might want a functional style of coding rather than an object oriented one.

    I've worked with your type before, hell I even once thought this way as well... The fact of the matter is that unless money and/or time is of no concern, then yes, spend hundreds of man days theorizing, researching, POCing different architectures and designs. But like the rest of us here that are mired in something I like to call reality, we do have budgets and timelines to stay within and stake holders that we have to answer to.

    What I'm trying to say is there is a fine balance that needs to be struck between design and git'r done, and this line varies from project to project of course. These fucking holier than thou, ivory tower attitudes can stay in academia, where you can continue to circle-jerk each other with your "perfect" architectures...

    Just my $0.02...

  • Meep (unregistered) in reply to NoAstronomer
    NoAstronomer:
    Meep:
    Doesn't matter if it is. A smart API would memoize (sic) redundant DB calls.

    +1. Because everyone knows that executing the same DB call twice has no additional effects.

    Obviously your API should be smart enough to know when a DB call has side effects.

    BTW, just because you don't know the word doesn't mean it's spelled or used incorrectly.

  • (cs) in reply to Bryan the K
    Bryan the K:
    Good thing people rarely name their son Jr these days, right?

    I just got done working with code that scans text for names. It's amazing how many rules you have to take into account: Suffixes (Jr, Jr., Sr, II, III, IV, M.D., PhD., etc) Prefixes (Dr., Reverend) Last names with more than one word (John De Silva) First names with more than one word (Mary Beth Johnson) Middle Initials (John Q. Public) Middle names, including multiple middle names (John Stephen Jay Smith) Hyphens, apostrophes, other special characters (n with ~, e with accent, etc) Names with words that don't begin with capital letters (von Bethoven) People with just a single name (Prince)

  • Mark Bowytz (unregistered)

    Guys, I have some really sad news: Alex tragically passed in a travel-related incident last night around 3:23am. Please keep his family in your prayers as we try to sort this thing out. There will be no more articles until further notice.

  • Stephen Cleary (unregistered)

    Not to mention

    [image]

    It's a bugger to parse!

  • (cs) in reply to boog
    boog:
    Coyne:
    Rumen is just making sure sure the user is really authorized: Twice the security!
    You're going to need the added security, since there's no password involved here.

    Details, details!

    We've already proved he can type first and last name correctly. Twice! That should be enough security for anyone, particularly if your name is something like Fleischaker Novotnych.

  • Stephen Cleary (unregistered) in reply to somechick
    somechick:
    One of the other reasons I don't think there are a lot of women is because they just aren't as interested in taking things apart and learning how they work as men are.

    I think that's most of it, actually. Women (in general) just aren't as interested in the field.

    As a CS major, I had to take a discussion-based "Computers and Society" course; the male/female ratio in our university at the time was almost 6:1. The teacher was a female with rather - shall we say "harsh" - liberal views. One day the subject for discussion was women in CS. After some discussion noting the lack of women, our teacher concluded that it was due to discrimination and that women had too many barriers to becoming CS students.

    I then continued the discussion by pointing out:

    1. Women are not discriminated against - at least not in our university, not in the CS department. In fact, I speculated that most of the guys want more women in CS. Amusingly, most of the men in the class nodded silently in agreement at that point.
    2. Women had fewer barriers than men in the CS department. At that time, any woman with a CS (or math) major at our university had a full-ride scholarship (assuming they maintained a 2.0), simply because they were female.
    3. Therefore, the most likely reason there are so few women is that few women want to be in CS. Every woman in that class came up to me later and voiced their agreement with me.

    Unfortunately, the teacher was petty and vindictive; that day I earned myself a "C-" for the entire term (the lowest passing grade). TRWTF is the year after I graduated, that same teacher became head of the CS department...

  • woliveirajr (unregistered) in reply to Mark Bowytz

    really? further notices ?

  • (cs) in reply to HaskellTroll
    HaskellTroll:
    Really? You think there are few women in IT because there are few female CS students? (I'll draw the conclusion that because there are many males in CS there are few females in CS for you.)

    I guess you wanted to have a different argument, namly that there are few female CS students because there are so many male CS students. Can you see the circular logic?

    Can you try to explain that again in a way that is coherent, and free of non-sequiturs?

  • (cs) in reply to Mark Bowytz
    Mark Bowytz:
    Guys, I have some really sad news: Alex tragically passed in a travel-related incident last night around 3:23am. Please keep his family in your prayers as we try to sort this thing out. There will be no more articles until further notice.

    geting old

  • (cs) in reply to Coyne
    Coyne:
    boog:
    Coyne:
    Rumen is just making sure sure the user is really authorized: Twice the security!
    You're going to need the added security, since there's no password involved here.

    Details, details!

    We've already proved he can type first and last name correctly. Twice! That should be enough security for anyone, particularly if your name is something like Fleischaker Novotnych.

    That is loking like some made up name.

  • (cs) in reply to Nagesh
    Nagesh:
    geting old
    Nagesh:
    That is loking like some made up name.

    Talk about getting old, when can you stop misppeelign words on purpose, mandarin?!

  • (cs) in reply to Sylver
    Sylver:
    Captain Oblivious:

    ...

    I'm constantly amazed by how utterly ignorant of their craft most CS graduates are. These people are convinced that programming is "about" telling a computer how to do things, when it is merely a constructive fragment of mathematics -- in other words, a constructive approach to organization and proof. ...

    Sure, the real world applications that actually do something are not our business. Our business should be high flown architecture. Got ya.

    Now, take your pompousness back to whatever planet you think you are on.

    Programming IS about telling computers what to do. That's what the word "programming" means. Pro- (before) and -graphein (written). Written in advance. That's all programming is: writing stuff to be done, a to-do list for your computer.

    Organization and proof? Both fine, but not part of programming.

    The nonsense you just spouted is one of the reason why some college grads suck big time. Because you think your job is to come up with pretty architectures and discuss their relative merits.

    This is the opposite of my point. You can 'git 'r done' faster if you cut through the crap a 40 year old tradition of imperative programming has lead to, in order to mimic mathematical abstraction and quantification. Don't forget that mathematicians have been computing things for thousands of years. We do happen to know a thing or two about organizing computations for clarity and efficiency.

    The factory pattern, with its dozens of lines of boiler plate spread out in multiple classes and files, can be done away with in two lines of code, through functorial programming. Indeed, a "factory" is a functor on the algebra of classes in an OO language.

    Object orientation is a bad model for quantification and abstraction -- you always need an ad hoc layer of quantification to abstract over the last. So you get to use a design pattern. Finally, somebody decides they're sick of the pattern, and they introduce a keyword to their language to mimic it. Then the process begins again. Sadly, until that new keyword comes out, you're stuck threading a proof through a bad representation.

    This problem has been solved for a long time.

    Organization and proof? Both fine, but not part of programming.

    Don't tell me -- you're a CS grad.

    You are exactly and provably wrong. The Howard-Curry isomorphism theorem establishes that every function in a typed language is a /proof/ of the /theorem/ that its type represents. This is very useful, since a language with an expressive type system can express things like "<x> is an even number", or "a <user> must be logged in to view a <secure> <resource>" and enforce such a constraint statically, with the force of logical proof.

    This was a further part of my point. Microsoft is slowly introducing and recommending these expressive typing constructs over OO, specifically because they eliminate (certain, common classes of) bugs with logical force.

    Ask yourself this: how hard is it to divide Roman numerals? You would have to memorize dozens, if not hundreds, of rules. Compare this to Babylonian or Arabic numerals, which used fixed bases. You merely need to learn the division algorithm to work with these.

    The representation of a problem matters. If you represent a problem poorly, your solution will have to deal with the poor representation.

  • yes but (unregistered) in reply to Captain Oblivious
    Captain Oblivious:
    Sylver:
    Captain Oblivious:

    ...

    I'm constantly amazed by how utterly ignorant of their craft most CS graduates are. These people are convinced that programming is "about" telling a computer how to do things, when it is merely a constructive fragment of mathematics -- in other words, a constructive approach to organization and proof. ...

    Sure, the real world applications that actually do something are not our business. Our business should be high flown architecture. Got ya.

    Now, take your pompousness back to whatever planet you think you are on.

    Programming IS about telling computers what to do. That's what the word "programming" means. Pro- (before) and -graphein (written). Written in advance. That's all programming is: writing stuff to be done, a to-do list for your computer.

    Organization and proof? Both fine, but not part of programming.

    The nonsense you just spouted is one of the reason why some college grads suck big time. Because you think your job is to come up with pretty architectures and discuss their relative merits.

    This is the opposite of my point. You can 'git 'r done' faster if you cut through the crap a 40 year old tradition of imperative programming has lead to, in order to mimic mathematical abstraction and quantification. Don't forget that mathematicians have been computing things for thousands of years. We do happen to know a thing or two about organizing computations for clarity and efficiency.

    The factory pattern, with its dozens of lines of boiler plate spread out in multiple classes and files, can be done away with in two lines of code, through functorial programming. Indeed, a "factory" is a functor on the algebra of classes in an OO language.

    Object orientation is a bad model for quantification and abstraction -- you always need an ad hoc layer of quantification to abstract over the last. So you get to use a design pattern. Finally, somebody decides they're sick of the pattern, and they introduce a keyword to their language to mimic it. Then the process begins again. Sadly, until that new keyword comes out, you're stuck threading a proof through a bad representation.

    This problem has been solved for a long time.

    Organization and proof? Both fine, but not part of programming.

    Don't tell me -- you're a CS grad.

    You are exactly and provably wrong. The Howard-Curry isomorphism theorem establishes that every function in a typed language is a /proof/ of the /theorem/ that its type represents. This is very useful, since a language with an expressive type system can express things like "<x> is an even number", or "a <user> must be logged in to view a <secure> <resource>" and enforce such a constraint statically, with the force of logical proof.

    This was a further part of my point. Microsoft is slowly introducing and recommending these expressive typing constructs over OO, specifically because they eliminate (certain, common classes of) bugs with logical force.

    Ask yourself this: how hard is it to divide Roman numerals? You would have to memorize dozens, if not hundreds, of rules. Compare this to Babylonian or Arabic numerals, which used fixed bases. You merely need to learn the division algorithm to work with these.

    The representation of a problem matters. If you represent a problem poorly, your solution will have to deal with the poor representation.

    Factory pattern? What's that? (think about your answer, please)

  • (cs) in reply to yes but
    yes but:
    Factory pattern? What's that? (think about your answer, please)

    The factory pattern allows class based dispatch on runtime tokens. A factory is a class or object which creates objects of distinct types based on the type of its input parameter. In other words, it's a functor on the algebra of classes.

    I'm happy to think about my answer, but I'm not sure what you're trying to get at.

    One thing to note is that it is a custom control structure, and its expression is limited by how an OO class hierarchy is organized. The generalized functorial approach does not share in those limitations.

  • non intellectual (unregistered) in reply to Captain Oblivious
    Captain Oblivious:
    Sylver:
    Captain Oblivious:

    ...

    I'm constantly amazed by how utterly ignorant of their craft most CS graduates are. These people are convinced that programming is "about" telling a computer how to do things, when it is merely a constructive fragment of mathematics -- in other words, a constructive approach to organization and proof. ...

    Sure, the real world applications that actually do something are not our business. Our business should be high flown architecture. Got ya.

    Now, take your pompousness back to whatever planet you think you are on.

    Programming IS about telling computers what to do. That's what the word "programming" means. Pro- (before) and -graphein (written). Written in advance. That's all programming is: writing stuff to be done, a to-do list for your computer.

    Organization and proof? Both fine, but not part of programming.

    The nonsense you just spouted is one of the reason why some college grads suck big time. Because you think your job is to come up with pretty architectures and discuss their relative merits.

    This is the opposite of my point. You can 'git 'r done' faster if you cut through the crap a 40 year old tradition of imperative programming has lead to, in order to mimic mathematical abstraction and quantification. Don't forget that mathematicians have been computing things for thousands of years. We do happen to know a thing or two about organizing computations for clarity and efficiency.

    The factory pattern, with its dozens of lines of boiler plate spread out in multiple classes and files, can be done away with in two lines of code, through functorial programming. Indeed, a "factory" is a functor on the algebra of classes in an OO language.

    Object orientation is a bad model for quantification and abstraction -- you always need an ad hoc layer of quantification to abstract over the last. So you get to use a design pattern. Finally, somebody decides they're sick of the pattern, and they introduce a keyword to their language to mimic it. Then the process begins again. Sadly, until that new keyword comes out, you're stuck threading a proof through a bad representation.

    This problem has been solved for a long time.

    Organization and proof? Both fine, but not part of programming.

    Don't tell me -- you're a CS grad.

    You are exactly and provably wrong. The Howard-Curry isomorphism theorem establishes that every function in a typed language is a /proof/ of the /theorem/ that its type represents. This is very useful, since a language with an expressive type system can express things like "<x> is an even number", or "a <user> must be logged in to view a <secure> <resource>" and enforce such a constraint statically, with the force of logical proof.

    This was a further part of my point. Microsoft is slowly introducing and recommending these expressive typing constructs over OO, specifically because they eliminate (certain, common classes of) bugs with logical force.

    Ask yourself this: how hard is it to divide Roman numerals? You would have to memorize dozens, if not hundreds, of rules. Compare this to Babylonian or Arabic numerals, which used fixed bases. You merely need to learn the division algorithm to work with these.

    The representation of a problem matters. If you represent a problem poorly, your solution will have to deal with the poor representation.

    -1 for intellectual masturbation

  • (cs) in reply to non intellectual
    non intellectual:
    -1 for intellectual masturbation

    -1 for wagh math is hard

  • Jeff Grigg (unregistered)

    That code only deserves a "What?" Not a full "WTF?!?"

  • (cs) in reply to Captain Oblivious
    Captain Oblivious:
    non intellectual:
    -1 for intellectual masturbation

    -1 for wagh math is hard

    -1 for thinking your area of expertise matters to the rest of us.

  • some dude (unregistered) in reply to Stephen Cleary
    Stephen Cleary:
    1) Women are not discriminated against - at least not in our university, not in the CS department. In fact, I speculated that most of the guys want more women in CS. Amusingly, most of the men in the class nodded silently in agreement at that point.
    Suggestion: quit being pompous assholes then.
  • Some Dood (unregistered) in reply to Stephen Cleary
    Stephen Cleary:
    somechick:
    One of the other reasons I don't think there are a lot of women is because they just aren't as interested in taking things apart and learning how they work as men are.

    I think that's most of it, actually. Women (in general) just aren't as interested in the field.

    As a CS major, I had to take a discussion-based "Computers and Society" course; the male/female ratio in our university at the time was almost 6:1. The teacher was a female with rather - shall we say "harsh" - liberal views. One day the subject for discussion was women in CS. After some discussion noting the lack of women, our teacher concluded that it was due to discrimination and that women had too many barriers to becoming CS students.

    I then continued the discussion by pointing out:

    1. Women are not discriminated against - at least not in our university, not in the CS department. In fact, I speculated that most of the guys want more women in CS. Amusingly, most of the men in the class nodded silently in agreement at that point.
    2. Women had fewer barriers than men in the CS department. At that time, any woman with a CS (or math) major at our university had a full-ride scholarship (assuming they maintained a 2.0), simply because they were female.
    3. Therefore, the most likely reason there are so few women is that few women want to be in CS. Every woman in that class came up to me later and voiced their agreement with me.

    Unfortunately, the teacher was petty and vindictive; that day I earned myself a "C-" for the entire term (the lowest passing grade). TRWTF is the year after I graduated, that same teacher became head of the CS department...

    Mate, I think that's pretty much what everyone here's been saying...but some chick seems to think that because she wants to be in IT, all girls do but didn't make it because it's one big boys club. I think ye be preachin' to the choir for the most part...

    Not surprised the teacher would have been appointed head of the department - sounds like the type to kick up a stink if they don't get promoted - unfortunately society always yields to such wankers, and would find there's less resistance promoting him (or her) than to stand up to them and tell them they are radical fucking lunatics!

  • Jocelyn (unregistered) in reply to Captain Oblivious
    Captain Oblivious:
    Sylver:
    Captain Oblivious:

    ...

    I'm constantly amazed by how utterly ignorant of their craft most CS graduates are. These people are convinced that programming is "about" telling a computer how to do things, when it is merely a constructive fragment of mathematics -- in other words, a constructive approach to organization and proof. ...

    Sure, the real world applications that actually do something are not our business. Our business should be high flown architecture. Got ya.

    Now, take your pompousness back to whatever planet you think you are on.

    Programming IS about telling computers what to do. That's what the word "programming" means. Pro- (before) and -graphein (written). Written in advance. That's all programming is: writing stuff to be done, a to-do list for your computer.

    Organization and proof? Both fine, but not part of programming.

    The nonsense you just spouted is one of the reason why some college grads suck big time. Because you think your job is to come up with pretty architectures and discuss their relative merits.

    This is the opposite of my point. You can 'git 'r done' faster if you cut through the crap a 40 year old tradition of imperative programming has lead to, in order to mimic mathematical abstraction and quantification. Don't forget that mathematicians have been computing things for thousands of years. We do happen to know a thing or two about organizing computations for clarity and efficiency.

    The factory pattern, with its dozens of lines of boiler plate spread out in multiple classes and files, can be done away with in two lines of code, through functorial programming. Indeed, a "factory" is a functor on the algebra of classes in an OO language.

    Object orientation is a bad model for quantification and abstraction -- you always need an ad hoc layer of quantification to abstract over the last. So you get to use a design pattern. Finally, somebody decides they're sick of the pattern, and they introduce a keyword to their language to mimic it. Then the process begins again. Sadly, until that new keyword comes out, you're stuck threading a proof through a bad representation.

    This problem has been solved for a long time.

    Organization and proof? Both fine, but not part of programming.

    Don't tell me -- you're a CS grad.

    You are exactly and provably wrong. The Howard-Curry isomorphism theorem establishes that every function in a typed language is a /proof/ of the /theorem/ that its type represents. This is very useful, since a language with an expressive type system can express things like "<x> is an even number", or "a <user> must be logged in to view a <secure> <resource>" and enforce such a constraint statically, with the force of logical proof.

    This was a further part of my point. Microsoft is slowly introducing and recommending these expressive typing constructs over OO, specifically because they eliminate (certain, common classes of) bugs with logical force.

    Ask yourself this: how hard is it to divide Roman numerals? You would have to memorize dozens, if not hundreds, of rules. Compare this to Babylonian or Arabic numerals, which used fixed bases. You merely need to learn the division algorithm to work with these.

    The representation of a problem matters. If you represent a problem poorly, your solution will have to deal with the poor representation.

    So Romans were dumb Knuts?

  • Brad (unregistered)

    Duh! It's just in case the users first name and last name change before he returns!

    Its data integrity, he's a genius!

  • Lolocopter (unregistered) in reply to Captain Oblivious

    As a high-school freshman, I feel proud that I know what an isomorphism and am somewhat familiar with that theorem (an have played around with Haskell a good deal, too, enough to gain a solid grasp of its type system and curried functions).

    And BLUH, I have to take "Intro to Comp Sci" next year, followed by "AP Computer Science" in Junior year...

  • (cs) in reply to Stephen Cleary
    Stephen Cleary:
    TRWTF is the year after I graduated, that same teacher became head of the CS department...

    If she was able to convince the person or group who was assigning the CS department head - who would, almost by definition, not be as familiar with the overall issue - that she could increase the recruitment of women into the school, it's not surprising she was promoted to the head. Plus, as someone else said, that particular type of person tends to be really hard to block - you have to refute every claim of discrimination, and the wave generally keeps coming. You can do it for a year or two, maybe, but you can't get rid of the problem by pointing out she's raised 713 claims of discrimination, and only two of them had merit: she's wasting everybody's time crying wolf. Especially since there's enough misogynists out there the ratio won't be that lopsided.

    Stephen Cleary:
    1) Women are not discriminated against - at least not in our university, not in the CS department. In fact, I speculated that most of the guys want more women in CS.

    In my freshman class, there were 4 women, and 106 men. The vast majority of the guys were complimentary and encouraging towards the women. There were about a score of guys who didn't get involved, and there were three misogynists. One of those four women left the department, claiming she was discriminated against too much. It doesn't take many bad apples to ruin the bunch for some people.

    (Note: none of our professors expressed misogynist views. The most common sentiment they expressed was best said the first day, in response to one of the misogynists wondering "what are these girls doing here? This is a man's class!": "Statistically speaking, approximately 45 people in this class will be graduating with computer science degrees. Four of them will be women." The women CS majors were, in general, given much more respect by our professors than the men, especially early on. Male CS freshmen were expected to drop out, flunk out, or change majors, more often than not. Female CS freshmen were expected to graduate - the school's average at the time I graduated was over 90%. The professors were more interested in giving time to the students they "knew" would make it.)

Leave a comment on “Ask Rumen Again”

Log In or post as a guest

Replying to comment #:

« Return to Article