• MiserableOldGit (unregistered)

    This doesn't look that unusual for the way systems like that got implemented, but all of that curling about there would only produce a delay in connecting to the session, not cause it to be "slow" during delivery. I doubt calls would cause a delay anyway, I've seen systems written like that run fine under load.

    TRWTF seems to be around hosting, if it's checking payment info presumably at least some of that is at the vendor site. So it's their boxes being under powered or network hotspots anywhere in between.

    You could say TRWTF is lack of testing, but I never saw any place properly load-test this webby crap in any other way than risk inflicting the pain on the first clients for a few weeks. SNAFU.

  • Long Suffering.... (unregistered)

    Just recently found a similar issue. Going out to the database on each loop to get the same data. Ooer and over. I found this when I was working with a customer that had 3000 records instead of maybe ten at most. Oops. Note that the guy who wrote this is generally an excellent programmer and I am hard pressed to EVER find anything wrong,

  • (nodebb) in reply to MiserableOldGit

    You could say TRWTF is lack of testing, but I never saw any place properly load-test this webby crap in any other way than risk inflicting the pain on the first clients for a few weeks. SNAFU.

    You say that, but it's clear that Initech didn't test it at all, much less under load.

  • KeyJ (unregistered)

    That doesn't look like "a shell call out to cURL", and more like a call into libcurl (which, admittedly, is only a slightly less horrible thing to do on every page request).

  • RLB (unregistered) in reply to MiserableOldGit

    This doesn't look that unusual for the way systems like that got implemented, but all of that curling about there would only produce a delay in connecting to the session, not cause it to be "slow" during delivery.

    Unless they do four curl calls per page, each clocking in at up to half a minute. After all, in Enterprise Solutions, you need to check that the subscription has been paid for on every single page and popup, not just once at log-in. The latter would mean trusting the user not to pay for a week's subscription and then leave the computer logged in for all eternity (or at least until the next power cut), thus getting free product. And there aren't any better solutions to prevent that than calling home all the time, no sirree.

  • Debra (unregistered)

    Doesn't "'CURLOPT_SSL_VERIFYHOST' => false, 'CURLOPT_SSL_VERIFYPEER' => false" mean that they could just set up a fake localhost server (via an appropriate entry in /etc/hosts or the platform's equivalent) to return "yes, we've paid over 9000$ and are always registered" every time? Would certainly work faster.

  • Anonymous') OR 1=1; DROP TABLE wtf; -- (unregistered) in reply to Debra

    Yes. Setting CURLOPT_SSL_VERIFYPEER to false disables cert verification, so it would be trivial to use an intercepting proxy like Burp Suite or Charles Proxy and modify the returned responses.

    That's assuming of course that the base URL scheme not shown in this snippet is https and not http, which is no guarantee given code like this.

    Edit: I had to solve about 10 captchas (I lost count) before the system decided I wasn't a robot.

  • Naomi (unregistered) in reply to RLB

    Ugh, I had to work with software like this in one my math courses. If the connection was at all flaky, you could lose an entire homework assignment. :/ In fairness, the professor was totally understanding (and also not at all technical, so it's hard to blame her too much). And next semester, that system was replaced...

    ...with MyMathLab, which I'm pretty sure checks your answers with a buggy version of strcmp(3).

  • (nodebb) in reply to Debra

    Doesn't "'CURLOPT_SSL_VERIFYHOST' => false, 'CURLOPT_SSL_VERIFYPEER' => false" mean that they could just set up a fake localhost server (via an appropriate entry in /etc/hosts or the platform's equivalent) to return "yes, we've paid over 9000$ and are always registered" every time? Would certainly work faster.

    Why bother? Just replace telEdu_get_plan and friends with code to directly return the appropriate JSON that says you're in good standing. Lightening fast.

  • Corparate greed sux (unregistered)

    I think TRWTF here is that the "free" module requires a subscription for it to be able to function. It's along the same lines of all of those 'pay to win' games out there... only worse.

  • Worf (unregistered) in reply to RLB

    I would write it to check and continue a little while after expiration. Of course, when it's close to expiration put up a banner, and say it's expired after a day or two. But then remove the banner and let people think it's free due to a bug. Then fail promptly either at 5PM Friday, or 8AM Monday morning, ensuring maximum havoc.

    Or even better, analyzing usage patterns and failing just before the heavy usage starts. Awesomeness if you discover they use it for things like AGMs or investor conferences and disabling itself 10 minutes before.

    After all, they didn't pay and got a little extra free usage out of it. Just it happened to fail at perhaps the worst possible time, so whoever answers the phone can jack up the rates and add emergency and support lapse fees to the cost.

  • (nodebb)

    I'm confused about the comment in the article about the return value of the curl calls not being used. It looks like the result of every curl call is returned.

    ...or is the article saying that the pricing info isn't ever used???? Because that would be T extra RWTF.

  • sizer99 (google) in reply to konnichimade

    I do believe that is what they're saying - it calls 4 times to get $getplan, $paymentinfo, $getclassdetail, and $pricelist . And then it never uses those. The method is still TRWTF, but this is the icing.

  • Ray (unregistered)

    I'm kinda surprised they were actually able to view the source code.

    Often with javascript libraries (at least a long time ago, and to some extent still happening now), the JS code would be base64 with other crap too, so to get at the source code you would have to decode, eval it, and all that awesome stuff.

    Have seen that with PHP code too, where it's pretty crappily encoded as base64 with a great big base64_decode() around it, so anyone looking at it directly just sees mumbo jumbo, but the PHP interpreter sees that base64_decode(), executes it, then evals the final code to make stuff happen.

  • (nodebb)

    "What in view.php" is now in my lexicon of euphemistic interjections.

  • Daemon (unregistered)

    You know you're up the creek when you open the source and see an inline include in the code, in the middle of a function. :P

    (Nope. Nope, nope, Nope, Nope)

    This is one of those modules implemented and then tested locally (if at all), and deemed working, because the curl calls would return pretty much instantly. (Test server is local or near. Probably also why verification is switched off.)

  • (nodebb)

    Seriously. Delivering a bunch of function calls that are disregarded to production is... well, it's another indication of a total local of any sort of quality standards or control.

    Addendum 2020-08-13 15:17: Sigh. s/local/lack/

  • TheLord (unregistered) in reply to Long Suffering....

    Been there, done that. Had a django app that was slow. Went in to see what's going on. Applied stats gathering in postgresql. Found out that one http GET gets 75k records that are looped over several times in python side and every loop hits database. In the end, get generated over 500k requests to base. The kicker is, only about top 20 records were actually needed.

Leave a comment on “Teleconference Horror”

Log In or post as a guest

Replying to comment #:

« Return to Article