- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
"This IP range has been banned (DDoS attacks)."
This reminds me of how my friends and I got our Uni IP-range-banned from a forum. All that happended was a few double posts, just because the F5-Key got stuck and the browser kept reloading the "post successful" page... it was an accident, I swear!
After 2 years or so, the Uni got unbanned.
Admin
And it was running both perl and PHP?
I call bullshit on this one.
Admin
If this was a production app that would have many users, I could see querying the ultimate data source periodically and caching the results. Again, given the static nature of the data, once a day would surely be plenty often. But for a class project this seemed to me to add more work than necessary to meet the requirements.
I guess a lot depends on how much work it really had to do each time it hit CityEats. I hadn't heard of this site before and was thinking it was part of the anonymization, but now I see it's real. So okay, it displays a list of all the restaurants meeting the search criteria. If he just relates user location to which restaurants he will choose from based on "neighborhood", then he doesn't need to access the detail pages, so there's just one page to hit and scrape. (Assuming he can figure out which neighborhood to use from the user location without hitting CityEats.) If he wants a more precise location, well, hmm, I see that on the restaurant detail page the little GoogleMap image actually uses the lat/long to select the display area, he could scrape that for the location info. In that case he'd have to hit each detail page. Still, I guess it depends what neighborhood you specify, how many restaurants will show up? Typical looks like half a dozen to a dozen. So each scraping visit he'd hit the search results page plus maybe a dozen detail pages. That doesn't seem very heavy.
Of course it's possible that he didn't try to narrow the search, but on each scraping visit he hit every restaurant on the site. I'm not sure how many total they have, probably at least hundreds. I could see that getting heavy.
Admin
Please, try to show a little sensitivity. Right or wrong, the president's daughter was chained in the comments dungeon, and I assure you it was no laughing matter. Fortunately Paula Bean came up with a brillant way to get her out. Okay, that's not true, but it's not false either. It's file_not_found.
Admin
i love this story.
Admin
Brian fails...for responding "Yeah, sure, I leave the project in the hand of That Guy"
You deserve what ya get, sorry man. :)
Admin
Does anyone of you bitches know the specs?
Admin
Admin
Admin
Exactly my thought!!
Admin
Forget about 'That guy' for a moment. What about the others in the group? what were they doing for 3 weeks?!!
Admin
I was totally That Guy, except I managed to get my entire university banned from all nih.gov sub domains (including pubmed).