Ambox warning yellow

As many of you know, outside of being a writer, I also work on open-source projects. The most famous of these is Sockbot, a chatbot platform that interfaces with various forum and message clients. If you're reading this, my boss didn't object to pointing out that my work has never made this particular mistake—but maybe we'd be more famous if we had.

This particular chatbot was designed for internal use. The company, like many others, had moved to Slack as an IM program for their development teams. Like every other team that found themselves with a slack, the developers quickly set about programming bots to interact with the platform and assist with common tasks like linking to an issue in the company Jira, pushing out a docker container to the QA environment, and ordering lunch.

Today's submitter, Suzie, wanted to add Google search to her local bot, Alabaster. Often, when people started talking about something, they'd grab the first link off Google and dump it into the chat for context. This way, Alabaster could do that for them. Sure, it wasn't the most useful thing in the world, but it was simple and fun, and really, isn't that why we program in the first place?

Of course, Google puts a rate limit of 100 queries per day on their searching, and you need an API key. Bing, on the other hand, had a much higher limit at the time of 5,000 queries a month, and was much easier to integrate with. So Suzie made the executive decision to settle for Bing and eat the gentle razzing she'd get in response.

If you've never used a chatbot before, there's two ways to design them. The harder but much-nicer-to-use way is to design a natural language processor that can tell when you're talking to the bot and respond accordingly. The easier and thus far more common way is to have a trigger word and a terse command syntax, like !ban user to ban a user from the chat room. Command words are typically prefixed with a bit of punctuation so they can't accidentally be used to start a sentence, too. In the example I gave, the ! informs the bot that you're talking to it, and ban is the command, given the argument user to the command handler.

Alabaster was using !! as his trigger, so Suzie settled on !!bing as the bot command to start a search, passing in the rest of the line. Since she wasn't using Sockbot, she had to handle the line parsing herself. She went the easy way and replaced the first instance of the word "Bing" with empty space before handing the rest of the line to the Bing API.

Or ... she tried to. In reality, she wrote var searchText = message.TargetedText.ToLower().ReplaceFirst("big ", "").Trim();, missing the all-important "n" in "bing".

This wouldn't usually be a big deal. If you start your Google search phrase with the word "google", it's usually ignored while Google searches for the rest of the phrase. Bing, however, is also part of several people's names, such as Bing Crosby, Chandler Bing ... or porn star Carmella Bing, the most searched of the three. Somehow, Suzie had also allowed the searches access to NSFW results. Every time Alabaster was asked to search, it came back with X-rated results about the porn star.

Suzie scrapped the module and started over. This time, she decided Google's rate limit was probably just fine ... just fine indeed.

[Advertisement] BuildMaster allows you to create a self-service release management platform that allows different teams to manage their applications. Explore how!