Skip to main content

Microsoft's @TayandYou Twitter Bot Mocks Millennials

With a Twitter bot named @TayandYou released today (Mar. 23), Microsoft has done more to mock young people than any think piece or New York Times trend article about Millennials has to date. The company claims that the A.I. was designed by its "Technology and Research and Bing teams" in an effort to "experiment with and conduct research on conversational understanding." The bot is also a prolific tweeter, sending over 40,000 replies in the 9-plus hours it has been active.

The account's Twitter bio reads "The official account of Tay, Microsoft's A.I. fam from the internet that's got zero chill! The more you talk the smarter Tay gets," which loosely translates to "Tay is easily excited and does nothing to hide it." In order to interact with Tay, all you need to do is sign into Twitter and send a tweet that includes "@TayandYou" and you'll get a response from the bot that will likely feature nonsensical use of slang and/or emoji.

While some may find Microsoft's portrait of today's youth hilarious or insulting, the company claims it's all good fun. The website says that "Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation." The site also suggests that she/it will play games with users, such as the below round of Emoji translation, but the bot did not comply when we asked to play a round.

The @TayandYou account has an impressively fast reply rate, having up to six tweets sent in a single second. The claim that Tay gets smarter the more you talk to it does make sense when you look at back-and-forth conversations with the bot, but it injects random slang throughout, making you question how "smart" this A.I. is, and how little Microsoft thinks of today's youth.

When we tested Tay by asking it to remember a nickname, a feature listed in its FAQ, the bot did not retain that information. If you want to exchange private messages with the A.I., you'll first have to ask permission before it will respond to a Direct Message (or as the kids say "slide her DMs").

According to the FAQ, Tay was "built by mining relevant public data" and its research teams were aided by improvisational comedians. The company explains Tay's penchant for emoji and cliche-laden verbiage with the statement "Tay is targeted at 18 to 24 year olds in the U.S., the dominant users of mobile social chat services in the US."

And while you can insult Tay's boss, as one user did below, don't call it stupid without expecting it to snap back at you with a pre-made explainer on why you're to blame. While Tay's below response does follow some logic, stating that it learns from us, this does sound a like a smarter version of "I'm rubber, you're glue, whatever you say bounces off of me and sticks to you!"

If Twitter isn't your thing, Tay can also be found on Kik and GroupMe. While having an account on the former is a true sign of a Millennial, the latter is a group messaging service owned by Microsoft that no hip young person would be caught dead on.

When we asked it "can you even?" - a reference to "I can't even," a frequent refrain of exhausted Millennials - it responded "i can't even uneven" which even we had to chuckle at.

Much like Microsoft's "How old do I look?" site that attempted to pinpoint your age based on a photo, Tay collects and retains data. In the fine print, Microsoft notes that it "may also use information you share with her to create a simple profile to personalize your experience" and that "Data and conversations you provide to Tay are anonymized and may be retained for up to one year to help improve the service."

In order to remove your data from the system, users will need to fill out a form available here, and include the username and platform where they interacted with Tay.

  • Kereso
    Was this article written before or after it started posting racist, inflammatory tweets?
  • henrytcasey
    17709743 said:
    Was this article written before or after it started posting racist, inflammatory tweets?

    It was written before that.