ChatGPT has already come to Bing and several Microsoft mobile apps. Now, its reach in phones is growing even further.
In a press release, Snap (opens in new tab), the parent company of Snapchat announced “My AI” an AI chatbot that will be integrated with the popular messaging app. The chatbot will only be available to Snapchat Plus users and will begin rolling out to U.S. subscribers this week.
Like with Bing with ChatGPT, My AI is not technically ChatGPT. Instead, it is its own AI chatbot that uses the latest version of the GPT language model that powers ChatGPT and Bing with ChatGPT. As with the new Bing, Snap did not specify which version of the GPT model My AI uses, only that it is “built with the latest ChatGPT.” So it is unclear what similarities it may or may not have with “Sydney,” the dark alter ego of Bing’s chatbot that went off the deep end and professed its love for a New York Times reporter.
SnapChatGPT: What can My AI do?
The Snapchat Support (opens in new tab) page for My AI says that My AI is designed to be a personal sidekick for users. It can answer trivia questions, offer advice and help plan a trip — hopefully with better results than when we tried to get Bing with ChatGPT to plan a trip to Amsterdam.
However, there are some eyebrow-raising features of the new chatbot AI. Snapchat says that “You can give My AI a nickname and tell it about your likes (and dislikes!).” This could be a disaster waiting to happen given even the little we know about these relatively novel AI chatbots.
Training a language model like ChatGPT with these things feels like a recipe to repeat the events of Bing becoming Sydney, which caused Microsoft to initially set new limits on the new Bing before ultimately then expanding access greatly by adding it to the Bing, Edge and Skype mobile apps. Clearly, Snap doesn’t agree with AI experts that “digital health warnings” are needed for using chatbot AI.
SnapChatGPT: Proceed with caution
But users should be vigilant. Even though Snap says, “Don’t use My AI to generate political, sexual, harassing, or deceptive content, spam, malware, or content that promotes violence, self-harm, human-trafficking, or that would violate our Community Guidelines,” it is incredibly likely that My AI chatbot will be used to do exactly that.
It will also be used to store your personal data, though Snap does give instructions on how to delete such data. Data privacy is becoming an increasingly prominent issue with chatbot AI. The Telegraph (opens in new tab) recently reported that Microsoft staff review conversations with the new Bing, and users should assume that any data (text or voice) provided to these AI chatbots is being stored by said companies.
For their part, Snap says that “Your interactions with My AI and city-level location will be used by My AI. Your data will be used to improve My AI and any other Snap products, including Ads, and to make them more personal and relevant to you.” Between these privacy concerns and concerns about a repeat of Bing’s chatbot getting confused to dramatic effect, definitely use My AI with caution.
We will try My AI out for ourselves and will report back with our findings.