5 things I’d never ask a chatbot — and what you should be asking instead
Not every problem should be solved by a chatbot
As someone who writes about AI for a living, chatbots have become an extension of me, like a third arm or an additional toe that somehow makes me more effective day to day. And yet, while I’m happy to give my deepest thoughts and feelings to an AI, there are still plenty of topics that I don’t actively use them for.
This is for a variety of reasons. In some cases, it is to do with safety, avoiding any risk of information being leaked, trained on, or generally being misused by accident. In other cases, it is more a concern of its efficiency in answering certain subjects.
However, this isn’t to say I avoid all topics. In some cases, I simply rethink the question or ask something that is related to the topic I’m after, just with a slight variation on the overall theme.
Here are five examples, and where possible, the ways that I would rephrase the queries to get a better response.
Breaking news
At one point, chatbots like ChatGPT were behind on their knowledge, unable to answer up-to-date questions. This has since changed, and thanks to the ability to scan the web for answers, most chatbots can now answer questions right up to the minute.
However, this doesn’t mean that the accuracy is always correct. When topics are still breaking, chatbots can struggle to get answers and details correct, analyzing information that is still coming out.
While they often get details right, they can get caught up in the confusion of an event unfolding, offering up details that are unconfirmed or getting mixed up between conspiracy and truth.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
A good way around this is to ask a chatbot for information on the news, along with the sources where it has got its facts from, allowing you to delve into the information in your own time.
Equally, chatbots can be useful to simply inform further on a breaking topic, giving the background to explain what is happening, instead of trying to decipher the breaking news.
Legal advice
The likes of ChatGPT or Gemini can be useful to help you understand complicated concepts or decipher legalese where it emerges, but no chatbot should ever be relied on specifically for this.
Laws can change by region and aren’t always right for every situation. If you give a chatbot incomplete information on your legal situation, it can fill in the gaps and make assumptions that could lead to an overall wrong answer.
Instead, chatbots can be good at simply explaining legal concepts to you or helping decipher what is meant by a phrase or word in a legal document, helping you understand for yourself what you are reading.
Moral or emotional advice
Chatbots have come a long way in their understanding of emotional and moral situations and are able to provide strong advice where needed. However, the complexities of these kind of problems can cause problems.
ChatGPT doesn’t know about your own history and previous experiences, and, like other situations, it needs to fill in the gaps to answer a query. This can result in ChatGPT making a decision that is wrong for you, or giving advice that is unhelpful.
Some chatbots are also tilted towards giving responses that are positive, which can lead to them being overly supportive where you need some criticism to your decisions.
Anything involving personal data (health data, finances)
I have told chatbots huge amounts of personal information about myself. While that is obviously personal choice, it is important to be wary of what information you do end up giving.
Highly sensitive health data, passwords or financial information can be very private, and while most chatbots are safe to give information to, there is always risks with training and the potential for the information to be misused.
Instead, try giving similar examples to you or asking wider questions related to the information you’re asking about.
Making decisions for you
When there is a hard decision to make, it can be tempting to hand the decision over to AI, taking over for you and handling your fate.
Unsurprisingly, the issue here is that chatbots, once again, can’t understand the intricacies of your life, missing out on specific details and making a lot of assumptions based on its wider knowledge base.
Instead of asking AI to make a big decision for you, give it a full explanation of what you are trying to decide and ask it to generate a list of positives and negatives, or getting it to ask you a set of questions on the subject can help get you a bit more perspective.
Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.
More from Tom's Guide
- Apple could use Google Gemini to power 7 new Siri features
- I had fallen out of love with running — here’s how ChatGPT got me back on track
- How Tineco introduced a new kind of modern living with innovative products at CES 2026

Alex is the AI editor at TomsGuide. Dialed into all things artificial intelligence in the world right now, he knows the best chatbots, the weirdest AI image generators, and the ins and outs of one of tech’s biggest topics.
Before joining the Tom’s Guide team, Alex worked for the brands TechRadar and BBC Science Focus.
He was highly commended in the Specialist Writer category at the BSME's 2023 and was part of a team to win best podcast at the BSME's 2025.
In his time as a journalist, he has covered the latest in AI and robotics, broadband deals, the potential for alien life, the science of being slapped, and just about everything in between.
When he’s not trying to wrap his head around the latest AI whitepaper, Alex pretends to be a capable runner, cook, and climber.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.










