News India Live, Digital Desk: Today you and I are becoming dependent on Artificial Intelligence (AI) for our small needs. Be it writing an office mail, coding or finding a recipe – chatbots have become our new ‘Google’. But wait! CEO of Google Sundar Pichai has said something which every internet user should listen to carefully.
Pichai has recently talked about a dangerous aspect of AI which “Hallucinations” It is said. To understand in simple language, this is a situation where AI gives you wrong information, but with so much “confidence” and trust that you will feel that this is the ultimate truth.
What is the danger of ‘Hyper-Confident’ AI?
Sundar Pichai has made it clear that today’s modern AI tools (be it Google’s or anyone else’s) have become extremely smart. But the problem is that when they don’t know the answer to a question, instead of saying “I don’t know”, they make up a story. And they tell this story with such hyper-confidence that a common man gets deceived.
Imagine, you ask AI about any health problem and it tells you the wrong medicine with full claims? How risky it can be!
Secret of the “Black Box”
Pichai called it “black box” Also told the problem. Meaning, sometimes even the developers themselves don’t know for sure how and why the AI gave a specific answer. Although technology is improving, the problem of “hallucinations” has not been completely eliminated yet. This is a challenge that all big companies like Google, Microsoft and OpenAI are struggling with.
What should we do as a common user?
This news is not to scare us, but to wake us up. The simple meaning of Sundar Pichai’s words is that:
- Trust, but also verify: Whatever information AI gives, do not consider it as set in stone. Always cross-check it (Fact Check).
- Keep the judgment as human only: Take the help of AI to take decisions, but let your wisdom give the final say.
- Use wisely: AI is great for creative work, but be careful when it comes to facts.
So the next time your chatbot tells you a fact that sounds “too good to be true,” remember—it may just be talking in confidence!
look news india