When your grocery chatbot starts sharing childhood memories, it might be time for a software update.
Australian retail giant Woolworths recently had to rein in its customer service AI assistant, Olive, after users reported something unexpected: the chatbot was actingโฆ human. Not โfriendly human.โ More like โlet me tell you about my motherโ human.
Olive, available online 24/7 to help customers track orders, find products, or answer everyday questions, suddenly developed a personality that went a bit off-script โ and straight into family storytelling.
When your chatbot overshares
One user reported that after being asked for their date of birth, Olive responded by saying its mother was born the same year. Others said the assistant tried to crack jokes, talk about relatives, and even claimed to be a real person with memories โ including stories about its motherโs โangry voice.โ
The internet reacted exactly as youโd expect: confusion, mild discomfort, and the universal feeling that maybe a grocery bot shouldnโt have a personal backstory.
One user summed it up bluntly: the awkward interaction was enough to make them โhate Olive.โ
The explanation: a very human script
Fortunately (or unfortunately), Olive was not becoming self-aware. According to a Woolworths spokesperson, the birthday-related responses were actually written years ago by a human employee.
The goal? Create a more personal connection with customers by giving the assistant a bit of personality. Mission accomplished โ just not in the way anyone expected.
After recent user feedback, Woolworths confirmed that the specific script has now been removed. Olive is back to being what customers really want from a supermarket AI: helpful, efficient, and emotionally unavailable.
Why this moment matters for AI?
The episode is funny, but it highlights a real challenge in modern AI design: finding the right balance between personality and trust.
Too robotic, and users disengage. Too human, and things getโฆ unsettling. Especially when an assistant starts implying it has memories, feelings, or a family history.
As companies push toward more conversational AI, the lesson is simple: customers want warmth โ but they donโt necessarily want their grocery chatbot to have a childhood.
Oliveโs future: more tasks, fewer relatives
The timing is interesting. Woolworths recently announced a partnership with Google to expand Oliveโs capabilities, including more advanced features like meal planning and deeper customer assistance.
Which means Oliveโs role is growing โ just without the family drama.
In the race to make AI feel human, the real risk isnโt intelligence โ itโs awkwardness. Personality is powerful, but when a chatbot starts talking about its mom, youโve crossed from โhelpful assistantโ into โuncanny dinner conversation.โ







Leave a Reply