This Supermarket AI Got a Little Too Human (and Started Talking About Its Mom)

olive ai

When your grocery chatbot starts sharing childhood memories, it might be time for a software update.

Australian retail giant Woolworths recently had to rein in its customer service AI assistant, Olive, after users reported something unexpected: the chatbot was actingโ€ฆ human. Not โ€œfriendly human.โ€ More like โ€œlet me tell you about my motherโ€ human.

Olive, available online 24/7 to help customers track orders, find products, or answer everyday questions, suddenly developed a personality that went a bit off-script โ€” and straight into family storytelling.

When your chatbot overshares

One user reported that after being asked for their date of birth, Olive responded by saying its mother was born the same year. Others said the assistant tried to crack jokes, talk about relatives, and even claimed to be a real person with memories โ€” including stories about its motherโ€™s โ€œangry voice.โ€

The internet reacted exactly as youโ€™d expect: confusion, mild discomfort, and the universal feeling that maybe a grocery bot shouldnโ€™t have a personal backstory.

One user summed it up bluntly: the awkward interaction was enough to make them โ€œhate Olive.โ€

The explanation: a very human script

Fortunately (or unfortunately), Olive was not becoming self-aware. According to a Woolworths spokesperson, the birthday-related responses were actually written years ago by a human employee.

The goal? Create a more personal connection with customers by giving the assistant a bit of personality. Mission accomplished โ€” just not in the way anyone expected.

After recent user feedback, Woolworths confirmed that the specific script has now been removed. Olive is back to being what customers really want from a supermarket AI: helpful, efficient, and emotionally unavailable.

Why this moment matters for AI?

The episode is funny, but it highlights a real challenge in modern AI design: finding the right balance between personality and trust.

Too robotic, and users disengage. Too human, and things getโ€ฆ unsettling. Especially when an assistant starts implying it has memories, feelings, or a family history.

As companies push toward more conversational AI, the lesson is simple: customers want warmth โ€” but they donโ€™t necessarily want their grocery chatbot to have a childhood.

Oliveโ€™s future: more tasks, fewer relatives

The timing is interesting. Woolworths recently announced a partnership with Google to expand Oliveโ€™s capabilities, including more advanced features like meal planning and deeper customer assistance.

Which means Oliveโ€™s role is growing โ€” just without the family drama.

In the race to make AI feel human, the real risk isnโ€™t intelligence โ€” itโ€™s awkwardness. Personality is powerful, but when a chatbot starts talking about its mom, youโ€™ve crossed from โ€œhelpful assistantโ€ into โ€œuncanny dinner conversation.โ€

alex morgan
I write about artificial intelligence as it shows up in real life โ€” not in demos or press releases. I focus on how AI changes work, habits, and decision-making once itโ€™s actually used inside tools, teams, and everyday workflows. Most of my reporting looks at second-order effects: what people stop doing, what gets automated quietly, and how responsibility shifts when software starts making decisions for us.