An Australian travel website was forced to pull content and issue a warning after tourists began arriving at a remote Tasmanian town in search of hot springs that were never there, despite a destination invented entirely by artificial intelligence and listed online as a must-see in 2026.The fictional location, called Weldborough Hot Springs, was featured in a Tasmania Tours article titled “The 7 best hot spring experiences in Tasmania in 2026” websitedescribes a “secluded forest retreat” that offers “a peaceful escape” and “an authentic connection with nature.” The website claims that walkers will see pools of water “rich in healing minerals.” The problem, as bemused locals soon had to explain, was that Weldborough had no hot springs at all.
Confused tourists, confused locals
Weldborough is a small settlement in north-east Tasmania, known for its pubs and proximity to forests and rivers. However, after the article was published in July 2025, visitors began asking how to get to the advertised pool, often at the Weldborough Hotel, the town’s best-known landmark.

Weldborough Hotel/Photo: Trip Advisor
Local tavern owner Christie Probert Tell Australian Broadcasting Corporation This inquiry soon became a daily occurrence. “Actually, two days ago, I met a group of 24 drivers from the mainland who actually took a detour to come to the hot springs,” she said. Instead, Probert found herself explaining that the only nearby waterway was the Weld River, which she described as “icy cold” and “definitely not a hot spring.” “Honestly, they’re more likely to find a sapphire than… to find a hot spring,” she said, adding that she jokingly promised a free drink to anyone who managed to find the mysterious pool. “If you find the hot springs, come back and tell me and I’ll give you beer all night long, but they still haven’t come back.”
“Our AI is completely screwed up”
Tasmania Tours is operated by Australian Tours and Cruises, a New South Wales-based company that operates several travel booking websites. Its owner, Scott Hennessy, acknowledged the error and said the article, along with other AI-generated posts, had been removed.

Due to complaints and growing confusion, this article was removed along with other AI-generated content
“Our AI is completely screwed up,” Hennessy said, explaining that the company has outsourced some marketing content to third parties that use AI. While posts are usually reviewed before being posted, some went live while he was overseas.“We’re trying to compete with the big guys, and part of that is keeping our content fresh,” he said. “We don’t have the capacity to write enough content ourselves, which is why we outsource some of the functionality. Sometimes it’s perfect, really good, and does what you want it to do, and sometimes it’s completely wrong.”Hennessy stressed Tourism Tasmania was a legitimate business selling genuine tours and not a scam website. “We’re not a scam, we’re a married couple trying to do the right thing for people… We’re legitimate, we’re real people and we employ sales people,” he said. He added that all AI-generated blog posts are now being reviewed.
AI travel advice faces growing problems
Wellborough’s case is not isolated. Travel experts say so-called “AI hallucinations,” in which systems confidently spin facts, are increasingly sending people to the wrong places or offering unsafe advice. Tourism Southern Tasmania’s Anne Hardy said research showed about 90 per cent of AI-generated itineraries contained at least one error, while more than a third of travelers now relied on AI to plan their trips. Errors often include incorrect opening times, inaccurate descriptions, and in this case, the destination simply doesn’t exist. There have been reports of similar incidents internationally, including tourists trying to visit a non-existent canyon in Peru and tourists in Malaysia looking for an AI-generated cable car attraction. For Weldborough, the incident brought unwanted attention and a steady stream of disappointed visitors. For travelers more broadly, it’s a stark reminder that, no matter how convincing the language or imagery, not everything generated online reflects reality on the ground.


