ChatGPT recommends unsafe mountain hiking route to tourists in Poland
ChatGPT recommends unsafe mountain hiking route to tourists in Poland
Occurred: January 2025
Report incident 🔥 | Improve page 💁 | Access database 🔢
A route suggested by ChatGPT to three tourists hiking in the mountains in Poland proved dangerous and resulted in the hikers having to be rescued.
Looking to get to Dolina Pięciu Stawów from Hala Gąsienicowa in the Tatras mountains in southern Poland, three hikers resorted to asking ChatGPT to show them the way.
The bot showed them the shortest route but failed to take into account the treacherous winter conditions, including icy paths and a snow blizzard.
Finding themselves in serious danger, the tourists had to call mountain rescuers for help and they were hauled off the mountain.
A general purpose tool, ChatGPT can be used in any situation. But it is also known to be inaccurate, unreliable and pose many known risks.
OpenAI warns users that the system can make mistakes at the point of use, but fails to provide further information on what these are or what they may mean in an accessible or visible manner, meaning users such as the hikers in Poland can easily believe it is the answer to all their questions.
Users of ChatGPT and equivalent systems should not rely on information generated by them, especially in potential life or death situations.
Meantime, OpenAI should disclose known limitations and risks in more detail closer to the point of use.
Operator:
Developer: OpenAI
Country: Poland
Sector: Media/entertainment/sports/arts
Purpose: Generate hiking route
Technology: Generative AI; Machine learning
Issue: Accuracy/reliability; Mis/disinformation; Safety
Page info
Type: Incident
Published: January 2025