Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The risks involving hallucinations are too damn high still, and may always be.

Yes, but I think in the limited realm of people who otherwise wouldn't get any advice at all, I think LLMs could play a useful role. American healthcare is so prohibitively expensive that many people with potential medical issues will avoid seeing a doctor until it is too late to do anything. Checking in with an LLM could help people at least identify red flags that really can't be ignored, and it would be more helpful than WebMD telling you that everything is cancer.



Otherwise not getting advice at all goes way beyond healthcare being too expensive, it could be that you don't get an appointment and it could just be that you don't have the time or energy.


I think we may see society settling on feeling comfortable with their doctor using an AI, but not being an AI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: