Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your concept of cheating is simply how LLMs work.


It is not. LLMs do not just memorize; they also extrapolate, otherwise they would be useless. Just like any ML model.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: