Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not quite, there are still trillions of dollars to burn through. We'll probably get some hardware that can accelerate LLM training and inference a million times, but still won't even be close to AGI

It's interesting to think about what emotions/desires an AI would need to improve



The actual business model is in local, offline commodity consumer LLM devices. (Think something the size and cost of a wi-fi router.)

This won't happen until Chinese manufacturers get the manufacturing capacity to make these for cheap.

I.e., not in this bubble and you'll have to wait a decade or more.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: