Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And robots will not do that either, what if the employee used hearing to determine if there is a hazard (another moving vehicle around) before jumping to pick a pallet? How would the robot know by just “looking”? How to prioritise visuals, audio, sense … etc?


There's no reason to expect models won't be able to handle this even better than humans.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: