Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Mixture of Experts isn't using multiple models with different specialties, it's more like a sparsity technique, where you massively increase the number of parameters and use only a subset of the weights in each forward pass.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: