Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How is this any faster than ChatGPT? Unless you have faster GPU's than OpenAI the only way to be "faster" is to use a smaller model.


The UX is much snappier:

> We didn't do anything interesting with the AI here. What's interesting is the experience we built around it.

> I'm trying to take my expertise in building really good, really performant applications in a local-first way for the web and apply it to make other apps that deserve this type of treatment feel better.

> Chat is saying that this is pure UX. Absolutely! The goal here is to make make the user experience as good as possible not because those 5 seconds matter a whole lot but because the actual experience matters a lot. You should feel good when you're going around these apps navigating and doing things.

Much more detail in video announcement: https://youloop.leftium.com/?v=bIr7NtNRDmE&a=519&b=551


We do - we're using a really powerful hosted cluster at Azure (they have an exclusive licensing deal with OpenAI).

Excluding the client performance wins, we're up to 2x faster than ChatGPT.com using the same model https://x.com/ryandavogel/status/1878647963507163431




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: