Hacker Newsnew | past | comments | ask | show | jobs | submit | eoskx's commentslogin

Interesting, but cannot run CUDA or more to the point `nvidia-smi`.


Well, to be fair, the whole shebang is from a completely different company, that have their own ML library and such, so that isn't that surprising. Although I agree that some CUDA shim or similar would be a lot more interesting, still getting to the place of running inference and training with your very own library is pretty dope already.


Also, not surprising that LiteLLM's SOC2 auditor was Delve. The story writes itself.


Would a proper SOC2 audit have prevented this?

I've been through SOC2 certifications in a few jobs and I'm not sure it makes you bullet proof, although maybe there's something I'm missing?


SOC2 is just "the process we say we have, is what we do in practice". The process can be almost anything. Some auditors will push on stuff as "required", but they're often wrong.

But all it means in the end is you can read up on how a company works and have some level of trust that they're not lying (too much).

It makes absolutely zero guarantees about security practices, unless the documented process make these guarantees.


Yeah, that was my understanding as well, so I fail to see how a proper SOC2 would have prevented this.

I mean ideally a proper SOC2 would mean there are processes in place to reduce the likelihood of this happening, and then also processes to recover from if it did ended up happening.

But the end result could've been essentially the same.


It wouldn't have. lol.


Just so long as it was a proper SOC2 audit, and not a copy-pasted job:

https://news.ycombinator.com/item?id=47481729


Valid, but for all the crap that LangChain gets it at least has its own layer for upstream LLM provider calls, which means it isn't affected by this supply chain compromise (unless you're using the optional langchain-litellm package). DSPy uses LiteLLM as its primary way to call OpenAI, etc. and CrewAI imports it, too, but I believe it prefers the vendor libraries directly before it falls back to LiteLLM.


Not just as a gateway in a lot cases, but CrewAI and DSPy use it directly. DSPy uses it as its only way to call upstream LLM providers and CrewAI falls back to it if the OpenAI, Anthropic, etc. SDKs aren't available.


Yep, DSPy and CrewAI have direct dependencies on it. DSPy uses it as its primary library for calling upstream LLM providers and CrewAI falls back to it I believe if the OpenAI, Anthropic, etc. SDKs aren't available.


LangChain at least has its own layer for upstream LLM provider calls, which means it isn't affected by this supply chain compromise. DSPy uses LiteLLM as its primary way to call OpenAI, etc. and CrewAI imports it, too, but I believe it prefers the vendor libraries directly before it falls back to LiteLLM.


This is bad, especially from a downstream dependency perspective. DSPy and CrewAI also import LiteLLM, so you could not be using LiteLLM as a gateway, but still importing it via those libraries for agents, etc.


Wow, the postmortem for this is going to be brutal. I wonder just how many people/orgs have been affected.


Yep, I think the worst impact is going to be from libraries that were using LiteLLM as just an upstream LLM provider library vs for a model gateway. Hopefully, CrewAI and DSPy can get on top of it soon.


I'm surprised to see nanobot uses LiteLLM: https://github.com/HKUDS/nanobot

LiteLLM wouldn't be my top choice, because it installs a lot of extra stuff. https://news.ycombinator.com/item?id=43646438 But it's quite popular.


I completely removed nanobot after I found that. Luckily, I only used it a few times and inside a docker container. litellm 1.82.6 was the latest version I could find installed, not sure if it was affected.


Very similar experience to my own. Was a 1P customer for years, but the product declined after the VC purchase and I trust Apple more to get privacy & security right. Apple Passwords accomplishes the minimum of what I want a Password system to do. Yes, 1Password has some nice add-ons like SSH agent integration, secure notes, etc., but some of these aren't necessary or have workarounds as outlined in this post.

I really do wish there was some way to integrate Apple Passwords with Linux, but I don't see that happening. FWIW, iCloud on Windows isn't horrible and has decent Apple Password support as it even works with iCloud Advanced Data Protection now.


I've asked multiple OpenAI employees on X that have been posting about the issue whether or not they will be processing bulk unclassified Americans' data or what will they do when asked since I think it is fair to assume that they have or will receive the same ask that was made of Anthropic. No response, yet. The Head of National Security Partnerships at OpenAI seems to be focused on stating that that the NSA is not able to use the contract. Whether or not that is true, it doesn't address the unclassified bulk data processing concern, which is a form mass surveillance of Americans. Also, not great when at least one OpenAI employee has posted that the DoD "does not conduct domestic surveillance" and only issued a correction after quite a backlash by stating that he was only quoting the Under Secretary of Defense.


> the NSA is not able to use the contract

They already have a "contract" with every FAANG. OpenAI is Microsoft.


Isn't OpenAI still under a court order not to delete any data, that is incoming prompts and responses?


"Even as Mr. Trump published the post at 3:47 p.m., the two sides kept talking. Mr. Michael, who was on a call with Anthropic executives at the time, said the Pentagon wanted the company to allow for the collection and analysis of unclassified, commercial bulk data on Americans, such as geolocation and web browsing data, people briefed on the negotiations said. Anthropic told the Pentagon that it was willing to let its technology be used by the National Security Agency for classified material collected under the Foreign Intelligence Surveillance Act. But the company wanted a legally binding promise from the Pentagon not to use its technology on unclassified commercial data."


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: