I had this moment recently with implementing facebook oauth. I don’t need to spend mental cycles figuring that out, doing the back and forth with their API, pulling my hair out at their docs, etc. I just want it to work and build my app. AI just did that part for me and could move on.
> no amount of AI is going to help you debug why your auth flow suddenly broke.
What? Coding agents are very capable at helping fix bugs in specific domains. Your examples are like, the exact place where AI can add value.
You do an update, things randomly break: tell Claude to figure it out and it can go look up the breaking changes in the new versions, read your code and tell you what happened and fix it for you.
Repeating 2008, where nobody had a good sense of who actually owed what where. This is how Martin Casado can say this isn't like dot-com, because "there isn't as much debt". It's there, just shoved behind the couch.
I actually started writing a very similar essay, but the hyperbole got too out of hand – open source isn't dying anytime soon.
I do think that SDKs and utility-focused libraries are going to mostly go away, though, and that's less flashy but does have interesting implications imo.
I'm inclined to agree somewhat about libraries. I'm not entirely certain that it is a bad thing.
Perhaps it would be more accurate to say libraries will change in form. There is a very broad spectrum of what libraries do. Some of the very small may just become purpose written inline code. Some of the large, hated-but-necessary libraries might get reduced into manageable chunks if people who use them can utilise AI to strip them down to the necessary component. Projects like that are things that are a lot of work for an individual that make it easier to just bite the bullet and use the bloated mass library. Getting an opportunity to make an AI do that drudge work might lower the threshold that some of those things will be improved.
I also wonder about the idea of skills as libraries. I have already found that I am starting to put code into skills for the AI to use as templates for output. Developing code in this way would let you add the specific abilities of a library to any skill supporting AI.
For me, the current state of technology in society was impacted the most by Apple and their vision. Could definitely debate between Apple and Google, but I don't think another company is really in the conversation.
It’s better for them if you don’t know how long you’ve been talking to the LLM. Timestamps can remind you that it’s been 5 hours: without it you’ll think less about timing and just keep going.
Wang is a networking machine and has connected with everyone in the industry. Likely was brought in as a recruiting leader. Mark being Mark, though, doesn’t understand the value of vision and figured getting big names in the same room was better than actually having a plan.
Your last sentence suggests that he willingly failed to take the choice to create a vision and a plan.
If, for whatever reason, you don't have a vision and a plan, hiring big names to help kickstart that process seems like a way better next step than "do nothing".
Wang is not Zuck's first choice. Zuck couldn't get the top talents he wanted so he got Wang. Unfortunately Wang is not technical, he excels in managing the labeling company and be the top in providing such services.
That's why I also think the hiring angle makes sense. It would actually be astonishing if he could turn technical and compete with the leaders in OAI/Anthrpic
You’re right – the way I phrased it assumes “having a plan” is a possibility for him. It isn’t. The best he was ever going to do was get talent in the room, make a Thinking Machines knockoff blog post with some hand wavey word salad, and stand around until they do something useful.
The case against this EO is not “banning new technology”. It’s not allowing the federal government to ban any state regulation. And states having the power to make their own rules is maybe the most American value.
reply