in 2002 I worked at an AT&T major datacenter and watched the NSA install all the black boxes in every rack, complete with a black curtain and armed guards while they did the project (St Louis). Before that it was still going on, it just wasnt so embedded like they did in 2002.
the authors reference to LLM's as "bullshit machines" is more true the less parameters you have trained in your model....as we scale up to trillions of parameters, add Mixture of Experts (MoE) architecture, this no longer is an accurate statement. Proof in point was yesterdays announcemnt of Mythos 5 model (10T parameters + MoE [1]) by anthropic where it seems to be so good at finding/exploiting vulnerabilities in source code that have been there for decades and only recently uncovered needs to be used to fix these critical vilnerabilities first before it gets released to the public, they even have a project called Glasswing [2] dedicated to letting people fix the thousands of vulnerabilities already found by the model before they release this model to the public, because it's so good at what it does... I think we're a little bit past the point of calling these models "bullshit machines" at this point...
"His involvement with Linux began in the early 1990s when he was working on a project that required a stable networking solution. This led him to discover Linux, which was still in its infancy at the time.
Contributions to Linux Kernel
Cox's contributions to the Linux kernel are extensive and far-reaching. He is best known for his work on the Linux networking stack, which was critical in making Linux a viable option for server environments. Cox identified and addressed numerous issues in the kernel's TCP/IP implementation, enhancing its performance and reliability." [0]
"For those not familiar with the Linux kernel contributors, Alan Cox wrote large parts of the networking stack, was the maintainer of the 2.2 branch, and was commonly considered the "second in command" to Linus Torvalds at one point: http://en.wikipedia.org/wiki/Alan_Cox"
[1]
"Alan started working on Version 0. There were bugs and problems he could correct. He put Linux on a machine in the Swansea University computer network, which revealed many problems in networking which he sorted out; later he rewrote the networking software. [2]
So, you're implying that there is a second person named Alan Cox from Swansea, Wales who worked on FreeBSD, not Linux? Where is your source for that? lol
I do see some networking stuff on there, but much after Apple forked tcp. I don't know that Apple took much memory management from BSD either. Most likely, neither Alan Cox is relevant to Mac.
The search filters and the user interface in general on YouTube is garbage. you guys need to go back to the drawing board. it really is almost impossible to find a video, you have to sort through hundreds of AI slop clickbait videos in order to get to the one that you're actually interested in finding.
whether people want to admit it or not, agent encoding is kind of the norm right now and I think the fear is the stories coming out of places like Block, Inc where they announced they fired 4,000 engineers a couple days ago because of what's the obvious truth today versus 6 months ago.... one expert software engineer can do the work of 20-40 people, so why do we need so many people? it's a hard pill to swallow, it's easier to claim that agentic coding doesn't work or that the code is sloppy and it doesn't work when in reality most companies are currently using it everyday, especially the large ones.
No fear. No claim that AI assisted programming is inherently bad.
The indications are that this instance of AI assisted programming is bad because of the launch post, the name, and the history of Claudeflare doing this before.
I agree with you, if you're already a competent engineer, your productivity only is improved by orders of magnitude by using coding agents that are at this point producing very good code as long as you give it the right prompts and you test your code and remove any bugs... if the code tests and all the bugs are removed, what you've got is a working product that is hard to argue that it doesn't work especially if there's been a lot of QA done on it and there's no bugs....
reply