

I’m not so sure the “open source” part is working either when you think about how AI tools were trained.
It’s really sad, because the accessibility of developing software and collaborative nature of the open source community is a big part of what drew me to software engineering as a career, and it’s always been one of the first things I mention about why I love it. But, of course, these fucking evil companies found a way to take every individual part of something good and twist it into something awful.
The pattern of demoing stuff like this and then responding to the glaring issues and failures with, “but just wait until the next model bro, eventually this will work bro, just need a larger context window, maybe some recursive prompting, we’re basically already there bro,” is so embarrassing. If I claimed to have written this myself and presented it at a tech conference (or hell, even turned it in as an assignment in a CS course), I’d be laughed out of the room when it failed to compile hello world. The fact that people see this and get excited is so bizarre.
If it can’t compile hello world, can you even imagine all the easily exploitable security issues, the unhandled edge cases, the major performance issues, etc. that are buried in that dumpster fire of source code. That this is what passes for software “engineering” now is so sad. Too bad they don’t start using this compiler at Anthropic.