Monday, October 27
Daily News Stuff 27 October 2025
Scrambled Edition
Scrambled Edition
Top Story
- Why open source may not survive the rise of generative AI. (ZDNet)
Actually a real concern.
Open source depends on copyright. If you write something - software, here, but anything - it is your creating and you have the right to sell it, or to give it away, on terms you decide. Large open source projects need to get permission from all their contributors to include their code in their project.
Generative AI trashes all of that. The products of generative AI are not generally copyrightable, and they don't keep track of where things came from either. A piece of code from AI could be nominally original, or it could be adapted from an open source project, or it could be copied verbatim from a copyrighted source that the AI might not even have had legal access to (Anthropic had to pay out $1.5 billion over similar problems with its training data).
And you can't tell.
Tech News
- Federal regulators have denied autonomous trucking company Aurora Innovation an exemption to safety rules requiring signals to be placed around broken down vehicles. (Reason)
Aurora Innovation has sued the Department of Transportation for... Not changing the rules in its favour.
- GM plans to drop Apple CarPlay and Android Auto support on new vehicles and replace them with, you guessed it, more AI slop. (The Verge)
Specifically based on Google Gemini, though I don't think the particular flavour of slop is of primary concern.
- Why are network devices in 2025 still vulnerable to the kind of exploits we saw in 1995? (CSO Online)
Not the specific exploits, but buffer overruns, remote code execution, or simple authentication bypasses where the code never checks the password at all.
The answer? Because we're still dealing with code that was likely written in 1995:"But when you're dealing with legacy code - we've actually seen some C++ applications where you have literally thousands of overflow issues and the original developers are long gone - it's very difficult to get a new developer to look at it, and they don't really want to touch the code. They get to a point where it's like: Well, prove to me it's exploitable, because this is a critical old piece of code that no one understands and it's dangerous to touch it."
The solution?
Throw AI at it, and when it breaks, fire people.
Musical Interlude
Disclaimer: Salzburg!
Posted by: Pixy Misa at
05:41 PM
| No Comments
| Add Comment
| Trackbacks (Suck)
Post contains 405 words, total size 4 kb.
51kb generated in CPU 0.1749, elapsed 0.77 seconds.
56 queries taking 0.5592 seconds, 358 records returned.
Powered by Minx 1.1.6c-pink.
56 queries taking 0.5592 seconds, 358 records returned.
Powered by Minx 1.1.6c-pink.









