Sunday, March 15
Daily News Stuff 15 March 2026
Wiffly Waffly Edition
Wiffly Waffly Edition
Top Story
- Montana's Governor Greg Gianforte has signed into law the state's Right to Compute Act, the first legislation of its kind. (Western MT News)
Before you celebrate, the bill is the worst kind of waffly bullshit, forbidding the government from infringing on fundamental rights unless it really wants to.
I think they've spent too much time next door to Canada and the government needs to institute a hundred-mile decommunised zone.
Still doing better than Australia where our government threatened at one point to legislate against inconvenient arithmetic.
- Whether you have a right or not you can't do much computed on a MacBook Pro 14 with an M5 Max CPU. (Notebook Check)
It looks great on paper but it throttles down to less than half power within two seconds. If you want the high-end processor the only viable option is the 16 inch model.
Tech News
- Studies show that productivity gains from AI for typical office workers amount to as little as 16 minutes per week. (Nerds.XYZ)
I'm not a typical office worker, but I get more out of it just from using it as a better search engine. Watching Grok and ChatGPT discard dozens of useless results from Google before finding the right answer is... I don't know. Satisfying, in a strange way.
- Latency numbers every programmer should know but most don't. (GitHub)
Interesting that from the original version in 2012 to this 2020 update, the quoted SSD latency improved by a factor of 10 - from 150 microseconds to 16. And disk seeks from 10 milliseconds to 2, which has got to be measuring cache effects, because that would mean the disk would be spinning at 30,000 rpm.
- Just how much responsibility do AI chatbots and the companies that create them hold for psychotic people acting psychotically? (Tech Crunch)
I'm torn between bankrupting the companies and bankrupting the lawyers bringing these suits.
Both is good.
- Elon Musk plans to launch the Terafab project - his own chip manufacturing facility - next week. (WCCFTech)
Of course it's going to take years before anything can come online even if he can secure the required tools from ASML, which is a single source for top-end chip manufacturing equipment.
He has said that he plans to reach production in excess of 100 billion chips per year, though exactly what chips and in what time frame he has not yet specified.
- ASRock's new H610M Combo II motherboard is just like the earlier H610M combo only worse. A lot worse. (WCCFTech)
The H610M Combo featured two slots for DDR4 memory and four slots for DDR5, though you could only have one or the other.
The Combo II cuts that down to one slot for DDR5 and two slots for DDR5, which is kind of broken. If you have DDR5 you don't need the DDR4 slot, and if you don't you're stuck with a quarter of the bandwidth.
Musical Interlude
Disclaimer: There's a pill for that now.
Posted by: Pixy Misa at
05:47 PM
| Comments (2)
| Add Comment
| Trackbacks (Suck)
Post contains 501 words, total size 5 kb.
1
I am stunned, stunned to learn that Apple has shipped a laptop with inadequate power and cooling. Again.
Given the boat-anchor size of the 16-inch MacBook Pro (I have an older one for work), it sounds like the best place for the M5 Max is in a Mini or a Studio, which might have enough power and cooling to fully utilize it.
-j
Given the boat-anchor size of the 16-inch MacBook Pro (I have an older one for work), it sounds like the best place for the M5 Max is in a Mini or a Studio, which might have enough power and cooling to fully utilize it.
-j
Posted by: J Greely at Monday, March 16 2026 12:43 AM (oJgNG)
2
I think the claims around the AI space create some liability for the companies.
If I have no choice but to AI or be unemployable, and if I might already be unemployable, if I have non-zero mental health risks it is better for me to manage them by having zero use. The more ambiguous case is if I am potentially vulnerable, but also can function as an employee if I manage my mental health risks.
Telling people that they should go to university is also something that can be very bad for them. For example. And some of the people who got the most As for repeating teacher are some of the most vulnerable.
Existing metrics may serve me well enough on AI, (not that I was doing my filtering on the clearest, most objective, most long established, most theoretically justified rules of thumb). Yudkowsky was clearly not showing fruits of the spirit where Christ is concerned, so anything that Yudkowsky gets magical or spiritual hits off of is potentially dangerous. This basically describes ninety percent of experts excited in public over 'intelligence on a computer'. All the real evidence had more narrow interpretations possible, so the people leaping directly to the sexy interpretations were often getting hits off of their own magical thinking.
Sane followers of Christ do not go 'whelp, this theoretical idea is really important, and the rituals we must do for the idea naturally lead to a lot of mass murder.' This rule of thumb can be applied to a lot of academic theorists. (Which leads to a conversation about it not being scholastically admissible, and the conversations we are not having about whether academia can really represent anyone outside academia.)
Note, I absolutely was not using this approach at all until very recently. I've long been very aligned to some technocratic 'spiritual errors', those are pretty close to my starting points.
I'm unwise and immature enough to actually /need/ an explanation of why we don't just fix academia by shooting all of the obvious communists. On the one hand, it better equips me to meet communist academics where they are coming from, and to try to reach those alien lunatics. If I were wise and mature enough to have something healthy to say to them.
Anyway, if American style safety warnings on tools and dogs law is correct, then failing to fully disclose 'this is bad for crazy people' and 'we are obviously promoting discrimination in employment for crazy people' then the AI and the tech bros are fucked on liability law.
But this is downstream of 'everything should be curated for accuracy' statism which is objectively incorrect.
Also, then we could sue universities for making the communists ill, so they have liability for the current bunch of communist ill murderers, so it looks like collusion by lawyers to shield their alma maters from liability.
The correct law should be that free speech is always better.
There is a wide liability for telling crazy people that they should use AI, or for telling crazy people that they will be unemployed. But, really, we should instead be more permissive when it comes to publically telling academics that they don't know anything, and can frustrate themselves and the horse they rode in on.
So, yes, also the lawyers are trying to profit from law that is pretty bad.
If I have no choice but to AI or be unemployable, and if I might already be unemployable, if I have non-zero mental health risks it is better for me to manage them by having zero use. The more ambiguous case is if I am potentially vulnerable, but also can function as an employee if I manage my mental health risks.
Telling people that they should go to university is also something that can be very bad for them. For example. And some of the people who got the most As for repeating teacher are some of the most vulnerable.
Existing metrics may serve me well enough on AI, (not that I was doing my filtering on the clearest, most objective, most long established, most theoretically justified rules of thumb). Yudkowsky was clearly not showing fruits of the spirit where Christ is concerned, so anything that Yudkowsky gets magical or spiritual hits off of is potentially dangerous. This basically describes ninety percent of experts excited in public over 'intelligence on a computer'. All the real evidence had more narrow interpretations possible, so the people leaping directly to the sexy interpretations were often getting hits off of their own magical thinking.
Sane followers of Christ do not go 'whelp, this theoretical idea is really important, and the rituals we must do for the idea naturally lead to a lot of mass murder.' This rule of thumb can be applied to a lot of academic theorists. (Which leads to a conversation about it not being scholastically admissible, and the conversations we are not having about whether academia can really represent anyone outside academia.)
Note, I absolutely was not using this approach at all until very recently. I've long been very aligned to some technocratic 'spiritual errors', those are pretty close to my starting points.
I'm unwise and immature enough to actually /need/ an explanation of why we don't just fix academia by shooting all of the obvious communists. On the one hand, it better equips me to meet communist academics where they are coming from, and to try to reach those alien lunatics. If I were wise and mature enough to have something healthy to say to them.
Anyway, if American style safety warnings on tools and dogs law is correct, then failing to fully disclose 'this is bad for crazy people' and 'we are obviously promoting discrimination in employment for crazy people' then the AI and the tech bros are fucked on liability law.
But this is downstream of 'everything should be curated for accuracy' statism which is objectively incorrect.
Also, then we could sue universities for making the communists ill, so they have liability for the current bunch of communist ill murderers, so it looks like collusion by lawyers to shield their alma maters from liability.
The correct law should be that free speech is always better.
There is a wide liability for telling crazy people that they should use AI, or for telling crazy people that they will be unemployed. But, really, we should instead be more permissive when it comes to publically telling academics that they don't know anything, and can frustrate themselves and the horse they rode in on.
So, yes, also the lawyers are trying to profit from law that is pretty bad.
Posted by: PatBuckman at Monday, March 16 2026 02:17 AM (s6adZ)
56kb generated in CPU 0.0597, elapsed 0.2096 seconds.
58 queries taking 0.147 seconds, 364 records returned.
Powered by Minx 1.1.6c-pink.
58 queries taking 0.147 seconds, 364 records returned.
Powered by Minx 1.1.6c-pink.









