Monday, January 19
Daily News Stuff 19 January 2026
Explainers Edition
Explainers Edition
Top Story
- The reason why RAM has become four times more expensive is that a huge amount of RAM that has not yet been produced was purchased with non-existent money to be installed in GPUs that also have not yet been produced, in order to place them in data centers that have not yet been built, powered by infrastructure that may never appear, to satisfy demand that does not actually exist and to obtain profit that is mathematically impossible.
— Unknown
- The mythology of conscious AI. (Noema Mag)
This article gets one important thing right: LLMs are not conscious.In a 2022 interview with The Washington Post, Google engineer Blake Lemoine made a startling claim about the AI system he was working on, a chatbot called LaMDA. He claimed it was conscious, that it had feelings, and was, in an important sense, like a real person. Despite a flurry of media coverage, Lemoine wasn’t taken all that seriously. Google dismissed him for violating its confidentiality policies, and the AI bandwagon rolled on.
I commented on that story at the time. Lemoine is a crazy as a sack of rats on crazy pills. And was also completely and very obviously wrong, which is not the same thing.As AI technologies continue to improve, questions about machine consciousness are increasingly being raised. David Chalmers, one of the foremost thinkers in this area, has suggested that conscious machines may be possible in the not-too-distant future. Geoffrey Hinton, a true AI pioneer and recent Nobel Prize winner, thinks they exist already.
Wait. David Chalmers said what?
Huh. He did say that. A mostly sensible article summarising the evidence on both sides, concluding that pure feed-forward LLMs are not conscious but extended LLMs with recurrent processing - feedback loops - could be. (Boston Review)
Back to Noema:Taken together, these biases [anthropocentrism, which is irrelevant, human exceptionalism, which is irrelevant, and anthropomorphism, which is actually the key here - Pixy] explain why it’s hardly surprising that when things exhibit abilities we think of as distinctively human, such as intelligence, we naturally imbue them with other qualities we feel are characteristically or even distinctively human: understanding, mindedness and consciousness, too.
A little bit of nonsense thrown in at the start but an accurate description of the problem in the end.
But then it all falls apart:The very idea of conscious AI rests on the assumption that consciousness is a matter of computation.
Which is rather like assuming that water is a molecule made of one oxygen atom and two hydrogen atoms.More specifically, that implementing the right kind of computation, or information processing, is sufficient for consciousness to arise.
Because it is.This assumption, which philosophers call computational functionalism, is so deeply ingrained that it can be difficult to recognize it as an assumption at all.
As much as the molecular structure of water is an assumption.But that is what it is.
Nope.And if it’s wrong, as I think it may be, then real artificial consciousness is fully off the table, at least for the kinds of AI we're familiar with.
"Kinds of AI we're familiar with"? Do you mean feed-forward models, which are definitely not conscious, or enhanced systems with feedback loops?Challenging computational functionalism means diving into some deep waters about what computation means and what it means to say that a physical system, like a computer or a brain, computes at all. I'll summarize four related arguments that undermine the idea that computation, at least of the sort implemented in standard digital computers, is sufficient for consciousness.
And we're dead.
First, and most important, brains are not computers.
Brains are obviously computers and it is trivially easy to prove this.
Take a line of BASIC code, like:
10 PRINT 3+7
What does that do?
It prints10.
How do you know?
Because you can execute that code in your head.
How can you do that?
Because your brain is a computer.
It may be more than a computer - though nobody has produce a coherent, let alone convincing argument for this - but it is unquestionably a computer.
There follow dozens of paragraphs of irrelevancies I won't get into, but suffice to say that it all goes downhill from there.
Tech News
- Quick reminder that Intel's B570 is still available at $200. It's not the fastest graphics card - it's comparable with Nvidia's RTX 3060, a midrange card from five years ago - but it's cheap, in stock, and works. And it has 10GB of RAM, a small upgrade over common 8GB cards.
- An Altair 8800 that has been broken since its owner made a mistake assembling it in 1974 has finally been fixed. (Tom's Hardware)
Never too late.
- Reasons to stop using MySQL. (Optimized by Otto)
Oh, yeah? What would you recommend?
MariaDB.
Oh. Yeah. Good call. Carry on.
- The Slimbook Executive ticks all my boxes. Pretty much. (Liliputing)
I'm not really in the market for a new laptop right now, but this seems to get a lot right.
It's a 14" model with an Intel 255H CPU (6P/8E/2LP cores), a 2880x1800 120Hz screen - LCD rather than OLED, but it covers 100% of sRGB so it should be fine unless you're a professional artist of video editor - and a 99Wh battery despite weighing a modest 1.2Kg.
It has two SODIMM slots and two M.2 2280 slots - unusual and welcome in a 14" laptop - and one USB4 port, a 5Gb USB-C port with DP and PD (DisplayPort and Power Delivery), three 5Gb USB-A ports, HDMI, wired gigabit Ethernet, an audio jack, and a full-size SD card slot.
And the Four Essential Keys.
No dedicated graphics, but the CPU includes Intel's Arc 140T graphics which are quite competent and generally comparable to AMD's 780M.
Oh, and the keyboard is backlit and there's a physical privacy shutter for the camera.
They don't ship to Antarctica though.
- Why Silicon Valley is really talking about fleeing California (it's not the 5% theft wealth tax). (Tech Crunch)
Oh, do tell.Take Larry Page, who [owns] about 3% of Google but controls roughly 30% of its voting power through dual-class stock. Under this proposal, he’d owe taxes on that 30%. For a company valued in the hundreds of billions, that’s a lot more than a rounding error. The Post reports that one SpaceX alumni founder building grid technology would face a tax bill at the Series B stage of the company that would wipe out his entire holdings.
Oh, right. It's not the 5% wealth tax. It's the 100%+ wealth tax. I can see how that would be a problem.David Gamage, the University of Missouri communist kleptocrat retard law professor who helped craft the proposal, thinks Silicon Valley is overreacting. "I don’t understand why the billionaires just aren’t calling good tax lawyers," he told The San Francisco Standard this week.
That's just it.
They did.
Their tax lawyers advised them to flee. Immediately.
Musical Interlude
Disclaimer: Let's pebble!
Posted by: Pixy Misa at
06:53 PM
| Comments (2)
| Add Comment
| Trackbacks (Suck)
Post contains 1134 words, total size 10 kb.
1
I can't agree with your assessment that computation is sufficient for consciousness. And a lot of very smart engineers and theorists are on my side of the argument. There is no theory that even begins to conceptualize how more, faster, more complex logic and programming suddenly goes from calculating machine to consciousness.
We don't even really understand what consciousness is, and whether animals have it, and where the threshold might be if the bigger and more advanced mammals do, for example, but say ants don't.
There is some recent research that is suggesting that quantum entanglement is involved, but even that's just kind of hand-waving and there's more than a few crackpots pushing it.
We don't even really understand what consciousness is, and whether animals have it, and where the threshold might be if the bigger and more advanced mammals do, for example, but say ants don't.
There is some recent research that is suggesting that quantum entanglement is involved, but even that's just kind of hand-waving and there's more than a few crackpots pushing it.
Posted by: David Eastman at Tuesday, January 20 2026 03:28 AM (J69gL)
2
a) progression income taxation might not be. If it is merely tax farming, then cannibalistic theories of taking from the richest would screw everyone. Whether or not they are taking 95% or 100% to redistribute to graft hungry cronies. b) 'engineers' don't cause me to discard the hypothesis that they are blindly self-interested lunatics. 2020, for academia, was a very big and notable industrial accident, and explaining it and correcting 'should' be a bread and butter sort of engineering theory problem. c) Theorists and engineers aside: Yeah, I'm not seeing that a mathematical mapping exists between computer science and, say, theology. Some of these areas, maybe I am not well educated enough to have an opinion.
Posted by: PatBuckman at Tuesday, January 20 2026 04:16 AM (rcPLc)
59kb generated in CPU 0.0155, elapsed 0.1528 seconds.
58 queries taking 0.1419 seconds, 362 records returned.
Powered by Minx 1.1.6c-pink.
58 queries taking 0.1419 seconds, 362 records returned.
Powered by Minx 1.1.6c-pink.









