The Power Of Cool Vs. The Speed Of Lite
Long time readers of this blog will be familiar with the fact that
Moore's Law is dying - and in some respects has been
dead for a decade already.
This is of concern to everyone as the ever-increasing availability of computing power has been a key driver of economic growth for decades. It's of more immediate concern to some of us because it means we're no longer to be tempted by shiny new video cards every Christmas.
Maybe.
Video card makers have been enjoying a free ride from chip fabs for twenty years, rolling out new, improved cards on new, improved chip processes roughly every two years. But this is the third year in a row where Nvidia and AMD (the two remaining video card makers) have been stuck at 28nm. This makes it difficult to make any significant advances; whatever you do, if you make a card significantly faster, it will be significantly more expensive and use significantly more power.
Nvidia had some headroom available because they'd already released their high-end professional Tesla GPU in a consumer model. At $999 it wasn't ever going to find many customers, but as a flagship for the benchmark charts it was very useful.
So, recently AMD announced their R9 280 and R9 290 families. The 280 is simply a rebadged Radeon 7970; 2048 shaders running at 1GHz, with a 384-bit memory bus.
The 290 is a new chip, though. The full chip, labelled the 290X, has 2816 shaders (768 / 37.5% more than the 7970), running at "up to 1GHz". Its smaller sibling, the 290, has 2560 shaders (512 / 25% more than the 7970) running at "up to" 947MHz. That's a fairly significant increase for a chip based on the same fabrication process, and the new chip is 25% larger and has nearly 50% more transistors than the 7970. (Which is pretty interesting in itself, as they've achieved 20% better density without a change in the underlying technology.)
The cards rolled out to reviewers, and they found three things:
- They're fast.
- They're really good value for money.
- They're noisy as hell.
With a larger, denser chip, AMD needed to step up the cooling capacity of the new cards. For the reference design, they went for a blower fan - that is, one that blows the hot air out through a vent at the back of your PC - and it's just not very good. The fan problems reduced a potential home run for AMD to a single, and the recommendation from reviewers was to wait for AMD's partners to release cards with custom coolers.
In fact, Tom's Hardware tested the possibilities themselves by prying loose the reference cooler and installing their own, and the results were dramatic. But that's not something the average user would want to do, and the rest of us were left to wait.
And wait.
But the custom cards are now
showing up in reviews, and the results are everything you could ask for: The redesigned cards run 20C cooler, 20% faster, 6dB quieter, and just $20 more than the original, while consuming no more (even a little less) power.
So, time to buy a new video card?
Not so fast.
There's another wrinkle here. AMD's cards are generally better than Nvidia's consumer cards for computation. Nvidia has an edge on some applications thanks to their Cuda compute library, and the difference doesn't necessarily hold for the respective professional cards, but for some applications the difference is huge. A $549 AMD R9 290X outruns a $999 Nvidia Titan for Litecoin mining by 2:1.
With the recent meteoric rise of Bitcoin (ignoring the even more recent partial crash) and growing interest in cryptocurrencies in general (of which Litecoin is one), cheap compute power has gained a whole new market. And if you happen to want a new graphics card for some more prosaic task - let us say, graphics - you may find yourself seriously outbid, if you can find a card at all. Newegg are showing AMD's R9 280 and 290 cards at 30-50% above MSRP, or simply out of stock entirely.
There's also another reason I'll be sticking with my Radeon 7950 for a little while longer. Apart from the fact that having just switched over from my old 4850 I'm not feeling any need to upgrade.
Dell have recently announced affordable 4K monitors. Their 32" model is the same panel sold by Sharp and Asus, at the same price, $3500. But the 24" model is $1399, and the 28" semi-pro model under $1000.
Meanwhile, LG have announced a new lineup of ultra-wide 21:9 displays. Their existing 21:9 monitors have a resolution of 2560x1080, which is not that interesting when 2560x1440 monitors are readily available. The new models bump the resolution up to 3440x1440, making for a clear upgrade. This is great for tasks where you would otherwise use multiple monitors, such as programming.
I commonly find myself switching back and forth between my IDE, a terminal session, a web browser showing the app, and another web browser showing documentation pages of whatever library I'm working with at that moment. Never mind the windows for the app documentation in Microsoft Word, the database monitoring utilities, scratch areas in Notepad++ and so on. I need all the screen area I can get, but monitor boundaries are a pain, so the fewer of those I have, the better.
But.
The only display interface currently capable of driving those new hi-resolution monitors at full speed is DisplayPort. The reference R9 290 cards, all the custom R9 290 cards announced so far, and all of Nvidia's consumer cards have exactly one DisplayPort output. My Sapphire 7950, despite being 18 months old, has two.
So whichever of those fancy new monitors I end up with, I can run two of them on my existing hardware, but only one if I upgrade. Asus offer an overclocked R9 280X with four DisplayPort outputs, but it offers none of the advances of the R9 290 family, is a huge 3-slot card, and as far as I can tell, is completely out of stock everywhere.
Posted by: Pixy Misa at
04:43 PM
| Comments (5)
| Add Comment
| Trackbacks (Suck)
Post contains 1029 words, total size 7 kb.
1
What kind of idiotic design only puts a single DP on a high-end card? I'm assuming this isn't one of those ports that can take a splitter so you can use two monitors.
I've been thinking lately--my company makes payroll software. I wonder if payroll calculations are amenable to GPGPU. That'd be interesting because it would probably mean a huge speedup.
Posted by: RickC at Monday, December 23 2013 12:59 PM (swpgw)
2
The usual kind of idiotic design, I guess. At least there are
some AMD cards with multiple DisplayPort outputs, sometimes as many as six; I haven't seen a single Nividia consumer card with more than one.
Posted by: Pixy Misa at Monday, December 23 2013 06:55 PM (PiXy!)
3
And yes, you can split or daisy-chain DisplayPort...
Unless you're running a 4K monitor, in which case it's only got enough bandwidth to run a single screen.
Posted by: Pixy Misa at Monday, December 23 2013 06:58 PM (PiXy!)
4
I wonder if you could run dual-4K monitors via SLI/Crossfire. That seems kind of like an overreaction, though.
Posted by: RickC at Wednesday, December 25 2013 09:26 AM (swpgw)
5
You can do it with two cards that aren't SLI/Crossfire (I think). The way both SLI and Crossfire work, all the monitors have to be attached to the primary card. But Windows 7 and 8 both support multiple video cards running independently, each with their own monitors.
Posted by: Pixy Misa at Thursday, December 26 2013 04:40 PM (PiXy!)
Hide Comments
| Add Comment
The Company
I'm reading
Kage Baker's Company series at the moment. I thought I'd stopped mid-way through
Mendoza in Hollywood (book 3) long ago, just before the story arc that connects all the books together kicks in. But I just finished
The Graveyard Game (book 4) and as I was getting to the end, I was overwhelmed with deja vu. I definitely got that far before.
The series is about
Dr. Zeus Inc., a.k.a The Company, a business that controls the secrets of time travel and immortality and is naturally immensely rich and run by idiots. (Because if a corporation controls the secrets of time travel and immortality and
isn't run by idiots, there's not going to be much of a story.)
The reason I bring this up is firstly because the stories are quite good and readily available on Kindle (back in the days of paper the middle volumes seemed perpetually out of print, and when I first got my Nexus 7 last year the middle volumes were virtually out of print as well), and second, because of
pajama boy.
The background of Baker's books posits a decline in human moral fibre from the 21st century onwards (the books cover events in eras from around 150,000 BC through to at least the 24th century) to the point that everything remotely worth doing has been banned. Which seemed a bit far fetched to me until pb* popped into the public consciousness. He's the poster child for the achordate 24th century society of The Company.
Except for the fact that their list of banned substances includes chocolate. For now, I suspect even our insufferable man-children of left-wing propaganda would consider that a step too far.
* You have to earn capital letters.
Posted by: Pixy Misa at
11:57 PM
| No Comments
| Add Comment
| Trackbacks (Suck)
Post contains 291 words, total size 2 kb.
71kb generated in CPU 0.032, elapsed 0.8564 seconds.
56 queries taking 0.8389 seconds, 375 records returned.
Powered by Minx 1.1.6c-pink.