wise-wistful
24.02.2008, 04:29
vidia and Intel have a mutually beneficial relationship. One company is a graphics card giant; the other, a premier CPU producer. What's interesting is that while they have a healthy respect for each other, both butt heads more often in the enthusiast motherboard space these days (enthusiasts refer to 'em as "mobos").
A problem that the two hardware-makers need to deal with, though, is the poor integrated graphics "solutions" on motherboards. In short, you buy a new PC, then find out that you cannot play games without buying and installing some sort of discrete graphics card (that runs you another couple hundred dollars). Then you go break something.
With its new "a" series mobos coming out mid-March, Nvidia fires a shot over Intel's bow.
For starters, these boards boast a decent integrated graphics solution: the GeForce 8200. OK, so this won't play the computer-killing Crysis at 2560x1600-pixel resolution. What is interesting is that it can run some games on its own or boost a secondary board with some sort of miniature Scalable Link Interface (SLI).
Let's say you grab a cheap GeForce 8400 or 8500 series card (roughly $50 to $60). Working with the integrated graphics, you can get a 40-percent performance boost over the discrete board alone.
It won't do a thing for high-end cards like the 8800GT, but having that small SLI boost on the motherboard provides an affordable option for mainstream gamers that don't need to make a choice between buying a new car or a new computer.
Of course, Nvidia won't stop you from buying its high-level chip sets, either. The 780a ($250) will allow for three-way SLI. The 750a ($120) permits two cards to pair up in SLI. The 730a ($80) will be perfect for greenhorn gamers who want to dip their feet in the water with a single graphics card.
Another thing all these motherboards will offer is an interesting take on power conservation--Hybrid SLI (first unveiled last month, now in full demo form). You aren't playing PC games at all hours of the day, so why should graphics cards fire on all cylinders at all hours?
The first generation of its Hybrid SLI technology is a manual transmission throttle-down of your GPU. This reduces wattage, fan noise, and PC wear-and-tear. The second generation--in the works--should automatically shift those virtual gears for you. It'll spin down a discrete graphics card to a completely dormant state until really needed.
In the demo I saw at GDC, the system ground along, sucking up 250 watts of juice one minute, then whispered along at around 100 watts, the next. This news is especially important because, according to Nvidia product manager Matt Wuebbling, "[Both technologies] will find their way into notebook motherboards."
Intel, of course, also has gaming enthusiast solutions up its silicon sleeves: Namely, the recently unveiled Dual Socket Extreme boards. Formerly codenamed "SkullTrail," these big, bad mobos can support up to four graphics cards to run in either SLI or AMD's multiple-GPU CrossFire technology. What's interesting to note is that while Intel's motherboard easily handles one, two, three, or four cards at a time on Intel's motherboard, it isn't as-cut-and dry with Nvidia.
Intel spokesperson Dan Synder explains that, "mechanically and electrically, SkullTrail motherboards will fully support one to four cards--as you can see with Radeon cards. Problem is, Nvidia's only provided drivers to support SLI and Quad-SLI."
Nvidia insists that it boils down to engineering, Intel says it's a driver issue--it's the corporate version of he said/she said. While nobody was willing to go on record, it could also be a financial decision to differentiate between the two chipset-makers.
And what of the pesky problem of integrated graphics? Intel insists that it is on the case as well.
Intel's G45 integrated graphics chip will ship in the middle of 2008. While Snyder says it's a little too early to talk performance numbers yet on that front, he did let slip that some versions of Intel's upcoming Nehalem motherboards will have a graphics processor inside the GPU. The big question remains: how much impact will it have on performance? We should be able to find out by Q3-Q4 2008.
PC World (http://blogs.pcworld.com/staffblog/archives/006536.html)
A problem that the two hardware-makers need to deal with, though, is the poor integrated graphics "solutions" on motherboards. In short, you buy a new PC, then find out that you cannot play games without buying and installing some sort of discrete graphics card (that runs you another couple hundred dollars). Then you go break something.
With its new "a" series mobos coming out mid-March, Nvidia fires a shot over Intel's bow.
For starters, these boards boast a decent integrated graphics solution: the GeForce 8200. OK, so this won't play the computer-killing Crysis at 2560x1600-pixel resolution. What is interesting is that it can run some games on its own or boost a secondary board with some sort of miniature Scalable Link Interface (SLI).
Let's say you grab a cheap GeForce 8400 or 8500 series card (roughly $50 to $60). Working with the integrated graphics, you can get a 40-percent performance boost over the discrete board alone.
It won't do a thing for high-end cards like the 8800GT, but having that small SLI boost on the motherboard provides an affordable option for mainstream gamers that don't need to make a choice between buying a new car or a new computer.
Of course, Nvidia won't stop you from buying its high-level chip sets, either. The 780a ($250) will allow for three-way SLI. The 750a ($120) permits two cards to pair up in SLI. The 730a ($80) will be perfect for greenhorn gamers who want to dip their feet in the water with a single graphics card.
Another thing all these motherboards will offer is an interesting take on power conservation--Hybrid SLI (first unveiled last month, now in full demo form). You aren't playing PC games at all hours of the day, so why should graphics cards fire on all cylinders at all hours?
The first generation of its Hybrid SLI technology is a manual transmission throttle-down of your GPU. This reduces wattage, fan noise, and PC wear-and-tear. The second generation--in the works--should automatically shift those virtual gears for you. It'll spin down a discrete graphics card to a completely dormant state until really needed.
In the demo I saw at GDC, the system ground along, sucking up 250 watts of juice one minute, then whispered along at around 100 watts, the next. This news is especially important because, according to Nvidia product manager Matt Wuebbling, "[Both technologies] will find their way into notebook motherboards."
Intel, of course, also has gaming enthusiast solutions up its silicon sleeves: Namely, the recently unveiled Dual Socket Extreme boards. Formerly codenamed "SkullTrail," these big, bad mobos can support up to four graphics cards to run in either SLI or AMD's multiple-GPU CrossFire technology. What's interesting to note is that while Intel's motherboard easily handles one, two, three, or four cards at a time on Intel's motherboard, it isn't as-cut-and dry with Nvidia.
Intel spokesperson Dan Synder explains that, "mechanically and electrically, SkullTrail motherboards will fully support one to four cards--as you can see with Radeon cards. Problem is, Nvidia's only provided drivers to support SLI and Quad-SLI."
Nvidia insists that it boils down to engineering, Intel says it's a driver issue--it's the corporate version of he said/she said. While nobody was willing to go on record, it could also be a financial decision to differentiate between the two chipset-makers.
And what of the pesky problem of integrated graphics? Intel insists that it is on the case as well.
Intel's G45 integrated graphics chip will ship in the middle of 2008. While Snyder says it's a little too early to talk performance numbers yet on that front, he did let slip that some versions of Intel's upcoming Nehalem motherboards will have a graphics processor inside the GPU. The big question remains: how much impact will it have on performance? We should be able to find out by Q3-Q4 2008.
PC World (http://blogs.pcworld.com/staffblog/archives/006536.html)