µATX Part 2: Intel G33 Performance Review
by Gary Key on September 27, 2007 3:00 AM EST- Posted in
- Motherboards
Intel is the largest provider of graphic solutions in the PC market. Repeat that three times, and then marvel at how that can be after trying to play the latest game you or your child just brought home. Or if you're trying to play that new HD-DVD or Blu-ray title that just arrived in the mail on your Intel Integrated Graphics equipped system.
Of course, Intel is the market leader by volume rather than performance. They ship more integrated graphics chipsets than anyone else in the world. Virtually every business PC sold uses an Intel IGP, along with many of the entry level home systems where the thought of playing the latest game or watching a high definition movie on a PC is but a mere twinkle in the eye of the user. After all, there are lot more important reasons for using a PC other than for entertainment and in these markets, the Intel graphics solutions are more than sufficient.
We tend to be in with the crowd that wants more than the basic features needed for getting Windows Vista Aero certification or running Office 2007 at lightning fast speeds. While we would always buy a dedicated GPU unit for gaming, we still expect enough performance out of an IGP solution to at least play the top edutainment titles or mass market favorites like Sims 2. We also expect an IGP to provide decent video acceleration of the latest video formats when Aunt Harriet wants to view the newest sensation on YouTube or watch a home movie from last Christmas. Finally, we have been known to use HTPC boxes, and the thought of spending more for a HD decode/playback capable video card than the motherboard, CPU, and potentially memory combined is silly for such a system.
The innovation in the IGP market has been lagging for some time but has picked up in recent months with the introduction of the AMD 690G, AMD Radeon X1250, and now the NVIDIA MCP73 series. All of these solutions offer native DVI/HDMI output, HD decode and playback features, and decent enough gaming performance for those just starting or in a pinch until they can afford a better solution. Intel made some headway with the G965 chipset, and they hope to keep pace with or even surpass their competitors with the upcoming G35 chipset. Until then, we are left with a mid-year release known as the G33, aka the GMA 3100.
The G33 chipset is basically the P35 chipset with an integrated GMA 3100 graphics engine. As such, the chipset brings support for the 1333MHz FSB CPUs, upcoming 45nm based CPUs, and potentially DDR3. The GMA 3100 graphics engine is basically an update to the GMA 950 found in the 945G chipsets. The main highlights of the GMA 3100 are that it supports OpenGL 1.4, Shader 2.0 operations (DirectX 9.0c compliant), and Vertex Shader 2.0 is supported by software via CPU host processing. It has a 400 MHz clock, four pixel pipelines, a maximum 2048x1536 resolution, Dynamic Video Memory Technology, Clear Video processing engine, and MPEG-2 hardware decode acceleration.
Of these features, Dynamic Video Memory Technology is new and adds a twist to the UMA operations. DVMT allows the GMA 3100 to utilize anywhere from 4MB to 256MB of system memory for graphics purposes depending upon the type of application running. In 2D operation the chipset will reserve 4MB of system memory at 800x600 resolution and up to 16MB for 2048x1536 while dynamically adjusting the system memory requirements when utilizing a 3D application. HDMI, DVI, and HDCP support are provided... but only if the board manufacturer utilizes an SDVO chipset for the interface to the GMA 3100 or a separate ADD2 card is purchased. Otherwise, the user is left with the standard analog VGA output that our G33 review boards utilize today.
Even though the G33 is a "new" chipset, it still contains minimum graphics functionality and seems intent on just getting by in the market. Of course, it is not all about gaming with these platforms, but even in video playback and general application performance we see better solutions from Intel's competitors. Hopefully, this will change with the upcoming G35 chipset, but until that product actually ships we are left wanting.
Let's take a look at the G33 solutions today and see how well each board performs.
Of course, Intel is the market leader by volume rather than performance. They ship more integrated graphics chipsets than anyone else in the world. Virtually every business PC sold uses an Intel IGP, along with many of the entry level home systems where the thought of playing the latest game or watching a high definition movie on a PC is but a mere twinkle in the eye of the user. After all, there are lot more important reasons for using a PC other than for entertainment and in these markets, the Intel graphics solutions are more than sufficient.
We tend to be in with the crowd that wants more than the basic features needed for getting Windows Vista Aero certification or running Office 2007 at lightning fast speeds. While we would always buy a dedicated GPU unit for gaming, we still expect enough performance out of an IGP solution to at least play the top edutainment titles or mass market favorites like Sims 2. We also expect an IGP to provide decent video acceleration of the latest video formats when Aunt Harriet wants to view the newest sensation on YouTube or watch a home movie from last Christmas. Finally, we have been known to use HTPC boxes, and the thought of spending more for a HD decode/playback capable video card than the motherboard, CPU, and potentially memory combined is silly for such a system.
The innovation in the IGP market has been lagging for some time but has picked up in recent months with the introduction of the AMD 690G, AMD Radeon X1250, and now the NVIDIA MCP73 series. All of these solutions offer native DVI/HDMI output, HD decode and playback features, and decent enough gaming performance for those just starting or in a pinch until they can afford a better solution. Intel made some headway with the G965 chipset, and they hope to keep pace with or even surpass their competitors with the upcoming G35 chipset. Until then, we are left with a mid-year release known as the G33, aka the GMA 3100.
The G33 chipset is basically the P35 chipset with an integrated GMA 3100 graphics engine. As such, the chipset brings support for the 1333MHz FSB CPUs, upcoming 45nm based CPUs, and potentially DDR3. The GMA 3100 graphics engine is basically an update to the GMA 950 found in the 945G chipsets. The main highlights of the GMA 3100 are that it supports OpenGL 1.4, Shader 2.0 operations (DirectX 9.0c compliant), and Vertex Shader 2.0 is supported by software via CPU host processing. It has a 400 MHz clock, four pixel pipelines, a maximum 2048x1536 resolution, Dynamic Video Memory Technology, Clear Video processing engine, and MPEG-2 hardware decode acceleration.
Of these features, Dynamic Video Memory Technology is new and adds a twist to the UMA operations. DVMT allows the GMA 3100 to utilize anywhere from 4MB to 256MB of system memory for graphics purposes depending upon the type of application running. In 2D operation the chipset will reserve 4MB of system memory at 800x600 resolution and up to 16MB for 2048x1536 while dynamically adjusting the system memory requirements when utilizing a 3D application. HDMI, DVI, and HDCP support are provided... but only if the board manufacturer utilizes an SDVO chipset for the interface to the GMA 3100 or a separate ADD2 card is purchased. Otherwise, the user is left with the standard analog VGA output that our G33 review boards utilize today.
Even though the G33 is a "new" chipset, it still contains minimum graphics functionality and seems intent on just getting by in the market. Of course, it is not all about gaming with these platforms, but even in video playback and general application performance we see better solutions from Intel's competitors. Hopefully, this will change with the upcoming G35 chipset, but until that product actually ships we are left wanting.
Let's take a look at the G33 solutions today and see how well each board performs.
26 Comments
View All Comments
tooter2 - Sunday, September 30, 2007 - link
Hi all. I had just ordered the DS2R board when I read your review, and how poor this board overclocked, exceeding fsb of 400, contrary to what I had read elsewhere. I was a bit concerned to say the least. Well, I just spent an hour running the newest memtest86 using this board with an e6750 at 7 X 500 = 3.50 GHz at default vcore using 2 X 1gig of DDR2 6400 GSkill at 5-5-5-15 with vdimm at +.2, and all other settings at default except for the power management settings so as to be sure that I was running at the high speeds. This was with the Intel stock cooler. I've also run memtest at 8 x 463 = 3.70 GHz, default vcore. CPU temp never exceed 38C. And I've used an older Antec Neopower 480 for my psu. I should add that this is with on-board video in a bare-bones setup, i.e., no case, no hdds, ide optical drive. This board appears to be an overclocking monster, not at all like your results.And I plan to use a video card in this board, but I bought it for its mATX size plus the fact that I can get a video card later. I want to see how the new AMD cards pan out, plus what Nvidia comes back with. This will be used in a HTPC setup, but a setup where I can play games as well. Hence, the e6750.
tooter2 - Sunday, September 30, 2007 - link
In the above post, I meant to say "not exceeding 400 fsb".jonp - Saturday, September 29, 2007 - link
The Asus P5K-VM feature set chart shows only 1 PCI when it should say 2.overzealot - Saturday, September 29, 2007 - link
The g33m-ds2r comes with an eSATA expansion slot bracket. It also makes it quite clear that it's supported on the product's site and on the box it comes in.falacy - Friday, September 28, 2007 - link
Something I have noticed over the years is that this site doesn't really take an objective look at the "low end" hardware, from the perspective of those of who would purposely purchase these items - even though we're "tech savy". For instance,though I do agree that the absence of a DVI port isn't great, I find it hard to believe that I'm the only person who is still happily using a 17" CRT monitor at 1024x768 and it's pretty insulting to hear that anything without a DVI port isn't worth looking at. Did everyone forget that CRT monitors have better visual quality that LCDs - unless you're able to shell out far, far more money? I digress...Here is my path to the P5K-VM:
When I moved in 2003, after losing my great job, I had to sell my good computer and when I finally got settled I was far too poor to replace it. That was 2004. Anyhow, I needed a computer so bought a Dell desktop (P4 2.8) and used it until 2005 with an Ati 9550SE graphics card. This was good enough to play Star Wars Galaxies, Everquest, and a whack of other games that I played at the time. Of course, it ran everything office-like too. Later on I was given an ATI 9800XT video card, which was very expensive when it first came out. Anyhow, in 2006 I upgraded to an Asrock board that could handle a Core2, yet still had the AGP slot so I could make use of the 9800XT. At the time, there weren't any cheap Core2 processors, so I bought a P4 531 and it was a decent upgrade from the Dell. All this was awesome (and for the games I played I was happy), until recently when I bought I bought a Pentium Dual-Core 2160 and then was lucky enough to have the fan on my 9800XT fail, which awesomely fried the GPU. Yay. I was back to using the i865G graphics, as had given away my 9950SE and the only other cards in my collection of junk weren't any better than the onboard video.
And this brings me to yesterday, when I set up my new system.
I bought 1GB RAM and a P5K-VM and after testing it out, I found that the graphics capabilities trounce the i865G onboard video, in the practical testing of playing World of Warcraft as well as in 3DMark2000 scores; the G33 is smooth and playable in WoW at 1024x768, where as the i865G was somewhat choppy at 800x600. Also, the apart from playing at 1x AA rather than 4x AA, the G33 scored the same in 3DMark2000 as the old Ati 9550SE that I used to have. And finally, it really isn't that much of a downgrade from the Ati 9800XT (which sucked up so much power even in idle, the air from my PSU went from HOT to cool when I stopped using it!) in World of Warcraft (the only game I really play now). Sure, AA is nice, but I like the electicity/heat/noise savings better. Down the road, I may purchase a fanless PCI-E graphics card if the NEED arises.
All together, I believe a lot more credit should be given to the value of these motherboards. In fact, I have felt since the first onboard video chipsets to offer full AGP support that so long as you're not giving up any important features, it's pretty stupid for the average person to buy a motherboard without onboard video - you never know when you're going to need it! There is a huge list of fun gaming titles that the onboard graphics can play with Playstation 2 quality (or better) graphics quality and I think that this information is lost on the Anandtech crowd. Also, these systems can run with Windows XP and 1GB RAM and be completely amazing in compairison to what was available just two short years ago!
The P5K-VM is a perfect motherboard for a person like me, who has some dispoable income to build a computer over time and enough patience to make that happen. Eventually, I can add 8GB of RAM or a wicked graphics card (if ever I feel like playing more than WoW or Neverwinter Nights) or 4 SATA drives to run software RAID5 (or 4 IDE drives to use with my promise controller) or a camcorder to use the 1394 or a super-mega quad-core, low power consumption CPU.
Seems to me that anyone with a 17" CRT monitor, which often has better visual quality than the crappy LCDs people peddle these days, would be very wise to buy one of these boards now and upgrade as "the itch" and their budget fits!
lopri - Saturday, September 29, 2007 - link
Dunno whether I should laugh or cry over your post. Are you being serious or sarcastic? Sorry it was a long day and I'm not that a sharp person.falacy - Sunday, September 30, 2007 - link
That's exactly what I am talking about: The inability of so many people on Anandtech to see from the "Average Person"'s perspective. Funny enough, it just happens to be that "Average Person" makes up the majority of the computer purchasers in North America.As a person who has managed a "ma & pa" computer store in a small town, I can tell you that even the most inept of "boony noobs" out there has some computer knowledge these days. And, many people still have some decent hardware kicking around that, considering the things they actually USE a computer for, they can squeeze a bit more value out of. Heck, it was just two years ago that replaced an AMSTRAD 200 portable computer with a laptop that ended up frustrating the hell out the customer, because it didn't do all the things here ancient computer did (such as print to her equally ancient printer). In fact, my computer is housed in a modified 486 AT server tower that we took in on trade that was being used as an office server until the we replaced it in 2005. For "Average People" doing average things with average expectations, it's amazing how long a computer can last (Personally, I used my Celeron 300a Malay @450MHz & Abit BH6 Rev2 for over 3 years). Look at it this way, if all you're doing is crunching numbers and typing, my 486SX 25MHz laptop with Word Perfect 5.1 and Lotus 123 will still get the job done (and it will boot faster than anything else out there today).
Anyhow, it may come as a surprise that not everyone has enough money to just buy what ever the heck they want, when ever they feel like it. No, most of us have to set priorities in life - I believe that has something to do with being an adult and/or a parent. Consiquently, "Average People" like me (in wealth, as aposed to computer knowledge) have to wiegh the pros and cons a little more carefully and for someone like myself, I'd rather through some spare money at more storage space for my movies or a camcorder or a better TV to watch said movies on than I would an uber graphics card.
The plain truth of the matter is that the G33 under Windows XP can play fun games like Quake 3, Neverwinter Nights, World of Warcraft, and many other great 1999-2004 titles. All the while, it can do so using that CRT you probably already own, that likely still looks just as good as new and will give you a sharper image at 1024x768 than the low-end LCD you'd likely buy. Finally, a board like the P5K-VM is amazing, because should a person strike it rich they could upgrade the hell out of their computer without ever needing to consider buying a new motherboard - DDR3, 45nm CPU, Gigabit LAN, 8 channel audio? Boy, that seems pretty "bleeding edge" from vantage point on ye o'l interweb!
There's a lot of potential (for the "Average Person") in these boards.
strikeback03 - Tuesday, October 2, 2007 - link
I don't think a CRT can ever give a "sharper" image than an LCD - kinda the nature of the beast with discrete pixels vs a scanning electron beam. Now your CRT probably has better colors and almost certainly has better viewing angles than the average cheap LCD, but is almost certainly not sharper.also, 1024x768 is REALLY small. Even using the internet is cramped, and forget the average programming environment or photo editing program.
finally, it does not appear the P5K-VM supports DDR3. The chipset can, but most motherboard makers are choosing either DDR2 or DDR3, as the slots are different, and they cannot be used simultaneously.
ltcommanderdata - Friday, September 28, 2007 - link
Can we please get a review of the 14.31.1 XP driver for the GMA X3000 that enables hardware DX9.0c SM3.0 acceleration? I know you've switched over to Vista, but the 15.6 driver release notes don't mention that they added hardware acceleration so it looks like only the 14.31 and the newer 14.31.1 XP drivers have it. I would love to see a comparison between the GMA X3000, Xpress X1250, Geforce 7150, and a discrete X1300HM and 8500GT.You're probably saving the new drivers for an IGP review when the G35 GMA X3500 comes out (October 21?), but it would be nice to have numbers for the GMA X3000 too for comparison.
IntelUser2000 - Tuesday, October 2, 2007 - link
I agree, they should run XP driver tests. Better yet, they should test out G965 to see the taste of G35.
Here's my results:
From Gary-"We set our quality settings to medium or low where applicable except for the first two are set to high and the sliders are set in the middle spot."
With that in mind, I did a test. However I wasn't sure whether Object Scarring and Post processing were on or off. I did both tests.
AT settings+Object Scarring/Post Processing Off-12.5
AT settings+Object Scarring/Post Processing On-11.6
I also use Dual Channel DDR2-800 with 5-5-5-15 ram and E6600. I found out that in Company of Heroes, performance increased by 10% going from 5-6-6-18 to 5-5-5-15.
Supreme Commander: 8.381
I am using 14.31.1 driver and XP SP2.