Radeon X1300/X1500 vs Intel X4500 integrated
-
I'm building a computer at work. It has a card identified as Ati Radeon X1300/X1500. The motherboard I'm going to put in it has integrated Intel X4500 graphics. Which one performs better? I won't be putting games on it generally, but if they perform the same with HD video, moving Windows and little more... is it worth installing the graphics or should I keep it?
-
I can tell you from first-hand experience that the X3500 handles desktop and 1080p HD video perfectly.
-
Ok, so I'll test it before installing the graphics. Here's where the video gets stuck, but it could be because of the micro, which is an Athlon64. Now I'm going to put an old Core2Duo that I had lying around, hopefully it will improve things.
-
Today I tried the integrated graphics, with its drivers, native monitor resolution, etc. Something strange happened to me. The monitor's image seemed less defined than before. I thought that maybe my vision was blurred in the morning or that it was just a paranoia of mine, but I opened the computer, put in the Ati, and after installing the drivers and setting the native resolution, I noticed the change: the image returned to how I remembered it. How is it possible that there can be such differences, with the same monitor, the same resolution, the same cable, the same mode and refresh rate...?
-
That which you comment on was relatively common in low-end cards with VGA output although I hadn't seen it for quite some time.
I can't think of anything special except that you lower the refresh rate to the lowest possible.