When I ordered my computer a year and a half ago I painstakingly selected every component I wanted to put into it. One thing that I knew for certain was that I wanted to play Blu-ray movies, and as such I knew I needed to put in a fairly beefy video card. I decided that the best bang for my buck would the ATI Radeon HD 4350 (512 GB). At that time I didn’t see myself running anything other than Windows, and I certainly wasn’t thinking out how the card would behave in an open-source world.
Fast forward to today, where no Windows installations are to be found on my machines and I’m rockin’ it open-source. It amazes me how terrible graphics drivers can be in the Linux world. The open-source drivers that ship with Ubuntu for ATI and NVIDIA cards are decent enough, but they don’t have the necessary 3D support needed to run many games, and video playback can be a little jittery for me. When I throw on the binary drivers (including Catalyst Control Center, which is a software experience that should just be avoided like the plague), I can play games, but I have over and under-scan problems (an issue with my specific graphics card that are due to ATI’s drivers), and video is so jagged and jerky that I cannot watch it (with or without the anti-tearing options enabled).
So Saturday, my first day off in nearly a year, I got up and yanked the graphics card right out of my machine, and connected my monitor via the HDMI port integrated on my motherboard. Intel open-sources all their integrated graphics drivers. Video playback is smooth and gorgeous, and most games run better than before.
My point of writing the post is this: NVIDIA and AMD need to step up their graphics game. The open-source drivers don’t cut it for many users’ needs, and the proprietary drivers are a terrible experience. If you want a great experience, one that is completely open-source and problem free, give the integrated Intel chipsets a go- I’m surprised, and I think you will be, too.