Graphics Cards in the Future – What Should We Expect?


Graphics cards are those hardware parts that allow our desktop PCs, laptops and all other devices to interpret the visual information in a piece of software or video and display it in brilliant and sharp images. The more sophisticated the graphics card, the more realistic the images, and as current 3D graphics make everything look almost like the real thing (and sometimes even better), what should we expect for this technology in the future?


There are two major players in the graphics cards’ world – AMD and Nvidia. Both companies have been the leading innovators in this field for decades now, and their fame doesn’t seem to be fading; so this looks likely to remain the case for many more years to come. Of course, both are expected to release new products in 2013, but (much like with the iPhone ‘S’ trend) their newest creations are expected to be slightly better than the last and not in any way revolutionary.




And how can they come up with something revolutionary, as a graphics card is dependent on all other hardware components. A graphics card needs a fast processor to help it process the data, a good cooling system to keep it at normal temperature, a lot of RAM to use in the displaying of more complicated images and so on. As none of these are taking a significant leap into the future, neither can graphics cards. Because a computer is an integrated system of several components working together, and because it is true that a system is only as good and fast as its worse and slowest component, innovations have to be made in all areas in order for graphics cards to evolve significantly.


The biggest issue faced by graphics cards manufacturers is the size of the unit. It is determined by the size of the chips, i.e. on how small a piece of silicon you can fit all the necessary components. Until recently the standard was 40 nanometers, but that was thankfully changed to 28 nm. The next big thing in the field is expected to be a change to an even smaller size – 20nm. But we won’t be seeing that for at least a year more.


By making the unit smaller, the manufacturers allow the graphics card to perform better, but consume fewer resources. In other terms, it is a way for graphics card developers to better their products without waiting for processor, RAM and other manufacturers to catch up. Tech geeks can only hope that this technology will leave its research and development stage and enter the manufacturing one sooner rather than later, because it seems that video game developers are making games for cards that don’t yet exist.


Another aspect of the graphics cards’ future that should be mentioned is the price. These small yet crucial units have always been quite steep and anyone who wants the newest and best has to dig deep in their pockets to get it. Whether graphics cards will become more affordable over time is very questionable – no advances are currently being made towards using simpler materials which are the biggest price-forming factor.


Technology has always managed to surprise us. There are some Sci-Fi ideas we thought we would never see in our lifetimes, but are already a reality today. So we probably can’t actually imagine what the future of graphics cards looks like – all we can do is speculate and hope.

Rita Rova is the resident technology writer for voucher website,, where the latest technology deals are listed from top online stores serving the UK.

Tagged in:

, , , , ,