Intel Graphic Cards?


#1

I have heard many people saying that INTEL GRAPHIC cards are very substandard!!!

Is this true???If we compare the performance of Intel graphic cards with that of NVIDIA and ATI,then what is your opinion..??

Are they really sub-standard as compared to these cards..???


#2

i guess you are talking about onboard intel graphics, if yes then have a look yourself,

http://en.wikipedia.org/wiki/Intel_GMA

they all suck,,,only suitable for playing 3 year old games, and even then they run lousy, nvidia and ati onboard dedicated graphics are on par with stand alone pci express graphic cards

even nvidia 7100 graphics i have on moboboard are better then a lot of onboard intel graphics out there

intel used to make stand alone graphic cards but they dont anymore, you can still find intel agp graphic cards 2nd hand with 4 to 16 mb in the market


#3

intel doesn't have graphics card. if you are talking about onboard graphic chip then the chip sucks. it is no where near nvidia or ATI...


#4

too worst comparison!!!:|


#5

yeah but its honest


#6

To give you an idea. Intel X3000 has performance comparable to Geforce 6150. ATI Radeon 1250 integrated was better than x3000 and all other integrated graphics cards in 2007.

Reference: http://en.wikipedia.org/wiki/Intel_GMA#Microsoft_Windows_performance_reviews


#7

graphic cards are only required for gaming purposes????


#8

^no...apart frm gamin these r used for graphics n video editing etc


#9

I guess normal graphic cards would serve the above mentioned purposes!!!

Advanced cards are required for gaming only!!!


#10

INTEL GMA X3100 is not that bad afterall!!!


#11

currently integrated GPUs in intel motherboards are a joke (compared to even low end graphics cards)

but Intel is working on a new graphics card/architecture...its called Larrabee....its supposed to be a mix of existing CPU and GPU architectures....it seems promising from their performance data:

[quote=", post:, topic:"]

Preliminary performance data

Intel's SIGGRAPH 2008 paper describes simulations of Larrabee's projected performance. Graphs show how many 1 GHz Larrabee cores are required to maintain 60 FPS at 1600x1200 resolution in several popular games. Roughly 25 cores are required for Gears of War with no antialiasing, 25 cores for F.E.A.R with 4x antialiasing, and 10 cores for Half-Life 2: Episode 2 with 4x antialiasing. It is likely that Larrabee will run faster than 1 GHz, so these numbers are conservative. Another graph shows that performance on these games scales nearly linearly with the number of cores up to 32 cores. At 48 cores the performance scaling is roughly 90% of linear.

[/quote]

^

http://en.wikipedia.org/wiki/Larrabee_(GPU)


#12

[quote=", post:, topic:"]

I guess normal graphic cards would serve the above mentioned purposes!!!

Advanced cards are required for gaming only!!!

[/quote]

no man u r wrong…u need special effects and blah blah during video editing and most worst part of editing is rendrering…external graphics cards also reduces rendering time…i tried to edit “20 min” video first on builtin s3 graphics display wid 256mb display memory…it took 4 to 5 hours hours during rendring…n now grab ur heart wid next experiment…i tried same video on 8600gt ddr2 512mb…n it just took 20 minutes during rendring :)


#13

^rendering time also depends on which format you are rendering in


#14

I do professional Video editing using softwares from ULEAD, Roxio and Cyberlink and Yea Digital video editing depends pretty much on 3D cards and in built graphic cards either from Intel or Via or SIS cant do the job for u. They can only work with converters, free small tools but not with professional softwares. So 3D cards arent only for Games.