AMD A6 3400M vs A6 3410MX

Jeteroll

Distinguished
Sep 11, 2010
31
0
18,580
How much better will the 3410MX be for gaming then the A6 3400M? Maybbe an FPS estimate? Pl0x??? :sol:

P.S. I select an upgrade to a 1gb 6750m with dual graphics on this laptop: http://www.shopping.hp.com/webapp/shopping/computer_series.do?storeName=computer_store&category=notebooks&series_name=dv6zqe_series&a1=Category&v1=High%20performance
with an A6 so this can do dual graphics. THIS MEANS THAT THEY ADD A SEPERATE 6750M SO THERE ARE TWO SEPERATE CARDS (one in the cpu and 1 discrete card) RIGHT? TWO CARDS?
 
Solution
The A6-3410MX's CPU is clocked at 1.6GHz while the A6-3400M is clocked at 1.4GHz. The A6-3410MX's HD 6520G graphics core uses DDR3 RAM @ 1666MHz while the in the A6-3400M, the HD 6520G graphics core uses DDR3 @ 1333MHz.

Gaming performance will only be marginally better in the A6-3410MX. Generally speaking, the HD 6520G is somewhat close to the performance of a desktop Radeon HD 5550. The HD 6620G graphics core in the A8 Llano series is somewhat close to the performance of a desktop Radeon HD 5570. As a comparison the Intel HD 3000 graphics core is slightly better than the desktop Radeon HD 5450.

Jeteroll

Distinguished
Sep 11, 2010
31
0
18,580


Ive checked notebookcheck and they have 2 benches to compare them and i was wondering If i could get some GAMING benches/estimates instead of some super pi bench. SO no im not lazy thank you very much. :kaola:
 

wintermint

Distinguished
Sep 30, 2009
165
0
18,640
Well, I think that 3410MX runs on a higher clock cpu-wise but everything else should be the same. You should be able to check on the website regarding your other question. See if there's a link that says "Help me choose"
 
The A6-3410MX's CPU is clocked at 1.6GHz while the A6-3400M is clocked at 1.4GHz. The A6-3410MX's HD 6520G graphics core uses DDR3 RAM @ 1666MHz while the in the A6-3400M, the HD 6520G graphics core uses DDR3 @ 1333MHz.

Gaming performance will only be marginally better in the A6-3410MX. Generally speaking, the HD 6520G is somewhat close to the performance of a desktop Radeon HD 5550. The HD 6620G graphics core in the A8 Llano series is somewhat close to the performance of a desktop Radeon HD 5570. As a comparison the Intel HD 3000 graphics core is slightly better than the desktop Radeon HD 5450.
 
Solution

Jeteroll

Distinguished
Sep 11, 2010
31
0
18,580


Thanks, so if I upgrade and add 6750 there will be the gfx card on the cpu and a sepertate deiscrete 6750 right? SO it can do dual gfx?
 
I am not sure if the HD 6620G can do a hybrid crossfire with a HD 6750. I'll assume that the HD 6750 will be used on it's own.

If the laptop is capable of "GPU switching", then on light loads it will use the HD 6520G, to save power and keep the laptop cooler. When playing a game, the the laptop will switch over to the HD 6750. If there is no "GPU switching" ability, then the laptop will always use the HD 6750.
 

Jeteroll

Distinguished
Sep 11, 2010
31
0
18,580


Hmm notebook chech says there is a dual gfx config for these two: http://www.notebookcheck.net/AMD-Radeon-HD-6755G2.57278.0.html
But tell me one thing please: there will be the gfx card in the cpu PLUS ONE DISCRETE SPERATE CARD (6750) so two seperate cards right? cuz hp kept saying theres "only one card"
 


Okay, they would know more than I would, so it's a dual graphics solution.



Well, HP is correct. There is only one graphics card; the HD 6750.

The HD 6620G or HD 6520G is a graphics core that is inside the Llano APU; much like the Intel HD 2000 / 3000 graphics core inside all Sandy Bridge i3/i5/i7. Therefore, it is not a "video card".
 

Jeteroll

Distinguished
Sep 11, 2010
31
0
18,580


Oh ok thanks a bunch! :sol:
 

Captain_Kickass

Distinguished
Sep 18, 2011
2
0
18,510
Oddly enough, Pass Mark (www.passmark.com ... and warning arbitrary numbers based on computations) ranks the 3400M much higher than the 3410MX even though the 3410MX is slightly higher in MHZ (1.6 MHz vs 1.4 MHz for the 3400M). My "guess" is the laptop manufactures are pairing the 3410MX with 1333 MHz system ram (and not 1866Mhz) bottlenecking possibly killing the benchmark. Someone said the APU's 6520 GPU ran off of 1666Mhz Ram; but, I believe that's a typo and it's 1866Mhz ram... could be wrong though but that's what I remember. I may be confusing it with the A8-3850 Lano desktop chip. Either way, it's been a long day and I'm to tired to check). :kaola:

www.passmark.com
 

mauser1891

Distinguished
Sep 11, 2011
1
0
18,510
The AMD A6-3400M APU with the HD6520G uses which ever RAM that is currently installed.
I upgraded my "stock" 4GB DDR3 with a 8GB Crucial kit (DDR3 PC3-10600 • CL=9 • Unbuffered • NON-ECC • DDR3-1333 • 1.5V • 512Meg x 64). Which increases my HD6520G performance by allowing the system to use more RAM dynamically, with a faster transfer rate over the "stock" rate.

Comparing Intel HD2000/3000 series graphics solutions to the ATI HD6520G is an apple to an orange, not to start any flames but a simple statement.
"The Radeon HD 6520G, is without any doubt better than the Intel HD 3000 graphics. In the two tables below will will compare some of the features and the benchmarking that will establish the superiority of the Radeon HD 6520G." @ http://compare-processors.com/amd-radeon-hd-6520g-vsintel-hd-graphics-3000/1063/
 

Captain_Kickass

Distinguished
Sep 18, 2011
2
0
18,510



Technically yes but actually the on chip video processor for the A series chips can run at a max 1866Mhz memory clock for the gpu and from what I hear, to get best system performance, it's best to match with 1866Mhz system/board memory. Several posts out there will mention this and some claim up to a 40% increase in performance but I can't seem to find that review now and not sure if that's correct. I did find this review though confirming 1866Mhz system memory works fastest with the A8-3850.

PS. Agreed Intel 3000HD graphics blow chunks. The A series on dye GPUs destroy Intel's on chip graphics. Intel should either partner with Nvidia (lol that's not likely after the war over PCI-E they had) or just stop making on chip graphics altogther. lol

http://www.legitreviews.com/article/1652/1/