(laptop) PCIe runs at 2x 2.0 instead of 8x 2.0, please help

Kestis

Prominent
Mar 30, 2017
6
0
510
559ce738a67df4edbd8ffd7cfb0c10a688.png


Lenovo ideapad 300-15ibr

4GB ram
Intel HD 405 & Geforce 920m
Intel® Pentium® Processor N3710 (2M Cache, up to 2.56 GHz) Quad-core

Hello, I've searched whole internet, been trying many situations for literally one month and I have no idea what else could be done. I still have warranty on the pc but I don't think they will help me, just take money for diagnostic and say that they didn't find any problem.
 
Solution

But somebody actually tested the scaling of a GTX1080 and found dropping from PCIe 3.0 x16 to x4 results in only 4% loss of performance despite cutting bandwidth by 3/4. Cutting bandwidth by 96.8% to PCIe 1.1 x4 (which has...

Kestis

Prominent
Mar 30, 2017
6
0
510


I re-uploaded a picture, as you can see even when it's under load it's not going for more than 2x
 

BFG-9000

Respectable
Sep 17, 2016
167
0
2,010
Ah, noticed you have a Braswell SoC. Those only have a few PCIe lanes to begin with so x2 is right.

Your 920m GPU is a Kepler GK208 variant, one of which was sold as an x1 card with half as many shaders in the GT710 so I assume x2 would not be limiting for yours even if the rest of the system was decent.

Unfortunately it's not, as you are running a 6 watt Pentium with only single-channel memory so it's not like it would be usable anyway if the card ever ran out of VRAM and started doing PCIe transfers. Good thing your GPU has 1GB of local memory. If anything, that GPU is too powerful for that system so PCIe lanes should be the least of your worries.
 

Kestis

Prominent
Mar 30, 2017
6
0
510


Appreciate for your help, but I'd be happy if you explain the same just in simple words. (I'm not much into all of this just an amateur)
 

Kestis

Prominent
Mar 30, 2017
6
0
510


DAMN, I'm not a video gamer but I wanted to buy such laptop that I could play atleast some games, like world of warcraft wrath of the lich king with a nice graphics and fps. Well, thank you for the effort guys, appreciate it.

Currently I overclocked it for 1030 mhz, what if I overclock more?
 

Kestis

Prominent
Mar 30, 2017
6
0
510
Although, you said it supposed to be 1x, currently it's 2x (which you said it's okay) but why on gpu-z says that my video card supports 8x 2.0, but currently working on 2x 2.0
 

BFG-9000

Respectable
Sep 17, 2016
167
0
2,010
x2 should not slow that card at all, especially not in that system.

The primary need for high PCIe bandwidth is for when your graphics card runs out of memory so has to substitute system memory instead. Think of it like when you run out of system memory so it starts swapping to hard disk instead, which is so much slower it makes everything pause. In your case the system memory is so slow that everything would freeze up anyway, were it ever used to replace video memory. So both x8 and x2 would effectively perform the same--both are unusable and the moral is never run out of memory.

The main problem with that system is the processor is only as fast per clock as a Pentium 4 for single-threaded workloads. So in 2017 you are using the equivalent of four Pentium 4 2.5GHz from 2002 (although using only 6w instead of 244w). A 2.53GHz Core 2 from 2008 is 3x faster per core in Java, and today's 2.5GHz Kaby Lake is 7.5x faster.

So while the video card is no faster than today's fastest integrated graphics, the CPU is like something from 15 years ago. Without hardware acceleration assist from the video card, web browsing would be terrible. BTW that Pentium 4 likely was used with AGP 4x, which has about the same bandwidth as PCIe 2.0 x2, so x2 is even well matched to your processor.
 

Dragos Manea

Estimable
Mar 30, 2015
139
1
4,660


This is not true, you want me to believe that a gtx 1080 with 8 gb of ram can run on a 2x? This i can conclude from your logic, the bandwidth is for data to be transfered to the card to be processed, the more processing power has the card the more bandwidth is necesary, thats why a gtx 1080 will run at 10% of its true power on a 2x because it would not get data fast enought to process it (This is an exageration but you get the ideea).Also the frame latency will increase a lot due to lower bandwidth.

 

BFG-9000

Respectable
Sep 17, 2016
167
0
2,010

But somebody actually tested the scaling of a GTX1080 and found dropping from PCIe 3.0 x16 to x4 results in only 4% loss of performance despite cutting bandwidth by 3/4. Cutting bandwidth by 96.8% to PCIe 1.1 x4 (which has exactly the same bandwidth as PCIe 2.0 x2) resulted in at most 30% loss of performance and not the 90% you claim. This relatively minor drop in performance is mostly because textures need to be transferred over the PCIe as well.

They would not have been able to test at x2 because the 1080 draws 75w from the PCIe slot and the PCIe specification only allows 25w from a x1 or x2.

Instructions are small and simply don't require a lot of bandwidth. Bandwidth and latency are very different things.
 
Solution

Dragos Manea

Estimable
Mar 30, 2015
139
1
4,660
I think i mentioned that was an exageration but still 30% is a pretty loss, this way you would get a gtx 1050 ti or 1060 instead of a 1080. That is 2/3 of price drop.
Anyway on-topic it does influence but you should be fine on 2x because that videocard cannot use full speed of a 2x, 1x was enought, and again the 8x you see is for the maximum speed of the slot.
 

BFG-9000

Respectable
Sep 17, 2016
167
0
2,010
Well, context is everything and GTX1080 is nearly the worst-case scenario while OP has the equivalent of a GT730 on a Pentium 4. I think we can agree that a card that's 1/10 as fast as a 1060 is not going to have it's performance affected by x8, x4 or even x2 on that processor.

I'll even predict that a GTX1080 would probably have the same FPS at any PCIe bus width, even the equivalent of x2 if installed on a Pentium 4

OP can probably play any vintage game (that will run in Windows10) at maxed settings until about 2004. GTA San Andreas, FEAR and Crysis will run well with slightly reduced settings and something new like World of Warcraft: Legion would probably require turning things down rather a lot as the CPU is so far below the minimum recommended one. I should point out that back in 2004 Warcraft only required an 800MHz CPU and Lich King came out in 2008 but the minimum requirements have changed a lot after 13 years of patches.
 

Dragos Manea

Estimable
Mar 30, 2015
139
1
4,660
Yes we agree on that, and if op wants to game on the notebook i persoanlly think he should make an investment, you can get a pc working at mid-high with 400$ and high-very high at 1080p with 600$. As notebook i dont think you can get a decent gaming notebook for under 1300-1500$.