My problem is that although I set the FPS to unlimited in Minecraft, I only get around 150 FPS. My graphics card is about 15% loaded and my CPU 10%.
On my old PC with a GeForce gtx 1050, both were sometimes up to 90% full and I sometimes had up to 1200 FPS.
Could it be that older hardware can simply handle OpenGL better, or am I doing something wrong?
Thank you for your answers.
CPU: AMD Ryzen 7 2700x
GPU: AMD Radeon RX 5700 xt
PS: The few FPS would not bother me if the game would run at least without stuttering. And with games like Far Cry 5, SW Battlefront etc. I have no problems.
Your new card simply knows that it is nonsense to give you more FPS because it doesn't work anyway and you can't see any difference anyway.
I don't really know the reason, but I guess it would make NO SENSE to load the PC more. You would just use more electricity for nothing. I think that the PC simply recognizes that more power is simply not required…
But where do these little lags come from?
How many cores of the CPU are being used and what percentage?
My guess would be that your graphics card in Minecraft is too fast for the processor and that in this case the processor is the bottleneck.
A solution would be a higher resolution so that the graphics card has to work more.
My Minecraft is also full of small stutters since the last driver update.
I have an AMD Ryzen 5 1600X and an Rx580 8gb
I do not want the PC to be fully loaded. I'm just wondering why Minecraft doesn't consume the resources it has at its disposal. Answers like yours don't help me. And no, good PC components don't break at 70% utilization.
It is clear to me that Minecraft is not exactly programming gold, but the others don't have these lags either.