I would like to record Minecraft CLIPS with OBS, but this works only at a 20,000 bitrate and 60 fps. But especially with Minecraft it would be an advantage to record with 120 fps because you can make better Slomos. And when I use the x264 encoder jerky ingame too strong.
On my PC, it should not really lie because I have a Ryzen 5 2600 and a Vega 56.
I recommend to activate the hardware acceleration in OBS.
Is already activated
And when I use the x264 encoder jerky ingame too strong.
Because then your CPU is just too weak or you try to save the recorded material on the disk where the game is playing.
Yes, for x264 my CPU is too weak and so I want to record with H264 or H265 but then overloaded the coding. And I save my clips or recordings synonymous on another hard drive
VII. Is also just the graphics card too weak. It can have many causes.
On the graphics card, it can't be synonymous, especially with Minecraft
What did you choose for a profile at x264 at process load default
Use your video card encoder. If your graphics card can't handle the presets, increase the number of B-frames.
Anyway, I can only use the graphics card encoder, and actually it should be able to cope with it, since the load is always very low. And what are the B-frames or where can I increase, vllt it helps anyway what.
Very fast
In the encoder settings in your OBS. B-frames you have to imagine, based on a previously changed image partially information for the calculation of the new image is used. This reduces the required computing power, but leads to games with many jerky movements and scene changes to blurry pictures in between.
The thing about graphics card encoding is that, for the encoding, the graphics card uses dedicated stream processors, the Nvidia counterpart is CUDA, and they can be overloaded, even if your game is running smoothly from the GPU.
Okay, thanks for the explanation, but I still can't find the settings for the B-frames, can you maybe send me a screenshot or something like that?
Seems like the option exists only for the NVidia encoder
Hmm, lousy