I played Minecraft a few days ago and set the FPS to 60 because I only have a 60 Hz monitor (the default was 120).
It was almost like a slide show, although the FPS were constant. Set back to 120 FPS, it went well.
What is the reason if my monitor can't display 120 FPS at all?
Because the delay is less. For example, You move the mouse exactly after a frame change at 60fps, it takes 16.6ms until a new picture arrives at 120fps, just 8.3ms fills up more simply.
That's funny. I also play Star Citizen right now because I have 30-50 FPS on the station and it feels more fluid with 30 FPS, like MC with 60 FPS. Why is it different there?
60fps at minecraft should also not jerk noticeably.
Depending on how the game is optimized, what kind of game it is and how the frame times are, it can actually seem more fluid.
Can it be because I'm playing a big modpack (Revelation with something around 200 mods)? The rest of the PC is OK. R5 3600X @ 4.2 GHz with 16 GB DDR4 3200
FPS are by no means constant and vary a lot.
Keyword: frame times