GPU 100% without VSYNC

I have a fairly streamline ASUS GTX-970 STRIX.
It handles the new DOOM at above average settings without loosing a frame and in general, rarely does it go above +10 room temperature.
Why does it jump to 100% power level and full throttle GPU usage for this game?
The graphics are not even 10% of that of DOOM, and there is no VSYNC option.
I cannot see the framerate because ALT+~ does not work on non-US keyboards!
Move the key to F10 or whatever and add that VSYNC option, it cannot be that hard!

Until then I refuse to play this game because it totally fries my graphics card, I've never seen something like this and have 280+ games on steam. Not one heats up like that! Especially when I reduce the graphics quality (down to the minimum setting) it doesn't change at all. This last thing suggests me that it's a framerate issue. It probably jumps from like 80 FPS to 150 FPS. But I want it to run COOL on 30 FPS if needed...

This is a real gamestopper! Solve this low level stuff before you even think about adding content, or the impression is very very unprofessional!

Comments

  • Same problem here. FRAPS shows 100 fps which is what i set in NVIDIA settings, but the card is running as if it was doing 200+ fps. Resolution is 2560x1440. No difference between 'beautiful' or 'fastest' setting as far as GPU workload is concerned. Any ideas?
  • pidpid Italy
    I can understand that Unity3D uses C# which is notoriously cumbersome compared to pure C/C++ code but I don't understand why there's no framerate throttle. More than VSYNC I'd like to see a "maximum framerate" setting. It's not really difficult to implement, there's even boilerplate code on the internet for that.
  • pidpid Italy
    Here. 5 seconds search on Google. If I'd implement a client I'd start with that. It's a one-liner. Just take the value from an external config file that can be edited and everybody is happy.

    https://docs.unity3d.com/ScriptReference/Application-targetFrameRate.html
  • @pid It seems GPU overheating (what is "too hot", by the way?) is very rare issue with LoA - maybe no one bothered to ask for frame rate limiter before... No matter how easy something is, it can be forgotten :)
  • asawnoffshotgunasawnoffshotgun jonnywilson87@gmail.com
    same here, cant even play it. my cards fan is screaming.
  • edited February 2018
    there is many good reasons NOT to use frame rate limiting
    and there are ways to work around ;)
  • asawnoffshotgunasawnoffshotgun jonnywilson87@gmail.com
    feel free to continue
  • If you are concerned about overheating, just go google rivatuner, use that program to limit fps to 30 in LoA. This will cut GPU useage in half.
  • Miphon_CSMiphon_CS Citadel Team Administrator
    We're aware of these issues and will be addressing them in an upcoming client update.
  • Miphon_CS said:

    We're aware of these issues and will be addressing them in an upcoming client update.

    still high gpu usage
  • And still high gpu usage, a month later.
  • HAJZEHAJZE Sweden
    edited December 2018
    1080 and still high gpu usage even on locked fps, 6 months later

    edit: got better when changed from ultra to high graphic settings, getting 52C instead of 71C now..

    But should it rly be a problem to play on ultra with i7-8700k /MSI 1080 :s

  • On my rig (see OP) at minimum settings it runs nice at 25% and there's even an option to set framerate from 60 to 30 Hz! Nice job devs! That's indeed excellent news!!
  • Its also worth noting that Unity is C++. C# is just used as a scripting interface and even preprocesses into C++ using IL2CPP which is very effective. Every game engine worth a grain of salt has a scripting interface whether its LUA, C#, Blueprints, custom, etc. It has nothing to do with Unity and Unity is an extremely well put together engine.
  • Sadly, that's not necessarily true. IL2CPP does it's job but it can't do magic. The C# must already handle memory in an efficient way. There are several techniques to avoid memory allocation on the heap and excessive garbage collection known to knock down games on XBOX. Frank Savage put out good tutorials on this topic for back in the days for XNA. The problem won't change, ever. So IL2CPP is just a last step after memory allocation optimization and rather secondary to that. I doubt this is what we see here. My suspicion is that they optimized the shaders, especially the stock splatmap shader which has really bad performance.
Sign In or Register to comment.