Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow developer to limit engine resources #5109

Open
nonunknown opened this issue Aug 8, 2022 · 7 comments
Open

Allow developer to limit engine resources #5109

nonunknown opened this issue Aug 8, 2022 · 7 comments

Comments

@nonunknown
Copy link

Describe the project you are working on

A kart racing game

Describe the problem or limitation you are having in your project

I would like to limit the engine resources so I can make sure it will run on low-end machines.

Describe the feature / enhancement and how it helps to overcome the problem or limitation

Imagine that you want your game to run in a 10 years old machine, or an old mobile device, having the ability to choose which limitations you want to your project would be awesome.

Also this limitation is not superficial only (I mean its not mean to notify the user that the limit has reached) the objective of this implementation is to actually limit the running game to the set amount of resources.

Describe how your proposal will work, with code, pseudo-code, mock-ups, and/or diagrams

image

If this enhancement will not be used often, can it be worked around with a few lines of script?

not sure.

Is there a reason why this should be core and not an add-on in the asset library?

not possible, since you need low level information

@Calinou
Copy link
Member

Calinou commented Aug 8, 2022

Related to #2096.

I think this is best done with external tools, as it requires a lot of OS-specific wrangling such as setting CPU affinities. GPU throttling is even worse as it also requires driver-specific wrangling.

Either way, I am not 100% convinced that trying to emulate a low-end machine is worth the trouble. It will never be entirely accurate and can end up being misleading for a whole lot of reasons.

If you want to be mostly sure that your game will run well on slow hardware, target a much higher resolution and framerate on your own high-end device. For instance, if you can reasonably achieve a stable 4K @ 120 FPS on your own device, then a much slower machine should be able to achieve a stable 1080p @ 60 FPS (8× lower pixel throughput per second).

@alfredbaudisch
Copy link

alfredbaudisch commented Aug 8, 2022

I think this is best done with external tools, as it requires a lot of OS-specific wrangling such as setting CPU affinities .

Couldn't something like the Sandbox mode also help with this #5010?

But I guess the Sandbox is not a virtualization-like one, more of a file/access resource one. So forcing hardware resource thresholds would be way out of scope with what Godot does [in the sense of implementation complexity, I mean], even although the idea is interesting (in any case, my knowledge of Godot internals are limited).

@nonunknown
Copy link
Author

Related to #2096.

I think this is best done with external tools, as it requires a lot of OS-specific wrangling such as setting CPU affinities. GPU throttling is even worse as it also requires driver-specific wrangling.

Either way, I am not 100% convinced that trying to emulate a low-end machine is worth the trouble. It will never be entirely accurate and can end up being misleading for a whole lot of reasons.

If you want to be mostly sure that your game will run well on slow hardware, target a much higher resolution and framerate on your own high-end device. For instance, if you can reasonably achieve a stable 4K @ 120 FPS on your own device, then a much slower machine should be able to achieve a stable 1080p @ 60 FPS (8× lower pixel throughput per second).

Seems this is an interesting approach, but it doesnt give you 100% accuracy that will work or not.

I think this is best done with external tools, as it requires a lot of OS-specific wrangling such as setting CPU affinities .

Couldn't something like the Sandbox mode also help with this #5010?

But I guess the Sandbox is not a virtualization-like one, more of a file/access resource one. So forcing hardware resource thresholds would be way out of scope with what Godot does [in the sense of implementation complexity, I mean], even although the idea is interesting (in any case, my knowledge of Godot internals are limited).

Yeah having it on sandbox mode would be super interesting, and fit well with the proposal. at least on my POV

@KoBeWi
Copy link
Member

KoBeWi commented Aug 8, 2022

This can be sort of achieved by using a virtual machine 🤔

@Calinou
Copy link
Member

Calinou commented Aug 8, 2022

This can be sort of achieved by using a virtual machine thinking

Unfortunately, graphics acceleration for modern graphics APIs is difficult to get working in virtual machines (if even possible, when you only have a single GPU).

@TheDuriel
Copy link

Unfortunately, graphics acceleration for modern graphics APIs is difficult to get working in virtual machines (if even possible, when you only have a single GPU).

This is purely a artificial limitation by nvidias drivers. Assuming your GPU has virtualization unlocked, you can pass it through into a VM no problem. This is what every enterprise game streaming service does. (AMD does not lock away virtualization.)

This nvidia lock, is actually being lifted from consumer cards in the new generations. (And successfully circumvented in other cases.) So it's entirely feasible for someone to set up a VM with true and full access to the GPU. And with a fixed amount of memory and sectioned CPU resources.

That said. It's a hassle to set up. And you will always get better and more useful results from simply running the game on another machine. Neither VMs nor artificial resource limiting can account for other software and hardware environments in a sufficient manner.

@willnationsdev
Copy link
Contributor

Related to #2096.

I think this is best done with external tools, as it requires a lot of OS-specific wrangling such as setting CPU affinities. GPU throttling is even worse as it also requires driver-specific wrangling.

Either way, I am not 100% convinced that trying to emulate a low-end machine is worth the trouble. It will never be entirely accurate and can end up being misleading for a whole lot of reasons.

If you want to be mostly sure that your game will run well on slow hardware, target a much higher resolution and framerate on your own high-end device. For instance, if you can reasonably achieve a stable 4K @ 120 FPS on your own device, then a much slower machine should be able to achieve a stable 1080p @ 60 FPS (8× lower pixel throughput per second).

Seems this is an interesting approach, but it doesnt give you 100% accuracy that will work or not.

Even if it isn't 100% accurate, it's likely to produce a more accurate estimate than forcibly downgrading the hardware through emulation by the engine. Still though, I'm not sure how much it would hurt necessarily to somehow configure the engine to have a limit on the number of system resources it is allowed to use as I could see that being useful regardless.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants