Idea: to fix fps cost of 3d scopes

After playing a few rounds of insurgency and watching videos of insurgency sandstorm,I've been concerned over the FPS cost of 3d scopes. So I have this idea... (I've never made anything with UE4, but please bear with me)

When you use a 3d scope...anything outside of the scope can be rendered at a lower resolution or frame rate. I've read about a certain game engine rendering characters at a higher framerate compared to objects in the scene.

There can be new options in the video settings for 3d scopes.
*Scope resolution (low,med,high,adaptive)

"Outside the scope" settings:
*Reduce fps (limits the outside the scope framerate to 30,uses a blur effect to make transitions between frames look smoother,like far cry 4 does on the console)

*Resolution reduction (instead of just blurring whats outside your scope,you'll have the scene rendered at a lower resolution so long as you're using your 3d scopes.)

I hope something can be done about the 3d scopes being such performance hogs.

Some are experimenting with foviated rendering in VR (with eye tracking software). This works by rendering whatever the player is not looking at,at a lower resolution. Sometime similar should be done with objects outside of the scope,if such a thing is possible.
Youtube Video

last edited by MusicNote

Reducing the FPS would be awful, and lowering the resolution wouldn't be a bad idea, but I'm not sure if it can be done without looking awful as well. But in general, foveated rendering is best used in cases where the downscaled area of the image is unseen, like in VR with eye-tracking capabilities.

I'm sure the devs have thought of this. If it doesn't end up being used as a solution then I'm sure they have their reasons.

a win-win scenario would be for the devs to experiment with this. If it is plausible, consider it be added as an option.

Reducing fps to anything under 60 and introducing or forcing motion blur is a terrible idea in my opinion, many people who have new monitor technology like g-sync, freesync and ULMB can't stand being reduced to less than 60 fps and motion blur looks disgusting as it's never done correctly unless it's a heavily scripted scene that is more like a action movie than a video game.

I for one would rather have more performance hungry technology that looks great and is silky smooth, I mean it's 2018, not every game should be designed to run on rice cookers powered by potatoes that were designed in 1990, developers shouldn't focus on making their games excessible to people who refuse to update their computer hardware or haven't upgraded to current gen systems within the last 5 years, sure sandstorm is now going multi-platform so they will have more of a focus on making it run fine for consoles yet (forgive me if this is elitist but) due to the origins of the game PC should receive preferential treatment and heavy optimisations - the current gen of computer hardware DESTROYS current gen of console hardware, yet not a lot of devs have the time or resources to optimise their titles for the latest computer hardware which is a huge shame in my humble opinion.

From what i know of the 3d scopes it's "picture in picture" technology, and as i've not seen or played sandstorm in it's current state I can't say what/how the graphics settings operate but what some games that i've played do is they take your resolution of say 1080p as a base resolution for rendering, and if the game can run at a silky 60+ while stressed and under heavy load then your picture in picture resolution/texture resolution could run beyond that resolution say at 4k if your hardware can run it without hiccup and the results are glorious, it's night and day difference.

In arma 3 for instance my system can run it silky smooth 60+ on high/ultra at 1080p, yet my set texture/picture resolution goes beyond my monitors resolution so textures and models look better and of course it comes at a cost of performance for the user.

I for one will be interested to see exactly how and what the developers do - for all i know in the current state the picture in picture technology currently is a huge performance hog for even the most godtier of computer hardware - but for all i know that could be due to lack of optimisation or memory leaks or a multitude of things which deteriorate performance to such a degree that it is as noticable as hell.