-
Notifications
You must be signed in to change notification settings - Fork 500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance tip for vsync on proprietary nvidia drivers #227
Comments
I'm a new user of compton who was also looking for a way to get rid of tearing with the propreitary nvidia-drivers, I tried the option you mentioned, and it completely worked. Really glad I happen to have found your post, I've never seen that mentioned anywhere through google searches. Compton fixed tearing for me, my only problem was that it caused the color and detail of video to be slightly off, but that's another discussion, thank you. |
Hi, bucaneer and japanese-bird, Sorry for the late reply, firstly. Thanks for the new tip! I've added it the VSync guide, and it would be greatly helpful for the nvidia-driver users struggling with tearing issues, I suppose. :-) As a side note, it's indicated that the option causes huge (~30%) performance loss on some OpenGL applications: https://devtalk.nvidia.com/default/topic/602831/linux/unrecognized-flatpanelproperties-property-quot-scaling-quot-/post/3941157/#3941157 (I've not tested it myself because I don't have any tearing issues.)
Hmm...
|
@richardgv I can't replicate the massive OpenGL performance loss on my system. There is some loss, but largely negligible. Unigine Heaven loses 0.8% (20.16 fps vs. 20.00 fps), Unigine Valley - 1.9% (30.08 fps vs. 29.49 fps) and Portal - 4.7% (193.24 fps vs. 184.19 fps). All tests were run through Phoronix test suite, results averaged over three runs. In any case, this is still better than what I got with the previous compton-based solution (too lazy to benchmark for hard numbers, but intuitively obvious). |
@bucaneer thx for the tip, works perfect on gt240 with 331.89 with compton glx and xrender (kwin, compiz, mutter) |
confirming on dell e6400 (NVIDIA® Quadro® NVS 160M), driver version 340.32 |
Update on performance testing - this clashes nastily with OpenGL VSync if it is enabled in driver configuration or in specific application settings - heavy stuttering, low FPS. Perhaps that was what the poster in nvidia forums saw? |
Well, I can't reproduce the ~30% drop in performance, either...
(My setup: Single monitor; GTX 670; nvidia-drivers-343.22; compton with The effect on FPS of glxgears is... Interesting. And I didn't the notice clash between "Sync to VBlank" in |
I'm curious to know how to use this within my xorg.conf when my metamode doesn't look like your example. My entire "Device" section looks like this: Section "Device"
EndSection I have a dual monitor setup using a GTX 760 and Nvidia binary 343.22. I've had a hell of a time getting my right most monitor be registered as the primary monitor so that games launch on it vs them launching on the left most monitor which is not what I want. My window manager is XFCE, i'm using Xubuntu 14.04 but with kernel 3.16. On a side note, when I view nvidia-settings the checkbox for "make this the primary monitor for the x screen" is always on the left most monitor when I boot into the system, running nvidia-settings with gksudo or sudo and changing the checkbox to be the right most monitor and saving it to my xorg.conf still results in the wrong monitor being labeled as the primary after I reboot and it also changed the way my xorg.conf looks, it changes the device section to look more like yours BUT then games launch on the wrong monitor (this is what xorg.conf look like that launches games on the wrong monitor: http://pastebin.com/eHAcQnY4) so to get them to launch on the right most monitor I made my xorg.conf look like this: http://pastebin.com/uSA3La2h To recap, I want a tear free desktop experience, I should be able to with a GTX 760 and 2 very capable monitors. I'm using compton as well by the way, i launch it using: and it's .conf file is here: http://pastebin.com/AUv8FZ8Z Any help to get a tear free desktop would be much appreciated. |
@ubuntuaddicted You should consult the driver manual about this, but if I'm reading it right I think you can make it work by changing this line:
to this:
|
The I don't understand what you were trying to express by describing how you managed to make the games run on the correct monitor, unfortunately. If you still have VSync problems after enabling By the way, doesn't TwinView create a single X screen, so that it should no longer be needed to launch multiple compton processes (one for each screen)? |
I wish I could get this to work for me. I'm experiencing tearing, for instance, in Firefox when vertically scrolling (with smooth scrolling enabled). I have an Nvidia GeForce GT 640M (Optimus) in my laptop, and I have no VSync option in the NVIDIA X Server Settings. So, I gave the above a shot, and it didn't work. Luckily, I can just run my Intel graphics with compton and solve the tearing issue, but I was hoping that I could get it to work on both graphics chips. |
@mmortal03 If you're using Nvidia proprietary drivers, which software are you using? nvidia-prime or bumblebee? If you're using nvidia-prime, you're in bad luck. Nvidia-prime will cause tearing If you're using bumblebee, just use the Primus backend as it uses the intel GPU to vsync everything the Nvidia GPU outputs |
According to chjj/compton#227
This does nothing for me on a GTX 770 in Debian Testing with the latest 352.30 driver. Tearing is still rampant and checking "Sync to VBlank" in nvidia-settings does nothing. Ideas? |
I'm using the latest version of compton (arch linux compton-git from AUR) and everything is okay until I have a window with OpenGL content (e.g. glxgears). Moving windows around on the screen, previously fine, becomes very slow and choppy. Any ideas what I should do to fix this?
|
@ElTimablo I appear to have the same problem. Using XRender backend with a GTX 770 and this metamodes option still yields screen tearing. |
@ioquatix Interesting. I'm getting this on one monitor out of two while running gnome. The other monitor is fine, and the only real difference between them is that one is HDMI and the other is DVI. I'm not getting it on my 970, which has both monitors on DVI, so I wonder if the different cables have anything to do with it. |
is that possible to sync one video board to more than one display? |
@actionless No, but I'm thinking that there might be some kind of sync between the two DVI ports that just isn't there between the DVI and HDMI ports. I'm getting another DVI cable in the mail on Monday, so I'll try it out then. |
So I have both screens of my dual-screen setup connected via DVI, and the tearing that was previously present on only one monitor is now gone. Just an FYI for anyone out there using an HDMI cable and a DVI cable. With this and the ForceFullCompositePipeline option set to on, tearing is pretty much gone for me. |
are displays the same model? |
@actionless They're the same manufacturer, but not the same model. One is an older Acer LCD, and the other one is an Acer LED. They are both the same resolution and similar sizes, however. It was also the case on my personal desktop (the one I was testing on is my girlfriend's), which has two identical displays. TL;DR: If you have two displays, hook them both up the same way. |
@bucaneer , you're an absolute champion for sharing this information. It worked perfectly with the Nvidia GT730 on my system. Thanks. |
GT570 reporting in, works like a charm, thanks! |
The ForceFullCompositionPipeline option regresses applications that synchronize to vsync on their own. E.g. try running a single window of Chrome/Chromium on vsynctester.com with and without that option and you'll see what I'm talking about. It's probably ok for games, benchmark applications and non-hardware accelerated media players which usually try not to do vsync on their own. |
I've been messing around with screen tearing and will write some information. compton with Using Today, after upgrading nvidia drivers (see below) using Not sure if nvidia update is related or not, but when I remove
This is also what I use for
So, to conclude:
Edit: Actually, the part where Oh, by the way, Nvidia's "Sync to vblank" in nvidia-settings literally does nothing to solve screen tearing in OpenGL windows. It does not work. |
Update on the above... Using However, this command works, and if I compare the metamodes using Starting X after X config fileAttribute 'CurrentMetaMode' (noname:0.0): id=50, switchable=yes, source=xconfig :: DPY-1: nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceCompositionPipeline=On, ForceFullCompositionPipeline=On} After running commandAttribute 'CurrentMetaMode' (noname:0.0): id=50, switchable=no, source=nv-control :: DPY-1: nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceCompositionPipeline=On, ForceFullCompositionPipeline=On} For some reason
Now, what is the difference between the command and the X config? I made an image to show you: http://i.imgur.com/DqRjixY.png I simply tried to copy paste the Section "Screen"
Identifier "Screen0"
Option "metamodes" "nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceFullCompositionPipeline=On}"
Option "AllowIndirectGLXProtocol" "off"
Option "TripleBuffer" "on"
EndSection For what it's worth, in Also, it appears that the issue where ForceFullCompositionPipeline would make games stutter, doesn't happen in 370.23. |
Seems that I celebrated a little too soon, today after a reboot the same issue (slowdowns, stuttery terminal) still seems to happen. It seems to just be random. Sometimes when I start X, it's fine, sometimes I get glitchy stuff happening as already described. The difference with the X config and the command is that Anyone have any idea what this |
|
Yeah, but I find no info about how to set that via |
I update this with my info on all videos with "On" option you can see "bad framerate" compare to same "FPS counter framerate" on video with "Off" option as result- I understand that option "drop" all frames with tearing, so when FPS in you browser games video player go lower then monitor "phisical hertz" you got many frames with tearing and they droped... in games-without vsync games render all frames with tearing, when this option set to "On" you can see 90FPS in game and like 10-20 "visual FPS" you can see it on this video (option On) https://www.youtube.com/watch?v=KiC2X1C1hZA&list=PLzDEnfuEGFHvqKPwXFUi_DPsDvSleldx6&index=9 compare to "Off" https://www.youtube.com/watch?v=8-Fy91tKHWY&list=PLzDEnfuEGFHvqKPwXFUi_DPsDvSleldx6&index=10 also as "side effect" I got "input freez"(keyboard/mouse) during frame drops(when option is On) I test this option for more then year (I even use it for 1 week...very bad GUI/xorg reactions on input its main problem for me) ,it keep same its not my hardware or kernel or xorg problem, I can see many users on internet post same result as I did, even this thread on github confirm it so I still play games with "vsync" and ForceCompositionPipeline = Off (even with tearing(sometime,its not always happens...random) its better and smooth then option "On") Yes compositors like Compton with vsync make it better....but tearing still exist also I add "perfect test" for tearing without games |
I don't want to support XRender, so lets just cut all the shit. Welcome to NeoComp
I ended up putting just
at the end of |
FWIW this xorg line addition resolved the issue for me. NVIDIA Geforce 1050 on Debian 9 using Mate/Compton (using GPU compositor). Symptons were browser being much slower, typing sometimes stalling in browser, computer generally running hotter. |
I struggled for a while trying to get a perfectly tearing-free display with the help of compton (#168), but just now I found a way to do that with
nvidia
drivers alone. It is a single line to be added to xorg.conf underSection "Screen"
:Option "metamodes" "nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"
This removes all tearing even when no compositor is running at virtually no performance cost. It means that compton itself can go back to simply handling transparency and shadows on its lean
--backend xrender
without wasting resources on vsync.The source claims the option is only available on 600 series and newer Nvidia cards, but I can't see anything to that effect in the driver manual, so I assume it should work on all cards covered by drivers since version 319.23 (basically, 8000 series and newer).
I did some benchmarking with
gtkperf
. The table shows total time (in seconds) for a 100 round test:where old compton config is this:
and new config is this:
Only the old config is tear-free when ForceFullCompositionPipeline is off, but all three are when it is on. ForceFullCompositionPipeline itself does not really have any performance impact beyond the margin of error, but getting rid of newly redundant compton options provides a significant speed-up. I think it would be good to mention this in the vsync and/or performance guide. (Yes, the vsync guide does say that compton can't perform better than the drivers, but this particular option is hidden from regular users, and the "Sync to VBlank" option in
nvidia-settings
is limited to OpenGL applications, and actually doesn't perform well withcompton --backend glx
.)The text was updated successfully, but these errors were encountered: