-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
builtin.conf: modernize internal profiles #12384
Conversation
To be honest the manual is full of imo misleading statements in the scale section. If we do change this profile, it shouldn't because of what the manual says. As for the actual change itself, I personally use ewa_lanczos but spline36 is arguably better depending on your preferences. It's not as if this change is going from a definitely inferior filter to a better one. It's an opinion. |
What is the goal of gpu-hq actually? To provide reasonably high quality settings, or to put everything at maximum? It is also my personal opinion that anything except |
Arguably, we can use For
I agree, if they were one better filter, we wouldn't even need so many options. Still I think polar lanczos is better choice for scale.
I would say, reasonable high quality profile, to set mpv to best output it can produce with bulit-in processing. Is the Frankly I would like for Spline36 to be default mpv scaler, with maybe gpu-fast option that defaults to native sampling. Although it is probably good, that default config can work on potato pc.
I would say, anything higher than |
This comment was marked as off-topic.
This comment was marked as off-topic.
Nah, it is too much. It also forces antiringing at 0.8. And produces nasty artifacts in certain cases, because of that. IMHO, not suitable for general recommendation, but good if someone decides to get sharper results. But at this point, maybe go custom shader instead. |
I've been using Anything beyond bilinear for I'm sure there are a lot of people using We could point out in the documentation that changing it from |
Since it's the gpu-hq profile, I don't think leaving cscale to bilinear makes a lot of sense even though I can't tell the difference half the time anyway. Leaving it to spline36/orthagonal lanczos should be good enough in every case though Also I missed that this changed dscale to catrom too, I strongly disagree with this as well |
This is a very good question to ask. I feel like any more and this just goes into personal preference territory which is not the goal of a preset IMO. |
Spline36 is not a good scaler and only really popular because it was, well, popular in the past. It doesn't enjoy any good theoretical properties, and depending on your goal various other scalers will perform better: If you care about spatial properties like preserving linear gradients (which is equivalent to quadratic convergence, in practice this amounts to having little aliasing and ringing) you should just use a
|
While I think that modernising the defaults is probably a good idea, deciding which filter to pick is problematic as there isn't a clear winner.
On top of this being very subjective you also have to take into account that it highly depends on content. If you're mostly watching live-action content full of high-frequency information/noise, you can usually get away with a sharper filter and never notice any of its shortcomings. If you're mostly watching clean line-art however, all the problems become much easier to see. For downsampling, |
On "real content" Hermite is "cleaner" (no negative lobe) than both and considerably sharper than Mitchell. If any alternative is to be used over Mitchell, It's Hermite. |
I could even vouch for polar hermite if that's an option. |
The only option that needs to be added to It empirically, measurably and objectively harms the visual quality of my videos, and should be disabled for high quality video rendering. -- In all seriousness, my interpretation of In my opinion, the most important options in
…and everything else is more-or-less just subjective, or completely tertiary to the question of what “high quality” is. As for the discussion about whether anything other than |
This comment was marked as off-topic.
This comment was marked as off-topic.
I don't really think changing from Mitchell to Catmul-Rom is a good idea, without good anti-ringing (such as using ravu-zoom-ar-r3), it's notably worse imo. Also I've noticed Catmul-Rom is bad when downscaling from content that originally had a small resolution but better on 1080p+. Is it possible to decide based on resolution? The ewa_lanczossharp don't seem to work too great at lower resolutions either (too much aliasing) ewa_lanczos4sharpest is useless in all scenarios and looks nasty imo. Currently I'm doing this with small modifications to the shaders:
I haven't tried hermite yet but I'm looking for something in between mitchell and catmull_rom. EDIT: Tried hermite in anime and it's worse than both, way too much aliasing that any low-res stuff becomes unwatchable and it can still be noticed with 1080p source. Probably does better on real content where aliasing isn't as noticeable. EDIT: NVM, I wasn't testing hermite correctly... |
Low-res stuff? ...Are you using it to upscale? It's well known Hermite is terrible for upscaling, though aliasing isn't quite the right word. The filter creates blocking when upscaling. |
That wasn't my example, I was talking about older 480p content, upscaled to 4k with my shaders and then downscaled to my native 1440p with either mitchell or hermite. Anime I was checking it on was "Get Backers" on HiDive and some others.
Although it turns out I was incorrect, it doesn't create aliasing, that's the result of sharpening with CAS but Mitchell does a better job of masking it. |
I think you should cool it with the meme shaders. |
Makes a difference of 6W for me on a 1080p 60fps YouTube video on a 1080p 60Hz monitor when comparing bilinear with ewa_lanczossharp. @Obegg You're confusing the osd-bar with the osc. |
Why? I'm yet to find a better set of shaders that helps get rid of noise and sharpens the image without destroying too much image quality. With mitchell it's fine for me at 480p although I haven't experimented too much with using a sharper downscaler and turning down CAS, I only ever tried mitchell, lancoz (all versions), roubidouxsharp thoroughly before. EDIT: HAHA yeah, turns out I wasn't correctly testing hermite... |
🤦♂️ |
@christoph-heinrich I have a similarish power draw difference on a 6600 XT, so it has nothing to do with your GPU being old. Still though, while I think taking performance into consideration has its merits, |
I think That's why, in libplacebo, I opted for three preset levels (fast, default and highquality). I have no opinon on mitchell vs catrom but I'd like to at least see some justification for the claim. (What about downscaling HDR sources?) |
Very interesting, on my setup the difference between Edit: Just noticed the original post said |
But libplacebo default preset also uses
We could make two more profiles for mpv and mirror the ones from libplacebo as closely as |
If you ask me, we should make mpv defaults match Other shit like dithering being disabled by default is also just objectively wrong and Honestly, I would like to overhaul the entire options system to make rendering options directly map to their pl_render_params analogs (ideally via |
It was typo, I meant |
Keep it sharp, let users opt-in more blurry result.
Minor bikeshed, but I'd prefer the use of And in the commit message
This is not a valid value for the option In the future, we could rename |
But why? With the new default, dither depth will be auto-detected. Unless auto-detection doesn't work, in which case, shouldn't we just default to |
We don't know which platforms are affected. I have to manually set dither-depth=8 on my hardware on Windows/X11/Wayland. You're also on AMD but you don't. We don't really know what causes it to work for some people and not work for others. Dither depth being auto-detected is fine for people for whom it works, and if it's broken you can simply set one option to change it to a specific bit depth. |
I don't follow. The new defaults are Or maybe we should set |
On my end, dither=ordered is 5256.41 fps vs dither=no 5890.83 fps. For comparison, dither=fruit is 4635.11 fps. (All numbers on |
Ah I missed this since it's part of the same commit, my bad. This is fine then. I guess the point still stands if somebody wanted dithering on the fast profile on environments where |
The goal is to provide simple to understand quality/performance level profiles for the users. Instead of default and gpu-hq profile. There main profiles were added: - fast: can run on any hardware - default: balanced profile between quality and performance - high-quality: out of the box high quality experience. Intended mostly for dGPU. Summary of three profiles, including default one: [fast] scale=bilinear cscale=bilinear (implicit) dscale=bilinear dither=no correct-downscaling=no linear-downscaling=no sigmoid-upscaling=no hdr-compute-peak=no [default] (implicit mpv defaults) scale=lanczos cscale=lanczos dscale=mitchell dither-depth=auto correct-downscaling=yes linear-downscaling=yes sigmoid-upscaling=yes hdr-compute-peak=yes [high-quality] (inherits default options) scale=ewa_lanczossharp cscale=ewa_lanczossharp (implicit) hdr-peak-percentile=99.995 hdr-contrast-recovery=0.30 allow-delayed-peak-detect=no deband=yes scaler-lut-size=8
I don't think auto ever worked for me on gpu-next, it always has detected 10bit on my 8bit monitor playing 8bit content. It did detect 12bit or something weird like that before so at least it's improved. That said I could never tell the difference anyway so not sure what was going on. |
|
Please read #11862 for why this matters. mpv shouldn't need to dither content at all if gpu drivers worked correctly. |
You always need to dither to the backbuffer depth. |
That's incorrect statement. If you are using 8-bit backbuffer, we have to dither to 8-bits from whatever internal precision we have. Not dithering is an error and in practice there shouldn't be even an option to disable it given how free dithering is. |
Yes, but mpv shouldn't need to dither content to the display bit depth. If mpv is offered a 10 bit backbuffer on an 8 bit display then mpv should be able to pass 10 bit video and just assume it'll work. Instead on my system, mpv is offered 16 bit backbuffer and I see banding on my 8 bit display unless I explicitly set dither-depth=8 |
How this is related to this PR again? |
I made an ambiguously worded statement that was interpreted as factually incorrect in response to an off topic comment, so I felt the need to correct what I meant. It's not related. I'd flag it as off topic if I could for my own comments. |
No worries, I'm tired, not really read carefully everything. Let's focus on things that are relevant for the changes, discussion about specific options/changes can be also moved to next issues to have better focus. And don't spam all people who subscribed in this thread. |
Parity with mpv. See-Also: mpv-player/mpv#12384
It's confusing that the mid-quality option is in the gpu-hq profile, while the recommended filter is not. Also prefer the sharper catmull-rom for dscale, as it produces better results.
It's time to retire Spline36.