Re: New HDR Settings
Where exactly in the INI settings is this new option available?
I've updated to the latest version and I have the new options like disable autofire but cannot find the HDR settings.
Thanks in advance.
The online community for MiSTer FPGA enthusiasts
https://misterfpga.org/
Where exactly in the INI settings is this new option available?
I've updated to the latest version and I have the new options like disable autofire but cannot find the HDR settings.
Thanks in advance.
Its not part of the main update yet. There's a build buried in the #settings-workshop channel of Mister discord posted by Natrox.
Glad people are enjoying it! I am also glad I tried out Mathieulh's idea.
Sorg merged it into the main branch today, so it should see release relatively soon. I am looking forward to more people trying it out.
I am considering changing the order of the HDR options, so that 1 is HLG and 2 and 3 are the old ones.
We could get rid of them but there are some displays that do not support HLG. Thus, I will keep these HDR options in for the time.
It was possible to get a pretty close result to HLG with a lot of tweaking, but I am glad HLG makes that unnecessary for the most part.
FoxbatStargazer wrote: ↑Tue Jan 03, 2023 7:53 pmIts not part of the main update yet. There's a build buried in the #settings-workshop channel of Mister discord posted by Natrox.
That's right, it isn't included in the stable release yet. The unstable nightly has picked it up and you can try it out if you'd like:
https://github.com/MiSTer-unstable-nigh ... 104_1602a0
It is identical to my build, but official instead. As for the next stable release, it should arrive sometime this month.
There is no fixed release schedule but there has been at least one update every month up till now.
HDR HLG mode works great with LG BX OLED, thank you!
The brightness increase using HDR helps when enabling BFI
Natrox wrote: ↑Wed Jan 04, 2023 6:51 pmI am considering changing the order of the HDR options, so that 1 is HLG and 2 and 3 are the old ones.
We could get rid of them but there are some displays that do not support HLG. Thus, I will keep these HDR options in for the time.It was possible to get a pretty close result to HLG with a lot of tweaking, but I am glad HLG makes that unnecessary for the most part.
Yeah definitely keep BT2020 and DCI P3 options. I feel we're barely scratching potentials of those options. I got great result with HLG on C2 but would like to mess around with those some more.
Could you please explain video_gain_offset more? There is no real time adjustment menu so it's hard for me to wrap my head around.
we're entering levels of simulation that shouldn't be possible!!!
i'm liking it!
Sure. You have six numbers, two for each color channel. Each number can be between -2 and 2.
The first number (gain) is a multiplier. So say you have "120" in the red channel (8-bit, 0-255), and you use a value of "1.1", the final result will be "132" (1.1 * 120).
The second number (offset) is either additive or subtractive (depending on sign). With the same "120" value, if you use an offset of 0.1, the final result would be "146" (120 + 255 * 0.1).
Gain being a multiplier means that it can be used to change color balance. For example, if you notice the screen has a green tint to it, you can tweak green gain and set it to 0.9 or 0.8 depending on the intensity you are trying to correct. Keep in mind that this also means that the green component of any pixel can never reach 255.
Offset on the other hand is a fixed value you add or subtract. It is a little less obvious what to use it for. I personally like to use it to compensate for reduced gain. So for instance, if I use a gain of 0.9 on green to reduce green tint, I like to use an offset of 0.05 to slightly raise the brightness of green so I don't darken the green parts of the image too much.
Another way to see gain and offset, is as contrast and brightness, but per channel. Mathematically they are equal. You can do some interesting stuff with it however. One example in the MiSTer ini is for inverting colors. This uses a gain of -1 and an offset of 1 for each channel. So for value "100", you get "155" (-1 * 100 + 255 |or| 255 - 100). Whether it is useful to invert colors, well, that is up to you of course. Anyway, that is just an example of what you could do by combining the two.
The order is always: Rgain, Roffset, Ggain, Goffset, Bgain, Boffset.
The default of course is: 1, 0, 1, 0, 1, 0
Afternoon all, first to introduce myself: I added HDR support to RetroArch and am the author of the Sony Megatron shader that aims to harness the benefits of HDR support. See my thread here:
https://forums.libretro.com/t/sony-mega ... nthecactus
So Im here to learn about HDR on Mister and offer any knowledge I have through my experience on more traditional platforms.
The main aim of HDR for retroarch was to get a more accurate CRT experience by increasing brightness and simplifying the shaders. A lot of SDR shaders blur the phosphor triads in order to turn on more sub pixels and therefore gain more brightness. This comes at the cost in accuracy and so HDR opens the door to 100% phosphor masks and backlight strobing etc. Brightness is king when simulating CRTs as has been described above.
To touch on a few points Ive read above: firstly ABL - as has been said HDR traditionally is about highlights - making those specular glints really ping and so ABL kicks in if you have the whole screen a max brightness. However in our case we're normally turning off the vast majority of sub pixels - typically 3 out of 16 so less than 25% and then add in horizontal blanks between scanlines and your down to lighting up to full brightness near 10% of the screen and far less when viewing dark content. Then add in BFI or back light strobing and youre down even lower. Whether ABL kicks in at all is up to manufacturers algorithm but its probably always within the power limits of the TV.
Also I heard people talk about HDR meta data (MaxCLL etc) - this is something you should now avoid setting for consistency across TVs. Microsoft in particular has gone back against the idea as display manufacturers werent interpreting the values consistently and so one TV looks fine and another terrible. See here for more info:
https://learn.microsoft.com/en-us/windo ... drmetadata
Of course if the meta data is needed because Mister has to rely on the TV doing the inverse tonemapping etc then thats probably something we'll have to live with i.e only certain displays will support this.
One thing I thought is can we do the inverse tonemapping in hardware (instead of a shader)? Itd be relatively simple to implement and would be done on the 240p output image of the core. This is where my knowledge bumps up against the hardware of mister and hdr implementation though so Id like to understand more.
Can also confirm that HLG is backwards-compatible, looked perfectly normal on an SDR monitor.
FoxbatStargazer wrote: ↑Thu Jan 05, 2023 3:36 pmCan also confirm that HLG is backwards-compatible, looked perfectly normal on an SDR monitor.
Did you confirm with a color calibration? I will believe HDR works only when I see a side by side of a calibration done in SDR then flipped to HDR and it maintains near perfect tone mapping... The retrotink5X HDR mode is way off, at least in HDR10 mode, magenta is purple. I don't want to see that on the Mister as it makes HDR mode pointless imo, brightness with incorrect colors is not a good compromise. Looking forward to testing this though myself and will try to post some color accuracy charts.
I'm more confident that the SDR back-compat works exactly as if HGL was off, than the HDR side being perfect. But it is visually a big and noticable improvement in color accuracy over the other HDR modes.
I did have an issue on my C1 where the white text on the menu looked a bit red at default settings. pulling the mister's contrast/brightness down to 45 fixed it. It's possible there is something wonky but I'm more liable to blame my C1 config at the moment. Was told to max color warmth so that might be related somehow.
HDR10 is incorrect on both, MiSTer and Retrotink 5X Pro. Magenta is violet, Reds too cold etc. HLG though is perfectly fine colorwise, in both implementations. But i think the MiSTer has a slight advantage in HLG mode, the increased luminance pops a bit more. Don't know why.
Thank you for the explanation. It's still pretty hard to work with without menu for dialing heh....
MajorPainTheCactus wrote: ↑Thu Jan 05, 2023 2:12 pmAfternoon all, first to introduce myself: I added HDR support to RetroArch and am the author of the Sony Megatron shader that aims to harness the benefits of HDR support.
This is really cool. I haven't used Retroarch for awhile so didn't know it got hdr last year. Really cool to have another comparison point. Your Megatron shader is very very dark on my setup, not like your pictures. I'll dig around for fix on the internet.
Going back to Mister HLG, I haven't done extensive comparison yet but Genesis Sonic is looking dark and off on C2. I would like to get a sanity check here.
On C1 I get best results with Tone Mapping = On, gets too dark otherwise. Still not sure its perfect but I think the theory is that with HLG you should rely on auto tone mapping? Color Gamut is auto as well.
FoxbatStargazer wrote: ↑Sat Jan 07, 2023 4:07 amOn C1 I get best results with Tone Mapping = On, gets too dark otherwise. Still not sure its perfect but I think the theory is that with HLG you should rely on auto tone mapping? Color Gamut is auto as well.
Yeah I have Dynamic Tone Mapping = on for my C2. It's definitely the best setting for HLG.
I tried CRT Simulation gamma which helped certain tones, especially Sonic's skin tone, so I think we could get better color with good gamma curve. I also got much much better color by using really bright shadow mask like Retrotink AG-2, darker scanlines like SLA_Dk_030_Br_050, and then a bit of gamma down like Poly gamma 2.4. I made a even brighter shadow mask to wack it up even more heh heh. One of the big reason for HDR is to have high brightness with scanlines and masks on anyway so I think I found some combo that's working great for me.
I tried "Contrast=55" in the Mister.ini settings too. Gives you a litte bit more pop without clipping. Well, on my C9 at least. I assume HLG already looks a bit better on the newer and brighter LG Evo panels.
KennyL wrote: ↑Sat Jan 07, 2023 2:08 amThank you for the explanation. It's still pretty hard to work with without menu for dialing heh....
MajorPainTheCactus wrote: ↑Thu Jan 05, 2023 2:12 pmAfternoon all, first to introduce myself: I added HDR support to RetroArch and am the author of the Sony Megatron shader that aims to harness the benefits of HDR support.
This is really cool. I haven't used Retroarch for awhile so didn't know it got hdr last year. Really cool to have another comparison point. Your Megatron shader is very very dark on my setup, not like your pictures. I'll dig around for fix on the internet.
Going back to Mister HLG, I haven't done extensive comparison yet but Genesis Sonic is looking dark and off on C2. I would like to get a sanity check here.
- Mister HLG with default ini setting
- Mister SDR
- Retroarch (Peak Luminance 800x, Paper White Luminance 300, Contrast 6x)
mister_hdr_sonic.jpg
With regards to the Megatron shader being dark - you probably need to change the paper white value of the shader parameters - whack it right up to peak and see if it still looks right - if so youre good. Peak should be set to your tvs peak luminance as stated in rtings.
MajorPainTheCactus wrote: ↑Thu Jan 05, 2023 2:12 pmAfternoon all, first to introduce myself: I added HDR support to RetroArch and am the author of the Sony Megatron shader that aims to harness the benefits of HDR support.
It would be amazing if you could assist in the HDR implementation on MiSTeR. Your Retroarch shaders are easily the best and the first that truly wowed me when it comes to CRT simulation.
It would be great if mathieulh or any of the other devs working on the HDR implementation could pick up on this!
Natrox wrote: ↑Sun Jan 01, 2023 1:47 amThe MiSTer could do something similar actually. Dynamically setting luminance information based on the current frame is possible.
To do this you could repeat this procedure every second or so (to avoid killing the CPU):
- Access the scaler buffer which has the raw video output.
- Linearly read it while keeping track of maximum luminance (in 0.0 - 1.0 space), minimum luminance and average luminance.
- Scale the maximum luminance with the display luminance configured. E.g. if the peak in the image is 0.5, you would set a max luminance of 500 (0.5x1000). This will be used for MaxCLL as well.
- Use the average luminance to calculate MaxFALL. This variable is the average luminance of the scene, so we can calculate it. Just as with MaxCLL, scale the average with the display luminance.
- Send adjusted HDR metadata infoframe.
This should, in theory, result in HDR that adjusts itself to the content on screen. I have not tried it yet as it did not seem necessary at this point.
Interesting, I guess it is somewhat possible, although I am curious if a once a second update would not be too great of a compromise. Though in any case, why would you want to have functionality like this? AutoHDR makes sense for film, since recordings of real life (such as the sun) obviously have way higher luminance in reality than what a 200nits SDR display can show. AutoHDR can arguably output a more realistic image in this case.
I don't see why you would aim for the same thing for 16bit, let alone 8bit old video games. HDR or large luminance ranges did not exist on the consoles the MiSter is emulating. The only concern is to display the 256 or 65536 colors as accurate and bright as ideally necessary. It is a static thing, I don't see how there is a need to dynamically adjust the luminance. That was is not a thing on CRTs, so why should it be on the mister?
That is what I meant when I said the Mister should not attempt such functionality; it does not contribute to a more realistic CRT-like image.
Asteld wrote: ↑Thu Jan 12, 2023 10:10 amInteresting, I guess it is somewhat possible, although I am curious if a once a second update would not be too great of a compromise. Though in any case, why would you want to have functionality like this? AutoHDR makes sense for film, since recordings of real life (such as the sun) obviously have way higher luminance in reality than what a 200nits SDR display can show. AutoHDR can arguably output a more realistic image in this case.
I don't see why you would aim for the same thing for 16bit, let alone 8bit old video games. HDR or large luminance ranges did not exist on the consoles the MiSter is emulating. The only concern is to display the 256 or 65536 colors as accurate and bright as ideally necessary. It is a static thing, I don't see how there is a need to dynamically adjust the luminance. That was is not a thing on CRTs, so why should it be on the mister?
That is what I meant when I said the Mister should not attempt such functionality; it does not contribute to a more realistic CRT-like image.
AutoHDR is good for 3d video games even when they were not intended for HDR. It works pretty well, have you tried it?
The reason for HDR on MiSTer is to compensate for (artificial) scan lines and shadow masks making the image much darker than it should be.
MajorPainTheCactus wrote: ↑Thu Jan 05, 2023 2:12 pmOne thing I thought is can we do the inverse tonemapping in hardware (instead of a shader)? Itd be relatively simple to implement and would be done on the 240p output image of the core. This is where my knowledge bumps up against the hardware of mister and hdr implementation though so Id like to understand more.
It may be possible but probably not given that we are not outputting in 10-bit mode. We'd necessarily lose more precision by attempting this. Beyond that, it would cost FPGA fabric which is hard to justify, considering the niche and the fact that the current implementation of HDR is "good enough" for most.
Right now, the HDR implementation is no more than setting the metadata. Omitting the luminance levels results in an unchanged image on HLG, on my end anyway. I'm not extremely well versed on the HDMI standard, but we don't have much more available other than this. Setting HDR metadata is already unsupported with this version of the standard (v1.4), so I'm not sure if there is more to be done with what we have.
Speaking generally, I think having shader support would be awesome, although it's out of the realm of possibility with the MiSTer as-is. Might be an interesting hobby project for someone to turn a second DE10 into a dedicated scaler with HLSL support. It would be hard but certainly possible. This way, you could emulate some of the more complex CRT quirks.
I'm liking this new feature! The darker low midtones seem to be reminiscent of Wells Gardner CRT arcade monitors with the DCI setting, whereas the HLG seems to evoke more of a Sony CRT look. Probably just a coincidence, but I think if we could properly characterize a raw CRT with its gamma S-curve and map that onto HDR we might just be onto something. Something perhaps more arcadish than strictly accurate to sRGB. It's also similar to what digital cameras, which in certain cases seem to be going for a film look, do to screenshots. Perhaps a film EOTF is something we could also experiment with, with creator color spaces in HDR as opposed to a strictly neutral consumer color space like SDR. Great work!
thisisamigaspeaking wrote: ↑Thu Jan 12, 2023 1:36 pmAutoHDR is good for 3d video games even when they were not intended for HDR. It works pretty well, have you tried it?
Dunno about the Xbox version but I haven't been too impressed with windows. Most games if I dare to turn it up I get blinded by solid white sections.
This is not a comment on this being necessarily a bad idea for Mister... although I have similar struggles honestly! Settled on SDR and just turning up my OLED for now, its still not even max brightness.
FoxbatStargazer wrote: ↑Fri Jan 06, 2023 12:00 amI did have an issue on my C1 where the white text on the menu looked a bit red at default settings. pulling the mister's contrast/brightness down to 45 fixed it. It's possible there is something wonky but I'm more liable to blame my C1 config at the moment. Was told to max color warmth so that might be related somehow.
My menu is a bit more warm after turning on HDR too. I'm using a C8.
Just thought I could share my settings on my C1, which gives me something quite similar to my Panasonic Quintrix CRT
Mister:
HDR = 1
Contrast = 55
Horz filter:
GS_Sharpness_55
Vert filter:
SLA_Dk_70_Br_80
Gamma Correction:
Poly 2.3
Shadow Mask:
Consumer Tv (generic) (rgb)
C1-settings:
Oled motion PRO/BFI (high)
Colour Gamut:
Native
Tint:
R10 (otherwise I think the yellow is too green)
HDR Tone Mapping:
On
Peak Brightness:
High
What video filter settings are you using to closely resemble a PVM?
@Natrox I don't know what happened to the HLG option, but now it's dim and colorless. No amount of tone mapping is even brightening it up. I guess the critics won, because It doesn't even look 'just' like SDR at this point, it's bland. Please don't do that to the DCI setting, leave it the way it is. It's Arcadish and I had been trying to get that look for years. If there's a way to get it back to the way it was before, please tell me exactly how. If people wanted HDR to look precisely like SDR they should have just stuck with SDR. Otherwise, what's the point? I came here originally to praise this addition because I had foreseen something like this, but I'm only one voice.
You can try raising the color on either your TV or the Mister.ini but yeah I did notice a change there vs. when HLG was mode 3.
FoxbatStargazer wrote: ↑Fri Jan 27, 2023 11:44 pmYou can try raising the color on either your TV or the Mister.ini but yeah I did notice a change there vs. when HLG was mode 3.
Thanks. I can't because it's 444 with greyed-out options on my set, so the only adjustable parameters for HLG are brightness, contrast, tone mapping, gamma, and local dimming level. All do very little and it seems that the brightest I can get is almost as bright as SDR and slightly less colorful, as a higher peak level seems to also add to color saturation in HDR. Contrast and tone mapping do very little and the controls on the MiSTer.ini seem to max out at the default settings. It looks just like SDR inside an HDR container on my PC desktop at default settings, slightly dimmer and with a touch less color. It looks as if the white point is permanently set to about 200-400 nits with no option to change. If only we could control the peak level directly as contrast seems to max out at the default in this new configuration, unless I'm missing something. I like the system Retroarch uses with peak level and paper white level. I just wish we had direct control over the peak level even if it was set the way it currently is by default.