forked from Mirrorlandia_minetest/minetest
094a5a73d3
Remake the light_decode_table. The table starts out without pre-filled in values since those are always discarded by the code apparently. We calculate a pseudo curve with gamma power function, and then apply a new adjustment table. The adjustment table is setup to make the default gamma of 2.2 look decent: not too dark at light level 3 or so, but too dark at 1 and below to be playable. The curve is much smoother than before and looks reasonable at the whole range, offering a pleasant decay of light levels away from lights. The `display_gamma` setting now actually does something logical: the game is darker at values below 2.2, and brighter at values above 2.2. At 3.0, the game is very bright, but still has a good light scale. At 1.1 or so, the bottom 5 light levels are virtually black, but you can still see enough detail at light levels 7-8, so the range and spread is adequate. I must add that my monitor is somewhat dark to begin with, since I have a `hc` screen that doesn't dynamic range colors or try to pull up `black` pixels for me (it is tuned for accurate color and light levels), so this should look even better on more dynamic display tunings. |
||
---|---|---|
.. | ||
async | ||
common | ||
fstk | ||
game | ||
mainmenu | ||
profiler | ||
init.lua | ||
settingtypes.txt |