Remake the light_decode_table.
The table starts out without pre-filled in values since those
are always discarded by the code apparently. We calculate a
pseudo curve with gamma power function, and then apply a new
adjustment table.
The adjustment table is setup to make the default gamma of 2.2
look decent: not too dark at light level 3 or so, but too dark
at 1 and below to be playable. The curve is much smoother than
before and looks reasonable at the whole range, offering a
pleasant decay of light levels away from lights.
The `display_gamma` setting now actually does something logical:
the game is darker at values below 2.2, and brighter at values
above 2.2. At 3.0, the game is very bright, but still has a good
light scale. At 1.1 or so, the bottom 5 light levels are virtually
black, but you can still see enough detail at light levels 7-8,
so the range and spread is adequate.
I must add that my monitor is somewhat dark to begin with, since
I have a `hc` screen that doesn't dynamic range colors or try to
pull up `black` pixels for me (it is tuned for accurate color and
light levels), so this should look even better on more dynamic
display tunings.
As mgv7 is now the default mapgen i re-checked its tunnel width on request,
discovered they needed to be wider, and have made this change.
This commit widens the identical 3D noise tunnels in the other mapgens in
exactly the same way.
Move static object storage force-delete message from errorstream to
warningstream.
Increase 'max objects per block' setting to 64.
Add missing spaces in warning code.
Re-add documentation of noise parameter formats.
Re-add 'mgv5_np_ground' noise parameters in group format.
Both these were deleted through auto-generation of conf.example.
Add note to builtin/mainmenu/dlg_settings_advanced.lua that this
documentation must be preserved.
Disable the ability to connect to old servers by default to
improve password security.
If people still want to connect to old (0.4.12 and earlier)
servers, they can flip the send_pre_v25_init setting.
Add the ability to detect if we've tried to connect
to a server which only supports the pre v25 init protocol,
and show an apropriate error message. Most times the error
will already be catched at the serverlist level, the
detection mechanism only acts as last resort, because the
"Connection timed out" error message that would be shown
otherwise would be very confusing.
Automatic "fixing" of this condition is not desired,
as it would allow for downgrade attacks.
As already 161 of the 167 servers on the serverlist
support the new srp based auth protocol (> 96%),
the breakage should be minimal.
Follow up of commit
af30183124 "Add option to not send pre v25 init packet"
Also change the pessimistic assumption of masterlist
server versions to optimistic, in order to avoid buggy
behaviour (favourites not in the serverlist would be
denied to connect to, etc).
Reduce spread from 96 to primes 61 and 67 (either side of 64)
Prime spreads help to keep 3D noise periodic features unaligned
'cave width' 0.2 to preserve tunnel width
Reduce octaves to 3 to improve network structure
ABM's are hardcoded to run every 1.0s, NodeTimers are hard coded to
run at every 1.0s. Block mgmt is running every 2.0sec.
However, these timers can be better tuned for both higher and lower
values by server owners. Some server owners want to, and have the
resources to send more packets per second to clients, and so they
may wish to send smaller updates sooner. Right now all ABM's are
coalesced into 1.0 second intervals, resulting in large send queues
to all clients. By reducing the amount of possible timers, one can
get a far better response rate and lower the perception of lag.
On the other side of the camp, some servers may want to increase
these values, which again isn't easily doable.
The global settings abm_interval and nodetimer_interval are set to
current values by default. I've tested with 0.2/0.5 type values
and noticed a greatly improved response and better scattering of
nodetimers, as well as enjoying not faceplanting into doors with
pressure plates anymore.
The legacy init packet (pre v25) sends information about the client's
password that a server could use to log in to other servers if the
username and password are the same. All the other benefits of SRP of
protocol v25 are missed if the legacy init packet is still sent during
connection creation.
This patch adds an option to not send the v25 init packet. Not sending
the v25 packet means breaking compat with pre v25 servers, but as the
option is not enabled by default, no servers are affected unless the
user explicitly flips the switch. More than 90% of the servers on the
serverlist support post v25 protocols.
The patch also fixes a bug with greying out of non compliant servers
being done wrongly, the min and max params were mixed.
Replace simple caves with V5 caves, adding unpredictable water and lava
settings and massive caves based on subterrain. Remove fast terrain mode
and accompanying settings. Remove superfluous temperature/humidity
settings. Remove lava/water height setting. Fix errors in humidity
handling and remove humidity_break_point setting. Move cave noises to
generateCaves. Fix minor formatting/naming issues and use
MYMAX/MYMIN/myround.
The pageflip mode requires a stereo quadbuffer, and a modern graphic
card. Patch tested with NVidia 3D Vision.
The mini-map is not drawn, but that's what is done for topbottom and
sidebyside modes as well.
Also most of the time the user would prefer the HUD to be off. That's
for the user to decide though, and toggle it manually.
Finally, the interocular distance (aka eye separation) is twice as much
as the "3d_paralax_strength" settings. I find this a strange design
decision. I didn't want to chance this though, since it's how the other
3d modes interpret this settings.