BUG/MINOR: ssl/lua: prevent lua from affecting automatic maxconn computation

Since commit 36d1374 ("BUG/MINOR: lua: Fix SSL initialisation") in 1.6, the
Lua code always initializes an SSL server. It caused a small visible side
effect which is that by calling ssl_sock_prepare_srv_ctx(), it forces
global.ssl_used_backend to 1 and makes the initialization code believe that
there are some SSL servers in certain backends. This detection is used to
figure how to set the global maxconn value when only the memory usage is
limited. As such, even a configuration with no SSL at all will have a very
conservative maxconn.

The configuration below exhibits this :

   global
        ssl-server-verify none
        stats socket /tmp/sock1 mode 666 level admin
        tune.bufsize 16384

   listen  px
        timeout client  5s
        timeout server  5s
        timeout connect 5s
        bind :4445
        #bind :4443 ssl crt rsa+dh2048.pem
        #server s1 127.0.0.1:8003 ssl

Starting it with "-m 200" to limit it to 200 MB of RAM reports 1500 for
Maxconn, the same when uncommenting the "server" line, and 1300 when
uncommenting the "bind" line, regardless of the "server" line's status.

In practice it doesn't make sense to consider that Lua's server template
counts for one regular SSL server, because even if used for SSL, it will
not take large connection counts, compared to a backend relaying traffic.
Thus the solution consists in resetting the ssl_used_backend to its
previous value after creating the server_ctx from the Lua code. With the
fix, the same config with the same parameters now show :
  - maxconn=5700 when neither side uses SSL
  - maxconn=1500 when only one side uses SSL
  - maxconn=1300 when both sides use SSL

This fix can be backported to versions 1.6 and beyond.
1 file changed