commit | 8a1027aa45a8f79c3a1cb5ddaa3c691ae4caf6ad | [log] [tgz] |
---|---|---|
author | Thierry FOURNIER / OZON.IO <thierry.fournier@ozon.io> | Thu Nov 24 20:48:38 2016 +0100 |
committer | Willy Tarreau <w@1wt.eu> | Thu Nov 24 21:35:34 2016 +0100 |
tree | d62ccf37b5fac7ad5eadb6b41dcec712650de614 | |
parent | 7f3aa8b62f645fb2e60158fa5e4129ed5e7a8ef4 [diff] |
MINOR: lua: Add tokenize function. For tokenizing a string, standard Lua recommends to use regexes. The followinf example splits words: for i in string.gmatch(example, "%S+") do print(i) end This is a little bit overkill for simply split words. This patch adds a tokenize function which quick and do not use regexes.