efi_loader: MAX_UTF8_PER_UTF16 = 3

The constant MAX_UTF8_PER_UTF16 is used to calculate
required memory when converting from UTF-16 to UTF-8.
If this constant is too big we waste memory.

A code point encoded by one UTF-16 symbol is converted to a
maximum of three UTF-8 symbols, e.g.

0xffff could be encoded as 0xef 0xbf 0xbf.
The first byte carries four bits, the second and third byte
carry six bits each.

A code point encoded by two UTF-16 symbols is converted to four
UTF-8 symbols.

So in this case we need a maximum of two UTF-8 symbols per
UTF-16 symbol.

As the overall maximum is three UTF-8 symobls per UTF-16 symbol
we need MAX_UTF8_PER_UTF16 = 3.

Fixes: 78178bb0c9d lib: add some utf16 handling helpers
Signed-off-by: Heinrich Schuchardt <xypron.glpk@gmx.de>
Signed-off-by: Alexander Graf <agraf@suse.de>
diff --git a/include/charset.h b/include/charset.h
index 39279f7..37a3278 100644
--- a/include/charset.h
+++ b/include/charset.h
@@ -9,7 +9,7 @@
 #ifndef __CHARSET_H_
 #define __CHARSET_H_
 
-#define MAX_UTF8_PER_UTF16 4
+#define MAX_UTF8_PER_UTF16 3
 
 /**
  * utf16_strlen() - Get the length of an utf16 string
@@ -52,7 +52,7 @@
  * Converts 'size' characters of the utf16 string 'src' to utf8
  * written to the 'dest' buffer.
  *
- * NOTE that a single utf16 character can generate up to 4 utf8
+ * NOTE that a single utf16 character can generate up to 3 utf8
  * characters.  See MAX_UTF8_PER_UTF16.
  *
  * @dest   the destination buffer to write the utf8 characters