Terminus Font is a clean, fixed width bitmap font, designed for long
(8 and more hours per day) work with computers.
The font has a very good script coverage and subsets with other
encodings can be imported in the future.
This commit imports IBM437 encoded (WSDISPLAY_FONTENC_ISO) subsets
converted to WSF format. wsfont metadata are stored in the files, so
you can load them without any additional arguments to wsfontload(8).
This unbreaks miniroot.fs.
Thanks to Björn Johannesson for originally pointing this out and to
mrg@ for pointing out the actual issue and suggesting a fix.
The layout is configurable using sysctl machdep.adbkbdX.emulate_usb:
0 = no emulation
1 = ANSI
2 = ISO (new with this change)
3 = JIS (new with this change)
Default value is detected using the ADB keyboard handler id. JIS
default is disabled until it is tested.
No unit test for this edge case since all other unit tests are platform-
independent.
To reproduce:
$ make clean
$ make -s PROG=s-make NOMAN=yes USER_CFLAGS=-fsigned-char
$ make clean
$ make -s PROG=u-make NOMAN=yes USER_CFLAGS=-funsigned-char
$ make clean
$ range=$(lua -e 'print(("[%c-%c]"):format(0xe4, 0x61))')
$ ./s-make -V "\${:UM:M$range}\${:UN:N$range}"
M
$ ./u-make -V "\${:UM:M$range}\${:UN:N$range}"
N
This code was intended to check whether the two 4-word halves of an
8-word, 32-byte, 256-bit sample were repeated.
Instead, it accidentally checked whether the first 4 _bytes_ of the
two halves were repeated.
The effect was a false alarm rate of 1/2^32, instead of a false alarm
rate of 1/2^128, with no change on the true alarm rate in the event
of an RNG wedged producing all-zero or all-one bits. 1/2^128 is an
acceptable false alarm rate; 1/2^32, not so much.
(The false alarm right might be higher if the samples are not
perfectly uniformly distributed, which they most likey aren't,
although the documentation doesn't give any details other than
suggesting it's a ring oscillator under the hood, which provides
entropy from jitter induced by thermal noise. This driver records
half a bit of entropy per bit of sample to be reasonably
conservative.)
With SHA-256, NIST Hash_DRBG takes an preferred 440-bit/55-byte seed.
It's a weird number, and I'm not sure where it comes from (a quick
skim of SP800-90A doesn't turn anything up), but it's certainly
sufficient (256-bit/32-byte seed is almost certainly enough) so it's
not a problem to use something larger; Hash_DRBG can absorb seeds of
arbitrary lengths and larger seeds can't really hurt security (with
minor caveats like HMAC RO quirks that don't apply here).
Except -- owing to a typo, we actually used a 1760-bit/220-byte seed,
because I wrote `uint32_t seed[...]' instead of `uint8_t seed[...]'.
Again: not a problem to use a seed larger than needed. But let's
draw no more than we need out of the entropy pool!
Verified with CTASSERT(sizeof(seed) == 55). (Assertion omitted from
this commit because we might swap out Hash_DRBG for something else
with a different seed size like 32 bytes.)
The entropy pool algorithm is NOT designed to provide backtracking
resistance on its own -- it MUST be combined with a PRNG/DRBG that
provides that.
The only reason we use entropy_extract here is that cprng(9) is not
available yet (which in turn is because kmem and other basic kernel
facilities aren't available yet), but nist_hash_drbg doesn't have any
initialization order requirements, so we'll just use it directly.
This allows expressions like '__alignof__(ptr)->member', just as with
'sizeof'.
The upper rule in the grammar was preferred over the lower rule since it
shifted the T_LPAREN instead of reducing unary_expression. Its
implementation invoked undefined behavior if the expression was NULL
since it didn't assign anything to $$.
The change in lex.c 1.129 attempted to add support for __alignof, in
addition to the existing support for __alignof__. It failed by removing
support for __alignof__, while allowing the plain 'alignof' instead.