STACKSTRNUL() bug to eventually be found. Not any of the test cases
directly - the shell running the tests (the same /bin/sh) managed to
build one of the f_variable_syntax sub-tests incorrectly, and that was
enough to eventually allow the bug to be located and squashed.
Like all good promotions, this one comes with increased work, and no extra pay.
We currently parse var expansions like "${x-"a b c"}" incorrectly
according to POSIX (and ancient tradition) though in a way that is
generally unexpcted (there are 2 quoted strings there, according
to the standard, "${x-" and "}" and 'a b c' is unquoted.) This has
never been fixed in our sh, as like in many other shells, it is just
a little unbelievable, most people expect that inside the {} is a
whole new ballpark, and everything starts again, and it does in cases
like "${x%"a b c"}" (the newer matching operators.) There have been
comments in this test explaining how the test is actually testing for
incorrect behaviour for a while now.
But I have since learned of POSIX bug #221 (2010), the resolution of which is
to make the problem case unspecified. This will not actually become part
of posix until the next major version (POSIX 8) whenever that appears (still
years away I expect) - but at least we now can feel happier about the
way our code works and not worry quite so much about testing that incorrect
code keeps working incorrectly....
to 1000 to make the ATF test suite as a whole take less time. Before
the change, this single test case could take more than two hours to
run on a qemu emulated ARM.
It was created to copy FreeBSD, however actually the cache isn't
necessary. Remove it to simplify the code and reduce the cost to
maintain it (e.g., keep a consistency with a corresponding local
route).
produces output with a very large number of consecutive embegged \n
characters.
This test is currently expected to fail (as of commit date) but is not
marked as atf_expect_fail as the shell should be fixed to avoid the
problem quite soon. Until then almost anything might happen to the
sh that runs this test (from just plain wrong results, to core dumps,
even possibly the right results, though that's unlikely).
Whie doing this, get rid of the duplicate implementation of the
nested_cmdsubs_in_heredoc test (which was achieving nothing). I know
how I managed to do that, but on advice of counsel, I ... (it was
just a harmless waste of a tiny amount of CPU time "compiling" the test,
just like writing "x=0;" on consecutive lines....)
These should detect if the errors that caused MAKDEV to fail and
pkgsrc/pkgtools/cwrappers to fail to build (problem detected in
libnbcompat/configure) ever return.
Also fixed one of the other sub-tests so that it actually does what
it should - no idea how this one has been passing, it did not for me
when I was checking the new ones (but perhaps in the interim I have
fixed something else in sh, the problem was in the area I have been
playing, and I originally debugged the new tests using a newer version
of /bin/sh than has yet been committed.)
by IFS just the same as any other expansion (split when they should be,
and not when they shouldn't).
Good thing I did ... this discovered a regression in the new expand
code (from an hour or three ago) where quoted arith expansions are
being split, and shouldn't be. (on the other hand, when they should
be split, they are being split correctly, so that's good...)
No need to worry too much about this, in real life (as opposed to
torture tests) no-one ever puts digits in IFS, and a $(( )) expansion
only ever contains digits, so in practice, splitting never happens,
whether because it is quoted, or just becaue there is nothing to split.
The FreeBSD shell does it correctly (they do word splitting using a
totally different mechanism than we do) - I will find where I missed
testing for quoted expansions later tonight. Until that happens,
expect a test failure from t_fsplit:split_arith
While here add more comments in the other test casess (one had a comment
already) that our shell parses incorrectly (this is a parsing problem,
not an expansion problem, and the bug has existed forever .. and is
shared by bash).
That is, in an expression like "${var:-word}" the whole thing (including
word) is quoted by the double quotes, and in "${var:-"word"}" the
expansion is quoted, but 'word' is not, the 2nd " (the one before 'w')
ends the opening quote, and the third (before }) turns quoting on
again (it is required that there be an even number of unquoted quotes inside
a ${} expression, so things like "${var:-"word} are simply invalid.)
Another way of saying this, is that if the { is quoted (for which the
only way is using " to get to this kind of expansion at all) the }
must also be quoted (" quoted). There is no quote nesting here.
This also means that in "${var:-'word'}" the single quotes are just
characters, they are quoted by the "" that surround them, just as in
"a 'string' containing quotes", but "${var:-"'word'"}} is valid and
contains a single quoted word.
Note that this is different than the patetrn-match/trim expansions
( ${var%pattern} etc) where any surrounding quotes do not affect the
pattern, and all forms of quoting can be used in it - but it always
parses as a pattern, and is never subject to filename expansion, so
a '*' in it that is to mean "match any string" does not need special
quoting to avoid it expanding to file names, regardless of whether
the ${var%*} is quoted or not, but a * that is to be matched as a
character literally does need to be quoted.
I will probably file a PR about this bug sometime, but as it is
very old, and doesn't every seem to bother anyone (and is shared by
bash) there isn't any big hurry to open yet another verry messy can
of excrement to fix it...
This does mean that the FreeBSD shell "fails" some of our tests.
(The test is wrong, their shell is correct.)
as to why ${011} is ${11} and not ${9} (that is, why we interpret it
that way, the "why could it not be the other way?" is just "because
that is not how it was ever implemented".
piece of mind (to verify I was not breaking anything here.)
Also added some commentary better explaining why one of the tests of splitting
quoted variable expansions is almost certainly incorrect (both the tests,
and what sh does) ... though bash does the same as us, so all is not lost!
whose meaning is defined entirely by context.
For those who read commit messages, and want a (small) challenge,
work out where (and what) to insert as punctuation/operator chars
in the following to produce 3 ines of output, and what those will be:
for in in in do in do case in in in echo do do echo in esac done
(There are no comments, quotes of any kind, or any kind of sub-shell,
including cmd substitutions) With correct non alpha-numeric chars added,
it works.
digittoint(3).
The digittoint(3) test is skipped since we don't provide that function yet.
One of the test cases for btowc(3) is also skipped, since it tests conversion
to Unicode---whereas our wchar_t representation is locale-dependent.
but with \ newline line continuations inserted at strange places.
For the shell as it is today, since strip passes, wrap_strip should
automaticallty pass as well (and does), as the \ newline combination
is simply removed, producing identical input to that of strip.
However, for accurate line counting ($LINENO: coming soon) newlines (no
matter the context) cannot simply "go away" - we have to know where they
occur(ed) (perhaps long after the text was read) so we know what line
number we are actually processing. This new test case is (perhaps just
part) of future-proofing that the modified code does not break anything.
CAN stands for Controller Area Network, a broadcast network used
in automation and automotive fields. For example, the NMEA2000 standard
developped for marine devices uses a CAN network as the link layer.
This is an implementation of the linux socketcan API:
https://www.kernel.org/doc/Documentation/networking/can.txt
you can also see can(4).
This adds a new socket family (AF_CAN) and protocol (PF_CAN),
as well as the canconfig(8) utility, used to set timing parameter of
CAN hardware. Also inclued is a driver for the CAN controller
found in the allwinner A20 SoC (I tested it with an Olimex lime2 board,
connected with PIC18-based CAN devices).
There is also the canloop(4) pseudo-device, which allows to use
the socketcan API without CAN hardware.
At this time the CANFD part of the linux socketcan API is not implemented.
Error frames are not implemented either. But I could get the cansend and
canreceive utilities from the canutils package to build and run with minimal
changes. tcpudmp(8) can also be used to record frames, which can be
decoded with etherreal.