* This gives us plenty of space for source packages.
* A Mini-DVD is 1.4 GiB, and USB sticks of 2 GiB are at
the sweet spot of low-price vs size.
* Unused space will be compressed in release zip.
* We blew by 700 MiB long ago. Sorry CD-R folks.
Change-Id: I3bbe4508777027f6fe7c0ee2992637541feeb88f
When HEAD is tagged, the output will be identical to what it was
before (the latest hrev tag and nothing else.) When HEAD is not tagged,
and the most recent tag is further back, we now use a format like this:
hrevXXXXX+N(+dirty)
... where N is the number of commits since hrevXXXXX, and +dirty is added
if the working tree is dirty. This is significantly shorter than the
previous model (as it does not have the Git revision.)
Fixes#14445.
Change-Id: Ide7f66cf0ac1c1f05402afc52b6be3b68b66d6dc
Reviewed-on: https://review.haiku-os.org/566
Reviewed-by: waddlesplash <waddlesplash@gmail.com>
* PRE_BETA_2 is now the default in master.
* For libbe: R1/alpha4 used internal=8, but nobody bumped master
at the same time, so now we are on internal=9.
It is just straight up broken on most systems without this, as
the same "cast specifies signature type" appears there too.
Exactly why this fixes the problem does not appear to be any more known
than it was when this was first instated in 2012.
The latter is not just a symlink to the former, but is a small pseudo-
library that tells the linker to use the .so.1 version instead. As we
do not pass -L to this directory to the linker invocation, the linker
thus cannot find it, and so errors out.
We rightly do not want the linker doing "magic" things for us that
we don't expect, and so even if this one case is fine, we shouldn't
allow the linker to take care of this automatically for us when
it comes to libroot and other core system functionality, especially
as going forward we may indeed add a second libgcc version due to ABI
breaks. Instead, link against .so.1 directly.
Fixes the build breakage caused by the GCC 7 bump.
We now build libicns against it. It seems that it is better-maintained than
JasPer, so we should probably consider switching the JPEG2000Translator to
use it also.
ffmpeg_devel pulls in some other devel packages we don't really need,
and very few things are built against it anyway, so whoever needs it
can install it manually. Same goes for freetype and fontconfig.
* Store pointers in an addr_t instead of int32, for 64-bit's sake
* Use DebugSupport.h instead of userlandfs Debug.h and remove extra parentheses
* Create a header-only String class based on the userlandfs String and use it
* RecursiveLock instead of Locker.
* Jamfile cleanups and other misc. changes.
It isn't yet adapted to the new VFS API, so the build is still somewhat
broken.
This contains the contents of Haiku's sources, which is necessary
to include in "with source" builds for proper (L)GPL compliance,
mostly because we have GPL code in the tree.
"cpp" is the system C preprocessor, not the one from our cross-compiler,
and in the case of my system which does not have GCC installed at all,
it doesn't even exist.
With this, Clang-ARM builds successfully create a "haiku-arm.mmc".
I couldn't get it to output, even after blessing it with "rune",
but that may just be my fault...
Now that HOST_CC is actually passed in, we need to default
everything to it; otherwise, it's up to the Jambase as to
what CC we are actually using.
Found by trying to build Haiku on a system that has no "cc"
executable, but Jam tried to use it anyway (as all three of CC,
C++, and LINK.)
The former is passed to the compiler when linking using it,
the latter is passed to ld when it is invoked directly.
Also modify ArchitectureRules to not overwrite this setting.
This rule process the entire target's source files at once, and so
whoever wrote this rule in the first place (PulkoMandy?) probably
assumed without even testing that "cc -E" would create multiple
outputs for multiple inputs.
It doesn't, though: it just outputs them in sequence on the command line
the same way it does when the files are piped in through "cat". This
also has other advantages (e.g. preprocess errors caused by the compiler
assuming it was C not C++ code and so not defining __cplusplus, local
includes are now resolved properly, etc.)
Doing it this way does exposes other problems like the one fixed in the
previous commit (headers with no context defined, which worked previously
only because they used the context of the preceding `cat`'ed file.)
We now also remove the .pre file after collecting the catkeys.
Otherwise, Clang warns that we haven't set an architecture on ARM,
which is set in CCFLAGS. Since these might also contain other flags
that affect the preprocessor, there isn't any good reason to not
pass it through, so do that.
It seems that at on some platforms at least, Clang uses @define instead
of #define, but with functionally identical syntax, so use sed to
process it as such.
We lost these tunings when I moved us away from board focused
builds. I feel like most of our ARM interest is around ARMv7+
Change-Id: Ie301d275a74d48ee3d0c4c7dc7d6cdd635288a7b
This requires a trunk build of Clang (the flag was only implemented
& introduced 12 days ago), but at present, full builds will fail
due to an unrelated Clang bug: https://bugs.llvm.org/show_bug.cgi?id=38356
* __NO_INLINE__ fixes the cross-build on some glibc-based systems with
newer compilers, as it prevents glibc from declaring functions inline
that we override in libroot_build.
* We can now enable tree-vrp as long as no-delete-null-pointer-checks
goes where it used to.