By reference to the MDN and to Duktape's CLI, expose everything
we possibly can on the global object (Window).
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
Add a polyfill for Array.from(), and fix the console formatter so that
it won't keep exploding. This should improve matters in the tests.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
In order to cope when an entire document is `visibility: hidden`
we default to the <HTML> node when interacting with the document
to ensure we don't drop off the end of the box model without
identifying at least one node to fire events at.
This resolves#2658
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
We were using integer multiplication rather than fixed-point
multiplication when calculating point sizes relative to the viewport.
This fixes that.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
In order to help us debug shutting down with active fetches, this
will abort the process cleanly if we get a callback to an "active"
llcache handle after the abort process has actually killed them
all. This can happen with deferred fetcher aborts in the cURL
fetcher.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
During the process of finalising the hlcache, there won't be
any more fetching going on. As such, we can abort, error, and
then destroy any contents still in the process of loading. This
should reduce our leaks during shutdown.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
Sometimes callbacks may be cancelled from within themselves. In
that case we need to simply ensure that should the callback be
wanted to repeat, we instead stop that so that once the callback
is completed we do not attempt to reschedule something which had
already been deleted.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
In order to cope better with modern cURL which prevents making
cURL calls when inside a callback from within cURL, defer fetch
start when we are processing in `fetch_curl_data()`.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
The pushed fragment node holds the reference, so unref it in
the end of createDocumentFragment()
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
Since executing a script can cause more scripts to be appended
to the script array, and that can cause a reallocation which might
move the script array, reacquire the script pointer after running
the script so that we don't wander off into the reeds.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
stop curl fetcher logging being special case to standard error and
use the fetch catagory at DEBUG level instead.
The special suppress_curl_debug option is currently still obeyed
Until we can determine *how* the compartment isn't cleaning
up properly in the duktape context, this will at least mean
we don't get unpleasant callback related issues when compartments
are reset during browsing.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
On ubuntu 19.4, curl is built with HTTP2 support, and we
segfault.
==18174== Invalid read of size 1
==18174== at 0x4ACCE7D: ??? (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x4B054B1: ??? (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x4AD398A: ??? (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x4AD7A0B: ??? (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x4AE93EE: ??? (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x4AEA8A8: curl_multi_perform (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x1F2EF7: fetch_curl_poll (curl.c:1209)
==18174== by 0x1EEC5C: fetcher_poll (fetch.c:271)
==18174== by 0x2A1ED4: schedule_run (schedule.c:160)
==18174== by 0x15F941: framebuffer_run (gui.c:596)
==18174== by 0x15F941: main (gui.c:2206)
==18174== Address 0x9de95a8 is 3,224 bytes inside a block of size 6,304 free'd
==18174== at 0x483997B: free (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==18174== by 0x4AD497B: ??? (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x4AE158C: curl_easy_cleanup (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x1F30DE: fetch_curl_cache_handle (curl.c:761)
==18174== by 0x1F30DE: fetch_curl_stop (curl.c:840)
==18174== by 0x1F30DE: fetch_curl_done (curl.c:1122)
==18174== by 0x1F30DE: fetch_curl_poll (curl.c:1223)
==18174== by 0x1EEC5C: fetcher_poll (fetch.c:271)
==18174== by 0x2A1ED4: schedule_run (schedule.c:160)
==18174== by 0x15F941: framebuffer_run (gui.c:596)
==18174== by 0x15F941: main (gui.c:2206)
==18174== Block was alloc'd at
==18174== at 0x483AB35: calloc (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==18174== by 0x4AE165F: curl_easy_duphandle (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x1F15EB: fetch_curl_get_handle (curl.c:738)
==18174== by 0x1F15EB: fetch_curl_start (curl.c:750)
==18174== by 0x1EEB22: fetch_dispatch_job (fetch.c:156)
==18174== by 0x1EEB22: fetch_choose_and_dispatch (fetch.c:187)
==18174== by 0x1EEB22: fetch_dispatch_jobs (fetch.c:247)
==18174== by 0x1EF1BB: fetch_start (fetch.c:573)
==18174== by 0x26C779: llcache_object_refetch (llcache.c:916)
==18174== by 0x26D5E4: llcache_object_fetch (llcache.c:979)
==18174== by 0x26D5E4: llcache_object_retrieve_from_cache (llcache.c:1767)
==18174== by 0x26D5E4: llcache_object_retrieve (llcache.c:1865)
==18174== by 0x26E42C: llcache_fetch_redirect (llcache.c:2110)
==18174== by 0x26E42C: llcache_fetch_callback (llcache.c:2810)
==18174== by 0x1F1295: fetch_curl_process_headers (curl.c:922)
==18174== by 0x1F13A0: fetch_curl_data (curl.c:1324)
==18174== by 0x4ACD4C3: ??? (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
==18174== by 0x4AE00DA: ??? (in /usr/lib/x86_64-linux-gnu/libcurl.so.4.5.0)
In order to support the console logging formatting specification
as per https://console.spec.whatwg.org/#logger we need to implement
the Formatter(...) algorithm which is easier done within JavaScript
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
Instead of specifically having to extract each generic by name,
such as makeListProxy, instead support the entire generics table
and use `dukky_push_generics()` to gain access to it.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
Migrate the console enums into netsurf/console.h and add
support so that contents can raise a message to log to
the console.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
When appending stylesheets to the selection context, it now
takes the media query string associated with the sheet, rather
than the type bitfield.
TODO:
We need to pass all the sheets in, with their full media
query string, rather than filtering it ourselves and setting
the ones we pass in to "screen".
Signed-off-by: Michael Drake <michael.drake@codethink.co.uk>
LibCSS now takes the client media spec, rather than just the
media type we're selecting for.
Signed-off-by: Michael Drake <michael.drake@codethink.co.uk>
In 8332bf6b2a, when the stroke width was moved from a parameter to
the plot style field, it accidentally used the `stroke` field of
the svgtiny shape (the color) instead of `stroke_width`.
Signed-off-by: Michael Drake <michael.drake@codethink.co.uk>
Update the logging levels, change to NSLOG across the board,
and ensure that we use the `dukky` category now added for us.
Signed-off-by: Daniel Silverstone <dsilvers@digital-scurf.org>
Previously we correctly used the viewport width, but we were using the
document height instead of viewport height when an HTML child content
triggered a reformat of the parent HTML document.
It was always aware of viewport width, but since adding support for
vh CSS units, the HTML content handler also needs viewport height.
Signed-off-by: Michael Drake <michael.drake@codethink.co.uk>
Trying to reason about error propagation and resource leakage within
the form submission code was impossible because of the
form_successful_controls_dom function.
This function was over six hundred lines long, had twenty six top
level local variables and six levels of indent in places.
This commit splits it out into thirteen shorter and more obvious
functions. The resulting operation is identical except errors are
properly propagated (all failures were reported as out of memory)
and resource management can be reasoned about.
The compiler appears to inline the entirety of the code from
form_submit() down excepting a handful of leaf functions. This
results in similar code output size as previous implementation.
The new implementation has a greater number of variables passed to sub
functions than desirable because multiple character sets are required
to encode names and values in the multipart data list. However as
noted the compiler effectively inlines all these functions so this
does not actually become a major problem.
This reduces the log file size for startup and a single visit
to https://www.bbc.co.uk/news from 266133 to 178777 bytes,
by not dumping big data URLs over and over into the log.