da836a732c
* Android: Better track touch input returned from IsMouse*() Switch to actually tracking touch input to use for "mouse" input rather than the gestures system. The gesture system as an abstraction ontop of raw touch input loses some information needed to map to "mouse" input. Before, - IsMouseButtonReleased() triggers immediately after the initial touch (because GESTURE_TAP activates immediately on touch) instead of waiting for the touch to be released. - IsMouseButtonUp() returns false, when it should just be the opposite of IsMouseButtonDown(). - IsMouseButtonDown() returns true only after GESTURE_HOLD (which activates after some period of time after GESTURE_TAP), when instead it should just be true whenever there is touch input i.e. gesture != GESTURE_NONE or alternatively when any input is received on the screen. After this PR, touches map closer to mouse input. - IsMouseButtonReleased() triggers when touch is released (last frame was touched, this frame not touched). - IsMouseButtonUp() returns the opposite of IsMouseButtonDown() - IsMouseButtonDown() is true when (AMOTION_EVENT_ACTION_DOWN || AMOTION_EVENT_ACTION_MOVE) and false when (AMOTION_EVENT_ACTION_UP) * RPI: Include index check for RPI in GetTouchPosition() |
||
---|---|---|
.. | ||
external | ||
camera.h | ||
CMakeLists.txt | ||
CMakeOptions.txt | ||
config.h | ||
config.h.in | ||
core.c | ||
easings.h | ||
gestures.h | ||
Makefile | ||
models.c | ||
physac.h | ||
raudio.c | ||
raudio.h | ||
raylib.dll.rc | ||
raylib.dll.rc.data | ||
raylib.h | ||
raylib.ico | ||
raylib.rc | ||
raylib.rc.data | ||
raymath.h | ||
rglfw.c | ||
rlgl.h | ||
rmem.h | ||
rnet.h | ||
shapes.c | ||
shell.html | ||
text.c | ||
textures.c | ||
utils.c | ||
utils.h |