matt
|
879bc2893e
|
Use __section(".test.startup") for the init routines
|
2013-08-21 03:00:56 +00:00 |
matt
|
afb7a3b050
|
write of final NUL in strlcpy doesn't need to be post-incremented
|
2013-08-20 21:37:39 +00:00 |
matt
|
61e4023f77
|
If compiling standalone with Thumb, use the thumb version instead of the
naive version.
|
2013-08-20 21:35:24 +00:00 |
matt
|
bb45e513c6
|
Thumb versions of strcpy/strlcpy/strncpy
|
2013-08-20 21:32:50 +00:00 |
matt
|
e61510c225
|
strlcat_arm.S is smaller than strlcat_naive.S so always use it.
|
2013-08-20 21:08:54 +00:00 |
matt
|
1a52a42b91
|
Add a missing it gt before movgt for thumb
|
2013-08-20 16:34:47 +00:00 |
matt
|
59eed4d06e
|
Use the arm versions of strnlen if compiling thumb2
|
2013-08-20 08:08:59 +00:00 |
matt
|
04d357d985
|
Use the arm versions of strlen/strchr/strrchr if compiling thumb2
|
2013-08-20 08:07:30 +00:00 |
matt
|
7477ad6ac7
|
thumbify (part2)
|
2013-08-20 08:06:30 +00:00 |
matt
|
b628072a9d
|
swap r1 & ip
use adds, eors, etc.
teq -> cmp
|
2013-08-20 08:05:49 +00:00 |
matt
|
7eb7e9aa67
|
Push two registers to keep stack aligned.
|
2013-08-20 07:52:31 +00:00 |
matt
|
3444a5b35b
|
Unless we are using an XSCALE, default to the normal arm version of memcpy.
|
2013-08-20 07:25:52 +00:00 |
matt
|
4ca1b3cb84
|
Add two thumb2 bits.
|
2013-08-19 17:50:04 +00:00 |
matt
|
c2b4a072b1
|
Missing one teq -> cmp
|
2013-08-19 17:41:47 +00:00 |
matt
|
5df59dfb3f
|
Swap use of r1 and ip
teq -> cmp.
add s to few instructions
(thumbify part 1)
|
2013-08-19 17:38:47 +00:00 |
matt
|
ddea6e9586
|
cbnz/cbz can not branch backwards so nuke 'em.
Use the same register usage in strlen as in strnlen
|
2013-08-19 17:02:25 +00:00 |
matt
|
3d4b320a67
|
Add END()
|
2013-08-19 06:23:59 +00:00 |
matt
|
5958fbf1be
|
fix cfi_register -> cfi_offset
|
2013-08-19 06:11:20 +00:00 |
matt
|
1045cebc5a
|
Rework to allow thumb armv7 compilation.
Add atomic_simplelock.c for thumb
|
2013-08-19 03:55:12 +00:00 |
matt
|
578c0a5078
|
Thumbify (and use .cfi ops)
|
2013-08-19 03:54:15 +00:00 |
matt
|
556b3d5814
|
thumbify
add .cfi ops (for thumb)
|
2013-08-19 03:51:04 +00:00 |
matt
|
3302b194a2
|
This is ARM only
|
2013-08-19 03:47:06 +00:00 |
matt
|
14ba29f9a7
|
Add thumb version
Use STRONG_ALIAS
|
2013-08-19 03:44:47 +00:00 |
matt
|
a546978def
|
Use STRONG_ALIAS
Add thumb variation
|
2013-08-19 03:44:18 +00:00 |
matt
|
d2610c9da2
|
Add .cfi ops
Thumbify
|
2013-08-19 03:43:07 +00:00 |
matt
|
eeddcf15a5
|
Add cfi ops.
Thumbify
|
2013-08-19 03:27:34 +00:00 |
matt
|
4f50540e20
|
Add END() and clarify thumb/arm
|
2013-08-19 02:55:19 +00:00 |
matt
|
d0a4289725
|
Thumbify
|
2013-08-19 02:54:02 +00:00 |
matt
|
70e4c7e047
|
Add END()
|
2013-08-19 02:37:12 +00:00 |
matt
|
de1d51c4ce
|
Thumbify
|
2013-08-19 02:36:27 +00:00 |
matt
|
f66bbc4e80
|
ip -> r2
teq -> cmp
|
2013-08-19 02:24:09 +00:00 |
matt
|
7ed75e50a6
|
Thumbify
|
2013-08-19 02:22:25 +00:00 |
matt
|
d3ce3f542d
|
ip -> r2
teq -> cmp
|
2013-08-19 02:20:06 +00:00 |
matt
|
8ba242506b
|
Thumbify
|
2013-08-19 02:13:13 +00:00 |
matt
|
414db3a8ba
|
Use ip as a temporary
|
2013-08-19 02:11:03 +00:00 |
matt
|
0b070291c7
|
Change previous use of r2 to r3
|
2013-08-19 02:08:41 +00:00 |
matt
|
ad93040ca6
|
teq -> cmp
ip -> r2
add/sub -> adds/subs
(thumbify part 1)
|
2013-08-19 02:07:22 +00:00 |
matt
|
4373b1a3d8
|
For EABI, add .cfi ops
|
2013-08-19 01:17:32 +00:00 |
matt
|
9dae71ed91
|
Add .cfi for __ARM_EABI__
Thumbify
|
2013-08-19 01:12:08 +00:00 |
matt
|
b4e387d260
|
Add END(memcpy)
|
2013-08-19 01:08:53 +00:00 |
matt
|
c2438985ca
|
For Thumb, use naive version
|
2013-08-19 01:08:29 +00:00 |
matt
|
f37c7d04a2
|
Thumbify
|
2013-08-19 00:56:12 +00:00 |
matt
|
4640cf4296
|
Add .cfi ops if EABI.
Thumbify.
|
2013-08-19 00:36:29 +00:00 |
matt
|
dcdd476662
|
Thumbify
|
2013-08-19 00:35:06 +00:00 |
matt
|
034de21d59
|
Add a hidden version for libpthread.
|
2013-08-16 01:47:41 +00:00 |
matt
|
48a3833972
|
When compiling for Thumb1, the swp instruction is not availble. So this
stub is added to provide __cpu_simple_lock and __cpu_simple_lock_try via
thumb interwork support. It is compiled with -mno-thumb.
|
2013-08-15 22:42:50 +00:00 |
matt
|
f376e753f4
|
Only assemble if !__ARM_EABI__
|
2013-08-15 21:40:11 +00:00 |
matt
|
c4948bca36
|
Use mvnge AHI, #0x80000000 instead of mvnge AHI, ALO, lsr #1
|
2013-08-13 15:52:00 +00:00 |
matt
|
1271d298bd
|
0x800000000 -> 0x80000000 (one too many zeroes)
|
2013-08-13 15:46:31 +00:00 |
matt
|
11f1fec6ba
|
Fix movlt
|
2013-08-13 15:45:30 +00:00 |