about summary refs log tree commit diff
path: root/sysdeps/x86_64/multiarch/memcmp-sse4.S
diff options
context:
space:
mode:
authorH.J. Lu <hjl.tools@gmail.com>2019-01-21 11:25:56 -0800
committerH.J. Lu <hjl.tools@gmail.com>2019-01-21 11:26:07 -0800
commitb304fc201d2f6baf52ea790df8643e99772243cd (patch)
tree502227813e790ba46c8f6e4ee826cb3f36ffb9bb /sysdeps/x86_64/multiarch/memcmp-sse4.S
parent97700a34f36721b11a754cf37a1cc40695ece1fd (diff)
downloadglibc-b304fc201d2f6baf52ea790df8643e99772243cd.tar.gz
glibc-b304fc201d2f6baf52ea790df8643e99772243cd.tar.xz
glibc-b304fc201d2f6baf52ea790df8643e99772243cd.zip
x86-64 memcmp/wmemcmp: Properly handle the length parameter [BZ# 24097]
On x32, the size_t parameter may be passed in the lower 32 bits of a
64-bit register with the non-zero upper 32 bits.  The string/memory
functions written in assembly can only use the lower 32 bits of a
64-bit register as length or must clear the upper 32 bits before using
the full 64-bit register for length.

This pach fixes memcmp/wmemcmp for x32.  Tested on x86-64 and x32.  On
x86-64, libc.so is the same with and withou the fix.

	[BZ# 24097]
	CVE-2019-6488
	* sysdeps/x86_64/multiarch/memcmp-avx2-movbe.S: Use RDX_LP for
	length.  Clear the upper 32 bits of RDX register.
	* sysdeps/x86_64/multiarch/memcmp-sse4.S: Likewise.
	* sysdeps/x86_64/multiarch/memcmp-ssse3.S: Likewise.
	* sysdeps/x86_64/x32/Makefile (tests): Add tst-size_t-memcmp and
	tst-size_t-wmemcmp.
	* sysdeps/x86_64/x32/tst-size_t-memcmp.c: New file.
	* sysdeps/x86_64/x32/tst-size_t-wmemcmp.c: Likewise.
Diffstat (limited to 'sysdeps/x86_64/multiarch/memcmp-sse4.S')
-rw-r--r--sysdeps/x86_64/multiarch/memcmp-sse4.S9
1 files changed, 6 insertions, 3 deletions
diff --git a/sysdeps/x86_64/multiarch/memcmp-sse4.S b/sysdeps/x86_64/multiarch/memcmp-sse4.S
index 789ff15e2c..c8f67318a4 100644
--- a/sysdeps/x86_64/multiarch/memcmp-sse4.S
+++ b/sysdeps/x86_64/multiarch/memcmp-sse4.S
@@ -42,13 +42,16 @@
 	.section .text.sse4.1,"ax",@progbits
 ENTRY (MEMCMP)
 # ifdef USE_AS_WMEMCMP
-	shl	$2, %rdx
+	shl	$2, %RDX_LP
+# elif defined __ILP32__
+	/* Clear the upper 32 bits.  */
+	mov	%edx, %edx
 # endif
 	pxor	%xmm0, %xmm0
-	cmp	$79, %rdx
+	cmp	$79, %RDX_LP
 	ja	L(79bytesormore)
 # ifndef USE_AS_WMEMCMP
-	cmp	$1, %rdx
+	cmp	$1, %RDX_LP
 	je	L(firstbyte)
 # endif
 	add	%rdx, %rsi