diff options
author | Noah Goldstein <goldstein.w.n@gmail.com> | 2024-08-13 23:29:14 +0800 |
---|---|---|
committer | H.J. Lu <hjl.tools@gmail.com> | 2024-08-15 08:11:33 -0700 |
commit | 7da08862471dfec6fdae731c2a5f351ad485c71f (patch) | |
tree | fc099b40c8a707ae7ba653477035aa0ffa3120fb /sysdeps/x86_64 | |
parent | 207d64feb26279e152c50744e3c37e68491aca99 (diff) | |
download | glibc-7da08862471dfec6fdae731c2a5f351ad485c71f.tar.gz glibc-7da08862471dfec6fdae731c2a5f351ad485c71f.tar.xz glibc-7da08862471dfec6fdae731c2a5f351ad485c71f.zip |
x86: Fix bug in strchrnul-evex512 [BZ #32078]
Issue was we were expecting not matches with CHAR before the start of the string in the page cross case. The check code in the page cross case: ``` and $0xffffffffffffffc0,%rax vmovdqa64 (%rax),%zmm17 vpcmpneqb %zmm17,%zmm16,%k1 vptestmb %zmm17,%zmm17,%k0{%k1} kmovq %k0,%rax inc %rax shr %cl,%rax je L(continue) ``` expects that all characters that neither match null nor CHAR will be 1s in `rax` prior to the `inc`. Then the `inc` will overflow all of the 1s where no relevant match was found. This is incorrect in the page-cross case, as the `vmovdqa64 (%rax),%zmm17` loads from before the start of the input string. If there are matches with CHAR before the start of the string, `rax` won't properly overflow. The fix is quite simple. Just replace: ``` inc %rax shr %cl,%rax ``` With: ``` sar %cl,%rax inc %rax ``` The arithmetic shift will clear any matches prior to the start of the string while maintaining the signbit so the 1s can properly overflow to zero in the case of no matches. Reviewed-by: H.J. Lu <hjl.tools@gmail.com>
Diffstat (limited to 'sysdeps/x86_64')
-rw-r--r-- | sysdeps/x86_64/multiarch/strchr-evex-base.S | 8 |
1 files changed, 4 insertions, 4 deletions
diff --git a/sysdeps/x86_64/multiarch/strchr-evex-base.S b/sysdeps/x86_64/multiarch/strchr-evex-base.S index 04e2c0e79e..3a0b7c9d64 100644 --- a/sysdeps/x86_64/multiarch/strchr-evex-base.S +++ b/sysdeps/x86_64/multiarch/strchr-evex-base.S @@ -124,13 +124,13 @@ L(page_cross): VPCMPNE %VMM(1), %VMM(0), %k1 VPTEST %VMM(1), %VMM(1), %k0{%k1} KMOV %k0, %VRAX -# ifdef USE_AS_WCSCHR + sar %cl, %VRAX +#ifdef USE_AS_WCSCHR sub $VEC_MATCH_MASK, %VRAX -# else +#else inc %VRAX -# endif +#endif /* Ignore number of character for alignment adjustment. */ - shr %cl, %VRAX jz L(align_more) bsf %VRAX, %VRAX |