[v8,03/17] Add string-maskoff.h generic header

Message ID 20230113182733.1268668-4-richard.henderson@linaro.org
State Superseded
Headers
Series Improve generic string routines |

Checks

Context Check Description
dj/TryBot-apply_patch success Patch applied to master at the time it was sent

Commit Message

Richard Henderson Jan. 13, 2023, 6:27 p.m. UTC
  From: Adhemerval Zanella Netto <adhemerval.zanella@linaro.org>

Macros to operate on unaligned access for string operations:

  - create_mask: create a mask based on pointer alignment to sets up
    non-zero bytes before the beginning of the word so a following
    operation (such as find zero) might ignore these bytes.

  - repeat_bytes: setup an word with each byte being c_in.

  - highbit_mask: create a mask with high bit of each byte being 1,
    and the low 7 bits being all the opposite of the input.

  - word_containing: return the address of the op_t word containing the
    addres.

These macros are meant to be used on optimized vectorized string
implementations.
Message-Id: <20230111204558.2402155-4-adhemerval.zanella@linaro.org>
---
 sysdeps/generic/string-maskoff.h | 73 ++++++++++++++++++++++++++++++++
 1 file changed, 73 insertions(+)
 create mode 100644 sysdeps/generic/string-maskoff.h
  

Comments

Richard Henderson Jan. 16, 2023, 8:58 p.m. UTC | #1
On 1/13/23 08:27, Richard Henderson wrote:
> +/* Provide a mask based on the pointer alignment that sets up non-zero
> +   bytes before the beginning of the word.  It is used to mask off
> +   undesirable bits from an aligned read from an unaligned pointer.
> +   For instance, on a 64 bits machine with a pointer alignment of
> +   3 the function returns 0x0000000000ffffff for LE and 0xffffff0000000000
> +   (meaning to mask off the initial 3 bytes).  */
> +static __always_inline op_t
> +create_mask (uintptr_t i)
> +{
> +  i = i % sizeof (op_t);
> +  if (__BYTE_ORDER == __LITTLE_ENDIAN)
> +    return ~(((op_t)-1) << (i * CHAR_BIT));
> +  else
> +    return ~(((op_t)-1) >> (i * CHAR_BIT));
> +}

This has exactly one use, in strlen, which could now use shift_find, like strchrnul.

> +/* Based on mask created by 'create_mask', mask off the high bit of each
> +   byte in the mask.  It is used to mask off undesirable bits from an
> +   aligned read from an unaligned pointer, and also taking care to avoid
> +   match possible bytes meant to be matched.  For instance, on a 64 bits
> +   machine with a mask created from a pointer with an alignment of 3
> +   (0x0000000000ffffff) the function returns 0x7f7f7f0000000000 for BE
> +   and 0x00000000007f7f7f for LE.  */
> +static __always_inline op_t
> +highbit_mask (op_t m)
> +{
> +  return m & repeat_bytes (0x7f);
> +}

This isn't used at all anymore.

If we remove those two, we're left with repeat_byte and word_containing, which are used, 
but no longer seem to fit the filename of string-maskoff,h, though I can't immediately 
think of something better to use.


r~
  
Adhemerval Zanella Netto Jan. 17, 2023, 6:49 p.m. UTC | #2
On 16/01/23 17:58, Richard Henderson wrote:
> On 1/13/23 08:27, Richard Henderson wrote:
>> +/* Provide a mask based on the pointer alignment that sets up non-zero
>> +   bytes before the beginning of the word.  It is used to mask off
>> +   undesirable bits from an aligned read from an unaligned pointer.
>> +   For instance, on a 64 bits machine with a pointer alignment of
>> +   3 the function returns 0x0000000000ffffff for LE and 0xffffff0000000000
>> +   (meaning to mask off the initial 3 bytes).  */
>> +static __always_inline op_t
>> +create_mask (uintptr_t i)
>> +{
>> +  i = i % sizeof (op_t);
>> +  if (__BYTE_ORDER == __LITTLE_ENDIAN)
>> +    return ~(((op_t)-1) << (i * CHAR_BIT));
>> +  else
>> +    return ~(((op_t)-1) >> (i * CHAR_BIT));
>> +}
> 
> This has exactly one use, in strlen, which could now use shift_find, like strchrnul.

Right, I remove the strlen usage then.

> 
>> +/* Based on mask created by 'create_mask', mask off the high bit of each
>> +   byte in the mask.  It is used to mask off undesirable bits from an
>> +   aligned read from an unaligned pointer, and also taking care to avoid
>> +   match possible bytes meant to be matched.  For instance, on a 64 bits
>> +   machine with a mask created from a pointer with an alignment of 3
>> +   (0x0000000000ffffff) the function returns 0x7f7f7f0000000000 for BE
>> +   and 0x00000000007f7f7f for LE.  */
>> +static __always_inline op_t
>> +highbit_mask (op_t m)
>> +{
>> +  return m & repeat_bytes (0x7f);
>> +}
> 
> This isn't used at all anymore.
> 
> If we remove those two, we're left with repeat_byte and word_containing, which are used, but no longer seem to fit the filename of string-maskoff,h, though I can't immediately think of something better to use.

The word_containing can be removed as well in favor of PTR_ALIGN macros and
I will rename the header to string-repeat_bytes.h.
  
Richard Henderson Jan. 18, 2023, 1:33 a.m. UTC | #3
On 1/17/23 08:49, Adhemerval Zanella Netto wrote:
> 
> 
> On 16/01/23 17:58, Richard Henderson wrote:
>> On 1/13/23 08:27, Richard Henderson wrote:
>>> +/* Provide a mask based on the pointer alignment that sets up non-zero
>>> +   bytes before the beginning of the word.  It is used to mask off
>>> +   undesirable bits from an aligned read from an unaligned pointer.
>>> +   For instance, on a 64 bits machine with a pointer alignment of
>>> +   3 the function returns 0x0000000000ffffff for LE and 0xffffff0000000000
>>> +   (meaning to mask off the initial 3 bytes).  */
>>> +static __always_inline op_t
>>> +create_mask (uintptr_t i)
>>> +{
>>> +  i = i % sizeof (op_t);
>>> +  if (__BYTE_ORDER == __LITTLE_ENDIAN)
>>> +    return ~(((op_t)-1) << (i * CHAR_BIT));
>>> +  else
>>> +    return ~(((op_t)-1) >> (i * CHAR_BIT));
>>> +}
>>
>> This has exactly one use, in strlen, which could now use shift_find, like strchrnul.
> 
> Right, I remove the strlen usage then.

The thing is... leaving things as they are is probably better for HPPA and SH, which get 
to use their boolean byte comparison instructions.

The prior code in strchr(nul) was trying to prevent matches vs both 0 and C, consuming at 
least 4 extra insns (possibly more with the constant loads).  The new code in strchrnul 
may be better (or at least close) everywhere.

But perhaps simplicity demands removal in strlen as well?

> The word_containing can be removed as well in favor of PTR_ALIGN macros and

Good idea.

> I will rename the header to string-repeat_bytes.h.

Or merge with string-extbyte.h and name string-misc.h?


r~
  
Adhemerval Zanella Netto Jan. 18, 2023, 12:45 p.m. UTC | #4
On 17/01/23 22:33, Richard Henderson wrote:
> On 1/17/23 08:49, Adhemerval Zanella Netto wrote:
>>
>>
>> On 16/01/23 17:58, Richard Henderson wrote:
>>> On 1/13/23 08:27, Richard Henderson wrote:
>>>> +/* Provide a mask based on the pointer alignment that sets up non-zero
>>>> +   bytes before the beginning of the word.  It is used to mask off
>>>> +   undesirable bits from an aligned read from an unaligned pointer.
>>>> +   For instance, on a 64 bits machine with a pointer alignment of
>>>> +   3 the function returns 0x0000000000ffffff for LE and 0xffffff0000000000
>>>> +   (meaning to mask off the initial 3 bytes).  */
>>>> +static __always_inline op_t
>>>> +create_mask (uintptr_t i)
>>>> +{
>>>> +  i = i % sizeof (op_t);
>>>> +  if (__BYTE_ORDER == __LITTLE_ENDIAN)
>>>> +    return ~(((op_t)-1) << (i * CHAR_BIT));
>>>> +  else
>>>> +    return ~(((op_t)-1) >> (i * CHAR_BIT));
>>>> +}
>>>
>>> This has exactly one use, in strlen, which could now use shift_find, like strchrnul.
>>
>> Right, I remove the strlen usage then.
> 
> The thing is... leaving things as they are is probably better for HPPA and SH, which get to use their boolean byte comparison instructions.
> 
> The prior code in strchr(nul) was trying to prevent matches vs both 0 and C, consuming at least 4 extra insns (possibly more with the constant loads).  The new code in strchrnul may be better (or at least close) everywhere.
> 
> But perhaps simplicity demands removal in strlen as well?

I mimic the strchr code on strlen, so I hope it would produce better instruction
as well.

> 
>> The word_containing can be removed as well in favor of PTR_ALIGN macros and
> 
> Good idea.
> 
>> I will rename the header to string-repeat_bytes.h.
> 
> Or merge with string-extbyte.h and name string-misc.h?

I think it makes sense, I will update it.
  

Patch

diff --git a/sysdeps/generic/string-maskoff.h b/sysdeps/generic/string-maskoff.h
new file mode 100644
index 0000000000..73edd5ad0f
--- /dev/null
+++ b/sysdeps/generic/string-maskoff.h
@@ -0,0 +1,73 @@ 
+/* Mask off bits.  Generic C version.
+   Copyright (C) 2023 Free Software Foundation, Inc.
+   This file is part of the GNU C Library.
+
+   The GNU C Library is free software; you can redistribute it and/or
+   modify it under the terms of the GNU Lesser General Public
+   License as published by the Free Software Foundation; either
+   version 2.1 of the License, or (at your option) any later version.
+
+   The GNU C Library is distributed in the hope that it will be useful,
+   but WITHOUT ANY WARRANTY; without even the implied warranty of
+   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+   Lesser General Public License for more details.
+
+   You should have received a copy of the GNU Lesser General Public
+   License along with the GNU C Library; if not, see
+   <http://www.gnu.org/licenses/>.  */
+
+#ifndef _STRING_MASKOFF_H
+#define _STRING_MASKOFF_H 1
+
+#include <endian.h>
+#include <limits.h>
+#include <stdint.h>
+#include <string-optype.h>
+
+/* Provide a mask based on the pointer alignment that sets up non-zero
+   bytes before the beginning of the word.  It is used to mask off
+   undesirable bits from an aligned read from an unaligned pointer.
+   For instance, on a 64 bits machine with a pointer alignment of
+   3 the function returns 0x0000000000ffffff for LE and 0xffffff0000000000
+   (meaning to mask off the initial 3 bytes).  */
+static __always_inline op_t
+create_mask (uintptr_t i)
+{
+  i = i % sizeof (op_t);
+  if (__BYTE_ORDER == __LITTLE_ENDIAN)
+    return ~(((op_t)-1) << (i * CHAR_BIT));
+  else
+    return ~(((op_t)-1) >> (i * CHAR_BIT));
+}
+
+/* Setup an word with each byte being c_in.  For instance, on a 64 bits
+   machine with input as 0xce the functions returns 0xcececececececece.  */
+static __always_inline op_t
+repeat_bytes (unsigned char c_in)
+{
+  return ((op_t)-1 / 0xff) * c_in;
+}
+
+/* Based on mask created by 'create_mask', mask off the high bit of each
+   byte in the mask.  It is used to mask off undesirable bits from an
+   aligned read from an unaligned pointer, and also taking care to avoid
+   match possible bytes meant to be matched.  For instance, on a 64 bits
+   machine with a mask created from a pointer with an alignment of 3
+   (0x0000000000ffffff) the function returns 0x7f7f7f0000000000 for BE
+   and 0x00000000007f7f7f for LE.  */
+static __always_inline op_t
+highbit_mask (op_t m)
+{
+  return m & repeat_bytes (0x7f);
+}
+
+/* Return the address of the op_t word containing the address P.  For
+   instance on address 0x0011223344556677 and op_t with size of 8,
+   it returns 0x0011223344556670.  */
+static __always_inline op_t *
+word_containing (char const *p)
+{
+  return (op_t *) ((uintptr_t) p & -sizeof (op_t));
+}
+
+#endif /* _STRING_MASKOFF_H  */