From: Jason Merrill <jason@redhat.com>
To: Jakub Jelinek <jakub@redhat.com>
Cc: Jonathan Wakely <jwakely@redhat.com>, gcc-patches@gcc.gnu.org
Subject: Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
Date: Wed, 25 Nov 2020 12:26:17 -0500 [thread overview]
Message-ID: <5126c253-6ee3-a475-3324-38049d0e2e3f@redhat.com> (raw)
In-Reply-To: <20201102192107.GH3788@tucnak>
On 11/2/20 2:21 PM, Jakub Jelinek wrote:
> On Thu, Jul 30, 2020 at 10:16:46AM -0400, Jason Merrill via Gcc-patches wrote:
>>> The following patch adds __builtin_bit_cast builtin, similarly to
>>> clang or MSVC which implement std::bit_cast using such an builtin too.
>>
>> Great!
>
> Sorry for the long delay.
>
> The following patch implements what you've requested in the review.
> It doesn't deal with the padding bits being (sometimes) well defined, but if
> we e.g. get a new CONSTRUCTOR bit that would request that (and corresponding
> gimplifier change to pre-initialize to all zeros), it could be easily
> adjusted (just for such CONSTRUCTORs memset the whole mask portion
> (total_bytes) and set mask to NULL). If there are other cases where
> such handling is desirable (e.g. I wonder about bit_cast being called on
> non-automatic const variables where their CONSTRUCTOR at least
> implementation-wise would end up being on pre-zeroed .rodata (or .data)
> memory), that can be handled too.
>
> In the previous version I was using next_initializable_field, that is
> something that isn't available in the middle-end, so all I do right now
> is just skip non-FIELD_DECLs, unnamed bit-fields and fields with zero size.
>
> Bootstrapped/regtested on x86_64-linux and i686-linux, ok for trunk?
>
> 2020-11-02 Jakub Jelinek <jakub@redhat.com>
>
> PR libstdc++/93121
> * fold-const.h (native_encode_initializer): Add mask argument
> defaulted to nullptr.
> (find_bitfield_repr_type): Declare.
> * fold-const.c (find_bitfield_repr_type): New function.
> (native_encode_initializer): Add mask argument and support for
> filling it. Handle also some bitfields without integral
> DECL_BIT_FIELD_REPRESENTATIVE.
>
> * c-common.h (enum rid): Add RID_BUILTIN_BIT_CAST.
> * c-common.c (c_common_reswords): Add __builtin_bit_cast.
>
> * cp-tree.h (cp_build_bit_cast): Declare.
> * cp-tree.def (BIT_CAST_EXPR): New tree code.
> * cp-objcp-common.c (names_builtin_p): Handle RID_BUILTIN_BIT_CAST.
> (cp_common_init_ts): Handle BIT_CAST_EXPR.
> * cxx-pretty-print.c (cxx_pretty_printer::postfix_expression):
> Likewise.
> * parser.c (cp_parser_postfix_expression): Handle
> RID_BUILTIN_BIT_CAST.
> * semantics.c (cp_build_bit_cast): New function.
> * tree.c (cp_tree_equal): Handle BIT_CAST_EXPR.
> (cp_walk_subtrees): Likewise.
> * pt.c (tsubst_copy): Likewise.
> * constexpr.c (check_bit_cast_type, cxx_native_interpret_aggregate,
> cxx_eval_bit_cast): New functions.
> (cxx_eval_constant_expression): Handle BIT_CAST_EXPR.
> (potential_constant_expression_1): Likewise.
> * cp-gimplify.c (cp_genericize_r): Likewise.
>
> * g++.dg/cpp2a/bit-cast1.C: New test.
> * g++.dg/cpp2a/bit-cast2.C: New test.
> * g++.dg/cpp2a/bit-cast3.C: New test.
> * g++.dg/cpp2a/bit-cast4.C: New test.
> * g++.dg/cpp2a/bit-cast5.C: New test.
>
> --- gcc/fold-const.h.jj 2020-08-18 07:50:18.716920202 +0200
> +++ gcc/fold-const.h 2020-11-02 15:53:59.859477063 +0100
> @@ -27,9 +27,10 @@ extern int folding_initializer;
> /* Convert between trees and native memory representation. */
> extern int native_encode_expr (const_tree, unsigned char *, int, int off = -1);
> extern int native_encode_initializer (tree, unsigned char *, int,
> - int off = -1);
> + int off = -1, unsigned char * = nullptr);
> extern tree native_interpret_expr (tree, const unsigned char *, int);
> extern bool can_native_interpret_type_p (tree);
> +extern tree find_bitfield_repr_type (int, int);
> extern void shift_bytes_in_array_left (unsigned char *, unsigned int,
> unsigned int);
> extern void shift_bytes_in_array_right (unsigned char *, unsigned int,
> --- gcc/fold-const.c.jj 2020-10-15 15:09:49.079725120 +0200
> +++ gcc/fold-const.c 2020-11-02 17:20:43.784633491 +0100
> @@ -7915,25 +7915,78 @@ native_encode_expr (const_tree expr, uns
> }
> }
>
> +/* Try to find a type whose byte size is smaller or equal to LEN bytes larger
> + or equal to FIELDSIZE bytes, with underlying mode precision/size multiple
> + of BITS_PER_UNIT. As native_{interpret,encode}_int works in term of
> + machine modes, we can't just use build_nonstandard_integer_type. */
> +
> +tree
> +find_bitfield_repr_type (int fieldsize, int len)
> +{
> + machine_mode mode;
> + for (int pass = 0; pass < 2; pass++)
> + {
> + enum mode_class mclass = pass ? MODE_PARTIAL_INT : MODE_INT;
> + FOR_EACH_MODE_IN_CLASS (mode, mclass)
> + if (known_ge (GET_MODE_SIZE (mode), fieldsize)
> + && known_eq (GET_MODE_PRECISION (mode),
> + GET_MODE_BITSIZE (mode))
> + && known_le (GET_MODE_SIZE (mode), len))
> + {
> + tree ret = lang_hooks.types.type_for_mode (mode, 1);
> + if (ret && TYPE_MODE (ret) == mode)
> + return ret;
> + }
> + }
> +
> + for (int i = 0; i < NUM_INT_N_ENTS; i ++)
> + if (int_n_enabled_p[i]
> + && int_n_data[i].bitsize >= (unsigned) (BITS_PER_UNIT * fieldsize)
> + && int_n_trees[i].unsigned_type)
> + {
> + tree ret = int_n_trees[i].unsigned_type;
> + mode = TYPE_MODE (ret);
> + if (known_ge (GET_MODE_SIZE (mode), fieldsize)
> + && known_eq (GET_MODE_PRECISION (mode),
> + GET_MODE_BITSIZE (mode))
> + && known_le (GET_MODE_SIZE (mode), len))
> + return ret;
> + }
> +
> + return NULL_TREE;
> +}
> +
> /* Similar to native_encode_expr, but also handle CONSTRUCTORs, VCEs,
> - NON_LVALUE_EXPRs and nops. */
> + NON_LVALUE_EXPRs and nops. If MASK is non-NULL (then PTR has
> + to be non-NULL and OFF zero), then in addition to filling the
> + bytes pointed by PTR with the value also clear any bits pointed
> + by MASK that are known to be initialized, keep them as is for
> + e.g. uninitialized padding bits or uninitialized fields. */
>
> int
> native_encode_initializer (tree init, unsigned char *ptr, int len,
> - int off)
> + int off, unsigned char *mask)
> {
> + int r;
> +
> /* We don't support starting at negative offset and -1 is special. */
> if (off < -1 || init == NULL_TREE)
> return 0;
>
> + gcc_assert (mask == NULL || (off == 0 && ptr));
> +
> STRIP_NOPS (init);
> switch (TREE_CODE (init))
> {
> case VIEW_CONVERT_EXPR:
> case NON_LVALUE_EXPR:
> - return native_encode_initializer (TREE_OPERAND (init, 0), ptr, len, off);
> + return native_encode_initializer (TREE_OPERAND (init, 0), ptr, len, off,
> + mask);
> default:
> - return native_encode_expr (init, ptr, len, off);
> + r = native_encode_expr (init, ptr, len, off);
> + if (mask)
> + memset (mask, 0, r);
> + return r;
> case CONSTRUCTOR:
> tree type = TREE_TYPE (init);
> HOST_WIDE_INT total_bytes = int_size_in_bytes (type);
> @@ -7946,7 +7999,7 @@ native_encode_initializer (tree init, un
> {
> HOST_WIDE_INT min_index;
> unsigned HOST_WIDE_INT cnt;
> - HOST_WIDE_INT curpos = 0, fieldsize;
> + HOST_WIDE_INT curpos = 0, fieldsize, valueinit = -1;
> constructor_elt *ce;
>
> if (TYPE_DOMAIN (type) == NULL_TREE
> @@ -7961,12 +8014,22 @@ native_encode_initializer (tree init, un
> if (ptr != NULL)
> memset (ptr, '\0', MIN (total_bytes - off, len));
>
> - FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
> + for (cnt = 0; ; cnt++)
> {
> - tree val = ce->value;
> - tree index = ce->index;
> + tree val = NULL_TREE, index = NULL_TREE;
> HOST_WIDE_INT pos = curpos, count = 0;
> bool full = false;
> + if (vec_safe_iterate (CONSTRUCTOR_ELTS (init), cnt, &ce))
> + {
> + val = ce->value;
> + index = ce->index;
> + }
> + else if (mask == NULL
> + || CONSTRUCTOR_NO_CLEARING (init)
> + || curpos >= total_bytes)
> + break;
> + else
> + pos = total_bytes;
> if (index && TREE_CODE (index) == RANGE_EXPR)
> {
> if (!tree_fits_shwi_p (TREE_OPERAND (index, 0))
> @@ -7984,6 +8047,28 @@ native_encode_initializer (tree init, un
> pos = (tree_to_shwi (index) - min_index) * fieldsize;
> }
>
> + if (mask && !CONSTRUCTOR_NO_CLEARING (init) && curpos != pos)
> + {
> + if (valueinit == -1)
> + {
> + tree zero = build_constructor (TREE_TYPE (type), NULL);
> + r = native_encode_initializer (zero, ptr + curpos,
> + fieldsize, 0,
> + mask + curpos);
> + ggc_free (zero);
> + if (!r)
> + return 0;
> + valueinit = curpos;
> + curpos += fieldsize;
> + }
> + while (curpos != pos)
> + {
> + memcpy (ptr + curpos, ptr + valueinit, fieldsize);
> + memcpy (mask + curpos, mask + valueinit, fieldsize);
> + curpos += fieldsize;
> + }
> + }
> +
> curpos = pos;
> if (val)
> do
> @@ -7998,6 +8083,8 @@ native_encode_initializer (tree init, un
> if (ptr)
> memcpy (ptr + (curpos - o), ptr + (pos - o),
> fieldsize);
> + if (mask)
> + memcpy (mask + curpos, mask + pos, fieldsize);
> }
> else if (!native_encode_initializer (val,
> ptr
> @@ -8005,7 +8092,10 @@ native_encode_initializer (tree init, un
> : NULL,
> fieldsize,
> off == -1 ? -1
> - : 0))
> + : 0,
> + mask
> + ? mask + curpos
> + : NULL))
> return 0;
> else
> {
> @@ -8020,6 +8110,7 @@ native_encode_initializer (tree init, un
> unsigned char *p = NULL;
> int no = 0;
> int l;
> + gcc_assert (mask == NULL);
> if (curpos >= off)
> {
> if (ptr)
> @@ -8033,7 +8124,7 @@ native_encode_initializer (tree init, un
> no = off - curpos;
> l = len;
> }
> - if (!native_encode_initializer (val, p, l, no))
> + if (!native_encode_initializer (val, p, l, no, NULL))
> return 0;
> }
> curpos += fieldsize;
> @@ -8047,22 +8138,75 @@ native_encode_initializer (tree init, un
> {
> unsigned HOST_WIDE_INT cnt;
> constructor_elt *ce;
> + tree fld_base = TYPE_FIELDS (type);
> + tree to_free = NULL_TREE;
>
> + gcc_assert (TREE_CODE (type) == RECORD_TYPE || mask == NULL);
> if (ptr != NULL)
> memset (ptr, '\0', MIN (total_bytes - off, len));
> - FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
> + for (cnt = 0; ; cnt++)
> {
> - tree field = ce->index;
> - tree val = ce->value;
> - HOST_WIDE_INT pos, fieldsize;
> + tree val = NULL_TREE, field = NULL_TREE;
> + HOST_WIDE_INT pos = 0, fieldsize;
> unsigned HOST_WIDE_INT bpos = 0, epos = 0;
>
> - if (field == NULL_TREE)
> - return 0;
> + if (to_free)
> + {
> + ggc_free (to_free);
> + to_free = NULL_TREE;
> + }
>
> - pos = int_byte_position (field);
> - if (off != -1 && (HOST_WIDE_INT) off + len <= pos)
> - continue;
> + if (vec_safe_iterate (CONSTRUCTOR_ELTS (init), cnt, &ce))
> + {
> + val = ce->value;
> + field = ce->index;
> + if (field == NULL_TREE)
> + return 0;
> +
> + pos = int_byte_position (field);
> + if (off != -1 && (HOST_WIDE_INT) off + len <= pos)
> + continue;
> + }
> + else if (mask == NULL
> + || CONSTRUCTOR_NO_CLEARING (init))
> + break;
> + else
> + pos = total_bytes;
> +
> + if (mask && !CONSTRUCTOR_NO_CLEARING (init))
> + {
> + tree fld;
> + for (fld = fld_base; fld; fld = DECL_CHAIN (fld))
> + {
> + if (TREE_CODE (fld) != FIELD_DECL)
> + continue;
> + if (fld == field)
> + break;
> + if (DECL_BIT_FIELD (fld)
> + && DECL_NAME (fld) == NULL_TREE)
> + continue;
I think you want to check DECL_PADDING_P here; the C and C++ front ends
set it on unnamed bit-fields, and that's what is_empty_type looks at.
> + if (DECL_SIZE_UNIT (fld) == NULL_TREE
> + || !tree_fits_shwi_p (DECL_SIZE_UNIT (fld)))
> + return 0;
> + if (integer_zerop (DECL_SIZE_UNIT (fld)))
> + continue;
> + break;
> + }
> + if (fld == NULL_TREE)
> + {
> + if (ce == NULL)
> + break;
> + return 0;
> + }
> + fld_base = DECL_CHAIN (fld);
> + if (fld != field)
> + {
> + cnt--;
> + field = fld;
> + val = build_constructor (TREE_TYPE (fld), NULL);
> + to_free = val;
> + }
> + }
>
> if (TREE_CODE (TREE_TYPE (field)) == ARRAY_TYPE
> && TYPE_DOMAIN (TREE_TYPE (field))
> @@ -8103,30 +8247,49 @@ native_encode_initializer (tree init, un
> if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
> return 0;
>
> - tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
> - if (repr == NULL_TREE
> - || TREE_CODE (val) != INTEGER_CST
> - || !INTEGRAL_TYPE_P (TREE_TYPE (repr)))
> + if (TREE_CODE (val) != INTEGER_CST)
> return 0;
>
> - HOST_WIDE_INT rpos = int_byte_position (repr);
> + tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
> + tree repr_type = NULL_TREE;
> + HOST_WIDE_INT rpos = 0;
> + if (repr && INTEGRAL_TYPE_P (TREE_TYPE (repr)))
> + {
> + rpos = int_byte_position (repr);
> + repr_type = TREE_TYPE (repr);
> + }
> + else
> + {
> + repr_type = find_bitfield_repr_type (fieldsize, len);
> + if (repr_type == NULL_TREE)
> + return 0;
> + HOST_WIDE_INT repr_size = int_size_in_bytes (repr_type);
> + gcc_assert (repr_size > 0 && repr_size <= len);
> + if (pos + repr_size <= len)
> + rpos = pos;
> + else
> + {
> + rpos = len - repr_size;
> + gcc_assert (rpos <= pos);
> + }
> + }
> +
> if (rpos > pos)
> return 0;
> - wide_int w = wi::to_wide (val,
> - TYPE_PRECISION (TREE_TYPE (repr)));
> - int diff = (TYPE_PRECISION (TREE_TYPE (repr))
> + wide_int w = wi::to_wide (val, TYPE_PRECISION (repr_type));
> + int diff = (TYPE_PRECISION (repr_type)
> - TYPE_PRECISION (TREE_TYPE (field)));
> HOST_WIDE_INT bitoff = (pos - rpos) * BITS_PER_UNIT + bpos;
> if (!BYTES_BIG_ENDIAN)
> w = wi::lshift (w, bitoff);
> else
> w = wi::lshift (w, diff - bitoff);
> - val = wide_int_to_tree (TREE_TYPE (repr), w);
> + val = wide_int_to_tree (repr_type, w);
>
> unsigned char buf[MAX_BITSIZE_MODE_ANY_INT
> / BITS_PER_UNIT + 1];
> int l = native_encode_int (val, buf, sizeof buf, 0);
> - if (l * BITS_PER_UNIT != TYPE_PRECISION (TREE_TYPE (repr)))
> + if (l * BITS_PER_UNIT != TYPE_PRECISION (repr_type))
> return 0;
>
> if (ptr == NULL)
> @@ -8139,15 +8302,31 @@ native_encode_initializer (tree init, un
> {
> if (!BYTES_BIG_ENDIAN)
> {
> - int mask = (1 << bpos) - 1;
> - buf[pos - rpos] &= ~mask;
> - buf[pos - rpos] |= ptr[pos - o] & mask;
> + int msk = (1 << bpos) - 1;
> + buf[pos - rpos] &= ~msk;
> + buf[pos - rpos] |= ptr[pos - o] & msk;
> + if (mask)
> + {
> + if (fieldsize > 1 || epos == 0)
> + mask[pos] &= msk;
> + else
> + mask[pos] &= (msk | ~((1 << epos) - 1));
> + }
> }
> else
> {
> - int mask = (1 << (BITS_PER_UNIT - bpos)) - 1;
> - buf[pos - rpos] &= mask;
> - buf[pos - rpos] |= ptr[pos - o] & ~mask;
> + int msk = (1 << (BITS_PER_UNIT - bpos)) - 1;
> + buf[pos - rpos] &= msk;
> + buf[pos - rpos] |= ptr[pos - o] & ~msk;
> + if (mask)
> + {
> + if (fieldsize > 1 || epos == 0)
> + mask[pos] &= ~msk;
> + else
> + mask[pos] &= (~msk
> + | ((1 << (BITS_PER_UNIT - epos))
> + - 1));
> + }
> }
> }
> /* If the bitfield does not end at byte boundary, handle
> @@ -8158,27 +8337,37 @@ native_encode_initializer (tree init, un
> {
> if (!BYTES_BIG_ENDIAN)
> {
> - int mask = (1 << epos) - 1;
> - buf[pos - rpos + fieldsize - 1] &= mask;
> + int msk = (1 << epos) - 1;
> + buf[pos - rpos + fieldsize - 1] &= msk;
> buf[pos - rpos + fieldsize - 1]
> - |= ptr[pos + fieldsize - 1 - o] & ~mask;
> + |= ptr[pos + fieldsize - 1 - o] & ~msk;
> + if (mask && (fieldsize > 1 || bpos == 0))
> + mask[pos + fieldsize - 1] &= ~msk;
> }
> else
> {
> - int mask = (1 << (BITS_PER_UNIT - epos)) - 1;
> - buf[pos - rpos + fieldsize - 1] &= ~mask;
> + int msk = (1 << (BITS_PER_UNIT - epos)) - 1;
> + buf[pos - rpos + fieldsize - 1] &= ~msk;
> buf[pos - rpos + fieldsize - 1]
> - |= ptr[pos + fieldsize - 1 - o] & mask;
> + |= ptr[pos + fieldsize - 1 - o] & msk;
> + if (mask && (fieldsize > 1 || bpos == 0))
> + mask[pos + fieldsize - 1] &= msk;
> }
> }
> if (off == -1
> || (pos >= off
> && (pos + fieldsize <= (HOST_WIDE_INT) off + len)))
> - memcpy (ptr + pos - o, buf + (pos - rpos), fieldsize);
> + {
> + memcpy (ptr + pos - o, buf + (pos - rpos), fieldsize);
> + if (mask && (fieldsize > (bpos != 0) + (epos != 0)))
> + memset (mask + pos + (bpos != 0), 0,
> + fieldsize - (bpos != 0) - (epos != 0));
> + }
> else
> {
> /* Partial overlap. */
> HOST_WIDE_INT fsz = fieldsize;
> + gcc_assert (mask == NULL);
> if (pos < off)
> {
> fsz -= (off - pos);
> @@ -8198,7 +8387,8 @@ native_encode_initializer (tree init, un
> if (!native_encode_initializer (val, ptr ? ptr + pos - o
> : NULL,
> fieldsize,
> - off == -1 ? -1 : 0))
> + off == -1 ? -1 : 0,
> + mask ? mask + pos : NULL))
> return 0;
> }
> else
> @@ -8207,6 +8397,7 @@ native_encode_initializer (tree init, un
> unsigned char *p = NULL;
> int no = 0;
> int l;
> + gcc_assert (mask == NULL);
> if (pos >= off)
> {
> if (ptr)
> @@ -8220,7 +8411,7 @@ native_encode_initializer (tree init, un
> no = off - pos;
> l = len;
> }
> - if (!native_encode_initializer (val, p, l, no))
> + if (!native_encode_initializer (val, p, l, no, NULL))
> return 0;
> }
> }
> --- gcc/c-family/c-common.h.jj 2020-10-27 14:36:42.182246025 +0100
> +++ gcc/c-family/c-common.h 2020-11-02 13:42:18.699819747 +0100
> @@ -164,7 +164,7 @@ enum rid
> RID_HAS_NOTHROW_COPY, RID_HAS_TRIVIAL_ASSIGN,
> RID_HAS_TRIVIAL_CONSTRUCTOR, RID_HAS_TRIVIAL_COPY,
> RID_HAS_TRIVIAL_DESTRUCTOR, RID_HAS_UNIQUE_OBJ_REPRESENTATIONS,
> - RID_HAS_VIRTUAL_DESTRUCTOR,
> + RID_HAS_VIRTUAL_DESTRUCTOR, RID_BUILTIN_BIT_CAST,
> RID_IS_ABSTRACT, RID_IS_AGGREGATE,
> RID_IS_BASE_OF, RID_IS_CLASS,
> RID_IS_EMPTY, RID_IS_ENUM,
> --- gcc/c-family/c-common.c.jj 2020-10-27 14:36:42.182246025 +0100
> +++ gcc/c-family/c-common.c 2020-11-02 13:42:18.700819736 +0100
> @@ -374,6 +374,7 @@ const struct c_common_resword c_common_r
> { "__auto_type", RID_AUTO_TYPE, D_CONLY },
> { "__bases", RID_BASES, D_CXXONLY },
> { "__builtin_addressof", RID_ADDRESSOF, D_CXXONLY },
> + { "__builtin_bit_cast", RID_BUILTIN_BIT_CAST, D_CXXONLY },
> { "__builtin_call_with_static_chain",
> RID_BUILTIN_CALL_WITH_STATIC_CHAIN, D_CONLY },
> { "__builtin_choose_expr", RID_CHOOSE_EXPR, D_CONLY },
> --- gcc/cp/cp-tree.h.jj 2020-10-30 08:59:57.041496709 +0100
> +++ gcc/cp/cp-tree.h 2020-11-02 13:42:18.701819725 +0100
> @@ -7273,6 +7273,8 @@ extern tree finish_builtin_launder (loc
> tsubst_flags_t);
> extern tree cp_build_vec_convert (tree, location_t, tree,
> tsubst_flags_t);
> +extern tree cp_build_bit_cast (location_t, tree, tree,
> + tsubst_flags_t);
> extern void start_lambda_scope (tree);
> extern void record_lambda_scope (tree);
> extern void record_null_lambda_scope (tree);
> --- gcc/cp/cp-tree.def.jj 2020-09-21 11:15:53.715518437 +0200
> +++ gcc/cp/cp-tree.def 2020-11-02 13:42:18.701819725 +0100
> @@ -437,6 +437,9 @@ DEFTREECODE (UNARY_RIGHT_FOLD_EXPR, "una
> DEFTREECODE (BINARY_LEFT_FOLD_EXPR, "binary_left_fold_expr", tcc_expression, 3)
> DEFTREECODE (BINARY_RIGHT_FOLD_EXPR, "binary_right_fold_expr", tcc_expression, 3)
>
> +/* Represents the __builtin_bit_cast (type, expr) expression.
> + The type is in TREE_TYPE, expression in TREE_OPERAND (bitcast, 0). */
> +DEFTREECODE (BIT_CAST_EXPR, "bit_cast_expr", tcc_expression, 1)
>
> /** C++ extensions. */
>
> --- gcc/cp/cp-objcp-common.c.jj 2020-09-21 11:15:53.715518437 +0200
> +++ gcc/cp/cp-objcp-common.c 2020-11-02 13:42:18.701819725 +0100
> @@ -390,6 +390,7 @@ names_builtin_p (const char *name)
> case RID_BUILTIN_HAS_ATTRIBUTE:
> case RID_BUILTIN_SHUFFLE:
> case RID_BUILTIN_LAUNDER:
> + case RID_BUILTIN_BIT_CAST:
> case RID_OFFSETOF:
> case RID_HAS_NOTHROW_ASSIGN:
> case RID_HAS_NOTHROW_CONSTRUCTOR:
> @@ -488,6 +489,7 @@ cp_common_init_ts (void)
> MARK_TS_EXP (ALIGNOF_EXPR);
> MARK_TS_EXP (ARROW_EXPR);
> MARK_TS_EXP (AT_ENCODE_EXPR);
> + MARK_TS_EXP (BIT_CAST_EXPR);
> MARK_TS_EXP (CAST_EXPR);
> MARK_TS_EXP (CONST_CAST_EXPR);
> MARK_TS_EXP (CTOR_INITIALIZER);
> --- gcc/cp/cxx-pretty-print.c.jj 2020-10-19 09:32:35.503914847 +0200
> +++ gcc/cp/cxx-pretty-print.c 2020-11-02 13:42:18.702819714 +0100
> @@ -655,6 +655,15 @@ cxx_pretty_printer::postfix_expression (
> pp_right_paren (this);
> break;
>
> + case BIT_CAST_EXPR:
> + pp_cxx_ws_string (this, "__builtin_bit_cast");
> + pp_left_paren (this);
> + type_id (TREE_TYPE (t));
> + pp_comma (this);
> + expression (TREE_OPERAND (t, 0));
> + pp_right_paren (this);
> + break;
> +
> case EMPTY_CLASS_EXPR:
> type_id (TREE_TYPE (t));
> pp_left_paren (this);
> --- gcc/cp/parser.c.jj 2020-11-02 13:33:46.039434035 +0100
> +++ gcc/cp/parser.c 2020-11-02 13:42:18.705819681 +0100
> @@ -7234,6 +7234,32 @@ cp_parser_postfix_expression (cp_parser
> tf_warning_or_error);
> }
>
> + case RID_BUILTIN_BIT_CAST:
> + {
> + tree expression;
> + tree type;
> + /* Consume the `__builtin_bit_cast' token. */
> + cp_lexer_consume_token (parser->lexer);
> + /* Look for the opening `('. */
> + matching_parens parens;
> + parens.require_open (parser);
> + location_t type_location
> + = cp_lexer_peek_token (parser->lexer)->location;
> + /* Parse the type-id. */
> + {
> + type_id_in_expr_sentinel s (parser);
> + type = cp_parser_type_id (parser);
> + }
> + /* Look for the `,'. */
> + cp_parser_require (parser, CPP_COMMA, RT_COMMA);
> + /* Now, parse the assignment-expression. */
> + expression = cp_parser_assignment_expression (parser);
> + /* Look for the closing `)'. */
> + parens.require_close (parser);
> + return cp_build_bit_cast (type_location, type, expression,
> + tf_warning_or_error);
> + }
> +
> default:
> {
> tree type;
> --- gcc/cp/semantics.c.jj 2020-10-30 09:16:30.816279461 +0100
> +++ gcc/cp/semantics.c 2020-11-02 13:42:18.706819671 +0100
> @@ -10591,4 +10591,76 @@ cp_build_vec_convert (tree arg, location
> return build_call_expr_internal_loc (loc, IFN_VEC_CONVERT, type, 1, arg);
> }
>
> +/* Finish __builtin_bit_cast (type, arg). */
> +
> +tree
> +cp_build_bit_cast (location_t loc, tree type, tree arg,
> + tsubst_flags_t complain)
> +{
> + if (error_operand_p (type))
> + return error_mark_node;
> + if (!dependent_type_p (type))
> + {
> + if (!complete_type_or_maybe_complain (type, NULL_TREE, complain))
> + return error_mark_node;
> + if (TREE_CODE (type) == ARRAY_TYPE)
> + {
> + /* std::bit_cast for destination ARRAY_TYPE is not possible,
> + as functions may not return an array, so don't bother trying
> + to support this (and then deal with VLAs etc.). */
> + error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
> + "is an array type", type);
> + return error_mark_node;
> + }
> + if (!trivially_copyable_p (type))
> + {
> + error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
> + "is not trivially copyable", type);
> + return error_mark_node;
> + }
> + }
> +
> + if (error_operand_p (arg))
> + return error_mark_node;
> +
> + if (!type_dependent_expression_p (arg))
> + {
> + if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
> + {
> + /* Don't perform array-to-pointer conversion. */
> + arg = mark_rvalue_use (arg, loc, true);
> + if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
> + return error_mark_node;
> + }
> + else
> + arg = decay_conversion (arg, complain);
bit_cast operates on an lvalue argument, so I don't think we want
decay_conversion at all here.
> + if (error_operand_p (arg))
> + return error_mark_node;
> +
> + arg = convert_from_reference (arg);
This shouldn't be necessary; the argument should already be converted
from reference. Generally we call convert_from_reference on the result
of some processing, not on an incoming argument.
> + if (!trivially_copyable_p (TREE_TYPE (arg)))
> + {
> + error_at (cp_expr_loc_or_loc (arg, loc),
> + "%<__builtin_bit_cast%> source type %qT "
> + "is not trivially copyable", TREE_TYPE (arg));
> + return error_mark_node;
> + }
> + if (!dependent_type_p (type)
> + && !cp_tree_equal (TYPE_SIZE_UNIT (type),
> + TYPE_SIZE_UNIT (TREE_TYPE (arg))))
> + {
> + error_at (loc, "%<__builtin_bit_cast%> source size %qE "
> + "not equal to destination type size %qE",
> + TYPE_SIZE_UNIT (TREE_TYPE (arg)),
> + TYPE_SIZE_UNIT (type));
> + return error_mark_node;
> + }
> + }
> +
> + tree ret = build_min (BIT_CAST_EXPR, type, arg);
> + SET_EXPR_LOCATION (ret, loc);
> + return ret;
> +}
> +
> #include "gt-cp-semantics.h"
> --- gcc/cp/tree.c.jj 2020-10-08 10:58:22.023037307 +0200
> +++ gcc/cp/tree.c 2020-11-02 13:42:18.707819660 +0100
> @@ -3868,6 +3868,7 @@ cp_tree_equal (tree t1, tree t2)
> CASE_CONVERT:
> case NON_LVALUE_EXPR:
> case VIEW_CONVERT_EXPR:
> + case BIT_CAST_EXPR:
> if (!same_type_p (TREE_TYPE (t1), TREE_TYPE (t2)))
> return false;
> /* Now compare operands as usual. */
> @@ -5112,6 +5113,7 @@ cp_walk_subtrees (tree *tp, int *walk_su
> case CONST_CAST_EXPR:
> case DYNAMIC_CAST_EXPR:
> case IMPLICIT_CONV_EXPR:
> + case BIT_CAST_EXPR:
> if (TREE_TYPE (*tp))
> WALK_SUBTREE (TREE_TYPE (*tp));
>
> --- gcc/cp/pt.c.jj 2020-10-30 09:16:30.818279438 +0100
> +++ gcc/cp/pt.c 2020-11-02 13:42:18.709819638 +0100
> @@ -16741,6 +16741,13 @@ tsubst_copy (tree t, tree args, tsubst_f
> return build1 (code, type, op0);
> }
>
> + case BIT_CAST_EXPR:
> + {
> + tree type = tsubst (TREE_TYPE (t), args, complain, in_decl);
> + tree op0 = tsubst_copy (TREE_OPERAND (t, 0), args, complain, in_decl);
> + return cp_build_bit_cast (EXPR_LOCATION (t), type, op0, complain);
> + }
> +
> case SIZEOF_EXPR:
> if (PACK_EXPANSION_P (TREE_OPERAND (t, 0))
> || ARGUMENT_PACK_P (TREE_OPERAND (t, 0)))
> --- gcc/cp/constexpr.c.jj 2020-10-30 08:59:57.039496732 +0100
> +++ gcc/cp/constexpr.c 2020-11-02 15:55:14.259664820 +0100
> @@ -3912,6 +3912,442 @@ cxx_eval_bit_field_ref (const constexpr_
> return error_mark_node;
> }
>
> +/* Helper for cxx_eval_bit_cast.
> + Check [bit.cast]/3 rules, bit_cast is constexpr only if the To and From
> + types and types of all subobjects have is_union_v<T>, is_pointer_v<T>,
> + is_member_pointer_v<T>, is_volatile_v<T> false and has no non-static
> + data members of reference type. */
> +
> +static bool
> +check_bit_cast_type (const constexpr_ctx *ctx, location_t loc, tree type,
> + tree orig_type)
> +{
> + if (TREE_CODE (type) == UNION_TYPE)
> + {
> + if (!ctx->quiet)
> + {
> + if (type == orig_type)
> + error_at (loc, "%qs is not a constant expression because %qT is "
> + "a union type", "__builtin_bit_cast", type);
> + else
> + error_at (loc, "%qs is not a constant expression because %qT "
> + "contains a union type", "__builtin_bit_cast",
> + orig_type);
> + }
> + return true;
> + }
> + if (TREE_CODE (type) == POINTER_TYPE)
> + {
> + if (!ctx->quiet)
> + {
> + if (type == orig_type)
> + error_at (loc, "%qs is not a constant expression because %qT is "
> + "a pointer type", "__builtin_bit_cast", type);
> + else
> + error_at (loc, "%qs is not a constant expression because %qT "
> + "contains a pointer type", "__builtin_bit_cast",
> + orig_type);
> + }
> + return true;
> + }
> + if (TREE_CODE (type) == REFERENCE_TYPE)
> + {
> + if (!ctx->quiet)
> + {
> + if (type == orig_type)
> + error_at (loc, "%qs is not a constant expression because %qT is "
> + "a reference type", "__builtin_bit_cast", type);
> + else
> + error_at (loc, "%qs is not a constant expression because %qT "
> + "contains a reference type", "__builtin_bit_cast",
> + orig_type);
> + }
> + return true;
> + }
> + if (TYPE_PTRMEM_P (type))
> + {
> + if (!ctx->quiet)
> + {
> + if (type == orig_type)
> + error_at (loc, "%qs is not a constant expression because %qT is "
> + "a pointer to member type", "__builtin_bit_cast",
> + type);
> + else
> + error_at (loc, "%qs is not a constant expression because %qT "
> + "contains a pointer to member type",
> + "__builtin_bit_cast", orig_type);
> + }
> + return true;
> + }
> + if (TYPE_VOLATILE (type))
> + {
> + if (!ctx->quiet)
> + {
> + if (type == orig_type)
> + error_at (loc, "%qs is not a constant expression because %qT is "
> + "volatile", "__builtin_bit_cast", type);
> + else
> + error_at (loc, "%qs is not a constant expression because %qT "
> + "contains a volatile subobject",
> + "__builtin_bit_cast", orig_type);
> + }
> + return true;
> + }
> + if (TREE_CODE (type) == RECORD_TYPE)
> + for (tree field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
> + if (TREE_CODE (field) == FIELD_DECL
> + && check_bit_cast_type (ctx, loc, TREE_TYPE (field), orig_type))
> + return true;
> + return false;
> +}
> +
> +/* Attempt to interpret aggregate of TYPE from bytes encoded in target
> + byte order at PTR + OFF with LEN bytes. MASK contains bits set if the value
> + is indeterminate. */
> +
> +static tree
> +cxx_native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
> + int len, unsigned char *mask,
> + const constexpr_ctx *ctx, bool *non_constant_p,
> + location_t loc)
Can this be, say, native_interpret_initializer in fold-const? It
doesn't seem closely tied to the front end other than diagnostics that
could move to the caller, like you've already done for the non-aggregate
case.
> +{
> + vec<constructor_elt, va_gc> *elts = NULL;
> + if (TREE_CODE (type) == ARRAY_TYPE)
> + {
> + HOST_WIDE_INT eltsz = int_size_in_bytes (TREE_TYPE (type));
> + if (eltsz < 0 || eltsz > len || TYPE_DOMAIN (type) == NULL_TREE)
> + return error_mark_node;
> +
> + HOST_WIDE_INT cnt = 0;
> + if (TYPE_MAX_VALUE (TYPE_DOMAIN (type)))
> + {
> + if (!tree_fits_shwi_p (TYPE_MAX_VALUE (TYPE_DOMAIN (type))))
> + return error_mark_node;
> + cnt = tree_to_shwi (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) + 1;
> + }
> + if (eltsz == 0)
> + cnt = 0;
> + HOST_WIDE_INT pos = 0;
> + for (HOST_WIDE_INT i = 0; i < cnt; i++, pos += eltsz)
> + {
> + tree v = error_mark_node;
> + if (pos >= len || pos + eltsz > len)
> + return error_mark_node;
> + if (can_native_interpret_type_p (TREE_TYPE (type)))
> + {
> + v = native_interpret_expr (TREE_TYPE (type),
> + ptr + off + pos, eltsz);
> + if (v == NULL_TREE)
> + return error_mark_node;
> + for (int i = 0; i < eltsz; i++)
> + if (mask[off + pos + i])
> + {
> + if (!ctx->quiet)
> + error_at (loc, "%qs accessing uninitialized byte at "
> + "offset %d",
> + "__builtin_bit_cast", (int) (off + pos + i));
> + *non_constant_p = true;
> + return error_mark_node;
> + }
> + }
> + else if (TREE_CODE (TREE_TYPE (type)) == RECORD_TYPE
> + || TREE_CODE (TREE_TYPE (type)) == ARRAY_TYPE)
> + v = cxx_native_interpret_aggregate (TREE_TYPE (type),
> + ptr, off + pos, eltsz,
> + mask, ctx,
> + non_constant_p, loc);
> + if (v == error_mark_node)
> + return error_mark_node;
> + CONSTRUCTOR_APPEND_ELT (elts, size_int (i), v);
> + }
> + return build_constructor (type, elts);
> + }
> + gcc_assert (TREE_CODE (type) == RECORD_TYPE);
> + for (tree field = next_initializable_field (TYPE_FIELDS (type));
> + field; field = next_initializable_field (DECL_CHAIN (field)))
> + {
> + tree fld = field;
> + HOST_WIDE_INT bitoff = 0, pos = 0, sz = 0;
> + int diff = 0;
> + tree v = error_mark_node;
> + if (DECL_BIT_FIELD (field))
> + {
> + fld = DECL_BIT_FIELD_REPRESENTATIVE (field);
> + if (fld && INTEGRAL_TYPE_P (TREE_TYPE (fld)))
> + {
> + poly_int64 bitoffset;
> + poly_uint64 field_offset, fld_offset;
> + if (poly_int_tree_p (DECL_FIELD_OFFSET (field), &field_offset)
> + && poly_int_tree_p (DECL_FIELD_OFFSET (fld), &fld_offset))
> + bitoffset = (field_offset - fld_offset) * BITS_PER_UNIT;
> + else
> + bitoffset = 0;
> + bitoffset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
> + - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (fld)));
> + diff = (TYPE_PRECISION (TREE_TYPE (fld))
> + - TYPE_PRECISION (TREE_TYPE (field)));
> + if (!bitoffset.is_constant (&bitoff)
> + || bitoff < 0
> + || bitoff > diff)
> + return error_mark_node;
> + }
> + else
> + {
> + if (!tree_fits_uhwi_p (DECL_FIELD_BIT_OFFSET (field)))
> + return error_mark_node;
> + int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
> + int bpos = tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field));
> + bpos %= BITS_PER_UNIT;
> + fieldsize += bpos;
> + fieldsize += BITS_PER_UNIT - 1;
> + fieldsize /= BITS_PER_UNIT;
> + tree repr_type = find_bitfield_repr_type (fieldsize, len);
> + if (repr_type == NULL_TREE)
> + return error_mark_node;
> + sz = int_size_in_bytes (repr_type);
> + if (sz < 0 || sz > len)
> + return error_mark_node;
> + pos = int_byte_position (field);
> + if (pos < 0 || pos > len || pos + fieldsize > len)
> + return error_mark_node;
> + HOST_WIDE_INT rpos;
> + if (pos + sz <= len)
> + rpos = pos;
> + else
> + {
> + rpos = len - sz;
> + gcc_assert (rpos <= pos);
> + }
> + bitoff = (HOST_WIDE_INT) (pos - rpos) * BITS_PER_UNIT + bpos;
> + pos = rpos;
> + diff = (TYPE_PRECISION (repr_type)
> + - TYPE_PRECISION (TREE_TYPE (field)));
> + v = native_interpret_expr (repr_type, ptr + off + pos, sz);
> + if (v == NULL_TREE)
> + return error_mark_node;
> + fld = NULL_TREE;
> + }
> + }
> +
> + if (fld)
> + {
> + sz = int_size_in_bytes (TREE_TYPE (fld));
> + if (sz < 0 || sz > len)
> + return error_mark_node;
> + tree byte_pos = byte_position (fld);
> + if (!tree_fits_shwi_p (byte_pos))
> + return error_mark_node;
> + pos = tree_to_shwi (byte_pos);
> + if (pos < 0 || pos > len || pos + sz > len)
> + return error_mark_node;
> + }
> + if (fld == NULL_TREE)
> + /* Already handled above. */;
> + else if (can_native_interpret_type_p (TREE_TYPE (fld)))
> + {
> + v = native_interpret_expr (TREE_TYPE (fld),
> + ptr + off + pos, sz);
> + if (v == NULL_TREE)
> + return error_mark_node;
> + if (fld == field)
> + for (int i = 0; i < sz; i++)
> + if (mask[off + pos + i])
> + {
> + if (!ctx->quiet)
> + error_at (loc,
> + "%qs accessing uninitialized byte at offset %d",
> + "__builtin_bit_cast", (int) (off + pos + i));
> + *non_constant_p = true;
> + return error_mark_node;
> + }
> + }
> + else if (TREE_CODE (TREE_TYPE (fld)) == RECORD_TYPE
> + || TREE_CODE (TREE_TYPE (fld)) == ARRAY_TYPE)
> + v = cxx_native_interpret_aggregate (TREE_TYPE (fld),
> + ptr, off + pos, sz, mask,
> + ctx, non_constant_p, loc);
> + if (v == error_mark_node)
> + return error_mark_node;
> + if (fld != field)
> + {
> + if (TREE_CODE (v) != INTEGER_CST)
> + return error_mark_node;
> +
> + /* FIXME: Figure out how to handle PDP endian bitfields. */
> + if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
> + return error_mark_node;
> + if (!BYTES_BIG_ENDIAN)
> + v = wide_int_to_tree (TREE_TYPE (field),
> + wi::lrshift (wi::to_wide (v), bitoff));
> + else
> + v = wide_int_to_tree (TREE_TYPE (field),
> + wi::lrshift (wi::to_wide (v),
> + diff - bitoff));
> + int bpos = bitoff % BITS_PER_UNIT;
> + int bpos_byte = bitoff / BITS_PER_UNIT;
> + int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
> + fieldsize += bpos;
> + int epos = fieldsize % BITS_PER_UNIT;
> + fieldsize += BITS_PER_UNIT - 1;
> + fieldsize /= BITS_PER_UNIT;
> + int bad = -1;
> + if (bpos)
> + {
> + int msk;
> + if (!BYTES_BIG_ENDIAN)
> + {
> + msk = (1 << bpos) - 1;
> + if (fieldsize == 1 && epos != 0)
> + msk |= ~((1 << epos) - 1);
> + }
> + else
> + {
> + msk = ~((1 << (BITS_PER_UNIT - bpos)) - 1);
> + if (fieldsize == 1 && epos != 0)
> + msk |= (1 << (BITS_PER_UNIT - epos)) - 1;
> + }
> + if (mask[off + pos + bpos_byte] & ~msk)
> + bad = off + pos + bpos_byte;
> + }
> + if (epos && (fieldsize > 1 || bpos == 0))
> + {
> + int msk;
> + if (!BYTES_BIG_ENDIAN)
> + msk = (1 << epos) - 1;
> + else
> + msk = ~((1 << (BITS_PER_UNIT - epos)) - 1);
> + if (mask[off + pos + bpos_byte + fieldsize - 1] & msk)
> + bad = off + pos + bpos_byte + fieldsize - 1;
> + }
> + for (int i = 0; bad < 0 && i < fieldsize - (bpos != 0) - (epos != 0);
> + i++)
> + if (mask[off + pos + bpos_byte + (bpos != 0) + i])
> + bad = off + pos + bpos_byte + (bpos != 0) + i;
> + if (bad >= 0)
> + {
> + if (!ctx->quiet)
> + error_at (loc, "%qs accessing uninitialized byte at offset %d",
> + "__builtin_bit_cast", bad);
> + *non_constant_p = true;
> + return error_mark_node;
> + }
> + }
> + CONSTRUCTOR_APPEND_ELT (elts, field, v);
> + }
> + return build_constructor (type, elts);
> +}
> +
> +/* Subroutine of cxx_eval_constant_expression.
> + Attempt to evaluate a BIT_CAST_EXPR. */
> +
> +static tree
> +cxx_eval_bit_cast (const constexpr_ctx *ctx, tree t, bool *non_constant_p,
> + bool *overflow_p)
> +{
> + if (check_bit_cast_type (ctx, EXPR_LOCATION (t), TREE_TYPE (t),
> + TREE_TYPE (t))
> + || check_bit_cast_type (ctx, cp_expr_loc_or_loc (TREE_OPERAND (t, 0),
> + EXPR_LOCATION (t)),
> + TREE_TYPE (TREE_OPERAND (t, 0)),
> + TREE_TYPE (TREE_OPERAND (t, 0))))
> + {
> + *non_constant_p = true;
> + return t;
> + }
> +
> + tree op = cxx_eval_constant_expression (ctx, TREE_OPERAND (t, 0), false,
> + non_constant_p, overflow_p);
> + if (*non_constant_p)
> + return t;
> +
> + location_t loc = EXPR_LOCATION (t);
> + if (BITS_PER_UNIT != 8 || CHAR_BIT != 8)
> + {
> + if (!ctx->quiet)
> + sorry_at (loc, "%qs cannot be constant evaluated on the target",
> + "__builtin_bit_cast");
> + *non_constant_p = true;
> + return t;
> + }
> +
> + if (!tree_fits_shwi_p (TYPE_SIZE_UNIT (TREE_TYPE (t))))
> + {
> + if (!ctx->quiet)
> + sorry_at (loc, "%qs cannot be constant evaluated because the "
> + "type is too large", "__builtin_bit_cast");
> + *non_constant_p = true;
> + return t;
> + }
> +
> + HOST_WIDE_INT len = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (t)));
> + if (len < 0 || (int) len != len)
> + {
> + if (!ctx->quiet)
> + sorry_at (loc, "%qs cannot be constant evaluated because the "
> + "type is too large", "__builtin_bit_cast");
> + *non_constant_p = true;
> + return t;
> + }
> +
> + unsigned char buf[64];
> + unsigned char *ptr, *mask;
> + size_t alen = (size_t) len * 2;
> + if (alen <= sizeof (buf))
> + ptr = buf;
> + else
> + ptr = XNEWVEC (unsigned char, alen);
> + mask = ptr + (size_t) len;
> + /* At the beginning consider everything indeterminate. */
> + memset (mask, ~0, (size_t) len);
> +
> + if (native_encode_initializer (op, ptr, len, 0, mask) != len)
> + {
> + if (!ctx->quiet)
> + sorry_at (loc, "%qs cannot be constant evaluated because the "
> + "argument cannot be encoded", "__builtin_bit_cast");
> + *non_constant_p = true;
> + return t;
> + }
> +
> + if (can_native_interpret_type_p (TREE_TYPE (t)))
> + if (tree r = native_interpret_expr (TREE_TYPE (t), ptr, len))
> + {
> + for (int i = 0; i < len; i++)
> + if (mask[i])
> + {
> + if (!ctx->quiet)
> + error_at (loc, "%qs accessing uninitialized byte at offset %d",
> + "__builtin_bit_cast", i);
> + *non_constant_p = true;
> + r = t;
> + break;
> + }
> + if (ptr != buf)
> + XDELETE (ptr);
> + return r;
> + }
> +
> + if (TREE_CODE (TREE_TYPE (t)) == RECORD_TYPE)
> + {
> + tree r = cxx_native_interpret_aggregate (TREE_TYPE (t), ptr, 0, len,
> + mask, ctx, non_constant_p, loc);
> + if (r != error_mark_node)
> + {
> + if (ptr != buf)
> + XDELETE (ptr);
> + return r;
> + }
> + if (*non_constant_p)
> + return t;
> + }
> +
> + if (!ctx->quiet)
> + sorry_at (loc, "%qs cannot be constant evaluated because the "
> + "argument cannot be interpreted", "__builtin_bit_cast");
> + *non_constant_p = true;
> + return t;
> +}
> +
> /* Subroutine of cxx_eval_constant_expression.
> Evaluate a short-circuited logical expression T in the context
> of a given constexpr CALL. BAILOUT_VALUE is the value for
> @@ -6625,6 +7061,10 @@ cxx_eval_constant_expression (const cons
> *non_constant_p = true;
> return t;
>
> + case BIT_CAST_EXPR:
> + r = cxx_eval_bit_cast (ctx, t, non_constant_p, overflow_p);
> + break;
> +
> default:
> if (STATEMENT_CODE_P (TREE_CODE (t)))
> {
> @@ -8403,6 +8843,9 @@ potential_constant_expression_1 (tree t,
> case ANNOTATE_EXPR:
> return RECUR (TREE_OPERAND (t, 0), rval);
>
> + case BIT_CAST_EXPR:
> + return RECUR (TREE_OPERAND (t, 0), rval);
> +
> /* Coroutine await, yield and return expressions are not. */
> case CO_AWAIT_EXPR:
> case CO_YIELD_EXPR:
> --- gcc/cp/cp-gimplify.c.jj 2020-10-08 10:58:21.939038527 +0200
> +++ gcc/cp/cp-gimplify.c 2020-11-02 13:42:18.711819616 +0100
> @@ -1568,6 +1568,11 @@ cp_genericize_r (tree *stmt_p, int *walk
> cp_genericize_r, cp_walk_subtrees);
> break;
>
> + case BIT_CAST_EXPR:
> + *stmt_p = build1_loc (EXPR_LOCATION (stmt), VIEW_CONVERT_EXPR,
> + TREE_TYPE (stmt), TREE_OPERAND (stmt, 0));
> + break;
> +
> default:
> if (IS_TYPE_OR_DECL_P (stmt))
> *walk_subtrees = 0;
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast1.C.jj 2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast1.C 2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,47 @@
> +// { dg-do compile }
> +
> +struct S { short a, b; };
> +struct T { float a[16]; };
> +struct U { int b[16]; };
> +
> +#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
> +int
> +f1 (float x)
> +{
> + return __builtin_bit_cast (int, x);
> +}
> +#endif
> +
> +#if 2 * __SIZEOF_SHORT__ == __SIZEOF_INT__
> +S
> +f2 (int x)
> +{
> + return __builtin_bit_cast (S, x);
> +}
> +
> +int
> +f3 (S x)
> +{
> + return __builtin_bit_cast (int, x);
> +}
> +#endif
> +
> +#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
> +U
> +f4 (T &x)
> +{
> + return __builtin_bit_cast (U, x);
> +}
> +
> +T
> +f5 (int (&x)[16])
> +{
> + return __builtin_bit_cast (T, x);
> +}
> +#endif
> +
> +int
> +f6 ()
> +{
> + return __builtin_bit_cast (unsigned char, (signed char) 0);
> +}
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast2.C.jj 2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast2.C 2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,57 @@
> +// { dg-do compile }
> +
> +struct S { ~S (); int s; };
> +S s;
> +struct V; // { dg-message "forward declaration of 'struct V'" }
> +extern V v; // { dg-error "'v' has incomplete type" }
> +extern V *p;
> +struct U { int a, b; };
> +U u;
> +
> +void
> +foo (int *q)
> +{
> + __builtin_bit_cast (int, s); // { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> + __builtin_bit_cast (S, 0); // { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> + __builtin_bit_cast (int &, q); // { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> + __builtin_bit_cast (int [1], 0); // { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> + __builtin_bit_cast (V, 0); // { dg-error "invalid use of incomplete type 'struct V'" }
> + __builtin_bit_cast (int, v);
> + __builtin_bit_cast (int, *p); // { dg-error "invalid use of incomplete type 'struct V'" }
> + __builtin_bit_cast (U, 0); // { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> + __builtin_bit_cast (int, u); // { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +template <int N>
> +void
> +bar (int *q)
> +{
> + __builtin_bit_cast (int, s); // { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> + __builtin_bit_cast (S, 0); // { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> + __builtin_bit_cast (int &, q); // { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> + __builtin_bit_cast (int [1], 0); // { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> + __builtin_bit_cast (V, 0); // { dg-error "invalid use of incomplete type 'struct V'" }
> + __builtin_bit_cast (int, *p); // { dg-error "invalid use of incomplete type 'struct V'" }
> + __builtin_bit_cast (U, 0); // { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> + __builtin_bit_cast (int, u); // { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +template <typename T1, typename T2, typename T3, typename T4>
> +void
> +baz (T3 s, T4 *p, T1 *q)
> +{
> + __builtin_bit_cast (int, s); // { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> + __builtin_bit_cast (T3, 0); // { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> + __builtin_bit_cast (T1 &, q); // { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> + __builtin_bit_cast (T2, 0); // { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> + __builtin_bit_cast (T4, 0); // { dg-error "invalid use of incomplete type 'struct V'" }
> + __builtin_bit_cast (int, *p); // { dg-error "invalid use of incomplete type 'struct V'" }
> + __builtin_bit_cast (U, (T1) 0); // { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> + __builtin_bit_cast (T1, u); // { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +void
> +qux (int *q)
> +{
> + baz <int, int [1], S, V> (s, p, q);
> +}
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast3.C.jj 2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast3.C 2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,229 @@
> +// { dg-do compile { target c++11 } }
> +
> +template <typename To, typename From>
> +constexpr To
> +bit_cast (const From &from)
> +{
> + return __builtin_bit_cast (To, from);
> +}
> +
> +template <typename To, typename From>
> +constexpr bool
> +check (const From &from)
> +{
> + return bit_cast <From> (bit_cast <To> (from)) == from;
> +}
> +
> +struct A
> +{
> + int a, b, c;
> + constexpr bool operator == (const A &x) const
> + {
> + return x.a == a && x.b == b && x.c == c;
> + }
> +};
> +
> +struct B
> +{
> + unsigned a[3];
> + constexpr bool operator == (const B &x) const
> + {
> + return x.a[0] == a[0] && x.a[1] == a[1] && x.a[2] == a[2];
> + }
> +};
> +
> +struct C
> +{
> + char a[2][3][2];
> + constexpr bool operator == (const C &x) const
> + {
> + return x.a[0][0][0] == a[0][0][0]
> + && x.a[0][0][1] == a[0][0][1]
> + && x.a[0][1][0] == a[0][1][0]
> + && x.a[0][1][1] == a[0][1][1]
> + && x.a[0][2][0] == a[0][2][0]
> + && x.a[0][2][1] == a[0][2][1]
> + && x.a[1][0][0] == a[1][0][0]
> + && x.a[1][0][1] == a[1][0][1]
> + && x.a[1][1][0] == a[1][1][0]
> + && x.a[1][1][1] == a[1][1][1]
> + && x.a[1][2][0] == a[1][2][0]
> + && x.a[1][2][1] == a[1][2][1];
> + }
> +};
> +
> +struct D
> +{
> + int a, b;
> + constexpr bool operator == (const D &x) const
> + {
> + return x.a == a && x.b == b;
> + }
> +};
> +
> +struct E {};
> +struct F { char c, d, e, f; };
> +struct G : public D, E, F
> +{
> + int g;
> + constexpr bool operator == (const G &x) const
> + {
> + return x.a == a && x.b == b && x.c == c && x.d == d
> + && x.e == e && x.f == f && x.g == g;
> + }
> +};
> +
> +struct H
> +{
> + int a, b[2], c;
> + constexpr bool operator == (const H &x) const
> + {
> + return x.a == a && x.b[0] == b[0] && x.b[1] == b[1] && x.c == c;
> + }
> +};
> +
> +#if __SIZEOF_INT__ == 4
> +struct I
> +{
> + int a;
> + int b : 3;
> + int c : 24;
> + int d : 5;
> + int e;
> + constexpr bool operator == (const I &x) const
> + {
> + return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e;
> + }
> +};
> +#endif
> +
> +#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
> +struct J
> +{
> + long long int a, b : 11, c : 3, d : 37, e : 1, f : 10, g : 2, h;
> + constexpr bool operator == (const J &x) const
> + {
> + return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
> + && x.f == f && x.g == g && x.h == h;
> + }
> +};
> +
> +struct K
> +{
> + long long int a, b, c;
> + constexpr bool operator == (const K &x) const
> + {
> + return x.a == a && x.b == b && x.c == c;
> + }
> +};
> +
> +struct M
> +{
> + signed a : 6, b : 7, c : 6, d : 5;
> + unsigned char e;
> + unsigned int f;
> + long long int g;
> + constexpr bool operator == (const M &x) const
> + {
> + return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
> + && x.f == f && x.g == g;
> + }
> +};
> +
> +struct N
> +{
> + unsigned long long int a, b;
> + constexpr bool operator == (const N &x) const
> + {
> + return x.a == a && x.b == b;
> + }
> +};
> +#endif
> +
> +static_assert (check <unsigned int> (0), "");
> +static_assert (check <long long int> (0xdeadbeeffeedbac1ULL), "");
> +static_assert (check <signed char> ((unsigned char) 42), "");
> +static_assert (check <char> ((unsigned char) 42), "");
> +static_assert (check <unsigned char> ((unsigned char) 42), "");
> +static_assert (check <signed char> ((signed char) 42), "");
> +static_assert (check <char> ((signed char) 42), "");
> +static_assert (check <unsigned char> ((signed char) 42), "");
> +static_assert (check <signed char> ((char) 42), "");
> +static_assert (check <char> ((char) 42), "");
> +static_assert (check <unsigned char> ((char) 42), "");
> +#if __SIZEOF_INT__ == __SIZEOF_FLOAT__
> +static_assert (check <int> (2.5f), "");
> +static_assert (check <unsigned int> (136.5f), "");
> +#endif
> +#if __SIZEOF_LONG_LONG__ == __SIZEOF_DOUBLE__
> +static_assert (check <long long> (2.5), "");
> +static_assert (check <long long unsigned> (123456.75), "");
> +#endif
> +
> +static_assert (check <B> (A{ 1, 2, 3 }), "");
> +static_assert (check <A> (B{ 4, 5, 6 }), "");
> +
> +#if __SIZEOF_INT__ == 4
> +static_assert (check <C> (A{ 7, 8, 9 }), "");
> +static_assert (check <C> (B{ 10, 11, 12 }), "");
> +static_assert (check <A> (C{ { { { 13, 14 }, { 15, 16 }, { 17, 18 } },
> + { { 19, 20 }, { 21, 22 }, { 23, 24 } } } }), "");
> +constexpr unsigned char c[] = { 1, 2, 3, 4 };
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <unsigned int> (c) == 0x04030201U, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <unsigned int> (c) == 0x01020304U, "");
> +#endif
> +
> +#if __cplusplus >= 201703L
> +static_assert (check <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
> +#endif
> +constexpr int d[] = { 0x12345678, 0x23456789, 0x5a876543, 0x3ba78654 };
> +static_assert (bit_cast <G> (d) == bit_cast <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
> +
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
> + == I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed }, "");
> +static_assert (bit_cast <A> (I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed })
> + == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
> + == I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed }, "");
> +static_assert (bit_cast <A> (I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed })
> + == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
> +#endif
> +#endif
> +
> +#if 2 * __SIZEOF_INT__ == __SIZEOF_LONG_LONG__ && __SIZEOF_INT__ >= 4
> +constexpr unsigned long long a = 0xdeadbeeffee1deadULL;
> +constexpr unsigned b[] = { 0xfeedbacU, 0xbeeffeedU };
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <D> (a) == D { int (0xfee1deadU), int (0xdeadbeefU) }, "");
> +static_assert (bit_cast <unsigned long long> (b) == 0xbeeffeed0feedbacULL, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <D> (a) == D { int (0xdeadbeefU), int (0xfee1deadU) }, "");
> +static_assert (bit_cast <unsigned long long> (b) == 0x0feedbacbeeffeedULL, "");
> +#endif
> +#endif
> +
> +#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL })
> + == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
> + == K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
> + == M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL }, "");
> +static_assert (bit_cast <N> (M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL })
> + == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL })
> + == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
> + == K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
> + == M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL }, "");
> +static_assert (bit_cast <N> (M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL })
> + == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
> +#endif
> +#endif
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast4.C.jj 2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast4.C 2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,44 @@
> +// { dg-do compile { target c++11 } }
> +
> +template <typename To, typename From>
> +constexpr To
> +bit_cast (const From &from)
> +{
> + return __builtin_bit_cast (To, from);
> +}
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'U' is a union type" "U" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const U' is a union type" "const U" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'B' contains a union type" "B" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'char\\\*' is a pointer type" "char ptr" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const int\\\*' is a pointer type" "const int ptr" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'C' contains a pointer type" "C" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const C' contains a pointer type" "const C" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int D::\\\*' is a pointer to member type" "ptrmem 1" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int \\\(D::\\\*\\\)\\\(\\\) const' is a pointer to member type" "ptrmem 2" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int \\\(D::\\\*\\\)\\\(\\\)' is a pointer to member type" "ptrmem 3" { target *-*-* } 7 }
> +
> +union U { int u; };
> +struct A { int a; U b; };
> +struct B : public A { int c; };
> +struct C { const int *p; };
> +constexpr int a[] = { 1, 2, 3 };
> +constexpr const int *b = &a[0];
> +constexpr C c = { b };
> +struct D { int d; constexpr int foo () const { return 1; } };
> +constexpr int D::*d = &D::d;
> +constexpr int (D::*e) () const = &D::foo;
> +struct E { __INTPTR_TYPE__ e, f; };
> +constexpr E f = { 1, 2 };
> +constexpr U g { 0 };
> +
> +constexpr auto z = bit_cast <U> (0);
> +constexpr auto y = bit_cast <int> (g);
> +constexpr auto x = bit_cast <B> (a);
> +constexpr auto w = bit_cast <char *> ((__INTPTR_TYPE__) 0);
> +constexpr auto v = bit_cast <__UINTPTR_TYPE__> (b);
> +constexpr auto u = bit_cast <C> ((__INTPTR_TYPE__) 0);
> +constexpr auto t = bit_cast <__INTPTR_TYPE__> (c);
> +constexpr auto s = bit_cast <__INTPTR_TYPE__> (d);
> +constexpr auto r = bit_cast <E> (e);
> +constexpr auto q = bit_cast <int D::*> ((__INTPTR_TYPE__) 0);
> +constexpr auto p = bit_cast <int (D::*) ()> (f);
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast5.C.jj 2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast5.C 2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,69 @@
> +// { dg-do compile { target { c++20 && { ilp32 || lp64 } } } }
> +
> +struct A { signed char a, b, c, d, e, f; };
> +struct B {};
> +struct C { B a, b; short c; B d; };
> +struct D { int a : 4, b : 24, c : 4; };
> +struct E { B a, b; short c; };
> +struct F { B a; signed char b, c; B d; };
> +
> +constexpr bool
> +f1 ()
> +{
> + A a;
> + a.c = 23; a.d = 42;
> + C b = __builtin_bit_cast (C, a); // OK
> + return false;
> +}
> +
> +constexpr bool
> +f2 ()
> +{
> + A a;
> + a.a = 1; a.b = 2; a.c = 3; a.e = 4; a.f = 5;
> + C b = __builtin_bit_cast (C, a); // { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
> + return false;
> +}
> +
> +constexpr bool
> +f3 ()
> +{
> + D a;
> + a.b = 1;
> + F b = __builtin_bit_cast (F, a); // OK
> + return false;
> +}
> +
> +constexpr bool
> +f4 ()
> +{
> + D a;
> + a.b = 1; a.c = 2;
> + E b = __builtin_bit_cast (E, a); // OK
> + return false;
> +}
> +
> +constexpr bool
> +f5 ()
> +{
> + D a;
> + a.b = 1;
> + E b = __builtin_bit_cast (E, a); // { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
> + return false;
> +}
> +
> +constexpr bool
> +f6 ()
> +{
> + D a;
> + a.c = 1;
> + E b = __builtin_bit_cast (E, a); // { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 2" }
> + return false;
> +}
> +
> +constexpr bool a = f1 ();
> +constexpr bool b = f2 ();
> +constexpr bool c = f3 ();
> +constexpr bool d = f4 ();
> +constexpr bool e = f5 ();
> +constexpr bool f = f6 ();
>
>
> Jakub
>
next prev parent reply other threads:[~2020-11-25 17:26 UTC|newest]
Thread overview: 29+ messages / expand[flat|nested] mbox.gz Atom feed top
2020-07-18 18:50 [PATCH] c++: " Jakub Jelinek
2020-07-22 14:03 ` Paul Koning
2020-07-30 14:16 ` Jason Merrill
2020-07-30 14:57 ` Jakub Jelinek
2020-07-30 21:59 ` Jason Merrill
2020-07-31 8:19 ` Jakub Jelinek
2020-07-31 9:54 ` Jonathan Wakely
2020-07-31 10:06 ` Jakub Jelinek
2020-07-31 20:28 ` Jason Merrill
2020-08-27 10:06 ` Jakub Jelinek
2020-08-27 10:19 ` Richard Biener
2020-09-02 21:52 ` Jason Merrill
2020-08-27 10:46 ` Jakub Jelinek
2020-08-27 11:06 ` Jonathan Wakely
2020-08-27 11:17 ` Jakub Jelinek
2020-08-27 11:22 ` Jonathan Wakely
2020-08-27 10:59 ` Jonathan Wakely
2020-08-27 20:43 ` Iain Buclaw
2020-11-02 19:21 ` [PATCH] c++: v2: " Jakub Jelinek
2020-11-25 0:31 ` Jeff Law
2020-11-25 9:23 ` Jakub Jelinek
2020-11-25 10:31 ` Jonathan Wakely
2020-11-25 16:24 ` Jakub Jelinek
2020-11-25 16:28 ` Jonathan Wakely
2020-11-25 17:26 ` Jason Merrill [this message]
2020-11-25 18:50 ` Jakub Jelinek
2020-11-26 0:52 ` Jason Merrill
2020-11-26 15:09 ` [PATCH] c++: v3: " Jakub Jelinek
2020-12-03 14:24 ` Jason Merrill
Reply instructions:
You may reply publicly to this message via plain-text email
using any one of the following methods:
* Save the following mbox file, import it into your mail client,
and reply-to-all from there: mbox
Avoid top-posting and favor interleaved quoting:
https://en.wikipedia.org/wiki/Posting_style#Interleaved_style
* Reply using the --to, --cc, and --in-reply-to
switches of git-send-email(1):
git send-email \
--in-reply-to=5126c253-6ee3-a475-3324-38049d0e2e3f@redhat.com \
--to=jason@redhat.com \
--cc=gcc-patches@gcc.gnu.org \
--cc=jakub@redhat.com \
--cc=jwakely@redhat.com \
/path/to/YOUR_REPLY
https://kernel.org/pub/software/scm/git/docs/git-send-email.html
* If your mail client supports setting the In-Reply-To header
via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line
before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).