public inbox for gcc-patches@gcc.gnu.org
 help / color / mirror / Atom feed
* [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
@ 2020-07-18 18:50 Jakub Jelinek
  2020-07-22 14:03 ` Paul Koning
  2020-07-30 14:16 ` Jason Merrill
  0 siblings, 2 replies; 29+ messages in thread
From: Jakub Jelinek @ 2020-07-18 18:50 UTC (permalink / raw)
  To: Jason Merrill; +Cc: gcc-patches, Jonathan Wakely

Hi!

The following patch adds __builtin_bit_cast builtin, similarly to
clang or MSVC which implement std::bit_cast using such an builtin too.
It checks the various std::bit_cast requirements, when not constexpr
evaluated acts pretty much like VIEW_CONVERT_EXPR of the source argument
to the destination type and the hardest part is obviously the constexpr
evaluation.  I couldn't use the middle-end native_encode_initializer,
because it needs to handle the C++ CONSTRUCTOR_NO_CLEARING vs. negation of
that, value initialization of missing members if there are any etc., and
needs to handle bitfields even if they don't have an integral representative
(I've left out PDP11 handling of those, couldn't figure out how exactly are
bitfields laid out there).

Bootstrapped/regtested on x86_64-linux and i686-linux, ok for trunk?

2020-07-18  Jakub Jelinek  <jakub@redhat.com>

	PR libstdc++/93121
	* c-common.h (enum rid): Add RID_BUILTIN_BIT_CAST.
	* c-common.c (c_common_reswords): Add __builtin_bit_cast.

	* cp-tree.h (cp_build_bit_cast): Declare.
	* cp-tree.def (BIT_CAST_EXPR): New tree code.
	* cp-objcp-common.c (names_builtin_p): Handle RID_BUILTIN_BIT_CAST.
	(cp_common_init_ts): Handle BIT_CAST_EXPR.
	* cxx-pretty-print.c (cxx_pretty_printer::postfix_expression):
	Likewise.
	* parser.c (cp_parser_postfix_expression): Handle
	RID_BUILTIN_BIT_CAST.
	* semantics.c (cp_build_bit_cast): New function.
	* tree.c (cp_tree_equal): Handle BIT_CAST_EXPR.
	(cp_walk_subtrees): Likewise.
	* pt.c (tsubst_copy): Likewise.
	* constexpr.c (check_bit_cast_type, cxx_find_bitfield_repr_type,
	cxx_native_interpret_aggregate, cxx_native_encode_aggregate,
	cxx_eval_bit_cast): New functions.
	(cxx_eval_constant_expression): Handle BIT_CAST_EXPR.
	(potential_constant_expression_1): Likewise.
	* cp-gimplify.c (cp_genericize_r): Likewise.

	* g++.dg/cpp2a/bit-cast1.C: New test.
	* g++.dg/cpp2a/bit-cast2.C: New test.
	* g++.dg/cpp2a/bit-cast3.C: New test.
	* g++.dg/cpp2a/bit-cast4.C: New test.
	* g++.dg/cpp2a/bit-cast5.C: New test.

--- gcc/c-family/c-common.h.jj	2020-07-18 10:48:51.442493640 +0200
+++ gcc/c-family/c-common.h	2020-07-18 10:52:46.175021398 +0200
@@ -164,7 +164,7 @@ enum rid
   RID_HAS_NOTHROW_COPY,        RID_HAS_TRIVIAL_ASSIGN,
   RID_HAS_TRIVIAL_CONSTRUCTOR, RID_HAS_TRIVIAL_COPY,
   RID_HAS_TRIVIAL_DESTRUCTOR,  RID_HAS_UNIQUE_OBJ_REPRESENTATIONS,
-  RID_HAS_VIRTUAL_DESTRUCTOR,
+  RID_HAS_VIRTUAL_DESTRUCTOR,  RID_BUILTIN_BIT_CAST,
   RID_IS_ABSTRACT,             RID_IS_AGGREGATE,
   RID_IS_BASE_OF,              RID_IS_CLASS,
   RID_IS_EMPTY,                RID_IS_ENUM,
--- gcc/c-family/c-common.c.jj	2020-07-18 10:48:51.381494543 +0200
+++ gcc/c-family/c-common.c	2020-07-18 10:52:46.178021354 +0200
@@ -373,6 +373,7 @@ const struct c_common_resword c_common_r
   { "__auto_type",	RID_AUTO_TYPE,	D_CONLY },
   { "__bases",          RID_BASES, D_CXXONLY },
   { "__builtin_addressof", RID_ADDRESSOF, D_CXXONLY },
+  { "__builtin_bit_cast", RID_BUILTIN_BIT_CAST, D_CXXONLY },
   { "__builtin_call_with_static_chain",
     RID_BUILTIN_CALL_WITH_STATIC_CHAIN, D_CONLY },
   { "__builtin_choose_expr", RID_CHOOSE_EXPR, D_CONLY },
--- gcc/cp/cp-tree.h.jj	2020-07-18 10:48:51.573491701 +0200
+++ gcc/cp/cp-tree.h	2020-07-18 10:52:46.180021324 +0200
@@ -7312,6 +7312,8 @@ extern tree finish_builtin_launder		(loc
 						 tsubst_flags_t);
 extern tree cp_build_vec_convert		(tree, location_t, tree,
 						 tsubst_flags_t);
+extern tree cp_build_bit_cast			(location_t, tree, tree,
+						 tsubst_flags_t);
 extern void start_lambda_scope			(tree);
 extern void record_lambda_scope			(tree);
 extern void record_null_lambda_scope		(tree);
--- gcc/cp/cp-tree.def.jj	2020-07-18 10:48:51.569491760 +0200
+++ gcc/cp/cp-tree.def	2020-07-18 10:52:46.180021324 +0200
@@ -460,6 +460,9 @@ DEFTREECODE (UNARY_RIGHT_FOLD_EXPR, "una
 DEFTREECODE (BINARY_LEFT_FOLD_EXPR, "binary_left_fold_expr", tcc_expression, 3)
 DEFTREECODE (BINARY_RIGHT_FOLD_EXPR, "binary_right_fold_expr", tcc_expression, 3)
 
+/* Represents the __builtin_bit_cast (type, expr) expression.
+   The type is in TREE_TYPE, expression in TREE_OPERAND (bitcast, 0).  */
+DEFTREECODE (BIT_CAST_EXPR, "bit_cast_expr", tcc_expression, 1)
 
 /** C++ extensions. */
 
--- gcc/cp/cp-objcp-common.c.jj	2020-07-18 10:48:51.515492559 +0200
+++ gcc/cp/cp-objcp-common.c	2020-07-18 10:52:46.181021310 +0200
@@ -394,6 +394,7 @@ names_builtin_p (const char *name)
     case RID_BUILTIN_HAS_ATTRIBUTE:
     case RID_BUILTIN_SHUFFLE:
     case RID_BUILTIN_LAUNDER:
+    case RID_BUILTIN_BIT_CAST:
     case RID_OFFSETOF:
     case RID_HAS_NOTHROW_ASSIGN:
     case RID_HAS_NOTHROW_CONSTRUCTOR:
@@ -498,6 +499,7 @@ cp_common_init_ts (void)
   MARK_TS_EXP (ALIGNOF_EXPR);
   MARK_TS_EXP (ARROW_EXPR);
   MARK_TS_EXP (AT_ENCODE_EXPR);
+  MARK_TS_EXP (BIT_CAST_EXPR);
   MARK_TS_EXP (CAST_EXPR);
   MARK_TS_EXP (CONST_CAST_EXPR);
   MARK_TS_EXP (CTOR_INITIALIZER);
--- gcc/cp/cxx-pretty-print.c.jj	2020-07-18 10:48:51.601491287 +0200
+++ gcc/cp/cxx-pretty-print.c	2020-07-18 10:52:46.181021310 +0200
@@ -655,6 +655,15 @@ cxx_pretty_printer::postfix_expression (
       pp_right_paren (this);
       break;
 
+    case BIT_CAST_EXPR:
+      pp_cxx_ws_string (this, "__builtin_bit_cast");
+      pp_left_paren (this);
+      type_id (TREE_TYPE (t));
+      pp_comma (this);
+      expression (TREE_OPERAND (t, 0));
+      pp_right_paren (this);
+      break;
+
     case EMPTY_CLASS_EXPR:
       type_id (TREE_TYPE (t));
       pp_left_paren (this);
--- gcc/cp/parser.c.jj	2020-07-18 10:49:16.446123759 +0200
+++ gcc/cp/parser.c	2020-07-18 10:52:46.186021236 +0200
@@ -7217,6 +7217,32 @@ cp_parser_postfix_expression (cp_parser
 				     tf_warning_or_error);
       }
 
+    case RID_BUILTIN_BIT_CAST:
+      {
+	tree expression;
+	tree type;
+	/* Consume the `__builtin_bit_cast' token.  */
+	cp_lexer_consume_token (parser->lexer);
+	/* Look for the opening `('.  */
+	matching_parens parens;
+	parens.require_open (parser);
+	location_t type_location
+	  = cp_lexer_peek_token (parser->lexer)->location;
+	/* Parse the type-id.  */
+	{
+	  type_id_in_expr_sentinel s (parser);
+	  type = cp_parser_type_id (parser);
+	}
+	/* Look for the `,'.  */
+	cp_parser_require (parser, CPP_COMMA, RT_COMMA);
+	/* Now, parse the assignment-expression.  */
+	expression = cp_parser_assignment_expression (parser);
+	/* Look for the closing `)'.  */
+	parens.require_close (parser);
+	return cp_build_bit_cast (type_location, type, expression,
+				  tf_warning_or_error);
+      }
+
     default:
       {
 	tree type;
--- gcc/cp/semantics.c.jj	2020-07-18 10:48:51.737489274 +0200
+++ gcc/cp/semantics.c	2020-07-18 10:52:46.187021221 +0200
@@ -10469,4 +10469,82 @@ cp_build_vec_convert (tree arg, location
   return build_call_expr_internal_loc (loc, IFN_VEC_CONVERT, type, 1, arg);
 }
 
+/* Finish __builtin_bit_cast (type, arg).  */
+
+tree
+cp_build_bit_cast (location_t loc, tree type, tree arg,
+		   tsubst_flags_t complain)
+{
+  if (error_operand_p (type))
+    return error_mark_node;
+  if (!dependent_type_p (type))
+    {
+      if (!complete_type_or_maybe_complain (type, NULL_TREE, complain))
+	return error_mark_node;
+      if (TREE_CODE (type) == ARRAY_TYPE)
+	{
+	  /* std::bit_cast for destination ARRAY_TYPE is not possible,
+	     as functions may not return an array, so don't bother trying
+	     to support this (and then deal with VLAs etc.).  */
+	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
+			 "is an array type", type);
+	  return error_mark_node;
+	}
+      if (!trivially_copyable_p (type))
+	{
+	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
+			 "is not trivially copyable", type);
+	  return error_mark_node;
+	}
+    }
+
+  if (error_operand_p (arg))
+    return error_mark_node;
+
+  if (REFERENCE_REF_P (arg))
+    arg = TREE_OPERAND (arg, 0);
+
+  if (error_operand_p (arg))
+    return error_mark_node;
+
+  if (!type_dependent_expression_p (arg))
+    {
+      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
+	{
+	  /* Don't perform array-to-pointer conversion.  */
+	  arg = mark_rvalue_use (arg, loc, true);
+	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
+	    return error_mark_node;
+	}
+      else
+	arg = decay_conversion (arg, complain);
+
+      if (error_operand_p (arg))
+	return error_mark_node;
+
+      arg = convert_from_reference (arg);
+      if (!trivially_copyable_p (TREE_TYPE (arg)))
+	{
+	  error_at (cp_expr_loc_or_loc (arg, loc),
+		    "%<__builtin_bit_cast%> source type %qT "
+		    "is not trivially copyable", TREE_TYPE (arg));
+	  return error_mark_node;
+	}
+      if (!dependent_type_p (type)
+	  && !cp_tree_equal (TYPE_SIZE_UNIT (type),
+			     TYPE_SIZE_UNIT (TREE_TYPE (arg))))
+	{
+	  error_at (loc, "%<__builtin_bit_cast%> source size %qE "
+			 "not equal to destination type size %qE",
+			 TYPE_SIZE_UNIT (TREE_TYPE (arg)),
+			 TYPE_SIZE_UNIT (type));
+	  return error_mark_node;
+	}
+    }
+
+  tree ret = build_min (BIT_CAST_EXPR, type, arg);
+  SET_EXPR_LOCATION (ret, loc);
+  return ret;
+}
+
 #include "gt-cp-semantics.h"
--- gcc/cp/tree.c.jj	2020-07-18 10:48:51.738489259 +0200
+++ gcc/cp/tree.c	2020-07-18 10:52:46.188021206 +0200
@@ -3908,6 +3908,11 @@ cp_tree_equal (tree t1, tree t2)
 	return false;
       return true;
 
+    case BIT_CAST_EXPR:
+      if (!same_type_p (TREE_TYPE (t1), TREE_TYPE (t2)))
+	return false;
+      break;
+
     default:
       break;
     }
@@ -5112,6 +5117,7 @@ cp_walk_subtrees (tree *tp, int *walk_su
     case CONST_CAST_EXPR:
     case DYNAMIC_CAST_EXPR:
     case IMPLICIT_CONV_EXPR:
+    case BIT_CAST_EXPR:
       if (TREE_TYPE (*tp))
 	WALK_SUBTREE (TREE_TYPE (*tp));
 
--- gcc/cp/pt.c.jj	2020-07-18 10:49:16.451123685 +0200
+++ gcc/cp/pt.c	2020-07-18 10:52:46.190021177 +0200
@@ -16592,6 +16592,13 @@ tsubst_copy (tree t, tree args, tsubst_f
 	return build1 (code, type, op0);
       }
 
+    case BIT_CAST_EXPR:
+      {
+	tree type = tsubst (TREE_TYPE (t), args, complain, in_decl);
+	tree op0 = tsubst_copy (TREE_OPERAND (t, 0), args, complain, in_decl);
+	return cp_build_bit_cast (EXPR_LOCATION (t), type, op0, complain);
+      }
+
     case SIZEOF_EXPR:
       if (PACK_EXPANSION_P (TREE_OPERAND (t, 0))
 	  || ARGUMENT_PACK_P (TREE_OPERAND (t, 0)))
--- gcc/cp/constexpr.c.jj	2020-07-18 10:48:51.493492885 +0200
+++ gcc/cp/constexpr.c	2020-07-18 13:26:57.711003134 +0200
@@ -3853,6 +3853,808 @@ cxx_eval_bit_field_ref (const constexpr_
   return error_mark_node;
 }
 
+/* Helper for cxx_eval_bit_cast.
+   Check [bit.cast]/3 rules, bit_cast is constexpr only if the To and From
+   types and types of all subobjects have is_union_v<T>, is_pointer_v<T>,
+   is_member_pointer_v<T>, is_volatile_v<T> false and has no non-static
+   data members of reference type.  */
+
+static bool
+check_bit_cast_type (const constexpr_ctx *ctx, location_t loc, tree type,
+		     tree orig_type)
+{
+  if (TREE_CODE (type) == UNION_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not constant expression because %qT is "
+			   "a union type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not constant expression because %qT "
+			   "contains a union type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == POINTER_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not constant expression because %qT is "
+			   "a pointer type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not constant expression because %qT "
+			   "contains a pointer type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == REFERENCE_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not constant expression because %qT is "
+			   "a reference type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not constant expression because %qT "
+			   "contains a reference type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TYPE_PTRMEM_P (type))
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not constant expression because %qT is "
+			   "a pointer to member type", "__builtin_bit_cast",
+		      type);
+	  else
+	    error_at (loc, "%qs is not constant expression because %qT "
+			   "contains a pointer to member type",
+		      "__builtin_bit_cast", orig_type);
+	}
+      return true;
+    }
+  if (TYPE_VOLATILE (type))
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not constant expression because %qT is "
+			   "volatile", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not constant expression because %qT "
+			   "contains a volatile subobject",
+		      "__builtin_bit_cast", orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == RECORD_TYPE)
+    for (tree field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
+      if (TREE_CODE (field) == FIELD_DECL
+	  && check_bit_cast_type (ctx, loc, TREE_TYPE (field), orig_type))
+	return true;
+  return false;
+}
+
+/* Try to find a type whose byte size is smaller or equal to LEN bytes larger
+   or equal to FIELDSIZE bytes, with underlying mode precision/size multiple
+   of BITS_PER_UNIT.  As native_{interpret,encode}_int works in term of
+   machine modes, we can't just use build_nonstandard_integer_type.  */
+
+static tree
+cxx_find_bitfield_repr_type (int fieldsize, int len)
+{
+  machine_mode mode;
+  for (int pass = 0; pass < 2; pass++)
+    {
+      enum mode_class mclass = pass ? MODE_PARTIAL_INT : MODE_INT;
+      FOR_EACH_MODE_IN_CLASS (mode, mclass)
+	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
+	    && known_eq (GET_MODE_PRECISION (mode),
+			 GET_MODE_BITSIZE (mode))
+	    && known_le (GET_MODE_SIZE (mode), len))
+	  {
+	    tree ret = c_common_type_for_mode (mode, 1);
+	    if (ret && TYPE_MODE (ret) == mode)
+	      return ret;
+	  }
+    }
+
+  for (int i = 0; i < NUM_INT_N_ENTS; i ++)
+    if (int_n_enabled_p[i]
+	&& int_n_data[i].bitsize >= (unsigned) (BITS_PER_UNIT * fieldsize)
+	&& int_n_trees[i].unsigned_type)
+      {
+	tree ret = int_n_trees[i].unsigned_type;
+	mode = TYPE_MODE (ret);
+	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
+	    && known_eq (GET_MODE_PRECISION (mode),
+			 GET_MODE_BITSIZE (mode))
+	    && known_le (GET_MODE_SIZE (mode), len))
+	  return ret;
+      }
+
+  return NULL_TREE;
+}
+
+/* Attempt to interpret aggregate of TYPE from bytes encoded in target
+   byte order at PTR + OFF with LEN bytes.  MASK contains bits set if the value
+   is indeterminate.  */
+
+static tree
+cxx_native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
+				int len, unsigned char *mask,
+				const constexpr_ctx *ctx, bool *non_constant_p,
+				location_t loc)
+{
+  vec<constructor_elt, va_gc> *elts = NULL;
+  if (TREE_CODE (type) == ARRAY_TYPE)
+    {
+      HOST_WIDE_INT eltsz = int_size_in_bytes (TREE_TYPE (type));
+      if (eltsz < 0 || eltsz > len || TYPE_DOMAIN (type) == NULL_TREE)
+	return error_mark_node;
+
+      HOST_WIDE_INT cnt = 0;
+      if (TYPE_MAX_VALUE (TYPE_DOMAIN (type)))
+	{
+	  if (!tree_fits_shwi_p (TYPE_MAX_VALUE (TYPE_DOMAIN (type))))
+	    return error_mark_node;
+	  cnt = tree_to_shwi (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) + 1;
+	}
+      if (eltsz == 0)
+	cnt = 0;
+      HOST_WIDE_INT pos = 0;
+      for (HOST_WIDE_INT i = 0; i < cnt; i++, pos += eltsz)
+	{
+	  tree v = error_mark_node;
+	  if (pos >= len || pos + eltsz > len)
+	    return error_mark_node;
+	  if (can_native_interpret_type_p (TREE_TYPE (type)))
+	    {
+	      v = native_interpret_expr (TREE_TYPE (type),
+					 ptr + off + pos, eltsz);
+	      if (v == NULL_TREE)
+		return error_mark_node;
+	      for (int i = 0; i < eltsz; i++)
+		if (mask[off + pos + i])
+		  {
+		    if (!ctx->quiet)
+		      error_at (loc, "%qs accessing uninitialized byte at "
+				     "offset %d",
+				"__builtin_bit_cast", (int) (off + pos + i));
+		    *non_constant_p = true;
+		    return error_mark_node;
+		  }
+	    }
+	  else if (TREE_CODE (TREE_TYPE (type)) == RECORD_TYPE
+		   || TREE_CODE (TREE_TYPE (type)) == ARRAY_TYPE)
+	    v = cxx_native_interpret_aggregate (TREE_TYPE (type),
+						ptr, off + pos, eltsz,
+						mask, ctx,
+						non_constant_p, loc);
+	  if (v == error_mark_node)
+	    return error_mark_node;
+	  CONSTRUCTOR_APPEND_ELT (elts, size_int (i), v);
+	}
+      return build_constructor (type, elts);
+    }
+  gcc_assert (TREE_CODE (type) == RECORD_TYPE);
+  for (tree field = next_initializable_field (TYPE_FIELDS (type));
+       field; field = next_initializable_field (DECL_CHAIN (field)))
+    {
+      tree fld = field;
+      HOST_WIDE_INT bitoff = 0, pos = 0, sz = 0;
+      int diff = 0;
+      tree v = error_mark_node;
+      if (DECL_BIT_FIELD (field))
+	{
+	  fld = DECL_BIT_FIELD_REPRESENTATIVE (field);
+	  if (fld && INTEGRAL_TYPE_P (TREE_TYPE (fld)))
+	    {
+	      poly_int64 bitoffset;
+	      poly_uint64 field_offset, fld_offset;
+	      if (poly_int_tree_p (DECL_FIELD_OFFSET (field), &field_offset)
+		  && poly_int_tree_p (DECL_FIELD_OFFSET (fld), &fld_offset))
+		bitoffset = (field_offset - fld_offset) * BITS_PER_UNIT;
+	      else
+		bitoffset = 0;
+	      bitoffset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
+			    - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (fld)));
+	      diff = (TYPE_PRECISION (TREE_TYPE (fld))
+		      - TYPE_PRECISION (TREE_TYPE (field)));
+	      if (!bitoffset.is_constant (&bitoff)
+		  || bitoff < 0
+		  || bitoff > diff)
+		return error_mark_node;
+	    }
+	  else
+	    {
+	      if (!tree_fits_uhwi_p (DECL_FIELD_BIT_OFFSET (field)))
+		return error_mark_node;
+	      int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
+	      int bpos = tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field));
+	      bpos %= BITS_PER_UNIT;
+	      fieldsize += bpos;
+	      fieldsize += BITS_PER_UNIT - 1;
+	      fieldsize /= BITS_PER_UNIT;
+	      tree repr_type = cxx_find_bitfield_repr_type (fieldsize, len);
+	      if (repr_type == NULL_TREE)
+		return error_mark_node;
+	      sz = int_size_in_bytes (repr_type);
+	      if (sz < 0 || sz > len)
+		return error_mark_node;
+	      pos = int_byte_position (field);
+	      if (pos < 0 || pos > len || pos + fieldsize > len)
+		return error_mark_node;
+	      HOST_WIDE_INT rpos;
+	      if (pos + sz <= len)
+		rpos = pos;
+	      else
+		{
+		  rpos = len - sz;
+		  gcc_assert (rpos <= pos);
+		}
+	      bitoff = (HOST_WIDE_INT) (pos - rpos) * BITS_PER_UNIT + bpos;
+	      pos = rpos;
+	      diff = (TYPE_PRECISION (repr_type)
+		      - TYPE_PRECISION (TREE_TYPE (field)));
+	      v = native_interpret_expr (repr_type, ptr + off + pos, sz);
+	      if (v == NULL_TREE)
+		return error_mark_node;
+	      fld = NULL_TREE;
+	    }
+	}
+
+      if (fld)
+	{
+	  sz = int_size_in_bytes (TREE_TYPE (fld));
+	  if (sz < 0 || sz > len)
+	    return error_mark_node;
+	  tree byte_pos = byte_position (fld);
+	  if (!tree_fits_shwi_p (byte_pos))
+	    return error_mark_node;
+	  pos = tree_to_shwi (byte_pos);
+	  if (pos < 0 || pos > len || pos + sz > len)
+	    return error_mark_node;
+	}
+      if (fld == NULL_TREE)
+	/* Already handled above.  */;
+      else if (can_native_interpret_type_p (TREE_TYPE (fld)))
+	{
+	  v = native_interpret_expr (TREE_TYPE (fld),
+				     ptr + off + pos, sz);
+	  if (v == NULL_TREE)
+	    return error_mark_node;
+	  if (fld == field)
+	    for (int i = 0; i < sz; i++)
+	      if (mask[off + pos + i])
+		{
+		  if (!ctx->quiet)
+		    error_at (loc,
+			      "%qs accessing uninitialized byte at offset %d",
+			      "__builtin_bit_cast", (int) (off + pos + i));
+		  *non_constant_p = true;
+		  return error_mark_node;
+		}
+	}
+      else if (TREE_CODE (TREE_TYPE (fld)) == RECORD_TYPE
+	       || TREE_CODE (TREE_TYPE (fld)) == ARRAY_TYPE)
+	v = cxx_native_interpret_aggregate (TREE_TYPE (fld),
+					    ptr, off + pos, sz, mask,
+					    ctx, non_constant_p, loc);
+      if (v == error_mark_node)
+	return error_mark_node;
+      if (fld != field)
+	{
+	  if (TREE_CODE (v) != INTEGER_CST)
+	    return error_mark_node;
+
+	  /* FIXME: Figure out how to handle PDP endian bitfields.  */
+	  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
+	    return error_mark_node;
+	  if (!BYTES_BIG_ENDIAN)
+	    v = wide_int_to_tree (TREE_TYPE (field),
+				  wi::lrshift (wi::to_wide (v), bitoff));
+	  else
+	    v = wide_int_to_tree (TREE_TYPE (field),
+				  wi::lrshift (wi::to_wide (v),
+					       diff - bitoff));
+	  int bpos = bitoff % BITS_PER_UNIT;
+	  int bpos_byte = bitoff / BITS_PER_UNIT;
+	  int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
+	  fieldsize += bpos;
+	  int epos = fieldsize % BITS_PER_UNIT;
+	  fieldsize += BITS_PER_UNIT - 1;
+	  fieldsize /= BITS_PER_UNIT;
+	  int bad = -1;
+	  if (bpos)
+	    {
+	      int msk;
+	      if (!BYTES_BIG_ENDIAN)
+		{
+		  msk = (1 << bpos) - 1;
+		  if (fieldsize == 1 && epos != 0)
+		    msk |= ~((1 << epos) - 1);
+		}
+	      else
+		{
+		  msk = ~((1 << (BITS_PER_UNIT - bpos)) - 1);
+		  if (fieldsize == 1 && epos != 0)
+		    msk |= (1 << (BITS_PER_UNIT - epos)) - 1;
+		}
+	      if (mask[off + pos + bpos_byte] & ~msk)
+		bad = off + pos + bpos_byte;
+	    }
+	  if (epos && (fieldsize > 1 || bpos == 0))
+	    {
+	      int msk;
+	      if (!BYTES_BIG_ENDIAN)
+		msk = (1 << epos) - 1;
+	      else
+		msk = ~((1 << (BITS_PER_UNIT - epos)) - 1);
+	      if (mask[off + pos + bpos_byte + fieldsize - 1] & msk)
+		bad = off + pos + bpos_byte + fieldsize - 1;
+	    }
+	  for (int i = 0; bad < 0 && i < fieldsize - (bpos != 0) - (epos != 0);
+	       i++)
+	    if (mask[off + pos + bpos_byte + (bpos != 0) + i])
+	      bad = off + pos + bpos_byte + (bpos != 0) + i;
+	  if (bad >= 0)
+	    {
+	      if (!ctx->quiet)
+		error_at (loc, "%qs accessing uninitialized byte at offset %d",
+			  "__builtin_bit_cast", bad);
+	      *non_constant_p = true;
+	      return error_mark_node;
+	    }
+	}
+      CONSTRUCTOR_APPEND_ELT (elts, field, v);
+    }
+  return build_constructor (type, elts);
+}
+
+/* Similar to native_encode_initializer, but handle value initialized members
+   without initializers in !CONSTRUCTOR_NO_CLEARING CONSTRUCTORs, handle VCEs,
+   NON_LVALUE_EXPRs and nops and in addition to filling up PTR, fill also
+   MASK with 1 in bits that have indeterminate value.  On the other side,
+   it is simplified by OFF not being present (always assumed to be 0) and
+   never trying to extract anything partial later, always the whole object.  */
+
+int
+cxx_native_encode_aggregate (tree init, unsigned char *ptr, int len,
+			     unsigned char *mask, const constexpr_ctx *ctx,
+			     bool *non_constant_p, bool *overflow_p)
+{
+  if (init == NULL_TREE)
+    return 0;
+
+  STRIP_NOPS (init);
+  switch (TREE_CODE (init))
+    {
+    case VIEW_CONVERT_EXPR:
+    case NON_LVALUE_EXPR:
+      return cxx_native_encode_aggregate (TREE_OPERAND (init, 0), ptr, len,
+					  mask, ctx, non_constant_p,
+					  overflow_p);
+    default:
+      int r;
+      r = native_encode_expr (init, ptr, len, 0);
+      memset (mask, 0, r);
+      return r;
+    case CONSTRUCTOR:
+      tree type = TREE_TYPE (init);
+      HOST_WIDE_INT total_bytes = int_size_in_bytes (type);
+      if (total_bytes < 0)
+	return 0;
+      if (TREE_CODE (type) == ARRAY_TYPE)
+	{
+	  HOST_WIDE_INT min_index;
+	  unsigned HOST_WIDE_INT cnt;
+	  HOST_WIDE_INT curpos = 0, fieldsize, valueinit = -1;
+	  constructor_elt *ce;
+
+	  if (TYPE_DOMAIN (type) == NULL_TREE
+	      || !tree_fits_shwi_p (TYPE_MIN_VALUE (TYPE_DOMAIN (type))))
+	    return 0;
+
+	  fieldsize = int_size_in_bytes (TREE_TYPE (type));
+	  if (fieldsize <= 0)
+	    return 0;
+
+	  min_index = tree_to_shwi (TYPE_MIN_VALUE (TYPE_DOMAIN (type)));
+	  memset (ptr, '\0', MIN (total_bytes, len));
+
+	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
+	    {
+	      tree val = ce->value;
+	      tree index = ce->index;
+	      HOST_WIDE_INT pos = curpos, count = 0;
+	      bool full = false;
+	      if (index && TREE_CODE (index) == RANGE_EXPR)
+		{
+		  if (!tree_fits_shwi_p (TREE_OPERAND (index, 0))
+		      || !tree_fits_shwi_p (TREE_OPERAND (index, 1)))
+		    return 0;
+		  pos = (tree_to_shwi (TREE_OPERAND (index, 0)) - min_index)
+			* fieldsize;
+		  count = (tree_to_shwi (TREE_OPERAND (index, 1))
+			   - tree_to_shwi (TREE_OPERAND (index, 0)));
+		}
+	      else if (index)
+		{
+		  if (!tree_fits_shwi_p (index))
+		    return 0;
+		  pos = (tree_to_shwi (index) - min_index) * fieldsize;
+		}
+
+	      if (!CONSTRUCTOR_NO_CLEARING (init) && curpos != pos)
+		{
+		  if (valueinit == -1)
+		    {
+		      tree t = build_value_init (TREE_TYPE (type),
+						 tf_warning_or_error);
+		      t = cxx_eval_constant_expression (ctx, t, false,
+							non_constant_p,
+							overflow_p);
+		      if (!cxx_native_encode_aggregate (t, ptr + curpos,
+							fieldsize,
+							mask + curpos,
+							ctx, non_constant_p,
+							overflow_p))
+			return 0;
+		      valueinit = curpos;
+		      curpos += fieldsize;
+		    }
+		  while (curpos != pos)
+		    {
+		      memcpy (ptr + curpos, ptr + valueinit, fieldsize);
+		      memcpy (mask + curpos, mask + valueinit, fieldsize);
+		      curpos += fieldsize;
+		    }
+		}
+
+	      curpos = pos;
+	      if (val)
+		do
+		  {
+		    gcc_assert (curpos >= 0
+				&& (curpos + fieldsize
+				    <= (HOST_WIDE_INT) len));
+		    if (full)
+		      {
+			memcpy (ptr + curpos, ptr + pos, fieldsize);
+			memcpy (mask + curpos, mask + pos, fieldsize);
+		      }
+		    else if (!cxx_native_encode_aggregate (val,
+							   ptr + curpos,
+							   fieldsize,
+							   mask + curpos,
+							   ctx, non_constant_p,
+							   overflow_p))
+		      return 0;
+		    else
+		      {
+			full = true;
+			pos = curpos;
+		      }
+		    curpos += fieldsize;
+		  }
+		while (count-- != 0);
+	    }
+	  return MIN (total_bytes, len);
+	}
+      else if (TREE_CODE (type) == RECORD_TYPE)
+	{
+	  unsigned HOST_WIDE_INT cnt;
+	  constructor_elt *ce;
+	  tree fld_base = TYPE_FIELDS (type);
+
+	  memset (ptr, '\0', MIN (total_bytes, len));
+	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
+	    {
+	      tree field = ce->index;
+	      tree val = ce->value;
+	      HOST_WIDE_INT pos, fieldsize;
+	      unsigned HOST_WIDE_INT bpos = 0, epos = 0;
+
+	      if (field == NULL_TREE)
+		return 0;
+
+	      if (!CONSTRUCTOR_NO_CLEARING (init))
+		{
+		  tree fld = next_initializable_field (fld_base);
+		  fld_base = DECL_CHAIN (fld);
+		  if (fld == NULL_TREE)
+		    return 0;
+		  if (fld != field)
+		    {
+		      cnt--;
+		      field = fld;
+		      val = build_value_init (TREE_TYPE (fld),
+					      tf_warning_or_error);
+		      val = cxx_eval_constant_expression (ctx, val, false,
+							  non_constant_p,
+							  overflow_p);
+		    }
+		}
+
+	      pos = int_byte_position (field);
+	      gcc_assert ((HOST_WIDE_INT) len >= pos);
+
+	      if (TREE_CODE (TREE_TYPE (field)) == ARRAY_TYPE
+		  && TYPE_DOMAIN (TREE_TYPE (field))
+		  && ! TYPE_MAX_VALUE (TYPE_DOMAIN (TREE_TYPE (field))))
+		return 0;
+	      if (DECL_SIZE_UNIT (field) == NULL_TREE
+		  || !tree_fits_shwi_p (DECL_SIZE_UNIT (field)))
+		return 0;
+	      fieldsize = tree_to_shwi (DECL_SIZE_UNIT (field));
+	      if (fieldsize == 0)
+		continue;
+
+	      if (DECL_BIT_FIELD (field))
+		{
+		  if (!tree_fits_uhwi_p (DECL_FIELD_BIT_OFFSET (field)))
+		    return 0;
+		  fieldsize = TYPE_PRECISION (TREE_TYPE (field));
+		  bpos = tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field));
+		  bpos %= BITS_PER_UNIT;
+		  fieldsize += bpos;
+		  epos = fieldsize % BITS_PER_UNIT;
+		  fieldsize += BITS_PER_UNIT - 1;
+		  fieldsize /= BITS_PER_UNIT;
+		}
+
+	      gcc_assert (pos + fieldsize > 0);
+
+	      if (val == NULL_TREE)
+		continue;
+
+	      if (DECL_BIT_FIELD (field))
+		{
+		  /* FIXME: Handle PDP endian.  */
+		  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
+		    return 0;
+
+		  if (TREE_CODE (val) != INTEGER_CST)
+		    return 0;
+
+		  tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
+		  tree repr_type = NULL_TREE;
+		  HOST_WIDE_INT rpos = 0;
+		  if (repr && INTEGRAL_TYPE_P (TREE_TYPE (repr)))
+		    {
+		      rpos = int_byte_position (repr);
+		      repr_type = TREE_TYPE (repr);
+		    }
+		  else
+		    {
+		      repr_type = cxx_find_bitfield_repr_type (fieldsize, len);
+		      if (repr_type == NULL_TREE)
+			return 0;
+		      HOST_WIDE_INT repr_size = int_size_in_bytes (repr_type);
+		      gcc_assert (repr_size > 0 && repr_size <= len);
+		      if (pos + repr_size <= len)
+			rpos = pos;
+		      else
+			{
+			  rpos = len - repr_size;
+			  gcc_assert (rpos <= pos);
+			}
+		    }
+
+		  if (rpos > pos)
+		    return 0;
+		  wide_int w = wi::to_wide (val, TYPE_PRECISION (repr_type));
+		  int diff = (TYPE_PRECISION (repr_type)
+			      - TYPE_PRECISION (TREE_TYPE (field)));
+		  HOST_WIDE_INT bitoff = (pos - rpos) * BITS_PER_UNIT + bpos;
+		  if (!BYTES_BIG_ENDIAN)
+		    w = wi::lshift (w, bitoff);
+		  else
+		    w = wi::lshift (w, diff - bitoff);
+		  val = wide_int_to_tree (repr_type, w);
+
+		  unsigned char buf[MAX_BITSIZE_MODE_ANY_INT
+				    / BITS_PER_UNIT + 1];
+		  int l = native_encode_expr (val, buf, sizeof buf, 0);
+		  if (l * BITS_PER_UNIT != TYPE_PRECISION (repr_type))
+		    return 0;
+
+		  /* If the bitfield does not start at byte boundary, handle
+		     the partial byte at the start.  */
+		  if (bpos)
+		    {
+		      gcc_assert (pos >= 0 && len >= 1);
+		      if (!BYTES_BIG_ENDIAN)
+			{
+			  int msk = (1 << bpos) - 1;
+			  buf[pos - rpos] &= ~msk;
+			  buf[pos - rpos] |= ptr[pos] & msk;
+			  if (fieldsize > 1 || epos == 0)
+			    mask[pos] &= msk;
+			  else
+			    mask[pos] &= (msk | ~((1 << epos) - 1));
+			}
+		      else
+			{
+			  int msk = (1 << (BITS_PER_UNIT - bpos)) - 1;
+			  buf[pos - rpos] &= msk;
+			  buf[pos - rpos] |= ptr[pos] & ~msk;
+			  if (fieldsize > 1 || epos == 0)
+			    mask[pos] &= ~msk;
+			  else
+			    mask[pos] &= (~msk
+					  | ((1 << (BITS_PER_UNIT - epos))
+					     - 1));
+			}
+		    }
+		  /* If the bitfield does not end at byte boundary, handle
+		     the partial byte at the end.  */
+		  if (epos)
+		    {
+		      gcc_assert (pos + fieldsize <= (HOST_WIDE_INT) len);
+		      if (!BYTES_BIG_ENDIAN)
+			{
+			  int msk = (1 << epos) - 1;
+			  buf[pos - rpos + fieldsize - 1] &= msk;
+			  buf[pos - rpos + fieldsize - 1]
+			    |= ptr[pos + fieldsize - 1] & ~msk;
+			  if (fieldsize > 1 || bpos == 0)
+			    mask[pos + fieldsize - 1] &= ~msk;
+			}
+		       else
+			{
+			  int msk = (1 << (BITS_PER_UNIT - epos)) - 1;
+			  buf[pos - rpos + fieldsize - 1] &= ~msk;
+			  buf[pos - rpos + fieldsize - 1]
+			    |= ptr[pos + fieldsize - 1] & msk;
+			  if (fieldsize > 1 || bpos == 0)
+			    mask[pos + fieldsize - 1] &= msk;
+			}
+		    }
+		  gcc_assert (pos >= 0
+			      && (pos + fieldsize
+				  <= (HOST_WIDE_INT) len));
+		  memcpy (ptr + pos, buf + (pos - rpos), fieldsize);
+		  if (fieldsize > (bpos != 0) + (epos != 0))
+		    memset (mask + pos + (bpos != 0), 0,
+			    fieldsize - (bpos != 0) - (epos != 0));
+		  continue;
+		}
+
+	      gcc_assert (pos >= 0
+			  && (pos + fieldsize <= (HOST_WIDE_INT) len));
+	      if (!cxx_native_encode_aggregate (val, ptr + pos,
+						fieldsize, mask + pos,
+						ctx, non_constant_p,
+						overflow_p))
+		return 0;
+	    }
+	  return MIN (total_bytes, len);
+	}
+      return 0;
+    }
+}
+
+/* Subroutine of cxx_eval_constant_expression.
+   Attempt to evaluate a BIT_CAST_EXPR.  */
+
+static tree
+cxx_eval_bit_cast (const constexpr_ctx *ctx, tree t, bool *non_constant_p,
+		   bool *overflow_p)
+{
+  if (check_bit_cast_type (ctx, EXPR_LOCATION (t), TREE_TYPE (t),
+			   TREE_TYPE (t))
+      || check_bit_cast_type (ctx, cp_expr_loc_or_loc (TREE_OPERAND (t, 0),
+						       EXPR_LOCATION (t)),
+			      TREE_TYPE (TREE_OPERAND (t, 0)),
+			      TREE_TYPE (TREE_OPERAND (t, 0))))
+    {
+      *non_constant_p = true;
+      return t;
+    }
+
+  tree op = cxx_eval_constant_expression (ctx, TREE_OPERAND (t, 0), false,
+					  non_constant_p, overflow_p);
+  if (*non_constant_p)
+    return t;
+
+  location_t loc = EXPR_LOCATION (t);
+  if (BITS_PER_UNIT != 8 || CHAR_BIT != 8)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated on the target",
+		       "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  if (!tree_fits_shwi_p (TYPE_SIZE_UNIT (TREE_TYPE (t))))
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "type is too large", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  HOST_WIDE_INT len = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (t)));
+  if (len < 0 || (int) len != len)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "type is too large", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  unsigned char buf[64];
+  unsigned char *ptr, *mask;
+  size_t alen = (size_t) len * 2;
+  if (alen <= sizeof (buf))
+    ptr = buf;
+  else
+    ptr = XNEWVEC (unsigned char, alen);
+  mask = ptr + (size_t) len;
+  /* At the beginning consider everything indeterminate.  */
+  memset (mask, ~0, (size_t) len);
+
+  if (cxx_native_encode_aggregate (op, ptr, len, mask, ctx, non_constant_p,
+				   overflow_p) != len)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "argument cannot be encoded", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  if (can_native_interpret_type_p (TREE_TYPE (t)))
+    if (tree r = native_interpret_expr (TREE_TYPE (t), ptr, len))
+      {
+	for (int i = 0; i < len; i++)
+	  if (mask[i])
+	    {
+	      if (!ctx->quiet)
+		error_at (loc, "%qs accessing uninitialized byte at offset %d",
+			  "__builtin_bit_cast", i);
+	      *non_constant_p = true;
+	      r = t;
+	      break;
+	    }
+	if (ptr != buf)
+	  XDELETE (ptr);
+	return r;
+      }
+
+  if (TREE_CODE (TREE_TYPE (t)) == RECORD_TYPE)
+    {
+      tree r = cxx_native_interpret_aggregate (TREE_TYPE (t), ptr, 0, len,
+					       mask, ctx, non_constant_p, loc);
+      if (r != error_mark_node)
+	{
+	  if (ptr != buf)
+	    XDELETE (ptr);
+	  return r;
+	}
+      if (*non_constant_p)
+	return t;
+    }
+
+  if (!ctx->quiet)
+    sorry_at (loc, "%qs cannot be constant evaluated because the "
+		   "argument cannot be interpreted", "__builtin_bit_cast");
+  *non_constant_p = true;
+  return t;
+}
+
 /* Subroutine of cxx_eval_constant_expression.
    Evaluate a short-circuited logical expression T in the context
    of a given constexpr CALL.  BAILOUT_VALUE is the value for
@@ -6537,6 +7339,10 @@ cxx_eval_constant_expression (const cons
       *non_constant_p = true;
       return t;
 
+    case BIT_CAST_EXPR:
+      r = cxx_eval_bit_cast (ctx, t, non_constant_p, overflow_p);
+      break;
+
     default:
       if (STATEMENT_CODE_P (TREE_CODE (t)))
 	{
@@ -8315,6 +9121,9 @@ potential_constant_expression_1 (tree t,
     case ANNOTATE_EXPR:
       return RECUR (TREE_OPERAND (t, 0), rval);
 
+    case BIT_CAST_EXPR:
+      return RECUR (TREE_OPERAND (t, 0), rval);
+
     /* Coroutine await, yield and return expressions are not.  */
     case CO_AWAIT_EXPR:
     case CO_YIELD_EXPR:
--- gcc/cp/cp-gimplify.c.jj	2020-07-18 10:48:51.514492573 +0200
+++ gcc/cp/cp-gimplify.c	2020-07-18 10:52:46.191021162 +0200
@@ -1867,6 +1867,11 @@ cp_genericize_r (tree *stmt_p, int *walk
 	}
       break;
 
+    case BIT_CAST_EXPR:
+      *stmt_p = build1_loc (EXPR_LOCATION (stmt), VIEW_CONVERT_EXPR,
+			    TREE_TYPE (stmt), TREE_OPERAND (stmt, 0));
+      break;
+
     default:
       if (IS_TYPE_OR_DECL_P (stmt))
 	*walk_subtrees = 0;
--- gcc/testsuite/g++.dg/cpp2a/bit-cast1.C.jj	2020-07-18 10:52:46.191021162 +0200
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast1.C	2020-07-18 10:52:46.191021162 +0200
@@ -0,0 +1,47 @@
+// { dg-do compile }
+
+struct S { short a, b; };
+struct T { float a[16]; };
+struct U { int b[16]; };
+
+#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
+int
+f1 (float x)
+{
+  return __builtin_bit_cast (int, x);
+}
+#endif
+
+#if 2 * __SIZEOF_SHORT__ == __SIZEOF_INT__
+S
+f2 (int x)
+{
+  return __builtin_bit_cast (S, x);
+}
+
+int
+f3 (S x)
+{
+  return __builtin_bit_cast (int, x);
+}
+#endif
+
+#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
+U
+f4 (T &x)
+{
+  return __builtin_bit_cast (U, x);
+}
+
+T
+f5 (int (&x)[16])
+{
+  return __builtin_bit_cast (T, x);
+}
+#endif
+
+int
+f6 ()
+{
+  return __builtin_bit_cast (unsigned char, (signed char) 0);
+}
--- gcc/testsuite/g++.dg/cpp2a/bit-cast2.C.jj	2020-07-18 10:52:46.191021162 +0200
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast2.C	2020-07-18 10:52:46.191021162 +0200
@@ -0,0 +1,57 @@
+// { dg-do compile }
+
+struct S { ~S (); int s; };
+S s;
+struct V;				// { dg-message "forward declaration of 'struct V'" }
+extern V v;				// { dg-error "'v' has incomplete type" }
+extern V *p;
+struct U { int a, b; };
+U u;
+
+void
+foo (int *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, v);
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+template <int N>
+void
+bar (int *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+template <typename T1, typename T2, typename T3, typename T4>
+void
+baz (T3 s, T4 *p, T1 *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (T3, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (T1 &, q);		// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (T2, 0);		// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (T4, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, (T1) 0);	// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (T1, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+void
+qux (int *q)
+{
+  baz <int, int [1], S, V> (s, p, q);
+}
--- gcc/testsuite/g++.dg/cpp2a/bit-cast3.C.jj	2020-07-18 10:52:46.191021162 +0200
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast3.C	2020-07-18 11:16:31.003976285 +0200
@@ -0,0 +1,229 @@
+// { dg-do compile { target c++11 } }
+
+template <typename To, typename From>
+constexpr To
+bit_cast (const From &from)
+{
+  return __builtin_bit_cast (To, from);
+}
+
+template <typename To, typename From>
+constexpr bool
+check (const From &from)
+{
+  return bit_cast <From> (bit_cast <To> (from)) == from;
+}
+
+struct A
+{
+  int a, b, c;
+  constexpr bool operator == (const A &x) const
+  {
+    return x.a == a && x.b == b && x.c == c;
+  }
+};
+
+struct B
+{
+  unsigned a[3];
+  constexpr bool operator == (const B &x) const
+  {
+    return x.a[0] == a[0] && x.a[1] == a[1] && x.a[2] == a[2];
+  }
+};
+
+struct C
+{
+  char a[2][3][2];
+  constexpr bool operator == (const C &x) const
+  {
+    return x.a[0][0][0] == a[0][0][0]
+	   && x.a[0][0][1] == a[0][0][1]
+	   && x.a[0][1][0] == a[0][1][0]
+	   && x.a[0][1][1] == a[0][1][1]
+	   && x.a[0][2][0] == a[0][2][0]
+	   && x.a[0][2][1] == a[0][2][1]
+	   && x.a[1][0][0] == a[1][0][0]
+	   && x.a[1][0][1] == a[1][0][1]
+	   && x.a[1][1][0] == a[1][1][0]
+	   && x.a[1][1][1] == a[1][1][1]
+	   && x.a[1][2][0] == a[1][2][0]
+	   && x.a[1][2][1] == a[1][2][1];
+  }
+};
+
+struct D
+{
+  int a, b;
+  constexpr bool operator == (const D &x) const
+  {
+    return x.a == a && x.b == b;
+  }
+};
+
+struct E {};
+struct F { char c, d, e, f; };
+struct G : public D, E, F
+{
+  int g;
+  constexpr bool operator == (const G &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d
+	   && x.e == e && x.f == f && x.g == g;
+  }
+};
+
+struct H
+{
+  int a, b[2], c;
+  constexpr bool operator == (const H &x) const
+  {
+    return x.a == a && x.b[0] == b[0] && x.b[1] == b[1] && x.c == c;
+  }
+};
+
+#if __SIZEOF_INT__ == 4
+struct I
+{
+  int a;
+  int b : 3;
+  int c : 24;
+  int d : 5;
+  int e;
+  constexpr bool operator == (const I &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e;
+  }
+};
+#endif
+
+#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
+struct J
+{
+  long long int a, b : 11, c : 3, d : 37, e : 1, f : 10, g : 2, h;
+  constexpr bool operator == (const J &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
+	   && x.f == f && x.g == g && x.h == h;
+  }
+};
+
+struct K
+{
+  long long int a, b, c;
+  constexpr bool operator == (const K &x) const
+  {
+    return x.a == a && x.b == b && x.c == c;
+  }
+};
+
+struct M
+{
+  signed a : 6, b : 7, c : 6, d : 5;
+  unsigned char e;
+  unsigned int f;
+  long long int g;
+  constexpr bool operator == (const M &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
+	   && x.f == f && x.g == g;
+  }
+};
+
+struct N
+{
+  unsigned long long int a, b;
+  constexpr bool operator == (const N &x) const
+  {
+    return x.a == a && x.b == b;
+  }
+};
+#endif
+
+static_assert (check <unsigned int> (0), "");
+static_assert (check <long long int> (0xdeadbeeffeedbac1ULL), "");
+static_assert (check <signed char> ((unsigned char) 42), "");
+static_assert (check <char> ((unsigned char) 42), "");
+static_assert (check <unsigned char> ((unsigned char) 42), "");
+static_assert (check <signed char> ((signed char) 42), "");
+static_assert (check <char> ((signed char) 42), "");
+static_assert (check <unsigned char> ((signed char) 42), "");
+static_assert (check <signed char> ((char) 42), "");
+static_assert (check <char> ((char) 42), "");
+static_assert (check <unsigned char> ((char) 42), "");
+#if __SIZEOF_INT__ == __SIZEOF_FLOAT__
+static_assert (check <int> (2.5f), "");
+static_assert (check <unsigned int> (136.5f), "");
+#endif
+#if __SIZEOF_LONG_LONG__ == __SIZEOF_DOUBLE__
+static_assert (check <long long> (2.5), "");
+static_assert (check <long long unsigned> (123456.75), "");
+#endif
+
+static_assert (check <B> (A{ 1, 2, 3 }), "");
+static_assert (check <A> (B{ 4, 5, 6 }), "");
+
+#if __SIZEOF_INT__ == 4
+static_assert (check <C> (A{ 7, 8, 9 }), "");
+static_assert (check <C> (B{ 10, 11, 12 }), "");
+static_assert (check <A> (C{ { { { 13, 14 }, { 15, 16 }, { 17, 18 } },
+			       { { 19, 20 }, { 21, 22 }, { 23, 24 } } } }), "");
+constexpr unsigned char c[] = { 1, 2, 3, 4 };
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <unsigned int> (c) == 0x04030201U, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <unsigned int> (c) == 0x01020304U, "");
+#endif
+
+#if __cplusplus >= 201703L
+static_assert (check <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
+#endif
+constexpr int d[] = { 0x12345678, 0x23456789, 0x5a876543, 0x3ba78654 };
+static_assert (bit_cast <G> (d) == bit_cast <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
+
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
+	       == I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed }, "");
+static_assert (bit_cast <A> (I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed })
+	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
+	       == I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed }, "");
+static_assert (bit_cast <A> (I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed })
+	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
+#endif
+#endif
+
+#if 2 * __SIZEOF_INT__ == __SIZEOF_LONG_LONG__ && __SIZEOF_INT__ >= 4
+constexpr unsigned long long a = 0xdeadbeeffee1deadULL;
+constexpr unsigned b[] = { 0xfeedbacU, 0xbeeffeedU };
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <D> (a) == D { int (0xfee1deadU), int (0xdeadbeefU) }, "");
+static_assert (bit_cast <unsigned long long> (b) == 0xbeeffeed0feedbacULL, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <D> (a) == D { int (0xdeadbeefU), int (0xfee1deadU) }, "");
+static_assert (bit_cast <unsigned long long> (b) == 0x0feedbacbeeffeedULL, "");
+#endif
+#endif
+
+#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL })
+	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
+	       == K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
+	       == M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL }, "");
+static_assert (bit_cast <N> (M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL })
+	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL })
+	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
+	       == K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
+	       == M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL }, "");
+static_assert (bit_cast <N> (M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL })
+	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
+#endif
+#endif
--- gcc/testsuite/g++.dg/cpp2a/bit-cast4.C.jj	2020-07-18 10:52:46.191021162 +0200
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast4.C	2020-07-18 10:52:46.191021162 +0200
@@ -0,0 +1,45 @@
+// { dg-do compile { target c++11 } }
+
+template <typename To, typename From>
+constexpr To
+bit_cast (const From &from)
+{
+  return __builtin_bit_cast (To, from);
+}
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'U' is a union type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'const U' is a union type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'B' contains a union type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'char\\\*' is a pointer type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'const int\\\* const' is a pointer type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'C' contains a pointer type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'const C' contains a pointer type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'int D::\\\* const' is a pointer to member type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'int \\\(D::\\\* const\\\)\\\(\\\) const' is a pointer to member type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'int D::\\\*' is a pointer to member type" "" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not constant expression because 'int \\\(D::\\\*\\\)\\\(\\\)' is a pointer to member type" "" { target *-*-* } 7 }
+
+union U { int u; };
+struct A { int a; U b; };
+struct B : public A { int c; };
+struct C { const int *p; };
+constexpr int a[] = { 1, 2, 3 };
+constexpr const int *b = &a[0];
+constexpr C c = { b };
+struct D { int d; constexpr int foo () const { return 1; } };
+constexpr int D::*d = &D::d;
+constexpr int (D::*e) () const = &D::foo;
+struct E { __INTPTR_TYPE__ e, f; };
+constexpr E f = { 1, 2 };
+constexpr U g { 0 };
+
+constexpr auto z = bit_cast <U> (0);
+constexpr auto y = bit_cast <int> (g);
+constexpr auto x = bit_cast <B> (a);
+constexpr auto w = bit_cast <char *> ((__INTPTR_TYPE__) 0);
+constexpr auto v = bit_cast <__UINTPTR_TYPE__> (b);
+constexpr auto u = bit_cast <C> ((__INTPTR_TYPE__) 0);
+constexpr auto t = bit_cast <__INTPTR_TYPE__> (c);
+constexpr auto s = bit_cast <__INTPTR_TYPE__> (d);
+constexpr auto r = bit_cast <E> (e);
+constexpr auto q = bit_cast <int D::*> ((__INTPTR_TYPE__) 0);
+constexpr auto p = bit_cast <int (D::*) ()> (f);
--- gcc/testsuite/g++.dg/cpp2a/bit-cast5.C.jj	2020-07-18 10:52:46.192021147 +0200
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast5.C	2020-07-18 10:52:46.191021162 +0200
@@ -0,0 +1,69 @@
+// { dg-do compile { target { c++20 && { ilp32 || lp64 } } } }
+
+struct A { signed char a, b, c, d, e, f; };
+struct B {};
+struct C { B a, b; short c; B d; };
+struct D { int a : 4, b : 24, c : 4; };
+struct E { B a, b; short c; };
+struct F { B a; signed char b, c; B d; };
+
+constexpr bool
+f1 ()
+{
+  A a;
+  a.c = 23; a.d = 42;
+  C b = __builtin_bit_cast (C, a); // OK
+  return false;
+}
+
+constexpr bool
+f2 ()
+{
+  A a;
+  a.a = 1; a.b = 2; a.c = 3; a.e = 4; a.f = 5;
+  C b = __builtin_bit_cast (C, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
+  return false;
+}
+
+constexpr bool
+f3 ()
+{
+  D a;
+  a.b = 1;
+  F b = __builtin_bit_cast (F, a); // OK
+  return false;
+}
+
+constexpr bool
+f4 ()
+{
+  D a;
+  a.b = 1; a.c = 2;
+  E b = __builtin_bit_cast (E, a); // OK
+  return false;
+}
+
+constexpr bool
+f5 ()
+{
+  D a;
+  a.b = 1;
+  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
+  return false;
+}
+
+constexpr bool
+f6 ()
+{
+  D a;
+  a.c = 1;
+  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 2" }
+  return false;
+}
+
+constexpr bool a = f1 ();
+constexpr bool b = f2 ();
+constexpr bool c = f3 ();
+constexpr bool d = f4 ();
+constexpr bool e = f5 ();
+constexpr bool f = f6 ();

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-18 18:50 [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121] Jakub Jelinek
@ 2020-07-22 14:03 ` Paul Koning
  2020-07-30 14:16 ` Jason Merrill
  1 sibling, 0 replies; 29+ messages in thread
From: Paul Koning @ 2020-07-22 14:03 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: Jason Merrill, Jonathan Wakely, gcc-patches



> On Jul 18, 2020, at 2:50 PM, Jakub Jelinek via Gcc-patches <gcc-patches@gcc.gnu.org> wrote:
> 
> Hi!
> 
> The following patch adds __builtin_bit_cast builtin, similarly to
> clang or MSVC which implement std::bit_cast using such an builtin too.
> It checks the various std::bit_cast requirements, when not constexpr
> evaluated acts pretty much like VIEW_CONVERT_EXPR of the source argument
> to the destination type and the hardest part is obviously the constexpr
> evaluation.  I couldn't use the middle-end native_encode_initializer,
> because it needs to handle the C++ CONSTRUCTOR_NO_CLEARING vs. negation of
> that, value initialization of missing members if there are any etc., and
> needs to handle bitfields even if they don't have an integral representative
> (I've left out PDP11 handling of those, couldn't figure out how exactly are
> bitfields laid out there).

It seems to be spelled out in builtins.c function c_readstr.

	paul



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-18 18:50 [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121] Jakub Jelinek
  2020-07-22 14:03 ` Paul Koning
@ 2020-07-30 14:16 ` Jason Merrill
  2020-07-30 14:57   ` Jakub Jelinek
  2020-11-02 19:21   ` [PATCH] c++: v2: " Jakub Jelinek
  1 sibling, 2 replies; 29+ messages in thread
From: Jason Merrill @ 2020-07-30 14:16 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: gcc-patches, Jonathan Wakely

On 7/18/20 2:50 PM, Jakub Jelinek wrote:
> Hi!
> 
> The following patch adds __builtin_bit_cast builtin, similarly to
> clang or MSVC which implement std::bit_cast using such an builtin too.

Great!

> It checks the various std::bit_cast requirements, when not constexpr
> evaluated acts pretty much like VIEW_CONVERT_EXPR of the source argument
> to the destination type and the hardest part is obviously the constexpr
> evaluation.  I couldn't use the middle-end native_encode_initializer,
> because it needs to handle the C++ CONSTRUCTOR_NO_CLEARING vs. negation of
> that

CONSTRUCTOR_NO_CLEARING isn't specific to C++.

> value initialization of missing members if there are any etc.

Doesn't native_encode_initializer handle omitted initializers already? 
Any elements for which the usual zero-initialization is wrong should 
already have explicit initializers by the time we get here.

> needs to handle bitfields even if they don't have an integral representative

Can we use TYPE_MODE for this?

These all sound like things that could be improved in 
native_encode_initializer rather than duplicate a complex function.

> Bootstrapped/regtested on x86_64-linux and i686-linux, ok for trunk?
> 
> 2020-07-18  Jakub Jelinek  <jakub@redhat.com>
> 
> 	PR libstdc++/93121
> 	* c-common.h (enum rid): Add RID_BUILTIN_BIT_CAST.
> 	* c-common.c (c_common_reswords): Add __builtin_bit_cast.
> 
> 	* cp-tree.h (cp_build_bit_cast): Declare.
> 	* cp-tree.def (BIT_CAST_EXPR): New tree code.
> 	* cp-objcp-common.c (names_builtin_p): Handle RID_BUILTIN_BIT_CAST.
> 	(cp_common_init_ts): Handle BIT_CAST_EXPR.
> 	* cxx-pretty-print.c (cxx_pretty_printer::postfix_expression):
> 	Likewise.
> 	* parser.c (cp_parser_postfix_expression): Handle
> 	RID_BUILTIN_BIT_CAST.
> 	* semantics.c (cp_build_bit_cast): New function.
> 	* tree.c (cp_tree_equal): Handle BIT_CAST_EXPR.
> 	(cp_walk_subtrees): Likewise.
> 	* pt.c (tsubst_copy): Likewise.
> 	* constexpr.c (check_bit_cast_type, cxx_find_bitfield_repr_type,
> 	cxx_native_interpret_aggregate, cxx_native_encode_aggregate,
> 	cxx_eval_bit_cast): New functions.
> 	(cxx_eval_constant_expression): Handle BIT_CAST_EXPR.
> 	(potential_constant_expression_1): Likewise.
> 	* cp-gimplify.c (cp_genericize_r): Likewise.
> 
> 	* g++.dg/cpp2a/bit-cast1.C: New test.
> 	* g++.dg/cpp2a/bit-cast2.C: New test.
> 	* g++.dg/cpp2a/bit-cast3.C: New test.
> 	* g++.dg/cpp2a/bit-cast4.C: New test.
> 	* g++.dg/cpp2a/bit-cast5.C: New test.
> 
> --- gcc/c-family/c-common.h.jj	2020-07-18 10:48:51.442493640 +0200
> +++ gcc/c-family/c-common.h	2020-07-18 10:52:46.175021398 +0200
> @@ -164,7 +164,7 @@ enum rid
>     RID_HAS_NOTHROW_COPY,        RID_HAS_TRIVIAL_ASSIGN,
>     RID_HAS_TRIVIAL_CONSTRUCTOR, RID_HAS_TRIVIAL_COPY,
>     RID_HAS_TRIVIAL_DESTRUCTOR,  RID_HAS_UNIQUE_OBJ_REPRESENTATIONS,
> -  RID_HAS_VIRTUAL_DESTRUCTOR,
> +  RID_HAS_VIRTUAL_DESTRUCTOR,  RID_BUILTIN_BIT_CAST,
>     RID_IS_ABSTRACT,             RID_IS_AGGREGATE,
>     RID_IS_BASE_OF,              RID_IS_CLASS,
>     RID_IS_EMPTY,                RID_IS_ENUM,
> --- gcc/c-family/c-common.c.jj	2020-07-18 10:48:51.381494543 +0200
> +++ gcc/c-family/c-common.c	2020-07-18 10:52:46.178021354 +0200
> @@ -373,6 +373,7 @@ const struct c_common_resword c_common_r
>     { "__auto_type",	RID_AUTO_TYPE,	D_CONLY },
>     { "__bases",          RID_BASES, D_CXXONLY },
>     { "__builtin_addressof", RID_ADDRESSOF, D_CXXONLY },
> +  { "__builtin_bit_cast", RID_BUILTIN_BIT_CAST, D_CXXONLY },
>     { "__builtin_call_with_static_chain",
>       RID_BUILTIN_CALL_WITH_STATIC_CHAIN, D_CONLY },
>     { "__builtin_choose_expr", RID_CHOOSE_EXPR, D_CONLY },
> --- gcc/cp/cp-tree.h.jj	2020-07-18 10:48:51.573491701 +0200
> +++ gcc/cp/cp-tree.h	2020-07-18 10:52:46.180021324 +0200
> @@ -7312,6 +7312,8 @@ extern tree finish_builtin_launder		(loc
>   						 tsubst_flags_t);
>   extern tree cp_build_vec_convert		(tree, location_t, tree,
>   						 tsubst_flags_t);
> +extern tree cp_build_bit_cast			(location_t, tree, tree,
> +						 tsubst_flags_t);
>   extern void start_lambda_scope			(tree);
>   extern void record_lambda_scope			(tree);
>   extern void record_null_lambda_scope		(tree);
> --- gcc/cp/cp-tree.def.jj	2020-07-18 10:48:51.569491760 +0200
> +++ gcc/cp/cp-tree.def	2020-07-18 10:52:46.180021324 +0200
> @@ -460,6 +460,9 @@ DEFTREECODE (UNARY_RIGHT_FOLD_EXPR, "una
>   DEFTREECODE (BINARY_LEFT_FOLD_EXPR, "binary_left_fold_expr", tcc_expression, 3)
>   DEFTREECODE (BINARY_RIGHT_FOLD_EXPR, "binary_right_fold_expr", tcc_expression, 3)
>   
> +/* Represents the __builtin_bit_cast (type, expr) expression.
> +   The type is in TREE_TYPE, expression in TREE_OPERAND (bitcast, 0).  */
> +DEFTREECODE (BIT_CAST_EXPR, "bit_cast_expr", tcc_expression, 1)
>   
>   /** C++ extensions. */
>   
> --- gcc/cp/cp-objcp-common.c.jj	2020-07-18 10:48:51.515492559 +0200
> +++ gcc/cp/cp-objcp-common.c	2020-07-18 10:52:46.181021310 +0200
> @@ -394,6 +394,7 @@ names_builtin_p (const char *name)
>       case RID_BUILTIN_HAS_ATTRIBUTE:
>       case RID_BUILTIN_SHUFFLE:
>       case RID_BUILTIN_LAUNDER:
> +    case RID_BUILTIN_BIT_CAST:
>       case RID_OFFSETOF:
>       case RID_HAS_NOTHROW_ASSIGN:
>       case RID_HAS_NOTHROW_CONSTRUCTOR:
> @@ -498,6 +499,7 @@ cp_common_init_ts (void)
>     MARK_TS_EXP (ALIGNOF_EXPR);
>     MARK_TS_EXP (ARROW_EXPR);
>     MARK_TS_EXP (AT_ENCODE_EXPR);
> +  MARK_TS_EXP (BIT_CAST_EXPR);
>     MARK_TS_EXP (CAST_EXPR);
>     MARK_TS_EXP (CONST_CAST_EXPR);
>     MARK_TS_EXP (CTOR_INITIALIZER);
> --- gcc/cp/cxx-pretty-print.c.jj	2020-07-18 10:48:51.601491287 +0200
> +++ gcc/cp/cxx-pretty-print.c	2020-07-18 10:52:46.181021310 +0200
> @@ -655,6 +655,15 @@ cxx_pretty_printer::postfix_expression (
>         pp_right_paren (this);
>         break;
>   
> +    case BIT_CAST_EXPR:
> +      pp_cxx_ws_string (this, "__builtin_bit_cast");
> +      pp_left_paren (this);
> +      type_id (TREE_TYPE (t));
> +      pp_comma (this);
> +      expression (TREE_OPERAND (t, 0));
> +      pp_right_paren (this);
> +      break;
> +
>       case EMPTY_CLASS_EXPR:
>         type_id (TREE_TYPE (t));
>         pp_left_paren (this);
> --- gcc/cp/parser.c.jj	2020-07-18 10:49:16.446123759 +0200
> +++ gcc/cp/parser.c	2020-07-18 10:52:46.186021236 +0200
> @@ -7217,6 +7217,32 @@ cp_parser_postfix_expression (cp_parser
>   				     tf_warning_or_error);
>         }
>   
> +    case RID_BUILTIN_BIT_CAST:
> +      {
> +	tree expression;
> +	tree type;
> +	/* Consume the `__builtin_bit_cast' token.  */
> +	cp_lexer_consume_token (parser->lexer);
> +	/* Look for the opening `('.  */
> +	matching_parens parens;
> +	parens.require_open (parser);
> +	location_t type_location
> +	  = cp_lexer_peek_token (parser->lexer)->location;
> +	/* Parse the type-id.  */
> +	{
> +	  type_id_in_expr_sentinel s (parser);
> +	  type = cp_parser_type_id (parser);
> +	}
> +	/* Look for the `,'.  */
> +	cp_parser_require (parser, CPP_COMMA, RT_COMMA);
> +	/* Now, parse the assignment-expression.  */
> +	expression = cp_parser_assignment_expression (parser);
> +	/* Look for the closing `)'.  */
> +	parens.require_close (parser);
> +	return cp_build_bit_cast (type_location, type, expression,
> +				  tf_warning_or_error);
> +      }
> +
>       default:
>         {
>   	tree type;
> --- gcc/cp/semantics.c.jj	2020-07-18 10:48:51.737489274 +0200
> +++ gcc/cp/semantics.c	2020-07-18 10:52:46.187021221 +0200
> @@ -10469,4 +10469,82 @@ cp_build_vec_convert (tree arg, location
>     return build_call_expr_internal_loc (loc, IFN_VEC_CONVERT, type, 1, arg);
>   }
>   
> +/* Finish __builtin_bit_cast (type, arg).  */
> +
> +tree
> +cp_build_bit_cast (location_t loc, tree type, tree arg,
> +		   tsubst_flags_t complain)
> +{
> +  if (error_operand_p (type))
> +    return error_mark_node;
> +  if (!dependent_type_p (type))
> +    {
> +      if (!complete_type_or_maybe_complain (type, NULL_TREE, complain))
> +	return error_mark_node;
> +      if (TREE_CODE (type) == ARRAY_TYPE)
> +	{
> +	  /* std::bit_cast for destination ARRAY_TYPE is not possible,
> +	     as functions may not return an array, so don't bother trying
> +	     to support this (and then deal with VLAs etc.).  */
> +	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
> +			 "is an array type", type);
> +	  return error_mark_node;
> +	}
> +      if (!trivially_copyable_p (type))
> +	{
> +	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
> +			 "is not trivially copyable", type);
> +	  return error_mark_node;
> +	}
> +    }
> +
> +  if (error_operand_p (arg))
> +    return error_mark_node;
> +
> +  if (REFERENCE_REF_P (arg))
> +    arg = TREE_OPERAND (arg, 0);

Why?

> +
> +  if (error_operand_p (arg))
> +    return error_mark_node;
> +
> +  if (!type_dependent_expression_p (arg))
> +    {
> +      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
> +	{
> +	  /* Don't perform array-to-pointer conversion.  */
> +	  arg = mark_rvalue_use (arg, loc, true);
> +	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
> +	    return error_mark_node;
> +	}
> +      else
> +	arg = decay_conversion (arg, complain);
> +
> +      if (error_operand_p (arg))
> +	return error_mark_node;
> +
> +      arg = convert_from_reference (arg);
> +      if (!trivially_copyable_p (TREE_TYPE (arg)))
> +	{
> +	  error_at (cp_expr_loc_or_loc (arg, loc),
> +		    "%<__builtin_bit_cast%> source type %qT "
> +		    "is not trivially copyable", TREE_TYPE (arg));
> +	  return error_mark_node;
> +	}
> +      if (!dependent_type_p (type)
> +	  && !cp_tree_equal (TYPE_SIZE_UNIT (type),
> +			     TYPE_SIZE_UNIT (TREE_TYPE (arg))))
> +	{
> +	  error_at (loc, "%<__builtin_bit_cast%> source size %qE "
> +			 "not equal to destination type size %qE",
> +			 TYPE_SIZE_UNIT (TREE_TYPE (arg)),
> +			 TYPE_SIZE_UNIT (type));
> +	  return error_mark_node;
> +	}
> +    }
> +
> +  tree ret = build_min (BIT_CAST_EXPR, type, arg);
> +  SET_EXPR_LOCATION (ret, loc);
> +  return ret;
> +}
> +
>   #include "gt-cp-semantics.h"
> --- gcc/cp/tree.c.jj	2020-07-18 10:48:51.738489259 +0200
> +++ gcc/cp/tree.c	2020-07-18 10:52:46.188021206 +0200
> @@ -3908,6 +3908,11 @@ cp_tree_equal (tree t1, tree t2)
>   	return false;
>         return true;
>   
> +    case BIT_CAST_EXPR:
> +      if (!same_type_p (TREE_TYPE (t1), TREE_TYPE (t2)))
> +	return false;
> +      break;

Add this to the *_CAST_EXPR cases like you do in cp_walk_subtrees?

> +
>       default:
>         break;
>       }
> @@ -5112,6 +5117,7 @@ cp_walk_subtrees (tree *tp, int *walk_su
>       case CONST_CAST_EXPR:
>       case DYNAMIC_CAST_EXPR:
>       case IMPLICIT_CONV_EXPR:
> +    case BIT_CAST_EXPR:
>         if (TREE_TYPE (*tp))
>   	WALK_SUBTREE (TREE_TYPE (*tp));
>   
> --- gcc/cp/pt.c.jj	2020-07-18 10:49:16.451123685 +0200
> +++ gcc/cp/pt.c	2020-07-18 10:52:46.190021177 +0200
> @@ -16592,6 +16592,13 @@ tsubst_copy (tree t, tree args, tsubst_f
>   	return build1 (code, type, op0);
>         }
>   
> +    case BIT_CAST_EXPR:
> +      {
> +	tree type = tsubst (TREE_TYPE (t), args, complain, in_decl);
> +	tree op0 = tsubst_copy (TREE_OPERAND (t, 0), args, complain, in_decl);
> +	return cp_build_bit_cast (EXPR_LOCATION (t), type, op0, complain);
> +      }
> +
>       case SIZEOF_EXPR:
>         if (PACK_EXPANSION_P (TREE_OPERAND (t, 0))
>   	  || ARGUMENT_PACK_P (TREE_OPERAND (t, 0)))
> --- gcc/cp/constexpr.c.jj	2020-07-18 10:48:51.493492885 +0200
> +++ gcc/cp/constexpr.c	2020-07-18 13:26:57.711003134 +0200
> @@ -3853,6 +3853,808 @@ cxx_eval_bit_field_ref (const constexpr_
>     return error_mark_node;
>   }
>   
> +/* Helper for cxx_eval_bit_cast.
> +   Check [bit.cast]/3 rules, bit_cast is constexpr only if the To and From
> +   types and types of all subobjects have is_union_v<T>, is_pointer_v<T>,
> +   is_member_pointer_v<T>, is_volatile_v<T> false and has no non-static
> +   data members of reference type.  */
> +
> +static bool
> +check_bit_cast_type (const constexpr_ctx *ctx, location_t loc, tree type,
> +		     tree orig_type)
> +{
> +  if (TREE_CODE (type) == UNION_TYPE)
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not constant expression because %qT is "
> +			   "a union type", "__builtin_bit_cast", type);

"not a constant expression"; you're missing the "a" in all of these.

> +	  else
> +	    error_at (loc, "%qs is not constant expression because %qT "
> +			   "contains a union type", "__builtin_bit_cast",
> +		      orig_type);
> +	}
> +      return true;
> +    }
> +  if (TREE_CODE (type) == POINTER_TYPE)
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not constant expression because %qT is "
> +			   "a pointer type", "__builtin_bit_cast", type);
> +	  else
> +	    error_at (loc, "%qs is not constant expression because %qT "
> +			   "contains a pointer type", "__builtin_bit_cast",
> +		      orig_type);
> +	}
> +      return true;
> +    }
> +  if (TREE_CODE (type) == REFERENCE_TYPE)
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not constant expression because %qT is "
> +			   "a reference type", "__builtin_bit_cast", type);
> +	  else
> +	    error_at (loc, "%qs is not constant expression because %qT "
> +			   "contains a reference type", "__builtin_bit_cast",
> +		      orig_type);
> +	}
> +      return true;
> +    }
> +  if (TYPE_PTRMEM_P (type))
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not constant expression because %qT is "
> +			   "a pointer to member type", "__builtin_bit_cast",
> +		      type);
> +	  else
> +	    error_at (loc, "%qs is not constant expression because %qT "
> +			   "contains a pointer to member type",
> +		      "__builtin_bit_cast", orig_type);
> +	}
> +      return true;
> +    }
> +  if (TYPE_VOLATILE (type))
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not constant expression because %qT is "
> +			   "volatile", "__builtin_bit_cast", type);
> +	  else
> +	    error_at (loc, "%qs is not constant expression because %qT "
> +			   "contains a volatile subobject",
> +		      "__builtin_bit_cast", orig_type);
> +	}
> +      return true;
> +    }
> +  if (TREE_CODE (type) == RECORD_TYPE)
> +    for (tree field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
> +      if (TREE_CODE (field) == FIELD_DECL
> +	  && check_bit_cast_type (ctx, loc, TREE_TYPE (field), orig_type))
> +	return true;
> +  return false;
> +}
> +
> +/* Try to find a type whose byte size is smaller or equal to LEN bytes larger
> +   or equal to FIELDSIZE bytes, with underlying mode precision/size multiple
> +   of BITS_PER_UNIT.  As native_{interpret,encode}_int works in term of
> +   machine modes, we can't just use build_nonstandard_integer_type.  */
> +
> +static tree
> +cxx_find_bitfield_repr_type (int fieldsize, int len)
> +{
> +  machine_mode mode;
> +  for (int pass = 0; pass < 2; pass++)
> +    {
> +      enum mode_class mclass = pass ? MODE_PARTIAL_INT : MODE_INT;
> +      FOR_EACH_MODE_IN_CLASS (mode, mclass)
> +	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
> +	    && known_eq (GET_MODE_PRECISION (mode),
> +			 GET_MODE_BITSIZE (mode))
> +	    && known_le (GET_MODE_SIZE (mode), len))
> +	  {
> +	    tree ret = c_common_type_for_mode (mode, 1);
> +	    if (ret && TYPE_MODE (ret) == mode)
> +	      return ret;
> +	  }
> +    }
> +
> +  for (int i = 0; i < NUM_INT_N_ENTS; i ++)
> +    if (int_n_enabled_p[i]
> +	&& int_n_data[i].bitsize >= (unsigned) (BITS_PER_UNIT * fieldsize)
> +	&& int_n_trees[i].unsigned_type)
> +      {
> +	tree ret = int_n_trees[i].unsigned_type;
> +	mode = TYPE_MODE (ret);
> +	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
> +	    && known_eq (GET_MODE_PRECISION (mode),
> +			 GET_MODE_BITSIZE (mode))
> +	    && known_le (GET_MODE_SIZE (mode), len))
> +	  return ret;
> +      }
> +
> +  return NULL_TREE;
> +}
> +
> +/* Attempt to interpret aggregate of TYPE from bytes encoded in target
> +   byte order at PTR + OFF with LEN bytes.  MASK contains bits set if the value
> +   is indeterminate.  */
> +
> +static tree
> +cxx_native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
> +				int len, unsigned char *mask,
> +				const constexpr_ctx *ctx, bool *non_constant_p,
> +				location_t loc)
> +{
> +  vec<constructor_elt, va_gc> *elts = NULL;
> +  if (TREE_CODE (type) == ARRAY_TYPE)
> +    {
> +      HOST_WIDE_INT eltsz = int_size_in_bytes (TREE_TYPE (type));
> +      if (eltsz < 0 || eltsz > len || TYPE_DOMAIN (type) == NULL_TREE)
> +	return error_mark_node;
> +
> +      HOST_WIDE_INT cnt = 0;
> +      if (TYPE_MAX_VALUE (TYPE_DOMAIN (type)))
> +	{
> +	  if (!tree_fits_shwi_p (TYPE_MAX_VALUE (TYPE_DOMAIN (type))))
> +	    return error_mark_node;
> +	  cnt = tree_to_shwi (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) + 1;
> +	}
> +      if (eltsz == 0)
> +	cnt = 0;
> +      HOST_WIDE_INT pos = 0;
> +      for (HOST_WIDE_INT i = 0; i < cnt; i++, pos += eltsz)
> +	{
> +	  tree v = error_mark_node;
> +	  if (pos >= len || pos + eltsz > len)
> +	    return error_mark_node;
> +	  if (can_native_interpret_type_p (TREE_TYPE (type)))
> +	    {
> +	      v = native_interpret_expr (TREE_TYPE (type),
> +					 ptr + off + pos, eltsz);
> +	      if (v == NULL_TREE)
> +		return error_mark_node;
> +	      for (int i = 0; i < eltsz; i++)
> +		if (mask[off + pos + i])
> +		  {
> +		    if (!ctx->quiet)
> +		      error_at (loc, "%qs accessing uninitialized byte at "
> +				     "offset %d",
> +				"__builtin_bit_cast", (int) (off + pos + i));
> +		    *non_constant_p = true;
> +		    return error_mark_node;
> +		  }
> +	    }
> +	  else if (TREE_CODE (TREE_TYPE (type)) == RECORD_TYPE
> +		   || TREE_CODE (TREE_TYPE (type)) == ARRAY_TYPE)
> +	    v = cxx_native_interpret_aggregate (TREE_TYPE (type),
> +						ptr, off + pos, eltsz,
> +						mask, ctx,
> +						non_constant_p, loc);
> +	  if (v == error_mark_node)
> +	    return error_mark_node;
> +	  CONSTRUCTOR_APPEND_ELT (elts, size_int (i), v);
> +	}
> +      return build_constructor (type, elts);
> +    }
> +  gcc_assert (TREE_CODE (type) == RECORD_TYPE);
> +  for (tree field = next_initializable_field (TYPE_FIELDS (type));
> +       field; field = next_initializable_field (DECL_CHAIN (field)))
> +    {
> +      tree fld = field;
> +      HOST_WIDE_INT bitoff = 0, pos = 0, sz = 0;
> +      int diff = 0;
> +      tree v = error_mark_node;
> +      if (DECL_BIT_FIELD (field))
> +	{
> +	  fld = DECL_BIT_FIELD_REPRESENTATIVE (field);
> +	  if (fld && INTEGRAL_TYPE_P (TREE_TYPE (fld)))
> +	    {
> +	      poly_int64 bitoffset;
> +	      poly_uint64 field_offset, fld_offset;
> +	      if (poly_int_tree_p (DECL_FIELD_OFFSET (field), &field_offset)
> +		  && poly_int_tree_p (DECL_FIELD_OFFSET (fld), &fld_offset))
> +		bitoffset = (field_offset - fld_offset) * BITS_PER_UNIT;
> +	      else
> +		bitoffset = 0;
> +	      bitoffset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
> +			    - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (fld)));
> +	      diff = (TYPE_PRECISION (TREE_TYPE (fld))
> +		      - TYPE_PRECISION (TREE_TYPE (field)));
> +	      if (!bitoffset.is_constant (&bitoff)
> +		  || bitoff < 0
> +		  || bitoff > diff)
> +		return error_mark_node;
> +	    }
> +	  else
> +	    {
> +	      if (!tree_fits_uhwi_p (DECL_FIELD_BIT_OFFSET (field)))
> +		return error_mark_node;
> +	      int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
> +	      int bpos = tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field));
> +	      bpos %= BITS_PER_UNIT;
> +	      fieldsize += bpos;
> +	      fieldsize += BITS_PER_UNIT - 1;
> +	      fieldsize /= BITS_PER_UNIT;
> +	      tree repr_type = cxx_find_bitfield_repr_type (fieldsize, len);
> +	      if (repr_type == NULL_TREE)
> +		return error_mark_node;
> +	      sz = int_size_in_bytes (repr_type);
> +	      if (sz < 0 || sz > len)
> +		return error_mark_node;
> +	      pos = int_byte_position (field);
> +	      if (pos < 0 || pos > len || pos + fieldsize > len)
> +		return error_mark_node;
> +	      HOST_WIDE_INT rpos;
> +	      if (pos + sz <= len)
> +		rpos = pos;
> +	      else
> +		{
> +		  rpos = len - sz;
> +		  gcc_assert (rpos <= pos);
> +		}
> +	      bitoff = (HOST_WIDE_INT) (pos - rpos) * BITS_PER_UNIT + bpos;
> +	      pos = rpos;
> +	      diff = (TYPE_PRECISION (repr_type)
> +		      - TYPE_PRECISION (TREE_TYPE (field)));
> +	      v = native_interpret_expr (repr_type, ptr + off + pos, sz);
> +	      if (v == NULL_TREE)
> +		return error_mark_node;
> +	      fld = NULL_TREE;
> +	    }
> +	}
> +
> +      if (fld)
> +	{
> +	  sz = int_size_in_bytes (TREE_TYPE (fld));
> +	  if (sz < 0 || sz > len)
> +	    return error_mark_node;
> +	  tree byte_pos = byte_position (fld);
> +	  if (!tree_fits_shwi_p (byte_pos))
> +	    return error_mark_node;
> +	  pos = tree_to_shwi (byte_pos);
> +	  if (pos < 0 || pos > len || pos + sz > len)
> +	    return error_mark_node;
> +	}
> +      if (fld == NULL_TREE)
> +	/* Already handled above.  */;
> +      else if (can_native_interpret_type_p (TREE_TYPE (fld)))
> +	{
> +	  v = native_interpret_expr (TREE_TYPE (fld),
> +				     ptr + off + pos, sz);
> +	  if (v == NULL_TREE)
> +	    return error_mark_node;
> +	  if (fld == field)
> +	    for (int i = 0; i < sz; i++)
> +	      if (mask[off + pos + i])
> +		{
> +		  if (!ctx->quiet)
> +		    error_at (loc,
> +			      "%qs accessing uninitialized byte at offset %d",
> +			      "__builtin_bit_cast", (int) (off + pos + i));
> +		  *non_constant_p = true;
> +		  return error_mark_node;
> +		}
> +	}
> +      else if (TREE_CODE (TREE_TYPE (fld)) == RECORD_TYPE
> +	       || TREE_CODE (TREE_TYPE (fld)) == ARRAY_TYPE)
> +	v = cxx_native_interpret_aggregate (TREE_TYPE (fld),
> +					    ptr, off + pos, sz, mask,
> +					    ctx, non_constant_p, loc);
> +      if (v == error_mark_node)
> +	return error_mark_node;
> +      if (fld != field)
> +	{
> +	  if (TREE_CODE (v) != INTEGER_CST)
> +	    return error_mark_node;
> +
> +	  /* FIXME: Figure out how to handle PDP endian bitfields.  */
> +	  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
> +	    return error_mark_node;
> +	  if (!BYTES_BIG_ENDIAN)
> +	    v = wide_int_to_tree (TREE_TYPE (field),
> +				  wi::lrshift (wi::to_wide (v), bitoff));
> +	  else
> +	    v = wide_int_to_tree (TREE_TYPE (field),
> +				  wi::lrshift (wi::to_wide (v),
> +					       diff - bitoff));
> +	  int bpos = bitoff % BITS_PER_UNIT;
> +	  int bpos_byte = bitoff / BITS_PER_UNIT;
> +	  int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
> +	  fieldsize += bpos;
> +	  int epos = fieldsize % BITS_PER_UNIT;
> +	  fieldsize += BITS_PER_UNIT - 1;
> +	  fieldsize /= BITS_PER_UNIT;
> +	  int bad = -1;
> +	  if (bpos)
> +	    {
> +	      int msk;
> +	      if (!BYTES_BIG_ENDIAN)
> +		{
> +		  msk = (1 << bpos) - 1;
> +		  if (fieldsize == 1 && epos != 0)
> +		    msk |= ~((1 << epos) - 1);
> +		}
> +	      else
> +		{
> +		  msk = ~((1 << (BITS_PER_UNIT - bpos)) - 1);
> +		  if (fieldsize == 1 && epos != 0)
> +		    msk |= (1 << (BITS_PER_UNIT - epos)) - 1;
> +		}
> +	      if (mask[off + pos + bpos_byte] & ~msk)
> +		bad = off + pos + bpos_byte;
> +	    }
> +	  if (epos && (fieldsize > 1 || bpos == 0))
> +	    {
> +	      int msk;
> +	      if (!BYTES_BIG_ENDIAN)
> +		msk = (1 << epos) - 1;
> +	      else
> +		msk = ~((1 << (BITS_PER_UNIT - epos)) - 1);
> +	      if (mask[off + pos + bpos_byte + fieldsize - 1] & msk)
> +		bad = off + pos + bpos_byte + fieldsize - 1;
> +	    }
> +	  for (int i = 0; bad < 0 && i < fieldsize - (bpos != 0) - (epos != 0);
> +	       i++)
> +	    if (mask[off + pos + bpos_byte + (bpos != 0) + i])
> +	      bad = off + pos + bpos_byte + (bpos != 0) + i;
> +	  if (bad >= 0)
> +	    {
> +	      if (!ctx->quiet)
> +		error_at (loc, "%qs accessing uninitialized byte at offset %d",
> +			  "__builtin_bit_cast", bad);
> +	      *non_constant_p = true;
> +	      return error_mark_node;
> +	    }
> +	}
> +      CONSTRUCTOR_APPEND_ELT (elts, field, v);
> +    }
> +  return build_constructor (type, elts);
> +}
> +
> +/* Similar to native_encode_initializer, but handle value initialized members
> +   without initializers in !CONSTRUCTOR_NO_CLEARING CONSTRUCTORs, handle VCEs,
> +   NON_LVALUE_EXPRs and nops and in addition to filling up PTR, fill also
> +   MASK with 1 in bits that have indeterminate value.  On the other side,
> +   it is simplified by OFF not being present (always assumed to be 0) and
> +   never trying to extract anything partial later, always the whole object.  */
> +
> +int
> +cxx_native_encode_aggregate (tree init, unsigned char *ptr, int len,
> +			     unsigned char *mask, const constexpr_ctx *ctx,
> +			     bool *non_constant_p, bool *overflow_p)
> +{
> +  if (init == NULL_TREE)
> +    return 0;
> +
> +  STRIP_NOPS (init);
> +  switch (TREE_CODE (init))
> +    {
> +    case VIEW_CONVERT_EXPR:
> +    case NON_LVALUE_EXPR:
> +      return cxx_native_encode_aggregate (TREE_OPERAND (init, 0), ptr, len,
> +					  mask, ctx, non_constant_p,
> +					  overflow_p);
> +    default:
> +      int r;
> +      r = native_encode_expr (init, ptr, len, 0);
> +      memset (mask, 0, r);
> +      return r;
> +    case CONSTRUCTOR:
> +      tree type = TREE_TYPE (init);
> +      HOST_WIDE_INT total_bytes = int_size_in_bytes (type);
> +      if (total_bytes < 0)
> +	return 0;
> +      if (TREE_CODE (type) == ARRAY_TYPE)
> +	{
> +	  HOST_WIDE_INT min_index;
> +	  unsigned HOST_WIDE_INT cnt;
> +	  HOST_WIDE_INT curpos = 0, fieldsize, valueinit = -1;
> +	  constructor_elt *ce;
> +
> +	  if (TYPE_DOMAIN (type) == NULL_TREE
> +	      || !tree_fits_shwi_p (TYPE_MIN_VALUE (TYPE_DOMAIN (type))))
> +	    return 0;
> +
> +	  fieldsize = int_size_in_bytes (TREE_TYPE (type));
> +	  if (fieldsize <= 0)
> +	    return 0;
> +
> +	  min_index = tree_to_shwi (TYPE_MIN_VALUE (TYPE_DOMAIN (type)));
> +	  memset (ptr, '\0', MIN (total_bytes, len));
> +
> +	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
> +	    {
> +	      tree val = ce->value;
> +	      tree index = ce->index;
> +	      HOST_WIDE_INT pos = curpos, count = 0;
> +	      bool full = false;
> +	      if (index && TREE_CODE (index) == RANGE_EXPR)
> +		{
> +		  if (!tree_fits_shwi_p (TREE_OPERAND (index, 0))
> +		      || !tree_fits_shwi_p (TREE_OPERAND (index, 1)))
> +		    return 0;
> +		  pos = (tree_to_shwi (TREE_OPERAND (index, 0)) - min_index)
> +			* fieldsize;
> +		  count = (tree_to_shwi (TREE_OPERAND (index, 1))
> +			   - tree_to_shwi (TREE_OPERAND (index, 0)));
> +		}
> +	      else if (index)
> +		{
> +		  if (!tree_fits_shwi_p (index))
> +		    return 0;
> +		  pos = (tree_to_shwi (index) - min_index) * fieldsize;
> +		}
> +
> +	      if (!CONSTRUCTOR_NO_CLEARING (init) && curpos != pos)
> +		{
> +		  if (valueinit == -1)
> +		    {
> +		      tree t = build_value_init (TREE_TYPE (type),
> +						 tf_warning_or_error);
> +		      t = cxx_eval_constant_expression (ctx, t, false,
> +							non_constant_p,
> +							overflow_p);
> +		      if (!cxx_native_encode_aggregate (t, ptr + curpos,
> +							fieldsize,
> +							mask + curpos,
> +							ctx, non_constant_p,
> +							overflow_p))
> +			return 0;
> +		      valueinit = curpos;
> +		      curpos += fieldsize;
> +		    }
> +		  while (curpos != pos)
> +		    {
> +		      memcpy (ptr + curpos, ptr + valueinit, fieldsize);
> +		      memcpy (mask + curpos, mask + valueinit, fieldsize);
> +		      curpos += fieldsize;
> +		    }
> +		}
> +
> +	      curpos = pos;
> +	      if (val)
> +		do
> +		  {
> +		    gcc_assert (curpos >= 0
> +				&& (curpos + fieldsize
> +				    <= (HOST_WIDE_INT) len));
> +		    if (full)
> +		      {
> +			memcpy (ptr + curpos, ptr + pos, fieldsize);
> +			memcpy (mask + curpos, mask + pos, fieldsize);
> +		      }
> +		    else if (!cxx_native_encode_aggregate (val,
> +							   ptr + curpos,
> +							   fieldsize,
> +							   mask + curpos,
> +							   ctx, non_constant_p,
> +							   overflow_p))
> +		      return 0;
> +		    else
> +		      {
> +			full = true;
> +			pos = curpos;
> +		      }
> +		    curpos += fieldsize;
> +		  }
> +		while (count-- != 0);
> +	    }
> +	  return MIN (total_bytes, len);
> +	}
> +      else if (TREE_CODE (type) == RECORD_TYPE)
> +	{
> +	  unsigned HOST_WIDE_INT cnt;
> +	  constructor_elt *ce;
> +	  tree fld_base = TYPE_FIELDS (type);
> +
> +	  memset (ptr, '\0', MIN (total_bytes, len));
> +	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
> +	    {
> +	      tree field = ce->index;
> +	      tree val = ce->value;
> +	      HOST_WIDE_INT pos, fieldsize;
> +	      unsigned HOST_WIDE_INT bpos = 0, epos = 0;
> +
> +	      if (field == NULL_TREE)
> +		return 0;
> +
> +	      if (!CONSTRUCTOR_NO_CLEARING (init))
> +		{
> +		  tree fld = next_initializable_field (fld_base);
> +		  fld_base = DECL_CHAIN (fld);
> +		  if (fld == NULL_TREE)
> +		    return 0;
> +		  if (fld != field)
> +		    {
> +		      cnt--;
> +		      field = fld;
> +		      val = build_value_init (TREE_TYPE (fld),
> +					      tf_warning_or_error);
> +		      val = cxx_eval_constant_expression (ctx, val, false,
> +							  non_constant_p,
> +							  overflow_p);
> +		    }
> +		}
> +
> +	      pos = int_byte_position (field);
> +	      gcc_assert ((HOST_WIDE_INT) len >= pos);
> +
> +	      if (TREE_CODE (TREE_TYPE (field)) == ARRAY_TYPE
> +		  && TYPE_DOMAIN (TREE_TYPE (field))
> +		  && ! TYPE_MAX_VALUE (TYPE_DOMAIN (TREE_TYPE (field))))
> +		return 0;
> +	      if (DECL_SIZE_UNIT (field) == NULL_TREE
> +		  || !tree_fits_shwi_p (DECL_SIZE_UNIT (field)))
> +		return 0;
> +	      fieldsize = tree_to_shwi (DECL_SIZE_UNIT (field));
> +	      if (fieldsize == 0)
> +		continue;
> +
> +	      if (DECL_BIT_FIELD (field))
> +		{
> +		  if (!tree_fits_uhwi_p (DECL_FIELD_BIT_OFFSET (field)))
> +		    return 0;
> +		  fieldsize = TYPE_PRECISION (TREE_TYPE (field));
> +		  bpos = tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field));
> +		  bpos %= BITS_PER_UNIT;
> +		  fieldsize += bpos;
> +		  epos = fieldsize % BITS_PER_UNIT;
> +		  fieldsize += BITS_PER_UNIT - 1;
> +		  fieldsize /= BITS_PER_UNIT;
> +		}
> +
> +	      gcc_assert (pos + fieldsize > 0);
> +
> +	      if (val == NULL_TREE)
> +		continue;
> +
> +	      if (DECL_BIT_FIELD (field))
> +		{
> +		  /* FIXME: Handle PDP endian.  */
> +		  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
> +		    return 0;
> +
> +		  if (TREE_CODE (val) != INTEGER_CST)
> +		    return 0;
> +
> +		  tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
> +		  tree repr_type = NULL_TREE;
> +		  HOST_WIDE_INT rpos = 0;
> +		  if (repr && INTEGRAL_TYPE_P (TREE_TYPE (repr)))
> +		    {
> +		      rpos = int_byte_position (repr);
> +		      repr_type = TREE_TYPE (repr);
> +		    }
> +		  else
> +		    {
> +		      repr_type = cxx_find_bitfield_repr_type (fieldsize, len);
> +		      if (repr_type == NULL_TREE)
> +			return 0;
> +		      HOST_WIDE_INT repr_size = int_size_in_bytes (repr_type);
> +		      gcc_assert (repr_size > 0 && repr_size <= len);
> +		      if (pos + repr_size <= len)
> +			rpos = pos;
> +		      else
> +			{
> +			  rpos = len - repr_size;
> +			  gcc_assert (rpos <= pos);
> +			}
> +		    }
> +
> +		  if (rpos > pos)
> +		    return 0;
> +		  wide_int w = wi::to_wide (val, TYPE_PRECISION (repr_type));
> +		  int diff = (TYPE_PRECISION (repr_type)
> +			      - TYPE_PRECISION (TREE_TYPE (field)));
> +		  HOST_WIDE_INT bitoff = (pos - rpos) * BITS_PER_UNIT + bpos;
> +		  if (!BYTES_BIG_ENDIAN)
> +		    w = wi::lshift (w, bitoff);
> +		  else
> +		    w = wi::lshift (w, diff - bitoff);
> +		  val = wide_int_to_tree (repr_type, w);
> +
> +		  unsigned char buf[MAX_BITSIZE_MODE_ANY_INT
> +				    / BITS_PER_UNIT + 1];
> +		  int l = native_encode_expr (val, buf, sizeof buf, 0);
> +		  if (l * BITS_PER_UNIT != TYPE_PRECISION (repr_type))
> +		    return 0;
> +
> +		  /* If the bitfield does not start at byte boundary, handle
> +		     the partial byte at the start.  */
> +		  if (bpos)
> +		    {
> +		      gcc_assert (pos >= 0 && len >= 1);
> +		      if (!BYTES_BIG_ENDIAN)
> +			{
> +			  int msk = (1 << bpos) - 1;
> +			  buf[pos - rpos] &= ~msk;
> +			  buf[pos - rpos] |= ptr[pos] & msk;
> +			  if (fieldsize > 1 || epos == 0)
> +			    mask[pos] &= msk;
> +			  else
> +			    mask[pos] &= (msk | ~((1 << epos) - 1));
> +			}
> +		      else
> +			{
> +			  int msk = (1 << (BITS_PER_UNIT - bpos)) - 1;
> +			  buf[pos - rpos] &= msk;
> +			  buf[pos - rpos] |= ptr[pos] & ~msk;
> +			  if (fieldsize > 1 || epos == 0)
> +			    mask[pos] &= ~msk;
> +			  else
> +			    mask[pos] &= (~msk
> +					  | ((1 << (BITS_PER_UNIT - epos))
> +					     - 1));
> +			}
> +		    }
> +		  /* If the bitfield does not end at byte boundary, handle
> +		     the partial byte at the end.  */
> +		  if (epos)
> +		    {
> +		      gcc_assert (pos + fieldsize <= (HOST_WIDE_INT) len);
> +		      if (!BYTES_BIG_ENDIAN)
> +			{
> +			  int msk = (1 << epos) - 1;
> +			  buf[pos - rpos + fieldsize - 1] &= msk;
> +			  buf[pos - rpos + fieldsize - 1]
> +			    |= ptr[pos + fieldsize - 1] & ~msk;
> +			  if (fieldsize > 1 || bpos == 0)
> +			    mask[pos + fieldsize - 1] &= ~msk;
> +			}
> +		       else
> +			{
> +			  int msk = (1 << (BITS_PER_UNIT - epos)) - 1;
> +			  buf[pos - rpos + fieldsize - 1] &= ~msk;
> +			  buf[pos - rpos + fieldsize - 1]
> +			    |= ptr[pos + fieldsize - 1] & msk;
> +			  if (fieldsize > 1 || bpos == 0)
> +			    mask[pos + fieldsize - 1] &= msk;
> +			}
> +		    }
> +		  gcc_assert (pos >= 0
> +			      && (pos + fieldsize
> +				  <= (HOST_WIDE_INT) len));
> +		  memcpy (ptr + pos, buf + (pos - rpos), fieldsize);
> +		  if (fieldsize > (bpos != 0) + (epos != 0))
> +		    memset (mask + pos + (bpos != 0), 0,
> +			    fieldsize - (bpos != 0) - (epos != 0));
> +		  continue;
> +		}
> +
> +	      gcc_assert (pos >= 0
> +			  && (pos + fieldsize <= (HOST_WIDE_INT) len));
> +	      if (!cxx_native_encode_aggregate (val, ptr + pos,
> +						fieldsize, mask + pos,
> +						ctx, non_constant_p,
> +						overflow_p))
> +		return 0;
> +	    }
> +	  return MIN (total_bytes, len);
> +	}
> +      return 0;
> +    }
> +}
> +
> +/* Subroutine of cxx_eval_constant_expression.
> +   Attempt to evaluate a BIT_CAST_EXPR.  */
> +
> +static tree
> +cxx_eval_bit_cast (const constexpr_ctx *ctx, tree t, bool *non_constant_p,
> +		   bool *overflow_p)
> +{
> +  if (check_bit_cast_type (ctx, EXPR_LOCATION (t), TREE_TYPE (t),
> +			   TREE_TYPE (t))
> +      || check_bit_cast_type (ctx, cp_expr_loc_or_loc (TREE_OPERAND (t, 0),
> +						       EXPR_LOCATION (t)),
> +			      TREE_TYPE (TREE_OPERAND (t, 0)),
> +			      TREE_TYPE (TREE_OPERAND (t, 0))))
> +    {
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  tree op = cxx_eval_constant_expression (ctx, TREE_OPERAND (t, 0), false,
> +					  non_constant_p, overflow_p);
> +  if (*non_constant_p)
> +    return t;
> +
> +  location_t loc = EXPR_LOCATION (t);
> +  if (BITS_PER_UNIT != 8 || CHAR_BIT != 8)
> +    {
> +      if (!ctx->quiet)
> +	sorry_at (loc, "%qs cannot be constant evaluated on the target",
> +		       "__builtin_bit_cast");
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  if (!tree_fits_shwi_p (TYPE_SIZE_UNIT (TREE_TYPE (t))))
> +    {
> +      if (!ctx->quiet)
> +	sorry_at (loc, "%qs cannot be constant evaluated because the "
> +		       "type is too large", "__builtin_bit_cast");
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  HOST_WIDE_INT len = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (t)));
> +  if (len < 0 || (int) len != len)
> +    {
> +      if (!ctx->quiet)
> +	sorry_at (loc, "%qs cannot be constant evaluated because the "
> +		       "type is too large", "__builtin_bit_cast");
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  unsigned char buf[64];
> +  unsigned char *ptr, *mask;
> +  size_t alen = (size_t) len * 2;
> +  if (alen <= sizeof (buf))
> +    ptr = buf;
> +  else
> +    ptr = XNEWVEC (unsigned char, alen);
> +  mask = ptr + (size_t) len;
> +  /* At the beginning consider everything indeterminate.  */
> +  memset (mask, ~0, (size_t) len);
> +
> +  if (cxx_native_encode_aggregate (op, ptr, len, mask, ctx, non_constant_p,
> +				   overflow_p) != len)
> +    {
> +      if (!ctx->quiet)
> +	sorry_at (loc, "%qs cannot be constant evaluated because the "
> +		       "argument cannot be encoded", "__builtin_bit_cast");
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  if (can_native_interpret_type_p (TREE_TYPE (t)))
> +    if (tree r = native_interpret_expr (TREE_TYPE (t), ptr, len))
> +      {
> +	for (int i = 0; i < len; i++)
> +	  if (mask[i])
> +	    {
> +	      if (!ctx->quiet)
> +		error_at (loc, "%qs accessing uninitialized byte at offset %d",
> +			  "__builtin_bit_cast", i);
> +	      *non_constant_p = true;
> +	      r = t;
> +	      break;
> +	    }
> +	if (ptr != buf)
> +	  XDELETE (ptr);
> +	return r;
> +      }
> +
> +  if (TREE_CODE (TREE_TYPE (t)) == RECORD_TYPE)
> +    {
> +      tree r = cxx_native_interpret_aggregate (TREE_TYPE (t), ptr, 0, len,
> +					       mask, ctx, non_constant_p, loc);
> +      if (r != error_mark_node)
> +	{
> +	  if (ptr != buf)
> +	    XDELETE (ptr);
> +	  return r;
> +	}
> +      if (*non_constant_p)
> +	return t;
> +    }
> +
> +  if (!ctx->quiet)
> +    sorry_at (loc, "%qs cannot be constant evaluated because the "
> +		   "argument cannot be interpreted", "__builtin_bit_cast");
> +  *non_constant_p = true;
> +  return t;
> +}
> +
>   /* Subroutine of cxx_eval_constant_expression.
>      Evaluate a short-circuited logical expression T in the context
>      of a given constexpr CALL.  BAILOUT_VALUE is the value for
> @@ -6537,6 +7339,10 @@ cxx_eval_constant_expression (const cons
>         *non_constant_p = true;
>         return t;
>   
> +    case BIT_CAST_EXPR:
> +      r = cxx_eval_bit_cast (ctx, t, non_constant_p, overflow_p);
> +      break;
> +
>       default:
>         if (STATEMENT_CODE_P (TREE_CODE (t)))
>   	{
> @@ -8315,6 +9121,9 @@ potential_constant_expression_1 (tree t,
>       case ANNOTATE_EXPR:
>         return RECUR (TREE_OPERAND (t, 0), rval);
>   
> +    case BIT_CAST_EXPR:
> +      return RECUR (TREE_OPERAND (t, 0), rval);
> +
>       /* Coroutine await, yield and return expressions are not.  */
>       case CO_AWAIT_EXPR:
>       case CO_YIELD_EXPR:
> --- gcc/cp/cp-gimplify.c.jj	2020-07-18 10:48:51.514492573 +0200
> +++ gcc/cp/cp-gimplify.c	2020-07-18 10:52:46.191021162 +0200
> @@ -1867,6 +1867,11 @@ cp_genericize_r (tree *stmt_p, int *walk
>   	}
>         break;
>   
> +    case BIT_CAST_EXPR:
> +      *stmt_p = build1_loc (EXPR_LOCATION (stmt), VIEW_CONVERT_EXPR,
> +			    TREE_TYPE (stmt), TREE_OPERAND (stmt, 0));
> +      break;
> +
>       default:
>         if (IS_TYPE_OR_DECL_P (stmt))
>   	*walk_subtrees = 0;
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast1.C.jj	2020-07-18 10:52:46.191021162 +0200
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast1.C	2020-07-18 10:52:46.191021162 +0200
> @@ -0,0 +1,47 @@
> +// { dg-do compile }
> +
> +struct S { short a, b; };
> +struct T { float a[16]; };
> +struct U { int b[16]; };
> +
> +#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
> +int
> +f1 (float x)
> +{
> +  return __builtin_bit_cast (int, x);
> +}
> +#endif
> +
> +#if 2 * __SIZEOF_SHORT__ == __SIZEOF_INT__
> +S
> +f2 (int x)
> +{
> +  return __builtin_bit_cast (S, x);
> +}
> +
> +int
> +f3 (S x)
> +{
> +  return __builtin_bit_cast (int, x);
> +}
> +#endif
> +
> +#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
> +U
> +f4 (T &x)
> +{
> +  return __builtin_bit_cast (U, x);
> +}
> +
> +T
> +f5 (int (&x)[16])
> +{
> +  return __builtin_bit_cast (T, x);
> +}
> +#endif
> +
> +int
> +f6 ()
> +{
> +  return __builtin_bit_cast (unsigned char, (signed char) 0);
> +}
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast2.C.jj	2020-07-18 10:52:46.191021162 +0200
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast2.C	2020-07-18 10:52:46.191021162 +0200
> @@ -0,0 +1,57 @@
> +// { dg-do compile }
> +
> +struct S { ~S (); int s; };
> +S s;
> +struct V;				// { dg-message "forward declaration of 'struct V'" }
> +extern V v;				// { dg-error "'v' has incomplete type" }
> +extern V *p;
> +struct U { int a, b; };
> +U u;
> +
> +void
> +foo (int *q)
> +{
> +  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> +  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> +  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (int, v);
> +  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +template <int N>
> +void
> +bar (int *q)
> +{
> +  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> +  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> +  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +template <typename T1, typename T2, typename T3, typename T4>
> +void
> +baz (T3 s, T4 *p, T1 *q)
> +{
> +  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (T3, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (T1 &, q);		// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> +  __builtin_bit_cast (T2, 0);		// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> +  __builtin_bit_cast (T4, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (U, (T1) 0);	// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +  __builtin_bit_cast (T1, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +void
> +qux (int *q)
> +{
> +  baz <int, int [1], S, V> (s, p, q);
> +}
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast3.C.jj	2020-07-18 10:52:46.191021162 +0200
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast3.C	2020-07-18 11:16:31.003976285 +0200
> @@ -0,0 +1,229 @@
> +// { dg-do compile { target c++11 } }
> +
> +template <typename To, typename From>
> +constexpr To
> +bit_cast (const From &from)
> +{
> +  return __builtin_bit_cast (To, from);
> +}
> +
> +template <typename To, typename From>
> +constexpr bool
> +check (const From &from)
> +{
> +  return bit_cast <From> (bit_cast <To> (from)) == from;
> +}
> +
> +struct A
> +{
> +  int a, b, c;
> +  constexpr bool operator == (const A &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c;
> +  }
> +};
> +
> +struct B
> +{
> +  unsigned a[3];
> +  constexpr bool operator == (const B &x) const
> +  {
> +    return x.a[0] == a[0] && x.a[1] == a[1] && x.a[2] == a[2];
> +  }
> +};
> +
> +struct C
> +{
> +  char a[2][3][2];
> +  constexpr bool operator == (const C &x) const
> +  {
> +    return x.a[0][0][0] == a[0][0][0]
> +	   && x.a[0][0][1] == a[0][0][1]
> +	   && x.a[0][1][0] == a[0][1][0]
> +	   && x.a[0][1][1] == a[0][1][1]
> +	   && x.a[0][2][0] == a[0][2][0]
> +	   && x.a[0][2][1] == a[0][2][1]
> +	   && x.a[1][0][0] == a[1][0][0]
> +	   && x.a[1][0][1] == a[1][0][1]
> +	   && x.a[1][1][0] == a[1][1][0]
> +	   && x.a[1][1][1] == a[1][1][1]
> +	   && x.a[1][2][0] == a[1][2][0]
> +	   && x.a[1][2][1] == a[1][2][1];
> +  }
> +};
> +
> +struct D
> +{
> +  int a, b;
> +  constexpr bool operator == (const D &x) const
> +  {
> +    return x.a == a && x.b == b;
> +  }
> +};
> +
> +struct E {};
> +struct F { char c, d, e, f; };
> +struct G : public D, E, F
> +{
> +  int g;
> +  constexpr bool operator == (const G &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c && x.d == d
> +	   && x.e == e && x.f == f && x.g == g;
> +  }
> +};
> +
> +struct H
> +{
> +  int a, b[2], c;
> +  constexpr bool operator == (const H &x) const
> +  {
> +    return x.a == a && x.b[0] == b[0] && x.b[1] == b[1] && x.c == c;
> +  }
> +};
> +
> +#if __SIZEOF_INT__ == 4
> +struct I
> +{
> +  int a;
> +  int b : 3;
> +  int c : 24;
> +  int d : 5;
> +  int e;
> +  constexpr bool operator == (const I &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e;
> +  }
> +};
> +#endif
> +
> +#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
> +struct J
> +{
> +  long long int a, b : 11, c : 3, d : 37, e : 1, f : 10, g : 2, h;
> +  constexpr bool operator == (const J &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
> +	   && x.f == f && x.g == g && x.h == h;
> +  }
> +};
> +
> +struct K
> +{
> +  long long int a, b, c;
> +  constexpr bool operator == (const K &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c;
> +  }
> +};
> +
> +struct M
> +{
> +  signed a : 6, b : 7, c : 6, d : 5;
> +  unsigned char e;
> +  unsigned int f;
> +  long long int g;
> +  constexpr bool operator == (const M &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
> +	   && x.f == f && x.g == g;
> +  }
> +};
> +
> +struct N
> +{
> +  unsigned long long int a, b;
> +  constexpr bool operator == (const N &x) const
> +  {
> +    return x.a == a && x.b == b;
> +  }
> +};
> +#endif
> +
> +static_assert (check <unsigned int> (0), "");
> +static_assert (check <long long int> (0xdeadbeeffeedbac1ULL), "");
> +static_assert (check <signed char> ((unsigned char) 42), "");
> +static_assert (check <char> ((unsigned char) 42), "");
> +static_assert (check <unsigned char> ((unsigned char) 42), "");
> +static_assert (check <signed char> ((signed char) 42), "");
> +static_assert (check <char> ((signed char) 42), "");
> +static_assert (check <unsigned char> ((signed char) 42), "");
> +static_assert (check <signed char> ((char) 42), "");
> +static_assert (check <char> ((char) 42), "");
> +static_assert (check <unsigned char> ((char) 42), "");
> +#if __SIZEOF_INT__ == __SIZEOF_FLOAT__
> +static_assert (check <int> (2.5f), "");
> +static_assert (check <unsigned int> (136.5f), "");
> +#endif
> +#if __SIZEOF_LONG_LONG__ == __SIZEOF_DOUBLE__
> +static_assert (check <long long> (2.5), "");
> +static_assert (check <long long unsigned> (123456.75), "");
> +#endif
> +
> +static_assert (check <B> (A{ 1, 2, 3 }), "");
> +static_assert (check <A> (B{ 4, 5, 6 }), "");
> +
> +#if __SIZEOF_INT__ == 4
> +static_assert (check <C> (A{ 7, 8, 9 }), "");
> +static_assert (check <C> (B{ 10, 11, 12 }), "");
> +static_assert (check <A> (C{ { { { 13, 14 }, { 15, 16 }, { 17, 18 } },
> +			       { { 19, 20 }, { 21, 22 }, { 23, 24 } } } }), "");
> +constexpr unsigned char c[] = { 1, 2, 3, 4 };
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <unsigned int> (c) == 0x04030201U, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <unsigned int> (c) == 0x01020304U, "");
> +#endif
> +
> +#if __cplusplus >= 201703L
> +static_assert (check <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
> +#endif
> +constexpr int d[] = { 0x12345678, 0x23456789, 0x5a876543, 0x3ba78654 };
> +static_assert (bit_cast <G> (d) == bit_cast <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
> +
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
> +	       == I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed }, "");
> +static_assert (bit_cast <A> (I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed })
> +	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
> +	       == I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed }, "");
> +static_assert (bit_cast <A> (I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed })
> +	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
> +#endif
> +#endif
> +
> +#if 2 * __SIZEOF_INT__ == __SIZEOF_LONG_LONG__ && __SIZEOF_INT__ >= 4
> +constexpr unsigned long long a = 0xdeadbeeffee1deadULL;
> +constexpr unsigned b[] = { 0xfeedbacU, 0xbeeffeedU };
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <D> (a) == D { int (0xfee1deadU), int (0xdeadbeefU) }, "");
> +static_assert (bit_cast <unsigned long long> (b) == 0xbeeffeed0feedbacULL, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <D> (a) == D { int (0xdeadbeefU), int (0xfee1deadU) }, "");
> +static_assert (bit_cast <unsigned long long> (b) == 0x0feedbacbeeffeedULL, "");
> +#endif
> +#endif
> +
> +#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL })
> +	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
> +	       == K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
> +	       == M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL }, "");
> +static_assert (bit_cast <N> (M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL })
> +	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL })
> +	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
> +	       == K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
> +	       == M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL }, "");
> +static_assert (bit_cast <N> (M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL })
> +	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
> +#endif
> +#endif
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast4.C.jj	2020-07-18 10:52:46.191021162 +0200
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast4.C	2020-07-18 10:52:46.191021162 +0200
> @@ -0,0 +1,45 @@
> +// { dg-do compile { target c++11 } }
> +
> +template <typename To, typename From>
> +constexpr To
> +bit_cast (const From &from)
> +{
> +  return __builtin_bit_cast (To, from);
> +}
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'U' is a union type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'const U' is a union type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'B' contains a union type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'char\\\*' is a pointer type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'const int\\\* const' is a pointer type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'C' contains a pointer type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'const C' contains a pointer type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'int D::\\\* const' is a pointer to member type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'int \\\(D::\\\* const\\\)\\\(\\\) const' is a pointer to member type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'int D::\\\*' is a pointer to member type" "" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not constant expression because 'int \\\(D::\\\*\\\)\\\(\\\)' is a pointer to member type" "" { target *-*-* } 7 }
> +
> +union U { int u; };
> +struct A { int a; U b; };
> +struct B : public A { int c; };
> +struct C { const int *p; };
> +constexpr int a[] = { 1, 2, 3 };
> +constexpr const int *b = &a[0];
> +constexpr C c = { b };
> +struct D { int d; constexpr int foo () const { return 1; } };
> +constexpr int D::*d = &D::d;
> +constexpr int (D::*e) () const = &D::foo;
> +struct E { __INTPTR_TYPE__ e, f; };
> +constexpr E f = { 1, 2 };
> +constexpr U g { 0 };
> +
> +constexpr auto z = bit_cast <U> (0);
> +constexpr auto y = bit_cast <int> (g);
> +constexpr auto x = bit_cast <B> (a);
> +constexpr auto w = bit_cast <char *> ((__INTPTR_TYPE__) 0);
> +constexpr auto v = bit_cast <__UINTPTR_TYPE__> (b);
> +constexpr auto u = bit_cast <C> ((__INTPTR_TYPE__) 0);
> +constexpr auto t = bit_cast <__INTPTR_TYPE__> (c);
> +constexpr auto s = bit_cast <__INTPTR_TYPE__> (d);
> +constexpr auto r = bit_cast <E> (e);
> +constexpr auto q = bit_cast <int D::*> ((__INTPTR_TYPE__) 0);
> +constexpr auto p = bit_cast <int (D::*) ()> (f);
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast5.C.jj	2020-07-18 10:52:46.192021147 +0200
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast5.C	2020-07-18 10:52:46.191021162 +0200
> @@ -0,0 +1,69 @@
> +// { dg-do compile { target { c++20 && { ilp32 || lp64 } } } }
> +
> +struct A { signed char a, b, c, d, e, f; };
> +struct B {};
> +struct C { B a, b; short c; B d; };
> +struct D { int a : 4, b : 24, c : 4; };
> +struct E { B a, b; short c; };
> +struct F { B a; signed char b, c; B d; };
> +
> +constexpr bool
> +f1 ()
> +{
> +  A a;
> +  a.c = 23; a.d = 42;
> +  C b = __builtin_bit_cast (C, a); // OK
> +  return false;
> +}
> +
> +constexpr bool
> +f2 ()
> +{
> +  A a;
> +  a.a = 1; a.b = 2; a.c = 3; a.e = 4; a.f = 5;
> +  C b = __builtin_bit_cast (C, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
> +  return false;
> +}
> +
> +constexpr bool
> +f3 ()
> +{
> +  D a;
> +  a.b = 1;
> +  F b = __builtin_bit_cast (F, a); // OK
> +  return false;
> +}
> +
> +constexpr bool
> +f4 ()
> +{
> +  D a;
> +  a.b = 1; a.c = 2;
> +  E b = __builtin_bit_cast (E, a); // OK
> +  return false;
> +}
> +
> +constexpr bool
> +f5 ()
> +{
> +  D a;
> +  a.b = 1;
> +  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
> +  return false;
> +}
> +
> +constexpr bool
> +f6 ()
> +{
> +  D a;
> +  a.c = 1;
> +  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 2" }
> +  return false;
> +}
> +
> +constexpr bool a = f1 ();
> +constexpr bool b = f2 ();
> +constexpr bool c = f3 ();
> +constexpr bool d = f4 ();
> +constexpr bool e = f5 ();
> +constexpr bool f = f6 ();
> 
> 	Jakub
> 


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-30 14:16 ` Jason Merrill
@ 2020-07-30 14:57   ` Jakub Jelinek
  2020-07-30 21:59     ` Jason Merrill
  2020-11-02 19:21   ` [PATCH] c++: v2: " Jakub Jelinek
  1 sibling, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-07-30 14:57 UTC (permalink / raw)
  To: Jason Merrill; +Cc: gcc-patches, Jonathan Wakely

On Thu, Jul 30, 2020 at 10:16:46AM -0400, Jason Merrill wrote:
> > It checks the various std::bit_cast requirements, when not constexpr
> > evaluated acts pretty much like VIEW_CONVERT_EXPR of the source argument
> > to the destination type and the hardest part is obviously the constexpr
> > evaluation.  I couldn't use the middle-end native_encode_initializer,
> > because it needs to handle the C++ CONSTRUCTOR_NO_CLEARING vs. negation of
> > that
> 
> CONSTRUCTOR_NO_CLEARING isn't specific to C++.

That is true, but the middle-end doesn't really use that much (except one
spot in the gimplifier and when deciding if two constructors are equal).

> > value initialization of missing members if there are any etc.
> 
> Doesn't native_encode_initializer handle omitted initializers already? Any
> elements for which the usual zero-initialization is wrong should already
> have explicit initializers by the time we get here.

It assumes zero initialization for everything that is not explicitly initialized.
When it is used in the middle-end, CONSTRUCTORs are either used in
initializers of non-automatic vars and then they are zero initialized, or
for automatic variables only empty CONSTRUCTORs are allowed.
So, what it does is memset the memory to zero before trying to fill in the
individual initializers for RECORD_TYPE/ARRAY_TYPE.

Even if we are guaranteed that (what guarantees it?) when __builtin_bit_cast
is constexpr evaluated no initializer will be omitted if may be value initialized
to something other than all zeros, we still need to track what bits are well
defined and what are not (e.g. any padding in there).

Thinking about it now, maybe the mask handling for !CONSTRUCTOR_NO_CLEARING
is incorrect though if there are missing initializers, because those omitted
initializers still could have paddings with unspecified content, right?

> > needs to handle bitfields even if they don't have an integral representative
> 
> Can we use TYPE_MODE for this?

When the bitfield representative is an array, it will have BLKmode
TYPE_MODE, and we need to find some suitable other integral type.
The bit-field handling for BLKmode representatives can be copied to the middle-end
implementation.

> These all sound like things that could be improved in
> native_encode_initializer rather than duplicate a complex function.

I think in the end only small parts of it are actually duplicated.
It is true that both native_encode_initializer and the new C++ FE
counterpart are roughly 310 lines long, but the mask handling the latter
performs is unlikely useful for the middle-end (and if added there it would
need to allow it thus to be NULL and not filled), on the other side the very
complex handling the middle-end version needs to do where it can be asked
only for a small portion of the memory complicates things a lot.
Maybe we could say that non-NULL mask is only supported for the case where
one asks for everything and have some assertions for that, but I fear we'd
still end up with ~ 450-500 lines of code in the middle-end compared to the
current 310.

I guess I can try to merge the two back in one with optional mask and see
how does it look like.  But most likely the !CONSTRUCTOR_NO_CLEARING
handling etc. would need to be conditionalized on this special C++ mode
(mask non-NULL) etc.

> > +  if (REFERENCE_REF_P (arg))
> > +    arg = TREE_OPERAND (arg, 0);
> 
> Why?

I've added it so that the decay_conversion later on didn't mishandle
references to arrays.

> > +  if (error_operand_p (arg))
> > +    return error_mark_node;
> > +
> > +  if (!type_dependent_expression_p (arg))
> > +    {
> > +      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
> > +	{
> > +	  /* Don't perform array-to-pointer conversion.  */
> > +	  arg = mark_rvalue_use (arg, loc, true);
> > +	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
> > +	    return error_mark_node;
> > +	}

But then I've added the above to match more the __builtin_bit_cast behavior
in clang, so maybe the if (REFERENCE_REF_P (arg)) is not needed anymore.
For the std::bit_cast uses, the argument will be always a reference, never
an array directly and not sure if we want to encourage people to use the
builtin directly in their code.

> > +      else
> > +	arg = decay_conversion (arg, complain);
> > +
> > +      if (error_operand_p (arg))
> > +	return error_mark_node;
> > +
> > +      arg = convert_from_reference (arg);
> > +      if (!trivially_copyable_p (TREE_TYPE (arg)))
> > +	{
> > +	  error_at (cp_expr_loc_or_loc (arg, loc),
> > +		    "%<__builtin_bit_cast%> source type %qT "
> > +		    "is not trivially copyable", TREE_TYPE (arg));
> > +	  return error_mark_node;
> > +	}
> > +      if (!dependent_type_p (type)
> > +	  && !cp_tree_equal (TYPE_SIZE_UNIT (type),
> > +			     TYPE_SIZE_UNIT (TREE_TYPE (arg))))
> > +	{
> > +	  error_at (loc, "%<__builtin_bit_cast%> source size %qE "
> > +			 "not equal to destination type size %qE",
> > +			 TYPE_SIZE_UNIT (TREE_TYPE (arg)),
> > +			 TYPE_SIZE_UNIT (type));
> > +	  return error_mark_node;
> > +	}
> > +    }
> > +
> > +  tree ret = build_min (BIT_CAST_EXPR, type, arg);
> > +  SET_EXPR_LOCATION (ret, loc);
> > +  return ret;
> > +}
> > +
> >   #include "gt-cp-semantics.h"
> > --- gcc/cp/tree.c.jj	2020-07-18 10:48:51.738489259 +0200
> > +++ gcc/cp/tree.c	2020-07-18 10:52:46.188021206 +0200
> > @@ -3908,6 +3908,11 @@ cp_tree_equal (tree t1, tree t2)
> >   	return false;
> >         return true;
> > +    case BIT_CAST_EXPR:
> > +      if (!same_type_p (TREE_TYPE (t1), TREE_TYPE (t2)))
> > +	return false;
> > +      break;
> 
> Add this to the *_CAST_EXPR cases like you do in cp_walk_subtrees?

Ok.

> > +	{
> > +	  if (type == orig_type)
> > +	    error_at (loc, "%qs is not constant expression because %qT is "
> > +			   "a union type", "__builtin_bit_cast", type);
> 
> "not a constant expression"; you're missing the "a" in all of these.

Thanks, will change.

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-30 14:57   ` Jakub Jelinek
@ 2020-07-30 21:59     ` Jason Merrill
  2020-07-31  8:19       ` Jakub Jelinek
  0 siblings, 1 reply; 29+ messages in thread
From: Jason Merrill @ 2020-07-30 21:59 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: gcc-patches, Jonathan Wakely

On 7/30/20 10:57 AM, Jakub Jelinek wrote:
> On Thu, Jul 30, 2020 at 10:16:46AM -0400, Jason Merrill wrote:
>>> It checks the various std::bit_cast requirements, when not constexpr
>>> evaluated acts pretty much like VIEW_CONVERT_EXPR of the source argument
>>> to the destination type and the hardest part is obviously the constexpr
>>> evaluation.  I couldn't use the middle-end native_encode_initializer,
>>> because it needs to handle the C++ CONSTRUCTOR_NO_CLEARING vs. negation of
>>> that
>>
>> CONSTRUCTOR_NO_CLEARING isn't specific to C++.
> 
> That is true, but the middle-end doesn't really use that much (except one
> spot in the gimplifier and when deciding if two constructors are equal).
>>> value initialization of missing members if there are any etc.
>>
>> Doesn't native_encode_initializer handle omitted initializers already? Any
>> elements for which the usual zero-initialization is wrong should already
>> have explicit initializers by the time we get here.
> 
> It assumes zero initialization for everything that is not explicitly initialized.
> When it is used in the middle-end, CONSTRUCTORs are either used in
> initializers of non-automatic vars and then they are zero initialized, or
> for automatic variables only empty CONSTRUCTORs are allowed.
> So, what it does is memset the memory to zero before trying to fill in the
> individual initializers for RECORD_TYPE/ARRAY_TYPE.
> 
> Even if we are guaranteed that (what guarantees it?) when __builtin_bit_cast
> is constexpr evaluated no initializer will be omitted if may be value initialized
> to something other than all zeros,

This is established by digest_init/process_init_constructor_record, 
which replace implicit value-initialization with explicit values when 
it's not simple zero-initialization.

> we still need to track what bits are well
> defined and what are not (e.g. any padding in there).

> Thinking about it now, maybe the mask handling for !CONSTRUCTOR_NO_CLEARING
> is incorrect though if there are missing initializers, because those omitted
> initializers still could have paddings with unspecified content, right?

Zero-initialization (and thus trivial value-initialization) of a class 
also zeroes out padding bits, so when !CONSTRUCTOR_NO_CLEARING all bits 
should be well defined.

>>> needs to handle bitfields even if they don't have an integral representative
>>
>> Can we use TYPE_MODE for this?
> 
> When the bitfield representative is an array, it will have BLKmode
> TYPE_MODE, and we need to find some suitable other integral type.
> The bit-field handling for BLKmode representatives can be copied to the middle-end
> implementation.
> 
>> These all sound like things that could be improved in
>> native_encode_initializer rather than duplicate a complex function.
> 
> I think in the end only small parts of it are actually duplicated.
> It is true that both native_encode_initializer and the new C++ FE
> counterpart are roughly 310 lines long, but the mask handling the latter
> performs is unlikely useful for the middle-end (and if added there it would
> need to allow it thus to be NULL and not filled), on the other side the very
> complex handling the middle-end version needs to do where it can be asked
> only for a small portion of the memory complicates things a lot.
> Maybe we could say that non-NULL mask is only supported for the case where
> one asks for everything and have some assertions for that, but I fear we'd
> still end up with ~ 450-500 lines of code in the middle-end compared to the
> current 310.
> 
> I guess I can try to merge the two back in one with optional mask and see
> how does it look like.  But most likely the !CONSTRUCTOR_NO_CLEARING
> handling etc. would need to be conditionalized on this special C++ mode
> (mask non-NULL) etc.
> 
>>> +  if (REFERENCE_REF_P (arg))
>>> +    arg = TREE_OPERAND (arg, 0);
>>
>> Why?
> 
> I've added it so that the decay_conversion later on didn't mishandle
> references to arrays.
> 
>>> +  if (error_operand_p (arg))
>>> +    return error_mark_node;
>>> +
>>> +  if (!type_dependent_expression_p (arg))
>>> +    {
>>> +      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
>>> +	{
>>> +	  /* Don't perform array-to-pointer conversion.  */
>>> +	  arg = mark_rvalue_use (arg, loc, true);
>>> +	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
>>> +	    return error_mark_node;
>>> +	}
> 
> But then I've added the above to match more the __builtin_bit_cast behavior
> in clang, so maybe the if (REFERENCE_REF_P (arg)) is not needed anymore.
> For the std::bit_cast uses, the argument will be always a reference, never
> an array directly and not sure if we want to encourage people to use the
> builtin directly in their code.
> 
>>> +      else
>>> +	arg = decay_conversion (arg, complain);
>>> +
>>> +      if (error_operand_p (arg))
>>> +	return error_mark_node;
>>> +
>>> +      arg = convert_from_reference (arg);
>>> +      if (!trivially_copyable_p (TREE_TYPE (arg)))
>>> +	{
>>> +	  error_at (cp_expr_loc_or_loc (arg, loc),
>>> +		    "%<__builtin_bit_cast%> source type %qT "
>>> +		    "is not trivially copyable", TREE_TYPE (arg));
>>> +	  return error_mark_node;
>>> +	}
>>> +      if (!dependent_type_p (type)
>>> +	  && !cp_tree_equal (TYPE_SIZE_UNIT (type),
>>> +			     TYPE_SIZE_UNIT (TREE_TYPE (arg))))
>>> +	{
>>> +	  error_at (loc, "%<__builtin_bit_cast%> source size %qE "
>>> +			 "not equal to destination type size %qE",
>>> +			 TYPE_SIZE_UNIT (TREE_TYPE (arg)),
>>> +			 TYPE_SIZE_UNIT (type));
>>> +	  return error_mark_node;
>>> +	}
>>> +    }
>>> +
>>> +  tree ret = build_min (BIT_CAST_EXPR, type, arg);
>>> +  SET_EXPR_LOCATION (ret, loc);
>>> +  return ret;
>>> +}
>>> +
>>>    #include "gt-cp-semantics.h"
>>> --- gcc/cp/tree.c.jj	2020-07-18 10:48:51.738489259 +0200
>>> +++ gcc/cp/tree.c	2020-07-18 10:52:46.188021206 +0200
>>> @@ -3908,6 +3908,11 @@ cp_tree_equal (tree t1, tree t2)
>>>    	return false;
>>>          return true;
>>> +    case BIT_CAST_EXPR:
>>> +      if (!same_type_p (TREE_TYPE (t1), TREE_TYPE (t2)))
>>> +	return false;
>>> +      break;
>>
>> Add this to the *_CAST_EXPR cases like you do in cp_walk_subtrees?
> 
> Ok.
> 
>>> +	{
>>> +	  if (type == orig_type)
>>> +	    error_at (loc, "%qs is not constant expression because %qT is "
>>> +			   "a union type", "__builtin_bit_cast", type);
>>
>> "not a constant expression"; you're missing the "a" in all of these.
> 
> Thanks, will change.
> 
> 	Jakub
> 


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-30 21:59     ` Jason Merrill
@ 2020-07-31  8:19       ` Jakub Jelinek
  2020-07-31  9:54         ` Jonathan Wakely
  0 siblings, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-07-31  8:19 UTC (permalink / raw)
  To: Jason Merrill; +Cc: Jonathan Wakely, gcc-patches

On Thu, Jul 30, 2020 at 05:59:18PM -0400, Jason Merrill via Gcc-patches wrote:
> > Even if we are guaranteed that (what guarantees it?) when __builtin_bit_cast
> > is constexpr evaluated no initializer will be omitted if may be value initialized
> > to something other than all zeros,
> 
> This is established by digest_init/process_init_constructor_record, which
> replace implicit value-initialization with explicit values when it's not
> simple zero-initialization.

Ok, I see.

> > we still need to track what bits are well
> > defined and what are not (e.g. any padding in there).
> 
> > Thinking about it now, maybe the mask handling for !CONSTRUCTOR_NO_CLEARING
> > is incorrect though if there are missing initializers, because those omitted
> > initializers still could have paddings with unspecified content, right?
> 
> Zero-initialization (and thus trivial value-initialization) of a class also
> zeroes out padding bits, so when !CONSTRUCTOR_NO_CLEARING all bits should be
> well defined.

Does the standard require that somewhere?  Because that is not what the
compiler implements right now.
If I try:
extern "C" void *memcpy (void *, const void *, decltype (sizeof 0)) noexcept;
struct S { int a, b : 31, c; short d; signed char e; };
S a = S ();
constexpr S d = S ();

S
foo ()
{
  return S ();
}

void
bar (int *p)
{
  S b = foo ();
  S c = S ();
  memcpy (p, &a, sizeof (S));
  memcpy (p + 4, &b, sizeof (S));
  memcpy (p + 8, &c, sizeof (S));
  memcpy (p + 12, &d, sizeof (S));
}
then a will have the padding bit initialized iff it has a constant
initializer (as .data/.rodata initializers have all bits well defined),
but both foo, the copying of foo result to b and the initialization of c
all make the padding bit unspecified.

The gimplifier indeed has code that for !CONSTRUCTOR_NO_CLEARING
CONSTRUCTORs it will clear them fully first, but that only triggers if
the CONSTRUCTOR is characterized as incomplete, which is checked by whether
all the initializable fields have a corresponding initializer element,
so it doesn't count padding bits, in the middle or at the end of structures.

So, shall std::bit_cast be well defined if reading the padding bits
from these cases even when at runtime they would be undefined?  Or do we
need to change the gimplifier (and expansion etc.) for C++?

Talking about the cases where the types in the destination aren't unsigned
char/std::byte, that has some special rules and Marek has filed a PR for
those.

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-31  8:19       ` Jakub Jelinek
@ 2020-07-31  9:54         ` Jonathan Wakely
  2020-07-31 10:06           ` Jakub Jelinek
  0 siblings, 1 reply; 29+ messages in thread
From: Jonathan Wakely @ 2020-07-31  9:54 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: Jason Merrill, gcc-patches

On 31/07/20 10:19 +0200, Jakub Jelinek wrote:
>On Thu, Jul 30, 2020 at 05:59:18PM -0400, Jason Merrill via Gcc-patches wrote:
>> > Even if we are guaranteed that (what guarantees it?) when __builtin_bit_cast
>> > is constexpr evaluated no initializer will be omitted if may be value initialized
>> > to something other than all zeros,
>>
>> This is established by digest_init/process_init_constructor_record, which
>> replace implicit value-initialization with explicit values when it's not
>> simple zero-initialization.
>
>Ok, I see.
>
>> > we still need to track what bits are well
>> > defined and what are not (e.g. any padding in there).
>>
>> > Thinking about it now, maybe the mask handling for !CONSTRUCTOR_NO_CLEARING
>> > is incorrect though if there are missing initializers, because those omitted
>> > initializers still could have paddings with unspecified content, right?
>>
>> Zero-initialization (and thus trivial value-initialization) of a class also
>> zeroes out padding bits, so when !CONSTRUCTOR_NO_CLEARING all bits should be
>> well defined.
>
>Does the standard require that somewhere?  Because that is not what the
>compiler implements right now.

https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78620

>If I try:
>extern "C" void *memcpy (void *, const void *, decltype (sizeof 0)) noexcept;
>struct S { int a, b : 31, c; short d; signed char e; };
>S a = S ();
>constexpr S d = S ();
>
>S
>foo ()
>{
>  return S ();
>}
>
>void
>bar (int *p)
>{
>  S b = foo ();
>  S c = S ();
>  memcpy (p, &a, sizeof (S));
>  memcpy (p + 4, &b, sizeof (S));
>  memcpy (p + 8, &c, sizeof (S));
>  memcpy (p + 12, &d, sizeof (S));
>}
>then a will have the padding bit initialized iff it has a constant
>initializer (as .data/.rodata initializers have all bits well defined),
>but both foo, the copying of foo result to b and the initialization of c
>all make the padding bit unspecified.
>
>The gimplifier indeed has code that for !CONSTRUCTOR_NO_CLEARING
>CONSTRUCTORs it will clear them fully first, but that only triggers if
>the CONSTRUCTOR is characterized as incomplete, which is checked by whether
>all the initializable fields have a corresponding initializer element,
>so it doesn't count padding bits, in the middle or at the end of structures.
>
>So, shall std::bit_cast be well defined if reading the padding bits
>from these cases even when at runtime they would be undefined?  Or do we
>need to change the gimplifier (and expansion etc.) for C++?
>
>Talking about the cases where the types in the destination aren't unsigned
>char/std::byte, that has some special rules and Marek has filed a PR for
>those.
>
>	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-31  9:54         ` Jonathan Wakely
@ 2020-07-31 10:06           ` Jakub Jelinek
  2020-07-31 20:28             ` Jason Merrill
  0 siblings, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-07-31 10:06 UTC (permalink / raw)
  To: Jonathan Wakely; +Cc: Jason Merrill, gcc-patches

On Fri, Jul 31, 2020 at 10:54:46AM +0100, Jonathan Wakely wrote:
> > Does the standard require that somewhere?  Because that is not what the
> > compiler implements right now.
> 
> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78620

Ah, ok.
But does that imply that all CONSTRUCTORs without CONSTRUCTOR_NO_CLEARING
need to be treated that way?  I mean, aren't such CONSTRUCTORs used also for
other initializations?
And, are the default copy constructors or assignment operators supposed to
also copy the padding bits, or do they become unspecified again through
that?

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-31 10:06           ` Jakub Jelinek
@ 2020-07-31 20:28             ` Jason Merrill
  2020-08-27 10:06               ` Jakub Jelinek
  0 siblings, 1 reply; 29+ messages in thread
From: Jason Merrill @ 2020-07-31 20:28 UTC (permalink / raw)
  To: Jakub Jelinek, Jonathan Wakely; +Cc: gcc-patches

On 7/31/20 6:06 AM, Jakub Jelinek wrote:
> On Fri, Jul 31, 2020 at 10:54:46AM +0100, Jonathan Wakely wrote:
>>> Does the standard require that somewhere?  Because that is not what the
>>> compiler implements right now.
>>
>> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78620
> 
> But does that imply that all CONSTRUCTORs without CONSTRUCTOR_NO_CLEARING
> need to be treated that way?  I mean, aren't such CONSTRUCTORs used also for
> other initializations?

Yes, they are also used to represent constant values of classes that are 
initialized by constexpr constructor.

> And, are the default copy constructors or assignment operators supposed to
> also copy the padding bits, or do they become unspecified again through
> that?

For a non-union class, a defaulted copy is defined as memberwise copy, 
not a copy of the entire object representation.  So I guess strictly 
speaking the padding bits do become unspecified.  But I think if the 
copy is trivial, in practice all implementations do copy the object 
representation; perhaps the specification should adjust accordingly.

Jason


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-31 20:28             ` Jason Merrill
@ 2020-08-27 10:06               ` Jakub Jelinek
  2020-08-27 10:19                 ` Richard Biener
                                   ` (3 more replies)
  0 siblings, 4 replies; 29+ messages in thread
From: Jakub Jelinek @ 2020-08-27 10:06 UTC (permalink / raw)
  To: Jason Merrill, Martin Jambor, Richard Biener, Iain Buclaw
  Cc: Jonathan Wakely, gcc-patches

On Fri, Jul 31, 2020 at 04:28:05PM -0400, Jason Merrill via Gcc-patches wrote:
> On 7/31/20 6:06 AM, Jakub Jelinek wrote:
> > On Fri, Jul 31, 2020 at 10:54:46AM +0100, Jonathan Wakely wrote:
> > > > Does the standard require that somewhere?  Because that is not what the
> > > > compiler implements right now.
> > > 
> > > https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78620
> > 
> > But does that imply that all CONSTRUCTORs without CONSTRUCTOR_NO_CLEARING
> > need to be treated that way?  I mean, aren't such CONSTRUCTORs used also for
> > other initializations?
> 
> Yes, they are also used to represent constant values of classes that are
> initialized by constexpr constructor.
> 
> > And, are the default copy constructors or assignment operators supposed to
> > also copy the padding bits, or do they become unspecified again through
> > that?
> 
> For a non-union class, a defaulted copy is defined as memberwise copy, not a
> copy of the entire object representation.  So I guess strictly speaking the
> padding bits do become unspecified.  But I think if the copy is trivial, in
> practice all implementations do copy the object representation; perhaps the
> specification should adjust accordingly.

Sorry for not responding earlier.  I think at least in GCC there is no
guarantee the copying is copying the object representation rather than
memberwise copy, both are possible, depending e.g. whether SRA happens or
not.

So, shouldn't we have a new CONSTRUCTOR flag that will represent whether
padding bits are cleared or not and then use it e.g. in the gimplifier?
Right now the gimplifier only adds first zero initialization if
CONSTRUCTOR_NO_CLEARING is not set and some initializers are not present,
so if there is a new flag, we'd need to in that case find out if there are
any padding bits and do the zero initialization in that case.
A question is if GIMPLE var = {}; statement (empty CONSTRUCTOR) is handled
as zero initialization of also the padding bits, or if we should treat it
that way only if the CONSTRUCTOR on the rhs has the new bit set and e.g.
when lowering memset 0 into var = {}; set the bit too.
From what I understood on IRC, D has similar need for zero initialization of
padding.

In the testcase below, what is and what is not UB?

#include <bit>

struct S { int a : 31; int b; };
struct T { int a, b; };

constexpr int
foo ()
{
  S a = S ();
  S b = { 0, 0 };
  S c = a;
  S d;
  S e;
  d = a;
  e = S ();
  int u = std::bit_cast (T, a).a; // Is this well defined due to value initialization of a?
  int v = std::bit_cast (T, b).a; // But this is invalid, right?  There is no difference in the IL though.
  int w = std::bit_cast (T, c).a; // And this is also invalid, or are default copy ctors required to copy padding bits?
  int x = std::bit_cast (T, d).a; // Similarly for default copy assignment operators...
  int y = std::bit_cast (T, e).a; // And this too?
  int z = std::bit_cast (T, S ()).a; // This one is well defined?
  return u + v + w + x + y + z;
}

constexpr int x = foo ();

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-08-27 10:06               ` Jakub Jelinek
@ 2020-08-27 10:19                 ` Richard Biener
  2020-09-02 21:52                   ` Jason Merrill
  2020-08-27 10:46                 ` Jakub Jelinek
                                   ` (2 subsequent siblings)
  3 siblings, 1 reply; 29+ messages in thread
From: Richard Biener @ 2020-08-27 10:19 UTC (permalink / raw)
  To: Jakub Jelinek
  Cc: Jason Merrill, Martin Jambor, Iain Buclaw, Jonathan Wakely, gcc-patches

On Thu, 27 Aug 2020, Jakub Jelinek wrote:

> On Fri, Jul 31, 2020 at 04:28:05PM -0400, Jason Merrill via Gcc-patches wrote:
> > On 7/31/20 6:06 AM, Jakub Jelinek wrote:
> > > On Fri, Jul 31, 2020 at 10:54:46AM +0100, Jonathan Wakely wrote:
> > > > > Does the standard require that somewhere?  Because that is not what the
> > > > > compiler implements right now.
> > > > 
> > > > https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78620
> > > 
> > > But does that imply that all CONSTRUCTORs without CONSTRUCTOR_NO_CLEARING
> > > need to be treated that way?  I mean, aren't such CONSTRUCTORs used also for
> > > other initializations?
> > 
> > Yes, they are also used to represent constant values of classes that are
> > initialized by constexpr constructor.
> > 
> > > And, are the default copy constructors or assignment operators supposed to
> > > also copy the padding bits, or do they become unspecified again through
> > > that?
> > 
> > For a non-union class, a defaulted copy is defined as memberwise copy, not a
> > copy of the entire object representation.  So I guess strictly speaking the
> > padding bits do become unspecified.  But I think if the copy is trivial, in
> > practice all implementations do copy the object representation; perhaps the
> > specification should adjust accordingly.
> 
> Sorry for not responding earlier.  I think at least in GCC there is no
> guarantee the copying is copying the object representation rather than
> memberwise copy, both are possible, depending e.g. whether SRA happens or
> not.

Note we've basically settled on that SRA needs to copy padding and that
GIMPLE copies all bytes for aggregate copies and thus

  x = {}

is equivalent to a memset.

> So, shouldn't we have a new CONSTRUCTOR flag that will represent whether
> padding bits are cleared or not and then use it e.g. in the gimplifier?
> Right now the gimplifier only adds first zero initialization if
> CONSTRUCTOR_NO_CLEARING is not set and some initializers are not present,
> so if there is a new flag, we'd need to in that case find out if there are
> any padding bits and do the zero initialization in that case.
> A question is if GIMPLE var = {}; statement (empty CONSTRUCTOR) is handled
> as zero initialization of also the padding bits, or if we should treat it
> that way only if the CONSTRUCTOR on the rhs has the new bit set and e.g.
> when lowering memset 0 into var = {}; set the bit too.
> From what I understood on IRC, D has similar need for zero initialization of
> padding.

Now indeed the gimplifier will turn a aggregate CTOR initialization
to memberwise init without caring for padding.  Which means GENERIC
has the less strict semantics and we indeed would need some CTOR flag
to tell whether padding is implicitely zero or undefined?

> In the testcase below, what is and what is not UB?
> 
> #include <bit>
> 
> struct S { int a : 31; int b; };
> struct T { int a, b; };
> 
> constexpr int
> foo ()
> {
>   S a = S ();
>   S b = { 0, 0 };
>   S c = a;
>   S d;
>   S e;
>   d = a;
>   e = S ();
>   int u = std::bit_cast (T, a).a; // Is this well defined due to value initialization of a?
>   int v = std::bit_cast (T, b).a; // But this is invalid, right?  There is no difference in the IL though.
>   int w = std::bit_cast (T, c).a; // And this is also invalid, or are default copy ctors required to copy padding bits?
>   int x = std::bit_cast (T, d).a; // Similarly for default copy assignment operators...
>   int y = std::bit_cast (T, e).a; // And this too?
>   int z = std::bit_cast (T, S ()).a; // This one is well defined?
>   return u + v + w + x + y + z;
> }
> 
> constexpr int x = foo ();
> 
> 	Jakub
> 
> 

-- 
Richard Biener <rguenther@suse.de>
SUSE Software Solutions Germany GmbH, Maxfeldstrasse 5, 90409 Nuernberg,
Germany; GF: Felix Imendörffer; HRB 36809 (AG Nuernberg)

^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-08-27 10:06               ` Jakub Jelinek
  2020-08-27 10:19                 ` Richard Biener
@ 2020-08-27 10:46                 ` Jakub Jelinek
  2020-08-27 11:06                   ` Jonathan Wakely
  2020-08-27 10:59                 ` Jonathan Wakely
  2020-08-27 20:43                 ` Iain Buclaw
  3 siblings, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-08-27 10:46 UTC (permalink / raw)
  To: Jason Merrill, Martin Jambor, Richard Biener, Iain Buclaw,
	Jonathan Wakely, gcc-patches

On Thu, Aug 27, 2020 at 12:06:13PM +0200, Jakub Jelinek via Gcc-patches wrote:

Oops, rewrote the testcase from __builtin_bit_cast to std::bit_cast without
adjusting the syntax properly.
Also, let's not use bitfields in there, as clang doesn't support those.
So, adjusted testcase below.  clang++ rejects all 6 of those, but from what
you said, I'd expect that u and z should be well defined.

#include <bit>

struct S { short a; int b; };
struct T { int a, b; };

constexpr int
foo ()
{
  S a = S ();
  S b = { 0, 0 };
  S c = a;
  S d;
  S e;
  d = a;
  e = S ();
  int u = std::bit_cast<T> (a).a; // Is this well defined due to value initialization of a?
  int v = std::bit_cast<T> (b).a; // But this is invalid, right?  There is no difference in the IL though.
  int w = std::bit_cast<T> (c).a; // And this is also invalid, or are default copy ctors required to copy padding bits?
  int x = std::bit_cast<T> (d).a; // Similarly for default copy assignment operators...
  int y = std::bit_cast<T> (e).a; // And this too?
  int z = std::bit_cast<T> (S ()).a; // This one is well defined?
  return u + v + w + x + y + z;
}

constexpr int x = foo ();

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-08-27 10:06               ` Jakub Jelinek
  2020-08-27 10:19                 ` Richard Biener
  2020-08-27 10:46                 ` Jakub Jelinek
@ 2020-08-27 10:59                 ` Jonathan Wakely
  2020-08-27 20:43                 ` Iain Buclaw
  3 siblings, 0 replies; 29+ messages in thread
From: Jonathan Wakely @ 2020-08-27 10:59 UTC (permalink / raw)
  To: Jakub Jelinek
  Cc: Jason Merrill, Martin Jambor, Richard Biener, Iain Buclaw, gcc-patches

On 27/08/20 12:06 +0200, Jakub Jelinek wrote:
>On Fri, Jul 31, 2020 at 04:28:05PM -0400, Jason Merrill via Gcc-patches wrote:
>> On 7/31/20 6:06 AM, Jakub Jelinek wrote:
>> > On Fri, Jul 31, 2020 at 10:54:46AM +0100, Jonathan Wakely wrote:
>> > > > Does the standard require that somewhere?  Because that is not what the
>> > > > compiler implements right now.
>> > >
>> > > https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78620
>> >
>> > But does that imply that all CONSTRUCTORs without CONSTRUCTOR_NO_CLEARING
>> > need to be treated that way?  I mean, aren't such CONSTRUCTORs used also for
>> > other initializations?
>>
>> Yes, they are also used to represent constant values of classes that are
>> initialized by constexpr constructor.
>>
>> > And, are the default copy constructors or assignment operators supposed to
>> > also copy the padding bits, or do they become unspecified again through
>> > that?
>>
>> For a non-union class, a defaulted copy is defined as memberwise copy, not a
>> copy of the entire object representation.  So I guess strictly speaking the
>> padding bits do become unspecified.  But I think if the copy is trivial, in
>> practice all implementations do copy the object representation; perhaps the
>> specification should adjust accordingly.
>
>Sorry for not responding earlier.  I think at least in GCC there is no
>guarantee the copying is copying the object representation rather than
>memberwise copy, both are possible, depending e.g. whether SRA happens or
>not.
>
>So, shouldn't we have a new CONSTRUCTOR flag that will represent whether
>padding bits are cleared or not and then use it e.g. in the gimplifier?
>Right now the gimplifier only adds first zero initialization if
>CONSTRUCTOR_NO_CLEARING is not set and some initializers are not present,
>so if there is a new flag, we'd need to in that case find out if there are
>any padding bits and do the zero initialization in that case.
>A question is if GIMPLE var = {}; statement (empty CONSTRUCTOR) is handled
>as zero initialization of also the padding bits, or if we should treat it
>that way only if the CONSTRUCTOR on the rhs has the new bit set and e.g.
>when lowering memset 0 into var = {}; set the bit too.
From what I understood on IRC, D has similar need for zero initialization of
>padding.
>
>In the testcase below, what is and what is not UB?
>
>#include <bit>
>
>struct S { int a : 31; int b; };
>struct T { int a, b; };
>
>constexpr int
>foo ()
>{
>  S a = S ();
>  S b = { 0, 0 };
>  S c = a;
>  S d;
>  S e;
>  d = a;
>  e = S ();
>  int u = std::bit_cast (T, a).a; // Is this well defined due to value initialization of a?
>  int v = std::bit_cast (T, b).a; // But this is invalid, right?  There is no difference in the IL though.
>  int w = std::bit_cast (T, c).a; // And this is also invalid, or are default copy ctors required to copy padding bits?

They are not required to. It does a memberwise copy of the bases and
members. Padding is not copied.

>  int x = std::bit_cast (T, d).a; // Similarly for default copy assignment operators...

Same again, memberwise assignment of the bases and members.

I'm not sure whether the previous values of padding bits has to be
preserved though. If the LHS was zero-initialized (so its padding bits
were zeroed) and you assign to it, I don't see anything that allows
the padding bits to change. But I don't think the intention is to
forbid using memcpy for trivial assignments, and that would copy
padding bits.

>  int y = std::bit_cast (T, e).a; // And this too?

Yes. I don't think e = S() is required to change the padding bits of
e to be copies of the zero bits in S(), so they are unspecified after
that assignment, and after the bit_cast.

>  int z = std::bit_cast (T, S ()).a; // This one is well defined?

Yes, I think so.

>  return u + v + w + x + y + z;
>}
>
>constexpr int x = foo ();
>
>	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-08-27 10:46                 ` Jakub Jelinek
@ 2020-08-27 11:06                   ` Jonathan Wakely
  2020-08-27 11:17                     ` Jakub Jelinek
  0 siblings, 1 reply; 29+ messages in thread
From: Jonathan Wakely @ 2020-08-27 11:06 UTC (permalink / raw)
  To: Jakub Jelinek
  Cc: Jason Merrill, Martin Jambor, Richard Biener, Iain Buclaw, gcc-patches

On 27/08/20 12:46 +0200, Jakub Jelinek wrote:
>On Thu, Aug 27, 2020 at 12:06:13PM +0200, Jakub Jelinek via Gcc-patches wrote:
>
>Oops, rewrote the testcase from __builtin_bit_cast to std::bit_cast without
>adjusting the syntax properly.
>Also, let's not use bitfields in there, as clang doesn't support those.
>So, adjusted testcase below.  clang++ rejects all 6 of those, but from what
>you said, I'd expect that u and z should be well defined.
>
>#include <bit>
>
>struct S { short a; int b; };
>struct T { int a, b; };
>
>constexpr int
>foo ()
>{
>  S a = S ();
>  S b = { 0, 0 };
>  S c = a;
>  S d;
>  S e;
>  d = a;
>  e = S ();
>  int u = std::bit_cast<T> (a).a; // Is this well defined due to value initialization of a?

The standard says that padding bits in the bit_cast result are
unspecified, so I don't think they have to be copied even if the
source object was zero-initialized and has all-zero padding bits.


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-08-27 11:06                   ` Jonathan Wakely
@ 2020-08-27 11:17                     ` Jakub Jelinek
  2020-08-27 11:22                       ` Jonathan Wakely
  0 siblings, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-08-27 11:17 UTC (permalink / raw)
  To: Jonathan Wakely
  Cc: Jason Merrill, Martin Jambor, Richard Biener, Iain Buclaw, gcc-patches

On Thu, Aug 27, 2020 at 12:06:59PM +0100, Jonathan Wakely wrote:
> On 27/08/20 12:46 +0200, Jakub Jelinek wrote:
> > On Thu, Aug 27, 2020 at 12:06:13PM +0200, Jakub Jelinek via Gcc-patches wrote:
> > 
> > Oops, rewrote the testcase from __builtin_bit_cast to std::bit_cast without
> > adjusting the syntax properly.
> > Also, let's not use bitfields in there, as clang doesn't support those.
> > So, adjusted testcase below.  clang++ rejects all 6 of those, but from what
> > you said, I'd expect that u and z should be well defined.
> > 
> > #include <bit>
> > 
> > struct S { short a; int b; };
> > struct T { int a, b; };
> > 
> > constexpr int
> > foo ()
> > {
> >  S a = S ();
> >  S b = { 0, 0 };
> >  S c = a;
> >  S d;
> >  S e;
> >  d = a;
> >  e = S ();
> >  int u = std::bit_cast<T> (a).a; // Is this well defined due to value initialization of a?
> 
> The standard says that padding bits in the bit_cast result are
> unspecified, so I don't think they have to be copied even if the
> source object was zero-initialized and has all-zero padding bits.

My understanding of
"Padding bits of the To object are unspecified."
is that one shouldn't treat the result of std::bit_cast as if it was e.g.
value initialization followed by member-wise assignment.
But it doesn't say anything about padding bits in the From object.
In the above testcase, T has no padding bits, only S has them.
I think the "Padding bits of the To object are unspecified." should be about:
  T t = { 0, 0 };
  int s = std::bit_cast<T> (std::bit_cast<S> (t)).a;
being UB, that one can't expect the padding bits in S to have a particular
value.

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-08-27 11:17                     ` Jakub Jelinek
@ 2020-08-27 11:22                       ` Jonathan Wakely
  0 siblings, 0 replies; 29+ messages in thread
From: Jonathan Wakely @ 2020-08-27 11:22 UTC (permalink / raw)
  To: Jakub Jelinek
  Cc: Jason Merrill, Martin Jambor, Richard Biener, Iain Buclaw, gcc-patches

On 27/08/20 13:17 +0200, Jakub Jelinek wrote:
>On Thu, Aug 27, 2020 at 12:06:59PM +0100, Jonathan Wakely wrote:
>> On 27/08/20 12:46 +0200, Jakub Jelinek wrote:
>> > On Thu, Aug 27, 2020 at 12:06:13PM +0200, Jakub Jelinek via Gcc-patches wrote:
>> >
>> > Oops, rewrote the testcase from __builtin_bit_cast to std::bit_cast without
>> > adjusting the syntax properly.
>> > Also, let's not use bitfields in there, as clang doesn't support those.
>> > So, adjusted testcase below.  clang++ rejects all 6 of those, but from what
>> > you said, I'd expect that u and z should be well defined.
>> >
>> > #include <bit>
>> >
>> > struct S { short a; int b; };
>> > struct T { int a, b; };
>> >
>> > constexpr int
>> > foo ()
>> > {
>> >  S a = S ();
>> >  S b = { 0, 0 };
>> >  S c = a;
>> >  S d;
>> >  S e;
>> >  d = a;
>> >  e = S ();
>> >  int u = std::bit_cast<T> (a).a; // Is this well defined due to value initialization of a?
>>
>> The standard says that padding bits in the bit_cast result are
>> unspecified, so I don't think they have to be copied even if the
>> source object was zero-initialized and has all-zero padding bits.
>
>My understanding of
>"Padding bits of the To object are unspecified."
>is that one shouldn't treat the result of std::bit_cast as if it was e.g.
>value initialization followed by member-wise assignment.
>But it doesn't say anything about padding bits in the From object.
>In the above testcase, T has no padding bits, only S has them.

Doh, yes, sorry.

>I think the "Padding bits of the To object are unspecified." should be about:
>  T t = { 0, 0 };
>  int s = std::bit_cast<T> (std::bit_cast<S> (t)).a;
>being UB, that one can't expect the padding bits in S to have a particular
>value.
>
>	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-08-27 10:06               ` Jakub Jelinek
                                   ` (2 preceding siblings ...)
  2020-08-27 10:59                 ` Jonathan Wakely
@ 2020-08-27 20:43                 ` Iain Buclaw
  3 siblings, 0 replies; 29+ messages in thread
From: Iain Buclaw @ 2020-08-27 20:43 UTC (permalink / raw)
  To: Jakub Jelinek, Jason Merrill, Martin Jambor, Richard Biener
  Cc: gcc-patches, Jonathan Wakely

Excerpts from Jakub Jelinek's message of August 27, 2020 12:06 pm:
> On Fri, Jul 31, 2020 at 04:28:05PM -0400, Jason Merrill via Gcc-patches wrote:
>> On 7/31/20 6:06 AM, Jakub Jelinek wrote:
>> > On Fri, Jul 31, 2020 at 10:54:46AM +0100, Jonathan Wakely wrote:
>> > > > Does the standard require that somewhere?  Because that is not what the
>> > > > compiler implements right now.
>> > > 
>> > > https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78620
>> > 
>> > But does that imply that all CONSTRUCTORs without CONSTRUCTOR_NO_CLEARING
>> > need to be treated that way?  I mean, aren't such CONSTRUCTORs used also for
>> > other initializations?
>> 
>> Yes, they are also used to represent constant values of classes that are
>> initialized by constexpr constructor.
>> 
>> > And, are the default copy constructors or assignment operators supposed to
>> > also copy the padding bits, or do they become unspecified again through
>> > that?
>> 
>> For a non-union class, a defaulted copy is defined as memberwise copy, not a
>> copy of the entire object representation.  So I guess strictly speaking the
>> padding bits do become unspecified.  But I think if the copy is trivial, in
>> practice all implementations do copy the object representation; perhaps the
>> specification should adjust accordingly.
> 
> Sorry for not responding earlier.  I think at least in GCC there is no
> guarantee the copying is copying the object representation rather than
> memberwise copy, both are possible, depending e.g. whether SRA happens or
> not.
> 
> So, shouldn't we have a new CONSTRUCTOR flag that will represent whether
> padding bits are cleared or not and then use it e.g. in the gimplifier?
> Right now the gimplifier only adds first zero initialization if
> CONSTRUCTOR_NO_CLEARING is not set and some initializers are not present,
> so if there is a new flag, we'd need to in that case find out if there are
> any padding bits and do the zero initialization in that case.
> A question is if GIMPLE var = {}; statement (empty CONSTRUCTOR) is handled
> as zero initialization of also the padding bits, or if we should treat it
> that way only if the CONSTRUCTOR on the rhs has the new bit set and e.g.
> when lowering memset 0 into var = {}; set the bit too.
> From what I understood on IRC, D has similar need for zero initialization of
> padding.
> 

Yes, by the D spec, all holes in structs and arrays should be zeroed at
all time.  Which has reusulted in some awkward lowerings in order to
attempt forcing the issue.  In particular small structs not passed in
memory are prone to lose the value of padding bits after memset().

Iain.

^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-08-27 10:19                 ` Richard Biener
@ 2020-09-02 21:52                   ` Jason Merrill
  0 siblings, 0 replies; 29+ messages in thread
From: Jason Merrill @ 2020-09-02 21:52 UTC (permalink / raw)
  To: Richard Biener, Jakub Jelinek
  Cc: Martin Jambor, Iain Buclaw, Jonathan Wakely, gcc-patches

On 8/27/20 6:19 AM, Richard Biener wrote:
> On Thu, 27 Aug 2020, Jakub Jelinek wrote:
> 
>> On Fri, Jul 31, 2020 at 04:28:05PM -0400, Jason Merrill via Gcc-patches wrote:
>>> On 7/31/20 6:06 AM, Jakub Jelinek wrote:
>>>> On Fri, Jul 31, 2020 at 10:54:46AM +0100, Jonathan Wakely wrote:
>>>>>> Does the standard require that somewhere?  Because that is not what the
>>>>>> compiler implements right now.
>>>>>
>>>>> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78620
>>>>
>>>> But does that imply that all CONSTRUCTORs without CONSTRUCTOR_NO_CLEARING
>>>> need to be treated that way?  I mean, aren't such CONSTRUCTORs used also for
>>>> other initializations?
>>>
>>> Yes, they are also used to represent constant values of classes that are
>>> initialized by constexpr constructor.
>>>
>>>> And, are the default copy constructors or assignment operators supposed to
>>>> also copy the padding bits, or do they become unspecified again through
>>>> that?
>>>
>>> For a non-union class, a defaulted copy is defined as memberwise copy, not a
>>> copy of the entire object representation.  So I guess strictly speaking the
>>> padding bits do become unspecified.  But I think if the copy is trivial, in
>>> practice all implementations do copy the object representation; perhaps the
>>> specification should adjust accordingly.
>>
>> Sorry for not responding earlier.  I think at least in GCC there is no
>> guarantee the copying is copying the object representation rather than
>> memberwise copy, both are possible, depending e.g. whether SRA happens or
>> not.
> 
> Note we've basically settled on that SRA needs to copy padding and that
> GIMPLE copies all bytes for aggregate copies and thus
> 
>    x = {}
> 
> is equivalent to a memset.
> 
>> So, shouldn't we have a new CONSTRUCTOR flag that will represent whether
>> padding bits are cleared or not and then use it e.g. in the gimplifier?
>> Right now the gimplifier only adds first zero initialization if
>> CONSTRUCTOR_NO_CLEARING is not set and some initializers are not present,
>> so if there is a new flag, we'd need to in that case find out if there are
>> any padding bits and do the zero initialization in that case.
>> A question is if GIMPLE var = {}; statement (empty CONSTRUCTOR) is handled
>> as zero initialization of also the padding bits, or if we should treat it
>> that way only if the CONSTRUCTOR on the rhs has the new bit set and e.g.
>> when lowering memset 0 into var = {}; set the bit too.
>>  From what I understood on IRC, D has similar need for zero initialization of
>> padding.
> 
> Now indeed the gimplifier will turn a aggregate CTOR initialization
> to memberwise init without caring for padding.  Which means GENERIC
> has the less strict semantics and we indeed would need some CTOR flag
> to tell whether padding is implicitely zero or undefined?

CONSTRUCTOR_NO_CLEARING would seem to already mean that, but the C++ 
front end uses it just to indicate whether some fields are 
uninitialized.  I suppose C++ could use a LANG_FLAG for that instead of 
the generic flag.

>> In the testcase below, what is and what is not UB?
>>
>> #include <bit>
>>
>> struct S { int a : 31; int b; };
>> struct T { int a, b; };
>>
>> constexpr int
>> foo ()
>> {
>>    S a = S ();
>>    S b = { 0, 0 };
>>    S c = a;
>>    S d;
>>    S e;
>>    d = a;
>>    e = S ();
>>    int u = std::bit_cast (T, a).a; // Is this well defined due to value initialization of a?
>>    int v = std::bit_cast (T, b).a; // But this is invalid, right?  There is no difference in the IL though.
>>    int w = std::bit_cast (T, c).a; // And this is also invalid, or are default copy ctors required to copy padding bits?
>>    int x = std::bit_cast (T, d).a; // Similarly for default copy assignment operators...
>>    int y = std::bit_cast (T, e).a; // And this too?
>>    int z = std::bit_cast (T, S ()).a; // This one is well defined?
>>    return u + v + w + x + y + z;
>> }
>>
>> constexpr int x = foo ();
>>
>> 	Jakub
>>
>>
> 


^ permalink raw reply	[flat|nested] 29+ messages in thread

* [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-07-30 14:16 ` Jason Merrill
  2020-07-30 14:57   ` Jakub Jelinek
@ 2020-11-02 19:21   ` Jakub Jelinek
  2020-11-25  0:31     ` Jeff Law
  2020-11-25 17:26     ` Jason Merrill
  1 sibling, 2 replies; 29+ messages in thread
From: Jakub Jelinek @ 2020-11-02 19:21 UTC (permalink / raw)
  To: Jason Merrill; +Cc: Jonathan Wakely, gcc-patches

On Thu, Jul 30, 2020 at 10:16:46AM -0400, Jason Merrill via Gcc-patches wrote:
> > The following patch adds __builtin_bit_cast builtin, similarly to
> > clang or MSVC which implement std::bit_cast using such an builtin too.
> 
> Great!

Sorry for the long delay.

The following patch implements what you've requested in the review.
It doesn't deal with the padding bits being (sometimes) well defined, but if
we e.g. get a new CONSTRUCTOR bit that would request that (and corresponding
gimplifier change to pre-initialize to all zeros), it could be easily
adjusted (just for such CONSTRUCTORs memset the whole mask portion
(total_bytes) and set mask to NULL).  If there are other cases where
such handling is desirable (e.g. I wonder about bit_cast being called on
non-automatic const variables where their CONSTRUCTOR at least
implementation-wise would end up being on pre-zeroed .rodata (or .data)
memory), that can be handled too.

In the previous version I was using next_initializable_field, that is
something that isn't available in the middle-end, so all I do right now
is just skip non-FIELD_DECLs, unnamed bit-fields and fields with zero size.

Bootstrapped/regtested on x86_64-linux and i686-linux, ok for trunk?

2020-11-02  Jakub Jelinek  <jakub@redhat.com>

	PR libstdc++/93121
	* fold-const.h (native_encode_initializer): Add mask argument
	defaulted to nullptr.
	(find_bitfield_repr_type): Declare.
	* fold-const.c (find_bitfield_repr_type): New function.
	(native_encode_initializer): Add mask argument and support for
	filling it.  Handle also some bitfields without integral
	DECL_BIT_FIELD_REPRESENTATIVE.

	* c-common.h (enum rid): Add RID_BUILTIN_BIT_CAST.
	* c-common.c (c_common_reswords): Add __builtin_bit_cast.

	* cp-tree.h (cp_build_bit_cast): Declare.
	* cp-tree.def (BIT_CAST_EXPR): New tree code.
	* cp-objcp-common.c (names_builtin_p): Handle RID_BUILTIN_BIT_CAST.
	(cp_common_init_ts): Handle BIT_CAST_EXPR.
	* cxx-pretty-print.c (cxx_pretty_printer::postfix_expression):
	Likewise.
	* parser.c (cp_parser_postfix_expression): Handle
	RID_BUILTIN_BIT_CAST.
	* semantics.c (cp_build_bit_cast): New function.
	* tree.c (cp_tree_equal): Handle BIT_CAST_EXPR.
	(cp_walk_subtrees): Likewise.
	* pt.c (tsubst_copy): Likewise.
	* constexpr.c (check_bit_cast_type, cxx_native_interpret_aggregate,
	cxx_eval_bit_cast): New functions.
	(cxx_eval_constant_expression): Handle BIT_CAST_EXPR.
	(potential_constant_expression_1): Likewise.
	* cp-gimplify.c (cp_genericize_r): Likewise.

	* g++.dg/cpp2a/bit-cast1.C: New test.
	* g++.dg/cpp2a/bit-cast2.C: New test.
	* g++.dg/cpp2a/bit-cast3.C: New test.
	* g++.dg/cpp2a/bit-cast4.C: New test.
	* g++.dg/cpp2a/bit-cast5.C: New test.

--- gcc/fold-const.h.jj	2020-08-18 07:50:18.716920202 +0200
+++ gcc/fold-const.h	2020-11-02 15:53:59.859477063 +0100
@@ -27,9 +27,10 @@ extern int folding_initializer;
 /* Convert between trees and native memory representation.  */
 extern int native_encode_expr (const_tree, unsigned char *, int, int off = -1);
 extern int native_encode_initializer (tree, unsigned char *, int,
-				      int off = -1);
+				      int off = -1, unsigned char * = nullptr);
 extern tree native_interpret_expr (tree, const unsigned char *, int);
 extern bool can_native_interpret_type_p (tree);
+extern tree find_bitfield_repr_type (int, int);
 extern void shift_bytes_in_array_left (unsigned char *, unsigned int,
 				       unsigned int);
 extern void shift_bytes_in_array_right (unsigned char *, unsigned int,
--- gcc/fold-const.c.jj	2020-10-15 15:09:49.079725120 +0200
+++ gcc/fold-const.c	2020-11-02 17:20:43.784633491 +0100
@@ -7915,25 +7915,78 @@ native_encode_expr (const_tree expr, uns
     }
 }
 
+/* Try to find a type whose byte size is smaller or equal to LEN bytes larger
+   or equal to FIELDSIZE bytes, with underlying mode precision/size multiple
+   of BITS_PER_UNIT.  As native_{interpret,encode}_int works in term of
+   machine modes, we can't just use build_nonstandard_integer_type.  */
+
+tree
+find_bitfield_repr_type (int fieldsize, int len)
+{
+  machine_mode mode;
+  for (int pass = 0; pass < 2; pass++)
+    {
+      enum mode_class mclass = pass ? MODE_PARTIAL_INT : MODE_INT;
+      FOR_EACH_MODE_IN_CLASS (mode, mclass)
+	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
+	    && known_eq (GET_MODE_PRECISION (mode),
+			 GET_MODE_BITSIZE (mode))
+	    && known_le (GET_MODE_SIZE (mode), len))
+	  {
+	    tree ret = lang_hooks.types.type_for_mode (mode, 1);
+	    if (ret && TYPE_MODE (ret) == mode)
+	      return ret;
+	  }
+    }
+
+  for (int i = 0; i < NUM_INT_N_ENTS; i ++)
+    if (int_n_enabled_p[i]
+	&& int_n_data[i].bitsize >= (unsigned) (BITS_PER_UNIT * fieldsize)
+	&& int_n_trees[i].unsigned_type)
+      {
+	tree ret = int_n_trees[i].unsigned_type;
+	mode = TYPE_MODE (ret);
+	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
+	    && known_eq (GET_MODE_PRECISION (mode),
+			 GET_MODE_BITSIZE (mode))
+	    && known_le (GET_MODE_SIZE (mode), len))
+	  return ret;
+      }
+
+  return NULL_TREE;
+}
+
 /* Similar to native_encode_expr, but also handle CONSTRUCTORs, VCEs,
-   NON_LVALUE_EXPRs and nops.  */
+   NON_LVALUE_EXPRs and nops.  If MASK is non-NULL (then PTR has
+   to be non-NULL and OFF zero), then in addition to filling the
+   bytes pointed by PTR with the value also clear any bits pointed
+   by MASK that are known to be initialized, keep them as is for
+   e.g. uninitialized padding bits or uninitialized fields.  */
 
 int
 native_encode_initializer (tree init, unsigned char *ptr, int len,
-			   int off)
+			   int off, unsigned char *mask)
 {
+  int r;
+
   /* We don't support starting at negative offset and -1 is special.  */
   if (off < -1 || init == NULL_TREE)
     return 0;
 
+  gcc_assert (mask == NULL || (off == 0 && ptr));
+
   STRIP_NOPS (init);
   switch (TREE_CODE (init))
     {
     case VIEW_CONVERT_EXPR:
     case NON_LVALUE_EXPR:
-      return native_encode_initializer (TREE_OPERAND (init, 0), ptr, len, off);
+      return native_encode_initializer (TREE_OPERAND (init, 0), ptr, len, off,
+					mask);
     default:
-      return native_encode_expr (init, ptr, len, off);
+      r = native_encode_expr (init, ptr, len, off);
+      if (mask)
+	memset (mask, 0, r);
+      return r;
     case CONSTRUCTOR:
       tree type = TREE_TYPE (init);
       HOST_WIDE_INT total_bytes = int_size_in_bytes (type);
@@ -7946,7 +7999,7 @@ native_encode_initializer (tree init, un
 	{
 	  HOST_WIDE_INT min_index;
 	  unsigned HOST_WIDE_INT cnt;
-	  HOST_WIDE_INT curpos = 0, fieldsize;
+	  HOST_WIDE_INT curpos = 0, fieldsize, valueinit = -1;
 	  constructor_elt *ce;
 
 	  if (TYPE_DOMAIN (type) == NULL_TREE
@@ -7961,12 +8014,22 @@ native_encode_initializer (tree init, un
 	  if (ptr != NULL)
 	    memset (ptr, '\0', MIN (total_bytes - off, len));
 
-	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
+	  for (cnt = 0; ; cnt++)
 	    {
-	      tree val = ce->value;
-	      tree index = ce->index;
+	      tree val = NULL_TREE, index = NULL_TREE;
 	      HOST_WIDE_INT pos = curpos, count = 0;
 	      bool full = false;
+	      if (vec_safe_iterate (CONSTRUCTOR_ELTS (init), cnt, &ce))
+		{
+		  val = ce->value;
+		  index = ce->index;
+		}
+	      else if (mask == NULL
+		       || CONSTRUCTOR_NO_CLEARING (init)
+		       || curpos >= total_bytes)
+		break;
+	      else
+		pos = total_bytes;
 	      if (index && TREE_CODE (index) == RANGE_EXPR)
 		{
 		  if (!tree_fits_shwi_p (TREE_OPERAND (index, 0))
@@ -7984,6 +8047,28 @@ native_encode_initializer (tree init, un
 		  pos = (tree_to_shwi (index) - min_index) * fieldsize;
 		}
 
+	      if (mask && !CONSTRUCTOR_NO_CLEARING (init) && curpos != pos)
+		{
+		  if (valueinit == -1)
+		    {
+		      tree zero = build_constructor (TREE_TYPE (type), NULL);
+		      r = native_encode_initializer (zero, ptr + curpos,
+						     fieldsize, 0,
+						     mask + curpos);
+		      ggc_free (zero);
+		      if (!r)
+			return 0;
+		      valueinit = curpos;
+		      curpos += fieldsize;
+		    }
+		  while (curpos != pos)
+		    {
+		      memcpy (ptr + curpos, ptr + valueinit, fieldsize);
+		      memcpy (mask + curpos, mask + valueinit, fieldsize);
+		      curpos += fieldsize;
+		    }
+		}
+
 	      curpos = pos;
 	      if (val)
 		do
@@ -7998,6 +8083,8 @@ native_encode_initializer (tree init, un
 			    if (ptr)
 			      memcpy (ptr + (curpos - o), ptr + (pos - o),
 				      fieldsize);
+			    if (mask)
+			      memcpy (mask + curpos, mask + pos, fieldsize);
 			  }
 			else if (!native_encode_initializer (val,
 							     ptr
@@ -8005,7 +8092,10 @@ native_encode_initializer (tree init, un
 							     : NULL,
 							     fieldsize,
 							     off == -1 ? -1
-								       : 0))
+								       : 0,
+							     mask
+							     ? mask + curpos
+							     : NULL))
 			  return 0;
 			else
 			  {
@@ -8020,6 +8110,7 @@ native_encode_initializer (tree init, un
 			unsigned char *p = NULL;
 			int no = 0;
 			int l;
+			gcc_assert (mask == NULL);
 			if (curpos >= off)
 			  {
 			    if (ptr)
@@ -8033,7 +8124,7 @@ native_encode_initializer (tree init, un
 			    no = off - curpos;
 			    l = len;
 			  }
-			if (!native_encode_initializer (val, p, l, no))
+			if (!native_encode_initializer (val, p, l, no, NULL))
 			  return 0;
 		      }
 		    curpos += fieldsize;
@@ -8047,22 +8138,75 @@ native_encode_initializer (tree init, un
 	{
 	  unsigned HOST_WIDE_INT cnt;
 	  constructor_elt *ce;
+	  tree fld_base = TYPE_FIELDS (type);
+	  tree to_free = NULL_TREE;
 
+	  gcc_assert (TREE_CODE (type) == RECORD_TYPE || mask == NULL);
 	  if (ptr != NULL)
 	    memset (ptr, '\0', MIN (total_bytes - off, len));
-	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
+	  for (cnt = 0; ; cnt++)
 	    {
-	      tree field = ce->index;
-	      tree val = ce->value;
-	      HOST_WIDE_INT pos, fieldsize;
+	      tree val = NULL_TREE, field = NULL_TREE;
+	      HOST_WIDE_INT pos = 0, fieldsize;
 	      unsigned HOST_WIDE_INT bpos = 0, epos = 0;
 
-	      if (field == NULL_TREE)
-		return 0;
+	      if (to_free)
+		{
+		  ggc_free (to_free);
+		  to_free = NULL_TREE;
+		}
 
-	      pos = int_byte_position (field);
-	      if (off != -1 && (HOST_WIDE_INT) off + len <= pos)
-		continue;
+	      if (vec_safe_iterate (CONSTRUCTOR_ELTS (init), cnt, &ce))
+		{
+		  val = ce->value;
+		  field = ce->index;
+		  if (field == NULL_TREE)
+		    return 0;
+
+		  pos = int_byte_position (field);
+		  if (off != -1 && (HOST_WIDE_INT) off + len <= pos)
+		    continue;
+		}
+	      else if (mask == NULL
+		       || CONSTRUCTOR_NO_CLEARING (init))
+		break;
+	      else
+		pos = total_bytes;
+
+	      if (mask && !CONSTRUCTOR_NO_CLEARING (init))
+		{
+		  tree fld;
+		  for (fld = fld_base; fld; fld = DECL_CHAIN (fld))
+		    {
+		      if (TREE_CODE (fld) != FIELD_DECL)
+			continue;
+		      if (fld == field)
+			break;
+		      if (DECL_BIT_FIELD (fld)
+			  && DECL_NAME (fld) == NULL_TREE)
+			continue;
+		      if (DECL_SIZE_UNIT (fld) == NULL_TREE
+			  || !tree_fits_shwi_p (DECL_SIZE_UNIT (fld)))
+			return 0;
+		      if (integer_zerop (DECL_SIZE_UNIT (fld)))
+			continue;
+		      break;
+		    }
+		  if (fld == NULL_TREE)
+		    {
+		      if (ce == NULL)
+			break;
+		      return 0;
+		    }
+		  fld_base = DECL_CHAIN (fld);
+		  if (fld != field)
+		    {
+		      cnt--;
+		      field = fld;
+		      val = build_constructor (TREE_TYPE (fld), NULL);
+		      to_free = val;
+		    }
+		}
 
 	      if (TREE_CODE (TREE_TYPE (field)) == ARRAY_TYPE
 		  && TYPE_DOMAIN (TREE_TYPE (field))
@@ -8103,30 +8247,49 @@ native_encode_initializer (tree init, un
 		  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
 		    return 0;
 
-		  tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
-		  if (repr == NULL_TREE
-		      || TREE_CODE (val) != INTEGER_CST
-		      || !INTEGRAL_TYPE_P (TREE_TYPE (repr)))
+		  if (TREE_CODE (val) != INTEGER_CST)
 		    return 0;
 
-		  HOST_WIDE_INT rpos = int_byte_position (repr);
+		  tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
+		  tree repr_type = NULL_TREE;
+		  HOST_WIDE_INT rpos = 0;
+		  if (repr && INTEGRAL_TYPE_P (TREE_TYPE (repr)))
+		    {
+		      rpos = int_byte_position (repr);
+		      repr_type = TREE_TYPE (repr);
+		    }
+		  else
+		    {
+		      repr_type = find_bitfield_repr_type (fieldsize, len);
+		      if (repr_type == NULL_TREE)
+			return 0;
+		      HOST_WIDE_INT repr_size = int_size_in_bytes (repr_type);
+		      gcc_assert (repr_size > 0 && repr_size <= len);
+		      if (pos + repr_size <= len)
+			rpos = pos;
+		      else
+			{
+			  rpos = len - repr_size;
+			  gcc_assert (rpos <= pos);
+			}
+		    }
+
 		  if (rpos > pos)
 		    return 0;
-		  wide_int w = wi::to_wide (val,
-					    TYPE_PRECISION (TREE_TYPE (repr)));
-		  int diff = (TYPE_PRECISION (TREE_TYPE (repr))
+		  wide_int w = wi::to_wide (val, TYPE_PRECISION (repr_type));
+		  int diff = (TYPE_PRECISION (repr_type)
 			      - TYPE_PRECISION (TREE_TYPE (field)));
 		  HOST_WIDE_INT bitoff = (pos - rpos) * BITS_PER_UNIT + bpos;
 		  if (!BYTES_BIG_ENDIAN)
 		    w = wi::lshift (w, bitoff);
 		  else
 		    w = wi::lshift (w, diff - bitoff);
-		  val = wide_int_to_tree (TREE_TYPE (repr), w);
+		  val = wide_int_to_tree (repr_type, w);
 
 		  unsigned char buf[MAX_BITSIZE_MODE_ANY_INT
 				    / BITS_PER_UNIT + 1];
 		  int l = native_encode_int (val, buf, sizeof buf, 0);
-		  if (l * BITS_PER_UNIT != TYPE_PRECISION (TREE_TYPE (repr)))
+		  if (l * BITS_PER_UNIT != TYPE_PRECISION (repr_type))
 		    return 0;
 
 		  if (ptr == NULL)
@@ -8139,15 +8302,31 @@ native_encode_initializer (tree init, un
 		    {
 		      if (!BYTES_BIG_ENDIAN)
 			{
-			  int mask = (1 << bpos) - 1;
-			  buf[pos - rpos] &= ~mask;
-			  buf[pos - rpos] |= ptr[pos - o] & mask;
+			  int msk = (1 << bpos) - 1;
+			  buf[pos - rpos] &= ~msk;
+			  buf[pos - rpos] |= ptr[pos - o] & msk;
+			  if (mask)
+			    {
+			      if (fieldsize > 1 || epos == 0)
+				mask[pos] &= msk;
+			      else
+				mask[pos] &= (msk | ~((1 << epos) - 1));
+			    }
 			}
 		      else
 			{
-			  int mask = (1 << (BITS_PER_UNIT - bpos)) - 1;
-			  buf[pos - rpos] &= mask;
-			  buf[pos - rpos] |= ptr[pos - o] & ~mask;
+			  int msk = (1 << (BITS_PER_UNIT - bpos)) - 1;
+			  buf[pos - rpos] &= msk;
+			  buf[pos - rpos] |= ptr[pos - o] & ~msk;
+			  if (mask)
+			    {
+			      if (fieldsize > 1 || epos == 0)
+				mask[pos] &= ~msk;
+			      else
+				mask[pos] &= (~msk
+					      | ((1 << (BITS_PER_UNIT - epos))
+						 - 1));
+			    }
 			}
 		    }
 		  /* If the bitfield does not end at byte boundary, handle
@@ -8158,27 +8337,37 @@ native_encode_initializer (tree init, un
 		    {
 		      if (!BYTES_BIG_ENDIAN)
 			{
-			  int mask = (1 << epos) - 1;
-			  buf[pos - rpos + fieldsize - 1] &= mask;
+			  int msk = (1 << epos) - 1;
+			  buf[pos - rpos + fieldsize - 1] &= msk;
 			  buf[pos - rpos + fieldsize - 1]
-			    |= ptr[pos + fieldsize - 1 - o] & ~mask;
+			    |= ptr[pos + fieldsize - 1 - o] & ~msk;
+			  if (mask && (fieldsize > 1 || bpos == 0))
+			    mask[pos + fieldsize - 1] &= ~msk;
 			}
 		       else
 			{
-			  int mask = (1 << (BITS_PER_UNIT - epos)) - 1;
-			  buf[pos - rpos + fieldsize - 1] &= ~mask;
+			  int msk = (1 << (BITS_PER_UNIT - epos)) - 1;
+			  buf[pos - rpos + fieldsize - 1] &= ~msk;
 			  buf[pos - rpos + fieldsize - 1]
-			    |= ptr[pos + fieldsize - 1 - o] & mask;
+			    |= ptr[pos + fieldsize - 1 - o] & msk;
+			  if (mask && (fieldsize > 1 || bpos == 0))
+			    mask[pos + fieldsize - 1] &= msk;
 			}
 		    }
 		  if (off == -1
 		      || (pos >= off
 			  && (pos + fieldsize <= (HOST_WIDE_INT) off + len)))
-		    memcpy (ptr + pos - o, buf + (pos - rpos), fieldsize);
+		    {
+		      memcpy (ptr + pos - o, buf + (pos - rpos), fieldsize);
+		      if (mask && (fieldsize > (bpos != 0) + (epos != 0)))
+			memset (mask + pos + (bpos != 0), 0,
+				fieldsize - (bpos != 0) - (epos != 0));
+		    }
 		  else
 		    {
 		      /* Partial overlap.  */
 		      HOST_WIDE_INT fsz = fieldsize;
+		      gcc_assert (mask == NULL);
 		      if (pos < off)
 			{
 			  fsz -= (off - pos);
@@ -8198,7 +8387,8 @@ native_encode_initializer (tree init, un
 		  if (!native_encode_initializer (val, ptr ? ptr + pos - o
 							   : NULL,
 						  fieldsize,
-						  off == -1 ? -1 : 0))
+						  off == -1 ? -1 : 0,
+						  mask ? mask + pos : NULL))
 		    return 0;
 		}
 	      else
@@ -8207,6 +8397,7 @@ native_encode_initializer (tree init, un
 		  unsigned char *p = NULL;
 		  int no = 0;
 		  int l;
+		  gcc_assert (mask == NULL);
 		  if (pos >= off)
 		    {
 		      if (ptr)
@@ -8220,7 +8411,7 @@ native_encode_initializer (tree init, un
 		      no = off - pos;
 		      l = len;
 		    }
-		  if (!native_encode_initializer (val, p, l, no))
+		  if (!native_encode_initializer (val, p, l, no, NULL))
 		    return 0;
 		}
 	    }
--- gcc/c-family/c-common.h.jj	2020-10-27 14:36:42.182246025 +0100
+++ gcc/c-family/c-common.h	2020-11-02 13:42:18.699819747 +0100
@@ -164,7 +164,7 @@ enum rid
   RID_HAS_NOTHROW_COPY,        RID_HAS_TRIVIAL_ASSIGN,
   RID_HAS_TRIVIAL_CONSTRUCTOR, RID_HAS_TRIVIAL_COPY,
   RID_HAS_TRIVIAL_DESTRUCTOR,  RID_HAS_UNIQUE_OBJ_REPRESENTATIONS,
-  RID_HAS_VIRTUAL_DESTRUCTOR,
+  RID_HAS_VIRTUAL_DESTRUCTOR,  RID_BUILTIN_BIT_CAST,
   RID_IS_ABSTRACT,             RID_IS_AGGREGATE,
   RID_IS_BASE_OF,              RID_IS_CLASS,
   RID_IS_EMPTY,                RID_IS_ENUM,
--- gcc/c-family/c-common.c.jj	2020-10-27 14:36:42.182246025 +0100
+++ gcc/c-family/c-common.c	2020-11-02 13:42:18.700819736 +0100
@@ -374,6 +374,7 @@ const struct c_common_resword c_common_r
   { "__auto_type",	RID_AUTO_TYPE,	D_CONLY },
   { "__bases",          RID_BASES, D_CXXONLY },
   { "__builtin_addressof", RID_ADDRESSOF, D_CXXONLY },
+  { "__builtin_bit_cast", RID_BUILTIN_BIT_CAST, D_CXXONLY },
   { "__builtin_call_with_static_chain",
     RID_BUILTIN_CALL_WITH_STATIC_CHAIN, D_CONLY },
   { "__builtin_choose_expr", RID_CHOOSE_EXPR, D_CONLY },
--- gcc/cp/cp-tree.h.jj	2020-10-30 08:59:57.041496709 +0100
+++ gcc/cp/cp-tree.h	2020-11-02 13:42:18.701819725 +0100
@@ -7273,6 +7273,8 @@ extern tree finish_builtin_launder		(loc
 						 tsubst_flags_t);
 extern tree cp_build_vec_convert		(tree, location_t, tree,
 						 tsubst_flags_t);
+extern tree cp_build_bit_cast			(location_t, tree, tree,
+						 tsubst_flags_t);
 extern void start_lambda_scope			(tree);
 extern void record_lambda_scope			(tree);
 extern void record_null_lambda_scope		(tree);
--- gcc/cp/cp-tree.def.jj	2020-09-21 11:15:53.715518437 +0200
+++ gcc/cp/cp-tree.def	2020-11-02 13:42:18.701819725 +0100
@@ -437,6 +437,9 @@ DEFTREECODE (UNARY_RIGHT_FOLD_EXPR, "una
 DEFTREECODE (BINARY_LEFT_FOLD_EXPR, "binary_left_fold_expr", tcc_expression, 3)
 DEFTREECODE (BINARY_RIGHT_FOLD_EXPR, "binary_right_fold_expr", tcc_expression, 3)
 
+/* Represents the __builtin_bit_cast (type, expr) expression.
+   The type is in TREE_TYPE, expression in TREE_OPERAND (bitcast, 0).  */
+DEFTREECODE (BIT_CAST_EXPR, "bit_cast_expr", tcc_expression, 1)
 
 /** C++ extensions. */
 
--- gcc/cp/cp-objcp-common.c.jj	2020-09-21 11:15:53.715518437 +0200
+++ gcc/cp/cp-objcp-common.c	2020-11-02 13:42:18.701819725 +0100
@@ -390,6 +390,7 @@ names_builtin_p (const char *name)
     case RID_BUILTIN_HAS_ATTRIBUTE:
     case RID_BUILTIN_SHUFFLE:
     case RID_BUILTIN_LAUNDER:
+    case RID_BUILTIN_BIT_CAST:
     case RID_OFFSETOF:
     case RID_HAS_NOTHROW_ASSIGN:
     case RID_HAS_NOTHROW_CONSTRUCTOR:
@@ -488,6 +489,7 @@ cp_common_init_ts (void)
   MARK_TS_EXP (ALIGNOF_EXPR);
   MARK_TS_EXP (ARROW_EXPR);
   MARK_TS_EXP (AT_ENCODE_EXPR);
+  MARK_TS_EXP (BIT_CAST_EXPR);
   MARK_TS_EXP (CAST_EXPR);
   MARK_TS_EXP (CONST_CAST_EXPR);
   MARK_TS_EXP (CTOR_INITIALIZER);
--- gcc/cp/cxx-pretty-print.c.jj	2020-10-19 09:32:35.503914847 +0200
+++ gcc/cp/cxx-pretty-print.c	2020-11-02 13:42:18.702819714 +0100
@@ -655,6 +655,15 @@ cxx_pretty_printer::postfix_expression (
       pp_right_paren (this);
       break;
 
+    case BIT_CAST_EXPR:
+      pp_cxx_ws_string (this, "__builtin_bit_cast");
+      pp_left_paren (this);
+      type_id (TREE_TYPE (t));
+      pp_comma (this);
+      expression (TREE_OPERAND (t, 0));
+      pp_right_paren (this);
+      break;
+
     case EMPTY_CLASS_EXPR:
       type_id (TREE_TYPE (t));
       pp_left_paren (this);
--- gcc/cp/parser.c.jj	2020-11-02 13:33:46.039434035 +0100
+++ gcc/cp/parser.c	2020-11-02 13:42:18.705819681 +0100
@@ -7234,6 +7234,32 @@ cp_parser_postfix_expression (cp_parser
 				     tf_warning_or_error);
       }
 
+    case RID_BUILTIN_BIT_CAST:
+      {
+	tree expression;
+	tree type;
+	/* Consume the `__builtin_bit_cast' token.  */
+	cp_lexer_consume_token (parser->lexer);
+	/* Look for the opening `('.  */
+	matching_parens parens;
+	parens.require_open (parser);
+	location_t type_location
+	  = cp_lexer_peek_token (parser->lexer)->location;
+	/* Parse the type-id.  */
+	{
+	  type_id_in_expr_sentinel s (parser);
+	  type = cp_parser_type_id (parser);
+	}
+	/* Look for the `,'.  */
+	cp_parser_require (parser, CPP_COMMA, RT_COMMA);
+	/* Now, parse the assignment-expression.  */
+	expression = cp_parser_assignment_expression (parser);
+	/* Look for the closing `)'.  */
+	parens.require_close (parser);
+	return cp_build_bit_cast (type_location, type, expression,
+				  tf_warning_or_error);
+      }
+
     default:
       {
 	tree type;
--- gcc/cp/semantics.c.jj	2020-10-30 09:16:30.816279461 +0100
+++ gcc/cp/semantics.c	2020-11-02 13:42:18.706819671 +0100
@@ -10591,4 +10591,76 @@ cp_build_vec_convert (tree arg, location
   return build_call_expr_internal_loc (loc, IFN_VEC_CONVERT, type, 1, arg);
 }
 
+/* Finish __builtin_bit_cast (type, arg).  */
+
+tree
+cp_build_bit_cast (location_t loc, tree type, tree arg,
+		   tsubst_flags_t complain)
+{
+  if (error_operand_p (type))
+    return error_mark_node;
+  if (!dependent_type_p (type))
+    {
+      if (!complete_type_or_maybe_complain (type, NULL_TREE, complain))
+	return error_mark_node;
+      if (TREE_CODE (type) == ARRAY_TYPE)
+	{
+	  /* std::bit_cast for destination ARRAY_TYPE is not possible,
+	     as functions may not return an array, so don't bother trying
+	     to support this (and then deal with VLAs etc.).  */
+	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
+			 "is an array type", type);
+	  return error_mark_node;
+	}
+      if (!trivially_copyable_p (type))
+	{
+	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
+			 "is not trivially copyable", type);
+	  return error_mark_node;
+	}
+    }
+
+  if (error_operand_p (arg))
+    return error_mark_node;
+
+  if (!type_dependent_expression_p (arg))
+    {
+      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
+	{
+	  /* Don't perform array-to-pointer conversion.  */
+	  arg = mark_rvalue_use (arg, loc, true);
+	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
+	    return error_mark_node;
+	}
+      else
+	arg = decay_conversion (arg, complain);
+
+      if (error_operand_p (arg))
+	return error_mark_node;
+
+      arg = convert_from_reference (arg);
+      if (!trivially_copyable_p (TREE_TYPE (arg)))
+	{
+	  error_at (cp_expr_loc_or_loc (arg, loc),
+		    "%<__builtin_bit_cast%> source type %qT "
+		    "is not trivially copyable", TREE_TYPE (arg));
+	  return error_mark_node;
+	}
+      if (!dependent_type_p (type)
+	  && !cp_tree_equal (TYPE_SIZE_UNIT (type),
+			     TYPE_SIZE_UNIT (TREE_TYPE (arg))))
+	{
+	  error_at (loc, "%<__builtin_bit_cast%> source size %qE "
+			 "not equal to destination type size %qE",
+			 TYPE_SIZE_UNIT (TREE_TYPE (arg)),
+			 TYPE_SIZE_UNIT (type));
+	  return error_mark_node;
+	}
+    }
+
+  tree ret = build_min (BIT_CAST_EXPR, type, arg);
+  SET_EXPR_LOCATION (ret, loc);
+  return ret;
+}
+
 #include "gt-cp-semantics.h"
--- gcc/cp/tree.c.jj	2020-10-08 10:58:22.023037307 +0200
+++ gcc/cp/tree.c	2020-11-02 13:42:18.707819660 +0100
@@ -3868,6 +3868,7 @@ cp_tree_equal (tree t1, tree t2)
     CASE_CONVERT:
     case NON_LVALUE_EXPR:
     case VIEW_CONVERT_EXPR:
+    case BIT_CAST_EXPR:
       if (!same_type_p (TREE_TYPE (t1), TREE_TYPE (t2)))
 	return false;
       /* Now compare operands as usual.  */
@@ -5112,6 +5113,7 @@ cp_walk_subtrees (tree *tp, int *walk_su
     case CONST_CAST_EXPR:
     case DYNAMIC_CAST_EXPR:
     case IMPLICIT_CONV_EXPR:
+    case BIT_CAST_EXPR:
       if (TREE_TYPE (*tp))
 	WALK_SUBTREE (TREE_TYPE (*tp));
 
--- gcc/cp/pt.c.jj	2020-10-30 09:16:30.818279438 +0100
+++ gcc/cp/pt.c	2020-11-02 13:42:18.709819638 +0100
@@ -16741,6 +16741,13 @@ tsubst_copy (tree t, tree args, tsubst_f
 	return build1 (code, type, op0);
       }
 
+    case BIT_CAST_EXPR:
+      {
+	tree type = tsubst (TREE_TYPE (t), args, complain, in_decl);
+	tree op0 = tsubst_copy (TREE_OPERAND (t, 0), args, complain, in_decl);
+	return cp_build_bit_cast (EXPR_LOCATION (t), type, op0, complain);
+      }
+
     case SIZEOF_EXPR:
       if (PACK_EXPANSION_P (TREE_OPERAND (t, 0))
 	  || ARGUMENT_PACK_P (TREE_OPERAND (t, 0)))
--- gcc/cp/constexpr.c.jj	2020-10-30 08:59:57.039496732 +0100
+++ gcc/cp/constexpr.c	2020-11-02 15:55:14.259664820 +0100
@@ -3912,6 +3912,442 @@ cxx_eval_bit_field_ref (const constexpr_
   return error_mark_node;
 }
 
+/* Helper for cxx_eval_bit_cast.
+   Check [bit.cast]/3 rules, bit_cast is constexpr only if the To and From
+   types and types of all subobjects have is_union_v<T>, is_pointer_v<T>,
+   is_member_pointer_v<T>, is_volatile_v<T> false and has no non-static
+   data members of reference type.  */
+
+static bool
+check_bit_cast_type (const constexpr_ctx *ctx, location_t loc, tree type,
+		     tree orig_type)
+{
+  if (TREE_CODE (type) == UNION_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "a union type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a union type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == POINTER_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "a pointer type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a pointer type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == REFERENCE_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "a reference type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a reference type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TYPE_PTRMEM_P (type))
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "a pointer to member type", "__builtin_bit_cast",
+		      type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a pointer to member type",
+		      "__builtin_bit_cast", orig_type);
+	}
+      return true;
+    }
+  if (TYPE_VOLATILE (type))
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "volatile", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a volatile subobject",
+		      "__builtin_bit_cast", orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == RECORD_TYPE)
+    for (tree field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
+      if (TREE_CODE (field) == FIELD_DECL
+	  && check_bit_cast_type (ctx, loc, TREE_TYPE (field), orig_type))
+	return true;
+  return false;
+}
+
+/* Attempt to interpret aggregate of TYPE from bytes encoded in target
+   byte order at PTR + OFF with LEN bytes.  MASK contains bits set if the value
+   is indeterminate.  */
+
+static tree
+cxx_native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
+				int len, unsigned char *mask,
+				const constexpr_ctx *ctx, bool *non_constant_p,
+				location_t loc)
+{
+  vec<constructor_elt, va_gc> *elts = NULL;
+  if (TREE_CODE (type) == ARRAY_TYPE)
+    {
+      HOST_WIDE_INT eltsz = int_size_in_bytes (TREE_TYPE (type));
+      if (eltsz < 0 || eltsz > len || TYPE_DOMAIN (type) == NULL_TREE)
+	return error_mark_node;
+
+      HOST_WIDE_INT cnt = 0;
+      if (TYPE_MAX_VALUE (TYPE_DOMAIN (type)))
+	{
+	  if (!tree_fits_shwi_p (TYPE_MAX_VALUE (TYPE_DOMAIN (type))))
+	    return error_mark_node;
+	  cnt = tree_to_shwi (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) + 1;
+	}
+      if (eltsz == 0)
+	cnt = 0;
+      HOST_WIDE_INT pos = 0;
+      for (HOST_WIDE_INT i = 0; i < cnt; i++, pos += eltsz)
+	{
+	  tree v = error_mark_node;
+	  if (pos >= len || pos + eltsz > len)
+	    return error_mark_node;
+	  if (can_native_interpret_type_p (TREE_TYPE (type)))
+	    {
+	      v = native_interpret_expr (TREE_TYPE (type),
+					 ptr + off + pos, eltsz);
+	      if (v == NULL_TREE)
+		return error_mark_node;
+	      for (int i = 0; i < eltsz; i++)
+		if (mask[off + pos + i])
+		  {
+		    if (!ctx->quiet)
+		      error_at (loc, "%qs accessing uninitialized byte at "
+				     "offset %d",
+				"__builtin_bit_cast", (int) (off + pos + i));
+		    *non_constant_p = true;
+		    return error_mark_node;
+		  }
+	    }
+	  else if (TREE_CODE (TREE_TYPE (type)) == RECORD_TYPE
+		   || TREE_CODE (TREE_TYPE (type)) == ARRAY_TYPE)
+	    v = cxx_native_interpret_aggregate (TREE_TYPE (type),
+						ptr, off + pos, eltsz,
+						mask, ctx,
+						non_constant_p, loc);
+	  if (v == error_mark_node)
+	    return error_mark_node;
+	  CONSTRUCTOR_APPEND_ELT (elts, size_int (i), v);
+	}
+      return build_constructor (type, elts);
+    }
+  gcc_assert (TREE_CODE (type) == RECORD_TYPE);
+  for (tree field = next_initializable_field (TYPE_FIELDS (type));
+       field; field = next_initializable_field (DECL_CHAIN (field)))
+    {
+      tree fld = field;
+      HOST_WIDE_INT bitoff = 0, pos = 0, sz = 0;
+      int diff = 0;
+      tree v = error_mark_node;
+      if (DECL_BIT_FIELD (field))
+	{
+	  fld = DECL_BIT_FIELD_REPRESENTATIVE (field);
+	  if (fld && INTEGRAL_TYPE_P (TREE_TYPE (fld)))
+	    {
+	      poly_int64 bitoffset;
+	      poly_uint64 field_offset, fld_offset;
+	      if (poly_int_tree_p (DECL_FIELD_OFFSET (field), &field_offset)
+		  && poly_int_tree_p (DECL_FIELD_OFFSET (fld), &fld_offset))
+		bitoffset = (field_offset - fld_offset) * BITS_PER_UNIT;
+	      else
+		bitoffset = 0;
+	      bitoffset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
+			    - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (fld)));
+	      diff = (TYPE_PRECISION (TREE_TYPE (fld))
+		      - TYPE_PRECISION (TREE_TYPE (field)));
+	      if (!bitoffset.is_constant (&bitoff)
+		  || bitoff < 0
+		  || bitoff > diff)
+		return error_mark_node;
+	    }
+	  else
+	    {
+	      if (!tree_fits_uhwi_p (DECL_FIELD_BIT_OFFSET (field)))
+		return error_mark_node;
+	      int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
+	      int bpos = tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field));
+	      bpos %= BITS_PER_UNIT;
+	      fieldsize += bpos;
+	      fieldsize += BITS_PER_UNIT - 1;
+	      fieldsize /= BITS_PER_UNIT;
+	      tree repr_type = find_bitfield_repr_type (fieldsize, len);
+	      if (repr_type == NULL_TREE)
+		return error_mark_node;
+	      sz = int_size_in_bytes (repr_type);
+	      if (sz < 0 || sz > len)
+		return error_mark_node;
+	      pos = int_byte_position (field);
+	      if (pos < 0 || pos > len || pos + fieldsize > len)
+		return error_mark_node;
+	      HOST_WIDE_INT rpos;
+	      if (pos + sz <= len)
+		rpos = pos;
+	      else
+		{
+		  rpos = len - sz;
+		  gcc_assert (rpos <= pos);
+		}
+	      bitoff = (HOST_WIDE_INT) (pos - rpos) * BITS_PER_UNIT + bpos;
+	      pos = rpos;
+	      diff = (TYPE_PRECISION (repr_type)
+		      - TYPE_PRECISION (TREE_TYPE (field)));
+	      v = native_interpret_expr (repr_type, ptr + off + pos, sz);
+	      if (v == NULL_TREE)
+		return error_mark_node;
+	      fld = NULL_TREE;
+	    }
+	}
+
+      if (fld)
+	{
+	  sz = int_size_in_bytes (TREE_TYPE (fld));
+	  if (sz < 0 || sz > len)
+	    return error_mark_node;
+	  tree byte_pos = byte_position (fld);
+	  if (!tree_fits_shwi_p (byte_pos))
+	    return error_mark_node;
+	  pos = tree_to_shwi (byte_pos);
+	  if (pos < 0 || pos > len || pos + sz > len)
+	    return error_mark_node;
+	}
+      if (fld == NULL_TREE)
+	/* Already handled above.  */;
+      else if (can_native_interpret_type_p (TREE_TYPE (fld)))
+	{
+	  v = native_interpret_expr (TREE_TYPE (fld),
+				     ptr + off + pos, sz);
+	  if (v == NULL_TREE)
+	    return error_mark_node;
+	  if (fld == field)
+	    for (int i = 0; i < sz; i++)
+	      if (mask[off + pos + i])
+		{
+		  if (!ctx->quiet)
+		    error_at (loc,
+			      "%qs accessing uninitialized byte at offset %d",
+			      "__builtin_bit_cast", (int) (off + pos + i));
+		  *non_constant_p = true;
+		  return error_mark_node;
+		}
+	}
+      else if (TREE_CODE (TREE_TYPE (fld)) == RECORD_TYPE
+	       || TREE_CODE (TREE_TYPE (fld)) == ARRAY_TYPE)
+	v = cxx_native_interpret_aggregate (TREE_TYPE (fld),
+					    ptr, off + pos, sz, mask,
+					    ctx, non_constant_p, loc);
+      if (v == error_mark_node)
+	return error_mark_node;
+      if (fld != field)
+	{
+	  if (TREE_CODE (v) != INTEGER_CST)
+	    return error_mark_node;
+
+	  /* FIXME: Figure out how to handle PDP endian bitfields.  */
+	  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
+	    return error_mark_node;
+	  if (!BYTES_BIG_ENDIAN)
+	    v = wide_int_to_tree (TREE_TYPE (field),
+				  wi::lrshift (wi::to_wide (v), bitoff));
+	  else
+	    v = wide_int_to_tree (TREE_TYPE (field),
+				  wi::lrshift (wi::to_wide (v),
+					       diff - bitoff));
+	  int bpos = bitoff % BITS_PER_UNIT;
+	  int bpos_byte = bitoff / BITS_PER_UNIT;
+	  int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
+	  fieldsize += bpos;
+	  int epos = fieldsize % BITS_PER_UNIT;
+	  fieldsize += BITS_PER_UNIT - 1;
+	  fieldsize /= BITS_PER_UNIT;
+	  int bad = -1;
+	  if (bpos)
+	    {
+	      int msk;
+	      if (!BYTES_BIG_ENDIAN)
+		{
+		  msk = (1 << bpos) - 1;
+		  if (fieldsize == 1 && epos != 0)
+		    msk |= ~((1 << epos) - 1);
+		}
+	      else
+		{
+		  msk = ~((1 << (BITS_PER_UNIT - bpos)) - 1);
+		  if (fieldsize == 1 && epos != 0)
+		    msk |= (1 << (BITS_PER_UNIT - epos)) - 1;
+		}
+	      if (mask[off + pos + bpos_byte] & ~msk)
+		bad = off + pos + bpos_byte;
+	    }
+	  if (epos && (fieldsize > 1 || bpos == 0))
+	    {
+	      int msk;
+	      if (!BYTES_BIG_ENDIAN)
+		msk = (1 << epos) - 1;
+	      else
+		msk = ~((1 << (BITS_PER_UNIT - epos)) - 1);
+	      if (mask[off + pos + bpos_byte + fieldsize - 1] & msk)
+		bad = off + pos + bpos_byte + fieldsize - 1;
+	    }
+	  for (int i = 0; bad < 0 && i < fieldsize - (bpos != 0) - (epos != 0);
+	       i++)
+	    if (mask[off + pos + bpos_byte + (bpos != 0) + i])
+	      bad = off + pos + bpos_byte + (bpos != 0) + i;
+	  if (bad >= 0)
+	    {
+	      if (!ctx->quiet)
+		error_at (loc, "%qs accessing uninitialized byte at offset %d",
+			  "__builtin_bit_cast", bad);
+	      *non_constant_p = true;
+	      return error_mark_node;
+	    }
+	}
+      CONSTRUCTOR_APPEND_ELT (elts, field, v);
+    }
+  return build_constructor (type, elts);
+}
+
+/* Subroutine of cxx_eval_constant_expression.
+   Attempt to evaluate a BIT_CAST_EXPR.  */
+
+static tree
+cxx_eval_bit_cast (const constexpr_ctx *ctx, tree t, bool *non_constant_p,
+		   bool *overflow_p)
+{
+  if (check_bit_cast_type (ctx, EXPR_LOCATION (t), TREE_TYPE (t),
+			   TREE_TYPE (t))
+      || check_bit_cast_type (ctx, cp_expr_loc_or_loc (TREE_OPERAND (t, 0),
+						       EXPR_LOCATION (t)),
+			      TREE_TYPE (TREE_OPERAND (t, 0)),
+			      TREE_TYPE (TREE_OPERAND (t, 0))))
+    {
+      *non_constant_p = true;
+      return t;
+    }
+
+  tree op = cxx_eval_constant_expression (ctx, TREE_OPERAND (t, 0), false,
+					  non_constant_p, overflow_p);
+  if (*non_constant_p)
+    return t;
+
+  location_t loc = EXPR_LOCATION (t);
+  if (BITS_PER_UNIT != 8 || CHAR_BIT != 8)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated on the target",
+		       "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  if (!tree_fits_shwi_p (TYPE_SIZE_UNIT (TREE_TYPE (t))))
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "type is too large", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  HOST_WIDE_INT len = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (t)));
+  if (len < 0 || (int) len != len)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "type is too large", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  unsigned char buf[64];
+  unsigned char *ptr, *mask;
+  size_t alen = (size_t) len * 2;
+  if (alen <= sizeof (buf))
+    ptr = buf;
+  else
+    ptr = XNEWVEC (unsigned char, alen);
+  mask = ptr + (size_t) len;
+  /* At the beginning consider everything indeterminate.  */
+  memset (mask, ~0, (size_t) len);
+
+  if (native_encode_initializer (op, ptr, len, 0, mask) != len)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "argument cannot be encoded", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  if (can_native_interpret_type_p (TREE_TYPE (t)))
+    if (tree r = native_interpret_expr (TREE_TYPE (t), ptr, len))
+      {
+	for (int i = 0; i < len; i++)
+	  if (mask[i])
+	    {
+	      if (!ctx->quiet)
+		error_at (loc, "%qs accessing uninitialized byte at offset %d",
+			  "__builtin_bit_cast", i);
+	      *non_constant_p = true;
+	      r = t;
+	      break;
+	    }
+	if (ptr != buf)
+	  XDELETE (ptr);
+	return r;
+      }
+
+  if (TREE_CODE (TREE_TYPE (t)) == RECORD_TYPE)
+    {
+      tree r = cxx_native_interpret_aggregate (TREE_TYPE (t), ptr, 0, len,
+					       mask, ctx, non_constant_p, loc);
+      if (r != error_mark_node)
+	{
+	  if (ptr != buf)
+	    XDELETE (ptr);
+	  return r;
+	}
+      if (*non_constant_p)
+	return t;
+    }
+
+  if (!ctx->quiet)
+    sorry_at (loc, "%qs cannot be constant evaluated because the "
+		   "argument cannot be interpreted", "__builtin_bit_cast");
+  *non_constant_p = true;
+  return t;
+}
+
 /* Subroutine of cxx_eval_constant_expression.
    Evaluate a short-circuited logical expression T in the context
    of a given constexpr CALL.  BAILOUT_VALUE is the value for
@@ -6625,6 +7061,10 @@ cxx_eval_constant_expression (const cons
       *non_constant_p = true;
       return t;
 
+    case BIT_CAST_EXPR:
+      r = cxx_eval_bit_cast (ctx, t, non_constant_p, overflow_p);
+      break;
+
     default:
       if (STATEMENT_CODE_P (TREE_CODE (t)))
 	{
@@ -8403,6 +8843,9 @@ potential_constant_expression_1 (tree t,
     case ANNOTATE_EXPR:
       return RECUR (TREE_OPERAND (t, 0), rval);
 
+    case BIT_CAST_EXPR:
+      return RECUR (TREE_OPERAND (t, 0), rval);
+
     /* Coroutine await, yield and return expressions are not.  */
     case CO_AWAIT_EXPR:
     case CO_YIELD_EXPR:
--- gcc/cp/cp-gimplify.c.jj	2020-10-08 10:58:21.939038527 +0200
+++ gcc/cp/cp-gimplify.c	2020-11-02 13:42:18.711819616 +0100
@@ -1568,6 +1568,11 @@ cp_genericize_r (tree *stmt_p, int *walk
 				 cp_genericize_r, cp_walk_subtrees);
       break;
 
+    case BIT_CAST_EXPR:
+      *stmt_p = build1_loc (EXPR_LOCATION (stmt), VIEW_CONVERT_EXPR,
+			    TREE_TYPE (stmt), TREE_OPERAND (stmt, 0));
+      break;
+
     default:
       if (IS_TYPE_OR_DECL_P (stmt))
 	*walk_subtrees = 0;
--- gcc/testsuite/g++.dg/cpp2a/bit-cast1.C.jj	2020-11-02 13:42:18.711819616 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast1.C	2020-11-02 13:42:18.711819616 +0100
@@ -0,0 +1,47 @@
+// { dg-do compile }
+
+struct S { short a, b; };
+struct T { float a[16]; };
+struct U { int b[16]; };
+
+#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
+int
+f1 (float x)
+{
+  return __builtin_bit_cast (int, x);
+}
+#endif
+
+#if 2 * __SIZEOF_SHORT__ == __SIZEOF_INT__
+S
+f2 (int x)
+{
+  return __builtin_bit_cast (S, x);
+}
+
+int
+f3 (S x)
+{
+  return __builtin_bit_cast (int, x);
+}
+#endif
+
+#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
+U
+f4 (T &x)
+{
+  return __builtin_bit_cast (U, x);
+}
+
+T
+f5 (int (&x)[16])
+{
+  return __builtin_bit_cast (T, x);
+}
+#endif
+
+int
+f6 ()
+{
+  return __builtin_bit_cast (unsigned char, (signed char) 0);
+}
--- gcc/testsuite/g++.dg/cpp2a/bit-cast2.C.jj	2020-11-02 13:42:18.711819616 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast2.C	2020-11-02 13:42:18.711819616 +0100
@@ -0,0 +1,57 @@
+// { dg-do compile }
+
+struct S { ~S (); int s; };
+S s;
+struct V;				// { dg-message "forward declaration of 'struct V'" }
+extern V v;				// { dg-error "'v' has incomplete type" }
+extern V *p;
+struct U { int a, b; };
+U u;
+
+void
+foo (int *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, v);
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+template <int N>
+void
+bar (int *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+template <typename T1, typename T2, typename T3, typename T4>
+void
+baz (T3 s, T4 *p, T1 *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (T3, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (T1 &, q);		// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (T2, 0);		// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (T4, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, (T1) 0);	// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (T1, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+void
+qux (int *q)
+{
+  baz <int, int [1], S, V> (s, p, q);
+}
--- gcc/testsuite/g++.dg/cpp2a/bit-cast3.C.jj	2020-11-02 13:42:18.711819616 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast3.C	2020-11-02 13:42:18.711819616 +0100
@@ -0,0 +1,229 @@
+// { dg-do compile { target c++11 } }
+
+template <typename To, typename From>
+constexpr To
+bit_cast (const From &from)
+{
+  return __builtin_bit_cast (To, from);
+}
+
+template <typename To, typename From>
+constexpr bool
+check (const From &from)
+{
+  return bit_cast <From> (bit_cast <To> (from)) == from;
+}
+
+struct A
+{
+  int a, b, c;
+  constexpr bool operator == (const A &x) const
+  {
+    return x.a == a && x.b == b && x.c == c;
+  }
+};
+
+struct B
+{
+  unsigned a[3];
+  constexpr bool operator == (const B &x) const
+  {
+    return x.a[0] == a[0] && x.a[1] == a[1] && x.a[2] == a[2];
+  }
+};
+
+struct C
+{
+  char a[2][3][2];
+  constexpr bool operator == (const C &x) const
+  {
+    return x.a[0][0][0] == a[0][0][0]
+	   && x.a[0][0][1] == a[0][0][1]
+	   && x.a[0][1][0] == a[0][1][0]
+	   && x.a[0][1][1] == a[0][1][1]
+	   && x.a[0][2][0] == a[0][2][0]
+	   && x.a[0][2][1] == a[0][2][1]
+	   && x.a[1][0][0] == a[1][0][0]
+	   && x.a[1][0][1] == a[1][0][1]
+	   && x.a[1][1][0] == a[1][1][0]
+	   && x.a[1][1][1] == a[1][1][1]
+	   && x.a[1][2][0] == a[1][2][0]
+	   && x.a[1][2][1] == a[1][2][1];
+  }
+};
+
+struct D
+{
+  int a, b;
+  constexpr bool operator == (const D &x) const
+  {
+    return x.a == a && x.b == b;
+  }
+};
+
+struct E {};
+struct F { char c, d, e, f; };
+struct G : public D, E, F
+{
+  int g;
+  constexpr bool operator == (const G &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d
+	   && x.e == e && x.f == f && x.g == g;
+  }
+};
+
+struct H
+{
+  int a, b[2], c;
+  constexpr bool operator == (const H &x) const
+  {
+    return x.a == a && x.b[0] == b[0] && x.b[1] == b[1] && x.c == c;
+  }
+};
+
+#if __SIZEOF_INT__ == 4
+struct I
+{
+  int a;
+  int b : 3;
+  int c : 24;
+  int d : 5;
+  int e;
+  constexpr bool operator == (const I &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e;
+  }
+};
+#endif
+
+#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
+struct J
+{
+  long long int a, b : 11, c : 3, d : 37, e : 1, f : 10, g : 2, h;
+  constexpr bool operator == (const J &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
+	   && x.f == f && x.g == g && x.h == h;
+  }
+};
+
+struct K
+{
+  long long int a, b, c;
+  constexpr bool operator == (const K &x) const
+  {
+    return x.a == a && x.b == b && x.c == c;
+  }
+};
+
+struct M
+{
+  signed a : 6, b : 7, c : 6, d : 5;
+  unsigned char e;
+  unsigned int f;
+  long long int g;
+  constexpr bool operator == (const M &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
+	   && x.f == f && x.g == g;
+  }
+};
+
+struct N
+{
+  unsigned long long int a, b;
+  constexpr bool operator == (const N &x) const
+  {
+    return x.a == a && x.b == b;
+  }
+};
+#endif
+
+static_assert (check <unsigned int> (0), "");
+static_assert (check <long long int> (0xdeadbeeffeedbac1ULL), "");
+static_assert (check <signed char> ((unsigned char) 42), "");
+static_assert (check <char> ((unsigned char) 42), "");
+static_assert (check <unsigned char> ((unsigned char) 42), "");
+static_assert (check <signed char> ((signed char) 42), "");
+static_assert (check <char> ((signed char) 42), "");
+static_assert (check <unsigned char> ((signed char) 42), "");
+static_assert (check <signed char> ((char) 42), "");
+static_assert (check <char> ((char) 42), "");
+static_assert (check <unsigned char> ((char) 42), "");
+#if __SIZEOF_INT__ == __SIZEOF_FLOAT__
+static_assert (check <int> (2.5f), "");
+static_assert (check <unsigned int> (136.5f), "");
+#endif
+#if __SIZEOF_LONG_LONG__ == __SIZEOF_DOUBLE__
+static_assert (check <long long> (2.5), "");
+static_assert (check <long long unsigned> (123456.75), "");
+#endif
+
+static_assert (check <B> (A{ 1, 2, 3 }), "");
+static_assert (check <A> (B{ 4, 5, 6 }), "");
+
+#if __SIZEOF_INT__ == 4
+static_assert (check <C> (A{ 7, 8, 9 }), "");
+static_assert (check <C> (B{ 10, 11, 12 }), "");
+static_assert (check <A> (C{ { { { 13, 14 }, { 15, 16 }, { 17, 18 } },
+			       { { 19, 20 }, { 21, 22 }, { 23, 24 } } } }), "");
+constexpr unsigned char c[] = { 1, 2, 3, 4 };
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <unsigned int> (c) == 0x04030201U, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <unsigned int> (c) == 0x01020304U, "");
+#endif
+
+#if __cplusplus >= 201703L
+static_assert (check <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
+#endif
+constexpr int d[] = { 0x12345678, 0x23456789, 0x5a876543, 0x3ba78654 };
+static_assert (bit_cast <G> (d) == bit_cast <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
+
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
+	       == I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed }, "");
+static_assert (bit_cast <A> (I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed })
+	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
+	       == I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed }, "");
+static_assert (bit_cast <A> (I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed })
+	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
+#endif
+#endif
+
+#if 2 * __SIZEOF_INT__ == __SIZEOF_LONG_LONG__ && __SIZEOF_INT__ >= 4
+constexpr unsigned long long a = 0xdeadbeeffee1deadULL;
+constexpr unsigned b[] = { 0xfeedbacU, 0xbeeffeedU };
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <D> (a) == D { int (0xfee1deadU), int (0xdeadbeefU) }, "");
+static_assert (bit_cast <unsigned long long> (b) == 0xbeeffeed0feedbacULL, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <D> (a) == D { int (0xdeadbeefU), int (0xfee1deadU) }, "");
+static_assert (bit_cast <unsigned long long> (b) == 0x0feedbacbeeffeedULL, "");
+#endif
+#endif
+
+#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL })
+	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
+	       == K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
+	       == M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL }, "");
+static_assert (bit_cast <N> (M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL })
+	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL })
+	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
+	       == K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
+	       == M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL }, "");
+static_assert (bit_cast <N> (M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL })
+	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
+#endif
+#endif
--- gcc/testsuite/g++.dg/cpp2a/bit-cast4.C.jj	2020-11-02 13:42:18.711819616 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast4.C	2020-11-02 13:42:18.711819616 +0100
@@ -0,0 +1,44 @@
+// { dg-do compile { target c++11 } }
+
+template <typename To, typename From>
+constexpr To
+bit_cast (const From &from)
+{
+  return __builtin_bit_cast (To, from);
+}
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'U' is a union type" "U" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const U' is a union type" "const U" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'B' contains a union type" "B" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'char\\\*' is a pointer type" "char ptr" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const int\\\*' is a pointer type" "const int ptr" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'C' contains a pointer type" "C" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const C' contains a pointer type" "const C" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int D::\\\*' is a pointer to member type" "ptrmem 1" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int \\\(D::\\\*\\\)\\\(\\\) const' is a pointer to member type" "ptrmem 2" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int \\\(D::\\\*\\\)\\\(\\\)' is a pointer to member type" "ptrmem 3" { target *-*-* } 7 }
+
+union U { int u; };
+struct A { int a; U b; };
+struct B : public A { int c; };
+struct C { const int *p; };
+constexpr int a[] = { 1, 2, 3 };
+constexpr const int *b = &a[0];
+constexpr C c = { b };
+struct D { int d; constexpr int foo () const { return 1; } };
+constexpr int D::*d = &D::d;
+constexpr int (D::*e) () const = &D::foo;
+struct E { __INTPTR_TYPE__ e, f; };
+constexpr E f = { 1, 2 };
+constexpr U g { 0 };
+
+constexpr auto z = bit_cast <U> (0);
+constexpr auto y = bit_cast <int> (g);
+constexpr auto x = bit_cast <B> (a);
+constexpr auto w = bit_cast <char *> ((__INTPTR_TYPE__) 0);
+constexpr auto v = bit_cast <__UINTPTR_TYPE__> (b);
+constexpr auto u = bit_cast <C> ((__INTPTR_TYPE__) 0);
+constexpr auto t = bit_cast <__INTPTR_TYPE__> (c);
+constexpr auto s = bit_cast <__INTPTR_TYPE__> (d);
+constexpr auto r = bit_cast <E> (e);
+constexpr auto q = bit_cast <int D::*> ((__INTPTR_TYPE__) 0);
+constexpr auto p = bit_cast <int (D::*) ()> (f);
--- gcc/testsuite/g++.dg/cpp2a/bit-cast5.C.jj	2020-11-02 13:42:18.711819616 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast5.C	2020-11-02 13:42:18.711819616 +0100
@@ -0,0 +1,69 @@
+// { dg-do compile { target { c++20 && { ilp32 || lp64 } } } }
+
+struct A { signed char a, b, c, d, e, f; };
+struct B {};
+struct C { B a, b; short c; B d; };
+struct D { int a : 4, b : 24, c : 4; };
+struct E { B a, b; short c; };
+struct F { B a; signed char b, c; B d; };
+
+constexpr bool
+f1 ()
+{
+  A a;
+  a.c = 23; a.d = 42;
+  C b = __builtin_bit_cast (C, a); // OK
+  return false;
+}
+
+constexpr bool
+f2 ()
+{
+  A a;
+  a.a = 1; a.b = 2; a.c = 3; a.e = 4; a.f = 5;
+  C b = __builtin_bit_cast (C, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
+  return false;
+}
+
+constexpr bool
+f3 ()
+{
+  D a;
+  a.b = 1;
+  F b = __builtin_bit_cast (F, a); // OK
+  return false;
+}
+
+constexpr bool
+f4 ()
+{
+  D a;
+  a.b = 1; a.c = 2;
+  E b = __builtin_bit_cast (E, a); // OK
+  return false;
+}
+
+constexpr bool
+f5 ()
+{
+  D a;
+  a.b = 1;
+  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
+  return false;
+}
+
+constexpr bool
+f6 ()
+{
+  D a;
+  a.c = 1;
+  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 2" }
+  return false;
+}
+
+constexpr bool a = f1 ();
+constexpr bool b = f2 ();
+constexpr bool c = f3 ();
+constexpr bool d = f4 ();
+constexpr bool e = f5 ();
+constexpr bool f = f6 ();


	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-02 19:21   ` [PATCH] c++: v2: " Jakub Jelinek
@ 2020-11-25  0:31     ` Jeff Law
  2020-11-25  9:23       ` Jakub Jelinek
  2020-11-25 17:26     ` Jason Merrill
  1 sibling, 1 reply; 29+ messages in thread
From: Jeff Law @ 2020-11-25  0:31 UTC (permalink / raw)
  To: Jakub Jelinek, Jason Merrill; +Cc: Jonathan Wakely, gcc-patches



On 11/2/20 12:21 PM, Jakub Jelinek via Gcc-patches wrote:
> On Thu, Jul 30, 2020 at 10:16:46AM -0400, Jason Merrill via Gcc-patches wrote:
>>> The following patch adds __builtin_bit_cast builtin, similarly to
>>> clang or MSVC which implement std::bit_cast using such an builtin too.
>> Great!
> Sorry for the long delay.
>
> The following patch implements what you've requested in the review.
> It doesn't deal with the padding bits being (sometimes) well defined, but if
> we e.g. get a new CONSTRUCTOR bit that would request that (and corresponding
> gimplifier change to pre-initialize to all zeros), it could be easily
> adjusted (just for such CONSTRUCTORs memset the whole mask portion
> (total_bytes) and set mask to NULL).  If there are other cases where
> such handling is desirable (e.g. I wonder about bit_cast being called on
> non-automatic const variables where their CONSTRUCTOR at least
> implementation-wise would end up being on pre-zeroed .rodata (or .data)
> memory), that can be handled too.
>
> In the previous version I was using next_initializable_field, that is
> something that isn't available in the middle-end, so all I do right now
> is just skip non-FIELD_DECLs, unnamed bit-fields and fields with zero size.
>
> Bootstrapped/regtested on x86_64-linux and i686-linux, ok for trunk?
>
> 2020-11-02  Jakub Jelinek  <jakub@redhat.com>
>
> 	PR libstdc++/93121
> 	* fold-const.h (native_encode_initializer): Add mask argument
> 	defaulted to nullptr.
> 	(find_bitfield_repr_type): Declare.
> 	* fold-const.c (find_bitfield_repr_type): New function.
> 	(native_encode_initializer): Add mask argument and support for
> 	filling it.  Handle also some bitfields without integral
> 	DECL_BIT_FIELD_REPRESENTATIVE.
>
> 	* c-common.h (enum rid): Add RID_BUILTIN_BIT_CAST.
> 	* c-common.c (c_common_reswords): Add __builtin_bit_cast.
>
> 	* cp-tree.h (cp_build_bit_cast): Declare.
> 	* cp-tree.def (BIT_CAST_EXPR): New tree code.
> 	* cp-objcp-common.c (names_builtin_p): Handle RID_BUILTIN_BIT_CAST.
> 	(cp_common_init_ts): Handle BIT_CAST_EXPR.
> 	* cxx-pretty-print.c (cxx_pretty_printer::postfix_expression):
> 	Likewise.
> 	* parser.c (cp_parser_postfix_expression): Handle
> 	RID_BUILTIN_BIT_CAST.
> 	* semantics.c (cp_build_bit_cast): New function.
> 	* tree.c (cp_tree_equal): Handle BIT_CAST_EXPR.
> 	(cp_walk_subtrees): Likewise.
> 	* pt.c (tsubst_copy): Likewise.
> 	* constexpr.c (check_bit_cast_type, cxx_native_interpret_aggregate,
> 	cxx_eval_bit_cast): New functions.
> 	(cxx_eval_constant_expression): Handle BIT_CAST_EXPR.
> 	(potential_constant_expression_1): Likewise.
> 	* cp-gimplify.c (cp_genericize_r): Likewise.
>
> 	* g++.dg/cpp2a/bit-cast1.C: New test.
> 	* g++.dg/cpp2a/bit-cast2.C: New test.
> 	* g++.dg/cpp2a/bit-cast3.C: New test.
> 	* g++.dg/cpp2a/bit-cast4.C: New test.
> 	* g++.dg/cpp2a/bit-cast5.C: New test.
Just looking at the fold-const and c-common bits.  Jason is going to
look the the cp/ stuff per today's meeting.

FIrst, do we need to document the new builtin? 

> --- gcc/fold-const.c.jj	2020-10-15 15:09:49.079725120 +0200
> +++ gcc/fold-const.c	2020-11-02 17:20:43.784633491 +0100
> @@ -7998,6 +8083,8 @@ native_encode_initializer (tree init, un
>  			    if (ptr)
>  			      memcpy (ptr + (curpos - o), ptr + (pos - o),
>  				      fieldsize);
> +			    if (mask)
> +			      memcpy (mask + curpos, mask + pos, fieldsize);
>  			  }
>  			else if (!native_encode_initializer (val,
>  							     ptr
Presumably we've already determined that the memcpy won't have an
overlap between the source and destination here?

Assuming that's true, then it looks good to me.

jeff


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-25  0:31     ` Jeff Law
@ 2020-11-25  9:23       ` Jakub Jelinek
  2020-11-25 10:31         ` Jonathan Wakely
  0 siblings, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-11-25  9:23 UTC (permalink / raw)
  To: Jeff Law; +Cc: Jason Merrill, Jonathan Wakely, gcc-patches

On Tue, Nov 24, 2020 at 05:31:03PM -0700, Jeff Law wrote:
> FIrst, do we need to document the new builtin? 

I think for builtins that are only meant for libstdc++ as underlying implementation
of its documented in the standard APIs we have some cases where we don't
document them and other cases where we don't.
I can certainly add doc/extend.texi documentation for it.

> > --- gcc/fold-const.c.jj	2020-10-15 15:09:49.079725120 +0200
> > +++ gcc/fold-const.c	2020-11-02 17:20:43.784633491 +0100
> > @@ -7998,6 +8083,8 @@ native_encode_initializer (tree init, un
> >  			    if (ptr)
> >  			      memcpy (ptr + (curpos - o), ptr + (pos - o),
> >  				      fieldsize);
> > +			    if (mask)
> > +			      memcpy (mask + curpos, mask + pos, fieldsize);
> >  			  }
> >  			else if (!native_encode_initializer (val,
> >  							     ptr
> Presumably we've already determined that the memcpy won't have an
> overlap between the source and destination here?

Yes, similar reason why the memcpy above that doesn't overlap.
This is when we want to save work, and compute the value bits
and mask bits for one element and then just copy it over to
all following elements that are initialized with the same value when
the index is a RANGE_EXPR.

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-25  9:23       ` Jakub Jelinek
@ 2020-11-25 10:31         ` Jonathan Wakely
  2020-11-25 16:24           ` Jakub Jelinek
  0 siblings, 1 reply; 29+ messages in thread
From: Jonathan Wakely @ 2020-11-25 10:31 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: Jeff Law, Jason Merrill, gcc-patches

On 25/11/20 10:23 +0100, Jakub Jelinek wrote:
>On Tue, Nov 24, 2020 at 05:31:03PM -0700, Jeff Law wrote:
>> FIrst, do we need to document the new builtin? 
>
>I think for builtins that are only meant for libstdc++ as underlying implementation
>of its documented in the standard APIs we have some cases where we don't
>document them and other cases where we don't.

And people report bugs when they're not documented. They might want to
know how the built-in is supposed to behave so they can implement the
equivalent in other compilers. Documenting the *intended* behaviour
also makes it clear which behaviours can be relied on, which allows us
to change the result for edge cases or out-of-contract inputs later.
If we don't describe the expected behaviour, then we can't really
blame people for relying on whatever it happens to do today.

I used to think we don't need to bother documenting them, but I've
been persuaded that it's useful to do it.


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-25 10:31         ` Jonathan Wakely
@ 2020-11-25 16:24           ` Jakub Jelinek
  2020-11-25 16:28             ` Jonathan Wakely
  0 siblings, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-11-25 16:24 UTC (permalink / raw)
  To: Jonathan Wakely; +Cc: gcc-patches

On Wed, Nov 25, 2020 at 10:31:28AM +0000, Jonathan Wakely via Gcc-patches wrote:
> On 25/11/20 10:23 +0100, Jakub Jelinek wrote:
> > On Tue, Nov 24, 2020 at 05:31:03PM -0700, Jeff Law wrote:
> > > FIrst, do we need to document the new builtin? 
> > 
> > I think for builtins that are only meant for libstdc++ as underlying implementation
> > of its documented in the standard APIs we have some cases where we don't
> > document them and other cases where we don't.
> 
> And people report bugs when they're not documented. They might want to
> know how the built-in is supposed to behave so they can implement the
> equivalent in other compilers. Documenting the *intended* behaviour
> also makes it clear which behaviours can be relied on, which allows us
> to change the result for edge cases or out-of-contract inputs later.
> If we don't describe the expected behaviour, then we can't really
> blame people for relying on whatever it happens to do today.
> 
> I used to think we don't need to bother documenting them, but I've
> been persuaded that it's useful to do it.

Ok, so like this?

2020-11-25  Jakub Jelinek  <jakub@redhat.com>

	PR libstdc++/93121
	* doc/extend.texi (__builtin_bit_cast): Document.

--- gcc/doc/extend.texi.jj	2020-11-23 17:01:51.986013540 +0100
+++ gcc/doc/extend.texi	2020-11-25 17:21:14.696046005 +0100
@@ -13574,6 +13574,21 @@ have intederminate values and the object
 bitwise compared to some other object, for example for atomic operations.
 @end deftypefn
 
+@deftypefn {Built-in Function} @var{type} __builtin_bit_cast (@var{type}, @var{arg})
+The @code{__builtin_bit_cast} function is available only
+in C++.  The built-in is intended to be used by implementations of
+the @code{std::bit_cast} C++ template function.  Programs should make
+use of the latter function rather than invoking the built-in directly.
+
+This built-in function allows reinterpreting the bits of the @var{arg}
+argument as if it had type @var{type}.  @var{type} and the type of the
+@var{arg} argument need to be trivially copyable types with the same size.
+When manifestly constant-evaluated, it performs extra diagnostics required
+for @code{std::bit_cast} and returns a constant expression if @var{arg}
+is a constant expression.  For more details
+refer to the latest revision of the C++ standard.
+@end deftypefn
+
 @deftypefn {Built-in Function} long __builtin_expect (long @var{exp}, long @var{c})
 @opindex fprofile-arcs
 You may use @code{__builtin_expect} to provide the compiler with


	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-25 16:24           ` Jakub Jelinek
@ 2020-11-25 16:28             ` Jonathan Wakely
  0 siblings, 0 replies; 29+ messages in thread
From: Jonathan Wakely @ 2020-11-25 16:28 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: gcc-patches

On 25/11/20 17:24 +0100, Jakub Jelinek wrote:
>On Wed, Nov 25, 2020 at 10:31:28AM +0000, Jonathan Wakely via Gcc-patches wrote:
>> On 25/11/20 10:23 +0100, Jakub Jelinek wrote:
>> > On Tue, Nov 24, 2020 at 05:31:03PM -0700, Jeff Law wrote:
>> > > FIrst, do we need to document the new builtin? 
>> >
>> > I think for builtins that are only meant for libstdc++ as underlying implementation
>> > of its documented in the standard APIs we have some cases where we don't
>> > document them and other cases where we don't.
>>
>> And people report bugs when they're not documented. They might want to
>> know how the built-in is supposed to behave so they can implement the
>> equivalent in other compilers. Documenting the *intended* behaviour
>> also makes it clear which behaviours can be relied on, which allows us
>> to change the result for edge cases or out-of-contract inputs later.
>> If we don't describe the expected behaviour, then we can't really
>> blame people for relying on whatever it happens to do today.
>>
>> I used to think we don't need to bother documenting them, but I've
>> been persuaded that it's useful to do it.
>
>Ok, so like this?
>
>2020-11-25  Jakub Jelinek  <jakub@redhat.com>
>
>	PR libstdc++/93121
>	* doc/extend.texi (__builtin_bit_cast): Document.
>
>--- gcc/doc/extend.texi.jj	2020-11-23 17:01:51.986013540 +0100
>+++ gcc/doc/extend.texi	2020-11-25 17:21:14.696046005 +0100
>@@ -13574,6 +13574,21 @@ have intederminate values and the object
> bitwise compared to some other object, for example for atomic operations.
> @end deftypefn
>
>+@deftypefn {Built-in Function} @var{type} __builtin_bit_cast (@var{type}, @var{arg})
>+The @code{__builtin_bit_cast} function is available only
>+in C++.  The built-in is intended to be used by implementations of
>+the @code{std::bit_cast} C++ template function.  Programs should make
>+use of the latter function rather than invoking the built-in directly.
>+
>+This built-in function allows reinterpreting the bits of the @var{arg}
>+argument as if it had type @var{type}.  @var{type} and the type of the
>+@var{arg} argument need to be trivially copyable types with the same size.
>+When manifestly constant-evaluated, it performs extra diagnostics required
>+for @code{std::bit_cast} and returns a constant expression if @var{arg}
>+is a constant expression.  For more details
>+refer to the latest revision of the C++ standard.
>+@end deftypefn
>+
> @deftypefn {Built-in Function} long __builtin_expect (long @var{exp}, long @var{c})
> @opindex fprofile-arcs
> You may use @code{__builtin_expect} to provide the compiler with

That looks good to me.



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-02 19:21   ` [PATCH] c++: v2: " Jakub Jelinek
  2020-11-25  0:31     ` Jeff Law
@ 2020-11-25 17:26     ` Jason Merrill
  2020-11-25 18:50       ` Jakub Jelinek
  1 sibling, 1 reply; 29+ messages in thread
From: Jason Merrill @ 2020-11-25 17:26 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: Jonathan Wakely, gcc-patches

On 11/2/20 2:21 PM, Jakub Jelinek wrote:
> On Thu, Jul 30, 2020 at 10:16:46AM -0400, Jason Merrill via Gcc-patches wrote:
>>> The following patch adds __builtin_bit_cast builtin, similarly to
>>> clang or MSVC which implement std::bit_cast using such an builtin too.
>>
>> Great!
> 
> Sorry for the long delay.
> 
> The following patch implements what you've requested in the review.
> It doesn't deal with the padding bits being (sometimes) well defined, but if
> we e.g. get a new CONSTRUCTOR bit that would request that (and corresponding
> gimplifier change to pre-initialize to all zeros), it could be easily
> adjusted (just for such CONSTRUCTORs memset the whole mask portion
> (total_bytes) and set mask to NULL).  If there are other cases where
> such handling is desirable (e.g. I wonder about bit_cast being called on
> non-automatic const variables where their CONSTRUCTOR at least
> implementation-wise would end up being on pre-zeroed .rodata (or .data)
> memory), that can be handled too.
> 
> In the previous version I was using next_initializable_field, that is
> something that isn't available in the middle-end, so all I do right now
> is just skip non-FIELD_DECLs, unnamed bit-fields and fields with zero size.
> 
> Bootstrapped/regtested on x86_64-linux and i686-linux, ok for trunk?
> 
> 2020-11-02  Jakub Jelinek  <jakub@redhat.com>
> 
> 	PR libstdc++/93121
> 	* fold-const.h (native_encode_initializer): Add mask argument
> 	defaulted to nullptr.
> 	(find_bitfield_repr_type): Declare.
> 	* fold-const.c (find_bitfield_repr_type): New function.
> 	(native_encode_initializer): Add mask argument and support for
> 	filling it.  Handle also some bitfields without integral
> 	DECL_BIT_FIELD_REPRESENTATIVE.
> 
> 	* c-common.h (enum rid): Add RID_BUILTIN_BIT_CAST.
> 	* c-common.c (c_common_reswords): Add __builtin_bit_cast.
> 
> 	* cp-tree.h (cp_build_bit_cast): Declare.
> 	* cp-tree.def (BIT_CAST_EXPR): New tree code.
> 	* cp-objcp-common.c (names_builtin_p): Handle RID_BUILTIN_BIT_CAST.
> 	(cp_common_init_ts): Handle BIT_CAST_EXPR.
> 	* cxx-pretty-print.c (cxx_pretty_printer::postfix_expression):
> 	Likewise.
> 	* parser.c (cp_parser_postfix_expression): Handle
> 	RID_BUILTIN_BIT_CAST.
> 	* semantics.c (cp_build_bit_cast): New function.
> 	* tree.c (cp_tree_equal): Handle BIT_CAST_EXPR.
> 	(cp_walk_subtrees): Likewise.
> 	* pt.c (tsubst_copy): Likewise.
> 	* constexpr.c (check_bit_cast_type, cxx_native_interpret_aggregate,
> 	cxx_eval_bit_cast): New functions.
> 	(cxx_eval_constant_expression): Handle BIT_CAST_EXPR.
> 	(potential_constant_expression_1): Likewise.
> 	* cp-gimplify.c (cp_genericize_r): Likewise.
> 
> 	* g++.dg/cpp2a/bit-cast1.C: New test.
> 	* g++.dg/cpp2a/bit-cast2.C: New test.
> 	* g++.dg/cpp2a/bit-cast3.C: New test.
> 	* g++.dg/cpp2a/bit-cast4.C: New test.
> 	* g++.dg/cpp2a/bit-cast5.C: New test.
> 
> --- gcc/fold-const.h.jj	2020-08-18 07:50:18.716920202 +0200
> +++ gcc/fold-const.h	2020-11-02 15:53:59.859477063 +0100
> @@ -27,9 +27,10 @@ extern int folding_initializer;
>   /* Convert between trees and native memory representation.  */
>   extern int native_encode_expr (const_tree, unsigned char *, int, int off = -1);
>   extern int native_encode_initializer (tree, unsigned char *, int,
> -				      int off = -1);
> +				      int off = -1, unsigned char * = nullptr);
>   extern tree native_interpret_expr (tree, const unsigned char *, int);
>   extern bool can_native_interpret_type_p (tree);
> +extern tree find_bitfield_repr_type (int, int);
>   extern void shift_bytes_in_array_left (unsigned char *, unsigned int,
>   				       unsigned int);
>   extern void shift_bytes_in_array_right (unsigned char *, unsigned int,
> --- gcc/fold-const.c.jj	2020-10-15 15:09:49.079725120 +0200
> +++ gcc/fold-const.c	2020-11-02 17:20:43.784633491 +0100
> @@ -7915,25 +7915,78 @@ native_encode_expr (const_tree expr, uns
>       }
>   }
>   
> +/* Try to find a type whose byte size is smaller or equal to LEN bytes larger
> +   or equal to FIELDSIZE bytes, with underlying mode precision/size multiple
> +   of BITS_PER_UNIT.  As native_{interpret,encode}_int works in term of
> +   machine modes, we can't just use build_nonstandard_integer_type.  */
> +
> +tree
> +find_bitfield_repr_type (int fieldsize, int len)
> +{
> +  machine_mode mode;
> +  for (int pass = 0; pass < 2; pass++)
> +    {
> +      enum mode_class mclass = pass ? MODE_PARTIAL_INT : MODE_INT;
> +      FOR_EACH_MODE_IN_CLASS (mode, mclass)
> +	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
> +	    && known_eq (GET_MODE_PRECISION (mode),
> +			 GET_MODE_BITSIZE (mode))
> +	    && known_le (GET_MODE_SIZE (mode), len))
> +	  {
> +	    tree ret = lang_hooks.types.type_for_mode (mode, 1);
> +	    if (ret && TYPE_MODE (ret) == mode)
> +	      return ret;
> +	  }
> +    }
> +
> +  for (int i = 0; i < NUM_INT_N_ENTS; i ++)
> +    if (int_n_enabled_p[i]
> +	&& int_n_data[i].bitsize >= (unsigned) (BITS_PER_UNIT * fieldsize)
> +	&& int_n_trees[i].unsigned_type)
> +      {
> +	tree ret = int_n_trees[i].unsigned_type;
> +	mode = TYPE_MODE (ret);
> +	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
> +	    && known_eq (GET_MODE_PRECISION (mode),
> +			 GET_MODE_BITSIZE (mode))
> +	    && known_le (GET_MODE_SIZE (mode), len))
> +	  return ret;
> +      }
> +
> +  return NULL_TREE;
> +}
> +
>   /* Similar to native_encode_expr, but also handle CONSTRUCTORs, VCEs,
> -   NON_LVALUE_EXPRs and nops.  */
> +   NON_LVALUE_EXPRs and nops.  If MASK is non-NULL (then PTR has
> +   to be non-NULL and OFF zero), then in addition to filling the
> +   bytes pointed by PTR with the value also clear any bits pointed
> +   by MASK that are known to be initialized, keep them as is for
> +   e.g. uninitialized padding bits or uninitialized fields.  */
>   
>   int
>   native_encode_initializer (tree init, unsigned char *ptr, int len,
> -			   int off)
> +			   int off, unsigned char *mask)
>   {
> +  int r;
> +
>     /* We don't support starting at negative offset and -1 is special.  */
>     if (off < -1 || init == NULL_TREE)
>       return 0;
>   
> +  gcc_assert (mask == NULL || (off == 0 && ptr));
> +
>     STRIP_NOPS (init);
>     switch (TREE_CODE (init))
>       {
>       case VIEW_CONVERT_EXPR:
>       case NON_LVALUE_EXPR:
> -      return native_encode_initializer (TREE_OPERAND (init, 0), ptr, len, off);
> +      return native_encode_initializer (TREE_OPERAND (init, 0), ptr, len, off,
> +					mask);
>       default:
> -      return native_encode_expr (init, ptr, len, off);
> +      r = native_encode_expr (init, ptr, len, off);
> +      if (mask)
> +	memset (mask, 0, r);
> +      return r;
>       case CONSTRUCTOR:
>         tree type = TREE_TYPE (init);
>         HOST_WIDE_INT total_bytes = int_size_in_bytes (type);
> @@ -7946,7 +7999,7 @@ native_encode_initializer (tree init, un
>   	{
>   	  HOST_WIDE_INT min_index;
>   	  unsigned HOST_WIDE_INT cnt;
> -	  HOST_WIDE_INT curpos = 0, fieldsize;
> +	  HOST_WIDE_INT curpos = 0, fieldsize, valueinit = -1;
>   	  constructor_elt *ce;
>   
>   	  if (TYPE_DOMAIN (type) == NULL_TREE
> @@ -7961,12 +8014,22 @@ native_encode_initializer (tree init, un
>   	  if (ptr != NULL)
>   	    memset (ptr, '\0', MIN (total_bytes - off, len));
>   
> -	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
> +	  for (cnt = 0; ; cnt++)
>   	    {
> -	      tree val = ce->value;
> -	      tree index = ce->index;
> +	      tree val = NULL_TREE, index = NULL_TREE;
>   	      HOST_WIDE_INT pos = curpos, count = 0;
>   	      bool full = false;
> +	      if (vec_safe_iterate (CONSTRUCTOR_ELTS (init), cnt, &ce))
> +		{
> +		  val = ce->value;
> +		  index = ce->index;
> +		}
> +	      else if (mask == NULL
> +		       || CONSTRUCTOR_NO_CLEARING (init)
> +		       || curpos >= total_bytes)
> +		break;
> +	      else
> +		pos = total_bytes;
>   	      if (index && TREE_CODE (index) == RANGE_EXPR)
>   		{
>   		  if (!tree_fits_shwi_p (TREE_OPERAND (index, 0))
> @@ -7984,6 +8047,28 @@ native_encode_initializer (tree init, un
>   		  pos = (tree_to_shwi (index) - min_index) * fieldsize;
>   		}
>   
> +	      if (mask && !CONSTRUCTOR_NO_CLEARING (init) && curpos != pos)
> +		{
> +		  if (valueinit == -1)
> +		    {
> +		      tree zero = build_constructor (TREE_TYPE (type), NULL);
> +		      r = native_encode_initializer (zero, ptr + curpos,
> +						     fieldsize, 0,
> +						     mask + curpos);
> +		      ggc_free (zero);
> +		      if (!r)
> +			return 0;
> +		      valueinit = curpos;
> +		      curpos += fieldsize;
> +		    }
> +		  while (curpos != pos)
> +		    {
> +		      memcpy (ptr + curpos, ptr + valueinit, fieldsize);
> +		      memcpy (mask + curpos, mask + valueinit, fieldsize);
> +		      curpos += fieldsize;
> +		    }
> +		}
> +
>   	      curpos = pos;
>   	      if (val)
>   		do
> @@ -7998,6 +8083,8 @@ native_encode_initializer (tree init, un
>   			    if (ptr)
>   			      memcpy (ptr + (curpos - o), ptr + (pos - o),
>   				      fieldsize);
> +			    if (mask)
> +			      memcpy (mask + curpos, mask + pos, fieldsize);
>   			  }
>   			else if (!native_encode_initializer (val,
>   							     ptr
> @@ -8005,7 +8092,10 @@ native_encode_initializer (tree init, un
>   							     : NULL,
>   							     fieldsize,
>   							     off == -1 ? -1
> -								       : 0))
> +								       : 0,
> +							     mask
> +							     ? mask + curpos
> +							     : NULL))
>   			  return 0;
>   			else
>   			  {
> @@ -8020,6 +8110,7 @@ native_encode_initializer (tree init, un
>   			unsigned char *p = NULL;
>   			int no = 0;
>   			int l;
> +			gcc_assert (mask == NULL);
>   			if (curpos >= off)
>   			  {
>   			    if (ptr)
> @@ -8033,7 +8124,7 @@ native_encode_initializer (tree init, un
>   			    no = off - curpos;
>   			    l = len;
>   			  }
> -			if (!native_encode_initializer (val, p, l, no))
> +			if (!native_encode_initializer (val, p, l, no, NULL))
>   			  return 0;
>   		      }
>   		    curpos += fieldsize;
> @@ -8047,22 +8138,75 @@ native_encode_initializer (tree init, un
>   	{
>   	  unsigned HOST_WIDE_INT cnt;
>   	  constructor_elt *ce;
> +	  tree fld_base = TYPE_FIELDS (type);
> +	  tree to_free = NULL_TREE;
>   
> +	  gcc_assert (TREE_CODE (type) == RECORD_TYPE || mask == NULL);
>   	  if (ptr != NULL)
>   	    memset (ptr, '\0', MIN (total_bytes - off, len));
> -	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
> +	  for (cnt = 0; ; cnt++)
>   	    {
> -	      tree field = ce->index;
> -	      tree val = ce->value;
> -	      HOST_WIDE_INT pos, fieldsize;
> +	      tree val = NULL_TREE, field = NULL_TREE;
> +	      HOST_WIDE_INT pos = 0, fieldsize;
>   	      unsigned HOST_WIDE_INT bpos = 0, epos = 0;
>   
> -	      if (field == NULL_TREE)
> -		return 0;
> +	      if (to_free)
> +		{
> +		  ggc_free (to_free);
> +		  to_free = NULL_TREE;
> +		}
>   
> -	      pos = int_byte_position (field);
> -	      if (off != -1 && (HOST_WIDE_INT) off + len <= pos)
> -		continue;
> +	      if (vec_safe_iterate (CONSTRUCTOR_ELTS (init), cnt, &ce))
> +		{
> +		  val = ce->value;
> +		  field = ce->index;
> +		  if (field == NULL_TREE)
> +		    return 0;
> +
> +		  pos = int_byte_position (field);
> +		  if (off != -1 && (HOST_WIDE_INT) off + len <= pos)
> +		    continue;
> +		}
> +	      else if (mask == NULL
> +		       || CONSTRUCTOR_NO_CLEARING (init))
> +		break;
> +	      else
> +		pos = total_bytes;
> +
> +	      if (mask && !CONSTRUCTOR_NO_CLEARING (init))
> +		{
> +		  tree fld;
> +		  for (fld = fld_base; fld; fld = DECL_CHAIN (fld))
> +		    {
> +		      if (TREE_CODE (fld) != FIELD_DECL)
> +			continue;
> +		      if (fld == field)
> +			break;
> +		      if (DECL_BIT_FIELD (fld)
> +			  && DECL_NAME (fld) == NULL_TREE)
> +			continue;

I think you want to check DECL_PADDING_P here; the C and C++ front ends 
set it on unnamed bit-fields, and that's what is_empty_type looks at.

> +		      if (DECL_SIZE_UNIT (fld) == NULL_TREE
> +			  || !tree_fits_shwi_p (DECL_SIZE_UNIT (fld)))
> +			return 0;
> +		      if (integer_zerop (DECL_SIZE_UNIT (fld)))
> +			continue;
> +		      break;
> +		    }
> +		  if (fld == NULL_TREE)
> +		    {
> +		      if (ce == NULL)
> +			break;
> +		      return 0;
> +		    }
> +		  fld_base = DECL_CHAIN (fld);
> +		  if (fld != field)
> +		    {
> +		      cnt--;
> +		      field = fld;
> +		      val = build_constructor (TREE_TYPE (fld), NULL);
> +		      to_free = val;
> +		    }
> +		}
>   
>   	      if (TREE_CODE (TREE_TYPE (field)) == ARRAY_TYPE
>   		  && TYPE_DOMAIN (TREE_TYPE (field))
> @@ -8103,30 +8247,49 @@ native_encode_initializer (tree init, un
>   		  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
>   		    return 0;
>   
> -		  tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
> -		  if (repr == NULL_TREE
> -		      || TREE_CODE (val) != INTEGER_CST
> -		      || !INTEGRAL_TYPE_P (TREE_TYPE (repr)))
> +		  if (TREE_CODE (val) != INTEGER_CST)
>   		    return 0;
>   
> -		  HOST_WIDE_INT rpos = int_byte_position (repr);
> +		  tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
> +		  tree repr_type = NULL_TREE;
> +		  HOST_WIDE_INT rpos = 0;
> +		  if (repr && INTEGRAL_TYPE_P (TREE_TYPE (repr)))
> +		    {
> +		      rpos = int_byte_position (repr);
> +		      repr_type = TREE_TYPE (repr);
> +		    }
> +		  else
> +		    {
> +		      repr_type = find_bitfield_repr_type (fieldsize, len);
> +		      if (repr_type == NULL_TREE)
> +			return 0;
> +		      HOST_WIDE_INT repr_size = int_size_in_bytes (repr_type);
> +		      gcc_assert (repr_size > 0 && repr_size <= len);
> +		      if (pos + repr_size <= len)
> +			rpos = pos;
> +		      else
> +			{
> +			  rpos = len - repr_size;
> +			  gcc_assert (rpos <= pos);
> +			}
> +		    }
> +
>   		  if (rpos > pos)
>   		    return 0;
> -		  wide_int w = wi::to_wide (val,
> -					    TYPE_PRECISION (TREE_TYPE (repr)));
> -		  int diff = (TYPE_PRECISION (TREE_TYPE (repr))
> +		  wide_int w = wi::to_wide (val, TYPE_PRECISION (repr_type));
> +		  int diff = (TYPE_PRECISION (repr_type)
>   			      - TYPE_PRECISION (TREE_TYPE (field)));
>   		  HOST_WIDE_INT bitoff = (pos - rpos) * BITS_PER_UNIT + bpos;
>   		  if (!BYTES_BIG_ENDIAN)
>   		    w = wi::lshift (w, bitoff);
>   		  else
>   		    w = wi::lshift (w, diff - bitoff);
> -		  val = wide_int_to_tree (TREE_TYPE (repr), w);
> +		  val = wide_int_to_tree (repr_type, w);
>   
>   		  unsigned char buf[MAX_BITSIZE_MODE_ANY_INT
>   				    / BITS_PER_UNIT + 1];
>   		  int l = native_encode_int (val, buf, sizeof buf, 0);
> -		  if (l * BITS_PER_UNIT != TYPE_PRECISION (TREE_TYPE (repr)))
> +		  if (l * BITS_PER_UNIT != TYPE_PRECISION (repr_type))
>   		    return 0;
>   
>   		  if (ptr == NULL)
> @@ -8139,15 +8302,31 @@ native_encode_initializer (tree init, un
>   		    {
>   		      if (!BYTES_BIG_ENDIAN)
>   			{
> -			  int mask = (1 << bpos) - 1;
> -			  buf[pos - rpos] &= ~mask;
> -			  buf[pos - rpos] |= ptr[pos - o] & mask;
> +			  int msk = (1 << bpos) - 1;
> +			  buf[pos - rpos] &= ~msk;
> +			  buf[pos - rpos] |= ptr[pos - o] & msk;
> +			  if (mask)
> +			    {
> +			      if (fieldsize > 1 || epos == 0)
> +				mask[pos] &= msk;
> +			      else
> +				mask[pos] &= (msk | ~((1 << epos) - 1));
> +			    }
>   			}
>   		      else
>   			{
> -			  int mask = (1 << (BITS_PER_UNIT - bpos)) - 1;
> -			  buf[pos - rpos] &= mask;
> -			  buf[pos - rpos] |= ptr[pos - o] & ~mask;
> +			  int msk = (1 << (BITS_PER_UNIT - bpos)) - 1;
> +			  buf[pos - rpos] &= msk;
> +			  buf[pos - rpos] |= ptr[pos - o] & ~msk;
> +			  if (mask)
> +			    {
> +			      if (fieldsize > 1 || epos == 0)
> +				mask[pos] &= ~msk;
> +			      else
> +				mask[pos] &= (~msk
> +					      | ((1 << (BITS_PER_UNIT - epos))
> +						 - 1));
> +			    }
>   			}
>   		    }
>   		  /* If the bitfield does not end at byte boundary, handle
> @@ -8158,27 +8337,37 @@ native_encode_initializer (tree init, un
>   		    {
>   		      if (!BYTES_BIG_ENDIAN)
>   			{
> -			  int mask = (1 << epos) - 1;
> -			  buf[pos - rpos + fieldsize - 1] &= mask;
> +			  int msk = (1 << epos) - 1;
> +			  buf[pos - rpos + fieldsize - 1] &= msk;
>   			  buf[pos - rpos + fieldsize - 1]
> -			    |= ptr[pos + fieldsize - 1 - o] & ~mask;
> +			    |= ptr[pos + fieldsize - 1 - o] & ~msk;
> +			  if (mask && (fieldsize > 1 || bpos == 0))
> +			    mask[pos + fieldsize - 1] &= ~msk;
>   			}
>   		       else
>   			{
> -			  int mask = (1 << (BITS_PER_UNIT - epos)) - 1;
> -			  buf[pos - rpos + fieldsize - 1] &= ~mask;
> +			  int msk = (1 << (BITS_PER_UNIT - epos)) - 1;
> +			  buf[pos - rpos + fieldsize - 1] &= ~msk;
>   			  buf[pos - rpos + fieldsize - 1]
> -			    |= ptr[pos + fieldsize - 1 - o] & mask;
> +			    |= ptr[pos + fieldsize - 1 - o] & msk;
> +			  if (mask && (fieldsize > 1 || bpos == 0))
> +			    mask[pos + fieldsize - 1] &= msk;
>   			}
>   		    }
>   		  if (off == -1
>   		      || (pos >= off
>   			  && (pos + fieldsize <= (HOST_WIDE_INT) off + len)))
> -		    memcpy (ptr + pos - o, buf + (pos - rpos), fieldsize);
> +		    {
> +		      memcpy (ptr + pos - o, buf + (pos - rpos), fieldsize);
> +		      if (mask && (fieldsize > (bpos != 0) + (epos != 0)))
> +			memset (mask + pos + (bpos != 0), 0,
> +				fieldsize - (bpos != 0) - (epos != 0));
> +		    }
>   		  else
>   		    {
>   		      /* Partial overlap.  */
>   		      HOST_WIDE_INT fsz = fieldsize;
> +		      gcc_assert (mask == NULL);
>   		      if (pos < off)
>   			{
>   			  fsz -= (off - pos);
> @@ -8198,7 +8387,8 @@ native_encode_initializer (tree init, un
>   		  if (!native_encode_initializer (val, ptr ? ptr + pos - o
>   							   : NULL,
>   						  fieldsize,
> -						  off == -1 ? -1 : 0))
> +						  off == -1 ? -1 : 0,
> +						  mask ? mask + pos : NULL))
>   		    return 0;
>   		}
>   	      else
> @@ -8207,6 +8397,7 @@ native_encode_initializer (tree init, un
>   		  unsigned char *p = NULL;
>   		  int no = 0;
>   		  int l;
> +		  gcc_assert (mask == NULL);
>   		  if (pos >= off)
>   		    {
>   		      if (ptr)
> @@ -8220,7 +8411,7 @@ native_encode_initializer (tree init, un
>   		      no = off - pos;
>   		      l = len;
>   		    }
> -		  if (!native_encode_initializer (val, p, l, no))
> +		  if (!native_encode_initializer (val, p, l, no, NULL))
>   		    return 0;
>   		}
>   	    }
> --- gcc/c-family/c-common.h.jj	2020-10-27 14:36:42.182246025 +0100
> +++ gcc/c-family/c-common.h	2020-11-02 13:42:18.699819747 +0100
> @@ -164,7 +164,7 @@ enum rid
>     RID_HAS_NOTHROW_COPY,        RID_HAS_TRIVIAL_ASSIGN,
>     RID_HAS_TRIVIAL_CONSTRUCTOR, RID_HAS_TRIVIAL_COPY,
>     RID_HAS_TRIVIAL_DESTRUCTOR,  RID_HAS_UNIQUE_OBJ_REPRESENTATIONS,
> -  RID_HAS_VIRTUAL_DESTRUCTOR,
> +  RID_HAS_VIRTUAL_DESTRUCTOR,  RID_BUILTIN_BIT_CAST,
>     RID_IS_ABSTRACT,             RID_IS_AGGREGATE,
>     RID_IS_BASE_OF,              RID_IS_CLASS,
>     RID_IS_EMPTY,                RID_IS_ENUM,
> --- gcc/c-family/c-common.c.jj	2020-10-27 14:36:42.182246025 +0100
> +++ gcc/c-family/c-common.c	2020-11-02 13:42:18.700819736 +0100
> @@ -374,6 +374,7 @@ const struct c_common_resword c_common_r
>     { "__auto_type",	RID_AUTO_TYPE,	D_CONLY },
>     { "__bases",          RID_BASES, D_CXXONLY },
>     { "__builtin_addressof", RID_ADDRESSOF, D_CXXONLY },
> +  { "__builtin_bit_cast", RID_BUILTIN_BIT_CAST, D_CXXONLY },
>     { "__builtin_call_with_static_chain",
>       RID_BUILTIN_CALL_WITH_STATIC_CHAIN, D_CONLY },
>     { "__builtin_choose_expr", RID_CHOOSE_EXPR, D_CONLY },
> --- gcc/cp/cp-tree.h.jj	2020-10-30 08:59:57.041496709 +0100
> +++ gcc/cp/cp-tree.h	2020-11-02 13:42:18.701819725 +0100
> @@ -7273,6 +7273,8 @@ extern tree finish_builtin_launder		(loc
>   						 tsubst_flags_t);
>   extern tree cp_build_vec_convert		(tree, location_t, tree,
>   						 tsubst_flags_t);
> +extern tree cp_build_bit_cast			(location_t, tree, tree,
> +						 tsubst_flags_t);
>   extern void start_lambda_scope			(tree);
>   extern void record_lambda_scope			(tree);
>   extern void record_null_lambda_scope		(tree);
> --- gcc/cp/cp-tree.def.jj	2020-09-21 11:15:53.715518437 +0200
> +++ gcc/cp/cp-tree.def	2020-11-02 13:42:18.701819725 +0100
> @@ -437,6 +437,9 @@ DEFTREECODE (UNARY_RIGHT_FOLD_EXPR, "una
>   DEFTREECODE (BINARY_LEFT_FOLD_EXPR, "binary_left_fold_expr", tcc_expression, 3)
>   DEFTREECODE (BINARY_RIGHT_FOLD_EXPR, "binary_right_fold_expr", tcc_expression, 3)
>   
> +/* Represents the __builtin_bit_cast (type, expr) expression.
> +   The type is in TREE_TYPE, expression in TREE_OPERAND (bitcast, 0).  */
> +DEFTREECODE (BIT_CAST_EXPR, "bit_cast_expr", tcc_expression, 1)
>   
>   /** C++ extensions. */
>   
> --- gcc/cp/cp-objcp-common.c.jj	2020-09-21 11:15:53.715518437 +0200
> +++ gcc/cp/cp-objcp-common.c	2020-11-02 13:42:18.701819725 +0100
> @@ -390,6 +390,7 @@ names_builtin_p (const char *name)
>       case RID_BUILTIN_HAS_ATTRIBUTE:
>       case RID_BUILTIN_SHUFFLE:
>       case RID_BUILTIN_LAUNDER:
> +    case RID_BUILTIN_BIT_CAST:
>       case RID_OFFSETOF:
>       case RID_HAS_NOTHROW_ASSIGN:
>       case RID_HAS_NOTHROW_CONSTRUCTOR:
> @@ -488,6 +489,7 @@ cp_common_init_ts (void)
>     MARK_TS_EXP (ALIGNOF_EXPR);
>     MARK_TS_EXP (ARROW_EXPR);
>     MARK_TS_EXP (AT_ENCODE_EXPR);
> +  MARK_TS_EXP (BIT_CAST_EXPR);
>     MARK_TS_EXP (CAST_EXPR);
>     MARK_TS_EXP (CONST_CAST_EXPR);
>     MARK_TS_EXP (CTOR_INITIALIZER);
> --- gcc/cp/cxx-pretty-print.c.jj	2020-10-19 09:32:35.503914847 +0200
> +++ gcc/cp/cxx-pretty-print.c	2020-11-02 13:42:18.702819714 +0100
> @@ -655,6 +655,15 @@ cxx_pretty_printer::postfix_expression (
>         pp_right_paren (this);
>         break;
>   
> +    case BIT_CAST_EXPR:
> +      pp_cxx_ws_string (this, "__builtin_bit_cast");
> +      pp_left_paren (this);
> +      type_id (TREE_TYPE (t));
> +      pp_comma (this);
> +      expression (TREE_OPERAND (t, 0));
> +      pp_right_paren (this);
> +      break;
> +
>       case EMPTY_CLASS_EXPR:
>         type_id (TREE_TYPE (t));
>         pp_left_paren (this);
> --- gcc/cp/parser.c.jj	2020-11-02 13:33:46.039434035 +0100
> +++ gcc/cp/parser.c	2020-11-02 13:42:18.705819681 +0100
> @@ -7234,6 +7234,32 @@ cp_parser_postfix_expression (cp_parser
>   				     tf_warning_or_error);
>         }
>   
> +    case RID_BUILTIN_BIT_CAST:
> +      {
> +	tree expression;
> +	tree type;
> +	/* Consume the `__builtin_bit_cast' token.  */
> +	cp_lexer_consume_token (parser->lexer);
> +	/* Look for the opening `('.  */
> +	matching_parens parens;
> +	parens.require_open (parser);
> +	location_t type_location
> +	  = cp_lexer_peek_token (parser->lexer)->location;
> +	/* Parse the type-id.  */
> +	{
> +	  type_id_in_expr_sentinel s (parser);
> +	  type = cp_parser_type_id (parser);
> +	}
> +	/* Look for the `,'.  */
> +	cp_parser_require (parser, CPP_COMMA, RT_COMMA);
> +	/* Now, parse the assignment-expression.  */
> +	expression = cp_parser_assignment_expression (parser);
> +	/* Look for the closing `)'.  */
> +	parens.require_close (parser);
> +	return cp_build_bit_cast (type_location, type, expression,
> +				  tf_warning_or_error);
> +      }
> +
>       default:
>         {
>   	tree type;
> --- gcc/cp/semantics.c.jj	2020-10-30 09:16:30.816279461 +0100
> +++ gcc/cp/semantics.c	2020-11-02 13:42:18.706819671 +0100
> @@ -10591,4 +10591,76 @@ cp_build_vec_convert (tree arg, location
>     return build_call_expr_internal_loc (loc, IFN_VEC_CONVERT, type, 1, arg);
>   }
>   
> +/* Finish __builtin_bit_cast (type, arg).  */
> +
> +tree
> +cp_build_bit_cast (location_t loc, tree type, tree arg,
> +		   tsubst_flags_t complain)
> +{
> +  if (error_operand_p (type))
> +    return error_mark_node;
> +  if (!dependent_type_p (type))
> +    {
> +      if (!complete_type_or_maybe_complain (type, NULL_TREE, complain))
> +	return error_mark_node;
> +      if (TREE_CODE (type) == ARRAY_TYPE)
> +	{
> +	  /* std::bit_cast for destination ARRAY_TYPE is not possible,
> +	     as functions may not return an array, so don't bother trying
> +	     to support this (and then deal with VLAs etc.).  */
> +	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
> +			 "is an array type", type);
> +	  return error_mark_node;
> +	}
> +      if (!trivially_copyable_p (type))
> +	{
> +	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
> +			 "is not trivially copyable", type);
> +	  return error_mark_node;
> +	}
> +    }
> +
> +  if (error_operand_p (arg))
> +    return error_mark_node;
> +
> +  if (!type_dependent_expression_p (arg))
> +    {
> +      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
> +	{
> +	  /* Don't perform array-to-pointer conversion.  */
> +	  arg = mark_rvalue_use (arg, loc, true);
> +	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
> +	    return error_mark_node;
> +	}
> +      else
> +	arg = decay_conversion (arg, complain);

bit_cast operates on an lvalue argument, so I don't think we want 
decay_conversion at all here.

> +      if (error_operand_p (arg))
> +	return error_mark_node;
> +
> +      arg = convert_from_reference (arg);

This shouldn't be necessary; the argument should already be converted 
from reference.  Generally we call convert_from_reference on the result 
of some processing, not on an incoming argument.

> +      if (!trivially_copyable_p (TREE_TYPE (arg)))
> +	{
> +	  error_at (cp_expr_loc_or_loc (arg, loc),
> +		    "%<__builtin_bit_cast%> source type %qT "
> +		    "is not trivially copyable", TREE_TYPE (arg));
> +	  return error_mark_node;
> +	}
> +      if (!dependent_type_p (type)
> +	  && !cp_tree_equal (TYPE_SIZE_UNIT (type),
> +			     TYPE_SIZE_UNIT (TREE_TYPE (arg))))
> +	{
> +	  error_at (loc, "%<__builtin_bit_cast%> source size %qE "
> +			 "not equal to destination type size %qE",
> +			 TYPE_SIZE_UNIT (TREE_TYPE (arg)),
> +			 TYPE_SIZE_UNIT (type));
> +	  return error_mark_node;
> +	}
> +    }
> +
> +  tree ret = build_min (BIT_CAST_EXPR, type, arg);
> +  SET_EXPR_LOCATION (ret, loc);
> +  return ret;
> +}
> +
>   #include "gt-cp-semantics.h"
> --- gcc/cp/tree.c.jj	2020-10-08 10:58:22.023037307 +0200
> +++ gcc/cp/tree.c	2020-11-02 13:42:18.707819660 +0100
> @@ -3868,6 +3868,7 @@ cp_tree_equal (tree t1, tree t2)
>       CASE_CONVERT:
>       case NON_LVALUE_EXPR:
>       case VIEW_CONVERT_EXPR:
> +    case BIT_CAST_EXPR:
>         if (!same_type_p (TREE_TYPE (t1), TREE_TYPE (t2)))
>   	return false;
>         /* Now compare operands as usual.  */
> @@ -5112,6 +5113,7 @@ cp_walk_subtrees (tree *tp, int *walk_su
>       case CONST_CAST_EXPR:
>       case DYNAMIC_CAST_EXPR:
>       case IMPLICIT_CONV_EXPR:
> +    case BIT_CAST_EXPR:
>         if (TREE_TYPE (*tp))
>   	WALK_SUBTREE (TREE_TYPE (*tp));
>   
> --- gcc/cp/pt.c.jj	2020-10-30 09:16:30.818279438 +0100
> +++ gcc/cp/pt.c	2020-11-02 13:42:18.709819638 +0100
> @@ -16741,6 +16741,13 @@ tsubst_copy (tree t, tree args, tsubst_f
>   	return build1 (code, type, op0);
>         }
>   
> +    case BIT_CAST_EXPR:
> +      {
> +	tree type = tsubst (TREE_TYPE (t), args, complain, in_decl);
> +	tree op0 = tsubst_copy (TREE_OPERAND (t, 0), args, complain, in_decl);
> +	return cp_build_bit_cast (EXPR_LOCATION (t), type, op0, complain);
> +      }
> +
>       case SIZEOF_EXPR:
>         if (PACK_EXPANSION_P (TREE_OPERAND (t, 0))
>   	  || ARGUMENT_PACK_P (TREE_OPERAND (t, 0)))
> --- gcc/cp/constexpr.c.jj	2020-10-30 08:59:57.039496732 +0100
> +++ gcc/cp/constexpr.c	2020-11-02 15:55:14.259664820 +0100
> @@ -3912,6 +3912,442 @@ cxx_eval_bit_field_ref (const constexpr_
>     return error_mark_node;
>   }
>   
> +/* Helper for cxx_eval_bit_cast.
> +   Check [bit.cast]/3 rules, bit_cast is constexpr only if the To and From
> +   types and types of all subobjects have is_union_v<T>, is_pointer_v<T>,
> +   is_member_pointer_v<T>, is_volatile_v<T> false and has no non-static
> +   data members of reference type.  */
> +
> +static bool
> +check_bit_cast_type (const constexpr_ctx *ctx, location_t loc, tree type,
> +		     tree orig_type)
> +{
> +  if (TREE_CODE (type) == UNION_TYPE)
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not a constant expression because %qT is "
> +			   "a union type", "__builtin_bit_cast", type);
> +	  else
> +	    error_at (loc, "%qs is not a constant expression because %qT "
> +			   "contains a union type", "__builtin_bit_cast",
> +		      orig_type);
> +	}
> +      return true;
> +    }
> +  if (TREE_CODE (type) == POINTER_TYPE)
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not a constant expression because %qT is "
> +			   "a pointer type", "__builtin_bit_cast", type);
> +	  else
> +	    error_at (loc, "%qs is not a constant expression because %qT "
> +			   "contains a pointer type", "__builtin_bit_cast",
> +		      orig_type);
> +	}
> +      return true;
> +    }
> +  if (TREE_CODE (type) == REFERENCE_TYPE)
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not a constant expression because %qT is "
> +			   "a reference type", "__builtin_bit_cast", type);
> +	  else
> +	    error_at (loc, "%qs is not a constant expression because %qT "
> +			   "contains a reference type", "__builtin_bit_cast",
> +		      orig_type);
> +	}
> +      return true;
> +    }
> +  if (TYPE_PTRMEM_P (type))
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not a constant expression because %qT is "
> +			   "a pointer to member type", "__builtin_bit_cast",
> +		      type);
> +	  else
> +	    error_at (loc, "%qs is not a constant expression because %qT "
> +			   "contains a pointer to member type",
> +		      "__builtin_bit_cast", orig_type);
> +	}
> +      return true;
> +    }
> +  if (TYPE_VOLATILE (type))
> +    {
> +      if (!ctx->quiet)
> +	{
> +	  if (type == orig_type)
> +	    error_at (loc, "%qs is not a constant expression because %qT is "
> +			   "volatile", "__builtin_bit_cast", type);
> +	  else
> +	    error_at (loc, "%qs is not a constant expression because %qT "
> +			   "contains a volatile subobject",
> +		      "__builtin_bit_cast", orig_type);
> +	}
> +      return true;
> +    }
> +  if (TREE_CODE (type) == RECORD_TYPE)
> +    for (tree field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
> +      if (TREE_CODE (field) == FIELD_DECL
> +	  && check_bit_cast_type (ctx, loc, TREE_TYPE (field), orig_type))
> +	return true;
> +  return false;
> +}
> +
> +/* Attempt to interpret aggregate of TYPE from bytes encoded in target
> +   byte order at PTR + OFF with LEN bytes.  MASK contains bits set if the value
> +   is indeterminate.  */
> +
> +static tree
> +cxx_native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
> +				int len, unsigned char *mask,
> +				const constexpr_ctx *ctx, bool *non_constant_p,
> +				location_t loc)

Can this be, say, native_interpret_initializer in fold-const?  It 
doesn't seem closely tied to the front end other than diagnostics that 
could move to the caller, like you've already done for the non-aggregate 
case.

> +{
> +  vec<constructor_elt, va_gc> *elts = NULL;
> +  if (TREE_CODE (type) == ARRAY_TYPE)
> +    {
> +      HOST_WIDE_INT eltsz = int_size_in_bytes (TREE_TYPE (type));
> +      if (eltsz < 0 || eltsz > len || TYPE_DOMAIN (type) == NULL_TREE)
> +	return error_mark_node;
> +
> +      HOST_WIDE_INT cnt = 0;
> +      if (TYPE_MAX_VALUE (TYPE_DOMAIN (type)))
> +	{
> +	  if (!tree_fits_shwi_p (TYPE_MAX_VALUE (TYPE_DOMAIN (type))))
> +	    return error_mark_node;
> +	  cnt = tree_to_shwi (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) + 1;
> +	}
> +      if (eltsz == 0)
> +	cnt = 0;
> +      HOST_WIDE_INT pos = 0;
> +      for (HOST_WIDE_INT i = 0; i < cnt; i++, pos += eltsz)
> +	{
> +	  tree v = error_mark_node;
> +	  if (pos >= len || pos + eltsz > len)
> +	    return error_mark_node;
> +	  if (can_native_interpret_type_p (TREE_TYPE (type)))
> +	    {
> +	      v = native_interpret_expr (TREE_TYPE (type),
> +					 ptr + off + pos, eltsz);
> +	      if (v == NULL_TREE)
> +		return error_mark_node;
> +	      for (int i = 0; i < eltsz; i++)
> +		if (mask[off + pos + i])
> +		  {
> +		    if (!ctx->quiet)
> +		      error_at (loc, "%qs accessing uninitialized byte at "
> +				     "offset %d",
> +				"__builtin_bit_cast", (int) (off + pos + i));
> +		    *non_constant_p = true;
> +		    return error_mark_node;
> +		  }
> +	    }
> +	  else if (TREE_CODE (TREE_TYPE (type)) == RECORD_TYPE
> +		   || TREE_CODE (TREE_TYPE (type)) == ARRAY_TYPE)
> +	    v = cxx_native_interpret_aggregate (TREE_TYPE (type),
> +						ptr, off + pos, eltsz,
> +						mask, ctx,
> +						non_constant_p, loc);
> +	  if (v == error_mark_node)
> +	    return error_mark_node;
> +	  CONSTRUCTOR_APPEND_ELT (elts, size_int (i), v);
> +	}
> +      return build_constructor (type, elts);
> +    }
> +  gcc_assert (TREE_CODE (type) == RECORD_TYPE);
> +  for (tree field = next_initializable_field (TYPE_FIELDS (type));
> +       field; field = next_initializable_field (DECL_CHAIN (field)))
> +    {
> +      tree fld = field;
> +      HOST_WIDE_INT bitoff = 0, pos = 0, sz = 0;
> +      int diff = 0;
> +      tree v = error_mark_node;
> +      if (DECL_BIT_FIELD (field))
> +	{
> +	  fld = DECL_BIT_FIELD_REPRESENTATIVE (field);
> +	  if (fld && INTEGRAL_TYPE_P (TREE_TYPE (fld)))
> +	    {
> +	      poly_int64 bitoffset;
> +	      poly_uint64 field_offset, fld_offset;
> +	      if (poly_int_tree_p (DECL_FIELD_OFFSET (field), &field_offset)
> +		  && poly_int_tree_p (DECL_FIELD_OFFSET (fld), &fld_offset))
> +		bitoffset = (field_offset - fld_offset) * BITS_PER_UNIT;
> +	      else
> +		bitoffset = 0;
> +	      bitoffset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
> +			    - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (fld)));
> +	      diff = (TYPE_PRECISION (TREE_TYPE (fld))
> +		      - TYPE_PRECISION (TREE_TYPE (field)));
> +	      if (!bitoffset.is_constant (&bitoff)
> +		  || bitoff < 0
> +		  || bitoff > diff)
> +		return error_mark_node;
> +	    }
> +	  else
> +	    {
> +	      if (!tree_fits_uhwi_p (DECL_FIELD_BIT_OFFSET (field)))
> +		return error_mark_node;
> +	      int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
> +	      int bpos = tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field));
> +	      bpos %= BITS_PER_UNIT;
> +	      fieldsize += bpos;
> +	      fieldsize += BITS_PER_UNIT - 1;
> +	      fieldsize /= BITS_PER_UNIT;
> +	      tree repr_type = find_bitfield_repr_type (fieldsize, len);
> +	      if (repr_type == NULL_TREE)
> +		return error_mark_node;
> +	      sz = int_size_in_bytes (repr_type);
> +	      if (sz < 0 || sz > len)
> +		return error_mark_node;
> +	      pos = int_byte_position (field);
> +	      if (pos < 0 || pos > len || pos + fieldsize > len)
> +		return error_mark_node;
> +	      HOST_WIDE_INT rpos;
> +	      if (pos + sz <= len)
> +		rpos = pos;
> +	      else
> +		{
> +		  rpos = len - sz;
> +		  gcc_assert (rpos <= pos);
> +		}
> +	      bitoff = (HOST_WIDE_INT) (pos - rpos) * BITS_PER_UNIT + bpos;
> +	      pos = rpos;
> +	      diff = (TYPE_PRECISION (repr_type)
> +		      - TYPE_PRECISION (TREE_TYPE (field)));
> +	      v = native_interpret_expr (repr_type, ptr + off + pos, sz);
> +	      if (v == NULL_TREE)
> +		return error_mark_node;
> +	      fld = NULL_TREE;
> +	    }
> +	}
> +
> +      if (fld)
> +	{
> +	  sz = int_size_in_bytes (TREE_TYPE (fld));
> +	  if (sz < 0 || sz > len)
> +	    return error_mark_node;
> +	  tree byte_pos = byte_position (fld);
> +	  if (!tree_fits_shwi_p (byte_pos))
> +	    return error_mark_node;
> +	  pos = tree_to_shwi (byte_pos);
> +	  if (pos < 0 || pos > len || pos + sz > len)
> +	    return error_mark_node;
> +	}
> +      if (fld == NULL_TREE)
> +	/* Already handled above.  */;
> +      else if (can_native_interpret_type_p (TREE_TYPE (fld)))
> +	{
> +	  v = native_interpret_expr (TREE_TYPE (fld),
> +				     ptr + off + pos, sz);
> +	  if (v == NULL_TREE)
> +	    return error_mark_node;
> +	  if (fld == field)
> +	    for (int i = 0; i < sz; i++)
> +	      if (mask[off + pos + i])
> +		{
> +		  if (!ctx->quiet)
> +		    error_at (loc,
> +			      "%qs accessing uninitialized byte at offset %d",
> +			      "__builtin_bit_cast", (int) (off + pos + i));
> +		  *non_constant_p = true;
> +		  return error_mark_node;
> +		}
> +	}
> +      else if (TREE_CODE (TREE_TYPE (fld)) == RECORD_TYPE
> +	       || TREE_CODE (TREE_TYPE (fld)) == ARRAY_TYPE)
> +	v = cxx_native_interpret_aggregate (TREE_TYPE (fld),
> +					    ptr, off + pos, sz, mask,
> +					    ctx, non_constant_p, loc);
> +      if (v == error_mark_node)
> +	return error_mark_node;
> +      if (fld != field)
> +	{
> +	  if (TREE_CODE (v) != INTEGER_CST)
> +	    return error_mark_node;
> +
> +	  /* FIXME: Figure out how to handle PDP endian bitfields.  */
> +	  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
> +	    return error_mark_node;
> +	  if (!BYTES_BIG_ENDIAN)
> +	    v = wide_int_to_tree (TREE_TYPE (field),
> +				  wi::lrshift (wi::to_wide (v), bitoff));
> +	  else
> +	    v = wide_int_to_tree (TREE_TYPE (field),
> +				  wi::lrshift (wi::to_wide (v),
> +					       diff - bitoff));
> +	  int bpos = bitoff % BITS_PER_UNIT;
> +	  int bpos_byte = bitoff / BITS_PER_UNIT;
> +	  int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
> +	  fieldsize += bpos;
> +	  int epos = fieldsize % BITS_PER_UNIT;
> +	  fieldsize += BITS_PER_UNIT - 1;
> +	  fieldsize /= BITS_PER_UNIT;
> +	  int bad = -1;
> +	  if (bpos)
> +	    {
> +	      int msk;
> +	      if (!BYTES_BIG_ENDIAN)
> +		{
> +		  msk = (1 << bpos) - 1;
> +		  if (fieldsize == 1 && epos != 0)
> +		    msk |= ~((1 << epos) - 1);
> +		}
> +	      else
> +		{
> +		  msk = ~((1 << (BITS_PER_UNIT - bpos)) - 1);
> +		  if (fieldsize == 1 && epos != 0)
> +		    msk |= (1 << (BITS_PER_UNIT - epos)) - 1;
> +		}
> +	      if (mask[off + pos + bpos_byte] & ~msk)
> +		bad = off + pos + bpos_byte;
> +	    }
> +	  if (epos && (fieldsize > 1 || bpos == 0))
> +	    {
> +	      int msk;
> +	      if (!BYTES_BIG_ENDIAN)
> +		msk = (1 << epos) - 1;
> +	      else
> +		msk = ~((1 << (BITS_PER_UNIT - epos)) - 1);
> +	      if (mask[off + pos + bpos_byte + fieldsize - 1] & msk)
> +		bad = off + pos + bpos_byte + fieldsize - 1;
> +	    }
> +	  for (int i = 0; bad < 0 && i < fieldsize - (bpos != 0) - (epos != 0);
> +	       i++)
> +	    if (mask[off + pos + bpos_byte + (bpos != 0) + i])
> +	      bad = off + pos + bpos_byte + (bpos != 0) + i;
> +	  if (bad >= 0)
> +	    {
> +	      if (!ctx->quiet)
> +		error_at (loc, "%qs accessing uninitialized byte at offset %d",
> +			  "__builtin_bit_cast", bad);
> +	      *non_constant_p = true;
> +	      return error_mark_node;
> +	    }
> +	}
> +      CONSTRUCTOR_APPEND_ELT (elts, field, v);
> +    }
> +  return build_constructor (type, elts);
> +}
> +
> +/* Subroutine of cxx_eval_constant_expression.
> +   Attempt to evaluate a BIT_CAST_EXPR.  */
> +
> +static tree
> +cxx_eval_bit_cast (const constexpr_ctx *ctx, tree t, bool *non_constant_p,
> +		   bool *overflow_p)
> +{
> +  if (check_bit_cast_type (ctx, EXPR_LOCATION (t), TREE_TYPE (t),
> +			   TREE_TYPE (t))
> +      || check_bit_cast_type (ctx, cp_expr_loc_or_loc (TREE_OPERAND (t, 0),
> +						       EXPR_LOCATION (t)),
> +			      TREE_TYPE (TREE_OPERAND (t, 0)),
> +			      TREE_TYPE (TREE_OPERAND (t, 0))))
> +    {
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  tree op = cxx_eval_constant_expression (ctx, TREE_OPERAND (t, 0), false,
> +					  non_constant_p, overflow_p);
> +  if (*non_constant_p)
> +    return t;
> +
> +  location_t loc = EXPR_LOCATION (t);
> +  if (BITS_PER_UNIT != 8 || CHAR_BIT != 8)
> +    {
> +      if (!ctx->quiet)
> +	sorry_at (loc, "%qs cannot be constant evaluated on the target",
> +		       "__builtin_bit_cast");
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  if (!tree_fits_shwi_p (TYPE_SIZE_UNIT (TREE_TYPE (t))))
> +    {
> +      if (!ctx->quiet)
> +	sorry_at (loc, "%qs cannot be constant evaluated because the "
> +		       "type is too large", "__builtin_bit_cast");
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  HOST_WIDE_INT len = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (t)));
> +  if (len < 0 || (int) len != len)
> +    {
> +      if (!ctx->quiet)
> +	sorry_at (loc, "%qs cannot be constant evaluated because the "
> +		       "type is too large", "__builtin_bit_cast");
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  unsigned char buf[64];
> +  unsigned char *ptr, *mask;
> +  size_t alen = (size_t) len * 2;
> +  if (alen <= sizeof (buf))
> +    ptr = buf;
> +  else
> +    ptr = XNEWVEC (unsigned char, alen);
> +  mask = ptr + (size_t) len;
> +  /* At the beginning consider everything indeterminate.  */
> +  memset (mask, ~0, (size_t) len);
> +
> +  if (native_encode_initializer (op, ptr, len, 0, mask) != len)
> +    {
> +      if (!ctx->quiet)
> +	sorry_at (loc, "%qs cannot be constant evaluated because the "
> +		       "argument cannot be encoded", "__builtin_bit_cast");
> +      *non_constant_p = true;
> +      return t;
> +    }
> +
> +  if (can_native_interpret_type_p (TREE_TYPE (t)))
> +    if (tree r = native_interpret_expr (TREE_TYPE (t), ptr, len))
> +      {
> +	for (int i = 0; i < len; i++)
> +	  if (mask[i])
> +	    {
> +	      if (!ctx->quiet)
> +		error_at (loc, "%qs accessing uninitialized byte at offset %d",
> +			  "__builtin_bit_cast", i);
> +	      *non_constant_p = true;
> +	      r = t;
> +	      break;
> +	    }
> +	if (ptr != buf)
> +	  XDELETE (ptr);
> +	return r;
> +      }
> +
> +  if (TREE_CODE (TREE_TYPE (t)) == RECORD_TYPE)
> +    {
> +      tree r = cxx_native_interpret_aggregate (TREE_TYPE (t), ptr, 0, len,
> +					       mask, ctx, non_constant_p, loc);
> +      if (r != error_mark_node)
> +	{
> +	  if (ptr != buf)
> +	    XDELETE (ptr);
> +	  return r;
> +	}
> +      if (*non_constant_p)
> +	return t;
> +    }
> +
> +  if (!ctx->quiet)
> +    sorry_at (loc, "%qs cannot be constant evaluated because the "
> +		   "argument cannot be interpreted", "__builtin_bit_cast");
> +  *non_constant_p = true;
> +  return t;
> +}
> +
>   /* Subroutine of cxx_eval_constant_expression.
>      Evaluate a short-circuited logical expression T in the context
>      of a given constexpr CALL.  BAILOUT_VALUE is the value for
> @@ -6625,6 +7061,10 @@ cxx_eval_constant_expression (const cons
>         *non_constant_p = true;
>         return t;
>   
> +    case BIT_CAST_EXPR:
> +      r = cxx_eval_bit_cast (ctx, t, non_constant_p, overflow_p);
> +      break;
> +
>       default:
>         if (STATEMENT_CODE_P (TREE_CODE (t)))
>   	{
> @@ -8403,6 +8843,9 @@ potential_constant_expression_1 (tree t,
>       case ANNOTATE_EXPR:
>         return RECUR (TREE_OPERAND (t, 0), rval);
>   
> +    case BIT_CAST_EXPR:
> +      return RECUR (TREE_OPERAND (t, 0), rval);
> +
>       /* Coroutine await, yield and return expressions are not.  */
>       case CO_AWAIT_EXPR:
>       case CO_YIELD_EXPR:
> --- gcc/cp/cp-gimplify.c.jj	2020-10-08 10:58:21.939038527 +0200
> +++ gcc/cp/cp-gimplify.c	2020-11-02 13:42:18.711819616 +0100
> @@ -1568,6 +1568,11 @@ cp_genericize_r (tree *stmt_p, int *walk
>   				 cp_genericize_r, cp_walk_subtrees);
>         break;
>   
> +    case BIT_CAST_EXPR:
> +      *stmt_p = build1_loc (EXPR_LOCATION (stmt), VIEW_CONVERT_EXPR,
> +			    TREE_TYPE (stmt), TREE_OPERAND (stmt, 0));
> +      break;
> +
>       default:
>         if (IS_TYPE_OR_DECL_P (stmt))
>   	*walk_subtrees = 0;
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast1.C.jj	2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast1.C	2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,47 @@
> +// { dg-do compile }
> +
> +struct S { short a, b; };
> +struct T { float a[16]; };
> +struct U { int b[16]; };
> +
> +#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
> +int
> +f1 (float x)
> +{
> +  return __builtin_bit_cast (int, x);
> +}
> +#endif
> +
> +#if 2 * __SIZEOF_SHORT__ == __SIZEOF_INT__
> +S
> +f2 (int x)
> +{
> +  return __builtin_bit_cast (S, x);
> +}
> +
> +int
> +f3 (S x)
> +{
> +  return __builtin_bit_cast (int, x);
> +}
> +#endif
> +
> +#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
> +U
> +f4 (T &x)
> +{
> +  return __builtin_bit_cast (U, x);
> +}
> +
> +T
> +f5 (int (&x)[16])
> +{
> +  return __builtin_bit_cast (T, x);
> +}
> +#endif
> +
> +int
> +f6 ()
> +{
> +  return __builtin_bit_cast (unsigned char, (signed char) 0);
> +}
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast2.C.jj	2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast2.C	2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,57 @@
> +// { dg-do compile }
> +
> +struct S { ~S (); int s; };
> +S s;
> +struct V;				// { dg-message "forward declaration of 'struct V'" }
> +extern V v;				// { dg-error "'v' has incomplete type" }
> +extern V *p;
> +struct U { int a, b; };
> +U u;
> +
> +void
> +foo (int *q)
> +{
> +  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> +  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> +  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (int, v);
> +  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +template <int N>
> +void
> +bar (int *q)
> +{
> +  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> +  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> +  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +template <typename T1, typename T2, typename T3, typename T4>
> +void
> +baz (T3 s, T4 *p, T1 *q)
> +{
> +  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (T3, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
> +  __builtin_bit_cast (T1 &, q);		// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
> +  __builtin_bit_cast (T2, 0);		// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
> +  __builtin_bit_cast (T4, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
> +  __builtin_bit_cast (U, (T1) 0);	// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +  __builtin_bit_cast (T1, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
> +}
> +
> +void
> +qux (int *q)
> +{
> +  baz <int, int [1], S, V> (s, p, q);
> +}
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast3.C.jj	2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast3.C	2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,229 @@
> +// { dg-do compile { target c++11 } }
> +
> +template <typename To, typename From>
> +constexpr To
> +bit_cast (const From &from)
> +{
> +  return __builtin_bit_cast (To, from);
> +}
> +
> +template <typename To, typename From>
> +constexpr bool
> +check (const From &from)
> +{
> +  return bit_cast <From> (bit_cast <To> (from)) == from;
> +}
> +
> +struct A
> +{
> +  int a, b, c;
> +  constexpr bool operator == (const A &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c;
> +  }
> +};
> +
> +struct B
> +{
> +  unsigned a[3];
> +  constexpr bool operator == (const B &x) const
> +  {
> +    return x.a[0] == a[0] && x.a[1] == a[1] && x.a[2] == a[2];
> +  }
> +};
> +
> +struct C
> +{
> +  char a[2][3][2];
> +  constexpr bool operator == (const C &x) const
> +  {
> +    return x.a[0][0][0] == a[0][0][0]
> +	   && x.a[0][0][1] == a[0][0][1]
> +	   && x.a[0][1][0] == a[0][1][0]
> +	   && x.a[0][1][1] == a[0][1][1]
> +	   && x.a[0][2][0] == a[0][2][0]
> +	   && x.a[0][2][1] == a[0][2][1]
> +	   && x.a[1][0][0] == a[1][0][0]
> +	   && x.a[1][0][1] == a[1][0][1]
> +	   && x.a[1][1][0] == a[1][1][0]
> +	   && x.a[1][1][1] == a[1][1][1]
> +	   && x.a[1][2][0] == a[1][2][0]
> +	   && x.a[1][2][1] == a[1][2][1];
> +  }
> +};
> +
> +struct D
> +{
> +  int a, b;
> +  constexpr bool operator == (const D &x) const
> +  {
> +    return x.a == a && x.b == b;
> +  }
> +};
> +
> +struct E {};
> +struct F { char c, d, e, f; };
> +struct G : public D, E, F
> +{
> +  int g;
> +  constexpr bool operator == (const G &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c && x.d == d
> +	   && x.e == e && x.f == f && x.g == g;
> +  }
> +};
> +
> +struct H
> +{
> +  int a, b[2], c;
> +  constexpr bool operator == (const H &x) const
> +  {
> +    return x.a == a && x.b[0] == b[0] && x.b[1] == b[1] && x.c == c;
> +  }
> +};
> +
> +#if __SIZEOF_INT__ == 4
> +struct I
> +{
> +  int a;
> +  int b : 3;
> +  int c : 24;
> +  int d : 5;
> +  int e;
> +  constexpr bool operator == (const I &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e;
> +  }
> +};
> +#endif
> +
> +#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
> +struct J
> +{
> +  long long int a, b : 11, c : 3, d : 37, e : 1, f : 10, g : 2, h;
> +  constexpr bool operator == (const J &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
> +	   && x.f == f && x.g == g && x.h == h;
> +  }
> +};
> +
> +struct K
> +{
> +  long long int a, b, c;
> +  constexpr bool operator == (const K &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c;
> +  }
> +};
> +
> +struct M
> +{
> +  signed a : 6, b : 7, c : 6, d : 5;
> +  unsigned char e;
> +  unsigned int f;
> +  long long int g;
> +  constexpr bool operator == (const M &x) const
> +  {
> +    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
> +	   && x.f == f && x.g == g;
> +  }
> +};
> +
> +struct N
> +{
> +  unsigned long long int a, b;
> +  constexpr bool operator == (const N &x) const
> +  {
> +    return x.a == a && x.b == b;
> +  }
> +};
> +#endif
> +
> +static_assert (check <unsigned int> (0), "");
> +static_assert (check <long long int> (0xdeadbeeffeedbac1ULL), "");
> +static_assert (check <signed char> ((unsigned char) 42), "");
> +static_assert (check <char> ((unsigned char) 42), "");
> +static_assert (check <unsigned char> ((unsigned char) 42), "");
> +static_assert (check <signed char> ((signed char) 42), "");
> +static_assert (check <char> ((signed char) 42), "");
> +static_assert (check <unsigned char> ((signed char) 42), "");
> +static_assert (check <signed char> ((char) 42), "");
> +static_assert (check <char> ((char) 42), "");
> +static_assert (check <unsigned char> ((char) 42), "");
> +#if __SIZEOF_INT__ == __SIZEOF_FLOAT__
> +static_assert (check <int> (2.5f), "");
> +static_assert (check <unsigned int> (136.5f), "");
> +#endif
> +#if __SIZEOF_LONG_LONG__ == __SIZEOF_DOUBLE__
> +static_assert (check <long long> (2.5), "");
> +static_assert (check <long long unsigned> (123456.75), "");
> +#endif
> +
> +static_assert (check <B> (A{ 1, 2, 3 }), "");
> +static_assert (check <A> (B{ 4, 5, 6 }), "");
> +
> +#if __SIZEOF_INT__ == 4
> +static_assert (check <C> (A{ 7, 8, 9 }), "");
> +static_assert (check <C> (B{ 10, 11, 12 }), "");
> +static_assert (check <A> (C{ { { { 13, 14 }, { 15, 16 }, { 17, 18 } },
> +			       { { 19, 20 }, { 21, 22 }, { 23, 24 } } } }), "");
> +constexpr unsigned char c[] = { 1, 2, 3, 4 };
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <unsigned int> (c) == 0x04030201U, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <unsigned int> (c) == 0x01020304U, "");
> +#endif
> +
> +#if __cplusplus >= 201703L
> +static_assert (check <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
> +#endif
> +constexpr int d[] = { 0x12345678, 0x23456789, 0x5a876543, 0x3ba78654 };
> +static_assert (bit_cast <G> (d) == bit_cast <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
> +
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
> +	       == I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed }, "");
> +static_assert (bit_cast <A> (I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed })
> +	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
> +	       == I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed }, "");
> +static_assert (bit_cast <A> (I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed })
> +	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
> +#endif
> +#endif
> +
> +#if 2 * __SIZEOF_INT__ == __SIZEOF_LONG_LONG__ && __SIZEOF_INT__ >= 4
> +constexpr unsigned long long a = 0xdeadbeeffee1deadULL;
> +constexpr unsigned b[] = { 0xfeedbacU, 0xbeeffeedU };
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <D> (a) == D { int (0xfee1deadU), int (0xdeadbeefU) }, "");
> +static_assert (bit_cast <unsigned long long> (b) == 0xbeeffeed0feedbacULL, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <D> (a) == D { int (0xdeadbeefU), int (0xfee1deadU) }, "");
> +static_assert (bit_cast <unsigned long long> (b) == 0x0feedbacbeeffeedULL, "");
> +#endif
> +#endif
> +
> +#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
> +#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
> +static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL })
> +	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
> +	       == K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
> +	       == M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL }, "");
> +static_assert (bit_cast <N> (M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL })
> +	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
> +#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
> +static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL })
> +	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
> +	       == K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL }, "");
> +static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
> +	       == M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL }, "");
> +static_assert (bit_cast <N> (M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL })
> +	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
> +#endif
> +#endif
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast4.C.jj	2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast4.C	2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,44 @@
> +// { dg-do compile { target c++11 } }
> +
> +template <typename To, typename From>
> +constexpr To
> +bit_cast (const From &from)
> +{
> +  return __builtin_bit_cast (To, from);
> +}
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'U' is a union type" "U" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const U' is a union type" "const U" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'B' contains a union type" "B" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'char\\\*' is a pointer type" "char ptr" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const int\\\*' is a pointer type" "const int ptr" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'C' contains a pointer type" "C" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const C' contains a pointer type" "const C" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int D::\\\*' is a pointer to member type" "ptrmem 1" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int \\\(D::\\\*\\\)\\\(\\\) const' is a pointer to member type" "ptrmem 2" { target *-*-* } 7 }
> +// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int \\\(D::\\\*\\\)\\\(\\\)' is a pointer to member type" "ptrmem 3" { target *-*-* } 7 }
> +
> +union U { int u; };
> +struct A { int a; U b; };
> +struct B : public A { int c; };
> +struct C { const int *p; };
> +constexpr int a[] = { 1, 2, 3 };
> +constexpr const int *b = &a[0];
> +constexpr C c = { b };
> +struct D { int d; constexpr int foo () const { return 1; } };
> +constexpr int D::*d = &D::d;
> +constexpr int (D::*e) () const = &D::foo;
> +struct E { __INTPTR_TYPE__ e, f; };
> +constexpr E f = { 1, 2 };
> +constexpr U g { 0 };
> +
> +constexpr auto z = bit_cast <U> (0);
> +constexpr auto y = bit_cast <int> (g);
> +constexpr auto x = bit_cast <B> (a);
> +constexpr auto w = bit_cast <char *> ((__INTPTR_TYPE__) 0);
> +constexpr auto v = bit_cast <__UINTPTR_TYPE__> (b);
> +constexpr auto u = bit_cast <C> ((__INTPTR_TYPE__) 0);
> +constexpr auto t = bit_cast <__INTPTR_TYPE__> (c);
> +constexpr auto s = bit_cast <__INTPTR_TYPE__> (d);
> +constexpr auto r = bit_cast <E> (e);
> +constexpr auto q = bit_cast <int D::*> ((__INTPTR_TYPE__) 0);
> +constexpr auto p = bit_cast <int (D::*) ()> (f);
> --- gcc/testsuite/g++.dg/cpp2a/bit-cast5.C.jj	2020-11-02 13:42:18.711819616 +0100
> +++ gcc/testsuite/g++.dg/cpp2a/bit-cast5.C	2020-11-02 13:42:18.711819616 +0100
> @@ -0,0 +1,69 @@
> +// { dg-do compile { target { c++20 && { ilp32 || lp64 } } } }
> +
> +struct A { signed char a, b, c, d, e, f; };
> +struct B {};
> +struct C { B a, b; short c; B d; };
> +struct D { int a : 4, b : 24, c : 4; };
> +struct E { B a, b; short c; };
> +struct F { B a; signed char b, c; B d; };
> +
> +constexpr bool
> +f1 ()
> +{
> +  A a;
> +  a.c = 23; a.d = 42;
> +  C b = __builtin_bit_cast (C, a); // OK
> +  return false;
> +}
> +
> +constexpr bool
> +f2 ()
> +{
> +  A a;
> +  a.a = 1; a.b = 2; a.c = 3; a.e = 4; a.f = 5;
> +  C b = __builtin_bit_cast (C, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
> +  return false;
> +}
> +
> +constexpr bool
> +f3 ()
> +{
> +  D a;
> +  a.b = 1;
> +  F b = __builtin_bit_cast (F, a); // OK
> +  return false;
> +}
> +
> +constexpr bool
> +f4 ()
> +{
> +  D a;
> +  a.b = 1; a.c = 2;
> +  E b = __builtin_bit_cast (E, a); // OK
> +  return false;
> +}
> +
> +constexpr bool
> +f5 ()
> +{
> +  D a;
> +  a.b = 1;
> +  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
> +  return false;
> +}
> +
> +constexpr bool
> +f6 ()
> +{
> +  D a;
> +  a.c = 1;
> +  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 2" }
> +  return false;
> +}
> +
> +constexpr bool a = f1 ();
> +constexpr bool b = f2 ();
> +constexpr bool c = f3 ();
> +constexpr bool d = f4 ();
> +constexpr bool e = f5 ();
> +constexpr bool f = f6 ();
> 
> 
> 	Jakub
> 


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-25 17:26     ` Jason Merrill
@ 2020-11-25 18:50       ` Jakub Jelinek
  2020-11-26  0:52         ` Jason Merrill
  0 siblings, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-11-25 18:50 UTC (permalink / raw)
  To: Jason Merrill; +Cc: Jonathan Wakely, gcc-patches

On Wed, Nov 25, 2020 at 12:26:17PM -0500, Jason Merrill wrote:
> > +		      if (DECL_BIT_FIELD (fld)
> > +			  && DECL_NAME (fld) == NULL_TREE)
> > +			continue;
> 
> I think you want to check DECL_PADDING_P here; the C and C++ front ends set
> it on unnamed bit-fields, and that's what is_empty_type looks at.

Ok, changed in my copy.  I'll also post a patch for
__builtin_clear_padding to use DECL_PADDING_P in there instead of
DECL_BIT_FIELD/DECL_NAME==NULL.

> > +      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
> > +	{
> > +	  /* Don't perform array-to-pointer conversion.  */
> > +	  arg = mark_rvalue_use (arg, loc, true);
> > +	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
> > +	    return error_mark_node;
> > +	}
> > +      else
> > +	arg = decay_conversion (arg, complain);
> 
> bit_cast operates on an lvalue argument, so I don't think we want
> decay_conversion at all here.
> 
> > +      if (error_operand_p (arg))
> > +	return error_mark_node;
> > +
> > +      arg = convert_from_reference (arg);
> 
> This shouldn't be necessary; the argument should already be converted from
> reference.  Generally we call convert_from_reference on the result of some
> processing, not on an incoming argument.

Removing these two regresses some tests in the testsuite.
It is true that std::bit_cast's argument must be a reference, and so when
one uses std::bit_cast one won't run into these problems, but the builtin's
argument itself is an rvalue and so we need to deal with people calling it
directly.
So, commenting out the decay_conversion and convert_from_reference results
in:
extern V v;
...
  __builtin_bit_cast (int, v);
no longer being reported as invalid use of incomplete type, but
error: '__builtin_bit_cast' source size '' not equal to destination type size '4'
(note nothing in between '' for the size because the size is NULL).
Ditto for:
extern V *p;
...
  __builtin_bit_cast (int, *p);
I guess I could add some hand written code to deal with incomplete types
to cure these.  But e.g. decay_conversion also calls mark_rvalue_use which
we also need e.g. for -Wunused-but-set*, but don't we also need it e.g. for
lambdas? The builtin is after all using the argument as an rvalue
(reads it).
Another change that commenting out those two parts causes is different
diagnostics on bit-cast4.C,
bit-cast4.C:7:30: error: '__builtin_bit_cast' is not a constant expression because 'const int* const' is a pointer type
bit-cast4.C:7:30: error: '__builtin_bit_cast' is not a constant expression because 'int D::* const' is a pointer to member type
bit-cast4.C:7:30: error: '__builtin_bit_cast' is not a constant expression because 'int (D::* const)() const' is a pointer to member type
The tests expect 'const int*', 'int D::*' and 'int (D::*)() const', i.e. the
toplevel qualifiers stripped from those.
Commenting out just the arg = convert_from_reference (arg); doesn't regress
anything though, it is the decay_conversion.

> > +/* Attempt to interpret aggregate of TYPE from bytes encoded in target
> > +   byte order at PTR + OFF with LEN bytes.  MASK contains bits set if the value
> > +   is indeterminate.  */
> > +
> > +static tree
> > +cxx_native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
> > +				int len, unsigned char *mask,
> > +				const constexpr_ctx *ctx, bool *non_constant_p,
> > +				location_t loc)
> 
> Can this be, say, native_interpret_initializer in fold-const?  It doesn't
> seem closely tied to the front end other than diagnostics that could move to
> the caller, like you've already done for the non-aggregate case.

The middle-end doesn't need it ATM for anything, plus I think the
ctx/non_constant_p/loc and diagnostics is really C++ FE specific.
If you really want it in fold-const.c, the only way I can imagine it is
that it would be
tree
native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
			    int len, unsigned char *mask = NULL,
			    tree (*mask_callback) (void *, int) = NULL,
			    void *mask_data = NULL)
where C++ would call it with the mask argument, as mask_callback a FE function
that would emit the diagnostics and decide what to return when mask is set
on something, and mask_data would be a pointer to struct containing
const constexpr_ctx *ctx; bool *non_constant_p; location_t loc;
for it.

	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v2: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-25 18:50       ` Jakub Jelinek
@ 2020-11-26  0:52         ` Jason Merrill
  2020-11-26 15:09           ` [PATCH] c++: v3: " Jakub Jelinek
  0 siblings, 1 reply; 29+ messages in thread
From: Jason Merrill @ 2020-11-26  0:52 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: Jonathan Wakely, gcc-patches

On 11/25/20 1:50 PM, Jakub Jelinek wrote:
> On Wed, Nov 25, 2020 at 12:26:17PM -0500, Jason Merrill wrote:
>>> +		      if (DECL_BIT_FIELD (fld)
>>> +			  && DECL_NAME (fld) == NULL_TREE)
>>> +			continue;
>>
>> I think you want to check DECL_PADDING_P here; the C and C++ front ends set
>> it on unnamed bit-fields, and that's what is_empty_type looks at.
> 
> Ok, changed in my copy.  I'll also post a patch for
> __builtin_clear_padding to use DECL_PADDING_P in there instead of
> DECL_BIT_FIELD/DECL_NAME==NULL.
> 
>>> +      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
>>> +	{
>>> +	  /* Don't perform array-to-pointer conversion.  */
>>> +	  arg = mark_rvalue_use (arg, loc, true);
>>> +	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
>>> +	    return error_mark_node;
>>> +	}
>>> +      else
>>> +	arg = decay_conversion (arg, complain);
>>
>> bit_cast operates on an lvalue argument, so I don't think we want
>> decay_conversion at all here.
>>
>>> +      if (error_operand_p (arg))
>>> +	return error_mark_node;
>>> +
>>> +      arg = convert_from_reference (arg);
>>
>> This shouldn't be necessary; the argument should already be converted from
>> reference.  Generally we call convert_from_reference on the result of some
>> processing, not on an incoming argument.
> 
> Removing these two regresses some tests in the testsuite.
> It is true that std::bit_cast's argument must be a reference, and so when
> one uses std::bit_cast one won't run into these problems, but the builtin's
> argument itself is an rvalue and so we need to deal with people calling it
> directly.
> So, commenting out the decay_conversion and convert_from_reference results
> in:
> extern V v;
> ...
>    __builtin_bit_cast (int, v);
> no longer being reported as invalid use of incomplete type, but
> error: '__builtin_bit_cast' source size '' not equal to destination type size '4'
> (note nothing in between '' for the size because the size is NULL).
> Ditto for:
> extern V *p;
> ...
>    __builtin_bit_cast (int, *p);
> I guess I could add some hand written code to deal with incomplete types
> to cure these.  But e.g. decay_conversion also calls mark_rvalue_use which
> we also need e.g. for -Wunused-but-set*, but don't we also need it e.g. for
> lambdas? The builtin is after all using the argument as an rvalue
> (reads it).

OK, it isn't exactly use as an rvalue, but I guess it's close enough to 
work.

> Another change that commenting out those two parts causes is different
> diagnostics on bit-cast4.C,
> bit-cast4.C:7:30: error: '__builtin_bit_cast' is not a constant expression because 'const int* const' is a pointer type
> bit-cast4.C:7:30: error: '__builtin_bit_cast' is not a constant expression because 'int D::* const' is a pointer to member type
> bit-cast4.C:7:30: error: '__builtin_bit_cast' is not a constant expression because 'int (D::* const)() const' is a pointer to member type
> The tests expect 'const int*', 'int D::*' and 'int (D::*)() const', i.e. the
> toplevel qualifiers stripped from those.
> Commenting out just the arg = convert_from_reference (arg); doesn't regress
> anything though, it is the decay_conversion.

Let's just drop that part, then.

>>> +/* Attempt to interpret aggregate of TYPE from bytes encoded in target
>>> +   byte order at PTR + OFF with LEN bytes.  MASK contains bits set if the value
>>> +   is indeterminate.  */
>>> +
>>> +static tree
>>> +cxx_native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
>>> +				int len, unsigned char *mask,
>>> +				const constexpr_ctx *ctx, bool *non_constant_p,
>>> +				location_t loc)
>>
>> Can this be, say, native_interpret_initializer in fold-const?  It doesn't
>> seem closely tied to the front end other than diagnostics that could move to
>> the caller, like you've already done for the non-aggregate case.
> 
> The middle-end doesn't need it ATM for anything, plus I think the
> ctx/non_constant_p/loc and diagnostics is really C++ FE specific.
> If you really want it in fold-const.c, the only way I can imagine it is
> that it would be
> tree
> native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
> 			    int len, unsigned char *mask = NULL,
> 			    tree (*mask_callback) (void *, int) = NULL,
> 			    void *mask_data = NULL)
> where C++ would call it with the mask argument, as mask_callback a FE function
> that would emit the diagnostics and decide what to return when mask is set
> on something, and mask_data would be a pointer to struct containing
> const constexpr_ctx *ctx; bool *non_constant_p; location_t loc;
> for it.

Instead of a callback, the function could express mask-related failure 
in a way that the caller can detect and diagnose.  Perhaps by clearing 
mask elements that correspond to padding in the output, so you can again 
just scan through the mask to see if there are any elements set?

Jason


^ permalink raw reply	[flat|nested] 29+ messages in thread

* [PATCH] c++: v3: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-26  0:52         ` Jason Merrill
@ 2020-11-26 15:09           ` Jakub Jelinek
  2020-12-03 14:24             ` Jason Merrill
  0 siblings, 1 reply; 29+ messages in thread
From: Jakub Jelinek @ 2020-11-26 15:09 UTC (permalink / raw)
  To: Jason Merrill; +Cc: Jonathan Wakely, gcc-patches

On Wed, Nov 25, 2020 at 07:52:44PM -0500, Jason Merrill wrote:
> > > I think you want to check DECL_PADDING_P here; the C and C++ front ends set
> > > it on unnamed bit-fields, and that's what is_empty_type looks at.
> > 
> > Ok, changed in my copy.  I'll also post a patch for
> > __builtin_clear_padding to use DECL_PADDING_P in there instead of
> > DECL_BIT_FIELD/DECL_NAME==NULL.

This is now in the patch.

> OK, it isn't exactly use as an rvalue, but I guess it's close enough to
> work.
> 
> > Commenting out just the arg = convert_from_reference (arg); doesn't regress
> > anything though, it is the decay_conversion.
> 
> Let's just drop that part, then.

decay_conversion kept, convert_from_reference removed.

> Instead of a callback, the function could express mask-related failure in a
> way that the caller can detect and diagnose.  Perhaps by clearing mask
> elements that correspond to padding in the output, so you can again just
> scan through the mask to see if there are any elements set?

The following patch removes the mask from the new native_interpret_aggregate
moved to fold-const.c altogether.  Instead, the code uses the
__builtin_clear_padding infrastructure (new entrypoint in there).
If native_interpret_aggregate succeeds (returns non-NULL), then
what we have in mask from the earlier native_encode_initializer call is
0 bits if they are well defined in argument and 1 bits for indeterminate
bits.  The new clear_type_padding_in_mask function then clears all bits
which are padding in the given type (the destination type), so if any bits
remain in the mask, it means we are using indeterminate bits and C++ FE
can diagnose it.  The bit-cast* tests didn't change at all.

Ok for trunk if it passes bootstrap/regtest on x86_64-linux and i686-linux?
So far it passed all builtin-clear-padding* and bit-cast* tests, which is
I think everything that should be affected by the changes.

2020-11-26  Jakub Jelinek  <jakub@redhat.com>

	PR libstdc++/93121
	* fold-const.h (native_encode_initializer): Add mask argument
	defaulted to nullptr.
	(find_bitfield_repr_type): Declare.
	(native_interpret_aggregate): Declare.
	* fold-const.c (find_bitfield_repr_type): New function.
	(native_encode_initializer): Add mask argument and support for
	filling it.  Handle also some bitfields without integral
	DECL_BIT_FIELD_REPRESENTATIVE.
	(native_interpret_aggregate): New function.
	* gimple-fold.h (clear_type_padding_in_mask): Declare.
	* gimple-fold.c (struct clear_padding_struct): Add clear_in_mask
	member.
	(clear_padding_flush): Handle buf->clear_in_mask.
	(clear_padding_union): Copy clear_in_mask.  Don't error if
	buf->clear_in_mask is set.
	(clear_padding_type): Don't error if buf->clear_in_mask is set.
	(clear_type_padding_in_mask): New function.
	(gimple_fold_builtin_clear_padding): Set buf.clear_in_mask to false.

	* c-common.h (enum rid): Add RID_BUILTIN_BIT_CAST.
	* c-common.c (c_common_reswords): Add __builtin_bit_cast.

	* cp-tree.h (cp_build_bit_cast): Declare.
	* cp-tree.def (BIT_CAST_EXPR): New tree code.
	* cp-objcp-common.c (names_builtin_p): Handle RID_BUILTIN_BIT_CAST.
	(cp_common_init_ts): Handle BIT_CAST_EXPR.
	* cxx-pretty-print.c (cxx_pretty_printer::postfix_expression):
	Likewise.
	* parser.c (cp_parser_postfix_expression): Handle
	RID_BUILTIN_BIT_CAST.
	* semantics.c (cp_build_bit_cast): New function.
	* tree.c (cp_tree_equal): Handle BIT_CAST_EXPR.
	(cp_walk_subtrees): Likewise.
	* pt.c (tsubst_copy): Likewise.
	* constexpr.c (check_bit_cast_type, cxx_eval_bit_cast): New functions.
	(cxx_eval_constant_expression): Handle BIT_CAST_EXPR.
	(potential_constant_expression_1): Likewise.
	* cp-gimplify.c (cp_genericize_r): Likewise.

	* g++.dg/cpp2a/bit-cast1.C: New test.
	* g++.dg/cpp2a/bit-cast2.C: New test.
	* g++.dg/cpp2a/bit-cast3.C: New test.
	* g++.dg/cpp2a/bit-cast4.C: New test.
	* g++.dg/cpp2a/bit-cast5.C: New test.

--- gcc/fold-const.h.jj	2020-11-26 14:05:07.392268711 +0100
+++ gcc/fold-const.h	2020-11-26 14:35:07.594192689 +0100
@@ -27,9 +27,11 @@ extern int folding_initializer;
 /* Convert between trees and native memory representation.  */
 extern int native_encode_expr (const_tree, unsigned char *, int, int off = -1);
 extern int native_encode_initializer (tree, unsigned char *, int,
-				      int off = -1);
+				      int off = -1, unsigned char * = nullptr);
 extern tree native_interpret_expr (tree, const unsigned char *, int);
 extern bool can_native_interpret_type_p (tree);
+extern tree native_interpret_aggregate (tree, const unsigned char *, int, int);
+extern tree find_bitfield_repr_type (int, int);
 extern void shift_bytes_in_array_left (unsigned char *, unsigned int,
 				       unsigned int);
 extern void shift_bytes_in_array_right (unsigned char *, unsigned int,
--- gcc/fold-const.c.jj	2020-11-26 14:05:07.392268711 +0100
+++ gcc/fold-const.c	2020-11-26 14:35:07.595192678 +0100
@@ -7968,25 +7968,78 @@ native_encode_expr (const_tree expr, uns
     }
 }
 
+/* Try to find a type whose byte size is smaller or equal to LEN bytes larger
+   or equal to FIELDSIZE bytes, with underlying mode precision/size multiple
+   of BITS_PER_UNIT.  As native_{interpret,encode}_int works in term of
+   machine modes, we can't just use build_nonstandard_integer_type.  */
+
+tree
+find_bitfield_repr_type (int fieldsize, int len)
+{
+  machine_mode mode;
+  for (int pass = 0; pass < 2; pass++)
+    {
+      enum mode_class mclass = pass ? MODE_PARTIAL_INT : MODE_INT;
+      FOR_EACH_MODE_IN_CLASS (mode, mclass)
+	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
+	    && known_eq (GET_MODE_PRECISION (mode),
+			 GET_MODE_BITSIZE (mode))
+	    && known_le (GET_MODE_SIZE (mode), len))
+	  {
+	    tree ret = lang_hooks.types.type_for_mode (mode, 1);
+	    if (ret && TYPE_MODE (ret) == mode)
+	      return ret;
+	  }
+    }
+
+  for (int i = 0; i < NUM_INT_N_ENTS; i ++)
+    if (int_n_enabled_p[i]
+	&& int_n_data[i].bitsize >= (unsigned) (BITS_PER_UNIT * fieldsize)
+	&& int_n_trees[i].unsigned_type)
+      {
+	tree ret = int_n_trees[i].unsigned_type;
+	mode = TYPE_MODE (ret);
+	if (known_ge (GET_MODE_SIZE (mode), fieldsize)
+	    && known_eq (GET_MODE_PRECISION (mode),
+			 GET_MODE_BITSIZE (mode))
+	    && known_le (GET_MODE_SIZE (mode), len))
+	  return ret;
+      }
+
+  return NULL_TREE;
+}
+
 /* Similar to native_encode_expr, but also handle CONSTRUCTORs, VCEs,
-   NON_LVALUE_EXPRs and nops.  */
+   NON_LVALUE_EXPRs and nops.  If MASK is non-NULL (then PTR has
+   to be non-NULL and OFF zero), then in addition to filling the
+   bytes pointed by PTR with the value also clear any bits pointed
+   by MASK that are known to be initialized, keep them as is for
+   e.g. uninitialized padding bits or uninitialized fields.  */
 
 int
 native_encode_initializer (tree init, unsigned char *ptr, int len,
-			   int off)
+			   int off, unsigned char *mask)
 {
+  int r;
+
   /* We don't support starting at negative offset and -1 is special.  */
   if (off < -1 || init == NULL_TREE)
     return 0;
 
+  gcc_assert (mask == NULL || (off == 0 && ptr));
+
   STRIP_NOPS (init);
   switch (TREE_CODE (init))
     {
     case VIEW_CONVERT_EXPR:
     case NON_LVALUE_EXPR:
-      return native_encode_initializer (TREE_OPERAND (init, 0), ptr, len, off);
+      return native_encode_initializer (TREE_OPERAND (init, 0), ptr, len, off,
+					mask);
     default:
-      return native_encode_expr (init, ptr, len, off);
+      r = native_encode_expr (init, ptr, len, off);
+      if (mask)
+	memset (mask, 0, r);
+      return r;
     case CONSTRUCTOR:
       tree type = TREE_TYPE (init);
       HOST_WIDE_INT total_bytes = int_size_in_bytes (type);
@@ -7999,7 +8052,7 @@ native_encode_initializer (tree init, un
 	{
 	  HOST_WIDE_INT min_index;
 	  unsigned HOST_WIDE_INT cnt;
-	  HOST_WIDE_INT curpos = 0, fieldsize;
+	  HOST_WIDE_INT curpos = 0, fieldsize, valueinit = -1;
 	  constructor_elt *ce;
 
 	  if (TYPE_DOMAIN (type) == NULL_TREE
@@ -8014,12 +8067,22 @@ native_encode_initializer (tree init, un
 	  if (ptr != NULL)
 	    memset (ptr, '\0', MIN (total_bytes - off, len));
 
-	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
+	  for (cnt = 0; ; cnt++)
 	    {
-	      tree val = ce->value;
-	      tree index = ce->index;
+	      tree val = NULL_TREE, index = NULL_TREE;
 	      HOST_WIDE_INT pos = curpos, count = 0;
 	      bool full = false;
+	      if (vec_safe_iterate (CONSTRUCTOR_ELTS (init), cnt, &ce))
+		{
+		  val = ce->value;
+		  index = ce->index;
+		}
+	      else if (mask == NULL
+		       || CONSTRUCTOR_NO_CLEARING (init)
+		       || curpos >= total_bytes)
+		break;
+	      else
+		pos = total_bytes;
 	      if (index && TREE_CODE (index) == RANGE_EXPR)
 		{
 		  if (!tree_fits_shwi_p (TREE_OPERAND (index, 0))
@@ -8037,6 +8100,28 @@ native_encode_initializer (tree init, un
 		  pos = (tree_to_shwi (index) - min_index) * fieldsize;
 		}
 
+	      if (mask && !CONSTRUCTOR_NO_CLEARING (init) && curpos != pos)
+		{
+		  if (valueinit == -1)
+		    {
+		      tree zero = build_constructor (TREE_TYPE (type), NULL);
+		      r = native_encode_initializer (zero, ptr + curpos,
+						     fieldsize, 0,
+						     mask + curpos);
+		      ggc_free (zero);
+		      if (!r)
+			return 0;
+		      valueinit = curpos;
+		      curpos += fieldsize;
+		    }
+		  while (curpos != pos)
+		    {
+		      memcpy (ptr + curpos, ptr + valueinit, fieldsize);
+		      memcpy (mask + curpos, mask + valueinit, fieldsize);
+		      curpos += fieldsize;
+		    }
+		}
+
 	      curpos = pos;
 	      if (val)
 		do
@@ -8051,6 +8136,8 @@ native_encode_initializer (tree init, un
 			    if (ptr)
 			      memcpy (ptr + (curpos - o), ptr + (pos - o),
 				      fieldsize);
+			    if (mask)
+			      memcpy (mask + curpos, mask + pos, fieldsize);
 			  }
 			else if (!native_encode_initializer (val,
 							     ptr
@@ -8058,7 +8145,10 @@ native_encode_initializer (tree init, un
 							     : NULL,
 							     fieldsize,
 							     off == -1 ? -1
-								       : 0))
+								       : 0,
+							     mask
+							     ? mask + curpos
+							     : NULL))
 			  return 0;
 			else
 			  {
@@ -8073,6 +8163,7 @@ native_encode_initializer (tree init, un
 			unsigned char *p = NULL;
 			int no = 0;
 			int l;
+			gcc_assert (mask == NULL);
 			if (curpos >= off)
 			  {
 			    if (ptr)
@@ -8086,7 +8177,7 @@ native_encode_initializer (tree init, un
 			    no = off - curpos;
 			    l = len;
 			  }
-			if (!native_encode_initializer (val, p, l, no))
+			if (!native_encode_initializer (val, p, l, no, NULL))
 			  return 0;
 		      }
 		    curpos += fieldsize;
@@ -8100,22 +8191,74 @@ native_encode_initializer (tree init, un
 	{
 	  unsigned HOST_WIDE_INT cnt;
 	  constructor_elt *ce;
+	  tree fld_base = TYPE_FIELDS (type);
+	  tree to_free = NULL_TREE;
 
+	  gcc_assert (TREE_CODE (type) == RECORD_TYPE || mask == NULL);
 	  if (ptr != NULL)
 	    memset (ptr, '\0', MIN (total_bytes - off, len));
-	  FOR_EACH_VEC_SAFE_ELT (CONSTRUCTOR_ELTS (init), cnt, ce)
+	  for (cnt = 0; ; cnt++)
 	    {
-	      tree field = ce->index;
-	      tree val = ce->value;
-	      HOST_WIDE_INT pos, fieldsize;
+	      tree val = NULL_TREE, field = NULL_TREE;
+	      HOST_WIDE_INT pos = 0, fieldsize;
 	      unsigned HOST_WIDE_INT bpos = 0, epos = 0;
 
-	      if (field == NULL_TREE)
-		return 0;
+	      if (to_free)
+		{
+		  ggc_free (to_free);
+		  to_free = NULL_TREE;
+		}
 
-	      pos = int_byte_position (field);
-	      if (off != -1 && (HOST_WIDE_INT) off + len <= pos)
-		continue;
+	      if (vec_safe_iterate (CONSTRUCTOR_ELTS (init), cnt, &ce))
+		{
+		  val = ce->value;
+		  field = ce->index;
+		  if (field == NULL_TREE)
+		    return 0;
+
+		  pos = int_byte_position (field);
+		  if (off != -1 && (HOST_WIDE_INT) off + len <= pos)
+		    continue;
+		}
+	      else if (mask == NULL
+		       || CONSTRUCTOR_NO_CLEARING (init))
+		break;
+	      else
+		pos = total_bytes;
+
+	      if (mask && !CONSTRUCTOR_NO_CLEARING (init))
+		{
+		  tree fld;
+		  for (fld = fld_base; fld; fld = DECL_CHAIN (fld))
+		    {
+		      if (TREE_CODE (fld) != FIELD_DECL)
+			continue;
+		      if (fld == field)
+			break;
+		      if (DECL_PADDING_P (fld))
+			continue;
+		      if (DECL_SIZE_UNIT (fld) == NULL_TREE
+			  || !tree_fits_shwi_p (DECL_SIZE_UNIT (fld)))
+			return 0;
+		      if (integer_zerop (DECL_SIZE_UNIT (fld)))
+			continue;
+		      break;
+		    }
+		  if (fld == NULL_TREE)
+		    {
+		      if (ce == NULL)
+			break;
+		      return 0;
+		    }
+		  fld_base = DECL_CHAIN (fld);
+		  if (fld != field)
+		    {
+		      cnt--;
+		      field = fld;
+		      val = build_constructor (TREE_TYPE (fld), NULL);
+		      to_free = val;
+		    }
+		}
 
 	      if (TREE_CODE (TREE_TYPE (field)) == ARRAY_TYPE
 		  && TYPE_DOMAIN (TREE_TYPE (field))
@@ -8156,30 +8299,49 @@ native_encode_initializer (tree init, un
 		  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
 		    return 0;
 
-		  tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
-		  if (repr == NULL_TREE
-		      || TREE_CODE (val) != INTEGER_CST
-		      || !INTEGRAL_TYPE_P (TREE_TYPE (repr)))
+		  if (TREE_CODE (val) != INTEGER_CST)
 		    return 0;
 
-		  HOST_WIDE_INT rpos = int_byte_position (repr);
+		  tree repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
+		  tree repr_type = NULL_TREE;
+		  HOST_WIDE_INT rpos = 0;
+		  if (repr && INTEGRAL_TYPE_P (TREE_TYPE (repr)))
+		    {
+		      rpos = int_byte_position (repr);
+		      repr_type = TREE_TYPE (repr);
+		    }
+		  else
+		    {
+		      repr_type = find_bitfield_repr_type (fieldsize, len);
+		      if (repr_type == NULL_TREE)
+			return 0;
+		      HOST_WIDE_INT repr_size = int_size_in_bytes (repr_type);
+		      gcc_assert (repr_size > 0 && repr_size <= len);
+		      if (pos + repr_size <= len)
+			rpos = pos;
+		      else
+			{
+			  rpos = len - repr_size;
+			  gcc_assert (rpos <= pos);
+			}
+		    }
+
 		  if (rpos > pos)
 		    return 0;
-		  wide_int w = wi::to_wide (val,
-					    TYPE_PRECISION (TREE_TYPE (repr)));
-		  int diff = (TYPE_PRECISION (TREE_TYPE (repr))
+		  wide_int w = wi::to_wide (val, TYPE_PRECISION (repr_type));
+		  int diff = (TYPE_PRECISION (repr_type)
 			      - TYPE_PRECISION (TREE_TYPE (field)));
 		  HOST_WIDE_INT bitoff = (pos - rpos) * BITS_PER_UNIT + bpos;
 		  if (!BYTES_BIG_ENDIAN)
 		    w = wi::lshift (w, bitoff);
 		  else
 		    w = wi::lshift (w, diff - bitoff);
-		  val = wide_int_to_tree (TREE_TYPE (repr), w);
+		  val = wide_int_to_tree (repr_type, w);
 
 		  unsigned char buf[MAX_BITSIZE_MODE_ANY_INT
 				    / BITS_PER_UNIT + 1];
 		  int l = native_encode_int (val, buf, sizeof buf, 0);
-		  if (l * BITS_PER_UNIT != TYPE_PRECISION (TREE_TYPE (repr)))
+		  if (l * BITS_PER_UNIT != TYPE_PRECISION (repr_type))
 		    return 0;
 
 		  if (ptr == NULL)
@@ -8192,15 +8354,31 @@ native_encode_initializer (tree init, un
 		    {
 		      if (!BYTES_BIG_ENDIAN)
 			{
-			  int mask = (1 << bpos) - 1;
-			  buf[pos - rpos] &= ~mask;
-			  buf[pos - rpos] |= ptr[pos - o] & mask;
+			  int msk = (1 << bpos) - 1;
+			  buf[pos - rpos] &= ~msk;
+			  buf[pos - rpos] |= ptr[pos - o] & msk;
+			  if (mask)
+			    {
+			      if (fieldsize > 1 || epos == 0)
+				mask[pos] &= msk;
+			      else
+				mask[pos] &= (msk | ~((1 << epos) - 1));
+			    }
 			}
 		      else
 			{
-			  int mask = (1 << (BITS_PER_UNIT - bpos)) - 1;
-			  buf[pos - rpos] &= mask;
-			  buf[pos - rpos] |= ptr[pos - o] & ~mask;
+			  int msk = (1 << (BITS_PER_UNIT - bpos)) - 1;
+			  buf[pos - rpos] &= msk;
+			  buf[pos - rpos] |= ptr[pos - o] & ~msk;
+			  if (mask)
+			    {
+			      if (fieldsize > 1 || epos == 0)
+				mask[pos] &= ~msk;
+			      else
+				mask[pos] &= (~msk
+					      | ((1 << (BITS_PER_UNIT - epos))
+						 - 1));
+			    }
 			}
 		    }
 		  /* If the bitfield does not end at byte boundary, handle
@@ -8211,27 +8389,37 @@ native_encode_initializer (tree init, un
 		    {
 		      if (!BYTES_BIG_ENDIAN)
 			{
-			  int mask = (1 << epos) - 1;
-			  buf[pos - rpos + fieldsize - 1] &= mask;
+			  int msk = (1 << epos) - 1;
+			  buf[pos - rpos + fieldsize - 1] &= msk;
 			  buf[pos - rpos + fieldsize - 1]
-			    |= ptr[pos + fieldsize - 1 - o] & ~mask;
+			    |= ptr[pos + fieldsize - 1 - o] & ~msk;
+			  if (mask && (fieldsize > 1 || bpos == 0))
+			    mask[pos + fieldsize - 1] &= ~msk;
 			}
 		       else
 			{
-			  int mask = (1 << (BITS_PER_UNIT - epos)) - 1;
-			  buf[pos - rpos + fieldsize - 1] &= ~mask;
+			  int msk = (1 << (BITS_PER_UNIT - epos)) - 1;
+			  buf[pos - rpos + fieldsize - 1] &= ~msk;
 			  buf[pos - rpos + fieldsize - 1]
-			    |= ptr[pos + fieldsize - 1 - o] & mask;
+			    |= ptr[pos + fieldsize - 1 - o] & msk;
+			  if (mask && (fieldsize > 1 || bpos == 0))
+			    mask[pos + fieldsize - 1] &= msk;
 			}
 		    }
 		  if (off == -1
 		      || (pos >= off
 			  && (pos + fieldsize <= (HOST_WIDE_INT) off + len)))
-		    memcpy (ptr + pos - o, buf + (pos - rpos), fieldsize);
+		    {
+		      memcpy (ptr + pos - o, buf + (pos - rpos), fieldsize);
+		      if (mask && (fieldsize > (bpos != 0) + (epos != 0)))
+			memset (mask + pos + (bpos != 0), 0,
+				fieldsize - (bpos != 0) - (epos != 0));
+		    }
 		  else
 		    {
 		      /* Partial overlap.  */
 		      HOST_WIDE_INT fsz = fieldsize;
+		      gcc_assert (mask == NULL);
 		      if (pos < off)
 			{
 			  fsz -= (off - pos);
@@ -8251,7 +8439,8 @@ native_encode_initializer (tree init, un
 		  if (!native_encode_initializer (val, ptr ? ptr + pos - o
 							   : NULL,
 						  fieldsize,
-						  off == -1 ? -1 : 0))
+						  off == -1 ? -1 : 0,
+						  mask ? mask + pos : NULL))
 		    return 0;
 		}
 	      else
@@ -8260,6 +8449,7 @@ native_encode_initializer (tree init, un
 		  unsigned char *p = NULL;
 		  int no = 0;
 		  int l;
+		  gcc_assert (mask == NULL);
 		  if (pos >= off)
 		    {
 		      if (ptr)
@@ -8273,7 +8463,7 @@ native_encode_initializer (tree init, un
 		      no = off - pos;
 		      l = len;
 		    }
-		  if (!native_encode_initializer (val, p, l, no))
+		  if (!native_encode_initializer (val, p, l, no, NULL))
 		    return 0;
 		}
 	    }
@@ -8549,6 +8739,168 @@ can_native_interpret_type_p (tree type)
     }
 }
 
+/* Attempt to interpret aggregate of TYPE from bytes encoded in target
+   byte order at PTR + OFF with LEN bytes.  Does not handle unions.  */
+
+tree
+native_interpret_aggregate (tree type, const unsigned char *ptr, int off,
+			    int len)
+{
+  vec<constructor_elt, va_gc> *elts = NULL;
+  if (TREE_CODE (type) == ARRAY_TYPE)
+    {
+      HOST_WIDE_INT eltsz = int_size_in_bytes (TREE_TYPE (type));
+      if (eltsz < 0 || eltsz > len || TYPE_DOMAIN (type) == NULL_TREE)
+	return NULL_TREE;
+
+      HOST_WIDE_INT cnt = 0;
+      if (TYPE_MAX_VALUE (TYPE_DOMAIN (type)))
+	{
+	  if (!tree_fits_shwi_p (TYPE_MAX_VALUE (TYPE_DOMAIN (type))))
+	    return NULL_TREE;
+	  cnt = tree_to_shwi (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) + 1;
+	}
+      if (eltsz == 0)
+	cnt = 0;
+      HOST_WIDE_INT pos = 0;
+      for (HOST_WIDE_INT i = 0; i < cnt; i++, pos += eltsz)
+	{
+	  tree v = NULL_TREE;
+	  if (pos >= len || pos + eltsz > len)
+	    return NULL_TREE;
+	  if (can_native_interpret_type_p (TREE_TYPE (type)))
+	    {
+	      v = native_interpret_expr (TREE_TYPE (type),
+					 ptr + off + pos, eltsz);
+	      if (v == NULL_TREE)
+		return NULL_TREE;
+	    }
+	  else if (TREE_CODE (TREE_TYPE (type)) == RECORD_TYPE
+		   || TREE_CODE (TREE_TYPE (type)) == ARRAY_TYPE)
+	    v = native_interpret_aggregate (TREE_TYPE (type), ptr, off + pos,
+					    eltsz);
+	  if (v == NULL_TREE)
+	    return NULL_TREE;
+	  CONSTRUCTOR_APPEND_ELT (elts, size_int (i), v);
+	}
+      return build_constructor (type, elts);
+    }
+  if (TREE_CODE (type) != RECORD_TYPE)
+    return NULL_TREE;
+  for (tree field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
+    {
+      if (TREE_CODE (field) != FIELD_DECL || DECL_PADDING_P (field))
+	continue;
+      tree fld = field;
+      HOST_WIDE_INT bitoff = 0, pos = 0, sz = 0;
+      int diff = 0;
+      tree v = NULL_TREE;
+      if (DECL_BIT_FIELD (field))
+	{
+	  fld = DECL_BIT_FIELD_REPRESENTATIVE (field);
+	  if (fld && INTEGRAL_TYPE_P (TREE_TYPE (fld)))
+	    {
+	      poly_int64 bitoffset;
+	      poly_uint64 field_offset, fld_offset;
+	      if (poly_int_tree_p (DECL_FIELD_OFFSET (field), &field_offset)
+		  && poly_int_tree_p (DECL_FIELD_OFFSET (fld), &fld_offset))
+		bitoffset = (field_offset - fld_offset) * BITS_PER_UNIT;
+	      else
+		bitoffset = 0;
+	      bitoffset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
+			    - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (fld)));
+	      diff = (TYPE_PRECISION (TREE_TYPE (fld))
+		      - TYPE_PRECISION (TREE_TYPE (field)));
+	      if (!bitoffset.is_constant (&bitoff)
+		  || bitoff < 0
+		  || bitoff > diff)
+		return NULL_TREE;
+	    }
+	  else
+	    {
+	      if (!tree_fits_uhwi_p (DECL_FIELD_BIT_OFFSET (field)))
+		return NULL_TREE;
+	      int fieldsize = TYPE_PRECISION (TREE_TYPE (field));
+	      int bpos = tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field));
+	      bpos %= BITS_PER_UNIT;
+	      fieldsize += bpos;
+	      fieldsize += BITS_PER_UNIT - 1;
+	      fieldsize /= BITS_PER_UNIT;
+	      tree repr_type = find_bitfield_repr_type (fieldsize, len);
+	      if (repr_type == NULL_TREE)
+		return NULL_TREE;
+	      sz = int_size_in_bytes (repr_type);
+	      if (sz < 0 || sz > len)
+		return NULL_TREE;
+	      pos = int_byte_position (field);
+	      if (pos < 0 || pos > len || pos + fieldsize > len)
+		return NULL_TREE;
+	      HOST_WIDE_INT rpos;
+	      if (pos + sz <= len)
+		rpos = pos;
+	      else
+		{
+		  rpos = len - sz;
+		  gcc_assert (rpos <= pos);
+		}
+	      bitoff = (HOST_WIDE_INT) (pos - rpos) * BITS_PER_UNIT + bpos;
+	      pos = rpos;
+	      diff = (TYPE_PRECISION (repr_type)
+		      - TYPE_PRECISION (TREE_TYPE (field)));
+	      v = native_interpret_expr (repr_type, ptr + off + pos, sz);
+	      if (v == NULL_TREE)
+		return NULL_TREE;
+	      fld = NULL_TREE;
+	    }
+	}
+
+      if (fld)
+	{
+	  sz = int_size_in_bytes (TREE_TYPE (fld));
+	  if (sz < 0 || sz > len)
+	    return NULL_TREE;
+	  tree byte_pos = byte_position (fld);
+	  if (!tree_fits_shwi_p (byte_pos))
+	    return NULL_TREE;
+	  pos = tree_to_shwi (byte_pos);
+	  if (pos < 0 || pos > len || pos + sz > len)
+	    return NULL_TREE;
+	}
+      if (fld == NULL_TREE)
+	/* Already handled above.  */;
+      else if (can_native_interpret_type_p (TREE_TYPE (fld)))
+	{
+	  v = native_interpret_expr (TREE_TYPE (fld),
+				     ptr + off + pos, sz);
+	  if (v == NULL_TREE)
+	    return NULL_TREE;
+	}
+      else if (TREE_CODE (TREE_TYPE (fld)) == RECORD_TYPE
+	       || TREE_CODE (TREE_TYPE (fld)) == ARRAY_TYPE)
+	v = native_interpret_aggregate (TREE_TYPE (fld), ptr, off + pos, sz);
+      if (v == NULL_TREE)
+	return NULL_TREE;
+      if (fld != field)
+	{
+	  if (TREE_CODE (v) != INTEGER_CST)
+	    return NULL_TREE;
+
+	  /* FIXME: Figure out how to handle PDP endian bitfields.  */
+	  if (BYTES_BIG_ENDIAN != WORDS_BIG_ENDIAN)
+	    return NULL_TREE;
+	  if (!BYTES_BIG_ENDIAN)
+	    v = wide_int_to_tree (TREE_TYPE (field),
+				  wi::lrshift (wi::to_wide (v), bitoff));
+	  else
+	    v = wide_int_to_tree (TREE_TYPE (field),
+				  wi::lrshift (wi::to_wide (v),
+					       diff - bitoff));
+	}
+      CONSTRUCTOR_APPEND_ELT (elts, field, v);
+    }
+  return build_constructor (type, elts);
+}
+
 /* Routines for manipulation of native_encode_expr encoded data if the encoded
    or extracted constant positions and/or sizes aren't byte aligned.  */
 
--- gcc/gimple-fold.h.jj	2020-11-26 14:05:07.393268700 +0100
+++ gcc/gimple-fold.h	2020-11-26 15:50:08.324992304 +0100
@@ -35,6 +35,7 @@ extern tree maybe_fold_and_comparisons (
 					enum tree_code, tree, tree);
 extern tree maybe_fold_or_comparisons (tree, enum tree_code, tree, tree,
 				       enum tree_code, tree, tree);
+extern void clear_type_padding_in_mask (tree, unsigned char *);
 extern bool optimize_atomic_compare_exchange_p (gimple *);
 extern void fold_builtin_atomic_compare_exchange (gimple_stmt_iterator *);
 extern bool arith_overflowed_p (enum tree_code, const_tree, const_tree,
--- gcc/gimple-fold.c.jj	2020-11-26 14:18:10.826532574 +0100
+++ gcc/gimple-fold.c	2020-11-26 15:51:25.806128376 +0100
@@ -3958,6 +3958,10 @@ static const size_t clear_padding_buf_si
 /* Data passed through __builtin_clear_padding folding.  */
 struct clear_padding_struct {
   location_t loc;
+  /* 0 during __builtin_clear_padding folding, nonzero during
+     clear_type_padding_in_mask.  In that case, instead of clearing the
+     non-padding bits in union_ptr array clear the padding bits in there.  */
+  bool clear_in_mask;
   tree base;
   tree alias_type;
   gimple_stmt_iterator *gsi;
@@ -4000,6 +4004,39 @@ clear_padding_flush (clear_padding_struc
   size_t padding_bytes = buf->padding_bytes;
   if (buf->union_ptr)
     {
+      if (buf->clear_in_mask)
+	{
+	  /* During clear_type_padding_in_mask, clear the padding
+	     bits set in buf->buf in the buf->union_ptr mask.  */
+	  for (size_t i = 0; i < end; i++)
+	    {
+	      if (buf->buf[i] == (unsigned char) ~0)
+		padding_bytes++;
+	      else
+		{
+		  memset (&buf->union_ptr[buf->off + i - padding_bytes],
+			  0, padding_bytes);
+		  padding_bytes = 0;
+		  buf->union_ptr[buf->off + i] &= ~buf->buf[i];
+		}
+	    }
+	  if (full)
+	    {
+	      memset (&buf->union_ptr[buf->off + end - padding_bytes],
+		      0, padding_bytes);
+	      buf->off = 0;
+	      buf->size = 0;
+	      buf->padding_bytes = 0;
+	    }
+	  else
+	    {
+	      memmove (buf->buf, buf->buf + end, buf->size - end);
+	      buf->off += end;
+	      buf->size -= end;
+	      buf->padding_bytes = padding_bytes;
+	    }
+	  return;
+	}
       /* Inside of a union, instead of emitting any code, instead
 	 clear all bits in the union_ptr buffer that are clear
 	 in buf.  Whole padding bytes don't clear anything.  */
@@ -4311,6 +4348,7 @@ clear_padding_union (clear_padding_struc
 	clear_padding_flush (buf, false);
       union_buf = XALLOCA (clear_padding_struct);
       union_buf->loc = buf->loc;
+      union_buf->clear_in_mask = buf->clear_in_mask;
       union_buf->base = NULL_TREE;
       union_buf->alias_type = NULL_TREE;
       union_buf->gsi = NULL;
@@ -4335,9 +4373,10 @@ clear_padding_union (clear_padding_struc
 	      continue;
 	    gcc_assert (TREE_CODE (TREE_TYPE (field)) == ARRAY_TYPE
 			&& !COMPLETE_TYPE_P (TREE_TYPE (field)));
-	    error_at (buf->loc, "flexible array member %qD does not have "
-				"well defined padding bits for %qs",
-		      field, "__builtin_clear_padding");
+	    if (!buf->clear_in_mask)
+	      error_at (buf->loc, "flexible array member %qD does not have "
+				  "well defined padding bits for %qs",
+			field, "__builtin_clear_padding");
 	    continue;
 	  }
 	HOST_WIDE_INT fldsz = tree_to_shwi (DECL_SIZE_UNIT (field));
@@ -4529,9 +4568,10 @@ clear_padding_type (clear_padding_struct
 		  continue;
 		gcc_assert (TREE_CODE (ftype) == ARRAY_TYPE
 			    && !COMPLETE_TYPE_P (ftype));
-		error_at (buf->loc, "flexible array member %qD does not have "
-				    "well defined padding bits for %qs",
-			  field, "__builtin_clear_padding");
+		if (!buf->clear_in_mask)
+		  error_at (buf->loc, "flexible array member %qD does not "
+				      "have well defined padding bits for %qs",
+			    field, "__builtin_clear_padding");
 	      }
 	    else if (is_empty_type (TREE_TYPE (field)))
 	      continue;
@@ -4643,6 +4683,27 @@ clear_padding_type (clear_padding_struct
     }
 }
 
+/* Clear padding bits of TYPE in MASK.  */
+
+void
+clear_type_padding_in_mask (tree type, unsigned char *mask)
+{
+  clear_padding_struct buf;
+  buf.loc = UNKNOWN_LOCATION;
+  buf.clear_in_mask = true;
+  buf.base = NULL_TREE;
+  buf.alias_type = NULL_TREE;
+  buf.gsi = NULL;
+  buf.align = 0;
+  buf.off = 0;
+  buf.padding_bytes = 0;
+  buf.sz = int_size_in_bytes (type);
+  buf.size = 0;
+  buf.union_ptr = mask;
+  clear_padding_type (&buf, type, buf.sz);
+  clear_padding_flush (&buf, true);
+}
+
 /* Fold __builtin_clear_padding builtin.  */
 
 static bool
@@ -4662,6 +4723,7 @@ gimple_fold_builtin_clear_padding (gimpl
   gsi_prev (&gsiprev);
 
   buf.loc = loc;
+  buf.clear_in_mask = false;
   buf.base = ptr;
   buf.alias_type = NULL_TREE;
   buf.gsi = gsi;
--- gcc/c-family/c-common.h.jj	2020-11-26 14:05:07.375268900 +0100
+++ gcc/c-family/c-common.h	2020-11-26 14:35:07.597192656 +0100
@@ -169,7 +169,7 @@ enum rid
   RID_HAS_NOTHROW_COPY,        RID_HAS_TRIVIAL_ASSIGN,
   RID_HAS_TRIVIAL_CONSTRUCTOR, RID_HAS_TRIVIAL_COPY,
   RID_HAS_TRIVIAL_DESTRUCTOR,  RID_HAS_UNIQUE_OBJ_REPRESENTATIONS,
-  RID_HAS_VIRTUAL_DESTRUCTOR,
+  RID_HAS_VIRTUAL_DESTRUCTOR,  RID_BUILTIN_BIT_CAST,
   RID_IS_ABSTRACT,             RID_IS_AGGREGATE,
   RID_IS_BASE_OF,              RID_IS_CLASS,
   RID_IS_EMPTY,                RID_IS_ENUM,
--- gcc/c-family/c-common.c.jj	2020-11-26 14:05:07.375268900 +0100
+++ gcc/c-family/c-common.c	2020-11-26 14:35:07.597192656 +0100
@@ -374,6 +374,7 @@ const struct c_common_resword c_common_r
   { "__auto_type",	RID_AUTO_TYPE,	D_CONLY },
   { "__bases",          RID_BASES, D_CXXONLY },
   { "__builtin_addressof", RID_ADDRESSOF, D_CXXONLY },
+  { "__builtin_bit_cast", RID_BUILTIN_BIT_CAST, D_CXXONLY },
   { "__builtin_call_with_static_chain",
     RID_BUILTIN_CALL_WITH_STATIC_CHAIN, D_CONLY },
   { "__builtin_choose_expr", RID_CHOOSE_EXPR, D_CONLY },
--- gcc/cp/cp-tree.h.jj	2020-11-26 14:05:07.379268856 +0100
+++ gcc/cp/cp-tree.h	2020-11-26 14:35:07.589192745 +0100
@@ -7308,6 +7308,8 @@ extern tree finish_builtin_launder		(loc
 						 tsubst_flags_t);
 extern tree cp_build_vec_convert		(tree, location_t, tree,
 						 tsubst_flags_t);
+extern tree cp_build_bit_cast			(location_t, tree, tree,
+						 tsubst_flags_t);
 extern void start_lambda_scope			(tree);
 extern void record_lambda_scope			(tree);
 extern void record_null_lambda_scope		(tree);
--- gcc/cp/cp-tree.def.jj	2020-11-26 14:05:07.377268878 +0100
+++ gcc/cp/cp-tree.def	2020-11-26 14:35:07.591192723 +0100
@@ -437,6 +437,9 @@ DEFTREECODE (UNARY_RIGHT_FOLD_EXPR, "una
 DEFTREECODE (BINARY_LEFT_FOLD_EXPR, "binary_left_fold_expr", tcc_expression, 3)
 DEFTREECODE (BINARY_RIGHT_FOLD_EXPR, "binary_right_fold_expr", tcc_expression, 3)
 
+/* Represents the __builtin_bit_cast (type, expr) expression.
+   The type is in TREE_TYPE, expression in TREE_OPERAND (bitcast, 0).  */
+DEFTREECODE (BIT_CAST_EXPR, "bit_cast_expr", tcc_expression, 1)
 
 /** C++ extensions. */
 
--- gcc/cp/cp-objcp-common.c.jj	2020-11-26 14:05:07.377268878 +0100
+++ gcc/cp/cp-objcp-common.c	2020-11-26 14:35:07.587192767 +0100
@@ -391,6 +391,7 @@ names_builtin_p (const char *name)
     case RID_BUILTIN_HAS_ATTRIBUTE:
     case RID_BUILTIN_SHUFFLE:
     case RID_BUILTIN_LAUNDER:
+    case RID_BUILTIN_BIT_CAST:
     case RID_OFFSETOF:
     case RID_HAS_NOTHROW_ASSIGN:
     case RID_HAS_NOTHROW_CONSTRUCTOR:
@@ -489,6 +490,7 @@ cp_common_init_ts (void)
   MARK_TS_EXP (ALIGNOF_EXPR);
   MARK_TS_EXP (ARROW_EXPR);
   MARK_TS_EXP (AT_ENCODE_EXPR);
+  MARK_TS_EXP (BIT_CAST_EXPR);
   MARK_TS_EXP (CAST_EXPR);
   MARK_TS_EXP (CONST_CAST_EXPR);
   MARK_TS_EXP (CTOR_INITIALIZER);
--- gcc/cp/cxx-pretty-print.c.jj	2020-11-26 14:05:07.379268856 +0100
+++ gcc/cp/cxx-pretty-print.c	2020-11-26 14:35:07.591192723 +0100
@@ -655,6 +655,15 @@ cxx_pretty_printer::postfix_expression (
       pp_right_paren (this);
       break;
 
+    case BIT_CAST_EXPR:
+      pp_cxx_ws_string (this, "__builtin_bit_cast");
+      pp_left_paren (this);
+      type_id (TREE_TYPE (t));
+      pp_comma (this);
+      expression (TREE_OPERAND (t, 0));
+      pp_right_paren (this);
+      break;
+
     case EMPTY_CLASS_EXPR:
       type_id (TREE_TYPE (t));
       pp_left_paren (this);
--- gcc/cp/parser.c.jj	2020-11-26 14:05:07.384268800 +0100
+++ gcc/cp/parser.c	2020-11-26 14:35:07.587192767 +0100
@@ -7247,6 +7247,32 @@ cp_parser_postfix_expression (cp_parser
 				     tf_warning_or_error);
       }
 
+    case RID_BUILTIN_BIT_CAST:
+      {
+	tree expression;
+	tree type;
+	/* Consume the `__builtin_bit_cast' token.  */
+	cp_lexer_consume_token (parser->lexer);
+	/* Look for the opening `('.  */
+	matching_parens parens;
+	parens.require_open (parser);
+	location_t type_location
+	  = cp_lexer_peek_token (parser->lexer)->location;
+	/* Parse the type-id.  */
+	{
+	  type_id_in_expr_sentinel s (parser);
+	  type = cp_parser_type_id (parser);
+	}
+	/* Look for the `,'.  */
+	cp_parser_require (parser, CPP_COMMA, RT_COMMA);
+	/* Now, parse the assignment-expression.  */
+	expression = cp_parser_assignment_expression (parser);
+	/* Look for the closing `)'.  */
+	parens.require_close (parser);
+	return cp_build_bit_cast (type_location, type, expression,
+				  tf_warning_or_error);
+      }
+
     default:
       {
 	tree type;
--- gcc/cp/semantics.c.jj	2020-11-26 14:05:07.390268733 +0100
+++ gcc/cp/semantics.c	2020-11-26 14:35:07.590192734 +0100
@@ -10679,4 +10679,75 @@ cp_build_vec_convert (tree arg, location
   return build_call_expr_internal_loc (loc, IFN_VEC_CONVERT, type, 1, arg);
 }
 
+/* Finish __builtin_bit_cast (type, arg).  */
+
+tree
+cp_build_bit_cast (location_t loc, tree type, tree arg,
+		   tsubst_flags_t complain)
+{
+  if (error_operand_p (type))
+    return error_mark_node;
+  if (!dependent_type_p (type))
+    {
+      if (!complete_type_or_maybe_complain (type, NULL_TREE, complain))
+	return error_mark_node;
+      if (TREE_CODE (type) == ARRAY_TYPE)
+	{
+	  /* std::bit_cast for destination ARRAY_TYPE is not possible,
+	     as functions may not return an array, so don't bother trying
+	     to support this (and then deal with VLAs etc.).  */
+	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
+			 "is an array type", type);
+	  return error_mark_node;
+	}
+      if (!trivially_copyable_p (type))
+	{
+	  error_at (loc, "%<__builtin_bit_cast%> destination type %qT "
+			 "is not trivially copyable", type);
+	  return error_mark_node;
+	}
+    }
+
+  if (error_operand_p (arg))
+    return error_mark_node;
+
+  if (!type_dependent_expression_p (arg))
+    {
+      if (TREE_CODE (TREE_TYPE (arg)) == ARRAY_TYPE)
+	{
+	  /* Don't perform array-to-pointer conversion.  */
+	  arg = mark_rvalue_use (arg, loc, true);
+	  if (!complete_type_or_maybe_complain (TREE_TYPE (arg), arg, complain))
+	    return error_mark_node;
+	}
+      else
+	arg = decay_conversion (arg, complain);
+
+      if (error_operand_p (arg))
+	return error_mark_node;
+
+      if (!trivially_copyable_p (TREE_TYPE (arg)))
+	{
+	  error_at (cp_expr_loc_or_loc (arg, loc),
+		    "%<__builtin_bit_cast%> source type %qT "
+		    "is not trivially copyable", TREE_TYPE (arg));
+	  return error_mark_node;
+	}
+      if (!dependent_type_p (type)
+	  && !cp_tree_equal (TYPE_SIZE_UNIT (type),
+			     TYPE_SIZE_UNIT (TREE_TYPE (arg))))
+	{
+	  error_at (loc, "%<__builtin_bit_cast%> source size %qE "
+			 "not equal to destination type size %qE",
+			 TYPE_SIZE_UNIT (TREE_TYPE (arg)),
+			 TYPE_SIZE_UNIT (type));
+	  return error_mark_node;
+	}
+    }
+
+  tree ret = build_min (BIT_CAST_EXPR, type, arg);
+  SET_EXPR_LOCATION (ret, loc);
+  return ret;
+}
+
 #include "gt-cp-semantics.h"
--- gcc/cp/tree.c.jj	2020-11-26 14:05:07.390268733 +0100
+++ gcc/cp/tree.c	2020-11-26 14:35:07.591192723 +0100
@@ -3918,6 +3918,7 @@ cp_tree_equal (tree t1, tree t2)
     CASE_CONVERT:
     case NON_LVALUE_EXPR:
     case VIEW_CONVERT_EXPR:
+    case BIT_CAST_EXPR:
       if (!same_type_p (TREE_TYPE (t1), TREE_TYPE (t2)))
 	return false;
       /* Now compare operands as usual.  */
@@ -5163,6 +5164,7 @@ cp_walk_subtrees (tree *tp, int *walk_su
     case CONST_CAST_EXPR:
     case DYNAMIC_CAST_EXPR:
     case IMPLICIT_CONV_EXPR:
+    case BIT_CAST_EXPR:
       if (TREE_TYPE (*tp))
 	WALK_SUBTREE (TREE_TYPE (*tp));
 
--- gcc/cp/pt.c.jj	2020-11-26 14:05:07.388268756 +0100
+++ gcc/cp/pt.c	2020-11-26 14:35:07.593192700 +0100
@@ -16727,6 +16727,13 @@ tsubst_copy (tree t, tree args, tsubst_f
 	return build1 (code, type, op0);
       }
 
+    case BIT_CAST_EXPR:
+      {
+	tree type = tsubst (TREE_TYPE (t), args, complain, in_decl);
+	tree op0 = tsubst_copy (TREE_OPERAND (t, 0), args, complain, in_decl);
+	return cp_build_bit_cast (EXPR_LOCATION (t), type, op0, complain);
+      }
+
     case SIZEOF_EXPR:
       if (PACK_EXPANSION_P (TREE_OPERAND (t, 0))
 	  || ARGUMENT_PACK_P (TREE_OPERAND (t, 0)))
--- gcc/cp/constexpr.c.jj	2020-11-26 14:05:07.376268889 +0100
+++ gcc/cp/constexpr.c	2020-11-26 15:28:53.215212466 +0100
@@ -3945,6 +3945,214 @@ cxx_eval_bit_field_ref (const constexpr_
   return error_mark_node;
 }
 
+/* Helper for cxx_eval_bit_cast.
+   Check [bit.cast]/3 rules, bit_cast is constexpr only if the To and From
+   types and types of all subobjects have is_union_v<T>, is_pointer_v<T>,
+   is_member_pointer_v<T>, is_volatile_v<T> false and has no non-static
+   data members of reference type.  */
+
+static bool
+check_bit_cast_type (const constexpr_ctx *ctx, location_t loc, tree type,
+		     tree orig_type)
+{
+  if (TREE_CODE (type) == UNION_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "a union type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a union type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == POINTER_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "a pointer type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a pointer type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == REFERENCE_TYPE)
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "a reference type", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a reference type", "__builtin_bit_cast",
+		      orig_type);
+	}
+      return true;
+    }
+  if (TYPE_PTRMEM_P (type))
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "a pointer to member type", "__builtin_bit_cast",
+		      type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a pointer to member type",
+		      "__builtin_bit_cast", orig_type);
+	}
+      return true;
+    }
+  if (TYPE_VOLATILE (type))
+    {
+      if (!ctx->quiet)
+	{
+	  if (type == orig_type)
+	    error_at (loc, "%qs is not a constant expression because %qT is "
+			   "volatile", "__builtin_bit_cast", type);
+	  else
+	    error_at (loc, "%qs is not a constant expression because %qT "
+			   "contains a volatile subobject",
+		      "__builtin_bit_cast", orig_type);
+	}
+      return true;
+    }
+  if (TREE_CODE (type) == RECORD_TYPE)
+    for (tree field = TYPE_FIELDS (type); field; field = DECL_CHAIN (field))
+      if (TREE_CODE (field) == FIELD_DECL
+	  && check_bit_cast_type (ctx, loc, TREE_TYPE (field), orig_type))
+	return true;
+  return false;
+}
+
+/* Subroutine of cxx_eval_constant_expression.
+   Attempt to evaluate a BIT_CAST_EXPR.  */
+
+static tree
+cxx_eval_bit_cast (const constexpr_ctx *ctx, tree t, bool *non_constant_p,
+		   bool *overflow_p)
+{
+  if (check_bit_cast_type (ctx, EXPR_LOCATION (t), TREE_TYPE (t),
+			   TREE_TYPE (t))
+      || check_bit_cast_type (ctx, cp_expr_loc_or_loc (TREE_OPERAND (t, 0),
+						       EXPR_LOCATION (t)),
+			      TREE_TYPE (TREE_OPERAND (t, 0)),
+			      TREE_TYPE (TREE_OPERAND (t, 0))))
+    {
+      *non_constant_p = true;
+      return t;
+    }
+
+  tree op = cxx_eval_constant_expression (ctx, TREE_OPERAND (t, 0), false,
+					  non_constant_p, overflow_p);
+  if (*non_constant_p)
+    return t;
+
+  location_t loc = EXPR_LOCATION (t);
+  if (BITS_PER_UNIT != 8 || CHAR_BIT != 8)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated on the target",
+		       "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  if (!tree_fits_shwi_p (TYPE_SIZE_UNIT (TREE_TYPE (t))))
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "type is too large", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  HOST_WIDE_INT len = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (t)));
+  if (len < 0 || (int) len != len)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "type is too large", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  unsigned char buf[64];
+  unsigned char *ptr, *mask;
+  size_t alen = (size_t) len * 2;
+  if (alen <= sizeof (buf))
+    ptr = buf;
+  else
+    ptr = XNEWVEC (unsigned char, alen);
+  mask = ptr + (size_t) len;
+  /* At the beginning consider everything indeterminate.  */
+  memset (mask, ~0, (size_t) len);
+
+  if (native_encode_initializer (op, ptr, len, 0, mask) != len)
+    {
+      if (!ctx->quiet)
+	sorry_at (loc, "%qs cannot be constant evaluated because the "
+		       "argument cannot be encoded", "__builtin_bit_cast");
+      *non_constant_p = true;
+      return t;
+    }
+
+  if (can_native_interpret_type_p (TREE_TYPE (t)))
+    if (tree r = native_interpret_expr (TREE_TYPE (t), ptr, len))
+      {
+	for (int i = 0; i < len; i++)
+	  if (mask[i])
+	    {
+	      if (!ctx->quiet)
+		error_at (loc, "%qs accessing uninitialized byte at offset %d",
+			  "__builtin_bit_cast", i);
+	      *non_constant_p = true;
+	      r = t;
+	      break;
+	    }
+	if (ptr != buf)
+	  XDELETE (ptr);
+	return r;
+      }
+
+  if (TREE_CODE (TREE_TYPE (t)) == RECORD_TYPE)
+    {
+      tree r = native_interpret_aggregate (TREE_TYPE (t), ptr, 0, len);
+      if (r != NULL_TREE)
+	{
+	  clear_type_padding_in_mask (TREE_TYPE (t), mask);
+	  for (int i = 0; i < len; i++)
+	    if (mask[i])
+	      {
+		if (!ctx->quiet)
+		  error_at (loc, "%qs accessing uninitialized byte at offset "
+				 "%d", "__builtin_bit_cast", i);
+		*non_constant_p = true;
+		r = t;
+		break;
+	      }
+	  if (ptr != buf)
+	    XDELETE (ptr);
+	  return r;
+	}
+    }
+
+  if (!ctx->quiet)
+    sorry_at (loc, "%qs cannot be constant evaluated because the "
+		   "argument cannot be interpreted", "__builtin_bit_cast");
+  *non_constant_p = true;
+  return t;
+}
+
 /* Subroutine of cxx_eval_constant_expression.
    Evaluate a short-circuited logical expression T in the context
    of a given constexpr CALL.  BAILOUT_VALUE is the value for
@@ -6648,6 +6856,10 @@ cxx_eval_constant_expression (const cons
       *non_constant_p = true;
       return t;
 
+    case BIT_CAST_EXPR:
+      r = cxx_eval_bit_cast (ctx, t, non_constant_p, overflow_p);
+      break;
+
     default:
       if (STATEMENT_CODE_P (TREE_CODE (t)))
 	{
@@ -8436,6 +8648,9 @@ potential_constant_expression_1 (tree t,
     case ANNOTATE_EXPR:
       return RECUR (TREE_OPERAND (t, 0), rval);
 
+    case BIT_CAST_EXPR:
+      return RECUR (TREE_OPERAND (t, 0), rval);
+
     /* Coroutine await, yield and return expressions are not.  */
     case CO_AWAIT_EXPR:
     case CO_YIELD_EXPR:
--- gcc/cp/cp-gimplify.c.jj	2020-11-26 14:05:07.377268878 +0100
+++ gcc/cp/cp-gimplify.c	2020-11-26 14:35:07.590192734 +0100
@@ -1552,6 +1552,11 @@ cp_genericize_r (tree *stmt_p, int *walk
 				 cp_genericize_r, cp_walk_subtrees);
       break;
 
+    case BIT_CAST_EXPR:
+      *stmt_p = build1_loc (EXPR_LOCATION (stmt), VIEW_CONVERT_EXPR,
+			    TREE_TYPE (stmt), TREE_OPERAND (stmt, 0));
+      break;
+
     default:
       if (IS_TYPE_OR_DECL_P (stmt))
 	*walk_subtrees = 0;
--- gcc/testsuite/g++.dg/cpp2a/bit-cast1.C.jj	2020-11-26 14:35:07.594192689 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast1.C	2020-11-26 14:35:07.594192689 +0100
@@ -0,0 +1,47 @@
+// { dg-do compile }
+
+struct S { short a, b; };
+struct T { float a[16]; };
+struct U { int b[16]; };
+
+#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
+int
+f1 (float x)
+{
+  return __builtin_bit_cast (int, x);
+}
+#endif
+
+#if 2 * __SIZEOF_SHORT__ == __SIZEOF_INT__
+S
+f2 (int x)
+{
+  return __builtin_bit_cast (S, x);
+}
+
+int
+f3 (S x)
+{
+  return __builtin_bit_cast (int, x);
+}
+#endif
+
+#if __SIZEOF_FLOAT__ == __SIZEOF_INT__
+U
+f4 (T &x)
+{
+  return __builtin_bit_cast (U, x);
+}
+
+T
+f5 (int (&x)[16])
+{
+  return __builtin_bit_cast (T, x);
+}
+#endif
+
+int
+f6 ()
+{
+  return __builtin_bit_cast (unsigned char, (signed char) 0);
+}
--- gcc/testsuite/g++.dg/cpp2a/bit-cast2.C.jj	2020-11-26 14:35:07.594192689 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast2.C	2020-11-26 14:35:07.594192689 +0100
@@ -0,0 +1,57 @@
+// { dg-do compile }
+
+struct S { ~S (); int s; };
+S s;
+struct V;				// { dg-message "forward declaration of 'struct V'" }
+extern V v;				// { dg-error "'v' has incomplete type" }
+extern V *p;
+struct U { int a, b; };
+U u;
+
+void
+foo (int *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, v);
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+template <int N>
+void
+bar (int *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (S, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (int &, q);	// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (int [1], 0);	// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (V, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, 0);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (int, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+template <typename T1, typename T2, typename T3, typename T4>
+void
+baz (T3 s, T4 *p, T1 *q)
+{
+  __builtin_bit_cast (int, s);		// { dg-error "'__builtin_bit_cast' source type 'S' is not trivially copyable" }
+  __builtin_bit_cast (T3, 0);		// { dg-error "'__builtin_bit_cast' destination type 'S' is not trivially copyable" }
+  __builtin_bit_cast (T1 &, q);		// { dg-error "'__builtin_bit_cast' destination type 'int&' is not trivially copyable" }
+  __builtin_bit_cast (T2, 0);		// { dg-error "'__builtin_bit_cast' destination type \[^\n\r]* is an array type" }
+  __builtin_bit_cast (T4, 0);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (int, *p);		// { dg-error "invalid use of incomplete type 'struct V'" }
+  __builtin_bit_cast (U, (T1) 0);	// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+  __builtin_bit_cast (T1, u);		// { dg-error "'__builtin_bit_cast' source size '\[0-9]*' not equal to destination type size '\[0-9]*'" }
+}
+
+void
+qux (int *q)
+{
+  baz <int, int [1], S, V> (s, p, q);
+}
--- gcc/testsuite/g++.dg/cpp2a/bit-cast3.C.jj	2020-11-26 14:35:07.594192689 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast3.C	2020-11-26 14:35:07.594192689 +0100
@@ -0,0 +1,229 @@
+// { dg-do compile { target c++11 } }
+
+template <typename To, typename From>
+constexpr To
+bit_cast (const From &from)
+{
+  return __builtin_bit_cast (To, from);
+}
+
+template <typename To, typename From>
+constexpr bool
+check (const From &from)
+{
+  return bit_cast <From> (bit_cast <To> (from)) == from;
+}
+
+struct A
+{
+  int a, b, c;
+  constexpr bool operator == (const A &x) const
+  {
+    return x.a == a && x.b == b && x.c == c;
+  }
+};
+
+struct B
+{
+  unsigned a[3];
+  constexpr bool operator == (const B &x) const
+  {
+    return x.a[0] == a[0] && x.a[1] == a[1] && x.a[2] == a[2];
+  }
+};
+
+struct C
+{
+  char a[2][3][2];
+  constexpr bool operator == (const C &x) const
+  {
+    return x.a[0][0][0] == a[0][0][0]
+	   && x.a[0][0][1] == a[0][0][1]
+	   && x.a[0][1][0] == a[0][1][0]
+	   && x.a[0][1][1] == a[0][1][1]
+	   && x.a[0][2][0] == a[0][2][0]
+	   && x.a[0][2][1] == a[0][2][1]
+	   && x.a[1][0][0] == a[1][0][0]
+	   && x.a[1][0][1] == a[1][0][1]
+	   && x.a[1][1][0] == a[1][1][0]
+	   && x.a[1][1][1] == a[1][1][1]
+	   && x.a[1][2][0] == a[1][2][0]
+	   && x.a[1][2][1] == a[1][2][1];
+  }
+};
+
+struct D
+{
+  int a, b;
+  constexpr bool operator == (const D &x) const
+  {
+    return x.a == a && x.b == b;
+  }
+};
+
+struct E {};
+struct F { char c, d, e, f; };
+struct G : public D, E, F
+{
+  int g;
+  constexpr bool operator == (const G &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d
+	   && x.e == e && x.f == f && x.g == g;
+  }
+};
+
+struct H
+{
+  int a, b[2], c;
+  constexpr bool operator == (const H &x) const
+  {
+    return x.a == a && x.b[0] == b[0] && x.b[1] == b[1] && x.c == c;
+  }
+};
+
+#if __SIZEOF_INT__ == 4
+struct I
+{
+  int a;
+  int b : 3;
+  int c : 24;
+  int d : 5;
+  int e;
+  constexpr bool operator == (const I &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e;
+  }
+};
+#endif
+
+#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
+struct J
+{
+  long long int a, b : 11, c : 3, d : 37, e : 1, f : 10, g : 2, h;
+  constexpr bool operator == (const J &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
+	   && x.f == f && x.g == g && x.h == h;
+  }
+};
+
+struct K
+{
+  long long int a, b, c;
+  constexpr bool operator == (const K &x) const
+  {
+    return x.a == a && x.b == b && x.c == c;
+  }
+};
+
+struct M
+{
+  signed a : 6, b : 7, c : 6, d : 5;
+  unsigned char e;
+  unsigned int f;
+  long long int g;
+  constexpr bool operator == (const M &x) const
+  {
+    return x.a == a && x.b == b && x.c == c && x.d == d && x.e == e
+	   && x.f == f && x.g == g;
+  }
+};
+
+struct N
+{
+  unsigned long long int a, b;
+  constexpr bool operator == (const N &x) const
+  {
+    return x.a == a && x.b == b;
+  }
+};
+#endif
+
+static_assert (check <unsigned int> (0), "");
+static_assert (check <long long int> (0xdeadbeeffeedbac1ULL), "");
+static_assert (check <signed char> ((unsigned char) 42), "");
+static_assert (check <char> ((unsigned char) 42), "");
+static_assert (check <unsigned char> ((unsigned char) 42), "");
+static_assert (check <signed char> ((signed char) 42), "");
+static_assert (check <char> ((signed char) 42), "");
+static_assert (check <unsigned char> ((signed char) 42), "");
+static_assert (check <signed char> ((char) 42), "");
+static_assert (check <char> ((char) 42), "");
+static_assert (check <unsigned char> ((char) 42), "");
+#if __SIZEOF_INT__ == __SIZEOF_FLOAT__
+static_assert (check <int> (2.5f), "");
+static_assert (check <unsigned int> (136.5f), "");
+#endif
+#if __SIZEOF_LONG_LONG__ == __SIZEOF_DOUBLE__
+static_assert (check <long long> (2.5), "");
+static_assert (check <long long unsigned> (123456.75), "");
+#endif
+
+static_assert (check <B> (A{ 1, 2, 3 }), "");
+static_assert (check <A> (B{ 4, 5, 6 }), "");
+
+#if __SIZEOF_INT__ == 4
+static_assert (check <C> (A{ 7, 8, 9 }), "");
+static_assert (check <C> (B{ 10, 11, 12 }), "");
+static_assert (check <A> (C{ { { { 13, 14 }, { 15, 16 }, { 17, 18 } },
+			       { { 19, 20 }, { 21, 22 }, { 23, 24 } } } }), "");
+constexpr unsigned char c[] = { 1, 2, 3, 4 };
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <unsigned int> (c) == 0x04030201U, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <unsigned int> (c) == 0x01020304U, "");
+#endif
+
+#if __cplusplus >= 201703L
+static_assert (check <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
+#endif
+constexpr int d[] = { 0x12345678, 0x23456789, 0x5a876543, 0x3ba78654 };
+static_assert (bit_cast <G> (d) == bit_cast <G> (H { 0x12345678, { 0x23456789, 0x5a876543 }, 0x3ba78654 }), "");
+
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
+	       == I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed }, "");
+static_assert (bit_cast <A> (I { 0x7efa3412, 3, 0x50eca8, 0xb, 0x1eeffeed })
+	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <I> (A { 0x7efa3412, 0x5a876543, 0x1eeffeed })
+	       == I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed }, "");
+static_assert (bit_cast <A> (I { 0x7efa3412, 2, -0x2bc4d6, 0x3, 0x1eeffeed })
+	       == A { 0x7efa3412, 0x5a876543, 0x1eeffeed }, "");
+#endif
+#endif
+
+#if 2 * __SIZEOF_INT__ == __SIZEOF_LONG_LONG__ && __SIZEOF_INT__ >= 4
+constexpr unsigned long long a = 0xdeadbeeffee1deadULL;
+constexpr unsigned b[] = { 0xfeedbacU, 0xbeeffeedU };
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <D> (a) == D { int (0xfee1deadU), int (0xdeadbeefU) }, "");
+static_assert (bit_cast <unsigned long long> (b) == 0xbeeffeed0feedbacULL, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <D> (a) == D { int (0xdeadbeefU), int (0xfee1deadU) }, "");
+static_assert (bit_cast <unsigned long long> (b) == 0x0feedbacbeeffeedULL, "");
+#endif
+#endif
+
+#if __SIZEOF_INT__ == 4 && __SIZEOF_LONG_LONG__ == 8
+#if __BYTE_ORDER__ == __ORDER_LITTLE_ENDIAN__
+static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL })
+	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
+	       == K { 0x0feedbacdeadbeefLL, 7862463375103529997LL, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
+	       == M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL }, "");
+static_assert (bit_cast <N> (M { -8, 59, 31, -5, 234, 0xfeedbacdU, 0x123456789abcde42ULL })
+	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
+#elif __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+static_assert (bit_cast <J> (K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL })
+	       == J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <K> (J { 0x0feedbacdeadbeefLL, -1011, 2, -0xbacdeadbeLL, -1, -303, 1, 0x0feedbacdeadbeefLL })
+	       == K { 0x0feedbacdeadbeefLL, -9103311533965288635LL, 0x0feedbacdeadbeefLL }, "");
+static_assert (bit_cast <M> (N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL })
+	       == M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL }, "");
+static_assert (bit_cast <N> (M { -1, -35, -19, -6, 205, 0xeadbeef8U, 0x123456789abcde42ULL })
+	       == N { 0xfeedbacdeadbeef8ULL, 0x123456789abcde42ULL }, "");
+#endif
+#endif
--- gcc/testsuite/g++.dg/cpp2a/bit-cast4.C.jj	2020-11-26 14:35:07.594192689 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast4.C	2020-11-26 14:35:07.594192689 +0100
@@ -0,0 +1,44 @@
+// { dg-do compile { target c++11 } }
+
+template <typename To, typename From>
+constexpr To
+bit_cast (const From &from)
+{
+  return __builtin_bit_cast (To, from);
+}
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'U' is a union type" "U" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const U' is a union type" "const U" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'B' contains a union type" "B" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'char\\\*' is a pointer type" "char ptr" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const int\\\*' is a pointer type" "const int ptr" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'C' contains a pointer type" "C" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'const C' contains a pointer type" "const C" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int D::\\\*' is a pointer to member type" "ptrmem 1" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int \\\(D::\\\*\\\)\\\(\\\) const' is a pointer to member type" "ptrmem 2" { target *-*-* } 7 }
+// { dg-error "'__builtin_bit_cast' is not a constant expression because 'int \\\(D::\\\*\\\)\\\(\\\)' is a pointer to member type" "ptrmem 3" { target *-*-* } 7 }
+
+union U { int u; };
+struct A { int a; U b; };
+struct B : public A { int c; };
+struct C { const int *p; };
+constexpr int a[] = { 1, 2, 3 };
+constexpr const int *b = &a[0];
+constexpr C c = { b };
+struct D { int d; constexpr int foo () const { return 1; } };
+constexpr int D::*d = &D::d;
+constexpr int (D::*e) () const = &D::foo;
+struct E { __INTPTR_TYPE__ e, f; };
+constexpr E f = { 1, 2 };
+constexpr U g { 0 };
+
+constexpr auto z = bit_cast <U> (0);
+constexpr auto y = bit_cast <int> (g);
+constexpr auto x = bit_cast <B> (a);
+constexpr auto w = bit_cast <char *> ((__INTPTR_TYPE__) 0);
+constexpr auto v = bit_cast <__UINTPTR_TYPE__> (b);
+constexpr auto u = bit_cast <C> ((__INTPTR_TYPE__) 0);
+constexpr auto t = bit_cast <__INTPTR_TYPE__> (c);
+constexpr auto s = bit_cast <__INTPTR_TYPE__> (d);
+constexpr auto r = bit_cast <E> (e);
+constexpr auto q = bit_cast <int D::*> ((__INTPTR_TYPE__) 0);
+constexpr auto p = bit_cast <int (D::*) ()> (f);
--- gcc/testsuite/g++.dg/cpp2a/bit-cast5.C.jj	2020-11-26 14:35:07.594192689 +0100
+++ gcc/testsuite/g++.dg/cpp2a/bit-cast5.C	2020-11-26 14:35:07.594192689 +0100
@@ -0,0 +1,69 @@
+// { dg-do compile { target { c++20 && { ilp32 || lp64 } } } }
+
+struct A { signed char a, b, c, d, e, f; };
+struct B {};
+struct C { B a, b; short c; B d; };
+struct D { int a : 4, b : 24, c : 4; };
+struct E { B a, b; short c; };
+struct F { B a; signed char b, c; B d; };
+
+constexpr bool
+f1 ()
+{
+  A a;
+  a.c = 23; a.d = 42;
+  C b = __builtin_bit_cast (C, a); // OK
+  return false;
+}
+
+constexpr bool
+f2 ()
+{
+  A a;
+  a.a = 1; a.b = 2; a.c = 3; a.e = 4; a.f = 5;
+  C b = __builtin_bit_cast (C, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
+  return false;
+}
+
+constexpr bool
+f3 ()
+{
+  D a;
+  a.b = 1;
+  F b = __builtin_bit_cast (F, a); // OK
+  return false;
+}
+
+constexpr bool
+f4 ()
+{
+  D a;
+  a.b = 1; a.c = 2;
+  E b = __builtin_bit_cast (E, a); // OK
+  return false;
+}
+
+constexpr bool
+f5 ()
+{
+  D a;
+  a.b = 1;
+  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 3" }
+  return false;
+}
+
+constexpr bool
+f6 ()
+{
+  D a;
+  a.c = 1;
+  E b = __builtin_bit_cast (E, a);	// { dg-error "'__builtin_bit_cast' accessing uninitialized byte at offset 2" }
+  return false;
+}
+
+constexpr bool a = f1 ();
+constexpr bool b = f2 ();
+constexpr bool c = f3 ();
+constexpr bool d = f4 ();
+constexpr bool e = f5 ();
+constexpr bool f = f6 ();


	Jakub


^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: [PATCH] c++: v3: Add __builtin_bit_cast to implement std::bit_cast [PR93121]
  2020-11-26 15:09           ` [PATCH] c++: v3: " Jakub Jelinek
@ 2020-12-03 14:24             ` Jason Merrill
  0 siblings, 0 replies; 29+ messages in thread
From: Jason Merrill @ 2020-12-03 14:24 UTC (permalink / raw)
  To: Jakub Jelinek; +Cc: Jonathan Wakely, gcc-patches

On 11/26/20 10:09 AM, Jakub Jelinek wrote:

Sorry, thought I had replied to this before.

> The following patch removes the mask from the new native_interpret_aggregate
> moved to fold-const.c altogether.  Instead, the code uses the
> __builtin_clear_padding infrastructure (new entrypoint in there).
> If native_interpret_aggregate succeeds (returns non-NULL), then
> what we have in mask from the earlier native_encode_initializer call is
> 0 bits if they are well defined in argument and 1 bits for indeterminate
> bits.  The new clear_type_padding_in_mask function then clears all bits
> which are padding in the given type (the destination type), so if any bits
> remain in the mask, it means we are using indeterminate bits and C++ FE
> can diagnose it.  The bit-cast* tests didn't change at all.

Great, thanks.

> +  if (can_native_interpret_type_p (TREE_TYPE (t)))
> +    if (tree r = native_interpret_expr (TREE_TYPE (t), ptr, len))
> +      {
> +	for (int i = 0; i < len; i++)
> +	  if (mask[i])
> +	    {
> +	      if (!ctx->quiet)
> +		error_at (loc, "%qs accessing uninitialized byte at offset %d",
> +			  "__builtin_bit_cast", i);
> +	      *non_constant_p = true;
> +	      r = t;
> +	      break;
> +	    }
> +	if (ptr != buf)
> +	  XDELETE (ptr);
> +	return r;
> +      }
> +
> +  if (TREE_CODE (TREE_TYPE (t)) == RECORD_TYPE)
> +    {
> +      tree r = native_interpret_aggregate (TREE_TYPE (t), ptr, 0, len);
> +      if (r != NULL_TREE)
> +	{
> +	  clear_type_padding_in_mask (TREE_TYPE (t), mask);
> +	  for (int i = 0; i < len; i++)
> +	    if (mask[i])
> +	      {
> +		if (!ctx->quiet)
> +		  error_at (loc, "%qs accessing uninitialized byte at offset "
> +				 "%d", "__builtin_bit_cast", i);
> +		*non_constant_p = true;
> +		r = t;
> +		break;
> +	      }
> +	  if (ptr != buf)
> +	    XDELETE (ptr);
> +	  return r;
> +	}
> +    }

There's a lot of code that can be shared between these two cases now. 
The C++ bits are OK with that change.

Jason


^ permalink raw reply	[flat|nested] 29+ messages in thread

end of thread, other threads:[~2020-12-03 14:24 UTC | newest]

Thread overview: 29+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2020-07-18 18:50 [PATCH] c++: Add __builtin_bit_cast to implement std::bit_cast [PR93121] Jakub Jelinek
2020-07-22 14:03 ` Paul Koning
2020-07-30 14:16 ` Jason Merrill
2020-07-30 14:57   ` Jakub Jelinek
2020-07-30 21:59     ` Jason Merrill
2020-07-31  8:19       ` Jakub Jelinek
2020-07-31  9:54         ` Jonathan Wakely
2020-07-31 10:06           ` Jakub Jelinek
2020-07-31 20:28             ` Jason Merrill
2020-08-27 10:06               ` Jakub Jelinek
2020-08-27 10:19                 ` Richard Biener
2020-09-02 21:52                   ` Jason Merrill
2020-08-27 10:46                 ` Jakub Jelinek
2020-08-27 11:06                   ` Jonathan Wakely
2020-08-27 11:17                     ` Jakub Jelinek
2020-08-27 11:22                       ` Jonathan Wakely
2020-08-27 10:59                 ` Jonathan Wakely
2020-08-27 20:43                 ` Iain Buclaw
2020-11-02 19:21   ` [PATCH] c++: v2: " Jakub Jelinek
2020-11-25  0:31     ` Jeff Law
2020-11-25  9:23       ` Jakub Jelinek
2020-11-25 10:31         ` Jonathan Wakely
2020-11-25 16:24           ` Jakub Jelinek
2020-11-25 16:28             ` Jonathan Wakely
2020-11-25 17:26     ` Jason Merrill
2020-11-25 18:50       ` Jakub Jelinek
2020-11-26  0:52         ` Jason Merrill
2020-11-26 15:09           ` [PATCH] c++: v3: " Jakub Jelinek
2020-12-03 14:24             ` Jason Merrill

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).