public inbox for gcc-patches@gcc.gnu.org
 help / color / mirror / Atom feed
* [PATCH 1/4][RFC] middle-end/90348 - add explicit birth
@ 2022-02-04 13:49 Richard Biener
  2022-02-04 14:03 ` Jakub Jelinek
                   ` (2 more replies)
  0 siblings, 3 replies; 16+ messages in thread
From: Richard Biener @ 2022-02-04 13:49 UTC (permalink / raw)
  To: gcc-patches; +Cc: Jakub Jelinek, matz

This adds explicit variable birth CLOBBERs in an attempt to fix
PR90348 and duplicates.  The birth / death CLOBBER pairs are
used to compute liveness and conflicts for stack variable
coalescing where the lack of an explicit birth but instead
use of first mention causes misrepresentation of variable life
ranges when optimization moves the first mention upwards the
original birth point at the variables bind expression start.

Birth CLOBBERs are represented as traditional CLOBBERs with all
the accompaning effect on optimization.  While they do not serve
as a barrier for address mentions they act as barrier for the
actual accesses which is enough for determining conflicts in
the context of stack slot sharing.  The birth CLOBBERs are
distinguished from death CLOBBERs by setting CLOBBER_MARKS_BIRTH
using the private_flag on the CONSTRUCTOR node and amend the
CLOBBER_MARK_EOL marked clobbers introduced earlier.

The patch changes how we handle stack variables that are not marked
with CLOBBERs.  For those the first mention started live which then
lasted upon function exit which means effectively all not marked
variables conflicted with each other variable.  This property is best
represented by an extra flag rather than the full conflict bitmap
which is what the patch introduces with conflicts_represented which
is cleared at start and set to true once we visit the first birth
CLOBBER.  From that on we assume all variable accesses are properly
fenced by birth/death CLOBBERs.

Variables without explicit births will not take part in stack slot
sharing after this change.

Currently birth CLOBBERs are added when gimplification adds
corresponding end-of-life CLOBBERs, during bind and target expression
gimplification.  Generally inserting births at DECL_EXPRs is
more precise so we also do it there (also noting not all such variables
are mentioned in BINDs).  Avoiding redundant births is on the TOOD
list and might also remove the need for the warn_switch_unreachable_r
hunk.

This is the meat of the PR90348 fix, the following 3 patches perform
testcase adjustments and followup fixes to avoid regressions.

Bootstrapped on x86_64-unknown-linux-gnu - re-testing in progress.

Any comments?  I have mixed feelings with proposing this for GCC 12
but like to hear from others as well.  I didn't try to evaluate
the quality of stack slot sharing before/after this change besides
fixing the testsuite fallout (we have a few testcases checking for
specific instances).

Thanks,
Richard.

2022-02-01  Richard Biener  <rguenther@suse.de>

	PR middle-end/90348
	PR middle-end/103006
	* tree-core.h (clobber_kind): Add CLOBBER_BIRTH.
	* gimplify.cc (gimplify_bind_expr): Also add birth CLOBBERs.
	(gimplify_target_expr): Likewise.
	(gimplify_decl_expr): Likewise.
	(warn_switch_unreachable_r): Do not treat birth CLOBBERs as
	real stmt - they get added at BIND starts but that's before
	the case labels.
	* tree-pretty-print.cc (dump_generic_node): Mark birth CLOBBERs.
	* cfgexpand.cc (stack_var::conflicts_represented): New.
	(add_stack_var): Initialize conflicts_represented.
	(add_stack_var_conflict): Assert the conflicts for the
	vars are represented.
	(stack_var_conflict_p): Honor conflicts_represented flag.
	(visit_op): Remove.
	(visit_conflict): Likewise.
	(add_scope_conflicts_1): Simplify by only considering birth
	and death CLOBBERs.
	(add_scope_conflicts): Adjust comment.
	(add_stack_protection_conflicts): Only add conflicts for
	variables that have them represented.

	* gcc.dg/torture/pr103006-1.c: New testcase.
	* gcc.dg/torture/pr103006-2.c: Likewise.
	* gcc.dg/torture/pr90348.c: Likewise.
---
 gcc/cfgexpand.cc                          | 157 +++++++++-------------
 gcc/gimplify.cc                           |  83 +++++++++++-
 gcc/testsuite/gcc.dg/torture/pr103006-1.c |  27 ++++
 gcc/testsuite/gcc.dg/torture/pr103006-2.c |  29 ++++
 gcc/testsuite/gcc.dg/torture/pr90348.c    |  40 ++++++
 gcc/tree-core.h                           |   2 +
 gcc/tree-pretty-print.cc                  |   4 +-
 7 files changed, 244 insertions(+), 98 deletions(-)
 create mode 100644 gcc/testsuite/gcc.dg/torture/pr103006-1.c
 create mode 100644 gcc/testsuite/gcc.dg/torture/pr103006-2.c
 create mode 100644 gcc/testsuite/gcc.dg/torture/pr90348.c

diff --git a/gcc/cfgexpand.cc b/gcc/cfgexpand.cc
index d51af2e3084..02467874996 100644
--- a/gcc/cfgexpand.cc
+++ b/gcc/cfgexpand.cc
@@ -319,6 +319,10 @@ public:
      size, the alignment for this partition.  */
   unsigned int alignb;
 
+  /* Whether this variable has conflicts represented in the CONFLICTS
+     bitmap.  If not, it conflicts with all other stack variables.  */
+  bool conflicts_represented;
+
   /* The partition representative.  */
   size_t representative;
 
@@ -479,7 +483,8 @@ add_stack_var (tree decl, bool really_expand)
   v->representative = stack_vars_num;
   v->next = EOC;
 
-  /* All variables initially conflict with no other.  */
+  /* All variables initially conflict with each other.  */
+  v->conflicts_represented = false;
   v->conflicts = NULL;
 
   /* Ensure that this decl doesn't get put onto the list twice.  */
@@ -497,6 +502,7 @@ add_stack_var_conflict (size_t x, size_t y)
   class stack_var *b = &stack_vars[y];
   if (x == y)
     return;
+  gcc_assert (a->conflicts_represented && b->conflicts_represented);
   if (!a->conflicts)
     a->conflicts = BITMAP_ALLOC (&stack_var_bitmap_obstack);
   if (!b->conflicts)
@@ -520,57 +526,16 @@ stack_var_conflict_p (size_t x, size_t y)
   if (TREE_CODE (a->decl) == SSA_NAME || TREE_CODE (b->decl) == SSA_NAME)
     return true;
 
+  /* If we cannot compute lifeness in a meaningful way for either variable
+     they conflict.  */
+  if (!a->conflicts_represented || !b->conflicts_represented)
+    return true;
+
   if (!a->conflicts || !b->conflicts)
     return false;
   return bitmap_bit_p (a->conflicts, y);
 }
 
-/* Callback for walk_stmt_ops.  If OP is a decl touched by add_stack_var
-   enter its partition number into bitmap DATA.  */
-
-static bool
-visit_op (gimple *, tree op, tree, void *data)
-{
-  bitmap active = (bitmap)data;
-  op = get_base_address (op);
-  if (op
-      && DECL_P (op)
-      && DECL_RTL_IF_SET (op) == pc_rtx)
-    {
-      size_t *v = decl_to_stack_part->get (op);
-      if (v)
-	bitmap_set_bit (active, *v);
-    }
-  return false;
-}
-
-/* Callback for walk_stmt_ops.  If OP is a decl touched by add_stack_var
-   record conflicts between it and all currently active other partitions
-   from bitmap DATA.  */
-
-static bool
-visit_conflict (gimple *, tree op, tree, void *data)
-{
-  bitmap active = (bitmap)data;
-  op = get_base_address (op);
-  if (op
-      && DECL_P (op)
-      && DECL_RTL_IF_SET (op) == pc_rtx)
-    {
-      size_t *v = decl_to_stack_part->get (op);
-      if (v && bitmap_set_bit (active, *v))
-	{
-	  size_t num = *v;
-	  bitmap_iterator bi;
-	  unsigned i;
-	  gcc_assert (num < stack_vars_num);
-	  EXECUTE_IF_SET_IN_BITMAP (active, 0, i, bi)
-	    add_stack_var_conflict (num, i);
-	}
-    }
-  return false;
-}
-
 /* Helper routine for add_scope_conflicts, calculating the active partitions
    at the end of BB, leaving the result in WORK.  We're called to generate
    conflicts when FOR_CONFLICT is true, otherwise we're just tracking
@@ -582,58 +547,63 @@ add_scope_conflicts_1 (basic_block bb, bitmap work, bool for_conflict)
   edge e;
   edge_iterator ei;
   gimple_stmt_iterator gsi;
-  walk_stmt_load_store_addr_fn visit;
+  bitmap_iterator bi;
+  unsigned i;
 
   bitmap_clear (work);
   FOR_EACH_EDGE (e, ei, bb->preds)
     bitmap_ior_into (work, (bitmap)e->src->aux);
 
-  visit = visit_op;
+  /* Since we do not get to see the birth clobber for variables being
+     live only via backedges add conflicts for everything live at
+     the start of the block.  */
+  if (for_conflict)
+    EXECUTE_IF_SET_IN_BITMAP (work, 0, i, bi)
+      {
+	class stack_var *a = &stack_vars[i];
+	if (!a->conflicts)
+	  a->conflicts = BITMAP_ALLOC (&stack_var_bitmap_obstack);
+	bitmap_ior_into (a->conflicts, work);
+      }
 
-  for (gsi = gsi_start_phis (bb); !gsi_end_p (gsi); gsi_next (&gsi))
-    {
-      gimple *stmt = gsi_stmt (gsi);
-      walk_stmt_load_store_addr_ops (stmt, work, NULL, NULL, visit);
-    }
+  /* Now do the local dataflow.  */
   for (gsi = gsi_after_labels (bb); !gsi_end_p (gsi); gsi_next (&gsi))
     {
       gimple *stmt = gsi_stmt (gsi);
+      if (!gimple_clobber_p (stmt))
+	continue;
 
-      if (gimple_clobber_p (stmt))
-	{
-	  tree lhs = gimple_assign_lhs (stmt);
-	  size_t *v;
-	  /* Nested function lowering might introduce LHSs
-	     that are COMPONENT_REFs.  */
-	  if (!VAR_P (lhs))
-	    continue;
-	  if (DECL_RTL_IF_SET (lhs) == pc_rtx
-	      && (v = decl_to_stack_part->get (lhs)))
-	    bitmap_clear_bit (work, *v);
-	}
-      else if (!is_gimple_debug (stmt))
+      tree lhs = gimple_assign_lhs (stmt);
+      size_t *v;
+      /* Nested function lowering might introduce LHSs
+	 that are COMPONENT_REFs.  */
+      if (!VAR_P (lhs))
+	continue;
+      if (DECL_RTL_IF_SET (lhs) == pc_rtx
+	  && (v = decl_to_stack_part->get (lhs)))
 	{
-	  if (for_conflict
-	      && visit == visit_op)
+	  size_t num = *v;
+	  if (CLOBBER_KIND (gimple_assign_rhs1 (stmt)) == CLOBBER_BIRTH)
 	    {
-	      /* If this is the first real instruction in this BB we need
-	         to add conflicts for everything live at this point now.
-		 Unlike classical liveness for named objects we can't
-		 rely on seeing a def/use of the names we're interested in.
-		 There might merely be indirect loads/stores.  We'd not add any
-		 conflicts for such partitions.  */
-	      bitmap_iterator bi;
-	      unsigned i;
-	      EXECUTE_IF_SET_IN_BITMAP (work, 0, i, bi)
+	      /* Birth.  */
+	      if (for_conflict)
 		{
-		  class stack_var *a = &stack_vars[i];
-		  if (!a->conflicts)
-		    a->conflicts = BITMAP_ALLOC (&stack_var_bitmap_obstack);
-		  bitmap_ior_into (a->conflicts, work);
+		  /* The births are where we add conflicts.  */
+		  if (bitmap_set_bit (work, num))
+		    EXECUTE_IF_SET_IN_BITMAP (work, 0, i, bi)
+		      add_stack_var_conflict (num, i);
+		}
+	      else
+		{
+		  /* We've seen a birth, assume life-ranges are
+		     properly represented by birth and death CLOBBERs.  */
+		  stack_vars[num].conflicts_represented = true;
+		  bitmap_set_bit (work, num);
 		}
-	      visit = visit_conflict;
 	    }
-	  walk_stmt_load_store_addr_ops (stmt, work, visit, visit, visit);
+	  else if (CLOBBER_KIND (gimple_assign_rhs1 (stmt)) == CLOBBER_EOL)
+	    /* Death.  */
+	    bitmap_clear_bit (work, num);
 	}
     }
 }
@@ -650,13 +620,13 @@ add_scope_conflicts (void)
   int *rpo;
   int n_bbs;
 
-  /* We approximate the live range of a stack variable by taking the first
-     mention of its name as starting point(s), and by the end-of-scope
-     death clobber added by gimplify as ending point(s) of the range.
-     This overapproximates in the case we for instance moved an address-taken
-     operation upward, without also moving a dereference to it upwards.
-     But it's conservatively correct as a variable never can hold values
-     before its name is mentioned at least once.
+  /* We approximate the live range of a stack variable by using the
+     birth and death CLOBBERs as added by for example gimplification
+     based on the scopes the variables were declared in.
+     For variables we never see mentioned in a birth CLOBBER we have
+     to assume conflicts with all other stack variables.  We could
+     improve this for the case of stack variables that do not have
+     their address taken as we can precisely track all mentions.
 
      We then do a mostly classical bitmap liveness algorithm.  */
 
@@ -2022,9 +1992,12 @@ add_stack_protection_conflicts (void)
 
   for (i = 0; i < n; ++i)
     {
+      if (!stack_vars[i].conflicts_represented)
+	continue;
       unsigned char ph_i = phase[i];
       for (j = i + 1; j < n; ++j)
-	if (ph_i != phase[j])
+	if (stack_vars[j].conflicts_represented
+	    && ph_i != phase[j])
 	  add_stack_var_conflict (i, j);
     }
 
diff --git a/gcc/gimplify.cc b/gcc/gimplify.cc
index 875b115d02d..37c80d609c4 100644
--- a/gcc/gimplify.cc
+++ b/gcc/gimplify.cc
@@ -1355,6 +1355,7 @@ gimplify_bind_expr (tree *expr_p, gimple_seq *pre_p)
   tree temp = voidify_wrapper_expr (bind_expr, NULL);
 
   /* Mark variables seen in this bind expr.  */
+  body = NULL;
   for (t = BIND_EXPR_VARS (bind_expr); t ; t = DECL_CHAIN (t))
     {
       if (VAR_P (t))
@@ -1412,6 +1413,43 @@ gimplify_bind_expr (tree *expr_p, gimple_seq *pre_p)
 
 	  if (DECL_HARD_REGISTER (t) && !is_global_var (t) && cfun)
 	    cfun->has_local_explicit_reg_vars = true;
+
+	  /* For variables we add clobbers for in the cleanup, add
+	     corresponding births at the start.  */
+	  /* ???  We'd like to not add the birth here if we knew that
+	     we'd see a DECL_EXPR for this variable later.  Incidentially
+	     OMP also seems to have a need to know that (see find_decl_expr).
+	     Adding the birth here for example causes us to have a spurious
+	     one for
+	       switch (..)
+		 {
+		 case 1:
+		   S s;
+		 }
+	     leading to bogus diagnostic we now mitigate in
+	     warn_switch_unreachable_r as the birth will appear before
+	     the case label in that case.
+	     It might be possible to hijack the GENERIC body walk
+	     we do in unshare_body to set a flag on all decls that
+	     we discover having a DECL_EXPR or alternatively wrap
+	     all builders of DECL_EXPR with a function doing that.  */
+	  if (!is_global_var (t)
+	      && DECL_CONTEXT (t) == current_function_decl
+	      && !DECL_HARD_REGISTER (t)
+	      && !TREE_THIS_VOLATILE (t)
+	      && !DECL_HAS_VALUE_EXPR_P (t)
+	      /* Only care for variables that have to be in memory.  Others
+		 will be rewritten into SSA names, hence moved to the
+		 top-level.  */
+	      && !is_gimple_reg (t)
+	      && flag_stack_reuse != SR_NONE)
+	    {
+	      tree clobber = build_clobber (TREE_TYPE (t), CLOBBER_BIRTH);
+	      gimple *clobber_stmt;
+	      clobber_stmt = gimple_build_assign (t, clobber);
+	      gimple_set_location (clobber_stmt, start_locus);
+	      gimplify_seq_add_stmt (&body, clobber_stmt);
+	    }
 	}
     }
 
@@ -1423,7 +1461,6 @@ gimplify_bind_expr (tree *expr_p, gimple_seq *pre_p)
   gimplify_ctxp->save_stack = false;
 
   /* Gimplify the body into the GIMPLE_BIND tuple's body.  */
-  body = NULL;
   gimplify_stmt (&BIND_EXPR_BODY (bind_expr), &body);
   gimple_bind_set_body (bind_stmt, body);
 
@@ -1901,6 +1938,23 @@ gimplify_decl_expr (tree *stmt_p, gimple_seq *seq_p)
 	  is_vla = true;
 	}
 
+      /* This mostly copies the condition we used for building a CLOBBER
+	 when visiting a BIND_EXPR.  We should rather remember that though,
+	 we approximate it by checking DECL_SEEN_IN_BIND_EXPR_P.  */
+      if (!decl_had_value_expr_p
+	  && !is_global_var (decl)
+	  && !TREE_THIS_VOLATILE (decl)
+	  && !DECL_HARD_REGISTER (decl)
+	  && !is_gimple_reg (decl)
+	  && DECL_SEEN_IN_BIND_EXPR_P (decl)
+	  && flag_stack_reuse != SR_NONE)
+	{
+	  tree clobber = build_clobber (TREE_TYPE (decl), CLOBBER_BIRTH);
+	  gimple *clobber_stmt = gimple_build_assign (decl, clobber);
+	  gimple_set_location (clobber_stmt, EXPR_LOCATION (stmt));
+	  gimplify_seq_add_stmt (seq_p, clobber_stmt);
+	}
+
       if (asan_poisoned_variables
 	  && !is_vla
 	  && TREE_ADDRESSABLE (decl)
@@ -2075,7 +2129,10 @@ warn_switch_unreachable_r (gimple_stmt_iterator *gsi_p, bool *handled_ops_p,
 	}
       /* Fall through.  */
     default:
-      /* Save the first "real" statement (not a decl/lexical scope/...).  */
+      /* Save the first "real" statement (not a decl/lexical scope/...)
+	 but avoid clobbers we add for the birth of variables.  */
+      if (gimple_clobber_p (stmt, CLOBBER_BIRTH))
+	break;
       wi->info = stmt;
       return integer_zero_node;
     }
@@ -6917,7 +6974,7 @@ gimplify_target_expr (tree *expr_p, gimple_seq *pre_p, gimple_seq *post_p)
   enum gimplify_status ret;
 
   bool unpoison_empty_seq = false;
-  gimple_stmt_iterator unpoison_it;
+  gimple_stmt_iterator unpoison_it = gsi_none ();
 
   if (init)
     {
@@ -6935,8 +6992,9 @@ gimplify_target_expr (tree *expr_p, gimple_seq *pre_p, gimple_seq *post_p)
 	}
       else
 	{
-	  /* Save location where we need to place unpoisoning.  It's possible
-	     that a variable will be converted to needs_to_live_in_memory.  */
+	  /* Save location where we need to place unpoisoning or birth.
+	     It's possible that a variable will be converted to
+	     needs_to_live_in_memory.  */
 	  unpoison_it = gsi_last (*pre_p);
 	  unpoison_empty_seq = gsi_end_p (unpoison_it);
 
@@ -6984,6 +7042,21 @@ gimplify_target_expr (tree *expr_p, gimple_seq *pre_p, gimple_seq *post_p)
 	      tree clobber = build_clobber (TREE_TYPE (temp), CLOBBER_EOL);
 	      clobber = build2 (MODIFY_EXPR, TREE_TYPE (temp), temp, clobber);
 	      gimple_push_cleanup (temp, clobber, false, pre_p, true);
+
+	      /* Replace the placed NOP with a birth clobber.  */
+	      clobber = build_clobber (TREE_TYPE (temp), CLOBBER_BIRTH);
+	      clobber = build2 (MODIFY_EXPR, TREE_TYPE (temp), temp, clobber);
+	      gimple_seq stmts = NULL;
+	      gimplify_and_add (clobber, &stmts);
+	      if (unpoison_empty_seq)
+		{
+		  gimple_stmt_iterator si = gsi_start (*pre_p);
+		  gsi_insert_seq_before_without_update (&si, stmts,
+							GSI_SAME_STMT);
+		}
+	      else
+		gsi_insert_seq_after_without_update (&unpoison_it, stmts,
+						     GSI_LAST_NEW_STMT);
 	    }
 	  if (asan_poisoned_variables
 	      && DECL_ALIGN (temp) <= MAX_SUPPORTED_STACK_ALIGNMENT
diff --git a/gcc/testsuite/gcc.dg/torture/pr103006-1.c b/gcc/testsuite/gcc.dg/torture/pr103006-1.c
new file mode 100644
index 00000000000..0053896f19f
--- /dev/null
+++ b/gcc/testsuite/gcc.dg/torture/pr103006-1.c
@@ -0,0 +1,27 @@
+/* { dg-do run } */
+
+int printf(const char *, ...);
+int a, *b, c, e, f;
+void g() {
+  int *d[7];
+  d[6] = b = (int *)d;
+  printf("0\n");
+}
+int i() {
+  for (c = 0; c < 2; c++) {
+    long h[6][2];
+    for (e = 0; e < 6; e++)
+      for (f = 0; f < 2; f++)
+        h[e][f] = 1;
+    if (c) {
+      g();
+      return h[3][0];
+    }
+  }
+  return 0;
+}
+int main() {
+  if (i() != 1)
+    __builtin_abort ();
+  return 0;
+}
diff --git a/gcc/testsuite/gcc.dg/torture/pr103006-2.c b/gcc/testsuite/gcc.dg/torture/pr103006-2.c
new file mode 100644
index 00000000000..38acf85c4b9
--- /dev/null
+++ b/gcc/testsuite/gcc.dg/torture/pr103006-2.c
@@ -0,0 +1,29 @@
+/* { dg-do run } */
+
+__attribute__((noipa)) void ff(void){}
+int a, *b, c, e, f;
+__attribute__((always_inline))
+static inline void g() {
+  int *d[7];
+  d[6] = b = (int *)d;
+  ff();
+}
+__attribute__((noinline))
+int i() {
+  for (c = 0; c < 2; c++) {
+    long h[6][2];
+    for (e = 0; e < 6; e++)
+      for (f = 0; f < 2; f++)
+        h[e][f] = 1;
+    if (c) {
+      g();
+      return h[3][0];
+    }
+  }
+  return 0;
+}
+int main() {
+  if (i() != 1)
+    __builtin_abort ();
+  return 0;
+}
diff --git a/gcc/testsuite/gcc.dg/torture/pr90348.c b/gcc/testsuite/gcc.dg/torture/pr90348.c
new file mode 100644
index 00000000000..132aff7861f
--- /dev/null
+++ b/gcc/testsuite/gcc.dg/torture/pr90348.c
@@ -0,0 +1,40 @@
+/* { dg-do run } */
+
+void __attribute__ ((noipa))
+set_one (unsigned char* ptr)
+{
+  *ptr = 1;
+}
+
+int __attribute__ ((noipa))
+check_nonzero (unsigned char const* in, unsigned int len)
+{
+  for (unsigned int i = 0; i < len; ++i)
+    if (in[i] != 0)
+      return 1;
+  return 0;
+}
+
+void
+set_one_on_stack ()
+{
+  unsigned char buf[1];
+  set_one (buf);
+}
+
+int
+main ()
+{
+  for (int i = 0; i < 5; ++i)
+    {
+      unsigned char in[4];
+      for (int j = 0; j < i; ++j)
+	{
+	  in[j] = 0;
+	  set_one_on_stack ();
+	}
+      if (check_nonzero (in, i))
+	__builtin_abort ();
+    }
+}
+
diff --git a/gcc/tree-core.h b/gcc/tree-core.h
index bf2efa61330..786bc91571d 100644
--- a/gcc/tree-core.h
+++ b/gcc/tree-core.h
@@ -967,6 +967,8 @@ enum annot_expr_kind {
 enum clobber_kind {
   /* Unspecified, this clobber acts as a store of an undefined value.  */
   CLOBBER_UNDEF,
+  /* This clobber starts the lifetime of the storage.  */
+  CLOBBER_BIRTH,
   /* This clobber ends the lifetime of the storage.  */
   CLOBBER_EOL,
   CLOBBER_LAST
diff --git a/gcc/tree-pretty-print.cc b/gcc/tree-pretty-print.cc
index 666b7a70ea2..d601f3f980d 100644
--- a/gcc/tree-pretty-print.cc
+++ b/gcc/tree-pretty-print.cc
@@ -2502,7 +2502,9 @@ dump_generic_node (pretty_printer *pp, tree node, int spc, dump_flags_t flags,
 	if (TREE_CLOBBER_P (node))
 	  {
 	    pp_string (pp, "CLOBBER");
-	    if (CLOBBER_KIND (node) == CLOBBER_EOL)
+	    if (CLOBBER_KIND (node) == CLOBBER_BIRTH)
+	      pp_string (pp, "(birth)");
+	    else if (CLOBBER_KIND (node) == CLOBBER_EOL)
 	      pp_string (pp, "(eol)");
 	  }
 	else if (TREE_CODE (TREE_TYPE (node)) == RECORD_TYPE
-- 
2.34.1


^ permalink raw reply	[flat|nested] 16+ messages in thread

end of thread, other threads:[~2022-02-09 15:55 UTC | newest]

Thread overview: 16+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2022-02-04 13:49 [PATCH 1/4][RFC] middle-end/90348 - add explicit birth Richard Biener
2022-02-04 14:03 ` Jakub Jelinek
2022-02-04 14:11   ` Richard Biener
2022-02-05 18:21 ` Jeff Law
2022-02-05 19:54   ` Eric Botcazou
2022-02-07  7:29     ` Richard Biener
2022-02-08 10:31       ` Eric Botcazou
2022-02-07 23:46 ` Joseph Myers
2022-02-08  7:20   ` Richard Biener
2022-02-08 14:13     ` Michael Matz
2022-02-08 14:53       ` Richard Biener
2022-02-08 15:19         ` Michael Matz
2022-02-09 14:25           ` Richard Biener
2022-02-09 15:55             ` Michael Matz
2022-02-08 18:38         ` Joseph Myers
2022-02-09 13:37           ` Michael Matz

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).