public inbox for gcc-patches@gcc.gnu.org
 help / color / mirror / Atom feed
* [PATCH]middle-end: check memory accesses in the destination block [PR113588].
@ 2024-01-29 15:04 Tamar Christina
  2024-01-30  9:51 ` Richard Biener
  0 siblings, 1 reply; 8+ messages in thread
From: Tamar Christina @ 2024-01-29 15:04 UTC (permalink / raw)
  To: gcc-patches; +Cc: nd, rguenther, jlaw

[-- Attachment #1: Type: text/plain, Size: 11200 bytes --]

Hi All,

When analyzing loads for early break it was always the intention that for the
exit where things get moved to we only check the loads that can be reached from
the condition.

However the main loop checks all loads and we skip the destination BB.  As such
we never actually check the loads reachable from the COND in the last BB unless
this BB was also the exit chosen by the vectorizer.

This leads us to incorrectly vectorize the loop in the PR and in doing so access
out of bounds.

Bootstrapped Regtested on aarch64-none-linux-gnu and no issues.

Ok for master?

Thanks,
Tamar

gcc/ChangeLog:

	PR tree-optimization/113588
	* tree-vect-data-refs.cc (vect_analyze_early_break_dependences_1): New.
	(vect_analyze_data_ref_dependence):  Use it.
	(vect_analyze_early_break_dependences): Update comments.

gcc/testsuite/ChangeLog:

	PR tree-optimization/113588
	* gcc.dg/vect/vect-early-break_108-pr113588.c: New test.
	* gcc.dg/vect/vect-early-break_109-pr113588.c: New test.

--- inline copy of patch -- 
diff --git a/gcc/testsuite/gcc.dg/vect/vect-early-break_108-pr113588.c b/gcc/testsuite/gcc.dg/vect/vect-early-break_108-pr113588.c
new file mode 100644
index 0000000000000000000000000000000000000000..e488619c9aac41fafbcf479818392a6bb7c6924f
--- /dev/null
+++ b/gcc/testsuite/gcc.dg/vect/vect-early-break_108-pr113588.c
@@ -0,0 +1,15 @@
+/* { dg-do compile } */
+/* { dg-add-options vect_early_break } */
+/* { dg-require-effective-target vect_early_break } */
+/* { dg-require-effective-target vect_int } */
+
+/* { dg-final { scan-tree-dump-not "LOOP VECTORIZED" "vect" } } */
+
+int foo (const char *s, unsigned long n)
+{
+ unsigned long len = 0;
+ while (*s++ && n--)
+   ++len;
+ return len;
+}
+
diff --git a/gcc/testsuite/gcc.dg/vect/vect-early-break_109-pr113588.c b/gcc/testsuite/gcc.dg/vect/vect-early-break_109-pr113588.c
new file mode 100644
index 0000000000000000000000000000000000000000..488c19d3ede809631d1a7ede0e7f7bcdc7a1ae43
--- /dev/null
+++ b/gcc/testsuite/gcc.dg/vect/vect-early-break_109-pr113588.c
@@ -0,0 +1,44 @@
+/* { dg-add-options vect_early_break } */
+/* { dg-require-effective-target vect_early_break } */
+/* { dg-require-effective-target vect_int } */
+/* { dg-require-effective-target mmap } */
+
+/* { dg-final { scan-tree-dump-not "LOOP VECTORIZED" "vect" } } */
+
+#include <sys/mman.h>
+#include <unistd.h>
+
+#include "tree-vect.h"
+
+__attribute__((noipa))
+int foo (const char *s, unsigned long n)
+{
+ unsigned long len = 0;
+ while (*s++ && n--)
+   ++len;
+ return len;
+}
+
+int main()
+{
+
+  check_vect ();
+
+  long pgsz = sysconf (_SC_PAGESIZE);
+  void *p = mmap (NULL, pgsz * 3, PROT_READ|PROT_WRITE,
+     MAP_ANONYMOUS|MAP_PRIVATE, 0, 0);
+  if (p == MAP_FAILED)
+    return 0;
+  mprotect (p, pgsz, PROT_NONE);
+  mprotect (p+2*pgsz, pgsz, PROT_NONE);
+  char *p1 = p + pgsz;
+  p1[0] = 1;
+  p1[1] = 0;
+  foo (p1, 1000);
+  p1 = p + 2*pgsz - 2;
+  p1[0] = 1;
+  p1[1] = 0;
+  foo (p1, 1000);
+  return 0;
+}
+
diff --git a/gcc/tree-vect-data-refs.cc b/gcc/tree-vect-data-refs.cc
index f592aeb8028afd4fd70e2175104efab2a2c0d82e..52cef242a7ce5d0e525bff639fa1dc2f0a6f30b9 100644
--- a/gcc/tree-vect-data-refs.cc
+++ b/gcc/tree-vect-data-refs.cc
@@ -619,10 +619,69 @@ vect_analyze_data_ref_dependence (struct data_dependence_relation *ddr,
   return opt_result::success ();
 }
 
-/* Funcion vect_analyze_early_break_dependences.
+/* Function vect_analyze_early_break_dependences_1
 
-   Examime all the data references in the loop and make sure that if we have
-   mulitple exits that we are able to safely move stores such that they become
+   Helper function of vect_analyze_early_break_dependences which performs safety
+   analysis for load operations in an early break.  */
+
+static opt_result
+vect_analyze_early_break_dependences_1 (data_reference *dr_ref, gimple *stmt)
+{
+  /* We currently only support statically allocated objects due to
+     not having first-faulting loads support or peeling for
+     alignment support.  Compute the size of the referenced object
+     (it could be dynamically allocated).  */
+  tree obj = DR_BASE_ADDRESS (dr_ref);
+  if (!obj || TREE_CODE (obj) != ADDR_EXPR)
+    {
+      if (dump_enabled_p ())
+	dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
+			 "early breaks only supported on statically"
+			 " allocated objects.\n");
+      return opt_result::failure_at (stmt,
+			 "can't safely apply code motion to "
+			 "dependencies of %G to vectorize "
+			 "the early exit.\n", stmt);
+    }
+
+  tree refop = TREE_OPERAND (obj, 0);
+  tree refbase = get_base_address (refop);
+  if (!refbase || !DECL_P (refbase) || !DECL_SIZE (refbase)
+      || TREE_CODE (DECL_SIZE (refbase)) != INTEGER_CST)
+    {
+      if (dump_enabled_p ())
+	dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
+			 "early breaks only supported on"
+			 " statically allocated objects.\n");
+      return opt_result::failure_at (stmt,
+			 "can't safely apply code motion to "
+			 "dependencies of %G to vectorize "
+			 "the early exit.\n", stmt);
+    }
+
+  /* Check if vector accesses to the object will be within bounds.
+     must be a constant or assume loop will be versioned or niters
+     bounded by VF so accesses are within range.  */
+  if (!ref_within_array_bound (stmt, DR_REF (dr_ref)))
+    {
+      if (dump_enabled_p ())
+	dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
+			 "early breaks not supported: vectorization "
+			 "would %s beyond size of obj.",
+			 DR_IS_READ (dr_ref) ? "read" : "write");
+      return opt_result::failure_at (stmt,
+			 "can't safely apply code motion to "
+			 "dependencies of %G to vectorize "
+			 "the early exit.\n", stmt);
+    }
+
+  return opt_result::success ();
+}
+
+/* Function vect_analyze_early_break_dependences.
+
+   Examine all the data references in the loop and make sure that if we have
+   multiple exits that we are able to safely move stores such that they become
    safe for vectorization.  The function also calculates the place where to move
    the instructions to and computes what the new vUSE chain should be.
 
@@ -639,7 +698,7 @@ vect_analyze_data_ref_dependence (struct data_dependence_relation *ddr,
      - Multiple loads are allowed as long as they don't alias.
 
    NOTE:
-     This implemementation is very conservative. Any overlappig loads/stores
+     This implementation is very conservative. Any overlapping loads/stores
      that take place before the early break statement gets rejected aside from
      WAR dependencies.
 
@@ -668,7 +727,6 @@ vect_analyze_early_break_dependences (loop_vec_info loop_vinfo)
   auto_vec<data_reference *> bases;
   basic_block dest_bb = NULL;
 
-  hash_set <gimple *> visited;
   class loop *loop = LOOP_VINFO_LOOP (loop_vinfo);
   class loop *loop_nest = loop_outer (loop);
 
@@ -683,6 +741,7 @@ vect_analyze_early_break_dependences (loop_vec_info loop_vinfo)
   dest_bb = single_pred (loop->latch);
   basic_block bb = dest_bb;
 
+  /* First analyse all blocks leading to dest_bb excluding dest_bb itself.  */
   do
     {
       /* If the destination block is also the header then we have nothing to do.  */
@@ -707,53 +766,11 @@ vect_analyze_early_break_dependences (loop_vec_info loop_vinfo)
 	  if (!dr_ref)
 	    continue;
 
-	  /* We currently only support statically allocated objects due to
-	     not having first-faulting loads support or peeling for
-	     alignment support.  Compute the size of the referenced object
-	     (it could be dynamically allocated).  */
-	  tree obj = DR_BASE_ADDRESS (dr_ref);
-	  if (!obj || TREE_CODE (obj) != ADDR_EXPR)
-	    {
-	      if (dump_enabled_p ())
-		dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
-				 "early breaks only supported on statically"
-				 " allocated objects.\n");
-	      return opt_result::failure_at (stmt,
-				 "can't safely apply code motion to "
-				 "dependencies of %G to vectorize "
-				 "the early exit.\n", stmt);
-	    }
-
-	  tree refop = TREE_OPERAND (obj, 0);
-	  tree refbase = get_base_address (refop);
-	  if (!refbase || !DECL_P (refbase) || !DECL_SIZE (refbase)
-	      || TREE_CODE (DECL_SIZE (refbase)) != INTEGER_CST)
-	    {
-	      if (dump_enabled_p ())
-		dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
-				 "early breaks only supported on"
-				 " statically allocated objects.\n");
-	      return opt_result::failure_at (stmt,
-				 "can't safely apply code motion to "
-				 "dependencies of %G to vectorize "
-				 "the early exit.\n", stmt);
-	    }
-
-	  /* Check if vector accesses to the object will be within bounds.
-	     must be a constant or assume loop will be versioned or niters
-	     bounded by VF so accesses are within range.  */
-	  if (!ref_within_array_bound (stmt, DR_REF (dr_ref)))
-	    {
-	      if (dump_enabled_p ())
-		dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
-				 "early breaks not supported: vectorization "
-				 "would %s beyond size of obj.",
-				 DR_IS_READ (dr_ref) ? "read" : "write");
-	      return opt_result::failure_at (stmt,
-				 "can't safely apply code motion to "
-				 "dependencies of %G to vectorize "
-				 "the early exit.\n", stmt);
-	    }
+	  /* Check if the operation is one we can safely do.  */
+	  opt_result res
+	    = vect_analyze_early_break_dependences_1 (dr_ref, stmt);
+	  if (!res)
+	    return res;
 
 	  if (DR_IS_READ (dr_ref))
 	    bases.safe_push (dr_ref);
@@ -817,6 +834,51 @@ vect_analyze_early_break_dependences (loop_vec_info loop_vinfo)
     }
   while (bb != loop->header);
 
+  /* For the destination BB we need to only analyze loads reachable from the early
+     break statement itself.  */
+  auto_vec <tree> workset;
+  hash_set <tree> visited;
+  gimple *last_stmt = gsi_stmt (gsi_last_bb (dest_bb));
+  gcond *last_cond = dyn_cast <gcond *> (last_stmt);
+  /* If the cast fails we have a different control flow statement in the latch.  Most
+     commonly this is a switch.  */
+  if (!last_cond)
+   return opt_result::failure_at (last_stmt,
+			     "can't safely apply code motion to dependencies"
+			     " to vectorize the early exit, unknown control fow"
+			     " in stmt %G", last_stmt);
+  workset.safe_push (gimple_cond_lhs (last_cond));
+  workset.safe_push (gimple_cond_rhs (last_cond));
+
+  imm_use_iterator imm_iter;
+  use_operand_p use_p;
+  tree lhs;
+  do
+    {
+      tree op = workset.pop ();
+      if (visited.add (op))
+	continue;
+      stmt_vec_info stmt_vinfo = loop_vinfo->lookup_def (op);
+
+      /* Not defined in loop, don't care.  */
+      if (!stmt_vinfo)
+	continue;
+      gimple *stmt = STMT_VINFO_STMT (stmt_vinfo);
+      auto dr_ref = STMT_VINFO_DATA_REF (stmt_vinfo);
+      if (dr_ref)
+	{
+	  opt_result res
+	    = vect_analyze_early_break_dependences_1 (dr_ref, stmt);
+	  if (!res)
+	    return res;
+	}
+      else
+	FOR_EACH_IMM_USE_FAST (use_p, imm_iter, op)
+	  if ((lhs = gimple_get_lhs (USE_STMT (use_p))))
+	    workset.safe_push (lhs);
+    }
+  while (!workset.is_empty ());
+
   /* We don't allow outer -> inner loop transitions which should have been
      trapped already during loop form analysis.  */
   gcc_assert (dest_bb->loop_father == loop);




-- 

[-- Attachment #2: rb18216.patch --]
[-- Type: text/plain, Size: 10182 bytes --]

diff --git a/gcc/testsuite/gcc.dg/vect/vect-early-break_108-pr113588.c b/gcc/testsuite/gcc.dg/vect/vect-early-break_108-pr113588.c
new file mode 100644
index 0000000000000000000000000000000000000000..e488619c9aac41fafbcf479818392a6bb7c6924f
--- /dev/null
+++ b/gcc/testsuite/gcc.dg/vect/vect-early-break_108-pr113588.c
@@ -0,0 +1,15 @@
+/* { dg-do compile } */
+/* { dg-add-options vect_early_break } */
+/* { dg-require-effective-target vect_early_break } */
+/* { dg-require-effective-target vect_int } */
+
+/* { dg-final { scan-tree-dump-not "LOOP VECTORIZED" "vect" } } */
+
+int foo (const char *s, unsigned long n)
+{
+ unsigned long len = 0;
+ while (*s++ && n--)
+   ++len;
+ return len;
+}
+
diff --git a/gcc/testsuite/gcc.dg/vect/vect-early-break_109-pr113588.c b/gcc/testsuite/gcc.dg/vect/vect-early-break_109-pr113588.c
new file mode 100644
index 0000000000000000000000000000000000000000..488c19d3ede809631d1a7ede0e7f7bcdc7a1ae43
--- /dev/null
+++ b/gcc/testsuite/gcc.dg/vect/vect-early-break_109-pr113588.c
@@ -0,0 +1,44 @@
+/* { dg-add-options vect_early_break } */
+/* { dg-require-effective-target vect_early_break } */
+/* { dg-require-effective-target vect_int } */
+/* { dg-require-effective-target mmap } */
+
+/* { dg-final { scan-tree-dump-not "LOOP VECTORIZED" "vect" } } */
+
+#include <sys/mman.h>
+#include <unistd.h>
+
+#include "tree-vect.h"
+
+__attribute__((noipa))
+int foo (const char *s, unsigned long n)
+{
+ unsigned long len = 0;
+ while (*s++ && n--)
+   ++len;
+ return len;
+}
+
+int main()
+{
+
+  check_vect ();
+
+  long pgsz = sysconf (_SC_PAGESIZE);
+  void *p = mmap (NULL, pgsz * 3, PROT_READ|PROT_WRITE,
+     MAP_ANONYMOUS|MAP_PRIVATE, 0, 0);
+  if (p == MAP_FAILED)
+    return 0;
+  mprotect (p, pgsz, PROT_NONE);
+  mprotect (p+2*pgsz, pgsz, PROT_NONE);
+  char *p1 = p + pgsz;
+  p1[0] = 1;
+  p1[1] = 0;
+  foo (p1, 1000);
+  p1 = p + 2*pgsz - 2;
+  p1[0] = 1;
+  p1[1] = 0;
+  foo (p1, 1000);
+  return 0;
+}
+
diff --git a/gcc/tree-vect-data-refs.cc b/gcc/tree-vect-data-refs.cc
index f592aeb8028afd4fd70e2175104efab2a2c0d82e..52cef242a7ce5d0e525bff639fa1dc2f0a6f30b9 100644
--- a/gcc/tree-vect-data-refs.cc
+++ b/gcc/tree-vect-data-refs.cc
@@ -619,10 +619,69 @@ vect_analyze_data_ref_dependence (struct data_dependence_relation *ddr,
   return opt_result::success ();
 }
 
-/* Funcion vect_analyze_early_break_dependences.
+/* Function vect_analyze_early_break_dependences_1
 
-   Examime all the data references in the loop and make sure that if we have
-   mulitple exits that we are able to safely move stores such that they become
+   Helper function of vect_analyze_early_break_dependences which performs safety
+   analysis for load operations in an early break.  */
+
+static opt_result
+vect_analyze_early_break_dependences_1 (data_reference *dr_ref, gimple *stmt)
+{
+  /* We currently only support statically allocated objects due to
+     not having first-faulting loads support or peeling for
+     alignment support.  Compute the size of the referenced object
+     (it could be dynamically allocated).  */
+  tree obj = DR_BASE_ADDRESS (dr_ref);
+  if (!obj || TREE_CODE (obj) != ADDR_EXPR)
+    {
+      if (dump_enabled_p ())
+	dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
+			 "early breaks only supported on statically"
+			 " allocated objects.\n");
+      return opt_result::failure_at (stmt,
+			 "can't safely apply code motion to "
+			 "dependencies of %G to vectorize "
+			 "the early exit.\n", stmt);
+    }
+
+  tree refop = TREE_OPERAND (obj, 0);
+  tree refbase = get_base_address (refop);
+  if (!refbase || !DECL_P (refbase) || !DECL_SIZE (refbase)
+      || TREE_CODE (DECL_SIZE (refbase)) != INTEGER_CST)
+    {
+      if (dump_enabled_p ())
+	dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
+			 "early breaks only supported on"
+			 " statically allocated objects.\n");
+      return opt_result::failure_at (stmt,
+			 "can't safely apply code motion to "
+			 "dependencies of %G to vectorize "
+			 "the early exit.\n", stmt);
+    }
+
+  /* Check if vector accesses to the object will be within bounds.
+     must be a constant or assume loop will be versioned or niters
+     bounded by VF so accesses are within range.  */
+  if (!ref_within_array_bound (stmt, DR_REF (dr_ref)))
+    {
+      if (dump_enabled_p ())
+	dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
+			 "early breaks not supported: vectorization "
+			 "would %s beyond size of obj.",
+			 DR_IS_READ (dr_ref) ? "read" : "write");
+      return opt_result::failure_at (stmt,
+			 "can't safely apply code motion to "
+			 "dependencies of %G to vectorize "
+			 "the early exit.\n", stmt);
+    }
+
+  return opt_result::success ();
+}
+
+/* Function vect_analyze_early_break_dependences.
+
+   Examine all the data references in the loop and make sure that if we have
+   multiple exits that we are able to safely move stores such that they become
    safe for vectorization.  The function also calculates the place where to move
    the instructions to and computes what the new vUSE chain should be.
 
@@ -639,7 +698,7 @@ vect_analyze_data_ref_dependence (struct data_dependence_relation *ddr,
      - Multiple loads are allowed as long as they don't alias.
 
    NOTE:
-     This implemementation is very conservative. Any overlappig loads/stores
+     This implementation is very conservative. Any overlapping loads/stores
      that take place before the early break statement gets rejected aside from
      WAR dependencies.
 
@@ -668,7 +727,6 @@ vect_analyze_early_break_dependences (loop_vec_info loop_vinfo)
   auto_vec<data_reference *> bases;
   basic_block dest_bb = NULL;
 
-  hash_set <gimple *> visited;
   class loop *loop = LOOP_VINFO_LOOP (loop_vinfo);
   class loop *loop_nest = loop_outer (loop);
 
@@ -683,6 +741,7 @@ vect_analyze_early_break_dependences (loop_vec_info loop_vinfo)
   dest_bb = single_pred (loop->latch);
   basic_block bb = dest_bb;
 
+  /* First analyse all blocks leading to dest_bb excluding dest_bb itself.  */
   do
     {
       /* If the destination block is also the header then we have nothing to do.  */
@@ -707,53 +766,11 @@ vect_analyze_early_break_dependences (loop_vec_info loop_vinfo)
 	  if (!dr_ref)
 	    continue;
 
-	  /* We currently only support statically allocated objects due to
-	     not having first-faulting loads support or peeling for
-	     alignment support.  Compute the size of the referenced object
-	     (it could be dynamically allocated).  */
-	  tree obj = DR_BASE_ADDRESS (dr_ref);
-	  if (!obj || TREE_CODE (obj) != ADDR_EXPR)
-	    {
-	      if (dump_enabled_p ())
-		dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
-				 "early breaks only supported on statically"
-				 " allocated objects.\n");
-	      return opt_result::failure_at (stmt,
-				 "can't safely apply code motion to "
-				 "dependencies of %G to vectorize "
-				 "the early exit.\n", stmt);
-	    }
-
-	  tree refop = TREE_OPERAND (obj, 0);
-	  tree refbase = get_base_address (refop);
-	  if (!refbase || !DECL_P (refbase) || !DECL_SIZE (refbase)
-	      || TREE_CODE (DECL_SIZE (refbase)) != INTEGER_CST)
-	    {
-	      if (dump_enabled_p ())
-		dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
-				 "early breaks only supported on"
-				 " statically allocated objects.\n");
-	      return opt_result::failure_at (stmt,
-				 "can't safely apply code motion to "
-				 "dependencies of %G to vectorize "
-				 "the early exit.\n", stmt);
-	    }
-
-	  /* Check if vector accesses to the object will be within bounds.
-	     must be a constant or assume loop will be versioned or niters
-	     bounded by VF so accesses are within range.  */
-	  if (!ref_within_array_bound (stmt, DR_REF (dr_ref)))
-	    {
-	      if (dump_enabled_p ())
-		dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
-				 "early breaks not supported: vectorization "
-				 "would %s beyond size of obj.",
-				 DR_IS_READ (dr_ref) ? "read" : "write");
-	      return opt_result::failure_at (stmt,
-				 "can't safely apply code motion to "
-				 "dependencies of %G to vectorize "
-				 "the early exit.\n", stmt);
-	    }
+	  /* Check if the operation is one we can safely do.  */
+	  opt_result res
+	    = vect_analyze_early_break_dependences_1 (dr_ref, stmt);
+	  if (!res)
+	    return res;
 
 	  if (DR_IS_READ (dr_ref))
 	    bases.safe_push (dr_ref);
@@ -817,6 +834,51 @@ vect_analyze_early_break_dependences (loop_vec_info loop_vinfo)
     }
   while (bb != loop->header);
 
+  /* For the destination BB we need to only analyze loads reachable from the early
+     break statement itself.  */
+  auto_vec <tree> workset;
+  hash_set <tree> visited;
+  gimple *last_stmt = gsi_stmt (gsi_last_bb (dest_bb));
+  gcond *last_cond = dyn_cast <gcond *> (last_stmt);
+  /* If the cast fails we have a different control flow statement in the latch.  Most
+     commonly this is a switch.  */
+  if (!last_cond)
+   return opt_result::failure_at (last_stmt,
+			     "can't safely apply code motion to dependencies"
+			     " to vectorize the early exit, unknown control fow"
+			     " in stmt %G", last_stmt);
+  workset.safe_push (gimple_cond_lhs (last_cond));
+  workset.safe_push (gimple_cond_rhs (last_cond));
+
+  imm_use_iterator imm_iter;
+  use_operand_p use_p;
+  tree lhs;
+  do
+    {
+      tree op = workset.pop ();
+      if (visited.add (op))
+	continue;
+      stmt_vec_info stmt_vinfo = loop_vinfo->lookup_def (op);
+
+      /* Not defined in loop, don't care.  */
+      if (!stmt_vinfo)
+	continue;
+      gimple *stmt = STMT_VINFO_STMT (stmt_vinfo);
+      auto dr_ref = STMT_VINFO_DATA_REF (stmt_vinfo);
+      if (dr_ref)
+	{
+	  opt_result res
+	    = vect_analyze_early_break_dependences_1 (dr_ref, stmt);
+	  if (!res)
+	    return res;
+	}
+      else
+	FOR_EACH_IMM_USE_FAST (use_p, imm_iter, op)
+	  if ((lhs = gimple_get_lhs (USE_STMT (use_p))))
+	    workset.safe_push (lhs);
+    }
+  while (!workset.is_empty ());
+
   /* We don't allow outer -> inner loop transitions which should have been
      trapped already during loop form analysis.  */
   gcc_assert (dest_bb->loop_father == loop);




^ permalink raw reply	[flat|nested] 8+ messages in thread

end of thread, other threads:[~2024-02-03 22:01 UTC | newest]

Thread overview: 8+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-01-29 15:04 [PATCH]middle-end: check memory accesses in the destination block [PR113588] Tamar Christina
2024-01-30  9:51 ` Richard Biener
2024-01-30 12:57   ` Tamar Christina
2024-01-30 13:04     ` Richard Biener
2024-02-01 21:33       ` Tamar Christina
2024-02-02  9:37         ` Toon Moene
2024-02-03 22:00           ` Sam James
2024-02-02 10:21         ` Richard Biener

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).