From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 1999 invoked by alias); 26 Jul 2018 09:14:22 -0000 Mailing-List: contact gcc-patches-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Id: List-Archive: List-Post: List-Help: Sender: gcc-patches-owner@gcc.gnu.org Received: (qmail 1845 invoked by uid 89); 26 Jul 2018 09:14:20 -0000 Authentication-Results: sourceware.org; auth=none X-Spam-SWARE-Status: No, score=-26.9 required=5.0 tests=BAYES_00,GIT_PATCH_0,GIT_PATCH_1,GIT_PATCH_2,GIT_PATCH_3,RCVD_IN_DNSWL_NONE,SPF_HELO_PASS,SPF_PASS autolearn=ham version=3.3.2 spammy=sk:interes X-HELO: EUR03-DB5-obe.outbound.protection.outlook.com Received: from mail-eopbgr40046.outbound.protection.outlook.com (HELO EUR03-DB5-obe.outbound.protection.outlook.com) (40.107.4.46) by sourceware.org (qpsmtpd/0.93/v0.84-503-g423c35a) with ESMTP; Thu, 26 Jul 2018 09:14:14 +0000 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=armh.onmicrosoft.com; s=selector1-arm-com; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-SenderADCheck; bh=W7skyIK0IceeHcKRjJDIJ0d1Cb1p+d8GyyZSEczgtwQ=; b=b3tYqALa3qRU1dR/XggbY1vMWKgQOp77DJdTroq9WXpJWyQuL7IzCszvB8JRERsTC5mcvPhBpabLaP6yOOfdeN82y6+hFMT2zf9LHHZH/Vboa4CUTd+eyfOqLFQRiQyDWCdkqpODTsFz+1PqM1BPNYtRl+Jkg1l9gMV5klyiFxI= Authentication-Results: spf=none (sender IP is ) smtp.mailfrom=Alan.Hayward@arm.com; Received: from C02TF0U7HF1T.manchester.arm.com (217.140.106.32) by DB6PR0802MB2133.eurprd08.prod.outlook.com (2603:10a6:4:83::20) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.973.20; Thu, 26 Jul 2018 09:13:58 +0000 From: Alan Hayward To: gcc-patches@gcc.gnu.org Cc: nd@arm.com, Alan Hayward Subject: [PATCH v2 6/7] Remaining support for clobber high Date: Thu, 26 Jul 2018 09:14:00 -0000 Message-Id: <20180726091329.26875-7-alan.hayward@arm.com> In-Reply-To: <20180726091329.26875-1-alan.hayward@arm.com> References: <20180726091329.26875-1-alan.hayward@arm.com> MIME-Version: 1.0 Content-Type: text/plain Return-Path: alan.hayward@arm.com Received-SPF: None (protection.outlook.com: arm.com does not designate permitted sender hosts) X-SW-Source: 2018-07/txt/msg01598.txt.bz2 Add the remainder of clobber high checks. Happy to split this into smaller patches if required (there didn't seem anything obvious to split into). 2018-07-25 Alan Hayward * alias.c (record_set): Check for clobber high. * cfgexpand.c (expand_gimple_stmt): Likewise. * combine-stack-adj.c (single_set_for_csa): Likewise. * combine.c (find_single_use_1): Likewise. (set_nonzero_bits_and_sign_copies): Likewise. (get_combine_src_dest): Likewise. (is_parallel_of_n_reg_sets): Likewise. (try_combine): Likewise. (record_dead_and_set_regs_1): Likewise. (reg_dead_at_p_1): Likewise. (reg_dead_at_p): Likewise. * dce.c (deletable_insn_p): Likewise. (mark_nonreg_stores_1): Likewise. (mark_nonreg_stores_2): Likewise. * df-scan.c (df_find_hard_reg_defs): Likewise. (df_uses_record): Likewise. (df_get_call_refs): Likewise. * dwarf2out.c (mem_loc_descriptor): Likewise. * haifa-sched.c (haifa_classify_rtx): Likewise. * ira-build.c (create_insn_allocnos): Likewise. * ira-costs.c (scan_one_insn): Likewise. * ira.c (equiv_init_movable_p): Likewise. (rtx_moveable_p): Likewise. (interesting_dest_for_shprep): Likewise. * jump.c (mark_jump_label_1): Likewise. * postreload-gcse.c (record_opr_changes): Likewise. * postreload.c (reload_cse_simplify): Likewise. (struct reg_use): Add source expr. (reload_combine): Check for clobber high. (reload_combine_note_use): Likewise. (reload_cse_move2add): Likewise. (move2add_note_store): Likewise. * print-rtl.c (print_pattern): Likewise. * recog.c (decode_asm_operands): Likewise. (store_data_bypass_p): Likewise. (if_test_bypass_p): Likewise. * regcprop.c (kill_clobbered_value): Likewise. (kill_set_value): Likewise. * reginfo.c (reg_scan_mark_refs): Likewise. * reload1.c (maybe_fix_stack_asms): Likewise. (eliminate_regs_1): Likewise. (elimination_effects): Likewise. (mark_not_eliminable): Likewise. (scan_paradoxical_subregs): Likewise. (forget_old_reloads_1): Likewise. * reorg.c (find_end_label): Likewise. (try_merge_delay_insns): Likewise. (redundant_insn): Likewise. (own_thread_p): Likewise. (fill_simple_delay_slots): Likewise. (fill_slots_from_thread): Likewise. (dbr_schedule): Likewise. * resource.c (update_live_status): Likewise. (mark_referenced_resources): Likewise. (mark_set_resources): Likewise. * rtl.c (copy_rtx): Likewise. * rtlanal.c (reg_referenced_p): Likewise. (single_set_2): Likewise. (noop_move_p): Likewise. (note_stores): Likewise. * sched-deps.c (sched_analyze_reg): Likewise. (sched_analyze_insn): Likewise. --- gcc/alias.c | 11 +++++++++++ gcc/cfgexpand.c | 1 + gcc/combine-stack-adj.c | 1 + gcc/combine.c | 38 +++++++++++++++++++++++++++++++++----- gcc/dce.c | 11 +++++++++-- gcc/df-scan.c | 6 ++++++ gcc/dwarf2out.c | 1 + gcc/haifa-sched.c | 3 +++ gcc/ira-build.c | 5 +++++ gcc/ira-costs.c | 7 +++++++ gcc/ira.c | 6 +++++- gcc/jump.c | 1 + gcc/postreload-gcse.c | 21 ++++++++++++--------- gcc/postreload.c | 25 ++++++++++++++++++++++++- gcc/print-rtl.c | 1 + gcc/recog.c | 9 ++++++--- gcc/regcprop.c | 10 ++++++++-- gcc/reginfo.c | 4 ++++ gcc/reload1.c | 16 +++++++++++++++- gcc/reorg.c | 27 ++++++++++++++++++--------- gcc/resource.c | 24 +++++++++++++++++++++--- gcc/rtl.c | 4 ++++ gcc/rtlanal.c | 18 +++++++++++++++--- gcc/sched-deps.c | 15 ++++++++++++++- 24 files changed, 225 insertions(+), 40 deletions(-) diff --git a/gcc/alias.c b/gcc/alias.c index 2091dfbf3d7..748da2b6951 100644 --- a/gcc/alias.c +++ b/gcc/alias.c @@ -1554,6 +1554,17 @@ record_set (rtx dest, const_rtx set, void *data ATTRIBUTE_UNUSED) new_reg_base_value[regno] = 0; return; } + /* A CLOBBER_HIGH only wipes out the old value if the mode of the old + value is greater than that of the clobber. */ + else if (GET_CODE (set) == CLOBBER_HIGH) + { + if (new_reg_base_value[regno] != 0 + && reg_is_clobbered_by_clobber_high ( + regno, GET_MODE (new_reg_base_value[regno]), XEXP (set, 0))) + new_reg_base_value[regno] = 0; + return; + } + src = SET_SRC (set); } else diff --git a/gcc/cfgexpand.c b/gcc/cfgexpand.c index d6e3c382085..39db6ed435f 100644 --- a/gcc/cfgexpand.c +++ b/gcc/cfgexpand.c @@ -3750,6 +3750,7 @@ expand_gimple_stmt (gimple *stmt) /* If we want exceptions for non-call insns, any may_trap_p instruction may throw. */ && GET_CODE (PATTERN (insn)) != CLOBBER + && GET_CODE (PATTERN (insn)) != CLOBBER_HIGH && GET_CODE (PATTERN (insn)) != USE && insn_could_throw_p (insn)) make_reg_eh_region_note (insn, 0, lp_nr); diff --git a/gcc/combine-stack-adj.c b/gcc/combine-stack-adj.c index 4573dc2a87d..bd7f5d0d2ec 100644 --- a/gcc/combine-stack-adj.c +++ b/gcc/combine-stack-adj.c @@ -133,6 +133,7 @@ single_set_for_csa (rtx_insn *insn) && SET_SRC (this_rtx) == SET_DEST (this_rtx)) ; else if (GET_CODE (this_rtx) != CLOBBER + && GET_CODE (this_rtx) != CLOBBER_HIGH && GET_CODE (this_rtx) != USE) return NULL_RTX; } diff --git a/gcc/combine.c b/gcc/combine.c index cfe0f190ece..be8397b5763 100644 --- a/gcc/combine.c +++ b/gcc/combine.c @@ -570,6 +570,7 @@ find_single_use_1 (rtx dest, rtx *loc) case SYMBOL_REF: CASE_CONST_ANY: case CLOBBER: + case CLOBBER_HIGH: return 0; case SET: @@ -1752,6 +1753,9 @@ set_nonzero_bits_and_sign_copies (rtx x, const_rtx set, void *data) return; } + /* Should not happen as we only using pseduo registers. */ + gcc_assert (GET_CODE (set) != CLOBBER_HIGH); + /* If this register is being initialized using itself, and the register is uninitialized in this basic block, and there are no LOG_LINKS which set the register, then part of the @@ -1910,6 +1914,7 @@ can_combine_p (rtx_insn *insn, rtx_insn *i3, rtx_insn *pred ATTRIBUTE_UNUSED, /* We can ignore CLOBBERs. */ case CLOBBER: + case CLOBBER_HIGH: break; case SET: @@ -2570,10 +2575,17 @@ is_parallel_of_n_reg_sets (rtx pat, int n) || !REG_P (SET_DEST (XVECEXP (pat, 0, i)))) return false; for ( ; i < len; i++) - if (GET_CODE (XVECEXP (pat, 0, i)) != CLOBBER - || XEXP (XVECEXP (pat, 0, i), 0) == const0_rtx) - return false; - + switch (GET_CODE (XVECEXP (pat, 0, i))) + { + case CLOBBER: + if (XEXP (XVECEXP (pat, 0, i), 0) == const0_rtx) + return false; + break; + case CLOBBER_HIGH: + break; + default: + return false; + } return true; } @@ -2848,7 +2860,8 @@ try_combine (rtx_insn *i3, rtx_insn *i2, rtx_insn *i1, rtx_insn *i0, for (i = 0; ok && i < XVECLEN (p2, 0); i++) { if ((GET_CODE (XVECEXP (p2, 0, i)) == SET - || GET_CODE (XVECEXP (p2, 0, i)) == CLOBBER) + || GET_CODE (XVECEXP (p2, 0, i)) == CLOBBER + || GET_CODE (XVECEXP (p2, 0, i)) == CLOBBER_HIGH) && reg_overlap_mentioned_p (SET_DEST (PATTERN (i3)), SET_DEST (XVECEXP (p2, 0, i)))) ok = false; @@ -13296,6 +13309,15 @@ record_dead_and_set_regs_1 (rtx dest, const_rtx setter, void *data) ? SET_SRC (setter) : gen_lowpart (GET_MODE (dest), SET_SRC (setter))); + else if (GET_CODE (setter) == CLOBBER_HIGH) + { + reg_stat_type *rsp = ®_stat[REGNO (dest)]; + if (rsp->last_set_value + && reg_is_clobbered_by_clobber_high + (REGNO (dest), GET_MODE (rsp->last_set_value), + XEXP (setter, 0))) + record_value_for_reg (dest, NULL, NULL_RTX); + } else record_value_for_reg (dest, record_dead_insn, NULL_RTX); } @@ -13716,6 +13738,7 @@ get_last_value (const_rtx x) static unsigned int reg_dead_regno, reg_dead_endregno; static int reg_dead_flag; +rtx reg_dead_reg; /* Function called via note_stores from reg_dead_at_p. @@ -13730,6 +13753,10 @@ reg_dead_at_p_1 (rtx dest, const_rtx x, void *data ATTRIBUTE_UNUSED) if (!REG_P (dest)) return; + if (GET_CODE (x) == CLOBBER_HIGH + && !reg_is_clobbered_by_clobber_high (reg_dead_reg, XEXP (x, 0))) + return; + regno = REGNO (dest); endregno = END_REGNO (dest); if (reg_dead_endregno > regno && reg_dead_regno < endregno) @@ -13753,6 +13780,7 @@ reg_dead_at_p (rtx reg, rtx_insn *insn) /* Set variables for reg_dead_at_p_1. */ reg_dead_regno = REGNO (reg); reg_dead_endregno = END_REGNO (reg); + reg_dead_reg = reg; reg_dead_flag = 0; diff --git a/gcc/dce.c b/gcc/dce.c index e4d442c62b5..7fc3fd209b5 100644 --- a/gcc/dce.c +++ b/gcc/dce.c @@ -145,6 +145,7 @@ deletable_insn_p (rtx_insn *insn, bool fast, bitmap arg_stores) return false; case CLOBBER: + case CLOBBER_HIGH: if (fast) { /* A CLOBBER of a dead pseudo register serves no purpose. @@ -213,7 +214,10 @@ static void mark_nonreg_stores_1 (rtx dest, const_rtx pattern, void *data) { if (GET_CODE (pattern) != CLOBBER && !REG_P (dest)) - mark_insn ((rtx_insn *) data, true); + { + gcc_checking_assert (GET_CODE (pattern) != CLOBBER_HIGH); + mark_insn ((rtx_insn *) data, true); + } } @@ -224,7 +228,10 @@ static void mark_nonreg_stores_2 (rtx dest, const_rtx pattern, void *data) { if (GET_CODE (pattern) != CLOBBER && !REG_P (dest)) - mark_insn ((rtx_insn *) data, false); + { + gcc_checking_assert (GET_CODE (pattern) != CLOBBER_HIGH); + mark_insn ((rtx_insn *) data, false); + } } diff --git a/gcc/df-scan.c b/gcc/df-scan.c index cbb08fc36ae..0b119f211ea 100644 --- a/gcc/df-scan.c +++ b/gcc/df-scan.c @@ -2778,6 +2778,7 @@ df_find_hard_reg_defs (rtx x, HARD_REG_SET *defs) break; case CLOBBER: + case CLOBBER_HIGH: df_find_hard_reg_defs_1 (XEXP (x, 0), defs); break; @@ -2837,6 +2838,10 @@ df_uses_record (struct df_collection_rec *collection_rec, /* If we're clobbering a REG then we have a def so ignore. */ return; + case CLOBBER_HIGH: + gcc_assert (REG_P (XEXP (x, 0))); + return; + case MEM: df_uses_record (collection_rec, &XEXP (x, 0), DF_REF_REG_MEM_LOAD, @@ -3133,6 +3138,7 @@ df_get_call_refs (struct df_collection_rec *collection_rec, for (note = CALL_INSN_FUNCTION_USAGE (insn_info->insn); note; note = XEXP (note, 1)) { + gcc_assert (GET_CODE (XEXP (note, 0)) != CLOBBER_HIGH); if (GET_CODE (XEXP (note, 0)) == USE) df_uses_record (collection_rec, &XEXP (XEXP (note, 0), 0), DF_REF_REG_USE, bb, insn_info, flags); diff --git a/gcc/dwarf2out.c b/gcc/dwarf2out.c index 8377cbc5dd1..33d6d3ea3a2 100644 --- a/gcc/dwarf2out.c +++ b/gcc/dwarf2out.c @@ -16321,6 +16321,7 @@ mem_loc_descriptor (rtx rtl, machine_mode mode, case CONST_FIXED: case CLRSB: case CLOBBER: + case CLOBBER_HIGH: /* If delegitimize_address couldn't do anything with the UNSPEC, we can't express it in the debug info. This can happen e.g. with some TLS UNSPECs. */ diff --git a/gcc/haifa-sched.c b/gcc/haifa-sched.c index 4a899b56173..4f0221f6f43 100644 --- a/gcc/haifa-sched.c +++ b/gcc/haifa-sched.c @@ -529,6 +529,9 @@ haifa_classify_rtx (const_rtx x) /* Test if it is a 'store'. */ tmp_class = may_trap_exp (XEXP (x, 0), 1); break; + case CLOBBER_HIGH: + gcc_assert (REG_P (XEXP (x, 0))); + break; case SET: /* Test if it is a store. */ tmp_class = may_trap_exp (SET_DEST (x), 1); diff --git a/gcc/ira-build.c b/gcc/ira-build.c index c0fd9384f8b..da017be5b3d 100644 --- a/gcc/ira-build.c +++ b/gcc/ira-build.c @@ -1876,6 +1876,11 @@ create_insn_allocnos (rtx x, rtx outer, bool output_p) create_insn_allocnos (XEXP (x, 0), NULL, true); return; } + else if (code == CLOBBER_HIGH) + { + gcc_assert (REG_P (XEXP (x, 0)) && HARD_REGISTER_P (XEXP (x, 0))); + return; + } else if (code == MEM) { create_insn_allocnos (XEXP (x, 0), NULL, false); diff --git a/gcc/ira-costs.c b/gcc/ira-costs.c index 2b4ae38f410..6fa917ab1a1 100644 --- a/gcc/ira-costs.c +++ b/gcc/ira-costs.c @@ -1444,6 +1444,13 @@ scan_one_insn (rtx_insn *insn) return insn; } + if (pat_code == CLOBBER_HIGH) + { + gcc_assert (REG_P (XEXP (PATTERN (insn), 0)) + && HARD_REGISTER_P (XEXP (PATTERN (insn), 0))); + return insn; + } + counted_mem = false; set = single_set (insn); extract_insn (insn); diff --git a/gcc/ira.c b/gcc/ira.c index b7bcc15a15d..def194a0782 100644 --- a/gcc/ira.c +++ b/gcc/ira.c @@ -3086,6 +3086,7 @@ equiv_init_movable_p (rtx x, int regno) case CC0: case CLOBBER: + case CLOBBER_HIGH: return 0; case PRE_INC: @@ -4411,6 +4412,7 @@ rtx_moveable_p (rtx *loc, enum op_type type) && rtx_moveable_p (&XEXP (x, 2), OP_IN)); case CLOBBER: + case CLOBBER_HIGH: return rtx_moveable_p (&SET_DEST (x), OP_OUT); case UNSPEC_VOLATILE: @@ -4863,7 +4865,9 @@ interesting_dest_for_shprep (rtx_insn *insn, basic_block call_dom) for (int i = 0; i < XVECLEN (pat, 0); i++) { rtx sub = XVECEXP (pat, 0, i); - if (GET_CODE (sub) == USE || GET_CODE (sub) == CLOBBER) + if (GET_CODE (sub) == USE + || GET_CODE (sub) == CLOBBER + || GET_CODE (sub) == CLOBBER_HIGH) continue; if (GET_CODE (sub) != SET || side_effects_p (sub)) diff --git a/gcc/jump.c b/gcc/jump.c index f379048f636..06f7255d24d 100644 --- a/gcc/jump.c +++ b/gcc/jump.c @@ -1094,6 +1094,7 @@ mark_jump_label_1 (rtx x, rtx_insn *insn, bool in_mem, bool is_target) case CC0: case REG: case CLOBBER: + case CLOBBER_HIGH: case CALL: return; diff --git a/gcc/postreload-gcse.c b/gcc/postreload-gcse.c index 152dd6ae452..afa61dcede6 100644 --- a/gcc/postreload-gcse.c +++ b/gcc/postreload-gcse.c @@ -791,15 +791,18 @@ record_opr_changes (rtx_insn *insn) record_last_reg_set_info_regno (insn, regno); for (link = CALL_INSN_FUNCTION_USAGE (insn); link; link = XEXP (link, 1)) - if (GET_CODE (XEXP (link, 0)) == CLOBBER) - { - x = XEXP (XEXP (link, 0), 0); - if (REG_P (x)) - { - gcc_assert (HARD_REGISTER_P (x)); - record_last_reg_set_info (insn, x); - } - } + { + gcc_assert (GET_CODE (XEXP (link, 0)) != CLOBBER_HIGH); + if (GET_CODE (XEXP (link, 0)) == CLOBBER) + { + x = XEXP (XEXP (link, 0), 0); + if (REG_P (x)) + { + gcc_assert (HARD_REGISTER_P (x)); + record_last_reg_set_info (insn, x); + } + } + } if (! RTL_CONST_OR_PURE_CALL_P (insn)) record_last_mem_set_info (insn); diff --git a/gcc/postreload.c b/gcc/postreload.c index 0638709639b..56cb14dc676 100644 --- a/gcc/postreload.c +++ b/gcc/postreload.c @@ -133,6 +133,8 @@ reload_cse_simplify (rtx_insn *insn, rtx testreg) for (i = XVECLEN (body, 0) - 1; i >= 0; --i) { rtx part = XVECEXP (body, 0, i); + /* asms can only have full clobbers, not clobber_highs. */ + gcc_assert (GET_CODE (part) != CLOBBER_HIGH); if (GET_CODE (part) == CLOBBER && REG_P (XEXP (part, 0))) cselib_invalidate_rtx (XEXP (part, 0)); } @@ -156,6 +158,7 @@ reload_cse_simplify (rtx_insn *insn, rtx testreg) } } else if (GET_CODE (part) != CLOBBER + && GET_CODE (part) != CLOBBER_HIGH && GET_CODE (part) != USE) break; } @@ -667,7 +670,8 @@ struct reg_use STORE_RUID is always meaningful if we only want to use a value in a register in a different place: it denotes the next insn in the insn stream (i.e. the last encountered) that sets or clobbers the register. - REAL_STORE_RUID is similar, but clobbers are ignored when updating it. */ + REAL_STORE_RUID is similar, but clobbers are ignored when updating it. + EXPR is the expression used when storing the register. */ static struct { struct reg_use reg_use[RELOAD_COMBINE_MAX_USES]; @@ -677,6 +681,7 @@ static struct int real_store_ruid; int use_ruid; bool all_offsets_match; + rtx expr; } reg_state[FIRST_PSEUDO_REGISTER]; /* Reverse linear uid. This is increased in reload_combine while scanning @@ -1341,6 +1346,10 @@ reload_combine (void) { rtx setuse = XEXP (link, 0); rtx usage_rtx = XEXP (setuse, 0); + /* We could support CLOBBER_HIGH and treat it in the same way as + HARD_REGNO_CALL_PART_CLOBBERED, but no port needs that yet. */ + gcc_assert (GET_CODE (setuse) != CLOBBER_HIGH); + if ((GET_CODE (setuse) == USE || GET_CODE (setuse) == CLOBBER) && REG_P (usage_rtx)) { @@ -1516,6 +1525,10 @@ reload_combine_note_use (rtx *xp, rtx_insn *insn, int ruid, rtx containing_mem) } break; + case CLOBBER_HIGH: + gcc_assert (REG_P (SET_DEST (x))); + return; + case PLUS: /* We are interested in (plus (reg) (const_int)) . */ if (!REG_P (XEXP (x, 0)) @@ -2135,6 +2148,9 @@ reload_cse_move2add (rtx_insn *first) { rtx setuse = XEXP (link, 0); rtx usage_rtx = XEXP (setuse, 0); + /* CALL_INSN_FUNCTION_USAGEs can only have full clobbers, not + clobber_highs. */ + gcc_assert (GET_CODE (setuse) != CLOBBER_HIGH); if (GET_CODE (setuse) == CLOBBER && REG_P (usage_rtx)) { @@ -2297,6 +2313,13 @@ move2add_note_store (rtx dst, const_rtx set, void *data) move2add_record_mode (dst); } + else if (GET_CODE (set) == CLOBBER_HIGH) + { + /* Only invalidate if actually clobbered. */ + if (reg_mode[regno] == BLKmode + || reg_is_clobbered_by_clobber_high (regno, reg_mode[regno], dst)) + goto invalidate; + } else { invalidate: diff --git a/gcc/print-rtl.c b/gcc/print-rtl.c index 37c0d53fae2..ba9ac02fce7 100644 --- a/gcc/print-rtl.c +++ b/gcc/print-rtl.c @@ -1742,6 +1742,7 @@ print_pattern (pretty_printer *pp, const_rtx x, int verbose) print_exp (pp, x, verbose); break; case CLOBBER: + case CLOBBER_HIGH: case USE: pp_printf (pp, "%s ", GET_RTX_NAME (GET_CODE (x))); print_value (pp, XEXP (x, 0), verbose); diff --git a/gcc/recog.c b/gcc/recog.c index 0a8fa2ce46c..d7c69439683 100644 --- a/gcc/recog.c +++ b/gcc/recog.c @@ -1600,6 +1600,7 @@ decode_asm_operands (rtx body, rtx *operands, rtx **operand_locs, { if (GET_CODE (XVECEXP (body, 0, i)) == CLOBBER) break; /* Past last SET */ + gcc_assert (GET_CODE (XVECEXP (body, 0, i)) == SET); if (operands) operands[i] = SET_DEST (XVECEXP (body, 0, i)); if (operand_locs) @@ -3715,7 +3716,8 @@ store_data_bypass_p_1 (rtx_insn *out_insn, rtx in_set) { rtx out_exp = XVECEXP (out_pat, 0, i); - if (GET_CODE (out_exp) == CLOBBER || GET_CODE (out_exp) == USE) + if (GET_CODE (out_exp) == CLOBBER || GET_CODE (out_exp) == USE + || GET_CODE (out_exp) == CLOBBER_HIGH) continue; gcc_assert (GET_CODE (out_exp) == SET); @@ -3746,7 +3748,8 @@ store_data_bypass_p (rtx_insn *out_insn, rtx_insn *in_insn) { rtx in_exp = XVECEXP (in_pat, 0, i); - if (GET_CODE (in_exp) == CLOBBER || GET_CODE (in_exp) == USE) + if (GET_CODE (in_exp) == CLOBBER || GET_CODE (in_exp) == USE + || GET_CODE (in_exp) == CLOBBER_HIGH) continue; gcc_assert (GET_CODE (in_exp) == SET); @@ -3798,7 +3801,7 @@ if_test_bypass_p (rtx_insn *out_insn, rtx_insn *in_insn) { rtx exp = XVECEXP (out_pat, 0, i); - if (GET_CODE (exp) == CLOBBER) + if (GET_CODE (exp) == CLOBBER || GET_CODE (exp) == CLOBBER_HIGH) continue; gcc_assert (GET_CODE (exp) == SET); diff --git a/gcc/regcprop.c b/gcc/regcprop.c index 18132425ab2..1f805765b93 100644 --- a/gcc/regcprop.c +++ b/gcc/regcprop.c @@ -237,7 +237,11 @@ static void kill_clobbered_value (rtx x, const_rtx set, void *data) { struct value_data *const vd = (struct value_data *) data; - if (GET_CODE (set) == CLOBBER) + gcc_assert (GET_CODE (set) != CLOBBER_HIGH || REG_P (x)); + + if (GET_CODE (set) == CLOBBER + || (GET_CODE (set) == CLOBBER_HIGH + && reg_is_clobbered_by_clobber_high (x, XEXP (set, 0)))) kill_value (x, vd); } @@ -257,7 +261,9 @@ kill_set_value (rtx x, const_rtx set, void *data) struct kill_set_value_data *ksvd = (struct kill_set_value_data *) data; if (rtx_equal_p (x, ksvd->ignore_set_reg)) return; - if (GET_CODE (set) != CLOBBER) + + gcc_assert (GET_CODE (set) != CLOBBER_HIGH || REG_P (x)); + if (GET_CODE (set) != CLOBBER && GET_CODE (set) != CLOBBER_HIGH) { kill_value (x, ksvd->vd); if (REG_P (x)) diff --git a/gcc/reginfo.c b/gcc/reginfo.c index f4071dac8b4..1f36d141c73 100644 --- a/gcc/reginfo.c +++ b/gcc/reginfo.c @@ -1100,6 +1100,10 @@ reg_scan_mark_refs (rtx x, rtx_insn *insn) reg_scan_mark_refs (XEXP (XEXP (x, 0), 0), insn); break; + case CLOBBER_HIGH: + gcc_assert (!(MEM_P (XEXP (x, 0)))); + break; + case SET: /* Count a set of the destination if it is a register. */ for (dest = SET_DEST (x); diff --git a/gcc/reload1.c b/gcc/reload1.c index a4cc3ee02ea..8e15160c6ad 100644 --- a/gcc/reload1.c +++ b/gcc/reload1.c @@ -1339,6 +1339,8 @@ maybe_fix_stack_asms (void) rtx t = XVECEXP (pat, 0, i); if (GET_CODE (t) == CLOBBER && STACK_REG_P (XEXP (t, 0))) SET_HARD_REG_BIT (clobbered, REGNO (XEXP (t, 0))); + /* CLOBBER_HIGH is only supported for LRA. */ + gcc_assert (GET_CODE (t) != CLOBBER_HIGH); } /* Get the operand values and constraints out of the insn. */ @@ -2879,6 +2881,7 @@ eliminate_regs_1 (rtx x, machine_mode mem_mode, rtx insn, return x; case CLOBBER: + case CLOBBER_HIGH: case ASM_OPERANDS: gcc_assert (insn && DEBUG_INSN_P (insn)); break; @@ -3089,6 +3092,10 @@ elimination_effects (rtx x, machine_mode mem_mode) elimination_effects (XEXP (x, 0), mem_mode); return; + case CLOBBER_HIGH: + /* CLOBBER_HIGH is only supported for LRA. */ + return; + case SET: /* Check for setting a register that we know about. */ if (REG_P (SET_DEST (x))) @@ -3810,6 +3817,9 @@ mark_not_eliminable (rtx dest, const_rtx x, void *data ATTRIBUTE_UNUSED) if (dest == hard_frame_pointer_rtx) return; + /* CLOBBER_HIGH is only supported for LRA. */ + gcc_assert (GET_CODE (x) != CLOBBER_HIGH); + for (i = 0; i < NUM_ELIMINABLE_REGS; i++) if (reg_eliminate[i].can_eliminate && dest == reg_eliminate[i].to_rtx && (GET_CODE (x) != SET @@ -4445,6 +4455,7 @@ scan_paradoxical_subregs (rtx x) case PC: case USE: case CLOBBER: + case CLOBBER_HIGH: return; case SUBREG: @@ -4899,7 +4910,7 @@ reload_as_needed (int live_known) to be forgotten later. */ static void -forget_old_reloads_1 (rtx x, const_rtx ignored ATTRIBUTE_UNUSED, +forget_old_reloads_1 (rtx x, const_rtx setter, void *data) { unsigned int regno; @@ -4919,6 +4930,9 @@ forget_old_reloads_1 (rtx x, const_rtx ignored ATTRIBUTE_UNUSED, if (!REG_P (x)) return; + /* CLOBBER_HIGH is only supported for LRA. */ + gcc_assert (GET_CODE (setter) != CLOBBER_HIGH); + regno = REGNO (x); if (regno >= FIRST_PSEUDO_REGISTER) diff --git a/gcc/reorg.c b/gcc/reorg.c index 904d91ec9e8..f8a986cea1d 100644 --- a/gcc/reorg.c +++ b/gcc/reorg.c @@ -397,7 +397,8 @@ find_end_label (rtx kind) while (NOTE_P (insn) || (NONJUMP_INSN_P (insn) && (GET_CODE (PATTERN (insn)) == USE - || GET_CODE (PATTERN (insn)) == CLOBBER))) + || GET_CODE (PATTERN (insn)) == CLOBBER + || GET_CODE (PATTERN (insn)) == CLOBBER_HIGH))) insn = PREV_INSN (insn); /* When a target threads its epilogue we might already have a @@ -1297,7 +1298,8 @@ try_merge_delay_insns (rtx_insn *insn, rtx_insn *thread) /* TRIAL must be a CALL_INSN or INSN. Skip USE and CLOBBER. */ if (NONJUMP_INSN_P (trial) - && (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER)) + && (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER + || GET_CODE (pat) == CLOBBER_HIGH)) continue; if (GET_CODE (next_to_match) == GET_CODE (trial) @@ -1491,7 +1493,8 @@ redundant_insn (rtx insn, rtx_insn *target, const vec &delay_list) --insns_to_search; pat = PATTERN (trial); - if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) + if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER + || GET_CODE (pat) == CLOBBER_HIGH) continue; if (GET_CODE (trial) == DEBUG_INSN) @@ -1589,7 +1592,8 @@ redundant_insn (rtx insn, rtx_insn *target, const vec &delay_list) --insns_to_search; pat = PATTERN (trial); - if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) + if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER + || GET_CODE (pat) == CLOBBER_HIGH) continue; if (GET_CODE (trial) == DEBUG_INSN) @@ -1701,7 +1705,8 @@ own_thread_p (rtx thread, rtx label, int allow_fallthrough) || LABEL_P (insn) || (NONJUMP_INSN_P (insn) && GET_CODE (PATTERN (insn)) != USE - && GET_CODE (PATTERN (insn)) != CLOBBER)) + && GET_CODE (PATTERN (insn)) != CLOBBER + && GET_CODE (PATTERN (insn)) != CLOBBER_HIGH)) return 0; return 1; @@ -2024,7 +2029,8 @@ fill_simple_delay_slots (int non_jumps_p) pat = PATTERN (trial); /* Stand-alone USE and CLOBBER are just for flow. */ - if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) + if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER + || GET_CODE (pat) == CLOBBER_HIGH) continue; /* And DEBUG_INSNs never go into delay slots. */ @@ -2150,7 +2156,8 @@ fill_simple_delay_slots (int non_jumps_p) pat = PATTERN (trial); /* Stand-alone USE and CLOBBER are just for flow. */ - if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) + if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER + || GET_CODE (pat) == CLOBBER_HIGH) continue; /* And DEBUG_INSNs do not go in delay slots. */ @@ -2418,7 +2425,8 @@ fill_slots_from_thread (rtx_jump_insn *insn, rtx condition, } pat = PATTERN (trial); - if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) + if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER + || GET_CODE (pat) == CLOBBER_HIGH) continue; if (GET_CODE (trial) == DEBUG_INSN) @@ -3818,7 +3826,8 @@ dbr_schedule (rtx_insn *first) if (! insn->deleted () && NONJUMP_INSN_P (insn) && GET_CODE (PATTERN (insn)) != USE - && GET_CODE (PATTERN (insn)) != CLOBBER) + && GET_CODE (PATTERN (insn)) != CLOBBER + && GET_CODE (PATTERN (insn)) != CLOBBER_HIGH) { if (GET_CODE (PATTERN (insn)) == SEQUENCE) { diff --git a/gcc/resource.c b/gcc/resource.c index 0822daebde7..fdfab6916d4 100644 --- a/gcc/resource.c +++ b/gcc/resource.c @@ -108,6 +108,11 @@ update_live_status (rtx dest, const_rtx x, void *data ATTRIBUTE_UNUSED) if (GET_CODE (x) == CLOBBER) for (i = first_regno; i < last_regno; i++) CLEAR_HARD_REG_BIT (current_live_regs, i); + else if (GET_CODE (x) == CLOBBER_HIGH) + /* No current target supports both branch delay slots and CLOBBER_HIGH. + We'd need more elaborate liveness tracking to handle that + combination. */ + gcc_unreachable (); else for (i = first_regno; i < last_regno; i++) { @@ -293,6 +298,7 @@ mark_referenced_resources (rtx x, struct resources *res, return; case CLOBBER: + case CLOBBER_HIGH: return; case CALL_INSN: @@ -668,9 +674,15 @@ mark_set_resources (rtx x, struct resources *res, int in_dest, for (link = CALL_INSN_FUNCTION_USAGE (call_insn); link; link = XEXP (link, 1)) - if (GET_CODE (XEXP (link, 0)) == CLOBBER) - mark_set_resources (SET_DEST (XEXP (link, 0)), res, 1, - MARK_SRC_DEST); + { + /* We could support CLOBBER_HIGH and treat it in the same way as + HARD_REGNO_CALL_PART_CLOBBERED, but no port needs that + yet. */ + gcc_assert (GET_CODE (XEXP (link, 0)) != CLOBBER_HIGH); + if (GET_CODE (XEXP (link, 0)) == CLOBBER) + mark_set_resources (SET_DEST (XEXP (link, 0)), res, 1, + MARK_SRC_DEST); + } /* Check for a REG_SETJMP. If it exists, then we must assume that this call can clobber any register. */ @@ -713,6 +725,12 @@ mark_set_resources (rtx x, struct resources *res, int in_dest, mark_set_resources (XEXP (x, 0), res, 1, MARK_SRC_DEST); return; + case CLOBBER_HIGH: + /* No current target supports both branch delay slots and CLOBBER_HIGH. + We'd need more elaborate liveness tracking to handle that + combination. */ + gcc_unreachable (); + case SEQUENCE: { rtx_sequence *seq = as_a (x); diff --git a/gcc/rtl.c b/gcc/rtl.c index 985db1c14f0..f9146afcf2c 100644 --- a/gcc/rtl.c +++ b/gcc/rtl.c @@ -304,6 +304,10 @@ copy_rtx (rtx orig) return orig; break; + case CLOBBER_HIGH: + gcc_assert (REG_P (XEXP (orig, 0))); + return orig; + case CONST: if (shared_const_p (orig)) return orig; diff --git a/gcc/rtlanal.c b/gcc/rtlanal.c index 1cab1545744..6c620a4a3e5 100644 --- a/gcc/rtlanal.c +++ b/gcc/rtlanal.c @@ -1198,6 +1198,10 @@ reg_referenced_p (const_rtx x, const_rtx body) return 1; return 0; + case CLOBBER_HIGH: + gcc_assert (REG_P (XEXP (body, 0))); + return 0; + case COND_EXEC: if (reg_overlap_mentioned_p (x, COND_EXEC_TEST (body))) return 1; @@ -1420,7 +1424,11 @@ set_of_1 (rtx x, const_rtx pat, void *data1) { struct set_of_data *const data = (struct set_of_data *) (data1); if (rtx_equal_p (x, data->pat) - || (!MEM_P (x) && reg_overlap_mentioned_p (data->pat, x))) + || (GET_CODE (pat) == CLOBBER_HIGH + && REGNO(data->pat) == REGNO(XEXP (pat, 0)) + && reg_is_clobbered_by_clobber_high (data->pat, XEXP (pat, 0))) + || (GET_CODE (pat) != CLOBBER_HIGH && !MEM_P (x) + && reg_overlap_mentioned_p (data->pat, x))) data->found = pat; } @@ -1509,6 +1517,7 @@ single_set_2 (const rtx_insn *insn, const_rtx pat) { case USE: case CLOBBER: + case CLOBBER_HIGH: break; case SET: @@ -1663,7 +1672,8 @@ noop_move_p (const rtx_insn *insn) rtx tem = XVECEXP (pat, 0, i); if (GET_CODE (tem) == USE - || GET_CODE (tem) == CLOBBER) + || GET_CODE (tem) == CLOBBER + || GET_CODE (tem) == CLOBBER_HIGH) continue; if (GET_CODE (tem) != SET || ! set_noop_p (tem)) @@ -1895,7 +1905,9 @@ note_stores (const_rtx x, void (*fun) (rtx, const_rtx, void *), void *data) if (GET_CODE (x) == COND_EXEC) x = COND_EXEC_CODE (x); - if (GET_CODE (x) == SET || GET_CODE (x) == CLOBBER) + if (GET_CODE (x) == SET + || GET_CODE (x) == CLOBBER + || GET_CODE (x) == CLOBBER_HIGH) { rtx dest = SET_DEST (x); diff --git a/gcc/sched-deps.c b/gcc/sched-deps.c index 120b5f0ddc1..f89f28269fd 100644 --- a/gcc/sched-deps.c +++ b/gcc/sched-deps.c @@ -2319,6 +2319,13 @@ sched_analyze_reg (struct deps_desc *deps, int regno, machine_mode mode, while (--i >= 0) note_reg_use (regno + i); } + else if (ref == CLOBBER_HIGH) + { + gcc_assert (i == 1); + /* We don't know the current state of the register, so have to treat + the clobber high as a full clobber. */ + note_reg_clobber (regno); + } else { while (--i >= 0) @@ -2342,6 +2349,8 @@ sched_analyze_reg (struct deps_desc *deps, int regno, machine_mode mode, else if (ref == USE) note_reg_use (regno); else + /* For CLOBBER_HIGH, we don't know the current state of the register, + so have to treat it as a full clobber. */ note_reg_clobber (regno); /* Pseudos that are REG_EQUIV to something may be replaced @@ -2962,7 +2971,7 @@ sched_analyze_insn (struct deps_desc *deps, rtx x, rtx_insn *insn) sub = COND_EXEC_CODE (sub); code = GET_CODE (sub); } - if (code == SET || code == CLOBBER) + else if (code == SET || code == CLOBBER || code == CLOBBER_HIGH) sched_analyze_1 (deps, sub, insn); else sched_analyze_2 (deps, sub, insn); @@ -2978,6 +2987,10 @@ sched_analyze_insn (struct deps_desc *deps, rtx x, rtx_insn *insn) { if (GET_CODE (XEXP (link, 0)) == CLOBBER) sched_analyze_1 (deps, XEXP (link, 0), insn); + else if (GET_CODE (XEXP (link, 0)) == CLOBBER_HIGH) + /* We could support CLOBBER_HIGH and treat it in the same way as + HARD_REGNO_CALL_PART_CLOBBERED, but no port needs that yet. */ + gcc_unreachable (); else if (GET_CODE (XEXP (link, 0)) != SET) sched_analyze_2 (deps, XEXP (link, 0), insn); } -- 2.15.2 (Apple Git-101.1)