From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 53903 invoked by alias); 23 Oct 2017 17:23:43 -0000 Mailing-List: contact gcc-patches-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Id: List-Archive: List-Post: List-Help: Sender: gcc-patches-owner@gcc.gnu.org Received: (qmail 53870 invoked by uid 89); 23 Oct 2017 17:23:43 -0000 Authentication-Results: sourceware.org; auth=none X-Virus-Found: No X-Spam-SWARE-Status: No, score=-15.5 required=5.0 tests=AWL,BAYES_00,GIT_PATCH_1,GIT_PATCH_2,GIT_PATCH_3,KAM_ASCII_DIVIDERS,RCVD_IN_DNSWL_NONE,SPF_PASS autolearn=ham version=3.3.2 spammy= X-HELO: mail-wr0-f181.google.com Received: from mail-wr0-f181.google.com (HELO mail-wr0-f181.google.com) (209.85.128.181) by sourceware.org (qpsmtpd/0.93/v0.84-503-g423c35a) with ESMTP; Mon, 23 Oct 2017 17:23:40 +0000 Received: by mail-wr0-f181.google.com with SMTP id l8so5301787wre.12 for ; Mon, 23 Oct 2017 10:23:39 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:from:to:mail-followup-to:subject:references:date :in-reply-to:message-id:user-agent:mime-version; bh=WOjiQ8VKD08ysi3gTTMtY1ofsaUZxP3xe79ZS4wONzM=; b=b0Ms9RJlg21br/F/3xrw3OUyVzCRd9fTG+0yUsii/tZMo1KRYaCBL9OwnvLw52mlnV 58dJj6sBrx0bZ+bAhJHHeb+nevPSPas9sielpGWKrup/KwExbANryV21lJ2wHWeYUwHk xMxjJU2bg9iGIgJkdiu/mUYTytJNOee8GvLA2zi5ZQnpwFbIVpBvq7v0a82lfwsnooyv C77p+D4OiAjFB9ruD0ARnqsHnfL7cESdhOHcLww/CMlXog00uNkcW4XDZAceyKTwtZ8N caiJLIwv/GUl12zrOQNpd6tg2n27rJO4loM0NSX4iCcql058hp0ulCDkThPdVV/QfUJQ THUA== X-Gm-Message-State: AMCzsaURrKCdYOyL7XNoKpFaY+Iw/1U7eFwk09uUiM8EdHup2zWVfhgV C3nbteHUmYu/rYeFIypLlESoJCH0H/c= X-Google-Smtp-Source: ABhQp+SWkAyUt1AUGlim42F1+ztI7KLZqbFDYzbxTjFrjCnDjRC+LZoNi3SmlKVDMWqt2ypy5ETbZQ== X-Received: by 10.223.164.206 with SMTP id h14mr11394815wrb.25.1508779417813; Mon, 23 Oct 2017 10:23:37 -0700 (PDT) Received: from localhost ([2.26.27.199]) by smtp.gmail.com with ESMTPSA id q7sm6649132wrg.97.2017.10.23.10.23.36 for (version=TLS1_2 cipher=ECDHE-RSA-CHACHA20-POLY1305 bits=256/256); Mon, 23 Oct 2017 10:23:37 -0700 (PDT) From: Richard Sandiford To: gcc-patches@gcc.gnu.org Mail-Followup-To: gcc-patches@gcc.gnu.org, richard.sandiford@linaro.org Subject: [056/nnn] poly_int: MEM_REF offsets References: <871sltvm7r.fsf@linaro.org> Date: Mon, 23 Oct 2017 17:24:00 -0000 In-Reply-To: <871sltvm7r.fsf@linaro.org> (Richard Sandiford's message of "Mon, 23 Oct 2017 17:54:32 +0100") Message-ID: <8760b5lqw7.fsf@linaro.org> User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/25.2 (gnu/linux) MIME-Version: 1.0 Content-Type: text/plain X-SW-Source: 2017-10/txt/msg01557.txt.bz2 This patch allows MEM_REF offsets to be polynomial, with mem_ref_offset now returning a poly_offset_int instead of an offset_int. The non-mechanical changes to callers of mem_ref_offset were handled by previous patches. 2017-10-23 Richard Sandiford Alan Hayward David Sherwood gcc/ * fold-const.h (mem_ref_offset): Return a poly_offset_int rather than an offset_int. * tree.c (mem_ref_offset): Likewise. * builtins.c (get_object_alignment_2): Treat MEM_REF offsets as poly_ints. * expr.c (get_inner_reference, expand_expr_real_1): Likewise. * gimple-fold.c (get_base_constructor): Likewise. * gimple-ssa-strength-reduction.c (restructure_reference): Likewise. * ipa-polymorphic-call.c (ipa_polymorphic_call_context::ipa_polymorphic_call_context): Likewise. * ipa-prop.c (compute_complex_assign_jump_func, get_ancestor_addr_info) (ipa_get_adjustment_candidate): Likewise. * match.pd: Likewise. * tree-data-ref.c (dr_analyze_innermost): Likewise. * tree-dfa.c (get_addr_base_and_unit_offset_1): Likewise. * tree-eh.c (tree_could_trap_p): Likewise. * tree-object-size.c (addr_object_size): Likewise. * tree-ssa-address.c (copy_ref_info): Likewise. * tree-ssa-alias.c (indirect_ref_may_alias_decl_p): Likewise. (indirect_refs_may_alias_p): Likewise. * tree-ssa-sccvn.c (copy_reference_ops_from_ref): Likewise. * tree-ssa.c (maybe_rewrite_mem_ref_base): Likewise. (non_rewritable_mem_ref_base): Likewise. * tree-vect-data-refs.c (vect_check_gather_scatter): Likewise. * tree-vrp.c (search_for_addr_array): Likewise. * varasm.c (decode_addr_const): Likewise. Index: gcc/fold-const.h =================================================================== --- gcc/fold-const.h 2017-10-23 17:18:47.662057360 +0100 +++ gcc/fold-const.h 2017-10-23 17:22:18.228825053 +0100 @@ -114,7 +114,7 @@ extern tree fold_indirect_ref_loc (locat extern tree build_simple_mem_ref_loc (location_t, tree); #define build_simple_mem_ref(T)\ build_simple_mem_ref_loc (UNKNOWN_LOCATION, T) -extern offset_int mem_ref_offset (const_tree); +extern poly_offset_int mem_ref_offset (const_tree); extern tree build_invariant_address (tree, tree, poly_int64); extern tree constant_boolean_node (bool, tree); extern tree div_if_zero_remainder (const_tree, const_tree); Index: gcc/tree.c =================================================================== --- gcc/tree.c 2017-10-23 17:17:01.436033953 +0100 +++ gcc/tree.c 2017-10-23 17:22:18.236826658 +0100 @@ -4925,10 +4925,11 @@ build_simple_mem_ref_loc (location_t loc /* Return the constant offset of a MEM_REF or TARGET_MEM_REF tree T. */ -offset_int +poly_offset_int mem_ref_offset (const_tree t) { - return offset_int::from (wi::to_wide (TREE_OPERAND (t, 1)), SIGNED); + return poly_offset_int::from (wi::to_poly_wide (TREE_OPERAND (t, 1)), + SIGNED); } /* Return an invariant ADDR_EXPR of type TYPE taking the address of BASE Index: gcc/builtins.c =================================================================== --- gcc/builtins.c 2017-10-23 17:18:57.855161317 +0100 +++ gcc/builtins.c 2017-10-23 17:22:18.226824652 +0100 @@ -350,7 +350,7 @@ get_object_alignment_2 (tree exp, unsign bitpos += ptr_bitpos; if (TREE_CODE (exp) == MEM_REF || TREE_CODE (exp) == TARGET_MEM_REF) - bitpos += mem_ref_offset (exp).to_short_addr () * BITS_PER_UNIT; + bitpos += mem_ref_offset (exp).force_shwi () * BITS_PER_UNIT; } } else if (TREE_CODE (exp) == STRING_CST) Index: gcc/expr.c =================================================================== --- gcc/expr.c 2017-10-23 17:20:49.571719793 +0100 +++ gcc/expr.c 2017-10-23 17:22:18.228825053 +0100 @@ -7165,8 +7165,8 @@ get_inner_reference (tree exp, poly_int6 tree off = TREE_OPERAND (exp, 1); if (!integer_zerop (off)) { - offset_int boff, coff = mem_ref_offset (exp); - boff = coff << LOG2_BITS_PER_UNIT; + poly_offset_int boff = mem_ref_offset (exp); + boff <<= LOG2_BITS_PER_UNIT; bit_offset += boff; } exp = TREE_OPERAND (TREE_OPERAND (exp, 0), 0); @@ -10255,9 +10255,9 @@ expand_expr_real_1 (tree exp, rtx target might end up in a register. */ if (mem_ref_refers_to_non_mem_p (exp)) { - HOST_WIDE_INT offset = mem_ref_offset (exp).to_short_addr (); + poly_int64 offset = mem_ref_offset (exp).force_shwi (); base = TREE_OPERAND (base, 0); - if (offset == 0 + if (known_zero (offset) && !reverse && tree_fits_uhwi_p (TYPE_SIZE (type)) && (GET_MODE_BITSIZE (DECL_MODE (base)) Index: gcc/gimple-fold.c =================================================================== --- gcc/gimple-fold.c 2017-10-23 17:17:01.430034763 +0100 +++ gcc/gimple-fold.c 2017-10-23 17:22:18.228825053 +0100 @@ -6176,7 +6176,7 @@ get_base_constructor (tree base, poly_in { if (!tree_fits_shwi_p (TREE_OPERAND (base, 1))) return NULL_TREE; - *bit_offset += (mem_ref_offset (base).to_short_addr () + *bit_offset += (mem_ref_offset (base).force_shwi () * BITS_PER_UNIT); } Index: gcc/gimple-ssa-strength-reduction.c =================================================================== --- gcc/gimple-ssa-strength-reduction.c 2017-10-23 17:18:47.663057272 +0100 +++ gcc/gimple-ssa-strength-reduction.c 2017-10-23 17:22:18.229825254 +0100 @@ -970,17 +970,19 @@ restructure_reference (tree *pbase, tree widest_int index = *pindex; tree mult_op0, t1, t2, type; widest_int c1, c2, c3, c4, c5; + offset_int mem_offset; if (!base || !offset || TREE_CODE (base) != MEM_REF + || !mem_ref_offset (base).is_constant (&mem_offset) || TREE_CODE (offset) != MULT_EXPR || TREE_CODE (TREE_OPERAND (offset, 1)) != INTEGER_CST || wi::umod_floor (index, BITS_PER_UNIT) != 0) return false; t1 = TREE_OPERAND (base, 0); - c1 = widest_int::from (mem_ref_offset (base), SIGNED); + c1 = widest_int::from (mem_offset, SIGNED); type = TREE_TYPE (TREE_OPERAND (base, 1)); mult_op0 = TREE_OPERAND (offset, 0); Index: gcc/ipa-polymorphic-call.c =================================================================== --- gcc/ipa-polymorphic-call.c 2017-10-23 17:16:59.704267816 +0100 +++ gcc/ipa-polymorphic-call.c 2017-10-23 17:22:18.229825254 +0100 @@ -917,9 +917,11 @@ ipa_polymorphic_call_context::ipa_polymo { /* We found dereference of a pointer. Type of the pointer and MEM_REF is meaningless, but we can look futher. */ - if (TREE_CODE (base) == MEM_REF) + offset_int mem_offset; + if (TREE_CODE (base) == MEM_REF + && mem_ref_offset (base).is_constant (&mem_offset)) { - offset_int o = mem_ref_offset (base) * BITS_PER_UNIT; + offset_int o = mem_offset * BITS_PER_UNIT; o += offset; o += offset2; if (!wi::fits_shwi_p (o)) Index: gcc/ipa-prop.c =================================================================== --- gcc/ipa-prop.c 2017-10-23 17:17:01.431034628 +0100 +++ gcc/ipa-prop.c 2017-10-23 17:22:18.230825454 +0100 @@ -1267,9 +1267,12 @@ compute_complex_assign_jump_func (struct if (TREE_CODE (TREE_TYPE (op1)) != RECORD_TYPE) return; base = get_ref_base_and_extent_hwi (op1, &offset, &size, &reverse); - if (!base || TREE_CODE (base) != MEM_REF) + offset_int mem_offset; + if (!base + || TREE_CODE (base) != MEM_REF + || !mem_ref_offset (base).is_constant (&mem_offset)) return; - offset += mem_ref_offset (base).to_short_addr () * BITS_PER_UNIT; + offset += mem_offset.to_short_addr () * BITS_PER_UNIT; ssa = TREE_OPERAND (base, 0); if (TREE_CODE (ssa) != SSA_NAME || !SSA_NAME_IS_DEFAULT_DEF (ssa) @@ -1311,7 +1314,10 @@ get_ancestor_addr_info (gimple *assign, obj = expr; expr = get_ref_base_and_extent_hwi (expr, offset, &size, &reverse); - if (!expr || TREE_CODE (expr) != MEM_REF) + offset_int mem_offset; + if (!expr + || TREE_CODE (expr) != MEM_REF + || !mem_ref_offset (expr).is_constant (&mem_offset)) return NULL_TREE; parm = TREE_OPERAND (expr, 0); if (TREE_CODE (parm) != SSA_NAME @@ -1319,7 +1325,7 @@ get_ancestor_addr_info (gimple *assign, || TREE_CODE (SSA_NAME_VAR (parm)) != PARM_DECL) return NULL_TREE; - *offset += mem_ref_offset (expr).to_short_addr () * BITS_PER_UNIT; + *offset += mem_offset.to_short_addr () * BITS_PER_UNIT; *obj_p = obj; return expr; } @@ -4568,7 +4574,7 @@ ipa_get_adjustment_candidate (tree **exp if (TREE_CODE (base) == MEM_REF) { - offset += mem_ref_offset (base).to_short_addr () * BITS_PER_UNIT; + offset += mem_ref_offset (base).force_shwi () * BITS_PER_UNIT; base = TREE_OPERAND (base, 0); } Index: gcc/match.pd =================================================================== --- gcc/match.pd 2017-10-23 17:18:47.664057184 +0100 +++ gcc/match.pd 2017-10-23 17:22:18.230825454 +0100 @@ -3350,12 +3350,12 @@ DEFINE_INT_AND_FLOAT_ROUND_FN (RINT) tree base1 = get_addr_base_and_unit_offset (TREE_OPERAND (@1, 0), &off1); if (base0 && TREE_CODE (base0) == MEM_REF) { - off0 += mem_ref_offset (base0).to_short_addr (); + off0 += mem_ref_offset (base0).force_shwi (); base0 = TREE_OPERAND (base0, 0); } if (base1 && TREE_CODE (base1) == MEM_REF) { - off1 += mem_ref_offset (base1).to_short_addr (); + off1 += mem_ref_offset (base1).force_shwi (); base1 = TREE_OPERAND (base1, 0); } } Index: gcc/tree-data-ref.c =================================================================== --- gcc/tree-data-ref.c 2017-10-23 17:18:47.666057008 +0100 +++ gcc/tree-data-ref.c 2017-10-23 17:22:18.231825655 +0100 @@ -820,16 +820,16 @@ dr_analyze_innermost (innermost_loop_beh } /* Calculate the alignment and misalignment for the inner reference. */ - unsigned int HOST_WIDE_INT base_misalignment; - unsigned int base_alignment; - get_object_alignment_1 (base, &base_alignment, &base_misalignment); + unsigned int HOST_WIDE_INT bit_base_misalignment; + unsigned int bit_base_alignment; + get_object_alignment_1 (base, &bit_base_alignment, &bit_base_misalignment); /* There are no bitfield references remaining in BASE, so the values we got back must be whole bytes. */ - gcc_assert (base_alignment % BITS_PER_UNIT == 0 - && base_misalignment % BITS_PER_UNIT == 0); - base_alignment /= BITS_PER_UNIT; - base_misalignment /= BITS_PER_UNIT; + gcc_assert (bit_base_alignment % BITS_PER_UNIT == 0 + && bit_base_misalignment % BITS_PER_UNIT == 0); + unsigned int base_alignment = bit_base_alignment / BITS_PER_UNIT; + poly_int64 base_misalignment = bit_base_misalignment / BITS_PER_UNIT; if (TREE_CODE (base) == MEM_REF) { @@ -837,8 +837,8 @@ dr_analyze_innermost (innermost_loop_beh { /* Subtract MOFF from the base and add it to POFFSET instead. Adjust the misalignment to reflect the amount we subtracted. */ - offset_int moff = mem_ref_offset (base); - base_misalignment -= moff.to_short_addr (); + poly_offset_int moff = mem_ref_offset (base); + base_misalignment -= moff.force_shwi (); tree mofft = wide_int_to_tree (sizetype, moff); if (!poffset) poffset = mofft; @@ -925,8 +925,14 @@ dr_analyze_innermost (innermost_loop_beh drb->offset = fold_convert (ssizetype, offset_iv.base); drb->init = init; drb->step = step; - drb->base_alignment = base_alignment; - drb->base_misalignment = base_misalignment & (base_alignment - 1); + if (known_misalignment (base_misalignment, base_alignment, + &drb->base_misalignment)) + drb->base_alignment = base_alignment; + else + { + drb->base_alignment = known_alignment (base_misalignment); + drb->base_misalignment = 0; + } drb->offset_alignment = highest_pow2_factor (offset_iv.base); drb->step_alignment = highest_pow2_factor (step); Index: gcc/tree-dfa.c =================================================================== --- gcc/tree-dfa.c 2017-10-23 17:17:01.432034493 +0100 +++ gcc/tree-dfa.c 2017-10-23 17:22:18.231825655 +0100 @@ -797,8 +797,8 @@ get_addr_base_and_unit_offset_1 (tree ex { if (!integer_zerop (TREE_OPERAND (exp, 1))) { - offset_int off = mem_ref_offset (exp); - byte_offset += off.to_short_addr (); + poly_offset_int off = mem_ref_offset (exp); + byte_offset += off.force_shwi (); } exp = TREE_OPERAND (base, 0); } @@ -819,8 +819,8 @@ get_addr_base_and_unit_offset_1 (tree ex return NULL_TREE; if (!integer_zerop (TMR_OFFSET (exp))) { - offset_int off = mem_ref_offset (exp); - byte_offset += off.to_short_addr (); + poly_offset_int off = mem_ref_offset (exp); + byte_offset += off.force_shwi (); } exp = TREE_OPERAND (base, 0); } Index: gcc/tree-eh.c =================================================================== --- gcc/tree-eh.c 2017-10-23 16:52:17.994460559 +0100 +++ gcc/tree-eh.c 2017-10-23 17:22:18.231825655 +0100 @@ -2658,14 +2658,15 @@ tree_could_trap_p (tree expr) if (TREE_CODE (TREE_OPERAND (expr, 0)) == ADDR_EXPR) { tree base = TREE_OPERAND (TREE_OPERAND (expr, 0), 0); - offset_int off = mem_ref_offset (expr); - if (wi::neg_p (off, SIGNED)) + poly_offset_int off = mem_ref_offset (expr); + if (may_lt (off, 0)) return true; if (TREE_CODE (base) == STRING_CST) - return wi::leu_p (TREE_STRING_LENGTH (base), off); - else if (DECL_SIZE_UNIT (base) == NULL_TREE - || TREE_CODE (DECL_SIZE_UNIT (base)) != INTEGER_CST - || wi::leu_p (wi::to_offset (DECL_SIZE_UNIT (base)), off)) + return may_le (TREE_STRING_LENGTH (base), off); + tree size = DECL_SIZE_UNIT (base); + if (size == NULL_TREE + || !poly_int_tree_p (size) + || may_le (wi::to_poly_offset (size), off)) return true; /* Now we are sure the first byte of the access is inside the object. */ Index: gcc/tree-object-size.c =================================================================== --- gcc/tree-object-size.c 2017-10-23 16:52:17.994460559 +0100 +++ gcc/tree-object-size.c 2017-10-23 17:22:18.232825856 +0100 @@ -210,11 +210,17 @@ addr_object_size (struct object_size_inf } if (sz != unknown[object_size_type]) { - offset_int dsz = wi::sub (sz, mem_ref_offset (pt_var)); - if (wi::neg_p (dsz)) - sz = 0; - else if (wi::fits_uhwi_p (dsz)) - sz = dsz.to_uhwi (); + offset_int mem_offset; + if (mem_ref_offset (pt_var).is_constant (&mem_offset)) + { + offset_int dsz = wi::sub (sz, mem_offset); + if (wi::neg_p (dsz)) + sz = 0; + else if (wi::fits_uhwi_p (dsz)) + sz = dsz.to_uhwi (); + else + sz = unknown[object_size_type]; + } else sz = unknown[object_size_type]; } Index: gcc/tree-ssa-address.c =================================================================== --- gcc/tree-ssa-address.c 2017-10-23 17:17:03.207794688 +0100 +++ gcc/tree-ssa-address.c 2017-10-23 17:22:18.232825856 +0100 @@ -1008,8 +1008,8 @@ copy_ref_info (tree new_ref, tree old_re && (TREE_INT_CST_LOW (TMR_STEP (new_ref)) < align))))) { - unsigned int inc = (mem_ref_offset (old_ref).to_short_addr () - - mem_ref_offset (new_ref).to_short_addr ()); + poly_uint64 inc = (mem_ref_offset (old_ref) + - mem_ref_offset (new_ref)).force_uhwi (); adjust_ptr_info_misalignment (new_pi, inc); } else Index: gcc/tree-ssa-alias.c =================================================================== --- gcc/tree-ssa-alias.c 2017-10-23 17:17:01.433034358 +0100 +++ gcc/tree-ssa-alias.c 2017-10-23 17:22:18.232825856 +0100 @@ -1143,7 +1143,7 @@ indirect_ref_may_alias_decl_p (tree ref1 && DECL_P (base2)); ptr1 = TREE_OPERAND (base1, 0); - offset_int moff = mem_ref_offset (base1) << LOG2_BITS_PER_UNIT; + poly_offset_int moff = mem_ref_offset (base1) << LOG2_BITS_PER_UNIT; /* If only one reference is based on a variable, they cannot alias if the pointer access is beyond the extent of the variable access. @@ -1299,8 +1299,8 @@ indirect_refs_may_alias_p (tree ref1 ATT && operand_equal_p (TMR_INDEX2 (base1), TMR_INDEX2 (base2), 0)))))) { - offset_int moff1 = mem_ref_offset (base1) << LOG2_BITS_PER_UNIT; - offset_int moff2 = mem_ref_offset (base2) << LOG2_BITS_PER_UNIT; + poly_offset_int moff1 = mem_ref_offset (base1) << LOG2_BITS_PER_UNIT; + poly_offset_int moff2 = mem_ref_offset (base2) << LOG2_BITS_PER_UNIT; return ranges_may_overlap_p (offset1 + moff1, max_size1, offset2 + moff2, max_size2); } Index: gcc/tree-ssa-sccvn.c =================================================================== --- gcc/tree-ssa-sccvn.c 2017-10-23 17:20:50.884679814 +0100 +++ gcc/tree-ssa-sccvn.c 2017-10-23 17:22:18.233826056 +0100 @@ -753,11 +753,8 @@ copy_reference_ops_from_ref (tree ref, v case MEM_REF: /* The base address gets its own vn_reference_op_s structure. */ temp.op0 = TREE_OPERAND (ref, 1); - { - offset_int off = mem_ref_offset (ref); - if (wi::fits_shwi_p (off)) - temp.off = off.to_shwi (); - } + if (!mem_ref_offset (ref).to_shwi (&temp.off)) + temp.off = -1; temp.clique = MR_DEPENDENCE_CLIQUE (ref); temp.base = MR_DEPENDENCE_BASE (ref); temp.reverse = REF_REVERSE_STORAGE_ORDER (ref); Index: gcc/tree-ssa.c =================================================================== --- gcc/tree-ssa.c 2017-10-23 16:52:17.994460559 +0100 +++ gcc/tree-ssa.c 2017-10-23 17:22:18.233826056 +0100 @@ -1379,10 +1379,10 @@ maybe_rewrite_mem_ref_base (tree *tp, bi } else if (DECL_SIZE (sym) && TREE_CODE (DECL_SIZE (sym)) == INTEGER_CST - && mem_ref_offset (*tp) >= 0 - && wi::leu_p (mem_ref_offset (*tp) - + wi::to_offset (TYPE_SIZE_UNIT (TREE_TYPE (*tp))), - wi::to_offset (DECL_SIZE_UNIT (sym))) + && (known_subrange_p + (mem_ref_offset (*tp), + wi::to_offset (TYPE_SIZE_UNIT (TREE_TYPE (*tp))), + 0, wi::to_offset (DECL_SIZE_UNIT (sym)))) && (! INTEGRAL_TYPE_P (TREE_TYPE (*tp)) || (wi::to_offset (TYPE_SIZE (TREE_TYPE (*tp))) == TYPE_PRECISION (TREE_TYPE (*tp)))) @@ -1433,9 +1433,8 @@ non_rewritable_mem_ref_base (tree ref) || TREE_CODE (TREE_TYPE (decl)) == COMPLEX_TYPE) && useless_type_conversion_p (TREE_TYPE (base), TREE_TYPE (TREE_TYPE (decl))) - && wi::fits_uhwi_p (mem_ref_offset (base)) - && wi::gtu_p (wi::to_offset (TYPE_SIZE_UNIT (TREE_TYPE (decl))), - mem_ref_offset (base)) + && must_gt (wi::to_poly_offset (TYPE_SIZE_UNIT (TREE_TYPE (decl))), + mem_ref_offset (base)) && multiple_of_p (sizetype, TREE_OPERAND (base, 1), TYPE_SIZE_UNIT (TREE_TYPE (base)))) return NULL_TREE; @@ -1445,11 +1444,10 @@ non_rewritable_mem_ref_base (tree ref) return NULL_TREE; /* For integral typed extracts we can use a BIT_FIELD_REF. */ if (DECL_SIZE (decl) - && TREE_CODE (DECL_SIZE (decl)) == INTEGER_CST - && mem_ref_offset (base) >= 0 - && wi::leu_p (mem_ref_offset (base) - + wi::to_offset (TYPE_SIZE_UNIT (TREE_TYPE (base))), - wi::to_offset (DECL_SIZE_UNIT (decl))) + && (known_subrange_p + (mem_ref_offset (base), + wi::to_poly_offset (TYPE_SIZE_UNIT (TREE_TYPE (base))), + 0, wi::to_poly_offset (DECL_SIZE_UNIT (decl)))) /* ??? We can't handle bitfield precision extracts without either using an alternate type for the BIT_FIELD_REF and then doing a conversion or possibly adjusting the offset Index: gcc/tree-vect-data-refs.c =================================================================== --- gcc/tree-vect-data-refs.c 2017-10-23 17:18:47.668056833 +0100 +++ gcc/tree-vect-data-refs.c 2017-10-23 17:22:18.234826257 +0100 @@ -3265,10 +3265,7 @@ vect_check_gather_scatter (gimple *stmt, if (!integer_zerop (TREE_OPERAND (base, 1))) { if (off == NULL_TREE) - { - offset_int moff = mem_ref_offset (base); - off = wide_int_to_tree (sizetype, moff); - } + off = wide_int_to_tree (sizetype, mem_ref_offset (base)); else off = size_binop (PLUS_EXPR, off, fold_convert (sizetype, TREE_OPERAND (base, 1))); Index: gcc/tree-vrp.c =================================================================== --- gcc/tree-vrp.c 2017-10-23 17:11:40.251958611 +0100 +++ gcc/tree-vrp.c 2017-10-23 17:22:18.235826458 +0100 @@ -6808,7 +6808,9 @@ search_for_addr_array (tree t, location_ || TREE_CODE (el_sz) != INTEGER_CST) return; - idx = mem_ref_offset (t); + if (!mem_ref_offset (t).is_constant (&idx)) + return; + idx = wi::sdiv_trunc (idx, wi::to_offset (el_sz)); if (idx < 0) { Index: gcc/varasm.c =================================================================== --- gcc/varasm.c 2017-10-23 17:20:52.530629696 +0100 +++ gcc/varasm.c 2017-10-23 17:22:18.236826658 +0100 @@ -2903,7 +2903,7 @@ decode_addr_const (tree exp, struct addr else if (TREE_CODE (target) == MEM_REF && TREE_CODE (TREE_OPERAND (target, 0)) == ADDR_EXPR) { - offset += mem_ref_offset (target).to_short_addr (); + offset += mem_ref_offset (target).force_shwi (); target = TREE_OPERAND (TREE_OPERAND (target, 0), 0); } else if (TREE_CODE (target) == INDIRECT_REF