From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 15910 invoked by alias); 23 Oct 2017 17:12:02 -0000 Mailing-List: contact gcc-patches-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Id: List-Archive: List-Post: List-Help: Sender: gcc-patches-owner@gcc.gnu.org Received: (qmail 15900 invoked by uid 89); 23 Oct 2017 17:12:02 -0000 Authentication-Results: sourceware.org; auth=none X-Virus-Found: No X-Spam-SWARE-Status: No, score=-10.5 required=5.0 tests=AWL,BAYES_00,GIT_PATCH_2,GIT_PATCH_3,KAM_ASCII_DIVIDERS,RCVD_IN_DNSWL_NONE,RCVD_IN_SORBS_SPAM,SPF_PASS autolearn=ham version=3.3.2 spammy=Hand, sk:disqual X-HELO: mail-wm0-f43.google.com Received: from mail-wm0-f43.google.com (HELO mail-wm0-f43.google.com) (74.125.82.43) by sourceware.org (qpsmtpd/0.93/v0.84-503-g423c35a) with ESMTP; Mon, 23 Oct 2017 17:11:57 +0000 Received: by mail-wm0-f43.google.com with SMTP id q124so10921405wmb.0 for ; Mon, 23 Oct 2017 10:11:56 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:from:to:mail-followup-to:subject:references:date :in-reply-to:message-id:user-agent:mime-version; bh=By09lFJ631OHTpygsXX7Iu8x9D7jceSRnV058jIBEh8=; b=nOa+ZERuTF6IhElK7tjWU8Vl2sgf17KxxvLU1vocT9rtpx+YkUtWc36ZxA582hmraU z/a8IHZy8ztbRdw/NFsGOJhvhZA4z7MgY+stA8SUkjl73N1vLlSmQXT1+Offo2rH3Swc d+kyKngq+PfN4+o+a7tIiV0w1yDvhGXoYZg4g0vPgD0NH17bybgOC8RcP+/w0XuFZqxg uKpgDUp1tVt75iovAQm1EMrvtxg63MiKHjtnqgrBgKzGRuwhJCgRfNpwVOn1/XruFcR4 Ct26F92qrm7hoJLqE6FZoP0FH+wG7VWuAvcSXpF0tyT2nTh5ye/NqxDG6Cb4aXbIfqrj /KoQ== X-Gm-Message-State: AMCzsaUWP9C7wf8tHzBIFe02ajRWlDkTGmGfzr4N8PKnkRVmt2h5OM5t deZV5eFIMQ0sS6UkWXBmJx2gONCvf1w= X-Google-Smtp-Source: ABhQp+QUY+/gORrpfvEMCfKgD/Ci07R7uwv163+UXY4S6fuq+sk1GMFdS+AFPgfk0760ehSGvK357g== X-Received: by 10.28.239.2 with SMTP id n2mr5342677wmh.140.1508778713536; Mon, 23 Oct 2017 10:11:53 -0700 (PDT) Received: from localhost ([2.26.27.199]) by smtp.gmail.com with ESMTPSA id u57sm9709030wrf.43.2017.10.23.10.11.51 for (version=TLS1_2 cipher=ECDHE-RSA-CHACHA20-POLY1305 bits=256/256); Mon, 23 Oct 2017 10:11:52 -0700 (PDT) From: Richard Sandiford To: gcc-patches@gcc.gnu.org Mail-Followup-To: gcc-patches@gcc.gnu.org, richard.sandiford@linaro.org Subject: [029/nnn] poly_int: get_ref_base_and_extent References: <871sltvm7r.fsf@linaro.org> Date: Mon, 23 Oct 2017 17:12:00 -0000 In-Reply-To: <871sltvm7r.fsf@linaro.org> (Richard Sandiford's message of "Mon, 23 Oct 2017 17:54:32 +0100") Message-ID: <87efptpz54.fsf@linaro.org> User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/25.2 (gnu/linux) MIME-Version: 1.0 Content-Type: text/plain X-SW-Source: 2017-10/txt/msg01530.txt.bz2 This patch changes the types of the bit offsets and sizes returned by get_ref_base_and_extent to poly_int64. There are some callers that can't sensibly operate on polynomial offsets or handle cases where the offset and size aren't known exactly. This includes the IPA devirtualisation code (since there's no defined way of having vtables at variable offsets) and some parts of the DWARF code. The patch therefore adds a helper function get_ref_base_and_extent_hwi that either returns exact HOST_WIDE_INT bit positions and sizes or returns a null base to indicate failure. 2017-10-23 Richard Sandiford Alan Hayward David Sherwood gcc/ * tree-dfa.h (get_ref_base_and_extent): Return the base, size and max_size as poly_int64_pods rather than HOST_WIDE_INTs. (get_ref_base_and_extent_hwi): Declare. * tree-dfa.c (get_ref_base_and_extent): Return the base, size and max_size as poly_int64_pods rather than HOST_WIDE_INTs. (get_ref_base_and_extent_hwi): New function. * cfgexpand.c (expand_debug_expr): Update call to get_ref_base_and_extent. * dwarf2out.c (add_var_loc_to_decl): Likewise. * gimple-fold.c (get_base_constructor): Return the offset as a poly_int64_pod rather than a HOST_WIDE_INT. (fold_const_aggregate_ref_1): Track polynomial sizes and offsets. * ipa-polymorphic-call.c (ipa_polymorphic_call_context::set_by_invariant) (extr_type_from_vtbl_ptr_store): Track polynomial offsets. (ipa_polymorphic_call_context::ipa_polymorphic_call_context) (check_stmt_for_type_change): Use get_ref_base_and_extent_hwi rather than get_ref_base_and_extent. (ipa_polymorphic_call_context::get_dynamic_type): Likewise. * ipa-prop.c (ipa_load_from_parm_agg, compute_complex_assign_jump_func) (get_ancestor_addr_info, determine_locally_known_aggregate_parts): Likewise. (ipa_get_adjustment_candidate): Update call to get_ref_base_and_extent. * tree-sra.c (create_access, get_access_for_expr): Likewise. * tree-ssa-alias.c (ao_ref_base, aliasing_component_refs_p) (stmt_kills_ref_p): Likewise. * tree-ssa-dce.c (mark_aliased_reaching_defs_necessary_1): Likewise. * tree-ssa-scopedtables.c (avail_expr_hash, equal_mem_array_ref_p): Likewise. * tree-ssa-sccvn.c (vn_reference_lookup_3): Likewise. Use get_ref_base_and_extent_hwi rather than get_ref_base_and_extent when calling native_encode_expr. * tree-ssa-structalias.c (get_constraint_for_component_ref): Update call to get_ref_base_and_extent. (do_structure_copy): Use get_ref_base_and_extent_hwi rather than get_ref_base_and_extent. * var-tracking.c (track_expr_p): Likewise. Index: gcc/tree-dfa.h =================================================================== --- gcc/tree-dfa.h 2017-10-23 17:07:40.909726192 +0100 +++ gcc/tree-dfa.h 2017-10-23 17:16:59.705267681 +0100 @@ -29,8 +29,10 @@ extern void debug_dfa_stats (void); extern tree ssa_default_def (struct function *, tree); extern void set_ssa_default_def (struct function *, tree, tree); extern tree get_or_create_ssa_default_def (struct function *, tree); -extern tree get_ref_base_and_extent (tree, HOST_WIDE_INT *, - HOST_WIDE_INT *, HOST_WIDE_INT *, bool *); +extern tree get_ref_base_and_extent (tree, poly_int64_pod *, poly_int64_pod *, + poly_int64_pod *, bool *); +extern tree get_ref_base_and_extent_hwi (tree, HOST_WIDE_INT *, + HOST_WIDE_INT *, bool *); extern tree get_addr_base_and_unit_offset_1 (tree, HOST_WIDE_INT *, tree (*) (tree)); extern tree get_addr_base_and_unit_offset (tree, HOST_WIDE_INT *); Index: gcc/tree-dfa.c =================================================================== --- gcc/tree-dfa.c 2017-10-23 17:07:40.909726192 +0100 +++ gcc/tree-dfa.c 2017-10-23 17:16:59.705267681 +0100 @@ -377,15 +377,15 @@ get_or_create_ssa_default_def (struct fu true, the storage order of the reference is reversed. */ tree -get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset, - HOST_WIDE_INT *psize, - HOST_WIDE_INT *pmax_size, +get_ref_base_and_extent (tree exp, poly_int64_pod *poffset, + poly_int64_pod *psize, + poly_int64_pod *pmax_size, bool *preverse) { - offset_int bitsize = -1; - offset_int maxsize; + poly_offset_int bitsize = -1; + poly_offset_int maxsize; tree size_tree = NULL_TREE; - offset_int bit_offset = 0; + poly_offset_int bit_offset = 0; bool seen_variable_array_ref = false; /* First get the final access size and the storage order from just the @@ -400,11 +400,11 @@ get_ref_base_and_extent (tree exp, HOST_ if (mode == BLKmode) size_tree = TYPE_SIZE (TREE_TYPE (exp)); else - bitsize = int (GET_MODE_BITSIZE (mode)); + bitsize = GET_MODE_BITSIZE (mode); } if (size_tree != NULL_TREE - && TREE_CODE (size_tree) == INTEGER_CST) - bitsize = wi::to_offset (size_tree); + && poly_int_tree_p (size_tree)) + bitsize = wi::to_poly_offset (size_tree); *preverse = reverse_storage_order_for_component_p (exp); @@ -419,7 +419,7 @@ get_ref_base_and_extent (tree exp, HOST_ switch (TREE_CODE (exp)) { case BIT_FIELD_REF: - bit_offset += wi::to_offset (TREE_OPERAND (exp, 2)); + bit_offset += wi::to_poly_offset (TREE_OPERAND (exp, 2)); break; case COMPONENT_REF: @@ -427,10 +427,10 @@ get_ref_base_and_extent (tree exp, HOST_ tree field = TREE_OPERAND (exp, 1); tree this_offset = component_ref_field_offset (exp); - if (this_offset && TREE_CODE (this_offset) == INTEGER_CST) + if (this_offset && poly_int_tree_p (this_offset)) { - offset_int woffset = (wi::to_offset (this_offset) - << LOG2_BITS_PER_UNIT); + poly_offset_int woffset = (wi::to_poly_offset (this_offset) + << LOG2_BITS_PER_UNIT); woffset += wi::to_offset (DECL_FIELD_BIT_OFFSET (field)); bit_offset += woffset; @@ -438,7 +438,7 @@ get_ref_base_and_extent (tree exp, HOST_ referenced the last field of a struct or a union member then we have to adjust maxsize by the padding at the end of our field. */ - if (seen_variable_array_ref && maxsize != -1) + if (seen_variable_array_ref && known_size_p (maxsize)) { tree stype = TREE_TYPE (TREE_OPERAND (exp, 0)); tree next = DECL_CHAIN (field); @@ -450,14 +450,15 @@ get_ref_base_and_extent (tree exp, HOST_ tree fsize = DECL_SIZE_UNIT (field); tree ssize = TYPE_SIZE_UNIT (stype); if (fsize == NULL - || TREE_CODE (fsize) != INTEGER_CST + || !poly_int_tree_p (fsize) || ssize == NULL - || TREE_CODE (ssize) != INTEGER_CST) + || !poly_int_tree_p (ssize)) maxsize = -1; else { - offset_int tem = (wi::to_offset (ssize) - - wi::to_offset (fsize)); + poly_offset_int tem + = (wi::to_poly_offset (ssize) + - wi::to_poly_offset (fsize)); tem <<= LOG2_BITS_PER_UNIT; tem -= woffset; maxsize += tem; @@ -471,10 +472,10 @@ get_ref_base_and_extent (tree exp, HOST_ /* We need to adjust maxsize to the whole structure bitsize. But we can subtract any constant offset seen so far, because that would get us out of the structure otherwise. */ - if (maxsize != -1 + if (known_size_p (maxsize) && csize - && TREE_CODE (csize) == INTEGER_CST) - maxsize = wi::to_offset (csize) - bit_offset; + && poly_int_tree_p (csize)) + maxsize = wi::to_poly_offset (csize) - bit_offset; else maxsize = -1; } @@ -488,14 +489,15 @@ get_ref_base_and_extent (tree exp, HOST_ tree low_bound, unit_size; /* If the resulting bit-offset is constant, track it. */ - if (TREE_CODE (index) == INTEGER_CST + if (poly_int_tree_p (index) && (low_bound = array_ref_low_bound (exp), - TREE_CODE (low_bound) == INTEGER_CST) + poly_int_tree_p (low_bound)) && (unit_size = array_ref_element_size (exp), TREE_CODE (unit_size) == INTEGER_CST)) { - offset_int woffset - = wi::sext (wi::to_offset (index) - wi::to_offset (low_bound), + poly_offset_int woffset + = wi::sext (wi::to_poly_offset (index) + - wi::to_poly_offset (low_bound), TYPE_PRECISION (TREE_TYPE (index))); woffset *= wi::to_offset (unit_size); woffset <<= LOG2_BITS_PER_UNIT; @@ -512,10 +514,10 @@ get_ref_base_and_extent (tree exp, HOST_ /* We need to adjust maxsize to the whole array bitsize. But we can subtract any constant offset seen so far, because that would get us outside of the array otherwise. */ - if (maxsize != -1 + if (known_size_p (maxsize) && asize - && TREE_CODE (asize) == INTEGER_CST) - maxsize = wi::to_offset (asize) - bit_offset; + && poly_int_tree_p (asize)) + maxsize = wi::to_poly_offset (asize) - bit_offset; else maxsize = -1; @@ -560,11 +562,11 @@ get_ref_base_and_extent (tree exp, HOST_ base type boundary. This needs to include possible trailing padding that is there for alignment purposes. */ if (seen_variable_array_ref - && maxsize != -1 + && known_size_p (maxsize) && (TYPE_SIZE (TREE_TYPE (exp)) == NULL_TREE - || TREE_CODE (TYPE_SIZE (TREE_TYPE (exp))) != INTEGER_CST - || (bit_offset + maxsize - == wi::to_offset (TYPE_SIZE (TREE_TYPE (exp)))))) + || !poly_int_tree_p (TYPE_SIZE (TREE_TYPE (exp))) + || may_eq (bit_offset + maxsize, + wi::to_poly_offset (TYPE_SIZE (TREE_TYPE (exp)))))) maxsize = -1; /* Hand back the decl for MEM[&decl, off]. */ @@ -574,12 +576,13 @@ get_ref_base_and_extent (tree exp, HOST_ exp = TREE_OPERAND (TREE_OPERAND (exp, 0), 0); else { - offset_int off = mem_ref_offset (exp); + poly_offset_int off = mem_ref_offset (exp); off <<= LOG2_BITS_PER_UNIT; off += bit_offset; - if (wi::fits_shwi_p (off)) + poly_int64 off_hwi; + if (off.to_shwi (&off_hwi)) { - bit_offset = off; + bit_offset = off_hwi; exp = TREE_OPERAND (TREE_OPERAND (exp, 0), 0); } } @@ -594,7 +597,7 @@ get_ref_base_and_extent (tree exp, HOST_ } done: - if (!wi::fits_shwi_p (bitsize) || wi::neg_p (bitsize)) + if (!bitsize.to_shwi (psize) || may_lt (*psize, 0)) { *poffset = 0; *psize = -1; @@ -603,9 +606,10 @@ get_ref_base_and_extent (tree exp, HOST_ return exp; } - *psize = bitsize.to_shwi (); - - if (!wi::fits_shwi_p (bit_offset)) + /* ??? Due to negative offsets in ARRAY_REF we can end up with + negative bit_offset here. We might want to store a zero offset + in this case. */ + if (!bit_offset.to_shwi (poffset)) { *poffset = 0; *pmax_size = -1; @@ -625,44 +629,37 @@ get_ref_base_and_extent (tree exp, HOST_ if (TREE_CODE (TREE_TYPE (exp)) == ARRAY_TYPE || (seen_variable_array_ref && (sz_tree == NULL_TREE - || TREE_CODE (sz_tree) != INTEGER_CST - || (bit_offset + maxsize == wi::to_offset (sz_tree))))) + || !poly_int_tree_p (sz_tree) + || may_eq (bit_offset + maxsize, + wi::to_poly_offset (sz_tree))))) maxsize = -1; } /* If maxsize is unknown adjust it according to the size of the base decl. */ - else if (maxsize == -1 - && DECL_SIZE (exp) - && TREE_CODE (DECL_SIZE (exp)) == INTEGER_CST) - maxsize = wi::to_offset (DECL_SIZE (exp)) - bit_offset; + else if (!known_size_p (maxsize) + && DECL_SIZE (exp) + && poly_int_tree_p (DECL_SIZE (exp))) + maxsize = wi::to_poly_offset (DECL_SIZE (exp)) - bit_offset; } else if (CONSTANT_CLASS_P (exp)) { /* If maxsize is unknown adjust it according to the size of the base type constant. */ - if (maxsize == -1 + if (!known_size_p (maxsize) && TYPE_SIZE (TREE_TYPE (exp)) - && TREE_CODE (TYPE_SIZE (TREE_TYPE (exp))) == INTEGER_CST) - maxsize = (wi::to_offset (TYPE_SIZE (TREE_TYPE (exp))) + && poly_int_tree_p (TYPE_SIZE (TREE_TYPE (exp)))) + maxsize = (wi::to_poly_offset (TYPE_SIZE (TREE_TYPE (exp))) - bit_offset); } - /* ??? Due to negative offsets in ARRAY_REF we can end up with - negative bit_offset here. We might want to store a zero offset - in this case. */ - *poffset = bit_offset.to_shwi (); - if (!wi::fits_shwi_p (maxsize) || wi::neg_p (maxsize)) + if (!maxsize.to_shwi (pmax_size) + || may_lt (*pmax_size, 0) + || !endpoint_representable_p (*poffset, *pmax_size)) *pmax_size = -1; - else - { - *pmax_size = maxsize.to_shwi (); - if (*poffset > HOST_WIDE_INT_MAX - *pmax_size) - *pmax_size = -1; - } /* Punt if *POFFSET + *PSIZE overflows in HOST_WIDE_INT, the callers don't check for such overflows individually and assume it works. */ - if (*psize != -1 && *poffset > HOST_WIDE_INT_MAX - *psize) + if (!endpoint_representable_p (*poffset, *psize)) { *poffset = 0; *psize = -1; @@ -674,6 +671,32 @@ get_ref_base_and_extent (tree exp, HOST_ return exp; } +/* Like get_ref_base_and_extent, but for cases in which we only care + about constant-width accesses at constant offsets. Return null + if the access is anything else. */ + +tree +get_ref_base_and_extent_hwi (tree exp, HOST_WIDE_INT *poffset, + HOST_WIDE_INT *psize, bool *preverse) +{ + poly_int64 offset, size, max_size; + HOST_WIDE_INT const_offset, const_size; + bool reverse; + tree decl = get_ref_base_and_extent (exp, &offset, &size, &max_size, + &reverse); + if (!offset.is_constant (&const_offset) + || !size.is_constant (&const_size) + || const_offset < 0 + || !known_size_p (max_size) + || may_ne (max_size, const_size)) + return NULL_TREE; + + *poffset = const_offset; + *psize = const_size; + *preverse = reverse; + return decl; +} + /* Returns the base object and a constant BITS_PER_UNIT offset in *POFFSET that denotes the starting address of the memory access EXP. Returns NULL_TREE if the offset is not constant or any component Index: gcc/cfgexpand.c =================================================================== --- gcc/cfgexpand.c 2017-10-23 17:11:40.240937549 +0100 +++ gcc/cfgexpand.c 2017-10-23 17:16:59.700268356 +0100 @@ -4878,7 +4878,7 @@ expand_debug_expr (tree exp) if (handled_component_p (TREE_OPERAND (exp, 0))) { - HOST_WIDE_INT bitoffset, bitsize, maxsize; + poly_int64 bitoffset, bitsize, maxsize, byteoffset; bool reverse; tree decl = get_ref_base_and_extent (TREE_OPERAND (exp, 0), &bitoffset, @@ -4888,12 +4888,12 @@ expand_debug_expr (tree exp) || TREE_CODE (decl) == RESULT_DECL) && (!TREE_ADDRESSABLE (decl) || target_for_debug_bind (decl)) - && (bitoffset % BITS_PER_UNIT) == 0 - && bitsize > 0 - && bitsize == maxsize) + && multiple_p (bitoffset, BITS_PER_UNIT, &byteoffset) + && must_gt (bitsize, 0) + && must_eq (bitsize, maxsize)) { rtx base = gen_rtx_DEBUG_IMPLICIT_PTR (mode, decl); - return plus_constant (mode, base, bitoffset / BITS_PER_UNIT); + return plus_constant (mode, base, byteoffset); } } Index: gcc/dwarf2out.c =================================================================== --- gcc/dwarf2out.c 2017-10-23 17:16:57.210604569 +0100 +++ gcc/dwarf2out.c 2017-10-23 17:16:59.703267951 +0100 @@ -5914,17 +5914,15 @@ add_var_loc_to_decl (tree decl, rtx loc_ || (TREE_CODE (realdecl) == MEM_REF && TREE_CODE (TREE_OPERAND (realdecl, 0)) == ADDR_EXPR)) { - HOST_WIDE_INT maxsize; bool reverse; - tree innerdecl - = get_ref_base_and_extent (realdecl, &bitpos, &bitsize, &maxsize, - &reverse); - if (!DECL_P (innerdecl) + tree innerdecl = get_ref_base_and_extent_hwi (realdecl, &bitpos, + &bitsize, &reverse); + if (!innerdecl + || !DECL_P (innerdecl) || DECL_IGNORED_P (innerdecl) || TREE_STATIC (innerdecl) - || bitsize <= 0 - || bitpos + bitsize > 256 - || bitsize != maxsize) + || bitsize == 0 + || bitpos + bitsize > 256) return NULL; decl = innerdecl; } Index: gcc/gimple-fold.c =================================================================== --- gcc/gimple-fold.c 2017-10-23 17:11:40.321090726 +0100 +++ gcc/gimple-fold.c 2017-10-23 17:16:59.703267951 +0100 @@ -6171,10 +6171,10 @@ gimple_fold_stmt_to_constant (gimple *st is not explicitly available, but it is known to be zero such as 'static const int a;'. */ static tree -get_base_constructor (tree base, HOST_WIDE_INT *bit_offset, +get_base_constructor (tree base, poly_int64_pod *bit_offset, tree (*valueize)(tree)) { - HOST_WIDE_INT bit_offset2, size, max_size; + poly_int64 bit_offset2, size, max_size; bool reverse; if (TREE_CODE (base) == MEM_REF) @@ -6226,7 +6226,7 @@ get_base_constructor (tree base, HOST_WI case COMPONENT_REF: base = get_ref_base_and_extent (base, &bit_offset2, &size, &max_size, &reverse); - if (max_size == -1 || size != max_size) + if (!known_size_p (max_size) || may_ne (size, max_size)) return NULL_TREE; *bit_offset += bit_offset2; return get_base_constructor (base, bit_offset, valueize); @@ -6437,7 +6437,7 @@ fold_ctor_reference (tree type, tree cto fold_const_aggregate_ref_1 (tree t, tree (*valueize) (tree)) { tree ctor, idx, base; - HOST_WIDE_INT offset, size, max_size; + poly_int64 offset, size, max_size; tree tem; bool reverse; @@ -6463,23 +6463,23 @@ fold_const_aggregate_ref_1 (tree t, tree if (TREE_CODE (TREE_OPERAND (t, 1)) == SSA_NAME && valueize && (idx = (*valueize) (TREE_OPERAND (t, 1))) - && TREE_CODE (idx) == INTEGER_CST) + && poly_int_tree_p (idx)) { tree low_bound, unit_size; /* If the resulting bit-offset is constant, track it. */ if ((low_bound = array_ref_low_bound (t), - TREE_CODE (low_bound) == INTEGER_CST) + poly_int_tree_p (low_bound)) && (unit_size = array_ref_element_size (t), tree_fits_uhwi_p (unit_size))) { - offset_int woffset - = wi::sext (wi::to_offset (idx) - wi::to_offset (low_bound), + poly_offset_int woffset + = wi::sext (wi::to_poly_offset (idx) + - wi::to_poly_offset (low_bound), TYPE_PRECISION (TREE_TYPE (idx))); - if (wi::fits_shwi_p (woffset)) + if (woffset.to_shwi (&offset)) { - offset = woffset.to_shwi (); /* TODO: This code seems wrong, multiply then check to see if it fits. */ offset *= tree_to_uhwi (unit_size); @@ -6492,7 +6492,7 @@ fold_const_aggregate_ref_1 (tree t, tree return build_zero_cst (TREE_TYPE (t)); /* Out of bound array access. Value is undefined, but don't fold. */ - if (offset < 0) + if (may_lt (offset, 0)) return NULL_TREE; /* We can not determine ctor. */ if (!ctor) @@ -6517,14 +6517,14 @@ fold_const_aggregate_ref_1 (tree t, tree if (ctor == error_mark_node) return build_zero_cst (TREE_TYPE (t)); /* We do not know precise address. */ - if (max_size == -1 || max_size != size) + if (!known_size_p (max_size) || may_ne (max_size, size)) return NULL_TREE; /* We can not determine ctor. */ if (!ctor) return NULL_TREE; /* Out of bound array access. Value is undefined, but don't fold. */ - if (offset < 0) + if (may_lt (offset, 0)) return NULL_TREE; return fold_ctor_reference (TREE_TYPE (t), ctor, offset, size, Index: gcc/ipa-polymorphic-call.c =================================================================== --- gcc/ipa-polymorphic-call.c 2017-10-23 17:07:40.909726192 +0100 +++ gcc/ipa-polymorphic-call.c 2017-10-23 17:16:59.704267816 +0100 @@ -759,7 +759,7 @@ ipa_polymorphic_call_context::set_by_inv tree otr_type, HOST_WIDE_INT off) { - HOST_WIDE_INT offset2, size, max_size; + poly_int64 offset2, size, max_size; bool reverse; tree base; @@ -772,7 +772,7 @@ ipa_polymorphic_call_context::set_by_inv cst = TREE_OPERAND (cst, 0); base = get_ref_base_and_extent (cst, &offset2, &size, &max_size, &reverse); - if (!DECL_P (base) || max_size == -1 || max_size != size) + if (!DECL_P (base) || !known_size_p (max_size) || may_ne (max_size, size)) return false; /* Only type inconsistent programs can have otr_type that is @@ -899,23 +899,21 @@ ipa_polymorphic_call_context::ipa_polymo base_pointer = walk_ssa_copies (base_pointer, &visited); if (TREE_CODE (base_pointer) == ADDR_EXPR) { - HOST_WIDE_INT size, max_size; - HOST_WIDE_INT offset2; + HOST_WIDE_INT offset2, size; bool reverse; tree base - = get_ref_base_and_extent (TREE_OPERAND (base_pointer, 0), - &offset2, &size, &max_size, &reverse); + = get_ref_base_and_extent_hwi (TREE_OPERAND (base_pointer, 0), + &offset2, &size, &reverse); + if (!base) + break; - if (max_size != -1 && max_size == size) - combine_speculation_with (TYPE_MAIN_VARIANT (TREE_TYPE (base)), - offset + offset2, - true, - NULL /* Do not change outer type. */); + combine_speculation_with (TYPE_MAIN_VARIANT (TREE_TYPE (base)), + offset + offset2, + true, + NULL /* Do not change outer type. */); /* If this is a varying address, punt. */ - if ((TREE_CODE (base) == MEM_REF || DECL_P (base)) - && max_size != -1 - && max_size == size) + if (TREE_CODE (base) == MEM_REF || DECL_P (base)) { /* We found dereference of a pointer. Type of the pointer and MEM_REF is meaningless, but we can look futher. */ @@ -1181,7 +1179,7 @@ noncall_stmt_may_be_vtbl_ptr_store (gimp extr_type_from_vtbl_ptr_store (gimple *stmt, struct type_change_info *tci, HOST_WIDE_INT *type_offset) { - HOST_WIDE_INT offset, size, max_size; + poly_int64 offset, size, max_size; tree lhs, rhs, base; bool reverse; @@ -1263,17 +1261,23 @@ extr_type_from_vtbl_ptr_store (gimple *s } return tci->offset > POINTER_SIZE ? error_mark_node : NULL_TREE; } - if (offset != tci->offset - || size != POINTER_SIZE - || max_size != POINTER_SIZE) + if (may_ne (offset, tci->offset) + || may_ne (size, POINTER_SIZE) + || may_ne (max_size, POINTER_SIZE)) { if (dump_file) - fprintf (dump_file, " wrong offset %i!=%i or size %i\n", - (int)offset, (int)tci->offset, (int)size); - return offset + POINTER_SIZE <= tci->offset - || (max_size != -1 - && tci->offset + POINTER_SIZE > offset + max_size) - ? error_mark_node : NULL; + { + fprintf (dump_file, " wrong offset "); + print_dec (offset, dump_file); + fprintf (dump_file, "!=%i or size ", (int) tci->offset); + print_dec (size, dump_file); + fprintf (dump_file, "\n"); + } + return (must_le (offset + POINTER_SIZE, tci->offset) + || (known_size_p (max_size) + && must_gt (tci->offset + POINTER_SIZE, + offset + max_size)) + ? error_mark_node : NULL); } } @@ -1403,7 +1407,7 @@ check_stmt_for_type_change (ao_ref *ao A { tree op = walk_ssa_copies (gimple_call_arg (stmt, 0)); tree type = TYPE_METHOD_BASETYPE (TREE_TYPE (fn)); - HOST_WIDE_INT offset = 0, size, max_size; + HOST_WIDE_INT offset = 0; bool reverse; if (dump_file) @@ -1415,14 +1419,15 @@ check_stmt_for_type_change (ao_ref *ao A /* See if THIS parameter seems like instance pointer. */ if (TREE_CODE (op) == ADDR_EXPR) { - op = get_ref_base_and_extent (TREE_OPERAND (op, 0), &offset, - &size, &max_size, &reverse); - if (size != max_size || max_size == -1) + HOST_WIDE_INT size; + op = get_ref_base_and_extent_hwi (TREE_OPERAND (op, 0), + &offset, &size, &reverse); + if (!op) { tci->speculative++; return csftc_abort_walking_p (tci->speculative); } - if (op && TREE_CODE (op) == MEM_REF) + if (TREE_CODE (op) == MEM_REF) { if (!tree_fits_shwi_p (TREE_OPERAND (op, 1))) { @@ -1578,7 +1583,6 @@ ipa_polymorphic_call_context::get_dynami if (gimple_code (call) == GIMPLE_CALL) { tree ref = gimple_call_fn (call); - HOST_WIDE_INT offset2, size, max_size; bool reverse; if (TREE_CODE (ref) == OBJ_TYPE_REF) @@ -1608,10 +1612,11 @@ ipa_polymorphic_call_context::get_dynami && !SSA_NAME_IS_DEFAULT_DEF (ref) && gimple_assign_load_p (SSA_NAME_DEF_STMT (ref))) { + HOST_WIDE_INT offset2, size; tree ref_exp = gimple_assign_rhs1 (SSA_NAME_DEF_STMT (ref)); tree base_ref - = get_ref_base_and_extent (ref_exp, &offset2, &size, - &max_size, &reverse); + = get_ref_base_and_extent_hwi (ref_exp, &offset2, + &size, &reverse); /* Finally verify that what we found looks like read from OTR_OBJECT or from INSTANCE with offset OFFSET. */ Index: gcc/ipa-prop.c =================================================================== --- gcc/ipa-prop.c 2017-10-23 17:16:58.507429441 +0100 +++ gcc/ipa-prop.c 2017-10-23 17:16:59.704267816 +0100 @@ -1071,12 +1071,11 @@ ipa_load_from_parm_agg (struct ipa_func_ bool *by_ref_p, bool *guaranteed_unmodified) { int index; - HOST_WIDE_INT size, max_size; + HOST_WIDE_INT size; bool reverse; - tree base - = get_ref_base_and_extent (op, offset_p, &size, &max_size, &reverse); + tree base = get_ref_base_and_extent_hwi (op, offset_p, &size, &reverse); - if (max_size == -1 || max_size != size || *offset_p < 0) + if (!base) return false; if (DECL_P (base)) @@ -1204,7 +1203,7 @@ compute_complex_assign_jump_func (struct gcall *call, gimple *stmt, tree name, tree param_type) { - HOST_WIDE_INT offset, size, max_size; + HOST_WIDE_INT offset, size; tree op1, tc_ssa, base, ssa; bool reverse; int index; @@ -1267,11 +1266,8 @@ compute_complex_assign_jump_func (struct op1 = TREE_OPERAND (op1, 0); if (TREE_CODE (TREE_TYPE (op1)) != RECORD_TYPE) return; - base = get_ref_base_and_extent (op1, &offset, &size, &max_size, &reverse); - if (TREE_CODE (base) != MEM_REF - /* If this is a varying address, punt. */ - || max_size == -1 - || max_size != size) + base = get_ref_base_and_extent_hwi (op1, &offset, &size, &reverse); + if (!base || TREE_CODE (base) != MEM_REF) return; offset += mem_ref_offset (base).to_short_addr () * BITS_PER_UNIT; ssa = TREE_OPERAND (base, 0); @@ -1301,7 +1297,7 @@ compute_complex_assign_jump_func (struct static tree get_ancestor_addr_info (gimple *assign, tree *obj_p, HOST_WIDE_INT *offset) { - HOST_WIDE_INT size, max_size; + HOST_WIDE_INT size; tree expr, parm, obj; bool reverse; @@ -1313,13 +1309,9 @@ get_ancestor_addr_info (gimple *assign, return NULL_TREE; expr = TREE_OPERAND (expr, 0); obj = expr; - expr = get_ref_base_and_extent (expr, offset, &size, &max_size, &reverse); + expr = get_ref_base_and_extent_hwi (expr, offset, &size, &reverse); - if (TREE_CODE (expr) != MEM_REF - /* If this is a varying address, punt. */ - || max_size == -1 - || max_size != size - || *offset < 0) + if (!expr || TREE_CODE (expr) != MEM_REF) return NULL_TREE; parm = TREE_OPERAND (expr, 0); if (TREE_CODE (parm) != SSA_NAME @@ -1581,15 +1573,12 @@ determine_locally_known_aggregate_parts } else if (TREE_CODE (arg) == ADDR_EXPR) { - HOST_WIDE_INT arg_max_size; bool reverse; arg = TREE_OPERAND (arg, 0); - arg_base = get_ref_base_and_extent (arg, &arg_offset, &arg_size, - &arg_max_size, &reverse); - if (arg_max_size == -1 - || arg_max_size != arg_size - || arg_offset < 0) + arg_base = get_ref_base_and_extent_hwi (arg, &arg_offset, + &arg_size, &reverse); + if (!arg_base) return; if (DECL_P (arg_base)) { @@ -1604,18 +1593,15 @@ determine_locally_known_aggregate_parts } else { - HOST_WIDE_INT arg_max_size; bool reverse; gcc_checking_assert (AGGREGATE_TYPE_P (TREE_TYPE (arg))); by_ref = false; check_ref = false; - arg_base = get_ref_base_and_extent (arg, &arg_offset, &arg_size, - &arg_max_size, &reverse); - if (arg_max_size == -1 - || arg_max_size != arg_size - || arg_offset < 0) + arg_base = get_ref_base_and_extent_hwi (arg, &arg_offset, + &arg_size, &reverse); + if (!arg_base) return; ao_ref_init (&r, arg); @@ -1631,7 +1617,7 @@ determine_locally_known_aggregate_parts { struct ipa_known_agg_contents_list *n, **p; gimple *stmt = gsi_stmt (gsi); - HOST_WIDE_INT lhs_offset, lhs_size, lhs_max_size; + HOST_WIDE_INT lhs_offset, lhs_size; tree lhs, rhs, lhs_base; bool reverse; @@ -1647,10 +1633,9 @@ determine_locally_known_aggregate_parts || contains_bitfld_component_ref_p (lhs)) break; - lhs_base = get_ref_base_and_extent (lhs, &lhs_offset, &lhs_size, - &lhs_max_size, &reverse); - if (lhs_max_size == -1 - || lhs_max_size != lhs_size) + lhs_base = get_ref_base_and_extent_hwi (lhs, &lhs_offset, + &lhs_size, &reverse); + if (!lhs_base) break; if (check_ref) @@ -4574,11 +4559,11 @@ ipa_get_adjustment_candidate (tree **exp *convert = true; } - HOST_WIDE_INT offset, size, max_size; + poly_int64 offset, size, max_size; bool reverse; tree base = get_ref_base_and_extent (**expr, &offset, &size, &max_size, &reverse); - if (!base || size == -1 || max_size == -1) + if (!base || !known_size_p (size) || !known_size_p (max_size)) return NULL; if (TREE_CODE (base) == MEM_REF) Index: gcc/tree-sra.c =================================================================== --- gcc/tree-sra.c 2017-10-23 17:07:40.909726192 +0100 +++ gcc/tree-sra.c 2017-10-23 17:16:59.705267681 +0100 @@ -865,11 +865,20 @@ static bool maybe_add_sra_candidate (tre create_access (tree expr, gimple *stmt, bool write) { struct access *access; + poly_int64 poffset, psize, pmax_size; HOST_WIDE_INT offset, size, max_size; tree base = expr; bool reverse, ptr, unscalarizable_region = false; - base = get_ref_base_and_extent (expr, &offset, &size, &max_size, &reverse); + base = get_ref_base_and_extent (expr, &poffset, &psize, &pmax_size, + &reverse); + if (!poffset.is_constant (&offset) + || !psize.is_constant (&size) + || !pmax_size.is_constant (&max_size)) + { + disqualify_candidate (base, "Encountered a polynomial-sized access."); + return NULL; + } if (sra_mode == SRA_MODE_EARLY_IPA && TREE_CODE (base) == MEM_REF) @@ -3048,7 +3057,8 @@ clobber_subtree (struct access *access, static struct access * get_access_for_expr (tree expr) { - HOST_WIDE_INT offset, size, max_size; + poly_int64 poffset, psize, pmax_size; + HOST_WIDE_INT offset, max_size; tree base; bool reverse; @@ -3058,8 +3068,12 @@ get_access_for_expr (tree expr) if (TREE_CODE (expr) == VIEW_CONVERT_EXPR) expr = TREE_OPERAND (expr, 0); - base = get_ref_base_and_extent (expr, &offset, &size, &max_size, &reverse); - if (max_size == -1 || !DECL_P (base)) + base = get_ref_base_and_extent (expr, &poffset, &psize, &pmax_size, + &reverse); + if (!known_size_p (pmax_size) + || !pmax_size.is_constant (&max_size) + || !poffset.is_constant (&offset) + || !DECL_P (base)) return NULL; if (!bitmap_bit_p (candidate_bitmap, DECL_UID (base))) Index: gcc/tree-ssa-alias.c =================================================================== --- gcc/tree-ssa-alias.c 2017-10-23 17:11:40.347140508 +0100 +++ gcc/tree-ssa-alias.c 2017-10-23 17:16:59.705267681 +0100 @@ -635,7 +635,7 @@ ao_ref_init (ao_ref *r, tree ref) ao_ref_base (ao_ref *ref) { bool reverse; - HOST_WIDE_INT offset, size, max_size; + poly_int64 offset, size, max_size; if (ref->base) return ref->base; @@ -823,7 +823,7 @@ aliasing_component_refs_p (tree ref1, return true; else if (same_p == 1) { - HOST_WIDE_INT offadj, sztmp, msztmp; + poly_int64 offadj, sztmp, msztmp; bool reverse; get_ref_base_and_extent (*refp, &offadj, &sztmp, &msztmp, &reverse); offset2 -= offadj; @@ -842,7 +842,7 @@ aliasing_component_refs_p (tree ref1, return true; else if (same_p == 1) { - HOST_WIDE_INT offadj, sztmp, msztmp; + poly_int64 offadj, sztmp, msztmp; bool reverse; get_ref_base_and_extent (*refp, &offadj, &sztmp, &msztmp, &reverse); offset1 -= offadj; @@ -2450,15 +2450,12 @@ stmt_kills_ref_p (gimple *stmt, ao_ref * the access properly. */ if (!ref->max_size_known_p ()) return false; - HOST_WIDE_INT size, max_size, const_offset; - poly_int64 ref_offset = ref->offset; + poly_int64 size, offset, max_size, ref_offset = ref->offset; bool reverse; - tree base - = get_ref_base_and_extent (lhs, &const_offset, &size, &max_size, - &reverse); + tree base = get_ref_base_and_extent (lhs, &offset, &size, &max_size, + &reverse); /* We can get MEM[symbol: sZ, index: D.8862_1] here, so base == ref->base does not always hold. */ - poly_int64 offset = const_offset; if (base != ref->base) { /* Try using points-to info. */ @@ -2490,7 +2487,7 @@ stmt_kills_ref_p (gimple *stmt, ao_ref * } /* For a must-alias check we need to be able to constrain the access properly. */ - if (size == max_size + if (must_eq (size, max_size) && known_subrange_p (ref_offset, ref->max_size, offset, size)) return true; } Index: gcc/tree-ssa-dce.c =================================================================== --- gcc/tree-ssa-dce.c 2017-10-23 17:11:40.348142423 +0100 +++ gcc/tree-ssa-dce.c 2017-10-23 17:16:59.706267546 +0100 @@ -477,7 +477,7 @@ mark_aliased_reaching_defs_necessary_1 ( && !stmt_can_throw_internal (def_stmt)) { tree base, lhs = gimple_get_lhs (def_stmt); - HOST_WIDE_INT size, offset, max_size; + poly_int64 size, offset, max_size; bool reverse; ao_ref_base (ref); base @@ -488,7 +488,7 @@ mark_aliased_reaching_defs_necessary_1 ( { /* For a must-alias check we need to be able to constrain the accesses properly. */ - if (size == max_size + if (must_eq (size, max_size) && known_subrange_p (ref->offset, ref->max_size, offset, size)) return true; /* Or they need to be exactly the same. */ Index: gcc/tree-ssa-scopedtables.c =================================================================== --- gcc/tree-ssa-scopedtables.c 2017-10-23 17:07:40.909726192 +0100 +++ gcc/tree-ssa-scopedtables.c 2017-10-23 17:16:59.706267546 +0100 @@ -480,13 +480,13 @@ avail_expr_hash (class expr_hash_elt *p) Dealing with both MEM_REF and ARRAY_REF allows us not to care about equivalence with other statements not considered here. */ bool reverse; - HOST_WIDE_INT offset, size, max_size; + poly_int64 offset, size, max_size; tree base = get_ref_base_and_extent (t, &offset, &size, &max_size, &reverse); /* Strictly, we could try to normalize variable-sized accesses too, but here we just deal with the common case. */ - if (size != -1 - && size == max_size) + if (known_size_p (max_size) + && must_eq (size, max_size)) { enum tree_code code = MEM_REF; hstate.add_object (code); @@ -520,26 +520,26 @@ equal_mem_array_ref_p (tree t0, tree t1) if (!types_compatible_p (TREE_TYPE (t0), TREE_TYPE (t1))) return false; bool rev0; - HOST_WIDE_INT off0, sz0, max0; + poly_int64 off0, sz0, max0; tree base0 = get_ref_base_and_extent (t0, &off0, &sz0, &max0, &rev0); - if (sz0 == -1 - || sz0 != max0) + if (!known_size_p (max0) + || may_ne (sz0, max0)) return false; bool rev1; - HOST_WIDE_INT off1, sz1, max1; + poly_int64 off1, sz1, max1; tree base1 = get_ref_base_and_extent (t1, &off1, &sz1, &max1, &rev1); - if (sz1 == -1 - || sz1 != max1) + if (!known_size_p (max1) + || may_ne (sz1, max1)) return false; if (rev0 != rev1) return false; /* Types were compatible, so this is a sanity check. */ - gcc_assert (sz0 == sz1); + gcc_assert (must_eq (sz0, sz1)); - return (off0 == off1) && operand_equal_p (base0, base1, 0); + return must_eq (off0, off1) && operand_equal_p (base0, base1, 0); } /* Compare two hashable_expr structures for equivalence. They are Index: gcc/tree-ssa-sccvn.c =================================================================== --- gcc/tree-ssa-sccvn.c 2017-10-23 17:11:40.349144338 +0100 +++ gcc/tree-ssa-sccvn.c 2017-10-23 17:16:59.706267546 +0100 @@ -1920,7 +1920,7 @@ vn_reference_lookup_3 (ao_ref *ref, tree { tree ref2 = TREE_OPERAND (gimple_call_arg (def_stmt, 0), 0); tree base2; - HOST_WIDE_INT offset2, size2, maxsize2; + poly_int64 offset2, size2, maxsize2; bool reverse; base2 = get_ref_base_and_extent (ref2, &offset2, &size2, &maxsize2, &reverse); @@ -1943,7 +1943,7 @@ vn_reference_lookup_3 (ao_ref *ref, tree && CONSTRUCTOR_NELTS (gimple_assign_rhs1 (def_stmt)) == 0) { tree base2; - HOST_WIDE_INT offset2, size2, maxsize2; + poly_int64 offset2, size2, maxsize2; bool reverse; base2 = get_ref_base_and_extent (gimple_assign_lhs (def_stmt), &offset2, &size2, &maxsize2, &reverse); @@ -1975,13 +1975,12 @@ vn_reference_lookup_3 (ao_ref *ref, tree && is_gimple_min_invariant (SSA_VAL (gimple_assign_rhs1 (def_stmt)))))) { tree base2; - HOST_WIDE_INT offset2, size2, maxsize2; + HOST_WIDE_INT offset2, size2; bool reverse; - base2 = get_ref_base_and_extent (gimple_assign_lhs (def_stmt), - &offset2, &size2, &maxsize2, &reverse); - if (!reverse - && maxsize2 != -1 - && maxsize2 == size2 + base2 = get_ref_base_and_extent_hwi (gimple_assign_lhs (def_stmt), + &offset2, &size2, &reverse); + if (base2 + && !reverse && size2 % BITS_PER_UNIT == 0 && offset2 % BITS_PER_UNIT == 0 && operand_equal_p (base, base2, 0) @@ -2039,14 +2038,14 @@ vn_reference_lookup_3 (ao_ref *ref, tree && TREE_CODE (gimple_assign_rhs1 (def_stmt)) == SSA_NAME) { tree base2; - HOST_WIDE_INT offset2, size2, maxsize2; + poly_int64 offset2, size2, maxsize2; bool reverse; base2 = get_ref_base_and_extent (gimple_assign_lhs (def_stmt), &offset2, &size2, &maxsize2, &reverse); if (!reverse - && maxsize2 != -1 - && maxsize2 == size2 + && known_size_p (maxsize2) + && must_eq (maxsize2, size2) && operand_equal_p (base, base2, 0) && known_subrange_p (offset, maxsize, offset2, size2) /* ??? We can't handle bitfield precision extracts without Index: gcc/tree-ssa-structalias.c =================================================================== --- gcc/tree-ssa-structalias.c 2017-10-23 17:07:40.909726192 +0100 +++ gcc/tree-ssa-structalias.c 2017-10-23 17:16:59.707267411 +0100 @@ -3191,9 +3191,9 @@ get_constraint_for_component_ref (tree t bool address_p, bool lhs_p) { tree orig_t = t; - HOST_WIDE_INT bitsize = -1; - HOST_WIDE_INT bitmaxsize = -1; - HOST_WIDE_INT bitpos; + poly_int64 bitsize = -1; + poly_int64 bitmaxsize = -1; + poly_int64 bitpos; bool reverse; tree forzero; @@ -3255,8 +3255,8 @@ get_constraint_for_component_ref (tree t ignore this constraint. When we handle pointer subtraction, we may have to do something cute here. */ - if ((unsigned HOST_WIDE_INT)bitpos < get_varinfo (result.var)->fullsize - && bitmaxsize != 0) + if (may_lt (poly_uint64 (bitpos), get_varinfo (result.var)->fullsize) + && maybe_nonzero (bitmaxsize)) { /* It's also not true that the constraint will actually start at the right offset, it may start in some padding. We only care about @@ -3268,8 +3268,8 @@ get_constraint_for_component_ref (tree t cexpr.offset = 0; for (curr = get_varinfo (cexpr.var); curr; curr = vi_next (curr)) { - if (ranges_overlap_p (curr->offset, curr->size, - bitpos, bitmaxsize)) + if (ranges_may_overlap_p (poly_int64 (curr->offset), curr->size, + bitpos, bitmaxsize)) { cexpr.var = curr->id; results->safe_push (cexpr); @@ -3302,7 +3302,7 @@ get_constraint_for_component_ref (tree t results->safe_push (cexpr); } } - else if (bitmaxsize == 0) + else if (known_zero (bitmaxsize)) { if (dump_file && (dump_flags & TDF_DETAILS)) fprintf (dump_file, "Access to zero-sized part of variable, " @@ -3317,13 +3317,15 @@ get_constraint_for_component_ref (tree t /* If we do not know exactly where the access goes say so. Note that only for non-structure accesses we know that we access at most one subfiled of any variable. */ - if (bitpos == -1 - || bitsize != bitmaxsize + HOST_WIDE_INT const_bitpos; + if (!bitpos.is_constant (&const_bitpos) + || const_bitpos == -1 + || may_ne (bitsize, bitmaxsize) || AGGREGATE_TYPE_P (TREE_TYPE (orig_t)) || result.offset == UNKNOWN_OFFSET) result.offset = UNKNOWN_OFFSET; else - result.offset += bitpos; + result.offset += const_bitpos; } else if (result.type == ADDRESSOF) { @@ -3660,14 +3662,17 @@ do_structure_copy (tree lhsop, tree rhso && (rhsp->type == SCALAR || rhsp->type == ADDRESSOF)) { - HOST_WIDE_INT lhssize, lhsmaxsize, lhsoffset; - HOST_WIDE_INT rhssize, rhsmaxsize, rhsoffset; + HOST_WIDE_INT lhssize, lhsoffset; + HOST_WIDE_INT rhssize, rhsoffset; bool reverse; unsigned k = 0; - get_ref_base_and_extent (lhsop, &lhsoffset, &lhssize, &lhsmaxsize, - &reverse); - get_ref_base_and_extent (rhsop, &rhsoffset, &rhssize, &rhsmaxsize, - &reverse); + if (!get_ref_base_and_extent_hwi (lhsop, &lhsoffset, &lhssize, &reverse) + || !get_ref_base_and_extent_hwi (rhsop, &rhsoffset, &rhssize, + &reverse)) + { + process_all_all_constraints (lhsc, rhsc); + return; + } for (j = 0; lhsc.iterate (j, &lhsp);) { varinfo_t lhsv, rhsv; Index: gcc/var-tracking.c =================================================================== --- gcc/var-tracking.c 2017-10-23 17:16:50.377527331 +0100 +++ gcc/var-tracking.c 2017-10-23 17:16:59.708267276 +0100 @@ -5208,20 +5208,20 @@ track_expr_p (tree expr, bool need_rtl) || (TREE_CODE (realdecl) == MEM_REF && TREE_CODE (TREE_OPERAND (realdecl, 0)) == ADDR_EXPR)) { - HOST_WIDE_INT bitsize, bitpos, maxsize; + HOST_WIDE_INT bitsize, bitpos; bool reverse; tree innerdecl - = get_ref_base_and_extent (realdecl, &bitpos, &bitsize, - &maxsize, &reverse); - if (!DECL_P (innerdecl) + = get_ref_base_and_extent_hwi (realdecl, &bitpos, + &bitsize, &reverse); + if (!innerdecl + || !DECL_P (innerdecl) || DECL_IGNORED_P (innerdecl) /* Do not track declarations for parts of tracked record parameters since we want to track them as a whole. */ || tracked_record_parameter_p (innerdecl) || TREE_STATIC (innerdecl) - || bitsize <= 0 - || bitpos + bitsize > 256 - || bitsize != maxsize) + || bitsize == 0 + || bitpos + bitsize > 256) return 0; else realdecl = expr;