From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 99193 invoked by alias); 23 Oct 2017 17:00:45 -0000 Mailing-List: contact gcc-patches-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Id: List-Archive: List-Post: List-Help: Sender: gcc-patches-owner@gcc.gnu.org Received: (qmail 99091 invoked by uid 89); 23 Oct 2017 17:00:37 -0000 Authentication-Results: sourceware.org; auth=none X-Virus-Found: No X-Spam-SWARE-Status: No, score=-15.3 required=5.0 tests=AWL,BAYES_00,GIT_PATCH_1,GIT_PATCH_2,GIT_PATCH_3,KAM_ASCII_DIVIDERS,KAM_STOCKGEN,RCVD_IN_DNSWL_NONE,SPF_PASS autolearn=ham version=3.3.2 spammy= X-HELO: mail-wr0-f169.google.com Received: from mail-wr0-f169.google.com (HELO mail-wr0-f169.google.com) (209.85.128.169) by sourceware.org (qpsmtpd/0.93/v0.84-503-g423c35a) with ESMTP; Mon, 23 Oct 2017 17:00:32 +0000 Received: by mail-wr0-f169.google.com with SMTP id y9so5885913wrb.2 for ; Mon, 23 Oct 2017 10:00:32 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:from:to:mail-followup-to:subject:references:date :in-reply-to:message-id:user-agent:mime-version; bh=25jhSBeNyBIyhIxnwHuMn/qxpoFUPVRYlMBswGiZgr0=; b=NeTaZv8rgeg4Vb6mDd8dOmSbYb/LwXUSv9B7ODOjDFArAhXEeDQJazk1/AA19zpXkB a7llxGo/0rd/QWDgIkoz0EBL4hD1cZM0yJOj2F3Ck6ipptzKd2bKmPy1UZmADrICZ2U0 eB7A3OPUkafSZveZY6fkGGSAGgqATIxfm7zeyNCLfkcAVNHESxOFuioSDT0Bx7RQRdFv JrwyYOFVHQ2l+rIiX3606+rHj3rC8cjqcX5PutnQ4BSXryhkilFqv0tXwdSSTZaYW/K1 S5ZHRfx6svK6N1GOogpsv7pJiO3cWh2KtCOc53BypG50CypSt3ZaaZN8+UYu/CzJzveI qCSg== X-Gm-Message-State: AMCzsaWTw6Ykyh/McKSZohuClBjBLxHPLMa2B4cjGtXH3iJpWXxWmIIF CgQg6y+iuSu+LhXLBgeqctBqntWWJts= X-Google-Smtp-Source: ABhQp+T4YiM0VvXvy8Xm0NmKgIxU9L725VMu/GhBm5hJV3T4bTv6tGgqgx4t0FNREykRuzmzMFHksw== X-Received: by 10.223.191.10 with SMTP id p10mr12900913wrh.127.1508778029038; Mon, 23 Oct 2017 10:00:29 -0700 (PDT) Received: from localhost ([2.26.27.199]) by smtp.gmail.com with ESMTPSA id t43sm6961619wrc.74.2017.10.23.10.00.27 for (version=TLS1_2 cipher=ECDHE-RSA-CHACHA20-POLY1305 bits=256/256); Mon, 23 Oct 2017 10:00:28 -0700 (PDT) From: Richard Sandiford To: gcc-patches@gcc.gnu.org Mail-Followup-To: gcc-patches@gcc.gnu.org, richard.sandiford@linaro.org Subject: [005/nnn] poly_int: rtx constants References: <871sltvm7r.fsf@linaro.org> Date: Mon, 23 Oct 2017 17:01:00 -0000 In-Reply-To: <871sltvm7r.fsf@linaro.org> (Richard Sandiford's message of "Mon, 23 Oct 2017 17:54:32 +0100") Message-ID: <87d15du7dh.fsf@linaro.org> User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/25.2 (gnu/linux) MIME-Version: 1.0 Content-Type: text/plain X-SW-Source: 2017-10/txt/msg01505.txt.bz2 This patch adds an rtl representation of poly_int values. There were three possible ways of doing this: (1) Add a new rtl code for the poly_ints themselves and store the coefficients as trailing wide_ints. This would give constants like: (const_poly_int [c0 c1 ... cn]) The runtime value would be: c0 + c1 * x1 + ... + cn * xn (2) Like (1), but use rtxes for the coefficients. This would give constants like: (const_poly_int [(const_int c0) (const_int c1) ... (const_int cn)]) although the coefficients could be const_wide_ints instead of const_ints where appropriate. (3) Add a new rtl code for the polynomial indeterminates, then use them in const wrappers. A constant like c0 + c1 * x1 would then look like: (const:M (plus:M (mult:M (const_param:M x1) (const_int c1)) (const_int c0))) There didn't seem to be that much to choose between them. The main advantage of (1) is that it's a more efficient representation and that we can refer to the cofficients directly as wide_int_storage. 2017-10-23 Richard Sandiford Alan Hayward David Sherwood gcc/ * doc/rtl.texi (const_poly_int): Document. * gengenrtl.c (excluded_rtx): Return true for CONST_POLY_INT. * rtl.h (const_poly_int_def): New struct. (rtx_def::u): Add a cpi field. (CASE_CONST_UNIQUE, CASE_CONST_ANY): Add CONST_POLY_INT. (CONST_POLY_INT_P, CONST_POLY_INT_COEFFS): New macros. (wi::rtx_to_poly_wide_ref): New typedef (const_poly_int_value, wi::to_poly_wide, rtx_to_poly_int64) (poly_int_rtx_p): New functions. (trunc_int_for_mode): Declare a poly_int64 version. (plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT. (immed_wide_int_const): Take a poly_wide_int_ref rather than a wide_int_ref. (strip_offset): Declare. (strip_offset_and_add): New function. * rtl.def (CONST_POLY_INT): New rtx code. * rtl.c (rtx_size): Handle CONST_POLY_INT. (shared_const_p): Use poly_int_rtx_p. * emit-rtl.h (gen_int_mode): Take a poly_int64 instead of a HOST_WIDE_INT. (gen_int_shift_amount): Likewise. * emit-rtl.c (const_poly_int_hasher): New class. (const_poly_int_htab): New variable. (init_emit_once): Initialize it when NUM_POLY_INT_COEFFS > 1. (const_poly_int_hasher::hash): New function. (const_poly_int_hasher::equal): Likewise. (gen_int_mode): Take a poly_int64 instead of a HOST_WIDE_INT. (immed_wide_int_const): Rename to... (immed_wide_int_const_1): ...this and make static. (immed_wide_int_const): New function, taking a poly_wide_int_ref instead of a wide_int_ref. (gen_int_shift_amount): Take a poly_int64 instead of a HOST_WIDE_INT. (gen_lowpart_common): Handle CONST_POLY_INT. * cse.c (hash_rtx_cb, equiv_constant): Likewise. * cselib.c (cselib_hash_rtx): Likewise. * dwarf2out.c (const_ok_for_output_1): Likewise. * expr.c (convert_modes): Likewise. * print-rtl.c (rtx_writer::print_rtx, print_value): Likewise. * rtlhash.c (add_rtx): Likewise. * explow.c (trunc_int_for_mode): Add a poly_int64 version. (plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT. Handle existing CONST_POLY_INT rtxes. * expmed.h (expand_shift): Take a poly_int64 instead of a HOST_WIDE_INT. * expmed.c (expand_shift): Likewise. * rtlanal.c (strip_offset): New function. (commutative_operand_precedence): Give CONST_POLY_INT the same precedence as CONST_DOUBLE and put CONST_WIDE_INT between that and CONST_INT. * rtl-tests.c (const_poly_int_tests): New struct. (rtl_tests_c_tests): Use it. * simplify-rtx.c (simplify_const_unary_operation): Handle CONST_POLY_INT. (simplify_const_binary_operation): Likewise. (simplify_binary_operation_1): Fold additions of symbolic constants and CONST_POLY_INTs. (simplify_subreg): Handle extensions and truncations of CONST_POLY_INTs. (simplify_const_poly_int_tests): New struct. (simplify_rtx_c_tests): Use it. * wide-int.h (storage_ref): Add default constructor. (wide_int_ref_storage): Likewise. (trailing_wide_ints): Use GTY((user)). (trailing_wide_ints::operator[]): Add a const version. (trailing_wide_ints::get_precision): New function. (trailing_wide_ints::extra_size): Likewise. Index: gcc/doc/rtl.texi =================================================================== --- gcc/doc/rtl.texi 2017-10-23 17:00:20.916834036 +0100 +++ gcc/doc/rtl.texi 2017-10-23 17:00:54.437007600 +0100 @@ -1621,6 +1621,15 @@ is accessed with the macro @code{CONST_F data is accessed with @code{CONST_FIXED_VALUE_HIGH}; the low part is accessed with @code{CONST_FIXED_VALUE_LOW}. +@findex const_poly_int +@item (const_poly_int:@var{m} [@var{c0} @var{c1} @dots{}]) +Represents a @code{poly_int}-style polynomial integer with coefficients +@var{c0}, @var{c1}, @dots{}. The coefficients are @code{wide_int}-based +integers rather than rtxes. @code{CONST_POLY_INT_COEFFS} gives the +values of individual coefficients (which is mostly only useful in +low-level routines) and @code{const_poly_int_value} gives the full +@code{poly_int} value. + @findex const_vector @item (const_vector:@var{m} [@var{x0} @var{x1} @dots{}]) Represents a vector constant. The square brackets stand for the vector Index: gcc/gengenrtl.c =================================================================== --- gcc/gengenrtl.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/gengenrtl.c 2017-10-23 17:00:54.442003055 +0100 @@ -157,6 +157,7 @@ excluded_rtx (int idx) return (strcmp (defs[idx].enumname, "VAR_LOCATION") == 0 || strcmp (defs[idx].enumname, "CONST_DOUBLE") == 0 || strcmp (defs[idx].enumname, "CONST_WIDE_INT") == 0 + || strcmp (defs[idx].enumname, "CONST_POLY_INT") == 0 || strcmp (defs[idx].enumname, "CONST_FIXED") == 0); } Index: gcc/rtl.h =================================================================== --- gcc/rtl.h 2017-10-23 16:52:20.579835373 +0100 +++ gcc/rtl.h 2017-10-23 17:00:54.444001238 +0100 @@ -280,6 +280,10 @@ #define CWI_GET_NUM_ELEM(RTX) \ #define CWI_PUT_NUM_ELEM(RTX, NUM) \ (RTL_FLAG_CHECK1("CWI_PUT_NUM_ELEM", (RTX), CONST_WIDE_INT)->u2.num_elem = (NUM)) +struct GTY((variable_size)) const_poly_int_def { + trailing_wide_ints coeffs; +}; + /* RTL expression ("rtx"). */ /* The GTY "desc" and "tag" options below are a kludge: we need a desc @@ -424,6 +428,7 @@ struct GTY((desc("0"), tag("0"), struct real_value rv; struct fixed_value fv; struct hwivec_def hwiv; + struct const_poly_int_def cpi; } GTY ((special ("rtx_def"), desc ("GET_CODE (&%0)"))) u; }; @@ -734,6 +739,7 @@ #define CASE_CONST_SCALAR_INT \ #define CASE_CONST_UNIQUE \ case CONST_INT: \ case CONST_WIDE_INT: \ + case CONST_POLY_INT: \ case CONST_DOUBLE: \ case CONST_FIXED @@ -741,6 +747,7 @@ #define CASE_CONST_UNIQUE \ #define CASE_CONST_ANY \ case CONST_INT: \ case CONST_WIDE_INT: \ + case CONST_POLY_INT: \ case CONST_DOUBLE: \ case CONST_FIXED: \ case CONST_VECTOR @@ -773,6 +780,11 @@ #define CONST_INT_P(X) (GET_CODE (X) == /* Predicate yielding nonzero iff X is an rtx for a constant integer. */ #define CONST_WIDE_INT_P(X) (GET_CODE (X) == CONST_WIDE_INT) +/* Predicate yielding nonzero iff X is an rtx for a polynomial constant + integer. */ +#define CONST_POLY_INT_P(X) \ + (NUM_POLY_INT_COEFFS > 1 && GET_CODE (X) == CONST_POLY_INT) + /* Predicate yielding nonzero iff X is an rtx for a constant fixed-point. */ #define CONST_FIXED_P(X) (GET_CODE (X) == CONST_FIXED) @@ -1871,6 +1883,12 @@ #define CONST_WIDE_INT_VEC(RTX) HWIVEC_C #define CONST_WIDE_INT_NUNITS(RTX) CWI_GET_NUM_ELEM (RTX) #define CONST_WIDE_INT_ELT(RTX, N) CWI_ELT (RTX, N) +/* For a CONST_POLY_INT, CONST_POLY_INT_COEFFS gives access to the + individual coefficients, in the form of a trailing_wide_ints structure. */ +#define CONST_POLY_INT_COEFFS(RTX) \ + (RTL_FLAG_CHECK1("CONST_POLY_INT_COEFFS", (RTX), \ + CONST_POLY_INT)->u.cpi.coeffs) + /* For a CONST_DOUBLE: #if TARGET_SUPPORTS_WIDE_INT == 0 For a VOIDmode, there are two integers CONST_DOUBLE_LOW is the @@ -2184,6 +2202,84 @@ wi::max_value (machine_mode mode, signop return max_value (GET_MODE_PRECISION (as_a (mode)), sgn); } +namespace wi +{ + typedef poly_int > > + rtx_to_poly_wide_ref; + rtx_to_poly_wide_ref to_poly_wide (const_rtx, machine_mode); +} + +/* Return the value of a CONST_POLY_INT in its native precision. */ + +inline wi::rtx_to_poly_wide_ref +const_poly_int_value (const_rtx x) +{ + poly_int res; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + res.coeffs[i] = CONST_POLY_INT_COEFFS (x)[i]; + return res; +} + +/* Return true if X is a scalar integer or a CONST_POLY_INT. The value + can then be extracted using wi::to_poly_wide. */ + +inline bool +poly_int_rtx_p (const_rtx x) +{ + return CONST_SCALAR_INT_P (x) || CONST_POLY_INT_P (x); +} + +/* Access X (which satisfies poly_int_rtx_p) as a poly_wide_int. + MODE is the mode of X. */ + +inline wi::rtx_to_poly_wide_ref +wi::to_poly_wide (const_rtx x, machine_mode mode) +{ + if (CONST_POLY_INT_P (x)) + return const_poly_int_value (x); + return rtx_mode_t (const_cast (x), mode); +} + +/* Return the value of X as a poly_int64. */ + +inline poly_int64 +rtx_to_poly_int64 (const_rtx x) +{ + if (CONST_POLY_INT_P (x)) + { + poly_int64 res; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + res.coeffs[i] = CONST_POLY_INT_COEFFS (x)[i].to_shwi (); + return res; + } + return INTVAL (x); +} + +/* Return true if arbitrary value X is an integer constant that can + be represented as a poly_int64. Store the value in *RES if so, + otherwise leave it unmodified. */ + +inline bool +poly_int_rtx_p (const_rtx x, poly_int64_pod *res) +{ + if (CONST_INT_P (x)) + { + *res = INTVAL (x); + return true; + } + if (CONST_POLY_INT_P (x)) + { + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + if (!wi::fits_shwi_p (CONST_POLY_INT_COEFFS (x)[i])) + return false; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + res->coeffs[i] = CONST_POLY_INT_COEFFS (x)[i].to_shwi (); + return true; + } + return false; +} + extern void init_rtlanal (void); extern int rtx_cost (rtx, machine_mode, enum rtx_code, int, bool); extern int address_cost (rtx, machine_mode, addr_space_t, bool); @@ -2721,7 +2817,8 @@ #define EXTRACT_ARGS_IN_RANGE(SIZE, POS, /* In explow.c */ extern HOST_WIDE_INT trunc_int_for_mode (HOST_WIDE_INT, machine_mode); -extern rtx plus_constant (machine_mode, rtx, HOST_WIDE_INT, bool = false); +extern poly_int64 trunc_int_for_mode (poly_int64, machine_mode); +extern rtx plus_constant (machine_mode, rtx, poly_int64, bool = false); extern HOST_WIDE_INT get_stack_check_protect (void); /* In rtl.c */ @@ -3032,13 +3129,11 @@ extern void end_sequence (void); extern double_int rtx_to_double_int (const_rtx); #endif extern void cwi_output_hex (FILE *, const_rtx); -#ifndef GENERATOR_FILE -extern rtx immed_wide_int_const (const wide_int_ref &, machine_mode); -#endif #if TARGET_SUPPORTS_WIDE_INT == 0 extern rtx immed_double_const (HOST_WIDE_INT, HOST_WIDE_INT, machine_mode); #endif +extern rtx immed_wide_int_const (const poly_wide_int_ref &, machine_mode); /* In varasm.c */ extern rtx force_const_mem (machine_mode, rtx); @@ -3226,6 +3321,7 @@ extern HOST_WIDE_INT get_integer_term (c extern rtx get_related_value (const_rtx); extern bool offset_within_block_p (const_rtx, HOST_WIDE_INT); extern void split_const (rtx, rtx *, rtx *); +extern rtx strip_offset (rtx, poly_int64_pod *); extern bool unsigned_reg_p (rtx); extern int reg_mentioned_p (const_rtx, const_rtx); extern int count_occurrences (const_rtx, const_rtx, int); @@ -4160,6 +4256,21 @@ load_extend_op (machine_mode mode) return UNKNOWN; } +/* If X is a PLUS of a base and a constant offset, add the constant to *OFFSET + and return the base. Return X otherwise. */ + +inline rtx +strip_offset_and_add (rtx x, poly_int64_pod *offset) +{ + if (GET_CODE (x) == PLUS) + { + poly_int64 suboffset; + x = strip_offset (x, &suboffset); + *offset += suboffset; + } + return x; +} + /* gtype-desc.c. */ extern void gt_ggc_mx (rtx &); extern void gt_pch_nx (rtx &); Index: gcc/rtl.def =================================================================== --- gcc/rtl.def 2017-10-23 16:52:20.579835373 +0100 +++ gcc/rtl.def 2017-10-23 17:00:54.443002147 +0100 @@ -348,6 +348,9 @@ DEF_RTL_EXPR(CONST_INT, "const_int", "w" /* numeric integer constant */ DEF_RTL_EXPR(CONST_WIDE_INT, "const_wide_int", "", RTX_CONST_OBJ) +/* An rtx representation of a poly_wide_int. */ +DEF_RTL_EXPR(CONST_POLY_INT, "const_poly_int", "", RTX_CONST_OBJ) + /* fixed-point constant */ DEF_RTL_EXPR(CONST_FIXED, "const_fixed", "www", RTX_CONST_OBJ) Index: gcc/rtl.c =================================================================== --- gcc/rtl.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/rtl.c 2017-10-23 17:00:54.443002147 +0100 @@ -189,6 +189,10 @@ rtx_size (const_rtx x) + sizeof (struct hwivec_def) + ((CONST_WIDE_INT_NUNITS (x) - 1) * sizeof (HOST_WIDE_INT))); + if (CONST_POLY_INT_P (x)) + return (RTX_HDR_SIZE + + sizeof (struct const_poly_int_def) + + CONST_POLY_INT_COEFFS (x).extra_size ()); if (GET_CODE (x) == SYMBOL_REF && SYMBOL_REF_HAS_BLOCK_INFO_P (x)) return RTX_HDR_SIZE + sizeof (struct block_symbol); return RTX_CODE_SIZE (GET_CODE (x)); @@ -257,9 +261,10 @@ shared_const_p (const_rtx orig) /* CONST can be shared if it contains a SYMBOL_REF. If it contains a LABEL_REF, it isn't sharable. */ + poly_int64 offset; return (GET_CODE (XEXP (orig, 0)) == PLUS && GET_CODE (XEXP (XEXP (orig, 0), 0)) == SYMBOL_REF - && CONST_INT_P (XEXP (XEXP (orig, 0), 1))); + && poly_int_rtx_p (XEXP (XEXP (orig, 0), 1), &offset)); } Index: gcc/emit-rtl.h =================================================================== --- gcc/emit-rtl.h 2017-10-23 16:52:20.579835373 +0100 +++ gcc/emit-rtl.h 2017-10-23 17:00:54.440004873 +0100 @@ -362,14 +362,14 @@ extern rtvec gen_rtvec (int, ...); extern rtx copy_insn_1 (rtx); extern rtx copy_insn (rtx); extern rtx_insn *copy_delay_slot_insn (rtx_insn *); -extern rtx gen_int_mode (HOST_WIDE_INT, machine_mode); +extern rtx gen_int_mode (poly_int64, machine_mode); extern rtx_insn *emit_copy_of_insn_after (rtx_insn *, rtx_insn *); extern void set_reg_attrs_from_value (rtx, rtx); extern void set_reg_attrs_for_parm (rtx, rtx); extern void set_reg_attrs_for_decl_rtl (tree t, rtx x); extern void adjust_reg_mode (rtx, machine_mode); extern int mem_expr_equal_p (const_tree, const_tree); -extern rtx gen_int_shift_amount (machine_mode, HOST_WIDE_INT); +extern rtx gen_int_shift_amount (machine_mode, poly_int64); extern bool need_atomic_barrier_p (enum memmodel, bool); Index: gcc/emit-rtl.c =================================================================== --- gcc/emit-rtl.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/emit-rtl.c 2017-10-23 17:00:54.440004873 +0100 @@ -148,6 +148,16 @@ struct const_wide_int_hasher : ggc_cache static GTY ((cache)) hash_table *const_wide_int_htab; +struct const_poly_int_hasher : ggc_cache_ptr_hash +{ + typedef std::pair compare_type; + + static hashval_t hash (rtx x); + static bool equal (rtx x, const compare_type &y); +}; + +static GTY ((cache)) hash_table *const_poly_int_htab; + /* A hash table storing register attribute structures. */ struct reg_attr_hasher : ggc_cache_ptr_hash { @@ -257,6 +267,31 @@ const_wide_int_hasher::equal (rtx x, rtx } #endif +/* Returns a hash code for CONST_POLY_INT X. */ + +hashval_t +const_poly_int_hasher::hash (rtx x) +{ + inchash::hash h; + h.add_int (GET_MODE (x)); + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]); + return h.end (); +} + +/* Returns nonzero if CONST_POLY_INT X is an rtx representation of Y. */ + +bool +const_poly_int_hasher::equal (rtx x, const compare_type &y) +{ + if (GET_MODE (x) != y.first) + return false; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + if (CONST_POLY_INT_COEFFS (x)[i] != y.second.coeffs[i]) + return false; + return true; +} + /* Returns a hash code for X (which is really a CONST_DOUBLE). */ hashval_t const_double_hasher::hash (rtx x) @@ -520,9 +555,13 @@ gen_rtx_CONST_INT (machine_mode mode ATT } rtx -gen_int_mode (HOST_WIDE_INT c, machine_mode mode) +gen_int_mode (poly_int64 c, machine_mode mode) { - return GEN_INT (trunc_int_for_mode (c, mode)); + c = trunc_int_for_mode (c, mode); + if (c.is_constant ()) + return GEN_INT (c.coeffs[0]); + unsigned int prec = GET_MODE_PRECISION (as_a (mode)); + return immed_wide_int_const (poly_wide_int::from (c, prec, SIGNED), mode); } /* CONST_DOUBLEs might be created from pairs of integers, or from @@ -626,8 +665,8 @@ lookup_const_wide_int (rtx wint) a CONST_DOUBLE (if !TARGET_SUPPORTS_WIDE_INT) or a CONST_WIDE_INT (if TARGET_SUPPORTS_WIDE_INT). */ -rtx -immed_wide_int_const (const wide_int_ref &v, machine_mode mode) +static rtx +immed_wide_int_const_1 (const wide_int_ref &v, machine_mode mode) { unsigned int len = v.get_len (); /* Not scalar_int_mode because we also allow pointer bound modes. */ @@ -714,6 +753,53 @@ immed_double_const (HOST_WIDE_INT i0, HO } #endif +/* Return an rtx representation of C in mode MODE. */ + +rtx +immed_wide_int_const (const poly_wide_int_ref &c, machine_mode mode) +{ + if (c.is_constant ()) + return immed_wide_int_const_1 (c.coeffs[0], mode); + + /* Not scalar_int_mode because we also allow pointer bound modes. */ + unsigned int prec = GET_MODE_PRECISION (as_a (mode)); + + /* Allow truncation but not extension since we do not know if the + number is signed or unsigned. */ + gcc_assert (prec <= c.coeffs[0].get_precision ()); + poly_wide_int newc = poly_wide_int::from (c, prec, SIGNED); + + /* See whether we already have an rtx for this constant. */ + inchash::hash h; + h.add_int (mode); + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + h.add_wide_int (newc.coeffs[i]); + const_poly_int_hasher::compare_type typed_value (mode, newc); + rtx *slot = const_poly_int_htab->find_slot_with_hash (typed_value, + h.end (), INSERT); + rtx x = *slot; + if (x) + return x; + + /* Create a new rtx. There's a choice to be made here between installing + the actual mode of the rtx or leaving it as VOIDmode (for consistency + with CONST_INT). In practice the handling of the codes is different + enough that we get no benefit from using VOIDmode, and various places + assume that VOIDmode implies CONST_INT. Using the real mode seems like + the right long-term direction anyway. */ + typedef trailing_wide_ints twi; + size_t extra_size = twi::extra_size (prec); + x = rtx_alloc_v (CONST_POLY_INT, + sizeof (struct const_poly_int_def) + extra_size); + PUT_MODE (x, mode); + CONST_POLY_INT_COEFFS (x).set_precision (prec); + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + CONST_POLY_INT_COEFFS (x)[i] = newc.coeffs[i]; + + *slot = x; + return x; +} + rtx gen_rtx_REG (machine_mode mode, unsigned int regno) { @@ -1502,7 +1588,8 @@ gen_lowpart_common (machine_mode mode, r } else if (GET_CODE (x) == SUBREG || REG_P (x) || GET_CODE (x) == CONCAT || const_vec_p (x) - || CONST_DOUBLE_AS_FLOAT_P (x) || CONST_SCALAR_INT_P (x)) + || CONST_DOUBLE_AS_FLOAT_P (x) || CONST_SCALAR_INT_P (x) + || CONST_POLY_INT_P (x)) return lowpart_subreg (mode, x, innermode); /* Otherwise, we can't do this. */ @@ -6089,6 +6176,9 @@ init_emit_once (void) #endif const_double_htab = hash_table::create_ggc (37); + if (NUM_POLY_INT_COEFFS > 1) + const_poly_int_htab = hash_table::create_ggc (37); + const_fixed_htab = hash_table::create_ggc (37); reg_attrs_htab = hash_table::create_ggc (37); @@ -6482,7 +6572,7 @@ need_atomic_barrier_p (enum memmodel mod by VALUE bits. */ rtx -gen_int_shift_amount (machine_mode mode, HOST_WIDE_INT value) +gen_int_shift_amount (machine_mode mode, poly_int64 value) { return gen_int_mode (value, get_shift_amount_mode (mode)); } Index: gcc/cse.c =================================================================== --- gcc/cse.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/cse.c 2017-10-23 17:00:54.436008509 +0100 @@ -2323,6 +2323,15 @@ hash_rtx_cb (const_rtx x, machine_mode m hash += CONST_WIDE_INT_ELT (x, i); return hash; + case CONST_POLY_INT: + { + inchash::hash h; + h.add_int (hash); + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]); + return h.end (); + } + case CONST_DOUBLE: /* This is like the general case, except that it only counts the integers representing the constant. */ @@ -3781,6 +3790,8 @@ equiv_constant (rtx x) /* See if we previously assigned a constant value to this SUBREG. */ if ((new_rtx = lookup_as_function (x, CONST_INT)) != 0 || (new_rtx = lookup_as_function (x, CONST_WIDE_INT)) != 0 + || (NUM_POLY_INT_COEFFS > 1 + && (new_rtx = lookup_as_function (x, CONST_POLY_INT)) != 0) || (new_rtx = lookup_as_function (x, CONST_DOUBLE)) != 0 || (new_rtx = lookup_as_function (x, CONST_FIXED)) != 0) return new_rtx; Index: gcc/cselib.c =================================================================== --- gcc/cselib.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/cselib.c 2017-10-23 17:00:54.436008509 +0100 @@ -1128,6 +1128,15 @@ cselib_hash_rtx (rtx x, int create, mach hash += CONST_WIDE_INT_ELT (x, i); return hash; + case CONST_POLY_INT: + { + inchash::hash h; + h.add_int (hash); + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]); + return h.end (); + } + case CONST_DOUBLE: /* This is like the general case, except that it only counts the integers representing the constant. */ Index: gcc/dwarf2out.c =================================================================== --- gcc/dwarf2out.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/dwarf2out.c 2017-10-23 17:00:54.439005782 +0100 @@ -13753,6 +13753,9 @@ const_ok_for_output_1 (rtx rtl) return false; } + if (CONST_POLY_INT_P (rtl)) + return false; + if (targetm.const_not_ok_for_debug_p (rtl)) { expansion_failed (NULL_TREE, rtl, Index: gcc/expr.c =================================================================== --- gcc/expr.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/expr.c 2017-10-23 17:00:54.442003055 +0100 @@ -692,6 +692,7 @@ convert_modes (machine_mode mode, machin && is_int_mode (oldmode, &int_oldmode) && GET_MODE_PRECISION (int_mode) <= GET_MODE_PRECISION (int_oldmode) && ((MEM_P (x) && !MEM_VOLATILE_P (x) && direct_load[(int) int_mode]) + || CONST_POLY_INT_P (x) || (REG_P (x) && (!HARD_REGISTER_P (x) || targetm.hard_regno_mode_ok (REGNO (x), int_mode)) Index: gcc/print-rtl.c =================================================================== --- gcc/print-rtl.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/print-rtl.c 2017-10-23 17:00:54.443002147 +0100 @@ -898,6 +898,17 @@ rtx_writer::print_rtx (const_rtx in_rtx) fprintf (m_outfile, " "); cwi_output_hex (m_outfile, in_rtx); break; + + case CONST_POLY_INT: + fprintf (m_outfile, " ["); + print_dec (CONST_POLY_INT_COEFFS (in_rtx)[0], m_outfile, SIGNED); + for (unsigned int i = 1; i < NUM_POLY_INT_COEFFS; ++i) + { + fprintf (m_outfile, ", "); + print_dec (CONST_POLY_INT_COEFFS (in_rtx)[i], m_outfile, SIGNED); + } + fprintf (m_outfile, "]"); + break; #endif case CODE_LABEL: @@ -1568,6 +1579,17 @@ print_value (pretty_printer *pp, const_r } break; + case CONST_POLY_INT: + pp_left_bracket (pp); + pp_wide_int (pp, CONST_POLY_INT_COEFFS (x)[0], SIGNED); + for (unsigned int i = 1; i < NUM_POLY_INT_COEFFS; ++i) + { + pp_string (pp, ", "); + pp_wide_int (pp, CONST_POLY_INT_COEFFS (x)[i], SIGNED); + } + pp_right_bracket (pp); + break; + case CONST_DOUBLE: if (FLOAT_MODE_P (GET_MODE (x))) { Index: gcc/rtlhash.c =================================================================== --- gcc/rtlhash.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/rtlhash.c 2017-10-23 17:00:54.444001238 +0100 @@ -55,6 +55,10 @@ add_rtx (const_rtx x, hash &hstate) for (i = 0; i < CONST_WIDE_INT_NUNITS (x); i++) hstate.add_object (CONST_WIDE_INT_ELT (x, i)); return; + case CONST_POLY_INT: + for (i = 0; i < NUM_POLY_INT_COEFFS; ++i) + hstate.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]); + break; case SYMBOL_REF: if (XSTR (x, 0)) hstate.add (XSTR (x, 0), strlen (XSTR (x, 0)) + 1); Index: gcc/explow.c =================================================================== --- gcc/explow.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/explow.c 2017-10-23 17:00:54.440004873 +0100 @@ -77,13 +77,23 @@ trunc_int_for_mode (HOST_WIDE_INT c, mac return c; } +/* Likewise for polynomial values, using the sign-extended representation + for each individual coefficient. */ + +poly_int64 +trunc_int_for_mode (poly_int64 x, machine_mode mode) +{ + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + x.coeffs[i] = trunc_int_for_mode (x.coeffs[i], mode); + return x; +} + /* Return an rtx for the sum of X and the integer C, given that X has mode MODE. INPLACE is true if X can be modified inplace or false if it must be treated as immutable. */ rtx -plus_constant (machine_mode mode, rtx x, HOST_WIDE_INT c, - bool inplace) +plus_constant (machine_mode mode, rtx x, poly_int64 c, bool inplace) { RTX_CODE code; rtx y; @@ -92,7 +102,7 @@ plus_constant (machine_mode mode, rtx x, gcc_assert (GET_MODE (x) == VOIDmode || GET_MODE (x) == mode); - if (c == 0) + if (known_zero (c)) return x; restart: @@ -180,10 +190,12 @@ plus_constant (machine_mode mode, rtx x, break; default: + if (CONST_POLY_INT_P (x)) + return immed_wide_int_const (const_poly_int_value (x) + c, mode); break; } - if (c != 0) + if (maybe_nonzero (c)) x = gen_rtx_PLUS (mode, x, gen_int_mode (c, mode)); if (GET_CODE (x) == SYMBOL_REF || GET_CODE (x) == LABEL_REF) Index: gcc/expmed.h =================================================================== --- gcc/expmed.h 2017-10-23 16:52:20.579835373 +0100 +++ gcc/expmed.h 2017-10-23 17:00:54.441003964 +0100 @@ -712,8 +712,8 @@ extern unsigned HOST_WIDE_INT choose_mul #ifdef TREE_CODE extern rtx expand_variable_shift (enum tree_code, machine_mode, rtx, tree, rtx, int); -extern rtx expand_shift (enum tree_code, machine_mode, rtx, int, rtx, - int); +extern rtx expand_shift (enum tree_code, machine_mode, rtx, poly_int64, rtx, + int); extern rtx expand_divmod (int, enum tree_code, machine_mode, rtx, rtx, rtx, int); #endif Index: gcc/expmed.c =================================================================== --- gcc/expmed.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/expmed.c 2017-10-23 17:00:54.441003964 +0100 @@ -2541,7 +2541,7 @@ expand_shift_1 (enum tree_code code, mac rtx expand_shift (enum tree_code code, machine_mode mode, rtx shifted, - int amount, rtx target, int unsignedp) + poly_int64 amount, rtx target, int unsignedp) { return expand_shift_1 (code, mode, shifted, gen_int_shift_amount (mode, amount), Index: gcc/rtlanal.c =================================================================== --- gcc/rtlanal.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/rtlanal.c 2017-10-23 17:00:54.444001238 +0100 @@ -915,6 +915,28 @@ split_const (rtx x, rtx *base_out, rtx * *base_out = x; *offset_out = const0_rtx; } + +/* Express integer value X as some value Y plus a polynomial offset, + where Y is either const0_rtx, X or something within X (as opposed + to a new rtx). Return the Y and store the offset in *OFFSET_OUT. */ + +rtx +strip_offset (rtx x, poly_int64_pod *offset_out) +{ + rtx base = const0_rtx; + rtx test = x; + if (GET_CODE (test) == CONST) + test = XEXP (test, 0); + if (GET_CODE (test) == PLUS) + { + base = XEXP (test, 0); + test = XEXP (test, 1); + } + if (poly_int_rtx_p (test, offset_out)) + return base; + *offset_out = 0; + return x; +} /* Return the number of places FIND appears within X. If COUNT_DEST is zero, we do not count occurrences inside the destination of a SET. */ @@ -3406,13 +3428,15 @@ commutative_operand_precedence (rtx op) /* Constants always become the second operand. Prefer "nice" constants. */ if (code == CONST_INT) - return -8; + return -10; if (code == CONST_WIDE_INT) - return -7; + return -9; + if (code == CONST_POLY_INT) + return -8; if (code == CONST_DOUBLE) - return -7; + return -8; if (code == CONST_FIXED) - return -7; + return -8; op = avoid_constant_pool_reference (op); code = GET_CODE (op); @@ -3420,13 +3444,15 @@ commutative_operand_precedence (rtx op) { case RTX_CONST_OBJ: if (code == CONST_INT) - return -6; + return -7; if (code == CONST_WIDE_INT) - return -6; + return -6; + if (code == CONST_POLY_INT) + return -5; if (code == CONST_DOUBLE) - return -5; + return -5; if (code == CONST_FIXED) - return -5; + return -5; return -4; case RTX_EXTRA: Index: gcc/rtl-tests.c =================================================================== --- gcc/rtl-tests.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/rtl-tests.c 2017-10-23 17:00:54.443002147 +0100 @@ -228,6 +228,62 @@ test_uncond_jump () jump_insn); } +template +struct const_poly_int_tests +{ + static void run (); +}; + +template<> +struct const_poly_int_tests<1> +{ + static void run () {} +}; + +/* Test various CONST_POLY_INT properties. */ + +template +void +const_poly_int_tests::run () +{ + rtx x1 = gen_int_mode (poly_int64 (1, 1), QImode); + rtx x255 = gen_int_mode (poly_int64 (1, 255), QImode); + + /* Test that constants are unique. */ + ASSERT_EQ (x1, gen_int_mode (poly_int64 (1, 1), QImode)); + ASSERT_NE (x1, gen_int_mode (poly_int64 (1, 1), HImode)); + ASSERT_NE (x1, x255); + + /* Test const_poly_int_value. */ + ASSERT_MUST_EQ (const_poly_int_value (x1), poly_int64 (1, 1)); + ASSERT_MUST_EQ (const_poly_int_value (x255), poly_int64 (1, -1)); + + /* Test rtx_to_poly_int64. */ + ASSERT_MUST_EQ (rtx_to_poly_int64 (x1), poly_int64 (1, 1)); + ASSERT_MUST_EQ (rtx_to_poly_int64 (x255), poly_int64 (1, -1)); + ASSERT_MAY_NE (rtx_to_poly_int64 (x255), poly_int64 (1, 255)); + + /* Test plus_constant of a symbol. */ + rtx symbol = gen_rtx_SYMBOL_REF (Pmode, "foo"); + rtx offset1 = gen_int_mode (poly_int64 (9, 11), Pmode); + rtx sum1 = gen_rtx_CONST (Pmode, gen_rtx_PLUS (Pmode, symbol, offset1)); + ASSERT_RTX_EQ (plus_constant (Pmode, symbol, poly_int64 (9, 11)), sum1); + + /* Test plus_constant of a CONST. */ + rtx offset2 = gen_int_mode (poly_int64 (12, 20), Pmode); + rtx sum2 = gen_rtx_CONST (Pmode, gen_rtx_PLUS (Pmode, symbol, offset2)); + ASSERT_RTX_EQ (plus_constant (Pmode, sum1, poly_int64 (3, 9)), sum2); + + /* Test a cancelling plus_constant. */ + ASSERT_EQ (plus_constant (Pmode, sum2, poly_int64 (-12, -20)), symbol); + + /* Test plus_constant on integer constants. */ + ASSERT_EQ (plus_constant (QImode, const1_rtx, poly_int64 (4, -2)), + gen_int_mode (poly_int64 (5, -2), QImode)); + ASSERT_EQ (plus_constant (QImode, x1, poly_int64 (4, -2)), + gen_int_mode (poly_int64 (5, -1), QImode)); +} + /* Run all of the selftests within this file. */ void @@ -238,6 +294,7 @@ rtl_tests_c_tests () test_dumping_rtx_reuse (); test_single_set (); test_uncond_jump (); + const_poly_int_tests::run (); /* Purge state. */ set_first_insn (NULL); Index: gcc/simplify-rtx.c =================================================================== --- gcc/simplify-rtx.c 2017-10-23 16:52:20.579835373 +0100 +++ gcc/simplify-rtx.c 2017-10-23 17:00:54.445000329 +0100 @@ -2039,6 +2039,26 @@ simplify_const_unary_operation (enum rtx } } + /* Handle polynomial integers. */ + else if (CONST_POLY_INT_P (op)) + { + poly_wide_int result; + switch (code) + { + case NEG: + result = -const_poly_int_value (op); + break; + + case NOT: + result = ~const_poly_int_value (op); + break; + + default: + return NULL_RTX; + } + return immed_wide_int_const (result, mode); + } + return NULL_RTX; } @@ -2219,6 +2239,7 @@ simplify_binary_operation_1 (enum rtx_co rtx tem, reversed, opleft, opright, elt0, elt1; HOST_WIDE_INT val; scalar_int_mode int_mode, inner_mode; + poly_int64 offset; /* Even if we can't compute a constant result, there are some cases worth simplifying. */ @@ -2531,6 +2552,12 @@ simplify_binary_operation_1 (enum rtx_co return simplify_gen_binary (MINUS, mode, tem, XEXP (op0, 0)); } + if ((GET_CODE (op0) == CONST + || GET_CODE (op0) == SYMBOL_REF + || GET_CODE (op0) == LABEL_REF) + && poly_int_rtx_p (op1, &offset)) + return plus_constant (mode, op0, trunc_int_for_mode (-offset, mode)); + /* Don't let a relocatable value get a negative coeff. */ if (CONST_INT_P (op1) && GET_MODE (op0) != VOIDmode) return simplify_gen_binary (PLUS, mode, @@ -4325,6 +4352,57 @@ simplify_const_binary_operation (enum rt return immed_wide_int_const (result, int_mode); } + /* Handle polynomial integers. */ + if (NUM_POLY_INT_COEFFS > 1 + && is_a (mode, &int_mode) + && poly_int_rtx_p (op0) + && poly_int_rtx_p (op1)) + { + poly_wide_int result; + switch (code) + { + case PLUS: + result = wi::to_poly_wide (op0, mode) + wi::to_poly_wide (op1, mode); + break; + + case MINUS: + result = wi::to_poly_wide (op0, mode) - wi::to_poly_wide (op1, mode); + break; + + case MULT: + if (CONST_SCALAR_INT_P (op1)) + result = wi::to_poly_wide (op0, mode) * rtx_mode_t (op1, mode); + else + return NULL_RTX; + break; + + case ASHIFT: + if (CONST_SCALAR_INT_P (op1)) + { + wide_int shift = rtx_mode_t (op1, mode); + if (SHIFT_COUNT_TRUNCATED) + shift = wi::umod_trunc (shift, GET_MODE_PRECISION (int_mode)); + else if (wi::geu_p (shift, GET_MODE_PRECISION (int_mode))) + return NULL_RTX; + result = wi::to_poly_wide (op0, mode) << shift; + } + else + return NULL_RTX; + break; + + case IOR: + if (!CONST_SCALAR_INT_P (op1) + || !can_ior_p (wi::to_poly_wide (op0, mode), + rtx_mode_t (op1, mode), &result)) + return NULL_RTX; + break; + + default: + return NULL_RTX; + } + return immed_wide_int_const (result, int_mode); + } + return NULL_RTX; } @@ -6317,13 +6395,27 @@ simplify_subreg (machine_mode outermode, scalar_int_mode int_outermode, int_innermode; if (is_a (outermode, &int_outermode) && is_a (innermode, &int_innermode) - && (GET_MODE_PRECISION (int_outermode) - < GET_MODE_PRECISION (int_innermode)) && byte == subreg_lowpart_offset (int_outermode, int_innermode)) { - rtx tem = simplify_truncation (int_outermode, op, int_innermode); - if (tem) - return tem; + /* Handle polynomial integers. The upper bits of a paradoxical + subreg are undefined, so this is safe regardless of whether + we're truncating or extending. */ + if (CONST_POLY_INT_P (op)) + { + poly_wide_int val + = poly_wide_int::from (const_poly_int_value (op), + GET_MODE_PRECISION (int_outermode), + SIGNED); + return immed_wide_int_const (val, int_outermode); + } + + if (GET_MODE_PRECISION (int_outermode) + < GET_MODE_PRECISION (int_innermode)) + { + rtx tem = simplify_truncation (int_outermode, op, int_innermode); + if (tem) + return tem; + } } return NULL_RTX; @@ -6629,12 +6721,60 @@ test_vector_ops () } } +template +struct simplify_const_poly_int_tests +{ + static void run (); +}; + +template<> +struct simplify_const_poly_int_tests<1> +{ + static void run () {} +}; + +/* Test various CONST_POLY_INT properties. */ + +template +void +simplify_const_poly_int_tests::run () +{ + rtx x1 = gen_int_mode (poly_int64 (1, 1), QImode); + rtx x2 = gen_int_mode (poly_int64 (-80, 127), QImode); + rtx x3 = gen_int_mode (poly_int64 (-79, -128), QImode); + rtx x4 = gen_int_mode (poly_int64 (5, 4), QImode); + rtx x5 = gen_int_mode (poly_int64 (30, 24), QImode); + rtx x6 = gen_int_mode (poly_int64 (20, 16), QImode); + rtx x7 = gen_int_mode (poly_int64 (7, 4), QImode); + rtx x8 = gen_int_mode (poly_int64 (30, 24), HImode); + rtx x9 = gen_int_mode (poly_int64 (-30, -24), HImode); + rtx x10 = gen_int_mode (poly_int64 (-31, -24), HImode); + rtx two = GEN_INT (2); + rtx six = GEN_INT (6); + HOST_WIDE_INT offset = subreg_lowpart_offset (QImode, HImode); + + /* These tests only try limited operation combinations. Fuller arithmetic + testing is done directly on poly_ints. */ + ASSERT_EQ (simplify_unary_operation (NEG, HImode, x8, HImode), x9); + ASSERT_EQ (simplify_unary_operation (NOT, HImode, x8, HImode), x10); + ASSERT_EQ (simplify_unary_operation (TRUNCATE, QImode, x8, HImode), x5); + ASSERT_EQ (simplify_binary_operation (PLUS, QImode, x1, x2), x3); + ASSERT_EQ (simplify_binary_operation (MINUS, QImode, x3, x1), x2); + ASSERT_EQ (simplify_binary_operation (MULT, QImode, x4, six), x5); + ASSERT_EQ (simplify_binary_operation (MULT, QImode, six, x4), x5); + ASSERT_EQ (simplify_binary_operation (ASHIFT, QImode, x4, two), x6); + ASSERT_EQ (simplify_binary_operation (IOR, QImode, x4, two), x7); + ASSERT_EQ (simplify_subreg (HImode, x5, QImode, 0), x8); + ASSERT_EQ (simplify_subreg (QImode, x8, HImode, offset), x5); +} + /* Run all of the selftests within this file. */ void simplify_rtx_c_tests () { test_vector_ops (); + simplify_const_poly_int_tests::run (); } } // namespace selftest Index: gcc/wide-int.h =================================================================== --- gcc/wide-int.h 2017-10-23 17:00:20.923835582 +0100 +++ gcc/wide-int.h 2017-10-23 17:00:54.445999420 +0100 @@ -613,6 +613,7 @@ #define SHIFT_FUNCTION \ access. */ struct storage_ref { + storage_ref () {} storage_ref (const HOST_WIDE_INT *, unsigned int, unsigned int); const HOST_WIDE_INT *val; @@ -944,6 +945,8 @@ struct wide_int_ref_storage : public wi: HOST_WIDE_INT scratch[2]; public: + wide_int_ref_storage () {} + wide_int_ref_storage (const wi::storage_ref &); template @@ -1323,7 +1326,7 @@ typedef generic_wide_int -class GTY(()) trailing_wide_ints +class GTY((user)) trailing_wide_ints { private: /* The shared precision of each number. */ @@ -1340,9 +1343,14 @@ class GTY(()) trailing_wide_ints HOST_WIDE_INT m_val[1]; public: + typedef WIDE_INT_REF_FOR (trailing_wide_int_storage) const_reference; + void set_precision (unsigned int); + unsigned int get_precision () const { return m_precision; } trailing_wide_int operator [] (unsigned int); + const_reference operator [] (unsigned int) const; static size_t extra_size (unsigned int); + size_t extra_size () const { return extra_size (m_precision); } }; inline trailing_wide_int_storage:: @@ -1414,6 +1422,14 @@ trailing_wide_ints ::operator [] (uns &m_val[index * m_max_len]); } +template +inline typename trailing_wide_ints ::const_reference +trailing_wide_ints ::operator [] (unsigned int index) const +{ + return wi::storage_ref (&m_val[index * m_max_len], + m_len[index], m_precision); +} + /* Return how many extra bytes need to be added to the end of the structure in order to handle N wide_ints of precision PRECISION. */ template