From: Jay Foad <jay.foad@gmail.com>
To: Richard Guenther <rguenther@suse.de>
Cc: gcc-patches@gcc.gnu.org
Subject: Re: [PATCH][RFC][1/2] Bitfield lowering, add BIT_FIELD_EXPR
Date: Thu, 16 Jun 2011 12:06:00 -0000 [thread overview]
Message-ID: <BANLkTin5m7BOc+OSc_2SmdaEag82Ybzmbg@mail.gmail.com> (raw)
In-Reply-To: <alpine.LNX.2.00.1106161328310.810@zhemvz.fhfr.qr>
> BIT_FIELD_EXPR <a, b, C1, C2> is equivalent to computing
> a & ~((1 << C1 - 1) << C2) | ((b << C2) & (1 << C1 = 1)),
a & ~(((1 << C1) - 1) << C2) | ((b & ((1 << C1) - 1)) << C2)
?
Jay.
thus
> inserting b of width C1 at the bitfield position C2 in a, returning
> the new value. This allows translating
> BIT_FIELD_REF <a, C1, C2> = b;
> to
> a = BIT_FIELD_EXPR <a, b, C1, C2>;
> which avoids partial definitions of a (thus, BIT_FIELD_EXPR is
> similar to COMPLEX_EXPR). BIT_FIELD_EXPR is supposed to work
> on registers only.
>
> Comments welcome, esp. on how to avoid introducing quaternary
> RHS on gimple stmts (or using a GIMPLE_SINGLE_RHS as the patch does).
>
> Folders/combiners are missing to handle some of the easy
> BIT_FIELD_REF / BIT_FIELD_EXPR cases, as well as eventually
> re-writing shift/mask operations to BIT_FIELD_REF/EXPR.
>
> Richard.
>
> 2011-06-16 Richard Guenther <rguenther@suse.de>
>
> * expr.c (expand_expr_real_1): Handle BIT_FIELD_EXPR.
> * fold-const.c (operand_equal_p): Likewise.
> (build_bit_mask): New function.
> (fold_quaternary_loc): Likewise.
> (fold): Call it.
> (fold_build4_stat_loc): New function.
> * gimplify.c (gimplify_expr): Handle BIT_FIELD_EXPR.
> * tree-inline.c (estimate_operator_cost): Likewise.
> * tree-pretty-print.c (dump_generic_node): Likewise.
> * tree-ssa-operands.c (get_expr_operands): Likewise.
> * tree.def (BIT_FIELD_EXPR): New tree code.
> * tree.h (build_bit_mask): Declare.
> (fold_quaternary): Define.
> (fold_quaternary_loc): Declare.
> (fold_build4): Define.
> (fold_build4_loc): Likewise.
> (fold_build4_stat_loc): Declare.
> * gimple.c (gimple_rhs_class_table): Handle BIT_FIELD_EXPR.
>
> Index: trunk/gcc/expr.c
> ===================================================================
> *** trunk.orig/gcc/expr.c 2011-06-15 13:27:40.000000000 +0200
> --- trunk/gcc/expr.c 2011-06-15 15:08:41.000000000 +0200
> *************** expand_expr_real_1 (tree exp, rtx target
> *** 8680,8685 ****
> --- 8680,8708 ----
>
> return expand_constructor (exp, target, modifier, false);
>
> + case BIT_FIELD_EXPR:
> + {
> + unsigned bitpos = (unsigned) TREE_INT_CST_LOW (TREE_OPERAND (exp, 3));
> + unsigned bitsize = (unsigned) TREE_INT_CST_LOW (TREE_OPERAND (exp, 2));
> + tree bits, mask;
> + if (BYTES_BIG_ENDIAN)
> + bitpos = TYPE_PRECISION (type) - bitsize - bitpos;
> + /* build a mask to mask/clear the bits in the word. */
> + mask = build_bit_mask (type, bitsize, bitpos);
> + /* extend the bits to the word type, shift them to the right
> + place and mask the bits. */
> + bits = fold_convert (type, TREE_OPERAND (exp, 1));
> + bits = fold_build2 (BIT_AND_EXPR, type,
> + fold_build2 (LSHIFT_EXPR, type,
> + bits, size_int (bitpos)), mask);
> + /* switch to clear mask and do the composition. */
> + mask = fold_build1 (BIT_NOT_EXPR, type, mask);
> + return expand_normal (fold_build2 (BIT_IOR_EXPR, type,
> + fold_build2 (BIT_AND_EXPR, type,
> + TREE_OPERAND (exp, 0), mask),
> + bits));
> + }
> +
> case TARGET_MEM_REF:
> {
> addr_space_t as = TYPE_ADDR_SPACE (TREE_TYPE (exp));
> Index: trunk/gcc/fold-const.c
> ===================================================================
> *** trunk.orig/gcc/fold-const.c 2011-06-15 14:51:31.000000000 +0200
> --- trunk/gcc/fold-const.c 2011-06-15 15:33:04.000000000 +0200
> *************** operand_equal_p (const_tree arg0, const_
> *** 2667,2672 ****
> --- 2667,2675 ----
> case DOT_PROD_EXPR:
> return OP_SAME (0) && OP_SAME (1) && OP_SAME (2);
>
> + case BIT_FIELD_EXPR:
> + return OP_SAME (0) && OP_SAME (1) && OP_SAME (2) && OP_SAME (3);
> +
> default:
> return 0;
> }
> *************** contains_label_p (tree st)
> *** 13230,13235 ****
> --- 13233,13251 ----
> (walk_tree_without_duplicates (&st, contains_label_1 , NULL) != NULL_TREE);
> }
>
> + /* Builds and returns a mask of integral type TYPE for masking out
> + BITSIZE bits at bit position BITPOS in a word of type TYPE.
> + The mask has the bits set from bit BITPOS to BITPOS + BITSIZE - 1. */
> +
> + tree
> + build_bit_mask (tree type, unsigned int bitsize, unsigned int bitpos)
> + {
> + tree mask = double_int_to_tree (type, double_int_mask (bitsize));
> + mask = const_binop (LSHIFT_EXPR, mask, size_int (bitpos));
> +
> + return mask;
> + }
> +
> /* Fold a ternary expression of code CODE and type TYPE with operands
> OP0, OP1, and OP2. Return the folded expression if folding is
> successful. Otherwise, return NULL_TREE. */
> *************** fold_ternary_loc (location_t loc, enum t
> *** 13591,13596 ****
> --- 13607,13685 ----
> } /* switch (code) */
> }
>
> + /* Fold a quaternary expression of code CODE and type TYPE with operands
> + OP0, OP1, OP2, and OP3. Return the folded expression if folding is
> + successful. Otherwise, return NULL_TREE. */
> +
> + tree
> + fold_quaternary_loc (location_t loc, enum tree_code code, tree type,
> + tree op0, tree op1, tree op2, tree op3)
> + {
> + tree arg0 = NULL_TREE, arg1 = NULL_TREE;
> + enum tree_code_class kind = TREE_CODE_CLASS (code);
> +
> + gcc_assert (IS_EXPR_CODE_CLASS (kind)
> + && TREE_CODE_LENGTH (code) == 4);
> +
> + /* Strip any conversions that don't change the mode. This is safe
> + for every expression, except for a comparison expression because
> + its signedness is derived from its operands. So, in the latter
> + case, only strip conversions that don't change the signedness.
> +
> + Note that this is done as an internal manipulation within the
> + constant folder, in order to find the simplest representation of
> + the arguments so that their form can be studied. In any cases,
> + the appropriate type conversions should be put back in the tree
> + that will get out of the constant folder. */
> + if (op0)
> + {
> + arg0 = op0;
> + STRIP_NOPS (arg0);
> + }
> +
> + if (op1)
> + {
> + arg1 = op1;
> + STRIP_NOPS (arg1);
> + }
> +
> + switch (code)
> + {
> + case BIT_FIELD_EXPR:
> + /* Constant fold BIT_FIELD_EXPR. */
> + if (TREE_CODE (arg0) == INTEGER_CST
> + || TREE_CODE (arg1) == INTEGER_CST)
> + {
> + unsigned bitpos = (unsigned) TREE_INT_CST_LOW (op3);
> + unsigned bitsize = (unsigned) TREE_INT_CST_LOW (op2);
> + tree bits, mask;
> + if (BYTES_BIG_ENDIAN)
> + bitpos = TYPE_PRECISION (type) - bitsize - bitpos;
> + /* build a mask to mask/clear the bits in the word. */
> + mask = build_bit_mask (type, bitsize, bitpos);
> + /* extend the bits to the word type, shift them to the right
> + place and mask the bits. */
> + bits = fold_convert_loc (loc, type, arg1);
> + bits = fold_build2_loc (loc, BIT_AND_EXPR, type,
> + fold_build2_loc (loc, LSHIFT_EXPR, type,
> + bits, size_int (bitpos)),
> + mask);
> + /* switch to clear mask and do the composition. */
> + mask = fold_build1_loc (loc, BIT_NOT_EXPR, type, mask);
> + return fold_build2_loc (loc, BIT_IOR_EXPR, type,
> + fold_build2_loc (loc, BIT_AND_EXPR, type,
> + fold_convert (type, arg0),
> + mask),
> + bits);
> + }
> +
> + return NULL_TREE;
> +
> + default:
> + return NULL_TREE;
> + }
> + }
> +
> /* Perform constant folding and related simplification of EXPR.
> The related simplifications include x*1 => x, x*0 => 0, etc.,
> and application of the associative law.
> *************** fold (tree expr)
> *** 13632,13638 ****
> if (IS_EXPR_CODE_CLASS (kind))
> {
> tree type = TREE_TYPE (t);
> ! tree op0, op1, op2;
>
> switch (TREE_CODE_LENGTH (code))
> {
> --- 13721,13727 ----
> if (IS_EXPR_CODE_CLASS (kind))
> {
> tree type = TREE_TYPE (t);
> ! tree op0, op1, op2, op3;
>
> switch (TREE_CODE_LENGTH (code))
> {
> *************** fold (tree expr)
> *** 13651,13656 ****
> --- 13740,13752 ----
> op2 = TREE_OPERAND (t, 2);
> tem = fold_ternary_loc (loc, code, type, op0, op1, op2);
> return tem ? tem : expr;
> + case 4:
> + op0 = TREE_OPERAND (t, 0);
> + op1 = TREE_OPERAND (t, 1);
> + op2 = TREE_OPERAND (t, 2);
> + op3 = TREE_OPERAND (t, 3);
> + tem = fold_quaternary_loc (loc, code, type, op0, op1, op2, op3);
> + return tem ? tem : expr;
> default:
> break;
> }
> *************** fold_build3_stat_loc (location_t loc, en
> *** 14107,14112 ****
> --- 14203,14294 ----
> return tem;
> }
>
> + /* Fold a quaternary tree expression with code CODE of type TYPE with
> + operands OP0, OP1, OP2, and OP3. Return a folded expression if
> + successful. Otherwise, return a tree expression with code CODE of
> + type TYPE with operands OP0, OP1, OP2, and OP3. */
> +
> + tree
> + fold_build4_stat_loc (location_t loc, enum tree_code code, tree type,
> + tree op0, tree op1, tree op2, tree op3 MEM_STAT_DECL)
> + {
> + tree tem;
> + #ifdef ENABLE_FOLD_CHECKING
> + unsigned char checksum_before_op0[16],
> + checksum_before_op1[16],
> + checksum_before_op2[16],
> + checksum_before_op3[16],
> + checksum_after_op0[16],
> + checksum_after_op1[16],
> + checksum_after_op2[16];
> + checksum_after_op3[16];
> + struct md5_ctx ctx;
> + htab_t ht;
> +
> + ht = htab_create (32, htab_hash_pointer, htab_eq_pointer, NULL);
> + md5_init_ctx (&ctx);
> + fold_checksum_tree (op0, &ctx, ht);
> + md5_finish_ctx (&ctx, checksum_before_op0);
> + htab_empty (ht);
> +
> + md5_init_ctx (&ctx);
> + fold_checksum_tree (op1, &ctx, ht);
> + md5_finish_ctx (&ctx, checksum_before_op1);
> + htab_empty (ht);
> +
> + md5_init_ctx (&ctx);
> + fold_checksum_tree (op2, &ctx, ht);
> + md5_finish_ctx (&ctx, checksum_before_op2);
> + htab_empty (ht);
> +
> + md5_init_ctx (&ctx);
> + fold_checksum_tree (op3, &ctx, ht);
> + md5_finish_ctx (&ctx, checksum_before_op3);
> + htab_empty (ht);
> + #endif
> +
> + gcc_assert (TREE_CODE_CLASS (code) != tcc_vl_exp);
> + tem = fold_quaternary_loc (loc, code, type, op0, op1, op2, op3);
> + if (!tem)
> + tem = build4_stat_loc (loc, code, type, op0, op1, op2, op3 PASS_MEM_STAT);
> +
> + #ifdef ENABLE_FOLD_CHECKING
> + md5_init_ctx (&ctx);
> + fold_checksum_tree (op0, &ctx, ht);
> + md5_finish_ctx (&ctx, checksum_after_op0);
> + htab_empty (ht);
> +
> + if (memcmp (checksum_before_op0, checksum_after_op0, 16))
> + fold_check_failed (op0, tem);
> +
> + md5_init_ctx (&ctx);
> + fold_checksum_tree (op1, &ctx, ht);
> + md5_finish_ctx (&ctx, checksum_after_op1);
> + htab_empty (ht);
> +
> + if (memcmp (checksum_before_op1, checksum_after_op1, 16))
> + fold_check_failed (op1, tem);
> +
> + md5_init_ctx (&ctx);
> + fold_checksum_tree (op2, &ctx, ht);
> + md5_finish_ctx (&ctx, checksum_after_op2);
> + htab_delete (ht);
> +
> + if (memcmp (checksum_before_op2, checksum_after_op2, 16))
> + fold_check_failed (op2, tem);
> +
> + md5_init_ctx (&ctx);
> + fold_checksum_tree (op3, &ctx, ht);
> + md5_finish_ctx (&ctx, checksum_after_op3);
> + htab_delete (ht);
> +
> + if (memcmp (checksum_before_op3, checksum_after_op3, 16))
> + fold_check_failed (op3, tem);
> + #endif
> + return tem;
> + }
> +
> +
> /* Fold a CALL_EXPR expression of type TYPE with operands FN and NARGS
> arguments in ARGARRAY, and a null static chain.
> Return a folded expression if successful. Otherwise, return a CALL_EXPR
> Index: trunk/gcc/gimplify.c
> ===================================================================
> *** trunk.orig/gcc/gimplify.c 2011-06-10 13:28:11.000000000 +0200
> --- trunk/gcc/gimplify.c 2011-06-15 15:05:55.000000000 +0200
> *************** gimplify_expr (tree *expr_p, gimple_seq
> *** 7239,7244 ****
> --- 7239,7248 ----
> /* Classified as tcc_expression. */
> goto expr_3;
>
> + case BIT_FIELD_EXPR:
> + /* Arguments 3 and 4 are constants. */
> + goto expr_2;
> +
> case POINTER_PLUS_EXPR:
> /* Convert ((type *)A)+offset into &A->field_of_type_and_offset.
> The second is gimple immediate saving a need for extra statement.
> Index: trunk/gcc/tree-inline.c
> ===================================================================
> *** trunk.orig/gcc/tree-inline.c 2011-06-07 14:41:22.000000000 +0200
> --- trunk/gcc/tree-inline.c 2011-06-15 14:56:09.000000000 +0200
> *************** estimate_operator_cost (enum tree_code c
> *** 3357,3362 ****
> --- 3357,3366 ----
> return weights->div_mod_cost;
> return 1;
>
> + /* Bit-field insertion needs several shift and mask operations. */
> + case BIT_FIELD_EXPR:
> + return 3;
> +
> default:
> /* We expect a copy assignment with no operator. */
> gcc_assert (get_gimple_rhs_class (code) == GIMPLE_SINGLE_RHS);
> Index: trunk/gcc/tree-pretty-print.c
> ===================================================================
> *** trunk.orig/gcc/tree-pretty-print.c 2011-06-07 14:41:22.000000000 +0200
> --- trunk/gcc/tree-pretty-print.c 2011-06-15 15:07:05.000000000 +0200
> *************** dump_generic_node (pretty_printer *buffe
> *** 1217,1222 ****
> --- 1217,1234 ----
> pp_string (buffer, ">");
> break;
>
> + case BIT_FIELD_EXPR:
> + pp_string (buffer, "BIT_FIELD_EXPR <");
> + dump_generic_node (buffer, TREE_OPERAND (node, 0), spc, flags, false);
> + pp_string (buffer, ", ");
> + dump_generic_node (buffer, TREE_OPERAND (node, 1), spc, flags, false);
> + pp_string (buffer, ", ");
> + dump_generic_node (buffer, TREE_OPERAND (node, 2), spc, flags, false);
> + pp_string (buffer, ", ");
> + dump_generic_node (buffer, TREE_OPERAND (node, 3), spc, flags, false);
> + pp_string (buffer, ">");
> + break;
> +
> case ARRAY_REF:
> case ARRAY_RANGE_REF:
> op0 = TREE_OPERAND (node, 0);
> Index: trunk/gcc/tree-ssa-operands.c
> ===================================================================
> *** trunk.orig/gcc/tree-ssa-operands.c 2011-04-18 11:06:42.000000000 +0200
> --- trunk/gcc/tree-ssa-operands.c 2011-06-15 14:54:12.000000000 +0200
> *************** get_expr_operands (gimple stmt, tree *ex
> *** 974,979 ****
> --- 974,984 ----
> get_expr_operands (stmt, &TREE_OPERAND (expr, 0), flags);
> return;
>
> + case BIT_FIELD_EXPR:
> + gcc_assert (TREE_CODE (TREE_OPERAND (expr, 2)) == INTEGER_CST
> + && TREE_CODE (TREE_OPERAND (expr, 3)) == INTEGER_CST);
> + /* Fallthru. */
> +
> case TRUTH_AND_EXPR:
> case TRUTH_OR_EXPR:
> case TRUTH_XOR_EXPR:
> Index: trunk/gcc/tree.def
> ===================================================================
> *** trunk.orig/gcc/tree.def 2011-05-11 11:09:42.000000000 +0200
> --- trunk/gcc/tree.def 2011-06-15 14:53:17.000000000 +0200
> *************** DEFTREECODE (ADDR_EXPR, "addr_expr", tcc
> *** 784,789 ****
> --- 784,801 ----
> descriptor of type ptr_mode. */
> DEFTREECODE (FDESC_EXPR, "fdesc_expr", tcc_expression, 2)
>
> + /* Given a word, a value and a bitfield position and size within
> + the word, produce the value that results if replacing the
> + described parts of word with value.
> + Operand 0 is a tree for the word of integral type;
> + Operand 1 is a tree for the value of integral type;
> + Operand 2 is a tree giving the constant number of bits being
> + referenced which is less or equal to the precision of the value;
> + Operand 3 is a tree giving the constant position of the first referenced
> + bit such that the sum of operands 2 and 3 is less than or equal to the
> + precision of the word. */
> + DEFTREECODE (BIT_FIELD_EXPR, "bitfield_expr", tcc_expression, 4)
> +
> /* Given two real or integer operands of the same type,
> returns a complex value of the corresponding complex type. */
> DEFTREECODE (COMPLEX_EXPR, "complex_expr", tcc_binary, 2)
> Index: trunk/gcc/tree.h
> ===================================================================
> *** trunk.orig/gcc/tree.h 2011-06-15 13:27:06.000000000 +0200
> --- trunk/gcc/tree.h 2011-06-15 15:19:59.000000000 +0200
> *************** extern bool is_typedef_decl (tree x);
> *** 5091,5096 ****
> --- 5091,5097 ----
> extern bool typedef_variant_p (tree);
> extern bool auto_var_in_fn_p (const_tree, const_tree);
> extern tree build_low_bits_mask (tree, unsigned);
> + extern tree build_bit_mask (tree type, unsigned int, unsigned int);
> extern tree tree_strip_nop_conversions (tree);
> extern tree tree_strip_sign_nop_conversions (tree);
> extern tree lhd_gcc_personality (void);
> *************** extern tree fold_binary_loc (location_t,
> *** 5147,5152 ****
> --- 5148,5156 ----
> #define fold_ternary(CODE,T1,T2,T3,T4)\
> fold_ternary_loc (UNKNOWN_LOCATION, CODE, T1, T2, T3, T4)
> extern tree fold_ternary_loc (location_t, enum tree_code, tree, tree, tree, tree);
> + #define fold_quaternary(CODE,T1,T2,T3,T4,T5)\
> + fold_quaternary_loc (UNKNOWN_LOCATION, CODE, T1, T2, T3, T4, T5)
> + extern tree fold_quaternary_loc (location_t, enum tree_code, tree, tree, tree, tree, tree);
> #define fold_build1(c,t1,t2)\
> fold_build1_stat_loc (UNKNOWN_LOCATION, c, t1, t2 MEM_STAT_INFO)
> #define fold_build1_loc(l,c,t1,t2)\
> *************** extern tree fold_build2_stat_loc (locati
> *** 5165,5170 ****
> --- 5169,5180 ----
> fold_build3_stat_loc (l, c, t1, t2, t3, t4 MEM_STAT_INFO)
> extern tree fold_build3_stat_loc (location_t, enum tree_code, tree, tree, tree,
> tree MEM_STAT_DECL);
> + #define fold_build4(c,t1,t2,t3,t4,t5)\
> + fold_build4_stat_loc (UNKNOWN_LOCATION, c, t1, t2, t3, t4, t5 MEM_STAT_INFO)
> + #define fold_build4_loc(l,c,t1,t2,t3,t4,t5)\
> + fold_build4_stat_loc (l, c, t1, t2, t3, t4, t5 MEM_STAT_INFO)
> + extern tree fold_build4_stat_loc (location_t, enum tree_code, tree, tree, tree,
> + tree, tree MEM_STAT_DECL);
> extern tree fold_build1_initializer_loc (location_t, enum tree_code, tree, tree);
> extern tree fold_build2_initializer_loc (location_t, enum tree_code, tree, tree, tree);
> extern tree fold_build3_initializer_loc (location_t, enum tree_code, tree, tree, tree, tree);
> Index: trunk/gcc/gimple.c
> ===================================================================
> *** trunk.orig/gcc/gimple.c 2011-06-15 15:33:04.000000000 +0200
> --- trunk/gcc/gimple.c 2011-06-15 15:33:54.000000000 +0200
> *************** get_gimple_rhs_num_ops (enum tree_code c
> *** 2599,2605 ****
> || (SYM) == ADDR_EXPR \
> || (SYM) == WITH_SIZE_EXPR \
> || (SYM) == SSA_NAME \
> ! || (SYM) == VEC_COND_EXPR) ? GIMPLE_SINGLE_RHS \
> : GIMPLE_INVALID_RHS),
> #define END_OF_BASE_TREE_CODES (unsigned char) GIMPLE_INVALID_RHS,
>
> --- 2599,2606 ----
> || (SYM) == ADDR_EXPR \
> || (SYM) == WITH_SIZE_EXPR \
> || (SYM) == SSA_NAME \
> ! || (SYM) == VEC_COND_EXPR \
> ! || (SYM) == BIT_FIELD_EXPR) ? GIMPLE_SINGLE_RHS \
> : GIMPLE_INVALID_RHS),
> #define END_OF_BASE_TREE_CODES (unsigned char) GIMPLE_INVALID_RHS,
>
>
next prev parent reply other threads:[~2011-06-16 11:57 UTC|newest]
Thread overview: 12+ messages / expand[flat|nested] mbox.gz Atom feed top
2011-06-16 11:46 Richard Guenther
2011-06-16 12:06 ` Jay Foad [this message]
2011-06-16 17:18 ` Richard Henderson
2011-06-16 18:10 ` Eric Botcazou
2011-06-16 19:23 ` Richard Guenther
2011-06-16 19:53 ` Richard Henderson
2011-06-20 14:23 ` William J. Schmidt
2011-06-20 19:09 ` Andrew Pinski
2011-06-20 21:03 ` Richard Guenther
2011-06-20 21:05 ` Andrew Pinski
2011-06-19 23:45 ` William J. Schmidt
2011-06-20 13:44 ` Michael Matz
Reply instructions:
You may reply publicly to this message via plain-text email
using any one of the following methods:
* Save the following mbox file, import it into your mail client,
and reply-to-all from there: mbox
Avoid top-posting and favor interleaved quoting:
https://en.wikipedia.org/wiki/Posting_style#Interleaved_style
* Reply using the --to, --cc, and --in-reply-to
switches of git-send-email(1):
git send-email \
--in-reply-to=BANLkTin5m7BOc+OSc_2SmdaEag82Ybzmbg@mail.gmail.com \
--to=jay.foad@gmail.com \
--cc=gcc-patches@gcc.gnu.org \
--cc=rguenther@suse.de \
/path/to/YOUR_REPLY
https://kernel.org/pub/software/scm/git/docs/git-send-email.html
* If your mail client supports setting the In-Reply-To header
via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line
before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).