From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from smtpbg511.qq.com (smtpbg511.qq.com [203.205.250.109]) by sourceware.org (Postfix) with ESMTPS id BF53238356AE for ; Tue, 31 May 2022 08:50:46 +0000 (GMT) DMARC-Filter: OpenDMARC Filter v1.4.1 sourceware.org BF53238356AE Authentication-Results: sourceware.org; dmarc=none (p=none dis=none) header.from=rivai.ai Authentication-Results: sourceware.org; spf=pass smtp.mailfrom=rivai.ai X-QQ-mid: bizesmtp84t1653987042t2lkgoo7 Received: from server1.localdomain ( [42.247.22.65]) by bizesmtp.qq.com (ESMTP) with id ; Tue, 31 May 2022 16:50:41 +0800 (CST) X-QQ-SSF: 01400000002000B0F000000A0000000 X-QQ-FEAT: eTtJes0duVuLOKChFYlGOeEbUhJDXZt9R6OLWHFR99K3nPLwHO3Bdh8NGSbDH c0QTO0By4KDArQ2X98l7u14ba6tQuMWkgOqQTZoQOp+h/orQxDt0X+eEWpcvtw2uGRaGbq9 sI/yU8NVOK/2BA6SNuIaPWHZxPCNB/4rPRdi1ZO5bsark5SQQesIeTMVZfZDdzjTajNfRTV BRYJ9j8e3lsYD+On6lDnVZzb9wGbeQq78XQamjDMuQVqrKznD5fi2oN0nUuqo9/EY6/LKFA 85M/9lcLd2cLZ6vSpmvB6gypt5jHbj5+J5WkhjAfbFELqXoz+A2ULNoKpUDysbxz+t+WNc0 qiwZG7MGiQUkmgsxjKo0JKO5ESvKwnuFIW0oElxBEzQx7CZeOM= X-QQ-GoodBg: 2 From: juzhe.zhong@rivai.ai To: gcc-patches@gcc.gnu.org Cc: kito.cheng@gmail.com, palmer@dabbelt.com, juzhe.zhong@rivai.ai Subject: [PATCH 09/21] Add misc function intrinsic support Date: Tue, 31 May 2022 16:50:00 +0800 Message-Id: <20220531085012.269719-10-juzhe.zhong@rivai.ai> X-Mailer: git-send-email 2.36.1 In-Reply-To: <20220531085012.269719-1-juzhe.zhong@rivai.ai> References: <20220531085012.269719-1-juzhe.zhong@rivai.ai> MIME-Version: 1.0 Content-Transfer-Encoding: quoted-printable X-QQ-SENDSIZE: 520 Feedback-ID: bizesmtp:rivai.ai:qybgforeign:qybgforeign9 X-QQ-Bgrelay: 1 X-Spam-Status: No, score=-10.1 required=5.0 tests=BAYES_00, GIT_PATCH_0, KAM_ASCII_DIVIDERS, KAM_DMARC_STATUS, KAM_SHORT, RCVD_IN_DNSWL_NONE, RCVD_IN_MSPIKE_H2, SPF_PASS, TXREP, T_SCC_BODY_TEXT_LINE, T_SPF_HELO_TEMPERROR, URIBL_BLACK autolearn=ham autolearn_force=no version=3.4.6 X-Spam-Checker-Version: SpamAssassin 3.4.6 (2021-04-09) on server2.sourceware.org X-BeenThere: gcc-patches@gcc.gnu.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: Gcc-patches mailing list List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Tue, 31 May 2022 08:50:59 -0000 From: zhongjuzhe =0D gcc/ChangeLog:=0D =0D * config/riscv/predicates.md (vector_any_register_operand): New pre= dicate.=0D * config/riscv/riscv-protos.h (riscv_regmode_natural_size): New fun= ction.=0D * config/riscv/riscv-vector-builtins-functions.cc (get_pred_str): N= ew function.=0D (get_operation_str): New function.=0D (is_dt_ptr): New function.=0D (is_dt_unsigned): New function.=0D (is_dt_const): New function.=0D (intrinsic_rename): New function.=0D (get_dt_t_with_index): New function.=0D (misc::assemble_name): New function.=0D (misc::get_return_type): New function.=0D (misc::get_argument_types): New function.=0D (vreinterpret::expand): New function.=0D (vlmul_ext::assemble_name): New function.=0D (vlmul_ext::expand): New function.=0D (vlmul_trunc::assemble_name): New function.=0D (vlmul_trunc::expand): New function.=0D (vundefined::assemble_name): New function.=0D (vundefined::get_argument_types): New function.=0D (vundefined::expand): New function.=0D * config/riscv/riscv-vector-builtins-functions.def (vreinterpret): = New macro definition.=0D (vlmul_ext): New macro definition.=0D (vlmul_trunc): New macro definition.=0D (vundefined): New macro definition.=0D * config/riscv/riscv-vector-builtins-functions.h (class misc): New = class.=0D (class vreinterpret): New class.=0D (class vlmul_ext): New class.=0D (class vlmul_trunc): New class.=0D (class vundefined): New class.=0D * config/riscv/riscv-vector-builtins-iterators.def (VCONVERFI): New= iterator.=0D (VCONVERI): New iterator.=0D (VCONVERI2): New iterator.=0D (VCONVERI3): New iterator.=0D (VCONVERF): New iterator.=0D (VSETI): New iterator.=0D (VSETF): New iterator.=0D (VGETI): New iterator.=0D (VGETF): New iterator.=0D (VLMULEXT): New iterator.=0D (VLMULTRUNC): New iterator.=0D * config/riscv/riscv.cc (riscv_hard_regno_mode_ok): Fix register al= location.=0D (riscv_class_max_nregs): Fix register allocation.=0D (riscv_can_change_mode_class): Add RVV mode can change support.=0D (riscv_regmode_natural_size): New function.=0D * config/riscv/riscv.h (REGMODE_NATURAL_SIZE): New targethook.=0D * config/riscv/vector-iterators.md: New iterators and attributes.=0D * config/riscv/vector.md (@vreinterpret): New pattern.=0D (@vlmul_ext): New pattern.=0D (*vlmul_ext): New pattern.=0D (@vlmul_trunc): New pattern.=0D (@vset): New pattern.=0D (@vget): New pattern.=0D =0D gcc/testsuite/ChangeLog:=0D =0D * g++.target/riscv/rvv/misc_func.C: New test.=0D * g++.target/riscv/rvv/rvv-intrinsic.exp: New test.=0D * gcc.target/riscv/rvv/intrinsic/misc_func.c: New test.=0D =0D ---=0D gcc/config/riscv/predicates.md | 7 +=0D gcc/config/riscv/riscv-protos.h | 1 +=0D .../riscv/riscv-vector-builtins-functions.cc | 288 ++=0D .../riscv/riscv-vector-builtins-functions.def | 22 +=0D .../riscv/riscv-vector-builtins-functions.h | 62 +=0D .../riscv/riscv-vector-builtins-iterators.def | 171 +=0D gcc/config/riscv/riscv.cc | 33 +-=0D gcc/config/riscv/riscv.h | 2 +=0D gcc/config/riscv/vector-iterators.md | 142 +=0D gcc/config/riscv/vector.md | 133 +=0D .../g++.target/riscv/rvv/misc_func.C | 2597 +++++++++++++++=0D .../g++.target/riscv/rvv/rvv-intrinsic.exp | 39 +=0D .../riscv/rvv/intrinsic/misc_func.c | 2921 +++++++++++++++++=0D 13 files changed, 6414 insertions(+), 4 deletions(-)=0D create mode 100644 gcc/testsuite/g++.target/riscv/rvv/misc_func.C=0D create mode 100644 gcc/testsuite/g++.target/riscv/rvv/rvv-intrinsic.exp=0D create mode 100644 gcc/testsuite/gcc.target/riscv/rvv/intrinsic/misc_func.= c=0D =0D diff --git a/gcc/config/riscv/predicates.md b/gcc/config/riscv/predicates.m= d=0D index 7a101676538..e31c829bf5b 100644=0D --- a/gcc/config/riscv/predicates.md=0D +++ b/gcc/config/riscv/predicates.md=0D @@ -257,6 +257,13 @@=0D return GET_MODE (op) =3D=3D Pmode;=0D })=0D =0D +;; A special predicate that doesn't match a particular mode.=0D +(define_special_predicate "vector_any_register_operand"=0D + (match_code "reg, subreg")=0D +{=0D + return VECTOR_MODE_P (GET_MODE (op));=0D +})=0D +=0D (define_predicate "vector_reg_or_const0_operand"=0D (ior (match_operand 0 "register_operand")=0D (match_test "op =3D=3D const0_rtx && !VECTOR_MODE_P (GET_MODE (op))= ")))=0D diff --git a/gcc/config/riscv/riscv-protos.h b/gcc/config/riscv/riscv-proto= s.h=0D index 2d63fe76930..fd8906e47de 100644=0D --- a/gcc/config/riscv/riscv-protos.h=0D +++ b/gcc/config/riscv/riscv-protos.h=0D @@ -75,6 +75,7 @@ extern bool riscv_store_data_bypass_p (rtx_insn *, rtx_in= sn *);=0D extern rtx riscv_gen_gpr_save_insn (struct riscv_frame_info *);=0D extern bool riscv_gpr_save_operation_p (rtx);=0D extern rtx riscv_add_offset (rtx, rtx, HOST_WIDE_INT);=0D +extern poly_uint64 riscv_regmode_natural_size (machine_mode);=0D =0D /* Routines implemented in riscv-c.cc. */=0D void riscv_cpu_cpp_builtins (cpp_reader *);=0D diff --git a/gcc/config/riscv/riscv-vector-builtins-functions.cc b/gcc/conf= ig/riscv/riscv-vector-builtins-functions.cc=0D index 0acda8f671e..a25f167f40e 100644=0D --- a/gcc/config/riscv/riscv-vector-builtins-functions.cc=0D +++ b/gcc/config/riscv/riscv-vector-builtins-functions.cc=0D @@ -143,6 +143,131 @@ get_vtype_for_mode (machine_mode mode)=0D gcc_unreachable ();=0D }=0D =0D +static const char *=0D +get_pred_str (enum predication_index pred, bool overloaded_p =3D false)=0D +{=0D + switch (pred)=0D + {=0D + case PRED_void:=0D + case PRED_none:=0D + return "";=0D +=0D + case PRED_m:=0D + return overloaded_p ? "" : "_m";=0D +=0D + case PRED_tam:=0D + return "_tam";=0D +=0D + case PRED_tum:=0D + return "_tum";=0D +=0D + case PRED_tu:=0D + return "_tu";=0D +=0D + case PRED_ta:=0D + return "_ta";=0D +=0D + case PRED_ma:=0D + return "_ma";=0D +=0D + case PRED_mu:=0D + return "_mu";=0D +=0D + case PRED_tama:=0D + return "_tama";=0D +=0D + case PRED_tamu:=0D + return "_tamu";=0D +=0D + case PRED_tuma:=0D + return "_tuma";=0D +=0D + case PRED_tumu:=0D + return "_tumu";=0D +=0D + default:=0D + gcc_unreachable ();=0D + }=0D +}=0D +=0D +static const char *=0D +get_operation_str (enum operation_index op)=0D +{=0D + switch (op)=0D + {=0D + case OP_vv:=0D + return "_vv";=0D +=0D + case OP_vx:=0D + return "_vx";=0D +=0D + case OP_v:=0D + return "_v";=0D +=0D + case OP_wv:=0D + return "_wv";=0D +=0D + case OP_wx:=0D + return "_wx";=0D +=0D + case OP_x_x_v:=0D + return "_x_x_v";=0D +=0D + case OP_vf2:=0D + return "_vf2";=0D +=0D + case OP_vf4:=0D + return "_vf4";=0D +=0D + case OP_vf8:=0D + return "_vf8";=0D +=0D + case OP_vvm:=0D + return "_vvm";=0D +=0D + case OP_vxm:=0D + return "_vxm";=0D +=0D + case OP_x_x_w:=0D + return "_x_x_w";=0D +=0D + case OP_v_v:=0D + return "_v_v";=0D +=0D + case OP_v_x:=0D + return "_v_x";=0D +=0D + case OP_v_f:=0D + return "_v_f";=0D +=0D + case OP_vs:=0D + return "_vs";=0D +=0D + case OP_vf:=0D + return "_vf";=0D +=0D + case OP_wf:=0D + return "_wf";=0D +=0D + case OP_vfm:=0D + return "_vfm";=0D +=0D + case OP_vm:=0D + return "_vm";=0D +=0D + case OP_mm:=0D + return "_mm";=0D +=0D + case OP_m:=0D + return "_m";=0D +=0D + default:=0D + break;=0D + }=0D +=0D + return "";=0D +}=0D +=0D static const char *=0D mode2data_type_str (machine_mode mode, bool u, bool ie)=0D {=0D @@ -300,6 +425,53 @@ get_dt_t (machine_mode mode, bool u, bool ptr =3D fals= e, bool c =3D false)=0D return type;=0D }=0D =0D +/* Helper functions to get datatype of arg. */=0D +=0D +static bool=0D +is_dt_ptr (enum data_type_index dt)=0D +{=0D + return dt =3D=3D DT_ptr || dt =3D=3D DT_uptr || dt =3D=3D DT_c_ptr || dt= =3D=3D DT_c_uptr;=0D +}=0D +=0D +static bool=0D +is_dt_unsigned (enum data_type_index dt)=0D +{=0D + return dt =3D=3D DT_unsigned || dt =3D=3D DT_uptr || dt =3D=3D DT_c_uptr= ;=0D +}=0D +=0D +static bool=0D +is_dt_const (enum data_type_index dt)=0D +{=0D + return dt =3D=3D DT_c_ptr || dt =3D=3D DT_c_uptr;=0D +}=0D +=0D +/* Helper functions for builder implementation. */=0D +static void=0D +intrinsic_rename (function_instance &instance, int index1, int index2)=0D +{=0D + machine_mode dst_mode =3D instance.get_arg_pattern ().arg_list[index1];= =0D + machine_mode src_mode =3D instance.get_arg_pattern ().arg_list[index2];= =0D + bool dst_unsigned_p =3D instance.get_data_type_list ()[index1] =3D=3D DT= _unsigned;=0D + bool src_unsigned_p =3D instance.get_data_type_list ()[index2] =3D=3D DT= _unsigned;=0D + const char *name =3D instance.get_base_name ();=0D + const char *op =3D get_operation_str (instance.get_operation ());=0D + const char *src_suffix =3D mode2data_type_str (src_mode, src_unsigned_p,= false);=0D + const char *dst_suffix =3D mode2data_type_str (dst_mode, dst_unsigned_p,= false);=0D + const char *pred =3D get_pred_str (instance.get_pred ());=0D + snprintf (instance.function_name, NAME_MAXLEN, "%s%s%s%s%s", name, op, s= rc_suffix, dst_suffix, pred);=0D +}=0D +=0D +static tree=0D +get_dt_t_with_index (const function_instance &instance, int index)=0D +{=0D + machine_mode mode =3D instance.get_arg_pattern ().arg_list[index];=0D + bool unsigned_p =3D is_dt_unsigned (instance.get_data_type_list ()[index= ]);=0D + bool ptr_p =3D is_dt_ptr (instance.get_data_type_list ()[index]);=0D + bool c_p =3D is_dt_const (instance.get_data_type_list ()[index]);=0D + return get_dt_t (mode, unsigned_p, ptr_p, c_p);=0D +}=0D +=0D +=0D /* Return true if the function has no return value. */=0D static bool=0D function_returns_void_p (tree fndecl)=0D @@ -1222,6 +1394,122 @@ vsetvlmax::expand (const function_instance &instanc= e, tree exp, rtx target) cons=0D !function_returns_void_p (fndecl));=0D }=0D =0D +/* A function implementation for Miscellaneous functions. */=0D +char *=0D +misc::assemble_name (function_instance &instance)=0D +{=0D + machine_mode mode =3D instance.get_arg_pattern ().arg_list[0];=0D + bool unsigned_p =3D instance.get_data_type_list ()[0] =3D=3D DT_unsigned= ;=0D + intrinsic_rename (instance, 0, 1);=0D + append_name (instance.get_base_name ());=0D + append_name (mode2data_type_str (mode, unsigned_p, false));=0D + return finish_name ();=0D +}=0D +=0D +tree=0D +misc::get_return_type (const function_instance &instance) const=0D +{=0D + return get_dt_t_with_index (instance, 0);=0D +}=0D +=0D +void=0D +misc::get_argument_types (const function_instance &instance,=0D + vec &argument_types) const=0D +{=0D + argument_types.quick_push (get_dt_t_with_index (instance, 1));=0D +}=0D +=0D +/* A function implementation for vreinterpret functions. */=0D +rtx=0D +vreinterpret::expand (const function_instance &instance, tree exp, rtx tar= get) const=0D +{=0D + enum insn_code icode =3D code_for_vreinterpret (instance.get_arg_pattern= ().arg_list[0]);=0D + return expand_builtin_insn (icode, exp, target, instance);=0D +}=0D +=0D +/* A function implementation for vlmul_ext functions. */=0D +char *=0D +vlmul_ext::assemble_name (function_instance &instance)=0D +{=0D + machine_mode tmode =3D instance.get_arg_pattern ().arg_list[0];=0D + machine_mode smode =3D instance.get_arg_pattern ().arg_list[1];=0D + if (GET_MODE_INNER (tmode) !=3D GET_MODE_INNER (smode))=0D + return nullptr;=0D +=0D + if (tmode =3D=3D smode)=0D + return nullptr;=0D + =0D + if (known_lt (GET_MODE_SIZE (tmode), GET_MODE_SIZE (smode)))=0D + return nullptr;=0D +=0D + bool unsigned_p =3D instance.get_data_type_list ()[0] =3D=3D DT_unsigned= ;=0D + intrinsic_rename (instance, 0, 1);=0D + append_name (instance.get_base_name ());=0D + append_name (mode2data_type_str (tmode, unsigned_p, false));=0D + return finish_name ();=0D +}=0D +=0D +rtx=0D +vlmul_ext::expand (const function_instance &instance, tree exp, rtx target= ) const=0D +{=0D + enum insn_code icode =3D code_for_vlmul_ext (instance.get_arg_pattern ()= .arg_list[0]);=0D + return expand_builtin_insn (icode, exp, target, instance);=0D +}=0D +=0D +/* A function implementation for vlmul_trunc functions. */=0D +char *=0D +vlmul_trunc::assemble_name (function_instance &instance)=0D +{=0D + machine_mode tmode =3D instance.get_arg_pattern ().arg_list[0];=0D + machine_mode smode =3D instance.get_arg_pattern ().arg_list[1];=0D + if (GET_MODE_INNER (tmode) !=3D GET_MODE_INNER (smode))=0D + return nullptr;=0D +=0D + if (tmode =3D=3D smode)=0D + return nullptr;=0D + =0D + if (known_gt (GET_MODE_SIZE (tmode), GET_MODE_SIZE (smode)))=0D + return nullptr;=0D +=0D + bool unsigned_p =3D instance.get_data_type_list ()[0] =3D=3D DT_unsigned= ;=0D + intrinsic_rename (instance, 0, 1);=0D + append_name (instance.get_base_name ());=0D + append_name (mode2data_type_str (tmode, unsigned_p, false));=0D + return finish_name ();=0D +}=0D +=0D +rtx=0D +vlmul_trunc::expand (const function_instance &instance, tree exp, rtx targ= et) const=0D +{=0D + enum insn_code icode =3D code_for_vlmul_trunc (instance.get_arg_pattern = ().arg_list[0]);=0D + return expand_builtin_insn (icode, exp, target, instance);=0D +}=0D +=0D +/* A function implementation for vundefined functions. */=0D +char *=0D +vundefined::assemble_name (function_instance &instance)=0D +{=0D + const char *name =3D instance.get_base_name ();=0D + machine_mode mode =3D instance.get_arg_pattern ().arg_list[0];=0D + bool unsigned_p =3D instance.get_data_type_list ()[0] =3D=3D DT_unsigned= ;=0D + const char *dt =3D mode2data_type_str (mode, unsigned_p, false);=0D + snprintf (instance.function_name, NAME_MAXLEN, "%s%s", name, dt);=0D + return nullptr;=0D +}=0D +=0D +void=0D +vundefined::get_argument_types (const function_instance &,=0D + vec &) const=0D +{=0D +}=0D +=0D +rtx=0D +vundefined::expand (const function_instance &, tree, rtx target) const=0D +{=0D + emit_clobber (copy_rtx (target));=0D + return target;=0D +}=0D +=0D } // end namespace riscv_vector=0D =0D using namespace riscv_vector;=0D diff --git a/gcc/config/riscv/riscv-vector-builtins-functions.def b/gcc/con= fig/riscv/riscv-vector-builtins-functions.def=0D index 666e8503d81..86130e02381 100644=0D --- a/gcc/config/riscv/riscv-vector-builtins-functions.def=0D +++ b/gcc/config/riscv/riscv-vector-builtins-functions.def=0D @@ -34,6 +34,28 @@ along with GCC; see the file COPYING3. If not see=0D /* 6. Configuration-Setting Instructions. */=0D DEF_RVV_FUNCTION(vsetvl, vsetvl, (1, VITER(VI, signed)), PAT_none, PRED_no= ne, OP_none)=0D DEF_RVV_FUNCTION(vsetvlmax, vsetvlmax, (1, VITER(VI, signed)), PAT_none, P= RED_none, OP_none)=0D +/* Helper misc intrinsics for software programmer. */=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VI, unsigned), VATT= R(0, VI, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VI, signed), VATTR(= 0, VI, unsigned)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VATTR(1, VCONVERFI, signe= d), VITER(VCONVERF, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VATTR(1, VCONVERFI, unsig= ned), VITER(VCONVERF, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VCONVERF, signed), = VATTR(0, VCONVERFI, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VCONVERF, signed), = VATTR(0, VCONVERFI, unsigned)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VCONVERI, signed), = VATTR(0, VCONVERI, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VCONVERI, unsigned)= , VATTR(0, VCONVERI, unsigned)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VCONVERI2, signed),= VATTR(0, VCONVERI2, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VCONVERI2, unsigned= ), VATTR(0, VCONVERI2, unsigned)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VCONVERI3, signed),= VATTR(0, VCONVERI3, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vreinterpret, vreinterpret, (2, VITER(VCONVERI3, unsigned= ), VATTR(0, VCONVERI3, unsigned)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vlmul_ext, vlmul_ext, (2, VITER(VLMULEXT, signed), VITER(= VI, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vlmul_ext, vlmul_ext, (2, VITER(VLMULEXT, unsigned), VITE= R(VI, unsigned)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vlmul_ext, vlmul_ext, (2, VITER(VLMULEXT, signed), VITER(= VF, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vlmul_trunc, vlmul_trunc, (2, VITER(VLMULTRUNC, signed), = VITER(VI, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vlmul_trunc, vlmul_trunc, (2, VITER(VLMULTRUNC, unsigned)= , VITER(VI, unsigned)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vlmul_trunc, vlmul_trunc, (2, VITER(VLMULTRUNC, signed), = VITER(VF, signed)), PAT_none, PRED_none, OP_v)=0D +DEF_RVV_FUNCTION(vundefined, vundefined, (1, VITER(VI, signed)), PAT_none,= PRED_none, OP_none)=0D +DEF_RVV_FUNCTION(vundefined, vundefined, (1, VITER(VI, unsigned)), PAT_non= e, PRED_none, OP_none)=0D +DEF_RVV_FUNCTION(vundefined, vundefined, (1, VITER(VF, signed)), PAT_none,= PRED_none, OP_none)=0D #undef REQUIRED_EXTENSIONS=0D #undef DEF_RVV_FUNCTION=0D #undef VITER=0D diff --git a/gcc/config/riscv/riscv-vector-builtins-functions.h b/gcc/confi= g/riscv/riscv-vector-builtins-functions.h=0D index 9846ded1155..85e4af5f079 100644=0D --- a/gcc/config/riscv/riscv-vector-builtins-functions.h=0D +++ b/gcc/config/riscv/riscv-vector-builtins-functions.h=0D @@ -522,6 +522,68 @@ public:=0D virtual rtx expand (const function_instance &, tree, rtx) const override= ;=0D };=0D =0D +/* A function_base for Miscellaneous functions. */=0D +class misc : public function_builder=0D +{=0D +public:=0D + // use the same construction function as the function_builder=0D + using function_builder::function_builder;=0D +=0D + virtual char * assemble_name (function_instance &) override;=0D + =0D + virtual void get_argument_types (const function_instance &, vec &)= const override;=0D +=0D + virtual tree get_return_type (const function_instance &) const override;= =0D +};=0D +=0D +/* A function_base for vreinterpret functions. */=0D +class vreinterpret : public misc=0D +{=0D +public:=0D + // use the same construction function as the misc=0D + using misc::misc;=0D + =0D + virtual rtx expand (const function_instance &, tree, rtx) const override= ;=0D +};=0D +=0D +/* A function_base for vlmul_ext functions. */=0D +class vlmul_ext : public misc=0D +{=0D +public:=0D + // use the same construction function as the misc=0D + using misc::misc;=0D + =0D + virtual char * assemble_name (function_instance &) override;=0D + =0D + virtual rtx expand (const function_instance &, tree, rtx) const override= ;=0D +};=0D +=0D +/* A function_base for vlmul_trunc functions. */=0D +class vlmul_trunc : public misc=0D +{=0D +public:=0D + // use the same construction function as the misc=0D + using misc::misc;=0D + =0D + virtual char * assemble_name (function_instance &) override;=0D + =0D + virtual rtx expand (const function_instance &, tree, rtx) const override= ;=0D +};=0D +=0D +/* A function_base for vundefined functions. */=0D +class vundefined : public misc=0D +{=0D +public:=0D + // use the same construction function as the misc=0D + using misc::misc;=0D +=0D + virtual char * assemble_name (function_instance &) override;=0D + =0D + virtual void get_argument_types (const function_instance &, vec &)= const override;=0D + =0D + virtual rtx expand (const function_instance &, tree, rtx) const override= ;=0D +};=0D +=0D } // namespace riscv_vector=0D =0D #endif // end GCC_RISCV_VECTOR_BUILTINS_FUNCTIONS_H=0D \ No newline at end of file=0D diff --git a/gcc/config/riscv/riscv-vector-builtins-iterators.def b/gcc/con= fig/riscv/riscv-vector-builtins-iterators.def=0D index 77a391c7630..8f2ea912804 100644=0D --- a/gcc/config/riscv/riscv-vector-builtins-iterators.def=0D +++ b/gcc/config/riscv/riscv-vector-builtins-iterators.def=0D @@ -118,6 +118,177 @@ DEF_RISCV_ARG_MODE_ATTR(V64BITI, 0, VNx2DI, VNx2DI, T= ARGET_ANY)=0D DEF_RISCV_ARG_MODE_ATTR(V64BITI, 1, VNx4DI, VNx4DI, TARGET_ANY)=0D DEF_RISCV_ARG_MODE_ATTR(V64BITI, 2, VNx8DI, VNx8DI, TARGET_ANY)=0D DEF_RISCV_ARG_MODE_ATTR(V64BITI, 3, VNx16DI, VNx16DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VCONVERFI, 9)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 0, VNx2SF, VNx2SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 1, VNx4SF, VNx4SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 2, VNx8SF, VNx8SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 3, VNx16SF, VNx16SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 4, VNx32SF, VNx32SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 5, VNx2DF, VNx2DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 6, VNx4DF, VNx4DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 7, VNx8DF, VNx8DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERFI, 8, VNx16DF, VNx16DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VCONVERI, 21)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 0, VNx2HI, VNx4QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 1, VNx4HI, VNx8QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 2, VNx8HI, VNx16QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 3, VNx16HI, VNx32QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 4, VNx32HI, VNx64QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 5, VNx64HI, VNx128QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 6, VNx2SI, VNx8QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 7, VNx4SI, VNx16QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 8, VNx8SI, VNx32QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 9, VNx16SI, VNx64QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 10, VNx32SI, VNx128QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 11, VNx2DI, VNx16QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 12, VNx4DI, VNx32QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 13, VNx8DI, VNx64QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 14, VNx16DI, VNx128QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 15, VNx4QI, VNx2HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 16, VNx8QI, VNx4HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 17, VNx16QI, VNx8HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 18, VNx32QI, VNx16HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 19, VNx64QI, VNx32HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI, 20, VNx128QI, VNx64HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VCONVERI2, 19)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 0, VNx2SI, VNx4HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 1, VNx4SI, VNx8HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 2, VNx8SI, VNx16HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 3, VNx16SI, VNx32HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 4, VNx32SI, VNx64HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 5, VNx2DI, VNx8HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 6, VNx4DI, VNx16HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 7, VNx8DI, VNx32HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 8, VNx16DI, VNx64HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 9, VNx8QI, VNx2SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 10, VNx16QI, VNx4SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 11, VNx32QI, VNx8SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 12, VNx64QI, VNx16SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 13, VNx128QI, VNx32SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 14, VNx4HI, VNx2SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 15, VNx8HI, VNx4SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 16, VNx16HI, VNx8SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 17, VNx32HI, VNx16SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI2, 18, VNx64HI, VNx32SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VCONVERI3, 16)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 0, VNx2DI, VNx4SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 1, VNx4DI, VNx8SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 2, VNx8DI, VNx16SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 3, VNx16DI, VNx32SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 4, VNx16QI, VNx2DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 5, VNx32QI, VNx4DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 6, VNx64QI, VNx8DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 7, VNx128QI, VNx16DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 8, VNx8HI, VNx2DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 9, VNx16HI, VNx4DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 10, VNx32HI, VNx8DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 11, VNx64HI, VNx16DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 12, VNx4SI, VNx2DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 13, VNx8SI, VNx4DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 14, VNx16SI, VNx8DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERI3, 15, VNx32SI, VNx16DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VCONVERF, 9)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 0, VNx2SF, VNx2SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 1, VNx4SF, VNx4SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 2, VNx8SF, VNx8SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 3, VNx16SF, VNx16SF, TARGET_HARD_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 4, VNx32SF, VNx32SF, TARGET_HARD_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 5, VNx2DF, VNx2DF, TARGET_DOUBLE_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 6, VNx4DF, VNx4DF, TARGET_DOUBLE_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 7, VNx8DF, VNx8DF, TARGET_DOUBLE_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VCONVERF, 8, VNx16DF, VNx16DF, TARGET_DOUBLE_FLOAT= )=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VSETI, 12)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 0, VNx32QI, VNx32QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 1, VNx64QI, VNx64QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 2, VNx128QI, VNx128QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 3, VNx16HI, VNx16HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 4, VNx32HI, VNx32HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 5, VNx64HI, VNx64HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 6, VNx8SI, VNx8SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 7, VNx16SI, VNx16SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 8, VNx32SI, VNx32SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 9, VNx4DI, VNx4DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 10, VNx8DI, VNx8DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETI, 11, VNx16DI, VNx16DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VSETF, 6)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETF, 0, VNx8SF, VNx8SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETF, 1, VNx16SF, VNx16SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETF, 2, VNx32SF, VNx32SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETF, 3, VNx4DF, VNx4DF, TARGET_DOUBLE_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETF, 4, VNx8DF, VNx8DF, TARGET_DOUBLE_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VSETF, 5, VNx16DF, VNx16DF, TARGET_DOUBLE_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VGETI, 12)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 0, VNx16QI, VNx16QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 1, VNx32QI, VNx32QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 2, VNx64QI, VNx64QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 3, VNx8HI, VNx8HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 4, VNx16HI, VNx16HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 5, VNx32HI, VNx32HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 6, VNx4SI, VNx4SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 7, VNx8SI, VNx8SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 8, VNx16SI, VNx16SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 9, VNx2DI, VNx2DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 10, VNx4DI, VNx4DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETI, 11, VNx8DI, VNx8DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VGETF, 6)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETF, 0, VNx4SF, VNx4SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETF, 1, VNx8SF, VNx8SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETF, 2, VNx16SF, VNx16SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETF, 3, VNx2DF, VNx2DF, TARGET_DOUBLE_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETF, 4, VNx4DF, VNx4DF, TARGET_DOUBLE_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VGETF, 5, VNx8DF, VNx8DF, TARGET_DOUBLE_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VLMULEXT, 25)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 0, VNx4QI, VNx4QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 1, VNx8QI, VNx8QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 2, VNx16QI, VNx16QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 3, VNx32QI, VNx32QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 4, VNx64QI, VNx64QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 5, VNx128QI, VNx128QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 6, VNx4HI, VNx4HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 7, VNx8HI, VNx8HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 8, VNx16HI, VNx16HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 9, VNx32HI, VNx32HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 10, VNx64HI, VNx64HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 11, VNx4SI, VNx4SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 12, VNx8SI, VNx8SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 13, VNx16SI, VNx16SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 14, VNx32SI, VNx32SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 15, VNx4DI, VNx4DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 16, VNx8DI, VNx8DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 17, VNx16DI, VNx16DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 18, VNx4SF, VNx4SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 19, VNx8SF, VNx8SF, TARGET_HARD_FLOAT)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 20, VNx16SF, VNx16SF, TARGET_HARD_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 21, VNx32SF, VNx32SF, TARGET_HARD_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 22, VNx4DF, VNx4DF, TARGET_DOUBLE_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 23, VNx8DF, VNx8DF, TARGET_DOUBLE_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VLMULEXT, 24, VNx16DF, VNx16DF, TARGET_DOUBLE_FLOA= T)=0D +DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VLMULTRUNC, 25)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 0, VNx2QI, VNx2QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 1, VNx4QI, VNx4QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 2, VNx8QI, VNx8QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 3, VNx16QI, VNx16QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 4, VNx32QI, VNx32QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 5, VNx64QI, VNx64QI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 6, VNx2HI, VNx2HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 7, VNx4HI, VNx4HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 8, VNx8HI, VNx8HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 9, VNx16HI, VNx16HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 10, VNx32HI, VNx32HI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 11, VNx2SI, VNx2SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 12, VNx4SI, VNx4SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 13, VNx8SI, VNx8SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 14, VNx16SI, VNx16SI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 15, VNx2DI, VNx2DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 16, VNx4DI, VNx4DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 17, VNx8DI, VNx8DI, TARGET_ANY)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 18, VNx2SF, VNx2SF, TARGET_HARD_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 19, VNx4SF, VNx4SF, TARGET_HARD_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 20, VNx8SF, VNx8SF, TARGET_HARD_FLOAT)= =0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 21, VNx16SF, VNx16SF, TARGET_HARD_FLOA= T)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 22, VNx2DF, VNx2DF, TARGET_DOUBLE_FLOA= T)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 23, VNx4DF, VNx4DF, TARGET_DOUBLE_FLOA= T)=0D +DEF_RISCV_ARG_MODE_ATTR(VLMULTRUNC, 24, VNx8DF, VNx8DF, TARGET_DOUBLE_FLOA= T)=0D DEF_RISCV_ARG_MODE_ATTR_VARIABLE(VM, 69)=0D DEF_RISCV_ARG_MODE_ATTR(VM, 0, VNx2BI, VNx2BI, TARGET_ANY)=0D DEF_RISCV_ARG_MODE_ATTR(VM, 1, VNx4BI, VNx4BI, TARGET_ANY)=0D diff --git a/gcc/config/riscv/riscv.cc b/gcc/config/riscv/riscv.cc=0D index 7a1f19b32ee..e88057e992a 100644=0D --- a/gcc/config/riscv/riscv.cc=0D +++ b/gcc/config/riscv/riscv.cc=0D @@ -5009,6 +5009,9 @@ riscv_hard_regno_mode_ok (unsigned int regno, machine= _mode mode)=0D if (rvv_mode_p (mode))=0D return false;=0D =0D + if (!FLOAT_MODE_P (mode))=0D + return false;=0D + =0D if (!FP_REG_P (regno + nregs - 1))=0D return false;=0D =0D @@ -5073,13 +5076,13 @@ riscv_class_max_nregs (reg_class_t rclass, machine_= mode mode)=0D if (reg_class_subset_p (rclass, GR_REGS))=0D return riscv_hard_regno_nregs (GP_REG_FIRST, mode);=0D =0D - if (reg_class_subset_p (V_REGS, rclass))=0D + if (reg_class_subset_p (rclass, V_REGS))=0D return riscv_hard_regno_nregs (V_REG_FIRST, mode);=0D =0D - if (reg_class_subset_p (VL_REGS, rclass))=0D + if (reg_class_subset_p (rclass, VL_REGS))=0D return 1;=0D =0D - if (reg_class_subset_p (VTYPE_REGS, rclass))=0D + if (reg_class_subset_p (rclass, VTYPE_REGS))=0D return 1;=0D =0D return 0;=0D @@ -5718,8 +5721,11 @@ riscv_slow_unaligned_access (machine_mode, unsigned = int)=0D /* Implement TARGET_CAN_CHANGE_MODE_CLASS. */=0D =0D static bool=0D -riscv_can_change_mode_class (machine_mode, machine_mode, reg_class_t rclas= s)=0D +riscv_can_change_mode_class (machine_mode from, machine_mode to, reg_class= _t rclass)=0D {=0D + if (rvv_mode_p (from) && rvv_mode_p (to))=0D + return true;=0D + =0D return !reg_classes_intersect_p (FP_REGS, rclass);=0D }=0D =0D @@ -6003,6 +6009,25 @@ riscv_mangle_type (const_tree type)=0D return NULL;=0D }=0D =0D +/* Implement REGMODE_NATURAL_SIZE. */=0D +=0D +poly_uint64=0D +riscv_regmode_natural_size (machine_mode mode)=0D +{=0D + /* The natural size for RVV data modes is one RVV data vector,=0D + and similarly for predicates. We can't independently modify=0D + anything smaller than that. */=0D + /* ??? For now, only do this for variable-width RVV registers.=0D + Doing it for constant-sized registers breaks lower-subreg.c. */=0D + /* For partial tuple vector, we should=0D + use the mode of the vector subpart,=0D + in case of segment loads and stores. */=0D + if (!riscv_vector_chunks.is_constant () && rvv_mode_p (mode))=0D + return BYTES_PER_RISCV_VECTOR;=0D +=0D + return UNITS_PER_WORD;=0D +}=0D +=0D /* Initialize the GCC target structure. */=0D #undef TARGET_ASM_ALIGNED_HI_OP=0D #define TARGET_ASM_ALIGNED_HI_OP "\t.half\t"=0D diff --git a/gcc/config/riscv/riscv.h b/gcc/config/riscv/riscv.h=0D index cb4cfc0f73e..c54c984e70b 100644=0D --- a/gcc/config/riscv/riscv.h=0D +++ b/gcc/config/riscv/riscv.h=0D @@ -1069,4 +1069,6 @@ extern void riscv_remove_unneeded_save_restore_calls = (void);=0D =0D #define REGISTER_TARGET_PRAGMAS() riscv_register_pragmas ()=0D =0D +#define REGMODE_NATURAL_SIZE(MODE) riscv_regmode_natural_size (MODE)=0D +=0D #endif /* ! GCC_RISCV_H */=0D diff --git a/gcc/config/riscv/vector-iterators.md b/gcc/config/riscv/vector= -iterators.md=0D index e01305ef3fc..df9011ee901 100644=0D --- a/gcc/config/riscv/vector-iterators.md=0D +++ b/gcc/config/riscv/vector-iterators.md=0D @@ -23,6 +23,18 @@=0D UNSPEC_VSETVLI=0D ;; RVV instructions.=0D UNSPEC_RVV=0D + ;; reinterpret=0D + UNSPEC_REINTERPRET=0D + ;; lmul_ext=0D + UNSPEC_LMUL_EXT=0D + ;; lmul_trunc=0D + UNSPEC_LMUL_TRUNC=0D + ;; set a vector=0D + UNSPEC_SET_VECTOR=0D + ;; get a vector=0D + UNSPEC_GET_VECTOR=0D + ;; vec_duplicate=0D + UNSPEC_VEC_DUPLICATE=0D ;; vector select=0D UNSPEC_SELECT=0D =0D @@ -85,6 +97,136 @@=0D ;; All vector modes supported for integer sew =3D 64.=0D (define_mode_iterator V64BITI [VNx2DI VNx4DI VNx8DI VNx16DI])=0D =0D +;; vector integer and float-point mode interconversion.=0D +(define_mode_attr VCONVERFI [=0D + (VNx2SF "VNx2SI")=0D + (VNx4SF "VNx4SI")=0D + (VNx8SF "VNx8SI")=0D + (VNx16SF "VNx16SI")=0D + (VNx32SF "VNx32SI")=0D + (VNx2DF "VNx2DI")=0D + (VNx4DF "VNx4DI")=0D + (VNx8DF "VNx8DI")=0D + (VNx16DF "VNx16DI")])=0D +=0D +;; vector integer same lmul but different sew interconversion.=0D +(define_mode_attr VCONVERI [=0D + (VNx2HI "VNx4QI")=0D + (VNx4HI "VNx8QI")=0D + (VNx8HI "VNx16QI")=0D + (VNx16HI "VNx32QI")=0D + (VNx32HI "VNx64QI")=0D + (VNx64HI "VNx128QI")=0D + (VNx2SI "VNx8QI")=0D + (VNx4SI "VNx16QI")=0D + (VNx8SI "VNx32QI")=0D + (VNx16SI "VNx64QI")=0D + (VNx32SI "VNx128QI")=0D + (VNx2DI "VNx16QI")=0D + (VNx4DI "VNx32QI")=0D + (VNx8DI "VNx64QI")=0D + (VNx16DI "VNx128QI")=0D + (VNx4QI "VNx2HI")=0D + (VNx8QI "VNx4HI")=0D + (VNx16QI "VNx8HI")=0D + (VNx32QI "VNx16HI")=0D + (VNx64QI "VNx32HI")=0D + (VNx128QI "VNx64HI")])=0D +=0D +(define_mode_attr VCONVERI2 [=0D + (VNx2SI "VNx4HI")=0D + (VNx4SI "VNx8HI")=0D + (VNx8SI "VNx16HI")=0D + (VNx16SI "VNx32HI")=0D + (VNx32SI "VNx64HI")=0D + (VNx2DI "VNx8HI")=0D + (VNx4DI "VNx16HI")=0D + (VNx8DI "VNx32HI")=0D + (VNx16DI "VNx64HI")=0D + (VNx8QI "VNx2SI")=0D + (VNx16QI "VNx4SI")=0D + (VNx32QI "VNx8SI")=0D + (VNx64QI "VNx16SI")=0D + (VNx128QI "VNx32SI")=0D + (VNx4HI "VNx2SI")=0D + (VNx8HI "VNx4SI")=0D + (VNx16HI "VNx8SI")=0D + (VNx32HI "VNx16SI")=0D + (VNx64HI "VNx32SI")])=0D +=0D +(define_mode_attr VCONVERI3 [=0D + (VNx2DI "VNx4SI")=0D + (VNx4DI "VNx8SI")=0D + (VNx8DI "VNx16SI")=0D + (VNx16DI "VNx32SI")=0D + (VNx16QI "VNx2DI")=0D + (VNx32QI "VNx4DI")=0D + (VNx64QI "VNx8DI")=0D + (VNx128QI "VNx16DI")=0D + (VNx8HI "VNx2DI")=0D + (VNx16HI "VNx4DI")=0D + (VNx32HI "VNx8DI")=0D + (VNx64HI "VNx16DI")=0D + (VNx4SI "VNx2DI")=0D + (VNx8SI "VNx4DI")=0D + (VNx16SI "VNx8DI")=0D + (VNx32SI "VNx16DI")])=0D +=0D +;; vector iterator integer and float-point mode interconversion.=0D +(define_mode_iterator VCONVERF [=0D + (VNx2SF "TARGET_HARD_FLOAT")=0D + (VNx4SF "TARGET_HARD_FLOAT")=0D + (VNx8SF "TARGET_HARD_FLOAT")=0D + (VNx16SF "TARGET_HARD_FLOAT")=0D + (VNx32SF "TARGET_HARD_FLOAT")=0D + (VNx2DF "TARGET_DOUBLE_FLOAT")=0D + (VNx4DF "TARGET_DOUBLE_FLOAT")=0D + (VNx8DF "TARGET_DOUBLE_FLOAT")=0D + (VNx16DF "TARGET_DOUBLE_FLOAT")])=0D +=0D +;; vector modes can be set.=0D +(define_mode_iterator VSETI [=0D + VNx32QI VNx64QI VNx128QI=0D + VNx16HI VNx32HI VNx64HI=0D + VNx8SI VNx16SI VNx32SI=0D + VNx4DI VNx8DI VNx16DI])=0D +=0D +(define_mode_iterator VSETF [=0D + (VNx8SF "TARGET_HARD_FLOAT") (VNx16SF "TARGET_HARD_FLOAT") (VNx32SF "TAR= GET_HARD_FLOAT")=0D + (VNx4DF "TARGET_DOUBLE_FLOAT") (VNx8DF "TARGET_DOUBLE_FLOAT") (VNx16DF "= TARGET_DOUBLE_FLOAT")])=0D +=0D +;; vector modes can be get.=0D +(define_mode_iterator VGETI [=0D + VNx16QI VNx32QI VNx64QI=0D + VNx8HI VNx16HI VNx32HI=0D + VNx4SI VNx8SI VNx16SI=0D + VNx2DI VNx4DI VNx8DI])=0D +=0D +(define_mode_iterator VGETF [=0D + (VNx4SF "TARGET_HARD_FLOAT") (VNx8SF "TARGET_HARD_FLOAT") (VNx16SF "TARG= ET_HARD_FLOAT")=0D + (VNx2DF "TARGET_DOUBLE_FLOAT") (VNx4DF "TARGET_DOUBLE_FLOAT") (VNx8DF "T= ARGET_DOUBLE_FLOAT")])=0D +=0D +;; All vector extend modes supported.=0D +(define_mode_iterator VLMULEXT [=0D + VNx4QI VNx8QI VNx16QI VNx32QI VNx64QI VNx128QI=0D + VNx4HI VNx8HI VNx16HI VNx32HI VNx64HI=0D + VNx4SI VNx8SI VNx16SI VNx32SI=0D + VNx4DI VNx8DI VNx16DI=0D + (VNx4SF "TARGET_HARD_FLOAT") (VNx8SF "TARGET_HARD_FLOAT")=0D + (VNx16SF "TARGET_HARD_FLOAT") (VNx32SF "TARGET_HARD_FLOAT")=0D + (VNx4DF "TARGET_DOUBLE_FLOAT") (VNx8DF "TARGET_DOUBLE_FLOAT")=0D + (VNx16DF "TARGET_DOUBLE_FLOAT")])=0D +=0D +;; All vector truncate modes supported.=0D +(define_mode_iterator VLMULTRUNC [=0D + VNx2QI VNx4QI VNx8QI VNx16QI VNx32QI VNx64QI=0D + VNx2HI VNx4HI VNx8HI VNx16HI VNx32HI=0D + VNx2SI VNx4SI VNx8SI VNx16SI=0D + VNx2DI VNx4DI VNx8DI=0D + (VNx2SF "TARGET_HARD_FLOAT") (VNx4SF "TARGET_HARD_FLOAT") (VNx8SF "TARGE= T_HARD_FLOAT")=0D + (VNx16SF "TARGET_HARD_FLOAT")=0D + (VNx2DF "TARGET_DOUBLE_FLOAT") (VNx4DF "TARGET_DOUBLE_FLOAT") (VNx8DF "T= ARGET_DOUBLE_FLOAT")])=0D +=0D ;; Map a vector int or float mode to a vector compare mode.=0D (define_mode_attr VM [=0D (VNx2BI "VNx2BI") (VNx4BI "VNx4BI") (VNx8BI "VNx8BI") (VNx16BI "VNx16BI"= )=0D diff --git a/gcc/config/riscv/vector.md b/gcc/config/riscv/vector.md=0D index 1731d969372..54e68aa165b 100644=0D --- a/gcc/config/riscv/vector.md=0D +++ b/gcc/config/riscv/vector.md=0D @@ -349,6 +349,139 @@=0D [(set_attr "type" "vsetvl")=0D (set_attr "mode" "none")])=0D =0D +;; -----------------------------------------------------------------------= --=0D +;; ---- [INT,FP] Vector Misc Functions=0D +;; -----------------------------------------------------------------------= --=0D +;; Includes:=0D +;; - Reinterpret between different vector modes=0D +;; - Vector LMUL extension=0D +;; - Vector LMUL truncation=0D +;; - insert a vector to a vector group=0D +;; - get a vector from a vector group=0D +;; -----------------------------------------------------------------------= --=0D +=0D +;; Reinterpret operand 1 in operand 0's mode, without changing its content= s.=0D +;; This is equivalent to a subreg on little-endian targets but not for=0D +;; big-endian; see the comment at the head of the file for details.=0D +(define_expand "@vreinterpret"=0D + [(set (match_operand:V 0 "register_operand")=0D + (unspec:V=0D + [(match_operand 1 "vector_any_register_operand")] UNSPEC_REINTERPRET= ))]=0D + "TARGET_VECTOR"=0D +{=0D + machine_mode src_mode =3D GET_MODE (operands[1]);=0D + if (targetm.can_change_mode_class (mode, src_mode, FP_REGS))=0D + {=0D + emit_move_insn (operands[0], gen_lowpart (mode, operands[1]));= =0D + DONE;=0D + }=0D +})=0D +=0D +;; Vector LMUL extension=0D +(define_expand "@vlmul_ext"=0D + [(set (match_operand:VLMULEXT 0 "register_operand")=0D + (unspec:VLMULEXT=0D + [(match_operand 1 "vector_any_register_operand")] UNSPEC_LMUL_EXT))]= =0D + "TARGET_VECTOR"=0D +{=0D +})=0D +=0D +(define_insn_and_split "*vlmul_ext"=0D + [(set (match_operand:VLMULEXT 0 "register_operand" "=3Dvr, ?&vr")=0D + (unspec:VLMULEXT=0D + [(match_operand:V 1 "register_operand" "0, vr")] UNSPEC_LMUL_EXT))]= =0D + "TARGET_VECTOR"=0D + "#"=0D + "&& reload_completed"=0D + [(const_int 0)]=0D + {=0D + rtx subreg =3D simplify_gen_subreg (mode, operands[0], mode, 0);=0D + riscv_emit_move (subreg, operands[1]);=0D + DONE;=0D + })=0D +=0D +;; Vector LMUL truncation=0D +(define_expand "@vlmul_trunc"=0D + [(set (match_operand:VLMULTRUNC 0 "register_operand")=0D + (unspec:VLMULTRUNC=0D + [(match_operand 1 "vector_any_register_operand")] UNSPEC_LMUL_TRUNC)= )]=0D + "TARGET_VECTOR"=0D +{=0D + rtx subreg =3D simplify_gen_subreg (mode, operands[1], GET_MODE (o= perands[1]), 0);=0D + riscv_emit_move (operands[0], subreg);=0D + DONE;=0D +})=0D +=0D +;; insert a vector to a vector group=0D +(define_expand "@vset"=0D + [(set (match_operand:VSETI 0 "register_operand")=0D + (unspec:VSETI=0D + [(match_operand:VSETI 1 "register_operand" "0")=0D + (match_operand 2 "const_int_operand")=0D + (match_operand 3 "vector_any_register_operand")] UNSPEC_SET_VECTOR)= )]=0D + "TARGET_VECTOR"=0D +{=0D + unsigned int nvecs =3D exact_div (GET_MODE_SIZE (GET_MODE (operands[0]))= ,=0D + GET_MODE_SIZE (GET_MODE (operands[3]))).to_constant ();=0D + poly_int64 offset =3D (INTVAL (operands[2]) & (nvecs - 1))=0D + * GET_MODE_SIZE (GET_MODE (operands[3]));=0D + rtx subreg =3D simplify_gen_subreg (GET_MODE (operands[3]), operands[1],= GET_MODE (operands[1]), offset);=0D + riscv_emit_move (subreg, operands[3]);=0D + riscv_emit_move (operands[0], operands[1]);=0D + DONE;=0D +})=0D +=0D +(define_expand "@vset"=0D + [(set (match_operand:VSETF 0 "register_operand")=0D + (unspec:VSETF=0D + [(match_operand:VSETF 1 "register_operand" "0")=0D + (match_operand 2 "const_int_operand")=0D + (match_operand 3 "vector_any_register_operand")] UNSPEC_SET_VECTOR)= )]=0D + "TARGET_VECTOR"=0D +{=0D + unsigned int nvecs =3D exact_div (GET_MODE_SIZE (GET_MODE (operands[0]))= ,=0D + GET_MODE_SIZE (GET_MODE (operands[3]))).to_constant ();=0D + poly_int64 offset =3D (INTVAL (operands[2]) & (nvecs - 1))=0D + * GET_MODE_SIZE (GET_MODE (operands[3]));=0D + rtx subreg =3D simplify_gen_subreg (GET_MODE (operands[3]), operands[1],= GET_MODE (operands[1]), offset);=0D + riscv_emit_move (subreg, operands[3]);=0D + riscv_emit_move (operands[0], operands[1]);=0D + DONE;=0D +})=0D +=0D +;; get a vector from a vector group=0D +(define_expand "@vget"=0D + [(set (match_operand:VGETI 0 "register_operand")=0D + (unspec:VGETI=0D + [(match_operand 1 "vector_any_register_operand")=0D + (match_operand 2 "const_int_operand")] UNSPEC_GET_VECTOR))]=0D + "TARGET_VECTOR"=0D +{=0D + unsigned int nvecs =3D exact_div (GET_MODE_SIZE (GET_MODE (operands[1]))= ,=0D + GET_MODE_SIZE (GET_MODE (operands[0]))).to_constant ();=0D + poly_int64 offset =3D (INTVAL (operands[2]) & (nvecs - 1))=0D + * GET_MODE_SIZE (GET_MODE (operands[0]));=0D + rtx subreg =3D simplify_gen_subreg (GET_MODE (operands[0]), operands[1],= GET_MODE (operands[1]), offset);=0D + riscv_emit_move (operands[0], subreg);=0D + DONE;=0D +})=0D +=0D +(define_expand "@vget"=0D + [(set (match_operand:VGETF 0 "register_operand")=0D + (unspec:VGETF=0D + [(match_operand 1 "vector_any_register_operand")=0D + (match_operand 2 "const_int_operand")] UNSPEC_GET_VECTOR))]=0D + "TARGET_VECTOR"=0D +{=0D + unsigned int nvecs =3D exact_div (GET_MODE_SIZE (GET_MODE (operands[1]))= ,=0D + GET_MODE_SIZE (GET_MODE (operands[0]))).to_constant ();=0D + poly_int64 offset =3D (INTVAL (operands[2]) & (nvecs - 1))=0D + * GET_MODE_SIZE (GET_MODE (operands[0]));=0D + rtx subreg =3D simplify_gen_subreg (GET_MODE (operands[0]), operands[1],= GET_MODE (operands[1]), offset);=0D + riscv_emit_move (operands[0], subreg);=0D + DONE;=0D +})=0D +=0D ;; -----------------------------------------------------------------------= --------=0D ;; ---- 7. Vector Loads and Stores=0D ;; -----------------------------------------------------------------------= --------=0D diff --git a/gcc/testsuite/g++.target/riscv/rvv/misc_func.C b/gcc/testsuite= /g++.target/riscv/rvv/misc_func.C=0D new file mode 100644=0D index 00000000000..b527bce1cab=0D --- /dev/null=0D +++ b/gcc/testsuite/g++.target/riscv/rvv/misc_func.C=0D @@ -0,0 +1,2597 @@=0D +/* { dg-do compile } */=0D +/* { dg-skip-if "test vector intrinsic" { *-*-* } { "*" } { "-march=3Drv*v= *" } } */=0D +=0D +#include =0D +#include =0D +=0D +vuint8mf8_t test_vreinterpret_v_i8mf8_u8mf8(vint8mf8_t src)=0D +{=0D + vuint8mf8_t a =3D vreinterpret_u8mf8(src);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vreinterpret_v_i8mf4_u8mf4(vint8mf4_t src)=0D +{=0D + vuint8mf4_t a =3D vreinterpret_u8mf4(src);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vreinterpret_v_i8mf2_u8mf2(vint8mf2_t src)=0D +{=0D + vuint8mf2_t a =3D vreinterpret_u8mf2(src);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vreinterpret_v_i8m1_u8m1(vint8m1_t src)=0D +{=0D + vuint8m1_t a =3D vreinterpret_u8m1(src);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vreinterpret_v_i8m2_u8m2(vint8m2_t src)=0D +{=0D + vuint8m2_t a =3D vreinterpret_u8m2(src);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vreinterpret_v_i8m4_u8m4(vint8m4_t src)=0D +{=0D + vuint8m4_t a =3D vreinterpret_u8m4(src);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vreinterpret_v_i8m8_u8m8(vint8m8_t src)=0D +{=0D + vuint8m8_t a =3D vreinterpret_u8m8(src);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vreinterpret_v_u8mf8_i8mf8(vuint8mf8_t src)=0D +{=0D + vint8mf8_t a =3D vreinterpret_i8mf8(src);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vreinterpret_v_u8mf4_i8mf4(vuint8mf4_t src)=0D +{=0D + vint8mf4_t a =3D vreinterpret_i8mf4(src);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vreinterpret_v_u8mf2_i8mf2(vuint8mf2_t src)=0D +{=0D + vint8mf2_t a =3D vreinterpret_i8mf2(src);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vreinterpret_v_u8m1_i8m1(vuint8m1_t src)=0D +{=0D + vint8m1_t a =3D vreinterpret_i8m1(src);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vreinterpret_v_u8m2_i8m2(vuint8m2_t src)=0D +{=0D + vint8m2_t a =3D vreinterpret_i8m2(src);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vreinterpret_v_u8m4_i8m4(vuint8m4_t src)=0D +{=0D + vint8m4_t a =3D vreinterpret_i8m4(src);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vreinterpret_v_u8m8_i8m8(vuint8m8_t src)=0D +{=0D + vint8m8_t a =3D vreinterpret_i8m8(src);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vreinterpret_v_i16mf4_u16mf4(vint16mf4_t src)=0D +{=0D + vuint16mf4_t a =3D vreinterpret_u16mf4(src);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vreinterpret_v_i16mf2_u16mf2(vint16mf2_t src)=0D +{=0D + vuint16mf2_t a =3D vreinterpret_u16mf2(src);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vreinterpret_v_i16m1_u16m1(vint16m1_t src)=0D +{=0D + vuint16m1_t a =3D vreinterpret_u16m1(src);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vreinterpret_v_i16m2_u16m2(vint16m2_t src)=0D +{=0D + vuint16m2_t a =3D vreinterpret_u16m2(src);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vreinterpret_v_i16m4_u16m4(vint16m4_t src)=0D +{=0D + vuint16m4_t a =3D vreinterpret_u16m4(src);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vreinterpret_v_i16m8_u16m8(vint16m8_t src)=0D +{=0D + vuint16m8_t a =3D vreinterpret_u16m8(src);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vreinterpret_v_u16mf4_i16mf4(vuint16mf4_t src)=0D +{=0D + vint16mf4_t a =3D vreinterpret_i16mf4(src);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vreinterpret_v_u16mf2_i16mf2(vuint16mf2_t src)=0D +{=0D + vint16mf2_t a =3D vreinterpret_i16mf2(src);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vreinterpret_v_u16m1_i16m1(vuint16m1_t src)=0D +{=0D + vint16m1_t a =3D vreinterpret_i16m1(src);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vreinterpret_v_u16m2_i16m2(vuint16m2_t src)=0D +{=0D + vint16m2_t a =3D vreinterpret_i16m2(src);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vreinterpret_v_u16m4_i16m4(vuint16m4_t src)=0D +{=0D + vint16m4_t a =3D vreinterpret_i16m4(src);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vreinterpret_v_u16m8_i16m8(vuint16m8_t src)=0D +{=0D + vint16m8_t a =3D vreinterpret_i16m8(src);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vreinterpret_v_i32mf2_u32mf2(vint32mf2_t src)=0D +{=0D + vuint32mf2_t a =3D vreinterpret_u32mf2(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_i32m1_u32m1(vint32m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_i32m2_u32m2(vint32m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_i32m4_u32m4(vint32m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_i32m8_u32m8(vint32m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_u32m8(src);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vreinterpret_v_i32mf2_f32mf2(vint32mf2_t src)=0D +{=0D + vfloat32mf2_t a =3D vreinterpret_f32mf2(src);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vreinterpret_v_i32m1_f32m1(vint32m1_t src)=0D +{=0D + vfloat32m1_t a =3D vreinterpret_f32m1(src);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vreinterpret_v_i32m2_f32m2(vint32m2_t src)=0D +{=0D + vfloat32m2_t a =3D vreinterpret_f32m2(src);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vreinterpret_v_i32m4_f32m4(vint32m4_t src)=0D +{=0D + vfloat32m4_t a =3D vreinterpret_f32m4(src);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vreinterpret_v_i32m8_f32m8(vint32m8_t src)=0D +{=0D + vfloat32m8_t a =3D vreinterpret_f32m8(src);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vreinterpret_v_u32mf2_i32mf2(vuint32mf2_t src)=0D +{=0D + vint32mf2_t a =3D vreinterpret_i32mf2(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_u32m1_i32m1(vuint32m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_u32m2_i32m2(vuint32m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_u32m4_i32m4(vuint32m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_u32m8_i32m8(vuint32m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_i32m8(src);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vreinterpret_v_u32mf2_f32mf2(vuint32mf2_t src)=0D +{=0D + vfloat32mf2_t a =3D vreinterpret_f32mf2(src);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vreinterpret_v_u32m1_f32m1(vuint32m1_t src)=0D +{=0D + vfloat32m1_t a =3D vreinterpret_f32m1(src);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vreinterpret_v_u32m2_f32m2(vuint32m2_t src)=0D +{=0D + vfloat32m2_t a =3D vreinterpret_f32m2(src);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vreinterpret_v_u32m4_f32m4(vuint32m4_t src)=0D +{=0D + vfloat32m4_t a =3D vreinterpret_f32m4(src);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vreinterpret_v_u32m8_f32m8(vuint32m8_t src)=0D +{=0D + vfloat32m8_t a =3D vreinterpret_f32m8(src);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vreinterpret_v_f32mf2_i32mf2(vfloat32mf2_t src)=0D +{=0D + vint32mf2_t a =3D vreinterpret_i32mf2(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_f32m1_i32m1(vfloat32m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_f32m2_i32m2(vfloat32m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_f32m4_i32m4(vfloat32m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_f32m8_i32m8(vfloat32m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_i32m8(src);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vreinterpret_v_f32mf2_u32mf2(vfloat32mf2_t src)=0D +{=0D + vuint32mf2_t a =3D vreinterpret_u32mf2(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_f32m1_u32m1(vfloat32m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_f32m2_u32m2(vfloat32m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_f32m4_u32m4(vfloat32m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_f32m8_u32m8(vfloat32m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_u32m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_i64m1_u64m1(vint64m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_i64m2_u64m2(vint64m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_i64m4_u64m4(vint64m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_i64m8_u64m8(vint64m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_u64m8(src);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vreinterpret_v_i64m1_f64m1(vint64m1_t src)=0D +{=0D + vfloat64m1_t a =3D vreinterpret_f64m1(src);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vreinterpret_v_i64m2_f64m2(vint64m2_t src)=0D +{=0D + vfloat64m2_t a =3D vreinterpret_f64m2(src);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vreinterpret_v_i64m4_f64m4(vint64m4_t src)=0D +{=0D + vfloat64m4_t a =3D vreinterpret_f64m4(src);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vreinterpret_v_i64m8_f64m8(vint64m8_t src)=0D +{=0D + vfloat64m8_t a =3D vreinterpret_f64m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_u64m1_i64m1(vuint64m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_u64m2_i64m2(vuint64m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_u64m4_i64m4(vuint64m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_u64m8_i64m8(vuint64m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_i64m8(src);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vreinterpret_v_u64m1_f64m1(vuint64m1_t src)=0D +{=0D + vfloat64m1_t a =3D vreinterpret_f64m1(src);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vreinterpret_v_u64m2_f64m2(vuint64m2_t src)=0D +{=0D + vfloat64m2_t a =3D vreinterpret_f64m2(src);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vreinterpret_v_u64m4_f64m4(vuint64m4_t src)=0D +{=0D + vfloat64m4_t a =3D vreinterpret_f64m4(src);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vreinterpret_v_u64m8_f64m8(vuint64m8_t src)=0D +{=0D + vfloat64m8_t a =3D vreinterpret_f64m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_f64m1_i64m1(vfloat64m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_f64m2_i64m2(vfloat64m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_f64m4_i64m4(vfloat64m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_f64m8_i64m8(vfloat64m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_i64m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_f64m1_u64m1(vfloat64m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_f64m2_u64m2(vfloat64m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_f64m4_u64m4(vfloat64m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_f64m8_u64m8(vfloat64m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_u64m8(src);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vreinterpret_v_i8mf4_i16mf4(vint8mf4_t src)=0D +{=0D + vint16mf4_t a =3D vreinterpret_i16mf4(src);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vreinterpret_v_i8mf2_i16mf2(vint8mf2_t src)=0D +{=0D + vint16mf2_t a =3D vreinterpret_i16mf2(src);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vreinterpret_v_i8m1_i16m1(vint8m1_t src)=0D +{=0D + vint16m1_t a =3D vreinterpret_i16m1(src);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vreinterpret_v_i8m2_i16m2(vint8m2_t src)=0D +{=0D + vint16m2_t a =3D vreinterpret_i16m2(src);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vreinterpret_v_i8m4_i16m4(vint8m4_t src)=0D +{=0D + vint16m4_t a =3D vreinterpret_i16m4(src);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vreinterpret_v_i8m8_i16m8(vint8m8_t src)=0D +{=0D + vint16m8_t a =3D vreinterpret_i16m8(src);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vreinterpret_v_i8mf2_i32mf2(vint8mf2_t src)=0D +{=0D + vint32mf2_t a =3D vreinterpret_i32mf2(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_i8m1_i32m1(vint8m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_i8m2_i32m2(vint8m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_i8m4_i32m4(vint8m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_i8m8_i32m8(vint8m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_i32m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_i8m1_i64m1(vint8m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_i8m2_i64m2(vint8m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_i8m4_i64m4(vint8m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_i8m8_i64m8(vint8m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_i64m8(src);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vreinterpret_v_i16mf4_i8mf4(vint16mf4_t src)=0D +{=0D + vint8mf4_t a =3D vreinterpret_i8mf4(src);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vreinterpret_v_i16mf2_i8mf2(vint16mf2_t src)=0D +{=0D + vint8mf2_t a =3D vreinterpret_i8mf2(src);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vreinterpret_v_i16m1_i8m1(vint16m1_t src)=0D +{=0D + vint8m1_t a =3D vreinterpret_i8m1(src);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vreinterpret_v_i16m2_i8m2(vint16m2_t src)=0D +{=0D + vint8m2_t a =3D vreinterpret_i8m2(src);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vreinterpret_v_i16m4_i8m4(vint16m4_t src)=0D +{=0D + vint8m4_t a =3D vreinterpret_i8m4(src);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vreinterpret_v_i16m8_i8m8(vint16m8_t src)=0D +{=0D + vint8m8_t a =3D vreinterpret_i8m8(src);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vreinterpret_v_i16mf2_i32mf2(vint16mf2_t src)=0D +{=0D + vint32mf2_t a =3D vreinterpret_i32mf2(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_i16m1_i32m1(vint16m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_i16m2_i32m2(vint16m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_i16m4_i32m4(vint16m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_i16m8_i32m8(vint16m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_i32m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_i16m1_i64m1(vint16m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_i16m2_i64m2(vint16m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_i16m4_i64m4(vint16m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_i16m8_i64m8(vint16m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_i64m8(src);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vreinterpret_v_i32mf2_i8mf2(vint32mf2_t src)=0D +{=0D + vint8mf2_t a =3D vreinterpret_i8mf2(src);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vreinterpret_v_i32m1_i8m1(vint32m1_t src)=0D +{=0D + vint8m1_t a =3D vreinterpret_i8m1(src);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vreinterpret_v_i32m2_i8m2(vint32m2_t src)=0D +{=0D + vint8m2_t a =3D vreinterpret_i8m2(src);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vreinterpret_v_i32m4_i8m4(vint32m4_t src)=0D +{=0D + vint8m4_t a =3D vreinterpret_i8m4(src);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vreinterpret_v_i32m8_i8m8(vint32m8_t src)=0D +{=0D + vint8m8_t a =3D vreinterpret_i8m8(src);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vreinterpret_v_i32mf2_i16mf2(vint32mf2_t src)=0D +{=0D + vint16mf2_t a =3D vreinterpret_i16mf2(src);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vreinterpret_v_i32m1_i16m1(vint32m1_t src)=0D +{=0D + vint16m1_t a =3D vreinterpret_i16m1(src);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vreinterpret_v_i32m2_i16m2(vint32m2_t src)=0D +{=0D + vint16m2_t a =3D vreinterpret_i16m2(src);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vreinterpret_v_i32m4_i16m4(vint32m4_t src)=0D +{=0D + vint16m4_t a =3D vreinterpret_i16m4(src);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vreinterpret_v_i32m8_i16m8(vint32m8_t src)=0D +{=0D + vint16m8_t a =3D vreinterpret_i16m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_i32m1_i64m1(vint32m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_i32m2_i64m2(vint32m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_i32m4_i64m4(vint32m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_i32m8_i64m8(vint32m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_i64m8(src);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vreinterpret_v_i64m1_i8m1(vint64m1_t src)=0D +{=0D + vint8m1_t a =3D vreinterpret_i8m1(src);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vreinterpret_v_i64m2_i8m2(vint64m2_t src)=0D +{=0D + vint8m2_t a =3D vreinterpret_i8m2(src);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vreinterpret_v_i64m4_i8m4(vint64m4_t src)=0D +{=0D + vint8m4_t a =3D vreinterpret_i8m4(src);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vreinterpret_v_i64m8_i8m8(vint64m8_t src)=0D +{=0D + vint8m8_t a =3D vreinterpret_i8m8(src);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vreinterpret_v_i64m1_i16m1(vint64m1_t src)=0D +{=0D + vint16m1_t a =3D vreinterpret_i16m1(src);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vreinterpret_v_i64m2_i16m2(vint64m2_t src)=0D +{=0D + vint16m2_t a =3D vreinterpret_i16m2(src);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vreinterpret_v_i64m4_i16m4(vint64m4_t src)=0D +{=0D + vint16m4_t a =3D vreinterpret_i16m4(src);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vreinterpret_v_i64m8_i16m8(vint64m8_t src)=0D +{=0D + vint16m8_t a =3D vreinterpret_i16m8(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_i64m1_i32m1(vint64m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_i64m2_i32m2(vint64m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_i64m4_i32m4(vint64m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_i64m8_i32m8(vint64m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_i32m8(src);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vreinterpret_v_u8mf4_u16mf4(vuint8mf4_t src)=0D +{=0D + vuint16mf4_t a =3D vreinterpret_u16mf4(src);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vreinterpret_v_u8mf2_u16mf2(vuint8mf2_t src)=0D +{=0D + vuint16mf2_t a =3D vreinterpret_u16mf2(src);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vreinterpret_v_u8m1_u16m1(vuint8m1_t src)=0D +{=0D + vuint16m1_t a =3D vreinterpret_u16m1(src);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vreinterpret_v_u8m2_u16m2(vuint8m2_t src)=0D +{=0D + vuint16m2_t a =3D vreinterpret_u16m2(src);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vreinterpret_v_u8m4_u16m4(vuint8m4_t src)=0D +{=0D + vuint16m4_t a =3D vreinterpret_u16m4(src);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vreinterpret_v_u8m8_u16m8(vuint8m8_t src)=0D +{=0D + vuint16m8_t a =3D vreinterpret_u16m8(src);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vreinterpret_v_u8mf2_u32mf2(vuint8mf2_t src)=0D +{=0D + vuint32mf2_t a =3D vreinterpret_u32mf2(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_u8m1_u32m1(vuint8m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_u8m2_u32m2(vuint8m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_u8m4_u32m4(vuint8m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_u8m8_u32m8(vuint8m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_u32m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_u8m1_u64m1(vuint8m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_u8m2_u64m2(vuint8m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_u8m4_u64m4(vuint8m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_u8m8_u64m8(vuint8m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_u64m8(src);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vreinterpret_v_u16mf4_u8mf4(vuint16mf4_t src)=0D +{=0D + vuint8mf4_t a =3D vreinterpret_u8mf4(src);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vreinterpret_v_u16mf2_u8mf2(vuint16mf2_t src)=0D +{=0D + vuint8mf2_t a =3D vreinterpret_u8mf2(src);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vreinterpret_v_u16m1_u8m1(vuint16m1_t src)=0D +{=0D + vuint8m1_t a =3D vreinterpret_u8m1(src);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vreinterpret_v_u16m2_u8m2(vuint16m2_t src)=0D +{=0D + vuint8m2_t a =3D vreinterpret_u8m2(src);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vreinterpret_v_u16m4_u8m4(vuint16m4_t src)=0D +{=0D + vuint8m4_t a =3D vreinterpret_u8m4(src);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vreinterpret_v_u16m8_u8m8(vuint16m8_t src)=0D +{=0D + vuint8m8_t a =3D vreinterpret_u8m8(src);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vreinterpret_v_u16mf2_u32mf2(vuint16mf2_t src)=0D +{=0D + vuint32mf2_t a =3D vreinterpret_u32mf2(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_u16m1_u32m1(vuint16m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_u16m2_u32m2(vuint16m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_u16m4_u32m4(vuint16m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_u16m8_u32m8(vuint16m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_u32m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_u16m1_u64m1(vuint16m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_u16m2_u64m2(vuint16m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_u16m4_u64m4(vuint16m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_u16m8_u64m8(vuint16m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_u64m8(src);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vreinterpret_v_u32mf2_u8mf2(vuint32mf2_t src)=0D +{=0D + vuint8mf2_t a =3D vreinterpret_u8mf2(src);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vreinterpret_v_u32m1_u8m1(vuint32m1_t src)=0D +{=0D + vuint8m1_t a =3D vreinterpret_u8m1(src);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vreinterpret_v_u32m2_u8m2(vuint32m2_t src)=0D +{=0D + vuint8m2_t a =3D vreinterpret_u8m2(src);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vreinterpret_v_u32m4_u8m4(vuint32m4_t src)=0D +{=0D + vuint8m4_t a =3D vreinterpret_u8m4(src);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vreinterpret_v_u32m8_u8m8(vuint32m8_t src)=0D +{=0D + vuint8m8_t a =3D vreinterpret_u8m8(src);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vreinterpret_v_u32mf2_u16mf2(vuint32mf2_t src)=0D +{=0D + vuint16mf2_t a =3D vreinterpret_u16mf2(src);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vreinterpret_v_u32m1_u16m1(vuint32m1_t src)=0D +{=0D + vuint16m1_t a =3D vreinterpret_u16m1(src);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vreinterpret_v_u32m2_u16m2(vuint32m2_t src)=0D +{=0D + vuint16m2_t a =3D vreinterpret_u16m2(src);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vreinterpret_v_u32m4_u16m4(vuint32m4_t src)=0D +{=0D + vuint16m4_t a =3D vreinterpret_u16m4(src);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vreinterpret_v_u32m8_u16m8(vuint32m8_t src)=0D +{=0D + vuint16m8_t a =3D vreinterpret_u16m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_u32m1_u64m1(vuint32m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_u32m2_u64m2(vuint32m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_u32m4_u64m4(vuint32m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_u32m8_u64m8(vuint32m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_u64m8(src);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vreinterpret_v_u64m1_u8m1(vuint64m1_t src)=0D +{=0D + vuint8m1_t a =3D vreinterpret_u8m1(src);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vreinterpret_v_u64m2_u8m2(vuint64m2_t src)=0D +{=0D + vuint8m2_t a =3D vreinterpret_u8m2(src);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vreinterpret_v_u64m4_u8m4(vuint64m4_t src)=0D +{=0D + vuint8m4_t a =3D vreinterpret_u8m4(src);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vreinterpret_v_u64m8_u8m8(vuint64m8_t src)=0D +{=0D + vuint8m8_t a =3D vreinterpret_u8m8(src);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vreinterpret_v_u64m1_u16m1(vuint64m1_t src)=0D +{=0D + vuint16m1_t a =3D vreinterpret_u16m1(src);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vreinterpret_v_u64m2_u16m2(vuint64m2_t src)=0D +{=0D + vuint16m2_t a =3D vreinterpret_u16m2(src);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vreinterpret_v_u64m4_u16m4(vuint64m4_t src)=0D +{=0D + vuint16m4_t a =3D vreinterpret_u16m4(src);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vreinterpret_v_u64m8_u16m8(vuint64m8_t src)=0D +{=0D + vuint16m8_t a =3D vreinterpret_u16m8(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_u64m1_u32m1(vuint64m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_u64m2_u32m2(vuint64m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_u64m4_u32m4(vuint64m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_u64m8_u32m8(vuint64m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_u32m8(src);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_ext_v_i8mf8_i8mf4(vint8mf8_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_ext_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_ext_v_i8mf8_i8mf2(vint8mf8_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_ext_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_ext_v_i8mf8_i8m1(vint8mf8_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_ext_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_ext_v_i8mf8_i8m2(vint8mf8_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_ext_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8mf8_i8m4(vint8mf8_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8mf8_i8m8(vint8mf8_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_ext_v_i8mf4_i8mf2(vint8mf4_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_ext_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_ext_v_i8mf4_i8m1(vint8mf4_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_ext_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_ext_v_i8mf4_i8m2(vint8mf4_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_ext_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8mf4_i8m4(vint8mf4_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8mf4_i8m8(vint8mf4_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_ext_v_i8mf2_i8m1(vint8mf2_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_ext_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_ext_v_i8mf2_i8m2(vint8mf2_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_ext_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8mf2_i8m4(vint8mf2_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8mf2_i8m8(vint8mf2_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_ext_v_i8m1_i8m2(vint8m1_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_ext_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8m1_i8m4(vint8m1_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8m1_i8m8(vint8m1_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8m2_i8m4(vint8m2_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8m2_i8m8(vint8m2_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8m4_i8m8(vint8m4_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_ext_v_i16mf4_i16mf2(vint16mf4_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_ext_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_ext_v_i16mf4_i16m1(vint16mf4_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_ext_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_ext_v_i16mf4_i16m2(vint16mf4_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_ext_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_ext_v_i16mf4_i16m4(vint16mf4_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_ext_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16mf4_i16m8(vint16mf4_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_ext_v_i16mf2_i16m1(vint16mf2_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_ext_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_ext_v_i16mf2_i16m2(vint16mf2_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_ext_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_ext_v_i16mf2_i16m4(vint16mf2_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_ext_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16mf2_i16m8(vint16mf2_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_ext_v_i16m1_i16m2(vint16m1_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_ext_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_ext_v_i16m1_i16m4(vint16m1_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_ext_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16m1_i16m8(vint16m1_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_ext_v_i16m2_i16m4(vint16m2_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_ext_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16m2_i16m8(vint16m2_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16m4_i16m8(vint16m4_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vlmul_ext_v_i32mf2_i32m1(vint32mf2_t op1)=0D +{=0D + vint32m1_t a =3D vlmul_ext_i32m1(op1);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vlmul_ext_v_i32mf2_i32m2(vint32mf2_t op1)=0D +{=0D + vint32m2_t a =3D vlmul_ext_i32m2(op1);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vlmul_ext_v_i32mf2_i32m4(vint32mf2_t op1)=0D +{=0D + vint32m4_t a =3D vlmul_ext_i32m4(op1);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vlmul_ext_v_i32mf2_i32m8(vint32mf2_t op1)=0D +{=0D + vint32m8_t a =3D vlmul_ext_i32m8(op1);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vlmul_ext_v_i32m1_i32m2(vint32m1_t op1)=0D +{=0D + vint32m2_t a =3D vlmul_ext_i32m2(op1);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vlmul_ext_v_i32m1_i32m4(vint32m1_t op1)=0D +{=0D + vint32m4_t a =3D vlmul_ext_i32m4(op1);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vlmul_ext_v_i32m1_i32m8(vint32m1_t op1)=0D +{=0D + vint32m8_t a =3D vlmul_ext_i32m8(op1);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vlmul_ext_v_i32m2_i32m4(vint32m2_t op1)=0D +{=0D + vint32m4_t a =3D vlmul_ext_i32m4(op1);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vlmul_ext_v_i32m2_i32m8(vint32m2_t op1)=0D +{=0D + vint32m8_t a =3D vlmul_ext_i32m8(op1);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vlmul_ext_v_i32m4_i32m8(vint32m4_t op1)=0D +{=0D + vint32m8_t a =3D vlmul_ext_i32m8(op1);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vlmul_ext_v_i64m1_i64m2(vint64m1_t op1)=0D +{=0D + vint64m2_t a =3D vlmul_ext_i64m2(op1);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vlmul_ext_v_i64m1_i64m4(vint64m1_t op1)=0D +{=0D + vint64m4_t a =3D vlmul_ext_i64m4(op1);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vlmul_ext_v_i64m1_i64m8(vint64m1_t op1)=0D +{=0D + vint64m8_t a =3D vlmul_ext_i64m8(op1);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vlmul_ext_v_i64m2_i64m4(vint64m2_t op1)=0D +{=0D + vint64m4_t a =3D vlmul_ext_i64m4(op1);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vlmul_ext_v_i64m2_i64m8(vint64m2_t op1)=0D +{=0D + vint64m8_t a =3D vlmul_ext_i64m8(op1);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vlmul_ext_v_i64m4_i64m8(vint64m4_t op1)=0D +{=0D + vint64m8_t a =3D vlmul_ext_i64m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_ext_v_u8mf8_u8mf4(vuint8mf8_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_ext_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_ext_v_u8mf8_u8mf2(vuint8mf8_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_ext_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_ext_v_u8mf8_u8m1(vuint8mf8_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_ext_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_ext_v_u8mf8_u8m2(vuint8mf8_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_ext_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8mf8_u8m4(vuint8mf8_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8mf8_u8m8(vuint8mf8_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_ext_v_u8mf4_u8mf2(vuint8mf4_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_ext_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_ext_v_u8mf4_u8m1(vuint8mf4_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_ext_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_ext_v_u8mf4_u8m2(vuint8mf4_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_ext_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8mf4_u8m4(vuint8mf4_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8mf4_u8m8(vuint8mf4_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_ext_v_u8mf2_u8m1(vuint8mf2_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_ext_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_ext_v_u8mf2_u8m2(vuint8mf2_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_ext_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8mf2_u8m4(vuint8mf2_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8mf2_u8m8(vuint8mf2_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_ext_v_u8m1_u8m2(vuint8m1_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_ext_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8m1_u8m4(vuint8m1_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8m1_u8m8(vuint8m1_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8m2_u8m4(vuint8m2_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8m2_u8m8(vuint8m2_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8m4_u8m8(vuint8m4_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_ext_v_u16mf4_u16mf2(vuint16mf4_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_ext_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_ext_v_u16mf4_u16m1(vuint16mf4_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_ext_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_ext_v_u16mf4_u16m2(vuint16mf4_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_ext_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_ext_v_u16mf4_u16m4(vuint16mf4_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_ext_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16mf4_u16m8(vuint16mf4_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_ext_v_u16mf2_u16m1(vuint16mf2_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_ext_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_ext_v_u16mf2_u16m2(vuint16mf2_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_ext_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_ext_v_u16mf2_u16m4(vuint16mf2_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_ext_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16mf2_u16m8(vuint16mf2_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_ext_v_u16m1_u16m2(vuint16m1_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_ext_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_ext_v_u16m1_u16m4(vuint16m1_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_ext_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16m1_u16m8(vuint16m1_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_ext_v_u16m2_u16m4(vuint16m2_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_ext_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16m2_u16m8(vuint16m2_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16m4_u16m8(vuint16m4_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vlmul_ext_v_u32mf2_u32m1(vuint32mf2_t op1)=0D +{=0D + vuint32m1_t a =3D vlmul_ext_u32m1(op1);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vlmul_ext_v_u32mf2_u32m2(vuint32mf2_t op1)=0D +{=0D + vuint32m2_t a =3D vlmul_ext_u32m2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vlmul_ext_v_u32mf2_u32m4(vuint32mf2_t op1)=0D +{=0D + vuint32m4_t a =3D vlmul_ext_u32m4(op1);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vlmul_ext_v_u32mf2_u32m8(vuint32mf2_t op1)=0D +{=0D + vuint32m8_t a =3D vlmul_ext_u32m8(op1);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vlmul_ext_v_u32m1_u32m2(vuint32m1_t op1)=0D +{=0D + vuint32m2_t a =3D vlmul_ext_u32m2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vlmul_ext_v_u32m1_u32m4(vuint32m1_t op1)=0D +{=0D + vuint32m4_t a =3D vlmul_ext_u32m4(op1);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vlmul_ext_v_u32m1_u32m8(vuint32m1_t op1)=0D +{=0D + vuint32m8_t a =3D vlmul_ext_u32m8(op1);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vlmul_ext_v_u32m2_u32m4(vuint32m2_t op1)=0D +{=0D + vuint32m4_t a =3D vlmul_ext_u32m4(op1);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vlmul_ext_v_u32m2_u32m8(vuint32m2_t op1)=0D +{=0D + vuint32m8_t a =3D vlmul_ext_u32m8(op1);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vlmul_ext_v_u32m4_u32m8(vuint32m4_t op1)=0D +{=0D + vuint32m8_t a =3D vlmul_ext_u32m8(op1);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vlmul_ext_v_u64m1_u64m2(vuint64m1_t op1)=0D +{=0D + vuint64m2_t a =3D vlmul_ext_u64m2(op1);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vlmul_ext_v_u64m1_u64m4(vuint64m1_t op1)=0D +{=0D + vuint64m4_t a =3D vlmul_ext_u64m4(op1);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vlmul_ext_v_u64m1_u64m8(vuint64m1_t op1)=0D +{=0D + vuint64m8_t a =3D vlmul_ext_u64m8(op1);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vlmul_ext_v_u64m2_u64m4(vuint64m2_t op1)=0D +{=0D + vuint64m4_t a =3D vlmul_ext_u64m4(op1);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vlmul_ext_v_u64m2_u64m8(vuint64m2_t op1)=0D +{=0D + vuint64m8_t a =3D vlmul_ext_u64m8(op1);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vlmul_ext_v_u64m4_u64m8(vuint64m4_t op1)=0D +{=0D + vuint64m8_t a =3D vlmul_ext_u64m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vlmul_ext_v_f32mf2_f32m1(vfloat32mf2_t op1)=0D +{=0D + vfloat32m1_t a =3D vlmul_ext_f32m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vlmul_ext_v_f32mf2_f32m2(vfloat32mf2_t op1)=0D +{=0D + vfloat32m2_t a =3D vlmul_ext_f32m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vlmul_ext_v_f32mf2_f32m4(vfloat32mf2_t op1)=0D +{=0D + vfloat32m4_t a =3D vlmul_ext_f32m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vlmul_ext_v_f32mf2_f32m8(vfloat32mf2_t op1)=0D +{=0D + vfloat32m8_t a =3D vlmul_ext_f32m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vlmul_ext_v_f32m1_f32m2(vfloat32m1_t op1)=0D +{=0D + vfloat32m2_t a =3D vlmul_ext_f32m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vlmul_ext_v_f32m1_f32m4(vfloat32m1_t op1)=0D +{=0D + vfloat32m4_t a =3D vlmul_ext_f32m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vlmul_ext_v_f32m1_f32m8(vfloat32m1_t op1)=0D +{=0D + vfloat32m8_t a =3D vlmul_ext_f32m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vlmul_ext_v_f32m2_f32m4(vfloat32m2_t op1)=0D +{=0D + vfloat32m4_t a =3D vlmul_ext_f32m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vlmul_ext_v_f32m2_f32m8(vfloat32m2_t op1)=0D +{=0D + vfloat32m8_t a =3D vlmul_ext_f32m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vlmul_ext_v_f32m4_f32m8(vfloat32m4_t op1)=0D +{=0D + vfloat32m8_t a =3D vlmul_ext_f32m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vlmul_ext_v_f64m1_f64m2(vfloat64m1_t op1)=0D +{=0D + vfloat64m2_t a =3D vlmul_ext_f64m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vlmul_ext_v_f64m1_f64m4(vfloat64m1_t op1)=0D +{=0D + vfloat64m4_t a =3D vlmul_ext_f64m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vlmul_ext_v_f64m1_f64m8(vfloat64m1_t op1)=0D +{=0D + vfloat64m8_t a =3D vlmul_ext_f64m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vlmul_ext_v_f64m2_f64m4(vfloat64m2_t op1)=0D +{=0D + vfloat64m4_t a =3D vlmul_ext_f64m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vlmul_ext_v_f64m2_f64m8(vfloat64m2_t op1)=0D +{=0D + vfloat64m8_t a =3D vlmul_ext_f64m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vlmul_ext_v_f64m4_f64m8(vfloat64m4_t op1)=0D +{=0D + vfloat64m8_t a =3D vlmul_ext_f64m8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8mf4_i8mf8(vint8mf4_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8mf2_i8mf8(vint8mf2_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8mf2_i8mf4(vint8mf2_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8m1_i8mf8(vint8m1_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8m1_i8mf4(vint8m1_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_trunc_v_i8m1_i8mf2(vint8m1_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_trunc_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8m2_i8mf8(vint8m2_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8m2_i8mf4(vint8m2_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_trunc_v_i8m2_i8mf2(vint8m2_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_trunc_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_trunc_v_i8m2_i8m1(vint8m2_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_trunc_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8m4_i8mf8(vint8m4_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8m4_i8mf4(vint8m4_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_trunc_v_i8m4_i8mf2(vint8m4_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_trunc_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_trunc_v_i8m4_i8m1(vint8m4_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_trunc_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_trunc_v_i8m4_i8m2(vint8m4_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_trunc_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8m8_i8mf8(vint8m8_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8m8_i8mf4(vint8m8_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_trunc_v_i8m8_i8mf2(vint8m8_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_trunc_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_trunc_v_i8m8_i8m1(vint8m8_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_trunc_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_trunc_v_i8m8_i8m2(vint8m8_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_trunc_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_trunc_v_i8m8_i8m4(vint8m8_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_trunc_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16mf2_i16mf4(vint16mf2_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16m1_i16mf4(vint16m1_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_trunc_v_i16m1_i16mf2(vint16m1_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_trunc_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16m2_i16mf4(vint16m2_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_trunc_v_i16m2_i16mf2(vint16m2_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_trunc_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_trunc_v_i16m2_i16m1(vint16m2_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_trunc_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16m4_i16mf4(vint16m4_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_trunc_v_i16m4_i16mf2(vint16m4_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_trunc_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_trunc_v_i16m4_i16m1(vint16m4_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_trunc_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_trunc_v_i16m4_i16m2(vint16m4_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_trunc_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16m8_i16mf4(vint16m8_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_trunc_v_i16m8_i16mf2(vint16m8_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_trunc_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_trunc_v_i16m8_i16m1(vint16m8_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_trunc_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_trunc_v_i16m8_i16m2(vint16m8_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_trunc_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_trunc_v_i16m8_i16m4(vint16m8_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_trunc_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vlmul_trunc_v_i32m1_i32mf2(vint32m1_t op1)=0D +{=0D + vint32mf2_t a =3D vlmul_trunc_i32mf2(op1);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vlmul_trunc_v_i32m2_i32mf2(vint32m2_t op1)=0D +{=0D + vint32mf2_t a =3D vlmul_trunc_i32mf2(op1);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vlmul_trunc_v_i32m2_i32m1(vint32m2_t op1)=0D +{=0D + vint32m1_t a =3D vlmul_trunc_i32m1(op1);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vlmul_trunc_v_i32m4_i32mf2(vint32m4_t op1)=0D +{=0D + vint32mf2_t a =3D vlmul_trunc_i32mf2(op1);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vlmul_trunc_v_i32m4_i32m1(vint32m4_t op1)=0D +{=0D + vint32m1_t a =3D vlmul_trunc_i32m1(op1);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vlmul_trunc_v_i32m4_i32m2(vint32m4_t op1)=0D +{=0D + vint32m2_t a =3D vlmul_trunc_i32m2(op1);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vlmul_trunc_v_i32m8_i32mf2(vint32m8_t op1)=0D +{=0D + vint32mf2_t a =3D vlmul_trunc_i32mf2(op1);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vlmul_trunc_v_i32m8_i32m1(vint32m8_t op1)=0D +{=0D + vint32m1_t a =3D vlmul_trunc_i32m1(op1);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vlmul_trunc_v_i32m8_i32m2(vint32m8_t op1)=0D +{=0D + vint32m2_t a =3D vlmul_trunc_i32m2(op1);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vlmul_trunc_v_i32m8_i32m4(vint32m8_t op1)=0D +{=0D + vint32m4_t a =3D vlmul_trunc_i32m4(op1);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vlmul_trunc_v_i64m2_i64m1(vint64m2_t op1)=0D +{=0D + vint64m1_t a =3D vlmul_trunc_i64m1(op1);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vlmul_trunc_v_i64m4_i64m1(vint64m4_t op1)=0D +{=0D + vint64m1_t a =3D vlmul_trunc_i64m1(op1);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vlmul_trunc_v_i64m4_i64m2(vint64m4_t op1)=0D +{=0D + vint64m2_t a =3D vlmul_trunc_i64m2(op1);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vlmul_trunc_v_i64m8_i64m1(vint64m8_t op1)=0D +{=0D + vint64m1_t a =3D vlmul_trunc_i64m1(op1);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vlmul_trunc_v_i64m8_i64m2(vint64m8_t op1)=0D +{=0D + vint64m2_t a =3D vlmul_trunc_i64m2(op1);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vlmul_trunc_v_i64m8_i64m4(vint64m8_t op1)=0D +{=0D + vint64m4_t a =3D vlmul_trunc_i64m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8mf4_u8mf8(vuint8mf4_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8mf2_u8mf8(vuint8mf2_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8mf2_u8mf4(vuint8mf2_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8m1_u8mf8(vuint8m1_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8m1_u8mf4(vuint8m1_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_trunc_v_u8m1_u8mf2(vuint8m1_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_trunc_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8m2_u8mf8(vuint8m2_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8m2_u8mf4(vuint8m2_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_trunc_v_u8m2_u8mf2(vuint8m2_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_trunc_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_trunc_v_u8m2_u8m1(vuint8m2_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_trunc_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8m4_u8mf8(vuint8m4_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8m4_u8mf4(vuint8m4_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_trunc_v_u8m4_u8mf2(vuint8m4_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_trunc_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_trunc_v_u8m4_u8m1(vuint8m4_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_trunc_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_trunc_v_u8m4_u8m2(vuint8m4_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_trunc_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8m8_u8mf8(vuint8m8_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8m8_u8mf4(vuint8m8_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_trunc_v_u8m8_u8mf2(vuint8m8_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_trunc_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_trunc_v_u8m8_u8m1(vuint8m8_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_trunc_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_trunc_v_u8m8_u8m2(vuint8m8_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_trunc_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_trunc_v_u8m8_u8m4(vuint8m8_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_trunc_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16mf2_u16mf4(vuint16mf2_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16m1_u16mf4(vuint16m1_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_trunc_v_u16m1_u16mf2(vuint16m1_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_trunc_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16m2_u16mf4(vuint16m2_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_trunc_v_u16m2_u16mf2(vuint16m2_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_trunc_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_trunc_v_u16m2_u16m1(vuint16m2_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_trunc_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16m4_u16mf4(vuint16m4_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_trunc_v_u16m4_u16mf2(vuint16m4_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_trunc_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_trunc_v_u16m4_u16m1(vuint16m4_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_trunc_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_trunc_v_u16m4_u16m2(vuint16m4_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_trunc_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16m8_u16mf4(vuint16m8_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_trunc_v_u16m8_u16mf2(vuint16m8_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_trunc_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_trunc_v_u16m8_u16m1(vuint16m8_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_trunc_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_trunc_v_u16m8_u16m2(vuint16m8_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_trunc_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_trunc_v_u16m8_u16m4(vuint16m8_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_trunc_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vlmul_trunc_v_u32m1_u32mf2(vuint32m1_t op1)=0D +{=0D + vuint32mf2_t a =3D vlmul_trunc_u32mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vlmul_trunc_v_u32m2_u32mf2(vuint32m2_t op1)=0D +{=0D + vuint32mf2_t a =3D vlmul_trunc_u32mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vlmul_trunc_v_u32m2_u32m1(vuint32m2_t op1)=0D +{=0D + vuint32m1_t a =3D vlmul_trunc_u32m1(op1);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vlmul_trunc_v_u32m4_u32mf2(vuint32m4_t op1)=0D +{=0D + vuint32mf2_t a =3D vlmul_trunc_u32mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vlmul_trunc_v_u32m4_u32m1(vuint32m4_t op1)=0D +{=0D + vuint32m1_t a =3D vlmul_trunc_u32m1(op1);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vlmul_trunc_v_u32m4_u32m2(vuint32m4_t op1)=0D +{=0D + vuint32m2_t a =3D vlmul_trunc_u32m2(op1);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vlmul_trunc_v_u32m8_u32mf2(vuint32m8_t op1)=0D +{=0D + vuint32mf2_t a =3D vlmul_trunc_u32mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vlmul_trunc_v_u32m8_u32m1(vuint32m8_t op1)=0D +{=0D + vuint32m1_t a =3D vlmul_trunc_u32m1(op1);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vlmul_trunc_v_u32m8_u32m2(vuint32m8_t op1)=0D +{=0D + vuint32m2_t a =3D vlmul_trunc_u32m2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vlmul_trunc_v_u32m8_u32m4(vuint32m8_t op1)=0D +{=0D + vuint32m4_t a =3D vlmul_trunc_u32m4(op1);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vlmul_trunc_v_u64m2_u64m1(vuint64m2_t op1)=0D +{=0D + vuint64m1_t a =3D vlmul_trunc_u64m1(op1);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vlmul_trunc_v_u64m4_u64m1(vuint64m4_t op1)=0D +{=0D + vuint64m1_t a =3D vlmul_trunc_u64m1(op1);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vlmul_trunc_v_u64m4_u64m2(vuint64m4_t op1)=0D +{=0D + vuint64m2_t a =3D vlmul_trunc_u64m2(op1);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vlmul_trunc_v_u64m8_u64m1(vuint64m8_t op1)=0D +{=0D + vuint64m1_t a =3D vlmul_trunc_u64m1(op1);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vlmul_trunc_v_u64m8_u64m2(vuint64m8_t op1)=0D +{=0D + vuint64m2_t a =3D vlmul_trunc_u64m2(op1);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vlmul_trunc_v_u64m8_u64m4(vuint64m8_t op1)=0D +{=0D + vuint64m4_t a =3D vlmul_trunc_u64m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vlmul_trunc_v_f32m1_f32mf2(vfloat32m1_t op1)=0D +{=0D + vfloat32mf2_t a =3D vlmul_trunc_f32mf2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vlmul_trunc_v_f32m2_f32mf2(vfloat32m2_t op1)=0D +{=0D + vfloat32mf2_t a =3D vlmul_trunc_f32mf2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vlmul_trunc_v_f32m2_f32m1(vfloat32m2_t op1)=0D +{=0D + vfloat32m1_t a =3D vlmul_trunc_f32m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vlmul_trunc_v_f32m4_f32mf2(vfloat32m4_t op1)=0D +{=0D + vfloat32mf2_t a =3D vlmul_trunc_f32mf2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vlmul_trunc_v_f32m4_f32m1(vfloat32m4_t op1)=0D +{=0D + vfloat32m1_t a =3D vlmul_trunc_f32m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vlmul_trunc_v_f32m4_f32m2(vfloat32m4_t op1)=0D +{=0D + vfloat32m2_t a =3D vlmul_trunc_f32m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vlmul_trunc_v_f32m8_f32mf2(vfloat32m8_t op1)=0D +{=0D + vfloat32mf2_t a =3D vlmul_trunc_f32mf2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vlmul_trunc_v_f32m8_f32m1(vfloat32m8_t op1)=0D +{=0D + vfloat32m1_t a =3D vlmul_trunc_f32m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vlmul_trunc_v_f32m8_f32m2(vfloat32m8_t op1)=0D +{=0D + vfloat32m2_t a =3D vlmul_trunc_f32m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vlmul_trunc_v_f32m8_f32m4(vfloat32m8_t op1)=0D +{=0D + vfloat32m4_t a =3D vlmul_trunc_f32m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vlmul_trunc_v_f64m2_f64m1(vfloat64m2_t op1)=0D +{=0D + vfloat64m1_t a =3D vlmul_trunc_f64m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vlmul_trunc_v_f64m4_f64m1(vfloat64m4_t op1)=0D +{=0D + vfloat64m1_t a =3D vlmul_trunc_f64m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vlmul_trunc_v_f64m4_f64m2(vfloat64m4_t op1)=0D +{=0D + vfloat64m2_t a =3D vlmul_trunc_f64m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vlmul_trunc_v_f64m8_f64m1(vfloat64m8_t op1)=0D +{=0D + vfloat64m1_t a =3D vlmul_trunc_f64m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vlmul_trunc_v_f64m8_f64m2(vfloat64m8_t op1)=0D +{=0D + vfloat64m2_t a =3D vlmul_trunc_f64m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vlmul_trunc_v_f64m8_f64m4(vfloat64m8_t op1)=0D +{=0D + vfloat64m4_t a =3D vlmul_trunc_f64m4(op1);=0D + return a;=0D +}=0D \ No newline at end of file=0D diff --git a/gcc/testsuite/g++.target/riscv/rvv/rvv-intrinsic.exp b/gcc/tes= tsuite/g++.target/riscv/rvv/rvv-intrinsic.exp=0D new file mode 100644=0D index 00000000000..c8db25f0fbd=0D --- /dev/null=0D +++ b/gcc/testsuite/g++.target/riscv/rvv/rvv-intrinsic.exp=0D @@ -0,0 +1,39 @@=0D +# Copyright (C) 2022-2022 Free Software Foundation, Inc.=0D +=0D +# This program is free software; you can redistribute it and/or modify=0D +# it under the terms of the GNU General Public License as published by=0D +# the Free Software Foundation; either version 3 of the License, or=0D +# (at your option) any later version.=0D +#=0D +# This program is distributed in the hope that it will be useful,=0D +# but WITHOUT ANY WARRANTY; without even the implied warranty of=0D +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the=0D +# GNU General Public License for more details.=0D +#=0D +# You should have received a copy of the GNU General Public License=0D +# along with GCC; see the file COPYING3. If not see=0D +# .=0D +=0D +# GCC testsuite that uses the `dg.exp' driver.=0D +=0D +# Test the front-end for C++.=0D +# We don't need to test back-end code-gen in RV32 system for C++=0D +# Because it is already tested in C.=0D +# Exit immediately if this isn't a RISC-V target.=0D +if ![istarget riscv64*-*-*] then {=0D + return=0D +}=0D +=0D +# Load support procs.=0D +load_lib g++-dg.exp=0D +=0D +# Initialize `dg'.=0D +dg-init=0D +=0D +# Main loop.=0D +set CFLAGS "-march=3Drv64gcv_zfh -O3"=0D +dg-runtest [lsort [glob -nocomplain $srcdir/$subdir/*.C]] \=0D + "" $CFLAGS=0D +=0D +# All done.=0D +dg-finish=0D \ No newline at end of file=0D diff --git a/gcc/testsuite/gcc.target/riscv/rvv/intrinsic/misc_func.c b/gcc= /testsuite/gcc.target/riscv/rvv/intrinsic/misc_func.c=0D new file mode 100644=0D index 00000000000..387da4205f9=0D --- /dev/null=0D +++ b/gcc/testsuite/gcc.target/riscv/rvv/intrinsic/misc_func.c=0D @@ -0,0 +1,2921 @@=0D +=0D +/* { dg-do compile } */=0D +/* { dg-skip-if "test vector intrinsic" { *-*-* } { "*" } { "-march=3Drv*v= *" } } */=0D +=0D +#include =0D +#include =0D +=0D +=0D +/* Reinterpret between different type under the same SEW and LMUL */=0D +=0D +vuint8mf8_t test_vreinterpret_v_i8mf8_u8mf8(vint8mf8_t src)=0D +{=0D + vuint8mf8_t a =3D vreinterpret_v_i8mf8_u8mf8(src);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vreinterpret_v_i8mf4_u8mf4(vint8mf4_t src)=0D +{=0D + vuint8mf4_t a =3D vreinterpret_v_i8mf4_u8mf4(src);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vreinterpret_v_i8mf2_u8mf2(vint8mf2_t src)=0D +{=0D + vuint8mf2_t a =3D vreinterpret_v_i8mf2_u8mf2(src);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vreinterpret_v_i8m1_u8m1(vint8m1_t src)=0D +{=0D + vuint8m1_t a =3D vreinterpret_v_i8m1_u8m1(src);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vreinterpret_v_i8m2_u8m2(vint8m2_t src)=0D +{=0D + vuint8m2_t a =3D vreinterpret_v_i8m2_u8m2(src);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vreinterpret_v_i8m4_u8m4(vint8m4_t src)=0D +{=0D + vuint8m4_t a =3D vreinterpret_v_i8m4_u8m4(src);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vreinterpret_v_i8m8_u8m8(vint8m8_t src)=0D +{=0D + vuint8m8_t a =3D vreinterpret_v_i8m8_u8m8(src);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vreinterpret_v_u8mf8_i8mf8(vuint8mf8_t src)=0D +{=0D + vint8mf8_t a =3D vreinterpret_v_u8mf8_i8mf8(src);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vreinterpret_v_u8mf4_i8mf4(vuint8mf4_t src)=0D +{=0D + vint8mf4_t a =3D vreinterpret_v_u8mf4_i8mf4(src);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vreinterpret_v_u8mf2_i8mf2(vuint8mf2_t src)=0D +{=0D + vint8mf2_t a =3D vreinterpret_v_u8mf2_i8mf2(src);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vreinterpret_v_u8m1_i8m1(vuint8m1_t src)=0D +{=0D + vint8m1_t a =3D vreinterpret_v_u8m1_i8m1(src);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vreinterpret_v_u8m2_i8m2(vuint8m2_t src)=0D +{=0D + vint8m2_t a =3D vreinterpret_v_u8m2_i8m2(src);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vreinterpret_v_u8m4_i8m4(vuint8m4_t src)=0D +{=0D + vint8m4_t a =3D vreinterpret_v_u8m4_i8m4(src);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vreinterpret_v_u8m8_i8m8(vuint8m8_t src)=0D +{=0D + vint8m8_t a =3D vreinterpret_v_u8m8_i8m8(src);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vreinterpret_v_i16mf4_u16mf4(vint16mf4_t src)=0D +{=0D + vuint16mf4_t a =3D vreinterpret_v_i16mf4_u16mf4(src);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vreinterpret_v_i16mf2_u16mf2(vint16mf2_t src)=0D +{=0D + vuint16mf2_t a =3D vreinterpret_v_i16mf2_u16mf2(src);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vreinterpret_v_i16m1_u16m1(vint16m1_t src)=0D +{=0D + vuint16m1_t a =3D vreinterpret_v_i16m1_u16m1(src);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vreinterpret_v_i16m2_u16m2(vint16m2_t src)=0D +{=0D + vuint16m2_t a =3D vreinterpret_v_i16m2_u16m2(src);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vreinterpret_v_i16m4_u16m4(vint16m4_t src)=0D +{=0D + vuint16m4_t a =3D vreinterpret_v_i16m4_u16m4(src);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vreinterpret_v_i16m8_u16m8(vint16m8_t src)=0D +{=0D + vuint16m8_t a =3D vreinterpret_v_i16m8_u16m8(src);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vreinterpret_v_u16mf4_i16mf4(vuint16mf4_t src)=0D +{=0D + vint16mf4_t a =3D vreinterpret_v_u16mf4_i16mf4(src);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vreinterpret_v_u16mf2_i16mf2(vuint16mf2_t src)=0D +{=0D + vint16mf2_t a =3D vreinterpret_v_u16mf2_i16mf2(src);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vreinterpret_v_u16m1_i16m1(vuint16m1_t src)=0D +{=0D + vint16m1_t a =3D vreinterpret_v_u16m1_i16m1(src);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vreinterpret_v_u16m2_i16m2(vuint16m2_t src)=0D +{=0D + vint16m2_t a =3D vreinterpret_v_u16m2_i16m2(src);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vreinterpret_v_u16m4_i16m4(vuint16m4_t src)=0D +{=0D + vint16m4_t a =3D vreinterpret_v_u16m4_i16m4(src);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vreinterpret_v_u16m8_i16m8(vuint16m8_t src)=0D +{=0D + vint16m8_t a =3D vreinterpret_v_u16m8_i16m8(src);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vreinterpret_v_i32mf2_u32mf2(vint32mf2_t src)=0D +{=0D + vuint32mf2_t a =3D vreinterpret_v_i32mf2_u32mf2(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_i32m1_u32m1(vint32m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_v_i32m1_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_i32m2_u32m2(vint32m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_v_i32m2_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_i32m4_u32m4(vint32m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_v_i32m4_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_i32m8_u32m8(vint32m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_v_i32m8_u32m8(src);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vreinterpret_v_i32mf2_f32mf2(vint32mf2_t src)=0D +{=0D + vfloat32mf2_t a =3D vreinterpret_v_i32mf2_f32mf2(src);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vreinterpret_v_i32m1_f32m1(vint32m1_t src)=0D +{=0D + vfloat32m1_t a =3D vreinterpret_v_i32m1_f32m1(src);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vreinterpret_v_i32m2_f32m2(vint32m2_t src)=0D +{=0D + vfloat32m2_t a =3D vreinterpret_v_i32m2_f32m2(src);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vreinterpret_v_i32m4_f32m4(vint32m4_t src)=0D +{=0D + vfloat32m4_t a =3D vreinterpret_v_i32m4_f32m4(src);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vreinterpret_v_i32m8_f32m8(vint32m8_t src)=0D +{=0D + vfloat32m8_t a =3D vreinterpret_v_i32m8_f32m8(src);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vreinterpret_v_u32mf2_i32mf2(vuint32mf2_t src)=0D +{=0D + vint32mf2_t a =3D vreinterpret_v_u32mf2_i32mf2(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_u32m1_i32m1(vuint32m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_v_u32m1_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_u32m2_i32m2(vuint32m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_v_u32m2_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_u32m4_i32m4(vuint32m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_v_u32m4_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_u32m8_i32m8(vuint32m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_v_u32m8_i32m8(src);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vreinterpret_v_u32mf2_f32mf2(vuint32mf2_t src)=0D +{=0D + vfloat32mf2_t a =3D vreinterpret_v_u32mf2_f32mf2(src);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vreinterpret_v_u32m1_f32m1(vuint32m1_t src)=0D +{=0D + vfloat32m1_t a =3D vreinterpret_v_u32m1_f32m1(src);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vreinterpret_v_u32m2_f32m2(vuint32m2_t src)=0D +{=0D + vfloat32m2_t a =3D vreinterpret_v_u32m2_f32m2(src);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vreinterpret_v_u32m4_f32m4(vuint32m4_t src)=0D +{=0D + vfloat32m4_t a =3D vreinterpret_v_u32m4_f32m4(src);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vreinterpret_v_u32m8_f32m8(vuint32m8_t src)=0D +{=0D + vfloat32m8_t a =3D vreinterpret_v_u32m8_f32m8(src);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vreinterpret_v_f32mf2_i32mf2(vfloat32mf2_t src)=0D +{=0D + vint32mf2_t a =3D vreinterpret_v_f32mf2_i32mf2(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_f32m1_i32m1(vfloat32m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_v_f32m1_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_f32m2_i32m2(vfloat32m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_v_f32m2_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_f32m4_i32m4(vfloat32m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_v_f32m4_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_f32m8_i32m8(vfloat32m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_v_f32m8_i32m8(src);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vreinterpret_v_f32mf2_u32mf2(vfloat32mf2_t src)=0D +{=0D + vuint32mf2_t a =3D vreinterpret_v_f32mf2_u32mf2(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_f32m1_u32m1(vfloat32m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_v_f32m1_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_f32m2_u32m2(vfloat32m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_v_f32m2_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_f32m4_u32m4(vfloat32m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_v_f32m4_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_f32m8_u32m8(vfloat32m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_v_f32m8_u32m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_i64m1_u64m1(vint64m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_v_i64m1_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_i64m2_u64m2(vint64m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_v_i64m2_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_i64m4_u64m4(vint64m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_v_i64m4_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_i64m8_u64m8(vint64m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_v_i64m8_u64m8(src);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vreinterpret_v_i64m1_f64m1(vint64m1_t src)=0D +{=0D + vfloat64m1_t a =3D vreinterpret_v_i64m1_f64m1(src);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vreinterpret_v_i64m2_f64m2(vint64m2_t src)=0D +{=0D + vfloat64m2_t a =3D vreinterpret_v_i64m2_f64m2(src);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vreinterpret_v_i64m4_f64m4(vint64m4_t src)=0D +{=0D + vfloat64m4_t a =3D vreinterpret_v_i64m4_f64m4(src);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vreinterpret_v_i64m8_f64m8(vint64m8_t src)=0D +{=0D + vfloat64m8_t a =3D vreinterpret_v_i64m8_f64m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_u64m1_i64m1(vuint64m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_v_u64m1_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_u64m2_i64m2(vuint64m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_v_u64m2_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_u64m4_i64m4(vuint64m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_v_u64m4_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_u64m8_i64m8(vuint64m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_v_u64m8_i64m8(src);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vreinterpret_v_u64m1_f64m1(vuint64m1_t src)=0D +{=0D + vfloat64m1_t a =3D vreinterpret_v_u64m1_f64m1(src);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vreinterpret_v_u64m2_f64m2(vuint64m2_t src)=0D +{=0D + vfloat64m2_t a =3D vreinterpret_v_u64m2_f64m2(src);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vreinterpret_v_u64m4_f64m4(vuint64m4_t src)=0D +{=0D + vfloat64m4_t a =3D vreinterpret_v_u64m4_f64m4(src);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vreinterpret_v_u64m8_f64m8(vuint64m8_t src)=0D +{=0D + vfloat64m8_t a =3D vreinterpret_v_u64m8_f64m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_f64m1_i64m1(vfloat64m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_v_f64m1_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_f64m2_i64m2(vfloat64m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_v_f64m2_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_f64m4_i64m4(vfloat64m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_v_f64m4_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_f64m8_i64m8(vfloat64m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_v_f64m8_i64m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_f64m1_u64m1(vfloat64m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_v_f64m1_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_f64m2_u64m2(vfloat64m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_v_f64m2_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_f64m4_u64m4(vfloat64m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_v_f64m4_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_f64m8_u64m8(vfloat64m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_v_f64m8_u64m8(src);=0D + return a;=0D +}=0D +=0D +/* Reinterpret between different SEW under the same LMUL */=0D +=0D +vint16mf4_t test_vreinterpret_v_i8mf4_i16mf4(vint8mf4_t src)=0D +{=0D + vint16mf4_t a =3D vreinterpret_v_i8mf4_i16mf4(src);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vreinterpret_v_i8mf2_i16mf2(vint8mf2_t src)=0D +{=0D + vint16mf2_t a =3D vreinterpret_v_i8mf2_i16mf2(src);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vreinterpret_v_i8m1_i16m1(vint8m1_t src)=0D +{=0D + vint16m1_t a =3D vreinterpret_v_i8m1_i16m1(src);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vreinterpret_v_i8m2_i16m2(vint8m2_t src)=0D +{=0D + vint16m2_t a =3D vreinterpret_v_i8m2_i16m2(src);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vreinterpret_v_i8m4_i16m4(vint8m4_t src)=0D +{=0D + vint16m4_t a =3D vreinterpret_v_i8m4_i16m4(src);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vreinterpret_v_i8m8_i16m8(vint8m8_t src)=0D +{=0D + vint16m8_t a =3D vreinterpret_v_i8m8_i16m8(src);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vreinterpret_v_i8mf2_i32mf2(vint8mf2_t src)=0D +{=0D + vint32mf2_t a =3D vreinterpret_v_i8mf2_i32mf2(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_i8m1_i32m1(vint8m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_v_i8m1_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_i8m2_i32m2(vint8m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_v_i8m2_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_i8m4_i32m4(vint8m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_v_i8m4_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_i8m8_i32m8(vint8m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_v_i8m8_i32m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_i8m1_i64m1(vint8m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_v_i8m1_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_i8m2_i64m2(vint8m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_v_i8m2_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_i8m4_i64m4(vint8m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_v_i8m4_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_i8m8_i64m8(vint8m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_v_i8m8_i64m8(src);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vreinterpret_v_i16mf4_i8mf4(vint16mf4_t src)=0D +{=0D + vint8mf4_t a =3D vreinterpret_v_i16mf4_i8mf4(src);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vreinterpret_v_i16mf2_i8mf2(vint16mf2_t src)=0D +{=0D + vint8mf2_t a =3D vreinterpret_v_i16mf2_i8mf2(src);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vreinterpret_v_i16m1_i8m1(vint16m1_t src)=0D +{=0D + vint8m1_t a =3D vreinterpret_v_i16m1_i8m1(src);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vreinterpret_v_i16m2_i8m2(vint16m2_t src)=0D +{=0D + vint8m2_t a =3D vreinterpret_v_i16m2_i8m2(src);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vreinterpret_v_i16m4_i8m4(vint16m4_t src)=0D +{=0D + vint8m4_t a =3D vreinterpret_v_i16m4_i8m4(src);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vreinterpret_v_i16m8_i8m8(vint16m8_t src)=0D +{=0D + vint8m8_t a =3D vreinterpret_v_i16m8_i8m8(src);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vreinterpret_v_i16mf2_i32mf2(vint16mf2_t src)=0D +{=0D + vint32mf2_t a =3D vreinterpret_v_i16mf2_i32mf2(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_i16m1_i32m1(vint16m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_v_i16m1_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_i16m2_i32m2(vint16m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_v_i16m2_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_i16m4_i32m4(vint16m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_v_i16m4_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_i16m8_i32m8(vint16m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_v_i16m8_i32m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_i16m1_i64m1(vint16m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_v_i16m1_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_i16m2_i64m2(vint16m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_v_i16m2_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_i16m4_i64m4(vint16m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_v_i16m4_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_i16m8_i64m8(vint16m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_v_i16m8_i64m8(src);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vreinterpret_v_i32mf2_i8mf2(vint32mf2_t src)=0D +{=0D + vint8mf2_t a =3D vreinterpret_v_i32mf2_i8mf2(src);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vreinterpret_v_i32m1_i8m1(vint32m1_t src)=0D +{=0D + vint8m1_t a =3D vreinterpret_v_i32m1_i8m1(src);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vreinterpret_v_i32m2_i8m2(vint32m2_t src)=0D +{=0D + vint8m2_t a =3D vreinterpret_v_i32m2_i8m2(src);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vreinterpret_v_i32m4_i8m4(vint32m4_t src)=0D +{=0D + vint8m4_t a =3D vreinterpret_v_i32m4_i8m4(src);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vreinterpret_v_i32m8_i8m8(vint32m8_t src)=0D +{=0D + vint8m8_t a =3D vreinterpret_v_i32m8_i8m8(src);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vreinterpret_v_i32mf2_i16mf2(vint32mf2_t src)=0D +{=0D + vint16mf2_t a =3D vreinterpret_v_i32mf2_i16mf2(src);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vreinterpret_v_i32m1_i16m1(vint32m1_t src)=0D +{=0D + vint16m1_t a =3D vreinterpret_v_i32m1_i16m1(src);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vreinterpret_v_i32m2_i16m2(vint32m2_t src)=0D +{=0D + vint16m2_t a =3D vreinterpret_v_i32m2_i16m2(src);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vreinterpret_v_i32m4_i16m4(vint32m4_t src)=0D +{=0D + vint16m4_t a =3D vreinterpret_v_i32m4_i16m4(src);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vreinterpret_v_i32m8_i16m8(vint32m8_t src)=0D +{=0D + vint16m8_t a =3D vreinterpret_v_i32m8_i16m8(src);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vreinterpret_v_i32m1_i64m1(vint32m1_t src)=0D +{=0D + vint64m1_t a =3D vreinterpret_v_i32m1_i64m1(src);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vreinterpret_v_i32m2_i64m2(vint32m2_t src)=0D +{=0D + vint64m2_t a =3D vreinterpret_v_i32m2_i64m2(src);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vreinterpret_v_i32m4_i64m4(vint32m4_t src)=0D +{=0D + vint64m4_t a =3D vreinterpret_v_i32m4_i64m4(src);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vreinterpret_v_i32m8_i64m8(vint32m8_t src)=0D +{=0D + vint64m8_t a =3D vreinterpret_v_i32m8_i64m8(src);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vreinterpret_v_i64m1_i8m1(vint64m1_t src)=0D +{=0D + vint8m1_t a =3D vreinterpret_v_i64m1_i8m1(src);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vreinterpret_v_i64m2_i8m2(vint64m2_t src)=0D +{=0D + vint8m2_t a =3D vreinterpret_v_i64m2_i8m2(src);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vreinterpret_v_i64m4_i8m4(vint64m4_t src)=0D +{=0D + vint8m4_t a =3D vreinterpret_v_i64m4_i8m4(src);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vreinterpret_v_i64m8_i8m8(vint64m8_t src)=0D +{=0D + vint8m8_t a =3D vreinterpret_v_i64m8_i8m8(src);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vreinterpret_v_i64m1_i16m1(vint64m1_t src)=0D +{=0D + vint16m1_t a =3D vreinterpret_v_i64m1_i16m1(src);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vreinterpret_v_i64m2_i16m2(vint64m2_t src)=0D +{=0D + vint16m2_t a =3D vreinterpret_v_i64m2_i16m2(src);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vreinterpret_v_i64m4_i16m4(vint64m4_t src)=0D +{=0D + vint16m4_t a =3D vreinterpret_v_i64m4_i16m4(src);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vreinterpret_v_i64m8_i16m8(vint64m8_t src)=0D +{=0D + vint16m8_t a =3D vreinterpret_v_i64m8_i16m8(src);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vreinterpret_v_i64m1_i32m1(vint64m1_t src)=0D +{=0D + vint32m1_t a =3D vreinterpret_v_i64m1_i32m1(src);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vreinterpret_v_i64m2_i32m2(vint64m2_t src)=0D +{=0D + vint32m2_t a =3D vreinterpret_v_i64m2_i32m2(src);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vreinterpret_v_i64m4_i32m4(vint64m4_t src)=0D +{=0D + vint32m4_t a =3D vreinterpret_v_i64m4_i32m4(src);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vreinterpret_v_i64m8_i32m8(vint64m8_t src)=0D +{=0D + vint32m8_t a =3D vreinterpret_v_i64m8_i32m8(src);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vreinterpret_v_u8mf4_u16mf4(vuint8mf4_t src)=0D +{=0D + vuint16mf4_t a =3D vreinterpret_v_u8mf4_u16mf4(src);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vreinterpret_v_u8mf2_u16mf2(vuint8mf2_t src)=0D +{=0D + vuint16mf2_t a =3D vreinterpret_v_u8mf2_u16mf2(src);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vreinterpret_v_u8m1_u16m1(vuint8m1_t src)=0D +{=0D + vuint16m1_t a =3D vreinterpret_v_u8m1_u16m1(src);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vreinterpret_v_u8m2_u16m2(vuint8m2_t src)=0D +{=0D + vuint16m2_t a =3D vreinterpret_v_u8m2_u16m2(src);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vreinterpret_v_u8m4_u16m4(vuint8m4_t src)=0D +{=0D + vuint16m4_t a =3D vreinterpret_v_u8m4_u16m4(src);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vreinterpret_v_u8m8_u16m8(vuint8m8_t src)=0D +{=0D + vuint16m8_t a =3D vreinterpret_v_u8m8_u16m8(src);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vreinterpret_v_u8mf2_u32mf2(vuint8mf2_t src)=0D +{=0D + vuint32mf2_t a =3D vreinterpret_v_u8mf2_u32mf2(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_u8m1_u32m1(vuint8m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_v_u8m1_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_u8m2_u32m2(vuint8m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_v_u8m2_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_u8m4_u32m4(vuint8m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_v_u8m4_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_u8m8_u32m8(vuint8m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_v_u8m8_u32m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_u8m1_u64m1(vuint8m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_v_u8m1_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_u8m2_u64m2(vuint8m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_v_u8m2_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_u8m4_u64m4(vuint8m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_v_u8m4_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_u8m8_u64m8(vuint8m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_v_u8m8_u64m8(src);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vreinterpret_v_u16mf4_u8mf4(vuint16mf4_t src)=0D +{=0D + vuint8mf4_t a =3D vreinterpret_v_u16mf4_u8mf4(src);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vreinterpret_v_u16mf2_u8mf2(vuint16mf2_t src)=0D +{=0D + vuint8mf2_t a =3D vreinterpret_v_u16mf2_u8mf2(src);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vreinterpret_v_u16m1_u8m1(vuint16m1_t src)=0D +{=0D + vuint8m1_t a =3D vreinterpret_v_u16m1_u8m1(src);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vreinterpret_v_u16m2_u8m2(vuint16m2_t src)=0D +{=0D + vuint8m2_t a =3D vreinterpret_v_u16m2_u8m2(src);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vreinterpret_v_u16m4_u8m4(vuint16m4_t src)=0D +{=0D + vuint8m4_t a =3D vreinterpret_v_u16m4_u8m4(src);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vreinterpret_v_u16m8_u8m8(vuint16m8_t src)=0D +{=0D + vuint8m8_t a =3D vreinterpret_v_u16m8_u8m8(src);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vreinterpret_v_u16mf2_u32mf2(vuint16mf2_t src)=0D +{=0D + vuint32mf2_t a =3D vreinterpret_v_u16mf2_u32mf2(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_u16m1_u32m1(vuint16m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_v_u16m1_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_u16m2_u32m2(vuint16m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_v_u16m2_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_u16m4_u32m4(vuint16m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_v_u16m4_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_u16m8_u32m8(vuint16m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_v_u16m8_u32m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_u16m1_u64m1(vuint16m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_v_u16m1_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_u16m2_u64m2(vuint16m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_v_u16m2_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_u16m4_u64m4(vuint16m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_v_u16m4_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_u16m8_u64m8(vuint16m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_v_u16m8_u64m8(src);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vreinterpret_v_u32mf2_u8mf2(vuint32mf2_t src)=0D +{=0D + vuint8mf2_t a =3D vreinterpret_v_u32mf2_u8mf2(src);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vreinterpret_v_u32m1_u8m1(vuint32m1_t src)=0D +{=0D + vuint8m1_t a =3D vreinterpret_v_u32m1_u8m1(src);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vreinterpret_v_u32m2_u8m2(vuint32m2_t src)=0D +{=0D + vuint8m2_t a =3D vreinterpret_v_u32m2_u8m2(src);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vreinterpret_v_u32m4_u8m4(vuint32m4_t src)=0D +{=0D + vuint8m4_t a =3D vreinterpret_v_u32m4_u8m4(src);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vreinterpret_v_u32m8_u8m8(vuint32m8_t src)=0D +{=0D + vuint8m8_t a =3D vreinterpret_v_u32m8_u8m8(src);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vreinterpret_v_u32mf2_u16mf2(vuint32mf2_t src)=0D +{=0D + vuint16mf2_t a =3D vreinterpret_v_u32mf2_u16mf2(src);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vreinterpret_v_u32m1_u16m1(vuint32m1_t src)=0D +{=0D + vuint16m1_t a =3D vreinterpret_v_u32m1_u16m1(src);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vreinterpret_v_u32m2_u16m2(vuint32m2_t src)=0D +{=0D + vuint16m2_t a =3D vreinterpret_v_u32m2_u16m2(src);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vreinterpret_v_u32m4_u16m4(vuint32m4_t src)=0D +{=0D + vuint16m4_t a =3D vreinterpret_v_u32m4_u16m4(src);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vreinterpret_v_u32m8_u16m8(vuint32m8_t src)=0D +{=0D + vuint16m8_t a =3D vreinterpret_v_u32m8_u16m8(src);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vreinterpret_v_u32m1_u64m1(vuint32m1_t src)=0D +{=0D + vuint64m1_t a =3D vreinterpret_v_u32m1_u64m1(src);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vreinterpret_v_u32m2_u64m2(vuint32m2_t src)=0D +{=0D + vuint64m2_t a =3D vreinterpret_v_u32m2_u64m2(src);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vreinterpret_v_u32m4_u64m4(vuint32m4_t src)=0D +{=0D + vuint64m4_t a =3D vreinterpret_v_u32m4_u64m4(src);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vreinterpret_v_u32m8_u64m8(vuint32m8_t src)=0D +{=0D + vuint64m8_t a =3D vreinterpret_v_u32m8_u64m8(src);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vreinterpret_v_u64m1_u8m1(vuint64m1_t src)=0D +{=0D + vuint8m1_t a =3D vreinterpret_v_u64m1_u8m1(src);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vreinterpret_v_u64m2_u8m2(vuint64m2_t src)=0D +{=0D + vuint8m2_t a =3D vreinterpret_v_u64m2_u8m2(src);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vreinterpret_v_u64m4_u8m4(vuint64m4_t src)=0D +{=0D + vuint8m4_t a =3D vreinterpret_v_u64m4_u8m4(src);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vreinterpret_v_u64m8_u8m8(vuint64m8_t src)=0D +{=0D + vuint8m8_t a =3D vreinterpret_v_u64m8_u8m8(src);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vreinterpret_v_u64m1_u16m1(vuint64m1_t src)=0D +{=0D + vuint16m1_t a =3D vreinterpret_v_u64m1_u16m1(src);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vreinterpret_v_u64m2_u16m2(vuint64m2_t src)=0D +{=0D + vuint16m2_t a =3D vreinterpret_v_u64m2_u16m2(src);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vreinterpret_v_u64m4_u16m4(vuint64m4_t src)=0D +{=0D + vuint16m4_t a =3D vreinterpret_v_u64m4_u16m4(src);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vreinterpret_v_u64m8_u16m8(vuint64m8_t src)=0D +{=0D + vuint16m8_t a =3D vreinterpret_v_u64m8_u16m8(src);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vreinterpret_v_u64m1_u32m1(vuint64m1_t src)=0D +{=0D + vuint32m1_t a =3D vreinterpret_v_u64m1_u32m1(src);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vreinterpret_v_u64m2_u32m2(vuint64m2_t src)=0D +{=0D + vuint32m2_t a =3D vreinterpret_v_u64m2_u32m2(src);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vreinterpret_v_u64m4_u32m4(vuint64m4_t src)=0D +{=0D + vuint32m4_t a =3D vreinterpret_v_u64m4_u32m4(src);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vreinterpret_v_u64m8_u32m8(vuint64m8_t src)=0D +{=0D + vuint32m8_t a =3D vreinterpret_v_u64m8_u32m8(src);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_ext_v_i8mf8_i8mf4(vint8mf8_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_ext_v_i8mf8_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_ext_v_i8mf8_i8mf2(vint8mf8_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_ext_v_i8mf8_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_ext_v_i8mf8_i8m1(vint8mf8_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_ext_v_i8mf8_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_ext_v_i8mf8_i8m2(vint8mf8_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_ext_v_i8mf8_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8mf8_i8m4(vint8mf8_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_v_i8mf8_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8mf8_i8m8(vint8mf8_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_v_i8mf8_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_ext_v_i8mf4_i8mf2(vint8mf4_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_ext_v_i8mf4_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_ext_v_i8mf4_i8m1(vint8mf4_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_ext_v_i8mf4_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_ext_v_i8mf4_i8m2(vint8mf4_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_ext_v_i8mf4_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8mf4_i8m4(vint8mf4_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_v_i8mf4_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8mf4_i8m8(vint8mf4_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_v_i8mf4_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_ext_v_i8mf2_i8m1(vint8mf2_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_ext_v_i8mf2_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_ext_v_i8mf2_i8m2(vint8mf2_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_ext_v_i8mf2_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8mf2_i8m4(vint8mf2_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_v_i8mf2_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8mf2_i8m8(vint8mf2_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_v_i8mf2_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_ext_v_i8m1_i8m2(vint8m1_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_ext_v_i8m1_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8m1_i8m4(vint8m1_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_v_i8m1_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8m1_i8m8(vint8m1_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_v_i8m1_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_ext_v_i8m2_i8m4(vint8m2_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_ext_v_i8m2_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8m2_i8m8(vint8m2_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_v_i8m2_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vlmul_ext_v_i8m4_i8m8(vint8m4_t op1)=0D +{=0D + vint8m8_t a =3D vlmul_ext_v_i8m4_i8m8(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_ext_v_i16mf4_i16mf2(vint16mf4_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_ext_v_i16mf4_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_ext_v_i16mf4_i16m1(vint16mf4_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_ext_v_i16mf4_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_ext_v_i16mf4_i16m2(vint16mf4_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_ext_v_i16mf4_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_ext_v_i16mf4_i16m4(vint16mf4_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_ext_v_i16mf4_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16mf4_i16m8(vint16mf4_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_v_i16mf4_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_ext_v_i16mf2_i16m1(vint16mf2_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_ext_v_i16mf2_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_ext_v_i16mf2_i16m2(vint16mf2_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_ext_v_i16mf2_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_ext_v_i16mf2_i16m4(vint16mf2_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_ext_v_i16mf2_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16mf2_i16m8(vint16mf2_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_v_i16mf2_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_ext_v_i16m1_i16m2(vint16m1_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_ext_v_i16m1_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_ext_v_i16m1_i16m4(vint16m1_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_ext_v_i16m1_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16m1_i16m8(vint16m1_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_v_i16m1_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_ext_v_i16m2_i16m4(vint16m2_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_ext_v_i16m2_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16m2_i16m8(vint16m2_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_v_i16m2_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vlmul_ext_v_i16m4_i16m8(vint16m4_t op1)=0D +{=0D + vint16m8_t a =3D vlmul_ext_v_i16m4_i16m8(op1);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vlmul_ext_v_i32mf2_i32m1(vint32mf2_t op1)=0D +{=0D + vint32m1_t a =3D vlmul_ext_v_i32mf2_i32m1(op1);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vlmul_ext_v_i32mf2_i32m2(vint32mf2_t op1)=0D +{=0D + vint32m2_t a =3D vlmul_ext_v_i32mf2_i32m2(op1);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vlmul_ext_v_i32mf2_i32m4(vint32mf2_t op1)=0D +{=0D + vint32m4_t a =3D vlmul_ext_v_i32mf2_i32m4(op1);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vlmul_ext_v_i32mf2_i32m8(vint32mf2_t op1)=0D +{=0D + vint32m8_t a =3D vlmul_ext_v_i32mf2_i32m8(op1);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vlmul_ext_v_i32m1_i32m2(vint32m1_t op1)=0D +{=0D + vint32m2_t a =3D vlmul_ext_v_i32m1_i32m2(op1);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vlmul_ext_v_i32m1_i32m4(vint32m1_t op1)=0D +{=0D + vint32m4_t a =3D vlmul_ext_v_i32m1_i32m4(op1);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vlmul_ext_v_i32m1_i32m8(vint32m1_t op1)=0D +{=0D + vint32m8_t a =3D vlmul_ext_v_i32m1_i32m8(op1);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vlmul_ext_v_i32m2_i32m4(vint32m2_t op1)=0D +{=0D + vint32m4_t a =3D vlmul_ext_v_i32m2_i32m4(op1);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vlmul_ext_v_i32m2_i32m8(vint32m2_t op1)=0D +{=0D + vint32m8_t a =3D vlmul_ext_v_i32m2_i32m8(op1);=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vlmul_ext_v_i32m4_i32m8(vint32m4_t op1)=0D +{=0D + vint32m8_t a =3D vlmul_ext_v_i32m4_i32m8(op1);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vlmul_ext_v_i64m1_i64m2(vint64m1_t op1)=0D +{=0D + vint64m2_t a =3D vlmul_ext_v_i64m1_i64m2(op1);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vlmul_ext_v_i64m1_i64m4(vint64m1_t op1)=0D +{=0D + vint64m4_t a =3D vlmul_ext_v_i64m1_i64m4(op1);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vlmul_ext_v_i64m1_i64m8(vint64m1_t op1)=0D +{=0D + vint64m8_t a =3D vlmul_ext_v_i64m1_i64m8(op1);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vlmul_ext_v_i64m2_i64m4(vint64m2_t op1)=0D +{=0D + vint64m4_t a =3D vlmul_ext_v_i64m2_i64m4(op1);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vlmul_ext_v_i64m2_i64m8(vint64m2_t op1)=0D +{=0D + vint64m8_t a =3D vlmul_ext_v_i64m2_i64m8(op1);=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vlmul_ext_v_i64m4_i64m8(vint64m4_t op1)=0D +{=0D + vint64m8_t a =3D vlmul_ext_v_i64m4_i64m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_ext_v_u8mf8_u8mf4(vuint8mf8_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_ext_v_u8mf8_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_ext_v_u8mf8_u8mf2(vuint8mf8_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_ext_v_u8mf8_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_ext_v_u8mf8_u8m1(vuint8mf8_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_ext_v_u8mf8_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_ext_v_u8mf8_u8m2(vuint8mf8_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_ext_v_u8mf8_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8mf8_u8m4(vuint8mf8_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_v_u8mf8_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8mf8_u8m8(vuint8mf8_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_v_u8mf8_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_ext_v_u8mf4_u8mf2(vuint8mf4_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_ext_v_u8mf4_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_ext_v_u8mf4_u8m1(vuint8mf4_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_ext_v_u8mf4_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_ext_v_u8mf4_u8m2(vuint8mf4_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_ext_v_u8mf4_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8mf4_u8m4(vuint8mf4_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_v_u8mf4_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8mf4_u8m8(vuint8mf4_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_v_u8mf4_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_ext_v_u8mf2_u8m1(vuint8mf2_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_ext_v_u8mf2_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_ext_v_u8mf2_u8m2(vuint8mf2_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_ext_v_u8mf2_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8mf2_u8m4(vuint8mf2_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_v_u8mf2_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8mf2_u8m8(vuint8mf2_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_v_u8mf2_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_ext_v_u8m1_u8m2(vuint8m1_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_ext_v_u8m1_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8m1_u8m4(vuint8m1_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_v_u8m1_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8m1_u8m8(vuint8m1_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_v_u8m1_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_ext_v_u8m2_u8m4(vuint8m2_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_ext_v_u8m2_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8m2_u8m8(vuint8m2_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_v_u8m2_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vlmul_ext_v_u8m4_u8m8(vuint8m4_t op1)=0D +{=0D + vuint8m8_t a =3D vlmul_ext_v_u8m4_u8m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_ext_v_u16mf4_u16mf2(vuint16mf4_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_ext_v_u16mf4_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_ext_v_u16mf4_u16m1(vuint16mf4_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_ext_v_u16mf4_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_ext_v_u16mf4_u16m2(vuint16mf4_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_ext_v_u16mf4_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_ext_v_u16mf4_u16m4(vuint16mf4_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_ext_v_u16mf4_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16mf4_u16m8(vuint16mf4_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_v_u16mf4_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_ext_v_u16mf2_u16m1(vuint16mf2_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_ext_v_u16mf2_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_ext_v_u16mf2_u16m2(vuint16mf2_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_ext_v_u16mf2_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_ext_v_u16mf2_u16m4(vuint16mf2_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_ext_v_u16mf2_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16mf2_u16m8(vuint16mf2_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_v_u16mf2_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_ext_v_u16m1_u16m2(vuint16m1_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_ext_v_u16m1_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_ext_v_u16m1_u16m4(vuint16m1_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_ext_v_u16m1_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16m1_u16m8(vuint16m1_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_v_u16m1_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_ext_v_u16m2_u16m4(vuint16m2_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_ext_v_u16m2_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16m2_u16m8(vuint16m2_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_v_u16m2_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vlmul_ext_v_u16m4_u16m8(vuint16m4_t op1)=0D +{=0D + vuint16m8_t a =3D vlmul_ext_v_u16m4_u16m8(op1);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vlmul_ext_v_u32mf2_u32m1(vuint32mf2_t op1)=0D +{=0D + vuint32m1_t a =3D vlmul_ext_v_u32mf2_u32m1(op1);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vlmul_ext_v_u32mf2_u32m2(vuint32mf2_t op1)=0D +{=0D + vuint32m2_t a =3D vlmul_ext_v_u32mf2_u32m2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vlmul_ext_v_u32mf2_u32m4(vuint32mf2_t op1)=0D +{=0D + vuint32m4_t a =3D vlmul_ext_v_u32mf2_u32m4(op1);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vlmul_ext_v_u32mf2_u32m8(vuint32mf2_t op1)=0D +{=0D + vuint32m8_t a =3D vlmul_ext_v_u32mf2_u32m8(op1);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vlmul_ext_v_u32m1_u32m2(vuint32m1_t op1)=0D +{=0D + vuint32m2_t a =3D vlmul_ext_v_u32m1_u32m2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vlmul_ext_v_u32m1_u32m4(vuint32m1_t op1)=0D +{=0D + vuint32m4_t a =3D vlmul_ext_v_u32m1_u32m4(op1);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vlmul_ext_v_u32m1_u32m8(vuint32m1_t op1)=0D +{=0D + vuint32m8_t a =3D vlmul_ext_v_u32m1_u32m8(op1);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vlmul_ext_v_u32m2_u32m4(vuint32m2_t op1)=0D +{=0D + vuint32m4_t a =3D vlmul_ext_v_u32m2_u32m4(op1);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vlmul_ext_v_u32m2_u32m8(vuint32m2_t op1)=0D +{=0D + vuint32m8_t a =3D vlmul_ext_v_u32m2_u32m8(op1);=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vlmul_ext_v_u32m4_u32m8(vuint32m4_t op1)=0D +{=0D + vuint32m8_t a =3D vlmul_ext_v_u32m4_u32m8(op1);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vlmul_ext_v_u64m1_u64m2(vuint64m1_t op1)=0D +{=0D + vuint64m2_t a =3D vlmul_ext_v_u64m1_u64m2(op1);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vlmul_ext_v_u64m1_u64m4(vuint64m1_t op1)=0D +{=0D + vuint64m4_t a =3D vlmul_ext_v_u64m1_u64m4(op1);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vlmul_ext_v_u64m1_u64m8(vuint64m1_t op1)=0D +{=0D + vuint64m8_t a =3D vlmul_ext_v_u64m1_u64m8(op1);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vlmul_ext_v_u64m2_u64m4(vuint64m2_t op1)=0D +{=0D + vuint64m4_t a =3D vlmul_ext_v_u64m2_u64m4(op1);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vlmul_ext_v_u64m2_u64m8(vuint64m2_t op1)=0D +{=0D + vuint64m8_t a =3D vlmul_ext_v_u64m2_u64m8(op1);=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vlmul_ext_v_u64m4_u64m8(vuint64m4_t op1)=0D +{=0D + vuint64m8_t a =3D vlmul_ext_v_u64m4_u64m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vlmul_ext_v_f32mf2_f32m1(vfloat32mf2_t op1)=0D +{=0D + vfloat32m1_t a =3D vlmul_ext_v_f32mf2_f32m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vlmul_ext_v_f32mf2_f32m2(vfloat32mf2_t op1)=0D +{=0D + vfloat32m2_t a =3D vlmul_ext_v_f32mf2_f32m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vlmul_ext_v_f32mf2_f32m4(vfloat32mf2_t op1)=0D +{=0D + vfloat32m4_t a =3D vlmul_ext_v_f32mf2_f32m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vlmul_ext_v_f32mf2_f32m8(vfloat32mf2_t op1)=0D +{=0D + vfloat32m8_t a =3D vlmul_ext_v_f32mf2_f32m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vlmul_ext_v_f32m1_f32m2(vfloat32m1_t op1)=0D +{=0D + vfloat32m2_t a =3D vlmul_ext_v_f32m1_f32m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vlmul_ext_v_f32m1_f32m4(vfloat32m1_t op1)=0D +{=0D + vfloat32m4_t a =3D vlmul_ext_v_f32m1_f32m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vlmul_ext_v_f32m1_f32m8(vfloat32m1_t op1)=0D +{=0D + vfloat32m8_t a =3D vlmul_ext_v_f32m1_f32m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vlmul_ext_v_f32m2_f32m4(vfloat32m2_t op1)=0D +{=0D + vfloat32m4_t a =3D vlmul_ext_v_f32m2_f32m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vlmul_ext_v_f32m2_f32m8(vfloat32m2_t op1)=0D +{=0D + vfloat32m8_t a =3D vlmul_ext_v_f32m2_f32m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vlmul_ext_v_f32m4_f32m8(vfloat32m4_t op1)=0D +{=0D + vfloat32m8_t a =3D vlmul_ext_v_f32m4_f32m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vlmul_ext_v_f64m1_f64m2(vfloat64m1_t op1)=0D +{=0D + vfloat64m2_t a =3D vlmul_ext_v_f64m1_f64m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vlmul_ext_v_f64m1_f64m4(vfloat64m1_t op1)=0D +{=0D + vfloat64m4_t a =3D vlmul_ext_v_f64m1_f64m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vlmul_ext_v_f64m1_f64m8(vfloat64m1_t op1)=0D +{=0D + vfloat64m8_t a =3D vlmul_ext_v_f64m1_f64m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vlmul_ext_v_f64m2_f64m4(vfloat64m2_t op1)=0D +{=0D + vfloat64m4_t a =3D vlmul_ext_v_f64m2_f64m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vlmul_ext_v_f64m2_f64m8(vfloat64m2_t op1)=0D +{=0D + vfloat64m8_t a =3D vlmul_ext_v_f64m2_f64m8(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vlmul_ext_v_f64m4_f64m8(vfloat64m4_t op1)=0D +{=0D + vfloat64m8_t a =3D vlmul_ext_v_f64m4_f64m8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8mf4_i8mf8(vint8mf4_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_v_i8mf4_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8mf2_i8mf8(vint8mf2_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_v_i8mf2_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8mf2_i8mf4(vint8mf2_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_v_i8mf2_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8m1_i8mf8(vint8m1_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_v_i8m1_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8m1_i8mf4(vint8m1_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_v_i8m1_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_trunc_v_i8m1_i8mf2(vint8m1_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_trunc_v_i8m1_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8m2_i8mf8(vint8m2_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_v_i8m2_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8m2_i8mf4(vint8m2_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_v_i8m2_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_trunc_v_i8m2_i8mf2(vint8m2_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_trunc_v_i8m2_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_trunc_v_i8m2_i8m1(vint8m2_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_trunc_v_i8m2_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8m4_i8mf8(vint8m4_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_v_i8m4_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8m4_i8mf4(vint8m4_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_v_i8m4_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_trunc_v_i8m4_i8mf2(vint8m4_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_trunc_v_i8m4_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_trunc_v_i8m4_i8m1(vint8m4_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_trunc_v_i8m4_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_trunc_v_i8m4_i8m2(vint8m4_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_trunc_v_i8m4_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vlmul_trunc_v_i8m8_i8mf8(vint8m8_t op1)=0D +{=0D + vint8mf8_t a =3D vlmul_trunc_v_i8m8_i8mf8(op1);=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vlmul_trunc_v_i8m8_i8mf4(vint8m8_t op1)=0D +{=0D + vint8mf4_t a =3D vlmul_trunc_v_i8m8_i8mf4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vlmul_trunc_v_i8m8_i8mf2(vint8m8_t op1)=0D +{=0D + vint8mf2_t a =3D vlmul_trunc_v_i8m8_i8mf2(op1);=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vlmul_trunc_v_i8m8_i8m1(vint8m8_t op1)=0D +{=0D + vint8m1_t a =3D vlmul_trunc_v_i8m8_i8m1(op1);=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vlmul_trunc_v_i8m8_i8m2(vint8m8_t op1)=0D +{=0D + vint8m2_t a =3D vlmul_trunc_v_i8m8_i8m2(op1);=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vlmul_trunc_v_i8m8_i8m4(vint8m8_t op1)=0D +{=0D + vint8m4_t a =3D vlmul_trunc_v_i8m8_i8m4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16mf2_i16mf4(vint16mf2_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_v_i16mf2_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16m1_i16mf4(vint16m1_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_v_i16m1_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_trunc_v_i16m1_i16mf2(vint16m1_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_trunc_v_i16m1_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16m2_i16mf4(vint16m2_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_v_i16m2_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_trunc_v_i16m2_i16mf2(vint16m2_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_trunc_v_i16m2_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_trunc_v_i16m2_i16m1(vint16m2_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_trunc_v_i16m2_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16m4_i16mf4(vint16m4_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_v_i16m4_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_trunc_v_i16m4_i16mf2(vint16m4_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_trunc_v_i16m4_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_trunc_v_i16m4_i16m1(vint16m4_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_trunc_v_i16m4_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_trunc_v_i16m4_i16m2(vint16m4_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_trunc_v_i16m4_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vlmul_trunc_v_i16m8_i16mf4(vint16m8_t op1)=0D +{=0D + vint16mf4_t a =3D vlmul_trunc_v_i16m8_i16mf4(op1);=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vlmul_trunc_v_i16m8_i16mf2(vint16m8_t op1)=0D +{=0D + vint16mf2_t a =3D vlmul_trunc_v_i16m8_i16mf2(op1);=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vlmul_trunc_v_i16m8_i16m1(vint16m8_t op1)=0D +{=0D + vint16m1_t a =3D vlmul_trunc_v_i16m8_i16m1(op1);=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vlmul_trunc_v_i16m8_i16m2(vint16m8_t op1)=0D +{=0D + vint16m2_t a =3D vlmul_trunc_v_i16m8_i16m2(op1);=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vlmul_trunc_v_i16m8_i16m4(vint16m8_t op1)=0D +{=0D + vint16m4_t a =3D vlmul_trunc_v_i16m8_i16m4(op1);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vlmul_trunc_v_i32m1_i32mf2(vint32m1_t op1)=0D +{=0D + vint32mf2_t a =3D vlmul_trunc_v_i32m1_i32mf2(op1);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vlmul_trunc_v_i32m2_i32mf2(vint32m2_t op1)=0D +{=0D + vint32mf2_t a =3D vlmul_trunc_v_i32m2_i32mf2(op1);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vlmul_trunc_v_i32m2_i32m1(vint32m2_t op1)=0D +{=0D + vint32m1_t a =3D vlmul_trunc_v_i32m2_i32m1(op1);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vlmul_trunc_v_i32m4_i32mf2(vint32m4_t op1)=0D +{=0D + vint32mf2_t a =3D vlmul_trunc_v_i32m4_i32mf2(op1);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vlmul_trunc_v_i32m4_i32m1(vint32m4_t op1)=0D +{=0D + vint32m1_t a =3D vlmul_trunc_v_i32m4_i32m1(op1);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vlmul_trunc_v_i32m4_i32m2(vint32m4_t op1)=0D +{=0D + vint32m2_t a =3D vlmul_trunc_v_i32m4_i32m2(op1);=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vlmul_trunc_v_i32m8_i32mf2(vint32m8_t op1)=0D +{=0D + vint32mf2_t a =3D vlmul_trunc_v_i32m8_i32mf2(op1);=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vlmul_trunc_v_i32m8_i32m1(vint32m8_t op1)=0D +{=0D + vint32m1_t a =3D vlmul_trunc_v_i32m8_i32m1(op1);=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vlmul_trunc_v_i32m8_i32m2(vint32m8_t op1)=0D +{=0D + vint32m2_t a =3D vlmul_trunc_v_i32m8_i32m2(op1);=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vlmul_trunc_v_i32m8_i32m4(vint32m8_t op1)=0D +{=0D + vint32m4_t a =3D vlmul_trunc_v_i32m8_i32m4(op1);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vlmul_trunc_v_i64m2_i64m1(vint64m2_t op1)=0D +{=0D + vint64m1_t a =3D vlmul_trunc_v_i64m2_i64m1(op1);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vlmul_trunc_v_i64m4_i64m1(vint64m4_t op1)=0D +{=0D + vint64m1_t a =3D vlmul_trunc_v_i64m4_i64m1(op1);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vlmul_trunc_v_i64m4_i64m2(vint64m4_t op1)=0D +{=0D + vint64m2_t a =3D vlmul_trunc_v_i64m4_i64m2(op1);=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vlmul_trunc_v_i64m8_i64m1(vint64m8_t op1)=0D +{=0D + vint64m1_t a =3D vlmul_trunc_v_i64m8_i64m1(op1);=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vlmul_trunc_v_i64m8_i64m2(vint64m8_t op1)=0D +{=0D + vint64m2_t a =3D vlmul_trunc_v_i64m8_i64m2(op1);=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vlmul_trunc_v_i64m8_i64m4(vint64m8_t op1)=0D +{=0D + vint64m4_t a =3D vlmul_trunc_v_i64m8_i64m4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8mf4_u8mf8(vuint8mf4_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_v_u8mf4_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8mf2_u8mf8(vuint8mf2_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_v_u8mf2_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8mf2_u8mf4(vuint8mf2_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_v_u8mf2_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8m1_u8mf8(vuint8m1_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_v_u8m1_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8m1_u8mf4(vuint8m1_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_v_u8m1_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_trunc_v_u8m1_u8mf2(vuint8m1_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_trunc_v_u8m1_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8m2_u8mf8(vuint8m2_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_v_u8m2_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8m2_u8mf4(vuint8m2_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_v_u8m2_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_trunc_v_u8m2_u8mf2(vuint8m2_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_trunc_v_u8m2_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_trunc_v_u8m2_u8m1(vuint8m2_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_trunc_v_u8m2_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8m4_u8mf8(vuint8m4_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_v_u8m4_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8m4_u8mf4(vuint8m4_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_v_u8m4_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_trunc_v_u8m4_u8mf2(vuint8m4_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_trunc_v_u8m4_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_trunc_v_u8m4_u8m1(vuint8m4_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_trunc_v_u8m4_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_trunc_v_u8m4_u8m2(vuint8m4_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_trunc_v_u8m4_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vlmul_trunc_v_u8m8_u8mf8(vuint8m8_t op1)=0D +{=0D + vuint8mf8_t a =3D vlmul_trunc_v_u8m8_u8mf8(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vlmul_trunc_v_u8m8_u8mf4(vuint8m8_t op1)=0D +{=0D + vuint8mf4_t a =3D vlmul_trunc_v_u8m8_u8mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vlmul_trunc_v_u8m8_u8mf2(vuint8m8_t op1)=0D +{=0D + vuint8mf2_t a =3D vlmul_trunc_v_u8m8_u8mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vlmul_trunc_v_u8m8_u8m1(vuint8m8_t op1)=0D +{=0D + vuint8m1_t a =3D vlmul_trunc_v_u8m8_u8m1(op1);=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vlmul_trunc_v_u8m8_u8m2(vuint8m8_t op1)=0D +{=0D + vuint8m2_t a =3D vlmul_trunc_v_u8m8_u8m2(op1);=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vlmul_trunc_v_u8m8_u8m4(vuint8m8_t op1)=0D +{=0D + vuint8m4_t a =3D vlmul_trunc_v_u8m8_u8m4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16mf2_u16mf4(vuint16mf2_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_v_u16mf2_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16m1_u16mf4(vuint16m1_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_v_u16m1_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_trunc_v_u16m1_u16mf2(vuint16m1_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_trunc_v_u16m1_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16m2_u16mf4(vuint16m2_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_v_u16m2_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_trunc_v_u16m2_u16mf2(vuint16m2_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_trunc_v_u16m2_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_trunc_v_u16m2_u16m1(vuint16m2_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_trunc_v_u16m2_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16m4_u16mf4(vuint16m4_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_v_u16m4_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_trunc_v_u16m4_u16mf2(vuint16m4_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_trunc_v_u16m4_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_trunc_v_u16m4_u16m1(vuint16m4_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_trunc_v_u16m4_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_trunc_v_u16m4_u16m2(vuint16m4_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_trunc_v_u16m4_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vlmul_trunc_v_u16m8_u16mf4(vuint16m8_t op1)=0D +{=0D + vuint16mf4_t a =3D vlmul_trunc_v_u16m8_u16mf4(op1);=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vlmul_trunc_v_u16m8_u16mf2(vuint16m8_t op1)=0D +{=0D + vuint16mf2_t a =3D vlmul_trunc_v_u16m8_u16mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vlmul_trunc_v_u16m8_u16m1(vuint16m8_t op1)=0D +{=0D + vuint16m1_t a =3D vlmul_trunc_v_u16m8_u16m1(op1);=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vlmul_trunc_v_u16m8_u16m2(vuint16m8_t op1)=0D +{=0D + vuint16m2_t a =3D vlmul_trunc_v_u16m8_u16m2(op1);=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vlmul_trunc_v_u16m8_u16m4(vuint16m8_t op1)=0D +{=0D + vuint16m4_t a =3D vlmul_trunc_v_u16m8_u16m4(op1);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vlmul_trunc_v_u32m1_u32mf2(vuint32m1_t op1)=0D +{=0D + vuint32mf2_t a =3D vlmul_trunc_v_u32m1_u32mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vlmul_trunc_v_u32m2_u32mf2(vuint32m2_t op1)=0D +{=0D + vuint32mf2_t a =3D vlmul_trunc_v_u32m2_u32mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vlmul_trunc_v_u32m2_u32m1(vuint32m2_t op1)=0D +{=0D + vuint32m1_t a =3D vlmul_trunc_v_u32m2_u32m1(op1);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vlmul_trunc_v_u32m4_u32mf2(vuint32m4_t op1)=0D +{=0D + vuint32mf2_t a =3D vlmul_trunc_v_u32m4_u32mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vlmul_trunc_v_u32m4_u32m1(vuint32m4_t op1)=0D +{=0D + vuint32m1_t a =3D vlmul_trunc_v_u32m4_u32m1(op1);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vlmul_trunc_v_u32m4_u32m2(vuint32m4_t op1)=0D +{=0D + vuint32m2_t a =3D vlmul_trunc_v_u32m4_u32m2(op1);=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vlmul_trunc_v_u32m8_u32mf2(vuint32m8_t op1)=0D +{=0D + vuint32mf2_t a =3D vlmul_trunc_v_u32m8_u32mf2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vlmul_trunc_v_u32m8_u32m1(vuint32m8_t op1)=0D +{=0D + vuint32m1_t a =3D vlmul_trunc_v_u32m8_u32m1(op1);=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vlmul_trunc_v_u32m8_u32m2(vuint32m8_t op1)=0D +{=0D + vuint32m2_t a =3D vlmul_trunc_v_u32m8_u32m2(op1);=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vlmul_trunc_v_u32m8_u32m4(vuint32m8_t op1)=0D +{=0D + vuint32m4_t a =3D vlmul_trunc_v_u32m8_u32m4(op1);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vlmul_trunc_v_u64m2_u64m1(vuint64m2_t op1)=0D +{=0D + vuint64m1_t a =3D vlmul_trunc_v_u64m2_u64m1(op1);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vlmul_trunc_v_u64m4_u64m1(vuint64m4_t op1)=0D +{=0D + vuint64m1_t a =3D vlmul_trunc_v_u64m4_u64m1(op1);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vlmul_trunc_v_u64m4_u64m2(vuint64m4_t op1)=0D +{=0D + vuint64m2_t a =3D vlmul_trunc_v_u64m4_u64m2(op1);=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vlmul_trunc_v_u64m8_u64m1(vuint64m8_t op1)=0D +{=0D + vuint64m1_t a =3D vlmul_trunc_v_u64m8_u64m1(op1);=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vlmul_trunc_v_u64m8_u64m2(vuint64m8_t op1)=0D +{=0D + vuint64m2_t a =3D vlmul_trunc_v_u64m8_u64m2(op1);=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vlmul_trunc_v_u64m8_u64m4(vuint64m8_t op1)=0D +{=0D + vuint64m4_t a =3D vlmul_trunc_v_u64m8_u64m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vlmul_trunc_v_f32m1_f32mf2(vfloat32m1_t op1)=0D +{=0D + vfloat32mf2_t a =3D vlmul_trunc_v_f32m1_f32mf2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vlmul_trunc_v_f32m2_f32mf2(vfloat32m2_t op1)=0D +{=0D + vfloat32mf2_t a =3D vlmul_trunc_v_f32m2_f32mf2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vlmul_trunc_v_f32m2_f32m1(vfloat32m2_t op1)=0D +{=0D + vfloat32m1_t a =3D vlmul_trunc_v_f32m2_f32m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vlmul_trunc_v_f32m4_f32mf2(vfloat32m4_t op1)=0D +{=0D + vfloat32mf2_t a =3D vlmul_trunc_v_f32m4_f32mf2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vlmul_trunc_v_f32m4_f32m1(vfloat32m4_t op1)=0D +{=0D + vfloat32m1_t a =3D vlmul_trunc_v_f32m4_f32m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vlmul_trunc_v_f32m4_f32m2(vfloat32m4_t op1)=0D +{=0D + vfloat32m2_t a =3D vlmul_trunc_v_f32m4_f32m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vlmul_trunc_v_f32m8_f32mf2(vfloat32m8_t op1)=0D +{=0D + vfloat32mf2_t a =3D vlmul_trunc_v_f32m8_f32mf2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vlmul_trunc_v_f32m8_f32m1(vfloat32m8_t op1)=0D +{=0D + vfloat32m1_t a =3D vlmul_trunc_v_f32m8_f32m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vlmul_trunc_v_f32m8_f32m2(vfloat32m8_t op1)=0D +{=0D + vfloat32m2_t a =3D vlmul_trunc_v_f32m8_f32m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vlmul_trunc_v_f32m8_f32m4(vfloat32m8_t op1)=0D +{=0D + vfloat32m4_t a =3D vlmul_trunc_v_f32m8_f32m4(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vlmul_trunc_v_f64m2_f64m1(vfloat64m2_t op1)=0D +{=0D + vfloat64m1_t a =3D vlmul_trunc_v_f64m2_f64m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vlmul_trunc_v_f64m4_f64m1(vfloat64m4_t op1)=0D +{=0D + vfloat64m1_t a =3D vlmul_trunc_v_f64m4_f64m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vlmul_trunc_v_f64m4_f64m2(vfloat64m4_t op1)=0D +{=0D + vfloat64m2_t a =3D vlmul_trunc_v_f64m4_f64m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vlmul_trunc_v_f64m8_f64m1(vfloat64m8_t op1)=0D +{=0D + vfloat64m1_t a =3D vlmul_trunc_v_f64m8_f64m1(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vlmul_trunc_v_f64m8_f64m2(vfloat64m8_t op1)=0D +{=0D + vfloat64m2_t a =3D vlmul_trunc_v_f64m8_f64m2(op1);=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vlmul_trunc_v_f64m8_f64m4(vfloat64m8_t op1)=0D +{=0D + vfloat64m4_t a =3D vlmul_trunc_v_f64m8_f64m4(op1);=0D + return a;=0D +}=0D +=0D +vint8mf8_t test_vundefined_i8mf8()=0D +{=0D + vint8mf8_t a =3D vundefined_i8mf8();=0D + return a;=0D +}=0D +=0D +vint8mf4_t test_vundefined_i8mf4()=0D +{=0D + vint8mf4_t a =3D vundefined_i8mf4();=0D + return a;=0D +}=0D +=0D +vint8mf2_t test_vundefined_i8mf2()=0D +{=0D + vint8mf2_t a =3D vundefined_i8mf2();=0D + return a;=0D +}=0D +=0D +vint8m1_t test_vundefined_i8m1()=0D +{=0D + vint8m1_t a =3D vundefined_i8m1();=0D + return a;=0D +}=0D +=0D +vint8m2_t test_vundefined_i8m2()=0D +{=0D + vint8m2_t a =3D vundefined_i8m2();=0D + return a;=0D +}=0D +=0D +vint8m4_t test_vundefined_i8m4()=0D +{=0D + vint8m4_t a =3D vundefined_i8m4();=0D + return a;=0D +}=0D +=0D +vint8m8_t test_vundefined_i8m8()=0D +{=0D + vint8m8_t a =3D vundefined_i8m8();=0D + return a;=0D +}=0D +=0D +vint16mf4_t test_vundefined_i16mf4()=0D +{=0D + vint16mf4_t a =3D vundefined_i16mf4();=0D + return a;=0D +}=0D +=0D +vint16mf2_t test_vundefined_i16mf2()=0D +{=0D + vint16mf2_t a =3D vundefined_i16mf2();=0D + return a;=0D +}=0D +=0D +vint16m1_t test_vundefined_i16m1()=0D +{=0D + vint16m1_t a =3D vundefined_i16m1();=0D + return a;=0D +}=0D +=0D +vint16m2_t test_vundefined_i16m2()=0D +{=0D + vint16m2_t a =3D vundefined_i16m2();=0D + return a;=0D +}=0D +=0D +vint16m4_t test_vundefined_i16m4()=0D +{=0D + vint16m4_t a =3D vundefined_i16m4();=0D + return a;=0D +}=0D +=0D +vint16m8_t test_vundefined_i16m8()=0D +{=0D + vint16m8_t a =3D vundefined_i16m8();=0D + return a;=0D +}=0D +=0D +vint32mf2_t test_vundefined_i32mf2()=0D +{=0D + vint32mf2_t a =3D vundefined_i32mf2();=0D + return a;=0D +}=0D +=0D +vint32m1_t test_vundefined_i32m1()=0D +{=0D + vint32m1_t a =3D vundefined_i32m1();=0D + return a;=0D +}=0D +=0D +vint32m2_t test_vundefined_i32m2()=0D +{=0D + vint32m2_t a =3D vundefined_i32m2();=0D + return a;=0D +}=0D +=0D +vint32m4_t test_vundefined_i32m4()=0D +{=0D + vint32m4_t a =3D vundefined_i32m4();=0D + return a;=0D +}=0D +=0D +vint32m8_t test_vundefined_i32m8()=0D +{=0D + vint32m8_t a =3D vundefined_i32m8();=0D + return a;=0D +}=0D +=0D +vint64m1_t test_vundefined_i64m1()=0D +{=0D + vint64m1_t a =3D vundefined_i64m1();=0D + return a;=0D +}=0D +=0D +vint64m2_t test_vundefined_i64m2()=0D +{=0D + vint64m2_t a =3D vundefined_i64m2();=0D + return a;=0D +}=0D +=0D +vint64m4_t test_vundefined_i64m4()=0D +{=0D + vint64m4_t a =3D vundefined_i64m4();=0D + return a;=0D +}=0D +=0D +vint64m8_t test_vundefined_i64m8()=0D +{=0D + vint64m8_t a =3D vundefined_i64m8();=0D + return a;=0D +}=0D +=0D +vuint8mf8_t test_vundefined_u8mf8()=0D +{=0D + vuint8mf8_t a =3D vundefined_u8mf8();=0D + return a;=0D +}=0D +=0D +vuint8mf4_t test_vundefined_u8mf4()=0D +{=0D + vuint8mf4_t a =3D vundefined_u8mf4();=0D + return a;=0D +}=0D +=0D +vuint8mf2_t test_vundefined_u8mf2()=0D +{=0D + vuint8mf2_t a =3D vundefined_u8mf2();=0D + return a;=0D +}=0D +=0D +vuint8m1_t test_vundefined_u8m1()=0D +{=0D + vuint8m1_t a =3D vundefined_u8m1();=0D + return a;=0D +}=0D +=0D +vuint8m2_t test_vundefined_u8m2()=0D +{=0D + vuint8m2_t a =3D vundefined_u8m2();=0D + return a;=0D +}=0D +=0D +vuint8m4_t test_vundefined_u8m4()=0D +{=0D + vuint8m4_t a =3D vundefined_u8m4();=0D + return a;=0D +}=0D +=0D +vuint8m8_t test_vundefined_u8m8()=0D +{=0D + vuint8m8_t a =3D vundefined_u8m8();=0D + return a;=0D +}=0D +=0D +vuint16mf4_t test_vundefined_u16mf4()=0D +{=0D + vuint16mf4_t a =3D vundefined_u16mf4();=0D + return a;=0D +}=0D +=0D +vuint16mf2_t test_vundefined_u16mf2()=0D +{=0D + vuint16mf2_t a =3D vundefined_u16mf2();=0D + return a;=0D +}=0D +=0D +vuint16m1_t test_vundefined_u16m1()=0D +{=0D + vuint16m1_t a =3D vundefined_u16m1();=0D + return a;=0D +}=0D +=0D +vuint16m2_t test_vundefined_u16m2()=0D +{=0D + vuint16m2_t a =3D vundefined_u16m2();=0D + return a;=0D +}=0D +=0D +vuint16m4_t test_vundefined_u16m4()=0D +{=0D + vuint16m4_t a =3D vundefined_u16m4();=0D + return a;=0D +}=0D +=0D +vuint16m8_t test_vundefined_u16m8()=0D +{=0D + vuint16m8_t a =3D vundefined_u16m8();=0D + return a;=0D +}=0D +=0D +vuint32mf2_t test_vundefined_u32mf2()=0D +{=0D + vuint32mf2_t a =3D vundefined_u32mf2();=0D + return a;=0D +}=0D +=0D +vuint32m1_t test_vundefined_u32m1()=0D +{=0D + vuint32m1_t a =3D vundefined_u32m1();=0D + return a;=0D +}=0D +=0D +vuint32m2_t test_vundefined_u32m2()=0D +{=0D + vuint32m2_t a =3D vundefined_u32m2();=0D + return a;=0D +}=0D +=0D +vuint32m4_t test_vundefined_u32m4()=0D +{=0D + vuint32m4_t a =3D vundefined_u32m4();=0D + return a;=0D +}=0D +=0D +vuint32m8_t test_vundefined_u32m8()=0D +{=0D + vuint32m8_t a =3D vundefined_u32m8();=0D + return a;=0D +}=0D +=0D +vuint64m1_t test_vundefined_u64m1()=0D +{=0D + vuint64m1_t a =3D vundefined_u64m1();=0D + return a;=0D +}=0D +=0D +vuint64m2_t test_vundefined_u64m2()=0D +{=0D + vuint64m2_t a =3D vundefined_u64m2();=0D + return a;=0D +}=0D +=0D +vuint64m4_t test_vundefined_u64m4()=0D +{=0D + vuint64m4_t a =3D vundefined_u64m4();=0D + return a;=0D +}=0D +=0D +vuint64m8_t test_vundefined_u64m8()=0D +{=0D + vuint64m8_t a =3D vundefined_u64m8();=0D + return a;=0D +}=0D +=0D +vfloat32mf2_t test_vundefined_f32mf2()=0D +{=0D + vfloat32mf2_t a =3D vundefined_f32mf2();=0D + return a;=0D +}=0D +=0D +vfloat32m1_t test_vundefined_f32m1()=0D +{=0D + vfloat32m1_t a =3D vundefined_f32m1();=0D + return a;=0D +}=0D +=0D +vfloat32m2_t test_vundefined_f32m2()=0D +{=0D + vfloat32m2_t a =3D vundefined_f32m2();=0D + return a;=0D +}=0D +=0D +vfloat32m4_t test_vundefined_f32m4()=0D +{=0D + vfloat32m4_t a =3D vundefined_f32m4();=0D + return a;=0D +}=0D +=0D +vfloat32m8_t test_vundefined_f32m8()=0D +{=0D + vfloat32m8_t a =3D vundefined_f32m8();=0D + return a;=0D +}=0D +=0D +vfloat64m1_t test_vundefined_f64m1()=0D +{=0D + vfloat64m1_t a =3D vundefined_f64m1();=0D + return a;=0D +}=0D +=0D +vfloat64m2_t test_vundefined_f64m2()=0D +{=0D + vfloat64m2_t a =3D vundefined_f64m2();=0D + return a;=0D +}=0D +=0D +vfloat64m4_t test_vundefined_f64m4()=0D +{=0D + vfloat64m4_t a =3D vundefined_f64m4();=0D + return a;=0D +}=0D +=0D +vfloat64m8_t test_vundefined_f64m8()=0D +{=0D + vfloat64m8_t a =3D vundefined_f64m8();=0D + return a;=0D +}=0D -- =0D 2.36.1=0D =0D