Hi all, I'd like to submit two patches that add support for Arm's MVE Tail Predicated Low Overhead Loop feature. --- Introduction --- The M-class Arm-ARM: https://developer.arm.com/documentation/ddi0553/bu/?lang=en Section B5.5.1 "Loop tail predication" describes the feature we are adding support for with this patch (although we only add codegen for DLSTP/LETP instruction loops). Previously with commit d2ed233cb94 we'd added support for non-MVE DLS/LE loops through the loop-doloop pass, which, given a standard MVE loop like: ``` void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, int16_t *c, int n) {   while (n > 0)     {       mve_pred16_t p = vctp16q (n);       int16x8_t va = vldrhq_z_s16 (a, p);       int16x8_t vb = vldrhq_z_s16 (b, p);       int16x8_t vc = vaddq_x_s16 (va, vb, p);       vstrhq_p_s16 (c, vc, p);       c+=8;       a+=8;       b+=8;       n-=8;     } } ``` .. would output: ```                 dls     lr, lr .L3:         vctp.16 r3         vmrs    ip, P0  @ movhi         sxth    ip, ip         vmsr     P0, ip @ movhi         mov     r4, r0         vpst         vldrht.16       q2, [r4]         mov     r4, r1         vmov    q3, q0         vpst         vldrht.16       q1, [r4]         mov     r4, r2         vpst         vaddt.i16       q3, q2, q1         subs    r3, r3, #8         vpst         vstrht.16       q3, [r4]         adds    r0, r0, #16         adds    r1, r1, #16         adds    r2, r2, #16         le      lr, .L3 ``` where the LE instruction will decrement LR by 1, compare and branch if needed. (there are also other inefficiencies with the above code, like the pointless vmrs/sxth/vmsr on the VPR and the adds not being merged into the vldrht/vstrht as a #16 offsets and some random movs! But that's different problems...) The MVE version is similar, except that: * Instead of DLS/LE the instructions are DLSTP/LETP. * Instead of pre-calculating the number of iterations of the   loop, we place the number of elements to be processed by the   loop into LR. * Instead of decrementing the LR by one, LETP will decrement it   by FPSCR.LTPSIZE, which is the number of elements being   processed in each iteration: 16 for 8-bit elements, 5 for 16-bit   elements, etc. * On the final iteration, automatic Loop Tail Predication is   performed, as if the instructions within the loop had been VPT   predicated with a VCTP generating the VPR predicate in every   loop iteration. The dlstp/letp loop now looks like: ```                 dlstp.16        lr, r3 .L14:         mov     r3, r0         vldrh.16        q3, [r3]         mov     r3, r1         vldrh.16        q2, [r3]         mov     r3, r2         vadd.i16  q3, q3, q2         adds    r0, r0, #16         vstrh.16        q3, [r3]         adds    r1, r1, #16         adds    r2, r2, #16         letp    lr, .L14 ``` Since the loop tail predication is automatic, we have eliminated the VCTP that had been specified by the user in the intrinsic and converted the VPT-predicated instructions into their unpredicated equivalents (which also saves us from VPST insns). The LE instruction here decrements LR by 8 in each iteration. --- This 1/2 patch --- This first patch lays some groundwork by adding an attribute to md patterns, and then the second patch contains the functional changes. One major difficulty in implementing MVE Tail-Predicated Low Overhead Loops was the need to transform VPT-predicated insns in the insn chain into their unpredicated equivalents, like: `mve_vldrbq_z_ -> mve_vldrbq_`. This requires us to have a deterministic link between two different patterns in mve.md -- this _could_ be done by re-ordering the entirety of mve.md such that the patterns are at some constant icode proximity (e.g. having the _z immediately after the unpredicated version would mean that to map from the former to the latter you could use icode-1), but that is a very messy solution that would lead to complex unknown dependencies between patterns. This patch proves an alternative way of doing that: using an insn attribute to encode the icode of the unpredicated instruction. This was implemented by doing a find n replace across mve.md using the following patterns: define_insn "(.*)_p_(.*)"((.|\n)*?)\n( )*\[\(set_attr define_insn "$1_p_$2"$3\n$5[(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr define_insn "(.*)_m_(.*)"((.|\n)*?)\n( )*\[\(set_attr define_insn "$1_m_$2"$3\n$5[(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr define_insn "(.*)_z_(.*)"((.|\n)*?)\n( )*\[\(set_attr define_insn "$1_z_$2"$3\n$5[(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr and then a number of manual fixes were needed for the md patterns that did not conform to the above.  Those changes were: Dropped the type suffix _s/_u_f: CODE_FOR_mve_vcmpcsq_n_ CODE_FOR_mve_vcmpcsq_ CODE_FOR_mve_vcmpeqq_n_ CODE_FOR_mve_vcmpeqq_ CODE_FOR_mve_vcmpgeq_n_ CODE_FOR_mve_vcmpgeq_ CODE_FOR_mve_vcmpgtq_n_ CODE_FOR_mve_vcmpgtq_ CODE_FOR_mve_vcmphiq_n_ CODE_FOR_mve_vcmphiq_ CODE_FOR_mve_vcmpleq_n_ CODE_FOR_mve_vcmpleq_ CODE_FOR_mve_vcmpltq_n_ CODE_FOR_mve_vcmpltq_ CODE_FOR_mve_vcmpneq_n_ CODE_FOR_mve_vcmpneq_ CODE_FOR_mve_vaddq CODE_FOR_mve_vcaddq_rot270 CODE_FOR_mve_vcaddq_rot90 CODE_FOR_mve_vcaddq_rot270 CODE_FOR_mve_vcaddq_rot90 CODE_FOR_mve_vcmlaq CODE_FOR_mve_vcmlaq_rot180 CODE_FOR_mve_vcmlaq_rot270 CODE_FOR_mve_vcmlaq_rot90 CODE_FOR_mve_vcmulq CODE_FOR_mve_vcmulq_rot180 CODE_FOR_mve_vcmulq_rot270 CODE_FOR_mve_vcmulq_rot90 Dropped _wb_: CODE_FOR_mve_vidupq_u_insn CODE_FOR_mve_vddupq_u_insn Dropped one underscore character: CODE_FOR_arm_vcx1qv16qi CODE_FOR_arm_vcx2qv16qi CODE_FOR_arm_vcx3qv16qi No regressions on arm-none-eabi with an MVE target. Thank you, Stam Markianos-Wright gcc/ChangeLog:         * config/arm/arm.md (mve_unpredicated_insn): New attribute.         * config/arm/mve.md (mve_vrndq_m_f): Add attribute.        (mve_vaddlvq_p_v4si): Likewise.        (mve_vaddvq_p_): Likewise.        (mve_vbicq_m_n_): Likewise.        (mve_vcmpeqq_m_f): Likewise.        (mve_vcvtaq_m_): Likewise.        (mve_vcvtq_m_to_f_): Likewise.        (mve_vabsq_m_s): Likewise.        (mve_vaddvaq_p_): Likewise.        (mve_vclsq_m_s): Likewise.        (mve_vclzq_m_): Likewise.        (mve_vcmpcsq_m_n_u): Likewise.        (mve_vcmpcsq_m_u): Likewise.        (mve_vcmpeqq_m_n_): Likewise.        (mve_vcmpeqq_m_): Likewise.        (mve_vcmpgeq_m_n_s): Likewise.        (mve_vcmpgeq_m_s): Likewise.        (mve_vcmpgtq_m_n_s): Likewise.        (mve_vcmpgtq_m_s): Likewise.        (mve_vcmphiq_m_n_u): Likewise.        (mve_vcmphiq_m_u): Likewise.        (mve_vcmpleq_m_n_s): Likewise.        (mve_vcmpleq_m_s): Likewise.        (mve_vcmpltq_m_n_s): Likewise.        (mve_vcmpltq_m_s): Likewise.        (mve_vcmpneq_m_n_): Likewise.        (mve_vcmpneq_m_): Likewise.        (mve_vdupq_m_n_): Likewise.        (mve_vmaxaq_m_s): Likewise.        (mve_vmaxavq_p_s): Likewise.        (mve_vmaxvq_p_): Likewise.        (mve_vminaq_m_s): Likewise.        (mve_vminavq_p_s): Likewise.        (mve_vminvq_p_): Likewise.        (mve_vmladavq_p_): Likewise.        (mve_vmladavxq_p_s): Likewise.        (mve_vmlsdavq_p_s): Likewise.        (mve_vmlsdavxq_p_s): Likewise.        (mve_vmvnq_m_): Likewise.        (mve_vnegq_m_s): Likewise.        (mve_vqabsq_m_s): Likewise.        (mve_vqnegq_m_s): Likewise.        (mve_vqrshlq_m_n_): Likewise.        (mve_vqshlq_m_r_): Likewise.        (mve_vrev64q_m_): Likewise.        (mve_vrshlq_m_n_): Likewise.        (mve_vshlq_m_r_): Likewise.        (mve_vabsq_m_f): Likewise.        (mve_vaddlvaq_p_v4si): Likewise.        (mve_vcmpeqq_m_n_f): Likewise.        (mve_vcmpgeq_m_f): Likewise.        (mve_vcmpgeq_m_n_f): Likewise.        (mve_vcmpgtq_m_f): Likewise.        (mve_vcmpgtq_m_n_f): Likewise.        (mve_vcmpleq_m_f): Likewise.        (mve_vcmpleq_m_n_f): Likewise.        (mve_vcmpltq_m_f): Likewise.        (mve_vcmpltq_m_n_f): Likewise.        (mve_vcmpneq_m_f): Likewise.        (mve_vcmpneq_m_n_f): Likewise.        (mve_vcvtbq_m_f16_f32v8hf): Likewise.        (mve_vcvtbq_m_f32_f16v4sf): Likewise.        (mve_vcvttq_m_f16_f32v8hf): Likewise.        (mve_vcvttq_m_f32_f16v4sf): Likewise.        (mve_vdupq_m_n_f): Likewise.        (mve_vmaxnmaq_m_f): Likewise.        (mve_vmaxnmavq_p_f): Likewise.        (mve_vmaxnmvq_p_f): Likewise.        (mve_vminnmaq_m_f): Likewise.        (mve_vminnmavq_p_f): Likewise.        (mve_vminnmvq_p_f): Likewise.        (mve_vmlaldavq_p_): Likewise.        (mve_vmlaldavxq_p_s): Likewise.        (mve_vmlsldavq_p_s): Likewise.        (mve_vmlsldavxq_p_s): Likewise.        (mve_vmovlbq_m_): Likewise.        (mve_vmovltq_m_): Likewise.        (mve_vmovnbq_m_): Likewise.        (mve_vmovntq_m_): Likewise.        (mve_vmvnq_m_n_): Likewise.        (mve_vnegq_m_f): Likewise.        (mve_vorrq_m_n_): Likewise.        (mve_vqmovnbq_m_): Likewise.        (mve_vqmovntq_m_): Likewise.        (mve_vqmovunbq_m_s): Likewise.        (mve_vqmovuntq_m_s): Likewise.        (mve_vrev32q_m_fv8hf): Likewise.        (mve_vrev32q_m_): Likewise.        (mve_vrev64q_m_f): Likewise.        (mve_vrmlaldavhxq_p_sv4si): Likewise.        (mve_vrmlsldavhq_p_sv4si): Likewise.        (mve_vrmlsldavhxq_p_sv4si): Likewise.        (mve_vrndaq_m_f): Likewise.        (mve_vrndmq_m_f): Likewise.        (mve_vrndnq_m_f): Likewise.        (mve_vrndpq_m_f): Likewise.        (mve_vrndxq_m_f): Likewise.        (mve_vcvtmq_m_): Likewise.        (mve_vcvtpq_m_): Likewise.        (mve_vcvtnq_m_): Likewise.        (mve_vcvtq_m_n_from_f_): Likewise.        (mve_vrev16q_m_v16qi): Likewise.        (mve_vcvtq_m_from_f_): Likewise.        (mve_vrmlaldavhq_p_v4si): Likewise.        (mve_vabavq_p_): Likewise.        (mve_vqshluq_m_n_s): Likewise.        (mve_vshlq_m_): Likewise.        (mve_vsriq_m_n_): Likewise.        (mve_vsubq_m_): Likewise.        (mve_vcvtq_m_n_to_f_): Likewise.        (mve_vabdq_m_): Likewise.        (mve_vaddq_m_n_): Likewise.        (mve_vaddq_m_): Likewise.        (mve_vandq_m_): Likewise.        (mve_vbicq_m_): Likewise.        (mve_vbrsrq_m_n_): Likewise.        (mve_vcaddq_rot270_m_): Likewise.        (mve_vcaddq_rot90_m_): Likewise.        (mve_veorq_m_): Likewise.        (mve_vhaddq_m_n_): Likewise.        (mve_vhaddq_m_): Likewise.        (mve_vhsubq_m_n_): Likewise.        (mve_vhsubq_m_): Likewise.        (mve_vmaxq_m_): Likewise.        (mve_vminq_m_): Likewise.        (mve_vmladavaq_p_): Likewise.        (mve_vmlaq_m_n_): Likewise.        (mve_vmlasq_m_n_): Likewise.        (mve_vmulhq_m_): Likewise.        (mve_vmullbq_int_m_): Likewise.        (mve_vmulltq_int_m_): Likewise.        (mve_vmulq_m_n_): Likewise.        (mve_vmulq_m_): Likewise.        (mve_vornq_m_): Likewise.        (mve_vorrq_m_): Likewise.        (mve_vqaddq_m_n_): Likewise.        (mve_vqaddq_m_): Likewise.        (mve_vqdmlahq_m_n_s): Likewise.        (mve_vqdmlashq_m_n_s): Likewise.        (mve_vqrdmlahq_m_n_s): Likewise.        (mve_vqrdmlashq_m_n_s): Likewise.        (mve_vqrshlq_m_): Likewise.        (mve_vqshlq_m_n_): Likewise.        (mve_vqshlq_m_): Likewise.        (mve_vqsubq_m_n_): Likewise.        (mve_vqsubq_m_): Likewise.        (mve_vrhaddq_m_): Likewise.        (mve_vrmulhq_m_): Likewise.        (mve_vrshlq_m_): Likewise.        (mve_vrshrq_m_n_): Likewise.        (mve_vshlq_m_n_): Likewise.        (mve_vshrq_m_n_): Likewise.        (mve_vsliq_m_n_): Likewise.        (mve_vsubq_m_n_): Likewise.        (mve_vhcaddq_rot270_m_s): Likewise.        (mve_vhcaddq_rot90_m_s): Likewise.        (mve_vmladavaxq_p_s): Likewise.        (mve_vmlsdavaq_p_s): Likewise.        (mve_vmlsdavaxq_p_s): Likewise.        (mve_vqdmladhq_m_s): Likewise.        (mve_vqdmladhxq_m_s): Likewise.        (mve_vqdmlsdhq_m_s): Likewise.        (mve_vqdmlsdhxq_m_s): Likewise.        (mve_vqdmulhq_m_n_s): Likewise.        (mve_vqdmulhq_m_s): Likewise.        (mve_vqrdmladhq_m_s): Likewise.        (mve_vqrdmladhxq_m_s): Likewise.        (mve_vqrdmlsdhq_m_s): Likewise.        (mve_vqrdmlsdhxq_m_s): Likewise.        (mve_vqrdmulhq_m_n_s): Likewise.        (mve_vqrdmulhq_m_s): Likewise.        (mve_vmlaldavaq_p_): Likewise.        (mve_vmlaldavaxq_p_): Likewise.        (mve_vqrshrnbq_m_n_): Likewise.        (mve_vqrshrntq_m_n_): Likewise.        (mve_vqshrnbq_m_n_): Likewise.        (mve_vqshrntq_m_n_): Likewise.        (mve_vrmlaldavhaq_p_sv4si): Likewise.        (mve_vrshrnbq_m_n_): Likewise.        (mve_vrshrntq_m_n_): Likewise.        (mve_vshllbq_m_n_): Likewise.        (mve_vshlltq_m_n_): Likewise.        (mve_vshrnbq_m_n_): Likewise.        (mve_vshrntq_m_n_): Likewise.        (mve_vmlsldavaq_p_s): Likewise.        (mve_vmlsldavaxq_p_s): Likewise.        (mve_vmullbq_poly_m_p): Likewise.        (mve_vmulltq_poly_m_p): Likewise.        (mve_vqdmullbq_m_n_s): Likewise.        (mve_vqdmullbq_m_s): Likewise.        (mve_vqdmulltq_m_n_s): Likewise.        (mve_vqdmulltq_m_s): Likewise.        (mve_vqrshrunbq_m_n_s): Likewise.        (mve_vqrshruntq_m_n_s): Likewise.        (mve_vqshrunbq_m_n_s): Likewise.        (mve_vqshruntq_m_n_s): Likewise.        (mve_vrmlaldavhaq_p_uv4si): Likewise.        (mve_vrmlaldavhaxq_p_sv4si): Likewise.        (mve_vrmlsldavhaq_p_sv4si): Likewise.        (mve_vrmlsldavhaxq_p_sv4si): Likewise.        (mve_vabdq_m_f): Likewise.        (mve_vaddq_m_f): Likewise.        (mve_vaddq_m_n_f): Likewise.        (mve_vandq_m_f): Likewise.        (mve_vbicq_m_f): Likewise.        (mve_vbrsrq_m_n_f): Likewise.        (mve_vcaddq_rot270_m_f): Likewise.        (mve_vcaddq_rot90_m_f): Likewise.        (mve_vcmlaq_m_f): Likewise.        (mve_vcmlaq_rot180_m_f): Likewise.        (mve_vcmlaq_rot270_m_f): Likewise.        (mve_vcmlaq_rot90_m_f): Likewise.        (mve_vcmulq_m_f): Likewise.        (mve_vcmulq_rot180_m_f): Likewise.        (mve_vcmulq_rot270_m_f): Likewise.        (mve_vcmulq_rot90_m_f): Likewise.        (mve_veorq_m_f): Likewise.        (mve_vfmaq_m_f): Likewise.        (mve_vfmaq_m_n_f): Likewise.        (mve_vfmasq_m_n_f): Likewise.        (mve_vfmsq_m_f): Likewise.        (mve_vmaxnmq_m_f): Likewise.        (mve_vminnmq_m_f): Likewise.        (mve_vmulq_m_f): Likewise.        (mve_vmulq_m_n_f): Likewise.        (mve_vornq_m_f): Likewise.        (mve_vorrq_m_f): Likewise.        (mve_vsubq_m_f): Likewise.        (mve_vsubq_m_n_f): Likewise. (mve_vstrbq_scatter_offset_p__insn): Likewise.        (mve_vstrwq_scatter_base_p_v4si): Likewise.        (mve_vstrbq_p_): Likewise.        (mve_vldrbq_gather_offset_z_): Likewise.        (mve_vldrbq_z_): Likewise.        (mve_vldrwq_gather_base_z_v4si): Likewise.        (mve_vldrhq_gather_offset_z_): Likewise. (mve_vldrhq_gather_shifted_offset_z_): Likewise.        (mve_vldrhq_z_fv8hf): Likewise.        (mve_vldrhq_z_): Likewise.        (mve_vldrwq_z_fv4sf): Likewise.        (mve_vldrwq_z_v4si): Likewise.        (mve_vldrdq_gather_base_z_v2di): Likewise.        (mve_vldrdq_gather_offset_z_v2di): Likewise.        (mve_vldrdq_gather_shifted_offset_z_v2di): Likewise.        (mve_vldrhq_gather_offset_z_fv8hf): Likewise.        (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise.        (mve_vldrwq_gather_base_z_fv4sf): Likewise.        (mve_vldrwq_gather_offset_z_fv4sf): Likewise.        (mve_vldrwq_gather_offset_z_v4si): Likewise.        (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise.        (mve_vldrwq_gather_shifted_offset_z_v4si): Likewise.        (mve_vstrhq_p_fv8hf): Likewise.        (mve_vstrhq_p_): Likewise. (mve_vstrhq_scatter_offset_p__insn): Likewise. (mve_vstrhq_scatter_shifted_offset_p__insn): Likewise.        (mve_vstrwq_p_fv4sf): Likewise.        (mve_vstrwq_p_v4si): Likewise.        (mve_vstrdq_scatter_base_p_v2di): Likewise.        (mve_vstrdq_scatter_offset_p_v2di_insn): Likewise. (mve_vstrdq_scatter_shifted_offset_p_v2di_insn): Likewise.        (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise.        (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise.        (mve_vstrwq_scatter_base_p_fv4sf): Likewise.        (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise.        (mve_vstrwq_scatter_offset_p_v4si_insn): Likewise.        (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise. (mve_vstrwq_scatter_shifted_offset_p_v4si_insn): Likewise.        (mve_vidupq_m_wb_u_insn): Likewise.        (mve_vddupq_m_wb_u_insn): Likewise.        (mve_vdwdupq_m_wb_u_insn): Likewise.        (mve_viwdupq_m_wb_u_insn): Likewise.        (mve_vstrwq_scatter_base_wb_p_v4si): Likewise.        (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.        (mve_vstrdq_scatter_base_wb_p_v2di): Likewise.        (mve_vldrwq_gather_base_wb_z_v4si_insn): Likewise.        (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.        (mve_vldrdq_gather_base_wb_z_v2di_insn): Likewise.        (mve_vadciq_m_v4si): Likewise.        (mve_vadcq_m_v4si): Likewise.        (mve_vsbciq_m_v4si): Likewise.        (mve_vsbcq_m_v4si): Likewise.        (mve_vshlcq_m_): Likewise.        (arm_vcx1q_p_v16qi): Likewise.        (arm_vcx2q_p_v16qi): Likewise.        (arm_vcx3q_p_v16qi): Likewise. gcc/testsuite/ChangeLog:         * gcc.target/arm/dlstp-compile-asm.c: New test. #### Inline copy of patch ### diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md index 69bf343fb0ed601014979cfc1803abe84c87f179..e1d2e62593085accfcc111cf6fa5795e4520f213 100644 --- a/gcc/config/arm/arm.md +++ b/gcc/config/arm/arm.md @@ -123,6 +123,8 @@  ; and not all ARM insns do.  (define_attr "predicated" "yes,no" (const_string "no")) +(define_attr "mve_unpredicated_insn" "" (const_int 0)) +  ; LENGTH of an instruction (in bytes)  (define_attr "length" ""    (const_int 4)) diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md index 62186f124da183fe1b1eb57a1aea1e8fff680a22..b1c8c1c569f31a6cb1bfdc16394047f02d6cddf4 100644 --- a/gcc/config/arm/mve.md +++ b/gcc/config/arm/mve.md @@ -142,7 +142,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintzt.f%# %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -818,7 +819,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddlvt.32 %Q0, %R0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -910,7 +912,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddvt.%#    %0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2560,7 +2563,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vbict.i%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vcmpeqq_m_f]) @@ -2575,7 +2579,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    eq, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vcvtaq_m_u, vcvtaq_m_s]) @@ -2590,7 +2595,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtat.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u]) @@ -2605,7 +2611,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.f%#.%#  %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vqrshrnbq_n_u, vqrshrnbq_n_s]) @@ -2727,7 +2734,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vabst.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2743,7 +2751,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddvat.%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvaq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2759,7 +2768,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vclst.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclsq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2775,7 +2785,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vclzt.i%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2791,7 +2802,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.u%#    cs, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpcsq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2807,7 +2819,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.u%#    cs, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpcsq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2823,7 +2836,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.i%#    eq, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2839,7 +2853,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.i%#    eq, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2855,7 +2870,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    ge, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2871,7 +2887,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    ge, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2887,7 +2904,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    gt, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2903,7 +2921,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    gt, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2919,7 +2938,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.u%#    hi, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmphiq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2935,7 +2955,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.u%#    hi, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmphiq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2951,7 +2972,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    le, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2967,7 +2989,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    le, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2983,7 +3006,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    lt, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2999,7 +3023,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    lt, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3015,7 +3040,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.i%#    ne, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3031,7 +3057,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.i%#    ne, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3047,7 +3074,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vdupt.%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3063,7 +3091,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmaxat.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxaq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3079,7 +3108,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmaxavt.s%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxavq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3095,7 +3125,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmaxvt.%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxvq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3111,7 +3142,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vminat.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminaq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3127,7 +3159,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vminavt.s%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminavq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3143,7 +3176,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vminvt.%#\t%0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminvq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3175,7 +3209,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmladavt.%#\t%0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3191,7 +3226,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmladavxt.s%#\t%0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3239,7 +3275,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsdavt.s%#    %0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3255,7 +3292,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsdavxt.s%#    %0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3271,7 +3309,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmvnt %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3287,7 +3326,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vnegt.s%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3319,7 +3359,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqabst.s%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqabsq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3367,7 +3408,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqnegt.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqnegq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3479,7 +3521,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshlt.%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3495,7 +3538,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshlt.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_r_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3511,7 +3555,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrev64t.%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3527,7 +3572,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshlt.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3543,7 +3589,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshlt.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_r_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3702,7 +3749,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vabst.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3718,7 +3766,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddlvat.32 %Q0, %R0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvaq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270]) @@ -3752,7 +3801,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    eq, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3768,7 +3818,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    ge, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3784,7 +3835,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    ge, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3800,7 +3852,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    gt, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3816,7 +3869,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    gt, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3832,7 +3886,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    le, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3848,7 +3903,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    le, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3864,7 +3920,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    lt, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3880,7 +3937,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    lt, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3896,7 +3954,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    ne, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3912,7 +3971,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    ne, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3928,7 +3988,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcvtbt.f16.f32 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3944,7 +4005,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcvtbt.f32.f16 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3960,7 +4022,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcvttt.f16.f32 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3976,7 +4039,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcvttt.f32.f16 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3992,7 +4056,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vdupt.%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4071,7 +4136,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmaxnmat.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmaq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmaxnmavq_p_f]) @@ -4086,7 +4152,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmaxnmavt.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmavq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4102,7 +4169,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmaxnmvt.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmvq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vminnmaq_m_f]) @@ -4117,7 +4185,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vminnmat.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmaq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4133,7 +4202,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vminnmavt.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmavq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vminnmvq_p_f]) @@ -4148,7 +4218,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vminnmvt.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmvq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4196,7 +4267,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlaldavt.%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4212,7 +4284,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlaldavxt.s%#\t%Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmlsldavaq_s]) @@ -4259,7 +4332,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsldavt.s%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4275,7 +4349,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsldavxt.s%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmovlbq_m_u, vmovlbq_m_s]) @@ -4290,7 +4365,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmovlbt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovlbq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmovltq_m_u, vmovltq_m_s]) @@ -4305,7 +4381,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmovltt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovltq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmovnbq_m_u, vmovnbq_m_s]) @@ -4320,7 +4397,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmovnbt.i%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovnbq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4336,7 +4414,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmovntt.i%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovntq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4352,7 +4431,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmvnt.i%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vnegq_m_f]) @@ -4367,7 +4447,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vnegt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4383,7 +4464,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vorrt.i%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vpselq_f]) @@ -4414,7 +4496,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqmovnbt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovnbq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4430,7 +4513,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqmovntt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovntq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4446,7 +4530,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqmovunbt.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovunbq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4462,7 +4547,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqmovuntt.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovuntq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4574,7 +4660,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrev32t.16 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_fv8hf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4590,7 +4677,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrev32t.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4606,7 +4694,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrev64t.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4638,7 +4727,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavhxt.s32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhxq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4670,7 +4760,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlsldavht.s32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4686,7 +4777,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlsldavhxt.s32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhxq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4702,7 +4794,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintat.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndaq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4718,7 +4811,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintmt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndmq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4734,7 +4828,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintnt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndnq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4750,7 +4845,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintpt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndpq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4766,7 +4862,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintxt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndxq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4846,7 +4943,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtmt.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4862,7 +4960,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtpt.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4878,7 +4977,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtnt.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4895,7 +4995,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.%#.f%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4911,7 +5012,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrev16t.8 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev16q_v16qi")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4927,7 +5029,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4943,7 +5046,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavht.32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4976,7 +5080,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vabavt.%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabavq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4993,7 +5098,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\n\tvqshlut.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshluq_n_s")) +  (set_attr "type" "mve_move")])  ;;  ;; [vshlq_m_s, vshlq_m_u]) @@ -5009,7 +5115,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshlt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_")) +  (set_attr "type" "mve_move")])  ;;  ;; [vsriq_m_n_s, vsriq_m_n_u]) @@ -5025,7 +5132,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsrit.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsriq_n_")) +  (set_attr "type" "mve_move")])  ;;  ;; [vsubq_m_u, vsubq_m_s]) @@ -5041,7 +5149,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsubt.i%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_")) +  (set_attr "type" "mve_move")])  ;;  ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s]) @@ -5057,7 +5166,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.f%#.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vabdq_m_s, vabdq_m_u]) @@ -5073,7 +5183,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vabdt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5090,7 +5201,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddt.i%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5107,7 +5219,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddt.i%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5124,7 +5237,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vandt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5141,7 +5255,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vbict %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5158,7 +5273,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vbrsrt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5175,7 +5291,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcaddt.i%#    %q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5192,7 +5309,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcaddt.i%#    %q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5209,7 +5327,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;veort %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5226,7 +5345,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhaddt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5243,7 +5363,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhaddt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5260,7 +5381,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhsubt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5277,7 +5399,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhsubt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5294,7 +5417,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmaxt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5311,7 +5435,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmint.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5328,7 +5453,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmladavat.%#    %0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5345,7 +5471,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlat.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5362,7 +5489,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlast.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlasq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5379,7 +5507,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmulht.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulhq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5396,7 +5525,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmullbt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5413,7 +5543,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmulltt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5430,7 +5561,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmult.i%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5447,7 +5579,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmult.i%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5464,7 +5597,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vornt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5481,7 +5615,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vorrt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5498,7 +5633,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqaddt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5515,7 +5651,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqaddt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5532,7 +5669,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmlaht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlahq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5549,7 +5687,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmlasht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlashq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5566,7 +5705,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmlaht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlahq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5583,7 +5723,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmlasht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlashq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5600,7 +5741,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshlt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5617,7 +5759,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshlt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5634,7 +5777,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshlt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5651,7 +5795,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqsubt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5668,7 +5813,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqsubt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5685,7 +5831,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrhaddt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrhaddq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5702,7 +5849,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmulht.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmulhq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5719,7 +5867,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshlt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5736,7 +5885,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshrt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5753,7 +5903,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshlt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5770,7 +5921,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshrt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5787,7 +5939,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vslit.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsliq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5804,7 +5957,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsubt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5821,7 +5975,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhcaddt.s%#\t%q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot270_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5838,7 +5993,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhcaddt.s%#\t%q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot90_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5855,7 +6011,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmladavaxt.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5872,7 +6029,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsdavat.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5889,7 +6047,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsdavaxt.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5906,7 +6065,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmladht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5923,7 +6083,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmladhxt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5940,7 +6101,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmlsdht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5957,7 +6119,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmlsdhxt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5974,7 +6137,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmulht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5991,7 +6155,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmulht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6008,7 +6173,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmladht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6025,7 +6191,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmladhxt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6042,7 +6209,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmlsdht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6059,7 +6227,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmlsdhxt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6076,7 +6245,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmulht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6093,7 +6263,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmulht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6110,7 +6281,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlaldavat.%#    %Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6127,7 +6299,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlaldavaxt.%# %Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaxq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6144,7 +6317,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshrnbt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrnbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6161,7 +6335,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshrntt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrntq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6178,7 +6353,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\n\tvqshrnbt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrnbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6195,7 +6371,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshrntt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrntq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6212,7 +6389,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavhat.s32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6229,7 +6407,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshrnbt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrnbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6246,7 +6425,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshrntt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrntq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6263,7 +6443,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshllbt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshllbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6280,7 +6461,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshlltt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlltq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6297,7 +6479,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshrnbt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrnbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6314,7 +6497,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshrntt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrntq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6331,7 +6515,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsldavat.s%#\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6348,7 +6533,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsldavaxt.s%#\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6365,7 +6551,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmullbt.p%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6382,7 +6569,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmulltt.p%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6399,7 +6587,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmullbt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6416,7 +6605,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmullbt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6433,7 +6623,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmulltt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6450,7 +6641,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmulltt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6467,7 +6659,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshrunbt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrunbq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6484,7 +6677,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshruntt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshruntq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6501,7 +6695,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshrunbt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrunbq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6518,7 +6713,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshruntt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshruntq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6535,7 +6731,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavhat.u32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_uv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6552,7 +6749,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavhaxt.s32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaxq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6569,7 +6767,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlsldavhat.s32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6586,7 +6785,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlsldavhaxt.s32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaxq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vabdq_m_f]) @@ -6602,7 +6802,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vabdt.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6619,7 +6820,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vaddt.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6636,7 +6838,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vaddt.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6653,7 +6856,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vandt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6670,7 +6874,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vbict %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6687,7 +6892,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vbrsrt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6704,7 +6910,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcaddt.f%#    %q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6721,7 +6928,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcaddt.f%#    %q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6738,7 +6946,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmlat.f%#    %q0, %q2, %q3, #0" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6755,7 +6964,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmlat.f%#    %q0, %q2, %q3, #180" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot180")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6772,7 +6982,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmlat.f%#    %q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot270")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6789,7 +7000,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmlat.f%#    %q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot90")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6806,7 +7018,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmult.f%#    %q0, %q2, %q3, #0" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6823,7 +7036,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmult.f%#    %q0, %q2, %q3, #180" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot180")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6840,7 +7054,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmult.f%#    %q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot270")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6857,7 +7072,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmult.f%#    %q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot90")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6874,7 +7090,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;veort %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6891,7 +7108,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vfmat.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6908,7 +7126,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vfmat.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6925,7 +7144,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vfmast.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmasq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6942,7 +7162,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vfmst.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmsq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6959,7 +7180,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmaxnmt.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6976,7 +7198,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vminnmt.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6993,7 +7216,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmult.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7010,7 +7234,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmult.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7027,7 +7252,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vornt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7044,7 +7270,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vorrt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7061,7 +7288,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vsubt.f%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7078,7 +7306,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vsubt.f%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7245,7 +7474,8 @@        VSTRBSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrbt.\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset__insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u] @@ -7268,7 +7498,8 @@     output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vstrbq_p_s vstrbq_p_u] @@ -7288,7 +7519,8 @@     output_asm_insn ("vpst\;vstrbt.\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_")) +  (set_attr "length" "8")])  ;;  ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u] @@ -7313,7 +7545,8 @@       output_asm_insn ("vpst\n\tvldrbt.\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_")) +  (set_attr "length" "8")])  ;;  ;; [vldrbq_z_s vldrbq_z_u] @@ -7336,7 +7569,8 @@       output_asm_insn ("vpst\;vldrbt.\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u] @@ -7357,7 +7591,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_f] @@ -7424,7 +7659,8 @@       output_asm_insn ("vpst\n\tvldrht.\t%q0, [%m1, %q2]",ops);     return "";  } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u] @@ -7472,7 +7708,8 @@       output_asm_insn ("vpst\n\tvldrht.\t%q0, [%m1, %q2, uxtw #1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_s, vldrhq_u] @@ -7514,7 +7751,8 @@     output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_z_s vldrhq_z_u] @@ -7537,7 +7775,8 @@       output_asm_insn ("vpst\;vldrht.\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_f] @@ -7595,7 +7834,8 @@     output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_z_s vldrwq_z_u] @@ -7615,7 +7855,8 @@     output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_v4si")) +  (set_attr "length" "8")])  (define_expand "mve_vld1q_f"    [(match_operand:MVE_0 0 "s_register_operand") @@ -7676,7 +7917,8 @@     output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_v2di")) +  (set_attr "length" "8")])  ;;  ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u] @@ -7717,7 +7959,8 @@    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);    return "";  } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_v2di")) +  (set_attr "length" "8")])  ;;  ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u] @@ -7758,7 +8001,8 @@     output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_v2di")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_gather_offset_f] @@ -7800,7 +8044,8 @@     output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_gather_shifted_offset_f] @@ -7842,7 +8087,8 @@     output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_base_f] @@ -7883,7 +8129,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_offset_f] @@ -7945,7 +8192,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u] @@ -7967,7 +8215,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_shifted_offset_f] @@ -8029,7 +8278,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u] @@ -8051,7 +8301,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_f] @@ -8090,7 +8341,8 @@     output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_p_s vstrhq_p_u] @@ -8110,7 +8362,8 @@     output_asm_insn ("vpst\;vstrht.\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u] @@ -8142,7 +8395,8 @@        VSTRHSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrht.\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset__insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u] @@ -8202,7 +8456,8 @@        VSTRHSSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrht.\t%q2, [%0, %q1, uxtw #1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset__insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u] @@ -8289,7 +8544,8 @@     output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_p_s vstrwq_p_u] @@ -8309,7 +8565,8 @@     output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_s vstrwq_u] @@ -8371,7 +8628,8 @@     output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_v2di")) +  (set_attr "length" "8")])  ;;  ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u] @@ -8424,7 +8682,8 @@        VSTRDSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrdt.64\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_v2di_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u] @@ -8484,7 +8743,8 @@        VSTRDSSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrdt.64\t%q2, [%0, %q1, UXTW #3]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_v2di_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u] @@ -8572,7 +8832,8 @@        VSTRHQSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vstrht.16\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_scatter_shifted_offset_f] @@ -8632,7 +8893,8 @@        VSTRHQSSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_base_f] @@ -8677,7 +8939,8 @@     output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_offset_f] @@ -8736,7 +8999,8 @@        VSTRWQSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vstrwt.32\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u] @@ -8767,7 +9031,8 @@        VSTRWSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrwt.32\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_v4si_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u] @@ -8855,7 +9120,8 @@        VSTRWQSSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u] @@ -8887,7 +9153,8 @@        VSTRWSSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_v4si_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u] @@ -9012,7 +9279,8 @@          (match_operand:SI 6 "immediate_operand" "i")))]   "TARGET_HAVE_MVE"   "vpst\;\tvidupt.u%#\t%q0, %2, %4" - [(set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u_insn")) +  (set_attr "length""8")])  ;;  ;; [vddupq_n_u]) @@ -9080,7 +9348,8 @@           (match_operand:SI 6 "immediate_operand" "i")))]   "TARGET_HAVE_MVE"   "vpst\;\tvddupt.u%#\t%q0, %2, %4" - [(set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u_insn")) +  (set_attr "length""8")])  ;;  ;; [vdwdupq_n_u]) @@ -9196,7 +9465,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;\tvdwdupt.u%#\t%q2, %3, %R4, %5" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u_insn")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -9313,7 +9583,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;\tviwdupt.u%#\t%q2, %3, %R4, %5" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u_insn")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -9365,7 +9636,8 @@     output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_base_wb_f] @@ -9416,7 +9688,8 @@     output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u] @@ -9467,7 +9740,8 @@     output_asm_insn ("vpst;vstrdt.u64\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_v2di")) +  (set_attr "length" "8")])  (define_expand "mve_vldrwq_gather_base_wb_v4si"    [(match_operand:V4SI 0 "s_register_operand") @@ -9575,7 +9849,8 @@     output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_v4si_insn")) +  (set_attr "length" "8")])  (define_expand "mve_vldrwq_gather_base_wb_fv4sf"    [(match_operand:V4SI 0 "s_register_operand") @@ -9684,7 +9959,8 @@     output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn")) +  (set_attr "length" "8")])  (define_expand "mve_vldrdq_gather_base_wb_v2di"    [(match_operand:V2DI 0 "s_register_operand") @@ -9809,7 +10085,8 @@     output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_v2di_insn")) +  (set_attr "length" "8")])  ;;  ;; [vadciq_m_s, vadciq_m_u])  ;; @@ -9826,7 +10103,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vadcit.i32\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "8")])  ;; @@ -9862,7 +10140,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vadct.i32\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "8")])  ;; @@ -9899,7 +10178,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsbcit.i32\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "8")])  ;; @@ -9935,7 +10215,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsbct.i32\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "8")])  ;; @@ -10352,7 +10633,8 @@   ]   "TARGET_HAVE_MVE"   "vpst\;vshlct\t%q0, %1, %4" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_")) +  (set_attr "type" "mve_move")    (set_attr "length" "8")])  ;; CDE instructions on MVE registers. @@ -10435,7 +10717,8 @@       CDE_VCX))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vpst\;vcx1t\\tp%c1, %q0, #%c3" -  [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi")) +  (set_attr "type" "coproc")     (set_attr "length" "8")]  ) @@ -10449,7 +10732,8 @@       CDE_VCX))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vpst\;vcx2t\\tp%c1, %q0, %q3, #%c4" -  [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi")) +  (set_attr "type" "coproc")     (set_attr "length" "8")]  ) @@ -10464,7 +10748,8 @@       CDE_VCX))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vpst\;vcx3t\\tp%c1, %q0, %q3, %q4, #%c5" -  [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi")) +  (set_attr "type" "coproc")     (set_attr "length" "8")]  ) diff --git a/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c new file mode 100644 index 0000000000000000000000000000000000000000..ec6ee774cbda6604c4c24b57cd4d5d3bd08e07cd --- /dev/null +++ b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c @@ -0,0 +1,82 @@ +/* { dg-do compile { target { arm*-*-* } } } */ +/* { dg-require-effective-target arm_v8_1m_mve_ok } */ +/* { dg-skip-if "avoid conflicting multilib options" { *-*-* } { "-marm" "-mcpu=*" } } */ +/* { dg-options "-march=armv8.1-m.main+fp.dp+mve.fp -O3" } */ + +#include + +#define IMM 5 + +#define TEST_COMPILE_IN_DLSTP_TERNARY(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED)                \ +void test_##NAME##PRED##_##SIGN##BITS (TYPE##BITS##_t *a, TYPE##BITS##_t *b,  TYPE##BITS##_t *c, int n)    \ +{                                            \ +  while (n > 0)                                        \ +    {                                            \ +      mve_pred16_t p = vctp##BITS##q (n); \ +      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p);        \ +      TYPE##BITS##x##LANES##_t vb = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (b, p);        \ +      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_##SIGN##BITS (va, vb, p);        \ +      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);         \ +      c += LANES;                                    \ +      a += LANES;                                    \ +      b += LANES;                                    \ +      n -= LANES;                                    \ +    }                                            \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY(BITS, LANES, LDRSTRYTPE, NAME, PRED)    \ +TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY(NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (8, 16, b, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (16, 8, h, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (32, 4, w, NAME, PRED) + + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vaddq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vorrq, _x) + + +#define TEST_COMPILE_IN_DLSTP_TERNARY_N(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED)    \ +void test_##NAME##PRED##_n_##SIGN##BITS (TYPE##BITS##_t *a, TYPE##BITS##_t *c, int n)    \ +{                                            \ +  while (n > 0)                                        \ +    {                                            \ +      mve_pred16_t p = vctp##BITS##q (n); \ +      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p);        \ +      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_n_##SIGN##BITS (va, IMM, p);        \ +      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);         \ +      c += LANES;                                    \ +      a += LANES;                                    \ +      n -= LANES;                                    \ +    }                                            \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N(BITS, LANES, LDRSTRYTPE, NAME, PRED)    \ +TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N(NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (8, 16, b, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (16, 8, h, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (32, 4, w, NAME, PRED) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vaddq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vmulq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vsubq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vhaddq, _x) + + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vbrsrq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshlq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshrq, _x) + + +/* The final number of DLSTPs currently is calculated by the number of +  `TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY.*` macros * 6.  */ +/* { dg-final { scan-assembler-times {\tdlstp} 54 } } */ +/* { dg-final { scan-assembler-times {\tletp} 54 } } */ +/* { dg-final { scan-assembler-not "\tvctp\t" } } */ +/* { dg-final { scan-assembler-not "\tvpst\t" } } */ +/* { dg-final { scan-assembler-not "P0" } } */ \ No newline at end of file