Hi all, Created a v3 of these two patches. For this 1/2 patch, this includes: a) A rebase onto latest trunk and onto Andre's: https://gcc.gnu.org/pipermail/gcc-patches/2023-January/610520.html patch series. b) Minor change to the macro definitions in arm.h I recognise that we are now in Stage 4 and, even though these have been on the list since Stage 1, the 2/2 patch does contain mid-end changes, so do let me know if there's a chance to get this into GCC13 or if we should retarget this for GCC14. Thank you! Stam On 11/01/2023 14:23, Stam Markianos-Wright via Gcc-patches wrote: > ----- Respin of the below patch ----- > > In this 1/2 patch, from v1 to v2 I have added: > > * The three new helper #defines in arm.h. > > * Attribute mappings to unpredicated MVE instructions that map to > themselves. This allows us to distinguish between unpredicated > insns that do have a VPT predicated form (are VPT predicable) and > insns that do not. > > > Original email with updated Changelog at the end: > > > > Hi all, > > I'd like to submit two patches that add support for Arm's MVE > Tail Predicated Low Overhead Loop feature. > > --- Introduction --- > > The M-class Arm-ARM: > https://developer.arm.com/documentation/ddi0553/bu/?lang=en > Section B5.5.1 "Loop tail predication" describes the feature > we are adding support for with this patch (although > we only add codegen for DLSTP/LETP instruction loops). > > Previously with commit d2ed233cb94 we'd added support for > non-MVE DLS/LE loops through the loop-doloop pass, which, given > a standard MVE loop like: > > ``` > void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, int16_t > *c, int n) > { >   while (n > 0) >     { >       mve_pred16_t p = vctp16q (n); >       int16x8_t va = vldrhq_z_s16 (a, p); >       int16x8_t vb = vldrhq_z_s16 (b, p); >       int16x8_t vc = vaddq_x_s16 (va, vb, p); >       vstrhq_p_s16 (c, vc, p); >       c+=8; >       a+=8; >       b+=8; >       n-=8; >     } > } > ``` > .. would output: > > ``` >         >         dls     lr, lr > .L3: >         vctp.16 r3 >         vmrs    ip, P0  @ movhi >         sxth    ip, ip >         vmsr     P0, ip @ movhi >         mov     r4, r0 >         vpst >         vldrht.16       q2, [r4] >         mov     r4, r1 >         vmov    q3, q0 >         vpst >         vldrht.16       q1, [r4] >         mov     r4, r2 >         vpst >         vaddt.i16       q3, q2, q1 >         subs    r3, r3, #8 >         vpst >         vstrht.16       q3, [r4] >         adds    r0, r0, #16 >         adds    r1, r1, #16 >         adds    r2, r2, #16 >         le      lr, .L3 > ``` > > where the LE instruction will decrement LR by 1, compare and > branch if needed. > > (there are also other inefficiencies with the above code, like the > pointless vmrs/sxth/vmsr on the VPR and the adds not being merged > into the vldrht/vstrht as a #16 offsets and some random movs! > But that's different problems...) > > The MVE version is similar, except that: > * Instead of DLS/LE the instructions are DLSTP/LETP. > * Instead of pre-calculating the number of iterations of the >   loop, we place the number of elements to be processed by the >   loop into LR. > * Instead of decrementing the LR by one, LETP will decrement it >   by FPSCR.LTPSIZE, which is the number of elements being >   processed in each iteration: 16 for 8-bit elements, 5 for 16-bit >   elements, etc. > * On the final iteration, automatic Loop Tail Predication is >   performed, as if the instructions within the loop had been VPT >   predicated with a VCTP generating the VPR predicate in every >   loop iteration. > > The dlstp/letp loop now looks like: > > ``` >         >         dlstp.16        lr, r3 > .L14: >         mov     r3, r0 >         vldrh.16        q3, [r3] >         mov     r3, r1 >         vldrh.16        q2, [r3] >         mov     r3, r2 >         vadd.i16  q3, q3, q2 >         adds    r0, r0, #16 >         vstrh.16        q3, [r3] >         adds    r1, r1, #16 >         adds    r2, r2, #16 >         letp    lr, .L14 > > ``` > > Since the loop tail predication is automatic, we have eliminated > the VCTP that had been specified by the user in the intrinsic > and converted the VPT-predicated instructions into their > unpredicated equivalents (which also saves us from VPST insns). > > The LE instruction here decrements LR by 8 in each iteration. > > --- This 1/2 patch --- > > This first patch lays some groundwork by adding an attribute to > md patterns, and then the second patch contains the functional > changes. > > One major difficulty in implementing MVE Tail-Predicated Low > Overhead Loops was the need to transform VPT-predicated insns > in the insn chain into their unpredicated equivalents, like: > `mve_vldrbq_z_ -> mve_vldrbq_`. > > This requires us to have a deterministic link between two > different patterns in mve.md -- this _could_ be done by > re-ordering the entirety of mve.md such that the patterns are > at some constant icode proximity (e.g. having the _z immediately > after the unpredicated version would mean that to map from the > former to the latter you could use icode-1), but that is a very > messy solution that would lead to complex unknown dependencies > between patterns. > > This patch proves an alternative way of doing that: using an insn > attribute to encode the icode of the unpredicated instruction. > > This was implemented by doing a find n replace across mve.md > using the following patterns: > > define_insn "(.*)_p_(.*)"((.|\n)*?)\n( )*\[\(set_attr > define_insn "$1_p_$2"$3\n$5[(set (attr "mve_unpredicated_insn") > (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr > > > define_insn "(.*)_m_(.*)"((.|\n)*?)\n( )*\[\(set_attr > define_insn "$1_m_$2"$3\n$5[(set (attr "mve_unpredicated_insn") > (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr > > define_insn "(.*)_z_(.*)"((.|\n)*?)\n( )*\[\(set_attr > define_insn "$1_z_$2"$3\n$5[(set (attr "mve_unpredicated_insn") > (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr > > and then a number of manual fixes were needed for the md patterns > that did not conform to the above.  Those changes were: > > Dropped the type suffix _s/_u_f: > CODE_FOR_mve_vcmpcsq_n_ > CODE_FOR_mve_vcmpcsq_ > CODE_FOR_mve_vcmpeqq_n_ > CODE_FOR_mve_vcmpeqq_ > CODE_FOR_mve_vcmpgeq_n_ > CODE_FOR_mve_vcmpgeq_ > CODE_FOR_mve_vcmpgtq_n_ > CODE_FOR_mve_vcmpgtq_ > CODE_FOR_mve_vcmphiq_n_ > CODE_FOR_mve_vcmphiq_ > CODE_FOR_mve_vcmpleq_n_ > CODE_FOR_mve_vcmpleq_ > CODE_FOR_mve_vcmpltq_n_ > CODE_FOR_mve_vcmpltq_ > CODE_FOR_mve_vcmpneq_n_ > CODE_FOR_mve_vcmpneq_ > CODE_FOR_mve_vaddq > CODE_FOR_mve_vcaddq_rot270 > CODE_FOR_mve_vcaddq_rot90 > CODE_FOR_mve_vcaddq_rot270 > CODE_FOR_mve_vcaddq_rot90 > CODE_FOR_mve_vcmlaq > CODE_FOR_mve_vcmlaq_rot180 > CODE_FOR_mve_vcmlaq_rot270 > CODE_FOR_mve_vcmlaq_rot90 > CODE_FOR_mve_vcmulq > CODE_FOR_mve_vcmulq_rot180 > CODE_FOR_mve_vcmulq_rot270 > CODE_FOR_mve_vcmulq_rot90 > > Dropped _wb_: > CODE_FOR_mve_vidupq_u_insn > CODE_FOR_mve_vddupq_u_insn > > Dropped one underscore character: > CODE_FOR_arm_vcx1qv16qi > CODE_FOR_arm_vcx2qv16qi > CODE_FOR_arm_vcx3qv16qi > > No regressions on arm-none-eabi with an MVE target. > > Thank you, > Stam Markianos-Wright > > gcc/ChangeLog: > >         * config/arm/arm.md (mve_unpredicated_insn): New attribute. >         * config/arm/arm.h (MVE_VPT_PREDICATED_INSN_P): New define. >     (MVE_VPT_UNPREDICATED_INSN_P): Likewise. >     (MVE_VPT_PREDICABLE_INSN_P): Likewise. >         * config/arm/vec-common.md (mve_vshlq_): Add > attribute. >         * config/arm/mve.md (arm_vcx1q_p_v16qi): Add attribute. >     (arm_vcx1qv16qi): Likewise. >     (arm_vcx1qav16qi): Likewise. >     (arm_vcx1qv16qi): Likewise. >     (arm_vcx2q_p_v16qi): Likewise. >     (arm_vcx2qv16qi): Likewise. >     (arm_vcx2qav16qi): Likewise. >     (arm_vcx2qv16qi): Likewise. >     (arm_vcx3q_p_v16qi): Likewise. >     (arm_vcx3qv16qi): Likewise. >     (arm_vcx3qav16qi: Likewise. >     (arm_vcx3qv16qi): Likewise. >     (mve_vabavq_): Likewise. >     (mve_vabavq_p_): Likewise. >     (mve_vabdq_): Likewise. >     (mve_vabdq_f): Likewise. >     (mve_vabdq_m_): Likewise. >     (mve_vabdq_m_f): Likewise. >     (mve_vabsq_f): Likewise. >     (mve_vabsq_m_f): Likewise. >     (mve_vabsq_m_s): Likewise. >     (mve_vabsq_s): Likewise. >     (mve_vadciq_v4si): Likewise. >     (mve_vadciq_m_v4si): Likewise. >     (mve_vadcq_v4si): Likewise. >     (mve_vadcq_m_v4si): Likewise. >     (mve_vaddlvaq_v4si): Likewise. >     (mve_vaddlvaq_p_v4si): Likewise. >     (mve_vaddlvq_v4si): Likewise. >     (mve_vaddlvq_p_v4si): Likewise. >     (mve_vaddq_f): Likewise. >     (mve_vaddq_m_): Likewise. >     (mve_vaddq_m_f): Likewise. >     (mve_vaddq_m_n_): Likewise. >     (mve_vaddq_m_n_f): Likewise. >     (mve_vaddq_n_): Likewise. >     (mve_vaddq_n_f): Likewise. >     (mve_vaddq): Likewise. >     (mve_vaddvaq_): Likewise. >     (mve_vaddvaq_p_): Likewise. >     (mve_vaddvq_): Likewise. >     (mve_vaddvq_p_): Likewise. >     (mve_vandq_): Likewise. >     (mve_vandq_f): Likewise. >     (mve_vandq_m_): Likewise. >     (mve_vandq_m_f): Likewise. >     (mve_vandq_s): Likewise. >     (mve_vandq_u): Likewise. >     (mve_vbicq_): Likewise. >     (mve_vbicq_f): Likewise. >     (mve_vbicq_m_): Likewise. >     (mve_vbicq_m_f): Likewise. >     (mve_vbicq_m_n_): Likewise. >     (mve_vbicq_n_): Likewise. >     (mve_vbicq_s): Likewise. >     (mve_vbicq_u): Likewise. >     (mve_vbrsrq_m_n_): Likewise. >     (mve_vbrsrq_m_n_f): Likewise. >     (mve_vbrsrq_n_): Likewise. >     (mve_vbrsrq_n_f): Likewise. >     (mve_vcaddq_rot270_m_): Likewise. >     (mve_vcaddq_rot270_m_f): Likewise. >     (mve_vcaddq_rot270): Likewise. >     (mve_vcaddq_rot270): Likewise. >     (mve_vcaddq_rot90_m_): Likewise. >     (mve_vcaddq_rot90_m_f): Likewise. >     (mve_vcaddq_rot90): Likewise. >     (mve_vcaddq_rot90): Likewise. >     (mve_vcaddq): Likewise. >     (mve_vcaddq): Likewise. >     (mve_vclsq_m_s): Likewise. >     (mve_vclsq_s): Likewise. >     (mve_vclzq_): Likewise. >     (mve_vclzq_m_): Likewise. >     (mve_vclzq_s): Likewise. >     (mve_vclzq_u): Likewise. >     (mve_vcmlaq_m_f): Likewise. >     (mve_vcmlaq_rot180_m_f): Likewise. >     (mve_vcmlaq_rot180): Likewise. >     (mve_vcmlaq_rot270_m_f): Likewise. >     (mve_vcmlaq_rot270): Likewise. >     (mve_vcmlaq_rot90_m_f): Likewise. >     (mve_vcmlaq_rot90): Likewise. >     (mve_vcmlaq): Likewise. >     (mve_vcmlaq): Likewise. >     (mve_vcmpq_): Likewise. >     (mve_vcmpq_f): Likewise. >     (mve_vcmpq_n_): Likewise. >     (mve_vcmpq_n_f): Likewise. >     (mve_vcmpcsq_): Likewise. >     (mve_vcmpcsq_m_n_u): Likewise. >     (mve_vcmpcsq_m_u): Likewise. >     (mve_vcmpcsq_n_): Likewise. >     (mve_vcmpeqq_): Likewise. >     (mve_vcmpeqq_f): Likewise. >     (mve_vcmpeqq_m_): Likewise. >     (mve_vcmpeqq_m_f): Likewise. >     (mve_vcmpeqq_m_n_): Likewise. >     (mve_vcmpeqq_m_n_f): Likewise. >     (mve_vcmpeqq_n_): Likewise. >     (mve_vcmpeqq_n_f): Likewise. >     (mve_vcmpgeq_): Likewise. >     (mve_vcmpgeq_f): Likewise. >     (mve_vcmpgeq_m_f): Likewise. >     (mve_vcmpgeq_m_n_f): Likewise. >     (mve_vcmpgeq_m_n_s): Likewise. >     (mve_vcmpgeq_m_s): Likewise. >     (mve_vcmpgeq_n_): Likewise. >     (mve_vcmpgeq_n_f): Likewise. >     (mve_vcmpgtq_): Likewise. >     (mve_vcmpgtq_f): Likewise. >     (mve_vcmpgtq_m_f): Likewise. >     (mve_vcmpgtq_m_n_f): Likewise. >     (mve_vcmpgtq_m_n_s): Likewise. >     (mve_vcmpgtq_m_s): Likewise. >     (mve_vcmpgtq_n_): Likewise. >     (mve_vcmpgtq_n_f): Likewise. >     (mve_vcmphiq_): Likewise. >     (mve_vcmphiq_m_n_u): Likewise. >     (mve_vcmphiq_m_u): Likewise. >     (mve_vcmphiq_n_): Likewise. >     (mve_vcmpleq_): Likewise. >     (mve_vcmpleq_f): Likewise. >     (mve_vcmpleq_m_f): Likewise. >     (mve_vcmpleq_m_n_f): Likewise. >     (mve_vcmpleq_m_n_s): Likewise. >     (mve_vcmpleq_m_s): Likewise. >     (mve_vcmpleq_n_): Likewise. >     (mve_vcmpleq_n_f): Likewise. >     (mve_vcmpltq_): Likewise. >     (mve_vcmpltq_f): Likewise. >     (mve_vcmpltq_m_f): Likewise. >     (mve_vcmpltq_m_n_f): Likewise. >     (mve_vcmpltq_m_n_s): Likewise. >     (mve_vcmpltq_m_s): Likewise. >     (mve_vcmpltq_n_): Likewise. >     (mve_vcmpltq_n_f): Likewise. >     (mve_vcmpneq_): Likewise. >     (mve_vcmpneq_f): Likewise. >     (mve_vcmpneq_m_): Likewise. >     (mve_vcmpneq_m_f): Likewise. >     (mve_vcmpneq_m_n_): Likewise. >     (mve_vcmpneq_m_n_f): Likewise. >     (mve_vcmpneq_n_): Likewise. >     (mve_vcmpneq_n_f): Likewise. >     (mve_vcmulq_m_f): Likewise. >     (mve_vcmulq_rot180_m_f): Likewise. >     (mve_vcmulq_rot180): Likewise. >     (mve_vcmulq_rot270_m_f): Likewise. >     (mve_vcmulq_rot270): Likewise. >     (mve_vcmulq_rot90_m_f): Likewise. >     (mve_vcmulq_rot90): Likewise. >     (mve_vcmulq): Likewise. >     (mve_vcmulq): Likewise. >     (mve_vctpq_mhi): Likewise. >     (mve_vctpqhi): Likewise. >     (mve_vcvtaq_): Likewise. >     (mve_vcvtaq_m_): Likewise. >     (mve_vcvtbq_f16_f32v8hf): Likewise. >     (mve_vcvtbq_f32_f16v4sf): Likewise. >     (mve_vcvtbq_m_f16_f32v8hf): Likewise. >     (mve_vcvtbq_m_f32_f16v4sf): Likewise. >     (mve_vcvtmq_): Likewise. >     (mve_vcvtmq_m_): Likewise. >     (mve_vcvtnq_): Likewise. >     (mve_vcvtnq_m_): Likewise. >     (mve_vcvtpq_): Likewise. >     (mve_vcvtpq_m_): Likewise. >     (mve_vcvtq_from_f_): Likewise. >     (mve_vcvtq_m_from_f_): Likewise. >     (mve_vcvtq_m_n_from_f_): Likewise. >     (mve_vcvtq_m_n_to_f_): Likewise. >     (mve_vcvtq_m_to_f_): Likewise. >     (mve_vcvtq_n_from_f_): Likewise. >     (mve_vcvtq_n_to_f_): Likewise. >     (mve_vcvtq_to_f_): Likewise. >     (mve_vcvttq_f16_f32v8hf): Likewise. >     (mve_vcvttq_f32_f16v4sf): Likewise. >     (mve_vcvttq_m_f16_f32v8hf): Likewise. >     (mve_vcvttq_m_f32_f16v4sf): Likewise. >     (mve_vddupq_m_wb_u_insn): Likewise. >     (mve_vddupq_u_insn): Likewise. >     (mve_vdupq_m_n_): Likewise. >     (mve_vdupq_m_n_f): Likewise. >     (mve_vdupq_n_): Likewise. >     (mve_vdupq_n_f): Likewise. >     (mve_vdwdupq_m_wb_u_insn): Likewise. >     (mve_vdwdupq_wb_u_insn): Likewise. >     (mve_veorq_): Likewise. >     (mve_veorq_f): Likewise. >     (mve_veorq_m_): Likewise. >     (mve_veorq_m_f): Likewise. >     (mve_veorq_s): Likewise. >     (mve_veorq_u): Likewise. >     (mve_vfmaq_f): Likewise. >     (mve_vfmaq_m_f): Likewise. >     (mve_vfmaq_m_n_f): Likewise. >     (mve_vfmaq_n_f): Likewise. >     (mve_vfmasq_m_n_f): Likewise. >     (mve_vfmasq_n_f): Likewise. >     (mve_vfmsq_f): Likewise. >     (mve_vfmsq_m_f): Likewise. >     (mve_vhaddq_): Likewise. >     (mve_vhaddq_m_): Likewise. >     (mve_vhaddq_m_n_): Likewise. >     (mve_vhaddq_n_): Likewise. >     (mve_vhcaddq_rot270_m_s): Likewise. >     (mve_vhcaddq_rot270_s): Likewise. >     (mve_vhcaddq_rot90_m_s): Likewise. >     (mve_vhcaddq_rot90_s): Likewise. >     (mve_vhsubq_): Likewise. >     (mve_vhsubq_m_): Likewise. >     (mve_vhsubq_m_n_): Likewise. >     (mve_vhsubq_n_): Likewise. >     (mve_vidupq_m_wb_u_insn): Likewise. >     (mve_vidupq_u_insn): Likewise. >     (mve_viwdupq_m_wb_u_insn): Likewise. >     (mve_viwdupq_wb_u_insn): Likewise. >     (mve_vldrbq_): Likewise. >     (mve_vldrbq_gather_offset_): Likewise. >     (mve_vldrbq_gather_offset_z_): Likewise. >     (mve_vldrbq_z_): Likewise. >     (mve_vldrdq_gather_base_v2di): Likewise. >     (mve_vldrdq_gather_base_wb_v2di_insn): Likewise. >     (mve_vldrdq_gather_base_wb_z_v2di_insn): Likewise. >     (mve_vldrdq_gather_base_z_v2di): Likewise. >     (mve_vldrdq_gather_offset_v2di): Likewise. >     (mve_vldrdq_gather_offset_z_v2di): Likewise. >     (mve_vldrdq_gather_shifted_offset_v2di): Likewise. >     (mve_vldrdq_gather_shifted_offset_z_v2di): Likewise. >     (mve_vldrhq_): Likewise. >     (mve_vldrhq_fv8hf): Likewise. >     (mve_vldrhq_gather_offset_): Likewise. >     (mve_vldrhq_gather_offset_fv8hf): Likewise. >     (mve_vldrhq_gather_offset_z_): Likewise. >     (mve_vldrhq_gather_offset_z_fv8hf): Likewise. >     (mve_vldrhq_gather_shifted_offset_): Likewise. >     (mve_vldrhq_gather_shifted_offset_fv8hf): Likewise. >     (mve_vldrhq_gather_shifted_offset_z_): Likewise. >     (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise. >     (mve_vldrhq_z_): Likewise. >     (mve_vldrhq_z_fv8hf): Likewise. >     (mve_vldrwq_v4si): Likewise. >     (mve_vldrwq_fv4sf): Likewise. >     (mve_vldrwq_gather_base_v4si): Likewise. >     (mve_vldrwq_gather_base_fv4sf): Likewise. >     (mve_vldrwq_gather_base_wb_v4si_insn): Likewise. >     (mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise. >     (mve_vldrwq_gather_base_wb_z_v4si_insn): Likewise. >     (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise. >     (mve_vldrwq_gather_base_z_v4si): Likewise. >     (mve_vldrwq_gather_base_z_fv4sf): Likewise. >     (mve_vldrwq_gather_offset_v4si): Likewise. >     (mve_vldrwq_gather_offset_fv4sf): Likewise. >     (mve_vldrwq_gather_offset_z_v4si): Likewise. >     (mve_vldrwq_gather_offset_z_fv4sf): Likewise. >     (mve_vldrwq_gather_shifted_offset_v4si): Likewise. >     (mve_vldrwq_gather_shifted_offset_fv4sf): Likewise. >     (mve_vldrwq_gather_shifted_offset_z_v4si): Likewise. >     (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise. >     (mve_vldrwq_z_v4si): Likewise. >     (mve_vldrwq_z_fv4sf): Likewise. >     (mve_vmaxaq_m_s): Likewise. >     (mve_vmaxaq_s): Likewise. >     (mve_vmaxavq_p_s): Likewise. >     (mve_vmaxavq_s): Likewise. >     (mve_vmaxnmaq_f): Likewise. >     (mve_vmaxnmaq_m_f): Likewise. >     (mve_vmaxnmavq_f): Likewise. >     (mve_vmaxnmavq_p_f): Likewise. >     (mve_vmaxnmq_f): Likewise. >     (mve_vmaxnmq_m_f): Likewise. >     (mve_vmaxnmvq_f): Likewise. >     (mve_vmaxnmvq_p_f): Likewise. >     (mve_vmaxq_): Likewise. >     (mve_vmaxq_m_): Likewise. >     (mve_vmaxq_s): Likewise. >     (mve_vmaxq_u): Likewise. >     (mve_vmaxvq_): Likewise. >     (mve_vmaxvq_p_): Likewise. >     (mve_vminaq_m_s): Likewise. >     (mve_vminaq_s): Likewise. >     (mve_vminavq_p_s): Likewise. >     (mve_vminavq_s): Likewise. >     (mve_vminnmaq_f): Likewise. >     (mve_vminnmaq_m_f): Likewise. >     (mve_vminnmavq_f): Likewise. >     (mve_vminnmavq_p_f): Likewise. >     (mve_vminnmq_f): Likewise. >     (mve_vminnmq_m_f): Likewise. >     (mve_vminnmvq_f): Likewise. >     (mve_vminnmvq_p_f): Likewise. >     (mve_vminq_): Likewise. >     (mve_vminq_m_): Likewise. >     (mve_vminq_s): Likewise. >     (mve_vminq_u): Likewise. >     (mve_vminvq_): Likewise. >     (mve_vminvq_p_): Likewise. >     (mve_vmladavaq_): Likewise. >     (mve_vmladavaq_p_): Likewise. >     (mve_vmladavaxq_p_s): Likewise. >     (mve_vmladavaxq_s): Likewise. >     (mve_vmladavq_): Likewise. >     (mve_vmladavq_p_): Likewise. >     (mve_vmladavxq_p_s): Likewise. >     (mve_vmladavxq_s): Likewise. >     (mve_vmlaldavaq_): Likewise. >     (mve_vmlaldavaq_p_): Likewise. >     (mve_vmlaldavaxq_): Likewise. >     (mve_vmlaldavaxq_p_): Likewise. >     (mve_vmlaldavaxq_s): Likewise. >     (mve_vmlaldavq_): Likewise. >     (mve_vmlaldavq_p_): Likewise. >     (mve_vmlaldavxq_p_s): Likewise. >     (mve_vmlaldavxq_s): Likewise. >     (mve_vmlaq_m_n_): Likewise. >     (mve_vmlaq_n_): Likewise. >     (mve_vmlasq_m_n_): Likewise. >     (mve_vmlasq_n_): Likewise. >     (mve_vmlsdavaq_p_s): Likewise. >     (mve_vmlsdavaq_s): Likewise. >     (mve_vmlsdavaxq_p_s): Likewise. >     (mve_vmlsdavaxq_s): Likewise. >     (mve_vmlsdavq_p_s): Likewise. >     (mve_vmlsdavq_s): Likewise. >     (mve_vmlsdavxq_p_s): Likewise. >     (mve_vmlsdavxq_s): Likewise. >     (mve_vmlsldavaq_p_s): Likewise. >     (mve_vmlsldavaq_s): Likewise. >     (mve_vmlsldavaxq_p_s): Likewise. >     (mve_vmlsldavaxq_s): Likewise. >     (mve_vmlsldavq_p_s): Likewise. >     (mve_vmlsldavq_s): Likewise. >     (mve_vmlsldavxq_p_s): Likewise. >     (mve_vmlsldavxq_s): Likewise. >     (mve_vmovlbq_): Likewise. >     (mve_vmovlbq_m_): Likewise. >     (mve_vmovltq_): Likewise. >     (mve_vmovltq_m_): Likewise. >     (mve_vmovnbq_): Likewise. >     (mve_vmovnbq_m_): Likewise. >     (mve_vmovntq_): Likewise. >     (mve_vmovntq_m_): Likewise. >     (mve_vmulhq_): Likewise. >     (mve_vmulhq_m_): Likewise. >     (mve_vmullbq_int_): Likewise. >     (mve_vmullbq_int_m_): Likewise. >     (mve_vmullbq_poly_m_p): Likewise. >     (mve_vmullbq_poly_p): Likewise. >     (mve_vmulltq_int_): Likewise. >     (mve_vmulltq_int_m_): Likewise. >     (mve_vmulltq_poly_m_p): Likewise. >     (mve_vmulltq_poly_p): Likewise. >     (mve_vmulq_): Likewise. >     (mve_vmulq_f): Likewise. >     (mve_vmulq_m_): Likewise. >     (mve_vmulq_m_f): Likewise. >     (mve_vmulq_m_n_): Likewise. >     (mve_vmulq_m_n_f): Likewise. >     (mve_vmulq_n_): Likewise. >     (mve_vmulq_n_f): Likewise. >     (mve_vmvnq_): Likewise. >     (mve_vmvnq_m_): Likewise. >     (mve_vmvnq_m_n_): Likewise. >     (mve_vmvnq_n_): Likewise. >     (mve_vmvnq_s): Likewise. >     (mve_vmvnq_u): Likewise. >     (mve_vnegq_f): Likewise. >     (mve_vnegq_m_f): Likewise. >     (mve_vnegq_m_s): Likewise. >     (mve_vnegq_s): Likewise. >     (mve_vornq_): Likewise. >     (mve_vornq_f): Likewise. >     (mve_vornq_m_): Likewise. >     (mve_vornq_m_f): Likewise. >     (mve_vornq_s): Likewise. >     (mve_vornq_u): Likewise. >     (mve_vorrq_): Likewise. >     (mve_vorrq_f): Likewise. >     (mve_vorrq_m_): Likewise. >     (mve_vorrq_m_f): Likewise. >     (mve_vorrq_m_n_): Likewise. >     (mve_vorrq_n_): Likewise. >     (mve_vorrq_s): Likewise. >     (mve_vorrq_s): Likewise. >     (mve_vqabsq_m_s): Likewise. >     (mve_vqabsq_s): Likewise. >     (mve_vqaddq_): Likewise. >     (mve_vqaddq_m_): Likewise. >     (mve_vqaddq_m_n_): Likewise. >     (mve_vqaddq_n_): Likewise. >     (mve_vqdmladhq_m_s): Likewise. >     (mve_vqdmladhq_s): Likewise. >     (mve_vqdmladhxq_m_s): Likewise. >     (mve_vqdmladhxq_s): Likewise. >     (mve_vqdmlahq_m_n_s): Likewise. >     (mve_vqdmlahq_n_): Likewise. >     (mve_vqdmlahq_n_s): Likewise. >     (mve_vqdmlashq_m_n_s): Likewise. >     (mve_vqdmlashq_n_): Likewise. >     (mve_vqdmlashq_n_s): Likewise. >     (mve_vqdmlsdhq_m_s): Likewise. >     (mve_vqdmlsdhq_s): Likewise. >     (mve_vqdmlsdhxq_m_s): Likewise. >     (mve_vqdmlsdhxq_s): Likewise. >     (mve_vqdmulhq_m_n_s): Likewise. >     (mve_vqdmulhq_m_s): Likewise. >     (mve_vqdmulhq_n_s): Likewise. >     (mve_vqdmulhq_s): Likewise. >     (mve_vqdmullbq_m_n_s): Likewise. >     (mve_vqdmullbq_m_s): Likewise. >     (mve_vqdmullbq_n_s): Likewise. >     (mve_vqdmullbq_s): Likewise. >     (mve_vqdmulltq_m_n_s): Likewise. >     (mve_vqdmulltq_m_s): Likewise. >     (mve_vqdmulltq_n_s): Likewise. >     (mve_vqdmulltq_s): Likewise. >     (mve_vqmovnbq_): Likewise. >     (mve_vqmovnbq_m_): Likewise. >     (mve_vqmovntq_): Likewise. >     (mve_vqmovntq_m_): Likewise. >     (mve_vqmovunbq_m_s): Likewise. >     (mve_vqmovunbq_s): Likewise. >     (mve_vqmovuntq_m_s): Likewise. >     (mve_vqmovuntq_s): Likewise. >     (mve_vqnegq_m_s): Likewise. >     (mve_vqnegq_s): Likewise. >     (mve_vqrdmladhq_m_s): Likewise. >     (mve_vqrdmladhq_s): Likewise. >     (mve_vqrdmladhxq_m_s): Likewise. >     (mve_vqrdmladhxq_s): Likewise. >     (mve_vqrdmlahq_m_n_s): Likewise. >     (mve_vqrdmlahq_n_): Likewise. >     (mve_vqrdmlahq_n_s): Likewise. >     (mve_vqrdmlashq_m_n_s): Likewise. >     (mve_vqrdmlashq_n_): Likewise. >     (mve_vqrdmlashq_n_s): Likewise. >     (mve_vqrdmlsdhq_m_s): Likewise. >     (mve_vqrdmlsdhq_s): Likewise. >     (mve_vqrdmlsdhxq_m_s): Likewise. >     (mve_vqrdmlsdhxq_s): Likewise. >     (mve_vqrdmulhq_m_n_s): Likewise. >     (mve_vqrdmulhq_m_s): Likewise. >     (mve_vqrdmulhq_n_s): Likewise. >     (mve_vqrdmulhq_s): Likewise. >     (mve_vqrshlq_): Likewise. >     (mve_vqrshlq_m_): Likewise. >     (mve_vqrshlq_m_n_): Likewise. >     (mve_vqrshlq_n_): Likewise. >     (mve_vqrshrnbq_m_n_): Likewise. >     (mve_vqrshrnbq_n_): Likewise. >     (mve_vqrshrntq_m_n_): Likewise. >     (mve_vqrshrntq_n_): Likewise. >     (mve_vqrshrunbq_m_n_s): Likewise. >     (mve_vqrshrunbq_n_s): Likewise. >     (mve_vqrshruntq_m_n_s): Likewise. >     (mve_vqrshruntq_n_s): Likewise. >     (mve_vqshlq_): Likewise. >     (mve_vqshlq_m_): Likewise. >     (mve_vqshlq_m_n_): Likewise. >     (mve_vqshlq_m_r_): Likewise. >     (mve_vqshlq_n_): Likewise. >     (mve_vqshlq_r_): Likewise. >     (mve_vqshluq_m_n_s): Likewise. >     (mve_vqshluq_n_s): Likewise. >     (mve_vqshrnbq_m_n_): Likewise. >     (mve_vqshrnbq_n_): Likewise. >     (mve_vqshrntq_m_n_): Likewise. >     (mve_vqshrntq_n_): Likewise. >     (mve_vqshrunbq_m_n_s): Likewise. >     (mve_vqshrunbq_n_s): Likewise. >     (mve_vqshruntq_m_n_s): Likewise. >     (mve_vqshruntq_n_s): Likewise. >     (mve_vqsubq_): Likewise. >     (mve_vqsubq_m_): Likewise. >     (mve_vqsubq_m_n_): Likewise. >     (mve_vqsubq_n_): Likewise. >     (mve_vrev16q_v16qi): Likewise. >     (mve_vrev16q_m_v16qi): Likewise. >     (mve_vrev32q_): Likewise. >     (mve_vrev32q_fv8hf): Likewise. >     (mve_vrev32q_m_): Likewise. >     (mve_vrev32q_m_fv8hf): Likewise. >     (mve_vrev64q_): Likewise. >     (mve_vrev64q_f): Likewise. >     (mve_vrev64q_m_): Likewise. >     (mve_vrev64q_m_f): Likewise. >     (mve_vrhaddq_): Likewise. >     (mve_vrhaddq_m_): Likewise. >     (mve_vrmlaldavhaq_v4si): Likewise. >     (mve_vrmlaldavhaq_p_sv4si): Likewise. >     (mve_vrmlaldavhaq_p_uv4si): Likewise. >     (mve_vrmlaldavhaq_sv4si): Likewise. >     (mve_vrmlaldavhaq_uv4si): Likewise. >     (mve_vrmlaldavhaxq_p_sv4si): Likewise. >     (mve_vrmlaldavhaxq_sv4si): Likewise. >     (mve_vrmlaldavhq_v4si): Likewise. >     (mve_vrmlaldavhq_p_v4si): Likewise. >     (mve_vrmlaldavhxq_p_sv4si): Likewise. >     (mve_vrmlaldavhxq_sv4si): Likewise. >     (mve_vrmlsldavhaq_p_sv4si): Likewise. >     (mve_vrmlsldavhaq_sv4si): Likewise. >     (mve_vrmlsldavhaxq_p_sv4si): Likewise. >     (mve_vrmlsldavhaxq_sv4si): Likewise. >     (mve_vrmlsldavhq_p_sv4si): Likewise. >     (mve_vrmlsldavhq_sv4si): Likewise. >     (mve_vrmlsldavhxq_p_sv4si): Likewise. >     (mve_vrmlsldavhxq_sv4si): Likewise. >     (mve_vrmulhq_): Likewise. >     (mve_vrmulhq_m_): Likewise. >     (mve_vrndaq_f): Likewise. >     (mve_vrndaq_m_f): Likewise. >     (mve_vrndmq_f): Likewise. >     (mve_vrndmq_m_f): Likewise. >     (mve_vrndnq_f): Likewise. >     (mve_vrndnq_m_f): Likewise. >     (mve_vrndpq_f): Likewise. >     (mve_vrndpq_m_f): Likewise. >     (mve_vrndq_f): Likewise. >     (mve_vrndq_m_f): Likewise. >     (mve_vrndxq_f): Likewise. >     (mve_vrndxq_m_f): Likewise. >     (mve_vrshlq_): Likewise. >     (mve_vrshlq_m_): Likewise. >     (mve_vrshlq_m_n_): Likewise. >     (mve_vrshlq_n_): Likewise. >     (mve_vrshrnbq_m_n_): Likewise. >     (mve_vrshrnbq_n_): Likewise. >     (mve_vrshrntq_m_n_): Likewise. >     (mve_vrshrntq_n_): Likewise. >     (mve_vrshrq_m_n_): Likewise. >     (mve_vrshrq_n_): Likewise. >     (mve_vsbciq_v4si): Likewise. >     (mve_vsbciq_m_v4si): Likewise. >     (mve_vsbcq_v4si): Likewise. >     (mve_vsbcq_m_v4si): Likewise. >     (mve_vshlcq_): Likewise. >     (mve_vshlcq_m_): Likewise. >     (mve_vshllbq_m_n_): Likewise. >     (mve_vshllbq_n_): Likewise. >     (mve_vshlltq_m_n_): Likewise. >     (mve_vshlltq_n_): Likewise. >     (mve_vshlq_): Likewise. >     (mve_vshlq_): Likewise. >     (mve_vshlq_m_): Likewise. >     (mve_vshlq_m_n_): Likewise. >     (mve_vshlq_m_r_): Likewise. >     (mve_vshlq_n_): Likewise. >     (mve_vshlq_r_): Likewise. >     (mve_vshrnbq_m_n_): Likewise. >     (mve_vshrnbq_n_): Likewise. >     (mve_vshrntq_m_n_): Likewise. >     (mve_vshrntq_n_): Likewise. >     (mve_vshrq_m_n_): Likewise. >     (mve_vshrq_n_): Likewise. >     (mve_vsliq_m_n_): Likewise. >     (mve_vsliq_n_): Likewise. >     (mve_vsriq_m_n_): Likewise. >     (mve_vsriq_n_): Likewise. >     (mve_vstrbq_): Likewise. >     (mve_vstrbq_p_): Likewise. >     (mve_vstrbq_scatter_offset__insn): Likewise. >     (mve_vstrbq_scatter_offset_p__insn): Likewise. >     (mve_vstrdq_scatter_base_v2di): Likewise. >     (mve_vstrdq_scatter_base_p_v2di): Likewise. >     (mve_vstrdq_scatter_base_wb_v2di): Likewise. >     (mve_vstrdq_scatter_base_wb_p_v2di): Likewise. >     (mve_vstrdq_scatter_offset_v2di_insn): Likewise. >     (mve_vstrdq_scatter_offset_p_v2di_insn): Likewise. >     (mve_vstrdq_scatter_shifted_offset_v2di_insn): Likewise. >     (mve_vstrdq_scatter_shifted_offset_p_v2di_insn): Likewise. >     (mve_vstrhq_): Likewise. >     (mve_vstrhq_fv8hf): Likewise. >     (mve_vstrhq_p_): Likewise. >     (mve_vstrhq_p_fv8hf): Likewise. >     (mve_vstrhq_scatter_offset__insn): Likewise. >     (mve_vstrhq_scatter_offset_fv8hf_insn): Likewise. >     (mve_vstrhq_scatter_offset_p__insn): Likewise. >     (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise. >  (mve_vstrhq_scatter_shifted_offset__insn): Likewise. >     (mve_vstrhq_scatter_shifted_offset_fv8hf_insn): Likewise. >  (mve_vstrhq_scatter_shifted_offset_p__insn): Likewise. >     (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise. >     (mve_vstrwq_v4si): Likewise. >     (mve_vstrwq_fv4sf): Likewise. >     (mve_vstrwq_p_v4si): Likewise. >     (mve_vstrwq_p_fv4sf): Likewise. >     (mve_vstrwq_scatter_base_v4si): Likewise. >     (mve_vstrwq_scatter_base_fv4sf): Likewise. >     (mve_vstrwq_scatter_base_p_v4si): Likewise. >     (mve_vstrwq_scatter_base_p_fv4sf): Likewise. >     (mve_vstrwq_scatter_base_wb_v4si): Likewise. >     (mve_vstrwq_scatter_base_wb_fv4sf): Likewise. >     (mve_vstrwq_scatter_base_wb_p_v4si): Likewise. >     (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise. >     (mve_vstrwq_scatter_offset_v4si_insn): Likewise. >     (mve_vstrwq_scatter_offset_fv4sf_insn): Likewise. >     (mve_vstrwq_scatter_offset_p_v4si_insn): Likewise. >     (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise. >     (mve_vstrwq_scatter_shifted_offset_v4si_insn): Likewise. >     (mve_vstrwq_scatter_shifted_offset_fv4sf_insn): Likewise. >     (mve_vstrwq_scatter_shifted_offset_p_v4si_insn): Likewise. >     (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise. >     (mve_vsubq_): Likewise. >     (mve_vsubq_f): Likewise. >     (mve_vsubq_m_): Likewise. >     (mve_vsubq_m_f): Likewise. >     (mve_vsubq_m_n_): Likewise. >     (mve_vsubq_m_n_f): Likewise. >     (mve_vsubq_n_): Likewise. >     (mve_vsubq_n_f): Likewise. > > > gcc/testsuite/ChangeLog: > >         * gcc.target/arm/dlstp-compile-asm.c: New test. > Inline copy of v3 patch: diff --git a/gcc/config/arm/arm.h b/gcc/config/arm/arm.h index 7d40b8b7e00bc3b4dcff7ec685ba864ca3885052..40972c24ba1497302a2f23429ce071385ae29984 100644 --- a/gcc/config/arm/arm.h +++ b/gcc/config/arm/arm.h @@ -2358,6 +2358,21 @@ extern int making_const_table;    else if (TARGET_THUMB1)                \      thumb1_final_prescan_insn (INSN) +/* These defines are useful to refer to the value of the mve_unpredicated_insn +   insn attribute.  Note that, because these use the get_attr_* function, these +   will change recog_data if (INSN) isn't current_insn.  */ +#define MVE_VPT_PREDICABLE_INSN_P(INSN)                    \ +  (recog_memoized (INSN) >= 0                        \ +  && get_attr_mve_unpredicated_insn (INSN) != 0)     \ + +#define MVE_VPT_PREDICATED_INSN_P(INSN)                    \ +  (MVE_VPT_PREDICABLE_INSN_P (INSN)                    \ +   && recog_memoized (INSN) != get_attr_mve_unpredicated_insn (INSN))    \ + +#define MVE_VPT_UNPREDICATED_INSN_P(INSN)                \ +  (MVE_VPT_PREDICABLE_INSN_P (INSN)                    \ +   && recog_memoized (INSN) == get_attr_mve_unpredicated_insn (INSN))    \ +  #define ARM_SIGN_EXTEND(x)  ((HOST_WIDE_INT)            \    (HOST_BITS_PER_WIDE_INT <= 32 ? (unsigned HOST_WIDE_INT) (x)    \     : ((((unsigned HOST_WIDE_INT)(x)) & (unsigned HOST_WIDE_INT) 0xffffffff) |\ diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md index cbfc4543531452b0708a38bdf4abf5105b54f8b7..e97943751878820813fcd7a497e7e5b2e80ef289 100644 --- a/gcc/config/arm/arm.md +++ b/gcc/config/arm/arm.md @@ -124,6 +124,8 @@  ; and not all ARM insns do.  (define_attr "predicated" "yes,no" (const_string "no")) +(define_attr "mve_unpredicated_insn" "" (const_int 0)) +  ; LENGTH of an instruction (in bytes)  (define_attr "length" ""    (const_int 4)) diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md index 74523f49356a9fb06afd47e2a19a98caba17937d..854b8ab935f82ad0eb99e6af9852ce8154cf9d9d 100644 --- a/gcc/config/arm/mve.md +++ b/gcc/config/arm/mve.md @@ -142,7 +142,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintzt.f%# %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -156,7 +157,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vrintx.f%#    %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndxq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -170,7 +172,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vrintz.f%#    %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -184,7 +187,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vrintp.f%#    %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndpq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -198,7 +202,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vrintn.f%#    %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndnq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -212,7 +217,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vrintm.f%#    %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndmq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -226,7 +232,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vrinta.f%#    %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndaq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -240,7 +247,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vrev64.%# %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -253,7 +261,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vneg.f%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -267,7 +276,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vdup.%#\t%q0, %1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -280,7 +290,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vabs.f%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -294,7 +305,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vrev32.16 %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_fv8hf")) +  (set_attr "type" "mve_move")  ])  ;;  ;; [vcvttq_f32_f16]) @@ -307,7 +319,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcvtt.f32.f16 %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf")) +  (set_attr "type" "mve_move")  ])  ;; @@ -321,7 +334,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcvtb.f32.f16 %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf")) +  (set_attr "type" "mve_move")  ])  ;; @@ -335,7 +349,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt.f%#.%# %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -349,7 +364,8 @@    ]    "TARGET_HAVE_MVE"    "vrev64.%# %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -363,7 +379,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt.%#.f%# %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_")) +  (set_attr "type" "mve_move")  ])  ;; [vqnegq_s])  ;; @@ -375,7 +392,8 @@    ]    "TARGET_HAVE_MVE"    "vqneg.s%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqnegq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -389,7 +407,8 @@    ]    "TARGET_HAVE_MVE"    "vqabs.s%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqabsq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -402,7 +421,8 @@    ]    "TARGET_HAVE_MVE"    "vneg.s%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -415,7 +435,8 @@    ]    "TARGET_HAVE_MVE"    "vmvn\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_u")) +  (set_attr "type" "mve_move")  ])  (define_expand "mve_vmvnq_s"    [ @@ -436,7 +457,8 @@    ]    "TARGET_HAVE_MVE"    "vdup.%#\t%q0, %1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -449,7 +471,8 @@    ]    "TARGET_HAVE_MVE"    "vclz.i%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_s")) +  (set_attr "type" "mve_move")  ])  (define_expand "mve_vclzq_u"    [ @@ -470,7 +493,8 @@    ]    "TARGET_HAVE_MVE"    "vcls.s%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclsq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -484,7 +508,8 @@    ]    "TARGET_HAVE_MVE"    "vaddv.%#\t%0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -497,7 +522,8 @@    ]    "TARGET_HAVE_MVE"    "vabs.s%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -511,7 +537,8 @@    ]    "TARGET_HAVE_MVE"    "vrev32.%#\t%q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -525,7 +552,8 @@    ]    "TARGET_HAVE_MVE"    "vmovlt.%#   %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovltq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -539,7 +567,8 @@    ]    "TARGET_HAVE_MVE"    "vmovlb.%#   %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovlbq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -553,7 +582,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtp.%#.f%# %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -567,7 +597,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtn.%#.f%# %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -581,7 +612,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtm.%#.f%# %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -595,7 +627,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvta.%#.f%# %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -609,7 +642,8 @@    ]    "TARGET_HAVE_MVE"    "vmvn.i%#  %q0, %1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -623,7 +657,8 @@    ]    "TARGET_HAVE_MVE"    "vrev16.8 %q0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev16q_v16qi")) +  (set_attr "type" "mve_move")  ])  ;; @@ -637,7 +672,8 @@    ]    "TARGET_HAVE_MVE"    "vaddlv.32\t%Q0, %R0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvq_v4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -651,7 +687,8 @@    ]    "TARGET_HAVE_MVE"    "vctp. %1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctpq")) +  (set_attr "type" "mve_move")  ])  ;; @@ -680,7 +717,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vsub.f\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -695,7 +733,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vbrsr.  %q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -710,7 +749,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcvt.f.\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_")) +  (set_attr "type" "mve_move")  ])  ;; [vcreateq_f]) @@ -755,7 +795,8 @@    ]    "TARGET_HAVE_MVE"    "vshr.\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_")) +  (set_attr "type" "mve_move")  ])  ;; Versions that take constant vectors as operand 2 (with all elements @@ -803,7 +844,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcvt..f\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -818,8 +860,9 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddlvt.32\t%Q0, %R0, %q1" -  [(set_attr "type" "mve_move") -   (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvq_v4si")) +  (set_attr "type" "mve_move") +  (set_attr "length""8")])  ;;  ;; [vcmpneq_, vcmpcsq_, vcmpeqq_, vcmpgeq_, vcmpgtq_, vcmphiq_, vcmpleq_, vcmpltq_]) @@ -832,7 +875,8 @@    ]    "TARGET_HAVE_MVE" "vcmp.%#\t, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -847,7 +891,8 @@    ]    "TARGET_HAVE_MVE"    "vcmp.%# , %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -866,7 +911,8 @@    ]    "TARGET_HAVE_MVE"    "vabd.%#    %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -881,7 +927,8 @@    ]    "TARGET_HAVE_MVE"    "vadd.i%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -896,7 +943,8 @@    ]    "TARGET_HAVE_MVE"    "vaddva.%#\t%0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvaq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -911,7 +959,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddvt.%#    %0, %q1" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -931,7 +980,8 @@    "@     vand\t%q0, %q1, %q2     * return neon_output_logic_immediate (\"vand\", &operands[2], mode, 1, VALID_NEON_QREG_MODE (mode));" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_u")) +  (set_attr "type" "mve_move")  ])  (define_expand "mve_vandq_s"    [ @@ -953,7 +1003,8 @@    ]    "TARGET_HAVE_MVE"    "vbic\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_u")) +  (set_attr "type" "mve_move")  ])  (define_expand "mve_vbicq_s" @@ -977,7 +1028,8 @@    ]    "TARGET_HAVE_MVE"    "vbrsr.%#    %q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -992,7 +1044,8 @@    ]    "TARGET_HAVE_MVE"    "vcadd.i%#    %q0, %q1, %q2, #" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq")) +  (set_attr "type" "mve_move")  ])  ;; Auto vectorizer pattern for int vcadd @@ -1015,7 +1068,8 @@    ]    "TARGET_HAVE_MVE"    "veor\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_u")) +  (set_attr "type" "mve_move")  ])  (define_expand "mve_veorq_s"    [ @@ -1038,7 +1092,8 @@    ]    "TARGET_HAVE_MVE"    "vhadd.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1053,7 +1108,8 @@    ]    "TARGET_HAVE_MVE"    "vhadd.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1068,7 +1124,8 @@    ]    "TARGET_HAVE_MVE"    "vhcadd.s%#\t%q0, %q1, %q2, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot270_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1083,7 +1140,8 @@    ]    "TARGET_HAVE_MVE"    "vhcadd.s%#\t%q0, %q1, %q2, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot90_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1098,7 +1156,8 @@    ]    "TARGET_HAVE_MVE"    "vhsub.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1113,7 +1172,8 @@    ]    "TARGET_HAVE_MVE"    "vhsub.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1128,7 +1188,8 @@    ]    "TARGET_HAVE_MVE"    "vmaxa.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxaq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1143,7 +1204,8 @@    ]    "TARGET_HAVE_MVE"    "vmaxav.s%#\t%0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxavq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1157,7 +1219,8 @@    ]    "TARGET_HAVE_MVE"    "vmax.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxq_s")) +  (set_attr "type" "mve_move")  ])  (define_insn "mve_vmaxq_u" @@ -1168,7 +1231,8 @@    ]    "TARGET_HAVE_MVE"    "vmax.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxq_u")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1183,7 +1247,8 @@    ]    "TARGET_HAVE_MVE"    "vmaxv.%#\t%0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxvq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1198,7 +1263,8 @@    ]    "TARGET_HAVE_MVE"    "vmina.s%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminaq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1213,7 +1279,8 @@    ]    "TARGET_HAVE_MVE"    "vminav.s%#\t%0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminavq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1227,7 +1294,8 @@    ]    "TARGET_HAVE_MVE"    "vmin.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminq_s")) +  (set_attr "type" "mve_move")  ])  (define_insn "mve_vminq_u" @@ -1238,7 +1306,8 @@    ]    "TARGET_HAVE_MVE"    "vmin.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminq_u")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1253,7 +1322,8 @@    ]    "TARGET_HAVE_MVE"    "vminv.%#\t%0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminvq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1268,7 +1338,8 @@    ]    "TARGET_HAVE_MVE"    "vmladav.%#\t%0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1283,7 +1354,8 @@    ]    "TARGET_HAVE_MVE"    "vmladavx.s%#\t%0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1298,7 +1370,8 @@    ]    "TARGET_HAVE_MVE"    "vmlsdav.s%#\t%0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1313,7 +1386,8 @@    ]    "TARGET_HAVE_MVE"    "vmlsdavx.s%#\t%0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1328,7 +1402,8 @@    ]    "TARGET_HAVE_MVE"    "vmulh.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulhq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1343,7 +1418,8 @@    ]    "TARGET_HAVE_MVE"    "vmullb.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1358,7 +1434,8 @@    ]    "TARGET_HAVE_MVE"    "vmullt.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1373,7 +1450,8 @@    ]    "TARGET_HAVE_MVE"    "vmul.i%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1388,7 +1466,8 @@    ]    "TARGET_HAVE_MVE"    "vmul.i%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_")) +  (set_attr "type" "mve_move")  ])  (define_insn "mve_vmulq" @@ -1399,7 +1478,8 @@    ]    "TARGET_HAVE_MVE"    "vmul.i%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1413,7 +1493,8 @@    ]    "TARGET_HAVE_MVE"     "vorn\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_s")) +  (set_attr "type" "mve_move")  ])  (define_expand "mve_vornq_u" @@ -1442,7 +1523,8 @@    "@     vorr\t%q0, %q1, %q2     * return neon_output_logic_immediate (\"vorr\", &operands[2], mode, 0, VALID_NEON_QREG_MODE (mode));" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s")) +  (set_attr "type" "mve_move")  ])  (define_expand "mve_vorrq_u"    [ @@ -1465,7 +1547,8 @@    ]    "TARGET_HAVE_MVE"    "vqadd.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1480,7 +1563,8 @@    ]    "TARGET_HAVE_MVE"    "vqadd.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1495,7 +1579,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmulh.s%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_n_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1510,7 +1595,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmulh.s%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1525,7 +1611,8 @@    ]    "TARGET_HAVE_MVE"    "vqrdmulh.s%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_n_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1540,7 +1627,8 @@    ]    "TARGET_HAVE_MVE"    "vqrdmulh.s%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1555,7 +1643,8 @@    ]    "TARGET_HAVE_MVE"    "vqrshl.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1570,7 +1659,8 @@    ]    "TARGET_HAVE_MVE"    "vqrshl.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1585,7 +1675,8 @@    ]    "TARGET_HAVE_MVE"    "vqshl.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1600,7 +1691,8 @@    ]    "TARGET_HAVE_MVE"    "vqshl.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_r_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1615,7 +1707,8 @@    ]    "TARGET_HAVE_MVE"    "vqshl.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1630,7 +1723,8 @@    ]    "TARGET_HAVE_MVE"    "vqshlu.s%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshluq_n_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1645,7 +1739,8 @@    ]    "TARGET_HAVE_MVE"    "vqsub.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1660,7 +1755,8 @@    ]    "TARGET_HAVE_MVE"    "vqsub.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1675,7 +1771,8 @@    ]    "TARGET_HAVE_MVE"    "vrhadd.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrhaddq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1690,7 +1787,8 @@    ]    "TARGET_HAVE_MVE"    "vrmulh.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmulhq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1705,7 +1803,8 @@    ]    "TARGET_HAVE_MVE"    "vrshl.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1720,7 +1819,8 @@    ]    "TARGET_HAVE_MVE"    "vrshl.%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1735,7 +1835,8 @@    ]    "TARGET_HAVE_MVE"    "vrshr.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1750,7 +1851,8 @@    ]    "TARGET_HAVE_MVE"    "vshl.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1765,7 +1867,8 @@    ]    "TARGET_HAVE_MVE"    "vshl.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_r_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1780,7 +1883,8 @@    ]    "TARGET_HAVE_MVE"    "vsub.i%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1795,7 +1899,8 @@    ]    "TARGET_HAVE_MVE"    "vsub.i%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_")) +  (set_attr "type" "mve_move")  ])  (define_insn "mve_vsubq" @@ -1806,7 +1911,8 @@    ]    "TARGET_HAVE_MVE"    "vsub.i%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1821,7 +1927,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vabd.f%#    %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1836,7 +1943,8 @@    ]    "TARGET_HAVE_MVE"    "vaddlva.32\t%Q0, %R0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvaq_v4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1851,7 +1959,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vadd.f%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1865,7 +1974,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vand %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1879,7 +1989,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vbic %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1894,7 +2005,8 @@    ]    "TARGET_HAVE_MVE"    "vbic.i%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1909,7 +2021,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcadd.f%#    %q0, %q1, %q2, #" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1923,7 +2036,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcmp.f%#    , %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1938,7 +2052,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcmp.f%#    , %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_n_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1953,7 +2068,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcmul.f%#    %q0, %q1, %q2, #" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1968,7 +2084,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vctpt. %1" -  [(set_attr "type" "mve_move") +  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctpq")) +   (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -1983,7 +2100,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcvtb.f16.f32 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf")) +  (set_attr "type" "mve_move")  ])  ;; @@ -1998,7 +2116,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vcvtt.f16.f32 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2012,7 +2131,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "veor %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2027,7 +2147,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vmaxnma.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmaq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2042,7 +2163,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vmaxnmav.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmavq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2056,7 +2178,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vmaxnm.f%#    %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2071,7 +2194,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vmaxnmv.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmvq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2086,7 +2210,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vminnma.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmaq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2101,7 +2226,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vminnmav.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmavq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2115,7 +2241,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vminnm.f%#    %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2130,7 +2257,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vminnmv.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmvq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2145,7 +2273,8 @@    ]    "TARGET_HAVE_MVE"    "vmlaldav.%#    %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2160,7 +2289,8 @@    ]    "TARGET_HAVE_MVE"    "vmlaldavx.s%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2175,7 +2305,8 @@    ]    "TARGET_HAVE_MVE"    "vmlsldav.s%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2190,7 +2321,8 @@    ]    "TARGET_HAVE_MVE"    "vmlsldavx.s%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2205,7 +2337,8 @@    ]    "TARGET_HAVE_MVE"    "vmovnb.i%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovnbq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2220,7 +2353,8 @@    ]    "TARGET_HAVE_MVE"    "vmovnt.i%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovntq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2234,7 +2368,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vmul.f%#    %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2249,7 +2384,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vmul.f%#    %q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2263,7 +2399,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vorn %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2277,7 +2414,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vorr %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2292,7 +2430,8 @@    ]    "TARGET_HAVE_MVE"    "vorr.i%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2307,7 +2446,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmullb.s%#    %q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_n_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2322,7 +2462,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmullb.s%#    %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2337,7 +2478,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmullt.s%#    %q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_n_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2352,7 +2494,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmullt.s%#    %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2367,7 +2510,8 @@    ]    "TARGET_HAVE_MVE"    "vqmovnb.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovnbq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2382,7 +2526,8 @@    ]    "TARGET_HAVE_MVE"    "vqmovnt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovntq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2397,7 +2542,8 @@    ]    "TARGET_HAVE_MVE"    "vqmovunb.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovunbq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2412,7 +2558,8 @@    ]    "TARGET_HAVE_MVE"    "vqmovunt.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovuntq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2427,7 +2574,8 @@    ]    "TARGET_HAVE_MVE"    "vrmlaldavhx.s32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhxq_sv4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2442,7 +2590,8 @@    ]    "TARGET_HAVE_MVE"    "vrmlsldavh.s32\t%Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhq_sv4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2457,7 +2606,8 @@    ]    "TARGET_HAVE_MVE"    "vrmlsldavhx.s32\t%Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhxq_sv4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2472,7 +2622,8 @@    ]    "TARGET_HAVE_MVE"    "vshllb.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshllbq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2487,7 +2638,8 @@    ]    "TARGET_HAVE_MVE"    "vshllt.%#\t%q0, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlltq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2501,7 +2653,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vsub.f%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2516,7 +2669,8 @@    ]    "TARGET_HAVE_MVE"    "vmullt.p%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2531,7 +2685,8 @@    ]    "TARGET_HAVE_MVE"    "vmullb.p%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2546,7 +2701,8 @@    ]    "TARGET_HAVE_MVE"    "vrmlaldavh.32\t%Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhq_v4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2562,7 +2718,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vbict.i%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vcmpeqq_m_f]) @@ -2577,7 +2734,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    eq, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vcvtaq_m_u, vcvtaq_m_s]) @@ -2592,7 +2750,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtat.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u]) @@ -2607,7 +2766,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.f%#.%#  %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vqrshrnbq_n_u, vqrshrnbq_n_s]) @@ -2622,7 +2782,8 @@    ]    "TARGET_HAVE_MVE"    "vqrshrnb.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrnbq_n_")) +  (set_attr "type" "mve_move")  ])  ;;  ;; [vqrshrunbq_n_s]) @@ -2637,7 +2798,8 @@    ]    "TARGET_HAVE_MVE"    "vqrshrunb.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrunbq_n_s")) +  (set_attr "type" "mve_move")  ])  ;;  ;; [vrmlaldavhaq_s vrmlaldavhaq_u]) @@ -2652,7 +2814,8 @@    ]    "TARGET_HAVE_MVE"    "vrmlaldavha.32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_v4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2668,7 +2831,8 @@    ]    "TARGET_HAVE_MVE"    "vabav.%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabavq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -2729,7 +2893,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vabst.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2745,7 +2910,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddvat.%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvaq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2761,7 +2927,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vclst.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclsq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2777,7 +2944,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vclzt.i%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2793,7 +2961,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.u%#    cs, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpcsq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2809,7 +2978,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.u%#    cs, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpcsq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2825,7 +2995,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.i%#    eq, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2841,7 +3012,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.i%#    eq, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2857,7 +3029,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    ge, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2873,7 +3046,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    ge, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2889,7 +3063,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    gt, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2905,7 +3080,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    gt, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2921,7 +3097,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.u%#    hi, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmphiq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2937,7 +3114,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.u%#    hi, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmphiq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2953,7 +3131,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    le, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2969,7 +3148,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    le, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -2985,7 +3165,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    lt, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3001,7 +3182,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.s%#    lt, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3017,7 +3199,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.i%#    ne, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3033,7 +3216,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcmpt.i%#    ne, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3049,8 +3233,9 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vdupt.%#\t%q0, %2" -  [(set_attr "type" "mve_move") -   (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_")) +  (set_attr "type" "mve_move") +  (set_attr "length""8")])  ;;  ;; [vmaxaq_m_s]) @@ -3065,7 +3250,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmaxat.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxaq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3081,7 +3267,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmaxavt.s%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxavq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3097,7 +3284,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmaxvt.%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxvq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3113,7 +3301,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vminat.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminaq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3129,7 +3318,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vminavt.s%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminavq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3145,7 +3335,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vminvt.%#\t%0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminvq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3161,7 +3352,8 @@    ]    "TARGET_HAVE_MVE"    "vmladava.%#    %0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3177,7 +3369,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmladavt.%#\t%0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3193,7 +3386,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmladavxt.s%#\t%0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3209,7 +3403,8 @@    ]    "TARGET_HAVE_MVE"    "vmla.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3225,7 +3420,8 @@    ]    "TARGET_HAVE_MVE"    "vmlas.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlasq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3241,7 +3437,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsdavt.s%#    %0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3257,7 +3454,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsdavxt.s%#    %0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3273,7 +3471,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmvnt %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3289,7 +3488,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vnegt.s%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3305,7 +3505,8 @@    ]    "TARGET_HAVE_MVE"    "vpsel %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpselq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3321,7 +3522,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqabst.s%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqabsq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3337,7 +3539,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmlah.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlahq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3353,7 +3556,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmlash.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlashq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3369,7 +3573,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqnegt.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqnegq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3385,7 +3590,8 @@    ]    "TARGET_HAVE_MVE"    "vqrdmladh.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3401,7 +3607,8 @@    ]    "TARGET_HAVE_MVE"    "vqrdmladhx.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3417,7 +3624,8 @@    ]    "TARGET_HAVE_MVE"    "vqrdmlah.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlahq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3433,7 +3641,8 @@    ]    "TARGET_HAVE_MVE"    "vqrdmlash.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlashq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3449,7 +3658,8 @@    ]    "TARGET_HAVE_MVE"    "vqrdmlsdh.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3465,7 +3675,8 @@    ]    "TARGET_HAVE_MVE"    "vqrdmlsdhx.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3481,7 +3692,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshlt.%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3497,7 +3709,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshlt.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_r_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3513,7 +3726,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrev64t.%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3529,7 +3743,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshlt.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3545,7 +3760,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshlt.%#\t%q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_r_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3561,7 +3777,8 @@    ]    "TARGET_HAVE_MVE"    "vsli.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsliq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3577,7 +3794,8 @@    ]    "TARGET_HAVE_MVE"    "vsri.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsriq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3593,7 +3811,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmlsdhx.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3609,7 +3828,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmlsdh.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3625,7 +3845,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmladhx.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3641,7 +3862,8 @@    ]    "TARGET_HAVE_MVE"    "vqdmladh.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3657,7 +3879,8 @@    ]    "TARGET_HAVE_MVE"    "vmlsdavax.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3673,7 +3896,8 @@    ]    "TARGET_HAVE_MVE"    "vmlsdava.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3689,7 +3913,8 @@    ]    "TARGET_HAVE_MVE"    "vmladavax.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaxq_s")) +  (set_attr "type" "mve_move")  ])  ;;  ;; [vabsq_m_f]) @@ -3704,7 +3929,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vabst.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3720,8 +3946,10 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddlvat.32\t%Q0, %R0, %q2" -  [(set_attr "type" "mve_move") -   (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvaq_v4si")) +  (set_attr "type" "mve_move") +  (set_attr "length""8")]) +  ;;  ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270])  ;; @@ -3738,7 +3966,8 @@    "@     vcmul.f%#    %q0, %q2, %q3, #     vcmla.f%#    %q0, %q2, %q3, #" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq")) +  (set_attr "type" "mve_move")  ])  ;; @@ -3754,7 +3983,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    eq, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3770,7 +4000,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    ge, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3786,7 +4017,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    ge, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3802,7 +4034,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    gt, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3818,7 +4051,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    gt, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3834,7 +4068,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    le, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3850,7 +4085,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    le, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3866,7 +4102,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    lt, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3882,7 +4119,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    lt, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3898,7 +4136,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    ne, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3914,7 +4153,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmpt.f%#    ne, %q1, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3930,7 +4170,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcvtbt.f16.f32 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3946,7 +4187,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcvtbt.f32.f16 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3962,7 +4204,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcvttt.f16.f32 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3978,7 +4221,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcvttt.f32.f16 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -3994,8 +4238,9 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vdupt.%#\t%q0, %2" -  [(set_attr "type" "mve_move") -   (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_f")) +  (set_attr "type" "mve_move") +  (set_attr "length""8")])  ;;  ;; [vfmaq_f]) @@ -4010,7 +4255,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vfma.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4026,7 +4272,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vfma.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_n_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4042,7 +4289,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vfmas.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmasq_n_f")) +  (set_attr "type" "mve_move")  ])  ;;  ;; [vfmsq_f]) @@ -4057,7 +4305,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vfms.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmsq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4073,7 +4322,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmaxnmat.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmaq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmaxnmavq_p_f]) @@ -4088,7 +4338,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmaxnmavt.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmavq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4104,7 +4355,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmaxnmvt.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmvq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vminnmaq_m_f]) @@ -4119,7 +4371,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vminnmat.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmaq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4135,7 +4388,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vminnmavt.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmavq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vminnmvq_p_f]) @@ -4150,7 +4404,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vminnmvt.f%#    %0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmvq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4166,7 +4421,8 @@    ]    "TARGET_HAVE_MVE"    "vmlaldava.%#\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4182,7 +4438,8 @@    ]    "TARGET_HAVE_MVE"    "vmlaldavax.s%#\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4198,7 +4455,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlaldavt.%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4214,7 +4472,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlaldavxt.s%#\t%Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmlsldavaq_s]) @@ -4229,7 +4488,8 @@    ]    "TARGET_HAVE_MVE"    "vmlsldava.s%# %Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4245,7 +4505,8 @@    ]    "TARGET_HAVE_MVE"    "vmlsldavax.s%# %Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaxq_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4261,7 +4522,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsldavt.s%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4277,7 +4539,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsldavxt.s%# %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmovlbq_m_u, vmovlbq_m_s]) @@ -4292,7 +4555,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmovlbt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovlbq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmovltq_m_u, vmovltq_m_s]) @@ -4307,7 +4571,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmovltt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovltq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vmovnbq_m_u, vmovnbq_m_s]) @@ -4322,7 +4587,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmovnbt.i%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovnbq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4338,7 +4604,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmovntt.i%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovntq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4354,7 +4621,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmvnt.i%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vnegq_m_f]) @@ -4369,7 +4637,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vnegt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4385,7 +4654,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vorrt.i%#    %q0, %2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vpselq_f]) @@ -4400,7 +4670,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpsel %q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpselq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4416,7 +4687,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqmovnbt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovnbq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4432,7 +4704,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqmovntt.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovntq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4448,7 +4721,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqmovunbt.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovunbq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4464,7 +4738,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqmovuntt.s%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovuntq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4480,7 +4755,8 @@    ]    "TARGET_HAVE_MVE"    "vqrshrnt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrntq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4496,7 +4772,8 @@    ]    "TARGET_HAVE_MVE"    "vqrshrunt.s%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshruntq_n_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4512,7 +4789,8 @@    ]    "TARGET_HAVE_MVE"    "vqshrnb.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrnbq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4528,7 +4806,8 @@    ]    "TARGET_HAVE_MVE"    "vqshrnt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrntq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4544,7 +4823,8 @@    ]    "TARGET_HAVE_MVE"    "vqshrunb.s%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrunbq_n_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4560,7 +4840,8 @@    ]    "TARGET_HAVE_MVE"    "vqshrunt.s%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshruntq_n_s")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4576,7 +4857,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrev32t.16 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_fv8hf")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4592,7 +4874,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrev32t.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4608,7 +4891,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrev64t.%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4624,7 +4908,8 @@    ]    "TARGET_HAVE_MVE"    "vrmlaldavhax.s32 %Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaxq_sv4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4640,7 +4925,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavhxt.s32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhxq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4656,7 +4942,8 @@    ]    "TARGET_HAVE_MVE"    "vrmlsldavhax.s32 %Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaxq_sv4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4672,7 +4959,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlsldavht.s32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4688,7 +4976,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlsldavhxt.s32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhxq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4704,7 +4993,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintat.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndaq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4720,7 +5010,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintmt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndmq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4736,7 +5027,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintnt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndnq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4752,7 +5044,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintpt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndpq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4768,7 +5061,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vrintxt.f%#    %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndxq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4784,7 +5078,8 @@    ]    "TARGET_HAVE_MVE"    "vrshrnb.i%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrnbq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4800,7 +5095,8 @@    ]    "TARGET_HAVE_MVE"    "vrshrnt.i%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrntq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4816,7 +5112,8 @@    ]    "TARGET_HAVE_MVE"    "vshrnb.i%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrnbq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4832,7 +5129,8 @@    ]    "TARGET_HAVE_MVE"    "vshrnt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrntq_n_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4848,7 +5146,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtmt.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4864,7 +5163,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtpt.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4880,7 +5180,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtnt.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4897,7 +5198,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.%#.f%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4913,7 +5215,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrev16t.8 %q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev16q_v16qi")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4929,7 +5232,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.%#.f%#\t%q0, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4945,7 +5249,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavht.32 %Q0, %R0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -4961,7 +5266,8 @@    ]    "TARGET_HAVE_MVE"    "vrmlsldavha.s32 %Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaq_sv4si")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4978,7 +5284,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vabavt.%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabavq_")) +  (set_attr "type" "mve_move")  ])  ;; @@ -4995,7 +5302,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\n\tvqshlut.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshluq_n_s")) +  (set_attr "type" "mve_move")])  ;;  ;; [vshlq_m_s, vshlq_m_u]) @@ -5011,7 +5319,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshlt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_")) +  (set_attr "type" "mve_move")])  ;;  ;; [vsriq_m_n_s, vsriq_m_n_u]) @@ -5027,7 +5336,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsrit.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsriq_n_")) +  (set_attr "type" "mve_move")])  ;;  ;; [vsubq_m_u, vsubq_m_s]) @@ -5043,7 +5353,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsubt.i%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_")) +  (set_attr "type" "mve_move")])  ;;  ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s]) @@ -5059,7 +5370,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.f%#.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vabdq_m_s, vabdq_m_u]) @@ -5075,7 +5387,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vabdt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5092,7 +5405,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddt.i%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5109,7 +5423,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vaddt.i%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5126,7 +5441,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vandt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5143,7 +5459,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vbict %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5160,7 +5477,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vbrsrt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5177,7 +5495,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcaddt.i%#    %q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5194,7 +5513,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vcaddt.i%#    %q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5211,7 +5531,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;veort %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5228,7 +5549,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhaddt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5245,7 +5567,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhaddt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5262,7 +5585,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhsubt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5279,7 +5603,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhsubt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5296,7 +5621,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmaxt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5313,7 +5639,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmint.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5330,7 +5657,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmladavat.%#    %0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5347,7 +5675,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlat.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5364,7 +5693,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlast.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlasq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5381,7 +5711,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmulht.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulhq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5398,7 +5729,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmullbt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5415,7 +5747,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmulltt.%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5432,7 +5765,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmult.i%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5449,7 +5783,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmult.i%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5466,7 +5801,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vornt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5483,7 +5819,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vorrt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5500,7 +5837,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqaddt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5517,7 +5855,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqaddt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5534,7 +5873,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmlaht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlahq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5551,7 +5891,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmlasht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlashq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5568,7 +5909,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmlaht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlahq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5585,7 +5927,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmlasht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlashq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5602,7 +5945,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshlt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5619,7 +5963,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshlt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5636,7 +5981,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshlt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5653,7 +5999,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqsubt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5670,7 +6017,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqsubt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5687,7 +6035,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrhaddt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrhaddq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5704,7 +6053,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmulht.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmulhq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5721,7 +6071,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshlt.%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5738,7 +6089,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshrt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5755,7 +6107,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshlt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5772,7 +6125,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshrt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5789,7 +6143,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vslit.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsliq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5806,7 +6161,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsubt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5823,7 +6179,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhcaddt.s%#\t%q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot270_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5840,7 +6197,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vhcaddt.s%#\t%q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot90_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5857,7 +6215,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmladavaxt.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5874,7 +6233,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsdavat.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5891,7 +6251,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsdavaxt.s%#\t%0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5908,7 +6269,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmladht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5925,7 +6287,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmladhxt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5942,7 +6305,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmlsdht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5959,7 +6323,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmlsdhxt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5976,7 +6341,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmulht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -5993,7 +6359,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmulht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6010,7 +6377,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmladht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6027,7 +6395,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmladhxt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6044,7 +6413,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmlsdht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6061,7 +6431,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmlsdhxt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6078,7 +6449,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmulht.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6095,7 +6467,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrdmulht.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6112,7 +6485,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlaldavat.%#    %Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaq_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6129,8 +6503,9 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlaldavaxt.%#\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") -   (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaxq_")) +  (set_attr "type" "mve_move") +  (set_attr "length""8")])  ;;  ;; [vqrshrnbq_m_n_u, vqrshrnbq_m_n_s]) @@ -6146,7 +6521,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshrnbt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrnbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6163,7 +6539,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshrntt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrntq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6180,7 +6557,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\n\tvqshrnbt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrnbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6197,7 +6575,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshrntt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrntq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6214,7 +6593,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavhat.s32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6231,7 +6611,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshrnbt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrnbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6248,7 +6629,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrshrntt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrntq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6265,7 +6647,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshllbt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshllbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6282,7 +6665,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshlltt.%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlltq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6299,7 +6683,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshrnbt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrnbq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6316,7 +6701,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vshrntt.i%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrntq_n_")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6333,7 +6719,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsldavat.s%#\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6350,7 +6737,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmlsldavaxt.s%#\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaxq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6367,7 +6755,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmullbt.p%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6384,7 +6773,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vmulltt.p%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6401,7 +6791,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmullbt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6418,7 +6809,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmullbt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6435,7 +6827,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmulltt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6452,7 +6845,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqdmulltt.s%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6469,7 +6863,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshrunbt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrunbq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6486,7 +6881,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqrshruntt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshruntq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6503,7 +6899,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshrunbt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrunbq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6520,7 +6917,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vqshruntt.s%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshruntq_n_s")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6537,7 +6935,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavhat.u32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_uv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6554,7 +6953,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlaldavhaxt.s32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaxq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6571,7 +6971,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlsldavhat.s32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6588,7 +6989,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vrmlsldavhaxt.s32\t%Q0, %R0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaxq_sv4si")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;;  ;; [vabdq_m_f]) @@ -6604,7 +7006,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vabdt.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6621,7 +7024,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vaddt.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6638,7 +7042,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vaddt.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6655,7 +7060,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vandt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6672,7 +7078,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vbict %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6689,7 +7096,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vbrsrt.%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6706,7 +7114,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcaddt.f%#    %q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6723,7 +7132,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcaddt.f%#    %q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6740,7 +7150,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmlat.f%#    %q0, %q2, %q3, #0" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6757,7 +7168,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmlat.f%#    %q0, %q2, %q3, #180" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot180")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6774,7 +7186,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmlat.f%#    %q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot270")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6791,7 +7204,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmlat.f%#    %q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot90")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6808,7 +7222,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmult.f%#    %q0, %q2, %q3, #0" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6825,7 +7240,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmult.f%#    %q0, %q2, %q3, #180" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot180")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6842,7 +7258,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmult.f%#    %q0, %q2, %q3, #270" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot270")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6859,7 +7276,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vcmult.f%#    %q0, %q2, %q3, #90" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot90")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6876,7 +7294,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;veort %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6893,7 +7312,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vfmat.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6910,7 +7330,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vfmat.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6927,7 +7348,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vfmast.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmasq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6944,7 +7366,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vfmst.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmsq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6961,7 +7384,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmaxnmt.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6978,7 +7402,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vminnmt.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -6995,7 +7420,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmult.f%#    %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7012,7 +7438,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vmult.f%#    %q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7029,7 +7456,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vornt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7046,7 +7474,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vorrt %q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7063,7 +7492,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vsubt.f%#\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7080,7 +7510,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vsubt.f%#\t%q0, %q2, %3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_f")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -7100,7 +7531,8 @@     output_asm_insn("vstrb.\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_")) +  (set_attr "length" "4")])  ;;  ;; [vstrbq_scatter_offset_s vstrbq_scatter_offset_u] @@ -7128,7 +7560,8 @@        VSTRBSOQ))]    "TARGET_HAVE_MVE"    "vstrb.\t%q2, [%0, %q1]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset__insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrwq_scatter_base_s vstrwq_scatter_base_u] @@ -7150,7 +7583,8 @@     output_asm_insn("vstrw.u32\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_v4si")) +  (set_attr "length" "4")])  ;;  ;; [vldrbq_gather_offset_s vldrbq_gather_offset_u] @@ -7173,7 +7607,8 @@       output_asm_insn ("vldrb.\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_")) +  (set_attr "length" "4")])  ;;  ;; [vldrbq_s vldrbq_u] @@ -7195,7 +7630,8 @@       output_asm_insn ("vldrb.\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_")) +  (set_attr "length" "4")])  ;;  ;; [vldrwq_gather_base_s vldrwq_gather_base_u] @@ -7215,7 +7651,8 @@     output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_v4si")) +  (set_attr "length" "4")])  ;;  ;; [vstrbq_scatter_offset_p_s vstrbq_scatter_offset_p_u] @@ -7247,7 +7684,8 @@        VSTRBSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrbt.\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset__insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u] @@ -7270,7 +7708,8 @@     output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_v4si")) +  (set_attr "length" "8")])  (define_insn "mve_vstrbq_p_"    [(set (match_operand: 0 "mve_memory_operand" "=Ux") @@ -7288,7 +7727,8 @@     output_asm_insn ("vpst\;vstrbt.\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_")) +  (set_attr "length" "8")])  ;;  ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u] @@ -7313,7 +7753,8 @@       output_asm_insn ("vpst\n\tvldrbt.\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_")) +  (set_attr "length" "8")])  ;;  ;; [vldrbq_z_s vldrbq_z_u] @@ -7336,7 +7777,8 @@       output_asm_insn ("vpst\;vldrbt.\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u] @@ -7357,7 +7799,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_f] @@ -7376,7 +7819,8 @@     output_asm_insn ("vldrh.16\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf")) +  (set_attr "length" "4")])  ;;  ;; [vldrhq_gather_offset_s vldrhq_gather_offset_u] @@ -7399,7 +7843,8 @@       output_asm_insn ("vldrh.\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_")) +  (set_attr "length" "4")])  ;;  ;; [vldrhq_gather_offset_z_s vldrhq_gather_offset_z_u] @@ -7424,7 +7869,8 @@       output_asm_insn ("vpst\n\tvldrht.\t%q0, [%m1, %q2]",ops);     return "";  } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u] @@ -7447,7 +7893,8 @@       output_asm_insn ("vldrh.\t%q0, [%m1, %q2, uxtw #1]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_")) +  (set_attr "length" "4")])  ;;  ;; [vldrhq_gather_shifted_offset_z_s vldrhq_gather_shited_offset_z_u] @@ -7472,7 +7919,8 @@       output_asm_insn ("vpst\n\tvldrht.\t%q0, [%m1, %q2, uxtw #1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_s, vldrhq_u] @@ -7494,7 +7942,8 @@       output_asm_insn ("vldrh.\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_")) +  (set_attr "length" "4")])  ;;  ;; [vldrhq_z_f] @@ -7514,7 +7963,8 @@     output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_z_s vldrhq_z_u] @@ -7537,7 +7987,8 @@       output_asm_insn ("vpst\;vldrht.\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_f] @@ -7556,7 +8007,8 @@     output_asm_insn ("vldrw.32\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf")) +  (set_attr "length" "4")])  ;;  ;; [vldrwq_s vldrwq_u] @@ -7575,7 +8027,8 @@     output_asm_insn ("vldrw.32\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_v4si")) +  (set_attr "length" "4")])  ;;  ;; [vldrwq_z_f] @@ -7595,7 +8048,8 @@     output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_z_s vldrwq_z_u] @@ -7615,7 +8069,8 @@     output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_v4si")) +  (set_attr "length" "8")])  (define_expand "mve_vld1q_f"    [(match_operand:MVE_0 0 "s_register_operand") @@ -7655,7 +8110,8 @@     output_asm_insn ("vldrd.64\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_v2di")) +  (set_attr "length" "4")])  ;;  ;; [vldrdq_gather_base_z_s vldrdq_gather_base_z_u] @@ -7676,7 +8132,8 @@     output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_v2di")) +  (set_attr "length" "8")])  ;;  ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u] @@ -7696,7 +8153,8 @@    output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2]",ops);    return "";  } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_v2di")) +  (set_attr "length" "4")])  ;;  ;; [vldrdq_gather_offset_z_s vldrdq_gather_offset_z_u] @@ -7717,7 +8175,8 @@    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);    return "";  } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_v2di")) +  (set_attr "length" "8")])  ;;  ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u] @@ -7737,7 +8196,8 @@     output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2, uxtw #3]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_v2di")) +  (set_attr "length" "4")])  ;;  ;; [vldrdq_gather_shifted_offset_z_s vldrdq_gather_shifted_offset_z_u] @@ -7758,7 +8218,8 @@     output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_v2di")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_gather_offset_f] @@ -7778,7 +8239,8 @@     output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf")) +  (set_attr "length" "4")])  ;;  ;; [vldrhq_gather_offset_z_f] @@ -7800,7 +8262,8 @@     output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf")) +  (set_attr "length" "8")])  ;;  ;; [vldrhq_gather_shifted_offset_f] @@ -7820,7 +8283,8 @@     output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2, uxtw #1]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf")) +  (set_attr "length" "4")])  ;;  ;; [vldrhq_gather_shifted_offset_z_f] @@ -7842,7 +8306,8 @@     output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_base_f] @@ -7862,7 +8327,8 @@     output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf")) +  (set_attr "length" "4")])  ;;  ;; [vldrwq_gather_base_z_f] @@ -7883,7 +8349,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_offset_f] @@ -7903,7 +8370,8 @@     output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf")) +  (set_attr "length" "4")])  ;;  ;; [vldrwq_gather_offset_s vldrwq_gather_offset_u] @@ -7923,7 +8391,8 @@     output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_v4si")) +  (set_attr "length" "4")])  ;;  ;; [vldrwq_gather_offset_z_f] @@ -7945,7 +8414,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u] @@ -7967,7 +8437,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_shifted_offset_f] @@ -7987,7 +8458,8 @@     output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf")) +  (set_attr "length" "4")])  ;;  ;; [vldrwq_gather_shifted_offset_s vldrwq_gather_shifted_offset_u] @@ -8007,7 +8479,8 @@     output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_v4si")) +  (set_attr "length" "4")])  ;;  ;; [vldrwq_gather_shifted_offset_z_f] @@ -8029,7 +8502,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u] @@ -8051,7 +8525,8 @@     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_f] @@ -8070,7 +8545,8 @@     output_asm_insn ("vstrh.16\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf")) +  (set_attr "length" "4")])  ;;  ;; [vstrhq_p_f] @@ -8091,7 +8567,8 @@     output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_p_s vstrhq_p_u] @@ -8113,7 +8590,8 @@     output_asm_insn ("vpst\;vstrht.\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u] @@ -8145,7 +8623,8 @@        VSTRHSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrht.\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset__insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u] @@ -8173,7 +8652,8 @@        VSTRHSOQ))]    "TARGET_HAVE_MVE"    "vstrh.\t%q2, [%0, %q1]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset__insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrhq_scatter_shifted_offset_p_s vstrhq_scatter_shifted_offset_p_u] @@ -8205,7 +8685,8 @@        VSTRHSSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrht.\t%q2, [%0, %q1, uxtw #1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset__insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u] @@ -8234,7 +8715,8 @@        VSTRHSSOQ))]    "TARGET_HAVE_MVE"    "vstrh.\t%q2, [%0, %q1, uxtw #1]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset__insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrhq_s, vstrhq_u] @@ -8253,7 +8735,8 @@     output_asm_insn ("vstrh.\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_")) +  (set_attr "length" "4")])  ;;  ;; [vstrwq_f] @@ -8272,7 +8755,8 @@     output_asm_insn ("vstrw.32\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf")) +  (set_attr "length" "4")])  ;;  ;; [vstrwq_p_f] @@ -8293,7 +8777,8 @@     output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_p_s vstrwq_p_u] @@ -8314,7 +8799,8 @@     output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_s vstrwq_u] @@ -8333,7 +8819,8 @@     output_asm_insn ("vstrw.32\t%q1, %E0",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_v4si")) +  (set_attr "length" "4")])  (define_expand "mve_vst1q_f"    [(match_operand: 0 "mve_memory_operand") @@ -8376,7 +8863,8 @@     output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_v2di")) +  (set_attr "length" "8")])  ;;  ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u] @@ -8398,7 +8886,8 @@     output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_v2di")) +  (set_attr "length" "4")])  ;;  ;; [vstrdq_scatter_offset_p_s vstrdq_scatter_offset_p_u] @@ -8429,7 +8918,8 @@        VSTRDSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrdt.64\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_v2di_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u] @@ -8457,7 +8947,8 @@        VSTRDSOQ))]    "TARGET_HAVE_MVE"    "vstrd.64\t%q2, [%0, %q1]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_v2di_insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrdq_scatter_shifted_offset_p_s vstrdq_scatter_shifted_offset_p_u] @@ -8489,7 +8980,8 @@        VSTRDSSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrdt.64\t%q2, [%0, %q1, UXTW #3]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_v2di_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u] @@ -8518,7 +9010,8 @@        VSTRDSSOQ))]    "TARGET_HAVE_MVE"    "vstrd.64\t%q2, [%0, %q1, UXTW #3]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_v2di_insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrhq_scatter_offset_f] @@ -8546,7 +9039,8 @@        VSTRHQSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vstrh.16\t%q2, [%0, %q1]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrhq_scatter_offset_p_f] @@ -8577,7 +9071,8 @@        VSTRHQSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vstrht.16\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrhq_scatter_shifted_offset_f] @@ -8605,7 +9100,8 @@        VSTRHQSSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vstrh.16\t%q2, [%0, %q1, uxtw #1]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrhq_scatter_shifted_offset_p_f] @@ -8637,7 +9133,8 @@        VSTRHQSSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_base_f] @@ -8659,7 +9156,8 @@     output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf")) +  (set_attr "length" "4")])  ;;  ;; [vstrwq_scatter_base_p_f] @@ -8682,7 +9180,8 @@     output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_offset_f] @@ -8710,7 +9209,8 @@        VSTRWQSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vstrw.32\t%q2, [%0, %q1]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrwq_scatter_offset_p_f] @@ -8741,7 +9241,8 @@        VSTRWQSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vstrwt.32\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u] @@ -8772,7 +9273,8 @@        VSTRWSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrwt.32\t%q2, [%0, %q1]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_v4si_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u] @@ -8800,7 +9302,8 @@        VSTRWSOQ))]    "TARGET_HAVE_MVE"    "vstrw.32\t%q2, [%0, %q1]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_v4si_insn")) +  (set_attr "length" "4")])  ;;  ;; [vstrwq_scatter_shifted_offset_f] @@ -8828,7 +9331,8 @@       VSTRWQSSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vstrw.32\t%q2, [%0, %q1, uxtw #2]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_shifted_offset_p_f] @@ -8860,7 +9364,8 @@        VSTRWQSSO_F))]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u] @@ -8892,7 +9397,8 @@        VSTRWSSOQ))]    "TARGET_HAVE_MVE"    "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]" -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_v4si_insn")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u] @@ -8921,7 +9427,8 @@        VSTRWSSOQ))]    "TARGET_HAVE_MVE"    "vstrw.32\t%q2, [%0, %q1, uxtw #2]" -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_v4si_insn")) +  (set_attr "length" "4")])  ;;  ;; [vaddq_s, vaddq_u]) @@ -8934,7 +9441,8 @@    ]    "TARGET_HAVE_MVE"    "vadd.i%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq")) +  (set_attr "type" "mve_move")  ])  ;; @@ -8948,7 +9456,8 @@    ]    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"    "vadd.f%#\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_f")) +  (set_attr "type" "mve_move")  ])  ;; @@ -9017,7 +9526,8 @@          (match_operand:SI 6 "immediate_operand" "i")))]   "TARGET_HAVE_MVE"   "vpst\;\tvidupt.u%#\t%q0, %2, %4" - [(set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u_insn")) +  (set_attr "length""8")])  ;;  ;; [vddupq_n_u]) @@ -9085,7 +9595,8 @@           (match_operand:SI 6 "immediate_operand" "i")))]   "TARGET_HAVE_MVE"   "vpst\;vddupt.u%#\t%q0, %2, %4" - [(set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u_insn")) +  (set_attr "length""8")])  ;;  ;; [vdwdupq_n_u]) @@ -9201,8 +9712,9 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vdwdupt.u%#\t%q2, %3, %R4, %5" -  [(set_attr "type" "mve_move") -   (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u_insn")) +  (set_attr "type" "mve_move") +  (set_attr "length""8")])  ;;  ;; [viwdupq_n_u]) @@ -9318,7 +9830,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;\tviwdupt.u%#\t%q2, %3, %R4, %5" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u_insn")) +  (set_attr "type" "mve_move")     (set_attr "length""8")])  ;; @@ -9344,7 +9857,8 @@     output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_v4si")) +  (set_attr "length" "4")])  ;;  ;; [vstrwq_scatter_base_wb_p_s vstrwq_scatter_base_wb_p_u] @@ -9370,7 +9884,8 @@     output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_v4si")) +  (set_attr "length" "8")])  ;;  ;; [vstrwq_scatter_base_wb_f] @@ -9395,7 +9910,8 @@     output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf")) +  (set_attr "length" "4")])  ;;  ;; [vstrwq_scatter_base_wb_p_f] @@ -9421,7 +9937,8 @@     output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf")) +  (set_attr "length" "8")])  ;;  ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u] @@ -9446,7 +9963,8 @@     output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_v2di")) +  (set_attr "length" "4")])  ;;  ;; [vstrdq_scatter_base_wb_p_s vstrdq_scatter_base_wb_p_u] @@ -9472,7 +9990,8 @@     output_asm_insn ("vpst;vstrdt.u64\t%q2, [%q0, %1]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_v2di")) +  (set_attr "length" "8")])  (define_expand "mve_vldrwq_gather_base_wb_v4si"    [(match_operand:V4SI 0 "s_register_operand") @@ -9524,7 +10043,8 @@     output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_v4si_insn")) +  (set_attr "length" "4")])  (define_expand "mve_vldrwq_gather_base_wb_z_v4si"    [(match_operand:V4SI 0 "s_register_operand") @@ -9580,7 +10100,8 @@     output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_v4si_insn")) +  (set_attr "length" "8")])  (define_expand "mve_vldrwq_gather_base_wb_fv4sf"    [(match_operand:V4SI 0 "s_register_operand") @@ -9632,7 +10153,8 @@     output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn")) +  (set_attr "length" "4")])  (define_expand "mve_vldrwq_gather_base_wb_z_fv4sf"    [(match_operand:V4SI 0 "s_register_operand") @@ -9689,7 +10211,8 @@     output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn")) +  (set_attr "length" "8")])  (define_expand "mve_vldrdq_gather_base_wb_v2di"    [(match_operand:V2DI 0 "s_register_operand") @@ -9742,7 +10265,8 @@     output_asm_insn ("vldrd.64\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_v2di_insn")) +  (set_attr "length" "4")])  (define_expand "mve_vldrdq_gather_base_wb_z_v2di"    [(match_operand:V2DI 0 "s_register_operand") @@ -9781,7 +10305,7 @@     (unspec:SI [(reg:SI VFPCC_REGNUM)] UNSPEC_GET_FPSCR_NZCVQC))]   "TARGET_HAVE_MVE"   "vmrs\\t%0, FPSCR_nzcvqc" -  [(set_attr "type" "mve_move")]) + [(set_attr "type" "mve_move")])  (define_insn "set_fpscr_nzcvqc"   [(set (reg:SI VFPCC_REGNUM) @@ -9789,7 +10313,7 @@      VUNSPEC_SET_FPSCR_NZCVQC))]   "TARGET_HAVE_MVE"   "vmsr\\tFPSCR_nzcvqc, %0" -  [(set_attr "type" "mve_move")]) + [(set_attr "type" "mve_move")])  ;;  ;; [vldrdq_gather_base_wb_z_s vldrdq_gather_base_wb_z_u] @@ -9814,7 +10338,8 @@     output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);     return "";  } -  [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_v2di_insn")) +  (set_attr "length" "8")])  ;;  ;; [vadciq_m_s, vadciq_m_u])  ;; @@ -9831,7 +10356,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vadcit.i32\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "8")])  ;; @@ -9848,7 +10374,8 @@    ]    "TARGET_HAVE_MVE"    "vadci.i32\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "4")])  ;; @@ -9867,7 +10394,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vadct.i32\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "8")])  ;; @@ -9884,7 +10412,8 @@    ]    "TARGET_HAVE_MVE"    "vadc.i32\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "4")     (set_attr "conds" "set")]) @@ -9904,7 +10433,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsbcit.i32\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "8")])  ;; @@ -9921,7 +10451,8 @@    ]    "TARGET_HAVE_MVE"    "vsbci.i32\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "4")])  ;; @@ -9940,7 +10471,8 @@    ]    "TARGET_HAVE_MVE"    "vpst\;vsbct.i32\t%q0, %q2, %q3" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "8")])  ;; @@ -9957,7 +10489,8 @@    ]    "TARGET_HAVE_MVE"    "vsbc.i32\t%q0, %q1, %q2" -  [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_v4si")) +  (set_attr "type" "mve_move")     (set_attr "length" "4")])  ;; @@ -9986,7 +10519,7 @@              "vst21.\t{%q0, %q1}, %3", ops);     return "";  } -  [(set_attr "length" "8")]) + [(set_attr "length" "8")])  ;;  ;; [vld2q]) @@ -10014,7 +10547,7 @@              "vld21.\t{%q0, %q1}, %3", ops);     return "";  } -  [(set_attr "length" "8")]) + [(set_attr "length" "8")])  ;;  ;; [vld4q]) @@ -10357,7 +10890,8 @@   ]   "TARGET_HAVE_MVE"   "vpst\;vshlct\t%q0, %1, %4" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_")) +  (set_attr "type" "mve_move")    (set_attr "length" "8")])  ;; CDE instructions on MVE registers. @@ -10369,7 +10903,8 @@       UNSPEC_VCDE))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vcx1\\tp%c1, %q0, #%c2" -  [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi")) +  (set_attr "type" "coproc")]  )  (define_insn "arm_vcx1qav16qi" @@ -10380,7 +10915,8 @@       UNSPEC_VCDEA))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vcx1a\\tp%c1, %q0, #%c3" -  [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qav16qi")) +  (set_attr "type" "coproc")]  )  (define_insn "arm_vcx2qv16qi" @@ -10391,7 +10927,8 @@       UNSPEC_VCDE))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vcx2\\tp%c1, %q0, %q2, #%c3" -  [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi")) +  (set_attr "type" "coproc")]  )  (define_insn "arm_vcx2qav16qi" @@ -10403,7 +10940,8 @@       UNSPEC_VCDEA))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vcx2a\\tp%c1, %q0, %q3, #%c4" -  [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qav16qi")) +  (set_attr "type" "coproc")]  )  (define_insn "arm_vcx3qv16qi" @@ -10415,7 +10953,8 @@       UNSPEC_VCDE))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vcx3\\tp%c1, %q0, %q2, %q3, #%c4" -  [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi")) +  (set_attr "type" "coproc")]  )  (define_insn "arm_vcx3qav16qi" @@ -10428,7 +10967,8 @@       UNSPEC_VCDEA))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vcx3a\\tp%c1, %q0, %q3, %q4, #%c5" -  [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qav16qi")) +  (set_attr "type" "coproc")]  )  (define_insn "arm_vcx1q_p_v16qi" @@ -10440,7 +10980,8 @@       CDE_VCX))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vpst\;vcx1t\\tp%c1, %q0, #%c3" -  [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi")) +  (set_attr "type" "coproc")     (set_attr "length" "8")]  ) @@ -10454,7 +10995,8 @@       CDE_VCX))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vpst\;vcx2t\\tp%c1, %q0, %q3, #%c4" -  [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi")) +  (set_attr "type" "coproc")     (set_attr "length" "8")]  ) @@ -10469,7 +11011,8 @@       CDE_VCX))]    "TARGET_CDE && TARGET_HAVE_MVE"    "vpst\;vcx3t\\tp%c1, %q0, %q3, %q4, #%c5" -  [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi")) +  (set_attr "type" "coproc")     (set_attr "length" "8")]  ) diff --git a/gcc/config/arm/vec-common.md b/gcc/config/arm/vec-common.md index f06df4db63654db94d30eac637e9cc6464ec1941..0aa6b16348baa4d9632b354cd36268ba3e24e9dc 100644 --- a/gcc/config/arm/vec-common.md +++ b/gcc/config/arm/vec-common.md @@ -366,7 +366,8 @@    "@     vshl.%#\t%0, %1, %2     * return neon_output_shift_immediate (\"vshl\", 'i', &operands[2], mode, VALID_NEON_QREG_MODE (mode), true);" -  [(set_attr "type" "neon_shift_reg, neon_shift_imm")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_")) +  (set_attr "type" "neon_shift_reg, neon_shift_imm")]  )  (define_expand "vashl3" diff --git a/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c new file mode 100644 index 0000000000000000000000000000000000000000..acf0836050c19b983feeaf97c3e52e1318bb194d --- /dev/null +++ b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c @@ -0,0 +1,149 @@ +/* { dg-do compile { target { arm*-*-* } } } */ +/* { dg-require-effective-target arm_v8_1m_mve_ok } */ +/* { dg-skip-if "avoid conflicting multilib options" { *-*-* } { "-marm" "-mcpu=*" } } */ +/* { dg-options "-march=armv8.1-m.main+fp.dp+mve.fp -mfloat-abi=hard -mfpu=auto -O3" } */ + +#include + +#define IMM 5 + +#define TEST_COMPILE_IN_DLSTP_TERNARY(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED)                \ +void test_##NAME##PRED##_##SIGN##BITS (TYPE##BITS##_t *a, TYPE##BITS##_t *b,  TYPE##BITS##_t *c, int n)    \ +{                                            \ +  while (n > 0)                                        \ +    {                                            \ +      mve_pred16_t p = vctp##BITS##q (n); \ +      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p);        \ +      TYPE##BITS##x##LANES##_t vb = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (b, p);        \ +      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_##SIGN##BITS (va, vb, p);        \ +      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);         \ +      c += LANES;                                    \ +      a += LANES;                                    \ +      b += LANES;                                    \ +      n -= LANES;                                    \ +    }                                            \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY(BITS, LANES, LDRSTRYTPE, NAME, PRED)    \ +TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY(NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (8, 16, b, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (16, 8, h, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (32, 4, w, NAME, PRED) + + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vaddq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vmulq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vsubq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vhaddq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vorrq, _x) + + +#define TEST_COMPILE_IN_DLSTP_TERNARY_M(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED)                \ +void test_##NAME##PRED##_##SIGN##BITS (TYPE##BITS##x##LANES##_t __inactive, TYPE##BITS##_t *a, TYPE##BITS##_t *b,  TYPE##BITS##_t *c, int n)    \ +{                                            \ +  while (n > 0)                                        \ +    {                                            \ +      mve_pred16_t p = vctp##BITS##q (n); \ +      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p);        \ +      TYPE##BITS##x##LANES##_t vb = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (b, p);        \ +      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_##SIGN##BITS (__inactive, va, vb, p);        \ +      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);         \ +      c += LANES;                                    \ +      a += LANES;                                    \ +      b += LANES;                                    \ +      n -= LANES;                                    \ +    }                                            \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M(BITS, LANES, LDRSTRYTPE, NAME, PRED)    \ +TEST_COMPILE_IN_DLSTP_TERNARY_M (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_TERNARY_M (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M(NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M (8, 16, b, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M (16, 8, h, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M (32, 4, w, NAME, PRED) + + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vaddq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vmulq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vsubq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vhaddq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vorrq, _m) + +#define TEST_COMPILE_IN_DLSTP_TERNARY_N(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED)    \ +void test_##NAME##PRED##_n_##SIGN##BITS (TYPE##BITS##_t *a, TYPE##BITS##_t *c, int n)    \ +{                                            \ +  while (n > 0)                                        \ +    {                                            \ +      mve_pred16_t p = vctp##BITS##q (n); \ +      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p);        \ +      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_n_##SIGN##BITS (va, IMM, p);        \ +      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);         \ +      c += LANES;                                    \ +      a += LANES;                                    \ +      n -= LANES;                                    \ +    }                                            \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N(BITS, LANES, LDRSTRYTPE, NAME, PRED)    \ +TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N(NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (8, 16, b, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (16, 8, h, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (32, 4, w, NAME, PRED) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vaddq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vmulq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vsubq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vhaddq, _x) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vbrsrq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshlq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshrq, _x) + +#define TEST_COMPILE_IN_DLSTP_TERNARY_M_N(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED)    \ +void test_##NAME##PRED##_n_##SIGN##BITS (TYPE##BITS##x##LANES##_t __inactive, TYPE##BITS##_t *a,  TYPE##BITS##_t *c, int n)    \ +{                                            \ +  while (n > 0)                                        \ +    {                                            \ +      mve_pred16_t p = vctp##BITS##q (n); \ +      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p);        \ +      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_n_##SIGN##BITS (__inactive, va, IMM, p);        \ +      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);         \ +      c += LANES;                                    \ +      a += LANES;                                    \ +      n -= LANES;                                    \ +    }                                            \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M_N(BITS, LANES, LDRSTRYTPE, NAME, PRED)    \ +TEST_COMPILE_IN_DLSTP_TERNARY_M_N (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_TERNARY_M_N (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N(NAME, PRED)            \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M_N (8, 16, b, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M_N (16, 8, h, NAME, PRED)                \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M_N (32, 4, w, NAME, PRED) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vaddq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vmulq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vsubq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vhaddq, _m) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vbrsrq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vshlq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vshrq, _m) + +/* The final number of DLSTPs currently is calculated by the number of +  `TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY.*` macros * 6.  */ +/* { dg-final { scan-assembler-times {\tdlstp} 144 } } */ +/* { dg-final { scan-assembler-times {\tletp} 144 } } */ +/* { dg-final { scan-assembler-not "\tvctp\t" } } */ +/* { dg-final { scan-assembler-not "\tvpst\t" } } */ +/* { dg-final { scan-assembler-not "p0" } } */