public inbox for gcc-patches@gcc.gnu.org
 help / color / mirror / Atom feed
* [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
@ 2022-11-11 17:39 Stam Markianos-Wright
  0 siblings, 0 replies; 7+ messages in thread
From: Stam Markianos-Wright @ 2022-11-11 17:39 UTC (permalink / raw)
  To: gcc-patches

[-- Attachment #1: Type: text/plain, Size: 118267 bytes --]

Hi all,

I'd like to submit two patches that add support for Arm's MVE
Tail Predicated Low Overhead Loop feature.

--- Introduction ---

The M-class Arm-ARM:
https://developer.arm.com/documentation/ddi0553/bu/?lang=en
Section B5.5.1 "Loop tail predication" describes the feature
we are adding support for with this patch (although
we only add codegen for DLSTP/LETP instruction loops).

Previously with commit d2ed233cb94 we'd added support for
non-MVE DLS/LE loops through the loop-doloop pass, which, given
a standard MVE loop like:

```
void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, int16_t 
*c, int n)
{
   while (n > 0)
     {
       mve_pred16_t p = vctp16q (n);
       int16x8_t va = vldrhq_z_s16 (a, p);
       int16x8_t vb = vldrhq_z_s16 (b, p);
       int16x8_t vc = vaddq_x_s16 (va, vb, p);
       vstrhq_p_s16 (c, vc, p);
       c+=8;
       a+=8;
       b+=8;
       n-=8;
     }
}
```
.. would output:

```
         <pre-calculate the number of iterations and place it into lr>
         dls     lr, lr
.L3:
         vctp.16 r3
         vmrs    ip, P0  @ movhi
         sxth    ip, ip
         vmsr     P0, ip @ movhi
         mov     r4, r0
         vpst
         vldrht.16       q2, [r4]
         mov     r4, r1
         vmov    q3, q0
         vpst
         vldrht.16       q1, [r4]
         mov     r4, r2
         vpst
         vaddt.i16       q3, q2, q1
         subs    r3, r3, #8
         vpst
         vstrht.16       q3, [r4]
         adds    r0, r0, #16
         adds    r1, r1, #16
         adds    r2, r2, #16
         le      lr, .L3
```

where the LE instruction will decrement LR by 1, compare and
branch if needed.

(there are also other inefficiencies with the above code, like the
pointless vmrs/sxth/vmsr on the VPR and the adds not being merged
into the vldrht/vstrht as a #16 offsets and some random movs!
But that's different problems...)

The MVE version is similar, except that:
* Instead of DLS/LE the instructions are DLSTP/LETP.
* Instead of pre-calculating the number of iterations of the
   loop, we place the number of elements to be processed by the
   loop into LR.
* Instead of decrementing the LR by one, LETP will decrement it
   by FPSCR.LTPSIZE, which is the number of elements being
   processed in each iteration: 16 for 8-bit elements, 5 for 16-bit
   elements, etc.
* On the final iteration, automatic Loop Tail Predication is
   performed, as if the instructions within the loop had been VPT
   predicated with a VCTP generating the VPR predicate in every
   loop iteration.

The dlstp/letp loop now looks like:

```
         <place n into r3>
         dlstp.16        lr, r3
.L14:
         mov     r3, r0
         vldrh.16        q3, [r3]
         mov     r3, r1
         vldrh.16        q2, [r3]
         mov     r3, r2
         vadd.i16  q3, q3, q2
         adds    r0, r0, #16
         vstrh.16        q3, [r3]
         adds    r1, r1, #16
         adds    r2, r2, #16
         letp    lr, .L14

```

Since the loop tail predication is automatic, we have eliminated
the VCTP that had been specified by the user in the intrinsic
and converted the VPT-predicated instructions into their
unpredicated equivalents (which also saves us from VPST insns).

The LE instruction here decrements LR by 8 in each iteration.

--- This 1/2 patch ---

This first patch lays some groundwork by adding an attribute to
md patterns, and then the second patch contains the functional
changes.

One major difficulty in implementing MVE Tail-Predicated Low
Overhead Loops was the need to transform VPT-predicated insns
in the insn chain into their unpredicated equivalents, like:
`mve_vldrbq_z_<supf><mode> -> mve_vldrbq_<supf><mode>`.

This requires us to have a deterministic link between two
different patterns in mve.md -- this _could_ be done by
re-ordering the entirety of mve.md such that the patterns are
at some constant icode proximity (e.g. having the _z immediately
after the unpredicated version would mean that to map from the
former to the latter you could use icode-1), but that is a very
messy solution that would lead to complex unknown dependencies
between patterns.

This patch proves an alternative way of doing that: using an insn
attribute to encode the icode of the unpredicated instruction.

This was implemented by doing a find n replace across mve.md
using the following patterns:

define_insn "(.*)_p_(.*)"((.|\n)*?)\n( )*\[\(set_attr
define_insn "$1_p_$2"$3\n$5[(set (attr "mve_unpredicated_insn") 
(symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr


define_insn "(.*)_m_(.*)"((.|\n)*?)\n( )*\[\(set_attr
define_insn "$1_m_$2"$3\n$5[(set (attr "mve_unpredicated_insn") 
(symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr

define_insn "(.*)_z_(.*)"((.|\n)*?)\n( )*\[\(set_attr
define_insn "$1_z_$2"$3\n$5[(set (attr "mve_unpredicated_insn") 
(symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr

and then a number of manual fixes were needed for the md patterns
that did not conform to the above.  Those changes were:

Dropped the type suffix _s/_u_f:
CODE_FOR_mve_vcmpcsq_n_<mode>
CODE_FOR_mve_vcmpcsq_<mode>
CODE_FOR_mve_vcmpeqq_n_<mode>
CODE_FOR_mve_vcmpeqq_<mode>
CODE_FOR_mve_vcmpgeq_n_<mode>
CODE_FOR_mve_vcmpgeq_<mode>
CODE_FOR_mve_vcmpgtq_n_<mode>
CODE_FOR_mve_vcmpgtq_<mode>
CODE_FOR_mve_vcmphiq_n_<mode>
CODE_FOR_mve_vcmphiq_<mode>
CODE_FOR_mve_vcmpleq_n_<mode>
CODE_FOR_mve_vcmpleq_<mode>
CODE_FOR_mve_vcmpltq_n_<mode>
CODE_FOR_mve_vcmpltq_<mode>
CODE_FOR_mve_vcmpneq_n_<mode>
CODE_FOR_mve_vcmpneq_<mode>
CODE_FOR_mve_vaddq<mode>
CODE_FOR_mve_vcaddq_rot270<mode>
CODE_FOR_mve_vcaddq_rot90<mode>
CODE_FOR_mve_vcaddq_rot270<mode>
CODE_FOR_mve_vcaddq_rot90<mode>
CODE_FOR_mve_vcmlaq<mode>
CODE_FOR_mve_vcmlaq_rot180<mode>
CODE_FOR_mve_vcmlaq_rot270<mode>
CODE_FOR_mve_vcmlaq_rot90<mode>
CODE_FOR_mve_vcmulq<mode>
CODE_FOR_mve_vcmulq_rot180<mode>
CODE_FOR_mve_vcmulq_rot270<mode>
CODE_FOR_mve_vcmulq_rot90<mode>

Dropped _wb_:
CODE_FOR_mve_vidupq_u<mode>_insn
CODE_FOR_mve_vddupq_u<mode>_insn

Dropped one underscore character:
CODE_FOR_arm_vcx1q<a>v16qi
CODE_FOR_arm_vcx2q<a>v16qi
CODE_FOR_arm_vcx3q<a>v16qi

No regressions on arm-none-eabi with an MVE target.

Thank you,
Stam Markianos-Wright

gcc/ChangeLog:

         * config/arm/arm.md (mve_unpredicated_insn): New attribute.
         * config/arm/mve.md (mve_vrndq_m_f<mode>): Add attribute.
        (mve_vaddlvq_p_<supf>v4si): Likewise.
        (mve_vaddvq_p_<supf><mode>): Likewise.
        (mve_vbicq_m_n_<supf><mode>): Likewise.
        (mve_vcmpeqq_m_f<mode>): Likewise.
        (mve_vcvtaq_m_<supf><mode>): Likewise.
        (mve_vcvtq_m_to_f_<supf><mode>): Likewise.
        (mve_vabsq_m_s<mode>): Likewise.
        (mve_vaddvaq_p_<supf><mode>): Likewise.
        (mve_vclsq_m_s<mode>): Likewise.
        (mve_vclzq_m_<supf><mode>): Likewise.
        (mve_vcmpcsq_m_n_u<mode>): Likewise.
        (mve_vcmpcsq_m_u<mode>): Likewise.
        (mve_vcmpeqq_m_n_<supf><mode>): Likewise.
        (mve_vcmpeqq_m_<supf><mode>): Likewise.
        (mve_vcmpgeq_m_n_s<mode>): Likewise.
        (mve_vcmpgeq_m_s<mode>): Likewise.
        (mve_vcmpgtq_m_n_s<mode>): Likewise.
        (mve_vcmpgtq_m_s<mode>): Likewise.
        (mve_vcmphiq_m_n_u<mode>): Likewise.
        (mve_vcmphiq_m_u<mode>): Likewise.
        (mve_vcmpleq_m_n_s<mode>): Likewise.
        (mve_vcmpleq_m_s<mode>): Likewise.
        (mve_vcmpltq_m_n_s<mode>): Likewise.
        (mve_vcmpltq_m_s<mode>): Likewise.
        (mve_vcmpneq_m_n_<supf><mode>): Likewise.
        (mve_vcmpneq_m_<supf><mode>): Likewise.
        (mve_vdupq_m_n_<supf><mode>): Likewise.
        (mve_vmaxaq_m_s<mode>): Likewise.
        (mve_vmaxavq_p_s<mode>): Likewise.
        (mve_vmaxvq_p_<supf><mode>): Likewise.
        (mve_vminaq_m_s<mode>): Likewise.
        (mve_vminavq_p_s<mode>): Likewise.
        (mve_vminvq_p_<supf><mode>): Likewise.
        (mve_vmladavq_p_<supf><mode>): Likewise.
        (mve_vmladavxq_p_s<mode>): Likewise.
        (mve_vmlsdavq_p_s<mode>): Likewise.
        (mve_vmlsdavxq_p_s<mode>): Likewise.
        (mve_vmvnq_m_<supf><mode>): Likewise.
        (mve_vnegq_m_s<mode>): Likewise.
        (mve_vqabsq_m_s<mode>): Likewise.
        (mve_vqnegq_m_s<mode>): Likewise.
        (mve_vqrshlq_m_n_<supf><mode>): Likewise.
        (mve_vqshlq_m_r_<supf><mode>): Likewise.
        (mve_vrev64q_m_<supf><mode>): Likewise.
        (mve_vrshlq_m_n_<supf><mode>): Likewise.
        (mve_vshlq_m_r_<supf><mode>): Likewise.
        (mve_vabsq_m_f<mode>): Likewise.
        (mve_vaddlvaq_p_<supf>v4si): Likewise.
        (mve_vcmpeqq_m_n_f<mode>): Likewise.
        (mve_vcmpgeq_m_f<mode>): Likewise.
        (mve_vcmpgeq_m_n_f<mode>): Likewise.
        (mve_vcmpgtq_m_f<mode>): Likewise.
        (mve_vcmpgtq_m_n_f<mode>): Likewise.
        (mve_vcmpleq_m_f<mode>): Likewise.
        (mve_vcmpleq_m_n_f<mode>): Likewise.
        (mve_vcmpltq_m_f<mode>): Likewise.
        (mve_vcmpltq_m_n_f<mode>): Likewise.
        (mve_vcmpneq_m_f<mode>): Likewise.
        (mve_vcmpneq_m_n_f<mode>): Likewise.
        (mve_vcvtbq_m_f16_f32v8hf): Likewise.
        (mve_vcvtbq_m_f32_f16v4sf): Likewise.
        (mve_vcvttq_m_f16_f32v8hf): Likewise.
        (mve_vcvttq_m_f32_f16v4sf): Likewise.
        (mve_vdupq_m_n_f<mode>): Likewise.
        (mve_vmaxnmaq_m_f<mode>): Likewise.
        (mve_vmaxnmavq_p_f<mode>): Likewise.
        (mve_vmaxnmvq_p_f<mode>): Likewise.
        (mve_vminnmaq_m_f<mode>): Likewise.
        (mve_vminnmavq_p_f<mode>): Likewise.
        (mve_vminnmvq_p_f<mode>): Likewise.
        (mve_vmlaldavq_p_<supf><mode>): Likewise.
        (mve_vmlaldavxq_p_s<mode>): Likewise.
        (mve_vmlsldavq_p_s<mode>): Likewise.
        (mve_vmlsldavxq_p_s<mode>): Likewise.
        (mve_vmovlbq_m_<supf><mode>): Likewise.
        (mve_vmovltq_m_<supf><mode>): Likewise.
        (mve_vmovnbq_m_<supf><mode>): Likewise.
        (mve_vmovntq_m_<supf><mode>): Likewise.
        (mve_vmvnq_m_n_<supf><mode>): Likewise.
        (mve_vnegq_m_f<mode>): Likewise.
        (mve_vorrq_m_n_<supf><mode>): Likewise.
        (mve_vqmovnbq_m_<supf><mode>): Likewise.
        (mve_vqmovntq_m_<supf><mode>): Likewise.
        (mve_vqmovunbq_m_s<mode>): Likewise.
        (mve_vqmovuntq_m_s<mode>): Likewise.
        (mve_vrev32q_m_fv8hf): Likewise.
        (mve_vrev32q_m_<supf><mode>): Likewise.
        (mve_vrev64q_m_f<mode>): Likewise.
        (mve_vrmlaldavhxq_p_sv4si): Likewise.
        (mve_vrmlsldavhq_p_sv4si): Likewise.
        (mve_vrmlsldavhxq_p_sv4si): Likewise.
        (mve_vrndaq_m_f<mode>): Likewise.
        (mve_vrndmq_m_f<mode>): Likewise.
        (mve_vrndnq_m_f<mode>): Likewise.
        (mve_vrndpq_m_f<mode>): Likewise.
        (mve_vrndxq_m_f<mode>): Likewise.
        (mve_vcvtmq_m_<supf><mode>): Likewise.
        (mve_vcvtpq_m_<supf><mode>): Likewise.
        (mve_vcvtnq_m_<supf><mode>): Likewise.
        (mve_vcvtq_m_n_from_f_<supf><mode>): Likewise.
        (mve_vrev16q_m_<supf>v16qi): Likewise.
        (mve_vcvtq_m_from_f_<supf><mode>): Likewise.
        (mve_vrmlaldavhq_p_<supf>v4si): Likewise.
        (mve_vabavq_p_<supf><mode>): Likewise.
        (mve_vqshluq_m_n_s<mode>): Likewise.
        (mve_vshlq_m_<supf><mode>): Likewise.
        (mve_vsriq_m_n_<supf><mode>): Likewise.
        (mve_vsubq_m_<supf><mode>): Likewise.
        (mve_vcvtq_m_n_to_f_<supf><mode>): Likewise.
        (mve_vabdq_m_<supf><mode>): Likewise.
        (mve_vaddq_m_n_<supf><mode>): Likewise.
        (mve_vaddq_m_<supf><mode>): Likewise.
        (mve_vandq_m_<supf><mode>): Likewise.
        (mve_vbicq_m_<supf><mode>): Likewise.
        (mve_vbrsrq_m_n_<supf><mode>): Likewise.
        (mve_vcaddq_rot270_m_<supf><mode>): Likewise.
        (mve_vcaddq_rot90_m_<supf><mode>): Likewise.
        (mve_veorq_m_<supf><mode>): Likewise.
        (mve_vhaddq_m_n_<supf><mode>): Likewise.
        (mve_vhaddq_m_<supf><mode>): Likewise.
        (mve_vhsubq_m_n_<supf><mode>): Likewise.
        (mve_vhsubq_m_<supf><mode>): Likewise.
        (mve_vmaxq_m_<supf><mode>): Likewise.
        (mve_vminq_m_<supf><mode>): Likewise.
        (mve_vmladavaq_p_<supf><mode>): Likewise.
        (mve_vmlaq_m_n_<supf><mode>): Likewise.
        (mve_vmlasq_m_n_<supf><mode>): Likewise.
        (mve_vmulhq_m_<supf><mode>): Likewise.
        (mve_vmullbq_int_m_<supf><mode>): Likewise.
        (mve_vmulltq_int_m_<supf><mode>): Likewise.
        (mve_vmulq_m_n_<supf><mode>): Likewise.
        (mve_vmulq_m_<supf><mode>): Likewise.
        (mve_vornq_m_<supf><mode>): Likewise.
        (mve_vorrq_m_<supf><mode>): Likewise.
        (mve_vqaddq_m_n_<supf><mode>): Likewise.
        (mve_vqaddq_m_<supf><mode>): Likewise.
        (mve_vqdmlahq_m_n_s<mode>): Likewise.
        (mve_vqdmlashq_m_n_s<mode>): Likewise.
        (mve_vqrdmlahq_m_n_s<mode>): Likewise.
        (mve_vqrdmlashq_m_n_s<mode>): Likewise.
        (mve_vqrshlq_m_<supf><mode>): Likewise.
        (mve_vqshlq_m_n_<supf><mode>): Likewise.
        (mve_vqshlq_m_<supf><mode>): Likewise.
        (mve_vqsubq_m_n_<supf><mode>): Likewise.
        (mve_vqsubq_m_<supf><mode>): Likewise.
        (mve_vrhaddq_m_<supf><mode>): Likewise.
        (mve_vrmulhq_m_<supf><mode>): Likewise.
        (mve_vrshlq_m_<supf><mode>): Likewise.
        (mve_vrshrq_m_n_<supf><mode>): Likewise.
        (mve_vshlq_m_n_<supf><mode>): Likewise.
        (mve_vshrq_m_n_<supf><mode>): Likewise.
        (mve_vsliq_m_n_<supf><mode>): Likewise.
        (mve_vsubq_m_n_<supf><mode>): Likewise.
        (mve_vhcaddq_rot270_m_s<mode>): Likewise.
        (mve_vhcaddq_rot90_m_s<mode>): Likewise.
        (mve_vmladavaxq_p_s<mode>): Likewise.
        (mve_vmlsdavaq_p_s<mode>): Likewise.
        (mve_vmlsdavaxq_p_s<mode>): Likewise.
        (mve_vqdmladhq_m_s<mode>): Likewise.
        (mve_vqdmladhxq_m_s<mode>): Likewise.
        (mve_vqdmlsdhq_m_s<mode>): Likewise.
        (mve_vqdmlsdhxq_m_s<mode>): Likewise.
        (mve_vqdmulhq_m_n_s<mode>): Likewise.
        (mve_vqdmulhq_m_s<mode>): Likewise.
        (mve_vqrdmladhq_m_s<mode>): Likewise.
        (mve_vqrdmladhxq_m_s<mode>): Likewise.
        (mve_vqrdmlsdhq_m_s<mode>): Likewise.
        (mve_vqrdmlsdhxq_m_s<mode>): Likewise.
        (mve_vqrdmulhq_m_n_s<mode>): Likewise.
        (mve_vqrdmulhq_m_s<mode>): Likewise.
        (mve_vmlaldavaq_p_<supf><mode>): Likewise.
        (mve_vmlaldavaxq_p_<supf><mode>): Likewise.
        (mve_vqrshrnbq_m_n_<supf><mode>): Likewise.
        (mve_vqrshrntq_m_n_<supf><mode>): Likewise.
        (mve_vqshrnbq_m_n_<supf><mode>): Likewise.
        (mve_vqshrntq_m_n_<supf><mode>): Likewise.
        (mve_vrmlaldavhaq_p_sv4si): Likewise.
        (mve_vrshrnbq_m_n_<supf><mode>): Likewise.
        (mve_vrshrntq_m_n_<supf><mode>): Likewise.
        (mve_vshllbq_m_n_<supf><mode>): Likewise.
        (mve_vshlltq_m_n_<supf><mode>): Likewise.
        (mve_vshrnbq_m_n_<supf><mode>): Likewise.
        (mve_vshrntq_m_n_<supf><mode>): Likewise.
        (mve_vmlsldavaq_p_s<mode>): Likewise.
        (mve_vmlsldavaxq_p_s<mode>): Likewise.
        (mve_vmullbq_poly_m_p<mode>): Likewise.
        (mve_vmulltq_poly_m_p<mode>): Likewise.
        (mve_vqdmullbq_m_n_s<mode>): Likewise.
        (mve_vqdmullbq_m_s<mode>): Likewise.
        (mve_vqdmulltq_m_n_s<mode>): Likewise.
        (mve_vqdmulltq_m_s<mode>): Likewise.
        (mve_vqrshrunbq_m_n_s<mode>): Likewise.
        (mve_vqrshruntq_m_n_s<mode>): Likewise.
        (mve_vqshrunbq_m_n_s<mode>): Likewise.
        (mve_vqshruntq_m_n_s<mode>): Likewise.
        (mve_vrmlaldavhaq_p_uv4si): Likewise.
        (mve_vrmlaldavhaxq_p_sv4si): Likewise.
        (mve_vrmlsldavhaq_p_sv4si): Likewise.
        (mve_vrmlsldavhaxq_p_sv4si): Likewise.
        (mve_vabdq_m_f<mode>): Likewise.
        (mve_vaddq_m_f<mode>): Likewise.
        (mve_vaddq_m_n_f<mode>): Likewise.
        (mve_vandq_m_f<mode>): Likewise.
        (mve_vbicq_m_f<mode>): Likewise.
        (mve_vbrsrq_m_n_f<mode>): Likewise.
        (mve_vcaddq_rot270_m_f<mode>): Likewise.
        (mve_vcaddq_rot90_m_f<mode>): Likewise.
        (mve_vcmlaq_m_f<mode>): Likewise.
        (mve_vcmlaq_rot180_m_f<mode>): Likewise.
        (mve_vcmlaq_rot270_m_f<mode>): Likewise.
        (mve_vcmlaq_rot90_m_f<mode>): Likewise.
        (mve_vcmulq_m_f<mode>): Likewise.
        (mve_vcmulq_rot180_m_f<mode>): Likewise.
        (mve_vcmulq_rot270_m_f<mode>): Likewise.
        (mve_vcmulq_rot90_m_f<mode>): Likewise.
        (mve_veorq_m_f<mode>): Likewise.
        (mve_vfmaq_m_f<mode>): Likewise.
        (mve_vfmaq_m_n_f<mode>): Likewise.
        (mve_vfmasq_m_n_f<mode>): Likewise.
        (mve_vfmsq_m_f<mode>): Likewise.
        (mve_vmaxnmq_m_f<mode>): Likewise.
        (mve_vminnmq_m_f<mode>): Likewise.
        (mve_vmulq_m_f<mode>): Likewise.
        (mve_vmulq_m_n_f<mode>): Likewise.
        (mve_vornq_m_f<mode>): Likewise.
        (mve_vorrq_m_f<mode>): Likewise.
        (mve_vsubq_m_f<mode>): Likewise.
        (mve_vsubq_m_n_f<mode>): Likewise.
(mve_vstrbq_scatter_offset_p_<supf><mode>_insn): Likewise.
        (mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
        (mve_vstrbq_p_<supf><mode>): Likewise.
        (mve_vldrbq_gather_offset_z_<supf><mode>): Likewise.
        (mve_vldrbq_z_<supf><mode>): Likewise.
        (mve_vldrwq_gather_base_z_<supf>v4si): Likewise.
        (mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
        (mve_vldrhq_z_fv8hf): Likewise.
        (mve_vldrhq_z_<supf><mode>): Likewise.
        (mve_vldrwq_z_fv4sf): Likewise.
        (mve_vldrwq_z_<supf>v4si): Likewise.
        (mve_vldrdq_gather_base_z_<supf>v2di): Likewise.
        (mve_vldrdq_gather_offset_z_<supf>v2di): Likewise.
        (mve_vldrdq_gather_shifted_offset_z_<supf>v2di): Likewise.
        (mve_vldrhq_gather_offset_z_fv8hf): Likewise.
        (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise.
        (mve_vldrwq_gather_base_z_fv4sf): Likewise.
        (mve_vldrwq_gather_offset_z_fv4sf): Likewise.
        (mve_vldrwq_gather_offset_z_<supf>v4si): Likewise.
        (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise.
        (mve_vldrwq_gather_shifted_offset_z_<supf>v4si): Likewise.
        (mve_vstrhq_p_fv8hf): Likewise.
        (mve_vstrhq_p_<supf><mode>): Likewise.
(mve_vstrhq_scatter_offset_p_<supf><mode>_insn): Likewise.
(mve_vstrhq_scatter_shifted_offset_p_<supf><mode>_insn): Likewise.
        (mve_vstrwq_p_fv4sf): Likewise.
        (mve_vstrwq_p_<supf>v4si): Likewise.
        (mve_vstrdq_scatter_base_p_<supf>v2di): Likewise.
        (mve_vstrdq_scatter_offset_p_<supf>v2di_insn): Likewise.
(mve_vstrdq_scatter_shifted_offset_p_<supf>v2di_insn): Likewise.
        (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise.
        (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise.
        (mve_vstrwq_scatter_base_p_fv4sf): Likewise.
        (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise.
        (mve_vstrwq_scatter_offset_p_<supf>v4si_insn): Likewise.
        (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise.
(mve_vstrwq_scatter_shifted_offset_p_<supf>v4si_insn): Likewise.
        (mve_vidupq_m_wb_u<mode>_insn): Likewise.
        (mve_vddupq_m_wb_u<mode>_insn): Likewise.
        (mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
        (mve_viwdupq_m_wb_u<mode>_insn): Likewise.
        (mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
        (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
        (mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
        (mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
        (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
        (mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
        (mve_vadciq_m_<supf>v4si): Likewise.
        (mve_vadcq_m_<supf>v4si): Likewise.
        (mve_vsbciq_m_<supf>v4si): Likewise.
        (mve_vsbcq_m_<supf>v4si): Likewise.
        (mve_vshlcq_m_<supf><mode>): Likewise.
        (arm_vcx1q<a>_p_v16qi): Likewise.
        (arm_vcx2q<a>_p_v16qi): Likewise.
        (arm_vcx3q<a>_p_v16qi): Likewise.


gcc/testsuite/ChangeLog:

         * gcc.target/arm/dlstp-compile-asm.c: New test.



#### Inline copy of patch ###

diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md
index 
69bf343fb0ed601014979cfc1803abe84c87f179..e1d2e62593085accfcc111cf6fa5795e4520f213 
100644
--- a/gcc/config/arm/arm.md
+++ b/gcc/config/arm/arm.md
@@ -123,6 +123,8 @@
  ; and not all ARM insns do.
  (define_attr "predicated" "yes,no" (const_string "no"))

+(define_attr "mve_unpredicated_insn" "" (const_int 0))
+
  ; LENGTH of an instruction (in bytes)
  (define_attr "length" ""
    (const_int 4))
diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md
index 
62186f124da183fe1b1eb57a1aea1e8fff680a22..b1c8c1c569f31a6cb1bfdc16394047f02d6cddf4 
100644
--- a/gcc/config/arm/mve.md
+++ b/gcc/config/arm/mve.md
@@ -142,7 +142,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vrintzt.f%#<V_sz_elem> %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrndq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -818,7 +819,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vaddlvt.<supf>32 %Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vaddlvq_<supf>v4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -910,7 +912,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vaddvt.<supf>%#<V_sz_elem>    %0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vaddvq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2560,7 +2563,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vbict.i%#<V_sz_elem>    %q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vbicq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vcmpeqq_m_f])
@@ -2575,7 +2579,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    eq, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpeqq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vcvtaq_m_u, vcvtaq_m_s])
@@ -2590,7 +2595,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
"vpst\;vcvtat.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u])
@@ -2605,7 +2611,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
"vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>  %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vqrshrnbq_n_u, vqrshrnbq_n_s])
@@ -2727,7 +2734,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vabst.s%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vabsq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2743,7 +2751,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vaddvat.<supf>%#<V_sz_elem>    %0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vaddvaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2759,7 +2768,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vclst.s%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vclsq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2775,7 +2785,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vclzt.i%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vclzq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2791,7 +2802,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.u%#<V_sz_elem>    cs, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpcsq_n_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2807,7 +2819,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.u%#<V_sz_elem>    cs, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpcsq_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2823,7 +2836,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.i%#<V_sz_elem>    eq, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpeqq_n_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2839,7 +2853,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.i%#<V_sz_elem>    eq, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpeqq_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2855,7 +2870,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.s%#<V_sz_elem>    ge, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpgeq_n_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2871,7 +2887,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.s%#<V_sz_elem>    ge, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpgeq_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2887,7 +2904,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.s%#<V_sz_elem>    gt, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpgtq_n_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2903,7 +2921,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.s%#<V_sz_elem>    gt, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpgtq_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2919,7 +2938,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.u%#<V_sz_elem>    hi, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmphiq_n_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2935,7 +2955,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.u%#<V_sz_elem>    hi, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmphiq_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2951,7 +2972,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.s%#<V_sz_elem>    le, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpleq_n_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2967,7 +2989,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.s%#<V_sz_elem>    le, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpleq_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2983,7 +3006,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.s%#<V_sz_elem>    lt, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpltq_n_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -2999,7 +3023,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.s%#<V_sz_elem>    lt, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpltq_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3015,7 +3040,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.i%#<V_sz_elem>    ne, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpneq_n_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3031,7 +3057,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcmpt.i%#<V_sz_elem>    ne, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpneq_<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3047,7 +3074,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vdupt.%#<V_sz_elem>    %q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vdupq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3063,7 +3091,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmaxat.s%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmaxaq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3079,7 +3108,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmaxavt.s%#<V_sz_elem>    %0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmaxavq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3095,7 +3125,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmaxvt.<supf>%#<V_sz_elem>    %0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmaxvq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3111,7 +3142,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vminat.s%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vminaq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3127,7 +3159,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vminavt.s%#<V_sz_elem>    %0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vminavq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3143,7 +3176,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vminvt.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vminvq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3175,7 +3209,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmladavt.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmladavq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3191,7 +3226,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmladavxt.s%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmladavxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3239,7 +3275,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlsdavt.s%#<V_sz_elem>    %0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlsdavq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3255,7 +3292,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlsdavxt.s%#<V_sz_elem>    %0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlsdavxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3271,7 +3309,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmvnt %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmvnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3287,7 +3326,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vnegt.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vnegq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3319,7 +3359,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqabst.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqabsq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3367,7 +3408,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqnegt.s%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqnegq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3479,7 +3521,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrshlt.<supf>%#<V_sz_elem>    %q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrshlq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3495,7 +3538,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqshlt.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqshlq_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3511,7 +3555,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrev64t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrev64q_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3527,7 +3572,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrshlt.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrshlq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3543,7 +3589,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vshlt.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshlq_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3702,7 +3749,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vabst.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vabsq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3718,7 +3766,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vaddlvat.<supf>32 %Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vaddlvaq_<supf>v4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270])
@@ -3752,7 +3801,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    eq, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpeqq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3768,7 +3818,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    ge, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpgeq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3784,7 +3835,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    ge, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpgeq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3800,7 +3852,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    gt, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpgtq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3816,7 +3869,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    gt, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpgtq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3832,7 +3886,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    le, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpleq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3848,7 +3903,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    le, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpleq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3864,7 +3920,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    lt, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpltq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3880,7 +3937,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    lt, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpltq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3896,7 +3954,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    ne, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpneq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3912,7 +3971,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmpt.f%#<V_sz_elem>    ne, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmpneq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3928,7 +3988,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcvtbt.f16.f32 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3944,7 +4005,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcvtbt.f32.f16 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3960,7 +4022,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcvttt.f16.f32 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3976,7 +4039,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcvttt.f32.f16 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -3992,7 +4056,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vdupt.%#<V_sz_elem>    %q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vdupq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4071,7 +4136,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vmaxnmat.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmaxnmaq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vmaxnmavq_p_f])
@@ -4086,7 +4152,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vmaxnmavt.f%#<V_sz_elem>    %0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmaxnmavq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4102,7 +4169,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vmaxnmvt.f%#<V_sz_elem>    %0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmaxnmvq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vminnmaq_m_f])
@@ -4117,7 +4185,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vminnmat.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vminnmaq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4133,7 +4202,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vminnmavt.f%#<V_sz_elem>    %0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vminnmavq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vminnmvq_p_f])
@@ -4148,7 +4218,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vminnmvt.f%#<V_sz_elem>    %0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vminnmvq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4196,7 +4267,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlaldavt.<supf>%#<V_sz_elem> %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlaldavq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4212,7 +4284,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlaldavxt.s%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlaldavxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vmlsldavaq_s])
@@ -4259,7 +4332,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlsldavt.s%#<V_sz_elem> %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlsldavq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4275,7 +4349,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlsldavxt.s%#<V_sz_elem> %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlsldavxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vmovlbq_m_u, vmovlbq_m_s])
@@ -4290,7 +4365,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmovlbt.<supf>%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmovlbq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vmovltq_m_u, vmovltq_m_s])
@@ -4305,7 +4381,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmovltt.<supf>%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmovltq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vmovnbq_m_u, vmovnbq_m_s])
@@ -4320,7 +4397,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmovnbt.i%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmovnbq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4336,7 +4414,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmovntt.i%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmovntq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4352,7 +4431,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmvnt.i%#<V_sz_elem>    %q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmvnq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vnegq_m_f])
@@ -4367,7 +4447,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vnegt.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vnegq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4383,7 +4464,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vorrt.i%#<V_sz_elem>    %q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vorrq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vpselq_f])
@@ -4414,7 +4496,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqmovnbt.<supf>%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqmovnbq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4430,7 +4513,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqmovntt.<supf>%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqmovntq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4446,7 +4530,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqmovunbt.s%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqmovunbq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4462,7 +4547,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqmovuntt.s%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqmovuntq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4574,7 +4660,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vrev32t.16 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrev32q_fv8hf"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4590,7 +4677,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrev32t.%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrev32q_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4606,7 +4694,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vrev64t.%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrev64q_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4638,7 +4727,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlaldavhxt.s32 %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlaldavhxq_sv4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4670,7 +4760,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlsldavht.s32 %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlsldavhq_sv4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4686,7 +4777,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlsldavhxt.s32 %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlsldavhxq_sv4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4702,7 +4794,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vrintat.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrndaq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4718,7 +4811,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vrintmt.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrndmq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4734,7 +4828,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vrintnt.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrndnq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4750,7 +4845,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vrintpt.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrndpq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4766,7 +4862,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vrintxt.f%#<V_sz_elem>    %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrndxq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4846,7 +4943,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
"vpst\;vcvtmt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4862,7 +4960,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
"vpst\;vcvtpt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4878,7 +4977,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
"vpst\;vcvtnt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4895,7 +4995,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
"vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4911,7 +5012,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrev16t.8 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrev16q_<supf>v16qi"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4927,7 +5029,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
"vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4943,7 +5046,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlaldavht.<supf>32 %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlaldavhq_<supf>v4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -4976,7 +5080,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vabavt.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vabavq_<supf><mode>"))
+  (set_attr "type" "mve_move")
  ])

  ;;
@@ -4993,7 +5098,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\n\tvqshlut.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqshluq_n_s<mode>"))
+  (set_attr "type" "mve_move")])

  ;;
  ;; [vshlq_m_s, vshlq_m_u])
@@ -5009,7 +5115,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshlq_<supf><mode>"))
+  (set_attr "type" "mve_move")])

  ;;
  ;; [vsriq_m_n_s, vsriq_m_n_u])
@@ -5025,7 +5132,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vsrit.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vsriq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")])

  ;;
  ;; [vsubq_m_u, vsubq_m_s])
@@ -5041,7 +5149,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vsubt.i%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vsubq_<supf><mode>"))
+  (set_attr "type" "mve_move")])

  ;;
  ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s])
@@ -5057,7 +5166,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
"vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vabdq_m_s, vabdq_m_u])
@@ -5073,7 +5183,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vabdt.<supf>%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vabdq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5090,7 +5201,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vaddt.i%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vaddq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5107,7 +5219,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vaddt.i%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vaddq<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5124,7 +5237,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vandt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vandq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5141,7 +5255,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vbict %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vbicq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5158,7 +5273,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vbrsrt.%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vbrsrq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5175,7 +5291,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcaddt.i%#<V_sz_elem>    %q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcaddq_rot270<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5192,7 +5309,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vcaddt.i%#<V_sz_elem>    %q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcaddq_rot90<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5209,7 +5327,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;veort %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_veorq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5226,7 +5345,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vhaddt.<supf>%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vhaddq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5243,7 +5363,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vhaddt.<supf>%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vhaddq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5260,7 +5381,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vhsubt.<supf>%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vhsubq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5277,7 +5399,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vhsubt.<supf>%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vhsubq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5294,7 +5417,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmaxt.<supf>%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmaxq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5311,7 +5435,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmint.<supf>%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vminq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5328,7 +5453,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmladavat.<supf>%#<V_sz_elem>    %0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmladavaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5345,7 +5471,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlat.<supf>%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlaq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5362,7 +5489,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlast.<supf>%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlasq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5379,7 +5507,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmulht.<supf>%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmulhq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5396,7 +5525,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmullbt.<supf>%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmullbq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5413,7 +5543,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmulltt.<supf>%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmulltq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5430,7 +5561,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmult.i%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmulq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5447,7 +5579,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmult.i%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmulq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5464,7 +5597,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vornt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vornq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5481,7 +5615,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vorrt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vorrq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5498,7 +5633,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqaddt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqaddq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5515,7 +5651,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqaddt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqaddq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5532,7 +5669,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmlaht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmlahq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5549,7 +5687,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmlasht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmlashq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5566,7 +5705,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrdmlaht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrdmlahq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5583,7 +5723,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrdmlasht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrdmlashq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5600,7 +5741,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrshlq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5617,7 +5759,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqshlq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5634,7 +5777,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqshlq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5651,7 +5795,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqsubt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqsubq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5668,7 +5813,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqsubt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqsubq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5685,7 +5831,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrhaddt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrhaddq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5702,7 +5849,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmulht.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmulhq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5719,7 +5867,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrshlq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5736,7 +5885,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrshrt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrshrq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5753,7 +5903,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshlq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5770,7 +5921,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vshrt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshrq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5787,7 +5939,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vslit.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vsliq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5804,7 +5957,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vsubt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vsubq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5821,7 +5975,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vhcaddt.s%#<V_sz_elem>\t%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vhcaddq_rot270_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5838,7 +5993,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vhcaddt.s%#<V_sz_elem>\t%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vhcaddq_rot90_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5855,7 +6011,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmladavaxt.s%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmladavaxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5872,7 +6029,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlsdavat.s%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlsdavaq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5889,7 +6047,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlsdavaxt.s%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlsdavaxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5906,7 +6065,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmladht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmladhq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5923,7 +6083,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmladhxt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmladhxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5940,7 +6101,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmlsdht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmlsdhq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5957,7 +6119,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmlsdhxt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmlsdhxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5974,7 +6137,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmulht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmulhq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -5991,7 +6155,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmulht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmulhq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6008,7 +6173,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrdmladht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrdmladhq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6025,7 +6191,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrdmladhxt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrdmladhxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6042,7 +6209,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrdmlsdht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrdmlsdhq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6059,7 +6227,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrdmlsdhxt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrdmlsdhxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6076,7 +6245,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrdmulht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrdmulhq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6093,7 +6263,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrdmulht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrdmulhq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6110,7 +6281,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlaldavat.<supf>%#<V_sz_elem>    %Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlaldavaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6127,7 +6299,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlaldavaxt.<supf>%#<V_sz_elem> %Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlaldavaxq_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6144,7 +6317,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrshrnbt.<supf>%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrshrnbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6161,7 +6335,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrshrntt.<supf>%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrshrntq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6178,7 +6353,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\n\tvqshrnbt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqshrnbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6195,7 +6371,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqshrntt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqshrntq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6212,7 +6389,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlaldavhat.s32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlaldavhaq_sv4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6229,7 +6407,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrshrnbt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrshrnbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6246,7 +6425,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrshrntt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrshrntq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6263,7 +6443,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vshllbt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshllbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6280,7 +6461,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vshlltt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshlltq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6297,7 +6479,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vshrnbt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshrnbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6314,7 +6497,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vshrntt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshrntq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6331,7 +6515,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlsldavat.s%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlsldavaq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6348,7 +6533,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmlsldavaxt.s%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmlsldavaxq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6365,7 +6551,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmullbt.p%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmullbq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6382,7 +6569,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vmulltt.p%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmulltq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6399,7 +6587,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmullbt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmullbq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6416,7 +6605,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmullbt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmullbq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6433,7 +6623,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmulltt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmulltq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6450,7 +6641,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqdmulltt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqdmulltq_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6467,7 +6659,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrshrunbt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrshrunbq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6484,7 +6677,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqrshruntt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqrshruntq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6501,7 +6695,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqshrunbt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqshrunbq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6518,7 +6713,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vqshruntt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vqshruntq_n_s<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6535,7 +6731,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlaldavhat.u32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlaldavhaq_uv4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6552,7 +6749,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlaldavhaxt.s32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlaldavhaxq_sv4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6569,7 +6767,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlsldavhat.s32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlsldavhaq_sv4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6586,7 +6785,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vrmlsldavhaxt.s32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vrmlsldavhaxq_sv4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])
  ;;
  ;; [vabdq_m_f])
@@ -6602,7 +6802,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vabdt.f%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vabdq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6619,7 +6820,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vaddt.f%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vaddq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6636,7 +6838,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vaddt.f%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vaddq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6653,7 +6856,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vandt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vandq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6670,7 +6874,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vbict %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vbicq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6687,7 +6892,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vbrsrt.%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vbrsrq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6704,7 +6910,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcaddt.f%#<V_sz_elem>    %q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcaddq_rot270<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6721,7 +6928,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcaddt.f%#<V_sz_elem>    %q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcaddq_rot90<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6738,7 +6946,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmlat.f%#<V_sz_elem>    %q0, %q2, %q3, #0"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmlaq<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6755,7 +6964,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmlat.f%#<V_sz_elem>    %q0, %q2, %q3, #180"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmlaq_rot180<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6772,7 +6982,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmlat.f%#<V_sz_elem>    %q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmlaq_rot270<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6789,7 +7000,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmlat.f%#<V_sz_elem>    %q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmlaq_rot90<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6806,7 +7018,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmult.f%#<V_sz_elem>    %q0, %q2, %q3, #0"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmulq<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6823,7 +7036,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmult.f%#<V_sz_elem>    %q0, %q2, %q3, #180"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmulq_rot180<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6840,7 +7054,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmult.f%#<V_sz_elem>    %q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmulq_rot270<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6857,7 +7072,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vcmult.f%#<V_sz_elem>    %q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vcmulq_rot90<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6874,7 +7090,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;veort %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_veorq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6891,7 +7108,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vfmat.f%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vfmaq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6908,7 +7126,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vfmat.f%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vfmaq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6925,7 +7144,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vfmast.f%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vfmasq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6942,7 +7162,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vfmst.f%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vfmsq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6959,7 +7180,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vmaxnmt.f%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmaxnmq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6976,7 +7198,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vminnmt.f%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vminnmq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -6993,7 +7216,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vmult.f%#<V_sz_elem>    %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmulq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -7010,7 +7234,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vmult.f%#<V_sz_elem>    %q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vmulq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -7027,7 +7252,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vornt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -7044,7 +7270,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vorrt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vorrq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -7061,7 +7288,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vsubt.f%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vsubq_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -7078,7 +7306,8 @@
    ]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vsubt.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vsubq_n_f<mode>"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -7245,7 +7474,8 @@
        VSTRBSOQ))]
    "TARGET_HAVE_MVE"
    "vpst\;vstrbt.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u]
@@ -7268,7 +7498,8 @@
     output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrbq_p_s vstrbq_p_u]
@@ -7288,7 +7519,8 @@
     output_asm_insn ("vpst\;vstrbt.<V_sz_elem>\t%q1, %E0",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u]
@@ -7313,7 +7545,8 @@
       output_asm_insn ("vpst\n\tvldrbt.<supf><V_sz_elem>\t%q0, [%m1, 
%q2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrbq_z_s vldrbq_z_u]
@@ -7336,7 +7569,8 @@
       output_asm_insn ("vpst\;vldrbt.<supf><V_sz_elem>\t%q0, %E1",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u]
@@ -7357,7 +7591,8 @@
     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrhq_f]
@@ -7424,7 +7659,8 @@
       output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, 
%q2]",ops);
     return "";
  }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u]
@@ -7472,7 +7708,8 @@
       output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, 
%q2, uxtw #1]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrhq_s, vldrhq_u]
@@ -7514,7 +7751,8 @@
     output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrhq_z_s vldrhq_z_u]
@@ -7537,7 +7775,8 @@
       output_asm_insn ("vpst\;vldrht.<supf><V_sz_elem>\t%q0, %E1",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrwq_f]
@@ -7595,7 +7834,8 @@
     output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrwq_z_s vldrwq_z_u]
@@ -7615,7 +7855,8 @@
     output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "8")])

  (define_expand "mve_vld1q_f<mode>"
    [(match_operand:MVE_0 0 "s_register_operand")
@@ -7676,7 +7917,8 @@
     output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u]
@@ -7717,7 +7959,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);
    return "";
  }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u]
@@ -7758,7 +8001,8 @@
     output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrhq_gather_offset_f]
@@ -7800,7 +8044,8 @@
     output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrhq_gather_shifted_offset_f]
@@ -7842,7 +8087,8 @@
     output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrwq_gather_base_f]
@@ -7883,7 +8129,8 @@
     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrwq_gather_offset_f]
@@ -7945,7 +8192,8 @@
     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u]
@@ -7967,7 +8215,8 @@
     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrwq_gather_shifted_offset_f]
@@ -8029,7 +8278,8 @@
     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u]
@@ -8051,7 +8301,8 @@
     output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrhq_f]
@@ -8090,7 +8341,8 @@
     output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrhq_p_s vstrhq_p_u]
@@ -8110,7 +8362,8 @@
     output_asm_insn ("vpst\;vstrht.<V_sz_elem>\t%q1, %E0",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u]
@@ -8142,7 +8395,8 @@
        VSTRHSOQ))]
    "TARGET_HAVE_MVE"
    "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u]
@@ -8202,7 +8456,8 @@
        VSTRHSSOQ))]
    "TARGET_HAVE_MVE"
    "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u]
@@ -8289,7 +8544,8 @@
     output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_p_s vstrwq_p_u]
@@ -8309,7 +8565,8 @@
     output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_s vstrwq_u]
@@ -8371,7 +8628,8 @@
     output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u]
@@ -8424,7 +8682,8 @@
        VSTRDSOQ))]
    "TARGET_HAVE_MVE"
    "vpst\;vstrdt.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u]
@@ -8484,7 +8743,8 @@
        VSTRDSSOQ))]
    "TARGET_HAVE_MVE"
    "vpst\;vstrdt.64\t%q2, [%0, %q1, UXTW #3]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u]
@@ -8572,7 +8832,8 @@
        VSTRHQSO_F))]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vstrht.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrhq_scatter_shifted_offset_f]
@@ -8632,7 +8893,8 @@
        VSTRHQSSO_F))]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_scatter_base_f]
@@ -8677,7 +8939,8 @@
     output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_scatter_offset_f]
@@ -8736,7 +8999,8 @@
        VSTRWQSO_F))]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -8767,7 +9031,8 @@
        VSTRWSOQ))]
    "TARGET_HAVE_MVE"
    "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -8855,7 +9120,8 @@
        VSTRWQSSO_F))]
    "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
    "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u]
@@ -8887,7 +9153,8 @@
        VSTRWSSOQ))]
    "TARGET_HAVE_MVE"
    "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u]
@@ -9012,7 +9279,8 @@
          (match_operand:SI 6 "immediate_operand" "i")))]
   "TARGET_HAVE_MVE"
   "vpst\;\tvidupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vidupq_u<mode>_insn"))
+  (set_attr "length""8")])

  ;;
  ;; [vddupq_n_u])
@@ -9080,7 +9348,8 @@
           (match_operand:SI 6 "immediate_operand" "i")))]
   "TARGET_HAVE_MVE"
   "vpst\;\tvddupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vddupq_u<mode>_insn"))
+  (set_attr "length""8")])

  ;;
  ;; [vdwdupq_n_u])
@@ -9196,7 +9465,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;\tvdwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vdwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -9313,7 +9583,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;\tviwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_viwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
     (set_attr "length""8")])

  ;;
@@ -9365,7 +9636,8 @@
     output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrwq_scatter_base_wb_f]
@@ -9416,7 +9688,8 @@
     output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "8")])

  ;;
  ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u]
@@ -9467,7 +9740,8 @@
     output_asm_insn ("vpst;vstrdt.u64\t%q2, [%q0, %1]!",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "8")])

  (define_expand "mve_vldrwq_gather_base_wb_<supf>v4si"
    [(match_operand:V4SI 0 "s_register_operand")
@@ -9575,7 +9849,8 @@
     output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "8")])

  (define_expand "mve_vldrwq_gather_base_wb_fv4sf"
    [(match_operand:V4SI 0 "s_register_operand")
@@ -9684,7 +9959,8 @@
     output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "8")])

  (define_expand "mve_vldrdq_gather_base_wb_<supf>v2di"
    [(match_operand:V2DI 0 "s_register_operand")
@@ -9809,7 +10085,8 @@
     output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);
     return "";
  }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "8")])
  ;;
  ;; [vadciq_m_s, vadciq_m_u])
  ;;
@@ -9826,7 +10103,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vadcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length" "8")])

  ;;
@@ -9862,7 +10140,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vadct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length" "8")])

  ;;
@@ -9899,7 +10178,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vsbcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length" "8")])

  ;;
@@ -9935,7 +10215,8 @@
    ]
    "TARGET_HAVE_MVE"
    "vpst\;vsbct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
     (set_attr "length" "8")])

  ;;
@@ -10352,7 +10633,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshlct\t%q0, %1, %4"
- [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])

  ;; CDE instructions on MVE registers.
@@ -10435,7 +10717,8 @@
       CDE_VCX))]
    "TARGET_CDE && TARGET_HAVE_MVE"
    "vpst\;vcx1<a>t\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_arm_vcx1q<a>v16qi"))
+  (set_attr "type" "coproc")
     (set_attr "length" "8")]
  )

@@ -10449,7 +10732,8 @@
       CDE_VCX))]
    "TARGET_CDE && TARGET_HAVE_MVE"
    "vpst\;vcx2<a>t\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_arm_vcx2q<a>v16qi"))
+  (set_attr "type" "coproc")
     (set_attr "length" "8")]
  )

@@ -10464,7 +10748,8 @@
       CDE_VCX))]
    "TARGET_CDE && TARGET_HAVE_MVE"
    "vpst\;vcx3<a>t\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref 
"CODE_FOR_arm_vcx3q<a>v16qi"))
+  (set_attr "type" "coproc")
     (set_attr "length" "8")]
  )

diff --git a/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c 
b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c
new file mode 100644
index 
0000000000000000000000000000000000000000..ec6ee774cbda6604c4c24b57cd4d5d3bd08e07cd
--- /dev/null
+++ b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c
@@ -0,0 +1,82 @@
+/* { dg-do compile { target { arm*-*-* } } } */
+/* { dg-require-effective-target arm_v8_1m_mve_ok } */
+/* { dg-skip-if "avoid conflicting multilib options" { *-*-* } { 
"-marm" "-mcpu=*" } } */
+/* { dg-options "-march=armv8.1-m.main+fp.dp+mve.fp -O3" } */
+
+#include <arm_mve.h>
+
+#define IMM 5
+
+#define TEST_COMPILE_IN_DLSTP_TERNARY(BITS, LANES, LDRSTRYTPE, TYPE, 
SIGN, NAME, PRED)                \
+void test_##NAME##PRED##_##SIGN##BITS (TYPE##BITS##_t *a, 
TYPE##BITS##_t *b,  TYPE##BITS##_t *c, int n)    \
+{                                            \
+  while (n > 0)                                        \
+    {                                            \
+      mve_pred16_t p = vctp##BITS##q (n); \
+      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS 
(a, p);        \
+      TYPE##BITS##x##LANES##_t vb = vldr##LDRSTRYTPE##q_z_##SIGN##BITS 
(b, p);        \
+      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_##SIGN##BITS (va, vb, 
p);        \
+      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);         \
+      c += LANES;                                    \
+      a += LANES;                                    \
+      b += LANES;                                    \
+      n -= LANES;                                    \
+    }                                            \
+}
+
+#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY(BITS, LANES, 
LDRSTRYTPE, NAME, PRED)    \
+TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, int, s, NAME, 
PRED)            \
+TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, uint, u, NAME, 
PRED)
+
+#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY(NAME, 
PRED)            \
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (8, 16, b, NAME, PRED)    
             \
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (16, 8, h, NAME, PRED)    
             \
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (32, 4, w, NAME, PRED)
+
+
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vaddq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vorrq, _x)
+
+
+#define TEST_COMPILE_IN_DLSTP_TERNARY_N(BITS, LANES, LDRSTRYTPE, TYPE, 
SIGN, NAME, PRED)    \
+void test_##NAME##PRED##_n_##SIGN##BITS (TYPE##BITS##_t *a, 
TYPE##BITS##_t *c, int n)    \
+{                                            \
+  while (n > 0)                                        \
+    {                                            \
+      mve_pred16_t p = vctp##BITS##q (n); \
+      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS 
(a, p);        \
+      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_n_##SIGN##BITS (va, 
IMM, p);        \
+      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);         \
+      c += LANES;                                    \
+      a += LANES;                                    \
+      n -= LANES;                                    \
+    }                                            \
+}
+
+#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N(BITS, LANES, 
LDRSTRYTPE, NAME, PRED)    \
+TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, int, s, NAME, 
PRED)            \
+TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, uint, u, 
NAME, PRED)
+
+#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N(NAME, 
PRED)            \
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (8, 16, b, NAME, 
PRED)                \
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (16, 8, h, NAME, 
PRED)                \
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (32, 4, w, NAME, PRED)
+
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vaddq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vmulq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vsubq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vhaddq, _x)
+
+
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vbrsrq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshlq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshrq, _x)
+
+
+/* The final number of DLSTPs currently is calculated by the number of
+  `TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY.*` macros * 6.  */
+/* { dg-final { scan-assembler-times {\tdlstp} 54 } } */
+/* { dg-final { scan-assembler-times {\tletp} 54 } } */
+/* { dg-final { scan-assembler-not "\tvctp\t" } } */
+/* { dg-final { scan-assembler-not "\tvpst\t" } } */
+/* { dg-final { scan-assembler-not "P0" } } */
\ No newline at end of file

[-- Attachment #2: rb16364.patch --]
[-- Type: text/x-patch, Size: 89194 bytes --]

diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md
index 69bf343fb0ed601014979cfc1803abe84c87f179..e1d2e62593085accfcc111cf6fa5795e4520f213 100644
--- a/gcc/config/arm/arm.md
+++ b/gcc/config/arm/arm.md
@@ -123,6 +123,8 @@
 ; and not all ARM insns do.
 (define_attr "predicated" "yes,no" (const_string "no"))
 
+(define_attr "mve_unpredicated_insn" "" (const_int 0))
+
 ; LENGTH of an instruction (in bytes)
 (define_attr "length" ""
   (const_int 4))
diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md
index 62186f124da183fe1b1eb57a1aea1e8fff680a22..b1c8c1c569f31a6cb1bfdc16394047f02d6cddf4 100644
--- a/gcc/config/arm/mve.md
+++ b/gcc/config/arm/mve.md
@@ -142,7 +142,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vrintzt.f%#<V_sz_elem> %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -818,7 +819,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vaddlvt.<supf>32 %Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -910,7 +912,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vaddvt.<supf>%#<V_sz_elem>	%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2560,7 +2563,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vbict.i%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcmpeqq_m_f])
@@ -2575,7 +2579,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	eq, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcvtaq_m_u, vcvtaq_m_s])
@@ -2590,7 +2595,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtat.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u])
@@ -2605,7 +2611,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>	 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vqrshrnbq_n_u, vqrshrnbq_n_s])
@@ -2727,7 +2734,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vabst.s%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2743,7 +2751,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vaddvat.<supf>%#<V_sz_elem>	%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2759,7 +2768,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vclst.s%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclsq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2775,7 +2785,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vclzt.i%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2791,7 +2802,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.u%#<V_sz_elem>	cs, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpcsq_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2807,7 +2819,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.u%#<V_sz_elem>	cs, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpcsq_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2823,7 +2836,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.i%#<V_sz_elem>	eq, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2839,7 +2853,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.i%#<V_sz_elem>	eq, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2855,7 +2870,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.s%#<V_sz_elem>	ge, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2871,7 +2887,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.s%#<V_sz_elem>	ge, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2887,7 +2904,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.s%#<V_sz_elem>	gt, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2903,7 +2921,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.s%#<V_sz_elem>	gt, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2919,7 +2938,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.u%#<V_sz_elem>	hi, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmphiq_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2935,7 +2955,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.u%#<V_sz_elem>	hi, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmphiq_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2951,7 +2972,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.s%#<V_sz_elem>	le, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2967,7 +2989,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.s%#<V_sz_elem>	le, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2983,7 +3006,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.s%#<V_sz_elem>	lt, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2999,7 +3023,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.s%#<V_sz_elem>	lt, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3015,7 +3040,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.i%#<V_sz_elem>	ne, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3031,7 +3057,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.i%#<V_sz_elem>	ne, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3047,7 +3074,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vdupt.%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3063,7 +3091,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmaxat.s%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxaq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3079,7 +3108,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmaxavt.s%#<V_sz_elem>	%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxavq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3095,7 +3125,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmaxvt.<supf>%#<V_sz_elem>	%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxvq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3111,7 +3142,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vminat.s%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminaq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3127,7 +3159,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vminavt.s%#<V_sz_elem>	%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminavq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3143,7 +3176,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vminvt.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminvq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3175,7 +3209,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmladavt.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3191,7 +3226,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmladavxt.s%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3239,7 +3275,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlsdavt.s%#<V_sz_elem>	%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3255,7 +3292,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlsdavxt.s%#<V_sz_elem>	%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3271,7 +3309,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmvnt %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3287,7 +3326,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vnegt.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3319,7 +3359,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqabst.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqabsq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3367,7 +3408,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqnegt.s%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqnegq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3479,7 +3521,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrshlt.<supf>%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3495,7 +3538,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqshlt.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3511,7 +3555,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrev64t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3527,7 +3572,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrshlt.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3543,7 +3589,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshlt.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3702,7 +3749,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vabst.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3718,7 +3766,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vaddlvat.<supf>32 %Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvaq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270])
@@ -3752,7 +3801,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	eq, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3768,7 +3818,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	ge, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3784,7 +3835,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	ge, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3800,7 +3852,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	gt, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3816,7 +3869,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	gt, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3832,7 +3886,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	le, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3848,7 +3903,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	le, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3864,7 +3920,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	lt, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3880,7 +3937,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	lt, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3896,7 +3954,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	ne, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3912,7 +3971,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>	ne, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3928,7 +3988,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f16.f32 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3944,7 +4005,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f32.f16 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3960,7 +4022,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f16.f32 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3976,7 +4039,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f32.f16 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3992,7 +4056,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vdupt.%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4071,7 +4136,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vmaxnmat.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmaq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vmaxnmavq_p_f])
@@ -4086,7 +4152,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vmaxnmavt.f%#<V_sz_elem>	%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmavq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4102,7 +4169,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vmaxnmvt.f%#<V_sz_elem>	%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmvq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vminnmaq_m_f])
@@ -4117,7 +4185,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vminnmat.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmaq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4133,7 +4202,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vminnmavt.f%#<V_sz_elem>	%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmavq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vminnmvq_p_f])
@@ -4148,7 +4218,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vminnmvt.f%#<V_sz_elem>	%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmvq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4196,7 +4267,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlaldavt.<supf>%#<V_sz_elem> %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4212,7 +4284,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlaldavxt.s%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vmlsldavaq_s])
@@ -4259,7 +4332,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlsldavt.s%#<V_sz_elem> %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4275,7 +4349,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlsldavxt.s%#<V_sz_elem> %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vmovlbq_m_u, vmovlbq_m_s])
@@ -4290,7 +4365,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmovlbt.<supf>%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovlbq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vmovltq_m_u, vmovltq_m_s])
@@ -4305,7 +4381,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmovltt.<supf>%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovltq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vmovnbq_m_u, vmovnbq_m_s])
@@ -4320,7 +4397,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmovnbt.i%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovnbq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4336,7 +4414,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmovntt.i%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovntq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4352,7 +4431,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmvnt.i%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vnegq_m_f])
@@ -4367,7 +4447,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vnegt.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4383,7 +4464,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vorrt.i%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vpselq_f])
@@ -4414,7 +4496,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqmovnbt.<supf>%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovnbq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4430,7 +4513,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqmovntt.<supf>%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovntq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4446,7 +4530,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqmovunbt.s%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovunbq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4462,7 +4547,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqmovuntt.s%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovuntq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4574,7 +4660,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vrev32t.16 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_fv8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4590,7 +4677,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrev32t.%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4606,7 +4694,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vrev64t.%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4638,7 +4727,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlaldavhxt.s32 %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhxq_sv4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4670,7 +4760,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlsldavht.s32 %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhq_sv4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4686,7 +4777,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlsldavhxt.s32 %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhxq_sv4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4702,7 +4794,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vrintat.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndaq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4718,7 +4811,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vrintmt.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndmq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4734,7 +4828,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vrintnt.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndnq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4750,7 +4845,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vrintpt.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndpq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4766,7 +4862,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vrintxt.f%#<V_sz_elem>	%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndxq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4846,7 +4943,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtmt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4862,7 +4960,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtpt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4878,7 +4977,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtnt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4895,7 +4995,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4911,7 +5012,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrev16t.8 %q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev16q_<supf>v16qi"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4927,7 +5029,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4943,7 +5046,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlaldavht.<supf>32 %Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -4976,7 +5080,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vabavt.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabavq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -4993,7 +5098,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\n\tvqshlut.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshluq_n_s<mode>"))
+  (set_attr "type" "mve_move")])
 
 ;;
 ;; [vshlq_m_s, vshlq_m_u])
@@ -5009,7 +5115,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_<supf><mode>"))
+  (set_attr "type" "mve_move")])
 
 ;;
 ;; [vsriq_m_n_s, vsriq_m_n_u])
@@ -5025,7 +5132,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsrit.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsriq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")])
 
 ;;
 ;; [vsubq_m_u, vsubq_m_s])
@@ -5041,7 +5149,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsubt.i%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_<supf><mode>"))
+  (set_attr "type" "mve_move")])
 
 ;;
 ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s])
@@ -5057,7 +5166,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vabdq_m_s, vabdq_m_u])
@@ -5073,7 +5183,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vabdt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5090,7 +5201,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vaddt.i%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5107,7 +5219,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vaddt.i%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5124,7 +5237,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vandt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5141,7 +5255,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vbict %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5158,7 +5273,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vbrsrt.%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5175,7 +5291,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcaddt.i%#<V_sz_elem>	%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5192,7 +5309,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcaddt.i%#<V_sz_elem>	%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5209,7 +5327,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;veort %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5226,7 +5345,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vhaddt.<supf>%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5243,7 +5363,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vhaddt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5260,7 +5381,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vhsubt.<supf>%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5277,7 +5399,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vhsubt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5294,7 +5417,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmaxt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5311,7 +5435,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmint.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5328,7 +5453,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmladavat.<supf>%#<V_sz_elem>	%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5345,7 +5471,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlat.<supf>%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5362,7 +5489,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlast.<supf>%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlasq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5379,7 +5507,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmulht.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulhq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5396,7 +5525,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmullbt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5413,7 +5543,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmulltt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5430,7 +5561,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmult.i%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5447,7 +5579,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmult.i%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5464,7 +5597,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vornt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5481,7 +5615,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vorrt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5498,7 +5633,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqaddt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5515,7 +5651,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqaddt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5532,7 +5669,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmlaht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlahq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5549,7 +5687,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmlasht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlashq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5566,7 +5705,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrdmlaht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlahq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5583,7 +5723,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrdmlasht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlashq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5600,7 +5741,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5617,7 +5759,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5634,7 +5777,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5651,7 +5795,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqsubt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5668,7 +5813,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqsubt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5685,7 +5831,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrhaddt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrhaddq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5702,7 +5849,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmulht.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmulhq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5719,7 +5867,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5736,7 +5885,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrshrt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5753,7 +5903,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshlt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5770,7 +5921,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshrt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5787,7 +5939,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vslit.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsliq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5804,7 +5957,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsubt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5821,7 +5975,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vhcaddt.s%#<V_sz_elem>\t%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot270_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5838,7 +5993,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vhcaddt.s%#<V_sz_elem>\t%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot90_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5855,7 +6011,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmladavaxt.s%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5872,7 +6029,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlsdavat.s%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5889,7 +6047,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlsdavaxt.s%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5906,7 +6065,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmladht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5923,7 +6083,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmladhxt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5940,7 +6101,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmlsdht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5957,7 +6119,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmlsdhxt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5974,7 +6137,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmulht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5991,7 +6155,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmulht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6008,7 +6173,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrdmladht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6025,7 +6191,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrdmladhxt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6042,7 +6209,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrdmlsdht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6059,7 +6227,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrdmlsdhxt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6076,7 +6245,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrdmulht.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6093,7 +6263,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrdmulht.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6110,7 +6281,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlaldavat.<supf>%#<V_sz_elem>	%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6127,7 +6299,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlaldavaxt.<supf>%#<V_sz_elem> %Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaxq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6144,7 +6317,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrshrnbt.<supf>%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrnbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6161,7 +6335,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrshrntt.<supf>%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrntq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6178,7 +6353,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\n\tvqshrnbt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrnbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6195,7 +6371,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqshrntt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrntq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6212,7 +6389,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlaldavhat.s32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_sv4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6229,7 +6407,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrshrnbt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrnbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6246,7 +6425,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrshrntt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrntq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6263,7 +6443,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshllbt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshllbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6280,7 +6461,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshlltt.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlltq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6297,7 +6479,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshrnbt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrnbq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6314,7 +6497,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vshrntt.i%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrntq_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6331,7 +6515,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlsldavat.s%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6348,7 +6533,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmlsldavaxt.s%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaxq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6365,7 +6551,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmullbt.p%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6382,7 +6569,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmulltt.p%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6399,7 +6587,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmullbt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6416,7 +6605,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmullbt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6433,7 +6623,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmulltt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6450,7 +6641,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqdmulltt.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6467,7 +6659,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrshrunbt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrunbq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6484,7 +6677,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqrshruntt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshruntq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6501,7 +6695,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqshrunbt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrunbq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6518,7 +6713,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vqshruntt.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshruntq_n_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6535,7 +6731,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlaldavhat.u32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_uv4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6552,7 +6749,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlaldavhaxt.s32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaxq_sv4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6569,7 +6767,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlsldavhat.s32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaq_sv4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6586,7 +6785,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vrmlsldavhaxt.s32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaxq_sv4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vabdq_m_f])
@@ -6602,7 +6802,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vabdt.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6619,7 +6820,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vaddt.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6636,7 +6838,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vaddt.f%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6653,7 +6856,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vandt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6670,7 +6874,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vbict %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6687,7 +6892,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vbrsrt.%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6704,7 +6910,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcaddt.f%#<V_sz_elem>	%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6721,7 +6928,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcaddt.f%#<V_sz_elem>	%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6738,7 +6946,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmlat.f%#<V_sz_elem>	%q0, %q2, %q3, #0"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6755,7 +6964,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmlat.f%#<V_sz_elem>	%q0, %q2, %q3, #180"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot180<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6772,7 +6982,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmlat.f%#<V_sz_elem>	%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot270<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6789,7 +7000,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmlat.f%#<V_sz_elem>	%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot90<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6806,7 +7018,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmult.f%#<V_sz_elem>	%q0, %q2, %q3, #0"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6823,7 +7036,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmult.f%#<V_sz_elem>	%q0, %q2, %q3, #180"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot180<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6840,7 +7054,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmult.f%#<V_sz_elem>	%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot270<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6857,7 +7072,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmult.f%#<V_sz_elem>	%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot90<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6874,7 +7090,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;veort %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6891,7 +7108,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vfmat.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6908,7 +7126,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vfmat.f%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6925,7 +7144,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vfmast.f%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmasq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6942,7 +7162,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vfmst.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmsq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6959,7 +7180,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vmaxnmt.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6976,7 +7198,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vminnmt.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -6993,7 +7216,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vmult.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -7010,7 +7234,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vmult.f%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -7027,7 +7252,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vornt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -7044,7 +7270,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vorrt %q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -7061,7 +7288,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vsubt.f%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -7078,7 +7306,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vsubt.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -7245,7 +7474,8 @@
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrbt.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u]
@@ -7268,7 +7498,8 @@
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrbq_p_s vstrbq_p_u]
@@ -7288,7 +7519,8 @@
    output_asm_insn ("vpst\;vstrbt.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u]
@@ -7313,7 +7545,8 @@
      output_asm_insn ("vpst\n\tvldrbt.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_z_s vldrbq_z_u]
@@ -7336,7 +7569,8 @@
      output_asm_insn ("vpst\;vldrbt.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u]
@@ -7357,7 +7591,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_f]
@@ -7424,7 +7659,8 @@
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u]
@@ -7472,7 +7708,8 @@
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_s, vldrhq_u]
@@ -7514,7 +7751,8 @@
    output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_z_s vldrhq_z_u]
@@ -7537,7 +7775,8 @@
      output_asm_insn ("vpst\;vldrht.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_f]
@@ -7595,7 +7834,8 @@
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_z_s vldrwq_z_u]
@@ -7615,7 +7855,8 @@
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vld1q_f<mode>"
   [(match_operand:MVE_0 0 "s_register_operand")
@@ -7676,7 +7917,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u]
@@ -7717,7 +7959,8 @@
   output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u]
@@ -7758,7 +8001,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_offset_f]
@@ -7800,7 +8044,8 @@
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_f]
@@ -7842,7 +8087,8 @@
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_f]
@@ -7883,7 +8129,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_f]
@@ -7945,7 +8192,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u]
@@ -7967,7 +8215,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_f]
@@ -8029,7 +8278,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u]
@@ -8051,7 +8301,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_f]
@@ -8090,7 +8341,8 @@
    output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_p_s vstrhq_p_u]
@@ -8110,7 +8362,8 @@
    output_asm_insn ("vpst\;vstrht.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u]
@@ -8142,7 +8395,8 @@
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u]
@@ -8202,7 +8456,8 @@
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u]
@@ -8289,7 +8544,8 @@
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_p_s vstrwq_p_u]
@@ -8309,7 +8565,8 @@
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_s vstrwq_u]
@@ -8371,7 +8628,8 @@
    output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u]
@@ -8424,7 +8682,8 @@
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u]
@@ -8484,7 +8743,8 @@
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1, UXTW #3]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u]
@@ -8572,7 +8832,8 @@
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_f]
@@ -8632,7 +8893,8 @@
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_f]
@@ -8677,7 +8939,8 @@
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_f]
@@ -8736,7 +8999,8 @@
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -8767,7 +9031,8 @@
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -8855,7 +9120,8 @@
 	  VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u]
@@ -8887,7 +9153,8 @@
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u]
@@ -9012,7 +9279,8 @@
 		(match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;\tvidupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vddupq_n_u])
@@ -9080,7 +9348,8 @@
 		 (match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;\tvddupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vdwdupq_n_u])
@@ -9196,7 +9465,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;\tvdwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -9313,7 +9583,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;\tviwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -9365,7 +9636,8 @@
    output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_f]
@@ -9416,7 +9688,8 @@
    output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u]
@@ -9467,7 +9740,8 @@
    output_asm_insn ("vpst;vstrdt.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -9575,7 +9849,8 @@
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -9684,7 +9959,8 @@
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrdq_gather_base_wb_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -9809,7 +10085,8 @@
    output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 ;;
 ;; [vadciq_m_s, vadciq_m_u])
 ;;
@@ -9826,7 +10103,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -9862,7 +10140,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -9899,7 +10178,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -9935,7 +10215,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -10352,7 +10633,8 @@
  ]
  "TARGET_HAVE_MVE"
  "vpst\;vshlct\t%q0, %1, %4"
- [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
   (set_attr "length" "8")])
 
 ;; CDE instructions on MVE registers.
@@ -10435,7 +10717,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx1<a>t\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -10449,7 +10732,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx2<a>t\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -10464,7 +10748,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx3<a>t\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
diff --git a/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c
new file mode 100644
index 0000000000000000000000000000000000000000..ec6ee774cbda6604c4c24b57cd4d5d3bd08e07cd
--- /dev/null
+++ b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c
@@ -0,0 +1,82 @@
+/* { dg-do compile { target { arm*-*-* } } } */
+/* { dg-require-effective-target arm_v8_1m_mve_ok } */
+/* { dg-skip-if "avoid conflicting multilib options" { *-*-* } { "-marm" "-mcpu=*" } } */
+/* { dg-options "-march=armv8.1-m.main+fp.dp+mve.fp -O3" } */
+
+#include <arm_mve.h>
+
+#define IMM 5
+
+#define TEST_COMPILE_IN_DLSTP_TERNARY(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED)				\
+void test_##NAME##PRED##_##SIGN##BITS (TYPE##BITS##_t *a, TYPE##BITS##_t *b,  TYPE##BITS##_t *c, int n)	\
+{											\
+  while (n > 0)										\
+    {											\
+      mve_pred16_t p = vctp##BITS##q (n);						\
+      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p);		\
+      TYPE##BITS##x##LANES##_t vb = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (b, p);		\
+      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_##SIGN##BITS (va, vb, p);		\
+      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);					\
+      c += LANES;									\
+      a += LANES;									\
+      b += LANES;									\
+      n -= LANES;									\
+    }											\
+}
+
+#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY(BITS, LANES, LDRSTRYTPE, NAME, PRED)	\
+TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED)			\
+TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED)
+
+#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY(NAME, PRED)			\
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (8, 16, b, NAME, PRED)				\
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (16, 8, h, NAME, PRED)				\
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (32, 4, w, NAME, PRED)
+
+
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vaddq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vorrq, _x)
+
+
+#define TEST_COMPILE_IN_DLSTP_TERNARY_N(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED)	\
+void test_##NAME##PRED##_n_##SIGN##BITS (TYPE##BITS##_t *a,  TYPE##BITS##_t *c, int n)	\
+{											\
+  while (n > 0)										\
+    {											\
+      mve_pred16_t p = vctp##BITS##q (n);						\
+      TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p);		\
+      TYPE##BITS##x##LANES##_t vc = NAME##PRED##_n_##SIGN##BITS (va, IMM, p);		\
+      vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p);					\
+      c += LANES;									\
+      a += LANES;									\
+      n -= LANES;									\
+    }											\
+}
+
+#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N(BITS, LANES, LDRSTRYTPE, NAME, PRED)	\
+TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED)			\
+TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED)
+
+#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N(NAME, PRED)			\
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (8, 16, b, NAME, PRED)				\
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (16, 8, h, NAME, PRED)				\
+TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (32, 4, w, NAME, PRED)
+
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vaddq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vmulq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vsubq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vhaddq, _x)
+
+
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vbrsrq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshlq, _x)
+TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshrq, _x)
+
+
+/* The final number of DLSTPs currently is calculated by the number of
+  `TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY.*` macros * 6.  */
+/* { dg-final { scan-assembler-times {\tdlstp} 54 } } */
+/* { dg-final { scan-assembler-times {\tletp} 54 } } */
+/* { dg-final { scan-assembler-not "\tvctp\t" } } */
+/* { dg-final { scan-assembler-not "\tvpst\t" } } */
+/* { dg-final { scan-assembler-not "P0" } } */
\ No newline at end of file

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
  2023-12-18 11:53 ` [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns Andre Vieira
@ 2023-12-20 16:54   ` Andre Vieira (lists)
  0 siblings, 0 replies; 7+ messages in thread
From: Andre Vieira (lists) @ 2023-12-20 16:54 UTC (permalink / raw)
  To: gcc-patches; +Cc: Richard.Earnshaw, Stam Markianos-Wright

[-- Attachment #1: Type: text/plain, Size: 377 bytes --]

Reworked patch after Richard's comments and moved 
predicated_doloop_end_internal and dlstp*_insn to the next patch in the 
series to make sure this one builds on its own.

On 18/12/2023 11:53, Andre Vieira wrote:
> 
> Re-sending Stam's first patch, same as:
> https://gcc.gnu.org/pipermail/gcc-patches/2023-November/635301.html
> 
> Hopefully patchworks can pick this up :)
> 

[-- Attachment #2: 0001-arm-Add-define_attr-to-to-create-a-mapping-between-M_v2.patch --]
[-- Type: text/plain, Size: 103870 bytes --]

diff --git a/gcc/config/arm/arm.h b/gcc/config/arm/arm.h
index a9c2752c0ea5ecd4597ded254e9426753ac0a098..f0b01b7461f883994a0be137cb6cbf079d54618b 100644
--- a/gcc/config/arm/arm.h
+++ b/gcc/config/arm/arm.h
@@ -2375,6 +2375,21 @@ extern int making_const_table;
   else if (TARGET_THUMB1)				\
     thumb1_final_prescan_insn (INSN)
 
+/* These defines are useful to refer to the value of the mve_unpredicated_insn
+   insn attribute.  Note that, because these use the get_attr_* function, these
+   will change recog_data if (INSN) isn't current_insn.  */
+#define MVE_VPT_PREDICABLE_INSN_P(INSN)					\
+  (recog_memoized (INSN) >= 0						\
+   && get_attr_mve_unpredicated_insn (INSN) != CODE_FOR_nothing)
+
+#define MVE_VPT_PREDICATED_INSN_P(INSN)					\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) != get_attr_mve_unpredicated_insn (INSN))
+
+#define MVE_VPT_UNPREDICATED_INSN_P(INSN)				\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) == get_attr_mve_unpredicated_insn (INSN))
+
 #define ARM_SIGN_EXTEND(x)  ((HOST_WIDE_INT)			\
   (HOST_BITS_PER_WIDE_INT <= 32 ? (unsigned HOST_WIDE_INT) (x)	\
    : ((((unsigned HOST_WIDE_INT)(x)) & (unsigned HOST_WIDE_INT) 0xffffffff) |\
diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md
index 07eaf06cdeace750fe1c7d399deb833ef5fc2b66..296212be33ffe6397b05491d8854d2a59f7c54df 100644
--- a/gcc/config/arm/arm.md
+++ b/gcc/config/arm/arm.md
@@ -124,6 +124,12 @@ (define_attr "fpu" "none,vfp"
 ; and not all ARM insns do.
 (define_attr "predicated" "yes,no" (const_string "no"))
 
+; An attribute that encodes the CODE_FOR_<insn> of the MVE VPT unpredicated
+; version of a VPT-predicated instruction.  For unpredicated instructions
+; that are predicable, encode the same pattern's CODE_FOR_<insn> as a way to
+; encode that it is a predicable instruction.
+(define_attr "mve_unpredicated_insn" "" (symbol_ref "CODE_FOR_nothing"))
+
 ; LENGTH of an instruction (in bytes)
 (define_attr "length" ""
   (const_int 4))
diff --git a/gcc/config/arm/iterators.md b/gcc/config/arm/iterators.md
index a980353810166312d5bdfc8ad58b2825c910d0a0..5ea2d9e866891bdb3dc73fcf6cbd6cdd2f989951 100644
--- a/gcc/config/arm/iterators.md
+++ b/gcc/config/arm/iterators.md
@@ -2305,6 +2305,7 @@ (define_int_attr simd32_op [(UNSPEC_QADD8 "qadd8") (UNSPEC_QSUB8 "qsub8")
 
 (define_int_attr mmla_sfx [(UNSPEC_MATMUL_S "s8") (UNSPEC_MATMUL_U "u8")
 			   (UNSPEC_MATMUL_US "s8")])
+
 ;;MVE int attribute.
 (define_int_attr supf [(VCVTQ_TO_F_S "s") (VCVTQ_TO_F_U "u") (VREV16Q_S "s")
 		       (VREV16Q_U "u") (VMVNQ_N_S "s") (VMVNQ_N_U "u")
diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md
index b0d3443da9cee991193d390200738290806a1e69..b1862d7977e91605cd971e634105bed3fa6e75cb 100644
--- a/gcc/config/arm/mve.md
+++ b/gcc/config/arm/mve.md
@@ -17,7 +17,7 @@
 ;; along with GCC; see the file COPYING3.  If not see
 ;; <http://www.gnu.org/licenses/>.
 
-(define_insn "*mve_mov<mode>"
+(define_insn "mve_mov<mode>"
   [(set (match_operand:MVE_types 0 "nonimmediate_operand" "=w,w,r,w   , w,   r,Ux,w")
 	(match_operand:MVE_types 1 "general_operand"      " w,r,w,DnDm,UxUi,r,w, Ul"))]
   "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT"
@@ -81,18 +81,27 @@ (define_insn "*mve_mov<mode>"
       return "";
     }
 }
-  [(set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load")
+   [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")])
+   (set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load")
    (set_attr "length" "4,8,8,4,4,8,4,8")
    (set_attr "thumb2_pool_range" "*,*,*,*,1018,*,*,*")
    (set_attr "neg_pool_range" "*,*,*,*,996,*,*,*")])
 
-(define_insn "*mve_vdup<mode>"
+(define_insn "mve_vdup<mode>"
   [(set (match_operand:MVE_vecs 0 "s_register_operand" "=w")
 	(vec_duplicate:MVE_vecs
 	  (match_operand:<V_elem> 1 "s_register_operand" "r")))]
   "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT"
   "vdup.<V_sz_elem>\t%q0, %1"
-  [(set_attr "length" "4")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdup<mode>"))
+  (set_attr "length" "4")
    (set_attr "type" "mve_move")])
 
 ;;
@@ -145,7 +154,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_mnemo>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -159,7 +169,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -173,7 +184,8 @@ (define_insn "mve_v<absneg_str>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "v<absneg_str>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -187,7 +199,8 @@ (define_insn "@mve_<mve_insn>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -201,7 +214,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 ;;
 ;; [vcvttq_f32_f16])
@@ -214,7 +228,8 @@ (define_insn "mve_vcvttq_f32_f16v4sf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -228,7 +243,8 @@ (define_insn "mve_vcvtbq_f32_f16v4sf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -242,7 +258,8 @@ (define_insn "mve_vcvtq_to_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -256,7 +273,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -270,7 +288,8 @@ (define_insn "mve_vcvtq_from_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -284,7 +303,8 @@ (define_insn "mve_v<absneg_str>q_s<mode>"
   ]
   "TARGET_HAVE_MVE"
   "v<absneg_str>.s%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -297,7 +317,8 @@ (define_insn "mve_vmvnq_u<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vmvn\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vmvnq_s<mode>"
   [
@@ -318,7 +339,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -331,7 +353,8 @@ (define_insn "@mve_vclzq_s<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vclz.i%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vclzq_u<mode>"
   [
@@ -354,7 +377,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -368,7 +392,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -382,7 +407,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -397,7 +423,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -411,7 +438,8 @@ (define_insn "mve_vcvtpq_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtp.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -425,7 +453,8 @@ (define_insn "mve_vcvtnq_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtn.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -439,7 +468,8 @@ (define_insn "mve_vcvtmq_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtm.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -453,7 +483,8 @@ (define_insn "mve_vcvtaq_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvta.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -467,7 +498,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -481,7 +513,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -495,7 +528,8 @@ (define_insn "@mve_<mve_insn>q_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -509,7 +543,8 @@ (define_insn "mve_vctp<MVE_vctp>q<MVE_vpred>"
   ]
   "TARGET_HAVE_MVE"
   "vctp.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -523,7 +558,8 @@ (define_insn "mve_vpnotv16bi"
   ]
   "TARGET_HAVE_MVE"
   "vpnot"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpnotv16bi"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -538,7 +574,8 @@ (define_insn "@mve_<mve_insn>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -553,7 +590,8 @@ (define_insn "mve_vcvtq_n_to_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f<V_sz_elem>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; [vcreateq_f])
@@ -599,7 +637,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Versions that take constant vectors as operand 2 (with all elements
@@ -617,7 +656,8 @@ (define_insn "mve_vshrq_n_s<mode>_imm"
 					VALID_NEON_QREG_MODE (<MODE>mode),
 					true);
   }
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_s<mode>_imm"))
+  (set_attr "type" "mve_move")
 ])
 (define_insn "mve_vshrq_n_u<mode>_imm"
   [
@@ -632,7 +672,8 @@ (define_insn "mve_vshrq_n_u<mode>_imm"
 					VALID_NEON_QREG_MODE (<MODE>mode),
 					true);
   }
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_u<mode>_imm"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -647,7 +688,8 @@ (define_insn "mve_vcvtq_n_from_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf><V_sz_elem>.f<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -662,8 +704,9 @@ (define_insn "@mve_<mve_insn>q_p_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcmpneq_, vcmpcsq_, vcmpeqq_, vcmpgeq_, vcmpgtq_, vcmphiq_, vcmpleq_, vcmpltq_])
@@ -676,7 +719,8 @@ (define_insn "@mve_vcmp<mve_cmp_op>q_<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>\t<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -691,7 +735,8 @@ (define_insn "@mve_vcmp<mve_cmp_op>q_n_<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -722,7 +767,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -739,7 +785,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -754,7 +801,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -769,7 +817,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -789,8 +838,11 @@ (define_insn "mve_vandq_u<mode>"
   "@
    vand\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vand\", &operands[2], <MODE>mode, 1, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+   [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_vandq_u<mode>")
+						   (symbol_ref "CODE_FOR_nothing")])
+  (set_attr "type" "mve_move")
 ])
+
 (define_expand "mve_vandq_s<mode>"
   [
    (set (match_operand:MVE_2 0 "s_register_operand")
@@ -811,7 +863,8 @@ (define_insn "mve_vbicq_u<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vbicq_s<mode>"
@@ -835,7 +888,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -853,7 +907,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Auto vectorizer pattern for int vcadd
@@ -876,7 +931,8 @@ (define_insn "mve_veorq_u<mode>"
   ]
   "TARGET_HAVE_MVE"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_veorq_s<mode>"
   [
@@ -904,7 +960,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -920,7 +977,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -935,7 +993,8 @@ (define_insn "mve_<max_min_su_str>q_<max_min_supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<max_min_su_str>.<max_min_supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_su_str>q_<max_min_supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 
@@ -954,7 +1013,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -972,7 +1032,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -988,7 +1049,8 @@ (define_insn "@mve_<mve_insn>q_int_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1004,7 +1066,8 @@ (define_insn "mve_<mve_addsubmul>q<mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_addsubmul>.i%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1018,7 +1081,8 @@ (define_insn "mve_vornq_s<mode>"
   ]
   "TARGET_HAVE_MVE"
    "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vornq_u<mode>"
@@ -1047,7 +1111,8 @@ (define_insn "mve_vorrq_s<mode>"
   "@
    vorr\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vorr\", &operands[2], <MODE>mode, 0, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vorrq_u<mode>"
   [
@@ -1071,7 +1136,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1087,7 +1153,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1103,7 +1170,8 @@ (define_insn "@mve_<mve_insn>q_r_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1118,7 +1186,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1133,7 +1202,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1148,7 +1218,8 @@ (define_insn "@mve_<mve_insn>q_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1165,7 +1236,8 @@ (define_insn "@mve_<mve_insn>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1179,7 +1251,8 @@ (define_insn "mve_vandq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vand\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1193,7 +1266,8 @@ (define_insn "mve_vbicq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1209,7 +1283,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1223,7 +1298,8 @@ (define_insn "@mve_vcmp<mve_cmp_op>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1238,7 +1314,8 @@ (define_insn "@mve_vcmp<mve_cmp_op>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1253,8 +1330,10 @@ (define_insn "mve_vctp<MVE_vctp>q_m<MVE_vpred>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vctpt.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")
+])
 
 ;;
 ;; [vcvtbq_f16_f32])
@@ -1268,7 +1347,8 @@ (define_insn "mve_vcvtbq_f16_f32v8hf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1283,7 +1363,8 @@ (define_insn "mve_vcvttq_f16_f32v8hf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1297,7 +1378,8 @@ (define_insn "mve_veorq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1313,7 +1395,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1331,7 +1414,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1346,7 +1430,8 @@ (define_insn "@mve_<max_min_f_str>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<max_min_f_str>.f%#<V_sz_elem>	%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_f_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1364,7 +1449,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1384,7 +1470,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1400,7 +1487,8 @@ (define_insn "mve_<mve_addsubmul>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_addsubmul>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1414,7 +1502,8 @@ (define_insn "mve_vornq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1428,7 +1517,8 @@ (define_insn "mve_vorrq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorr\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1444,7 +1534,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1460,7 +1551,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1476,7 +1568,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1494,7 +1587,8 @@ (define_insn "@mve_<mve_insn>q_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1510,7 +1604,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1526,7 +1621,8 @@ (define_insn "@mve_<mve_insn>q_poly_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_poly_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1547,8 +1643,9 @@ (define_insn "@mve_vcmp<mve_cmp_op1>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_f<mode>"))
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtaq_m_u, vcvtaq_m_s])
 ;;
@@ -1562,8 +1659,10 @@ (define_insn "mve_vcvtaq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtat.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u])
 ;;
@@ -1577,8 +1676,9 @@ (define_insn "mve_vcvtq_m_to_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vqrshrnbq_n_u, vqrshrnbq_n_s]
@@ -1604,7 +1704,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1623,7 +1724,8 @@ (define_insn "@mve_<mve_insn>q_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1639,7 +1741,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1685,7 +1788,10 @@ (define_insn "mve_vshlcq_<supf><mode>"
 		   (match_dup 4)]
 	VSHLCQ))]
  "TARGET_HAVE_MVE"
- "vshlc\t%q0, %1, %4")
+ "vshlc\t%q0, %1, %4"
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+])
 
 ;;
 ;; [vabsq_m_s]
@@ -1705,7 +1811,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1721,7 +1828,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1744,7 +1852,8 @@ (define_insn "@mve_vcmp<mve_cmp_op1>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1767,7 +1876,8 @@ (define_insn "@mve_vcmp<mve_cmp_op1>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1783,7 +1893,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1800,7 +1911,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1819,7 +1931,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1838,7 +1951,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1857,7 +1971,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1878,7 +1993,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1894,7 +2010,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1910,7 +2027,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1933,7 +2051,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1950,7 +2069,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1967,7 +2087,8 @@ (define_insn "@mve_<mve_insn>q_m_r_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1983,7 +2104,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1999,7 +2121,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2015,7 +2138,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2038,7 +2162,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_mnemo>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2054,7 +2179,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270])
@@ -2072,7 +2198,9 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_f<mode>"
   "@
    vcmul.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>
    vcmla.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+  [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>")
+						  (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>")])
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2093,7 +2221,8 @@ (define_insn "@mve_vcmp<mve_cmp_op1>q_m_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2109,7 +2238,8 @@ (define_insn "mve_vcvtbq_m_f16_f32v8hf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2125,7 +2255,8 @@ (define_insn "mve_vcvtbq_m_f32_f16v4sf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2141,7 +2272,8 @@ (define_insn "mve_vcvttq_m_f16_f32v8hf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2157,8 +2289,9 @@ (define_insn "mve_vcvttq_m_f32_f16v4sf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vdupq_m_n_f])
@@ -2173,7 +2306,8 @@ (define_insn "@mve_<mve_insn>q_m_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2190,7 +2324,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2207,7 +2342,8 @@ (define_insn "@mve_<mve_insn>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2224,7 +2360,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2243,7 +2380,8 @@ (define_insn "@mve_<mve_insn>q_p_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2262,7 +2400,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2281,7 +2420,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2298,7 +2438,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2319,7 +2460,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2335,7 +2477,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2352,7 +2495,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2368,7 +2512,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2384,7 +2529,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2400,7 +2546,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2416,7 +2563,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2435,7 +2583,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2451,7 +2600,8 @@ (define_insn "mve_vcvtmq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtmt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2467,7 +2617,8 @@ (define_insn "mve_vcvtpq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtpt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2483,7 +2634,8 @@ (define_insn "mve_vcvtnq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtnt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2500,7 +2652,8 @@ (define_insn "mve_vcvtq_m_n_from_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2516,7 +2669,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2532,8 +2686,9 @@ (define_insn "mve_vcvtq_m_from_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vabavq_p_s, vabavq_p_u])
@@ -2549,7 +2704,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -2566,8 +2722,9 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\n\t<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vsriq_m_n_s, vsriq_m_n_u])
@@ -2583,8 +2740,9 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s])
@@ -2600,7 +2758,8 @@ (define_insn "mve_vcvtq_m_n_to_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2640,7 +2799,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2659,8 +2819,9 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vaddq_m_u, vaddq_m_s]
@@ -2678,7 +2839,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2698,7 +2860,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2715,8 +2878,9 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcaddq_rot90_m_u, vcaddq_rot90_m_s]
@@ -2735,7 +2899,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2763,7 +2928,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2784,7 +2950,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2802,7 +2969,8 @@ (define_insn "@mve_<mve_insn>q_int_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2819,7 +2987,8 @@ (define_insn "mve_vornq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2837,7 +3006,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2855,7 +3025,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2872,7 +3043,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2892,7 +3064,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2920,7 +3093,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2940,7 +3114,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2958,7 +3133,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2976,7 +3152,8 @@ (define_insn "@mve_<mve_insn>q_poly_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_poly_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2994,7 +3171,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3012,7 +3190,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3036,7 +3215,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3057,7 +3237,8 @@ (define_insn "@mve_<mve_insn>q_m_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3077,7 +3258,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3094,7 +3276,8 @@ (define_insn "@mve_<mve_insn>q_m_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3116,7 +3299,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3136,7 +3320,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3153,7 +3338,8 @@ (define_insn "mve_vornq_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3173,7 +3359,8 @@ (define_insn "mve_vstrbq_<supf><mode>"
    output_asm_insn("vstrb.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_s vstrbq_scatter_offset_u]
@@ -3201,7 +3388,8 @@ (define_insn "mve_vstrbq_scatter_offset_<supf><mode>_insn"
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vstrb.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_s vstrwq_scatter_base_u]
@@ -3223,7 +3411,8 @@ (define_insn "mve_vstrwq_scatter_base_<supf>v4si"
    output_asm_insn("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_gather_offset_s vldrbq_gather_offset_u]
@@ -3246,7 +3435,8 @@ (define_insn "mve_vldrbq_gather_offset_<supf><mode>"
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_s vldrbq_u]
@@ -3268,7 +3458,8 @@ (define_insn "mve_vldrbq_<supf><mode>"
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_s vldrwq_gather_base_u]
@@ -3288,7 +3479,8 @@ (define_insn "mve_vldrwq_gather_base_<supf>v4si"
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_p_s vstrbq_scatter_offset_p_u]
@@ -3320,7 +3512,8 @@ (define_insn "mve_vstrbq_scatter_offset_p_<supf><mode>_insn"
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrbt.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u]
@@ -3343,7 +3536,8 @@ (define_insn "mve_vstrwq_scatter_base_p_<supf>v4si"
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_insn "mve_vstrbq_p_<supf><mode>"
   [(set (match_operand:<MVE_B_ELEM> 0 "mve_memory_operand" "=Ux")
@@ -3361,7 +3555,8 @@ (define_insn "mve_vstrbq_p_<supf><mode>"
    output_asm_insn ("vpst\;vstrbt.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u]
@@ -3386,7 +3581,8 @@ (define_insn "mve_vldrbq_gather_offset_z_<supf><mode>"
      output_asm_insn ("vpst\n\tvldrbt.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_z_s vldrbq_z_u]
@@ -3409,7 +3605,8 @@ (define_insn "mve_vldrbq_z_<supf><mode>"
      output_asm_insn ("vpst\;vldrbt.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u]
@@ -3430,7 +3627,8 @@ (define_insn "mve_vldrwq_gather_base_z_<supf>v4si"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_f]
@@ -3449,7 +3647,8 @@ (define_insn "mve_vldrhq_fv8hf"
    output_asm_insn ("vldrh.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_s vldrhq_gather_offset_u]
@@ -3472,7 +3671,8 @@ (define_insn "mve_vldrhq_gather_offset_<supf><mode>"
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_s vldrhq_gather_offset_z_u]
@@ -3497,7 +3697,8 @@ (define_insn "mve_vldrhq_gather_offset_z_<supf><mode>"
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u]
@@ -3520,7 +3721,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_<supf><mode>"
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_s vldrhq_gather_shited_offset_z_u]
@@ -3545,7 +3747,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_z_<supf><mode>"
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_s, vldrhq_u]
@@ -3567,7 +3770,8 @@ (define_insn "mve_vldrhq_<supf><mode>"
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_z_f]
@@ -3587,7 +3791,8 @@ (define_insn "mve_vldrhq_z_fv8hf"
    output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_z_s vldrhq_z_u]
@@ -3610,7 +3815,8 @@ (define_insn "mve_vldrhq_z_<supf><mode>"
      output_asm_insn ("vpst\;vldrht.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_f]
@@ -3629,7 +3835,8 @@ (define_insn "mve_vldrwq_fv4sf"
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_s vldrwq_u]
@@ -3648,7 +3855,8 @@ (define_insn "mve_vldrwq_<supf>v4si"
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_z_f]
@@ -3668,7 +3876,8 @@ (define_insn "mve_vldrwq_z_fv4sf"
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_z_s vldrwq_z_u]
@@ -3688,7 +3897,8 @@ (define_insn "mve_vldrwq_z_<supf>v4si"
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_expand "@mve_vld1q_f<mode>"
   [(match_operand:MVE_0 0 "s_register_operand")
@@ -3728,7 +3938,8 @@ (define_insn "mve_vldrdq_gather_base_<supf>v2di"
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_base_z_s vldrdq_gather_base_z_u]
@@ -3749,7 +3960,8 @@ (define_insn "mve_vldrdq_gather_base_z_<supf>v2di"
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u]
@@ -3769,7 +3981,8 @@ (define_insn "mve_vldrdq_gather_offset_<supf>v2di"
   output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_offset_z_s vldrdq_gather_offset_z_u]
@@ -3790,7 +4003,8 @@ (define_insn "mve_vldrdq_gather_offset_z_<supf>v2di"
   output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u]
@@ -3810,7 +4024,8 @@ (define_insn "mve_vldrdq_gather_shifted_offset_<supf>v2di"
    output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_z_s vldrdq_gather_shifted_offset_z_u]
@@ -3831,7 +4046,8 @@ (define_insn "mve_vldrdq_gather_shifted_offset_z_<supf>v2di"
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_offset_f]
@@ -3851,7 +4067,8 @@ (define_insn "mve_vldrhq_gather_offset_fv8hf"
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_f]
@@ -3873,7 +4090,8 @@ (define_insn "mve_vldrhq_gather_offset_z_fv8hf"
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_f]
@@ -3893,7 +4111,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_fv8hf"
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_f]
@@ -3915,7 +4134,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_z_fv8hf"
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_f]
@@ -3935,7 +4155,8 @@ (define_insn "mve_vldrwq_gather_base_fv4sf"
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_z_f]
@@ -3956,7 +4177,8 @@ (define_insn "mve_vldrwq_gather_base_z_fv4sf"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_f]
@@ -3976,7 +4198,8 @@ (define_insn "mve_vldrwq_gather_offset_fv4sf"
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_s vldrwq_gather_offset_u]
@@ -3996,7 +4219,8 @@ (define_insn "mve_vldrwq_gather_offset_<supf>v4si"
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_z_f]
@@ -4018,7 +4242,8 @@ (define_insn "mve_vldrwq_gather_offset_z_fv4sf"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u]
@@ -4040,7 +4265,8 @@ (define_insn "mve_vldrwq_gather_offset_z_<supf>v4si"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_f]
@@ -4060,7 +4286,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_fv4sf"
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_s vldrwq_gather_shifted_offset_u]
@@ -4080,7 +4307,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_<supf>v4si"
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_f]
@@ -4102,7 +4330,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_z_fv4sf"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u]
@@ -4124,7 +4353,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_z_<supf>v4si"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_f]
@@ -4143,7 +4373,8 @@ (define_insn "mve_vstrhq_fv8hf"
    output_asm_insn ("vstrh.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_p_f]
@@ -4164,7 +4395,8 @@ (define_insn "mve_vstrhq_p_fv8hf"
    output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_p_s vstrhq_p_u]
@@ -4186,7 +4418,8 @@ (define_insn "mve_vstrhq_p_<supf><mode>"
    output_asm_insn ("vpst\;vstrht.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u]
@@ -4218,7 +4451,8 @@ (define_insn "mve_vstrhq_scatter_offset_p_<supf><mode>_insn"
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u]
@@ -4246,7 +4480,8 @@ (define_insn "mve_vstrhq_scatter_offset_<supf><mode>_insn"
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_s vstrhq_scatter_shifted_offset_p_u]
@@ -4278,7 +4513,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_p_<supf><mode>_insn"
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u]
@@ -4307,7 +4543,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_s, vstrhq_u]
@@ -4326,7 +4563,8 @@ (define_insn "mve_vstrhq_<supf><mode>"
    output_asm_insn ("vstrh.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_f]
@@ -4345,7 +4583,8 @@ (define_insn "mve_vstrwq_fv4sf"
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_p_f]
@@ -4366,7 +4605,8 @@ (define_insn "mve_vstrwq_p_fv4sf"
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_p_s vstrwq_p_u]
@@ -4387,7 +4627,8 @@ (define_insn "mve_vstrwq_p_<supf>v4si"
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_s vstrwq_u]
@@ -4406,7 +4647,8 @@ (define_insn "mve_vstrwq_<supf>v4si"
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 (define_expand "@mve_vst1q_f<mode>"
   [(match_operand:<MVE_CNVT> 0 "mve_memory_operand")
@@ -4449,7 +4691,8 @@ (define_insn "mve_vstrdq_scatter_base_p_<supf>v2di"
    output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u]
@@ -4471,7 +4714,8 @@ (define_insn "mve_vstrdq_scatter_base_<supf>v2di"
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_offset_p_s vstrdq_scatter_offset_p_u]
@@ -4502,7 +4746,8 @@ (define_insn "mve_vstrdq_scatter_offset_p_<supf>v2di_insn"
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u]
@@ -4530,7 +4775,8 @@ (define_insn "mve_vstrdq_scatter_offset_<supf>v2di_insn"
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_p_s vstrdq_scatter_shifted_offset_p_u]
@@ -4562,7 +4808,8 @@ (define_insn "mve_vstrdq_scatter_shifted_offset_p_<supf>v2di_insn"
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u]
@@ -4591,7 +4838,8 @@ (define_insn "mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_f]
@@ -4619,7 +4867,8 @@ (define_insn "mve_vstrhq_scatter_offset_fv8hf_insn"
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_f]
@@ -4650,7 +4899,8 @@ (define_insn "mve_vstrhq_scatter_offset_p_fv8hf_insn"
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_f]
@@ -4678,7 +4928,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_fv8hf_insn"
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_f]
@@ -4710,7 +4961,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn"
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_f]
@@ -4732,7 +4984,8 @@ (define_insn "mve_vstrwq_scatter_base_fv4sf"
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_p_f]
@@ -4755,7 +5008,8 @@ (define_insn "mve_vstrwq_scatter_base_p_fv4sf"
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_f]
@@ -4783,7 +5037,8 @@ (define_insn "mve_vstrwq_scatter_offset_fv4sf_insn"
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_offset_p_f]
@@ -4814,7 +5069,8 @@ (define_insn "mve_vstrwq_scatter_offset_p_fv4sf_insn"
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -4845,7 +5101,8 @@ (define_insn "mve_vstrwq_scatter_offset_p_<supf>v4si_insn"
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -4873,7 +5130,8 @@ (define_insn "mve_vstrwq_scatter_offset_<supf>v4si_insn"
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_f]
@@ -4901,7 +5159,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_fv4sf_insn"
 	 VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_f]
@@ -4933,7 +5192,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn"
 	  VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u]
@@ -4965,7 +5225,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_p_<supf>v4si_insn"
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u]
@@ -4994,7 +5255,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vidupq_n_u])
@@ -5062,7 +5324,8 @@ (define_insn "mve_vidupq_m_wb_u<mode>_insn"
 		(match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;\tvidupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vddupq_n_u])
@@ -5130,7 +5393,8 @@ (define_insn "mve_vddupq_m_wb_u<mode>_insn"
 		 (match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;vddupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vdwdupq_n_u])
@@ -5246,8 +5510,9 @@ (define_insn "mve_vdwdupq_m_wb_u<mode>_insn"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vdwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [viwdupq_n_u])
@@ -5363,7 +5628,8 @@ (define_insn "mve_viwdupq_m_wb_u<mode>_insn"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;\tviwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5389,7 +5655,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_<supf>v4si"
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_s vstrwq_scatter_base_wb_p_u]
@@ -5415,7 +5682,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_p_<supf>v4si"
    output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_f]
@@ -5440,7 +5708,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_fv4sf"
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_f]
@@ -5466,7 +5735,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_p_fv4sf"
    output_asm_insn ("vpst\;vstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u]
@@ -5491,7 +5761,8 @@ (define_insn "mve_vstrdq_scatter_base_wb_<supf>v2di"
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_p_s vstrdq_scatter_base_wb_p_u]
@@ -5517,7 +5788,8 @@ (define_insn "mve_vstrdq_scatter_base_wb_p_<supf>v2di"
    output_asm_insn ("vpst\;vstrdt.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5569,7 +5841,8 @@ (define_insn "mve_vldrwq_gather_base_wb_<supf>v4si_insn"
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5625,7 +5898,8 @@ (define_insn "mve_vldrwq_gather_base_wb_z_<supf>v4si_insn"
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5677,7 +5951,8 @@ (define_insn "mve_vldrwq_gather_base_wb_fv4sf_insn"
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5734,7 +6009,8 @@ (define_insn "mve_vldrwq_gather_base_wb_z_fv4sf_insn"
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrdq_gather_base_wb_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -5787,7 +6063,8 @@ (define_insn "mve_vldrdq_gather_base_wb_<supf>v2di_insn"
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrdq_gather_base_wb_z_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -5826,7 +6103,7 @@ (define_insn "get_fpscr_nzcvqc"
    (unspec_volatile:SI [(reg:SI VFPCC_REGNUM)] UNSPEC_GET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmrs\\t%0, FPSCR_nzcvqc"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 (define_insn "set_fpscr_nzcvqc"
  [(set (reg:SI VFPCC_REGNUM)
@@ -5834,7 +6111,7 @@ (define_insn "set_fpscr_nzcvqc"
     VUNSPEC_SET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmsr\\tFPSCR_nzcvqc, %0"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 ;;
 ;; [vldrdq_gather_base_wb_z_s vldrdq_gather_base_wb_z_u]
@@ -5859,7 +6136,8 @@ (define_insn "mve_vldrdq_gather_base_wb_z_<supf>v2di_insn"
    output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 ;;
 ;; [vadciq_m_s, vadciq_m_u])
 ;;
@@ -5876,7 +6154,8 @@ (define_insn "mve_vadciq_m_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5893,7 +6172,8 @@ (define_insn "mve_vadciq_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vadci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -5912,7 +6192,8 @@ (define_insn "mve_vadcq_m_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5929,7 +6210,8 @@ (define_insn "mve_vadcq_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vadc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")
    (set_attr "conds" "set")])
 
@@ -5949,7 +6231,8 @@ (define_insn "mve_vsbciq_m_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5966,7 +6249,8 @@ (define_insn "mve_vsbciq_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vsbci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -5985,7 +6269,8 @@ (define_insn "mve_vsbcq_m_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6002,7 +6287,8 @@ (define_insn "mve_vsbcq_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vsbc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -6031,7 +6317,7 @@ (define_insn "mve_vst2q<mode>"
 		    "vst21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld2q])
@@ -6059,7 +6345,7 @@ (define_insn "mve_vld2q<mode>"
 		    "vld21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld4q])
@@ -6402,7 +6688,8 @@ (define_insn "mve_vshlcq_m_<supf><mode>"
  ]
  "TARGET_HAVE_MVE"
  "vpst\;vshlct\t%q0, %1, %4"
- [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
   (set_attr "length" "8")])
 
 ;; CDE instructions on MVE registers.
@@ -6414,7 +6701,8 @@ (define_insn "arm_vcx1qv16qi"
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1\\tp%c1, %q0, #%c2"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1qav16qi"
@@ -6425,7 +6713,8 @@ (define_insn "arm_vcx1qav16qi"
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1a\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qv16qi"
@@ -6436,7 +6725,8 @@ (define_insn "arm_vcx2qv16qi"
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2\\tp%c1, %q0, %q2, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qav16qi"
@@ -6448,7 +6738,8 @@ (define_insn "arm_vcx2qav16qi"
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2a\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qv16qi"
@@ -6460,7 +6751,8 @@ (define_insn "arm_vcx3qv16qi"
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3\\tp%c1, %q0, %q2, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qav16qi"
@@ -6473,7 +6765,8 @@ (define_insn "arm_vcx3qav16qi"
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3a\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1q<a>_p_v16qi"
@@ -6485,7 +6778,8 @@ (define_insn "arm_vcx1q<a>_p_v16qi"
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx1<a>t\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6499,7 +6793,8 @@ (define_insn "arm_vcx2q<a>_p_v16qi"
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx2<a>t\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6514,11 +6809,12 @@ (define_insn "arm_vcx3q<a>_p_v16qi"
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx3<a>t\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
-(define_insn "*movmisalign<mode>_mve_store"
+(define_insn "movmisalign<mode>_mve_store"
   [(set (match_operand:MVE_VLD_ST 0 "mve_memory_operand"	     "=Ux")
 	(unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "s_register_operand" " w")]
 	 UNSPEC_MISALIGNED_ACCESS))]
@@ -6526,11 +6822,12 @@ (define_insn "*movmisalign<mode>_mve_store"
     || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (<MODE>mode)))
    && !BYTES_BIG_ENDIAN && unaligned_access"
   "vstr<V_sz_elem1>.<V_sz_elem>\t%q1, %E0"
-  [(set_attr "type" "mve_store")]
+  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign<mode>_mve_store"))
+   (set_attr "type" "mve_store")]
 )
 
 
-(define_insn "*movmisalign<mode>_mve_load"
+(define_insn "movmisalign<mode>_mve_load"
   [(set (match_operand:MVE_VLD_ST 0 "s_register_operand"				 "=w")
 	(unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "mve_memory_operand" " Ux")]
 	 UNSPEC_MISALIGNED_ACCESS))]
@@ -6538,7 +6835,8 @@ (define_insn "*movmisalign<mode>_mve_load"
     || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (<MODE>mode)))
    && !BYTES_BIG_ENDIAN && unaligned_access"
   "vldr<V_sz_elem1>.<V_sz_elem>\t%q0, %E1"
-  [(set_attr "type" "mve_load")]
+  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign<mode>_mve_load"))
+   (set_attr "type" "mve_load")]
 )
 
 ;; Expander for VxBI moves
diff --git a/gcc/config/arm/vec-common.md b/gcc/config/arm/vec-common.md
index 9af8429968d8662b3c814306c94f033434378e7d..74871cb984b3fe1fb9571841cdcae39631abf8e2 100644
--- a/gcc/config/arm/vec-common.md
+++ b/gcc/config/arm/vec-common.md
@@ -366,7 +366,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   "@
    <mve_insn>.<supf>%#<V_sz_elem>\t%<V_reg>0, %<V_reg>1, %<V_reg>2
    * return neon_output_shift_immediate (\"vshl\", 'i', &operands[2], <MODE>mode, VALID_NEON_QREG_MODE (<MODE>mode), true);"
-  [(set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
 )
 
 (define_expand "vashl<mode>3"

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
  2023-12-18 11:53 [PATCH 0/2] arm: Add support for MVE Tail-Predicated Low Overhead Loops Andre Vieira
@ 2023-12-18 11:53 ` Andre Vieira
  2023-12-20 16:54   ` Andre Vieira (lists)
  0 siblings, 1 reply; 7+ messages in thread
From: Andre Vieira @ 2023-12-18 11:53 UTC (permalink / raw)
  To: gcc-patches; +Cc: Richard.Earnshaw, Stam Markianos-Wright

[-- Attachment #1: Type: text/plain, Size: 44 bytes --]

This is a multi-part message in MIME format.

[-- Attachment #2: Type: text/plain, Size: 152 bytes --]


Re-sending Stam's first patch, same as:
https://gcc.gnu.org/pipermail/gcc-patches/2023-November/635301.html

Hopefully patchworks can pick this up :)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #3: 0001-arm-Add-define_attr-to-to-create-a-mapping-between-M.patch --]
[-- Type: text/x-patch; name="0001-arm-Add-define_attr-to-to-create-a-mapping-between-M.patch", Size: 104533 bytes --]

diff --git a/gcc/config/arm/arm.h b/gcc/config/arm/arm.h
index a9c2752c0ea..0b0e8620717 100644
--- a/gcc/config/arm/arm.h
+++ b/gcc/config/arm/arm.h
@@ -2375,6 +2375,21 @@ extern int making_const_table;
   else if (TARGET_THUMB1)				\
     thumb1_final_prescan_insn (INSN)
 
+/* These defines are useful to refer to the value of the mve_unpredicated_insn
+   insn attribute.  Note that, because these use the get_attr_* function, these
+   will change recog_data if (INSN) isn't current_insn.  */
+#define MVE_VPT_PREDICABLE_INSN_P(INSN)					\
+  (recog_memoized (INSN) >= 0						\
+  && get_attr_mve_unpredicated_insn (INSN) != 0)			\
+
+#define MVE_VPT_PREDICATED_INSN_P(INSN)					\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) != get_attr_mve_unpredicated_insn (INSN))	\
+
+#define MVE_VPT_UNPREDICATED_INSN_P(INSN)				\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) == get_attr_mve_unpredicated_insn (INSN))	\
+
 #define ARM_SIGN_EXTEND(x)  ((HOST_WIDE_INT)			\
   (HOST_BITS_PER_WIDE_INT <= 32 ? (unsigned HOST_WIDE_INT) (x)	\
    : ((((unsigned HOST_WIDE_INT)(x)) & (unsigned HOST_WIDE_INT) 0xffffffff) |\
diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md
index 07eaf06cdea..8efdebecc3c 100644
--- a/gcc/config/arm/arm.md
+++ b/gcc/config/arm/arm.md
@@ -124,6 +124,8 @@ (define_attr "fpu" "none,vfp"
 ; and not all ARM insns do.
 (define_attr "predicated" "yes,no" (const_string "no"))
 
+(define_attr "mve_unpredicated_insn" "" (const_int 0))
+
 ; LENGTH of an instruction (in bytes)
 (define_attr "length" ""
   (const_int 4))
diff --git a/gcc/config/arm/iterators.md b/gcc/config/arm/iterators.md
index a9803538101..5ea2d9e8668 100644
--- a/gcc/config/arm/iterators.md
+++ b/gcc/config/arm/iterators.md
@@ -2305,6 +2305,7 @@ (define_int_attr simd32_op [(UNSPEC_QADD8 "qadd8") (UNSPEC_QSUB8 "qsub8")
 
 (define_int_attr mmla_sfx [(UNSPEC_MATMUL_S "s8") (UNSPEC_MATMUL_U "u8")
 			   (UNSPEC_MATMUL_US "s8")])
+
 ;;MVE int attribute.
 (define_int_attr supf [(VCVTQ_TO_F_S "s") (VCVTQ_TO_F_U "u") (VREV16Q_S "s")
 		       (VREV16Q_U "u") (VMVNQ_N_S "s") (VMVNQ_N_U "u")
diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md
index b0d3443da9c..62df022ef19 100644
--- a/gcc/config/arm/mve.md
+++ b/gcc/config/arm/mve.md
@@ -17,7 +17,7 @@
 ;; along with GCC; see the file COPYING3.  If not see
 ;; <http://www.gnu.org/licenses/>.
 
-(define_insn "*mve_mov<mode>"
+(define_insn "mve_mov<mode>"
   [(set (match_operand:MVE_types 0 "nonimmediate_operand" "=w,w,r,w   , w,   r,Ux,w")
 	(match_operand:MVE_types 1 "general_operand"      " w,r,w,DnDm,UxUi,r,w, Ul"))]
   "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT"
@@ -81,18 +81,27 @@ (define_insn "*mve_mov<mode>"
       return "";
     }
 }
-  [(set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load")
+   [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")])
+   (set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load")
    (set_attr "length" "4,8,8,4,4,8,4,8")
    (set_attr "thumb2_pool_range" "*,*,*,*,1018,*,*,*")
    (set_attr "neg_pool_range" "*,*,*,*,996,*,*,*")])
 
-(define_insn "*mve_vdup<mode>"
+(define_insn "mve_vdup<mode>"
   [(set (match_operand:MVE_vecs 0 "s_register_operand" "=w")
 	(vec_duplicate:MVE_vecs
 	  (match_operand:<V_elem> 1 "s_register_operand" "r")))]
   "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT"
   "vdup.<V_sz_elem>\t%q0, %1"
-  [(set_attr "length" "4")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdup<mode>"))
+  (set_attr "length" "4")
    (set_attr "type" "mve_move")])
 
 ;;
@@ -145,7 +154,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_mnemo>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -159,7 +169,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -173,7 +184,8 @@ (define_insn "mve_v<absneg_str>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "v<absneg_str>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -187,7 +199,8 @@ (define_insn "@mve_<mve_insn>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -201,7 +214,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 ;;
 ;; [vcvttq_f32_f16])
@@ -214,7 +228,8 @@ (define_insn "mve_vcvttq_f32_f16v4sf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -228,7 +243,8 @@ (define_insn "mve_vcvtbq_f32_f16v4sf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -242,7 +258,8 @@ (define_insn "mve_vcvtq_to_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -256,7 +273,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -270,7 +288,8 @@ (define_insn "mve_vcvtq_from_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -284,7 +303,8 @@ (define_insn "mve_v<absneg_str>q_s<mode>"
   ]
   "TARGET_HAVE_MVE"
   "v<absneg_str>.s%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -297,7 +317,8 @@ (define_insn "mve_vmvnq_u<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vmvn\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vmvnq_s<mode>"
   [
@@ -318,7 +339,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -331,7 +353,8 @@ (define_insn "@mve_vclzq_s<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vclz.i%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vclzq_u<mode>"
   [
@@ -354,7 +377,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -368,7 +392,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -382,7 +407,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -397,7 +423,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -411,7 +438,8 @@ (define_insn "mve_vcvtpq_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtp.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -425,7 +453,8 @@ (define_insn "mve_vcvtnq_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtn.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -439,7 +468,8 @@ (define_insn "mve_vcvtmq_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtm.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -453,7 +483,8 @@ (define_insn "mve_vcvtaq_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvta.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -467,7 +498,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -481,7 +513,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -495,7 +528,8 @@ (define_insn "@mve_<mve_insn>q_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -509,7 +543,8 @@ (define_insn "mve_vctp<MVE_vctp>q<MVE_vpred>"
   ]
   "TARGET_HAVE_MVE"
   "vctp.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -523,7 +558,8 @@ (define_insn "mve_vpnotv16bi"
   ]
   "TARGET_HAVE_MVE"
   "vpnot"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpnotv16bi"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -538,7 +574,8 @@ (define_insn "@mve_<mve_insn>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -553,7 +590,8 @@ (define_insn "mve_vcvtq_n_to_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f<V_sz_elem>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; [vcreateq_f])
@@ -599,7 +637,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Versions that take constant vectors as operand 2 (with all elements
@@ -617,7 +656,8 @@ (define_insn "mve_vshrq_n_s<mode>_imm"
 					VALID_NEON_QREG_MODE (<MODE>mode),
 					true);
   }
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_s<mode>_imm"))
+  (set_attr "type" "mve_move")
 ])
 (define_insn "mve_vshrq_n_u<mode>_imm"
   [
@@ -632,7 +672,8 @@ (define_insn "mve_vshrq_n_u<mode>_imm"
 					VALID_NEON_QREG_MODE (<MODE>mode),
 					true);
   }
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_u<mode>_imm"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -647,7 +688,8 @@ (define_insn "mve_vcvtq_n_from_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf><V_sz_elem>.f<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -662,8 +704,9 @@ (define_insn "@mve_<mve_insn>q_p_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcmpneq_, vcmpcsq_, vcmpeqq_, vcmpgeq_, vcmpgtq_, vcmphiq_, vcmpleq_, vcmpltq_])
@@ -676,7 +719,8 @@ (define_insn "@mve_vcmp<mve_cmp_op>q_<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>\t<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -691,7 +735,8 @@ (define_insn "@mve_vcmp<mve_cmp_op>q_n_<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -722,7 +767,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -739,7 +785,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -754,7 +801,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -769,7 +817,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -789,8 +838,11 @@ (define_insn "mve_vandq_u<mode>"
   "@
    vand\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vand\", &operands[2], <MODE>mode, 1, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+   [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_vandq_u<mode>")
+						   (symbol_ref "CODE_FOR_nothing")])
+  (set_attr "type" "mve_move")
 ])
+
 (define_expand "mve_vandq_s<mode>"
   [
    (set (match_operand:MVE_2 0 "s_register_operand")
@@ -811,7 +863,8 @@ (define_insn "mve_vbicq_u<mode>"
   ]
   "TARGET_HAVE_MVE"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vbicq_s<mode>"
@@ -835,7 +888,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -853,7 +907,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Auto vectorizer pattern for int vcadd
@@ -876,7 +931,8 @@ (define_insn "mve_veorq_u<mode>"
   ]
   "TARGET_HAVE_MVE"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_veorq_s<mode>"
   [
@@ -904,7 +960,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -920,7 +977,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -935,7 +993,8 @@ (define_insn "mve_<max_min_su_str>q_<max_min_supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<max_min_su_str>.<max_min_supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_su_str>q_<max_min_supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 
@@ -954,7 +1013,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -972,7 +1032,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -988,7 +1049,8 @@ (define_insn "@mve_<mve_insn>q_int_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1004,7 +1066,8 @@ (define_insn "mve_<mve_addsubmul>q<mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_addsubmul>.i%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1018,7 +1081,8 @@ (define_insn "mve_vornq_s<mode>"
   ]
   "TARGET_HAVE_MVE"
    "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vornq_u<mode>"
@@ -1047,7 +1111,8 @@ (define_insn "mve_vorrq_s<mode>"
   "@
    vorr\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vorr\", &operands[2], <MODE>mode, 0, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vorrq_u<mode>"
   [
@@ -1071,7 +1136,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1087,7 +1153,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1103,7 +1170,8 @@ (define_insn "@mve_<mve_insn>q_r_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1118,7 +1186,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1133,7 +1202,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1148,7 +1218,8 @@ (define_insn "@mve_<mve_insn>q_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1165,7 +1236,8 @@ (define_insn "@mve_<mve_insn>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1179,7 +1251,8 @@ (define_insn "mve_vandq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vand\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1193,7 +1266,8 @@ (define_insn "mve_vbicq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1209,7 +1283,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1223,7 +1298,8 @@ (define_insn "@mve_vcmp<mve_cmp_op>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1238,7 +1314,8 @@ (define_insn "@mve_vcmp<mve_cmp_op>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1253,8 +1330,10 @@ (define_insn "mve_vctp<MVE_vctp>q_m<MVE_vpred>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vctpt.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")
+])
 
 ;;
 ;; [vcvtbq_f16_f32])
@@ -1268,7 +1347,8 @@ (define_insn "mve_vcvtbq_f16_f32v8hf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1283,7 +1363,8 @@ (define_insn "mve_vcvttq_f16_f32v8hf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1297,7 +1378,8 @@ (define_insn "mve_veorq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1313,7 +1395,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1331,7 +1414,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1346,7 +1430,8 @@ (define_insn "@mve_<max_min_f_str>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<max_min_f_str>.f%#<V_sz_elem>	%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_f_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1364,7 +1449,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1384,7 +1470,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1400,7 +1487,8 @@ (define_insn "mve_<mve_addsubmul>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_addsubmul>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1414,7 +1502,8 @@ (define_insn "mve_vornq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1428,7 +1517,8 @@ (define_insn "mve_vorrq_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorr\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1444,7 +1534,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1460,7 +1551,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1476,7 +1568,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1494,7 +1587,8 @@ (define_insn "@mve_<mve_insn>q_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1510,7 +1604,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1526,7 +1621,8 @@ (define_insn "@mve_<mve_insn>q_poly_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_poly_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1547,8 +1643,9 @@ (define_insn "@mve_vcmp<mve_cmp_op1>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_f<mode>"))
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtaq_m_u, vcvtaq_m_s])
 ;;
@@ -1562,8 +1659,10 @@ (define_insn "mve_vcvtaq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtat.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u])
 ;;
@@ -1577,8 +1676,9 @@ (define_insn "mve_vcvtq_m_to_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vqrshrnbq_n_u, vqrshrnbq_n_s]
@@ -1604,7 +1704,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1623,7 +1724,8 @@ (define_insn "@mve_<mve_insn>q_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1639,7 +1741,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1685,7 +1788,10 @@ (define_insn "mve_vshlcq_<supf><mode>"
 		   (match_dup 4)]
 	VSHLCQ))]
  "TARGET_HAVE_MVE"
- "vshlc\t%q0, %1, %4")
+ "vshlc\t%q0, %1, %4"
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+])
 
 ;;
 ;; [vabsq_m_s]
@@ -1705,7 +1811,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1721,7 +1828,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1744,7 +1852,8 @@ (define_insn "@mve_vcmp<mve_cmp_op1>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1767,7 +1876,8 @@ (define_insn "@mve_vcmp<mve_cmp_op1>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1783,7 +1893,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1800,7 +1911,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1819,7 +1931,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1838,7 +1951,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1857,7 +1971,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1878,7 +1993,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1894,7 +2010,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1910,7 +2027,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1933,7 +2051,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1950,7 +2069,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1967,7 +2087,8 @@ (define_insn "@mve_<mve_insn>q_m_r_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1983,7 +2104,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1999,7 +2121,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2015,7 +2138,8 @@ (define_insn "@mve_<mve_insn>q_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2038,7 +2162,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_mnemo>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2054,7 +2179,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270])
@@ -2072,7 +2198,9 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_f<mode>"
   "@
    vcmul.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>
    vcmla.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+  [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>")
+						  (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>")])
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2093,7 +2221,8 @@ (define_insn "@mve_vcmp<mve_cmp_op1>q_m_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2109,7 +2238,8 @@ (define_insn "mve_vcvtbq_m_f16_f32v8hf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2125,7 +2255,8 @@ (define_insn "mve_vcvtbq_m_f32_f16v4sf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2141,7 +2272,8 @@ (define_insn "mve_vcvttq_m_f16_f32v8hf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2157,8 +2289,9 @@ (define_insn "mve_vcvttq_m_f32_f16v4sf"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vdupq_m_n_f])
@@ -2173,7 +2306,8 @@ (define_insn "@mve_<mve_insn>q_m_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2190,7 +2324,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2207,7 +2342,8 @@ (define_insn "@mve_<mve_insn>q_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2224,7 +2360,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2243,7 +2380,8 @@ (define_insn "@mve_<mve_insn>q_p_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2262,7 +2400,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2281,7 +2420,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2298,7 +2438,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2319,7 +2460,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2335,7 +2477,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2352,7 +2495,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2368,7 +2512,8 @@ (define_insn "@mve_<mve_insn>q_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2384,7 +2529,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2400,7 +2546,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2416,7 +2563,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2435,7 +2583,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2451,7 +2600,8 @@ (define_insn "mve_vcvtmq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtmt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2467,7 +2617,8 @@ (define_insn "mve_vcvtpq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtpt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2483,7 +2634,8 @@ (define_insn "mve_vcvtnq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtnt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2500,7 +2652,8 @@ (define_insn "mve_vcvtq_m_n_from_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2516,7 +2669,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2532,8 +2686,9 @@ (define_insn "mve_vcvtq_m_from_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vabavq_p_s, vabavq_p_u])
@@ -2549,7 +2704,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -2566,8 +2722,9 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\n\t<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vsriq_m_n_s, vsriq_m_n_u])
@@ -2583,8 +2740,9 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s])
@@ -2600,7 +2758,8 @@ (define_insn "mve_vcvtq_m_n_to_f_<supf><mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2640,7 +2799,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2659,8 +2819,9 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vaddq_m_u, vaddq_m_s]
@@ -2678,7 +2839,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2698,7 +2860,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2715,8 +2878,9 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcaddq_rot90_m_u, vcaddq_rot90_m_s]
@@ -2735,7 +2899,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2763,7 +2928,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2784,7 +2950,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2802,7 +2969,8 @@ (define_insn "@mve_<mve_insn>q_int_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2819,7 +2987,8 @@ (define_insn "mve_vornq_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2837,7 +3006,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2855,7 +3025,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2872,7 +3043,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2892,7 +3064,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2920,7 +3093,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2940,7 +3114,8 @@ (define_insn "@mve_<mve_insn>q_p_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2958,7 +3133,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2976,7 +3152,8 @@ (define_insn "@mve_<mve_insn>q_poly_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_poly_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2994,7 +3171,8 @@ (define_insn "@mve_<mve_insn>q_m_n_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3012,7 +3190,8 @@ (define_insn "@mve_<mve_insn>q_m_<supf><mode>"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3036,7 +3215,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3057,7 +3237,8 @@ (define_insn "@mve_<mve_insn>q_m_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3077,7 +3258,8 @@ (define_insn "@mve_<mve_insn>q_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3094,7 +3276,8 @@ (define_insn "@mve_<mve_insn>q_m_n_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3116,7 +3299,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3136,7 +3320,8 @@ (define_insn "@mve_<mve_insn>q<mve_rot>_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3153,7 +3338,8 @@ (define_insn "mve_vornq_m_f<mode>"
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3173,7 +3359,8 @@ (define_insn "mve_vstrbq_<supf><mode>"
    output_asm_insn("vstrb.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_s vstrbq_scatter_offset_u]
@@ -3201,7 +3388,8 @@ (define_insn "mve_vstrbq_scatter_offset_<supf><mode>_insn"
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vstrb.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_s vstrwq_scatter_base_u]
@@ -3223,7 +3411,8 @@ (define_insn "mve_vstrwq_scatter_base_<supf>v4si"
    output_asm_insn("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_gather_offset_s vldrbq_gather_offset_u]
@@ -3246,7 +3435,8 @@ (define_insn "mve_vldrbq_gather_offset_<supf><mode>"
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_s vldrbq_u]
@@ -3268,7 +3458,8 @@ (define_insn "mve_vldrbq_<supf><mode>"
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_s vldrwq_gather_base_u]
@@ -3288,7 +3479,8 @@ (define_insn "mve_vldrwq_gather_base_<supf>v4si"
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_p_s vstrbq_scatter_offset_p_u]
@@ -3320,7 +3512,8 @@ (define_insn "mve_vstrbq_scatter_offset_p_<supf><mode>_insn"
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrbt.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u]
@@ -3343,7 +3536,8 @@ (define_insn "mve_vstrwq_scatter_base_p_<supf>v4si"
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_insn "mve_vstrbq_p_<supf><mode>"
   [(set (match_operand:<MVE_B_ELEM> 0 "mve_memory_operand" "=Ux")
@@ -3361,7 +3555,8 @@ (define_insn "mve_vstrbq_p_<supf><mode>"
    output_asm_insn ("vpst\;vstrbt.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u]
@@ -3386,7 +3581,8 @@ (define_insn "mve_vldrbq_gather_offset_z_<supf><mode>"
      output_asm_insn ("vpst\n\tvldrbt.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_z_s vldrbq_z_u]
@@ -3409,7 +3605,8 @@ (define_insn "mve_vldrbq_z_<supf><mode>"
      output_asm_insn ("vpst\;vldrbt.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u]
@@ -3430,7 +3627,8 @@ (define_insn "mve_vldrwq_gather_base_z_<supf>v4si"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_f]
@@ -3449,7 +3647,8 @@ (define_insn "mve_vldrhq_fv8hf"
    output_asm_insn ("vldrh.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_s vldrhq_gather_offset_u]
@@ -3472,7 +3671,8 @@ (define_insn "mve_vldrhq_gather_offset_<supf><mode>"
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_s vldrhq_gather_offset_z_u]
@@ -3497,7 +3697,8 @@ (define_insn "mve_vldrhq_gather_offset_z_<supf><mode>"
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u]
@@ -3520,7 +3721,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_<supf><mode>"
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_s vldrhq_gather_shited_offset_z_u]
@@ -3545,7 +3747,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_z_<supf><mode>"
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_s, vldrhq_u]
@@ -3567,7 +3770,8 @@ (define_insn "mve_vldrhq_<supf><mode>"
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_z_f]
@@ -3587,7 +3791,8 @@ (define_insn "mve_vldrhq_z_fv8hf"
    output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_z_s vldrhq_z_u]
@@ -3610,7 +3815,8 @@ (define_insn "mve_vldrhq_z_<supf><mode>"
      output_asm_insn ("vpst\;vldrht.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_f]
@@ -3629,7 +3835,8 @@ (define_insn "mve_vldrwq_fv4sf"
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_s vldrwq_u]
@@ -3648,7 +3855,8 @@ (define_insn "mve_vldrwq_<supf>v4si"
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_z_f]
@@ -3668,7 +3876,8 @@ (define_insn "mve_vldrwq_z_fv4sf"
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_z_s vldrwq_z_u]
@@ -3688,7 +3897,8 @@ (define_insn "mve_vldrwq_z_<supf>v4si"
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_expand "@mve_vld1q_f<mode>"
   [(match_operand:MVE_0 0 "s_register_operand")
@@ -3728,7 +3938,8 @@ (define_insn "mve_vldrdq_gather_base_<supf>v2di"
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_base_z_s vldrdq_gather_base_z_u]
@@ -3749,7 +3960,8 @@ (define_insn "mve_vldrdq_gather_base_z_<supf>v2di"
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u]
@@ -3769,7 +3981,8 @@ (define_insn "mve_vldrdq_gather_offset_<supf>v2di"
   output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_offset_z_s vldrdq_gather_offset_z_u]
@@ -3790,7 +4003,8 @@ (define_insn "mve_vldrdq_gather_offset_z_<supf>v2di"
   output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u]
@@ -3810,7 +4024,8 @@ (define_insn "mve_vldrdq_gather_shifted_offset_<supf>v2di"
    output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_z_s vldrdq_gather_shifted_offset_z_u]
@@ -3831,7 +4046,8 @@ (define_insn "mve_vldrdq_gather_shifted_offset_z_<supf>v2di"
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_offset_f]
@@ -3851,7 +4067,8 @@ (define_insn "mve_vldrhq_gather_offset_fv8hf"
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_f]
@@ -3873,7 +4090,8 @@ (define_insn "mve_vldrhq_gather_offset_z_fv8hf"
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_f]
@@ -3893,7 +4111,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_fv8hf"
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_f]
@@ -3915,7 +4134,8 @@ (define_insn "mve_vldrhq_gather_shifted_offset_z_fv8hf"
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_f]
@@ -3935,7 +4155,8 @@ (define_insn "mve_vldrwq_gather_base_fv4sf"
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_z_f]
@@ -3956,7 +4177,8 @@ (define_insn "mve_vldrwq_gather_base_z_fv4sf"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_f]
@@ -3976,7 +4198,8 @@ (define_insn "mve_vldrwq_gather_offset_fv4sf"
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_s vldrwq_gather_offset_u]
@@ -3996,7 +4219,8 @@ (define_insn "mve_vldrwq_gather_offset_<supf>v4si"
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_z_f]
@@ -4018,7 +4242,8 @@ (define_insn "mve_vldrwq_gather_offset_z_fv4sf"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u]
@@ -4040,7 +4265,8 @@ (define_insn "mve_vldrwq_gather_offset_z_<supf>v4si"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_f]
@@ -4060,7 +4286,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_fv4sf"
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_s vldrwq_gather_shifted_offset_u]
@@ -4080,7 +4307,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_<supf>v4si"
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_f]
@@ -4102,7 +4330,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_z_fv4sf"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u]
@@ -4124,7 +4353,8 @@ (define_insn "mve_vldrwq_gather_shifted_offset_z_<supf>v4si"
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_f]
@@ -4143,7 +4373,8 @@ (define_insn "mve_vstrhq_fv8hf"
    output_asm_insn ("vstrh.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_p_f]
@@ -4164,7 +4395,8 @@ (define_insn "mve_vstrhq_p_fv8hf"
    output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_p_s vstrhq_p_u]
@@ -4186,7 +4418,8 @@ (define_insn "mve_vstrhq_p_<supf><mode>"
    output_asm_insn ("vpst\;vstrht.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u]
@@ -4218,7 +4451,8 @@ (define_insn "mve_vstrhq_scatter_offset_p_<supf><mode>_insn"
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u]
@@ -4246,7 +4480,8 @@ (define_insn "mve_vstrhq_scatter_offset_<supf><mode>_insn"
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_s vstrhq_scatter_shifted_offset_p_u]
@@ -4278,7 +4513,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_p_<supf><mode>_insn"
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u]
@@ -4307,7 +4543,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_s, vstrhq_u]
@@ -4326,7 +4563,8 @@ (define_insn "mve_vstrhq_<supf><mode>"
    output_asm_insn ("vstrh.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_f]
@@ -4345,7 +4583,8 @@ (define_insn "mve_vstrwq_fv4sf"
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_p_f]
@@ -4366,7 +4605,8 @@ (define_insn "mve_vstrwq_p_fv4sf"
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_p_s vstrwq_p_u]
@@ -4387,7 +4627,8 @@ (define_insn "mve_vstrwq_p_<supf>v4si"
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_s vstrwq_u]
@@ -4406,7 +4647,8 @@ (define_insn "mve_vstrwq_<supf>v4si"
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 (define_expand "@mve_vst1q_f<mode>"
   [(match_operand:<MVE_CNVT> 0 "mve_memory_operand")
@@ -4449,7 +4691,8 @@ (define_insn "mve_vstrdq_scatter_base_p_<supf>v2di"
    output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u]
@@ -4471,7 +4714,8 @@ (define_insn "mve_vstrdq_scatter_base_<supf>v2di"
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_offset_p_s vstrdq_scatter_offset_p_u]
@@ -4502,7 +4746,8 @@ (define_insn "mve_vstrdq_scatter_offset_p_<supf>v2di_insn"
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u]
@@ -4530,7 +4775,8 @@ (define_insn "mve_vstrdq_scatter_offset_<supf>v2di_insn"
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_p_s vstrdq_scatter_shifted_offset_p_u]
@@ -4562,7 +4808,8 @@ (define_insn "mve_vstrdq_scatter_shifted_offset_p_<supf>v2di_insn"
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u]
@@ -4591,7 +4838,8 @@ (define_insn "mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_f]
@@ -4619,7 +4867,8 @@ (define_insn "mve_vstrhq_scatter_offset_fv8hf_insn"
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_f]
@@ -4650,7 +4899,8 @@ (define_insn "mve_vstrhq_scatter_offset_p_fv8hf_insn"
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_f]
@@ -4678,7 +4928,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_fv8hf_insn"
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_f]
@@ -4710,7 +4961,8 @@ (define_insn "mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn"
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_f]
@@ -4732,7 +4984,8 @@ (define_insn "mve_vstrwq_scatter_base_fv4sf"
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_p_f]
@@ -4755,7 +5008,8 @@ (define_insn "mve_vstrwq_scatter_base_p_fv4sf"
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_f]
@@ -4783,7 +5037,8 @@ (define_insn "mve_vstrwq_scatter_offset_fv4sf_insn"
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_offset_p_f]
@@ -4814,7 +5069,8 @@ (define_insn "mve_vstrwq_scatter_offset_p_fv4sf_insn"
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -4845,7 +5101,8 @@ (define_insn "mve_vstrwq_scatter_offset_p_<supf>v4si_insn"
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -4873,7 +5130,8 @@ (define_insn "mve_vstrwq_scatter_offset_<supf>v4si_insn"
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_f]
@@ -4901,7 +5159,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_fv4sf_insn"
 	 VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_f]
@@ -4933,7 +5192,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn"
 	  VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u]
@@ -4965,7 +5225,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_p_<supf>v4si_insn"
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u]
@@ -4994,7 +5255,8 @@ (define_insn "mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vidupq_n_u])
@@ -5062,7 +5324,8 @@ (define_insn "mve_vidupq_m_wb_u<mode>_insn"
 		(match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;\tvidupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vddupq_n_u])
@@ -5130,7 +5393,8 @@ (define_insn "mve_vddupq_m_wb_u<mode>_insn"
 		 (match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;vddupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vdwdupq_n_u])
@@ -5246,8 +5510,9 @@ (define_insn "mve_vdwdupq_m_wb_u<mode>_insn"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vdwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [viwdupq_n_u])
@@ -5363,7 +5628,8 @@ (define_insn "mve_viwdupq_m_wb_u<mode>_insn"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;\tviwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5389,7 +5655,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_<supf>v4si"
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_s vstrwq_scatter_base_wb_p_u]
@@ -5415,7 +5682,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_p_<supf>v4si"
    output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_f]
@@ -5440,7 +5708,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_fv4sf"
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_f]
@@ -5466,7 +5735,8 @@ (define_insn "mve_vstrwq_scatter_base_wb_p_fv4sf"
    output_asm_insn ("vpst\;vstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u]
@@ -5491,7 +5761,8 @@ (define_insn "mve_vstrdq_scatter_base_wb_<supf>v2di"
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_p_s vstrdq_scatter_base_wb_p_u]
@@ -5517,7 +5788,8 @@ (define_insn "mve_vstrdq_scatter_base_wb_p_<supf>v2di"
    output_asm_insn ("vpst\;vstrdt.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5569,7 +5841,8 @@ (define_insn "mve_vldrwq_gather_base_wb_<supf>v4si_insn"
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5625,7 +5898,8 @@ (define_insn "mve_vldrwq_gather_base_wb_z_<supf>v4si_insn"
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5677,7 +5951,8 @@ (define_insn "mve_vldrwq_gather_base_wb_fv4sf_insn"
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5734,7 +6009,8 @@ (define_insn "mve_vldrwq_gather_base_wb_z_fv4sf_insn"
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrdq_gather_base_wb_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -5787,7 +6063,8 @@ (define_insn "mve_vldrdq_gather_base_wb_<supf>v2di_insn"
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrdq_gather_base_wb_z_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -5826,7 +6103,7 @@ (define_insn "get_fpscr_nzcvqc"
    (unspec_volatile:SI [(reg:SI VFPCC_REGNUM)] UNSPEC_GET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmrs\\t%0, FPSCR_nzcvqc"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 (define_insn "set_fpscr_nzcvqc"
  [(set (reg:SI VFPCC_REGNUM)
@@ -5834,7 +6111,7 @@ (define_insn "set_fpscr_nzcvqc"
     VUNSPEC_SET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmsr\\tFPSCR_nzcvqc, %0"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 ;;
 ;; [vldrdq_gather_base_wb_z_s vldrdq_gather_base_wb_z_u]
@@ -5859,7 +6136,8 @@ (define_insn "mve_vldrdq_gather_base_wb_z_<supf>v2di_insn"
    output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 ;;
 ;; [vadciq_m_s, vadciq_m_u])
 ;;
@@ -5876,7 +6154,8 @@ (define_insn "mve_vadciq_m_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5893,7 +6172,8 @@ (define_insn "mve_vadciq_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vadci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -5912,7 +6192,8 @@ (define_insn "mve_vadcq_m_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5929,7 +6210,8 @@ (define_insn "mve_vadcq_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vadc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")
    (set_attr "conds" "set")])
 
@@ -5949,7 +6231,8 @@ (define_insn "mve_vsbciq_m_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5966,7 +6249,8 @@ (define_insn "mve_vsbciq_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vsbci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -5985,7 +6269,8 @@ (define_insn "mve_vsbcq_m_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6002,7 +6287,8 @@ (define_insn "mve_vsbcq_<supf>v4si"
   ]
   "TARGET_HAVE_MVE"
   "vsbc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -6031,7 +6317,7 @@ (define_insn "mve_vst2q<mode>"
 		    "vst21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld2q])
@@ -6059,7 +6345,7 @@ (define_insn "mve_vld2q<mode>"
 		    "vld21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld4q])
@@ -6402,7 +6688,8 @@ (define_insn "mve_vshlcq_m_<supf><mode>"
  ]
  "TARGET_HAVE_MVE"
  "vpst\;vshlct\t%q0, %1, %4"
- [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
   (set_attr "length" "8")])
 
 ;; CDE instructions on MVE registers.
@@ -6414,7 +6701,8 @@ (define_insn "arm_vcx1qv16qi"
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1\\tp%c1, %q0, #%c2"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1qav16qi"
@@ -6425,7 +6713,8 @@ (define_insn "arm_vcx1qav16qi"
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1a\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qv16qi"
@@ -6436,7 +6725,8 @@ (define_insn "arm_vcx2qv16qi"
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2\\tp%c1, %q0, %q2, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qav16qi"
@@ -6448,7 +6738,8 @@ (define_insn "arm_vcx2qav16qi"
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2a\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qv16qi"
@@ -6460,7 +6751,8 @@ (define_insn "arm_vcx3qv16qi"
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3\\tp%c1, %q0, %q2, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qav16qi"
@@ -6473,7 +6765,8 @@ (define_insn "arm_vcx3qav16qi"
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3a\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1q<a>_p_v16qi"
@@ -6485,7 +6778,8 @@ (define_insn "arm_vcx1q<a>_p_v16qi"
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx1<a>t\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6499,7 +6793,8 @@ (define_insn "arm_vcx2q<a>_p_v16qi"
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx2<a>t\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6514,11 +6809,12 @@ (define_insn "arm_vcx3q<a>_p_v16qi"
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx3<a>t\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
-(define_insn "*movmisalign<mode>_mve_store"
+(define_insn "movmisalign<mode>_mve_store"
   [(set (match_operand:MVE_VLD_ST 0 "mve_memory_operand"	     "=Ux")
 	(unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "s_register_operand" " w")]
 	 UNSPEC_MISALIGNED_ACCESS))]
@@ -6526,11 +6822,12 @@ (define_insn "*movmisalign<mode>_mve_store"
     || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (<MODE>mode)))
    && !BYTES_BIG_ENDIAN && unaligned_access"
   "vstr<V_sz_elem1>.<V_sz_elem>\t%q1, %E0"
-  [(set_attr "type" "mve_store")]
+  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign<mode>_mve_store"))
+   (set_attr "type" "mve_store")]
 )
 
 
-(define_insn "*movmisalign<mode>_mve_load"
+(define_insn "movmisalign<mode>_mve_load"
   [(set (match_operand:MVE_VLD_ST 0 "s_register_operand"				 "=w")
 	(unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "mve_memory_operand" " Ux")]
 	 UNSPEC_MISALIGNED_ACCESS))]
@@ -6538,7 +6835,8 @@ (define_insn "*movmisalign<mode>_mve_load"
     || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (<MODE>mode)))
    && !BYTES_BIG_ENDIAN && unaligned_access"
   "vldr<V_sz_elem1>.<V_sz_elem>\t%q0, %E1"
-  [(set_attr "type" "mve_load")]
+  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign<mode>_mve_load"))
+   (set_attr "type" "mve_load")]
 )
 
 ;; Expander for VxBI moves
@@ -6620,3 +6918,40 @@ (define_expand "@arm_mve_reinterpret<mode>"
       }
   }
 )
+
+;; Originally expanded by 'predicated_doloop_end'.
+;; In the rare situation where the branch is too far, we do also need to
+;; revert FPSCR.LTPSIZE back to 0x100 after the last iteration.
+(define_insn "*predicated_doloop_end_internal"
+  [(set (pc)
+	(if_then_else
+	   (ge (plus:SI (reg:SI LR_REGNUM)
+			(match_operand:SI 0 "const_int_operand" ""))
+		(const_int 0))
+	 (label_ref (match_operand 1 "" ""))
+	 (pc)))
+   (set (reg:SI LR_REGNUM)
+	(plus:SI (reg:SI LR_REGNUM) (match_dup 0)))
+   (clobber (reg:CC CC_REGNUM))]
+  "TARGET_32BIT && TARGET_HAVE_LOB && TARGET_HAVE_MVE && TARGET_THUMB2"
+  {
+    if (get_attr_length (insn) == 4)
+      return "letp\t%|lr, %l1";
+    else
+      return "subs\t%|lr, #%n0\n\tbgt\t%l1\n\tlctp";
+  }
+  [(set (attr "length")
+	(if_then_else
+	   (ltu (minus (pc) (match_dup 1)) (const_int 1024))
+	    (const_int 4)
+	    (const_int 6)))
+   (set_attr "type" "branch")])
+
+(define_insn "dlstp<mode1>_insn"
+  [
+    (set (reg:SI LR_REGNUM)
+	 (unspec:SI [(match_operand:SI 0 "s_register_operand" "r")]
+	  DLSTP))
+  ]
+  "TARGET_32BIT && TARGET_HAVE_LOB && TARGET_HAVE_MVE && TARGET_THUMB2"
+  "dlstp.<mode1>\t%|lr, %0")
diff --git a/gcc/config/arm/vec-common.md b/gcc/config/arm/vec-common.md
index 9af8429968d..74871cb984b 100644
--- a/gcc/config/arm/vec-common.md
+++ b/gcc/config/arm/vec-common.md
@@ -366,7 +366,8 @@ (define_insn "@mve_<mve_insn>q_<supf><mode>"
   "@
    <mve_insn>.<supf>%#<V_sz_elem>\t%<V_reg>0, %<V_reg>1, %<V_reg>2
    * return neon_output_shift_immediate (\"vshl\", 'i', &operands[2], <MODE>mode, VALID_NEON_QREG_MODE (<MODE>mode), true);"
-  [(set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
 )
 
 (define_expand "vashl<mode>3"

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
  2023-11-06 11:20 Stamatis Markianos-Wright
@ 2023-12-12 10:33 ` Richard Earnshaw
  0 siblings, 0 replies; 7+ messages in thread
From: Richard Earnshaw @ 2023-12-12 10:33 UTC (permalink / raw)
  To: Stamatis Markianos-Wright, gcc-patches
  Cc: Kyrylo Tkachov, Richard Earnshaw, richard.sandiford, ramana.gcc



On 06/11/2023 11:20, Stamatis Markianos-Wright wrote:
> Patch has already been approved at:
> 
> https://gcc.gnu.org/pipermail/gcc-patches/2023-September/630326.html
> 
> 
> ... But I'm sending this again for archiving on the list after rebasing

A couple of minor nits:

1)

+#define MVE_VPT_PREDICABLE_INSN_P(INSN)					\
+  (recog_memoized (INSN) >= 0						\
+  && get_attr_mve_unpredicated_insn (INSN) != 0)			\

I think it's better to write "!= CODE_FOR_nothing".

+(define_attr "mve_unpredicated_insn" "" (const_int 0))
+

And the default value here should similarly be 'symbol_ref 
"CODE_FOR_nothing"'.

So that the style matches the symbol refs elsewhere.


2)
+(define_insn "*predicated_doloop_end_internal"
+  [(set (pc)
+	(if_then_else
+	   (ge (plus:SI (reg:SI LR_REGNUM)
+			(match_operand:SI 0 "const_int_operand" ""))
+		(const_int 0))
+	 (label_ref (match_operand 1 "" ""))
+	 (pc)))
+   (set (reg:SI LR_REGNUM)
+	(plus:SI (reg:SI LR_REGNUM) (match_dup 0)))
+   (clobber (reg:CC CC_REGNUM))]
+  "TARGET_32BIT && TARGET_HAVE_LOB && TARGET_HAVE_MVE && TARGET_THUMB2"

TARGET_THUMB2 => TARGET_32BIT, so the first test is redundant.  In fact, 
given that TARGET_HAVE_LOB => armv8.1-m.main => thumb2, why do we need 
either?

So
	TARGET_HAVE_LOB && TARGET_HAVE_MVE
should be sufficient.


+(define_insn "dlstp<mode1>_insn"
+  [
+    (set (reg:SI LR_REGNUM)
+	 (unspec:SI [(match_operand:SI 0 "s_register_operand" "r")]
+	  DLSTP))
+  ]
+  "TARGET_32BIT && TARGET_HAVE_LOB && TARGET_HAVE_MVE && TARGET_THUMB2"

Same here.

Otherwise, OK.

R.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
@ 2023-11-06 11:20 Stamatis Markianos-Wright
  2023-12-12 10:33 ` Richard Earnshaw
  0 siblings, 1 reply; 7+ messages in thread
From: Stamatis Markianos-Wright @ 2023-11-06 11:20 UTC (permalink / raw)
  To: gcc-patches
  Cc: Kyrylo Tkachov, Richard Earnshaw, richard.sandiford, ramana.gcc

[-- Attachment #1: Type: text/plain, Size: 180 bytes --]

Patch has already been approved at:

https://gcc.gnu.org/pipermail/gcc-patches/2023-September/630326.html


... But I'm sending this again for archiving on the list after rebasing

[-- Attachment #2: 1.patch --]
[-- Type: text/x-patch, Size: 127650 bytes --]

commit 5919a33d0280d35b0ebcbc07f10b2a09461b1508
Author: Stam Markianos-Wright <stam.markianos-wright@arm.com>
Date:   Tue Oct 18 17:42:56 2022 +0100

    arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
    
    I'd like to submit two patches that add support for Arm's MVE
    Tail Predicated Low Overhead Loop feature.
    
    --- Introduction ---
    
    The M-class Arm-ARM:
    https://developer.arm.com/documentation/ddi0553/bu/?lang=en
    Section B5.5.1 "Loop tail predication" describes the feature
    we are adding support for with this patch (although
    we only add codegen for DLSTP/LETP instruction loops).
    
    Previously with commit d2ed233cb94 we'd added support for
    non-MVE DLS/LE loops through the loop-doloop pass, which, given
    a standard MVE loop like:
    
    ```
    void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, int16_t *c, int n)
    {
      while (n > 0)
        {
          mve_pred16_t p = vctp16q (n);
          int16x8_t va = vldrhq_z_s16 (a, p);
          int16x8_t vb = vldrhq_z_s16 (b, p);
          int16x8_t vc = vaddq_x_s16 (va, vb, p);
          vstrhq_p_s16 (c, vc, p);
          c+=8;
          a+=8;
          b+=8;
          n-=8;
        }
    }
    ```
    .. would output:
    
    ```
            <pre-calculate the number of iterations and place it into lr>
            dls     lr, lr
    .L3:
            vctp.16 r3
            vmrs    ip, P0  @ movhi
            sxth    ip, ip
            vmsr     P0, ip @ movhi
            mov     r4, r0
            vpst
            vldrht.16       q2, [r4]
            mov     r4, r1
            vmov    q3, q0
            vpst
            vldrht.16       q1, [r4]
            mov     r4, r2
            vpst
            vaddt.i16       q3, q2, q1
            subs    r3, r3, #8
            vpst
            vstrht.16       q3, [r4]
            adds    r0, r0, #16
            adds    r1, r1, #16
            adds    r2, r2, #16
            le      lr, .L3
    ```
    
    where the LE instruction will decrement LR by 1, compare and
    branch if needed.
    
    (there are also other inefficiencies with the above code, like the
    pointless vmrs/sxth/vmsr on the VPR and the adds not being merged
    into the vldrht/vstrht as a #16 offsets and some random movs!
    But that's different problems...)
    
    The MVE version is similar, except that:
    * Instead of DLS/LE the instructions are DLSTP/LETP.
    * Instead of pre-calculating the number of iterations of the
      loop, we place the number of elements to be processed by the
      loop into LR.
    * Instead of decrementing the LR by one, LETP will decrement it
      by FPSCR.LTPSIZE, which is the number of elements being
      processed in each iteration: 16 for 8-bit elements, 5 for 16-bit
      elements, etc.
    * On the final iteration, automatic Loop Tail Predication is
      performed, as if the instructions within the loop had been VPT
      predicated with a VCTP generating the VPR predicate in every
      loop iteration.
    
    The dlstp/letp loop now looks like:
    
    ```
            <place n into r3>
            dlstp.16        lr, r3
    .L14:
            mov     r3, r0
            vldrh.16        q3, [r3]
            mov     r3, r1
            vldrh.16        q2, [r3]
            mov     r3, r2
            vadd.i16  q3, q3, q2
            adds    r0, r0, #16
            vstrh.16        q3, [r3]
            adds    r1, r1, #16
            adds    r2, r2, #16
            letp    lr, .L14
    
    ```
    
    Since the loop tail predication is automatic, we have eliminated
    the VCTP that had been specified by the user in the intrinsic
    and converted the VPT-predicated instructions into their
    unpredicated equivalents (which also saves us from VPST insns).
    
    The LE instruction here decrements LR by 8 in each iteration.
    
    --- This 1/2 patch ---
    
    This first patch lays some groundwork by adding an attribute to
    md patterns, and then the second patch contains the functional
    changes.
    
    One major difficulty in implementing MVE Tail-Predicated Low
    Overhead Loops was the need to transform VPT-predicated insns
    in the insn chain into their unpredicated equivalents, like:
    `mve_vldrbq_z_<supf><mode> -> mve_vldrbq_<supf><mode>`.
    
    This requires us to have a deterministic link between two
    different patterns in mve.md -- this _could_ be done by
    re-ordering the entirety of mve.md such that the patterns are
    at some constant icode proximity (e.g. having the _z immediately
    after the unpredicated version would mean that to map from the
    former to the latter you could use icode-1), but that is a very
    messy solution that would lead to complex unknown dependencies
    between the ordering of patterns.
    
    This patch proves an alternative way of doing that: using an insn
    attribute to encode the icode of the unpredicated instruction.
    
    No regressions on arm-none-eabi with an MVE target.
    
    Thank you,
    Stam Markianos-Wright
    
    gcc/ChangeLog:
    
            * config/arm/arm.md (mve_unpredicated_insn): New attribute.
            * config/arm/arm.h (MVE_VPT_PREDICATED_INSN_P): New define.
            (MVE_VPT_UNPREDICATED_INSN_P): Likewise.
            (MVE_VPT_PREDICABLE_INSN_P): Likewise.
            * config/arm/vec-common.md (mve_vshlq_<supf><mode>): Add attribute.
            * config/arm/mve.md (arm_vcx1q<a>_p_v16qi): Add attribute.
            (arm_vcx1q<a>v16qi): Likewise.
            (arm_vcx1qav16qi): Likewise.
            (arm_vcx1qv16qi): Likewise.
            (arm_vcx2q<a>_p_v16qi): Likewise.
            (arm_vcx2q<a>v16qi): Likewise.
            (arm_vcx2qav16qi): Likewise.
            (arm_vcx2qv16qi): Likewise.
            (arm_vcx3q<a>_p_v16qi): Likewise.
            (arm_vcx3q<a>v16qi): Likewise.
            (arm_vcx3qav16qi): Likewise.
            (arm_vcx3qv16qi): Likewise.
            (mve_vabavq_<supf><mode>): Likewise.
            (mve_vabavq_p_<supf><mode>): Likewise.
            (mve_vabdq_<supf><mode>): Likewise.
            (mve_vabdq_f<mode>): Likewise.
            (mve_vabdq_m_<supf><mode>): Likewise.
            (mve_vabdq_m_f<mode>): Likewise.
            (mve_vabsq_f<mode>): Likewise.
            (mve_vabsq_m_f<mode>): Likewise.
            (mve_vabsq_m_s<mode>): Likewise.
            (mve_vabsq_s<mode>): Likewise.
            (mve_vadciq_<supf>v4si): Likewise.
            (mve_vadciq_m_<supf>v4si): Likewise.
            (mve_vadcq_<supf>v4si): Likewise.
            (mve_vadcq_m_<supf>v4si): Likewise.
            (mve_vaddlvaq_<supf>v4si): Likewise.
            (mve_vaddlvaq_p_<supf>v4si): Likewise.
            (mve_vaddlvq_<supf>v4si): Likewise.
            (mve_vaddlvq_p_<supf>v4si): Likewise.
            (mve_vaddq_f<mode>): Likewise.
            (mve_vaddq_m_<supf><mode>): Likewise.
            (mve_vaddq_m_f<mode>): Likewise.
            (mve_vaddq_m_n_<supf><mode>): Likewise.
            (mve_vaddq_m_n_f<mode>): Likewise.
            (mve_vaddq_n_<supf><mode>): Likewise.
            (mve_vaddq_n_f<mode>): Likewise.
            (mve_vaddq<mode>): Likewise.
            (mve_vaddvaq_<supf><mode>): Likewise.
            (mve_vaddvaq_p_<supf><mode>): Likewise.
            (mve_vaddvq_<supf><mode>): Likewise.
            (mve_vaddvq_p_<supf><mode>): Likewise.
            (mve_vandq_<supf><mode>): Likewise.
            (mve_vandq_f<mode>): Likewise.
            (mve_vandq_m_<supf><mode>): Likewise.
            (mve_vandq_m_f<mode>): Likewise.
            (mve_vandq_s<mode>): Likewise.
            (mve_vandq_u<mode>): Likewise.
            (mve_vbicq_<supf><mode>): Likewise.
            (mve_vbicq_f<mode>): Likewise.
            (mve_vbicq_m_<supf><mode>): Likewise.
            (mve_vbicq_m_f<mode>): Likewise.
            (mve_vbicq_m_n_<supf><mode>): Likewise.
            (mve_vbicq_n_<supf><mode>): Likewise.
            (mve_vbicq_s<mode>): Likewise.
            (mve_vbicq_u<mode>): Likewise.
            (mve_vbrsrq_m_n_<supf><mode>): Likewise.
            (mve_vbrsrq_m_n_f<mode>): Likewise.
            (mve_vbrsrq_n_<supf><mode>): Likewise.
            (mve_vbrsrq_n_f<mode>): Likewise.
            (mve_vcaddq_rot270_m_<supf><mode>): Likewise.
            (mve_vcaddq_rot270_m_f<mode>): Likewise.
            (mve_vcaddq_rot270<mode>): Likewise.
            (mve_vcaddq_rot270<mode>): Likewise.
            (mve_vcaddq_rot90_m_<supf><mode>): Likewise.
            (mve_vcaddq_rot90_m_f<mode>): Likewise.
            (mve_vcaddq_rot90<mode>): Likewise.
            (mve_vcaddq_rot90<mode>): Likewise.
            (mve_vcaddq<mve_rot><mode>): Likewise.
            (mve_vcaddq<mve_rot><mode>): Likewise.
            (mve_vclsq_m_s<mode>): Likewise.
            (mve_vclsq_s<mode>): Likewise.
            (mve_vclzq_<supf><mode>): Likewise.
            (mve_vclzq_m_<supf><mode>): Likewise.
            (mve_vclzq_s<mode>): Likewise.
            (mve_vclzq_u<mode>): Likewise.
            (mve_vcmlaq_m_f<mode>): Likewise.
            (mve_vcmlaq_rot180_m_f<mode>): Likewise.
            (mve_vcmlaq_rot180<mode>): Likewise.
            (mve_vcmlaq_rot270_m_f<mode>): Likewise.
            (mve_vcmlaq_rot270<mode>): Likewise.
            (mve_vcmlaq_rot90_m_f<mode>): Likewise.
            (mve_vcmlaq_rot90<mode>): Likewise.
            (mve_vcmlaq<mode>): Likewise.
            (mve_vcmlaq<mve_rot><mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_f<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_n_<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_n_f<mode>): Likewise.
            (mve_vcmpcsq_<mode>): Likewise.
            (mve_vcmpcsq_m_n_u<mode>): Likewise.
            (mve_vcmpcsq_m_u<mode>): Likewise.
            (mve_vcmpcsq_n_<mode>): Likewise.
            (mve_vcmpeqq_<mode>): Likewise.
            (mve_vcmpeqq_f<mode>): Likewise.
            (mve_vcmpeqq_m_<supf><mode>): Likewise.
            (mve_vcmpeqq_m_f<mode>): Likewise.
            (mve_vcmpeqq_m_n_<supf><mode>): Likewise.
            (mve_vcmpeqq_m_n_f<mode>): Likewise.
            (mve_vcmpeqq_n_<mode>): Likewise.
            (mve_vcmpeqq_n_f<mode>): Likewise.
            (mve_vcmpgeq_<mode>): Likewise.
            (mve_vcmpgeq_f<mode>): Likewise.
            (mve_vcmpgeq_m_f<mode>): Likewise.
            (mve_vcmpgeq_m_n_f<mode>): Likewise.
            (mve_vcmpgeq_m_n_s<mode>): Likewise.
            (mve_vcmpgeq_m_s<mode>): Likewise.
            (mve_vcmpgeq_n_<mode>): Likewise.
            (mve_vcmpgeq_n_f<mode>): Likewise.
            (mve_vcmpgtq_<mode>): Likewise.
            (mve_vcmpgtq_f<mode>): Likewise.
            (mve_vcmpgtq_m_f<mode>): Likewise.
            (mve_vcmpgtq_m_n_f<mode>): Likewise.
            (mve_vcmpgtq_m_n_s<mode>): Likewise.
            (mve_vcmpgtq_m_s<mode>): Likewise.
            (mve_vcmpgtq_n_<mode>): Likewise.
            (mve_vcmpgtq_n_f<mode>): Likewise.
            (mve_vcmphiq_<mode>): Likewise.
            (mve_vcmphiq_m_n_u<mode>): Likewise.
            (mve_vcmphiq_m_u<mode>): Likewise.
            (mve_vcmphiq_n_<mode>): Likewise.
            (mve_vcmpleq_<mode>): Likewise.
            (mve_vcmpleq_f<mode>): Likewise.
            (mve_vcmpleq_m_f<mode>): Likewise.
            (mve_vcmpleq_m_n_f<mode>): Likewise.
            (mve_vcmpleq_m_n_s<mode>): Likewise.
            (mve_vcmpleq_m_s<mode>): Likewise.
            (mve_vcmpleq_n_<mode>): Likewise.
            (mve_vcmpleq_n_f<mode>): Likewise.
            (mve_vcmpltq_<mode>): Likewise.
            (mve_vcmpltq_f<mode>): Likewise.
            (mve_vcmpltq_m_f<mode>): Likewise.
            (mve_vcmpltq_m_n_f<mode>): Likewise.
            (mve_vcmpltq_m_n_s<mode>): Likewise.
            (mve_vcmpltq_m_s<mode>): Likewise.
            (mve_vcmpltq_n_<mode>): Likewise.
            (mve_vcmpltq_n_f<mode>): Likewise.
            (mve_vcmpneq_<mode>): Likewise.
            (mve_vcmpneq_f<mode>): Likewise.
            (mve_vcmpneq_m_<supf><mode>): Likewise.
            (mve_vcmpneq_m_f<mode>): Likewise.
            (mve_vcmpneq_m_n_<supf><mode>): Likewise.
            (mve_vcmpneq_m_n_f<mode>): Likewise.
            (mve_vcmpneq_n_<mode>): Likewise.
            (mve_vcmpneq_n_f<mode>): Likewise.
            (mve_vcmulq_m_f<mode>): Likewise.
            (mve_vcmulq_rot180_m_f<mode>): Likewise.
            (mve_vcmulq_rot180<mode>): Likewise.
            (mve_vcmulq_rot270_m_f<mode>): Likewise.
            (mve_vcmulq_rot270<mode>): Likewise.
            (mve_vcmulq_rot90_m_f<mode>): Likewise.
            (mve_vcmulq_rot90<mode>): Likewise.
            (mve_vcmulq<mode>): Likewise.
            (mve_vcmulq<mve_rot><mode>): Likewise.
            (mve_vctp<mode1>q_mhi): Likewise.
            (mve_vctp<mode1>qhi): Likewise.
            (mve_vcvtaq_<supf><mode>): Likewise.
            (mve_vcvtaq_m_<supf><mode>): Likewise.
            (mve_vcvtbq_f16_f32v8hf): Likewise.
            (mve_vcvtbq_f32_f16v4sf): Likewise.
            (mve_vcvtbq_m_f16_f32v8hf): Likewise.
            (mve_vcvtbq_m_f32_f16v4sf): Likewise.
            (mve_vcvtmq_<supf><mode>): Likewise.
            (mve_vcvtmq_m_<supf><mode>): Likewise.
            (mve_vcvtnq_<supf><mode>): Likewise.
            (mve_vcvtnq_m_<supf><mode>): Likewise.
            (mve_vcvtpq_<supf><mode>): Likewise.
            (mve_vcvtpq_m_<supf><mode>): Likewise.
            (mve_vcvtq_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_n_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_n_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_n_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_n_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_to_f_<supf><mode>): Likewise.
            (mve_vcvttq_f16_f32v8hf): Likewise.
            (mve_vcvttq_f32_f16v4sf): Likewise.
            (mve_vcvttq_m_f16_f32v8hf): Likewise.
            (mve_vcvttq_m_f32_f16v4sf): Likewise.
            (mve_vddupq_m_wb_u<mode>_insn): Likewise.
            (mve_vddupq_u<mode>_insn): Likewise.
            (mve_vdupq_m_n_<supf><mode>): Likewise.
            (mve_vdupq_m_n_f<mode>): Likewise.
            (mve_vdupq_n_<supf><mode>): Likewise.
            (mve_vdupq_n_f<mode>): Likewise.
            (mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
            (mve_vdwdupq_wb_u<mode>_insn): Likewise.
            (mve_veorq_<supf><mode>): Likewise.
            (mve_veorq_f<mode>): Likewise.
            (mve_veorq_m_<supf><mode>): Likewise.
            (mve_veorq_m_f<mode>): Likewise.
            (mve_veorq_s<mode>): Likewise.
            (mve_veorq_u<mode>): Likewise.
            (mve_vfmaq_f<mode>): Likewise.
            (mve_vfmaq_m_f<mode>): Likewise.
            (mve_vfmaq_m_n_f<mode>): Likewise.
            (mve_vfmaq_n_f<mode>): Likewise.
            (mve_vfmasq_m_n_f<mode>): Likewise.
            (mve_vfmasq_n_f<mode>): Likewise.
            (mve_vfmsq_f<mode>): Likewise.
            (mve_vfmsq_m_f<mode>): Likewise.
            (mve_vhaddq_<supf><mode>): Likewise.
            (mve_vhaddq_m_<supf><mode>): Likewise.
            (mve_vhaddq_m_n_<supf><mode>): Likewise.
            (mve_vhaddq_n_<supf><mode>): Likewise.
            (mve_vhcaddq_rot270_m_s<mode>): Likewise.
            (mve_vhcaddq_rot270_s<mode>): Likewise.
            (mve_vhcaddq_rot90_m_s<mode>): Likewise.
            (mve_vhcaddq_rot90_s<mode>): Likewise.
            (mve_vhsubq_<supf><mode>): Likewise.
            (mve_vhsubq_m_<supf><mode>): Likewise.
            (mve_vhsubq_m_n_<supf><mode>): Likewise.
            (mve_vhsubq_n_<supf><mode>): Likewise.
            (mve_vidupq_m_wb_u<mode>_insn): Likewise.
            (mve_vidupq_u<mode>_insn): Likewise.
            (mve_viwdupq_m_wb_u<mode>_insn): Likewise.
            (mve_viwdupq_wb_u<mode>_insn): Likewise.
            (mve_vldrbq_<supf><mode>): Likewise.
            (mve_vldrbq_gather_offset_<supf><mode>): Likewise.
            (mve_vldrbq_gather_offset_z_<supf><mode>): Likewise.
            (mve_vldrbq_z_<supf><mode>): Likewise.
            (mve_vldrdq_gather_base_<supf>v2di): Likewise.
            (mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
            (mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
            (mve_vldrdq_gather_base_z_<supf>v2di): Likewise.
            (mve_vldrdq_gather_offset_<supf>v2di): Likewise.
            (mve_vldrdq_gather_offset_z_<supf>v2di): Likewise.
            (mve_vldrdq_gather_shifted_offset_<supf>v2di): Likewise.
            (mve_vldrdq_gather_shifted_offset_z_<supf>v2di): Likewise.
            (mve_vldrhq_<supf><mode>): Likewise.
            (mve_vldrhq_fv8hf): Likewise.
            (mve_vldrhq_gather_offset_<supf><mode>): Likewise.
            (mve_vldrhq_gather_offset_fv8hf): Likewise.
            (mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
            (mve_vldrhq_gather_offset_z_fv8hf): Likewise.
            (mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
            (mve_vldrhq_gather_shifted_offset_fv8hf): Likewise.
            (mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
            (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise.
            (mve_vldrhq_z_<supf><mode>): Likewise.
            (mve_vldrhq_z_fv8hf): Likewise.
            (mve_vldrwq_<supf>v4si): Likewise.
            (mve_vldrwq_fv4sf): Likewise.
            (mve_vldrwq_gather_base_<supf>v4si): Likewise.
            (mve_vldrwq_gather_base_fv4sf): Likewise.
            (mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
            (mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
            (mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
            (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
            (mve_vldrwq_gather_base_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_base_z_fv4sf): Likewise.
            (mve_vldrwq_gather_offset_<supf>v4si): Likewise.
            (mve_vldrwq_gather_offset_fv4sf): Likewise.
            (mve_vldrwq_gather_offset_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_offset_z_fv4sf): Likewise.
            (mve_vldrwq_gather_shifted_offset_<supf>v4si): Likewise.
            (mve_vldrwq_gather_shifted_offset_fv4sf): Likewise.
            (mve_vldrwq_gather_shifted_offset_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise.
            (mve_vldrwq_z_<supf>v4si): Likewise.
            (mve_vldrwq_z_fv4sf): Likewise.
            (mve_vmaxaq_m_s<mode>): Likewise.
            (mve_vmaxaq_s<mode>): Likewise.
            (mve_vmaxavq_p_s<mode>): Likewise.
            (mve_vmaxavq_s<mode>): Likewise.
            (mve_vmaxnmaq_f<mode>): Likewise.
            (mve_vmaxnmaq_m_f<mode>): Likewise.
            (mve_vmaxnmavq_f<mode>): Likewise.
            (mve_vmaxnmavq_p_f<mode>): Likewise.
            (mve_vmaxnmq_f<mode>): Likewise.
            (mve_vmaxnmq_m_f<mode>): Likewise.
            (mve_vmaxnmvq_f<mode>): Likewise.
            (mve_vmaxnmvq_p_f<mode>): Likewise.
            (mve_vmaxq_<supf><mode>): Likewise.
            (mve_vmaxq_m_<supf><mode>): Likewise.
            (mve_vmaxq_s<mode>): Likewise.
            (mve_vmaxq_u<mode>): Likewise.
            (mve_vmaxvq_<supf><mode>): Likewise.
            (mve_vmaxvq_p_<supf><mode>): Likewise.
            (mve_vminaq_m_s<mode>): Likewise.
            (mve_vminaq_s<mode>): Likewise.
            (mve_vminavq_p_s<mode>): Likewise.
            (mve_vminavq_s<mode>): Likewise.
            (mve_vminnmaq_f<mode>): Likewise.
            (mve_vminnmaq_m_f<mode>): Likewise.
            (mve_vminnmavq_f<mode>): Likewise.
            (mve_vminnmavq_p_f<mode>): Likewise.
            (mve_vminnmq_f<mode>): Likewise.
            (mve_vminnmq_m_f<mode>): Likewise.
            (mve_vminnmvq_f<mode>): Likewise.
            (mve_vminnmvq_p_f<mode>): Likewise.
            (mve_vminq_<supf><mode>): Likewise.
            (mve_vminq_m_<supf><mode>): Likewise.
            (mve_vminq_s<mode>): Likewise.
            (mve_vminq_u<mode>): Likewise.
            (mve_vminvq_<supf><mode>): Likewise.
            (mve_vminvq_p_<supf><mode>): Likewise.
            (mve_vmladavaq_<supf><mode>): Likewise.
            (mve_vmladavaq_p_<supf><mode>): Likewise.
            (mve_vmladavaxq_p_s<mode>): Likewise.
            (mve_vmladavaxq_s<mode>): Likewise.
            (mve_vmladavq_<supf><mode>): Likewise.
            (mve_vmladavq_p_<supf><mode>): Likewise.
            (mve_vmladavxq_p_s<mode>): Likewise.
            (mve_vmladavxq_s<mode>): Likewise.
            (mve_vmlaldavaq_<supf><mode>): Likewise.
            (mve_vmlaldavaq_p_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_p_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_s<mode>): Likewise.
            (mve_vmlaldavq_<supf><mode>): Likewise.
            (mve_vmlaldavq_p_<supf><mode>): Likewise.
            (mve_vmlaldavxq_p_s<mode>): Likewise.
            (mve_vmlaldavxq_s<mode>): Likewise.
            (mve_vmlaq_m_n_<supf><mode>): Likewise.
            (mve_vmlaq_n_<supf><mode>): Likewise.
            (mve_vmlasq_m_n_<supf><mode>): Likewise.
            (mve_vmlasq_n_<supf><mode>): Likewise.
            (mve_vmlsdavaq_p_s<mode>): Likewise.
            (mve_vmlsdavaq_s<mode>): Likewise.
            (mve_vmlsdavaxq_p_s<mode>): Likewise.
            (mve_vmlsdavaxq_s<mode>): Likewise.
            (mve_vmlsdavq_p_s<mode>): Likewise.
            (mve_vmlsdavq_s<mode>): Likewise.
            (mve_vmlsdavxq_p_s<mode>): Likewise.
            (mve_vmlsdavxq_s<mode>): Likewise.
            (mve_vmlsldavaq_p_s<mode>): Likewise.
            (mve_vmlsldavaq_s<mode>): Likewise.
            (mve_vmlsldavaxq_p_s<mode>): Likewise.
            (mve_vmlsldavaxq_s<mode>): Likewise.
            (mve_vmlsldavq_p_s<mode>): Likewise.
            (mve_vmlsldavq_s<mode>): Likewise.
            (mve_vmlsldavxq_p_s<mode>): Likewise.
            (mve_vmlsldavxq_s<mode>): Likewise.
            (mve_vmovlbq_<supf><mode>): Likewise.
            (mve_vmovlbq_m_<supf><mode>): Likewise.
            (mve_vmovltq_<supf><mode>): Likewise.
            (mve_vmovltq_m_<supf><mode>): Likewise.
            (mve_vmovnbq_<supf><mode>): Likewise.
            (mve_vmovnbq_m_<supf><mode>): Likewise.
            (mve_vmovntq_<supf><mode>): Likewise.
            (mve_vmovntq_m_<supf><mode>): Likewise.
            (mve_vmulhq_<supf><mode>): Likewise.
            (mve_vmulhq_m_<supf><mode>): Likewise.
            (mve_vmullbq_int_<supf><mode>): Likewise.
            (mve_vmullbq_int_m_<supf><mode>): Likewise.
            (mve_vmullbq_poly_m_p<mode>): Likewise.
            (mve_vmullbq_poly_p<mode>): Likewise.
            (mve_vmulltq_int_<supf><mode>): Likewise.
            (mve_vmulltq_int_m_<supf><mode>): Likewise.
            (mve_vmulltq_poly_m_p<mode>): Likewise.
            (mve_vmulltq_poly_p<mode>): Likewise.
            (mve_vmulq_<supf><mode>): Likewise.
            (mve_vmulq_f<mode>): Likewise.
            (mve_vmulq_m_<supf><mode>): Likewise.
            (mve_vmulq_m_f<mode>): Likewise.
            (mve_vmulq_m_n_<supf><mode>): Likewise.
            (mve_vmulq_m_n_f<mode>): Likewise.
            (mve_vmulq_n_<supf><mode>): Likewise.
            (mve_vmulq_n_f<mode>): Likewise.
            (mve_vmvnq_<supf><mode>): Likewise.
            (mve_vmvnq_m_<supf><mode>): Likewise.
            (mve_vmvnq_m_n_<supf><mode>): Likewise.
            (mve_vmvnq_n_<supf><mode>): Likewise.
            (mve_vmvnq_s<mode>): Likewise.
            (mve_vmvnq_u<mode>): Likewise.
            (mve_vnegq_f<mode>): Likewise.
            (mve_vnegq_m_f<mode>): Likewise.
            (mve_vnegq_m_s<mode>): Likewise.
            (mve_vnegq_s<mode>): Likewise.
            (mve_vornq_<supf><mode>): Likewise.
            (mve_vornq_f<mode>): Likewise.
            (mve_vornq_m_<supf><mode>): Likewise.
            (mve_vornq_m_f<mode>): Likewise.
            (mve_vornq_s<mode>): Likewise.
            (mve_vornq_u<mode>): Likewise.
            (mve_vorrq_<supf><mode>): Likewise.
            (mve_vorrq_f<mode>): Likewise.
            (mve_vorrq_m_<supf><mode>): Likewise.
            (mve_vorrq_m_f<mode>): Likewise.
            (mve_vorrq_m_n_<supf><mode>): Likewise.
            (mve_vorrq_n_<supf><mode>): Likewise.
            (mve_vorrq_s<mode>): Likewise.
            (mve_vorrq_s<mode>): Likewise.
            (mve_vqabsq_m_s<mode>): Likewise.
            (mve_vqabsq_s<mode>): Likewise.
            (mve_vqaddq_<supf><mode>): Likewise.
            (mve_vqaddq_m_<supf><mode>): Likewise.
            (mve_vqaddq_m_n_<supf><mode>): Likewise.
            (mve_vqaddq_n_<supf><mode>): Likewise.
            (mve_vqdmladhq_m_s<mode>): Likewise.
            (mve_vqdmladhq_s<mode>): Likewise.
            (mve_vqdmladhxq_m_s<mode>): Likewise.
            (mve_vqdmladhxq_s<mode>): Likewise.
            (mve_vqdmlahq_m_n_s<mode>): Likewise.
            (mve_vqdmlahq_n_<supf><mode>): Likewise.
            (mve_vqdmlahq_n_s<mode>): Likewise.
            (mve_vqdmlashq_m_n_s<mode>): Likewise.
            (mve_vqdmlashq_n_<supf><mode>): Likewise.
            (mve_vqdmlashq_n_s<mode>): Likewise.
            (mve_vqdmlsdhq_m_s<mode>): Likewise.
            (mve_vqdmlsdhq_s<mode>): Likewise.
            (mve_vqdmlsdhxq_m_s<mode>): Likewise.
            (mve_vqdmlsdhxq_s<mode>): Likewise.
            (mve_vqdmulhq_m_n_s<mode>): Likewise.
            (mve_vqdmulhq_m_s<mode>): Likewise.
            (mve_vqdmulhq_n_s<mode>): Likewise.
            (mve_vqdmulhq_s<mode>): Likewise.
            (mve_vqdmullbq_m_n_s<mode>): Likewise.
            (mve_vqdmullbq_m_s<mode>): Likewise.
            (mve_vqdmullbq_n_s<mode>): Likewise.
            (mve_vqdmullbq_s<mode>): Likewise.
            (mve_vqdmulltq_m_n_s<mode>): Likewise.
            (mve_vqdmulltq_m_s<mode>): Likewise.
            (mve_vqdmulltq_n_s<mode>): Likewise.
            (mve_vqdmulltq_s<mode>): Likewise.
            (mve_vqmovnbq_<supf><mode>): Likewise.
            (mve_vqmovnbq_m_<supf><mode>): Likewise.
            (mve_vqmovntq_<supf><mode>): Likewise.
            (mve_vqmovntq_m_<supf><mode>): Likewise.
            (mve_vqmovunbq_m_s<mode>): Likewise.
            (mve_vqmovunbq_s<mode>): Likewise.
            (mve_vqmovuntq_m_s<mode>): Likewise.
            (mve_vqmovuntq_s<mode>): Likewise.
            (mve_vqnegq_m_s<mode>): Likewise.
            (mve_vqnegq_s<mode>): Likewise.
            (mve_vqrdmladhq_m_s<mode>): Likewise.
            (mve_vqrdmladhq_s<mode>): Likewise.
            (mve_vqrdmladhxq_m_s<mode>): Likewise.
            (mve_vqrdmladhxq_s<mode>): Likewise.
            (mve_vqrdmlahq_m_n_s<mode>): Likewise.
            (mve_vqrdmlahq_n_<supf><mode>): Likewise.
            (mve_vqrdmlahq_n_s<mode>): Likewise.
            (mve_vqrdmlashq_m_n_s<mode>): Likewise.
            (mve_vqrdmlashq_n_<supf><mode>): Likewise.
            (mve_vqrdmlashq_n_s<mode>): Likewise.
            (mve_vqrdmlsdhq_m_s<mode>): Likewise.
            (mve_vqrdmlsdhq_s<mode>): Likewise.
            (mve_vqrdmlsdhxq_m_s<mode>): Likewise.
            (mve_vqrdmlsdhxq_s<mode>): Likewise.
            (mve_vqrdmulhq_m_n_s<mode>): Likewise.
            (mve_vqrdmulhq_m_s<mode>): Likewise.
            (mve_vqrdmulhq_n_s<mode>): Likewise.
            (mve_vqrdmulhq_s<mode>): Likewise.
            (mve_vqrshlq_<supf><mode>): Likewise.
            (mve_vqrshlq_m_<supf><mode>): Likewise.
            (mve_vqrshlq_m_n_<supf><mode>): Likewise.
            (mve_vqrshlq_n_<supf><mode>): Likewise.
            (mve_vqrshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vqrshrnbq_n_<supf><mode>): Likewise.
            (mve_vqrshrntq_m_n_<supf><mode>): Likewise.
            (mve_vqrshrntq_n_<supf><mode>): Likewise.
            (mve_vqrshrunbq_m_n_s<mode>): Likewise.
            (mve_vqrshrunbq_n_s<mode>): Likewise.
            (mve_vqrshruntq_m_n_s<mode>): Likewise.
            (mve_vqrshruntq_n_s<mode>): Likewise.
            (mve_vqshlq_<supf><mode>): Likewise.
            (mve_vqshlq_m_<supf><mode>): Likewise.
            (mve_vqshlq_m_n_<supf><mode>): Likewise.
            (mve_vqshlq_m_r_<supf><mode>): Likewise.
            (mve_vqshlq_n_<supf><mode>): Likewise.
            (mve_vqshlq_r_<supf><mode>): Likewise.
            (mve_vqshluq_m_n_s<mode>): Likewise.
            (mve_vqshluq_n_s<mode>): Likewise.
            (mve_vqshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vqshrnbq_n_<supf><mode>): Likewise.
            (mve_vqshrntq_m_n_<supf><mode>): Likewise.
            (mve_vqshrntq_n_<supf><mode>): Likewise.
            (mve_vqshrunbq_m_n_s<mode>): Likewise.
            (mve_vqshrunbq_n_s<mode>): Likewise.
            (mve_vqshruntq_m_n_s<mode>): Likewise.
            (mve_vqshruntq_n_s<mode>): Likewise.
            (mve_vqsubq_<supf><mode>): Likewise.
            (mve_vqsubq_m_<supf><mode>): Likewise.
            (mve_vqsubq_m_n_<supf><mode>): Likewise.
            (mve_vqsubq_n_<supf><mode>): Likewise.
            (mve_vrev16q_<supf>v16qi): Likewise.
            (mve_vrev16q_m_<supf>v16qi): Likewise.
            (mve_vrev32q_<supf><mode>): Likewise.
            (mve_vrev32q_fv8hf): Likewise.
            (mve_vrev32q_m_<supf><mode>): Likewise.
            (mve_vrev32q_m_fv8hf): Likewise.
            (mve_vrev64q_<supf><mode>): Likewise.
            (mve_vrev64q_f<mode>): Likewise.
            (mve_vrev64q_m_<supf><mode>): Likewise.
            (mve_vrev64q_m_f<mode>): Likewise.
            (mve_vrhaddq_<supf><mode>): Likewise.
            (mve_vrhaddq_m_<supf><mode>): Likewise.
            (mve_vrmlaldavhaq_<supf>v4si): Likewise.
            (mve_vrmlaldavhaq_p_sv4si): Likewise.
            (mve_vrmlaldavhaq_p_uv4si): Likewise.
            (mve_vrmlaldavhaq_sv4si): Likewise.
            (mve_vrmlaldavhaq_uv4si): Likewise.
            (mve_vrmlaldavhaxq_p_sv4si): Likewise.
            (mve_vrmlaldavhaxq_sv4si): Likewise.
            (mve_vrmlaldavhq_<supf>v4si): Likewise.
            (mve_vrmlaldavhq_p_<supf>v4si): Likewise.
            (mve_vrmlaldavhxq_p_sv4si): Likewise.
            (mve_vrmlaldavhxq_sv4si): Likewise.
            (mve_vrmlsldavhaq_p_sv4si): Likewise.
            (mve_vrmlsldavhaq_sv4si): Likewise.
            (mve_vrmlsldavhaxq_p_sv4si): Likewise.
            (mve_vrmlsldavhaxq_sv4si): Likewise.
            (mve_vrmlsldavhq_p_sv4si): Likewise.
            (mve_vrmlsldavhq_sv4si): Likewise.
            (mve_vrmlsldavhxq_p_sv4si): Likewise.
            (mve_vrmlsldavhxq_sv4si): Likewise.
            (mve_vrmulhq_<supf><mode>): Likewise.
            (mve_vrmulhq_m_<supf><mode>): Likewise.
            (mve_vrndaq_f<mode>): Likewise.
            (mve_vrndaq_m_f<mode>): Likewise.
            (mve_vrndmq_f<mode>): Likewise.
            (mve_vrndmq_m_f<mode>): Likewise.
            (mve_vrndnq_f<mode>): Likewise.
            (mve_vrndnq_m_f<mode>): Likewise.
            (mve_vrndpq_f<mode>): Likewise.
            (mve_vrndpq_m_f<mode>): Likewise.
            (mve_vrndq_f<mode>): Likewise.
            (mve_vrndq_m_f<mode>): Likewise.
            (mve_vrndxq_f<mode>): Likewise.
            (mve_vrndxq_m_f<mode>): Likewise.
            (mve_vrshlq_<supf><mode>): Likewise.
            (mve_vrshlq_m_<supf><mode>): Likewise.
            (mve_vrshlq_m_n_<supf><mode>): Likewise.
            (mve_vrshlq_n_<supf><mode>): Likewise.
            (mve_vrshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vrshrnbq_n_<supf><mode>): Likewise.
            (mve_vrshrntq_m_n_<supf><mode>): Likewise.
            (mve_vrshrntq_n_<supf><mode>): Likewise.
            (mve_vrshrq_m_n_<supf><mode>): Likewise.
            (mve_vrshrq_n_<supf><mode>): Likewise.
            (mve_vsbciq_<supf>v4si): Likewise.
            (mve_vsbciq_m_<supf>v4si): Likewise.
            (mve_vsbcq_<supf>v4si): Likewise.
            (mve_vsbcq_m_<supf>v4si): Likewise.
            (mve_vshlcq_<supf><mode>): Likewise.
            (mve_vshlcq_m_<supf><mode>): Likewise.
            (mve_vshllbq_m_n_<supf><mode>): Likewise.
            (mve_vshllbq_n_<supf><mode>): Likewise.
            (mve_vshlltq_m_n_<supf><mode>): Likewise.
            (mve_vshlltq_n_<supf><mode>): Likewise.
            (mve_vshlq_<supf><mode>): Likewise.
            (mve_vshlq_<supf><mode>): Likewise.
            (mve_vshlq_m_<supf><mode>): Likewise.
            (mve_vshlq_m_n_<supf><mode>): Likewise.
            (mve_vshlq_m_r_<supf><mode>): Likewise.
            (mve_vshlq_n_<supf><mode>): Likewise.
            (mve_vshlq_r_<supf><mode>): Likewise.
            (mve_vshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vshrnbq_n_<supf><mode>): Likewise.
            (mve_vshrntq_m_n_<supf><mode>): Likewise.
            (mve_vshrntq_n_<supf><mode>): Likewise.
            (mve_vshrq_m_n_<supf><mode>): Likewise.
            (mve_vshrq_n_<supf><mode>): Likewise.
            (mve_vsliq_m_n_<supf><mode>): Likewise.
            (mve_vsliq_n_<supf><mode>): Likewise.
            (mve_vsriq_m_n_<supf><mode>): Likewise.
            (mve_vsriq_n_<supf><mode>): Likewise.
            (mve_vstrbq_<supf><mode>): Likewise.
            (mve_vstrbq_p_<supf><mode>): Likewise.
            (mve_vstrbq_scatter_offset_<supf><mode>_insn): Likewise.
            (mve_vstrbq_scatter_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrdq_scatter_base_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_p_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_offset_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_offset_p_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_shifted_offset_p_<supf>v2di_insn): Likewise.
            (mve_vstrhq_<supf><mode>): Likewise.
            (mve_vstrhq_fv8hf): Likewise.
            (mve_vstrhq_p_<supf><mode>): Likewise.
            (mve_vstrhq_p_fv8hf): Likewise.
            (mve_vstrhq_scatter_offset_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_offset_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise.
            (mve_vstrwq_<supf>v4si): Likewise.
            (mve_vstrwq_fv4sf): Likewise.
            (mve_vstrwq_p_<supf>v4si): Likewise.
            (mve_vstrwq_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_wb_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_offset_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_offset_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_offset_p_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_p_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise.
            (mve_vsubq_<supf><mode>): Likewise.
            (mve_vsubq_f<mode>): Likewise.
            (mve_vsubq_m_<supf><mode>): Likewise.
            (mve_vsubq_m_f<mode>): Likewise.
            (mve_vsubq_m_n_<supf><mode>): Likewise.
            (mve_vsubq_m_n_f<mode>): Likewise.
            (mve_vsubq_n_<supf><mode>): Likewise.
            (mve_vsubq_n_f<mode>): Likewise.

diff --git a/gcc/config/arm/arm.h b/gcc/config/arm/arm.h
index a9c2752c0ea..0b0e8620717 100644
--- a/gcc/config/arm/arm.h
+++ b/gcc/config/arm/arm.h
@@ -2375,6 +2375,21 @@ extern int making_const_table;
   else if (TARGET_THUMB1)				\
     thumb1_final_prescan_insn (INSN)
 
+/* These defines are useful to refer to the value of the mve_unpredicated_insn
+   insn attribute.  Note that, because these use the get_attr_* function, these
+   will change recog_data if (INSN) isn't current_insn.  */
+#define MVE_VPT_PREDICABLE_INSN_P(INSN)					\
+  (recog_memoized (INSN) >= 0						\
+  && get_attr_mve_unpredicated_insn (INSN) != 0)			\
+
+#define MVE_VPT_PREDICATED_INSN_P(INSN)					\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) != get_attr_mve_unpredicated_insn (INSN))	\
+
+#define MVE_VPT_UNPREDICATED_INSN_P(INSN)				\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) == get_attr_mve_unpredicated_insn (INSN))	\
+
 #define ARM_SIGN_EXTEND(x)  ((HOST_WIDE_INT)			\
   (HOST_BITS_PER_WIDE_INT <= 32 ? (unsigned HOST_WIDE_INT) (x)	\
    : ((((unsigned HOST_WIDE_INT)(x)) & (unsigned HOST_WIDE_INT) 0xffffffff) |\
diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md
index 07eaf06cdea..8efdebecc3c 100644
--- a/gcc/config/arm/arm.md
+++ b/gcc/config/arm/arm.md
@@ -124,6 +124,8 @@
 ; and not all ARM insns do.
 (define_attr "predicated" "yes,no" (const_string "no"))
 
+(define_attr "mve_unpredicated_insn" "" (const_int 0))
+
 ; LENGTH of an instruction (in bytes)
 (define_attr "length" ""
   (const_int 4))
diff --git a/gcc/config/arm/iterators.md b/gcc/config/arm/iterators.md
index a9803538101..5ea2d9e8668 100644
--- a/gcc/config/arm/iterators.md
+++ b/gcc/config/arm/iterators.md
@@ -2305,6 +2305,7 @@
 
 (define_int_attr mmla_sfx [(UNSPEC_MATMUL_S "s8") (UNSPEC_MATMUL_U "u8")
 			   (UNSPEC_MATMUL_US "s8")])
+
 ;;MVE int attribute.
 (define_int_attr supf [(VCVTQ_TO_F_S "s") (VCVTQ_TO_F_U "u") (VREV16Q_S "s")
 		       (VREV16Q_U "u") (VMVNQ_N_S "s") (VMVNQ_N_U "u")
diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md
index 366cec0812a..44a04b86cb5 100644
--- a/gcc/config/arm/mve.md
+++ b/gcc/config/arm/mve.md
@@ -17,7 +17,7 @@
 ;; along with GCC; see the file COPYING3.  If not see
 ;; <http://www.gnu.org/licenses/>.
 
-(define_insn "*mve_mov<mode>"
+(define_insn "mve_mov<mode>"
   [(set (match_operand:MVE_types 0 "nonimmediate_operand" "=w,w,r,w   , w,   r,Ux,w")
 	(match_operand:MVE_types 1 "general_operand"      " w,r,w,DnDm,UxUi,r,w, Ul"))]
   "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT"
@@ -81,18 +81,27 @@
       return "";
     }
 }
-  [(set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load")
+   [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")])
+   (set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load")
    (set_attr "length" "4,8,8,4,4,8,4,8")
    (set_attr "thumb2_pool_range" "*,*,*,*,1018,*,*,*")
    (set_attr "neg_pool_range" "*,*,*,*,996,*,*,*")])
 
-(define_insn "*mve_vdup<mode>"
+(define_insn "mve_vdup<mode>"
   [(set (match_operand:MVE_vecs 0 "s_register_operand" "=w")
 	(vec_duplicate:MVE_vecs
 	  (match_operand:<V_elem> 1 "s_register_operand" "r")))]
   "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT"
   "vdup.<V_sz_elem>\t%q0, %1"
-  [(set_attr "length" "4")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdup<mode>"))
+  (set_attr "length" "4")
    (set_attr "type" "mve_move")])
 
 ;;
@@ -145,7 +154,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_mnemo>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -159,7 +169,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -173,7 +184,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "v<absneg_str>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -187,7 +199,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -201,7 +214,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 ;;
 ;; [vcvttq_f32_f16])
@@ -214,7 +228,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -228,7 +243,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -242,7 +258,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -256,7 +273,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -270,7 +288,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -284,7 +303,8 @@
   ]
   "TARGET_HAVE_MVE"
   "v<absneg_str>.s%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -297,7 +317,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmvn\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vmvnq_s<mode>"
   [
@@ -318,7 +339,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -331,7 +353,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vclz.i%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vclzq_u<mode>"
   [
@@ -354,7 +377,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -368,7 +392,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -382,7 +407,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -397,7 +423,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -411,7 +438,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtp.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -425,7 +453,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtn.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -439,7 +468,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtm.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -453,7 +483,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvta.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -467,7 +498,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -481,7 +513,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -495,7 +528,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -509,7 +543,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vctp.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -523,7 +558,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpnot"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpnotv16bi"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -538,7 +574,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -553,7 +590,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f<V_sz_elem>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; [vcreateq_f])
@@ -599,7 +637,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Versions that take constant vectors as operand 2 (with all elements
@@ -617,7 +656,8 @@
 					VALID_NEON_QREG_MODE (<MODE>mode),
 					true);
   }
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_s<mode>_imm"))
+  (set_attr "type" "mve_move")
 ])
 (define_insn "mve_vshrq_n_u<mode>_imm"
   [
@@ -632,7 +672,8 @@
 					VALID_NEON_QREG_MODE (<MODE>mode),
 					true);
   }
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_u<mode>_imm"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -647,7 +688,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf><V_sz_elem>.f<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -662,8 +704,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcmpneq_, vcmpcsq_, vcmpeqq_, vcmpgeq_, vcmpgtq_, vcmphiq_, vcmpleq_, vcmpltq_])
@@ -676,7 +719,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>\t<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -691,7 +735,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -722,7 +767,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -739,7 +785,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -754,7 +801,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -769,7 +817,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -789,8 +838,11 @@
   "@
    vand\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vand\", &operands[2], <MODE>mode, 1, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+   [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_vandq_u<mode>")
+						   (symbol_ref "CODE_FOR_nothing")])
+  (set_attr "type" "mve_move")
 ])
+
 (define_expand "mve_vandq_s<mode>"
   [
    (set (match_operand:MVE_2 0 "s_register_operand")
@@ -811,7 +863,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vbicq_s<mode>"
@@ -835,7 +888,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -853,7 +907,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Auto vectorizer pattern for int vcadd
@@ -876,7 +931,8 @@
   ]
   "TARGET_HAVE_MVE"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_veorq_s<mode>"
   [
@@ -904,7 +960,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -920,7 +977,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -935,7 +993,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<max_min_su_str>.<max_min_supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_su_str>q_<max_min_supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 
@@ -954,7 +1013,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -972,7 +1032,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -988,7 +1049,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1004,7 +1066,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_addsubmul>.i%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1018,7 +1081,8 @@
   ]
   "TARGET_HAVE_MVE"
    "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vornq_u<mode>"
@@ -1047,7 +1111,8 @@
   "@
    vorr\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vorr\", &operands[2], <MODE>mode, 0, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vorrq_u<mode>"
   [
@@ -1071,7 +1136,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1087,7 +1153,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1103,7 +1170,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1118,7 +1186,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1133,7 +1202,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1148,7 +1218,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1165,7 +1236,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1179,7 +1251,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vand\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1193,7 +1266,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1209,7 +1283,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1223,7 +1298,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1238,7 +1314,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1253,8 +1330,10 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vctpt.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")
+])
 
 ;;
 ;; [vcvtbq_f16_f32])
@@ -1268,7 +1347,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1283,7 +1363,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1297,7 +1378,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1313,7 +1395,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1331,7 +1414,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1346,7 +1430,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<max_min_f_str>.f%#<V_sz_elem>	%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_f_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1364,7 +1449,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1384,7 +1470,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1400,7 +1487,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_addsubmul>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1414,7 +1502,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1428,7 +1517,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorr\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1444,7 +1534,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1460,7 +1551,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1476,7 +1568,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1494,7 +1587,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1510,7 +1604,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1526,7 +1621,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_poly_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1547,8 +1643,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_f<mode>"))
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtaq_m_u, vcvtaq_m_s])
 ;;
@@ -1562,8 +1659,10 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtat.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u])
 ;;
@@ -1577,8 +1676,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vqrshrnbq_n_u, vqrshrnbq_n_s]
@@ -1604,7 +1704,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1623,7 +1724,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1639,7 +1741,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1685,7 +1788,10 @@
 		   (match_dup 4)]
 	VSHLCQ))]
  "TARGET_HAVE_MVE"
- "vshlc\t%q0, %1, %4")
+ "vshlc\t%q0, %1, %4"
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+])
 
 ;;
 ;; [vabsq_m_s]
@@ -1705,7 +1811,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1721,7 +1828,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1744,7 +1852,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1767,7 +1876,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1783,7 +1893,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1800,7 +1911,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1819,7 +1931,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1838,7 +1951,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1857,7 +1971,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1878,7 +1993,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1894,7 +2010,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1910,7 +2027,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1933,7 +2051,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1950,7 +2069,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1967,7 +2087,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1983,7 +2104,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1999,7 +2121,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2015,7 +2138,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2038,7 +2162,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_mnemo>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2054,7 +2179,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270])
@@ -2072,7 +2198,9 @@
   "@
    vcmul.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>
    vcmla.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+  [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>")
+						  (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>")])
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2093,7 +2221,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2109,7 +2238,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2125,7 +2255,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2141,7 +2272,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2157,8 +2289,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vdupq_m_n_f])
@@ -2173,7 +2306,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2190,7 +2324,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2207,7 +2342,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2224,7 +2360,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2243,7 +2380,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2262,7 +2400,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2281,7 +2420,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2298,7 +2438,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2319,7 +2460,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2335,7 +2477,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2352,7 +2495,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2368,7 +2512,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2384,7 +2529,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2400,7 +2546,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2416,7 +2563,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2435,7 +2583,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2451,7 +2600,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtmt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2467,7 +2617,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtpt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2483,7 +2634,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtnt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2500,7 +2652,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2516,7 +2669,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2532,8 +2686,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vabavq_p_s, vabavq_p_u])
@@ -2549,7 +2704,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -2566,8 +2722,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\n\t<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vsriq_m_n_s, vsriq_m_n_u])
@@ -2583,8 +2740,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s])
@@ -2600,7 +2758,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2640,7 +2799,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2659,8 +2819,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vaddq_m_u, vaddq_m_s]
@@ -2678,7 +2839,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2698,7 +2860,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2715,8 +2878,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcaddq_rot90_m_u, vcaddq_rot90_m_s]
@@ -2735,7 +2899,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2763,7 +2928,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2784,7 +2950,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2802,7 +2969,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2819,7 +2987,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2837,7 +3006,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2855,7 +3025,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2872,7 +3043,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2892,7 +3064,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2920,7 +3093,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2940,7 +3114,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2958,7 +3133,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2976,7 +3152,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_poly_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2994,7 +3171,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3012,7 +3190,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3036,7 +3215,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3057,7 +3237,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3077,7 +3258,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3094,7 +3276,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3116,7 +3299,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3136,7 +3320,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3153,7 +3338,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3173,7 +3359,8 @@
    output_asm_insn("vstrb.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_s vstrbq_scatter_offset_u]
@@ -3201,7 +3388,8 @@
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vstrb.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_s vstrwq_scatter_base_u]
@@ -3223,7 +3411,8 @@
    output_asm_insn("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_gather_offset_s vldrbq_gather_offset_u]
@@ -3246,7 +3435,8 @@
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_s vldrbq_u]
@@ -3268,7 +3458,8 @@
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_s vldrwq_gather_base_u]
@@ -3288,7 +3479,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_p_s vstrbq_scatter_offset_p_u]
@@ -3320,7 +3512,8 @@
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrbt.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u]
@@ -3343,7 +3536,8 @@
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_insn "mve_vstrbq_p_<supf><mode>"
   [(set (match_operand:<MVE_B_ELEM> 0 "mve_memory_operand" "=Ux")
@@ -3361,7 +3555,8 @@
    output_asm_insn ("vpst\;vstrbt.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u]
@@ -3386,7 +3581,8 @@
      output_asm_insn ("vpst\n\tvldrbt.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_z_s vldrbq_z_u]
@@ -3409,7 +3605,8 @@
      output_asm_insn ("vpst\;vldrbt.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u]
@@ -3430,7 +3627,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_f]
@@ -3449,7 +3647,8 @@
    output_asm_insn ("vldrh.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_s vldrhq_gather_offset_u]
@@ -3472,7 +3671,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_s vldrhq_gather_offset_z_u]
@@ -3497,7 +3697,8 @@
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u]
@@ -3520,7 +3721,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_s vldrhq_gather_shited_offset_z_u]
@@ -3545,7 +3747,8 @@
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_s, vldrhq_u]
@@ -3567,7 +3770,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_z_f]
@@ -3587,7 +3791,8 @@
    output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_z_s vldrhq_z_u]
@@ -3610,7 +3815,8 @@
      output_asm_insn ("vpst\;vldrht.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_f]
@@ -3629,7 +3835,8 @@
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_s vldrwq_u]
@@ -3648,7 +3855,8 @@
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_z_f]
@@ -3668,7 +3876,8 @@
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_z_s vldrwq_z_u]
@@ -3688,7 +3897,8 @@
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vld1q_f<mode>"
   [(match_operand:MVE_0 0 "s_register_operand")
@@ -3728,7 +3938,8 @@
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_base_z_s vldrdq_gather_base_z_u]
@@ -3749,7 +3960,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u]
@@ -3769,7 +3981,8 @@
   output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_offset_z_s vldrdq_gather_offset_z_u]
@@ -3790,7 +4003,8 @@
   output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u]
@@ -3810,7 +4024,8 @@
    output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_z_s vldrdq_gather_shifted_offset_z_u]
@@ -3831,7 +4046,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_offset_f]
@@ -3851,7 +4067,8 @@
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_f]
@@ -3873,7 +4090,8 @@
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_f]
@@ -3893,7 +4111,8 @@
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_f]
@@ -3915,7 +4134,8 @@
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_f]
@@ -3935,7 +4155,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_z_f]
@@ -3956,7 +4177,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_f]
@@ -3976,7 +4198,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_s vldrwq_gather_offset_u]
@@ -3996,7 +4219,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_z_f]
@@ -4018,7 +4242,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u]
@@ -4040,7 +4265,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_f]
@@ -4060,7 +4286,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_s vldrwq_gather_shifted_offset_u]
@@ -4080,7 +4307,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_f]
@@ -4102,7 +4330,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u]
@@ -4124,7 +4353,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_f]
@@ -4143,7 +4373,8 @@
    output_asm_insn ("vstrh.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_p_f]
@@ -4164,7 +4395,8 @@
    output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_p_s vstrhq_p_u]
@@ -4186,7 +4418,8 @@
    output_asm_insn ("vpst\;vstrht.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u]
@@ -4218,7 +4451,8 @@
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u]
@@ -4246,7 +4480,8 @@
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_s vstrhq_scatter_shifted_offset_p_u]
@@ -4278,7 +4513,8 @@
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u]
@@ -4307,7 +4543,8 @@
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_s, vstrhq_u]
@@ -4326,7 +4563,8 @@
    output_asm_insn ("vstrh.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_f]
@@ -4345,7 +4583,8 @@
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_p_f]
@@ -4366,7 +4605,8 @@
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_p_s vstrwq_p_u]
@@ -4387,7 +4627,8 @@
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_s vstrwq_u]
@@ -4406,7 +4647,8 @@
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vst1q_f<mode>"
   [(match_operand:<MVE_CNVT> 0 "mve_memory_operand")
@@ -4449,7 +4691,8 @@
    output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u]
@@ -4471,7 +4714,8 @@
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_offset_p_s vstrdq_scatter_offset_p_u]
@@ -4502,7 +4746,8 @@
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u]
@@ -4530,7 +4775,8 @@
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_p_s vstrdq_scatter_shifted_offset_p_u]
@@ -4562,7 +4808,8 @@
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u]
@@ -4591,7 +4838,8 @@
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_f]
@@ -4619,7 +4867,8 @@
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_f]
@@ -4650,7 +4899,8 @@
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_f]
@@ -4678,7 +4928,8 @@
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_f]
@@ -4710,7 +4961,8 @@
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_f]
@@ -4732,7 +4984,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_p_f]
@@ -4755,7 +5008,8 @@
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_f]
@@ -4783,7 +5037,8 @@
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_offset_p_f]
@@ -4814,7 +5069,8 @@
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -4845,7 +5101,8 @@
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -4873,7 +5130,8 @@
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_f]
@@ -4901,7 +5159,8 @@
 	 VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_f]
@@ -4933,7 +5192,8 @@
 	  VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u]
@@ -4965,7 +5225,8 @@
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u]
@@ -4994,7 +5255,8 @@
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vidupq_n_u])
@@ -5062,7 +5324,8 @@
 		(match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;\tvidupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vddupq_n_u])
@@ -5130,7 +5393,8 @@
 		 (match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;vddupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vdwdupq_n_u])
@@ -5246,8 +5510,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vdwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [viwdupq_n_u])
@@ -5363,7 +5628,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;\tviwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5389,7 +5655,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_s vstrwq_scatter_base_wb_p_u]
@@ -5415,7 +5682,8 @@
    output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_f]
@@ -5440,7 +5708,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_f]
@@ -5466,7 +5735,8 @@
    output_asm_insn ("vpst\;vstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u]
@@ -5491,7 +5761,8 @@
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_p_s vstrdq_scatter_base_wb_p_u]
@@ -5517,7 +5788,8 @@
    output_asm_insn ("vpst\;vstrdt.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5569,7 +5841,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5625,7 +5898,8 @@
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5677,7 +5951,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5734,7 +6009,8 @@
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrdq_gather_base_wb_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -5787,7 +6063,8 @@
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrdq_gather_base_wb_z_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -5826,7 +6103,7 @@
    (unspec_volatile:SI [(reg:SI VFPCC_REGNUM)] UNSPEC_GET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmrs\\t%0, FPSCR_nzcvqc"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 (define_insn "set_fpscr_nzcvqc"
  [(set (reg:SI VFPCC_REGNUM)
@@ -5834,7 +6111,7 @@
     VUNSPEC_SET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmsr\\tFPSCR_nzcvqc, %0"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 ;;
 ;; [vldrdq_gather_base_wb_z_s vldrdq_gather_base_wb_z_u]
@@ -5859,7 +6136,8 @@
    output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 ;;
 ;; [vadciq_m_s, vadciq_m_u])
 ;;
@@ -5876,7 +6154,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5893,7 +6172,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vadci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -5912,7 +6192,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5929,7 +6210,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vadc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")
    (set_attr "conds" "set")])
 
@@ -5949,7 +6231,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5966,7 +6249,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vsbci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -5985,7 +6269,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6002,7 +6287,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vsbc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -6031,7 +6317,7 @@
 		    "vst21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld2q])
@@ -6059,7 +6345,7 @@
 		    "vld21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld4q])
@@ -6402,7 +6688,8 @@
  ]
  "TARGET_HAVE_MVE"
  "vpst\;vshlct\t%q0, %1, %4"
- [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
   (set_attr "length" "8")])
 
 ;; CDE instructions on MVE registers.
@@ -6414,7 +6701,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1\\tp%c1, %q0, #%c2"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1qav16qi"
@@ -6425,7 +6713,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1a\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qv16qi"
@@ -6436,7 +6725,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2\\tp%c1, %q0, %q2, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qav16qi"
@@ -6448,7 +6738,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2a\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qv16qi"
@@ -6460,7 +6751,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3\\tp%c1, %q0, %q2, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qav16qi"
@@ -6473,7 +6765,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3a\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1q<a>_p_v16qi"
@@ -6485,7 +6778,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx1<a>t\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6499,7 +6793,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx2<a>t\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6514,11 +6809,12 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx3<a>t\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
-(define_insn "*movmisalign<mode>_mve_store"
+(define_insn "movmisalign<mode>_mve_store"
   [(set (match_operand:MVE_VLD_ST 0 "mve_memory_operand"	     "=Ux")
 	(unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "s_register_operand" " w")]
 	 UNSPEC_MISALIGNED_ACCESS))]
@@ -6526,11 +6822,12 @@
     || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (<MODE>mode)))
    && !BYTES_BIG_ENDIAN && unaligned_access"
   "vstr<V_sz_elem1>.<V_sz_elem>\t%q1, %E0"
-  [(set_attr "type" "mve_store")]
+  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign<mode>_mve_store"))
+   (set_attr "type" "mve_store")]
 )
 
 
-(define_insn "*movmisalign<mode>_mve_load"
+(define_insn "movmisalign<mode>_mve_load"
   [(set (match_operand:MVE_VLD_ST 0 "s_register_operand"				 "=w")
 	(unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "mve_memory_operand" " Ux")]
 	 UNSPEC_MISALIGNED_ACCESS))]
@@ -6538,7 +6835,8 @@
     || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (<MODE>mode)))
    && !BYTES_BIG_ENDIAN && unaligned_access"
   "vldr<V_sz_elem1>.<V_sz_elem>\t%q0, %E1"
-  [(set_attr "type" "mve_load")]
+  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign<mode>_mve_load"))
+   (set_attr "type" "mve_load")]
 )
 
 ;; Expander for VxBI moves
@@ -6620,3 +6918,40 @@
       }
   }
 )
+
+;; Originally expanded by 'predicated_doloop_end'.
+;; In the rare situation where the branch is too far, we do also need to
+;; revert FPSCR.LTPSIZE back to 0x100 after the last iteration.
+(define_insn "*predicated_doloop_end_internal"
+  [(set (pc)
+	(if_then_else
+	   (ge (plus:SI (reg:SI LR_REGNUM)
+			(match_operand:SI 0 "const_int_operand" ""))
+		(const_int 0))
+	 (label_ref (match_operand 1 "" ""))
+	 (pc)))
+   (set (reg:SI LR_REGNUM)
+	(plus:SI (reg:SI LR_REGNUM) (match_dup 0)))
+   (clobber (reg:CC CC_REGNUM))]
+  "TARGET_32BIT && TARGET_HAVE_LOB && TARGET_HAVE_MVE && TARGET_THUMB2"
+  {
+    if (get_attr_length (insn) == 4)
+      return "letp\t%|lr, %l1";
+    else
+      return "subs\t%|lr, #%n0\n\tbgt\t%l1\n\tlctp";
+  }
+  [(set (attr "length")
+	(if_then_else
+	   (ltu (minus (pc) (match_dup 1)) (const_int 1024))
+	    (const_int 4)
+	    (const_int 6)))
+   (set_attr "type" "branch")])
+
+(define_insn "dlstp<mode1>_insn"
+  [
+    (set (reg:SI LR_REGNUM)
+	 (unspec:SI [(match_operand:SI 0 "s_register_operand" "r")]
+	  DLSTP))
+  ]
+  "TARGET_32BIT && TARGET_HAVE_LOB && TARGET_HAVE_MVE && TARGET_THUMB2"
+  "dlstp.<mode1>\t%|lr, %0")
diff --git a/gcc/config/arm/vec-common.md b/gcc/config/arm/vec-common.md
index 9af8429968d..74871cb984b 100644
--- a/gcc/config/arm/vec-common.md
+++ b/gcc/config/arm/vec-common.md
@@ -366,7 +366,8 @@
   "@
    <mve_insn>.<supf>%#<V_sz_elem>\t%<V_reg>0, %<V_reg>1, %<V_reg>2
    * return neon_output_shift_immediate (\"vshl\", 'i', &operands[2], <MODE>mode, VALID_NEON_QREG_MODE (<MODE>mode), true);"
-  [(set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
 )
 
 (define_expand "vashl<mode>3"

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
@ 2023-08-17 10:30 Stamatis Markianos-Wright
  0 siblings, 0 replies; 7+ messages in thread
From: Stamatis Markianos-Wright @ 2023-08-17 10:30 UTC (permalink / raw)
  To: gcc-patches; +Cc: Kyrylo Tkachov, Richard Earnshaw

[-- Attachment #1: Type: text/plain, Size: 33653 bytes --]

Hi all,

I'd like to submit two patches that add support for Arm's MVE
Tail Predicated Low Overhead Loop feature.

--- Introduction ---

The M-class Arm-ARM:
https://developer.arm.com/documentation/ddi0553/bu/?lang=en
Section B5.5.1 "Loop tail predication" describes the feature
we are adding support for with this patch (although
we only add codegen for DLSTP/LETP instruction loops).

Previously with commit d2ed233cb94 we'd added support for
non-MVE DLS/LE loops through the loop-doloop pass, which, given
a standard MVE loop like:

```
void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, int16_t 
*c, int n)
{
   while (n > 0)
     {
       mve_pred16_t p = vctp16q (n);
       int16x8_t va = vldrhq_z_s16 (a, p);
       int16x8_t vb = vldrhq_z_s16 (b, p);
       int16x8_t vc = vaddq_x_s16 (va, vb, p);
       vstrhq_p_s16 (c, vc, p);
       c+=8;
       a+=8;
       b+=8;
       n-=8;
     }
}
```
.. would output:

```
         <pre-calculate the number of iterations and place it into lr>
         dls     lr, lr
.L3:
         vctp.16 r3
         vmrs    ip, P0  @ movhi
         sxth    ip, ip
         vmsr     P0, ip @ movhi
         mov     r4, r0
         vpst
         vldrht.16       q2, [r4]
         mov     r4, r1
         vmov    q3, q0
         vpst
         vldrht.16       q1, [r4]
         mov     r4, r2
         vpst
         vaddt.i16       q3, q2, q1
         subs    r3, r3, #8
         vpst
         vstrht.16       q3, [r4]
         adds    r0, r0, #16
         adds    r1, r1, #16
         adds    r2, r2, #16
         le      lr, .L3
```

where the LE instruction will decrement LR by 1, compare and
branch if needed.

(there are also other inefficiencies with the above code, like the
pointless vmrs/sxth/vmsr on the VPR and the adds not being merged
into the vldrht/vstrht as a #16 offsets and some random movs!
But that's different problems...)

The MVE version is similar, except that:
* Instead of DLS/LE the instructions are DLSTP/LETP.
* Instead of pre-calculating the number of iterations of the
   loop, we place the number of elements to be processed by the
   loop into LR.
* Instead of decrementing the LR by one, LETP will decrement it
   by FPSCR.LTPSIZE, which is the number of elements being
   processed in each iteration: 16 for 8-bit elements, 5 for 16-bit
   elements, etc.
* On the final iteration, automatic Loop Tail Predication is
   performed, as if the instructions within the loop had been VPT
   predicated with a VCTP generating the VPR predicate in every
   loop iteration.

The dlstp/letp loop now looks like:

```
         <place n into r3>
         dlstp.16        lr, r3
.L14:
         mov     r3, r0
         vldrh.16        q3, [r3]
         mov     r3, r1
         vldrh.16        q2, [r3]
         mov     r3, r2
         vadd.i16  q3, q3, q2
         adds    r0, r0, #16
         vstrh.16        q3, [r3]
         adds    r1, r1, #16
         adds    r2, r2, #16
         letp    lr, .L14

```

Since the loop tail predication is automatic, we have eliminated
the VCTP that had been specified by the user in the intrinsic
and converted the VPT-predicated instructions into their
unpredicated equivalents (which also saves us from VPST insns).

The LE instruction here decrements LR by 8 in each iteration.

--- This 1/2 patch ---

This first patch lays some groundwork by adding an attribute to
md patterns, and then the second patch contains the functional
changes.

One major difficulty in implementing MVE Tail-Predicated Low
Overhead Loops was the need to transform VPT-predicated insns
in the insn chain into their unpredicated equivalents, like:
`mve_vldrbq_z_<supf><mode> -> mve_vldrbq_<supf><mode>`.

This requires us to have a deterministic link between two
different patterns in mve.md -- this _could_ be done by
re-ordering the entirety of mve.md such that the patterns are
at some constant icode proximity (e.g. having the _z immediately
after the unpredicated version would mean that to map from the
former to the latter you could use icode-1), but that is a very
messy solution that would lead to complex unknown dependencies
between the ordering of patterns.

This patch proves an alternative way of doing that: using an insn
attribute to encode the icode of the unpredicated instruction.

No regressions on arm-none-eabi with an MVE target.

Thank you,
Stam Markianos-Wright

gcc/ChangeLog:

         * config/arm/arm.md (mve_unpredicated_insn): New attribute.
         * config/arm/arm.h (MVE_VPT_PREDICATED_INSN_P): New define.
     (MVE_VPT_UNPREDICATED_INSN_P): Likewise.
     (MVE_VPT_PREDICABLE_INSN_P): Likewise.
         * config/arm/vec-common.md (mve_vshlq_<supf><mode>): Add attribute.
         * config/arm/mve.md (arm_vcx1q<a>_p_v16qi): Add attribute.
     (arm_vcx1q<a>v16qi): Likewise.
     (arm_vcx1qav16qi): Likewise.
     (arm_vcx1qv16qi): Likewise.
     (arm_vcx2q<a>_p_v16qi): Likewise.
     (arm_vcx2q<a>v16qi): Likewise.
     (arm_vcx2qav16qi): Likewise.
     (arm_vcx2qv16qi): Likewise.
     (arm_vcx3q<a>_p_v16qi): Likewise.
     (arm_vcx3q<a>v16qi): Likewise.
     (arm_vcx3qav16qi): Likewise.
     (arm_vcx3qv16qi): Likewise.
     (mve_vabavq_<supf><mode>): Likewise.
     (mve_vabavq_p_<supf><mode>): Likewise.
     (mve_vabdq_<supf><mode>): Likewise.
     (mve_vabdq_f<mode>): Likewise.
     (mve_vabdq_m_<supf><mode>): Likewise.
     (mve_vabdq_m_f<mode>): Likewise.
     (mve_vabsq_f<mode>): Likewise.
     (mve_vabsq_m_f<mode>): Likewise.
     (mve_vabsq_m_s<mode>): Likewise.
     (mve_vabsq_s<mode>): Likewise.
     (mve_vadciq_<supf>v4si): Likewise.
     (mve_vadciq_m_<supf>v4si): Likewise.
     (mve_vadcq_<supf>v4si): Likewise.
     (mve_vadcq_m_<supf>v4si): Likewise.
     (mve_vaddlvaq_<supf>v4si): Likewise.
     (mve_vaddlvaq_p_<supf>v4si): Likewise.
     (mve_vaddlvq_<supf>v4si): Likewise.
     (mve_vaddlvq_p_<supf>v4si): Likewise.
     (mve_vaddq_f<mode>): Likewise.
     (mve_vaddq_m_<supf><mode>): Likewise.
     (mve_vaddq_m_f<mode>): Likewise.
     (mve_vaddq_m_n_<supf><mode>): Likewise.
     (mve_vaddq_m_n_f<mode>): Likewise.
     (mve_vaddq_n_<supf><mode>): Likewise.
     (mve_vaddq_n_f<mode>): Likewise.
     (mve_vaddq<mode>): Likewise.
     (mve_vaddvaq_<supf><mode>): Likewise.
     (mve_vaddvaq_p_<supf><mode>): Likewise.
     (mve_vaddvq_<supf><mode>): Likewise.
     (mve_vaddvq_p_<supf><mode>): Likewise.
     (mve_vandq_<supf><mode>): Likewise.
     (mve_vandq_f<mode>): Likewise.
     (mve_vandq_m_<supf><mode>): Likewise.
     (mve_vandq_m_f<mode>): Likewise.
     (mve_vandq_s<mode>): Likewise.
     (mve_vandq_u<mode>): Likewise.
     (mve_vbicq_<supf><mode>): Likewise.
     (mve_vbicq_f<mode>): Likewise.
     (mve_vbicq_m_<supf><mode>): Likewise.
     (mve_vbicq_m_f<mode>): Likewise.
     (mve_vbicq_m_n_<supf><mode>): Likewise.
     (mve_vbicq_n_<supf><mode>): Likewise.
     (mve_vbicq_s<mode>): Likewise.
     (mve_vbicq_u<mode>): Likewise.
     (mve_vbrsrq_m_n_<supf><mode>): Likewise.
     (mve_vbrsrq_m_n_f<mode>): Likewise.
     (mve_vbrsrq_n_<supf><mode>): Likewise.
     (mve_vbrsrq_n_f<mode>): Likewise.
     (mve_vcaddq_rot270_m_<supf><mode>): Likewise.
     (mve_vcaddq_rot270_m_f<mode>): Likewise.
     (mve_vcaddq_rot270<mode>): Likewise.
     (mve_vcaddq_rot270<mode>): Likewise.
     (mve_vcaddq_rot90_m_<supf><mode>): Likewise.
     (mve_vcaddq_rot90_m_f<mode>): Likewise.
     (mve_vcaddq_rot90<mode>): Likewise.
     (mve_vcaddq_rot90<mode>): Likewise.
     (mve_vcaddq<mve_rot><mode>): Likewise.
     (mve_vcaddq<mve_rot><mode>): Likewise.
     (mve_vclsq_m_s<mode>): Likewise.
     (mve_vclsq_s<mode>): Likewise.
     (mve_vclzq_<supf><mode>): Likewise.
     (mve_vclzq_m_<supf><mode>): Likewise.
     (mve_vclzq_s<mode>): Likewise.
     (mve_vclzq_u<mode>): Likewise.
     (mve_vcmlaq_m_f<mode>): Likewise.
     (mve_vcmlaq_rot180_m_f<mode>): Likewise.
     (mve_vcmlaq_rot180<mode>): Likewise.
     (mve_vcmlaq_rot270_m_f<mode>): Likewise.
     (mve_vcmlaq_rot270<mode>): Likewise.
     (mve_vcmlaq_rot90_m_f<mode>): Likewise.
     (mve_vcmlaq_rot90<mode>): Likewise.
     (mve_vcmlaq<mode>): Likewise.
     (mve_vcmlaq<mve_rot><mode>): Likewise.
     (mve_vcmp<mve_cmp_op>q_<mode>): Likewise.
     (mve_vcmp<mve_cmp_op>q_f<mode>): Likewise.
     (mve_vcmp<mve_cmp_op>q_n_<mode>): Likewise.
     (mve_vcmp<mve_cmp_op>q_n_f<mode>): Likewise.
     (mve_vcmpcsq_<mode>): Likewise.
     (mve_vcmpcsq_m_n_u<mode>): Likewise.
     (mve_vcmpcsq_m_u<mode>): Likewise.
     (mve_vcmpcsq_n_<mode>): Likewise.
     (mve_vcmpeqq_<mode>): Likewise.
     (mve_vcmpeqq_f<mode>): Likewise.
     (mve_vcmpeqq_m_<supf><mode>): Likewise.
     (mve_vcmpeqq_m_f<mode>): Likewise.
     (mve_vcmpeqq_m_n_<supf><mode>): Likewise.
     (mve_vcmpeqq_m_n_f<mode>): Likewise.
     (mve_vcmpeqq_n_<mode>): Likewise.
     (mve_vcmpeqq_n_f<mode>): Likewise.
     (mve_vcmpgeq_<mode>): Likewise.
     (mve_vcmpgeq_f<mode>): Likewise.
     (mve_vcmpgeq_m_f<mode>): Likewise.
     (mve_vcmpgeq_m_n_f<mode>): Likewise.
     (mve_vcmpgeq_m_n_s<mode>): Likewise.
     (mve_vcmpgeq_m_s<mode>): Likewise.
     (mve_vcmpgeq_n_<mode>): Likewise.
     (mve_vcmpgeq_n_f<mode>): Likewise.
     (mve_vcmpgtq_<mode>): Likewise.
     (mve_vcmpgtq_f<mode>): Likewise.
     (mve_vcmpgtq_m_f<mode>): Likewise.
     (mve_vcmpgtq_m_n_f<mode>): Likewise.
     (mve_vcmpgtq_m_n_s<mode>): Likewise.
     (mve_vcmpgtq_m_s<mode>): Likewise.
     (mve_vcmpgtq_n_<mode>): Likewise.
     (mve_vcmpgtq_n_f<mode>): Likewise.
     (mve_vcmphiq_<mode>): Likewise.
     (mve_vcmphiq_m_n_u<mode>): Likewise.
     (mve_vcmphiq_m_u<mode>): Likewise.
     (mve_vcmphiq_n_<mode>): Likewise.
     (mve_vcmpleq_<mode>): Likewise.
     (mve_vcmpleq_f<mode>): Likewise.
     (mve_vcmpleq_m_f<mode>): Likewise.
     (mve_vcmpleq_m_n_f<mode>): Likewise.
     (mve_vcmpleq_m_n_s<mode>): Likewise.
     (mve_vcmpleq_m_s<mode>): Likewise.
     (mve_vcmpleq_n_<mode>): Likewise.
     (mve_vcmpleq_n_f<mode>): Likewise.
     (mve_vcmpltq_<mode>): Likewise.
     (mve_vcmpltq_f<mode>): Likewise.
     (mve_vcmpltq_m_f<mode>): Likewise.
     (mve_vcmpltq_m_n_f<mode>): Likewise.
     (mve_vcmpltq_m_n_s<mode>): Likewise.
     (mve_vcmpltq_m_s<mode>): Likewise.
     (mve_vcmpltq_n_<mode>): Likewise.
     (mve_vcmpltq_n_f<mode>): Likewise.
     (mve_vcmpneq_<mode>): Likewise.
     (mve_vcmpneq_f<mode>): Likewise.
     (mve_vcmpneq_m_<supf><mode>): Likewise.
     (mve_vcmpneq_m_f<mode>): Likewise.
     (mve_vcmpneq_m_n_<supf><mode>): Likewise.
     (mve_vcmpneq_m_n_f<mode>): Likewise.
     (mve_vcmpneq_n_<mode>): Likewise.
     (mve_vcmpneq_n_f<mode>): Likewise.
     (mve_vcmulq_m_f<mode>): Likewise.
     (mve_vcmulq_rot180_m_f<mode>): Likewise.
     (mve_vcmulq_rot180<mode>): Likewise.
     (mve_vcmulq_rot270_m_f<mode>): Likewise.
     (mve_vcmulq_rot270<mode>): Likewise.
     (mve_vcmulq_rot90_m_f<mode>): Likewise.
     (mve_vcmulq_rot90<mode>): Likewise.
     (mve_vcmulq<mode>): Likewise.
     (mve_vcmulq<mve_rot><mode>): Likewise.
     (mve_vctp<mode1>q_mhi): Likewise.
     (mve_vctp<mode1>qhi): Likewise.
     (mve_vcvtaq_<supf><mode>): Likewise.
     (mve_vcvtaq_m_<supf><mode>): Likewise.
     (mve_vcvtbq_f16_f32v8hf): Likewise.
     (mve_vcvtbq_f32_f16v4sf): Likewise.
     (mve_vcvtbq_m_f16_f32v8hf): Likewise.
     (mve_vcvtbq_m_f32_f16v4sf): Likewise.
     (mve_vcvtmq_<supf><mode>): Likewise.
     (mve_vcvtmq_m_<supf><mode>): Likewise.
     (mve_vcvtnq_<supf><mode>): Likewise.
     (mve_vcvtnq_m_<supf><mode>): Likewise.
     (mve_vcvtpq_<supf><mode>): Likewise.
     (mve_vcvtpq_m_<supf><mode>): Likewise.
     (mve_vcvtq_from_f_<supf><mode>): Likewise.
     (mve_vcvtq_m_from_f_<supf><mode>): Likewise.
     (mve_vcvtq_m_n_from_f_<supf><mode>): Likewise.
     (mve_vcvtq_m_n_to_f_<supf><mode>): Likewise.
     (mve_vcvtq_m_to_f_<supf><mode>): Likewise.
     (mve_vcvtq_n_from_f_<supf><mode>): Likewise.
     (mve_vcvtq_n_to_f_<supf><mode>): Likewise.
     (mve_vcvtq_to_f_<supf><mode>): Likewise.
     (mve_vcvttq_f16_f32v8hf): Likewise.
     (mve_vcvttq_f32_f16v4sf): Likewise.
     (mve_vcvttq_m_f16_f32v8hf): Likewise.
     (mve_vcvttq_m_f32_f16v4sf): Likewise.
     (mve_vddupq_m_wb_u<mode>_insn): Likewise.
     (mve_vddupq_u<mode>_insn): Likewise.
     (mve_vdupq_m_n_<supf><mode>): Likewise.
     (mve_vdupq_m_n_f<mode>): Likewise.
     (mve_vdupq_n_<supf><mode>): Likewise.
     (mve_vdupq_n_f<mode>): Likewise.
     (mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
     (mve_vdwdupq_wb_u<mode>_insn): Likewise.
     (mve_veorq_<supf><mode>): Likewise.
     (mve_veorq_f<mode>): Likewise.
     (mve_veorq_m_<supf><mode>): Likewise.
     (mve_veorq_m_f<mode>): Likewise.
     (mve_veorq_s<mode>): Likewise.
     (mve_veorq_u<mode>): Likewise.
     (mve_vfmaq_f<mode>): Likewise.
     (mve_vfmaq_m_f<mode>): Likewise.
     (mve_vfmaq_m_n_f<mode>): Likewise.
     (mve_vfmaq_n_f<mode>): Likewise.
     (mve_vfmasq_m_n_f<mode>): Likewise.
     (mve_vfmasq_n_f<mode>): Likewise.
     (mve_vfmsq_f<mode>): Likewise.
     (mve_vfmsq_m_f<mode>): Likewise.
     (mve_vhaddq_<supf><mode>): Likewise.
     (mve_vhaddq_m_<supf><mode>): Likewise.
     (mve_vhaddq_m_n_<supf><mode>): Likewise.
     (mve_vhaddq_n_<supf><mode>): Likewise.
     (mve_vhcaddq_rot270_m_s<mode>): Likewise.
     (mve_vhcaddq_rot270_s<mode>): Likewise.
     (mve_vhcaddq_rot90_m_s<mode>): Likewise.
     (mve_vhcaddq_rot90_s<mode>): Likewise.
     (mve_vhsubq_<supf><mode>): Likewise.
     (mve_vhsubq_m_<supf><mode>): Likewise.
     (mve_vhsubq_m_n_<supf><mode>): Likewise.
     (mve_vhsubq_n_<supf><mode>): Likewise.
     (mve_vidupq_m_wb_u<mode>_insn): Likewise.
     (mve_vidupq_u<mode>_insn): Likewise.
     (mve_viwdupq_m_wb_u<mode>_insn): Likewise.
     (mve_viwdupq_wb_u<mode>_insn): Likewise.
     (mve_vldrbq_<supf><mode>): Likewise.
     (mve_vldrbq_gather_offset_<supf><mode>): Likewise.
     (mve_vldrbq_gather_offset_z_<supf><mode>): Likewise.
     (mve_vldrbq_z_<supf><mode>): Likewise.
     (mve_vldrdq_gather_base_<supf>v2di): Likewise.
     (mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
     (mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
     (mve_vldrdq_gather_base_z_<supf>v2di): Likewise.
     (mve_vldrdq_gather_offset_<supf>v2di): Likewise.
     (mve_vldrdq_gather_offset_z_<supf>v2di): Likewise.
     (mve_vldrdq_gather_shifted_offset_<supf>v2di): Likewise.
     (mve_vldrdq_gather_shifted_offset_z_<supf>v2di): Likewise.
     (mve_vldrhq_<supf><mode>): Likewise.
     (mve_vldrhq_fv8hf): Likewise.
     (mve_vldrhq_gather_offset_<supf><mode>): Likewise.
     (mve_vldrhq_gather_offset_fv8hf): Likewise.
     (mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
     (mve_vldrhq_gather_offset_z_fv8hf): Likewise.
     (mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
     (mve_vldrhq_gather_shifted_offset_fv8hf): Likewise.
     (mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
     (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise.
     (mve_vldrhq_z_<supf><mode>): Likewise.
     (mve_vldrhq_z_fv8hf): Likewise.
     (mve_vldrwq_<supf>v4si): Likewise.
     (mve_vldrwq_fv4sf): Likewise.
     (mve_vldrwq_gather_base_<supf>v4si): Likewise.
     (mve_vldrwq_gather_base_fv4sf): Likewise.
     (mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
     (mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
     (mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
     (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
     (mve_vldrwq_gather_base_z_<supf>v4si): Likewise.
     (mve_vldrwq_gather_base_z_fv4sf): Likewise.
     (mve_vldrwq_gather_offset_<supf>v4si): Likewise.
     (mve_vldrwq_gather_offset_fv4sf): Likewise.
     (mve_vldrwq_gather_offset_z_<supf>v4si): Likewise.
     (mve_vldrwq_gather_offset_z_fv4sf): Likewise.
     (mve_vldrwq_gather_shifted_offset_<supf>v4si): Likewise.
     (mve_vldrwq_gather_shifted_offset_fv4sf): Likewise.
     (mve_vldrwq_gather_shifted_offset_z_<supf>v4si): Likewise.
     (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise.
     (mve_vldrwq_z_<supf>v4si): Likewise.
     (mve_vldrwq_z_fv4sf): Likewise.
     (mve_vmaxaq_m_s<mode>): Likewise.
     (mve_vmaxaq_s<mode>): Likewise.
     (mve_vmaxavq_p_s<mode>): Likewise.
     (mve_vmaxavq_s<mode>): Likewise.
     (mve_vmaxnmaq_f<mode>): Likewise.
     (mve_vmaxnmaq_m_f<mode>): Likewise.
     (mve_vmaxnmavq_f<mode>): Likewise.
     (mve_vmaxnmavq_p_f<mode>): Likewise.
     (mve_vmaxnmq_f<mode>): Likewise.
     (mve_vmaxnmq_m_f<mode>): Likewise.
     (mve_vmaxnmvq_f<mode>): Likewise.
     (mve_vmaxnmvq_p_f<mode>): Likewise.
     (mve_vmaxq_<supf><mode>): Likewise.
     (mve_vmaxq_m_<supf><mode>): Likewise.
     (mve_vmaxq_s<mode>): Likewise.
     (mve_vmaxq_u<mode>): Likewise.
     (mve_vmaxvq_<supf><mode>): Likewise.
     (mve_vmaxvq_p_<supf><mode>): Likewise.
     (mve_vminaq_m_s<mode>): Likewise.
     (mve_vminaq_s<mode>): Likewise.
     (mve_vminavq_p_s<mode>): Likewise.
     (mve_vminavq_s<mode>): Likewise.
     (mve_vminnmaq_f<mode>): Likewise.
     (mve_vminnmaq_m_f<mode>): Likewise.
     (mve_vminnmavq_f<mode>): Likewise.
     (mve_vminnmavq_p_f<mode>): Likewise.
     (mve_vminnmq_f<mode>): Likewise.
     (mve_vminnmq_m_f<mode>): Likewise.
     (mve_vminnmvq_f<mode>): Likewise.
     (mve_vminnmvq_p_f<mode>): Likewise.
     (mve_vminq_<supf><mode>): Likewise.
     (mve_vminq_m_<supf><mode>): Likewise.
     (mve_vminq_s<mode>): Likewise.
     (mve_vminq_u<mode>): Likewise.
     (mve_vminvq_<supf><mode>): Likewise.
     (mve_vminvq_p_<supf><mode>): Likewise.
     (mve_vmladavaq_<supf><mode>): Likewise.
     (mve_vmladavaq_p_<supf><mode>): Likewise.
     (mve_vmladavaxq_p_s<mode>): Likewise.
     (mve_vmladavaxq_s<mode>): Likewise.
     (mve_vmladavq_<supf><mode>): Likewise.
     (mve_vmladavq_p_<supf><mode>): Likewise.
     (mve_vmladavxq_p_s<mode>): Likewise.
     (mve_vmladavxq_s<mode>): Likewise.
     (mve_vmlaldavaq_<supf><mode>): Likewise.
     (mve_vmlaldavaq_p_<supf><mode>): Likewise.
     (mve_vmlaldavaxq_<supf><mode>): Likewise.
     (mve_vmlaldavaxq_p_<supf><mode>): Likewise.
     (mve_vmlaldavaxq_s<mode>): Likewise.
     (mve_vmlaldavq_<supf><mode>): Likewise.
     (mve_vmlaldavq_p_<supf><mode>): Likewise.
     (mve_vmlaldavxq_p_s<mode>): Likewise.
     (mve_vmlaldavxq_s<mode>): Likewise.
     (mve_vmlaq_m_n_<supf><mode>): Likewise.
     (mve_vmlaq_n_<supf><mode>): Likewise.
     (mve_vmlasq_m_n_<supf><mode>): Likewise.
     (mve_vmlasq_n_<supf><mode>): Likewise.
     (mve_vmlsdavaq_p_s<mode>): Likewise.
     (mve_vmlsdavaq_s<mode>): Likewise.
     (mve_vmlsdavaxq_p_s<mode>): Likewise.
     (mve_vmlsdavaxq_s<mode>): Likewise.
     (mve_vmlsdavq_p_s<mode>): Likewise.
     (mve_vmlsdavq_s<mode>): Likewise.
     (mve_vmlsdavxq_p_s<mode>): Likewise.
     (mve_vmlsdavxq_s<mode>): Likewise.
     (mve_vmlsldavaq_p_s<mode>): Likewise.
     (mve_vmlsldavaq_s<mode>): Likewise.
     (mve_vmlsldavaxq_p_s<mode>): Likewise.
     (mve_vmlsldavaxq_s<mode>): Likewise.
     (mve_vmlsldavq_p_s<mode>): Likewise.
     (mve_vmlsldavq_s<mode>): Likewise.
     (mve_vmlsldavxq_p_s<mode>): Likewise.
     (mve_vmlsldavxq_s<mode>): Likewise.
     (mve_vmovlbq_<supf><mode>): Likewise.
     (mve_vmovlbq_m_<supf><mode>): Likewise.
     (mve_vmovltq_<supf><mode>): Likewise.
     (mve_vmovltq_m_<supf><mode>): Likewise.
     (mve_vmovnbq_<supf><mode>): Likewise.
     (mve_vmovnbq_m_<supf><mode>): Likewise.
     (mve_vmovntq_<supf><mode>): Likewise.
     (mve_vmovntq_m_<supf><mode>): Likewise.
     (mve_vmulhq_<supf><mode>): Likewise.
     (mve_vmulhq_m_<supf><mode>): Likewise.
     (mve_vmullbq_int_<supf><mode>): Likewise.
     (mve_vmullbq_int_m_<supf><mode>): Likewise.
     (mve_vmullbq_poly_m_p<mode>): Likewise.
     (mve_vmullbq_poly_p<mode>): Likewise.
     (mve_vmulltq_int_<supf><mode>): Likewise.
     (mve_vmulltq_int_m_<supf><mode>): Likewise.
     (mve_vmulltq_poly_m_p<mode>): Likewise.
     (mve_vmulltq_poly_p<mode>): Likewise.
     (mve_vmulq_<supf><mode>): Likewise.
     (mve_vmulq_f<mode>): Likewise.
     (mve_vmulq_m_<supf><mode>): Likewise.
     (mve_vmulq_m_f<mode>): Likewise.
     (mve_vmulq_m_n_<supf><mode>): Likewise.
     (mve_vmulq_m_n_f<mode>): Likewise.
     (mve_vmulq_n_<supf><mode>): Likewise.
     (mve_vmulq_n_f<mode>): Likewise.
     (mve_vmvnq_<supf><mode>): Likewise.
     (mve_vmvnq_m_<supf><mode>): Likewise.
     (mve_vmvnq_m_n_<supf><mode>): Likewise.
     (mve_vmvnq_n_<supf><mode>): Likewise.
     (mve_vmvnq_s<mode>): Likewise.
     (mve_vmvnq_u<mode>): Likewise.
     (mve_vnegq_f<mode>): Likewise.
     (mve_vnegq_m_f<mode>): Likewise.
     (mve_vnegq_m_s<mode>): Likewise.
     (mve_vnegq_s<mode>): Likewise.
     (mve_vornq_<supf><mode>): Likewise.
     (mve_vornq_f<mode>): Likewise.
     (mve_vornq_m_<supf><mode>): Likewise.
     (mve_vornq_m_f<mode>): Likewise.
     (mve_vornq_s<mode>): Likewise.
     (mve_vornq_u<mode>): Likewise.
     (mve_vorrq_<supf><mode>): Likewise.
     (mve_vorrq_f<mode>): Likewise.
     (mve_vorrq_m_<supf><mode>): Likewise.
     (mve_vorrq_m_f<mode>): Likewise.
     (mve_vorrq_m_n_<supf><mode>): Likewise.
     (mve_vorrq_n_<supf><mode>): Likewise.
     (mve_vorrq_s<mode>): Likewise.
     (mve_vorrq_s<mode>): Likewise.
     (mve_vqabsq_m_s<mode>): Likewise.
     (mve_vqabsq_s<mode>): Likewise.
     (mve_vqaddq_<supf><mode>): Likewise.
     (mve_vqaddq_m_<supf><mode>): Likewise.
     (mve_vqaddq_m_n_<supf><mode>): Likewise.
     (mve_vqaddq_n_<supf><mode>): Likewise.
     (mve_vqdmladhq_m_s<mode>): Likewise.
     (mve_vqdmladhq_s<mode>): Likewise.
     (mve_vqdmladhxq_m_s<mode>): Likewise.
     (mve_vqdmladhxq_s<mode>): Likewise.
     (mve_vqdmlahq_m_n_s<mode>): Likewise.
     (mve_vqdmlahq_n_<supf><mode>): Likewise.
     (mve_vqdmlahq_n_s<mode>): Likewise.
     (mve_vqdmlashq_m_n_s<mode>): Likewise.
     (mve_vqdmlashq_n_<supf><mode>): Likewise.
     (mve_vqdmlashq_n_s<mode>): Likewise.
     (mve_vqdmlsdhq_m_s<mode>): Likewise.
     (mve_vqdmlsdhq_s<mode>): Likewise.
     (mve_vqdmlsdhxq_m_s<mode>): Likewise.
     (mve_vqdmlsdhxq_s<mode>): Likewise.
     (mve_vqdmulhq_m_n_s<mode>): Likewise.
     (mve_vqdmulhq_m_s<mode>): Likewise.
     (mve_vqdmulhq_n_s<mode>): Likewise.
     (mve_vqdmulhq_s<mode>): Likewise.
     (mve_vqdmullbq_m_n_s<mode>): Likewise.
     (mve_vqdmullbq_m_s<mode>): Likewise.
     (mve_vqdmullbq_n_s<mode>): Likewise.
     (mve_vqdmullbq_s<mode>): Likewise.
     (mve_vqdmulltq_m_n_s<mode>): Likewise.
     (mve_vqdmulltq_m_s<mode>): Likewise.
     (mve_vqdmulltq_n_s<mode>): Likewise.
     (mve_vqdmulltq_s<mode>): Likewise.
     (mve_vqmovnbq_<supf><mode>): Likewise.
     (mve_vqmovnbq_m_<supf><mode>): Likewise.
     (mve_vqmovntq_<supf><mode>): Likewise.
     (mve_vqmovntq_m_<supf><mode>): Likewise.
     (mve_vqmovunbq_m_s<mode>): Likewise.
     (mve_vqmovunbq_s<mode>): Likewise.
     (mve_vqmovuntq_m_s<mode>): Likewise.
     (mve_vqmovuntq_s<mode>): Likewise.
     (mve_vqnegq_m_s<mode>): Likewise.
     (mve_vqnegq_s<mode>): Likewise.
     (mve_vqrdmladhq_m_s<mode>): Likewise.
     (mve_vqrdmladhq_s<mode>): Likewise.
     (mve_vqrdmladhxq_m_s<mode>): Likewise.
     (mve_vqrdmladhxq_s<mode>): Likewise.
     (mve_vqrdmlahq_m_n_s<mode>): Likewise.
     (mve_vqrdmlahq_n_<supf><mode>): Likewise.
     (mve_vqrdmlahq_n_s<mode>): Likewise.
     (mve_vqrdmlashq_m_n_s<mode>): Likewise.
     (mve_vqrdmlashq_n_<supf><mode>): Likewise.
     (mve_vqrdmlashq_n_s<mode>): Likewise.
     (mve_vqrdmlsdhq_m_s<mode>): Likewise.
     (mve_vqrdmlsdhq_s<mode>): Likewise.
     (mve_vqrdmlsdhxq_m_s<mode>): Likewise.
     (mve_vqrdmlsdhxq_s<mode>): Likewise.
     (mve_vqrdmulhq_m_n_s<mode>): Likewise.
     (mve_vqrdmulhq_m_s<mode>): Likewise.
     (mve_vqrdmulhq_n_s<mode>): Likewise.
     (mve_vqrdmulhq_s<mode>): Likewise.
     (mve_vqrshlq_<supf><mode>): Likewise.
     (mve_vqrshlq_m_<supf><mode>): Likewise.
     (mve_vqrshlq_m_n_<supf><mode>): Likewise.
     (mve_vqrshlq_n_<supf><mode>): Likewise.
     (mve_vqrshrnbq_m_n_<supf><mode>): Likewise.
     (mve_vqrshrnbq_n_<supf><mode>): Likewise.
     (mve_vqrshrntq_m_n_<supf><mode>): Likewise.
     (mve_vqrshrntq_n_<supf><mode>): Likewise.
     (mve_vqrshrunbq_m_n_s<mode>): Likewise.
     (mve_vqrshrunbq_n_s<mode>): Likewise.
     (mve_vqrshruntq_m_n_s<mode>): Likewise.
     (mve_vqrshruntq_n_s<mode>): Likewise.
     (mve_vqshlq_<supf><mode>): Likewise.
     (mve_vqshlq_m_<supf><mode>): Likewise.
     (mve_vqshlq_m_n_<supf><mode>): Likewise.
     (mve_vqshlq_m_r_<supf><mode>): Likewise.
     (mve_vqshlq_n_<supf><mode>): Likewise.
     (mve_vqshlq_r_<supf><mode>): Likewise.
     (mve_vqshluq_m_n_s<mode>): Likewise.
     (mve_vqshluq_n_s<mode>): Likewise.
     (mve_vqshrnbq_m_n_<supf><mode>): Likewise.
     (mve_vqshrnbq_n_<supf><mode>): Likewise.
     (mve_vqshrntq_m_n_<supf><mode>): Likewise.
     (mve_vqshrntq_n_<supf><mode>): Likewise.
     (mve_vqshrunbq_m_n_s<mode>): Likewise.
     (mve_vqshrunbq_n_s<mode>): Likewise.
     (mve_vqshruntq_m_n_s<mode>): Likewise.
     (mve_vqshruntq_n_s<mode>): Likewise.
     (mve_vqsubq_<supf><mode>): Likewise.
     (mve_vqsubq_m_<supf><mode>): Likewise.
     (mve_vqsubq_m_n_<supf><mode>): Likewise.
     (mve_vqsubq_n_<supf><mode>): Likewise.
     (mve_vrev16q_<supf>v16qi): Likewise.
     (mve_vrev16q_m_<supf>v16qi): Likewise.
     (mve_vrev32q_<supf><mode>): Likewise.
     (mve_vrev32q_fv8hf): Likewise.
     (mve_vrev32q_m_<supf><mode>): Likewise.
     (mve_vrev32q_m_fv8hf): Likewise.
     (mve_vrev64q_<supf><mode>): Likewise.
     (mve_vrev64q_f<mode>): Likewise.
     (mve_vrev64q_m_<supf><mode>): Likewise.
     (mve_vrev64q_m_f<mode>): Likewise.
     (mve_vrhaddq_<supf><mode>): Likewise.
     (mve_vrhaddq_m_<supf><mode>): Likewise.
     (mve_vrmlaldavhaq_<supf>v4si): Likewise.
     (mve_vrmlaldavhaq_p_sv4si): Likewise.
     (mve_vrmlaldavhaq_p_uv4si): Likewise.
     (mve_vrmlaldavhaq_sv4si): Likewise.
     (mve_vrmlaldavhaq_uv4si): Likewise.
     (mve_vrmlaldavhaxq_p_sv4si): Likewise.
     (mve_vrmlaldavhaxq_sv4si): Likewise.
     (mve_vrmlaldavhq_<supf>v4si): Likewise.
     (mve_vrmlaldavhq_p_<supf>v4si): Likewise.
     (mve_vrmlaldavhxq_p_sv4si): Likewise.
     (mve_vrmlaldavhxq_sv4si): Likewise.
     (mve_vrmlsldavhaq_p_sv4si): Likewise.
     (mve_vrmlsldavhaq_sv4si): Likewise.
     (mve_vrmlsldavhaxq_p_sv4si): Likewise.
     (mve_vrmlsldavhaxq_sv4si): Likewise.
     (mve_vrmlsldavhq_p_sv4si): Likewise.
     (mve_vrmlsldavhq_sv4si): Likewise.
     (mve_vrmlsldavhxq_p_sv4si): Likewise.
     (mve_vrmlsldavhxq_sv4si): Likewise.
     (mve_vrmulhq_<supf><mode>): Likewise.
     (mve_vrmulhq_m_<supf><mode>): Likewise.
     (mve_vrndaq_f<mode>): Likewise.
     (mve_vrndaq_m_f<mode>): Likewise.
     (mve_vrndmq_f<mode>): Likewise.
     (mve_vrndmq_m_f<mode>): Likewise.
     (mve_vrndnq_f<mode>): Likewise.
     (mve_vrndnq_m_f<mode>): Likewise.
     (mve_vrndpq_f<mode>): Likewise.
     (mve_vrndpq_m_f<mode>): Likewise.
     (mve_vrndq_f<mode>): Likewise.
     (mve_vrndq_m_f<mode>): Likewise.
     (mve_vrndxq_f<mode>): Likewise.
     (mve_vrndxq_m_f<mode>): Likewise.
     (mve_vrshlq_<supf><mode>): Likewise.
     (mve_vrshlq_m_<supf><mode>): Likewise.
     (mve_vrshlq_m_n_<supf><mode>): Likewise.
     (mve_vrshlq_n_<supf><mode>): Likewise.
     (mve_vrshrnbq_m_n_<supf><mode>): Likewise.
     (mve_vrshrnbq_n_<supf><mode>): Likewise.
     (mve_vrshrntq_m_n_<supf><mode>): Likewise.
     (mve_vrshrntq_n_<supf><mode>): Likewise.
     (mve_vrshrq_m_n_<supf><mode>): Likewise.
     (mve_vrshrq_n_<supf><mode>): Likewise.
     (mve_vsbciq_<supf>v4si): Likewise.
     (mve_vsbciq_m_<supf>v4si): Likewise.
     (mve_vsbcq_<supf>v4si): Likewise.
     (mve_vsbcq_m_<supf>v4si): Likewise.
     (mve_vshlcq_<supf><mode>): Likewise.
     (mve_vshlcq_m_<supf><mode>): Likewise.
     (mve_vshllbq_m_n_<supf><mode>): Likewise.
     (mve_vshllbq_n_<supf><mode>): Likewise.
     (mve_vshlltq_m_n_<supf><mode>): Likewise.
     (mve_vshlltq_n_<supf><mode>): Likewise.
     (mve_vshlq_<supf><mode>): Likewise.
     (mve_vshlq_<supf><mode>): Likewise.
     (mve_vshlq_m_<supf><mode>): Likewise.
     (mve_vshlq_m_n_<supf><mode>): Likewise.
     (mve_vshlq_m_r_<supf><mode>): Likewise.
     (mve_vshlq_n_<supf><mode>): Likewise.
     (mve_vshlq_r_<supf><mode>): Likewise.
     (mve_vshrnbq_m_n_<supf><mode>): Likewise.
     (mve_vshrnbq_n_<supf><mode>): Likewise.
     (mve_vshrntq_m_n_<supf><mode>): Likewise.
     (mve_vshrntq_n_<supf><mode>): Likewise.
     (mve_vshrq_m_n_<supf><mode>): Likewise.
     (mve_vshrq_n_<supf><mode>): Likewise.
     (mve_vsliq_m_n_<supf><mode>): Likewise.
     (mve_vsliq_n_<supf><mode>): Likewise.
     (mve_vsriq_m_n_<supf><mode>): Likewise.
     (mve_vsriq_n_<supf><mode>): Likewise.
     (mve_vstrbq_<supf><mode>): Likewise.
     (mve_vstrbq_p_<supf><mode>): Likewise.
     (mve_vstrbq_scatter_offset_<supf><mode>_insn): Likewise.
     (mve_vstrbq_scatter_offset_p_<supf><mode>_insn): Likewise.
     (mve_vstrdq_scatter_base_<supf>v2di): Likewise.
     (mve_vstrdq_scatter_base_p_<supf>v2di): Likewise.
     (mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
     (mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
     (mve_vstrdq_scatter_offset_<supf>v2di_insn): Likewise.
     (mve_vstrdq_scatter_offset_p_<supf>v2di_insn): Likewise.
     (mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn): Likewise.
     (mve_vstrdq_scatter_shifted_offset_p_<supf>v2di_insn): Likewise.
     (mve_vstrhq_<supf><mode>): Likewise.
     (mve_vstrhq_fv8hf): Likewise.
     (mve_vstrhq_p_<supf><mode>): Likewise.
     (mve_vstrhq_p_fv8hf): Likewise.
     (mve_vstrhq_scatter_offset_<supf><mode>_insn): Likewise.
     (mve_vstrhq_scatter_offset_fv8hf_insn): Likewise.
     (mve_vstrhq_scatter_offset_p_<supf><mode>_insn): Likewise.
     (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise.
  (mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn): Likewise.
     (mve_vstrhq_scatter_shifted_offset_fv8hf_insn): Likewise.
  (mve_vstrhq_scatter_shifted_offset_p_<supf><mode>_insn): Likewise.
     (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise.
     (mve_vstrwq_<supf>v4si): Likewise.
     (mve_vstrwq_fv4sf): Likewise.
     (mve_vstrwq_p_<supf>v4si): Likewise.
     (mve_vstrwq_p_fv4sf): Likewise.
     (mve_vstrwq_scatter_base_<supf>v4si): Likewise.
     (mve_vstrwq_scatter_base_fv4sf): Likewise.
     (mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
     (mve_vstrwq_scatter_base_p_fv4sf): Likewise.
     (mve_vstrwq_scatter_base_wb_<supf>v4si): Likewise.
     (mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
     (mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
     (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
     (mve_vstrwq_scatter_offset_<supf>v4si_insn): Likewise.
     (mve_vstrwq_scatter_offset_fv4sf_insn): Likewise.
     (mve_vstrwq_scatter_offset_p_<supf>v4si_insn): Likewise.
     (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise.
     (mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn): Likewise.
     (mve_vstrwq_scatter_shifted_offset_fv4sf_insn): Likewise.
     (mve_vstrwq_scatter_shifted_offset_p_<supf>v4si_insn): Likewise.
     (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise.
     (mve_vsubq_<supf><mode>): Likewise.
     (mve_vsubq_f<mode>): Likewise.
     (mve_vsubq_m_<supf><mode>): Likewise.
     (mve_vsubq_m_f<mode>): Likewise.
     (mve_vsubq_m_n_<supf><mode>): Likewise.
     (mve_vsubq_m_n_f<mode>): Likewise.
     (mve_vsubq_n_<supf><mode>): Likewise.
     (mve_vsubq_n_f<mode>): Likewise.


[-- Attachment #2: 1.patch --]
[-- Type: text/x-patch, Size: 128695 bytes --]

commit 7a25d85f91d84e53e707bb36d052f8196e49e147
Author: Stam Markianos-Wright <stam.markianos-wright@arm.com>
Date:   Tue Oct 18 17:42:56 2022 +0100

    arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
    
    I'd like to submit two patches that add support for Arm's MVE
    Tail Predicated Low Overhead Loop feature.
    
    --- Introduction ---
    
    The M-class Arm-ARM:
    https://developer.arm.com/documentation/ddi0553/bu/?lang=en
    Section B5.5.1 "Loop tail predication" describes the feature
    we are adding support for with this patch (although
    we only add codegen for DLSTP/LETP instruction loops).
    
    Previously with commit d2ed233cb94 we'd added support for
    non-MVE DLS/LE loops through the loop-doloop pass, which, given
    a standard MVE loop like:
    
    ```
    void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, int16_t *c, int n)
    {
      while (n > 0)
        {
          mve_pred16_t p = vctp16q (n);
          int16x8_t va = vldrhq_z_s16 (a, p);
          int16x8_t vb = vldrhq_z_s16 (b, p);
          int16x8_t vc = vaddq_x_s16 (va, vb, p);
          vstrhq_p_s16 (c, vc, p);
          c+=8;
          a+=8;
          b+=8;
          n-=8;
        }
    }
    ```
    .. would output:
    
    ```
            <pre-calculate the number of iterations and place it into lr>
            dls     lr, lr
    .L3:
            vctp.16 r3
            vmrs    ip, P0  @ movhi
            sxth    ip, ip
            vmsr     P0, ip @ movhi
            mov     r4, r0
            vpst
            vldrht.16       q2, [r4]
            mov     r4, r1
            vmov    q3, q0
            vpst
            vldrht.16       q1, [r4]
            mov     r4, r2
            vpst
            vaddt.i16       q3, q2, q1
            subs    r3, r3, #8
            vpst
            vstrht.16       q3, [r4]
            adds    r0, r0, #16
            adds    r1, r1, #16
            adds    r2, r2, #16
            le      lr, .L3
    ```
    
    where the LE instruction will decrement LR by 1, compare and
    branch if needed.
    
    (there are also other inefficiencies with the above code, like the
    pointless vmrs/sxth/vmsr on the VPR and the adds not being merged
    into the vldrht/vstrht as a #16 offsets and some random movs!
    But that's different problems...)
    
    The MVE version is similar, except that:
    * Instead of DLS/LE the instructions are DLSTP/LETP.
    * Instead of pre-calculating the number of iterations of the
      loop, we place the number of elements to be processed by the
      loop into LR.
    * Instead of decrementing the LR by one, LETP will decrement it
      by FPSCR.LTPSIZE, which is the number of elements being
      processed in each iteration: 16 for 8-bit elements, 5 for 16-bit
      elements, etc.
    * On the final iteration, automatic Loop Tail Predication is
      performed, as if the instructions within the loop had been VPT
      predicated with a VCTP generating the VPR predicate in every
      loop iteration.
    
    The dlstp/letp loop now looks like:
    
    ```
            <place n into r3>
            dlstp.16        lr, r3
    .L14:
            mov     r3, r0
            vldrh.16        q3, [r3]
            mov     r3, r1
            vldrh.16        q2, [r3]
            mov     r3, r2
            vadd.i16  q3, q3, q2
            adds    r0, r0, #16
            vstrh.16        q3, [r3]
            adds    r1, r1, #16
            adds    r2, r2, #16
            letp    lr, .L14
    
    ```
    
    Since the loop tail predication is automatic, we have eliminated
    the VCTP that had been specified by the user in the intrinsic
    and converted the VPT-predicated instructions into their
    unpredicated equivalents (which also saves us from VPST insns).
    
    The LE instruction here decrements LR by 8 in each iteration.
    
    --- This 1/2 patch ---
    
    This first patch lays some groundwork by adding an attribute to
    md patterns, and then the second patch contains the functional
    changes.
    
    One major difficulty in implementing MVE Tail-Predicated Low
    Overhead Loops was the need to transform VPT-predicated insns
    in the insn chain into their unpredicated equivalents, like:
    `mve_vldrbq_z_<supf><mode> -> mve_vldrbq_<supf><mode>`.
    
    This requires us to have a deterministic link between two
    different patterns in mve.md -- this _could_ be done by
    re-ordering the entirety of mve.md such that the patterns are
    at some constant icode proximity (e.g. having the _z immediately
    after the unpredicated version would mean that to map from the
    former to the latter you could use icode-1), but that is a very
    messy solution that would lead to complex unknown dependencies
    between the ordering of patterns.
    
    This patch proves an alternative way of doing that: using an insn
    attribute to encode the icode of the unpredicated instruction.
    
    No regressions on arm-none-eabi with an MVE target.
    
    Thank you,
    Stam Markianos-Wright
    
    gcc/ChangeLog:
    
            * config/arm/arm.md (mve_unpredicated_insn): New attribute.
            * config/arm/arm.h (MVE_VPT_PREDICATED_INSN_P): New define.
            (MVE_VPT_UNPREDICATED_INSN_P): Likewise.
            (MVE_VPT_PREDICABLE_INSN_P): Likewise.
            * config/arm/vec-common.md (mve_vshlq_<supf><mode>): Add attribute.
            * config/arm/mve.md (arm_vcx1q<a>_p_v16qi): Add attribute.
            (arm_vcx1q<a>v16qi): Likewise.
            (arm_vcx1qav16qi): Likewise.
            (arm_vcx1qv16qi): Likewise.
            (arm_vcx2q<a>_p_v16qi): Likewise.
            (arm_vcx2q<a>v16qi): Likewise.
            (arm_vcx2qav16qi): Likewise.
            (arm_vcx2qv16qi): Likewise.
            (arm_vcx3q<a>_p_v16qi): Likewise.
            (arm_vcx3q<a>v16qi): Likewise.
            (arm_vcx3qav16qi): Likewise.
            (arm_vcx3qv16qi): Likewise.
            (mve_vabavq_<supf><mode>): Likewise.
            (mve_vabavq_p_<supf><mode>): Likewise.
            (mve_vabdq_<supf><mode>): Likewise.
            (mve_vabdq_f<mode>): Likewise.
            (mve_vabdq_m_<supf><mode>): Likewise.
            (mve_vabdq_m_f<mode>): Likewise.
            (mve_vabsq_f<mode>): Likewise.
            (mve_vabsq_m_f<mode>): Likewise.
            (mve_vabsq_m_s<mode>): Likewise.
            (mve_vabsq_s<mode>): Likewise.
            (mve_vadciq_<supf>v4si): Likewise.
            (mve_vadciq_m_<supf>v4si): Likewise.
            (mve_vadcq_<supf>v4si): Likewise.
            (mve_vadcq_m_<supf>v4si): Likewise.
            (mve_vaddlvaq_<supf>v4si): Likewise.
            (mve_vaddlvaq_p_<supf>v4si): Likewise.
            (mve_vaddlvq_<supf>v4si): Likewise.
            (mve_vaddlvq_p_<supf>v4si): Likewise.
            (mve_vaddq_f<mode>): Likewise.
            (mve_vaddq_m_<supf><mode>): Likewise.
            (mve_vaddq_m_f<mode>): Likewise.
            (mve_vaddq_m_n_<supf><mode>): Likewise.
            (mve_vaddq_m_n_f<mode>): Likewise.
            (mve_vaddq_n_<supf><mode>): Likewise.
            (mve_vaddq_n_f<mode>): Likewise.
            (mve_vaddq<mode>): Likewise.
            (mve_vaddvaq_<supf><mode>): Likewise.
            (mve_vaddvaq_p_<supf><mode>): Likewise.
            (mve_vaddvq_<supf><mode>): Likewise.
            (mve_vaddvq_p_<supf><mode>): Likewise.
            (mve_vandq_<supf><mode>): Likewise.
            (mve_vandq_f<mode>): Likewise.
            (mve_vandq_m_<supf><mode>): Likewise.
            (mve_vandq_m_f<mode>): Likewise.
            (mve_vandq_s<mode>): Likewise.
            (mve_vandq_u<mode>): Likewise.
            (mve_vbicq_<supf><mode>): Likewise.
            (mve_vbicq_f<mode>): Likewise.
            (mve_vbicq_m_<supf><mode>): Likewise.
            (mve_vbicq_m_f<mode>): Likewise.
            (mve_vbicq_m_n_<supf><mode>): Likewise.
            (mve_vbicq_n_<supf><mode>): Likewise.
            (mve_vbicq_s<mode>): Likewise.
            (mve_vbicq_u<mode>): Likewise.
            (mve_vbrsrq_m_n_<supf><mode>): Likewise.
            (mve_vbrsrq_m_n_f<mode>): Likewise.
            (mve_vbrsrq_n_<supf><mode>): Likewise.
            (mve_vbrsrq_n_f<mode>): Likewise.
            (mve_vcaddq_rot270_m_<supf><mode>): Likewise.
            (mve_vcaddq_rot270_m_f<mode>): Likewise.
            (mve_vcaddq_rot270<mode>): Likewise.
            (mve_vcaddq_rot270<mode>): Likewise.
            (mve_vcaddq_rot90_m_<supf><mode>): Likewise.
            (mve_vcaddq_rot90_m_f<mode>): Likewise.
            (mve_vcaddq_rot90<mode>): Likewise.
            (mve_vcaddq_rot90<mode>): Likewise.
            (mve_vcaddq<mve_rot><mode>): Likewise.
            (mve_vcaddq<mve_rot><mode>): Likewise.
            (mve_vclsq_m_s<mode>): Likewise.
            (mve_vclsq_s<mode>): Likewise.
            (mve_vclzq_<supf><mode>): Likewise.
            (mve_vclzq_m_<supf><mode>): Likewise.
            (mve_vclzq_s<mode>): Likewise.
            (mve_vclzq_u<mode>): Likewise.
            (mve_vcmlaq_m_f<mode>): Likewise.
            (mve_vcmlaq_rot180_m_f<mode>): Likewise.
            (mve_vcmlaq_rot180<mode>): Likewise.
            (mve_vcmlaq_rot270_m_f<mode>): Likewise.
            (mve_vcmlaq_rot270<mode>): Likewise.
            (mve_vcmlaq_rot90_m_f<mode>): Likewise.
            (mve_vcmlaq_rot90<mode>): Likewise.
            (mve_vcmlaq<mode>): Likewise.
            (mve_vcmlaq<mve_rot><mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_f<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_n_<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_n_f<mode>): Likewise.
            (mve_vcmpcsq_<mode>): Likewise.
            (mve_vcmpcsq_m_n_u<mode>): Likewise.
            (mve_vcmpcsq_m_u<mode>): Likewise.
            (mve_vcmpcsq_n_<mode>): Likewise.
            (mve_vcmpeqq_<mode>): Likewise.
            (mve_vcmpeqq_f<mode>): Likewise.
            (mve_vcmpeqq_m_<supf><mode>): Likewise.
            (mve_vcmpeqq_m_f<mode>): Likewise.
            (mve_vcmpeqq_m_n_<supf><mode>): Likewise.
            (mve_vcmpeqq_m_n_f<mode>): Likewise.
            (mve_vcmpeqq_n_<mode>): Likewise.
            (mve_vcmpeqq_n_f<mode>): Likewise.
            (mve_vcmpgeq_<mode>): Likewise.
            (mve_vcmpgeq_f<mode>): Likewise.
            (mve_vcmpgeq_m_f<mode>): Likewise.
            (mve_vcmpgeq_m_n_f<mode>): Likewise.
            (mve_vcmpgeq_m_n_s<mode>): Likewise.
            (mve_vcmpgeq_m_s<mode>): Likewise.
            (mve_vcmpgeq_n_<mode>): Likewise.
            (mve_vcmpgeq_n_f<mode>): Likewise.
            (mve_vcmpgtq_<mode>): Likewise.
            (mve_vcmpgtq_f<mode>): Likewise.
            (mve_vcmpgtq_m_f<mode>): Likewise.
            (mve_vcmpgtq_m_n_f<mode>): Likewise.
            (mve_vcmpgtq_m_n_s<mode>): Likewise.
            (mve_vcmpgtq_m_s<mode>): Likewise.
            (mve_vcmpgtq_n_<mode>): Likewise.
            (mve_vcmpgtq_n_f<mode>): Likewise.
            (mve_vcmphiq_<mode>): Likewise.
            (mve_vcmphiq_m_n_u<mode>): Likewise.
            (mve_vcmphiq_m_u<mode>): Likewise.
            (mve_vcmphiq_n_<mode>): Likewise.
            (mve_vcmpleq_<mode>): Likewise.
            (mve_vcmpleq_f<mode>): Likewise.
            (mve_vcmpleq_m_f<mode>): Likewise.
            (mve_vcmpleq_m_n_f<mode>): Likewise.
            (mve_vcmpleq_m_n_s<mode>): Likewise.
            (mve_vcmpleq_m_s<mode>): Likewise.
            (mve_vcmpleq_n_<mode>): Likewise.
            (mve_vcmpleq_n_f<mode>): Likewise.
            (mve_vcmpltq_<mode>): Likewise.
            (mve_vcmpltq_f<mode>): Likewise.
            (mve_vcmpltq_m_f<mode>): Likewise.
            (mve_vcmpltq_m_n_f<mode>): Likewise.
            (mve_vcmpltq_m_n_s<mode>): Likewise.
            (mve_vcmpltq_m_s<mode>): Likewise.
            (mve_vcmpltq_n_<mode>): Likewise.
            (mve_vcmpltq_n_f<mode>): Likewise.
            (mve_vcmpneq_<mode>): Likewise.
            (mve_vcmpneq_f<mode>): Likewise.
            (mve_vcmpneq_m_<supf><mode>): Likewise.
            (mve_vcmpneq_m_f<mode>): Likewise.
            (mve_vcmpneq_m_n_<supf><mode>): Likewise.
            (mve_vcmpneq_m_n_f<mode>): Likewise.
            (mve_vcmpneq_n_<mode>): Likewise.
            (mve_vcmpneq_n_f<mode>): Likewise.
            (mve_vcmulq_m_f<mode>): Likewise.
            (mve_vcmulq_rot180_m_f<mode>): Likewise.
            (mve_vcmulq_rot180<mode>): Likewise.
            (mve_vcmulq_rot270_m_f<mode>): Likewise.
            (mve_vcmulq_rot270<mode>): Likewise.
            (mve_vcmulq_rot90_m_f<mode>): Likewise.
            (mve_vcmulq_rot90<mode>): Likewise.
            (mve_vcmulq<mode>): Likewise.
            (mve_vcmulq<mve_rot><mode>): Likewise.
            (mve_vctp<mode1>q_mhi): Likewise.
            (mve_vctp<mode1>qhi): Likewise.
            (mve_vcvtaq_<supf><mode>): Likewise.
            (mve_vcvtaq_m_<supf><mode>): Likewise.
            (mve_vcvtbq_f16_f32v8hf): Likewise.
            (mve_vcvtbq_f32_f16v4sf): Likewise.
            (mve_vcvtbq_m_f16_f32v8hf): Likewise.
            (mve_vcvtbq_m_f32_f16v4sf): Likewise.
            (mve_vcvtmq_<supf><mode>): Likewise.
            (mve_vcvtmq_m_<supf><mode>): Likewise.
            (mve_vcvtnq_<supf><mode>): Likewise.
            (mve_vcvtnq_m_<supf><mode>): Likewise.
            (mve_vcvtpq_<supf><mode>): Likewise.
            (mve_vcvtpq_m_<supf><mode>): Likewise.
            (mve_vcvtq_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_n_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_n_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_n_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_n_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_to_f_<supf><mode>): Likewise.
            (mve_vcvttq_f16_f32v8hf): Likewise.
            (mve_vcvttq_f32_f16v4sf): Likewise.
            (mve_vcvttq_m_f16_f32v8hf): Likewise.
            (mve_vcvttq_m_f32_f16v4sf): Likewise.
            (mve_vddupq_m_wb_u<mode>_insn): Likewise.
            (mve_vddupq_u<mode>_insn): Likewise.
            (mve_vdupq_m_n_<supf><mode>): Likewise.
            (mve_vdupq_m_n_f<mode>): Likewise.
            (mve_vdupq_n_<supf><mode>): Likewise.
            (mve_vdupq_n_f<mode>): Likewise.
            (mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
            (mve_vdwdupq_wb_u<mode>_insn): Likewise.
            (mve_veorq_<supf><mode>): Likewise.
            (mve_veorq_f<mode>): Likewise.
            (mve_veorq_m_<supf><mode>): Likewise.
            (mve_veorq_m_f<mode>): Likewise.
            (mve_veorq_s<mode>): Likewise.
            (mve_veorq_u<mode>): Likewise.
            (mve_vfmaq_f<mode>): Likewise.
            (mve_vfmaq_m_f<mode>): Likewise.
            (mve_vfmaq_m_n_f<mode>): Likewise.
            (mve_vfmaq_n_f<mode>): Likewise.
            (mve_vfmasq_m_n_f<mode>): Likewise.
            (mve_vfmasq_n_f<mode>): Likewise.
            (mve_vfmsq_f<mode>): Likewise.
            (mve_vfmsq_m_f<mode>): Likewise.
            (mve_vhaddq_<supf><mode>): Likewise.
            (mve_vhaddq_m_<supf><mode>): Likewise.
            (mve_vhaddq_m_n_<supf><mode>): Likewise.
            (mve_vhaddq_n_<supf><mode>): Likewise.
            (mve_vhcaddq_rot270_m_s<mode>): Likewise.
            (mve_vhcaddq_rot270_s<mode>): Likewise.
            (mve_vhcaddq_rot90_m_s<mode>): Likewise.
            (mve_vhcaddq_rot90_s<mode>): Likewise.
            (mve_vhsubq_<supf><mode>): Likewise.
            (mve_vhsubq_m_<supf><mode>): Likewise.
            (mve_vhsubq_m_n_<supf><mode>): Likewise.
            (mve_vhsubq_n_<supf><mode>): Likewise.
            (mve_vidupq_m_wb_u<mode>_insn): Likewise.
            (mve_vidupq_u<mode>_insn): Likewise.
            (mve_viwdupq_m_wb_u<mode>_insn): Likewise.
            (mve_viwdupq_wb_u<mode>_insn): Likewise.
            (mve_vldrbq_<supf><mode>): Likewise.
            (mve_vldrbq_gather_offset_<supf><mode>): Likewise.
            (mve_vldrbq_gather_offset_z_<supf><mode>): Likewise.
            (mve_vldrbq_z_<supf><mode>): Likewise.
            (mve_vldrdq_gather_base_<supf>v2di): Likewise.
            (mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
            (mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
            (mve_vldrdq_gather_base_z_<supf>v2di): Likewise.
            (mve_vldrdq_gather_offset_<supf>v2di): Likewise.
            (mve_vldrdq_gather_offset_z_<supf>v2di): Likewise.
            (mve_vldrdq_gather_shifted_offset_<supf>v2di): Likewise.
            (mve_vldrdq_gather_shifted_offset_z_<supf>v2di): Likewise.
            (mve_vldrhq_<supf><mode>): Likewise.
            (mve_vldrhq_fv8hf): Likewise.
            (mve_vldrhq_gather_offset_<supf><mode>): Likewise.
            (mve_vldrhq_gather_offset_fv8hf): Likewise.
            (mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
            (mve_vldrhq_gather_offset_z_fv8hf): Likewise.
            (mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
            (mve_vldrhq_gather_shifted_offset_fv8hf): Likewise.
            (mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
            (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise.
            (mve_vldrhq_z_<supf><mode>): Likewise.
            (mve_vldrhq_z_fv8hf): Likewise.
            (mve_vldrwq_<supf>v4si): Likewise.
            (mve_vldrwq_fv4sf): Likewise.
            (mve_vldrwq_gather_base_<supf>v4si): Likewise.
            (mve_vldrwq_gather_base_fv4sf): Likewise.
            (mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
            (mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
            (mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
            (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
            (mve_vldrwq_gather_base_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_base_z_fv4sf): Likewise.
            (mve_vldrwq_gather_offset_<supf>v4si): Likewise.
            (mve_vldrwq_gather_offset_fv4sf): Likewise.
            (mve_vldrwq_gather_offset_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_offset_z_fv4sf): Likewise.
            (mve_vldrwq_gather_shifted_offset_<supf>v4si): Likewise.
            (mve_vldrwq_gather_shifted_offset_fv4sf): Likewise.
            (mve_vldrwq_gather_shifted_offset_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise.
            (mve_vldrwq_z_<supf>v4si): Likewise.
            (mve_vldrwq_z_fv4sf): Likewise.
            (mve_vmaxaq_m_s<mode>): Likewise.
            (mve_vmaxaq_s<mode>): Likewise.
            (mve_vmaxavq_p_s<mode>): Likewise.
            (mve_vmaxavq_s<mode>): Likewise.
            (mve_vmaxnmaq_f<mode>): Likewise.
            (mve_vmaxnmaq_m_f<mode>): Likewise.
            (mve_vmaxnmavq_f<mode>): Likewise.
            (mve_vmaxnmavq_p_f<mode>): Likewise.
            (mve_vmaxnmq_f<mode>): Likewise.
            (mve_vmaxnmq_m_f<mode>): Likewise.
            (mve_vmaxnmvq_f<mode>): Likewise.
            (mve_vmaxnmvq_p_f<mode>): Likewise.
            (mve_vmaxq_<supf><mode>): Likewise.
            (mve_vmaxq_m_<supf><mode>): Likewise.
            (mve_vmaxq_s<mode>): Likewise.
            (mve_vmaxq_u<mode>): Likewise.
            (mve_vmaxvq_<supf><mode>): Likewise.
            (mve_vmaxvq_p_<supf><mode>): Likewise.
            (mve_vminaq_m_s<mode>): Likewise.
            (mve_vminaq_s<mode>): Likewise.
            (mve_vminavq_p_s<mode>): Likewise.
            (mve_vminavq_s<mode>): Likewise.
            (mve_vminnmaq_f<mode>): Likewise.
            (mve_vminnmaq_m_f<mode>): Likewise.
            (mve_vminnmavq_f<mode>): Likewise.
            (mve_vminnmavq_p_f<mode>): Likewise.
            (mve_vminnmq_f<mode>): Likewise.
            (mve_vminnmq_m_f<mode>): Likewise.
            (mve_vminnmvq_f<mode>): Likewise.
            (mve_vminnmvq_p_f<mode>): Likewise.
            (mve_vminq_<supf><mode>): Likewise.
            (mve_vminq_m_<supf><mode>): Likewise.
            (mve_vminq_s<mode>): Likewise.
            (mve_vminq_u<mode>): Likewise.
            (mve_vminvq_<supf><mode>): Likewise.
            (mve_vminvq_p_<supf><mode>): Likewise.
            (mve_vmladavaq_<supf><mode>): Likewise.
            (mve_vmladavaq_p_<supf><mode>): Likewise.
            (mve_vmladavaxq_p_s<mode>): Likewise.
            (mve_vmladavaxq_s<mode>): Likewise.
            (mve_vmladavq_<supf><mode>): Likewise.
            (mve_vmladavq_p_<supf><mode>): Likewise.
            (mve_vmladavxq_p_s<mode>): Likewise.
            (mve_vmladavxq_s<mode>): Likewise.
            (mve_vmlaldavaq_<supf><mode>): Likewise.
            (mve_vmlaldavaq_p_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_p_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_s<mode>): Likewise.
            (mve_vmlaldavq_<supf><mode>): Likewise.
            (mve_vmlaldavq_p_<supf><mode>): Likewise.
            (mve_vmlaldavxq_p_s<mode>): Likewise.
            (mve_vmlaldavxq_s<mode>): Likewise.
            (mve_vmlaq_m_n_<supf><mode>): Likewise.
            (mve_vmlaq_n_<supf><mode>): Likewise.
            (mve_vmlasq_m_n_<supf><mode>): Likewise.
            (mve_vmlasq_n_<supf><mode>): Likewise.
            (mve_vmlsdavaq_p_s<mode>): Likewise.
            (mve_vmlsdavaq_s<mode>): Likewise.
            (mve_vmlsdavaxq_p_s<mode>): Likewise.
            (mve_vmlsdavaxq_s<mode>): Likewise.
            (mve_vmlsdavq_p_s<mode>): Likewise.
            (mve_vmlsdavq_s<mode>): Likewise.
            (mve_vmlsdavxq_p_s<mode>): Likewise.
            (mve_vmlsdavxq_s<mode>): Likewise.
            (mve_vmlsldavaq_p_s<mode>): Likewise.
            (mve_vmlsldavaq_s<mode>): Likewise.
            (mve_vmlsldavaxq_p_s<mode>): Likewise.
            (mve_vmlsldavaxq_s<mode>): Likewise.
            (mve_vmlsldavq_p_s<mode>): Likewise.
            (mve_vmlsldavq_s<mode>): Likewise.
            (mve_vmlsldavxq_p_s<mode>): Likewise.
            (mve_vmlsldavxq_s<mode>): Likewise.
            (mve_vmovlbq_<supf><mode>): Likewise.
            (mve_vmovlbq_m_<supf><mode>): Likewise.
            (mve_vmovltq_<supf><mode>): Likewise.
            (mve_vmovltq_m_<supf><mode>): Likewise.
            (mve_vmovnbq_<supf><mode>): Likewise.
            (mve_vmovnbq_m_<supf><mode>): Likewise.
            (mve_vmovntq_<supf><mode>): Likewise.
            (mve_vmovntq_m_<supf><mode>): Likewise.
            (mve_vmulhq_<supf><mode>): Likewise.
            (mve_vmulhq_m_<supf><mode>): Likewise.
            (mve_vmullbq_int_<supf><mode>): Likewise.
            (mve_vmullbq_int_m_<supf><mode>): Likewise.
            (mve_vmullbq_poly_m_p<mode>): Likewise.
            (mve_vmullbq_poly_p<mode>): Likewise.
            (mve_vmulltq_int_<supf><mode>): Likewise.
            (mve_vmulltq_int_m_<supf><mode>): Likewise.
            (mve_vmulltq_poly_m_p<mode>): Likewise.
            (mve_vmulltq_poly_p<mode>): Likewise.
            (mve_vmulq_<supf><mode>): Likewise.
            (mve_vmulq_f<mode>): Likewise.
            (mve_vmulq_m_<supf><mode>): Likewise.
            (mve_vmulq_m_f<mode>): Likewise.
            (mve_vmulq_m_n_<supf><mode>): Likewise.
            (mve_vmulq_m_n_f<mode>): Likewise.
            (mve_vmulq_n_<supf><mode>): Likewise.
            (mve_vmulq_n_f<mode>): Likewise.
            (mve_vmvnq_<supf><mode>): Likewise.
            (mve_vmvnq_m_<supf><mode>): Likewise.
            (mve_vmvnq_m_n_<supf><mode>): Likewise.
            (mve_vmvnq_n_<supf><mode>): Likewise.
            (mve_vmvnq_s<mode>): Likewise.
            (mve_vmvnq_u<mode>): Likewise.
            (mve_vnegq_f<mode>): Likewise.
            (mve_vnegq_m_f<mode>): Likewise.
            (mve_vnegq_m_s<mode>): Likewise.
            (mve_vnegq_s<mode>): Likewise.
            (mve_vornq_<supf><mode>): Likewise.
            (mve_vornq_f<mode>): Likewise.
            (mve_vornq_m_<supf><mode>): Likewise.
            (mve_vornq_m_f<mode>): Likewise.
            (mve_vornq_s<mode>): Likewise.
            (mve_vornq_u<mode>): Likewise.
            (mve_vorrq_<supf><mode>): Likewise.
            (mve_vorrq_f<mode>): Likewise.
            (mve_vorrq_m_<supf><mode>): Likewise.
            (mve_vorrq_m_f<mode>): Likewise.
            (mve_vorrq_m_n_<supf><mode>): Likewise.
            (mve_vorrq_n_<supf><mode>): Likewise.
            (mve_vorrq_s<mode>): Likewise.
            (mve_vorrq_s<mode>): Likewise.
            (mve_vqabsq_m_s<mode>): Likewise.
            (mve_vqabsq_s<mode>): Likewise.
            (mve_vqaddq_<supf><mode>): Likewise.
            (mve_vqaddq_m_<supf><mode>): Likewise.
            (mve_vqaddq_m_n_<supf><mode>): Likewise.
            (mve_vqaddq_n_<supf><mode>): Likewise.
            (mve_vqdmladhq_m_s<mode>): Likewise.
            (mve_vqdmladhq_s<mode>): Likewise.
            (mve_vqdmladhxq_m_s<mode>): Likewise.
            (mve_vqdmladhxq_s<mode>): Likewise.
            (mve_vqdmlahq_m_n_s<mode>): Likewise.
            (mve_vqdmlahq_n_<supf><mode>): Likewise.
            (mve_vqdmlahq_n_s<mode>): Likewise.
            (mve_vqdmlashq_m_n_s<mode>): Likewise.
            (mve_vqdmlashq_n_<supf><mode>): Likewise.
            (mve_vqdmlashq_n_s<mode>): Likewise.
            (mve_vqdmlsdhq_m_s<mode>): Likewise.
            (mve_vqdmlsdhq_s<mode>): Likewise.
            (mve_vqdmlsdhxq_m_s<mode>): Likewise.
            (mve_vqdmlsdhxq_s<mode>): Likewise.
            (mve_vqdmulhq_m_n_s<mode>): Likewise.
            (mve_vqdmulhq_m_s<mode>): Likewise.
            (mve_vqdmulhq_n_s<mode>): Likewise.
            (mve_vqdmulhq_s<mode>): Likewise.
            (mve_vqdmullbq_m_n_s<mode>): Likewise.
            (mve_vqdmullbq_m_s<mode>): Likewise.
            (mve_vqdmullbq_n_s<mode>): Likewise.
            (mve_vqdmullbq_s<mode>): Likewise.
            (mve_vqdmulltq_m_n_s<mode>): Likewise.
            (mve_vqdmulltq_m_s<mode>): Likewise.
            (mve_vqdmulltq_n_s<mode>): Likewise.
            (mve_vqdmulltq_s<mode>): Likewise.
            (mve_vqmovnbq_<supf><mode>): Likewise.
            (mve_vqmovnbq_m_<supf><mode>): Likewise.
            (mve_vqmovntq_<supf><mode>): Likewise.
            (mve_vqmovntq_m_<supf><mode>): Likewise.
            (mve_vqmovunbq_m_s<mode>): Likewise.
            (mve_vqmovunbq_s<mode>): Likewise.
            (mve_vqmovuntq_m_s<mode>): Likewise.
            (mve_vqmovuntq_s<mode>): Likewise.
            (mve_vqnegq_m_s<mode>): Likewise.
            (mve_vqnegq_s<mode>): Likewise.
            (mve_vqrdmladhq_m_s<mode>): Likewise.
            (mve_vqrdmladhq_s<mode>): Likewise.
            (mve_vqrdmladhxq_m_s<mode>): Likewise.
            (mve_vqrdmladhxq_s<mode>): Likewise.
            (mve_vqrdmlahq_m_n_s<mode>): Likewise.
            (mve_vqrdmlahq_n_<supf><mode>): Likewise.
            (mve_vqrdmlahq_n_s<mode>): Likewise.
            (mve_vqrdmlashq_m_n_s<mode>): Likewise.
            (mve_vqrdmlashq_n_<supf><mode>): Likewise.
            (mve_vqrdmlashq_n_s<mode>): Likewise.
            (mve_vqrdmlsdhq_m_s<mode>): Likewise.
            (mve_vqrdmlsdhq_s<mode>): Likewise.
            (mve_vqrdmlsdhxq_m_s<mode>): Likewise.
            (mve_vqrdmlsdhxq_s<mode>): Likewise.
            (mve_vqrdmulhq_m_n_s<mode>): Likewise.
            (mve_vqrdmulhq_m_s<mode>): Likewise.
            (mve_vqrdmulhq_n_s<mode>): Likewise.
            (mve_vqrdmulhq_s<mode>): Likewise.
            (mve_vqrshlq_<supf><mode>): Likewise.
            (mve_vqrshlq_m_<supf><mode>): Likewise.
            (mve_vqrshlq_m_n_<supf><mode>): Likewise.
            (mve_vqrshlq_n_<supf><mode>): Likewise.
            (mve_vqrshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vqrshrnbq_n_<supf><mode>): Likewise.
            (mve_vqrshrntq_m_n_<supf><mode>): Likewise.
            (mve_vqrshrntq_n_<supf><mode>): Likewise.
            (mve_vqrshrunbq_m_n_s<mode>): Likewise.
            (mve_vqrshrunbq_n_s<mode>): Likewise.
            (mve_vqrshruntq_m_n_s<mode>): Likewise.
            (mve_vqrshruntq_n_s<mode>): Likewise.
            (mve_vqshlq_<supf><mode>): Likewise.
            (mve_vqshlq_m_<supf><mode>): Likewise.
            (mve_vqshlq_m_n_<supf><mode>): Likewise.
            (mve_vqshlq_m_r_<supf><mode>): Likewise.
            (mve_vqshlq_n_<supf><mode>): Likewise.
            (mve_vqshlq_r_<supf><mode>): Likewise.
            (mve_vqshluq_m_n_s<mode>): Likewise.
            (mve_vqshluq_n_s<mode>): Likewise.
            (mve_vqshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vqshrnbq_n_<supf><mode>): Likewise.
            (mve_vqshrntq_m_n_<supf><mode>): Likewise.
            (mve_vqshrntq_n_<supf><mode>): Likewise.
            (mve_vqshrunbq_m_n_s<mode>): Likewise.
            (mve_vqshrunbq_n_s<mode>): Likewise.
            (mve_vqshruntq_m_n_s<mode>): Likewise.
            (mve_vqshruntq_n_s<mode>): Likewise.
            (mve_vqsubq_<supf><mode>): Likewise.
            (mve_vqsubq_m_<supf><mode>): Likewise.
            (mve_vqsubq_m_n_<supf><mode>): Likewise.
            (mve_vqsubq_n_<supf><mode>): Likewise.
            (mve_vrev16q_<supf>v16qi): Likewise.
            (mve_vrev16q_m_<supf>v16qi): Likewise.
            (mve_vrev32q_<supf><mode>): Likewise.
            (mve_vrev32q_fv8hf): Likewise.
            (mve_vrev32q_m_<supf><mode>): Likewise.
            (mve_vrev32q_m_fv8hf): Likewise.
            (mve_vrev64q_<supf><mode>): Likewise.
            (mve_vrev64q_f<mode>): Likewise.
            (mve_vrev64q_m_<supf><mode>): Likewise.
            (mve_vrev64q_m_f<mode>): Likewise.
            (mve_vrhaddq_<supf><mode>): Likewise.
            (mve_vrhaddq_m_<supf><mode>): Likewise.
            (mve_vrmlaldavhaq_<supf>v4si): Likewise.
            (mve_vrmlaldavhaq_p_sv4si): Likewise.
            (mve_vrmlaldavhaq_p_uv4si): Likewise.
            (mve_vrmlaldavhaq_sv4si): Likewise.
            (mve_vrmlaldavhaq_uv4si): Likewise.
            (mve_vrmlaldavhaxq_p_sv4si): Likewise.
            (mve_vrmlaldavhaxq_sv4si): Likewise.
            (mve_vrmlaldavhq_<supf>v4si): Likewise.
            (mve_vrmlaldavhq_p_<supf>v4si): Likewise.
            (mve_vrmlaldavhxq_p_sv4si): Likewise.
            (mve_vrmlaldavhxq_sv4si): Likewise.
            (mve_vrmlsldavhaq_p_sv4si): Likewise.
            (mve_vrmlsldavhaq_sv4si): Likewise.
            (mve_vrmlsldavhaxq_p_sv4si): Likewise.
            (mve_vrmlsldavhaxq_sv4si): Likewise.
            (mve_vrmlsldavhq_p_sv4si): Likewise.
            (mve_vrmlsldavhq_sv4si): Likewise.
            (mve_vrmlsldavhxq_p_sv4si): Likewise.
            (mve_vrmlsldavhxq_sv4si): Likewise.
            (mve_vrmulhq_<supf><mode>): Likewise.
            (mve_vrmulhq_m_<supf><mode>): Likewise.
            (mve_vrndaq_f<mode>): Likewise.
            (mve_vrndaq_m_f<mode>): Likewise.
            (mve_vrndmq_f<mode>): Likewise.
            (mve_vrndmq_m_f<mode>): Likewise.
            (mve_vrndnq_f<mode>): Likewise.
            (mve_vrndnq_m_f<mode>): Likewise.
            (mve_vrndpq_f<mode>): Likewise.
            (mve_vrndpq_m_f<mode>): Likewise.
            (mve_vrndq_f<mode>): Likewise.
            (mve_vrndq_m_f<mode>): Likewise.
            (mve_vrndxq_f<mode>): Likewise.
            (mve_vrndxq_m_f<mode>): Likewise.
            (mve_vrshlq_<supf><mode>): Likewise.
            (mve_vrshlq_m_<supf><mode>): Likewise.
            (mve_vrshlq_m_n_<supf><mode>): Likewise.
            (mve_vrshlq_n_<supf><mode>): Likewise.
            (mve_vrshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vrshrnbq_n_<supf><mode>): Likewise.
            (mve_vrshrntq_m_n_<supf><mode>): Likewise.
            (mve_vrshrntq_n_<supf><mode>): Likewise.
            (mve_vrshrq_m_n_<supf><mode>): Likewise.
            (mve_vrshrq_n_<supf><mode>): Likewise.
            (mve_vsbciq_<supf>v4si): Likewise.
            (mve_vsbciq_m_<supf>v4si): Likewise.
            (mve_vsbcq_<supf>v4si): Likewise.
            (mve_vsbcq_m_<supf>v4si): Likewise.
            (mve_vshlcq_<supf><mode>): Likewise.
            (mve_vshlcq_m_<supf><mode>): Likewise.
            (mve_vshllbq_m_n_<supf><mode>): Likewise.
            (mve_vshllbq_n_<supf><mode>): Likewise.
            (mve_vshlltq_m_n_<supf><mode>): Likewise.
            (mve_vshlltq_n_<supf><mode>): Likewise.
            (mve_vshlq_<supf><mode>): Likewise.
            (mve_vshlq_<supf><mode>): Likewise.
            (mve_vshlq_m_<supf><mode>): Likewise.
            (mve_vshlq_m_n_<supf><mode>): Likewise.
            (mve_vshlq_m_r_<supf><mode>): Likewise.
            (mve_vshlq_n_<supf><mode>): Likewise.
            (mve_vshlq_r_<supf><mode>): Likewise.
            (mve_vshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vshrnbq_n_<supf><mode>): Likewise.
            (mve_vshrntq_m_n_<supf><mode>): Likewise.
            (mve_vshrntq_n_<supf><mode>): Likewise.
            (mve_vshrq_m_n_<supf><mode>): Likewise.
            (mve_vshrq_n_<supf><mode>): Likewise.
            (mve_vsliq_m_n_<supf><mode>): Likewise.
            (mve_vsliq_n_<supf><mode>): Likewise.
            (mve_vsriq_m_n_<supf><mode>): Likewise.
            (mve_vsriq_n_<supf><mode>): Likewise.
            (mve_vstrbq_<supf><mode>): Likewise.
            (mve_vstrbq_p_<supf><mode>): Likewise.
            (mve_vstrbq_scatter_offset_<supf><mode>_insn): Likewise.
            (mve_vstrbq_scatter_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrdq_scatter_base_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_p_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_offset_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_offset_p_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_shifted_offset_p_<supf>v2di_insn): Likewise.
            (mve_vstrhq_<supf><mode>): Likewise.
            (mve_vstrhq_fv8hf): Likewise.
            (mve_vstrhq_p_<supf><mode>): Likewise.
            (mve_vstrhq_p_fv8hf): Likewise.
            (mve_vstrhq_scatter_offset_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_offset_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise.
            (mve_vstrwq_<supf>v4si): Likewise.
            (mve_vstrwq_fv4sf): Likewise.
            (mve_vstrwq_p_<supf>v4si): Likewise.
            (mve_vstrwq_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_wb_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_offset_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_offset_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_offset_p_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_p_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise.
            (mve_vsubq_<supf><mode>): Likewise.
            (mve_vsubq_f<mode>): Likewise.
            (mve_vsubq_m_<supf><mode>): Likewise.
            (mve_vsubq_m_f<mode>): Likewise.
            (mve_vsubq_m_n_<supf><mode>): Likewise.
            (mve_vsubq_m_n_f<mode>): Likewise.
            (mve_vsubq_n_<supf><mode>): Likewise.
            (mve_vsubq_n_f<mode>): Likewise.

diff --git a/gcc/config/arm/arm.h b/gcc/config/arm/arm.h
index 4f54530adcb..f06e5c2cda4 100644
--- a/gcc/config/arm/arm.h
+++ b/gcc/config/arm/arm.h
@@ -2358,6 +2358,21 @@ extern int making_const_table;
   else if (TARGET_THUMB1)				\
     thumb1_final_prescan_insn (INSN)
 
+/* These defines are useful to refer to the value of the mve_unpredicated_insn
+   insn attribute.  Note that, because these use the get_attr_* function, these
+   will change recog_data if (INSN) isn't current_insn.  */
+#define MVE_VPT_PREDICABLE_INSN_P(INSN)					\
+  (recog_memoized (INSN) >= 0						\
+  && get_attr_mve_unpredicated_insn (INSN) != 0)			\
+
+#define MVE_VPT_PREDICATED_INSN_P(INSN)					\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) != get_attr_mve_unpredicated_insn (INSN))	\
+
+#define MVE_VPT_UNPREDICATED_INSN_P(INSN)				\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) == get_attr_mve_unpredicated_insn (INSN))	\
+
 #define ARM_SIGN_EXTEND(x)  ((HOST_WIDE_INT)			\
   (HOST_BITS_PER_WIDE_INT <= 32 ? (unsigned HOST_WIDE_INT) (x)	\
    : ((((unsigned HOST_WIDE_INT)(x)) & (unsigned HOST_WIDE_INT) 0xffffffff) |\
diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md
index 2ac97232ffd..ee931ad6ebd 100644
--- a/gcc/config/arm/arm.md
+++ b/gcc/config/arm/arm.md
@@ -124,6 +124,8 @@
 ; and not all ARM insns do.
 (define_attr "predicated" "yes,no" (const_string "no"))
 
+(define_attr "mve_unpredicated_insn" "" (const_int 0))
+
 ; LENGTH of an instruction (in bytes)
 (define_attr "length" ""
   (const_int 4))
diff --git a/gcc/config/arm/iterators.md b/gcc/config/arm/iterators.md
index 2edd0b06370..71e43539616 100644
--- a/gcc/config/arm/iterators.md
+++ b/gcc/config/arm/iterators.md
@@ -2296,6 +2296,7 @@
 
 (define_int_attr mmla_sfx [(UNSPEC_MATMUL_S "s8") (UNSPEC_MATMUL_U "u8")
 			   (UNSPEC_MATMUL_US "s8")])
+
 ;;MVE int attribute.
 (define_int_attr supf [(VCVTQ_TO_F_S "s") (VCVTQ_TO_F_U "u") (VREV16Q_S "s")
 		       (VREV16Q_U "u") (VMVNQ_N_S "s") (VMVNQ_N_U "u")
diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md
index 6e4b143affa..87cbf6c1726 100644
--- a/gcc/config/arm/mve.md
+++ b/gcc/config/arm/mve.md
@@ -17,7 +17,7 @@
 ;; along with GCC; see the file COPYING3.  If not see
 ;; <http://www.gnu.org/licenses/>.
 
-(define_insn "*mve_mov<mode>"
+(define_insn "mve_mov<mode>"
   [(set (match_operand:MVE_types 0 "nonimmediate_operand" "=w,w,r,w   , w,   r,Ux,w")
 	(match_operand:MVE_types 1 "general_operand"      " w,r,w,DnDm,UxUi,r,w, Ul"))]
   "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT"
@@ -81,18 +81,27 @@
       return "";
     }
 }
-  [(set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load")
+   [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")
+						   (symbol_ref "CODE_FOR_mve_mov<mode>")
+						   (symbol_ref "CODE_FOR_nothing")])
+   (set_attr "type" "mve_move,mve_move,mve_move,mve_move,mve_load,multiple,mve_store,mve_load")
    (set_attr "length" "4,8,8,4,4,8,4,8")
    (set_attr "thumb2_pool_range" "*,*,*,*,1018,*,*,*")
    (set_attr "neg_pool_range" "*,*,*,*,996,*,*,*")])
 
-(define_insn "*mve_vdup<mode>"
+(define_insn "mve_vdup<mode>"
   [(set (match_operand:MVE_vecs 0 "s_register_operand" "=w")
 	(vec_duplicate:MVE_vecs
 	  (match_operand:<V_elem> 1 "s_register_operand" "r")))]
   "TARGET_HAVE_MVE || TARGET_HAVE_MVE_FLOAT"
   "vdup.<V_sz_elem>\t%q0, %1"
-  [(set_attr "length" "4")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdup<mode>"))
+  (set_attr "length" "4")
    (set_attr "type" "mve_move")])
 
 ;;
@@ -145,7 +154,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_mnemo>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -159,7 +169,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -173,7 +184,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "v<absneg_str>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -187,7 +199,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -201,7 +214,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 ;;
 ;; [vcvttq_f32_f16])
@@ -214,7 +228,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -228,7 +243,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -242,7 +258,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -256,7 +273,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -270,7 +288,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -284,7 +303,8 @@
   ]
   "TARGET_HAVE_MVE"
   "v<absneg_str>.s%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -297,7 +317,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmvn\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vmvnq_s<mode>"
   [
@@ -318,7 +339,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -331,7 +353,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vclz.i%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vclzq_u<mode>"
   [
@@ -354,7 +377,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -368,7 +392,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -382,7 +407,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -397,7 +423,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -411,7 +438,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtp.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -425,7 +453,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtn.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -439,7 +468,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtm.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -453,7 +483,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvta.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -467,7 +498,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -481,7 +513,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -495,7 +528,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -509,7 +543,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vctp.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -523,7 +558,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpnot"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpnotv16bi"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -538,7 +574,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -553,7 +590,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f<V_sz_elem>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; [vcreateq_f])
@@ -599,7 +637,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Versions that take constant vectors as operand 2 (with all elements
@@ -617,7 +656,8 @@
 					VALID_NEON_QREG_MODE (<MODE>mode),
 					true);
   }
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_s<mode>_imm"))
+  (set_attr "type" "mve_move")
 ])
 (define_insn "mve_vshrq_n_u<mode>_imm"
   [
@@ -632,7 +672,8 @@
 					VALID_NEON_QREG_MODE (<MODE>mode),
 					true);
   }
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_u<mode>_imm"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -647,7 +688,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf><V_sz_elem>.f<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -662,8 +704,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcmpneq_, vcmpcsq_, vcmpeqq_, vcmpgeq_, vcmpgtq_, vcmphiq_, vcmpleq_, vcmpltq_])
@@ -676,7 +719,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>\t<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -691,7 +735,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -722,7 +767,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -739,7 +785,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -754,7 +801,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -769,7 +817,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -789,8 +838,11 @@
   "@
    vand\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vand\", &operands[2], <MODE>mode, 1, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+   [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_vandq_u<mode>")
+						   (symbol_ref "CODE_FOR_nothing")])
+  (set_attr "type" "mve_move")
 ])
+
 (define_expand "mve_vandq_s<mode>"
   [
    (set (match_operand:MVE_2 0 "s_register_operand")
@@ -811,7 +863,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vbicq_s<mode>"
@@ -835,7 +888,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -853,7 +907,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Auto vectorizer pattern for int vcadd
@@ -876,7 +931,8 @@
   ]
   "TARGET_HAVE_MVE"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_veorq_s<mode>"
   [
@@ -904,7 +960,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -920,7 +977,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -935,7 +993,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<max_min_su_str>.<max_min_supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_su_str>q_<max_min_supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 
@@ -954,7 +1013,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -972,7 +1032,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -987,7 +1048,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmullb.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1002,7 +1064,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmullt.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1018,7 +1081,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_addsubmul>.i%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1032,7 +1096,8 @@
   ]
   "TARGET_HAVE_MVE"
    "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vornq_u<mode>"
@@ -1061,7 +1126,8 @@
   "@
    vorr\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vorr\", &operands[2], <MODE>mode, 0, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vorrq_u<mode>"
   [
@@ -1085,7 +1151,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1101,7 +1168,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1117,7 +1185,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1132,7 +1201,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1147,7 +1217,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1162,7 +1233,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1179,7 +1251,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1193,7 +1266,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vand\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1207,7 +1281,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1223,7 +1298,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1237,7 +1313,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1252,7 +1329,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1267,8 +1345,10 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vctpt.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")
+])
 
 ;;
 ;; [vcvtbq_f16_f32])
@@ -1282,7 +1362,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1297,7 +1378,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1311,7 +1393,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1327,7 +1410,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1345,7 +1429,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1360,7 +1445,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<max_min_f_str>.f%#<V_sz_elem>	%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_f_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1378,7 +1464,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1398,7 +1485,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1414,7 +1502,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_addsubmul>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1428,7 +1517,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1442,7 +1532,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorr\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1458,7 +1549,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1474,7 +1566,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1490,7 +1583,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1508,7 +1602,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1524,7 +1619,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1539,7 +1635,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmullt.p%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1554,7 +1651,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmullb.p%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1575,8 +1673,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_f<mode>"))
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtaq_m_u, vcvtaq_m_s])
 ;;
@@ -1590,8 +1689,10 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtat.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u])
 ;;
@@ -1605,8 +1706,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vqrshrnbq_n_u, vqrshrnbq_n_s]
@@ -1632,7 +1734,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1651,7 +1754,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1667,7 +1771,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1713,7 +1818,10 @@
 		   (match_dup 4)]
 	VSHLCQ))]
  "TARGET_HAVE_MVE"
- "vshlc\t%q0, %1, %4")
+ "vshlc\t%q0, %1, %4"
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+])
 
 ;;
 ;; [vabsq_m_s]
@@ -1733,7 +1841,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1749,7 +1858,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1772,7 +1882,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1795,7 +1906,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1811,7 +1923,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1828,7 +1941,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1847,7 +1961,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1866,7 +1981,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1885,7 +2001,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1906,7 +2023,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1922,7 +2040,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1938,7 +2057,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1961,7 +2081,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1978,7 +2099,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1995,7 +2117,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2011,7 +2134,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2027,7 +2151,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2043,7 +2168,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2066,7 +2192,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_mnemo>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2082,7 +2209,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270])
@@ -2100,7 +2228,9 @@
   "@
    vcmul.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>
    vcmla.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+  [(set_attr_alternative "mve_unpredicated_insn" [(symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>")
+						  (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>")])
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2121,7 +2251,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2137,7 +2268,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2153,7 +2285,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2169,7 +2302,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2185,8 +2319,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vdupq_m_n_f])
@@ -2201,7 +2336,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2218,7 +2354,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2235,7 +2372,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2252,7 +2390,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2271,7 +2410,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2290,7 +2430,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2309,7 +2450,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2326,7 +2468,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2347,7 +2490,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2363,7 +2507,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2380,7 +2525,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2396,7 +2542,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2412,7 +2559,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2428,7 +2576,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2444,7 +2593,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2463,7 +2613,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2479,7 +2630,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtmt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2495,7 +2647,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtpt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2511,7 +2664,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtnt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2528,7 +2682,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2544,7 +2699,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2560,8 +2716,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vabavq_p_s, vabavq_p_u])
@@ -2577,7 +2734,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -2594,8 +2752,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\n\t<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vsriq_m_n_s, vsriq_m_n_u])
@@ -2611,8 +2770,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s])
@@ -2628,7 +2788,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2668,7 +2829,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2687,8 +2849,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vaddq_m_u, vaddq_m_s]
@@ -2706,7 +2869,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2726,7 +2890,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2743,8 +2908,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcaddq_rot90_m_u, vcaddq_rot90_m_s]
@@ -2763,7 +2929,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2791,7 +2958,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2812,7 +2980,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2829,7 +2998,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmullbt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2846,7 +3016,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmulltt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2863,7 +3034,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2881,7 +3053,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2899,7 +3072,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2916,7 +3090,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2936,7 +3111,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2964,7 +3140,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2984,7 +3161,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3002,7 +3180,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3019,7 +3198,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmullbt.p%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3036,7 +3216,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmulltt.p%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3054,7 +3235,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3072,7 +3254,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3096,7 +3279,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3117,7 +3301,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3137,7 +3322,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3154,7 +3340,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3176,7 +3363,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3196,7 +3384,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mve_rot>_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3213,7 +3402,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3233,7 +3423,8 @@
    output_asm_insn("vstrb.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_s vstrbq_scatter_offset_u]
@@ -3261,7 +3452,8 @@
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vstrb.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_s vstrwq_scatter_base_u]
@@ -3283,7 +3475,8 @@
    output_asm_insn("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_gather_offset_s vldrbq_gather_offset_u]
@@ -3306,7 +3499,8 @@
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_s vldrbq_u]
@@ -3328,7 +3522,8 @@
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_s vldrwq_gather_base_u]
@@ -3348,7 +3543,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_p_s vstrbq_scatter_offset_p_u]
@@ -3380,7 +3576,8 @@
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrbt.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u]
@@ -3403,7 +3600,8 @@
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_insn "mve_vstrbq_p_<supf><mode>"
   [(set (match_operand:<MVE_B_ELEM> 0 "mve_memory_operand" "=Ux")
@@ -3421,7 +3619,8 @@
    output_asm_insn ("vpst\;vstrbt.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u]
@@ -3446,7 +3645,8 @@
      output_asm_insn ("vpst\n\tvldrbt.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_z_s vldrbq_z_u]
@@ -3469,7 +3669,8 @@
      output_asm_insn ("vpst\;vldrbt.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u]
@@ -3490,7 +3691,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_f]
@@ -3509,7 +3711,8 @@
    output_asm_insn ("vldrh.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_s vldrhq_gather_offset_u]
@@ -3532,7 +3735,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_s vldrhq_gather_offset_z_u]
@@ -3557,7 +3761,8 @@
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u]
@@ -3580,7 +3785,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_s vldrhq_gather_shited_offset_z_u]
@@ -3605,7 +3811,8 @@
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_s, vldrhq_u]
@@ -3627,7 +3834,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_z_f]
@@ -3647,7 +3855,8 @@
    output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_z_s vldrhq_z_u]
@@ -3670,7 +3879,8 @@
      output_asm_insn ("vpst\;vldrht.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_f]
@@ -3689,7 +3899,8 @@
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_s vldrwq_u]
@@ -3708,7 +3919,8 @@
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_z_f]
@@ -3728,7 +3940,8 @@
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_z_s vldrwq_z_u]
@@ -3748,7 +3961,8 @@
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vld1q_f<mode>"
   [(match_operand:MVE_0 0 "s_register_operand")
@@ -3788,7 +4002,8 @@
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_base_z_s vldrdq_gather_base_z_u]
@@ -3809,7 +4024,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u]
@@ -3829,7 +4045,8 @@
   output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_offset_z_s vldrdq_gather_offset_z_u]
@@ -3850,7 +4067,8 @@
   output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u]
@@ -3870,7 +4088,8 @@
    output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_z_s vldrdq_gather_shifted_offset_z_u]
@@ -3891,7 +4110,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_offset_f]
@@ -3911,7 +4131,8 @@
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_f]
@@ -3933,7 +4154,8 @@
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_f]
@@ -3953,7 +4175,8 @@
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_f]
@@ -3975,7 +4198,8 @@
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_f]
@@ -3995,7 +4219,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_z_f]
@@ -4016,7 +4241,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_f]
@@ -4036,7 +4262,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_s vldrwq_gather_offset_u]
@@ -4056,7 +4283,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_z_f]
@@ -4078,7 +4306,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u]
@@ -4100,7 +4329,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_f]
@@ -4120,7 +4350,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_s vldrwq_gather_shifted_offset_u]
@@ -4140,7 +4371,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_f]
@@ -4162,7 +4394,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u]
@@ -4184,7 +4417,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_f]
@@ -4203,7 +4437,8 @@
    output_asm_insn ("vstrh.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_p_f]
@@ -4224,7 +4459,8 @@
    output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_p_s vstrhq_p_u]
@@ -4246,7 +4482,8 @@
    output_asm_insn ("vpst\;vstrht.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u]
@@ -4278,7 +4515,8 @@
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u]
@@ -4306,7 +4544,8 @@
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_s vstrhq_scatter_shifted_offset_p_u]
@@ -4338,7 +4577,8 @@
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u]
@@ -4367,7 +4607,8 @@
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_s, vstrhq_u]
@@ -4386,7 +4627,8 @@
    output_asm_insn ("vstrh.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_f]
@@ -4405,7 +4647,8 @@
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_p_f]
@@ -4426,7 +4669,8 @@
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_p_s vstrwq_p_u]
@@ -4447,7 +4691,8 @@
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_s vstrwq_u]
@@ -4466,7 +4711,8 @@
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vst1q_f<mode>"
   [(match_operand:<MVE_CNVT> 0 "mve_memory_operand")
@@ -4509,7 +4755,8 @@
    output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u]
@@ -4531,7 +4778,8 @@
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_offset_p_s vstrdq_scatter_offset_p_u]
@@ -4562,7 +4810,8 @@
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u]
@@ -4590,7 +4839,8 @@
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_p_s vstrdq_scatter_shifted_offset_p_u]
@@ -4622,7 +4872,8 @@
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u]
@@ -4651,7 +4902,8 @@
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_f]
@@ -4679,7 +4931,8 @@
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_f]
@@ -4710,7 +4963,8 @@
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_f]
@@ -4738,7 +4992,8 @@
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_f]
@@ -4770,7 +5025,8 @@
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_f]
@@ -4792,7 +5048,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_p_f]
@@ -4815,7 +5072,8 @@
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_f]
@@ -4843,7 +5101,8 @@
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_offset_p_f]
@@ -4874,7 +5133,8 @@
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -4905,7 +5165,8 @@
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -4933,7 +5194,8 @@
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_f]
@@ -4961,7 +5223,8 @@
 	 VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_f]
@@ -4993,7 +5256,8 @@
 	  VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u]
@@ -5025,7 +5289,8 @@
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u]
@@ -5054,7 +5319,8 @@
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vidupq_n_u])
@@ -5122,7 +5388,8 @@
 		(match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;\tvidupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vddupq_n_u])
@@ -5190,7 +5457,8 @@
 		 (match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;vddupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vdwdupq_n_u])
@@ -5306,8 +5574,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vdwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [viwdupq_n_u])
@@ -5423,7 +5692,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;\tviwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5449,7 +5719,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_s vstrwq_scatter_base_wb_p_u]
@@ -5475,7 +5746,8 @@
    output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_f]
@@ -5500,7 +5772,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_f]
@@ -5526,7 +5799,8 @@
    output_asm_insn ("vpst\;vstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u]
@@ -5551,7 +5825,8 @@
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_p_s vstrdq_scatter_base_wb_p_u]
@@ -5577,7 +5852,8 @@
    output_asm_insn ("vpst\;vstrdt.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5629,7 +5905,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5685,7 +5962,8 @@
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5737,7 +6015,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5794,7 +6073,8 @@
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrdq_gather_base_wb_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -5847,7 +6127,8 @@
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrdq_gather_base_wb_z_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -5886,7 +6167,7 @@
    (unspec_volatile:SI [(reg:SI VFPCC_REGNUM)] UNSPEC_GET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmrs\\t%0, FPSCR_nzcvqc"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 (define_insn "set_fpscr_nzcvqc"
  [(set (reg:SI VFPCC_REGNUM)
@@ -5894,7 +6175,7 @@
     VUNSPEC_SET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmsr\\tFPSCR_nzcvqc, %0"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 ;;
 ;; [vldrdq_gather_base_wb_z_s vldrdq_gather_base_wb_z_u]
@@ -5919,7 +6200,8 @@
    output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 ;;
 ;; [vadciq_m_s, vadciq_m_u])
 ;;
@@ -5936,7 +6218,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5953,7 +6236,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vadci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -5972,7 +6256,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -5989,7 +6274,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vadc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")
    (set_attr "conds" "set")])
 
@@ -6009,7 +6295,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6026,7 +6313,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vsbci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -6045,7 +6333,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6062,7 +6351,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vsbc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -6091,7 +6381,7 @@
 		    "vst21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld2q])
@@ -6119,7 +6409,7 @@
 		    "vld21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld4q])
@@ -6462,7 +6752,8 @@
  ]
  "TARGET_HAVE_MVE"
  "vpst\;vshlct\t%q0, %1, %4"
- [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
   (set_attr "length" "8")])
 
 ;; CDE instructions on MVE registers.
@@ -6474,7 +6765,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1\\tp%c1, %q0, #%c2"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1qav16qi"
@@ -6485,7 +6777,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1a\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qv16qi"
@@ -6496,7 +6789,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2\\tp%c1, %q0, %q2, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qav16qi"
@@ -6508,7 +6802,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2a\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qv16qi"
@@ -6520,7 +6815,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3\\tp%c1, %q0, %q2, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qav16qi"
@@ -6533,7 +6829,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3a\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1q<a>_p_v16qi"
@@ -6545,7 +6842,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx1<a>t\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6559,7 +6857,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx2<a>t\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6574,11 +6873,12 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx3<a>t\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
-(define_insn "*movmisalign<mode>_mve_store"
+(define_insn "movmisalign<mode>_mve_store"
   [(set (match_operand:MVE_VLD_ST 0 "mve_memory_operand"	     "=Ux")
 	(unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "s_register_operand" " w")]
 	 UNSPEC_MISALIGNED_ACCESS))]
@@ -6586,11 +6886,12 @@
     || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (<MODE>mode)))
    && !BYTES_BIG_ENDIAN && unaligned_access"
   "vstr<V_sz_elem1>.<V_sz_elem>\t%q1, %E0"
-  [(set_attr "type" "mve_store")]
+  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign<mode>_mve_store"))
+   (set_attr "type" "mve_store")]
 )
 
 
-(define_insn "*movmisalign<mode>_mve_load"
+(define_insn "movmisalign<mode>_mve_load"
   [(set (match_operand:MVE_VLD_ST 0 "s_register_operand"				 "=w")
 	(unspec:MVE_VLD_ST [(match_operand:MVE_VLD_ST 1 "mve_memory_operand" " Ux")]
 	 UNSPEC_MISALIGNED_ACCESS))]
@@ -6598,7 +6899,8 @@
     || (TARGET_HAVE_MVE_FLOAT && VALID_MVE_SF_MODE (<MODE>mode)))
    && !BYTES_BIG_ENDIAN && unaligned_access"
   "vldr<V_sz_elem1>.<V_sz_elem>\t%q0, %E1"
-  [(set_attr "type" "mve_load")]
+  [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_movmisalign<mode>_mve_load"))
+   (set_attr "type" "mve_load")]
 )
 
 ;; Expander for VxBI moves
@@ -6680,3 +6982,40 @@
       }
   }
 )
+
+;; Originally expanded by 'predicated_doloop_end'.
+;; In the rare situation where the branch is too far, we do also need to
+;; revert FPSCR.LTPSIZE back to 0x100 after the last iteration.
+(define_insn "*predicated_doloop_end_internal"
+  [(set (pc)
+	(if_then_else
+	   (ge (plus:SI (reg:SI LR_REGNUM)
+			(match_operand:SI 0 "const_int_operand" ""))
+		(const_int 0))
+	 (label_ref (match_operand 1 "" ""))
+	 (pc)))
+   (set (reg:SI LR_REGNUM)
+	(plus:SI (reg:SI LR_REGNUM) (match_dup 0)))
+   (clobber (reg:CC CC_REGNUM))]
+  "TARGET_32BIT && TARGET_HAVE_LOB && TARGET_HAVE_MVE && TARGET_THUMB2"
+  {
+    if (get_attr_length (insn) == 4)
+      return "letp\t%|lr, %l1";
+    else
+      return "subs\t%|lr, #%n0\n\tbgt\t%l1\n\tlctp";
+  }
+  [(set (attr "length")
+	(if_then_else
+	   (ltu (minus (pc) (match_dup 1)) (const_int 1024))
+	    (const_int 4)
+	    (const_int 6)))
+   (set_attr "type" "branch")])
+
+(define_insn "dlstp<mode1>_insn"
+  [
+    (set (reg:SI LR_REGNUM)
+	 (unspec:SI [(match_operand:SI 0 "s_register_operand" "r")]
+	  DLSTP))
+  ]
+  "TARGET_32BIT && TARGET_HAVE_LOB && TARGET_HAVE_MVE && TARGET_THUMB2"
+  "dlstp.<mode1>\t%|lr, %0")
diff --git a/gcc/config/arm/vec-common.md b/gcc/config/arm/vec-common.md
index 9af8429968d..74871cb984b 100644
--- a/gcc/config/arm/vec-common.md
+++ b/gcc/config/arm/vec-common.md
@@ -366,7 +366,8 @@
   "@
    <mve_insn>.<supf>%#<V_sz_elem>\t%<V_reg>0, %<V_reg>1, %<V_reg>2
    * return neon_output_shift_immediate (\"vshl\", 'i', &operands[2], <MODE>mode, VALID_NEON_QREG_MODE (<MODE>mode), true);"
-  [(set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
 )
 
 (define_expand "vashl<mode>3"

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
@ 2023-06-15 11:47 Stamatis Markianos-Wright
  0 siblings, 0 replies; 7+ messages in thread
From: Stamatis Markianos-Wright @ 2023-06-15 11:47 UTC (permalink / raw)
  To: gcc-patches; +Cc: Kyrylo Tkachov, Richard Earnshaw, ramana.gcc, nickc

[-- Attachment #1: Type: text/plain, Size: 44151 bytes --]

     Hi all,

     I'd like to submit two patches that add support for Arm's MVE
     Tail Predicated Low Overhead Loop feature.

     --- Introduction ---

     The M-class Arm-ARM:
     https://developer.arm.com/documentation/ddi0553/bu/?lang=en
     Section B5.5.1 "Loop tail predication" describes the feature
     we are adding support for with this patch (although
     we only add codegen for DLSTP/LETP instruction loops).

     Previously with commit d2ed233cb94 we'd added support for
     non-MVE DLS/LE loops through the loop-doloop pass, which, given
     a standard MVE loop like:

     ```
     void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, 
int16_t *c, int n)
     {
       while (n > 0)
         {
           mve_pred16_t p = vctp16q (n);
           int16x8_t va = vldrhq_z_s16 (a, p);
           int16x8_t vb = vldrhq_z_s16 (b, p);
           int16x8_t vc = vaddq_x_s16 (va, vb, p);
           vstrhq_p_s16 (c, vc, p);
           c+=8;
           a+=8;
           b+=8;
           n-=8;
         }
     }
     ```
     .. would output:

     ```
             <pre-calculate the number of iterations and place it into lr>
             dls     lr, lr
     .L3:
             vctp.16 r3
             vmrs    ip, P0  @ movhi
             sxth    ip, ip
             vmsr     P0, ip @ movhi
             mov     r4, r0
             vpst
             vldrht.16       q2, [r4]
             mov     r4, r1
             vmov    q3, q0
             vpst
             vldrht.16       q1, [r4]
             mov     r4, r2
             vpst
             vaddt.i16       q3, q2, q1
             subs    r3, r3, #8
             vpst
             vstrht.16       q3, [r4]
             adds    r0, r0, #16
             adds    r1, r1, #16
             adds    r2, r2, #16
             le      lr, .L3
     ```

     where the LE instruction will decrement LR by 1, compare and
     branch if needed.

     (there are also other inefficiencies with the above code, like the
     pointless vmrs/sxth/vmsr on the VPR and the adds not being merged
     into the vldrht/vstrht as a #16 offsets and some random movs!
     But that's different problems...)

     The MVE version is similar, except that:
     * Instead of DLS/LE the instructions are DLSTP/LETP.
     * Instead of pre-calculating the number of iterations of the
       loop, we place the number of elements to be processed by the
       loop into LR.
     * Instead of decrementing the LR by one, LETP will decrement it
       by FPSCR.LTPSIZE, which is the number of elements being
       processed in each iteration: 16 for 8-bit elements, 5 for 16-bit
       elements, etc.
     * On the final iteration, automatic Loop Tail Predication is
       performed, as if the instructions within the loop had been VPT
       predicated with a VCTP generating the VPR predicate in every
       loop iteration.

     The dlstp/letp loop now looks like:

     ```
             <place n into r3>
             dlstp.16        lr, r3
     .L14:
             mov     r3, r0
             vldrh.16        q3, [r3]
             mov     r3, r1
             vldrh.16        q2, [r3]
             mov     r3, r2
             vadd.i16  q3, q3, q2
             adds    r0, r0, #16
             vstrh.16        q3, [r3]
             adds    r1, r1, #16
             adds    r2, r2, #16
             letp    lr, .L14

     ```

     Since the loop tail predication is automatic, we have eliminated
     the VCTP that had been specified by the user in the intrinsic
     and converted the VPT-predicated instructions into their
     unpredicated equivalents (which also saves us from VPST insns).

     The LE instruction here decrements LR by 8 in each iteration.

     --- This 1/2 patch ---

     This first patch lays some groundwork by adding an attribute to
     md patterns, and then the second patch contains the functional
     changes.

     One major difficulty in implementing MVE Tail-Predicated Low
     Overhead Loops was the need to transform VPT-predicated insns
     in the insn chain into their unpredicated equivalents, like:
     `mve_vldrbq_z_<supf><mode> -> mve_vldrbq_<supf><mode>`.

     This requires us to have a deterministic link between two
     different patterns in mve.md -- this _could_ be done by
     re-ordering the entirety of mve.md such that the patterns are
     at some constant icode proximity (e.g. having the _z immediately
     after the unpredicated version would mean that to map from the
     former to the latter you could use icode-1), but that is a very
     messy solution that would lead to complex unknown dependencies
     between the ordering of patterns.

     This patch proves an alternative way of doing that: using an insn
     attribute to encode the icode of the unpredicated instruction.

     No regressions on arm-none-eabi with an MVE target.

     Thank you,
     Stam Markianos-Wright

     gcc/ChangeLog:

             * config/arm/arm.md (mve_unpredicated_insn): New attribute.
             * config/arm/arm.h (MVE_VPT_PREDICATED_INSN_P): New define.
             (MVE_VPT_UNPREDICATED_INSN_P): Likewise.
             (MVE_VPT_PREDICABLE_INSN_P): Likewise.
             * config/arm/vec-common.md (mve_vshlq_<supf><mode>): Add 
attribute.
             * config/arm/mve.md (arm_vcx1q<a>_p_v16qi): Add attribute.
             (arm_vcx1q<a>v16qi): Likewise.
             (arm_vcx1qav16qi): Likewise.
             (arm_vcx1qv16qi): Likewise.
             (arm_vcx2q<a>_p_v16qi): Likewise.
             (arm_vcx2q<a>v16qi): Likewise.
             (arm_vcx2qav16qi): Likewise.
             (arm_vcx2qv16qi): Likewise.
             (arm_vcx3q<a>_p_v16qi): Likewise.
             (arm_vcx3q<a>v16qi): Likewise.
             (arm_vcx3qav16qi): Likewise.
             (arm_vcx3qv16qi): Likewise.
             (mve_vabavq_<supf><mode>): Likewise.
             (mve_vabavq_p_<supf><mode>): Likewise.
             (mve_vabdq_<supf><mode>): Likewise.
             (mve_vabdq_f<mode>): Likewise.
             (mve_vabdq_m_<supf><mode>): Likewise.
             (mve_vabdq_m_f<mode>): Likewise.
             (mve_vabsq_f<mode>): Likewise.
             (mve_vabsq_m_f<mode>): Likewise.
             (mve_vabsq_m_s<mode>): Likewise.
             (mve_vabsq_s<mode>): Likewise.
             (mve_vadciq_<supf>v4si): Likewise.
             (mve_vadciq_m_<supf>v4si): Likewise.
             (mve_vadcq_<supf>v4si): Likewise.
             (mve_vadcq_m_<supf>v4si): Likewise.
             (mve_vaddlvaq_<supf>v4si): Likewise.
             (mve_vaddlvaq_p_<supf>v4si): Likewise.
             (mve_vaddlvq_<supf>v4si): Likewise.
             (mve_vaddlvq_p_<supf>v4si): Likewise.
             (mve_vaddq_f<mode>): Likewise.
             (mve_vaddq_m_<supf><mode>): Likewise.
             (mve_vaddq_m_f<mode>): Likewise.
             (mve_vaddq_m_n_<supf><mode>): Likewise.
             (mve_vaddq_m_n_f<mode>): Likewise.
             (mve_vaddq_n_<supf><mode>): Likewise.
             (mve_vaddq_n_f<mode>): Likewise.
             (mve_vaddq<mode>): Likewise.
             (mve_vaddvaq_<supf><mode>): Likewise.
             (mve_vaddvaq_p_<supf><mode>): Likewise.
             (mve_vaddvq_<supf><mode>): Likewise.
             (mve_vaddvq_p_<supf><mode>): Likewise.
             (mve_vandq_<supf><mode>): Likewise.
             (mve_vandq_f<mode>): Likewise.
             (mve_vandq_m_<supf><mode>): Likewise.
             (mve_vandq_m_f<mode>): Likewise.
             (mve_vandq_s<mode>): Likewise.
             (mve_vandq_u<mode>): Likewise.
             (mve_vbicq_<supf><mode>): Likewise.
             (mve_vbicq_f<mode>): Likewise.
             (mve_vbicq_m_<supf><mode>): Likewise.
             (mve_vbicq_m_f<mode>): Likewise.
             (mve_vbicq_m_n_<supf><mode>): Likewise.
             (mve_vbicq_n_<supf><mode>): Likewise.
             (mve_vbicq_s<mode>): Likewise.
             (mve_vbicq_u<mode>): Likewise.
             (mve_vbrsrq_m_n_<supf><mode>): Likewise.
             (mve_vbrsrq_m_n_f<mode>): Likewise.
             (mve_vbrsrq_n_<supf><mode>): Likewise.
             (mve_vbrsrq_n_f<mode>): Likewise.
             (mve_vcaddq_rot270_m_<supf><mode>): Likewise.
             (mve_vcaddq_rot270_m_f<mode>): Likewise.
             (mve_vcaddq_rot270<mode>): Likewise.
             (mve_vcaddq_rot270<mode>): Likewise.
             (mve_vcaddq_rot90_m_<supf><mode>): Likewise.
             (mve_vcaddq_rot90_m_f<mode>): Likewise.
             (mve_vcaddq_rot90<mode>): Likewise.
             (mve_vcaddq_rot90<mode>): Likewise.
             (mve_vcaddq<mve_rot><mode>): Likewise.
             (mve_vcaddq<mve_rot><mode>): Likewise.
             (mve_vclsq_m_s<mode>): Likewise.
             (mve_vclsq_s<mode>): Likewise.
             (mve_vclzq_<supf><mode>): Likewise.
             (mve_vclzq_m_<supf><mode>): Likewise.
             (mve_vclzq_s<mode>): Likewise.
             (mve_vclzq_u<mode>): Likewise.
             (mve_vcmlaq_m_f<mode>): Likewise.
             (mve_vcmlaq_rot180_m_f<mode>): Likewise.
             (mve_vcmlaq_rot180<mode>): Likewise.
             (mve_vcmlaq_rot270_m_f<mode>): Likewise.
             (mve_vcmlaq_rot270<mode>): Likewise.
             (mve_vcmlaq_rot90_m_f<mode>): Likewise.
             (mve_vcmlaq_rot90<mode>): Likewise.
             (mve_vcmlaq<mode>): Likewise.
             (mve_vcmlaq<mve_rot><mode>): Likewise.
             (mve_vcmp<mve_cmp_op>q_<mode>): Likewise.
             (mve_vcmp<mve_cmp_op>q_f<mode>): Likewise.
             (mve_vcmp<mve_cmp_op>q_n_<mode>): Likewise.
             (mve_vcmp<mve_cmp_op>q_n_f<mode>): Likewise.
             (mve_vcmpcsq_<mode>): Likewise.
             (mve_vcmpcsq_m_n_u<mode>): Likewise.
             (mve_vcmpcsq_m_u<mode>): Likewise.
             (mve_vcmpcsq_n_<mode>): Likewise.
             (mve_vcmpeqq_<mode>): Likewise.
             (mve_vcmpeqq_f<mode>): Likewise.
             (mve_vcmpeqq_m_<supf><mode>): Likewise.
             (mve_vcmpeqq_m_f<mode>): Likewise.
             (mve_vcmpeqq_m_n_<supf><mode>): Likewise.
             (mve_vcmpeqq_m_n_f<mode>): Likewise.
             (mve_vcmpeqq_n_<mode>): Likewise.
             (mve_vcmpeqq_n_f<mode>): Likewise.
             (mve_vcmpgeq_<mode>): Likewise.
             (mve_vcmpgeq_f<mode>): Likewise.
             (mve_vcmpgeq_m_f<mode>): Likewise.
             (mve_vcmpgeq_m_n_f<mode>): Likewise.
             (mve_vcmpgeq_m_n_s<mode>): Likewise.
             (mve_vcmpgeq_m_s<mode>): Likewise.
             (mve_vcmpgeq_n_<mode>): Likewise.
             (mve_vcmpgeq_n_f<mode>): Likewise.
             (mve_vcmpgtq_<mode>): Likewise.
             (mve_vcmpgtq_f<mode>): Likewise.
             (mve_vcmpgtq_m_f<mode>): Likewise.
             (mve_vcmpgtq_m_n_f<mode>): Likewise.
             (mve_vcmpgtq_m_n_s<mode>): Likewise.
             (mve_vcmpgtq_m_s<mode>): Likewise.
             (mve_vcmpgtq_n_<mode>): Likewise.
             (mve_vcmpgtq_n_f<mode>): Likewise.
             (mve_vcmphiq_<mode>): Likewise.
             (mve_vcmphiq_m_n_u<mode>): Likewise.
             (mve_vcmphiq_m_u<mode>): Likewise.
             (mve_vcmphiq_n_<mode>): Likewise.
             (mve_vcmpleq_<mode>): Likewise.
             (mve_vcmpleq_f<mode>): Likewise.
             (mve_vcmpleq_m_f<mode>): Likewise.
             (mve_vcmpleq_m_n_f<mode>): Likewise.
             (mve_vcmpleq_m_n_s<mode>): Likewise.
             (mve_vcmpleq_m_s<mode>): Likewise.
             (mve_vcmpleq_n_<mode>): Likewise.
             (mve_vcmpleq_n_f<mode>): Likewise.
             (mve_vcmpltq_<mode>): Likewise.
             (mve_vcmpltq_f<mode>): Likewise.
             (mve_vcmpltq_m_f<mode>): Likewise.
             (mve_vcmpltq_m_n_f<mode>): Likewise.
             (mve_vcmpltq_m_n_s<mode>): Likewise.
             (mve_vcmpltq_m_s<mode>): Likewise.
             (mve_vcmpltq_n_<mode>): Likewise.
             (mve_vcmpltq_n_f<mode>): Likewise.
             (mve_vcmpneq_<mode>): Likewise.
             (mve_vcmpneq_f<mode>): Likewise.
             (mve_vcmpneq_m_<supf><mode>): Likewise.
             (mve_vcmpneq_m_f<mode>): Likewise.
             (mve_vcmpneq_m_n_<supf><mode>): Likewise.
             (mve_vcmpneq_m_n_f<mode>): Likewise.
             (mve_vcmpneq_n_<mode>): Likewise.
             (mve_vcmpneq_n_f<mode>): Likewise.
             (mve_vcmulq_m_f<mode>): Likewise.
             (mve_vcmulq_rot180_m_f<mode>): Likewise.
             (mve_vcmulq_rot180<mode>): Likewise.
             (mve_vcmulq_rot270_m_f<mode>): Likewise.
             (mve_vcmulq_rot270<mode>): Likewise.
             (mve_vcmulq_rot90_m_f<mode>): Likewise.
             (mve_vcmulq_rot90<mode>): Likewise.
             (mve_vcmulq<mode>): Likewise.
             (mve_vcmulq<mve_rot><mode>): Likewise.
             (mve_vctp<mode1>q_mhi): Likewise.
             (mve_vctp<mode1>qhi): Likewise.
             (mve_vcvtaq_<supf><mode>): Likewise.
             (mve_vcvtaq_m_<supf><mode>): Likewise.
             (mve_vcvtbq_f16_f32v8hf): Likewise.
             (mve_vcvtbq_f32_f16v4sf): Likewise.
             (mve_vcvtbq_m_f16_f32v8hf): Likewise.
             (mve_vcvtbq_m_f32_f16v4sf): Likewise.
             (mve_vcvtmq_<supf><mode>): Likewise.
             (mve_vcvtmq_m_<supf><mode>): Likewise.
             (mve_vcvtnq_<supf><mode>): Likewise.
             (mve_vcvtnq_m_<supf><mode>): Likewise.
             (mve_vcvtpq_<supf><mode>): Likewise.
             (mve_vcvtpq_m_<supf><mode>): Likewise.
             (mve_vcvtq_from_f_<supf><mode>): Likewise.
             (mve_vcvtq_m_from_f_<supf><mode>): Likewise.
             (mve_vcvtq_m_n_from_f_<supf><mode>): Likewise.
             (mve_vcvtq_m_n_to_f_<supf><mode>): Likewise.
             (mve_vcvtq_m_to_f_<supf><mode>): Likewise.
             (mve_vcvtq_n_from_f_<supf><mode>): Likewise.
             (mve_vcvtq_n_to_f_<supf><mode>): Likewise.
             (mve_vcvtq_to_f_<supf><mode>): Likewise.
             (mve_vcvttq_f16_f32v8hf): Likewise.
             (mve_vcvttq_f32_f16v4sf): Likewise.
             (mve_vcvttq_m_f16_f32v8hf): Likewise.
             (mve_vcvttq_m_f32_f16v4sf): Likewise.
             (mve_vddupq_m_wb_u<mode>_insn): Likewise.
             (mve_vddupq_u<mode>_insn): Likewise.
             (mve_vdupq_m_n_<supf><mode>): Likewise.
             (mve_vdupq_m_n_f<mode>): Likewise.
             (mve_vdupq_n_<supf><mode>): Likewise.
             (mve_vdupq_n_f<mode>): Likewise.
             (mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
             (mve_vdwdupq_wb_u<mode>_insn): Likewise.
             (mve_veorq_<supf><mode>): Likewise.
             (mve_veorq_f<mode>): Likewise.
             (mve_veorq_m_<supf><mode>): Likewise.
             (mve_veorq_m_f<mode>): Likewise.
             (mve_veorq_s<mode>): Likewise.
             (mve_veorq_u<mode>): Likewise.
             (mve_vfmaq_f<mode>): Likewise.
             (mve_vfmaq_m_f<mode>): Likewise.
             (mve_vfmaq_m_n_f<mode>): Likewise.
             (mve_vfmaq_n_f<mode>): Likewise.
             (mve_vfmasq_m_n_f<mode>): Likewise.
             (mve_vfmasq_n_f<mode>): Likewise.
             (mve_vfmsq_f<mode>): Likewise.
             (mve_vfmsq_m_f<mode>): Likewise.
             (mve_vhaddq_<supf><mode>): Likewise.
             (mve_vhaddq_m_<supf><mode>): Likewise.
             (mve_vhaddq_m_n_<supf><mode>): Likewise.
             (mve_vhaddq_n_<supf><mode>): Likewise.
             (mve_vhcaddq_rot270_m_s<mode>): Likewise.
             (mve_vhcaddq_rot270_s<mode>): Likewise.
             (mve_vhcaddq_rot90_m_s<mode>): Likewise.
             (mve_vhcaddq_rot90_s<mode>): Likewise.
             (mve_vhsubq_<supf><mode>): Likewise.
             (mve_vhsubq_m_<supf><mode>): Likewise.
             (mve_vhsubq_m_n_<supf><mode>): Likewise.
             (mve_vhsubq_n_<supf><mode>): Likewise.
             (mve_vidupq_m_wb_u<mode>_insn): Likewise.
             (mve_vidupq_u<mode>_insn): Likewise.
             (mve_viwdupq_m_wb_u<mode>_insn): Likewise.
             (mve_viwdupq_wb_u<mode>_insn): Likewise.
             (mve_vldrbq_<supf><mode>): Likewise.
             (mve_vldrbq_gather_offset_<supf><mode>): Likewise.
             (mve_vldrbq_gather_offset_z_<supf><mode>): Likewise.
             (mve_vldrbq_z_<supf><mode>): Likewise.
             (mve_vldrdq_gather_base_<supf>v2di): Likewise.
             (mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
             (mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
             (mve_vldrdq_gather_base_z_<supf>v2di): Likewise.
             (mve_vldrdq_gather_offset_<supf>v2di): Likewise.
             (mve_vldrdq_gather_offset_z_<supf>v2di): Likewise.
             (mve_vldrdq_gather_shifted_offset_<supf>v2di): Likewise.
             (mve_vldrdq_gather_shifted_offset_z_<supf>v2di): Likewise.
             (mve_vldrhq_<supf><mode>): Likewise.
             (mve_vldrhq_fv8hf): Likewise.
             (mve_vldrhq_gather_offset_<supf><mode>): Likewise.
             (mve_vldrhq_gather_offset_fv8hf): Likewise.
             (mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
             (mve_vldrhq_gather_offset_z_fv8hf): Likewise.
(mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
             (mve_vldrhq_gather_shifted_offset_fv8hf): Likewise.
(mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
             (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise.
             (mve_vldrhq_z_<supf><mode>): Likewise.
             (mve_vldrhq_z_fv8hf): Likewise.
             (mve_vldrwq_<supf>v4si): Likewise.
             (mve_vldrwq_fv4sf): Likewise.
             (mve_vldrwq_gather_base_<supf>v4si): Likewise.
             (mve_vldrwq_gather_base_fv4sf): Likewise.
             (mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
             (mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
             (mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
             (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
             (mve_vldrwq_gather_base_z_<supf>v4si): Likewise.
             (mve_vldrwq_gather_base_z_fv4sf): Likewise.
             (mve_vldrwq_gather_offset_<supf>v4si): Likewise.
             (mve_vldrwq_gather_offset_fv4sf): Likewise.
             (mve_vldrwq_gather_offset_z_<supf>v4si): Likewise.
             (mve_vldrwq_gather_offset_z_fv4sf): Likewise.
             (mve_vldrwq_gather_shifted_offset_<supf>v4si): Likewise.
             (mve_vldrwq_gather_shifted_offset_fv4sf): Likewise.
             (mve_vldrwq_gather_shifted_offset_z_<supf>v4si): Likewise.
             (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise.
             (mve_vldrwq_z_<supf>v4si): Likewise.
             (mve_vldrwq_z_fv4sf): Likewise.
             (mve_vmaxaq_m_s<mode>): Likewise.
             (mve_vmaxaq_s<mode>): Likewise.
             (mve_vmaxavq_p_s<mode>): Likewise.
             (mve_vmaxavq_s<mode>): Likewise.
             (mve_vmaxnmaq_f<mode>): Likewise.
             (mve_vmaxnmaq_m_f<mode>): Likewise.
             (mve_vmaxnmavq_f<mode>): Likewise.
             (mve_vmaxnmavq_p_f<mode>): Likewise.
             (mve_vmaxnmq_f<mode>): Likewise.
             (mve_vmaxnmq_m_f<mode>): Likewise.
             (mve_vmaxnmvq_f<mode>): Likewise.
             (mve_vmaxnmvq_p_f<mode>): Likewise.
             (mve_vmaxq_<supf><mode>): Likewise.
             (mve_vmaxq_m_<supf><mode>): Likewise.
             (mve_vmaxq_s<mode>): Likewise.
             (mve_vmaxq_u<mode>): Likewise.
             (mve_vmaxvq_<supf><mode>): Likewise.
             (mve_vmaxvq_p_<supf><mode>): Likewise.
             (mve_vminaq_m_s<mode>): Likewise.
             (mve_vminaq_s<mode>): Likewise.
             (mve_vminavq_p_s<mode>): Likewise.
             (mve_vminavq_s<mode>): Likewise.
             (mve_vminnmaq_f<mode>): Likewise.
             (mve_vminnmaq_m_f<mode>): Likewise.
             (mve_vminnmavq_f<mode>): Likewise.
             (mve_vminnmavq_p_f<mode>): Likewise.
             (mve_vminnmq_f<mode>): Likewise.
             (mve_vminnmq_m_f<mode>): Likewise.
             (mve_vminnmvq_f<mode>): Likewise.
             (mve_vminnmvq_p_f<mode>): Likewise.
             (mve_vminq_<supf><mode>): Likewise.
             (mve_vminq_m_<supf><mode>): Likewise.
             (mve_vminq_s<mode>): Likewise.
             (mve_vminq_u<mode>): Likewise.
             (mve_vminvq_<supf><mode>): Likewise.
             (mve_vminvq_p_<supf><mode>): Likewise.
             (mve_vmladavaq_<supf><mode>): Likewise.
             (mve_vmladavaq_p_<supf><mode>): Likewise.
             (mve_vmladavaxq_p_s<mode>): Likewise.
             (mve_vmladavaxq_s<mode>): Likewise.
             (mve_vmladavq_<supf><mode>): Likewise.
             (mve_vmladavq_p_<supf><mode>): Likewise.
             (mve_vmladavxq_p_s<mode>): Likewise.
             (mve_vmladavxq_s<mode>): Likewise.
             (mve_vmlaldavaq_<supf><mode>): Likewise.
             (mve_vmlaldavaq_p_<supf><mode>): Likewise.
             (mve_vmlaldavaxq_<supf><mode>): Likewise.
             (mve_vmlaldavaxq_p_<supf><mode>): Likewise.
             (mve_vmlaldavaxq_s<mode>): Likewise.
             (mve_vmlaldavq_<supf><mode>): Likewise.
             (mve_vmlaldavq_p_<supf><mode>): Likewise.
             (mve_vmlaldavxq_p_s<mode>): Likewise.
             (mve_vmlaldavxq_s<mode>): Likewise.
             (mve_vmlaq_m_n_<supf><mode>): Likewise.
             (mve_vmlaq_n_<supf><mode>): Likewise.
             (mve_vmlasq_m_n_<supf><mode>): Likewise.
             (mve_vmlasq_n_<supf><mode>): Likewise.
             (mve_vmlsdavaq_p_s<mode>): Likewise.
             (mve_vmlsdavaq_s<mode>): Likewise.
             (mve_vmlsdavaxq_p_s<mode>): Likewise.
             (mve_vmlsdavaxq_s<mode>): Likewise.
             (mve_vmlsdavq_p_s<mode>): Likewise.
             (mve_vmlsdavq_s<mode>): Likewise.
             (mve_vmlsdavxq_p_s<mode>): Likewise.
             (mve_vmlsdavxq_s<mode>): Likewise.
             (mve_vmlsldavaq_p_s<mode>): Likewise.
             (mve_vmlsldavaq_s<mode>): Likewise.
             (mve_vmlsldavaxq_p_s<mode>): Likewise.
             (mve_vmlsldavaxq_s<mode>): Likewise.
             (mve_vmlsldavq_p_s<mode>): Likewise.
             (mve_vmlsldavq_s<mode>): Likewise.
             (mve_vmlsldavxq_p_s<mode>): Likewise.
             (mve_vmlsldavxq_s<mode>): Likewise.
             (mve_vmovlbq_<supf><mode>): Likewise.
             (mve_vmovlbq_m_<supf><mode>): Likewise.
             (mve_vmovltq_<supf><mode>): Likewise.
             (mve_vmovltq_m_<supf><mode>): Likewise.
             (mve_vmovnbq_<supf><mode>): Likewise.
             (mve_vmovnbq_m_<supf><mode>): Likewise.
             (mve_vmovntq_<supf><mode>): Likewise.
             (mve_vmovntq_m_<supf><mode>): Likewise.
             (mve_vmulhq_<supf><mode>): Likewise.
             (mve_vmulhq_m_<supf><mode>): Likewise.
             (mve_vmullbq_int_<supf><mode>): Likewise.
             (mve_vmullbq_int_m_<supf><mode>): Likewise.
             (mve_vmullbq_poly_m_p<mode>): Likewise.
             (mve_vmullbq_poly_p<mode>): Likewise.
             (mve_vmulltq_int_<supf><mode>): Likewise.
             (mve_vmulltq_int_m_<supf><mode>): Likewise.
             (mve_vmulltq_poly_m_p<mode>): Likewise.
             (mve_vmulltq_poly_p<mode>): Likewise.
             (mve_vmulq_<supf><mode>): Likewise.
             (mve_vmulq_f<mode>): Likewise.
             (mve_vmulq_m_<supf><mode>): Likewise.
             (mve_vmulq_m_f<mode>): Likewise.
             (mve_vmulq_m_n_<supf><mode>): Likewise.
             (mve_vmulq_m_n_f<mode>): Likewise.
             (mve_vmulq_n_<supf><mode>): Likewise.
             (mve_vmulq_n_f<mode>): Likewise.
             (mve_vmvnq_<supf><mode>): Likewise.
             (mve_vmvnq_m_<supf><mode>): Likewise.
             (mve_vmvnq_m_n_<supf><mode>): Likewise.
             (mve_vmvnq_n_<supf><mode>): Likewise.
             (mve_vmvnq_s<mode>): Likewise.
             (mve_vmvnq_u<mode>): Likewise.
             (mve_vnegq_f<mode>): Likewise.
             (mve_vnegq_m_f<mode>): Likewise.
             (mve_vnegq_m_s<mode>): Likewise.
             (mve_vnegq_s<mode>): Likewise.
             (mve_vornq_<supf><mode>): Likewise.
             (mve_vornq_f<mode>): Likewise.
             (mve_vornq_m_<supf><mode>): Likewise.
             (mve_vornq_m_f<mode>): Likewise.
             (mve_vornq_s<mode>): Likewise.
             (mve_vornq_u<mode>): Likewise.
             (mve_vorrq_<supf><mode>): Likewise.
             (mve_vorrq_f<mode>): Likewise.
             (mve_vorrq_m_<supf><mode>): Likewise.
             (mve_vorrq_m_f<mode>): Likewise.
             (mve_vorrq_m_n_<supf><mode>): Likewise.
             (mve_vorrq_n_<supf><mode>): Likewise.
             (mve_vorrq_s<mode>): Likewise.
             (mve_vorrq_s<mode>): Likewise.
             (mve_vqabsq_m_s<mode>): Likewise.
             (mve_vqabsq_s<mode>): Likewise.
             (mve_vqaddq_<supf><mode>): Likewise.
             (mve_vqaddq_m_<supf><mode>): Likewise.
             (mve_vqaddq_m_n_<supf><mode>): Likewise.
             (mve_vqaddq_n_<supf><mode>): Likewise.
             (mve_vqdmladhq_m_s<mode>): Likewise.
             (mve_vqdmladhq_s<mode>): Likewise.
             (mve_vqdmladhxq_m_s<mode>): Likewise.
             (mve_vqdmladhxq_s<mode>): Likewise.
             (mve_vqdmlahq_m_n_s<mode>): Likewise.
             (mve_vqdmlahq_n_<supf><mode>): Likewise.
             (mve_vqdmlahq_n_s<mode>): Likewise.
             (mve_vqdmlashq_m_n_s<mode>): Likewise.
             (mve_vqdmlashq_n_<supf><mode>): Likewise.
             (mve_vqdmlashq_n_s<mode>): Likewise.
             (mve_vqdmlsdhq_m_s<mode>): Likewise.
             (mve_vqdmlsdhq_s<mode>): Likewise.
             (mve_vqdmlsdhxq_m_s<mode>): Likewise.
             (mve_vqdmlsdhxq_s<mode>): Likewise.
             (mve_vqdmulhq_m_n_s<mode>): Likewise.
             (mve_vqdmulhq_m_s<mode>): Likewise.
             (mve_vqdmulhq_n_s<mode>): Likewise.
             (mve_vqdmulhq_s<mode>): Likewise.
             (mve_vqdmullbq_m_n_s<mode>): Likewise.
             (mve_vqdmullbq_m_s<mode>): Likewise.
             (mve_vqdmullbq_n_s<mode>): Likewise.
             (mve_vqdmullbq_s<mode>): Likewise.
             (mve_vqdmulltq_m_n_s<mode>): Likewise.
             (mve_vqdmulltq_m_s<mode>): Likewise.
             (mve_vqdmulltq_n_s<mode>): Likewise.
             (mve_vqdmulltq_s<mode>): Likewise.
             (mve_vqmovnbq_<supf><mode>): Likewise.
             (mve_vqmovnbq_m_<supf><mode>): Likewise.
             (mve_vqmovntq_<supf><mode>): Likewise.
             (mve_vqmovntq_m_<supf><mode>): Likewise.
             (mve_vqmovunbq_m_s<mode>): Likewise.
             (mve_vqmovunbq_s<mode>): Likewise.
             (mve_vqmovuntq_m_s<mode>): Likewise.
             (mve_vqmovuntq_s<mode>): Likewise.
             (mve_vqnegq_m_s<mode>): Likewise.
             (mve_vqnegq_s<mode>): Likewise.
             (mve_vqrdmladhq_m_s<mode>): Likewise.
             (mve_vqrdmladhq_s<mode>): Likewise.
             (mve_vqrdmladhxq_m_s<mode>): Likewise.
             (mve_vqrdmladhxq_s<mode>): Likewise.
             (mve_vqrdmlahq_m_n_s<mode>): Likewise.
             (mve_vqrdmlahq_n_<supf><mode>): Likewise.
             (mve_vqrdmlahq_n_s<mode>): Likewise.
             (mve_vqrdmlashq_m_n_s<mode>): Likewise.
             (mve_vqrdmlashq_n_<supf><mode>): Likewise.
             (mve_vqrdmlashq_n_s<mode>): Likewise.
             (mve_vqrdmlsdhq_m_s<mode>): Likewise.
             (mve_vqrdmlsdhq_s<mode>): Likewise.
             (mve_vqrdmlsdhxq_m_s<mode>): Likewise.
             (mve_vqrdmlsdhxq_s<mode>): Likewise.
             (mve_vqrdmulhq_m_n_s<mode>): Likewise.
             (mve_vqrdmulhq_m_s<mode>): Likewise.
             (mve_vqrdmulhq_n_s<mode>): Likewise.
             (mve_vqrdmulhq_s<mode>): Likewise.
             (mve_vqrshlq_<supf><mode>): Likewise.
             (mve_vqrshlq_m_<supf><mode>): Likewise.
             (mve_vqrshlq_m_n_<supf><mode>): Likewise.
             (mve_vqrshlq_n_<supf><mode>): Likewise.
             (mve_vqrshrnbq_m_n_<supf><mode>): Likewise.
             (mve_vqrshrnbq_n_<supf><mode>): Likewise.
             (mve_vqrshrntq_m_n_<supf><mode>): Likewise.
             (mve_vqrshrntq_n_<supf><mode>): Likewise.
             (mve_vqrshrunbq_m_n_s<mode>): Likewise.
             (mve_vqrshrunbq_n_s<mode>): Likewise.
             (mve_vqrshruntq_m_n_s<mode>): Likewise.
             (mve_vqrshruntq_n_s<mode>): Likewise.
             (mve_vqshlq_<supf><mode>): Likewise.
             (mve_vqshlq_m_<supf><mode>): Likewise.
             (mve_vqshlq_m_n_<supf><mode>): Likewise.
             (mve_vqshlq_m_r_<supf><mode>): Likewise.
             (mve_vqshlq_n_<supf><mode>): Likewise.
             (mve_vqshlq_r_<supf><mode>): Likewise.
             (mve_vqshluq_m_n_s<mode>): Likewise.
             (mve_vqshluq_n_s<mode>): Likewise.
             (mve_vqshrnbq_m_n_<supf><mode>): Likewise.
             (mve_vqshrnbq_n_<supf><mode>): Likewise.
             (mve_vqshrntq_m_n_<supf><mode>): Likewise.
             (mve_vqshrntq_n_<supf><mode>): Likewise.
             (mve_vqshrunbq_m_n_s<mode>): Likewise.
             (mve_vqshrunbq_n_s<mode>): Likewise.
             (mve_vqshruntq_m_n_s<mode>): Likewise.
             (mve_vqshruntq_n_s<mode>): Likewise.
             (mve_vqsubq_<supf><mode>): Likewise.
             (mve_vqsubq_m_<supf><mode>): Likewise.
             (mve_vqsubq_m_n_<supf><mode>): Likewise.
             (mve_vqsubq_n_<supf><mode>): Likewise.
             (mve_vrev16q_<supf>v16qi): Likewise.
             (mve_vrev16q_m_<supf>v16qi): Likewise.
             (mve_vrev32q_<supf><mode>): Likewise.
             (mve_vrev32q_fv8hf): Likewise.
             (mve_vrev32q_m_<supf><mode>): Likewise.
             (mve_vrev32q_m_fv8hf): Likewise.
             (mve_vrev64q_<supf><mode>): Likewise.
             (mve_vrev64q_f<mode>): Likewise.
             (mve_vrev64q_m_<supf><mode>): Likewise.
             (mve_vrev64q_m_f<mode>): Likewise.
             (mve_vrhaddq_<supf><mode>): Likewise.
             (mve_vrhaddq_m_<supf><mode>): Likewise.
             (mve_vrmlaldavhaq_<supf>v4si): Likewise.
             (mve_vrmlaldavhaq_p_sv4si): Likewise.
             (mve_vrmlaldavhaq_p_uv4si): Likewise.
             (mve_vrmlaldavhaq_sv4si): Likewise.
             (mve_vrmlaldavhaq_uv4si): Likewise.
             (mve_vrmlaldavhaxq_p_sv4si): Likewise.
             (mve_vrmlaldavhaxq_sv4si): Likewise.
             (mve_vrmlaldavhq_<supf>v4si): Likewise.
             (mve_vrmlaldavhq_p_<supf>v4si): Likewise.
             (mve_vrmlaldavhxq_p_sv4si): Likewise.
             (mve_vrmlaldavhxq_sv4si): Likewise.
             (mve_vrmlsldavhaq_p_sv4si): Likewise.
             (mve_vrmlsldavhaq_sv4si): Likewise.
             (mve_vrmlsldavhaxq_p_sv4si): Likewise.
             (mve_vrmlsldavhaxq_sv4si): Likewise.
             (mve_vrmlsldavhq_p_sv4si): Likewise.
             (mve_vrmlsldavhq_sv4si): Likewise.
             (mve_vrmlsldavhxq_p_sv4si): Likewise.
             (mve_vrmlsldavhxq_sv4si): Likewise.
             (mve_vrmulhq_<supf><mode>): Likewise.
             (mve_vrmulhq_m_<supf><mode>): Likewise.
             (mve_vrndaq_f<mode>): Likewise.
             (mve_vrndaq_m_f<mode>): Likewise.
             (mve_vrndmq_f<mode>): Likewise.
             (mve_vrndmq_m_f<mode>): Likewise.
             (mve_vrndnq_f<mode>): Likewise.
             (mve_vrndnq_m_f<mode>): Likewise.
             (mve_vrndpq_f<mode>): Likewise.
             (mve_vrndpq_m_f<mode>): Likewise.
             (mve_vrndq_f<mode>): Likewise.
             (mve_vrndq_m_f<mode>): Likewise.
             (mve_vrndxq_f<mode>): Likewise.
             (mve_vrndxq_m_f<mode>): Likewise.
             (mve_vrshlq_<supf><mode>): Likewise.
             (mve_vrshlq_m_<supf><mode>): Likewise.
             (mve_vrshlq_m_n_<supf><mode>): Likewise.
             (mve_vrshlq_n_<supf><mode>): Likewise.
             (mve_vrshrnbq_m_n_<supf><mode>): Likewise.
             (mve_vrshrnbq_n_<supf><mode>): Likewise.
             (mve_vrshrntq_m_n_<supf><mode>): Likewise.
             (mve_vrshrntq_n_<supf><mode>): Likewise.
             (mve_vrshrq_m_n_<supf><mode>): Likewise.
             (mve_vrshrq_n_<supf><mode>): Likewise.
             (mve_vsbciq_<supf>v4si): Likewise.
             (mve_vsbciq_m_<supf>v4si): Likewise.
             (mve_vsbcq_<supf>v4si): Likewise.
             (mve_vsbcq_m_<supf>v4si): Likewise.
             (mve_vshlcq_<supf><mode>): Likewise.
             (mve_vshlcq_m_<supf><mode>): Likewise.
             (mve_vshllbq_m_n_<supf><mode>): Likewise.
             (mve_vshllbq_n_<supf><mode>): Likewise.
             (mve_vshlltq_m_n_<supf><mode>): Likewise.
             (mve_vshlltq_n_<supf><mode>): Likewise.
             (mve_vshlq_<supf><mode>): Likewise.
             (mve_vshlq_<supf><mode>): Likewise.
             (mve_vshlq_m_<supf><mode>): Likewise.
             (mve_vshlq_m_n_<supf><mode>): Likewise.
             (mve_vshlq_m_r_<supf><mode>): Likewise.
             (mve_vshlq_n_<supf><mode>): Likewise.
             (mve_vshlq_r_<supf><mode>): Likewise.
             (mve_vshrnbq_m_n_<supf><mode>): Likewise.
             (mve_vshrnbq_n_<supf><mode>): Likewise.
             (mve_vshrntq_m_n_<supf><mode>): Likewise.
             (mve_vshrntq_n_<supf><mode>): Likewise.
             (mve_vshrq_m_n_<supf><mode>): Likewise.
             (mve_vshrq_n_<supf><mode>): Likewise.
             (mve_vsliq_m_n_<supf><mode>): Likewise.
             (mve_vsliq_n_<supf><mode>): Likewise.
             (mve_vsriq_m_n_<supf><mode>): Likewise.
             (mve_vsriq_n_<supf><mode>): Likewise.
             (mve_vstrbq_<supf><mode>): Likewise.
             (mve_vstrbq_p_<supf><mode>): Likewise.
(mve_vstrbq_scatter_offset_<supf><mode>_insn): Likewise.
(mve_vstrbq_scatter_offset_p_<supf><mode>_insn): Likewise.
             (mve_vstrdq_scatter_base_<supf>v2di): Likewise.
             (mve_vstrdq_scatter_base_p_<supf>v2di): Likewise.
             (mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
             (mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
             (mve_vstrdq_scatter_offset_<supf>v2di_insn): Likewise.
             (mve_vstrdq_scatter_offset_p_<supf>v2di_insn): Likewise.
(mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn): Likewise.
(mve_vstrdq_scatter_shifted_offset_p_<supf>v2di_insn): Likewise.
             (mve_vstrhq_<supf><mode>): Likewise.
             (mve_vstrhq_fv8hf): Likewise.
             (mve_vstrhq_p_<supf><mode>): Likewise.
             (mve_vstrhq_p_fv8hf): Likewise.
(mve_vstrhq_scatter_offset_<supf><mode>_insn): Likewise.
             (mve_vstrhq_scatter_offset_fv8hf_insn): Likewise.
(mve_vstrhq_scatter_offset_p_<supf><mode>_insn): Likewise.
             (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise.
(mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn): Likewise.
             (mve_vstrhq_scatter_shifted_offset_fv8hf_insn): Likewise.
(mve_vstrhq_scatter_shifted_offset_p_<supf><mode>_insn): Likewise.
             (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise.
             (mve_vstrwq_<supf>v4si): Likewise.
             (mve_vstrwq_fv4sf): Likewise.
             (mve_vstrwq_p_<supf>v4si): Likewise.
             (mve_vstrwq_p_fv4sf): Likewise.
             (mve_vstrwq_scatter_base_<supf>v4si): Likewise.
             (mve_vstrwq_scatter_base_fv4sf): Likewise.
             (mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
             (mve_vstrwq_scatter_base_p_fv4sf): Likewise.
             (mve_vstrwq_scatter_base_wb_<supf>v4si): Likewise.
             (mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
             (mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
             (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
             (mve_vstrwq_scatter_offset_<supf>v4si_insn): Likewise.
             (mve_vstrwq_scatter_offset_fv4sf_insn): Likewise.
             (mve_vstrwq_scatter_offset_p_<supf>v4si_insn): Likewise.
             (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise.
(mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn): Likewise.
             (mve_vstrwq_scatter_shifted_offset_fv4sf_insn): Likewise.
(mve_vstrwq_scatter_shifted_offset_p_<supf>v4si_insn): Likewise.
             (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise.
             (mve_vsubq_<supf><mode>): Likewise.
             (mve_vsubq_f<mode>): Likewise.
             (mve_vsubq_m_<supf><mode>): Likewise.
             (mve_vsubq_m_f<mode>): Likewise.
             (mve_vsubq_m_n_<supf><mode>): Likewise.
             (mve_vsubq_m_n_f<mode>): Likewise.
             (mve_vsubq_n_<supf><mode>): Likewise.
             (mve_vsubq_n_f<mode>): Likewise.

[-- Attachment #2: 1.patch --]
[-- Type: text/x-patch, Size: 126482 bytes --]

commit 739b52501f95fe5073967009214e55f0dba0eda2
Author: Stam Markianos-Wright <stam.markianos-wright@arm.com>
Date:   Tue Oct 18 17:42:56 2022 +0100

    arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns
    
    I'd like to submit two patches that add support for Arm's MVE
    Tail Predicated Low Overhead Loop feature.
    
    --- Introduction ---
    
    The M-class Arm-ARM:
    https://developer.arm.com/documentation/ddi0553/bu/?lang=en
    Section B5.5.1 "Loop tail predication" describes the feature
    we are adding support for with this patch (although
    we only add codegen for DLSTP/LETP instruction loops).
    
    Previously with commit d2ed233cb94 we'd added support for
    non-MVE DLS/LE loops through the loop-doloop pass, which, given
    a standard MVE loop like:
    
    ```
    void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, int16_t *c, int n)
    {
      while (n > 0)
        {
          mve_pred16_t p = vctp16q (n);
          int16x8_t va = vldrhq_z_s16 (a, p);
          int16x8_t vb = vldrhq_z_s16 (b, p);
          int16x8_t vc = vaddq_x_s16 (va, vb, p);
          vstrhq_p_s16 (c, vc, p);
          c+=8;
          a+=8;
          b+=8;
          n-=8;
        }
    }
    ```
    .. would output:
    
    ```
            <pre-calculate the number of iterations and place it into lr>
            dls     lr, lr
    .L3:
            vctp.16 r3
            vmrs    ip, P0  @ movhi
            sxth    ip, ip
            vmsr     P0, ip @ movhi
            mov     r4, r0
            vpst
            vldrht.16       q2, [r4]
            mov     r4, r1
            vmov    q3, q0
            vpst
            vldrht.16       q1, [r4]
            mov     r4, r2
            vpst
            vaddt.i16       q3, q2, q1
            subs    r3, r3, #8
            vpst
            vstrht.16       q3, [r4]
            adds    r0, r0, #16
            adds    r1, r1, #16
            adds    r2, r2, #16
            le      lr, .L3
    ```
    
    where the LE instruction will decrement LR by 1, compare and
    branch if needed.
    
    (there are also other inefficiencies with the above code, like the
    pointless vmrs/sxth/vmsr on the VPR and the adds not being merged
    into the vldrht/vstrht as a #16 offsets and some random movs!
    But that's different problems...)
    
    The MVE version is similar, except that:
    * Instead of DLS/LE the instructions are DLSTP/LETP.
    * Instead of pre-calculating the number of iterations of the
      loop, we place the number of elements to be processed by the
      loop into LR.
    * Instead of decrementing the LR by one, LETP will decrement it
      by FPSCR.LTPSIZE, which is the number of elements being
      processed in each iteration: 16 for 8-bit elements, 5 for 16-bit
      elements, etc.
    * On the final iteration, automatic Loop Tail Predication is
      performed, as if the instructions within the loop had been VPT
      predicated with a VCTP generating the VPR predicate in every
      loop iteration.
    
    The dlstp/letp loop now looks like:
    
    ```
            <place n into r3>
            dlstp.16        lr, r3
    .L14:
            mov     r3, r0
            vldrh.16        q3, [r3]
            mov     r3, r1
            vldrh.16        q2, [r3]
            mov     r3, r2
            vadd.i16  q3, q3, q2
            adds    r0, r0, #16
            vstrh.16        q3, [r3]
            adds    r1, r1, #16
            adds    r2, r2, #16
            letp    lr, .L14
    
    ```
    
    Since the loop tail predication is automatic, we have eliminated
    the VCTP that had been specified by the user in the intrinsic
    and converted the VPT-predicated instructions into their
    unpredicated equivalents (which also saves us from VPST insns).
    
    The LE instruction here decrements LR by 8 in each iteration.
    
    --- This 1/2 patch ---
    
    This first patch lays some groundwork by adding an attribute to
    md patterns, and then the second patch contains the functional
    changes.
    
    One major difficulty in implementing MVE Tail-Predicated Low
    Overhead Loops was the need to transform VPT-predicated insns
    in the insn chain into their unpredicated equivalents, like:
    `mve_vldrbq_z_<supf><mode> -> mve_vldrbq_<supf><mode>`.
    
    This requires us to have a deterministic link between two
    different patterns in mve.md -- this _could_ be done by
    re-ordering the entirety of mve.md such that the patterns are
    at some constant icode proximity (e.g. having the _z immediately
    after the unpredicated version would mean that to map from the
    former to the latter you could use icode-1), but that is a very
    messy solution that would lead to complex unknown dependencies
    between the ordering of patterns.
    
    This patch proves an alternative way of doing that: using an insn
    attribute to encode the icode of the unpredicated instruction.
    
    No regressions on arm-none-eabi with an MVE target.
    
    Thank you,
    Stam Markianos-Wright
    
    gcc/ChangeLog:
    
            * config/arm/arm.md (mve_unpredicated_insn): New attribute.
            * config/arm/arm.h (MVE_VPT_PREDICATED_INSN_P): New define.
            (MVE_VPT_UNPREDICATED_INSN_P): Likewise.
            (MVE_VPT_PREDICABLE_INSN_P): Likewise.
            * config/arm/vec-common.md (mve_vshlq_<supf><mode>): Add attribute.
            * config/arm/mve.md (arm_vcx1q<a>_p_v16qi): Add attribute.
            (arm_vcx1q<a>v16qi): Likewise.
            (arm_vcx1qav16qi): Likewise.
            (arm_vcx1qv16qi): Likewise.
            (arm_vcx2q<a>_p_v16qi): Likewise.
            (arm_vcx2q<a>v16qi): Likewise.
            (arm_vcx2qav16qi): Likewise.
            (arm_vcx2qv16qi): Likewise.
            (arm_vcx3q<a>_p_v16qi): Likewise.
            (arm_vcx3q<a>v16qi): Likewise.
            (arm_vcx3qav16qi): Likewise.
            (arm_vcx3qv16qi): Likewise.
            (mve_vabavq_<supf><mode>): Likewise.
            (mve_vabavq_p_<supf><mode>): Likewise.
            (mve_vabdq_<supf><mode>): Likewise.
            (mve_vabdq_f<mode>): Likewise.
            (mve_vabdq_m_<supf><mode>): Likewise.
            (mve_vabdq_m_f<mode>): Likewise.
            (mve_vabsq_f<mode>): Likewise.
            (mve_vabsq_m_f<mode>): Likewise.
            (mve_vabsq_m_s<mode>): Likewise.
            (mve_vabsq_s<mode>): Likewise.
            (mve_vadciq_<supf>v4si): Likewise.
            (mve_vadciq_m_<supf>v4si): Likewise.
            (mve_vadcq_<supf>v4si): Likewise.
            (mve_vadcq_m_<supf>v4si): Likewise.
            (mve_vaddlvaq_<supf>v4si): Likewise.
            (mve_vaddlvaq_p_<supf>v4si): Likewise.
            (mve_vaddlvq_<supf>v4si): Likewise.
            (mve_vaddlvq_p_<supf>v4si): Likewise.
            (mve_vaddq_f<mode>): Likewise.
            (mve_vaddq_m_<supf><mode>): Likewise.
            (mve_vaddq_m_f<mode>): Likewise.
            (mve_vaddq_m_n_<supf><mode>): Likewise.
            (mve_vaddq_m_n_f<mode>): Likewise.
            (mve_vaddq_n_<supf><mode>): Likewise.
            (mve_vaddq_n_f<mode>): Likewise.
            (mve_vaddq<mode>): Likewise.
            (mve_vaddvaq_<supf><mode>): Likewise.
            (mve_vaddvaq_p_<supf><mode>): Likewise.
            (mve_vaddvq_<supf><mode>): Likewise.
            (mve_vaddvq_p_<supf><mode>): Likewise.
            (mve_vandq_<supf><mode>): Likewise.
            (mve_vandq_f<mode>): Likewise.
            (mve_vandq_m_<supf><mode>): Likewise.
            (mve_vandq_m_f<mode>): Likewise.
            (mve_vandq_s<mode>): Likewise.
            (mve_vandq_u<mode>): Likewise.
            (mve_vbicq_<supf><mode>): Likewise.
            (mve_vbicq_f<mode>): Likewise.
            (mve_vbicq_m_<supf><mode>): Likewise.
            (mve_vbicq_m_f<mode>): Likewise.
            (mve_vbicq_m_n_<supf><mode>): Likewise.
            (mve_vbicq_n_<supf><mode>): Likewise.
            (mve_vbicq_s<mode>): Likewise.
            (mve_vbicq_u<mode>): Likewise.
            (mve_vbrsrq_m_n_<supf><mode>): Likewise.
            (mve_vbrsrq_m_n_f<mode>): Likewise.
            (mve_vbrsrq_n_<supf><mode>): Likewise.
            (mve_vbrsrq_n_f<mode>): Likewise.
            (mve_vcaddq_rot270_m_<supf><mode>): Likewise.
            (mve_vcaddq_rot270_m_f<mode>): Likewise.
            (mve_vcaddq_rot270<mode>): Likewise.
            (mve_vcaddq_rot270<mode>): Likewise.
            (mve_vcaddq_rot90_m_<supf><mode>): Likewise.
            (mve_vcaddq_rot90_m_f<mode>): Likewise.
            (mve_vcaddq_rot90<mode>): Likewise.
            (mve_vcaddq_rot90<mode>): Likewise.
            (mve_vcaddq<mve_rot><mode>): Likewise.
            (mve_vcaddq<mve_rot><mode>): Likewise.
            (mve_vclsq_m_s<mode>): Likewise.
            (mve_vclsq_s<mode>): Likewise.
            (mve_vclzq_<supf><mode>): Likewise.
            (mve_vclzq_m_<supf><mode>): Likewise.
            (mve_vclzq_s<mode>): Likewise.
            (mve_vclzq_u<mode>): Likewise.
            (mve_vcmlaq_m_f<mode>): Likewise.
            (mve_vcmlaq_rot180_m_f<mode>): Likewise.
            (mve_vcmlaq_rot180<mode>): Likewise.
            (mve_vcmlaq_rot270_m_f<mode>): Likewise.
            (mve_vcmlaq_rot270<mode>): Likewise.
            (mve_vcmlaq_rot90_m_f<mode>): Likewise.
            (mve_vcmlaq_rot90<mode>): Likewise.
            (mve_vcmlaq<mode>): Likewise.
            (mve_vcmlaq<mve_rot><mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_f<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_n_<mode>): Likewise.
            (mve_vcmp<mve_cmp_op>q_n_f<mode>): Likewise.
            (mve_vcmpcsq_<mode>): Likewise.
            (mve_vcmpcsq_m_n_u<mode>): Likewise.
            (mve_vcmpcsq_m_u<mode>): Likewise.
            (mve_vcmpcsq_n_<mode>): Likewise.
            (mve_vcmpeqq_<mode>): Likewise.
            (mve_vcmpeqq_f<mode>): Likewise.
            (mve_vcmpeqq_m_<supf><mode>): Likewise.
            (mve_vcmpeqq_m_f<mode>): Likewise.
            (mve_vcmpeqq_m_n_<supf><mode>): Likewise.
            (mve_vcmpeqq_m_n_f<mode>): Likewise.
            (mve_vcmpeqq_n_<mode>): Likewise.
            (mve_vcmpeqq_n_f<mode>): Likewise.
            (mve_vcmpgeq_<mode>): Likewise.
            (mve_vcmpgeq_f<mode>): Likewise.
            (mve_vcmpgeq_m_f<mode>): Likewise.
            (mve_vcmpgeq_m_n_f<mode>): Likewise.
            (mve_vcmpgeq_m_n_s<mode>): Likewise.
            (mve_vcmpgeq_m_s<mode>): Likewise.
            (mve_vcmpgeq_n_<mode>): Likewise.
            (mve_vcmpgeq_n_f<mode>): Likewise.
            (mve_vcmpgtq_<mode>): Likewise.
            (mve_vcmpgtq_f<mode>): Likewise.
            (mve_vcmpgtq_m_f<mode>): Likewise.
            (mve_vcmpgtq_m_n_f<mode>): Likewise.
            (mve_vcmpgtq_m_n_s<mode>): Likewise.
            (mve_vcmpgtq_m_s<mode>): Likewise.
            (mve_vcmpgtq_n_<mode>): Likewise.
            (mve_vcmpgtq_n_f<mode>): Likewise.
            (mve_vcmphiq_<mode>): Likewise.
            (mve_vcmphiq_m_n_u<mode>): Likewise.
            (mve_vcmphiq_m_u<mode>): Likewise.
            (mve_vcmphiq_n_<mode>): Likewise.
            (mve_vcmpleq_<mode>): Likewise.
            (mve_vcmpleq_f<mode>): Likewise.
            (mve_vcmpleq_m_f<mode>): Likewise.
            (mve_vcmpleq_m_n_f<mode>): Likewise.
            (mve_vcmpleq_m_n_s<mode>): Likewise.
            (mve_vcmpleq_m_s<mode>): Likewise.
            (mve_vcmpleq_n_<mode>): Likewise.
            (mve_vcmpleq_n_f<mode>): Likewise.
            (mve_vcmpltq_<mode>): Likewise.
            (mve_vcmpltq_f<mode>): Likewise.
            (mve_vcmpltq_m_f<mode>): Likewise.
            (mve_vcmpltq_m_n_f<mode>): Likewise.
            (mve_vcmpltq_m_n_s<mode>): Likewise.
            (mve_vcmpltq_m_s<mode>): Likewise.
            (mve_vcmpltq_n_<mode>): Likewise.
            (mve_vcmpltq_n_f<mode>): Likewise.
            (mve_vcmpneq_<mode>): Likewise.
            (mve_vcmpneq_f<mode>): Likewise.
            (mve_vcmpneq_m_<supf><mode>): Likewise.
            (mve_vcmpneq_m_f<mode>): Likewise.
            (mve_vcmpneq_m_n_<supf><mode>): Likewise.
            (mve_vcmpneq_m_n_f<mode>): Likewise.
            (mve_vcmpneq_n_<mode>): Likewise.
            (mve_vcmpneq_n_f<mode>): Likewise.
            (mve_vcmulq_m_f<mode>): Likewise.
            (mve_vcmulq_rot180_m_f<mode>): Likewise.
            (mve_vcmulq_rot180<mode>): Likewise.
            (mve_vcmulq_rot270_m_f<mode>): Likewise.
            (mve_vcmulq_rot270<mode>): Likewise.
            (mve_vcmulq_rot90_m_f<mode>): Likewise.
            (mve_vcmulq_rot90<mode>): Likewise.
            (mve_vcmulq<mode>): Likewise.
            (mve_vcmulq<mve_rot><mode>): Likewise.
            (mve_vctp<mode1>q_mhi): Likewise.
            (mve_vctp<mode1>qhi): Likewise.
            (mve_vcvtaq_<supf><mode>): Likewise.
            (mve_vcvtaq_m_<supf><mode>): Likewise.
            (mve_vcvtbq_f16_f32v8hf): Likewise.
            (mve_vcvtbq_f32_f16v4sf): Likewise.
            (mve_vcvtbq_m_f16_f32v8hf): Likewise.
            (mve_vcvtbq_m_f32_f16v4sf): Likewise.
            (mve_vcvtmq_<supf><mode>): Likewise.
            (mve_vcvtmq_m_<supf><mode>): Likewise.
            (mve_vcvtnq_<supf><mode>): Likewise.
            (mve_vcvtnq_m_<supf><mode>): Likewise.
            (mve_vcvtpq_<supf><mode>): Likewise.
            (mve_vcvtpq_m_<supf><mode>): Likewise.
            (mve_vcvtq_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_n_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_n_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_m_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_n_from_f_<supf><mode>): Likewise.
            (mve_vcvtq_n_to_f_<supf><mode>): Likewise.
            (mve_vcvtq_to_f_<supf><mode>): Likewise.
            (mve_vcvttq_f16_f32v8hf): Likewise.
            (mve_vcvttq_f32_f16v4sf): Likewise.
            (mve_vcvttq_m_f16_f32v8hf): Likewise.
            (mve_vcvttq_m_f32_f16v4sf): Likewise.
            (mve_vddupq_m_wb_u<mode>_insn): Likewise.
            (mve_vddupq_u<mode>_insn): Likewise.
            (mve_vdupq_m_n_<supf><mode>): Likewise.
            (mve_vdupq_m_n_f<mode>): Likewise.
            (mve_vdupq_n_<supf><mode>): Likewise.
            (mve_vdupq_n_f<mode>): Likewise.
            (mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
            (mve_vdwdupq_wb_u<mode>_insn): Likewise.
            (mve_veorq_<supf><mode>): Likewise.
            (mve_veorq_f<mode>): Likewise.
            (mve_veorq_m_<supf><mode>): Likewise.
            (mve_veorq_m_f<mode>): Likewise.
            (mve_veorq_s<mode>): Likewise.
            (mve_veorq_u<mode>): Likewise.
            (mve_vfmaq_f<mode>): Likewise.
            (mve_vfmaq_m_f<mode>): Likewise.
            (mve_vfmaq_m_n_f<mode>): Likewise.
            (mve_vfmaq_n_f<mode>): Likewise.
            (mve_vfmasq_m_n_f<mode>): Likewise.
            (mve_vfmasq_n_f<mode>): Likewise.
            (mve_vfmsq_f<mode>): Likewise.
            (mve_vfmsq_m_f<mode>): Likewise.
            (mve_vhaddq_<supf><mode>): Likewise.
            (mve_vhaddq_m_<supf><mode>): Likewise.
            (mve_vhaddq_m_n_<supf><mode>): Likewise.
            (mve_vhaddq_n_<supf><mode>): Likewise.
            (mve_vhcaddq_rot270_m_s<mode>): Likewise.
            (mve_vhcaddq_rot270_s<mode>): Likewise.
            (mve_vhcaddq_rot90_m_s<mode>): Likewise.
            (mve_vhcaddq_rot90_s<mode>): Likewise.
            (mve_vhsubq_<supf><mode>): Likewise.
            (mve_vhsubq_m_<supf><mode>): Likewise.
            (mve_vhsubq_m_n_<supf><mode>): Likewise.
            (mve_vhsubq_n_<supf><mode>): Likewise.
            (mve_vidupq_m_wb_u<mode>_insn): Likewise.
            (mve_vidupq_u<mode>_insn): Likewise.
            (mve_viwdupq_m_wb_u<mode>_insn): Likewise.
            (mve_viwdupq_wb_u<mode>_insn): Likewise.
            (mve_vldrbq_<supf><mode>): Likewise.
            (mve_vldrbq_gather_offset_<supf><mode>): Likewise.
            (mve_vldrbq_gather_offset_z_<supf><mode>): Likewise.
            (mve_vldrbq_z_<supf><mode>): Likewise.
            (mve_vldrdq_gather_base_<supf>v2di): Likewise.
            (mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
            (mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
            (mve_vldrdq_gather_base_z_<supf>v2di): Likewise.
            (mve_vldrdq_gather_offset_<supf>v2di): Likewise.
            (mve_vldrdq_gather_offset_z_<supf>v2di): Likewise.
            (mve_vldrdq_gather_shifted_offset_<supf>v2di): Likewise.
            (mve_vldrdq_gather_shifted_offset_z_<supf>v2di): Likewise.
            (mve_vldrhq_<supf><mode>): Likewise.
            (mve_vldrhq_fv8hf): Likewise.
            (mve_vldrhq_gather_offset_<supf><mode>): Likewise.
            (mve_vldrhq_gather_offset_fv8hf): Likewise.
            (mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
            (mve_vldrhq_gather_offset_z_fv8hf): Likewise.
            (mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
            (mve_vldrhq_gather_shifted_offset_fv8hf): Likewise.
            (mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
            (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise.
            (mve_vldrhq_z_<supf><mode>): Likewise.
            (mve_vldrhq_z_fv8hf): Likewise.
            (mve_vldrwq_<supf>v4si): Likewise.
            (mve_vldrwq_fv4sf): Likewise.
            (mve_vldrwq_gather_base_<supf>v4si): Likewise.
            (mve_vldrwq_gather_base_fv4sf): Likewise.
            (mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
            (mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
            (mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
            (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
            (mve_vldrwq_gather_base_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_base_z_fv4sf): Likewise.
            (mve_vldrwq_gather_offset_<supf>v4si): Likewise.
            (mve_vldrwq_gather_offset_fv4sf): Likewise.
            (mve_vldrwq_gather_offset_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_offset_z_fv4sf): Likewise.
            (mve_vldrwq_gather_shifted_offset_<supf>v4si): Likewise.
            (mve_vldrwq_gather_shifted_offset_fv4sf): Likewise.
            (mve_vldrwq_gather_shifted_offset_z_<supf>v4si): Likewise.
            (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise.
            (mve_vldrwq_z_<supf>v4si): Likewise.
            (mve_vldrwq_z_fv4sf): Likewise.
            (mve_vmaxaq_m_s<mode>): Likewise.
            (mve_vmaxaq_s<mode>): Likewise.
            (mve_vmaxavq_p_s<mode>): Likewise.
            (mve_vmaxavq_s<mode>): Likewise.
            (mve_vmaxnmaq_f<mode>): Likewise.
            (mve_vmaxnmaq_m_f<mode>): Likewise.
            (mve_vmaxnmavq_f<mode>): Likewise.
            (mve_vmaxnmavq_p_f<mode>): Likewise.
            (mve_vmaxnmq_f<mode>): Likewise.
            (mve_vmaxnmq_m_f<mode>): Likewise.
            (mve_vmaxnmvq_f<mode>): Likewise.
            (mve_vmaxnmvq_p_f<mode>): Likewise.
            (mve_vmaxq_<supf><mode>): Likewise.
            (mve_vmaxq_m_<supf><mode>): Likewise.
            (mve_vmaxq_s<mode>): Likewise.
            (mve_vmaxq_u<mode>): Likewise.
            (mve_vmaxvq_<supf><mode>): Likewise.
            (mve_vmaxvq_p_<supf><mode>): Likewise.
            (mve_vminaq_m_s<mode>): Likewise.
            (mve_vminaq_s<mode>): Likewise.
            (mve_vminavq_p_s<mode>): Likewise.
            (mve_vminavq_s<mode>): Likewise.
            (mve_vminnmaq_f<mode>): Likewise.
            (mve_vminnmaq_m_f<mode>): Likewise.
            (mve_vminnmavq_f<mode>): Likewise.
            (mve_vminnmavq_p_f<mode>): Likewise.
            (mve_vminnmq_f<mode>): Likewise.
            (mve_vminnmq_m_f<mode>): Likewise.
            (mve_vminnmvq_f<mode>): Likewise.
            (mve_vminnmvq_p_f<mode>): Likewise.
            (mve_vminq_<supf><mode>): Likewise.
            (mve_vminq_m_<supf><mode>): Likewise.
            (mve_vminq_s<mode>): Likewise.
            (mve_vminq_u<mode>): Likewise.
            (mve_vminvq_<supf><mode>): Likewise.
            (mve_vminvq_p_<supf><mode>): Likewise.
            (mve_vmladavaq_<supf><mode>): Likewise.
            (mve_vmladavaq_p_<supf><mode>): Likewise.
            (mve_vmladavaxq_p_s<mode>): Likewise.
            (mve_vmladavaxq_s<mode>): Likewise.
            (mve_vmladavq_<supf><mode>): Likewise.
            (mve_vmladavq_p_<supf><mode>): Likewise.
            (mve_vmladavxq_p_s<mode>): Likewise.
            (mve_vmladavxq_s<mode>): Likewise.
            (mve_vmlaldavaq_<supf><mode>): Likewise.
            (mve_vmlaldavaq_p_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_p_<supf><mode>): Likewise.
            (mve_vmlaldavaxq_s<mode>): Likewise.
            (mve_vmlaldavq_<supf><mode>): Likewise.
            (mve_vmlaldavq_p_<supf><mode>): Likewise.
            (mve_vmlaldavxq_p_s<mode>): Likewise.
            (mve_vmlaldavxq_s<mode>): Likewise.
            (mve_vmlaq_m_n_<supf><mode>): Likewise.
            (mve_vmlaq_n_<supf><mode>): Likewise.
            (mve_vmlasq_m_n_<supf><mode>): Likewise.
            (mve_vmlasq_n_<supf><mode>): Likewise.
            (mve_vmlsdavaq_p_s<mode>): Likewise.
            (mve_vmlsdavaq_s<mode>): Likewise.
            (mve_vmlsdavaxq_p_s<mode>): Likewise.
            (mve_vmlsdavaxq_s<mode>): Likewise.
            (mve_vmlsdavq_p_s<mode>): Likewise.
            (mve_vmlsdavq_s<mode>): Likewise.
            (mve_vmlsdavxq_p_s<mode>): Likewise.
            (mve_vmlsdavxq_s<mode>): Likewise.
            (mve_vmlsldavaq_p_s<mode>): Likewise.
            (mve_vmlsldavaq_s<mode>): Likewise.
            (mve_vmlsldavaxq_p_s<mode>): Likewise.
            (mve_vmlsldavaxq_s<mode>): Likewise.
            (mve_vmlsldavq_p_s<mode>): Likewise.
            (mve_vmlsldavq_s<mode>): Likewise.
            (mve_vmlsldavxq_p_s<mode>): Likewise.
            (mve_vmlsldavxq_s<mode>): Likewise.
            (mve_vmovlbq_<supf><mode>): Likewise.
            (mve_vmovlbq_m_<supf><mode>): Likewise.
            (mve_vmovltq_<supf><mode>): Likewise.
            (mve_vmovltq_m_<supf><mode>): Likewise.
            (mve_vmovnbq_<supf><mode>): Likewise.
            (mve_vmovnbq_m_<supf><mode>): Likewise.
            (mve_vmovntq_<supf><mode>): Likewise.
            (mve_vmovntq_m_<supf><mode>): Likewise.
            (mve_vmulhq_<supf><mode>): Likewise.
            (mve_vmulhq_m_<supf><mode>): Likewise.
            (mve_vmullbq_int_<supf><mode>): Likewise.
            (mve_vmullbq_int_m_<supf><mode>): Likewise.
            (mve_vmullbq_poly_m_p<mode>): Likewise.
            (mve_vmullbq_poly_p<mode>): Likewise.
            (mve_vmulltq_int_<supf><mode>): Likewise.
            (mve_vmulltq_int_m_<supf><mode>): Likewise.
            (mve_vmulltq_poly_m_p<mode>): Likewise.
            (mve_vmulltq_poly_p<mode>): Likewise.
            (mve_vmulq_<supf><mode>): Likewise.
            (mve_vmulq_f<mode>): Likewise.
            (mve_vmulq_m_<supf><mode>): Likewise.
            (mve_vmulq_m_f<mode>): Likewise.
            (mve_vmulq_m_n_<supf><mode>): Likewise.
            (mve_vmulq_m_n_f<mode>): Likewise.
            (mve_vmulq_n_<supf><mode>): Likewise.
            (mve_vmulq_n_f<mode>): Likewise.
            (mve_vmvnq_<supf><mode>): Likewise.
            (mve_vmvnq_m_<supf><mode>): Likewise.
            (mve_vmvnq_m_n_<supf><mode>): Likewise.
            (mve_vmvnq_n_<supf><mode>): Likewise.
            (mve_vmvnq_s<mode>): Likewise.
            (mve_vmvnq_u<mode>): Likewise.
            (mve_vnegq_f<mode>): Likewise.
            (mve_vnegq_m_f<mode>): Likewise.
            (mve_vnegq_m_s<mode>): Likewise.
            (mve_vnegq_s<mode>): Likewise.
            (mve_vornq_<supf><mode>): Likewise.
            (mve_vornq_f<mode>): Likewise.
            (mve_vornq_m_<supf><mode>): Likewise.
            (mve_vornq_m_f<mode>): Likewise.
            (mve_vornq_s<mode>): Likewise.
            (mve_vornq_u<mode>): Likewise.
            (mve_vorrq_<supf><mode>): Likewise.
            (mve_vorrq_f<mode>): Likewise.
            (mve_vorrq_m_<supf><mode>): Likewise.
            (mve_vorrq_m_f<mode>): Likewise.
            (mve_vorrq_m_n_<supf><mode>): Likewise.
            (mve_vorrq_n_<supf><mode>): Likewise.
            (mve_vorrq_s<mode>): Likewise.
            (mve_vorrq_s<mode>): Likewise.
            (mve_vqabsq_m_s<mode>): Likewise.
            (mve_vqabsq_s<mode>): Likewise.
            (mve_vqaddq_<supf><mode>): Likewise.
            (mve_vqaddq_m_<supf><mode>): Likewise.
            (mve_vqaddq_m_n_<supf><mode>): Likewise.
            (mve_vqaddq_n_<supf><mode>): Likewise.
            (mve_vqdmladhq_m_s<mode>): Likewise.
            (mve_vqdmladhq_s<mode>): Likewise.
            (mve_vqdmladhxq_m_s<mode>): Likewise.
            (mve_vqdmladhxq_s<mode>): Likewise.
            (mve_vqdmlahq_m_n_s<mode>): Likewise.
            (mve_vqdmlahq_n_<supf><mode>): Likewise.
            (mve_vqdmlahq_n_s<mode>): Likewise.
            (mve_vqdmlashq_m_n_s<mode>): Likewise.
            (mve_vqdmlashq_n_<supf><mode>): Likewise.
            (mve_vqdmlashq_n_s<mode>): Likewise.
            (mve_vqdmlsdhq_m_s<mode>): Likewise.
            (mve_vqdmlsdhq_s<mode>): Likewise.
            (mve_vqdmlsdhxq_m_s<mode>): Likewise.
            (mve_vqdmlsdhxq_s<mode>): Likewise.
            (mve_vqdmulhq_m_n_s<mode>): Likewise.
            (mve_vqdmulhq_m_s<mode>): Likewise.
            (mve_vqdmulhq_n_s<mode>): Likewise.
            (mve_vqdmulhq_s<mode>): Likewise.
            (mve_vqdmullbq_m_n_s<mode>): Likewise.
            (mve_vqdmullbq_m_s<mode>): Likewise.
            (mve_vqdmullbq_n_s<mode>): Likewise.
            (mve_vqdmullbq_s<mode>): Likewise.
            (mve_vqdmulltq_m_n_s<mode>): Likewise.
            (mve_vqdmulltq_m_s<mode>): Likewise.
            (mve_vqdmulltq_n_s<mode>): Likewise.
            (mve_vqdmulltq_s<mode>): Likewise.
            (mve_vqmovnbq_<supf><mode>): Likewise.
            (mve_vqmovnbq_m_<supf><mode>): Likewise.
            (mve_vqmovntq_<supf><mode>): Likewise.
            (mve_vqmovntq_m_<supf><mode>): Likewise.
            (mve_vqmovunbq_m_s<mode>): Likewise.
            (mve_vqmovunbq_s<mode>): Likewise.
            (mve_vqmovuntq_m_s<mode>): Likewise.
            (mve_vqmovuntq_s<mode>): Likewise.
            (mve_vqnegq_m_s<mode>): Likewise.
            (mve_vqnegq_s<mode>): Likewise.
            (mve_vqrdmladhq_m_s<mode>): Likewise.
            (mve_vqrdmladhq_s<mode>): Likewise.
            (mve_vqrdmladhxq_m_s<mode>): Likewise.
            (mve_vqrdmladhxq_s<mode>): Likewise.
            (mve_vqrdmlahq_m_n_s<mode>): Likewise.
            (mve_vqrdmlahq_n_<supf><mode>): Likewise.
            (mve_vqrdmlahq_n_s<mode>): Likewise.
            (mve_vqrdmlashq_m_n_s<mode>): Likewise.
            (mve_vqrdmlashq_n_<supf><mode>): Likewise.
            (mve_vqrdmlashq_n_s<mode>): Likewise.
            (mve_vqrdmlsdhq_m_s<mode>): Likewise.
            (mve_vqrdmlsdhq_s<mode>): Likewise.
            (mve_vqrdmlsdhxq_m_s<mode>): Likewise.
            (mve_vqrdmlsdhxq_s<mode>): Likewise.
            (mve_vqrdmulhq_m_n_s<mode>): Likewise.
            (mve_vqrdmulhq_m_s<mode>): Likewise.
            (mve_vqrdmulhq_n_s<mode>): Likewise.
            (mve_vqrdmulhq_s<mode>): Likewise.
            (mve_vqrshlq_<supf><mode>): Likewise.
            (mve_vqrshlq_m_<supf><mode>): Likewise.
            (mve_vqrshlq_m_n_<supf><mode>): Likewise.
            (mve_vqrshlq_n_<supf><mode>): Likewise.
            (mve_vqrshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vqrshrnbq_n_<supf><mode>): Likewise.
            (mve_vqrshrntq_m_n_<supf><mode>): Likewise.
            (mve_vqrshrntq_n_<supf><mode>): Likewise.
            (mve_vqrshrunbq_m_n_s<mode>): Likewise.
            (mve_vqrshrunbq_n_s<mode>): Likewise.
            (mve_vqrshruntq_m_n_s<mode>): Likewise.
            (mve_vqrshruntq_n_s<mode>): Likewise.
            (mve_vqshlq_<supf><mode>): Likewise.
            (mve_vqshlq_m_<supf><mode>): Likewise.
            (mve_vqshlq_m_n_<supf><mode>): Likewise.
            (mve_vqshlq_m_r_<supf><mode>): Likewise.
            (mve_vqshlq_n_<supf><mode>): Likewise.
            (mve_vqshlq_r_<supf><mode>): Likewise.
            (mve_vqshluq_m_n_s<mode>): Likewise.
            (mve_vqshluq_n_s<mode>): Likewise.
            (mve_vqshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vqshrnbq_n_<supf><mode>): Likewise.
            (mve_vqshrntq_m_n_<supf><mode>): Likewise.
            (mve_vqshrntq_n_<supf><mode>): Likewise.
            (mve_vqshrunbq_m_n_s<mode>): Likewise.
            (mve_vqshrunbq_n_s<mode>): Likewise.
            (mve_vqshruntq_m_n_s<mode>): Likewise.
            (mve_vqshruntq_n_s<mode>): Likewise.
            (mve_vqsubq_<supf><mode>): Likewise.
            (mve_vqsubq_m_<supf><mode>): Likewise.
            (mve_vqsubq_m_n_<supf><mode>): Likewise.
            (mve_vqsubq_n_<supf><mode>): Likewise.
            (mve_vrev16q_<supf>v16qi): Likewise.
            (mve_vrev16q_m_<supf>v16qi): Likewise.
            (mve_vrev32q_<supf><mode>): Likewise.
            (mve_vrev32q_fv8hf): Likewise.
            (mve_vrev32q_m_<supf><mode>): Likewise.
            (mve_vrev32q_m_fv8hf): Likewise.
            (mve_vrev64q_<supf><mode>): Likewise.
            (mve_vrev64q_f<mode>): Likewise.
            (mve_vrev64q_m_<supf><mode>): Likewise.
            (mve_vrev64q_m_f<mode>): Likewise.
            (mve_vrhaddq_<supf><mode>): Likewise.
            (mve_vrhaddq_m_<supf><mode>): Likewise.
            (mve_vrmlaldavhaq_<supf>v4si): Likewise.
            (mve_vrmlaldavhaq_p_sv4si): Likewise.
            (mve_vrmlaldavhaq_p_uv4si): Likewise.
            (mve_vrmlaldavhaq_sv4si): Likewise.
            (mve_vrmlaldavhaq_uv4si): Likewise.
            (mve_vrmlaldavhaxq_p_sv4si): Likewise.
            (mve_vrmlaldavhaxq_sv4si): Likewise.
            (mve_vrmlaldavhq_<supf>v4si): Likewise.
            (mve_vrmlaldavhq_p_<supf>v4si): Likewise.
            (mve_vrmlaldavhxq_p_sv4si): Likewise.
            (mve_vrmlaldavhxq_sv4si): Likewise.
            (mve_vrmlsldavhaq_p_sv4si): Likewise.
            (mve_vrmlsldavhaq_sv4si): Likewise.
            (mve_vrmlsldavhaxq_p_sv4si): Likewise.
            (mve_vrmlsldavhaxq_sv4si): Likewise.
            (mve_vrmlsldavhq_p_sv4si): Likewise.
            (mve_vrmlsldavhq_sv4si): Likewise.
            (mve_vrmlsldavhxq_p_sv4si): Likewise.
            (mve_vrmlsldavhxq_sv4si): Likewise.
            (mve_vrmulhq_<supf><mode>): Likewise.
            (mve_vrmulhq_m_<supf><mode>): Likewise.
            (mve_vrndaq_f<mode>): Likewise.
            (mve_vrndaq_m_f<mode>): Likewise.
            (mve_vrndmq_f<mode>): Likewise.
            (mve_vrndmq_m_f<mode>): Likewise.
            (mve_vrndnq_f<mode>): Likewise.
            (mve_vrndnq_m_f<mode>): Likewise.
            (mve_vrndpq_f<mode>): Likewise.
            (mve_vrndpq_m_f<mode>): Likewise.
            (mve_vrndq_f<mode>): Likewise.
            (mve_vrndq_m_f<mode>): Likewise.
            (mve_vrndxq_f<mode>): Likewise.
            (mve_vrndxq_m_f<mode>): Likewise.
            (mve_vrshlq_<supf><mode>): Likewise.
            (mve_vrshlq_m_<supf><mode>): Likewise.
            (mve_vrshlq_m_n_<supf><mode>): Likewise.
            (mve_vrshlq_n_<supf><mode>): Likewise.
            (mve_vrshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vrshrnbq_n_<supf><mode>): Likewise.
            (mve_vrshrntq_m_n_<supf><mode>): Likewise.
            (mve_vrshrntq_n_<supf><mode>): Likewise.
            (mve_vrshrq_m_n_<supf><mode>): Likewise.
            (mve_vrshrq_n_<supf><mode>): Likewise.
            (mve_vsbciq_<supf>v4si): Likewise.
            (mve_vsbciq_m_<supf>v4si): Likewise.
            (mve_vsbcq_<supf>v4si): Likewise.
            (mve_vsbcq_m_<supf>v4si): Likewise.
            (mve_vshlcq_<supf><mode>): Likewise.
            (mve_vshlcq_m_<supf><mode>): Likewise.
            (mve_vshllbq_m_n_<supf><mode>): Likewise.
            (mve_vshllbq_n_<supf><mode>): Likewise.
            (mve_vshlltq_m_n_<supf><mode>): Likewise.
            (mve_vshlltq_n_<supf><mode>): Likewise.
            (mve_vshlq_<supf><mode>): Likewise.
            (mve_vshlq_<supf><mode>): Likewise.
            (mve_vshlq_m_<supf><mode>): Likewise.
            (mve_vshlq_m_n_<supf><mode>): Likewise.
            (mve_vshlq_m_r_<supf><mode>): Likewise.
            (mve_vshlq_n_<supf><mode>): Likewise.
            (mve_vshlq_r_<supf><mode>): Likewise.
            (mve_vshrnbq_m_n_<supf><mode>): Likewise.
            (mve_vshrnbq_n_<supf><mode>): Likewise.
            (mve_vshrntq_m_n_<supf><mode>): Likewise.
            (mve_vshrntq_n_<supf><mode>): Likewise.
            (mve_vshrq_m_n_<supf><mode>): Likewise.
            (mve_vshrq_n_<supf><mode>): Likewise.
            (mve_vsliq_m_n_<supf><mode>): Likewise.
            (mve_vsliq_n_<supf><mode>): Likewise.
            (mve_vsriq_m_n_<supf><mode>): Likewise.
            (mve_vsriq_n_<supf><mode>): Likewise.
            (mve_vstrbq_<supf><mode>): Likewise.
            (mve_vstrbq_p_<supf><mode>): Likewise.
            (mve_vstrbq_scatter_offset_<supf><mode>_insn): Likewise.
            (mve_vstrbq_scatter_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrdq_scatter_base_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_p_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
            (mve_vstrdq_scatter_offset_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_offset_p_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn): Likewise.
            (mve_vstrdq_scatter_shifted_offset_p_<supf>v2di_insn): Likewise.
            (mve_vstrhq_<supf><mode>): Likewise.
            (mve_vstrhq_fv8hf): Likewise.
            (mve_vstrhq_p_<supf><mode>): Likewise.
            (mve_vstrhq_p_fv8hf): Likewise.
            (mve_vstrhq_scatter_offset_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_offset_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_fv8hf_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_p_<supf><mode>_insn): Likewise.
            (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise.
            (mve_vstrwq_<supf>v4si): Likewise.
            (mve_vstrwq_fv4sf): Likewise.
            (mve_vstrwq_p_<supf>v4si): Likewise.
            (mve_vstrwq_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_wb_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
            (mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
            (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
            (mve_vstrwq_scatter_offset_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_offset_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_offset_p_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_fv4sf_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_p_<supf>v4si_insn): Likewise.
            (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise.
            (mve_vsubq_<supf><mode>): Likewise.
            (mve_vsubq_f<mode>): Likewise.
            (mve_vsubq_m_<supf><mode>): Likewise.
            (mve_vsubq_m_f<mode>): Likewise.
            (mve_vsubq_m_n_<supf><mode>): Likewise.
            (mve_vsubq_m_n_f<mode>): Likewise.
            (mve_vsubq_n_<supf><mode>): Likewise.
            (mve_vsubq_n_f<mode>): Likewise.

diff --git a/gcc/config/arm/arm.h b/gcc/config/arm/arm.h
index 7d40b8b7e00..40972c24ba1 100644
--- a/gcc/config/arm/arm.h
+++ b/gcc/config/arm/arm.h
@@ -2358,6 +2358,21 @@ extern int making_const_table;
   else if (TARGET_THUMB1)				\
     thumb1_final_prescan_insn (INSN)
 
+/* These defines are useful to refer to the value of the mve_unpredicated_insn
+   insn attribute.  Note that, because these use the get_attr_* function, these
+   will change recog_data if (INSN) isn't current_insn.  */
+#define MVE_VPT_PREDICABLE_INSN_P(INSN)					\
+  (recog_memoized (INSN) >= 0						\
+  && get_attr_mve_unpredicated_insn (INSN) != 0)			\
+
+#define MVE_VPT_PREDICATED_INSN_P(INSN)					\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) != get_attr_mve_unpredicated_insn (INSN))	\
+
+#define MVE_VPT_UNPREDICATED_INSN_P(INSN)				\
+  (MVE_VPT_PREDICABLE_INSN_P (INSN)					\
+   && recog_memoized (INSN) == get_attr_mve_unpredicated_insn (INSN))	\
+
 #define ARM_SIGN_EXTEND(x)  ((HOST_WIDE_INT)			\
   (HOST_BITS_PER_WIDE_INT <= 32 ? (unsigned HOST_WIDE_INT) (x)	\
    : ((((unsigned HOST_WIDE_INT)(x)) & (unsigned HOST_WIDE_INT) 0xffffffff) |\
diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md
index cbfc4543531..e9794375187 100644
--- a/gcc/config/arm/arm.md
+++ b/gcc/config/arm/arm.md
@@ -124,6 +124,8 @@
 ; and not all ARM insns do.
 (define_attr "predicated" "yes,no" (const_string "no"))
 
+(define_attr "mve_unpredicated_insn" "" (const_int 0))
+
 ; LENGTH of an instruction (in bytes)
 (define_attr "length" ""
   (const_int 4))
diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md
index 9e3570c5264..74b8af8d57e 100644
--- a/gcc/config/arm/mve.md
+++ b/gcc/config/arm/mve.md
@@ -145,7 +145,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_mnemo>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -159,7 +160,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -173,7 +175,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "v<absneg_str>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -187,7 +190,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -201,7 +205,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 ;;
 ;; [vcvttq_f32_f16])
@@ -214,7 +219,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -228,7 +234,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f32.f16\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -242,7 +249,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -256,7 +264,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -270,7 +279,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -284,7 +294,8 @@
   ]
   "TARGET_HAVE_MVE"
   "v<absneg_str>.s%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_v<absneg_str>q_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -297,7 +308,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmvn\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vmvnq_s<mode>"
   [
@@ -318,7 +330,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -331,7 +344,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vclz.i%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vclzq_u<mode>"
   [
@@ -354,7 +368,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -368,7 +383,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -382,7 +398,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -397,7 +414,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -411,7 +429,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtp.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -425,7 +444,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtn.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -439,7 +459,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtm.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -453,7 +474,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvta.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -467,7 +489,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -481,7 +504,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -495,7 +519,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -509,7 +534,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vctp.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -538,7 +564,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -553,7 +580,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.f<V_sz_elem>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; [vcreateq_f])
@@ -599,7 +627,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf><V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Versions that take constant vectors as operand 2 (with all elements
@@ -647,7 +676,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvt.<supf><V_sz_elem>.f<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -662,8 +692,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcmpneq_, vcmpcsq_, vcmpeqq_, vcmpgeq_, vcmpgtq_, vcmphiq_, vcmpleq_, vcmpltq_])
@@ -676,7 +707,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>\t<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -691,7 +723,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vcmp.<mve_cmp_type>%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -722,7 +755,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -739,7 +773,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -754,7 +789,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -769,7 +805,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -789,7 +826,8 @@
   "@
    vand\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vand\", &operands[2], <MODE>mode, 1, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_vandq_s<mode>"
   [
@@ -811,7 +849,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vbicq_s<mode>"
@@ -835,7 +874,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -850,7 +890,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vcadd.i%#<V_sz_elem>	%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq<mve_rot><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;; Auto vectorizer pattern for int vcadd
@@ -873,7 +914,8 @@
   ]
   "TARGET_HAVE_MVE"
   "veor\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_u<mode>"))
+  (set_attr "type" "mve_move")
 ])
 (define_expand "mve_veorq_s<mode>"
   [
@@ -901,7 +943,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -916,7 +959,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vhcadd.s%#<V_sz_elem>\t%q0, %q1, %q2, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot270_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -931,7 +975,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vhcadd.s%#<V_sz_elem>\t%q0, %q1, %q2, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot90_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -947,7 +992,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -962,7 +1008,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<max_min_su_str>.<max_min_supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_su_str>q_<max_min_supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 
@@ -981,7 +1028,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -999,7 +1047,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1014,7 +1063,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmullb.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1029,7 +1079,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmullt.<supf>%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1045,7 +1096,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_addsubmul>.i%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1059,7 +1111,8 @@
   ]
   "TARGET_HAVE_MVE"
    "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 (define_expand "mve_vornq_u<mode>"
@@ -1088,8 +1141,10 @@
   "@
    vorr\t%q0, %q1, %q2
    * return neon_output_logic_immediate (\"vorr\", &operands[2], <MODE>mode, 0, VALID_NEON_QREG_MODE (<MODE>mode));"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s<mode>"))
+  (set_attr "type" "mve_move")
 ])
+
 (define_expand "mve_vorrq_u<mode>"
   [
    (set (match_operand:MVE_2 0 "s_register_operand")
@@ -1112,7 +1167,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1128,7 +1184,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1144,7 +1201,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1159,7 +1217,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1174,7 +1233,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1189,7 +1249,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1206,7 +1267,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1220,7 +1282,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vand\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1234,7 +1297,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vbic\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1249,7 +1313,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcadd.f%#<V_sz_elem>	%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq<mve_rot><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1263,7 +1328,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1278,7 +1344,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmp.f%#<V_sz_elem>	<mve_cmp_op>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1293,7 +1360,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcmul.f%#<V_sz_elem>	%q0, %q1, %q2, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq<mve_rot><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1308,8 +1376,10 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vctpt.<MVE_vctp>\t%1"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctp<MVE_vctp>q<MVE_vpred>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")
+])
 
 ;;
 ;; [vcvtbq_f16_f32])
@@ -1323,7 +1393,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtb.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1338,7 +1409,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vcvtt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1386,7 +1458,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1401,7 +1474,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<max_min_f_str>.f%#<V_sz_elem>	%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<max_min_f_str>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1419,7 +1493,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1439,7 +1514,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1455,7 +1531,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_addsubmul>.f%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_addsubmul>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1469,7 +1546,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorn\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1483,7 +1561,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vorr\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1499,7 +1578,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.i%#<V_sz_elem>	%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1515,7 +1595,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1531,7 +1612,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1549,7 +1631,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1565,7 +1648,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1580,7 +1664,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmullt.p%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1595,7 +1680,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vmullb.p%#<V_sz_elem>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1616,8 +1702,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_f<mode>"))
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtaq_m_u, vcvtaq_m_s])
 ;;
@@ -1631,8 +1718,10 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtat.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
+
 ;;
 ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u])
 ;;
@@ -1646,8 +1735,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vqrshrnbq_n_u, vqrshrnbq_n_s]
@@ -1673,7 +1763,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1692,7 +1783,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1708,7 +1800,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1754,7 +1847,10 @@
 		   (match_dup 4)]
 	VSHLCQ))]
  "TARGET_HAVE_MVE"
- "vshlc\t%q0, %1, %4")
+ "vshlc\t%q0, %1, %4"
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
+])
 
 ;;
 ;; [vabsq_m_s]
@@ -1774,7 +1870,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1790,7 +1887,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1813,7 +1911,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1836,7 +1935,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcmpt.<isu>%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1852,7 +1952,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1869,7 +1970,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1888,7 +1990,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1907,7 +2010,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1926,7 +2030,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1947,7 +2052,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -1963,7 +2069,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -1979,7 +2086,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2002,7 +2110,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2019,7 +2128,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2036,7 +2146,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_r_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2052,7 +2163,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2068,7 +2180,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2084,7 +2197,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2107,7 +2221,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_mnemo>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2123,7 +2238,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 ;;
 ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270])
@@ -2141,7 +2257,8 @@
   "@
    vcmul.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>
    vcmla.f%#<V_sz_elem>	%q0, %q2, %q3, #<rot>"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq<mve_rot><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2162,7 +2279,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmpt.f%#<V_sz_elem>\t<mve_cmp_op1>, %q1, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmp<mve_cmp_op1>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2178,7 +2296,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2194,7 +2313,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtbt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2210,7 +2330,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f16.f32\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2226,8 +2347,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvttt.f32.f16\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vdupq_m_n_f])
@@ -2242,7 +2364,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2259,7 +2382,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2276,7 +2400,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2293,7 +2418,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2312,7 +2438,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2331,7 +2458,8 @@
   ]
   "TARGET_HAVE_MVE"
   "<mve_insn>.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2350,7 +2478,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2367,7 +2496,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2388,7 +2518,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2404,7 +2535,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2421,7 +2553,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2437,7 +2570,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "<mve_insn>\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
 ])
 
 ;;
@@ -2453,7 +2587,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2469,7 +2604,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2485,7 +2621,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2504,7 +2641,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2520,7 +2658,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtmt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2536,7 +2675,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtpt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2552,7 +2692,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtnt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2569,7 +2710,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2585,7 +2727,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2601,8 +2744,9 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.<supf>%#<V_sz_elem>.f%#<V_sz_elem>\t%q0, %q2"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vabavq_p_s, vabavq_p_u])
@@ -2618,7 +2762,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -2635,8 +2780,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\n\t<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vsriq_m_n_s, vsriq_m_n_u])
@@ -2652,8 +2798,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length" "8")])
 
 ;;
 ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s])
@@ -2669,7 +2816,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcvtt.f%#<V_sz_elem>.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2709,7 +2857,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2728,8 +2877,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>	%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vaddq_m_u, vaddq_m_s]
@@ -2747,7 +2897,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.i%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2767,7 +2918,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2784,8 +2936,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [vcaddq_rot270_m_u, vcaddq_rot270_m_s])
@@ -2801,7 +2954,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcaddt.i%#<V_sz_elem>	%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2818,7 +2972,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vcaddt.i%#<V_sz_elem>	%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2846,7 +3001,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2867,7 +3023,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2884,7 +3041,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmullbt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2901,7 +3059,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmulltt.<supf>%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2918,7 +3077,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2936,7 +3096,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2954,7 +3115,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2971,7 +3133,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -2988,7 +3151,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vhcaddt.s%#<V_sz_elem>\t%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot270_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3005,7 +3169,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vhcaddt.s%#<V_sz_elem>\t%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot90_s<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3025,7 +3190,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3053,7 +3219,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<isu>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3073,7 +3240,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>32\t%Q0, %R0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3091,7 +3259,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.<supf>%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3125,7 +3294,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vmulltt.p%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3143,7 +3313,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3161,7 +3332,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;<mve_insn>t.s%#<V_sz_elem>\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_<supf><mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3185,7 +3357,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>	%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3206,7 +3379,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.f%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3226,7 +3400,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3243,7 +3418,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;<mve_insn>t.%#<V_sz_elem>\t%q0, %q2, %3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_<mve_insn>q_n_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3260,7 +3436,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcaddt.f%#<V_sz_elem>	%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3277,7 +3454,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcaddt.f%#<V_sz_elem>	%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3294,7 +3472,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmlat.f%#<V_sz_elem>	%q0, %q2, %q3, #0"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3311,7 +3490,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmlat.f%#<V_sz_elem>	%q0, %q2, %q3, #180"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot180<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3328,7 +3508,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmlat.f%#<V_sz_elem>	%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot270<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3345,7 +3526,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmlat.f%#<V_sz_elem>	%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot90<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3362,7 +3544,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmult.f%#<V_sz_elem>	%q0, %q2, %q3, #0"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3379,7 +3562,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmult.f%#<V_sz_elem>	%q0, %q2, %q3, #180"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot180<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3396,7 +3580,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmult.f%#<V_sz_elem>	%q0, %q2, %q3, #270"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot270<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3413,7 +3598,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vcmult.f%#<V_sz_elem>	%q0, %q2, %q3, #90"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot90<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3430,7 +3616,8 @@
   ]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vornt\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f<mode>"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -3450,7 +3637,8 @@
    output_asm_insn("vstrb.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_s vstrbq_scatter_offset_u]
@@ -3478,7 +3666,8 @@
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vstrb.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_s vstrwq_scatter_base_u]
@@ -3500,7 +3689,8 @@
    output_asm_insn("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_gather_offset_s vldrbq_gather_offset_u]
@@ -3523,7 +3713,8 @@
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrbq_s vldrbq_u]
@@ -3545,7 +3736,8 @@
      output_asm_insn ("vldrb.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_s vldrwq_gather_base_u]
@@ -3565,7 +3757,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrbq_scatter_offset_p_s vstrbq_scatter_offset_p_u]
@@ -3597,7 +3790,8 @@
 	  VSTRBSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrbt.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u]
@@ -3620,7 +3814,8 @@
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_insn "mve_vstrbq_p_<supf><mode>"
   [(set (match_operand:<MVE_B_ELEM> 0 "mve_memory_operand" "=Ux")
@@ -3638,7 +3833,8 @@
    output_asm_insn ("vpst\;vstrbt.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u]
@@ -3663,7 +3859,8 @@
      output_asm_insn ("vpst\n\tvldrbt.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrbq_z_s vldrbq_z_u]
@@ -3686,7 +3883,8 @@
      output_asm_insn ("vpst\;vldrbt.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u]
@@ -3707,7 +3905,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_f]
@@ -3726,7 +3925,8 @@
    output_asm_insn ("vldrh.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_s vldrhq_gather_offset_u]
@@ -3749,7 +3949,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_s vldrhq_gather_offset_z_u]
@@ -3774,7 +3975,8 @@
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2]",ops);
    return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u]
@@ -3797,7 +3999,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_s vldrhq_gather_shited_offset_z_u]
@@ -3822,7 +4025,8 @@
      output_asm_insn ("vpst\n\tvldrht.<supf><V_sz_elem>\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_s, vldrhq_u]
@@ -3844,7 +4048,8 @@
      output_asm_insn ("vldrh.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_z_f]
@@ -3864,7 +4069,8 @@
    output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_z_s vldrhq_z_u]
@@ -3887,7 +4093,8 @@
      output_asm_insn ("vpst\;vldrht.<supf><V_sz_elem>\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_f]
@@ -3906,7 +4113,8 @@
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_s vldrwq_u]
@@ -3925,7 +4133,8 @@
    output_asm_insn ("vldrw.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_z_f]
@@ -3945,7 +4154,8 @@
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_z_s vldrwq_z_u]
@@ -3965,7 +4175,8 @@
    output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vld1q_f<mode>"
   [(match_operand:MVE_0 0 "s_register_operand")
@@ -4005,7 +4216,8 @@
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_base_z_s vldrdq_gather_base_z_u]
@@ -4026,7 +4238,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u]
@@ -4046,7 +4259,8 @@
   output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_offset_z_s vldrdq_gather_offset_z_u]
@@ -4067,7 +4281,8 @@
   output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops);
   return "";
 }
- [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u]
@@ -4087,7 +4302,8 @@
    output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrdq_gather_shifted_offset_z_s vldrdq_gather_shifted_offset_z_u]
@@ -4108,7 +4324,8 @@
    output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_offset_f]
@@ -4128,7 +4345,8 @@
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_offset_z_f]
@@ -4150,7 +4368,8 @@
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_f]
@@ -4170,7 +4389,8 @@
    output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrhq_gather_shifted_offset_z_f]
@@ -4192,7 +4412,8 @@
    output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_base_f]
@@ -4212,7 +4433,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_base_z_f]
@@ -4233,7 +4455,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_f]
@@ -4253,7 +4476,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_s vldrwq_gather_offset_u]
@@ -4273,7 +4497,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_offset_z_f]
@@ -4295,7 +4520,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u]
@@ -4317,7 +4543,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_f]
@@ -4337,7 +4564,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_s vldrwq_gather_shifted_offset_u]
@@ -4357,7 +4585,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_f]
@@ -4379,7 +4608,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u]
@@ -4401,7 +4631,8 @@
    output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_f]
@@ -4420,7 +4651,8 @@
    output_asm_insn ("vstrh.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_p_f]
@@ -4441,7 +4673,8 @@
    output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_p_s vstrhq_p_u]
@@ -4463,7 +4696,8 @@
    output_asm_insn ("vpst\;vstrht.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u]
@@ -4495,7 +4729,8 @@
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u]
@@ -4523,7 +4758,8 @@
 	  VSTRHSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_s vstrhq_scatter_shifted_offset_p_u]
@@ -4555,7 +4791,8 @@
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrht.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u]
@@ -4584,7 +4821,8 @@
 	  VSTRHSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrh.<V_sz_elem>\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_<supf><mode>_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_s, vstrhq_u]
@@ -4603,7 +4841,8 @@
    output_asm_insn ("vstrh.<V_sz_elem>\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_<supf><mode>"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_f]
@@ -4622,7 +4861,8 @@
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_p_f]
@@ -4643,7 +4883,8 @@
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_p_s vstrwq_p_u]
@@ -4664,7 +4905,8 @@
    output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_s vstrwq_u]
@@ -4683,7 +4925,8 @@
    output_asm_insn ("vstrw.32\t%q1, %E0",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_<supf>v4si"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vst1q_f<mode>"
   [(match_operand:<MVE_CNVT> 0 "mve_memory_operand")
@@ -4726,7 +4969,8 @@
    output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u]
@@ -4748,7 +4992,8 @@
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_offset_p_s vstrdq_scatter_offset_p_u]
@@ -4779,7 +5024,8 @@
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u]
@@ -4807,7 +5053,8 @@
 	  VSTRDSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_p_s vstrdq_scatter_shifted_offset_p_u]
@@ -4839,7 +5086,8 @@
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrdt.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u]
@@ -4868,7 +5116,8 @@
 	  VSTRDSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrd.64\t%q2, [%0, %q1, uxtw #3]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_f]
@@ -4896,7 +5145,8 @@
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_offset_p_f]
@@ -4927,7 +5177,8 @@
 	  VSTRHQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_f]
@@ -4955,7 +5206,8 @@
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrh.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrhq_scatter_shifted_offset_p_f]
@@ -4987,7 +5239,8 @@
 	  VSTRHQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_f]
@@ -5009,7 +5262,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_p_f]
@@ -5032,7 +5286,8 @@
    output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_f]
@@ -5060,7 +5315,8 @@
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_offset_p_f]
@@ -5091,7 +5347,8 @@
 	  VSTRWQSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -5122,7 +5379,8 @@
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u]
@@ -5150,7 +5408,8 @@
 	  VSTRWSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_f]
@@ -5178,7 +5437,8 @@
 	 VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_f]
@@ -5210,7 +5470,8 @@
 	  VSTRWQSSO_F))]
   "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u]
@@ -5242,7 +5503,8 @@
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u]
@@ -5271,7 +5533,8 @@
 	  VSTRWSSOQ))]
   "TARGET_HAVE_MVE"
   "vstrw.32\t%q2, [%0, %q1, uxtw #2]"
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vidupq_n_u])
@@ -5339,7 +5602,8 @@
 		(match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;\tvidupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vddupq_n_u])
@@ -5407,7 +5671,8 @@
 		 (match_operand:SI 6 "immediate_operand" "i")))]
  "TARGET_HAVE_MVE"
  "vpst\;vddupt.u%#<V_sz_elem>\t%q0, %2, %4"
- [(set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u<mode>_insn"))
+  (set_attr "length""8")])
 
 ;;
 ;; [vdwdupq_n_u])
@@ -5523,8 +5788,9 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vdwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
-   (set_attr "length""8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
+  (set_attr "length""8")])
 
 ;;
 ;; [viwdupq_n_u])
@@ -5640,7 +5906,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;\tviwdupt.u%#<V_sz_elem>\t%q2, %3, %R4, %5"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u<mode>_insn"))
+  (set_attr "type" "mve_move")
    (set_attr "length""8")])
 
 ;;
@@ -5666,7 +5933,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_s vstrwq_scatter_base_wb_p_u]
@@ -5692,7 +5960,8 @@
    output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_<supf>v4si"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_f]
@@ -5717,7 +5986,8 @@
    output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrwq_scatter_base_wb_p_f]
@@ -5743,7 +6013,8 @@
    output_asm_insn ("vpst\;vstrwt.u32\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf"))
+  (set_attr "length" "8")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u]
@@ -5768,7 +6039,8 @@
    output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "4")])
 
 ;;
 ;; [vstrdq_scatter_base_wb_p_s vstrdq_scatter_base_wb_p_u]
@@ -5794,7 +6066,8 @@
    output_asm_insn ("vpst\;vstrdt.u64\t%q2, [%q0, %1]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_<supf>v2di"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5846,7 +6119,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_<supf>v4si"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5902,7 +6176,8 @@
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_<supf>v4si_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrwq_gather_base_wb_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -5954,7 +6229,8 @@
    output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrwq_gather_base_wb_z_fv4sf"
   [(match_operand:V4SI 0 "s_register_operand")
@@ -6011,7 +6287,8 @@
    output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn"))
+  (set_attr "length" "8")])
 
 (define_expand "mve_vldrdq_gather_base_wb_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -6064,7 +6341,8 @@
    output_asm_insn ("vldrd.64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "4")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "4")])
 
 (define_expand "mve_vldrdq_gather_base_wb_z_<supf>v2di"
   [(match_operand:V2DI 0 "s_register_operand")
@@ -6103,7 +6381,7 @@
    (unspec_volatile:SI [(reg:SI VFPCC_REGNUM)] UNSPEC_GET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmrs\\t%0, FPSCR_nzcvqc"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 (define_insn "set_fpscr_nzcvqc"
  [(set (reg:SI VFPCC_REGNUM)
@@ -6111,7 +6389,7 @@
     VUNSPEC_SET_FPSCR_NZCVQC))]
  "TARGET_HAVE_MVE"
  "vmsr\\tFPSCR_nzcvqc, %0"
-  [(set_attr "type" "mve_move")])
+ [(set_attr "type" "mve_move")])
 
 ;;
 ;; [vldrdq_gather_base_wb_z_s vldrdq_gather_base_wb_z_u]
@@ -6136,7 +6414,8 @@
    output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_<supf>v2di_insn"))
+  (set_attr "length" "8")])
 ;;
 ;; [vadciq_m_s, vadciq_m_u])
 ;;
@@ -6153,7 +6432,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6170,7 +6450,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vadci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -6189,7 +6470,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vadct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6206,7 +6488,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vadc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")
    (set_attr "conds" "set")])
 
@@ -6226,7 +6509,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbcit.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6243,7 +6527,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vsbci.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -6262,7 +6547,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vpst\;vsbct.i32\t%q0, %q2, %q3"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "8")])
 
 ;;
@@ -6279,7 +6565,8 @@
   ]
   "TARGET_HAVE_MVE"
   "vsbc.i32\t%q0, %q1, %q2"
-  [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_<supf>v4si"))
+  (set_attr "type" "mve_move")
    (set_attr "length" "4")])
 
 ;;
@@ -6308,7 +6595,7 @@
 		    "vst21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld2q])
@@ -6336,7 +6623,7 @@
 		    "vld21.<V_sz_elem>\t{%q0, %q1}, %3", ops);
    return "";
 }
-  [(set_attr "length" "8")])
+ [(set_attr "length" "8")])
 
 ;;
 ;; [vld4q])
@@ -6679,7 +6966,8 @@
  ]
  "TARGET_HAVE_MVE"
  "vpst\;vshlct\t%q0, %1, %4"
- [(set_attr "type" "mve_move")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_<supf><mode>"))
+  (set_attr "type" "mve_move")
   (set_attr "length" "8")])
 
 ;; CDE instructions on MVE registers.
@@ -6691,7 +6979,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1\\tp%c1, %q0, #%c2"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1qav16qi"
@@ -6702,7 +6991,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx1a\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qv16qi"
@@ -6713,7 +7003,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2\\tp%c1, %q0, %q2, #%c3"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx2qav16qi"
@@ -6725,7 +7016,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx2a\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qv16qi"
@@ -6737,7 +7029,8 @@
 	 UNSPEC_VCDE))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3\\tp%c1, %q0, %q2, %q3, #%c4"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx3qav16qi"
@@ -6750,7 +7043,8 @@
 	 UNSPEC_VCDEA))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vcx3a\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qav16qi"))
+  (set_attr "type" "coproc")]
 )
 
 (define_insn "arm_vcx1q<a>_p_v16qi"
@@ -6762,7 +7056,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx1<a>t\\tp%c1, %q0, #%c3"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6776,7 +7071,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx2<a>t\\tp%c1, %q0, %q3, #%c4"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
@@ -6791,7 +7087,8 @@
 	 CDE_VCX))]
   "TARGET_CDE && TARGET_HAVE_MVE"
   "vpst\;vcx3<a>t\\tp%c1, %q0, %q3, %q4, #%c5"
-  [(set_attr "type" "coproc")
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3q<a>v16qi"))
+  (set_attr "type" "coproc")
    (set_attr "length" "8")]
 )
 
diff --git a/gcc/config/arm/vec-common.md b/gcc/config/arm/vec-common.md
index 9af8429968d..45b6735b15c 100644
--- a/gcc/config/arm/vec-common.md
+++ b/gcc/config/arm/vec-common.md
@@ -366,7 +366,8 @@
   "@
    <mve_insn>.<supf>%#<V_sz_elem>\t%<V_reg>0, %<V_reg>1, %<V_reg>2
    * return neon_output_shift_immediate (\"vshl\", 'i', &operands[2], <MODE>mode, VALID_NEON_QREG_MODE (<MODE>mode), true);"
-  [(set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
+ [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_<supf><mode>"))
+  (set_attr "type" "neon_shift_reg<q>, neon_shift_imm<q>")]
 )
 
 (define_expand "vashl<mode>3"

^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2023-12-20 16:54 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2022-11-11 17:39 [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns Stam Markianos-Wright
2023-06-15 11:47 Stamatis Markianos-Wright
2023-08-17 10:30 Stamatis Markianos-Wright
2023-11-06 11:20 Stamatis Markianos-Wright
2023-12-12 10:33 ` Richard Earnshaw
2023-12-18 11:53 [PATCH 0/2] arm: Add support for MVE Tail-Predicated Low Overhead Loops Andre Vieira
2023-12-18 11:53 ` [PATCH 1/2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns Andre Vieira
2023-12-20 16:54   ` Andre Vieira (lists)

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).