* [integration v2 1/4] RISC-V/rvv: Added assembly pseudo and changed assembler mnemonics.
2021-10-05 12:51 [integration v2 0/4] RISC-V/rvv: Update rvv from v01.0 to v1.0 Nelson Chu
@ 2021-10-05 12:51 ` Nelson Chu
2021-10-05 12:51 ` [integration v2 2/4] RISC-V/rvv: Update constraints for widening and narrowing instructions Nelson Chu
` (3 subsequent siblings)
4 siblings, 0 replies; 6+ messages in thread
From: Nelson Chu @ 2021-10-05 12:51 UTC (permalink / raw)
To: binutils, jimw, andrew
* Added pseudo instruction,
- vfabs.v vd,vs = vfsgnjx.vv vd,vs,vs
* Changed assembler mnemonics, and the older names kept as aliases,
- Changed from vle1.v to vlm.v, and vse1.v to vsm.v.
- Changed from vfredsum and vfwredsum to vfredusum and vfwredusum respectively.
- Changed from vpopc.m to vcpop.m, to be consistent with scalar instruction.
- Changed from vmandnot.mm and vmornot.mm to vmandn.mm and vmorn.mm.
gas/
* testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l: Updated.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns.s: Likewise.
include/
* opcode/riscv-opc-extended.h: Updated.
opcodes/
* riscv-opc.c: Added pseudo vfabs.v, and changed assembler mnemonics.
---
.../extended/vector-insns-fail-arith-floatp.l | 1 +
.../extended/vector-insns-fail-arith-floatp.s | 2 ++
.../gas/riscv/extended/vector-insns-vmsgtvx.d | 12 +++----
.../gas/riscv/extended/vector-insns.d | 36 ++++++++++++-------
.../gas/riscv/extended/vector-insns.s | 32 +++++++++++------
include/opcode/riscv-opc-extended.h | 28 +++++++--------
opcodes/riscv-opc.c | 23 ++++++++----
7 files changed, 85 insertions(+), 49 deletions(-)
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l
index bcc49a09080..9900dbb1e58 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l
@@ -34,6 +34,7 @@
.*Error: illegal operands vd cannot overlap vm `vfmax.vv v0,v4,v8,v0.t'
.*Error: illegal operands vd cannot overlap vm `vfmax.vf v0,v4,fa1,v0.t'
.*Error: illegal operands vd cannot overlap vm `vfneg.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfabs.v v0,v4,v0.t'
.*Error: illegal operands vd cannot overlap vm `vfsgnj.vv v0,v4,v8,v0.t'
.*Error: illegal operands vd cannot overlap vm `vfsgnj.vf v0,v4,fa1,v0.t'
.*Error: illegal operands vd cannot overlap vm `vfsgnjn.vv v0,v4,v8,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s
index a48b1a3fd33..19ed26a95aa 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s
@@ -120,6 +120,8 @@
vfneg.v v4, v4 # OK
vfneg.v v0, v4, v0.t # vd overlap vm
+ vfabs.v v4, v4 # OK
+ vfabs.v v0, v4, v0.t # vd overlap vm
vfsgnj.vv v4, v4, v8 # OK
vfsgnj.vv v8, v4, v8 # OK
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d b/gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d
index 4d33fe7d596..dcc951a3cbf 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d
@@ -12,18 +12,18 @@ Disassembly of section .text:
[ ]+[0-9a-f]+:[ ]+6cc64457[ ]+vmslt.vx[ ]+v8,v12,a2,v0.t
[ ]+[0-9a-f]+:[ ]+6e802457[ ]+vmxor.mm[ ]+v8,v8,v0
[ ]+[0-9a-f]+:[ ]+6c85c657[ ]+vmslt.vx[ ]+v12,v8,a1,v0.t
-[ ]+[0-9a-f]+:[ ]+62062057[ ]+vmandnot.mm[ ]+v0,v0,v12
+[ ]+[0-9a-f]+:[ ]+62062057[ ]+vmandn.mm[ ]+v0,v0,v12
[ ]+[0-9a-f]+:[ ]+6c85c657[ ]+vmslt.vx[ ]+v12,v8,a1,v0.t
-[ ]+[0-9a-f]+:[ ]+62062657[ ]+vmandnot.mm[ ]+v12,v0,v12
-[ ]+[0-9a-f]+:[ ]+62402257[ ]+vmandnot.mm[ ]+v4,v4,v0
+[ ]+[0-9a-f]+:[ ]+62062657[ ]+vmandn.mm[ ]+v12,v0,v12
+[ ]+[0-9a-f]+:[ ]+62402257[ ]+vmandn.mm[ ]+v4,v4,v0
[ ]+[0-9a-f]+:[ ]+6ac22257[ ]+vmor.mm[ ]+v4,v12,v4
[ ]+[0-9a-f]+:[ ]+6a85c257[ ]+vmsltu.vx[ ]+v4,v8,a1
[ ]+[0-9a-f]+:[ ]+76422257[ ]+vmnot.m[ ]+v4,v4
[ ]+[0-9a-f]+:[ ]+68c64457[ ]+vmsltu.vx[ ]+v8,v12,a2,v0.t
[ ]+[0-9a-f]+:[ ]+6e802457[ ]+vmxor.mm[ ]+v8,v8,v0
[ ]+[0-9a-f]+:[ ]+6885c657[ ]+vmsltu.vx[ ]+v12,v8,a1,v0.t
-[ ]+[0-9a-f]+:[ ]+62062057[ ]+vmandnot.mm[ ]+v0,v0,v12
+[ ]+[0-9a-f]+:[ ]+62062057[ ]+vmandn.mm[ ]+v0,v0,v12
[ ]+[0-9a-f]+:[ ]+6885c657[ ]+vmsltu.vx[ ]+v12,v8,a1,v0.t
-[ ]+[0-9a-f]+:[ ]+62062657[ ]+vmandnot.mm[ ]+v12,v0,v12
-[ ]+[0-9a-f]+:[ ]+62402257[ ]+vmandnot.mm[ ]+v4,v4,v0
+[ ]+[0-9a-f]+:[ ]+62062657[ ]+vmandn.mm[ ]+v12,v0,v12
+[ ]+[0-9a-f]+:[ ]+62402257[ ]+vmandn.mm[ ]+v4,v4,v0
[ ]+[0-9a-f]+:[ ]+6ac22257[ ]+vmor.mm[ ]+v4,v12,v4
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns.d b/gas/testsuite/gas/riscv/extended/vector-insns.d
index 01770c48bdd..1b665f6cc0d 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns.d
+++ b/gas/testsuite/gas/riscv/extended/vector-insns.d
@@ -44,10 +44,14 @@ Disassembly of section .text:
[ ]+[0-9a-f]+:[ ]+ca95f557[ ]+vsetivli[ ]+a0,11,e256,m2,tu,ma
[ ]+[0-9a-f]+:[ ]+c695f557[ ]+vsetivli[ ]+a0,11,e256,m2,ta,mu
[ ]+[0-9a-f]+:[ ]+c295f557[ ]+vsetivli[ ]+a0,11,e256,m2,tu,mu
-[ ]+[0-9a-f]+:[ ]+02b50207[ ]+vle1.v[ ]+v4,\(a0\)
-[ ]+[0-9a-f]+:[ ]+02b50207[ ]+vle1.v[ ]+v4,\(a0\)
-[ ]+[0-9a-f]+:[ ]+02b50227[ ]+vse1.v[ ]+v4,\(a0\)
-[ ]+[0-9a-f]+:[ ]+02b50227[ ]+vse1.v[ ]+v4,\(a0\)
+[ ]+[0-9a-f]+:[ ]+02b50207[ ]+vlm.v[ ]+v4,\(a0\)
+[ ]+[0-9a-f]+:[ ]+02b50207[ ]+vlm.v[ ]+v4,\(a0\)
+[ ]+[0-9a-f]+:[ ]+02b50207[ ]+vlm.v[ ]+v4,\(a0\)
+[ ]+[0-9a-f]+:[ ]+02b50207[ ]+vlm.v[ ]+v4,\(a0\)
+[ ]+[0-9a-f]+:[ ]+02b50227[ ]+vsm.v[ ]+v4,\(a0\)
+[ ]+[0-9a-f]+:[ ]+02b50227[ ]+vsm.v[ ]+v4,\(a0\)
+[ ]+[0-9a-f]+:[ ]+02b50227[ ]+vsm.v[ ]+v4,\(a0\)
+[ ]+[0-9a-f]+:[ ]+02b50227[ ]+vsm.v[ ]+v4,\(a0\)
[ ]+[0-9a-f]+:[ ]+02050207[ ]+vle8.v[ ]+v4,\(a0\)
[ ]+[0-9a-f]+:[ ]+02050207[ ]+vle8.v[ ]+v4,\(a0\)
[ ]+[0-9a-f]+:[ ]+00050207[ ]+vle8.v[ ]+v4,\(a0\),v0.t
@@ -1762,6 +1766,8 @@ Disassembly of section .text:
[ ]+[0-9a-f]+:[ ]+18865257[ ]+vfmax.vf[ ]+v4,v8,fa2,v0.t
[ ]+[0-9a-f]+:[ ]+26841257[ ]+vfneg.v[ ]+v4,v8
[ ]+[0-9a-f]+:[ ]+24841257[ ]+vfneg.v[ ]+v4,v8,v0.t
+[ ]+[0-9a-f]+:[ ]+2a841257[ ]+vfabs.v[ ]+v4,v8
+[ ]+[0-9a-f]+:[ ]+28841257[ ]+vfabs.v[ ]+v4,v8,v0.t
[ ]+[0-9a-f]+:[ ]+22861257[ ]+vfsgnj.vv[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+22865257[ ]+vfsgnj.vf[ ]+v4,v8,fa2
[ ]+[0-9a-f]+:[ ]+26861257[ ]+vfsgnjn.vv[ ]+v4,v8,v12
@@ -1863,17 +1869,21 @@ Disassembly of section .text:
[ ]+[0-9a-f]+:[ ]+c0860257[ ]+vwredsumu.vs[ ]+v4,v8,v12,v0.t
[ ]+[0-9a-f]+:[ ]+c4860257[ ]+vwredsum.vs[ ]+v4,v8,v12,v0.t
[ ]+[0-9a-f]+:[ ]+0e861257[ ]+vfredosum.vs[ ]+v4,v8,v12
-[ ]+[0-9a-f]+:[ ]+06861257[ ]+vfredsum.vs[ ]+v4,v8,v12
+[ ]+[0-9a-f]+:[ ]+06861257[ ]+vfredusum.vs[ ]+v4,v8,v12
+[ ]+[0-9a-f]+:[ ]+06861257[ ]+vfredusum.vs[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+1e861257[ ]+vfredmax.vs[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+16861257[ ]+vfredmin.vs[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+0c861257[ ]+vfredosum.vs[ ]+v4,v8,v12,v0.t
-[ ]+[0-9a-f]+:[ ]+04861257[ ]+vfredsum.vs[ ]+v4,v8,v12,v0.t
+[ ]+[0-9a-f]+:[ ]+04861257[ ]+vfredusum.vs[ ]+v4,v8,v12,v0.t
+[ ]+[0-9a-f]+:[ ]+04861257[ ]+vfredusum.vs[ ]+v4,v8,v12,v0.t
[ ]+[0-9a-f]+:[ ]+1c861257[ ]+vfredmax.vs[ ]+v4,v8,v12,v0.t
[ ]+[0-9a-f]+:[ ]+14861257[ ]+vfredmin.vs[ ]+v4,v8,v12,v0.t
[ ]+[0-9a-f]+:[ ]+ce861257[ ]+vfwredosum.vs[ ]+v4,v8,v12
-[ ]+[0-9a-f]+:[ ]+c6861257[ ]+vfwredsum.vs[ ]+v4,v8,v12
+[ ]+[0-9a-f]+:[ ]+c6861257[ ]+vfwredusum.vs[ ]+v4,v8,v12
+[ ]+[0-9a-f]+:[ ]+c6861257[ ]+vfwredusum.vs[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+cc861257[ ]+vfwredosum.vs[ ]+v4,v8,v12,v0.t
-[ ]+[0-9a-f]+:[ ]+c4861257[ ]+vfwredsum.vs[ ]+v4,v8,v12,v0.t
+[ ]+[0-9a-f]+:[ ]+c4861257[ ]+vfwredusum.vs[ ]+v4,v8,v12,v0.t
+[ ]+[0-9a-f]+:[ ]+c4861257[ ]+vfwredusum.vs[ ]+v4,v8,v12,v0.t
[ ]+[0-9a-f]+:[ ]+66842257[ ]+vmmv.m[ ]+v4,v8
[ ]+[0-9a-f]+:[ ]+66842257[ ]+vmmv.m[ ]+v4,v8
[ ]+[0-9a-f]+:[ ]+6e422257[ ]+vmclr.m[ ]+v4
@@ -1881,20 +1891,22 @@ Disassembly of section .text:
[ ]+[0-9a-f]+:[ ]+76842257[ ]+vmnot.m[ ]+v4,v8
[ ]+[0-9a-f]+:[ ]+66862257[ ]+vmand.mm[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+76862257[ ]+vmnand.mm[ ]+v4,v8,v12
-[ ]+[0-9a-f]+:[ ]+62862257[ ]+vmandnot.mm[ ]+v4,v8,v12
+[ ]+[0-9a-f]+:[ ]+62862257[ ]+vmandn.mm[ ]+v4,v8,v12
+[ ]+[0-9a-f]+:[ ]+62862257[ ]+vmandn.mm[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+6e862257[ ]+vmxor.mm[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+6a862257[ ]+vmor.mm[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+7a862257[ ]+vmnor.mm[ ]+v4,v8,v12
-[ ]+[0-9a-f]+:[ ]+72862257[ ]+vmornot.mm[ ]+v4,v8,v12
+[ ]+[0-9a-f]+:[ ]+72862257[ ]+vmorn.mm[ ]+v4,v8,v12
+[ ]+[0-9a-f]+:[ ]+72862257[ ]+vmorn.mm[ ]+v4,v8,v12
[ ]+[0-9a-f]+:[ ]+7e862257[ ]+vmxnor.mm[ ]+v4,v8,v12
-[ ]+[0-9a-f]+:[ ]+42c82557[ ]+vpopc.m[ ]+a0,v12
+[ ]+[0-9a-f]+:[ ]+42c82557[ ]+vcpop.m[ ]+a0,v12
[ ]+[0-9a-f]+:[ ]+42c8a557[ ]+vfirst.m[ ]+a0,v12
[ ]+[0-9a-f]+:[ ]+5280a257[ ]+vmsbf.m[ ]+v4,v8
[ ]+[0-9a-f]+:[ ]+5281a257[ ]+vmsif.m[ ]+v4,v8
[ ]+[0-9a-f]+:[ ]+52812257[ ]+vmsof.m[ ]+v4,v8
[ ]+[0-9a-f]+:[ ]+52882257[ ]+viota.m[ ]+v4,v8
[ ]+[0-9a-f]+:[ ]+5208a257[ ]+vid.v[ ]+v4
-[ ]+[0-9a-f]+:[ ]+40c82557[ ]+vpopc.m[ ]+a0,v12,v0.t
+[ ]+[0-9a-f]+:[ ]+40c82557[ ]+vcpop.m[ ]+a0,v12,v0.t
[ ]+[0-9a-f]+:[ ]+40c8a557[ ]+vfirst.m[ ]+a0,v12,v0.t
[ ]+[0-9a-f]+:[ ]+5080a257[ ]+vmsbf.m[ ]+v4,v8,v0.t
[ ]+[0-9a-f]+:[ ]+5081a257[ ]+vmsif.m[ ]+v4,v8,v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns.s b/gas/testsuite/gas/riscv/extended/vector-insns.s
index 5c78e28e776..0f894a355aa 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns.s
+++ b/gas/testsuite/gas/riscv/extended/vector-insns.s
@@ -36,9 +36,13 @@
vsetivli a0, 0xb, e256, m2, ta, mu
vsetivli a0, 0xb, e256, m2, tu, mu
- vle1.v v4, (a0)
+ vlm.v v4, (a0)
+ vlm.v v4, 0(a0)
+ vle1.v v4, (a0) # Alias of vlm.v
vle1.v v4, 0(a0)
- vse1.v v4, (a0)
+ vsm.v v4, (a0)
+ vsm.v v4, 0(a0)
+ vse1.v v4, (a0) # Alias of vsm.v
vse1.v v4, 0(a0)
vle8.v v4, (a0)
@@ -1978,6 +1982,8 @@
vfneg.v v4, v8
vfneg.v v4, v8, v0.t
+ vfabs.v v4, v8
+ vfabs.v v4, v8, v0.t
vfsgnj.vv v4, v8, v12
vfsgnj.vf v4, v8, fa2
@@ -2090,18 +2096,22 @@
vwredsum.vs v4, v8, v12, v0.t
vfredosum.vs v4, v8, v12
- vfredsum.vs v4, v8, v12
+ vfredusum.vs v4, v8, v12
+ vfredsum.vs v4, v8, v12 # Alias of vfredusum.vs.
vfredmax.vs v4, v8, v12
vfredmin.vs v4, v8, v12
vfredosum.vs v4, v8, v12, v0.t
- vfredsum.vs v4, v8, v12, v0.t
+ vfredusum.vs v4, v8, v12, v0.t
+ vfredsum.vs v4, v8, v12, v0.t # Alias of vfredusum.vs.
vfredmax.vs v4, v8, v12, v0.t
vfredmin.vs v4, v8, v12, v0.t
vfwredosum.vs v4, v8, v12
- vfwredsum.vs v4, v8, v12
+ vfwredusum.vs v4, v8, v12
+ vfwredsum.vs v4, v8, v12 # Alias of vfwredusum.vs.
vfwredosum.vs v4, v8, v12, v0.t
- vfwredsum.vs v4, v8, v12, v0.t
+ vfwredusum.vs v4, v8, v12, v0.t
+ vfwredsum.vs v4, v8, v12, v0.t # Alias of vfwredusum.vs.
# Aliases
vmcpy.m v4, v8
@@ -2112,21 +2122,23 @@
vmand.mm v4, v8, v12
vmnand.mm v4, v8, v12
- vmandnot.mm v4, v8, v12
+ vmandn.mm v4, v8, v12
+ vmandnot.mm v4, v8, v12 # Alias of vmandn.mm.
vmxor.mm v4, v8, v12
vmor.mm v4, v8, v12
vmnor.mm v4, v8, v12
- vmornot.mm v4, v8, v12
+ vmorn.mm v4, v8, v12
+ vmornot.mm v4, v8, v12 # Alias of vmorn.mm.
vmxnor.mm v4, v8, v12
- vpopc.m a0, v12
+ vcpop.m a0, v12
vfirst.m a0, v12
vmsbf.m v4, v8
vmsif.m v4, v8
vmsof.m v4, v8
viota.m v4, v8
vid.v v4
- vpopc.m a0, v12, v0.t
+ vcpop.m a0, v12, v0.t
vfirst.m a0, v12, v0.t
vmsbf.m v4, v8, v0.t
vmsif.m v4, v8, v0.t
diff --git a/include/opcode/riscv-opc-extended.h b/include/opcode/riscv-opc-extended.h
index 3de8809b4c2..de9741f9b39 100644
--- a/include/opcode/riscv-opc-extended.h
+++ b/include/opcode/riscv-opc-extended.h
@@ -99,10 +99,10 @@
#define MASK_VSETIVLI 0xc000707f
#define MATCH_VSETVLI 0x00007057
#define MASK_VSETVLI 0x8000707f
-#define MATCH_VLE1V 0x02b00007
-#define MASK_VLE1V 0xfff0707f
-#define MATCH_VSE1V 0x02b00027
-#define MASK_VSE1V 0xfff0707f
+#define MATCH_VLMV 0x02b00007
+#define MASK_VLMV 0xfff0707f
+#define MATCH_VSMV 0x02b00027
+#define MASK_VSMV 0xfff0707f
#define MATCH_VLE8V 0x00000007
#define MASK_VLE8V 0xfdf0707f
#define MATCH_VLE16V 0x00005007
@@ -1359,34 +1359,34 @@
#define MASK_VWREDSUMVS 0xfc00707f
#define MATCH_VFREDOSUMVS 0x0c001057
#define MASK_VFREDOSUMVS 0xfc00707f
-#define MATCH_VFREDSUMVS 0x04001057
-#define MASK_VFREDSUMVS 0xfc00707f
+#define MATCH_VFREDUSUMVS 0x04001057
+#define MASK_VFREDUSUMVS 0xfc00707f
#define MATCH_VFREDMAXVS 0x1c001057
#define MASK_VFREDMAXVS 0xfc00707f
#define MATCH_VFREDMINVS 0x14001057
#define MASK_VFREDMINVS 0xfc00707f
#define MATCH_VFWREDOSUMVS 0xcc001057
#define MASK_VFWREDOSUMVS 0xfc00707f
-#define MATCH_VFWREDSUMVS 0xc4001057
-#define MASK_VFWREDSUMVS 0xfc00707f
+#define MATCH_VFWREDUSUMVS 0xc4001057
+#define MASK_VFWREDUSUMVS 0xfc00707f
#define MATCH_VMANDMM 0x66002057
#define MASK_VMANDMM 0xfe00707f
#define MATCH_VMNANDMM 0x76002057
#define MASK_VMNANDMM 0xfe00707f
-#define MATCH_VMANDNOTMM 0x62002057
-#define MASK_VMANDNOTMM 0xfe00707f
+#define MATCH_VMANDNMM 0x62002057
+#define MASK_VMANDNMM 0xfe00707f
#define MATCH_VMXORMM 0x6e002057
#define MASK_VMXORMM 0xfe00707f
#define MATCH_VMORMM 0x6a002057
#define MASK_VMORMM 0xfe00707f
#define MATCH_VMNORMM 0x7a002057
#define MASK_VMNORMM 0xfe00707f
-#define MATCH_VMORNOTMM 0x72002057
-#define MASK_VMORNOTMM 0xfe00707f
+#define MATCH_VMORNMM 0x72002057
+#define MASK_VMORNMM 0xfe00707f
#define MATCH_VMXNORMM 0x7e002057
#define MASK_VMXNORMM 0xfe00707f
-#define MATCH_VPOPCM 0x40082057
-#define MASK_VPOPCM 0xfc0ff07f
+#define MATCH_VCPOPM 0x40082057
+#define MASK_VCPOPM 0xfc0ff07f
#define MATCH_VFIRSTM 0x4008a057
#define MASK_VFIRSTM 0xfc0ff07f
#define MATCH_VMSBFM 0x5000a057
diff --git a/opcodes/riscv-opc.c b/opcodes/riscv-opc.c
index 05f94704774..cccc3553c60 100644
--- a/opcodes/riscv-opc.c
+++ b/opcodes/riscv-opc.c
@@ -1417,8 +1417,10 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vsetvli", 0, INSN_CLASS_V, "d,s,Vc", MATCH_VSETVLI, MASK_VSETVLI, match_opcode, 0},
{"vsetivli", 0, INSN_CLASS_V, "d,Z,Vb", MATCH_VSETIVLI, MASK_VSETIVLI, match_opcode, 0},
-{"vle1.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VLE1V, MASK_VLE1V, match_opcode, INSN_DREF },
-{"vse1.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VSE1V, MASK_VSE1V, match_opcode, INSN_DREF },
+{"vlm.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VLMV, MASK_VLMV, match_opcode, INSN_DREF },
+{"vsm.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VSMV, MASK_VSMV, match_opcode, INSN_DREF },
+{"vle1.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VLMV, MASK_VLMV, match_opcode, INSN_DREF|INSN_ALIAS },
+{"vse1.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VSMV, MASK_VSMV, match_opcode, INSN_DREF|INSN_ALIAS },
{"vle8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE8V, MASK_VLE8V, match_vd_neq_vm, INSN_DREF },
{"vle16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE16V, MASK_VLE16V, match_vd_neq_vm, INSN_DREF },
@@ -2087,6 +2089,7 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vfmax.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFMAXVF, MASK_VFMAXVF, match_vd_neq_vm, 0},
{"vfneg.v", 0, INSN_CLASS_V_AND_F, "Vd,VuVm", MATCH_VFSGNJNVV, MASK_VFSGNJNVV, match_vs1_eq_vs2_neq_vm, INSN_ALIAS },
+{"vfabs.v", 0, INSN_CLASS_V_AND_F, "Vd,VuVm", MATCH_VFSGNJXVV, MASK_VFSGNJXVV, match_vs1_eq_vs2_neq_vm, INSN_ALIAS },
{"vfsgnj.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFSGNJVV, MASK_VFSGNJVV, match_vd_neq_vm, 0},
{"vfsgnj.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFSGNJVF, MASK_VFSGNJVF, match_vd_neq_vm, 0},
@@ -2150,12 +2153,14 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vwredsum.vs",0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWREDSUMVS, MASK_VWREDSUMVS, match_opcode, 0},
{"vfredosum.vs",0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDOSUMVS, MASK_VFREDOSUMVS, match_opcode, 0},
-{"vfredsum.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDSUMVS, MASK_VFREDSUMVS, match_opcode, 0},
+{"vfredusum.vs",0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDUSUMVS, MASK_VFREDUSUMVS, match_opcode, 0},
+{"vfredsum.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDUSUMVS, MASK_VFREDUSUMVS, match_opcode, INSN_ALIAS},
{"vfredmax.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDMAXVS, MASK_VFREDMAXVS, match_opcode, 0},
{"vfredmin.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDMINVS, MASK_VFREDMINVS, match_opcode, 0},
{"vfwredosum.vs",0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWREDOSUMVS, MASK_VFWREDOSUMVS, match_opcode, 0},
-{"vfwredsum.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWREDSUMVS, MASK_VFWREDSUMVS, match_opcode, 0},
+{"vfwredusum.vs",0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWREDUSUMVS, MASK_VFWREDUSUMVS, match_opcode, 0},
+{"vfwredsum.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWREDUSUMVS, MASK_VFWREDUSUMVS, match_opcode, INSN_ALIAS},
{"vmmv.m", 0, INSN_CLASS_V, "Vd,Vu", MATCH_VMANDMM, MASK_VMANDMM, match_vs1_eq_vs2, INSN_ALIAS},
{"vmcpy.m", 0, INSN_CLASS_V, "Vd,Vu", MATCH_VMANDMM, MASK_VMANDMM, match_vs1_eq_vs2, INSN_ALIAS},
@@ -2165,14 +2170,18 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vmand.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMANDMM, MASK_VMANDMM, match_opcode, 0},
{"vmnand.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMNANDMM, MASK_VMNANDMM, match_opcode, 0},
-{"vmandnot.mm",0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMANDNOTMM, MASK_VMANDNOTMM, match_opcode, 0},
+
+{"vmandn.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMANDNMM, MASK_VMANDNMM, match_opcode, 0},
+{"vmandnot.mm",0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMANDNMM, MASK_VMANDNMM, match_opcode, INSN_ALIAS},
{"vmxor.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMXORMM, MASK_VMXORMM, match_opcode, 0},
{"vmor.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMORMM, MASK_VMORMM, match_opcode, 0},
{"vmnor.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMNORMM, MASK_VMNORMM, match_opcode, 0},
-{"vmornot.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMORNOTMM, MASK_VMORNOTMM, match_opcode, 0},
+{"vmorn.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMORNMM, MASK_VMORNMM, match_opcode, 0},
+{"vmornot.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMORNMM, MASK_VMORNMM, match_opcode, INSN_ALIAS},
{"vmxnor.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMXNORMM, MASK_VMXNORMM, match_opcode, 0},
-{"vpopc.m", 0, INSN_CLASS_V, "d,VtVm", MATCH_VPOPCM, MASK_VPOPCM, match_opcode, 0},
+{"vcpop.m", 0, INSN_CLASS_V, "d,VtVm", MATCH_VCPOPM, MASK_VCPOPM, match_opcode, 0},
+{"vpopc.m", 0, INSN_CLASS_V, "d,VtVm", MATCH_VCPOPM, MASK_VCPOPM, match_opcode, INSN_ALIAS},
{"vfirst.m", 0, INSN_CLASS_V, "d,VtVm", MATCH_VFIRSTM, MASK_VFIRSTM, match_opcode, 0},
{"vmsbf.m", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VMSBFM, MASK_VMSBFM, match_vd_neq_vs2_neq_vm, 0},
{"vmsif.m", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VMSIFM, MASK_VMSIFM, match_vd_neq_vs2_neq_vm, 0},
--
2.30.2
^ permalink raw reply [flat|nested] 6+ messages in thread
* [integration v2 2/4] RISC-V/rvv: Update constraints for widening and narrowing instructions.
2021-10-05 12:51 [integration v2 0/4] RISC-V/rvv: Update rvv from v01.0 to v1.0 Nelson Chu
2021-10-05 12:51 ` [integration v2 1/4] RISC-V/rvv: Added assembly pseudo and changed assembler mnemonics Nelson Chu
@ 2021-10-05 12:51 ` Nelson Chu
2021-10-05 12:51 ` [integration v2 3/4] RISC-V/rvv: Separate zvamo from v, and removed the zvlsseg extension name Nelson Chu
` (2 subsequent siblings)
4 siblings, 0 replies; 6+ messages in thread
From: Nelson Chu @ 2021-10-05 12:51 UTC (permalink / raw)
To: binutils, jimw, andrew
* Since fractional LMUL is supported, we cannot just assume LMUL is 1.
Otherwise, the old conflit checking rules may cause problems.
* Removed the overlap constraints for narrowing instructions.
gas/
* testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d: Removed.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l: Updated.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s: Likewise.
opcodes/
* riscv-opc.c (match_vd_neq_vs1_neq_vm): Added for vw*.wv instructions.
(match_widen_vd_neq_vs1_neq_vs2_neq_vm): Replaced by match_vd_neq_vs1_neq_vs2_neq_vm.
(match_widen_vd_neq_vs1_neq_vm): Replaced by match_vd_neq_vs1_neq_vm.
(match_widen_vd_neq_vs2_neq_vm): Replaced by match_vd_neq_vs2_neq_vm.
(match_widen_vd_neq_vm): Replaced by match_vd_neq_vm.
(match_narrow_vd_neq_vs2_neq_vm): Same as match_widen_vd_neq_vs2_neq_vm.
---
.../extended/vector-insns-fail-arith-narrow.d | 3 -
.../extended/vector-insns-fail-arith-narrow.l | 85 -----
.../extended/vector-insns-fail-arith-narrow.s | 100 ------
.../extended/vector-insns-fail-arith-widen.l | 131 --------
.../extended/vector-insns-fail-arith-widen.s | 88 +++---
opcodes/riscv-opc.c | 294 ++++++------------
6 files changed, 139 insertions(+), 562 deletions(-)
delete mode 100644 gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d
delete mode 100644 gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l
delete mode 100644 gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d
deleted file mode 100644
index e7a4d4e00c2..00000000000
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d
+++ /dev/null
@@ -1,3 +0,0 @@
-#as: -march=rv32ifv -mcheck-constraints
-#source: vector-insns-fail-arith-narrow.s
-#error_output: vector-insns-fail-arith-narrow.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l
deleted file mode 100644
index 3a3634cd098..00000000000
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l
+++ /dev/null
@@ -1,85 +0,0 @@
-.*: Assembler messages:
-.*Error: illegal operands vd cannot overlap vs2 `vncvt.x.x.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vncvt.x.x.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vncvt.x.x.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vncvt.x.x.w v0,v2,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wv v2,v2,v4'
-.*Error: illegal operands vd must be multiple of 2 `vnsrl.wv v2,v3,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wv v3,v2,v4'
-.*Error: illegal operands vd cannot overlap vm `vnsrl.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wx v2,v2,a1'
-.*Error: illegal operands vd must be multiple of 2 `vnsrl.wx v2,v3,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wx v3,v2,a1'
-.*Error: illegal operands vd cannot overlap vm `vnsrl.wx v0,v2,a1,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wi v2,v2,31'
-.*Error: illegal operands vd must be multiple of 2 `vnsrl.wi v2,v3,31'
-.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wi v3,v2,31'
-.*Error: illegal operands vd cannot overlap vm `vnsrl.wi v0,v2,31,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnsra.wv v2,v2,v4'
-.*Error: illegal operands vd must be multiple of 2 `vnsra.wv v2,v3,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vnsra.wv v3,v2,v4'
-.*Error: illegal operands vd cannot overlap vm `vnsra.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnsra.wx v2,v2,a1'
-.*Error: illegal operands vd must be multiple of 2 `vnsra.wx v2,v3,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vnsra.wx v3,v2,a1'
-.*Error: illegal operands vd cannot overlap vm `vnsra.wx v0,v2,a1,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnsra.wi v2,v2,31'
-.*Error: illegal operands vd must be multiple of 2 `vnsra.wi v2,v3,31'
-.*Error: illegal operands vd cannot overlap vs2 `vnsra.wi v3,v2,31'
-.*Error: illegal operands vd cannot overlap vm `vnsra.wi v0,v2,31,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wv v2,v2,v4'
-.*Error: illegal operands vd must be multiple of 2 `vnclipu.wv v2,v3,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wv v3,v2,v4'
-.*Error: illegal operands vd cannot overlap vm `vnclipu.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wx v2,v2,a1'
-.*Error: illegal operands vd must be multiple of 2 `vnclipu.wx v2,v3,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wx v3,v2,a1'
-.*Error: illegal operands vd cannot overlap vm `vnclipu.wx v0,v2,a1,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wi v2,v2,31'
-.*Error: illegal operands vd must be multiple of 2 `vnclipu.wi v2,v3,31'
-.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wi v3,v2,31'
-.*Error: illegal operands vd cannot overlap vm `vnclipu.wi v0,v2,31,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnclip.wv v2,v2,v4'
-.*Error: illegal operands vd must be multiple of 2 `vnclip.wv v2,v3,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vnclip.wv v3,v2,v4'
-.*Error: illegal operands vd cannot overlap vm `vnclip.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnclip.wx v2,v2,a1'
-.*Error: illegal operands vd must be multiple of 2 `vnclip.wx v2,v3,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vnclip.wx v3,v2,a1'
-.*Error: illegal operands vd cannot overlap vm `vnclip.wx v0,v2,a1,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vnclip.wi v2,v2,31'
-.*Error: illegal operands vd must be multiple of 2 `vnclip.wi v2,v3,31'
-.*Error: illegal operands vd cannot overlap vs2 `vnclip.wi v3,v2,31'
-.*Error: illegal operands vd cannot overlap vm `vnclip.wi v0,v2,31,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.xu.f.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vfncvt.xu.f.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.xu.f.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vfncvt.xu.f.w v0,v2,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.x.f.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vfncvt.x.f.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.x.f.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vfncvt.x.f.w v0,v2,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rtz.xu.f.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vfncvt.rtz.xu.f.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rtz.xu.f.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vfncvt.rtz.xu.f.w v0,v2,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rtz.x.f.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vfncvt.rtz.x.f.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rtz.x.f.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vfncvt.rtz.x.f.w v0,v2,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.xu.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vfncvt.f.xu.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.xu.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vfncvt.f.xu.w v0,v2,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.x.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vfncvt.f.x.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.x.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vfncvt.f.x.w v0,v2,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.f.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vfncvt.f.f.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.f.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vfncvt.f.f.w v0,v2,v0.t'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rod.f.f.w v2,v2'
-.*Error: illegal operands vd must be multiple of 2 `vfncvt.rod.f.f.w v2,v3'
-.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rod.f.f.w v3,v2'
-.*Error: illegal operands vd cannot overlap vm `vfncvt.rod.f.f.w v0,v2,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s
deleted file mode 100644
index 73b96ef800f..00000000000
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s
+++ /dev/null
@@ -1,100 +0,0 @@
-# Vector Narrowing Integer Right Shift Instructions
-
- # vncvt.x.x.w vd,vs,vm = vnsrl.wx vd,vs,x0,vm
- vncvt.x.x.w v2, v2 # vd overlap vs2
- vncvt.x.x.w v2, v3 # vs2 should be multiple of 2
- vncvt.x.x.w v3, v2 # vd overlap vs2
- vncvt.x.x.w v0, v2, v0.t # vd overlap vm
-
- vnsrl.wv v2, v2, v4 # vd overlap vs2
- vnsrl.wv v2, v3, v4 # vs2 should be multiple of 2
- vnsrl.wv v3, v2, v4 # vd overlap vs2
- vnsrl.wv v4, v2, v4 # OK
- vnsrl.wv v0, v2, v4, v0.t # vd overlap vm
- vnsrl.wx v2, v2, a1 # vd overlap vs2
- vnsrl.wx v2, v3, a1 # vs2 should be multiple of 2
- vnsrl.wx v3, v2, a1 # vd overlap vs2
- vnsrl.wx v0, v2, a1, v0.t # vd overlap vm
- vnsrl.wi v2, v2, 31 # vd overlap vs2
- vnsrl.wi v2, v3, 31 # vs2 should be multiple of 2
- vnsrl.wi v3, v2, 31 # vd overlap vs2
- vnsrl.wi v0, v2, 31, v0.t # vd overlap vm
-
- vnsra.wv v2, v2, v4
- vnsra.wv v2, v3, v4
- vnsra.wv v3, v2, v4
- vnsra.wv v4, v2, v4
- vnsra.wv v0, v2, v4, v0.t
- vnsra.wx v2, v2, a1
- vnsra.wx v2, v3, a1
- vnsra.wx v3, v2, a1
- vnsra.wx v0, v2, a1, v0.t
- vnsra.wi v2, v2, 31
- vnsra.wi v2, v3, 31
- vnsra.wi v3, v2, 31
- vnsra.wi v0, v2, 31, v0.t
-
-# Vector Narrowing Fixed-Point Clip Instructions
-
- vnclipu.wv v2, v2, v4 # vd overlap vs2
- vnclipu.wv v2, v3, v4 # vs2 should be multiple of 2
- vnclipu.wv v3, v2, v4 # vd overlap vs2
- vnclipu.wv v4, v2, v4 # OK
- vnclipu.wv v0, v2, v4, v0.t # vd overlap vm
- vnclipu.wx v2, v2, a1 # vd overlap vs2
- vnclipu.wx v2, v3, a1 # vs2 should be multiple of 2
- vnclipu.wx v3, v2, a1 # vd overlap vs2
- vnclipu.wx v0, v2, a1, v0.t # vd overlap vm
- vnclipu.wi v2, v2, 31 # vd overlap vs2
- vnclipu.wi v2, v3, 31 # vs2 should be multiple of 2
- vnclipu.wi v3, v2, 31 # vd overlap vs2
- vnclipu.wi v0, v2, 31, v0.t # vd overlap vm
-
- vnclip.wv v2, v2, v4
- vnclip.wv v2, v3, v4
- vnclip.wv v3, v2, v4
- vnclip.wv v4, v2, v4
- vnclip.wv v0, v2, v4, v0.t
- vnclip.wx v2, v2, a1
- vnclip.wx v2, v3, a1
- vnclip.wx v3, v2, a1
- vnclip.wx v0, v2, a1, v0.t
- vnclip.wi v2, v2, 31
- vnclip.wi v2, v3, 31
- vnclip.wi v3, v2, 31
- vnclip.wi v0, v2, 31, v0.t
-
-# Narrowing Floating-Point/Integer Type-Convert Instructions
-
- vfncvt.xu.f.w v2, v2 # vd overlap vs2
- vfncvt.xu.f.w v2, v3 # vs2 should be multiple of 2
- vfncvt.xu.f.w v3, v2 # vd overlap vs2
- vfncvt.xu.f.w v0, v2, v0.t # vd overlap vm
- vfncvt.x.f.w v2, v2
- vfncvt.x.f.w v2, v3
- vfncvt.x.f.w v3, v2
- vfncvt.x.f.w v0, v2, v0.t
- vfncvt.rtz.xu.f.w v2, v2
- vfncvt.rtz.xu.f.w v2, v3
- vfncvt.rtz.xu.f.w v3, v2
- vfncvt.rtz.xu.f.w v0, v2, v0.t
- vfncvt.rtz.x.f.w v2, v2
- vfncvt.rtz.x.f.w v2, v3
- vfncvt.rtz.x.f.w v3, v2
- vfncvt.rtz.x.f.w v0, v2, v0.t
- vfncvt.f.xu.w v2, v2
- vfncvt.f.xu.w v2, v3
- vfncvt.f.xu.w v3, v2
- vfncvt.f.xu.w v0, v2, v0.t
- vfncvt.f.x.w v2, v2
- vfncvt.f.x.w v2, v3
- vfncvt.f.x.w v3, v2
- vfncvt.f.x.w v0, v2, v0.t
- vfncvt.f.f.w v2, v2
- vfncvt.f.f.w v2, v3
- vfncvt.f.f.w v3, v2
- vfncvt.f.f.w v0, v2, v0.t
- vfncvt.rod.f.f.w v2, v2
- vfncvt.rod.f.f.w v2, v3
- vfncvt.rod.f.f.w v3, v2
- vfncvt.rod.f.f.w v0, v2, v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l
index 5f22ca99e9e..364b765a981 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l
@@ -1,253 +1,122 @@
.*: Assembler messages:
-.*Error: illegal operands vd must be multiple of 2 `vwcvtu.x.x.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vwcvtu.x.x.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vwcvtu.x.x.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vwcvtu.x.x.v v0,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwcvt.x.x.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vwcvt.x.x.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vwcvt.x.x.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vwcvt.x.x.v v0,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwaddu.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwaddu.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwaddu.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwaddu.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwaddu.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwaddu.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwaddu.vx v1,v2,a1'
.*Error: illegal operands vd cannot overlap vs2 `vwaddu.vx v2,v2,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vwaddu.vx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwaddu.vx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwaddu.wv v1,v2,v4'
-.*Error: illegal operands vs2 must be multiple of 2 `vwaddu.wv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwaddu.wv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwaddu.wv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwaddu.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwaddu.wx v1,v2,a1'
-.*Error: illegal operands vs2 must be multiple of 2 `vwaddu.wx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwaddu.wx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwsubu.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwsubu.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwsubu.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwsubu.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwsubu.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwsubu.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwsubu.vx v1,v2,a1'
.*Error: illegal operands vd cannot overlap vs2 `vwsubu.vx v2,v2,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vwsubu.vx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwsubu.vx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwsubu.wv v1,v2,v4'
-.*Error: illegal operands vs2 must be multiple of 2 `vwsubu.wv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwsubu.wv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwsubu.wv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwsubu.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwsubu.wx v1,v2,a1'
-.*Error: illegal operands vs2 must be multiple of 2 `vwsubu.wx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwsubu.wx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwadd.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwadd.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwadd.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwadd.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwadd.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwadd.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwadd.vx v1,v2,a1'
.*Error: illegal operands vd cannot overlap vs2 `vwadd.vx v2,v2,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vwadd.vx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwadd.vx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwadd.wv v1,v2,v4'
-.*Error: illegal operands vs2 must be multiple of 2 `vwadd.wv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwadd.wv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwadd.wv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwadd.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwadd.wx v1,v2,a1'
-.*Error: illegal operands vs2 must be multiple of 2 `vwadd.wx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwadd.wx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwsub.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwsub.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwsub.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwsub.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwsub.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwsub.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwsub.vx v1,v2,a1'
.*Error: illegal operands vd cannot overlap vs2 `vwsub.vx v2,v2,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vwsub.vx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwsub.vx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwsub.wv v1,v2,v4'
-.*Error: illegal operands vs2 must be multiple of 2 `vwsub.wv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwsub.wv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwsub.wv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwsub.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwsub.wx v1,v2,a1'
-.*Error: illegal operands vs2 must be multiple of 2 `vwsub.wx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwsub.wx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmul.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwmul.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwmul.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwmul.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwmul.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwmul.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmul.vx v1,v2,a1'
.*Error: illegal operands vd cannot overlap vs2 `vwmul.vx v2,v2,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vwmul.vx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwmul.vx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmulu.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwmulu.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwmulu.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwmulu.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwmulu.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwmulu.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmulu.vx v1,v2,a1'
.*Error: illegal operands vd cannot overlap vs2 `vwmulu.vx v2,v2,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vwmulu.vx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwmulu.vx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmulsu.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwmulsu.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwmulsu.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwmulsu.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwmulsu.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwmulsu.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmulsu.vx v1,v2,a1'
.*Error: illegal operands vd cannot overlap vs2 `vwmulsu.vx v2,v2,a1'
-.*Error: illegal operands vd cannot overlap vs2 `vwmulsu.vx v2,v3,a1'
.*Error: illegal operands vd cannot overlap vm `vwmulsu.vx v0,v2,a1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmaccu.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwmaccu.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwmaccu.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwmaccu.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwmaccu.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwmaccu.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmaccu.vx v1,a1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vwmaccu.vx v2,a1,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vwmaccu.vx v2,a1,v3'
.*Error: illegal operands vd cannot overlap vm `vwmaccu.vx v0,a1,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmacc.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwmacc.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwmacc.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwmacc.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwmacc.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwmacc.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmacc.vx v1,a1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vwmacc.vx v2,a1,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vwmacc.vx v2,a1,v3'
.*Error: illegal operands vd cannot overlap vm `vwmacc.vx v0,a1,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmaccsu.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs1 `vwmaccsu.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vwmaccsu.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs2 `vwmaccsu.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vwmaccsu.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vwmaccsu.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmaccsu.vx v1,a1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vwmaccsu.vx v2,a1,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vwmaccsu.vx v2,a1,v3'
.*Error: illegal operands vd cannot overlap vm `vwmaccsu.vx v0,a1,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vwmaccus.vx v1,a1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vwmaccus.vx v2,a1,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vwmaccus.vx v2,a1,v3'
.*Error: illegal operands vd cannot overlap vm `vwmaccus.vx v0,a1,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwadd.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vfwadd.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vfwadd.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwadd.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwadd.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwadd.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwadd.vf v1,v2,fa1'
.*Error: illegal operands vd cannot overlap vs2 `vfwadd.vf v2,v2,fa1'
-.*Error: illegal operands vd cannot overlap vs2 `vfwadd.vf v2,v3,fa1'
.*Error: illegal operands vd cannot overlap vm `vfwadd.vf v0,v2,fa1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwadd.wv v1,v2,v4'
-.*Error: illegal operands vs2 must be multiple of 2 `vfwadd.wv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwadd.wv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwadd.wv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwadd.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwsub.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwsub.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwsub.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwsub.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwsub.vf v1,v2,fa1'
.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vf v2,v2,fa1'
-.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vf v2,v3,fa1'
.*Error: illegal operands vd cannot overlap vm `vfwsub.vf v0,v2,fa1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwsub.wv v1,v2,v4'
-.*Error: illegal operands vs2 must be multiple of 2 `vfwsub.wv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwsub.wv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwsub.wv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwsub.wv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwmul.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs2 `vfwmul.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vfwmul.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwmul.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwmul.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwmul.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwsub.vf v1,v2,fa1'
.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vf v2,v2,fa1'
-.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vf v2,v3,fa1'
.*Error: illegal operands vd cannot overlap vm `vfwsub.vf v0,v2,fa1,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwmacc.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwmacc.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwmacc.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs2 `vfwmacc.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vfwmacc.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwmacc.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwmacc.vf v1,fa1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwmacc.vf v2,fa1,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwmacc.vf v2,fa1,v3'
.*Error: illegal operands vd cannot overlap vm `vfwmacc.vf v0,fa1,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwnmacc.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwnmacc.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwnmacc.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs2 `vfwnmacc.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vfwnmacc.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwnmacc.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwnmacc.vf v1,fa1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwnmacc.vf v2,fa1,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwnmacc.vf v2,fa1,v3'
.*Error: illegal operands vd cannot overlap vm `vfwnmacc.vf v0,fa1,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwmsac.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwmsac.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwmsac.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs2 `vfwmsac.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vfwmsac.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwmsac.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwmsac.vf v1,fa1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwmsac.vf v2,fa1,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwmsac.vf v2,fa1,v3'
.*Error: illegal operands vd cannot overlap vm `vfwmsac.vf v0,fa1,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwnmsac.vv v1,v2,v4'
.*Error: illegal operands vd cannot overlap vs1 `vfwnmsac.vv v2,v2,v4'
-.*Error: illegal operands vd cannot overlap vs1 `vfwnmsac.vv v2,v3,v4'
.*Error: illegal operands vd cannot overlap vs2 `vfwnmsac.vv v4,v2,v4'
-.*Error: illegal operands vd cannot overlap vs2 `vfwnmsac.vv v4,v2,v5'
.*Error: illegal operands vd cannot overlap vm `vfwnmsac.vv v0,v2,v4,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwnmsac.vf v1,fa1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwnmsac.vf v2,fa1,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwnmsac.vf v2,fa1,v3'
.*Error: illegal operands vd cannot overlap vm `vfwnmsac.vf v0,fa1,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwcvt.xu.f.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.xu.f.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.xu.f.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vfwcvt.xu.f.v v0,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwcvt.x.f.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.x.f.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.x.f.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vfwcvt.x.f.v v0,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwcvt.rtz.xu.f.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.rtz.xu.f.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.rtz.xu.f.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vfwcvt.rtz.xu.f.v v0,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwcvt.rtz.x.f.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.rtz.x.f.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.rtz.x.f.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vfwcvt.rtz.x.f.v v0,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwcvt.f.xu.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.xu.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.xu.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vfwcvt.f.xu.v v0,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwcvt.f.x.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.x.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.x.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vfwcvt.f.x.v v0,v2,v0.t'
-.*Error: illegal operands vd must be multiple of 2 `vfwcvt.f.f.v v1,v2'
.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.f.v v2,v2'
-.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.f.v v2,v3'
.*Error: illegal operands vd cannot overlap vm `vfwcvt.f.f.v v0,v2,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s
index addedd4dc26..bbddceca76f 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s
@@ -1,9 +1,9 @@
# Vector Widening Integer Add/Subtract
# vwcvtu.x.x.v vd,vs,vm = vwaddu.vx vd,vs,x0,vm
- vwcvtu.x.x.v v1, v2 # vd should be multiple of 2
+ vwcvtu.x.x.v v1, v2 # OK since fractional LMUL. vd should be multiple of 2
vwcvtu.x.x.v v2, v2 # vd overlap vs2
- vwcvtu.x.x.v v2, v3 # vd overlap vs2
+ vwcvtu.x.x.v v2, v3 # OK since fractional LMUL. vd overlap vs2
vwcvtu.x.x.v v0, v2, v0.t # vd overlap vm
# vwcvt.x.x.v vd,vs,vm = vwadd.vx vd,vs,x0,vm
@@ -12,25 +12,25 @@
vwcvt.x.x.v v2, v3
vwcvt.x.x.v v0, v2, v0.t
- vwaddu.vv v1, v2, v4 # vd should be multiple of 2
+ vwaddu.vv v1, v2, v4 # OK since fractional LMUL. vd should be multiple of 2
vwaddu.vv v2, v2, v4 # vd overlap vs2
- vwaddu.vv v2, v3, v4 # vd overlap vs2
+ vwaddu.vv v2, v3, v4 # OK since fractional LMUL. vd overlap vs2
vwaddu.vv v4, v2, v4 # vd overlap vs1
- vwaddu.vv v4, v2, v5 # vd overlap vs1
+ vwaddu.vv v4, v2, v5 # OK since fractional LMUL. vd overlap vs1
vwaddu.vv v0, v2, v4, v0.t # vd overlap vm
- vwaddu.vx v1, v2, a1 # vd should be multiple of 2
+ vwaddu.vx v1, v2, a1 # OK since fractional LMUL. vd should be multiple of 2
vwaddu.vx v2, v2, a1 # vd overlap vs2
- vwaddu.vx v2, v3, a1 # vd overlap vs2
+ vwaddu.vx v2, v3, a1 # OK since fractional LMUL. vd overlap vs2
vwaddu.vx v0, v2, a1, v0.t # vd overlap vm
- vwaddu.wv v1, v2, v4 # vd should be multiple of 2
+ vwaddu.wv v1, v2, v4 # OK since fractional LMUL. vd should be multiple of 2
vwaddu.wv v2, v2, v4 # OK
- vwaddu.wv v2, v3, v4 # vs2 should be multiple of 2
+ vwaddu.wv v2, v3, v4 # OK since fractional LMUL. vs2 should be multiple of 2
vwaddu.wv v4, v2, v4 # vd overlap vs1
- vwaddu.wv v4, v2, v5 # vd overlap vs1
+ vwaddu.wv v4, v2, v5 # OK since fractional LMUL. vd overlap vs1
vwaddu.wv v0, v2, v4, v0.t # vd overlap vm
- vwaddu.wx v1, v2, a1 # vd should be multiple of 2
+ vwaddu.wx v1, v2, a1 # OK since fractional LMUL. vd should be multiple of 2
vwaddu.wx v2, v2, a1 # OK
- vwaddu.wx v2, v3, a1 # vs2 should be multiple of 2
+ vwaddu.wx v2, v3, a1 # OK since fractional LMUL. vs2 should be multiple of 2
vwaddu.wx v0, v2, a1, v0.t # vd overlap vm
vwsubu.vv v1, v2, v4
@@ -98,15 +98,15 @@
# Vector Widening Integer Multiply Instructions
- vwmul.vv v1, v2, v4 # vd should be multiple of 2
+ vwmul.vv v1, v2, v4 # OK since fractional LMUL. vd should be multiple of 2
vwmul.vv v2, v2, v4 # vd overlap vs2
- vwmul.vv v2, v3, v4 # vd overlap vs2
+ vwmul.vv v2, v3, v4 # OK since fractional LMUL. vd overlap vs2
vwmul.vv v4, v2, v4 # vd overlap vs1
- vwmul.vv v4, v2, v5 # vd overlap vs1
+ vwmul.vv v4, v2, v5 # OK since fractional LMUL. vd overlap vs1
vwmul.vv v0, v2, v4, v0.t # vd overlap vm
- vwmul.vx v1, v2, a1 # vd should be multiple of 2
+ vwmul.vx v1, v2, a1 # OK since fractional LMUL. vd should be multiple of 2
vwmul.vx v2, v2, a1 # vd overlap vs2
- vwmul.vx v2, v3, a1 # vd overlap vs2
+ vwmul.vx v2, v3, a1 # OK since fractional LMUL. vd overlap vs2
vwmul.vx v0, v2, a1, v0.t # vd overlap vm
vwmulu.vv v1, v2, v4
@@ -133,15 +133,15 @@
# Vector Widening Integer Multiply-Add Instructions
- vwmaccu.vv v1, v2, v4 # vd should be multiple of 2
+ vwmaccu.vv v1, v2, v4 # OK since fractional LMUL. vd should be multiple of 2
vwmaccu.vv v2, v2, v4 # vd overlap vs1
- vwmaccu.vv v2, v3, v4 # vd overlap vs1
+ vwmaccu.vv v2, v3, v4 # OK since fractional LMUL. vd overlap vs1
vwmaccu.vv v4, v2, v4 # vd overlap vs2
- vwmaccu.vv v4, v2, v5 # vd overlap vs2
+ vwmaccu.vv v4, v2, v5 # OK since fractional LMUL. vd overlap vs2
vwmaccu.vv v0, v2, v4, v0.t # vd overlap vm
- vwmaccu.vx v1, a1, v2 # vd should be multiple of 2
+ vwmaccu.vx v1, a1, v2 # OK since fractional LMUL. vd should be multiple of 2
vwmaccu.vx v2, a1, v2 # vd overlap vs2
- vwmaccu.vx v2, a1, v3 # vd overlap vs2
+ vwmaccu.vx v2, a1, v3 # OK since fractional LMUL. vd overlap vs2
vwmaccu.vx v0, a1, v2, v0.t # vd overlap vm
vwmacc.vv v1, v2, v4
@@ -166,28 +166,28 @@
vwmaccsu.vx v2, a1, v3
vwmaccsu.vx v0, a1, v2, v0.t
- vwmaccus.vx v1, a1, v2 # vd should be multiple of 2
+ vwmaccus.vx v1, a1, v2 # OK since fractional LMUL. vd should be multiple of 2
vwmaccus.vx v2, a1, v2 # vd overlap vs2
- vwmaccus.vx v2, a1, v3 # vd overlap vs2
+ vwmaccus.vx v2, a1, v3 # OK since fractional LMUL. vd overlap vs2
vwmaccus.vx v0, a1, v2, v0.t # vd overlap vm
# Vector Widening Floating-Point Add/Subtract Instructions
- vfwadd.vv v1, v2, v4 # vd should be multiple of 2
+ vfwadd.vv v1, v2, v4 # OK since fractional LMUL. vd should be multiple of 2
vfwadd.vv v2, v2, v4 # vd overlap vs2
- vfwadd.vv v2, v3, v4 # vd overlap vs2
+ vfwadd.vv v2, v3, v4 # OK since fractional LMUL. vd overlap vs2
vfwadd.vv v4, v2, v4 # vd overlap vs1
- vfwadd.vv v4, v2, v5 # vd overlap vs1
+ vfwadd.vv v4, v2, v5 # OK since fractional LMUL. vd overlap vs1
vfwadd.vv v0, v2, v4, v0.t # vd overlap vm
- vfwadd.vf v1, v2, fa1 # vd should be multiple of 2
+ vfwadd.vf v1, v2, fa1 # OK since fractional LMUL. vd should be multiple of 2
vfwadd.vf v2, v2, fa1 # vd overlap vs2
- vfwadd.vf v2, v3, fa1 # vd overlap vs2
+ vfwadd.vf v2, v3, fa1 # OK since fractional LMUL. vd overlap vs2
vfwadd.vf v0, v2, fa1, v0.t # vd overlap vm
- vfwadd.wv v1, v2, v4 # vd should be multiple of 2
+ vfwadd.wv v1, v2, v4 # OK since fractional LMUL. vd should be multiple of 2
vfwadd.wv v2, v2, v4 # OK
- vfwadd.wv v2, v3, v4 # vs2 should be multiple of 2
+ vfwadd.wv v2, v3, v4 # OK since fractional LMUL. vs2 should be multiple of 2
vfwadd.wv v4, v2, v4 # vd overlap vs1
- vfwadd.wv v4, v2, v5 # vd overlap vs1
+ vfwadd.wv v4, v2, v5 # OK since fractional LMUL. vd overlap vs1
vfwadd.wv v0, v2, v4, v0.t # vd overlap vm
vfwsub.vv v1, v2, v4
@@ -209,27 +209,27 @@
# Vector Widening Floating-Point Multiply
- vfwmul.vv v1, v2, v4 # vd should be multiple of 2
+ vfwmul.vv v1, v2, v4 # OK since fractional LMUL. vd should be multiple of 2
vfwmul.vv v2, v2, v4 # vd overlap vs2
- vfwmul.vv v2, v3, v4 # vd overlap vs2
+ vfwmul.vv v2, v3, v4 # OK since fractional LMUL. vd overlap vs2
vfwmul.vv v4, v2, v4 # vd overlap vs1
- vfwmul.vv v4, v2, v5 # vd overlap vs1
+ vfwmul.vv v4, v2, v5 # OK since fractional LMUL. vd overlap vs1
vfwmul.vv v0, v2, v4, v0.t # vd overlap vm
- vfwsub.vf v1, v2, fa1 # vd should be multiple of 2
+ vfwsub.vf v1, v2, fa1 # OK since fractional LMUL. vd should be multiple of 2
vfwsub.vf v2, v2, fa1 # vd overlap vs2
- vfwsub.vf v2, v3, fa1 # vd overlap vs2
+ vfwsub.vf v2, v3, fa1 # OK since fractional LMUL. vd overlap vs2
vfwsub.vf v0, v2, fa1, v0.t # vd overlap vm
# Vector Widening Floating-Point Fused Multiply-Add Instructions
- vfwmacc.vv v1, v2, v4 # vd should be multiple of 2
+ vfwmacc.vv v1, v2, v4 # OK since fractional LMUL. vd should be multiple of 2
vfwmacc.vv v2, v2, v4 # vd overlap vs1
- vfwmacc.vv v2, v3, v4 # vd overlap vs1
+ vfwmacc.vv v2, v3, v4 # OK since fractional LMUL. vd overlap vs1
vfwmacc.vv v4, v2, v4 # vd overlap vs2
- vfwmacc.vv v4, v2, v5 # vd overlap vs2
+ vfwmacc.vv v4, v2, v5 # OK since fractional LMUL. vd overlap vs2
vfwmacc.vv v0, v2, v4, v0.t # vd overlap vm
- vfwmacc.vf v1, fa1, v2 # vd should be multiple of 2
+ vfwmacc.vf v1, fa1, v2 # OK since fractional LMUL. vd should be multiple of 2
vfwmacc.vf v2, fa1, v2 # vd overlap vs2
- vfwmacc.vf v2, fa1, v3 # vd overlap vs2
+ vfwmacc.vf v2, fa1, v3 # OK since fractional LMUL. vd overlap vs2
vfwmacc.vf v0, fa1, v2, v0.t # vd overlap vm
vfwnmacc.vv v1, v2, v4
@@ -267,9 +267,9 @@
# Widening Floating-Point/Integer Type-Convert Instructions
- vfwcvt.xu.f.v v1, v2 # vd should be multiple of 2
+ vfwcvt.xu.f.v v1, v2 # OK since fractional LMUL. vd should be multiple of 2
vfwcvt.xu.f.v v2, v2 # vd overlap vs2
- vfwcvt.xu.f.v v2, v3 # vd overlap vs2
+ vfwcvt.xu.f.v v2, v3 # OK since fractional LMUL. vd overlap vs2
vfwcvt.xu.f.v v0, v2, v0.t # vd overlap vm
vfwcvt.x.f.v v1, v2
vfwcvt.x.f.v v2, v2
diff --git a/opcodes/riscv-opc.c b/opcodes/riscv-opc.c
index cccc3553c60..9d733aff9d3 100644
--- a/opcodes/riscv-opc.c
+++ b/opcodes/riscv-opc.c
@@ -1084,141 +1084,38 @@ match_vd_eq_vs1_eq_vs2 (const struct riscv_opcode *op,
/* These are used to check the vector constraints. */
static int
-match_widen_vd_neq_vs1_neq_vs2_neq_vm (const struct riscv_opcode *op,
- insn_t insn,
- int constraints,
- const char **error)
+match_vd_neq_vs1_neq_vs2 (const struct riscv_opcode *op,
+ insn_t insn,
+ int constraints,
+ const char **error)
{
int vd = (insn & MASK_VD) >> OP_SH_VD;
int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
- int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
if (!constraints || error == NULL)
return match_opcode (op, insn, 0, NULL);
- if ((vd % 2) != 0)
- *error = "illegal operands vd must be multiple of 2";
- else if (vs1 >= vd && vs1 <= (vd + 1))
+ if (vs1 == vd)
*error = "illegal operands vd cannot overlap vs1";
- else if (vs2 >= vd && vs2 <= (vd + 1))
+ else if (vs2 == vd)
*error = "illegal operands vd cannot overlap vs2";
- else if (!vm && vm >= vd && vm <= (vd + 1))
- *error = "illegal operands vd cannot overlap vm";
else
return match_opcode (op, insn, 0, NULL);
return 0;
}
static int
-match_widen_vd_neq_vs1_neq_vm (const struct riscv_opcode *op,
- insn_t insn,
- int constraints,
- const char **error)
+match_vd_neq_vs1_neq_vs2_neq_vm (const struct riscv_opcode *op,
+ insn_t insn,
+ int constraints,
+ const char **error)
{
int vd = (insn & MASK_VD) >> OP_SH_VD;
int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
- if (!constraints || error == NULL)
- return match_opcode (op, insn, 0, NULL);
-
- if ((vd % 2) != 0)
- *error = "illegal operands vd must be multiple of 2";
- else if ((vs2 % 2) != 0)
- *error = "illegal operands vs2 must be multiple of 2";
- else if (vs1 >= vd && vs1 <= (vd + 1))
- *error = "illegal operands vd cannot overlap vs1";
- else if (!vm && vm >= vd && vm <= (vd + 1))
- *error = "illegal operands vd cannot overlap vm";
- else
- return match_opcode (op, insn, 0, NULL);
- return 0;
-}
-
-static int
-match_widen_vd_neq_vs2_neq_vm (const struct riscv_opcode *op,
- insn_t insn,
- int constraints,
- const char **error)
-{
- int vd = (insn & MASK_VD) >> OP_SH_VD;
- int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
- int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
-
- if (!constraints || error == NULL)
- return match_opcode (op, insn, 0, NULL);
-
- if ((vd % 2) != 0)
- *error = "illegal operands vd must be multiple of 2";
- else if (vs2 >= vd && vs2 <= (vd + 1))
- *error = "illegal operands vd cannot overlap vs2";
- else if (!vm && vm >= vd && vm <= (vd + 1))
- *error = "illegal operands vd cannot overlap vm";
- else
- return match_opcode (op, insn, 0, NULL);
- return 0;
-}
-
-static int
-match_widen_vd_neq_vm (const struct riscv_opcode *op,
- insn_t insn,
- int constraints,
- const char **error)
-{
- int vd = (insn & MASK_VD) >> OP_SH_VD;
- int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
- int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
-
- if (!constraints || error == NULL)
- return match_opcode (op, insn, 0, NULL);
-
- if ((vd % 2) != 0)
- *error = "illegal operands vd must be multiple of 2";
- else if ((vs2 % 2) != 0)
- *error = "illegal operands vs2 must be multiple of 2";
- else if (!vm && vm >= vd && vm <= (vd + 1))
- *error = "illegal operands vd cannot overlap vm";
- else
- return match_opcode (op, insn, 0, NULL);
- return 0;
-}
-
-static int
-match_narrow_vd_neq_vs2_neq_vm (const struct riscv_opcode *op,
- insn_t insn,
- int constraints,
- const char **error)
-{
- int vd = (insn & MASK_VD) >> OP_SH_VD;
- int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
- int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
-
- if (!constraints || error == NULL)
- return match_opcode (op, insn, 0, NULL);
-
- if ((vs2 % 2) != 0)
- *error = "illegal operands vd must be multiple of 2";
- else if (vd >= vs2 && vd <= (vs2 + 1))
- *error = "illegal operands vd cannot overlap vs2";
- else if (!vm && vd >= vm && vd <= (vm + 1))
- *error = "illegal operands vd cannot overlap vm";
- else
- return match_opcode (op, insn, 0, NULL);
- return 0;
-}
-
-static int
-match_vd_neq_vs1_neq_vs2 (const struct riscv_opcode *op,
- insn_t insn,
- int constraints,
- const char **error)
-{
- int vd = (insn & MASK_VD) >> OP_SH_VD;
- int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
- int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
-
if (!constraints || error == NULL)
return match_opcode (op, insn, 0, NULL);
@@ -1226,20 +1123,21 @@ match_vd_neq_vs1_neq_vs2 (const struct riscv_opcode *op,
*error = "illegal operands vd cannot overlap vs1";
else if (vs2 == vd)
*error = "illegal operands vd cannot overlap vs2";
+ else if (!vm && vm == vd)
+ *error = "illegal operands vd cannot overlap vm";
else
return match_opcode (op, insn, 0, NULL);
return 0;
}
static int
-match_vd_neq_vs1_neq_vs2_neq_vm (const struct riscv_opcode *op,
- insn_t insn,
- int constraints,
- const char **error)
+match_vd_neq_vs1_neq_vm (const struct riscv_opcode *op,
+ insn_t insn,
+ int constraints,
+ const char **error)
{
int vd = (insn & MASK_VD) >> OP_SH_VD;
int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
- int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
if (!constraints || error == NULL)
@@ -1247,8 +1145,6 @@ match_vd_neq_vs1_neq_vs2_neq_vm (const struct riscv_opcode *op,
if (vs1 == vd)
*error = "illegal operands vd cannot overlap vs1";
- else if (vs2 == vd)
- *error = "illegal operands vd cannot overlap vs2";
else if (!vm && vm == vd)
*error = "illegal operands vd cannot overlap vm";
else
@@ -1818,25 +1714,25 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vrsub.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VRSUBVX, MASK_VRSUBVX, match_vd_neq_vm, 0 },
{"vrsub.vi", 0, INSN_CLASS_V, "Vd,Vt,ViVm", MATCH_VRSUBVI, MASK_VRSUBVI, match_vd_neq_vm, 0 },
-{"vwcvt.x.x.v", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VWCVTXXV, MASK_VWCVTXXV, match_widen_vd_neq_vs2_neq_vm, INSN_ALIAS },
-{"vwcvtu.x.x.v", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VWCVTUXXV, MASK_VWCVTUXXV, match_widen_vd_neq_vs2_neq_vm, INSN_ALIAS },
-
-{"vwaddu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWADDUVV, MASK_VWADDUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
-{"vwaddu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWADDUVX, MASK_VWADDUVX, match_widen_vd_neq_vs2_neq_vm, 0 },
-{"vwsubu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWSUBUVV, MASK_VWSUBUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
-{"vwsubu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWSUBUVX, MASK_VWSUBUVX, match_widen_vd_neq_vs2_neq_vm, 0 },
-{"vwadd.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWADDVV, MASK_VWADDVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
-{"vwadd.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWADDVX, MASK_VWADDVX, match_widen_vd_neq_vs2_neq_vm, 0 },
-{"vwsub.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWSUBVV, MASK_VWSUBVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
-{"vwsub.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWSUBVX, MASK_VWSUBVX, match_widen_vd_neq_vs2_neq_vm, 0 },
-{"vwaddu.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWADDUWV, MASK_VWADDUWV, match_widen_vd_neq_vs1_neq_vm, 0 },
-{"vwaddu.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWADDUWX, MASK_VWADDUWX, match_widen_vd_neq_vm, 0 },
-{"vwsubu.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWSUBUWV, MASK_VWSUBUWV, match_widen_vd_neq_vs1_neq_vm, 0 },
-{"vwsubu.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWSUBUWX, MASK_VWSUBUWX, match_widen_vd_neq_vm, 0 },
-{"vwadd.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWADDWV, MASK_VWADDWV, match_widen_vd_neq_vs1_neq_vm, 0 },
-{"vwadd.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWADDWX, MASK_VWADDWX, match_widen_vd_neq_vm, 0 },
-{"vwsub.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWSUBWV, MASK_VWSUBWV, match_widen_vd_neq_vs1_neq_vm, 0 },
-{"vwsub.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWSUBWX, MASK_VWSUBWX, match_widen_vd_neq_vm, 0 },
+{"vwcvt.x.x.v", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VWCVTXXV, MASK_VWCVTXXV, match_vd_neq_vs2_neq_vm, INSN_ALIAS },
+{"vwcvtu.x.x.v", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VWCVTUXXV, MASK_VWCVTUXXV, match_vd_neq_vs2_neq_vm, INSN_ALIAS },
+
+{"vwaddu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWADDUVV, MASK_VWADDUVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwaddu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWADDUVX, MASK_VWADDUVX, match_vd_neq_vs2_neq_vm, 0 },
+{"vwsubu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWSUBUVV, MASK_VWSUBUVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwsubu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWSUBUVX, MASK_VWSUBUVX, match_vd_neq_vs2_neq_vm, 0 },
+{"vwadd.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWADDVV, MASK_VWADDVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwadd.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWADDVX, MASK_VWADDVX, match_vd_neq_vs2_neq_vm, 0 },
+{"vwsub.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWSUBVV, MASK_VWSUBVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwsub.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWSUBVX, MASK_VWSUBVX, match_vd_neq_vs2_neq_vm, 0 },
+{"vwaddu.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWADDUWV, MASK_VWADDUWV, match_vd_neq_vs1_neq_vm, 0 },
+{"vwaddu.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWADDUWX, MASK_VWADDUWX, match_vd_neq_vm, 0 },
+{"vwsubu.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWSUBUWV, MASK_VWSUBUWV, match_vd_neq_vs1_neq_vm, 0 },
+{"vwsubu.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWSUBUWX, MASK_VWSUBUWX, match_vd_neq_vm, 0 },
+{"vwadd.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWADDWV, MASK_VWADDWV, match_vd_neq_vs1_neq_vm, 0 },
+{"vwadd.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWADDWX, MASK_VWADDWX, match_vd_neq_vm, 0 },
+{"vwsub.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWSUBWV, MASK_VWSUBWV, match_vd_neq_vs1_neq_vm, 0 },
+{"vwsub.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWSUBWX, MASK_VWSUBWX, match_vd_neq_vm, 0 },
{"vzext.vf2", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VZEXT_VF2, MASK_VZEXT_VF2, match_vd_neq_vm, 0 },
{"vsext.vf2", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VSEXT_VF2, MASK_VSEXT_VF2, match_vd_neq_vm, 0 },
@@ -1883,14 +1779,14 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vsra.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VSRAVX, MASK_VSRAVX, match_vd_neq_vm, 0 },
{"vsra.vi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VSRAVI, MASK_VSRAVI, match_vd_neq_vm, 0 },
-{"vncvt.x.x.w",0, INSN_CLASS_V, "Vd,VtVm", MATCH_VNCVTXXW, MASK_VNCVTXXW, match_narrow_vd_neq_vs2_neq_vm, INSN_ALIAS },
+{"vncvt.x.x.w",0, INSN_CLASS_V, "Vd,VtVm", MATCH_VNCVTXXW, MASK_VNCVTXXW, match_opcode, INSN_ALIAS },
-{"vnsrl.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VNSRLWV, MASK_VNSRLWV, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnsrl.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VNSRLWX, MASK_VNSRLWX, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnsrl.wi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VNSRLWI, MASK_VNSRLWI, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnsra.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VNSRAWV, MASK_VNSRAWV, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnsra.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VNSRAWX, MASK_VNSRAWX, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnsra.wi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VNSRAWI, MASK_VNSRAWI, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnsrl.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VNSRLWV, MASK_VNSRLWV, match_opcode, 0 },
+{"vnsrl.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VNSRLWX, MASK_VNSRLWX, match_opcode, 0 },
+{"vnsrl.wi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VNSRLWI, MASK_VNSRLWI, match_opcode, 0 },
+{"vnsra.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VNSRAWV, MASK_VNSRAWV, match_opcode, 0 },
+{"vnsra.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VNSRAWX, MASK_VNSRAWX, match_opcode, 0 },
+{"vnsra.wi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VNSRAWI, MASK_VNSRAWI, match_opcode, 0 },
{"vmseq.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VMSEQVV, MASK_VMSEQVV, match_opcode, 0 },
{"vmseq.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VMSEQVX, MASK_VMSEQVX, match_opcode, 0 },
@@ -1948,12 +1844,12 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vmulhsu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VMULHSUVV, MASK_VMULHSUVV, match_vd_neq_vm, 0 },
{"vmulhsu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VMULHSUVX, MASK_VMULHSUVX, match_vd_neq_vm, 0 },
-{"vwmul.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWMULVV, MASK_VWMULVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
-{"vwmul.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWMULVX, MASK_VWMULVX, match_widen_vd_neq_vs2_neq_vm, 0 },
-{"vwmulu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWMULUVV, MASK_VWMULUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
-{"vwmulu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWMULUVX, MASK_VWMULUVX, match_widen_vd_neq_vs2_neq_vm, 0 },
-{"vwmulsu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWMULSUVV, MASK_VWMULSUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
-{"vwmulsu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWMULSUVX, MASK_VWMULSUVX, match_widen_vd_neq_vs2_neq_vm, 0 },
+{"vwmul.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWMULVV, MASK_VWMULVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwmul.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWMULVX, MASK_VWMULVX, match_vd_neq_vs2_neq_vm, 0 },
+{"vwmulu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWMULUVV, MASK_VWMULUVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwmulu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWMULUVX, MASK_VWMULUVX, match_vd_neq_vs2_neq_vm, 0 },
+{"vwmulsu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWMULSUVV, MASK_VWMULSUVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwmulsu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VWMULSUVX, MASK_VWMULSUVX, match_vd_neq_vs2_neq_vm, 0 },
{"vmacc.vv", 0, INSN_CLASS_V, "Vd,Vs,VtVm", MATCH_VMACCVV, MASK_VMACCVV, match_vd_neq_vm, 0},
{"vmacc.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VMACCVX, MASK_VMACCVX, match_vd_neq_vm, 0},
@@ -1964,13 +1860,13 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vnmsub.vv", 0, INSN_CLASS_V, "Vd,Vs,VtVm", MATCH_VNMSUBVV, MASK_VNMSUBVV, match_vd_neq_vm, 0},
{"vnmsub.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VNMSUBVX, MASK_VNMSUBVX, match_vd_neq_vm, 0},
-{"vwmaccu.vv", 0, INSN_CLASS_V, "Vd,Vs,VtVm", MATCH_VWMACCUVV, MASK_VWMACCUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vwmaccu.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VWMACCUVX, MASK_VWMACCUVX, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vwmacc.vv", 0, INSN_CLASS_V, "Vd,Vs,VtVm", MATCH_VWMACCVV, MASK_VWMACCVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vwmacc.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VWMACCVX, MASK_VWMACCVX, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vwmaccsu.vv", 0, INSN_CLASS_V, "Vd,Vs,VtVm", MATCH_VWMACCSUVV, MASK_VWMACCSUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vwmaccsu.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VWMACCSUVX, MASK_VWMACCSUVX, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vwmaccus.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VWMACCUSVX, MASK_VWMACCUSVX, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vwmaccu.vv", 0, INSN_CLASS_V, "Vd,Vs,VtVm", MATCH_VWMACCUVV, MASK_VWMACCUVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vwmaccu.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VWMACCUVX, MASK_VWMACCUVX, match_vd_neq_vs2_neq_vm, 0},
+{"vwmacc.vv", 0, INSN_CLASS_V, "Vd,Vs,VtVm", MATCH_VWMACCVV, MASK_VWMACCVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vwmacc.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VWMACCVX, MASK_VWMACCVX, match_vd_neq_vs2_neq_vm, 0},
+{"vwmaccsu.vv", 0, INSN_CLASS_V, "Vd,Vs,VtVm", MATCH_VWMACCSUVV, MASK_VWMACCSUVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vwmaccsu.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VWMACCSUVX, MASK_VWMACCSUVX, match_vd_neq_vs2_neq_vm, 0},
+{"vwmaccus.vx", 0, INSN_CLASS_V, "Vd,s,VtVm", MATCH_VWMACCUSVX, MASK_VWMACCUSVX, match_vd_neq_vs2_neq_vm, 0},
{"vdivu.vv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VDIVUVV, MASK_VDIVUVV, match_vd_neq_vm, 0 },
{"vdivu.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VDIVUVX, MASK_VDIVUVX, match_vd_neq_vm, 0 },
@@ -2019,12 +1915,12 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vssra.vx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VSSRAVX, MASK_VSSRAVX, match_vd_neq_vm, 0 },
{"vssra.vi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VSSRAVI, MASK_VSSRAVI, match_vd_neq_vm, 0 },
-{"vnclipu.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VNCLIPUWV, MASK_VNCLIPUWV, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnclipu.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VNCLIPUWX, MASK_VNCLIPUWX, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnclipu.wi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VNCLIPUWI, MASK_VNCLIPUWI, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnclip.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VNCLIPWV, MASK_VNCLIPWV, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnclip.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VNCLIPWX, MASK_VNCLIPWX, match_narrow_vd_neq_vs2_neq_vm, 0 },
-{"vnclip.wi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VNCLIPWI, MASK_VNCLIPWI, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnclipu.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VNCLIPUWV, MASK_VNCLIPUWV, match_opcode, 0 },
+{"vnclipu.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VNCLIPUWX, MASK_VNCLIPUWX, match_opcode, 0 },
+{"vnclipu.wi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VNCLIPUWI, MASK_VNCLIPUWI, match_opcode, 0 },
+{"vnclip.wv", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VNCLIPWV, MASK_VNCLIPWV, match_opcode, 0 },
+{"vnclip.wx", 0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VNCLIPWX, MASK_VNCLIPWX, match_opcode, 0 },
+{"vnclip.wi", 0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VNCLIPWI, MASK_VNCLIPWI, match_opcode, 0 },
{"vfadd.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFADDVV, MASK_VFADDVV, match_vd_neq_vm, 0},
{"vfadd.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFADDVF, MASK_VFADDVF, match_vd_neq_vm, 0},
@@ -2032,14 +1928,14 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vfsub.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFSUBVF, MASK_VFSUBVF, match_vd_neq_vm, 0},
{"vfrsub.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFRSUBVF, MASK_VFRSUBVF, match_vd_neq_vm, 0},
-{"vfwadd.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWADDVV, MASK_VFWADDVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vfwadd.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWADDVF, MASK_VFWADDVF, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwsub.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWSUBVV, MASK_VFWSUBVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vfwsub.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWSUBVF, MASK_VFWSUBVF, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwadd.wv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWADDWV, MASK_VFWADDWV, match_widen_vd_neq_vs1_neq_vm, 0},
-{"vfwadd.wf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWADDWF, MASK_VFWADDWF, match_widen_vd_neq_vm, 0},
-{"vfwsub.wv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWSUBWV, MASK_VFWSUBWV, match_widen_vd_neq_vs1_neq_vm, 0},
-{"vfwsub.wf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWSUBWF, MASK_VFWSUBWF, match_widen_vd_neq_vm, 0},
+{"vfwadd.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWADDVV, MASK_VFWADDVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwadd.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWADDVF, MASK_VFWADDVF, match_vd_neq_vs2_neq_vm, 0},
+{"vfwsub.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWSUBVV, MASK_VFWSUBVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwsub.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWSUBVF, MASK_VFWSUBVF, match_vd_neq_vs2_neq_vm, 0},
+{"vfwadd.wv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWADDWV, MASK_VFWADDWV, match_vd_neq_vs1_neq_vm, 0},
+{"vfwadd.wf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWADDWF, MASK_VFWADDWF, match_vd_neq_vm, 0},
+{"vfwsub.wv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWSUBWV, MASK_VFWSUBWV, match_vd_neq_vs1_neq_vm, 0},
+{"vfwsub.wf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWSUBWF, MASK_VFWSUBWF, match_vd_neq_vm, 0},
{"vfmul.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFMULVV, MASK_VFMULVV, match_vd_neq_vm, 0},
{"vfmul.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFMULVF, MASK_VFMULVF, match_vd_neq_vm, 0},
@@ -2047,8 +1943,8 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vfdiv.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFDIVVF, MASK_VFDIVVF, match_vd_neq_vm, 0},
{"vfrdiv.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFRDIVVF, MASK_VFRDIVVF, match_vd_neq_vm, 0},
-{"vfwmul.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWMULVV, MASK_VFWMULVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vfwmul.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWMULVF, MASK_VFWMULVF, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwmul.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWMULVV, MASK_VFWMULVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwmul.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWMULVF, MASK_VFWMULVF, match_vd_neq_vs2_neq_vm, 0},
{"vfmadd.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFMADDVV, MASK_VFMADDVV, match_vd_neq_vm, 0},
{"vfmadd.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFMADDVF, MASK_VFMADDVF, match_vd_neq_vm, 0},
@@ -2067,14 +1963,14 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vfnmsac.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFNMSACVV, MASK_VFNMSACVV, match_vd_neq_vm, 0},
{"vfnmsac.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFNMSACVF, MASK_VFNMSACVF, match_vd_neq_vm, 0},
-{"vfwmacc.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWMACCVV, MASK_VFWMACCVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vfwmacc.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWMACCVF, MASK_VFWMACCVF, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwnmacc.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWNMACCVV, MASK_VFWNMACCVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vfwnmacc.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWNMACCVF, MASK_VFWNMACCVF, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwmsac.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWMSACVV, MASK_VFWMSACVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vfwmsac.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWMSACVF, MASK_VFWMSACVF, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwnmsac.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWNMSACVV, MASK_VFWNMSACVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
-{"vfwnmsac.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWNMSACVF, MASK_VFWNMSACVF, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwmacc.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWMACCVV, MASK_VFWMACCVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwmacc.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWMACCVF, MASK_VFWMACCVF, match_vd_neq_vs2_neq_vm, 0},
+{"vfwnmacc.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWNMACCVV, MASK_VFWNMACCVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwnmacc.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWNMACCVF, MASK_VFWNMACCVF, match_vd_neq_vs2_neq_vm, 0},
+{"vfwmsac.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWMSACVV, MASK_VFWMSACVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwmsac.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWMSACVF, MASK_VFWMSACVF, match_vd_neq_vs2_neq_vm, 0},
+{"vfwnmsac.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWNMSACVV, MASK_VFWNMSACVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwnmsac.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWNMSACVF, MASK_VFWNMSACVF, match_vd_neq_vs2_neq_vm, 0},
{"vfsqrt.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFSQRTV, MASK_VFSQRTV, match_vd_neq_vm, 0},
{"vfrsqrt7.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFRSQRT7V, MASK_VFRSQRT7V, match_vd_neq_vm, 0},
@@ -2123,22 +2019,22 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vfcvt.f.xu.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCVTFXUV, MASK_VFCVTFXUV, match_vd_neq_vm, 0},
{"vfcvt.f.x.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCVTFXV, MASK_VFCVTFXV, match_vd_neq_vm, 0},
-{"vfwcvt.xu.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTXUFV, MASK_VFWCVTXUFV, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwcvt.x.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTXFV, MASK_VFWCVTXFV, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwcvt.rtz.xu.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTRTZXUFV, MASK_VFWCVTRTZXUFV, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwcvt.rtz.x.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTRTZXFV, MASK_VFWCVTRTZXFV, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwcvt.f.xu.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFXUV, MASK_VFWCVTFXUV, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwcvt.f.x.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFXV, MASK_VFWCVTFXV, match_widen_vd_neq_vs2_neq_vm, 0},
-{"vfwcvt.f.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFFV, MASK_VFWCVTFFV, match_widen_vd_neq_vs2_neq_vm, 0},
-
-{"vfncvt.xu.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTXUFW, MASK_VFNCVTXUFW, match_narrow_vd_neq_vs2_neq_vm, 0},
-{"vfncvt.x.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTXFW, MASK_VFNCVTXFW, match_narrow_vd_neq_vs2_neq_vm, 0},
-{"vfncvt.rtz.xu.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRTZXUFW, MASK_VFNCVTRTZXUFW, match_narrow_vd_neq_vs2_neq_vm, 0},
-{"vfncvt.rtz.x.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRTZXFW, MASK_VFNCVTRTZXFW, match_narrow_vd_neq_vs2_neq_vm, 0},
-{"vfncvt.f.xu.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFXUW, MASK_VFNCVTFXUW, match_narrow_vd_neq_vs2_neq_vm, 0},
-{"vfncvt.f.x.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFXW, MASK_VFNCVTFXW, match_narrow_vd_neq_vs2_neq_vm, 0},
-{"vfncvt.f.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFFW, MASK_VFNCVTFFW, match_narrow_vd_neq_vs2_neq_vm, 0},
-{"vfncvt.rod.f.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRODFFW, MASK_VFNCVTRODFFW, match_narrow_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.xu.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTXUFV, MASK_VFWCVTXUFV, match_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.x.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTXFV, MASK_VFWCVTXFV, match_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.rtz.xu.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTRTZXUFV, MASK_VFWCVTRTZXUFV, match_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.rtz.x.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTRTZXFV, MASK_VFWCVTRTZXFV, match_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.f.xu.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFXUV, MASK_VFWCVTFXUV, match_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.f.x.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFXV, MASK_VFWCVTFXV, match_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.f.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFFV, MASK_VFWCVTFFV, match_vd_neq_vs2_neq_vm, 0},
+
+{"vfncvt.xu.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTXUFW, MASK_VFNCVTXUFW, match_opcode, 0},
+{"vfncvt.x.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTXFW, MASK_VFNCVTXFW, match_opcode, 0},
+{"vfncvt.rtz.xu.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRTZXUFW, MASK_VFNCVTRTZXUFW, match_opcode, 0},
+{"vfncvt.rtz.x.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRTZXFW, MASK_VFNCVTRTZXFW, match_opcode, 0},
+{"vfncvt.f.xu.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFXUW, MASK_VFNCVTFXUW, match_opcode, 0},
+{"vfncvt.f.x.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFXW, MASK_VFNCVTFXW, match_opcode, 0},
+{"vfncvt.f.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFFW, MASK_VFNCVTFFW, match_opcode, 0},
+{"vfncvt.rod.f.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRODFFW, MASK_VFNCVTRODFFW, match_opcode, 0},
{"vredsum.vs", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDSUMVS, MASK_VREDSUMVS, match_opcode, 0},
{"vredmaxu.vs",0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDMAXUVS, MASK_VREDMAXUVS, match_opcode, 0},
--
2.30.2
^ permalink raw reply [flat|nested] 6+ messages in thread
* [integration v2 3/4] RISC-V/rvv: Separate zvamo from v, and removed the zvlsseg extension name.
2021-10-05 12:51 [integration v2 0/4] RISC-V/rvv: Update rvv from v01.0 to v1.0 Nelson Chu
2021-10-05 12:51 ` [integration v2 1/4] RISC-V/rvv: Added assembly pseudo and changed assembler mnemonics Nelson Chu
2021-10-05 12:51 ` [integration v2 2/4] RISC-V/rvv: Update constraints for widening and narrowing instructions Nelson Chu
@ 2021-10-05 12:51 ` Nelson Chu
2021-10-05 12:51 ` [integration v2 4/4] RISC-V/rvv: Added zve* and zvl* extensions, and clarify the imply rules Nelson Chu
2021-10-25 6:04 ` [integration v2 0/4] RISC-V/rvv: Update rvv from v01.0 to v1.0 Nelson Chu
4 siblings, 0 replies; 6+ messages in thread
From: Nelson Chu @ 2021-10-05 12:51 UTC (permalink / raw)
To: binutils, jimw, andrew
* Separate zvamo from v extension with v1.0, but keep the implementations.
* Removed zvlsseg extension name as the vector segmant loads and stores
are required (included) in v extension.
* Updated the versions of v and zvamo from draft v0.10 to frozen v1.0.
bfd/
* elfxx-riscv.c (riscv_supported_std_z_ext): Removed entry of zvlsseg.
gas/
* config/tc-riscv.c (riscv_extended_subset_supports): Changed
INSN_CLASS_V_OR_ZVAMO to INSN_CLASS_ZVAMO, and removed
INSN_CLASS_V_OR_ZVLSSEG.
(riscv_extended_csr_class_check): Updated since the name zvlsseg
is removed.
* testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d: Changed
-march from rv32iav to rv32ia_zvamo.
* testsuite/gas/riscv/extended/vector-insns.d: Changed -march from
rv32iafv to rv32iafv_zvamo.
include/
* opcode/riscv.h (riscv_extended_insn_class): Changed
INSN_CLASS_V_OR_ZVAMO to INSN_CLASS_ZVAMO, and removed
INSN_CLASS_V_OR_ZVLSSEG.
opcodes/
* riscv-opc.c (riscv_draft_opcodes): Changed INSN_CLASS_V_OR_ZVAMO
to INSN_CLASS_ZVAMO since they are separated from v. Also changed
INSN_CLASS_V_OR_ZVLSSEG to INSN_CLASS_V as they are included in v.
---
bfd/elfxx-riscv.c | 5 +-
gas/config/tc-riscv.c | 12 +-
.../riscv/extended/vector-insns-fail-zvamo.d | 2 +-
.../gas/riscv/extended/vector-insns.d | 2 +-
include/opcode/riscv.h | 3 +-
opcodes/riscv-opc.c | 620 +++++++++---------
6 files changed, 318 insertions(+), 326 deletions(-)
diff --git a/bfd/elfxx-riscv.c b/bfd/elfxx-riscv.c
index 2bec654e3a7..62b92a05f25 100644
--- a/bfd/elfxx-riscv.c
+++ b/bfd/elfxx-riscv.c
@@ -1132,7 +1132,7 @@ static struct riscv_supported_ext riscv_supported_std_ext[] =
{"j", ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
{"t", ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
{"p", ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
- {"v", ISA_SPEC_CLASS_DRAFT, 0, 10, 0 }, /* draft. */
+ {"v", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
{"n", ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
{NULL, 0, 0, 0, 0}
};
@@ -1147,8 +1147,7 @@ static struct riscv_supported_ext riscv_supported_std_z_ext[] =
{"zbb", ISA_SPEC_CLASS_DRAFT, 0, 93, 0 },
{"zba", ISA_SPEC_CLASS_DRAFT, 0, 93, 0 },
{"zbc", ISA_SPEC_CLASS_DRAFT, 0, 93, 0 },
- {"zvamo", ISA_SPEC_CLASS_DRAFT, 0, 10, 0 }, /* draft. */
- {"zvlsseg", ISA_SPEC_CLASS_DRAFT, 0, 10, 0 }, /* draft. */
+ {"zvamo", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
{"zfh", ISA_SPEC_CLASS_DRAFT, 0, 1, 0 }, /* draft. */
{NULL, 0, 0, 0, 0}
};
diff --git a/gas/config/tc-riscv.c b/gas/config/tc-riscv.c
index d4cf99e002b..acdb303f4b9 100644
--- a/gas/config/tc-riscv.c
+++ b/gas/config/tc-riscv.c
@@ -277,13 +277,8 @@ riscv_extended_subset_supports (int insn_class)
case INSN_CLASS_V: return riscv_subset_supports ("v");
case INSN_CLASS_V_AND_F:
return riscv_subset_supports ("v") && riscv_subset_supports ("f");
- case INSN_CLASS_V_OR_ZVAMO:
- return (riscv_subset_supports ("a")
- && (riscv_subset_supports ("v")
- || riscv_subset_supports ("zvamo")));
- case INSN_CLASS_V_OR_ZVLSSEG:
- return (riscv_subset_supports ("v")
- || riscv_subset_supports ("zvlsseg"));
+ case INSN_CLASS_ZVAMO:
+ return riscv_subset_supports ("a") && riscv_subset_supports ("zvamo");
case INSN_CLASS_ZFH:
return riscv_subset_supports ("zfh");
@@ -988,8 +983,7 @@ riscv_extended_csr_class_check (int csr_class)
{
case CSR_CLASS_V:
return (riscv_subset_supports ("v")
- || riscv_subset_supports ("zvamo")
- || riscv_subset_supports ("zvlsseg"));
+ || riscv_subset_supports ("zvamo"));
case CSR_CLASS_VENDOR_THEAD:
return true;
default:
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d
index 5749449bd03..8a6de14600c 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d
@@ -1,3 +1,3 @@
-#as: -march=rv32iav -mcheck-constraints
+#as: -march=rv32ia_zvamo -mcheck-constraints
#source: vector-insns-fail-zvamo.s
#error_output: vector-insns-fail-zvamo.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns.d b/gas/testsuite/gas/riscv/extended/vector-insns.d
index 1b665f6cc0d..4e0ffff368c 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns.d
+++ b/gas/testsuite/gas/riscv/extended/vector-insns.d
@@ -1,4 +1,4 @@
-#as: -march=rv32iafv
+#as: -march=rv32iafv_zvamo
#objdump: -dr
.*:[ ]+file format .*
diff --git a/include/opcode/riscv.h b/include/opcode/riscv.h
index 1603fcfc495..329bbc95ac9 100644
--- a/include/opcode/riscv.h
+++ b/include/opcode/riscv.h
@@ -514,8 +514,7 @@ enum riscv_extended_insn_class
/* Draft */
INSN_CLASS_V = INSN_CLASS_EXTENDED,
INSN_CLASS_V_AND_F,
- INSN_CLASS_V_OR_ZVAMO,
- INSN_CLASS_V_OR_ZVLSSEG,
+ INSN_CLASS_ZVAMO,
INSN_CLASS_ZFH,
INSN_CLASS_D_AND_ZFH,
INSN_CLASS_Q_AND_ZFH,
diff --git a/opcodes/riscv-opc.c b/opcodes/riscv-opc.c
index 9d733aff9d3..830b1bbf128 100644
--- a/opcodes/riscv-opc.c
+++ b/opcodes/riscv-opc.c
@@ -1363,277 +1363,277 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vle32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE32FFV, MASK_VLE32FFV, match_vd_neq_vm, INSN_DREF },
{"vle64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE64FFV, MASK_VLE64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg2e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG2E8V, MASK_VLSEG2E8V, match_vd_neq_vm, INSN_DREF },
-{"vsseg2e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG2E8V, MASK_VSSEG2E8V, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG3E8V, MASK_VLSEG3E8V, match_vd_neq_vm, INSN_DREF },
-{"vsseg3e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG3E8V, MASK_VSSEG3E8V, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG4E8V, MASK_VLSEG4E8V, match_vd_neq_vm, INSN_DREF },
-{"vsseg4e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG4E8V, MASK_VSSEG4E8V, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG5E8V, MASK_VLSEG5E8V, match_vd_neq_vm, INSN_DREF },
-{"vsseg5e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG5E8V, MASK_VSSEG5E8V, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG6E8V, MASK_VLSEG6E8V, match_vd_neq_vm, INSN_DREF },
-{"vsseg6e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG6E8V, MASK_VSSEG6E8V, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG7E8V, MASK_VLSEG7E8V, match_vd_neq_vm, INSN_DREF },
-{"vsseg7e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG7E8V, MASK_VSSEG7E8V, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG8E8V, MASK_VLSEG8E8V, match_vd_neq_vm, INSN_DREF },
-{"vsseg8e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG8E8V, MASK_VSSEG8E8V, match_vd_neq_vm, INSN_DREF },
-
-{"vlseg2e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG2E16V, MASK_VLSEG2E16V, match_vd_neq_vm, INSN_DREF },
-{"vsseg2e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG2E16V, MASK_VSSEG2E16V, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG3E16V, MASK_VLSEG3E16V, match_vd_neq_vm, INSN_DREF },
-{"vsseg3e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG3E16V, MASK_VSSEG3E16V, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG4E16V, MASK_VLSEG4E16V, match_vd_neq_vm, INSN_DREF },
-{"vsseg4e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG4E16V, MASK_VSSEG4E16V, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG5E16V, MASK_VLSEG5E16V, match_vd_neq_vm, INSN_DREF },
-{"vsseg5e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG5E16V, MASK_VSSEG5E16V, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG6E16V, MASK_VLSEG6E16V, match_vd_neq_vm, INSN_DREF },
-{"vsseg6e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG6E16V, MASK_VSSEG6E16V, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG7E16V, MASK_VLSEG7E16V, match_vd_neq_vm, INSN_DREF },
-{"vsseg7e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG7E16V, MASK_VSSEG7E16V, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG8E16V, MASK_VLSEG8E16V, match_vd_neq_vm, INSN_DREF },
-{"vsseg8e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG8E16V, MASK_VSSEG8E16V, match_vd_neq_vm, INSN_DREF },
-
-{"vlseg2e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG2E32V, MASK_VLSEG2E32V, match_vd_neq_vm, INSN_DREF },
-{"vsseg2e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG2E32V, MASK_VSSEG2E32V, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG3E32V, MASK_VLSEG3E32V, match_vd_neq_vm, INSN_DREF },
-{"vsseg3e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG3E32V, MASK_VSSEG3E32V, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG4E32V, MASK_VLSEG4E32V, match_vd_neq_vm, INSN_DREF },
-{"vsseg4e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG4E32V, MASK_VSSEG4E32V, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG5E32V, MASK_VLSEG5E32V, match_vd_neq_vm, INSN_DREF },
-{"vsseg5e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG5E32V, MASK_VSSEG5E32V, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG6E32V, MASK_VLSEG6E32V, match_vd_neq_vm, INSN_DREF },
-{"vsseg6e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG6E32V, MASK_VSSEG6E32V, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG7E32V, MASK_VLSEG7E32V, match_vd_neq_vm, INSN_DREF },
-{"vsseg7e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG7E32V, MASK_VSSEG7E32V, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG8E32V, MASK_VLSEG8E32V, match_vd_neq_vm, INSN_DREF },
-{"vsseg8e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG8E32V, MASK_VSSEG8E32V, match_vd_neq_vm, INSN_DREF },
-
-{"vlseg2e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG2E64V, MASK_VLSEG2E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg2e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG2E64V, MASK_VSSEG2E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG3E64V, MASK_VLSEG3E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg3e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG3E64V, MASK_VSSEG3E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG4E64V, MASK_VLSEG4E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg4e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG4E64V, MASK_VSSEG4E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG5E64V, MASK_VLSEG5E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg5e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG5E64V, MASK_VSSEG5E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG6E64V, MASK_VLSEG6E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg6e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG6E64V, MASK_VSSEG6E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG7E64V, MASK_VLSEG7E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg7e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG7E64V, MASK_VSSEG7E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG8E64V, MASK_VLSEG8E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg8e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VSSEG8E64V, MASK_VSSEG8E64V, match_vd_neq_vm, INSN_DREF },
-
-{"vlsseg2e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG2E8V, MASK_VLSSEG2E8V, match_vd_neq_vm, INSN_DREF },
-{"vssseg2e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG2E8V, MASK_VSSSEG2E8V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg3e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG3E8V, MASK_VLSSEG3E8V, match_vd_neq_vm, INSN_DREF },
-{"vssseg3e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG3E8V, MASK_VSSSEG3E8V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg4e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG4E8V, MASK_VLSSEG4E8V, match_vd_neq_vm, INSN_DREF },
-{"vssseg4e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG4E8V, MASK_VSSSEG4E8V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg5e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG5E8V, MASK_VLSSEG5E8V, match_vd_neq_vm, INSN_DREF },
-{"vssseg5e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG5E8V, MASK_VSSSEG5E8V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg6e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG6E8V, MASK_VLSSEG6E8V, match_vd_neq_vm, INSN_DREF },
-{"vssseg6e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG6E8V, MASK_VSSSEG6E8V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg7e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG7E8V, MASK_VLSSEG7E8V, match_vd_neq_vm, INSN_DREF },
-{"vssseg7e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG7E8V, MASK_VSSSEG7E8V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg8e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG8E8V, MASK_VLSSEG8E8V, match_vd_neq_vm, INSN_DREF },
-{"vssseg8e8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG8E8V, MASK_VSSSEG8E8V, match_vd_neq_vm, INSN_DREF },
-
-{"vlsseg2e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG2E16V, MASK_VLSSEG2E16V, match_vd_neq_vm, INSN_DREF },
-{"vssseg2e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG2E16V, MASK_VSSSEG2E16V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg3e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG3E16V, MASK_VLSSEG3E16V, match_vd_neq_vm, INSN_DREF },
-{"vssseg3e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG3E16V, MASK_VSSSEG3E16V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg4e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG4E16V, MASK_VLSSEG4E16V, match_vd_neq_vm, INSN_DREF },
-{"vssseg4e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG4E16V, MASK_VSSSEG4E16V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg5e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG5E16V, MASK_VLSSEG5E16V, match_vd_neq_vm, INSN_DREF },
-{"vssseg5e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG5E16V, MASK_VSSSEG5E16V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg6e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG6E16V, MASK_VLSSEG6E16V, match_vd_neq_vm, INSN_DREF },
-{"vssseg6e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG6E16V, MASK_VSSSEG6E16V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg7e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG7E16V, MASK_VLSSEG7E16V, match_vd_neq_vm, INSN_DREF },
-{"vssseg7e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG7E16V, MASK_VSSSEG7E16V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg8e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG8E16V, MASK_VLSSEG8E16V, match_vd_neq_vm, INSN_DREF },
-{"vssseg8e16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG8E16V, MASK_VSSSEG8E16V, match_vd_neq_vm, INSN_DREF },
-
-{"vlsseg2e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG2E32V, MASK_VLSSEG2E32V, match_vd_neq_vm, INSN_DREF },
-{"vssseg2e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG2E32V, MASK_VSSSEG2E32V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg3e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG3E32V, MASK_VLSSEG3E32V, match_vd_neq_vm, INSN_DREF },
-{"vssseg3e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG3E32V, MASK_VSSSEG3E32V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg4e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG4E32V, MASK_VLSSEG4E32V, match_vd_neq_vm, INSN_DREF },
-{"vssseg4e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG4E32V, MASK_VSSSEG4E32V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg5e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG5E32V, MASK_VLSSEG5E32V, match_vd_neq_vm, INSN_DREF },
-{"vssseg5e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG5E32V, MASK_VSSSEG5E32V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg6e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG6E32V, MASK_VLSSEG6E32V, match_vd_neq_vm, INSN_DREF },
-{"vssseg6e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG6E32V, MASK_VSSSEG6E32V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg7e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG7E32V, MASK_VLSSEG7E32V, match_vd_neq_vm, INSN_DREF },
-{"vssseg7e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG7E32V, MASK_VSSSEG7E32V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg8e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG8E32V, MASK_VLSSEG8E32V, match_vd_neq_vm, INSN_DREF },
-{"vssseg8e32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG8E32V, MASK_VSSSEG8E32V, match_vd_neq_vm, INSN_DREF },
-
-{"vlsseg2e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG2E64V, MASK_VLSSEG2E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg2e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG2E64V, MASK_VSSSEG2E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg3e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG3E64V, MASK_VLSSEG3E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg3e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG3E64V, MASK_VSSSEG3E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg4e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG4E64V, MASK_VLSSEG4E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg4e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG4E64V, MASK_VSSSEG4E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg5e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG5E64V, MASK_VLSSEG5E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg5e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG5E64V, MASK_VSSSEG5E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg6e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG6E64V, MASK_VLSSEG6E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg6e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG6E64V, MASK_VSSSEG6E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg7e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG7E64V, MASK_VLSSEG7E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg7e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG7E64V, MASK_VSSSEG7E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg8e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VLSSEG8E64V, MASK_VLSSEG8E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg8e64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),tVm", MATCH_VSSSEG8E64V, MASK_VSSSEG8E64V, match_vd_neq_vm, INSN_DREF },
-
-{"vloxseg2ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI8V, MASK_VLOXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg2ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI8V, MASK_VSOXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg3ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI8V, MASK_VLOXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg3ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI8V, MASK_VSOXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg4ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI8V, MASK_VLOXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg4ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI8V, MASK_VSOXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg5ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI8V, MASK_VLOXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg5ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI8V, MASK_VSOXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg6ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI8V, MASK_VLOXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg6ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI8V, MASK_VSOXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg7ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI8V, MASK_VLOXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg7ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI8V, MASK_VSOXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg8ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI8V, MASK_VLOXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg8ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI8V, MASK_VSOXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-
-{"vloxseg2ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI16V, MASK_VLOXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg2ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI16V, MASK_VSOXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg3ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI16V, MASK_VLOXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg3ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI16V, MASK_VSOXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg4ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI16V, MASK_VLOXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg4ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI16V, MASK_VSOXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg5ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI16V, MASK_VLOXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg5ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI16V, MASK_VSOXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg6ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI16V, MASK_VLOXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg6ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI16V, MASK_VSOXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg7ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI16V, MASK_VLOXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg7ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI16V, MASK_VSOXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg8ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI16V, MASK_VLOXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg8ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI16V, MASK_VSOXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-
-{"vloxseg2ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI32V, MASK_VLOXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg2ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI32V, MASK_VSOXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg3ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI32V, MASK_VLOXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg3ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI32V, MASK_VSOXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg4ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI32V, MASK_VLOXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg4ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI32V, MASK_VSOXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg5ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI32V, MASK_VLOXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg5ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI32V, MASK_VSOXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg6ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI32V, MASK_VLOXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg6ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI32V, MASK_VSOXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg7ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI32V, MASK_VLOXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg7ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI32V, MASK_VSOXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg8ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI32V, MASK_VLOXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg8ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI32V, MASK_VSOXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-
-{"vloxseg2ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI64V, MASK_VLOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg2ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI64V, MASK_VSOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg3ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI64V, MASK_VLOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg3ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI64V, MASK_VSOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg4ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI64V, MASK_VLOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg4ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI64V, MASK_VSOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg5ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI64V, MASK_VLOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg5ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI64V, MASK_VSOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg6ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI64V, MASK_VLOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg6ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI64V, MASK_VSOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg7ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI64V, MASK_VLOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg7ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI64V, MASK_VSOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg8ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI64V, MASK_VLOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg8ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI64V, MASK_VSOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-
-{"vluxseg2ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI8V, MASK_VLUXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg2ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI8V, MASK_VSUXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg3ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI8V, MASK_VLUXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg3ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI8V, MASK_VSUXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg4ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI8V, MASK_VLUXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg4ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI8V, MASK_VSUXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg5ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI8V, MASK_VLUXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg5ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI8V, MASK_VSUXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg6ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI8V, MASK_VLUXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg6ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI8V, MASK_VSUXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg7ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI8V, MASK_VLUXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg7ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI8V, MASK_VSUXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg8ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI8V, MASK_VLUXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg8ei8.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI8V, MASK_VSUXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-
-{"vluxseg2ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI16V, MASK_VLUXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg2ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI16V, MASK_VSUXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg3ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI16V, MASK_VLUXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg3ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI16V, MASK_VSUXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg4ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI16V, MASK_VLUXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg4ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI16V, MASK_VSUXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg5ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI16V, MASK_VLUXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg5ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI16V, MASK_VSUXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg6ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI16V, MASK_VLUXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg6ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI16V, MASK_VSUXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg7ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI16V, MASK_VLUXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg7ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI16V, MASK_VSUXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg8ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI16V, MASK_VLUXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg8ei16.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI16V, MASK_VSUXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-
-{"vluxseg2ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI32V, MASK_VLUXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg2ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI32V, MASK_VSUXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg3ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI32V, MASK_VLUXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg3ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI32V, MASK_VSUXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg4ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI32V, MASK_VLUXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg4ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI32V, MASK_VSUXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg5ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI32V, MASK_VLUXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg5ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI32V, MASK_VSUXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg6ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI32V, MASK_VLUXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg6ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI32V, MASK_VSUXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg7ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI32V, MASK_VLUXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg7ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI32V, MASK_VSUXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg8ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI32V, MASK_VLUXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg8ei32.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI32V, MASK_VSUXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-
-{"vluxseg2ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI64V, MASK_VLUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg2ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI64V, MASK_VSUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg3ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI64V, MASK_VLUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg3ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI64V, MASK_VSUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg4ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI64V, MASK_VLUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg4ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI64V, MASK_VSUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg5ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI64V, MASK_VLUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg5ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI64V, MASK_VSUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg6ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI64V, MASK_VLUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg6ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI64V, MASK_VSUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg7ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI64V, MASK_VLUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg7ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI64V, MASK_VSUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg8ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI64V, MASK_VLUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg8ei64.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI64V, MASK_VSUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-
-{"vlseg2e8ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG2E8FFV, MASK_VLSEG2E8FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e8ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG3E8FFV, MASK_VLSEG3E8FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e8ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG4E8FFV, MASK_VLSEG4E8FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e8ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG5E8FFV, MASK_VLSEG5E8FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e8ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG6E8FFV, MASK_VLSEG6E8FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e8ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG7E8FFV, MASK_VLSEG7E8FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e8ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG8E8FFV, MASK_VLSEG8E8FFV, match_vd_neq_vm, INSN_DREF },
-
-{"vlseg2e16ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG2E16FFV, MASK_VLSEG2E16FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e16ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG3E16FFV, MASK_VLSEG3E16FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e16ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG4E16FFV, MASK_VLSEG4E16FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e16ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG5E16FFV, MASK_VLSEG5E16FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e16ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG6E16FFV, MASK_VLSEG6E16FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e16ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG7E16FFV, MASK_VLSEG7E16FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e16ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG8E16FFV, MASK_VLSEG8E16FFV, match_vd_neq_vm, INSN_DREF },
-
-{"vlseg2e32ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG2E32FFV, MASK_VLSEG2E32FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e32ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG3E32FFV, MASK_VLSEG3E32FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e32ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG4E32FFV, MASK_VLSEG4E32FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e32ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG5E32FFV, MASK_VLSEG5E32FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e32ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG6E32FFV, MASK_VLSEG6E32FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e32ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG7E32FFV, MASK_VLSEG7E32FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e32ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG8E32FFV, MASK_VLSEG8E32FFV, match_vd_neq_vm, INSN_DREF },
-
-{"vlseg2e64ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG2E64FFV, MASK_VLSEG2E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e64ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG3E64FFV, MASK_VLSEG3E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e64ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG4E64FFV, MASK_VLSEG4E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e64ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG5E64FFV, MASK_VLSEG5E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e64ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG6E64FFV, MASK_VLSEG6E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e64ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG7E64FFV, MASK_VLSEG7E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e64ff.v", 0, INSN_CLASS_V_OR_ZVLSSEG, "Vd,0(s)Vm", MATCH_VLSEG8E64FFV, MASK_VLSEG8E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg2e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E8V, MASK_VLSEG2E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg2e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG2E8V, MASK_VSSEG2E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E8V, MASK_VLSEG3E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg3e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG3E8V, MASK_VSSEG3E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E8V, MASK_VLSEG4E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg4e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG4E8V, MASK_VSSEG4E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E8V, MASK_VLSEG5E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg5e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG5E8V, MASK_VSSEG5E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E8V, MASK_VLSEG6E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg6e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG6E8V, MASK_VSSEG6E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E8V, MASK_VLSEG7E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg7e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG7E8V, MASK_VSSEG7E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E8V, MASK_VLSEG8E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg8e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG8E8V, MASK_VSSEG8E8V, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E16V, MASK_VLSEG2E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg2e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG2E16V, MASK_VSSEG2E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E16V, MASK_VLSEG3E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg3e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG3E16V, MASK_VSSEG3E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E16V, MASK_VLSEG4E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg4e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG4E16V, MASK_VSSEG4E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E16V, MASK_VLSEG5E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg5e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG5E16V, MASK_VSSEG5E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E16V, MASK_VLSEG6E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg6e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG6E16V, MASK_VSSEG6E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E16V, MASK_VLSEG7E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg7e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG7E16V, MASK_VSSEG7E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E16V, MASK_VLSEG8E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg8e16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG8E16V, MASK_VSSEG8E16V, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E32V, MASK_VLSEG2E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg2e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG2E32V, MASK_VSSEG2E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E32V, MASK_VLSEG3E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg3e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG3E32V, MASK_VSSEG3E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E32V, MASK_VLSEG4E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg4e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG4E32V, MASK_VSSEG4E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E32V, MASK_VLSEG5E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg5e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG5E32V, MASK_VSSEG5E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E32V, MASK_VLSEG6E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg6e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG6E32V, MASK_VSSEG6E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E32V, MASK_VLSEG7E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg7e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG7E32V, MASK_VSSEG7E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E32V, MASK_VLSEG8E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg8e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG8E32V, MASK_VSSEG8E32V, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E64V, MASK_VLSEG2E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG2E64V, MASK_VSSEG2E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E64V, MASK_VLSEG3E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG3E64V, MASK_VSSEG3E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E64V, MASK_VLSEG4E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG4E64V, MASK_VSSEG4E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E64V, MASK_VLSEG5E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG5E64V, MASK_VSSEG5E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E64V, MASK_VLSEG6E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG6E64V, MASK_VSSEG6E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E64V, MASK_VLSEG7E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG7E64V, MASK_VSSEG7E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E64V, MASK_VLSEG8E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG8E64V, MASK_VSSEG8E64V, match_vd_neq_vm, INSN_DREF },
+
+{"vlsseg2e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG2E8V, MASK_VLSSEG2E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg2e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG2E8V, MASK_VSSSEG2E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg3e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG3E8V, MASK_VLSSEG3E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg3e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG3E8V, MASK_VSSSEG3E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg4e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG4E8V, MASK_VLSSEG4E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg4e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG4E8V, MASK_VSSSEG4E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg5e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG5E8V, MASK_VLSSEG5E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg5e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG5E8V, MASK_VSSSEG5E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg6e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG6E8V, MASK_VLSSEG6E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg6e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG6E8V, MASK_VSSSEG6E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg7e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG7E8V, MASK_VLSSEG7E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg7e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG7E8V, MASK_VSSSEG7E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg8e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG8E8V, MASK_VLSSEG8E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg8e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG8E8V, MASK_VSSSEG8E8V, match_vd_neq_vm, INSN_DREF },
+
+{"vlsseg2e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG2E16V, MASK_VLSSEG2E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg2e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG2E16V, MASK_VSSSEG2E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg3e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG3E16V, MASK_VLSSEG3E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg3e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG3E16V, MASK_VSSSEG3E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg4e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG4E16V, MASK_VLSSEG4E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg4e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG4E16V, MASK_VSSSEG4E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg5e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG5E16V, MASK_VLSSEG5E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg5e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG5E16V, MASK_VSSSEG5E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg6e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG6E16V, MASK_VLSSEG6E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg6e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG6E16V, MASK_VSSSEG6E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg7e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG7E16V, MASK_VLSSEG7E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg7e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG7E16V, MASK_VSSSEG7E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg8e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG8E16V, MASK_VLSSEG8E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg8e16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG8E16V, MASK_VSSSEG8E16V, match_vd_neq_vm, INSN_DREF },
+
+{"vlsseg2e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG2E32V, MASK_VLSSEG2E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg2e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG2E32V, MASK_VSSSEG2E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg3e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG3E32V, MASK_VLSSEG3E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg3e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG3E32V, MASK_VSSSEG3E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg4e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG4E32V, MASK_VLSSEG4E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg4e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG4E32V, MASK_VSSSEG4E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg5e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG5E32V, MASK_VLSSEG5E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg5e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG5E32V, MASK_VSSSEG5E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg6e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG6E32V, MASK_VLSSEG6E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg6e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG6E32V, MASK_VSSSEG6E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg7e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG7E32V, MASK_VLSSEG7E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg7e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG7E32V, MASK_VSSSEG7E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg8e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG8E32V, MASK_VLSSEG8E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg8e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG8E32V, MASK_VSSSEG8E32V, match_vd_neq_vm, INSN_DREF },
+
+{"vlsseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG2E64V, MASK_VLSSEG2E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG2E64V, MASK_VSSSEG2E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG3E64V, MASK_VLSSEG3E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG3E64V, MASK_VSSSEG3E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG4E64V, MASK_VLSSEG4E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG4E64V, MASK_VSSSEG4E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG5E64V, MASK_VLSSEG5E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG5E64V, MASK_VSSSEG5E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG6E64V, MASK_VLSSEG6E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG6E64V, MASK_VSSSEG6E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG7E64V, MASK_VLSSEG7E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG7E64V, MASK_VSSSEG7E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG8E64V, MASK_VLSSEG8E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG8E64V, MASK_VSSSEG8E64V, match_vd_neq_vm, INSN_DREF },
+
+{"vloxseg2ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI8V, MASK_VLOXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg2ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI8V, MASK_VSOXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg3ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI8V, MASK_VLOXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg3ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI8V, MASK_VSOXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg4ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI8V, MASK_VLOXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg4ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI8V, MASK_VSOXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg5ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI8V, MASK_VLOXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg5ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI8V, MASK_VSOXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg6ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI8V, MASK_VLOXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg6ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI8V, MASK_VSOXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg7ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI8V, MASK_VLOXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg7ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI8V, MASK_VSOXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg8ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI8V, MASK_VLOXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg8ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI8V, MASK_VSOXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vloxseg2ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI16V, MASK_VLOXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg2ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI16V, MASK_VSOXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg3ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI16V, MASK_VLOXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg3ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI16V, MASK_VSOXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg4ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI16V, MASK_VLOXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg4ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI16V, MASK_VSOXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg5ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI16V, MASK_VLOXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg5ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI16V, MASK_VSOXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg6ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI16V, MASK_VLOXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg6ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI16V, MASK_VSOXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg7ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI16V, MASK_VLOXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg7ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI16V, MASK_VSOXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg8ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI16V, MASK_VLOXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg8ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI16V, MASK_VSOXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vloxseg2ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI32V, MASK_VLOXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg2ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI32V, MASK_VSOXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg3ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI32V, MASK_VLOXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg3ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI32V, MASK_VSOXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg4ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI32V, MASK_VLOXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg4ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI32V, MASK_VSOXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg5ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI32V, MASK_VLOXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg5ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI32V, MASK_VSOXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg6ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI32V, MASK_VLOXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg6ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI32V, MASK_VSOXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg7ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI32V, MASK_VLOXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg7ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI32V, MASK_VSOXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg8ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI32V, MASK_VLOXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg8ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI32V, MASK_VSOXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vloxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI64V, MASK_VLOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI64V, MASK_VSOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI64V, MASK_VLOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI64V, MASK_VSOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI64V, MASK_VLOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI64V, MASK_VSOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI64V, MASK_VLOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI64V, MASK_VSOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI64V, MASK_VLOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI64V, MASK_VSOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI64V, MASK_VLOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI64V, MASK_VSOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI64V, MASK_VLOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI64V, MASK_VSOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vluxseg2ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI8V, MASK_VLUXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg2ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI8V, MASK_VSUXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg3ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI8V, MASK_VLUXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg3ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI8V, MASK_VSUXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg4ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI8V, MASK_VLUXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg4ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI8V, MASK_VSUXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg5ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI8V, MASK_VLUXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg5ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI8V, MASK_VSUXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg6ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI8V, MASK_VLUXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg6ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI8V, MASK_VSUXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg7ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI8V, MASK_VLUXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg7ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI8V, MASK_VSUXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg8ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI8V, MASK_VLUXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg8ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI8V, MASK_VSUXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vluxseg2ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI16V, MASK_VLUXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg2ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI16V, MASK_VSUXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg3ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI16V, MASK_VLUXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg3ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI16V, MASK_VSUXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg4ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI16V, MASK_VLUXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg4ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI16V, MASK_VSUXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg5ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI16V, MASK_VLUXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg5ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI16V, MASK_VSUXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg6ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI16V, MASK_VLUXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg6ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI16V, MASK_VSUXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg7ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI16V, MASK_VLUXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg7ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI16V, MASK_VSUXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg8ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI16V, MASK_VLUXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg8ei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI16V, MASK_VSUXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vluxseg2ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI32V, MASK_VLUXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg2ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI32V, MASK_VSUXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg3ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI32V, MASK_VLUXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg3ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI32V, MASK_VSUXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg4ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI32V, MASK_VLUXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg4ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI32V, MASK_VSUXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg5ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI32V, MASK_VLUXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg5ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI32V, MASK_VSUXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg6ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI32V, MASK_VLUXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg6ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI32V, MASK_VSUXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg7ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI32V, MASK_VLUXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg7ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI32V, MASK_VSUXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg8ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI32V, MASK_VLUXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg8ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI32V, MASK_VSUXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vluxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI64V, MASK_VLUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI64V, MASK_VSUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI64V, MASK_VLUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI64V, MASK_VSUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI64V, MASK_VLUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI64V, MASK_VSUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI64V, MASK_VLUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI64V, MASK_VSUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI64V, MASK_VLUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI64V, MASK_VSUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI64V, MASK_VLUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI64V, MASK_VSUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI64V, MASK_VLUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI64V, MASK_VSUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vlseg2e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E8FFV, MASK_VLSEG2E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E8FFV, MASK_VLSEG3E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E8FFV, MASK_VLSEG4E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E8FFV, MASK_VLSEG5E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E8FFV, MASK_VLSEG6E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E8FFV, MASK_VLSEG7E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E8FFV, MASK_VLSEG8E8FFV, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e16ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E16FFV, MASK_VLSEG2E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e16ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E16FFV, MASK_VLSEG3E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e16ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E16FFV, MASK_VLSEG4E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e16ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E16FFV, MASK_VLSEG5E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e16ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E16FFV, MASK_VLSEG6E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e16ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E16FFV, MASK_VLSEG7E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e16ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E16FFV, MASK_VLSEG8E16FFV, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E32FFV, MASK_VLSEG2E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E32FFV, MASK_VLSEG3E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E32FFV, MASK_VLSEG4E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E32FFV, MASK_VLSEG5E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E32FFV, MASK_VLSEG6E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E32FFV, MASK_VLSEG7E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E32FFV, MASK_VLSEG8E32FFV, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E64FFV, MASK_VLSEG2E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E64FFV, MASK_VLSEG3E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E64FFV, MASK_VLSEG4E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E64FFV, MASK_VLSEG5E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E64FFV, MASK_VLSEG6E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E64FFV, MASK_VLSEG7E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E64FFV, MASK_VLSEG8E64FFV, match_vd_neq_vm, INSN_DREF },
{"vl1r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL1RE8V, MASK_VL1RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
{"vl1re8.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL1RE8V, MASK_VL1RE8V, match_vls_nf_rv, INSN_DREF },
@@ -1664,45 +1664,45 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vs4r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VS4RV, MASK_VS4RV, match_vls_nf_rv, INSN_DREF },
{"vs8r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VS8RV, MASK_VS8RV, match_vls_nf_rv, INSN_DREF },
-{"vamoaddei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI8V, MASK_VAMOADDEI8V, match_vd_neq_vm, INSN_DREF},
-{"vamoswapei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI8V, MASK_VAMOSWAPEI8V, match_vd_neq_vm, INSN_DREF},
-{"vamoxorei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI8V, MASK_VAMOXOREI8V, match_vd_neq_vm, INSN_DREF},
-{"vamoandei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI8V, MASK_VAMOANDEI8V, match_vd_neq_vm, INSN_DREF},
-{"vamoorei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI8V, MASK_VAMOOREI8V, match_vd_neq_vm, INSN_DREF},
-{"vamominei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI8V, MASK_VAMOMINEI8V, match_vd_neq_vm, INSN_DREF},
-{"vamomaxei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI8V, MASK_VAMOMAXEI8V, match_vd_neq_vm, INSN_DREF},
-{"vamominuei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI8V, MASK_VAMOMINUEI8V, match_vd_neq_vm, INSN_DREF},
-{"vamomaxuei8.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI8V, MASK_VAMOMAXUEI8V, match_vd_neq_vm, INSN_DREF},
-
-{"vamoaddei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI16V, MASK_VAMOADDEI16V, match_vd_neq_vm, INSN_DREF},
-{"vamoswapei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI16V, MASK_VAMOSWAPEI16V, match_vd_neq_vm, INSN_DREF},
-{"vamoxorei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI16V, MASK_VAMOXOREI16V, match_vd_neq_vm, INSN_DREF},
-{"vamoandei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI16V, MASK_VAMOANDEI16V, match_vd_neq_vm, INSN_DREF},
-{"vamoorei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI16V, MASK_VAMOOREI16V, match_vd_neq_vm, INSN_DREF},
-{"vamominei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI16V, MASK_VAMOMINEI16V, match_vd_neq_vm, INSN_DREF},
-{"vamomaxei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI16V, MASK_VAMOMAXEI16V, match_vd_neq_vm, INSN_DREF},
-{"vamominuei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI16V, MASK_VAMOMINUEI16V, match_vd_neq_vm, INSN_DREF},
-{"vamomaxuei16.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI16V, MASK_VAMOMAXUEI16V, match_vd_neq_vm, INSN_DREF},
-
-{"vamoaddei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI32V, MASK_VAMOADDEI32V, match_vd_neq_vm, INSN_DREF},
-{"vamoswapei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI32V, MASK_VAMOSWAPEI32V, match_vd_neq_vm, INSN_DREF},
-{"vamoxorei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI32V, MASK_VAMOXOREI32V, match_vd_neq_vm, INSN_DREF},
-{"vamoandei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI32V, MASK_VAMOANDEI32V, match_vd_neq_vm, INSN_DREF},
-{"vamoorei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI32V, MASK_VAMOOREI32V, match_vd_neq_vm, INSN_DREF},
-{"vamominei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI32V, MASK_VAMOMINEI32V, match_vd_neq_vm, INSN_DREF},
-{"vamomaxei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI32V, MASK_VAMOMAXEI32V, match_vd_neq_vm, INSN_DREF},
-{"vamominuei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI32V, MASK_VAMOMINUEI32V, match_vd_neq_vm, INSN_DREF},
-{"vamomaxuei32.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI32V, MASK_VAMOMAXUEI32V, match_vd_neq_vm, INSN_DREF},
-
-{"vamoaddei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI64V, MASK_VAMOADDEI64V, match_vd_neq_vm, INSN_DREF},
-{"vamoswapei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI64V, MASK_VAMOSWAPEI64V, match_vd_neq_vm, INSN_DREF},
-{"vamoxorei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI64V, MASK_VAMOXOREI64V, match_vd_neq_vm, INSN_DREF},
-{"vamoandei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI64V, MASK_VAMOANDEI64V, match_vd_neq_vm, INSN_DREF},
-{"vamoorei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI64V, MASK_VAMOOREI64V, match_vd_neq_vm, INSN_DREF},
-{"vamominei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI64V, MASK_VAMOMINEI64V, match_vd_neq_vm, INSN_DREF},
-{"vamomaxei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI64V, MASK_VAMOMAXEI64V, match_vd_neq_vm, INSN_DREF},
-{"vamominuei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI64V, MASK_VAMOMINUEI64V, match_vd_neq_vm, INSN_DREF},
-{"vamomaxuei64.v", 0, INSN_CLASS_V_OR_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI64V, MASK_VAMOMAXUEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoaddei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI8V, MASK_VAMOADDEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamoswapei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI8V, MASK_VAMOSWAPEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamoxorei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI8V, MASK_VAMOXOREI8V, match_vd_neq_vm, INSN_DREF},
+{"vamoandei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI8V, MASK_VAMOANDEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamoorei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI8V, MASK_VAMOOREI8V, match_vd_neq_vm, INSN_DREF},
+{"vamominei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI8V, MASK_VAMOMINEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI8V, MASK_VAMOMAXEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamominuei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI8V, MASK_VAMOMINUEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxuei8.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI8V, MASK_VAMOMAXUEI8V, match_vd_neq_vm, INSN_DREF},
+
+{"vamoaddei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI16V, MASK_VAMOADDEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamoswapei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI16V, MASK_VAMOSWAPEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamoxorei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI16V, MASK_VAMOXOREI16V, match_vd_neq_vm, INSN_DREF},
+{"vamoandei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI16V, MASK_VAMOANDEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamoorei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI16V, MASK_VAMOOREI16V, match_vd_neq_vm, INSN_DREF},
+{"vamominei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI16V, MASK_VAMOMINEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI16V, MASK_VAMOMAXEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamominuei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI16V, MASK_VAMOMINUEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxuei16.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI16V, MASK_VAMOMAXUEI16V, match_vd_neq_vm, INSN_DREF},
+
+{"vamoaddei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI32V, MASK_VAMOADDEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamoswapei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI32V, MASK_VAMOSWAPEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamoxorei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI32V, MASK_VAMOXOREI32V, match_vd_neq_vm, INSN_DREF},
+{"vamoandei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI32V, MASK_VAMOANDEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamoorei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI32V, MASK_VAMOOREI32V, match_vd_neq_vm, INSN_DREF},
+{"vamominei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI32V, MASK_VAMOMINEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI32V, MASK_VAMOMAXEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamominuei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI32V, MASK_VAMOMINUEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxuei32.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI32V, MASK_VAMOMAXUEI32V, match_vd_neq_vm, INSN_DREF},
+
+{"vamoaddei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI64V, MASK_VAMOADDEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoswapei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI64V, MASK_VAMOSWAPEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoxorei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI64V, MASK_VAMOXOREI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoandei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI64V, MASK_VAMOANDEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoorei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI64V, MASK_VAMOOREI64V, match_vd_neq_vm, INSN_DREF},
+{"vamominei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI64V, MASK_VAMOMINEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI64V, MASK_VAMOMAXEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamominuei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI64V, MASK_VAMOMINUEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxuei64.v", 0, INSN_CLASS_ZVAMO, "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI64V, MASK_VAMOMAXUEI64V, match_vd_neq_vm, INSN_DREF},
{"vneg.v", 0, INSN_CLASS_V, "Vd,VtVm", MATCH_VRSUBVX, MASK_VRSUBVX | MASK_RS1, match_vd_neq_vm, INSN_ALIAS },
--
2.30.2
^ permalink raw reply [flat|nested] 6+ messages in thread
* [integration v2 4/4] RISC-V/rvv: Added zve* and zvl* extensions, and clarify the imply rules.
2021-10-05 12:51 [integration v2 0/4] RISC-V/rvv: Update rvv from v01.0 to v1.0 Nelson Chu
` (2 preceding siblings ...)
2021-10-05 12:51 ` [integration v2 3/4] RISC-V/rvv: Separate zvamo from v, and removed the zvlsseg extension name Nelson Chu
@ 2021-10-05 12:51 ` Nelson Chu
2021-10-25 6:04 ` [integration v2 0/4] RISC-V/rvv: Update rvv from v01.0 to v1.0 Nelson Chu
4 siblings, 0 replies; 6+ messages in thread
From: Nelson Chu @ 2021-10-05 12:51 UTC (permalink / raw)
To: binutils, jimw, andrew
* Recognized zve* and zvl* extensions.
- zve*: zve64d, zve64f, zve64x, zve32f and zve32x.
- zvl*: zvl32b, zvl64b, zvl128b, zvl256b, zvl512b, zvl1024b, zvl2048b,
zvl4096b, zvl8192b, zvl16384b, zvl32768b and zvl65536b.
* Spec said that v requires f and d, zve64d requires d, zve64f and zve32f
require f. However, according to the issue 723,
[https://github.com/riscv/riscv-v-spec/issues/723]
The general rule is that extension names imply the things they require.
Therefore, the current imply rules should be as follows,
- v imply f and d.
- zve64d imply d.
- zve64f and zve32f imply f.
- zvamo imply a.
Besides, consider the implicit zve and zvl extensions,
- v imply zve64d and zvl128b.
- zve64* imply the corresponding zve32*. For example, zve64f imply zve32f,
and zve64x imply zve32x.
- zve*d imply zve*f and zve*x. For example, zve64d imply zve64f and zve64x.
- zve*f imply zve*x. For example, zve64f imply zve64x.
- zve64* imply zvl64b, and zve32* imply zvl32b.
- The larger zvl* imply all smaller zvl*. For example, zvl128b imply zvl64b,
and zvl32b.
Therefore, "-march=rv64iv -misa-spec=20191213" will be
"rv64i2p0_f2p0_d2p0_v1p0_zicsr2p0_zve32f1p0_zve32x1p0_zve64d1p0_zve64f1p0_zve64x1p0_zvl128b1p0_zvl32b1p0_zvl64b1p0".
Note: zicsr is the imply extension of f.
* For zve32x, the (segmant) load/store instructions are illegal when EEW is
64. Besides, vsew cannot be set to 64 by vsetvli when zve32* is enabled.
* For zvl*b extensions, we alos need to enable either v or zve* extensions.
Otherwise we should issue errors.
bfd/
* elfxx-riscv.c (riscv_implicit_subsets): Added imply rules for v,
zve* and zvl*b extensions.
(riscv_supported_std_z_ext): Added zve* and zvl*b extensions.
(riscv_parse_check_conflicts): The zvl*b extensions cannot be set
without v and zve* extensions.
gas/
* config/tc-riscv.c (riscv_extended_subset_supports): Handle zve*.
(my_getVsetvliExpression): vsew cannot be set to 64 by vsetvli
when zve32* is enabled.
(riscv_ip): The (segmant) loads and stores with EEW 64 cannot be
used when zve32x is enabled.
* testsuite/gas/riscv/extended/march-imply-v.d: New testcase.
* testsuite/gas/riscv/extended/march-imply-zve*.d: Likewise.
* testsuite/gas/riscv/extended/march-imply-zvl*b.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zve32x.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zve32x.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zve32x.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zvl.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zvl.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d: Removed
a-ext from -march since it will be added as implicit ext for zvamo.
* testsuite/gas/riscv/extended/vector-insns.d: Likewise.
include/
* opcode/riscv.h: Defined INSN_V_EEW64.
opcodes/
* riscv-opc.c (riscv_draft_opcodes): Added INSN_V_EEW64 for vector
loads and stores when the eew encodings are 64.
---
bfd/elfxx-riscv.c | 66 +++
gas/config/tc-riscv.c | 29 +-
.../gas/riscv/extended/march-imply-v.d | 6 +
.../gas/riscv/extended/march-imply-zve32f.d | 6 +
.../gas/riscv/extended/march-imply-zve32x.d | 6 +
.../gas/riscv/extended/march-imply-zve64d.d | 6 +
.../gas/riscv/extended/march-imply-zve64f.d | 6 +
.../gas/riscv/extended/march-imply-zve64x.d | 6 +
.../gas/riscv/extended/march-imply-zvl1024b.d | 6 +
.../gas/riscv/extended/march-imply-zvl128b.d | 6 +
.../riscv/extended/march-imply-zvl16384b.d | 6 +
.../gas/riscv/extended/march-imply-zvl2048b.d | 6 +
.../gas/riscv/extended/march-imply-zvl256b.d | 6 +
.../riscv/extended/march-imply-zvl32768b.d | 6 +
.../gas/riscv/extended/march-imply-zvl4096b.d | 6 +
.../gas/riscv/extended/march-imply-zvl512b.d | 6 +
.../gas/riscv/extended/march-imply-zvl64b.d | 6 +
.../riscv/extended/march-imply-zvl65536b.d | 6 +
.../gas/riscv/extended/march-imply-zvl8192b.d | 6 +
.../riscv/extended/vector-insns-fail-zvamo.d | 2 +-
.../riscv/extended/vector-insns-fail-zve32x.d | 3 +
.../riscv/extended/vector-insns-fail-zve32x.l | 82 ++++
.../riscv/extended/vector-insns-fail-zve32x.s | 413 ++++++++++++++++++
.../riscv/extended/vector-insns-fail-zvl.d | 3 +
.../riscv/extended/vector-insns-fail-zvl.l | 2 +
.../gas/riscv/extended/vector-insns.d | 2 +-
include/opcode/riscv.h | 2 +
opcodes/riscv-opc.c | 152 +++----
28 files changed, 777 insertions(+), 81 deletions(-)
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-v.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zve32f.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zve32x.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zve64d.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zve64f.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zve64x.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl1024b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl128b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl16384b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl2048b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl256b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl32768b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl4096b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl512b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl64b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl65536b.d
create mode 100644 gas/testsuite/gas/riscv/extended/march-imply-zvl8192b.d
create mode 100644 gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.d
create mode 100644 gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.l
create mode 100644 gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.s
create mode 100644 gas/testsuite/gas/riscv/extended/vector-insns-fail-zvl.d
create mode 100644 gas/testsuite/gas/riscv/extended/vector-insns-fail-zvl.l
diff --git a/bfd/elfxx-riscv.c b/bfd/elfxx-riscv.c
index 62b92a05f25..f161181bcac 100644
--- a/bfd/elfxx-riscv.c
+++ b/bfd/elfxx-riscv.c
@@ -1073,6 +1073,32 @@ static struct riscv_implicit_subset riscv_implicit_subsets[] =
{"g", "zicsr", check_implicit_always},
{"g", "zifencei", check_implicit_always},
{"q", "d", check_implicit_always},
+ {"v", "d", check_implicit_always},
+ {"v", "zve64d", check_implicit_always},
+ {"v", "zvl128b", check_implicit_always},
+ {"zvamo", "a", check_implicit_always},
+ {"zve64d", "d", check_implicit_always},
+ {"zve64d", "zve64f", check_implicit_always},
+ {"zve64f", "zve32f", check_implicit_always},
+ {"zve64f", "zve64x", check_implicit_always},
+ {"zve64f", "zvl64b", check_implicit_always},
+ {"zve32f", "f", check_implicit_always},
+ {"zve32f", "zvl32b", check_implicit_always},
+ {"zve32f", "zve32x", check_implicit_always},
+ {"zve64x", "zve32x", check_implicit_always},
+ {"zve64x", "zvl64b", check_implicit_always},
+ {"zve32x", "zvl32b", check_implicit_always},
+ {"zvl65536b", "zvl32768b", check_implicit_always},
+ {"zvl32768b", "zvl16384b", check_implicit_always},
+ {"zvl16384b", "zvl8192b", check_implicit_always},
+ {"zvl8192b", "zvl4096b", check_implicit_always},
+ {"zvl4096b", "zvl2048b", check_implicit_always},
+ {"zvl2048b", "zvl1024b", check_implicit_always},
+ {"zvl1024b", "zvl512b", check_implicit_always},
+ {"zvl512b", "zvl256b", check_implicit_always},
+ {"zvl256b", "zvl128b", check_implicit_always},
+ {"zvl128b", "zvl64b", check_implicit_always},
+ {"zvl64b", "zvl32b", check_implicit_always},
{"d", "f", check_implicit_always},
{"f", "zicsr", check_implicit_always},
{"zfh", "f", check_implicit_always},
@@ -1148,6 +1174,24 @@ static struct riscv_supported_ext riscv_supported_std_z_ext[] =
{"zba", ISA_SPEC_CLASS_DRAFT, 0, 93, 0 },
{"zbc", ISA_SPEC_CLASS_DRAFT, 0, 93, 0 },
{"zvamo", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zve32x", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zve32f", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zve32d", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zve64x", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zve64f", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zve64d", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl32b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl64b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl128b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl256b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl512b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl1024b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl2048b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl4096b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl8192b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl16384b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl32768b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
+ {"zvl65536b", ISA_SPEC_CLASS_DRAFT, 1, 0, 0 },
{"zfh", ISA_SPEC_CLASS_DRAFT, 0, 1, 0 }, /* draft. */
{NULL, 0, 0, 0, 0}
};
@@ -1841,6 +1885,28 @@ riscv_parse_check_conflicts (riscv_parse_subset_t *rps)
(_("rv32e does not support the `f' extension"));
no_conflict = false;
}
+
+ bool support_zve = false;
+ bool support_zvl = false;
+ riscv_subset_t *s = rps->subset_list->head;
+ for (; s != NULL; s = s->next)
+ {
+ if (!support_zve
+ && strncmp (s->name, "zve", 3) == 0)
+ support_zve = true;
+ if (!support_zvl
+ && strncmp (s->name, "zvl", 3) == 0)
+ support_zvl = true;
+ if (support_zve && support_zvl)
+ break;
+ }
+ if (support_zvl && !support_zve)
+ {
+ rps->error_handler
+ (_("zvl*b extensions need to enable either `v' or `zve' extension"));
+ no_conflict = false;
+ }
+
return no_conflict;
}
diff --git a/gas/config/tc-riscv.c b/gas/config/tc-riscv.c
index acdb303f4b9..8c0aa1cecde 100644
--- a/gas/config/tc-riscv.c
+++ b/gas/config/tc-riscv.c
@@ -274,11 +274,17 @@ riscv_extended_subset_supports (int insn_class)
{
switch (insn_class)
{
- case INSN_CLASS_V: return riscv_subset_supports ("v");
+ case INSN_CLASS_V:
+ return (riscv_subset_supports ("v")
+ || riscv_subset_supports ("zve64x")
+ || riscv_subset_supports ("zve32x"));
case INSN_CLASS_V_AND_F:
- return riscv_subset_supports ("v") && riscv_subset_supports ("f");
+ return (riscv_subset_supports ("v")
+ || riscv_subset_supports ("zve64d")
+ || riscv_subset_supports ("zve64f")
+ || riscv_subset_supports ("zve32f"));
case INSN_CLASS_ZVAMO:
- return riscv_subset_supports ("a") && riscv_subset_supports ("zvamo");
+ return riscv_subset_supports ("zvamo");
case INSN_CLASS_ZFH:
return riscv_subset_supports ("zfh");
@@ -2450,6 +2456,13 @@ my_getVsetvliExpression (expressionS *ep, char *str)
++str;
if (vsew_found)
as_bad (_("multiple vsew constants"));
+ /* The zve32x is the implicit extension of zve32f, and the zve64x is
+ the implicit extension of zve64f. */
+ else if (riscv_subset_supports ("zve32x")
+ && !riscv_subset_supports ("zve64x")
+ && vsew_value > 2)
+ as_bad (_("illegal vsew %s for zve32x and zve32f"),
+ riscv_vsew[vsew_value]);
vsew_found = true;
}
if (arg_lookup (&str, riscv_vlmul, ARRAY_SIZE (riscv_vlmul), &vlmul_value))
@@ -2859,6 +2872,16 @@ riscv_ip (char *str, struct riscv_cl_insn *ip, expressionS *imm_expr,
as_warn (_("read-only CSR is written `%s'"), str);
insn_with_csr = false;
}
+
+ /* The (segmant) load and store with EEW 64 cannot be used
+ when zve32x is enabled. */
+ if (ip->insn_mo->pinfo & INSN_V_EEW64
+ && riscv_subset_supports ("zve32x")
+ && !riscv_subset_supports ("zve64x"))
+ {
+ error = _("illegal opcode for zve32x");
+ break;
+ }
}
if (*asarg != '\0')
break;
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-v.d b/gas/testsuite/gas/riscv/extended/march-imply-v.d
new file mode 100644
index 00000000000..4177f0328bc
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-v.d
@@ -0,0 +1,6 @@
+#as: -march=rv32iv -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_f2p2_d2p2_v1p0_zicsr2p0_zve32f1p0_zve32x1p0_zve64d1p0_zve64f1p0_zve64x1p0_zvl128b1p0_zvl32b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zve32f.d b/gas/testsuite/gas/riscv/extended/march-imply-zve32f.d
new file mode 100644
index 00000000000..17b7a67cc95
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zve32f.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32f -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_f2p2_zicsr2p0_zve32f1p0_zve32x1p0_zvl32b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zve32x.d b/gas/testsuite/gas/riscv/extended/march-imply-zve32x.d
new file mode 100644
index 00000000000..34742e544dc
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zve32x.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl32b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zve64d.d b/gas/testsuite/gas/riscv/extended/march-imply-zve64d.d
new file mode 100644
index 00000000000..f5fb6a5fc1c
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zve64d.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve64d -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_f2p2_d2p2_zicsr2p0_zve32f1p0_zve32x1p0_zve64d1p0_zve64f1p0_zve64x1p0_zvl32b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zve64f.d b/gas/testsuite/gas/riscv/extended/march-imply-zve64f.d
new file mode 100644
index 00000000000..2cd7224a29d
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zve64f.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve64f -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_f2p2_zicsr2p0_zve32f1p0_zve32x1p0_zve64f1p0_zve64x1p0_zvl32b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zve64x.d b/gas/testsuite/gas/riscv/extended/march-imply-zve64x.d
new file mode 100644
index 00000000000..96dea6f58a6
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zve64x.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve64x -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zve64x1p0_zvl32b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl1024b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl1024b.d
new file mode 100644
index 00000000000..4a68a9c9425
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl1024b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl1024b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl1024b1p0_zvl128b1p0_zvl256b1p0_zvl32b1p0_zvl512b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl128b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl128b.d
new file mode 100644
index 00000000000..ef1ac30b9ba
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl128b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl128b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl128b1p0_zvl32b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl16384b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl16384b.d
new file mode 100644
index 00000000000..a05077afbcd
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl16384b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl16384b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl1024b1p0_zvl128b1p0_zvl16384b1p0_zvl2048b1p0_zvl256b1p0_zvl32b1p0_zvl4096b1p0_zvl512b1p0_zvl64b1p0_zvl8192b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl2048b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl2048b.d
new file mode 100644
index 00000000000..8c45a9812d2
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl2048b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl2048b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl1024b1p0_zvl128b1p0_zvl2048b1p0_zvl256b1p0_zvl32b1p0_zvl512b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl256b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl256b.d
new file mode 100644
index 00000000000..515bfdb0513
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl256b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl256b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl128b1p0_zvl256b1p0_zvl32b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl32768b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl32768b.d
new file mode 100644
index 00000000000..a10b11a74e9
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl32768b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl32768b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl1024b1p0_zvl128b1p0_zvl16384b1p0_zvl2048b1p0_zvl256b1p0_zvl32768b1p0_zvl32b1p0_zvl4096b1p0_zvl512b1p0_zvl64b1p0_zvl8192b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl4096b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl4096b.d
new file mode 100644
index 00000000000..0951b57366f
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl4096b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl4096b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl1024b1p0_zvl128b1p0_zvl2048b1p0_zvl256b1p0_zvl32b1p0_zvl4096b1p0_zvl512b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl512b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl512b.d
new file mode 100644
index 00000000000..14b439011c7
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl512b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl512b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl128b1p0_zvl256b1p0_zvl32b1p0_zvl512b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl64b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl64b.d
new file mode 100644
index 00000000000..071bd245a28
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl64b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl64b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl32b1p0_zvl64b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl65536b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl65536b.d
new file mode 100644
index 00000000000..8b1739feb74
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl65536b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl65536b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl1024b1p0_zvl128b1p0_zvl16384b1p0_zvl2048b1p0_zvl256b1p0_zvl32768b1p0_zvl32b1p0_zvl4096b1p0_zvl512b1p0_zvl64b1p0_zvl65536b1p0_zvl8192b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/march-imply-zvl8192b.d b/gas/testsuite/gas/riscv/extended/march-imply-zvl8192b.d
new file mode 100644
index 00000000000..ddfdc5019d7
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/march-imply-zvl8192b.d
@@ -0,0 +1,6 @@
+#as: -march=rv32i_zve32x_zvl8192b -march-attr -misa-spec=20191213
+#readelf: -A
+#source: ../empty.s
+Attribute Section: riscv
+File Attributes
+ Tag_RISCV_arch: "rv32i2p1_zve32x1p0_zvl1024b1p0_zvl128b1p0_zvl2048b1p0_zvl256b1p0_zvl32b1p0_zvl4096b1p0_zvl512b1p0_zvl64b1p0_zvl8192b1p0"
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d
index 8a6de14600c..ff06d2377a3 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d
@@ -1,3 +1,3 @@
-#as: -march=rv32ia_zvamo -mcheck-constraints
+#as: -march=rv32i_zvamo -mcheck-constraints
#source: vector-insns-fail-zvamo.s
#error_output: vector-insns-fail-zvamo.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.d
new file mode 100644
index 00000000000..3c60e8026a9
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.d
@@ -0,0 +1,3 @@
+#as: -march=rv32i_zvamo_zve32x
+#source: vector-insns-fail-zve32x.s
+#error_output: vector-insns-fail-zve32x.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.l
new file mode 100644
index 00000000000..13f747c7ccb
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.l
@@ -0,0 +1,82 @@
+.*Assembler messages:
+.*Error: illegal vsew e64 for zve32x and zve32f
+.*Error: illegal vsew e128 for zve32x and zve32f
+.*Error: illegal vsew e256 for zve32x and zve32f
+.*Error: illegal vsew e512 for zve32x and zve32f
+.*Error: illegal vsew e1024 for zve32x and zve32f
+.*Error: illegal opcode for zve32x `vle64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vse64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlse64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vsse64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vloxei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsoxei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vluxei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsuxei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vle64ff.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg2e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vsseg2e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg3e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vsseg3e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg4e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vsseg4e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg5e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vsseg5e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg6e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vsseg6e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg7e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vsseg7e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg8e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vsseg8e64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlsseg2e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vssseg2e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vlsseg3e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vssseg3e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vlsseg4e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vssseg4e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vlsseg5e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vssseg5e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vlsseg6e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vssseg6e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vlsseg7e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vssseg7e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vlsseg8e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vssseg8e64.v v4,\(a0\),a1'
+.*Error: illegal opcode for zve32x `vloxseg2ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsoxseg2ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vloxseg3ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsoxseg3ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vloxseg4ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsoxseg4ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vloxseg5ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsoxseg5ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vloxseg6ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsoxseg6ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vloxseg7ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsoxseg7ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vloxseg8ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsoxseg8ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vluxseg2ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsuxseg2ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vluxseg3ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsuxseg3ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vluxseg4ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsuxseg4ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vluxseg5ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsuxseg5ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vluxseg6ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsuxseg6ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vluxseg7ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsuxseg7ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vluxseg8ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vsuxseg8ei64.v v4,\(a0\),v12'
+.*Error: illegal opcode for zve32x `vlseg2e64ff.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg3e64ff.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg4e64ff.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg5e64ff.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg6e64ff.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg7e64ff.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vlseg8e64ff.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vl1re64.v v3,\(a0\)'
+.*Error: illegal opcode for zve32x `vl2re64.v v2,\(a0\)'
+.*Error: illegal opcode for zve32x `vl4re64.v v4,\(a0\)'
+.*Error: illegal opcode for zve32x `vl8re64.v v8,\(a0\)'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.s
new file mode 100644
index 00000000000..7a0bfe88db8
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zve32x.s
@@ -0,0 +1,413 @@
+ vsetvl a0, a1, a2
+ vsetvli a0, a1, e8
+ vsetvli a0, a1, e16, m2
+ vsetvli a0, a1, e32, m2, ta
+ vsetvli a0, a1, e64, m2, ta, ma
+ vsetvli a0, a1, e128
+ vsetvli a0, a1, e256, m2
+ vsetvli a0, a1, e512, m2, ta
+ vsetvli a0, a1, e1024, m2, ta, ma
+
+ vlm.v v4, (a0)
+ vle1.v v4, (a0) # Alias of vlm.v
+ vsm.v v4, (a0)
+ vse1.v v4, (a0) # Alias of vsm.v
+
+ vle8.v v4, (a0)
+ vse8.v v4, (a0)
+ vle16.v v4, (a0)
+ vse16.v v4, (a0)
+ vle32.v v4, (a0)
+ vse32.v v4, (a0)
+ vle64.v v4, (a0)
+ vse64.v v4, (a0)
+
+ vlse8.v v4, (a0), a1
+ vsse8.v v4, (a0), a1
+ vlse16.v v4, (a0), a1
+ vsse16.v v4, (a0), a1
+ vlse32.v v4, (a0), a1
+ vsse32.v v4, (a0), a1
+ vlse64.v v4, (a0), a1
+ vsse64.v v4, (a0), a1
+
+ vloxei8.v v4, (a0), v12
+ vsoxei8.v v4, (a0), v12
+ vluxei8.v v4, (a0), v12
+ vsuxei8.v v4, (a0), v12
+ vloxei16.v v4, (a0), v12
+ vsoxei16.v v4, (a0), v12
+ vluxei16.v v4, (a0), v12
+ vsuxei16.v v4, (a0), v12
+ vloxei32.v v4, (a0), v12
+ vsoxei32.v v4, (a0), v12
+ vluxei32.v v4, (a0), v12
+ vsuxei32.v v4, (a0), v12
+ vloxei64.v v4, (a0), v12
+ vsoxei64.v v4, (a0), v12
+ vluxei64.v v4, (a0), v12
+ vsuxei64.v v4, (a0), v12
+
+ vle8ff.v v4, (a0)
+ vle16ff.v v4, (a0)
+ vle32ff.v v4, (a0)
+ vle64ff.v v4, (a0)
+
+ vlseg2e8.v v4, (a0)
+ vsseg2e8.v v4, (a0)
+ vlseg3e8.v v4, (a0)
+ vsseg3e8.v v4, (a0)
+ vlseg4e8.v v4, (a0)
+ vsseg4e8.v v4, (a0)
+ vlseg5e8.v v4, (a0)
+ vsseg5e8.v v4, (a0)
+ vlseg6e8.v v4, (a0)
+ vsseg6e8.v v4, (a0)
+ vlseg7e8.v v4, (a0)
+ vsseg7e8.v v4, (a0)
+ vlseg8e8.v v4, (a0)
+ vsseg8e8.v v4, (a0)
+ vlseg2e16.v v4, (a0)
+ vsseg2e16.v v4, (a0)
+ vlseg3e16.v v4, (a0)
+ vsseg3e16.v v4, (a0)
+ vlseg4e16.v v4, (a0)
+ vsseg4e16.v v4, (a0)
+ vlseg5e16.v v4, (a0)
+ vsseg5e16.v v4, (a0)
+ vlseg6e16.v v4, (a0)
+ vsseg6e16.v v4, (a0)
+ vlseg7e16.v v4, (a0)
+ vsseg7e16.v v4, (a0)
+ vlseg8e16.v v4, (a0)
+ vsseg8e16.v v4, (a0)
+ vlseg2e32.v v4, (a0)
+ vsseg2e32.v v4, (a0)
+ vlseg3e32.v v4, (a0)
+ vsseg3e32.v v4, (a0)
+ vlseg4e32.v v4, (a0)
+ vsseg4e32.v v4, (a0)
+ vlseg5e32.v v4, (a0)
+ vsseg5e32.v v4, (a0)
+ vlseg6e32.v v4, (a0)
+ vsseg6e32.v v4, (a0)
+ vlseg7e32.v v4, (a0)
+ vsseg7e32.v v4, (a0)
+ vlseg8e32.v v4, (a0)
+ vsseg8e32.v v4, (a0)
+ vlseg2e64.v v4, (a0)
+ vsseg2e64.v v4, (a0)
+ vlseg3e64.v v4, (a0)
+ vsseg3e64.v v4, (a0)
+ vlseg4e64.v v4, (a0)
+ vsseg4e64.v v4, (a0)
+ vlseg5e64.v v4, (a0)
+ vsseg5e64.v v4, (a0)
+ vlseg6e64.v v4, (a0)
+ vsseg6e64.v v4, (a0)
+ vlseg7e64.v v4, (a0)
+ vsseg7e64.v v4, (a0)
+ vlseg8e64.v v4, (a0)
+ vsseg8e64.v v4, (a0)
+
+ vlsseg2e8.v v4, (a0), a1
+ vssseg2e8.v v4, (a0), a1
+ vlsseg3e8.v v4, (a0), a1
+ vssseg3e8.v v4, (a0), a1
+ vlsseg4e8.v v4, (a0), a1
+ vssseg4e8.v v4, (a0), a1
+ vlsseg5e8.v v4, (a0), a1
+ vssseg5e8.v v4, (a0), a1
+ vlsseg6e8.v v4, (a0), a1
+ vssseg6e8.v v4, (a0), a1
+ vlsseg7e8.v v4, (a0), a1
+ vssseg7e8.v v4, (a0), a1
+ vlsseg8e8.v v4, (a0), a1
+ vssseg8e8.v v4, (a0), a1
+ vlsseg2e16.v v4, (a0), a1
+ vssseg2e16.v v4, (a0), a1
+ vlsseg3e16.v v4, (a0), a1
+ vssseg3e16.v v4, (a0), a1
+ vlsseg4e16.v v4, (a0), a1
+ vssseg4e16.v v4, (a0), a1
+ vlsseg5e16.v v4, (a0), a1
+ vssseg5e16.v v4, (a0), a1
+ vlsseg6e16.v v4, (a0), a1
+ vssseg6e16.v v4, (a0), a1
+ vlsseg7e16.v v4, (a0), a1
+ vssseg7e16.v v4, (a0), a1
+ vlsseg8e16.v v4, (a0), a1
+ vssseg8e16.v v4, (a0), a1
+ vlsseg2e32.v v4, (a0), a1
+ vssseg2e32.v v4, (a0), a1
+ vlsseg3e32.v v4, (a0), a1
+ vssseg3e32.v v4, (a0), a1
+ vlsseg4e32.v v4, (a0), a1
+ vssseg4e32.v v4, (a0), a1
+ vlsseg5e32.v v4, (a0), a1
+ vssseg5e32.v v4, (a0), a1
+ vlsseg6e32.v v4, (a0), a1
+ vssseg6e32.v v4, (a0), a1
+ vlsseg7e32.v v4, (a0), a1
+ vssseg7e32.v v4, (a0), a1
+ vlsseg8e32.v v4, (a0), a1
+ vssseg8e32.v v4, (a0), a1
+ vlsseg2e64.v v4, (a0), a1
+ vssseg2e64.v v4, (a0), a1
+ vlsseg3e64.v v4, (a0), a1
+ vssseg3e64.v v4, (a0), a1
+ vlsseg4e64.v v4, (a0), a1
+ vssseg4e64.v v4, (a0), a1
+ vlsseg5e64.v v4, (a0), a1
+ vssseg5e64.v v4, (a0), a1
+ vlsseg6e64.v v4, (a0), a1
+ vssseg6e64.v v4, (a0), a1
+ vlsseg7e64.v v4, (a0), a1
+ vssseg7e64.v v4, (a0), a1
+ vlsseg8e64.v v4, (a0), a1
+ vssseg8e64.v v4, (a0), a1
+
+ vloxseg2ei8.v v4, (a0), v12
+ vsoxseg2ei8.v v4, (a0), v12
+ vloxseg3ei8.v v4, (a0), v12
+ vsoxseg3ei8.v v4, (a0), v12
+ vloxseg4ei8.v v4, (a0), v12
+ vsoxseg4ei8.v v4, (a0), v12
+ vloxseg5ei8.v v4, (a0), v12
+ vsoxseg5ei8.v v4, (a0), v12
+ vloxseg6ei8.v v4, (a0), v12
+ vsoxseg6ei8.v v4, (a0), v12
+ vloxseg7ei8.v v4, (a0), v12
+ vsoxseg7ei8.v v4, (a0), v12
+ vloxseg8ei8.v v4, (a0), v12
+ vsoxseg8ei8.v v4, (a0), v12
+ vloxseg2ei16.v v4, (a0), v12
+ vsoxseg2ei16.v v4, (a0), v12
+ vloxseg3ei16.v v4, (a0), v12
+ vsoxseg3ei16.v v4, (a0), v12
+ vloxseg4ei16.v v4, (a0), v12
+ vsoxseg4ei16.v v4, (a0), v12
+ vloxseg5ei16.v v4, (a0), v12
+ vsoxseg5ei16.v v4, (a0), v12
+ vloxseg6ei16.v v4, (a0), v12
+ vsoxseg6ei16.v v4, (a0), v12
+ vloxseg7ei16.v v4, (a0), v12
+ vsoxseg7ei16.v v4, (a0), v12
+ vloxseg8ei16.v v4, (a0), v12
+ vsoxseg8ei16.v v4, (a0), v12
+ vloxseg2ei32.v v4, (a0), v12
+ vsoxseg2ei32.v v4, (a0), v12
+ vloxseg3ei32.v v4, (a0), v12
+ vsoxseg3ei32.v v4, (a0), v12
+ vloxseg4ei32.v v4, (a0), v12
+ vsoxseg4ei32.v v4, (a0), v12
+ vloxseg5ei32.v v4, (a0), v12
+ vsoxseg5ei32.v v4, (a0), v12
+ vloxseg6ei32.v v4, (a0), v12
+ vsoxseg6ei32.v v4, (a0), v12
+ vloxseg7ei32.v v4, (a0), v12
+ vsoxseg7ei32.v v4, (a0), v12
+ vloxseg8ei32.v v4, (a0), v12
+ vsoxseg8ei32.v v4, (a0), v12
+ vloxseg2ei64.v v4, (a0), v12
+ vsoxseg2ei64.v v4, (a0), v12
+ vloxseg3ei64.v v4, (a0), v12
+ vsoxseg3ei64.v v4, (a0), v12
+ vloxseg4ei64.v v4, (a0), v12
+ vsoxseg4ei64.v v4, (a0), v12
+ vloxseg5ei64.v v4, (a0), v12
+ vsoxseg5ei64.v v4, (a0), v12
+ vloxseg6ei64.v v4, (a0), v12
+ vsoxseg6ei64.v v4, (a0), v12
+ vloxseg7ei64.v v4, (a0), v12
+ vsoxseg7ei64.v v4, (a0), v12
+ vloxseg8ei64.v v4, (a0), v12
+ vsoxseg8ei64.v v4, (a0), v12
+
+ vluxseg2ei8.v v4, (a0), v12
+ vsuxseg2ei8.v v4, (a0), v12
+ vluxseg3ei8.v v4, (a0), v12
+ vsuxseg3ei8.v v4, (a0), v12
+ vluxseg4ei8.v v4, (a0), v12
+ vsuxseg4ei8.v v4, (a0), v12
+ vluxseg5ei8.v v4, (a0), v12
+ vsuxseg5ei8.v v4, (a0), v12
+ vluxseg6ei8.v v4, (a0), v12
+ vsuxseg6ei8.v v4, (a0), v12
+ vluxseg7ei8.v v4, (a0), v12
+ vsuxseg7ei8.v v4, (a0), v12
+ vluxseg8ei8.v v4, (a0), v12
+ vsuxseg8ei8.v v4, (a0), v12
+ vluxseg2ei16.v v4, (a0), v12
+ vsuxseg2ei16.v v4, (a0), v12
+ vluxseg3ei16.v v4, (a0), v12
+ vsuxseg3ei16.v v4, (a0), v12
+ vluxseg4ei16.v v4, (a0), v12
+ vsuxseg4ei16.v v4, (a0), v12
+ vluxseg5ei16.v v4, (a0), v12
+ vsuxseg5ei16.v v4, (a0), v12
+ vluxseg6ei16.v v4, (a0), v12
+ vsuxseg6ei16.v v4, (a0), v12
+ vluxseg7ei16.v v4, (a0), v12
+ vsuxseg7ei16.v v4, (a0), v12
+ vluxseg8ei16.v v4, (a0), v12
+ vsuxseg8ei16.v v4, (a0), v12
+ vluxseg2ei32.v v4, (a0), v12
+ vsuxseg2ei32.v v4, (a0), v12
+ vluxseg3ei32.v v4, (a0), v12
+ vsuxseg3ei32.v v4, (a0), v12
+ vluxseg4ei32.v v4, (a0), v12
+ vsuxseg4ei32.v v4, (a0), v12
+ vluxseg5ei32.v v4, (a0), v12
+ vsuxseg5ei32.v v4, (a0), v12
+ vluxseg6ei32.v v4, (a0), v12
+ vsuxseg6ei32.v v4, (a0), v12
+ vluxseg7ei32.v v4, (a0), v12
+ vsuxseg7ei32.v v4, (a0), v12
+ vluxseg8ei32.v v4, (a0), v12
+ vsuxseg8ei32.v v4, (a0), v12
+ vluxseg2ei64.v v4, (a0), v12
+ vsuxseg2ei64.v v4, (a0), v12
+ vluxseg3ei64.v v4, (a0), v12
+ vsuxseg3ei64.v v4, (a0), v12
+ vluxseg4ei64.v v4, (a0), v12
+ vsuxseg4ei64.v v4, (a0), v12
+ vluxseg5ei64.v v4, (a0), v12
+ vsuxseg5ei64.v v4, (a0), v12
+ vluxseg6ei64.v v4, (a0), v12
+ vsuxseg6ei64.v v4, (a0), v12
+ vluxseg7ei64.v v4, (a0), v12
+ vsuxseg7ei64.v v4, (a0), v12
+ vluxseg8ei64.v v4, (a0), v12
+ vsuxseg8ei64.v v4, (a0), v12
+
+ vlseg2e8ff.v v4, (a0)
+ vlseg3e8ff.v v4, (a0)
+ vlseg4e8ff.v v4, (a0)
+ vlseg5e8ff.v v4, (a0)
+ vlseg6e8ff.v v4, (a0)
+ vlseg7e8ff.v v4, (a0)
+ vlseg8e8ff.v v4, (a0)
+ vlseg2e16ff.v v4, (a0)
+ vlseg3e16ff.v v4, (a0)
+ vlseg4e16ff.v v4, (a0)
+ vlseg5e16ff.v v4, (a0)
+ vlseg6e16ff.v v4, (a0)
+ vlseg7e16ff.v v4, (a0)
+ vlseg8e16ff.v v4, (a0)
+ vlseg2e32ff.v v4, (a0)
+ vlseg3e32ff.v v4, (a0)
+ vlseg4e32ff.v v4, (a0)
+ vlseg5e32ff.v v4, (a0)
+ vlseg6e32ff.v v4, (a0)
+ vlseg7e32ff.v v4, (a0)
+ vlseg8e32ff.v v4, (a0)
+ vlseg2e64ff.v v4, (a0)
+ vlseg3e64ff.v v4, (a0)
+ vlseg4e64ff.v v4, (a0)
+ vlseg5e64ff.v v4, (a0)
+ vlseg6e64ff.v v4, (a0)
+ vlseg7e64ff.v v4, (a0)
+ vlseg8e64ff.v v4, (a0)
+
+ vl1r.v v3, (a0)
+ vl1re8.v v3, (a0)
+ vl1re16.v v3, (a0)
+ vl1re32.v v3, (a0)
+ vl1re64.v v3, (a0)
+ vl2r.v v2, (a0)
+ vl2re8.v v2, (a0)
+ vl2re16.v v2, (a0)
+ vl2re32.v v2, (a0)
+ vl2re64.v v2, (a0)
+ vl4r.v v4, (a0)
+ vl4re8.v v4, (a0)
+ vl4re16.v v4, (a0)
+ vl4re32.v v4, (a0)
+ vl4re64.v v4, (a0)
+ vl8r.v v8, (a0)
+ vl8re8.v v8, (a0)
+ vl8re16.v v8, (a0)
+ vl8re32.v v8, (a0)
+ vl8re64.v v8, (a0)
+
+ vs1r.v v3, (a1)
+ vs2r.v v2, (a1)
+ vs4r.v v4, (a1)
+ vs8r.v v8, (a1)
+
+ vamoaddei8.v v4, (a1), v8, v4
+ vamoswapei8.v v4, (a1), v8, v4
+ vamoxorei8.v v4, (a1), v8, v4
+ vamoandei8.v v4, (a1), v8, v4
+ vamoorei8.v v4, (a1), v8, v4
+ vamominei8.v v4, (a1), v8, v4
+ vamomaxei8.v v4, (a1), v8, v4
+ vamominuei8.v v4, (a1), v8, v4
+ vamomaxuei8.v v4, (a1), v8, v4
+ vamoaddei8.v v4, 0(a1), v8, v4
+ vamoswapei8.v v4, 0(a1), v8, v4
+ vamoxorei8.v v4, 0(a1), v8, v4
+ vamoandei8.v v4, 0(a1), v8, v4
+ vamoorei8.v v4, 0(a1), v8, v4
+ vamominei8.v v4, 0(a1), v8, v4
+ vamomaxei8.v v4, 0(a1), v8, v4
+ vamominuei8.v v4, 0(a1), v8, v4
+ vamomaxuei8.v v4, 0(a1), v8, v4
+
+ vamoaddei16.v v4, (a1), v8, v4
+ vamoswapei16.v v4, (a1), v8, v4
+ vamoxorei16.v v4, (a1), v8, v4
+ vamoandei16.v v4, (a1), v8, v4
+ vamoorei16.v v4, (a1), v8, v4
+ vamominei16.v v4, (a1), v8, v4
+ vamomaxei16.v v4, (a1), v8, v4
+ vamominuei16.v v4, (a1), v8, v4
+ vamomaxuei16.v v4, (a1), v8, v4
+ vamoaddei16.v v4, 0(a1), v8, v4
+ vamoswapei16.v v4, 0(a1), v8, v4
+ vamoxorei16.v v4, 0(a1), v8, v4
+ vamoandei16.v v4, 0(a1), v8, v4
+ vamoorei16.v v4, 0(a1), v8, v4
+ vamominei16.v v4, 0(a1), v8, v4
+ vamomaxei16.v v4, 0(a1), v8, v4
+ vamominuei16.v v4, 0(a1), v8, v4
+ vamomaxuei16.v v4, 0(a1), v8, v4
+
+ vamoaddei32.v v4, (a1), v8, v4
+ vamoswapei32.v v4, (a1), v8, v4
+ vamoxorei32.v v4, (a1), v8, v4
+ vamoandei32.v v4, (a1), v8, v4
+ vamoorei32.v v4, (a1), v8, v4
+ vamominei32.v v4, (a1), v8, v4
+ vamomaxei32.v v4, (a1), v8, v4
+ vamominuei32.v v4, (a1), v8, v4
+ vamomaxuei32.v v4, (a1), v8, v4
+ vamoaddei32.v v4, 0(a1), v8, v4
+ vamoswapei32.v v4, 0(a1), v8, v4
+ vamoxorei32.v v4, 0(a1), v8, v4
+ vamoandei32.v v4, 0(a1), v8, v4
+ vamoorei32.v v4, 0(a1), v8, v4
+ vamominei32.v v4, 0(a1), v8, v4
+ vamomaxei32.v v4, 0(a1), v8, v4
+ vamominuei32.v v4, 0(a1), v8, v4
+ vamomaxuei32.v v4, 0(a1), v8, v4
+
+ vamoaddei64.v v4, (a1), v8, v4
+ vamoswapei64.v v4, (a1), v8, v4
+ vamoxorei64.v v4, (a1), v8, v4
+ vamoandei64.v v4, (a1), v8, v4
+ vamoorei64.v v4, (a1), v8, v4
+ vamominei64.v v4, (a1), v8, v4
+ vamomaxei64.v v4, (a1), v8, v4
+ vamominuei64.v v4, (a1), v8, v4
+ vamomaxuei64.v v4, (a1), v8, v4
+ vamoaddei64.v v4, 0(a1), v8, v4
+ vamoswapei64.v v4, 0(a1), v8, v4
+ vamoxorei64.v v4, 0(a1), v8, v4
+ vamoandei64.v v4, 0(a1), v8, v4
+ vamoorei64.v v4, 0(a1), v8, v4
+ vamominei64.v v4, 0(a1), v8, v4
+ vamomaxei64.v v4, 0(a1), v8, v4
+ vamominuei64.v v4, 0(a1), v8, v4
+ vamomaxuei64.v v4, 0(a1), v8, v4
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvl.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvl.d
new file mode 100644
index 00000000000..16884e7e906
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvl.d
@@ -0,0 +1,3 @@
+#as: -march=rv32i_zvl65536b
+#source: ../empty.s
+#error_output: vector-insns-fail-zvl.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvl.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvl.l
new file mode 100644
index 00000000000..d820ded1e91
--- /dev/null
+++ b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvl.l
@@ -0,0 +1,2 @@
+.*Assembler messages:
+.*Error: zvl\*b extensions need to enable either `v' or `zve' extension
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns.d b/gas/testsuite/gas/riscv/extended/vector-insns.d
index 4e0ffff368c..3f9b9a1de02 100644
--- a/gas/testsuite/gas/riscv/extended/vector-insns.d
+++ b/gas/testsuite/gas/riscv/extended/vector-insns.d
@@ -1,4 +1,4 @@
-#as: -march=rv32iafv_zvamo
+#as: -march=rv32ifv_zvamo
#objdump: -dr
.*:[ ]+file format .*
diff --git a/include/opcode/riscv.h b/include/opcode/riscv.h
index 329bbc95ac9..25ef888bd63 100644
--- a/include/opcode/riscv.h
+++ b/include/opcode/riscv.h
@@ -445,6 +445,8 @@ extern const struct riscv_opcode riscv_insn_types[];
/* Extended extensions. */
+/* The insn_info fields. */
+#define INSN_V_EEW64 0x10000000
/* RVV IMM encodings. */
#define EXTRACT_RVV_VI_IMM(x) \
(RV_X(x, 15, 5) | (-RV_X(x, 19, 1) << 5))
diff --git a/opcodes/riscv-opc.c b/opcodes/riscv-opc.c
index 830b1bbf128..278b67326f3 100644
--- a/opcodes/riscv-opc.c
+++ b/opcodes/riscv-opc.c
@@ -1321,47 +1321,47 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vle8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE8V, MASK_VLE8V, match_vd_neq_vm, INSN_DREF },
{"vle16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE16V, MASK_VLE16V, match_vd_neq_vm, INSN_DREF },
{"vle32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE32V, MASK_VLE32V, match_vd_neq_vm, INSN_DREF },
-{"vle64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE64V, MASK_VLE64V, match_vd_neq_vm, INSN_DREF },
+{"vle64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE64V, MASK_VLE64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vse8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSE8V, MASK_VSE8V, match_vd_neq_vm, INSN_DREF },
{"vse16.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSE16V, MASK_VSE16V, match_vd_neq_vm, INSN_DREF },
{"vse32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSE32V, MASK_VSE32V, match_vd_neq_vm, INSN_DREF },
-{"vse64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSE64V, MASK_VSE64V, match_vd_neq_vm, INSN_DREF },
+{"vse64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSE64V, MASK_VSE64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vlse8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSE8V, MASK_VLSE8V, match_vd_neq_vm, INSN_DREF },
{"vlse16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSE16V, MASK_VLSE16V, match_vd_neq_vm, INSN_DREF },
{"vlse32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSE32V, MASK_VLSE32V, match_vd_neq_vm, INSN_DREF },
-{"vlse64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSE64V, MASK_VLSE64V, match_vd_neq_vm, INSN_DREF },
+{"vlse64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSE64V, MASK_VLSE64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vsse8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSE8V, MASK_VSSE8V, match_vd_neq_vm, INSN_DREF },
{"vsse16.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSE16V, MASK_VSSE16V, match_vd_neq_vm, INSN_DREF },
{"vsse32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSE32V, MASK_VSSE32V, match_vd_neq_vm, INSN_DREF },
-{"vsse64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSE64V, MASK_VSSE64V, match_vd_neq_vm, INSN_DREF },
+{"vsse64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSE64V, MASK_VSSE64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vloxei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXEI8V, MASK_VLOXEI8V, match_vd_neq_vm, INSN_DREF },
{"vloxei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXEI16V, MASK_VLOXEI16V, match_vd_neq_vm, INSN_DREF },
{"vloxei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXEI32V, MASK_VLOXEI32V, match_vd_neq_vm, INSN_DREF },
-{"vloxei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXEI64V, MASK_VLOXEI64V, match_vd_neq_vm, INSN_DREF },
+{"vloxei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXEI64V, MASK_VLOXEI64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vsoxei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXEI8V, MASK_VSOXEI8V, match_vd_neq_vm, INSN_DREF },
{"vsoxei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXEI16V, MASK_VSOXEI16V, match_vd_neq_vm, INSN_DREF },
{"vsoxei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXEI32V, MASK_VSOXEI32V, match_vd_neq_vm, INSN_DREF },
-{"vsoxei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXEI64V, MASK_VSOXEI64V, match_vd_neq_vm, INSN_DREF },
+{"vsoxei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXEI64V, MASK_VSOXEI64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vluxei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXEI8V, MASK_VLUXEI8V, match_vd_neq_vm, INSN_DREF },
{"vluxei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXEI16V, MASK_VLUXEI16V, match_vd_neq_vm, INSN_DREF },
{"vluxei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXEI32V, MASK_VLUXEI32V, match_vd_neq_vm, INSN_DREF },
-{"vluxei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXEI64V, MASK_VLUXEI64V, match_vd_neq_vm, INSN_DREF },
+{"vluxei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXEI64V, MASK_VLUXEI64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vsuxei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXEI8V, MASK_VSUXEI8V, match_vd_neq_vm, INSN_DREF },
{"vsuxei16.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXEI16V, MASK_VSUXEI16V, match_vd_neq_vm, INSN_DREF },
{"vsuxei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXEI32V, MASK_VSUXEI32V, match_vd_neq_vm, INSN_DREF },
-{"vsuxei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXEI64V, MASK_VSUXEI64V, match_vd_neq_vm, INSN_DREF },
+{"vsuxei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXEI64V, MASK_VSUXEI64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vle8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE8FFV, MASK_VLE8FFV, match_vd_neq_vm, INSN_DREF },
{"vle16ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE16FFV, MASK_VLE16FFV, match_vd_neq_vm, INSN_DREF },
{"vle32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE32FFV, MASK_VLE32FFV, match_vd_neq_vm, INSN_DREF },
-{"vle64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE64FFV, MASK_VLE64FFV, match_vd_neq_vm, INSN_DREF },
+{"vle64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLE64FFV, MASK_VLE64FFV, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vlseg2e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E8V, MASK_VLSEG2E8V, match_vd_neq_vm, INSN_DREF },
{"vsseg2e8.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG2E8V, MASK_VSSEG2E8V, match_vd_neq_vm, INSN_DREF },
@@ -1408,20 +1408,20 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vlseg8e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E32V, MASK_VLSEG8E32V, match_vd_neq_vm, INSN_DREF },
{"vsseg8e32.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG8E32V, MASK_VSSEG8E32V, match_vd_neq_vm, INSN_DREF },
-{"vlseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E64V, MASK_VLSEG2E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG2E64V, MASK_VSSEG2E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E64V, MASK_VLSEG3E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG3E64V, MASK_VSSEG3E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E64V, MASK_VLSEG4E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG4E64V, MASK_VSSEG4E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E64V, MASK_VLSEG5E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG5E64V, MASK_VSSEG5E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E64V, MASK_VLSEG6E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG6E64V, MASK_VSSEG6E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E64V, MASK_VLSEG7E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG7E64V, MASK_VSSEG7E64V, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E64V, MASK_VLSEG8E64V, match_vd_neq_vm, INSN_DREF },
-{"vsseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG8E64V, MASK_VSSEG8E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E64V, MASK_VLSEG2E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG2E64V, MASK_VSSEG2E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E64V, MASK_VLSEG3E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG3E64V, MASK_VSSEG3E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E64V, MASK_VLSEG4E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG4E64V, MASK_VSSEG4E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E64V, MASK_VLSEG5E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG5E64V, MASK_VSSEG5E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E64V, MASK_VLSEG6E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG6E64V, MASK_VSSEG6E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E64V, MASK_VLSEG7E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG7E64V, MASK_VSSEG7E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E64V, MASK_VLSEG8E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VSSEG8E64V, MASK_VSSEG8E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vlsseg2e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG2E8V, MASK_VLSSEG2E8V, match_vd_neq_vm, INSN_DREF },
{"vssseg2e8.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG2E8V, MASK_VSSSEG2E8V, match_vd_neq_vm, INSN_DREF },
@@ -1468,20 +1468,20 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vlsseg8e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG8E32V, MASK_VLSSEG8E32V, match_vd_neq_vm, INSN_DREF },
{"vssseg8e32.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG8E32V, MASK_VSSSEG8E32V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG2E64V, MASK_VLSSEG2E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG2E64V, MASK_VSSSEG2E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG3E64V, MASK_VLSSEG3E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG3E64V, MASK_VSSSEG3E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG4E64V, MASK_VLSSEG4E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG4E64V, MASK_VSSSEG4E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG5E64V, MASK_VLSSEG5E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG5E64V, MASK_VSSSEG5E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG6E64V, MASK_VLSSEG6E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG6E64V, MASK_VSSSEG6E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG7E64V, MASK_VLSSEG7E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG7E64V, MASK_VSSSEG7E64V, match_vd_neq_vm, INSN_DREF },
-{"vlsseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG8E64V, MASK_VLSSEG8E64V, match_vd_neq_vm, INSN_DREF },
-{"vssseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG8E64V, MASK_VSSSEG8E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG2E64V, MASK_VLSSEG2E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vssseg2e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG2E64V, MASK_VSSSEG2E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlsseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG3E64V, MASK_VLSSEG3E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vssseg3e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG3E64V, MASK_VSSSEG3E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlsseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG4E64V, MASK_VLSSEG4E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vssseg4e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG4E64V, MASK_VSSSEG4E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlsseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG5E64V, MASK_VLSSEG5E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vssseg5e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG5E64V, MASK_VSSSEG5E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlsseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG6E64V, MASK_VLSSEG6E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vssseg6e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG6E64V, MASK_VSSSEG6E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlsseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG7E64V, MASK_VLSSEG7E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vssseg7e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG7E64V, MASK_VSSSEG7E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlsseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VLSSEG8E64V, MASK_VLSSEG8E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vssseg8e64.v", 0, INSN_CLASS_V, "Vd,0(s),tVm", MATCH_VSSSEG8E64V, MASK_VSSSEG8E64V, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vloxseg2ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI8V, MASK_VLOXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
{"vsoxseg2ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI8V, MASK_VSOXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
@@ -1528,20 +1528,20 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vloxseg8ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI32V, MASK_VLOXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
{"vsoxseg8ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI32V, MASK_VSOXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI64V, MASK_VLOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI64V, MASK_VSOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI64V, MASK_VLOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI64V, MASK_VSOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI64V, MASK_VLOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI64V, MASK_VSOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI64V, MASK_VLOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI64V, MASK_VSOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI64V, MASK_VLOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI64V, MASK_VSOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI64V, MASK_VLOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI64V, MASK_VSOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vloxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI64V, MASK_VLOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsoxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI64V, MASK_VSOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG2EI64V, MASK_VLOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsoxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG2EI64V, MASK_VSOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vloxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG3EI64V, MASK_VLOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsoxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG3EI64V, MASK_VSOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vloxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG4EI64V, MASK_VLOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsoxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG4EI64V, MASK_VSOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vloxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG5EI64V, MASK_VLOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsoxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG5EI64V, MASK_VSOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vloxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG6EI64V, MASK_VLOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsoxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG6EI64V, MASK_VSOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vloxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG7EI64V, MASK_VLOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsoxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG7EI64V, MASK_VSOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vloxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLOXSEG8EI64V, MASK_VLOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsoxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSOXSEG8EI64V, MASK_VSOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vluxseg2ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI8V, MASK_VLUXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
{"vsuxseg2ei8.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI8V, MASK_VSUXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
@@ -1588,20 +1588,20 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vluxseg8ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI32V, MASK_VLUXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
{"vsuxseg8ei32.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI32V, MASK_VSUXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI64V, MASK_VLUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI64V, MASK_VSUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI64V, MASK_VLUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI64V, MASK_VSUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI64V, MASK_VLUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI64V, MASK_VSUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI64V, MASK_VLUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI64V, MASK_VSUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI64V, MASK_VLUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI64V, MASK_VSUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI64V, MASK_VLUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI64V, MASK_VSUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vluxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI64V, MASK_VLUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
-{"vsuxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI64V, MASK_VSUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG2EI64V, MASK_VLUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsuxseg2ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG2EI64V, MASK_VSUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vluxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG3EI64V, MASK_VLUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsuxseg3ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG3EI64V, MASK_VSUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vluxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG4EI64V, MASK_VLUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsuxseg4ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG4EI64V, MASK_VSUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vluxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG5EI64V, MASK_VLUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsuxseg5ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG5EI64V, MASK_VSUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vluxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG6EI64V, MASK_VLUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsuxseg6ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG6EI64V, MASK_VSUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vluxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG7EI64V, MASK_VLUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsuxseg7ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG7EI64V, MASK_VSUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vluxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VLUXSEG8EI64V, MASK_VLUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vsuxseg8ei64.v", 0, INSN_CLASS_V, "Vd,0(s),VtVm", MATCH_VSUXSEG8EI64V, MASK_VSUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vlseg2e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E8FFV, MASK_VLSEG2E8FFV, match_vd_neq_vm, INSN_DREF },
{"vlseg3e8ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E8FFV, MASK_VLSEG3E8FFV, match_vd_neq_vm, INSN_DREF },
@@ -1627,37 +1627,37 @@ const struct riscv_opcode riscv_draft_opcodes[] =
{"vlseg7e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E32FFV, MASK_VLSEG7E32FFV, match_vd_neq_vm, INSN_DREF },
{"vlseg8e32ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E32FFV, MASK_VLSEG8E32FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg2e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E64FFV, MASK_VLSEG2E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg3e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E64FFV, MASK_VLSEG3E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg4e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E64FFV, MASK_VLSEG4E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg5e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E64FFV, MASK_VLSEG5E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg6e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E64FFV, MASK_VLSEG6E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg7e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E64FFV, MASK_VLSEG7E64FFV, match_vd_neq_vm, INSN_DREF },
-{"vlseg8e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E64FFV, MASK_VLSEG8E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg2e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG2E64FFV, MASK_VLSEG2E64FFV, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg3e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG3E64FFV, MASK_VLSEG3E64FFV, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg4e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG4E64FFV, MASK_VLSEG4E64FFV, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg5e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG5E64FFV, MASK_VLSEG5E64FFV, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg6e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG6E64FFV, MASK_VLSEG6E64FFV, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg7e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG7E64FFV, MASK_VLSEG7E64FFV, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
+{"vlseg8e64ff.v", 0, INSN_CLASS_V, "Vd,0(s)Vm", MATCH_VLSEG8E64FFV, MASK_VLSEG8E64FFV, match_vd_neq_vm, INSN_DREF|INSN_V_EEW64 },
{"vl1r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL1RE8V, MASK_VL1RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
{"vl1re8.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL1RE8V, MASK_VL1RE8V, match_vls_nf_rv, INSN_DREF },
{"vl1re16.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL1RE16V, MASK_VL1RE16V, match_vls_nf_rv, INSN_DREF },
{"vl1re32.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL1RE32V, MASK_VL1RE32V, match_vls_nf_rv, INSN_DREF },
-{"vl1re64.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL1RE64V, MASK_VL1RE64V, match_vls_nf_rv, INSN_DREF },
+{"vl1re64.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL1RE64V, MASK_VL1RE64V, match_vls_nf_rv, INSN_DREF|INSN_V_EEW64 },
{"vl2r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL2RE8V, MASK_VL2RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
{"vl2re8.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL2RE8V, MASK_VL2RE8V, match_vls_nf_rv, INSN_DREF },
{"vl2re16.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL2RE16V, MASK_VL2RE16V, match_vls_nf_rv, INSN_DREF },
{"vl2re32.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL2RE32V, MASK_VL2RE32V, match_vls_nf_rv, INSN_DREF },
-{"vl2re64.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL2RE64V, MASK_VL2RE64V, match_vls_nf_rv, INSN_DREF },
+{"vl2re64.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL2RE64V, MASK_VL2RE64V, match_vls_nf_rv, INSN_DREF|INSN_V_EEW64 },
{"vl4r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL4RE8V, MASK_VL4RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
{"vl4re8.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL4RE8V, MASK_VL4RE8V, match_vls_nf_rv, INSN_DREF },
{"vl4re16.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL4RE16V, MASK_VL4RE16V, match_vls_nf_rv, INSN_DREF },
{"vl4re32.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL4RE32V, MASK_VL4RE32V, match_vls_nf_rv, INSN_DREF },
-{"vl4re64.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL4RE64V, MASK_VL4RE64V, match_vls_nf_rv, INSN_DREF },
+{"vl4re64.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL4RE64V, MASK_VL4RE64V, match_vls_nf_rv, INSN_DREF|INSN_V_EEW64 },
{"vl8r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL8RE8V, MASK_VL8RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
{"vl8re8.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL8RE8V, MASK_VL8RE8V, match_vls_nf_rv, INSN_DREF },
{"vl8re16.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL8RE16V, MASK_VL8RE16V, match_vls_nf_rv, INSN_DREF },
{"vl8re32.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL8RE32V, MASK_VL8RE32V, match_vls_nf_rv, INSN_DREF },
-{"vl8re64.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL8RE64V, MASK_VL8RE64V, match_vls_nf_rv, INSN_DREF },
+{"vl8re64.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VL8RE64V, MASK_VL8RE64V, match_vls_nf_rv, INSN_DREF|INSN_V_EEW64 },
{"vs1r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VS1RV, MASK_VS1RV, match_vls_nf_rv, INSN_DREF },
{"vs2r.v", 0, INSN_CLASS_V, "Vd,0(s)", MATCH_VS2RV, MASK_VS2RV, match_vls_nf_rv, INSN_DREF },
--
2.30.2
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: [integration v2 0/4] RISC-V/rvv: Update rvv from v01.0 to v1.0
2021-10-05 12:51 [integration v2 0/4] RISC-V/rvv: Update rvv from v01.0 to v1.0 Nelson Chu
` (3 preceding siblings ...)
2021-10-05 12:51 ` [integration v2 4/4] RISC-V/rvv: Added zve* and zvl* extensions, and clarify the imply rules Nelson Chu
@ 2021-10-25 6:04 ` Nelson Chu
4 siblings, 0 replies; 6+ messages in thread
From: Nelson Chu @ 2021-10-25 6:04 UTC (permalink / raw)
To: Binutils, Jim Wilson, Andrew Waterman
Committed these patches into riscv integration branch. I am going to
move the whole rvv 1.0 extensions back to the master branch, since rvv
1.0 was frozen and at the public review stage.
Thanks
Nelson
On Tue, Oct 5, 2021 at 8:51 PM Nelson Chu <nelson.chu@sifive.com> wrote:
>
> Hi Guys,
>
> There are four patches used to update the rvv from v1.0 to 1.0,
> * [integration v2 1/4] RISC-V/rvv: Added assembly pseudo and changed assembler mnemonics.
> * [integration v2 2/4] RISC-V/rvv: Update constraints for widening and narrowing instructions.
> * [integration v2 3/4] RISC-V/rvv: Separate zvamo from v, and removed the zvlsseg extension name.
> * [integration v2 4/4] RISC-V/rvv: Added zve* and zvl* extensions, and clarify the imply rules.
>
> I still send the rvv patches to integration branch, since it should be
> easier to review. After the reviewing, I will merge the whole rvv patches
> into the one, and then move it from the integration branch to mainline,
> since the rvv v1.0 is frozen and at the public review stage for now.
>
> Thanks
> Nelson
>
>
^ permalink raw reply [flat|nested] 6+ messages in thread