From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by sourceware.org (Postfix) with ESMTP id 68E8C385828E for ; Mon, 25 Sep 2023 20:27:16 +0000 (GMT) DMARC-Filter: OpenDMARC Filter v1.4.2 sourceware.org 68E8C385828E Authentication-Results: sourceware.org; dmarc=pass (p=none dis=none) header.from=arm.com Authentication-Results: sourceware.org; spf=pass smtp.mailfrom=arm.com Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id 5F7F61424; Mon, 25 Sep 2023 13:27:54 -0700 (PDT) Received: from e121540-lin.manchester.arm.com (e121540-lin.manchester.arm.com [10.32.110.72]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPSA id 4C6E83F6C4; Mon, 25 Sep 2023 13:27:15 -0700 (PDT) From: Richard Sandiford To: richard.earnshaw@arm.com, binutils@sourceware.org Cc: Richard Sandiford Subject: [PATCH 1/2] aarch64: Restructure feature flag handling Date: Mon, 25 Sep 2023 21:26:46 +0100 Message-Id: <20230925202647.2049600-2-richard.sandiford@arm.com> X-Mailer: git-send-email 2.25.1 In-Reply-To: <20230925202647.2049600-1-richard.sandiford@arm.com> References: <20230925202647.2049600-1-richard.sandiford@arm.com> MIME-Version: 1.0 Content-Transfer-Encoding: 8bit X-Spam-Status: No, score=-25.3 required=5.0 tests=BAYES_00,GIT_PATCH_0,KAM_DMARC_NONE,KAM_DMARC_STATUS,KAM_LAZY_DOMAIN_SECURITY,SPF_HELO_NONE,SPF_NONE,TXREP,UPPERCASE_50_75 autolearn=ham autolearn_force=no version=3.4.6 X-Spam-Checker-Version: SpamAssassin 3.4.6 (2021-04-09) on server2.sourceware.org List-Id: The AArch64 feature-flag code is currently limited to a maximum of 64 features. This patch reworks it so that the limit can be increased more easily. The basic idea is: (1) Turn the ARM_FEATURE_FOO macros into an enum, with the enum counting bit positions. (2) Make the feature-list macros take an array index argument (currently always 0). The macros then return the aarch64_feature_set contents for that array index. An N-element array would then be initialised as: { MACRO (0), ..., MACRO (N - 1) } (3) Provide convenience macros for initialising an aarch64_feature_set for: - a single feature - a list of individual features - an architecture version - an architecture version + a list of additional features (2) and (3) use the preprocessor to generate static initialisers. The main restriction was that uses of the same preprocessor macro cannot be nested. So if a macro wants to do something for N individual arguments, it needs to use a chain of N macros to do it. There then needs to be a way of deriving N, as a preprocessor token suitable for pasting. The easiest way of doing that was to precede each list of features by the number of features in the list. So an aarch64_feature_set initialiser for three features A, B and C would be written: AARCH64_FEATURES (3, A, B, C) This scheme makes it difficult to keep AARCH64_FEATURE_CRYPTO as a synonym for SHA2+AES, so the patch expands the former to the latter. --- gas/config/tc-aarch64.c | 462 ++++++++++++++------------------------- include/opcode/aarch64.h | 419 ++++++++++++++++++++++------------- opcodes/aarch64-dis.c | 7 +- opcodes/aarch64-opc.c | 54 +++-- opcodes/aarch64-tbl.h | 120 +++++----- 5 files changed, 516 insertions(+), 546 deletions(-) diff --git a/gas/config/tc-aarch64.c b/gas/config/tc-aarch64.c index 4458f6dd663..5f5ec1b3dbc 100644 --- a/gas/config/tc-aarch64.c +++ b/gas/config/tc-aarch64.c @@ -38,11 +38,6 @@ #include "dw2gencfi.h" #include "dwarf2dbg.h" -/* Types of processor to assemble for. */ -#ifndef CPU_DEFAULT -#define CPU_DEFAULT AARCH64_ARCH_V8A -#endif - #define streq(a, b) (strcmp (a, b) == 0) #define END_OF_INSN '\0' @@ -56,7 +51,7 @@ static const aarch64_feature_set *mcpu_cpu_opt = NULL; static const aarch64_feature_set *march_cpu_opt = NULL; /* Constants for known architecture features. */ -static const aarch64_feature_set cpu_default = CPU_DEFAULT; +static const aarch64_feature_set cpu_default = AARCH64_ARCH_FEATURES (V8A); /* Currently active instruction sequence. */ static aarch64_instr_sequence *insn_sequence = NULL; @@ -4894,7 +4889,8 @@ parse_sys_reg (char **str, htab_t sys_regs, "name '%s'"), buf); if (!pstatefield_p && !aarch64_sys_ins_reg_supported_p (cpu_variant, o->name, - o->value, o->flags, o->features)) + o->value, o->flags, + &o->features)) as_bad (_("selected processor does not support system register " "name '%s'"), buf); if (aarch64_sys_reg_deprecated_p (o->flags)) @@ -6527,11 +6523,10 @@ parse_operands (char *str, const aarch64_opcode *opcode) clear_error (); skip_whitespace (str); - if (AARCH64_CPU_HAS_FEATURE (*opcode->avariant, AARCH64_FEATURE_SME2)) + if (AARCH64_CPU_HAS_FEATURE (*opcode->avariant, SME2)) imm_reg_type = REG_TYPE_R_ZR_SP_BHSDQ_VZP_PN; - else if (AARCH64_CPU_HAS_ANY_FEATURES (*opcode->avariant, - AARCH64_FEATURE_SVE - | AARCH64_FEATURE_SVE2)) + else if (AARCH64_CPU_HAS_FEATURE (*opcode->avariant, SVE) + || AARCH64_CPU_HAS_FEATURE (*opcode->avariant, SVE2)) imm_reg_type = REG_TYPE_R_ZR_SP_BHSDQ_VZP; else imm_reg_type = REG_TYPE_R_ZR_BHSDQ_V; @@ -10149,170 +10144,80 @@ struct aarch64_cpu_option_table /* This list should, at a minimum, contain all the cpu names recognized by GCC. */ static const struct aarch64_cpu_option_table aarch64_cpus[] = { - {"all", AARCH64_ANY, NULL}, - {"cortex-a34", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC), "Cortex-A34"}, - {"cortex-a35", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC), "Cortex-A35"}, - {"cortex-a53", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC), "Cortex-A53"}, - {"cortex-a57", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC), "Cortex-A57"}, - {"cortex-a72", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC), "Cortex-A72"}, - {"cortex-a73", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC), "Cortex-A73"}, - {"cortex-a55", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_RCPC | AARCH64_FEATURE_F16 | AARCH64_FEATURE_DOTPROD), + {"all", AARCH64_ALL_FEATURES, NULL}, + {"cortex-a34", AARCH64_CPU_FEATURES (V8A, 1, CRC), "Cortex-A34"}, + {"cortex-a35", AARCH64_CPU_FEATURES (V8A, 1, CRC), "Cortex-A35"}, + {"cortex-a53", AARCH64_CPU_FEATURES (V8A, 1, CRC), "Cortex-A53"}, + {"cortex-a57", AARCH64_CPU_FEATURES (V8A, 1, CRC), "Cortex-A57"}, + {"cortex-a72", AARCH64_CPU_FEATURES (V8A, 1, CRC), "Cortex-A72"}, + {"cortex-a73", AARCH64_CPU_FEATURES (V8A, 1, CRC), "Cortex-A73"}, + {"cortex-a55", AARCH64_CPU_FEATURES (V8_2A, 3, RCPC, F16, DOTPROD), "Cortex-A55"}, - {"cortex-a75", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_RCPC | AARCH64_FEATURE_F16 | AARCH64_FEATURE_DOTPROD), + {"cortex-a75", AARCH64_CPU_FEATURES (V8_2A, 3, RCPC, F16, DOTPROD), "Cortex-A75"}, - {"cortex-a76", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_RCPC | AARCH64_FEATURE_F16 | AARCH64_FEATURE_DOTPROD), + {"cortex-a76", AARCH64_CPU_FEATURES (V8_2A, 3, RCPC, F16, DOTPROD), "Cortex-A76"}, - {"cortex-a76ae", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_F16 | AARCH64_FEATURE_RCPC - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_SSBS), - "Cortex-A76AE"}, - {"cortex-a77", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_F16 | AARCH64_FEATURE_RCPC - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_SSBS), - "Cortex-A77"}, - {"cortex-a65", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_F16 | AARCH64_FEATURE_RCPC - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_SSBS), - "Cortex-A65"}, - {"cortex-a65ae", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_F16 | AARCH64_FEATURE_RCPC - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_SSBS), - "Cortex-A65AE"}, - {"cortex-a78", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_F16 - | AARCH64_FEATURE_RCPC - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_SSBS - | AARCH64_FEATURE_PROFILE), - "Cortex-A78"}, - {"cortex-a78ae", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_F16 - | AARCH64_FEATURE_RCPC - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_SSBS - | AARCH64_FEATURE_PROFILE), - "Cortex-A78AE"}, - {"cortex-a78c", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_F16 - | AARCH64_FEATURE_FLAGM - | AARCH64_FEATURE_PAC - | AARCH64_FEATURE_PROFILE - | AARCH64_FEATURE_RCPC - | AARCH64_FEATURE_SSBS), - "Cortex-A78C"}, - {"cortex-a510", AARCH64_FEATURE (AARCH64_ARCH_V9A, - AARCH64_FEATURE_BFLOAT16 - | AARCH64_FEATURE_I8MM - | AARCH64_FEATURE_MEMTAG - | AARCH64_FEATURE_SVE2_BITPERM), - "Cortex-A510"}, - {"cortex-a520", AARCH64_FEATURE (AARCH64_ARCH_V9_2A, - AARCH64_FEATURE_MEMTAG - | AARCH64_FEATURE_SVE2_BITPERM), - "Cortex-A520"}, - {"cortex-a710", AARCH64_FEATURE (AARCH64_ARCH_V9A, - AARCH64_FEATURE_BFLOAT16 - | AARCH64_FEATURE_I8MM - | AARCH64_FEATURE_MEMTAG - | AARCH64_FEATURE_SVE2_BITPERM), - "Cortex-A710"}, - {"cortex-a720", AARCH64_FEATURE (AARCH64_ARCH_V9_2A, - AARCH64_FEATURE_MEMTAG - | AARCH64_FEATURE_PROFILE - | AARCH64_FEATURE_SVE2_BITPERM), - "Cortex-A720"}, - {"ares", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_RCPC | AARCH64_FEATURE_F16 - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_PROFILE), - "Ares"}, - {"exynos-m1", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC | AARCH64_FEATURE_CRYPTO), - "Samsung Exynos M1"}, - {"falkor", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC | AARCH64_FEATURE_CRYPTO - | AARCH64_FEATURE_RDMA), - "Qualcomm Falkor"}, - {"neoverse-e1", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_RCPC | AARCH64_FEATURE_F16 - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_SSBS), - "Neoverse E1"}, - {"neoverse-n1", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_RCPC | AARCH64_FEATURE_F16 - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_PROFILE), - "Neoverse N1"}, - {"neoverse-n2", AARCH64_FEATURE (AARCH64_ARCH_V8_5A, - AARCH64_FEATURE_BFLOAT16 - | AARCH64_FEATURE_I8MM - | AARCH64_FEATURE_F16 - | AARCH64_FEATURE_SVE - | AARCH64_FEATURE_SVE2 - | AARCH64_FEATURE_SVE2_BITPERM - | AARCH64_FEATURE_MEMTAG - | AARCH64_FEATURE_RNG), - "Neoverse N2"}, - {"neoverse-v1", AARCH64_FEATURE (AARCH64_ARCH_V8_4A, - AARCH64_FEATURE_PROFILE - | AARCH64_FEATURE_CVADP - | AARCH64_FEATURE_SVE - | AARCH64_FEATURE_SSBS - | AARCH64_FEATURE_RNG - | AARCH64_FEATURE_F16 - | AARCH64_FEATURE_BFLOAT16 - | AARCH64_FEATURE_I8MM), "Neoverse V1"}, - {"qdf24xx", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC | AARCH64_FEATURE_CRYPTO - | AARCH64_FEATURE_RDMA), - "Qualcomm QDF24XX"}, - {"saphira", AARCH64_FEATURE (AARCH64_ARCH_V8_4A, - AARCH64_FEATURE_CRYPTO | AARCH64_FEATURE_PROFILE), - "Qualcomm Saphira"}, - {"thunderx", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC | AARCH64_FEATURE_CRYPTO), - "Cavium ThunderX"}, - {"vulcan", AARCH64_FEATURE (AARCH64_ARCH_V8_1A, - AARCH64_FEATURE_CRYPTO), - "Broadcom Vulcan"}, + {"cortex-a76ae", AARCH64_CPU_FEATURES (V8_2A, 4, F16, RCPC, DOTPROD, + SSBS), "Cortex-A76AE"}, + {"cortex-a77", AARCH64_CPU_FEATURES (V8_2A, 4, F16, RCPC, DOTPROD, + SSBS), "Cortex-A77"}, + {"cortex-a65", AARCH64_CPU_FEATURES (V8_2A, 4, F16, RCPC, DOTPROD, + SSBS), "Cortex-A65"}, + {"cortex-a65ae", AARCH64_CPU_FEATURES (V8_2A, 4, F16, RCPC, DOTPROD, + SSBS), "Cortex-A65AE"}, + {"cortex-a78", AARCH64_CPU_FEATURES (V8_2A, 5, F16, RCPC, DOTPROD, + SSBS, PROFILE), "Cortex-A78"}, + {"cortex-a78ae", AARCH64_CPU_FEATURES (V8_2A, 5, F16, RCPC, DOTPROD, + SSBS, PROFILE), "Cortex-A78AE"}, + {"cortex-a78c", AARCH64_CPU_FEATURES (V8_2A, 7, DOTPROD, F16, FLAGM, + PAC, PROFILE, RCPC, SSBS), + "Cortex-A78C"}, + {"cortex-a510", AARCH64_CPU_FEATURES (V9A, 4, BFLOAT16, I8MM, MEMTAG, + SVE2_BITPERM), "Cortex-A510"}, + {"cortex-a520", AARCH64_CPU_FEATURES (V9_2A, 2, MEMTAG, SVE2_BITPERM), + "Cortex-A520"}, + {"cortex-a710", AARCH64_CPU_FEATURES (V9A, 4, BFLOAT16, I8MM, MEMTAG, + SVE2_BITPERM), "Cortex-A710"}, + {"cortex-a720", AARCH64_CPU_FEATURES (V9_2A, 3, MEMTAG, PROFILE, + SVE2_BITPERM), "Cortex-A720"}, + {"ares", AARCH64_CPU_FEATURES (V8_2A, 4, RCPC, F16, DOTPROD, + PROFILE), "Ares"}, + {"exynos-m1", AARCH64_CPU_FEATURES (V8A, 3, CRC, SHA2, AES), + "Samsung Exynos M1"}, + {"falkor", AARCH64_CPU_FEATURES (V8A, 4, CRC, SHA2, AES, RDMA), + "Qualcomm Falkor"}, + {"neoverse-e1", AARCH64_CPU_FEATURES (V8_2A, 4, RCPC, F16, DOTPROD, + SSBS), "Neoverse E1"}, + {"neoverse-n1", AARCH64_CPU_FEATURES (V8_2A, 4, RCPC, F16, DOTPROD, + PROFILE), "Neoverse N1"}, + {"neoverse-n2", AARCH64_CPU_FEATURES (V8_5A, 8, BFLOAT16, I8MM, F16, + SVE, SVE2, SVE2_BITPERM, MEMTAG, + RNG), "Neoverse N2"}, + {"neoverse-v1", AARCH64_CPU_FEATURES (V8_4A, 8, PROFILE, CVADP, SVE, + SSBS, RNG, F16, BFLOAT16, I8MM), + "Neoverse V1"}, + {"qdf24xx", AARCH64_CPU_FEATURES (V8A, 4, CRC, SHA2, AES, RDMA), + "Qualcomm QDF24XX"}, + {"saphira", AARCH64_CPU_FEATURES (V8_4A, 3, SHA2, AES, PROFILE), + "Qualcomm Saphira"}, + {"thunderx", AARCH64_CPU_FEATURES (V8A, 3, CRC, SHA2, AES), + "Cavium ThunderX"}, + {"vulcan", AARCH64_CPU_FEATURES (V8_1A, 2, SHA2, AES), + "Broadcom Vulcan"}, /* The 'xgene-1' name is an older name for 'xgene1', which was used in earlier releases and is superseded by 'xgene1' in all tools. */ - {"xgene-1", AARCH64_ARCH_V8A, "APM X-Gene 1"}, - {"xgene1", AARCH64_ARCH_V8A, "APM X-Gene 1"}, - {"xgene2", AARCH64_FEATURE (AARCH64_ARCH_V8A, - AARCH64_FEATURE_CRC), "APM X-Gene 2"}, - {"cortex-r82", AARCH64_ARCH_V8R, "Cortex-R82"}, - {"cortex-x1", AARCH64_FEATURE (AARCH64_ARCH_V8_2A, - AARCH64_FEATURE_F16 - | AARCH64_FEATURE_RCPC - | AARCH64_FEATURE_DOTPROD - | AARCH64_FEATURE_SSBS - | AARCH64_FEATURE_PROFILE), - "Cortex-X1"}, - {"cortex-x2", AARCH64_FEATURE (AARCH64_ARCH_V9A, - AARCH64_FEATURE_BFLOAT16 - | AARCH64_FEATURE_I8MM - | AARCH64_FEATURE_MEMTAG - | AARCH64_FEATURE_SVE2_BITPERM), - "Cortex-X2"}, - {"generic", AARCH64_ARCH_V8A, NULL}, - - {NULL, AARCH64_ARCH_NONE, NULL} + {"xgene-1", AARCH64_ARCH_FEATURES (V8A), "APM X-Gene 1"}, + {"xgene1", AARCH64_ARCH_FEATURES (V8A), "APM X-Gene 1"}, + {"xgene2", AARCH64_CPU_FEATURES (V8A, 1, CRC), "APM X-Gene 2"}, + {"cortex-r82", AARCH64_ARCH_FEATURES (V8R), "Cortex-R82"}, + {"cortex-x1", AARCH64_CPU_FEATURES (V8_2A, 5, F16, RCPC, DOTPROD, + SSBS, PROFILE), "Cortex-X1"}, + {"cortex-x2", AARCH64_CPU_FEATURES (V9A, 4, BFLOAT16, I8MM, MEMTAG, + SVE2_BITPERM), "Cortex-X2"}, + {"generic", AARCH64_ARCH_FEATURES (V8A), NULL}, + + {NULL, AARCH64_NO_FEATURES, NULL} }; struct aarch64_arch_option_table @@ -10324,22 +10229,22 @@ struct aarch64_arch_option_table /* This list should, at a minimum, contain all the architecture names recognized by GCC. */ static const struct aarch64_arch_option_table aarch64_archs[] = { - {"all", AARCH64_ANY}, - {"armv8-a", AARCH64_ARCH_V8A}, - {"armv8.1-a", AARCH64_ARCH_V8_1A}, - {"armv8.2-a", AARCH64_ARCH_V8_2A}, - {"armv8.3-a", AARCH64_ARCH_V8_3A}, - {"armv8.4-a", AARCH64_ARCH_V8_4A}, - {"armv8.5-a", AARCH64_ARCH_V8_5A}, - {"armv8.6-a", AARCH64_ARCH_V8_6A}, - {"armv8.7-a", AARCH64_ARCH_V8_7A}, - {"armv8.8-a", AARCH64_ARCH_V8_8A}, - {"armv8-r", AARCH64_ARCH_V8R}, - {"armv9-a", AARCH64_ARCH_V9A}, - {"armv9.1-a", AARCH64_ARCH_V9_1A}, - {"armv9.2-a", AARCH64_ARCH_V9_2A}, - {"armv9.3-a", AARCH64_ARCH_V9_3A}, - {NULL, AARCH64_ARCH_NONE} + {"all", AARCH64_ALL_FEATURES}, + {"armv8-a", AARCH64_ARCH_FEATURES (V8A)}, + {"armv8.1-a", AARCH64_ARCH_FEATURES (V8_1A)}, + {"armv8.2-a", AARCH64_ARCH_FEATURES (V8_2A)}, + {"armv8.3-a", AARCH64_ARCH_FEATURES (V8_3A)}, + {"armv8.4-a", AARCH64_ARCH_FEATURES (V8_4A)}, + {"armv8.5-a", AARCH64_ARCH_FEATURES (V8_5A)}, + {"armv8.6-a", AARCH64_ARCH_FEATURES (V8_6A)}, + {"armv8.7-a", AARCH64_ARCH_FEATURES (V8_7A)}, + {"armv8.8-a", AARCH64_ARCH_FEATURES (V8_8A)}, + {"armv8-r", AARCH64_ARCH_FEATURES (V8R)}, + {"armv9-a", AARCH64_ARCH_FEATURES (V9A)}, + {"armv9.1-a", AARCH64_ARCH_FEATURES (V9_1A)}, + {"armv9.2-a", AARCH64_ARCH_FEATURES (V9_2A)}, + {"armv9.3-a", AARCH64_ARCH_FEATURES (V9_3A)}, + {NULL, AARCH64_NO_FEATURES} }; /* ISA extensions. */ @@ -10351,106 +10256,61 @@ struct aarch64_option_cpu_value_table }; static const struct aarch64_option_cpu_value_table aarch64_features[] = { - {"crc", AARCH64_FEATURE (AARCH64_FEATURE_CRC, 0), - AARCH64_ARCH_NONE}, - {"crypto", AARCH64_FEATURE (AARCH64_FEATURE_CRYPTO, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SIMD, 0)}, - {"fp", AARCH64_FEATURE (AARCH64_FEATURE_FP, 0), - AARCH64_ARCH_NONE}, - {"lse", AARCH64_FEATURE (AARCH64_FEATURE_LSE, 0), - AARCH64_ARCH_NONE}, - {"simd", AARCH64_FEATURE (AARCH64_FEATURE_SIMD, 0), - AARCH64_FEATURE (AARCH64_FEATURE_FP, 0)}, - {"pan", AARCH64_FEATURE (AARCH64_FEATURE_PAN, 0), - AARCH64_ARCH_NONE}, - {"lor", AARCH64_FEATURE (AARCH64_FEATURE_LOR, 0), - AARCH64_ARCH_NONE}, - {"ras", AARCH64_FEATURE (AARCH64_FEATURE_RAS, 0), - AARCH64_ARCH_NONE}, - {"rdma", AARCH64_FEATURE (AARCH64_FEATURE_RDMA, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SIMD, 0)}, - {"fp16", AARCH64_FEATURE (AARCH64_FEATURE_F16, 0), - AARCH64_FEATURE (AARCH64_FEATURE_FP, 0)}, - {"fp16fml", AARCH64_FEATURE (AARCH64_FEATURE_F16_FML, 0), - AARCH64_FEATURE (AARCH64_FEATURE_F16, 0)}, - {"profile", AARCH64_FEATURE (AARCH64_FEATURE_PROFILE, 0), - AARCH64_ARCH_NONE}, - {"sve", AARCH64_FEATURE (AARCH64_FEATURE_SVE, 0), - AARCH64_FEATURE (AARCH64_FEATURE_COMPNUM, 0)}, - {"tme", AARCH64_FEATURE (AARCH64_FEATURE_TME, 0), - AARCH64_ARCH_NONE}, - {"compnum", AARCH64_FEATURE (AARCH64_FEATURE_COMPNUM, 0), - AARCH64_FEATURE (AARCH64_FEATURE_F16 - | AARCH64_FEATURE_SIMD, 0)}, - {"rcpc", AARCH64_FEATURE (AARCH64_FEATURE_RCPC, 0), - AARCH64_ARCH_NONE}, - {"dotprod", AARCH64_FEATURE (AARCH64_FEATURE_DOTPROD, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SIMD, 0)}, - {"sha2", AARCH64_FEATURE (AARCH64_FEATURE_SHA2, 0), - AARCH64_FEATURE (AARCH64_FEATURE_FP, 0)}, - {"sb", AARCH64_FEATURE (AARCH64_FEATURE_SB, 0), - AARCH64_ARCH_NONE}, - {"predres", AARCH64_FEATURE (AARCH64_FEATURE_PREDRES, 0), - AARCH64_ARCH_NONE}, - {"aes", AARCH64_FEATURE (AARCH64_FEATURE_AES, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SIMD, 0)}, - {"sm4", AARCH64_FEATURE (AARCH64_FEATURE_SM4, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SIMD, 0)}, - {"sha3", AARCH64_FEATURE (AARCH64_FEATURE_SHA3, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SHA2, 0)}, - {"rng", AARCH64_FEATURE (AARCH64_FEATURE_RNG, 0), - AARCH64_ARCH_NONE}, - {"ssbs", AARCH64_FEATURE (AARCH64_FEATURE_SSBS, 0), - AARCH64_ARCH_NONE}, - {"memtag", AARCH64_FEATURE (AARCH64_FEATURE_MEMTAG, 0), - AARCH64_ARCH_NONE}, - {"sve2", AARCH64_FEATURE (AARCH64_FEATURE_SVE2, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SVE, 0)}, - {"sve2-sm4", AARCH64_FEATURE (AARCH64_FEATURE_SVE2_SM4, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 - | AARCH64_FEATURE_SM4, 0)}, - {"sve2-aes", AARCH64_FEATURE (AARCH64_FEATURE_SVE2_AES, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 - | AARCH64_FEATURE_AES, 0)}, - {"sve2-sha3", AARCH64_FEATURE (AARCH64_FEATURE_SVE2_SHA3, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 - | AARCH64_FEATURE_SHA3, 0)}, - {"sve2-bitperm", AARCH64_FEATURE (AARCH64_FEATURE_SVE2_BITPERM, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SVE2, 0)}, - {"sme", AARCH64_FEATURE (AARCH64_FEATURE_SME, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 - | AARCH64_FEATURE_BFLOAT16, 0)}, - {"sme-f64", AARCH64_FEATURE (AARCH64_FEATURE_SME_F64F64, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SME, 0)}, - {"sme-f64f64", AARCH64_FEATURE (AARCH64_FEATURE_SME_F64F64, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SME, 0)}, - {"sme-i64", AARCH64_FEATURE (AARCH64_FEATURE_SME_I16I64, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SME, 0)}, - {"sme-i16i64", AARCH64_FEATURE (AARCH64_FEATURE_SME_I16I64, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SME, 0)}, - {"sme2", AARCH64_FEATURE (AARCH64_FEATURE_SME2, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SME, 0)}, - {"bf16", AARCH64_FEATURE (AARCH64_FEATURE_BFLOAT16, 0), - AARCH64_FEATURE (AARCH64_FEATURE_FP, 0)}, - {"i8mm", AARCH64_FEATURE (AARCH64_FEATURE_I8MM, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SIMD, 0)}, - {"f32mm", AARCH64_FEATURE (AARCH64_FEATURE_F32MM, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SVE, 0)}, - {"f64mm", AARCH64_FEATURE (AARCH64_FEATURE_F64MM, 0), - AARCH64_FEATURE (AARCH64_FEATURE_SVE, 0)}, - {"ls64", AARCH64_FEATURE (AARCH64_FEATURE_LS64, 0), - AARCH64_ARCH_NONE}, - {"flagm", AARCH64_FEATURE (AARCH64_FEATURE_FLAGM, 0), - AARCH64_ARCH_NONE}, - {"pauth", AARCH64_FEATURE (AARCH64_FEATURE_PAC, 0), - AARCH64_ARCH_NONE}, - {"mops", AARCH64_FEATURE (AARCH64_FEATURE_MOPS, 0), - AARCH64_ARCH_NONE}, - {"hbc", AARCH64_FEATURE (AARCH64_FEATURE_HBC, 0), - AARCH64_ARCH_NONE}, - {"cssc", AARCH64_FEATURE (AARCH64_FEATURE_CSSC, 0), - AARCH64_ARCH_NONE}, - {NULL, AARCH64_ARCH_NONE, AARCH64_ARCH_NONE}, + {"crc", AARCH64_FEATURE (CRC), AARCH64_NO_FEATURES}, + {"crypto", AARCH64_FEATURES (2, AES, SHA2), + AARCH64_FEATURE (SIMD)}, + {"fp", AARCH64_FEATURE (FP), AARCH64_NO_FEATURES}, + {"lse", AARCH64_FEATURE (LSE), AARCH64_NO_FEATURES}, + {"simd", AARCH64_FEATURE (SIMD), AARCH64_FEATURE (FP)}, + {"pan", AARCH64_FEATURE (PAN), AARCH64_NO_FEATURES}, + {"lor", AARCH64_FEATURE (LOR), AARCH64_NO_FEATURES}, + {"ras", AARCH64_FEATURE (RAS), AARCH64_NO_FEATURES}, + {"rdma", AARCH64_FEATURE (RDMA), AARCH64_FEATURE (SIMD)}, + {"fp16", AARCH64_FEATURE (F16), AARCH64_FEATURE (FP)}, + {"fp16fml", AARCH64_FEATURE (F16_FML), AARCH64_FEATURE (F16)}, + {"profile", AARCH64_FEATURE (PROFILE), AARCH64_NO_FEATURES}, + {"sve", AARCH64_FEATURE (SVE), AARCH64_FEATURE (COMPNUM)}, + {"tme", AARCH64_FEATURE (TME), AARCH64_NO_FEATURES}, + {"compnum", AARCH64_FEATURE (COMPNUM), + AARCH64_FEATURES (2, F16, SIMD)}, + {"rcpc", AARCH64_FEATURE (RCPC), AARCH64_NO_FEATURES}, + {"dotprod", AARCH64_FEATURE (DOTPROD), AARCH64_FEATURE (SIMD)}, + {"sha2", AARCH64_FEATURE (SHA2), AARCH64_FEATURE (FP)}, + {"sb", AARCH64_FEATURE (SB), AARCH64_NO_FEATURES}, + {"predres", AARCH64_FEATURE (PREDRES), AARCH64_NO_FEATURES}, + {"aes", AARCH64_FEATURE (AES), AARCH64_FEATURE (SIMD)}, + {"sm4", AARCH64_FEATURE (SM4), AARCH64_FEATURE (SIMD)}, + {"sha3", AARCH64_FEATURE (SHA3), AARCH64_FEATURE (SHA2)}, + {"rng", AARCH64_FEATURE (RNG), AARCH64_NO_FEATURES}, + {"ssbs", AARCH64_FEATURE (SSBS), AARCH64_NO_FEATURES}, + {"memtag", AARCH64_FEATURE (MEMTAG), AARCH64_NO_FEATURES}, + {"sve2", AARCH64_FEATURE (SVE2), AARCH64_FEATURE (SVE)}, + {"sve2-sm4", AARCH64_FEATURE (SVE2_SM4), + AARCH64_FEATURES (2, SVE2, SM4)}, + {"sve2-aes", AARCH64_FEATURE (SVE2_AES), + AARCH64_FEATURES (2, SVE2, AES)}, + {"sve2-sha3", AARCH64_FEATURE (SVE2_SHA3), + AARCH64_FEATURES (2, SVE2, SHA3)}, + {"sve2-bitperm", AARCH64_FEATURE (SVE2_BITPERM), + AARCH64_FEATURE (SVE2)}, + {"sme", AARCH64_FEATURE (SME), + AARCH64_FEATURES (2, SVE2, BFLOAT16)}, + {"sme-f64", AARCH64_FEATURE (SME_F64F64), AARCH64_FEATURE (SME)}, + {"sme-f64f64", AARCH64_FEATURE (SME_F64F64), AARCH64_FEATURE (SME)}, + {"sme-i64", AARCH64_FEATURE (SME_I16I64), AARCH64_FEATURE (SME)}, + {"sme-i16i64", AARCH64_FEATURE (SME_I16I64), AARCH64_FEATURE (SME)}, + {"sme2", AARCH64_FEATURE (SME2), AARCH64_FEATURE (SME)}, + {"bf16", AARCH64_FEATURE (BFLOAT16), AARCH64_FEATURE (FP)}, + {"i8mm", AARCH64_FEATURE (I8MM), AARCH64_FEATURE (SIMD)}, + {"f32mm", AARCH64_FEATURE (F32MM), AARCH64_FEATURE (SVE)}, + {"f64mm", AARCH64_FEATURE (F64MM), AARCH64_FEATURE (SVE)}, + {"ls64", AARCH64_FEATURE (LS64), AARCH64_NO_FEATURES}, + {"flagm", AARCH64_FEATURE (FLAGM), AARCH64_NO_FEATURES}, + {"pauth", AARCH64_FEATURE (PAC), AARCH64_NO_FEATURES}, + {"mops", AARCH64_FEATURE (MOPS), AARCH64_NO_FEATURES}, + {"hbc", AARCH64_FEATURE (HBC), AARCH64_NO_FEATURES}, + {"cssc", AARCH64_FEATURE (CSSC), AARCH64_NO_FEATURES}, + {NULL, AARCH64_NO_FEATURES, AARCH64_NO_FEATURES}, }; struct aarch64_long_option_table @@ -10466,14 +10326,15 @@ static aarch64_feature_set aarch64_feature_disable_set (aarch64_feature_set set) { const struct aarch64_option_cpu_value_table *opt; - aarch64_feature_set prev = 0; + aarch64_feature_set prev = AARCH64_NO_FEATURES; - while (prev != set) { - prev = set; - for (opt = aarch64_features; opt->name != NULL; opt++) - if (AARCH64_CPU_HAS_ANY_FEATURES (opt->require, set)) - AARCH64_MERGE_FEATURE_SETS (set, set, opt->value); - } + while (!AARCH64_CPU_HAS_ALL_FEATURES (prev, set)) + { + prev = set; + for (opt = aarch64_features; opt->name != NULL; opt++) + if (AARCH64_CPU_HAS_ANY_FEATURES (opt->require, set)) + AARCH64_MERGE_FEATURE_SETS (set, set, opt->value); + } return set; } @@ -10482,14 +10343,15 @@ static aarch64_feature_set aarch64_feature_enable_set (aarch64_feature_set set) { const struct aarch64_option_cpu_value_table *opt; - aarch64_feature_set prev = 0; + aarch64_feature_set prev = AARCH64_NO_FEATURES; - while (prev != set) { - prev = set; - for (opt = aarch64_features; opt->name != NULL; opt++) - if (AARCH64_CPU_HAS_FEATURE (set, opt->value)) - AARCH64_MERGE_FEATURE_SETS (set, set, opt->require); - } + while (!AARCH64_CPU_HAS_ALL_FEATURES (prev, set)) + { + prev = set; + for (opt = aarch64_features; opt->name != NULL; opt++) + if (AARCH64_CPU_HAS_ALL_FEATURES (set, opt->value)) + AARCH64_MERGE_FEATURE_SETS (set, set, opt->require); + } return set; } @@ -10571,7 +10433,7 @@ aarch64_parse_features (const char *str, const aarch64_feature_set **opt_p, else { set = aarch64_feature_disable_set (opt->value); - AARCH64_CLEAR_FEATURE (*ext_set, *ext_set, set); + AARCH64_CLEAR_FEATURES (*ext_set, *ext_set, set); } break; } diff --git a/include/opcode/aarch64.h b/include/opcode/aarch64.h index 381934ab39b..54ac1d85fd4 100644 --- a/include/opcode/aarch64.h +++ b/include/opcode/aarch64.h @@ -38,165 +38,241 @@ extern "C" { typedef uint32_t aarch64_insn; -/* The following bitmasks control CPU features. */ -#define AARCH64_FEATURE_V8 (1ULL << 0) /* All processors. */ -#define AARCH64_FEATURE_V8_6A (1ULL << 1) /* ARMv8.6 processors. */ -#define AARCH64_FEATURE_BFLOAT16 (1ULL << 2) /* Bfloat16 insns. */ -#define AARCH64_FEATURE_V8A (1ULL << 3) /* Armv8-A processors. */ -#define AARCH64_FEATURE_SVE2 (1ULL << 4) /* SVE2 instructions. */ -#define AARCH64_FEATURE_V8_2A (1ULL << 5) /* ARMv8.2 processors. */ -#define AARCH64_FEATURE_V8_3A (1ULL << 6) /* ARMv8.3 processors. */ -#define AARCH64_FEATURE_SVE2_AES (1ULL << 7) -#define AARCH64_FEATURE_SVE2_BITPERM (1ULL << 8) -#define AARCH64_FEATURE_SVE2_SM4 (1ULL << 9) -#define AARCH64_FEATURE_SVE2_SHA3 (1ULL << 10) -#define AARCH64_FEATURE_V8_4A (1ULL << 11) /* ARMv8.4 processors. */ -#define AARCH64_FEATURE_V8R (1ULL << 12) /* Armv8-R processors. */ -#define AARCH64_FEATURE_V8_7A (1ULL << 13) /* Armv8.7 processors. */ -#define AARCH64_FEATURE_SME (1ULL << 14) /* Scalable Matrix Extension. */ -#define AARCH64_FEATURE_LS64 (1ULL << 15) /* Atomic 64-byte load/store. */ -#define AARCH64_FEATURE_PAC (1ULL << 16) /* v8.3 Pointer Authentication. */ -#define AARCH64_FEATURE_FP (1ULL << 17) /* FP instructions. */ -#define AARCH64_FEATURE_SIMD (1ULL << 18) /* SIMD instructions. */ -#define AARCH64_FEATURE_CRC (1ULL << 19) /* CRC instructions. */ -#define AARCH64_FEATURE_LSE (1ULL << 20) /* LSE instructions. */ -#define AARCH64_FEATURE_PAN (1ULL << 21) /* PAN instructions. */ -#define AARCH64_FEATURE_LOR (1ULL << 22) /* LOR instructions. */ -#define AARCH64_FEATURE_RDMA (1ULL << 23) /* v8.1 SIMD instructions. */ -#define AARCH64_FEATURE_V8_1A (1ULL << 24) /* v8.1 features. */ -#define AARCH64_FEATURE_F16 (1ULL << 25) /* v8.2 FP16 instructions. */ -#define AARCH64_FEATURE_RAS (1ULL << 26) /* RAS Extensions. */ -#define AARCH64_FEATURE_PROFILE (1ULL << 27) /* Statistical Profiling. */ -#define AARCH64_FEATURE_SVE (1ULL << 28) /* SVE instructions. */ -#define AARCH64_FEATURE_RCPC (1ULL << 29) /* RCPC instructions. */ -#define AARCH64_FEATURE_COMPNUM (1ULL << 30) /* Complex # instructions. */ -#define AARCH64_FEATURE_DOTPROD (1ULL << 31) /* Dot Product instructions. */ -#define AARCH64_FEATURE_SM4 (1ULL << 32) /* SM3 & SM4 instructions. */ -#define AARCH64_FEATURE_SHA2 (1ULL << 33) /* SHA2 instructions. */ -#define AARCH64_FEATURE_SHA3 (1ULL << 34) /* SHA3 instructions. */ -#define AARCH64_FEATURE_AES (1ULL << 35) /* AES instructions. */ -#define AARCH64_FEATURE_F16_FML (1ULL << 36) /* v8.2 FP16FML ins. */ -#define AARCH64_FEATURE_V8_5A (1ULL << 37) /* ARMv8.5 processors. */ -#define AARCH64_FEATURE_FLAGMANIP (1ULL << 38) /* v8.5 Flag Manipulation version 2. */ -#define AARCH64_FEATURE_FRINTTS (1ULL << 39) /* FRINT[32,64][Z,X] insns. */ -#define AARCH64_FEATURE_SB (1ULL << 40) /* SB instruction. */ -#define AARCH64_FEATURE_PREDRES (1ULL << 41) /* Execution and Data Prediction Restriction instructions. */ -#define AARCH64_FEATURE_CVADP (1ULL << 42) /* DC CVADP. */ -#define AARCH64_FEATURE_RNG (1ULL << 43) /* Random Number instructions. */ -#define AARCH64_FEATURE_BTI (1ULL << 44) /* BTI instructions. */ -#define AARCH64_FEATURE_SCXTNUM (1ULL << 45) /* SCXTNUM_ELx. */ -#define AARCH64_FEATURE_ID_PFR2 (1ULL << 46) /* ID_PFR2 instructions. */ -#define AARCH64_FEATURE_SSBS (1ULL << 47) /* SSBS mechanism enabled. */ -#define AARCH64_FEATURE_MEMTAG (1ULL << 48) /* Memory Tagging Extension. */ -#define AARCH64_FEATURE_TME (1ULL << 49) /* Transactional Memory Extension. */ -#define AARCH64_FEATURE_MOPS (1ULL << 50) /* Standardization of memory operations. */ -#define AARCH64_FEATURE_HBC (1ULL << 51) /* Hinted conditional branches. */ -#define AARCH64_FEATURE_I8MM (1ULL << 52) /* Matrix Multiply instructions. */ -#define AARCH64_FEATURE_F32MM (1ULL << 53) -#define AARCH64_FEATURE_F64MM (1ULL << 54) -#define AARCH64_FEATURE_FLAGM (1ULL << 55) /* v8.4 Flag Manipulation. */ -#define AARCH64_FEATURE_V9A (1ULL << 56) /* Armv9.0-A processors. */ -#define AARCH64_FEATURE_SME_F64F64 (1ULL << 57) /* SME F64F64. */ -#define AARCH64_FEATURE_SME_I16I64 (1ULL << 58) /* SME I16I64. */ -#define AARCH64_FEATURE_V8_8A (1ULL << 59) /* Armv8.8 processors. */ -#define AARCH64_FEATURE_CSSC (1ULL << 60) /* Common Short Sequence Compression instructions. */ -#define AARCH64_FEATURE_SME2 (1ULL << 61) /* SME2. */ - -/* Crypto instructions are the combination of AES and SHA2. */ -#define AARCH64_FEATURE_CRYPTO (AARCH64_FEATURE_SHA2 | AARCH64_FEATURE_AES) - -#define AARCH64_ARCH_V8A_FEATURES (AARCH64_FEATURE_V8A \ - | AARCH64_FEATURE_FP \ - | AARCH64_FEATURE_RAS \ - | AARCH64_FEATURE_SIMD) -#define AARCH64_ARCH_V8_1A_FEATURES (AARCH64_FEATURE_V8_1A \ - | AARCH64_FEATURE_CRC \ - | AARCH64_FEATURE_LSE \ - | AARCH64_FEATURE_PAN \ - | AARCH64_FEATURE_LOR \ - | AARCH64_FEATURE_RDMA) -#define AARCH64_ARCH_V8_2A_FEATURES (AARCH64_FEATURE_V8_2A) -#define AARCH64_ARCH_V8_3A_FEATURES (AARCH64_FEATURE_V8_3A \ - | AARCH64_FEATURE_PAC \ - | AARCH64_FEATURE_RCPC \ - | AARCH64_FEATURE_COMPNUM) -#define AARCH64_ARCH_V8_4A_FEATURES (AARCH64_FEATURE_V8_4A \ - | AARCH64_FEATURE_DOTPROD \ - | AARCH64_FEATURE_FLAGM \ - | AARCH64_FEATURE_F16_FML) -#define AARCH64_ARCH_V8_5A_FEATURES (AARCH64_FEATURE_V8_5A \ - | AARCH64_FEATURE_FLAGMANIP \ - | AARCH64_FEATURE_FRINTTS \ - | AARCH64_FEATURE_SB \ - | AARCH64_FEATURE_PREDRES \ - | AARCH64_FEATURE_CVADP \ - | AARCH64_FEATURE_BTI \ - | AARCH64_FEATURE_SCXTNUM \ - | AARCH64_FEATURE_ID_PFR2 \ - | AARCH64_FEATURE_SSBS) -#define AARCH64_ARCH_V8_6A_FEATURES (AARCH64_FEATURE_V8_6A \ - | AARCH64_FEATURE_BFLOAT16 \ - | AARCH64_FEATURE_I8MM) -#define AARCH64_ARCH_V8_7A_FEATURES (AARCH64_FEATURE_V8_7A \ - | AARCH64_FEATURE_LS64) -#define AARCH64_ARCH_V8_8A_FEATURES (AARCH64_FEATURE_V8_8A \ - | AARCH64_FEATURE_MOPS \ - | AARCH64_FEATURE_HBC) - -#define AARCH64_ARCH_V9A_FEATURES (AARCH64_FEATURE_V9A \ - | AARCH64_FEATURE_F16 \ - | AARCH64_FEATURE_SVE \ - | AARCH64_FEATURE_SVE2) -#define AARCH64_ARCH_V9_1A_FEATURES (AARCH64_ARCH_V8_6A_FEATURES) -#define AARCH64_ARCH_V9_2A_FEATURES (AARCH64_ARCH_V8_7A_FEATURES) -#define AARCH64_ARCH_V9_3A_FEATURES (AARCH64_ARCH_V8_8A_FEATURES) +/* An enum containing all known CPU features. The values act as bit positions + into aarch64_feature_set. */ +enum aarch64_feature_bit { + /* All processors. */ + AARCH64_FEATURE_V8, + /* ARMv8.6 processors. */ + AARCH64_FEATURE_V8_6A, + /* Bfloat16 insns. */ + AARCH64_FEATURE_BFLOAT16, + /* Armv8-A processors. */ + AARCH64_FEATURE_V8A, + /* SVE2 instructions. */ + AARCH64_FEATURE_SVE2, + /* ARMv8.2 processors. */ + AARCH64_FEATURE_V8_2A, + /* ARMv8.3 processors. */ + AARCH64_FEATURE_V8_3A, + AARCH64_FEATURE_SVE2_AES, + AARCH64_FEATURE_SVE2_BITPERM, + AARCH64_FEATURE_SVE2_SM4, + AARCH64_FEATURE_SVE2_SHA3, + /* ARMv8.4 processors. */ + AARCH64_FEATURE_V8_4A, + /* Armv8-R processors. */ + AARCH64_FEATURE_V8R, + /* Armv8.7 processors. */ + AARCH64_FEATURE_V8_7A, + /* Scalable Matrix Extension. */ + AARCH64_FEATURE_SME, + /* Atomic 64-byte load/store. */ + AARCH64_FEATURE_LS64, + /* v8.3 Pointer Authentication. */ + AARCH64_FEATURE_PAC, + /* FP instructions. */ + AARCH64_FEATURE_FP, + /* SIMD instructions. */ + AARCH64_FEATURE_SIMD, + /* CRC instructions. */ + AARCH64_FEATURE_CRC, + /* LSE instructions. */ + AARCH64_FEATURE_LSE, + /* PAN instructions. */ + AARCH64_FEATURE_PAN, + /* LOR instructions. */ + AARCH64_FEATURE_LOR, + /* v8.1 SIMD instructions. */ + AARCH64_FEATURE_RDMA, + /* v8.1 features. */ + AARCH64_FEATURE_V8_1A, + /* v8.2 FP16 instructions. */ + AARCH64_FEATURE_F16, + /* RAS Extensions. */ + AARCH64_FEATURE_RAS, + /* Statistical Profiling. */ + AARCH64_FEATURE_PROFILE, + /* SVE instructions. */ + AARCH64_FEATURE_SVE, + /* RCPC instructions. */ + AARCH64_FEATURE_RCPC, + /* Complex # instructions. */ + AARCH64_FEATURE_COMPNUM, + /* Dot Product instructions. */ + AARCH64_FEATURE_DOTPROD, + /* SM3 & SM4 instructions. */ + AARCH64_FEATURE_SM4, + /* SHA2 instructions. */ + AARCH64_FEATURE_SHA2, + /* SHA3 instructions. */ + AARCH64_FEATURE_SHA3, + /* AES instructions. */ + AARCH64_FEATURE_AES, + /* v8.2 FP16FML ins. */ + AARCH64_FEATURE_F16_FML, + /* ARMv8.5 processors. */ + AARCH64_FEATURE_V8_5A, + /* v8.5 Flag Manipulation version 2. */ + AARCH64_FEATURE_FLAGMANIP, + /* FRINT[32,64][Z,X] insns. */ + AARCH64_FEATURE_FRINTTS, + /* SB instruction. */ + AARCH64_FEATURE_SB, + /* Execution and Data Prediction Restriction instructions. */ + AARCH64_FEATURE_PREDRES, + /* DC CVADP. */ + AARCH64_FEATURE_CVADP, + /* Random Number instructions. */ + AARCH64_FEATURE_RNG, + /* BTI instructions. */ + AARCH64_FEATURE_BTI, + /* SCXTNUM_ELx. */ + AARCH64_FEATURE_SCXTNUM, + /* ID_PFR2 instructions. */ + AARCH64_FEATURE_ID_PFR2, + /* SSBS mechanism enabled. */ + AARCH64_FEATURE_SSBS, + /* Memory Tagging Extension. */ + AARCH64_FEATURE_MEMTAG, + /* Transactional Memory Extension. */ + AARCH64_FEATURE_TME, + /* Standardization of memory operations. */ + AARCH64_FEATURE_MOPS, + /* Hinted conditional branches. */ + AARCH64_FEATURE_HBC, + /* Matrix Multiply instructions. */ + AARCH64_FEATURE_I8MM, + AARCH64_FEATURE_F32MM, + AARCH64_FEATURE_F64MM, + /* v8.4 Flag Manipulation. */ + AARCH64_FEATURE_FLAGM, + /* Armv9.0-A processors. */ + AARCH64_FEATURE_V9A, + /* SME F64F64. */ + AARCH64_FEATURE_SME_F64F64, + /* SME I16I64. */ + AARCH64_FEATURE_SME_I16I64, + /* Armv8.8 processors. */ + AARCH64_FEATURE_V8_8A, + /* Common Short Sequence Compression instructions. */ + AARCH64_FEATURE_CSSC, + /* SME2. */ + AARCH64_FEATURE_SME2 +}; + +/* These macros take an initial argument X that gives the index into + an aarch64_feature_set. The macros then return the bitmask for + that array index. */ + +/* A mask in which feature bit BIT is set and all other bits are clear. */ +#define AARCH64_UINT64_BIT(X, BIT) \ + ((X) == (BIT) / 64 ? 1ULL << (BIT) % 64 : 0) + +/* A mask that includes only AARCH64_FEATURE_. */ +#define AARCH64_FEATBIT(X, NAME) \ + AARCH64_UINT64_BIT (X, AARCH64_FEATURE_##NAME) + +/* A mask of the features that are enabled by each architecture version, + excluding those that are inherited from other architecture versions. */ +#define AARCH64_ARCH_V8A_FEATURES(X) (AARCH64_FEATBIT (X, V8A) \ + | AARCH64_FEATBIT (X, FP) \ + | AARCH64_FEATBIT (X, RAS) \ + | AARCH64_FEATBIT (X, SIMD)) +#define AARCH64_ARCH_V8_1A_FEATURES(X) (AARCH64_FEATBIT (X, V8_1A) \ + | AARCH64_FEATBIT (X, CRC) \ + | AARCH64_FEATBIT (X, LSE) \ + | AARCH64_FEATBIT (X, PAN) \ + | AARCH64_FEATBIT (X, LOR) \ + | AARCH64_FEATBIT (X, RDMA)) +#define AARCH64_ARCH_V8_2A_FEATURES(X) (AARCH64_FEATBIT (X, V8_2A)) +#define AARCH64_ARCH_V8_3A_FEATURES(X) (AARCH64_FEATBIT (X, V8_3A) \ + | AARCH64_FEATBIT (X, PAC) \ + | AARCH64_FEATBIT (X, RCPC) \ + | AARCH64_FEATBIT (X, COMPNUM)) +#define AARCH64_ARCH_V8_4A_FEATURES(X) (AARCH64_FEATBIT (X, V8_4A) \ + | AARCH64_FEATBIT (X, DOTPROD) \ + | AARCH64_FEATBIT (X, FLAGM) \ + | AARCH64_FEATBIT (X, F16_FML)) +#define AARCH64_ARCH_V8_5A_FEATURES(X) (AARCH64_FEATBIT (X, V8_5A) \ + | AARCH64_FEATBIT (X, FLAGMANIP) \ + | AARCH64_FEATBIT (X, FRINTTS) \ + | AARCH64_FEATBIT (X, SB) \ + | AARCH64_FEATBIT (X, PREDRES) \ + | AARCH64_FEATBIT (X, CVADP) \ + | AARCH64_FEATBIT (X, BTI) \ + | AARCH64_FEATBIT (X, SCXTNUM) \ + | AARCH64_FEATBIT (X, ID_PFR2) \ + | AARCH64_FEATBIT (X, SSBS)) +#define AARCH64_ARCH_V8_6A_FEATURES(X) (AARCH64_FEATBIT (X, V8_6A) \ + | AARCH64_FEATBIT (X, BFLOAT16) \ + | AARCH64_FEATBIT (X, I8MM)) +#define AARCH64_ARCH_V8_7A_FEATURES(X) (AARCH64_FEATBIT (X, V8_7A) \ + | AARCH64_FEATBIT (X, LS64)) +#define AARCH64_ARCH_V8_8A_FEATURES(X) (AARCH64_FEATBIT (X, V8_8A) \ + | AARCH64_FEATBIT (X, MOPS) \ + | AARCH64_FEATBIT (X, HBC)) + +#define AARCH64_ARCH_V9A_FEATURES(X) (AARCH64_FEATBIT (X, V9A) \ + | AARCH64_FEATBIT (X, F16) \ + | AARCH64_FEATBIT (X, SVE) \ + | AARCH64_FEATBIT (X, SVE2)) +#define AARCH64_ARCH_V9_1A_FEATURES(X) AARCH64_ARCH_V8_6A_FEATURES (X) +#define AARCH64_ARCH_V9_2A_FEATURES(X) AARCH64_ARCH_V8_7A_FEATURES (X) +#define AARCH64_ARCH_V9_3A_FEATURES(X) AARCH64_ARCH_V8_8A_FEATURES (X) /* Architectures are the sum of the base and extensions. */ -#define AARCH64_ARCH_V8A AARCH64_FEATURE (AARCH64_FEATURE_V8, \ - AARCH64_ARCH_V8A_FEATURES) -#define AARCH64_ARCH_V8_1A AARCH64_FEATURE (AARCH64_ARCH_V8A, \ - AARCH64_ARCH_V8_1A_FEATURES) -#define AARCH64_ARCH_V8_2A AARCH64_FEATURE (AARCH64_ARCH_V8_1A, \ - AARCH64_ARCH_V8_2A_FEATURES) -#define AARCH64_ARCH_V8_3A AARCH64_FEATURE (AARCH64_ARCH_V8_2A, \ - AARCH64_ARCH_V8_3A_FEATURES) -#define AARCH64_ARCH_V8_4A AARCH64_FEATURE (AARCH64_ARCH_V8_3A, \ - AARCH64_ARCH_V8_4A_FEATURES) -#define AARCH64_ARCH_V8_5A AARCH64_FEATURE (AARCH64_ARCH_V8_4A, \ - AARCH64_ARCH_V8_5A_FEATURES) -#define AARCH64_ARCH_V8_6A AARCH64_FEATURE (AARCH64_ARCH_V8_5A, \ - AARCH64_ARCH_V8_6A_FEATURES) -#define AARCH64_ARCH_V8_7A AARCH64_FEATURE (AARCH64_ARCH_V8_6A, \ - AARCH64_ARCH_V8_7A_FEATURES) -#define AARCH64_ARCH_V8_8A AARCH64_FEATURE (AARCH64_ARCH_V8_7A, \ - AARCH64_ARCH_V8_8A_FEATURES) -#define AARCH64_ARCH_V8R (AARCH64_FEATURE (AARCH64_ARCH_V8_4A, \ - AARCH64_FEATURE_V8R) \ - & ~(AARCH64_FEATURE_V8A | AARCH64_FEATURE_LOR)) - -#define AARCH64_ARCH_V9A AARCH64_FEATURE (AARCH64_ARCH_V8_5A, \ - AARCH64_ARCH_V9A_FEATURES) -#define AARCH64_ARCH_V9_1A AARCH64_FEATURE (AARCH64_ARCH_V9A, \ - AARCH64_ARCH_V9_1A_FEATURES) -#define AARCH64_ARCH_V9_2A AARCH64_FEATURE (AARCH64_ARCH_V9_1A, \ - AARCH64_ARCH_V9_2A_FEATURES) -#define AARCH64_ARCH_V9_3A AARCH64_FEATURE (AARCH64_ARCH_V9_2A, \ - AARCH64_ARCH_V9_3A_FEATURES) - -#define AARCH64_ARCH_NONE AARCH64_FEATURE (0, 0) -#define AARCH64_ANY AARCH64_FEATURE (-1, 0) /* Any basic core. */ +#define AARCH64_ARCH_V8A(X) (AARCH64_FEATBIT (X, V8) \ + | AARCH64_ARCH_V8A_FEATURES (X)) +#define AARCH64_ARCH_V8_1A(X) (AARCH64_ARCH_V8A (X) \ + | AARCH64_ARCH_V8_1A_FEATURES (X)) +#define AARCH64_ARCH_V8_2A(X) (AARCH64_ARCH_V8_1A (X) \ + | AARCH64_ARCH_V8_2A_FEATURES (X)) +#define AARCH64_ARCH_V8_3A(X) (AARCH64_ARCH_V8_2A (X) \ + | AARCH64_ARCH_V8_3A_FEATURES (X)) +#define AARCH64_ARCH_V8_4A(X) (AARCH64_ARCH_V8_3A (X) \ + | AARCH64_ARCH_V8_4A_FEATURES (X)) +#define AARCH64_ARCH_V8_5A(X) (AARCH64_ARCH_V8_4A (X) \ + | AARCH64_ARCH_V8_5A_FEATURES (X)) +#define AARCH64_ARCH_V8_6A(X) (AARCH64_ARCH_V8_5A (X) \ + | AARCH64_ARCH_V8_6A_FEATURES (X)) +#define AARCH64_ARCH_V8_7A(X) (AARCH64_ARCH_V8_6A (X) \ + | AARCH64_ARCH_V8_7A_FEATURES (X)) +#define AARCH64_ARCH_V8_8A(X) (AARCH64_ARCH_V8_7A (X) \ + | AARCH64_ARCH_V8_8A_FEATURES (X)) +#define AARCH64_ARCH_V8R(X) ((AARCH64_ARCH_V8_4A (X) \ + | AARCH64_FEATBIT (X, V8R)) \ + & ~AARCH64_FEATBIT (X, V8A) \ + & ~AARCH64_FEATBIT (X, LOR)) + +#define AARCH64_ARCH_V9A(X) (AARCH64_ARCH_V8_5A (X) \ + | AARCH64_ARCH_V9A_FEATURES (X)) +#define AARCH64_ARCH_V9_1A(X) (AARCH64_ARCH_V9A (X) \ + | AARCH64_ARCH_V9_1A_FEATURES (X)) +#define AARCH64_ARCH_V9_2A(X) (AARCH64_ARCH_V9_1A (X) \ + | AARCH64_ARCH_V9_2A_FEATURES (X)) +#define AARCH64_ARCH_V9_3A(X) (AARCH64_ARCH_V9_2A (X) \ + | AARCH64_ARCH_V9_3A_FEATURES (X)) + +#define AARCH64_ARCH_NONE(X) 0 /* CPU-specific features. */ typedef unsigned long long aarch64_feature_set; +#define AARCH64_CPU_HAS_FEATURE(CPU,FEAT) \ + ((~(CPU) & AARCH64_FEATBIT (0, FEAT)) == 0) + #define AARCH64_CPU_HAS_ALL_FEATURES(CPU,FEAT) \ ((~(CPU) & (FEAT)) == 0) #define AARCH64_CPU_HAS_ANY_FEATURES(CPU,FEAT) \ (((CPU) & (FEAT)) != 0) -#define AARCH64_CPU_HAS_FEATURE(CPU,FEAT) \ - AARCH64_CPU_HAS_ALL_FEATURES (CPU,FEAT) +#define AARCH64_SET_FEATURE(DEST, FEAT) \ + ((DEST) = FEAT (0)) + +#define AARCH64_CLEAR_FEATURE(DEST, SRC, FEAT) \ + ((DEST) = (SRC) & ~AARCH64_FEATBIT (0, FEAT)) #define AARCH64_MERGE_FEATURE_SETS(TARG,F1,F2) \ do \ @@ -205,14 +281,55 @@ typedef unsigned long long aarch64_feature_set; } \ while (0) -#define AARCH64_CLEAR_FEATURE(TARG,F1,F2) \ +#define AARCH64_CLEAR_FEATURES(TARG,F1,F2) \ do \ { \ (TARG) = (F1) &~ (F2); \ } \ while (0) -#define AARCH64_FEATURE(core,coproc) ((core) | (coproc)) +/* aarch64_feature_set initializers for no features and all features, + respectively. */ +#define AARCH64_NO_FEATURES 0 +#define AARCH64_ALL_FEATURES -1 + +/* An aarch64_feature_set initializer for a single feature, + AARCH64_FEATURE_. */ +#define AARCH64_FEATURE(FEAT) AARCH64_FEATBIT (0, FEAT) + +/* An aarch64_feature_set initializer for a specific architecture version, + including all the features that are enabled by default for that architecture + version. */ +#define AARCH64_ARCH_FEATURES(ARCH) AARCH64_ARCH_##ARCH (0) + +/* Used by AARCH64_CPU_FEATURES. */ +#define AARCH64_OR_FEATURES_1(X, ARCH, F1) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_ARCH_##ARCH (X)) +#define AARCH64_OR_FEATURES_2(X, ARCH, F1, F2) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_OR_FEATURES_1 (X, ARCH, F2)) +#define AARCH64_OR_FEATURES_3(X, ARCH, F1, ...) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_OR_FEATURES_2 (X, ARCH, __VA_ARGS__)) +#define AARCH64_OR_FEATURES_4(X, ARCH, F1, ...) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_OR_FEATURES_3 (X, ARCH, __VA_ARGS__)) +#define AARCH64_OR_FEATURES_5(X, ARCH, F1, ...) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_OR_FEATURES_4 (X, ARCH, __VA_ARGS__)) +#define AARCH64_OR_FEATURES_6(X, ARCH, F1, ...) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_OR_FEATURES_5 (X, ARCH, __VA_ARGS__)) +#define AARCH64_OR_FEATURES_7(X, ARCH, F1, ...) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_OR_FEATURES_6 (X, ARCH, __VA_ARGS__)) +#define AARCH64_OR_FEATURES_8(X, ARCH, F1, ...) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_OR_FEATURES_7 (X, ARCH, __VA_ARGS__)) +#define AARCH64_OR_FEATURES_9(X, ARCH, F1, ...) \ + (AARCH64_FEATBIT (X, F1) | AARCH64_OR_FEATURES_8 (X, ARCH, __VA_ARGS__)) + +/* An aarch64_feature_set initializer for a CPU that implements architecture + version ARCH, and additionally provides the N features listed in "...". */ +#define AARCH64_CPU_FEATURES(ARCH, N, ...) \ + AARCH64_OR_FEATURES_##N (0, ARCH, __VA_ARGS__) + +/* An aarch64_feature_set initializer for the N features listed in "...". */ +#define AARCH64_FEATURES(N, ...) \ + AARCH64_CPU_FEATURES (NONE, N, __VA_ARGS__) enum aarch64_operand_class { @@ -1121,7 +1238,7 @@ extern bool aarch64_sys_ins_reg_has_xt (const aarch64_sys_ins_reg *); extern bool aarch64_sys_ins_reg_supported_p (const aarch64_feature_set, const char *reg_name, aarch64_insn, - uint32_t, aarch64_feature_set); + uint32_t, const aarch64_feature_set *); extern const aarch64_sys_ins_reg aarch64_sys_regs_ic []; extern const aarch64_sys_ins_reg aarch64_sys_regs_dc []; @@ -1575,7 +1692,7 @@ extern bool aarch64_sve_dupm_mov_immediate_p (uint64_t, int); extern bool -aarch64_cpu_supports_inst_p (uint64_t, aarch64_inst *); +aarch64_cpu_supports_inst_p (aarch64_feature_set, aarch64_inst *); #ifdef DEBUG_AARCH64 extern int debug_dump; diff --git a/opcodes/aarch64-dis.c b/opcodes/aarch64-dis.c index 03bcc372a56..3bf5f50effc 100644 --- a/opcodes/aarch64-dis.c +++ b/opcodes/aarch64-dis.c @@ -3008,7 +3008,7 @@ determine_disassembling_preference (struct aarch64_inst *inst, continue; } - if (!AARCH64_CPU_HAS_FEATURE (arch_variant, *alias->avariant)) + if (!AARCH64_CPU_HAS_ALL_FEATURES (arch_variant, *alias->avariant)) { DEBUG_TRACE ("skip %s: we're missing features", alias->name); continue; @@ -3969,10 +3969,11 @@ select_aarch64_variant (unsigned mach) switch (mach) { case bfd_mach_aarch64_8R: - arch_variant = AARCH64_ARCH_V8R; + AARCH64_SET_FEATURE (arch_variant, AARCH64_ARCH_V8R); break; default: - arch_variant = AARCH64_ANY & ~(AARCH64_FEATURE_V8R); + arch_variant = (aarch64_feature_set) AARCH64_ALL_FEATURES; + AARCH64_CLEAR_FEATURE (arch_variant, arch_variant, V8R); } } diff --git a/opcodes/aarch64-opc.c b/opcodes/aarch64-opc.c index 41dec5cdc04..df1b5113641 100644 --- a/opcodes/aarch64-opc.c +++ b/opcodes/aarch64-opc.c @@ -4501,7 +4501,7 @@ aarch64_print_operand (char *buf, size_t size, bfd_vma pc, bool exact_match = (!(sr->flags & (F_REG_READ | F_REG_WRITE)) || (sr->flags & opnd->sysreg.flags) == opnd->sysreg.flags) - && AARCH64_CPU_HAS_FEATURE (features, sr->features); + && AARCH64_CPU_HAS_ALL_FEATURES (features, sr->features); /* Try and find an exact match, But if that fails, return the first partial match that was found. */ @@ -4674,17 +4674,14 @@ aarch64_print_operand (char *buf, size_t size, bfd_vma pc, #define C14 14 #define C15 15 -#define SYSREG(name, encoding, flags, features) \ - { name, encoding, flags, features } - -#define SR_CORE(n,e,f) SYSREG (n,e,f,0) +#define SR_CORE(n,e,f) {n,e,f,AARCH64_NO_FEATURES} #define SR_FEAT(n,e,f,feat) \ - SYSREG ((n), (e), (f) | F_ARCHEXT, AARCH64_FEATURE_##feat) + { (n), (e), (f) | F_ARCHEXT, AARCH64_FEATURE (feat) } #define SR_FEAT2(n,e,f,fe1,fe2) \ - SYSREG ((n), (e), (f) | F_ARCHEXT, \ - AARCH64_FEATURE_##fe1 | AARCH64_FEATURE_##fe2) + { (n), (e), (f) | F_ARCHEXT, \ + AARCH64_FEATURES (2, fe1, fe2) } #define SR_V8_1_A(n,e,f) SR_FEAT2(n,e,f,V8A,V8_1A) #define SR_V8_4_A(n,e,f) SR_FEAT2(n,e,f,V8A,V8_4A) @@ -5736,7 +5733,7 @@ const aarch64_sys_reg aarch64_sys_regs [] = SR_V8_8A ("allint", CPENC (3,0,C4,C3,0), 0), SR_V8_8A ("icc_nmiar1_el1", CPENC (3,0,C12,C9,5), F_REG_READ), - { 0, CPENC (0,0,0,0,0), 0, 0 } + { 0, CPENC (0,0,0,0,0), 0, AARCH64_NO_FEATURES } }; bool @@ -5769,7 +5766,7 @@ const aarch64_sys_reg aarch64_pstatefields [] = SR_SME ("svcrsmza", 0x1b, PSTATE_ENCODE_CRM_AND_IMM(0x6,0x1) | F_REG_MAX_VALUE (1)), SR_V8_8A ("allint", 0x08, F_REG_MAX_VALUE (1)), - { 0, CPENC (0,0,0,0,0), 0, 0 }, + { 0, CPENC (0,0,0,0,0), 0, AARCH64_NO_FEATURES }, }; bool @@ -5954,13 +5951,13 @@ aarch64_sys_ins_reg_has_xt (const aarch64_sys_ins_reg *sys_ins_reg) extern bool aarch64_sys_ins_reg_supported_p (const aarch64_feature_set features, - const char *reg_name, - aarch64_insn reg_value, - uint32_t reg_flags, - aarch64_feature_set reg_features) + const char *reg_name, + aarch64_insn reg_value, + uint32_t reg_flags, + const aarch64_feature_set *reg_features) { /* Armv8-R has no EL3. */ - if (AARCH64_CPU_HAS_FEATURE (features, AARCH64_FEATURE_V8R)) + if (AARCH64_CPU_HAS_FEATURE (features, V8R)) { const char *suffix = strrchr (reg_name, '_'); if (suffix && !strcmp (suffix, "_el3")) @@ -5971,7 +5968,7 @@ aarch64_sys_ins_reg_supported_p (const aarch64_feature_set features, return true; if (reg_features - && AARCH64_CPU_HAS_ALL_FEATURES (features, reg_features)) + && AARCH64_CPU_HAS_ALL_FEATURES (features, *reg_features)) return true; /* ARMv8.4 TLB instructions. */ @@ -6021,17 +6018,17 @@ aarch64_sys_ins_reg_supported_p (const aarch64_feature_set features, || reg_value == CPENS (6, C8, C2, 5) || reg_value == CPENS (6, C8, C5, 1) || reg_value == CPENS (6, C8, C5, 5)) - && AARCH64_CPU_HAS_FEATURE (features, AARCH64_FEATURE_V8_4A)) + && AARCH64_CPU_HAS_FEATURE (features, V8_4A)) return true; /* DC CVAP. Values are from aarch64_sys_regs_dc. */ if (reg_value == CPENS (3, C7, C12, 1) - && AARCH64_CPU_HAS_FEATURE (features, AARCH64_FEATURE_V8_2A)) + && AARCH64_CPU_HAS_FEATURE (features, V8_2A)) return true; /* DC CVADP. Values are from aarch64_sys_regs_dc. */ if (reg_value == CPENS (3, C7, C13, 1) - && AARCH64_CPU_HAS_FEATURE (features, AARCH64_FEATURE_CVADP)) + && AARCH64_CPU_HAS_FEATURE (features, CVADP)) return true; /* DC for ARMv8.5-A Memory Tagging Extension. */ @@ -6053,18 +6050,18 @@ aarch64_sys_ins_reg_supported_p (const aarch64_feature_set features, || reg_value == CPENS (3, C7, C13, 5) || reg_value == CPENS (3, C7, C14, 5) || reg_value == CPENS (3, C7, C4, 4)) - && AARCH64_CPU_HAS_FEATURE (features, AARCH64_FEATURE_MEMTAG)) + && AARCH64_CPU_HAS_FEATURE (features, MEMTAG)) return true; /* AT S1E1RP, AT S1E1WP. Values are from aarch64_sys_regs_at. */ if ((reg_value == CPENS (0, C7, C9, 0) || reg_value == CPENS (0, C7, C9, 1)) - && AARCH64_CPU_HAS_FEATURE (features, AARCH64_FEATURE_V8_2A)) + && AARCH64_CPU_HAS_FEATURE (features, V8_2A)) return true; /* CFP/DVP/CPP RCTX : Value are from aarch64_sys_regs_sr. */ if (reg_value == CPENS (3, C7, C3, 0) - && AARCH64_CPU_HAS_FEATURE (features, AARCH64_FEATURE_PREDRES)) + && AARCH64_CPU_HAS_FEATURE (features, PREDRES)) return true; return false; @@ -6372,8 +6369,8 @@ verify_constraints (const struct aarch64_inst *inst, /* Check to see if the MOVPRFX SVE instruction is followed by an SVE instruction for better error messages. */ if (!opcode->avariant - || !(*opcode->avariant & - (AARCH64_FEATURE_SVE | AARCH64_FEATURE_SVE2))) + || (!AARCH64_CPU_HAS_FEATURE (*opcode->avariant, SVE) + && !AARCH64_CPU_HAS_FEATURE (*opcode->avariant, SVE2))) { mismatch_detail->kind = AARCH64_OPDE_SYNTAX_ERROR; mismatch_detail->error = _("SVE instruction expected after " @@ -6614,7 +6611,8 @@ aarch64_sve_dupm_mov_immediate_p (uint64_t uvalue, int esize) supports the instruction described by INST. */ bool -aarch64_cpu_supports_inst_p (uint64_t cpu_variant, aarch64_inst *inst) +aarch64_cpu_supports_inst_p (aarch64_feature_set cpu_variant, + aarch64_inst *inst) { if (!inst->opcode->avariant || !AARCH64_CPU_HAS_ALL_FEATURES (cpu_variant, *inst->opcode->avariant)) @@ -6622,14 +6620,12 @@ aarch64_cpu_supports_inst_p (uint64_t cpu_variant, aarch64_inst *inst) if (inst->opcode->iclass == sme_fp_sd && inst->operands[0].qualifier == AARCH64_OPND_QLF_S_D - && !AARCH64_CPU_HAS_ALL_FEATURES (cpu_variant, - AARCH64_FEATURE_SME_F64F64)) + && !AARCH64_CPU_HAS_FEATURE (cpu_variant, SME_F64F64)) return false; if (inst->opcode->iclass == sme_int_sd && inst->operands[0].qualifier == AARCH64_OPND_QLF_S_D - && !AARCH64_CPU_HAS_ALL_FEATURES (cpu_variant, - AARCH64_FEATURE_SME_I16I64)) + && !AARCH64_CPU_HAS_FEATURE (cpu_variant, SME_I16I64)) return false; return true; diff --git a/opcodes/aarch64-tbl.h b/opcodes/aarch64-tbl.h index 3de1e37b373..9e099cfd65e 100644 --- a/opcodes/aarch64-tbl.h +++ b/opcodes/aarch64-tbl.h @@ -2461,123 +2461,117 @@ error message for MOVPRFX constraint violations. */ static const aarch64_feature_set aarch64_feature_v8 = - AARCH64_FEATURE (AARCH64_FEATURE_V8, 0); + AARCH64_FEATURE (V8); static const aarch64_feature_set aarch64_feature_fp = - AARCH64_FEATURE (AARCH64_FEATURE_FP, 0); + AARCH64_FEATURE (FP); static const aarch64_feature_set aarch64_feature_simd = - AARCH64_FEATURE (AARCH64_FEATURE_SIMD, 0); + AARCH64_FEATURE (SIMD); static const aarch64_feature_set aarch64_feature_crc = - AARCH64_FEATURE (AARCH64_FEATURE_CRC, 0); + AARCH64_FEATURE (CRC); static const aarch64_feature_set aarch64_feature_lse = - AARCH64_FEATURE (AARCH64_FEATURE_LSE, 0); + AARCH64_FEATURE (LSE); static const aarch64_feature_set aarch64_feature_lor = - AARCH64_FEATURE (AARCH64_FEATURE_LOR, 0); + AARCH64_FEATURE (LOR); static const aarch64_feature_set aarch64_feature_rdma = - AARCH64_FEATURE (AARCH64_FEATURE_RDMA, 0); + AARCH64_FEATURE (RDMA); static const aarch64_feature_set aarch64_feature_v8_2a = - AARCH64_FEATURE (AARCH64_FEATURE_V8_2A, 0); + AARCH64_FEATURE (V8_2A); static const aarch64_feature_set aarch64_feature_fp_f16 = - AARCH64_FEATURE (AARCH64_FEATURE_F16 | AARCH64_FEATURE_FP, 0); + AARCH64_FEATURES (2, F16, FP); static const aarch64_feature_set aarch64_feature_simd_f16 = - AARCH64_FEATURE (AARCH64_FEATURE_F16 | AARCH64_FEATURE_SIMD, 0); + AARCH64_FEATURES (2, F16, SIMD); static const aarch64_feature_set aarch64_feature_sve = - AARCH64_FEATURE (AARCH64_FEATURE_SVE, 0); + AARCH64_FEATURE (SVE); static const aarch64_feature_set aarch64_feature_v8_3a = - AARCH64_FEATURE (AARCH64_FEATURE_V8_3A, 0); + AARCH64_FEATURE (V8_3A); static const aarch64_feature_set aarch64_feature_fp_v8_3a = - AARCH64_FEATURE (AARCH64_FEATURE_V8_3A | AARCH64_FEATURE_FP, 0); + AARCH64_FEATURES (2, V8_3A, FP); static const aarch64_feature_set aarch64_feature_pac = - AARCH64_FEATURE (AARCH64_FEATURE_PAC, 0); + AARCH64_FEATURE (PAC); static const aarch64_feature_set aarch64_feature_compnum = - AARCH64_FEATURE (AARCH64_FEATURE_COMPNUM, 0); + AARCH64_FEATURE (COMPNUM); static const aarch64_feature_set aarch64_feature_rcpc = - AARCH64_FEATURE (AARCH64_FEATURE_RCPC, 0); + AARCH64_FEATURE (RCPC); static const aarch64_feature_set aarch64_feature_dotprod = - AARCH64_FEATURE (AARCH64_FEATURE_DOTPROD, 0); + AARCH64_FEATURE (DOTPROD); static const aarch64_feature_set aarch64_feature_sha2 = - AARCH64_FEATURE (AARCH64_FEATURE_V8 | AARCH64_FEATURE_SHA2, 0); + AARCH64_FEATURES (2, V8, SHA2); static const aarch64_feature_set aarch64_feature_aes = - AARCH64_FEATURE (AARCH64_FEATURE_V8 | AARCH64_FEATURE_AES, 0); + AARCH64_FEATURES (2, V8, AES); static const aarch64_feature_set aarch64_feature_v8_4a = - AARCH64_FEATURE (AARCH64_FEATURE_V8_4A, 0); + AARCH64_FEATURE (V8_4A); static const aarch64_feature_set aarch64_feature_sm4 = - AARCH64_FEATURE (AARCH64_FEATURE_SM4 | AARCH64_FEATURE_SIMD - | AARCH64_FEATURE_FP, 0); + AARCH64_FEATURES (3, SM4, SIMD, FP); static const aarch64_feature_set aarch64_feature_sha3 = - AARCH64_FEATURE (AARCH64_FEATURE_SHA2 | AARCH64_FEATURE_SHA3 - | AARCH64_FEATURE_SIMD | AARCH64_FEATURE_FP, 0); + AARCH64_FEATURES (4, SHA2, SHA3, SIMD, FP); static const aarch64_feature_set aarch64_feature_fp_16_v8_2a = - AARCH64_FEATURE (AARCH64_FEATURE_F16_FML | AARCH64_FEATURE_F16 - | AARCH64_FEATURE_FP, 0); -static const aarch64_feature_set aarch64_feature_v8_5a = - AARCH64_FEATURE (AARCH64_FEATURE_V8_5A, 0); + AARCH64_FEATURES (3, F16_FML, F16, FP); +static const aarch64_feature_set aarch64_feature_v8_5 = + AARCH64_FEATURE (V8_5A); static const aarch64_feature_set aarch64_feature_flagmanip = - AARCH64_FEATURE (AARCH64_FEATURE_FLAGMANIP, 0); + AARCH64_FEATURE (FLAGMANIP); static const aarch64_feature_set aarch64_feature_frintts = - AARCH64_FEATURE (AARCH64_FEATURE_FRINTTS, 0); + AARCH64_FEATURE (FRINTTS); static const aarch64_feature_set aarch64_feature_sb = - AARCH64_FEATURE (AARCH64_FEATURE_SB, 0); + AARCH64_FEATURE (SB); static const aarch64_feature_set aarch64_feature_predres = - AARCH64_FEATURE (AARCH64_FEATURE_PREDRES, 0); + AARCH64_FEATURE (PREDRES); static const aarch64_feature_set aarch64_feature_memtag = - AARCH64_FEATURE (AARCH64_FEATURE_MEMTAG, 0); + AARCH64_FEATURE (MEMTAG); static const aarch64_feature_set aarch64_feature_bfloat16 = - AARCH64_FEATURE (AARCH64_FEATURE_BFLOAT16, 0); + AARCH64_FEATURE (BFLOAT16); static const aarch64_feature_set aarch64_feature_bfloat16_sve = - AARCH64_FEATURE (AARCH64_FEATURE_BFLOAT16 | AARCH64_FEATURE_SVE, 0); + AARCH64_FEATURES (2, BFLOAT16, SVE); static const aarch64_feature_set aarch64_feature_tme = - AARCH64_FEATURE (AARCH64_FEATURE_TME, 0); + AARCH64_FEATURE (TME); static const aarch64_feature_set aarch64_feature_sve2 = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2, 0); + AARCH64_FEATURE (SVE2); static const aarch64_feature_set aarch64_feature_sve2aes = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 | AARCH64_FEATURE_SVE2_AES, 0); + AARCH64_FEATURES (2, SVE2, SVE2_AES); static const aarch64_feature_set aarch64_feature_sve2sha3 = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 | AARCH64_FEATURE_SVE2_SHA3, 0); + AARCH64_FEATURES (2, SVE2, SVE2_SHA3); static const aarch64_feature_set aarch64_feature_sve2sm4 = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 | AARCH64_FEATURE_SVE2_SM4, 0); + AARCH64_FEATURES (2, SVE2, SVE2_SM4); static const aarch64_feature_set aarch64_feature_sve2bitperm = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 | AARCH64_FEATURE_SVE2_BITPERM, 0); + AARCH64_FEATURES (2, SVE2, SVE2_BITPERM); static const aarch64_feature_set aarch64_feature_sme = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 | AARCH64_FEATURE_SME, 0); + AARCH64_FEATURES (2, SVE2, SME); static const aarch64_feature_set aarch64_feature_sme_f64f64 = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 | AARCH64_FEATURE_SME - | AARCH64_FEATURE_SME_F64F64, 0); + AARCH64_FEATURES (3, SVE2, SME, SME_F64F64); static const aarch64_feature_set aarch64_feature_sme_i16i64 = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 | AARCH64_FEATURE_SME - | AARCH64_FEATURE_SME_I16I64, 0); + AARCH64_FEATURES (3, SVE2, SME, SME_I16I64); static const aarch64_feature_set aarch64_feature_sme2 = - AARCH64_FEATURE (AARCH64_FEATURE_SVE2 | AARCH64_FEATURE_SME - | AARCH64_FEATURE_SME2, 0); + AARCH64_FEATURES (3, SVE2, SME, SME2); static const aarch64_feature_set aarch64_feature_sme2_i16i64 = - AARCH64_FEATURE (AARCH64_FEATURE_SME2 | AARCH64_FEATURE_SME_I16I64, 0); + AARCH64_FEATURES (2, SME2, SME_I16I64); static const aarch64_feature_set aarch64_feature_sme2_f64f64 = - AARCH64_FEATURE (AARCH64_FEATURE_SME2 | AARCH64_FEATURE_SME_F64F64, 0); + AARCH64_FEATURES (2, SME2, SME_F64F64); static const aarch64_feature_set aarch64_feature_v8_6a = - AARCH64_FEATURE (AARCH64_FEATURE_V8_6A, 0); + AARCH64_FEATURE (V8_6A); static const aarch64_feature_set aarch64_feature_v8_7a = - AARCH64_FEATURE (AARCH64_FEATURE_V8_7A, 0); + AARCH64_FEATURE (V8_7A); static const aarch64_feature_set aarch64_feature_i8mm = - AARCH64_FEATURE (AARCH64_FEATURE_I8MM, 0); + AARCH64_FEATURE (I8MM); static const aarch64_feature_set aarch64_feature_i8mm_sve = - AARCH64_FEATURE (AARCH64_FEATURE_I8MM | AARCH64_FEATURE_SVE, 0); + AARCH64_FEATURES (2, I8MM, SVE); static const aarch64_feature_set aarch64_feature_f32mm_sve = - AARCH64_FEATURE (AARCH64_FEATURE_F32MM | AARCH64_FEATURE_SVE, 0); + AARCH64_FEATURES (2, F32MM, SVE); static const aarch64_feature_set aarch64_feature_f64mm_sve = - AARCH64_FEATURE (AARCH64_FEATURE_F64MM | AARCH64_FEATURE_SVE, 0); + AARCH64_FEATURES (2, F64MM, SVE); static const aarch64_feature_set aarch64_feature_v8r = - AARCH64_FEATURE (AARCH64_FEATURE_V8R, 0); + AARCH64_FEATURE (V8R); static const aarch64_feature_set aarch64_feature_ls64 = - AARCH64_FEATURE (AARCH64_FEATURE_LS64, 0); + AARCH64_FEATURE (LS64); static const aarch64_feature_set aarch64_feature_flagm = - AARCH64_FEATURE (AARCH64_FEATURE_FLAGM, 0); + AARCH64_FEATURE (FLAGM); static const aarch64_feature_set aarch64_feature_mops = - AARCH64_FEATURE (AARCH64_FEATURE_MOPS, 0); + AARCH64_FEATURE (MOPS); static const aarch64_feature_set aarch64_feature_mops_memtag = - AARCH64_FEATURE (AARCH64_FEATURE_MOPS | AARCH64_FEATURE_MEMTAG, 0); + AARCH64_FEATURES (2, MOPS, MEMTAG); static const aarch64_feature_set aarch64_feature_hbc = - AARCH64_FEATURE (AARCH64_FEATURE_HBC, 0); + AARCH64_FEATURE (HBC); static const aarch64_feature_set aarch64_feature_cssc = - AARCH64_FEATURE (AARCH64_FEATURE_CSSC, 0); + AARCH64_FEATURE (CSSC); #define CORE &aarch64_feature_v8 #define FP &aarch64_feature_fp -- 2.25.1