175681 Commits

Author SHA1 Message Date
Carl Love
68dd57808f rs6000: Add command line and builtin compatibility check
2020-03-20  Carl Love  <cel@us.ibm.com>

	PR/target 87583
	* gcc/config/rs6000/rs6000.c (rs6000_option_override_internal):
	Add check for TARGET_FPRND for Power 7 or newer.
2020-03-20 18:16:24 -05:00
Jan Hubicka
72b3bc895f Fix verifier ICE on wrong comdat local flag [PR93347]
gcc/ChangeLog:

2020-03-20  Jan Hubicka  <hubicka@ucw.cz>

	PR ipa/93347
	* cgraph.c (symbol_table::create_edge): Update calls_comdat_local flag.
	(cgraph_edge::redirect_callee): Move here; likewise.
	(cgraph_node::remove_callees): Update calls_comdat_local flag.
	(cgraph_node::verify_node): Verify that calls_comdat_local flag match
	reality.
	(cgraph_node::check_calls_comdat_local_p): New member function.
	* cgraph.h (cgraph_node::check_calls_comdat_local_p): Declare.
	(cgraph_edge::redirect_callee): Move offline.
	* ipa-fnsummary.c (compute_fn_summary): Do not compute
	calls_comdat_local flag here.
	* ipa-inline-transform.c (inline_call): Fix updating of
	calls_comdat_local flag.
	* ipa-split.c (split_function): Use true instead of 1 to set the flag.
	* symtab.c (symtab_node::add_to_same_comdat_group): Update
	calls_comdat_local flag.

gcc/testsuite/ChangeLog:

2020-03-20  Jan Hubicka  <hubicka@ucw.cz>

	* g++.dg/torture/pr93347.C: New test.
2020-03-20 22:06:24 +01:00
Richard Biener
a89349e664 adjust SLP tree dumping
This also dumps the root node we eventually smuggle in.

2020-03-20  Richard Biener  <rguenther@suse.de>

	* tree-vect-slp.c (vect_analyze_slp_instance): Dump SLP tree
	from the possibly modified root.
2020-03-20 18:58:56 +01:00
Patrick Palka
a23eff1bd0 c++: Add testcases from PR c++/69694
These testcases are compiling successfully since 7.1.

gcc/testsuite/ChangeLog:

	PR c++/69694
	* g++.dg/cpp0x/decltype74.C: New test.
	* g++.dg/cpp0x/decltype75.C: New test.
2020-03-20 13:06:21 -04:00
Srinath Parvathaneni
1dfcc3b541 [ARM][GCC][11x]: MVE ACLE vector interleaving store and deinterleaving load intrinsics and also aliases to vstr and vldr intrinsics.
This patch supports following MVE ACLE intrinsics which are aliases of vstr and
vldr intrinsics.

vst1q_p_u8, vst1q_p_s8, vld1q_z_u8, vld1q_z_s8, vst1q_p_u16, vst1q_p_s16,
vld1q_z_u16, vld1q_z_s16, vst1q_p_u32, vst1q_p_s32, vld1q_z_u32, vld1q_z_s32,
vld1q_z_f16, vst1q_p_f16, vld1q_z_f32, vst1q_p_f32.

This patch also supports following MVE ACLE vector deinterleaving loads and vector
interleaving stores.

vst2q_s8, vst2q_u8, vld2q_s8, vld2q_u8, vld4q_s8, vld4q_u8, vst2q_s16, vst2q_u16,
vld2q_s16, vld2q_u16, vld4q_s16, vld4q_u16, vst2q_s32, vst2q_u32, vld2q_s32,
vld2q_u32, vld4q_s32, vld4q_u32, vld4q_f16, vld2q_f16, vst2q_f16, vld4q_f32,
vld2q_f32, vst2q_f32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* config/arm/arm_mve.h (vst1q_p_u8): Define macro.
	(vst1q_p_s8): Likewise.
	(vst2q_s8): Likewise.
	(vst2q_u8): Likewise.
	(vld1q_z_u8): Likewise.
	(vld1q_z_s8): Likewise.
	(vld2q_s8): Likewise.
	(vld2q_u8): Likewise.
	(vld4q_s8): Likewise.
	(vld4q_u8): Likewise.
	(vst1q_p_u16): Likewise.
	(vst1q_p_s16): Likewise.
	(vst2q_s16): Likewise.
	(vst2q_u16): Likewise.
	(vld1q_z_u16): Likewise.
	(vld1q_z_s16): Likewise.
	(vld2q_s16): Likewise.
	(vld2q_u16): Likewise.
	(vld4q_s16): Likewise.
	(vld4q_u16): Likewise.
	(vst1q_p_u32): Likewise.
	(vst1q_p_s32): Likewise.
	(vst2q_s32): Likewise.
	(vst2q_u32): Likewise.
	(vld1q_z_u32): Likewise.
	(vld1q_z_s32): Likewise.
	(vld2q_s32): Likewise.
	(vld2q_u32): Likewise.
	(vld4q_s32): Likewise.
	(vld4q_u32): Likewise.
	(vld4q_f16): Likewise.
	(vld2q_f16): Likewise.
	(vld1q_z_f16): Likewise.
	(vst2q_f16): Likewise.
	(vst1q_p_f16): Likewise.
	(vld4q_f32): Likewise.
	(vld2q_f32): Likewise.
	(vld1q_z_f32): Likewise.
	(vst2q_f32): Likewise.
	(vst1q_p_f32): Likewise.
	(__arm_vst1q_p_u8): Define intrinsic.
	(__arm_vst1q_p_s8): Likewise.
	(__arm_vst2q_s8): Likewise.
	(__arm_vst2q_u8): Likewise.
	(__arm_vld1q_z_u8): Likewise.
	(__arm_vld1q_z_s8): Likewise.
	(__arm_vld2q_s8): Likewise.
	(__arm_vld2q_u8): Likewise.
	(__arm_vld4q_s8): Likewise.
	(__arm_vld4q_u8): Likewise.
	(__arm_vst1q_p_u16): Likewise.
	(__arm_vst1q_p_s16): Likewise.
	(__arm_vst2q_s16): Likewise.
	(__arm_vst2q_u16): Likewise.
	(__arm_vld1q_z_u16): Likewise.
	(__arm_vld1q_z_s16): Likewise.
	(__arm_vld2q_s16): Likewise.
	(__arm_vld2q_u16): Likewise.
	(__arm_vld4q_s16): Likewise.
	(__arm_vld4q_u16): Likewise.
	(__arm_vst1q_p_u32): Likewise.
	(__arm_vst1q_p_s32): Likewise.
	(__arm_vst2q_s32): Likewise.
	(__arm_vst2q_u32): Likewise.
	(__arm_vld1q_z_u32): Likewise.
	(__arm_vld1q_z_s32): Likewise.
	(__arm_vld2q_s32): Likewise.
	(__arm_vld2q_u32): Likewise.
	(__arm_vld4q_s32): Likewise.
	(__arm_vld4q_u32): Likewise.
	(__arm_vld4q_f16): Likewise.
	(__arm_vld2q_f16): Likewise.
	(__arm_vld1q_z_f16): Likewise.
	(__arm_vst2q_f16): Likewise.
	(__arm_vst1q_p_f16): Likewise.
	(__arm_vld4q_f32): Likewise.
	(__arm_vld2q_f32): Likewise.
	(__arm_vld1q_z_f32): Likewise.
	(__arm_vst2q_f32): Likewise.
	(__arm_vst1q_p_f32): Likewise.
	(vld1q_z): Define polymorphic variant.
	(vld2q): Likewise.
	(vld4q): Likewise.
	(vst1q_p): Likewise.
	(vst2q): Likewise.
	* config/arm/arm_mve_builtins.def (STORE1): Use builtin qualifier.
	(LOAD1): Likewise.
	* config/arm/mve.md (mve_vst2q<mode>): Define RTL pattern.
	(mve_vld2q<mode>): Likewise.
	(mve_vld4q<mode>): Likewise.

gcc/testsuite/ChangeLog:

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* gcc.target/arm/mve/intrinsics/vld1q_z_f16.c: New test.
	* gcc.target/arm/mve/intrinsics/vld1q_z_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_z_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_z_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_z_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld2q_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld2q_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld2q_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld2q_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld2q_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld2q_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld2q_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld2q_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld4q_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld4q_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld4q_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld4q_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld4q_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld4q_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld4q_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld4q_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_p_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_p_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_p_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_p_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_p_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_p_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_p_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_p_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst2q_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst2q_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst2q_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst2q_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst2q_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst2q_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst2q_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst2q_u8.c: Likewise.
2020-03-20 16:57:45 +00:00
Iain Buclaw
b5446d0cc0 d: Fix SEGV in hash_table<odr_name_hasher, false, xcallocator>::find_slot_with_hash
This patch fixes LTO bug with the D front-end.  As DECL_ASSEMBLER_NAME
is set on the TYPE_DECL, so TYPE_CXX_ODR_P must also be set on the type.

The addition of merge_aggregate_types is not strictly needed now, but it
fixes a problem introduced in newer versions of the dmd front-end where
templated types could be sent more than once to the D code generator.

gcc/d/ChangeLog:

2020-03-20  Iain Buclaw  <ibuclaw@gdcproject.org>

	PR lto/91027
	* d-tree.h (struct GTY): Add daggregate field.
	(IDENTIFIER_DAGGREGATE): Define.
	(d_mangle_decl): Add declaration.
	* decl.cc (mangle_decl): Remove static linkage, rename to...
	(d_mangle_decl): ...this, update all callers.
	* types.cc (merge_aggregate_types): New function.
	(TypeVisitor::visit (TypeStruct *)): Call merge_aggregate_types, set
	IDENTIFIER_DAGGREGATE and TYPE_CXX_ODR_P.
	(TypeVisitor::visit (TypeClass *)): Likewise.
2020-03-20 17:26:29 +01:00
Richard Sandiford
1aa22b1916 c-family: Tighten vector handling in type_for_mode [PR94072]
In this PR we had a 512-bit VECTOR_TYPE whose mode is XImode
(an integer mode used for four 128-bit vectors).  When trying
to expand a zero constant for it, we hit code in expand_expr_real_1
that tries to use the associated integer type instead.  The code used
type_for_mode (XImode, 1) to get this integer type.

However, the c-family implementation of type_for_mode checks for
any registered built-in type that matches the mode and has the
right signedness.  This meant that it could return a built-in
vector type when given an integer mode (particularly if, as here,
the vector type isn't supported by the current subtarget and so
TYPE_MODE != TYPE_MODE_RAW).  The expand code would then cycle
endlessly trying to use this "new" type instead of the original
vector type.

2020-03-20  Richard Sandiford  <richard.sandiford@arm.com>

gcc/c-family/
	PR middle-end/94072
	* c-common.c (c_common_type_for_mode): Before using a registered
	built-in type, check that the vectorness of the type matches
	the vectorness of the mode.

gcc/testsuite/
	PR middle-end/94072
	* gcc.target/aarch64/pr94072.c: New test.
2020-03-20 16:14:56 +00:00
Srinath Parvathaneni
c3562f8104 [ARM][GCC][10x]: MVE ACLE intrinsics "add with carry across beats" and "beat-wise substract".
This patch supports following MVE ACLE "add with carry across beats" intrinsics and "beat-wise substract" intrinsics.

vadciq_s32, vadciq_u32, vadciq_m_s32, vadciq_m_u32, vadcq_s32, vadcq_u32, vadcq_m_s32, vadcq_m_u32, vsbciq_s32, vsbciq_u32, vsbciq_m_s32, vsbciq_m_u32, vsbcq_s32, vsbcq_u32, vsbcq_m_s32, vsbcq_m_u32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* config/arm/arm-builtins.c (ARM_BUILTIN_GET_FPSCR_NZCVQC): Define.
	(ARM_BUILTIN_SET_FPSCR_NZCVQC): Likewise.
	(arm_init_mve_builtins): Add "__builtin_arm_get_fpscr_nzcvqc" and
	"__builtin_arm_set_fpscr_nzcvqc" to arm_builtin_decls array.
	(arm_expand_builtin): Define case ARM_BUILTIN_GET_FPSCR_NZCVQC
	and ARM_BUILTIN_SET_FPSCR_NZCVQC.
	* config/arm/arm_mve.h (vadciq_s32): Define macro.
	(vadciq_u32): Likewise.
	(vadciq_m_s32): Likewise.
	(vadciq_m_u32): Likewise.
	(vadcq_s32): Likewise.
	(vadcq_u32): Likewise.
	(vadcq_m_s32): Likewise.
	(vadcq_m_u32): Likewise.
	(vsbciq_s32): Likewise.
	(vsbciq_u32): Likewise.
	(vsbciq_m_s32): Likewise.
	(vsbciq_m_u32): Likewise.
	(vsbcq_s32): Likewise.
	(vsbcq_u32): Likewise.
	(vsbcq_m_s32): Likewise.
	(vsbcq_m_u32): Likewise.
	(__arm_vadciq_s32): Define intrinsic.
	(__arm_vadciq_u32): Likewise.
	(__arm_vadciq_m_s32): Likewise.
	(__arm_vadciq_m_u32): Likewise.
	(__arm_vadcq_s32): Likewise.
	(__arm_vadcq_u32): Likewise.
	(__arm_vadcq_m_s32): Likewise.
	(__arm_vadcq_m_u32): Likewise.
	(__arm_vsbciq_s32): Likewise.
	(__arm_vsbciq_u32): Likewise.
	(__arm_vsbciq_m_s32): Likewise.
	(__arm_vsbciq_m_u32): Likewise.
	(__arm_vsbcq_s32): Likewise.
	(__arm_vsbcq_u32): Likewise.
	(__arm_vsbcq_m_s32): Likewise.
	(__arm_vsbcq_m_u32): Likewise.
	(vadciq_m): Define polymorphic variant.
	(vadciq): Likewise.
	(vadcq_m): Likewise.
	(vadcq): Likewise.
	(vsbciq_m): Likewise.
	(vsbciq): Likewise.
	(vsbcq_m): Likewise.
	(vsbcq): Likewise.
	* config/arm/arm_mve_builtins.def (BINOP_NONE_NONE_NONE): Use builtin
	qualifier.
	(BINOP_UNONE_UNONE_UNONE): Likewise.
	(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
	(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE): Likewise.
	* config/arm/mve.md (VADCIQ): Define iterator.
	(VADCIQ_M): Likewise.
	(VSBCQ): Likewise.
	(VSBCQ_M): Likewise.
	(VSBCIQ): Likewise.
	(VSBCIQ_M): Likewise.
	(VADCQ): Likewise.
	(VADCQ_M): Likewise.
	(mve_vadciq_m_<supf>v4si): Define RTL pattern.
	(mve_vadciq_<supf>v4si): Likewise.
	(mve_vadcq_m_<supf>v4si): Likewise.
	(mve_vadcq_<supf>v4si): Likewise.
	(mve_vsbciq_m_<supf>v4si): Likewise.
	(mve_vsbciq_<supf>v4si): Likewise.
	(mve_vsbcq_m_<supf>v4si): Likewise.
	(mve_vsbcq_<supf>v4si): Likewise.
	(get_fpscr_nzcvqc): Define isns.
	(set_fpscr_nzcvqc): Define isns.
	* config/arm/unspecs.md (UNSPEC_GET_FPSCR_NZCVQC): Define.
	(UNSPEC_SET_FPSCR_NZCVQC): Define.

gcc/testsuite/ChangeLog:

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
           Andre Vieira  <andre.simoesdiasvieira@arm.com>
           Mihail Ionescu  <mihail.ionescu@arm.com>

	* gcc.target/arm/mve/intrinsics/vadciq_m_s32.c: New test.
	* gcc.target/arm/mve/intrinsics/vadciq_m_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vadciq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vadciq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vadcq_m_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vadcq_m_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vadcq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vadcq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsbciq_m_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsbciq_m_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsbciq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsbciq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsbcq_m_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsbcq_m_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsbcq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsbcq_u32.c: Likewise.
2020-03-20 16:10:01 +00:00
Patrick Palka
828878c35c c++: Include the constraint parameter mapping in diagnostic constraint contexts
When diagnosing a constraint error, we currently try to print the constraint
inside a diagnostic constraint context with its template arguments substituted
in.  If substitution fails, then we instead just print the dependent form, as in
the test case below:

  .../diagnostic6.C:14:15: error: static assertion failed
     14 | static_assert(E<int>); // { dg-error "static assertion failed|not a class" }
        |               ^~~~~~
  .../diagnostic6.C:14:15: note: constraints not satisfied
  .../diagnostic6.C:4:11:   required for the satisfaction of ‘C<T>’
  .../diagnostic6.C:8:11:   required for the satisfaction of ‘D<typename T::type>’
  .../diagnostic6.C:14:15: error: ‘int’ is not a class, struct, or union type

But printing just the dependent form sometimes makes it difficult to understand
the underlying failure.  In the above example, for instance, there's no
indication of how the template argument 'int' relates to either of the 'T's.

This patch improves the situation by changing these diagnostics to always print
the dependent form of the constraint, and alongside it the (preferably
substituted) constraint parameter mapping.  So with the same test case below we
now get:

  .../diagnostic6.C:14:15: error: static assertion failed
     14 | static_assert(E<int>); // { dg-error "static assertion failed|not a class" }
        |               ^~~~~~
  .../diagnostic6.C:14:15: note: constraints not satisfied
  .../diagnostic6.C:4:11:   required for the satisfaction of ‘C<T>’ [with T = typename T::type]
  .../diagnostic6.C:8:11:   required for the satisfaction of ‘D<typename T::type>’ [with T = int]
  .../diagnostic6.C:14:15: error: ‘int’ is not a class, struct, or union type

This change arguably makes it easier to figure out what's going on whenever a
constraint fails due to substitution creating an invalid type rather than
failing due to the constraint evaluating to false.

gcc/cp/ChangeLog:

	* cxx-pretty-print.c (pp_cxx_parameter_mapping): Make extern.  Move
	the "[with ]" bits to here from ...
	(pp_cxx_atomic_constraint): ... here.
	* cxx-pretty-print.h (pp_cxx_parameter_mapping): Declare.
	* error.c (rebuild_concept_check): Delete.
	(print_concept_check_info): Print the dependent form of the constraint and the
	preferably substituted parameter mapping alongside it.

gcc/testsuite/ChangeLog:

	* g++.dg/concepts/diagnostic6.C: New test.
2020-03-20 10:19:54 -04:00
Srinath Parvathaneni
261014a1be [ARM][GCC][9x]: MVE ACLE predicated intrinsics with (dont-care) variant.
This patch supports following MVE ACLE predicated intrinsic with `_x` (dont-care) variant.
* ``_x`` (dont-care) which indicates that the false-predicated lanes have undefined values.
These are syntactic sugar for merge intrinsics with a ``vuninitializedq`` inactive parameter.

vabdq_x_f16, vabdq_x_f32, vabdq_x_s16, vabdq_x_s32, vabdq_x_s8, vabdq_x_u16, vabdq_x_u32, vabdq_x_u8,
vabsq_x_f16, vabsq_x_f32, vabsq_x_s16, vabsq_x_s32, vabsq_x_s8, vaddq_x_f16, vaddq_x_f32, vaddq_x_n_f16,
vaddq_x_n_f32, vaddq_x_n_s16, vaddq_x_n_s32, vaddq_x_n_s8, vaddq_x_n_u16, vaddq_x_n_u32, vaddq_x_n_u8,
vaddq_x_s16, vaddq_x_s32, vaddq_x_s8, vaddq_x_u16, vaddq_x_u32, vaddq_x_u8, vandq_x_f16, vandq_x_f32,
vandq_x_s16, vandq_x_s32, vandq_x_s8, vandq_x_u16, vandq_x_u32, vandq_x_u8, vbicq_x_f16, vbicq_x_f32,
vbicq_x_s16, vbicq_x_s32, vbicq_x_s8, vbicq_x_u16, vbicq_x_u32, vbicq_x_u8, vbrsrq_x_n_f16,
vbrsrq_x_n_f32, vbrsrq_x_n_s16, vbrsrq_x_n_s32, vbrsrq_x_n_s8, vbrsrq_x_n_u16, vbrsrq_x_n_u32,
vbrsrq_x_n_u8, vcaddq_rot270_x_f16, vcaddq_rot270_x_f32, vcaddq_rot270_x_s16, vcaddq_rot270_x_s32,
vcaddq_rot270_x_s8, vcaddq_rot270_x_u16, vcaddq_rot270_x_u32, vcaddq_rot270_x_u8, vcaddq_rot90_x_f16,
vcaddq_rot90_x_f32, vcaddq_rot90_x_s16, vcaddq_rot90_x_s32, vcaddq_rot90_x_s8, vcaddq_rot90_x_u16,
vcaddq_rot90_x_u32, vcaddq_rot90_x_u8, vclsq_x_s16, vclsq_x_s32, vclsq_x_s8, vclzq_x_s16, vclzq_x_s32,
vclzq_x_s8, vclzq_x_u16, vclzq_x_u32, vclzq_x_u8, vcmulq_rot180_x_f16, vcmulq_rot180_x_f32,
vcmulq_rot270_x_f16, vcmulq_rot270_x_f32, vcmulq_rot90_x_f16, vcmulq_rot90_x_f32, vcmulq_x_f16,
vcmulq_x_f32, vcvtaq_x_s16_f16, vcvtaq_x_s32_f32, vcvtaq_x_u16_f16, vcvtaq_x_u32_f32, vcvtbq_x_f32_f16,
vcvtmq_x_s16_f16, vcvtmq_x_s32_f32, vcvtmq_x_u16_f16, vcvtmq_x_u32_f32, vcvtnq_x_s16_f16,
vcvtnq_x_s32_f32, vcvtnq_x_u16_f16, vcvtnq_x_u32_f32, vcvtpq_x_s16_f16, vcvtpq_x_s32_f32,
vcvtpq_x_u16_f16, vcvtpq_x_u32_f32, vcvtq_x_f16_s16, vcvtq_x_f16_u16, vcvtq_x_f32_s32, vcvtq_x_f32_u32,
vcvtq_x_n_f16_s16, vcvtq_x_n_f16_u16, vcvtq_x_n_f32_s32, vcvtq_x_n_f32_u32, vcvtq_x_n_s16_f16,
vcvtq_x_n_s32_f32, vcvtq_x_n_u16_f16, vcvtq_x_n_u32_f32, vcvtq_x_s16_f16, vcvtq_x_s32_f32,
vcvtq_x_u16_f16, vcvtq_x_u32_f32, vcvttq_x_f32_f16, vddupq_x_n_u16, vddupq_x_n_u32, vddupq_x_n_u8,
vddupq_x_wb_u16, vddupq_x_wb_u32, vddupq_x_wb_u8, vdupq_x_n_f16, vdupq_x_n_f32, vdupq_x_n_s16,
vdupq_x_n_s32, vdupq_x_n_s8, vdupq_x_n_u16, vdupq_x_n_u32, vdupq_x_n_u8, vdwdupq_x_n_u16, vdwdupq_x_n_u32,
vdwdupq_x_n_u8, vdwdupq_x_wb_u16, vdwdupq_x_wb_u32, vdwdupq_x_wb_u8, veorq_x_f16, veorq_x_f32, veorq_x_s16,
veorq_x_s32, veorq_x_s8, veorq_x_u16, veorq_x_u32, veorq_x_u8, vhaddq_x_n_s16, vhaddq_x_n_s32,
vhaddq_x_n_s8, vhaddq_x_n_u16, vhaddq_x_n_u32, vhaddq_x_n_u8, vhaddq_x_s16, vhaddq_x_s32, vhaddq_x_s8,
vhaddq_x_u16, vhaddq_x_u32, vhaddq_x_u8, vhcaddq_rot270_x_s16, vhcaddq_rot270_x_s32, vhcaddq_rot270_x_s8,
vhcaddq_rot90_x_s16, vhcaddq_rot90_x_s32, vhcaddq_rot90_x_s8, vhsubq_x_n_s16, vhsubq_x_n_s32,
vhsubq_x_n_s8, vhsubq_x_n_u16, vhsubq_x_n_u32, vhsubq_x_n_u8, vhsubq_x_s16, vhsubq_x_s32, vhsubq_x_s8,
vhsubq_x_u16, vhsubq_x_u32, vhsubq_x_u8, vidupq_x_n_u16, vidupq_x_n_u32, vidupq_x_n_u8, vidupq_x_wb_u16,
vidupq_x_wb_u32, vidupq_x_wb_u8, viwdupq_x_n_u16, viwdupq_x_n_u32, viwdupq_x_n_u8, viwdupq_x_wb_u16,
viwdupq_x_wb_u32, viwdupq_x_wb_u8, vmaxnmq_x_f16, vmaxnmq_x_f32, vmaxq_x_s16, vmaxq_x_s32, vmaxq_x_s8,
vmaxq_x_u16, vmaxq_x_u32, vmaxq_x_u8, vminnmq_x_f16, vminnmq_x_f32, vminq_x_s16, vminq_x_s32, vminq_x_s8,
vminq_x_u16, vminq_x_u32, vminq_x_u8, vmovlbq_x_s16, vmovlbq_x_s8, vmovlbq_x_u16, vmovlbq_x_u8,
vmovltq_x_s16, vmovltq_x_s8, vmovltq_x_u16, vmovltq_x_u8, vmulhq_x_s16, vmulhq_x_s32, vmulhq_x_s8,
vmulhq_x_u16, vmulhq_x_u32, vmulhq_x_u8, vmullbq_int_x_s16, vmullbq_int_x_s32, vmullbq_int_x_s8,
vmullbq_int_x_u16, vmullbq_int_x_u32, vmullbq_int_x_u8, vmullbq_poly_x_p16, vmullbq_poly_x_p8,
vmulltq_int_x_s16, vmulltq_int_x_s32, vmulltq_int_x_s8, vmulltq_int_x_u16, vmulltq_int_x_u32,
vmulltq_int_x_u8, vmulltq_poly_x_p16, vmulltq_poly_x_p8, vmulq_x_f16, vmulq_x_f32, vmulq_x_n_f16,
vmulq_x_n_f32, vmulq_x_n_s16, vmulq_x_n_s32, vmulq_x_n_s8, vmulq_x_n_u16, vmulq_x_n_u32, vmulq_x_n_u8,
vmulq_x_s16, vmulq_x_s32, vmulq_x_s8, vmulq_x_u16, vmulq_x_u32, vmulq_x_u8, vmvnq_x_n_s16, vmvnq_x_n_s32,
vmvnq_x_n_u16, vmvnq_x_n_u32, vmvnq_x_s16, vmvnq_x_s32, vmvnq_x_s8, vmvnq_x_u16, vmvnq_x_u32, vmvnq_x_u8,
vnegq_x_f16, vnegq_x_f32, vnegq_x_s16, vnegq_x_s32, vnegq_x_s8, vornq_x_f16, vornq_x_f32, vornq_x_s16,
vornq_x_s32, vornq_x_s8, vornq_x_u16, vornq_x_u32, vornq_x_u8, vorrq_x_f16, vorrq_x_f32, vorrq_x_s16,
vorrq_x_s32, vorrq_x_s8, vorrq_x_u16, vorrq_x_u32, vorrq_x_u8, vrev16q_x_s8, vrev16q_x_u8, vrev32q_x_f16,
vrev32q_x_s16, vrev32q_x_s8, vrev32q_x_u16, vrev32q_x_u8, vrev64q_x_f16, vrev64q_x_f32, vrev64q_x_s16,
vrev64q_x_s32, vrev64q_x_s8, vrev64q_x_u16, vrev64q_x_u32, vrev64q_x_u8, vrhaddq_x_s16, vrhaddq_x_s32,
vrhaddq_x_s8, vrhaddq_x_u16, vrhaddq_x_u32, vrhaddq_x_u8, vrmulhq_x_s16, vrmulhq_x_s32, vrmulhq_x_s8,
vrmulhq_x_u16, vrmulhq_x_u32, vrmulhq_x_u8, vrndaq_x_f16, vrndaq_x_f32, vrndmq_x_f16, vrndmq_x_f32,
vrndnq_x_f16, vrndnq_x_f32, vrndpq_x_f16, vrndpq_x_f32, vrndq_x_f16, vrndq_x_f32, vrndxq_x_f16,
vrndxq_x_f32, vrshlq_x_s16, vrshlq_x_s32, vrshlq_x_s8, vrshlq_x_u16, vrshlq_x_u32, vrshlq_x_u8,
vrshrq_x_n_s16, vrshrq_x_n_s32, vrshrq_x_n_s8, vrshrq_x_n_u16, vrshrq_x_n_u32, vrshrq_x_n_u8,
vshllbq_x_n_s16, vshllbq_x_n_s8, vshllbq_x_n_u16, vshllbq_x_n_u8, vshlltq_x_n_s16, vshlltq_x_n_s8,
vshlltq_x_n_u16, vshlltq_x_n_u8, vshlq_x_n_s16, vshlq_x_n_s32, vshlq_x_n_s8, vshlq_x_n_u16, vshlq_x_n_u32,
vshlq_x_n_u8, vshlq_x_s16, vshlq_x_s32, vshlq_x_s8, vshlq_x_u16, vshlq_x_u32, vshlq_x_u8, vshrq_x_n_s16,
vshrq_x_n_s32, vshrq_x_n_s8, vshrq_x_n_u16, vshrq_x_n_u32, vshrq_x_n_u8, vsubq_x_f16, vsubq_x_f32,
vsubq_x_n_f16, vsubq_x_n_f32, vsubq_x_n_s16, vsubq_x_n_s32, vsubq_x_n_s8, vsubq_x_n_u16, vsubq_x_n_u32,
vsubq_x_n_u8, vsubq_x_s16, vsubq_x_s32, vsubq_x_s8, vsubq_x_u16, vsubq_x_u32, vsubq_x_u8.

Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm_mve.h (vddupq_x_n_u8): Define macro.
	(vddupq_x_n_u16): Likewise.
	(vddupq_x_n_u32): Likewise.
	(vddupq_x_wb_u8): Likewise.
	(vddupq_x_wb_u16): Likewise.
	(vddupq_x_wb_u32): Likewise.
	(vdwdupq_x_n_u8): Likewise.
	(vdwdupq_x_n_u16): Likewise.
	(vdwdupq_x_n_u32): Likewise.
	(vdwdupq_x_wb_u8): Likewise.
	(vdwdupq_x_wb_u16): Likewise.
	(vdwdupq_x_wb_u32): Likewise.
	(vidupq_x_n_u8): Likewise.
	(vidupq_x_n_u16): Likewise.
	(vidupq_x_n_u32): Likewise.
	(vidupq_x_wb_u8): Likewise.
	(vidupq_x_wb_u16): Likewise.
	(vidupq_x_wb_u32): Likewise.
	(viwdupq_x_n_u8): Likewise.
	(viwdupq_x_n_u16): Likewise.
	(viwdupq_x_n_u32): Likewise.
	(viwdupq_x_wb_u8): Likewise.
	(viwdupq_x_wb_u16): Likewise.
	(viwdupq_x_wb_u32): Likewise.
	(vdupq_x_n_s8): Likewise.
	(vdupq_x_n_s16): Likewise.
	(vdupq_x_n_s32): Likewise.
	(vdupq_x_n_u8): Likewise.
	(vdupq_x_n_u16): Likewise.
	(vdupq_x_n_u32): Likewise.
	(vminq_x_s8): Likewise.
	(vminq_x_s16): Likewise.
	(vminq_x_s32): Likewise.
	(vminq_x_u8): Likewise.
	(vminq_x_u16): Likewise.
	(vminq_x_u32): Likewise.
	(vmaxq_x_s8): Likewise.
	(vmaxq_x_s16): Likewise.
	(vmaxq_x_s32): Likewise.
	(vmaxq_x_u8): Likewise.
	(vmaxq_x_u16): Likewise.
	(vmaxq_x_u32): Likewise.
	(vabdq_x_s8): Likewise.
	(vabdq_x_s16): Likewise.
	(vabdq_x_s32): Likewise.
	(vabdq_x_u8): Likewise.
	(vabdq_x_u16): Likewise.
	(vabdq_x_u32): Likewise.
	(vabsq_x_s8): Likewise.
	(vabsq_x_s16): Likewise.
	(vabsq_x_s32): Likewise.
	(vaddq_x_s8): Likewise.
	(vaddq_x_s16): Likewise.
	(vaddq_x_s32): Likewise.
	(vaddq_x_n_s8): Likewise.
	(vaddq_x_n_s16): Likewise.
	(vaddq_x_n_s32): Likewise.
	(vaddq_x_u8): Likewise.
	(vaddq_x_u16): Likewise.
	(vaddq_x_u32): Likewise.
	(vaddq_x_n_u8): Likewise.
	(vaddq_x_n_u16): Likewise.
	(vaddq_x_n_u32): Likewise.
	(vclsq_x_s8): Likewise.
	(vclsq_x_s16): Likewise.
	(vclsq_x_s32): Likewise.
	(vclzq_x_s8): Likewise.
	(vclzq_x_s16): Likewise.
	(vclzq_x_s32): Likewise.
	(vclzq_x_u8): Likewise.
	(vclzq_x_u16): Likewise.
	(vclzq_x_u32): Likewise.
	(vnegq_x_s8): Likewise.
	(vnegq_x_s16): Likewise.
	(vnegq_x_s32): Likewise.
	(vmulhq_x_s8): Likewise.
	(vmulhq_x_s16): Likewise.
	(vmulhq_x_s32): Likewise.
	(vmulhq_x_u8): Likewise.
	(vmulhq_x_u16): Likewise.
	(vmulhq_x_u32): Likewise.
	(vmullbq_poly_x_p8): Likewise.
	(vmullbq_poly_x_p16): Likewise.
	(vmullbq_int_x_s8): Likewise.
	(vmullbq_int_x_s16): Likewise.
	(vmullbq_int_x_s32): Likewise.
	(vmullbq_int_x_u8): Likewise.
	(vmullbq_int_x_u16): Likewise.
	(vmullbq_int_x_u32): Likewise.
	(vmulltq_poly_x_p8): Likewise.
	(vmulltq_poly_x_p16): Likewise.
	(vmulltq_int_x_s8): Likewise.
	(vmulltq_int_x_s16): Likewise.
	(vmulltq_int_x_s32): Likewise.
	(vmulltq_int_x_u8): Likewise.
	(vmulltq_int_x_u16): Likewise.
	(vmulltq_int_x_u32): Likewise.
	(vmulq_x_s8): Likewise.
	(vmulq_x_s16): Likewise.
	(vmulq_x_s32): Likewise.
	(vmulq_x_n_s8): Likewise.
	(vmulq_x_n_s16): Likewise.
	(vmulq_x_n_s32): Likewise.
	(vmulq_x_u8): Likewise.
	(vmulq_x_u16): Likewise.
	(vmulq_x_u32): Likewise.
	(vmulq_x_n_u8): Likewise.
	(vmulq_x_n_u16): Likewise.
	(vmulq_x_n_u32): Likewise.
	(vsubq_x_s8): Likewise.
	(vsubq_x_s16): Likewise.
	(vsubq_x_s32): Likewise.
	(vsubq_x_n_s8): Likewise.
	(vsubq_x_n_s16): Likewise.
	(vsubq_x_n_s32): Likewise.
	(vsubq_x_u8): Likewise.
	(vsubq_x_u16): Likewise.
	(vsubq_x_u32): Likewise.
	(vsubq_x_n_u8): Likewise.
	(vsubq_x_n_u16): Likewise.
	(vsubq_x_n_u32): Likewise.
	(vcaddq_rot90_x_s8): Likewise.
	(vcaddq_rot90_x_s16): Likewise.
	(vcaddq_rot90_x_s32): Likewise.
	(vcaddq_rot90_x_u8): Likewise.
	(vcaddq_rot90_x_u16): Likewise.
	(vcaddq_rot90_x_u32): Likewise.
	(vcaddq_rot270_x_s8): Likewise.
	(vcaddq_rot270_x_s16): Likewise.
	(vcaddq_rot270_x_s32): Likewise.
	(vcaddq_rot270_x_u8): Likewise.
	(vcaddq_rot270_x_u16): Likewise.
	(vcaddq_rot270_x_u32): Likewise.
	(vhaddq_x_n_s8): Likewise.
	(vhaddq_x_n_s16): Likewise.
	(vhaddq_x_n_s32): Likewise.
	(vhaddq_x_n_u8): Likewise.
	(vhaddq_x_n_u16): Likewise.
	(vhaddq_x_n_u32): Likewise.
	(vhaddq_x_s8): Likewise.
	(vhaddq_x_s16): Likewise.
	(vhaddq_x_s32): Likewise.
	(vhaddq_x_u8): Likewise.
	(vhaddq_x_u16): Likewise.
	(vhaddq_x_u32): Likewise.
	(vhcaddq_rot90_x_s8): Likewise.
	(vhcaddq_rot90_x_s16): Likewise.
	(vhcaddq_rot90_x_s32): Likewise.
	(vhcaddq_rot270_x_s8): Likewise.
	(vhcaddq_rot270_x_s16): Likewise.
	(vhcaddq_rot270_x_s32): Likewise.
	(vhsubq_x_n_s8): Likewise.
	(vhsubq_x_n_s16): Likewise.
	(vhsubq_x_n_s32): Likewise.
	(vhsubq_x_n_u8): Likewise.
	(vhsubq_x_n_u16): Likewise.
	(vhsubq_x_n_u32): Likewise.
	(vhsubq_x_s8): Likewise.
	(vhsubq_x_s16): Likewise.
	(vhsubq_x_s32): Likewise.
	(vhsubq_x_u8): Likewise.
	(vhsubq_x_u16): Likewise.
	(vhsubq_x_u32): Likewise.
	(vrhaddq_x_s8): Likewise.
	(vrhaddq_x_s16): Likewise.
	(vrhaddq_x_s32): Likewise.
	(vrhaddq_x_u8): Likewise.
	(vrhaddq_x_u16): Likewise.
	(vrhaddq_x_u32): Likewise.
	(vrmulhq_x_s8): Likewise.
	(vrmulhq_x_s16): Likewise.
	(vrmulhq_x_s32): Likewise.
	(vrmulhq_x_u8): Likewise.
	(vrmulhq_x_u16): Likewise.
	(vrmulhq_x_u32): Likewise.
	(vandq_x_s8): Likewise.
	(vandq_x_s16): Likewise.
	(vandq_x_s32): Likewise.
	(vandq_x_u8): Likewise.
	(vandq_x_u16): Likewise.
	(vandq_x_u32): Likewise.
	(vbicq_x_s8): Likewise.
	(vbicq_x_s16): Likewise.
	(vbicq_x_s32): Likewise.
	(vbicq_x_u8): Likewise.
	(vbicq_x_u16): Likewise.
	(vbicq_x_u32): Likewise.
	(vbrsrq_x_n_s8): Likewise.
	(vbrsrq_x_n_s16): Likewise.
	(vbrsrq_x_n_s32): Likewise.
	(vbrsrq_x_n_u8): Likewise.
	(vbrsrq_x_n_u16): Likewise.
	(vbrsrq_x_n_u32): Likewise.
	(veorq_x_s8): Likewise.
	(veorq_x_s16): Likewise.
	(veorq_x_s32): Likewise.
	(veorq_x_u8): Likewise.
	(veorq_x_u16): Likewise.
	(veorq_x_u32): Likewise.
	(vmovlbq_x_s8): Likewise.
	(vmovlbq_x_s16): Likewise.
	(vmovlbq_x_u8): Likewise.
	(vmovlbq_x_u16): Likewise.
	(vmovltq_x_s8): Likewise.
	(vmovltq_x_s16): Likewise.
	(vmovltq_x_u8): Likewise.
	(vmovltq_x_u16): Likewise.
	(vmvnq_x_s8): Likewise.
	(vmvnq_x_s16): Likewise.
	(vmvnq_x_s32): Likewise.
	(vmvnq_x_u8): Likewise.
	(vmvnq_x_u16): Likewise.
	(vmvnq_x_u32): Likewise.
	(vmvnq_x_n_s16): Likewise.
	(vmvnq_x_n_s32): Likewise.
	(vmvnq_x_n_u16): Likewise.
	(vmvnq_x_n_u32): Likewise.
	(vornq_x_s8): Likewise.
	(vornq_x_s16): Likewise.
	(vornq_x_s32): Likewise.
	(vornq_x_u8): Likewise.
	(vornq_x_u16): Likewise.
	(vornq_x_u32): Likewise.
	(vorrq_x_s8): Likewise.
	(vorrq_x_s16): Likewise.
	(vorrq_x_s32): Likewise.
	(vorrq_x_u8): Likewise.
	(vorrq_x_u16): Likewise.
	(vorrq_x_u32): Likewise.
	(vrev16q_x_s8): Likewise.
	(vrev16q_x_u8): Likewise.
	(vrev32q_x_s8): Likewise.
	(vrev32q_x_s16): Likewise.
	(vrev32q_x_u8): Likewise.
	(vrev32q_x_u16): Likewise.
	(vrev64q_x_s8): Likewise.
	(vrev64q_x_s16): Likewise.
	(vrev64q_x_s32): Likewise.
	(vrev64q_x_u8): Likewise.
	(vrev64q_x_u16): Likewise.
	(vrev64q_x_u32): Likewise.
	(vrshlq_x_s8): Likewise.
	(vrshlq_x_s16): Likewise.
	(vrshlq_x_s32): Likewise.
	(vrshlq_x_u8): Likewise.
	(vrshlq_x_u16): Likewise.
	(vrshlq_x_u32): Likewise.
	(vshllbq_x_n_s8): Likewise.
	(vshllbq_x_n_s16): Likewise.
	(vshllbq_x_n_u8): Likewise.
	(vshllbq_x_n_u16): Likewise.
	(vshlltq_x_n_s8): Likewise.
	(vshlltq_x_n_s16): Likewise.
	(vshlltq_x_n_u8): Likewise.
	(vshlltq_x_n_u16): Likewise.
	(vshlq_x_s8): Likewise.
	(vshlq_x_s16): Likewise.
	(vshlq_x_s32): Likewise.
	(vshlq_x_u8): Likewise.
	(vshlq_x_u16): Likewise.
	(vshlq_x_u32): Likewise.
	(vshlq_x_n_s8): Likewise.
	(vshlq_x_n_s16): Likewise.
	(vshlq_x_n_s32): Likewise.
	(vshlq_x_n_u8): Likewise.
	(vshlq_x_n_u16): Likewise.
	(vshlq_x_n_u32): Likewise.
	(vrshrq_x_n_s8): Likewise.
	(vrshrq_x_n_s16): Likewise.
	(vrshrq_x_n_s32): Likewise.
	(vrshrq_x_n_u8): Likewise.
	(vrshrq_x_n_u16): Likewise.
	(vrshrq_x_n_u32): Likewise.
	(vshrq_x_n_s8): Likewise.
	(vshrq_x_n_s16): Likewise.
	(vshrq_x_n_s32): Likewise.
	(vshrq_x_n_u8): Likewise.
	(vshrq_x_n_u16): Likewise.
	(vshrq_x_n_u32): Likewise.
	(vdupq_x_n_f16): Likewise.
	(vdupq_x_n_f32): Likewise.
	(vminnmq_x_f16): Likewise.
	(vminnmq_x_f32): Likewise.
	(vmaxnmq_x_f16): Likewise.
	(vmaxnmq_x_f32): Likewise.
	(vabdq_x_f16): Likewise.
	(vabdq_x_f32): Likewise.
	(vabsq_x_f16): Likewise.
	(vabsq_x_f32): Likewise.
	(vaddq_x_f16): Likewise.
	(vaddq_x_f32): Likewise.
	(vaddq_x_n_f16): Likewise.
	(vaddq_x_n_f32): Likewise.
	(vnegq_x_f16): Likewise.
	(vnegq_x_f32): Likewise.
	(vmulq_x_f16): Likewise.
	(vmulq_x_f32): Likewise.
	(vmulq_x_n_f16): Likewise.
	(vmulq_x_n_f32): Likewise.
	(vsubq_x_f16): Likewise.
	(vsubq_x_f32): Likewise.
	(vsubq_x_n_f16): Likewise.
	(vsubq_x_n_f32): Likewise.
	(vcaddq_rot90_x_f16): Likewise.
	(vcaddq_rot90_x_f32): Likewise.
	(vcaddq_rot270_x_f16): Likewise.
	(vcaddq_rot270_x_f32): Likewise.
	(vcmulq_x_f16): Likewise.
	(vcmulq_x_f32): Likewise.
	(vcmulq_rot90_x_f16): Likewise.
	(vcmulq_rot90_x_f32): Likewise.
	(vcmulq_rot180_x_f16): Likewise.
	(vcmulq_rot180_x_f32): Likewise.
	(vcmulq_rot270_x_f16): Likewise.
	(vcmulq_rot270_x_f32): Likewise.
	(vcvtaq_x_s16_f16): Likewise.
	(vcvtaq_x_s32_f32): Likewise.
	(vcvtaq_x_u16_f16): Likewise.
	(vcvtaq_x_u32_f32): Likewise.
	(vcvtnq_x_s16_f16): Likewise.
	(vcvtnq_x_s32_f32): Likewise.
	(vcvtnq_x_u16_f16): Likewise.
	(vcvtnq_x_u32_f32): Likewise.
	(vcvtpq_x_s16_f16): Likewise.
	(vcvtpq_x_s32_f32): Likewise.
	(vcvtpq_x_u16_f16): Likewise.
	(vcvtpq_x_u32_f32): Likewise.
	(vcvtmq_x_s16_f16): Likewise.
	(vcvtmq_x_s32_f32): Likewise.
	(vcvtmq_x_u16_f16): Likewise.
	(vcvtmq_x_u32_f32): Likewise.
	(vcvtbq_x_f32_f16): Likewise.
	(vcvttq_x_f32_f16): Likewise.
	(vcvtq_x_f16_u16): Likewise.
	(vcvtq_x_f16_s16): Likewise.
	(vcvtq_x_f32_s32): Likewise.
	(vcvtq_x_f32_u32): Likewise.
	(vcvtq_x_n_f16_s16): Likewise.
	(vcvtq_x_n_f16_u16): Likewise.
	(vcvtq_x_n_f32_s32): Likewise.
	(vcvtq_x_n_f32_u32): Likewise.
	(vcvtq_x_s16_f16): Likewise.
	(vcvtq_x_s32_f32): Likewise.
	(vcvtq_x_u16_f16): Likewise.
	(vcvtq_x_u32_f32): Likewise.
	(vcvtq_x_n_s16_f16): Likewise.
	(vcvtq_x_n_s32_f32): Likewise.
	(vcvtq_x_n_u16_f16): Likewise.
	(vcvtq_x_n_u32_f32): Likewise.
	(vrndq_x_f16): Likewise.
	(vrndq_x_f32): Likewise.
	(vrndnq_x_f16): Likewise.
	(vrndnq_x_f32): Likewise.
	(vrndmq_x_f16): Likewise.
	(vrndmq_x_f32): Likewise.
	(vrndpq_x_f16): Likewise.
	(vrndpq_x_f32): Likewise.
	(vrndaq_x_f16): Likewise.
	(vrndaq_x_f32): Likewise.
	(vrndxq_x_f16): Likewise.
	(vrndxq_x_f32): Likewise.
	(vandq_x_f16): Likewise.
	(vandq_x_f32): Likewise.
	(vbicq_x_f16): Likewise.
	(vbicq_x_f32): Likewise.
	(vbrsrq_x_n_f16): Likewise.
	(vbrsrq_x_n_f32): Likewise.
	(veorq_x_f16): Likewise.
	(veorq_x_f32): Likewise.
	(vornq_x_f16): Likewise.
	(vornq_x_f32): Likewise.
	(vorrq_x_f16): Likewise.
	(vorrq_x_f32): Likewise.
	(vrev32q_x_f16): Likewise.
	(vrev64q_x_f16): Likewise.
	(vrev64q_x_f32): Likewise.
	(__arm_vddupq_x_n_u8): Define intrinsic.
	(__arm_vddupq_x_n_u16): Likewise.
	(__arm_vddupq_x_n_u32): Likewise.
	(__arm_vddupq_x_wb_u8): Likewise.
	(__arm_vddupq_x_wb_u16): Likewise.
	(__arm_vddupq_x_wb_u32): Likewise.
	(__arm_vdwdupq_x_n_u8): Likewise.
	(__arm_vdwdupq_x_n_u16): Likewise.
	(__arm_vdwdupq_x_n_u32): Likewise.
	(__arm_vdwdupq_x_wb_u8): Likewise.
	(__arm_vdwdupq_x_wb_u16): Likewise.
	(__arm_vdwdupq_x_wb_u32): Likewise.
	(__arm_vidupq_x_n_u8): Likewise.
	(__arm_vidupq_x_n_u16): Likewise.
	(__arm_vidupq_x_n_u32): Likewise.
	(__arm_vidupq_x_wb_u8): Likewise.
	(__arm_vidupq_x_wb_u16): Likewise.
	(__arm_vidupq_x_wb_u32): Likewise.
	(__arm_viwdupq_x_n_u8): Likewise.
	(__arm_viwdupq_x_n_u16): Likewise.
	(__arm_viwdupq_x_n_u32): Likewise.
	(__arm_viwdupq_x_wb_u8): Likewise.
	(__arm_viwdupq_x_wb_u16): Likewise.
	(__arm_viwdupq_x_wb_u32): Likewise.
	(__arm_vdupq_x_n_s8): Likewise.
	(__arm_vdupq_x_n_s16): Likewise.
	(__arm_vdupq_x_n_s32): Likewise.
	(__arm_vdupq_x_n_u8): Likewise.
	(__arm_vdupq_x_n_u16): Likewise.
	(__arm_vdupq_x_n_u32): Likewise.
	(__arm_vminq_x_s8): Likewise.
	(__arm_vminq_x_s16): Likewise.
	(__arm_vminq_x_s32): Likewise.
	(__arm_vminq_x_u8): Likewise.
	(__arm_vminq_x_u16): Likewise.
	(__arm_vminq_x_u32): Likewise.
	(__arm_vmaxq_x_s8): Likewise.
	(__arm_vmaxq_x_s16): Likewise.
	(__arm_vmaxq_x_s32): Likewise.
	(__arm_vmaxq_x_u8): Likewise.
	(__arm_vmaxq_x_u16): Likewise.
	(__arm_vmaxq_x_u32): Likewise.
	(__arm_vabdq_x_s8): Likewise.
	(__arm_vabdq_x_s16): Likewise.
	(__arm_vabdq_x_s32): Likewise.
	(__arm_vabdq_x_u8): Likewise.
	(__arm_vabdq_x_u16): Likewise.
	(__arm_vabdq_x_u32): Likewise.
	(__arm_vabsq_x_s8): Likewise.
	(__arm_vabsq_x_s16): Likewise.
	(__arm_vabsq_x_s32): Likewise.
	(__arm_vaddq_x_s8): Likewise.
	(__arm_vaddq_x_s16): Likewise.
	(__arm_vaddq_x_s32): Likewise.
	(__arm_vaddq_x_n_s8): Likewise.
	(__arm_vaddq_x_n_s16): Likewise.
	(__arm_vaddq_x_n_s32): Likewise.
	(__arm_vaddq_x_u8): Likewise.
	(__arm_vaddq_x_u16): Likewise.
	(__arm_vaddq_x_u32): Likewise.
	(__arm_vaddq_x_n_u8): Likewise.
	(__arm_vaddq_x_n_u16): Likewise.
	(__arm_vaddq_x_n_u32): Likewise.
	(__arm_vclsq_x_s8): Likewise.
	(__arm_vclsq_x_s16): Likewise.
	(__arm_vclsq_x_s32): Likewise.
	(__arm_vclzq_x_s8): Likewise.
	(__arm_vclzq_x_s16): Likewise.
	(__arm_vclzq_x_s32): Likewise.
	(__arm_vclzq_x_u8): Likewise.
	(__arm_vclzq_x_u16): Likewise.
	(__arm_vclzq_x_u32): Likewise.
	(__arm_vnegq_x_s8): Likewise.
	(__arm_vnegq_x_s16): Likewise.
	(__arm_vnegq_x_s32): Likewise.
	(__arm_vmulhq_x_s8): Likewise.
	(__arm_vmulhq_x_s16): Likewise.
	(__arm_vmulhq_x_s32): Likewise.
	(__arm_vmulhq_x_u8): Likewise.
	(__arm_vmulhq_x_u16): Likewise.
	(__arm_vmulhq_x_u32): Likewise.
	(__arm_vmullbq_poly_x_p8): Likewise.
	(__arm_vmullbq_poly_x_p16): Likewise.
	(__arm_vmullbq_int_x_s8): Likewise.
	(__arm_vmullbq_int_x_s16): Likewise.
	(__arm_vmullbq_int_x_s32): Likewise.
	(__arm_vmullbq_int_x_u8): Likewise.
	(__arm_vmullbq_int_x_u16): Likewise.
	(__arm_vmullbq_int_x_u32): Likewise.
	(__arm_vmulltq_poly_x_p8): Likewise.
	(__arm_vmulltq_poly_x_p16): Likewise.
	(__arm_vmulltq_int_x_s8): Likewise.
	(__arm_vmulltq_int_x_s16): Likewise.
	(__arm_vmulltq_int_x_s32): Likewise.
	(__arm_vmulltq_int_x_u8): Likewise.
	(__arm_vmulltq_int_x_u16): Likewise.
	(__arm_vmulltq_int_x_u32): Likewise.
	(__arm_vmulq_x_s8): Likewise.
	(__arm_vmulq_x_s16): Likewise.
	(__arm_vmulq_x_s32): Likewise.
	(__arm_vmulq_x_n_s8): Likewise.
	(__arm_vmulq_x_n_s16): Likewise.
	(__arm_vmulq_x_n_s32): Likewise.
	(__arm_vmulq_x_u8): Likewise.
	(__arm_vmulq_x_u16): Likewise.
	(__arm_vmulq_x_u32): Likewise.
	(__arm_vmulq_x_n_u8): Likewise.
	(__arm_vmulq_x_n_u16): Likewise.
	(__arm_vmulq_x_n_u32): Likewise.
	(__arm_vsubq_x_s8): Likewise.
	(__arm_vsubq_x_s16): Likewise.
	(__arm_vsubq_x_s32): Likewise.
	(__arm_vsubq_x_n_s8): Likewise.
	(__arm_vsubq_x_n_s16): Likewise.
	(__arm_vsubq_x_n_s32): Likewise.
	(__arm_vsubq_x_u8): Likewise.
	(__arm_vsubq_x_u16): Likewise.
	(__arm_vsubq_x_u32): Likewise.
	(__arm_vsubq_x_n_u8): Likewise.
	(__arm_vsubq_x_n_u16): Likewise.
	(__arm_vsubq_x_n_u32): Likewise.
	(__arm_vcaddq_rot90_x_s8): Likewise.
	(__arm_vcaddq_rot90_x_s16): Likewise.
	(__arm_vcaddq_rot90_x_s32): Likewise.
	(__arm_vcaddq_rot90_x_u8): Likewise.
	(__arm_vcaddq_rot90_x_u16): Likewise.
	(__arm_vcaddq_rot90_x_u32): Likewise.
	(__arm_vcaddq_rot270_x_s8): Likewise.
	(__arm_vcaddq_rot270_x_s16): Likewise.
	(__arm_vcaddq_rot270_x_s32): Likewise.
	(__arm_vcaddq_rot270_x_u8): Likewise.
	(__arm_vcaddq_rot270_x_u16): Likewise.
	(__arm_vcaddq_rot270_x_u32): Likewise.
	(__arm_vhaddq_x_n_s8): Likewise.
	(__arm_vhaddq_x_n_s16): Likewise.
	(__arm_vhaddq_x_n_s32): Likewise.
	(__arm_vhaddq_x_n_u8): Likewise.
	(__arm_vhaddq_x_n_u16): Likewise.
	(__arm_vhaddq_x_n_u32): Likewise.
	(__arm_vhaddq_x_s8): Likewise.
	(__arm_vhaddq_x_s16): Likewise.
	(__arm_vhaddq_x_s32): Likewise.
	(__arm_vhaddq_x_u8): Likewise.
	(__arm_vhaddq_x_u16): Likewise.
	(__arm_vhaddq_x_u32): Likewise.
	(__arm_vhcaddq_rot90_x_s8): Likewise.
	(__arm_vhcaddq_rot90_x_s16): Likewise.
	(__arm_vhcaddq_rot90_x_s32): Likewise.
	(__arm_vhcaddq_rot270_x_s8): Likewise.
	(__arm_vhcaddq_rot270_x_s16): Likewise.
	(__arm_vhcaddq_rot270_x_s32): Likewise.
	(__arm_vhsubq_x_n_s8): Likewise.
	(__arm_vhsubq_x_n_s16): Likewise.
	(__arm_vhsubq_x_n_s32): Likewise.
	(__arm_vhsubq_x_n_u8): Likewise.
	(__arm_vhsubq_x_n_u16): Likewise.
	(__arm_vhsubq_x_n_u32): Likewise.
	(__arm_vhsubq_x_s8): Likewise.
	(__arm_vhsubq_x_s16): Likewise.
	(__arm_vhsubq_x_s32): Likewise.
	(__arm_vhsubq_x_u8): Likewise.
	(__arm_vhsubq_x_u16): Likewise.
	(__arm_vhsubq_x_u32): Likewise.
	(__arm_vrhaddq_x_s8): Likewise.
	(__arm_vrhaddq_x_s16): Likewise.
	(__arm_vrhaddq_x_s32): Likewise.
	(__arm_vrhaddq_x_u8): Likewise.
	(__arm_vrhaddq_x_u16): Likewise.
	(__arm_vrhaddq_x_u32): Likewise.
	(__arm_vrmulhq_x_s8): Likewise.
	(__arm_vrmulhq_x_s16): Likewise.
	(__arm_vrmulhq_x_s32): Likewise.
	(__arm_vrmulhq_x_u8): Likewise.
	(__arm_vrmulhq_x_u16): Likewise.
	(__arm_vrmulhq_x_u32): Likewise.
	(__arm_vandq_x_s8): Likewise.
	(__arm_vandq_x_s16): Likewise.
	(__arm_vandq_x_s32): Likewise.
	(__arm_vandq_x_u8): Likewise.
	(__arm_vandq_x_u16): Likewise.
	(__arm_vandq_x_u32): Likewise.
	(__arm_vbicq_x_s8): Likewise.
	(__arm_vbicq_x_s16): Likewise.
	(__arm_vbicq_x_s32): Likewise.
	(__arm_vbicq_x_u8): Likewise.
	(__arm_vbicq_x_u16): Likewise.
	(__arm_vbicq_x_u32): Likewise.
	(__arm_vbrsrq_x_n_s8): Likewise.
	(__arm_vbrsrq_x_n_s16): Likewise.
	(__arm_vbrsrq_x_n_s32): Likewise.
	(__arm_vbrsrq_x_n_u8): Likewise.
	(__arm_vbrsrq_x_n_u16): Likewise.
	(__arm_vbrsrq_x_n_u32): Likewise.
	(__arm_veorq_x_s8): Likewise.
	(__arm_veorq_x_s16): Likewise.
	(__arm_veorq_x_s32): Likewise.
	(__arm_veorq_x_u8): Likewise.
	(__arm_veorq_x_u16): Likewise.
	(__arm_veorq_x_u32): Likewise.
	(__arm_vmovlbq_x_s8): Likewise.
	(__arm_vmovlbq_x_s16): Likewise.
	(__arm_vmovlbq_x_u8): Likewise.
	(__arm_vmovlbq_x_u16): Likewise.
	(__arm_vmovltq_x_s8): Likewise.
	(__arm_vmovltq_x_s16): Likewise.
	(__arm_vmovltq_x_u8): Likewise.
	(__arm_vmovltq_x_u16): Likewise.
	(__arm_vmvnq_x_s8): Likewise.
	(__arm_vmvnq_x_s16): Likewise.
	(__arm_vmvnq_x_s32): Likewise.
	(__arm_vmvnq_x_u8): Likewise.
	(__arm_vmvnq_x_u16): Likewise.
	(__arm_vmvnq_x_u32): Likewise.
	(__arm_vmvnq_x_n_s16): Likewise.
	(__arm_vmvnq_x_n_s32): Likewise.
	(__arm_vmvnq_x_n_u16): Likewise.
	(__arm_vmvnq_x_n_u32): Likewise.
	(__arm_vornq_x_s8): Likewise.
	(__arm_vornq_x_s16): Likewise.
	(__arm_vornq_x_s32): Likewise.
	(__arm_vornq_x_u8): Likewise.
	(__arm_vornq_x_u16): Likewise.
	(__arm_vornq_x_u32): Likewise.
	(__arm_vorrq_x_s8): Likewise.
	(__arm_vorrq_x_s16): Likewise.
	(__arm_vorrq_x_s32): Likewise.
	(__arm_vorrq_x_u8): Likewise.
	(__arm_vorrq_x_u16): Likewise.
	(__arm_vorrq_x_u32): Likewise.
	(__arm_vrev16q_x_s8): Likewise.
	(__arm_vrev16q_x_u8): Likewise.
	(__arm_vrev32q_x_s8): Likewise.
	(__arm_vrev32q_x_s16): Likewise.
	(__arm_vrev32q_x_u8): Likewise.
	(__arm_vrev32q_x_u16): Likewise.
	(__arm_vrev64q_x_s8): Likewise.
	(__arm_vrev64q_x_s16): Likewise.
	(__arm_vrev64q_x_s32): Likewise.
	(__arm_vrev64q_x_u8): Likewise.
	(__arm_vrev64q_x_u16): Likewise.
	(__arm_vrev64q_x_u32): Likewise.
	(__arm_vrshlq_x_s8): Likewise.
	(__arm_vrshlq_x_s16): Likewise.
	(__arm_vrshlq_x_s32): Likewise.
	(__arm_vrshlq_x_u8): Likewise.
	(__arm_vrshlq_x_u16): Likewise.
	(__arm_vrshlq_x_u32): Likewise.
	(__arm_vshllbq_x_n_s8): Likewise.
	(__arm_vshllbq_x_n_s16): Likewise.
	(__arm_vshllbq_x_n_u8): Likewise.
	(__arm_vshllbq_x_n_u16): Likewise.
	(__arm_vshlltq_x_n_s8): Likewise.
	(__arm_vshlltq_x_n_s16): Likewise.
	(__arm_vshlltq_x_n_u8): Likewise.
	(__arm_vshlltq_x_n_u16): Likewise.
	(__arm_vshlq_x_s8): Likewise.
	(__arm_vshlq_x_s16): Likewise.
	(__arm_vshlq_x_s32): Likewise.
	(__arm_vshlq_x_u8): Likewise.
	(__arm_vshlq_x_u16): Likewise.
	(__arm_vshlq_x_u32): Likewise.
	(__arm_vshlq_x_n_s8): Likewise.
	(__arm_vshlq_x_n_s16): Likewise.
	(__arm_vshlq_x_n_s32): Likewise.
	(__arm_vshlq_x_n_u8): Likewise.
	(__arm_vshlq_x_n_u16): Likewise.
	(__arm_vshlq_x_n_u32): Likewise.
	(__arm_vrshrq_x_n_s8): Likewise.
	(__arm_vrshrq_x_n_s16): Likewise.
	(__arm_vrshrq_x_n_s32): Likewise.
	(__arm_vrshrq_x_n_u8): Likewise.
	(__arm_vrshrq_x_n_u16): Likewise.
	(__arm_vrshrq_x_n_u32): Likewise.
	(__arm_vshrq_x_n_s8): Likewise.
	(__arm_vshrq_x_n_s16): Likewise.
	(__arm_vshrq_x_n_s32): Likewise.
	(__arm_vshrq_x_n_u8): Likewise.
	(__arm_vshrq_x_n_u16): Likewise.
	(__arm_vshrq_x_n_u32): Likewise.
	(__arm_vdupq_x_n_f16): Likewise.
	(__arm_vdupq_x_n_f32): Likewise.
	(__arm_vminnmq_x_f16): Likewise.
	(__arm_vminnmq_x_f32): Likewise.
	(__arm_vmaxnmq_x_f16): Likewise.
	(__arm_vmaxnmq_x_f32): Likewise.
	(__arm_vabdq_x_f16): Likewise.
	(__arm_vabdq_x_f32): Likewise.
	(__arm_vabsq_x_f16): Likewise.
	(__arm_vabsq_x_f32): Likewise.
	(__arm_vaddq_x_f16): Likewise.
	(__arm_vaddq_x_f32): Likewise.
	(__arm_vaddq_x_n_f16): Likewise.
	(__arm_vaddq_x_n_f32): Likewise.
	(__arm_vnegq_x_f16): Likewise.
	(__arm_vnegq_x_f32): Likewise.
	(__arm_vmulq_x_f16): Likewise.
	(__arm_vmulq_x_f32): Likewise.
	(__arm_vmulq_x_n_f16): Likewise.
	(__arm_vmulq_x_n_f32): Likewise.
	(__arm_vsubq_x_f16): Likewise.
	(__arm_vsubq_x_f32): Likewise.
	(__arm_vsubq_x_n_f16): Likewise.
	(__arm_vsubq_x_n_f32): Likewise.
	(__arm_vcaddq_rot90_x_f16): Likewise.
	(__arm_vcaddq_rot90_x_f32): Likewise.
	(__arm_vcaddq_rot270_x_f16): Likewise.
	(__arm_vcaddq_rot270_x_f32): Likewise.
	(__arm_vcmulq_x_f16): Likewise.
	(__arm_vcmulq_x_f32): Likewise.
	(__arm_vcmulq_rot90_x_f16): Likewise.
	(__arm_vcmulq_rot90_x_f32): Likewise.
	(__arm_vcmulq_rot180_x_f16): Likewise.
	(__arm_vcmulq_rot180_x_f32): Likewise.
	(__arm_vcmulq_rot270_x_f16): Likewise.
	(__arm_vcmulq_rot270_x_f32): Likewise.
	(__arm_vcvtaq_x_s16_f16): Likewise.
	(__arm_vcvtaq_x_s32_f32): Likewise.
	(__arm_vcvtaq_x_u16_f16): Likewise.
	(__arm_vcvtaq_x_u32_f32): Likewise.
	(__arm_vcvtnq_x_s16_f16): Likewise.
	(__arm_vcvtnq_x_s32_f32): Likewise.
	(__arm_vcvtnq_x_u16_f16): Likewise.
	(__arm_vcvtnq_x_u32_f32): Likewise.
	(__arm_vcvtpq_x_s16_f16): Likewise.
	(__arm_vcvtpq_x_s32_f32): Likewise.
	(__arm_vcvtpq_x_u16_f16): Likewise.
	(__arm_vcvtpq_x_u32_f32): Likewise.
	(__arm_vcvtmq_x_s16_f16): Likewise.
	(__arm_vcvtmq_x_s32_f32): Likewise.
	(__arm_vcvtmq_x_u16_f16): Likewise.
	(__arm_vcvtmq_x_u32_f32): Likewise.
	(__arm_vcvtbq_x_f32_f16): Likewise.
	(__arm_vcvttq_x_f32_f16): Likewise.
	(__arm_vcvtq_x_f16_u16): Likewise.
	(__arm_vcvtq_x_f16_s16): Likewise.
	(__arm_vcvtq_x_f32_s32): Likewise.
	(__arm_vcvtq_x_f32_u32): Likewise.
	(__arm_vcvtq_x_n_f16_s16): Likewise.
	(__arm_vcvtq_x_n_f16_u16): Likewise.
	(__arm_vcvtq_x_n_f32_s32): Likewise.
	(__arm_vcvtq_x_n_f32_u32): Likewise.
	(__arm_vcvtq_x_s16_f16): Likewise.
	(__arm_vcvtq_x_s32_f32): Likewise.
	(__arm_vcvtq_x_u16_f16): Likewise.
	(__arm_vcvtq_x_u32_f32): Likewise.
	(__arm_vcvtq_x_n_s16_f16): Likewise.
	(__arm_vcvtq_x_n_s32_f32): Likewise.
	(__arm_vcvtq_x_n_u16_f16): Likewise.
	(__arm_vcvtq_x_n_u32_f32): Likewise.
	(__arm_vrndq_x_f16): Likewise.
	(__arm_vrndq_x_f32): Likewise.
	(__arm_vrndnq_x_f16): Likewise.
	(__arm_vrndnq_x_f32): Likewise.
	(__arm_vrndmq_x_f16): Likewise.
	(__arm_vrndmq_x_f32): Likewise.
	(__arm_vrndpq_x_f16): Likewise.
	(__arm_vrndpq_x_f32): Likewise.
	(__arm_vrndaq_x_f16): Likewise.
	(__arm_vrndaq_x_f32): Likewise.
	(__arm_vrndxq_x_f16): Likewise.
	(__arm_vrndxq_x_f32): Likewise.
	(__arm_vandq_x_f16): Likewise.
	(__arm_vandq_x_f32): Likewise.
	(__arm_vbicq_x_f16): Likewise.
	(__arm_vbicq_x_f32): Likewise.
	(__arm_vbrsrq_x_n_f16): Likewise.
	(__arm_vbrsrq_x_n_f32): Likewise.
	(__arm_veorq_x_f16): Likewise.
	(__arm_veorq_x_f32): Likewise.
	(__arm_vornq_x_f16): Likewise.
	(__arm_vornq_x_f32): Likewise.
	(__arm_vorrq_x_f16): Likewise.
	(__arm_vorrq_x_f32): Likewise.
	(__arm_vrev32q_x_f16): Likewise.
	(__arm_vrev64q_x_f16): Likewise.
	(__arm_vrev64q_x_f32): Likewise.
	(vabdq_x): Define polymorphic variant.
	(vabsq_x): Likewise.
	(vaddq_x): Likewise.
	(vandq_x): Likewise.
	(vbicq_x): Likewise.
	(vbrsrq_x): Likewise.
	(vcaddq_rot270_x): Likewise.
	(vcaddq_rot90_x): Likewise.
	(vcmulq_rot180_x): Likewise.
	(vcmulq_rot270_x): Likewise.
	(vcmulq_x): Likewise.
	(vcvtq_x): Likewise.
	(vcvtq_x_n): Likewise.
	(vcvtnq_m): Likewise.
	(veorq_x): Likewise.
	(vmaxnmq_x): Likewise.
	(vminnmq_x): Likewise.
	(vmulq_x): Likewise.
	(vnegq_x): Likewise.
	(vornq_x): Likewise.
	(vorrq_x): Likewise.
	(vrev32q_x): Likewise.
	(vrev64q_x): Likewise.
	(vrndaq_x): Likewise.
	(vrndmq_x): Likewise.
	(vrndnq_x): Likewise.
	(vrndpq_x): Likewise.
	(vrndq_x): Likewise.
	(vrndxq_x): Likewise.
	(vsubq_x): Likewise.
	(vcmulq_rot90_x): Likewise.
	(vadciq): Likewise.
	(vclsq_x): Likewise.
	(vclzq_x): Likewise.
	(vhaddq_x): Likewise.
	(vhcaddq_rot270_x): Likewise.
	(vhcaddq_rot90_x): Likewise.
	(vhsubq_x): Likewise.
	(vmaxq_x): Likewise.
	(vminq_x): Likewise.
	(vmovlbq_x): Likewise.
	(vmovltq_x): Likewise.
	(vmulhq_x): Likewise.
	(vmullbq_int_x): Likewise.
	(vmullbq_poly_x): Likewise.
	(vmulltq_int_x): Likewise.
	(vmulltq_poly_x): Likewise.
	(vmvnq_x): Likewise.
	(vrev16q_x): Likewise.
	(vrhaddq_x): Likewise.
	(vrmulhq_x): Likewise.
	(vrshlq_x): Likewise.
	(vrshrq_x): Likewise.
	(vshllbq_x): Likewise.
	(vshlltq_x): Likewise.
	(vshlq_x_n): Likewise.
	(vshlq_x): Likewise.
	(vdwdupq_x_u8): Likewise.
	(vdwdupq_x_u16): Likewise.
	(vdwdupq_x_u32): Likewise.
	(viwdupq_x_u8): Likewise.
	(viwdupq_x_u16): Likewise.
	(viwdupq_x_u32): Likewise.
	(vidupq_x_u8): Likewise.
	(vddupq_x_u8): Likewise.
	(vidupq_x_u16): Likewise.
	(vddupq_x_u16): Likewise.
	(vidupq_x_u32): Likewise.
	(vddupq_x_u32): Likewise.
	(vshrq_x): Likewise.

gcc/testsuite/ChangeLog:

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vabdq_x_f16.c: New test.
	* gcc.target/arm/mve/intrinsics/vabdq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabdq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabdq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabdq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabdq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabdq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabdq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabsq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabsq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabsq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabsq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vabsq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclsq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclsq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclsq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclzq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclzq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclzq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclzq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclzq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vclzq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot180_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot180_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot270_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot270_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot90_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot90_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtaq_x_s16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtaq_x_s32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtaq_x_u16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtaq_x_u32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtbq_x_f32_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtmq_x_s16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtmq_x_s32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtmq_x_u16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtmq_x_u32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtnq_x_s16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtnq_x_s32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtnq_x_u16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtnq_x_u32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtpq_x_s16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtpq_x_s32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtpq_x_u16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtpq_x_u32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_f16_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_f16_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_f32_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_f32_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f16_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f16_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f32_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f32_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_s16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_s32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_u16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_n_u32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_s16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_s32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_u16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_x_u32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvttq_x_f32_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdupq_x_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdupq_x_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdupq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdupq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhaddq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vhsubq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxnmq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxnmq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminnmq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminnmq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmovlbq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmovlbq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmovlbq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmovlbq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmovltq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmovltq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmovltq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmovltq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulhq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulhq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulhq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulhq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulhq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulhq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmullbq_poly_x_p16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmullbq_poly_x_p8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulltq_poly_x_p16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulltq_poly_x_p8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmvnq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vnegq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vnegq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vnegq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vnegq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vnegq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev16q_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev16q_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev32q_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev32q_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev32q_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev32q_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev32q_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev64q_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev64q_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev64q_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev64q_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev64q_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev64q_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev64q_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrev64q_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrhaddq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrhaddq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrmulhq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrmulhq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndaq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndaq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndmq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndmq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndnq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndnq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndpq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndpq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndxq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrndxq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshlq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshlq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshlq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshlq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshlq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshlq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshllbq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlltq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshlq_x_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshrq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshrq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshrq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_n_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_x_u8.c: Likewise.
2020-03-20 14:14:35 +00:00
Richard Biener
3d42842c07 fix CTOR vectorization
We failed to handle pattern stmts appropriately.

2020-03-20  Richard Biener  <rguenther@suse.de>

	* tree-vect-slp.c (vect_analyze_slp_instance): Push the stmts
	to vectorize for CTOR defs.
2020-03-20 15:02:32 +01:00
Srinath Parvathaneni
41e1a7ffae [ARM][GCC][2/8x]: MVE ACLE gather load and scatter store intrinsics with writeback.
This patch supports following MVE ACLE intrinsics with writeback.

vldrdq_gather_base_wb_s64, vldrdq_gather_base_wb_u64, vldrdq_gather_base_wb_z_s64,
vldrdq_gather_base_wb_z_u64, vldrwq_gather_base_wb_f32, vldrwq_gather_base_wb_s32,
vldrwq_gather_base_wb_u32, vldrwq_gather_base_wb_z_f32, vldrwq_gather_base_wb_z_s32,
vldrwq_gather_base_wb_z_u32, vstrdq_scatter_base_wb_p_s64, vstrdq_scatter_base_wb_p_u64,
vstrdq_scatter_base_wb_s64, vstrdq_scatter_base_wb_u64, vstrwq_scatter_base_wb_p_s32,
vstrwq_scatter_base_wb_p_f32, vstrwq_scatter_base_wb_p_u32, vstrwq_scatter_base_wb_s32,
vstrwq_scatter_base_wb_u32, vstrwq_scatter_base_wb_f32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* config/arm/arm-builtins.c (LDRGBWBS_QUALIFIERS): Define builtin
	qualifier.
	(LDRGBWBU_QUALIFIERS): Likewise.
	(LDRGBWBS_Z_QUALIFIERS): Likewise.
	(LDRGBWBU_Z_QUALIFIERS): Likewise.
	(STRSBWBS_QUALIFIERS): Likewise.
	(STRSBWBU_QUALIFIERS): Likewise.
	(STRSBWBS_P_QUALIFIERS): Likewise.
	(STRSBWBU_P_QUALIFIERS): Likewise.
	* config/arm/arm_mve.h (vldrdq_gather_base_wb_s64): Define macro.
	(vldrdq_gather_base_wb_u64): Likewise.
	(vldrdq_gather_base_wb_z_s64): Likewise.
	(vldrdq_gather_base_wb_z_u64): Likewise.
	(vldrwq_gather_base_wb_f32): Likewise.
	(vldrwq_gather_base_wb_s32): Likewise.
	(vldrwq_gather_base_wb_u32): Likewise.
	(vldrwq_gather_base_wb_z_f32): Likewise.
	(vldrwq_gather_base_wb_z_s32): Likewise.
	(vldrwq_gather_base_wb_z_u32): Likewise.
	(vstrdq_scatter_base_wb_p_s64): Likewise.
	(vstrdq_scatter_base_wb_p_u64): Likewise.
	(vstrdq_scatter_base_wb_s64): Likewise.
	(vstrdq_scatter_base_wb_u64): Likewise.
	(vstrwq_scatter_base_wb_p_s32): Likewise.
	(vstrwq_scatter_base_wb_p_f32): Likewise.
	(vstrwq_scatter_base_wb_p_u32): Likewise.
	(vstrwq_scatter_base_wb_s32): Likewise.
	(vstrwq_scatter_base_wb_u32): Likewise.
	(vstrwq_scatter_base_wb_f32): Likewise.
	(__arm_vldrdq_gather_base_wb_s64): Define intrinsic.
	(__arm_vldrdq_gather_base_wb_u64): Likewise.
	(__arm_vldrdq_gather_base_wb_z_s64): Likewise.
	(__arm_vldrdq_gather_base_wb_z_u64): Likewise.
	(__arm_vldrwq_gather_base_wb_s32): Likewise.
	(__arm_vldrwq_gather_base_wb_u32): Likewise.
	(__arm_vldrwq_gather_base_wb_z_s32): Likewise.
	(__arm_vldrwq_gather_base_wb_z_u32): Likewise.
	(__arm_vstrdq_scatter_base_wb_s64): Likewise.
	(__arm_vstrdq_scatter_base_wb_u64): Likewise.
	(__arm_vstrdq_scatter_base_wb_p_s64): Likewise.
	(__arm_vstrdq_scatter_base_wb_p_u64): Likewise.
	(__arm_vstrwq_scatter_base_wb_p_s32): Likewise.
	(__arm_vstrwq_scatter_base_wb_p_u32): Likewise.
	(__arm_vstrwq_scatter_base_wb_s32): Likewise.
	(__arm_vstrwq_scatter_base_wb_u32): Likewise.
	(__arm_vldrwq_gather_base_wb_f32): Likewise.
	(__arm_vldrwq_gather_base_wb_z_f32): Likewise.
	(__arm_vstrwq_scatter_base_wb_f32): Likewise.
	(__arm_vstrwq_scatter_base_wb_p_f32): Likewise.
	(vstrwq_scatter_base_wb): Define polymorphic variant.
	(vstrwq_scatter_base_wb_p): Likewise.
	(vstrdq_scatter_base_wb_p): Likewise.
	(vstrdq_scatter_base_wb): Likewise.
	* config/arm/arm_mve_builtins.def (LDRGBWBS_QUALIFIERS): Use builtin
	qualifier.
	* config/arm/mve.md (mve_vstrwq_scatter_base_wb_<supf>v4si): Define RTL
	pattern.
	(mve_vstrwq_scatter_base_wb_add_<supf>v4si): Likewise.
	(mve_vstrwq_scatter_base_wb_<supf>v4si_insn): Likewise.
	(mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
	(mve_vstrwq_scatter_base_wb_p_add_<supf>v4si): Likewise.
	(mve_vstrwq_scatter_base_wb_p_<supf>v4si_insn): Likewise.
	(mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
	(mve_vstrwq_scatter_base_wb_add_fv4sf): Likewise.
	(mve_vstrwq_scatter_base_wb_fv4sf_insn): Likewise.
	(mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
	(mve_vstrwq_scatter_base_wb_p_add_fv4sf): Likewise.
	(mve_vstrwq_scatter_base_wb_p_fv4sf_insn): Likewise.
	(mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
	(mve_vstrdq_scatter_base_wb_add_<supf>v2di): Likewise.
	(mve_vstrdq_scatter_base_wb_<supf>v2di_insn): Likewise.
	(mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
	(mve_vstrdq_scatter_base_wb_p_add_<supf>v2di): Likewise.
	(mve_vstrdq_scatter_base_wb_p_<supf>v2di_insn): Likewise.
	(mve_vldrwq_gather_base_wb_<supf>v4si): Likewise.
	(mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
	(mve_vldrwq_gather_base_wb_z_<supf>v4si): Likewise.
	(mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
	(mve_vldrwq_gather_base_wb_fv4sf): Likewise.
	(mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
	(mve_vldrwq_gather_base_wb_z_fv4sf): Likewise.
	(mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
	(mve_vldrdq_gather_base_wb_<supf>v2di): Likewise.
	(mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
	(mve_vldrdq_gather_base_wb_z_<supf>v2di): Likewise.
	(mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.

gcc/testsuite/ChangeLog:

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_s64.c: New test.
	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_u64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_z_s64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_z_u64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_p_s64.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_p_u64.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_s64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_u64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_f32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_s32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_u32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_u32.c: Likewise.
2020-03-20 12:06:26 +00:00
Srinath Parvathaneni
92f80065d1 [ARM][GCC][1/8x]: MVE ACLE vidup, vddup, viwdup and vdwdup intrinsics with writeback.
This patch supports following MVE ACLE intrinsics with writeback.

vddupq_m_n_u8, vddupq_m_n_u32, vddupq_m_n_u16, vddupq_m_wb_u8, vddupq_m_wb_u16, vddupq_m_wb_u32, vddupq_n_u8, vddupq_n_u32, vddupq_n_u16, vddupq_wb_u8, vddupq_wb_u16, vddupq_wb_u32, vdwdupq_m_n_u8, vdwdupq_m_n_u32, vdwdupq_m_n_u16, vdwdupq_m_wb_u8, vdwdupq_m_wb_u32, vdwdupq_m_wb_u16, vdwdupq_n_u8, vdwdupq_n_u32, vdwdupq_n_u16, vdwdupq_wb_u8, vdwdupq_wb_u32, vdwdupq_wb_u16, vidupq_m_n_u8, vidupq_m_n_u32, vidupq_m_n_u16, vidupq_m_wb_u8, vidupq_m_wb_u16, vidupq_m_wb_u32, vidupq_n_u8, vidupq_n_u32, vidupq_n_u16, vidupq_wb_u8, vidupq_wb_u16, vidupq_wb_u32, viwdupq_m_n_u8, viwdupq_m_n_u32, viwdupq_m_n_u16, viwdupq_m_wb_u8, viwdupq_m_wb_u32, viwdupq_m_wb_u16, viwdupq_n_u8, viwdupq_n_u32, viwdupq_n_u16, viwdupq_wb_u8, viwdupq_wb_u32, viwdupq_wb_u16.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* config/arm/arm-builtins.c
	(QUINOP_UNONE_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Define quinary
	builtin qualifier.
	* config/arm/arm_mve.h (vddupq_m_n_u8): Define macro.
	(vddupq_m_n_u32): Likewise.
	(vddupq_m_n_u16): Likewise.
	(vddupq_m_wb_u8): Likewise.
	(vddupq_m_wb_u16): Likewise.
	(vddupq_m_wb_u32): Likewise.
	(vddupq_n_u8): Likewise.
	(vddupq_n_u32): Likewise.
	(vddupq_n_u16): Likewise.
	(vddupq_wb_u8): Likewise.
	(vddupq_wb_u16): Likewise.
	(vddupq_wb_u32): Likewise.
	(vdwdupq_m_n_u8): Likewise.
	(vdwdupq_m_n_u32): Likewise.
	(vdwdupq_m_n_u16): Likewise.
	(vdwdupq_m_wb_u8): Likewise.
	(vdwdupq_m_wb_u32): Likewise.
	(vdwdupq_m_wb_u16): Likewise.
	(vdwdupq_n_u8): Likewise.
	(vdwdupq_n_u32): Likewise.
	(vdwdupq_n_u16): Likewise.
	(vdwdupq_wb_u8): Likewise.
	(vdwdupq_wb_u32): Likewise.
	(vdwdupq_wb_u16): Likewise.
	(vidupq_m_n_u8): Likewise.
	(vidupq_m_n_u32): Likewise.
	(vidupq_m_n_u16): Likewise.
	(vidupq_m_wb_u8): Likewise.
	(vidupq_m_wb_u16): Likewise.
	(vidupq_m_wb_u32): Likewise.
	(vidupq_n_u8): Likewise.
	(vidupq_n_u32): Likewise.
	(vidupq_n_u16): Likewise.
	(vidupq_wb_u8): Likewise.
	(vidupq_wb_u16): Likewise.
	(vidupq_wb_u32): Likewise.
	(viwdupq_m_n_u8): Likewise.
	(viwdupq_m_n_u32): Likewise.
	(viwdupq_m_n_u16): Likewise.
	(viwdupq_m_wb_u8): Likewise.
	(viwdupq_m_wb_u32): Likewise.
	(viwdupq_m_wb_u16): Likewise.
	(viwdupq_n_u8): Likewise.
	(viwdupq_n_u32): Likewise.
	(viwdupq_n_u16): Likewise.
	(viwdupq_wb_u8): Likewise.
	(viwdupq_wb_u32): Likewise.
	(viwdupq_wb_u16): Likewise.
	(__arm_vddupq_m_n_u8): Define intrinsic.
	(__arm_vddupq_m_n_u32): Likewise.
	(__arm_vddupq_m_n_u16): Likewise.
	(__arm_vddupq_m_wb_u8): Likewise.
	(__arm_vddupq_m_wb_u16): Likewise.
	(__arm_vddupq_m_wb_u32): Likewise.
	(__arm_vddupq_n_u8): Likewise.
	(__arm_vddupq_n_u32): Likewise.
	(__arm_vddupq_n_u16): Likewise.
	(__arm_vdwdupq_m_n_u8): Likewise.
	(__arm_vdwdupq_m_n_u32): Likewise.
	(__arm_vdwdupq_m_n_u16): Likewise.
	(__arm_vdwdupq_m_wb_u8): Likewise.
	(__arm_vdwdupq_m_wb_u32): Likewise.
	(__arm_vdwdupq_m_wb_u16): Likewise.
	(__arm_vdwdupq_n_u8): Likewise.
	(__arm_vdwdupq_n_u32): Likewise.
	(__arm_vdwdupq_n_u16): Likewise.
	(__arm_vdwdupq_wb_u8): Likewise.
	(__arm_vdwdupq_wb_u32): Likewise.
	(__arm_vdwdupq_wb_u16): Likewise.
	(__arm_vidupq_m_n_u8): Likewise.
	(__arm_vidupq_m_n_u32): Likewise.
	(__arm_vidupq_m_n_u16): Likewise.
	(__arm_vidupq_n_u8): Likewise.
	(__arm_vidupq_m_wb_u8): Likewise.
	(__arm_vidupq_m_wb_u16): Likewise.
	(__arm_vidupq_m_wb_u32): Likewise.
	(__arm_vidupq_n_u32): Likewise.
	(__arm_vidupq_n_u16): Likewise.
	(__arm_vidupq_wb_u8): Likewise.
	(__arm_vidupq_wb_u16): Likewise.
	(__arm_vidupq_wb_u32): Likewise.
	(__arm_vddupq_wb_u8): Likewise.
	(__arm_vddupq_wb_u16): Likewise.
	(__arm_vddupq_wb_u32): Likewise.
	(__arm_viwdupq_m_n_u8): Likewise.
	(__arm_viwdupq_m_n_u32): Likewise.
	(__arm_viwdupq_m_n_u16): Likewise.
	(__arm_viwdupq_m_wb_u8): Likewise.
	(__arm_viwdupq_m_wb_u32): Likewise.
	(__arm_viwdupq_m_wb_u16): Likewise.
	(__arm_viwdupq_n_u8): Likewise.
	(__arm_viwdupq_n_u32): Likewise.
	(__arm_viwdupq_n_u16): Likewise.
	(__arm_viwdupq_wb_u8): Likewise.
	(__arm_viwdupq_wb_u32): Likewise.
	(__arm_viwdupq_wb_u16): Likewise.
	(vidupq_m): Define polymorphic variant.
	(vddupq_m): Likewise.
	(vidupq_u16): Likewise.
	(vidupq_u32): Likewise.
	(vidupq_u8): Likewise.
	(vddupq_u16): Likewise.
	(vddupq_u32): Likewise.
	(vddupq_u8): Likewise.
	(viwdupq_m): Likewise.
	(viwdupq_u16): Likewise.
	(viwdupq_u32): Likewise.
	(viwdupq_u8): Likewise.
	(vdwdupq_m): Likewise.
	(vdwdupq_u16): Likewise.
	(vdwdupq_u32): Likewise.
	(vdwdupq_u8): Likewise.
	* config/arm/arm_mve_builtins.def
	(QUINOP_UNONE_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Use builtin
	qualifier.
	* config/arm/mve.md (mve_vidupq_n_u<mode>): Define RTL pattern.
	(mve_vidupq_u<mode>_insn): Likewise.
	(mve_vidupq_m_n_u<mode>): Likewise.
	(mve_vidupq_m_wb_u<mode>_insn): Likewise.
	(mve_vddupq_n_u<mode>): Likewise.
	(mve_vddupq_u<mode>_insn): Likewise.
	(mve_vddupq_m_n_u<mode>): Likewise.
	(mve_vddupq_m_wb_u<mode>_insn): Likewise.
	(mve_vdwdupq_n_u<mode>): Likewise.
	(mve_vdwdupq_wb_u<mode>): Likewise.
	(mve_vdwdupq_wb_u<mode>_insn): Likewise.
	(mve_vdwdupq_m_n_u<mode>): Likewise.
	(mve_vdwdupq_m_wb_u<mode>): Likewise.
	(mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
	(mve_viwdupq_n_u<mode>): Likewise.
	(mve_viwdupq_wb_u<mode>): Likewise.
	(mve_viwdupq_wb_u<mode>_insn): Likewise.
	(mve_viwdupq_m_n_u<mode>): Likewise.
	(mve_viwdupq_m_wb_u<mode>): Likewise.
	(mve_viwdupq_m_wb_u<mode>_insn): Likewise.

gcc/testsuite/ChangeLog:

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u16.c: New test.
	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_m_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vddupq_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_m_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vidupq_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_n_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_n_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_n_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/viwdupq_wb_u8.c: Likewise.
2020-03-20 11:58:30 +00:00
Srinath Parvathaneni
85a94e8790 [ARM][GCC][7x]: MVE vreinterpretq and vuninitializedq intrinsics.
This patch supports following MVE ACLE intrinsics.

vreinterpretq_s16_s32, vreinterpretq_s16_s64, vreinterpretq_s16_s8, vreinterpretq_s16_u16,
vreinterpretq_s16_u32, vreinterpretq_s16_u64, vreinterpretq_s16_u8, vreinterpretq_s32_s16,
vreinterpretq_s32_s64, vreinterpretq_s32_s8, vreinterpretq_s32_u16, vreinterpretq_s32_u32,
vreinterpretq_s32_u64, vreinterpretq_s32_u8, vreinterpretq_s64_s16, vreinterpretq_s64_s32,
vreinterpretq_s64_s8, vreinterpretq_s64_u16, vreinterpretq_s64_u32, vreinterpretq_s64_u64,
vreinterpretq_s64_u8, vreinterpretq_s8_s16, vreinterpretq_s8_s32, vreinterpretq_s8_s64,
vreinterpretq_s8_u16, vreinterpretq_s8_u32, vreinterpretq_s8_u64, vreinterpretq_s8_u8,
vreinterpretq_u16_s16, vreinterpretq_u16_s32, vreinterpretq_u16_s64, vreinterpretq_u16_s8,
vreinterpretq_u16_u32, vreinterpretq_u16_u64, vreinterpretq_u16_u8, vreinterpretq_u32_s16,
vreinterpretq_u32_s32, vreinterpretq_u32_s64, vreinterpretq_u32_s8, vreinterpretq_u32_u16,
vreinterpretq_u32_u64, vreinterpretq_u32_u8, vreinterpretq_u64_s16, vreinterpretq_u64_s32,
vreinterpretq_u64_s64, vreinterpretq_u64_s8, vreinterpretq_u64_u16, vreinterpretq_u64_u32,
vreinterpretq_u64_u8, vreinterpretq_u8_s16, vreinterpretq_u8_s32, vreinterpretq_u8_s64,
vreinterpretq_u8_s8, vreinterpretq_u8_u16, vreinterpretq_u8_u32, vreinterpretq_u8_u64,
vreinterpretq_s32_f16, vreinterpretq_s32_f32, vreinterpretq_u16_f16, vreinterpretq_u16_f32,
vreinterpretq_u32_f16, vreinterpretq_u32_f32, vreinterpretq_u64_f16, vreinterpretq_u64_f32,
vreinterpretq_u8_f16, vreinterpretq_u8_f32, vreinterpretq_f16_f32, vreinterpretq_f16_s16,
vreinterpretq_f16_s32, vreinterpretq_f16_s64, vreinterpretq_f16_s8, vreinterpretq_f16_u16,
vreinterpretq_f16_u32, vreinterpretq_f16_u64, vreinterpretq_f16_u8, vreinterpretq_f32_f16,
vreinterpretq_f32_s16, vreinterpretq_f32_s32, vreinterpretq_f32_s64, vreinterpretq_f32_s8,
vreinterpretq_f32_u16, vreinterpretq_f32_u32, vreinterpretq_f32_u64, vreinterpretq_f32_u8,
vreinterpretq_s16_f16, vreinterpretq_s16_f32, vreinterpretq_s64_f16, vreinterpretq_s64_f32,
vreinterpretq_s8_f16, vreinterpretq_s8_f32, vuninitializedq_u8, vuninitializedq_u16,
vuninitializedq_u32, vuninitializedq_u64, vuninitializedq_s8, vuninitializedq_s16,
vuninitializedq_s32, vuninitializedq_s64, vuninitializedq_f16, vuninitializedq_f32 and
vuninitializedq.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm_mve.h (vreinterpretq_s16_s32): Define macro.
	(vreinterpretq_s16_s64): Likewise.
	(vreinterpretq_s16_s8): Likewise.
	(vreinterpretq_s16_u16): Likewise.
	(vreinterpretq_s16_u32): Likewise.
	(vreinterpretq_s16_u64): Likewise.
	(vreinterpretq_s16_u8): Likewise.
	(vreinterpretq_s32_s16): Likewise.
	(vreinterpretq_s32_s64): Likewise.
	(vreinterpretq_s32_s8): Likewise.
	(vreinterpretq_s32_u16): Likewise.
	(vreinterpretq_s32_u32): Likewise.
	(vreinterpretq_s32_u64): Likewise.
	(vreinterpretq_s32_u8): Likewise.
	(vreinterpretq_s64_s16): Likewise.
	(vreinterpretq_s64_s32): Likewise.
	(vreinterpretq_s64_s8): Likewise.
	(vreinterpretq_s64_u16): Likewise.
	(vreinterpretq_s64_u32): Likewise.
	(vreinterpretq_s64_u64): Likewise.
	(vreinterpretq_s64_u8): Likewise.
	(vreinterpretq_s8_s16): Likewise.
	(vreinterpretq_s8_s32): Likewise.
	(vreinterpretq_s8_s64): Likewise.
	(vreinterpretq_s8_u16): Likewise.
	(vreinterpretq_s8_u32): Likewise.
	(vreinterpretq_s8_u64): Likewise.
	(vreinterpretq_s8_u8): Likewise.
	(vreinterpretq_u16_s16): Likewise.
	(vreinterpretq_u16_s32): Likewise.
	(vreinterpretq_u16_s64): Likewise.
	(vreinterpretq_u16_s8): Likewise.
	(vreinterpretq_u16_u32): Likewise.
	(vreinterpretq_u16_u64): Likewise.
	(vreinterpretq_u16_u8): Likewise.
	(vreinterpretq_u32_s16): Likewise.
	(vreinterpretq_u32_s32): Likewise.
	(vreinterpretq_u32_s64): Likewise.
	(vreinterpretq_u32_s8): Likewise.
	(vreinterpretq_u32_u16): Likewise.
	(vreinterpretq_u32_u64): Likewise.
	(vreinterpretq_u32_u8): Likewise.
	(vreinterpretq_u64_s16): Likewise.
	(vreinterpretq_u64_s32): Likewise.
	(vreinterpretq_u64_s64): Likewise.
	(vreinterpretq_u64_s8): Likewise.
	(vreinterpretq_u64_u16): Likewise.
	(vreinterpretq_u64_u32): Likewise.
	(vreinterpretq_u64_u8): Likewise.
	(vreinterpretq_u8_s16): Likewise.
	(vreinterpretq_u8_s32): Likewise.
	(vreinterpretq_u8_s64): Likewise.
	(vreinterpretq_u8_s8): Likewise.
	(vreinterpretq_u8_u16): Likewise.
	(vreinterpretq_u8_u32): Likewise.
	(vreinterpretq_u8_u64): Likewise.
	(vreinterpretq_s32_f16): Likewise.
	(vreinterpretq_s32_f32): Likewise.
	(vreinterpretq_u16_f16): Likewise.
	(vreinterpretq_u16_f32): Likewise.
	(vreinterpretq_u32_f16): Likewise.
	(vreinterpretq_u32_f32): Likewise.
	(vreinterpretq_u64_f16): Likewise.
	(vreinterpretq_u64_f32): Likewise.
	(vreinterpretq_u8_f16): Likewise.
	(vreinterpretq_u8_f32): Likewise.
	(vreinterpretq_f16_f32): Likewise.
	(vreinterpretq_f16_s16): Likewise.
	(vreinterpretq_f16_s32): Likewise.
	(vreinterpretq_f16_s64): Likewise.
	(vreinterpretq_f16_s8): Likewise.
	(vreinterpretq_f16_u16): Likewise.
	(vreinterpretq_f16_u32): Likewise.
	(vreinterpretq_f16_u64): Likewise.
	(vreinterpretq_f16_u8): Likewise.
	(vreinterpretq_f32_f16): Likewise.
	(vreinterpretq_f32_s16): Likewise.
	(vreinterpretq_f32_s32): Likewise.
	(vreinterpretq_f32_s64): Likewise.
	(vreinterpretq_f32_s8): Likewise.
	(vreinterpretq_f32_u16): Likewise.
	(vreinterpretq_f32_u32): Likewise.
	(vreinterpretq_f32_u64): Likewise.
	(vreinterpretq_f32_u8): Likewise.
	(vreinterpretq_s16_f16): Likewise.
	(vreinterpretq_s16_f32): Likewise.
	(vreinterpretq_s64_f16): Likewise.
	(vreinterpretq_s64_f32): Likewise.
	(vreinterpretq_s8_f16): Likewise.
	(vreinterpretq_s8_f32): Likewise.
	(vuninitializedq_u8): Likewise.
	(vuninitializedq_u16): Likewise.
	(vuninitializedq_u32): Likewise.
	(vuninitializedq_u64): Likewise.
	(vuninitializedq_s8): Likewise.
	(vuninitializedq_s16): Likewise.
	(vuninitializedq_s32): Likewise.
	(vuninitializedq_s64): Likewise.
	(vuninitializedq_f16): Likewise.
	(vuninitializedq_f32): Likewise.
	(__arm_vuninitializedq_u8): Define intrinsic.
	(__arm_vuninitializedq_u16): Likewise.
	(__arm_vuninitializedq_u32): Likewise.
	(__arm_vuninitializedq_u64): Likewise.
	(__arm_vuninitializedq_s8): Likewise.
	(__arm_vuninitializedq_s16): Likewise.
	(__arm_vuninitializedq_s32): Likewise.
	(__arm_vuninitializedq_s64): Likewise.
	(__arm_vreinterpretq_s16_s32): Likewise.
	(__arm_vreinterpretq_s16_s64): Likewise.
	(__arm_vreinterpretq_s16_s8): Likewise.
	(__arm_vreinterpretq_s16_u16): Likewise.
	(__arm_vreinterpretq_s16_u32): Likewise.
	(__arm_vreinterpretq_s16_u64): Likewise.
	(__arm_vreinterpretq_s16_u8): Likewise.
	(__arm_vreinterpretq_s32_s16): Likewise.
	(__arm_vreinterpretq_s32_s64): Likewise.
	(__arm_vreinterpretq_s32_s8): Likewise.
	(__arm_vreinterpretq_s32_u16): Likewise.
	(__arm_vreinterpretq_s32_u32): Likewise.
	(__arm_vreinterpretq_s32_u64): Likewise.
	(__arm_vreinterpretq_s32_u8): Likewise.
	(__arm_vreinterpretq_s64_s16): Likewise.
	(__arm_vreinterpretq_s64_s32): Likewise.
	(__arm_vreinterpretq_s64_s8): Likewise.
	(__arm_vreinterpretq_s64_u16): Likewise.
	(__arm_vreinterpretq_s64_u32): Likewise.
	(__arm_vreinterpretq_s64_u64): Likewise.
	(__arm_vreinterpretq_s64_u8): Likewise.
	(__arm_vreinterpretq_s8_s16): Likewise.
	(__arm_vreinterpretq_s8_s32): Likewise.
	(__arm_vreinterpretq_s8_s64): Likewise.
	(__arm_vreinterpretq_s8_u16): Likewise.
	(__arm_vreinterpretq_s8_u32): Likewise.
	(__arm_vreinterpretq_s8_u64): Likewise.
	(__arm_vreinterpretq_s8_u8): Likewise.
	(__arm_vreinterpretq_u16_s16): Likewise.
	(__arm_vreinterpretq_u16_s32): Likewise.
	(__arm_vreinterpretq_u16_s64): Likewise.
	(__arm_vreinterpretq_u16_s8): Likewise.
	(__arm_vreinterpretq_u16_u32): Likewise.
	(__arm_vreinterpretq_u16_u64): Likewise.
	(__arm_vreinterpretq_u16_u8): Likewise.
	(__arm_vreinterpretq_u32_s16): Likewise.
	(__arm_vreinterpretq_u32_s32): Likewise.
	(__arm_vreinterpretq_u32_s64): Likewise.
	(__arm_vreinterpretq_u32_s8): Likewise.
	(__arm_vreinterpretq_u32_u16): Likewise.
	(__arm_vreinterpretq_u32_u64): Likewise.
	(__arm_vreinterpretq_u32_u8): Likewise.
	(__arm_vreinterpretq_u64_s16): Likewise.
	(__arm_vreinterpretq_u64_s32): Likewise.
	(__arm_vreinterpretq_u64_s64): Likewise.
	(__arm_vreinterpretq_u64_s8): Likewise.
	(__arm_vreinterpretq_u64_u16): Likewise.
	(__arm_vreinterpretq_u64_u32): Likewise.
	(__arm_vreinterpretq_u64_u8): Likewise.
	(__arm_vreinterpretq_u8_s16): Likewise.
	(__arm_vreinterpretq_u8_s32): Likewise.
	(__arm_vreinterpretq_u8_s64): Likewise.
	(__arm_vreinterpretq_u8_s8): Likewise.
	(__arm_vreinterpretq_u8_u16): Likewise.
	(__arm_vreinterpretq_u8_u32): Likewise.
	(__arm_vreinterpretq_u8_u64): Likewise.
	(__arm_vuninitializedq_f16): Likewise.
	(__arm_vuninitializedq_f32): Likewise.
	(__arm_vreinterpretq_s32_f16): Likewise.
	(__arm_vreinterpretq_s32_f32): Likewise.
	(__arm_vreinterpretq_s16_f16): Likewise.
	(__arm_vreinterpretq_s16_f32): Likewise.
	(__arm_vreinterpretq_s64_f16): Likewise.
	(__arm_vreinterpretq_s64_f32): Likewise.
	(__arm_vreinterpretq_s8_f16): Likewise.
	(__arm_vreinterpretq_s8_f32): Likewise.
	(__arm_vreinterpretq_u16_f16): Likewise.
	(__arm_vreinterpretq_u16_f32): Likewise.
	(__arm_vreinterpretq_u32_f16): Likewise.
	(__arm_vreinterpretq_u32_f32): Likewise.
	(__arm_vreinterpretq_u64_f16): Likewise.
	(__arm_vreinterpretq_u64_f32): Likewise.
	(__arm_vreinterpretq_u8_f16): Likewise.
	(__arm_vreinterpretq_u8_f32): Likewise.
	(__arm_vreinterpretq_f16_f32): Likewise.
	(__arm_vreinterpretq_f16_s16): Likewise.
	(__arm_vreinterpretq_f16_s32): Likewise.
	(__arm_vreinterpretq_f16_s64): Likewise.
	(__arm_vreinterpretq_f16_s8): Likewise.
	(__arm_vreinterpretq_f16_u16): Likewise.
	(__arm_vreinterpretq_f16_u32): Likewise.
	(__arm_vreinterpretq_f16_u64): Likewise.
	(__arm_vreinterpretq_f16_u8): Likewise.
	(__arm_vreinterpretq_f32_f16): Likewise.
	(__arm_vreinterpretq_f32_s16): Likewise.
	(__arm_vreinterpretq_f32_s32): Likewise.
	(__arm_vreinterpretq_f32_s64): Likewise.
	(__arm_vreinterpretq_f32_s8): Likewise.
	(__arm_vreinterpretq_f32_u16): Likewise.
	(__arm_vreinterpretq_f32_u32): Likewise.
	(__arm_vreinterpretq_f32_u64): Likewise.
	(__arm_vreinterpretq_f32_u8): Likewise.
	(vuninitializedq): Define polymorphic variant.
	(vreinterpretq_f16): Likewise.
	(vreinterpretq_f32): Likewise.
	(vreinterpretq_s16): Likewise.
	(vreinterpretq_s32): Likewise.
	(vreinterpretq_s64): Likewise.
	(vreinterpretq_s8): Likewise.
	(vreinterpretq_u16): Likewise.
	(vreinterpretq_u32): Likewise.
	(vreinterpretq_u64): Likewise.
	(vreinterpretq_u8): Likewise.

gcc/testsuite/ChangeLog:

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vuninitializedq_float.c: New test.
	* gcc.target/arm/mve/intrinsics/vuninitializedq_float1.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vuninitializedq_int.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vuninitializedq_int1.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_s64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_u64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vreinterpretq_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vuninitializedq_float.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vuninitializedq_float1.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vuninitializedq_int.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vuninitializedq_int1.c: Likewise.
2020-03-20 11:50:21 +00:00
Srinath Parvathaneni
3eff57aacf [ARM][GCC][6x]:MVE ACLE vaddq intrinsics using arithmetic plus operator.
This patch supports following MVE ACLE vaddq intrinsics. The RTL patterns for this intrinsics are added using arithmetic "plus" operator.

vaddq_s8, vaddq_s16, vaddq_s32, vaddq_u8, vaddq_u16, vaddq_u32, vaddq_f16, vaddq_f32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* config/arm/arm_mve.h (vaddq_s8): Define macro.
	(vaddq_s16): Likewise.
	(vaddq_s32): Likewise.
	(vaddq_u8): Likewise.
	(vaddq_u16): Likewise.
	(vaddq_u32): Likewise.
	(vaddq_f16): Likewise.
	(vaddq_f32): Likewise.
	(__arm_vaddq_s8): Define intrinsic.
	(__arm_vaddq_s16): Likewise.
	(__arm_vaddq_s32): Likewise.
	(__arm_vaddq_u8): Likewise.
	(__arm_vaddq_u16): Likewise.
	(__arm_vaddq_u32): Likewise.
	(__arm_vaddq_f16): Likewise.
	(__arm_vaddq_f32): Likewise.
	(vaddq): Define polymorphic variant.
	* config/arm/iterators.md (VNIM): Define mode iterator for common types
	Neon, IWMMXT and MVE.
	(VNINOTM): Likewise.
	* config/arm/mve.md (mve_vaddq<mode>): Define RTL pattern.
	(mve_vaddq_f<mode>): Define RTL pattern.
	* config/arm/neon.md (add<mode>3): Rename to addv4hf3 RTL pattern.
	(addv8hf3_neon): Define RTL pattern.
	* config/arm/vec-common.md (add<mode>3): Modify standard add RTL pattern
	to support MVE.
	(addv8hf3): Define standard RTL pattern for MVE and Neon.
	(add<mode>3): Modify existing standard add RTL pattern for Neon and IWMMXT.

gcc/testsuite/ChangeLog:

2020-03-20  Srinath Parvathaneni  <srinath.parvathaneni@arm.com>
            Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>

	* gcc.target/arm/mve/intrinsics/vaddq_f16.c: New test.
	* gcc.target/arm/mve/intrinsics/vaddq_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_u8.c: Likewise.
2020-03-20 11:44:08 +00:00
Martin Liska
7d4549b2cd
Fix correct offset in ipa_get_jf_ancestor_result.
PR ipa/94232
	* ipa-cp.c (ipa_get_jf_ancestor_result): Use offset in bytes. Previously
	build_ref_for_offset function was used and it transforms off to bytes
	from bits.
2020-03-20 11:01:13 +01:00
Richard Biener
8fefa21fcf tree-optimization/94266 - fix object type extraction heuristics
This fixes the heuristic deriving an actual object type from a
MEM_REFs pointer operand to use the more sensible type of an
actual object instead of the pointed to type.

2020-03-20  Richard Biener  <rguenther@suse.de>

	PR tree-optimization/94266
	* gimple-ssa-sprintf.c (get_origin_and_offset): Use the
	type of the underlying object to adjust for the containing
	field if available.
2020-03-20 10:52:45 +01:00
Andre Simoes Dias Vieira
719c864225 gcc, Arm: Revert changes to {get,set}_fpscr
MVE made changes to {get,set}_fpscr to enable the compiler to optimize
unneccesary gets and sets when using these for intrinsics that use and/or write
the carry bit.  However, these actually get and set the full FPSCR register and
are used by fp env intrinsics to modify the fp context.  So MVE should not be
using these.

gcc/ChangeLog:
2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>

	* config/arm/unspecs.md (UNSPEC_GET_FPSCR): Rename this to ...
	(VUNSPEC_GET_FPSCR): ... this, and move it to vunspec.
	* config/arm/vfp.md: (get_fpscr, set_fpscr): Revert to old patterns.
2020-03-20 09:22:47 +00:00
Andre Simoes Dias Vieira
005f6fc59e gcc, Arm: Fix testisms for MVE testsuite
This patch fixes some testism where -mfpu=auto was missing or where we could
end up with -mfloat-abi=hard and soft on the same command-line.

gcc/testsuite/ChangeLog:
2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>

	* gcc.target/arm/mve/intrinsics/mve_fp_fpu1.c: Fix testisms.
	* gcc.target/arm/mve/intrinsics/mve_fp_fpu2.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_fpu1.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_fpu2.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_fpu3.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_libcall1.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_libcall2.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_float.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_float1.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_float2.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_int.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_int1.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_int2.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_uint.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_uint1.c: Likewise.
	* gcc.target/arm/mve/intrinsics/mve_vector_uint2.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vshrntq_m_n_u32.c: Likewise.
2020-03-20 09:12:44 +00:00
Andre Simoes Dias Vieira
0efe7d8796 gcc, Arm: Fix MVE move from GPR -> GPR
This patch fixes the pattern mve_mov for the case where both MVE vectors are in
R registers and the move does not get optimized away.  I use the same approach
as we do for NEON, where we use four register moves.

gcc/ChangeLog:
2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>

	* config/arm/mve.md (mve_mov<mode>): Fix R->R case.

gcc/testsuite/ChangeLog:
2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>

	* gcc.target/arm/mve/intrinsics/mve_move_gpr_to_gpr.c: New test.
2020-03-20 09:08:39 +00:00
Jakub Jelinek
4119cd693d store-merging: Fix up -fnon-call-exceptions handling [PR94224]
When we are adding a single store into a store group, we are already
checking that store->lp_nr matches, but we have also code to add further
INTEGER_CST stores into the group right away if the ordering requires that
either we put there all or none from a certain set of stores.  And in those
cases we weren't doing these lp_nr checks, which means we could end up with
stores with different lp_nr in the same group, which then ICEs during
output_merged_store.

2020-03-20  Jakub Jelinek  <jakub@redhat.com>

	PR tree-optimization/94224
	* gimple-ssa-store-merging.c
	(imm_store_chain_info::coalesce_immediate): Don't consider overlapping
	or adjacent INTEGER_CST rhs_code stores as mergeable if they have
	different lp_nr.

	* g++.dg/tree-ssa/pr94224.C: New test.
2020-03-20 09:33:38 +01:00
Andre Simoes Dias Vieira
05009698ee gcc, Arm: Fix no_cond issue introduced by MVE
This was a matter of mistaken logic in (define_attr "conds" ..). This was
setting the conds attribute for any neon instruction to no_cond which was
messing up code generation.

gcc/ChangeLog:
2020-03-20  Andre Vieira  <andre.simoesdiasvieira@arm.com>

	* config/arm/arm.md (define_attr "conds"): Fix logic for neon and mve.
2020-03-20 08:27:30 +00:00
Bin Bin Lv
4a18f168f4 [rs6000] Rewrite the declaration of a variable
Rewrite the declaration of toc_section from the source file rs6000.c to its
header file for standardizing the code.

Bootstrap and regression were done on powerpc64le-linux-gnu (LE) with no
regressions.

gcc/ChangeLog

2020-03-20  Bin Bin Lv  <shlb.linux.ibm.com>

	* config/rs6000/rs6000-internal.h (toc_section): Remove the
	declaration.
	* config/rs6000/rs6000.h (toc_section): Add the declaration.
	* config/rs6000/rs6000.c (toc_section): Remove the declaration.
2020-03-19 23:32:44 -04:00
Jason Merrill
94e2418780 c++: Avoid unnecessary empty class copy [94175].
A simple empty class copy is still simple when wrapped in a TARGET_EXPR, so
we need to strip that as well.  This change also exposed some unnecessary
copies in return statements, which when returning by invisible reference led
to <RETURN_EXPR <MEM_REF <RESULT_DECL>>>, which gimplify_return_expr didn't
like.  So we also need to strip the _REF when we eliminate the INIT_EXPR.

gcc/cp/ChangeLog
2020-03-19  Jason Merrill  <jason@redhat.com>

	PR c++/94175
	* cp-gimplify.c (simple_empty_class_p): Look through
	SIMPLE_TARGET_EXPR_P.
	(cp_gimplify_expr) [MODIFY_EXPR]: Likewise.
	[RETURN_EXPR]: Avoid producing 'return *retval;'.
	* call.c (build_call_a): Strip TARGET_EXPR from empty class arg.
	* cp-tree.h (SIMPLE_TARGET_EXPR_P): Check that TARGET_EXPR_INITIAL
	is non-null.
2020-03-19 21:47:28 -04:00
GCC Administrator
3373d3e38e Daily bump. 2020-03-20 00:16:30 +00:00
Jan Hubicka
f7dceb4e65 Fix cgraph_node::function_symbol availability compuattion [PR94202]
this fixes ICE in inliner cache sanity check which is caused by very old
bug in visibility calculation in cgraph_node::function_symbol and
cgraph_node::function_or_virtual_thunk_symbol.

In the testcase there is indirect call to a thunk. At begining we correctly
see that its body as AVAIL_AVAILABLE but later we inline into the thunk and
this turns it to AVAIL_INTERPOSABLE.

This is because function_symbol incorrectly overwrites availability parameter
by availability of the alias used in the call within thunk, which is a local
alias.

gcc/ChangeLog:

2020-03-19  Jan Hubicka  <hubicka@ucw.cz>

	PR ipa/94202
	* cgraph.c (cgraph_node::function_symbol): Fix availability computation.
	(cgraph_node::function_or_virtual_thunk_symbol): Likewise.

gcc/testsuite/ChangeLog:

2020-03-19  Jan Hubicka  <hubicka@ucw.cz>

	PR ipa/94202
	* g++.dg/torture/pr94202.C: New test.
2020-03-20 00:42:13 +01:00
Jakub Jelinek
9def91e9f2 c: Fix up cfun->function_end_locus from the C FE [PR94029]
On the following testcase we ICE because while
      DECL_STRUCT_FUNCTION (current_function_decl)->function_start_locus
        = c_parser_peek_token (parser)->location;
and similarly DECL_SOURCE_LOCATION (fndecl) is set from some token's
location, the end is set as:
  /* Store the end of the function, so that we get good line number
     info for the epilogue.  */
  cfun->function_end_locus = input_location;
and the thing is that input_location is only very rarely set in the C FE
(the primary spot that changes it is the cb_line_change/fe_file_change).
Which means, e.g. for pretty much all C functions that are on a single line,
function_start_locus column is > than function_end_locus column, and the
testcase even has smaller line in function_end_locus because cb_line_change
isn't performed while parsing multi-line arguments of a function-like macro.

Attached are two possible fixes to achieve what the C++ FE does, in
particular that cfun->function_end_locus is the locus of the closing } of
the function.  The first one updates input_location when we see a closing }
of a compound statement (though any, not just the function body) and thus
input_location in the finish_function call is what we need.
The second instead propagates the location_t from the parsing of the
outermost compound statement (the function body) to finish_function.
The second one is this version.

2020-03-19  Jakub Jelinek  <jakub@redhat.com>

	PR gcov-profile/94029
	* c-tree.h (finish_function): Add location_t argument defaulted to
	input_location.
	* c-parser.c (c_parser_compound_statement): Add endlocp argument and
	set it to the locus of closing } if non-NULL.
	(c_parser_compound_statement_nostart): Return locus of closing }.
	(c_parser_parse_rtl_body): Likewise.
	(c_parser_declaration_or_fndef): Propagate locus of closing } to
	finish_function.
	* c-decl.c (finish_function): Add end_loc argument, use it instead of
	input_location to set function_end_locus.

	* gcc.misc-tests/gcov-pr94029.c: New test.
2020-03-19 22:56:20 +01:00
Iain Buclaw
37482edc3f d/dmd: Merge upstream dmd d1a606599
Fixes long standing regression in the D front-end implemention, and adds
a new field to allow retrieving a list of all content imports from the
code generator.

Reviewed-on: https://github.com/dlang/dmd/pull/10913
	     https://github.com/dlang/dmd/pull/10933
2020-03-19 18:34:52 +01:00
Jan Hubicka
f22712bd8a Fix inliner ICE on alias with flatten attribute [PR92372]
gcc/ChangeLog:

2020-03-19  Jan Hubicka  <hubicka@ucw.cz>

	PR ipa/92372
	* cgraphunit.c (process_function_and_variable_attributes): warn
	for flatten attribute on alias.
	* ipa-inline.c (ipa_inline): Do not ICE on flatten attribute on alias.

gcc/testsuite/ChangeLog:

2020-03-19  Jan Hubicka  <hubicka@ucw.cz>

	PR ipa/92372
	* gcc.c-torture/pr92372.c: New test.
	* gcc.dg/attr-flatten-1.c: New test.
2020-03-19 17:12:56 +01:00
Martin Liska
c8429c2aba
API extension for binutils (type of symbols).
* lto-section-in.c: Add ext_symtab.
	* lto-streamer-out.c (write_symbol_extension_info): New.
	(produce_symtab_extension): New.
	(produce_asm_for_decls): Stream also produce_symtab_extension.
	* lto-streamer.h (enum lto_section_type): New section.
	* lto-symtab.h (enum gcc_plugin_symbol_type): New.
	(enum gcc_plugin_symbol_section_kind): Likewise.
	* lto-plugin.c (LTO_SECTION_PREFIX): Rename to ...
	(LTO_SYMTAB_PREFIX): ... this.
	(LTO_SECTION_PREFIX_LEN): Rename to ...
	(LTO_SYMTAB_PREFIX_LEN): ... this.
	(LTO_SYMTAB_EXT_PREFIX): New.
	(LTO_SYMTAB_EXT_PREFIX_LEN): New.
	(LTO_LTO_PREFIX): New.
	(LTO_LTO_PREFIX_LEN): New.
	(parse_table_entry): Fill up unused to zero.
	(parse_table_entry_extension): New.
	(parse_symtab_extension): New.
	(finish_conflict_resolution): Change type
	for resolution.
	(process_symtab): Use new macro name.
	(process_symtab_extension): New.
	(claim_file_handler): Parse also process_symtab_extension.
	(onload): Call new add_symbols_v2.
2020-03-19 16:56:27 +01:00
Martin Liska
f5389e17e4
Update include/plugin-api.h.
* plugin-api.h (struct ld_plugin_symbol): Split
	int def into 4 char fields.
	(enum ld_plugin_symbol_type): New.
	(enum ld_plugin_symbol_section_kind): New.
	(enum ld_plugin_tag): Add LDPT_ADD_SYMBOLS_V2.
2020-03-19 16:55:59 +01:00
Jakub Jelinek
02f7334ac9 c++: Fix up handling of captured vars in lambdas in OpenMP clauses [PR93931]
Without the parser.c change we were ICEing on the testcase, because while the
uses of the captured vars inside of the constructs were replaced with capture
proxy decls, we didn't do that for decls in OpenMP clauses.

With that fixed, we don't ICE anymore, but the testcase is miscompiled and FAILs
at runtime.  This is because the capture proxy decls have DECL_VALUE_EXPR and
during gimplification we were gimplifying those to their DECL_VALUE_EXPRs.
That is fine for shared vars, but for privatized ones we must not do that.
So that is what the cp-gimplify.c changes do.  Had to add a DECL_CONTEXT check
before calling is_capture_proxy because some VAR_DECLs don't have DECL_CONTEXT
set (yet) and is_capture_proxy relies on that being non-NULL always.

2020-03-19  Jakub Jelinek  <jakub@redhat.com>

	PR c++/93931
	* parser.c (cp_parser_omp_var_list_no_open): Call process_outer_var_ref
	on outer_automatic_var_p decls.
	* cp-gimplify.c (cxx_omp_disregard_value_expr): Return true also for
	capture proxy decls.

	* testsuite/libgomp.c++/pr93931.C: New test.
2020-03-19 12:22:47 +01:00
Tobias Burnus
bb83e069eb libgomp/testsuite: ignore blank-line output for function-not-offloaded.c
* testsuite/libgomp.c-c++-common/function-not-offloaded.c: Add
	dg-allow-blank-lines-in-output.
2020-03-19 11:42:49 +01:00
Jakub Jelinek
c7e9019681 phiopt: Avoid -fcompare-debug bug in phiopt [PR94211]
Two years ago, I've added support for up to 2 simple preparation statements
in value_replacement, but the
-      && estimate_num_insns (assign, &eni_time_weights)
+      && estimate_num_insns (bb_seq (middle_bb), &eni_time_weights)
change, meant that we compute the cost of all those statements rather than
just the single assign that has been the single supported non-debug
statement in the bb before, doesn't do what I thought would do, gimple_seq
is just gimple * and thus it can't be really overloaded depending on whether
we pass a single gimple * or a whole sequence.  Which means in the last
two years it doesn't count all the statements, but only the first one.
With -g that happens to be a DEBUG_STMT, or it could be e.g. the first
preparation statement which could be much cheaper than the actual assign.

2020-03-19  Jakub Jelinek  <jakub@redhat.com>

	PR tree-optimization/94211
	* tree-ssa-phiopt.c (value_replacement): Use estimate_num_insns_seq
	instead of estimate_num_insns for bb_seq (middle_bb).  Rename
	emtpy_or_with_defined_p variable to empty_or_with_defined_p, adjust
	all uses.

	* gcc.dg/pr94211.c: New test.
2020-03-19 10:24:16 +01:00
Richard Biener
f3280e4c0c ipa/94217 simplify offsetted address build
This avoids using build_ref_for_offset and build_fold_addr_expr
where type mixup easily results in something not IP invariant.

2020-03-19  Richard Biener  <rguenther@suse.de>

	PR ipa/94217
	* ipa-cp.c (ipa_get_jf_ancestor_result): Avoid build_fold_addr_expr
	and build_ref_for_offset.
2020-03-19 10:20:30 +01:00
Richard Biener
73bc09fa8c middle-end/94216 fix another build_fold_addr_expr use
2020-03-19  Richard Biener  <rguenther@suse.de>

	PR middle-end/94216
	* fold-const.c (fold_binary_loc): Avoid using
	build_fold_addr_expr when we really want an ADDR_EXPR.

	* g++.dg/torture/pr94216.C: New testcase.
2020-03-19 10:18:01 +01:00
GCC Administrator
b5562f1187 Daily bump. 2020-03-19 00:16:20 +00:00
Jonathan Wakely
b334182653 libstdc++: Fix is_trivially_constructible (PR 94033)
This attempts to make is_nothrow_constructible more robust (and
efficient to compile) by not depending on is_constructible. Instead the
__is_constructible intrinsic is used directly. The helper class
__is_nt_constructible_impl which checks whether the construction is
non-throwing now takes a bool template parameter that is substituted by
the result of the instrinsic. This fixes the reported bug by not using
the already-instantiated (and incorrect) value of std::is_constructible.
I don't think it really fixes the problem in general, because
std::is_nothrow_constructible itself could already have been
instantiated in a context where it gives the wrong result. A proper fix
needs to be done in the compiler.

	PR libstdc++/94033
	* include/std/type_traits (__is_nt_default_constructible_atom): Remove.
	(__is_nt_default_constructible_impl): Remove.
	(__is_nothrow_default_constructible_impl): Remove.
	(__is_nt_constructible_impl): Add bool template parameter. Adjust
	partial specializations.
	(__is_nothrow_constructible_impl): Replace class template with alias
	template.
	(is_nothrow_default_constructible): Derive from alias template
	__is_nothrow_constructible_impl instead of
	__is_nothrow_default_constructible_impl.
	* testsuite/20_util/is_nothrow_constructible/94003.cc: New test.
2020-03-18 23:19:35 +00:00
Segher Boessenkool
07fe4af4d5 rs6000: Add back some w* constraints (PR91886)
In May and June last year I deleted many of our (vector) constraints.
We can now just use "wa" for those, together with some other
conditions, which can be per alternative using the "enabled" attribute
(which in turn primarily uses the "isa" attribute).

But, it turns out that Clang implements some of those constraints as
well, and at least musl uses some of them.  It is easy for us to add
those contraints back (as undocumented aliases to "wa", which always
did mean the same thing for valid inline assembler code), so do that.

gcc/
	* config/rs6000/constraints.md (wd, wf, wi, ws, ww): New undocumented
	aliases for "wa".
2020-03-18 23:17:28 +00:00
Jeff Law
529ea7d959 Complete change to resolve pr90275.
PR rtl-optimization/90275
	* cse.c (cse_insn): Delete no-op register moves too.
2020-03-18 16:07:28 -06:00
Martin Sebor
3512dc0108 PR ipa/92799 - ICE on a weakref function definition followed by a declaration
gcc/testsuite/ChangeLog:

	PR ipa/92799
	* gcc.dg/attr-weakref-5.c: New test.

gcc/ChangeLog:

	PR ipa/92799
	* cgraphunit.c (process_function_and_variable_attributes): Also
	complain about weakref function definitions and drop all effects
	of the attribute.
2020-03-18 14:47:29 -06:00
Srinath Parvathaneni
7a5fffa5ed [ARM][GCC][8/5x]: Remaining MVE store intrinsics which stores an half word, word and double word to memory.
This patch supports the following MVE ACLE store intrinsics which stores an halfword, word or double word to memory.

vstrdq_scatter_base_p_s64, vstrdq_scatter_base_p_u64, vstrdq_scatter_base_s64, vstrdq_scatter_base_u64, vstrdq_scatter_offset_p_s64, vstrdq_scatter_offset_p_u64, vstrdq_scatter_offset_s64, vstrdq_scatter_offset_u64, vstrdq_scatter_shifted_offset_p_s64,
vstrdq_scatter_shifted_offset_p_u64, vstrdq_scatter_shifted_offset_s64,
vstrdq_scatter_shifted_offset_u64, vstrhq_scatter_offset_f16, vstrhq_scatter_offset_p_f16, vstrhq_scatter_shifted_offset_f16, vstrhq_scatter_shifted_offset_p_f16,
vstrwq_scatter_base_f32, vstrwq_scatter_base_p_f32, vstrwq_scatter_offset_f32, vstrwq_scatter_offset_p_f32, vstrwq_scatter_offset_p_s32, vstrwq_scatter_offset_p_u32, vstrwq_scatter_offset_s32, vstrwq_scatter_offset_u32, vstrwq_scatter_shifted_offset_f32,
vstrwq_scatter_shifted_offset_p_f32, vstrwq_scatter_shifted_offset_p_s32,
vstrwq_scatter_shifted_offset_p_u32, vstrwq_scatter_shifted_offset_s32,
vstrwq_scatter_shifted_offset_u32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

In this patch a new predicate "Ri" is defined to check the immediate is in the range of +/-1016 and multiple of 8.

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm_mve.h (vstrdq_scatter_base_p_s64): Define macro.
	(vstrdq_scatter_base_p_u64): Likewise.
	(vstrdq_scatter_base_s64): Likewise.
	(vstrdq_scatter_base_u64): Likewise.
	(vstrdq_scatter_offset_p_s64): Likewise.
	(vstrdq_scatter_offset_p_u64): Likewise.
	(vstrdq_scatter_offset_s64): Likewise.
	(vstrdq_scatter_offset_u64): Likewise.
	(vstrdq_scatter_shifted_offset_p_s64): Likewise.
	(vstrdq_scatter_shifted_offset_p_u64): Likewise.
	(vstrdq_scatter_shifted_offset_s64): Likewise.
	(vstrdq_scatter_shifted_offset_u64): Likewise.
	(vstrhq_scatter_offset_f16): Likewise.
	(vstrhq_scatter_offset_p_f16): Likewise.
	(vstrhq_scatter_shifted_offset_f16): Likewise.
	(vstrhq_scatter_shifted_offset_p_f16): Likewise.
	(vstrwq_scatter_base_f32): Likewise.
	(vstrwq_scatter_base_p_f32): Likewise.
	(vstrwq_scatter_offset_f32): Likewise.
	(vstrwq_scatter_offset_p_f32): Likewise.
	(vstrwq_scatter_offset_p_s32): Likewise.
	(vstrwq_scatter_offset_p_u32): Likewise.
	(vstrwq_scatter_offset_s32): Likewise.
	(vstrwq_scatter_offset_u32): Likewise.
	(vstrwq_scatter_shifted_offset_f32): Likewise.
	(vstrwq_scatter_shifted_offset_p_f32): Likewise.
	(vstrwq_scatter_shifted_offset_p_s32): Likewise.
	(vstrwq_scatter_shifted_offset_p_u32): Likewise.
	(vstrwq_scatter_shifted_offset_s32): Likewise.
	(vstrwq_scatter_shifted_offset_u32): Likewise.
	(__arm_vstrdq_scatter_base_p_s64): Define intrinsic.
	(__arm_vstrdq_scatter_base_p_u64): Likewise.
	(__arm_vstrdq_scatter_base_s64): Likewise.
	(__arm_vstrdq_scatter_base_u64): Likewise.
	(__arm_vstrdq_scatter_offset_p_s64): Likewise.
	(__arm_vstrdq_scatter_offset_p_u64): Likewise.
	(__arm_vstrdq_scatter_offset_s64): Likewise.
	(__arm_vstrdq_scatter_offset_u64): Likewise.
	(__arm_vstrdq_scatter_shifted_offset_p_s64): Likewise.
	(__arm_vstrdq_scatter_shifted_offset_p_u64): Likewise.
	(__arm_vstrdq_scatter_shifted_offset_s64): Likewise.
	(__arm_vstrdq_scatter_shifted_offset_u64): Likewise.
	(__arm_vstrwq_scatter_offset_p_s32): Likewise.
	(__arm_vstrwq_scatter_offset_p_u32): Likewise.
	(__arm_vstrwq_scatter_offset_s32): Likewise.
	(__arm_vstrwq_scatter_offset_u32): Likewise.
	(__arm_vstrwq_scatter_shifted_offset_p_s32): Likewise.
	(__arm_vstrwq_scatter_shifted_offset_p_u32): Likewise.
	(__arm_vstrwq_scatter_shifted_offset_s32): Likewise.
	(__arm_vstrwq_scatter_shifted_offset_u32): Likewise.
	(__arm_vstrhq_scatter_offset_f16): Likewise.
	(__arm_vstrhq_scatter_offset_p_f16): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_f16): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_p_f16): Likewise.
	(__arm_vstrwq_scatter_base_f32): Likewise.
	(__arm_vstrwq_scatter_base_p_f32): Likewise.
	(__arm_vstrwq_scatter_offset_f32): Likewise.
	(__arm_vstrwq_scatter_offset_p_f32): Likewise.
	(__arm_vstrwq_scatter_shifted_offset_f32): Likewise.
	(__arm_vstrwq_scatter_shifted_offset_p_f32): Likewise.
	(vstrhq_scatter_offset): Define polymorphic variant.
	(vstrhq_scatter_offset_p): Likewise.
	(vstrhq_scatter_shifted_offset): Likewise.
	(vstrhq_scatter_shifted_offset_p): Likewise.
	(vstrwq_scatter_base): Likewise.
	(vstrwq_scatter_base_p): Likewise.
	(vstrwq_scatter_offset): Likewise.
	(vstrwq_scatter_offset_p): Likewise.
	(vstrwq_scatter_shifted_offset): Likewise.
	(vstrwq_scatter_shifted_offset_p): Likewise.
	(vstrdq_scatter_base_p): Likewise.
	(vstrdq_scatter_base): Likewise.
	(vstrdq_scatter_offset_p): Likewise.
	(vstrdq_scatter_offset): Likewise.
	(vstrdq_scatter_shifted_offset_p): Likewise.
	(vstrdq_scatter_shifted_offset): Likewise.
	* config/arm/arm_mve_builtins.def (STRSBS): Use builtin qualifier.
	(STRSBS_P): Likewise.
	(STRSBU): Likewise.
	(STRSBU_P): Likewise.
	(STRSS): Likewise.
	(STRSS_P): Likewise.
	(STRSU): Likewise.
	(STRSU_P): Likewise.
	* config/arm/constraints.md (Ri): Define.
	* config/arm/mve.md (VSTRDSBQ): Define iterator.
	(VSTRDSOQ): Likewise.
	(VSTRDSSOQ): Likewise.
	(VSTRWSOQ): Likewise.
	(VSTRWSSOQ): Likewise.
	(mve_vstrdq_scatter_base_p_<supf>v2di): Define RTL pattern.
	(mve_vstrdq_scatter_base_<supf>v2di): Likewise.
	(mve_vstrdq_scatter_offset_p_<supf>v2di): Likewise.
	(mve_vstrdq_scatter_offset_<supf>v2di): Likewise.
	(mve_vstrdq_scatter_shifted_offset_p_<supf>v2di): Likewise.
	(mve_vstrdq_scatter_shifted_offset_<supf>v2di): Likewise.
	(mve_vstrhq_scatter_offset_fv8hf): Likewise.
	(mve_vstrhq_scatter_offset_p_fv8hf): Likewise.
	(mve_vstrhq_scatter_shifted_offset_fv8hf): Likewise.
	(mve_vstrhq_scatter_shifted_offset_p_fv8hf): Likewise.
	(mve_vstrwq_scatter_base_fv4sf): Likewise.
	(mve_vstrwq_scatter_base_p_fv4sf): Likewise.
	(mve_vstrwq_scatter_offset_fv4sf): Likewise.
	(mve_vstrwq_scatter_offset_p_fv4sf): Likewise.
	(mve_vstrwq_scatter_offset_p_<supf>v4si): Likewise.
	(mve_vstrwq_scatter_offset_<supf>v4si): Likewise.
	(mve_vstrwq_scatter_shifted_offset_fv4sf): Likewise.
	(mve_vstrwq_scatter_shifted_offset_p_fv4sf): Likewise.
	(mve_vstrwq_scatter_shifted_offset_p_<supf>v4si): Likewise.
	(mve_vstrwq_scatter_shifted_offset_<supf>v4si): Likewise.
	* config/arm/predicates.md (Ri): Define predicate to check immediate
	is the range +/-1016 and multiple of 8.

gcc/testsuite/ChangeLog:

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_p_s64.c: New test.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_p_u64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_s64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_u64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_p_s64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_p_u64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_s64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_u64.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_p_s64.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_p_u64.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_s64.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_u64.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_f16.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_f16.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_f32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_f32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_s32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_u32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_s32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_u32.c:
	Likewise.
2020-03-18 19:16:32 +00:00
Srinath Parvathaneni
5cad47e0f8 [ARM][GCC][7/5x]: MVE store intrinsics which stores byte,half word or word to memory.
This patch supports the following MVE ACLE store intrinsics which stores a byte, halfword, or word to memory.

vst1q_f32, vst1q_f16, vst1q_s8, vst1q_s32, vst1q_s16, vst1q_u8, vst1q_u32, vst1q_u16, vstrhq_f16, vstrhq_scatter_offset_s32, vstrhq_scatter_offset_s16, vstrhq_scatter_offset_u32, vstrhq_scatter_offset_u16, vstrhq_scatter_offset_p_s32, vstrhq_scatter_offset_p_s16, vstrhq_scatter_offset_p_u32, vstrhq_scatter_offset_p_u16, vstrhq_scatter_shifted_offset_s32,
vstrhq_scatter_shifted_offset_s16, vstrhq_scatter_shifted_offset_u32,
vstrhq_scatter_shifted_offset_u16, vstrhq_scatter_shifted_offset_p_s32,
vstrhq_scatter_shifted_offset_p_s16, vstrhq_scatter_shifted_offset_p_u32,
vstrhq_scatter_shifted_offset_p_u16, vstrhq_s32, vstrhq_s16, vstrhq_u32, vstrhq_u16, vstrhq_p_f16, vstrhq_p_s32, vstrhq_p_s16, vstrhq_p_u32, vstrhq_p_u16, vstrwq_f32, vstrwq_s32, vstrwq_u32, vstrwq_p_f32, vstrwq_p_s32, vstrwq_p_u32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm_mve.h (vst1q_f32): Define macro.
	(vst1q_f16): Likewise.
	(vst1q_s8): Likewise.
	(vst1q_s32): Likewise.
	(vst1q_s16): Likewise.
	(vst1q_u8): Likewise.
	(vst1q_u32): Likewise.
	(vst1q_u16): Likewise.
	(vstrhq_f16): Likewise.
	(vstrhq_scatter_offset_s32): Likewise.
	(vstrhq_scatter_offset_s16): Likewise.
	(vstrhq_scatter_offset_u32): Likewise.
	(vstrhq_scatter_offset_u16): Likewise.
	(vstrhq_scatter_offset_p_s32): Likewise.
	(vstrhq_scatter_offset_p_s16): Likewise.
	(vstrhq_scatter_offset_p_u32): Likewise.
	(vstrhq_scatter_offset_p_u16): Likewise.
	(vstrhq_scatter_shifted_offset_s32): Likewise.
	(vstrhq_scatter_shifted_offset_s16): Likewise.
	(vstrhq_scatter_shifted_offset_u32): Likewise.
	(vstrhq_scatter_shifted_offset_u16): Likewise.
	(vstrhq_scatter_shifted_offset_p_s32): Likewise.
	(vstrhq_scatter_shifted_offset_p_s16): Likewise.
	(vstrhq_scatter_shifted_offset_p_u32): Likewise.
	(vstrhq_scatter_shifted_offset_p_u16): Likewise.
	(vstrhq_s32): Likewise.
	(vstrhq_s16): Likewise.
	(vstrhq_u32): Likewise.
	(vstrhq_u16): Likewise.
	(vstrhq_p_f16): Likewise.
	(vstrhq_p_s32): Likewise.
	(vstrhq_p_s16): Likewise.
	(vstrhq_p_u32): Likewise.
	(vstrhq_p_u16): Likewise.
	(vstrwq_f32): Likewise.
	(vstrwq_s32): Likewise.
	(vstrwq_u32): Likewise.
	(vstrwq_p_f32): Likewise.
	(vstrwq_p_s32): Likewise.
	(vstrwq_p_u32): Likewise.
	(__arm_vst1q_s8): Define intrinsic.
	(__arm_vst1q_s32): Likewise.
	(__arm_vst1q_s16): Likewise.
	(__arm_vst1q_u8): Likewise.
	(__arm_vst1q_u32): Likewise.
	(__arm_vst1q_u16): Likewise.
	(__arm_vstrhq_scatter_offset_s32): Likewise.
	(__arm_vstrhq_scatter_offset_s16): Likewise.
	(__arm_vstrhq_scatter_offset_u32): Likewise.
	(__arm_vstrhq_scatter_offset_u16): Likewise.
	(__arm_vstrhq_scatter_offset_p_s32): Likewise.
	(__arm_vstrhq_scatter_offset_p_s16): Likewise.
	(__arm_vstrhq_scatter_offset_p_u32): Likewise.
	(__arm_vstrhq_scatter_offset_p_u16): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_s32): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_s16): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_u32): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_u16): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_p_s32): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_p_s16): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_p_u32): Likewise.
	(__arm_vstrhq_scatter_shifted_offset_p_u16): Likewise.
	(__arm_vstrhq_s32): Likewise.
	(__arm_vstrhq_s16): Likewise.
	(__arm_vstrhq_u32): Likewise.
	(__arm_vstrhq_u16): Likewise.
	(__arm_vstrhq_p_s32): Likewise.
	(__arm_vstrhq_p_s16): Likewise.
	(__arm_vstrhq_p_u32): Likewise.
	(__arm_vstrhq_p_u16): Likewise.
	(__arm_vstrwq_s32): Likewise.
	(__arm_vstrwq_u32): Likewise.
	(__arm_vstrwq_p_s32): Likewise.
	(__arm_vstrwq_p_u32): Likewise.
	(__arm_vstrwq_p_f32): Likewise.
	(__arm_vstrwq_f32): Likewise.
	(__arm_vst1q_f32): Likewise.
	(__arm_vst1q_f16): Likewise.
	(__arm_vstrhq_f16): Likewise.
	(__arm_vstrhq_p_f16): Likewise.
	(vst1q): Define polymorphic variant.
	(vstrhq): Likewise.
	(vstrhq_p): Likewise.
	(vstrhq_scatter_offset_p): Likewise.
	(vstrhq_scatter_offset): Likewise.
	(vstrhq_scatter_shifted_offset_p): Likewise.
	(vstrhq_scatter_shifted_offset): Likewise.
	(vstrwq_p): Likewise.
	(vstrwq): Likewise.
	* config/arm/arm_mve_builtins.def (STRS): Use builtin qualifier.
	(STRS_P): Likewise.
	(STRSS): Likewise.
	(STRSS_P): Likewise.
	(STRSU): Likewise.
	(STRSU_P): Likewise.
	(STRU): Likewise.
	(STRU_P): Likewise.
	* config/arm/mve.md (VST1Q): Define iterator.
	(VSTRHSOQ): Likewise.
	(VSTRHSSOQ): Likewise.
	(VSTRHQ): Likewise.
	(VSTRWQ): Likewise.
	(mve_vstrhq_fv8hf): Define RTL pattern.
	(mve_vstrhq_p_fv8hf): Likewise.
	(mve_vstrhq_p_<supf><mode>): Likewise.
	(mve_vstrhq_scatter_offset_p_<supf><mode>): Likewise.
	(mve_vstrhq_scatter_offset_<supf><mode>): Likewise.
	(mve_vstrhq_scatter_shifted_offset_p_<supf><mode>): Likewise.
	(mve_vstrhq_scatter_shifted_offset_<supf><mode>): Likewise.
	(mve_vstrhq_<supf><mode>): Likewise.
	(mve_vstrwq_fv4sf): Likewise.
	(mve_vstrwq_p_fv4sf): Likewise.
	(mve_vstrwq_p_<supf>v4si): Likewise.
	(mve_vstrwq_<supf>v4si): Likewise.
	(mve_vst1q_f<mode>): Define expand.
	(mve_vst1q_<supf><mode>): Likewise.

gcc/testsuite/ChangeLog:

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vst1q_f16.c: New test.
	* gcc.target/arm/mve/intrinsics/vst1q_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vst1q_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_p_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_p_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_p_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_p_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_p_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_s16.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_s32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_u16.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_u32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_s16.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_s32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_u16.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_u32.c:
	Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrhq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_p_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_p_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_p_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_u32.c: Likewise.
2020-03-18 19:08:29 +00:00
Srinath Parvathaneni
4cc23303ba [ARM][GCC][6/5x]: Remaining MVE load intrinsics which loads half word and word or double word from memory.
This patch supports the following Remaining MVE ACLE load intrinsics which load an halfword,
word or double word from memory.

vldrdq_gather_base_s64, vldrdq_gather_base_u64, vldrdq_gather_base_z_s64,
vldrdq_gather_base_z_u64, vldrdq_gather_offset_s64, vldrdq_gather_offset_u64,
vldrdq_gather_offset_z_s64, vldrdq_gather_offset_z_u64, vldrdq_gather_shifted_offset_s64,
vldrdq_gather_shifted_offset_u64, vldrdq_gather_shifted_offset_z_s64,
vldrdq_gather_shifted_offset_z_u64, vldrhq_gather_offset_f16, vldrhq_gather_offset_z_f16,
vldrhq_gather_shifted_offset_f16, vldrhq_gather_shifted_offset_z_f16, vldrwq_gather_base_f32,
vldrwq_gather_base_z_f32, vldrwq_gather_offset_f32, vldrwq_gather_offset_s32,
vldrwq_gather_offset_u32, vldrwq_gather_offset_z_f32, vldrwq_gather_offset_z_s32,
vldrwq_gather_offset_z_u32, vldrwq_gather_shifted_offset_f32, vldrwq_gather_shifted_offset_s32,
vldrwq_gather_shifted_offset_u32, vldrwq_gather_shifted_offset_z_f32,
vldrwq_gather_shifted_offset_z_s32, vldrwq_gather_shifted_offset_z_u32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm_mve.h (vld1q_s8): Define macro.
	(vld1q_s32): Likewise.
	(vld1q_s16): Likewise.
	(vld1q_u8): Likewise.
	(vld1q_u32): Likewise.
	(vld1q_u16): Likewise.
	(vldrhq_gather_offset_s32): Likewise.
	(vldrhq_gather_offset_s16): Likewise.
	(vldrhq_gather_offset_u32): Likewise.
	(vldrhq_gather_offset_u16): Likewise.
	(vldrhq_gather_offset_z_s32): Likewise.
	(vldrhq_gather_offset_z_s16): Likewise.
	(vldrhq_gather_offset_z_u32): Likewise.
	(vldrhq_gather_offset_z_u16): Likewise.
	(vldrhq_gather_shifted_offset_s32): Likewise.
	(vldrhq_gather_shifted_offset_s16): Likewise.
	(vldrhq_gather_shifted_offset_u32): Likewise.
	(vldrhq_gather_shifted_offset_u16): Likewise.
	(vldrhq_gather_shifted_offset_z_s32): Likewise.
	(vldrhq_gather_shifted_offset_z_s16): Likewise.
	(vldrhq_gather_shifted_offset_z_u32): Likewise.
	(vldrhq_gather_shifted_offset_z_u16): Likewise.
	(vldrhq_s32): Likewise.
	(vldrhq_s16): Likewise.
	(vldrhq_u32): Likewise.
	(vldrhq_u16): Likewise.
	(vldrhq_z_s32): Likewise.
	(vldrhq_z_s16): Likewise.
	(vldrhq_z_u32): Likewise.
	(vldrhq_z_u16): Likewise.
	(vldrwq_s32): Likewise.
	(vldrwq_u32): Likewise.
	(vldrwq_z_s32): Likewise.
	(vldrwq_z_u32): Likewise.
	(vld1q_f32): Likewise.
	(vld1q_f16): Likewise.
	(vldrhq_f16): Likewise.
	(vldrhq_z_f16): Likewise.
	(vldrwq_f32): Likewise.
	(vldrwq_z_f32): Likewise.
	(__arm_vld1q_s8): Define intrinsic.
	(__arm_vld1q_s32): Likewise.
	(__arm_vld1q_s16): Likewise.
	(__arm_vld1q_u8): Likewise.
	(__arm_vld1q_u32): Likewise.
	(__arm_vld1q_u16): Likewise.
	(__arm_vldrhq_gather_offset_s32): Likewise.
	(__arm_vldrhq_gather_offset_s16): Likewise.
	(__arm_vldrhq_gather_offset_u32): Likewise.
	(__arm_vldrhq_gather_offset_u16): Likewise.
	(__arm_vldrhq_gather_offset_z_s32): Likewise.
	(__arm_vldrhq_gather_offset_z_s16): Likewise.
	(__arm_vldrhq_gather_offset_z_u32): Likewise.
	(__arm_vldrhq_gather_offset_z_u16): Likewise.
	(__arm_vldrhq_gather_shifted_offset_s32): Likewise.
	(__arm_vldrhq_gather_shifted_offset_s16): Likewise.
	(__arm_vldrhq_gather_shifted_offset_u32): Likewise.
	(__arm_vldrhq_gather_shifted_offset_u16): Likewise.
	(__arm_vldrhq_gather_shifted_offset_z_s32): Likewise.
	(__arm_vldrhq_gather_shifted_offset_z_s16): Likewise.
	(__arm_vldrhq_gather_shifted_offset_z_u32): Likewise.
	(__arm_vldrhq_gather_shifted_offset_z_u16): Likewise.
	(__arm_vldrhq_s32): Likewise.
	(__arm_vldrhq_s16): Likewise.
	(__arm_vldrhq_u32): Likewise.
	(__arm_vldrhq_u16): Likewise.
	(__arm_vldrhq_z_s32): Likewise.
	(__arm_vldrhq_z_s16): Likewise.
	(__arm_vldrhq_z_u32): Likewise.
	(__arm_vldrhq_z_u16): Likewise.
	(__arm_vldrwq_s32): Likewise.
	(__arm_vldrwq_u32): Likewise.
	(__arm_vldrwq_z_s32): Likewise.
	(__arm_vldrwq_z_u32): Likewise.
	(__arm_vld1q_f32): Likewise.
	(__arm_vld1q_f16): Likewise.
	(__arm_vldrwq_f32): Likewise.
	(__arm_vldrwq_z_f32): Likewise.
	(__arm_vldrhq_z_f16): Likewise.
	(__arm_vldrhq_f16): Likewise.
	(vld1q): Define polymorphic variant.
	(vldrhq_gather_offset): Likewise.
	(vldrhq_gather_offset_z): Likewise.
	(vldrhq_gather_shifted_offset): Likewise.
	(vldrhq_gather_shifted_offset_z): Likewise.
	* config/arm/arm_mve_builtins.def (LDRU): Use builtin qualifier.
	(LDRS): Likewise.
	(LDRU_Z): Likewise.
	(LDRS_Z): Likewise.
	(LDRGU_Z): Likewise.
	(LDRGU): Likewise.
	(LDRGS_Z): Likewise.
	(LDRGS): Likewise.
	* config/arm/mve.md (MVE_H_ELEM): Define mode iterator.
	(V_sz_elem1): Likewise.
	(VLD1Q): Define iterator.
	(VLDRHGOQ): Likewise.
	(VLDRHGSOQ): Likewise.
	(VLDRHQ): Likewise.
	(VLDRWQ): Likewise.
	(mve_vldrhq_fv8hf): Define RTL pattern.
	(mve_vldrhq_gather_offset_<supf><mode>): Likewise.
	(mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
	(mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
	(mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
	(mve_vldrhq_<supf><mode>): Likewise.
	(mve_vldrhq_z_fv8hf): Likewise.
	(mve_vldrhq_z_<supf><mode>): Likewise.
	(mve_vldrwq_fv4sf): Likewise.
	(mve_vldrwq_<supf>v4si): Likewise.
	(mve_vldrwq_z_fv4sf): Likewise.
	(mve_vldrwq_z_<supf>v4si): Likewise.
	(mve_vld1q_f<mode>): Define RTL expand pattern.
	(mve_vld1q_<supf><mode>): Likewise.

gcc/testsuite/ChangeLog:

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vld1q_f16.c: New test.
	* gcc.target/arm/mve/intrinsics/vld1q_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_z_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_z_u32.c: Likewise.
2020-03-18 18:58:48 +00:00
Srinath Parvathaneni
bf1e3d5afa [ARM][GCC][5/5x]: MVE ACLE load intrinsics which load a byte, halfword, or word from memory.
This patch supports the following MVE ACLE load intrinsics which load a byte, halfword,
or word from memory.
vld1q_s8, vld1q_s32, vld1q_s16, vld1q_u8, vld1q_u32, vld1q_u16, vldrhq_gather_offset_s32,
vldrhq_gather_offset_s16, vldrhq_gather_offset_u32, vldrhq_gather_offset_u16,
vldrhq_gather_offset_z_s32, vldrhq_gather_offset_z_s16, vldrhq_gather_offset_z_u32,
vldrhq_gather_offset_z_u16, vldrhq_gather_shifted_offset_s32,vldrwq_f32, vldrwq_z_f32,
vldrhq_gather_shifted_offset_s16, vldrhq_gather_shifted_offset_u32,
vldrhq_gather_shifted_offset_u16, vldrhq_gather_shifted_offset_z_s32,
vldrhq_gather_shifted_offset_z_s16, vldrhq_gather_shifted_offset_z_u32,
vldrhq_gather_shifted_offset_z_u16, vldrhq_s32, vldrhq_s16, vldrhq_u32, vldrhq_u16,
vldrhq_z_s32, vldrhq_z_s16, vldrhq_z_u32, vldrhq_z_u16, vldrwq_s32, vldrwq_u32,
vldrwq_z_s32, vldrwq_z_u32, vld1q_f32, vld1q_f16, vldrhq_f16, vldrhq_z_f16.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm_mve.h (vld1q_s8): Define macro.
	(vld1q_s32): Likewise.
	(vld1q_s16): Likewise.
	(vld1q_u8): Likewise.
	(vld1q_u32): Likewise.
	(vld1q_u16): Likewise.
	(vldrhq_gather_offset_s32): Likewise.
	(vldrhq_gather_offset_s16): Likewise.
	(vldrhq_gather_offset_u32): Likewise.
	(vldrhq_gather_offset_u16): Likewise.
	(vldrhq_gather_offset_z_s32): Likewise.
	(vldrhq_gather_offset_z_s16): Likewise.
	(vldrhq_gather_offset_z_u32): Likewise.
	(vldrhq_gather_offset_z_u16): Likewise.
	(vldrhq_gather_shifted_offset_s32): Likewise.
	(vldrhq_gather_shifted_offset_s16): Likewise.
	(vldrhq_gather_shifted_offset_u32): Likewise.
	(vldrhq_gather_shifted_offset_u16): Likewise.
	(vldrhq_gather_shifted_offset_z_s32): Likewise.
	(vldrhq_gather_shifted_offset_z_s16): Likewise.
	(vldrhq_gather_shifted_offset_z_u32): Likewise.
	(vldrhq_gather_shifted_offset_z_u16): Likewise.
	(vldrhq_s32): Likewise.
	(vldrhq_s16): Likewise.
	(vldrhq_u32): Likewise.
	(vldrhq_u16): Likewise.
	(vldrhq_z_s32): Likewise.
	(vldrhq_z_s16): Likewise.
	(vldrhq_z_u32): Likewise.
	(vldrhq_z_u16): Likewise.
	(vldrwq_s32): Likewise.
	(vldrwq_u32): Likewise.
	(vldrwq_z_s32): Likewise.
	(vldrwq_z_u32): Likewise.
	(vld1q_f32): Likewise.
	(vld1q_f16): Likewise.
	(vldrhq_f16): Likewise.
	(vldrhq_z_f16): Likewise.
	(vldrwq_f32): Likewise.
	(vldrwq_z_f32): Likewise.
	(__arm_vld1q_s8): Define intrinsic.
	(__arm_vld1q_s32): Likewise.
	(__arm_vld1q_s16): Likewise.
	(__arm_vld1q_u8): Likewise.
	(__arm_vld1q_u32): Likewise.
	(__arm_vld1q_u16): Likewise.
	(__arm_vldrhq_gather_offset_s32): Likewise.
	(__arm_vldrhq_gather_offset_s16): Likewise.
	(__arm_vldrhq_gather_offset_u32): Likewise.
	(__arm_vldrhq_gather_offset_u16): Likewise.
	(__arm_vldrhq_gather_offset_z_s32): Likewise.
	(__arm_vldrhq_gather_offset_z_s16): Likewise.
	(__arm_vldrhq_gather_offset_z_u32): Likewise.
	(__arm_vldrhq_gather_offset_z_u16): Likewise.
	(__arm_vldrhq_gather_shifted_offset_s32): Likewise.
	(__arm_vldrhq_gather_shifted_offset_s16): Likewise.
	(__arm_vldrhq_gather_shifted_offset_u32): Likewise.
	(__arm_vldrhq_gather_shifted_offset_u16): Likewise.
	(__arm_vldrhq_gather_shifted_offset_z_s32): Likewise.
	(__arm_vldrhq_gather_shifted_offset_z_s16): Likewise.
	(__arm_vldrhq_gather_shifted_offset_z_u32): Likewise.
	(__arm_vldrhq_gather_shifted_offset_z_u16): Likewise.
	(__arm_vldrhq_s32): Likewise.
	(__arm_vldrhq_s16): Likewise.
	(__arm_vldrhq_u32): Likewise.
	(__arm_vldrhq_u16): Likewise.
	(__arm_vldrhq_z_s32): Likewise.
	(__arm_vldrhq_z_s16): Likewise.
	(__arm_vldrhq_z_u32): Likewise.
	(__arm_vldrhq_z_u16): Likewise.
	(__arm_vldrwq_s32): Likewise.
	(__arm_vldrwq_u32): Likewise.
	(__arm_vldrwq_z_s32): Likewise.
	(__arm_vldrwq_z_u32): Likewise.
	(__arm_vld1q_f32): Likewise.
	(__arm_vld1q_f16): Likewise.
	(__arm_vldrwq_f32): Likewise.
	(__arm_vldrwq_z_f32): Likewise.
	(__arm_vldrhq_z_f16): Likewise.
	(__arm_vldrhq_f16): Likewise.
	(vld1q): Define polymorphic variant.
	(vldrhq_gather_offset): Likewise.
	(vldrhq_gather_offset_z): Likewise.
	(vldrhq_gather_shifted_offset): Likewise.
	(vldrhq_gather_shifted_offset_z): Likewise.
	* config/arm/arm_mve_builtins.def (LDRU): Use builtin qualifier.
	(LDRS): Likewise.
	(LDRU_Z): Likewise.
	(LDRS_Z): Likewise.
	(LDRGU_Z): Likewise.
	(LDRGU): Likewise.
	(LDRGS_Z): Likewise.
	(LDRGS): Likewise.
	* config/arm/mve.md (MVE_H_ELEM): Define mode iterator.
	(V_sz_elem1): Likewise.
	(VLD1Q): Define iterator.
	(VLDRHGOQ): Likewise.
	(VLDRHGSOQ): Likewise.
	(VLDRHQ): Likewise.
	(VLDRWQ): Likewise.
	(mve_vldrhq_fv8hf): Define RTL pattern.
	(mve_vldrhq_gather_offset_<supf><mode>): Likewise.
	(mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
	(mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
	(mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
	(mve_vldrhq_<supf><mode>): Likewise.
	(mve_vldrhq_z_fv8hf): Likewise.
	(mve_vldrhq_z_<supf><mode>): Likewise.
	(mve_vldrwq_fv4sf): Likewise.
	(mve_vldrwq_<supf>v4si): Likewise.
	(mve_vldrwq_z_fv4sf): Likewise.
	(mve_vldrwq_z_<supf>v4si): Likewise.
	(mve_vld1q_f<mode>): Define RTL expand pattern.
	(mve_vld1q_<supf><mode>): Likewise.

gcc/testsuite/ChangeLog:

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vld1q_f16.c: New test.
	* gcc.target/arm/mve/intrinsics/vld1q_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vld1q_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrhq_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_z_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_z_u32.c: Likewise.
2020-03-18 18:48:05 +00:00
Srinath Parvathaneni
429d607bc4 [ARM][GCC][4/5x]: MVE load intrinsics with zero(_z) suffix.
This patch supports the following MVE ACLE load intrinsics with zero(_z) suffix.
* ``_z`` (zero) which indicates false-predicated lanes are filled with zeroes, these are only used for load instructions.

vldrbq_gather_offset_z_s16, vldrbq_gather_offset_z_u8, vldrbq_gather_offset_z_s32, vldrbq_gather_offset_z_u16, vldrbq_gather_offset_z_u32, vldrbq_gather_offset_z_s8, vldrbq_z_s16, vldrbq_z_u8, vldrbq_z_s8, vldrbq_z_s32, vldrbq_z_u16, vldrbq_z_u32, vldrwq_gather_base_z_u32, vldrwq_gather_base_z_s32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm-builtins.c (LDRGBS_Z_QUALIFIERS): Define builtin
	qualifier.
	(LDRGBU_Z_QUALIFIERS): Likewise.
	(LDRGS_Z_QUALIFIERS): Likewise.
	(LDRGU_Z_QUALIFIERS): Likewise.
	(LDRS_Z_QUALIFIERS): Likewise.
	(LDRU_Z_QUALIFIERS): Likewise.
	* config/arm/arm_mve.h (vldrbq_gather_offset_z_s16): Define macro.
	(vldrbq_gather_offset_z_u8): Likewise.
	(vldrbq_gather_offset_z_s32): Likewise.
	(vldrbq_gather_offset_z_u16): Likewise.
	(vldrbq_gather_offset_z_u32): Likewise.
	(vldrbq_gather_offset_z_s8): Likewise.
	(vldrbq_z_s16): Likewise.
	(vldrbq_z_u8): Likewise.
	(vldrbq_z_s8): Likewise.
	(vldrbq_z_s32): Likewise.
	(vldrbq_z_u16): Likewise.
	(vldrbq_z_u32): Likewise.
	(vldrwq_gather_base_z_u32): Likewise.
	(vldrwq_gather_base_z_s32): Likewise.
	(__arm_vldrbq_gather_offset_z_s8): Define intrinsic.
	(__arm_vldrbq_gather_offset_z_s32): Likewise.
	(__arm_vldrbq_gather_offset_z_s16): Likewise.
	(__arm_vldrbq_gather_offset_z_u8): Likewise.
	(__arm_vldrbq_gather_offset_z_u32): Likewise.
	(__arm_vldrbq_gather_offset_z_u16): Likewise.
	(__arm_vldrbq_z_s8): Likewise.
	(__arm_vldrbq_z_s32): Likewise.
	(__arm_vldrbq_z_s16): Likewise.
	(__arm_vldrbq_z_u8): Likewise.
	(__arm_vldrbq_z_u32): Likewise.
	(__arm_vldrbq_z_u16): Likewise.
	(__arm_vldrwq_gather_base_z_s32): Likewise.
	(__arm_vldrwq_gather_base_z_u32): Likewise.
	(vldrbq_gather_offset_z): Define polymorphic variant.
	* config/arm/arm_mve_builtins.def (LDRGBS_Z_QUALIFIERS): Use builtin
	qualifier.
	(LDRGBU_Z_QUALIFIERS): Likewise.
	(LDRGS_Z_QUALIFIERS): Likewise.
	(LDRGU_Z_QUALIFIERS): Likewise.
	(LDRS_Z_QUALIFIERS): Likewise.
	(LDRU_Z_QUALIFIERS): Likewise.
	* config/arm/mve.md (mve_vldrbq_gather_offset_z_<supf><mode>): Define
	RTL pattern.
	(mve_vldrbq_z_<supf><mode>): Likewise.
	(mve_vldrwq_gather_base_z_<supf>v4si): Likewise.

gcc/testsuite/ChangeLog: Likewise.

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s16.c: New test.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_z_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_z_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_z_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_z_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_z_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_z_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_z_u32.c: Likewise.
2020-03-18 18:35:17 +00:00
Srinath Parvathaneni
405e918c31 [ARM][GCC][3/5x]: MVE store intrinsics with predicated suffix.
This patch supports the following MVE ACLE store intrinsics with predicated suffix.

vstrbq_p_s8, vstrbq_p_s32, vstrbq_p_s16, vstrbq_p_u8, vstrbq_p_u32, vstrbq_p_u16, vstrbq_scatter_offset_p_s8, vstrbq_scatter_offset_p_s32, vstrbq_scatter_offset_p_s16, vstrbq_scatter_offset_p_u8, vstrbq_scatter_offset_p_u32, vstrbq_scatter_offset_p_u16, vstrwq_scatter_base_p_s32, vstrwq_scatter_base_p_u32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm-builtins.c (STRS_P_QUALIFIERS): Define builtin
	qualifier.
	(STRU_P_QUALIFIERS): Likewise.
	(STRSU_P_QUALIFIERS): Likewise.
	(STRSS_P_QUALIFIERS): Likewise.
	(STRSBS_P_QUALIFIERS): Likewise.
	(STRSBU_P_QUALIFIERS): Likewise.
	* config/arm/arm_mve.h (vstrbq_p_s8): Define macro.
	(vstrbq_p_s32): Likewise.
	(vstrbq_p_s16): Likewise.
	(vstrbq_p_u8): Likewise.
	(vstrbq_p_u32): Likewise.
	(vstrbq_p_u16): Likewise.
	(vstrbq_scatter_offset_p_s8): Likewise.
	(vstrbq_scatter_offset_p_s32): Likewise.
	(vstrbq_scatter_offset_p_s16): Likewise.
	(vstrbq_scatter_offset_p_u8): Likewise.
	(vstrbq_scatter_offset_p_u32): Likewise.
	(vstrbq_scatter_offset_p_u16): Likewise.
	(vstrwq_scatter_base_p_s32): Likewise.
	(vstrwq_scatter_base_p_u32): Likewise.
	(__arm_vstrbq_p_s8): Define intrinsic.
	(__arm_vstrbq_p_s32): Likewise.
	(__arm_vstrbq_p_s16): Likewise.
	(__arm_vstrbq_p_u8): Likewise.
	(__arm_vstrbq_p_u32): Likewise.
	(__arm_vstrbq_p_u16): Likewise.
	(__arm_vstrbq_scatter_offset_p_s8): Likewise.
	(__arm_vstrbq_scatter_offset_p_s32): Likewise.
	(__arm_vstrbq_scatter_offset_p_s16): Likewise.
	(__arm_vstrbq_scatter_offset_p_u8): Likewise.
	(__arm_vstrbq_scatter_offset_p_u32): Likewise.
	(__arm_vstrbq_scatter_offset_p_u16): Likewise.
	(__arm_vstrwq_scatter_base_p_s32): Likewise.
	(__arm_vstrwq_scatter_base_p_u32): Likewise.
	(vstrbq_p): Define polymorphic variant.
	(vstrbq_scatter_offset_p): Likewise.
	(vstrwq_scatter_base_p): Likewise.
	* config/arm/arm_mve_builtins.def (STRS_P_QUALIFIERS): Use builtin
	qualifier.
	(STRU_P_QUALIFIERS): Likewise.
	(STRSU_P_QUALIFIERS): Likewise.
	(STRSS_P_QUALIFIERS): Likewise.
	(STRSBS_P_QUALIFIERS): Likewise.
	(STRSBU_P_QUALIFIERS): Likewise.
	* config/arm/mve.md (mve_vstrbq_scatter_offset_p_<supf><mode>): Define
	RTL pattern.
	(mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
	(mve_vstrbq_p_<supf><mode>): Likewise.

gcc/testsuite/ChangeLog:

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vstrbq_p_s16.c: New test.
	* gcc.target/arm/mve/intrinsics/vstrbq_p_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_p_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_p_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_p_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_p_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_u32.c: Likewise.
2020-03-18 18:22:21 +00:00
Srinath Parvathaneni
535a8645bb [ARM][GCC][2/5x]: MVE load intrinsics.
This patch supports the following MVE ACLE load intrinsics.

vldrbq_gather_offset_u8, vldrbq_gather_offset_s8, vldrbq_s8, vldrbq_u8, vldrbq_gather_offset_u16, vldrbq_gather_offset_s16, vldrbq_s16, vldrbq_u16, vldrbq_gather_offset_u32, vldrbq_gather_offset_s32, vldrbq_s32, vldrbq_u32, vldrwq_gather_base_s32, vldrwq_gather_base_u32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm-builtins.c (LDRGU_QUALIFIERS): Define builtin
	qualifier.
	(LDRGS_QUALIFIERS): Likewise.
	(LDRS_QUALIFIERS): Likewise.
	(LDRU_QUALIFIERS): Likewise.
	(LDRGBS_QUALIFIERS): Likewise.
	(LDRGBU_QUALIFIERS): Likewise.
	* config/arm/arm_mve.h (vldrbq_gather_offset_u8): Define macro.
	(vldrbq_gather_offset_s8): Likewise.
	(vldrbq_s8): Likewise.
	(vldrbq_u8): Likewise.
	(vldrbq_gather_offset_u16): Likewise.
	(vldrbq_gather_offset_s16): Likewise.
	(vldrbq_s16): Likewise.
	(vldrbq_u16): Likewise.
	(vldrbq_gather_offset_u32): Likewise.
	(vldrbq_gather_offset_s32): Likewise.
	(vldrbq_s32): Likewise.
	(vldrbq_u32): Likewise.
	(vldrwq_gather_base_s32): Likewise.
	(vldrwq_gather_base_u32): Likewise.
	(__arm_vldrbq_gather_offset_u8): Define intrinsic.
	(__arm_vldrbq_gather_offset_s8): Likewise.
	(__arm_vldrbq_s8): Likewise.
	(__arm_vldrbq_u8): Likewise.
	(__arm_vldrbq_gather_offset_u16): Likewise.
	(__arm_vldrbq_gather_offset_s16): Likewise.
	(__arm_vldrbq_s16): Likewise.
	(__arm_vldrbq_u16): Likewise.
	(__arm_vldrbq_gather_offset_u32): Likewise.
	(__arm_vldrbq_gather_offset_s32): Likewise.
	(__arm_vldrbq_s32): Likewise.
	(__arm_vldrbq_u32): Likewise.
	(__arm_vldrwq_gather_base_s32): Likewise.
	(__arm_vldrwq_gather_base_u32): Likewise.
	(vldrbq_gather_offset): Define polymorphic variant.
	* config/arm/arm_mve_builtins.def (LDRGU_QUALIFIERS): Use builtin
	qualifier.
	(LDRGS_QUALIFIERS): Likewise.
	(LDRS_QUALIFIERS): Likewise.
	(LDRU_QUALIFIERS): Likewise.
	(LDRGBS_QUALIFIERS): Likewise.
	(LDRGBU_QUALIFIERS): Likewise.
	* config/arm/mve.md (VLDRBGOQ): Define iterator.
	(VLDRBQ): Likewise.
	(VLDRWGBQ): Likewise.
	(mve_vldrbq_gather_offset_<supf><mode>): Define RTL pattern.
	(mve_vldrbq_<supf><mode>): Likewise.
	(mve_vldrwq_gather_base_<supf>v4si): Likewise.

gcc/testsuite/ChangeLog:

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s16.c: New test.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrbq_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_u32.c: Likewise.
2020-03-18 18:13:53 +00:00
Srinath Parvathaneni
4ff6857599 [ARM][GCC][1/5x]: MVE store intrinsics.
This patch supports the following MVE ACLE store intrinsics.

vstrbq_scatter_offset_s8, vstrbq_scatter_offset_s32, vstrbq_scatter_offset_s16, vstrbq_scatter_offset_u8, vstrbq_scatter_offset_u32, vstrbq_scatter_offset_u16, vstrbq_s8, vstrbq_s32, vstrbq_s16, vstrbq_u8, vstrbq_u32, vstrbq_u16, vstrwq_scatter_base_s32, vstrwq_scatter_base_u32.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm-builtins.c (STRS_QUALIFIERS): Define builtin qualifier.
	(STRU_QUALIFIERS): Likewise.
	(STRSS_QUALIFIERS): Likewise.
	(STRSU_QUALIFIERS): Likewise.
	(STRSBS_QUALIFIERS): Likewise.
	(STRSBU_QUALIFIERS): Likewise.
	* config/arm/arm_mve.h (vstrbq_s8): Define macro.
	(vstrbq_u8): Likewise.
	(vstrbq_u16): Likewise.
	(vstrbq_scatter_offset_s8): Likewise.
	(vstrbq_scatter_offset_u8): Likewise.
	(vstrbq_scatter_offset_u16): Likewise.
	(vstrbq_s16): Likewise.
	(vstrbq_u32): Likewise.
	(vstrbq_scatter_offset_s16): Likewise.
	(vstrbq_scatter_offset_u32): Likewise.
	(vstrbq_s32): Likewise.
	(vstrbq_scatter_offset_s32): Likewise.
	(vstrwq_scatter_base_s32): Likewise.
	(vstrwq_scatter_base_u32): Likewise.
	(__arm_vstrbq_scatter_offset_s8): Define intrinsic.
	(__arm_vstrbq_scatter_offset_s32): Likewise.
	(__arm_vstrbq_scatter_offset_s16): Likewise.
	(__arm_vstrbq_scatter_offset_u8): Likewise.
	(__arm_vstrbq_scatter_offset_u32): Likewise.
	(__arm_vstrbq_scatter_offset_u16): Likewise.
	(__arm_vstrbq_s8): Likewise.
	(__arm_vstrbq_s32): Likewise.
	(__arm_vstrbq_s16): Likewise.
	(__arm_vstrbq_u8): Likewise.
	(__arm_vstrbq_u32): Likewise.
	(__arm_vstrbq_u16): Likewise.
	(__arm_vstrwq_scatter_base_s32): Likewise.
	(__arm_vstrwq_scatter_base_u32): Likewise.
	(vstrbq): Define polymorphic variant.
	(vstrbq_scatter_offset): Likewise.
	(vstrwq_scatter_base): Likewise.
	* config/arm/arm_mve_builtins.def (STRS_QUALIFIERS): Use builtin
	qualifier.
	(STRU_QUALIFIERS): Likewise.
	(STRSS_QUALIFIERS): Likewise.
	(STRSU_QUALIFIERS): Likewise.
	(STRSBS_QUALIFIERS): Likewise.
	(STRSBU_QUALIFIERS): Likewise.
	* config/arm/mve.md (MVE_B_ELEM): Define mode attribute iterator.
	(VSTRWSBQ): Define iterators.
	(VSTRBSOQ): Likewise.
	(VSTRBQ): Likewise.
	(mve_vstrbq_<supf><mode>): Define RTL pattern.
	(mve_vstrbq_scatter_offset_<supf><mode>): Likewise.
	(mve_vstrwq_scatter_base_<supf>v4si): Likewise.

gcc/testsuite/ChangeLog:

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vstrbq_s16.c: New test.
	* gcc.target/arm/mve/intrinsics/vstrbq_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_u16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_u32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrbq_u8.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_s32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_u32.c: Likewise.
2020-03-18 17:54:30 +00:00
Srinath Parvathaneni
532e9e2402 [ARM][GCC][4/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.

vabdq_m_f32, vabdq_m_f16, vaddq_m_f32, vaddq_m_f16, vaddq_m_n_f32, vaddq_m_n_f16, vandq_m_f32, vandq_m_f16, vbicq_m_f32, vbicq_m_f16, vbrsrq_m_n_f32, vbrsrq_m_n_f16, vcaddq_rot270_m_f32, vcaddq_rot270_m_f16, vcaddq_rot90_m_f32, vcaddq_rot90_m_f16, vcmlaq_m_f32, vcmlaq_m_f16, vcmlaq_rot180_m_f32, vcmlaq_rot180_m_f16, vcmlaq_rot270_m_f32, vcmlaq_rot270_m_f16, vcmlaq_rot90_m_f32, vcmlaq_rot90_m_f16, vcmulq_m_f32, vcmulq_m_f16, vcmulq_rot180_m_f32, vcmulq_rot180_m_f16, vcmulq_rot270_m_f32, vcmulq_rot270_m_f16, vcmulq_rot90_m_f32, vcmulq_rot90_m_f16, vcvtq_m_n_s32_f32, vcvtq_m_n_s16_f16, vcvtq_m_n_u32_f32, vcvtq_m_n_u16_f16, veorq_m_f32, veorq_m_f16, vfmaq_m_f32, vfmaq_m_f16, vfmaq_m_n_f32, vfmaq_m_n_f16, vfmasq_m_n_f32, vfmasq_m_n_f16, vfmsq_m_f32, vfmsq_m_f16, vmaxnmq_m_f32, vmaxnmq_m_f16, vminnmq_m_f32, vminnmq_m_f16, vmulq_m_f32, vmulq_m_f16, vmulq_m_n_f32, vmulq_m_n_f16, vornq_m_f32, vornq_m_f16, vorrq_m_f32, vorrq_m_f16, vsubq_m_f32, vsubq_m_f16, vsubq_m_n_f32, vsubq_m_n_f16.

Please refer to M-profile Vector Extension (MVE) intrinsics [1]  for more details.
[1]  https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* config/arm/arm_mve.h (vabdq_m_f32): Define macro.
	(vabdq_m_f16): Likewise.
	(vaddq_m_f32): Likewise.
	(vaddq_m_f16): Likewise.
	(vaddq_m_n_f32): Likewise.
	(vaddq_m_n_f16): Likewise.
	(vandq_m_f32): Likewise.
	(vandq_m_f16): Likewise.
	(vbicq_m_f32): Likewise.
	(vbicq_m_f16): Likewise.
	(vbrsrq_m_n_f32): Likewise.
	(vbrsrq_m_n_f16): Likewise.
	(vcaddq_rot270_m_f32): Likewise.
	(vcaddq_rot270_m_f16): Likewise.
	(vcaddq_rot90_m_f32): Likewise.
	(vcaddq_rot90_m_f16): Likewise.
	(vcmlaq_m_f32): Likewise.
	(vcmlaq_m_f16): Likewise.
	(vcmlaq_rot180_m_f32): Likewise.
	(vcmlaq_rot180_m_f16): Likewise.
	(vcmlaq_rot270_m_f32): Likewise.
	(vcmlaq_rot270_m_f16): Likewise.
	(vcmlaq_rot90_m_f32): Likewise.
	(vcmlaq_rot90_m_f16): Likewise.
	(vcmulq_m_f32): Likewise.
	(vcmulq_m_f16): Likewise.
	(vcmulq_rot180_m_f32): Likewise.
	(vcmulq_rot180_m_f16): Likewise.
	(vcmulq_rot270_m_f32): Likewise.
	(vcmulq_rot270_m_f16): Likewise.
	(vcmulq_rot90_m_f32): Likewise.
	(vcmulq_rot90_m_f16): Likewise.
	(vcvtq_m_n_s32_f32): Likewise.
	(vcvtq_m_n_s16_f16): Likewise.
	(vcvtq_m_n_u32_f32): Likewise.
	(vcvtq_m_n_u16_f16): Likewise.
	(veorq_m_f32): Likewise.
	(veorq_m_f16): Likewise.
	(vfmaq_m_f32): Likewise.
	(vfmaq_m_f16): Likewise.
	(vfmaq_m_n_f32): Likewise.
	(vfmaq_m_n_f16): Likewise.
	(vfmasq_m_n_f32): Likewise.
	(vfmasq_m_n_f16): Likewise.
	(vfmsq_m_f32): Likewise.
	(vfmsq_m_f16): Likewise.
	(vmaxnmq_m_f32): Likewise.
	(vmaxnmq_m_f16): Likewise.
	(vminnmq_m_f32): Likewise.
	(vminnmq_m_f16): Likewise.
	(vmulq_m_f32): Likewise.
	(vmulq_m_f16): Likewise.
	(vmulq_m_n_f32): Likewise.
	(vmulq_m_n_f16): Likewise.
	(vornq_m_f32): Likewise.
	(vornq_m_f16): Likewise.
	(vorrq_m_f32): Likewise.
	(vorrq_m_f16): Likewise.
	(vsubq_m_f32): Likewise.
	(vsubq_m_f16): Likewise.
	(vsubq_m_n_f32): Likewise.
	(vsubq_m_n_f16): Likewise.
	(__attribute__): Likewise.
	(__arm_vabdq_m_f32): Likewise.
	(__arm_vabdq_m_f16): Likewise.
	(__arm_vaddq_m_f32): Likewise.
	(__arm_vaddq_m_f16): Likewise.
	(__arm_vaddq_m_n_f32): Likewise.
	(__arm_vaddq_m_n_f16): Likewise.
	(__arm_vandq_m_f32): Likewise.
	(__arm_vandq_m_f16): Likewise.
	(__arm_vbicq_m_f32): Likewise.
	(__arm_vbicq_m_f16): Likewise.
	(__arm_vbrsrq_m_n_f32): Likewise.
	(__arm_vbrsrq_m_n_f16): Likewise.
	(__arm_vcaddq_rot270_m_f32): Likewise.
	(__arm_vcaddq_rot270_m_f16): Likewise.
	(__arm_vcaddq_rot90_m_f32): Likewise.
	(__arm_vcaddq_rot90_m_f16): Likewise.
	(__arm_vcmlaq_m_f32): Likewise.
	(__arm_vcmlaq_m_f16): Likewise.
	(__arm_vcmlaq_rot180_m_f32): Likewise.
	(__arm_vcmlaq_rot180_m_f16): Likewise.
	(__arm_vcmlaq_rot270_m_f32): Likewise.
	(__arm_vcmlaq_rot270_m_f16): Likewise.
	(__arm_vcmlaq_rot90_m_f32): Likewise.
	(__arm_vcmlaq_rot90_m_f16): Likewise.
	(__arm_vcmulq_m_f32): Likewise.
	(__arm_vcmulq_m_f16): Likewise.
	(__arm_vcmulq_rot180_m_f32): Define intrinsic.
	(__arm_vcmulq_rot180_m_f16): Likewise.
	(__arm_vcmulq_rot270_m_f32): Likewise.
	(__arm_vcmulq_rot270_m_f16): Likewise.
	(__arm_vcmulq_rot90_m_f32): Likewise.
	(__arm_vcmulq_rot90_m_f16): Likewise.
	(__arm_vcvtq_m_n_s32_f32): Likewise.
	(__arm_vcvtq_m_n_s16_f16): Likewise.
	(__arm_vcvtq_m_n_u32_f32): Likewise.
	(__arm_vcvtq_m_n_u16_f16): Likewise.
	(__arm_veorq_m_f32): Likewise.
	(__arm_veorq_m_f16): Likewise.
	(__arm_vfmaq_m_f32): Likewise.
	(__arm_vfmaq_m_f16): Likewise.
	(__arm_vfmaq_m_n_f32): Likewise.
	(__arm_vfmaq_m_n_f16): Likewise.
	(__arm_vfmasq_m_n_f32): Likewise.
	(__arm_vfmasq_m_n_f16): Likewise.
	(__arm_vfmsq_m_f32): Likewise.
	(__arm_vfmsq_m_f16): Likewise.
	(__arm_vmaxnmq_m_f32): Likewise.
	(__arm_vmaxnmq_m_f16): Likewise.
	(__arm_vminnmq_m_f32): Likewise.
	(__arm_vminnmq_m_f16): Likewise.
	(__arm_vmulq_m_f32): Likewise.
	(__arm_vmulq_m_f16): Likewise.
	(__arm_vmulq_m_n_f32): Likewise.
	(__arm_vmulq_m_n_f16): Likewise.
	(__arm_vornq_m_f32): Likewise.
	(__arm_vornq_m_f16): Likewise.
	(__arm_vorrq_m_f32): Likewise.
	(__arm_vorrq_m_f16): Likewise.
	(__arm_vsubq_m_f32): Likewise.
	(__arm_vsubq_m_f16): Likewise.
	(__arm_vsubq_m_n_f32): Likewise.
	(__arm_vsubq_m_n_f16): Likewise.
	(vabdq_m): Define polymorphic variant.
	(vaddq_m): Likewise.
	(vaddq_m_n): Likewise.
	(vandq_m): Likewise.
	(vbicq_m): Likewise.
	(vbrsrq_m_n): Likewise.
	(vcaddq_rot270_m): Likewise.
	(vcaddq_rot90_m): Likewise.
	(vcmlaq_m): Likewise.
	(vcmlaq_rot180_m): Likewise.
	(vcmlaq_rot270_m): Likewise.
	(vcmlaq_rot90_m): Likewise.
	(vcmulq_m): Likewise.
	(vcmulq_rot180_m): Likewise.
	(vcmulq_rot270_m): Likewise.
	(vcmulq_rot90_m): Likewise.
	(veorq_m): Likewise.
	(vfmaq_m): Likewise.
	(vfmaq_m_n): Likewise.
	(vfmasq_m_n): Likewise.
	(vfmsq_m): Likewise.
	(vmaxnmq_m): Likewise.
	(vminnmq_m): Likewise.
	(vmulq_m): Likewise.
	(vmulq_m_n): Likewise.
	(vornq_m): Likewise.
	(vsubq_m): Likewise.
	(vsubq_m_n): Likewise.
	(vorrq_m): Likewise.
	* config/arm/arm_mve_builtins.def (QUADOP_NONE_NONE_NONE_IMM_UNONE): Use
	builtin qualifier.
	(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
	(QUADOP_UNONE_UNONE_NONE_IMM_UNONE): Likewise.
	* config/arm/mve.md (mve_vabdq_m_f<mode>): Define RTL pattern.
	(mve_vaddq_m_f<mode>): Likewise.
	(mve_vaddq_m_n_f<mode>): Likewise.
	(mve_vandq_m_f<mode>): Likewise.
	(mve_vbicq_m_f<mode>): Likewise.
	(mve_vbrsrq_m_n_f<mode>): Likewise.
	(mve_vcaddq_rot270_m_f<mode>): Likewise.
	(mve_vcaddq_rot90_m_f<mode>): Likewise.
	(mve_vcmlaq_m_f<mode>): Likewise.
	(mve_vcmlaq_rot180_m_f<mode>): Likewise.
	(mve_vcmlaq_rot270_m_f<mode>): Likewise.
	(mve_vcmlaq_rot90_m_f<mode>): Likewise.
	(mve_vcmulq_m_f<mode>): Likewise.
	(mve_vcmulq_rot180_m_f<mode>): Likewise.
	(mve_vcmulq_rot270_m_f<mode>): Likewise.
	(mve_vcmulq_rot90_m_f<mode>): Likewise.
	(mve_veorq_m_f<mode>): Likewise.
	(mve_vfmaq_m_f<mode>): Likewise.
	(mve_vfmaq_m_n_f<mode>): Likewise.
	(mve_vfmasq_m_n_f<mode>): Likewise.
	(mve_vfmsq_m_f<mode>): Likewise.
	(mve_vmaxnmq_m_f<mode>): Likewise.
	(mve_vminnmq_m_f<mode>): Likewise.
	(mve_vmulq_m_f<mode>): Likewise.
	(mve_vmulq_m_n_f<mode>): Likewise.
	(mve_vornq_m_f<mode>): Likewise.
	(mve_vorrq_m_f<mode>): Likewise.
	(mve_vsubq_m_f<mode>): Likewise.
	(mve_vsubq_m_n_f<mode>): Likewise.

gcc/testsuite/ChangeLog:

2020-03-18  Andre Vieira  <andre.simoesdiasvieira@arm.com>
            Mihail Ionescu  <mihail.ionescu@arm.com>
            Srinath Parvathaneni  <srinath.parvathaneni@arm.com>

	* gcc.target/arm/mve/intrinsics/vabdq_m_f16.c: New test.
	* gcc.target/arm/mve/intrinsics/vabdq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_m_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vaddq_m_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vandq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbicq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmlaq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmlaq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot180_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot180_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot270_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot270_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot90_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcmulq_rot90_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_m_n_s16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_m_n_s32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_m_n_u16_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vcvtq_m_n_u32_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/veorq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vfmaq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vfmaq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vfmaq_m_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vfmaq_m_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vfmasq_m_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vfmasq_m_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vfmsq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vfmsq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxnmq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmaxnmq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminnmq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vminnmq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_m_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vmulq_m_n_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vornq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vorrq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_m_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_m_f32.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_m_n_f16.c: Likewise.
	* gcc.target/arm/mve/intrinsics/vsubq_m_n_f32.c: Likewise.
2020-03-18 17:16:21 +00:00