Training  /  11. Juni 2024  -  12. Juni 2024

EU AI Act and High-Risk Systems

The Artificial Intelligence Act (AI Act) is a proposal by the European Commission to establish a common regulatory and legal framework for artificial intelligence. The AI Act uses a risk-based approach and sets out a series of escalating legal and technical obligations depending on whether the AI product or service is considered low, medium or high risk, while some applications of AI will be banned outright.

An assessment of the design, development, and use of AI systems can mitigate the risks of AI failures and, by avoiding liability issues, can prevent reputational and financial damages.

Our AI Act Training offer systematic support for compliance of AI systems and address the following topics:

  • Key principles, objectives, and definitions from the EU AI Act, including implications and approach to compliance.
  • Methods and tools for addressing requirements.
  • A verification methodology for industry specific use-cases, which supports in:
    • Ensuring high-quality training, validation, and testing data, and minimizing bias in data sets.
    • Providing complete technical documentation in an automated manner.
    • Ensuring post-market monitoring and verification of safety and performance properties during the whole lifecycle.
    • Selecting methods for the design and evaluation of ML models.

 

Target audience: 

This training is for anyone who wants to bridge the gap between the high-level regulations within the AI Act and detailed requirements for AI for the application.

If there are 3 or more participants from one company, you can also bring along a company-specific use case, which the IKS experts will look at together with you.