15/07/2025
rexx systems news​

EU AI Act: These are the new regulations from August of 2025

Coffee, AI, colleagues: AI is now part of the basic equipment at many companies. Intelligent systems often save HR departments in particular a lot of time in lengthy pre-selection processes. But as is the case with new technologies: first comes the introduction, then comes the legal regulations. We explain what the EU AI Act prescribes from the second of August 2025.

A year has already passed: On August 2, 2024, the eagerly awaited EU AI Act came into force. We have already shed light on what the EU AI Act means for HR departments. Now it’s getting exciting step by step: the next deadline to be marked in red in the calendar is August 2. From then on, obligations will apply to AI models for general purposes.

The most important takeaways for HR departments

  • The rules that come into force on August 2, 2025 as part of the EU AI Act will become important over the coming year for HR departments that use AI, among others.
  • Artificial Intelligence in human resources is classified as high-risk AI.
  • Companies that use such high-risk AI must have a risk assessment carried out by newly created notified bodies and meet requirements.
  • In addition to documentation and training, a transparency obligation for the use of AI in recruitment companies will also be important in future.

EU AI Act from August 2, 2025: These rules are new

AI Social Scoring

AI systems that can manipulate users, perform social scoring or allow biometric categorization or emotion recognition have been banned since 2 February 2025.

On August 2, 2025, rules will come into force that will be important for you in the future if you use AI in your HR department. According to a recent study by ManpowerGroup, 44% of German companies are already doing so.

Provider obligations for general purpose AI models

Providers of General Purpose Artificial Intelligence (GPAI) must fulfill numerous requirements. GPAI includes, for example, large language models such as ChatGPT or image AIs such as Midjourney. The new obligations include:

  • permanently up-to-date documentation of the model and its provision for providers of downstream AI systems
  • the development of a copyright compliance strategy
  • the summary and publication of the content used for the training

So far, so complicated. The positive thing for you as a user of these AI systems is that you initially have no special obligations under the new law. However, it is worth keeping an eye on the extent to which providers comply with the obligations.

Governance Rules

There is an obligation for the EU and the individual members to create authorities and contact points. The EU must set up an AI office, give the AI body a structure and define its tasks. An advisory forum must be made available, a scientific panel of independent experts must be set up and the member states must be given access to this pool of experts.

The individual EU member states, including Germany, are obliged to set up competent national authorities and central contact points. Companies can turn to these to have the conformity assessment of their high-risk AI systems carried out.

How the changes coming with the EU AI Act from August 2025 will affect your company

Providers and governments, all well and good – but how does the EU AI Act affect your company? Systems that make personnel-related decisions are considered high-risk AI. In future, they will therefore have to take measures to assess risks and carefully check and document their systems.

This means that companies themselves are responsible for contacting one of the advisory bodies that will open on August 2, 2025, in order to obtain as precise an explanation as possible of the obligations that apply to them in the context of AI use.

From August 2, 2026 – another year later – you will have to have a conformity assessment carried out by notified bodies. High-risk systems will be registered. You will then be responsible for complying with the regulations and documenting their use – there will also be a reporting obligation. You are also obliged to train your employees carefully so that they can handle the intelligent systems professionally and responsibly at all times.

The labeling requirement for AI content and working with it is coming

Article 50 of the EU AI Act stipulates that companies and authorities working with AI must make this clear. This can relate to very different content.

  • You have produced a text with AI.
  • You have generated an image with AI.
  • You let a chatbot answer questions.
  • They use AI in the pre-selection of applications.

AI labeling obligationIn all these cases, the labeling obligation applies. Particularly in the application process, it determines the relationship of trust between interested parties and the company, as the specialist lawyer for IT law Marian Härtel knows: “Transparency about the use of AI in application processes is required by law and is important for building trust.”

Attention: You may not use data from the application processes for training your recruiting AI if you do not obtain explicit permission to do so or completely anonymize the data in advance!

Planned penalties if the provisions of the EU AI Act are not complied with

The plan is for EU member states to impose penalties for infringements that are effective, proportionate and dissuasive. In this context, fines of either seven percent of the previous year’s global turnover or up to 35 million euros are possible. In the case of start-ups and SMEs, the lower amount is always used for these penalties, while the higher amount is used for large companies. These penalties can be imposed from August 2, 2027.

Conclusion: Not just dreams of the future

There’s still plenty of time until August 2, 2026, right? Rather not. After all, one year is not much time to set up new authorities and make the necessary adjustments to your company to ensure that you are on the safe side legally by the deadline.

It’s better to prepare now:

  • Label the AI content and inform applicants transparently about which steps of the application process are supported by artificial intelligence.
  • Conduct training courses for your employees to familiarize them with the AI systems used.
  • Document the use of AI for subsequent reports and for the risk assessment by the new authorities.

The more experienced your company is in dealing with the AI system and documenting the work, the easier it will be for you to make the adjustments that are likely to be necessary after the risk assessment.

The right software will make it easy for you to master the tasks ahead. The rexx Suite, for example, makes application management easier for you, is GDPR-compliant and is continuously adapted to all new legal regulations. This means you are on the safe side when using AI in your HR department.

This might also interest you:

Test your new
HR Software now for free