Robotics is increasingly subject to regulatory and legislative changes being made, in line with technological advances, in order to limit safety and cybersecurity risks. Recent European legislation has been game-changing for manufactures and users alike. Our expert explains.
The boom in robotics in industrial settings inevitably raises questions around security, and is prompting new regulations. At the European level, directives on machinery (2006/42/CE), mains voltage (2014/35/UE) and electromagnetic compatibility (2014/30/CE), and the machinery regulation (2023/1230), to name just the most significant, are establishing an already-extensive regulatory framework applicable to all robotic systems, or at least those manufactured after the introduction of these legislative acts, which are not retrospective.
“While it may seem restrictive, this legislation is crucial, and the manufacturers that use these machines are themselves calling for it in order to clearly delineate everyone’s responsibilities,” says Max Deleruelle, Business Technical Manager at CETIM, the French technical centre for mechanical industries.
Substantial modification
The new 2023/1230 regulation on machinery, which replaces EU Directive 2006/42/CE, will immediately apply directly to every new machine, with no transition period. One new feature is the concept of substantial modification (hardware or software) to a machine in service.
“The new regulation takes account of the fact that machines in service are frequently modified by the end user,” explains Max Deleruelle. “These modifications can create a new hazard or increase an existing risk in a way the manufacturer never considered. And the new regulation therefore now stipulates that anyone making a substantial modification to a machine in service is to be considered a manufacturer.”
This means that the user will now assume the same obligation that falls on the manufacturer, namely to assess the equipment’s compliance with all applicable directives and regulations, and to renew its CE marking to that effect.
“This legislation is crucial, and the manufacturers are calling for it in order to clearly delineate everyone’s responsibilities”
Max Deleruelle explains that “As a producer, the end user generally lacks the expertise to perform this revalidation of the modified machine’s safety features. They have to reach out to an expert – a provider such as Actemium, or the machine’s manufacturer. But ultimately, the end user is legally responsible for any accident.”
Repair and maintenance operations that do not affect the machine’s compliance are not included.
Data corruption
The other major change in the 2023/1230 regulation is concerned with cybersecurity. Insofar as the machine meets the definition of a “product that includes digital components” with data transmission, new safety requirements will also apply.
“This involves taking measures against data corruption that could cause hazardous situations, and also ensuring that every operation is recorded and tracked,” says the CETIM manager.
Whether data corruption is accidental or the intentional result of a cyberattack, the manufacturer must now consider reasonably foreseeable malicious acts that could lead to a hazardous situation.
“In other words,” explains Max Deleruelle, “in the case of this type of corruption, if the manufacturer’s machinery stops working, and this causes a loss of business, it can take legal action against the manufacturer for not having provided the means to implement appropriate cybersecurity to the machinery.”
These new requirements should enable the end user, when purchasing a new machine, to comply with the new European Cyber Resilience Act, which from 2026 will require the component manufacturer to inform the integrator of possible incidents and attacks, and from 2027, the integrator in turn to inform the end user and authorities for a five-year period. If necessary, the integrator will be responsible for making the required hardware or software modifications.
“It’s worth mentioning that the next revision of the ISO 10218 standard on robots will include cybersecurity. This will require a security assessment, and if necessary, the implementation of specific measures to strengthen cybersecurity.”
The new regulation will also cover AI and include autoscaling components that perform security functions. “For example, you might have an AI for recognising human beings or detecting prohibited actions such as reaching an arm toward the robot, which would trigger an emergency stop,” says Max Deleruelle, adding that “In this case, the AI would have to be certified by a third party.” Our efforts to fully understand the impact of the “AI Act”, which comes into effect in 2027, remain a work in progress.
10/17/2024