Pulsantiera di navigazione Home Page
Pagina Facebook Pagina Linkedin Canale Youtube Italian version
News
Legal news

INFORMATION TECHNOLOGY

EU Directive 2853/2024 on product liability comes into force. New in the area of proof and cau-sation of damage from Artificial Intelligence systems.

On December 8, 2024, the Directive of the European Parliament and of the Council on Liability for Defective Products and Repealing Council Directive 85/374/EEC (Product Liability Directive) came into force.

The Directive is especially important because it updates liability rules to the digital age and introduces provisions on Artificial Intelligence Liability.

Illuminating is the provision of Recital 13 of the Directive:

(13) Products in the digital age can be tangible or intangible. Software, such as operating systems, firmware, computer programs, applications, or AI systems, is increasingly prevalent in the market, and its importance for product security purposes is growing. Software can be placed on the market as a stand-alone product or can later be integrated into other products as a component and can cause damage due to its operation. To ensure legal certainty, it should be clarified in this directive that, for the purposes of applying strict liability, software is a product, regardless of how it is supplied or used, and thus regardless of whether the software is integrated into a device, used through a communications network or cloud technologies, or is provided through a software-as-a-service model. Information, on the other hand, should not be considered a product, and product liability rules should therefore not apply to the content of digital files, such as multimedia files, e-books, or the mere source code of software. The manufacturer or developer of software, including the provider of IA systems under Regulation (EU) 2024/1689 of the European Parliament and of the Council (5), should be considered a manufacturer.

So, a developer or producer of software, including artificial intelligence (AI) systems within the meaning of Regulation 2024/1689 (AI Act) are considered "manufacturers," as best understood from Article 4(10) of the directive:

"Manufacturer" means any natural or legal person who: (a) develops, produces or manufactures a product; (b) has a product designed or manufactured or who, by affixing its name, trademark or other distinguishing features to such product, presents itself as a manufacturer; or (c) develops, produces or manufactures a product for its own use.

The Product Liability Directive also states that since products can be designed in such a way that software changes, including upgrades, can be made, the same principles apply to the changes made. Accordingly, when a substantial modification is made by a software update or an update due to the continuous learning of an AI system, the substantially modified product is considered to have been made available on the market or put into service at the time of the modification.

Fundamental are the procedural rules regarding proof of damage and causation, proof that is often almost impossible to be provided by the injured party, especially in the case of damage created by AI systems, due to the so-called "black box" effect of the algorithm.
           
They may therefore presume the defective character of a product or the causal link between damage and defect, or both, in cases where, although the defendant manufacturer has disclosed the relevant information, it is excessively difficult for the injured party-actor, particularly because of the technical and scientific complexity of the case, to prove the defective character of the product or the existence of the causal link, or both. Accordingly, peer not jeopardize the right to compensation and given that manufacturers have specialized knowledge and better information than the injured party, and in order to ensure a fair allocation of risks by avoiding reversal of the burden of proof, where his difficulties concern proof of the defective character of the product the plaintiff will be required to prove only that it is probable that the product was defective or, where his difficulties concern proof of causation, only that the defective character of the product is a probable cause of the damage. Technical or scientific complexity will be determined by the national courts on a case-by-case basis, taking into account several factors: from the complex nature of the product, as in the case of an innovative medical device to the complex nature of the technology used, e.g., machine learning; from the complex nature of the information and data that the plaintiff must analyze to the complex nature of the causal link, e.g., between a pharmaceutical or food product and the occurrence of a disease, or a link for whose proof the plaintiff is required to explain the inner workings of an AI system. The plaintiff will have to provide arguments to prove the existence of the above-mentioned undue hardship, but will not be required to provide evidence regarding such hardship. For example, in an action involving an AI system, in order for the court to establish the existence of undue hardship, the plaintiff will not be required to explain the specific features of that AI system or how those features complicate the proof of causation. The defendant - instead - will still have the opportunity to challenge all elements of the action, including the existence of undue hardship.
 
Stampa la pagina