CHECK AGAINST DELIVERY
Dear Ana, dear colleagues,
It is a pleasure to be here, at the High-Level Forum, to listen to all of the expertise and innovative thinking from my fellow speakers, and to reflect on the work carried out by the High-Level Forum over the last year, which the Fundamental Rights Agency has contributed to.
I wish to thank the Commission for its efforts to bring together Member States and agencies, as well as practitioners and experts to shape a forward-looking agenda on EU criminal justice. The EU Agency for Fundamental Rights has been working in the area of criminal justice since it was established and we have acquired a wealth of evidence, which we are always happy to share.
Listening carefully to the varied input and perspectives at Forum meetings, and considering all the available evidence, helps us to understand the current status of not only the law on paper, but also the lived realities of how the law functions in practice. This, in turn, ensures that decisions on the direction that EU criminal justice will take, are well-informed and evidence-based. This is something that FRA strives to underpin with its evidence-based reports, and their accompanying key findings and opinions – which are based on the experiences of hundreds of practitioners who are applying EU law in practice, alongside the experiences of defendants, offenders, witnesses and victims.
As we discuss, and eventually adopt, measures to make our future EU law more responsive to emerging threats and legal and technological developments, one needs to bear the full spectrum of fundamental rights in mind.
For example, when we talk about the future of criminal procedural safeguards in EU law, one needs to consider not only the rights of persons suspected or accused of crime, but also the rights of victims. The Agency’s data consistently show that people living in the EU are concerned about crime, and many are victims of crime. Our large-scale surveys have consistently revealed the extent to which different groups experience crime – ranging from violence, through to fraud. However, many crimes remain unreported to the authorities.
There are some forms of crime which affect very large numbers of people – one prime example is online fraud, which features prominently as a priority crime area in the EU, and is referenced under the ‘ProtectEU’ internal security strategy – reflecting the growing reality of online crime. And this is one area where the Agency is undertaking extensive research on victims of online fraud. However, many of these crimes remain under-reported by victims, which not only negatively impacts the effectiveness of victims’ rights but also distorts the perception of which crimes are most prevalent. Underreporting means that many offenders continue to act with impunity, and EU law – which sets out to address crime and ensure victims’ rights – is less effective. However, this is not to ‘blame’ victims for not coming forward to report crime.
Therefore, any possible actions – as outlined in the High-Level Forum report - in the area of EU criminal law must go hand in hand with a commitment to empower victims. Perhaps this is one of the areas that would have deserved to be specifically addressed during the High-Level Forum discussions and in the report, taking also into account the fact that the current EU Victims’ Rights Strategy is coming to an end this year and the new one will be proposed and adopted in 2026. In addition, we should soon have the newly revised Victims’ Rights Directive.
As I mentioned earlier, this is a topic which is underpinned by the wealth of expertise and evidence developed by FRA over the years, and the Agency stands ready to support EU institutions and Member States with any actions that may be taken following these discussions.
For example, the report considers the possible need to adopt new EU criminal law initiatives to uphold the respect of the EU values, notably in the area of hate offences, and possibly also regarding gender-based violence.
FRA holds an extensive body of evidence in this regard, from our large-scale surveys that capture different population groups’ experiences of hate (both online and offline), to our dedicated research on the volume and nature of online hate against different groups across different online platforms. This work sheds a light on the diverse and widespread nature of online hate that warrants further consideration with respect to criminal law. Close attention to the application of the Digital Services Act – in practice – needs to run in parallel with any discussion about revision of existing law or new legislation to combat online hate. To this end, the Agency is undertaking further research on the application of the Digital Services Act with respect to illegal and harmful online content against boys and girls.
In respect of gender-based violence, key results from an EU-wide survey – undertaken by Eurostat, FRA and EIGE – show that 1 in 3 women in the EU have experienced physical violence (or threats) and/or sexual violence during their lifetime, by any perpetrator. This figure has not changed since FRA undertook its first EU-wide survey on violence against women, which we published in 2014. Our work on the Digital Services Act also underlines the extent of online misogyny, which outstrips hate targeting other groups we looked at.
FRA’s holds much expertise in relation to two further calls contained in the High Level Forum’s report. The first is the call to strengthen the effectiveness of the functioning of the European Arrest Warrant Framework Decision. In this regard, FRA carried out extensive research in all Member States, reports of which were published in 2019 and 2024, and which contain a pool of best practices across Member States that could be exchanged and built upon.
The second is the call for effective follow-up to the 2023 Commission Recommendation on detention by further exploring soft law measures, such as guidelines, best practices, project financing to support the Member States on detention related matters, or cross-border cooperation instruments on alternatives to detention such as electronic monitoring.
FRA has produced a tool to support Member States in their efforts to improve conditions of detention, as well as to guide judges and other legal practitioners in cross-border cases. FRA’s database on criminal detention is regularly updated and contains national standards in all Member States, national and international case law, and reference to monitoring reports. FRA will continue working in this field to provide accessible and reliable information. And in this regard, we very much welcome the feedback we have received from different judicial networks where we have presented the database, including direct feedback from Member States, indicating that they find the database to be useful – particularly with respect to cross-border cases.
Allow me to mention two specific areas which FRA has been prioritising, in order to most effectively and efficiently support ongoing and future developments in the area of criminal law at EU and national level: namely, digitalisation of justice, and the use of AI.
On 13th November – FRA published its report on ‘Digitalising Justice’ – which looked comprehensively at 31 digital tools in the justice field.
FRA’s findings in these areas aim to guide actors who are developing, using and overseeing digital tools to understand how fundamental rights can be impacted through increased digitalisation.
All of our findings are built upon insights we gathered from people working in the justice field who are dealing with these realities on a daily basis. The findings can help guide the process of digitalisation in such a way that ensures that digitalised justice works for everyone. FRA’s findings can be a resource for the EU institutions and for national justice authorities, including feeding into actions arising from this Forum – for example when mapping digital tools or developing standards.
In this area, FRA has worked hard to gather rights-focused empirical data which has two benefits: in the first instance it helps to flag emerging or structural fundamental rights risks, but in addition, the data sets out safeguards and solutions to maximise the benefits of digitalisation.
Chapter 4 of the High-Level Forum’s report refers to two digital advances: videoconferencing and artificial intelligence. Allow me to elaborate on these two aspects, from the perspective of FRA’s work.
In relation to the use of videoconferencing, in addition to the importance of the technical standards highlighted in the report, FRA’s findings signal the need for targeted safeguards to balance efficiency with fair trial and defence rights concerns. As the European Union is set to increasingly rely on cross-border digital tools (for example e-CODEX or AI in justice) – people, businesses and service users would benefit from shared rights-based standards in addition to technical standards. Such standards would lay the foundations for justice systems that are as efficient, accessible, transparent, secure, and inclusive as possible in the digital era. They would also contribute to the extremely important task of strengthening trust in justice systems which would, in turn, benefit stronger economies.
As regards artificial intelligence, FRA interviewed justice practitioners, technical experts and experts from justice ministries for its work on the digitalisation of justice. They all emphasised the need for a cautious approach when it comes to the use of AI in justice, stressing the importance of strict compliance with fundamental rights. They also highlighted the need to maintain human discretion and oversight in decision-making aspects related to the use of AI in justice going forward.
Fundamental rights compliance of any AI tool is essential. Ethical considerations are not sufficient; developers and users of AI must follow established fundamental rights norms and standards.
In this regard, the work of the Council, in particular its e-justice Working Party, is important to follow. To this end, FRA has informed the work of the e-Justice Working Party by sharing its key findings and is ready to continue its engagement to support Member States in their work in this area.
FRA is also working closely with the AI Office in the DG CNECT in the area of fundamental rights impact assessments. The key message we reiterate in our work is that all digital tools - not only high-risk AI used in the justice field - carry risks that can be mitigated through rights assessments, broad consultation, and thoughtful design. There is an obvious link, and much evidence indicating that in addition to reducing risks, these rights-compliant elements also maximise benefits.
As justice systems in many Member States accelerate their digital transformations, we consider FRA’s analysis timely and useful. Our aim with it was not to pinpoint problems and leave actors alone to come up with solutions. It was to increase understanding of on-the-ground risks and to highlight safeguards and good practices that work to counter those risks.
It is essential for EU Member States to carefully and successfully embrace digitalisation in the justice field, in a way that that allows for everyone in the EU to benefit from improved access to justice, efficient court proceedings, and hearings that are conducted within a reasonable time.
All of us gathered here today must work together to build a case for investing in a critical assessment of fundamental rights compliance during the design phase of digital tools. We must showcase how this can pay off in the long run, by leading to more universal use, digital inclusion, legal certainty, and a level playing field for people and businesses. We must also highlight how building in fundamental rights compliance at an early stage is simply ‘good business’; should future litigation by EU or national courts highlight rights’ violations in digital tools, it would lead to costly redesigns. Given this relatively new territory we find ourselves in, we have a great opportunity to build concrete fundamental rights safeguards into digitalisation efforts from an early stage. Such efforts will only be improved further by ensuring the needs of our most vulnerable groups in society are also considered when designing and deploying digital tools, in particular those that utilise AI. This must include tools that can benefit victims of crime – as I referred to at the beginning of my intervention today.
Our attention must remain focused on practice. As digital tools become more widely used in the justice field, we must watch their application with respect to legal safeguards, alongside the development of relevant case law relating to digital and AI applications. We must react swiftly where we see shortcomings. FRA will continue to follow this area closely and to support EU institutions and Member States with its evidence.
In particular, we know that courts and justice authorities, as well as law enforcement authorities, need clear guidance on the appropriate use of artificial intelligence. In this respect, FRA is about to publish its report on Assessing High-risk AI on 4th December, which will support the EU and Member States by informing on how best to use the AI Act to protect fundamental rights.
FRA will continue to support where we can. The promise of quicker, more efficient and more inclusive access to justice is one we must grasp, while bearing in mind the potential pitfalls we might encounter as we advance.
Thank you.