
Chile
AT A GLANCE
Chile is integrating AI across its justice and law enforcement systems to support prediction, investigation, and case management. The Urban Crime Prediction System (used by law enforcement) uses historical data and predictive algorithms to identify crime hotspots, while surveillance initiatives such as SITIA and Calle Segura apply facial recognition, license-plate reading, and data sharing to support policing. The Public Prosecutor’s Office uses FISCAL HEREDIA to map criminal networks and analyse victim statements for links between cases. Courts are piloting tools such as Justa for transcription and document handling and the Buscador de Jurisprudencia for legal research. AI has also been used in human rights litigation, and training programmes are being developed to support responsible use of AI in the justice sector.
There are no national laws that expressly regulate the use of AI in criminal proceedings but existing general laws, such as Criminal Code of Procedure and the General Data Protection Law (Law 21.719) already apply. Chile is also in the process of developing a comprehensive, risk-based framework for the regulation of AI.
USE
Law enforcement
Predictive analytics
The Security Analysis and Mathematical Modeling Center at the University of Chile, working with Carabineros' Criminal Analysis Department, developed a system known as the ‘Urban Crime Prediction System’. First piloted in 2015, the tool uses historical crime data, geographic information, and temporal patterns to predict where crimes such as robberies and assaults are most likely to occur. The system combines three predictive algorithms to generate colour-coded risk maps that indicate hotspots by time and location, and these maps are integrated into Carabineros’ existing criminal-analysis platform for patrol planning and resource allocation.
Unlike many similar programmes worldwide, the Chilean software not only seeks to predict crime but also enables search and filtering by specific crime type, for example, robbery with force, robbery with violence, or sexual assault and can link these crimes to particular activities and services, such as liquor stores, banks, and service stations, which often become focal points for criminal activity.
Initially tested in 12 communities of Santiago, it later expanded to more than 70 municipalities, with early reports claiming up to 89% accuracy in identifying risk areas.
Data review and analysis
The ‘Sistema de Teleprotección Nacional’ (SITIA) is a telecommunication system that integrates facial recognition and license-plate reading with existing camera infrastructure. As at August 2025, the pilot is planned for 14 communities in the central-north area of the Región Metropolitana. In that pilot, 130 of 1,000 cameras in the system are varifocal ‘bullet’ cameras that can perform facial recognition.
The system is designed to detect missing persons or individuals with outstanding arrest warrants through facial recognition and to identify stolen vehicles via automatic license-plate reading. According to the government, SITIA will support both the Carabineros and the Public Prosecutor’s Office in the investigation and prosecution of crimes.
This is the first large-scale application project of artificial intelligence in the field of remote protection. It is a project that is the result of collaboration between the public, private sectors and the scientific world.
![]()
The Calle Segura program, launched in 2019, was an initiative to deploy over 2,000 cameras with advanced technology and AI across 1,000 critical points nationwide. Part of this included drones for televigilance, checkpoints to intercept escape routes for stolen vehicles, and improved interconnection between law-enforcement agencies and camera systems. The system connects various institutions—Carabineros, municipalities, highways, prosecutors—via a platform to share surveillance data and analytical outputs. The ‘Calle Segura’ programme also includes the implementation of a ‘comisaría virtual’ (virtual police station) to facilitate procedures that previously required in-person visits, thereby freeing Chilean national police to increase their presence on the streets.
Prosecutors
Evidence review and analysis
The Chilean Attorney General’s Office uses an AI system called ‘FISCAL HEREDIA’, now used by more than 240 prosecutors nationwide. Using large volumes of data obtained from the databases of the Prosecutor’s Support System and the ‘SOSAFE’ citizen-security application (a community reporting platform), the tool employs two primary models:
- Mapping connections between individuals with criminal records, aiding in identifying suspects and understanding criminal networks.
- Using natural language tools to process victim statements and police reports to detect patterns in criminal behaviour, assisting in linking related cases.
All results generated by FISCAL HEREDIA are subject to human oversight and validation by prosecutorial teams before integration into cases.
To develop the project, both the Prosecutor’s Support System and the SOSAFE citizen-security application made their databases available to researchers, while complying with privacy protection standards.
Anonymisation of data and building the model |
The data from the Prosecutor’s Support System and the SOSAFE application was anonymised to protect identities but included information on accused individuals from the past 15 years, covering both solved and unsolved cases. Using this data, the research team created a mathematical model to rebuild the social network of each accused person, showing how they were connected to others. |
Statement analysis |
After mapping a suspect’s social network, the system uses a second AI model to analyse victim statements given to the police. These statements generally follow a set structure, making it easier for the model to identify patterns in behaviour, particularly when criminals use similar methods across different cases. The AI used for these statements functions similarly large- language models such as ChatGPT. It can understand the text even if there are grammatical or spelling mistakes, and pulls out key details such as times, locations, types of weapons, and descriptions of victims and events. This helps the system spot patterns in behaviour, such as whether the criminals worked in pairs or groups, or used firearms instead of blunt objects such as weapons. |
Suspect connection |
The insights derived from these behavioural patterns are cross-referenced with earlier social-network data, which links suspects to one another. This combined approach makes it possible to correlate how individuals are connected and how crimes are committed, strengthening the investigatory value of the findings. |
In pilot testing, the system was able to identify multiple suspects linked to a known individual, some of whom were still at large. The tool can work continuously, scanning hundreds of thousands of police reports. While it does not replace the work of legal experts, it offers a powerful analytic tool to enhance and accelerate their prosecutorial investigations.
The development and deployment of the FISCAL HEREDIA tool had multiple objectives:
- Enhance prosecutorial intelligence. The system links incidents, such as thefts committed using vehicles with the same license plate or similar characteristics reported through the SOSAFE application to help identify ‘crime drives’, or patterns of repeated criminal activity,
- Predictive burglary models. It produces Predictive home burglaries risk forecasts in different areas for better management of neighbourhood security, to enable a more effective preventative patrol.
- Improve alert management. It improves the organisation of alerts sent through the SOSAFE application and received by police and municipalities, providing a better prioritisation of urgent events requiring a quick response.
Looking ahead, the Prosecutor’s Office plans to expand the tool’s capabilities by integrating additional datasets, such as prison records and inmate associations, to further enrich the social network model. They also aim to refine how crimes are weighed within the system (for instance, factoring in the severity of sentences) and to host the model on a central server or cloud platform so it can be used nationwide by regional prosecutors and police. Eventually, it is hoped that the technology will be adapted to support investigations into other serious crimes such as homicide, fraud, kidnapping, and drug trafficking.
Courts
Case management
As at August 2025, a pilot tool named ‘Justa’ is being tested by a team of judges, administrators, and technical advisors at the Juzgado de Letras y Garantía de Mulchén (Court of Letters and Guarantees of Mulchén), which handles criminal cases among others. It aims to speed up judicial workflows by assisting with internal tasks such as transcribing audio, drafting summaries, generating hearing minutes, and processing legal documents. This system is expanding to other courts, including a chamber of the Supreme Court.
Other pilots include automated transcription and document analysis in local courts, such as Mulchén.
Legal research, analysis and drafting support
Chile’s Supreme Court uses ‘Buscador de Jurisprudencia’ (Jurisprudence Search Tool), established by Auto Acordado No. 164-2024 (effective January 2025) and designed as a nationwide search engine for court ruling. The tool uses AI for semantic search anonymisation and relevance ranking, aiding judges and clerks in legal research.
Defence
Evidence review and analysis
In 2024, AI was used in a criminal case brought against the High Commanders of the Carabineros (leaders of Chile’s national police force). Although the use of AI was deployed by the victims, the case is illustrative of the type of tools available in Chile. After a victim suffered police-related injury and suicide in 2004, seven other victims died by suicide amid the 2019-20 violations of the right to protest committed by the Carabineros commanders and state organs. In April 2021, 18 months after the protests, with information of 2,000 cases collected during the attacks and processed in Excel spreadsheets, a partnership between civil society groups and academia brought a complaint against the Police High Command for their crimes. In October 2024, two generals were indicted as perpetrators of unlawful arrest based on 280 cases of shooting of victims. One challenge of the case was to systematise all the information that was being deployed. Counsel used AI to organise the data and identify communication patterns within the police command, in order to demonstrate the links between those responsible for the crimes.
TRAINING
Chile has begun incorporating training programs and awareness initiatives for justice-sector professionals on the use and risks of AI. The Academia Judicial de Chile (a public institution providing training to legal practitioners) has run ethics-focused sessions on AI and announced that it is developing guidance and training instruments to promote a ‘responsible, realistic’ use of AI in courts, with an emphasis on transparency, bias, and data protection. In 2022, the Attorney General’s Academy promoted and hosted AI-focused activities to familiarise prosecutors with new tools such as FISCAL HEREDIA.
At the academic level, the Universidad de Chile launched a short course in 2025 titled ‘Inteligencia Artificial Aplicada a la Justicia’ (Artificial Intelligence Applied to Justice), open to practitioners and officials, covering applications of AI in legal processes.
In August 2025, the Colegio de Abogados de Chile (Chilean Bar Association) convened a national seminar on ethics and AI, bringing together members of the judiciary and practising lawyers to debate risks, responsibilities, and ethical limits.
Training on AI use is also available in Chile through UNESCO-specific programmes.
Regulation
As at August 2025, there is no regulation governing the use of AI in criminal proceedings or in judicial proceedings more broadly. However, existing regulations—such as the criminal procedure code, data protection laws and constitutional principles—may be construed to regulate or limit the use of AI in criminal proceedings. And, a comprehensive AI draft bill (Boletín Nº 16.821-19) is being considered by the Chamber of Deputies.
Guidelines for practitioners
Judicial guidelines
While there are no judicial guidelines specifically addressing the use of AI within the judiciary, judges are bound by constitutional principles of impartiality, independence, and reasoned decision-making. These imply caution in relying on automated tools.
There are also no specific guidelines for lawyers regarding the use of AI, as at August 2025.
Criminal procedure rules
Code of Criminal Procedure
Pursuant to Article 297 of the Criminal Procedure Code, judges must evaluate all evidence—which would include outputs from forensic software, facial recognition systems, or predictive analytics tools—in accordance with principles of logic, maxims of experience, and scientifically well-founded knowledge. Courts are also required to explain the reasoning behind their assessment. This evidentiary rule functions as a natural safeguard against admitting or relying on opaque or insufficiently validated AI systems that cannot meet scientific rigor or transparency thresholds.
Data protection legislation
General Data Protection Law (Law 21.719)
Law No. 21.719 on the Protection of Personal Data (LPDP), which was approved by the Chilean Congress in December 2024 and will come into full effect in December 2026, establishes a legal framework for personal data protection. The LPDP creates a Personal Data Protection Agency to ensure compliance, expands data subject rights like data portability and erasure, and imposes new obligations on data controllers, aligning Chile's data privacy rules with international standards. The LPDP applies to data controllers (entities or individuals who determine the purposes and means of processing personal data) and data processors (entities or individuals who process data on behalf of a controller) at both private and public entities (including courts and prosecutorial bodies).While not specific to AI, the LPDP regulates the processing of personal data, which is highly relevant to AI systems used in the justice sector (e.g., risk assessment tools, predictive policing). Obligations around lawfulness, transparency, proportionality, and data subject rights apply equally to AI systems in the justice sector. These could, for example, restrict the deployment of AI tools that involve biometric or other sensitive data, or require procedural safeguards to ensure fair treatment of defendants.
Human rights
The use of AI in criminal proceedings must be consistent with procedural guarantees and fundamental rights included in the Chilean Constitution guarantees, including due process, equality before the law, and the right to a fair trial. Article 19 of the Constitution, which protects privacy, honour, and the inviolability of communications as well as the right to data protection, may, for example, operate as broad constitutional constraints on the use of AI where sensitive personal or biometric data are processed, or where automated outputs may affect due process and equality before the law.
Some additional international guidance is offered by regional and international instruments. Chile participates in Inter-American Commission on Human Rights (‘IACHR’) initiatives concerning digital rights and the use of technology in justice systems. The IACHR has emphasized that any AI use in judicial contexts must safeguard access to justice and non-discrimination. Chile has also endorsed the UNESCO Recommendation on the Ethics of Artificial Intelligence (2021), which, while non-binding, sets principles on transparency, accountability, and human rights protections that extend to justice-sector applications.
Outlook
In 2021, Chile launched its first National Artificial Intelligence Policy (2021–2030) which outlines the country’s vision for AI development in relation to three areas: (i) enabling infrastructure, (ii) AI development and adoption, and (iii) ethics, regulation, and socioeconomic impacts. The policy establishes the foundation for responsible use of AI across both the public and private sectors, including those in the justice sector
In May 2024, Chile presented an updated National AI Policy and accompanying draft law following UNESCO’s Readiness Assessment Methodology. The updated policy acknowledges the need for AI regulation, including guidelines on the ethical use of AI systems in the public sector, standards for transparency, accountability and data protection as well as the reform of laws complementary to AI. The draft law (Boletín 16.821-19) proposes a regulatory approach that combines self-regulation with risk regulation, distinguishing AI systems into those presenting unacceptable risk, high risk, limited risk, and no- obvious risk. It mandates transparency, explainability, documentation, human oversight, and sanctions for non-compliance.
Cases
In a 2024 decision (Rol N° 18.566-2024), the Supreme Court held that Worldcoin - a digital ID platform co–founded by Open AI’s CEO Sam Altman - violated a minor’s constitutional rights to privacy and data-protection by scanning her iris without informed consent and ordered Worldcoin to delete her biometric data within 30 days. The Supreme Court highlighted that biometric data is highly sensitive, requiring special constitutional protection. The ruling leaves open important questions about how courts will handle biometric and AI-processed data in criminal proceedings.
In 2020, the Chilean Constitutional Court addressed the constitutionality of conducting criminal trials via videoconference under pandemic-era legislation. The petitioner challenged the law that allowed remote trials, arguing it violated his right to a fair trial, including the right to an effective defence. The complaint was ultimately rejected (Rol N° 9054-2020) because the court did not reach the majority vote required to declare provisions of the law unconstitutional. Several justices, however, raised concerns about the use of digital and AI tools to replace judicial functions and the erosion of fair trial guarantees, highlighting that AI should ‘never’ replace judicial decision-making (‘the judge’s subjectivity’) and evidence assessment, especially in criminal cases.