Skip to content
south-africa-flag-png-large

South Africa

Tool TOOLS
Specialised systems
Tasks Tasks
Data review and analysis | Legal research, analysis and drafting support | Operational support | Predictive analytics
User Users
Law enforcement | Courts | Defence
Scope Scope
Nationwide
Training Training
No systematic or mandatory training
Regulation Regulation
No dedicated AI regulation. The Code of Conduct for Legal Practitioners, criminal procedure rules, and data protection laws apply to regulate the use of AI in court.
Cases Cases
In a series of civil cases, courts have considered hallucinated case citations generated by AI tools relied upon by counsel. Sanctions have included referrals to South Africa’s legal regulator for investigation and adverse costs orders if the court was misled.
Insight Insights
South Africa’s draft National AI Policy Framework, issued in 2024, promotes the adoption of ethical AI guidelines and may pave the way for regulations of the use of AI in court once finalised.

AT A GLANCE

In South Africa, prosecutors and judges have not yet officially adopted the use of AI (though there are some indications of AI use in civil proceedings). Rather, the use of AI has been focused on law enforcement. Police use predictive policing in Cape Town to curb gang violence. Vumacam’s licence-plate recognition network, which generates tens of thousands of daily alerts, has led to hundreds of arrests and vehicle seizures. The South African Police Service’s Personal Identification and Verification Application uses AI to identify and verify suspects. In addition, initiatives like LegalFundi’s 'Maya' chatbot expand access to legal advice including on reporting gender-based violence crimes. South Africa has no regulation targeting the use of AI in legal proceedings, but the courts have considered fake case citations hallucinated by ChatGPT and other AI tools in a series of civil cases and referred legal practitioners found to have misled the court to the Legal Practice Council, South Africa’s regulator, for investigation. The Council has provided training to highlight the risks of hallucinated case law, urging lawyers to validate AI outputs.

In addition to the Code of Conduct for Legal Practitioners, South Africa’s Protection of Personal Information Act and criminal procedure rules are also relevant to the use of AI tools in court. A draft National AI Policy Framework promotes the development of ethical AI rules and is seen as a first step towards more targeted regulation of the use of AI, including in court.

USE

Law enforcement 

Operational support 

The ShotSpotter system, in use in Cape Town  since 2016, delivers real-time gunshot detection and specifies the exact location where the shot was fired, with details related to law enforcement and emergency services for immediate response, within 30-45 seconds of a trigger being pulled.  Acoustic events are filtered by sophisticated machine algorithms before being verified by human experts. The system also indicates the number of shooters and shots fired, and can be used as evidence when prosecuting offenders. The system’s alerts and pinpoint accuracy contributed to over 50 arrests, the recovery of more than 35 guns, and the confiscation of 400 rounds of ammunition. 

Predictive analytics

In Cape Town, police have used predictive policing to focus on neighbourhoods where gang violence is common. By placing more officers in those areas at the right time, they have prevented some incidents before they even happen. 

Vumacamoperates licence-plate recognition cameras in South Africa to identify and pinpoint vehicles used for criminal activities. Daily, Vumacam flags approximately 28,000 Vehicles of Interest alerts, with 23,500 coming from police databases and 4,500 from private security forces. The system triggered 14,255 alerts in the first half of 2024, leading to 783 interceptions, the detention of 370 suspects, and 335 impounded vehicles.

In 2025, the South African Police Service identified AI-driven predictive policing as an area for development between 2025 and 2030, as these initiatives enable law enforcement to be proactive by taking preventative measures before crimes happen.

Data review and analysis

The South African Police Service has rolled out the ‘Persons Identification and Verification Application’ (PIVA), which uses AI technologies (such as natural language processing, deep learning, and machine learning) to identify and verify suspects. By using fingerprints, facial and iris scans, and (in some cases) voice recognition, the system reduces instances of identity fraud, and links verified records directly into case files.

We implemented the solution during the reporting period [during the 2023/24 reporting period] to verify the identities of over 1,081,688 accused individuals in near-real-time, and found that over 412,721 of these individuals (38%) had prior criminal records that we could reference. Further, 25,259 (2%) wanted persons could be identified as linked to [South African Police Service] circulations as persons of interest for other cases. This timely information is assisting [the South African Police Service] and [the National Prosecuting Authority] in their subsequent management of the accused, and it also provides data to aid in bail considerations.

South African Department of Justice and Constitutional Development, 2023

Screenshot 2025-10-03 at 13.13.55

Prosecutors 

As at August 2025, there are no reported cases of prosecutors in South Africa making use of AI. 

Courts 

Legal research, analysis and drafting support 

Judges have reportedly begun experimenting with basic speech-to-text tools such as Microsoft Word Dictate to produce initial drafts of judgments.

We have so many languages in South Africa. What would really help to avoid cases being unnecessarily postponed are AI tools specifically trained to assist with transcription services and with interpretation. [...] Some general AI translation tools don’t work very well for court purposes. That’s why people have started developing specialized tools trained specifically for legal and courtroom use. We’re exploring those options, but they’re still quite limited—especially when it comes to less common languages. Another area we’re just beginning to address is court transcription. Since official transcripts are required and transcription services are expensive, finding better solutions there is also a priority.

Jakkie Wessels, Regional Court President Limpopo, September 2025

Screenshot 2025-10-03 at 13.23.30

Defence

Legal research, analysis and drafting support 

‘LegalFundi’ — a South African law-tech non-profit committed to improving access to justice through technology solutions — is offering an AI-powered legal assistant named Maya, which provides free, accessible, and understandable legal guidance via WhatsApp. In the context of gender-based violence, the tool can provide advice to victims on how to report a crime and apply for a protection order. The information Maya provides, which is not limited to criminal proceedings, is grounded in the Paralegal Manual 2025, jointly published by LegalFundi, the Education & Training Unit, Social Change Assistance Trust, and Black Sash. Users can get direct references to chapters from this manual. Over 50,000 conversations have taken place via Maya, with a monthly user base of around 7,000 individuals. 

Another example is ‘My AI Lawyer’, an AI-powered commercial legal assistant offering legal support through a WhatsApp chatbot. Using a structured legal knowledge database, the Chatbot can draft basic letters, provide relevant legal provisions, and suggest next steps, while cautioning users to seek professional validation in complex cases.

TRAINING 

There is no formalised system in place to train judges, advocates or other lawyers on the use of AI, but specific judges are offering training courses through the South African Judicial Education Institute (providing training to judges, magistrates, and aspiring judicial office holders) on virtual courts, cybercrime, and electronic evidence. In August 2025, the Legal Practice Council, a statutory body established to regulate the legal profession in South Africa, held a webinar on ‘The use of AI tools in legal research and legal practice’. The webinar addressed a growing number of incidents in which legal practitioners were found to have used non-existent hallucinated case law retrieved using generative AI tools.

In March 2025, referring to the future of AI as a potential ‘boon and threat to legal practitioners’, the Legal Practice Council had embedded information and communication technology (ICT) training into the official admission examination syllabus for 2026.

REGULATION

South Africa has not yet adopted any laws, judicial guidelines or protocols that address the use of AI in judicial proceedings. The Department of Communications and Digital Technologies’ South Africa National Artificial Intelligence Policy Framework  may pave the way for regulation of the use of AI in court once enacted. In the meantime, a number of laws and guidelines that do not specifically target AI regulate its use in criminal proceedings. 

Guidelines for practitioners 

Code of Conduct for Legal Practitioners 

The South African Code of Conduct for All Legal Practitioners, Candidate Legal Practitioners and Juristic Entities sets out principles that must be observed in the practice of law. In respect of ethical duties, the Code of Conduct speaks in broad terms to the competency and skills of a legal practitioner, including the general duty to act with necessary skill, care and diligence. The Code of Conduct also sets out the general ethical and professional principles in relation to confidentiality, an attorney’s responsibility for ‘proper control and supervision over his or her staff and offices’, and the requirement to ‘maintain the highest standards of honesty and integrity’. While the Code of Conduct does not mention AI tools, the courts have applied it when considering legal practitioners’ obligations in the context of fake case citations or hallucinations generated using AI tools in a number of civil cases (see below). 

Criminal procedure rules 

Electronic Communications and Transactions Act, 2002 

The Electronic Communications and Transactions Act (Act 25 of 2002) (ECTA) regulates electronic communications in South Africa. It requires that a data message ‘must be given due evidential weight’. Specifically, section 15 of ECTA provides that, generally (i.e. not limited to the context of criminal proceedings), ‘the rules of evidence must not be applied so as to deny the admissibility of a data message’ into evidence because (i) it is a data message, or (ii) if a data message is the best evidence that can be adduced, the fact that ‘it is not in its original form’. A data message is defined as ‘data generated, sent, received or stored by electronic means’, including voice and any other electronically stored record. Section 15 was enacted to ensure that data messages are given their full evidential weight by placing electronic information on the same footing as traditional paper-based evidence. Section 15 of ECTA may also be construed as applicable to the regulation of AI in criminal proceedings. 

Data protection legislation 

Protection of Personal Information Act 

The Protection of Personal Information Act (Act 4 of 2013) regulates the processing of personal information or personal data by a responsible party (controller) or operator (processor). Modelled on similar EU legislation, the Act contains eight conditions for lawful processing which must be adhered to when managing personal information. Amongst these eight conditions is an obligation to adopt ‘appropriate reasonable technical and organisational measures’ to secure personal information against unlawful or unauthorised access. The Act also requires responsible parties to collect the minimum personal information required for the task at hand and an obligation not to retain records for ‘longer than is necessary for the purpose for which the information was collected or subsequently processed’. Even though the Act does not speak directly to the use of AI tools, it should apply equally to such use as the obligations on a responsible party in this regard are technology agnostic

Human rights

Provisions in national, regional and international human rights instruments may also be relevant to the regulation of AI in court. These include the right to privacy and the right to a fair trial in sections 14 and 35 of the ​​​​​​South African Constitution, fair trial guarantees in article 7 of the African Charter on Human and Peoples Rights, and protections of the right to a fair trial and the right to privacy in articles 14 and 17 of the International Covenant on Civil and Political Rights and articles 16 and 40 of the Convention on the Rights of the Child

Outlook

South Africa National Artificial Intelligence Policy Framework

In 2024, the Department of Communications and Digital Technologies issued the South Africa National Artificial Intelligence Policy Framework. (The Policy Framework is in line with the African Union’s 2024 Continental Strategy on AI, which  encourages the ‘adoption and implementation of ethical principles for AI’.) As at August 2025, the Framework is still in draft form and has not been formally gazetted or legislatively enacted. A public consultation ran in late 2024, after which stakeholder inputs began to be reviewed. The Department indicated in July 2025 that, ‘in this financial year’,  it would be ‘finalising the National Artificial Intelligence Policy’,  which would be ‘submitted to Cabinet for approval’ with implementation to ‘commence thereafter’. As at August 2025, the Framework was still in draft form and had not been formally gazetted or legislatively enacted.  

The Policy Framework is drafted as a first step to guide the ethical, inclusive, and strategic development of AI to ensure that it contributes to economic growth, social equity, and technological leadership. The Framework outlines broad applications of AI across sectors such as healthcare, education, public administration, and economic development. It does not detail sector-specific strategies for the justice system or legal processes like court proceedings. But the Framework emphasises several key areas of focus that relevant to the future application of AI in legal services, including:

Ethical AI guidelines 

There should be development of ethical AI guidelines which address ‘issues including fairness, transparency, and accountability’, ensure ‘alignment with human rights principles’, and adhere to ‘relevant laws, regulations, and policies governing AI development and use’.

Human oversight over AI

There should be human oversight over AI in critical decision-making, for example, by creating human-in-the-loop systems and developing ‘frameworks for AI decision-making that prioritize human judgment’.

Explainable and transparent AI 

Explainable and transparent AI should be promoted, ensuring that ‘processes, decision making criteria, and outcomes are understandable and accessible to users and stakeholders’, and that they can understand and interpret ‘how AI systems arrive at their decisions or conclusions’.

Mitigating biases in AI systems

Biases in AI systems should be mitigated, for instance, by ensuring that AI systems are ‘trained on diverse data sets representing all demographics’.  

Responsible use of AI 

A code of conduct for AI professionals should be created and ethical training integrated into AI education and professional development.

Initiatives  to regulate the use of AI in court 

The Rules Board for Courts of Law ​​is a statutory body established to review and amend the rules of court, subject to the approval of the Minister of Justice and Correctional Services. In 2025, initiatives have included draft e-justice rules to amend court procedure for a proposed e-justice system that includes electronic filing and delivery of court documents. There are also plans to consider amendments to the court procedure to regulate the use of AI in court but as at August 2025, details remained to be confirmed.

The Office of the Chief Justice from time to time issues directives concerning the court management. In September 2025, the Chief Justice spoke out on the topic of AI and suggested that the courts must take an active role when it comes to the use of AI. 

 

AI’s profound potential to enhance efficiency, reduce crippling backlogs, and lower the cost of legal services is exciting. But lawyers who abdicate their professional responsibility to technology risk harming their clients and eroding the trust on which the legal system depends … Clearly, if we do not shape the future of AI in law, it will shape us, and do so mercilessly.

Chief Justice Mandisa Maya, 2025

Screenshot 2025-10-03 at 13.25.28

CASES

As at August 2025, there were no cases on the use of AI in criminal proceedings. But in a civil context, courts have considered fabricated citations generated by AI systems and referred the lawyers who included these hallucinations in their submissions to the Legal Practice Council for investigation. For instance:  

In a case concerning the licensing and sale of a previous metals business before the High Court in Johannesburg, Northbound Processing (Pty) Ltd v South African Diamond and Precious Metals Regulatory and Others (2025/072038) [2025] ZAGPJHC 661, the judge discovered “while drafting the judgment” that Northbound’s legal team had cited fictitious case law in their heads of arguments, which had been generated by a legal AI tool called Legal Genius. The judge noted that counsel “apologised unreservedly for the oversight on behalf of Northbound’s legal team”. Although “there was no deliberate attempt to mislead the court in relation to the use of incorrect case citations in the heads of argument”, the judge, applying the earlier authority of Mavundla v MEC Department of Co-Operative Government and Traditional Affairs and Others, Case No. 1585/20 (2023)  (see below), concluded that it was appropriate to refer the conduct of Northbound’s legal practitioners  to the Legal Practice Council for investigation. 

In Mavundla v MEC Department of Co-Operative Government and Traditional Affairs and Others (97940 20224P) [2025] ZAKZPHC 2, the judge in the High Court in Pietermaritzburg considered an application for leave to appeal an earlier ruling. Mavundla’s legal team had cited seven fake cases in support of the application. The judge highlighted counsel’s duty not to mislead the court, which “should also be able to assume and rely on counsel’s tacit representation that the authorities cited and relied upon do actually exist”. The judge noted that “an inordinate amount of legal and judicial resources were spent to find the authorities referred to in court” by Mavundla’s legal team, who denied the use of ChatGPT.  The judge considered that the circumstances were “significantly more serious” than in the earlier authority of  Parker v Forsyth NO and others, Case No. 1585/20 (2023) (see below) and concluded that the matter should be referred to the Legal Practice Council “for investigation and further action” and ordered Mavundha’s lawyers to bear the costs for additional court appearances related to locating the fictitious cases. 

In Parker v Forsyth, Case No. 1585/20 (2023), a defamation case before the Regional Court in Johannesburg, Parker’s counsel provided opposing lawyers with a list of allegedly relevant cases. By the time of the hearing, counsel admitted that the cases were inaccessible and had been generated using ChatGPT. The judge criticised the lawyers “undue faith in the veracity of the legal research generated by artificial intelligence”, describing their conduct as both “overzealous and careless”. But the judge accepted that, as the fake cases were shared only with the opposing lawyers, there was no intent to mislead the court, and no punitive costs order was imposed, nor was the conduct referred to the Legal Practice Council.