
Singapore
AT A GLANCE
Singapore was one of the world’s first countries to issue a national AI strategy to harness AI’s potential. AI use in Singapore’s criminal proceedings remains limited but continues to expand through targeted experiments. Law enforcement employs predictive analytics (e.g., GRAND-VISION) and digital forensics to enhance crime prevention and evidence review. Courts use AI for translation, summarising case documents, real-time transcription, and legal research, while tools like Harvey AI support self-represented parties in civil cases. Defence lawyers benefit from AI-integrated platforms for drafting, case management, and research. Extensive training initiatives by legal institutions aim to equip professionals with AI skills.
Use of AI in court is regulated by a number of different laws and guidance, including judges’ and lawyers’ professional conduct rules and general data protection legislation and rules of evidence and procedure - as well as a dedicated guide on the use of generative AI in court issued by the Registrar of the State Courts and by advisory guidelines on the use of personal data in AI systems from the Data Protection Commission.
USE
As at September 2025, the use of AI in Singaporean criminal proceedings is relatively limited, but there is a push towards expanding the scope of the application of generative AI (Gen AI), accompanied by active, targeted experimentation with various AI tools.
In trying to explore what we do, we are aware that we must approach our exploration and testing in a principled way, so that we do not go chasing after every single new shiny thing, and that we make the best use of resources–manpower, money, and time–available to us. We do not need to be on the bleeding edge, but neither should we be so far behind that we fail in our mission to serve the community. We must find that right balance.
![]()
Law enforcement
Predictive analytics
‘GRAND-VISION’ is a research collaboration between the Singapore Police Force, Fujitsu, and Singapore Management University, which uses machine learning on historical crime data (location, time of day, season, weather) to generate ‘heat maps’ of where and when crimes are more likely to occur. This system supports the smarter deployment of patrols and crime prevention resources.
Data review and analysis
Singapore’s Police Force uses a digital forensics lab to automatically scan large volumes of digital evidence (such as phones, hard drives, and computers) to flag possible obscene or illicit content such as child sexual abuse material. Investigators then review flagged items, saving significant time.
Prosecutors
As at September 2025, there are no reported cases of prosecutors in Singapore using AI.
Courts
Currently, there is no use of AI for judicial decision-making or outcome prediction in court proceedings. AI use in Singaporean courts is, as at September 2025, limited to case management.
Case management
In civil proceedings, as part of the MoU between the Singapore Courts and Harvey AI (discussed below), a generative AI tool has been made available to small claims tribunal magistrates which allows users to generate English translations of party-filed documents in Singapore’s three other national languages (i.e., Chinese, Malay, and Tamil). As part of an expanded MoU in September 2025, this tool is being expanded to provide summaries of each party’s case, and assist in analysing the documents.
With the increasing use of electronic messaging like WhatsApp messaging and emails, parties are putting in larger volumes of evidence. The new AI-generated summaries help Tribunal Magistrates prepare for trial, with a good understanding of the facts involved in each dispute. These efficiency gains are crucial for upholding the quality of justice. By streamlining the analysis of documents, the tool allows Tribunal Magistrates to focus their time and expertise on the more complex aspects of a case, ensuring that high standards of judicial adjudication are maintained even as the volume of claims continues to rise.
![]()
A speech translation system has also been deployed by courts. The tool uses neural networks trained with language models and domain-specific terms to transcribe court hearings in real-time, allowing judges and parties to review oral testimonies in court instantaneously. Internally, the judiciary uses AI tools to clean up and improve the quality of machine-generated transcripts before they are finalised.
For court administrators, commercial AI products such as Microsoft Copilot are being experimented with, for taking minutes, preparing slide decks, preparing papers, and transcribing court proceedings.
Legal research, analysis and drafting support
For judicial officers and judges, Singaporean courts are conducting experiments for the use of AI for legal research. Courts have considered using publicly available options such as ‘Pair Search’, a prototype tool provided by the Government Technology Agency, powered by the same Large Language Models that ChatGPT is built on, for searching policy-related information. Users can access resources from the Hansard Parliamentary Reports, Supreme Court judgments and legislation. The tool assists users in the preparation of policies and Court decisions using the ‘analyse results’ feature. Similarly, other legal research databases such as ‘LawNet’, a legal research database offered by the Singapore Academy of Law, have also introduced AI features.
In July 2025, Justice Aidan Xu, the Judge in Charge of Innovation and Transformation for the Singaporean Judiciary, reflected on experiments to use AI to prepare first drafts of judgments, which judges could then review and adapt, and to engage in a process called ‘red-teaming’, which involved using AI to generate draft judgments for different outcomes in a case to help judges identify the strengths and weaknesses in their reasoning. Justice Xu noted the sensitivity of using AI in this context and confirmed that judges would try it out ‘carefully, and impose appropriate training, safeguards and supervision’.
Defence
Legal research, analysis and drafting support
The Ministry of Law has developed a cloud-based management and collaboration platform called the Legal Technology Platform (in collaboration with Lupl, an American legal technology firm), which offers a unified dashboard that consolidates case management (covering drafts, client instructions, time tracking, billing, templates, and integrated practice tools) into a single interface. The system is integrated with a GenAI system, developed with Lupl. Lawyers using the platform can interact with Microsoft CoPilot through prompts for a range of tasks, allowing legal practitioners to automate their tasks. Some of the key features include:
- Generating drafts, summaries of legal documents, and contextually-intelligent translations;
- Monitoring and managing task progress, team workload and capacity, and
- Drafting client communications.
In July 2025, a Singaporean judge signalled the launch of a GPT-Legal Q&A, developed by the Singapore Academy of Law, which would enable lawyers to ask legal research questions in natural language and receive contextual, relevant responses, which are generated by AI grounded in LawNet’s content.
In civil proceedings, the Singapore judiciary has signed an MoU with Harvey AI, an American legal startup, to develop an AI-powered platform to assist self-represented persons in the Small Claims Tribunal. Lawyers are not permitted in proceedings before the SCT, which handles more than 10,000 cases a year, with a claim limit of up to $30,000. Since lawyers are not allowed to represent parties in Small Claims Tribunal proceedings, the platform is designed to help users understand the evidence required, organise and submit documents, prepare and structure submissions, summarise opponents’ submissions, eventually predict likely outcomes and assess successful probabilities. The system will be made available to individuals from November 2025 onwards.
TRAINING
The Singapore Academy of Law (a statutory body and legal industry development agency), in collaboration with Microsoft, has released a guide and instructional video to provide all lawyers with information on how to obtain better results from generative AI tools.
In May 2025, the Academy also launched the ‘Junior Lawyers Professional Certification Programme’, a specifically curated programme that has been designed to help young lawyers (of up to five years’ experience) develop the practical skills necessary in the face of rapid technological change in both disputes and corporate practice areas. Participants can, for example, take courses on the ethics of generative AI, prompt engineering for lawyers and cross-border contract drafting. Most of the dispute’s modules will be led by current or former members of the judiciary as trainers or guest speakers. Participating in this programme is voluntary but recommended.
The Singapore Institute of Legal Education regularly conducts webinars on AI.
Universities also provide training to legal professionals. For example, Singapore Management University and the National University of Singapore regularly provide training for legal professionals using AI. These training sessions usually focus on introducing lawyers to fundamental concepts behind LLMs, legal and ethical risks and challenges, and the use of legal technologies in professional settings. The Singapore government incentivises legal professionals to opt for such training through the ‘Continuing Professional Development (CPD) Programme’, under which qualified lawyers in Singapore are required to accrue a certain number of CPD points every year by attending conferences, courses, and workshops to demonstrate their commitment to upgrading their skill sets and critical competencies.
The judiciary in Singapore also uses UNESCO training resources.
REGULATION
AI regulations
Singapore does not have any dedicated AI legislation. A number of frameworks, principles and approaches have been released by ministries and other bodies to help ensure responsible use of AI, but they have largely been voluntary and sector-agnostic. For example, the Model Artificial Intelligence Governance Framework (MAIG Framework), the second edition of which was released by Singapore’s Personal Data Protection Commission and its Infocomm Media Development Authority (IMDA) in 2020, sets out principles of AI governance and recommends practical methods by which they can be achieved. The MAIG Framework is intended to improve stakeholder confidence in organisations’ use of AI and includes recommendations such as establishing a coordinating body to oversee AI use and decide on the appropriate level of human involvement in AI-augmented decision-making. A Model AI Governance Framework for Generative AI (GenAI MAIG Framework) was created four years later to address GenAI risks such as hallucinations and copyright infringement.
Guidelines for practitioners
Judicial Code of Conduct
Insofar as the judiciary is concerned, in the absence of specific regulations, judges are bound by their general ethical obligations. It can therefore be argued that the Judicial Code of Conduct already regulates their conduct regarding the use of AI even though the Code does not mention it. The Code supplements the judges’ ethical obligations with ‘general statements’. The general statement for ‘Diligence’ provides that: ‘Judges must be conscientious in all aspects of their work and in the discharge of their judicial functions’. The Code further explains that, as part of their diligence obligations, judges are expected to ‘communicate clearly how they arrive at their decisions’. Therefore, any use of GenAI in this context may require adequate disclosures.
Legal Profession (Professional Conduct) Rules 2015
The Legal Profession (Professional Conduct) Rules 2015 made under the Legal Profession Act 1966 sets out the general principles and rules applicable to the practice of law in Singapore, including legal practitioners’ ‘parament duty’ to the court and the requirement to act with honesty, competence and diligence in their dealings with clients. These may also apply to practitioners’ use of AI in court.
Guide on the Use of Generative Artificial Intelligence Tools by Court Users (2024)
In 2024, the Registrar of the Singapore State Courts issued a Guide on the Use of Generative Artificial Intelligence Tools by Court Users. The 2024 Court Guide applies to ‘all matters’ in the Supreme Court, the State Courts and the Family Justice Courts, and therefore encompasses criminal proceedings. It provides a brief overview of what generative AI (GenAI) is and emphasises that the information generated by GenAI is not always accurate. The Guide provides that the courts do not generally prohibit the use of GenAI tools for the preparation of ‘Court Documents’, which are defined as including ‘any… material that is filed in or submitted to Court’ Users of the courts – which include lawyers, litigants-in-person, and witnesses – are advised that they remain fully responsible for the content of all their Court Documents.
Under the Guide, the following misuses of AI would trigger enforcement and/or sanction mechanisms:
- A failure to comply with existing laws and practice directions: The Guide reminds all court users of their obligations to ensure that all information provided to the Court is ‘independently verified, accurate, true and appropriate’. This includes verifying all quotes and citations. The Guide specifically mentions that it is not sufficient to use one GenAI tool to verify the accuracy of another GenAI tool.
- The use of AI to create, fabricate or alter evidence: The Guide is clear that GenAI tools should not be used to generate any evidence that court users wish to rely upon in court. Users may use GenAI tools to generate a first draft of an affidavit or statement, but may not use such tools to fabricate or tamper with evidence. But GenAI cannot be used in the creation, fabrication, embellishment of evidence nor can it be used to strengthen or dilute evidence.
- Using AI in a way that infringes intellectual property and/or disclosure rights: The Guide warns court users not to infringe intellectual property rights (such as copyrights) with their use of GenAI content and that proper source attribution must be provided. The Guide also warns against the disclosure of any personal or confidential information to GenAI tools, noting that ‘all information you provide to Generative AI chatbots may potentially be disclosed publicly’. It emphasises that confidentiality orders and laws, personal data protection laws, intellectual property laws and the relevant rules on legal privilege must be complied with when using GenAI tools.
Sanctions: Court users who fail to follow the Guide may subject to an adverse costs order; have material they submitted to the court disregarded or given less evidentiary weight; face disciplinary action, if they are lawyers; or be subject to ‘appropriate action in accordance with existing laws in respect of intellectual property rights, personal data protection, the protection of legal privilege and contempt of court’.
Other guidance for lawyers
- Microsoft and the Singapore Academy of Law (see above) published a guide on ‘Prompt Engineering for Lawyers’ in 2024. It ‘provides fundamental prompt engineering techniques that are helpful across various chat-based generative AI tools’. For example, the guide suggests that good prompts ‘will provide a clear description of the task, explain the role the AI tool needs to play, describe the audience, provide guidance on the tone, style and length of the expected output, and any additional context to be considered’. The guide also reminds lawyers of the importance of confidentiality and professionalism, and that GenAI tools should not be a ‘substitute for developing subject matter expertise’.
- The Law Society of Singapore has issued guides on the use of technology in the legal sector. In 2023, it issued a Guide on the Application of LegalTech for Law Practices to help law practices decide if a specific legal technology product or provider was suitable for their practice. The Guide briefly refers to AI, noting both GenAI’s potential as a ‘powerful productivity enhancer’ and concerns over its impact ‘on reliability of evidence’ and ‘its use in the preparation of submissions’ and concludes that ‘the full extent of its ramifications has yet to reveal itself’. An earlier Guide to Cybersecurity for Law Practices, published by the Law Society in 2020 to assist lawyers with protecting confidential information from cyber threats, includes recommendations that may apply to AI tools even though they are not mentioned. For example, the Guide advises lawyers to ‘[u]se software mechanisms to detect and log unauthorised changes to data in storage’.
Criminal procedure rules
Evidence Act 1893
Although not specific to criminal proceedings, the Evidence Act 1893 contains certain provisions that are drafted broadly enough to apply to AI. For example, section 116A(1) of the Act provides that unless sufficient contrary evidence is adduced, a ‘device or process’ that, ‘if properly used, ordinarily produces or accurately communicates an electronic record’ is to be presumed as having accurately produced or communicated an electronic record on a particular occasion. This requires the party adducing the electronic record as evidence to show that the device or process used to create that record ordinarily does so accurately, if properly used. Once that presumption is established, there is an evidential presumption that the system in question is reliable, although the party adducing the evidence still bears the burden of proving the electronic evidence in question.
Administration of Justice (Protection) Act 2016
While also not limited to criminal proceedings, the Singapore Parliament recently enacted an amendment to the Administration of Justice (Protection) Act 2016, which widens the range of circumstances that amount to contempt of court. Under the amended legislation, the definition of contempt of court includes, among other things, conduct that ‘involves a deception on the court, or is fictitious or constitutes a mere sham’. Given the breadth of the language used in the provision, this may arguably create a new avenue to punish litigants who misuse AI to, for example, generate fictitious evidence.
Data protection legislation
Personal Data Protection Act 2012
Singapore’s Personal Data Protection Act 2012 (PDPA) may be applicable when personal data is used to train AI systems or is processed by such systems in the context of court proceedings. Singapore’s Personal Data Protection Commission published Advisory Guidelines on use of Personal Data in AI Recommendation and Decision Systems in 2024, setting out guidance:
- to organisations on ‘when they can use personal data to develop and deploy systems that embed machine learning models’.
- on information to be provided to consumers when seeking consent;
- to third-party developers of bespoke AI systems who may occupy the role of data intermediaries on their obligations under the PDPA, which includes anonymisation; and
- on best practices that businesses could adopt to comply with the PDPA.
Under the Guidelines, organisations are permitted to use personal data when there is meaningful consent or when one of two exceptions is satisfied: the business improvement exception applies when businesses are developing AI systems to enhance an existing product or service; the research exception applies when organisations are conducting commercial research to develop AI systems that have a public benefit.
Human rights
Article 14 of the Constitution of the Republic of Singapore may potentially also be relevant in regulating the use of AI in courts although no reported case of its use for these purposes was found as at September 2025. The Singapore Constitution is the main source of human rights in Singapore. Article 14(1)(a) provides for the right of every Singaporean citizen to freedom of speech and expression, subject to the Singapore Parliament’s power under article 14(2)(a) to restrict that right provided that Parliament considers it ‘necessary or expedient’ in the interest of various considerations. including ‘public order and morality’. In The Online Citizen Pte Ltd v Attorney-General and another [2021] SGCA 96, a high-profile case in relatively recent history, Singapore’s apex court, the Court of Appeal, held that article 14(1)(a) of the Singapore Constitution does not protect the making of false speech, such that a maker of online falsehoods could be compelled by a different statute to correct the falsehood, that statute not being contrary to the right enshrined in Article 14(1)(a). While this precedent does not directly concern the use of AI in courts, it reinforces Singapore’s no-tolerance approach to falsehoods, which may be relevant to the increasing use of generative AI and the risk of hallucinations.
Applicable regional and international human rights instruments may also be relevant to the use of AI in the Singaporean courts. The ASEAN Human Rights Declaration protects the right to a fair trial and the right to privacy as does the Convention on the Rights of the Child.
Outlook
There is growing acceptance of the notion that Generative AI will bring about a paradigm shift in lawyering, and that we should all take this development seriously.
![]()
National AI Strategy: The Singapore National AI Strategy was first published in 2019 and updated in 2023, confirming the country’s belief “in the transformative potential of AI”. The strategy refers to Singapore’s launch of “the world’s first Model AI Governance Framework in 2019”, to AI Verify, “an AI governance testing framework and software toolkit” issued in 2022, and confirms that the Singaporean government “will continue to maintain a regulatory environment for AI that is pro-innovation while ensuring appropriate guardrails”.
Proposed GenAI Guide: In so far as regulation of the legal sector is concerned, in September 2025 the Ministry of Law launched a public consultation seeking feedback on a proposed Guide for Using Generative Artificial Intelligence in the Legal Sector. The Guide builds on the 2024 Court Guide and the GenAI MAIG Framework (see above) and was introduced in response to “ethical concerns”, “the rapid development of gen AI by different programmers”, and “feedback from law firms that they required support to improve their tech processes while juggling existing priorities”. It refers to the “duties under the Legal Profession Act 1966 and the Professional Conduct Rules 2015” and lists professional ethics, confidentiality, and transparency as key principles that legal professionals must observe when using GenAI in legal work. For example, it suggests ensuring that there always is a “lawyer-in-the-loop” and that legal professionals “review, analyse, and verify all GenAI-generated output before incorporating them into their work”.
CASES
It appears that, in a civil context, there has been at least one case in Singapore in which a self-represented person used ChatGPT to create submissions that included fabricated case law. The issue was uncovered after counsel for the opposing party searched for the authorities and found that they did not exist. It is unclear whether the person was sanctioned.