This speech was delivered by Lew Chuen Hong, Chief Executive, IMDA at IAPP Asia Privacy Forum 2023 on 19 Jul 2023.
SINGAPORE – 19 JUL 2023
Introduction
1. A very good morning. It is a real pleasure to be here at the IAPP Asia Privacy Forum 2023.
2. I would like to extend a very warm welcome to all our distinguished guests, and my counterparts from ASEAN, rest of Asia Pacific, and the Middle East.
Growth of digital economy and importance of data
3. 2023 marks the third year since the onset of COVID-19 pandemic. We see travel restrictions easing, and most economic and social activities have resumed, restoring a sense of normalcy for many.
4. As global economies shape their paths to recovery, one of the lasting legacies of COVID-19 has been the greater adoption of digital technologies. Indeed, the rise of the digital economy remains a bright spot today, providing new opportunities to millions. South East Asia’s digital economy in 2022 was estimated at US 200 billion – twice its size just 3 years ago. Data volumes and data flows have also increased correspondingly. In fact, some of these estimates indicate that 90% of the world’s data today has been generated1 in the last 2 years!
5. One analogy often used is data like the new oil – I don’t quite agree. Oil gets consumed and disappears. I think data is more like new capital. The more it is shared, and re-used (it doesn’t disappear), and in financial terms the faster the “velocity of capital” – the more value is created. Data underpins the efficient functioning of global supply chains – from the placement of orders, to payment details, transport of goods across distribution networks to the last mile delivery.
6. While there are clear opportunities accompanying greater use and sharing of data in the digital economy, consumers, businesses and regulators alike are increasingly wary of the risks and potential for abuse by bad actors – all of which undermine trust. I want to place emphasis on the word “trust”.
- Consumers need to have trust that the new services and technologies they interact with are secure, and their data will be used responsibly
- Businesses will need to build trust, to enable use of data for innovation so as to keep pace with changing consumer expectations.
- The role of regulators is to provide guardrails for that trust. Only then can we address the risks while facilitating innovation.
What have been some of the concerns that have appeared on the horizon? Three broad areas of concerns involving data use.
7. First, AI – more specifically, generative AI. AI, Gen AI exploded on the scene, literally in the past few months. There is a deep sense that developments in tech are running ahead very quickly, bringing both opportunities and risks.
- Recent incidents underscore the risks of generative AI – from incorrect citation of non-existent court cases as part of legal research, in what is termed “hallucination”, to creation of deepfakes used in scams and for spreading disinformation. Regulators around the world are also struggling with the use of huge volumes of data for training of AI.
- In May, the Italian data protection authority imposed a ban on ChatGPT. Reasons include the lack of legal basis under the EU GDPR for massive collection and use of personal data, the lack of age restrictions, and the potential for ChatGPT to provide factually incorrect information in its responses.
- The EU is looking to introduce a new AI Act, while Japan and UK have developed or are in the midst of developing guidance on key issues of concern instead of “hard regulations”.
8. Singapore has chosen to take a balanced approach to meet the twin goals of data protection and market innovation.
- We first launched the Model AI Governance Framework in 2019, setting out key issues and guiding principles such as transparency, explainability and fairness to promote trust in AI.
- Based on these internationally-aligned principles, we developed the AI Verify Minimum Viable Product (MVP) last year. AI Verify is an AI Governance Testing Framework and software toolkit to help industry be more transparent about the performance of their AI models. The toolkit tests for fairness, explainability and robustness, primarily in supervised learning models. Today, the science for AI testing still lags behind AI developments. That is why, we have open sourced AI Verify and set up an open source Foundation to crowdsource AI governance testing development from the global AI testing community.
- As a starting point to address the risks posed by Gen AI, we published a discussion paper that identifies key policy considerations and the need for a practical, trust-based approach. For example, (i) defining accountability between model developers and businesses developing services on top of these models (ii) developing evaluation metrics for Gen AI models and (iii) clarifying the use of data for model training.
- In relation to data use for training of gen AI, PDPC is also studying these issues and considering whether further guidance should be provided under the PDPA, recognising that generative AI has unique concerns, such as “memorisation” and “regurgitation” of personal data used to train it wholesale.
- Before we address the issues brought on by Gen AI, PDPC will be providing clarity on how the core principles of PDPA apply to the use of data in traditional AI systems. Yesterday, Minister announced that we will be launching Advisory Guidelines on the use of Personal Data in AI Recommendation and Decision Systems for open consultation. We look forward to hearing from industry on the draft.
9. The second issue is the landscape to facilitate responsible cross border data flows.
- We now have 137 countries with data protection laws, a marked increase from the last decade. Data protection laws across the world are similar in most underlying data protection principles, yet different in their unique ways.
- The data transfer landscape is fragmented. While most data protection laws put in place requirements to ensure citizens’ personal data is adequately protected as it is transferred, there is a lack of simple globally-acceptable transfer standards to date. The complex regulatory landscape poses challenges for global businesses, and especially SMEs that need to access overseas markets for growth.
10. Within ASEAN, there has been efforts to harmonise data protection standards and facilitate data flows through the ASEAN Model Contractual Clauses for intra-ASEAN data transfers.
- Building on ASEAN’s efforts, we have also worked with the EU to promote mutual understanding of EU and ASEAN contractual clauses for data transfers.
- We also believe in the value of supporting G2G standards and solutions for such transfers such as the Global Cross Border Privacy Rules (or Global CBPR). GCBPR first started out in APEC, with same set of standards being opened up globally. The Global CBPR brings us a step closer to having a common network, and I am heartened to see many like-minded countries joining the Global CBPR Forum.
11. The third issue I would like to highlight is in relation to online harms.
- Businesses and digital platforms today have the ability to collect vast amounts of data generated through consumers’ interactions with their services and products. Children are more impressionable and highly susceptible to external influences. This makes them the most vulnerable group to personalised content that is age-inappropriate or harmful. How can we build a safe and trusted digital space for children?
- Last year, Singapore passed the Online Safety (Miscellaneous Amendments) Act to protect Singaporeans, especially the young, from harmful content online. The Code of Practice for Online Safety, which requires designated social media services to take measures to enhance online safety on their platforms, took effect on 18 July 2023. PDPC will now be consulting the public on how businesses can implement appropriate safeguards to ensure children’s data is being collected, used and protected adequately. We have also launched an Innovation Call to explore the viability of privacy preserving age estimation solutions that complement existing age assurance methods – such as facial analysis to determine an individual’s age.
Conclusion
12. Singapore is not alone in confronting these challenges that have come to the fore with increased use of data and new technologies. We will continue to collaborate with industry, practitioners and data protection authorities to build trust while enabling innovation in the digital economy.
13. With that, thank you for your kind attention, and wish you a most fruitful time at the IAPP Asia Privacy Forum.