Global Trustee and Fiduciary Services Bite-Sized Issue 10 2023

Global Trustee and Fiduciary Services Bite-Sized | Issue 10 | 2023 3 QUICK LINKS CBDC DIVERSITY & INCLUSION FINTECH FUND LIQUIDITY OPERATIONAL RESILIENCE SUSTAINABLEFINANCE/ESG ASIA EUROPE LUXEMBOURG NORTH AMERICA UNITED KINGDOM FINTECH IOSCO Establishes Global Approach to Address Risks in Decentralised Finance On 7 September 2023 the International Organization of Securities Commissions (IOSCO) issued for consultation nine policy recommendations to address market integrity and investor protection concerns arising from Decentralised Finance (DeFi). The Recommendations cover six key areas, consistent with the IOSCO Objectives and Principles for Securities Regulation and relevant supporting IOSCO standards, recommendations, and good practices. These are: 1. Understanding DeFi Arrangements and Structures; 2. Achieving Common Standards of Regulatory Outcomes; 3. Identification and Management of Key Risks 4. Clear, Accurate and Comprehensive Disclosures; 5. Enforcement of Applicable Laws; and 6. Cross-Border Cooperation. IOSCO aims to finalise its DeFi recommendations around the end of 2023, in accordance with its Crypto-Asset Roadmap of July 2022, and in conjunction with its Crypto and Digital Assets recommendations. Comments on the consultation paper should be sent to DeFiconsultation@iosco.org on or before 19 October 2023. Link to Consultation here Potential Bias in Firms’ Use of Consumer Data On 4 September 2023 the Financial Services Consumer Panel (FSCP) published new research which found evidence that suggests consumers with protected characteristics are experiencing bias, due to the way financial firms are using personal data and algorithms. The FSCP found anecdotal evidence and widespread concerns that bias is introduced into financial systems and processes - intentionally or unintentionally - which can lead to consumer harm. For example, the research found that consumers were experiencing unfair bias relating to their ethnicity in terms of access to products, pricing of products and service received. The FSCP notes the absence of concrete evidence that algorithmic decision-making is the direct cause of consumer harm. The research suggests this is because firm’s use of personal data and algorithms is complex and opaque, making it difficult for anyone to understand: • How consumers’ data is being used ‘behind the scenes’; and • The consequences such decisions can have on consumers’ lives. The FSCP believes that this lack of evidence should be a significant cause for concern. The ethical use of algorithms and artificial intelligence (AI) is a priority for the FSCP and effective governance to mitigate the risk of consumer harm - is currently a topic of lively debate amongst consumer organisations, regulators and the UK government. The FSCP concludes with six recommendations to the FCA, based on the research findings. Link to the Report here

RkJQdWJsaXNoZXIy MTM5MzQ1OQ==