New UK Approach To Financial Service Regulation Demands ChatGPT-Like Solutions – Forbes

Nikhil Rathi, chief executive officer of the Britain’s Financial Conduct Authority (FCA), … [+] Photographer: Chris J. Ratcliffe/Bloomberg
One of the reasons why I think AI-to-AI connections are the only way to make this new duty work is that the outcomes depend to large extent on customer understanding and providing customers with the information that they need to make decisions. This makes me slightly nervous, because some years ago I was involved in a research project looking at obtaining informed consent from British consumers. The conclusion, as I recall, was that while obtaining consent was straightforward, obtaining informed consent was an almost insurmountable barrier to business. Remember, the general public have little grounding in mathematics, statistics, portfolio management the history of investments. Hence it seems to me that the money spent on trying to educate the public about the difference between the arithmetic mean and the mode might be better spent on commissioning and certifying bots capable of making informed decisions on their behalf.
(When it comes to the provisions about value – that products and services should be sold at a price that reflects their value and that there should be no excessively high fees — I have literally no idea how to judge whether a price is appropriate but presumably lawyers do, so that’s OK.)
What will this duty of care mean in practice for an average consumer such as myself? Well, one immediate impact will be on savings. A common complaint of customers is that banks do not offer the best savings rates to existing customers and that they tempt people in by offering high rates for initial period only. A combination of the demands of the duty of care, together with the practicalities of sweeping accounts, variable recurring payments (VRPs) in open banking and a little bit of machine learning should mean that British consumers are assured a much better value, and that the banks will have to work harder. The Financial Conduct Authority CEO Nikhil Rathi (pictured above) said last year that he was keeping a “beady eye” on banks. The new Consumer Duty will heighten scrutiny on how actively banks shepherd customers into products offering better rates. Consumer inertia (or lack of confidence in making financial decisions, as the banks prefer) now becomes a bank problem rather than a consumer problem.
Really smart cards.
What does the bank do about that problem? Well, it might spend years rewriting thousands of lines of COBOL to try to optimise the product selection for individual consumers but it will undoubtedly be more cost effective to allow AI to review the customers circumstances, the changing environment and the available products to not only advise customers on what the best products might be but to automatically move customers’ funds around.
Banks must, however, be careful about just how this is done. Last year’s Bank of England discussion paper on AI pointed out that while the Consumer Duty does not prevent firms from adopting business models with different pricing by groups (for instance risk-based pricing), certain AI-derived price-discrimination strategies could breach the requirements if they result in poor outcomes for groups of retail customers. As such, firms should be able to monitor, explain, and justify if their AI models result in differences in price and value for different cohorts of customers: ChatGPT isn’t a magic bullet, in other words.
(McKinsey reckon that ChatGPT and its ilk will impact all industry sectors but they single out banking as one of the sectors that could see the biggest impact. Estimating that the technology could deliver value equal to an additional $200 billion to $340 billion annually if the use cases they look at were to be fully implemented. Interestingly, the biggest use case that they identity in banking is the conversation of legacy code!)
This new way of regulating financial services means a significant opportunity for fintechs and reg techs. Take a look at the report on “FCA Consumer Duty: Business Burden or Golden Opportunity” prepared by U.K. fintech Moneyhub. They do a great job of exploring the value-creating uses cases around compliance with each of the requirements. This makes a lot of sense to me: If you are building a machine-learning model to use data from multiple sources to assess whether a consumer is vulnerable to frauds, for example, you can use that same model to help the customer build resilience against fraud and resilience against other external changes (eg, changes in employment status).
This approach helps with the duty to show that an organisation’s products and services should be fit for purpose as well as helping the organisation to work with others toward the desirable goal of delivering consumer financial health rather than a collection of loosely-related financial services.

source

Jesse
https://playwithchatgtp.com