AI in the financial sector should be handled with caution

AI in the financial sector should be handled with caution
AI in the financial sector should be handled with caution

A problem with artificial intelligence is its opaque nature when solving problems. It will not be precise on the indicators that lead to a solution. In general, Elin Hauge, an independent AI and business strategist, believes that we should first look at the problem we are trying to solve and only then look at the toolbox, which includes AI, among many other tools . Sometimes fixing a problem is “just getting the data flow to work properly between two different systems.”

The AI ​​won’t tell you why it’s weird, but it will tell you that this data point is weird.

Elin Hauge, AI and business strategist

She observed that many executives’ decision to use AI is “very largely” driven by consulting firms. “If McKinsey says it, it must be true, right?” Hauge noted that understanding data dynamics requires many skills across an industry, its data and the value chain. Still, such background is useful when deciding whether mathematics based on stochastic modeling should be used or not.

AI in the financial sector

While discussing the suitability of using AI to predict certain market indicators such as stocks or interest rates, AI could be more appropriately used to detect outliers, suggested Ms. Hauge during an interview on November 27, 2024, following his presentation at a European Investment Bank conference. “She won’t tell you why it’s weird, but she will tell you that this data point is weird.”

She noted that technical experts at financial institutions “kind of use” large language models when writing internal development codes. “It has nothing to do with negotiation.”

Can insurers and banks learn anything from AI?

She noted that among insurers, machine learning or “good old stochastic modeling” is used for “micro-pricing of individual risk level.” Claims fraud detection is another area that benefits from AI. Ms. Hauge explained that machine learning can identify “patterns of behavior in known fraud cases,” also known as “pattern recognition.” In the financial sector, the credit card industry approaches the issue from the opposite angle, that is, “a deviation from a typical model.” An alarm would go off if your card was used in the Bahamas, for example.

She believes that “classic old statistics work really well when you have a predefined data set” to perform regression analysis on. On the other hand, Ms. Hauge says that when you are faced with a “bigger data set and you don’t really know which data are more important to us and which are less important in pricing, the “machine learning is one way to do this.”

A surprising statement for your correspondent, given that there are statistical methods for selecting the relevant variables. Ms Hauge confirmed that the approach or “underlying mathematics is the same”, but that using AI is a quicker way to achieve her goal. However, it is possible to lose information about hardware indicators along the way.

Identify relevant indicators and their impact. All is not lost

“There is a way to solve this problem, which is to use heat maps in segments of your neural network, and AI is one of the sub-fields of technology that is currently being developed and studied,” said Ms. Hauger. She clarified that neural networks with “a large number of layers are called [systèmes] “deep learning”, which is the method used for most of these models.

She noted that regulations require data transparency and the use of algorithms for “high-risk applications.” Therefore, heatmaps attempt to assign weights to neural network nodes, “a sort of black box.” However, she admits that it is not “mature enough… but it is an area that is progressing.”

Man remains necessary to support AI

Hauge noted that “any of these models” provide stochastic predictions. She suggested that an outcome with a 95% probability level may mean that “you still need a human in the loop to look at the case.”

Relying entirely on model results can actually backfire. Elin Hauge reports that the Dutch tax and customs administration wrongly accused 26,000 families of child benefit fraud four years ago. The Dutch government resigned following this scandal.

The consequences of an AI error in choosing music on Spotify are much less than its use in the financial sector or following a medical examination.

This article was originally written translated and edited for the Paperjam website in French.

-

-

PREV AI in the financial sector should be handled with caution
NEXT Without ever planning to ban the purchase of gasoline vehicles like in Quebec, Norway is on the verge of only selling new 100% electric cars in 2025