Core10 Insights

ChatGPT in Banking | Core10

Written by Core10 | Jul 18, 2023 12:45:30 PM

ChatGPT continues to gain traction with some of the world’s largest and most tech-savvy companies. Generative AI tools are becoming increasingly common in the financial services sector. Before putting them to use, however, banking professionals should be aware of the potential benefits and risks. While powerful and helpful, ChatGPT does have limitations as well as security and compliance issues.

What Is ChatGPT?

What exactly is this new technology that’s commanding the attention of business leaders across the globe? ChatGPT is an artificial intelligence (AI) chatbot that delivers human-like responses when users enter written information. Based on vast amounts of data, ChatGPT uses a large language model to understand and communicate with human language. 

Generative AI for Banking

With new technology, there are always concerns about how it will impact business. As a company recognized for being at the forefront of innovation, Core10 acknowledges that AI models are not something to fear, rather something to embrace.

We have some thoughts and best practices when it comes to the use of ChatGPT. Alex Griffiths, our engineering manager and security and compliance expert, and Austin Mills, our legal counsel and partner in his firm's corporate technology practice, give expert advice and opinions on how to take advantage of this tool and safeguard personal, financial, and sensitive data. 

Effective Ways to Use ChatGPT in Banking

Banks engaged in providing cutting-edge technology to their customers are always searching for new tools. Here are some ways ChatGPT can assist banks:

  • Customer Service
    Using its advanced language model, ChatGPT can work as a virtual assistant. The tool can improve customer experience by significantly reducing wait times when managing requests and delivering fast solutions. 

  • Fraud Detection and Security
    ChatGPT can analyze patterns in communication and transaction data. This allows it to pinpoint suspicious activity and safeguard customers’ personal and financial information. 

  • Personalized Financial Advice
    ChatGPT technology can incorporate into other AI-based financial services to offer more personalized recommendations to the consumer. 

  • Content Creation
    Developing interesting and relevant information for consumers has never been simpler. ChatGPT can create content in the form of blog posts, newsletters, and more to keep customers engaged and educated and to further build relationships with a bank’s target market. 

ChatGPT offers aid to software developers in the banking industry as well. 

It can help with:

  • New Ideas or Strategies

    ChatGPT presents new ideas on how to solve complex problems, from developing code to creating a marketing strategy.

  • Algorithms
    Take, for example, a real-life experience from Engineering Manager Alex Griffiths. When he needed help figuring out what cryptography a client was using and how to implement it, he asked ChatGPT how he could solve this. ChatGPT gave him a list of things to try. Alex was able to use one of the resources on the list to conduct his task. 

According to Alex, ChatGPT excels at helping him avoid repeating trivial tasks. For example: When he needs to use a specific algorithm or connector, Alex asks ChatGPT to write him a new one. Then, he tweaks it to make it work for his individual project. This saves him time by not having to dig through old code to find what he was looking for. 

 

“It’s become a daily tool for developers—most of us don’t Google things anymore. With Google, you had to know what’s going wrong and how to phrase your question to get an answer. With ChatGPT, it knows what you are talking about. Researching errors and debugging is changed forever.” 

—Alex Griffiths, Core10 Engineering Manager

 

ChatGPT Risks and Liabilities

For any organization in the financial services industry, it is critical to understand the risks of any new tool. ChatGPT does pose some issues for the industry. 

  • Data Privacy and Security
    As Austin Mills, our legal counsel, explains, “For the fintech and financial services industries, one of the primary issues with ChatGPT is regarding data privacy and security. AI models like ChatGPT can process and generate text based on the information it is given. This raises concerns about the handling of sensitive data, like personal and financial information of customers.” 

  • Accurate and Reliable Information
    According to Austin, financial advice or transactions that are processed inaccurately can have serious legal and financial repercussions.

  • Accountability and Liability
    In the event of mistakes or misinformation caused by ChatGPT, banks and financial institutions could find themselves accountable or liable for the actions caused by the errors.

How can your bank protect itself from these risks? Here’s what Austin recommends:

  • Maintain data privacy. This could involve technical safeguards to ensure sensitive data is not stored or transmitted inappropriately. It could also include training for employees on how to manage data when interacting with generative AI tools and services. 

  • Practice rigorous testing and monitoring. Regular audits and evaluations of the AI system can help confirm correct and reliable information. 

  • Plan to address issues and mistakes. Develop a system for notifying affected parties and rectifying the situation. A bank may want to obtain insurance coverage for potential liabilities associated with AI usage. 

Alex reminds us that software developers need to be cautious when using the tool. 

“It’s important to remember that anything entered into ChatGPT may be used for the purposes of training. We should avoid entering any sensitive or confidential information that we wouldn’t want to share with the public.” 

 

Inputting sensitive information and proprietary data can affect compliance and security. Bank employees should avoid entering any of the following information into ChatGPT:

  • API endpoints

  • Usernames

  • Passwords

  • Request bodies

  • Request responses

  • Personally identifiable information

  • Financial information

  • Confidential business information

For software developers, confidential business information is an area to guard closely. Look at these two examples: 

  • Production code (already written): This is proprietary code with secrets in it, like your keys or credentials. ChatGPT is an open tool and uses your data. Your credentials could end up in training scenarios. 

  • Unit tests: This code tests functionality within code. The unit tests written by ChatGPT in our experience have been unreliable at best, often assuming functionality that doesn't exist. This leads to a poor unit test and results. 

Core10: Your Trusted Partner for Compliant Development

At Core10, we understand the need to stand out against the competition. Risk management is a critical part to integrating new technology for consumers. To stay on the cutting edge, banks need a partner who understands the latest technology and how to mitigate the risks and liabilities that come with new tech products.

We maintain our SOC 2 certification (Service Organization Control Type 2) and fully evaluate any partner to identify if they meet these distinguished criteria as well. 

Want to learn more about how AI can impact the financial industry? Contact Core10 to increase your understanding of this emerging technology.