Banking technologies are undergoing a major shift as banking companies face the challenge of servicing digital native customers with real-time service requirements supporting voice and text conversations and with low data latency to support marketplace and regulatory requirements. Banks can prepare themselves to support such a paradigm shift in servicing requirements by embedding their data infrastructure with intelligence.
Unless intelligence is contextualized for a bank and built into every component of the data infrastructure, the embedded intelligence does not create the intended beneficial impact.
Data infrastructure can be divided into four components: data preparation and integration, data storage and data warehouse, analytics and visualization Data . This blog discusses how intelligence and analytics can be embedded into every component of data infrastructure to enable banks to reduce latency.
Business intelligence is not a naïve term in the banking domain. A cohesion of smart tools and techniques create business intelligent solutions that are increasingly being adopted by banks. This makes the process seamless, convenient for customers and ensures precision in business outcomes.
1. Intelligent data preparation and integration
Banks are focusing on making data preparation and integration intelligent right from data ingestion to data preparation. They are being embedded with visual exploration for generating exception, analytics and insights on integration, the end objective is to facilitate accurate results. The other function that banks are looking to implement is to enable visual based data discovery in terms of geo-spatial capabilities and create empowered business users. On the other hand, automated data governance capabilities supported by data quality, meta data, data custodian, data definition and semantic definitions, traceability, privacy, custody, and metadata are also being developed to enhance the overall offerings for customers.
2. Efficient data storage and warehouse
Do banks have processes in place that ensures that the huge data available to them can be trusted? For this, data storage and data warehouse plays a key role. A storehouse of data is essential to eliminate operational challenges. It is, therefore, essential for banks to extend the logical data warehouse to no SQL data, data in other formats, offloading computation and storage to public or private or hybrid cloud. Additionally, banks should also look at having an ‘In-memory columnar engine’ for faster and better performance. That in turn will support interactive and visual applications, multiple data sources, complex data models and complex calculations.
3. Augment data analytics and visualisation
In today’s data driven world, it is critical for banks to adopt tools that will help analyze, process and evaluate data to generate strategic business outcomes. Customers and businesses need clear information to operate in an intelligent manner. That is why enhancing the analytics and visualisation with linguistic computing capabilities is essential for banking processes. Banks can even adopt Natural Language Processing (NLP) and Geo spatial intelligence to augment analytics. NLP is a simple way to raise a query on data and generate narratives to explain drivers and graphs, making the entire process clear and transparent. Voice and search based interface for raising a query on data is another way to enhance the data and analytics capabilities. For a more engaging platform, banks could even weigh conversational analytics that in the form of chatbots and virtual assistants help in driving precision in the operational workflow.
Conclusion:
Technology has matured, yet the banking industry has miles to go: Augmented Analytics and the entire intelligence built so far only augments the technology capabilities. It lacks domain content. It misses the domain data model and domain-specific measures, correlations, clustering, links and narrations. As usual, this is still not augmented with the knowledge of:
- Banking
- Banking and Financial Services Regulations
- Accounting Standards
- Financial Instruments, Markets and Products
The differentiating mark of a successful FinTech is it builds the domain knowledge into the technology. Success is directly proportional to the quality of depth of domain knowledge built. Example can be embedded geospatial data of the area where bank provides ATMs- so that longitudinal and latitudinal information is mapped to the physical address- the entire conversion has to be in-built. Some banks are using the generic intelligent capabilities provided by Google, AWS, Azure and Facebook. Unless, banks build subject matter depth, adoption of technology like conversational UI will be limited. This includes banking terms, banking products, banking product computation, banking intents, banking sentences etc. We need to keep in mind that subject matter expertise of Bots works both way. Depth in expertise grows and shortfall reduces the adoption exponentially.
Embedding intelligence into data infrastructure projects should not be treated like a software project but should rather be treated as a business model disruption project and be given the necessary investment, effort and attention.
In the next blog we will discuss challenges a bank faces if they manage intelligence projects as software projects.
By Mohan Bhatia | Wipro Fellow, Distinguished Member of Technical Staff