How financial institutions are embracing change

chain

The notion of a ‘tipping point’ became part of the public psyche after Malcolm Gladwell’s seminal piece of literature, The Tipping Point: How Little Things Can Make a Big Difference, which delved into what drove rapid, and often unexpected, change across diverse areas including business, education, fashion and media.

This definition towards change draws parallels with what is currently happening in IT amongst financial services’.  Numerous factors, including more stringent regulation and the need to rein in costs, are having a compounding effect, causing financial institutions to acknowledge that the time to modernise IT and processes is now.

High profile fines following data breaches mean banks are now realising that complacency is costly, when it comes to technology, both in competitiveness and the ability to support existing, and attract new customers.

Modernising traditionally under-invested and complex legacy technology solutions has often been perceived as a daunting task, something to put off until after a colossal operational failure or data breach, so banks have been slow to move. But recent security breaches and increasing regulation are indicating a proactive approach is now required.

Whilst computer technology has evolved at breakneck pace, becoming faster and more powerful over time, stringent budgeting and a complacent attitude has meant the back-engine infrastructure in banks has not necessarily matched pace.

Instead, legacy infrastructure remains in place, some dating back several decades, creating a bottleneck by inefficiently managing the transfer of large volumes of information, which in turn hinders the rate of business growth and innovation.

In recent times the reduction of proprietary trading has meant financial institutions have needed to return to thinking more about customers, therefore needing to interact with customers in greater volumes, driving security requirements to a new level.

Vast amounts of data is therefore needing to be transferred both internally and externally to various parties across the world, and being able to analyse, report and gain a central oversight over these transfers is now business critical.

Banks are now becoming aware that streamlining and securing their IT systems is a tier-one requirement that can no longer be delayed. It can be said we have now reached this oft talked about, “Tipping Point”.

Driving the Tipping Point
So what is causing this rapid change in financial services IT?

Most financial institutions have a lot of ingrained complexity due to the amount of disparate systems across internal divisions and constituents.

Some of these systems have resulted following M&A activity, and have then needed to be brought together across different lines of business.

Rather than taking the time to invest in appropriate integration, rudimentary means of incorporating them have instead been inefficiently implemented, without a strategic approach to system infrastructure.

Instead of having modern data movement processes in place, many institutions are still reliant on archaic, spaghetti scripted FTP-based file transfer methods that are open to data breach and don’t provide the transparency needed to remain compliant.

Regulatory concerns and the risk of data breaches are enough to have banks’ compliance departments waking up in a cold sweat. When high-profile regulatory fines seem to be an almost monthly occurrence, it is little wonder that data security has become one of the most important considerations.

To combat the rising numbers of data breaches, regulatory compliance is becoming ever more stringent in order to stamp down on the high-profile breaches that are costing banks both financially and in customer confidence and reputation, ultimately damaging the image of the financial sector as a whole.

The reality of threats to banks’ data is that they aren’t only facing external threats, with names as ominous sounding as Advanced Persistent Threats (APTs) and Distributed Denial of Service (DDoS) attacks, but also from their own employees.

Trends around Bring Your Own Device (BYOD) and an increasingly tech-savvy workforce have meant that employees are accessing and sharing business critical data in unprotected environments.

It is therefore necessary to mitigate vulnerabilities posed by inadvertent, accidental or sometimes malicious data breaches caused by people within the business.

On top of this, all businesses, regardless of sector or size, will always look to “do more with less” as the most fundamental approach to operational efficiency. Technology budgets are becoming more and more constrained, meaning existing and new systems have to work harder and more efficiently without incurring costs.

Compound this with competition from other financial institutions and technology will play an even greater role in how institutions remain competitive.

 

‘Modernisation and Consolidation’; ‘Big Data’ and ‘Governance and Compliance’
Within the last 15 years, only a small percentage of the Tier 1 banks have successfully modernised their banking technology, but this is beginning to change.

Undoubtedly, with so many factors weighing so heavily on the minds of financial institutions’ IT departments, there is a need to modernise and consolidate the IT that is powering data transfers and trades using a Business Integration Suite.

Banks have often invested tens of millions of dollars into their core banking systems, which can encapsulate payments engines and hubs, and these systems are expected to have a life expectancy of twenty years or so.

The area where banks can fall short is in the intricate and granular levels of integration needed to acquire data and to then direct that data to other systems within the bank. Often the data needed resides in multiple types of systems, with multiple interface methods and across multiple lines of business and in large volumes that are difficult to move. This is where business integration plays a critical role in the governing of the data and processes.

It is now possible to decommission old legacy systems, to migrate tens of thousands of customers onto a single platform, and test and deploy the system in a matter of months – instead of years. This in turn allows the financial institutions to meet much tougher Service Level Agreements for customers who want the banks to deal with transactions in near real time.

The compliance department needs to connect with regulators to show that it is taking proactive steps to secure data, minimise rogue trades, and to show that there is a ‘culture of compliance’ within the organisation.

This is where financial institutions are turning to secure, enterprise-grade Managed File Transfer (MFT) solutions, in conjunction with an integrated approach, to help automate and secure structured and unstructured data, allowing for more effective reporting.

MFT simplifies the compliance process with clear workflows, reporting and audit trails that are able to demonstrate clearly that sensitive information has only been shared with, or accessed by, approved parties.

An MFT platform can also help automate the process of governing the data that is being sent, alerting the compliance department of any sensitive information and requiring an authorisation before the transfer is allowed.

This in turn can help reduce rogue trades and to increase overall transparency on the trades happening, showing regulators that the bank has a clear and directional strategy and demonstrates that everything is being done to root out fraudulent activity.

In order to halt the fines that banks are receiving from the regulators, there needs to be evidence of a process that is consistently trying to identify what kind of transactions are moving through the system through algorithm databases and clear business rules.

The right solution will also be able to provide the level of detail including how many times a document may have been printed or if it was forwarded on, or even if it merely made it to its intended destination. This is all part of what is referred to as an ‘Audit Trail’, which helps to simplify the reporting process.

Conclusion
Financial institutions, when it comes to IT, have previously had a reputation (rightly or wrongly) as almost ‘glacial’ when it comes to change, due in part to their reluctance in moving away from perceived ‘stability’.

But recently, a lot of financial institutions have woken up to the reality that their outdated, legacy computing systems have become ineffective and, in most cases, prove a hindrance to their growth and competitiveness, and that now is the time to modernise and consolidate their complex systems.

Institutions have realised that the financial and reputational cost of a data breach can be catastrophic. Regulation will likely only become more stringent over time, so organisations need to be taking proactive steps now to safeguard against the risk of data breaches.

One of the difficult issues to deal with for financial institutions is migration and a great deal of innovation is required to migrate from one platform to another in order to reduce complexity when you have large communities but in doing so, institutions gain far greater levels of agility.

Fortunately, innovative solutions are now available that provide the ROI required and allow you to decommission old insecure solutions and to reduce the overall complexity and add needed agility. In answer to these demands, technology has evolved to the point where systems can be integrated and deployed far more rapidly, meaning the benefits and cost savings can be reaped sooner rather than later.

The tipping point is therefore now.

Derek Schwartz is senior vice-president financial services at Seeburger

Exit mobile version