9 Ways Financial Institutions Can Take Data Management to the Next Level

James Crosby
8 min readJul 29, 2020

Over the past 20 years, dozens of data management platforms have come to market, each offering essential functionality that both buy-side and sell-side institutions need. From setting up security master processes to integrating front-to-back office operations, financial institutions need a fast ROI which includes a speedy implementation process with ongoing self-sufficiency, and enhanced data quality. They also need a strong data governance framework with clear visibility on the complete life cycle of the data, from data source to data consumer packaged with full audit capabilities throughout.

Having built and implemented several of these data management solutions over the past 15 years, I’ve listed 9 important considerations that financial institutions need to take into account when reviewing their data management systems and projects, with a focus on how the industry is currently falling short. Taking on board these recommendations will help financial institutions optimise the way they manage their reference, market and investment data — or any other types of data for that matter — within their organisations.

Finding the right data management platform can be tough (picture by Jens Johnson from Pexels)

1. Use native cloud technology

Cloud adoption is growing every day within the finance industry. And this shift towards the cloud has meant that data management platforms have had to change with it. But can a platform that was built for a different architectural model every truly be up to the job of running on the cloud? Many established platforms still require what is known as a “thick client” (like installing Microsoft Word rather than using Google Docs) to be installed on desktop computers in order to perform the configuration. This outdated paradigm is something to watch out for.

To ensure financial institutions remain ready to take on the challenges of the future, they need to make sure they choose a data management platform that was designed from the ground up with a modern cloud-based microservices architecture. Both configuration and end-user dashboards should be fully web-based with a minimal amount of training required.

So, when you’re being shown flashy end-user dashboards during a demo, always ask: How do we configure that?

2. Focus on the user experience

Existing vendors focus on the user experience once the software is live. However, a tremendous amount of configuration is required at the implementation stage in order to get this up and running. Implementations of established software can take many months, even years as a lot of the work involved is configuring modules and data packages for every unique client. Instead of providing out-of-the-box functionality that is ready to go.

In the future, simple drop down menus will be all the configuration needed and implementation time frames will be reduced significantly — saving clients time and money and enhancing their overall user experience, not just when the software is live.

3. Put a flexible data dictionary at the core

In much of the incumbent data management software, data dictionary capabilities were either an afterthought, or designed too rigidly. In both cases, there is a detrimental impact on implementation timelines and on-going maintenance.

The data dictionary (the model or schema of the data being managed) should be at the centre of the data management platform. This is where a lot of the data governance should spring from by ensuring there is clear ownership and integrity of the data as well as by aligning the data to your own preferred codes (e.g. country codes, currency codes, etc). It should also serve as a business glossary, giving each attribute a common meaning across the organisation.

Every financial institution has its own unique view of the world and therefore this data dictionary must be easily extensible. By combining this extensibility with powerful and user-friendly management interfaces, maintenance can be made a breeze. Your data landscape is constantly evolving, so it’s important that your data management platform enables this evolution and doesn’t become a hindrance to change.

4. Keep permission management simple

One of the critical jobs of a data management platform is to safeguard your data. First and foremost this is about making sure that only correctly authenticated and authorised users can interact with the data. A well-designed architecture also plays a crucial part, particularly when dealing with the cloud. Again, it’s important to avoid ageing tech that wasn’t built with the cloud in mind.

Cloud security must employ the highest industry standards (picture by Arek Soxha from Pixabay)

So how do you put those granular controls in place to determine who can access and change what data? It needs to be powerful, flexible and easy to maintain, using engaging and intuitive UIs that are fit for purpose. If this area is overly complicated, not only will maintenance suffer, but it opens up security holes too.

5. Store raw data, historical data and snapshots of outgoing data

In the past, you had to hand-craft many of the common requirements of a data management platform. An example of this is storing the raw data as it comes in from the data source, before any transformations or derivations have been carried out. Similarly you want to keep a record of your historical data and snapshots of outgoing data.

Configuring all this can take many weeks of expensive consultancy time in much of the software out there today. The future of data management is that all of this happens out-of-the-box.

6. Track the complete data lineage

Knowing where your data came from and where it has been sent to is vital in today’s finance industry. Regulatory requirements often dictate this need for data traceability. Alongside the source of the data, you also want to know what transformations took place on each data point’s journey.

Many current providers struggle to show the complete trail from data source to data target. This is particularly the case for systems where large amounts of coding and SQL scripting form part of the implementation. The trail often gets lost at that point.

Data lineage from source to target

Out-of-the-box date lineage from source to target is the feature to look out for. Not only will this mean less implementation work and therefore reduced costs, but the data management team can also get to the bottom of issues quickly and easily. Plus the regulatory authorities will be delighted!

7. Keep the end-to-end flow simple

What can really add to the complexity and steep learning curve of data management software is the sheer number of different component types (modules of software to perform a certain task, such as moving data or validating data) that are required for the end-to-end flow of your data. Quite incredibly, it really can be more complicated than rocket science. With a small number of different component types, each taking on a more substantial role, there is less to learn and you really can’t go wrong in building out the perfect end-to-end flow.

Another big issue with a large number of different component types for various different tasks is the long dependency chain this causes. Each component is linked to the others to varying degrees, so making a change in one place means there needs to be a corresponding change in dozens of other places. So do ask: How many different component types does your software have?

8. Reduce number of mapping exercises

What I have found to be one of the most error-prone and time-consuming aspects of data management projects is the sheer number of mapping exercises required between different sources/components/tables/targets/etc. Different systems and vendors regularly use varying mnemonics to represent the same fields, and therefore an element of manual mapping work will always be required on a data management project.

Too many mapping exercises drive up the costs of projects

However, this can become exponentially more difficult and time-consuming if you have less out-of-the-box functionality in your data management software. Imagine having to manually map to the raw data storage tables, historical data tables, stage, audit, master, data warehouse, export tables, etc and you are facing a mammoth effort taking many weeks or months. And you then end up with a maintenance nightmare! No wonder adding a single extra field to your data master becomes a multi-day piece of work.

In the future, mapping exercises should be largely automated and kept to an absolute minimum. This will save financial institutions an incredible amount of hard work, time, and most importantly money.

9. Employ a tiered approval system

A feature that often requires weeks of configuration in today’s software is implementing a 4-eye approval process. Many financial institutions don’t want to allow a single person to make changes to sensitive datasets without another user (their boss or their colleague) reviewing the change first and then approving or rejecting it. This has big business benefits from an audit as well as security perspective.

However, in a lot of data management software, this has to be painstakingly configured for each data set you want to apply it to, and could include the following steps:

  • Adding a status column to the relevant table to store the current row status (New, Pending Approval, Approved, Rejected, etc)
  • Building and configuring a dedicated Approvals user interface to show the records that need approving
  • Writing SQL scripts or source code to filter the rows by the status
  • Creating backend processes that handle the saving of changes

Going forward, all this painstaking manual work should be eliminated and just work out-of-the-box. Wouldn’t it be great if this feature could be turned on or off with a single click of the mouse?

Summary

Taking all these points into account is a great start in achieving the perfect target operating model. Maintenance is made easy, new data feeds and systems can be brought on board in no time and fields can be added or removed from data masters without causing any headaches or costing an arm and a leg.

Speak to Fencore today to find out how we can help you on your data management journey and get you ready for the future. Connect with me here and visit us at https://fencore.com.sg to schedule a demo of our ground-breaking platform.

--

--

James Crosby

Software Developer. FinTech Entrepreneur. Triathlete. Passions: Diversity, Sustainability, Social Justice https://fencore.co