The Importance of Data Standards

The Importance of Data Standards

There is no denying that data standards play a major role in how banks and financial institutions run their organisations and offer a wide range of benefits. Reporting and analytic capabilities become much easier to collate when data sets are normalised by default, as an example.
Third party integrations with other technologies that have a direct impact on business operations become easier when there is commonality to the integration data sets. With regulatory compliance now front and centre to all financial institutions reporting requirements, standardisation plays a critical role in a banks level of compliancy.
But how does this play out in the global economy?
One of the issues we face is that there are multiple organisations chasing standardisation across regions, with specific use cases. What is needed is a truly global approach to data standardisation across industry, use cases and geographies.
While significant effort has been put into standardising various aspects of financial data, nothing has been all-encompassing. The result? A list of standards that all have their own versions and quite often only cover a narrow scope of the financial ecosystem, or a specific set of use cases.

CHALLENGES

The sheer scope of a single financial data standard provides a significant challenge. Questions such as who oversees the global standard and how you bring all the necessary parties to bear on creating such a standard are difficult to answer.

As Shane Rigby, CEO of LIXI comments, he sees four main interrelated reasons that make global data standards challenging to develop, maintain, and to have them broadly adopted.

1. Sustainability

2. Critical Mass for Adoption and Balancing Complexity

3. Optimum Level of Complexity

4. Ongoing Change Management

Sustainability

The ongoing development and maintenance of standards need to be sustainable. There needs to be a community of participants that are incentivised to continually evolve the standards through raising and analysing requirements, proposing, discussing and testing updates, and issuing new versions of the standards.

Initiatives such as Open Banking can provide a good solution to this conundrum. By pushing the players in a market to define a standard to bring efficiency and transparency, governments are possibly in the strongest position to drive data standardization. This leaves the practical creation of the standard.

Internationally recognised and experienced bodies such as ISO are best placed to practically drive the creation of standards forward, though working with other bodies that deal with data transmission, such as W3C, could prove beneficial, however would also see an increase in the overall overhead involved.

Even once a standard is largely ‘finalised’, the community needs to continue to maintain the platforms used to manage the standards, apply software patches, and test and update change management processes. Even the underlying format specifications (such as JSON Schema) change over time and may require standards based on them to be updated.

Data standards that are closely tied to a specific technology seem bound to intertwine their fate with that of the technology. In the early 2000s, XML was a widely used messaging format that has been superseded by JSON for many use cases. The SOAP protocol was built around XML and with the reduction of interest in XML, so too has the interest in SOAP waned, though XML and SOAP will likely remain in use in specific use cases where document market-up is required.1 It would then seem prudent to create data standards that are not tied to specific technology as much as feasible. It may be better to define a data standard within a technology-agnostic  framework with a goal of providing tools that can transform the standard to a specific technology or format, or transform between different technologies and formats as they shift and evolve over time.

Critical Mass

The paradox of voluntary standards is that there needs a significant community of adopters to overcome the fear of being the first. The more use cases for which a participant can adopt the standard increases the value they can derive from the standard. This ‘Chicken and the Egg’ problem takes significant effort to overcome.

An in-market use case is provided in how LIXI discovered that by not just focussing on the lodgement of mortgages within the Australian market, lenders are much more likely to adopt the standards across multiple functions. Other processes supported by these standards, such as valuations, insurance issuance, core banking integrations and so on, all increase the value that a lender (and the supporting ecosystem) can derive from using the standards.

Read the full article here.