New Trends in the Data Warehouse Market

The data warehouse marketplace has evolved from vertical players selling separate components for each data task to a larger set of vendors offering full data technology stack solutions—from data governance and data movement to reports and dashboards. This shift reflects emerging needs, such as insurers looking to improve their big data capabilities as they explore new data sources and the insights they can provide, and also incumbent realities, such as the fact that insurers tend to approach their data technology transformations as a single data warehouse transformation initiative rather than piece-by-piece data enhancements. In fact, the majority of insurers still approach these projects as in-house builds using horizontal platform technology, despite the proliferation of full-stack data warehouse providers.

Some vendors are partnering with third-party data source providers, some are adding tools to analyze unstructured data, and still others are adding capabilities to be able to analyze data streamed from Internet of Things devices. Insurers should keep an eye on these additional solution provider capabilities; when AI technologies become more prevalent in insurance technology, a historical data record across multiple systems will be important.

Many large insurers are incorporating data lake technology, data streaming, and cloud massively parallel processing data platforms into their data strategies. While these trends are still in the early stages, Novarica expects them to become more relevant to this space in the next few years. The extract, transform, and load (ETL) processes that move and transform data from one form to another are complex—they make up as much as 80% of the effort required for a data warehouse build. The complexity of these processes also makes them difficult to change.

Analytics solutions built upon cloud platforms (such as Snowflake, Google BigQuery, and AWS Redshift), on the other hand, are more similar to the normalized models found in core systems. With cloud-based analytics platforms, data transformation previously done via batch ETL can now be done as part of reports or virtualized data marts. The change in order of the steps is often called ELT.

Shifting from ETL to ELT isn’t the only change afoot in the data warehouse marketplace. The traditional approach of “Schema on Write” is evolving into a “Schema on Read” approach at many insurers. In a Schema on Read system, data is loaded into an analytics repository in its original form. Then consumers can interpret and transform the data as they need to when it is read out.

The number of full-stack data warehouse vendors for P/C insurers has grown, but most insurers are still taking a “build it themselves” approach to data warehousing and reporting, using horizontal platform technologies. Even with a vendor providing a full data technology stack that includes a data layer, industry-specific data models, presentation and visualization tools, and pre-built insurance reports and dashboards, an EDW project is still time-consuming and complex, requiring not only looking forward but also reviewing, cleaning, and rethinking the existing data and core systems. To learn more about the vendors active in this space, read Novarica’s latest Market Navigator report Data Warehouse Solutions for Property/Casualty Insurers.

Add new comment

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
1 + 17 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.

How can we help?

If you have a question specific to your industry, speak with an expert.  Call us today to learn about the benefits of becoming a client.

Talk to an Expert

Receive email updates relevant to you.  Subscribe to entire practices or to selected topics within
practices.

Get Email Updates