K2VIEW MARKET GUIDE

Data fabric vendors and market guide

Data fabric is an emerging data architecture capable of managing enterprise data at scale - for operational and analytical workloads. Read this guide before evaluating the top data fabric vendors. 

MARKET GUIDE15 blog copy-3

INTRO

Data fabric vendors and their offerings

Data fabric is the modern data architecture of choice for enterprises, looking to democratize data access for operational and analytical use cases, at massive scale.

This article is intended for CDOs, CIOs, data engineers, data scientists, and their business counterparts – as a quick and easy guide to understanding data fabric vendors and their offerings.

Chapter 01

What is data fabric

Analyst firm Gartner defines data fabric as an important emerging trend that requires a combination of multiple data management technologies – such as data integration, data pipelining, data orchestration, data governance, and data catalog.

Data fabric diagram from Gartner

Gartner: “A data fabric stitches together integrated data from
many different sources and delivers it to various data consumers.”

The main goal of a data fabric is to deliver integrated and enriched data – at the right time, in the right way, and to the right data consumer – to support both operational and analytical use cases.

 

By 2024, 25% of data management vendors will provide a complete framework for data fabric – up from 5% today.

Gartner logo

Chapter 02

Why data fabric

Successful data-driven enterprises map intended business outcomes to their data architecture and technology. So, business and IT stakeholders need to know the reasons for implementing data fabric, and thoroughly understand its value proposition. Key drivers are listed below.

Business drivers for data fabric

  • Pipelining data cleaner and faster, leads to shorter time to insights, more informed decision making, and ultimately better business outcomes.
  • Gaining a real-time, 360-degree view of any business entity – such as a customer, product, store, order, claim – enables new digital experiences, such as hyper-segmentation, proactive service, and sales personalization.
  • Decoupling data from the underlying applications enables incremental and rapid modernization of legacy siloed systems.

Data management drivers for data fabric

  • Automation frees up data scientists, data engineers, and other IT resources, from tedious grunt work.
  • A data fabric infrastructure integrates existing data management tools that are disjointed, and makes others redundant, for greater cost efficiency.

Organizational drivers for data fabric



Knowledge graph 2600x1300px-1

Knowledge graphs are important because most relationship insights are lost when using traditional data modeling and integration tools.

Chapter 03

Data fabric vendor capabilities

When researching data fabric vendors, look for a platform that comes with the following key capabilities:


  • Data catalog, to inventory and classify data assets, and visually map information supply chains
  • Data engineering, to create scalable data pipelines for operational and analytical workloads
  • Data governance, to ensure quality, enforce privacy, and make data safely accessible at scale
  • Data orchestration, to package fragmented data from multiple source systems and make it accessible via secured web services
  • Data preparation, to cleanse, transform, and enrich enterprise data for analytics and AI
  • Data integration and delivery, to integrate data from any source and deliver it to any target, in bulk (batch) or real time
  • Data persistence layer, to dynamically persist data in different relational and nonrelational models

Learn how K2View Data Fabric can make your business more profitable, from the moment it is deployed

Request Demo

Chapter 04

Leading data fabric vendors

The table below summarizes the strengths and concerns of the 5 top data fabric vendors.

  Strengths Concerns   
K2View
  • Single, integrated platform, combining all data fabric capabilities
  • Data is uniquely organized by business entity, for real-time data pipelining, and “x360” workloads at scale
  • Support for massive data workloads requiring real-time data integration and movement
  • Full support for both analytical and operational workloads
  • Quick deployment (typically in a few weeks) and easy adaption, supporting agile, and CI/CD
  • Low total cost of ownership (TCO)
  • Focus on large enterprises, with relatively few mid-sized customers
  • High concentration of deployments in telco, healthcare, and financial services markets
  • Few system integration partners outside North America and Europe
 
Denodo
  • Focus and strength in data virtualization
  • Catalog serves as a single-entry point for enforcing security and governance
  • Broad go-to-market partnerships
  • Optimization for analytics use cases
  • Complexity in managing and operating the data fabric
  • Not applicable for high-volume operational workloads
  • Processes and effort required to ensure distributed query performance on the platform
 
Talend
  • Focus and strength in data integration across multi-cloud and hybrid ecosystems
  • Wide-ranging capabilities for data engineering
  • Broad set of connectors for a large variety of data sources
  • Not applicable for high-volume operational workloads; best suited for analytics use cases
  • Support required for complex data orchestration and data pipeline operationalization
  • Limited data virtualization capabilities
 
Informatica
  • Use of AI and ML for augmented data integration and data quality support
  • Strengths in data integration for optimized analytics, data migration, and MDM
  • Ability to scale in support of complex data integration scenarios
  • Complex and costly deployment and adaption
  • Data virtualization support required
  • Limited real-time data pipelining capabilities, making it less suitable for operational workloads, where real-time data integration is required
  • Multiple disjointed tools acquired over time and not yet integrated into a single platform
 
IBM Cloud Pak for Data
  • Strong product scalability and performance
  • Diverse data integration delivery styles and architectures
  • Data virtualization and metadata management
  • Improved integration capabilities repackaged as Cloud Pak for Data
  • Data fabric is comprised of multiple standalone products, creating uncertainty around the platform’s structure, cost, and deployment
  • Complex architecture, resulting in difficult and costly upgrades
  • Self-service, and cloud-based data integration capabilities required
 

 

Chapter 05

Why K2View

K2View stands out as the only platform capable of responding to entity-centric data queries in real time, at massive scale, and supporting both analytical and operational workloads.

Unique technology

At the core of K2View’s patented technology is the Digital Entity™, the logical data schema that unifies everything the enterprise knows about a business entity (such as a customer, product, location, credit card, etc.), including business transactions, cross-channel interactions, network data, and master data.

The digital entity integrates all data elements, regardless of source systems, technologies, and formats.

The Digital Entity2-1

Everything a company knows about a customer is encapsulated in a digital entity.

How it works

At runtime, digital entity data is integrated from underlying data sources, enriched, transformed, compressed by 90%, and stored in a patented Micro-Database™ – one micro-DB per business entity.

K2View manages 120 million micro-DBs in the AT&T data fabric, one for each customer

Each micro-DB is secured with its own encryption key, meaning that data is protected at the customer record level. Data sync rules manage the frequency of data updates between the source systems and the micro-DBs, and the architecture is distributed to support workloads of massive scale – in the cloud, on-premises, or in a hybrid architecture.

Key features

  • Modular, open, and scalable architecture
    Data integration, transformation, enrichment, preparation, and delivery – integrated in a single, extensible platform
  • Split second, end-to-end, response times
    Enterprise data fabric, built to support real-time operations, with bi-directional data movement between sources and targets
  • Macro big data management in micro-DBs
    Data for each business entity is integrated into its own micro-DB, for a single source of truth, and a unified view of that entity 

WHITEPAPER

Uncover the patented technology behind our operational data fabric

We wrote the book on entity-based data fabric. Our technical whitepaper reveals the inner workings of K2View Data Fabric, and how it effectively organizes and manages big data.