Automated Data Catalog.

K2View Data Catalog is built into the company's Data Product Platform to inventory and classify data assets, and visually map information supply chains

Achieve agile information governance


Take control of your operational data

Gain end-to-end data lineage, from data sources to data consumers. 


Stay up to date,
all the time

Discover, profile, organize, and document metadata – automatically.


Turn data into a business asset

Provide data teams with the context they need, to extract value from data.

Data catalo tools

Auto-discover your metadata

The K2View Data Catalog collects, analyzes, and visualizes the metadata for an enterprise's data products.

It enables self-service data discovery for both operational and analytical workloads, and serves as the data registry for a federated data mesh or centralized data fabric architecture.

Data catalog software

Go with the flow

The K2View Data Catalog visually describes how an organization collects, transforms, and stores its data inventory.

Due to its user-friendly UI, the data catalog enables users to follow a data flow from its data source (table and column), to its target data consumer, thus providing full clarity and confidence in the company’s data model. 

For example, the catalog can assist a software engineer to identify the source systems of web service’s required output fields. 

Enterprise Data Catalog

Get the full picture

K2View Data Catalog visualizes all the data entities managed within the K2View Data Product Platform, including web services (and their output fields), integration points (interfaces), data product schemas, tables, columns, and more – as well as the relationships between them.

The relationships indicate the connections between the entities, and determine their hierarchy.  

The data catalog and data lineage structures can be readily exported in a standard format into 3rd-party enterprise data catalog tools.


data engeneering 2 copy 4-1

Trust your source using our data catalog tools

K2View Data Catalog is auto-updated at all times, with every change to underlying sources, data products, consuming web services, or anything in between

With data teams spending more than 80% of their time searching, preparing, and protecting data, they have very little time left to understand it, and put it to good use.

That's why a single, reliable, and always-up-to-date source of data is a mandate for today's data and analytics professionals.