Vous êtes sur la page 1sur 8

WHAT IS UNIFIED ENTERPRISE DATA LIFECYCLE

MANAGEMENT? ALSO CALLED UNIFIED EDM


Successful IT organizations must be able to incorporate the best talent and tools in order to stay ahead of the
game. With so many moving parts in the Data Integration and Migration process, visibility and collaboration
are now more important than ever.

1. 2.
Data Business Metadata
Unified Enterprise Data Lifecycle Governance Dictionaries
Management is the unification of: Glossary
DATA ATTRIBUTES
1. Data Governance
6.

D
2. Metadata Management - Business & Data 3.
Technical with Data Lineage Development Technical
3. Data Quality Lifecycle Metadata
4. Enterprise Development
Methodologies/Lifecycle Agile &
SDLC/Waterfall 5. 4.
5. Code and Collaboration Automation Data Data
Lineage Quality

Without an intentional focus on things like Data Governance, Business and Technical Metadata, Data Lineage
and Data Quality, the entire Data Development Lifecycle deliverables may be at risk.

There is a necessary synergy between people, processes and technology that must be considered when
planning for success. However, fragmentation or silos within organizations can impact our ability to see things
in a unified way. They can prevent us from working with maximum transparency and collaboration. This can
result in confusion, churn, and inefficiencies. Resources struggle to determine where each part of the lifecycle
begins and ends among the various organizations. Ultimately it is a breakdown of the Unified Enterprise Data
Management Life Cycle, leading to a lack of trust among the business users who depend on it.

Although Unified Enterprise Data Lifecycle Management is comprised of many parts- there is ONE that is a key
element to the success of any IT organization:

1. Data Governance

Without standards and practices that drive the proper use and format of the data, there can be no clear path to
success.

Many organizations do not know where to begin. Others start making rules and assigning Data Stewards, which
can ultimately lead to a negative connotation of the Governance Initiative, leading to short-term rejection and
long-term failure.

Successful organizations know that data needs to be governed through a strategic approach. Data stewards
must be knowledgeable, and have the right tools and methodologies at their disposal, making the data
governance initiative a passively enacted part of every step of the process.
It also requires appropriate organizational structure, support and technology tools to set the trajectory for
success. Maintaining the relevant policies, processes and standards with regard to change management, Data
Stewardship and Subject Matter Experts (SMEs), is key.
It is as an effective means of measuring and monitoring the data. Making it part of the process will ensure
adoption and long-term growth of the overall initiative.

Page 1 of 8
DATA GOVERNANCE Collaboration &
Strategy Communication

Policies, Measurements
Organization Processes Stewardship Monitoring
Standards

DATA
GOVERNANCE
Collaboration &
Strategy Communication

Change Management
Vision & Mission Business Impact & Readiness Communication Plan

IT Operations & Readiness


Objectives & Goals Mass Communication
Training & Awareness
Aligment with Corporate Stakeholder Management & Communication Individual Updates
Objectives
Defining Ownership & Accountability
Mechanisms
Aligment with Business
Organization Stewardship
Strategy
Trainig Strategy

Guiding Principles Operating Model Collaboration &


Information Life Cycle Tools
Policies, Measurements
Arbiters & Escalation point Processes Monitoring
Standards Data Mastering & Sharing
Data Governance
Organization Members Policies & Rules Statistics and Analysis Data Architecture & Security

Roles and Responsibilities Processes Tracking of Progress Data Quality & Stewardship
Workflow
Data Ownership & Controls Monitoring of Issues
Accountability
Data Standards & Definitions Continues Improving

Score-carding
Metadata, Taxonomy, Cataloging
and Classification

Page 2 of 8
2. Metadata Management - Business & Technical with Data Lineage

Other key components of the Unified Enterprise Data Lifecycle Management approach include Metadata
Management, which is, in turn made up of Business Metadata captured in a shared Business Glossary, as well
as Technical Metadata collected and centralized with context that gives it meaning across business units and in
relation to the business process.

To enable proper Technical Metadata use that fits with an existing architecture, there
Metadata Management
must be flexibility in integration, and interoperability with a broad set of data
Automated management tools and technologies.
Parsers/Scanners
Business
Metadata This ensures that as you work to instill better practices with purpose-built tools, the
people who currently maintain and benefit from your existing architecture, can
Terms linked Data Lineage &
continue to do so as the organization evolves.
to Metadata Impact Analysis

Technical
Metadata

UNIFIED ENTERPRISE DATA LIFECYCLE MANAGEMENT


DATA LINEAGE
IMPACT ANALYSIS
NEW DEVELOPMENT ANALYSIS
DATA SOURCES
BI REPORTS Requirements
REGULATORY COMPLIANCE
SMES

INGEST PREPARE AUTOMATED REVIEW & PUBLISH DATA


DATA LINEAGE APPROVAL LINEAGE
Scheduled Automated
Refreshes

The scanning and parsing of code to accurately harvest and create data lineage is made possible by integrating
with Metadata Connectors, ETL Parsers, Scripting Parsers, Metadata Tool Integrators, ETL Connectors, Big Data
Connectors, Modeling and Testing Tool Integrators with the ability to connect and share data freely inside the
enterprise architecture.

Page 3 of 8
The next best step to maintaining the accuracy across these systems is to have automated source-to-target
mapping, which can be used to generate ETL jobs for a broad variety of ETL platforms, and automatically
updates lineage views and impact analysis views as the source and target metadata changes over time.

3. Data Quality

Continuing further into the Unified Enterprise Data Lifecycle Management recipe for success, we come to Data
Quality, which also plays a key role in building trust with the business stakeholders.
Every report and decision point can lead to questions about the quality of the data. Where did it come from?
How was it transformed along the way? Can I trust this report and its precision or is it just directionally correct?

Data Quality can mean a broad variety of things from organization to organization. Ensuring ongoing
consistency in Data Quality is the next big challenge.

The ability to systematically scan, profile, assesses, and fix the data will play an important part in the overall
outcome both in the short and longer term.
Traditionally, this would mean a long and arduous process that is a one time effort, and not repeated until it
becomes necessary. This creates a pendulum of good to bad and back again creating frustration among the
business users, and reinforcing the belief that the data cannot be trusted

In order to put a solution in place that is initially tactical and becomes strategic, you need to have a robust set of
capabilities that go beyond identification of the problem, and dig deeper into solving the problem in a way that
can be replicated on any interval that may be necessary.
You may find yourself in a situation with no effective and integrated way to remediate, enhance, match and
consolidate the results. The process will be at best, incomplete. This is where workflow enabled Data Quality
Assessment Tools is changing the landscape for companies around the world.

Page 4 of 8
WORKFLOW BASED DATA QUALITY ASSESSMENT MANAGER

Data Quality Data Data

Data Sour
So
Management u Profiling

Pa
an
Issue Manag

rse
Iss

/Sc

rc
em
ue

/Sca
ing
en

rse
onit
M

ous M oring
e

A n alyze
cing
yz

Pa
an

n
al n

t
ti
ag

An
on g
em

Co orin
nit
n

C
Mo tinuo
en

nit Mo
t

ori us Measure te
ng
DQAM da DQAM

re
ort

asu
oli
Rep
ons
onitoring (Ru Stan

Me
Reporte M d

Code Generate
les
Ma ard

Match & C
na ize
gem
en
t)
& St n
ch te an me
Re

t)
at lida dard a ge
m

M so i z e ( R ules M a n
rat e

ed

n
Ge o Cod

Enhance
e

Co
ia
te

Rem

o
ne

t
e diate

En
t

Au
Data Quality
Au

nc

ha
e Remediation

End-to-end data quality process tool with Workflow capabilities and auto code generation for data remediation
allows you to manage data quality for the business, with visibility, measurement, and a collaborative approach for
maintenance. This process starts with the end in mind, and is uniquely comprehensive, complete and reusable.

Once the foundation is formed, you will have a continuous process to monitor and report on the Data Quality
through Issues Management, Visualization and Dashboards enabling ongoing and accurate Data Quality that
ensures precision reporting and decision-making that ultimately builds trust

4. Enterprise Development Methodologies- Agile & SDLC/Waterfall

We must also consider the Development Methodologies that enable New Development or maintenance of the
existing repositories... ...with approaches like Agile Framework and traditional System Development Lifecycle
(SDLC) or Waterfall.

DATA LIFECYCLE MANAGEMENT & AUTOMATION


Discover Define Design Build Test Deploy Maintain
Mapping Release Code
Metadata Management Requirements Data Mapping Solution Development System Data Quality
Rules Design Release Testing Assessment Management Updates
Automated Business, System/Data Data Mapping Code System Data Quality Code Mapping
Parsers/Scanners Functional & Analysis Modeling Release Development & Integ. Assessment Migration Maintanence
Business Technical Design Management Testing
Metadata Requirements Mapping Code
Gathering Specifications Changes
Source Schema User
Terms linked Data Lineage & Impact to Target Enhancement Acceptance Continnuous
to Metadata Impact Analysis Analysis Mapping Testing Data
Quality
Technical Faster MDM Refence Date & Codeset Manager
Metadata

Data Steward Business Architect Data Quality SME Release Data Analyst
Resources
Data Analysis Analyst SMEs/ Analyst Developer Assur. Data Analyst Architect/
Data Steward Analyst Steward Developers
QA Analysts
XConnect RQM Mapping Release CATfX Test DQAM Release Data Steward
Application Release Analyst
Manager Manager Manager Manager Manager

Page 5 of 8
Each of these methodologies are commonly used practices for Data Exchange, Integration,
Transformation, Migration, Extraction and Conversions for Technology Modernization. These may
include Database Platform Migration, ETL Platform Migration, Data Vault Methodology and
Automation, Data Lake Automation, and New ETL Development, as well as Business Intelligence
enablement.

These enable organizations to truly deliver competencies of Data Management Process Design, Data
Transformation Process, Information Literacy, Operational Forensics, Data Compounding, Data Harmonization,
Contextual Analytics, Outcome Analysis, Enterprise Data Inventory, and lots more.

With all of these critical process components being satisfied in a way that is sound, robust and repeatable, the
next natural step is Automation.

5. Code and Collaboration Automation

Traditionally these processes have been accomplished through practices that are less-than-efficient, time
consuming, and often not reusable. But with advanced technology we can reuse, reorient, and redeploy nearly
everything, writing the future of data management with the flexibility to be out front in every chapter!
The traditional one-off approach requires additional justification and funding each time new updates or
changes are required. It is a painful cycle that costs time and money.

The future of data warehousing, data integration, and data management is in future-proof, efficient automation
and reusability.
The acceleration and time savings simply eliminate half of the repetitive funding cycles and instead leverage the
existing automation built into the architecture year over year.
An integrated ecosystem delivers the ability to keep pace with data practices and business demands. Best of all,
the setup and learning curve for data management professionals and business users is nearly flat.

Page 6 of 8
Key Areas of Automation

1. Enterprise Source-to-Target Mapping


2. Integrated Development Environment known as Code Automation Template Frameworks. This enables you
to access metadata through a series of custom APIs with unlimited automation options and code-generation
capabilities. Purpose built modules takes automation to a new level and offers a wider variety of output
types than any product on the market giveing you the freedom to generate all brands of ETL Code, all types \
of SQL Stored Procedures, as well as Big Data Scripting for Pig, Python, MapReduce and many more.
3. Automated Design, Development and Testing
4. Impact Assessments
5. Requirements-to-Delivery Traceability Matrix
6. Data Quality Standardization & Reporting
7. Data Lineage Mapping and Refreshes
8. Release Management
9. Collaboration, Work Queues & Workflows
10. Performance & Status Reporting
11. LDAP Integration to meet security requirements
12. And more

Leveraging Metadata Driven platform and internal APIs, automation for all parts of the lifecycle in agile or
traditional methodologies is a reality now.

This gives you a comprehensive approach to Automating the overall Data Management process in a way that
is highly collaborative, visible and reusable.

It ensures that key benefits are not "lost in translation", through silos systems, and creates bridges with the
Business Teams who seek to better leverage and understand the projects they are funding every day.

The Unified Enterprise Data Lifecycle Management process includes key features that make for a simplified
approach to standardization, True Data Governance, and fast adaptation to a broad variety of Regulatory and
Compliance requirements for just about every industry.

Through increased visibility and collaboration, we create an environment that inspires idea sharing and a best
practices approach naturally and seamlessly, for everyone in the enterprise.

Page 7 of 8
The Unified EDM Platform Solution approach generates exponential value, at a very low cost of entry. It gives
the freedom to implement in a way that truly fits his business today, and well into the future with LDAP
Integration to meet security requirements, Local install, Cloud, or Hybrid options available, there are no
limits.

Alternative solutions that meet the combined value proposition of the Unified EDM Platform can be three or
four times the cost, with additional risk & incomplete functionality, and no centralization.

Considering all of the advantages and efficiencies Unified EDM creates for both the Business Organizations
and IT, its no wonder it is the way of the future for Data Management.

Author: Gaurav Mangal


Dec 2016/Jan 2017
gmangal@falconrock.com

Page 8 of 8

Vous aimerez peut-être aussi