Are you making the most of your mainframe data?

Mainframe data is big data!

Data Quality Matters

When most people think of legacy software, we think of software that is outdated and due for replacement.

Yet, an alternative definition of legacy, particularly when it comes to mainframe application, is, simply, software that works.

This is a definition that our partner, Syncsort, is proud of. The legacy DMX Sort product has been helping customers to reduce the cost of running their mainframe for decades.

This legacy – the understanding of how to optimally move vast amounts of data – is brought to Syncsort’s line of data integration tools – particularly for moving both logs and data from the IBM mainframe and the IBM i series to advanced analytics platforms like Hadoop and Splunk.

These data integration and change data capture solutions are complemented by the data quality stack, meaning that we don’t just move data efficiently, we ensure its quality as well.

Mainframe data is big data

View original post 155 more words

Advertisements

5 Product Data Levels to Consider

Different kinds of product data may be divided into the schemas. Product pricing is usually a subject mainly belonging to the ERP side of things. But how to connect the dots and take things to next level, this write-up throws light on Product Master data Management.

When talking about Product Master Data Management (Product MDM) Product Information Management (PIM) I like to divide the different kinds of product data into the schema below:

Five levelsLevel 1, Basic Data

At the first level, we find the basic product data that typically is the minimum required for creating a product in any system of record.

Here we find the primary product identification number or code that is the internal key to all other product data structures and transactions related to the product within an organization.

Then there usually is a short product description. This description helps internal employees identifying a product and distinguishing that product from other products. Most often the product is named in the official language of the company.

If an upstream trading partner produces the product, we may find the identification of that supplier here too. If the product is part of internal production, we may…

View original post 872 more words

Data-centric approach to enterprise architecture

Data is the key to taking a measured approach to change, rather than a simple, imprudent reaction to an internal or external stimulus. But it’s not that simple to uncover the right insights in real time, and how your technology is built can have a very real impact on data discovery. Data architecture and enterprise architecture are linked in responding to change, while limiting unintended consequences. DBTA recently held a webcast featuring Donald Soulsby, vice president of Architecture Strategies at Sandhill Consultants, and Jeffrey Giles, principal architect at Sandhill Consultants, who discussed a data-centric approach to enterprise architecture. Sandhill Consultants is a group of people, products and processes that help clients build comprehensive data architectures resulting from a persistent data management process founded on a robust Data governance practice, producing trusted, reliable, data, according to Soulsby and Giles. A good architecture for data solutions includes: RISK MANAGEMENT Strategic Regulatory Media Consumer COMPLIANCE Statutory Supervising Body Watchdog Commercial Value Chain Professional Enterprise architecture frameworks start with risk management as its building blocks, Soulsby and Giles said. A typical model asks what, how, where, when, and who. A unified architectural approach asks what, how, where, when, who and why. This type of solution is offered by Erwin and is called Enterprise Architecture Prime 6. According to Soulsby and Giles, the platform can achieve compliance, either regulatory or value chain; can limit unintended consequences; and has risk management for classification, valuation, detection and mitigation. erwin and Sandhill Consultants offerings will provide a holistic view to governing architectures from an enterprise perspective. This set of solutions provides a strong Data Foundation across the Enterprise to understand the Impact of Change and to reduce Risk and achieve Compliance, Solusby and Giles said. An archived on-demand replay of this webinar is available here.

via The Building Blocks of Great Enterprise Architecture for Uncovering Data — Architectural CAD Drawings

Data Architecture in a digital world; empowering the Data Driven Enterprise

To be able to be a really Data Driven, an organisation performs a Data Management discussion throughout the whole organisation.

Source: Data Architecture in a digital world; empowering the Data Driven Enterprise

Automating Enterprise Architecture

Modern EA processes must involve more stakeholders in the EA process so that EAs themselves aren’t the ones actually doing each and every task. The combination of smart tooling and collaborative process is really the key to success in automating your enterprise architecture practice.

Center Mast Consulting

Delivering faster. Saving money. Building new business capabilities. These are value points that make enterprise architecture more relevant than ever. Yet delivering EA at today’s breakneck pace of business requires automation. Here are two ways to do just that.

EA’s Slow Pace

For many organizations, enterprise architecture is still viewed as a governance process or center of excellence which major projects must be “run through.” This puts enterprise architecture on a project’s critical path, and as a result, enterprise architects must scramble to complete solution architecture, standards reviews, and documentation on time. It is these processes that must be automated in order to deliver EA faster.

Antiquated EA Processes

Let’s start the conversation with process, as that’s where things are inefficient to begin with. If we take a statistical examination of common EA work, it doesn’t take long to see that there’s an awful lot of tasks that are repetitive…

View original post 1,014 more words

Who is a Data Subject in GDPR

Who is a data subject in GDPR? – An identifiable natural person, who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person. #AbhiSrivastava#GDPRArticle4#GDPR#DataSubject

Untitled design

Three kinds of a MDM Data Model that comes with a tool

Differences in an off-the-shelf model, a buildable model and a dynamic model, when buying MDM solution from a vendor. #AbhiSrivastava, #MDM, #MDMDataModel

Master Data Management (MDM) is a lot about data modelling. When you buy a MDM tool it will have some implications for your data model. Here are three kinds of data models that may come with a tool:

An off-the-shelf model

This kind is particularly popular with customer and other party master data models. Core party data are pretty much the same to every company. We have national identification numbers, names, addresses, phone numbers and that kind of stuff where you do not have to reinvent the wheel.

Also, you will have access to rich reference data with a model such as address directories (which you may regard as belonging to a separate location domain), business directories (as for example the Dun & Bradstreet Worldbase) and in some countries citizen directories as well. MDM tools may come with a model shaped for these sources.

Tools which are optimized for data…

View original post 185 more words