Data Management Strategy
Understand how your current data is captured, stored manipulated and used and how it may need to change in order to be effectively used for analytics
Data Requirements for Analytics:
Data is at the heart of every Analytics program. We will work with the required stakeholders to identify the data requirements for your solution. The primary input to data requirements identification will be the business use cases. To realize the vision of the initiative, it is important to have access to relevant and complete data, at the right level of granularity with an acceptable level of quality.
Data Management Current State Assessment
Defining the current state of the analytic infrastructure at your organization is an important step. The information gained during this process contribute to many other strategic endeavors. Once the Analytics Platform architecture is defined, the understanding of the current usage of these technologies is necessary to identify any technology gaps for supporting analytics.
Data Gap Analysis
During this assessment, we identify the data subject areas currently captured and the potential data gaps for supporting identified use cases. Any active projects within the organization and the planned future projects to capture and manage data need to be aligned and prioritized with the maintenance and reliability analytics roadmap.
Master Data Managment
Many organizations become overwhelmed with the ever increasing data volumes that are being generated. To overcome this, we help implement proven and effective master data management processes. This exercise consists of the following: Identify the list of master data sources and their relationships, identify the source systems for master data definition of business rules and determination of data lineage and history requirements.
Source Systems Analysis
Closing any current data gaps will require a high-level analysis of source systems, in terms of understanding the data subject areas housed in each of the systems. This will include the data in internal systems as well as the external systems in order to support your analytics initiative.
Data Quality Management
Data quality is one of the most important constituents of data management. In order to ensure that better, accurate and usable customer intelligence is delivered the underlying data should be of good quality.
There are different integration points from the time data is extracted from the source all the way to the point of published results where data can be cleansed. Managing the data quality lifecycle needs to be a repeatable and sustainable process. During this stage, we will outline the process and tools required for Data Quality management.
Whether you are implementing a traditional Enterprise Data Warehouse architecture or a Big Data Architecture or even a hybrid architecture, proper modeling of data is important for long-term success.
We also have several internally developed Logical data models (LDM’s) built on our industry and analytic experience that we bring to our engagements and customize according to client needs. The LDM serves as an accelerator during implementation. The use of these proven, pre-established data models will not only reduce the overall cost of the project but equally importantly speed up the time required to deliver business value.
A LDM consists of key data subject areas (DSA) and data entities down to the data elements in each entity. A DSA is a logical grouping of data that relates to a logical area of business with associated data identified during business process map development. The relationships across entities are defined in the LDM. These entities consist of dimensions and facts. Data needs to be at the right level of granularity for linking across entities for supporting use cases.
The LDM engagement is scoped in such a way that the DSA’s and entities in the LDM will be limited to what is needed to support the identified and prioritized use cases. The LDM will be designed such that it will be modular, scalable and extendable.