David Jez Consultant

David Jez

I'm a Data Strategy Consultant

About Me

David Jez Consultant

I am David Jez, Data Strategy & Business Intelligence Consultant from Prague, Czech Republic.

I have strong background in data warehousing & Business Intelligence development with further focus on related data management topics such as Data Governance, Privacy and Security.

I help companies define their data strategy and Business Intelligence framework.

Development

Data Warehousing

Data Management

Leadership

Services

database

Data Warehouse Design

Designing of a data platform to integrate data from multiple sources that support analytical reporting and data analysis.


monitor-screen

End-to-End Reporting Solution

Development of a complete, end-to-end reporting solution starting from data ingestion to final reporting and analytical layer.

hacker

Data Privacy Consultancy

Consultancy with a focus on the proper handling of data and compliance with data protection regulations.



UI/UX design

Data Strategy Definition

Defining data strategy that helps you to make informed decisions based on your data and keep it safe and compliant.

analysis

Data Analysis

Service related to cleaning, transforming, and modeling data to discover useful information for business decision-making.

UI/UX design

Master Data Management

Helping organizations to define and manage their critical data and setting up the MDM processes within the company.

Looking for a custom job? Click here to contact me! 👋

Recent Projects

Design & Architecture of Enterprise Data Warehouse

At Avast, I was the owner and leader of global Enterprise Data Warehouse capability I was reponsible for design, management and maintenance of internal Data Warehouse. Furthermore, I was also actively involved in the development of DWH itself, e.g. I developed various ETLs and owned several business critical projects.

My other duties included:

  • defining the development standards,
  • agree on what data to process and make available to end-users,
  • determining what is the minimum quality criteria for the data that is available to end users and determine how to measure and analyze these criteria against the quality of the data that is the source data for the data warehouse,
  • determining the data retention policy,
  • designing a data access policy,
  • designing a data backup strategy,
  • monitoring and reporting on data usage & activity.

The data warehouse was first solved on MS SQL Server technology and then on a technology called Greenplum, which is practically a data platform based on the PostgreSQL core.

Customer 360 Project Architecture & Development

I worked on the Customer C360 project, where the goal was to unify all information about end customers and increase the level of analytical capabilities of the entire company. Apart from the development itself, I was responsible for the overall architecture of the solution.

The main idea behind Customer 360 is to combine all structured and unstructured data about every customer in your company to create a complete and accurate picture of every customer. This integrated knowledge enables you to create great customer experience, personalize customer interactions, and build greater customer insights.


The problem is that customer data is stored in different databases (e.g. store, internet, social media), geographic region and product line. Duplicates and discrepancies are unavoidable when these systems are not synchronized.

The C360 ​​client acts as a hub for connecting and synchronizing all of the customer information. It becomes a reference source for finding the most up-to-date information. Many people refer to this as the customer's "single source of truth." Data can be de-duplicated, aggregated, analyzed and displayed on demand.

Having all the data about your customer in one place provides you great analytical capabilities allowing you to analyze the data, from the simple data queries to more advanced data science such as making use of machine learning.

The biggest challenge when developing the Customer 360 is always to correctly identify all of the records belonging to a particular customer because of inconsistencies across databases and poor data quality.

GDPR Compliance Data Integration

Just like all other companies operating on the European market dealing with personal data of EU citizens, also Avast was affected by the GDPR regulation and thus had to implement internal processes that comply with that regulation.

One of the processes was to integrate individual customer data (PII) on one common data platform so that we could provide this data to customers upon request. And that is the process that I was responsible for - the integration of the customer data which involved developing individual ETs. Once done, the data was then displayed on a public GDPR portal owned by the company.

KPI Data Model Development

I was involved in the development of KPI reporting, where I was responsible for the entire database layer of the solution. The output of this reporting served both top management in decision making and external reporting to the stock exchange.

Work on the database layer involved integrating various data sources within DWH and developing individual ETLs and automating them.

The solution was built on the MS SQL Server platform including automation using a custom solution that I was the exclusive author of.

MDM/RDM Design

  • Client: Avast

MDM/RDM Design & Implementation

I was involved in the design and implementation of the MDM solution due to my position as a data warehouse owner and data mastering process.

The overall project included defining the different areas of master data, data quality management, data profiling, data acquisition and data merging.

Master data management is a method used to define and manage an organization's key data in order to provide along with data integration a single point of reference for other data consumers. Poor master data quality can affect an enterprise in several ways. It can impact business transparency, thereby compromising effective management, or it can slow down an organization's processes by requiring additional manual solutions and steps.

Data Warehouse Migration

  • Client: Avast

Data Warehouse Migration

As the owner of the data warehouse, I was responsible for migrating all of its content from one platform to another.

Over time, the client decided to migrate all content from the MS SQL Server environment to Greenplum.

The migration involved not only moving the data, but also rewriting all the data pipes (ETLs). In addition, the individual analytical models that were built on top of the data warehouse needed to be rebuilt.

This project involved migrating roughly:

  • 10 business-critical projects,
  • 90 databases,
  • 4 TB of data,
  • thousands of stored procedures and entities.

The project was very demanding in terms of communication with individual stakeholders and also in terms of planning the individual steps of the migration.

Despite the complexity of the whole project, the migration ended successfully in the end.

Data Security Measures Design

I created a policy and process for granting and controlling access rights to individual data objects, which controlled the data warehouse and also the data lake. For these purposes, I chose the RBAC (Role-Based Access Control) model in the company, which replaced the original system of assigning permissions to individual users.

In addition, I defined methods of data protection in the data warehouse, from data masking to replacing all PII data with hashes.

Last but not least, I participated in defining the process of data retention management in the company's analytical environments.

Management of Data Engineering Team

I managed a team of 7 data engineers where my main responsibilities included:

  • Managing the day-to-day activities of the team.
  • Motivating the team to achieve organizational goals.
  • Developing and implementing a timeline to achieve targets.
  • Delegating tasks to team members.
  • Conducting training of team members to maximize their potential.
  • Empowering team members with skills to improve their confidence, product knowledge, and communication skills.
  • Conducting quarterly performance reviews.
  • Contributing to the growth of the company through a successful team.
  • Creating a pleasant working environment that inspires the team.

Additionaly, I worked on many other projects where I managed also colleagues from other teams within the project team. These project teams had on average around ten members.

Experience

2019 - Present

Data Engineering Manager

  • reponsible for design, management and maintenance of internal Data Warehouse,
  • development of various ETLs, data pipelines and internal tools in order to automate data flows and administrative tasks,
  • data governance, MDM, data privacy & security management (GDPR),
  • set team goals, delegate tasks and set deadlines, oversee the day-to-day team's operation.
2017 - 2019

Principal BI/DWH Developer

  • Data Warehouse development on Microsoft platform (MS BI Full Stack),
  • responsibility for design, management and maintenance of Data Warehouse,
  • data analysis & ETL development,
  • Master Data Management process ownership,
  • sets development standards for local Data Warehouse.
2015 - 2017

Business Data Analyst

  • data warehouse development,
  • ETL development,
  • data analysis,
  • creation, automation and maintenance of the regular reports,
  • forecasting and planning responsibility.
More information to be found on my LinkedIn profile.

Education

2020 - 2021

Master of Business Administration

Data & Business, University of Economics
Specialization: Data & Analytics for Business Management

2012 - 2015

Master's Degree

University of Economics
Specialization: Information Systems and Technologies
Minor Specialization: Project Management

2019 - 2012

Bachelor's Degree

University of Economics
Specialization: Informatics and Statistics

Get In Touch

Let's talk about everything!

Don't like forms? Get in touch with me on LinkedIn 👋