One of the largest logistics companies in the world has in cooperation with Adastra, agilely built a data warehouse from scratch in a private Cloud, based on the CI/CD (Continuous Integration and Continuous Delivery) concept. The data warehouse contains 3.5 billion records, with daily data increments of 20 million records for processing, which are uploaded from 10 source systems. Data and KPI indicators are updated four times a day. The generation of a new KPI from the data warehouse and its incorporation into the report was reduced from several days to weeks to only two days. The number of monitored KPIs has increased tenfold since the start of the project.
A metadata approach controlled by the DWH Framework, Adastra's own ETL tool, which is fully integrated with Microsoft database technologies (on-premise and cloud), was used for data migrations from source systems, a unified approach to the creation of transformation scripts, and logging. Reports are delivered in the Microsoft Power BI platform.
Solution to a problem
In line with growing customer demand for transport and logistics, one of the largest logistics companies in the world is placing ever-increasing demands on the speed and efficiency of its processes, including internal IT. Therefore, the division, which covers, among others, the ticketing and help desk systems for the whole world, needed to monitor its own Key Performance Indicators (KPIs), as well as provide corporate KPIs to other departments of the company. The same division also manages the data warehouse which serves as the basis for KPIs.
Some modifications to the original data warehouse, such as adding new attributes to company employees, were complicated and their processing and deployment could take up to several days. Quite frequent "error or issue" could not be corrected within the required time frame because they were waiting, for example, for the opinion of a co-worker from another department. This delayed the closing of the "ticket". The goal was to ensure that any change to the data warehouse and its deployment to the production environment did not take so long.
The data warehouse stores approx. 3.5 billion records.
Business description of the solution
The company's divisions, together with Adastra, began upgrading the data model of the original data warehouse. The new data warehouse was created on a Cloud platform using knowledge and access to data from the original solution. At the same time, it eliminates weaknesses and resolves issues that time or specific tasks had detected in the original solution. The new data warehouse also allows:
- acceleration of data loads from source systems and their processing within individual DWH layers
- storing a long data history and its maintenance
- streamlining of access to data reporting in DWH
Daily increments of processed data reach up to 20 million records from
10 source systems.
Data processing - i.e. the uploading of data from source systems, transformation and consolidation of data at the core layer of the data warehouse, and distribution to specific datamarts - is provided through the Adastra DWH Framework, which helped to map existing and set up new transformation and integration processes using metadata-driven approach. It also facilitates the standardization and unification of ETL logic (data access, logging, validation, data quality control, etc.). A great benefit of the solution is the ability to (re)generate transformation scripts for all entities of the data warehouse, especially when it is necessary to reflect specific customer requirements in the solution.
This leads to a significant reduction in development time, and thus the delivery of the required KPIs to business users. Subsequent integration with the Azure DevOps platform enables unified code versioning, clearer and standardized code revision, and simplifies the entire deployment process with Continuous Integration and Continuous Deployment (CI/CD) support.
Updates of data and KPIs recalculations perform in the data warehouse 4 times a day, reflecting 4 main time zones.
Project outcome
- for data warehouse development and ETL process workflow settings:
- the workflow of ETL processes is set by metadata and their dependencies
- minimises manual code rewriting - developers or administrators do not write code directly; they only set up the process
- as a result, it unifies, standardizes and simplifies the coding of ETL procedures
- input data is validated automatically, which increases the quality of the data transmitted to the DWH. Data validation takes into account both technical and business rules.
- the execution of the ETL process is monitored - the DWH administrator receives a notification, e-mail or report if there is an error in the ETL process
- to initiate the process for the deployment of changes to target environments - to reduce the need for manual processing/involvement in the deployment process
The ratio of manual processing and the percentage of time saves by ETL specialists in the deployment of changes to target environments after process modifications is shown in the following table:
The delivery of a new KPI from the data warehouse to the report was reduced from several days to 2 days.
Type | Step | Releases (in minutes) | ||
---|---|---|---|---|
5.1 | 5.2 | 5.5 | ||
Manual | Pre-Deployment scripts | — | — | 2 |
Process | DWH build | 7 | 7 | 7 |
Process | DWH release | 70 | 14 | 11 |
Manual | Post-Deployment scripts | 10 | 8 | 1 |
Manual | Release check | — | 5 | 5 |
Process | OLAP release | — | — | 55 |
Total | 87 | 34 | 81 | |
Participation | 11% | 38% | 10% |
Legend:
Manual - performed 'manually' by the responsible person
Process - performed 'automatically' on the basis of a pre-prepared processes
Total - total time spent deploying DWH changes to the target evironment
Participation - indicates the percentage of the total time that the responsible person actively spends in implementing changes to the target environment
Requests to deliver a new KPI are implemented much faster than in the original solution, usually within a few days. Thanks to agile management and the approach of dividing complex requirements into smaller sub-tasks, it is possible to respond immediately to user changes and continuously incorporate them into the end solution.
At the start of the project we defined 5 KPIs. We are now observing 50, i.e. 10 times more KPIs.
Who participated in the project
Our excellent reputation is built by competent people who bring our projects to a successful end. In this case, the client relied on these consultants.


Josef Pinkr
Consultant
Have a similar issue? Let's discuss how we can help.
Thank you
We will contact you as soon as possible.