What Hadoop components are and how to choose the right component for the problem to be solved.
What you will learn?
- What Hadoop is and what it is typically used for.
- What Hadoop components are and how to choose the right component for the problem to be solved.
- What Hadoop distributions are.
- How the Hadoop installation process works, what is needed and how long it takes.
- How to upload simple data to Hadoop cluster.
- Introduction to Hive, Impala, HBase, Kafka and Spark.
- Basic principles for securing Hadoop clusters.
- What analytical tasks can be performed in the Hadoop environment.
- Typical cases of Hadoop use.
- Specific analytical tasks in Hadoop environment.
- How to use selected analysis tools in the Hadoop environment.
- The business impact of the Hadoop platform on selected examples: data archiving, enhanced analytical capabilities, use in Data Warehousing, etc.
The training takes 1 day
Who is the training intended for?
- All who want to know more about Big Data and get deeper into the issue.
- MIS, IS / IT, DW / BI specialists, developers, administrators, analysts, data architects.
The price includes
- Training materials
- Certificate of course completion
EUR per person (without VAT)
Michal Kubica will guide you through the training.
We will contact you as soon as possible.