workshop: Hadoop Essentials

This half-day introduction workshop provides a technical overview of the Apache Hadoop ecosystem. You will be introduced to the core of Hadoop – HDFS for data storage and MapReduce for data processing. Additionally we will cover important components of the Hadoop ecosystem – Apache Pig as a scripting language and Apache Hive as an additional dataware-house layer, both build on top of HDFS and MapReduce.
Prior programming skills (in any language) are a plus, but not necessarily needed as the hands-on labs with step-by-step instructions will give even novice users a deeper understanding of Apache Hadoop.
This beginners workshop will have a 50/50 mix of hands-on technical labs and lectures/discussions which in overall provides a good overview of the Hadoop ecosystem.
Participants must provide their own laptop with a minimum of 8 GB of RAM, a minimum of 50 GB available disk space. We will be using the Hortonworks Sandbox running in a VirtualBox. All required software will be provided via USB flash drive during the workshop, but please note that there may be no dedicated broadband for this workshop.
Info
Day:
2013-08-25
Start time:
10:00
Duration:
04:00
Room:
C115/Workshops
Track:
Database
Language:
en
Feedback
Click here to let us know how you liked this event.
Concurrent events
- HS3
- The Renaissance of Perl
- HS5
- Working with massively distributed database systems
- C119/Hauptkonferenz
- Dokumenten-Management mit Alfresco - eine Einführung
- HS1/2
- crowdgovernance
- C115/Workshops
- Hadoop Essentials
- HS4
- SystemTap - Skripten im Kernelspace
- C118/PHP
- Symfony 2 Rest Edition
- HS6
- Ruby is magic
- C219 (Sa Tine 2.0 / So Fedora)
- The Cat Is Alive And Running Out Of The Box
Speakers
![]() |
Uwe Seiler |