Speaker "Martin Lurie" Details Back



The Hadoop Ecosystem - all the major projects


Hadoop Hands On This session will introduce you to several use cases for Hadoop. Through these use cases you will understand the capabilities of the major Hadoop projects. Think of it like "speed-dating" with the different components. Newcomers will understand the range of capabilities within Hadoop. Those with experience will gain exposure to projects they haven't worked with before. Prerequisite skills: ability to enter a command at a bash-shell prompt, or watch the person next to you enter the command. Which use cases? Political analysis - was Abe Lincoln a "team-player", NYSE stock ticker analysis including using an analytics application, Labor census analysis and data warehouse offloading, web site interaction profiling at, email clustering with machine learning, mining social media, and providing "next-best-offer" based on geolocations. Which Hadoop projects? HDFS, Streaming MapReduce, Hive, Impala, Spark, Search, Pig, Sqoop, Flume, Kafka, Oozie, HBase, and Mahout. You will see sample code from each project. We will also run some industry standard Hadoop benchmarks.


Marty Lurie started his computer career generating chads while attempting to write Fortran on an IBM 1130. His day job is Hadoop Systems Engineering at Cloudera, but if pressed he will admit he mostly plays with computers. His favorite program is the one he wrote to connect his Nordic Track to his laptop (the laptop lost two pounds, and lowered its cholesterol by 20%). Marty is a Cloudera Certified Hadoop Administrator, Cloudera Certified Hadoop Developer, an IBM-certified Advanced WebSphere Administrator, Informix-certified Professional, Certified DB2 DBA, Certified Business Intelligence Solutions Professional, Linux+ Certified, and has trained his dog to play basketball. You can contact Marty at