Speaker "John Mertic" Details Back



Apache Hadoop is Retro: Unlocking Business Value


In 2006, Apache Hadoop was a small project deployed on 20 machines at Yahoo and by 2010 it was running on 45K machines. Apache Hadoop truly had become the backbone of Yahoo’s data infrastructure. One would think by 2016 that Apache Hadoop would be the backbone of data infrastructures for all enterprises, but widespread adoption has been shockingly low. Apache Hadoop and Big Data proponents recognize that this technology has not achieved its game-changing business potential. Gartner puts it well: "Despite considerable hype and reported successes for early adopters, 54 percent of survey respondents report no plans to invest [in Hadoop] at this time, while only 18 percent have plans to invest in Hadoop over the next two years," said Nick Heudecker, research director at Gartner. "Furthermore, the early adopters don't appear to be championing for substantial Hadoop adoption over the next 24 months; in fact, there are fewer who plan to begin in the next two years than already have." - Gartner Survey Highlights Challenges to Hadoop Adoption While proven a as popular platform among developers requiring a technology that can power large, complex applications, the rapid, and often healthy, innovation happening with Hadoop components and Hadoop Distros can also slow big data ecosystem development and limits adoption. In this presentation, John Mertic, director of program management for ODPi at The Linux Foundation, new developments that help unlock more business value for Apache Hadoop initiatives. In ODPi’s view, the industry needs more open source-based big data technologies and standards so application developers and enterprises are able to more easily build data-driven applications. This includes standardizing the commodity work of the components of an Hadoop distribution to spur the creation of more applications, which is a boost for the entire ecosystem. What's the takeaway for the audience? Attendees will learn: Why widespread adoption of Apache Hadoop in the enterprise has been low New developments enabling increased business value for Apache Hadoop initiatives The need for Standardizing the commodity work of the components of an Hadoop distribution The need for a common platform against which to certify apps to reduce the complexities of interoperability The benefits of compatibility and standardization across distribution and application offerings for management and integration


John Mertic is Director of Program Management for ODPi and Open Mainframe Project at The Linux Foundation. Previously, Mertic was director of business development software alliances at Bitnami. Mertic comes from a PHP and Open Source background, being a developer, evangelist, and partnership leader at SugarCRM, board member at OW2, president of OpenSocial, and frequent conference speaker around the world. As an avid writer, Mertic has published articles on IBM DeveloperWorks, Apple Developer Connection, and PHP Architect, and authored the book The Definitive Guide to SugarCRM: Better Business Applications and the book Building on SugarCRM. Recently keynoted at Apache: BigData North America 2016: