Back

 Industry News Details

 
How Disney Built a Pipeline for Streaming Analytics Posted on : May 30 - 2018

The explosion of on-demand video content is having a huge impact on how we watch television. You can now binge watch an entire season’s worth of Grey’s Anatomy at one sitting, if that suits your fancy. For a media giant like the Walt Disney Company, streaming video provides a great opportunity to engage with customers in new ways, but it also presents formidable technical obstacles.

Disney ABC Television Group (DATG) is the Burbank, California-based television arm of the global media conglomerate. With more than 7,000 employees, DATG is responsible for producing and delivering content across the ABC Television Network, including ABC News, ABC Entertainment, Disney Channels Worldwide, and others.

In recent years, the company has developed platforms that allow it to bypass traditional cable, satellite, and over-the-air broadcasts and deliver streaming television content over the Internet. Depending on where customer are, they can view current, archived, or even live ABC TV shows via Web browsers, apps on smart mobile devices, and streaming boxes from Roku.

Delivering that much content around the globe in a reliable manner is a big task in of itself. But  beyond the challenges inherent in building a content delivery network (CDN) are business-oriented questions that every advertising-supported media company needs answered, such as “Who is watching my show?”  “Are they seeing my ads?” and “How can I get them to watch more?”

Adam Ahringer, the manager of software data engineering at DATG, shared some details about how the company went about answering that type of question during a session at the recent Strata Data Conference in San Jose, California. One thing was readily apparent right from the outset: The old mechanism for Web analytics would no longer cut it.

“Omniture data was proving to not be sufficient,” Ahringer said. “For a long time, that was the standard way people instrumented their applications or website. We do get some decent analytics from it, but they’re really not timely and they’re very difficult to make changes to.”

Streaming Architecture

To answer age-old media questions in the new streaming landscape, the company decided to build its own real-time data analytics pipeline in the Amazon Web Services cloud. The pipeline would be architected to collect event data from all streaming content endpoints and feed it back into a central warehouse for analysis.

Ahringer and his team put a lot of thought into how to architect the new pipeline. One of the core design principles revolved around the fact that they wanted to collect as much event data as was practicable. View More