To realize the full potential of artificial intelligence (AI), we’re seeing increasing demand to move AI from experimentation to production. To achieve that, it is critical for developers to efficiently apply deep learning technologies (such as computer vision, NLP, neural recommendations, etc.) to production data pipelines. To address this challenge, we have developed new open source software technologies that unify data analytics and AI as an integrated workflow, such as BigDL (a distributed deep learning framework for Apache Spark) and Analytics Zoo (a unified analytics + AI platform for distributed TensorFlow, Keras, PyTorch, and BigDL on Apache Spark). This talk will provide an overview of BigDL and Analytics Zoo, and present the underlying distributed algorithms. More importantly, it will also show how to build and productionize end-to-end deep learning application pipelines for big data using real-world use cases (such as Microsoft Azure, JD.com, CERN, Midea/KUKA, etc.).