Cloud Native Data Pipelines for Global Supply Chains | D2iQ
Orchestrating global supply chain operations using cloud-agnostic microservices running on DC/OS.
3 min read
With online shopping continuing to grow from 15 to 20 percent annually, retailers worldwide are increasingly challenged by the complexities of getting products where they need to go. From production all the way to the last-mile delivery to the customer's doorstep, there are a multitude of supply chain challenges companies must conquer if they hope to create repeat customers.
Many businesses turn to Elementum, a leading supply chain management company, to orchestrate their global supply chain operations and help them meet their customers' instant gratification expectations.
Elementum's Product Graph maps the global $25 trillion product economy to shed light on the flow of goods around the world and facilitate cross-ecosystem incident management to ensure products are available at the right place, time, quantity and cost. One key to Elementum's success: the deployment of a microservices model on a cloud-based platform.
Daniel Avila, the engineering DevOps lead at Elementum, has the daunting task of building the technological base for the company's massive global data collection and sharing operation. His first task upon arrival two years ago was to break free from the company's monolithic app-based platform and move future development to a more flexible and easier-to-scale microservices model.
"We collect and consolidate data from all sorts of sources including IoT devices and third parties, and it was clear that to get the enterprise solution we needed, our best shot would be with Mesosphere DC/OS," Avila said.
In Elementum's evolving cloud-based and -agnostic stack, legacy applications are transported to DC/OS. This move has prompted Avila to say half jokingly he's aiming for "one ring to rule them all" by creating a data platform that will work well with any cloud infrastructure a large client already has in place and wants to continue using.
"We are 100 percent cloud," said Avila. "We don't even have a real data center."
With components such as Apache Kafka, Elastic Search, Redis, and DSE in the mix;, Elementum is able to adjust stream data and achieve near real-time updates, which can be shared with customers on elegant dashboards with minimal overhead.
"The most challenging thing for us is creating a standard way to represent the data," said Avila. "Doing a custom development project for every customer just isn't scalable."
Avila hopes to continue to move workloads, data pipelines, and data ingestion operations onto DC/OS and get involved with machine learning as well. The overarching goal is to make sure Elementum clients can see, analyze, manage, and optimize their entire supply chains and delivery logistics so their products get made, boxed, shipped, and delivered to the doorstep without hitting any potholes along the way.
Achieving that goal means keeping a laser-like focus on cloud technology and cloud agnosticism while plotting a path toward a multi-cloud implementation that promises scalability, flexibility, risk mitigation through redundancy and, perhaps most importantly, the kind of adaptability that's vital for success in such a fast-paced and -changing industry.