Next Generation Integration Strategies and the End of EAI/ETL Templates

The software and integration community is rapidly making a significant shift in integration and process intelligence within the enterprise architecture framework. Traditionally, the logic of integration and requirements such as security checks, routing, business flow, validation and transformation have existed within the integration transport layer itself.  This concept introduced the challenge of extremely monolithic integration layers that were not scalable due to many point-to-point connections. The tools used often forced reliance on independent 3rd party integration vendors who had no vested interest in the process they were attempting to orchestrate. Since the source/destination software vendors providing the data are best positioned to understand its security, routing and flow requirements, there’s rapidly growing support for the smart endpoint.

The use of smart endpoints provides an answer to the technical and manpower overhead incurred by monolithic integration technologies. This concept moves vendor specific business and transformation/mapping logic to the smart endpoint adapters attached to the application architecture itself and relies on the vendors understanding of how and when they need data and workflow. These services are paired with a robust communication method that provides delivery and queuing of the requests. Following this trend, the more discrete the endpoint (i.e. a microservice), the closer the application gets to a true componentized service based solution.

Application endpoints focus on producing and consuming messages, while a transport layer (or “pipe”) is the inter-service communication layer. This satisfies the primary objective of pushing the integration intelligence requirements to the vendor that best understands the need while requiring them, rather than the customer, to maintain the functionality of the data/process endpoint.

This also avoids the inevitable future state of EAI based “smart integrations,” often delivered as templates, that become systems in their own right. Many times these systems become very entrenched into an organizations infrastructure and very difficult to remove once the technology is outdated. This transformation from simple “integrations” to a full-fledged critical system is rarely planned. Over time these systems increase in complexity and scope while being based on proprietary maintenance and development protocols. If a mission critical system that was never planned sounds risky and expensive, then keep reading.

The History of the Smart Endpoint

The concept of the smart endpoint was born, in part, because of too much dependence on integration vendors to augment functional solution gaps and business logic to orchestrate the overall business process between applications.   Integration developers were encouraged to put more and more logic and intelligence into the transport layer infrastructure (the pipes), and less and less into the services and the applications (the endpoints) that consume them.

As these solutions grew, certain flaws became evident:

  • These systems often relied on point-to-point integration patterns that became larger and more complex over time. This often resulted in over-reliance upon the integration vendor
  • Attempts to alleviate point-to-point integration by moving to hub-and-spoke models (i.e. data repositories) simply pushed the complexity into data marts making them the bottleneck for performance and scale
  • Changes in the endpoint solutions resulted in very large scale effort to adapt and modify the many integrations that were layered through the communication infrastructure.

So, is this Smart Endpoints – Dumb Pipes?

There are key capabilities that are required for an endpoint component within this framework to be considered smart.  The endpoint must provide security checks, routing, business flow and validation and transformation. However, this does not suggest that the pipe itself be dumb. The pipe provides some critical components that are required to make this concept work, namely persistence and delivery.

Persistence elevates the pipe above just a transport mechanism. Objectively the pipe is dumb in that it doesn’t understand the context of what it’s messaging, but it’s smart about managing those messages. The idea is to write once, read once and then delete when acknowledged by the endpoint if the need for persistence is finished.  Who better to understand the need for the acknowledgement than the endpoint (developed by the application vendor) itself?

So, while a true “dumb” pipe provides a simple transport layer, the effective pipe is smart enough to understand the necessity for guaranteed persistence and delivery.

True business process orchestration is much too data dependent to overlook data in transit. The possibility of data loss causing disruption is unacceptable from a technical and business perspective. Hence, there’s no room for truly dumb pipes.

Extracting Further Value from Loosely Coupled Technology

This capability can help gain additional value from loosely coupled technology architectures and deliver harmonized business process orchestration. If you refer to the recent blog post, When is ‘Best-of-Breed’ the Best Choice for Supply Chains?, the need for this loosely coupled architecture to maintain agility and competitive advantage is explored. If implemented properly, it also allows existing integrations to exist while acting as an “alongside” layer to extract further value from investments in current solutions.

The goal is to pull the integration intelligence back into the solutions themselves, while providing not just interfaced solution silos, but truly interactive ones. The best solutions will also provide toolkits that support coordination between new and legacy components.

It should be obvious that operating via an agreed to standardized, yet extensible, canonical data model will provide more rapid development and other ROI opportunities. There will be work required to agree on a single definition of a business object that satisfies existing and future solutions, but this can still yield benefit when extending a workflow outside known solutions in reduced long term maintenance and upgrade costs.

Let’s allow the solutions to understand the content, context, and utilization of the intelligence contained in the data and allow the transport layer to do what it does best.

JDA is Here to Help

JDA’s solution to this need is called JDA ConnectJDA Connect provides value by orchestrating workflows and business processes rather than just data movement. Incorporating JDA intelligence (and third party system needs, such as SAP HANA) in the application adapter, rather than the pipe, means that system changes impact just one element of your integration architecture compared to template integration (ETL tools) where that change may affect many components. Based on open standards, it can be deployed within a much broader IT footprint.

JDA is here to help customers avoid the heavy, and often futile, costs of field integration of new solutions and pulls the integration intelligence back into the solution. Toolkits deployed alongside JDA solutions enable true orchestrated business processes that allow field extensions. JDA modules and third party systems can work together to deliver sophisticated solutions, such as JDA Intelligent Fulfillment, that weren’t possible in the past.

This framework also provides for a more upgradeable/sustainable environment for customers. It’s an important consideration for customers when considering inconsistent data models. As a stretch goal, this may also be opportunity to leverage the flexibility as part of an enterprise innovation initiative, allowing JDA customers to experiment without major investment, which will pay dividends as concepts of edge integration become more mainstream.

As the complexity of the omni-channel world expands, technology must evolve to ensure the necessary end-to-end processes are executed flawlessly and enable workflow between internal and external applications. Whether you need to execute the processing of thousands of consumer orders daily, link your assortment plan into your demand planning process, synchronize appointments between warehousing and transportation, effectively plan for store labor, execute an S&OP process or plan shelf resets, integration requirements are everywhere. Learn more about how JDA can help by contacting us today.

No Comments

Be the first to start a conversation

Leave a Comment