Success with job shop operations (Part IV) – Interoperability

sample_ProcessTraditional ERP systems were all-inclusive.  This makes for complex software suites with high license fees and long implementation times. No guarantee all modules will be “best of breed”.

Researching allows individual modules to be selected on their relative merits but the risk here is will the modules be able to talk to each other?

The desired state of affairs is seamless integration across applications.  Seamless integration, outside of building an all-in-one applications or developing a standard data transport envelope, requires serious thinking.

Suppose we say each system has its own tables\fields.

Any publisher of data will want to publish using their own native data element naming conventions.  If a publisher wants to call ”home address” [hadr] and store data at a 50-character field, go ahead.

Subscribers similarly want to read data using their own native data element naming/data type/sizing conventions.  One subscriber may want to call ”home address” [ah], another [address_home].

If [ah] has been sized at 70 characters, no problem. If [address_home] has been sized at 40 characters, we have a problem (i.e. loss of data).

Now, given a scenario where each stakeholder is able to read/write data (i.e. a subscriber can also be a publisher) no subscriber is likely to need all of a publisher’s data nor is any publisher likely to be willing to share all available data.

Primitive data sharing has all data a publisher has going to all subscribers, in a standard format, based on a notion that this is easier.

First of all, it costs a lot of time/money for any system to be able to format outgoing data to a standard and to parse incoming data out of a standard.

Why not encourage the use of standards, but allow stakeholders to use other data transport formats, at the risk of incurring expenditure to develop custom formatters and to make available custom parsers for use by other stakeholders?

So, we have data to be selectively shared, formats to be accommodated, leaving one unanswered question, which is how?

And, the answer is a data exchanger.

Find one of these and you will have achieved interoperability.


(Part I) – How to increase throughput

(Part II) – How to collect data at workflow steps

(Part III) – How to put decision support in-line

(Part IV) – Interoperability


Management consultant and process control engineer (MSc EE) with a focus on bridging the gap between operations and strategy in the areas of critical infrastructure protection, major crimes case management, healthcare services delivery, and b2b/b2c/b2d transactions. (C) 2010-2019 Karl Walter Keirstead, P. Eng. All rights reserved. The opinions expressed here are those of the author, and are not connected with Jay-Kell Technologies Inc, Civerex Systems Inc. (Canada), Civerex Systems Inc. (USA) or CvX Productions.
This entry was posted in Adaptive Case Management, Business Process Improvement, Business Process Management, Data Interoperability, Database Technology, Job Shop Operations, Software Design and tagged , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s