“Big Data” poses some challenges for healthcare – Find out how to circumvent these.

A discussion group at HIMSS in LinkedIn recently posted the following question “Smarter, Safer, Cheaper. What’s the true promise for healthcare big data”.

Here was my response:

Smarter, safer, cheaper for sure, but there are some challenges Dig_Dataand ways of making analytics seamless or making analytics very difficult.

Rule #1 is you cannot analyze what you did not collect in terms of data.

Rule #2 is you need easy ways of displaying the results of analytics as overlays to best practice protocols.

Rule #3 is you cannot follow Rule #2 if you don’t have best practice protocols (preferably in the form of flowgraphs as opposed to written policy/procedure).

In industrial Case Management (one example being military helicopter MRO), no two units that come in have the same configuration so it’s important to have a Case History of all interventions, organized by “session” and presented in reverse chronological order, with the ability to present the data by system/sub-system. The key to MRO is knowing when a particular system/sub-system needs periodic inspection and having data on hand to help with predicting problems before they actually materialize.

You end up with a lot of data but the data collection is effortless because all work is guided by flowgraphs (do this and then do that), data collection is automated and analytics routinely examine the data to identify ways and means of improving the flowgraphs.

It’s no different in healthcare, except much more complicated owing to the number of permutations and combinations such that you cannot expect to have end-to-end flowgraphs capable of handling the processing of most individual patients.

So, we end up with best practice process fragments that healthcare professionals have to thread together manually with some assistance by software and machines.

The following capabilities are needed:

#a. Skip a step along a best practice protocol.

#b. Revisit an already committed step.

#c. Forward record data at steps not yet current.

#d. Insert a step not in the original protocol.

In all of this it is important to be able to capture the data so you need a workflow management software environment that automatically logs all data as entered session by session at various computer screens at process steps and at ad hoc intervention steps. (i.e. map out your processes plan-side, roll them out, manage tasks via a run-time workflow management environment at Cases, have in place data logging at the Case Hx with a parallel flow to a “data warehouse” that analytic tools can easily link to).

The challenge becomes to detect via analytics patterns in the data logging, examples of which are for a 1-2-3-4-5-6 …. 12 workflow:

* Step 5 is almost always skipped. Why then should it continue to be in the protocol as a mandatory step? Either leave it as an optional step, or eliminate the step.

* Step 3 is almost always repeated via an ad hoc intervention following a commit at step 8. The process template probably needs to have a loopback added.

* One or more ad hoc steps are almost always inserted at step 12, why not update the protocol to make these part of the best practice template?

It is helpful to inventory all of the best practices as graphic flowgraphs for easy updating.

Analytics should ideally be able to build virtual meta cases that show protocol variations across Cases relating to patients with similar presentations, yielding statistics along the lines of “across 1,000 cases, the cases branched this way 5% of the time, 12% of the time that way, with no branching 83% of the time”.

Posted in Uncategorized | Tagged , | Leave a comment

It’s all about that Case, ‘bout that Case, no Trouble

I figured it would be easier for Meghan Trainor to attract folks to my blog.

In healthcare services delivery, your Case Managers need Case Management software.  Seems obvious, doesn’t it?  Why then do clinics/hospitals have EMRs that do not have a Patient History as their core pillar.

If you want to provide continuity of care to individual patients you need, first of all, “best practices” workflows to guide intake, assessment, diagnosis, tx planning, tx monitoring and discharge planning.all_about_the_case

And, yes, the Government says you also need long term outcomes data collection so that your patient’s great-grandchildren will also have good outcomes.

Clearly, the team of caregivers does not, will not, have the time to stare at paper process maps.

Accordingly, best practices maps need to be prepared and, for practical purposes, be carved up into individual interventions and automatically posted to the right staff at the right time.

The maps go a long way toward getting the “right” things done, by the “right” people, the “right” way, at the “right” time.

None of this can be done in thin air – “Case” is the environment of choice to host process steps.

Case accommodates any mix of structured and ad hoc interventions.

None of this will come as a surprise to practitioners who are old enough to remember paper charts.

Paper charts used “Case” and there still is no shortage of “Case Managers”, but look at the hoops they have to jump through to do what used to be easy.

What is perplexing is why did some designers of EMRs forget to carry forward the notion of “Case”?

Meghan Trainor – All About That Bass, .. that Bass, no Treble

Posted in FIXING HEALTHCARE | Tagged , , | Leave a comment

Why are we still having problems in the area of Healthcare software?

The point was made in a recent LinkedIn HIMSS discussion group “At what point does an EHR implementation become too large an endeavor?”

My response was this happens at the time you pick a bad EHR “solution”.   The usual outcome is a need for extensive customization which extends the timeline and the cost of acquisition. Confused_Dog

We don’t hear many stories about how easy/difficult it was for an organization to “customize”, say, MS Excel.

The reason, of course, is that setting up rules at spreadsheet cells is typically referred to as “configuration” as opposed to “customization” and you can configure different spreadsheet “books” to process data in quite different ways.

With EHRs, you can basically look at the software as having four modules (Clinical, Scheduling, Billing, Data Interchange). Not all that different from spreadsheets, (i.e. different “books” that perform different functions), right?

The statement “. . the level of effort is mind boggling” seems indeed to be the case with many EHRs, but not all.

My view is the origin of the mess the healthcare industry is in can be traced back to the way many of the EHRs were designed (requiring customization instead of configuration).

If your EHR ships with a proper scheduler designed for healthcare, a billing engine designed for healthcare, the only things you really need to worry about are Clinical and Data Exchange.


In most Clinical modules you quickly get to the practical problem Dr Liebert raises in respect of clinical work, which, if I am reading it right, is a statement that cookie-cutter medicine does not work.

Practical input like this from experts could have allowed EHR software designers to predict that one-size-fits-all would not work (” . . . the system needs you to do it this way”).

Some went to the other extreme where the customer is allowed to start with a blank sheet and map out clinical best practices as process maps, but with the caveat ” . . . make sure you anticipate all possible eventualities”.

This, too, was “not close, no cigar”.

The lights go on when your Clinical piece has its foundation in ACM (Adaptive Case Management) because, here, you start with a blank sheet, and use BPM (Business Process Management) to map your processes to cover most eventualities, without having to stay up nights worrying about all eventualities. The environment takes care of the need to accommodate ad hoc interventions.

BPM provides orchestration, ACM accommodates (via global rule sets) eventualities you did not include in your BPM process maps.

Think of a highway. BPM is the yellow center line (guidelines), ACM provides the guardrails on each side of the road. You can vary from the guidelines but extreme excursions are reined in by the guardrails.

You get to have your cake and eat it.

Don’t try to go out and buy an ACM EHR.  ACM is a methodology, not a piece of software.

Data Exchange

The wrong turn with Data Exchange was to reckon that EHRs need to be standardized so that they can exchange data.

One generic standalone Data Exchanger is all you really need and, simply stated, most of all of what it needs to be able to do is to take in data from publishers and make different subsets of this data available to different subscribers.

Publishers clearly want to post using their native data element naming conventions, subscribers want to read using their native data element naming conventions.

In respect of data types, character strings are fine for most data and you can post complex objects such as video files with an indication of their file type so that subscribers know what program is needed to access/open the file.

Data Exchange under the hood is nowhere near as simple as I have describe it here – you need long descriptors for data elements so that potential subscribers can know what it is they are subscribing to. You don’t, as a publisher, want to make your data available on other than a strict need-to-know basis. Different EHRs require different data formats.

You end up writing formatters to get data into the Data Exchanger and parsers to get the data out of the Data Exchanger but, guess what? There are only so many different EHRs so each new formatter/parser gets a little easier.

I think the healthcare business could significantly increase efficiency, increase throughput, decrease errors and improve compliance with internal and external rules and regulations, all of which would improve outcomes (i.e. individual patient outcomes and data collection for long term outcomes across different patient populations).

The only way this is going to happen is for buyers of EHRs to spec out what they want/need and accept nothing less.

Remember what you really need is a solution capable of addressing current as well as unanticipated future needs, not just current needs.

Posted in Adaptive Case Management, Business Process Improvement, Business Process Management, Case Management, Data Interoperability, FIXING HEALTHCARE | Tagged | Leave a comment

What’s your corporate IT strategy?

corporat_IT_strategyIn the beginning, IT reported to the CFO because the main focus was on automating accounting functions, with, time permitting, helping other functional units to address their computing needs. None of the application systems were able to easily exchange data.

In manufacturing and construction, things progressed to where we saw “all-singing-all-dancing” materials management/workflow management applications, with data transfers to accounting.

The focus, at that time, shifted from addressing functional unit needs to addressing the needs of the enterprise and CIO’s started to report to the CEO, not the CFO.

The problem was strategic planners recognized the need for business analysis capabilities so they proceeded independently to set up Business Analysis/Business Process Improvement units. These units did not have the skills to evolve sophisticated mind maps capable of consolidating operational level data to KPIs, so they did most of their work using spreadsheets.

Meanwhile, most IT departments, with a prior focus on filling orders and keeping the lights on, not following the state-of-the-art of mind maps, did not have the time nor did they set about to acquire the skills to select enterprise solutions.

Consolidating the organization’s Business Analysis/Business Improvement capabilities with those of traditional IT became possible as corporations moved to “cloud” solutions, relieving IT of the burden of “keeping the lights on”.

Not all of these transitions went smoothly.

The combined talent pool of internal BA/BI/IT proved, in many cases, to be a poor substitute for a couple of days of senior management consultant on-site time. The competitive advantage of these folks is that they have seen multiple good/bad strategy development/implementation initiatives across different industry areas over several decades.

In the absence of expert advice/assistance, the tendency remains to go for one of the following three options.

Internal Build – the undertaking here is to build the next generation Ferrari in one’s garage. Not impossible, given that Bill Gates did it and Steve Jobs did it, why not us?

One-stop-shop – the idea here is to find a system that addresses one’s current needs to the extent possible and to then proceed to “customize” the system. Vendors love this option. The core problem of course is corporations really don’t want/need software that addresses current needs – they need software that can address unanticipated future needs.

Best of Breed Integration – this makes a lot of sense PROVIDING all of the applications you pick have a demonstrable capability of exporting their data and importing data from local and remote third party systems and applications. Any slip-ups here and you join the queue of customers pleading with vendors for customization.

So, what to do? . . . .

Hire a good consultant for a couple of days on-site.

Pick a reputable busy hands-on consultant who starts to get ready to leave from the time he/she arrives on site.

Their focus will be on analysis/assessment and on training your staff so that they can take over once the consultant has gone off-site. Do your homework in advance. Expect the consultant to also do a lot of homework such that on arrival he/she has a reasonable understanding of your business, your strategy and the current state of your operations.

Do not hire a consultant who borrows your watch to tell you what time it is.

What’s the best possible outcome?

The best possible outcome is likely to come out of generic software that allows operations level staff to build, own and manage their own processes.

They should only have to go to BA/IT for rule set construction and inter-connectivity by and between all of the systems you already have in place and do not particularly want to cast aside.

The usual great deficiency in organizations is inability to provide continuity across silos.

Adaptive Case Management software plus Business Process Management software go a long way to bridging the gap. Graphic Enterprise free-form search knowledge bases go a long way to helping corporate stakeholders to evolve strategies and track performance at KPIs.

Yes, my group promotes these solution sets and we also use them internally.

A good starting position is to recognize that every organization has “best practices” – it does not matter whether these exist only in the minds of staff or whether they are on paper, on-line or in-line.   The main point is these are their “best practices” and will remain as such until these practices are replaced with improved practices.

The other thing they need to keep in mind is that their Job #1 is to build, maintain and enhance “competitive advantage”. This responsibility is shared across the Board, senior management and all staff. It’s all about “rowing in the same direction”.

What next? Read this blog post again and then again.

Then, take action.

Posted in Adaptive Case Management, Business Process Management, Interoperability | Tagged , , , | 1 Comment

How to fund process improvement initiatives

Every organization has processes, the processes are either documented or undocumented. They may exist only in the minds of workers or be on paper, online, or in-line.

Every organization has “best practices”. In the absence of any declared best practices, the practices in use are the organization’s “best practices” until these are replaced by better practices.

Processes are key for building and enhancing competitive advantage. The singular purpose of a process is to convert inputs to outputs.

All good, except that conversion requires time, money and resources that are typically in short supply.

The question that therefore often arises is which processes should be improved, to what extent, and how should process improvement initiatives be funded?


Option 1 – Fix your broken processes

If a process has bottlenecks, shuts down frequently or produces inconsistent outcomes it’s a prime candidate for improvement. Automated or manual inspection of process run-time logs will typically reveal bottlenecks and shutdowns.

If a step or series of steps show inconsistent outcomes, consider automating these.

A simple financial analysis is usually sufficient to justify fixing broken processes. Some need to be replaced outright.

When fixing processes, consider updating and improving these.

Option 2 – Go for the low-hanging fruit when looking to improve processes

If your organization has a “continuous process improvement” unit, bear in mind that most change is disruptive and that too much tweaking of some processes takes you to diminishing returns. Guard against having a “hammer looking for nails” mentality within such units.

With respect to the identification of candidate processes for improvement it makes sense to focus on easy, inexpensive, fast-track, low-risk, high return initiatives.

Some of these span silos and can have an important impact on an organization. Others do not.

Most processes in this category can be funded out of annual operating budgets.

Option 3 – Manage significant change

Wider, more complex initiatives require ROI submissions.

Objectives need to be clearly defined, benefits need to be stated, resources, timelines and funding need to be put in place.

Periodic monitoring is needed with, at times, re-setting of objectives.

Why many ROI-based initiatives fail.

It sounds simplistic but going through the motions of preparing an ROI and then not bothering to monitor performance, time, cost and outcomes over the life cycle of the initiative means the benefits as declared in the ROI stand a good chance of not being attained.

The reason is things change from the time of preparation of an ROI to implementation of initiatives – if things have changed to where the projected ROI of an initiative is trending negative, it is important to “know when to hold and when to fold”.

Worst case scenarios include being leapfrogged by a competitor before “new technology” gets to market. It may be best to shut down an initiative and put your scarce resources to initiatives with more promising outcomes.

Posted in Adaptive Case Management, Business Process Improvement, Competitive Advantage | Tagged , | Leave a comment

Where’s The Beef? – An under the hood look at “seamlessly integrated” BPMs

wheres_the_beef_from_PexelsI keep reading about “seamlessly integrated” BPMs’ (Business Process Management Systems) but when I start to look under the hood, it quickly becomes obvious that many content authors have no clue regarding the needs of organizations in the area of process management.

Reality is they should be talking about “hay-wired” BPMs modules.

Most of the product pitches start with a description of a process mapping environment. Some of the descriptions go on and on about languages/notations. This tells us that the environment being promoted is not end-user oriented.

No end-user wants to learn a new language/notation nor do they want to hire or have to liaise with a team of process experts or software development intermediaries.

The real experts for any process are the folks who work with that process day in/day out. Chances are you need facilitators to drag them out of their silos but with minor prompting, end-users can build, own and manage their processes.

Next in the list of capabilities we learn that there are “interfaces” that convert the graphic process map into, it seems, a run-time template that is “not rigid”.

What “interface” exactly would one possibly want other than a “rollout” button? If there is more to it than this, this is a dead giveaway that the protocol is too complicated and unworthy of receiving a tag of “seamlessly integrated”.

Same for “not rigid” – given we know that managing any Case requires being able to deal with a random mix of structured versus ad hoc interventions.

Any detailed explanation about a BPM template not being “rigid” is a smokescreen for inadequacy in this area.

We all know that at any step along any process template you may be required to reach out to a customer (or some outside system/application) or accept input from a customer or outside system/application.   If “rigidity” has to be highlighted, other than in a historical account of the evolution of BPM, the setup is too complicated.

Strike three!

I could quit here but if any readers are still with me, I am not yet done with the rant.

Here goes – at any process template step it’s a given that users will need to collect data.

These software systems therefore need at least one form at each step/intervention and, from an end-user perspective, the form needs to be local to the step.

Same for all manner of .txt, .doc, .ppt, .pdf, .xls, even video/audio recordings that may relate to a step. All of these need to be attributes of steps, not off in some “shared” repository.

What end-users want/need is real “seamless integration” and a modest amount of “interfacing.”

Clearly, when they click on a .doc attribute at a step, they want interfacing with MS Word, not replication of word processor functionality within the Case environment.

Why multiple references to Case?

The thing is we want the primary focus to be on reaching objectives and if “work” has become a combination of ad hoc and structured interventions, we pretty much need to host all of this at a run-time environment that lets users decide what next to do, when, how (ad hoc intervention capability), with some “guidance” attributable to background BPM process templates. Clearly, it’s not only all about BPM.

We also need to look to the environment to “rein-in” extreme excursions away from policy/procedure (global case-level rule sets, if you like). Otherwise you have no “governance”.

Case provides all of the necessary functionality, including data exchange plus auto-buildup of a longitudinal history of interventions (how otherwise would a user be able to decide what the next intervention should be at a Case?).

The real icing on the cake in some of these nonsensical pitches is references to “process objectives”.

If you no longer have “processes” (what we have today are “process fragments” that get threaded together by users, robots and software at run time), how can/should we expect to find objectives at process template end points?

No processes, no convenient end points, no objectives.

The answer is objectives have gone from being plan-side attributes of process maps to run-time attributes of Cases.

Once you venture into areas where there is a need to accommodate a random mix of ad hoc and structured interventions (i.e. most business areas today except where we may find a few not-yet-automated end-to-end processes), it is the Case Manager who decides what the objectives are for a Case and they, not BI analysts, nor IT, park these objectives at each Case.

Case Managers also monitor progress toward meeting Case objectives and this typically requires fairly complex non-subjective decision making, not simple counting of the number of remaining steps along a process template pathway.

See posts I have made on FOMM (Figure of Merit Matrices).

Just last week I read some promotional material announcing a ‘transition’ to “Case”.

I pointed out to the authors that Case is not new, that it actually is a term borrowed from the legal profession and was alive and well in the UK in the area of medicine from as early on as 1603.

They have not thus far responded to my objections.

It’s easy to determine within healthcare whether there is a focus on Cases.

Just walk into a clinic/hospital and ask if you can talk to a Case Manager. You will probably have a roomful of choices and some of these folks have been doing Case Management for decades. They have job titles, job descriptions and really perform “Case Management” day-in/day-out.

Most of my readers, aside from members of the flat-earth society, are starting to get this. Except that the way things have been going lately, we may very well soon have a flat earth, so they may end up having the last laugh.

“Case” – not new, not close, no cigar.

Posted in Adaptive Case Management, Business Process Management, Case Management, Data Interoperability, Decision Making, Enterprise Content Management, Process Management, Process Mapping | Tagged , , , | 1 Comment

3D Strategic Planning – What you need to know about it

Strategic Planning is a “connect-the-dots” exercise that usually starts off like this. . .big_data_mess_full_size

In order to develop strategy you need information on multiple Entities relating to your business (i.e. Capital, Access to Capital, Land, Equipment, Tools, Premises, Staff, Current Products/Services, Products / Services Under Development, Projects Awaiting Approval, Technology Trends, Changing Legislation, Competitors).

We know that decision-making involves the transformation of information into action. The problem is you cannot make decisions if you cannot easily find and access the information needed for such decisions.

For any given opportunity, arriving at a go/no decision will impact one or more of these Entities.

One added complexity is the way information belonging to one Entity interacts with information belonging to another Entity.

This is where we make the link between traditional strategic planning (deciding how to make best use of scarce resources on initiatives that track with corporate mission) and the “connect-the-dots” approach used by law enforcement for investigations.

The key point is the law enforcement “connect-the-dots” approach can be “borrowed” for corporate strategic planning purposes.

Here is a typical practical scenario

An opportunity has been identified, the sponsors present to the Board and the Board now has to assess the opportunity, assign a ranking, assign a priority and if the project manages to make its way through these filters, allocate funds to allow an initiative to move forward.

Different opportunities impact different Entities in different ways.

It follows that if you are consolidating information relating to corporate Entities you may need to provisionally allocate assets/resources to several competing opportunities.

All in the interest of making better decisions.

One way to do this is to consolidate all Entity information for the Corporation at a graphic knowledge base and then alias information relevant to each opportunity for local consultation at each opportunity.   This allows you to toggle between the “big picture” and individual opportunities, with each opportunity being able to “see” competition from others on the use of scarce resources.

If you find your favorite proposed initiative has a ranking below another initiative you can perhaps do more research on the merits/benefits of your initiative and improve its ranking.

The more you are able to “connect-the-dots” between initiatives and their “draws” on scarce resources, the greater the potential quality of your decision-making at and across initiatives.

Why 3D?

Well, you will discover soon enough that trying to build a single hierarchy comprising say, 500,000 information points on one computer screen requires the use of 3D “Russian Doll” or “Cascading Folder” mapping as illustrated in the US State Dept Country Profiles demo database (all countries, business, travel, law enforcement, narcotics, terrorism, etc.).

Try that on paper or whiteboard with post-its.

What you need is a graphic free-form search knowledge base capability that accommodates any number of separate hierarchies, with “connect-the-dots” facilities and with the ability to quickly zoom in from the “big picture” to low-level detail and back.

At the end of the day, it’s all about how you like to look at your corporation – you really only have two choices . . .

Like this



Or like this












Think about this article next time you go to a meeting pushing a cart with a 3 foot pile of reports, papers, spreadsheets.

Key Words: Strategic planning, Connect-the-dots, knowledge bases

Posted in Strategic Planning | 1 Comment

Policy, procedure, KPIs – how to run a business

Corporations have infrastructures with various asset classes (capital, access to capital, plant, equipment, people, existing products/services, new products/services, customers, and partners).

The ones that have “secret sauces” succeed, most of the others fail.

The rules for success are quite simple – it’s important to build and maintain each asset class individually.

Then, aside from the risk of being leapfrogged, you need to also enhance your assets to keep ahead of the competition.

Not all assets have the same relative strategic value so you need a way when making decisions re the commitment/deployment of assets to study each fund request in the light of its potential to support strategy.

It pays to maintain a reserve in each asset class. If, for instance, you commit all of your staff to a large project you may need to refuse new opportunities and if your “all-eggs-in-one-basket” initiative fails you will find yourself in damage control mode.

The traditional approach to “management” has been to standardize and streamline.

Policy provides governance, procedure provides guidance, KPIs allow CEOs to steer the ship.

The problem is people don’t keep policy in mind, they don’t read procedure and because things change quickly, it’s very easy to be looking at the wrong KPIs.

This is why we have BPM (Business Process Management) and ACM/BPM (Adaptive Case Management).

A Bit of History

I have always maintained the position that BPM had its origins in CPM (nodes, arcs, end node objective).

CPM goes back to 1957 (possibly earlier) with  flow graphing hitting the streets both in the military (Polaris Program) and in the construction business (E I Du Pont de Nemours).

Media coverage of flow graphing peaked in 1962 with DOD/NASA’s Pert Cost. I don’t recall seeing much media frenzy on CPM but the “critical path” methodology has evolved over time to where few considering launching a large project would do so without CPM.

The main contribution introduced by BPM was content-driven decision-box branching and loopbacks.

It is worth pointing out that branching itself was a part of GERT (an invention of General Electric) which recognized the need to avoid having to engage all pathways in a flow graph. The difference in GERT, was that the branching was evidence-based (i.e. plan side) whereas BPM is content-sensitive, causing rule set engagement to occur at run-time.

BPM Today

The problem with BPM came when most of the low-hanging fruit (i.e. mostly linear, straight through, end-to-end processes) had been harvested.

This is resulting in an exodus from traditional BPMs to Case where objectives can no longer be conveniently parked at process end points.


The thing about Case is that it can host objectives and allow background BPM to continue to provide Orchestration.

Governance comes from rule-sets at the Case level. We call all of this ACM/BPM where ACM stands for “Adaptive Case Management”. Some just call the hybrid approach ACM but we need both Orchestration and Governance plus a few other tools.

Corporations that embrace ACM/BPM end up with the best of two worlds i.e. a flexible environment in which structured “best practices” in the form of process fragments are available for users, machines and software to thread together at run time, and where the ability exists at any stage during the lifecycle of a Case to insert ad hoc interventions.

Contrast this with the shaky foundation that maintains that you can, with BPM alone, map out all eventualities. It does not take a lot of analysis to realize that in a relatively simple flow graph, the closer you have branching decision-boxes to the start node, the greater the number of permutations and combinations. Easy to identify flow graphs where the number of permutations and combinations is in the hundreds of thousands.

ACM/BPM wins hands down because there are no restrictions at any point in the lifecycle of a Case re engaging specific processing

Bridging the gap between operations and strategy

It’s fine to be practicing ACM/BPM at the operations level but it’s a trees vs forest wheel-spinning exercise unless/until you have a way to evolve strategy, put in place ways and means of assigning priorities (few companies have the wherewithal to authorize all funding requests that are submitted) and then monitor approved funding requests to make sure work is proceeding on time, on budget and within specification.

Strategy -> Priority Setting -> Objectives     <-       ROIs <- Cases <- Case objectives

Narrowing the gap between operations and strategy requires ensuring that Case objectives are at all times supportive of strategic objectives.

i.e.  Case objectives -> KPIs -> Strategic objectives

The main tools I have used over time to bridge the gap between operations and strategy are

a) shifting from ROIs to SROIs (1) at the funding level.

b) use of FOMM (Figure of Merit Matrices) during monitoring/tracking as a non-subjective means of assessing progress toward meeting Case objective.

c) finding ways to consolidate operations level data to a free-form search knowledge base environment that is able to simultaneously host KPIs.

I will re-visit these in future blog posts.



SROI is “Socio-Economic Return on Investment”  – basically it takes the sharp edge off purely financial assessments. The complexity of business today requires taking on a wider horizon than financial returns.



Posted in Adaptive Case Management, Business Process Management, Case Management, Competitive Advantage, MANAGEMENT, Strategic Planning | Leave a comment

Your BPMs shopping list

If you are in the market for a BPMs, you may be better off looking for a BPFMs (Business Process Fragment Management System).

shopping_cartPlease don’t make this into a new acronym – we don’t need more acronyms with “Intelligent Business Process Management”, “Agile Business Process Management”, “Dynamic Business Process Management”.  I am here to simplify things, not complicate them.

The thing is in today’s business world there are very few remaining end-to-end Business Processes to map.

Corporations long ago automated their continuous and end-to-end processes with the result that most of all we have left are “Process Fragments”.

What’s the difference between a Business Process and a Business Process Fragment?

Basically, It’s the presence or absence of objectives at flow graph end nodes.

Business Process flow graphs conveniently dovetail into a terminal node which can accommodate an objective. Get to the end node and you have reached the objective.

Business Process Fragment flow graphs, on the other hand, have no plan-side objectives.

You get to the end node of one process fragment and a robot, human or software threads the end node to another Process Fragment.

Of course, you could thread process fragments together plan side but this would require that you anticipate all possible eventualities and that you have in place rule sets to guide the branching down specific pathways.

Do the math. The higher the number of decision branching points toward the start of a flow graph, the higher the number of permutations and combinations. Easy in a relatively simple flow graph to get to 500,000.

It’s OK to let your knowledge workers do some of the decision branching. The usual reason for hiring knowledge workers is that they know what to do, when, how, and why and if you deploy them properly, where. .

If you are worried about allowing knowledge workers to pick the branching at run time, you probably should not have hired them.

Under the new era of Business Process Fragments, how do we know, plan side, when we are done?

Answer, you don’t.

Objectives move to run-time Case environments where a Case Manager decides when objectives have been met and it is OK to close the Case. (i.e. “It ain’t over until the fat lady sings”)

Now, it’s obvious from the above that our running list of “must haves” includes a) a graphic process mapping facility, b) a compiler so you can roll out templates of your graphic process maps, and c) a run-time environment capable of providing workload management across orders and users in the context of scarce resources.

You need d) global Case level rules so that as and when users deviate from “best practice” protocols, (i.e. skip over steps, perform steps out of sequence, perform ad hoc steps not in any process fragment and thread together process fragments in ways that are less than ideal), these users will be tripped up by such rules.

You also need e) a repository so you can look back over time and see who did what, how, when and where.

And, you need f) the capability to import data and export data from/to 3rd party local and remote systems, including customer systems and applications.

So, there you have it, your complete shopping list for a BPMs.

  • Process mapping
  • Compiler
  • Run-time Case management environment w/ workload management facilities
  • Global Case-level rule sets
  • Data Repository
  • Data Exchanger

By my count, that makes six (6) “must haves”.

Keep this list handy when you book a seat at a webinar or register for a seminar on BPMs.

The generic name for the type of system you should be looking for is ACM/BPM (Adaptive Case Management/Business Process Management).

Posted in Adaptive Case Management, Business Process Management, Case Management, Data Interoperability, Operational Planning, Process Mapping, R.A.L.B., Scheduling, Software Design | Tagged | Leave a comment

Pick a BPMs, any BPMs

We too easily settle for various states of affairs, only to find that outcomes could be more favorable given more research, better use of logic and less attention to paid “reviews”.

Hardly a day goes by without a new LinkedIn invite to attend a web demo or seminar on the “best” BPMs (business process management software suite).

“Best” for whom? The answer typically is “best” for the vendor.

Nowhere in these presentations is there much of an attempt to itemize essential functionality of a BPMs followed by a demonstration of how a product being showcased provides such functionality.

Most of these presentations are “show and tell” which translate to “. . . see what we can do with this fantastic product”.

If you have read my “Spend and Save” blog post, you can easily understand where many of the web demo/seminar attendees are coming from.

Unlike Sue in “Along Came Jones”, they don’t need fixes for their problems because they have not done an analysis of what their problems are, so no need to bring anyone in to fix these problems.

“Along Came Jones”, by the way, was a hit song written by Jerry Lieber and Mike Stoller and originally recorded by The Coasters in 1959. The song tells of the interaction between a gunslinger stock villain, “Salty Sam,” and a ranch owner, “Sweet Sue”. Sam makes various attempts, the first of which at an abandoned sawmill, to get Sue to give the villain the deed to her ranch but is, each time, outdone by Jones.

Remember the Coasters?

I did not expect you would, but here is a link (great saxophone!)


Now, at a more practical level, stay tuned for my checklist of “must have” functions for a BPMs. Don’t sign up for any BPM web demos or BPM seminars until you have this checklist and don’t be shy about asking the presenters how/where in their product offerings they address these functions.

Posted in Business Process Management, Case Management, TECHNOLOGY | Tagged , | Leave a comment