What Predictive Analytics Can Do For You!

If you are responsible for managing a business, the maturity level you want to get to is one where staff receives advanced decision support from your business management system.


Here is how you can use the “Easy” button to improve management of your business.

1. Start by evolving a set of best practices, encourage their consistent use, but allow variations (ad hoc interventions), where warranted.

You will be practicing ACM/BPM where you will be getting orchestration from background BPM and getting governance from the run-time ACM environment that is hosting your best practice workflows.

BPM logic and rule sets contribute to efficiency, ACM rule sets contribute to effectiveness (i.e. avoiding extreme, unwanted, variation away from best practices).

2. Now, track deviations away from best practices across multiple instances of your best practices.

Too many skips means that you have steps in your best practices that are not necessary, too many ad hoc interventions means steps are missing from your best practices.

3. Update your best practices as appropriate.

Take things one step further.

4. At manual branching decision boxes along your best practice flowgraphs, tally the number of times staff engage processing along different optional pathways and. soon, you will get to where you can highlight “favorites” that help staff with decision making.

5. Now, blend in predictive analytics . . .

Remember CPM (Critical Path Scheduling)?  CPM was great (and still is) for managing once-through initiatives – CPM lets you calculate forward arrival times at project objectives, and takes care of resource allocation, leveling and balancing whilst providing cost containment.

CPM is deterministic, which means if the design of the product/facility is right and you push through the pathways/sub-pathways you will get the expected performance.

Time/Cost/Performance management – it doesn’t get much better than this.

The problem is business management and it’s subset, business process management are anything but deterministic.

But look, if you get on board with ACM/BPM and focus on continuous improvement of your processes, you will, to a large extent, be getting the benefits of CPM (plan, monitor, control)  plus a predictive outcomes capability at Cases.

Posted in Adaptive Case Management, Business Process Improvement, Business Process Management, Case Management, Decision Making, Risk Analysis | Tagged , , , | Leave a comment

Beyond Case by Case Management

We know that Case Managers manage Cases but that is not the end of it.

Cases often need to draw on pooled resources and when the pool runs dry someone has to step in and make decisions regarding which Cases get the resources they need and which ones do not.

Analysis Word Shows Investigation Or Research

It’s hard to anticipate Case demands for resources unless we have structured sequences of steps with resource loading at each step.

Satisfying those demands is typically beyond the boundaries for most Cases, other than ones that own the resource pools they use.

It follows that we need to consolidate copies of Case instances somewhere so that given a start date for each Case instance we can position all instances along a common timeline and see resource demand peaks and valleys.

Leveling resource demand requires an understanding of the relative priority of each Case with stretchout of Case instance timelines in such a way that minimizes damage.

Clearly, leveling/stretchout algorithms are likely to be complex.

Do we throttle back resources evenly across all Case instances, leading to such eventualities as leveling one resource at the expense of others? Do we rank Cases in such a way that the ranking reflects the extent to which each Case contributes to corporate strategic objectives?  Do we try to negotiate with the owners of key resources to get them to defer, for example, planned maintenance?

No easy answers here.

The mechanics are easy.  Either you build a meta case infrastructure and put in place resource allocation, leveling and balancing (RALB) at the top level of such infrastructure or you mirror Case instance data to, say, a Critical Path Method (CPM) or Enterprise Resource Planning (ERP) run time environment that allows you to model and adjust Case instance step durations then import durations back to individual Cases in the Case Management environment.

The latter seems to be the preferred option because most CPM environments support multi-project management, have built-in RALB and accommodate import/export.


Posted in Automated Resource Allocation, Business Process Management, Case Management, Decision Making, FOMM, Job Shop Operations, Manufacturing Operations, Operational Planning, Process Management, Project Planning, R.A.L.B., Scheduling, Uncategorized | Leave a comment

Performance based reimbursement – coming soon to a place near you


Healthcare services delivery in the USA is out of control.

Costs have skyrocketed, facilities are overloaded, doctors are suffering burnout and government intervention has, under the guise of improving patient safety and outcomes, yielded only modest improvements.

MU (Meaningful Use) is largely responsible for the current alarming state of affairs.

It takes longer to process patients than before MU and it seems the focus has shifted away from treatment of individual patients to long-term outcomes data collection.

The remedy, after billions of dollars spent, is to rewind and set the focus on quality, efficiency and effectiveness of healthcare services delivery. Something that should have been the focus of MU from the start. Better late than never.

The problem is going to be with implementation.

Current EHRs were not designed to generate performance data. Replacing what is currently in use will be expensive and we can expect several rounds of false starts as vendors shift into a feeding frenzy to crank out “new” and “improved” EHRs using, the same old, in many cases, database architectures invented in the 1960s. Customers will be buying pigs with lipstick.

Strangely, the methodologies to do things the right way are readily available. We need four methodologies (BPM, RALB, ACM and FOMM) to make performance-based reimbursement a success.

And, there are two hurdles that need to be sorted.

One is “not invented here” and the other is “resistance to change”. Both of these are cultural hurdles.

NIH is particularly well entrenched in healthcare so it will be difficult to port BPM / RALB / ACM / FOMM. The easy solution for NIH is to get over it.

As for Resistance to Change, there is an easy fix that  does not require making changes in  the way we manage work.

If you think about it, all of us, each day, come into our places of work and immediately take note of our fixed time appointments. No one has a problem with a calendar. No change, no resistance.

Following calendar inspection, we look at our to-do list and we micro-schedule to-do tasks to fit between fixed time commitments.

If you have a half hour appointment at 0900 hours and another at 1100 hours, you may reasonably pick a couple of small tasks to clear off your desk between 0930 and 1045. Or, you may prefer to make progress on one large to-do task. Up to you, and no obligation to stick with one approach or the other from one day to the next.

Resistance to change in healthcare can be minimized so long as the pitch is right.

The thing is case management has been at the core of medicine since the 1600s. Accordingly, healthcare workers have no problem going to a patient chart prior to meeting with a patient so transitioning to an e-chart that looks the same as the old manila folder is not a problem.

The other thing is the concept in healthcare of “best practices” is understood.

BPM excels at enabling building and enhancing best practices, but it has a reputation of imposing rigid protocols. BPM and ACM together replace the rigidity of structured protocols where these make sense and accommodate unstructured or ad hoc interventions where appropriate. No rigidity, no resistance.

The other positive attribute of BPM is that lets agency functional unit staff document their workflows featuring existing agency forms, so healthcare professionals see their workflows posting their forms. No change, no resistance.

All of these scheduling maneuvers are eminently handled by RALB (i.e. 3-tier scheduling or Resource Allocation, Leveling and Balancing).

FOMM (Figure of Merit Matrices) is also not new, and easy to implement. Basically, it’s all about non-subjective assessment of progress toward meeting Case objectives. You could do it on the back of an envelope, but it’s a lot faster/easier if you append a spreadsheet template to each Case Record and follow the methodology.

None of these tried and true healthcare services delivery methods will work if the software User Interface is not right.

Here again, no change will result in no resistance.

So, let’s go forward with a UI  consisting of a split screen featuring two constructs everyone is familiar with (a calendar and a to-do list) and let’s make things such that using the UI requires less effort than not using it. No resistance here.

OK, how does this get us to performance-based?

This is where IT comes in – with end-users in the drivers’ seat, building and enhancing their own workflows, IT will have time to focus on predictive analytics. As users perform interventions, record data, the data will flow to the EHR (as it does now) but with a parallel feed to a data warehouse where all manner of analytics can take place.

The final piece of the puzzle is not to simply crank out after-the-fact statistical and tabular reports but to analyze data in real-time and improve decision-making in respect of healthcare services delivery to individual patients.

Reporting on measures is the easy part.


BPM: Business Process Management

ACM: Adaptive Case Management

RALB: Resource Allocation, Leveling and Balancing

FOMM: Figure Of Merit Matrices


Posted in Adaptive Case Management, Business Process Management, Case Management, FOMM, Meaningful Use, Performance Based Reimbursement, R.A.L.B. | Tagged , , , | Leave a comment

Is Social BPM a failure?

BPM.com is a great place to hang out.

Peter Schooff asked the question above and I recommend you take a look at the range of comments received.



My comment was  . . .


So many diverse comments here on this one discussion topic.


In healthcare it’s “no verbal orders”

For a child, at any step along a best practice pathway, you are likely to get a call from the parent re “Why are you doing this? or “I see on the internet that beet root is a better treatment modality, why are you not using this?”

For an adult, same thing plus a desire to go to a portal and gain access to their EMR file (the law says they have the right to access information in their file).

The healthcare log needs to have a record of each of these “ad hoc” interventions – not just date/time and caller but any data that was recorded, at the time it was recorded, on the form versions in service OR an audio recording OR a video telehealth encounter recording.

No way we would allow data flows to patients/caregivers using Facebook, Twitter, e-mail because of the risk of disclosure of patient information and possibility of heavy fines.

In respect of portal accesses, you want the user to be able to log in, see a menu of services (trimmed to what this user is allowed to see/request), a message goes from the portal to a webserver engine and it is the engine alone that links to the backend db server, indexes to the right record, retrieves the data, passes it back to the engine and pushes out the info to the user at the portal. Any suspicious incoming data stream diverts to a healthcare professional/admin person who will probably call and say “if you really need this amount of information, how about you come into the clinic to pick it up?

Bottom line, no “social ” in healthcare and if you are building generic platforms for healthcare (hospitals/clinics interacting with patients), for manufacturing (organizations like Lockheed interacting with suppliers), for b2b (a job shop operation interacting with a customer), why not use the same approach?

I am with Emiel  Kelly “. . . .some type of processes rely more on social interactions than other ones”

I think we could avoid the controversies by saying ” . . . some type of processes rely more on ad hoc interactions than other ones”

Posted in Adaptive Case Management, Business Process Management, Case Management, Interconnectivity | Leave a comment

Where is professional videography headed?


Thinking of going to 4K? You might want to wait for 8K assuming you want to be able to film what people a mile across a valley are eating for lunch.


Viewing 8K is another matter. Few residences have an attached cinema hall. The likely deal when you buy an 8K TV is you get a discount on the renovation contractor who enlarges your living room – after you have negotiated a variance with the local planning committee to allow your house to be closer to your property line.

I can’t get excited about 8K at all and, until recently, have not had a lot of motivation to trade up to 4K, other than for the ability to crop to HD and still have a good image.

For me, the order of things is the storyline, audio, lighting, camera.

Why? Because you lose the audience if the storyline is bad, next in line comes the audio and, if the lighting is no good, you quickly get to the end of the line i.e. the camera.

My challenges in videography have been to track stage performers as they rapidly glide across a stage, often going from front of stage to back of stage at the same time. To do this right you need a smooth tripod/head that can be managed with one finger and you pretty much have to know where the performers are about to move to so you can track them. For this reason, I go once to take notes, then a 2nd time to practice tracking and yet a 3rd time to do the recording. Imaging the savings if I could lock on faces and have the camera do the tracking.

Out of doors, the technique needs to vary. Here, we also have fast moving objects, often not close together, so you have to master smooth pan and zoom.

Auto-focus makes things a lot easier but it seems to me, based on testing I did a couple of years ago, that unless you move through an arc that lets the camera focus whilst you are panning/zooming, you may, depending on the camera, be in for a longer than desirable settling-in time.

The experimentation I did with my AG-AC160A gave me very fast settling-in-times or rather slow settling-in-times depending on the panning arc I chose to go from a near object to a church a mile or so across and down a river. Imagine the savings if I could lasso a target area on my monitor and have the camera track to the target (panning and zooming along a reasonable arc).

A very recent technology advance, for me, is the DJI Osmo.

Sure, you have to live with the wide angle lens, and low light issues (physics rules) but think of the footage you are likely to get relative to the missing an event entirely because of setup time with a big camera/tripod.

Adapting to the DJI Osmo is not likely to be a picnic.

In the ads, you only see the camera and monitor on top of a handle, but I can see things starting to look a lot less mobile once you start loading down the camera and yourself with accessories. You probably need to add two crew members, one with a portable mixer/recorder and one with a boom mic.

Quite a transition to make for a one-trick pony like me.

Posted in TECHNOLOGY, Video Production | 1 Comment

Success Factors with BPM

If you are thinking about the potential benefits of Business Process Management or want to fast track your current BPM initiative, here are a few “must haves” for success.

0. “First the problem then the solution”, meaning no point mapping/implementing processes if the organization does not have a mission and has not evolved a set of strategies.

1. a reasonable subset of the business activity to be “managed” involves the performance of tasks in logical sequence.

2. the work will be performed more than once (otherwise use Critical Path Method).

3. no work should be performed that does not either directly or indirectly support strategy.

4. the benefits vary such that for a large initiative it is advisable to go through the formality of an ROI or SROI.

5. the more complex the sequencing, the more specialized the tasks (requiring specific skill sets), the larger number of silos that a process must overarch, the more beneficial it becomes to go beyond paper mapping to achieve in-line implementation of a process (as opposed to an off-line or on-line implementation).

6. too low a level of detail (i.e. splitting one short term task to be performed exclusively by one person into three tasks) is bad; too high a level of summary makes monitoring/control difficult (i.e. one task comprising several tasks, to be performed by several people, over an extended period of time).

7. the run-time environment hosting instances of templates (i.e. compiled rolled-out flowgraphs) needs to be able to accommodate re-visiting already committed tasks, recording data at tasks not yet current along their instances, and insertion of ad hoc interventions at the environment.

8. usual run-time services to support the processing of instances include R.A.L.B (three-tier scheduling); a formal History (committed tasks, with date/timestamped user “signatures”, with recall of data, as it was, at the time it was entered, on the form versions that were in service at the time); data logging for possible machine real-time predictions OR after-the-fact data mining to allow process owners to improve their processes; data import/export to increase the reach of the run time environment.

9. reasonable accommodation for deviating from instances, but with governance from rule sets at the environment to “rein in” extreme, unwanted deviations i.e. guidance from BPM, governance from the environment. [the highway example of center lines to provide guidance and guardrails on both sides for governance].

10. the environment selected must have a simple UI, otherwise the initiative will fail – i.e. none of these assumptions will increase productivity, increase throughput, decrease errors, improve compliance with internal and external regulations or improve outcomes if the User Interface at the run-time environment fails to improve the user experience (avoid having to say to users ” easy for me, difficult for you”).

11. adequate training must be provided – the best results are obtained when the facilitator kicks off the 1st mapping session by giving the mouse to a stakeholder and saying “let’s map out one of your processes”.

12. many processes are dynamic, they must be maintained and occasionally targeted as candidates for improvement.

13. Wraparound BPM (360 degree BPM) is achieved when work performed under the guidance of BPM results in data that can be consolidated to KPIs at the strategy level.

Hurdles that need to be overcome

a) “you can manage complex processes by staring at paper process maps” – not true.

b) except for end-to-end processes, objectives belong at Cases hosting BPM/ACM, not at end points along flowgraphs – many times there are no end points (i.e. process fragments) – users, machines and software thread process fragments at run-time. In theory each Case can be different.

c) Cases can only be closed by Case Managers (“it ain’t over until the fat lady sings”).

d) Case Managers need decision support (from rules at tasks along flowgraph template instances, from the Case History, from rules global to the run-time environment, from FOMM (Figure of Merit Matrices) to avoid subjective assessment of progress toward Case goals/objectives.

Management needs to exercise reasonable patience, – you can’t change a corporate culture overnight.

Posted in Business Process Management, Process Mapping, Automated Resource Allocation, Strategic Planning, Operational Planning, Process Management, Case Management, R.A.L.B., Competitive Advantage | Tagged | Leave a comment

How to achieve quick wins with BPM

Quick Wins definitely are the preferred business development approach for consultants compared to wasting time responding to RFPs.

 Here is the pitch we have perfected over the past two decades..

How to Quick Start BPM in your organization

Let’s face it.

BPM and its direct antecedents (flowgraphs) have been around for a long time.

  • The methodology is not well known because we encounter business people daily who have never heard of BPM.
  • Another subset has heard of BPM but feel what they are doing presently (or not doing) is sufficient.
  • A third group has tried to implement BPM only to end up as members of a not-so-elite group that, according to some, experience failure rates of 70%.

We need to break out of this mold.

Technology alone is not going to help, so this leaves leadership and user onboarding to master.

If an organization wants BPM and users cooperate, it should be able to achieve liftoff and here is how to fast track your BPM initiative.

Businessman at the entrance of a labyrinth

Pilot Phase

1. Go for low-hanging fruit – pick an initiative with not too long a timeframe, not too high a risk, with the potential to demonstrate quantifiable benefits.

2. Pick a pilot project process that is confined to one or two silos.

3. Go to the trouble of preparing an ROI (you will want to document before/after to get support for other initiatives).

Make sure you document the “before” (i.e. how long it takes to do work, how consistent the outcomes are).

Desired State of Affairs: e.g. The new process reduces the time to analyze a claim by 30%, the level of customer satisfaction increased from 2.5 to 4.5.

4. Bring in a facilitator to graphically map out the process in real-time.

Forget notations and UMLs – most processes only need nodes, directional arcs, branching decision boxes, imposed delays, loopbacks, exit points.

Facilitators lose much of their “magic” when they force a group of ordinary users to watch them build processes with notations, languages.

5. Park images of data collection forms needed at process steps on your mapping canvas so you can drag and drop form images at steps as you build your process.

Make sure the images post to forms that include a memo field – you will want at run time to be able to take quick note of complaints from stakeholders that the process logic is wrong, the forms in use are wrong, the performing roles are wrong, etc..

6. Do not slow down the project by programming rule sets during the first cycle.

Instead, describe rules in narrative terms only and make the branching decision boxes manual.

You can build rule sets and convert decision boxes to auto off-line.

7. Assign actual imposed delays to process steps that need these, but use a run-time environment that allows a temporary override down to 1-2 minutes for testing purposes.

8. Encode process steps with their correct Routings but allow a temporary override of the parent Routing of all Routings so that one person can run through the entire process without having to log out/in under different user accounts.

9. Compile your mapped process, roll it out, get a small group of stakeholders to piano-play the process, incorporate their suggestions/comments/corrections, re-map, roll out again etc.

If you cannot get through all of the listed steps in 1 ½ hours, your SOW for today is larger/more complex than it should be.

Only map in one session what you can roll out and test, update, roll out and test again. You can advance the process next session. Your users want/expect “instant gratification”.

Production Phase

1. Replace the imaged forms with real forms, build rule sets, put branching at decision boxes to auto, reset imposed delays to their plan-side values.

2. Collect run-time data (should be automatic in the environment you are using) for statistical analysis/machine analysis to improve your process.

3. Blend in FOMM (Figure of Merit Matrix) constructs at Cases so you can more easily track progress toward meeting Case objectives.

The overall objective for your “Quick Results” BPM project is on-time/on-budget completion, before/after documentation, user testimonials that it is easier to do their work with the system than without it.

Overall, you should be able to see increased productivity, increased throughput (be it in the area of processing patients, settling insurance claims, or completing MRO on a Blackhawk helicopter), reduction in processing errors, increased compliance with internal and external rules/regulations, all of which contribute to better outcomes and increased competitive advantage.

Posted in Business Process Management, Competitive Advantage, Process Mapping, Productivity Improvement | Tagged | Leave a comment

Success in Business – Reach for the Top With ACM 360

Success in business is all about building and enhancing competitive advantage.

Strategy “rules” – if you don’t know where you want to be, you won’t get there.


No point evolving plans that don’t make efficient/effective use of scarce resources. No point having plans that never get implemented. No point implementing plans that are not monitored.

Plan, monitor, control.

We knew that, but how do we actually make it happen?

  1. Go for the big picture for strategy formulation/management

If you can’t see it, you can’t include it or exclude it.

Seeing the big picture allows you to assess and prioritize initiatives.

Graphic free-form search Knowledgebases give you the big picture.

Kbases – don’t try to evolve/manage strategy without these.

Use the following model:

Corporate assets inventory -> strategy -> KPIs ->candidate initiatives -> ROIs -> Authorized Initiatives

     2. Go with Adaptive Case for operations management

Case is capable of hosting any mix of ad hoc and structured interventions. Look to background BPM to provide guidance re structured interventions. Look to Case level rule sets to provide governance.

A Case can store any object including data element values, .pdf, .doc, .txt, video/audio recordings and spreadsheets.

The latter are capable of providing a framework for assessing progress toward meeting ROI goals/objectives. The methodology of choice for non-subjective assessment of progress is FOMM (Figure of Merit Matrices).

Case environments automatically build longitudinal histories with date-timestamped – user “signed” hyperlinks that allow viewing of data as it was, at the time it was collected, on the form versions that were in service at the time.

Case provides dynamic decision support in respect of the performance of work that derives from annual budget authorizations and work that is related to initiatives funded via ROIs.

Case – don’t go to the office without it.

Use the following model:

Initiatives -> Case setup -> monitoring -> data consolidation to Kbases -> KPI trending

     3. Face Up to the Dynamics and Bridge The Gap between Strategy and Operations.

In today’s world where 5-year plans have, under many scenarios, been compressed to 1 ½ years . . .

strategy can change, initiative priorities can change, goals and objectives can change.

Enter ACM  360 (Adaptive Case Management 360), for bridging the gap between Strategy and Operations, because of the wraparound that can be achieved by:

  1.  launching Cases for individual ROIs;
  2.  setting up permanent “bucket” Cases\Sub-Cases for the many different database record types organizations need (Corporate Assets, Land, Plant, Equipment, Tools, Customers, Customer Orders, Inventory In/Out, Supplier Orders, Shipments, etc);
  3.  managing operations, and;
  4.  consolidating data to corporate Kbases\KPIs.



Posted in Adaptive Case Management, Business Process Management, Case Management, Decision Making, Operational Planning, Strategic Planning, Uncategorized | Tagged , , | Leave a comment

Risk vs Uncertainty . . . again

I’m not sure people in general understand the difference between risk and uncertainty so here is an update on an article dated 2012.

Barry Ritholtz does a good job in the article quoting Michael Mauboussin


Risk: We don’t know what is going to
happen next,Rick_Uncertainty but we do know what the distribution looks like.

Uncertainty: We don’t know what is going to happen next, and we do not know what the possible distribution looks like.

The distinction is important in the area of strategic planning.

ROIs for initiatives should always include a Risk Assessment (worst case, expected, best case).

Approvers of ROIs are well advised not to expect to end up too close to the bottom or too close to the top.

In respect of uncertainty, ROIs should also always include exit strategies.

Posted in Decision Making, Risk Analysis, Strategic Planning, Uncategorized | Leave a comment

Patient Portals versus APIs for Patient Access to Healthcare Information

Back in November 2015, Health Data Management, published an article called “Challenges Ahead for Portals”.

This is an interesting article because it indirectly describes the effect of too much regulatory involvement in healthcare services delivery.



In the article, Raj Ratwani, scientific director of MedStar Health’s National Center for Human Factors in Healthcare states that patient portals “.. do not present information in a manner that is understandable and useful”.

It’s likely that views regarding the inappropriateness of existing patient portals led to inclusion in the Stage 3 objective for Patient Electronic Access to address the patient needs to “ view their health information, download their health information, transmit their health information to a third party and access their health information through an API.

My point is it’s fine for regulatory agencies to set incentive objectives but not to narrowly specify the means by which such objectives should be met.

Whether a patient gains assess to PHI via a portal or via an API should be a decision best left to stakeholders who have a close connection to patients.

Under this scenario, if a vendor implements a portal that does not address patient needs, the patients will move to another healthcare service provider who either has a better portal implementation or an API that works well for such patients and the provider supposedly, would pick up on this and move to a different vendor.

Accordingly, portal/API selection should be the responsibility of vendors first, then healthcare service providers, picking solutions they feel their patients will find acceptable.

Vendor -> Provider Selection-> Patient Needs

The way things go when there is too much regulation is regulators impose demands on vendors, healthcare service providers then select, from a reduced set of options, solutions they feel will address internal/patient needs and the patients then decide whether the “solutions” meet their needs.   I doubt very much whether the regulators consulted patients before reaching the conclusion that patients would be best served via APIs.

See how far away the patient is from the regulators under this alternative scenario.

Regulatory Authority -> Vendor -> Restricted Solution Selection for Providers-> Patient Needs

The reality is you can deliver patient healthcare information to patients using a number of technologies, one of which is an API at a Patient Portal (i.e. a hybrid solution). This avoids the need for the patient to download and install an API on the various devices they may want to use to access their healthcare information. All they need with a portal/API is to type in a URL and enter a user name/password.

The danger with the phraseology in the Stage 3 Final Rule is that software systems that do not have a traditional API could be categorized as not meeting the Stage 3 Final Rule.

Posted in FIXING HEALTHCARE, Interconnectivity, Interoperability, Meaningful Use, Uncategorized | Tagged , , , , | Leave a comment