Here’s the deal . . .
You are in a corporation that
1) “thinks” BPM,
2) has mapped processes,
3) has a run-time workflow\workload Case platform, where Cases have declared objectives,
4) has non-subjective means of assessing progress toward meeting Case objectives,
5) has a compiler that is capable of carving up your process maps into run-time BPM templates,
6) has a way to stream Cases onto BPM templates to generate private instances of said templates for each Case,
7) has good Case Managers.
All good, except that essential to requisite #3 (run-time workflow/workload Case platform), is the ability to auto-build a Case History.
Each user log-in that augments data needs to result in auto-recording of the “session” by way of a system-applied date/timestamp, complete with a user “signature” plus all data, as it was, at the time it was entered, on the form versions that were in service at the time.
Once in, the platform must not allow any changes to the data. Errors/omissions, late data are handled by loading/posting copies of Forms, allowing edits to these Forms with new “session” recordings.
Considering that not all data needed at a Case can be recorded precisely at the time the data becomes known to a user, all Forms at process steps (structured or ad hoc) must accommodate a reference date-of-capture date which can precede the Case History session date by hours, days, even weeks.
The cardinal rule in some industry sectors is that data, not in the system, “does not exist” – the interpretation of this rule is that protocol requires that users visit the Case History prior to taking any current decisions/actions. If the data is not in the Hx, there is a good chance that the decision will be made only on the basis of what is in the Case History. Who knew what, when is all – important in many Case audits.
So, given a Case History, how now do you go about improving decision-making at Cases and dynamically improving your inventory of processes that run across Cases ?
First comes data analytics.
Unless you are trying to use crowd facial recognition to post big screen notices to individual buyers at shopping malls re Internet searches they did last night, data analytics for improved dynamic decision making is not complicated.
A small change at branching decision boxes along workflows allows analytics to provide a hint to the user as to which branching options have been the most popular (i.e. they went that way 60% of the time).
Clearly, your data sampling size must be sufficient and you may need/want to filter your statistics according to a timeframe, especially for initiatives that anticipate different seasonal outcomes or initiatives where legislation may have changed recently.
As for dynamic process improvement, the best approach I have been able to think of is to post the process map and then overlay it to show skips, jumps, and loopbacks , with stats, where possible.
Ad hoc interventions should be noted as well, particularly in terms of their timing (e.g. each time step #1024 is skipped, we observe that a specific ad hoc intervention is inserted, possibly giving a good indication that the map needs to be updated to show the ad hoc intervention in lieu of the skipped step).