A discussion group at HIMSS in LinkedIn recently posted the following question “Smarter, Safer, Cheaper. What’s the true promise for healthcare big data”.
Here was my response:
Rule #1 is you cannot analyze what you did not collect in terms of data.
Rule #2 is you need easy ways of displaying the results of analytics as overlays to best practice protocols.
Rule #3 is you cannot follow Rule #2 if you don’t have best practice protocols (preferably in the form of flowgraphs as opposed to written policy/procedure).
In industrial Case Management (one example being military helicopter MRO), no two units that come in have the same configuration so it’s important to have a Case History of all interventions, organized by “session” and presented in reverse chronological order, with the ability to present the data by system/sub-system. The key to MRO is knowing when a particular system/sub-system needs periodic inspection and having data on hand to help with predicting problems before they actually materialize.
You end up with a lot of data but the data collection is effortless because all work is guided by flowgraphs (do this and then do that), data collection is automated and analytics routinely examine the data to identify ways and means of improving the flowgraphs.
It’s no different in healthcare, except much more complicated owing to the number of permutations and combinations such that you cannot expect to have end-to-end flowgraphs capable of handling the processing of most individual patients.
So, we end up with best practice process fragments that healthcare professionals have to thread together manually with some assistance by software and machines.
The following capabilities are needed:
#a. Skip a step along a best practice protocol.
#b. Revisit an already committed step.
#c. Forward record data at steps not yet current.
#d. Insert a step not in the original protocol.
In all of this it is important to be able to capture the data so you need a workflow management software environment that automatically logs all data as entered session by session at various computer screens at process steps and at ad hoc intervention steps. (i.e. map out your processes plan-side, roll them out, manage tasks via a run-time workflow management environment at Cases, have in place data logging at the Case Hx with a parallel flow to a “data warehouse” that analytic tools can easily link to).
The challenge becomes to detect via analytics patterns in the data logging, examples of which are for a 1-2-3-4-5-6 …. 12 workflow:
* Step 5 is almost always skipped. Why then should it continue to be in the protocol as a mandatory step? Either leave it as an optional step, or eliminate the step.
* Step 3 is almost always repeated via an ad hoc intervention following a commit at step 8. The process template probably needs to have a loopback added.
* One or more ad hoc steps are almost always inserted at step 12, why not update the protocol to make these part of the best practice template?
It is helpful to inventory all of the best practices as graphic flowgraphs for easy updating.
Analytics should ideally be able to build virtual meta cases that show protocol variations across Cases relating to patients with similar presentations, yielding statistics along the lines of “across 1,000 cases, the cases branched this way 5% of the time, 12% of the time that way, with no branching 83% of the time”.