Unlocking The Secrets To Building and Sustaining Competitive Advantage


 
 

secretsCompetitive Advantage is the result of better use of available Resources. 

The range of Resources for any corporation can include:

Capital, Access to Capital, Land, Equipment, Tools, Premises, Staff, Current Products/Services, Products / Services Under Development, Projects Awaiting Approval, Technology Trends, Changing Legislation, Competitors.

We know from RBV (Resource Based View) that corporations that are able to “view” all of their Resources tend to make better decisions re building up a proper mix of initiatives that draw on these resources (i.e. avoid high risk / low return initiatives; avoid initiatives that tie up key resources for too long a period of time; terminate or cancel initiatives that are non-performing).

Clearly, Operations needs to put a dual focus on work that advances the state of initiatives and work that is supportive of ongoing initiatives (i.e. maintaining compliance with external rules and regulations).

A problem arises when Operations puts too sharp a focus on, for example, processes.

There is no direct path between “continuous process improvement” and success from the implementation of corporate initiatives.  Whereas process improvement impacts efficiency, it only impacts effectiveness marginally.

The direct path from work to competitive advantage is as detailed below:

It’s not that difficult for an organization to transition to this model.

1.
Actors who perform the work and oversee the progress of work need a workspace (commonly called “Case”).

2.
The workspace must have an undercurrent comprising

  • orchestration from background BPM,
  • governance at the Case\Initiative level,
  • workload management i.e. RALB (Resource Allocation, Leveling, Balancing)
  • non-subjective assessment of progress toward meeting goals/objectives i.e. FOMM (Figure of Merit Matrices).

See some 300+ articles on the importance of orchestration from workflows, governance, workload management and non-subjective approaches to decision – making for both strategy development, and for achievement of operational efficiency and effectiveness

https://kwkeirstead.wordpress.com/

Photo Credit:
benwhitephotography

Posted in #strategy, Adaptive Case Management, BPM, Business Process Management, Case Management, Decision Making, FOMM, Operations Management, Resource Based View, Strategic Planning | 2 Comments

Problems with Throwaway Code in Transaction Processing Systems


It seems pointless in any transaction processing system to . . .

1) go to the trouble of only allowing data to be recorded via Official Interfaces,

2) go to the trouble of building Transaction Histories that allow recall of Sessions,  viewing of data, as it was, at the time it was collected, on the Form versions that were in service at the time,TransBin
     

    then process transactions using code you build on-the-fly and
    then throw away the code.

Part of the purpose of a Transaction History is to be able to re-trace the processing in the event of errors.

The problem goes beyond throw-away-code – it extends to any local or remote system or application where you do not have absolute control over source code changes (i.e. an archive of all versions of the source).

So, the only practical option when accepting data from any local or remote systems, not part of your main transaction processing app, is to carry out incoming data reasonableness checks that confirm that processing results are within range. 

This is difficult. (i.e. if the temperature today is 32 degrees F and you go to an app that maps your temperature reading to degrees C and you see 300 degrees C, you know something is not right. But, if you get back 1 degree C, that, too, means the processing is not right).

A partial remedy in a BPMs is to position pre-processing and post-processing rules at process steps to carry out real-time “audits” on outgoing and incoming data.

Pre-processing:                 “Is it OK to engage processing at this step?”

Post-processing:              “Are the calculated results from the local or remote external
                                              app within boundary conditions?”

Comment:

Note the reference to “official interfaces” above.

No organization today will allow data to be “poked” into its data structures – the reason is that such actions bypass in-line security.

It follows that the only acceptable approach to shipping/receiving transactions between any two systems involves use of an agreed-upon data transport envelope that publishers generate and subscribers import, presumably invoking appropriate pre-processing rules.

The usual range of “official interfaces” includes:

  1. Direct keying at traditional User Interfaces (user logs in, loads an app,
    engages whatever processing options are available at the application
    menu bar and icon bar, then logs out).
  2. Portal access using a range of devices that are able to get to the Internet via a browser.
  3. Data exchange in/out of a generic data exchanger (using “push” or “pull”).
Posted in Database Technology, Software Design, Software Source Control | Leave a comment

Augmenting Decision Support at Adaptive Case Management (ACM) Platforms


Traditional ACM(1) augments background BPM(2) decision-support at Cases via two methods : RALB(3), FOMM(4).compass-152121_640

  1. Allowing Users to micro-schedule assigned tasks (RALB).
  2. Allowing Case Managers to periodically assess progress toward meeting Case goals and objectives (FOMM).

Statistical Overlays

Augmented decision support at Cases can be provided via statistical overlays of mined data across completed Cases, at active Cases :

  1. Provisional assignment of durations for not-yet-current tasks.
  2. Engagement of a CPM (Critical Path Method) algorithm that calculates actual/expected dates at the Case end node.

i.e. Practical use of this setup would provide advice/assistance as follows – engage this sub-pathway and get to Case closure in eight (8) weeks, engage another sub-pathway and get to Case closure in six (6) weeks.

Caveat

Traditional CPM assumes a merge of all pathways to a single end node. Cases typically have multiple end nodes.

It follows that unless users are prompted to indicate one or more successor nodes at each Case end node, at the time the node is declared to be “complete”, the CPM algorithm will not be able to calculate the “critical path”.

Probabilistic Branching Overlays

Given that, unlike CPM, ACM engagement of some divergent sub-pathways is optional, data mining can further augment decision support at ACM platform branching decision boxes via probabilistic overlays (i.e. users chose option “A” 40% of the time, users chose option “B” 60% of the time).

Clearly, some filtering is required when data mining (i.e. exclude Cases that did not go to successful closings; exclude branching decision box probabilistic overlay options that have low rates of reported use).

Note that if seasonal filtering is in effect for data mining, a 40/60% overlay for “summer” can easily display/shift to 20/80% or 80/20% for “winter”, depending on the focus of a Case.

Recommendation
If you do not currently have an initiative to improve Decision Making at your ACM platform, my recommendation is to study the potential of statistical overlays and probabilistic overlays before jumping onto the RPA and AI “bandwagons”.

 

Terms used in this Blog Post:

(1) ACM – Adaptive Case Management

(2) RALB – Business Process Management

(3) RALB – Resource Allocation, Leveling and Balancing

(4) FOMM – Figure of Merit Matrices

Posted in Adaptive Case Management, Decision Making, FOMM, Risk Analysis | Tagged | Leave a comment

Is this where you want to be?


This is a pitch to consultants and business managers who subscribe to and use BPM (Business Process Management). BPM_vs_BPM

It turns out there are two flavors of BPM  –  “B(P)M” (business management that receives orchestration from process templates, process fragment templates, users, software and machines) and “(BP)M” (management by business processes).

Any similarity between the two ends here.

If you put a focus on B(P)M what this means is that you are subscribing to the use of BPM to provide background workflow orchestration at Cases – this impacts efficiency and effectiveness.

(BP)M, on the other hand, puts too sharp a focus on processes – you impact efficiency, but only minimally impact effectiveness, unless your processes are all end-to-end processes.

If you are up to it, and your clients are on board, clearly, B(P)M is where you will want to be.

Transitioning from (BP)M to B(P)M requires a change in mindset, so here are a few tips & tricks.

Most (BP)M consultants come from a background of end-to-end processes.  Mapped end-to-end processes are easy to roll out to production environments. You have one start task and one end task (e.g. “cutting the ribbon”). The objective is to get to the end task. Your process map details what, why, who and, to an extent, where and when.

B(P)M is different.

Much of the “process management” assistance that clients are looking for today is not in the area of end-to-end processes.

What we have today are “process fragments” that get threaded together at run time by workers, software and machines. Process fragments do not have plan-side objectives.

Under B(P)M, objectives become a property of run-time Case,  i.e. a patient, an insurance claim, a helicopter under MRO.  For discussion purposes here, Case is not equivalent to “use-Case”.

Process fragments continue the tradition of what, why, who, where and when, except that some of the interventions are now ad hoc interventions.  Tasks at Cases become a mix of structured and unstructured interventions –  this adds flexibility  to “process management” (i.e. Case management, really), but, it also adds variability to where and when.

“Ribbon-cutting” at a Case under B(P)M takes place when the Case Manager closes the Case – no exceptions!

As and when you transition to B(P)M, your legacy BPMS will need major surgery.

As explained, your new BPMS will need to accommodate any mix of structured and unstructured Case interventions. Fortunately, once you realize that a process of one step still is a process, no accommodation is needed so long as your BPMS provides workflow and workload functionality.

Seamless threading of process fragments adds a bit of complexity.

Since you can no longer rely on logic connections between tasks to guide all processing, each process fragment needs a rule set at its start task so that the task can report “OK to engage processing” or “NOT OK to engage processing”.

Your tasks become more data-driven under B(P)M.

However, manual override by a user/Case Manager is always an option i.e. skip the task. At some risk/peril to all stakeholders. . .

It’s worthwhile here to elaborate on the term “data-driven”.

Whereas, with end-to-end BPM and legacy BPMS’, data flows take place along process pathways, all non-instance-specific data at a Case can be accessed by a process fragment rule set.

Rule sets in B(P)M are pervasive – they can be found upstream from tasks, at tasks, and immediately downstream from tasks. They can be found at branching decision boxes and are essential at most loop back constructs to prevent churning.

Finally, your BPMs needs R.A.L.B and F.O.M.M.

You can’t properly manage work at Cases without R.A.L.B. (Resource Allocation, Leveling and Balancing). – absent R.A.L.B. for anything beyond a moderately complex workflow and you become unable to carry out workload management.

Re F.O.M.M (Figure or Merit Matrices) – you won’t get consistency across Cases if Case Managers cannot get a “second opinion” from F.O.M.M. The unique contribution of F.O.M.M is to make decision-making non-subjective.

Now, before leaving this space, click on the link below – you will get to a hard-to-find music video that I feel is fantastic.

Title Inspiration:

Kacey Musgraves & Willie Nelson “Are you sure this is where you want to be”

https://music.youtube.com/watch?v=cDCFjYVKAkY&feature=share

Posted in BPM, Business Process Management, Case Management, Process Management | Tagged | 1 Comment

How do we get to Excellence from Efficiency and Effectiveness?


Efficiency is the easiest journey, the journey to effectiveness is more difficult.Triple_E

Achieving excellence requires all-hands-on-deck, in my view, and is therefore more difficult to achieve.

One thing I recall from days at GE was my boss encouraging all design engineers to do daily walkabouts, to see firsthand, the impact of their designs on production-line work/workers.

Walkabouts are helpful for discovering reasons for inconsistencies in finished products and, in a few cases, to discover and explain consistencies.

In the plant where I worked, one of the products was watt-hour meters and these, at the time, required magnets.

For years HQ could not understand why the quality of the magnets was so high, relative to other watt-hour production plants.

A walkabout reveled that the blacksmith (yes, they had a blacksmith in those days), whose shift was 8 to 4, same as everyone else’s, actually came in at 6 AM to start up his hearth and stayed on the job until 6 PM to clean up.

The effect was a steady state by 8 AM and readiness to start the next day by 6 PM.

I have been a fan of walkabouts (along production facilities and in the office) since that time.

Posted in Competitive Advantage, Operational Efficiency, Productivity Improvement | Tagged , , | 1 Comment

The Nature of Strategic Decision Making


Much of what I read about “Business Management/Decision Making” seems to be written by folks who have little experience in business management.

Business Management is all about Decision Making (but not only about decision making).The_Thiinker

The purpose of this article is to put a focus on strategic decision-making – specifically how strategic decisions are made in real corporate life.

Corporations evolve strategies and allocate resources to Initiatives by way of ROI (Return on Investment) submissions/ authorizations.

All Initiatives have goals/objectives. All Initiatives have time spans. If you see one that appears to go on and on, you are looking at an initiative that receives extensions to previous allocations of resources.

There is no point doing work that does not contribute to advancing the state of an Initiative toward its goals/objectives.

Progress toward initiative goals/objectives is non-linear. Generally, it follows an “S” curve (slow to achieve liftoff, following by rapid progress, only to slow down toward the end of implementation).

Work involves steps, different steps require different resources and most work benefits from consistent use of “best practice” protocols. Some work is unstructured.

Decisions along Initiative timelines must be made before steps, at steps and after steps in order to maintain forward momentum of Initiatives.

Decision-making is the transformation of information into action.

I count six (6) sources of information for decision-making (knowledge, experience, intuition, wisdom, data/analytics, and rule sets/algorithms).

Good decisions are generally the result of reliance on more than one of the six (6) sources of information.

  1. Knowledge maps easily to information providing the decision-maker understands what specific knowledge he/she has access to (i.e. known knowns, known unknowns, unknown knowns, unknown unknowns).
  2. Experience maps to information when such experience was gained dealing with initiatives similar to the one that has the focus.
  3. Intuition maps to information when the decision-maker has a good track record relying on intuition at prior initiatives.
  4. Wisdom is a state of maturity that some people reach – in respect of decision making it has two manifestations i.e. knowing what to do, knowing what not to do.
  5. Data/analytics maps to information when the data is good and the analysis is sound.
  6. Rule-sets map well to initiatives when data is within the boundary conditions of the rule sets or when an algorithm working on the same type/quality of data has yielded good decisions.

Final points . . . .

Decisions typically get made when they need to be made.

Many decisions are made in the absence of adequate information, without consideration of associated risk/uncertainty and without consideration of the amount of resources they tie up (i.e. from low risk/short timeline/high return to high risk/long timeline/low return).

When you are making decisions bear in mind Donald Rumsfield’s 4K’s (known knowns, known unknowns, unknown knowns, unknown unknowns).  What you don’t know will hurt you!

Another good piece of advice is to bear in mind that if you cannot see the resources you are committing to initiatives, the quality of any decisions you make will be diminished.

The core message of the RBV (Resource Based View) methodology is “… it is difficult to make decisions when you cannot see the resources that will be impacted by such decisions”.

See “Decisions, Decisions, Decisions”  (2014-12-02) for a operational perspective on decision-making.

https://wp.me/pzzpB-CX

 

Posted in #strategy, Enterprise Content Management, Knowledge Bases, Risk Analysis | Tagged | Leave a comment

Protect and Serve – The search for efficiency and effectiveness


Police Departments have the same overall focus as private sector corporations.

1) evolving strategies that are supportive of a mission, then defining and putting in place initiatives that make good use of available scarce resources.

2) achieving operational efficiency and effectiveness.

Police_Night_Scene

Photo Jan-Gottweis

 

Whereas corporations bridge the gap between operations and strategy using ROI requests/approvals, PDs strive to eliminate/avoid any gap between operations and strategy by way of operational adherence to published policy and procedures.

In order for this to happen, Policy and Procedure (P&P) must exist and be readily available to all members of any individual or team response to incidents and be readily available for staff tasked with managing Cases.

P&P can be evolved using a range of Document Management Systems.

Assuming a common set of services across same-size-city PDs, three approaches can be used for writing/distributing P&P.

a) independent research.

b) reference to policy models (i.e. IACP).

c) construction of a Kbase featuring P&P from other same-size-city PDs.

Our preference in providing consulting services to PDs is option c), where, subject to copyright approval, we provide our clients with a Kbase comprising full-text content P&P from 10-20 PDs.  The client can then extract and add their P&P to the Kbase from DMS’ they are using (e.g. PowerDMS) and proceed to carry out full-text searches across the content of the Kbase.

Typical questions are:

  1. Do we have P&P for terrorist drone attacks?
  2. What is covered and to what level of detail?
  3. Are any updates to our P&P appropriate for “terrorist drone attacks”?

Regarding availability of P&P, there are currently three options for rollout:

a) “off-line” (printed manuals),

b) “on-line” (portal access),

c) “in-line” (rollout of P&P in the form of checklists with data capture facilities or real-time posting of P&P content task-by-task as tasks become current along the incident or Case timeline, with data capture facilities).

Our preference for rollout is, again, the last listed option (i.e. “in-line”).

This involves mapping P&P narratives to workflows, following by software carving up the workflows into tasks according to skill contribution or administrative level of approval.  A workload management engine then posts individual tasks to the attention of  staff for information and action as these tasks become “current” along the workflow timeline.

“In-Line” improves incident response and Case decision-making (performing the right tasks, at the right time, using the right resources, using the right forms), with auto-consolidation of all data from all tasks to a command and control Incident/Case Management platform and Incident/Case History.

Errors and omissions decrease dramatically as a result of orchestration (i.e. auto-task posting) plus governance (i.e. rule sets that operate on any data that is input by staff).

As usual, methods are NOT totally portable from private sector to the public sector.

Whereas in a private sector setting, governance plays a role of accommodating deviation away from P&P so long as extreme, unwanted deviation does not take place, governance in the area of the public sector needs to be tighter.

Secondly, whereas corporations find it useful to consolidate and trend key indicators to a Kbase, Police Departments want full-text Case content at Kbases so that staff can “connect-the-dots” across active and cold Cases.

Here is a screenshot of a Policy and Procedure Kbase that has links various resources plus links to 10 published major-city Police Department P&P data sets (about 5,000 documents in all, with the potential to go to 15,000 documents).

For more information on Kbase construction and the use of 3D free-form-search knowledgebases to increase operational efficiency and effectiveness within police departments, call Civerex at 1+ 800 529 5355 (USA) or 1+ 450 458 5601 (elsewhere).

PDs

Posted in Community Policing, Crime Reduction Initiatives, Homicide Investigations, Law Enforcement, Major Crimes Case Management, Strategic Planning | Tagged | Leave a comment

BPM Process Automation


BPM (Business Process Management) can deliver solid automation benefits to corporations willing to transition through several BPM Maturity Levels.

The benefits are increased operational efficiency and effectiveness.

It’s important to understand the difference between the two terms (i.e. you can be efficient and effective; you can be somewhat inefficient, yet effective; but if you are not effective, then it does not matter whether you are inefficient or efficient).

Here are the four BPM maturity levels.

No rocket science here – you will have no difficulty assessing the current level of maturity of your organization.

Level I: Process Mapping (process maps on paper)

Level II: Rollout of compiled process maps to a run-time
Case Management Platform (ACM(1)/BPM)

Level III: Rollout of improved process maps featuring automation.

Level IV:  Orchestration based on predictive analytics of run-time data

Practical Automation Example

If you are a regular subscriber to this Blog, you are familiar with BPM maturity Levels I, and II. (See Background Notes at the end of this article).

This objective of this blog post is to remove a hurdle that prevents many practitioners from reaching BPM Level III

The hurdle is the notion that it is difficult to automate.

Clearly, the approach to automation varies from one BPM mapping environment to the next and some rule sets are truly complicated to set up, but the screenshots here below should convince consultants/ business analysts and IT staff that basic automation typically requires little more than putting in place a stack of simple algebraic expressions (i.e. a Rule Set) at process steps.

Let’s say you have a BPM (Business Process Management) workflow that features several linked tasks, where all of the tasks along the workflow are currently performed by humans.

In the example below, we see a very small subset of a Law Enforcement Protocol for responding to a 911 violent crime call.

The flowgraph features a start node, following by a branching decision box (in yellow), with two sub-flows depending on where the Suspect is NOT and on Scene or is ON Scene.

Clearly, the flowgraph is a good candidate for automation.  We can eliminate the decision box by asking the question ‘Is the suspect on scene” at the “Arrival on Scene” data collection Form, adding a rule set at the decision box and then declaring the decision box as an “auto-commit” flowgraph step.

OnOff_Workflow_Automation_Example

 

Example:

An obvious part of any process improvement initiative will involve considering automation of some of the tasks along the workflow.

Where to start?

Clearly, it’s best to look to tasks that are tedious to perform manually, tasks that do not need to be performed manually, and tasks where processing errors put Cases “off the rails”.

The two (2) ‘Branching decision box’ tasks  (Steps #2 / #3) in our workflow) are good candidates for automation.

Run the sample workflow and you get the following sequence of tasks.

Before_Automation

 

 

 

 

 

Now, introduce automation.

1

First, add data recording at step #1

“Is the suspect “ON Scene” or “NOT on Scene” ?  ( Y/N)

2

Next, add a Rule Set at the #2 step\option “NOT On Scene

i=0; x=FALSE; If Edit finds string “N”, then i=1; If i=1 Then x=TRUE

3

Add a Rule Set at the #3 step\option “ON Scene

i=0; x=FALSE; If Edit finds string “Y”, then i=1; If i=1 Then x=TRUE

4

Now compile and run your upgraded workflow and note that you have reduced the number of steps in your workflow by 40%.

      After_Automation

 

 

 

 

 

Hint: Rule-building  at options in a Branching Decision Box can be greatly simplified by noting (as in the above) that differences from one rule set at one option to the next option are usually minor. The way to capitalize on this is build the form for the 1st option, clone the Form, rename the clone to the new option name and then carry out minor options on new option Form.


Background

Level I has your users staring at BPM paper process maps – better than no process maps in that you get some orchestration, but there is no governance.

Level II adds real-time orchestration – your compiled, rolled-out BPM process templates guide the performance of work along BPM workflows.

Host BPM template instances within a workflow/workload run-time environment to allow Users to micro-schedule their steps and allow supervisors to allocate, level and balance workload (R.A.L.B.(2)) across Users.

Add F.O.M.M.(3)to allow Case Managers to better assess progress toward meeting Case goals/objectives and you have reached Level II.

Efficiency increases when workers are encouraged to follow “best practices”, but allowing them to deviate from “best practices”, when necessary or appropriate.

Governance, in the form of rule sets upstream from steps, at steps and downstream from steps, acts to reduce extreme, unwanted, deviations away from “best practices”.

Level III reduces the amount of labor required via automation of some workflow steps, giving, under many scenarios, important cost savings, improved throughput and increased efficiency (i.e. reduced number of steps, absolute compliance with protocol at automated steps).

It’s difficult to provide metrics – for some workflows 70% of the steps are candidates for automation, for others only 20% of the steps are candidates for automation.

Bear in mind that automating only a few steps can give important savings if you processing thousands of orders per day using BPM process templates.

Level IV provides predictive advice and assistance to workers at branching decision points along workflow templates (requires routing of data collected along workflow to a data warehouse, followed data mining).

The results of data mining can then be posted at steps to guide users (i.e. “60% of Cases went this way, as opposed to going that way). In the interest of not getting caught up in self-fulfilling prophecies, only display predictions after the User has picked an option and then, only when the picked option is not toward the top of the list of the analytics.

Level IV puts you in semi-auto process improvement mode.

 

Definitions
(1)ACM : Adaptive Case Management
(2)RALB: Resource Allocation, Leveling and Balancing
(3)FOMM: Figure of Merit Matrices

 

Posted in BPM, Business Process Improvement, Business Process Management, FOMM, Operational Efficiency, Process Mapping | Tagged | Leave a comment

Project Management versus Business Process Management


Rule #1 in the area of technology selection is “First the problem, then the solution”.

man holding pen pointing note in front of man in black suit

Photo by rawpixel.com on Pexels.com

Suppose you want to improve outcomes and you state your need as “workflow and workload management” software.

Vendors offering project management (CPM) software will immediately detail how their software addresses workflows and workload management, backing this up with a list of satisfied customers.

Vendors of Business Process Management (BPM) software will immediately deliver the exact same pitch, and back up their pitch with a list of satisfied customers.

How can two very different technologies satisfy the same need?

The answer is they don’t and won’t.

The reason is prospective buyers typically fail to state whether, for their scenario, workflow logic and timing can be predicted (as in the construction business) or whether workflow logic and timing cannot be predicted (as in knowledge work).

CPM delivers its promise i.e. to calculate the “critical path” which involves predicting project completion dates and predicting costs at completion.  This implies advance knowledge of workflow logic and advance knowledge of task durations and costs.

Some flavors of BPM deliver the BPM promise (i.e. providing orchestration to knowledge workers via background BPM) via facilities that allow periodic assessment of progress toward stated goals/objectives.  In most work, workflow logic will vary over part of the lifecycle of the initiative but otherwise follow workflow logic. As for costs, well, these will not be predictable, as they are tied to how long it takes to complete tasks that end up needing to be performed.

The CPM methodology has been around since the mid-1950’s with antecedents invented before 1900. If you see terminology like “early-start, early-finish, late-start, late-finish, critical path and float” you know you are dealing with CPM software.  “Simplified” CPM software drops formal task logic, giving users Gantt Charts, timelines etc. With CPM, you are doing “project management”.

The BPM methodology dates back to the early 1990’s. Early BPM solutions had an almost exclusive focus on process documenting, process modeling and process improvement.  Aspiring BPM practitioners had to first master a BPM notation (e.g. BPMN) before undertaking their 1st BPM project.

BPM consultants and BPM software vendors soon realized that what customers wanted/needed was not “Business ‘Process Management’ ” but “Business (Process) Management”. This led to a method called Adaptive Case Management (ACM) where the focus is on managing workload to reach Case goals/objectives with the help of orchestration from background BPM workflows called “best practices”.

Whereas BPM is a plan-side method, ACM is strictly a run-time-side method and it needs two additional methods called RALB (Resource Allocation Leveling and Balancing), with obvious implications, plus FOMM (Figure of Merit Matrices) that gives Case Managers the ability to make periodic non-subjective assessments of progress toward Case goals/objectives.  ACM also requires “Rule Sets” to provide governance (i.e. the prevention of extreme unwanted deviation away from “best practices”). With ACM/BPM you are doing Case Management.

Discovery that customers wanted/needed ‘Business (Process) Management’ led to a temporary paradox of encouraging workers, on the one hand, to make consistent use of “best practices” (orchestration), yet instructing these same workers that they should feel free to deviate from “best practices” where appropriate, with the proviso not to engage in extreme unwanted deviation away from ‘best practices” (governance).

It turned out there was no paradox at all.  You can have background orchestration from “best practices” providing you also have in place governance that prevents extreme, unwanted deviation away from “best practices”.

Steve Jobs summed it all up nicely “It doesn’t make sense to hire smart people and then tell them what to do; we hire smart people so they can tell us what to do.”

Bottom line, if you are looking for workflow/workload management solutions, make sure you know the type of work you want to manage and then, only, focus on CPM or ACM/BPM.

Steer clear of CRM, DCM and other narrow ‘solutions’ – these are likely to address a portion of your needs, but not all of your stated needs.

Actually, the thing about needs is you ideally want something capable of satisfying your unanticipated future needs, not just your current needs.

Posted in Adaptive Case Management, Business Process Management, Competitive Advantage, Operational Planning, Process Management, Productivity Improvement, Project Planning, R.A.L.B., Scheduling, Software Acquisition | Leave a comment

Crime Reduction Initiatives Implementation Guide


syringe-1884784_640Success with Crime Reduction Initiatives involves identification of high return/low risk candidate initiatives, setting the focus on a small number of initiatives, goal/objective setting, then updating and rollout of Departmental Policy & Procedure.

Crime Reduction – Drug Distribution/Use

Phase I
Identification of high return/low risk candidate initiatives                 RBV
Selection of one or more initiatives                                                           RBV
Goal/objective setting                                                                                   RBV

The starting position for implementing a Crime Reduction Initiative is to use a strategy-building method such as RBV (Resource-Based View) (i.e. Consulting local, and state and federal crime statistics, browsing available publications/documents, identifying candidate initiatives, and selecting one or more initiatives for implementation, setting goals/objectives).

Phase II
Updating Policy & Procedure (P&P), rollout of P&P                                 BPM

Implementation of a Crime Reduction Initiative includes updating P&P and rollout of P&P to staff for day-to-day guidance/governance. Here, the method of choice is BPM (Business Process Management) i.e. detailing the sequence of needed interventions, guiding the processing of interventions, preventing deviation away from protocol using rules.

Phase III
Day-to-day incident/case management                                                      ACM
Periodic assessment of progress to goals/objectives                                ACM

ACM (Adaptive Case Management) is the method of choice for incident response/case management. Background BPM templates provide orchestration, in-line rules provide governance, ACM accommodates recording of data at protocol steps and auto-building of Incident/Case Histories.

Researching Initiatives
In the CiverMind™ demo system screenshot below, we have set up  links to various “resources” from select resource publishers, one of which is National Public Safety Participation (NPSP).

Depending on a PD’s setup for CiverMind, a CiverMind sheet could host links to 20,000 or more resources from various Publishers. For this demo, we will be featuring 3,000 resource links.

It is important to be able to research/select initiatives within a 3D knowledgebase such as CiverMind because the amount of information searching/prioritization needed can otherwise be overwhelming – a 3D Kbase User Interface allows all searching/ prioritization of candidate initiatives to be carried out at ONE computer screen.

CiverMind Crime Reduction

CiverMind 3D Kbase Screenshot

We can engage key word searches such as “drug, juvenile” to find resources to include in a Crime Reduction Initiative for “drug use by juveniles”. The search gave 487 hits in the demo.

A revised search phrase consisting of “drug juvenile NPSP” narrowed the hits to a more manageable 100.

Notice that NPSP provides a “Toolkit” for extraction of copies of NPSP resources – CiverMind accommodates buildup of links to resources of multiple hosted publishers.

Updating Policy/Procedure
As and when initiatives are identified for implementation, there will almost always be a need to update ePolicy/Procedure (i.e. preparing new P&P, updating existing P&P).

The usual scenario in respect of P&P involves updating of multiple P&P documents.
P&P management is greatly simplified when a Department hosts all of its P&P in a Kbase like CiverMind (i.e. auto-revisioning, auto-storage, easy access to hundreds or thousands of protocols from one computer screen, sophisticated search/masking of protocols with no “hits”).

Start with an initial search across your current P&P to find, for the above example, existing P&P relating to drugs/juveniles to avoid unnecessary proliferation of new P&P documents.

Most PDs have 200 Protocols, some run only 1-2 pages, others run 20 or more pages.

Our demo Kbase includes P&P for various major city PDs. This is a dramatization – in the normal course of events a PD will only need/want to host its own Policy and Procedure.

Rollout of Policy and Procedure to Member Smartphones/Tablets
Once you have an updated set of your P&P, your choices for rollout are a) Off-line publication (i.e. printed manuals) b) On-line publication (e-books) or c) In-line publication where narrative P&P texts for process steps are embedded in flowgraphs as checklists with auto-posting of steps to user InTrays at smartphones and tablets.

The benefits of in-line publication with orchestration and governance are as follows:

• Greatly reduced errors and omissions
• Real-time data collection, improved dynamic decision making at Incidents /Cases
• Improved outcomes

Not convinced you need 3D Kbases to manage 5,000 or more document links?

Here is a view of the same data featured above, minus the organizational capability that is native to 3D Kbase platforms . . .

CiverMind_Crime_Reduction_Expanded

 

Posted in Crime Reduction Initiatives, Knowledge Bases, Law Enforcement | Leave a comment