The Nature of Strategic Decision Making

Much of what I read about “Business Management/Decision Making” seems to be written by folks who have little experience in business management.

Business Management is all about Decision Making (but not only about decision making).The_Thiinker

The purpose of this article is to put a focus on strategic decision-making – specifically how strategic decisions are made in real corporate life.

Corporations evolve strategies and allocate resources to Initiatives by way of ROI (Return on Investment) submissions/ authorizations.

All Initiatives have goals/objectives. All Initiatives have time spans. If you see one that appears to go on and on, you are looking at an initiative that receives extensions to previous allocations of resources.

There is no point doing work that does not contribute to advance the state of an Initiative toward its goals/objectives.

Progress toward initiative goals/objectives is non-linear. Generally, it follows an “S” curve (slow to achieve liftoff, following by rapid progress, only to slow down toward the end of implementation).

Work involves steps, different steps require different resources and most work benefits from consistent use of “best practice” protocols. Some work is unstructured.

Decisions along Initiative timelines must be made before steps, at steps and after steps in order to maintain momentum of Initiatives.

Decision-making is the transformation of information into action.

I count six (6) sources of information for decision-making (knowledge, experience, intuition, wisdom, data/analytics, and rule sets/algorithms).

Good decisions are generally the result of reliance on more than one of the six (6) sources of information.

  1. Knowledge maps easily to information providing the decision-maker understands what specific knowledge he/she has access to (i.e. known knowns, known unknowns, unknown knowns, unknown unknowns).
  2. Experience maps to information when such experience was gained dealing with initiatives similar to the one that has the focus.
  3. Intuition maps to information when the decision-maker has a good track record relying on intuition for prior initiatives.
  4. Wisdom is a state of maturity that some people reach – in respect of decision making it has two manifestations i.e. knowing what to do, knowing what not to do.
  5. Data/analytics maps to information when the data is good and the analysis is sound.
  6. Rule-sets map well to initiatives when data is within the boundary conditions of the rule sets or when an algorithm working on the same type/quality of data has yielded good decisions.

Final points . . . .

Decisions typically get made when they need to be made.

Many decisions are made in the absence of adequate information, without consideration of associated risk/uncertainty and without consideration of the amount of resources they tie up (i.e. from low risk/short timeline/high return to high risk/long timeframe/low return).

When you are making decisions bear in mind Donald Rumsfield’s 4K’s (known knowns, known unknowns, unknown knowns, unknown unknowns).  What you don’t know will hurt you!

Another good piece of advice is to bear in mind that if you cannot see the resources you are committing to initiatives, the quality of any decisions you make will be diminished.

The core message of the RBV (Resource Based View) methodology is “… it is difficult to make decisions when you cannot see the resources that will be impacted by such decisions”.

See “Decisions, Decisions, Decisions”  (2014-12-02) for a operational perspective on decision-making.


Posted in #strategy, Enterprise Content Management, Knowledge Bases, Risk Analysis | Tagged | Leave a comment

Protect and Serve – The search for efficiency and effectiveness

Police Departments have the same overall focus as private sector corporations.

1) evolving strategies that are supportive of a mission, then defining and putting in place initiatives that make good use of available scarce resources.

2) achieving operational efficiency and effectiveness.


Photo Jan-Gottweis


Whereas corporations bridge the gap between operations and strategy using ROI requests/approvals, PDs strive to eliminate/avoid any gap between operations and strategy by way of operational adherence to published policy and procedures.

In order for this to happen, Policy and Procedure (P&P) must exist and be readily available to all members of any individual or team response to incidents and readily available for staff tasked with managing Cases.

P&P can be evolved using a range of Document Management Systems.

Assuming a common set of services across same-size-city PDs, three approaches can be used for writing/distributing P&P.

a) independent research.

b) reference to policy models (i.e. IACP).

c) construction of a Kbase featuring P&P from other same-size-city PDs.

Our preference in providing consulting services to PDs is option c) where, subject to copyright approval, we provide our clients with a Kbase comprising full-text content P&P from 10-20 PDs.  The client can then extract and add their P&P to the Kbase from DMS’ they are using and proceed to carry out full-text searches across the content of the Kbase.

Typical questions are:

  1. Do we have P&P for terrorist drone attacks?
  2. What is covered and to what level of detail?
  3. Are any updates to our P&P appropriate for “terrorist drone attacks”?

Regarding availability of P&P, there are currently three options for rollout:

a) “off-line” (printed manuals),

b) “on-line” (portal access),

c) “in-line” (rollout of P&P in the form of checklists with data capture facilities or real-time posting of P&P content task by task as tasks become current along the incident or Case timeline, with data capture facilities).

Our preference for rollout is, again, the last listed option (i.e. “in-line”).

This involves mapping P&P narratives to workflows, following by software carving up the workflows into tasks according to skill contribution or administrative levels of approval.  A workload management engine posts tasks to the attention of  staff for information and action.

“In-Line” improves incident response and Case decision-making (performing the right tasks, at the right time, using the right resources, using the right forms), with auto-consolidation of all data from all tasks to a command and control Incident/Case History.

Errors and omissions decrease dramatically as a result of orchestration (i.e. auto-task posting) plus governance (i.e. rule sets that operate on any data that is input by staff).

As usual, methods are NOT totally portable from private sector to the public sector.

Whereas in a private sector setting, governance plays a role of accommodating deviation away from P&P so long as extreme, unwanted deviation does not take place, governance in the area of the public sector needs to be tighter.

Secondly, whereas corporations find it useful to consolidate and trend key indicators to a Kbase, Police Departments want full-text Case content at Kbases so that staff can “connect-the-dots” across active and cold Cases.

Here is a screenshot of a Policy and Procedure Kbase that has links various resources and to 10 published major-city Police Department P&P data sets (about 5,000 documents in all with the potential to go to 15,000 documents).

For more information on Kbase construction and the use of 3D free-form-search knowledgebases to increase operational efficiency and effectiveness within police departments, call Civerex at 1+ 800 529 5355 (USA) or 1+ 450 458 5601 (elsewhere).


Posted in Community Policing, Crime Reduction Initiatives, Homicide Investigations, Law Enforcement, Major Crimes Case Management, Strategic Planning | Tagged | Leave a comment

BPM Process Automation

BPM (Business Process Management) can deliver solid automation benefits to corporations willing to transition through several BPM Maturity Levels.

The benefits are increased operational efficiency and effectiveness.

It’s important to understand the difference between the two terms (i.e. you can be efficient and effective; you can be somewhat inefficient, yet effective; but if you are not effective, then it does not matter whether you are inefficient or efficient).

Here are the four BPM maturity levels.

No rocket science here – you will have no difficulty assessing the current level of maturity of your organization.

Level I: Process Mapping (process maps on paper)

Level II: Rollout of compiled process maps to a run-time
Case Management Platform (ACM(1)/BPM)

Level III: Rollout of improved process maps featuring automation.

Level IV:  Orchestration based on predictive analytics of run-time data

Practical Automation Example

If you are a regular subscriber to this Blog, you are familiar with BPM maturity Levels I, and II. (See Background Notes at the end of this article).

This objective of this blog post is to remove a hurdle that prevents many practitioners from reaching BPM Level III

The hurdle is the notion that it is difficult to automate.

Clearly, the approach to automation varies from one BPM mapping environment to the next and some rule sets are truly complicated to set up, but the screenshots here below should convince consultants/ business analysts and IT staff that basic automation typically requires little more than putting in place a stack of simple algebraic expressions (i.e. a Rule Set) at process steps.

Let’s say you have a BPM (Business Process Management) workflow that features several linked tasks, where all of the tasks along the workflow are currently performed by humans.

In the example below, we see a very small subset of a Law Enforcement Protocol for responding to a 911 violent crime call.


An obvious part of any process improvement initiative will involve considering automation of some of the tasks along the workflow.

Where to start?

Clearly, it’s best to look to tasks that are tedious to perform manually, tasks that do not need to be performed manually, and tasks where processing errors put Cases “off the rails”.

‘Branching decision box’ tasks  (Steps #2 / #3) in our workflow) are good candidates for automation.

Run the sample workflow and you get the following sequence of tasks.







Now, introduce automation.


First, add data recording at step #1

“Is the suspect “ON Scene” or “NOT ON Scene” ?  ( Y/N)


Next, add a Rule Set at the #2 step\option “NOT On Scene

i=0; x=FALSE; If Edit finds string “N”, then i=1; If i=1 Then x=TRUE


Add a Rule Set at the #3 step\option “On Scene

i=0; x=FALSE; If Edit finds string “Y”, then i=1; If i=1 Then x=TRUE


Now compile and run your upgraded workflow and note that you have reduced the number of steps in your workflow by 40%.







Hint: Rule-building  at options in a Branching Decision Box can be greatly simplified by noting (as in the above) that differences from one rule set at one option to the next option are usually minor. The way to capitalize on this is build the form for the 1st option, clone the Form, rename the clone to the new option name and then carry out minor options on new option Form.


Level I has your users staring at BPM paper process maps – better than no process maps in that you get some orchestration, but there is no governance.

Level II adds real-time orchestration – your compiled, rolled-out BPM process templates guide the performance of work along BPM workflows.

Host BPM template instances within a workflow/workload run-time environment to allow Users to micro-schedule their steps and allow supervisors to allocate, level and balance workload (R.A.L.B.(2)) across Users.

Add F.O.M.M.(3)to allow Case Managers to better assess progress toward meeting Case goals/objectives and you have reached Level II.

Efficiency increases when workers are encouraged to follow “best practices”, but allowing them to deviate from “best practices”, when necessary or appropriate.

Governance, in the form of rule sets upstream from steps, at steps and downstream from steps, acts to reduce extreme, unwanted, deviations away from “best practices”.

Level III reduces the amount of labor required via automation of some workflow steps, giving, under many scenarios, important cost savings, improved throughput and increased efficiency (i.e. reduced number of steps, absolute compliance with protocol at automated steps).

It’s difficult to provide metrics – for some workflows 70% of the steps are candidates for automation, for others only 20% of the steps are candidates for automation.

Bear in mind that automating only a few steps can give important savings if you processing thousands of orders per day using BPM process templates.

Level IV provides predictive advice and assistance to workers at branching decision points along workflow templates (requires routing of data collected along workflow to a data warehouse, followed data mining).

The results of data mining can then be posted at steps to guide users (i.e. “60% of Cases went this way, as opposed to going that way). In the interest of not getting caught up in self-fulfilling prophecies, only display predictions after the User has picked an option and then, only when the picked option is not toward the top of the list of the analytics.

Level IV puts you in semi-auto process improvement mode.


(1)ACM : Adaptive Case Management
(2)RALB: Resource Allocation, Leveling and Balancing
(3)FOMM: Figure of Merit Matrices


Posted in BPM, Business Process Improvement, Business Process Management, FOMM, Operational Efficiency, Process Mapping | Tagged | Leave a comment

Project Management versus Business Process Management

Rule #1 in the area of technology selection is “First the problem, then the solution”.

man holding pen pointing note in front of man in black suit

Photo by on

Suppose you want to improve outcomes and you state your need as “workflow and workload management” software.

Vendors offering project management (CPM) software will immediately detail how their software addresses workflows and workload management, backing this up with a list of satisfied customers.

Vendors of Business Process Management (BPM) software will immediately deliver the exact same pitch, and back up their pitch with a list of satisfied customers.

How can two very different technologies satisfy the same need?

The answer is they don’t and won’t.

The reason is prospective buyers typically fail to state whether, for their scenario, workflow logic and timing can be predicted (as in the construction business) or whether workflow logic and timing cannot be predicted (as in knowledge work).

CPM delivers its promise i.e. to calculate the “critical path” which involves predicting project completion dates and predicting costs at completion.  This implies advance knowledge of workflow logic and advance knowledge of task durations and costs.

Some flavors of BPM deliver the BPM promise (i.e. providing orchestration to knowledge workers via background BPM) via facilities that allow periodic assessment of progress toward stated goals/objectives.  In most work, workflow logic will vary over part of the lifecycle of the initiative but otherwise follow workflow logic. As for costs, well, these will not be predictable, as they are tied to how long it takes to complete tasks that end up needing to be performed.

The CPM methodology has been around since the mid-1950’s with antecedents invented before 1900. If you see terminology like “early-start, early-finish, late-start, late-finish, critical path and float” you know you are dealing with CPM software.  “Simplified” CPM software drops formal task logic, giving users Gantt Charts, timelines etc. With CPM, you are doing “project management”.

The BPM methodology dates back to the early 1990’s. Early BPM solutions had an almost exclusive focus on process documenting, process modeling and process improvement.  Aspiring BPM practitioners had to first master a BPM notation (e.g. BPMN) before undertaking their 1st BPM project.

BPM consultants and BPM software vendors soon realized that what customers wanted/needed was not “Business ‘Process Management’ ” but “Business (Process) Management”. This led to a method called Adaptive Case Management (ACM) where the focus is on managing workload to reach Case goals/objectives with the help of orchestration from background BPM workflows called “best practices”.

Whereas BPM is a plan-side method, ACM is strictly a run-time-side method and it needs two additional methods called RALB (Resource Allocation Leveling and Balancing), with obvious implications, plus FOMM (Figure of Merit Matrices) that gives Case Managers the ability to make periodic non-subjective assessments of progress toward Case goals/objectives.  ACM also requires “Rule Sets” to provide governance (i.e. the prevention of extreme unwanted deviation away from “best practices”). With ACM/BPM you are doing Case Management.

Discovery that customers wanted/needed ‘Business (Process) Management’ led to a temporary paradox of encouraging workers, on the one hand, to make consistent use of “best practices” (orchestration), yet instructing these same workers that they should feel free to deviate from “best practices” where appropriate, with the proviso not to engage in extreme unwanted deviation away from ‘best practices” (governance).

It turned out there was no paradox at all.  You can have background orchestration from “best practices” providing you also have in place governance that prevents extreme, unwanted deviation away from “best practices”.

Steve Jobs summed it all up nicely “It doesn’t make sense to hire smart people and then tell them what to do; we hire smart people so they can tell us what to do.”

Bottom line, if you are looking for workflow/workload management solutions, make sure you know the type of work you want to manage and then, only, focus on CPM or ACM/BPM.

Steer clear of CRM, DCM and other narrow ‘solutions’ – these are likely to address a portion of your needs, but not all of your stated needs.

Actually, the thing about needs is you ideally want something capable of satisfying your unanticipated future needs, not just your current needs.

Posted in Adaptive Case Management, Business Process Management, Competitive Advantage, Operational Planning, Process Management, Productivity Improvement, Project Planning, R.A.L.B., Scheduling, Software Acquisition | Leave a comment

Crime Reduction Initiatives Implementation Guide

syringe-1884784_640Success with Crime Reduction Initiatives involves identification of high return/low risk candidate initiatives, setting the focus on a small number of initiatives, goal/objective setting, then updating and rollout of Departmental Policy & Procedure.

Crime Reduction – Drug Distribution/Use

Phase I
Identification of high return/low risk candidate initiatives                 RBV
Selection of one or more initiatives                                                           RBV
Goal/objective setting                                                                                   RBV

The starting position for implementing a Crime Reduction Initiative is to use a strategy-building method such as RBV (Resource-Based View) (i.e. Consulting local, and state and federal crime statistics, browsing available publications/documents, identifying candidate initiatives, and selecting one or more initiatives for implementation, setting goals/objectives).

Phase II
Updating Policy & Procedure (P&P), rollout of P&P                                 BPM

Implementation of a Crime Reduction Initiative includes updating P&P and rollout of P&P to staff for day-to-day guidance/governance. Here, the method of choice is BPM (Business Process Management) i.e. detailing the sequence of needed interventions, guiding the processing of interventions, preventing deviation away from protocol using rules.

Phase III
Day-to-day incident/case management                                                      ACM
Periodic assessment of progress to goals/objectives                                ACM

ACM (Adaptive Case Management) is the method of choice for incident response/case management. Background BPM templates provide orchestration, in-line rules provide governance, ACM accommodates recording of data at protocol steps and auto-building of Incident/Case Histories.

Researching Initiatives
In the CiverMind™ demo system screenshot below, we have set up  links to various “resources” from select resource publishers, one of which is National Public Safety Participation (NPSP).

Depending on a PD’s setup for CiverMind, a CiverMind sheet could host links to 20,000 or more resources from various Publishers. For this demo, we will be featuring 3,000 resource links.

It is important to be able to research/select initiatives within a 3D knowledgebase such as CiverMind because the amount of information searching/prioritization needed can otherwise be overwhelming – a 3D Kbase User Interface allows all searching/ prioritization of candidate initiatives to be carried out at ONE computer screen.

CiverMind Crime Reduction

CiverMind 3D Kbase Screenshot

We can engage key word searches such as “drug, juvenile” to find resources to include in a Crime Reduction Initiative for “drug use by juveniles”. The search gave 487 hits in the demo.

A revised search phrase consisting of “drug juvenile NPSP” narrowed the hits to a more manageable 100.

Notice that NPSP provides a “Toolkit” for extraction of copies of NPSP resources – CiverMind accommodates buildup of links to resources of multiple hosted publishers.

Updating Policy/Procedure
As and when initiatives are identified for implementation, there will almost always be a need to update ePolicy/Procedure (i.e. preparing new P&P, updating existing P&P).

The usual scenario in respect of P&P involves updating of multiple P&P documents.
P&P management is greatly simplified when a Department hosts all of its P&P in a Kbase like CiverMind (i.e. auto-revisioning, auto-storage, easy access to hundreds or thousands of protocols from one computer screen, sophisticated search/masking of protocols with no “hits”).

Start with an initial search across your current P&P to find, for the above example, existing P&P relating to drugs/juveniles to avoid unnecessary proliferation of new P&P documents.

Most PDs have 200 Protocols, some run only 1-2 pages, others run 20 or more pages.

Our demo Kbase includes P&P for various major city PDs. This is a dramatization – in the normal course of events a PD will only need/want to host its own Policy and Procedure.

Rollout of Policy and Procedure to Member Smartphones/Tablets
Once you have an updated set of your P&P, your choices for rollout are a) Off-line publication (i.e. printed manuals) b) On-line publication (e-books) or c) In-line publication where narrative P&P texts for process steps are embedded in flowgraphs as checklists with auto-posting of steps to user InTrays at smartphones and tablets.

The benefits of in-line publication with orchestration and governance are as follows:

• Greatly reduced errors and omissions
• Real-time data collection, improved dynamic decision making at Incidents /Cases
• Improved outcomes

Not convinced you need 3D Kbases to manage 5,000 or more document links?

Here is a view of the same data featured above, minus the organizational capability that is native to 3D Kbase platforms . . .



Posted in Crime Reduction Initiatives, Knowledge Bases, Law Enforcement | Leave a comment

Must-Have Features at a Run-Time Case Management Platform

Case Managers spend their time managing Cases and leveling and balancing resources across Cases. shopping_cart

Popular examples of Cases are Patients in healthcare, Investigations in law enforcement but, in general, a Case is just a cursor position at a post-relational database management system. So, we can have “Cases” where the focus is on a supplier, a customer, an insurance claim, a helicopter receiving periodic maintenance.

The methodology of choice for Case Management is ACM (Adaptive Case Manager).

ACM recognizes that a Case ends up as a mix of structured protocols (i.e. process fragments) plus ad hoc interventions. Each Case is typically unique. Each has goals and objectives and their presence is essential to closing any Case.

If you are planning on developing a Case Management software suite or planning to acquire one, here is a list of fifteen (15) “must-have” features:

  • Official interfaces only (normal users, casual users, import/export engine)
  • Case Hx (longitudinal view)
  • Case Hx (workflow view)
  • Case Goals/Objectives
  • FOMM for assessing progress toward Case Goals/Objectives
  • Post-relational dbms
  • Menu of Services (for selecting “best practices” protocols)
  • User Workspace (InTray, capable of hosting “best practices” protocol template instances)
  • Background orchestration from BPM process template instances
  • 3-Tier Scheduling (system, users, supervisors)
  • Skip/Jump at Process Steps
  • Insert an ad hoc intervention at a Case
  • Break Glass (for emergency takeover of an in-progress intervention)
  • Re-assign/take back a process step that is not being worked on
  • Advanced Security (who can do what, when)

Whether you decide to build or buy, don’t drop any of the above items from your shopping list without careful consideration of the consequences.

Feel free to call Civerex at 800 529 5355 if you have questions.


Posted in Adaptive Case Management, Case Management, Software Acquisition, Software Design | Tagged | 4 Comments

Closing the gap between strategy and operations – is as easy as A, B, C, D, E

abcdeCorporations need two pillars (ECM and DCM) plus three meta-methods to build, sustain and augment competitive advantage.

Absent any one of these and the corporation is likely to fail at closing the gap between strategy/operations and operations/strategy.

Let’s highlight these “must haves” in reverse order (E-D-C-B-A)

ECM (Enterprise Content Management) [strategic pillar]

The inventory of corporate resources/capabilities and management thereof (land, plant, equipment, tools, methods, staff, suppliers, customers etc.), along with tentative and funded initiatives that consume/use resources and capabilities over specified timespans.

DCM (Document Content Management) [operational level pillar]

The inventory of “documents” within an organization comprising text, .pdf, .doc, spreadsheet, image, video, audio files – Includes templates (data elements, layouts, rules) plus instances of templates that have received data element values, typically directly at Cases or via remote system and application data imports.

CPM (Critical Path Method)

The method of choice for planning, monitoring and controlling once-through initiatives.

BPM (Business Process Management)

The corporation’s inventory of “best practices” protocols (workflows, comprising steps, with attached data presentation/data collection forms and performance roles).

ACM (Adaptive Case Management)

A run-time environment that accommodates management of initiatives (hosting BPM templates that provide orchestration, accommodating ad hoc interventions, goals/objectives, plus a means of non-subjective assessment of progress toward goals/objectives).

Posted in Adaptive Case Management, Business Process Management, Case Management, Decision Making, Enterprise Content Management, FOMM, Knowledge Bases, Operations Management, Process Management, Project Planning, R.A.L.B., Resource Based View, Uncategorized | 2 Comments

Strategy Development – Same old, or entirely new?

Recently, I was asked whether Edith Penrose’s OldNew
“RBV” (Resource Based View) is today obsolete.

Fair question, given that RBV goes back to 1959.

This led to a question as to whether Porter’s Competitive Advantage method has or has not evolved to “Sustainable competitive advantage” / ” Temporary Competitive Advantage”.

Here is a link to one leading article on the topic “Sustainable competitive advantage or temporary competitive advantage: Improving understanding of an important strategy construct”. (T O’Shannassy – ‎2008)

A key statement in the article is “. . .in many industries for many firms competitive advantage is only a temporary outcome due to the influence of environmental uncertainty.”

Seems to me uncertainty (not just environmental uncertainty) has always been present, just as any initiative is, and always has been, characterized by risk.

I would add to this that all competitive advantage is, and always has been, temporary (Yogi Berra probably would have said that “You have it, until you no longer have it”), so my question is why have people spent time and money picking at Porter’s method?

If we go along with the distinction, a “temporary” CA could be the result of landing a big contract, but a more likely scenario is building CA takes a lot of time and money and once you have CA, you need to pause to consolidate/sustain your CA but then try to augment your CA.

One thing we can say, for sure, is that anyone who subscribes to RBA and follows the method: RBV steps

knows that if strategy changes, in-progress initiatives need to be reviewed and, in some instances, terminated (e.g. a competitor leapfrogs your innovative product, no point continuing with that implementation).

RBA requires that all work performed be supportive of strategy, accordingly the timeline for a strategic initiative must exceed implementation time for that initiative.

Since most important initiatives get authorized by way of ROI submissions, the timeline for an initiative is the time to breakeven, hopefully, a bit longer.

Bottom line, long-running initiatives work for long strategies, short-running initiatives are needed for short duration strategies and all competitive advantage is temporary,

Part of identifying/prioritizing initiatives to build/sustain/augment CA is to have a mix or short-lifecycle strategies and long-lifecycle strategies.

Anything new in all of the above?


What is “new” is the ability to view all corporate resources, view all initiatives, view all prioritized initiatives, view the status of all initiative implementation and view the status of all Cases at a graphic free-form-search knowledge base. The icing on the cake is the ability to change resource allocations and priorities of tentative initiatives and make real-time assessments in respect of in-progress Cases.

Nice alternative to arriving at a strategy planning meeting pushing a cartload of studies, reports, supporting documents and spreadsheets.


Posted in Case Management, Competitive Advantage, Resource Based View, Strategic Planning, Uncategorized | Tagged , | 1 Comment

How the EU’s GDPR (General Data Protection Regulation) impacts your business


(General Data Protection Regulation
 EU 2016/679) took effect one week ago (May 25, 2018).

The Regulation and two related Directives EU 2016/680 and EU 2016/681 deal with the processing of personal data relating to natural persons who are citizens of one of the 28-member States or persons residing in one or more of the 28-member States.

The first thing to note is that if you are a corporation whose headquarters is outside of the EU and you host any personal data relating to EU citizens/residents, you are also subject to EU Regulation 2016/679.

Article 4 of EU 679 defines “processing” to include “collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”.

The only way to understand the far-reaching impact of the legislation is to read the legal texts – there is one overarching Regulation and two Directives (680 & 681) but each member state (28 of them) has the right to publish its own versions of EU 2016 670/680/681.

If you do the math, you get 3 legal texts x 28 states x 150 explanatory texts/articles, for a possible total of 12,600 notes/articles.

If you have operations in more than one EU state, your policy/procedure re personal data relating to natural persons may require adjustments.

Know and Understand the legislation

A graphic free-form-search knowledgebase where you can simultaneous view all texts/articles for a search phrase facilitates the task.

You can view the texts/articles this way (i.e. no knowledge base),


or this way . . ., (e.g. with a knowledge base)


The problem with organizations running transaction-based software applications is that current systems are not likely to meet GDPR minimal requirements. There will be disclosures, fines will be imposed.

If you do a risk assessment OR bring in a consultant to carry out a risk assessment, the opinion is likely to be that in the event of disclosure you could receive a fine equal to 4% of your total world revenue. (i.e. Article 83).

A search at my EU 679/680/681 knowledgebase for “fine” highlights all “hits” down to the individual text/clause, allowing you quick/easy access to, in this case, Article 83.


GDPR sets the bar to a new, higher level in the area of natural persons data protection.

Be aware that “rule sets”, typically pervasive in transaction processing software, have great difficulty protecting personal “documents” (i.e. images, audio recordings, video recordings, documents, and memo field recordings, etc.).

For this reason, only share “documents” via a data exchanger that handles data sharing at transaction application process instance steps, using PCPs (process control points) that require manual intervention/clearance with other than across-the-board user permissioning.

Here are three (3) high-level recommendations re the design and configuration of transaction processing software systems to handle data relating to natural persons.

1.      Restrict sharing of data relating to natural persons.

Only share data relating to natural persons on a need-to-know basis – the more data you share, the greater the risk of deliberate or inadvertent data breaches.

2.      Importance of having formal data repositories.

Your core transaction processing software systems need to feature consolidating Case Histories that post system-generated date/timestamps, a user “signatures”, along with data, as it was, on the form versions that were in service at the time (who, what, where, when) at any process step you commit. Simple “logs” where you see accesses, without precise tracking on what was accessed, when, by whom, from where, are not good enough.

Postings to data histories must be automated, as opposed to selective and must include visits to all forms that host personal data, even if no data is recorded on these forms.

When a user leaves a process step form that has been visited and edited, the software suite must immediately lock down the new History posting.

Following this, no changes must be allowed to the new posting (or to any previous postings), except that authorized users can look up a form that is in the History, clone the form with its data, proceed to carry out edits on the form copy and then save the form template and data as a new session in the History.

3.      Strictly control access to stored personal data

All access to personal data for purposes of viewing or data sharing should be via “official” interfaces only (i.e. direct log in, portal access, data exchanger).

a)      Data being directly entered into the system by logged-in ordinary users should be controlled by user name and password, with permissions down to the form level i.e. a newly-added user should have no access to any record in the system. Access to the records that contain data relating to natural persons should be granted by “role”. Some records (i.e. ‘VIP’ records) should be blocked from access/viewing for other than specific designated users. VIP records should be excluded from searches, from directory listings and from Reports that list data across records.

b)     Casual and external users should only be able to access data relating to persons via a portal login using a user name and password, preferably using dual factor authentication.

One possible configuration for portals is to use a portal-facing IIS server that pushes out a user-specific “menu of services” itemizing authorized service requests. The content of the “menu of services” should be “role-based”. (i.e. the only people who get to see the data for a database record are those actively involved in processing that data). Clicking on a portal user menu line-item invokes a database engine at the portal-facing server. The database engine alone logs into the back-end server, retrieves requested forms /data and posts the form(s) and data to the user in-tray at the portal.

c)      Batch data streams from/to local or remote 3rd party systems and applications should only post to a standalone data exchanger that accommodates a mapping of need-to-know data elements per subscriber (typically a local or remote system). Parsers and formatters are the responsibility of subscribers and publishers and data streams must be encrypted for data transport. The data exchanger should accommodate rule sets that operate on incoming data (point of origin checks, range checks, boilerplate pick list lookup verification) and tag problematic incoming data for manual clearance.

Bottom line, if your current transaction processing systems/apps do not restrict sharing on a strict need-to-know basis, if your data repositories to not build/maintain formal Histories or if you they allow access to natural person’s data via other than official interfaces, you need to replace, re-engineer or adapt your systems/apps.

Many corporations view such undertakings as requiring a huge amount of capital/time/disruption,

They fail to realize that task execution/tracking (the major source of natural person data) can be carried out under a Case platform that can link to existing systems/apps via a generic data exchanger, giving the corporation improved control over workflow/workload, with the option of later adapting/re-engineering/replacing legacy systems/apps.

Here is a list of caveats in respect of data sharing of data relating to natural persons.

  • Avoid e-mail systems (too easy to inadvertently reference the wrong addressee, proliferation of storage of personal data).
  • Avoid cloud document-sharing facilities (better to have everything relating to a dbms record “at” the dbms record).
  • Avoid use of APIs or RPA apps that build transaction logs, complete with personal data (i.e. determine where these apps are, who has access to them, what the retention strategy of each app is?) – your Case Histories are your permanent record of each transaction.


Posted in Data Interoperability, Database Technology, Enterprise Content Management, Software Design, Uncategorized | Tagged , , , | Leave a comment

How Low Can You Go? –  Part IV – Essentials for building, sustaining & improving corporate competitive advantage.


This article attempts to bridge the gap between operational methods such as ACM/BPM/CPM/RALB and FOMM and methods used to evolve business strategy.The mission of all strategic and operational initiatives is to build, sustain and augment competitive advantage.

See “Theories of the Firm – Expanding RBV using 3D free-form search Kbases” at

RBV allows organizations to assign priorities to promising initiatives that are competing for scarce resources.  Proper implementation, in theory, leads to increased competitive advantage.

The mechanism for bridging the gap between an initiative and its practical implementation within, typically, a Case, is to have operations prepare an ROI submission, citing benefits and objectives. The latter can be parked at a Case in a Figure of Merit Matrix.  Periodic assessment of progress toward meeting the objectives of the Case with regard for the ROI really is all you need to bridge the gap.

Things often don’t go this way because of the bizarre habit of some corporations of authorizing initiatives and then not bothering to carry out periodic monitoring/auditing on progress.

In extreme cases, the strategy is changed yet the initiative continues.  The practical equivalent of this in government is to have an agency in charge of taking care of monuments that no longer exist.

“Must haves” for bridging the gap between operations and strategy/between strategy and operations include

  • RBV
  • Case w/FOMM

Note from the “Theories of the Firm . . “ article, that, for large corporations, RBV does not work very well if you don’t have access to 3D free-form-search knowledgebases.

For me, all of this boils down to “. . . you cannot manage what you cannot see”.

Photo by : Anneli Salo

Posted in Case Management, Decision Making, Financial Planning, FOMM, R.A.L.B., Risk Analysis, Strategic Planning, Uncategorized | Leave a comment