Where, Oh Where, have my Documents Gone?

Decision-making is the art of converting information into action. It’s an art because it requires a blending of experience, knowledge, wisdom, intuition, and information. All of these except information are acquired or intrinsic capabilities.

The problem with information is that it is difficult to bring together all of the content needed to make decisions. The result is many decisions are made in the absence of key information.

If you think about it, the information needed to make most decisions is likely to be spread across multiple “documents” (e.g. MS Word .doc files, .pdfs, spreadsheets, PowerPoint presentations, web pages etc.). These documents are likely to be all over the place. Each document type requires special software for viewing/editing.

Pulling all of these together and keeping them together as documents go through multiple revisions is tedious and time-consuming with no guarantee that all of the information will be where you need it to be at the time you need to make a decision.

Another consideration is some of the information needed to make decisions relating to one initiative will be common to other initiatives. Corporate Policy & Procedure is a good example. This means that if you want to consolidate information by initiative, you need to duplicate certain documents.

Free-form search Knowledgebases such as Civerex’s CiverManage™ are a solution to the problem of consolidating information and being able to prioritize initiatives that compete for the same scarce resources.

Here’s how you can set up a Kbase that can consolidate all of your documents and organize these for decision-making.

#1 Start by finding a Kbase environment that allows bulk import of the content of your Windows Directories and files located across various servers/PCs. You want to be able to import entire Directories, not individual files.

You will want the documents to import to your Kbase’s underlying database management system. Don’t settle for a system that uses links back to source Directories (i.e. if someone moves the files, the links will break; if someone opens a document from a Directory, edits the document and does a save-as, your KBase will be pointing to the wrong version of the document).

#2 Now, organize your documents the way you wish as “things-within-things-within-things”, same as you would organize physical files, within file folders, within drawers, within physical filing cabinets.

#3 Once your documents are all in one place, create as many alias documents as you like and park these under as many separate “filing cabinets” as you wish. The way alias documents work is any edits done at any alias document operate on the source document for that alias.

This leaves editing to be discussed along with free-form searches.

When you click on a stored document for editing, the Kbase environment needs to automatically do a save to the database before any editing starts so that you end up with a full audit trail of your document content. Since your documents are IN your Kbase, file names are no longer needed (the node name, key words / mirrored content are all you need to find a document).

Free-Form Searches

The big advantage of Kbases becomes apparent when you engage a search.

Whereas traditional SQL searches tell you what a query was able to find (providing you ask the software to look in the right place), a free-form Kbase search tells you visually what the software was able to find and what the software was NOT able to find. This greatly increases the amount of information returned per search.

A search at a hospital Kbase can give a patient information regarding which facilities have EMRs but the same search can give a vendor information regarding which facilities do not have EMRs.

If you try to look for a phone number in an address field using traditional SQL you will get back “not found” – the same search in a free-form Kbase will find the phone number even if it is embedded in a memo field.

It’s not all clear sailing – for fast searches, it pays to mirror document content to plain memo fields but in respect of, say, PowerPoint presentations, images, etc, you need to assign key words at document nodes so that you can find content. Videos are typically too large to store IN Kbases, so these are exceptions to the “no-links” rule.

With the exception of videos, you can, following import, move original documents to backup and, hopefully, never have to refer to these again.

A Practical Scenario

Consider a filmmaker trying to put together a documentary – there will typically be a screenplay, a project schedule, a project budget, promotional material, financing plans/requests/responses, equipment inventories, multiple video clips, actor/model release forms, copyright permission royalty agreements, etc..   If several projects are being worked on at the same time, some of the content may need to be used across projects.

File Folders (paper or electronic) make it difficult to find content.

Large corporations can easily have hundreds of active initiatives plus a number of initiatives under consideration, each with its own set of documents. In respect of initiatives under consideration, most of these initiatives compete for scarce resources. This gives rise to a need to prioritize initiatives which is next to impossible to do looking at one initiative at a time.

Knowledgebases help with information consolidation (products, products under development, asset inventories, staff, internal policy/procedure, external rules/ regulations, markets, competitors) and since there are no physical boundaries to Kbases, it is possible using “Russian doll” data hiding techniques to have any number of project\folders\nodes at one screen with free-form search facilities across the entire space.

The Kbases our group works with usually have in excess of 1,000 documents, sometimes 10,000 documents, all visible and accessible from one screen. Try that on your PC desktop and you will appreciate the value of a Kbase.

If you are not using Kbases, you or your assistants probably spend 1-2 hours a day hunting for information. Cost this out over a year and you have the potential to save a lot of money. The quality of decisions you make will improve as a result of having more complete information available if you migrate your documents to a Kbase.

If you are in the market for a Kbase, pick a LinkedIn discussion topic that has received 200-300 comments and ask the vendor to put together a demo Kbase comprising the comments, organized respondent by respondent, in reverse chronological order. Try to identify a sub-theme that two respondents have found to be particularly interesting and engage a search. Your search results will automatically thread together the dialog between the two respondents.

Here below is a screenshot of a Satellite Inventory, comprising some 6,000 documents relating to three Entities (Satellites, Launch Sites, and Launch Vehicles).

  • A search for a Launch Site, will give you site demographics plus all satellites launched from that site (all countries, all years) plus all launch vehicles used.
  • A search for a Satellite will give you demographics for that satellite and simultaneously highlight Launch Sites and Launch Vehicles used. You will be able to see clearly whether a country used a particular Launch Site for a period of time and then switched to a different site. If that country uses one site for commercial launches and a different site for military satellite launches, you will see it.
  • A search for a Launch Vehicle will give you all satellites/sites that have used this launch vehicle.


Need to find needles in haystacks?   Start using a Kbase today!

If you have any questions re your use of Kbases to improve decision-making, call me at 1 450 458 5601.


The setup for investigative searches such as encountered in major crimes case management is different. Here, you will want a mix of “things-within-things-within-things” plus networked documents (i.e. multiple hierarchical trees) as you “connect-the-dots”.

Posted in Database Technology, Decision Making, Enterprise Content Management, Major Crimes Case Management | Tagged , , | Leave a comment

Case – why you need it (Part III of III)

Here is part III/III on Case Management (ACM, BPM, RALB, ECM, CRM, and FOMM) highlighting data storage issues at Cases plus the need for interoperability and non-subjective ways and means of assessing progress toward meeting Case objectives.

Watch the 11 minute  video at

Posted in Adaptive Case Management, Business Process Management, Case Management, Data Interoperability, Database Technology, R.A.L.B. | Tagged , , , , | Leave a comment

Case – why you need It (Part II of III)

Here is part II on Case, ACM, BPM, RALB, ECM, CRM, and FOMM.

Watch the video at

Posted in Adaptive Case Management, Automated Resource Allocation, Business Process Improvement, Business Process Management, Case Management, Customer Centricity, Data Interoperability, Enterprise Content Management, R.A.L.B. | Tagged , , | Leave a comment

How much money should you spend on a promo video?

Video_ShootWe live in an era where “also ran” no longer works. The other thing is viewers of web videos have very short attention spans.

Any small business owner has to decide up front what level of sophistication they want for, to take one example, a promo video.

Four options I can think of are PowerPoint, talking heads, animation sequences and interview style.

All can result in absolutely dreadful results when done wrong. The latter IMO is the most effective when done right.

Here is how we pitch video to our clients as a way toward improving competitive advantage.

The business owner has to understand that for interview style there has to be a script (that must not be referred to during the recording), a good camera  (preferably two), good lighting and good sound.  The video should not run more than 3-5 mins.

The script has to immediately answer viewer questions that include

1) should I continue to view this video?

2) is the message addressing a need that I have?

3) does it cause me to want to contact the owner/company to contract for products/services or at least get more information?

Budgets will dictate whether the owner can afford to hire a video production company to hopefully generate good value for money or whether they should try to do the video themselves using consumer level technology.

No telling in respect of the latter whether a video is any better than no video.

You may have noticed that the media in some countries are shifting away from the use of pro staff/equipment for news event coverage. The difference in quality is dramatic but the argument against my recommendations is “do contemporary audiences actually care?”Video_ShootVideo_Shoot

Posted in Video Production | Tagged | Leave a comment

Case – why you need it (Part I of III)

Here is an 11 minute video on Case, ACM, BPM, RALB, ECM, CRM, and FOMM.

No ads, no waiting to the last few frames to find out what the video is all about.

No clue what RALB or FOMM stand for?

Listen to the video to find out.

Posted in Adaptive Case Management, Business Process Management, Case Management, Customer Centricity, Data Interoperability, Enterprise Content Management, Interoperability, R.A.L.B. | Tagged , , , | Leave a comment

Check your Business Rules on the way in and out

All successful business activity takes place under the control of Business Rules.saloon

There are two types of Business Rules, those that issue warnings and those that cause hard stops.

In highly-automated systems such as automobiles, an audible or visual warning is typically issued when a rule is violated (e.g. engine overheating). If the warning is ignored, a state of affairs may be reached where the system causes a shutdown.

A well designed BPM process behaves in much the same way – the process provides behind-the-scenes orchestration and the BPMs environment provides in-line, real-time governance.

Some prefer to use the terms guidelines and guardrails (i.e. center lines on a highway / physical barriers along the sides).

The big question becomes “where do we park business rules?”

If we consider a sequence of linked steps, we know that data flows along template instances.  We can, at any step, test data that is being carried forward as well as data being collected at the step.

We can, for example, prevent alpha characters from being input at numeric fields, reject end dates that are earlier than start dates, and carry out range checks on data values (i.e. percent complete must be between 0 and 100).

The obvious place to park rules is at data collection forms at template instance steps.

If you are at a step, recording data, there is a presumption that you are an appropriate/authorized resource to be at the step and that the step is being performed at the right time.

BPM process maps where steps have plan-side routings take care of some of this but do not take in to consideration whether an instance is on the right sub-path (i.e. we engaged processing of an adult healthcare patient only to discover that the patient actually is an adolescent).

Accordingly, whereas process map logic may tell us that a step has become “current” it may be prudent to test via pre-condition rules that it is OK to access the step and initiate processing.

The rule set placement in this case needs to be upstream from the step and this is easily accommodated by a precursor auto-exec step with a routing of “system”.

Users don’t see the step but if the results of processing at the step indicate a problem, a prompt will be issued or a hard stop will occur. If the outcome of the auto-step is a hard stop, processing never gets to the next-in-line step unless/until the problem is fixed or a supervisor inputs an override.

In respect of soft stops, an auto-exec step can be more lenient (e.g. a step asks for a street address for an individual or an organization). Here, whereas the workflow designers’ preference was to pick up a street address at a particular stage of the processing, an address is actually not needed until such time as a form is submitted, a letter is sent, or a worker goes on the road to that address. Downstream rules will apply a hard stop as and when appropriate.

We can see from the above that auto-exec process control template instance steps are no different from ordinary input/output conversion steps so no special constructs are needed – these steps auto-commit instead of requiring manual commits, their routing is “system” instead of some user skill category.

How do we provide governance at ad hoc steps?

Whereas we can declare an ad hoc step to be a process of one step, if the step form consists of a single memo field, there is not much that we can do. We don’t, in the normal course of events, need to be concerned about what is recorded at memo fields, but, you never know, so no harm auto-parsing the data to see if there are any key words /contexts that should be flagged.

Suppose, however, someone wants to ship a prototype.

Global rules say “no shipments unless a product has been through QA” so the generic solution to this and other unwanted behavior is to isolate users from directly streaming records on templates by presenting to the user a menu of services.

The menu can include all workflow templates plus a replication of all steps within each of these workflow templates as ad hoc steps.

“Build-test-ship” expands to “build”, “ship”, “test” (alpha listing) and in respect of “ship” we would reasonably have, behind the scenes, a two-step template, the first of which is an auto-exec step that asks “has this product been tested?” – if the answer is no, the “ship” ad hoc intervention will fail.


We can implement governance at a Case for work that is made up of any mix of structured /unstructured activity without having to jump through hoops.

Simply park your rules at steps or set up links at steps to regional rule sets (avoids any need for 1:1 coupling between any rule set and a process step).

Final refinements are, once at a process step, if the processing calls for engaging or reaching out to an algorithm, extend your rules implementation model by preceding the algorithm with pre-conditions and, just to make sure, insert an auto-exec post-condition step immediately “downstream” from the algorithm.

Do the same as you leave process steps via next-in-line auto-exec “system” steps that include loopbacks in the event of a fail.

Here is an Eiffel code fragment that illustrates “require”, “do” and “ensure”

class ACCOUNT create

… Attributes as before:
balance , minimum_balance , owner , open …
deposit (sum: INTEGER) is
– Deposit sum into the account.
sum >= 0
add (sum)
balance = old balance + sum
withdraw (sum: INTEGER) is
– Withdraw sum from the account.
sum >= 0
sum <= balance – minimum_balance
add (-sum)
balance = old balance – sum
may_withdraw … — As before

feature {NONE}
add … — As before
make (initial: INTEGER) is
– Initialize account with balance initial.
initial >= minimum_balance
balance := initial
balance >= minimum_balance
end — class ACCOUNT

Apologies to Civerex customers who have never seen nor had to worry about computer code. Rest assured, we will not be putting out upgrades to our software suites that require coding, same as we have not, in the past, required customers to worry about building and maintaining database tables\fields.

Most of the heavy lifting for the approach used in Civerex workflow management software suites comes from research done by Dr. Bertrand Meyer, inventor of the Eiffel language. We are indebted to Dr. Meyer and his team for their important contribution in the area of workflow management.

Civerex was the Canadian distributor of IES for a number of years.


Posted in Case Management, Software Design | Tagged , , , | Leave a comment

Employee Engagement

The best methodologies and working tools will get you nowhere if your employees are not working efficiently and effectively.

Here is a mini-plan to increase employee engagement within your organization.

#1. Hurdle one is to make sure none of the employees hate their jobs. If you get to step #7 with an employee and they are not on board, let both sides move on. It’s critical to have everyone in the boat, rowing in the same direction.

#2. Explain the big picture (why we are all here).

#3. Next, give the employees proper tools to do the work you are asking them to do and train them in the use of these tools.

#4. Show them what’s in it for them to use these tools effectively (e.g. less difficulty performing the work with the tools than not working with the tools, reduced stress).

#5. Give them some responsibility to make decisions.

#6. Mentor employees along a career path tailored to their interests, desires and capabilities.

#7. Get each employee to sign off on their engagement plan.

#8. Do a review every 6 months.

#9. Do daily/weekly walkabouts

Posted in Organizational Development | Tagged , , | 3 Comments

Working with KPIs V – Measuring Performance

This is the final post in the series “Working with KPIs”, where operations meets strategy.

In the screenshot below, we have workflows consolidating data in real time to the “3D Printing Corporation” Kbase.

Rule sets at process template steps upload data to the Kbase where expressions of the type x=x+1 increment counters and record details of events (e.g. QA defect details).

If a KPI is, for example, a moving average, then algorithms at the KBase can carry out the required additional calculations.

Direct data import from local and remote systems and applications can take place at a Kbase without the need to initially import data to your run time environment and, from there, upload the data to the Kbase.


As expected, the Kbase will auto-version all incoming data by “session” so all free-form searches need to be constrained by a time window (e.g. T2 current, T1 last 3 months, T0 last 6 months).


As with a workflow management run-time environment that auto-exports transaction data to

a) a history,
b) a Kbase,
c) a data exchanger,

a Kbase can be configured to export session data for upload to a graphic environment to facilitate trending.

Bottom line, you need 360 degree methodologies to narrow the gap between operations and strategy.

Traditional row and column data reporting has gone the way of buggy whips in today’s fast pace business climate.

“what you don’t see may end up being more important than what you are currently looking at”

The combination of traditional relational database technology plus advanced graphic capabilities allow you to look at the past, manage the present and get a better idea of what is likely to take place in the future.

To review the entire sequence of operations -> strategy blog posts see the following:

Case Management Tips and Tricks

I    Work Scheduling http://wp.me/pzzpB-wa

II  Decision Support/Work Performance http://wp.me/pzzpB-wj

III Data Exchange http://wp.me/pzzpB-wr

IV Consolidation of Case Data to Executive Dashboards . . . http://wp.me/pzzpB-wG


Working with KPIs

I    Taking Stock http://wp.me/pzzpB-xG

II  Formulating Strategy http://wp.me/pzzpB-xN

III Setting Goals/Objectives http://wp.me/pzzpB-xU

IV Defining KPIs http://wp.me/pzzpB-y9

V   Measuring Performance (this post)


Posted in Enterprise Content Management, Strategic Planning | Tagged , , , | 2 Comments

Working with KPIs IV – Defining KPIs

Notice in this blog series, as we have transitioned from “Working with KPIs I”, to II, to III and, now, to “Working with KPIs – IV Defining KPIs”, how the emphasis has shifted from putting in place methodologies to managing knowledge, information and data.

What is remarkable is this has been done with no required change to the venue where the knowledge, information and data is being consolidated (i.e. the Kbase).

In Working with “KPIs III – Setting Goals/Objectives”, we ended up with two major strategies.

#1. Scale-up of Plant A to meet increasing demand for 3D Printing Corporation’s USA-based manufacturing industry customers plus healthcare customers and prospective customers/partners.

#2. Locating, sizing and building Plant B (Singapore or UK) to service new healthcare customers outside of the USA.

The logical KPI choices for Strategy #1 would be to track the growth of manufacturing and healthcare customers plus the rate of conversion of prospects to customers/partners within the healthcare industry. (i.e. three KPIs).

As for Strategy #2, the lead KPI during the early phase of this project is likely to be tracking response to ad placement and trade show participation to determine overseas demand for 3D Printing healthcare products, followed by a KPI extracted from a 3rd party CPM application system (time, cost, plant capability) used to plan, monitor and control the construction of Plant B (i.e. two more KPIs).

Other KPIs might include a) monitoring changes in the competition b) monitoring changes in legislation and c) monitoring changes in the technology of 3D printing.

A key point is that when these KPIs are parked at the Kbase choices relating to KPIs and the relative importance of KPIs can be assessed in the context of 3D Corporation’s Asset Inventory and evolving Strategies and other content present at the Kbase.


There is an expectation here of being able to auto-update KPIs based on ongoing operational BPM/ACM Case activity as opposed to having to extract data from low-level transaction systems, carry out calculations on the data and then manually post results to the Kbase.

Behind the scenes, there will be several entity systems hosting data collection along BPM best practice workflows (i.e. manufacturing customers, healthcare customers, prospective customers in manufacturing and healthcare, prospective healthcare industry partners, new product designs).

This auto-updating of KPIs and trending of KPIs will be the focus of Working with “KPIs V – Measuring Performance”.

This blog post is #4 in a set of five, with titles as shown here below:

Working with KPIs I    – Taking Stock
Working with KPIs II   – Formulating strategy
Working with KPIs III – Setting Goals/Objectives
Working with KPIs IV – Defining KPIs
Working with KPIs V – Measuring Performance

Posted in Strategic Planning | Tagged , , | Leave a comment

Working with KPIs III- Setting Goals/Objectives

Continuing the example of strategy development for “3D Printing Corporation”, following

a)      preparation of an inventory of corporate assets (“Working with KPIs I”),

b)      cataloging of various strategies (“Working with KPIs II”),

the next step is to set Objectives and then, for complex initiatives, establish one or more Goals per Objective.

If your initiative has only one Objective and is not a complex initiative the Objective and the Goal can be the same, eliminating the need for a Goal.

Whereas rough timelines for individual strategy implementations are usually set during strategy development, Goal/Objective setting involves itemizing “deliverables” that must dovetail together to reach a status of “complete” for each strategic initiative.

Goals are points in time along the way to Objectives. They must represent verifiable stages toward preparation of a deliverable (i.e. “prototype complete, certified OK by QA”). Goals always have calendar dates.

Top management needs to consult with operations managers at this time to work out the logistics for preparation of deliverables and assess each strategy with respect to risk, costing and timelines.

It’s best to start with a formal description of each strategy, then prepare a timeline, followed by an ROI. If the timeline does not fit the initial time expectation set during articulation of a strategy, it may be necessary to re-work the strategy.

If the ROI does not reach a breakeven point within a reasonable timeframe, it may be necessary to re-work the strategy. As for risk, the strategy should receive a low, medium or high risk ranking.

Now comes a gatekeeper phase which involves confirmation that the organization has sufficient resources to meet timelines. This is where strategies compete with each other. Here is a screenshot of our Kbase for 3D Printing Corporation.


Notice that the Kbase consolidates data from multiple entities (assets, industry specific strategies, plants, customers and Country Profiles).

The Kbase shows the sharing of Plant A and proposed Plant B across the customer base/proposed customer base in the USA, UK and Singapore. Projected volume across the three target countries impacts the extent of use of Plant A as well as the location and size of Plant B.

It’s important to point out that whereas, in the 3D Printing Kbase, the principal use of the US Dept of State Country Profiles are for placement of Plant B, it is not essential, in a free-form search environment, to park the Country Profiles under say Strategy – Healthcare \Singapore or Strategy – Healthcare\UK.

Another point is that for nodes labeled “Plant A” and “(Plant B)”, there really is only one instance of each, the others being alias nodes (i.e. update any one occurrence of one of these nodes, the others all update contemporaneously).

Once a firm decision is made re location, if that location is Singapore, the node under Strategy – Healthcare\UK would routinely be deleted.

Not visible in the screenshot is auto-versioning of forms/documents that may be attached to individual Kbase nodes – as edits are made, the system automatically builds a history such that users can “browse” the history (i.e. what was the preferred location for Plant B in December 2013).

One of the great advantages of the use of a Kbase for planning is to be able to collapse entire sub-structures, allowing planners to reduce clutter.


In respect of searches, planners can carry out isolated searches on sub-sets of the Kbase (i.e. only search the US Dept of State Country Profiles).

Going forward to “Working with KPIs IV – Defining KPIs”, once a strategy has evolved to, say, where construction of Plan B is underway, we would reasonably want to set up a KPI that assesses progress toward construction of the plan. A second KPI would be set up to assess progress toward signing up new customers in the UK and in Singapore.

This blog post is #3 in a set of five, with titles as shown here below:

Working with KPIs I    – Taking Stock
Working with KPIs II   – Formulating strategy
Working with KPIs III – Setting Goals/Objectives
Working with KPIs IV – Defining KPIs
Working with KPIs V – Measuring Performance




Posted in Uncategorized | Tagged , , | Leave a comment