2016 Recap – Basic Requirements for Success with BPM


Once again, with minor updates/consolidations,  . . . . here is my list of basic requirements for Success with BPM.

BPM

1. Some of the work to be performed involves the performance of tasks in logical sequence.

2. The work will be performed more than once (otherwise use Critical Path Method).

3. The benefits vary such that, for a large initiative, it is advisable to prepare an ROI or SROI.

4. The more complex the sequencing, the more specialized the tasks (requiring specific skill sets), the more beneficial it becomes to go beyond paper mapping to have an in-line implementation of a process (as opposed to an off-line or on-line implementation).

5. The run-time environment hosting instances of templates (i.e. compiled flowgraphs) needs to be able to accommodate re-visiting already committed tasks, recording data at tasks not-yet-current along their instances, and ad hoc intervention insertions at the environment.

6. Usual essential services to support the processing of instances include:

a) R.A.L.B. (three-tier scheduling);
b) a non-subjective approach to assessing progress toward attainment of Case goals/objectives such as F.O.M.M. (Figure of Merit Matrices),
c) a formal History (committed tasks, with date/timestamped user “signatures”, with recall of data, as it was, at the time it was entered, on the form versions that were in service at the time);
d) data logging for possible machine analysis to allow process owners to improve their processes; data import/export to increase the reach of the run time environment.

7. Reasonable accommodation to deviate from the sequencing of steps, but with governance from rule sets along instance pathways and at the environment (typically Case) to “rein in” extreme, unwanted deviations away from “best practices protocols” i.e. guidance from BPM, governance from the environment. [the highway example of center lines to provide guidance and guardrails on both sides for governance is helpful].

8. The environment selected must have a simple User Interface, otherwise the initiative will fail.

9. Adequate training must be provided.

10. For whatever reason (lack of time, inability to “think” process), bring in a facilitator for a short period of time. On-site visits may be necessary with some clients (1-2 days) but the recommended balance of work should reasonably be capable of being done as a series of several one-hour GoToMeeting or equivalent sessions per week.

11. Advanced capabilities include: bi-directional data exchange with local and remote 3rd party systems and applications; predictive analytics for improved decision-making  and consolidation of run-time data to a free-form-search corporate Knowledge Base that hosts corporate assets, strategies and KPIs.

Posted in Adaptive Case Management, Automated Resource Allocation, Business Process Improvement, Business Process Management, Case Management, Compliance Control, Data Interoperability, FOMM, Operations Management, Process Management, Process Mapping, Productivity Improvement | Leave a comment

Managing Source Code using Kbases


After years of looking for better and better ways and means of managing source code, we turned to one of our own software suites, a graphic free-form-search Knowledgebase, to manage O-O code that is used across ten commercial software products that we develop, maintain and support.

We found that the “must-have” features for managing source were:

a) auto-version control

b) node aliasing, because source code units are typically used across several, sometimes all, products.

c) free-form-search  facilities so you can pick any code fragment and immediately see where it is used, pick any developer to find out units they have worked on etc.

In the screenshot below we have a code set relating to a custom app that we built for a customer in 1998, with  250+ source code units and 16 database tables.

2016-11-30_1123

Clicking at any node reveals the source, drilling down allows you to browse the versions of the source, engaging a search for any code fragment, table construct (latest or all) causes all primary node “hits” to light up.

Posted in Database Technology, Software Source Control | Leave a comment

Steering the ship – the new business management reality


Increased global competition has more corporations chasing after the same opportunities.

The impact on strategic planning has been a reduction from 5-year planning cycles, with annual reviews, to 18 month cycles, with quarterly reviews.

The playing field has changed in other ways as well.

Corporations used to strive for “satisfied” customers.  Today, they need “delighted” customers.

In certain cases, the primary focus needs to be on future customers, not on current customers.

Behind the scenes, the corporate mission remains unchanged – i.e. building, protecting and enhancing competitive advantage, but with shorter-term initiatives characterized by increased risk and uncertainty.

The traditional role of steering the ship i.e.  “plan->monitor->control” needs upgrading to “plan->monitor->re-assess”, where “monitor” now includes advanced decision support, and predictive analytics.ship

Whereas initiatives traditionally were allowed to run their course, “plan->monitor->re-assess” means more are at risk of being the focus of budget cuts or outright termination.

SWOT (Strengths, Weaknesses, Opportunities and Threats) continues to be front-and-center, except that the number of corporate assets likely to be impacted by an initiative has increased.

Corporations are finding it more difficult to build competitive advantage based on one or two key assets.

Success comes from innovation in the way clusters of assets are put into service.

Here is a partial list of corporate assets that need to be under constant review:

(Capital, Access to Capital, Land, Equipment, Tools, Premises, Staff, Intellectual Property/Knowhow, Current Products/Services, Products / Services Under Development, Projects Awaiting Approval, Technology Trends, Changing Legislation, Competitors)

Decisions, Decisions, Decisions

It’s not surprising that the process of decision-making has changed.

Whereas, in the past, decisions were often made on the basis of information with a heavy dose of experience and intuition, today’s decision makers look for ways and means of rapidly converting knowledge into information.

Free-Form-Search Kbases are the environment of choice for rapid conversion of knowledge to information, for the following reasons:

  1. Ability to see the big picture at a graphic User Interface,
  2. Built-in connect-the-dots facilities,
  3. Availability of e-map/e-build environments for rapid roll out of operational processes,
  4. Real-time data collection and uploading / consolidation of operational data to Kbases.

The changes I have described here not been without casualties:

  • Traditional BPM, with its focus on end to end processes, is now a core capability under Case that provides background orchestration and governance in respect of the performance of work,
  • Senior management no longer just stares at executive dashboards featuring KPIs, leaving operations to do what they like. Senior management is now able to piano-play their environment and not only challenge trends but also challenge KPIs themselves,
  • CRM has been absorbed into Case,
  • ECM is now embedded in Case.

Survivors include flow-graphing (mid 1950s) and F.O.M.M. (1960s).

Both are alive and well and pretty much “must-haves” for anyone working in business today.

So, what’s next?

  • IoT interconnectivity.
  • Interoperability by and between local and remote 3rd party systems and applications.
  • Predictive analytics.
  • Fewer decision makers as AI kicks in but the ones who survive will appear to be ”smarter”.
Posted in Competitive Advantage, Decision Making, FOMM, Risk Analysis, Strategic Planning | Leave a comment

What you don’t know will hurt you


Businessman at the entrance of a labyrinthDonald Rumsfeld did us a big favor by describing three categories of knowledge.

“There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know”. Donald Rumsfeld  U.S. Department of Defense (D0D) news briefing, February 12, 2002).

He initially forgot one category “unknown knowns” (i.e. knowledge that organizations have but cannot find / access when needed).

If you are looking to improve competitive advantage you do not have much control over unknown unknowns. But you need to have a good handle on the remaining others.

Knowns knowns and known unknowns are the result of hard work.

The problem of unknown knowns, on the other hand, does not require much more than putting in place a free-form-search Knowledge Base.

FFS Kbases basically ensure that if you have it, you will be able to find it.

As usual, there are no free lunches. FFS Kbases require daily contributions. Otherwise, they quickly become useless. You also have to use them.

The starting position is to acquire FFS Kbase software -make sure you do your homework.
Posted in Decision Making, Knowledge Bases

Where are the ‘Easy’ buttons in BPM?


If you are looking for success with BPM, there are, by my count, 19 hurdles. Different consultants use different approaches, their approaches are effective within some customer cultures/not so effective in others, the tools used by the consultant and the customer vary.

At the end of the day what counts is the customer journey.

BPM

To contain the scope of the discussion, let’s isolate the following as not being part of “BPM” . . .

  • formulating corporate strategies
  • ranking candidate initiatives
  • selecting initiatives in the context of scarce resources, risk and uncertainty
  • authorizing implementation of initiatives

The question becomes which of the following areas of BPM expertise are “easy” at the customer level, which are not . . .

Go ahead and rank these and feel free to critique any you feel should not be part of BPM and add others you feel should be part of BPM.

Straight away I can identify a 20th which is “ability to carry out CEM within your bpm run-time environment so that you are on the lookout for and responsive to customer touch points”.

E=Easy M=Moderate, D=Difficult

  1. mapping out processes (concept level);
  2. transitioning concept maps to production-level detail;
  3. improving processes prior to rollout;
  4. selecting an appropriate run-time environment (i.e. Case, unless you can suggest something better);
  5. rollout of improved processes to the run-time environment (compiling graphic maps to run-time templates);
  6. setting up Case-level governance;
  7. setting Case objectives;
  8. streaming Case records onto instances of run-time templates;
  9. threading together process fragments;
  10. managing workflow at Cases (skill performance roles);
  11. managing workload at Cases (users prioritizing tasks);
  12. insertion of ad hoc steps (processes of one step, if you like);
  13. interoperability (people, machines, software, at various places);
  14. managing workload across Cases (by supervisors);
  15. assessing progress toward meeting objectives at Cases;
  16. consolidating Case data to KPIs;
  17. challenging KPI trends, KPIs, initiatives, strategies;
  18.  real-time decision support at Cases;
  19. data mining for the purpose of auto-improvement of processes.

o o o

 

Posted in Business Process Improvement, Business Process Management, Case Management, Customer Experience Management | Tagged | 1 Comment

Mini-firestorm at BPM.COM


What Is the Best Way to Build an Executable Process Model?

From a comment E. Scott Menter made on this discussion where he wrote: “Flowcharts (including IMHO BPMN) are simply not a great way to build an executable process model.” What do you think?

As of this morning, we are at response #19, with no clear consensus.

#19 Karl Walter Keirstead

[Reading over the material at this discussion, I think we need a few more rounds before the traditional stampede over to a new question takes place.

Scott riled up a bunch of us by stating “Flowcharts (including IMHO BPMN) are simply not a great way to build an executable process model.”

At the 1st post, Emiel threw one spanner in the works, asking what “executing” means.

Here we are at response #19.

My experience is we can execute a model or we can execute a “best practice” template.

The former is a best practice under construction (albeit at a higher level of summary) whereas the latter is, (graphically, for those who relate to flowgraphs), the “best practice”.

We cannot “execute” graphs but we can compile them/interpret them/ scan them/ transform them by carving them up into discrete run-time steps that have various attributes such as the performance skill level needed, data collection forms needed to record data and some mechanism for attesting to the completion or commitment of individual steps.

Scott says the “really fun part” is “press the compile button”. I agree – a lot of behind the scenes things take place when you do this.

The objective is to be in a position, within some run-time environment, where software, machines and/or people can post steps to user InTrays for their attention/action.

Clearly, as one step is completed, we need the environment to provide orchestration by posting the next-in-line steps to the appropriate classes of users.

These users really need orchestration – in healthcare, a nurse can easily need to provide services to 20-50 patients per day, each of these patients will typically be at a different step along a private care pathway (read best practice template) and many patients will be on different pathways at different steps.

The nurses indeed are not robots, it’s totally unrealistic to think that all possible interventions for any patient could be “guided” by a best practice template.

What this means is the nurse needs to be able to insert an ad hoc step at any point.

They also need the ability to skip steps in the best practice, re-visit an already completed step, and record data at a not-yet-current step (i.e. the data has become available) – no point writing the data down on a piece of paper, then waiting for the step to become current and, only then, recording the data in the patient Electronic Health Record.

Lastly, the nurses need to be able to micro-schedule their work, supervisors need to be able to set priorities, supervisors need to level and balance workload across, in this example, nurses.

Can one do all of this on a cellphone? In principle, yes.

Can the run-time system work without orchestration, no. This would mean the organization would have no way to manage its best practices.

Is a flowgraph yes/no “a great way” to build an executable process model? You tell me.

How long does it take to build a flowgraph that captures the steps/directional arrows etc.?

How long does it take to build a best practice some other way? What does the result look like?

If you don’t like to build flowgraphs, even if another way takes five times longer, do it, providing a) you have the time b) the customer is prepared to pay for the end result.

The main lookup I made in participating at this discussion was the advice given by Dr Russell Ackoff. (The Art of Problem Solving, 1978)

The advice was along these lines “Decisions involve choices and if you can’t see the likely outcome of a choice then you cannot make decisions”

My comment . . . .no flowgraphs, no way to see likely outcomes, no way to make decisions.

Execept that maybe there IS (+1, Scott) an alternative to flowgraphs, but we need to see/hear what that is.

Linear task lists clearly are the way to go at run-time. Amit uses them, my group uses them.

A live/batch chat capability is a no-brainer (improves the customer experience or journey).

Once you admit that chats are “a good thing” you are admitting to ad hoc steps and your feet are firmly planted in “Adaptive Case Management” (a run-time environment with governance, with background BPM, with embedded CEM, with RALB or auto resource allocation, leveling and balancing, plus lots of other capabilities) that lets you “manage work”.

I have found a few folks who hold the view that “process management” means building, testing, updating paper process maps. OK, that works for a silo whose output is a paper process map.

For others, the end is a new beginning and the next thing to do is roll out the best practice in a run-time format so as to manage, not the process, but the Case that is hosting various (typically) process fragments and ad hoc insertions.

And the purpose of “managing” a Case is to support or contribute to corporate strategy/initiaties.

And the purpose of formulating strategies/initiatives is to build, maintain and enhance competive advantage.

All of this in support of Peter Drucker’s various statements along the lines of “the purpose of a business is to stay in business”.

A small problem is that for some, the purpose has changed to “let’s buy this company, fire management, put some lipstick on the pig, flip the company and make a lot of quick money”.

]

Posted in Adaptive Case Management, Business Process Improvement, Business Process Management, Case Management, Competitive Advantage, Customer Centricity, Customer Experience Management, Process Management, Process Mapping, R.A.L.B. | Tagged | Leave a comment

CEM and BPM, In The Sandbox


The no one disputes the potential of CEM (Customer Experience Management) for attracting and retaining customers and for administrative cost reduction in the area of goods and services delivery.

Since BPM has a similar focus (i.e.  delighting customers) the question arises as to whether an organization needs both a CEMs and a BPMs. (i.e can CEM and BPM play together in the sandbox?)

Family making sand castles at the beach

The answer? It depends.

Whereas BPM implementations are fairly straightforward, a range of implementation strategies exists for CEM.

Aggressive CEM implementations mine your social data.  These implementations post ads for goods/services you have been researching to a giant screen as you walk through a shopping mall.

Passive CEM implementations wait for customer inreach and try to delight customers on-the-fly.

A third strategy is to put in place outreach facilities.

Here’s one way you can practice outreach CEM within BPM.

When mapping processes, anticipate and include customer touch points as process steps.  At run time, as each of these will become current along a Case timeline, you will be able to seamlessly reach out to your customer in content and situation-appropriate ways.

Secondly, accommodate ad hoc reachout in your BPMs run time environment.  This allows you to contact the customer at any point along a Case timeline.

e- mail is not the communication method of choice for a CEM implementation.  Not secure, not easy to extract content from messages, and not easy to route incoming messages to Cases.

A better strategy is to get customers to log into a Customer Portal that you set up.

For security reasons, no Portal user should be able to establish a cursor position at a back end DBMS (database management system). A processing engine that sits between the back end DBMS and the Portal solves the problem nicely.

All Civerex BPMs’ rely on back end application system Calendar Events to trigger posts to Customer Portal InTrays (plan-side customer touch points, run-time ad hoc customer touch points).  Posting of a Calendar Event at the back end application system results in a pending information/action line at the Portal.  Calendar Events can be posted at the back end application manually or automatically.

Can you have 360-degree CEM as part of your BPMs?

Yes! – you may have the capability to integrate CEM right now or you may be only a few steps away from integrating CEM into your BPMS.

Cost savings in the order of 20-30% as a result of implementing BPM/CEM are not uncommon as a result of reductions in the number of phone calls and mailings to customers.

For more information on CEM and BPM, call Civerex Systems at 1+450 458 5601

 

 

Posted in Business Process Management, Case Management, Customer Centricity, Customer Experience Management | Tagged , | Leave a comment

Big Data and Competitive Advantage


The link between big data and corporate competitive advantage

Success in business is all about building, sustaining and augmenting competitive advantage.

SuccessGiven comparable infrastructure (Capital, Access to Capital, Land, Equipment, Tools, Premises, Staff, Intellectual Property/Knowhow, Current Products/Services, Products / Services Under Development, Projects Awaiting Approval, Technology Trends, Changing Legislation, Competitors) what is it that distinguishes one corporation from another in terms of ability to augment competitive advantage?

If you subscribe to the notion that managing a business today is more complex, has more options, shorter ROI timelines, with increased risk and uncertainty, one differentiating factor is the methodologies in use for strategic and operational planning, monitoring and control.

Let’s start with the problem of making best use of scarce corporate resources.

Most organizations have no shortage of exciting initiatives that could be undertaken at any point in time but lack the resources to implement more than a few of these.

It follows that strategists need ways and means of ranking prospective initiatives in order of decreasing attractiveness.

For this, they need to be able to inventory candidate initiatives, with an indication of the resources they would need going forward. Clearly, we might as well also inventory existing initiatives with the resources they are using in order to be able to determine on an ongoing basis which resources are available for new initiatives.

Strategists don’t like to tie up any one resource completely as that might prevent new initiatives from being undertaken so each resource needs a minimum reserve level.  Similarly, they don’t want any one resource to be tied up for too long a period of time.

A practical approach is to dynamically cross-link resources to initiatives (current and prospective). Resources sit in a pool and are assigned to initiatives and returned to the pool when no longer needed.

Strategists reasonably want to see all corporate assets/resources/initiatives at one computer screen and have the ability to drag/drop resources to new initiatives as well as repatriate resources to their respective resource pools as and when initiatives no longer need these resources.

The final step is to rank new initiatives according to their attractiveness (i.e. read “according to their ability to sustain or augment competitive advantage”).

This puts senior management in a position to select the more promising initiatives and  declare these as ”ready for implementation”.

A graphic free-form search knowledge base is the environment of choice here as it can provide visual oversight for tens of thousands of dynamic data points, with hierarchical linking.

Strategy Implementation

Responsibility for implementation on new initiatives goes to operational managers who compete for resources via ROI requests and annual budget requests.  The only initiatives that should get approved are those that contribute directly or indirectly to strategic objectives.

Operations managers similarly need infrastructure for setting up Projects or Cases, engaging best practice protocols for the performance of work, and assessing progress toward meeting Case goals.

Here, the methodologies of choice are BPM (Business Process Management), R.A.L.B (auto-Resource Allocation, Leveling and Balancing), and FOMM (Figure of Merit Matrices) within a Case Management run-time environment.

Enter Big Data

Consistent with the trend toward making decisions assisted by real-time predictive analytics, organizations are seeing dramatic increases in the quantity of data being collected as part of workflow management.

Given that one cannot analyze data that one does not collect, corporations do not, today, unduly agonize over what data to collect / not collect.

Collecting data carries with it no obligation to analyze the data and, within reason, the incremental cost of collecting more data rather than less data is not significant.

Two examples of practical use of big data are as follows:

  1. Operations Level (predictive analytics)

Overlaying of cross-case data at decision branching points along best practice template instances can guide users in the selection of sub-pathways to engage along instances (e.g. similar Cases went this way, 60% of the time).

  1. Strategy Level (connect-the-dots gaming exercises)

Consolidation of operational data to corporate dashboards/KPIs at a graphic free-form search knowledge base gives managers the option of being able to challenge trended data by engaging connect-the-dots searches across the entire space. (e.g. we are projecting a 10% increase in sales, which is  120% of target, except that, on analysis, the competition is increasing at a higher percentage, so maybe 10% is not “good”).

Posted in Case Management, FOMM, Operational Planning, Risk Analysis, Strategic Planning | 2 Comments

Is it time to rename “Business Process Management” to “Business Performance Management”?


One of the commenters at a BPM.com “Are Processes Key to Scalability?” discussion asked the question:

“What do we call ‘process’ so the high growth CEO will see it as important?”

BPM

Here is my response:

Agree with the need for a rename.

We know everything is a process. Processes convert inputs to outputs.

We know that a “process” can be a linked set of steps or a single step, and at a practical level any mix of these (the “processing” remains the same – inputs get converted to outputs).

We know that managing a business is all about evolving strategies, defining goals/objectives, with periodic assessment of progress toward meeting these goals/objectives.

The problem is few processes are end-to-end in b2b, so we end up having to accommodate random mixes of linked sets of steps and ad hoc steps. We need a place to manage these steps – call it Case, if no one can come with a better suggestion.

Case is nothing more than a cursor position in a post-relational database management structure. The structure is not restricted to pre-defined data storage tables/fields. A Case record can accommodate objects that require apps to view the content (images, spreadsheets, .doc/.pdf/.rtf, even audio/video recordings).

Some of these objects are stored in specific database fields, some in Binary Large Object fields, some are too large and need to be “stored” in external files with links to these objects. Bottom line, Case can accommodate anything and if you need sub-Cases (multiple orders for the same customer, multiple claims on an insurance policy, multiple episodes for a patient), no worries.

What is Case Management? – is this really not part of “Business Performance Management” – relying on what we probably could call “best practices” (i.e. process fragments consisting of linked steps, plus ad hoc interventions by resources who use experience, judgment, intuition and decision support).

Surely CEO’s would relate to Business Performance Management (BPM) as an alternative to Business Process Management? (i.e. they set strategy, resources are allocated to Cases via ROI submissions and via annual operating budgets – this ensures, to an extent, that the only work undertaken is work that is supportive of strategy).

Phase II is to monitor progress at the operational level toward meeting Case goals/objectives. Where are these goals/objectives? Surely not plan-side as the end steps in flowgraphs (i.e. we have moved from end-to-end processes to process fragments).

The answer is we find goals/objectives at Cases (run-time side) and Case Management breaks down to performing interventions at Cases that advance Case goals/objectives.

Now, few knowledge workers deal with only one Case – the reason is progress at Cases is often held up for various reasons (handoffs, wait times), so most workers will be working on 10, 20, possibly 50 Cases at a time (yes, for healthcare; yes, for insurance claims; yes, for job-shop manufacturing), so, the role of supervisors is not to manage individual Cases but rather that of allocating, leveling and balancing resources across multiple Cases.

Enter KPI’s at the strategy level – this is where we narrow the gap between operations and strategy.

Strategy -> Initiatives -> Cases -> KPIs –> Strategy

Here, all we need is the ability to consolidate Case data from multiple Cases to an environment hosting corporate KPIs and CEOs have what they need to ‘steer the ship’. I.e. Business Performance Management.

Almost, but not quite.

Missing is the ability to challenge the statistics and, to do this, CEOs need corporate knowledgebases where they can, on their own or, with a little help from staff, test KPIs trends against information in the corporate Kbase (i.e. our sales in country ABC are up 10%, sales in ABC for our three main competitors are up 20%, so reporting that “10%” is “good” needs to be investigated).

Conclusion . . .
Structured sequences of steps plus ad hoc steps can collectively be called “best practices”.

Case provides the environment for performing work.

The way goals/objectives are set up at Cases and the way Cases are managed helps ensure that work is at all times supportive of strategy.

Business Performance Management is what the organization does to build, maintain and enhance competitive advantage.

Posted in Adaptive Case Management, Business Process Management, Case Management | Tagged | 1 Comment

Theories of the Firm – Expanding RBV using 3D free-form search Kbases


RBV (Resource-Based View) is a strategy development theory whose roots go back to the 1980s/1990s with antecedents going back to 1959 “Penrose, E. T. (1959). The Theory of the Growth of the Firm. New York: John Wiley”.

RBV makes the point that to be the best you can be, you have to know what your resources (and capabilities) are before trying to make most effective use of these.

The RBV message basically quits there, leaving prospective firms with the task of figuring out how, exactly, to view their resources both initially and on an ongoing basis.

The thing about resources is they are often scarce and must be shared, so the questions that arise are which resources are being used, to what extent, for which initiatives.

The problem is things can change rapidly. Accordingly, firms need a capability to dynamically allocate/re-allocate resources as they are tracking progress toward meeting strategic objectives and as they are pondering launching new initiatives.

It’s not a stretch to say that large and perhaps not-so-large firms need an inventory of the following “resource/capability” classes – capital, access to capital, land, plant, equipment, tools, intellectual property, knowhow, staff, suppliers, customers, competitors, plus changing technology and changing legislation.

Best use of a firm’s resources for initiatives requires making decisions relating to low/high investment, low/high risk, and quick/slow return new initiatives and backing out of initiatives that have taken a wrong turm.

It becomes obvious that we need to know when allocating/re-allocating resources which allocations give the biggest bang for the buck. Firms reasonably do not want to tie up a resource on a low potential initiative, when the resource could be used on a high potential initiative.

Decisions of this type require knowledge which flows from information enhanced by wisdom, experience and intuition.

Donald Rumsfeld identified three categories of knowledge – a fourth (unknown knowns) was later added by others.

“Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones”.

When making decisions re initiatives and the allocation/deployment of resources/ capabilities, we need to take into account known knowns, known unknowns and unknown knowns, the latter posing a real challenge in the absence of ways and means of finding information the firm has but is unable, at the time of a needed major decision, to find.

Enter 3D free-form search Kbases that allow strategists/planners to consolidate thousands or tens of thousands of documents within one space.

Kbase03

The range of documents goes from text, to DOC, PDF, XLS, structured data, unstructured data, images, audio/video files.

Clearly any environment that hosts many document types needs to be able to call apps that are capable of displaying the data.

The search mechanisms have to be able in the background to find and highlight any “hits”, otherwise users of the Kbase have to tag the documents with key words, which we all know is tedious and unreliable (i.e. user mindsets tend to be different at the time they encode key words relative to when they later launch searches).

See my “US State Dept Country Profiles” 3D demo Kbase (all countries, business, travel, law enforcement, narcotics, terrorism).

http://wp.me/pzzpB-FX

Posted in Decision Making, Enterprise Content Management, Knowledge Bases | Leave a comment