Successful Engineering Design Strategies

You may have read about Toyota’s announced focus on hydrogen fuel-cell powered automobiles.

This is a win, win, lose strategy (for Toyota, for those of us who like to live on this planet, but maybe not for manufacturers of home generators).

The thing is, it seems that if you were to have one of the proposed Toyota cars, you could, providing you don’t need to drive off anywhere with that car, power up essential services at your home for a couple of days.

Clearly this hypothesis needs to be tested but if the promise is kept, small home generators are going to be a lot harder to sell than currently.

My point re engineering design in general is you need three (3) designs:

  1. The one you have in production
  2. Another, under test, that you could roll out fairly quickly.
  3. A disruptive design, in concept or on the drawing board, with the potential to sideline your designs 1 and 2.

No guarantee, of course, you won’t be leapfrogged by one of your current competitors.

The reality here is you might stand a greater chance of being leapfrogged by an organization that currently is not one of your competitors or by an organization that does not yet exist.

If this causes you not to sleep at night, read more of my blog articles.

My close friends maintain they are a sure remedy for insomnia.


Posted in Engineering Design, Manufacturing Operations, TECHNOLOGY | Tagged , , | Leave a comment

“Save Time and Spend” versus “Spend Time and Save”

When you go into any marketplace looking for a technology solution to address a complex problem, you basically have two options.

You can look at “best of class” rankings and pick near the top – this will save you time but it may cost you a lot of money.

The other option is to carry out an exhaustive search and hopefully use non-subjective filtering to find a “best fit”, spending time but saving money.


“Best of Class” – How it often works.

The first criteria of importance when you want to make a “best of class” product selection decision is the integrity and capability of the ranking service. Have the vendor products actually been evaluated or is the ranking a function of how much money vendors have paid to have their products “evaluated”?

Next, we have to worry about what type of prospective customer the ranking was done for.

A high-ranking system that handles a certain volume of transactions but is not scalable may not work for you.


“Best Fit” – How it usually really works

For organizations that want to do their own ranking, the usual approach is to prepare and issue an RFP. Corner cutting in this area usually results in the manufacture of a features list that may or may not be reflective of the needs of end-users.

A better approach is to consult users to find out what they actually need and write-up the RFP request on that basis.

Neither of these approaches results in a shopping list that requests a solution for unanticipated future needs. For this, you have to go to an experienced consultant, bearing in mind that most consultants with the ability to consistently predict the future will be difficult to lure out of the private Caribbean islands they have retired to.

This puts your solution search between ”Save Time and Spend” and “Spend Time and Save” in that preparing an RFP is a non-trivial exercise both time and money wise. The folks who prepare the RFPs do not always know, or bother to find out, what their users want or need, so they consult product literature and prepare an inventory of features across a large number of vendors.

The vendors invest heavily in responding and the one with the most features often gets the contract, all other things being equal. Except when the buyer has decided in advance who is to get the contract in which case the future “also ran” vendors really should not spend a lot of time and money responding to the RFP.

RFPs often start “feature wars” where any vendor capable of understanding the Scope of Work (SOW) will traipse through the checklists and respond positively.

The problem with “best fit” is users usually only need 10% of the features requested and so you can quickly see that a vendor high up on the feature count list might score very poorly on the few features that the users actually need. It would be helpful if features could be ranked on the basis of their usefulness but that seems to be difficult, particularly when the buyers go into the marketplace without bothering to consult the users re what is/is not important to them.

Sound depressing?

‘Fraid so.

The easy route ends up being expensive and the difficult route takes a long time.



Posted in TECHNOLOGY, Uncategorized | Leave a comment

Hello, our name is . . .

This post is for readers who are new to this blog.

Civerex is a 22-year veteran of the healthcare EHR wars, with occasional forages into the area of law enforcement, knowledgebase building, b2b, workflow management and data Sunriseexchange.

We are not looking for investment capital, we want to help you to invest in yourself by private-labeling our software to give you the product you have been dreaming about and wanting to put on the market.

We have eight (8) software products (CiverWeb, CiverMind, CiverManage, CiverPsych, CiverMed, CiverOrders, CiverMail, CiverExchange), all of which seamlessly interconnect and cover the full spectrum of strategy development, setup of KPIs, process mapping, modeling, simulation, rollout, monitoring, data consolidation and monitoring of performance.

Over time, we have put together 1,500,000 lines of source code that can be private-labeled as a distinctive new product in virtually any business sector/application area, with a few changes in terminology. Your business sector/your application, the one you have been dreaming of.

Our product portfolio has been developed by one team of developers, not eight. The reason for this outcome is each time a problem presented, we held back until such time as we could work out a generic solution. Better one than eight.

Now, the thing about private-labeling with the right code base is that aside from being able to speak different languages, you can, with one code base, support multiple products where each install is different, the service directories are different, the workflows are different, the data collection forms in service are different and the rule sets in place are different.

We leaped over the hurdle of translating our application shells to different languages by putting language translation facilities in the software such that a partner in a different country/corporate culture can run the software we distribute, overtype the English and then, when no more English expressions can be found, ship back to us a language file which we compile to a new executable and send back as a translated application.

Our clients like “hands-on”.

They do not fancy having to build database tables/fields. We automated all of this long ago. They do like building, owning and managing their own workflows as an alternative to having to contract with us for “customization” or having to hire an expert consultant to help them build workflows and roll these out to a production environment. They don’t like learning languages or notations. They just want solutions to their problems.

In the area of data exchange where the challenge is to get disparate systems to talk to each other, we used to build parsers/formatters to allow trading partners to exchange data. We found this to be tedious and decided to write “sniffers” that could scan an incoming data file and, to an extent, save the programmer from having to write code to yield some minor variation of an earlier data transport file format

There are a few things we have not yet figured out how to do and, of course, we have not done much about solving problems we have not yet heard about.

Here’s the deal.

If you want to put a new application on the market in a particular industry area, you can do this easily by inventing the next-generation Ferrari in your garage. Bill Gates did it. Steve Jobs did it. Why not you?

Another option that will give you “instant gratification” is to become a partner of ours and configure a new product, branded the way you want.   The turnoff is the fees you will need to pay us to have us brand a product and provide ongoing support/maintenance or the times 5 fees you will have to pay to buy a copy of our source code, but only an ROI can say whether the garage option or the private label option might work for you.

You won’t get much help from us in the way of “Cheshire cat” smiling sales rep “assistance” – we are rather unique in that we have a management consulting division so for each question you put to us you are likely to get ten questions back. We don’t hesitate to tell you might be better off with another software vendor because we have found it’s a lot more pleasant to deal with “delighted customers” as opposed to “disgruntled former customers”.

Finally, if have a concept you need to promote to others, we have a Video Production Unit that can put a good spin on any set of ideas you feel you need to communicate to your peers, top management or other stakeholders.

Video recording sessions with us tend to be painful. We make you take, re-take and re-take, until the end result “looks good” – we can afford to do this because you pay by the hour but you will like the end result. If you want an inexpensive promo, talk to a teenager who has an iPad. You might end up with an award-winning video. Ask the tooth fairy if I am giving out good advice here.

Enough said.

Bottom line, you will never know whether this is a “good deal” unless/until you pick up the phone and call our Managing Director, Karl Walter Keirstead. Call 450 458 5601.

Who is Karl Walter Keirstead?

Just Google the name and then do some homework before you make your call. It’s a lot easier to talk to someone when you know who they are.

Posted in Decision Making, Organizational Development, Software Acquisition, Strategic Planning, TECHNOLOGY | Tagged | Leave a comment

Healthcare – The chickens finally have come home to roost

chickensRoostMake your day by clicking on the link here below and then read this blog post.

If you feel healthcare in the USA is “too expensive” write to Rep. Michael Burgess (R-Texas), a physician who leads the House Energy and Commerce trade subcommittee and is drawing up a bill to enforce data sharing, and tell him he can have interoperability simply by taking some of the members of the “Electronic Health Records Association” to court.

My comments at “story” were:


Who buys EMR software that is incapable of exporting its data? Who subscribes to a cloud EMR service that has no option for exporting the data?

And if you must acquire/subscribe to a system that charges ‘extra’ to unlock data export, why are doctors suffering sticker shock?

Is it because they bought a “car” without checking to see whether a motor/transmission was included and if not, how much extra?

Seems to me HIPPA says healthcare service providers are custodians of patient data. How can you be a custodian when you don’t have custody?

Thank goodness, at least one person has it right i.e. “Interoperability is what makes an EHR useful,” said Rep. Michael Burgess (R-Texas) – no surprise to see that Rep Burgess is an MD, not an IT person going into the marketplace to find software that “meets the needs” of clinical staff who have never been consulted re their needs.

And, if you are reading this and think that “interoperability” is “difficult” – think again.

My group builds software for healthcare, law enforcement, manufacturing, b2b.

Aside from healthcare, none of these other sectors could function without seamless interconnectivity. For this reason we built a Data Exchanger that lets any system talk to any other system.

For healthcare, we even built an e-hub that allowed 100+ clinics, all using different EMRs, to exchange data. It ran in pilot mode for about 12 months, consolidating more than 120,000,000 data elements without any significant hiccup.

Each time we found a new set of trading partners who could not use one of our “standard” data transport formats, we wrote a parser/formatter.

We found over time that the number of new formats slowed, but, we had, by that time, grown weary of writing parsers/formatters, so we developed a “sniffer” that could scan an incoming document, figure out pretty much on its own what was new/different and greatly reduce the amount of custom programming needed.

None of this was “rocket science”.

It may be time for a class action suit against the big “x”  EMR vendors. No need to include Civerex in this pack, our EMRs have included import/export facilities since 1995.

My take . . . .. I think many of the players in this game deserve each other.

Do we really need an act of congress to provide relief for victims of self-imposed stupidity?

As my grandmother used to say ” Well,  . . ..  I never”


Posted in FIXING HEALTHCARE, Interoperability | Tagged | Leave a comment

Closing the Business Planning, Monitoring, Control Circle

As specialty practitioners continue to focus on specific operational methodologies and tools and argue about which are “best”, the fundamentals are that managing a business is about building, maintaining and enhancing competitive advantage.PlanMonitorControl

Any organization that is not able to build/maintain competitive advantage will fade. And, any organization that is unable to enhance its competitive advantage runs the risk of being leapfrogged by disruptive technologies (e.g. fuel cell powered automobiles that can act as portable electrical generators during power outages).

It all starts with strategy formulation.

We all know strategy is typically developed in isolation and then “communicated” to operational units who are left to interpret the strategy. It’s very easy to identify organizations that have a good strategy with poor operational performance and there are many examples of organizations that are super-efficient at building products customers do not want.

The notion that “everything is a process” at the operational level is not sustainable in organizations where most of the staff members are knowledge workers. There are few end-to-end processes with convenient end point objectives in these organizations. Instead, we find “process fragments” that people, machines and software thread together at run time.

The range of operational methodologies/tools is such that no amount of buying “best in class” rankings will deter enterprising organizations from finding more innovative/cost effective “solutions” to their problems.

Buyers can “save time and spend” or “spend time and save”.

It’s a bad idea, actually, to wait for problems to be identified and to then only start looking for “solutions”.

Some capabilities need to be classified as “strategic corporate assets”, IT infrastructure being a prime example.

Picking an IT Infrastructure – one of the most important corporate decisions you will make.

If an organization subscribes to the notion that the objective of any business is to remain in business, it is important to pick an Information infrastructure that has the potential to address unanticipated future needs, problems and opportunities.

Easily said, not so easy to do.

But, failure to adopt technology that is “future-proof” results in a need to return to the marketplace for new technology in 2-3 years long before the ROI for the old technology has had time to run its course.

Noone understands this better than video production companies, who, having recently gone to HD, are currently tripping over each other to upgrade to 4K.

Meanwhile, manufacturers are getting ready to roll out 8K cameras but, wait, Canon just announced a new sensor that is 60 times sharper than 1080p HD

If an organization wants to be successful, it needs methodologies and tools that are capable of allowing the organization to close the gap between strategy and operations.

If you are a subscriber to my collection of rants, you will have seen comments on the different information architectural needs of transaction processing applications versus applications/environments needed to carry out strategic planning.

Transaction processing applications pull/push messages from/to engines that provide very specific services. A Case-based run-time environment that showcases seamless interoperability has the wherewithal to provide decision support, collect operational level data, share data and build up a history of transactions, Case by Case.  All you need in such an environment is the ability to establish a cursor position in an RDBMS and engage processing.

Knowlegebase applications require a different IT architecture.

Here,  we have a need for graphic display of thousands (sometimes tens of thousands) of records, organized in multi-root hierarchical trees,  with free-form search capabilities across all records. The records typically come from multiple Entities where, in some organizations, there is no duplication of data elements across Entities (i.e. resources, customers, products, competitors, policy/procedure/legislation)

The focus can be on a particular structured data element value, text, key words, even images, and the scope of any search is likely to extend to current and historical data as opposed to current data only.

The 2nd difference between Kbase application IT architectures and the usual structures found in traditional RDBMS applications is that in a Kbase you need to be able to put a simultaneous focus on all records, not just the “hits”.

Users are likely, some of the time at least, to be more interested in what a search did NOT find as what was found. (i.e. McDonalds today wants to put an outlet at a location where Wendys is, tomorrow they may want to put an outlet where Wendys is not).

The 3rd and final difference is that the “structure” of data in a Kbase is likely to change on-the-fly. (i.e. I have records organized by City – following a search I may want to cluster some of these temporarily or permanently by type of business, keeping the City structure intact).

It’s easy to understand that in a Kbase application you need to load record stubs for all records into memory,  carry out searches and then build your KMap out of memory.

Different User mindset, different User Interface, different architecture.

See “It’s Time For You To Get Your Big Data Organized”

Stay tuned for more information on “Cases of Cases” in a subsequent blog post.

Posted in Case Management, Competitive Advantage, Decision Making, Operational Planning, Organizational Development, Strategic Planning | Tagged , , | Leave a comment

The potential of telemedicine for reducing the cost of healthcare and for improving quality of life

Telehealth has the potential to greatly reduce the cost of healthcare services delivery and ???????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????greatly increase the quality of life.

In respect of returning military personnel, John Liebert, MD and William JJ Birnes, PhD, JD published in 2013 a book called “Wounded Minds” where they highlighted the impact of results of the inefficiency of traditional treatment approaches ($32.2 billion annual expenditure for anxiety disorders alone).

These two authors state (page 254) in respect of the use of new suicide prevention initiatives that “Technology aimed at augmenting therapy is another strategy, one designed to overcome some access to care issues in remote areas. Virtual reality and telemedicine are examples”

Anxiety is just one area of medicine that can benefit from telehealth (i.e. substance abuse, depression, etc.)

It would, in my view, be a mistake to limit looking to telehealth to address the behavioral sub-set of conditions that patients can present with or to restrict telehealth to care access in remote areas.

It’s my view we have hardly begun to scratch the surface here.

We need to remind ourselves that the approach to medicine as currently practiced (i.e. fixing problems) is far less efficient than encouraging lifestyles that help to prevent problems from developing (i.e. wellness).

We can use telehealth in the area of treatment planning/monitoring as well as in the area of promoting wellness, with the caveat that no single approach/methodology/technique should replace all others.

Increasing availability of medical devices in the field bring us back to telemetry, a technology that is absolutely pervasive in industry, with origins back in the 19th century (data transmission between the Russian Tsar’s winter palace and army headquarters developed in 1845).

My area of interest in healthcare is in continuity of care (i.e. doing the right things, the right way, using the right resources, at the right places and times).

The foundation for this is twofold

a) there cannot be ten best ways to do something nor should there be only one.

b) healthcare resources are scarce so we need to make efficient use of these.

We can talk on and on about telehealth but it is an area that has many moving parts and these all have to fit together smoothly and seamlessly if we are to make effective and efficient use of this important technology.

Civerex has been a pioneer in providing infrastructure for telehealth.

We had in place in the early 2000’s telehealth Tx planning monitoring software for use in the treatment of anxiety disorders. The communication at the time was purely via telephone, but with call centers in one time zone, providers in another and patients in yet a third time zone, it was important in the appointment booking software module used by call center staff to make sure that providers and patients would “meet” at the right time.

Civerex’s current focus is to provide customers with efficient ways of recording 1:1 telehealth sessions and consolidating video/voice recordings at patient EHRs. We are looking to accommodate live video broadcasts of in-home sessions carried out by clinical staff so that senior staff back at clinics can tune into these broadcasts and provide real-time advice/assistance on the administration of treatment plan protocols.


About Civerex

The owners of Civerex developed in the late 1980’s a software product called RapidTox for the diagnosis of instances of poisoning. We were inspired by the work of Robert Driesbach, MD, who published the 1st edition of Handbook of Poisoning back in 1955.

The foundation of our work on RapidTox was a diagnostic algorithm that was able to identify candidate poisons on the basis of symptoms/signs. Selection of a poison gave the user a list of modalities (treatments that worked) with goals/objectives plus the ability to carry out differential diagnoses.

Our current suite of behavioral/medical software products continue to include the diagnostic algorithm in addition to putting a focus on encouraging consistent use of “best practices” protocols via background orchestration, with accommodation for deviating from these as and when deemed appropriate/necessary, subject to governance. The two core methodologies we use are BPM (business process management) and ACM (adaptive case management).

Another area of interest is promotion of the concept that discharge planning should start with the first incoming phone call.

We have spent a lot of time on providing seamless interoperability by and between our products and local and remote 3rd party systems and applications and we promote for general use, a product called CiverExchange that addresses this need.

Any groups interested in collaborating with Civerex should contact us at 450 458 5601 to highlight areas of interest and be prepared to apply for research grants to fund any proposed initiatives that Civerex may agree to in respect of collaborative undertakings.

We are happy to provide “private label” versions of our software for loading content subject to be of interest to different communities of prospective users.


Posted in Adaptive Case Management, Business Process Management, Data Interoperability, FIXING HEALTHCARE, Interoperability, Telehealth, Video Production | Tagged , , , | Leave a comment

What do Process Maps and Suicide Sheep have in Common?

I didn’t expect you to “get it”.

Check out this Factual Facts post “In 2005 in Turkey, a suicide sheep jumped off a cliff and 1500 sheep followed the first one” .

Now, before you go away, I need to very quickly make an important connection.

Many process maps get prepared on paper, are then filed, and are never referred to again.

If this is what happens to the maps you prepare, you might as well prepare them and throw them off a cliff.

My point is. . . .

For any process that has complex steps, connected sometimes in complex ways, with multiple decision box branching points, where different skilled persons must perform different steps, where different steps require the collection of different data, where you want to carry out data mining after-the-fact . . . .

there is NO way instances of your process templates can be “managed” by staring at a paper map and NO way, however long staff might stare at such maps, that you will be able to have process instances performed properly (doing the right things, the right way, at the right time, using the right resources, collecting the right data).

Remedy . . . .

Map your process maps on an e-canvas, compile them, roll them out for modeling/simulation, improve your process maps, then roll them out again so they can provide real-time orchestration within a Case run-time environment.

Otherwise, see you at the bottom of the cliff.


Posted in Adaptive Case Management, Business Process Management, Case Management, Nonsense | Tagged , , , , | Leave a comment

What’s your action plan for BPM for 2015?

We all understand that organizations need best practices and that, in the case of complex processes, these need to be in-line, not on-line and certainly not off-line.

In theory, you can ask data mining software to run through your event logs and build up Targetan inventory of all processes. If your processes are in-line you will have an event log, otherwise you probably won’t.

So, in the absence of an existing complete inventory of in-line processes, you have little choice but to drag key stakeholders into a room or a virtual room and facilitate process mapping.

Do this and you will be at a stage where how you get to mapped, improved, modeled, compiled and rolled-out processes will be mostly up to you.

You can use an e-canvas to drag and drop process steps as fast as stakeholders say “and then we do this”, click on your compiler and be done OR you can take notes, go away for 1-2 days and then reconvene to discuss your paper process map.

You can stay easily end up staying away for several weeks if you elect to write computer code to get to where you can  “piano-play” your process so that stakeholders can identify missing steps, steps that are improperly sequenced, steps that have the wrong attached forms, wrong routings.

If you are a consultant trying to stretch out your engagement, this “buggy whip” strategy will only work until your client finds out how long process mapping should take, using appropriate technology.

If you are a BA/BI staff member, undue dragging-on of process mapping initiatives will tax the patience of your stakeholders and put the initiative at risk – most of these folks would rather be somewhere else.

Don’t know where to start with BPM? Google “BPM” – you will see some 68,600,000 results.

Another option is browse my 235 blog articles written over a period of four years.


Posted in Business Process Improvement, Business Process Management, Operations Management, Process Management, Process Mapping | Leave a comment

Get Your Story Right

If you want to become a management consultant, it’s unreasonable to expect to ease into this the day after you graduate from university.

It takes time to build up expertise to walk in off the street and facilitate strategy storydevelopment sessions within a corporation and time to work out ways and means of helping corporations to align operations with strategy.

Once you are ready, unless you plan to rely on word of mouth, you need to get out a story that allows prospective clients to move beyond the notion that all you will do for them is borrow their watch to tell them what time it is.

Most consultancies provide consulting services only, leaving it up to prospective customers to do most of the hands-on heavy-lifting that follows receipt of a study report.

Civerex followed a different path – we have been an operating company for most of our 30-year history and only recently started to offer consulting services to clients. We practiced what we preach.

Our clients quickly discover that whereas we offer non-product-specific advice/assistance, we have a range of products that can be used to move forward from study reports to implementation of ideas/concepts.

All of these products have a stronger focus on application system development as opposed to plug-in “one-size-fits-all” solutions. They can be configured for different industry areas/application situations, either by us, or by independent consultants/implementors or, if a client feels up to it, by internal staff that Civerex mentors.

Here, for the record, is the Civerex History.

Make sure your story is not too short, nor too long and that it reads well.


About Civerex

“2015 is our 30th anniversary . . .

We started off as Jay-Kell Marketing Services- in 1985, organizing high-tech seminars out of Singapore on satellite communications, military radio, LANs/WANs and Object Oriented programming.

In 1990 we moved the business to Canada, continuing with seminars, but adding sales/support for Canada for a range of 4GL and O-O software products manufactured by mdbs inc., a Lafayette, IN based software company.

In 1992 we started to develop software applications on our own, thanks to a grant from the Ontario Hospital Association to develop an expert system for the diagnosis of psychiatric disorders. Our first customer was the Royal Ottawa Hospital.

We did a spin-off of software distribution/support to a new business entity, Civerex Systems Inc., in 1994.

Civerex became a sole-source supplier to DOJ/FBI in the late 1990’s and spent a number of years developing software for the management of hostage, barricade and suicide critical incidents. We developed for the FBI a software application called L.E.N.S (law enforcement negotiation system).

During the early 2000’s we formed a joint-venture with an aerospace engineering company called Infinity Technologies and ran our US operations out of Huntsville, AL for a number of years. Infinity was eventually bought out, Infinity/Civerex LLC was shut down, and Civerex (Canada) took over the ICLLC customer base.

Jay-Kell Technologies today continues to own all of the intellectual property for some eight (8) software products (knowledgebases, e-mapping, entity record management, portal, and data exchange) and has responsibility for private-label licensing. The way we see it, having two out of three competitors proposing our technology on an RFP initiative is a good thing.

In 2010 we started to provide management consulting services with a specific focus on bridging the gap between strategy and operations.

In 2012 we added a video production unit to address the growing demand for the development of advanced sales/promotion approaches and at-a-distance-customer training.

As you can see, it’s not our first rodeo.

We can bring a wide range of international hands-on expertise to bear on problems/issues you may need help with.

Call 800 529 5355 (USA) or 1450 458 5601 elsewhere for more information.



Posted in Adaptive Case Management, Business Process Improvement, Business Process Management, Competitive Advantage, Compliance Control, Customer Centricity, Data Interoperability, Enterprise Content Management, MANAGEMENT, Operational Planning, Operations Management, Organizational Development, Planning, Productivity Improvement, Strategic Planning | Tagged | Leave a comment

How certain can we be about uncertainty?

The answer is 100% – anything we map out as a plan for the future will be characterized by risk and uncertainty.


We can quantify risk but under many scenarios the only thing we can say about uncertainty is it will always be on the horizon.

What do we do when an initiative is impacted by a significant negative event?

A practical example of an unanticipated negative event is a product development initiative using a particular technology that gets hit by a leapfrog technology.

E.g. You might be in the process of perfecting a new type of gasoline engine and simultaneously be hit with an oil price drop (this week) plus an announcement by Toyota (this week) that they plan to focus on powering automobiles via fuel cells.

The fact that cars with fuel cells can provide an electrical feed that can handle some of your household electrical needs for a couple of days is bad news to the makers of residential standby generators, although this means you pretty much have to avoid driving until the power company grid comes back on-line unless you have two cars.

I wonder how long residential standby generator manufacturers have known about the possibility of using a car as a backup generator.

Adaptive Case Management is a methodology that allows knowledge workers to change course on short notice. ACM is not particularly helpful predicting uncertainty.

Building and maintaining KnowledgeBases can help you to keep an eye on open options, on technology, the competition.

If you take the trouble include KPIs in your Kbase and take the time to challenge trends in your KPIs by carrying out free-form searches across your Kbase, you may have a leg up on your competitors.

How do we find people who are good at predicting the future?

Answer, it’s pretty much pointless.

They are difficult to find – most of them retire to fortified islands in the Caribbean and disconnect their phones.


Posted in Decision Making, Operational Planning, Strategic Planning | Tagged , , , | 1 Comment