SCENARIO-BASED MODELLING: Storytelling our way to success. 1

“The soft stuff is always the hard stuff.”


Whoever said ‘the soft stuff is the hard stuff’ was right.  In fact, Douglas R. Conant, coauthor of TouchPoints: Creating Powerful Leadership Connections in the Smallest of Moments, when talking about an excerpt from The 3rd Alternative: Solving Life’s Most Difficult Problems, by Stephen R. Covey, goes on to note:

“In my 35-year corporate journey and my 60-year life journey, I have consistently found that the thorniest problems I face each day are soft stuff — problems of intention, understanding, communication, and interpersonal effectiveness — not hard stuff such as return on investment and other quantitative challenges. Inevitably, I have found myself needing to step back from the problem, listen more carefully, and frame the conflict more thoughtfully, while still finding a way to advance the corporate agenda empathetically. Most of the time, interestingly, this has led to a more promising path forward and a better relationship, which in turn has made the next conflict easier to deal with.”

Douglas R. Conant.

Conant is talking about the most pressing problem in modern organisations – making sense of stuff.

Sense Making

Companies today are awash with data.  Big data.  Small data.  Sharp data.  Fuzzy data.  Indeed, there are myriad software companies offering niche and bespoke software to help manage and analyse data.  Data, however is only one-dimensional.  To make sense of inforamtion is, essentially, to turn it into knowledge. To do this we need to contextualise it within the frameworks of our own understanding.  This is a phenomenally important point in sense-making; the notion of understanding something within the parameters of our own metal frameworks and it is something that most people can immediately recognise within their every day work.


Take, for instance, the building of a bridge.  The mental framework by which an accountant understands risks in building the bridge is uniquely different from the way an engineer understands the risks or indeed how a lawyer sees those very same risks.  Each was educated differently and the mental models they all use to conceptualise the same risks (for example)  leads to different understandings.  Knowledge has broad utility – it is polyvalent – but it needs to be contextualised before it can be caplitalised.

Knowledge has broad utility – it is polyvalent – but it needs to be contextualised before it can be caplitalised.

For instance, take again the same risk of a structural weakness within the new bridge.  The accountant will understand it as a financial problem, the engineer will understand it as a design issue and the lawyer will see some form of liability and warranty issue.  Ontologically, the ‘thing’ is the same but its context is different.  However, in order to make decisions based on their understanding, each person builds a ‘mental model’ to re-contextualise this new knowledge (with some additional information).

There is a problem.

Just like when we all learned to add fractions when we were 8, we have to have a ‘common denominator’ when we add models together.  I call this calibration, i.e. the art and science of creating a common denominator among models in order to combine and make sense of them.


Why do we need to calibrate?  Because trying to analyse vast amounts of the same type of information only increases information overload.  It is a key tenent of Knowledge Management that increasing variation decreases overload.

It is a key tenent of Knowledge Management that increasing variation decreases overload.

We know this to be intuitively correct.  We know that staring at reams and reams of data on a spreadsheet will not lead to an epiphany.  The clouds will not part and the trumpets will not blare and no shepherd in the sky will point the right way.  Overload and confusion occurs when one has too much of the same kind of information.  Making sense of something requires more variety.  In fact, overload only increases puzzlement due to the amount of uncertainty and imprecision in the data.  This, in turn, leads to greater deliberation which then leads to increased emotional arousal.  The ensuing ‘management hysteria’ is all too easily recognisable.  It leads to much more cost growth as senior management spend time and energy trying to make sense of a problem and it also leads to further strategic risk and lost opportunity as these same people don’t do their own jobs whilst trying to make sense of it.


In order to make sense, therefore, we need to aggregate and analyse disparate, calibrated models.  In other words, we need to look at the information from a variety of different perspectives through a variety of lenses.  The notion that IT companies would have us believe, that we can simply pour a load of wild data into a big tech hopper and have it spit out answers like some modern Delphic oracle is absurd.

The notion that IT companies would have us believe, that we can simply pour a load of wild data into a big tech hopper and have it spit out answers like some modern Delphic oracle is absurd.

Information still needs a lot of structural similarity if it’s to be calibrated and analysed by both technology and our own brains.

The diagram below gives an outline as to how this is done but it is only part of the equation.  Once the data is analysed and valid inferences are made then we still are only partially on our way to better understanding.  We still need those inferences to be contextualised and explained back to us in order for the answers to crystalise.  For example, in our model of a bridge, we may make valid inferences of engineering problems based on a detailed analysis of the schedule and the Earned Value but we still don’t know it that’s correct.


As an accountant or lawyer, therefore, in order to make sense of the technical risks we need the engineers to play back our inferences in our own language.  The easiest way to do this is through storytelling.  Storytelling is a new take on an old phenomenon.  It is the rediscovery of possibly the oldest practice of knowledge management – a practice which has come to the fore out of necessity and due to the abysmal failure of IT in this field.

Scenario-Based Model Development copy

Using our diagram above in our fictitious example, we can see how the Legal and Finance teams, armed with new analysis-based  information, seek to understand how the programme may be recovered.   They themselves have nowhere near enough contextual information or technical understanding of either the makeup or execution of such a complex programme but they do know it isn’t going according to plan.

So, with new analysis they engage the Project Managers in a series of detailed conversations whereby the technical experts tell their ‘stories’ of how they intend to right-side the ailing project.

Notice the key differentiator between a bedtime story and a business story – DETAIL!  Asking a broad generalised question typically elicits a stormy response.  Being non-specific is either adversarial or leaves too much room to evade the question altogether.  Engaging in specific narratives around particular scenarios (backed up by their S-curves) forces the managers to contextualise the right information in the right way.

From an organisational perspective, specific scenario-based storytelling forces manages into a positive, inquistive and non-adversarial narrative on how they are going to make things work without having to painfully translate technical data.  Done right, scenario based modelling is an ideal way to squeeze the most out of human capital without massive IT spends.






The True Cost of IT Reply

The cost of hardware has been falling for years and yet enterprise IT budgets continue to rise well beyond inflation. The truth is that per-capita costs seem to be increasing faster than per-unit costs are decreasing.   Coupled with colossal cost overruns in implementations most CIOs are under significant pressure for their technology budgets not only to deliver better systems but also demonstrable business benefits as well.  Many companies are looking  to cloud services to solve their technology cost problems but few realise that not only are the  costs of ICT hidden throughout the organisation but so is the value.  Simply put, IT just should not be so expensive.  So, where does the ‘real’ cost of technology lie?

ICT managers traditionally have little accountability and there are few controls over their cost centre.  Yet, these days ICT plays a pivotal role in supporting the execution of corporate strategy.  Despite this the only time the business has the opportunity to analyse and to correct failed projects is after they have delivered (or not, as the case may be).
In this paper we argue that using a 2-step method to ICT costing:  (i) a top-down approach to consolidating ICT costs, and (ii) by including  ‘project costs’ businesses can increase the granularity of their ICT total cost of ownership (TCO).  In this way, CIOs become masters of their own cost control and not the CFO.  Projects can be costed properly, projects can be right-sided accurately, projects can be deconflicted and business capability – not just shiny new tech – can actually be delivered.


Whether downturn or upturn.  Whether cost reduction or growth, the challenge for most CIOs is how to squeeze the most out of their limited budget.  In order to do this the business must gain an exact picture of its IT cost base.  Fundamentally, without knowledge of just how much IT costs the business every project, every initiative is doomed and IT will never deliver the capability to execute company strategy.

The cost of hardware may be decreasing but the total cost of enterprise software is always rising for 2 reasons:

  • The cost of labour to run enterprise software is higher because it is, seemingly, more specialised.
  • The cost of delivery is higher because deployment and integration takes longer and involves more people.

Few of these costs, if any at all, are known at the outset.  More problematically, there is no in-house ability to actually gain the insight into total expected costs.  The statistical ability and IT awareness needed to build the necessary models simply does not exist even in the most sophisticated organisation. What is clear is that  it is impossible to drive the development and approval of all ICT projects through the same capital budgeting system as any other large corporate purchase.  Quite simply, the mathematics needed to assess the purchasing of new software is radically different from new plant,  or acquiring another company.  The business may wish that the approach should be the same but to do so is incorrect.  For example, most projects are assessed on an NPV analysis.  However, to say that a software program (typically a Management Information System) can actually increase cash flow to the business is complete nonsense.  If all projects need to demonstrate measurable financial return (how much and when in discounted cash flows) right from the outset.  So, how is IT to achieve this? With so many broken promises jaded executives could be excused from disbelieving the wild exaggerations of the technology prophets.  The challenge is to create a system of ICT cost analysis which allows both better capital allocation as well as better understanding of the value added from IT systems. If the business does not then projects are inevitably delivered late and under-spec.  The IT function then spends years cannibalising their budget in order to deliver the original project. The result is huge write-downs in software in order to get the organisation back up to its benchmark.


True Cost of IT_Scatter Plot

The link between IT spend and commercial profitability has never been proven.  Increased technology budgets are largely a product of larger cash reserves.

Despite the recent economic downturns across the globe little has changed in ICT budgets.  Finance may be scarcer and the hurdle heights for budgetary approval may be higher but CIOs are still largely capitalised on a yearly spend level with little additional oversight.  Although IT departments have to fight harder for their dollars, once they have them there is still scant understanding or linkage between the money they are given and the value they deliver.

IT departments constantly bemoan the fact that evaluating the benefits of technology is difficult.  Some common misconceptions are that:

  • Most of the benefits of technology are intangible,
  • the benefits take a long time to materialise,
  • the real benefits of IT are strategic and in competitive advantage,
  • the benefits are indirect,
  • the benefits often come from dependencies, and that
  • there are insufficient mechanisms for capturing the value of these information systems.

All of this is nonsense.  The fact is that all of our traditional analysis is either too general or too narrow.  The cost of technology and the value it creates is already accounted for, it is just that they are hard to find.


The core of the issue is – where are the IT costs?  In order to show IT executing corporate strategy its costs have to be clearly aligned to the corporate chart of accounts.  Unfortunately, IFRS/GAAP accounting precedures do not take into consideration the indirect/non-capital ICT spend.  Even most cost tracking systems do not isolate IT related training budgets, do not extricate executive management oversight of IT projects and it do not  account for meetings and interviews relating to ICT choice and procurement.  In fact 10-15% of total ICT project costs are usually spent prior to delivery even starting.
Most IT project budgets are hugely understated.  Management time is unaccounted for and even operating costs are hidden in other budgets in order to reduce the visibility of overspend.  Even with these costs the IT spend often does not incorporate systems related training as well as the operational time of non-IT personnel in assisting implementation and deployment.

APPROACH – The Top-Down Model

True Cost of IT(1)

When creating a cost model for the total cost of IT ownership most management accountants go about accumulating line items and activities of the various financial centres relating to IT.  This goes nowhere; fast.  The secret is to create a top-down model such as the cost accounting model shown to the right.

Hard Costs v Soft Costs v Project Costs

By subtracting from total revenues all non-ICT related costs the business naturally arrives at the total cost of their ‘Information’ spend. What becomes clear is the staggering amount of time and energy which management spend on the development, purchase and integration of IT, for they are the primary beneficiaries of information.  The role of management, after all, is largely being able to use information to allocate better capital and resources within the business, predominantly through the use of IT to make decisions.
The cost of management relating to IT project work (i.e. non-transactional work) is called ‘Soft Costs.’  It is essential to arrive at soft costs from the top down because management rarely accounts for its time.  A bottom-up cost model, therefore will always be woefully inadequate.

The Defence community has also quantified the relationship of certain qualitative indicators and project costs.  These are called ‘Project Costs’ and they take into account, for instance, the effect that a cohesive stakeholder community has on a project.  There are 14 factors in total which calculate, amongst others, project complexity, workforce maturity etc.

Traditionally, the business’ management accountants attempt to shoehorn a fraction of these costs – inaccurately – into a ‘bottom-up’ cost model of hard costs which has limited granularity.  Unable to account for the soft services of management and the effect of qualitative indicators, projects inevitably always have the most optimistic, rose-tinted view of delivery.  Given that most ICT projects spend 10% of their budget before they start, and most misallocate another 20% of their time, therefore, even a project that only modestly reports a 10% overspend has really overrun by 40%.  An ICT project which fails to deliver within cost parameters has 2 possible outcomes:  People get sacked and budgets are written down.  The IT department cannibalise operational budgets of all other projects trying to deliver the failed initiative.

In the post-GFC world CIOs cannot afford to keep over-promising and under-delivering through a failure to understand the true cost of their ICT.  Until they do they will still just be the heads of the IT department.


ICT projects are more complex than normal capital projects.  They  often contain a large number of unknown variables and are often implemented to achieve vague business outcomes.  Normal capital projects such as building a new factory or developing a new product have clear, well established objectives, outcomes and processes. What is more, their success directly relates to increased revenue.  ICT projects, on the other hand do not, generally, directly increase revenue.    The are usually supporting back-office functions.  Budgeting for ICT projects therefore requires a different mindset and a different costing process.

The Defence community have long realised the expense of soft costs involved in projects.  Of the 6 phases of a project’s lifecycle the first 3 are devoted entirely to systems engineering and managerial decision making.

Defence often calculates the effect that qualitative factors have on projects – these are what may be termed ‘Project Costs’.  These factors take into consideration the complexity of the project, the design factors, the team maturity and cohesion amongst others.
A CIO, therefore, who only costs their department on the basis of hard costs is only scratching  the surface of real costs.  Without taking into account ‘Soft Costs’ and ‘Project Costs’ ICT projects are  doomed to overrun on costs and schedule and under-deliver on capability.

Benefits of Better IT Costings

Naturally, the new financial model of the ICT spend shows an increase due to the addition of previously hidden costs.  Although the new cost of management may be shocking it is also revealing given at how many of the systems are neither directed at better managerial decision making nor is management value-added included in any business case.   In fact, most business cases myopically rely on the traditional approach of reduced labour costs and increased business efficiency – both of which usually turn out to be fanciful.
This, in turn, also enables the projects to show (i) the value of ICT to managerial decision making, and (ii) it allows for a clearer and more precise method of ICT chargebacks.
Ultimately, finer granularity in the cost accounting of the ICT spend goes beyond an increase in the alignment between the IT department and the Finance function.  Greater transparency and scrutiny will inevitably lead not only to better implementations but also better projects which reap hard dollar benefits to the business in cost savings, precise growth and increased management value-added.

When this is done the business can adjust and fine-tune the IT costs in the same way as any other operational spend.  More importantly, the business will be able to choose the best IT investments that are able to execute corporate strategy rather than relying on industry fads and vendor promises of wild growth and productivity increases.  In the end, until the CIO can get to grips with the true costs of their technology spend and then take a seat at the top table.