SCENARIO-BASED MODELLING: Storytelling our way to success. 1

“The soft stuff is always the hard stuff.”

Unknown.

Whoever said ‘the soft stuff is the hard stuff’ was right.  In fact, Douglas R. Conant, coauthor of TouchPoints: Creating Powerful Leadership Connections in the Smallest of Moments, when talking about an excerpt from The 3rd Alternative: Solving Life’s Most Difficult Problems, by Stephen R. Covey, goes on to note:

“In my 35-year corporate journey and my 60-year life journey, I have consistently found that the thorniest problems I face each day are soft stuff — problems of intention, understanding, communication, and interpersonal effectiveness — not hard stuff such as return on investment and other quantitative challenges. Inevitably, I have found myself needing to step back from the problem, listen more carefully, and frame the conflict more thoughtfully, while still finding a way to advance the corporate agenda empathetically. Most of the time, interestingly, this has led to a more promising path forward and a better relationship, which in turn has made the next conflict easier to deal with.”

Douglas R. Conant.

Conant is talking about the most pressing problem in modern organisations – making sense of stuff.

Sense Making

Companies today are awash with data.  Big data.  Small data.  Sharp data.  Fuzzy data.  Indeed, there are myriad software companies offering niche and bespoke software to help manage and analyse data.  Data, however is only one-dimensional.  To make sense of inforamtion is, essentially, to turn it into knowledge. To do this we need to contextualise it within the frameworks of our own understanding.  This is a phenomenally important point in sense-making; the notion of understanding something within the parameters of our own metal frameworks and it is something that most people can immediately recognise within their every day work.

Contextualisation

Take, for instance, the building of a bridge.  The mental framework by which an accountant understands risks in building the bridge is uniquely different from the way an engineer understands the risks or indeed how a lawyer sees those very same risks.  Each was educated differently and the mental models they all use to conceptualise the same risks (for example)  leads to different understandings.  Knowledge has broad utility – it is polyvalent – but it needs to be contextualised before it can be caplitalised.

Knowledge has broad utility – it is polyvalent – but it needs to be contextualised before it can be caplitalised.

For instance, take again the same risk of a structural weakness within the new bridge.  The accountant will understand it as a financial problem, the engineer will understand it as a design issue and the lawyer will see some form of liability and warranty issue.  Ontologically, the ‘thing’ is the same but its context is different.  However, in order to make decisions based on their understanding, each person builds a ‘mental model’ to re-contextualise this new knowledge (with some additional information).

There is a problem.

Just like when we all learned to add fractions when we were 8, we have to have a ‘common denominator’ when we add models together.  I call this calibration, i.e. the art and science of creating a common denominator among models in order to combine and make sense of them.

Calibration

Why do we need to calibrate?  Because trying to analyse vast amounts of the same type of information only increases information overload.  It is a key tenent of Knowledge Management that increasing variation decreases overload.

It is a key tenent of Knowledge Management that increasing variation decreases overload.

We know this to be intuitively correct.  We know that staring at reams and reams of data on a spreadsheet will not lead to an epiphany.  The clouds will not part and the trumpets will not blare and no shepherd in the sky will point the right way.  Overload and confusion occurs when one has too much of the same kind of information.  Making sense of something requires more variety.  In fact, overload only increases puzzlement due to the amount of uncertainty and imprecision in the data.  This, in turn, leads to greater deliberation which then leads to increased emotional arousal.  The ensuing ‘management hysteria’ is all too easily recognisable.  It leads to much more cost growth as senior management spend time and energy trying to make sense of a problem and it also leads to further strategic risk and lost opportunity as these same people don’t do their own jobs whilst trying to make sense of it.

De-Mystifying

In order to make sense, therefore, we need to aggregate and analyse disparate, calibrated models.  In other words, we need to look at the information from a variety of different perspectives through a variety of lenses.  The notion that IT companies would have us believe, that we can simply pour a load of wild data into a big tech hopper and have it spit out answers like some modern Delphic oracle is absurd.

The notion that IT companies would have us believe, that we can simply pour a load of wild data into a big tech hopper and have it spit out answers like some modern Delphic oracle is absurd.

Information still needs a lot of structural similarity if it’s to be calibrated and analysed by both technology and our own brains.

The diagram below gives an outline as to how this is done but it is only part of the equation.  Once the data is analysed and valid inferences are made then we still are only partially on our way to better understanding.  We still need those inferences to be contextualised and explained back to us in order for the answers to crystalise.  For example, in our model of a bridge, we may make valid inferences of engineering problems based on a detailed analysis of the schedule and the Earned Value but we still don’t know it that’s correct.

Storytelling

As an accountant or lawyer, therefore, in order to make sense of the technical risks we need the engineers to play back our inferences in our own language.  The easiest way to do this is through storytelling.  Storytelling is a new take on an old phenomenon.  It is the rediscovery of possibly the oldest practice of knowledge management – a practice which has come to the fore out of necessity and due to the abysmal failure of IT in this field.

Scenario-Based Model Development copy

Using our diagram above in our fictitious example, we can see how the Legal and Finance teams, armed with new analysis-based  information, seek to understand how the programme may be recovered.   They themselves have nowhere near enough contextual information or technical understanding of either the makeup or execution of such a complex programme but they do know it isn’t going according to plan.

So, with new analysis they engage the Project Managers in a series of detailed conversations whereby the technical experts tell their ‘stories’ of how they intend to right-side the ailing project.

Notice the key differentiator between a bedtime story and a business story – DETAIL!  Asking a broad generalised question typically elicits a stormy response.  Being non-specific is either adversarial or leaves too much room to evade the question altogether.  Engaging in specific narratives around particular scenarios (backed up by their S-curves) forces the managers to contextualise the right information in the right way.

From an organisational perspective, specific scenario-based storytelling forces manages into a positive, inquistive and non-adversarial narrative on how they are going to make things work without having to painfully translate technical data.  Done right, scenario based modelling is an ideal way to squeeze the most out of human capital without massive IT spends.

 

 

 

 

 

Governance is More Than Openness 1

In a recent blog Richard Sage (@BakedIdea) points out that governance is just a matter of openness and sharing.  If only life were so simple.  If this were the case then what of Enron?  What of the whole global financial crisis?  These people were open?  These people shared?  They had GAAP reporting duties – So what went wrong?

The simple fact of the matter is that (i) governance is more than just sharing, but (ii) less than the full apparatus of conformance which Richard sets out.

More Than Sharing

Governance is more than sharing.  It is about design and flow.  Financial institutions shared information internally and reported it externally but this made not one jot of difference to the near collapse of the global economy.  Collateralised Debt Obligations (CDOs) were so complex that it would take a long time to unpick each one.  It is essential to understand that if an organisation actively conspires to confound regulatory procedures then there is no governance structure that will catch it.

“Governance without design is somewhat akin to looking at a ball of multi-coloured string and trying to guess what the pullover will look like.”

Organisations (here I extend the net to government and not-for-profit) need to design for misuse.  Understand that cross-functional information flows require some degree of architecture.  Without the necessary degree of design in governable artefacts (e.g. cost models, delivery schedules and contracts) it is impossible to unpick them.  In fact, it is somewhat akin to looking at a ball of multi-coloured string and guessing what the pullover is going to look like.

Governance Is Less Than You Think

I believe that governance is only the set of structures necessary to give confidence to institutional shareholders  that their interests are being well looked after.  The functions are the business processes and technical systems which enforce and deliver them.  This is why corporate governance speaks only of Directors Duties and not of business process.  The how will be forever changing in our modern and dynamic world.

In the end, governance is counterintuitive to business.  Good governance is seen to reduce profits, to close off avenues of growth and to burden management with bureaucracy and nugatory process.  Yet good governance should clear the way.  It should lower the bar and reduce the hurdles.  In concert with a stringent and effective assurance process governance becomes light yet effective.  It delivers confidence without suffocating the organisation.

The Failure of Risk: lessons from the GFC Reply

risk management. hop scotchWe live in uncertain times. The failures in risk management which lead to the global financial crisis have created an unprecedented set of circumstances. Not only are regulators imposing heavier compliance burdens but shareholders and investors are demanding greater reporting and higher levels of information transparency. On top of all this operational costs are too tight to carry the overhead of separate risk and assurance functions.

When the analysis is done there are 6 key lessons to learn from the global financial crisis:

  1. Integrate G, R & C.  In medium and large corporations isolated risk management practices actively work against the business.  Technical and operational experts will identify risk from experience and create risk slush-funds to mitigate them.  These increase the cost of business and in many cases price the company out of the market.  In an integrated GRC system the firm is able to manage risks across business units so that the risk funds are held centrally and do not add a premium to initial project costs.  Risk identification and analysis percolates from the bottom up but governance is driven from the top down.  In an integrated system they both to work within the business lifecycle to add the right mix of checks and balances so that no additional drag is added to investment/project approvals.
  2. Make Passive GRC Active.  Systems need to be active.  They need to hunt out risk, define it, quantify it and measure the dependencies of the risk.  Then, those same systems need to bring it to the attention of the executives so that they may make informed investment decisions.  In the end, humans follow the law of least effort:  employees will follow the path of least resistance in designing and gaining approval for their projects.   GRC must not follow a system of honour & audit but rather one of  active assurance.  When GRC systems are passive the business lifecycle becomes clogged with nugatory and useless program reviews that turn into technical sales pitches by design teams.  Such events and practices only serve to affirm the belief that GRC is a legal burden and one which only serves to satisfy the needs of regulatory compliance.  Raytheon, for instance, have an excellent system of governance-by-exception.   Their Integrated Product Design System (IPDS) has active governance measures and allows Raytheon to manage a pipeline of thousands of critical projects dynamically and by exception.GRC
  3. Get Granular.  When projects fail it is not usually because the risks have not been adequately managed.  The primary problems in risk practices are the failures of risk identification and analysis.  Managers are simply unable to deal with risks at a granular level and then weigh them up on a per project basis.   This is largely because the technical skills needed to do so are not within the standard sets of most executives (but they are within the more mathematical ones of the FS&I industry).   Where this disparity exists then businesses need to develop separate Red Teams or Assurance Teams, either from the existing PMO of from hand picked executives.
  4. Bottom Up & Top Down.  Risk management is bottom-up but governance is top-down.  The technical skills and software reliance involved in effective risk management mean that the entire practice usually percolates from the bottom of a business, upwards.  Consequently, unless it fits within a comprehensive governance framework it will be open to being gamed by senior executives.  This is why major projects which are seen as must-win are often approved with little or no governance or assurance.
  5. Risk Ownership.  Risks need to be owned at the lowest responsible level.  This is to say that when things go wrong the person at the lowest level who has the greatest amount of operational responsibility must be able to take charge to mitigate all aspects of the risk.  It is vital that the person owning the risk be able to recognise the variables which may see the risk realised.  It is also critical that the risk owner understand the corporate decision points, i.e. the points at which the contingency plans should be triggered.
  6. Invest in the Right Type of Risk Culture.  Risk should not be a dirty word.  Risks are inherent in every project and balancing them quantitatively and qualitatively is an essential skill for all senior executives.  Risk should be as much about seizing opportunity as it is about guarding profitability.  Businesses need to invest in top talent in order to drive good risk practices from the top.  Effective, Active-GRC involves a complex array of tools, practices, structures and processes which need an experienced senior executive to drive them constantly and consistently in the business.  The softer side of risk management cannot be neglected.  The nature of risk forces people onto the defensive as they attempt to justify all aspects of their project designs.  CROs need to help executives understand that all projects must balance risk if they are to attempt to push profitability.  Otherwise, risk cultures will mire companies in conservative, risk averse cultures which only act to add friction and reduce profitability.

Risk practices need to work together inside a single, comprehensive risk framework that goes beyond simple probabilistic modelling and disjointed regulatory compliance.   Businesses need to implement processes which not only integrate the business lifecycle but actively increase both liquidity and opportunity for risk to be seen to add real value to the company.   Only once this is achieved can risk management cease to be an operational drag for the business and become a value-adding proposition which works actively to increase the profit and performance of a company.

 

Top 5 Benefits of Effective Risk Management 1

risk management.little menBENEFITS OF AN INTEGRATED “ACTIVE GRC” FRAMEWORK

After the failure of risk management during the recent (and ongoing) financial crisis one could be forgiven for thinking that risk management – as we know it – is dead.  However, effective risk management is the only means which businesses have to:  (i) assess and compare investment decisions, (ii) seize subtle opportunities, and (iii) ensure regulatory compliance.  Risk management has greater utility beyond these obvious benefits.  Listed below are 5 of the top financial benefits of effective risk management:

1.  IMPROVED LIQUIDITY

When managers cannot identify or mitigate complex risks they create risk contingency slush funds and pad their accounts with excessive risk premiums. This is not an efficient allocation of capital and it can even price a business out of the market. Precise identification of risk premiums removes these slush funds and creates greater firm liquidity and the ability to allocate capital where it is needed.

2.  BETTER PROJECT PERFORMANCE

The best methods for risk identification and analysis of risk in projects are through the quantitative analysis of cost models and project schedules. However, these methods are only useful where such models are in enough detail. Good risk management leads to greater collaboration by cross-functional teams to optimise cost and schedule performance.

3.  BETTER OPPORTUNITY MANAGEMENT

With greater liquidity comes the ability to seize emerging opportunities. Not only can the company use this capital across portfolios to manage risks but it can also seize opportunities for M&A, talent acquisition, share buybacks, increased dividends, employee bonuses or increased project funding/investment.

4.  CONSENSUAL MANAGEMENT CULTURE

As managers work across the business to calibrate cost models with the project schedules; the contract and commercials with the technical architecture, the business is forced to adopt a more consensual, multi-disciplinary approach. Where GRC is implemented as part of a high-performance business initiative the culture is more likely to stick rather than one imposed from the top-down.

5.  IMPROVED REPORTING & DECISION MAKING

An active GRC process which is fully integrated with the business relies on the quantitative analysis of core artifacts (cost models, project schedules and technical architectures and contracts). A quantitative culture coupled with regular, detailed analytical outputs also greatly improves the standard of financial and operational reporting and therefore the possibility for improved investment decision making.

Building a Risk Culture is a Waste of Time 3

The focus of a good risk management practice is the building of a high-performance operational culture which is baked-in to the business.  Efforts to develop risk cultures cultures only serve to increase risk aversion in senior executives and calcify adversarial governance measures which decrease overall profitability.  The right approach to risk management is a comprehensive, holistic risk management framework which integrates tightly with the business.

risk management. waste of timeThe financial crisis is largely due to the the failure of risk management and over-exposure in leading risk-based institutions.  More specifically, the failure of risk management is linked to:

  • The failure to link link risk to investment/project approval decision making.  The aim of risk management is not to create really big risk registers.  Although, in many organisations one could be forgiven for thinking that this is the goal.  The aim of identifying risks is to calibrate them with the financial models and program plans of the projects so that risks can be comprehensively assessed within the value of the investment.  Once their financial value is quantified and their inputs and dependencies are mapped – and only then – can realistic and practical contingency planning be implemented for accurate risk management.
  • The failure to identify risks accurately and comprehensively.  Most risk toolsets and risk registers reveal a higgledy-piggledy mess of risks mixed up in a range from the strategic down to the technical.  Risks are identified differently at each level (strategic, financial, operational, technical).  Technical and Operational risks are best identified by overlapping processes of technical experts and parametric systems/discrete event simulation.  Financial risks are best identified by sensitivity analysis and stochastic simulation but strategic risks will largely focus on brand and competitor risks.  Risk identification is the most critical but most overlooked aspect of risk management.
  • The failure to use current risk toolsets in a meaningful way.  The software market is flooded with excellent risk modelling and management tools.  Risk management programs, however, are usually implemented by vendors with a “build it and they will come” mentality.  Risk management benefits investment appraisal at Board and C-Suite level and it cannot be expected to percolate from the bottom up.

RISK MANAGEMENT IS COUNTER-INTUITIVE

All this does not mean that risk management is a waste of time but rather it is counter-intuitive to the business.  It is almost impossible to ask most executives to push profits to the limit if their focus is on conservatism.  Building a culture of risk management is fraught with danger.  The result is usually a culture of risk aversion, conservatism and a heavy and burdensome governance framework that only adds friction to the business lifecycle and investment/project approval process.  Executives, unable to navigate the labyrinthine technicalities of such a systems achieve approvals for their pet programs by political means.  More so, projects that are obviously important to the business actually receive less risk attention than small projects.  Employees learn to  dismiss risk management and lose trust in senior management.

If risk management is to be an effective and value-adding component it must be a baked into the business as part of the project/investment design phase.  If not, then risk management processes  just build another silo within the business.  The key is to forget about “Risk” as the aim.  The goal must be a performance culture with an active and dynamic governance system which acts as a failsafe.  The threat of censure is the best risk incentive.

risk management. immature disciplineAWARENESS IS NOT MANAGEMENT

risk management. immature disciplineManagement has long been aware of risk but this does not always translate into true understanding of the risk implications of business decisions.  Risk policies and practices are often viewed as being parallel to business and not complimentary to it.

Why is it that most businesses rate themselves high on risk management behaviours?  This is largely because businesses do not correlate the failure of projects with the failure of risk and assurance processes. 

In a 2009 McKinsey & Co survey (published in June 2012 “Driving Value from Post-Crisis Operational Risk Management”) it was clear that risk management was seen as adding little value to the business.  Responses were collected from the financial services industry – an industry seen as the high-water mark for quantitative risk management. 

COLLABORATION IS THE KEY

Risk management needs to become a collaborative process which is tightly integrated with the business.  The key is to incentivise operational managers to make calculated risks.  As a rule of thumb there are 4 key measures to integrate risk management into the business:

  1. Red Teams.  Despite writing about collaboration the unique specialities of risk management often requires senior executives to polarise the business.  It is often easier to incentivise operational managers to maximise risks and check them by using Red Teams to minimise risks.  Where Red Teams are not cost effective then a dynamic assurance team (potentially coming from the PMO) will suffice.  Effective risk management requires different skills and backgrounds.  Using quantitative and qualitative risk management practices together requires a multi-disciplinary team of experts to suck out all the risks and calibrate them within the financial models and program schedules in order that investment committees can make sensible appraisals. 
  2. Contingency Planning.  Operational risk management should usually just boil down to good contingency planning.  Due to the unique skill sets in risk management, operational teams should largely focus on contingency planning and leave the financial calibration up to the assurance/Red teams to sweep up.
  3. Build Transparency through Common Artefacts.  The most fundamental element of a comprehensive  risk process is a lingua franca of risk  – and that language is finance.  All risk management tools need to percolate up into a financial model of a project.  This is so that the decision making process is based on a comprehensive assessment and when it comes to optimise the program the various risky components can be traced and unpicked.
  4. Deeper Assurance by the PMO.  The PMO needs to get involved in the ongoing identification of risk.  Executives try and game the governance system and the assurance team simply does not have the capacity for 100% audit and assurance.  The PMO is by far the best structure to assist in quantitative and qualitative risk identification because it already has oversight of 100% of projects and their financial controls.

Traditional risk management practices only provide broad oversight. With the added cost pressures that businesses now feel it is impossible to create large risk teams funded by a fat overhead. The future of risk management is not for companies to waste money by investing in costly and ineffective risk-culture programs.  Good risk management can only be developed by tightly integrating it with a GRC framework that actively and dynamically supports better operational performance.

The Complexity of Cost (Pt.2): a 3-tiered strategy for an effective ICT cost reduction program Reply

cost-reduction

In our last blog we recounted that most ICT cost reduction programs fail.  More to the point, we noted how they fail in larger businesses through a vicious cycle following increased overhead from poor process analysis.  All this stems from a limited view of direct and indirect ICT spend.

In summ, the answer is detailed cost modelling of ICT which analyses the firm’s technology in its place as a business capability enabler. This is vital in the current economic climate otherwise businesses will simply benchmark their costs against similar firms rather than try to pare ICT costs to the bone.

The results of traditional IT programs?

  1. ICT cost reduction programs usually only attack the easy and obvious.  For sustained cost management in ICT the cost reduction program needs to attack:  (i) soft costs (indirect spend), (ii) managerial costs and (iii) program costs as well as all the standard hard costs.
  2. Cost cutting reduces capability.  Traditional approach is to cut applications and services as well as heads but capability will eventually suffer.  Senior people are often made redundant was work is pushed from higher to lower paybands.  With them also goes much of the firm knowledge capital and goodwill of the firm.  If we want to quantify this cost of lost knowledge it is the difference between the market value and the book value of a business.

The problem is that IT is usually seen as a black box.  Few senior executives understand the subtle dependencies which stretch from technology throughout the business.  More importantly, few understand that actual capex and opex of ICT  just represents the hard costs of ICT.  In addition to the hard costs are the soft costs, the management costs and the program costs of ICT.  In more detail:

  • Soft Costs relate to all the indirect spend which flows from ICT procurement.  This may include travel for non-IT personnel involved in change, training and customisation or process change etc.
  • Managerial Costs is the accumulated cost of decision making from management.  This is pure overhead and is not accounted for in the Cost of Goods Sold but rather shows up in bloated Sales, General & Administrative (SGA) accounts.
  • Program Costs are the costs of running ICT programs beyond the costs accounted for in the various cost allocation systems.  These can be the cost of running distributed teams, the cost of low development capability etc.  Such cost coefficients are statistically generated.

On top of all these are the hard costs of ICT.

Borrowing diagrams from Accenture  the solution is to run a 3-tiered cost reduction strategy:

strategic cost management.accenture

After the easy stuff is done, the business must ultimately streamline its processes (and align cost structures accordingly) and then lower it non-discretionary spend.  The key is to (i) see the whole process, (ii) understand the dependencies, and (iii) engage locally.

  • Minimise (Hard Costs) –  Tactical Cost Reduction. Grab the low hanging fruit and take out the obvious costs; the costs in plain sight.  Engage locally with account managers and business unit leaders to reduce headcount but understand and model the dependencies by seeing the whole capability.  The Boston Consulting Group advise that managers proceed on third of a third rule, ie 1/3 of all FTEs are non customer facing and 1/3 of those can be removed without adverse impact on the business.
  • Optimise (Soft & Program Costs) –  Proactive Cost Governance.  This involves detailed spend analysis and process optimisation.  Indirect process costs grow like barnacles on a ship.  The longer they are there the more they are accepted but ultimately they increase the financial drag on a business.  Remove all the invented tasks by modelling the firm’s value chain and seeing where the processes fit into larger business capabilities.  Once this is done executives can optimise the key cost drivers and their inputs.  This improves the delivery model for ICT and enables better demand management.  Accompanying these operational actions the business should improve cost governance.  It can achieve this by removing the management structures around excessive process governance.  This requires a more active and dynamic GRC system but ultimately the business feels a lighter GRC touch.  Most importantly, simplify processes and remove the  ‘cost of complexity‘ ie vertical integration and convoluted workflows which increase process time and transactional costs.

cost reduction level.accenture

  • Re-design (Program & Managerial Costs) –  Strategic Cost Management.  In order to achieve significant and lasting cost reduction benefits the business must lower its discretionary spend.  However, managerial cost structures (which are significant) can only be made redundant when the overall complexity is reduced.  Once this happens shared services may be implemented and rationalised.  The ICT offering can be standardised and the business can create re-usable technology components.  Then the business can change its transfer pricing models and look towards offering the customer-facing SBUs a more sophisticated multi-channel mix of capabilities, ie give them the agility to increase their high-end customer offerings.   Only once this is achieved can the business look towards modernising and streamline technical architectures.

The key is to look at ICT as a capability enabler and not as a business unit in its own right.  ICT should have to justify its very existence.  However, once it does and develops full cost transparency then and only then can it move forward in real partnership with the business.

 

Setting Strategic Posture: Boards and technology Reply

Innovation

In their recent article on the low standard of the technological conversation in the boardroom (“On Technology, Boards Need to Get More Sophisticated“) Michael Bloch, Brad Brown and Johnson Sikes missed the goal but scored the point.

I posit that the paucity of good tech-talk in the boardroom does not require that Directors lift their technological game.  Does technical governance and commercial oversight need to improve?  Absolutely!  In fact I believe that it is the CIOs and the CMOs who really need to improve their conversation.

In summ:  Directors, through the caché of their reputation and gravitas of their experience,  provide the confidence to institutional investors that their money is being well looked after.  Company officers (senior execs) provide the Directors with the facts necessary to delivery that corporate oversight.

To say, therefore, that Directors need tech lessons, to me, points clearly to 2 underlying facts: those companies either have the wrong Directors or the wrong execs – or both.  Either way, Directors are the wrong people to be educating.   In fact, pointing the finger at the Directors is a commercial abdication by the company.  Better information and decision making frameworks are the keys but this does include the focus on governance which the authors allude to.

James Quinn wrote a great article in HBR in May 1985 on the increasing sensitivity of the Board to technology issues.  He noted that despite ICT asset spend often accounting for anywhere in the vicinity of 50% of a company’s capital spend, there was still a lack of understanding by the Board on what part ICT actually played in these projects.  That was 27 years ago.

Lack of Board understanding equals lack of Board oversight.

Richard Nolan and F. Warren McFarlan wrote a terrific article published in HBR in October 2005.  Entitled “Information Technology and the Board of Directors”, the authors argued for differing approaches to oversight depending on the market.  They outline that companies either have a Defensive or an Offensive need for IT.  For instance, vertically integrated companies with a simple supply chain or businesses with operational computing (such as factories) have little need for strategic IT.  These firms operate IT in a Defensive mode which allows a more passive approach to IT investment oversight.

On the other hand, companies which place a strategic imperative on IT (such as banks and insurance brokers) or businesses who are betting on IT to turnaround business (such as the publishing industry) would see an immediate loss of significant revenue if the IT failed.  In these cases they operate IT in a Offensive mode which requires that IT investment programs require greater scrutiny and oversight.  

The question remains whether Boards need to get more sophisticated on tech?  No:

  1. Boards need to set the strategic posture of ICT investment and oversight,
  2. Chief Marketing Officers need to be clear about the technology and user profiles of consumers of the company’s products.
  3. CIOs need to be transparent about the data needed to support decision making, and
  4. CTOs need to be clear about strategically important technology and opaque about all the other boxes and wires.
  5. Whoever is in charge of GRC needs to be clear about the integration of business and technical governance.

In the end, technology is but one topic as the Board combs through the technical enablers of critical investments.  What does not need to happen is technology become the focus of the conversation.

Technology is fun and interesting but it must not bedazzle the company.  Technology is usually a critical enabler but sometimes it can become a strategic instrument.  The Board needs enough information to choose which but it is up to the senior execs to provide the information necessary for effective oversight.