It’s alive!

This week, a strategic plan came alive. These are traditionally documents which describe a single path to success accompanied by some financial projections, a couple of alternative scenarios, and a set of risks, conditions, assumptions and caveats.

It’s not often that plans can be said to ‘jump into life’. But this did because this plan is not a document but an interactive dynamic model of the business. It jumped into life because those with a stake in the outcome could explore and play. This was strategic planning through Action Learning.

As a result the Executives, more confident now of the space within which they can manoeuvre the organisation, are selling the plan to others – potential funding partners in particular.

And their message is that we have an agile plan, one that can be flexed and changed in the face of events that might come out of left field without any warning.

Unshackle the managers

Trustees and potential funders like this approach to planning. They want the management team to be unfettered by constraints. If bad stuff should happen they don’t want expressions of angst about inability to stick to the original plan, but fast, flexible and imaginative responses with choices and options attached. This means knowing in advance how much key variables are likely to change if bad stuff happens and how far they have to travel before plans need to change.

And let them play

The business is one of providing care and support for elderly people. The front piece of the model is ability to ‘goal seek’ with 15 variables – including prices, trends, staff numbers, interest rates and efficiency cushions. The goals are determined by key financial ‘golden rules’ e.g. minimum levels of earnings, earnings growth, and debt cover.

‘Goal seek’ means testing each variable individually to see how much it would have to change for the golden rules to be breached. Tests can go the other way – for example to see what would have to change for new earnings targets to be reached.

There is nothing new in this type of test. But it is unusual to place this capability in the hands of senior executives, board members and funding partners to engage them in collaborative exploration and play.

But the shelves are bare

It would be good to think that an interactive and dynamic model of the business might fall off a nearby shelf. It will not, and any suggestion that it might would be misleading.

Building such a model requires design effort, data and time, all of which carry their own costs. It can seem like an insurmountable task. But, within a recent class on budgeting and forecasting, held for a mix of FDs and Financial Controllers, we conducted an exercise that overcame those concerns.

We picked upon one of the businesses represented within the room and proved that:

  • The key processes that need to be modelled are largely well understood;
  • Not every dynamic in the business needs to be modelled;
  • Key data may be fragmented but it is often available and easy to retrieve;
  • Missing from the mix were the necessary skills and time, but these can be found;
  • It’s a fun thing to do.

So, bring your planning to life and find ways for your Executives to play. That you can do this is a strategic advantage in its own right. That it will also help you figure out how to make your strategy come true is an added prize.

———————
Paul Clarke
Director
Develin Consulting

Organisations that need help in answering questions involving big data can submit them to Kaggle.com. This site matches organisations that have a problem to solve with data scientists who are looking for a challenge. At the time of writing the problems on offer ranged from the ability to read the political landscape in Hilary Clinton’s emails through to identifying individual Right Whales from aerial photographs of the mammals swimming through the ocean.

We use Kaggle as a training ground. Using data submitted by the San Francisco Police Dept we are attempting to predict crime patterns across San Francisco. Our entry will be tested by Kaggle against those from others. If we do well we may win a prize.

The big data and predictive analytics solutions at Kaggle.com have a direct application within Social Housing. For example, our Kaggle algorithm will answer the question ‘given a particular time and place, how likely is it that I will be the victim of a crime?’ A second algorithm that we are building (but not for Kaggle) will answer the question ‘given the identity of a social housing resident, how likely is it that they will fall into significant arrears and be either evicted or abandon their property?’

Both involve the use of a classification algorithm called a decision tree (e.g. to classify someone as a potential crime victim or not). Accuracy of outcome depends upon two things: our ability to interpret and model the data available (e.g. times and locations of crimes that have happened); and our ability to improve the algorithm as fresh data becomes available.

Approaches such as these are highly relevant to Housing Associations and Local Councils. They have to pick up the pieces when tenants, perhaps faced with mounting arrears, present a range of problematic behaviours. Tenants may leave without warning or wait to be evicted. Or they may cause disruption through anti-social behaviour. Even a small lift in the ability to predict these events could have a big impact. Fewer people would lose their homes and high levels of arrears might be avoided.

But this is a far harder challenge than our crime classification task. For that we are limited to a pre-determined and finite set of data. But for our social housing resident the data on offer is far from finite. We have payment records, the type and frequency of contact with the Landlord, socio economic profiles, the history of tenancy movements, the local environment, the history of the property, crime stats etc. Buried somewhere within these records are the indicators that we need.

Indeed this example illustrates both the vital role that big data can play and the curse of having almost unlimited data available from which to draw the necessary insights.

It’s vital therefore to first of all speak to the people who know most about the relationship with the customer and what drives certain outcomes. For example, some significant arrears happen because from the word go the tenant has simply no wish to pay their rent. For this group, predictions are of little value. We just need a process to detect that this is happening and to close things down as quickly as possible.

But for other groups, predictions may be vital. If specific patterns of behaviour can be detected, in particular changes in behaviour which we know to be potential indicators of future arrears and abandonments, interventions become possible.

The key steps to building a predictive algorithm are as follows:

  • Listen to those with first hand experience of the issue that will become the basis for the prediction required. Their stories will hold the key to how data should be ‘visualised’ and insights drawn;
  • Gather the data needed and be prepared to ‘wrangle it into shape’. For example, systems that record events, e.g. rent payments, do not typically record when patterns of behaviour change;
  • ‘Visualise’ the data (a picture is worth a thousand words) to expose the potential indicators that will play a vital role in the building and testing of the prediction algorithm to come. This is the hardest part of all, in particular if the indicators needed may be counter-intuitive;
  • Develop a body of history that contains the most important indicators and, crucially, evidence of the outcomes that you want to predict;
  • Divide that history into two (but not equally sized) sets of data. The first is used to develop the prediction algorithm, the second is used to test it;
  • Develop the prediction algorithm;
  • Test and refine it.

It will not be surprising to hear that the steps above are far from straightforward and that success is far from guaranteed. Trial and error is the name of the game.

And the more complex the behaviours that you are trying to predict, the less likely it is that ‘clear’ predictions will be possible. This may lead to an understandable worry about ‘false positives’ e.g. residents who are believed, incorrectly, to be amongst those at risk of letting their arrears get out of hand.

It helps to include costs. If the full cost of arrears, evictions and abandonments is known (lost rent, arrears, the time and effort to chase the departed tenant, the effort to find a new tenant) then tolerance of false positives is likely to be quite high.

This blog is an introduction to a series of blogs coming soon which will explain in detail how each of the above steps works and how it has been applied to the business of predicting significant arrears. All the ingredients of a big data task will be on show – stories, numbers, technology, statistics, programming.

But all the way through, the Author will remind the reader that this exercise, along with countless others involving big data, is about people. If just one person can avoid losing their home as a result of this work, then it will have been worth doing.

The first blog in the series will be called ‘The art of turning a story into a need for numbers’. We are starting to listen to the story, and we are collecting the data that will help to shape the story. All being well it will be along soon.

In the meantime, we have our San Francisco crime prediction algorithm to complete. If anyone planning to visit the Bay Area would like to know about the safest places to visit, please try kaggle.com. Those who have submitted solutions to the prediction challenge may just be able to tell you.

————–

If you would like to understand better how the simple use of big data techniques can help your organisation please don’t hesitate to get in touch.

Paul Clarke
Director
Develin Consulting