This is an abbreviated transcript from our presentation at Tech@Housing Housing 2018.

Housing Providers have a major role to play in helping to look after vulnerable people. Most of the time their staff know who they are and how best to support them. But now and again, we need help from data, in particular to find people who might otherwise spend their lives under the radar. At the Housing 2018 conference we took the opportunity to draw upon our experience and say a little bit about what’s involved.

—-

Predicting vulnerability means either knowing in advance that someone is going to be vulnerable in some way, or finding information that suggests that this is already the case.

Yes, you can use data for the task, but accuracy is essential. Also, it’s easy to get immersed in the world of data. You must never lose sight of the real objective – to find and help people.

It’s all about accuracy

This probably sounds obvious.

If you are an on-line book seller e.g. Amazon, your predictions are all about someone’s preferred choice of book. So you will use data relating to everyone elses’ browsing choices to come up with some recommendations.

If you get a recommendation right, and the person browsing buys the book, then the machine had delivered a ‘true positive’. If no purchase happened, this would be a ‘false positive’. If they bought a book that hadn’t been recommended this would be a ‘false negative’.

Whatever the outcome, the machine learns from its mistakes. Maybe a missed sale today, but increased sales tomorrow.

Nor does it matter how many false positives and negatives there are, the cost to the seller of getting it wrong isn’t great, a delayed sale at worst.

But, with vulnerable people, it is vital to understand these positives and negatives. A false positive means that someone thought to be vulnerable turned out not to be so. That’s good, but to know this outcome and learn from it, someone had to go and track the person down to find out.

A false negative means that someone who really was vulnerable was missed – a heavy price to pay for the person concerned.

Accuracy is therefore vital. There are two ways to improve it.

Increase the variety and amount of data used. And improve the quality of the signals indicating vulnerability.

Increase the variety of data used

At the Clarion Housing Group Develin was tasked with predicting those who would, at some point, abandon their homes. For whatever reason they can no longer pay the rent and decide that it’s better to disappear rather than face the consequences.

There aren’t many, but the costs for everyone are high. For the customer this might be a life on the streets. For the housing provider, it is lost rent and the cost of clearing up afterwards.

To make a difference someone needed to intervene in time to turn things around. This meant predicting in advance that abandonment was a possibility.

We brought all the data we could into one place, and applied Amazon style prediction algorithms to it.

It worked – in as far as we had a pool of customers predicted to be heading for abandonment. But the pool was too large to be of use.

If we were to let a machine do the work we needed data from other agencies and organisations as well as social media.

Improve signal quality

So we applied the second approach. We asked the professionals in the field what signals they look for when assessing the well-being of a resident. The obvious were included – signs of neglect to the property or prolonged absence.

Of greater value, we discovered that they could visually scan the digital records for a single customer and spot clues buried within transcripts from the contact centre, or from within emails.

They could spot a prolonged absence of actual contact with the customer, uncertainty in the responses from family and friends, increasing references to other agencies, patterns of speech suggesting increasing levels of stress, changing patterns in rent payments.

We translated their abilities into a set of algorithms that could detect these signals and measure these patterns across the entire set of customer records.

Once again we had a pool of candidates. But it was smaller. By refining our choices of signal and improving the way we detected them we worked the number in the pool down to 20, a small enough group for Field Officers to investigate. From knocking on doors and speaking to customers they estimated that the prediction was about 40% correct.

Housing providers have to do more

What do we learn from this?

The problem from the first approach is that the standard datasets may not have the signals needed. To find correlations within data strong enough for the task a wide variety of data is needed.

You have to combine multiple datasets e.g. from education providers, social housing, the criminal justice system, drug and alcohol prevention services, social media sites.

If you are a housing provider this is a big ask.

A better approach is to find the right signals and react to them.

You will need to create new sets of data. E.g. from a dataset of rental payments, flag changes in the period between payments, or fluctuations in the amount paid over a particular threshold. From contact centre transcripts flag specific words, and patterns of words indicating a changing relationship with the customer.

Also train people to be curious. Why has the customer locked themselves out of their flat twice in two days? Is this the onset of a more serious situation?

This is where much more can be done. Much is said about the need to improve the quality of data first, before attempting this sort of task. But, if this is simply improving the robustness of the standard datasets (filling in blanks in addresses, ensuring that formats are consistent) then a big opportunity to make a difference is being lost.

In particular, because some of the people who most need help may be virtually silent – the person living alone suffering from an increasing level of physical impairment. It may be weeks before anyone notices that they never put their rubbish bins out. Or the domestic abuse victim who is never seen in public and who will never speak out.

To find them you need to bring a wide variety of data into one place AND find a way to magnify the faint signals that may be there AND have curious people asking the right questions.

If we can do these things we might stand a chance of predicting the vulnerable using data, before it’s too late.

——

Paul Clarke, Director Develin Consulting

E: paul.clarke@develin.co.uk

T: +44 333 8000 825


SMEs can and should be using technology to help lift productivity further. This the message from many of those looking for ways to close the productivity gap between the UK and other countries.

This is primarily through ‘digital transformation’, in particular the automation of some activities, and the ability to use data to better engage with customers and find new customers further afield.

These advances have been written about for years. However, as reports such as that from the RSA Future Work Centre indicate (1), many businesses are not even close to being able to take on this challenge.

Those businesses who have adopted this technology have a big advantage. They have lower cost operations, more loyal customers who are also spending more, and the ability to find the right customer further afield.

But as an SME it’s easy to be held back – through a lack of skills, confusion and uncertainty about technology, insufficient funds or resources. And for business owners it’s reasonable to believe that barriers such as these will take years to overcome.

But that would be a mistake. Ground can be covered surprisingly fast. What matters is the clarity of your business strategy and your ability to identify the critical questions and extract the answers which will help propel your business forward.

E.g. say you are striving to always be price competitive. A key and obvious question might be – how do we take more cost out of the business?

This is difficult. You may already be at the minimum level of spend. Can you cut further without damaging the business?

An alternative question might be – how do we take steps out of our operation thereby reducing cost and speeding things up?

This is much more creative. A small amount of data could reveal the parts of the operation taking the greatest amount of time, and indicate whether automation could make a difference.

Another alternative question might be, simply, how do we become more price competitive?

A similarly focused set of data could help build your engagement with customers such that your services become better value for money than those from the competition.

Questions such as these form part of your ‘data strategy’ – a definition for the data and technology that will do most to accelerate your business forward.

By turning your business strategy into a set of critical ‘unanswered questions’ you can turn things on their head. Instead of surveying all available data and technology before making a choice, you are drawn to the specific elements that will make the biggest difference.

You will spend less, move quicker, and know what skills you need. Barriers will fall away with surprising speed and your business will be able take some big steps forward.

(1) RSA Future Work Centre report: Artificial intelligence, robotics and the future of low-skilled work: http://ow.ly/A51G30lGaXC

————————————-

Paul Clarke

Director, Develin Consulting

If you would like help in developing a data strategy or the data sources and intelligence that will propel your business forward, please contact us.