This is an abbreviated transcript from our presentation at Tech@Housing Housing 2018.
Housing Providers have a major role to play in helping to look after vulnerable people. Most of the time their staff know who they are and how best to support them. But now and again, we need help from data, in particular to find people who might otherwise spend their lives under the radar. At the Housing 2018 conference we took the opportunity to draw upon our experience and say a little bit about what’s involved.
Predicting vulnerability means either knowing in advance that someone is going to be vulnerable in some way, or finding information that suggests that this is already the case.
Yes, you can use data for the task, but accuracy is essential. Also, it’s easy to get immersed in the world of data. You must never lose sight of the real objective – to find and help people.
It’s all about accuracy
This probably sounds obvious.
If you are an on-line book seller e.g. Amazon, your predictions are all about someone’s preferred choice of book. So you will use data relating to everyone elses’ browsing choices to come up with some recommendations.
If you get a recommendation right, and the person browsing buys the book, then the machine had delivered a ‘true positive’. If no purchase happened, this would be a ‘false positive’. If they bought a book that hadn’t been recommended this would be a ‘false negative’.
Whatever the outcome, the machine learns from its mistakes. Maybe a missed sale today, but increased sales tomorrow.
Nor does it matter how many false positives and negatives there are, the cost to the seller of getting it wrong isn’t great, a delayed sale at worst.
But, with vulnerable people, it is vital to understand these positives and negatives. A false positive means that someone thought to be vulnerable turned out not to be so. That’s good, but to know this outcome and learn from it, someone had to go and track the person down to find out.
A false negative means that someone who really was vulnerable was missed – a heavy price to pay for the person concerned.
Accuracy is therefore vital. There are two ways to improve it.
Increase the variety and amount of data used. And improve the quality of the signals indicating vulnerability.
Increase the variety of data used
At the Clarion Housing Group Develin was tasked with predicting those who would, at some point, abandon their homes. For whatever reason they can no longer pay the rent and decide that it’s better to disappear rather than face the consequences.
There aren’t many, but the costs for everyone are high. For the customer this might be a life on the streets. For the housing provider, it is lost rent and the cost of clearing up afterwards.
To make a difference someone needed to intervene in time to turn things around. This meant predicting in advance that abandonment was a possibility.
We brought all the data we could into one place, and applied Amazon style prediction algorithms to it.
It worked – in as far as we had a pool of customers predicted to be heading for abandonment. But the pool was too large to be of use.
If we were to let a machine do the work we needed data from other agencies and organisations as well as social media.
Improve signal quality
So we applied the second approach. We asked the professionals in the field what signals they look for when assessing the well-being of a resident. The obvious were included – signs of neglect to the property or prolonged absence.
Of greater value, we discovered that they could visually scan the digital records for a single customer and spot clues buried within transcripts from the contact centre, or from within emails.
They could spot a prolonged absence of actual contact with the customer, uncertainty in the responses from family and friends, increasing references to other agencies, patterns of speech suggesting increasing levels of stress, changing patterns in rent payments.
We translated their abilities into a set of algorithms that could detect these signals and measure these patterns across the entire set of customer records.
Once again we had a pool of candidates. But it was smaller. By refining our choices of signal and improving the way we detected them we worked the number in the pool down to 20, a small enough group for Field Officers to investigate. From knocking on doors and speaking to customers they estimated that the prediction was about 40% correct.
Housing providers have to do more
What do we learn from this?
The problem from the first approach is that the standard datasets may not have the signals needed. To find correlations within data strong enough for the task a wide variety of data is needed.
You have to combine multiple datasets e.g. from education providers, social housing, the criminal justice system, drug and alcohol prevention services, social media sites.
If you are a housing provider this is a big ask.
A better approach is to find the right signals and react to them.
You will need to create new sets of data. E.g. from a dataset of rental payments, flag changes in the period between payments, or fluctuations in the amount paid over a particular threshold. From contact centre transcripts flag specific words, and patterns of words indicating a changing relationship with the customer.
Also train people to be curious. Why has the customer locked themselves out of their flat twice in two days? Is this the onset of a more serious situation?
This is where much more can be done. Much is said about the need to improve the quality of data first, before attempting this sort of task. But, if this is simply improving the robustness of the standard datasets (filling in blanks in addresses, ensuring that formats are consistent) then a big opportunity to make a difference is being lost.
In particular, because some of the people who most need help may be virtually silent – the person living alone suffering from an increasing level of physical impairment. It may be weeks before anyone notices that they never put their rubbish bins out. Or the domestic abuse victim who is never seen in public and who will never speak out.
To find them you need to bring a wide variety of data into one place AND find a way to magnify the faint signals that may be there AND have curious people asking the right questions.
If we can do these things we might stand a chance of predicting the vulnerable using data, before it’s too late.
Paul Clarke, Director Develin Consulting
T: +44 333 8000 825
As businesses move their finances over to self contained, self service applications in the cloud some of the information they need to run the business will definitely improve. Fundamental questions such as ‘how much have we spent this month?’ can be answered in an instant.
But to improve a business, the right questions need to be asked, such as ‘do you know what to do to lift your business to the next level?’. And, be asked by a person rather than by something similar to Amazon’s Alexa or Google Assistant.
We need the human touch
Only a person will have the experience and the nous to see the opportunities and pitfalls ahead, and be able to say ‘if you want to know what to do, I have some ideas’.
However, the better the technology the less likely it is that this will happen. Because the person best placed to ask the right question is our Accountant. And if the new tech means that we need our Accountants less, the less we will pay for their services, and the less they will be in touch.
That leaves the business owner with the information that the technology is already set up to provide, and at the mercy of the AI algorithms that makers of the new tech are introducing.
The clue is in the word ‘artificial’
No matter what spin is attached to the powers of AI, always remember that it is a reworking of data that is already in the machine to show patterns and correlations that may have meaning when looking forward. And if that data is no more than a history of income and expenditure, the insights will be limited.
Better information comes from new data, selected carefully with a key question in mind.
For example, you run the Mrs Miggins bakery business and deliver to retail outlets in your area.
It’s clear from the map what the shortest delivery route is. However, if asked the question ‘is this route the one that will make you the most money?’ you will need data to tell you about the time each possible route will take, as well as the distance. The route that takes the shortest time may allow time for more deliveries to be made, even though the distance may be greater.
Finding the right answer may not be an elegant exercise. And shiny new accounting tech may be of limited use.
Google will provide typical times for travel over different routes at specific times of day. The accounting system will provide you with costs and income. And the sales system (along with website, emails, and Facebook posts) will tell you about enquiries and the potential for additional deliveries.
But, the answer could lead to a higher margin and an edge over your competitors who are still working with just a map, or just google, and not combining different sets of data to uncover the hidden possibilities.
Here is the question
As specialists in Business Intelligence we are looking at ways to equip Accountants in Practice to transform the information that their clients can use to build a competitive edge.
Training is essential, specifically in the art and science of gathering and using data for a sharper commercial perspective and a stronger bottom line. We are working with BPP to develop the courses that will help.
And the right technology is needed. But this is less about pre-prepared reports and multi-coloured dashboards (although they have their place), and more about connecting data to crack the critical unanswered questions.
But are we right? Here is the question.
If you are an Accountant in Practice, what is the technology that you need most? And what would it allow you to do better?
Please let us know by commenting on LinkedIn. The more we understand of the world of the Accountant in Practice, the better we can make it.