Blog

Welcome to the Data to Insight blog. This page is public, but you'll need to be a site member to post comments. If there's something you'd like us to cover here, get in touch.

Search

Quality LAC data from quality Python training

Around the start of this year, we asked for expressions of interest in a project to collaboratively build a data checking tool for the SSDA903 dataset. We already knew from an earlier project, led by Greater Manchester Combined Authority and Social Finance, that the data tool would be useful. The new idea was that, instead of paying someone to build the tool for us, we'd pay someone to teach us how to build it ourselves. Within a day, more than 40 colleagues around the country had replied to say they were interested. With Wigan Council as the lead authority, we successfully bid for the money from DLUHC's Local Digital Collaboration Unit, and started work.


This week we're launching the first output from that project. It's a browser-based tool designed and developed by data analysts from more than 20 different local authorities, supported by technical expertise from our friends at Social Finance. It will let you perform the same kinds of data validation as the DfE’s SSDA903 statutory data submission tool, at any time, using your year-to-date extract of SSDA903 data. We recommend a monthly data checking cycle, but we're definitely interested to hear other people's approaches.


The tool loads Python code in your web browser to read and validate your SSDA903 data files locally. None of your SSDA903 data will leave your network via this tool.


This means that you can safely use it without installing additional software, and without any data sharing agreement, and without creating any new report outputs from your case management system. Technically, once the Python code has loaded into the browser, the tool could work entirely offline.


Data cleaning for the SSDA903 return is hard: typically you can't start it until the DfE site goes live in April, which means you're trying to fix errors which occurred up to a year ago, and you have a whole year's worth of them to fix at once. We hope this tool reduces the year-end bottleneck for local authorities and allows them to fix errors closer to source - which should make the April return easier, and should make fixing the errors easier, too.


The other impact is that, of course, if you're waiting a year to fix errors in your statutory dataset, then those errors are showing up in your regular reporting to decision-makers all year, too, and potentially hampering their ability to make good decisions. So we hope too that by helping local authorities move to in-year validation, this tool will help keep these crucial datasets clean year-round, ensuring that leaders are looking at an accurate picture of their work for children looked after.


What's next? We'll be interested to hear feedback from anyone who uses the tool. We have some funding in reserve to either fix any issues that arise, or keep pushing the project a little further and add some more data checks. Beyond that, if this is a success then we'll be thinking about how best to maintain it in the long term, and whether to expand the work further - there's at least one other statutory data return we'd like to tackle, and then there's the question of whether this Python-in-a-browser approach could be useful for anything else. Your thoughts on any of these possibilities are, as always, very welcome.

144 views0 comments

Recent Posts

See All

When this website first arrived in 2020, we had a lot of ideas about what Data to Insight might do, and only a little clarity and which of those ideas would take flight. User research helped outline t