The excitement about data needs to be balanced with an uncomfortable truth. Most data projects fail. Perhaps as many as 85%. This doesn’t mean your data modeling project or big data analysis tool is doomed. But it does mean you need to know how to swing the odds back in your favour. In this article we look at the reasons many data projects don’t deliver value. We also suggest practical steps you can follow to make your project a success.
There is great potential and enthusiasm in the public sector for data-driven innovation that improves services and reduces costs. This was the key message that the Mastodon C team took away from a round table discussion with senior figures from the public sector this week. Yes, challenges remain around data skills, data governance and shaping the right approaches for data projects, but some clear patterns are emerging that show that data makes an impact at a local level.
Data governance is about giving your people and partners timely but compliant access to data. Whether it’s driving collaboration across departments or engaging external suppliers, an effective approach to data governance gives you control over data access and usage, whilst making sure you stay on the right side of policy, regulations and the law. All whilst being able to prove that you’re doing it right. With new regulations like GDPR making jail sentences possible penalties for data owners, that’s more important than ever.
Data exploration is perhaps one of the most important elements when designing a data science project. Exploring the data allows you to better plan your project and answer important questions - like whether your data can actually answer the questions you’re trying to answer. In this short guide we outline how we approach investigating a new dataset, step by step. Typically, with a new data science project you will be working with a new dataset. Investigation of the data (it’s content and composition), will help to inform how the project should proceed and whether the data can answer the question you are aiming to address.
As we all know, one big aspect of delivering effective services under financial pressure is making the right commissioning decisions, based on a combination of expected demand, and local policy choices about how to meet that demand in the best way for the local community. We’ve also seen that according to research by the Local Government Information Unit with The Municipal Journal, 65% of authorities expect to have to use their reserves this year, putting extra urgency on smart commissioning. This blog post explains more about how we are trying to tackle that challenge.
We’re looking for a marketing intern to help us reach and educate more clients for our innovative new city data tool, Witan. This is a new position, and depending on product progress and your ambitions, it could develop into a permanent role.
The challenges may be big. And the future unclear. But a data-driven revolution is taking shape in the public sector. A revolution driven forward by innovative leaders and a handful of technology companies, working together to bring the benefits of data to local communities. The development of state of the art platforms like Witan is making the collection, analysis and sharing of data much easier, and more accessible, for public sector organisations. These new solutions offer the public sector practical ways to drive change and meet challenging goals.
In a recent blog post we talked about using data to improve local government service planning and delivery. As part of our Witan platform, we are delivering a number of pre-built models for common service planning and demand forecasting challenges that face city leaders; one of the most popular of these is Special Educational Needs demand forecasting and commissioning, which is a particularly costly and complex area to manage. We’ve now launched a video showing the experience of using the model within the Witan platform - please view and enjoy!
One question that often comes up when we talk to people new to data science is “Why not just use Excel?” It’s a reasonable question that’s not as easy to answer as you might think, but answering it does go some way to answering a key question anyone planning a data science project needs to consider - what’s the simplest way to get to the results I need?
Subscribe via RSS