The last couple of months have seen a mammoth development effort on PlanningAlerts.
Throughout that time we’ve posted regular updates on the PlanningAlerts Twitter feed. Follow us there if you’re interested in the absolute latest and greatest.
The purpose of this post is to bring together and summarise those changes, to make it easier to digest and get a sense of the bigger picture changes that have been taking place.
We’ve also had a lot of technical debt to catch up on. What does that mean? PlanningAlerts has been going for coming up to 3 years and things inevitably break in that time. Changes get made, subtle breakages happen that nobody notices but these things have small incremental effects. Also, software libraries get out-of-date and need to be updated. We’ll cover some of those fixes in the next post.
But first the more interesting stuff, the new features!
So without much further ceremony…
- See how many people received a particular application via email alerts – you can see this now on the detail page of every recent application
- Added Twitter and Facebook share buttons – we want to make it easy for people to share applications on their networks
- Get email notifications of new comments – so, if someone comments on an application within the area that you’re interested in, you’ll get notified of them in your normal email alert.
- Designed a prototype email signup widget – So that third-party websites (such as local councils) can make it super-easy for their constituents to sign up to email alerts on PlanningAlerts.
- Each authority now has its own page including statistics (apps received last week, median apps per week, etc..), the number of applications received over time, link to ScraperWiki scraper and the authority population – This allows you to quickly get an overview of what’s happening within a particular planning authority. Also useful for developers who want to fix or check a scraper)
- Added percentage of population covered. This is now automatically generated.
- New documentation on how to write a scraper – we’re now using ScraperWiki (http://scraperwiki.com) for any new planning authorities. This makes it really easy to get started writing a scraper. You can do everything from your web browser. You don’t even need to set up a local development environment. Our aim here is to significantly lower the barrier of entry to someone with some basic programming skills to come along and contribute to the project.
- New documentation on how to lobby your local council – we also want to make it easier for people without programming skills to contribute. So, we added a page about how you can lobby your local council to publish their planning data in a machine readable format.
We also added 5 new planning authorities:
- Liquor License applications for Victoria
- City of Launceston, TAS
- Unley, SA
- Development Assesment Panels, WA
- City of Perth, WA
City of Perth and the Development Assessment Panels required scraping pdf documents. Not an easy task!
Next in part 2, we’ll cover the myriad of improvements to existing features, some small, some large.