Another big story from Right To Know, and how you can do it too

Screenshot showing the released document that has a table of agency names

Over recent weeks there’s been lots of interest in a story about Australia’s mandatory data retention regime. In passing these controversial laws last year the government agreed to reign in the number of agencies able to access your data. However, the laws allowed agencies to re-apply for access. Two weeks ago it was discovered that over 60 agencies have done just that. This discovery was reported by most major news organisations in Australia.

While mandatory data retention concerns the OpenAustralia Foundation–after all, most of what we do is on the internet–this post isn’t about that. It’s about the largely untold story behind this recent discovery. It was all down to one person who is deeply concerned about the implications of data retention.

A couple of months ago Geordie Guy submitted a Freedom of Information (FOI) request publicly using our FOI request site Right To Know. What sparked all of the media interest recently is that his request was successful and it revealed the names of most of the 61 agencies seeking warrantless access to your metadata.

Dozens of news stories over several days, all from one person’s request on Right To Know.

The amazing thing is that you can do this too, about the issues that you care about. As you can read in his fascinating article about the process he went through, Geordie knows a thing or two about the FOI process. But you don’t need to and that’s the great thing about Right To Know.

When the agency initially indicated they were planning to refuse his request, Geordie cleverly changed its scope. If you were making a request and the same thing happened, you could add an annotation to your request asking for help from the fabulous community on Right To Know. They’ll almost certainly help you get the information you’re after.

Usually requests are free but in this case the agency decided to charge over $500 for access to these documents. This didn’t stop Geordie either. He successfully crowdfunded the fees for the request in just two and half hours. This isn’t the first time someone has crowdfunded FOI requests on Right To Know. It once again shows that there are people out there ready to support your quest for public access to important documents.

Have a listen to Heidi Pett’s story about this case on ABC Radio National’s PM programme. It’s one of the few media reports that looks at the full story and acknowledges the passionate individuals and volunteers that helped this this important information about our data retention system see the light of day.

We hope it will inspire you to make your own request for information about something you care about. Go ahead and create a new request on Right To Know now.

Posted in RightToKnow.org.au | Tagged , , , , , , | Leave a comment

They Vote For You – Finding the real facts about voting

I am a firm believer in the people. If given the truth, they can be depended upon to meet any national crisis. The great point is to bring them the real facts, and beer.

– Abraham Lincoln

When the Australian Citizenship Amendment (Allegiance to Australia) Bill 2015 went through the House of Representatives, it passed without one division (or formal vote) being recorded in the official record of Parliament. This means that there is no record of how individual Members of Parliament (MPs) voted on the bill.

But aren’t all votes in Parliament recorded?

Unbelievably, they are not.

There are two kinds of votes in Parliament: votes ‘on the voices’ and divisions. Votes on the voices are the most common kind of vote and involve our representatives yelling out ‘Aye’ (yes) or ‘No’ and the chair deciding which side is in the majority without writing down any names.

Divisions are less common and only occur if two or more of our representatives call for one. When a division is called, the bells of Parliament ring out for four minutes to alert any missing representatives that they should return to their chamber immediately if they want to vote. Then the chamber (either the House of Representatives or the Senate) is locked. During a division, our representatives walk to either side of their chamber: the right side to vote yes; and the left side to vote no. Then they are counted and their names are recorded.

Because most votes occur ‘on the voices’, we have no practical way of knowing how our representatives vote most of the time. What we see on They Vote For You (which takes its voting data from the Parliament’s official records) represents just a fraction of the votes actually taking place in Parliament.

Why is this important?

If we don’t know how our representatives vote, we can’t hold them accountable. Bills can speed through Parliament ‘on the voices’ without any public record of how each representative actually voted. The example I mentioned above was the Australian Citizenship Amendment (Allegiance to Australia) Bill 2015 (‘Allegiance to Australia bill’), which passed in the House of Representatives without one division being recorded. This was possible for two reasons: (1) the bill had bipartisan support, meaning both the Coalition and the Australian Labor Party supported it; and (2) those major parties control the House of Representatives because almost all our representatives there (known as Members of Parliament or MPs) are members of those two parties.

On the other hand, many of our representatives in the Senate (known as senators) belong to minor parties or are independents. This means that the two major parties have far less control in the Senate and so there is more debate and a far greater chance of divisions being called, as was the case with the Allegiance to Australia bill.

In fact, most anti-terror bills have bipartisan support so that the only voting data about them on They Vote For You comes from the Senate. The consequence of this is that many of our related policies only include the voting habits of our senators, including:

So if we want to know how our MP voted on any of these subjects, our only option is to go to the House Hansard (the official transcript of the House of Representatives, which is also available on OpenAustralia.org) and trawl through pages and pages of parliamentary jargon and hope that our MP contributed to debate. If they didn’t, it’s unlikely we’ll ever know.

How can we make the system better?

If every vote in Parliament was made by division then we could always see how our representatives voted on our behalf. Unfortunately, this solution has a major downside: divisions take a long time. There is four minutes of bell ringing, then moving across the chamber, counting and recording, and then everyone returning to their seats. In other words, Parliament would go on forever.

There is another way however: electronic voting. Representatives can simply vote by pressing buttons on a screen, providing a complete record and saving time. It shouldn’t take any longer than voting ‘on the voices’ because representatives can stay in their seats and counting is done by computer.

The House of Representatives Standing Committee on Procedure has already done an initial investigation into the possibility of e-voting back in 2013 (see their report). While they concluded against it, they emphasised that theirs is ‘very much a preliminary examination and the Committee cannot make any considered conclusion or recommendation without details of the options and their implications’.

Their main criticism was that e-voting lessens the visibility of Members’ decision-making in the House and takes away the opportunity for Members to ‘move away from their allocated seats and speak informally to their colleagues and Ministers’. The report goes on to say that ‘[a]necdotal evidence suggests that many consider these informal professional exchanges essential to their work’.

Since there are many solutions to these objections – including the use of differently coloured lights to make our representatives’ votes visible to their colleagues in Parliament and more informal discussion opportunities during breaks – the resistance to electronic voting seems to be more about parliamentarians being sticklers for tradition. Though it is possible that parties are concerned that there may be an upsurge in party members crossing the floor (or rebelling) if electronic voting was introduced. After all, it’s easier to be a rebel when it’s just a case of pressing a button rather than having to cross a chamber in front of all your party colleagues and the gaze of your party whip (whose job includes ensuring all party members attend and vote as a team).

Calls for parliaments to use electronic voting in order to increase parliamentary accountability are growing louder. In 2012, the Declaration on Parliamentary Openness was launched and has since been endorsed by a number of organisations, including the OpenAustralia Foundation. Article 20 of the Declaration calls for parliaments to minimise voting on the voices and instead use methods such as electronic voting that leave a record of voting behaviour, which can then be used by citizens to hold their representatives accountable.

 

What do you think? Is it time for electronic voting in our federal Parliament?

 

Posted in They Vote For You | 1 Response

Civic Tech Monthly, November 2015

Welcome to the tenth edition of Civic Tech Monthly. Below you’ll find news and notes about civic tech from Australia and around the world.

This will likely be our last newsletter of 2015. Thanks to everyone who has submitted items throughout the year and thank you for reading. In this edition we’ve got a few of local events and bits of news, some interesting research, some useful tech from this side of the globe, and more.

As always we’d love to see you at the OpenAustralia Foundation Sydney Pub Meet next Tuesday in Sydney.

If you know someone who’d like this newsletter, pass it on: http://eepurl.com/bcE0DX.

News and Notes

We’re having an end of year party in Sydney

It’s been a busy year for the OpenAustralia Foundation. To celebrate our achievements, and all of you who have helped us on the way, we’re going to have a little party on Sunday, the 6th of December. It’ll be pretty low key and casual (what else?). We’re going to head to lovely Bicentennial Park on the Glebe foreshore and hang out from 11:00 and enjoy some sunshine, a drink, and something to eat.

We’d love for you to join us. Please come along any time after 11:00. BYO food and drinks. RSVP on the Meetup page.

There’s a hackathon to help refugees in Sydney

The Techfugees Australia Hackathon is happening next weekend in Sydney on the 28th and 29th of November. It’s billed as the Sydney tech community’s response to the current refugee crisis involving a network of concerned individuals and inspired by recent similar events in Europe.

It’s great to see that the event has partnered with organisations that will describe the challenges they face to participants. This makes it far more likely that the solutions developed will be for real problems and will hopefully see real world use.

If you want to use your civic tech skills for good, register soon as there’s only a few places left.

Australian Government finally commits to finalising OGP membership

It’s been a long and often rocky journey but the Australian Government has finally committed to finalising its membership of the Open Government Partnership. Now the real hard work can begin, starting with the development of a national action plan for open government.

Congratulations to the people working behind the scenes to make this happen. And thanks to Peter Timmins for tirelessly keeping us up to date with the latest developments over at his blog, Open and Shut.

It’s all go on They Vote For You

From time to time we get an email from the offices of Members or Senators that ask us to change their voting record on They Vote For You in some way because they think it’s inaccurate. But when we ask them to tell us what the error is, we don’t hear back from them again. Recently we got an email that was refreshingly helpful and we’ve even made some changes thanks to it.

We’ve also got a call out for people that want to help make our parliament more open by contributing to They Vote For You – get in touch if you’re interested.

Civic tech: Would use again

mySociety have released some early findings from their research into the impact of civic technology. It’s definitely work a look. We’ve been participating in this research and can’t wait to see more of it so it can help inform the work we do.

Hackpad to SayIt importer

We recently heard about this tool that could be a really great way of creating and publishing transcripts. For example, you could use it to transcribe your local council’s meeting on a Hackpad and then have it published in a really nice way using SayIt.

Give it a go and let us know how you get on!

Audrey Tang, Brilliant Programmer, “Hacks” Politics in Taiwan

Audrey is the person that wrote the above tool we just mentioned. This translated interview with her is a really interesting story of one person’s journey to creating civic tech.

Posted in Civic Tech Monthly | Tagged , , , , , , , , , , , , , | Leave a comment

They Vote For You – Join our summer working bee!

Sign up to help unlock Parliament

Calling all law students and political science enthusiasts!

This summer, the OpenAustralia Foundation invites you to put your statutory interpretation skills to work! If you can pick through the Parliamentary jargon and make it clear who’s voting for what from a day’s proceedings then you can help.

They Vote For You launched last year to help Australians keep track of how their federal representatives vote on issues people care about, including university fee deregulation, privatisation and same-sex marriage. You can help make it even better by contributing your expertise.

By contributing to They Vote For You, you’ll be making Parliament more open to everyone. At the same time, you’ll develop your ability to interpret legislation and create Plain English summaries for a broad audience – all important skills for lawyers, activists and even future politicians!

We’re looking for people who have:

  • Keen attention to detail – someone who’ll notice when a Member or Senator tries to swap an “and” for an “or”;
  • Knowledge of (or a desire to learn) how Federal Parliament works – someone who can tell the difference between a Second Reading and a Third Reading;
  • Ability to read bills and legislation – someone who is stubborn enough to keep re-reading an amendment until they understand it;
  • Ability to translate Parliamentary jargon into Plain English – someone who can explain an amendment about retrospectivity without relying on the word “retrospectivity”; and
  • Desire to make the doings of Parliament accessible to everyone – someone who believes it’s important for Australians to know how their representatives are voting.

The Vote For You is an open source project, meaning you can contribute to it from home at times that suit you!

To find out how to get involved, get in touch with us by Wednesday 2 December 2015 and we’ll get you started.

 

Posted in They Vote For You | Leave a comment

They Vote For You – There’s something wrong with Andrew Wilkie’s voting record!

Picture 1 - Something's wrong with the voting record

Since it launched last year, we’ve received a few emails from the offices of Members or Senators that ask us to change their voting record on They Vote For You in some way because they think it’s inaccurate. But when we ask them to tell us what the error is, we don’t hear back from them again. That’s what made Tim’s email different.

Tim is an Adviser to Andrew Wilkie MP and he was concerned that there was something wrong with Mr Wilkie’s voting record on live animal export. But instead of leaving it there, he actually explained what was wrong and why it was inaccurate. He also told us exactly which divisions were causing the problem: one from 16 June 2011 and another from 20 June 2011. And he was right. In fact, Tim’s email was so refreshingly helpful that we’ve published it in full. Read on to see the changes we’ve made in response.

Dear OAF team

I would like to request a correction to Andrew Wilkie’s voting record on They Vote For You.

The page states that Andrew voted “moderately for live animal export”. As you may be aware, Andrew is one of the most, if not the most, outspoken critics of the live export trade in the Parliament.

The “moderately for” claim appears to come from the two divisions on 16 Jun 2011 and 20 Jun 2011 relating to a motion moved by Bob Katter, the Member for Kennedy. There are two issues about this motion that are important to clarify.

Firstly, Andrew did not second or support this motion because he supports live export (you can read his speech outlining his reasons here – http://www.openaustralia.org.au/debates/?id=2011-06-16.65.1#g65.2). The Member for Kennedy’s motion called on the Government to act to put in place appropriate animal welfare safeguards in Indonesia, and to make the existing live export trade as humane as possible. Andrew supports improved animal welfare standards even though he advocates an overall end to the live export trade. Therefore I don’t believe the claim that “People who are for live animal export would have voted Yes” is correct.

Secondly, both the 16 Jun and 20 Jun divisions relate to a motion to suspend standing orders. It is important to note that members of the crossbench have to consider every vote, whether it be on a Bill or a procedural motion, on their individual merits. In Andrew’s case, he will often support a suspension of standing orders (or oppose a gag motion) because he believes in free and open debate in the Parliament or he believes that the Parliament should debate a certain issue. This is not an indicator of his opinion on the substantive issue as that is not what the division in question is about. Members of the crossbench will also often find themselves seconding motions or bills that they might not necessarily agree with or vote for, because such items of business require a seconder which the major parties are not willing to provide.

On the live export issue, I understand that They Vote For You looks only at divisions. But to look at other parliamentary business, which is often much more important and telling than divisons, Andrew has introduced four Private Members’ Bills that would end live animal export cruelty, asked countless questions without notice, and given a number of speeches. I can provide links to those for you if you would like. I think that these other items of parliamentary business are far more relevant than divisions when it comes to Andrew’s position on live animal export. Some things never come to a vote, and as much as Andrew would like his Private Members’ Bills to come to a vote in the Parliament the Selection Committee, controlled by the Government and the Opposition, have complete control over if and when private members’ business comes to a vote.

Thanks for your time and please don’t hesitate to contact me on the below if you would like any further information.

Kind regards

Tim

The problem he highlighted was that these two divisions were not actually about live animal export. Instead, they were on whether to let the House of Representatives vote on another motion, which was about live animal export. Divisions like these are common because there are many times when a Member or Senator want to do something they’re not allowed to do because of the standing orders (that is, the rules about when to talk, when to be quiet, when to vote etc). According to Tim’s email, Mr Wilkie often votes ‘yes’ in divisions like these “because he believes in free and open debate in the Parliament or he believes that the Parliament should debate a certain issue”.

Fortunately, They Vote For You is designed to make it easy for anyone to correct a problem like this so that the site is as accurate as possible.

First, we find the troublesome divisions and click on their link.

Picture 2 - The troublesome motions

Second, once we’re on the division page, we scroll down to ‘Votes’ and click on the link ‘Add or update related policies’.

Picture 3 - How to correct the error

Third, under ‘Related Policies’, we find the live animal export policy and (because we now realise that this division isn’t really about live animal export) we click ‘remove’.

Picture 4 - How to correct the error cont.

Then we’re done. We can now go back to the live animal export policy page and see how things have changed. For example, Andrew Wilkie has now moved from being a moderate supporter to a very strong opponent – which is quite a difference! Other politicians have also moved so that the policy now reflects their positions more accurately.

Picture 5 - All fixed

That’s all there is to it! Three easy steps for us and a giant leap for accuracy. And we don’t have to stop there. The more relevant divisions that we can add to a policy, the more accurately that policy will reflect the positions of our Members and Senators. So why not pop over to Divisions to see if there are any others that we can add?

And next time a division looks out of place, try fixing it yourself. And maybe, as you go, you’ll discover a new issue that can be made into a policy. I’m now wondering if a policy about suspending standing orders to allow further debate could work…

Posted in They Vote For You | Leave a comment

Civic Tech Monthly, October 2015

Welcome to the ninth edition of Civic Tech Monthly. Below you’ll find news and notes about civic tech from Australia and around the world.

It seems like October has been a busy month for everyone. All around the world people are flat-out launching new projects, presenting at conferences, and sharing plans and ideas. This edition is full of opportunities and ideas you can draw on.

As always we’d love to see you at the OpenAustralia Foundation Sydney Pub Meet next Tuesday in Sydney.

If you know someone who’d like this newsletter, pass it on: http://eepurl.com/bcE0DX.

News and Notes

Thank you Matthew

There must be something in the water because this has been a big year for changing leadership in civic tech organisations with Tom Steinberg moving on from mySociety and James McKinney from OpenNorth. This time it’s our very own co-founder Matthew Landauer.

So now it’s our turn to say thank you to Matthew. Since creating OpenAustralia.org.au over 7 years ago Matthew has worked tirelessly to better connect people with their communities, governments and politicians through his work at the OpenAustralia Foundation. While many people talk about this, Matthew just gets on and makes it happen. He’s done this by programming, designing and writing websites that have been used by millions of people, and by teaching and inspiring others to do the same as a leader in civic tech.

From Henare and Luke, thank you Matthew for everything you taught us. With Kat keeping our feet on the ground, we’ll continue to put what you taught us to work. We’re full steam ahead.

Apply for support and development help from mySociety

There are dozens of civic tech projects, running all over the world, built on mySociety’s open source work. They’re using platforms like Alaveteli, FixMyStreet, WriteInPublic or YourNextRepresentative.

You can get mySociety’s help to set up something like this in your area. For projects that suit their program, they’re offering technical and development help, as well as advice on a range of issues and even hosting.

Applications close October 30, so get in quick.

Democracy Club plans for 2016 UK elections

For the UK’s 2015 elections Democracy Club helped thousands of people in the UK vote and find out more about candidates. Now they’re laying down plans for their next round:

May 2016 will see a much wider range of elections in the UK – from local councils to city mayors, from police commissioners to the devolved assemblies and parliaments…

For May 2016, we’ve set ourselves the ambitious target of knowing about every candidate in every election in the UK…

And that’s not all. On 7 May, we noticed that one of the most popular internet searches was: “Where do I vote?” For that reason, among others, we think there’s real value to be gained in open data for polling stations – their locations and the areas they serve.

If you’ve got elections coming up in your area check out their plans and how they approach these projects. Maybe these ideas will inspire a project of your own, of course all their projects are open source for you to use.

Update to an Open Data Masters Thesis

In 2011 Zarino Zappia completed his Masters thesis on the state of “open data” use in the UK and USA: ‘Participation, Power, Provenance: Mapping Information Flows in Open Data Development’ [4MB PDF]. Early this month he posted some thoughts about what has and hasn’t changed since then touching on what government, hackers, non-profits, and the private sector are up to. It’s a wonderful and refreshingly honest appraisal of the state of open data.

New York Senate relaunches with a website designed for citizens

It’s great to see a legislature launch a website that has been designed to help citizens. There’s still lots of work to do, but it’s a big step ahead of what most people around the world get. How does nysenate.gov compare to your local senate or parliament’s website?

Civicist have collect perspectives on the new system with particular focus on the interesting “listening” features.

Legislative openness conference in Georgia brings together delegates from over 30 countries

You can now watch the sessions from the Government Partnership’s Legislative Openness Working Group meeting hosted by the Parliament of Georgia last month. There were over 75 parliamentary and civil society delegates from more than 30 countries present for the meeting.

You can find out more about what went down in OpeningParliament.org’s helpful review of the event.

Freedom of Information sketch diary

Myfanwy Tristram has posted her amazing sketches from AlaveteliCon 2015, the international conference on Freedom of Information technologies.

The sketches are a great introduction to the characters behind FOI projects around the world (including ours!). Myf gives you a real sense of the different flavour that each team brings to their common mission. The sketches are published over 5 posts on Myf’s blog.

An epic web scraping tutorial

After our first scraping workshop last month we wrote up some of the things we learned. One of them was that a reference guide would have been useful. So Luke wrote an epic step-by-step tutorial on how to write a web scraper in Ruby using morph.io!

Over the last couple of weeks we’ve featured these in a blog post series in which you actually create, publish and run your own working scraper. Give it a try and let us know how you go. We’re keen to help more people get the skills to build the projects they want to see. Any feedback on the tutorial would be greatly appreciated.

If you’re in Sydney this weekend and keen to learn scraping, we’ve still got three spots available in our next Introduction to Web Scraping Workshop on Sunday. We’d love to you join us.

Videos from Code for America Summit

Code for America Summit was earlier this month and you can now watch it all online, including this much tweeted presentation from Tom Loosemore.

Who comments in PlanningAlerts and how could it work better?

About a month ago we started some design research to learn from the people who use PlanningAlerts how we can make the experience of commenting on local planning better. This post talks about how we’re approaching design research and our observations in this project so far.

We’ve currently working on a new project to help people write to their elected local councillors about planning applications. The aim is to strengthen the connection between citizens and local councillors around one of the most important things that local government does: planning. We’re also trying to improve the whole commenting flow in PlanningAlerts.

The Impacts of Civic Technology Conference 2016, call for papers

Speaking of research, the Impacts of Civic Technology Conference (TICTEC) is on again in 2016. TICTEC was a great success in 2015. It collected people from all over the world to talk and learn about research in civic tech and the impact that these projects make.

TICTEC 2016 will be 27th-28th April 2016 in Barcelona. The call for papers and workshop ideas is now open. mySociety also offer grants for travel and registration that you can also apply for now.

Posted in Civic Tech Monthly | Tagged , , , , , , , , , , , , , , , , , , , , | Leave a comment

Ruby web scraping tutorial on morph.io – Part 5, saving your data & running it on morph.io

This post is part of a series of posts that provide step-by-step instructions on how to write a simple web scraper using Ruby on morph.io. If you find any problems, let us know in the comments so we can improve these tutorials.


In the last post we dealt with the site’s pagination and started scraping a complete dataset. In this final post we work out how to save our data and publish our scraper to morph.io.

Scrapers on morph.io use the handy ScraperWiki library to save data to an SQLite database. This is how all data in morph.io is stored. Each scraper page provides options to download the SQLite database, a CSV file of each table, or access the data via an API.

You might remember seeing the ScraperWiki library listed as a dependency in your Gemfile earlier:

ruby "2.0.0"

gem "scraperwiki", git: "https://github.com/openaustralia/scraperwiki-ruby.git", branch: "morph_defaults"
gem "mechanize"

To use this library in your scraper, you need to declare that it is required at the top of your scraper.rb in the same way you have for the Mechanize library:

require 'mechanize'
require 'scraperwiki'

You can save data using the ScraperWiki.save_sqlite() method. This method takes care of the messy business of creating a database and handling duplication for you. There are two augments you need to pass it: an array of the record’s unique keys so it knows when to override or update a record, and the data that you want to save.

A member’s full name is unique to them so you can use that as your unique key (we’ve called the field “title”). The data you want to save is your member object. After your p member statement is a good place to save your data.

p member
ScraperWiki.save_sqlite([:title], member)

Your scraper.rb should now look like this:

require 'mechanize'
require 'scraperwiki'

agent = Mechanize.new
url = 'https://morph.io/documentation/examples/australian_members_of_parliament'

["1", "2", "3"].each do |page_number|
  page = agent.get(url + "?page=" + page_number)

  page.at('.search-filter-results').search('li').each do |li|
    member = {
      title: li.at('.title').inner_text.strip,
      electorate: li.search('dd')[0].inner_text,
      party: li.search('dd')[1].inner_text,
      url: li.at('.title a').attr('href')
    }

    p member
    ScraperWiki.save_sqlite([:title], member)
  end
end

Save and run your file. The command line output should be unchanged— but if you view the files in your project directory you will see a new file data.sqlite.

Great job. You’ve now written a scraper to collect data and save it to a database. It’s time to put your new scraper code on morph.io so you can show the world how cool you are—and so it can take care of running the thing, storing your data, and providing you easy access to it.

Running your scraper on morph.io

morph.io runs scraper code that is stored in public GitHub repositories. To run your scraper on morph.io, you’ll first have to push it back up to GitHub repository you originally cloned it from.

Start off with another git commit to save any outstanding changes.

Push your changes up to your remote GitHub repository with:

> git push origin master

Now go view your scraper’s page on GitHub (the url will be something like github.com/yourusername/the_name_of_this_scraper). Navigate to view your scraper.rb file on GitHub and see that it’s got all your local changes.

You can now go over to your scraper’s page on morph.io and click the “Run scraper” button near the top of the page. The moment of truth is upon us.

As your scraper runs you will see all your console output print the data for the members you are scraping. A few seconds later, underneath the heading “Data”, you’ll find a table showing a representative ten rows of data and buttons to download your data in a range of formats.

Take a moment to explore the download options and check that the data looks as you expected.

That’s all folks

Well done my friend, you’ve just written a web scraper.

With just a few lines of code you’ve collected information from a website and saved it in a structured format you can play with. You’ve published your work for all to see on morph.io and set it to run, store and provide access to your data.

If you want to get really fancy you can set your scraper to auto run daily on your scraper’s settings page so it’s stays up to date with any changes to the members list.

Before you go mad with power, go and explore some of the scrapers on morph.io. Try searching for topics you find interesting and domains you know. Get ideas for what to scrape next and learn from other peoples’ scraper code.

Remember to post questions to the help forums if you get blocked by tricky problems.

If you have any feedback on this tutorial we’d love to hear it.

Now go forth with your new powers and scrape all the things!

Posted in Morph | Tagged , , | 2 Responses

Ruby web scraping tutorial on morph.io – Part 4, dealing with pagination

This post is part of a series of posts that provide step-by-step instructions on how to write a simple web scraper using Ruby on morph.io. If you find any problems, let us know in the comments so we can improve these tutorials.


In the last post we finished collecting the data we want but discovered we needed to collect it over several pages. In this post we learn how to deal with this pagination. There are number of techniques of dealing with pagination and the one we present here is deliberately simple.

Visit the target page in your browser and navigate between the different pages using the links above the members list. Notice that when you go to page 2 the url is mostly the same except it has the query string ?page=2 on the end:

https://morph.io/documentation/examples/australian_members_of_parliament?page=2

When scraping websites pay close attention to the page URLs and their query strings. They often include clues to help you scrape.

It turns out you can navigate between the different member pages by just changing the page number to 1, 2 or 3 in the query string.

You can use what you’ve discovered as the basis for another each loop. This time you want to make a loop that runs your scraping code for each page.

You know that the three pages with members are pages 1, 2 and 3. Create an Array of these page numbers ["1", "2", "3"] and then loop through these numbers to run your get request and scraping code for each page.

require 'mechanize'

agent = Mechanize.new
url = 'https://morph.io/documentation/examples/australian_members_of_parliament'

["1", "2", "3"].each do |page_number|
  page = agent.get(url + "?page=" + page_number)

  page.at('.search-filter-results').search('li').each do |li|
    member = {
      title: li.at('.title').inner_text.strip,
      electorate: li.search('dd')[0].inner_text,
      party: li.search('dd')[1].inner_text,
      url: li.at('.title a').attr('href')
    }

    p member
  end
end

Save and run your scraper.rb. You should now see all 150 members details printed. Well done! You should do a git commit for this working code.

This is great—but there’s one more step. You’ve written a scraper that collects the details of members of Parliament and prints them to the command line— but you actually want to save this data. You need to store the information you’ve scraped so you can use it in your projects and that’s one of the things we’ll cover in our next and final post.

Posted in Morph | Tagged , , | Leave a comment

Ruby web scraping tutorial on morph.io – Part 3, continue writing your scraper

This post is part of a series of posts that provide step-by-step instructions on how to write a simple web scraper using Ruby on morph.io. If you find any problems, let us know in the comments so we can improve these tutorials.


In the last post we started writing our scraper and gathering some data. In this post we’ll expand our scraper to get more of the data we’re after.

So now that you’ve got the title for the first member get the electorate (the place the member is ‘member for’) and party.

Looking at the page source again, you can see this information is in the first and second <dd> elements in the member’s <li>.

<li>
  <p class='title'>
    <a href="http://www.aph.gov.au/Senators_and_Members/Parliamentarian?MPID=WN6">
      The Hon Ian Macfarlane MP
    </a>
  </p>
  <p class='thumbnail'>
    <a href="http://www.aph.gov.au/Senators_and_Members/Parliamentarian?MPID=WN6">
      <img alt="Photo of The Hon Ian Macfarlane MP" src="http://parlinfo.aph.gov.au/parlInfo/download/handbook/allmps/WN6/upload_ref_binary/WN6.JPG" width="80" />
    </a>
  </p>
  <dl>
    <dt>Member for</dt>
    <dd>Groom, Queensland</dd>
    <dt>Party</dt>
    <dd>Liberal Party of Australia</dd>
    <dt>Connect</dt>
    <dd>
      <a class="social mail" href="mailto:Ian.Macfarlane.MP@aph.gov.au"
      target="_blank">Email</a>
    </dd>
  </dl>
</li>

Get the electorate and party by first getting an array of the <dd> elements and then selecting the one you want by its index in the array. Remember that [0] is the first item in an Array.

Try getting the data in your irb session:

>> page.at('.search-filter-results').at('li').search('dd')[0].inner_text
=> "Groom, Queensland"
>> page.at('.search-filter-results').at('li').search('dd')[1].inner_text
=> "Liberal Party of Australia"

Then add the code to expand your member object in your scraper.rb:

member = {
  title: page.at('.search-filter-results').at('li').at('.title').inner_text.strip,
  electorate: page.at('.search-filter-results').at('li').search('dd')[0].inner_text,
  party: page.at('.search-filter-results').at('li').search('dd')[1].inner_text
}

Save and run your scraper using bundle exec ruby scraper.rb and check that your object includes the attributes with values you expect.

OK, now you just need the url for the member’s individual page. Look at that source code again and you’ll find it in the href of the <a> inside the <p> with the class title.

In your irb session, first get the <a> element:

>> page.at('.search-filter-results').at('li').at('.title a')
=> #<Nokogiri::XML::Element:0x3fca485cfba0 name="a" attributes=[#<Nokogiri::XML::Attr:0x3fca48432a18 name="href" value="http://www.aph.gov.au/Senators_and_Members/Parliamentarian?MPID=WN6">] children=[#<Nokogiri::XML::Text:0x3fca4843b5c8 "The Hon Ian Macfarlane MP">]>

You get a Nokogiri XML Element with one attribute. The attribute has the name “href” and the value is the url you want. You can use the attr() method here to return this value:

>> page.at('.search-filter-results').at('li').at('.title a').attr('href')
=> "http://www.aph.gov.au/Senators_and_Members/Parliamentarian?MPID=WN6"

You can now add this final attribute to your member object in scraper.rb:

member = {
  title: page.at('.search-filter-results').at('li').at('.title').inner_text.strip,
  electorate: page.at('.search-filter-results').at('li').search('dd')[0].inner_text,
  party: page.at('.search-filter-results').at('li').search('dd')[1].inner_text,
  url: page.at('.search-filter-results').at('li').at('.title a').attr('href')
}

Save and run your scraper file to make sure all is well. This is a good time to do another git commit to save your progress.

Now you’ve written a scraper to get information about one member of Australian Parliament. It’s time to get information about all the members on the first page.

Currently you’re using page.at('.search-filter-results').at('li') to target the first list item in the members list. You can adapt this to get every list item using the search() method:

page.at('.search-filter-results').search('li')

Use a ruby each loop to run your code to collect and print your member object once for each list item.

page.at('.search-filter-results').search('li').each do |li|
  member = {
    title: li.at('.title').inner_text.strip,
    electorate: li.search('dd')[0].inner_text,
    party: li.search('dd')[1].inner_text,
    url: li.at('.title a').attr('href')
  }

  p member
end

Save and run the file and see if it collects all the members on the page as expected. Now you’re really scraping!

You still don’t have all the members though, they are split over 3 pages and you only have the first. In our next post we’ll work out how to deal with this pagination.

Posted in Morph | Tagged , , | Leave a comment

Who comments in PlanningAlerts and how could it work better?

In our last two quarterly planning posts (see Q3 2015 and Q4 2015), we’ve talked about helping people write to their elected local councillors about planning applications through PlanningAlerts. As Matthew wrote in June, “The aim is to strengthen the connection between citizens and local councillors around one of the most important things that local government does which is planning”. We’re also trying to improve the whole commenting flow in PlanningAlerts.

I’ve been working on this new system for a while now, prototyping and iterating on the new comment options and folding improvements back into the general comment form so everybody benefits.

About a month ago I ran a survey with people who had made a comment on PlanningAlerts in the last few months. The survey went out to just over 500 people and we had 36 responders–about the same percentage turn-out as our PlanningAlerts survey at the beginning of the year (6% from 20,000). As you can see, the vast majority of PlanningAlerts users don’t currently comment.

We’ve never asked users about the commenting process before, so I was initially trying to find out some quite general things:

  • What kind of people are commenting currently?
  • How do they feel about the experience of commenting?
  • How easily do they get through the process of commenting?
  • Do people see the comments as a discussion between neighbours or just a message to council? or both?
  • Who do they think these comments go to? Do they understand the difference between the council organisation and the councillors?

The responses include some clear patterns and have raised a bunch of questions to follow up with short structured interviews. I’m also going to have these people use the new form prototype. This is to weed out usability problems before we launch this new feature to some areas of PlanningAlerts.

Here are some of the observations from the survey responses:

Older people are more likely to comment in PlanningAlerts

We’re now run two surveys of PlanningAlerts users asking them roughly how old they are. The first survey was sent to all users, this recent one was just to people who had recently commented on a planning application through the site.

Compared to the first survey to all users, responders to the recent commenters survey were relatively older. There were less people in their 30s and 40s and more in their 60s and 70s. Older people may be more likely to respond to these surveys generally, but we can still see from the different results that commenters are relatively older.

Knowing this can help us better empathise with the people using PlanningAlerts and make it more usable. For example, there is currently a lot of very small, grey text on the site that is likely not noticeable or comfortable to read for people with diminished eye sight—almost everybody’s eye sight gets at least a little worse with age. Knowing that this could be an issue for lots of PlanningAlerts users makes improving the readability of text a higher priority.

Comparing recent commenters to all PlanningAlerts users
Age group All users Recent commenters
30s 20% 11%
40s 26% 14%
50s 26% 28%
60s 18% 33%
70s 5% 8%

There’s a good understanding that comments go to planning authorities, but not that they go to neighbours signed up to PlanningAlerts

To “Who do you think receives your comments made on PlanningAlerts?” 86% (32) of responders checked “Local council staff”. Only 35% (13) checked “Neighbours who are signed up to PlanningAlerts”. Only one person thought their comments also went to elected councillors.

There seems to be a good understanding amongst these commenters that their comments are sent to the planning authority for the application. But not that they go to other people in the area signed up to PlanningAlerts. They were also very clear that their comments did not go to elected councillors.

In the interviews I want to follow up on this are find out if people are positive or negative about their comments going to other locals. I personally think it’s an important part of PlanningAlerts that people in an area can learn about local development, local history and how to impact the planning process from their neighbours. It seems like an efficient way to share knowledge, a way to strengthen connections between people and to demonstrate how easy it is to comment. If people are negative about this then what are their concerns?

I have no idea if the comments will be listened to or what impact they will have if any

There’s a clear pattern in the responses that people don’t think their comments are being listened to by planning authorities. They also don’t know how they could find out if they are. One person noted this as a reason to why they don’t make more comments.

  • I have no real way of knowing whether my concerns are given any attention by local council.
  • I have no idea if the comments will be listened to or what impact they will have if any
  • I believe that the [council] are going to go ahead and develop, come what may. However, if I and others don’t comment/object we will be seen as providing tacit approval to Council’s actions
  • Insufficient tools and transparency of processes from Planning Panel.
  • I don’t feel I have any influence. I was just sharing my observations, or thoughts with like minded people who may. (have influence)
  • I do get the ‘Form Letter’ from Council but I’m not in any way convinced they listen.
  • The process of being alerted and expressing an opinion works well but whether it has any effect is doubtful.
  • Although councils do respond to my comments, it is just an automated reply. The replies from City of Sydney are quite informative but the ones from Marrickville pretty meaningless.
  • I am not in any way convinced anyone listens. A previous mayor stated he ONLY listens to people whose property directly adjoins the building site.
  • –I know it’s money that matters, not people

Giving people simple access to their elected local representatives, and a way to have a public exchange with them, will hopefully provide a lever to increase their impact.

I would only comment on applications that really affect me

There was a strong pattern of people saying they only comment on applications that will effect them or that are interesting to them:

  • I would only comment on applications that really affect me, don’t want to just restrict any application.
  • Not many are that relevant / interest me.
  • Sometimes it doesn’t feel like it is right making comments that don’t directly impact
  • I target the ones that are most important
  • Only interested in applications which either reflect major planning and development issues for the district as a whole (eg approval for demolition of old houses or repurposing of industrial structures) or which affect the immediate location around where I live.
  • I comment on those that affect my area
  • I only comment on applications that may effect my immediate area.
  • Comment on those that I get that are significant,ie: not on normal sheds,pools,dwellings etc.
  • only comment on ones that I feel directly impact myself or my suburb
  • I would only comment on an application, that adversely affected me or my community.
  • Not all relevant to me. Also don’t want to be seen as simply negative about a lot of the development
  • A lot are irrelevant to my interest.

How do people decide if an application is relevant to them? Is there a common criteria?

Why don’t you comment on more applications? “It takes too much time

A number of people mentioned that commenting was a time consuming process, and that this prevented them from commenting on more applications:

  • Time – not so much in writing the response but in being across the particulars of DAs and being able to write an informed response.
  • Not enough time in my life – I restrict myself to those most relevant to me
  • Time poor
  • It takes too much time, but one concern is that it generates too much paper and mail from the council.

What are people’s basic processes for commenting in PlanningAlerts? What are the most time consuming components of this? Can we save people time?

I have only commented on applications where I have a knowledge of the property or street amenities.

A few people mentioned that they feel you should have a certain amount of knowledge of an application or area to comment on it, and that they only comment on applications they are knowledgeable about.

How does someone become knowledgeable about application? What is the most important and useful information about applications?

Comment in private

A small number of people mentioned that they would like to be able to comment without it being made public.

  • Would like an option to remain private on the internet – eg a “name withheld” type system.
  • Should be able to make comments in confidence ie only seen by council, not other residents
  • I prefer not to have my name published on the web. The first time I commented it wasn’t clear that the name was published.

Suggestions & improvements

There were a few suggestions for changes to PlanningAlerts:

  • Should be able to cut and paste photos diagrams, sketches etc.
  • I was pleased that the local council accepted the comments as an Objection. But it was not clear in making the comment that it would be going to the council.
  • There could be a button to share the objection via other social media or a process to enforce the council to contact us.
  • Some times it is hard to find a document to comment on if I don’t know the exact details, The search function is complex.

Summing up PlanningAlerts

We also had a few comments that are just nice summaries of what is good about PlanningAlerts. It’s great to see that there are people who understand and can articulate what PlanningAlerts does well:

  • PlanningAlerts removes the hurdles. I hear about developments I would not have otherwise known about, and I can quickly provide input without having to know any particular council processes.
  • Its an efficient system. I’m alerted to the various viewpoints of others.
  • Because it shares my opinion with other concerned people as well as council. Going directly to council wouldn’t share it with others concerned.

Next steps

If we want to make using PlanningAlerts a intuitive and enjoyable experience we need to understand the humans at the centre of it’s design. This is a small step to improve our understanding of the type of people who comment in PlanningAlerts, some of their concerns, and some of the barriers to commenting.

We’ve already drawn on the responses to this survey in updating wording and information surrounding the commenting process to make it better fit people’s mental model and address their concerns.

I’m now lining up interviews with a handful of the people who responded to try and answer some of the questions raised above and get to know them more. They’ll also show us how they use PlanningAlerts and test out the new comment form. This will highlight current usability problems and hopefully suggest ways to make commenting easier for everyone.

Design research is still very new to the OpenAustralia Foundation. Like all our work, we’re always open to advice and contributions to help us improve our projects. If you’re experienced in user research and want to make a contribution to our open source projects to transform democracy, please drop us a line or come down to our monthly pub meet. We’d love to hear your ideas.

Posted in PlanningAlerts.org.au | Tagged , , , , , , , , , | Leave a comment
  • Civic Tech Monthly

    An email newsletter for people interested in the development of technology to enable civic change.

  • Categories

  • Archives

    • [+]2016
    • [+]2015
    • [+]2014
    • [+]2013
    • [+]2012
    • [+]2011
    • [+]2010
    • [+]2009
    • [+]2008
    • [+]2007