Introducing the biggest Planning Alerts redesign in fourteen years

We originally launched Planning Alerts in October 2009. That feels like a lifetime ago.

In that time all sorts of features have been added, there have been visual redesigns, and we’ve introduced many little improvements and fixes. Yet nothing we have done before feels as substantial and meaningful as what we’re launching today. 

On the face of it, the changes you see may seem like a visual redesign. In fact we’ve researched thoroughly and thought deeply. Taking this approach, we think we’ve created a really beautiful and usable design that is so much clearer and more satisfying to use.

While the new look might be a little jarring at first, we think you’ll come to appreciate it quickly.

We’re just so happy after so much work to finally share it with you!


This is what we are aiming to achieve with the new Planning Alerts:

  • Simple: Clean and uncluttered interface
  • Extremely legible: Easy-to-read text across all devices.
  • Accessible – big type and high contrast
  • Welcoming
  • Human: Small touches of whimsy to remember we’re people
  • Not be government-y: Distinct and friendly design
  • Trustworthy: Improved transparency, without compromising ease
  • Respectful space: Inviting clear respectful interactions
  • Show a little more of the process under the hood so that people can interact with the service more productively, know what it’s doing, and be more confident that it is doing what you expect it to do.

Simple, Extremely legible and Accessible

The new site has big legible type across all screen sizes with good colour contrast everywhere. While no substitute for manual accessibility testing, we’ve also added automated accessibility testing as part of our test suite. This allows us to make changes more confidently, knowing that we’re less likely to inadvertently reintroduce some old accessibility problem.


Human Touches and Respectful Space

You’ll find illustrations of people as a subtle reminder that you’re in good company. There’s a symphony of other people here. People around Australia, people in our community and people working behind the scenes in government bureaucracy all read our comments and connect with our words. And of course we, the people running the service do too. This is a service for everyone, so when you make a comment, remember not everyone looks or thinks the same way as you. We all deserve to be part of a respectful dialogue helping to shape the built environment around us all.


Welcoming and Not Government-y

Colours! They’re a little idiosyncratic and we love them. They’re warm but not overbearing, cute but not cutesy, diverse but coherent… we think it would be hard for you to think this was a government service.


Transparent and Trustworthy

Trust us, we have made some changes aimed at increasing transparency and trustworthiness. For example, when you write a comment, from now on you’ll preview the full email before it’s sent to council instead of Planning Alerts magically delivering it to the planning authority. This is so you can double check what you’ve written and better understand any privacy implications. Importantly, you will also see more clearly exactly what will be sent to council and what’s publicly displayed on the site.

One of the many things we learned in the research is that we’ve embedded a lot of our values and ethics into decisions we’ve made that have very specific outcomes on how we present planning information. However we haven’t always spelled these out directly. We are also working to ensure these values are clearly expressed not just in our hearts, they’re also explained in our web pages and communications. One key example is our commitment to protecting your privacy.

Also, we realise the mechanics of the service haven’t always been clear to new users. So we’ve added a “How it works” section on the home page.


This is Just the Beginning

This redesign is just the start of a new stage for Planning Alerts.

While we think this is a more polished redesign than we have ever launched before, inevitably there will be rough edges or things that just aren’t working as they should. So, please, if you find something that seems weird or wrong or could be better, please share your thoughts with us. We really depend on you sharing your experience to help us focus on better meeting your needs.

Share your thoughts with us


Posted in Announcement, PlanningAlerts.org.au | 2 Responses

The Australian Taxation Office (ATO) has revoked OpenAustralia Foundation’s Deductible Gift Recipient (DGR) endorsement

On Thursday November 16, the OpenAustralia Foundation received the following letter from the Australian Taxation Office.

In it they said “Based on the information we examined during the review, we have determined that The Foundation is not operating as a public library to be entitled for endorsement as a DGR under items 1 & 4 of the table in section 30-15 of the ITAA 1997.”

We do not agree with their decision.

We are talking to our lawyers about our next steps. We will have more to say soon.

Additionally they sent this one today.

Kat and Matthew

Posted in Announcement, OpenAustralia Foundation | Tagged , , , , | 2 Responses

They Vote For You – There’s something wrong with Peter Dutton’s voting record!

A screenshot of an erroneous They Vote For You voting record, showing Peter Dutton as a supporter of a constitutionally enshrined First Nations Voice in parliament. The screenshot has a large error symbol overlaid on it.

You may have seen this screenshot showing Opposition Leader Peter Dutton as a supporter of a constitutionally enshrined First Nations Voice in parliament. If this looks wrong to you, it’s because it is. Several votes on the Constitution Alteration (Aboriginal and Torres Strait Islander Voice) 2023 were erroneously connected to this policy, when they were actually votes on whether to have a referendum on the Voice (not whether to have the Voice at all).

Once it was pointed out that Mr Dutton was now showing up as a Voice supporter, the issue was corrected. First, the existing policy was put into draft mode and its text changed to highlight the inaccuracy. This made sure that subscribers to the policy would get a notification that there was a problem.

Next, two new policies on the subject were created that better reflected the votes we have available: one specifically on whether to have a referendum on the Voice and one more generally on implementing the Uluru Statement from the Heart.

Then we made a post on our social media about the problem and what we did to fix it (see Facebook, Instagram and Mastodon).

This isn’t the first time that a mistaken policy connection has happened on They Vote For You. See, for example, this blog post from 2015 on an issue with Andrew Wilkie’s voting record.

Since the policy connections are done manually, it’s inevitable that some errors will occur. So if you do see something that doesn’t look right – such as Mr Dutton showing up as a supporter to a policy he is actively campaigning against – then please let us know so we can get it sorted as soon as possible.

Posted in They Vote For You | Tagged | Leave a comment

Journey to Improving Planning Alerts: A Service Designer’s Perspective – Part 3 Toolkit for Service Improvement

Guest post by Service Designer and Researcher, Sabina Popin

This blog post is the grand finale in our three-part series on the recent Planning Alerts Service Improvement project. I kicked off the series with a bird’s eye view of the project, and the sequel dove deeper into the approach, research, insights, opportunities, and concepts. Now, let’s shift the spotlight onto the nitty-gritty – the outputs of our work and the fate of these insights.

Oh snap we created Service Principles!

In the process of identifying key opportunities for service improvement we generated How Might We statements. A method of creative questioning that can then be used as a prompts for generating ideas. 

While we did indeed use them to help us generate our short term concepts for improving the service, to our surprise these statements also echoed some profound truths about how the service should operate. At first I wasn’t entirely convinced on what to do with these statements, conscious of not creating more artefacts that might add noise and confusion for the team. By being open about this with the team it became clear that some much needed Service Principles were taking shape. A set of aspirational instructions that would support design, development, onboarding of new staff and collaborators, and decision making that aligns with the core intent of the Planning Alerts Service. 

We dived headfirst into the details of these principles, which shone light on what the Planning Alerts team implicitly believed about the service. Through this process we found that some earlier principles crafted to support Development Design now had a new home – right next to Service Design principles. Three sets of principles emerged:

  • Experience principles: How we want the service to feel. 
  • Purpose principles:  Core values and intentions of the service.
  • Delivery principles: The “how” of our service delivery.

The emergence of the Service improvement toolkit

During the research synthesis process I plotted the user needs onto the different stages of the user journey to understand people’s needs in the context of where they occur when using the Planning Alerts service. This was the moment a new kind of map formed – a map of service improvement opportunities. We found gold in this all-in-one map that encompasses insights about individual, organisational, and planning authority needs, opportunities for improvement, and ideas for those opportunities. Usually, these elements would need different maps.

All-in-one Service Experience (Improvements) Map

A couple of factors led us to believe that such an all-in-one map was the right thing. 

At Planning Alerts the key decision makers are also the same people who are implementing the decisions. They need a tool that’s a Swiss Army Knife – determining strategic focus, stirring up solution discussions, and being a nest for future feedback.

Designing this jack-of-all-trades artefact posed quite a riddle for me. With so many functions, it risked becoming an overwhelming info dump, leaving it gathering dust. I shuddered at the thought of our hard work and precious insights being lost in the labyrinth of a poorly designed artefact.

This called for some good old fashioned back to basics information design hierarchy, but most importantly it called for ongoing collaboration with the team to ensure they knew the artefacts back to front, and developed a sense of ownership that would allow them to continue using it. 

Triaging new feedback

During the map creation process, we realised that the team needed a way to differentiate between issues that could be addressed right away and those that needed further analysis. As valuable as Github is for tracking immediate tasks, it’s not as well-suited for issues requiring extended discussion and evaluation.

And there’s no lack of feedback — with the new user account features, we have more comments, more blog posts, more emails, and likely even more feedback as we implement new changes. So, we were faced with the question: What should we do with these new issues?

We needed a way for the team to to organise these incoming pieces of feedback, opportunities and ideas, and to know how to use them to build on the needs that have already been uncovered and are housed in the Service Experience Map. This led us to experimenting with a triaging process for any new issues or feedback that cannot be actioned immediately, so they don’t get lost. 

Ok these sound great, but how do we make this work?

“We need a guide, some ‘how-to’ instructions!” was the consensus. And so, the toolkit concept was born. It will be an experiment, of course; we can’t guarantee it will be as beneficial as we hope. This led to an agreement on continued engagement: I’d work with the team once a week for at least the next three months, helping them iterate and integrate the toolkit into their strategic and daily decision-making processes.

Where to from here?

Short term improvements and Github

First up, we prioritised some concepts that could be implemented right away to address recurring issues. These small changes have the potential for a big impact. The Planning Alerts team already uses Github, so I logged these ready-to-go issues there. Integrating this process was extremely valuable; it helped me understand the level of detail required for a concept to be actionable and more importantly, how to weave user insights into the final design and features of a concept. I was watching the insights come alive.

Next, the idea is for the team to act on prioritised concepts, determining what they want to track and measure before implementing. This will allow them to evaluate new features against these metrics and make improvements based on feedback.

Ongoing strategic decision making

Having the support of myself or another service designer on an ongoing basis will help ensure the artefacts in the Planning Alerts improvement toolkit become a regular part of strategic conversations and decisions about the service.

Over time, they’ll need to continue working through the User Experience (Improvements) Map, making improvements at both the incremental and the bigger picture level. Part of my ongoing engagement will involve helping them cultivate a habit of using the Miro board and artefacts for strategic conversations. Additionally, they’ll need to start incorporating medium- and long-term improvements into the toolkit to make it an integral part of these strategic discussions.

A service designer on the team ongoing will help them refine their service principles, integrate artefacts into the onboarding process for new staff and collaborators, conduct ongoing research with users to understand evolving needs, and work with council and planning authority staff to better understand their needs and potential improvement opportunities.

Unique Challenges Require Unique Solutions

The chance to continue working alongside the team beyond the project and become true partners in the work is the greatest highlight for me. While it’s a service designer’s nightmare to see valuable insights go unused at the end of a project, the dream is to be there from the start — from research, strategy, and ideation through to implementation and beyond.

So, faced with unique challenges, the Planning Alerts team had the ability to be flexible and a willingness to try new approaches. This spirit of improvisation and adaptability were key assets in the success of this project, serving as a powerful reminder that no team is too small to make a substantial impact. Size and resources are only part of the equation – collaboration, creative thinking, and resilience play a crucial role in the success of any service design project.

Posted in Planning, PlanningAlerts.org.au | Tagged , , , , , , , | Leave a comment

Journey to Improving Planning Alerts: A Service Designer’s Perspective – Part 2 Research, Testing and Insights

Guest post by Service Designer and Researcher, Sabina Popin

This post is part of a three part series about the recent project on Planning Alerts Service Improvement. In the previous post I outlined the entirety of the project in brief. In this post you get to read in more depth about the epic journey we’ve been on to uncover opportunities for improvement, the insights and learnings we gained along the way and finally what concepts motivated the changes you will eventually see implemented on Planning Alerts

From Email Inbox Research Insights to Concepts

The research for this project was done in two phases: email inbox data analysis followed by in-depth interviews with the people that use the service. An earlier two month inbox study, generated hypotheses and initial insights about people’s needs. For this project, I extended the exploration to consider a little over a year’s worth of inbox emails, that’s thousands of emails (February to December 2022). This amount of data, coupled with the length of time we’ve analyzed, was more than enough to give us a better understanding of how many people are affected by similar issues. It also helped in identifying the biggest pain points and moments of delight for both staff and the people using the service. From the analysis of this data, categories of needs emerged, which I then mapped along the Planning Alerts service journey stages, to better understand the context in which they were occurring. 

The biggest pain points for support staff and people interacting with the service

The Planning Alerts team works hard to be supportive and human in their responses to everyone who writes to them. Canned responses are used for some email types, which helps, but they often require adjusting to fit each enquiry. However, with only one person as support staff servicing the whole of Australia, the individual handling and sorting of emails, and the time and thought needed to respond to them means some emails get missed, or replied to late. On top of this, working through sometimes emotive or complex community issues raised in comment reports can be emotionally taxing and requires skill, focus, empathy and time. 

As for people using the service, I observed that the issues causing the most frustration were when either alerts or the service didn’t work as they expected, for example missed alerts, or when they found information to be wrong or missing. In addition, many were under the impression that when they contacted Planning Alerts they were getting in touch with their local planning authority.

Uncovering opportunities

Once I had some key insights to share with the team it was time to roll our sleeves up and really kick-off the magic of the design process. I worked closely with the Planning Alerts team in a series of regular collaborative working sessions. These sessions were fast paced, messy and iterative. We would simultaneously play back research findings, gain further insights, and generate ideas to tackle the issues at hand. It was a sizable departure from the typical service design approach where sessions are more structured and separated into specific activities. It was also high energy seeing ideas form straight out of insights, it gave us the necessary momentum to keep chipping away. This unique all-in-one approach was made possible as the people delivering and implementing the service were also the decision-makers. They could dive into the nitty-gritty details while also maintaining a high-level perspective.  

Opportunity questions

Through our collaborative sessions we identified several key opportunity questions to help us generate ideas for improving the user experience on Planning Alerts. These questions guided us towards imagining solutions to common concerns and challenges people face. They gave us the necessary structure to sketch out ideas and form concepts:

  • How might we support people to express their needs in a way that is relevant and focused, so that they are taken up by council?
  • How might we support people to self-serve for known issues so that they feel empowered and don’t need to email in as often?
  • How might we ensure people are never left wondering about the reliability of the service?
  • How might we encourage people to be respectful of each other, so that there is opportunity for cooperation?

How do we prioritise what improvements to make?

We were fast approaching in-depth interviews with people who use the Planning Alerts service. At the same time we were grappling with a puzzle: we had a plethora of ideas, ranging from small UX tweaks to broader, long-term strategies like partnering with authorities to improve comment reception. We wanted to cover this broad range of ideas, and we also needed to prioritise actionable tasks for the coming year or quarter. We chose a few key opportunities and ideas to prototype based on the key pain points from the email inbox study and support staff experience to put in front of people. This would help us learn more about their needs and test our improvement hypotheses. This pivotal moment allowed us to focus on short-term goals while keeping an eye on the bigger picture.  

I then combined and refined the ideas that came out of this idea generation activity into concepts, sketching them out digitally to make them approachable and ready to test.

  • Understanding how the service works: A step-by-step How it Works section for the home page 
  • Getting help on the site: A dedicated help and support space on the Planning Alerts site that contains helpful articles, FAQs and a contact-us form
  • Getting help by contacting us: A contact-us form with links to helpful articles and guidance on how to ask for help
  • Getting help by replying to an alert: An automated email that would get sent to people who accidentally reply to an email alert with help articles and link to support pages
  • Proactive information when not receiving alerts: An email notification that would get sent to people who have not received alerts in some time to let them know why that might be the case
  • Commenting clarity: An improved comments section that draws attention to the important parts of a comment submission. 
  • Comment guidelines: Guidelines to help people write comments that have a better chance of being accepted by planning authorities
  • Comment preview: A built in comment preview function to encourage people to confirm what they want to say and to understand how the comment will appear on Planning Alerts and what will be sent to the planning authority
  • Report comment categories: A reporting feature that adds a few more categories to comment reporting to support people to make a conscious choice on why they are reporting the comment. 

While the inbox gave some great insights, I still had a lot of questions. The inbox was useful in creating a base for what might be happening, but I was still making a bunch of assumptions. I was really looking forward to talking to people to get their feedback on our concepts and to better understand what was really going on for them. Either busting those assumptions or validating them. 

All in one, research and testing interviews: a winning combo

First, a note on research participant recruitment

Next up, talking to real people! Early on I worked with the Planning Alerts team to identify the different groups of people who use the service to speak with. This included people who have alerts, who have registered for new email accounts (which the Team were concurrently rolling out), people who have made comments on Development Applications in their area, as well as people who work for planning authorities in Local and State Governments. We wanted to hear more about them and their needs to bring life and nuance to our understanding of their experiences. Yet, we also didn’t have long to bring the pieces together as we were heading into the long summer holiday season. 

Recruiting participants for research in larger organisations is usually done through a research recruitment agency from a list of people who have purposely signed up to participate in all kinds of research. When research recruitment is done internally it often takes a lot of time because it can be difficult to find people willing to participate even for a monetary incentive. I was shocked or should I say pleasantly surprised at how willing people who use the Planning Alerts service were to donate their time to participate in the research. We even heard a wonderful story where community residents shared that the Planning Alerts research was happening with one of their elected councillors and she signed up to participate in the research! WOW! This showed me how engaged the Planning Alerts community is, how much people genuinely care about what’s happening in their communities and that there is immense opportunity for planning authorities to work together with Planning Alerts for the benefit of communities Australia wide. Insert ‘smiling crying emoji’.

Gathering insights from people who use Planning Alerts

With the concepts and my trusty interview guide ready I was all set for research/ testing. I engaged with a total of 15 people, in 60 min 1-1 sessions. The people were a mix of people who’ve used Planning Alerts across residents, community groups and councils/ planning authorities. 

The first 30 min of the session was more of an interview format to understand their current state experience of Planning Alerts, their needs and aspirations when it comes to engaging with the planning process. I structured the interviews in this way to ensure we could learn in general about their experience as well as to get their views on our hypothesis on what could improve the experience. 

The second part of the interview focused on testing the ideas using the sketches and wireframes we created – this meant getting feedback to help understand what in their view works and what doesn’t.

We learned a lot from people about their experiences engaging with the planning process and with Planning Alerts. These higher level insights support our understanding of the complexities of the entire planning ecosystem and where Planning Alerts can play a role in supporting community engagement in the planning process.  You can read these insights below.

As for more immediate learnings that can help inform what short term changes we can make to improve the Planning Alerts service, well we learnt a lot about that too. The participants’ feedback helped to validate that we were on the right track with the kinds of improvements we wanted to make. Namely supporting a smoother and more transparent commenting experience; giving clarity around who Planning Alerts is; transparency around the service coverage and interruptions; supporting self-service with known issues and freeing up time for support staff to focus on responses that need more attention. 

Participant comments and feedback helped us to refine and improve the concepts further. As you read this you may have noticed some of these changes rolled out already!

Lessons Learned

Part of the prioritisation was already done, we knew that the concepts we took out represented the key things to solve in the short term. But before we could be ready for implementation we needed to iterate the concepts based on feedback, flesh them out so that they had enough detail to be designed and developed. And lastly, yes you guessed it ANOTHER level of more granular prioritisation. What goes off the rank first, like now, this quarter? 

I found this immensely satisfying; I so rarely get to be involved from the discovery and research all the way through to writing a detailed specification for design and development. The end of the final concept prioritisation truly felt like a moment of celebration for everyone. 

But that wasn’t the end

With the concepts now fleshed out and prioritised I now needed to help the team get all this useful work into formats and artefacts that they could actually use ongoing. I wasn’t about to let my service designer nightmare come true and let all the precious insights go to waste! We began to think of this suite of insights, needs, opportunities and concepts as a toolkit. 

And toolkits need clear instructions. They need to be well organised and approachable, they should neatly house all you need for the task at hand. 

By treating the suite of insights, needs, opportunities and concepts as a toolkit, we ensured that the Planning Alerts team could continue to put these valuable insights to work, and maintain a human approach to their service.

What did we actually learn about peoples experience with Planning Alerts?

Planning Alerts keeps communities informed

Staying informed is a key reason that Planning Alerts is valued by residents, particularly as planning authorities are not required to inform directly on all applications. With Planning Alerts, community groups and residents are able to form their own understanding of the shape of their suburb.

For the council and planning authorities that participated in the research, Planning Alerts play an important role in keeping their community informed when they are not able to or when their systems are harder to navigate. This can however cause additional work when the planning authority is not required to advertise but the applications are still picked up by Planning Alerts.

Easier to engage with Planning Alerts

Oftentimes Planning Alerts is seen as easier to engage with than council and planning authority portals. People appreciate being able to easily find a DA, comment on it, see other people’s feedback and be kept up to date on planning information in their area of interest.

Planning Alerts also makes it easy to share applications with the community via email and Facebook to discuss and rally together.

Confusion about who Planning Alerts is and what it stands for

Some people who use Planning Alerts think that it’s run by a local planning authority. This leads them to write to Planning Alerts with the assumption that Planning Alerts can provide them with information only a planning authority can, or to express frustration.

People want to be able to hold planning authorities accountable – thinking that Planning Alerts is run by the government and doesn’t allow them to do that. There is a desire for Planning Alerts to bring clarity about who they are and what they stand for and emphasise they are not connected to Planning authorities.

Research participants have expressed that emphasising the ethical pro-democracy, independent, not-for-profit nature of Planning Alerts is important for building trust and for understanding the gaps of each planning authority.

Biggest pain points relate to perception of service reliability

People appear to get most frustrated with alerts not working, showing wrong locations/images or not being able to find additional information and documentation to fully understand the application.

When privacy is in question, for example with comments being posted publicly this can be a source of frustration for people as they want immediate action and resolution.

People feel they need to understand the planning system to engage with it

People find the planning process and development applications and their accompanying documents difficult to understand. The more people engage with the process the more they learn. While some do their own research others are relying on more knowledgeable members of their community. Councils and planning authorities feel they are doing their best to support residents to understand the planning process.

Residents and planning authorities agree that developing an understanding of the planning process is important to be able to engage with it. They acknowledge the importance of engaging directly with planning authorities. Experienced residents have learnt through experience that the next steps to making a submission is to form relationships with council planning staff and write to elected councillors and state representatives, ministers and shadow ministers.

Finding information on an application can be challenging

People looking for more information about an application have found the experience to be convoluted and time consuming. When they go to a council or planning authority website they often have to copy the DA number as some council links go to their homepage rather than to an application directly. Once they get there, the documents are hard to understand for anyone who is not well versed in the planning process.

People with more experience with the planning process feel that there is not enough information on Planning Alerts to understand an application fully, which leads some people to make comments based on little context.

Discussions and comments on applications

People appreciate the presence of other people’s comments. It allows them to get more information, to understand the experience of those in close proximity to the development. One community group that participated in the research is even using it to recruit new members.

People are actively sharing Planning Alerts links on Facebook, where there are discussions and feedback is shared. These discussions may not always make their way back to Planning Alerts nor to the relevant planning authorities. Some community groups however suggest and create guidelines for members to create their own submissions to planning authorities.

While other people’s comments are largely helpful,  inflammatory and/or abusive comments do not go unnoticed. For some, seeing commenters ridicule or argue with others deters them from commenting, for others it raises concerns about their own and others’ privacy.

Comments based on little information may not be valid*

There is a small amount of text on the Planning Alerts email alert and application page to describe the nature of the application. Many people do not go through the process of fully understanding the development application before they comment. They are making comments based solely off what is written on the Planning Alerts page. This can lead to misunderstanding of the nature of the application and an increased amount of comments that planning authorities may consider not valid or irrelevant to the application at hand. These comments ultimately have no set process on the planning authority side and may not get responded to.

People (both Planning authorities and residents) have said that it’s important for the community to have enough of an understanding of the application to be able to create a valid response without having to pour over the minutiae of all the documentation.

People sometimes make comments that they acknowledge are not a direct objection or approval, sometimes they want to provide important context for the community, or have questions and want more information. In addition planning authorities have noticed that some comments are not related to planning at all eg. commenting on the nature of the person applying.

Engaging directly with council or planning authority

Residents who have experience with the planning process are tactical with their commenting. Only commenting on what is really important to them. In addition to commenting through Planning Alerts, they will make a submission directly though the respective planning authority and contact them directly as well.

They cited that they have had more luck in getting a response when going directly to the planning authority.

Getting to know the Planning staff at council is helpful for getting information or forming understanding about the development process. It provides a direct contact to turn to about issues and a better chance to make change.

People wonder what happens to comments and why they don’t get responded to

People want to understand what happens to the comments once they are sent to the planning authority. There’s a strong sentiment that planning authorities and councils rarely respond or get back to people. This discourages people from commenting.

Some participants have suggested that council could respond directly to comments publicly to express why something has been resolved the way it has. To continue to feel motivated to engage and take action, people need to see a response from their local planning authority or state member. Research participants have shared that it seems to them the best way to get a response is to engage with councillors and state representatives directly.

Volume and validity of comments from Planning Alerts are challenging for Planning authorities

There is a much higher volume of comments that comes in from Planning Alerts compared to Planning authorities’ own channels. It’s challenging for planning authorities to distinguish between valid responses and non-valid ‘chatter’.

Ensuring comments are going to the relevant people at planning authorities and are being submitted in the appropriate format is essential for them to be responded to and counted as valid. For a comment/submission to be counted Planning authorities have a list of requirements. Depending on the authority these may include: within the submission timeframe, full name and address, specifying approve/disapprove, reason for approve/disapprove and how it impacts the resident, relevance to planning requirements ie. built form, landscaping quality, and is a unique submission not copy paste, and mailed directly to planning authority or form submitted on their site.

How Planning authorities respond to comments

Planning authorities have different rules for responding to feedback depending on the legislation. Most are only required to respond to valid feedback, non-valid feedback is recorded but may be lost and never responded to. Residents are not always aware of these rules. For some authorities different volumes of feedback trigger different processes eg. more feedback, gets escalated.

Due to organisational limitations planning staff are not always well resourced. This means that comments that are not valid or relevant to the submission are recorded but not always responded to. They don’t always have the capacity to send them to where they need to go and some inevitably fall through the cracks.

There are also developments that don’t require consultation. When they receive comments from Planning Alerts for these applications, it makes the back end process labour intensive as they don’t have a process in place for dealing with unsolicited comments.

There are different rules around advertising in different jurisdictions. In some jurisdictions where they are not required to advertise for certain developments, yet they still get responses from community through Planning Alerts. This adds extra pressure on them even when they’re not able to do something about it.

What’s next you ask?

Well for that you’ll have to tune into the next blog post where I’ll go into detail about the artefacts we created, why we created them and how we intend for the team to use them ongoing. Also I suspect you’re wondering what happens to my relationship with the Planning Alerts, well we thought of something interesting. Wait and see!

Posted in Planning, PlanningAlerts.org.au | Tagged , , , , | Leave a comment

Journey to Improving Planning Alerts: A Service Designer’s Perspective

Guest post by Service Designer and Researcher, Sabina Popin

Welcome to the journey of improving Planning Alerts!

The OpenAustralia Foundation is a tiny organisation that gets a lot of email! Thousands of emails a year are directly in relation to the Planning Alerts service, people who use the service reaching out asking for all kinds of help. Currently they respond to each email individually, and as the service grows this is becoming harder and harder to do. As a team, they are always on the lookout for ways to improve their services in the most cost-effective way possible. They focus on making the smallest changes that have the most significant impact. However, in the case of Planning Alerts they struggled to differentiate between the problems people were facing when using the service and the issues they themselves wanted to fix. They wanted a process that provided a more systematic approach rather than relying on gut feelings alone, and that’s where my service design skills came in handy. They invited me to join the team for a few months to put my skills to good use in helping them find opportunities to improve the service. Ultimately to improve the experience for people who use the service and enable staff to put their effort in where it matters most to people so they can have a say on issues that matter most to them. 

In this three-part series, I will be sharing the teams and my learnings from this project. In this post I’ll walk you through a summary of the entire project. In the second post, you’ll see how I conducted research and testing, and what insights emerged. Finally, the third piece dives into the project’s output and the team’s next steps. Put simply, you’ll get the low down on the entire process behind some of the changes you will see around the Planning Alerts service. 

A Beginner’s Mindset in Service Design

While Planning Alerts had never used service design at this scale before, for me this felt like a comforting familiar flavour to how I’ve made my bread and butter for the past 10 years as a service and strategic designer. Except for a couple of small details… I knew zero, zip, zilch, nada about planning. I had also never worked within the civic tech space… in a charity… run by a teeny tiny team of two! Eeeep!

Sure, uncertainty, anxiety, and self-doubt may seem like red flags in other professions, but for us, they’re key ingredients for good design. In my work I’m often diving head first into a new world for each project. And, for a service designer, having a beginner’s mindset is a must-have. By approaching each project with an open mind, we can connect with people on a deeper level, eliminating assumptions and biases along the way. Most importantly, by approaching with fresh eyes, we’re primed to spot nuances and details that even the savviest subject matter expert might overlook.

Building on the work of others

My 3-month adventure with the Planning Alerts team was packed with insights gleaned over 6-8 weeks worth of work. As a newcomer to this space, I was thankful for the earlier work Jo Hill did with the team to understand what people were getting in touch about and why. This provided an excellent starting point. Jo Hill’s write-up and service blueprint provided a great overview of the service, while the categories and insights gleaned were an invaluable starting point for mapping people’s needs. The results of that work were eye-opening for the team and provided categories and initial insights to guide the Planning Alerts team, along with a set of suggestions on where they might focus next. Which set the stage nicely for my work.

Diving into the Deep End: Unearthing Insights, Ideas and Opportunities

I dug into about a year’s worth (Dec – Feb 2022) of Planning Alerts inbox emails. That’s thousands of emails from people. I then built on the previously created email categories and data analysis using a virtual wall (Miro board) built in white boarding tool Miro. With this mapping exercise, I identified key insights and opportunities for improvement. I met with the team regularly, sharing insights from the inbox research, and together we simultaneously discussed the insights and generated ideas and opportunities over zoom and screen sharing. This was made possible as the Planning Alerts team of two are both responsible for decision making and implementation. This means they’re able to hold both a strategic and practical implementation view at the same time, and this is in fact this is how they regularly work. So while I normally wouldn’t do things in this all-in-one way, adapting to their natural way of working meant that multiple pieces coming together in parallel felt exciting and energising and gave us all momentum.

Image of white boarding tool Miro with email categories.

But the inbox study alone wasn’t enough, we knew we had to bust or validate assumptions we had made. So to ensure we didn’t get stuck answering the wrong problems, or miss some giant elephant in the room, we interviewed people who use the service. We asked them about their experiences using Planning Alerts and the planning process in their area more broadly. In addition we created prototypes of intended changes to the service and brought these along to test out. 

After synthesising and analysing interview data, came another round of team sharing, reviewing, prioritising, and organising ideas into buckets of what is desirable, feasible, and viable to create in the short, medium, and long term. Together we unearthed some of this project’s treasures during epic zoom sessions including a series of vital service principles.

Creating Artefacts to Support Decision-Making and Action

The next step was to honour the treasure we found, by creating outputs of different kinds (artefacts) that the Planning Alerts team could use to support their decisions to make practical improvements to the service. These included a set of current state key insights and a Service Improvement Toolkit. These comprehensively document everything we learned about what people who use Planning Alerts need, their pain points, and outlining service improvement opportunities in a Miro board. 

We agreed that artefacts I created would need to be super practical. The Planning Alerts team wanted to use these both to support their decisions on how to improve that service and keep them on track and accountable. I turned the quick wins and high priority opportunities for improvement into Github issues so that they’re immediately part of the usual working environment, ready for action.

In the next post, I will share more detail about the research and testing, the juicy insights that emerged, the exciting new features that we prioritised to implement. And trust me you do want to see how this delicious sausage is made!

Posted in PlanningAlerts.org.au, Uncategorized | Tagged , , , , | Leave a comment

They Vote For You: The rebels of 2023 (so far)

Back in 2016, I wrote a post about the state of rebellion in our parliament. At that time, rebellions were quite rare. How things have changed! We now have several rebels in parliament, mostly from the Liberal Party.

What does it mean to be a rebel?

A rebel voter is someone who crosses the floor to vote against the rest of their party. They may do this in the interest of their electorate/state, out of personal principle, as a protest, or maybe they just fell asleep when the division was taking place and accidentally ended up on the wrong side of the room (see Nickolas Varvaris, the former MP for Barton).

Note that independents can’t be rebel voters because they have no party to rebel from.

Why are there so many rebels these days?

First, some context: while rebellion became rare during the 90s and 00s, particularly under the Howard Government, that wasn’t always the case. In other words, the current situation looks less odd when compared to the pre-Howard years.

That said, we are seeing a lot of rebels right now, which is likely due to many factors. The instability of leadership in both our major parties since the Rudd-Gillard and Abbott-Turnbull governments is probably playing a role. As is the rising political power of independents since they held the balance of power during the second Gillard Government. Certain matters have also become increasingly serious (climate change, housing) or increasingly politicised (COVID vaccination, transgender rights), which means more representatives will be motivated to take a stand on them.

Who’s rebelling in 2023?

Here’s a rundown of what’s been happening so far in 2023. Note that the MP for Calare Andrew Gee (formerly Nationals) and Victorian Senator Lidia Thorpe (formerly Greens) have not been included as they are now both independent.

Starting in the House of Representatives, there’s MP for Bass Bridget Archer (Liberal), who continues to cross the floor on several issues. So far this year, she has rebelled in support of housing affordability and housing availability in regional areas and to vote against the cashless debit card (or indue card) system.

Onwards to the Senate, and in alphabetical order, we have South Australian Senator Alex Antic (Liberal), who crossed the floor to support a United Australia Party motion to create an inquiry into ‘excess deaths’ in 2021 and 2022.

Then we have New South Wales Senator Andrew Bragg (Liberal), who crossed the floor to support transgender rights.

Finally, there’s Queensland Senator Gerard Rennick (Liberal), who crossed the floor to support a One Nation Party motion for the Minister to provide Australian Federal Police documents relating to a GoFundMe page called ‘Supporting the SASR family’.

New South Wales Senator Hollie Hughes (Liberal) also appears to have rebelled in one division, but as the only other Liberal senators present for the vote were Senators Antic and Rennick (both mentioned above), it may be more accurate to say that it was actually Antic and Rennick who were rebelling again. That division related to vaccine mandates and giving legal protections to those who refuse to get vaccinated.

Other current MPs and senators who have rebelled against their parties in the past include (in alphabetical order): Simon Birmingham (Liberal Senator), Richard Colbeck (Liberal Senator), Jane Hume (Liberal Senator), Marise Payne (Liberal Senator), Barbara Pocock (Greens Senator), Dean Smith (Liberal Senator), Ross Vasta (Liberal MP) and Jess Walsh (Labor Senator).

Posted in They Vote For You | Tagged , , , , , | Leave a comment

New! Three easy steps to a better PlanningAlerts

Photo by David Clode

Do you use PlanningAlerts and have an email alert set up? Do you realise you’ve never set a password for an account? Have you ever found unsubscribing from an alert not entirely obvious or needed help to confirm a comment or update your email address for an alert?

Good news! From now on you’ll be able to make an account! Once you’ve confirmed your email address, simply login any time to manage your alerts, email settings and any comments you make all in one place. Phew! 

Don’t worry, you can still search for development applications and see all the usual details on the website without logging in. You’ll only need to login to set up your email alert or create a comment. All the usual details are freely and openly available at planningalerts.org.au as before. 

When you next get an email alert you’ll be asked to “activate your account” which simply involves these three simple steps

  • Step 1: Confirm your email address
  • Step 2: Click on the link in the email “Activate your account”
  • Step 3: Fill in your name and password

And hurrah! You’re in! 

What? And why now? 

Back when we first launched PlanningAlerts, we wanted the sign up for email alerts to be as quick as possible with the fewest clicks standing between you and what you want to know. We also thought it was rude to ask people to sign up for an account just to see what your service did before you had a chance to look around. So we decided early on we would avoid accounts if we could. We could, so we did. In fact by avoiding user accounts altogether we could take a couple of steps out of the process entirely.

Times have changed! People now expect to login to services and be able to update details with their online services on their own. Using an email and password are now completely standard and password managers even built into browsers to help us stay sane while needing unique passwords for many day to day services. In fact it now seems plain weird PlanningAlerts didn’t allow this. 

This has helped keep PlanningAlerts super easy to use. However this approach has come with limitations.

We also hear from some of you that it can be tricky to update or change an email alert or unsubscribe and there’s been no way to manage your alerts in one place. You had to wait for an email alert and follow a link in the email. With a user account you can simply log in, review your details any time, and make changes yourself.

Also there’s been no connection between different alerts and comments that the same person has. Over time we see that this can lead to confusion and also extra effort.

As the number of people using the service has grown so has the amount of support we have to give people for common problems such as these. In the spirit of giving you access to tools and information you need to stay informed on issues that matter to you in your area, it’s also important that we don’t do something for you that you could easily do for yourself.

Easier for you

You’ll find it easier to 

  • Sign up for extra alerts – no need to confirm each one
  • Change your email address – just confirm the new email address once and any alerts you have are then sent to your new address
  • See any and all the alerts you have currently setup in one place
  • See any comments you’ve made in one place
  • See when each alert was last emailed to you

Tiny changes, big impact, or, How it will make things easier for us to support your community

At PlanningAlerts it’s important to us that you can contact us directly with your questions. Frankly, we’d like to see fewer support requests on some of those more mundane requests in our inbox. This will help us have greater capacity to focus on better supporting people through the planning process. 

As a tiny organisation we think we’re most effective when we figure out small changes that can have the biggest possible impact. For PlanningAlerts that means giving you control over your alerts and comments. So we’ve taken the time to focus on improving your experience of this local information service over the last few months. By providing you information and tools to take care of your ongoing information needs, we figure it might be that much easier to connect with the big picture; we even suspect you may start to feel a growing sense of confidence you have a part to play in shaping your neighborhood.

Some of what’s happening under the hood

So now you know why we’re doing this, I’ll outline how we’ve been rolling out this change. It includes some technical details.

There are two cases when PlanningAlerts asks people to confirm their intention by following a link sent to their email address, setting up a new alert and making a comment. Previously each new alert or comment had its own discrete record of the email address attached to it. Each alert and comment needed to be individually confirmed.

The first step in this migration was to attach each alert and comment to a user record. Where there is no user record associated with an email address, we create one in a special un-activated state where someone can’t actually yet log in.

Then, any new alerts or comments are similarly automatically hooked up to these otherwise invisible user accounts.

Then, we started rolling out changes to production that are only visible to people who are logged in. They don’t need to put in their email address to sign up for alerts and comments. They also don’t need to confirm their email address (because it’s already been confirmed in the user account setup). Phew!

The advantage of rolling the changes out in this way is that we can do it step by step. We don’t have to make one giant change to the service which touches many many aspects of how it works.

Then, the next stage is that to create an alert or make a comment people now have to login. This we hide behind a “feature flag” in the application so again we can roll it out to production without changing anything for current users, so that we can have confidence we’re not breaking anything. 

That’s better!

We are enabling this final step for everyone today. We’re excited about this change and we hope you will be too. 

Matthew Landauer and Katherine Szuminska

Posted in Announcement, Development, PlanningAlerts.org.au | 12 Responses

What are people contacting PlanningAlerts about?

Guest post by Service Designer and Researcher, Joanna Hill

Since first launching, PlanningAlerts has grown a lot. The number of people writing to PlanningAlerts has crept up and up, as of February 2011 sending out just over 900,000 alerts (thanks Wayback Machine). By the time of writing (Dec 1 2022) PlanningAlerts has sent out a whopping 192 million alerts across Australia. Only a small percentage of people who receive all those alerts ever get in touch. Yet even that tiny fraction of responses now adds up to thousands of emails a year and currently, PlanningAlerts responds individually to each one.

As PlanningAlerts continues to reach more people, they had questions like, do we need to plan to provide the current level of support people need, or could some tweaks to our existing services reduce how often people need to reach out? They want to be available to connect when there’s clear value in responding personally, and not be swamped with emails.  So they needed an independent view to help them better understand what people are getting in touch about and why. Insights from such research could help inform decisions about future directions the OpenAustralia Foundation could take to improve service,connect with people directly where it most matters to them, so they can have a say on issues that matter to them. 

What might that look like?

Approach

I was invited to join the team for a short stint. I’d be using my research and service design skills to look for opportunities to improve the experience for both users and staff. 

How exactly would I do that? Well, we would allow the best approach to reveal itself in time. The team at PlanningAlerts gave me a generous open brief to trust my instincts and my experience. 
An obvious starting point was to have a look at the PlanningAlerts inbox. I’d be able to see exactly what questions were motivating people to write in.

Some of the things I expected we might uncover were:

  • Where automation/canned responses might be helpful (and just as importantly where it wouldn’t)
  • Inspiration for new features
  • Opportunities for improvements on existing features
  • An understanding of why people might be leaving the service

Researchers and service designers are rarely given access to such a rich and raw source of user data. Organisations can be pretty protective of their correspondence with customers. Someone in my role might, at best, be given a pre-approved snapshot or summary. This can limit our ability or just make it a slower process to understand the user’s underlying needs.

Getting started. First dig into the inbox

I wanted to start by getting a first pass quick impression of the correspondence, the ‘shape of the data’ we might say. That would tell me what analytical approach to take. So I dived in and started reading emails over a sample of one month. It was quickly apparent on first viewing that there were indeed a huge range of issues. It wasn’t like “oh! everyone was talking about one or two things”. It was everything from small technical hiccups to “I want this service to do more for me!”. On an emotional scale the tone of the email had everything from joy to rage. 

Based on this, I knew I’d need an understanding of how the service worked from beginning to end, top to bottom. Then I could plot some of the correspondence to points in the service and start making some sense out of this spaghetti bowl of an inbox. So I asked the team for a ‘Service Blueprint’ or service map of PlanningAlerts. Nope, they didn’t have one. Right, I’d build one. 

Building a thing to learn a thing

I never build a Service Blueprint for fun. 

It’s rigorous and meticulous work. It also comes with the very real risk of getting lost in the weeds or worse, falling in love with my beautifully colour coded artefacts and losing sight of bigger things. 
I wanna be up there in the crown of the tallest trees looking out across the entire forest thinking about the big, strategic issues. But the truth is I can’t do that unless I have a connection to the earth too. It’s a balancing act – existing between the tiniest details and the biggest of issues. I’ve made enough mistakes in my time to know where there be dragons so I approached this exercise with a determined ‘means to an end’ philosophy. Google sheets would do just nicely. 


All I needed for now was:


Some columns – to mark out the stages of the service in chronological order

Some rows – to mark out the activities of the main four actors (1. the submission, 2. the public, 3. PlanningAlerts and the 4. council/planning authority)

To fill out the frontend experience (the parts visible to the user) I pretended I was a user (a legit technique thank you) and used the live service, documenting as I went. I then sat down with Matthew to document things that were happening in the backend processes that users couldn’t see.

Here’s a snapshot of the Service Blueprint showing it’s sections:

Categorising is a skill

Done. Phew. Came out alive and brain not melted. Now I could start categorising and plotting emails against the map. One bonus benefit of taking this approach would mean I’d be able to capture a bit of insight based on volume – noticing just how many of the same requests were occuring. At this point I still didn’t know exactly what I would learn but I knew it would come out in the wash if I trusted the process

Category creation itself requires great skill. Archivists etc know this well. If you do this bit without deep consideration, you can set yourself up to miss important learnings down the track. The job here was for me to unpack a user’s underlying need in each email, which isn’t always immediately obvious or stated upfront. I also need to articulate it in a fair and neutral language. This deserves a bit of time. 

Here’s some examples of categories I created:

  • I want to see historical applications for an area /property
  • I want to view/get alerts by council area
  • I’m having technical trouble signing up 
  • I want to know why this application is not on your website?
  • I want to amend or add to my comment but don’t know how
  • What does “delivered to the planning authority” mean? 
  • Why isn’t my comment on the Council website?
  • Why isn’t my comment on the Council website?
  • I want to report a rude/ racist/ discriminatory comment
  • I am no longer receiving alerts, what’s going on?

Here’s a snapshot of the Service Blueprint showing the categories plotted along it:

Oh look, incidental value

At this point I did a bit of a shareback with the PlanningAlerts crew to discover that quite apart from any specific learnings, the organisation gets a kick out of having their service mapped for the first time. We recognise this as a separate and highly valuable asset. Even without the emotional user content of a Customer Experience Map, this artefact can enable easier conversations about the service in lots of ways. 


As a result, I took some time to do a slightly more polished and simplified version of the map.  Upgrades include icons, numbering and arrows.

The simplified version looks like this:

Download the pdf

Ok but what did we learn?

After categorising two months of inbox enquiries, it was hard not to ignore the story showing up in the volume spikes. While a characteristic such as volume is usually the domain of quantitative analysis, it did at least point me at where some of the biggest tension points might sit in the journey. Here’s 3 big volume points where a lot of email was generated to PA.

Learning #1: Receiving an alert

Getting an alert from the service, puts it front of mind and user will often undergo an assessment of worth
When a user (service subscriber) receives a email alert from PlanningAlertsregarding a new DA in their area, we know that generates a whole bunch of traffic to the website, but through the inbox analysis it also appeared to act as a trigger for a user to adjust their preferences with the service. The two biggest examples being – asking to be unsubscribed (exiting the service) and asking how to change their search location (making the service more useful).

Learning #2: Publishing a comment

Only after a user publishes a comment can they asses if they’re happy with it

Another noticeable volume of enquiries occurred right after a user makes a comment on a development application. This was mostly about needing technical support with wanting to modify their comment. An early theory on what’s happening here is that once a user sees their comment published in the public domain in cold hard internet print, they immediately have feelings about wanting to modify it – be it to be more anonymous or alter the tone or language.

Big email point #3: Abuse reports
Abuse report messages are very different from everything else, and require much sensitively in their handling

Following along chronologically, the third point in the journey where planningalerts.org,au receives a lot of email is where users are sending an abuse report to the service, regarding a public comment. Of the three groups, this is by far the most fiery in nature, users are feeling an injustice has occurred and they’re wanting that to be addressed. The careful response to these type of emails by planningalerts requires a different skillset from the more technical support stuff.

What next?

From here, the next steps of taking this work further could look like:

  • Undertake more analysis increasing the time span from 2 months to perhaps 6 months. To level out the data and look for any missed email categories
  • Cluster and grouping email categories around user need
  • Take each cluster and reframing them as an opportunities
  • Tag and prioritise the opportunities against the organisations needs and agendas.e.g quick wins, automation opportunities, deeper inquiry needed, partnership opportunity  or encourage greater debate.

Posted in PlanningAlerts.org.au | Tagged , | Leave a comment

Senator for NSW Andrew Bragg threatens OpenAustralia Foundation with legal action

Liberal Party Senator Andrew Bragg has stepped up his campaign against the OpenAustralia Foundation. The senator has hired high profile lawyer Rebekah Giles to threaten legal action over how the OpenAustralia Foundation website They Vote For You presents his parliamentary voting record. 

They Vote For You takes the official voting record from Hansard and presents these publicly available records in a more accessible form so that voters can see how their elected representatives actually vote in parliament. As They Vote For You explains: “Forget what politicians say. What truly matters is what they do. And what they do is vote, to write our laws which affect us all.”

Senator Bragg, who serves as a member of the Liberal-National Party government, has objected to being recorded as an MP who has voted with his government colleagues on issues such as closing the gap, public school and university funding, increasing the Newstart allowance, the Paris Climate Agreement and increased funding for renewable intention etc.
Senator Bragg, and some of his Liberal colleagues such as Wentworth MP Dave Sharma, have joined in a campaign against They Vote For You.

First, Senator for NSW Andrew Bragg wrote to the Australian Charities and Not-for-profits Commission (ACNC) attempting to deregister us as a charity. This happened late last year. We don’t know what specifically was said in that letter.

Then, Dave Sharma, MP for Wentworth, wrote, in a coordinated action, to the Australian Electoral Commission (AEC) to try to stop us by claiming that we were breaking the law by not displaying “authorised by” on They Vote For You. That complaint was dismissed by the AEC because “…They Vote For You’s communications did not appear designed to influence elections and therefore were outside the commission’s domain.”

The first we knew of either of these was when they were reported in the Sydney Morning Herald “MPs call for ‘partisan’ political transparency site to lose charity status“.

Now, Senator Andrew Bragg has hired lawyers to threaten us with legal action for “misleading and deceptive conduct under Australian Consumer Law”. Furthermore they don’t want us to be able to openly discuss the issues because they have claimed that their legal threat is “PRIVATE & CONFIDENTIAL – NOT FOR PUBLICATION”.

The unfortunate thing is that this legal threat letter contains the first concrete and specific allegations which we can meaningfully respond to. Up until now it’s been months of mud-slinging in public from their position of power, claiming all sorts of ludicrous things while multiple attempts from us to invite a productive discussion with them have been met with silence.

It is our belief that the legal threat letter does not contain anything that is “private & confidential”. In reality it’s quite the opposite. It’s largely a discussion of the voting record of Andrew Bragg. Furthermore it is essential that discussions of the facts of politicians’ voting record happen in public. For that reason we have decided to publish a complete copy of the letter below.


Our legal advice is that Senator Bragg has no cause of action and we’ll revisit that legal question below, but first, let’s look at the body of their complaints.

Help They Vote For You remain trusted and independent

Responding to Senator Andrew Bragg’s complaints

(a) Voted generally against closing the gap between indigenous and non-indigenous Australians

While it is true that the votes mentioned “have no substantive effect on the legal rights and duties of Australians, nor to raise or lower funding or taxes”, they are still votes that took place in our parliament on a particular subject matter of interest to Australians.

We are only able to include votes that go to division on They Vote For You, as they are the only votes that are recorded in any detail in the official parliamentary record (which is where we get our data). That is, only division records tell us exactly who was in the room at the time and how each of those individuals voted.

We cannot find any divisions on the bills listed at [7] that would be relevant to this policy. The votes on whether to pass those bills were only recorded as votes ‘on the voices’ in the Journal of the Senate: Aboriginal Land Rights (Northern Territory Amendment (Jabiru) Bill, Territories Stolen Generations Redress Scheme (Consequential Amendments) Bill 2021 and Territories Stolen Generations Redress Scheme (Facilitation) Bill 2021. Note that the Corporations (Aboriginal and Torres Strait Islander) Amendment Bill has not yet been voted on in the Senate. We also cannot find any divisions specific to the funding listed at [8]. Therefore, these votes cannot be included in our record.

A solution we have recommended previously is for all votes to be adequately recorded in the official parliamentary record, which could be achieved most efficiently by introducing electronic voting to parliament.

(b) Voted consistently against increasing funding for public schools

When attaching a division to a policy, we mark them as either “strong” or ordinary (e.g. “Yes (strong)” or “Yes”). Divisions that actually change the law are “strong” while divisions that are merely symbolic are ranked as ordinary. The two divisions you mentioned are ordinary and ranked accordingly. A division on an appropriation bill directly relevant to this policy would be ranked as “strong” and therefore have more weight on the senator’s voting record. However, we cannot find any such division to add to this policy.

(c) Voted consistently against increasing funding for university education

While appropriation bills may be the only bills that impact direct funding arrangements, many bills impact funding arrangements indirectly, including those mentioned at [11], which is why they are included in this policy.

(d) Voted consistently against increasing investment in renewable energy

Low emission technologies are not renewable, which is why these two disallowance motions are included in this policy.

While appropriation bills may be the only bills that directly impact government spending, many divisions either indirectly impact funding arrangements or express an opinion towards funding arrangements, and so are included in our voting records.

We have not found any divisions specific to the funding listed at [16]. Therefore, they cannot be included in our record.

(e) Voted consistently against increasing the Newstart Allowance rate

While appropriation bills may be the only bills that directly impact government spending, many divisions either indirectly impact funding arrangements or express an opinion towards funding arrangements, and so are included in our voting records.

We have not found any divisions specific to the funding listed at [19]. Therefore, they cannot be included in our record.

(f) Voted consistently against the Paris Climate Agreement

Our parliamentarians may not control whether we enter the Agreement, but as our elected representatives they should be expected to have an opinion on the matter. These divisions express that opinion.

(g) Voted consistently against a carbon price

These motions expressed an opinion on the carbon price. Elected representatives are expected to have opinions on important matters of concern to Australians.

(h) Voted consistently against increasing legal protections for LGBTI people

These motions expressed an opinion on legal protections for the LGBTQI+ community. Elected representatives are expected to have opinions on such matters.

As far as we can see, no relevant divisions on the bills mentioned in [26] were recorded in the official parliamentary record, so cannot form part of our voting records. The only record of voting on the Religious Discrimination Bill and the Human Rights Legislation Amendment (we presume this is the bill you referred to as the Sex Discrimination Amendment) is in the Senate Journal, which noted that a first reading vote took place ‘on the voices’ before debate was adjourned. 

Regarding the subjects listed in [27]:

The new category called “We can’t say anything concrete about how they voted on” was created to take account of the fact that not all divisions are equal. 

Only divisions that make real legal changes are classified as “strong”, while the rest (including symbolic motions) are classified as ordinary. After exchanging emails with several staff members of elected representatives, we agreed that no policy position should be determined solely by a single ordinary vote. Instead, where a representative has only voted once in an ordinary division for a policy, their voting record now says “We can’t say anything concrete about how they voted on…”.

We acknowledge the information you have provided regarding Senator Bragg’s position on these subjects, but our site runs solely on data drawn from the official parliamentary record. Without relevant divisions, we cannot make any changes to the record.

(a) We can’t say anything concrete about how they voted on increasing workplace protections for women

We agree that it is bizarre that divisions are not always taken on important voting matters, such as on whether to pass a bill. Unfortunately, this is common voting practice in our parliament.

Our record does not include a division on whether to pass the Sex Discrimination and Fair Work (Respect at Work) Amendment because there was no such division. That vote was taken ‘on the voices’ and so cannot be included on our site. The reason we cannot include votes ‘on the voices’ is that such votes are not recorded with any detail in the official parliamentary record – we do not know exactly who was in the room at the time of the vote, nor how each individual voted. In this case, the Senate Journal simply states, “On the motion of the Attorney-General (Senator Cash) the report from the committee was adopted and the bill read a third time.”

As mentioned above, a solution we have recommended previously is for all votes to be adequately recorded in the official parliamentary record, which could be achieved most efficiently by introducing electronic voting to parliament.

(b) We can’t say anything concrete about how they voted on increasing funding for road infrastructure

While appropriation bills may be the only bills that directly impact government spending, many divisions either indirectly impact funding arrangements or express an opinion towards funding arrangements, and so are included in our voting records.

We have not found any divisions specific to the funding listed at [33]. Therefore, they cannot be included in our record.

(c) We can’t say anything concrete about how they voted on increasing the foreign aid budget to 0.7% of Gross National Income

While appropriation bills may be the only bills that directly impact government spending, many divisions either indirectly impact funding arrangements or express an opinion towards funding arrangements, and so are included in our voting records.

(d) We can’t say anything concrete about how they voted on a constitutionally enshrined First Nations Voice in parliament

While a referendum would be required to actually amend the constitution, this division expressed an opinion on the matter. Elected representatives are expected to have opinions on such matters.

The legal threat

Now back to where they are threatening to sue us under Australian Consumer Law. For this we defer to the letter that Michael Bradley from Marque Lawyers wrote. He is generously helping us pro-bono. Here is the letter he wrote in response.

Where to from here?

I hope after all this it is more than obvious that there is no attempt by us with They Vote For You to deceive or in any way misrepresent the voting records of any politicians.

However, there are serious issues with the completeness of the parliamentary voting record that have been until now largely unknown, ignored and invisible to the general public. We need a complete record of all votes, one that lists how every single person votes on every single vote, whether “on the voices” or by division. This is the only way that a complete picture of every politician’s voting record will be available so that citizens can truly and fairly hold their elected representatives to account.

How can we make this more widely known? How can we work together so that parliament improves how they record all the votes?

Matthew, Kat and Mackay

Help They Vote For You remain trusted and independent

Posted in They Vote For You | 3 Responses