Eight Things Cities Can Do Today to Generate Evidence and Outcomes

Tips from the Behavioral Insights Team

What Works Cities
7 min readOct 16, 2017

By Sasha Tregebov and Elspeth Kirkman

A graphic illustration of coauthor Elspeth Kirkman’s talk on behavioral science at the 2017 What Works Cities Summit

Since What Works Cities began in 2015, the Behavioral Insights Team (BIT) has conducted over 60 low-cost evaluations across more than 25 cities. Using insights from the behavioral sciences, we have helped make an impact on government priorities from police recruitment, to water bill collections, to access to healthcare. You can read more about our work in this report.

We have learned that cities can quickly and inexpensively apply behavioral insights and test those nudges through randomized controlled trials (RCTs). RCTs represent a great evaluation method because they provide the highest quality evidence about the effectiveness of changes to public policy and public administration.

Based on our experience, there are 8 things that cities can do today to start applying behavioral insights and implementing low-cost RCTs.

1. Do your communications pass the “flip test”?

Not sure whether your communication is getting the key message across? Turn the communication face down. Have a colleague flip it over. If the action residents need to take is not obvious to them within five seconds, it’s time to test out a new version! People have limited attention, and most of us don’t read everything sent to us from start to finish. Make sure the action people need to take is unmissable.

Example: BIT worked with Louisville to improve collection of parking citation revenue. One of the key features of the new letter we designed was a “pay now” stamp that signalled exactly what action was required.

2. My name is not “Dear Resident”! Each day, we are all bombarded by organizations seeking our time and attention. It’s hard to stand out in this crowd and ensure that your city’s communications actually get residents’ attention. At BIT, we have found that simply using residents’ first name instead of a generic greeting can make a real difference. Instead of “Hello resident” or “To whom it may concern” try out, “Hi Elizabeth.”

Example: BIT was working in the UK to increase the number of job seekers showing up for mass hiring events. Text messages that used the job seeker’s first name were more effective than those that were not personalized.

3. Shorten your URLs, and take us right where we need to go. Identify the letters and other non-digital communications your city sends to residents and businesses that include a prompt to take action online. Make sure those communications include a short URL that goes to directly to the webpage where the action should be taken (e.g., your payments portal). It is easy to shorten the length of a URL with a variety of free URL shorteners. Removing minor “friction costs” from a process, like typing in a long URL or clicking through multiple links, can have a real impact.

Example: BIT worked with South Bend to design a postcard encouraging applications to the police force. We used the bit.do website to shorten the URL with the application information from https://police.southbendin.gov/get-involved/start-career-sbpd to www.bit.do/SBPDserve.

By sending postcards at the right moment, Portland increased use of its bike-sharing program, BIKETOWN. (Photo credit: Eric Fischer/Flickr)

4. Leverage the “fresh start” effect. Habits, especially bad habits, are notoriously difficult to break. However, it’s a lot easier to change habits when you’re making big life changes like moving to a new home or starting a new job. It can even be easier at arbitrary moment that feel like a “fresh start”; the start of a new year or month, perhaps. Cities should take an inventory of services that are underutilized and might require residents to change habits to take up. Common examples include recreation programs as well as public transit or bike-sharing programs. Conduct targeted outreach promoting these programs to residents who have recently moved.

Example: Portland was looking to increase take up of its bike-sharing program, BIKETOWN. Postcards promoting the program were sent to two groups: residents who had recently moved, and residents who had a new bike station recently installed in their neighbourhood. We found that people who had recently moved were more than four times as likely to respond to the postcard and take up the service.

5. A/B test your emails. In an email “A/B test,” you randomly select email recipients to receive one of two or more different versions of an email. Then you track differences in how people react to those emails — whether they open them, click on links, sign up for services, etc. A/B tests using digital platforms are the simplest way to get into doing low-cost, high-quality randomized control trials. Ask your IT department or your vendor if you’re able to easily run A/B tests on your emails. Most platforms can!

Example: BIT was working with the City of Denver to encourage more people to use its Pocketgov platform for resident services and information. We used an A/B email test to figure out what message would be most effective in promoting the new ability for users to pay their annual license plate fees online. We found that the ironic message “I’d rather be waiting at the DMV during the holidays, said NO ONE EVER” was more effective in generating clickthroughs than the more traditional message “Pocketgov.com is giving you the gift of time this holiday season.”

6. A/B test everything digital. We can run A/B tests on webpages in the same way that we can run them on emails. We develop two different versions of a webpage and see how user behavior changes based on which version they access. Your website provider might have this option built in, but if not, you can use the website na.gg or a similar online tool to test different versions of webpages using a single URL. This is particularly useful when you are pushing residents to websites through social media. Tools like na.gg are great because they allow you to randomly send people to one of the versions using the same URL.

Example: BIT worked with the National Health Service in the UK to increase organ donor registrations. Using A/B tests on a government webpage, we found that messages focused on reciprocity (e.g., “If you needed an organ transplant, would you have one?”) were most effective. If this message were to be used over the whole year, it would lead to approximately 96,000 extra registrations.

7. Monitor for blockages. When you promote the use of an online service, make sure you can track users through their journey. By looking closely at how many emails are opened, how often links are clicked, and who actually signs up for the service, you can identify any major attrition points early and improve the experience for the user in those moments.

Example: In San Jose, we found new emails drove 8.6% of employees to start an application for a subsidized travel scheme, but only 5.6% completed the application. When we looked at the impact on actual ridership, we found no difference between those who received the email and those who did not. Now we know that the application process could be made easier and that going through the process of applying doesn’t necessarily translate into ridership.

8. Get random. The most critical (and misunderstood) part of any RCT is the randomization. For an RCT to produce valuable results, the groups receiving each “treatment” and the “control” group must be statistically identical. The easiest way to accomplish that is by randomly selecting who is in each using a computer program. If you are unable to do this easily, think about whether any of the information you have is as good as random. For example, if you know that account numbers or case numbers are assigned sequentially (not based on any characteristics of that case or account), you can be reasonably confident they’re close to random. That means, if you’re testing the effect of different types of letters, you can tell your systems to send version A to all those with an odd account or case number and version B to all those with an even account or case number. Some simple checks in Excel should help you work out if the numbers are truly as good as random. For example, let’s say you’re looking at payment accounts; you can check whether the average balance for those with odd and even account numbers is the same to prove you’ve picked a random variable.

Example: The Behavioral Insights Team is working with Gresham to increase compliance with code violation courtesy notices and help residents avoid civil penalties. We are testing whether a new notice with a clearer call to action and simplified content is more effective than the current notice. Gresham has automated the randomization by writing simple code to send even-numbered cases the current version and odd-numbered cases the new version.

Sasha Tregebov is a Senior Advisor at BIT North America. Elspeth Kirkman is the Head of BIT North America.

--

--

What Works Cities
What Works Cities

Written by What Works Cities

Helping leading cities across the U.S. use data and evidence to improve results for their residents. Launched by @BloombergDotOrg in April 2015.

No responses yet