An open access publication of the American Academy of Arts & Sciences
Winter 2019

Participatory Design for Innovation in Access to Justice

Author
Margaret Hagan
View PDF
Abstract

Most access-to-justice technologies are designed by lawyers and reflect lawyers’ perspectives on what people need. Most of these technologies do not fulfill their promise because the people they are designed to serve do not use them. Participatory design, which was developed in Scandinavia as a process for creating better software, brings end users and other stakeholders into the design process to help decide what problems need to be solved and how. Work at the Stanford Legal Design Lab highlights new insights about what tools can provide the assistance that people actually need, and about where and how they are likely to access and use those tools. These participatory design models lead to more effective innovation and greater community engagement with courts and the legal system.

MARGARET HAGAN is Director of the Legal Design Lab at Stanford Law School and a Lecturer at Stanford Institute of Design. She has published in such journals as Virginia Journal of Law and Technology, International Journal of Human Rights, and Business Law International.

A decade into the push for innovation in access to justice, most efforts reflect the interests and concerns of courts and lawyers rather than the needs of the people the innovations are supposed to serve. New legal technolo[gies and services, whether aiming to help people expunge their criminal records or to get divorced in more cooperative ways, have not been adopted by the general public. Instead, it is primarily lawyers who use them.1

One way to increase the likelihood that innovations will serve clients would be to involve clients in designing them. Participatory design emerged in Scandinavia in the 1970s as a way to think more effectively about decision-making in the workplace.2 It evolved into a strategy for developing software in which potential users were invited to help define a vision of a product, and it has since been widely used for changing systems like elementary education, hospital services, and smart cities, which use data and technology to improve sustainability and foster economic development.3

Participatory design’s promise is that “system innovationis more likely to be effective in producing tools that the target group will use and in spending existing resources efficiently to do so. Courts spend an enormous amount of money on information technology every year. But the technology often fails to meet courts’ goals: barely half of the people affected are satisfied with courts’ customer service.4

No wonder: high-profile technology solutions, like a prospective, unified case-management system for California’s courts, which would have laid the groundwork for online, statewide court services, fizzled out after $500 million had been spent.5 In Alameda County, California, a new court case-management system, rolled out in 2016, led to many people wrongly arrested and jailed, or forced to register as sex offenders.6 Other technologies have been built with the expectation that people without lawyers will use them to prepare for their court proceedings, but people do not use them. An Arizona project called “Computers that Speak of the Law,” for example, was intended to empower Navajo and Hopi communities through legal kiosks with satellite connections that would provide legal education and guidance, but the target communities did not use them.7

Most models for incorporating technology into the provision of legal services have similar characteristics. They are grounded on the notion that providing people with information about court processes will increase their capacity to navigate them. Yet they avoid customizing the information to specific people, out of concern that this might violate legal ethics rules about unauthorized practice of law, or court rules around neutrality. They are also aimed primarily at people who are already in the process of determining their legal options and getting legal tasks done.

Access-to-justice innovation projects usually consist of online guides and tools related to legal forms. The idea behind these projects is that more information and semiautomated tools will allow more litigants to navigate legal processes without lawyers. The most common types of offerings are: self-help websites describing legal processes; form-filling tools for preparing petitions, motions, and other court forms; and e-filing portals for submitting forms online rather than bringing them to court or mailing them. There are also more detailed types of guides, like videos, that walk a litigant through specific court processes.

These tools sometimes include physical access points, where people who do not otherwise have access to online resources can use them, such as kiosk workstations at libraries or other public institutions with computers and printers; or video conferencing support, with high-speed Internet and fax machines. These models aim to help people access legal information and connect remotely with lawyers. Increasingly, programs involve remote communication with legal professionals, in which a person can live-chat on a legal website; call an intake phone line or hot-line; or enter an online advice clinic to talk briefly about their issues. In most instances, this remote communication provides generic information about legal processes, but not about the substance of individual cases.

Finally, the legal profession has invested in back-office capacity to gather and make sense of information about the people using their systems, so that they can make better referrals and better manage their cases. This entails building shared case-management systems that can sync across organizations and jurisdictions, and allow more remote operations and interoperation. It also means wide area networks that improve the connectivity of offices, so they can use high-speed network capabilities to work.8

What these tools have in common is how they were developed: by legal-aid and court-administration groups and by lawyers driven by their own views about what will best engage the community. These tools were also usually funded by the same source: the Legal Services Corporation, a central funder and key supporter of technological innovation in access to justice, whose grant money goes to legal-aid groups staffed largely by lawyers.

By contrast, participatory design involves actively consulting and collaborating with a wide range of stakeholders in a system to understand their perspectives and incorporate their priorities into system innovation. Participatory design asks people who are meant to use a system’s services to help identify where it needs to be reformed, define what “better” operation would look like, and design new interventions to reform it. Teams of designers working with customers and professionals generate innovations that users rate as better than traditional innovations.9

One tool of participatory design is the envisionment design workshop. Service users, service providers, and design facilitators identify key problems of the current system, map out their experiences and ideas for improvement, and draft new concepts for possible implementation. These concepts crystallize hypotheses about how services could be improved, which other developers can use to draft more refined prototypes.

Another technique is the co-design jam. Co-design, or collaborative design, involves a mixed group of stakeholders working together to design a prototype and plan for its pilot implementation. The jam focuses on making something that can be implemented (rather than, for example, exploring a challenge area to draw out insights). A jam is similar to a hackathon: that is, a concentrated sprint of work in which a variety of collaborators and experts work together to develop and refine a new idea.

Different user-testing strategies work best at different points in the design process. Interview and focus-group testing is good for showing early-stage prototypes to target users and asking for their edits and additions. A design team may create an idea book – a catalogue of possible new interventions – and then have stakeholders rank the ideas. Another strategy is to show a prototype of a new solution and ask users to assess its usability and value, while evaluating their engagement with and comprehension of the prototype.

Some groups have pioneered structures for conducting user-testing on a regular basis. Smart Chicago – a nonprofit that works on improving civic software in the city – launched the Civic User Testing Group.10 They recruit testers through advertisements in local libraries and community spaces, where software testers organize focus groups and usability testing sessions. Participants are paid for their feedback. Blue Ridge Labs in New York City, affiliated with the nonprofit innovation group Robin Hood Labs, runs the Design Insight Group, which regularly tests new civic applications and start-ups.11 The federal government, in particular through the group 18F, also conducts user testing of new government websites and interactive services.12

In recent years, government and nonprofit labs have formalized a participatory-design approach to government services and policy-making, providing models for legal system professionals. There are several emerging types of labs: government-based policy labs, nonprofit social innovation labs, and living labs, which are often associated with universities and local governments.13 Government-based policy labs allow for a design-driven process to guide the development of policy, including stakeholder interviews and observations; prototyping and testing of new interventions or rules; and collaborative design with stakeholders.14 Social-innovation nonprofits run similar policy labs.15

Another structure is the living lab model, which has grown out of Europe and has spread throughout Asia, North America, South America, and beyond.16 A living lab entails ongoing policy discussions and prototyping in neighborhoods where potential users live. The labs identify a key future challenge or problem that a community faces, like environmental concerns, the development of smart interconnected city infrastructure, or the development of new food systems. The living lab hosts activities in which members of a core team, which might include policy-makers, designers, technologists, and researchers, interact with members of the community.

The team at Stanford’s Legal Design Lab, which I direct, is exploring ways that participatory design can be used in the civil justice system. The lab takes a prototyping approach to the research, and has tried several different models of gathering feedback on-site at the court and through our lab. For example, the lab has worked with the University of Denver Court Compass project to run a series of divorce redesign workshops with former litigants, court staff, and lawyers in Massachusetts and Iowa.17 These were envisionment workshops in which participants reflected on the processes and experiences they went through, and then generated new concepts for divorce rules and service changes. Participants placed a high value on simplifying court processes for filing and disclosure of financial information, and expressed a strong interest in an online tool that would provide procedural updates, automated forms, and filings in one coordinated pathway. The lab’s core design team will take these requirements and concepts into the next phase of co-design jams that will involve more technologists, professional designers, and policy experts to refine interventions based on what the former divorce litigants prioritized.

Workshops like these are resource-intensive. They require a core team of researchers and designers to establish partnerships with courts, recruit litigants for full-day participation, compensate them for their time, and work alongside them to create new designs. Trained design facilitators usually guide the process, though court staff could also be trained for this role.

Less resource-intensive methods include court user testing and agenda-setting sessions. These can involve activities with litigants who are in the courthouse waiting to access other services. In California’s Santa Clara County, for example, lab staff asked users of the court system what they believed the agenda for innovation should be. In an idea-ranking session, they evaluated different types of innovations (new products, services, organizations, or policies) based on the benefits to them individually and to their community. In a location-ranking session, they advised about where resources should be spent to make legal help resources more accessible.

In the agenda-setting activity, participants were presented with an array of cards. One initiative was written on each card that could possibly make their experience of the legal system better. The initiatives came from an inventory of concepts and priorities gathered through previous research. Blank cards allowed participants to suggest new ideas.

The lab told participants to imagine that they had been hired by a civic foundation to spend its budget to make courts more user-friendly for people without lawyers. They were shown each of the ten concepts for “access to justice innovation,” one at a time, and instructed to rank their possible value. Then they were given cards for each idea and a grid with four categories: high value ($100,000 to each idea), medium value ($50,000 to each idea), low value ($10,000 to each idea), and no value ($0 to each idea). They could put a maximum of four cards in each category. For each idea ranking, as well as for the allocation/category game, the lab asked the participants to explain their thinking and to add any further ideas.

A similar exercise focused on possible venues for providing legal help resources. The ten locations used in the experiment included places in the community, like churches, schools, or transit centers, as well as digital platforms, like text messages or websites. Again, participants were asked to distribute limited resources to these proposed venues, and to think like a community leader when setting their agenda. Both sessions occurred within a twenty-minute time frame, with one facilitator talking with the participant and one taking notes. In these sessions, the most popular concept was personalized chats with lawyers, librarians, and staff via a mobile phone or website. The second-most popular concept was virtual courts, in which a person could appear before clerks, attorneys, or judges via videoconference or text. Remote, personalized services that would give more universal access to court services were championed as the highest priority. In large part, this was to solve logistical challenges of taking time from work, finding child care, and paying for parking and transit, as well as to prevent the wasted time of waiting in lines in person. The least valued ideas were kiosks and workstations, along with paper-based explanations of the court process.

For the location-ranking game, participants overwhelmingly chose public libraries as the best place to put legal-help information. They also prioritized Internet-based solutions on desktop computers, mobile phones, and smart-home assistants (like Amazon’s Alexa). They recommended much more investment in information that is available online and in its interactivity. Other community locations, like shopping malls, churches and other places of worship, and transit centers were less valued. They reasoned that these places were inappropriate for legal matters because they were for other purposes – whether commercial, spiritual, or family – and they doubted that people would engage with legal resources, workshops, or services there.

One overarching message from the participants was that they wanted a face-to-face feeling in their services, but do not need face-to-face experiences with staff. They expressed a hunger for personalized guidance tailored to their situation. This translated to a priority on technology that would allow a person to have a conversation with a court authority and accomplish tasks (like filing documents and scheduling appointments), and not just learn about the process in the abstract. Though many participants did not consider themselves technology experts, they prioritized technology-based solutions via web, text, or mobile applications. They were willing to learn more about the technology to get more prompt and remote services.

This prototyping exercise was a relatively inexpensive and low-barrier way to introduce public input into new technology options, form designs, and service offerings. Because of the large number of users of the court system who are usually waiting in the court for service, there is a ready population of people to participate in feedback and co-design sessions. It was relatively easy to recruit at least ten people every two hours to speak for twenty minutes at a time. Most respondents had been to the court several times before. They had expertise, insights, and frustrations about the system, which translated into thoughtful recommendations about court reforms. Participants expressed appreciation for being given an opportunity to talk about their experiences and hope that the system could learn from their input to be better for future litigants.

These kinds of research efforts can be scaled. Court staff, potentially in partnership with local law schools, for example, could conduct weekly court user testing at self-help centers, clerk’s offices, or other waiting areas at the court. The essential elements are a budget to compensate people for their feedback, a standardized recruitment and consent protocol, and a partnership with the director of the self-help center or other office.

The court and legal-aid community can embrace participatory design as a method for access-to-justice innovation. As other government and social agencies have increasingly done, the civil justice sector can experiment with community-led agendas for innovation efforts and better situate and launch new technologies and services. A participatory approach can ground new initiatives in understanding about what the community will trust and use, so that legal professionals do not build new software or services that only they themselves will use. It can also ensure that the right problems are being solved, so that any new innovation is addressed to the kinds of problems that people actually frequently face.

This work could combine intensive workshops and co-design jams, which require high amounts of planning and trained staff, with lightweight feedback and agenda-setting sessions. Participatory work can also employ online data and machine learning. Online reviews, comment sections, and social media posts can be scanned and mined for key frustration points with courts and legal-aid services. By collecting user feedback and ideas from platforms online, civil justice actors can learn where there is a high need for innovation and why people are frustrated by the justice system.

Courts can also establish new units and services that lower the logistical burden of running these sessions. Courts and legal-aid providers could use existing services, like Amazon’s Mechanical Turk, an online study-recruitment system, to recruit members of the public to answer questions, give reviews, or test out new software before the organization invests in a new solution. Like city governments, courts could also build civic panels, in which they recruit existing and recent litigants to join a prescreened list of people to participate in design. Panel members might be trained to become innovation leaders themselves, taking on the management of setting the innovation agenda and piloting projects. Finally, legal organizations might invest in gathering their own data about users’ (or potential users’) behavior in the current system. As other government policy labs have begun to do, they can measure analytics from their websites, the numbers of people who have problems completing certain tasks, the flow of people through their buildings, and other markers of people’s behavior in their system. These data points can show where the system is failing, whether because the rules are too complicated, the paperwork too hard to get right, the website not easy enough to use, or the building too hard to navigate.

This kind of research allows legal organizations to be more intentional about how they spend resources. It empowers community members to help in deciding how funding, technology, and staff time are used in reforming the legal system. It can be a source of promising ideas for innovations and community partnerships, and it can harness stakeholders to help make the system work better for people it is supposed to serve.

Endnotes

  • 1Margaret Hagan, “The State of Legal Design: The Big Takeaways of the Stanford Law and Design Summit,” Medium, September 25, 2017 [LINK].
  • 2Tone Bratteteig and Ina Wagner, “What Is a Participatory Design Result?” in Proceedings of the 14th Participatory Design Conference, ed. Claus Bossen, Rachel Charlotte Smith, Anne Marie Kanstrup, et al. (Aarhus, Denmark: Participatory Design Conference, 2016), 141–150.
  • 3Betsy diSalvo, Jason Yip, Elizabeth Bonsignore, and Carl diSalvo, eds., Participatory Design for Learning: Perspectives from Practice and Research (New York: Routledge, 2017); and Simon Bowen, Kerry McSeveny, Eleanor Lockley, et al., “How Was It for You? Experiences of Participatory Design in the UK Health Service,” CoDesign 9 (4) (2013): 230–246. See the network of participatory designers working on this effort, for example at the conference “Right to the Smart City: Designing for Public Value and Civic Participation Symposium and Workshop,” Berkman Klein Center for Internet and Society, Harvard University, March 22–23, 2018 [LINK].
  • 4GBA Strategies, “Annual National Tracking Survey Analysis,” December 12, 2016 [LINK].
  • 5Michael Krigsman, “California Abandons $2 Billion Court Management System,” ZDNet, April 2, 2012 [LINK].
  • 6Cyrus  Farivar,  “Lawyers:  New  Court  Software  is  So  Awful  It’s  Getting  People  Wrongly   Arrested,” Ars Technica, December 2, 2016 [LINK].
  • 7See the grant report at Legal Services Corporation, “Summary of TIG Awards 2000–2009,” October 19, 2016 [LINK].
  • 8The LSC-TIG report on past grants details the many legal-aid technology projects around back-office capacity-building. See ibid.; and their more recent inventories of funded projects at Legal Services Corporation, “Technology Initiative Grant Awards: TIG Projects Funded by Year” [LINK].
  • 9Jakob Trischler, Simon J. Pervan, Stephen J. Kelly, and Don R. Scott, “The Value of Codesign: The Effect of Customer Involvement in Service Design Teams,” Journal of Service Research 21 (1) (2017): 1–26. This study cautions that these results emerge when the teams have been coached to work collaboratively, rather than individualistically.
  • 10Daniel X. O’Neil, The CUT Group: Civic User Testing Group as a New Model for UX Testing, Digital Skills Development, and Community Engagement in Civic Tech (Chicago: Smart Chicago Collaborative, 2014).
  • 11Blue Ridge Labs, “Design Insight Group” [LINK].
  • 12U.S. Digital Service, “The Digital Services Playbook” [LINK].
  • 13See a lengthy report on various labs, with a focus on European work, at Kyriaki Papageorgiou, Labs for Social Innovation (Barcelona: ESADE Institute for Social Innovation, 2017).
  • 14This model has been developing since 2012 in governments in Finland, the United Kingdom, Chile, and elsewhere. Ayushi Vig, “The Rise of Design in Policymaking: In Conversation with Verena Kontschieder,” Medium, February 7, 2018 [LINK]. See more at M. Fuller and A. Lochard, Public Policy Labs in European Union Member States (Luxembourg City: Publications Office of the European Union, 2016); and Lucy Kimbell, Applying Design Approaches to Policy Making: Discovering Policy Lab (Brighton: University of Brighton, 2015), 1–43 [LINK].
  • 15Margaret Hagan, “A Conversation with Public Policy Lab on Their Government Innovation Work,” Medium, February 12, 2018 [LINK].
  • 16See the European Open Living Labs website for case studies and links to the network of living labs, most of which are working on sustainability and smart cities: European Network of Living Labs [LINK].
  • 17See more details at the project website: Institute for the Advancement of the American Legal System, “Court Compass” [LINK].