Incident Response Form

Summary 

In the spring of 2018 I drafted conduct guidelines and reporting policies for an educational organization. I also helped design a form that they used to collect reports. This was a long, intensive process. Thankfully, I had the resources to conduct research along the way. Policies and forms underwent multiple public comment periods and user experience studies, and I was able to adapt, iterate, and improve based on the feedback I received. I also had the chance to take advantage of a great educational opportunity. While I do not have space to discuss the policies I wrote in depth, I’d like to talk about the form because the design of the incident reporting form (IRF)presented an opportunity to think and learn through a design that involved conflicting requirements, a broad range of stakeholders, and a sensitivity to context. 

The goals of having an accessible and easy-to-use incident reporting form and an incident reporting form that collects enough data are always in tension.

The Problem 

Incident reporting procedures and forms are difficult to design. On one hand, an organization receiving an incident report is a process of documentation that may worsen the harm a reporter has experienced. For this reason, many people choose not to report, even when the option is available. Our user testing found that the difficulty or emotional toll of filing a report was frequently cited as a reason that have not filed reports with organizations in the past. In such a context, ease-of-use is an accessibility issue. As a designer, I had the immediate goal of making sure that harm was not hampered with a reporting process and form design that exacerbate existing systemic inequality. 

However, in order to avoid perpetuating other systemic inequalities and bias, any reporting and conflict resolution procedure must be robust. This potentially involves the collection of a lot of difficult information. The goals of having an accessible and easy-to-use incident reporting form and an incident reporting form that collects sufficient data are always in tension. Privacy is another further concern of incident reporting procedures. Often reporters fear retaliation, and bias remains a constant concern for any conflict-resolution process. In response to considerations around such issues, permission management became another immediate concern of the design team I led. 

With all of these considerations in mind, we were finally able to state our problem: 

How might we design an accessible reporting process that is both easy enough to use and robust enough to be sound? 

 Use Cases 

Composing the reporting policy and engaging in user research led me to identify several use cases for the Incident Reporting Form. Each of the use cases could be broken down into a few different dimensions. I quickly realized that all questions would not  be relevant to all users.  I then decided to  identify a list of requirements that would change the questions being asked on the form. The length and content of the form, then, could be customized according to the needs of the reporter. 

  1. Desire for anonymity 
  2. Desire to be contacted 
  3. Staff restrictions 
  4. Desire to submit materials or a narrative account at the time of the report. 

Some combinations between needs could not be met. Because the software the organization used to accept reports did not have robust permission management features, privacy restrictions on the report involving staff had to be managed by staff directly and manually. In order to make sure no parties named  were involved in the response process, the staff had to process an initial report and get back to the reporter to proceed accepting any narrative account or other information. Therefore, it was only possible to accept reports with staff-member privacy restrictions if the reporter left contact information. Similarly, reports with no narrative account or information provided at the time of submission provided could only be submitted alongside contact information. Once the conditions for each type of report were written out and states diagramed, I could move on to the design of the form itself.

Designing the Form 

With accessibility and ease-of-use in mind, the IRF was structured around a key feature: modularity. Based on a number of required questions, users would be shown the parts of the form they needed to meet their particular use case.  This would result in less work per-page and per-user because tasks would be structured sequentially and unnecessary tasks would be eliminated. The end result was a form that had three primary paths, along with several additional sets of conditional questions. 

Word choices and question order played important roles in how users responded to the reporting process.

Testing and Iteration

After the initial draft of the form was created, it was further evaluated in four stages. The first was internal testing within the company and with a group of recruited participants. The form was then passed on to a series of legal and other subject matter experts. In the next phase, versions of the form were sent out on a mailing list for producers of similar events. Finally, several rounds additional rounds of user testing occurred with event participants and community members. 

Two stages of the testing process yielded the most useful information: the initial survey responses of respondents to the first wave of testing and later particpant-observation of individuals filling out the forms. 

The design of the form was only reached after a significant user testing. Users brought up a number of concerns. One user tester stated that although they took classes in English, they would have an easier time submitting a report in another language. From there the organization decided to allow report submissions in languages other than English and set up the requisite process. In addition, a number of key changes were made to vocabulary. We found that word choices and question order played important roles in how users responded to the reporting process.

One important component of user testing was bearing in mind that the testing sample by necessity would differ from the audience filling out the forms in at least one way: they would not be using the forms in the immediate aftermath of a potentially harmful incident. This was taken into account when reviewing user-testing information. 

  Documentation

Several forms of documentation exist for the form remain available. In addition to the gallery of images within this site, a demo of the Incident Response Form remains online, as does the Feedback Form. 

  Reflection

No one wants to have to use an incident report form. All the same, they can be an important part of running an ethical and accountable organization. While the implementation of the IRP ultimately proved to be a success, there are several changes I would make if I had to design a similar product. 

Due budgetary constraints, the company I wrote the IRP for could not afford to use a software solution to maintain a more structured complaint infrastructure with specific access scopes for users. That meant that some workarounds had to happen. Users filing a report that could not be seen by a particular staff member had to file a request that was visible to all staff members to start a reporting process. Then the reporter would have had to wait for a staff member to reach out. This meant that there was no easy way to submit a report with staff member viewing restrictions using the form, and that filing an anonymous report with staff member viewing restrictions was impossible. In an ideal world, report collection could have taken place in a system with built in access scope management, and reports could have been made directly through the interface. 

Having an online reporting form seems to provide issues in and of itself. In our research, I found that many of our respondents preferred giving accounts in person. They indicate that the use of an online form or survey system often felt cold in comparison. However, there are a number of significant barriers to in-person reporting.  They include logistical problems, problems in terms of information retention and storage, and interpersonal complications. Despite the supposed preference for in person reporting, the organization found that nonetheless the majority of reports came in through the online system. I do not know whether this potentially lowered the overall amount of reports, but it is possible. Ideally, I would have had metrics about how many people quit the survey part way before submitting. 

Thinking about such numbers, it is difficult not to think about the ways reporting forms could employ “dark design” in order to disincentivize reports, something might be done out of a desire to save an organization trouble. Either way, reporting is a complicated process, and any form involved in it must be able to handle that complexity while being good enough for users. Creating such an object, it turns out, is a task that takes a great amount of care, and is best done in close consultation with the community. 

Scroll to Top
Scroll to Top