04-2020
Improving the Synoptic Report Development Process
Synoptic reports are a structured method used by doctors for capturing patient data. We collaborated
with
members of
Cancer Care Ontario (CCO) to help improve the development of these electronic reports and thus aim to
improve the adoption of this method across healthcare facilities in Ontario.
Who I worked with
Max Romanoff - Designer
Fawzi Ammache - Designer
Adham Zaki - Developer
Daniel Yong - Product
Timeline
Sept 2019 - Apr 2020
8 months
What is Synoptic Reporting?
Within Ontario’s healthcare system, there is a lot of variability when it comes to how patient data
is collected. The styles of reports not only vary across medical treatments, but also across
healthcare facilities and are subject to vary across the methods that physicians use to fill out the
reports. Though variability is not an issue with one-time visitors, this does become an issue with
patients with chronic diseases, such as cancer. Synoptic reporting aims to solve this variability by
having each healthcare facility use a single set of reporting standards. This leads to more accurate
reporting, reduced omissions and better knowledge transfer — all of which are crucial when dealing
with chronic diseases.
Why is the adoption of this method so low?
To ensure that reports are concise, the CCO team requires a large panel of doctors — called the
Working Group — to go through an iterative voting process to decide on which questions to include.
Between the size of the Working Group, the
busy schedules of the doctors, and the amount of contentious questions that need to be resolved, the
development time for a single report can take up to 12-months.
Multiple stakeholders are involved for the development of each report
We started to see that multiple people were involved in the process, so we decided to conduct
in-depth interviews with the CCO team to find out more. I worked on putting together a research plan
to uncover goals and pain points from the individuals we’d heard about — the CCO team and the
clinicians.
After identifying who was involved, we worked together to map out what a journey to develop a single
report was like. The process would be kicked off by members of the CCO team as they gathered the
relevant stakeholders and other requirements. The entire second phase would be iterated between the
Functional Lead and the Clinicians in an effort to arrive at a finalized list of questions. Towards
the end, a member of the CCO team would take that list an encode it into XML to be created into a
digitized report.
Though we identified these 3 primary phases of the journey, we saw the most opportunity within the
latter 2. Overall, we found that the existing process was very manual and decentralized, and found
there were multiple opportunities to help reduce the amount of file types and manual efforts.
Reducing the back and forth
Looking at our current state journey, we found some opportunities to reduce the back and forth
between teams. Considering what we could achieve with our timeframe, we looked at removing the need
for clinicians to save files and email them to the Functional Lead, and for the Functional Lead to
send over the checklist to the CCO Team Member to code in XML.
We ideated around how a digital solution could address these problems, and deliver value to the CCO
team.
Centralizing where users create synoptic reports
Our hypothesis was that, by addressing the manual processes that the Functional Lead and the CCO
team have to go through, we would greatly reduce the amount of time it takes to create a Synoptic
Report. The 4 of us ran a Crazy 8’s exercise, with the following 3 themes in mind:
1. As the Functional Lead is currently responsible for transferring information between multiple
file formats, we wanted to instead maintain everything on a single platform. This would help keep
everything in the cloud, and resolve any issue relating to handling different file types.
2. As each clinician individually sends their results in their own file format, the Functional Lead
must wait on them to receive all the data, convert it, and then calculate the results. We wanted to
automate this process as we knew that question lists could be in the hundreds. From here, we thought
about somehow connecting a digital voting screen for clinicians, to an aggregated data view for the
functional lead.
3. When the question list is finalized, it's manually converted into XML by a member of the CCO
team; the resulting XML file is then developed into a final report. We also wanted to automate this
process and have the XML developed as the report is voted on.
Iterating on our designs
We iterated on our sketches and put together an end-to-end experience that covered the Functional
Lead’s tasks of creating a checklist and viewing aggregated results, and the clinicians’ task of
voting on the questions.
Working with the Functional Lead, we knew that the reports required Text, Checkboxes and Radio
answer types, in addition to nesting and logic for certain questions.
With clinicians, we knew this group wasn't tech savvy, so our priority was to minimize the number of
unnecessary interactions they had to go through with this new solution. We also tried to maintain
their existing experience, so we added a comment box as we heard this was something they commonly
used in Word.
Usability Testing with Stakeholders
I created the test protocol for both the Functional Lead and the Working Group members. As this was
a first glance into our designs, for both primary users, the in-test questions were focused around
“what would you expect to be able to do from this page”, rather than a task-based test. With the
Working Group members, we also asked additional questions to understand if their busy schedules
hindered their voting.
Overall, our test demonstrated that users saw value in the web-app concept, however we learned
that the platform was missing a few crucial features. The Functional Lead mentioned that the voting
results page was missing elements that she would typically use to identify contentious questions and
the Working Group mentioned wanting a save feature as they often return to voting at a later
time. One interesting finding we heard from both users was that it’d be helpful to include
instructions for the clinicians, written by the Functional Lead. We learned this was
commonly done inside of a Word doc, and typically just mentioned the type of report and the number
of questions.
We delivered the final web-app to the CCO team incorporating the changes we found from testing. Overall,
we looked at reducing clutter throughout the interface and adding features that users said were
necessary.
Easily create and edit forms
Knowing that the Functional Lead was more tech-savvy than we initially thought, we cleared up the
clutter from the original design. The round buttons for child questions will now show on hover of
the parent questions, rather than always, and child questions are simply displayed through the
numbering, rather than the redundancy of the box, arrow and numbering.
Share forms and instructions to multiple clinicians
As the Working Group can get large, we designed a textfield allowing the Functional Lead to add
their emails directly in the modal. For instructions, we included a text-box so she can add anything
clinicians need to know right onto the voting page.
Vote, comment, and submit — or save for later
Instructions for each clinician were added at the top of the page, above the voting criteria, for
easy visibility. Changing the background colours also made it much clearer that the right side
is meant to be interacted with. We also designed out a progress bar and save functionality so
clinicians could return later.
Filter, sort, and view voting results
Based on the Functional Lead’s comments, we added the clarity and relevance categories to help her
better determine contentious questions. We also worked on adding some of her missed Excel
functionality, such as sorting and filtering, to help her work just as quick as she did in the past.
Outcome
After we delivered the final web-app as well as admin accounts to the CCO team, we worked on a
poster and video giving an overview of the work done for CCO (for the capstone course, shown below).
What I learned
Starting off with the research, I was overwhelmed with the amount of problems that CCO faced in
their line of work. This led us to being unsure about the scope as we were typically used to
problems being handed to us, along with a clear expectation of the final outcome(s). The discovery
of the main pain points required a lot of deliberation between us, and everything had to be
considered to ensure we could actually deliver something within our limited time-frame; lots of
scope narrowing and refinement had to be done.
Looking at the design process, I found that small details in the prototypes were often crucial,
especially when working with people as meticulous as doctors. I learned from our mistakes of using
Lorem Ipsum for filler text because doctors became confused as they have never been exposed to that.
In addition, some of our prototypes looked realistic to the point that it was difficult for some
participants to discern whether it was a real or not; this caused a few technical difficulties
throughout the sessions. Going forward, I found it to be important to cater to your audience when
designing and when presenting designs (even if they’re prototypes), as this may affect the research
and results you’re gathering.