Design
Closing Quality Control Case Study
Role: Lead Product Designer
Design & development of a machine learning quality control system
Objective
Integrate with lending and title company professional's work routines to highlight closing errors, thus differentiating ourselves from competitors
Tools & Methodology
- User research
- User testing
- Story mapping
- Mockup design
- User flows
- Cross-team collaboration
- Wireframes
- UI design
Introduction
Snapdocs is a platform in the real estate closings space which facilitates mortgage closings nationwide. Its users are diverse in three main office types: lender, escrow, and notary.
My role on this project was Lead Product Designer, and it encompassed the breadth of product design, from research through to mockups and prototypes. In addition to the user-facing solution this case study digs into, there were also internal tools which I designed as part of the project. For brevity, his case study will only focus on the external-facing solution.
History
Existing research indicated that loan officers, funders, escrow officers, and other mortgage professional roles spent large amounts of time looking over closing documents for errors. This created a drag on productivity. It also potentially made a loan unsellable to investors if errors went unsolved. Together with the Product team, we hypothesized there was an opportunity to assist in document quality control. The solutions we built could truly differentiate Snapdocs in the market.
Challenge
To discover document errors and highlight them to the user, we first needed machine learning to detect the errors. Once models and proofs of concept were put into place, the data was accessible for design and engineering to access. It was my role to determine where in the lender’s workflow to surface these reports, and also create internal tools which internal users could utilize to ensure models were operating correctly.
It was not as simple as just showing loan and escrow officers a report, however. We had to work within the scope of what lenders and title companies work behaviors were:
- Because of the nature of the Snapdocs platform, users were rarely on our site since we integrated with their other tools via API’s (there were other efforts to modify this user behavior, but out of the scope of this project)
- In general, emails were the #1 form of communication for the mortgage industry, as such, there was a high noise to signal ratio
- The report was only valuable within hours of a signed closing
In addition, it was necessary that the data displayed was as accurate and up-to-date as possible. In other words, it was possible the information could be out of date by the time the user viewed a report, so we had to find ways to mitigate old data.
Approach
Research
Because of where Snapdocs sat in the midst of lender and escrow workflows, we knew we could not dictate to our users how to use our tool, we needed to meet them where they were at. The reality was that they already used so many other software tools, there was built-in resistence to yet another system with which to interact.
I performed discovery interviews with lending and title companies to understand unique workflows and toolsets. Questions included but were not limited to:
- What are your most common errors?
- What does your QC process look like?
- How long does it take you to quality control closings?
- What roles perform QC checks?
- How much time do you have before you must fund a closing?
- What challenges are involved with dealing with the other party (lenders with title, title with lenders)
Workflows varied from lender to lender, state to state, and region to region. It was paramount we identified solutions for the majority of customers, understanding we could not perfectly integrate with all workflows. If we successfully hit an 80% of parody, we would deem the project a success.
Design
From there, I mapped out the complex workflows, and determined where our solution could fit into our users’ task flows. To ensure a successful rollout, I worked closely with other pods, since our solution was going to surface in other feature sets. I presented our concepts, and collaborated with others to ensure our implementation did not negatively affect existing workflows. Product and engineering worked to align release schedules, and I joined the PM in demoing the solutions to sales and customer success so that they were prepared when we went to market.
Testing
Finally, once the hypotheses and release plans were in place, I tested prototypes with five lender and escrow officers ensuring our solutions would work. I was confident, given past experience and the type of interfaces we were testing, that five respondents would be an accurate representation of success or failure.
User | User-perceived task success | Actual result |
User 1 | 100% | 100% |
User 2 | 100% | 100% |
User 3 | 100% | 100% |
User 4 | 50% | 100% |
User 5 | 33.33% | 100% |
Total | 76.67% | 100% |
Unexpected Discovery
One thing worth pointing out. During testing, we discovered that the nomenclature we were using as the feature’s title was causing confusion with our customers. Originally we were calling it “Post Closing Quality Control”, however I became aware that anytime “post closing” came up, the conversation inevitably went off track. I pursued this, and found that what we thought “post closing” was, was not at all what our users used the term as. As a result, we renamed the feature set “Closing Quality Control”. I tested the new title as part of the overall user testing and it received high recognition.
Deliverables
Sensitive to the timeframes lenders and escrow work within, the resulting solution took the form of:
- Notifying the parties via an existing email that they had a quality control report ready to view
- Surfacing the quality control report within the closing page of the Snapdocs software (accessible via the email)
- Providing tools for each document to either replace, fix, or mark as resolved
Success
Because we were sensitive to our users’ workflows, we created solutions that complimented their working styles and met them where they were at. As a result, we received reports of adoption within days after being implemented for a particular company. Upon adoption, we were able to quantify trends across companies:
- 10 - 15 minutes of a user’s manual effort was saved per loan; we were able leverage the time and money saved by not having to QC documents as a selling point
- Increased our fee by $15 per loan transaction, a $375K/month increase in revenue
- In the early days, at least 3 clients signed up with our entire service package once our quality control features were put in place, on top of the 40 who signed on within the first 6 months, showing that we had identified a real need in the market