Solano Labs CI Session View
Solano CI is a continuous integration product for software development teams.
As lead product designer at Solano Labs, I redesigned the ‘Session View’—the comprehensive report of build progress and test results—and solved major usability issues and increased user trust in the product.
The Session View shows every state of a session, from its initial creation, through to a complete collection of test results, build lifecycle timings, and build artifacts.
Over time, as additional functionality and information was added to the view, the interface had become difficult to use.
How can we simplify a very complex interface and increase user efficiency?
Our goals were:
- Improve accuracy of reporting data.
- Clarify presentation of build status.
- Increase user trust in the system.
- Present consistent views for a variety of session types.
A holistic redesign based on user research and customer feedback, validated with live-data prototypes. We completely rebuilt how the system reports build status information, created a new real-time status report model, and I designed a set of visually consistent views for all session types.
The new session view solves all the issues we set out to solve: we improved accuracy and depth of detail of status report, while making it more usable and easy to understand; we increased user confidence in the system, and are providing customers with a more efficient tool for a critical part of their software development workflow.
Solano Labs is currently beta testing it with a selected group of customers and the feedback is consistently positive. One of the founders and the lead Product Manager says that “everyone is loving” the update.
At Solano Labs I helped move the product development process to SCRUM and two week sprints. My goal was to restructure the product development process to allow for more design thinking to happen earlier in the cycle.
To accomplish this, I adapted the Design Sprint process to run separate design sprints alongside engineering. This allowed for proper research and validation of hypotheses before diving into implementation; the output of a completed design sprint went directly into the engineering backlog, with the benefit of providing developers with a well-defined goal.
Discover > Define > Ideate > Prototype > Validate
Research, analysis, exploration; the bedrock of subsequent design activities.
Through research and user testing sessions, I helped the team better understand how the interface was solving problems for customers, and how it was falling short. Digging deeper below issues of cluttered visual presentation and confusing interactions, I got to the root of a critical issue users were having: they were not seeing a complete and accurate view into the state of their build.
During my research I captured input from users about the specific UI issues they had. I made notations on printouts of the existing session view.
Activities & Outputs: User Research, Use Cases, Usability Testing, Interviews, Concept Maps.
Clarity and focus; framing the problem to create a solution
I facilitated development of an internal prototype (running with live data) to validate new reporting outputs. I used this information later in the project to construct an improved real-time status report model.
I had previously developed with the team a set of personas that profiled new and existing customers. These were hybrid marketing and design personas that gave us clear idea of our users goals and pain points. Reinforced by the deeper research done for this specific project, the picture that emerged helped me get buy-in on making broader changes to the product.
One of the biggest challenges of this project was to figure out how to re-structure the presentation of session status. Most of these rough layouts were done with pencil and paper; many were done on whiteboards with the Product Manager, especially when design decisions might impact other views in the product.
Activities & Outputs: Personas, Use Cases, User Flows, Sketches, Lo-fi Wireframes.
Concepts for outcomes; combine understanding of the problem with imagination.
I iterated through a series of wireframes showing all possible configurations of a session view (there are twelve!). Previously, less common session types had inconsistent layouts; I refined the structure to present all sessions consistently.
The visual and interface design came from my work-in-progress style guide and design system.
Working with my Sketch mock-ups from the previous design update, I adapted these styles to new elements and components for a more consistent and cleaner interface.
I also selected icons that would help communicate the detailed status of the build, and refined the color scheme for the build lifecycle stages.
I applied these styles to the clickable prototype that would serve as a final mock-up of the UI design.
Activities & Outputs: Wireframes, Paper Prototypes, Style Tiles, UI Design, Lo-fi Mock-ups, Art Direction, Content Strategy.
If a picture is worth a thousand words, a prototype is worth a thousand pictures.
I constructed a clickable prototype in Middleman to present our current design direction and socialize a number of additional interaction concepts to stakeholders.
Since we had very hi-fidelity wireframes and a defined set of styles for the UI components, it was more efficient to create a clickable prototype rather than static mock-ups.
The purpose of this prototype was two-fold:
- Simulate a build running in the new UI, showing all the visual feedback in action
- Create front-end hooks for the engineers to begin wiring up the data to the view.
This proved to be a very effective step in the process. Engineers got a head start on implementation planning while we continued to refine the visual presentation.
Activities & Outputs: Mock-ups, Clickable Prototypes, Heuristics, Interactions
Prototype as if you know you’re right, but test as if you know you’re wrong
There was no practical way to deliver our initial prototype to customers using real-time data, so the engineering team built the new session view into the production app. We then allowed users to test it as an ‘alpha’ feature (behind a feature flag).
Making full use of the Agile philosophy, we had working software. But in the spirit of Design Thinking, we were still in the design phase.
Now the process of moderated and unmoderated user testing can reveal where we hit the mark with our solution, and where it might need improvement.
And by monitoring analytics we can uncover usage patterns that will help us improve the user experience even more in future versions of the app.
Activities & Outputs: Usability Tests, Analytics, QA