


Driving team productivity with a dashboard that cut soak test time by 81%
Driving team productivity with a dashboard that cut soak test time by 81%

HEWLETT PACKARD ENTERPRISES
HEWLETT PACKARD ENTERPRISES
Jump to Solution
8 min read
ROLE
Product Strategy, UX Research & Design, Full-Stack Development
TEAM
Product development intern, Senior manager, 8 QA engineers, Program manager
MADE IN
4 Months (April 2024 - Aug 2024)
ROLE
Product Strategy,
UX Research & Design,
Full-Stack Development
TEAM
Product development
intern, Senior manager,
8 QA engineers,
Program manager
MADE IN
4 Months
(April 2024 -
Aug 2024)
Made with coffee(lots of it) & duck-approved decisions
©️2026
Say Hello!
My spidey-sense says we’d get along ;)
Introduction
Soak testing is a long-duration stress test on Aruba CX switches, typically running from 24 to 72 hours or more. During this time, switches are kept running while triggering events, to uncover slow-build failures that short tests never reveal.
Finding those long-term issues early means fewer customer complaint and more reliable products being shipped.
Problem @ 🖐
QA engineers were spending more time maintaining the soak than learning from it
Manual soak testing was time-consuming and error-prone, & since metrics weren’t stored, engineers couldn’t identify long-term behavior patterns.
The strategy
SoakMaster: A streamlined experience, built around automation & visibility
The goal is to automate repetitive steps, and turns days of device activity into visual patterns that are easy to understand.
TLDR - A preview of the Final Dashboard
The research revealed that — Manual soak process pulled engineers away from more critical work
To understand how soak tests are actually being run day to day, I spent time observing workflows, reviewing similar internal tools already in use, and speaking with QA engineers and system software engineers responsible for validating Aruba CX switches



User Pain Points and Business Impact
💬 Hover to see "why?"

Time consuming

Slow root cause analysis

Limited scalability

Hard to communicate results
A key issue surfaced during research:
Soak testing produced valuable system data, but there was no visibility into trends developing over time.
Product Goals
With this opportunity in mind, I sat down with QA engineers in the team to align on a the goal of transforming soak testing from a manual, reactive process into a scalable, automated, and insight-driven system.
But why a web-based dashboard?
Manual, Fragmented Tracking
Engineers relied on scripts, terminals, and scattered logs to track soak runs. Monitoring the tests required manual check-ins, personal notes, and constant context switching, making it easy to miss early signs of issues
While command-line tools worked for triggering tests, they were not designed for monitoring, trend analysis, or collaboration. Understanding system behavior over days meant digging through logs instead of seeing the full picture.
CLI Workflows Didn’t Scale
A web-based dashboard provided a shared, always-accessible view of soak tests. It allowed engineers to automate execution, visualize health metrics, and easily share context with others.
A Central Web Dashboard
User Flows
With the research & insights gathered so far, I partnered with my mentor, a software developer to map out an end-to-end flow that reflected how engineers run soak tests and remained technically feasible

Presenting Early Prototypes & Pitching Visualization
To gather early feedback on the direction & scope, I shared the initial concept of SoakMaster with QA engineers and PMs across 4 internal teams. The session covered:
Project objectives
Automated flow for running soak tests and a dashboard to view results
Early exploration of how visualization could be used to understand soak execution, logs, and health metrics over time
They overall feedback was positive, but didn't fully meet expectations.

Preference for custom input

Need for a pre-soak setup step

Confusing microcopy & redundant inputs

Editing soak configurations mid-run
Unmet User Needs
💬 Hover for more context
Not all suggestions could be implemented immediately due to technical feasibility, but the feedback helped guide the final design and inform future improvements.
INTRODUCING
SoakMaster
How does SoakMaster reduce manual effort while improving visibility into device behavior over time?
Soak Tests Overview
The dashboard surfaces all ongoing and completed soak runs in one place, showing status, resource usage, and failure indicators at a glance. This gives engineers immediate visibility into the health and progress of testing without digging through logs.

Primary user action placed prominent to reduce friction when starting tests

Status labels indicate whether a run needs attention or can continue unattended

Direct log links enable faster investigation without manual file hunting
Search helps engineers quickly locate their soak runs as volume grows
New soak test setup form
Research showed that engineers repeatedly re-entered the same soak configuration details across runs, often relying on memory and manual notes. This guided setup form structures those inputs into clear, labeled fields, making test configuration faster, more consistent, and less error-prone.
Post-Test Device Health Metrics
SoakMaster tracks key indicators like CPU usage, memory usage, and core dumps, metrics that commonly reveal long-term stability issues such as system crashes or resource leaks. Visual trends make it easier to identify gradual increases or sudden spikes that indicate potential reliability issues.

Summary chips flag issues immediately

Line graphs turn raw values into visible trends
SoakMaster is now a live project and is being used by 40+ QA Engineers!
I presented the solution to the VP, senior managers, and QA engineers, and they loved it! Leadership appreciated the clarity the dashboard brought to soak tests and asked for the solution to be documented for wider adoption across teams.
The tool is now being used by 40+ engineers, with ongoing improvements building on the initial MVP. Seeing the project move from an intern-led effort to something teams actively use has been a meaningful outcome.
Metrics that mattered
Manual soak effort reduced from 1-2 hours to under 5 minutes per day

~40% fewer setup-related errors

Key Takeways
Accessibility shouldn't be an afterthought
Because SoakMaster was an internal tool, accessibility wasn’t a focus at the start. I revisited accessibility after designing the MVP and advocated for improvements.
While this strengthened the final design, it made me realize that thinking about contrast, hierarchy, & scannability earlier would have saved time & rework.
There is never just one right solution
At several points, the first solution felt “good enough,” especially under time pressure. But exploring alternatives helped surface better tradeoffs.
Questioning initial decisions, staying curious & iterating opened the door to better outcomes.
Driving team productivity with a dashboard that cut soak test time by 81%

HEWLETT PACKARD ENTERPRISES
ROLE
Product Strategy,
UX Research & Design,
Full-Stack Development
(MERN Stack)
TEAM
Product development
intern, Senior manager,
8 QA engineers,
Program manager
MADE IN
4 Months
(April 2024 -
Aug 2024)

Introduction
Soak testing is a long-duration stress test on Aruba CX switches, typically running from 24 to 72 hours or more. During this time, switches are kept running while triggering events, to uncover slow-build failures that short tests never reveal.
Finding those long-term issues early means fewer customer complaint and more reliable products being shipped.
Problem @ 🖐
QA engineers were spending more time maintaining the soak than learning from it
Manual soak testing was time-consuming and error-prone, & since metrics weren’t stored, engineers couldn’t identify long-term behavior patterns.
The strategy
SoakMaster: A streamlined experience, built around automation & visibility
The goal is to automate repetitive steps, and turns days of device activity into visual patterns that are easy to understand.
TLDR - Toggle between old and new workflows
At HPE, soak tests were still run manually

This is Alex
A QA engineer working on Aruba CX switches. He spends 1–2 hours every day manually running and monitoring soak tests that last for days.
New Workflow
Old Workflow
At HPE, soak tests were still run manually

This is Alex
A QA engineer working on Aruba CX switches. He spends 1–2 hours every day manually running and monitoring soak tests that last for days.
New Workflow
Old Workflow
Introduction
Soak testing is a long-duration stress test on Aruba CX switches, typically running from 24 to 72 hours or more. During this time, switches are kept running while triggering events, to uncover slow-build failures that short tests never reveal.
Finding those long-term issues early means fewer customer complaint and more reliable products being shipped.
Problem @ 🖐
QA engineers were spending more time maintaining the soak than learning from it
Manual soak testing was time-consuming and error-prone, & since metrics weren’t stored, engineers couldn’t identify long-term behavior patterns.
The strategy
SoakMaster: A streamlined experience, built around automation & visibility
The goal is to automate repetitive steps, and turns days of device activity into visual patterns that are easy to understand.
TLDR - Toggle between old and new workflows
At HPE, soak tests were still run manually

This is Alex
A QA engineer working on Aruba CX switches. He spends 1–2 hours every day manually running and monitoring soak tests that last for days.
New Workflow
Old Workflow
At HPE, soak tests were still run manually

This is Alex
A QA engineer working on Aruba CX switches. He spends 1–2 hours every day manually running and monitoring soak tests that last for days.
New Workflow
Old Workflow
Introduction
Soak testing is a long-duration stress test on Aruba CX switches, typically running from 24 to 72 hours or more. During this time, switches are kept running while triggering events, to uncover slow-build failures that short tests never reveal.
Finding those long-term issues early means fewer customer complaint and more reliable products being shipped.
Problem @ 🖐
QA engineers were spending more time maintaining the soak than learning from it
Manual soak testing was time-consuming and error-prone, & since metrics weren’t stored, engineers couldn’t identify long-term behavior patterns.
The strategy
SoakMaster: A streamlined experience, built around automation & visibility
The goal is to automate repetitive steps, and turns days of device activity into visual patterns that are easy to understand.
TLDR - Toggle between old and new workflows
At HPE, soak tests were still run manually

This is Alex
A QA engineer working on Aruba CX switches. He spends 1–2 hours every day manually running and monitoring soak tests that last for days.
New Workflow
Old Workflow
At HPE, soak tests were still run manually

This is Alex
A QA engineer working on Aruba CX switches. He spends 1–2 hours every day manually running and monitoring soak tests that last for days.
New Workflow
Old Workflow
SoakMaster is now a live project!
I presented the solution to the VP, senior managers, and QA engineers, and they loved it! Leadership appreciated the clarity the dashboard brought to soak tests and asked for the solution to be documented for wider adoption across teams.
The tool is now being used by 40+ engineers, with ongoing improvements building on the initial MVP. Seeing the project move from an intern-led effort to something teams actively use has been a meaningful outcome.
Metrics that mattered
SoakMaster is now a live project!
I presented the solution to the VP, senior managers, and QA engineers, and they loved it! Leadership appreciated the clarity the dashboard brought to soak tests and asked for the solution to be documented for wider adoption across teams.
The tool is now being used by 40+ engineers, with ongoing improvements building on the initial MVP. Seeing the project move from an intern-led effort to something teams actively use has been a meaningful outcome.
Metrics that mattered


SoakMaster is now a live project!
I presented the solution to the VP, senior managers, and QA engineers, and they loved it! Leadership appreciated the clarity the dashboard brought to soak tests and asked for the solution to be documented for wider adoption across teams.
The tool is now being used by 40+ engineers, with ongoing improvements building on the initial MVP. Seeing the project move from an intern-led effort to something teams actively use has been a meaningful outcome.
Metrics that mattered
SoakMaster is now a live project!
I presented the solution to the VP, senior managers, and QA engineers, and they loved it! Leadership appreciated the clarity the dashboard brought to soak tests and asked for the solution to be documented for wider adoption across teams.
The tool is now being used by 40+ engineers, with ongoing improvements building on the initial MVP. Seeing the project move from an intern-led effort to something teams actively use has been a meaningful outcome.
Metrics that mattered
Manual soak effort reduced from 1-2 hours to under 5 minutes per day

~40% fewer setup-related errors

Manual soak effort reduced from 1-2 hours to under 5 minutes per day

~40% fewer setup-related errors

Manual soak effort reduced from 1-2 hours to under 5 minutes per day

~40% fewer setup-related errors

Psst…this is just a preview, view the complete journey on a larger screen!
Psst…this is just a preview, view the complete journey on a larger screen!
Psst…this is just a preview, view the complete journey on a larger screen!
Say Hello!
My spidey-sense says we’d get along ;)
Say Hello!
My spidey-sense says we’d get along ;)
Made with coffee(lots of it) & duck-approved decisions
Made with coffee(lots of it) & duck-approved decisions
©️2026
©️2026




