As Holmes Corporation (HC) continues to evolve our digital learning experiences, our goal remains clear: deliver products that exceed the expectations of our learners, instructors, and association partners. A vital part of this dedication to product excellence centers around understanding the needs and motivations of our users and validating assumptions as we innovate. We can’t do this without regularly speaking with our customers.
Throughout my career, I’ve worked at the intersection of product strategy and experience design, translating user insights into products that deliver on our users’ wants and needs. Since joining HC, I’ve focused on building a stronger bridge between what our users experience and how our products evolve. The bridge is built on real feedback gathered from real people, and increasingly from the help of AI simulations that help us innovate at speed.
Listening to Our Users
At the heart of HC’s digital strategy is a commitment to listening to our users. For example, as we look to develop the next generation of our Learning Analytics Center, I’ve been conducting structured user interviews with course instructors. These sessions have given us valuable insights on how instructors use this platform to track their students’ progress, interpret data, and tailor their teaching. The best part? This glimpse into how our instructors utilize the platform has already had a substantial influence on the future product direction.
Instructors, however, are only one part of the picture. We’re also expanding our strategy to include self-study learners, ensuring that our future designs support both guided instruction and independent learning paths. Together, these perspectives create a full, nuanced understanding of how our learning ecosystem performs across different user types.
Introducing Synthetic User Testing
While real user feedback remains a foundation of our product development process, we’ve started complementing it with something new: synthetic user testing.
Synthetic testing uses AI-generated user models, or synthetic users, that reflect the demographic and behavioral traits of our real learners. These simulated audiences allow us to explore and gather early validation for design ideas and product hypotheses long before live testing begins.
Think of these synthetic users as temporary stand-ins for real users. They’re built on validated persona data and historical patterns of how learners engage with our platforms. By testing new workflows or concepts with synthetic audiences, we can predict likely user responses, identify usability risks, and refine designs in hours instead of weeks.
This approach doesn’t replace human feedback. It enhances it. Synthetic testing helps us innovate at speed while ensuring that every design decision is still grounded in user reality.
How Synthetic Users Work
So how do these synthetic users work? Using a specialized AI tool, we’re able to build groups of users based on existing learner and instructor data such as demographics, engagement metrics, and usage behaviors. Each synthetic cohort mirrors a real segment of our users. Some examples of these groups are first-time certification candidates, experienced instructors, or learners balancing study alongside full-time work.
Using these generated user groups, our product team can “pressure test” features like dashboards, data visualizations, and navigation. If an interface is confusing or a feature doesn’t resonate with our synthetic cohort, we know it needs refinement before we test it with real users.
The impact is profound. Instead of waiting for multi-week testing cycles, we can simulate user responses, iterate, and arrive at better design decisions within days. This speed-to-insight shortens our product development timelines while improving the quality of what we ultimately deliver.
What’s Next
As we continue this multifaceted approach to testing, our next phase will focus on live validation, testing our new designs with instructors and learners. The insights we gain from synthetic cohorts will inform these upcoming usability sessions, where we’ll confirm what works, refine what doesn’t, and ensure the platform delivers meaningful value to all users.
Our goal isn’t just to make incremental improvements. It’s to build a learning platform that evolves intelligently, backed by data and designed around the human experience.
By combining authentic user feedback with the power of synthetic testing, HC is redefining how we build and scale digital learning experiences. It’s an approach that lets us move faster, make smarter decisions, and focus on providing an outstanding learning platform for our customers.
If you’re looking for a partner who can help you envision, build, promote, distribute, and support your association’s personalized professional development programs, Holmes Corporation can help! Visit HolmesCorp.com or email us at [email protected] to get the conversation started.







