
AI Fraud Detection
AI Fraud Detection
The Customer Question:
A venture capital firm partnered with a tech startup to create an AI tool to locate instances of Medicaid fraud. Before creating the new tool, the team needed to understand the current process of fraud detection as conducted by humans.
The Approach:
Our researcher conducted a series of contextual inquiry sessions with state employees whose job it was to scan and flag suspicious Medicaid activity in large pharmaceutical and medical databases. We sat at the shoulders of these professionals, watching them work as they narrated their process. In short, we learned how humans identified fraud so the team could then create AI tools to do so at scale.
The Impact:
The contextual inquiry was a necessary element of the product development process. The tool could not have been designed and built without understanding the “as is” process conducted by humans. Our input enabled the team to create both the back end AI engine as well as the user interface.
Details
The Customer Question:
A venture capital firm partnered with a tech startup to create an AI tool to locate instances of Medicaid fraud. Before creating the new tool, the team needed to understand the current process of fraud detection as conducted by humans.
The Approach:
Our researcher conducted a series of contextual inquiry sessions with state employees whose job it was to scan and flag suspicious Medicaid activity in large pharmaceutical and medical databases. We sat at the shoulders of these professionals, watching them work as they narrated their process. In short, we learned how humans identified fraud so the team could then create AI tools to do so at scale.
The Impact:
The contextual inquiry was a necessary element of the product development process. The tool could not have been designed and built without understanding the “as is” process conducted by humans. Our input enabled the team to create both the back end AI engine as well as the user interface.

Statistically Validated User Personas
Statistically Validated User Personas
The Customer Question:
A global online resume building leader needed clarity on a fundamental business question: Who are our users, and how different are they from one another? Without validated user segments, the product, design, and marketing teams were making decisions based on assumptions rather than evidence.
The Approach:
We employed a rigorous mixed methods approach to identify and validate distinct user segments. The process began with workshops involving product managers to gather existing knowledge about the user base. This was followed by in-depth interviews with current subscribers to understand their needs, goals, pain points, and challenges. Finally, we applied advanced data mining techniques including cluster analysis and discriminant analysis, along with statistical significance tests, to validate the existence of these segments and measure the degree of separation between them.
The Impact:
The statistically validated personas provided a shared foundation of user understanding across the entire organization. UX teams gained clear direction on feature prioritization, designers created more targeted experiences, and the marketing department launched campaigns grounded in real user needs rather than guesswork. The company moved from fishing in dark waters to operating with confidence, knowing who they were serving and how to reach them effectively.
Details
The Customer Question:
A global online resume building leader needed clarity on a fundamental business question: Who are our users, and how different are they from one another? Without validated user segments, the product, design, and marketing teams were making decisions based on assumptions rather than evidence.
The Approach:
We employed a rigorous mixed methods approach to identify and validate distinct user segments. The process began with workshops involving product managers to gather existing knowledge about the user base. This was followed by in-depth interviews with current subscribers to understand their needs, goals, pain points, and challenges. Finally, we applied advanced data mining techniques including cluster analysis and discriminant analysis, along with statistical significance tests, to validate the existence of these segments and measure the degree of separation between them.
The Impact:
The statistically validated personas provided a shared foundation of user understanding across the entire organization. UX teams gained clear direction on feature prioritization, designers created more targeted experiences, and the marketing department launched campaigns grounded in real user needs rather than guesswork. The company moved from fishing in dark waters to operating with confidence, knowing who they were serving and how to reach them effectively.

Stock Trading AI Visualization Tools
Stock Trading AI Visualization Tools
The Customer Question:
Scottrade, a US-based stock trading platform, formed a team of designers and technologists to create AI visualization tools for its customers to assist them with their research. The team needed to test alpha and beta versions of their products with actual customers so they could improve the tools prior to launch.
The Approach:
We conducted a series of usability tests with current customers, who performed representative tasks using the tools. The designers and developers observed the sessions in real time. After each session, our researcher debriefed with them to articulate insights and ideate product improvements.
The Impact:
Because the tools were tested and improved prior to launch, the team significantly mitigated the risk of customer dissatisfaction and increased usability, usefulness, and customer delight.
Details
The Customer Question:
Scottrade, a US-based stock trading platform, formed a team of designers and technologists to create AI visualization tools for its customers to assist them with their research. The team needed to test alpha and beta versions of their products with actual customers so they could improve the tools prior to launch.
The Approach:
We conducted a series of usability tests with current customers, who performed representative tasks using the tools. The designers and developers observed the sessions in real time. After each session, our researcher debriefed with them to articulate insights and ideate product improvements.
The Impact:
Because the tools were tested and improved prior to launch, the team significantly mitigated the risk of customer dissatisfaction and increased usability, usefulness, and customer delight.

Forms Redesign for Healthcare Applications
Forms Redesign for Healthcare Applications
The Customer Question:
BlueCross BlueShield of North Carolina (BCBSNC) was experiencing low return rates due to incomplete or incorrectly filled-out application forms. They needed to understand why applicants were struggling with their forms and how to redesign them to improve completion rates and processing efficiency.
The Approach:
We partnered with Caroline Jarrett, one of the world’s most respected consultants in forms design, to apply an iterative design and usability-testing approach. The process combined user feedback with stakeholder knowledge to identify where the original forms confused applicants and how easily they could be processed. We limited “fill in the blank” questions, standardized check boxes, and created new verbiage to clarify each page of the application, ensuring the redesigned forms better supported users throughout the completion process.
The Impact:
The redesigned forms delivered measurable improvements: a streamlined application process, reduced user errors, and fewer applications returned for clarification. Most significantly, BCBSNC saw 23% more successful applications that required no human intervention compared to the previous year, directly addressing their challenge of incomplete submissions and improving operational efficiency.
Details
The Customer Question:
BlueCross BlueShield of North Carolina (BCBSNC) was experiencing low return rates due to incomplete or incorrectly filled-out application forms. They needed to understand why applicants were struggling with their forms and how to redesign them to improve completion rates and processing efficiency.
The Approach:
We partnered with Caroline Jarrett, one of the world’s most respected consultants in forms design, to apply an iterative design and usability-testing approach. The process combined user feedback with stakeholder knowledge to identify where the original forms confused applicants and how easily they could be processed. We limited “fill in the blank” questions, standardized check boxes, and created new verbiage to clarify each page of the application, ensuring the redesigned forms better supported users throughout the completion process.
The Impact:
The redesigned forms delivered measurable improvements: a streamlined application process, reduced user errors, and fewer applications returned for clarification. Most significantly, BCBSNC saw 23% more successful applications that required no human intervention compared to the previous year, directly addressing their challenge of incomplete submissions and improving operational efficiency.
Hundreds of projects.
Thousands of users.
Countless insights.

Business Intelligence Dashboard Prototype
Business Intelligence Dashboard Prototype
The Customer Question:
Emerson Climate Technologies needed to develop a web-based business-intelligence dashboard for retail customers that would unify data from installed products across multiple facilities and call center services. They required a simple, easy-access interface for a diverse retail audience and needed to deliver a customer-ready demo in just 5 weeks to demonstrate the concept at an upcoming conference.
The Approach:
We conducted three days of participatory design sessions with retail customers, developers, and sales and marketing teams to explore branding, integration, and user needs. Over two weeks, we executed three rapid development cycles—moving from sketches to wireframes to a functional prototype. We scripted and conducted internal “dry run” presentations to refine the demo, then provided technical support throughout the 4-day customer conference to ensure smooth delivery.
The Impact:
The TE-Emerson team successfully met the exceptionally tight 5-week deadline, delivering a functional prototype from conception to completion. We explored a variety of navigational, structural, visual, and functional solutions tailored to Emerson’s audience tiers. Conference attendees were able to interface with a high-fidelity prototype that effectively demonstrated how the offerings fit together and could improve their operations, validating the dashboard concept for future development.
Details
The Customer Question:
Emerson Climate Technologies needed to develop a web-based business-intelligence dashboard for retail customers that would unify data from installed products across multiple facilities and call center services. They required a simple, easy-access interface for a diverse retail audience and needed to deliver a customer-ready demo in just 5 weeks to demonstrate the concept at an upcoming conference.
The Approach:
We conducted three days of participatory design sessions with retail customers, developers, and sales and marketing teams to explore branding, integration, and user needs. Over two weeks, we executed three rapid development cycles—moving from sketches to wireframes to a functional prototype. We scripted and conducted internal “dry run” presentations to refine the demo, then provided technical support throughout the 4-day customer conference to ensure smooth delivery.
The Impact:
The TE-Emerson team successfully met the exceptionally tight 5-week deadline, delivering a functional prototype from conception to completion. We explored a variety of navigational, structural, visual, and functional solutions tailored to Emerson’s audience tiers. Conference attendees were able to interface with a high-fidelity prototype that effectively demonstrated how the offerings fit together and could improve their operations, validating the dashboard concept for future development.

Health Benefits Selection, Discussion, and Negotiation Support Tool
Health Benefits Selection, Discussion, and Negotiation Support Tool
The Customer Question:
The National Institutes of Health, working within the University of Michigan, created a web-based group health benefits planning application. The importance of health-related decisions made it critically important to ensure ease of use of this new tool so that it could successfully support users in independently making their health benefits choices. The tool also supported group discussion and negotiation presented in a gamification style, requiring additional focus on these more social elements of the interface.
The Approach:
We recruited 30 representative participants, many of whom had somewhat limited computer literacy to use the tool to perform a set of basic tasks. We observed and probed as they worked, noting points of friction and confusion, as well as noting what was working well for them. We presented a prioritized list of issues and worked with the design and development team to address the issues through design improvement. We then conducted a second set of usability sessions to validate the changes and identify additional issues. Finally, we conducted a third round of sessions for further validation and issue identification.
The Impact:
The iterative, collaborative approach to usability testing allowed our customer to roll out a new, high-impact solution with minimum risk of failure. Interface improvements made along the way improved task completion from an 50% to 90% in the third round. This approach of design/develop/test/redesign allowed the team to validate design changes made in-between testing sessions.
Details
The Customer Question:
The National Institutes of Health, working within the University of Michigan, created a web-based group health benefits planning application. The importance of health-related decisions made it critically important to ensure ease of use of this new tool so that it could successfully support users in independently making their health benefits choices. The tool also supported group discussion and negotiation presented in a gamification style, requiring additional focus on these more social elements of the interface.
The Approach:
We recruited 30 representative participants, many of whom had somewhat limited computer literacy to use the tool to perform a set of basic tasks. We observed and probed as they worked, noting points of friction and confusion, as well as noting what was working well for them. We presented a prioritized list of issues and worked with the design and development team to address the issues through design improvement. We then conducted a second set of usability sessions to validate the changes and identify additional issues. Finally, we conducted a third round of sessions for further validation and issue identification.
The Impact:
The iterative, collaborative approach to usability testing allowed our customer to roll out a new, high-impact solution with minimum risk of failure. Interface improvements made along the way improved task completion from an 50% to 90% in the third round. This approach of design/develop/test/redesign allowed the team to validate design changes made in-between testing sessions.

Competitor Evaluation of Personal Accounting Software
Competitor Evaluation of Personal Accounting Software
The Customer Question:
Intuit, a leading provider of accounting software for individuals and small businesses, wanted to ensure its solution was easier to use than its key competitors. By testing and measuring comparative ease of use, Intuit would be able to market its solution as superior as measured by a third party, and also, gather the data to continually improve its current solution in subsequent iterations.
The Approach:
We recruited 50 typical users of the tool and observed them performing typical tasks using the tool and its chief market competitor. We collected and analyzed both qualitative and quantitative data such as time on task, error rates, and customer satisfaction. We presented these results in a way that allowed Intuit to gauge its competitive advantage as well as prioritize future improvements to maintain and expand its advantage, even as new competitors entered the marketplace.
The Impact:
Intuit learned that their tool was deemed superior to its key competitors. Our findings were packaged for the team to ensure the results were immediately actionable. Our sponsor said, “Our team has been talking about how comprehensive and professional Tec-Ed’s report is. I’m so thankful we selected you as a vendor. The overwhelming response is, ‘this is so clear, understandable, informative, and well done.” (Laurel Lee, QuickBooks Product Manager, Intuit)
Details
The Customer Question:
Intuit, a leading provider of accounting software for individuals and small businesses, wanted to ensure its solution was easier to use than its key competitors. By testing and measuring comparative ease of use, Intuit would be able to market its solution as superior as measured by a third party, and also, gather the data to continually improve its current solution in subsequent iterations.
The Approach:
We recruited 50 typical users of the tool and observed them performing typical tasks using the tool and its chief market competitor. We collected and analyzed both qualitative and quantitative data such as time on task, error rates, and customer satisfaction. We presented these results in a way that allowed Intuit to gauge its competitive advantage as well as prioritize future improvements to maintain and expand its advantage, even as new competitors entered the marketplace.
The Impact:
Intuit learned that their tool was deemed superior to its key competitors. Our findings were packaged for the team to ensure the results were immediately actionable. Our sponsor said, “Our team has been talking about how comprehensive and professional Tec-Ed’s report is. I’m so thankful we selected you as a vendor. The overwhelming response is, ‘this is so clear, understandable, informative, and well done.” (Laurel Lee, QuickBooks Product Manager, Intuit)

Gaming Device Customer Experience
Gaming Device Customer Experience
The Customer Question:
NVidia is the world leader in graphics and gaming chip production. NVidia has also produced gaming devices. We were engaged to conduct usability testing on a wireless, standalone gaming handset to determine how avid gamers reacted to the unboxing and initial use of this novel device. Because the device was in its late stage of development, and had already undergone much iterative testing throughout its development, the focus of this engagement was to identify pragmatic changes that could be made prior to launch to enhance the customer experience.
The Approach:
Testing this gaming device required a departure from typical usability studies in which participants conduct a set of representative tasks. Rather, it called for a lengthy simulation of its unboxing, setup, and full initial gaming session. This was necessary to assess the aspects of the experience that could only be observed in real time, such as fatigue, the effects of the temperature of the device, and physical stress. We also observed and gathered typical measurements such as ease of unboxing, understandability of setup instructions, errors, and customer satisfaction, including satisfaction with the perceived value of the device.
The Impact:
Findings focused on improvements to the device’s Quick Start guide, the information architecture of setup and usage information, packaging, hardware overlays, and associated communications that helped set customer expectations for the device. These recommendations allowed Nvidia to make changes, even at a late stage of development, to improve the customer experience.
Details
The Customer Question:
NVidia is the world leader in graphics and gaming chip production. NVidia has also produced gaming devices. We were engaged to conduct usability testing on a wireless, standalone gaming handset to determine how avid gamers reacted to the unboxing and initial use of this novel device. Because the device was in its late stage of development, and had already undergone much iterative testing throughout its development, the focus of this engagement was to identify pragmatic changes that could be made prior to launch to enhance the customer experience.
The Approach:
Testing this gaming device required a departure from typical usability studies in which participants conduct a set of representative tasks. Rather, it called for a lengthy simulation of its unboxing, setup, and full initial gaming session. This was necessary to assess the aspects of the experience that could only be observed in real time, such as fatigue, the effects of the temperature of the device, and physical stress. We also observed and gathered typical measurements such as ease of unboxing, understandability of setup instructions, errors, and customer satisfaction, including satisfaction with the perceived value of the device.
The Impact:
Findings focused on improvements to the device’s Quick Start guide, the information architecture of setup and usage information, packaging, hardware overlays, and associated communications that helped set customer expectations for the device. These recommendations allowed Nvidia to make changes, even at a late stage of development, to improve the customer experience.