
Why Users Ignore Plan Recommendations
Product analytics showed that many users were abandoning the health plan recommendation tool or selecting a different plan than the one recommended. To understand why the tool wasn’t aligning with user decision-making, I led discovery research with Medicare shoppers, conducting interviews to uncover how people evaluate plans and what factors influence trust in recommendations. The findings informed changes to how recommendations were presented and helped improve confidence during the plan selection process.
My Role
Lead Researcher
Methods
Semi-Structured Interviews
Thematic Analysis
Project Type
Interview-based discovery study
Research Goals & Methodology
This research explored why users were not following the plan recommendation provided by a decision support tool on a private Medicare plan marketplace. Stakeholders wanted to understand whether the tool’s logic aligned with real-world decision-making and what was causing users to disengage.
The research was informed by behavioral data from site analytics, which highlighted drop-off points within the recommendation tool and helped guide the research questions. I collaborated with stakeholders to develop a research plan focused on trust, decision-making behavior, and users' perceptions of value and clarity.
Methods Used
-
Ten remote, semi-structured user interviews
-
Interview script co-developed with stakeholders
-
Thematic qualitative analysis
-
Designed an intercept survey for potential future quantitative validation
Key Findings
These interviews revealed experience gaps that led users to ignore or distrust the plan recommendation. Each insight reflects a usability or trust issue that impacted decision-making.
📍Users prioritized doctors, prescriptions, and costs, but the tool led with cost
Most users felt the recommendation was driven purely by cost savings, even when other priorities mattered more. When they felt the logic behind the recommendation didn’t match user goals, trust broke down.
“I chose a different plan because mine covered my doctors. The one they suggested didn’t.”
📍Users did not recall receiving a specific plan recommendation
Even users familiar with the website didn’t always realize they had been given a plan recommendation. Many overlooked or forgot it, highlighting a disconnect in the experience.
“I don’t remember getting a recommendation. I just looked through the options.”
📍Trust in cost estimates was mixed and emotionally charged
Some users trusted the projected cost estimates, while others were skeptical, especially when past experiences didn’t match the numbers. This created uncertainty and eroded confidence in the tool’s guidance.
“I didn’t trust the number. Last year they said it would be cheaper and it wasn’t.”
Recommendations

Outcome & Reflection
The research helped the product team understand why users disregarded plan recommendations and where key experience gaps created friction. Insights from this work were used to make experience-level improvements (such as adding a "recommended plan" banner to the appropriate plans in the shopping experience) to the recommendation tool and shopping experience during the following enrollment cycle, resulting in measurable performance gains.
Success was evaluated through product analytics, including engagement with the recommendation tool and adoption of recommended plans. A feature-specific survey was also implemented to assess whether the updated experience improved user confidence and decision-making. I designed a follow-up intercept survey to support future quantitative validation. This extended the research beyond qualitative findings and supported broader product decisions.
Key Takeaways
-
Revealed that users struggled to compare plans without clear side-by-side visuals.
-
Insights led to design adjustments that reduced confusion in the recommendation step.