Defining Impact

jenny comiskey
4 min readFeb 9, 2023

Anyone who has worked with me, or even met me briefly, knows that impact is something that drives me. It’s a question I always ask of myself and my teams. The role insights play in helping shape direction and drive change can sometimes get lost when the most visible and tangible contribution is centered on the act of building or the outputs, outcomes and results (eg. products shipped or metrics moved). Yet, impact can serve as one of the ultimate measures of success of a thriving research and insights organization. Especially when working in industry and in the applied landscape. What matters is the action that research drives. Even in academic settings, shifting focus from publishing and citations to understanding real world change and influence becomes a better means of assessing potential scale, reach and significance of new knowledge.

Researchers can have a tendency to over index on the report, the process, or a singular or collection of findings. Focusing on process and actions is not exclusive to research. I’ve seen this in many functions and types of work, it can be easier to talk about the actions and not the results. As researchers we are developing knowledge and insight in service of action and change. We need to reinforce this, tell more stories about it, and help shine a light on the broad range of ways research is playing a role. In terms of understanding the long term value of our work, we should look to connect back to the change driven by the work whether that be a strategy, a mindset or a direct shift in a product and related metrics. It’s also important for us to build up the language and examples of what impact looks like and our shared definitions and expectations for it.

I’ve tried to capture the types of impacts research might result in to help showcase the breadth and dimensions that can be at play. This is not comprehensive but can be useful to consider the potential outcomes before kicking off work, and after activating it.

Impact types can include:

Executional: Driving change in the product, service or marketing experience

  • Direct changes to the experience launched shaped by an insight that clearly benefits users
  • A core problem identified by research addressed in what and how is delivered
  • Identifying and addressing points of success or failure of delivery from the users point of view

Directional: Informing product, marketing, or company strategy

  • Informing team roadmaps, OKR’s, or future planning and priority
  • Helping resolve or advance a critical decisions with evidence
  • Identifying opportunities for new markets, products, experiences, services or organizational investments (or insights that led to furthering these)
  • Informing and helping to define metrics and models for how success is assessed
  • Informing a pivot, redirect or a shift in priority (eg. not launching and avoiding negative cost)
  • Setting entirely new user measures or benchmarks

Cultural: Shaping new mindsets, perspectives, models, or language

  • Informing how an organization thinks about customers, their experience and their needs by developing transferable or durable insights, themes and models (foundational or directional frameworks — journey maps, archetypes, jobs, etc…)
  • Creating new clarity in problem definition or focus
  • Surfacing potential for new research or investments in new teams or structures, gaps or assumptions previously unrecognized
  • Developing broader knowledge base around an area across multiple studies
  • Enhancing and aligning organizational knowledge and/or building shared perspectives across disparate teams
  • Redefining and advancing true user first commitment and shifts in team approach, process, or operating mechanisms to prioritize users across a product or marketing team

Operational: Defining new research systems, methods, programs, or process

  • Introducing new methods and process in how insight is integrated
  • Advancing the maturity of how the team operates with people at the center
  • Improving the quality, rigor and execution of the research itself
  • Increasing operational effectiveness (speed, efficiency, volume), helping teams move toward better outcomes faster
  • Improving organizational integration, advancing the scale, adoption and advocacy of insights
  • Increased awareness and understanding of research insights
  • Unblocking other teams with expertise and unique perspective

There may also be specific signals that help indicate progression on all of the above. Some harder than others to assess concretely. When possible, any of these can also begin to be tracked back to measurable user impact. Any of the research informed work above ideally drives business metrics and/or direct and measurable value for users. In terms of the business it may also be indicated by direct changes to support cost, satisfaction, revenue, adoption, retention, acquisition, prevention of wasted resources etc. Similarly, programmatic or operational impact can have indicators of success. When possible it’s a step further to track back to concrete evidence (eg. citations, adoption, engagement, reduction in admin time), improved or additional insight (eg. new discoveries).

Considering the types of impact that a question is set up to advance, or that at the end of the study can be activated, shifts from delivering and executing research to driving change with it.

Other resources for defining impact:

https://medium.com/meta-research/what-it-means-to-have-impact-as-a-researcher-at-facebook-f7f97ebba668

https://uxdesign.cc/three-levels-of-ux-research-impact-174768b7f4ef

https://gqzhang.medium.com/ux-research-excellence-framework-a824928fd7e9

https://www.userinterviews.com/blog/how-to-track-the-impact-of-ux-research

https://uxdesign.cc/tracking-the-impact-of-ux-research-a-framework-9e8b8f51599b

--

--

jenny comiskey

Humanity + tech. Helping create a people-centered future. Led insights at Stripe, Meta AI, Strava, IDEO and McKinsey.