Give users a soft landing on first login.

GROWTH DESIGN
UX DESIGN
PROTOTYPING
EXPERIMENTATION
COLLABORATION

My Roles: Product Designer, Product Manager, Product Marketer

TL;DR

Designed & built in-app onboarding from 0 to 1 for new Lattice users to quickly understand tools and get started.

User retention was the single most predictive measure of a customer’s likelihood to renew their Lattice contract, but only around ⅓ of Lattice users regularly used the platform’s tools. I designed and built an in-app onboarding flow for new users that drove a small but statistically significant lift of 2% in user retention in the first 4 weeks through a 50-50 split test. This met the criteria for a successful experiment, so I launched the flow to the full user base and prepared for V2.
PROBLEM

User engagement stalled, customers were churning at an alarming rate, and our NDR was dropping.

Lattice is a wide-reaching HR platform with many tools for a company’s employees to help them manage their performance and continuously improve. However, one of the most common pieces of feedback we received from employees is that they don’t know where to start or which tools are most helpful for their development. Typically, we relied heavily on the HR team (our primary buyer and decision maker) to onboard their company for us after purchasing Lattice which required a tremendous amount of in-person training from our implementation team in the hopes that we could educate the HR team enough to become the teachers for their company.
⚠️ Constraints
While leadership did allow me to create my role and start the Growth function, they were not ready yet to divert engineering resources from building new features on our roadmap. As a tradeoff, I designed, built, and launched the onboarding flow myself using third-party software which had design limitations in exchange for speed and ease of implementation.
SOLUTION

Create an onboarding flow for new users and run an experiment to boost user activation.

Final: This version of the onboarding flow was shown to individual contributors – users who were admins (HR team) or managers with direct reports. Those users received a similar onboarding flow with different, more relevant content.

OUTCOME & LEARNING

A successful first iteration with room for improvement.

As Lattice’s first full user base A/B test, the new user onboarding experiment taught us a lot. First, it became abundantly clear how difficult running successful experiments in a B2B environment is given how many factors we needed to control and how few users we had access to. As a simple comparison, a B2C company might run experiments that target a fraction of their user base that could still include upwards of a million users and might only take a week to reach statistical significance. Since B2B companies onboard users much more slowly, Lattice’s experiment group was less than 10,000 users, so our hope of reaching statistical significance from the get-go was low. The fact that we did was an accomplishment and signaled that we could continue experiments like this to improve the product.

Even so, a 2% improvement was positive, but smaller than we’d hoped to achieve. It became clear that changing user behavior would be challenging, especially without resources from our EPD team, but it would be a foundation that we’d continue to build on and iterate.
UPDATES FOR 2.0

Embrace less-is-more and reduce cognitive load.

Everyone hopes that the first version of a new feature is perfect and we never have to revisit it. Of course, that's wishful thinking, especially for a zero-to-one feature. There are many things I would have liked to improve on the next version for onboarding, including:

1. Decide on one primary metric and action.
I built the onboarding flow with the assumption that successful user activation could be achieved if the user completed at least one of several key actions, depending on the tools that their company prioritized. I assumed giving users more flexible choices would be better, but it very likely just created more confusion. In the next version, I would test a flow that prioritized one core action to make it extremely clear what the user should do vs. what the user could do.

This also made measuring success in a meaningful way much more challenging. The growth metric for evaluating success on this experiment was a collection of actions that could each be their own metric. Next time, I would pick just one of those metrics and focus on the action that reliably moves it to drive a greater impact.
   
2. Provide some basic explanation of the tools.
In an effort to keep the onboarding flow quick and to the point, I assumed the actions I was highlighting to new users would be familiar enough to understand in the broader context. Of course, nothing is really that simple, and that assumption hurt more than it helped. In the next version, I would test including messaging around, for example, what Lattice 1:1s does and why setting it up is valuable from the get-go.

3. Apply cleaner design with in-house code.
Being constrained by a third-party tool meant that I could only do so much to make the design clean and on-brand. In the next version, I’d spend more time working with other EPD team members to clean up the design more and, if resources allowed, creating the flow in-house so we could unlock a more creative and engaging UX.