Get Out of the Building: Essential Lessons from IMVU and Amazon on Starting with the Customer
3 core principles: Talk to Real Customers Before Writing Code; Test Assumptions, Not Just Features; Build Measurement Into Your Process
Hello, it’s Ethan & Jason. Welcome to a *paid subscriber-only* edition of Level Up: Your source for executive insights, high performance habits, and specific career growth actions.
Many subscribers expense this newsletter to their Learning & Development budget, here’s an email template to send to your manager.
If you are not a paid subscriber, here’s popular articles you missed:
We are thrilled to bring you a guest post by James Birchler, a tech executive who has spent two decades transforming how organizations build products and deliver customer value. In this article, James goes in-depth on the three keys to getting real customer insight:
Talk to Real Customers Before Writing Code
Test Assumptions, Not Just Features
Build Measurement Into Your Process
As VP of Engineering at IMVU, the 3D avatar-based instant messaging platform co-founded by Eric Ries in 2004, James helped pioneer the DevOps movement by building infrastructure to deploy code 50 times per day, coining the term 'continuous deployment,' and collaborating with Eric to develop the Lean Startup methodology.
In his current role as Technical Advisor at Amazon, James leads strategic initiatives in portfolio-based innovation, developer experience improvements, and team culture transformation. He’s also a UC Berkeley Haas Certified Executive Coach, helping engineering and product leaders navigate crucial growth phases.
James is offering a special 40% off deal for Level Up readers through Friday, February 15:
40% discounted yearly subscription to his newsletter Continuous Growth
40% discount on his courses on Maven (use discount code LevelUp at checkout) — Get Out of The Building: Mastering Lean Startup and Amazon's Working Backwards and Accelerate Your Tech Teams: Coaching Modes for Technical Leaders using CAMS
While it's been years since I helped build and scale IMVU, the first “Lean Startup”, I'm routinely asked to help CTOs, founders, product leaders, and teams at companies of all sizes decide what to build and how to build it. I'm surprised at how often my advice begins with some form of "Get out of the building and talk to your customer!"
Here, I'll share examples illustrating why this is so important, along with practical advice on how to do it, based on my experience implementing “Lean Startup,” “Customer Development,” and Amazon’s “Working Backward methodologies.”
First, I will tell a story that changed the course of startup product development…
A Product No One Wanted
In his seminal book The Lean Startup, Eric Ries shares a pivotal moment from IMVU's early days — I witnessed that moment firsthand as one of the company's earliest employees.
I worked alongside Eric and the founding team at IMVU as an engineering leader. We spent countless hours building what we were convinced would be revolutionary: an avatar add-on for instant messaging add- that would integrate with existing IM networks. The strategy seemed perfect.
But when we finally launched, we discovered a painful truth that would reshape our entire approach to product development: customers didn't want an IM integration at all. They wanted to make new friends and express themselves through our avatars on a messenger platform that was unrelated to their existing ones. We had built our entire initial product strategy on assumptions that we never validated with real users.
You might think this humbling experience immediately transformed how we built products - but the reality was messier. The changes we made to implement what we now think of as “Lean Startup” methodology, including Customer Development, took time and plenty of iteration. For example, we encouraged and empowered engineers to build features they thought would be valuable, but we hadn't yet established a systematic way to ensure proper customer development happened first.
The result? Our product became what our board member Steve Blank called a "bucket of bolts” – a collection of disparate features without a coherent vision. In one particularly memorable board meeting, which we used to run “stadium style” with the entire team observing in an early trial of transparent leadership, he reprimanded us for "throwing spaghetti against the wall to see what sticks" instead of following a disciplined customer development process and coherent product strategy.
Steve himself is a startup luminary and author of the Customer Development process in The Four Steps to the Epiphany. All that’s to say that he knew what he was talking about. The “disciplined customer development process” and “coherent product strategy” that he wanted to see both start with one thing– getting out of the building.
The Three Core Principles of "Getting Out of the Building"
1. Talk to Real Customers Before Writing Code
Sometimes, the hardest lessons are the ones we have to learn more than once. In 2006, as an engineering leader at IMVU, I fell into a trap that will sound familiar to many product leaders.
I became convinced that I'd discovered a killer feature that would revolutionize part of our product–I thought that adding tag clouds to user profiles would revolutionize how our users discovered shared interests. Tag clouds were the hot new feature in Web 2.0 applications, and I personally loved using them in other products.
Without speaking to a single customer–yes, you read that right!–I pitched the idea internally. Then, one of our talented lead engineers built and shipped the feature within days, a testament to our ability to move quickly, but also a perfect example of moving fast in exactly the wrong direction.
The response? Our office phone started ringing–and not in a good way! In those early days, we had a single landline with an answering machine that everyone in our small office could hear, and we'd shared that number with our early customers. That phone became our most direct customer feedback mechanism, especially when we shipped buggy code or, in this case, features that missed the mark.
Instead of praise for the tag clouds, we heard complaints about how we were wasting time on unnecessary features while ignoring the improvements our customers actually wanted. That old office phone was both a blessing and a curse. When we pushed untested code to production (a frequent occurrence before we implemented Continuous Deployment), it would ring almost immediately with customer complaints.
But in the case of the tag clouds, the calls weren't about bugs - they were about our fundamental misunderstanding of customer needs. We had achieved an impressive engineering feat with our rapid feature release, but we'd optimized for speed of delivery and the presumption that I knew what customers were trying to do, rather than validated learnings about what they actually wanted.
Here's how to conduct effective customer interviews and avoid similar mistakes:
Start with open-ended questions about their problems, not your solution (resist that urge to pitch!)
Listen for emotional responses – they often signal the most painful problems.
Pay attention to workarounds they've created – these are gold mines of insight into urgent needs.
Document exact quotes rather than paraphrasing – trust me, you'll want these later.
Look for patterns across multiple interviews – one customer might be an outlier, but patterns tell a story.
I recommend doing as many customer interviews as quickly as you can–for reference, Lean Launchpad teams at Stanford talk to 10-15 customers per week!
You typically need to hear the same problem, need, or feedback from at least 8-10 customers before considering it a valid pattern that warrants further investigation. This is sometimes referred to as reaching "problem-solution fit”
Here are some sample questions to help uncover real customer needs rather than validating your assumptions. They will help reveal the actual problems they're trying to solve and the emotional drivers behind their behaviors:
"Walk me through the last time you encountered [specific problem]?"
"What solutions have you tried before?"
"What's the hardest part about [specific task] for you?"
"How do you currently handle this situation?"
Common pitfalls to avoid (and yes, we've fallen into all of these):
Pitching your solution instead of listening - harder than it sounds
Asking leading questions that confirm your biases - we're all guilty of this one
Talking only to friendly users who won't give honest feedback
Focusing on features instead of problems
📋 Take Action This Week:
Schedule three customer conversations
Focus solely on listening and documenting problems - not solutions
Look for patterns across their responses
Write down the top three problems you hear mentioned repeatedly
2. Test Assumptions, Not Just Features
By 2008, we were facing a perplexing problem: While customers were signing up for IMVU in promising numbers, they were disappearing quickly after joining. Our conversion rates were suffering, and despite all our sophisticated A/B testing and data analysis, we couldn't figure out why.
I decided to test our fundamental assumptions about the user experience by setting up a simple user testing lab - really just a desk with a computer - in a back corner of our High Street office in downtown Palo Alto. Using Craigslist, I recruited potential IMVU customers, offering $50 for an hour of their time. What we discovered challenged everything we thought we knew about our product.
Sitting with dozens of users, I watched a painfully consistent pattern unfold: initial excitement as they signed up and discovered they could create an avatar to express themselves and potentially make new friends, followed by increasing confusion and frustration as they encountered not one or two, but NINE different user interface paradigms - three on the website alone, and six more in our "core" product that required a download and installation.
Here are some examples of what our users had to navigate:
1: Web catalog interface
Users started their journey here, browsing our catalog and making initial purchases.
But this experience bore little resemblance to what came next.
2: 3D virtual world interface
After downloading our application, users found themselves in this completely different 3D environment, with new navigation and interaction patterns to learn.
3: Messenger-style inventory
Then, managing their purchases required learning yet another interface, with its own distinct paradigm borrowed from instant messaging applications.
The sheer volume of distinct paradigms that users had to navigate was already a lot, but the most devastating moment came when users launched our core application, only to have it literally covered up by the web-based interface they'd just learned to navigate. Once hidden, users could never find their way back to the core application features. Almost every test subject eventually gave up in frustration.
Here's the ironic part: each of these conflicting interface elements had been the "winner" of some previous A/B test. We had fallen into a trap that many product teams encounter: running experiments without testing fundamental assumptions about user behavior. Each individual test had shown positive metrics, but we had failed to test our assumption that users could navigate between the different interface paradigms.
We were optimizing locally while creating global chaos.
I recorded the user interview sessions that demonstrated this problem and convened a meeting of our product development teams to watch users struggle through their IMVU journey. Seeing real people move from excitement to confusion to abandonment was a wake-up call that data alone couldn't provide. This led to a complete overhaul of our product experience, focusing on creating a coherent, unified interface.
Here’s how the new interface looked:
Unified IMVU Interface
The transformation was dramatic.
Instead of forcing users to navigate multiple disconnected interfaces, we created a single, intuitive hub that brought together all key functions. The new design provided clear, consistent access to everything users needed - chat, shopping, avatars, and social features - all within one cohesive experience. Each function was represented by a simple, clear icon, and users could access everything without leaving the main interface or learning new interaction patterns. This wasn't just an aesthetic improvement; it fundamentally changed how users experienced IMVU. Instead of fighting with the interface, they could focus on what mattered: connecting with others and expressing themselves through their avatars.
This experience taught us that testing our assumptions required a systematic approach. Here's the framework we developed, which is still relevant today:
List all the assumptions that your product idea depends on - even the ones that seem obvious. For example, we had assumed that users knew how to navigate between the user interfaces.
Rank them by risk and uncertainty - be brutally honest here.
Design small experiments to test each assumption - emphasis on small!
Set clear success criteria before running tests - no moving goalposts.
Be willing to pivot based on results - this is often the hardest part.
📋 Take Action This Week:
List your product's three riskiest assumptions
Design a small experiment to test each one within 5 days
Set clear success criteria before starting
Document what you learn, even (especially!) if it challenges your assumptions
3. Build Measurement Into Your Process
As we became more sophisticated with our customer development and started to really embrace the Lean Startup “Build-Measure-Learn” loop, we faced a new challenge: how to manage multiple experiments running in parallel.
After all, the Build-Measure-Learn loop is really just the scientific method applied to product development–but try running dozens of experiments simultaneously without proper tools and processes.
Things can get pretty chaotic.