Vous êtes sur la page 1sur 47

PRODUCT MANAGEMENT 101

NOTES ON KEY CONCEPTS AND TECHNIQUES


Tom Eisenmann

This document collects my notes about key concepts and techniques covered in the
Harvard Business School MBA elective, Product Management 101 (course
description; 2014-14 syllabus). The document is a work-in-progress which I’ll be
expanding/refining over the coming months. DO NOT ASSUME THAT YOU’VE
DOWNLOADED THE FINAL VERSION. The document is deliberately dense and
meant to convey lots of information on any topic with few words. As such, it is
structured as an outline — one that won’t be fun or easy to read, because it omits
examples and smooth prose transitions between ideas.

THE PM ROLE
DESIGN THINKING
CUSTOMER DISCOVERY
SOLUTION GENERATION
SOLUTION VALIDATION/MVP TESTING
THE MRD
UX PRINCIPLES
TECHNOLOGY BASICS
PRD/DRAFTING SPECIFICATIONS
WORKING WITH DEVELOPERS
AGILE VS. WATERFALL
PROJECT MANAGEMENT
USABILITY TESTING
LAUNCH MARKETING
QA BASICS
POST-LAUNCH ANALYTICS
SQL BASICS
A/B TESTING/QUANTITATIVE EXPERIMENT DESIGN
ADWORDS BASICS

1
THE PM ROLE

 Roles of PM and How They Vary Over Product Life Cycle


o Visionary/Strategist/Designer/”Voice of the Customer”; Business
Analyst; Project Manager/Coordinator; Integrator of Top Management
Perspectives and Functional Priorities; Advocate/Spokesperson
o Lots of responsibility but little formal authority because no one typically
reports directly to PM
o Agree with statement, “PM is the CEO of the product”?
o “Janitor” who’ll keep engineers productive
o Define “what” not “how”
o Sequence: 1) Problem Framing/Validation; 2) Solution Generation; 3)
Solution Validation/Iterative Refinement
 Relationships With Other Functions, Esp. In Big Companies
o Engineering
o Design
o Project Management
o Product Marketing
 How PM Role Varies in Startups vs. Big Companies
 How PM Role Varies in B2C vs. B2B Companies
 Attributes/attitudes of Great PMs
o Deep knowledge of market/customers/competitor
o Passion/instinct for great product design
o Comfortable with technology
o Tolerance for ambiguity
o Strong intellect and analytical skills
o Easily helicopters between big picture and fine detail
o Process and detail oriented; focused on deadlines/deliverables
o Crisp communicator
o No excuses/takes perosnal responsibility
o Team player/shares credit freely
o “Blocker” who’ll do mundane dirty work to keep team productive

2
DESIGN THINKING

Design Thinking: Definition and Process Overview


 Definition
o “a methodology that imbues the full spectrum of innovation activities
with a human-centered design ethos… powered by a thorough
understanding, through direct observation, of what people want and
need in their lives and what they like and dislike about the way
particular products are made, packaged, marketed, sold and
supported.” (Tim Brown, “Design Thinking,” HBR June 2008)
 Key elements
o Goal: a design that is desirable to customers; technically feasible;
and viable from a business perspective
o Iterative approach, relying on rapid prototyping
 Brown, HBR: “[goal is] not to validate preconceived hypotheses
but to help experimenters learn something new from each
iterative stab” [Note difference from lean startup approach,
which insists on initial hypotheses]
o Team based (rather than lone genius)
 Team ideally is co-located and has members representing
multiple disciplines (engineering, marketing, etc.) to get
diversity of views and breadth of solution concepts
o Does NOT relegate design to a downstream activity focused on
creating a “beautiful wrapper”
 Process: three main stages, but teams often loop back through stages in an
iterative manner (Brown, HBR)
o Inspiration (aka “exploration”). Frame the problem through deep
immersion in customer’s world to thoroughly understand needs
 Spend lots of time observing potential customers and gain
empathy for them (see “Customer Discovery” section below for
overview of techniques often employed)
 In the lean startup approach, Steve Blank’s “customer
discovery” process step — which addresses the question,
“Have we found a problem worth solving” by forcing
entrepreneurs to “get out of the building” — is closely related
to activities completed during the inspiration stage (see
“Customer Discovery” section)
o Ideation. Then, brainstorm, leveraging divergent thinking to generate
multiple solutions
 Prototype potential solutions and get feedback from potential
customers (see “Solution Generation” section below)
 Avoid attachment to any one solution
 Strong analytical thinkers are prone to jumping to a
single solution and defending it vigorously (Kelley &
Kelley, Creative Confidence, p. 28)

3
o Implementation. Finally, use convergent thinking to select the most
promising solution.
o Process is iterative, so you alternate between
ideation/generation/divergence and
implementation/validation/convergence, and in process broadening
then narrowing solution space – with a trend toward overall
narrowing (depiction of solution space under exploration –vertical
axis—over time on horizontal axis: an accordion bigger on the left
than the right)
 Critique (Buxton): “so fundamental to their practice that designers see no
more need to talk about it in a book than to mention that they also need to
eat breakfast”
 Relationship to product management

4
CUSTOMER DISCOVERY

Customer Discovery: What and Why?

 Purposes: confirm that you have identified a customer problem worth


solving – BEFORE overinvesting in building and marketing a product no one
really wants. Generate hypotheses about unmet customer needs, and gain
insight on potential solutions.
o An entrepreneur must first validate the problem (i.e., customer
need), to confirm that a strong unmet need exists; then they must
validate the market (defined as a group of customers who share a
problem), to confirm that the market is big enough to warrant further
development; then they must validate the solution, by confirming that
large numbers of customers will pay for it. Customer discovery
represents the first step in this process, i.e., problem validation.
(Klein, pp. 6-7)
o Entrepreneurs frequently skip customer discovery and later regret it,
after spending too much time developing a solution eventually
revealed to have limited demand. DON’T DO THIS!!
 Format of customer problem hypotheses should be: “I believe [customer
segment] experiences [problem] when doing [task] AND/OR because of
[constraint].” (Alvarez, p. 22)
o For example, “I believe that [new parents] experience [fear about
infant health] when [putting infants to sleep]” and “I believe
[professional men] experience [potential embarrassment] because
[they lack time to purchase new underwear when old ones wear out]”
o “Problem” might also be expressed as “need,” since some solutions
address desires rather than pain points.
o Unmet needs can be measure on two dimensions: 1) SEVERITY, i.e.,
magnitude of gap between optimal performance and performance of
existing solutions; and 2) IMPORTANCE of the need to customers.

Research Techniques

 Key issue: should you focus customer discovery on early adopters?


o Probably, since you typically must sell to early adopters initially
o But be wary when mainstream adoption is essential for business
model and when earliest adopters have different needs than
mainstream customers, as with tech-savvy “power users” or
individuals who’s self-image revolves around being first to try new
products
o Steve Blank’s criteria for targeting customer discovery efforts: seek
individuals who have a problem; know they have it; have been
actively seeking a solution; and can pay for your solution.

5
o Often, you can learn a lot from interviewing extreme users (e.g., heavy
users, individuals who use product in unusual ways, etc.)

 Quantitative research answers “What, how many, how often?” Qualitative


research explores “Why?” Consequently, qualitative research generates
hypotheses; quantitative research tests them. Qualitative and quantitative
research thus tend to follow each other in iterative cycles.
 Recruiting participants: see Goodman, Ch. 6
o It’s important to have a script when recruiting participants
 Request must address interviewees’ concerns about time
commitment and privacy.
o Target potential early adopters via personal referrals, social
networks, trade shows, store intercepts, landing page, etc.
o Craft your email request with care: best practice on this front can
make a big difference in terms of hit rates

o Prepare for “No!”
o Be wary of relying exclusively on friends/family who may not be in
target market and may tell you what you want to hear
o Commercial services can be cost effective when you have budget, but
you need to provide clear direction on who to include/exclude (e.g.,
due to employment by rival) etc. Cost is about $100-150 per
participant, excluding any incentives to interviewee
o Incentives may be appropriate, probably in the range of $1.50/minute
for consumers and $2/minute for B2B interviewees

Customer Interviews
• Purposes: both generating and validating hypotheses
– Use interviews to generate hypotheses regarding: 1) potential
customer segments and their unmet needs; 2) potential customers’
current solutions for meeting those needs, and their reasons for
dissatisfaction with those solutions; 3) structure/motivations of
decision making unit; and 4) potential barriers to customers adopting
your envisioned solution (e.g., budget constraint; risk aversion).
– You have validated hypotheses when multiple interviews converge on
the same conclusions
• “Validation” doesn’t mean you have found “truth” – it means
you have, at least for now, failed to falsify the hypothesis and
you can proceed in the same direction with some confidence
• Limitations
– Interviews shouldn’t be used to collect feature lists or solicit product
design advice; rather, you should ask how envisioned solution might
help better perform a specific task
– Interviews are not good for pricing research, due to self-interested
behavior or lack of buying authority

6
– With radical innovation, potential users may not be able to express
preferences, but they can still describe their problems with current
solutions
• Guidelines for conducting customer interviews
– Set 2-3 explicit research objectives; develop interview guide
– Select sample and recruit interviewees (see above for guidelines)
– Decide who else will join you
• Who needs input: engineers? sales?
• 2+ is better: one moderator + one note taker
• However, consider risk of disjointed jumps with
multiple interviewers, so coordinate roles
– Brace your self emotionally to hear things you don’t want to hear (Giff
Constable)
– Conduct interview
• Determine approach: on site with customer is preferable if
physical environment plays a role in problem/solution;
otherwise, phone interview is typically more efficient
• Decide if you’ll record (Alvarez p. 83). Pros: capture
everything; you can listen more carefully. Cons:
awkward request to start; increases analysis time;
makes some interviewees more cautious.
• Length: 45 minutes is ideal according to Alvarez, p. 100,
but Cespedes says 1-2 hours is appropriate for field
interview
• Number of interviews: Alvarez (p. 116) suggests you
tend to hit diminishing returns in terms of hypothesis
generation/validation after 15-20 interviews.
• Choose first questions carefully; don’t overdo icebreakers
• Alvarez, pp. 88-89: make interviewee confident she’ll be
helpful; set expectations (“I’ll mostly be listening”); get
interviewee talking by being quiet yourself after posing
first substantive question.
• Disarm politeness: tell people you need honest answers,
including criticism (Giff Constrable)
• Define terms
• Use mostly open ended questions and listen WAY more than
you talk
• Restate answers to confirm understanding
• Some basic customer discovery questions (Alvarez p. 60)
• Tell me how you do [task] today.
• Do you use any [tools/tricks/apps] to do the task?
• In completing the task, if you could wave a magic wand
and do anything you can’t do today, what would it be?
• Last time you did task, what were you doing right
before and right afterward?
• Should I have asked anything else?

7
• Justin Wilcox’s script:
1. What’s the hardest part about [problem context]? [Context should not be too
narrow or broad]
2. Can you tell me about the last time that happened? [stories are especially vivid]
3. Why was that hard?
4. What, if anything, have you done to solve that problem? [if they aren’t looking for
a solution already, we don’t have a problem worth solving!]
5. What don’t you love about the solutions you’ve tried? [this becomes your Unique
Selling Proposition]

Other questions
 How often do you experience this problem?
 How much are you spending to solve this problem now?
 Where do you find information about [problem context]?

•Listen for emotion and frustration (Alvarez, p. 65)


•Probe for constraints: awareness of problem, potential
solutions, limited resources, social/political acceptability of
change (Alvarez, pp. 71-76)
• Focus on past or present, not future (Alvarez, p. 69). Asking
“In the future, would you do XYZ?” will get an inaccurate
answer. People are too optimistic and they want to please you.
• Not: “How likely would you be to do XYZ?” Rather: “Tell
me about the last time you did XYZ?”
• Not: “How much would it cost your company if XYZ
happened?” Rather: “How much did it cost last time XYZ
happened?”
• Avoid leading questions or framing (Alvarez, p. 93), e.g.,
“Don’t you think XYZ?” “Would you like it if XYZ?” “Most people
would say XYZ”
• Avoid judgmental language and never say interviewee is
wrong, even if their understanding of product is seriously
flawed
• Steer interviewee away from offering views about feature
ideas or specific solutions (Alvarez, p. 97)
• “Be very careful asking customers what they want.
They’ll typically send you chasing unicorns. ‘But what I
REALLY want is a car that can hover and uses sadness
as fuel and produces happiness as exhaust.’” (LIFFFT)
• “First rule of validating your idea. Don’t talk about
your idea.” (Justin Wilcox) 2nd rule “Don’t ask about
the future.”
• When closing, keep door open for follow up and ask for
introductions to others who could be helpful. Say thank you!
– Debrief/record findings as soon as possible, before impressions fade:
key learning vs. objectives, surprises, comparison to earlier

8
interviews, process changes (Cepedes, p. 7). Cespedes suggests after-
action review: how did we do vs. intent? How can we improve? Also
focusing on process, Alvarez suggests (pp. 102-103):
• Did opening go smoothly?
• Did I ask leading questions?
• Did I get any bland answers?
• Did I use new questions on-the-fly that worked well and should
be added to interview guide?
• Was there anything I failed to ask/learn?
• Where did interviewee show the most emotion?
• Human-centered design interview tactics
– “Show me” and “Draw it” – ask interviewees to demonstrate things
they interact with
– Toyota Production System “Five Whys”
– Ask interviewees to “think aloud” as they complete a task

Focus Groups
• Purpose: typically, exploring customers’ current usage patterns and attitudes
– Group setting can be especially effective for eliciting reactions in high-
involvement categories with emotional, status, or life-style
associations.
– Good at finding desires, motivations, values and memories (Goodman,
p. 142)
• Limitations
– Best for generating or disconfirming hypotheses (answer to “Do you
want…?” is usually “Yes”), rather than validating hypotheses
– Best for exploring usage of/attitudes toward existing products by
mainstream customers. Subjects often struggle to state preferences
for products they haven’t experienced, and early adopters’ views may
be too disparate to converge in a group discussion.
– Conducting an effective focus group requires moderator skill, so if
budget allows, rely on a professional researcher
• Guidelines: 6-10 users of a homogeneous type (to encourage people to open
up with less fear of being judged), often compensated, 60-90 minutes
– Introduce objectives & ground rules (e.g., anonymity); participant
intros; how/when they use/bought product; discussion of
likes/dislikes
– To avoid herding, ask subjects to jot down ideas first
– Use whiteboard to guide discussion; never disagree or take sides
(“How do others feel about that…”)

Observation/Ethnography
 IDEO: you can learn a lot from extreme users (e.g., eight- or eighty-year old
struggling with a can opener)

9
 Likewise, look for “hacks” and workarounds to identify unmet needs.
 Synthesize observations using empathy map with four quadrants: DO (lower
left), SAY (upper left), THINK (upper right), FEEL (lower right). Then look for
new/surprising/contradictory patterns to gain insight on latent needs
(Kelley & Kelley, p. 223)
 Journey map is describes the steps a customer undertakes in solving a
problem/completing a task, arrayed in a timeline, “with special attention to
emotional highs and lows and the meaning that the experience holds for the
customer” (Liedtke, “Ten Tools for Design Thinking,” UVA case BP-0550)
– Goal: identify pain points/unmet needs
– Include even small steps (Kelley & Kelley, p. 234)
– Can be completed for different personas (see below)

Customer Surveys
• Purpose: assess 1) usage patterns; 2) past purchase behavior; 3) satisfaction
with existing solutions; 4) strength of needs; 5) feature preferences (but be
wary of responses!); 6) purchase intent (again, be wary of responses!); 7)
correlation of above with demographic/behavioral/psychographic attributes
(for segmentation)
– Responses can provide input for estimate of potential market size
• Limitations
– Must understand problem well enough to frame appropriate
questions, so surveys should typically be done only after a round of
customer interviews
– Survey responses regarding purchase intent for radically new
products are not reliable
– More broadly, questions should focus on past/present rather than
expected future behavior, especially when exploring an aspirational
product (e.g., “Would you like to go to the gym more often?”)
• Guidelines
– Conduct pilot testing to ensure question clarity and avoid excessive
length
– As with interviews, avoid leading questions such as “Would you like
XYZ?”
– Ensure adequate sample size; avoid convenience sampling

Competitor Analysis
 Competitor feature audit, including benchmarking (e.g., with letter grades,
“moon charts”)
 Surveys can uncover frequency of use, features used, strength of loyalty, etc.
 Competitor usability testing (see section below for “How To” guidance) can
uncover unmet needs for existing solutions

10
Personas
What Are Personas?
• A persona is a precise description of a representative user of a given type.
o How are personas different from customer segments? Customer
segments, which typically reflect shared
demographic/behavioral/attitudinal characteristics revealed through
large sample quantitative analysis, are used to target marketing
efforts. Personas reflect personal stories/goals/motivations, as
revealed through one-on-one interviews and ethnographic research;
they typically are used to provide context on real customers’ needs in
the product design process.
• Persona description might include: 1) image and name (NAME is crucial!); 2)
key demographic and behavioral information or job description; 3) key
needs/worries/motivations/GOALS and their implications for a solution; 4)
quotes
o Some generic needs/tradeoffs to consider in developing personas
(Alvarez, pp. 25-26)
 Preference for time vs. money
 Risk aversion/prefer predictability vs. novelty seeking
 Decision maker vs. taker
 Independence vs. reliance on others’ views
 How tech savvy?
• 3-7 personas are typical for a business; 1-3 should be primary. Having too
many suggests a product that’s too diffuse. Cooper, 135: Sometimes good to
include personas of potential users you are NOT designing for. Can have
personas for influencers/buyers, but you don’t design UI for them.

Why/When/How to Use Personas?


 During customer discovery, developing personas pushes the team to refine
its understanding of customer needs
 Solution generation is also informed by personas: Will proposed solution
meet Mary’s needs?
 During specification, user stories should be drafted and a unique UI should
be designed for each of 1-3 primary personas
o If you design for the largest number of potential users, you get a
mushy product that meets no one’s needs well. Cooper, 125: 80% in
Chrysler Dodge Ram focus groups hated it, but 20% loved it. Must
design products people LOVE!
 Throughout the design and development process, personas serve as a
communication tool for the team. Everyone must be deeply familiar with
them!
 In Cooper’s approach, scenarios entail PERSONAS completing TASKS to fulfill
GOALS.

11
Example Persona

12
SOLUTION GENERATION

Design principles (see also design principles under “UX Principles”)


 Keep your focus narrow in order to go faster (Alvarez, p. 23)
 IDEO: designs should be desirable to customers, technologically feasible,
and economically viable from a business model standpoint
 Avoid premature attachment to a specific solution
 Buxton: economically improving an existing product (and motivating users to
purchase upgrade) gets harder over time (incremental value grows modestly
and incremental cost of upgrade grows sharply) because
o Low hanging fruit, in terms of functionality, is added early
o Existing users are resistant to change once they master product use
o Integration of new features is more difficult as product’s complexity
grows (harder to rewrite code)
 Cooper, ch 10: focus should be on user’s GOALS, not tasks. Goal reflects end
condition; tasks are means to goals. Tasks change as technology changes;
goals do not.
o Programmers tend to do task-oriented design, because programming
is inherently procedural and reflects the steps needed to complete
specific tasks. Example: user Jennifer must monitor computer and
periodically reconfigure to ensure peak performance. Programmer
will build separate functions for monitoring and reconfiguring,
because these functions are distinct in terms of programming tasks
required. UI designer will ONLY invoke reconfig controls when
monitoring suggests non-peak performance. THEN, monitoring and
reconfig controls will always be shown together, so Jennifer can watch
performance change as she tweaks config.
o Fundamental personal goals for all users: not feeling stupid, not
make mistakes, have fun, get enough work done. Corporate goals:
boost profits, boost market share, etc. Practical goals bridge personal
and corporate goals, e.g., “record the client’s order.” False goals relate
to specific technologies or tasks, e.g., save memory, run in a browser,
be easy to learn, be graphically appealing, ensure cross platform
consistency, etc.
o Research shows that humans react to computers like they do to other
humans. So, software should be “polite” (not “please and than you,”
but rather interested in me, e.g., remembers my preferences e.g.,
monitor settings; deferential to user and respectful of user’s
preferences; generous with relevant info on condition; helpful when
trying to complete tasks; makes common sense assumptions about
which controls should be front-and-center and what I need, e.g., if I
ask for food at restaurant I need silverware too; trustworthy; self-
confident, e.g., needn’t check if I really want to delete file; responsive;
perceptive to see I often do X after I do Y; etc.), because this is a
universal human trait. Programmers struggle with making computers

13
more human because they see humans as imprecise and imperfect
computing devices.
o Norman, 42: use root cause/5-Whys analysis to understand true goals.
Levitt: users don’t want quarter inch drill, they want a quarter inch
hole. Norman, true, but not root cause: users want to hang a picture.
Or, they want to decorate their house.

Brainstorming/Structured Ideation
 Goal: generate LOTS of new solution concepts, leveraging design criteria and
insights about personas and their needs from exploration phase
 IDEO rules for brainstorming: defer judgment; encourage wild ideas; stay
focused on the topic; build on the ideas of others (Brown, Change by Design,
p. 78)
o Brown: “Butterfly test” gives small post-its to each participant, used to
vote on which ideas should move forward. [Note that this approach is
vulnerable to social pressure.]
o Start with a crisp problem statement and share relevant information
on personas, competitors’ efforts to solve problem, etc.; number ideas;
use space and cover walls; avoid deference to HiPPO – highest paid
person’s opinion (Kelley, Art of Innovation, p. 56)
 Techniques to encourage creative problem solving
o Envision how a negative can be a positive (Liedtke)
o Pretend to be someone else (Liedtke)
o Think about the opposite (Kelley & Kelley, p. 102)
 “Design Studio” technique (Jeff Gothelf, Lean UX, pp. 37-41)
o Multidisciplinary team of 5-8
o Problem definition and constraints (30 minutes)
o Individual idea generation (10 minutes): ask each team member for
six ideas. Encourage sketching.
o Presentation/critique of ideas: each member states persona and pain
point their idea addresses; critique seeks to clarify intentions rather
than evaluate idea
o Iterate and refine: individually, members spend 10 minutes
developing a single “big idea,” then the group critiques these
o Team idea generation, converging on a solution with best prospects
for success (45 minutes)
 Goldenberg et al. (HBR March 2003) patterns for generating product
innovation ideas. Start by breaking product into components, then explore
patterns
o Subtraction especially to avoid feature creep/excessive complexity,
e.g., Philips removing all buttons from DVD player
o Multiplication with altered copies, e.g., Gillette double-bladed razor
(with 2nd blade at different angle to improve the shave)
o Division into components then reintegrate parts in new way, e.g.,
stereo components broken out of integrated HiFi units

14
o Unification of task previously performed by separate elements into
one element, e.g., defroster that serves as radio antenna. Good when
cost control is imperative.

Prototyping: Why and What? (HH = Houde & Hill, 1997)


 HH: Prototype = any representation of a design idea, regardless of medium. Is
a brick a prototype? It is, if it is used to represent the weight and scale of
some future artifact.
 HH: different purposes for prototypes -- they explore:
1. Role, i.e., function artifact serves in user’s life. Designer wants to
know whether artifact will satisfy an important unmet need. This
requires establishment of user’s context. E.g.,
 paper storyboard depicting use of pen-based computer.
 Manually interactive paper prototype with user selecting
action on a “screen”, then designer replacing, by hand, that
screen with the appropriate next one.
 Knowledge navigator video (NB: lots of fidelity! This is
explained below)
2. Look and feel, i.e., sensory experience of using artifact – aesthetic
appeal and usability. Requires user experience to be simulated, so
users can interact and designer can discern ways to improve the UX.
E.g.,
 “GloBall” toy that talks to and rolls toward/away from kids,
based on kids’ vocalizations– simulated with radio-controlled
car and walkie-talkie inside a ball, both operated by unseen
designers
 ergonomics of an architects’ computer simulated by asking an
architect to carry around a pizza box all day
3. Implementation, i.e., whether activities and components required to
make/deliver artifact are technically feasible. Usually requires a
working system, or at least portions of it.
4. Integration of all above types to explore complete user experience
o Types may be developed in the order above, or in parallel. Clarity on
implementation constraints, for example, may shape role design.
 Prototype purposes also depend on audience (Han & Mendelson,
“Prototyping: A Quick Introduction,” GSB case E-414)
o Help team clarify product, and resolve technical challenges
 Software developers often grasp product requirements more
easily when they have a prototype to work with
o Engage potential customers to get feedback
o Demonstrate concept to stakeholders
 “Prototypes should command only as much time, effort, and investment as
are needed to generate useful feedback” (Brown, HBR) [Note consistency
with lean startup approach]

15
o Should be disposable. This helps avoid premature attachment to
specific solutions.
 Prototypes vary in terms of fidelity, i.e., closeness to eventual design
o Early, low fidelity prototypes focus on assessing value to potential
customers; later stage prototypes are often of higher fidelity and are
intended to resolve technical issues, gain clarity on workflow or
integration, secure feedback on visual design, etc.
o Brown, Change by Design, p. 91: surface details may require attention
so that potential customers are not distracted (i.e., false negative risk)
o Prototypes should be clean/clear/obvious. They should include some
images. But too much visual polish in prototypes (e.g., obsessing over
colors, font size, etc.) is likely to entail wasted effort as designs
change. (Klein, p. 129)
 Also, having a polished design may make you sub-consciously
less likely to make major and necessary changes.
 Likewise, a polished design feels more “done,” so testers are
more inclined to be polite in giving feedback. Or, they may be
distracted by visual elements (color choices etc.) and not focus
enough on ease of interaction etc.
o HH: high vs low fidelity distinction can be misleading. Focus on
purpose of prototype and intended audience, not tool used to create it.
It isn’t always the case that a finished-looking prototype is used late in
the design process, and vice versa. Depends in part on audience. E.g.,
Apple Knowledge Navigator video was meant to make the concept
concrete for a broad audience
o Buxton pp. 383+ cites research showing that quality of prototypes
didn’t impact test subjects’ perceptions of usability
 Buxton 383: subjects are reluctant to criticize designs because it gives
impression they are negative people and they don’t want to hurt tester’s
feelings.
o BUT!!! When shown 3-4 designs, the willingness to criticize goes
away. Same design gets much higher rating when shown alone than
when shown in tandem with rival design ideas.
 Formats for early stage software prototypes:
 Sketches, which Buxton distinguishes from prototypes
 Sketches (vs prototypes): evoke and explore in tentative manner
(vs describe, refine, test)
 Storyboards to demonstrate user experience, often with post-its or note
cards allowing easy manipulation
 Wireframes — skeletal representations of a software application or
website that map key features/functions, navigation elements, and links
between screens
 Paper prototypes – can rely on manual replacement of “screens” to
simulate interaction
o Pros: first iteration is fast

16
o Con: rapid iteration can be tedious; simulation of
interaction is artificial
 Interactive wireframes (as with Balsamiq) allow users to click on
elements and be transported to the destination screen
 When wireframing, writing copy for navigation elements forces
you to understand what a screen is about (Klein, 124)

How Does Prototype Testing Relate to Next Section’s Topic, MVP Testing? A
prototype is a representation of the envisioned system, and may be functional or
non-functional. Product prototypes can be used in ways that conform to the
definition of an MVP. Soliciting user reaction to a prototype (“Would this software
help you do XYX?”) is one way to validate a solution. Likewise, functional prototypes
can be used to confirm that key technologies will work as intended.
 How do the MVP categories in the next section relate to prototypes? “Limited
Use MVPs” require a functional prototype; typically, so do “Wizard of Oz
MVPs,” which have a user-facing interface that resembles that of the
envisioned application but an invisible backend. Concierge MVPs tend to rely
on manual front- and backend processes; they are not really faithful
representations of the envisioned system. Smoke tests may or may not
attempt to represent the envisioned system. A landing page test may share
only a text description of the planned service. A video MVP – like the one
used to demonstrate Dropbox before its beta test – will often depict a
prototype.

17
SOLUTION VALIDATION THROUGH MVP TESTING

Key Lean Startup Principles


 Most startups fail by wasting money building/marketing a product that no
one wants
 Instead, you should launch early and test business model hypotheses with a
minimum viable product
 Pivot when a hypothesis is wrong
 Repeat until you have product-market fit (PMF)
 Don’t scale until you have PMF

Definitions
 Lean startup: A new venture that tests business model hypotheses using
Minimum Viable Product tests. “Lean” does not necessarily imply “low cost”;
rather, it refers to an imperative to “avoid waste.”
 Minimum Viable Product: The smallest set of product functionality and
operational capabilities — including the possibility of a non-functional
product simulacrum — required to rigorously disprove a business model
hypothesis.
 Falsifiable hypothesis: A hypothesis is falsifiable when it can be
disconfirmed through an experiment. For example, NOT “Our product will
grow through word of mouth, RATHER, “Our viral coefficient will exceed 0.5.”
 False negative: A test result that indicates a hypothesis has been
disconfirmed when in reality it is valid. Common cause: reaction to badly-
designed MVP
 False positive: A test result that fails to disconfirm a hypothesis that in
reality is not valid. Common cause: sampling friends or super-enthusiasts
who aren’t representative of “typical” early adopters
 Pivot: Changing some business model elements while retaining others.
 Product-market fit: Occurs when the venture has the right product for the
market: one with demonstrated demand from early adopters and with solid
profit potential. Lean startups do not commence scaling until they achieve
product-market fit

Advantages of Launching Early


• Gets reliable feedback quickly by putting a real product in the hands of real
customers in a real world context – ideally, one in which customers are asked
to make a commitment, e.g., by purchasing the product.
• Fast cycles/small batches serve to
– Accelerate feedback
– Make it easier to diagnose and fix product problems
• Launching early may be more risky with
– Mission critical activities

18
– Industries with small numbers of large customers who regularly
compare notes on vendors
– Products that are especially vulnerable to IP theft (e.g., distinctive
designs that can be easily/legally copied)

MVP Purpose

Tests one or more business model hypotheses.

Need not be a functional product, or even a non-functional prototype.

Typical concerns
 Reputation damage due to poor quality and/or limited
functionality/capability. But: 1) quality won’t necessarily be poor – it should
be set at a level that yields reliable test feedback; and 2) small numbers limit
impact.
 Early launch will lead to concept theft. Maybe, but value of learning typically
more than offsets this risk.

MVP Types

Relative to the envisioned solution, MVPs constrain functionality and/or operational


capability.

Limited Use MVP reduces product functionality relative to envisioned solution, e.g.,
Intuit SnapTax (Ries, pp. 29-31); IMVU (Stanford case E254)

Manual MVP constrains operational capability, relying on manual operations


instead of software, equipment, etc. These solutions make sense when automated
operations would be expensive to develop and, once built, expensive to modify
 Concierge MVP relies on a fully manual solution in ways that are evident to
customer, facilitation learning about customer needs and response to
solution though frequent, intense direct interaction
o Example: Food on the Table, Ries pp. 99-102
o Example: Rent the Runway trunk show trial, HBS case 812-077.
 Wizard of Oz MVP (aka Mechanical Turk MVP) relies on manual operations
in ways that aren’t evident to the customer.
o Example: Aardvark (HBS case 811-064)
o Example: approving orders manually rather than building fraud
detection system (Klein, p. 91)

Smoke Test (Alvarez p. 134) radically constrains both functionality and operational
capability, and thereby gauges commitment before product is built, ideally by

19
securing an actual pre-order. Capturing an email demonstrates less commitment.
Variants include:
 Landing page test
 Button to nowhere: click on button/link that sends you to a “coming soon”
message. Modest cost is user’s reaction “What else can’t I trust here?”
 Video MVP (e.g., Kickstarter)
o Tips for creating a video prototype (via Toy Lab in Kelley & Kelley, p.
135)
 Use a script and a shot list; mix camera angles and shot styles
to avoid boring rhythm
 Use voiceovers to convey meaning/backstory
 Pay attention to lighting and sound quality
 Get early feedback
 Keep it short -- < 2 minutes
 Letter of Intent: often used with enterprise software
o Typically legally non-binding statement that if product with XYZ
specifications is delivered by XYZ date, customer intends to buy at
XYZ price. Even though this isn’t a legal contract, it still signals some
level of commitment, because customers don’t want to have to deal
with a vendor who can cite the LOI and say, “You reneged on a
promise.” There could be reputational consequences from reneging.
o Entrepreneur/PM can learn a lot about customer needs by negotiating
the language of an LOI.

Lean Startup Myths


• Myth: Lean startups are all bootstrapped.
• Reality: Hypothesis testing mindset continues after a startup shifts
from business model validation to business model optimization. In the
latter phase, firms may spend aggressively on product development
and customer acquisition.
• Myth: LS yields crappy products
• Reality: MVP quality should be good enough to yield reliable test
feedback, but no better. If test subjects expect quality and would
reject a sub-par product, then an MVP must deliver quality or risk a
false negative.
• Myth: Lean startups are driven by customer feedback, and the combination of
customers who can’t express a need for innovate solutions and iterative
improvement yields products that represent only incremental improvements
• Reality: The LS approach starts with a founder’s vision, which is
translated into hypotheses, which are validated through customer
feedback. Vision and hypotheses drive development efforts; customer
feedback doesn’t direct efforts, but it may prompt their redirection.
• Myth: LS is a new management approach
• Reality: “Test then invest” has been a mantra for entrepreneurs for
decades; direct marketers have long used the equivalent of “smoke

20
tests”; design thinking practitioners embrace rapid prototyping. LS
builds on many antecedents, but adds some distinctive ideas, in
particular, testing all elements of a business model using a series of
MVPs.
• Myth: Only for software/web/mobile products
• Reality: Rapid iteration is easier with software-based businesses, but
3D printing and other prototyping tools are speeding product
development cycles in hardware.
• Myth: LS is only for startups
• Reality: “Fast and frugal” testing when launching new ventures makes
just as much sense for big corporations as for startups.

21
MRD

The Market Requirements Document: What and Why?

• What:
• Why:

Sections of the PM101 MRD

• Vision
– Use positioning statement template from Moore or others
• Customer Value Proposition
– Unmet Needs to be Addressed: Hypotheses
– Customer Segments to be Targeted: Hypotheses
– Key Use Cases: Hypotheses
– Existing Solutions and Their Shortcomings
– Key Requirements for a New Solution: Hypotheses
– Expected Sources of Differentiation
– Why Now?
• Evidence Supporting Hypotheses
– Evidence for Unmet Needs
– Evidence of Demand for Proposed Solution
• Market Size
• Go-To-Market Approach
• Risks and Key Dependencies
• Strategic Considerations
• Team Members
• Go/No Go Recommendation

22
UX PRINCIPLES

UX design principles (see also design principles under “Solution Generation”)


 UXBooth
o Interface as magic: a great interface draws no attention to itself – it’s
invisible
o Designers should not reinvent when familiar patterns are available
 Norman, Design of Everyday Things – fundamental principles of interaction:
o Affordances are opportunity for action, e.g., a river affords an
opportunity to swim. Affordances are not properties of objects, they
are relations between properties of a physical object and a person. A
chair affords support and therefore affords sitting. Signifiers are often
used to show where an affordance action should take place (e.g., pull
handle on a door; arrow + person icon indicating that user should
swipe up to see contacts).
o Mapping that yields spatial correspondence between controls and
devices being controlled is important, as with steering wheel or
positioning of knobs to control a grid of four stove burners. Controls
should be close to item being controlled.
o Feedback is crucial; people hate feeling stupid when using a system.
Feedback must be prioritized, and too much feedback is bad: results in
feedback being ignored or turned off.
o Conceptual model is a simplified explanation for how things work.
Ideally, structure of object/system provides clues to its function, as
with holes for fingers in scissors. Too simple can cause confusion, e.g.,
when cloud based email software shows folders that aren’t available
when not online; turning up a thermostat to maximum doesn’t heat
room to 90 degrees. Some objects provide no clues, e.g., how to use
buttons on side of a digital watch.
 Cooper: “UI design” is all the stuff that affects the user; the rest is “program
design” (e.g., which programming language to use – unless this choice
impacts response times etc.). It’s REALLY important to do UI design before
coding, but many companies skip this step
o Once bad software is written, it’s hard to throw it out – programmers
get emotionally attached
o Agile iteration without advance UX research as a way to get good
software is costly
 Cooper, ch 11: users segment by proficiency into three groups
o Beginners. Marketers and management tend to focus on them a lot,
and push for designs that suit their needs. But no one likes to be a
beginner, so no one stays in this segment long. They either become an
intermediate or attrite.
o Power Users. Programmers relate to them and love to build for them.
o Intermediates. No one champions design for the most important
group!

23
 Cooper: be wary (esp. with programmers not trained in design) of self-
referential trap, i.e., imagining yourself as user
 Norman ch 5 analyzes user error. Categorizes causes, including deliberate
error (e.g., propping open a door; exceeding speed limit); multitasking;
fatigue; carelessness due to boredom; time stress. Design lessons:
o Distinguish between slips (correct goal, flawed execution) and
mistakes (wrong goal)
 Slips happen due to lower-level cognition problems during
execution. They are action-based (wrong action, e.g., pour
orange juice instead of milk into coffeee) or memory lapse-
baed (forget to do action, e.g., forget to turn off stove burners)
 Experts are more vulnerable to slips because they go on
autopilot
 Capture slips happen when procedures have similar
opening actions then diverge. You’ve been doing
procedure A a lot, then switch to B, but slip because you
keep following action path for A
 Description slips (e.g., throw dirty shirt into toilet,
instead of laundry basket) happen with controls that
are too similar. Important for different functions to have
different looking controls/displays (often violated in
airplane cockpit)
 Memory-lapse slips e.g., leave card in ATM or originals
in copier. Often caused by interruptions or cognitive
load of so many steps. Avoid with vivid reminders or
forcing, e.g., can’t complete transaction until your
recover card.
 Mode-error slips, e.g., turn off wrong component in
home entertainment system. Exacerbated when one
control has multiple functions to save money.
 Mistakes are due to problems at higher levels of cognition (i.e.,
planning, and comparing results against expectations). They
are rule based (wrong rule applied despite correct diagnosis of
situation); knowledge based (lack knowledge to diagnose, e.g.,
cargo measured in pounds not kilos, or novel situation with no
set rule), or memory lapse based (e.g., distracted tech botches
diagnosis)
 Good design provides clear, readily accessible display of
system state.
 Memory lapse at higher cognitive level leads to mistakes, at
lower level to slips
 Providing “Undo” option at all stages is really important.
 Constraints can prevent slips, e.g., different coloration of
liquids to be added to reservoir and different container designs
so container spouts only fit one intake.

24
 Confirmation statements can help with slips, but with mistakes
the user is typically sure that they want to proceed, so error
may not be corrected.
 Checklists help. Have two people execute them.
 Detecting mistakes is hard, because it’s difficult to ascertain
goals. Likewise, memory lapse is hard to detect.
o Use 5 whys to analyze, with awareness that many errors have
multiple (and often nested) causes
 Krug, Don’t Make Me Think: things that diminish goodwill
o Hiding key info like customer service phone number or shipping costs
o Insisting that info conforms to site’s format preference (e.g., dashes in
SS number)
o Asking for unnecessary information
o Sizzle in the way of steak (time consuming Flash prelude)
o Amateurish looking site
 Weinschenk = W: 100 Things Designers Need to Know About People; Krug =
K: Don’t Make Me Think

Layout
o Group similar items together, and use white space to create
patterns: our brains scan for patterns -W
o People respond strongly to images of faces-W
o People who use our alphabet read web page top left to bottom right,
so locate key info on top or in middle of page-W
o Shading on button creates perceived affordance-W
Color
o Blue and red don’t go together – hot and cold colors-W
o 9% of men are color blind-W
o Color meaning varies by culture, e.g., white is color of death for
some, purity for others-W
o
Text/Reading
o Titles and headlines are CRITICAL-W
o Use all caps – which is harder to read -- sparingly, to get attention-W
o Notwithstanding strong opinions, research shows no difference in
reading comprehension between serif and sans serif fonts. Use them
to evoke mood etc. -W
o Some newer fonts, e.g., Tahoma and Verdana, are designed with
large “x” (i.e., letter body) heights for better screen readability-W
o Especially when reading on a screen, contrast matters: best is black
text on white background-W
o Break text into chunks with bullets, short paragraphs, pictures, etc. -
W
o People read faster with long line length but prefer shorter line
length-W

25
o Omit needless words, especially happy talk. Eliminate instructions
by making things so simple they aren’t necessary -K
Navigation
 Progressive disclosure: don’t overwhelm with info on first page; let user
choose level of detail that meets their needs. Don’t worry about the fact
his requires multiple clicks—if clicks make sense, the user will find
navigation easy. Crucial, however, to know what most users want at each
step/down each path –W
 Make targets large enough and not too far away to avoid “motor load”
-W
 People categorize info, so do it for them-W
 Provide progress indicators-W
 Make it easy to undo last action and entire last sequence-W
 What gets attention: faces, things that move (e.g., videos), loud noises-W
 Some users are “search dominant” and always start navigation with a
search; others are “link dominant” and will start by browsing, via
hierarchy –K
o Back button accounts for 30-40% of all web clicks
o Conventions for web nav include sections (with a “you are here”
bold/arrow/highlighted/color-change indicator always evident)
and 4-5 utilities (e.g., shopping cart, site map, checkout, help,
archives, customer service, etc.) in top bar; local (current level)
nav on left side (left or right adjusted – not centered); “about us”
etc. at bottom
o Persistent nav or “global” nav appears on every page (except
home page and some pages for forms), and should always include
logo/site ID (also linked to home page), home link, search box
(with the word “Search”!), sections/primary nav (ideally as tabs
with contrast coloring—different for each section--for active tab;
tabs are self evident, obvious, and suggest physical space) and
utilities
o Think through nav carefully 3-4 levels below home page
o Every page needs a PROMINENT name (e.g., “Auctions” section
and “Sell an Item” subsection, later a link in the former top level
tab)
o Breadcrumbs (in addition to but below section nav) show current
location: You Are Here: Home > Hobbies > Book Collecting >
Welcome (each hyperlinked)
o Home page should accommodate: site identity and 6-8 word clear
and informative/lively tagline or mission statement; how to get
started via 1)site hierarchy, 2) search and 3) best stuff =
teases/timely content/deals; registration
o Pulldowns are tempting as a way to save space, but they reduce
serendipity in discovery and they are “twitchy” because they

26
disappear easily. Pulldowns are best for alpha lists of known
names, e.g., states
Action/motivation/understanding
 Change beliefs by getting people to make a small commitment-even one
that is forced-W
 People process info best as stories, especially of the journey form
(obstacle overcome); they’ll embrace causation assertions through story-
W
 People learn best from examples; photos and videos are great ways to
deliver examples-W
 Sustained attention lasts about 10 minutes (hence, length of Lynda.com
videos)-W
 With repeated action, there’s a tradeoff with making things too easy:
people will get bored and careless-W
 People are most motivated when close to reaching goal; motivation
plummets/attrition risk rises after reaching goal-W
 Variable/unpredictable rewards (rather than fixed schedule) are
powerful, because unpredictability keeps people searching –especially
when info comes in small doses (e.g., 140 character tweets)-W
 People are motivated by intrinsic rewards, social connection
opportunities, and they value progress, challenge/mastery (so, include
scores, leader boards, progress indicators, etc.) and control (so,
emphasize autonomy when you want them to self service) -W
 People are more motivated to compete when there are fewer
competitors-W
 People let others decide when they are uncertain, so use testimonials-W
 Impulse to imitate is strong, so show videos of people doing action you
want to encourage-W
 Place critical relevant information in close proximity to a call to action

Krug, Don’t Make Me Think = #1 usability rule!!
 For example, bad naming makes us think, e.g., clever names, marketing-
induced names, technical names, company-specific names, etc. (e.g., do you
put prepaid business reply card in box labeled “Stamped Mail” or “Metered
Mail”?)
 Also: Where am I? Where should I start?
 People scan/skim, so design like it’s a billboard
o Create clear visual hierarchy via prominence (larger, bolder) given
to items; grouping similar items; nesting
o Use conventions (and note that designers often resist them)
o Break pages into defined areas
o Make what’s clickable obvious
o Minimize noise/busy-ness

Style Guides (Gothelf, Lean UX, pp. 41-51)

27
• Codifies interactive, visual and copy elements of a UI/software system
o For example, headers, footers, drop-down menus, grids, forms, button
logic, colors, column sizes, backgrounds, separators, etc.
o Guide specifies what elements look like, where they are placed, and
when/how they should be used
o aka pattern libraries
• Saves effort and ensures consistent experience

28
TECHNOLOGY BASICS

Model-View-Controller
• Model: Where’s the data?
– Changes the view
• View: Where’s the UI?
– Browser, mobile app, etc.
• Controller: Where does computation happen?
– Updates the data via algorithms etc.
• MVC goals: M = smart; V = dumb; C = thin

Life of a web request


 Find server > make connection > show page
 DNS = Internet’s “phone book”
 Complications
o If you stay connected
o Need instant response
o Data changes in background

Version control system


 Check out a file from a central depository, in order to modify locally, then
check it back in
 Addresses what happens when two developers check out the same file;
shows the second guy to check back in what the first did.

29
PRD/DRAFTING SPECIFICATIONS

PRDs: What and Why?


• What: PRD explains what you are building and why you are building it
– Structured around user needs and requirements for a solution to
meet them
• Why
– Forces PM to think through issues and formulate language to describe
product to others
– Surfaces team disagreements early; builds consensus
– Provides others with overview of product

Feature Prioritization: What’s in V1.0?


 Launching early tradeoffs
– Early learning allows iteration
– Incomplete or buggy product can alienate customers

PM101 PRD Outline


• Vision
• Background/Motivation
• User Walkthrough
– Verbal Use Case
– Visual Use Case
– Cooper ch 11 defines a “scenario” as a persona using a system to
achieve a goal. He distinguishes between daily use scenarios;
necessary use scenarios (all actions that must be possible, even if
infrequently used); and edge case scenarios, very rarely encountered.
He advises ignoring edge cases during design.
• Detailed Design/Feature Description
– Features/info architecture
– v1 MVP; v Next; v Long Term
• Roadmap/Timing
• Scenarios for service intro
• Metrics
• International
• Projected Cost
• Operational Needs
• Addressing Caveats/Risks
• Other Areas for Investigation
• Group Members

30
WORKING WITH DEVELOPERS

Cooper, 89 + ch 7: Engineers think they can learn to design. They can’t. They are too
steeped in the deterministic world of silicon.
 Jetway test: do you turn left into cockpit, to be IN CONTROL and tackle the
challenge of figuring out all the dials and buttons (i.e., a highly complex,
deterministic system), or right into cabin to SIMPLIFY and put others in
control? Programmers turn left!
 Arrogance: I didn’t answer wrong, you asked the wrong question. Leads them
to take even specific designs as suggestions.
 Humans while complex, are too messy, non-deterministic
 Homo logicus wants control, and complexity is the price they’ll pay for it.
Result, when programmers do design, if feature bloat, like Office.
 Homo logicus wants to understand system, and will forfeit success (valued by
homo sapiens) to gain understanding (break a clock by taking it apart to see
how it works). They design programs so that interaction follows internal
functioning (e.g., exposing user unnecessarily to distinction between hard
disk and RAM storage)
 Homo logicus obsessed with possible edge cases (however remote) and will
program in order to address them (adding to cost/UI complexity). Homo
sapiens will cope with remote risk.
 Prone to mental bullying (like the physical domination of jocks)
Cooper, ch 8: programming culture
 Code reuse is valued – saves time and guarantees bug free code – but UI can
suffer if functionality is force fit
 reverence for technical skill (and fact that design doesn’t require as much
technical skill) leads programmers and managers who were programmers to
give it prominence in product development process, so UI comes latter if at
all
 LOVE difficult tasks: this is the prime motivator. The harder the better.
 Rarely have contact with users, and when they do, the best programmers
usually interact only with power users, skewing their view of “average”
 Work alone – programming takes uninterrupted single-minded focus. It can
take longer to check someone’s code than it took to write it, so programmers
have lots of skin in the game – functioning product depends on them alone.
“lonely work of programmer gives him a sense of power” and makes him
uncomfortable delegating – including delegating design.
 View UI design as vague opinion; programmer’s opinion is at least as valid in
his own mind

Lopp Being Geek


 Geeks build stuff – all the time
 114 the initial joy of a game for the geek is discovery… I want to see the
engine that governs this particular universe; I want to see its edges.., the

31
actual discovery of how to win is a buzzkill. The thrill, the adrenaline, comes
from the discovery, hunt, and eventual mastery of the unknown
 167 sees the world as a system that , given enough time and effort, is
completely knowable
 167 control issues: sensitive to drastic changes… system-defining events
force your nerd to recognize that the world is not always nor entirely a
knowable place
 168 joy in problem solving and discovery; adrenaline rush as each part of
project is completed
 170 hates small talk. “I have no system for understanding this messy person
in front of me.”

Lopp Managing Humans

32
AGILE VS. WATERFALL

Key Attributes of Waterfall Software Development Processes

Key Attributes of Agile Software Development Processes


• What: Agile processes allow a team to quickly adapt to changing
requirements by relying on rapid iteration, which is achieved through a
series of short but complete software development cycles.
– Agile has several variations which all have similar elements, e.g.,
scrum, XP (“Extreme Programming”).
– The core values of agile development are summarized in the Agile
Manifesto, drafted by 17 developers who met in 2001 to discuss
alternatives to documentation-heavy software development
processes. The values are
• Individuals and interactions over processes and tools
• Working software over comprehensive documentation
• Customer collaboration over contract negotiation
• Responding to change over following a plan
• Why: With short cycles, a team secures feedback sooner, avoiding the waste
of building functionality that turns out to not be required.
– Also, compared to waterfall processes with longer cycles, agile’s short
cycles imply that smaller amounts of new code are released with each
cycle, making it easier to find and fix bugs.
• Key Elements
– A small, cross-functional team (usually with less than 10members),
always including a product owner (who often is a PM), is usually co-
located to facilitate lots of face-to-face communication and rapid
decisions.
• In scrum variations of agile, a scrum master is responsible for
removing impediments to team productivity and ensuring that
the team adheres to scrum process.
– The product owner works with the team to define and prioritize tasks,
called user stories (see below).
– At the start of a “time-boxed” cycle— a period of fixed duration
usually ranging from one week to one month — a team decides which
stories in their backlog should be completed during the cycle.
• Cycles are called sprints in the scrum variation of agile.
• A prioritization process often assigns points, reflecting a fixed
unit of effort (e.g., one programmer day) to each story
reflecting expected effort.
• The point total for a cycle’s stories must not exceed the team’s
available capacity, which is calculated based on its velocity, i.e.,
the trend in total points completed per cycle.
– Story content is typically frozen during cycle.

33
– Stories that are not completed during a cycle are sent back to the
backlog — a time-boxed cycle is never extended.
– Agile development cycles are sometimes “feature-boxed,” i.e., they
continue until prioritized stories are complete. However, as noted by
developer Kevin Bedell, time-boxing forces sharper prioritization
decisions.
– Teams start each day with a stand-up meeting of roughly 15 minutes,
during which each member summarizes yesterday’s progress, today’s
priorities, and any expected obstacles.
– Status is publicly displayed on a board (often referred to as a
“kanban” board, adapting practices from just-in-time manufacturing)
on a wall of the team’s shared workspace. The board typically
organizes stories into sections reflecting story status, for example:
backlog, ready, coding, testing, approval, done. Stories are captured on
post-its or note cards, which often are color-coded by story type or
team member, as shown below:

Source: Ketil Jensen, Kanban: The Next Step in Agile Evolution

• User Stories
– User stories describing specific tasks that end users need to complete
are written in everyday language by the team’s product owner in
consultation with the product team.

34
– While online tools are available to capture and share stories, they are
often hand-written on note cards to ensure brevity and to encourage
ongoing face-to-face dialogue between the product owner and
developers.
– Story format is often: Who? What? Why? Or equivalently, “As a <role
or persona>, I want <goal>, so that <benefit>.”
• Example: “As a user closing the app, I want to be prompted to
save anything that’s changed since the last save, so I can
preserve useful work and discard work that’s not useful.”
– The product owner must specify acceptance criteria — conditions
that must be met or quality assurance tests that must be passed in
order to declare a story “done.” The criteria are often listed on the
back of the story card, as shown below:

Source: Scott Ambler, User Stories: An Agile Introduction

– Big stories, called epics (e.g., “user pays upon checkout”), are broken
into smaller stories (e.g., “user whose credit card has expired is
prompted for another card”)
• Just-in-time decomposition of epics at a cycle’s start (rather
than earlier in the development process) can avoid waste,
since a team will have better knowledge of priorities then
along with better understanding of the effort likely to be
required to complete stories.
• To prioritize tasks, the team must be able to estimate effort
required; decomposition helps with this.
• Interdependencies between stories complicate planning;
decomposition can lead to interdependencies.
– In waterfall planning, a use case is a “generalized description of set of
interactions between system and actors.” Compared to an agile user
story, a waterfall use case is typically:
• Larger in scope; stories must be granular to facilitate agile’s
incremental planning
• More detailed, especial with respect to user interface design

35
• More permanent; stories are often discarded, revised, and
decomposed

Conditions Under Which Agile Offers Advantages and Limitation


• Key Ways in Which Agile Differs from Waterfall Software Development
– Agile employs shorter development cycles
– Agile is time-boxed rather than feature-boxed
– Agile relies on incremental, ongoing planning rather than upfront
planning
– Agile emphasis is on oral rather than written communication
– Agile relies on a pull vs. push approach. According to Kent Beck,
Extreme Programming Explained, p. 88:
• “The ‘push’ [i.e., waterfall] model of development is to pile up
requirements to be done, then designs to be implemented, then
code to be integrated and tested; culminating in an aptly-
named big bang integration… [By contrast, user s]tories are
specified in detail immediately before they are implemented.
The tests are pulled from the specifications. The programming
interface is designed to match the needs of the test. The code is
written to match the tests and the interface. The interface is
refined to match the needs of the code as written. This leads to
a decision about which story to specify next. In the meantime,
the rest of the stories remain on the wall until chosen for
implementation.”
• The table below summarizes pros and cons of waterfall and agile methods:

Waterfall Agile
Pro  Avoids piecemeal design  Ongoing input from product owner
 Can work on modules in parallel  Product owner buy-in via direct
 Can see full output at end of participation
each phase  Flexible if requirements change
 Can measure progress against  Find/fix bugs faster
full plan  Faster time to market with MVP
 Full-time product owner not  Time-boxing yields cost predictability
needed

36
Con  It is difficult to specify all  Works best with small team and co-
requirements at outset location
 Not flexible if requirements  Requires full-time product owner
change  Flexibility can lead to feature
 Problems not discovered until proliferation
phase is complete  Incremental approach can lead to
integration problems
 Incremental approach can lead to a
product that lacks vision

37
PROJECT MANAGEMENT

Cite Nordlander as source for what follows.

Tracking Technologies
• The best approach is usually whatever your team prefers. Any technology
(e.g., whiteboard + post-its; spreadsheets; tracking software like Asana or
Pivotal Tracker) can work if it is used in a disciplined way.
• Spreadsheets are simple, customizable, and easy to share
– Key columns on the sheet might include: item, owner, due date,
priority, time budgeted for completion, status, dependencies, cost, etc.
• Bug tracking software (e.g., JIRA), which includes prioritization functionality,
can be adapted for use in project management. This approach treats all tasks,
including bugs, as “features,” and prioritizes them in a single stream.
• Project tracking software includes Asana, Pivotal Tracker, Sprint.ly,
Basecamp

Key issues
• Clarity of goals
• Features should be ideally decomposed into tasks that: 1) can be completed
by one person; 2) can be completed in a half day or less; 3) have no/minimal
dependencies
• Some slack in schedule for experiments etc.
• Allowing enough time for integration of multiple streams
• Top management buy-in
• Distributed vs. co-located teams
• Meeting format
• Prioritization scheme
• When is a feature “done”?

Team Roles
 Project manager is typically found in very large projects
– Like a product manager, removes barriers and coordinates team effort
– Unlike a product manager, does NOT set vision or overall goals
 Product manager
 Tech lead: most senior engineer, who’s responsible for final technical decisions.
Typically does some project and product management
 People manager

Email Management
 Read emails immediately
 Use automated filters to move and label emails
 Take Action, Archive / move email immediately
 Emails over 2 weeks old should be discarded
 Be able to quickly scan your inbox

38
 Get good with keyboard shortcuts

Meeting Management
 Avoid status update meetings
 Agenda out before hand
o Update action items in the notes before meeting
o Email action item owners in CC
 <= 8 ppl
 Decision maker identified
 Separate note taker from person running the meeting
 Display the notes as the meeting is happening – everyone should agree to the
action items being assigned
 25 or 50 minutes

39
USABILITY TESTING

Purposes
 Can get respondents’ feedback on rivals’ products or on prototypes of early
solutions, helping to validate demand -- in particular, whether your concept
is easy to understand
o Ask with sketch/prototype, “What do you think this is for?” (NOT “Do
you like this?”); “Does this screen make sense?”; “What would you
click on first?”; “What would you expect to see when you click on
that?”
 Product usability is key focus
 Klein, 41 NOT FOR
o Finding out if people like the product, will use product
o Finding out how to fix usability problems

Types of usability issues to explore – anything that requires multiple steps (Klein, p
30)
• Multi-step signup flows
• Purchase flows
• Searching/browsing experience
• Sharing experience
• File upload/edits
• Installation process
• Etc.

Session Management
• Do it early!!
• Recruiting representative users is overrated. Loose fit is okay (Krug Don’t
Make Me Think), since most usability issues don’t relate to domain
knowledge
• 3-5 subjects is usually enough to spot most big problems
• Do them all in one morning then debrief
• Use scripted intro (“Testing site, not you. Don’t worry about hurting our
feelings. I’ll ask you to think out loud as you complete the tasks I give you –
that’ll help us. Ask any questions you have: I may take a minute to answer,
since we want to learn what people do when they don’t have someone
nearby who’s familiar with the site. We’ll record, but we won’t share beyond
internal team – is that okay?”)
• Give a specific task (e.g., “Find a cookbook for under $14)
• Get subject to think out loud when completing tasks
• Don’t coach, don’t give a guided tour of product
• Be willing to let the user fail
• Ask open ended and neutral/non-leading questions
o NOT “Was that easy to use?” or “Do you think that was cool?” RATHER
“How’d that go?” or “What do you think?”

40
• Use a screen recorder
• Review/debrief right away
o Typical problems: unclear on overall concept; seeking words that
aren’t there; too much going on
o Triage Guidelines: Ignore “kayak” problems, i.e., user gets stuck then
quickly figures things out (like kayak rolling over); be wary of feature
requests; resist impulse to add things
• Do testing once a month!

Tools
• UserTesting.com

41
LAUNCH MARKETING

42
POST-LAUNCH ANALYTICS

Conversion Funnel Analysis/Cohort Analysis

Net Promoter Score

Customer Service Feedback

43
QA BASICS

44
SQL BASICS

45
A/B TESTING/QUANTITATIVE EXPERIMENT DESIGN

A/B Testing Is Useful For (Klein, p. 171)


 Getting statistically significant data on whether a new feature improves key
metrics (e.g., retention, engagement, free-to-premium conversion rates)
 Select between alternate designs

Pitfalls With Testing (Klein, p. 181)


 Failure to employ statistically significant sample
 Failure to consider long-term impacts (e.g., sales rise sharply during a
promotion, but then they fall back – perhaps below normal for a while)

46
ADWORDS BASICS

47

Vous aimerez peut-être aussi