Technical Interview Questions Flashcards

1
Q

How do you decide who are appropriate participants in your research?

A

Starting point

  • Population(s) - who am I trying to represent (business, UX goals)?
  • Research goals - am I testing a hypothesis? Describing a population or measuring a population parameter? Surfacing new issues/requirements?

Factors I’d finetune

  • Required characteristics (depends)
  • Desired characteristics (depends)
  • Demographic distribution (even unless otherwise)
  • Participant suitability (e.g., diary studies - screen against thoughtfulness, clarity)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are common mistakes that people make when moderating?

A

Moderating what? This can vary between summative usability, formative usability, interviews, generative research exercises, etc.

  • Be mindful and focus on observation
  • -> Get notetakers, use transcripts, be mindful, modulate
  • Using appropriate questions that don’t lead or introduce other types of biases
  • Determine the right moderation protocol
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Can you tell me about a particularly challenging bit of qualitative data analysis you’ve done in the past?

A

Latin America home visits

  • 2 countries
  • 16 2-hr sessions
  • 12 observers
  • Simultaneous translators

Understanding behaviors/needs WRT specific type of service WITH LANGUAGE BARRIER

  1. Collect all artifacts (strict data collection process)
    - Observation post-its strictly collected between sessions
    - Heavy reliance on transcripts, catalogued artifacts (e.g., photos)
  2. Full team involved in early analysis
    - Debriefs on the road
    - Every 2 days - affinity diagramming with stakeholders
    - Assign stakeholders particular participants (for role playing exercises later on)
  3. Synthesize - focus in on what matters to build model
    - My own observations, stakeholders observations, session records
    - Team synthesis begins the reduction process
    - Further reduce it more as researcher to develop overarching model
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When designing survey questions with response options that range from “Strongly disagree” to “Strongly agree”, how many response options are ideal? (More importantly, Why?)

A

Typically, I use the rule of thumb of bipolar (7-pt scale), unipolar (5-pt scale)

Factors that matter when considering

  • Unipolar or bipolar heuristic
  • Number of items total (>10? 5-pt)
  • Cultural preference
  • Already a benchmark? DON’T CHANGE

Techniques to help test

  • Interpolation
  • Cognitive pretesting
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Please describe a recent discovery research project and explain how you went from having lots of data to final deliverables. How did you approach the analysis and how did you translate your findings into something that would help the team?

A

TK

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Pretend that a team wants feedback on some designs, but they don’t have time for any usability testing. What kind of feedback can you give the PM when he comes to you for help?

A

Analytical usability techniques

  • Heuristics Evals
  • Cognitive walkthroughs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Name 3 well designed products

A
  • Headspace
  • Peleton
  • Evernote
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Name 3 poorly designed products

A
  • Tinder

- HBO GO!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

PM dismisses your 6 person study. What do you say?

A

Not all information that’s meaningful can easily be measured.

Qual - models, stories, experiences, painpoints
Quant - engagement, adoption, retention, break-off points

Short-term responses

1) Do we have user-centered measures we can refer to instead of business-centered measures (from logs data)? Indirect measures into user experience
2) In cases where different methods/data sources fail to corroborate each other - paths for further inquiry.
- Gap between what people say and do?
- New feature - usability sessions and logs analysis contradict. We’re running wide-scale surveys and crowdsourcing to figure out why.

Organizational culture (long-term)

1) Thick data - ability to inspire, ideate, identify new opportunities
2) If possible, bring them along to sessions, involve them in group analysis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly