Product execution mock interview: user engagement (w/ Microsoft PM)
VloĆŸit
- Äas pĆidĂĄn 4. 07. 2024
- Here's a product management mock case interview featuring a Microsoft PM focusing on a product execution question.
đ„ This full PM mock interview features Ankur Biswas, Microsoft PM, as the candidate answering a product execution question, specifically investigating why Instagram Story user engagement is down. Our ex-Googler Founder, Kenton Kivestu, plays the interviewer. Ankur is part of the Product Folks, an awesome community of product enthusiasts that put on a lot of great events: www.theproductfolks.com/.
If you're preparing for Facebook product management interviews, this video is a must watch. Also, for anyone preparing for other PM roles, execution and analytics questions about how to investigate a drop (or spike) are a common topic - this interview will give you insight into what these questions are like and what a good response looks like.â
đ Interviewing soon? RocketBlocks has the best concepts, drills, and coaching to get you more offers: www.rocketblocks.me/product-m...
đĄ Looking to become familiar with the PM role, the interview process, and how to start preparing? Our free product management guide covers it all: www.rocketblocks.me/guide/pm/...
âĄïž Check out our intro video to understand what PM is: âą What is product manage...
âĄïž Here's what to expect in PM interviews: âą Introduction to produc...
#PMinterviewsâ #productexecution #productanalytics #facebookPMâ #RocketBlocksâ
One of the best structure and framework. Ankur did a really great job. I would hire him. One think that I would have considered was dissecting the engagement by source, meaning engagement could come either from 1) users directly opening the app whenever they get the time and 2) engagement coming from PN. This would have helped identifying the root cause faster.
Kenton and Ankur, I tried to think of the plan before I heard Ankur's answer after learning the three changes. My thought process was funnel based check
1) Content Creation: Messenger teams story creation --- Check number of stories uploaded
2) Content Discovery : Notification changes
3)Content recommendation : Story bar.
Watch this right before my product sense interview and took alot of inspiration from this! Thanks and keep up the great work
Hats off to Ankur's thorough analysis and breakdown well.
Great video and great exercise. Two comments:
1. Facebook doesn't care about NPS/CSAT.
2. I'd also suggest diving deeper into revenue impact given stories are an individual placement for advertisers.
Something I was expecting to hear about with the experiments 2 & 3 is that if user's time is finite, you're asking them to view stories over a single duration/session versus many smaller ones throughout the day - which they may not want to do, or can't do. Or the stories from the notifications that are batched are competing against each other for the same piece of limited time, resulting in a lower engagement.
His upspeak burned a hole in my brain
Overall good structuring and explaining the though process or reasons before making any conclusionsđ
An amazing mock interview. The overall framework was great. I really enjoy all your videos. Very insightful.
Yes, Ankur did a great job here!
1:18 STORIES ENGAGEMENT IS DOWN HOW WOULD YOU INVESTIGATE THIS ISSUE?
2:08 2:49 metrics
3:09 the specific metric?
3:50 the specific time period?
5:04 the specific location?
5:50 the specific device/os?
7:11
8:14
9:11
9:43
10:20
10:38
11:30
11:53
13:10
13:30
13:46 experience of using the stories tool that's been changed
14:04
14:40 explanation on the story bar mechanics
15:46 15:58
This was amazing. Thanks!
Excellent job by Ankur! Thanks for a great video on this topic
Idea way for an interview to run. however, it will require the interviewer to be close to the product, and also the question to not be hypothetical for such an exchange to happen. Keeping that aside, a great example of peeling the layers, and letting reasons pave the way through the investigation.
OK i have watched many of these mock interviews and this is by far the best I have seen. If interviewee really didn't get the question ahead of time, I tip my hat to him on this. Very solid thought process. If it was indeed scripted, still good as it was very illustrative.
Agreed, Ankur really did a solid job here + great enthusiasm throughout!
I for sure believe its scripted, but nevertheless good one
@@doublerebel6783 Definitely not scripted...
This was very useful! Loved the analysis framework!
thanks, glad you enjoyed it! more good stuff coming soon :)
This is a very good mock, but not Excellent! However, probably this is a realistic interview with minimal behind the scenes preparation. Actually, you want to watch something like this. 1st because it portrays the real reaction of a candidate 2.You learn how to pivot when things does not go perfectly. One good thing in this interview, the candidate pivoted well on potential issues impacting user experience. Some assumptions and comment through the interview, were not necessarily best choices, but again, if this is on the spot interview, i think this really good.
it is "on the fly" - he is not reading from a script :)
If the overall engagement was down and the experiment was to bundle notifications, I would investigate that because itâs quite obvious that these notifications would drop driving traffic about stories. Not sure why it got so muddled up in the end.
Overall I thought it was excellent framing by Ankur on the interview. The initial framing and approach was perfect. However, I felt a little lost when it came to analyzing the variants in the test. For instance, I would typically approach something like this by identifying all the exhaustive factors - ruling them out one by one to narrow down the specific scenario.With respect to the assumption of no external impact - stock markets are a great example - a lowering of consumer sentiment in US can lead to a ripple effect on all major geos. In this case, it seems like there were underperforming variants in the A/B test. So the approach should have been to first identify the impact to the metrics for the variants and then hypothesize about the change in functionality which could have led to the decline in overall DAUs. Also, it wasnt clear if the A/B test was launched globally or in certain geos or to certain segments ( another hint on whether these were the contributing source ). Overall great stuff.
I also think his framework/answer got a little messy. It seems like he is not answering the interviewer's question, and kept asking questions that maybe a little out of framework. But great interview in general.
Excellent analysis.
Great video, once again
Ankur is so thoughtful!!
I observed a few concerns:
1. Customer segmentation could be done further.
2. Assumption of a change specifically impacting, involving the stories feature. Not necessarily always true.
3. We did not establish if the previous week's number itself was within the range of baseline.
4. The remaining conversation hinged on the notifications. But what % of stories' DAU and engagement is contributed by click on the notification. Unless established, it leaves a major variable unanswered. For example: notification contributes to only 1% of DAU, the whole hypothesis fails.
5. Messenger changes were not covered. IMO if the interviewer has cited this, one should cover or ask to drop with a valid assumption with confirmation from the interviewer.
The interview overall was well structured and it was delivered well.
I agree with #4
can you add me on linkedin? :)
Beautiful video made. Thanks!
Glad you enjoyed it!
Rally detailed and excellent interview though it was hard to keep up with all of the details.
Any good software engineer can do this kind of troubleshooting , I think focus of interview should be how to increase the user engagement for a product. Scalability and onboarding new user s is more challenging job for a product manager.
Ankur is so good
Yep, he does a really solid job here
Don't know why there's any down-votes! Admittedly a few concerns can be covered more completely, Ankur's structure is SO good overall
agree - v good for the on the fly! no script!
Hi, we are unable to open the above mentioned blogs
weird - should be fixed now!
please ux interview
@14:37 why does Ankur have red nail polish on his pinky? :D
i have no idea and never noticed - good attention to detail tho?
great overall interview but I felt there were too many questions he was asking at every step which felt rather distracting - I feel most PM interviewers would just say "it's up to you" and not really explain like you did.
This demonstrated a great scenario of actual PM "work" but not so much for an interview.
Symbian :D
Lol. true legends know
The interviewee was very unstructured!! Kevin, couldn't we use MECE here? I used MECE in my own solutioning and could diagnose more easily
MECE is a great tool - no doubt about that. Ankur does in fact lay out his process as internal vs. external which is as MECE as it gets. He doesn't start by doing that, which I think in this case was OK since he did a lot of "discovery" around the issue. Since he was on point, his questions and assumptions are good, I think it works well for him.
You meant Mee See MECE framework right?
@@gauriapte673 Yes, referring to the MECE framework. More details here: www.rocketblocks.me/blog/the-mece-principle.php
It's unclear why the person believes that a quick drop in engagement can only occur so quickly due to an internal issue.
If you spend time running these massive apps, you'll find that most of the "kink" in the curve issues come from acute things like bugs, new features, bad experiments, etc. There are some external factors that certainly could cause a dramatic drop & I do think the answer could have been strengthened by acknowledging that - but overall he still does a really solid job
He did acknowledge external factors to an extent....
External issues, such as competitive forces etc, tend to show results over longer periods of time. It's unlikely that there would be a sharp gain or drop in something unless it was an internally generated issue.