Discussion about this post

User's avatar
Kurt Wood's avatar

This is excellent. Thanks for putting it together. Loads of insights to shamelessly steal (I mean, strategically summarise) for clients.

If you’re open to it, I had a few follow-up questions:

1. In my experience, one challenge with remote usability testing is that participants often engage more deeply than they would in an organic setting (demand characteristics, and all that).

Do you think your testers might have paid more attention to AIOs than typical users would in the wild, possibly making the CTR drop-offs here lower than what might happen naturally? Or does this roughly align with the larger-scale traffic patterns you’ve seen?

2. How much variance did you notice in scroll depth or click-out rates across categories? Did user behaviour tend to cluster, or did it vary widely by query type? And did you categorise queries in any more granular ways that could be used to stratify the data, beyond health/DIY etc (e.g., by query structure, industry, etc.)?

Essentially I’m wondering how likely it is for scroll depth to be vary significantly across industries and different query types.

3. That section on users clicking through to social platforms to validate AIOs is fascinating—30% on desktop is higher than I’d have guessed. When you say a user clicked through to Reddit or other forums, was that specifically through the “Discussions and forums” SERP feature, or did it also include clicks on organic listings or search refinements?

Sorry if any of those are answered in the write-up and I missed it. Really appreciate the work you’ve done here.

Expand full comment
Ralf Seybold's avatar

I remember we have chatted about these things back a few months in the old year. Great to see what you have done here. This piece of work made me a paying subscriber to pro plan.

Expand full comment
15 more comments...

No posts