Do Users Understand Mobile Menu Icons?
Call it burger, hamburger, navicon or whatever – it’s that icon made up of 3 bars that indicates a menu.
Indicates to whom exactly? Like virtually every other element I add to a website, I just assume that users will know.
This interesting Twitter conversation made me think otherwise.
@lukew In my mobile testing a surprising no. of ppl dont know the icon that well. So I’ve added “Menu” sometimes. Depends on the users.
— Anders Matre Gåsland (@andersmg) February 6, 2014
So I ran a small A/B test using Optimizely*. The results were a little underwhelming but interesting.
Icon only, no text or borders.
Same as control, but with MENU written below.
Same as control, but with a rounded border.
The differences are negligible, slightly more users clicked the Icon + Border, than just the icon alone.
I suspect this is because the border draws the eye a bit more.
The experiment was targeted at all mobile browsers, and ran across all pages on the site.
What’s surprising is the low number of clicks – only 2% of users actually open the menu. This is indicative of the kind of site (blog-type content), as opposed to a more highly engaged service-type site (e.g. Facebook).
In my opinion discerning the overall intent of the user is far more important (and challenging), than the exact specifications of the menu icon.
If the user has a reason to go looking for something then they are more determined to spend the extra millisecond looking for something that looks like a menu.
Desktop Behavior is Different
Last year I did a similar test on the desktop layout of the site.
I tested 4 variations of the content inside a blue button:
- (Baseline) The word “MENU”
- The word “MORE”
- Hamburger icon + “MENU”
- Hamburger icon + “MORE”
The results were far more significant.
Tests 2, 3, and 4 performed far worse (18%, 31%, and 43% worse respectively).
Adding the bars icon appeared to confuse users and over-complicate a simple button.
An Expensive Test
* Optimizely billing and pricing is painful. They have monthly pricing based on a fixed number of visits to your A/B test page(s). It’s not cheap.
They charge you when you go over the monthly allocation, but don’t rollover any unused visits. It’s a lot like government tax departments that charge you interest for late payments, but don’t appear to do the same thing when they owe you a refund.
With Optimizely I have a 2,000 visit per month plan.
My experiment quickly hit 19,684 visits before I was able to pause it. Bill? An extra $117 for those extra 17,684 visits – ouch!
Even 20,000 visits to a test does not really give you a great deal of statistical confidence.