Interview home assessment - site review


#1

Hi all, I am interviewing for my first UX job ever as a UX consultant, and after the first round I have been given a design assignment which I’d like to run my understanding of by you, if I may…:speak_no_evil:Any pointers will be much appreciated in terms of method, presentation etc

BRIEF:
You now need to do a rapid expert design review of the [https://www.o2.co.uk/ site

Focus on walking through the following scenario:

“The user is approaching the site with the objective of finding a good deal on a preferred phone, they
are currently using a Samsung S7, they are thinking about the Samsung S9, but are open to a bit of
comparison. They are not currently on contract but would definitely consider this as an option.”

Presentation Requirements:
• Please review the site and build a discussion pack for a 1-hour chat – 30 mins
presentation, 30 mins discussion
• This presentation needs to be no more than 15 content slides
o How you did the assessment
o What you learned from the assessment (problems, successes and clear
recommendations)
o What we should do next to fix the site
• Please use screenshots from the website to illustrate key points
• Use the insights gained on personas and stakeholders to guide your commentary
• It is NOT necessary to comment on every screen etc. Pick comments based on value
• You will be presenting back to an executive and senior design team from the client

“You would need to finish the assessment by Wednesday, and then during the slot / session do a playback of your assessment”

MY QUESTION:

What is a rapid expert review deliverable exactly?

Is me personally evaluating the user task mentioned above, against a set of heuristics and documenting the good, and bad and my recommendations for improvement enough?


#2

It’s not in my UX Methods book so it doesn’t exist :rofl:

I think it is a Heuristic Evaluation?


#3

I think that’s how i will approach it?


#4

yeh, exactly this, expert review and heuristic evaluations are pretty much the same thing.

In terms of format, what i find works is

  1. take screenshot of page
  2. highlight the problematic areas, number them
  3. make notes/review per number - remember to refer back to user goal/objective

keep strictly to their guidelines

good luck


#5

awesome challenge!

A rapid expert evaluation means nothing.

It’s just a heuristic review based around a set of user needs.

Go through the task as requested and document any problems you find with screen grabs. Don’t worry about categorising them yet.

When you’ve finished set up a bunch of slides:

very brief description of method which is heuristic evaluation “I clicked through the site and looked for problems. When I encountered a problem I assigned it to one of these areas: xxxx”
Description of main findings and what the implications are for the user
recommendations and why it benefits the user

edit: the methodology really is that simple: I just clicked through the website looking for problems. If you said that to me at interview I would be totally cool with that because its a good method!


#6

I agree with LordMolesbury, but would add a bit more in Methodology. I would use that area to explain what criteria I used, such as Nielsen’s Heuristics, and include a brief summary of them (with a link to them at the bottom of that slide). This will also set you up for the rest of the presentation to reference them by short name instead of having to stop to explain in great detail again.


#7

Whoa, whoa, whoa, slow down.

Are you saying that this is a live project with a direct client that you’re working on?


#8

I think that line was left there accidentally, I cant imagine I am presenting to a real client, I was not informed of this anywhere other than that line in the brief…


#9

I wouldn’t assume this is accidental. Ask. I’m working on a long-form piece about this kind of BS test, and it’s not unheard of.


#10

@dougcollins is right, smells like a free evaluation to me - do this with several candidates and you get a nice consultation.
Could be wrong though. You should get it cleared up and ask why it’s being presented client side and not with the hiring team?


#11

i sensed that too, but figured O2 would never put their own site up for eval

oh: one more thought. They say ‘good deal’ this implies you need to say why a phone choice is a good choice compared to others. What is a ‘good deal’?


#12

I suppose, the company you are applying to, is not O2. But, they can be doing a UX for O2. Or maybe they are developing something similar. I would make some research on that.
On the other hand, they may be not connected with O2 in any way and their task is totally innocent.


#13

Does anybody know what exactly is a “a discussion pack for a 1-hour chat”?


#14

No, sounds like a badly worded hand-waving ‘thing’


#15

I’ve seen a lot of these ‘badly worded’ requirements. Even if they are innocent and don’t have bad intentions behind them it’s already a red flag for me. Would you want to work somewhere where things are not clear to you? Where you have to always second guess what is meant? Imagine their project briefs. No thanks.

Also, I’ve done these tests before - not for UX - but back when I was a digital designer. They are mostly scams using your time for free services. Plenty of other options to focus your time on if you ask me,


#16

Hi all, update, it is innocent - confirmed


#17

Thanks Ari, I dont have the luxury at the moment.


#18

Hi All,

Thanks so much for the guidance thus far, I’m stunned at how helpful everyone here is.

If I may ask one more question:

Without actual data how would you suggest I comment on success metrics of the design?

Overall Success Metrics for the Design
• Volume and Value of Sales
• Cross Sell per Sale
• Low Cost Channel Migration
• Registrations for Self-Service
• NPS (Customer Satisfaction Measure)
• Ease of Use (Speed, Accuracy)
• Reduction in Abandonments

As of now I feel I’ve incorporated it here and there, kinda willynilly, my what next slide is the following

"Our task here isn’t to make the interface look pretty — it’s to make sure everything on the screen makes sense. It’s to make sure people can actually use what we build.
So before we jump to redesigning something, we’re going to identify what problems need to be solved in the first place — and determine whether they’re worth solving at all.

  • Expert Reviews (Heuristic / Cognitive Walkthrough) covering more scenarios on both, the web, and the mobile interface
  • Usability Testing with real users on both, the web, and the mobile interface
  • I would begin with a mobile as personas indicate that this would most likeyly be the point of contact and interface of choice
  • Perform an accessibilty audit on both, the web, and the mobile interface
  • Get deeper insights with Google Analytics

Any feedback would be much appreciated.


#19

How can you comment on these metrics? Without data?

Overall Success Metrics for the Design:
• Volume and Value of Sales
No data: no comment possible. What do they mean? Can you benchmark?
• Cross Sell per Sale
No data: but I assume you can come up with a target for success
• Low Cost Channel Migration
What does that even mean?
• Registrations for Self-Service
What does that mean?
• NPS (Customer Satisfaction Measure)
No idea
• Ease of Use (Speed, Accuracy)
This in itself is a whole bunch of metrics such as clarity of information architecture “I can find the things I need quickly” to things like aesthetic design “I know to click on this big button”. Ill defined metric.
• Reduction in Abandonments
No data: but you’ll need to benchmark. Use Beymard institute for figures


#20

So here’s the thing: these design exercises almost never end well.

Question - if you did, would you walk away from the assignment?

EDIT: Just to say that this assignment and employer have raised so many :triangular_flag_on_post::triangular_flag_on_post::triangular_flag_on_post:. Personally there’s not a chance in hell I’d complete the assignment, but I understand where @bryce_chaikin is coming from. A shrinking bank account is a hell of a drug.