What would you say will be the next big trend in the UX Design industry?
From what I’ve read thus far I would say voice control of websites - particularly down to Siri and Alexa
my gut says that the next coming challenge for the UX people will be to design experiences for the smart TVs.
More and more brands are developing UIs to allow their APPs and their devices to be able to connect with the smart TVs.
The same smart TV UIs have room for improvements in terms of UX.
I don’t know if this will be a big trend, but there have been two threads recently about designing for children and seniors. The reason I think this may be a trend is that there doesn’t seem to be much information out there about that.
@Piper_Wilson seniors are normally neglected when it comes to anything technological which is a shame because I’m sure they were tech savvy when they were younger. I often think what it may be like for us (relatively) younger folk when we get older. I think this is more evident now that even toddlers know how to use ipads better than their parents.
I don’t fully agree on that. In my opinion, there’s a specific pattern to see this scenario:
Elderly people need a real scope/purpose to approach tech.
A couple of weeks ago I read some data from the Apple training program (the one available at the Apple stores). I was impressed by the number of over 60YO attending the courses (how to use a specific device, how to post content on the social networks, how to make pics with mobile devices and so on). They have a great motivation to learn and to approach tech because they feel that is the only way to boost the connections with family and friends
Millenia and tech/savvy very often approach new app and new devices because it’s trendy and/or because it’s viral.
Just think about the motivation you need to buy a new device (eg a new smartphone and or a new activity tracker), to register for a new social network and to download a new messaging app
I think we’ll be moving more and more away from screens - the sales of Echo in the US, the launch of Google Home, and the reported new tech surrounding Siri suggests that voice will be something we need to pay serious attention to. Conversational interfaces too - it’s very early days but it’s hard to ignore… VR… AR…
With regards to ageing, it will have a big impact for sure. This is talked about a bit in Accessibility circles, https://en.wikipedia.org/wiki/Aging_of_Japan - a few months ago I had a joint meeting with a company here in Amsterdam whose entire product line is focussing on the ageing population. Also quite interesting to see companies such as Philips, who 10 years ago you would call a TV manufacturer are now looked upon as being more in the health sector.
I know they’re saying this about every industry right now, but it’s machine learning. Not just in terms of products and interfaces that UX designers work on, but also in terms of the UX process. One of the biggest problems with UX as an org function is that it doesn’t scale very well. I think ML can fix that, and I wrote this article about it on LinkedIn this morning:
By now, most companies of a certain size have made at least one attempt to institutionalise UX. But few (if any) have managed to truly make it a fluid part of their everyday design processes.
Oftentimes UX initiatives are doomed from the beginning because the company’s culture simply lacks sufficient interest in the user’s experience. Sometimes UX fails because of functional silos that prevent the designated “UX Team” from doing their job. Other times management just doesn’t equip its UX team with enough resources to be effective.
But what’s always ignored in the post mortem of every failed UX initiative is a simple economic flaw that underpins all of the above problems: UX doesn’t scale. At least not in its current paradigm.
Today, UX is all about qualitative research and user testing.
But the more users you have, and the more universal your products are, the harder it is to extrapolate general insights from the feedback of a few users. It gets harder to define people through a persona. This problem is multiplied by the number of products you offer and the number of design projects you have at any given time. The bigger the company, the harder it gets to amalgamate or summarise feedback. And recruiting 500 user research participants per year is a a heck of a lot harder than recruiting 5 for a small, limited-scope study.
To make things worse, UX research is essentially a separate function from the actual design process. It’s a function owned by one or a small group of people, and for them, organizing and scheduling user research gets significantly more complex as research activities get ramped up across different projects.
Perhaps the most painful part: the feedback loop between research and design is too long, putting companies in an unfortunate position: if they do as much user research as they should, they’ll never ship on time. Every missed deadline adds to the backlog.
So where does that leave us? There’s no doubt that the basic concept of UX is incredibly powerful. Focus relentlessly on the user, and business success will follow.
And UX research is fundamentally needed because corporate designers and decision-makers are biased. Without objective user data, the design process quickly devolves into Groupthink, Design-By-Committee, Stakeholder Bargaining, and Convenient Assumptions.
So how can we capture all the nuance and complexity of the user in such a way that we can seamlessly insert it into the design process and make it fundamentally work at “BigCo” scale?
The answer, in my opinion, is machine learning.
Every time we collect user feedback, we should use that to “learn” and predict how users will respond to future design initiatives. Not based on our gut feel or “expertise”, but an objective understanding of users’ preferences.
It seems counter-intuitive, but the future of UX research might be an AI. That’s certainly how we’re approaching things at EyeQuant, where our web service offers instant, personalised feedback on website UI based on past user research.
Using predictive algorithms based on past research could enable us all to get meaningful user experience insights almost instantly, with extremely low marginal cost, while radically reducing the need to co-ordinate and plan live user research sessions.
There’s already a niche market for this, and I think it’s now a question of when, not if, this type of UX research goes mainstream.
What do you think: is machine learning the key to making UX scale?
As per the current trend and use, the designs must be creative and user oriented and so that we prefer the designing will be on another level which attracts users.
Actually this trend has sort of passed . I worked at Philips TV a few years ago, and we were looking into this, especially with Android as OS. Software is alright, but the hardware does not have enough performance (apart from Samsung). The trend in TV is to get rid of the SmartTV environment, and provide companion apps, which are way more easy to use (remotes kind of suck for such input), and run of much more performant devices. In the end, you’ll be casting content to your TV screen, like you already do with Netflix etc…
@glenn I have noticed this too - There are a few products like amazon fire stick and other devices like that which are media centres/crossed with android gaming capabilities. Really interesting, I tend to look at games UI as a starting point for web trends.
Interesting to hear about this topic from someone worked on such products.
Actually, this was my point, maybe not well explained.
As a user, I’m noticing more and more effort, from the maker side, invested in the TV Entertainment sector.
The new iOS will have specific apps to better control and to configure the smart TVs with Apple TV.
Having listened to the UX podcast lately and watching some of the implosion at Uber and watching it hint in some of the big tech startups, I hate to be the wet blanket, but I think the next big trend will be a big discussion of professional ethics.
Engineers have a code of ethics and psychologists have a code of ethics. They both affect public health, but we’re somewhat of a derivative of both and we really don’t have that conversation about ethics frequently enough.
We have a way to strongly impact how people structure their lives, approach their finances and health, and even what information they consider true. I think we’ve probably been approaching it with our eyes closed and I’m afraid we’re going to have some definitive examples of what it looks like when we don’t take responsibility.
I think is AI.