With the Rise of Integrated Ecosystems, What's the Future of UX?
The online world has always offered astonishing possibilities, but recent years have seen various previously-nascent technologies mature and achieve mainstream appeal, showing that the rich potential of a global network is ready to be realised. We can see this in the advent of complex digital ecosystems. From a basic search engine, Google expanded to become the ever-expanding suite that’s so dominant today.
After all, it’s one thing to have countless powerful options, but it’s another thing entirely to use them in combination. Now that standards have been agreed upon and APIs have been established, developers are free to get creative and find fresh ways to turn disparate elements into compelling systems — ERP software is essential because of its power to unify.
These systems have been highly effective at improving productivity, yes, but they’ve also fundamentally changed standards of UX (user experience). So many software frustrations stem from clumsy seams between functions, and since tightly-woven software ecosystems came along, there’s been a marked improvement in this area.
But as the density of digital connections continues to scale up, how can we expect UX to change further?
How will smartphone experiences change?
Think back to when the first iteration of the iPhone was released. The first cell phones came out with classic tactile buttons that worked very well, but when the first “smartphones” came out, they had clunky resistive touchscreens. Whether operated by stylus or finger, they were extremely awkward, and even the best implementations were lateral UX moves.
The iPhone changed the game and heralded the future of the consumer electronics industry by bringing the capacitive touchscreen to the mainstream — but even more than the technology, it was the user interface that mattered. Apple was the first company to understand how to make touch interfaces intuitive enough to suit the average consumer.
Today, smartphones of all stripes (whether they run iOS, Android, or even a third-party OS) have very slick and fast interfaces, so the challenges aren’t mechanical but operational. Users rely on large selections of apps, and they want them to work together more straightforwardly.
To that end, I anticipate smartphones of the future having tightly-integrated functions and storage options, with a broad level of contextual analysis allowing operating systems to figure out what users are trying to achieve and call upon relevant services and/or apps to assist. With Moore’s law failing, it will become ever harder to stand out through superior mobile hardware, making it the work of software developers to find ways to help devices outperform their rivals.
As a result, ten years from now, there will be vastly fewer mobile actions that require more than one or two inputs. Biometrics in combination with other sensors and contextual awareness will provide near-prescient levels of personalisation. Users will expect their devices to know what they mean and act accordingly. The age of manually toggling between apps will likely have come to an end.
Will VR/AR tech finally become mainstream?
For the mobile world, I view the current touchscreen interface as the natural endgame — at least, from a design standpoint — but that doesn’t mean it can’t be extended. Through the success of devices such as Amazon’s Alexa voice recognition system, we’ve seen that people are willing to use other input methods as long as they consistently work and the circumstances are practical.
It bears asking, then, what role VR and AR will play in the UX of the future. Despite having been in development for many years, these technologies are not truly mainstream, still relying on prohibitively-expensive hardware and having too few use cases to be ubiquitous in the way that the touchscreen now is. Are we ever going to see the dream of the full-immersion VR experience become a reality?
Well, possibly — but not in the foreseeable future. Even if costs came down massively and there were VR content and UIs available for every system, the hardware would remain a problem. Headsets have come a long way but
AR, however, has a much brighter future. The costs are already sufficiently low (only requiring a basic smartphone), and the applications are there, primarily in the virtual previewing of physical products and in personalised navigational and informational overlays. Even if we never see a mainstream-ready set of AR glasses (delivering on the promise of Google Glass), everything is ready for smartphone AR to grow into an integral part of UX standards.
Particularly in the eCommerce world, the future of mobile UX has a significant AR component. Instead of going to conventional retail warehouses, users will have choices: they’ll be able to preview 3d models in their home environments, or go to physical stores and benefit from the aforementioned AR overlays to find products, check stock, preview colors, consult support, and generally save time and effort.
How do businesses need to adapt to survive?
Having considered what the UX of the future is going to involve, we must now think about what businesses must do now to move with the times. At a minimum, I’m confident in making the two following general predictions:
Businesses will need to be fundamentally open to collaboration, even with their competitors.
Users don’t particularly want to have blind loyalty to specific companies, and they no longer benefit significantly from having go-to companies. After all, they can find alternatives very quickly through simple searches.
Any company that doesn’t maintain a welcoming API and supports its customers in engaging with its products and/or solutions from any system will have a difficult time. Think about how Sony was essentially pressured into supporting crossplay for Fortnite, thus removing an incentive (however small) for users to buy PlayStation hardware.
An eCommerce store that doesn’t provide AR content will be at a severe disadvantage relative to its competitors.
The experiential gap that exists between in-store buying and eCommerce is a sticking point, but options like virtual mirrors and in-home furniture placement mitigate this shortcoming. If store X offered comprehensive product previews and store Y did not, you’d likely go with store X for the peace of mind.
Today’s online retailers need to be looking ahead and thinking about how they can make their UX designs more experiential. If you’re setting up an eCommerce store in 2018, or if you're looking to buy a website to adapt, then make sure you choose a suitable CMS. Shopify has a track record of supporting VR/AR tech, for instance, so it seems a solid bet.
Essentially, depending on the nature of your business, you either need to work with your competitors or find ways to technologically outperform them. If you don’t, you’ll be left behind.
The mixed impact of convenience
On the whole, advances in UX stemming from widespread IoT-style integration are to be celebrated. After all, why wouldn’t you want a superior user experience? However, I do feel it’s important to sound a note of caution about the perils of extensive automation.
There’s been some talk recently about the prospect of
On the whole, though, I’d say it’s a matter of ensuring that we always have the option of taking manual control. When we bring in impressive new software systems to replace their outdated predecessors, the onus is on us to stay aware of how they work, even as we come to rely on their increased convenience.
Patrick Foster is a writer and eCommerce expert for Ecommerce Tips. He hopes to live to see a point at which he no longer has to deal with capricious PDF viewers on his phone. Check out the site for some actionable tips, and follow along on Twitter @myecommercetips.