top of page
Google Nest Hub Redesign
Overview
Note: I am not affiliated with Google/Nest and this exploration is not intended to condemn any existing work. This case study is purely my own and does not reflect any of Google/Nest views and research that informed their design decisions.
Duration
April-May 2019
Skills
User Research/ Interaction, Motion, Voice and Visual Design
Platform
Android
OVERVIEW
After using my Google Nest Hub every day for 4 months, I noticed that its UI can be designed to better display information at varying distances, which lead me to conceive three design solutions that I believe would bring greater accessibility to the device.
Update: Google has introduced the Ultrasound Sensing function that is similar to this concept. This project, to my knowledge, has been conceived before its release.
THE PROBLEM
I want to properly see and hear my device when I use it from across the room but sometimes I have to repeat myself or move in closer to the device.
Here's what I mean.
Using Google Nest Hub is great when I'm 5 feet or less away from it.
Current UI design of Google Nest Hub when asked about the weather. Simulated view at 5ft from the display. Sequences shortened.
However, when I'm standing further away, the detailed display immediately becomes less relevant to the experience.
Current UI design of Google Nest Hub when asked about the weather. Simulated view at 15ft from the display. Sequences shortened.
INITIAL RESEARCH
From my previous project LUVAR, I've conducted research and explored different ways that I could enhance the smart speaker user experience. One of the biggest challenges is the displacement between the user and their smart speaker. From further away, more effort is required for the user to communicate with their smart speaker.
Research
User Test Results
USER TEST
To challenge my assumptions on what could be improved about the current design, I set up a user testing session and invited five smart speaker users to participate.
Assumptions
-
The font size and content is too small to view from far distances.
-
The complexity of information displayed should vary depending on how far the user is from the screen.
-
The volume level should slightly increase when the user is far away.
Significance of Screen
-
Do users still find the screen relevant when communicating from far away?
-
How important is it for users to see if the device understood them correctly? Does this importance increase with their displacement from the device?
-
What is the comfortable font size to see at varying distances?
Complexity of Information
-
Do users demand the same amount of information when communicating from different distances?
Volume Level
-
Do users speak louder with the device when they are further away? Do they expect the same in return?
-
Should the volume increase/decrease to an optimum level depending on how far away the user is from the device?
USER TEST SET UP
First, I made sure all the users I am testing has 20/20 vision or have their glasses/contact lens on (that restores their vision to 20/20). It is important to make sure everyone has the same visual acuity in order to obtain consistent and accurate data.
USER TEST RESULTS
Voice and Visual Preferences
As users move further away from the device, they rely mostly on sound. However, when they're closer to the device they shift their focus towards the visuals.
Information Complexity
The responses are designed to sound friendly with additional and repeated information. In some cases, it can be too wordy for long-distance communications.
Font Size
The font is too small to be comfortably read from more than 5ft away from the display. All users with 20/20 vision agree that the optimal font for reading from any distance within 20ft is approximately 50pt (for Roboto).
Volume
Currently, the device responds with the volume of its most recent setting, which could either be too high or too low. Most users agree that volume 6 or 7 is optimum for any distance within 20 feet of the device in a quiet room.
Importance of Motion
Visual cues are essential to the voice assistant experience. As users stand further away from the device and speak the wake-word, they find themselves looking for any type of motion on the display to confirm that the device heard them.
Current UI design of Google Nest Hub when asked about the weather. Simulated view at 15ft from the display. Sequences shortened.
REDESIGN GOAL
An Equalized Visual and Voice Experience
The redesign should equalize the importance of visual and voice for the Google Nest Hub. Users should not have to shift their focus between one or the other based on how far they are from the device.
Redesign Goal
"Ok Google"
REDESIGN #1
"Ok Google"
A voice assistant device's response to the wake-word delights its user and informs them that their voice has been heard.
Problem
The current design features a drop down box that only roughly consumes 14% of the display. From further away, the users begin to look for the drop-down motion rather than the box and its contents for visual cues.
Current UI design of Google Nest Hub responding to the wake word "Ok Google". Simulated view at 15ft from the display.
Solution
By utilizing more screen space, vivid animations and larger fonts, users can confidently read from and speak with the Google Nest Hub from further distances.
Redesigned UI of Google Nest Hub responding to the wake word "Ok Google". Simulated view at 15ft from the display.
REDESIGN #2
Adaptive Interface & Response (AIR)
The complexity of information relayed each time should vary based on how far away the user is from the device.
Adaptive interface of Google Nest Hub when displaying the weather.
AIR
Current UI design of Google Nest Hub when asked about the weather. Simulated view at 15ft from the display. Sequences shortened.
Problem
Here's one of the most popular use cases for any voice assistant, reporting the weather.
From user testing, most users agree that the final weather screen is too detailed to be viewed from any distance further than 5ft. The overall interaction could've been shortened as well.
Solution
An adaptive interface and response for users that are Nearby (within 5ft) and Faraway (more than 5ft away).
Here's how it works.
There will be two similar experiences for users that are Faraway and Nearby. Users that are Faraway will receive the most concise response while users that are Nearby will receive a detailed one.
Adaptive Response
It is important that both experiences still deliver all the essential information. Here's how I imagine those responses could be formed.
Instead of using the assistant's "concise answer" for Faraway users and simply switching to the "default answer" for Nearby users, there is an opportunity to build a better narrative and flow by building off the "concise answer".
Here's what it looks like in action.
Adaptive design of Google Nest Hub when asked about the weather. Simulated view at 15ft from the display. Sequences shortened.
Adaptive design of Google Nest Hub when asked about the weather. Simulated view at 5ft from the display. Sequences shortened.
For Faraway users, information is relayed in the most concise manner. The display only shows what users want to know at a quick glance such as the current weather, temperature, and highs and lows for the day.
For Nearby users, detailed information is displayed and additional information is broadcasted.
Volume
Volume
Volume is evidently important to any voice user experience.
Problem
If the user previously lowered the volume on their Google Nest Hub, they will have difficulties hearing the device when communicating from afar.
Solution
Introducing Speak Up.
Speak Up is an optional function that is exclusive to the Google Assistant's speaking volume. Regardless of any previous volume setting done by the user. The Google Assistant on the Google Nest Hub will always speak at an optimum volume for its user based on how far they are from the device.
Optimum Volume
From user testing, the preferred volume is 6 for users located within 5ft-10ft of the device and 7 for users located at 10ft or further away in a quiet setting.
PRIVACY
It doesn't have to be a camera!
Privacy will always be a priority. The redesign proposed doesn't necessarily require a camera to operate. By utilizing distance measurement or ultrasonic sensors, Google Nest Hub should be able to detect how far away its user is from the device.
Update: Google released the Nest Hub Max with a camera that has facial recognition and hand gesture control. The redesign could take advantage of the hardware upgrade, as long as it doesn't violate the user's privacy.
Privacy
Process
PROTOTYPING PROCESS
From hand sketch, to Sketch, to After Effects
I drew some wireframes and animation ideas and prototyped them in Sketch. Using the AEUX plugin I found in this article by Jonas Naimark, I was able to quickly animate my design in After Effects.
bottom of page