🗣️ I contributed to the design of Voice to Insight at SAP – a machine learning NLP-based search feature where people could gain business insights from questions with a generated chart
role + timeline:
As a UX intern, I co-designed the vision and interaction design for mobile voice search. This feature was designed and shipped over a 6-month period in close collaboration with a product manager, three developers, and my design manager.
Bringing a more mobile-friendly pattern
On the desktop, users can type and directly select the appropriate model from suggestions. However, typing a complex question on mobile could be a bit cumbersome for a long question.
For our mobile users, the main use was to allow users to gain quick business insights on their data. Desktop users were generally more in-depth to create charts and datasets.
So while we already had the existing typing-based approach for our mobile app… the top customer request was to bring a voice flow.
Activate a Search
Clicking on the microphone button instantly activates a voice search. Users could then speak their results that would automatically generate a relevant chart.
Testing Different Interaction Methods
Through the design process, different ways of activation were explored. But ultimately, integration of a simple microphone button allowed for a seamless approach to activate a search in real-time.
Clear Animation Interaction States
There are 3 stages to a voice search: beginning, listening to the search, and processing the query to produce a result. This was communicated through differences in speed in the animation.
Building a Fallback
Sometimes voice input can fail. Users were able to recover by going back to a keyboard-based input if they needed to change a part of the query.
Recognition Post-Launch
The feature was featured in the Quarterly Update video – which highlights top features from product management. You can learn more about this in the above video, along with SAP’s blog post.