Project
Our team of four conducted a usability evaluation for a client developing a novel 3rd party iOS keyboard app concept called KLOA. The app aimed to minimize typing errors through the use of larger buttons and auto-correct functionality, with a visually and mechanically distinctive design compared to other keyboards available for iOS. Our goal was to assess the ease of adoption, perceived utility, and level of interest for the keyboard, and provide design recommendations based on our findings.
(Note that we conducted testing on the version at top right, although the app’s look and feel changed during the project.)
Challenges
One of the main challenges we faced was determining whether the app's novelty would affect users' ability to provide meaningful feedback. Additionally, we needed to evaluate the keyboard's usability and gauge users' level of interest in using it.
Role & Contributions
As the system manager and client liaison/coordinator, I played a central role in managing various aspects of the project. Specifically, I led the graphics design and editing of our our usability study kit, oversaw the usability testing and recording setup, managed data logging, and handled all video editing. In addition, I was responsible for editing the presentation and report documents, and contributed to the creation of graphics for these materials. Throughout the project, our team collaborated in real time using tools such as G Suite and Skype to plan and edit our documents and reports.
Process
The keyboard consists of two main parts, the alphabet bar and the touch targets. The letters serve only as a visual indicator, and typing is done through combinations of taps, double-taps, and swipes in the bottom area.
To learn about the product and the business owner's interests, we met with our main stakeholder.
Impromptu Usability Tests: We conducted usability testing with friends and passersby to understand how much effort a new user would put into learning how the app works and how much they would understand. In all 10 cases, frustration led to the phone being handed back within 5 minutes. We concluded that onboarding materials were necessary to obtain useful data for testing the full app.
Interaction Map: I created an interaction map document that graphically and textually described the functionality of the app. Since we were not the designers of the app, it was important to understand every feature, option, and interaction available in the app. This helped us refer to parts of the design that we may not have used much and prevented us from ignoring lesser-used features or sections. Although almost everything is visible in the app, there are two input modes (one for numbers and another for letters), and switching between different keyboards in iOS should be described to someone purchasing the app.
Interaction Map
Main Usability Study Goals:
Will users dedicate time and energy to learning the interface with a tutorial?
Do users learn the affordance of the letters and the keypad regions?
Would users find the interface an improvement over existing keyboards?
Usability Study Kit: We created a document of our usability testing preparations to be fully prepared for our usability study. This included our recruitment ad and screener form, which allowed us to refer to it and confirm our users met the criteria. Our screener focused on mobile device familiarity.
Before testing, we had users sign a consent form and fill out a pre-test questionnaire to gauge any changes after trying the app. The questionnaire covered device familiarity, if they usually used one or two hands to type on their phone, if they trigger auto-correct often, and their interest in using third-party keyboards. Users were given a gift card and were told they could leave at any time to ensure their comfort and prevent frustration.
As part of our preparations, we created a web-based onboarding site. The phone was set to the first page of the site with the KLOA keyboard activated.
Web-Based Onboarding Site
We started the study itself by speaking from our testing script to ensure uniformity and consistency. This helped avoid inconsistent results due to variations in how we described tasks. We also provided participants with a user task list for easy reference as we progressed through the testing script.
Usability Study Kit
Lab and Study Setup: We used QuickTime on a Mac to record the iOS device screen and GoToMeeting to stream the image from a USB webcam and the whole screen. Streaming audio and screen output allowed another person to take notes during the session without being in the same room. We recorded separate video clips for each feed and later composited them together. We found the overhead camera view useful in some cases but did not include it in the final composite.
Session images
Post Study: After each session, we asked the participant to fill out a post-test questionnaire. We then edited multiple video feeds from each study into one combined clip per participant, which all four of us used to enter notes and details about the study. This enabled us to quantitatively determine if the participants were improving over time in speed, completion, and other measures.
Presentation & Reporting: We presented our summarized findings to the class and the client and met with the client to present our full findings and answer their questions in depth.
Learnings & Future
Onboarding: Depending on the complexity of a design, some amount of hand-holding may be desired or even required to allow users to utilize it fully. It can be easy to assume that a simple looking app doesn’t need an introduction, but that isn’t always the case.
Setting and Scenario: Our initial impromptu testing with friends and passersby was conducted randomly when these people had something to do or somewhere to go. The change of venue and scenario to a conference room, with each person scheduled to help with our usability test, likely affected the quality of the data we were able to capture. In this case, with an early prototype, the fact that we added onboarding materials and intentionality to the person trying the app was helpful. In a situation where an app with complexity is closer to launch and intended for a broad, casual market, it would be crucial to test with users who were not prepared to sit and learn.
Video Recording & Streaming Setup: What I realized after running a few sessions was that the best setup is often the simplest one. Instead of recording individual streams of the phone screen, the camera aiming at the person, and the overhead camera watching their hands, I would set things up so that all of these were visible on the screen at the same time and record the screen. We did not have a chance to do a quick run-through with the testing setup, and this would have reduced the amount of technical issues we experienced.
I have since realized that the streaming software OBS is perfect for running multiple camera usability studies like this. It allows you to record and stream in the same app, save your layouts, and choose which mics are being recorded. This could be almost a one-click recording and streaming scenario, and it would have improved our video output, time taken to process that data, and the setup would be shareable with others.
Record Longer: In the future, we would choose to record the participant during the pre-test and post-test questionnaires. A fair number of our participants ended up mostly speaking their answers aloud, and we had to track that on paper. It would be good to have audio or possibly video to fall back on.
Study Web Site: It would be ideal, in the future, to modify the usability study website to record “time on task” for each section. This would eliminate a lot of scrubbing through video clips looking for the times each section was started, ended, and then the subtraction to find out how many seconds each section took to complete.
Outcome
We presented our findings to our client and class in the form of a presentation, including a slide deck with embedded video clips to show examples.
Key Initial Findings:
All six participants failed to grasp that they needed to select the numbered touch targets to input letters.
All six participants were confused by the interface.
Observations:
By the end of the study, all participants were more competent.
Five of the six participants were able to correctly describe how to complete the full range of interactions after the study.
Error rates increased quickly as participants progressed.
Next Steps:
KLOA may be suited to large screens and alternative interfaces, such as gaming, VR, or hardware like the Microsoft Kinect, where the input selectors are much further apart, and keyboards may not be default options.
It is possible that some sort of simplified interface with larger buttons, like this, could be a useful input method for users with limited hand dexterity, and potentially for use cases like when using gloves, scuba diving, or for military or astronaut usage.
Sample Footage of the Final Task
Areas of Opportunity:
High Cognitive Load > 6 of 6 participants reported low confidence and high stress while using the keyboard.
“Stress level? It’s definitely rising. I have no clue why I would switch to this at this point.”
Recommendations:
Users need a guided tutorial to prevent early loss of interest.
Incentive to overcome the stress of adoption needs to surpass the amount of time and energy required (typing speed, error correction, joy of interacting with the app…)