rachel pollock design

Collector
mobile rock-collecting companion for the novice geologist
fall 2018 · 2 weeks · UI / Interaction Design
how might I leverage the affordances of a traditional field guide and the affordances of a mobile phone to help someone classify a rock?
This 2-week exercise exercise in UI design asked me to translate a traditionally-analog tool into a mobile application. I chose to design a field guide for classifying rock specimens.

persona
Who might use a traditional field guide? I chose to focus on the novice user and landed on the following persona and use-case scenario:
Dorian is a thirteen-year-old boy scout who wants to earn his geology badge.
One of the badge’s requirements is to: “Collect 10 different rocks or minerals. Record where you obtained (found, bought, traded) each one. Label each specimen, identify its class and origin, determine its chemical composition, and list its physical properties. Share your collection with your counselor."
task analysis
Ah yes, the task analysis. I’m getting excited just thinking about it - this is one of my favorite parts of any UI project.
Anyway, the user’s primary task is to label each specimen he collects properly. But how does one go about classifying a rock? And how could a computer do the same thing? Would the process be different? I dug deep into the world of rock-classification (which surprisingly, I genuinely found quite interesting) and determined that an AI-powered, image-based rock-classifying tool seemed feasible. Though an AI would do the real classification work, I also wanted to ensure it still "spoke the language" and matched the mental model of someone who might be familiar with using traditional field guides. I made sure to work this idea into the app's navigational flow. While the algorithm works to classify the rock, the system's status is displayed, alluding to the steps in a traditional field guide.
The user's primary task (and the one I would develop) is highlighted in black.

device affordance considerations
The phone screen’s modest size caused me to consider the importance of the information I presented on each page. I decided that each screen would focus on one task until the specimen is classified, minimizing cognitive load.
human factors & context-of-use considerations
One-handed operation proved imperative to the app’s usability, as users might handle rocks or other materials in conjunction with their phones (And some user might not even be using their hands at all). Therefore, I designed the app to operate with a few simple, single-handed swipes, taps, and flicks. Alternative navigation would be integrated for those who might use accessibility settings or other assistive technologies to operate the app.

prototyping
hi-fi wireframing
The following wireframes represent key actions in the task flow for capturing and saving a specimen.

specimen
capture
Users utilize the phone's camera to capture a photo of the specimen

specimen
classification
The app's algorithm analyzes and classifies the photo by comparing it to others in its database

specimen
profile
Once the specimen is classified, its profile is added to the user's collection. The profile details the specimen's location and physical properties.
prototyping
UI motion design considerations
This stage of the project taught me a lot about how the UI’s motion design prompts users to anticipate the app’s hierarchy and functionality (including the gestures needed to complete their task). While I wireframed, I considered each UI element's contribution to the user's understanding of the app's navigation and functionality.

hi-fi UI wireflow
I mapped out each UI element's motion design triggers and feedback within the specimen classification task flow's wireframes.


hi-fi clickable prototype
I created the following clickable prototype in Principle, following the actions I'd mapped in the wireflow.
reflections & moving forward
With more time, I would love do some deep research and test the clickable prototype with users. If I did this project today, here's what I would do differently:
-
conduct usability tests on current flow
-
redesign the UI to reflect the most recent iOS standards and guidelines
-
refine the color palette so that the colors have a proper contrast ratio
-
prototype and test flows for alternative navigation methods for those using assistive technologies
-
build flows for other personas and their use-case scenarios
Exciting news- I'm currently working on the next iteration of this project!
Stay tuned :)

specimen
properties
If interested, users can read about each of the specimen's properties in greater detail.