A grocery navigation solution to increase store enjoyment, while minimizing wasted effort

An Overview

The Logistics

  • Project for HCI Foundations-- August 2017 - December 2017
  • Teammates: Jeremy Phillips, Rachel Chen, Ashok Krishna

My Role

  • Created final Physical Prototype
  • Developed the lo-fidelity mockups from sketches
  • Created the customer journey map to visualize our Research
  • Participated in User Research and Testing

Understanding the Problem Space

Starting Broad 

We quickly focused on the space of food, and specifically grocery stores. We went back and forth through our initial research on topics like food waste, in-store retention, or in-store optimization. To begin the understand the pain points our users faced as a result of grocery shopping we conducted semi-structured interviews, and in-store observations.

In our preliminary semi-structured interviews we left our questions relatively open, to understand the scope and complexities of the problem space. The only real constraints we applied to our semi-structured interviews were primary shoppers. We chose to influence the instore consumer, assuming those who benefit from the primary shopper would be equally impacted.

The questions we asked during the Interview related to:

  • How often they shop
  • Where they shop
  • Their experience with shopping
  • Once they are home how they store their food

General Insights from Interviews

We conducted 7 semi-structured interviews. General insights we were able to gather about our users were:

  • Our primary shoppers purchased 3-5 days worth of groceries at a time
  • Typically strived to get “the best [products] for the family” within their geographic and time constraints
  • Mindful of the time spent in stores, but often treated as a reprieve from the stresses of everyday life
  • Typically spend 3-4 hours a week in store

General Insights from the Observations

Based on our initial insights from the interviews, specifically relating to wasting time in store. We conducted in-store observations around metro-atlanta. Our team observed shoppers in their environment, and made notes of things like: body language, what devices they used, and their paths through the store.

Our observations from the 5 grocery stores were:

  • Shoppers looked up at the aisle signs, sometimes scanning back and forth from a specific vantage point.
  • Shoppers sometimes walked back and forth between aisles.
  • Shoppers, especially those with kids, were careful about where they temporarily parked their shopping carts.
  • Shoppers may have had dietary preferences which led them to a different part of the store (e.g. the natural foods section) which ultimately led to a break in their shopping flow.
  • Shopper utilized lists or smartphones (either as a list or for leisure) while shopping.

Converging on a Problem Area

Mapping the Journey

Grocery shopping is both a large retail market, as well as an everyday task. Based on our user research to illustrate this second-nature behaviors we created a Journey Map. This allowed us to abstract ourselves from the problem while focusing on the seemingly intuitive task of “grocery shopping”

Identifying User Needs

In order to synthesize our research we used an affinity mapping technique. This allowed us to create broad categories that matched to our specific insights. Based on these broad categories we stack ranked the insights based on need.

What we found was that our primary shoppers enjoyed grocery shopping, but were bothered by the inefficiencies. Therefore by providing our shoppers a stress-free, efficient, and quality shopping experience we could assume our shoppers would be happier, and as a result spend more time in stores. We created 6 user needs to guide us through the forthcoming solutioning stages.

  • Users need to quickly locate their shopping items, so that they don’t waste time and energy on wayfinding and navigation.
  • Users prefer to get through checkout lines quickly so they do not feel that time is wasted in line.
  • Users need better ways to find items at unfamiliar stores to avoid wandering and feeling lost.
  • Users need something to occupy them while idle in lines, so that they don’t feel bored or grow impatient as they wait.
  • Users would like a physically hassle-free shopping experience, so that they don’t have to waste too much time and effort on certain tasks (carrying, moving, lifting, etc.).
  • Users feel stressed by the crowds in stores and would like a way to navigate around them.

Ideation in the Space

Informed Brainstorming

We conducted an informed brainstorming session as a team to begin ideation in the space. Each team member was responsible for coming up with 10 quick solutions. We later curated these sticky solutions by clumping them in to general categories. The general broad solution categories that emerged from this session were:

  • Wayfinding/navigation optimization
  • Pre-filling shopping carts with regularly bought groceries
  • Guided shopping experiences
  • Luxurious in-store experiences
  • Engaging/productive checkout experiences
  • Making grocery shopping physically hassle-free
  • Checkout line optimization
  • Distracting kids with either educational or non-educational content
  • Gamification to involve kids in the shopping process

While each category had merit, due to our tight timeline we narrowed our scope to two categories we could further iterate on. We chose to move forward with the two that excited us most, and best matched our initial user needs:

Gamification & Navigation Optimization

Narrowing Scope

We created very simple paper sketches of the two categories, not limiting ourselves to feasibility or medium. In the end we had ideas like in store quests, children's companion applications, smart carts, and a google maps esc store interface. Since our sketches were crude we ,as a team, narrowed down from the many ideas to one idea we could continue to iterate on. Based on our assumptions about perceived cognitive load we chose to move forward with a navigation application, this ensured a blessed cognitive load on our users, while still maintaining an enjoyable experience.

Iterative Design

Prototyping 3 Systems

Dynamic Navigation

On-Cart Navigation

Great Shopping Adventure

Iterating on Mobile Navigation Application

During a team meeting to discuss the mobile application for wayfinding, a teammate mentioned that he had recently seen a bike navigation device that mounted on the handlebars, and acted as a small display with directions. We riffed on the idea by combining it with our pre-existing mobile application.

Our reasoning for quickly pivoting to this new idea, goes back to our initial user needs. We wanted our users not to waste time, but we also wanted to ensure they enjoyed the shopping experience. By utilizing multiple stimuli our user would conceivably rely on peripheral vision to receive navigational instructions. This eliminated the need to stare at the phone while navigating through the store.

Our idea was to have the user utilize their phone at home to create a grocery list. Once in store, their phone transitioned to a navigation device added by synchronized lights. A path would be devised based on their list, and our users could enjoy their time in the store, and not have to worry about a new or confusing environment.

Physical Prototyping

For the purposes of prototyping this idea, we chose to utilize Arduino capabilities. I had little experience with Arduino, but was eager to learn. A lot of our prototypes fidelity was dictated by our plans for testing. We were really curious about our lighting system, so it’s fidelity needed to be the highest part of the system, to ensure minimal user interface during testing.

The prototype was created with and Arduino 101, which has onboard BLE technology, a Neopixel Strip, a cardboard iPhone holder, and a generous amount of hot glue. The onboard BLE technology of the Arduino allows us to control the lights of the neopixel strip remotely. Therefore we could achieve a synchronized guidance system, by having the system react to the user.

Intent of final design, lights embedded on cart handle act as turn signals for user.

Construction of physical rig utilized an iPhone, Arduino 101, Neopixels, and a battery pack.

First Iteration of lighting design.

User Testing

Testing Plan

Our application had two main components:

  • ‍At home list making
  • In store navigation

Testing the at home list making was a relatively straight forward process. We developed a testing script, and facilitated 6 user tests. Focusing specifically on the at home list making features.

To ensure the viability of our in store navigation we developed an A/B between subjects testing guide. This allowed us to have a controlled, traditional environment, the current system. Which we could then compare to our proposed solution. To compare the two groups we compared the total distance traveled to find the three items. We measured out the distance between each items prior to testing.

We chose to simulate our grocery store environment in the stack of the library. This allowed us to mock up convincing shelves and aisles for our participants to wander throughout. One group was given a paper grocery list, with a list of items they were to find in the store.

Makeshift store signs were created as in store signifiers for the participants to use while testing.

Our participants could utilize the in-store signifiers like the aisle signs, or wander. The second group was given the same list of items, but on our mobile application with the lighting rig attached. This was a more complicated test as it required a lot of moving parts. The lighting controller walked with the user as they navigated throughout the store, correcting their path with the color of the lights. While another teammate took note of the total distance walked by the participant (utilized the floor tiles to keep track).

Map of the testing space a teammate used which was later utilized to aid in calculating total distance traveled.

Testing Results

To understand the successes of our application we separated the core functionality into two encompassing categories: the at home portion, and the in store flow of the application. This allowed us to focus on the usability for the specific features of the application rather than the successes as a whole.

That being said, in general, the at home functionalities of the application tested well. Our users understood the concept of creating lists and even our unique functionality of the creating a prefill list. While debriefing with users we discussed their potential uses of the application. All of the participants remarked they would use a similar application. Therefore, our initial set of features that set up the user to getting in to the store were working. 

The majority of users understand the at home portion of the application.

But of course, what we were really interested in was if our users used the lighting in the mobile application. To be honest, our results were inconclusive. We had some difficulties with the fidelity of the prototype. While the lights changed dynamically the map interface did not. Therefore we often had participants playing hot and cold with our application looking at the lights, and guessing their way through the store. That being said, we also had users instinctively understand the lights and quickly navigated their way to the correct location. The graph below shows our control group (the ones with just the list) compared to those with the lighting application.

Since we had such inconsistent results with our main feature, we would need to iterate and test again. I would like to test with both a dynamic map interface as the lighting to understand how the user interacts with the device. This could be achieved by using RFID technology in conjunction with GPS.

Steps Taken in regards to the participant. Depicts the inconsistency with the lighting feature, as it helped some but not all.



Stay focused on the core user need you are trying to solve. I think we wound of there in the end, but we spent some time going back and forth with how we wanted to approach the design problem.

Leverage users habits to enhance the design. The idea of improving navigation by relying on peripheral vision came from the notion of not interrupting the users time. We were able to leverage their preexisting habits, and mold them in a new way.

Morgan Ott ⛰ ☀️