Adding an App Feature to Google Arts and Culture Mobile App

GAC Mockup (1).png


Adding to the line up of interactive app features that connect art enthusiasts to detailed information and museum resources. 

Google Arts and Culture is a web and app based service that provides a way for users to access the world of fine arts in one place. It has interactive features including art selfie, art museum VR/AR, and ways to make yourself into artworks. To attract and engage users Google Arts and Culture wants to design an art discovery feature to help users ID and find similar artworks by using photo input. Adding compelling features can attract and build enthusiasm around artworks.



4 weeks


UX: Student Project


Sketch, InVision


User Interviews, Market Research, Competitive Analysis

Drawing (7).jpeg

The Challenge

Since the pandemic began, art enthusiasts must become a bit more resourceful in how they beat boredom, stay educated, and connect to the art world due gallery and museum closure. They sought art in alternate locations such as online databases and in person encounters outside a formal exhibit. At times enthusiasts struggled to find the most curated and accurate information on the art around them. 

Google Arts and Culture features thousands of art databases and curates collections of art into online exhibits in addition to a collection of interactive features. Features like “art selfie” have gone viral in past. Due to this unique point in time, Google wishes to release yet another interactive feature that will compel and draw users into it’s mobile app. 

Drawing (26).jpeg

Create a useful tool for art enthusiasts to connect and stay engaged with fine arts. 

Create a feature that integrates with an already existing image recognition app called Google Lens to identify artworks and shows related images.

Drawing (24).jpeg
Drawing (25).jpeg

Use current structure of Google Arts and Culture database, integrated Google products and design to add the feature. 

Project Goals

Drawing (16).jpeg



People interested in visual art, previous users of Google Arts and Culture. 

Sketching out my thoughts...

Sketching out my research while I gathered and analyzed information proved to be a much more organic way to collect my thoughts compared to on screen habits. I kicked off my research by thinking off-screen. 

Research Questions: 

What art identification tools are already on market? 

What features are already associated with Google Arts and Culture product and how they can relate to art discovery tools?

What Google Products already exist and are integrated into products (but may not be as straightforwardly known about)?


Market Research


Competitive Analysis 

Keeping it interactive...

Since Google is rich in cutting edge technology and the Material Design system I knew I had a lot of resources to work from when it came to designing for a Google product. However I was uninformed about just how far AI and AR features have come especially since the only AI tools I knew about were Google products. This led me to conduct both market research and competitive analysis.


I was curious how much users have engaged with Google products so I conducted a brief survey to collect some quantitative data. 

GAC Research Findings (1).png

Empathy Findings

Key Findings:

Drawing (27).jpeg

Google has many existent tools to integrate into a feature such as...
-Google Lens
-Art Selfie
-Art Palette

Drawing (28).jpeg

AI tools aren't as common but some are being used to discover and recommend art based on current selections within gallery and museum contexts which can inspire the UI and UX design. 

Drawing (29).jpeg

Users rely on art museum sites and internet searches to gather information, understand how to use photo scanning extensions but don't usually think to use it to ID art. Most users surveyed don't use Google Arts and Culture. 

Drawing (17).jpeg


An art enthusiast that enjoys experiences with art gallieries and museums. 

 I created a persona named Izzy. Initially, I felt challenged at how Izzy might access or find out about this app so I created a storyboard to define the tasks when engaging with a museum in times of pandemic limitations, something I was uncertain about. Creating a real world challenge and access point for Izzy better defined how I could provide site structure.





Integrating a new feature...

App Map

I wondered how I could integrate this feature seamlessly. I used a similar site structure to the existent interactive AR feature called "Pocket Gallery" because it involved zooming into an artwork, viewing in AR, and list details after the artwork was scanned and ID-ed. Since this was not an app I created it was important to get a grasp of what existed by creating a site map and task flow. I thought that this would be a straightforward process but was humbled by so many small details that were easy to miss.

App Map

Task Flow

Task Flow

Drawing (18).jpeg


Exploring Material Design
Integrating into a system library


Taking it slow with minimalist designs...

Before creating wireframes I did a series of sketches as well as a style tile to  focus on the harder to notice details of the Material Design layouts. Google Products appear to be so sleek and simplistic and it was important for me to really practice noticing such a minimalist layout before designing. Sketching most certainly helps me slow down and focus.

Another challenge that became apparent was due to my level of experience with animation and tools available I wasn't sure how to design a prototype involving the AI scan and the AR simulations so I just drew what was observed and left it to annotate during prototyping. 

An exciting part of this project was that I was using the bold and minimal Material Design System, as well as creating from copy work using app screenshots to build out the design for Art Discovery Tool. There was no need for any rebrand so all I needed was to build wireframes to build sequence and structure.


Wireframe Sketches



Style Tile

Style Tile

Priority to design...

When creating this design it was important to: 
-to keep consistent with Material Design for seamless app integration
-create an experience consistent with Google Lens and Art Selfie conventions 

-align with Izzy's needs of a simple interactive experience that doesn't require a large cognitive load due to being in a busy public setting

High Fidelity Mockups

To wrap up the ideation phase I developed high fidelity screens. For content, I turned back to my storyboard sequence of events so my user could see a simulation of that they desired to scan. This ended creating a challenge during prototyping since it was a feature dependent on user input including the camera functions. 

Hi Fidelity Screens

High Fidelity Screens

Drawing (19).jpeg


With all of the simulation challenges accounted for I annotated a prototype and was ready to test my design on some art enthusiasts. 

Google Arts and Culture Art Discovery Mockup
Drawing (20).jpeg


Three target user participants via Zoom meetings with screen sharing view. 

User Test Findings

I sought three participants within my personal and professional network: an educator, an artist, and a tech saavy user to round out approximate aspects of my persona Izzy.

During user test interviews three things stuck out to me: 

-If a user is looking at a work of art in person and scanning to ID it there is no purpose for an AR experience

-users were more engaged with similar art recommendations 

-users mainly sought more information, connections, and details on art rather than other functions tested


Participants all completed tasks at 80% accuracy rate.

This is how my user test aligned with the test goals: 

1. Observe if a user can complete an assigned task

Participant #, 100% completion rate

Participant #2 75% completion rate, snagged up with AR prototype

Participant #3 75% completion rate, couldn't find museum site on task #4 

2. Identify if there are any gaps in task flow. 

-more navigational links

-prototypes should be performed on a phone and not desktop

-there was no practical need for AR simulation as artwork was already hypothetically present

-think about artwork page flow, users couldn't find museum link for one of the tasks based on an existent design that I may not have full authority to change

3. Are annotations effective as guidance?

-annotations did not appear as planned. More knowledge of technical use of InVision prototyping needs to be built as a designer

4. What is known/unknown about Google apps?

-many users used Google Lens, Google Cardboard, and art selfie

-few knew about Google Arts and Culture, surprised me a bit


Since the origin of this app was not my design, I found it a challenge to focus my design priorities so I created an affinity map

What test participants say...


Drawing (30).jpeg

"I would love to use this to find out who the architect of certain houses might be. Sometimes I go on walks and pass well designed homes and want to know the bigger story behind it, the history. 


Drawing (31).jpeg

"I wish I could see more recommendations, that's what I'm looking for here but the zoom and the AR buttons stop my eyes and it makes me think I'm done scrolling and there is no more to see. 


Drawing (32).jpeg

"I'm always looking for artworks I can send the kids I teach. 

Affinity Map

Affinity Map

Priority Revisions and Moving Forward

Once these interviews wrapped up I created a priority revision list:

-eliminate AR screens because the user is standing in front of the artwork which was overlooked during initial phase of design
-fix inconsistencies between Art Discovery feature and current app features

-improve information architecture to encourage users to engage with page more
-fix spacing error between artwork and details on artwork details page
-redesign suggestions page


Priority Revisions


This project was humbling. When I began, I was excited at the simplicity at working within a design system. As I kept working I had noticed that I missed a lot of UI copywork in the initial iteration which was a large part of the product. Minimalist design appears easy to make but every detail counts which earned my newfound respect.  

I was surprised at some of my own oversights in how I could integrate the AR functions into this project. It really pays to simulate an experience as close as possible to the real thing to avoid obvious oversights. I can now see why UX designers involve themselves in field observations to avoid such mixups when defining products. 

Lastly, this project was a challenge to create and test with my prototype. I felt this was a struggle to do proper simulations without needing a verbal walkthrough for my users. I became more curious about prototyping tools, primarily ones that integrate AR animation and how I can get better at user testing that maximizes user insights, an area of growth. 


What's Next?

• Handoff to development

•Integrate into existent app

•Iterations after launch