Redefining Google Photos search with visual, map-based discovery and AI/ML-driven suggestions

Overview

Client and role

I joined the Photos UX team in mid-2020. During my time there we navigated a large rebrand, a move toward more clear monetization, and preparing the search experience for a large product launch.

Working within the Engagement product area, I completed projects involving search, photo discovery and editing, and managing user-generated content. As one of only 2 full-time remote team members, I helped the team transition to remote work during COVID.

I was responsible for design, prototyping, and spec writing in areas that ranged from tactical multi-variant A/B tests to large strategic, future looking projects based computational photography and ML/AI (machine learning and artificial intelligence) research. I also partnered closely with UX research to plan remote research studies as well as personally running in-person user research interviews.

Goals and results

To recognize the 5th anniversary of Photos being released publicly, the app would be launching with new features, a large IA overhaul, and a complete update of the branding. I was responsible for preparing all Search surfaces and features for this launch and putting structure in place for improvements post-launch. The long-term vision of this launch was to set the stage for Photos to grow to its next billion users.

  • Finalizing top-level Search tab experience, designing map-based search feature, and ensuring all search features had a logical location in the new information architecture led to >50% of press headlines mentioning the new map feature
  • Improving entrypoints to user-generated content features to drive machine intelligence improvements led to 5.5x increase in feature users and 4x increase in clicks
  • Partnering closely with product, engineering, marketing and UX research to ensure new features were useful, enjoyable, and shareable led to large increase in positive social sentiment and 100s of users manually sharing their photo maps
  • Partnering closely with the research product area to design future-looking, strategic editing features led to a more clear long-term vision for monetization and growth

Interactive map to quickly find photos and explore your past travels

With a completely updated information architecture, there is now a dedicated top-level tab for everything search related in Photos. Within the Search tab, a new map view provides a heat map of everywhere someone has taken photos. We were able to take the map view from a rough hack week demo to production-ready in 8 months. This was "the fastest I've ever launched a feature at Google" according to one of the developers. Because of the tight deadline, I collaborated closely with another UX designer and a motion designer for this project.

Although we had a working demo, this would be the first time Photos had an interactive map, so we wanted to explore all posibilities before confirming that something similar to the demo would be the best approach.

After initial design explorations, we ran user research sessions to identify confusing edge cases and answer specific questions about initial zoom levels and visual design. User research also allowed us to better prioritize and stack rank followup features after launch.

With a working demo, questions answered from research, and additional visual design explorations completed, each team member specced out the parts of the feature they had been working on most closely. All specs remained in Figma to improve the speed and quality of collaboration and keep everyone on the same page.

Continuing collaboration with the Android and iOS engineering teams was all the was left to get the project completed and ready for launch. We had to make some tradeoffs to meet the tight deadline, but seeing 100s of users screenshot and share the feature with no built in sharing tools (yet) was strong evidence that people were using and enjoying the map view.

See the current map experience below or try it out with your own photos in Google Photos on iOS or Android (tap Search, then Explore Map)!

Single-tap filtering driven by machine learning and photo identification

Tappable refinements make it easier and more enjoyable to find your photos. This feature needed to meet competing goals of making targeted search faster and easier (i.e. looking for a specific photo from a specific event) and providing more moments of serendipity and reminiscing while viewing your photos. Long-term, it also needed to increase Everyday Enthusiasts by educating users about the power of compound search (i.e. searching 'James February 2010' instead of just scrolling through all 'James' photos).

Because this project began with general overall goals, it required exploring a wide range of very divergent approaches. Wireframing many different approaches allowed us to quickly narrow in on the solutions that were the most meaningful. Some approaches included a contextual and educational bottom banner, a full 'compound query builder' at the beginning of every search, and post-search refinements/filters.

After arriving at a few approaches that seemed promising based on engineering constraints and the long-term vision for this feature, I created 2 prototypes for in-person user research interviews which I ran at the Mountain View campus.

With a spreadsheet full of feedback from users about the specific parts of the feature that were the most interesting, we were able to scope down the MVP version of this feature to focus exclusively on people and content types (videos, screenshots, favorites, etc.). Since this project relied so heavily on machine intelligence suggestions, there was very little visual design. Outlining and visualizing the overall structure and approach became even more important. I did this by creating an expandable and adaptable system to make it easy to improve this feature in the future without additional UX work (V1 and V2 structure not shown).

After outlining the core features and structure, I used a high-fidelity prototype to explore possible micro-interactions and animations. This will inform the final implementation of this feature and be used alongside a live demo during another round of user research.

GTC experiments

Guided Things Confirmation is fundamental to both the user search experience and the overall quality of Photos' MI models. With the new Search tab, correctly classifying photos became even more important, but it was challenging to discover how to do that.

Partnering with colleagues from UX writing and Behavioral Sciences, we created a multi-variant experiment that better communicated the user benefits through both copy and visual design. We ran two experiments which led to a 5.5x increase in feature users and a 4x increase in taps without significantly impacting any guardrail metrics like overall searches. After the success of the Android experiment, similar entrypoints are being added to both iOS and web.

Although this was a relatively small project, the successful collaboration between different teams and product areas led to something that will have a noticeable impact on search quality for all users.

Untitled future editing feature

I worked closely with the research product area and the computational photography team to create a plan for a new editing feature which would improve user engagement and provide more opportunities for monetization. Working with the UX research team, I designed a study to answer both high-level directional questions about users' personal photos and usability/interaction questions around an initial prototype.

Alongside creating some early mockups, I created artifacts which defined the multiple layers of technological complexity and related UX approaches. This also provided the team with a shared vocabulary to communicate previously undefined areas of the project. Working with the UX research team, I designed a study to answer both high-level directional questions about users' personal photos and usability/interaction questions around an initial prototype.

Selective details provided on request.

The Results

While working with the Photos UX team for just 8 months, I was able to help them accomplish the following results:

  • 5.5x increase in feature users and 4x increase in clicks to a feature which will improve MI models, search functionality, and usability
  • Successful launch of a complete product and brand redesign with large increases in social sentiment and mentions as well as press from Forbes, The Verge, Engadget, and many more (over half of the headlines mentioning the map feature)
  • Strategic and forward-looking design direction for upcoming Search and Editing features through high-fidelity prototyping, in-person user research, and collaboration with product, research, and engineering teams

That's all for now! Jump back to the homepage, check out another project, or send me a message at bryan@zavzen.com