top of page

Context aware AI-powered video shooting copilot

ABOUT   ▶︎

My Role   Lead Product Designer

Team   1 Designer/5 Engineers/ 2 AI scientists/ 2 UXR

Outcome   Product sold 2M globally, 25% higher MAU 

DJI ShotGuide

AI Product Design
UI/UX
Mobile
dji-logo-black-and-white.png
WHAT I DID  ▶︎

◆ Owner of product vision and strategy

◆ Interaction​ and visual design 

◆ Led XFN collaboration

◆ Drove UXR and design workshops

◆ End-to-end product design from concept to launch

INTRO

User Problem
Most DJI gimbal users  expect a much simpler vlogging experience, but often face a steep learning curve, resulting in reduced user satisfaction.
Glance of Solution

ShotGuide recognizes context and shooting targets, providing automated camera movement and assistance as a video shooting copilot.

IMPACT MEASUREMENT & PROBLEM RESEARCH

Frame 3429.png
Data Analysis

To identify the impact of the user problem and define the measurement of success.

Frame 3446.png
User Interview 

Participant: 50+ beginner users in three counties 🇬🇧 🇨🇳 🇯🇵  

Findings  ▶︎

70% of users needed a lot learning to on board.

MAU of beginner users had a 35% decline in 6 months after launching.

65% of participants stopped using because “hard to learn compositions and camera movement” 

Findings  ▶︎

Problem 1  Confusion around composition and camera movement

 “I always find my video doesn’t reproduce my feeling to that environment when I was there”

Problem 2  Not motivated to learn more about videography

"This shouldn't be more complex than shooting video with my phone."

Screenshot 2023-03-21 at 16.54_edited.jp
Screenshot 2023-03-21 at 16.54_edited.jp

REFRAME PROBLEM AFTER RESEARCH

Beginner users struggle with selecting appropriate camera movements and compositions for their shooting target.

WORKSHOP

I hosted rounds of workshops with engineers, UXR, PMs. By introducing the design sprint process, I led weekly iterations and helped team to land on two concepts within two weeks.

IMG_1081.jpg
IMG_6315.jpg
IMG_5706.jpg

DIRECTION EXPLORATION

Design sprint led to with 2 directions. I worked with UXR to invite 14 users to for concept test with the prototypes I built, and their reflections and opinions were analysed.

▶︎ Direction 1

Scenario-based tutorial and learning resources

Provide easy-to-follow video instructions for camera operator movement and gimbal holding technique, ensuring that users can reproduce high-quality videos with ease and zero friction.

Frame 61.png
Frame 62.png
▶︎ Direction 2  ▶︎ Final direction

A context aware AI-powered video shooting assistant

System recognizes the context and shooting target, and proactively provide shooting assistant, automated composition and camera movement.

wegwe.png

WHY DIRECTION 2

I arrived at the decision to select direction 2: a context-aware AI-powered video shooting copilot as the final design direction. This choice was driven by three key factors:

Screenshot 2023-03-28 at 17.55.07.png
Desirability

AI Co-creation experience give beginners superpower

Screenshot 2023-03-29 at 15.47.21.png
Feasibility

DJI has existing automated camera movement research efforts.

Screenshot 2023-03-29 at 15.47.40.png
Strategy

Push the company strategy of context understanding and co-creation

I designed a prototype to demonstrate a hero user flow and explore interaction design, which was also utilized for a user study to obtain feedback:

Camera detects the environment and  shooting target via live preview

AI suggests appropriate shooting techniques with a simple footage. 

▶︎ 1 Context awareness and AI suggestion

User pressing the shutter button, the gimbal automatically tracks and adjusts the target's composition while conducting camera movements.

The user can then compare the simplified video clips and decide whether to save the footage.

▶︎ 2 Video shotting copoilt

FINAL DESIGN CONCEPT

fghj.png

Iterated Design

PROTOTYPE & ITERATIONS

Adding Detailed Video Shooting Guide

User feedback:

Only showing the final result was not enough for them to learn how to hold the gimbal and move in the space.

Iteration:

Provide easy-to-follow video instructions for camera operator movement and gimbal holding technique,.

Frame 65 (3).png
Scenario-based ShotGuide

User feedback:

After getting some cool footages, they don’t know what to shot nexts and need inspiration.

Iteration:

ShotGuide can also offer shooting guides based on scenario recognition, providing users with related shooting inspirations for future shoots.

▶︎ Context awareness and AI suggestion

FINAL DESIGN

While framing, the camera detects the environment and identifies potential shooting targets, and a pop-up modal suggests appropriate shooting techniques. Users can review sample footage and confirm or reject suggestions.

▶︎ Video shotting copoilt

UI guides user to hold the gimbal and move in the scene. The gimbal automatically tracks and adjusts the target's composition while filming.

▶︎ Scenario-based ShotGuide Library

ShotGuide library can be accessed in the main screen, providing users with related shooting inspirations for future shoots.

WAHT I LEARNT

User prototype as a medium for team alignment

Prototypes and demos can provide tangible representations of abstract ideas. In innovative projects where there are few similar examples to draw from, using demos not only facilitates user testing but also promotes better communication among team members, sparking discussions and collaboration.

Impact is not only about the user

The design decision is a balance between users, business, resources, and the stage of the product. When aligning with leadership to secure resources, focusing on the user problem and user impact is not enough. What does this project bring to the overall/future picture of the business and technology roadmap? And what does it cost to achieve the success matrix? Those impacts are as significant as the users' impacts when proposing the project to leadership.

Use design to inform tech research and investment

The relationship between design and technology goes beyond feasibility, as design can influence technological research and investment. In this project, we identified human-machine co-creation can be a great direction for companies future roadmap. Design envisioning can shape a more desirable future and provide insight into a company's future investment and technology roadmap direction.

bottom of page