TagStream by DCO.ai

UCSC UX Capstone
Project Overview
UCSC partnered with startup DCO.ai to provide UX support for DCO's growing product, TagStream. Through both group and individual contributions I conducted research, offered strategy, designed revisions to the platform, and iterated on the designs. Note: My involvement in this project is protected via NDA, and many elements cannot be disclosed. As such, please pardon any image blurring or omission of specific research findings.
Research Methods:
Structured Interviews
Unstructured Interviews
Cognitive Walkthrough
Contextual Inquiry
User Testing
Figma Flow Testing
Prototype Testing
A/B Testing
Design Methods:
Heuristic Evaluation
Empathy Maps
Stakeholder Analysis
Competitive Analysis
User Journey Maps
Storyboarding
Paper Prototyping
Digital Prototyping
Wireframing
Agile Methodologies
Software Used:
Figma
Figjam
Miro
Excel
Adobe Aftereffects
Asana
Zoom
Yuja
Google Slides
TagStream
My Contributions
Although I contributed to all aspects of the project, my most impactful personal focus was on clarity and intuitiveness. Specifically, it became apparent through user testing that the overall use case of the existing software was confusing to new users. Questions arose like "what does it do?" and "...but what is this?" were common when user testing the legacy software. Through testing and iteration, both independently and with a team, I made a series of design recommendations to make the product more efficient, intuitive, and self-apparent.

Contents

  1. TagStream Introduction
  2. Product Evaluation
  3. Preliminary Research
  4. Design Direction
  5. Prototype Sketching
  6. Prototype 1
  7. User Research 1
  8. Prototype 2
  9. User Research 2

Introduction

TagStream is an audio and video parsing software that uses ai to identify insightful information within large troves of a/v content. Using proprietary technology, it creates "digital DNA" to identify and navigate to key moments, common items, or important information, in order to quickly provide value to viewers and content creators.

Original TagStream Video Workspace, SaaS Platform

Product Evaluation

Once we onboarded and familiarized ourselves with the software, we conducted a heuristic evaluation, a competitive analysis, and an ethics/inclusivity discussion. These helped us orient around industry standards and identify early areas of research focus.

Team Photo (including course instructors)

Preliminary Research

To begin our formative research, we met with an end-user of the software and conducted a contextual inquiry to examine how the software is used presently. This key user was highly knowledgeable about their industry workflow and was able to demonstrate typical use cases for TagStream.

To expand our research across more diverse use cases, we conducted a series of interviews with media professionals and content creators, to learn more about a day in the life of prospective end-users. These semi-structured interviews taught us a great deal about needs, organizational patterns, common tools, and expectations of these professionals.

Reviewing information from a semi-structured interview

Design Direction

After completing the preliminary research, we analyzed our findings to identify common themes to guide product redesign. The three themes that most strongly stood our were onboarding, optimization, and compatibility.

Onboarding: Improve user flow, user journey, and product exposure
Optimization: Intuitiveness, organization, and consistency with industry standards
Compatibility: Exportability and integration with 3rd party tools

Prototype Sketching

To begin the design process in earnest, we began sketching existing and prospective interfaces, making notes to specify design intentions and flow. We got feedback on these sketches from colleagues and stakeholders. (Images have been blurred by stakeholder request)

First Prototype

Once we had a cohesive idea of the visual design, we used Figma to create a low fidelity prototype for our first round of user testing. In this prototype we included revised tag organization and content exportability, along with changes to general visual layout. The prototype was somewhat interactive but mostly served to provide visual elements and rudimentary navigation. We would use this to test users for understanding of the software use concepts, navigational confusion, and general visual appeal.

User Testing 1

For our first round of user testing, we conducted cognitive walkthroughs of the prototype, prompting participants to complete simple tasks like "export a file" while encouraging them to think aloud as they navigated each page.

Findings in this round of testing indicated that participants still found the interface "cluttered," were visually drawn to the wrong areas of pages, and fundamentally failed to understand the purpose and capability of the product as a whole.

Second Prototype

With design direction now clearly scoped in, each member of my UCSC team individually designed a high fidelity prototype to incorporate research and iteration findings. My design focused on lowering the learning curve of the software, as well as making the product as a whole more intuitive and its usefulness more immediate and apparent.


To achieve this, I incorporated the following changes:

Mass video uploading:
The legacy software only allowed users to upload one video at a time. In addition to improving efficiency, enabling mass upload also created the affordance of handling large amounts of content.

Workspace organization:
The primary workspace of the software was viewed as cluttered and disorganized. My revision organized the page to place features where expected from industry standards, centralize primary taskbars, and organize those taskbars with a tabs system to reduce clutter.

Tutorial:
New users fundamentally did not understand the software without explicit verbal explanation. For this reason, I created a (skippable) tutorial in the software that presented an easily understandable example use case, and then guided users through a short series of steps to complete a simple task showcasing the value of the software.

User Testing 2

The second iteration of user testing had three phases:
  • First, I tested the prototype, without the tutorial, on three participants already familiar with TagStream.
  • Second, I tested the tutorial on all of my Master's Program Cohort and instructors to get designer feedback.
  • Third, I tested the full prototype, including now-revised tutorial, on participants unfamiliar with TagStream.

Results were excellent-
Familiar users recognized improvements as such without prompting.
Unfamiliar users completed the tutorial quickly and without error, but most importantly when asked "What do you suppose TagStream is?" participants who completed the tutorial answered correctly, 100% of the time.

A tutorial flow on Figma