Integrating an AI Assistant

Integrating an AI Assistant

Overview

Overview

At Symend, users manually create large volumes of branded content that incorporate behavioural tactics for target use cases. This manual process is time consuming and error-prone as behavioural tactics are tracked separately. To address this, an AI assistant was developed to streamline content creation and improved accuracy. The project spanned 6 months of development and 3 months of integration across Product, Engineering, Behavioural Science, and Data Science.

At Symend, users manually create large volumes of branded content that incorporate behavioural tactics for target use cases. This manual process is time consuming and error-prone as behavioural tactics are tracked separately. To address this, an AI assistant was developed to streamline content creation and improved accuracy. The project spanned 6 months of development and 3 months of integration across Product, Engineering, Behavioural Science, and Data Science.

My Contribution

My Contribution

As the sole designer, I led UX development from an off-platform proof of concept prototype to a core platform feature. I collaborated with behavioral scientists, data scientists, engineers, and product peers, adhering to the design system and creating new components when necessary. Utilizing tools including Balsamiq, Miro, Figma, Confluence and Jira.

As the sole designer, I led UX development from an off-platform proof of concept prototype to a core platform feature. I collaborated with behavioral scientists, data scientists, engineers, and product peers, adhering to the design system and creating new components when necessary. Utilizing tools including Balsamiq, Miro, Figma, Confluence and Jira.

Learnings & Outcomes

Learnings & Outcomes

Developed concepts from 0-1 within business and design constraints, gained experience with large language models (LLMs), OpenAI's API, prompt engineering, and the ability to identify core value from user testings to reduce unnecessary features and effort. Successfully shipped the alpha release in 1/3 of the estimated time to support 3 brand new use cases.

Developed concepts from 0-1 within business and design constraints, gained experience with large language models (LLMs), OpenAI's API, prompt engineering, and the ability to identify core value from user testings to reduce unnecessary features and effort. Successfully shipped the alpha release in 1/3 of the estimated time to support 3 brand new use cases.

Key Features

Key Features

The AI assistant comprises three sub-features: Generate, Analyze, and Improve. Each sub-feature is integrated within the content builder, where all content is created. The text in the email below was generated using the AI assistant.

The AI assistant comprises three sub-features: Generate, Analyze, and Improve. Each sub-feature is integrated within the content builder, where all content is created. The text in the email below was generated using the AI assistant.

  1. Generate

  1. Generate

Provides a starting point and brainstorming tool for new content by creating complete text from scratch based on user input. Users can select specific behavioral tactics to incorporate and better align the content with their intended use case.

Provides a starting point and brainstorming tool for new content by creating complete text from scratch based on user input. Users can select specific behavioral tactics to incorporate and better align the content with their intended use case.

  1. Analyze

  1. Analyze

Enables users to validate existing content through real-time and on-demand analysis. Allows users to understand the top behavioral tactics, tones of voice, and other key metrics used in their content.

Enables users to validate existing content through real-time and on-demand analysis. Allows users to understand the top behavioral tactics, tones of voice, and other key metrics used in their content.

  1. Improve

  1. Improve

Combines the functionalities of Generate and Analyze to re-generate existing text based on select modifications. Users can quickly iterate and enhance the intent behind each message.

Combines the functionalities of Generate and Analyze to re-generate existing text based on select modifications. Users can quickly iterate and enhance the intent behind each message.

Process Work

Process Work

User Flows

User Flows

Conducted comparative analysis of 12 industry AI tools to identify common behaviours and patterns. Combined common workflows and refined to fit key use cases specific to Symend; Generate and Analyze.

Conducted comparative analysis of 12 industry AI tools to identify common behaviours and patterns. Combined common workflows and refined to fit key use cases specific to Symend; Generate and Analyze.

Wireframes & Low Fidelity Prototypes

Wireframes & Low Fidelity Prototypes

Developed wireframes building off of the user workflows and iterated with feedback from product and design peers. The goal was to establish high level interactions and behaviour between the generate and analyze components.

Developed wireframes building off of the user workflows and iterated with feedback from product and design peers. The goal was to establish high level interactions and behaviour between the generate and analyze components.

Proof of Concept

Proof of Concept

Worked directly with a team of engineers to develop a working prototype based off the wireframes and connected to LLMs. This was used to gather feedback and gauge value from users on the workflow, interactions, and quality of output.

Worked directly with a team of engineers to develop a working prototype based off the wireframes and connected to LLMs. This was used to gather feedback and gauge value from users on the workflow, interactions, and quality of output.

Iterations

Iterations

The prototype was sent to users and incorporated into their day-to-day work. Google Analytics tags were embedded into key events to track adoption, and additional feedback was gathered asynchronously.

Key Improvements:

  1. Improved Generate Onboarding: Minimal text was initially being added to the generate prompts, resulting in poor outputs. To address this, placeholder text, help text, and examples were added to guide users in providing better input.

  2. Increased Analysis Exposure: The generate feature was used the most, and some users were unaware of the analyze feature. To increase its visibility, certain metrics were relocated and conducted in real-time to enhance exposure.

  3. Enhanced Analysis Functionality: There was a need to analyze larger amounts of text. The selection length was increased to allow custom selections that snapped to whole sentences and worked with the real-time analysis.

The prototype was sent to users and incorporated into their day-to-day work. Google Analytics tags were embedded into key events to track adoption, and additional feedback was gathered asynchronously.


Key Improvements:

  1. Improved Generate Onboarding: Minimal text was initially being added to the generate prompts, resulting in poor outputs. To address this, placeholder text, help text, and examples were added to guide users in providing better input.

  2. Increased Analysis Exposure: The generate feature was used the most, and some users were unaware of the analyze feature. To increase its visibility, certain metrics were relocated and conducted in real-time to enhance exposure.

  3. Enhanced Analysis Functionality: There was a need to analyze larger amounts of text. The selection length was increased to allow custom selections that snapped to whole sentences and worked with the real-time analysis.

User Interviews

User Interviews

After several weeks of use, I interviewed users across four departments to gather feedback on how these features were incorporated into their current content creation workflows and to understand where the value was.


Key Findings:

  1. Separate Generate and Analyze: Users did not use both the generate and analyze features together; they tended to use one or the other independently.

  2. Introduce a New Use Case: A need was identified to quickly iterate on existing content after it had been analyzed.

  3. Feature Scoping: Certain features were identified as adding little to no value and were removed from the integration process to reduce scope and effort.

After several weeks of use, I interviewed users across four departments to gather feedback on how these features were incorporated into their current content creation workflows and to understand where the value was.


Key Findings:

  1. Separate Generate and Analyze: Users did not use both the generate and analyze features together; they tended to use one or the other independently.

  2. Introduce a New Use Case: A need was identified to quickly iterate on existing content after it had been analyzed.

  3. Feature Scoping: Certain features were identified as adding little to no value and were removed from the integration process to reduce scope and effort.

Integration

Integration

Each use case was separated into individual features and began creating high-fidelity mockups and prototypes to test their incorporation into the content builder. The focus was on ensuring these features fit seamlessly into the current builder workflow.

Each use case was separated into individual features and began creating high-fidelity mockups and prototypes to test their incorporation into the content builder. The focus was on ensuring these features fit seamlessly into the current builder workflow.

Final User Testing & Feedback

Final User Testing & Feedback

After refining each feature, we needed to test how these features behaved with other areas of the application. A final group of users was gathered for testing sessions to identify gaps in workflows and friction points.

Key Outcomes:

  1. Adjust Generate Workflow: The generate experience was relocated within text blocks instead of having its own block. This change created a seamless user experience when developing text content.

  2. Bring Analysis Closer to Editing: The analysis feature was moved to a hovering popover to provide direct responses beside the text editor. This introduced a new design system component to support this use case.

  3. Reduce Friction in Improve: Users struggled with visual complexity and often got lost in the workflow. The experience was refined and aligned more closely with the generate flow to create consistency across the assistants.

After refining each feature, we needed to test how these features behaved with other areas of the application. A final group of users was gathered for testing sessions to identify gaps in workflows and friction points.

Key Outcomes:

  1. Adjust Generate Workflow: The generate experience was relocated within text blocks instead of having its own block. This change created a seamless user experience when developing text content.

  2. Bring Analysis Closer to Editing: The analysis feature was moved to a hovering popover to provide direct responses beside the text editor. This introduced a new design system component to support this use case.

  3. Reduce Friction in Improve: Users struggled with visual complexity and often got lost in the workflow. The experience was refined and aligned more closely with the generate flow to create consistency across the assistants.

Demo

Demo

Last updated 06/2024

© 2024 Ty Summers

Last updated 06/2024

© 2024 Ty Summers