AI Use Case Request Form
Team of 4
My roles/responsibilities:
Project Manager/Facilitator
User Experience Researcher (User Interviews, Usability Testing, User Scenarios)
User Experience Designer (User Interface Design, Interaction Design)
Duration: 9 months (September 2023 - May 2024)
Project Type: Consulting - University of Maryland Capstone Project
The Problem
As technology continues to advance, companies across the nation are utilizing Artificial Intelligence to assist with employee’s workflow and daily activities, and Montgomery County Government is no exception. The County Government is launching an “AI Center of Excellence” in which they are expecting an influx of AI use case requests from employees across multiple departments. In order to appropriately handle these requests, they are seeking a request form that will allow them to manage these requests.
Long Term Goal (LTG)
Create a testable prototype that facilitates employees’ AI requests by eliciting a proper AI use case with sufficient detail for the Center of Excellence to understand the problem statement without unnecessary communication between requester and employee reading the requests.
The Sprint Process
For this project, we followed Google’s Design Sprint Methodology, but instead of following a one week schedule with one day for each phase of the sprint, we spent one week on each phase, and divided the semester up into five sprints.
Project Challenges
Scope Creep: Clients discussed many ideas that were outside the original Scope of Work (SOW). We addressed this by reminding the clients of the agreed upon scope and enlisted the help of faculty to intervene when necessary.
Lack of AI Use Case Examples: As part of the original agreement, the clients were supposed to provide examples of AI use cases for us, which they could not provide. To address this issue, we interviewed more employees to try to determine what they might use AI for in their positions.
Access to ServiceNow: The clients were back and forth on whether the form should exist in their ServiceNow platform. However, we could not access the platform due to data privacy concerns, so we did not know what the interface looked like. As a result, we designed the form the best we could knowing that some features may not be supported. The clients were aware of this and accepted the compromise.
Initial Research
We created a Business Model Canvas (BMC) to analyze MoCo’s business structure and determine aspects like key resources and activities, cost structure, value propositions, etc. From this, we determined that the resources that were most expensive were employee’s time and efficiency. Though implementing AI can be expensive, by increasing employee’s efficiency and cutting their workload, AI will ultimately save the government money.
We used a Competitive Analysis Matrix (CAM) to discover how AI is currently being used, mainly in the context of the government (via both direct and indirect competitors). This gave us some ideas about how MoCo could take advantage of AI.
Key Insights:
Anne Arundel County school buses are equipped with cameras to determine and administer tickets to cars that illegally pass while children exit the bus.
Palantir, a private company that assists both government and commercial organizations, helps integrate and analyze large amounts of data.
GitHub Copilot makes suggestions that increase efficiency by reducing time spent problem solving.
Field Research
We conducted six initial interviews with people who use AI in their positions prior to interviewing MoCo Government employees. Using an affinity diagram to analyze the data, we found the following:
Employees would like a personalized AI system to meet their specific needs/
They are concerned about using sensitive data with AI tools.
Sprints 1 & 2: Research and Dashboard Design
At this stage, the clients had solely expressed interest in a form, but we did not yet have access to interview any target users (Montgomery County Employees).
I determined that the Center of Excellence (CoE) might benefit from a “Dashboard View” that would streamline requests into one general area. I felt this option it would elicit more information from the CoE about the details they wanted and needed to see from each request. By designing the form and testing it with MoCo Employees, we were able to determine the base questions the CoE would like before we even spoke with users.
This also proved to be beneficial later when we interviewed employees and they did not have much familiarity with AI and had not yet heard of the CoE.
Sprint Stage 1: Mapping
We advocated for designing the dashboard as it would help us to determine the information the CoE would want from the form.
After expressing our desires, we used dot voting to determine our target for this sprint.
This helped the clients determine what information they wanted to see from the form.
Sketch - Dashboard
Based on our client’s desires as well as perceived useful features, we created our first round of sketches.
The sketch I developed incorporated a section for to do’s as well as AI suggested actions, insights, and requests.
The clients determined that they liked features in each of the team members’ sketches.
The clients expressed they wanted something simple and easy to maintain, so we took that into account when designing our prototype.
In the sketches above, I used each view to showcase the focus of what would be highlighted if the user clicked a specific section. If a user clicks one section, the others would collapse, allowing the user to view the section in further detail.
Storyboarding
The team developed a storyboard to determine the user’s journey using the dashboard. This helped us to think through the prototype and allowed us to identify any gaps we have in the prototype.
Prototype
In this iteration of the prototype, we incorporated the information we had thus far from our clients as well as some ideas of our own.
Dashboard Landing Page
Action Items: Allows users to assign tasks to themselves and others.
Notifications: View new alerts - requests etc.
Ticket Dashboard: View multiple requests at one time.
Insights: View trends among requests, such as the number of requests by department and how long each one took to resolve. The charts shown are customizable to the user.
Dashboard Insights & Conclusions
We conducted three focus group sessions with the Center of Excellence to test and gain insights on the prototype.
The CoE wanted to include department, status, and budget if available, clarifying questions for funded requests, deadlines (legislative, customer, etc)., and separating software/tool requests from more complex requests. With this feedback in mind, we were able to move forward with the form design with a clearer picture of the necessary information that the CoE would need when reviewing form submissions.
We also received feedback that the members of the CoE did not want a separate platform that included action items and notifications since they already used a few different platforms to keep track of tasks and communicate with others. This was not a surprise to us, and didn’t phase us much as the intention of this platform was mainly to understand the type of data they would want to see from the form. The CoE overall did find this helpful and it helped get the ball rolling for necessary conversations regarding the existence of the request platform.
User Scenarios
Based on our map, interview data, and input from the client, we determined three different user scenarios for employees filling out the form. Some users could embody different scenarios at different points when filling out the form.
1. Employee knows they have a problem and may or may not know how to articulate it, but does not know if or how it can be solved using AI.
2. Employee knows they have a problem and may or may not know how to articulate it, and may have an idea of how their problem might be solved.
3. Employee knows they have a problem, can articulate it well, and knows the solution they would like.
With these three scenarios in mind, our mission was to create a single form that catered to all three of these types of users, and any potential edge case that might occur.
Personas & User Journey
We used the scenarios created above to develop personas, which acted as a tangible representation of our users. We realized that two scenarios could be encompassed in one person, and that employees may find themselves in more than one scenario depending on the request they have at the time.
The map, scenarios, and personas helped us to visualize the different pathways a user would take in each of the scenarios outlined above. We realized that two scenarios could take the same pathway as the questions would look quite similar for both scenarios. This journey map helped us to design the final iteration of our form.
Form Design
With the user testing data from the CoE, interview data from prospective users, scenarios, personas, and journey map in mind, we began exploring the form design. Since the goal of the form was to reduce back and forth communication due to lack of detail, we were able to use the data the Center of Excellence required as a basis for the form, and then did further research to understand what the users would want to include.
Form Sketch
I chose to make my initial form sketches simple following the clients’ requests. They wanted the form to be as simple as possible so the user did not feel burdened by filling it out. I trimmed the sketch down as much as possible while still incorporating all of the necessary elements.
Form Sketch
Form with necessary elements shown
Alternate View
Horizontal view shown because the form will exist on a desktop.
Initial Form Prototypes
Based on the guidance we had at the time, we created an initial set of prototypes.
We created these prototypes with the goal of keeping the form as simple as possible, but the clients had expressed that they actually would like to see more from the form, so we began adding to it.
Form Design
We began adding more to the form based on the client’s guidance, gathering more details from the respondent. We created a few more screens asking more probing questions, guided by our research and interviews with users.
We added an introductory screen to establish the context of the form for the user and guide them through their next steps.
From our user testing data, we learned that users wanted some background on what the information presented meant (eg. a definition for problem statement, an example of a problem statement).
If the user does not have a software/tool request, the form guides the user through creating a problem statement.
On the General Questions page, the user is asked if the request is for a software or tool. If the user clicks yes, they are prompted to this page for more information.
UI Improvements
As lead designer for Sprints 4 and 5, I decided to improve our existing form UI using visual design principles. As mentioned above, this design may exist in ServiceNow, so we didn’t want to create anything too elaborate that may not be used. However, I felt there was room for improvement with our existing design and wanted our clients to take a deeper interest in our ideas.
When redesigning the landing page, the main goal was to reduce information overload. I decided to incorporate dropdown menus to hide some of the information that the user can open at their discretion. I also redesigned the other pages to include these same aesthetic principles, but I did not make substantial changes to the content of the form.
Notable improvements in the UI include hierarchical font weights and sizes, increased padding on buttons and labels, and a more visually appealing color palette.
Research Informed Form Design
After the redesign, I asked the research team to provide research on form design. The first suggestion was to add an “identity page”, providing some context to the user about the form. Since a typical identity page mainly focuses on asking the user to sign in, the client ultimately decided that this would not be necessary as the user would already be signed into ServiceNow in order to access the form, but they still appreciated seeing the design.
The research team also suggested the implementation of allowing users to provide feedback. I experimented with a few different iterations of this design before the team decided on a final suggestion to present to the client.
Prior to choosing a final iteration, I experimented with a few different iterations of these screens, including the addition of suggestion help text and separating each of the steps into different screens, but the team ultimately decided this option was best as it was simple enough but did not overwhelm the user with too many pages to complete the process.
Accessibility Audit
We decided to conduct a full accessibility audit of our prototype to ensure ADA compliance. The only failures we encountered were in regards to the contrast ratio, so we changed the color palette to ensure all text meets WCAG 2.1 AA 4.5:1 contrast ratio.
Final Iteration
Based on the user testing feedback as well as the accessibility audit, I made some changes to the final iteration of the prototype to better the experience of filling out the form for the users.
General Questions
Removed Procurement Request Code to avoid confusion as it is not standard across all departments
Ensured all radio buttons were properly aligned
Feedback Button Language
Incorporated feedback section so that the users can share their opinions.
We changed the language of some parts of this user flow, including changing “continue” to “continue to feedback”.
Landing Page
Moved text outlining how much time the form would take to the top of the page
Added a range to account for longer use cases
Redesigned the buttons to make them stand out as interactive UI components
Further Details
Added a dropdown menu for users to pick a tool from a pre-approved list.
This also utilizes recognition rather than recall, one of Schneiderman’s Eight Golden Rules for designing better interfaces
We also added a tooltip over “third party software”, which would explain to the user that the software is not yet approved or provided by the county.
Design System
As part of my duties as designer for Sprints 4 and 5, I independently crafted and maintained a design system for the proposed solution.
Project Conclusion
Throughout this project, the group collected user and client feedback from MoCo employees to develop two solutions, the AI Use Case Collection platform and the Ticket Dashboard. We utilized sprint methodologies, such as mapping and sketching, to explore a range of workflows and user scenarios to implement in the prototype, worked closely with MoCo employees through testing sessions and client check ins to complete several custom prototype iterations, and created a request process, the AI Use Case Collection platform, to support a plethora of users with varying business needs and levels of AI knowledge.