Playstation's Kristie J. Fisher + Guide to designing a GenAI product

Playstation's Kristie J. Fisher + Guide to designing a GenAI product

In this newsletter:

* Podcast episode with Kristie J. Fisher, PhD, the Sr. Director of Global User Research, PlayStation Studios.

* Guide to designing a GenAI product: From vision to content strategy

* Poll for the AI community

The biggest challenge facing AI products isn’t whether they would use your product, it’s whether you’re delivering reasons to convince them to switch from their existing solution.

This is extra difficult when leveraging an emerging technology, like GenAI, because of key factors:

* GenAI tools ask users to give up control and have faith that the system knows what’s right—the exact opposite of what we’ve been training users to expect from productivity tools

* GenAI is still nascent and doesn’t always get it right, meaning that in some situations it will deliver an inferior output (and need to be re-prompted)

* Users quickly run out of ideas about what to prompt because they don’t know what the tech is capable of

So as much as product teams can focus on the incremental delivery of value to users, those efforts are likely to fail because we’re asking users to take a leap of faith. Something that users, especially B2B and enterprise, don’t want to do.

Thanks for reading Design of AI: News & resources for product teams! Subscribe for free to receive new posts and support my work.

That’s why this week’s episode with Kristie J. Fisher, PhD was so fascinating. Having worked on launching new products and features at XBox, Google, and Playstation, she has learned how to dive deeper into the psyche of users and gamers. In there is the secret to making a product enjoyable: defining metrics to ensure a user’s time is well spent.

When building and researching we must be committed not only to delivering value, but ensuring that the experience is enjoyable and worth changing your workflows for.

So when building your GenAI product, always create evaluative metrics for the level of impact. The higher you score, the more likely a switch. It also offers and opportunity to qualitatively investigate where and how the impact is happening so you mine valuable product ideas.

💡 Have questions about your GenAI project, post them on the Design of AI LinkedIn page.

💡 Or contact me via email to privately discuss your project

Kristie J. Fisher, PhD, has spent the last 15 years conducting user experience research and building and leading research teams across a variety of product domains, primarily in gaming. She currently leads the global PlayStation Studios User Research team. The mission of her team is to empower PlayStation's Studios to get to great faster by being vision-led and data informed. At Google she worked on Stadia, Gmail, and Ads and was a co-author of Google's People + AI Research Guidebook. Prior to Google she was at Xbox Research, collaborating with game producers and development teams to improve player experience on Xbox, Xbox Kinect, and Windows.

Guide to designing a GenAI product: From vision to content strategy

Working with GenAI requires designers to shift their mental models from deterministic to probabilistic output. Not only are you working with a new material, the technology is so new so there aren't any best practices (yet).

This guide is an overview of the technology and lessons I've learned in my own AI consulting projects working at PH1 Research and from the amazing experts we've had as guests on the Design of AI podcast (Spotify - Apple).

🎯 Continue reading the guide

Sections in this guide

* Background & reality-check

* Rationale for AI

* AI product vision

* AI product strategy

* AI product principles

* Design's role in crafting GenAI products

* Content strategy

Poll: We want to help our community better so we can deliver better resources.

We started Design of AI to help teams quickly learn how to best leverage hashtag#GenAI. In the coming months, we're launching some initiatives to improve knowledge sharing to address concerns we've heard:- Lack of archive of products/tools/features others have built- Lack of best practices- Lack of visibility on why initiatives have failed- Lack of mentorship & sense of doing it all aloneIf you have any questions or want to help with building out resources for some of these, contact us info@designof.ai

Thanks for reading Design of AI: News & resources for product teams! This post is public so feel free to share it.



This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit designofai.substack.com

[00:00:00] Whenever we have a technology that can do X, the better response is to pause.

[00:00:05] Now look at these user needs and opportunities and given this, is there a space for that technology

[00:00:11] if not what we need to evolve about the application of that technology in order to beat this

[00:00:16] community.

[00:00:17] We're exploring the intersection of video games, emerging technologies and product team

[00:00:22] KPIs with Kristie J. Fisher PhD, the senior director of global user research at

[00:00:27] Placeship Studios.

[00:00:28] The fact that fun is a KPI for games, essentially, is something that I love.

[00:00:33] I think many times as researchers we stand in our own way.

[00:00:38] We don't feel empowered to suggest a new KPI.

[00:00:41] We're just like, well, that's not my job, that's some MBA's job.

[00:00:45] An absolutely is your job.

[00:00:46] You are the expert on the people, the humans, the players, the users that you are building

[00:00:51] for.

[00:00:53] In Episode 18, we explore Kristie's experience at Placeship Studios, Google and Xbox.

[00:00:57] And dive into how teams can build more engaging AI product experience.

[00:01:03] We discuss similarities, differences and key learnings of working in games and big tech.

[00:01:09] How insights teams should be structured to ensure the delivering impact.

[00:01:13] How to create immersive experiences by creating KPIs for joy and delight.

[00:01:19] The benefits of AI to researchers and how AI is transforming games.

[00:01:24] Some of the best organizations, whether it's in tech or games,

[00:01:28] are starting to move towards more of this integrated multi-disciplinary insights group.

[00:01:32] Let's have a head of insights and under that let's have user research, analytics,

[00:01:38] some personalization tech, product strategy, let's have all of the data and insights people,

[00:01:43] all talking to each other, all of the time and collaborating because it's only when you're

[00:01:48] really triangulating across this multiple perspectives and multiple types of data that you get the highest confidence.

[00:01:54] Her STJ Fisher PhD has spent the last 15 years conducting user experience research

[00:01:59] and building and leading research teams across a variety of product domains, primarily in gaming.

[00:02:05] Sheglides the global PlayStation Studios user research team.

[00:02:09] The mission of her team is to empower PlayStation studios to get great faster by being vision led and data informed.

[00:02:16] At Google, she worked on Stadia, Gmail and Ads, and was a co-author of Google's People Plus AI Research Guidebook.

[00:02:24] Prior to Google, she was at Xbox Research, collaborating with game producers and

[00:02:28] development teams to improve player experience on Xbox, Xbox Connect, and Windows.

[00:02:34] A cognitive psychologist by training her research interests center on gaming,

[00:02:38] human computer interaction in the context of AI, and human learning and problem solving more broadly.

[00:02:44] Please note that Christie's views don't represent those of her employer.

[00:02:49] I'm Brittany Hobbs, and my co-host is Arpey Dr. Ithigarero. We interview people at the forefront of

[00:02:55] Gen AI to help you better leverage AI in your teams. If you like this episode, remember to like,

[00:03:01] subscribe, and rate our podcast in your favorite podcast app.

[00:03:05] And please find a design of AI in LinkedIn and sub-stack to join the conversation about Gen AI top.

[00:03:11] I am so excited to talk about video games and research in AI. As a kid, the group in the

[00:03:17] AI, and particularly interested in how you ended up in the field that I dreamed about

[00:03:22] ending up in. How did your path take you there? I know you went through Google,

[00:03:27] but now you're a PlayStation. Also a kid who grew up in the 80s didn't even realize for a

[00:03:31] very long time that video games was actually a career. I actually stumbled into it. I feel

[00:03:37] very surprised and fortunate still to this day that I somehow ended up here myself.

[00:03:42] I went to graduate school for psychology, fully intending to be an academic and have my own

[00:03:49] research lab and study the brain. But while I was there, I ended up doing some contract work at

[00:03:55] Microsoft Research. And that's really the first time I had any idea that user research was a field

[00:04:01] within the technology industry. And that got me so excited. And that's when I decided I wanted to

[00:04:08] pivot my career path away from academia and into industry. And as I was finishing at my PhD

[00:04:17] and looking for jobs, I saw an opening at Xbox for a games user researcher. And I was like,

[00:04:24] wait a second, not only is user research a field, you can do this in games. Are you kidding me?

[00:04:28] I applied in here I am. The timing just aligned to feel very surprised and fortunate to this day.

[00:04:35] Where audiences mainly tech. And what's interesting to me about tech is they keep forgetting the

[00:04:40] video games exist, they keep forgetting that video games are almost always ahead of the curve.

[00:04:45] They did the metaphors, they did avatars, they've done AI, they did gamification. There's always

[00:04:51] something amazing happening in video games. But why do you think tech people view it as a bit of a

[00:04:56] influence point? I asked myself that question a lot. And when I was at Google, I tried to

[00:05:04] not make that same mistake and point out when there was examples in the gaming sphere that we

[00:05:10] could draw from for some of our product questions. But I think weirdly enough, games is this

[00:05:16] interesting middle ground where it is both technology and it is art. And that is what makes it so

[00:05:23] exciting to me as a researcher where it's one thing to try to investigate questions around

[00:05:30] okay, how can a user accomplish this task as quickly and easily as possible? It's a much more

[00:05:36] challenging and more nuanced question to ask how can this user have fun in this world. But because

[00:05:42] games is kind of a little bit of both right, it's this creative vision that a game creator has,

[00:05:47] that then needs to be translated into a really complex interactive technology. So because it lives

[00:05:54] in between these two worlds, I think a lot of times the tech world has just categorized it as

[00:05:59] entertainment and the entertainment world has just categorized it as tech and both sort of

[00:06:04] miss out on what they can learn from games. So I live in Los Angeles and one of my

[00:06:09] constants, soapbox, gripes is how I never get any respect in this town working in games whenever

[00:06:15] a house is in film and TV, even though it's a bigger industry in terms of people and money

[00:06:21] than music and movies combined. That's my short answer as I think because it spans two worlds,

[00:06:28] it's easy for it to not really truly be a part of either and get forgotten about.

[00:06:33] I think history is going to respect you in a lot of ways because when I go watch the spider

[00:06:38] verse movie or the new dead person, they feel like video games. They're just video games now.

[00:06:43] Crazy, like you have actors basically representing themselves as digital entities. But to

[00:06:48] tier point, video games have to be immersive and we keep using the expression you're a user experience

[00:06:53] designed but digital products don't have to be immersive. They don't have to have relationship

[00:06:58] with you. They don't need to be interesting, exciting, tense, emotional things. What your building

[00:07:04] is really engaging humans on a level that for anyone who studied barits it's just amazing.

[00:07:09] I know Brittany has a psychologist, you must think this must so exciting too.

[00:07:13] Yeah, I think it's really fun on a lot of levels. I mean one is especially in cognitive psychology,

[00:07:18] research you do use a lot of games and gaming as a way to understand human behavior and

[00:07:26] especially human behavior in relation to technological stimuli. I think you are

[00:07:31] you were coming from such a great perspective on that. But two is video games are kind of like

[00:07:37] the original Gen AI because it is something where you have assumed that it's incredibly responsive

[00:07:43] to what you input or how you interacted and engaged with it. So you really both between gaming

[00:07:50] research and your time at Google, you come from such an early stage way of approaching this type

[00:07:57] of technology which I think is really interesting between your time at Google and the game

[00:08:03] work that you've been doing. What do you feel has been the thing you've been most proud of?

[00:08:08] If it's so hard to say I've had such privilege to work on a lot of incredible products and

[00:08:16] work with a lot of incredible people. My role right now at PlayStation is really more of

[00:08:20] the leadership role where I'm having the opportunity to guide an entire team and we've been

[00:08:25] really working to evolve the way that we work as the industry evolves and as the technology evolves

[00:08:32] and I'm really proud of just watching the generation of researchers grow and you'll really

[00:08:38] privilege to be able to lead them. So in terms of pride that's it one of the things I thought was

[00:08:42] in one of the most interesting and challenging things I worked on as a researcher earlier on in

[00:08:47] my career was on the Xbox Connect which was a full body motion sensor gaming experience and

[00:08:53] that felt really exciting because there was no real template for how to do that and so it

[00:09:00] interaction models. Some of which were successful, some of which were very much not successful and

[00:09:05] it felt really exciting to be kind of on that cutting edge in that moment of how how do you create

[00:09:11] we're really good at creating afforded says and creating an environmental immersive

[00:09:16] cues to kind of guide a player through an experience in more of a controller input based game.

[00:09:22] How do we translate some of that to full body input and what feels fun with one set of inputs

[00:09:30] might be different than what feels fun when you're doing it with your body and I felt like I was in

[00:09:35] a cognitive science sort of research lab asking these really fundamental questions but I was

[00:09:41] still working at a games company. That was a really interesting challenge. I'm proud of the work

[00:09:45] we did do there. The disk does raise something interesting which is we talk about

[00:09:50] an AI as this new material. Many people reference it as such but when you go back historically

[00:09:54] you just reference Xbox Connect and when the week came out that was when we were introduced

[00:09:59] at Excel or Rometer and that became a critical part of the iPhone which was when we started

[00:10:05] inputting our physical interactions and digital pieces and then there's new startups like

[00:10:10] archetype now who are trying to work with sensors and try to predict what's happening. What does

[00:10:16] look like working with this kind of new material when one was the first time that you really

[00:10:21] encountered that this is something new and I can do something different with it. I need to ask new types

[00:10:26] of questions and they have to imagine new ways of working with it. There's always something new. It's

[00:10:31] just a degree of how new but the interesting thing and I think maybe just whole broke said something

[00:10:38] similar to you when you interviewed him is something I've learned as that often like there's a

[00:10:44] guy go where something new comes out so whether it's the Connect, whether it's VR, whether it's

[00:10:49] AI and the first instinct is like oh my god this is so new this changes absolutely everything

[00:10:53] we have to throw out everything we knew about design and research and just completely think

[00:10:58] differently in start from scratch. There are definitely new and different considerations and ways

[00:11:03] we have to evolve and adjust but often the fundamentals stay the same thinking about some of

[00:11:09] this very specific tactical game experiences for the Connect or even all the way fast forward to

[00:11:14] now with some of the things we're thinking about with how to use Gen AI it still comes back to

[00:11:19] what are the core user needs and then how will this be a tool that helps them accomplish those

[00:11:25] key user needs? Whether there's needs are accomplishing a specific productivity task or having

[00:11:30] a deep emotional experience or enjoying a narrative created by a game developer goes back to what

[00:11:36] a user trying to accomplish this is just a new type of tool and the same thing still apply right

[00:11:42] you need education to the user onboarding you need good affordances you need good feedback none

[00:11:49] of that changes with these new technologies so I've seen enough of these ways of innovations

[00:11:55] that I'm now have this almost like mindfulness practice where it's okay pause stop panicking

[00:12:00] what are the fundamentals and then just trying to translate those fundamentals back to

[00:12:05] this new space? Question on that is how do you report back when you've discovered a human

[00:12:11] interaction then might infer a new use case or a new capability because we talk about the

[00:12:19] accelerometer VR the jobs he done were the use cases with the value created by the game sometimes

[00:12:26] you only learn about it once you do an ethnographic study of some sort right? How does any of

[00:12:34] backwards like I found something and this might actually change how we view games?

[00:12:40] At next box in a PlayStation and even at Google whether it's a game product or different kind

[00:12:45] of product the best product development processes in my experience are really agile and iterative

[00:12:50] and ideally like you're kind of starting out with one set of goals vision hypothesis

[00:12:56] and then as you do that research you might discover like oh there's something else here

[00:13:02] this technology actually enables something we hadn't planned for yet and in my mind that's an

[00:13:07] opportunity to go back and kind of update that original vision hypotheses you know core jobs to be done

[00:13:15] and I welcome that and I think what I am always advocating for is making sure that at the

[00:13:21] beginning of any project we're building in that testing and iteration time so that when we make

[00:13:26] those amazing discoveries that we couldn't have planned for that we have the time and the opportunity

[00:13:31] to now incorporate that new possibility into our design intentions. That's such an important piece of

[00:13:40] advice for people I remember a couple of months ago listening to a keynote for I think it was

[00:13:45] for Dovetail and someone was speaking about how it's like we need to rethink the way that we plan

[00:13:51] a lot of this testing and be more thoughtful about inflection points and being able to act on

[00:13:58] new things that we discover that weren't expected. Arp and I both as practitioners have always

[00:14:03] worked that way where we build inflection points changes into those research processes and so it's

[00:14:11] really validating to hear that you do the same thing. I think more people would benefit from doing

[00:14:17] that for sure. So yeah for the unplanned plan for the unexpected yeah absolutely and to that point

[00:14:24] when I was reading some of the work that you've done and the paper that you've done I love the

[00:14:29] fact that you also look at quantifying magic. I thought that was such a wonderful way to inspire

[00:14:34] the creativity of what you two learned from them but then looking back at the work that you were

[00:14:40] doing with just wholebrook at Google in your people plus AI guidebook and the advice and recommendations

[00:14:46] you were giving to people was five years ago the Xbox project you were synonymous 10 plus years ago

[00:14:52] and so much of what you were talking about at that point is still applicable today we're still

[00:14:59] sort of relearning those same things that you were talking about so many years ago.

[00:15:05] Why do you think that that is? Why do you think that these things that have sort of existed

[00:15:09] for a while still feel novel today as people are doing this research? It goes back to what I was

[00:15:15] saying anytime there's a new technology that really blows our minds and unlocks new possibilities

[00:15:22] the first instinct is they're everything out and we have to totally reinvent how we do our jobs

[00:15:28] but the fundamentals you're a psychologist you know this the fundamentals of humans don't change

[00:15:35] that quickly our technology space does and so a lot of what I've spoken to and in my work over the years

[00:15:41] is the frameworks you can use to approach user center design and user research and a lot of those

[00:15:50] are based on just like core principles of human psychology, up design it doesn't change as

[00:15:56] quickly at the rate of technology or AI specifically it is certainly so I think that's why a lot

[00:16:01] of it stays relevant is because it's based on core principles that don't change as fast as the

[00:16:05] rest of the technology around us. This design of AI episode is brought to you by PH1 a research

[00:16:12] and strategy consultancy that helps clients build AI products that customers want contact them about

[00:16:18] product discover research to answer critical questions about what's a build competitive analysis

[00:16:22] to find out how to gain an advantage service and customer analysis to identify the best use

[00:16:27] cases and value drivers and workshops and concept testing to validate what will work and to

[00:16:33] fine tune products. PH1 has worked on products or Spotify Microsoft the National Football League

[00:16:38] Dell, Mozilla, Bell and various health and higher education groups bring it an expert to make

[00:16:44] your products and teams are focused on what customers want visit their website to book an

[00:16:49] intro call PH1.ca. I have a question about how research might evolve because we were talking about

[00:16:58] how video games we grew up being super excited about them and I first discovered that they

[00:17:03] gained testers who were just look for bugs right and then user testing became another generation

[00:17:09] of evaluation and now with Jenny I tools I know there's a whole field of research both in terms

[00:17:15] of training, fine tuning, value you know puts and such. What does that look like in video game world?

[00:17:23] Yeah I think something that we've talked a lot about in the last few years is being realistic

[00:17:30] about what we can and can't test inside of our labs so whether that's our playtest labs for video

[00:17:36] games or whether that's our usability labs and more of like a tech product space with AI and with just

[00:17:43] emerging user mean experiences, a lot of the key questions we have can only really be assessed

[00:17:49] in the wild. So for example some of the biggest games right now are these like mega online

[00:17:55] multiplayer when I would call live service games so think of like Fortnite for example or Roblox

[00:18:01] and when you're building a game that's supposed to be a game like that there are so many

[00:18:06] questions about how players will interact in the wild, how the game will fit into their data

[00:18:11] day life, how the game will retain them over time and then the kind of meta experiences that

[00:18:17] will emerge like this idea of emergent gameplay I think goes back to what you were saying

[00:18:22] Brittany about how games in some ways are sort of the original Jenny I like people have found ways

[00:18:28] to play Minecraft and Fortnite and all of these experiences in ways this designers could not

[00:18:33] have envisioned. So we have to be realistic that we can't really predict ahead of time or we can't

[00:18:38] get reasonable insights that can say for certain this is going to be what the user's experience

[00:18:45] is life you have to just put it out there in the world and see. So we've been adjusting a lot of

[00:18:51] it we do to help work with our teams to do those smaller live tests that eventually grow into

[00:18:56] these more close-badas open-badas and then allowing the questions that can't be answered and

[00:19:01] allowed to be answered in a more controlled live environment before we're ready to do a big public

[00:19:07] launch. Thinking then about these sort of probabilistic experiences which you know Jenny I is

[00:19:15] very probabilistic game play is very probabilistic how are you actually testing for that you were

[00:19:23] saying those closed environments so people are using the actual game but in a very controlled

[00:19:28] time and space. But if I'm let's say I'm a product person I'm from a product team and I want

[00:19:35] to understand what should I be doing that I just don't know especially because this is very new

[00:19:39] for most people. So what are some tangible suggestions that you would have for product teams who do need

[00:19:46] to be testing probabilistic experiences and don't really know where to start or how to get the best

[00:19:51] least biased insights. Yeah plan ahead for a slow, faced live testing map where maybe the first

[00:20:01] live test is just among your company and then maybe the second line is your company and

[00:20:06] friends and family and when I say controlled it's like okay you're going to have access to this

[00:20:10] product for the next two weeks. You said or don't use it you know with us but it might be like hey

[00:20:17] you know we need concurrency of x amount of players in order to do matchmaking so everyone please play

[00:20:24] on Saturday at this time. At the beginning it's more it's quasi in the wild right where you're

[00:20:30] okay I'm just letting people use this on their home devices, on their home networks as part

[00:20:36] of their day-to-day life but there's some constraints around it because the products not like fully

[00:20:40] baked yet it's still in progress but it gets you a little bit closer to out of the lab and in the

[00:20:45] next episode a little bit more close and you're just kind of getting closer and closer and closer

[00:20:50] and you can't ever be absolutely 100% certain whether your product's going to succeed what your

[00:20:58] user's experience is going to be until it's fully live but you can narrow the risk and increase

[00:21:04] your confidence as you go from in lab testing to partially live to a little bit bigger to really close

[00:21:12] at the full live experience and then launch. We're looking at an ongoing gradual thing absolutely

[00:21:20] and in this situation do you find that you're getting better results if you're doing just like

[00:21:25] free form letting them play and them coming back to you with what that was or are you giving them tasks

[00:21:31] and sort of assigning different things that you're trying to QA? It always depends on

[00:21:36] whether the key questions and areas of interest for a specific game by the time we've gotten to

[00:21:41] being ready to do a live test we've already nailed some of the core experience in the play

[00:21:47] test labs. Usually we're ready to let people just kind of play that being said if there is one

[00:21:52] particular game mode or something like that where we're like this is the place that we feel like

[00:21:56] needs the most feedback and attention and data we might direct people to please play this mode this

[00:22:02] weekend so it really just depends on where the product is and what the key answers are and

[00:22:07] in terms of that ongoing ramp up I'm reminded of there's been a lot of reporting on chat GPT over

[00:22:13] the last couple of years and how the first thing they did is release a beta version to a lot

[00:22:18] of reporters knowing that they would be really incentivized to break it and try to make it do

[00:22:23] insane things and I think that's a really great example of limited live testing right where

[00:22:28] they had a very particular group they had very particular things they wanted to get out of it

[00:22:33] and they let that group really stress test it and they discovered a lot that it did

[00:22:38] up being revised when the final public version came out in 2022. Have you found that there's a

[00:22:44] particular method of getting that feedback back or tracking that has been the most effective?

[00:22:51] We always focus on triangulating between the call and the call we've sent a lot of time

[00:22:56] making sure that the product experience in the case of Google or the gaming experience in the

[00:23:01] case of PlayStation is really well instrumented and that we have a dashboard set up ahead of time

[00:23:07] where we know we can make the comparisons we want to make to understand what people are actually

[00:23:12] doing in the product and then we will either with like diary study style kind of ongoing surveys

[00:23:19] or a mix of surveys and follow up post live session interviews try to dig into okay this is

[00:23:29] what we saw people do in the game in the product now why are they doing that what is driving

[00:23:35] it and how satisfied or not satisfied are they with that experience, how are they feeling when

[00:23:39] they did that experience. The exact method again might depend on the product or the game at hand

[00:23:44] but it's usually a mix of pollen clot and the call is often open and does sort of mini diary

[00:23:51] responses survey questions and interviews. I have some questions about that because I think that's

[00:23:57] very interesting but we're in this area where basically gen eyes and this proof of concept stage people

[00:24:02] are messing around with it there's all these prompts we don't really know we're going to get out of

[00:24:06] it it's a time of exploration but I keep telling my clients when I work with them in terms of

[00:24:11] how can I integrate gen eye I keep telling them has to be fun has to be enjoyable has to feel like

[00:24:15] there's a bit of magic happening because otherwise you might as well do it the way you're used to

[00:24:20] the way you're comfortable with where you know what you're going to get the has to be a bit of joy

[00:24:23] now this brings me back the games and when I think it games you all are experts in the fly

[00:24:30] wheel effect as they call it in product land so I was reading an interview years ago the guy who's

[00:24:35] behind the oatmeal topic is talking about how he makes games and he sits with his family members

[00:24:39] he sees how the reacts he makes paper prototypes it's all about the reaction intensive the reaction

[00:24:45] and just how people play and how strongly they play and how aggressively they play that's how you

[00:24:51] can really sense the emotional collection of some of the other end I was listening to interview with

[00:24:56] the Nakeda Brear we were talking about virality and how he makes a very social

[00:25:00] apps and and how he's built number one download apps and it was always launched it in very

[00:25:05] controlled small zones and just see how quickly it explodes and it's all about the gross rate that's

[00:25:10] what matters to know anything so I'm wondering if you have any lessons or guidance for people who are

[00:25:17] researching and testing products so that they're not so obsessed with these really abstract

[00:25:22] boring metrics and instead they can focus on how can they make these things enjoyable because

[00:25:27] the joy matters more than whether it works sometimes yeah great question a couple different

[00:25:33] answers to that one is that my biggest sort of i-roll moment as a researcher who's worked in

[00:25:39] both games and tech is when somebody comes to me and how can we gamify our product and you know

[00:25:44] I think their hardest in the right place where I think what they're really trying to ask is

[00:25:48] what you just said our be which is how do I add delight and joy to my product or how do I increase

[00:25:54] engagement with my product but often what ends up happening is they just sort of layer on the

[00:25:59] sort of surface level elements of gaming hey we added badges to our expense reporting tool so now

[00:26:07] our employees are gonna love it right it's like that you're kind of missing the point where things like

[00:26:11] badges and upgrades and trophies they're the final manifestation of experiences that are deeply

[00:26:18] motivating and that deeply provide that joy and that delight and those emotions so in terms of

[00:26:25] how to think about that it starts with first principles it's understanding who our year users or

[00:26:31] your players what is gonna be their context when they come to your product what are maybe some

[00:26:37] met needs and then looking at okay given what we fundamentally understand about the people

[00:26:43] were building this for and the context in which they're gonna be using it what are some ideas that

[00:26:48] we could have that could offer those intense experiences or that might cause them to want to share

[00:26:54] this with a friend and then those are our initial sort of creative hypotheses and then you can

[00:26:59] start with research to concept test and iterate from there and see if indeed there's a spark

[00:27:04] there that you can build from with your users or players I was looking for inspiration from

[00:27:09] article about how Jenny and I's not being leveraged in a smart way and Kaiak has launched this tool

[00:27:14] to ask it for trip planning ideas and it's just a prop box and tells you well it might not give

[00:27:19] you good useful answer is to try it out anyway that has a long long description it doesn't seem

[00:27:24] fun to me now you worked on Google flights for a little bit and I'd love to know how would you

[00:27:31] test or give guidance on how to make a trip planning tool more useful but also much more enjoyable

[00:27:36] because trip planning has to be far just a quick caveat I didn't directly work on Google flights

[00:27:42] we didn't collaborate really closely with that team as we were building the Google Plus AI guide

[00:27:46] but because they had a lot of great examples but I can't take credit for the things about

[00:27:51] flights that are working I haven't used that tool on Kaiak specifically but of course trip planning

[00:27:55] something I'm very familiar with and have you know given a lot of thought about as both an end user

[00:27:59] and a designer researcher and user experience so again in this case I think the opportunity for

[00:28:07] gen at AI here is to both reduce pain points and increase delight and I think you have to

[00:28:13] think about both of those things because if you just focus on well how do we make this more fun

[00:28:19] but you're not still solving the core need or removing core barriers and pain points then that

[00:28:26] fun is going to be blocked a little bit so I'd say first and foremost let's make sure this tool

[00:28:31] is actually solving a real problem so understanding like what is currently challenging about

[00:28:36] trip planning is it comparing multiple options is of trying to just balance schedules is it not

[00:28:42] having inspiration and then what are the components of that AI can produce so if I was working on

[00:28:49] that I would maybe start with more some foundational research to really deeply understand

[00:28:54] what are the opportunities here on both removing pain and adding delight and in terms of adding

[00:29:01] delight again it would start with creative hypotheses like to me as as a traveler and again this

[00:29:09] would be like my creative hypothesis is that the most amazing experiences I have is when I just kind

[00:29:15] of stumble across some experience when I'm traveling that I would not have thought to search for

[00:29:21] that seems like a perfect opportunity for Gen AI right helping you find the unknown unknowns about

[00:29:27] the particular places you can visit or even places you might want to visit you might not know

[00:29:32] you want to visit and maybe it's based on what you've enjoyed in the past a trap that people

[00:29:39] in the tech world often fall into is they start with cool we have a technology that can do X

[00:29:44] let's figure out what's use case and I think whenever we have a technology that can do X

[00:29:50] that the better response is to pause now look at the space of user needs and opportunities and

[00:29:57] given this is there a space for that technology if not what we need to evolve about the

[00:30:03] application of that technology in order to meet this need setting delight or fun or certain emotions

[00:30:10] as a core need or core design pillar is a really big missed opportunity in the tech space that I'd

[00:30:16] see more of and I have started seeing it from time to time in little things like for example

[00:30:21] once you clear your inbox and both the Microsoft Outlook and in Gmail there's like a little

[00:30:27] cute cartoon that's with a sunshine you're done for the day yeah it's just little moments like

[00:30:32] that but what would the world look like if every single tech product had fun and delight or

[00:30:39] certain emotions really for excitement or something like that as a as a key design pillar or

[00:30:45] product requirement I would love to see what that could lead to as someone who's never had in

[00:30:51] box zero I'd love to just have one that just is like you made it through the day it's okay

[00:30:56] you can go but I also think for me I'm constantly trying to understand as the technology changes

[00:31:03] as more uses come out how can we be smarter about researching and understanding these things and

[00:31:11] we always feel like maybe there's some thing in academia that's this brand new method that's

[00:31:17] going to be so smart and so clever and so it's always a little bit funny but also confidence

[00:31:23] building I guess to hear you in this moment saying if you want to put the fun in it let's just go

[00:31:28] back to that evaluative research and understand at the core fundamental principle level what is fun

[00:31:36] about trip planning and we still keep needing to remind ourselves take us to back from the technology

[00:31:41] get to the real human value and then bring it back to the technology.

[00:31:47] I love beautifully said it feels like one of the problems here is that researchers live in this

[00:31:52] bubble where they're obeying to the KPI but there's no KGI that's about joy or happiness and

[00:31:57] and one of the things I love about behavioral research behavior that you can all mix especially

[00:32:01] they find a way to quantify happiness and emotional reactions so they can create these

[00:32:06] microtest that ladder up and this is really cool. Out can we help researchers think outside the

[00:32:14] box a little bit more and think a little beyond the conversion number it. I think that's one of

[00:32:18] the things that keeps me in games I started in games I went away from games for a few years and now

[00:32:23] I'm back and the fact that fun is a KPI for a game essentially is something that I love. I think

[00:32:29] many times as researchers we stand in our own way. We don't feel empowered to suggest a new KPI.

[00:32:38] We were just like, well that's not my job that's some MBA's job. It absolutely is your job.

[00:32:42] You are the expert on the people, the humans, the players, the users that you are building for

[00:32:48] and if you think there's a missed opportunity and the things that we're optimizing for and

[00:32:52] the things that we're measuring I do think it's our responsibility to propose those things

[00:32:58] and you can do it in a way that's still data informed. I think that you say it exactly,

[00:33:02] I think the reason why we have such missed opportunities to learn from games to just make

[00:33:08] better products in general is because we don't have enough or don't have the right user-centered KPI's.

[00:33:15] Talked about one example I think from YouTube in the AI guidebook or in another talk we gave

[00:33:21] as how for a long time with YouTube this was many years and iterations of AI ago.

[00:33:28] But the video recommendations were only based on did somebody click it and that led to basically

[00:33:33] a lot of quote unquote trashy videos where there's some like snake eating a cat and you're like,

[00:33:40] ah that's disgusting but I somehow can't look away from it just some fundamental human

[00:33:45] part of me have to click on this wild looking insane video. But a lot of those videos were low quality

[00:33:51] and didn't really add value to the people's lives over time so even though in the short term

[00:33:57] they'll cool we've optimized for clicks and the long term they saw usage and satisfaction with

[00:34:02] YouTube go down. So they had to change what they're optimizing for for quality for likes,

[00:34:08] for extended watching for things like that that were better signals that the user was really

[00:34:12] finding value out of the content and I think that's the kind of thinking I think that user

[00:34:18] research could be thought leaders on but we often kind of put ourselves in our own box of our

[00:34:23] job is to test the experience or our job is to answer this specific question versus our job

[00:34:29] is to provide frameworks to the team for how to think about making the best possible products

[00:34:35] for our users. Do you have access a lot of the telemetry that enables you to get those signals

[00:34:41] very easily? Yes. That must be really game-changer for you because so when it's often a research

[00:34:48] separate to data and the data is a lagging source that you can't get without approval so it's

[00:34:55] kind of crazy. Now that I've kind of shifted from individual contributor to leader over the last

[00:35:00] few years of my career something I think a lot more about is organizational structures and how that

[00:35:06] relates to ultimately delivering good products and some of the best organizations I've seen

[00:35:12] that are really leading in the user experience base whether it's in tech or games are starting

[00:35:16] to move towards more of this integrated multidisciplinary insights group where instead of having

[00:35:23] analytics kind of live over in one maybe an engineering or maybe a marketing or maybe

[00:35:29] in business management or something and then user research may be living in design and

[00:35:34] some other group living in product management let's have a head of insights and under that

[00:35:40] let's have user research let's have analytics let's have some personalization tech let's have

[00:35:47] a product strategy let's have all of the data and insights people all talking to each other

[00:35:52] all of the time and collaborating because it's only when you're really triangulating across

[00:35:56] this multiple perspectives and multiple types of data that you get the highest confidence answers.

[00:36:02] I'll continue to advocate for that as a good organizational model as all these functions have

[00:36:07] I think one of the challenges that I've faced going into in your organization in the last

[00:36:13] couple months is also it's just that is how how do we empower more people to take on the role

[00:36:19] of understanding insights so that's also great to hear because I think researchers have historically

[00:36:26] felt like well this is what I do I do research and they don't really understand data or they don't

[00:36:30] really understand the design side of it and how can we get more people doing that work but I want to ask

[00:36:37] really specifically around AI and how do you see AI being transformational to the role

[00:36:44] of researchers in a gaming tech or beyond. I've already seen really strong examples of how

[00:36:52] AI can really help reduce the time the turnaround time of going from data to insights so

[00:36:59] whether that's helping to summarize data helping to visualize data more quickly helping to find themes

[00:37:07] massive set of thousands of open ended responses there are so many ways that this is just reducing

[00:37:13] the time it takes for us to go from finishing a study of some kind and then delivering to teams

[00:37:21] actionable insights and I could not be more excited about that because as researchers what I get

[00:37:28] excited about and what I see my team always getting excited about is thinking about the really

[00:37:33] interesting tricky questions right like designing the right actually observing people in real time

[00:37:38] the unfun part of our jobs is that we'll now have a spreadsheet with 8,000 rows and I have to

[00:37:44] actually turn this into a beautiful product that I can present to my team in three days and AI is

[00:37:49] cutting that time for us already and I'm really excited to see what it can do. The huge caveat to that

[00:37:55] of course is that as always fundamental principle it is a human plus AI collaboration even as it

[00:38:02] smarter and smarter and more and more capable you have to still make sure your point to get in the

[00:38:06] best direction and doing your own sort of human sense making on top of the AI summarization

[00:38:14] but that's one key way I'm really seeing an evolution and change and it's one I'm really excited about.

[00:38:22] Yeah so you mentioned some examples of allowing researchers to process data a little bit more closely

[00:38:29] are there any other examples that you can share of how AI could be leveraged to research maybe

[00:38:35] increasingly complex games in particular. It's like as the complexity of the storytelling and the

[00:38:42] probability is of what people can do in it. How does AI allow for smarter,

[00:38:49] research and testing of that? That's the thing I still don't have an answer to. We're still at the very

[00:38:56] early days of thinking about how can Gen AI change what are the game experiences that we're testing

[00:39:03] and I think it's more likely that the Gen AI changes the games first and then in terms of how

[00:39:11] we need to then respond to then properly test those games and whether we use Gen AI in that testing.

[00:39:18] I think it's still really to be determined. I wish I had an answer at something that I think

[00:39:23] will be doing a lot of experimenting with in the next few years. We talked about telemetry earlier,

[00:39:30] you have a perspective that Gen AI can help in terms of analyzing telemetry and such a way

[00:39:36] where you can inform potential optimizations for your teams. Is that something that it could do?

[00:39:43] Yeah that's something we've again been thinking about and and talking about and there is

[00:39:50] kind of a philosophical discussion with that too. So for example we talk about fun and fun can

[00:39:58] have a different definition for different people and increasingly it games are including a

[00:40:04] lot of different options so that people can customize the way that they play, either to account

[00:40:11] for different capabilities and limitations. So a PlayStation for example we're really proud of the

[00:40:16] work we've done with accessibility and gaming but not just that but also preferences. So for example

[00:40:21] I like to play a lot of my games on easy, I'm just going to say that and other people don't find

[00:40:27] that as fun and they prefer for more of a challenge. And also as we look at all those different

[00:40:33] options and settings for every game we always have a fundamental conversation about okay

[00:40:40] let's make sure that whatever options we give people are not fundamentally undermining the

[00:40:46] design vision and the core tenants and pillars of the game that we know drive the fun. So if I was

[00:40:53] able to make a game that was so easy there was then nothing for me to do in the game it was like

[00:40:59] press button to win. That would really not quite be the design intent either and that wouldn't

[00:41:05] be something would be worth $67 from you to pay for. So I think the ability of AI to kind of maybe

[00:41:12] dynamically optimize the user's experience and maybe adjust the preferences in real time you know that's

[00:41:19] something that we talked about even before a Gen AI and not just in games but in products in general right

[00:41:24] can we leverage AI to kind of see what the user is doing maybe where they're struggling maybe

[00:41:29] where they're not having a great experience or they're not quite getting to their goal and then

[00:41:34] automatically adjust something to then optimize that experience. And I think which would be very careful

[00:41:40] with that because while on paper that sounds really good without really targeted application it could

[00:41:48] either lead to challenges and habituation it could undermine the core intent of the experience

[00:41:54] or it could just be confusing to people like well why did that change all of a sudden? Oh I have

[00:41:59] was really really close to defeating this boss and then suddenly I got a magic arrow and the dragon

[00:42:04] is dead and now I don't have the satisfaction of getting through that challenge and so it's something

[00:42:10] that we always love to have these philosophical discussions in our regular user research meetings but

[00:42:15] I think we don't have a clear guidebook yet on what that can really look like but it's certainly

[00:42:20] a possibility. You mentioned earlier about how a video game needs to be valuable enough to someone

[00:42:25] to spend $800 on triple A title. Now a big way to do that is obviously side missions and you could take

[00:42:32] 20-hour gameplay and turn into a 102-100-hour gameplay. Now with like a SaaS product or typical

[00:42:38] tech product one of the most consistent findings that I've had in products is how underutilized

[00:42:44] features and tools are and how few users are aware of what you can do with these platforms

[00:42:49] and quite often I've done research for big, big-name tech companies where people are asking for

[00:42:54] features that have existed for three or four years they have no idea that even exists. So what I'm

[00:43:00] wondering is as someone who works in a field where you need to constantly find ways in the

[00:43:07] gameplay to deliver more value and interest them in longer gameplay what are some lessons that

[00:43:12] you could share to a product team that perhaps is a little ignorant to the fact that they keep delivering

[00:43:17] three times as much value of able more side missions in the existing features at.

[00:43:23] Yeah I was laughing a bit about the idea of saying like people requesting things that

[00:43:27] exist in the product for years because they worked in the Google Ads team for quite a while and that

[00:43:31] is one of the most complex products because it has so many hyper expert users and despite working

[00:43:36] at four years I didn't even fully understand all of the things that all of our multiple

[00:43:41] product components could do. I think by the time I left I still had it learned at all.

[00:43:46] So in Gabe's in particular again it's going back to fundamentals and for some users the core

[00:43:51] value is the breadth or the amount of hours and experience. For other users that's secondary

[00:44:00] and it's more of like the intensity of the experience or the emotional resonance of the

[00:44:05] experience and the story. For other users it's the ability of kind of using the game as almost a

[00:44:11] platform for social connection or for personal expression or creation. So I think in terms of trying

[00:44:17] to assess what is of most value to how can we bring value, how can we make sure we're meeting

[00:44:24] that full scope of needs that we're living up to our price point or our value proposition.

[00:44:29] It's understanding again like what are the core drivers for the individuals and like a lot of

[00:44:34] users never see all of the side missions or all of the collectibles or all of the things in a given

[00:44:39] game just like they don't ever notice how to use all of the features in a product and so in those

[00:44:45] cases I think AI could be a pathway for people to understand what they're missing in those products

[00:44:51] so in a SaaS type product being able to say how do I use this to do X and then just getting having

[00:44:57] more of like a conversational approach to seeing like what you might be missing in the product

[00:45:02] that doesn't require you to just hunt around menus and be confused and frustrated for 20 minutes.

[00:45:08] And in the game space I think there might be a difference paths to making sure that players

[00:45:13] are getting the most possible value from their their games and again I think that's really

[00:45:19] understanding all the different things that are driving the fun for those particular type of

[00:45:24] audience the game is for. You mentioned so many things that I want to ask about 50 million

[00:45:30] questions but I won't. So we're talking a lot about the ability to leverage Genie AI in different ways

[00:45:37] and very probabilistic ways. And as someone who puts a lot of thought and effort into how to

[00:45:44] we test and and research and understand the right ways to utilize things like probably

[00:45:50] a story telling in Genre to AI what are some parameters or conditions let's say that are important

[00:45:57] to you as someone working in a space that is creating very immersive stories and has people who

[00:46:05] play very often and could potentially have addictive qualities to it what are things that are

[00:46:12] important for you to consider as you're going into this work and approaching it. Yeah the most

[00:46:17] important thing to consider is always your target players if you're making a game for children

[00:46:22] what is the standard for success and how to use different kinds of technologies is always going

[00:46:27] to be different than if you're making a game for adults and in any scenario really understanding

[00:46:33] how is the AI or any other new technology going to make this game more fun, more rewarding,

[00:46:40] more time well spent is a metric we talk about a lot where it's not just about time it's about

[00:46:46] is that time valuable is that time enriching your life is that time increasing your delight

[00:46:52] in your data day experience. So again it's always taking it back to understanding the

[00:47:00] person that you are building the experience for and the vision of what experience you want them to

[00:47:07] have and what needs you want to meet. But I think my big takeaway from our conversation today has been

[00:47:12] really that you're looking at their whole self you're really looking at you know the joy, the impact

[00:47:19] that it's had outside of the game as a result of the game and I just keep looking at tech teams

[00:47:24] and wondering why they don't do this. And I just say I know we've asked this already but I

[00:47:30] just I'd love to know how we can still more of it or if it's an organizational challenge,

[00:47:35] if it's a leadership challenge, if it's that the principles are not well aligned in these

[00:47:39] product teams like what do you see? It always comes down to incentives and as you said earlier

[00:47:45] when the KPIs are all just about like you know did we move this far on profit and not like

[00:47:52] did we move this far on profit while also delivering an amazing experience that really provides

[00:47:59] long-term value versus short-term revenue? I'm really looking forward to the fact that more and

[00:48:05] more we have user experience professionals in more of like zp or c-suite level positions

[00:48:12] who can maybe have more influence on on setting those incentives and setting those success metrics.

[00:48:18] And I really am excited to see where that goes. Thank you so much, Chrisie. I really appreciate your time.

[00:48:26] Yeah, super fascinating. It's been so underval in hopefully at Tuscoviral and we can start having

[00:48:30] some new conversations about a love your KPI more time will spend and if we can get that

[00:48:37] culturally in more organizations what a what a treat that would be.

[00:48:42] For all of us and all the things we have to use every day to just get our jobs done.

[00:48:51] Thank you. Thank you so much, Chrisie. I really appreciate your time.

[00:48:55] Yeah, super fascinating. It's been so underval in hopefully at Tuscoviral and we can start having

[00:48:59] some new conversations about a love your KPI more time will spend and if we can get that

[00:49:06] culturally in more organizations what a what a treat that would be.

[00:49:11] For all of us and all the things we have to use every day to just get our jobs done.

[00:49:16] The plan for the unplanned planned for the unexpected. Yeah.

[00:49:21] Thank you for listening to the design of AI podcast. The show is hosted by Brittany Hobbs and

[00:49:26] RP Drake, the Get-Ita. Subscribe on Spotify, YouTube or Apple, get our latest episodes.

[00:49:32] We speak to leaders at the forefront of AI to learn how great AI products are designed

[00:49:36] and how they're transforming industries. The content does is an our website design of.ai.