Transcripts
1. Trailer: In the last few years,
augmented reality has experienced a huge growth
from web experiences, apps and games, and even in
social media more than ever, it is really important to
design good user experiences. And that is why I
created this class. Hey everyone, my
name is Nora Cato. I am a UX designer specialized in XR and creative technology. When I designed my first
augmented reality experience, I was completely lost. I had no idea how to transfer my previous you acknowledge from designing websites and apps and other digital products
into this new media. So there was a lot of trial
and error over the years. I've had the chance to design
interactive experiences for some of the world's biggest
brands, such as Mercedes. You're Tokyo 2020 Olympics,
Lexis, or Nickelodeon. That has given me a
clear idea of what are the key elements
that matter when designing this type
of experiences with AR taking more and more space
over the next few years, many designers and
especially UX designers, want to take the jump
into designing for it. And that is why I've created
this course to summarize the fundamental key
UX elements that matter when designing experiences
for augmented reality. First, we'll go over the basics. What mental realities, what
types of AR we can find, as well as the devices
and that analogy. One thing is clear,
we will go deep into the specific UX processes
for augmented reality, as well as some of
the key concepts that matter when designing
for this technology. At the end of this course,
you will have the concept of your first augmented
reality experience. But don't worry, we'll
take it step-by-step, introducing small
exercises throughout the lessons that will help you define the structure
of your project. I can't wait for you to
discover the wonders of this new technology and
see what you make with it. So let's get started.
2. Project Overview: Before we start with
the actual content, let's talk about the
project for this class. The goal here is to come up with your first Augmented
Reality project. You don't have to worry
about finished designs, about wireframes or prototyping. The goal here is to come
up with an idea and build a structure for it to do
so, we'll go step-by-step. First, you'll come up with your idea, something
very simple. It can be summarized
in a sentence. It can be something
very technical or practical to help you do
something you do everyday, like a measuring tape. Or it can be something a bit
more creative or complex, like a children's storybook in second place,
well-defined the users. What's their goal
and their needs, as you would usually do on
a user experience project. In parallel with that, though, we'll talk about the context. What's the technology and devices that you have in
mind for this project? Then we'll briefly
define the experience. We'll talk about onboarding
the core experience. There ever messaging
problems that they can encounter and
how they will solve them. This will be a very
high level user flow, but something that will give
structure to your idea. Finally, and as an
option for yourself, you can create a concept image, something that
visualizes the idea that you have in mind and that clearly in a simple image
summarizes your project. I know this might
sound like a lot now, but over the lessons we'll do small exercises that will help you with each
one of these steps. I encourage you to
share your progress with the rest of the class and myself so I can help you and
support you along the way. And now let's get started.
3. What is AR?: Let's start with the basics. What is augmented reality? Augmented reality
is a digital medium or technology that allows the integration of
virtual content or digital information into
our physical world. So it seems like
it's actually there. Disintegration can be either by adding something that is
not physically there. Extending what is
that in some way, or actually completely
modifying our physical world. It is really important
to remember that augmented reality is
not virtual reality. When we talk about
virtual reality, we are talking about
something that completely covers our world. So it's something
completely artificial. Augmented reality,
on the other hand, uses the real-world environment as part of the experience. So this is the big
thing to remember. Augmented reality
is a combination of real and digital world. So if it's something
that completely covers your real environment, your physical environment,
where you can't see or hear anything that is
not augmented reality. Depending on the
integration between the physical world and
the digital content, there are different types of AR, and we'll go over them
in the next lesson.
4. Types of AR: Depending on the
integration between the real world and
a digital content, we can talk about different
types of augmented reality. The main thing here is
that they all share the common feature
that they will never overlap your real-world. There will still always be a reference to your
physical environment. Some of these types of
AR might overlap in some way and some
features might be common. But still, you might
feel that there is specific types of AR that is more suitable
for your project or for specific user cases. So in a top level, we'll differentiate between two types of augmented reality, marker-based and
non marker-based. Let's start with marker-based. Marker-based AR uses
target images or markers to position the digital
content in a given space. These markers tell
the application or the technology where to place the digital content with
a urine experience. A very good example
of this type of AR is the Tate Britain untold
stories experience. This Instagram effect. Use the paintings in the gallery to trigger the 3D content. So each painting had its own AR experience because it was a completely
different marker. And the technology
understood that adds specific image to
trigger one specific AR. On the other hand, we have non marker-based
augmented reality. This is any type of
experience that doesn't use a target or an image to
position the digital content. A good example of a
non marker-based AR is the ikea plays up. This app lets you virtually
place any object that you can find in the ikea
catalog in your space. So this app, understand there
is a flooring your house. You understand the space. And it adds that
object that so far or that rock or anything
that you want true to scale. So you can understand how that would look in
your actual space. Within the non marker-based
augmented reality, I like to think there are
two more types that even though they still don't
require an image to work, they're still their own thing. The first type is
location-based. They are. And I'm sure that if I say
Pokemon Go, you compete. You understand what
I'm talking about. This type of VR
uses the location and orientation
sensors in your phone to position any type
of virtual object in a specific location
or point of interests. So a spoken one goat does. You can move around your house, your city, or wherever you go. And you might find
specific content in a specific location. In the case of Pokemon Go, you have to catch Pokemons, but it can be applied
to anything else. On the other hand,
we have projection based AR, or superimposition. This type of augmented
reality involves either partially
or fully replacing the original view
of an object or a person with an augmented virtual
view of the same thing. This type of AR has been
really popularized by social platforms
such as Instagram and Facebook and
Snapchat and TikTok. But we can see really cover
examples of these with some Instagram AR for
striking effects. Or for instance with
the snap city painter. A much more complex
example of what this superimposing
based AR can do. In this case, you
are not using a face or an object to
superimpose something, but actual buildings
and streets. And this is something
that snapchat has been doing over
the last few years and perfecting to actual,
really complex experiences. As I said at the beginning, it is really important
to remember that these different categories
can overlap at some point. It might be that
one experience has actually two types of
AR and not just one. Remember that this technology is currently evolving
in the future. There might be different
categories, or even now, these types might
change depending on what or who you read. But for now, we'll
work with this so that we have a clear structure
of what type of AR we're talking when we see examples are when you
tackle your own projects. And now let's warm up
with a small exercise that can help you towards
your final project. In this lesson, we've seen
different types of ER, with an example for each one. Now that you might understand what different
types we can find, go out there on the web and try to look for
other examples. You can find one for each, or you can find several
for one specific category. Remember, the different
categories are marker-based AR on one hand and non
marker-based AR the other hand, within non-market
base, we can also find location-based they are. And superimposing AR, I really encourage you to
share with the rest of the class and myself. And maybe this might spark some ideas for your
final project. In the next lesson, we'll talk about different devices and technology that can be used for augmented
reality experiences.
5. Devices & Technology for AR: So let's do a quick recap of the main devices
that we can use for augmented reality are some of the technologies used to
create these experiences. We're gonna do a
very top-level list. And I encourage you, if you are interested
in this area, to explore on your own, to choose one of
these technologies and to do some research
on the types of features and requirements and even interaction that they
use in terms of hardware. Nowadays, there are many, many companies creating
their own devices, not only for augmented reality, virtual reality
and mixed reality, which would be like
a mix of both. But here, let's introduce
the main ones again. If you're interested
in any of these, I really encourage you
to do some research, to go onto the website
and read more about them. But let's start with one
of the big ones accompany that has been developing
the AR headset for years. And this is the
Microsoft HoloLens. The Microsoft HoloLens or
a mixed reality device, which means they can integrate both AR and VR experiences. They use multiple sensors, optics and holographic
processing that blends seamlessly
with our environment. The interaction is done
using controllers, hands, or even gaze. So depending on where
you're looking, the headset will
interpret that you are selecting or
confirming something. You can use your case to
interact with device. This is a very, very powerful piece of hardware, which means that it's not
really accessible to everyone. In a similar area, we find the magic leap, another company that has been developing the AR
headset for years, although they started
their journey with more creative applications, nowadays they are very focused on enterprise and companies. It is a very powerful
3D visualization tool, so powerful that
it can be used for even health and medicine areas, as well as the
Microsoft HoloLens. You can interact with
it using your hands, using a controller,
using your case, but also with your head
position, orientation, mobile or even other input
like keyboard and mouse. These companies use their
AR headsets to improve productivity and
to improve some of the tasks that they
do on a daily basis. Talking about classes, do we have the Snapchat spectacles, which is a completely
different point of view of using AR headsets. The Snapchat spectacles
allow you to interact with experiences done
through Snapchat. It did take on headsets, more based on social media. They don't allow massive
complex experiences. And in terms of
3D visualization, other tools like
the Magic Leap of the whole length are
much more powerful, but still it's a much
more accessible piece of hardware that you
can have in your home. And then you can use for much more playful and
creative experiences. And let's talk about software
or specific technologies. There are many, many pieces
of software that you can use to create
your AR experiences. Some of them might even sound familiar to you and you
might choose one or the other depending
on where you want to publish it or the level of
complexity that you want. These are just the main ones, are the ones that in my
personal experience, I've seen or used the most. If you're looking to create
AR experiences that are complex and can be
published in any platform, I really suggest to use a game engine like
Unity or Unreal. This allow you to
create and publish experiences to variety
of different platforms. The problem here
is that you might need a good coding knowledge. So if you have never
developed before, if you've had never used
Unity or Unreal before, it might be a bit
difficult to start, although you can integrate many different tools
and developer kits like euphoria or publish AR
experiences for web using WebGL. This require good
coding knowledge. And if you want to use one
of these game engines, you might need a
developer to help you. If you're looking for something
a bit more accessible that doesn't require
these many coding skills. You can use a tool
like Adobe Aero, although it is stick within
the Adobe environment and you still need to have this
app to see the experience. It's a good tool to
prototype and to get started with
AR very quickly, Adobe provides tons of tutorials and videos to help
you get started with AR, and you can have something
published in no time really. However, this is only
available for iOS, for iPads and iPhones, and only available through
the Adobe Creative Cloud. However, if you're already
paying your membership, I really encourage
you to download it on your iPhone or on
your tablet and just play with it
and experiment and see what you get done
in a similar line. And also just limited
to Apple environment. We have Swift Playgrounds. Swift Playgrounds have the goal of making coating accessible. So if you want to
get started with AR, it might be a very good tool. Although still it might require some coding or
some understanding of logic and interaction. But still, if you are an Apple user and you
want to give it a go, there are tons of
tutorials and videos. And Apple really
support you on creating your experiences in my
personal experience. So if you want to
get started with augmented reality and wanted to publish something for anyone, I really suggest you to go into the big three
social media platforms. Instagram, Facebook,
Snapchat, and TikTok. Nowadays everyone has one of these three apps
on their phones, or even all of them. So that means that if you
create an AR experience for one of these three platforms
or for all of them, chances are your audience might be a lot more bigger
than if you created just for Adobe users
or for someone with a headset and not just to
share it and to publish it. But these platforms have made creating these
experiences really easy. They all have very visual tool. So even if you want
to create something with logic or
complex interaction, they support you in doing that. Even if you don't
know how to code. And although these
experiences will only be available through the
social media platforms, instagram, Snapchat, or TikTok. Chances are a lot of people
will be able to enjoy them as they already
have them on their phone if
you're interested. These are the three different
tools for each platform. For Instagram and Facebook, you can use Spark AR. For Snapchat, you
can use Lens Studio. And for TikTok you
can use effect house. As a quick side note,
there is a company that I would like
to mention because of the impact and
importance that it has on the AR
environment nowadays. And this is Niantic, the company behind
Pokemon GO wizards Unite or ingress approach to AR, from a game's point of
view has made them a really important player
in this new environment. The big strength that
they have is that they encourage people
to move around, to explore the world, and to encourage
that games are no longer played sitting
down on the living room. And AR is the perfect
medium for that. If there's one main takeaway
that you should take from this whole big
picture of devices and technology is that if you want to get
started with AR, you don't need the
latest technology to publish something. You didn't need the
latest headset or spend thousands of euros or dollars
in a piece of hardware. And on the other hand,
you don't need to be a very skilled developer
to create something. You can use a social media
tool to publish something, to share it with your
friends and family. And you will have
your AR experience.
6. Introduction to UX for AR: Now that we have a clear idea of the technology that
we're dealing with. Let's dive into the UX. The main question
here is what differs from designing a
static website or application to designing an
augmented reality experience as a UX designer, what's different for me? Well, to be honest, they
are very similar processes. And once you understand
that technology, part of it is exactly
the same as you would do on a
traditional UX project. You do your research
about the user and their goals, needs
and challenges. You understand them defining the persona's scenarios
and experienced maps. Then you create wireframes, mood boards and prototypes, and iterate based on user testing until you
get to a final result. There are some key questions
that as a UX designer, you would still have to
consider when designing for AR. Who are you designing for? Your users? Where are their goals and
what do they want to achieve? And what do we want
them to achieve? What are the motivation
and the pain points? What's the context of use? This is a very brief overview
of the UX process that you would do on a
traditional project that you would still
have to do in AR. But there are details
in each one of these steps that will
differ from any project. Before any of that, There's one thing that we
need to remember is that we're talking about
a very new technology. Although platforms
like Snapchat, TikTok or Instagram have
democratized the access to it. We can consider that every
user will know how to use it. People might not know
what they're dealing with or might not be
familiar with it. This is something very new and very important to remember. So having established that, what are the differences then? Well, the main one and we've gone over it is the technology. There is no other way around it. You need to understand
the devices, the hardware, and the
technology that we're using. What are the limitations? What are the things that
you can do with it? What are their constraints? And most importantly,
we have to understand the difference between
headset and mobile. This is not just about
interaction or actual testing, but we need to
understand that there is a limitation in access when designing for a
specific audience, we have to remember that not everyone will
have a headset, whereas many people nowadays have a mobile device
of their own. So if you're
designing, let's say, an augmented reality
for a classroom, chances are they might not have augmented reality headset. So you might think
about designing your experience for a
mobile device or a tablet. Apart from the technology, we also have to understand the space that we're
designing for. We're no longer designing for a flat mobile screen
or a desktop. We are designing in 3D and we're designing in our
physical environment. So we are no longer
designing for a flat-screen onto D. We are designing
for our world in 3D. Don't worry about it. We'll talk about it later on, but it's important to start
considering that this is no longer something
static on your screen. In the next few lessons, we'll explore some of the
key UX processes that matter when designing for AR and that you still have to consider. But before doing that, let's do another small exercise. So far, we've defined what
augmented reality is, the types of augmented reality devices and
technology that you can encounter in a brief introduction
into user experience. Now, you might have a bit of an idea of the environment
that we're dealing with. I think it is a good moment to start thinking about
your final project. But don't worry, let's
go step-by-step. First of all, you just need
to think about an idea, something that you can
summarize in a sentence. It can be coming up
with a problem you have that you think
augmented reality can help with or something more
creative and that you want to explore in a
less practical way. For instance, you might
need a tool to help you lay out your paintings on the wall and you think AR can
help you with it. Or you want an
experience that can help you tell stories to
your children at night. Anything works. I really encourage you to share your ideas so I can
give you a bit of feedback or support you on developing it further
for your final projects.
7. UX for AR: Presence, Agency and Affordance: The first UX concept that
we're going to talk about, It's not one but actually three. These three concepts are
three ideas that are very related and have a lot
to do with one another. We're talking about presence, we're talking about agency, and we're talking
about affordance. And if you have no idea
what these words mean, don't worry, let go. The first concept we're going
to talk about is presence. And presence could be
related to interaction. So it's the degree of actually being there that you can have when experiencing
something. Presence refers to
the feeling of being there or just being
a mere viewer. A very good example of these are movies versus video games. When you're watching a movie, you're watching
something that you might feel very engaged with, but you are not
actually experiencing, you're not actually in there. You don't have any
degree of presence. Whereas when you are
playing a video game, you might have a degree of
presence because things that you do have an
impact on the game. So you are actually
in the game somehow. There are some
things that impact the degree of presence that we have or that we are giving
our users in an experience. The first one is the
degree of control. So how much control are
we giving our users? Can they do anything they want? Depending on how much
control and giving them, they might feel like they have more presence in
the game or less. Second is the
immediacy of control. Are the actions that they do immediate or there is a delay. Because if there is a
delay that can cause a perfect or even
break the presence. And the last thing is
the mode of control. So the control should
behave naturally. It should be easy
to use if a user needs to learn something new
that they don't usually do, then it can break
their attention or their presence completely. Unrelated to this is
the second concept that we're going to talk
about, which is agency. Agency is the ability
to take action or interact with the world in
a way that it has impact. Agency in simple
words, mean control. And agency generates presence because you have the ability to interact and you can see that those actions
have an impact. And last but not least, let's talk about the affordance. Affordance is a quality
of an object that shows how to take action or
interact with that object. So the ideal scenario
is that you know how to use something without anyone telling you
how to use it. A very good example is a chair. The chair is designed
in a way that it signifies being sat on it. So it's designed is
telling you what to do without anyone telling you
what you should do with it. The ideal thing is that users with their
instincts know how to interact with the
world that you are created or with
your experiences, without you having to tell them at every step what
they have to do. But in case it is needed, there are ways you can provide virtual affordances
in an experience. You can provide signature. You can add labels
to your buttons. You can create maps
or location trackers. Obviously the more
names, tags, labels, signifiers you are
to your experience, the more cluttered it will fail. You have to hit the
right balance between creating an experience that
users know how to use, but also adding support in case they don't
know what to do. These three concepts,
presence, agency, and affordance have to do with interaction,
with engagement. If the content in our
experience is not clear, if users are know
what to do with it, they won't feel engaged. They wouldn't feel
like they have agency to interact with it. So they won field present
in the experience. They will be just mere viewers
like watching a movie. Obviously this can vary
and you can play with it. But we are talking about AR. You might be thinking about doing a viewing
experience in AR. But at the end of the day, this is about interaction and creating engagement
with our experiences.
8. UX for AR: 2D vs 3D: So far we've talked about
some aspect that we have to consider when designing
for augmented reality. We've talked about presence, we've talked about agency, and we've talked
about affordance. However, these three concepts
are also important when designing other types of
interactive experiences. But the main area where
things differ when designing for AR is
the environment. In augmented reality, we're no longer designing just
for a flat-screen, but our whole world is
the interactive area. In this lesson, we're not
going to talk about how to create 3D content for
these experiences. This would have enough content to fill in a whole
different class. However, we're going
to talk about how these digital content
is positioned in our world in relation
to our camera or device. To make it easier, we're
going to differentiate between two ways of
position this content. One when it is attached
to our device, and two, when it is attached
to the world, this content can be
either 2D or 3D. But if we want to take
the most advantage of the AR media, we're going to stick to 2D when it is attached
to our phone. And 3D when it is
attached to the world, we're going to use
to the content attached to our font when
we want to use it for UI, this content will include
the type of things that we want users to have
agreed at anytime. So this includes many buttons, settings, profile, and
this type of content. Usually these wouldn't be part
of the experience per se, but it is a different way
to interact with it that we want users to have our
bridge at any point. However, the most interesting
area when we talk about AR is the 3D content that
it is attached to a world. This is the most
interesting way to position 3D content in the world because it forces users to move around and interact
with this content. Although these objects can
be positioned anywhere, there are different ways that we can attach them to the world. For instance, you can have something attached to a surface, like placing a piece of
furniture to your living room. You can also pinpoint it
to a specific location, like catching a Pokemon to your face through a social
media AR experience, or through a print
or a painting, like in a marker-based
AR experience. Although we are making
a difference here between sticking two to the four UI and things
attached to your phone and 3D. For things that are
attached to the world, this can be mixed and matched. However, it is the easier way to understand the difference
between one and the other. There is one essential
thing here to understand the difference
between one and the other, between 2D and 3D. And it is that although
we are interacting with our world and our world
is our interactive area, we're still experiencing
it through a small window. That being our mobile
device or a head says, this window still has a limit. We don't want this window, this screen to be clutter. We don't want it to have
a lot of to the elements because the focus of the experience should
be the 3D content. So it is a good practice to prioritize the user
interface that we want to be there and to hide
the rest in menus or other areas of our experience that are not as
front and center. Let's do another small exercise
for your final project. If you've been following
the lessons by now, you might have a rough idea
for your AR experience. In previous videos,
we've been talking about contexts, users and technology. So let's talk about that. But related to your project
with your idea in mind, even if it just a
rough sentence, try to answer the
following questions. Who are your users? Is it just yourself? Is it your parents or friends, or is it the whole world? Where and when will they
use these experience? Is it in their living room,
when they're walking, when they're traveling,
or in a classroom, which device will they use? Is it their mobile device? Is it a headset or is
it something else? Write the answers
for these questions and use them to describe your idea in a bit more detail and remember to share it
with the rest of the class.
9. UX for AR: Field of View: Let's talk quickly about another UX concept
that matters when designing augmented
reality experiences the field of view. The filter view is the extent of the observable world that
is seen at any given point. In AR, basically it
is the area from where you can see an
object or an image. If it's not in your field
of view, you can't see it. If you don't really understand
what I'm talking about, Let's do a quick experiment. Place your finger in front
of your face and move it backwards until you
can no longer see it. That point where you
can no longer see it, that defines your field of view. So whatever is from that point, the front of your eyes
that your filter view. And the same if we
look up and down in terms of augmented reality and what we've been
talking about, this is important in terms
of content placement. Where are you positioning
things in the space? This is critical in an
experience because if you play something outside
of your field of view, users might not be
able to see it. And I know it sounds
very obvious, but you might be surprised
how many experiences start with something
behind the user said, and people think
it is completely broken in general terms, although you don't really
need to remember this, the field of view is
defined 110-120 degrees. This research comes from the
Google Daydream UX team, which also concluded that
the optimal view for up and down is 60 degrees
up and down the neck. This area, 100-20
degrees and 60 degrees, is defined as the content zone. And it also talks about the minimum distance
that we need to play something away from the user in order for them to
see it correctly. And this is like half a
meter away from them. Otherwise it might
be too blurry and the user might not be able
to focus on it or see it. This is not absolutely
relevant for you when you're working on
the US side of things. But it might be
important in terms of user testing and things that might go wrong along the way. The essential thing here
to remember is that if you don't place something within the user's filter view, that we completely
lost and might give the feeling that the
experience is broken. If you want them
to see something outside of their filter view, you need to guide them
through it by using UI or maybe placing an object just
outside the filter view. So users might tend
to look at it. It is as easy as this
if you place content outside of the user's field of view and urine
guide them for it. The content might be completely lost and they just won't see it.
10. UX for AR: Interaction: So far we've discussed many
areas where we have to put our attention when designing augmented reality experience. However, we haven't talked
about the main experience, which is the interaction. So how do users interact
with our experiences? In previous videos, we've
talked about affordance. And this has a lot to do
with interaction because it refers to the actions that
our users will and can take. The main key thing here is that any interaction that
we allow our users to perform have to be
intuitive and responsive. Although there are
many ways that we can interact with this
type of experience, here we're going to talk about the four most common ways to interact with AR experiences. The first one is
through the screen. When we interact with an AR
experience through a screen, this can be done through
two different ways, via direct manipulation
or via the UI. Direct manipulation means that any action that
we're performing, we are doing it to the object directly through the
UI means that there is some sort of 2D elements
in our screen that allow us to interact with
the digital content, but it's not directly
on the content. Obviously, direct
manipulation is much more immersive because it takes advantage of the
AR and the camera. Although you can
still use the UI, this should be used
for things that are not about the
experience itself, but more about menus are other ways to interact
with this content. If you record,
experience implies interactive with some
sort of digital content, allowing users to perform
actions directly to it. It is far more
immersive and engaging. Let's go back to
ikea plays an app that we've been
referencing over and over. In this app, you can
either scale, rotate, or position the object by touching it directly
and not through the UI. This is a much more natural way to interact with this content. And although it
might need a bit of guidance or instructions
at the beginning, it feels much more intuitive
once you know what to do. Although interaction
through the screen is the most intuitive data, how familiar it is for us, we can take more advantage of the AR to interact with
our experience, e.g. through the use of gestures, voice control, or Eyegaze, gestural interaction
takes full advantage of the AR because it uses
the camera as an input. We can differentiate between a hand gesture and face gesture, which are the most
developed ways to interact in terms of festers, hand gesture require
the user holding the phone or the device
in order to perform the gestures in front
of the camera on mobile devices using hand
gesture might not be the most intuitive because it requires you holding
the phone on one hand and using the other
hand to perform the action. However, in headset
devices like the HoloLens, you can use both hands to
perform your interactions and use actions like
pinching to select. On the other hand, we
have phase gestures and social media experiences
are thriving in it. Not only you can use blinking or opening your mouth to
interact with the experience, but they understand more
complex expressions tied to human emotions like put on
a stat or a happy face. We can also use voice control to interact with our experiences. And this is a very
interesting way to interact with the system because
it is something that we are already doing
on our daily lives with devices like the
Google Home or Alexa. It doesn't rely on hands to
interact with the system. When done well, it's a very natural way
to interact with it, especially when it
comes to commands. We can also use eye gaze, which is a much more new way to interact with this
type of experiences. We find it in more complex
devices like the Magic Leap where you can interact with
things by looking at them. It relies a lot on calibration. And due to its nature, it can be used for everything. But because it's really new, it's open to experimentation
and in the future, it can be used for more
complex ways of interaction. Regardless of the method of interaction that
they choose to use, there are some things to
consider across-the-board. First, don't rely
on just one input, but choose the right
one for the task. All of these ways of
interaction are grade, but some of them might not
be suitable for some tasks. For instance, using gesture for a mobile device can be a bit tricky since you have
to use both hands, want to hold the phone and the other one to do the gesture. Use multiple inputs together
when it makes sense, combining different
inputs when appropriate can make the experience
more natural. So if you want users to interact with some digital
content in the space, you can use direct
manipulation to interact with the
objects directly. Or you can use the UI
for users that are not familiar with AR
yet, give feedback. Use clear audio and
visual feedback to show that the system is
receiving an input that can be by showing changes really quickly when users
start the screen or some visual feedback
to let them know that the system is listening when they're giving
a voice command. The most important thing
here to remember is that regardless of the method of instruction that
you choose to use, your interactions have to be clear, intuitive,
and responsive. Before we finish this
lesson and interaction, Let's do another small exercise towards your final project. I want you to think
about your experience. I want to think about
the core interaction. By now, you might
have a rough idea of what you want to
do for your project, as well as who your users
are and the type of device. And since you have the type of device that you want
your users to use, it is important to think
about the types of interaction that are
available for it. Think about actions that you
want your users to perform, like place an object
to the world, rotate it, or scale it, and think about the
type of interaction that they can do it via UI. Is it via direct manipulation? Is it voice control? Think about how
these actions relate to an interaction that
your users can do. So let's do a quick example. Imagine your experience
is an AR two to help you display
paintings on your wall. So let's come up with
different actions. For instance, you might have placed painting,
that's one option. Then you might have
reposition the painting. Second action. Third one could be scale it. So you have this reaction. Now try to think about
the interactions. So the actual input
that the user will do in order to
fulfill these actions. So what should they
do to position it? What should they do
to reposition it, and what do they do to scale it? Now, do the same for
your experience. Come up with some actions
and the interaction that the user will have to
do in order to perform them. And don't forget to share with the rest of the
class in order for me to give you some feedback and help you towards
your final project.
11. UX for AR: Onboarding and Error Managing: One of the most important
aspects when creating an AR experience is
the on-boarding. This refers to how a person
enters and experience. Is there a tutorial or is
it self-explanatory enough that you don't need to guide your users through the
beginning of the experience. This is very important because depending on how
you design this area, a person might
completely acquitted or feel engaged enough
to continue with it. As we've said before, many people have
never experienced an AR or 3D environment before. So when users first
interact with it, they might need some help or guidance to help
them through it. And onboarding plays a key
role when designing a good UX, you might be thinking of
tutorials or step-by-step guide. But one good rule is to make the onboarding part of
the experience. Well, there are three things
that we can do to make this initial encounter with
our experience much better. The first thing is
to avoid teaching users what they have
to do beforehand. So instead of teaching what
they have to do at once, at the beginning, it
is really important to show instructions
in a contextual way. So you don't need to give them all the information
all at once, but whenever they
need it. So e.g. imagine in your experience, users have to scan a floor or a surface instead of
selling them beforehand, that they will need to
do this and what to do if they don't
manage to do it. Show them step-by-step, guide
them through the process. This way, they won't have to remember a series
of instructions, but they know how to do it
once they aren't doing it. Related to this obviously
is to guide them, especially at the beginning. Use cues and UI to allow them to move around and explore
the world around them. This way, they will feel more
confident when exploring the world and they wouldn't be expecting instructions
all the time. So in other words, just show instructions when
you need to show them. Otherwise, just allow
users to explore freely. And the third thing is, don't reinvent the wheel. You don't need to come up with fancy interactions to do
things that already work. Users know how to tab, how to drag, how to slide. So if you can use these
interactions in your experience, instead of coming up with something completely new
that you have to teach them, just use those interactions. So by doing that, by using them, you won't have to be constantly teaching users what to do. Let's see an example. This is a super
bloom app developed by Niantic and pre-load it. It allows you to feel
your space with flowers. None of the queens
platinum jubilee. The first time you open the app, it gives you tips
like stay aware of your surroundings so you
don't bump into things. But the main attraction
of the app is to scan your floor and
to add flour sweet, to plant some seeds. So at the beginning,
when it gives you tips, instead of telling you right
away what you'll have to do, like you will have
to scan the floor. It allows you to first open the camera and open the AR mode. And once it opens, it guides you through
the scanning the floor. This is a very simple
onboarding process, but it works and it does exactly what we've
been talking about. It doesn't give you all the
information all at once. It allows you to move around. You have a sense of exploration and it doesn't
reinvent the wheel, it does something that works. The information is shown
at the right moment, and it even tells you when
it's working by showing the circle in a green color
and when it's not working, by showing it read. The same thing applies
to error messaging. This new technology might bring moments when users
don't know what to do. If you are a UX designer, use the same logic. You don't have to reinvent the wheel when showing
error messaging. The only difference here is the problems might be different and users might not be familiar with them or how to solve them. So again, don't
overload users with what could go wrong
beforehand before it happens, but help them and guide
them if there is an error, as a rule of thumb, do
these three things, inform them that
there was an error by saying something like
something went wrong, give them an idea
of what went wrong. Like we couldn't
scan the floor or we couldn't find a surface and
then help them to solve it. What could be the problems
related to this problem? Maybe the room is
not well laid or maybe the user is not pointing
at the floor directly. So give them the steps. So by doing these things, although it doesn't
sound like a lot, you are taking into
consideration that the user might not be
familiar with this type of experiences and
that they might encounter errors that
they will need help with.
12. Key Takeaways: We've come to the end of it. This is the last
lesson of this class. And I know that
we've covered a lot. This concept that we've
talked about could be worth a class on their own. However, I really want to say congratulations on
having come this far. Ar might be a daunting subject, but now you have the key
elements to get started with it and to dive deeper
in some of these areas. Before we end, let's do a quick summary of the key
takeaways for this class. In it, we've discussed the
fundamental areas of AR. It is the different types as well as the devices
and the technology. Remember, AR is a
technology that integrate with your real-world,
with your environment. It can be either by adding, by extending, or
by modifying it. It will never cover
your world otherwise, that would be virtual reality. Remember to define the
context and the environment, which devices are your
users going to use? Do you want a very
niche targeted audience with access
the latest technology, then go for a headset. But if you want anyone
to use the experience, you should go for mobile
in terms of technology, where will these
experienced republished? Is it a website, a bespoke up, or a social media platform? The interaction might differ, as well as the technology
that you are going to use, as well as with any UX project you need to design
for your users. You are designing
for people that might have never used
one of these experience. So you should never forget that make the
experience engaging, allow users to interact
and do things. They are not just Viewers. Remember, we want users to feel presence in these experience
to feel like they have agency and impact
on what they're seeing and that they can experience it without
difficulties. Don't overload them
with information. The focus here is
on the 3D world. So if anything is not needed on the screen or on
the UI, hide it. But at the same time, don't hide the experience. Experiment with the filter
view, but remember, anything outside of it might be completely lost if you
don't guide users to it. Finally, remember,
onboarding is key. Users might not know what to do. So don't reinvent the wheel. If it works, it works and guide the users through
contextual interactions. In terms of error messaging, inform users of the errors and
help them how to solve it. I hope this class
has been helpful for you and I really
encourage you to work on the final project and share it with the
rest of the class.
13. Conclusion: Congratulations on having
finished this class. Now it's your turn to work on your project through
the lessons we've introduced some small exercises to roughly this on
your experience. So now it's your turn
to put everything together to make it easier
on the class resources. You'll find a
template to help you summarize the key aspects
of your experience. However, let's do a quick
recap of what's needed. First of all, define your idea. Summarize the problem
that you want to solve or your unique AR experience
in a short sentence. Then define who are your users. And also define the contexts. What type of device
that you have in mind. Where do you see your users
using this experience? And most importantly,
who your users are? Then create a high
level user flow, including the main
areas, the onboarding. So how do users enter the experience, the
core experience. So what are the main
instructions in it? And some error managing. How do you foresee
some errors happening? And how do you see
them being solved? If you feel like it at a representative image to
visually show your project. This is optional,
but it can help you visually see how this
could look like. Below. You'll also find a
project done by myself. So you have a clear
idea of what the briefest and maybe in
spark some inspiration. Remember, your project
can be about anything, but it has to be in
augmented reality. And don't forget to
post your project on the project gallery
for everyone to see. And there you go. You have the idea and the structure for your first
augmented reality projects. At this point, I just want to
congratulate you again and thank you for following this
class until the very end. If there is one thing
that I want you to leave with is that anyone, including you can design
for augmented reality. This is a new media, but it is here to stay. And if we've proven something is that you don't
really need to be a technical expert or have a lot of experience to design for it.