Make a SpaceX bot for Google Assistant | Xavier Decuyper | Skillshare

Make a SpaceX bot for Google Assistant

Xavier Decuyper, Keep learning!

Make a SpaceX bot for Google Assistant

Xavier Decuyper, Keep learning!

Play Speed
  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x
24 Lessons (1h 59m)
    • 1. Course introduction

      1:24
    • 2. What we're going to build

      2:30
    • 3. Architecture overview

      3:02
    • 4. What is Dialogflow?

      4:50
    • 5. Dialogflow console tour

      9:03
    • 6. Create your first intent

      5:59
    • 7. Creating and using entities

      10:04
    • 8. Training your bot

      3:19
    • 9. Create "Next SpaceX launch" intent

      5:29
    • 10. What are fulfillments?

      1:36
    • 11. Your first fulfillment with Firebase Functions

      7:46
    • 12. Integrate with SpaceX API

      12:28
    • 13. Debugging Firebase Functions

      7:50
    • 14. Handling user input

      7:41
    • 15. Rich responses: send images to user

      3:16
    • 16. Rich responses: send suggestions to the user

      3:36
    • 17. Rich responses: Cards

      3:42
    • 18. Writing code locally on your computer

      4:52
    • 19. Organizing the code nicely

      5:24
    • 20. Integration flexibility

      1:12
    • 21. Integrate with Google Assistant

      5:47
    • 22. Integrate with Facebook Messenger

      5:45
    • 23. Switching to Dialogflow API v2

      1:50
    • 24. Conclusion and next steps

      0:59
  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels
  • Beg/Int level
  • Int/Adv level

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.

88

Students

--

Projects

About This Class

Chatbots are awesome to ask quick questions, but how do you create one yourself?

In this course you’ll learn how you can extend the Google Assistant by creating your own “Action on Google”.

The techniques used in this course will also work for other assistants such as Amazon Alexa and Microsoft Cortana.

Your bot will also work on regular chat platforms. Without changing anything you will be able to link up your bot with Facebook Messenger, Skype, Slack, Telegram, Twitter, ...

08442c0e

Meet Your Teacher

Teacher Profile Image

Xavier Decuyper

Keep learning!

Teacher

Hello, I'm Xavier, a passionate developer with a wide interest. I have worked on countless of projects and I'm always looking for new and exciting stuff. Through Skillshare I want to pass along some of my knowledge to you!

See full profile

Class Ratings

Expectations Met?
  • Exceeded!
    0%
  • Yes
    0%
  • Somewhat
    0%
  • Not really
    0%
Reviews Archive

In October 2018, we updated our review system to improve the way we collect feedback. Below are the reviews written before that update.

Your creative journey starts here.

  • Unlimited access to every class
  • Supportive online creative community
  • Learn offline with Skillshare’s app

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.

phone

Transcripts

1. Course introduction: hi there, and welcome to the scores about building your own actions for the gruel assistant. My name is Exavier. I'm a professional developer tech enthusiast, and you tumor. During this course, I will show you how you can extend the Google assistant by writing your own applications for it. These air called actions for Google, and they will work on the Google Home speakers, but also on Android devices and even on IOS devices to kick off the scores. I'll start by showing you what we're going to build spoiler alert. It's going to be a but that knows everything about Space six. To do this, we will be using dialogue flow to understand what people are asking, and we will be using JavaScript to create some custom logic. The architecture that we're going to build is going to be flexible. Building in action for Google is nice, but we also want to make sure that our architecture can support other platforms in the future. The techniques that I'll show you will make your application compatible with platforms such as messenger, Alexa Cortana, Skype and more. When you're finished with this course, you will be able to make your own applications for the Google assistant while at the same time supporting other platforms. So what are you waiting for? Force assistance are the next big thing by the scores right now. And learn how you can extend the Google assistant or any other service for that matter. 2. What we're going to build: So let's take a look at what we're going to build during this course. As I said in the introduction, it's going to be in action for Google that is capable of answering questions about space. Six questions like When is the next launch? What is being launched and so on? Here's a short clip of the end result running on my Google home. Hey, Google, let me talk to Space Expo. All right. Getting the test version of space expert. Hi, I'm Space Six. But what question can I answer for you today? When is the next launch? The next launches on May 7 2018? How many times has a Falcon nine launch? The Falcon nine has launched a total of 54 times. 52 lunches were successful, while two resulted in a failure. That's a success rate of 96.30%. What's the company's evaluation? The company is evaluated at $15 billion. How old is the Falcon? Heavy. The falcon heavy has a height of 70 meters. Thank you. Bye. It works great and noticed that I don't have to learn how to speak to it. I can just use my own words and the algorithm will figure out what I meant and how to answer my question so I can ask the same question in different ways, and it will still understand what you mean. I can ask how high is the Falcon nine, But I can also ask how so or how large it is. As I've mentioned in the introduction, we're not going to limit ourselves to integrating with the Google assistant. Here's the exact same application running on Facebook Messenger. As you can see, we received the exact same answers as we did before. However, because messenger is being used on a device with a screen, we can show more information and include images and even suggestions for additional questions. The same thing goes for when you're using the Google assistant on your phone, for example. So that's what we're going to build during this course. If you're interested by this course right now and then continue with next video where we're going to take a look at the architecture that will allow us to build all of this cool stuff 3. Architecture overview: Now, before we get our hands early, let's take a look at the things we need to actually make this work. An action for Google is basically a chat bots that is being controlled by the voice of the user. So the first thing we need is a way to convert this speech into text. Then we need something to interpret this text and make sense of it. We need to understand what the users intention is and what information he is looking for. Once we know that we have to perform some actions to fetch the information that the user has requested, and then finally, we have to give the user a response. So let's take a look at what we're going to use to accomplish these three things. We don't have to do them all ourselves, so let's start with speech. Luckily, this is being handled by Google itself. So the user talks to the Google assistant and Google's artificial intelligence will transcribe speech into text. Now that we know what the users said, we have to figure out the meaning. So we will connect the Google assistant to a service called Dialogue Flow. This is a free service that takes text, isn't input and tries to recognize the users. Intention. It's pretty awesome, as you will see later on in this course. Also for, in fact, dialect. Flow started as a P I I and was acquired by Google in 2016. One's dialogue flow has identified what the user wants. We need some code to actually give the user and answer. You can write this logic in any programming language you want. The only requirement is that you can put it up on a server somewhere and allow dialogue flow to access it over. Http. Now, because dialect flow is a Google product, they recommend using a firebase function, which is actually the same as Google Cloud functions. This is a platform that takes JavaScript code and runs it all for you. Google will take care of provisioning servers, securing them and scaling them should that be required. If that's not a good enough reason considered this, it's crazy cheap. They give you your 1st 2 million requests per month for free. Now, our custom firebase function will then talk to a public AP I that has a ton of information about Space X and once we collected the data we need will create a response for the user and pass it back to dialogue flow, who will then pass it back to the Google assistant who will pass it on to the user. That all sounds pretty complicated, but will take this step by step and slowly build out this architecture. And because we're using dialogue flow, it's actually possible to replace the Google assistant with any other type of assistant or even shot services. So I can integrate our application into a platform like Facebook Messenger, for instance. So that's an overview of the architecture that we're going to build during this course. In the next video, we will take our first steps and start exploring the dialect flow Consul. 4. What is Dialogflow?: so during the introduction, I already gave you a quick summary of what dialect flow is, but let's quickly check out the website to get some more details about the service. So here I am on Dialects Flow website, Dalek flow dot com. And as the website says, dialect flow allows you to build natural and rich conversational experiences, and what they do is they use machine learning to understand what your users are actually saying. So if user says to you book a flight from Los Angeles to Hawaii for less than $300 then the L'Arche flow can understand the meaning of the sentence so it can extract Los Angeles and Hawaii as two places in the United States, and it can extract the $300 as a currency so you can see that here it's it's got Geo City, Los Angeles, Geo States, Hawaii and you can see the price, which is 300 the currencies of course, U S. Dollar. Now, what's more is that dialogue flow is backed by Google, and it runs and ghouls. Infrastructure dialogue flow started as a p I I and was acquired by Google in 2016 And what it means to run on Google's infrastructure is that it can scale to millions of users. And another important benefit is that when you integrate your dialogue flow bought with Google assistant, you get super little leyton. See between these two services because they run on Google's internal network. So that's another very important positive point. Now it's a Google service, but you can also integrated very easily with other platforms as well, so you can integrate it easily with Facebook Messenger, Slack, Skype, Twitter, cortana and more. And in fact, dialect flow has one click integrations for the services. So if you want to integrate with slack, there's just one button. You have to click and set up some keys and some security parameters. And that said dialect flow will handle all the rest itself. It also works across devices, So if your users are using the Google assistant on the phone or they use it in their cars, or to use it on a smart speaker or on a smart fridge or whatever, it will all work with dialogue flow, and it's available around the world in 20 different supported languages. That's pretty great. You can build one baht. You could build it in English, and then when it's finished, you can translate that same bought with all the same logic into another language that's pretty pretty useful. And then, of course, there's the Who's Using Dialogue flow section with some big names here that are using the service, but we're not really interested in that now. Another interesting aspect is the pricing of dialogue flow. So if you go to pricing, you see that dialogue flow has basically two additions. There's a standard edition, which is completely free. And then there's the Enterprise Edition, which offers enterprise support, and they charge a small fee for that support. Now, during this course, we will use the free edition. In fact, there's so much included in the free addition that even if you're building a baht for small business, you'll still be able to use the free version and not have to use the Enterprise edition. If we click on learn more, you can actually see the subtle differences between the two. So the free addition, the standard edition is limited by some usage quotas. The enterprise edition, however, has a much higher usage quota and you get support from Google itself, so that could be pretty handy if you're rolling this out in a very large company. If we scroll down, you can see on overview of the differences between the services. And what's interesting is that the standard edition is free with unlimited text queries, and that's actually all we need. We don't need the voice and direction off dialogue flow at all. We're just going to use the text queries on that's actually more than enough. You can send three queries per second through the island flow, so that's three requests per second for detecting what a user has said. So that's also plenty for what we're going to build during this course. And it's also plenty for small businesses. With the free edition, you don't get a service level agreement so Google doesn't commit to a certain up time on support happens through the community or through email, whereas with the Enterprise Edition you paper requests, you gotta hire squares per second, you will get a service level agreement, I presume at some point, and there's support included from Google clout. But as I said before, we're just going to stick with the standard edition. That's more than enough for what we're going to do. And that was a quick overview of what dialect flow is now. In the next video, we will log into the council, explore it a bit and create our very first agents, so let's get going. 5. Dialogflow console tour: all right. Now it's time to get our hands a bit dirty and take a look at the dialogue flow. Consul. So here I am on the website again, dialogue flow dot com and on top, right corner. Here you have a button that says go to console. So we're going to click on it. And because dialogue flow is a Google products, you, of course, half log in with your Google account. If you already have a Google account, you don't have to set anything up. Just click sign in. You select your Google account, and then you're automatically taken to the dialogue flow Consul. Now, the first thing that we're going to do is we're going to create a new agents, so projects and dialect flow are called agents. But during this course, I might also call them a nap or a bottle or something like that. So to do that, I'm gonna click on this. Drop that menu here. This will show you a list of all the agents that you've already built with dialect flow and at the bottom you can see here there's create new agent. We're gonna click on that. We're gonna give her age in the name we're gonna call it you to me Space X pot, for instance. And then there's some parameters that we have to set up. So the first is the default language off our agent By default, this is being set to English and we're gonna keep it just like that. But in the future, you can add more languages if you want to. We're going to set up the time zone. I'm gonna set GMT plus one which is a time zone where I live in. And then there's a section called Google Project. And right now it says, create a new Google project and we're going to leave it like that. What this means is that dialogue flow will create a new project in Google Cloud for us. And with this project, we can integrate it very easily with actions on Google with Google Assistant, which is something that we want to do and you get access to firebase clown functions, which is something that we're going to use to add the logic to our chatbots or to our agents. I should say so I'm going to leave it at create new Google Project and then there's the A P I version that you can choose. So right now there's Version one, which is the current stable version. And then there's Version two, which is a newer version, has some new formats, four responses and requests, but it's currently in beta. So I will use the one in the scores because that's the stable version, and I might update the scores in the future with a V two. All right, so I'm gonna click, create, and I'm gonna let dialogue flow, do its magic and create my agent. So there we go. My agent has been created and along with it, a Google Cloud projects reckon use firebase functions. And I can integrated with Google assistant down the road. Now let me show you Been around in the dialect Full council on the left. Here you've got a bunch off options and a bunch of tabs that you can use for the 1st 1 as you've already seen, is to switch between all the different agents that you have created and to open the settings off a particular agent. The next thing is a language selector. So if I want are bought to support Dutch for instance, I can add it here, and then it can quickly toggle between languages. Next up is intense, and an intent is a mapping between what a user says and what action should be taken by our agents. So this will make sure that a question like how tall is the Falcon nine or what's the height of the Falcon? Nine is being recognized as the same intense and then gets processed by our logic. Then we've got entities and entities can help you extract certain values from a user input . And there are three types of entities. Their system and thes developer entities and user entities. The system entities, they are defined by dialogue flow itself, and they're pretty generic to give a concrete example. If users says that his favorite color is blue, then the system will detect blue as a color entity. So those are already built in into the system, ready for you to use. But you can also extend down look flow to create your own entities. So to answer certain questions about Space six, our body will need to know what rockets the user wants more information about, and to detect that we can create our custom rocket entity that contains the Falcon one, Falcon nine and Falcon heavy, for instance. And once we've done that, dialogue flow will recognize these rockets for us and pass it along to our custom logic. And then finally there user entities, which we won't discuss during this course. But they basically allow you to temporary sign entities to a user on. That could be useful. For instance, when you're allowing the user to manipulate his personal data, for example, then there's a training tap. Artificial intelligence is great, but it has to be trained in order to be really good. So it's possible that your agent might not understand certain sentences straight away. And when that happens, you can see these sentences here on. You can actually teach your bought how it should have responded or what intense it should have triggered for a given input. So by training your body, you can increase its accuracy. Eso It's quite important to keep an eye out on this training tap and go through it from time to time. The next thing is integrations, and here you can link up your agents with several services like, for example, Google assistant or Facebook messenger or slack. And most of these integrations air pretty easy to activate. You just click on this toggle and then fill in some settings like Call back you around, verify tokens, stuff like that and then dialogue flow will actually take care of communicating with that platform and passing messages around. So that's pretty nice that they support all these services straight out of the box. Next up, this analytics, this is pretty simple. Here you will be able to see how many people have used your bought, how many questions they ask it. Procession, What the most popular intent is over your bought and so on. It's wanted someone. So right now this is, of course, empty because this is apron new agent. Then there is fulfillment. Next up is fulfillment is and fulfillment allow you to create custom code to answer a question. For the user, this is pretty useful because dialogue flow itself can only answer with simple text answers . So if you need to fetch some information from an A P I or from a database, you can do that with custom code and set up a fulfillment. As I said in the introduction. You can write your code your logic in whatever language you want, as long as you can put it up on a server somewhere and have inaccessible over, http. And if you have that you can use this weapon right here, and then you can give the link to your beach be script or your ruby on rails, script or whatever on dialogue flow will use your service to get an answer for the user where you can use firebase functions, which is something that we're going to do during this course. And basically, this is a service that allows you to just upload some JavaScript code. And Google will take care of provisioning servers and scaling them and securing them. And what not So because dialogue flow is a Google company, they, of course, recommend this. And it's actually also the fastest option because again, your firebase function and dialect flow operate on the same network, and that makes your agent a little bit faster. Next thing that I want to go over is pre built agents, so sometimes it's actually not necessary to create all the intense from scratch, and these pre built agents are basically templates that you can import into your project if you want your agent to handle certain actions. So if your agents should support setting alarms, for instance, then you can import this agent right here. And when you do, that dialect flow will create a bunch of intense and entities, basically everything that it needs to perform these actions. It will import them in your project, and then you're ready to customize them however you like. The last thing we're going to take a look at is small talk, so it's likely that your users will start asking your bought some real questions like when it was born or how old it is or when it was created, or how it was created. Stuff like that. And if you wanna handle this informal chatting, you don't have to create custom intense for it, and in fact, that will be very hard to do so. Instead, you can use the small talk module to customize how your bond will answer certain questions . So if it bump open this first section here about this agent, you can see the kind of questions that users could ask your bought on. Then you can define some answers here. So this, like, how old are you or your annoying or answer my question. So enabling the small talk module makes your agent feel more human like, and you can use it to give. You bought a unique personality so you could make it funny where you could make it grumpy, or you could make it very neutral. So that was a little overview of the Dialogue Flow Council. In the next video, we will create our very first intense 6. Create your first intent: All right. So here I am again in the dialogue flow Consul. And in this video, we will create our very first intense. So in the consul here, I've opened up the intense. And as you can see, we already have too intense. We have the default fallback intent and the default Welcome intent. Now, the default Malcolm intent is the one that will welcome your user. It will say hi. Hello. How are you? Thinks like that. The default fallback intent. It will be executed every time. Your body doesn't really understand what the user wants to do. So you get these intense by default, but you can customize them. So if you open up the default fallback intent, you can see that if you scroll down, here are the responses that your body can give if it doesn't really understand what the user means. So that could be I don't get it. Can you say that again? I missed what you said. Say it again, please, uh, things like that. So, as you can see here, there 12 responses that you're bought can give when it doesn't really understand what user wants to do. And you can add responses to this list. Or you can customize the existing ones however you like. And what's interesting is that you can give multiple responses for the same intense and your body, or your agent will actually randomly select one of these responses to make it feel more human. Because otherwise, if you're bought, always response with the exact same text that it becomes a little bit boring after a while . So you can add multiple text responses here. If you go back to intense. There's also the default welcome intent. And here are things like Hi, Hello. Good day. Greetings. Stuff like that. It's that the first message that your user receives when he opens up your butt. So for the welcome intent, I'm gonna delete all of these here, and I'm just going to say hi. My name is Space X pot. What question can I answer for you today? All right, so that could be my welcome intent for this, but I'm gonna click safe to say that now. Let's also create an intense ourselves. So this bought will be able to answer multiple questions about Space six. And one of these questions will be how many employees does space 6/2. So I'm going to start by creating a new intent for them. And I'm gonna call my intends company dash employees and a company here because this is a question related to the company, and we're gonna add MAWR questions like these related to the company in the future. But you could name this whatever you like. Just make sure that you know what each intent thus, that we're going to scroll down to training phrases and we're going to say at training phrases. And here we can teach dialect flow how the user might ask for the employee count. So one of the ways user can ask. That is by saying how many people work at space. Six. Question work. But the user could also say how many people work for Space six or how many employees the space X half for how many engineers are working for Space X. So what's important here is that you gift dialogue flow, a bunch of phrases that can trigger this intense, and this should be phrased very differently. So it's important to specify as many example questions as you can right here because they will improve the accuracy off your agent. Now, if I click safe, we will be able to test it. So on the right hand side here of dialogue float, you have the option of testing out your agent so you can type in here and actually chat with your agent. And you can see it has done some training already because I've added a new intent. So now I can see how many employees does space X half. And if I scroll down, you can see there is no response from dialect flow. But it has recognized my intent that assess you probably want company employees. That's exactly what I want, and I can freeze it however I like. So I can also say how many employees very short, very to the point. And it also recognizes correctly my intent, even though how many employees is not in the training phrases. It's the machine learning algorithms of dialogue flow that have recognized that it's very likely that by saying this, I really want to know how many employees are in the company, so that's pretty cool. But obviously we also want our bought to give a response. If it's cruel. Down here are responses, I can add a response. I can say, Well, there are about 7000 face six employees right now, or I can add another response. I can say that will be 7000 people, something like that. But of course, this isn't really future proof. These answers here they're hard coded. And so if tomorrow Space X has 1000 employees, I have to go into my intent and then change this text response. And then I have to say, Okay, right now, there are 8000 people employed by Space six and at 7000 so that's not quite maintainable. What will do later on this course is will actually use an A p I to fetch this information in real time, and we'll do that through a fulfillment. But again, that's for a later video. So that was it for this video and the next video, we will create an intent that uses a custom entity and we will make an entity to defect what rocket the user wants more information about. So that could be the Falcon one, Falcon nine or the Falcon heavy. So continue, along with the next video 7. Creating and using entities: So in the last video, we created a very simple intense that is capable of answering the question. How many boys does Space X have now, this intense doesn't talk to an A P I yet, but that is something that I'll address in the next chapter. This intent, however, is very basic. Now, I also want more advanced, intense. So I want users to be able to ask the bought more information about a specific rocket, for instance. So to do that, we first have to know about which rocket the user wants more information. And to grab that, we have to create a custom entity. So I'm gonna go over here to entities and I'm gonna collect the plus icon to make a new entity. And I'm gonna call this entity Space X Rocket. And down here we can define all the rockets or all the items that fall under this entity Space X rocket. So Space X currently has three rockets. So I'm going to add one entry for each rocket and let's start with the Falcon nine. So here there is the reference value, and then we have to enter some synonyms, so I'm gonna call this one, the Falcon nine. And then we're gonna enter some synonyms. So that could be Falcon nine F nine. And let's say Falcon nine, written in full one of the same thing for the older one, the Falcon one F one and Falcon one. And then we're gonna create 1/4 the latest and greatest falcon Heavy. There we go. Now the para meter on the left. Here, the reference value is sort off like an I d. If dialect flow recognizes this entry, it will pass this value this reference value to our custom fulfillment, and then we can work with it. So if the user wants more information about the Falcon nine and it is used one of these forms, then dialogue flow will pass this reference value Falcon nine to our custom fulfillment. And that's pretty handed because the reference value is always the same, no matter what synonyms you have to find for it. So that is our custom rocket entity. I'm going to save it. And now we can create an intense that actually uses this entity. So I'm gonna click the blood I can hear on intense, and we're gonna call this intense rocket details, and we're gonna make some training freezes for it. So this intense is responsible for handling questions like, What is the Falcon nine? And if I type the straining freeze, notice that dialogue flow highlights Falcon nine and beneath it. It has created a parameter for this, and it has detected that the Falcon nine is of entity type Space X rocket and that the result value is Falcon nine. Pretty awesome, right? So I can add some more training for you to see a grin. I can also say, Tell me more about the Falcon heavy or give me some details about the Falcon nine. And as you can see with each of these training questions, nylon flow already extracts the rocket type from the sentence. So that's pretty awesome, right? We just created the intense and dialogue flow did all of the hard work. Now it sometimes happens that dialogue flow doesn't recognize thes entity straight away. And when that happens, you can just select a piece of text and manually match it up to an entity. And when you do, that, dialect flow will learn from it and improve itself, so that's pretty useful now more towards the bottom. There is a section called Actions and Parameters. And here we got an overview off all the parameters that dialogue flow can detect in this intense. And we can see here that air space X rocket entity is already here. Now, in this section, I can also say when a parameter is required or if a parameter can contain multiple items. So, in other words, if it's a list, so for dis intend to work, we always need to know what rocket to user wants more information about. So I'm gonna make space X rocket a required parameter. When I do that, notice that there is now an option called prompts. And here we can define what dialogue flow should say to user when it doesn't. Kim is this parameter. So if user asks, give me more information about the rocket than dialect flow has to prompt the user. Hey, what rocket do you want more information about? And these problems can be defined here, So if I click on it, we can to find multiple prompts. So I could say for what rocket do you want more information and I can make up an alternative what rocket do you want more info about? And that's it. Those air to promise that we're gonna define I'm gonna click close, and now our agent should be capable of detecting when a user wants more information about a specific rocket and also extract which rocket wants details about. But let's make it a limit more complex. What if user wants to know how high is specific rocket is, or what diameter it has well to handle that we'll create another entity. But before we do that, let's just save this intense and let's go to entities. Let's create a new one and I'm gonna call this one rockets property, and this will contain all the properties that a rocket can have. So a rocket has a few properties. It can have a heights. It can have a diameter, and it can have a weight. Now we can define some synonyms here, so heights is also the tallest of rocket or the length of a rocket the size, how high it is, where, how tall it is then that the amateur could also be called the width of the rocket, and how wide the rocket IHS and the weight can be the mass of the rocket. It can be how much the rocket ways for it can be. How heavy a rocket ISS Now notice that right here we have an option called Allow automated expansion, and when you enable this dialogue, flow will automatically try to recognize values that haven't been explicitly listed here. So let's say that you don't have toll here in the list of synonyms for heights. When you enable automated expansion, it's possible the dialect flow will automatically add told to this list and this automatically expanding it. Now, this sounds like a very cool feature. But be careful with it in the official documentation dialect Flow says that having multiple entities with this automated expansion option enabled could cause issues and unexpected results. So we're going to keep it off for now. All right, I'm going to save this empty, and then we're gonna go back to our intense and expand a little bit. So I'm gonna go two rocket details and we're gonna add a few training phrases were going to say What's the height off the Falcon nine. And before I add more trading freezes, you can already see the dialect flow now recognizes two parameters. It recognizes the rocket property entity and the Space X rocket entity. So that's awesome. Let's add some more training. Freeze. How much does the Falcon nine wait? What's the diameter off a Falcon nine? Or how heavy is a Falcon nine Now, in all of these cases, it actually detected both the rocket and the property that we're trying to get. Now let's scroll down again to actions and parameters and notice that here the rocket property parameter has also been added to this list. Now I can make this one required, but for this intent, it's not actually required. If the user asks about the Falcon, nine will give him a summary of everything we know about the Falcon nine. But if the user asked specifically for the height, that will only answer with the height of the rocket now, we'll do that later on in the course when we create a custom fulfillment. All right, so that's it for now. Let's save this intent and let's actually try it out. So I'm gonna click on safe, And as you can see, this starts the agent training, and I'm just going to wait until it completes and then we contest are bought. All right, training is complete. Let's ask you a question here on the right hand side, I'm gonna ask it, How high is the Falcon? Nine. And when I submitted, you can see that it correctly recognises my intent as rocket details. And it has extracted two parameters from my question Space X rocket, which it has detected as a Falcon nine, and the rocket Property, which it has detected as height. So that's indeed my intent. I want the height off the Falcon nine. Let's try another one, Let's say, What's the heights of the Falcon? Heavy and again. Same details it has detected are correct, intense, and it has mapped or two parameters. So, as I've said in the beginning of this video, we're going to use these parameters later on in this course to specifically answer users question. So that wasn't for this video. In the next video, we will take a closer look at the training tap to see what we can do there 8. Training your bot: in the last video, we created a more complex intent, and we tested it a few times a year in the dialect flow Consul all over test went very well , and dialogue flow correctly recognized what our intent was, and it even recognized some parameters. However, there will be times when the system doesn't work perfectly and that the artificial intelligence algorithms will do something weird. And when that happens, you can go over to the training tap and see a list of all the questions that people have asked your butt. So if I open it up, you can see that I don't have many questions here. But over time, this will fill up nicely. Now, when you click on one of these messages, a dialogue will pop up with more information about it, and it will show us what the user has asked, what parameters dialect flow has extracted from it and what intense it has recognized. So if something is wrong, you can correct this right here. So let's say that in this example, give me some details about the Falcon nine. It has incorrectly recognised the intent while to correct it. I can simply click on the intent on. I can select any other intense from my project. I could also teach dialect flow about entities. If it has missed an entity or it has strongly recognized one, we can change it right here. So if it has wrongly recognized the Falcon nine as an entity, I can remove it. Or if it didn't detect a certain entity, I can again select some text in this user's query and then linking up to an entity within my project. Now, of course, in this situation right here, both the intense and the parameters are correct. And when it is correct, we can teach dialect flow, that it is correct. And this is reinforced learning. So I can say to dialer flow, Yes, this is correct. Now when I do that, dialogue flow will at this sentence to the training phrases off are intense. And so next time someone triggers a message that is similar. Like this dialogue flow will know how to handle it. So when I'm done reviewing all these questions, I can click a proof on dialect flow will take my feedback, put it back into my agent and start training. As you can see here. So the trading tap in dialogue flow is very important when you've just launched your butt. So keep an eye out on this unmatched column here because every time the dialogue flow doesn't know what to do with the user's input, it will market as unmatched. And so you can go in here into the training tap and link up that user input with an intense so that dialect flow learns how to handle this. So that is a quick overview of the training, functionality and dialogue flow. And again, it's important to keep an eye on this when you first deploy your agent. It's likely that in the early days, three artificial intelligence will need a helping hand. So now you have a complete tour off the dialogue flow Consul. In the next video, I will add a few other intense to our agent, and then we're almost ready to jump to the next chapter where will finally write some custom job script code so that we can answer users questions based on the data that has served up by an A P I 9. Create "Next SpaceX launch" intent: before we continue with the next section of this course, let's add some more intense in this video. I'm going to add the next launch intent, and this one will be able to tell the user when the next space X launch is scheduled. Now, this intent will be a bit special because it will have a follow up in 10th. So I'm going to create a new intent here in dialogue flow. I'll call this one next launch, and we're going to give it some training phrases so the user will be able to ask things like, When is the next launch? What's the next launch? Although that wouldn't be very correct. English And just next launch very short. Now, I also want to make it possible to ask for the next launch of a specific rocket. So things like when is the next Falcon nine launch? Or when will the Falcon nine take off again? And as you can see again, dialect flow automatically detects our space X rocket entity, which is great. Then I'm going to scroll down on. I'm gonna add a basic response here, and I'm just going to say the next launch is on dot, dot, dot dot The details of this will be filled in once we create a custom fulfillment to talk to an MP I, which will do in the next section of this course. All right, so I'm going to save this intense. And now the user is able to ask when the next launches. However, after asking when the next launches user might be interested to know what exactly Space six will be launching. So this is a special kind of intense. This is called a follow up intent, and the biggest difference between normal, intense and follow up intense is that would follow ups. Dialect Flow remembers the contexts. So here's a simple example. When the user asks what is the Falcon nine? And then afterwards, he asks, How tall is it that our agent should understand that the user is still talking about the Falcon nine? So it has to keep track off the context of the conversation. We'll follow up intense, take care of this automatically, and to keep track off the context. So I'm going to go back to the overview of our intense. And when I hovered the next launch intent, you can see that there is a button here called and follow up intent. So let's click on that. And now dialect flow will present us with a list of common follow up intense. Now we want to make it possible for user to ask form or information. So we will use the Maurin tent. If, however, you want to handle a different follow up intent that is not in this list and you can just create a custom follow intent. But for this one, we're just going to click on more. Okay, so this has created our follow up intent. And as you can see, it has put it below our original intent. And you can even collapse and expand this section right here. So if you have a lot of follow up intense, you could just hide them all at once. I'm gonna edit this intense. I'm gonna click on it. And the first thing that I'm going to do is I'm going to remove these through space is here in the name of our intent, it's not required. But it's a convention that I have to have no spaces in my intent names. Now we're gonna scrolled out to do training phrases and notice that dialogue flow has already populated. This list would commonly used phrases for mawr intense. So more results. Show me more. Tell me more, more information, all that kind of stuff. But we're gonna add a few that are specific to our but to our agent and these can be what will they be launching? But they could also ask more info about this launch, please. Or they can ask, what is the payload of the launch? Things like that. That should do it for now on, let's also make it respond with a simple answer. And when I scroll down and here to text response, I'm just going to say, Here is more information about the next lunch. So again, this is just a police. Hold the response. We will make this dynamic later on in this course. So I'm going to save the intent and this will trigger agent training. We're just gonna wait until that is finished. And when that's finished, we're gonna try out our But so we're gonna ask it When is the next lunch? And it will respond with the next launches on dot, dot, dot And as you notice here it has now a context. It's the next launch follow up context. So if I now ask what is Space six launching, then it will respond with. Here is more information about the next launch because it remembers that were in this next launch context. And so it knows that we asking not more information in general about Space X, but that we're asking specifically more information about the next lunch. So follow up intense are a great way for users to drill down and to get more details about a specific thing. So that was it for this video and also Ford this section in the next section of the course , we will create some custom JavaScript code to actually answer these questions by fetching some data from an A P I. 10. What are fulfillments?: so, up until now, we have explored the dialogue flow Consul, and by now you should have a pretty good understanding of how it works and how you can use it to detect the intent of a user. However, dialect flow alone isn't enough to create a good action for Google. Here's an overview of the complete architecture that we're going to build. So far, we've only looked at dialogue flow, and so our agent is only capable of detecting the users intent and responding with a static answer. In this section of the course, we will be taking a look at this part, connecting dialogue flow with a firebase function so we can grab information from an externally be I. I touched upon firebase functions during the introduction of this course, but let's quickly go over some details. Firebase Functions, which is run by Google, is basically service that allows you to upload some JavaScript code, and Google will take care of hosting it. This falls under the server lis model. You only provide the code, and Google will make sure that it scales incredibly well. It's also pretty cheap. The free tier of firebase functions gives you two million invocations for free each month. Also, bandwidth costs are very low. You don't have to pay for traffic between firebase and dialogue flow because there to Google products, you only have to pay for request made over the Internet. But it's not expensive, and we don't need much of it for this app anyway. All right, so that was it for this introduction. In the next video, I will show you how to set up a custom fulfillment and how to link it up with one of our intense 11. Your first fulfillment with Firebase Functions: all right. Time to roll up our sleeves and write some code. Here I am, back in the dialogue flow console and let's open up the fulfillment tap here now to create a custom fulfillment. You basically have two options. You can use the weapon, or you can use a cloud function where the in line editor here Web books are useful for when you want to host your code on your own server somewhere. But in this course, we will stick to a firebase function. It's fast, and Google gives you a very generous free tier. So I'm just going to enable the in line editor here, and that will activate the code editor right here. Now, don't worry, we will only use this online editor for a few videos and then later on in the course, I will show you how to use a regular code editor on your computer and then push that code onto firebase functions. Now, as you can see, dollar flow has already provided us with some code. They provide a skeleton, and it's actually a pretty good skeleton. Let's take a look at it. They start by importing the Firebase functions library and then this library right here. Dialogue flow fulfillment. Now, if we take a look on the get help page for this project that we can see that this library makes creating fulfillment for dialogue flow agents easy and simple. And what's more is that this library allows you to create responses, and Alec Flu will translate it to the format that one of the eight supported chat and voice platforms need to give it back to the user. Now what's so great about this library is that we can create rich responses with it. So besides plain text answers, we can also send users some carts or pictures or even suggestions for follow up questions. But more about those rich responses in a later video. All right, let's go back to the code. Then they said a D book variable, and then here they export a function called dialogue flow Firebase fulfillment, which will be executed by dialogue flow every time an intent needs the fulfillment, which for us is going to be almost every intent because we need some information from a napi I now inside dysfunction. They start by creating an agent which is basically a weapon client and what Brookline's comes from library that it just showed you on get up. And they give this library the request and the response that they get right here from dialogue flow. Now the request object will contain a lot of information suggests which intent was triggered. What parameters dialog flow has extracted, how confident it is with the intent that it has detected and so on. It's once want. And the response object is something that can be used to give a response back to dialogue flow. And this is actually handled by the weapon client itself. Afterwards, they lock some stuff to the consuls who we can see it. So they logged the headers and they lock the request body. And then they defined two functions that, if I know welcome function and a fallback function, they each received the agents object that we defined here. Now what the welcome function does is it just adds a simple response to the agent in excess . Welcome to my agent and the fallback function that something similar, Although it says I didn't understand. I'm sorry. Can you try again? Now, At the bottom of this file, we find an intent map, and this basically connects the intense from dialogue flow to our custom functions. So a single fire dysfunction what we have right here, this entirety can actually handle multiple intense. So here we can see that Winner Fulfillment receives a default welcome intent. Then it will trigger the welcome function. And the welcome function is, of course, the find here. And when it receives the default fallback intent, it will trigger the fallback function, which is the find here. So now that we understand how this coat works, let's add our own coat to it. Let's say that I want our fulfillment to handle the company employees intent that we created in the previous section. So I'm going to start by creating a new function here, and I'm gonna call this function. Actually, I'm just gonna in debt this and I'm gonna call this function, get company employees. You can call this whatever you want. Just like the other two functions, you will receive an agent object, which you can use to give a response and then inside dysfunction. We're free to do whatever we want so we can go out to an FBI and fetch him information or we can connect to a database somewhere. We can do whatever we want. I'm not going to talk to the space ex FBI just yet. That's something for the next video. But here, I'm just going to send a simple text response. So I'll say agent, that ad. And then I'm going to say this is a simple response from our fire base club function and then afterwards let at another response to say, Agent thought and you wanted the employee count. But I can't I can't fetch down yet. Yes, that's an important word. All right, so that's that function. Now we have to let the library know what can trigger this functions. That's our company employees intense. So I'm going to scroll down. Actually, I'm gonna copy the name of this function that I'm going to scroll down and I'm gonna add a new entity year and we see intent map not set. And then the name off our intensive. That's company Dash employees again. We defined that here in the intense in the previous section of the scores. And then we're going to give it the name off the function that it should trigger when it receives this intent. Now, I'm also going to remove this commented coat right here. These are just examples. So that cleans up a little bit. And then all we have to do is we have to click on Deploy to push it life. And because this is the first time that we deploy dysfunction, it will take a minute or two to set everything up, in fact, in the background, and will create a Google cloud project for us. But of course, when we change dysfunction in the future, it will only take a few seconds to deploy. All right, so it has successfully deployed our function, and now we need to configure our intent to actually use our fulfillment or to actually use our function. So I'm gonna go to intense here. I'm gonna open up the company employees intent, and I'm going to scroll down to the section fulfillment, and I'm gonna click that open, and I'm just gonna enable weapon call for dis intent, and then I'm gonna save it. So this option tells dialogue flow to ignore these responses here and instead reach out to our firebase function and let it respond instead. So agent training is now complete. And let's not try if this actually works. So on the right here, I'm going to ask the question, How many people work for space? Six. Gonna hit, Enter. And as you can see, it took a little while for the cloud function to actually start up. But now it response with this is a simple response from our firebase cloud function. You wanted the employee count, but I can't get that yet. So this response comes directly from our firebase function, and that means that everything works correctly. In the next video, we will expand this function, and we will actually answer the question of how many employees the company has by reaching out to the space ex FBI. 12. Integrate with SpaceX API: So in the last video, we configured our fulfillment to answer the question. How many employees does space X half. However, to answer that question correctly, we need to look up the employee count through an A P. I and I happened tohave a tab open with the A p I. So this is the FBI that we're going to use. It's an open source AP I maintained by the community. So it's not an official a p I, but it's very accurate, and it has a ton of information. So let's quickly look at the A P I documentation, which is available in the Wiki year. And here's a list of all the endpoints that this FBI supports. So the 1st 1 is company data, and that sounds like the right one to fetch the employee count. So let's open up the documentation for this endpoint. Let's copy this year. Oh, and let's see what's behind it. And yep, that seems to be the one. The employee count is located right here. Let's take a look at another endpoint while we're at it. So I'm gonna go back and let's take a look. A rocket data, for instance, Here we can see a list of all the rockets that Space six has, along with a lot of information about its rockets. So what the cost is, per launch a description, what the diameter is, what engine configuration this rocket uses and so on and so on. And so want just a ton a ton of information on each rocket. Let's take a look at yet another endpoints. Let's go back. Let's take a look at what launches, for instance, and here we have, ah, bunch of endpoints so we can fetch the latest launch. We can fetch all the past launches or we can fetch all the upcoming launches. So let's do upcoming now just to take a peek. And then again, we get a lot of information about each launch, so we'll see where they will be launching from when they will be launching what the ready campaign is. We will be seeing a mission patch if it's available. A press kit, a link to a video, what Space X will be reusing the capsule. The core, the fair rings, what rocket they will be using just a ton. A ton of information is in this a P I. So enough exploring, let's use the FBI to actually fetch the employee count in our firebase function. So the employee count again, since that company data it sits at this end point. So to make a call to an A P I to make a niche to be called and no GS in general, you can use the building. HD to be client already can use any other client you want, and I'm going to use the request libraries and search for that request node here, and I'm going to use this one because it's super simple and yet very powerful. But again, if you're more comfortable with another library, feel free to use that one instead. All you need to do is go out to euro, fetch the Jason, and then we can work with it. So let's head over to dialogue flow and let's open up our fulfillment again. All right, there it is now, normally, to install the request library, you would run NPM install, and that would take care of it. Now, if you don't know what NPM is, it's basically a package manager that can install external libraries for you and even keep them up to date. However, the NPM command is only available on computers and it's not available through this Web interface. So we have to add it manually to the package, not Jason file. So I'm going to scroll down. I'm gonna go to the dependencies and here I'm going to say that request is a dependency, and we're going to request the latest greatest version, which at this point is 2.85 point zero. Now, I looked up this version number beforehand on the get help page. If you're installing this later on, please make sure to choose an up to date version or two. First, download this coat right here and run NPM on your local computer, which is something that will do later on in this course. So that's all we have to do. So that's all we have to do. The next time we deplore function, Firebase will install all the dependencies that we defined here, including the request framework. So let's now open up our source code, and the first thing that I'll do is I'll import the request library. So here, beneath all the other imports, were going to say const requests equals and they were going to require the request library . And then we're going to adapt the function that we wrote in the last video. So I'm gonna scroll down here. So instead of answering with a simple static text, answer when delete that and we're going to go out to the A P I and fetch the employee count . Now, if you know a bit of JavaScript, you know that making requests to an external service usually happens a synchronously so that the rest of the program can continue to work normally. Now, in JavaScript, we can handle this in a few ways. We can use callbacks, which tend to become messy when they're nested. We can use a sinker weight, which is a new feature in JavaScript, but isn't supported yet by fire base. Or we can use promises. And that's what we're going to do here. Now, If you are not familiar with promises, check out the mdn documentation for more information about them. But anyway, they're not so complex, so maybe you can figure them out just by watching this video. So when the dialect full library calls or function right here, we're going to instantly return a new promise, and this promise would give us a resolve and a reject method. And then inside this promise, we're free to do whatever we want. Now, the idea is simple. When the dialect flu library sees that we return a promise it will wait. Send an answer to the user until we call the result function. So in this promise, right here, we have all the time in the world to fetch the information we need from the A p I and what we have it, we just call the result function. So in here, we're going to use the request library and we say requests. We're gonna give it an object we're gonna give it to you or l that we I want to make a request to. And then we're gonna set Jason to True. Now, if you said Jason to true, then the request library will actually fetch the Jason and parse it four years who don't have to do this yourself that I'm going to check on the endpoint. I don't have it open yet. So here is the company info endpoint. I'm just going to copy that peace that in here in your rice section, and then we would give the library a callback. Now, again, you can use promises for this one as well. Or you can use a sink await if you use a rapper library. But I'm not gonna do that. I'm just gonna use a call back. So this callback will receive an error, a response and a body. All right, now in here. Let me maybe add a new wine here to make a little bit more clear there. In here. Let's first start by dealing with any potential errors. So we're going to check if there is an error. We're just gonna add a response. We're gonna save whoops. I can't connect to the space six. Basically, try again later, something like that. And then afterwards we're going to call the result function when we say return resolve. And what this will do is this will Terminator function right here so it will make sure that no code can run after this error. Now that we've dealt with potential errors, let's actually work with the response off the FBI. Now, you might think that the Jason object from the a p I will be included here in the response object. But that's actually not the case. The response object contains things like server headers and stuff like that. The actual response that Jason that we want is in the body. So let's take a look at what our body should look like. So I'm going to copy the endpoint again. We're going to take a look at the Jason and this is a Jason that will be in the body, and we're interested in the employees attributes. So what we're gonna do, we're gonna check if that's available. So I'm going to say, if body dot employees, if that exists, then we're going to add a response were, say, agent thought Space X currently has body Don employees and poise. All right, And if it doesn't have an employee's attribute, then something was wrong. Then the FBI responded with something that we don't understand. They were going to say, Whoops. I Oops, I could not get the employee count right now. Try again later, all right? And in the last thing we have to do is we have to resolve our promise. If we don't do that, the dialect full library will just keep on waiting for forever, actually, or until firebase kills the function. So at the end, we're just going to say Return resolved. All right, so that should be it. So let's go over it one more time. When get company employees is called, we instantly return a new promise. And that allows us to work in the background that we're gonna make a call to this end point right here of the space six a. P i. When that response comes back, we receive an error, a response in the body. We start by checking the error. If there is an error, would just say that to the user and we stop our promise. We resolve it if then body done employees is defined, we say. Okay, Space X currently has AC's number off employees. And if it's not the find that we got some weird answer from the FBI that we say that we couldn't get the count right now and then at the last, we resolve our promise as well. All right, so it looks all good, but there's no one more issue that we have. Teoh correct. Actually on this is an issue that I ran into as well when I was preparing this course here . We used requests and we define requests right here, and we imported the request library. However, the boilerplate code from dialect flow also has a request variable. And thats right here. It's the request that the library got from dialogue flow, which contains things like the parameters a dollar flow has extracted in Swan in Swan. So these to clash with each other and this one actually wins. So if we say request javascript, actually, things that were meaning to use this variable right here while instead we want to use this one right there. So to solve this issue, I'm just gonna call this one request http and will change that here as well. And then we solve the collision off names. So that's it. Our function looks good. Let's scroll down. Let's hit deploy And let's wait a few seconds for it to correctly deploy through the fire based service. All right, that's done deploying. Let's test our bought again. We're gonna ask how many employees does space six have hit enter and there you go in response with Space X currently has 7000 employees. Exactly what we define right here Now, it's interesting to note that before this can work, you have to enable building in your firebase account or in your Google Cloud Council the first time when you run this and you didn't do that, fire base will actually give you an error, and they will say, Hey, you have to enable billing, which involves adding your credit card to your Google account. Now, don't be afraid this won't cost you anything, even if you have some traffic to this. Google just needs to know your credit card data when you start making requests to other Internet services. So that was it for this video, and in the next video, I will give you a tour of the Fire based council. 13. Debugging Firebase Functions: So in the last video, we created a custom fulfillment to fetch the number of space six employees. But what happens if something goes wrong? What if we cannot reach the A P I? Or what if the Jason is wrongly formatted? Or what if we made a mistake in our code? In general, books are inevitable, and it's important that we log as much data about errors. It's possible so we can diagnose the problems and fix it as fast as we can. Firebase functions has built in logging capabilities, and the documentation says that to use the stack driver error reporting tool, all you have to do is log a new era to the consul, and then fire base will do all the heavy lifting. When you do this, they will automatically log things like in what filed the error occurred and at what line? Pretty useful. So let's implement a view of these longing statements. In our fulfillment. I'll go back to dialogue, flow and inside our function to fetch the employees. We handle the error from the request library. If there is an error, we say to the user that we cannot connect through the space 60 p i and that he has to try again later. Well, if this goes wrong at this step, we want to know about this. So I'm going to say council dot error. And then I'm gonna create a new error object. And in here I can pass along a message that I can say ever connecting to the A P I. Now it's important that if you want this error to show up in the stag driver console, it has to be of type errors. We have to create a new error object. You cannot simply council log a string. No, it has to be an error object now, another area where we can add some logging is right here. So here we check if the body has a field called employees and if it does, we respond. And we say to the user, Hey, Space six has this many employees. But when it doesn't have that field, there's something wrong with the It's possible that the format of the FBI has suddenly changed, and again we want to keep track of these issues as well. So in here I can say consul, about error and create again a new air object and I'll say no employees fields in the response now, Normally, when I'll test this fulfillment, these two errors will not be triggered because the FBI is accessible and because the employees field is available on the body. So instead, I added line right here that logs and Arab to the consul straight away when the get company employees function is being called. So again, this is just for testing purposes. I'm gonna hit deploy, and we're gonna waits until it successfully deployed to firebase functions. And then we're gonna test it and see where our locks end up. All right, So the deployment is done. Let's now test our agent. And let's ask it how many employees those Space X have send that and A forces still answers with space experience. He has 7000 employees, but it should have locked this error right here. Now, where can review all the hours that have occurred? While you have basically two places where you could do this, you can do it either in the fire base consul or in the Google Cloud Consul. So let's start with the fire base one. I'm gonna open up a new tap. And I'm gonna go to console the firebase dot google dot com and I'm going to open up the correct project. That's my you, Timmy Space X bought. And then I'm gonna go over here on the left to functions. I'm gonna go toe locks. This is still in Dutch, but it means logs. And then over here we get an overview of everything that has happened with our function. And there's a bunch of log messages in here. The flag I guns, for instance, they indicated point in time where our function has started or has been terminated. And so here we see ah, flag that says function execution started. And then later on, we see another flag, insist function execution took 1000 600 milliseconds, and it's finished with status coat of 200. Now, in between, these flags are all the logs statements that we made within our fulfillment. And here we can see indeed are our this is just a test error and also notes where or in what function it has occurred in what file and on what line number. So this is a great way of seeing the log output off your function. So regular lock messages the council don't lock. Statements will also show up here. So notice that these two lakh statements here the body and headers they are regular consulate lock statements. Their information locks. However, the other way is maybe a little bit better. Firebase functions is deeply integrated with grew cloud and so we can also use stag driver . Now, remember that when we first created our dialect flow agent, it also created a Google cloud project forests and now is the time that we can actually use this. So I'm gonna go to consul dot cloud at google dot com and make sure you're on the correct project. So right here I'm in the Unity Space Export project. It's the exact same name as my firebase functions because the two are obviously linked together. If you're not in the right project, just click on this and then switch to your bots or your agents project that will open up the menu and I'm going to scroll down to stack driver and then I'm gonna open up the air reporting tool and now we'll get an overview of all the errors that have occurred in our application along with how frequently they have been seen. So we can see here that are test error has occurred and we can see that it It was first seen two days ago when I was testing for creating this course. And it was last seen four minutes ago when we executed the function by asking it how many employees Space six have here? We can see the number of occurrences over time. There is a nice little graph, and this consul doubles up as a simple buck tracker. So you can see here the problem is an open. I can say that we've acknowledged this problem where we resolved it. And you can even link stag driver with your buck tracker so that every time an error occurs , it automatically creates a new bug in your bug tracker. But that's a bit beyond the scope off this course. Now you can click on an error message to receive more data. So when I do that again, get to see how many occurrences it has. And I can see a stack trace so I can see what the exact error waas and where the Kurds began. It occurred in the function get company employees at that, particularly filed that particular line, and you can see the complete stacked race along with all the other Stack traces off recent errors. So Stack Driver is a great tool to keep track of errors that happened within your coat. And the free tier of Stack driver allows you to lock 1000 airs per second with the maximum lock size of 100 kilobytes. That's quite a lot, and logs will be retained for up to seven days. So that was a quick overview of how you can use logging in your fulfillment and how to use the Web Council to spot issues with your code. In the next video, I will show you how to creative fulfillment for the next launch intent and how we can take the entities that dialect flow has detected and use them in our code. 14. Handling user input: So in this video we're going to create the fulfillment for the next launch intent. Let's open it up to refresh our memories. And this intent allows us to ask questions like, When will the Falcon nine take off again and when is the next launch? So there are two types of requests a user can ask for the next launch in general. Or he can ask for the next launch of a specific rocket. If the user requests the next launch of a specific rocket than dialogue, flow will extract the space X rocket parameter and that is highlighted here in yellow. And here we can see it as well in the actions and parameters. So we've got the intent that extracts an optional parameter. Let's not go ahead and adapter fulfillment so it can provide an answer to the users question. So I'm gonna open up our fulfillment here, and I'm gonna make a new function similar to this one right here again, employees Count, which will also receive an agent and will also make a request to the P I. Now, I already showed you how to work with promises and how to make a request to the space X ap I So I'm not going to repeat that here. Instead, I've prepared the court already, and I'm just going to copy Paste it. So here is the code and copy it. And I'm gonna pace it in here like that. And let's also added to our intent map. So I'm going to say intent map dot set next launch and that we're going to set it to whoops . We're going to send it to our get next launch function. All right, now, let's quickly go over this function. We start again by returning a promise within maker requests through the upcoming launches, a p I end point. We handle any errors. If they're there, then we check if the body contains more than zero launches. If it doesn't, we say I could not find the next launch. But if it does, then we take the first lunch. We take the date of that launch and we say to the user the next launches on this particular date and finally we resolved that promise so that Dalek flow can return the answer to the user. All right, so this function could now answer the question. When is the next launch. But what about a question like, when is the next Falcon nine launch? Now, as I showed you before, our intent is already configured to extract this parameter. And we called it Space X Rocket. When dollar flow extracts this parameter, it will attach it to the agent object that we receive year and then we can work with it inside our function. So, insider promise I'm going to extract it. I'm gonna create a new variable, would call it Rocket I. D. And this will equal agent the parameters Space X walking. So that's the way you use parameters that were extracted by dialect float from the intense . Now, how do we use this rocket? I d What? We have two options. We can request all the upcoming launches from the A P I like we do here and then filtered them based on the Rocket I D. Or we can make the A P. I do all the filtering forests, and that sounds a lot easier. So let's go with that option. Here's the documentation off the launches endpoint off the P I, and the last example shows us how we can filter launches so here it filters launches based on the launch year and the rocket I d and the core reuse and the serial number off the core . So whenever we attach rocket I d this credit string parameter right here, the a p I will filter it forests automatically, so that's pretty awesome. Now let's go back to our code. And the first thing that I'm gonna do is I'm gonna isolate the Ural right here, So I'm gonna say, Let's end point. You're l equals and ended your own Oops. And in the euro like that, we're gonna pass it in here as well. Now, what we want to do is we want to add something to the URL. If the rocket i d. Has been set. So we're going to say if there is a rocket, i d Then we're gonna take our endpoint euro, and we're gonna add a credit string parameter to it. We're going to say that the rocket I d equals and then rockets I d. Now in reality, want to check if the rocket i d. Has a correct value like Falcon nine Fucking Have you were Falcon one, for instance. But for this example this will do, and that's it. We don't have to touch the rest of our code now. It will respond with the next launch date of a specific rocket. If the user has asked it, However, I think we need to change the responsible. If a user asks for the next launch of a specific rocket, we want to repeat that rocket in the answer. So I'm going to create a statement here. I'm going to say if there is a rocket, I D. That we're gonna respond. I'm going to save the next and then the name of the Rocket. Next Falcon nine, for instance. Launch is on, and then the dates and when the user just requested the next launch in general, we, of course, keep this answer right here. This just make the responses of the body a little bit nicer and that's it. It's time to deploy our fulfillment, and then we're gonna test it to see if it actually works. All right, our fulfillment has been the board's correctly. Let's now ask a simple question. Let's say when is the next launch and groups actually forgot something. Let's go back to intense. Let's go to next launch. And we, of course, have to enable our fulfillment for this intense because otherwise it will just respond with the message that we defined here. So I'm gonna enable the weapon Fordice intends. I'll click safe. They don't go back to our fulfillment and asked that question again once training has completed. All right, So let's ask the question again. When is the next launch? And then it says the next launches on the 18th of April 2018th. Now let's ask, when is the next Falcon nine launch? And then it says the next Falcon nine launch is on and then the same date, because the next launch also happens to be the next Falcon nine launch. There aren't any fog and heavy launch is scheduled yet, so I can show you that either. But you get the idea now you might wonder. Hold on a second. How comes that the rocket any variable will contain an I. D. That the a. P. I understands. While our rocket idea is basically the Space X rocket entity that we created in the previous section off the course, and if we open up the entity, you can see that I created it in a way so that the reference value here on the left are the same as the ideas that the A p I use. And so when dialogue flow detects that the user wants more information about the Falcon nine, it will always pass this value to our fulfillment. And that happens to be the i D. That the a P I use is so pretty clever. Now that was it for this video. We learned how to use the parameters detected by dialogue flow, inner fulfillment. And in the next video, we're going to make a response a bit nicer and learn how descent images to the user. 15. Rich responses: send images to user: In the last video, we implemented a fulfillment for the next launch intent. This was the A P I end point that we used to grab information about the next launch, and if I open it up, you can see that there's a lot of interesting stuff here in the links section. So there's a link through the press kid, and they're bedded campaign and to the life stream. But what's interesting is that they also include a mission patch and that looks something like this. So it would be cool if we could send that to the user as well. That way, the user has a text response and a visual response. Now you might think Hold on a minute. The Google home speaker cannot show images, and you would be correct. That is why we will send multiple responses. Together we will send a text response, and we will send an image. So if the user is talking to a Google home, he'll only here to text response. But if he's using the Google assistant on his phone, or if his using Facebook messenger or Slack, he will see both of text response and the image. So let's implement it. I'm gonna go back to the dialogue flow console and I'll open up our fulfillment, and I'm going to scroll down to our function right here and before we answer when the next launches, I'm going to send the Mission patch if it's available, so I'll check that with an if statement and let's take a look again under Jason. So the Mission patch is in the launch object. It's other links and then mission patch so that we say If the launch contains links and if the launch links contains the mission patch, we will add a response agent don't add. And what will we added? We will add a new image, and we will pass the Ural to the image in It's constructor like so And that's it, actually, so sending an image to the user is a symbol as creating a new image class and passing it along to the agent now know that you can only send an image if it's already hosted somewhere. Now. In our case, mission patches are already hosted somewhere in the A P. I gives us an inch to be endpoint, so that is just perfect. Now, the last thing we need to do is we need to make sure that we import this image class from the dialogue flow library. So here in the Valley Awful library, we require card and suggestions. And let's also require the image class. That's it. Let's go ahead and pour function. And when it's done deploying, let's test our agent. Let's ask it when is the next launch hit? Enter and it doesn't really include our image. Let's ask it again. Maybe it's a cashing problem. And sure enough, there is our image. There's our mission patch along with her text response. The next launch is on 2018. 04 18. So that was it for this lesson. In the next lesson, we will send some suggestions to the user so we could help him with this next question. 16. Rich responses: send suggestions to the user: all right, so at this point, our agent is pretty advanced. It can detect what the user wants. It can fetch data from an A P I, and it can respond with text or with images. But there's even more that we can do. In this lesson. I will show you how to add suggestion chips to your answers. Now this is what a suggestion tip looks like in the Google assistant on a phone. It allows you to make suggestions to the user and say, Hey, here are some follow up questions that you can ask or you can even help the user with his next input. So in this case, the bought is asking for a number, and you can make a few suggestions off your own. And you can also use this to remind the user that your body is capable of answering certain types of questions now because we are using dialect flow Fulfillment library, it's very easy to implement these suggestions and to Adam to our response. So let's go to the dialect full council and let's open our fulfillment code again. And let's say that we want to add a suggestion to the intent that Fash is how many employees the company has. So here is that function function. Get company employees. And let's say that we want to suggest to the user that he could also ask about future launches. So in this function, I can add a new suggestion. I will added as the last statement. So just like before, I'm gonna do agent dot ad, and then I can add text or Econet images. But I can also add suggestions on how to do that. Well, you just create a new suggestion and then you give it some text. So we're going to suggest that a user that his next question could be when is the next launch? Now there are a few limitations to using suggestions, and this depends on a platform that you're catering to. Google Assistant, for instance, allows a maximum of eight suggestions, and each of these can only contain text with a maximum length of 25 characters. Now we can add multiple suggestions to our response so I can say again, agent thought, add a new suggestion and let's say they don't want to suggest what is a Falcon nine as another question that the user could ask, All right, so let's the floor fulfillment and let's test if it actually works, all right, it's done deploying. Let's now test it out and let's ask our pot. How many employees does the company have? Send that, and there we have it. The bought response with Space X currently has 7000 employees, and then it makes to suggestions, says one is the next launch, and one is a Falcon nine. Now, if we or the user clicks on one of these suggestions, it is sent to our. But so if the user says, Yeah, I wonder what a Falcon nine s or I wonder what the next launches he clicks on it. That question gets sent to the body, and the body can answer that question instantly. So suggestions Congrats, Lee. Increase the experience off the user because it's just faster for them to tap on something than to have to type it all themselves or to have to ask it with their voice. It also helps them to make them more familiar with what you're bought. Conduce you So that was it for this lesson and in the next lesson will take a look at the last rich response type, and that is sending cards 17. Rich responses: Cards: So, up until now, I showed you two types of rich responses, images and suggestions. In this video, I will show you the last one, and that is carts in the Google documentation. We can see what a cart looks like on the Google assistant, and it's basically a box with an image, a title, some text and a clickable button. Now other platforms have something similar. Here is what a car looks like on Facebook Messenger, and as you can see, it's the same I d. It has an image and has a title. It has some text and then a few buttons. So how can we use this? Well, let's go back to dial up flow and here, not fulfillment. I'm going to scroll down to our next launch intent. Now remember that a few videos ago we change this intense so that it also sends the Mission patch to the user. Now, instead of just sending the mission patch to the user, let's send the Mission patch in a nice cart. Together with the link to the life stream on YouTube, for instance, I'm sure that users will appreciate that even more so. Let's remove that image right here. And let's add a new cart. I'm gonna say, agent thought add to add a new response and I'm gonna make a new card object, and we're gonna give this card object a regular Java script object with all the parameters . And this one takes quite a few parameters. It takes an image euro. It takes a title, It takes some texts. You can give it the text for the button and you can give it a your l that the button should link to All right, So let's fill in these parameters. I'll use the mission patch as the image you or else we're gonna say, Launch, darling, stop Mission Patch and I'll give the cart title Life stream and the text of her card could be something like Watch this launch life on YouTube and life about 15 minutes before lunch . And the last thing we have to do is we have to define the button. So the text for the button could be watched and a link at the bottom should take us to is launched dumpling start video mink. So, again, this all comes from the a p I. Now let's also change our if statement here so you would check if there are links. And if there is a mission patch, send this cart. But for us to be able to send this card, we also need to check if there is a video link. So we're gonna do that here. So if those three are there, re consent this card to the user. All right. Last thing to do is we need to make sure that we import the cart object correctly. So I'm going to scroll up. And yes, you. We import the cart class from the dialogue flow Fulfillment Library. So that's it. Let's scroll down. Let's deploy our fulfillment so that we contest if it actually works, all right, that's done deploying. Let's ask our But when is the next launch? And hopefully we'll see our beautiful cart. And sure enough, it response with our text response. The next launch is on this date, and it also sends along our beautiful cart with an image, a title, a description and, of course, the link to the life stream. So that was it for this lesson. And in the next one, we will take a look at how we can stop using this online code editor and writer code locally and deploy it with the firebase cli tool 18. Writing code locally on your computer: up until now, we have been riding our coat here in the online editor of Dialogue Flow, and that is fine when you have a small project. But it's not enough for bigger and more complex agents. As your code base grows, you want to be able to spread it out into multiple files, and you want to use things like version control systems and continues integration. So in this video, we're going to break free from this editor, and luckily, dialect flow makes this very easy. Here I am, in the fulfillment tappin dialogue, flow and all I have to do to get a copy of my project. It just click this little download icon right here. And when I do this, dialogue flow gives me a ZIP file, which safari automatically unzips for me. And if I open that in sublime text, you'll notice that in this directory there is a fire base directory, and if I open that up, there is a Functions directory, and in here are two files and in next out Js file and a packaged Jason file. So these were the same two files that we had access to through the online editor And if I open up the in next Js file, you can see here is the code that we've been working on throughout this course. So there is our get company employees intent. And here's or get next launch intent. Now, we also have our package. Jason, file here with all of the dependencies for our project. Pretty great. So now we can work on our code locally on our computers. But how do you deployed back to Firebase? While for that we're going to need the firebase cli tool. This is a note package that has to be installed through an PM, So I'm gonna open up the terminal and I'm gonna type pseudo and PM install Dash G G stands for global and then fire base tools. I'm gonna type in my password here. And for this to work, you will need node installed on your system. But if you have some knowledge of Java script in node, then you probably know what NPM is and what this command does. All right, now, after that fire base has been installed onto our computers and we can use it just like that Now, the first thing that we're going to do is we're going to authenticate this CLI too. So we're gonna run firebase log in and this asks if it's allowed to collect some usage on air reporting information. I'm going to say yes, and then it opens up a browser and it asks us to log in with our Google account. So I'm going to sign in with same account that we use for our dialect float project. And then it asks me if it's allowed to read my Google Cloud projects. And if it's allowed to administer my firebase data and settings, so we're going to say yes and then who fired a cli log in successful and we have to go back to our terminal. So yes, here in the terminal, we can see we've logged in successfully. Now that that is done, we can deploy our function back to Firebase, so I'll go into the directory that we just downloaded. So I'm gonna go to downloads Firebase fulfillment, which is the name of the folder that Fire Days gives you, and then we're gonna go inside firebase functions. And so here are our index dot Js file and packaged Jason file and in here we need to make sure that all the dependencies are correctly installed. So the package that Jason filed has a bunch of dependencies here. We have to install them locally before we deploy our function. So to do that, I'm going to run NPM install again if you know, a bit of known and then PM then you know what this command does and how it works. This would basically install all these packages that we listed in the dependencies. And once that's done, all we have to do to deploy to fire base is, say, npm, run, deploy. And that's all we have to do. This will package up our code and send into fire base pretty easy, right? And that's it. Our function has been successfully deployed. Just one thing to note. Once you deploy your function through the firebase cli tool, you cannot use the online editor anymore. But that's not too bad. So that was it for this lesson. And have you noticed, by the way, their code is starting to look a bit messy, are in. Next file is already getting quite big, and we haven't implemented that many intense yet. So in the next video, I will show you how you can split up the in next up Js file into multiple files so that it's easier to maintain and cleaner. 19. Organizing the code nicely: in this video, we're going to clean up our next stop Js file. Now, this file has grown quite large already, even though we haven't that much logic in it. And so to prevent issues down the road, we're going to split it up into multiple files. And one way you can clean it up is by creating one file for intent that your agent can handle. So in our case, that would be one for welcome one for fall back one forget company employees and one forget next launch. So I'll start by creating a new folder in my Functions folder here, and I'm gonna call this one intense handlers. And in here I'll make a new file for our get next launch intent, for instance, so called this file next launch Js, and we're gonna go back to our index file here. I'm going to select the get next launch intent, and I'm just gonna cut it out of this file and paste it into its own file. Now, when it's in its own file, we need to re import some of the dependencies. So request http for instance that does not exist in this file. So we have to import that. So on top of this file, we're gonna say const request http equals require the request library, and we also use the card object here. So we're gonna import that one as well. We're gonna sit, const Card equals require dialect flow dash fulfillment like that. All right, that seems quite good. And now we need to export this function so we can import it in our indexed file. So I'll just remove this function thing right here and was going to say module dot exports equals. And then we're just gonna make it anonymous function without a name just like that. I'm going to save the file. I'll go back to my index dot Js file. And here at the bottom, we have our intent map. And it says that when the next launch intent is triggered, it should execute the function, get next launch, but because we re factored it, the get next launch function doesn't exist anymore. Insider in next dot Js file. Instead, we have to included from this found right here. All we have to do to make that work is going to say require they were going to require intense handlers slash next launch and that's it. The next launch intent now has its own file just for the logic of this intense. So that was pretty simple, right? Let's do the same thing for the get company employees intense. So the intent name is company employees. And as a convention, I always use the name of the intent as the file name off the handler as well. This just makes it a little bit easier to switch between intense and dialect flow and intense handlers in your code base. So I'm gonna make a new file. I'm going to save it as company dash employees dot Js I'm gonna go in an indexed file here would cut out this function based in there. We're gonna export it again. Module dot exports equals and then a function like that. And, of course, we need to import the tools that we're using right here. So you copy it from the next launch Intense face that in there that I'm gonna check. We don't use cart in this intense. We use suggestions on the other hand, So we're gonna import the suggestions from the dialogue flow fulfillment and again, that's it. We're gonna save it. Go back to index Now we're gonna open up or intent MEP and we're gonna say require thought , slash intent handlers slash company cash and voice. Now, I can do the same thing for the welcome and fall back intent, but these air really simple. So I'll just leave them in here for now. So as you can see, this dramatically cleans up are in next dot Js file. In fact, it's just 40 lines long and we can clean it up. We can remove some of these things here. We can remove some of these imports like card suggestion and image. We're not using those here anymore, so we can remove that as well. So we're in next Duchy s fall right now is around 30 35 lines long, and that's a lot cleaner than what it used to be. And it's also way more maintainable. So now we can work with multiple people on the same code base, each person working on their own intent, and you don't have to fear that someone's work is going to interfere with someone else's. So that was it for this lesson. And this was also the final lesson in this section about creating a custom fulfillment with fire base in the next section, we're going to take a look at how we could integrate are bought with Google assistant and Facebook Messenger. 20. Integration flexibility: so, up until now, we have been working on implementing our custom fulfillment with their own JavaScript code . We can make our bought a lot smarter because we can use data from external sources like AP ICE or databases in this section of the course. We're going to integrate our bond with various services like Google Assistance and Facebook Messenger, because we use dialect flow and we use the dialogue flow fulfillment library. Integrating with these services could not be any easier. Dialect flow provides almost one click integrations with the most popular services we don't have to change anything are intense, stay the same and also our fulfillment don't need any modifications before we start. I just want to mention that will update this section of the course when new services are being added to dialect flow. So if your favorite service isn't in this course yet, let me know, and I will add it later, All right, that was it for this introduction. In the next video, we will start by integrating are bought with Google assistant, and this will allow us to run it on Google Home Devices as well as on android phones and even I West devices 21. Integrate with Google Assistant: All right, So in this video, we're going to see how we can integrate our agent with the Google assistant To do this, I'm gonna go to the integration taps right here. And because the Alec Flow is a service offered by Google and we can see that the Google assistant integration takes the prime spot. So if we click on integration settings, a possible open up with all the things that we can came figure to link up our agents with the Google assistant So we can say, Hey, what happens if a user starts our bought? So with Google Assistant, if you want to talk to an agent or two, particularly body, you have to say, Hey, Google, talk to and then the name of the app. And when that happens, when the user says that this intent should be invoked, you could also say that the user is required to be logged in, if that's necessary, But for our bought, that's of course, not necessary. Now to link it up, we first have to test it with the Google assistant. So I'm gonna click on the test button right here, and this will ask us if it's okay to auto preview our changes. So if we enable this every time we make a change to our dialogue flow agent, it will update that or the Google assistant as well. So obviously we want that. So I'm going to leave it on. I'm going to click on Continue. And now dialogue flow will create our action on Google, and they will also enable us to use the simulator, and that's what they've opened up right year. Now, this simulator is intended to test your agent and to see if it works for the Google assistant. So on the left here you can talk to your agent and then on the right. You can see what the response of your agent will look like so we can see some technical information like request and response objects. But then we can also see what the response will look like on a phone or on a speaker. Because these two have different capabilities. The phone can show cards and suggestions, while the speaker can only give the user a single answer. So let's try talking to it very briefly right here. It says. Talk to my test at test stop is at the moment the name of a rap. So I'm gonna hit, enter and then we can see that are bought is answering with Hi, my name is Space X Bought one question. Can I answer for you today? So this is basically the same as the tester that we have in dialogue flow right here on the right except that it simulates the environment off the Google assistant so I can ask other questions. I can ask when is the next lunch and then you can see. Of course, it keeps on responding and we can see all the request. The response objects, the audio that the Google assistant will send through the user and an additional debug information as well. Now let's head back to dialogue flow. Once we've hit the test button, we now have an action on Google and we can go. If we go back to dialect flow, we can now click on manage assistant app. And when we do that, it will open up the consul for actions on Google and you. We have all the information about our app and we can start configuring it. So on the left, here we have our actions. We have analytics to see how many people are using our agents on the Google assistant. We have the simulator, which have showed you before we got some back and services and some connected properties. But on the right here, here are all the things you have to configure before you can submit your application to Google and then make it usable for other people as well. So these are things like, What's the information of the app or what language the APP supports? Let's go ahead and open up app information. And here you can say what your app will be called. So our apple obviously be space expert and how it's being pronounced. You can then add a description, a short description of long description. You can show some sample invocations, and they will all show up in the actions on Google Directory. Now, keep in mind that this all has to be filled in correctly before he submitted to Google, because otherwise they, of course, won't allow your application in the directory. Let's go back here. We're not gonna save the changes because I'm not going to submit this agent to the Google Directory. But that is basically it dialogue flow because it's a Google service is very tightly integrated with actions on Google. All you have to do is go to dialogue, flow, click on tests, and then you can manage your assistant app and you can give it all the details that it wants. And then ultimately you can submit it for review to Google. Now, an interesting fact is that you don't have to submit your agent to Google if you want to use it just for yourself. If I want to test space X pot right now, all I have to do is just click on tests and then manage assistant app. And all of a sudden space six baht will work for all the devices. They're connected to my Google account. So here I'm logged in with my Google account. While all the other devices like my Google home or my android phone, they can already talk to this test app. The only thing I have to do is give my app in AJ's name, said it considered the Google hate launch Space X pot or let me talk to space expert. So that was it for this video integration with actions on Google is very simple because, of course, both are Google Services. In the next video, we will take a look at how you can integrate your agent with Facebook Messenger. 22. Integrate with Facebook Messenger: All right, So in this video, we're going to take a look at how we can integrate our agent with Facebook Messenger. The first thing that we need is we need to create an app on the Facebook platform so that we can actually interface with messenger. To do that, you have to log in to the Facebook developer. Consul, that's that developers dot Facebook Don't Come on the start beach here, you can click on add a new app. One student right now, and we're gonna give it a display name when calling you to me Space X bought. Then there is your contact email address. We'll click on create APP I d and solve the culture. And this takes us through the overview page of our app. And here we can see all the products or the services that we can add to our little project . And the thing that I'm gonna add is a want ad, the integration with Facebook Messenger's. I'm gonna click on set up right here, and we're gonna scroll down to create a chat about on Facebook. You actually have to link it to an existing Facebook page so you can either create a new page right here for your agent or your chatbots. Or you can select one of your existing pages from the list right here. I'm gonna do that. I've already created a Space six bought page click. Continue. I'm gonna give it permission. And this will generate a page access token that looks like this. It's quite long, actually. Take note of this. Were copied to the clipboard because we're gonna need it when we're setting up a dialogue flow, I'll just copy it to the clipboard and then going to go to dialogue flow and in integrations tap. We're gonna open up Facebook Messenger. I'm gonna enable it, and it asks, is a verification token and a page access took. So the beach access Tokcan we already have. So I'm gonna paste that right in here. Now we need a verify Tobon and money to do something with this callback. Your L Now, the verify token can be anything you want. And it's just a way for dialogue flow and Facebook messenger to know that they're talking to the right and point. You can add anything you want in here. So what I will do is I will actually go to last pass on. I will just generate long random string, maybe making a little more complex. There we go. And I will just pace that in here as my verify token. All right, that's it for dialogue flow I can click on started. This will try to check the status over Bond and says it's okay. But now we also have to configure Facebook messenger because right now, if someone starts to talk to our body, Facebook will actually not send these chat messages to dialect flow, and that's something that we really want. So what we're gonna do is we're gonna go back to Facebook here, and we're gonna go here into the weapons section, Just blower page access token one click on set up Weapon X. And in here we need to specify which types of events Facebook has to forward to our dialogue flow agent. Now, a few of these aren't actually needed for for our agents, the ones that we need is we need messages and we did messaging post backs. Then we need to give Facebook a callback. Ural. And here Facebook basically wants to know where it can send the information to. So if a user is talking or sends a message to our but than Facebook, will four that message through the euro off are choosing? And then again, we have to specify the verify token, and this again is used between dollar flow and Facebook to verify that there were talking to the right end point. So I'm gonna go back to dialect Flow. I'm going to copy the callback, Ural Paste, that in there and that. I'm also going to copy our verify token, and we're gonna pace that in here. All we have to do is click on verify and safe. There we go. Let's scroll back down or weapon was added green check mark. Everything is okay. And now we also have to specify when or for which page we need to call the weapon. So I'm just going to click on my space X. But beaches well, click on Subscribe, and that's it. Now our Facebook app is still in the development status, and that means that no one can actually talk to her. But except for we ourselves, of course, if you want everyone to be able to talk to your bought then of force. You have to get it through review that it's right here. You can add it to submission. Now, let's head over to our page and see if we can actually talk to her. But Okay, so here I have opened up the space X bought page that I've created on Facebook, and I'm going to click on send a message that will open up this panel right here. As you can see, have already been testing the lot. And I can say that again. So I can see when is the next launch. And if everything goes well, this will then reach out to dialect flow and boom. The next launch is on and then particularly date. And that's it. We've integrated our agents with Facebook Messenger. And by now you should see how easy it is to integrate dialect flow with other services like Facebook messenger like slack, like Google assistant. Things like that 23. Switching to Dialogflow API v2: Now, as I was working on this course dialect Flow released the second version of their A P I on April 17th 2018. Now, in this short update, I just want to show you how you can switch to diversion to a P I. Now, right now, Version two has no additional benefits for your agent. However, new features will only be released for those who are using the second version off the p I. So it's pretty important that we get on board as quickly as possible now. Luckily enough, we don't have to change anything about Our agents are intense, stay the same and we don't have to touch the code in our fulfillment, either. And that's because we use the dialogue flow Fulfillment library. When we switched to the V two, the library will automatically recognize it and adapt to it. That's pretty awesome, right? So to switch to V two, all we have to do is go to dialogue, flow and open up the settings off our agent and then I'm going to scroll down. And here is a section called a P I version, and all we have to do is just click on V two, and this will open up a little prompt and say what Vito is all about. So it's a redesign of the existing A p I. It requires an update off the client and fulfillment code. However, because we use the dialect flow library, we don't need to do that. Then there are two things we need to make sure we need to make sure that our client code can work with the V to a P I. That's already the case, and the Web hook has to work with it as well or cloud function. And that's the case as well. So all we have to do is click on Upgrade to V two and save our agent, and that's it. Our agents will still work as intended, and all they have lifting is done by the library that we use in our fulfillment. 24. Conclusion and next steps: thank you very much for purchasing this course and completing it all the way to the end. I hope you enjoyed it and learn a lot from it. Now you should be able to make your own intelligent chatbots with dialogue flow, and you will be able to integrate them with Google assistant and various other platforms if you have any questions, or if something was not clear, don't has a date to create a new question right here. I'll do my best to answer all these questions as thoroughly as I can. I'm also providing the source code that was used during this course as a free download along with an export of the dialect flow agent, and that should give you everything you need to get started as well. And finally, if you like the scores, consider riding a nice review for it. Also, follow me on Twitter for more updates and check out my YouTube channel for free explainer videos on a wide variety of topics. Thank you so much for watching, and I hope to see you in one of my other courses