Blazing Fast GraphQL Backend Development Using Hasura | Dmytro Mezhenskyi | Skillshare

Playback Speed

  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x

Blazing Fast GraphQL Backend Development Using Hasura

teacher avatar Dmytro Mezhenskyi, Learning is a key to everything

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Lessons in This Class

78 Lessons (6h 46m)
    • 1. Intro - What we will learn in this Course?

    • 2. Intro - What is Hasura?

    • 3. Start with Hasura in Hasura Cloud

    • 4. Run Hasura as a Docker Container

    • 5. Hasura Console Overview

    • 6. Creating Tables in Hasura Database Manager

    • 7. Let's build the first GraphQL query to the Database

    • 8. Query - Data sorting

    • 9. Query - Data filtering

    • 10. Query - Full-Text Searching

    • 11. Query - Combine Multiple Filters

    • 12. Query - How to do Pagination

    • 13. Query - How to work with Relations between Database Tables

    • 14. Mutations - Create an Item

    • 15. Mutations - Update an Item

    • 16. Mutations - Delete an Item

    • 17. Mutations - Mutate Items with Relations

    • 18. How to use Variables in GraphQL Queries, Mutations and Subscriptions

    • 19. Subscriptions - Realtime Queries

    • 20. How to extend Business Logic?

    • 21. Setup Firebase Cloud Functions

    • 22. Event triggers - Create an Event Trigger

    • 23. Event triggers - Implement Event Logic

    • 24. Actions - Basic Overview

    • 25. Actions - Create an Action

    • 26. Actions - Relations to the Database Tables

    • 27. Remote Scheme - What is that?

    • 28. Remote Scheme - Create and connect external GraphQL Scheme

    • 29. Section recap

    • 30. How to protect Hasura Endpoints

    • 31. Authentication with JWT (JSON WEB Token)

    • 32. Authentication with JWT & Firebase API

    • 33. Role-based Access

    • 34. Anonymous Role

    • 35. Webhook Authentication Mode

    • 36. Migrations - Describe a Problem

    • 37. Migrations - What is Migrations & Metadata

    • 38. Migrations - Meet Hasura CLI

    • 39. Migrations - Create Initial Migration

    • 40. Migrations - How to export Metadata

    • 41. Migrations - How to check Migration Status

    • 42. Migrations - How to apply Metadata

    • 43. Migrations - How to keep your Migrations & Metadata always in Sync

    • 44. Migrations - Migration Squashing

    • 45. Migrations - Seed Migrations

    • 46. Migrations - How to rollback Changes if something went wrong

    • 47. Improvement - Fix Permissions for User Roles

    • 48. Improvement - Create Action "Upload File"

    • 49. Angular DEMO - Create an Angular App & install Angular Material

    • 50. Angular DEMO - Implement SignUp functionality

    • 51. Angular DEMO - Implement SignIn functionality

    • 52. Angular DEMO - Typescript Code and types Autogeneration

    • 53. Angular DEMO - Create Authentication Guards and Create a User Profile

    • 54. Angular DEMO - Perform Secure Queries to GraphQL Endpoints

    • 55. Angular DEMO - Implement File Uploading

    • 56. Angular DEMO - Render the List of Uploaded Files

    • 57. Angular DEMO - Open File Details in Dialog Window

    • 58. Angular DEMO - Add Comments to the File

    • 59. React DEMO - Create React Application

    • 60. React DEMO - Implement SignUp functionality

    • 61. React DEMO - Implement SignIn functionality

    • 62. React DEMO - Typescript Code and types Autogeneration

    • 63. React DEMO - Create Authentication Guards and Create a User Profile

    • 64. React DEMO - Perform Secure Queries to GraphQL Endpoints

    • 65. React DEMO - Implement File Uploading

    • 66. React DEMO - Render the List of Uploaded Files

    • 67. React DEMO - Open File Details in Dialog Window

    • 68. React DEMO - Add Comments to the File

    • 69. Deployment - Configure Hasura for convenient Local Development for a Team

    • 70. Deployment - Configure Cloud Functions for Multi Environments

    • 71. Deployment - Configure React Application for Multi Environment Support

    • 72. Deployment - Create & Configure Firebase Production & Develop Projects

    • 73. Deployment - Create & Configure Hasura Production & Development instances

    • 74. Deployment - Configure GitHub Repo a setup sensitive Data as Secrets

    • 75. Deployment - Deploy Cloud Function with GitHub Actions

    • 76. Deployment - Deploy Hasura Engine with GitHub Actions

    • 77. Deployment - Deploy React App using GitHub Actions

    • 78. Deployment - Deploy the whole Project to Production Environment

  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels
  • Beg/Int level
  • Int/Adv level

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.





About This Class

If you’re a developer struggling with GraphQL because of its complexity… or simply because it takes too much time, then I have awesome news for you.

Hasura in just a few hours can transform your PostgreSQL database into Real-Time GraphQL Endpoints with incredible performance! After just few hours of this course you will be able to:

  • Transforming a Database into GraphQL Server with build-in Sorting, Full-Text Search & Pagination;

  • Using Subscriptions for real-time queries;

  • Authentication with JWT tokens & Firebase provider / Authentication with WebHooks;

  • Role-Based access to the Data (We will create User and Admin role with different permissions);

  • File Uploading strategy with GraphQL and Firebase Storage;

  • Database Migrations & Metadata which will help your database successfully evolve;

  • Orchestration & Delegation of custom Logic to 3rd party Micro-Services (REST & GraphQL) which we will create by using Firebase Cloud Functions;

  • We will build a Web App with Angular & React + TypeScript which will communicate with our Hasura Server and this application will handle Authentication, File Uploading and basic GraphQL CRUD operations;

  • You will see how to deploy your project to different environments like Development or Production using CI/CD pipelines and GitHub Actions in particular

I am sure that you will be impressed by the functionality which it brings and this few hours invested in this course will save you weeks in the future. Enjoy watching!

P.s the course will be getting updates every time Hasura releases new features.


The information provided by me in this course is for general informational purposes only and I am not responsible for any unpredictable expenses from your side because of some project misconfiguration or another mistakes. All information in the course is provided in good faith, however I make no representation or warranty of any kind, express or implied, regarding the accuracy, adequacy, validity, reliability, availability or completeness of any information in the course.

Meet Your Teacher

Teacher Profile Image

Dmytro Mezhenskyi

Learning is a key to everything


Hello, my name is Dmytro. I have been working as a Lead Frontend Developer. I started to build my first projects in 2012 and since that time I have been constantly learning new technologies and working on different complex projects. Since March 2020 I have been running my YouTube channel and back then I had realised that I really enjoy to teach cool things people around the globe.

See full profile

Class Ratings

Expectations Met?
  • Exceeded!
  • Yes
  • Somewhat
  • Not really
Reviews Archive

In October 2018, we updated our review system to improve the way we collect feedback. Below are the reviews written before that update.

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.


1. Intro - What we will learn in this Course?: Hello again and welcome to this course. Let me quickly introduce what we're going to learn here and how better to work with this course. The main focus here will be on the Hazara graph girl engine. Well covered the core functionality, lag, data sorting and filtering, full text searching, pagination, mutations, and real-time subscriptions. Of course, we will go beyond and we will see how to implement such a vital functionality like authentication, role-based access to the data. And we will create of course two roles like user role and admin role. Then we will learn how to perform file uploading. And we will cover such an important feature as a migrations, which allow you to restore your database schema and Hazara metadata. Of course, every application is unique and requires its own logic besides basic crud operations. So we will learn also how to extend basic has Sura functionality with firebase Cloud Functions, and we will adjust it to our needs. And in the end of this course, we will build an application which allows users to create the user profile, sign-in, then applaud some Fatah and maybe add some description to it. And of course, user will be able to open the photo details and leave some comments to this photo. The full text stack which we going to use in this course. You can see on your screen right now, there is small remark though. Details will be covered how Sura only regarding their rest in this tech stack like Docker, Firebase, Cloud Functions, or angular. Of course, I will be explaining how to install it and I will definitely highlight most important parts. But I will not be diving there deep and explain every cold stream because I assume that you have at least some very basic knowledge in web development. How you should use this course. If you are already familiar with has Sura and you're just interested in some concrete sections, you can jam directly to it. But if it is your first experience with has Sura, I would recommend you to go from section one to section five. There will be covered basic sort of functionality. And then there will be different examples with different front-end frameworks, and you can choose your favorite one and jump directly to eat. This course is still being actively developed and in nearest future there will be added also examples with React and VGS. Most probably depends on your feedback. I will extend it with some and other use cases which you may suggest somewhere in comments. And here, of course, if has Sura released some new feature, it will be also reflected in this course as soon as possible. So don't worry, you will always get the latest information. Alright guys, I wish you good luck. And I'm pretty sure you will learn a lot of interesting things here, which in the future we'll save a lot of time for you. So let's move forward and see you in the next videos. 2. Intro - What is Hasura?: Hello guys. So what is has Sura and how it makes our life easier. So let's get started with graph QL. Graph QL is a query language for APIs and the runtime for fulfilling those queries with your existing data. If to simplify it as much as possible, you describe data your application needs, and via HTTP, you send this query to your server. Your server should parse this query, get maybe some params and fetch the data. And where usually we stored the data in the database, right? So our server goals to database, grabs the data and returns to the client that data it has requested. And it doesn't matter how complex your application is. Everything boils down to creation of the data, update of this data, and deleting of the data. So-called crud operations. Despite these, operations are very primitive unit to write code for this. Anyway, you need to build SQL queries or use or amps then create maybe some controllers, graph kyeolre resolvers and so on. And then later, you may realize that you write almost the same code for every entity. You always need to have the list of photos, list of comments, you need to filter this list. You need to always pagination four balls. You need to control permissions and many other things. When we encounter such situation, we, as developers want to somehow automated and has Sura does exactly this. Hazara sits between our client and database. Technically, it is our backend, very smart backend, and very fast by the way. It just scans your database and based on its schema, it generates graph QL queries, mutations, and real-time subscriptions, which allow you to perform those crud operations with data. Literally, you even don't need to write code for it. Besides these, Hazara handles also authentication, role-based access to the data. And if you need more complex logic, it is not an issue because has Zurek can be easily extended with custom business logic. And these business logic can be written with any program language. I'm already super excited to show you this tool in action. So let's get started and install ha Sura. 3. Start with Hasura in Hasura Cloud: Hi, there. There are couple of ways of how to start a sewer project, and now we will learn how to do it with Hazara cloud. The first thing you have to do is to go to her surrogate ion. Then click get Hazara button and big the plan. In my case, I'll take a free tier, then proceed with a quick identification process. That illustration here is a free and trivial. After successful registration, you will have a similar page to what I have now. And you will be asked about beak, either existing or create a new free database was heroku. I will pick the Heroku one. Most probably you will be asked to login with your Heroku account or to create your own. I have one, so I will use it for login. I can see now that my database was created and I can click button create project. Now, once project was created, you will be redirected to the dashboard where you will see some basic information about your project as example, graph girl and point, IP address and so on. Let's explore what else do we have in Hazara console? So we have four tabs. First one is a general, this is a general information we see right now. Then below we have a tab team. For now, you should see your email address only. But if you work in a team, you could invite more people. Just put email of your teammate and give some roles like admin or user, where admin can have very broad permission. He can do everything whatever he wants. But user has less permissions and you can specify which exactly permission do you want to grant this user and then just colleague add button below. The next section is called envy vars. Here you can define variables which will allow you to configure your Hazara engine. Some of them we will cover later in this course. As an example, you can see here, already predefined has Sura graphical database URL, which points to our Hazara database instance. You can click on it in order to see their current value and updated if it's necessary. If you want to add the new variable, you can click on new envy var button, and from a drop down, you can peek available variable and edit. As example, I want to set to true and then I click Save in order to add this variable. Now, you can see it listed here. Then last but not least is domain section. You can see that now we automatically got some damage domain name. But if you have your own domain name like your, you can add this Alice in the section by clicking on new custom domain and then click Add button. Great, we have checked all sections and now we can have a look at our graph QL console and see if it's working. So I just click the button Launch console and we can see that our Hazara Console works perfectly. 4. Run Hasura as a Docker Container: Hi there. In this video, I will show you how to set up Husserl locally on your computer as a Docker container. This is a more advanced but heavily used way of running. Has Sura. If you're working in a team, most probably change your database scheme quite often. And in order to not break an application for your colleagues, it is better to do it locally in order to run has served as a Docker container unit to have Docker installed. If you do not have it yet, you can follow this link which you can see right now on your screen. And then just follow instructions for your operating system. Once you successfully installed Docker on your machine, you can open your terminal and Tiran Docker. And if you have Docker successfully installed, you will see something similar to what I have. Now let's create a folder for our project. I will navigate to my cursor's folder and created there. You can choose your own location. Of course. Here I will create how Sura course folder. Then I will go inside and open it with VS code editor. You can use any editor you prefer. Here we're going to create a file docker-compose yaml, which will be responsible for orchestration of our containers. At this time, we have no Docker compose file here. So let's add some. In order to get some very basic instructions. I will go to again 200 at A0. Then I will go to dogs. Than inside the Getting Started section, I will navigate to Docker Quickstart section and click on step one, get the Docker compose file. Here we are suggested to run one of these two comments to get Docker compose file. I will use the second common because I do not want to install widget and curl comes by default with Mac OS and Windows, I believe as well. I will copy this line and then I opened my terminal in VS Code. Thus this line and hit Enter. You can see that Docker compose file appeared, arrived there. If you have any issues with curl or network, you can create docker compose YAML file manually, and then just copy this URL which goes after the coral and access it via browser like this. Then just copy this text and insert inside your Docker compose file and safe. Let's quickly have a look what is inside Docker compose YAML. So we declared to services here our database called balls, graphs and graph girl Engine, which is our Hazara. For database, we declared Docker image of Postgres 12th. And for our house, Sura is how Sura slash Graphx ML Engine version 1.3.1. You may have slightly different version of course. And because it is not a course about Docker, I will highlight the most interesting parts regarding Hazara configuration. The first thing is a port where you can define a port four has Sura. If your local host 8080 board is already occupied it, you can change it lag these 21881 as example. Then inside an environment section, we see environment variables which could be used to configure our Hazara. This is the same variables. What we saw in the previous video about installation has Sura on Hazara cloud. Here is our Hazara graph QL database, oral and has Sura graph QL deaf mode. And the same you can see right now in Docker compose CMO. If you want to add some new variables, just place it somewhere here. You may wonder where you can find the whole list of available variables. Well, To see this list you can go again to hazard dogs. Find this section has Sura CLI, and click on the configuration reference, and then find environment variables. And here you can see all possible variables and descriptions. Which variables means what? Okay, now it's time to run our Hazara containers. To do this, I will go to my VS Code and in terminal I will run common Docker compose up to start necessary containers. It will take some time to pull images. And once you see this line in logs, you can go to browser and navigate to localhost 8080. And if everything fine, you will be able to see how Sura console, if you want to stop your container, just go to terminal and press control plus c in order to stop containers. Here we go. Now we know how to start our Hazara as a Docker container. So let's move on. 5. Hasura Console Overview: Welcome to this lesson. Now we know how to spin up has Sura server in has Sura cloud and locally with a docker. Now let's have a closer look at the Hazara console and learn how to work with it. So here's our Hazara console. This first screen is our playground where we will be building our graph, kill queries, asians, and configuring headers. It is like a lightweight version of postman if you have ever worked with it. Let's explore what do we have right now. At the very top, you can see the version of our house Sura engine. Then below there is an end point which you will need to call in order to perform queries and mutations. Then in section request headers, you have an option to add headers to your requests or to remove existing ones. As example, we can add our identification header and some token. This is just a dummy example more about our identification, we will learn in next sections. Let's move forward. And here we see the playground where we will be mostly working. Inside explorer, we will have some entry points where we will start to build our queries. But because we have no tables in our database yet, it is empty. But soon we will see the connection. Then in this area, we'll be building our graphical queries and rotations. And here below we will be defining variables. Then we have a row of buttons here. Here you can see a big play button which runs our queries and mutations, then predefined, which works just like cold formatter and makes queries more readable. History button just shows a history of our curious. For now, of course it is empty. Copy is just copying the query we're built to a clipboard, nothing special. Explorer button just toggles our sidebar where our queries and mutations are going to be listed. The next feature is called Cloud Explorer, and it allows you to generate some boy Supreme Court based on your query. So far it is empty, but in the future, I will show you a use case for it. Voyager is empty in our case for now, but actually it is a tool which represents any graph QL API as an interactive graph. This is just a live damning example from its official page. And it shows the relation between entities and so on. Then let's move forward, derive action. It will help us with a code generation for our actions. In details, we will have a look at how Sura actions in next course sections. After these goals analyzed section. For now, there's nothing to analyze as well. But once we start to build first queries, we will be able to see here how exactly Hazara translates our graph girl queries to SQL queries. And last but not least, is dogs. There we will see all available query params, fields. It's divs and descriptions. If it is provided, of course, it will be generated automatically depends on our database schema and metadata. So now we allow a little bit more familiar with how Sura console UI and its functionality. Don't worry, thinks which are not available now we will learn a bit later. Cool, let's move on. And in next lesson we will explore and next step called data. 6. Creating Tables in Hasura Database Manager: Hi there. Well, for now our database is completely empty so we cannot build any queries. Now, let's go to the data and create some tables. I can click at table or create table and see the next screen. Let's give this table and name and we'll call it photos. Then I have to add some columns. So I will add ID, which has type UI. The default value will be generated by this function called get random UUID. Then I will add a field called Fatah or L, which has type text. Then let's add maybe some, I don't know, description, which is also text type. And we also should have some real-life create and update date. We can grab them from the frequently used columns. I will click this button and be created at and updated at filleds. Looks great. Then I will make 80 column as primary key, foreign key a weekend skip for now and you need key. We also needed and comment. You can leave empty or just put some description of this table. This description will be also used in Graph girl dogs. So I will leave some short one just for test. Good. Now, I can hit at Table button and my table will be created and will appear in the sidebar. Let's click on it and we see that there are no, any rows in the table. So we have to create some. In order to do this, I will navigate to the Insert Row and add some values. I will leave ID empty because idea will be automatically generated as well as created at and updated at fields. Now, let's go back to browse roles and you can see that our roles were successful inserted. If you need to clone, edit, or delete some role, use this three buttons. If you want to add or remove a new column to the table, you can easily do it in Modify tab. Just go there and find and new column. Or if you want to remove a column, click on edit and find the remove button. Alright, now we have some data and we would be able to build our first query. So let's do it in the next lesson. 7. Let's build the first GraphQL query to the Database: Hi, welcome back. In the previous lesson, we have created that table photos and edit some roles. Now, let's try to query them in our graphical. There are some changes since last time we were here. First of all, we have three endpoints in our explorer. And if we expand it, we will see a lot of fields which we will see in the action in these next few videos. The next change is hidden in Docs. Now, if we click on dogs, we will see some root queries, mutations, and subscriptions. If we go inside query root, we will see available fields, photos, photos, aggregate, and photos by pk, which means primary key. The difference between those three is that the first one, photos, has sorting and filtering capabilities and returns always an array of entities. Photos aggregate also has sorting and filtering capabilities, but it returns aggregated phallus like count. If we want to count how many photos do we have? Or if we want to get this example, maximum or minimum values and so on. And the last one photos by BK and it is being used if you want to fetch some entity by primary key, usually it is an ID. It has neither sorting nor filtering capabilities, and it returns always only one value object. Or now if it's not found, let's investigate one of the fields. I will take photos for this example. Here between braces, you can see values which can take this field as a parameter. If it has exclamation mark at the end, it means that parameter is required. Next to the name of the parameter, you can see a parameter type. It can be some primitive types like integer for limit filled or something more complex like photos select column four distinct, unfilled. The types are clickable so you can click and see more details. Then after braces, we can see the type which this field will return. In our case, it is a field called photos. If I click, we will see that this fields are the fields which come from our table, which we created in the previous video. And by the way, here's our description to our table. Okay, now let's get back to our query builder and build the very first query. I can write it manually, but Hazara allows to do it that way more easier. I just need to expand an entity I want to query and I need to just check a field which I want to fetch. And you can see that the query was generated automatically. You can also give some name to your query and hit the play button. And now you can see that magic is real. Without any single line of code, we have a working graph QL engine, which returns our roles, which we have added to the database before. Now, let's quickly try to perform an aggregate query. I will expand photos aggregate than aggregate, and then count. Let's keep it simple right now. We see that photos aggregate query was also generated correctly. Now I hit the play button again and we see that we have three photos, which is correct number actually. And also, let's try out a query called photos by pk. I will expand it. It immediately shows me that I need to provide an ID. So I will copy one and paste it here. And I will select all filleds. I hit Play button again, and they see that the query return me correctly the value, brilliant. If you think that was it, you are wrong because has Sura provides you very powerful sorting and filtering capabilities, which we will learn in the next lesson. So see you there. 8. Query - Data sorting: Hello guys, welcome back. In previous video, we'll learn how to do queries to our Hazara and which types of queries exist. Today, we will learn how to perform sorting in our queries. First of all, I will remove photo by pk query and revert the previous query, which is called photos. The whole sorting is actually boils down to ordering by some field and applying ascending or descending modifiers. A great example would be the ordering of our photos by creation date. So we can sort them from newest to oldest and vice versa. How we can do it. Let's expand in our explorer a field called order by and tick a filled called create an ad. Once we select the field, we see a select box appeared where we can choose a modifier, ascending or descending. Because we want to sort from newest to oldest, I'd pick the ascending and we see that the order has been changed. If I change to ascending, we see that older photos go first. Of course you can do sorting by multiple fields if you need, but in our case, it would not have any effect. You might also notice that except ascending and descending, we have also ask null first and ask no lust. And also the same for descending modifier. Names explained perfectly their functionality. They determine whether nulls appear before or after non null values in the sort ordering. But how it behaves default, descending and ascending them. By default, for ascending ordering, null values are returned at the end of the result. And for descending ordering, now valleys are returned at the start of the results. Now, I would like to say a few words about distinct on, distinct on thus verse. Simple thing. If to put it simple, it removes duplicates from query results by taking the first row of each group of the duplicate. What I mean by that, well, let's add a new row in our table. I will go to data, to table photos and clone the last row, but I change description a little bit and remove idea. So postgres will generate a new one. After this, I just want to make they're created at the same as previous has. Now I have another row with same photo oral and created at date. Let's go back to graphical and tried to fetch data without distinct on. Now we see that we have four items where our clone included. Then if we activate distinct O1 and select created ad as example, we see that the duplicate is not there anymore and we see only the first duplicate with description. This is a third photon. Also, it is a good practice not only in her Sura but in SQL in particular, is to use distinct on along with the order by. Just be sure that distinct on stays before order BY. So now our sorting query is completed and we can run it and then it is working correctly. The next topic will be about filtering results. So see you in the next lesson. 9. Query - Data filtering: In previous video, we learned which sorting capabilities provides her Sura. Let's talk now about filtering. First of all, I will remove the query about the sorting from previous video. And in order to do filtering, we have to expand field where here we see underscore, end and the score not and underscore or, but let's keep them for now and start with something simpler. Let's try to filter items out by some concrete date. Let's say I want to get photos which have a date same as a third photo. Let's try to run it. And now we can see that we have only two items. Our third photo, and it's clone which has the same date. What if we want to fetch items which are not equal to this date? Then we have to use operator underscore and a queue, which is a shortened version of not equal. Let's try it out. Now we see the first and the second photo because their creation dates are different from what, 4-3 and further forehead. If I say exemple, I want to display photos which were created after the second one. I can use the underscore GET operator, which means greater than something. So if I copy creation date of the second photo busted here and then run the query, we see that it works as expected. And now we see the third item and it's cologne because their creation dates are greater or created earlier. What if we need to include all saw item which has created ad, which is equal as well. So equal or greater than current date. No problem. Just use underscore JD, which means greater or equal. Let's try it out. I ran a query and we see now that we also included the second photo as well. In case you need to get items which were created later than this date, use operators underscore LT and underscore LTU respectively. They do the same thing, but in other direction, they mean less than and less than or equal. Sometimes we need to query only items by some list of values. As example, I want to get items with ID equal to second and the third item. For these, I would use operator and the score in unfortunately has Sura query builder converts it into the stream, which is wrong. I have to manually adjust this query. I remove quotes and insert first AD and the second one. Let's see if it works. Well. It works perfectly. But what if I want to have the opposite result and exclude this IDs? Then I need to use underscore and even operator or Nene or nine, I didn't know how to correctly read it. Which means actually not in. I change it here. By the way, check out how Husserl recognize it and updated everything in Explorer. And if I run the query, we will see that we get all items switch IDs are not in our list. The last operator is underscore, is underscore now, but it perfectly describes itself. It returns items where some particular filled is null. In our case or i, these are non null, so we get nothing in this case. So it was the basic filtering capability. And in next videos we will see a little bit more advanced things. Song, let's move on. 10. Query - Full-Text Searching: Hello guys, welcome back. In previous video, we learned how to filter query items and dozens of operators. But there are some of them which will allow us to perform full-text search, which is heavily used feature. First of all, I have to mention that operators I'm going to show you can be applied only for fields which have died txt. That's why you didn't see them when we were working with filtering by date and a IDs. In our example, diaper text has our description and our Fatah or L filled. And for this lesson, I will use description field for the searching. Let's try to expand description field under filter, where now we see Elizabeth more operators than before. Then you'll once our underscore I alike score like underscore and I alike underscore and like underscore and similar and underscore similar. Let's start with underscore, like and underscore and like. Let's say I want to find an item where description contains world. Third, I have find underscore like operator and bus there my string. But also I need to add in front and back of the word percentage sign. This percentage sign means any sequence of 0 or more characters. So if I would leave this sign only in the end, it would return items which are only starts with these worked under score M-like operator as you already made guest. Thus opposite think I reverted back to like operator. And now let's try to change their first later in our Search button and make it capital. Hmm. It looks like our like operator is a case sensitive, but it is not always what we need to have. Usually, we do not care about case sensitivity. And if we want to have it case insensitive, we have to use operator I like. Let's try it out. Well, perfect. Now we do case insensitive search. Again, if you need opposite effect, police use underscore and I alike operator. Operators similar does almost the same what like operator does. And even if we replace it with similar, we will get the same result. The only one difference is that similar can work with SQL standards definition of a regular expression. So we can use operator as example or in order to find photos where description contain words either first or second. If you need opposite result, use and similar operator. Great job. Now we know how to do full-text search in our database. So let's move on and explore. And another has surahs features. 11. Query - Combine Multiple Filters: Hi gays. In previous videos we saw how reached filtering and searching capabilities in her Sura. But in this video we'll do one step beyond, and we will see how to combine multiple filters. Actually, we already solved this operator switch can combine multiple operators, but back then I decided to skip them. This three operators are underscore, underscore, not and underscore or from their sname, we can understand what they're supposed to do, but let's see them in action. And I would like to start with underscore and operator. Actually, we have already used this underscore operator, or better to say its simpler version. We did it when we were doing filtering by multiple fields. We can achieve behavior of underscore end by defining filter criteria inside of where property. As example, I want to fetch all photos where fought to underscore oro is not now and created ad is greater than data or second photo. So if I run this query, we will see the third photo and it's clone exactly what we expected. You can also achieve the same result with underscore end. For these, I will just copy the data they need. Then uncheck the boxes and expand underscore end. Here I will say that created underscore head is greater than my date and photo oral is not null. I run the query and we see the results stays the same. The only one difference which has this underscore and operator. It can take as parameter an array, which can be, for some cases better. But to be honest, I have never had a case where we need it. The next one would be underscore, or this operator also takes array, but it compares if any of our filters return true. If I just change underscore and to underscore or intranet, we will see that all items appear. It happens because our is null operator which set to false, satisfied all items because all of them have some value in filled. Let's maybe modify the query a little bit. And let's say that we want to get photos where they greater than current one, but description should contain words first. Now we see that second photo was excluded because it doesn't satisfy any of our filters. It's creation date is not greater than the current one. And it doesn't contain the world's first in its description. And the last remark that worst to mention here is that input parameter should be always array for this operator. Otherwise, it will behave as an underscore and operator. So keep in mind this. Okay, let's move on and have a look at the operator underscore not. It should be very simple. It is working mostly the same way as underscore and EQ, which we saw in previous video. Let's try to build quire with underscore not operator. So I want to fetch photos where ID is not equal, this ID. When they run the query, I see all photos where ID is different from what I provided. The same result we will get if we use underscore and EQ operator, seem, it stays the same. Underscore norm can be useful when you want to invert result of multiple filters. Lets say we want to take our previous query with underscore or and completely inverted. Now we have only one item where there is no word, first in description or creation date less than defined. Who? I didn't know how about you, but I'm really impressed by hold this Hazara filtering capabilities. It is really so powerful. And remember that we still didn't write any single line of code, but it is not all functionality. In next video, we'll learn another things like pagination. So see you there. 12. Query - How to do Pagination: Hello again, welcome to this lesson. In this short but useful video, we'll learn how to do pagination in her Sura. Let's query or for those for now. So now we see whole for photos. But let's imagine that we have $104 or even more. But we do not want to query all of them right? Most probably user is not interested in 100 photos but in first ten or 20. So we have to somehow limited. Hazara provides such an option right out of the box. You just need to find the field called limit under their field, which you are trying to fetch and set some value. I will set it to two. Now you can see that we have got only first two records. So we can consider it as our first page. Now, let think what we need to do in order to get next to record, namely second page, right? We have to somehow skip the first two. And the field of set does exactly what we need. Offset defines how many records should be skipped. So in our case, it will be two. Let's try it out. And now we see next to photos which go after first two photos. If we would like to fetch third page, we would need to set offset to four and so on. So the formula you should follow when you implement your pagination is offset equal the page number minus1 multiplied by the limit. So if we would like to get page number three according to formula, it would be three minus one equal two, and multiply it by a limit which is two, so we will get four. So our officers should be set to four and so on. So this is how you can implement pagination for your application, and you can see that it's quite easy to do. So let's move on and explore the new Hazara features. See you in the next video. 13. Query - How to work with Relations between Database Tables: Hi, welcome to this lesson. Until now we had only one table called photos. But if you have ever seen before any demo about graph QL, you must probably saw a use case where you are fetching also some nested data like our author information, comments or something else. So let's implement such a relation in this video as well. First of all, I will go to the data page and create a new table called comments. I add and feel their required fields out. I will add a D field than we need their common trope, which is a decks type. Then I need field, which will be a reference to our photo. So I will call it photo ID. And keep in mind that the type of this field should be the same as the type of ID in photo table. In our case it is UID and default field, we will leave empty. After this, I will just add created ad felt. Then I have to do one thing I didn't do before. In foreign keys section, I have to set our photo ID as a foreign key for our table photos. And then I create a relation to column id. So this way, I set up the relation between this two tables. Now we have two tables which have a relation. But this relation exists only on database level, which means that if we go to graphical, we still cannot fetch comments for our photos. However, we can fetch common separately. In order to be able to fetch comments for every photo, we need to ask her Sura, to expose this relation. We have to go back to data, then to our photos table, and then navigate to the tab called relationship. You can see that it has Sura suggests already a relation. So we just need to click Add button. Here you have an option to rename your relation, but the current one, it's okay for me, so I just click safe. Cool. Now I can go back to graphical and now I see also a filled comments and I can fetch also comments for every photo. We see that now we have a property comments which is an empty array for now. So let's add some data to the common stable. I will copy idea of the first photo and go back to data. Then I find common stable. Then go inside, insert road tab and add carpel of commons to the first photo. Now, if we run our query one more time, we can see that we have got two comments for our first photo. By the way, keep in mind that all filters, sorting and searching features are also applicable to nested queries. So I can easily say that I want to get only the first command for each photo us example. And since we have some relations, we can have a look at the tool called wager. If I click on it, we can see our visualized graph and we can trace how your entities relate to each other, which might be very useful in some cases. Then I remember we didn't see the feature called analyze. So let's have a look at it. I click it here and I can see which SQL query is being executed for this particular query, which can give you some better understanding of what happens under the hood if you are proficient enough in pure SQL scripting golf course. Alright, that's it regarding relations. I hope it was interesting and see you in the next video. 14. Mutations - Create an Item: Welcome case. We ever learn a lot about querying with so Relations, filtering, sorting, and pagination. But until now we have been adding data directly into the database manually, which is not an option for modern application. We have to learn how to do it via graph QL interface as well. For these, we will be using mutations in graph QL mutations are responsible for manipulating with data, lag, editing data, removing or updating. And of course in her Sura, it comes out of the box. So how we can create a mutation. In left bottom corner, we have a dropdown where we can select mutation and colleague plus button. We see that documentation was added, so let's rename it into insert for the first in explorer has Sura provides us a couple of insert mutation, insert photo S1 for creating a single photo or insert photos for bulk or multiple insert. I will use insert photos one For now. I expanded and check fields I want to insert. It is for the oral and description. So let's add values for this fills. So we see the error which says that we must do have selection of subfields. So let's do it. I want to select bag ID photo, URL Description and created Add Field of the inserted photo. Okay, there is gone. Now let's try to execute this mutation. Great, our new photo was created. So if we execute previous query, we will see that the new photo is also included to the result. Alright, let's copy the ID of this New Photo and go back to mutation. We know that our ideas have to be unique. But what happens in case if I try to insert photo with ID which has already exists, I will slightly change the field values and let's try it out. So we get an error which is expected. But what if I would like to avoid such behaviour and implement behavior called upsert, which means update record if it's exists or otherwise just inserted. Fortunately, we can do it. And in order to achieve this, we need to configure a field called armed conflict. First, we need to pick which constraint do we handle? In our case, it is a primary key violation. And then we can peek which filled we should update in this case, we can big fought world. But the true Is that has Sura supports here not only single filled but array of fields. So we can add all saw description or if you want to update nothing, you can leave chest MTR eight, but I will revert it and let's try to execute it. Now we see that it works fine. If we worry fought us one more time, we see that a new photo was not created by God, has Sura just updated and existing one. Now, let's go back to our mutation and I will tell just few words about insert photos. I will collapse, insert photo Swan, and expand insert photos. I will add the New Photo and add some values. And of course, I define returning values. Unlike insert photos one, you can also get amount of affected ross. So let's check it. If we run this statement, we will get the pretty much the same result as for insert photo swan. So there will be added a new photo. The only return type will be different to see our returning type is always array, even if we insert only one item. And also it shows the count of affected roles, it can give us a hint that this mutation is being used for bulk insert, which means insert of multiple by others and one dime. And it is true. The only thing we have to change in order to insert multiple values is to turn object into an array and add multiple of values. Then I just changed descriptions and whorls and run it. Okay, looks good. So let's try to query all photos, and here is our new photos. So let's move forward and in next video we'll see how to update photos. See you there. 15. Mutations - Update an Item: Hello gays in these shirtless and we will see how we can update items in her Sura. Updating of items is very similar to insert mutation. So there are two options, update by primary key and bulk update. Let's start with update by primary key. I will copy MAD of this last photo, remove everything and imitation update photos by primary key. There is one required field where I have to insert ID of the photo icon update. Then I have to expand underscore set property and select fields you want to update. In my case, I choose description, but of course you can pick multiple of them. After these, I have to define some fields to return. I want to have ID and let's say description. Now I run it and I can see that my photo description was successfully updated. But what about bulk update? With bulk update, it is pretty much similar. You have to use update photos mutation. Here in underscore set, you define which fields to update. In my case, it will be again description. And let's say I want to change description to mark for the lead. Then inside where field which is required, we have to define which exactly photos should be updated. I want to update all items which were created after let's say, our cloned photo, which we created earlier in this section. Let's go in data and check when it was created. And now go back and bass despair there. Then the last step is to define return values. I would like to see IDs and descriptions and affected roles also would be cool to have I click run and we see that photos created after third one, how God description marked for delete. Cool, updating is working. So let's see how to delete items in the next lesson. So see you there. 16. Mutations - Delete an Item: Hello guys, welcome to this lesson. We continue to learn all crud operations in her Sura. Today, in this short video, we will see how to remove items. Again, there are two ways to delete items by primary key and bulk remove. In order to remove by primary key, you just need to know their ID of the item you want to delete. I will take some ideal which was marked for delete. Okay? Then I expand delete folders by pk, insert my corporate ID into that ID field, and because I must return something, I just return ID. If I run this mutation, this record will be removed from database. If you want to remove multiple roles, you need there, delete photos mutation. Here, you just need to define condition on which all roles will be deleted. In my case, I don't want to delete all photos where description equals to mark for delete. Then I define a returning values. I want to see IDs and photos or else and also affected arose as well. And now if I run this query, all items with description mark for delete will be deleted. Now, if we try to fetch all photos, we will see only first three items and the cologne or the third one. Great. Now we aware how to remove items in her Sura. So let's move on and see you in the next lesson. 17. Mutations - Mutate Items with Relations: Hi there. I think you have now quite solid understanding of how to work with mutations. Of course, you can rotate also nested objects. In this video, we will build mutation, which inserts a new photo along with predefined command. So let's add a new mutation called insert fought. Always comment. I will use insert photo S1, But the SIM would work for any type of mutations. So I will insert this group Shun and photo URL and define some values for it. And then a bow, I will expand also comments and add comment with some text. Now I just defined returning fields. It will be ID description and comments with ID and comment. And now I run this query and we can see that both entities were added. And if you have noticed, we didn't define the idea of the photo for that comment has Sura resulted automatically for us, which is great. Just to prove it, we can go and fetch all our photos and we can see that our last photo has its predefined comment. This is how you can also update nested objects and it is relevant for any crowd operation. Feel free to experiment with it. And now let's move on and see you in the next lecture. 18. How to use Variables in GraphQL Queries, Mutations and Subscriptions: Her gaze. In this quick lesson, we will find out how to use variables in our queries. Let's have a look at our create photon mutation. We see that all our values for photo, whirl and description are hard-coded. It can be convenient until you work in the console. But for real-world applications, we need something more than a mic. This is where variables come in. You may have noticed earlier that we have below a section for query variables. Here we just have to define a Jason where keys a variable name and value is our value. So I will create here Jason wisdom, key photo oral and valor will be photo with ID six. Then I will also add a description with stags. It is description from value. Now, it complains that we define these variables, but we have never used them. So let's fix it. After the name of our mutation, query or subscription. We need to open parenthesis and declare variables here. Each variable starts with the door sign and after these goals, their name, which should map with one of the JSON key we created just few seconds ago. Then we have to define a data type of this variable. In our case, balls have type string and this is required so we add exclamation mark in there and then we will do the same also for description. Alright? The latest step is to replace our hard-coded strings with variables. You just need to copy variable name and replace their particular string with it, like this. Now, our query is dynamic and looks also much more cleaner. Now we can run this query to be sure that everything is working. Perfect. And the last small hint where you can find the types. If you are not sure which datatypes should have some variable, you can find it out in Docs. If we find our insert photo one mutation, we can see that the type of our description is string as well as photo world. But for ID, it is UID type and four created ad, it is timestep type. So this is how we can use variables in your graph. You'll queries and we are moving forward to the next lesson. 19. Subscriptions - Realtime Queries: Hello, welcome to this lesson. Let's talk today about subscriptions. Subscriptions are very similar to queries and everything. What we have learned for queries like filtering, sorting, pagination, and so on. This everything is applicable to subscriptions. Let's take our query, get photos and converted to subscription. The only thing I have to do to change query to subscription. And that's it. If we run this query, we will get the same result as if we called irregular query. But after this request is not being destroyed, you can see that it stays to listen for changes. Let's try to add a new forethought and check if Realtime is working. I will open one more browser tab with Hazara console. And in this second tab, I will add a new photo. So he would go and I run this query. And now if we scroll a little bit down, we will see our new photo without any page reloading. That's cool, isn't it? If we want to stop, listen to new values, we can stop our subscription by clicking stop button. So that's it. If you need some real-time functionality for your application, please use subscription now you know how to do it. And we are moving forward and see you in the next video. 20. How to extend Business Logic?: Hello guys. Out of the box has surahs cup abilities are impressive, but sometimes even such a rich functionality is not enough and developers should be able to extend it. Has Sura provides you a couple of ways of how to do it. And all of these features we will see in action. In next videos. We will have a look at event triggers, actions, remote schemas, and in other features which may pop up in the future. Besides these, we will work with some serverless technologies like Firebase Cloud Functions, which allow you very easy to add some business logic without any additional server configuration. So welcome to this section and let's explore this features one by one. 21. Setup Firebase Cloud Functions: Hello, welcome to this lesson. Before we start with Hazara features, we need to create a Firebase project and deploy at least one cloud function. It is for free. So don't worry. If you do not have an account yet, you have to navigate to firebase dot and colleague sign-in button and sending with any of your google account. When you sign up, you will be redirected to Firebase console, which looks like mine. Now, we need to create a new project. Just click on Add project and there are some name and then we clique, continue. After these, I will disable Google Analytics because I don't need it. And then I click Create Project. It will take some time to create it. And then you can click continue and jumps straight to the project. For now, we are interested only in functions. So let's go there and click Get Started. We immediately see a dialog window. We some instructions how to start with Firebase functions. According to the instructions, we need to install Firebase tools. So let's do it. I will copy this line. Then I go to my terminal in via scored, busted here and hit Enter. It doesn't really matter in which folder we installed because we install Firebase tools globally because of minus g flag. Let's go back to Firebase console one more time and check which next steps we have to do. All right, we have to initialize our project. We use a common Firebase in it. So let's copy it and go to VS Code. And to run. If you run it the first time you will get an error like I have. I intentionally logged out to show you this error and to do our identification with you. In order to fix it, we have to run Firebase login command. Here is up to your high personnel Dog Mine, a Firebase, collect CLI usage information. So I hit Enter. Now I was redirected to the Google login page, and I have to login with the user who owns the Firebase project, which we created just few minutes ago. Now we see that our Firebase login was successful. So we can go back to VS Code. I run Firebase in it one more time. And now Firebase asks us about features we want to configure. You can navigate with arrow down and arrow on your keyboard and select or unselect options by pressing Space key. So I will select functions and because later on in this course we will be uploading real photos. I will keep also storage option and hit enter. Now we should pick the project we're going to work with. So I peak, use an existing project and then I choose the project I created just few minutes ago. Here is up to you what to use JavaScript or damp script. But I'm a big fan of type scripts or I will choose Type Script. For sure. I want to have the ASL winter. And yes, I want to install required npm dependencies. But, but before to install, I have to activate my storage, sorry for that. So let's just go to our Firebase console, then to the storage and click get started, then next. And here I need to choose storage location. I choose your your opinion because it is closer to me. So now I can get back to via scold and hit Enter to install dependencies. Here we are asked to which file to use for storage rules for me, it's find default one. So yeah, here we go. Our project was successful, is set up. Now let's have a look what was generated inside folder functions. So we see some commented out code lives and command our hello world function. Save it, and let's try to deploy it. In order to deploy functions. I just run common Firebase deploy. And because I want to deploy it only functions, I need to run it with flag only and failure functions. All right, guys, so it looks like something was changed in Firebase policies. So now in order to use Firebase functions with no JS version, then you have to use bears, you go blends. So, so you have two options. First one is to switch to pay as you go plan. Technically it's a paid plan, but you have to millions of function invocations for free each months, which is more than enough for testing. Or option number two is to use NodeJS version eight. But the problem is that it is deprecated and it will be turned off on March 15th, 2021. So I will choose option too because I don't want to assign my credit card right now. So I will go to, but could choose x1 and change gnawed version to aid. Now I run it one more time. And now we see that our Cloud Function was deployed. So we can see even a link to it. I can open this link and we see message hello from Firebase, which proves that our function works. Great. Now we have set up Firebase project. So let's move on and connected with Sura. So see you in the next lesson. 22. Event triggers - Create an Event Trigger: Hello guys. In this video we will learn what is triggers in Sura and why we would need it. Let's consider the case. Your client wants to know if some new comments to the photo was added and he or she asks to send an email notification on every new comment. And in this email should be a common text and link to the photo. Here is where Hazara events come into play. If to keep it short, it is just an HTTP endpoint which will be triggered everytime when some role in database was created or updated or deleted. Creation of Hazara events consists from two steps. The first one, we have to configure it on Hazara side. And the second step would be that we should write business logic, which will be sending our emails in separate cloud function. In this video, we will be working on the first step. So let's go to our Hazara console. Here at the top you can find that tab called eLance. And let's go there. Here, just click Create button. I will name this trigger, notify about comment. Then I have to pick a table called comments and peak insert because I want to trigger this event when a new record was inserted. And the last thing we have to do is to insert web hook or L, which will be our Firebase function. But because I am going to do calls from Cloud Functions to my local Sura instance. I will be serving my functions also locally. Otherwise I will not reach HTTP Oracle host from outside. So let's serve our function locally and see which URL it will generate. I will go to our VS code and then I will navigate to Functions folder and run NPM, run serve. Let's copy or L where our functions locally were initialized. And I want to just copy the URL of this function and passed it in the web hook input. But instead of hello world, there will be notified about comment. And also one very important moment, you have to replace HTTP URL localhost with HTTP host doped Docker dot internal. The reason is that Docker container has its own localhost or local host of the container. But our functions and our served on the local host of our machine. And exactly these host dot docker dot internal points to the correct host. And now I click Create Event trigger. And awesome. Our event was created and you can see some additional tabs like bending events, processed events, and invocation log. For now, all of them are empty because we executed nothing so far. But before to execute the event, we have to create our notify about common endpoint and to add some logic which will be dispatching our emails. This we will do in the next lecture. So see you there. 23. Event triggers - Implement Event Logic: Hi guys, welcome back. In this lesson, we will create a cloud function which will be dispatching our emails and it will be called by Hazara event. First of all, let's start to install the required dependencies and its types inside functions folder. There will be two dependencies, not fetch in order to call another endpoints and nor the mailer to dispatch emails. After these, let's rename our hello world to notify about command and log the body first in order to see what do we have there. And then let serve the function. Now let's go to the Hazara console, find the table photos, and let's grab some ID. Then let's try to insert our comment for this photo. I will use insert one mutation and add some common text. And then I add this copy at AD. So let's run it. So you see that the comment was created and let's go to the events, open process Events. And now we see that our event was executed. We can even expand the log and see Request details. You can see it passes session variables and data which we provide because we insert the value of the property which is called OLED is now. And if it would be, let's say update, then in this field we will see the previous value for this field. And then in the property knew, we see the data which we inserted. Now let's implement email dispatching. So I will go to VS Code and let's go a little bit. First of all, let's open try-catch block and set response with score 200 and message success. If everything is fine and called 500 and error message. If something went wrong. Then let's extract event data from body. And from event data we extract photo ID and comment. Do you remember the property knew contains the data which was inserted into the database. And also let's extract the session variables from event. All right, everything looks fine, but there is the small issue. According to our requirement, we have to also send a link and description of the photon. But the problem is that we have only photo IDs. So we have to call Hazara graph girl and buoyant and fetch information about the photon. So first of all, we need to build a graph QL query. So let's go to our graphical and build one. So I need a query photos by primary key with ID, which will be a variable ID. Then I declare this variable will be required. And then I will ask for photo or RL and description. Cool. Let's copied and pasted here and assign it to the constant called Get photo query. Now let's import North fetch, which does the same thing as a fetch in a browser. It just calls some endpoint. Then we call our Hazara and point. And it is method paused because all graph Gail queries are paused. Then our body will be represented by a JSON string, which has a field query with valid graph QL query and the field variables with our variables object. And we're ascend headers and session variables from request. It is not needed in this particular case, but I just wanted to show you how to do it. This query will return a response and we need to parse it to JSON first. And then we can fetch data about our photo, which is photo oral and description. Great, we have now all required data and now let's implement message sending. First. Let's import all the required functions from northern mailer. Then we need to create a test account for our dummy SMTP server. Now we need to create SMTP transport. I prepared some cold for you. And this is SMTP server for testing purposes where we are going to use the credentials for our test account. Then we will use our transporter to send an email. Here is some prepared called snippet, which I will explain you in a second. So here we just say that email is from our test account. Receiver is me, subject these new comment and some HTML with a link to the correct photo oral proper description and comment body. Because this mail server is just for testing, the email will not be really dispatch to my email box, but it will generate a world where we could see how these email would look like. So just before response, I will add this stream which logs et generated URL to this email. So now we are ready to go. I'd save changes, restart our Cloud Functions. Can take some time, and now we can go to Hazara console and run mutation one more time. We see that comment was aided. So let's check status of event. All right, it looks also fine. Now if we go back to VS code and have a look into the console, we can see the link to the email, so I can follow this link. And here we go. Here is our email with correct link and comment. Now we have working event kind of relations. You use Hazara events when you need to perform some custom logic once something was added or updated or removed in urine database. Okay, that's what's it about event triggers. So let's move on and see you in the next video. 24. Actions - Basic Overview: Hi, there. We continue to extend how capability. And in this video we will have a look at how Sura actions. So what is actions in has Sura actions, it is a way to extend, has surahs graph Gil schema, we some custom logic, quite confusing, right? What about actions in has Sura, our HTTP arrest endpoints, which act a resolver still didn't get it. That's fine for me to call. So some time to understand. Well, let's try to visualize it with couple of slides and I hope things will get clearer. Let's imagine that you have an single-page application which calls have Sura endpoint in order to fetch some data until you do some basic crud operations. You are fine in happy, customers are happy and your manager is happy. But one day your product manager comes and says that your application should be integrated with another micro service. Let's name it account service, which handles our identification and provides user profiles and so on. But there is the problem. These microservices doesn't support graph q l, it has only rest interface. So you think, OK, that's fine. I will just do their call to their restaurant buoyant directly from their obligation. And it is fine until you get more microservices to integrate with. Then managing of such a system becomes a mass because you have to manage different endpoints. You have to mix up graph QL was the rest calls. Then most probably you need to have some state management like redux for data, which comes from rest, right? But then it turns out that you are using already some Apala client, which does also state management. And you ultimately get to sources of trues which you have to keep in sync and it is just the beginning of your problems. Hmm. Wouldn't it be great if we could add some custom schema to our Hazara, let's say sign-up mutation where we can define input values and also return type. And when we call this mutation, has swirl would delegate everything to some rest and buoyant, including request input values as well. Then it calls already our Account Service creates a user doesn't necessarily themes and returns some results back to her Sura. And then has Sura would already deliver result data back to the client via graphical interface. And in our case, it could be some token string. Sounds great, isn't it? I also think so. Well, literally it is exactly what has Sura actions are doing. And in the next lesson we will see the connection. So see you there. 25. Actions - Create an Action: How guys, so let's create our first action. I want to extend our Hazara with extra graphical mutation called CREATE_USER, then will create one more cloud function which will be representing our action handler. And I'm going to use Firebase identification, which represents our account service we want to integrate. So let's get started to create our action. Let's go to the actions tab and click Create button. Here we have to define the name for our action and input and output types. It going to be male, which is required string, then buzzword, it's also required string and display name, which is just a string. Then sample output, I will rename to user. And user will have three fields. This is ID, then email, and display name. So this way we created kind of contract or interface for our resolver, which we remember is rest endpoint. So we say that hey, we will provide you data like email, password, and display name. You do whatever you want with it. But we expect that you will return us a JSON was fields id, email, and display name. Then we have to define this handler and I'm going to call again a cloud functions. So I will use the same URL which I used for Hazara event, but change it to create user. And we can save our action. Now if we go back to graphical weekend, see our CREATE_USER mutation. You see, you can work with it as with any normal graph kill mutation, but it will fail of course, if we try to execute it right now. Everything because there are no resolver yet. So let's go to our Cloud Functions and create a new one. But first of all, let's move our logic from previous lessons to some separate file. And then we just import handler functions, which I will add in onRequest function. Came. Hmm. Yeah, cool. It looks now way more better. So let's create now another file called CREATE_USER. And here we will export some function called CREATE_USER handler, which does nothing so far, but it just logs their request body and to return some hard-coded data which maps our return type user, which we defined earlier. And then we need to create a Cloud Function and assign this handler. Now we can run our Cloud Functions and try to trigger our actions in Hazara one more time. And wow, we got the values we defined in Cloud Functions. So Asura resulted fast. So let's have a look what is inside our body. So besides session variables has Sura sense as an input property which contains value or we defined when we build the mutation. Now let's bring some logic to our function instead to hard core values. Let's save user in Firebase with custom claims, which later on will be used for identification and return then a user recreated. So first of all, let's import some functions from Firebase admin package. This functions will allow us to create and manage users. Then I want to extract email, password and display name, which has Sura delegates to this function. Then after this, I can create a user in Firebase by bad thing object with email, password, and display name. Now having user-created, I have to sit up custom claims which I have mentioned will be used later for identification. And in that section, I will explain it more detailed. And if custom claims who are added successfully, we should response with real data instead of hard coded one. So I replace it here, and that's it. We can restart our functions and try to add a new user. I go back to Hazara console, then add some real email, password and display name, and then I execute it. So now we can see that user was successfully created and we work with it as with normal graph kill mutation. However, we know that behind the scene it is chest arrest and buoyant. Ok. That's how's it? And let's move on and see you in the next lesson. 26. Actions - Relations to the Database Tables: Hello, welcome to this lesson. We have learned what is resurrections, which problems it solves, and how to create it. And in previous video, we created an action which allow us to create a user profile. But now we should be able to fetch this profile information. So let's quickly create an action which we call user profile. It will be a query with a single parameter ID which is required. And it will call get profile and buoyant in our Cloud Functions. By the way, if you didn't noticed, I didn't declare type user in new types definition section because we created it before and have Sura resolved it correctly. Now I go back to VS code in order to create a handler for it. So let's create a separate dash script file. And I will just bust some code snippet here. It just fetches ID from input and returns a user profile which belongs to this idea. It's very simple. The last step is to register our function in index.js file like this. So here we go. Now let's save it and restart. Our functions. Are, functions were started. So I build a query and we can see that it works like a charm. Now, don't you find our action a little bit useless? I mean, we can get profile information, but usually I want to also fetch photos posted by this user or maybe comments, right? And Hazara allows you to do such kind of relations. And let's have a look how to build it. First of all, we need to extend our photos and comments tables with a new additional field called User ID, which will store Firebase user id. And then I will update some of these fields in both tables we see user ID. Right? Now it looks better, Great. Now let's go to our action user profile and find that their relationships. And here we can create one. We just speak what kind of type This section is, in our case, it's array relationship, which means that profile can have many photos, so it's one to many relationship. Then I define a name and public reference schema. Then I beak table, in our case it's table photos. And let's say that user profile idea relates to user ID field in photo stable. And then I hit safe. Then I will repeat the same for the comments. All right, looks good. Here we go. We are done. Now let's go back to graphical and try to fetch also some photos and comments. Right? I build a query and also we have fetched also photos and comments posted by user. Isn't it great? But here's one reminding problems, or this is a one-way direction like from Exon two table. But unfortunately we can do opposite thing like create relation from photos table, do the action user profile, and then fetch list of, I don't know, photos with user profile information. At least in Hazara 1.3, it is not possible, but there is another way how to overcome this issue. And the solution is called remote schema, which we will learn in the next lesson. So see you there. 27. Remote Scheme - What is that?: Hello, and welcome to this lesson. So what is remold schema? Well, the idea behind this is very simple. So let's imagine that you have multiple microservices you want to integrate into your application. Let's use our case. We have now our micro service powered by Hazara, which is responsible for photos and comments. And we have some separate microservices which we will build later. And it's going to be also graph kill server, but it handles every sync related to user profiles, identification and so on. I will call it account service. Our photo project will be depending on this account service because we want to fetch photos alone, quiz user information, right? And until we have this two microservices separated, our client some React or Angular application needs to call photos first. When photos has been fetched, you need to collect user IDs and then you have to do another call to account service to fetch profile in force for those ideas and so on. And after this, you need to map photo entities with user profiles on client side, which is convenient at all. They agree with me. And this is where remote schemas can be a better solution. So it allows you to kind of let say, merge or to teach multiple schemas together and make it behave as a one big graph. You'll schema. Sounds great, isn't it? So let's try to implement in the next lesson. So see you there. 28. Remote Scheme - Create and connect external GraphQL Scheme: Hi, welcome. In this video we're going to implement their remote schema. And first of all, we need some, another draft girls server. I will spin up a bola server on my Cloud Function. And first of all, we need to install the one npm package called apolar server cloud function. It is just a wrapper around a paula server which provides some useful helpers features to work easily with Cloud Functions. And then I will create a separate file for our function. Here I passed some code. The court is very simple. I just defined types and one resolver for user profile, which takes one parameter called ID and it returns null if id was not provided or otherwise profile information if it was found, then I will configure a poll, a server, and I just export this instance. Here will go. And now I go to the file index.js and I'm going to just create a function and it need a bola server. Just like this. Right, should be enough to run our apolar server. I just need to restart functions and we can navigate to this function or URL. All right guys, if you encounter the same error as I have, just go to ds config JSON file and set properties keep leap, check to true and it should fix the issue. Let's try to restart our functions. And we can navigate to this function oral. Alright, cool. I can even build a query and will be the user profile and they need email and password and I run it and I get null, which makes sense because I didn't provide an ID. So now let's copy our endpoint to these graphical schema and go to Hazara console. Here we have the remote Schemas section. So let's go there and let's add in new scheme. I just give some name and no Firebase user profiles example. And a base there were LA copied. But remember that you have to replace localhost. We host dot docker dot internal. So this is pretty much it. So let's save this remote schema. And if we go back to graphical, we see that our user profile from an Ebola survivor was added in our explorer. It must be a little bit confusing because we have an action with the same name. So let's rename user profile to something like Firebase user profile. So I will go to VS code and then I rename it here. So here we go. I save the file, and now I need to redeploy functions. And by the way, keep in mind that if you change remote schema, you also have to reload it in Hazara in order to reflect the latest changes. So I go to my remote schema, find a reload button and click it in order to refresh remote graph cure schema. Now we seen graphical, our Firebase user brought file and we could even fetch something from it. So let me build a query and provide the ID of the Firebase user, which we created before in previous lessons. And he would go, I run it and grade it, works fine. But now you can do even more. You can go to photos table as example, and create relation to this remote schema solids. Do it. Just find the button, Add remote schema relationship. Here we go, and let's give it some name. I peek remote schema and configured the way that the value of the field user ID will be used as an input parameter for our Firebase user profile resolver, which leaves in Firebase Cloud function. So this one. Now I can save and tested. So I will fetch some photos from our database. Then I want to add information about user profile. And do you see our remote schema is also available now, so I can fetch the user profile as well. And then I run this query. Alright, let's grow a little bit. And do you see our first photo was resolved. Wes user profile is just amazing. This is how you can easily merge multiple graphical schemas together. And this is super useful when you need to extend business logic of Hazara. And your micro service infrastructure is represented by multiple draft girls servers. But if your micro services talk to each other via rest, then it makes sense to have a look at how Sura actions, which we learned in previous lessons. But we should move on and explore another hazardous features. So see you in the next lessons. 29. Section recap: Hi, welcome. In this section we'll learn a lot about how to extend business logic in her Sura. You can see that besides crud operations have Sura can be also kind of orchestrator between multiple microservices. There's still some restrictions like you cannot create relation from remote schema to has Sura table, or that you cannot create reference from Asura table to the action. But as far as being actively developed and most probably at the moment of recording this video, it was already solved. And if so, this course will be updated as soon as possible. So don't worry. But let's recap when to use what? So we use events when we need to perform some operation. After something was inserted, changed, or deleted in database, then we have actions. And actions are a good choice. If you have to somehow validate data before to insert in database. Then if you have to delegate some logic to third party micro service, which supports only arrest and you want to hide it behind GraphX girl interface. Or if you do not want to or cannot convert it to graph kill server. And also actions could be the good solution if you need to fetch some data from third party service. And you are supposed to do nested queries to have Sura stables in same, in the same request. And last but not least, remote schema. So use it if you need to delegate some logic to third party micro service, which also implements graph QL interface. Or if you need to create some reference from HA surahs database table to remote graph cure schema. So I would say that's it. We finish this section and I hope it was interesting. So good job. And see you in the next section. 30. How to protect Hasura Endpoints: Hello guys, welcome. In order to enable our identification, the first step is to secure our graph care endpoint. And in order to do it, we have to go to our docker compose YAML file and set one environment variable called Hazara graph kill admin secret. And let's give some secret password. Now we have to stop our containers and then I have to run it again to, in order to apply our changes. Now, if we go to the Hazara console, we will see that it requires our admin secret password. Once we enter the password, we can see the console. Again. We can even try to fetch some photos to be sure that everything works fine. But did you notice that we have an additional header X Husserl, Edmund secret? That's actually the reason why we're still can do requests. And if we change the value of this header, we will get an access denied. So this authentication method being involved always, even if you explicitly configured JWT or web hook out Identification. And it takes precedence over them. Which means that if you provide extra Sura admin secret with correct password, you get full admin rights and you can do any query to any endpoints. So be careful. Just beaker long password and do not expose it in good and do not use this password on the client side. There are actually another ways how to handle our identification in her Sura, which we will learn in the next lesson. So see you there. 31. Authentication with JWT (JSON WEB Token): Hi guys, welcome there. Another and probably the most heavily used way to implement our identification in Hazara is to use JWT token. I think every hour identification provider works with JWT because nowadays it is the easiest and secure way how to authenticate user. I hope you're already aware what it is, but if you hear about it first-time, I will quickly explain it to you. So JWT or JSON web token is a string which has been generated from some data somewhere, usually on the server, we some crypto algorithm and secret key. And in order to decode it, you need to know the secret key and their algorithm. We usually send this doctrine with every request as a value of authorization header and server takes this string and tries to validate it with it's sacred. If validation passed successfully, request proceeds, otherwise it fails, which means that talking is not valid. So how we can generate it. If you visit JWT dot IO and scroll little bit down, you can find a library for any language which can generate and verify this takin. Also our, THE providers like Firebase or out or provide two WT Dawkins for you so you should not care about it. But in this video, I want to show you how it works without Firebase magic in order to understand how it works under the hood. But later on in the next video, swivel implement authentication with Firebase as well. Alright, let's get started. First of all, we need to generate proper JWT token with a proper Bay LOT. Usually you have to do it on the server side using JWT library once you check if user provided correct login and password. But I will use these online builder. In payload. You can have any data you need, but do not put any sensitive information there like buzzwords because it will be visible. But in order to make it work with Sura, we have to provide some additional data. It is HTTP has surrendered IO JWT claims field, and inside there should be at least these two fields, x Hazara allow THE roles and x Hazara default role. I said they're all to admin for now and more about roles will talk in another reader. You can of course provide even your custom fields. The only thing is that they have to start with x has syrup graphics. Then you can peak algorithm you want to use, but I will pick Default one. And then you have to provide a secret key. And knowing this secret Hazara, we'll be able to verify our takin if it's valid. Now, our JWT string is generated and we can copied and provide in Hazara console. So let's do it. And this token we need to provide as an authorization header and the values should be bearer. And our talking then, the only one problem here is that for now this token will be ignored because we didn't activate JWT authentication method in Hazara. In order to do this, we need to provide an environment variable called Hazara graph kill, JWT secret. And as a barrier, we have to provide an object as a string with type key pair or J WK, oral JSON Web Key world. In our case, I will be using type key pairs. So I provide a type which is our algorithm, hs 256 and the key which is our secret. So I will go again to JWT dot IO and copy the secret. And this is all required fields you have to provide in order to decode token. So let's start our Docker containers. Now we can reload console, disable x has admin secret and try to execute a query without it, but with only JWT token. And you can see that we still can do queries to our database and it fails if we slightly change the token. I hope now you understand a little bit more about JWT and how her sutra, the colds, these tokens. And in the next video we'll create and has Saran action which does identification for us. So see you in the next video. 32. Authentication with JWT & Firebase API: Hi guys, welcome to this lesson. In this video, we will implement JWT identification like we did in previous video. But back then we were generating doc and manually. And now we will let Firebase to do it for us and let us get started with grading and action. And while I'm creating the action, I will tell you why I choose to go with action rather than use Firebase directly in the app. And the first reason is that we do not have any application yet. I didn't want to create an Angular or React application now, so I will not be able to use Firebase library, but the app itself will create in the next sections of this course, don't worry. Another and more important reason is that we want to have some abstraction layer in order to not completely depend on Firebase. Because if one day we would like to change auth provider, we would need also adjust our client application and also maybe cloud function. But if we hide the Auth provider behind some abstraction layer, which is in our case action. We would need only to replace our backend without touching from ten part at all. Alright, our action is done, and it is a mutation which sends login and password to the login endpoint and it expects back an access token. Now let's go to our Cloud Functions and create one. So as always, I create a separate file. Then I will paste our handler template. And then first of all, let's get an email and password from inputs. Here we go. And then I have to call Firebase to rest API endpoint in order to authenticate the user because Firebase admin Cdk doesn't provide this option unfortunately. So let's import fetch function. And then I do request to this end point. By the way, the web API you can find in the Firebase console, just go to the settings of your project. Alright, then we say that it is a method post and we provide Leguin and buzzword in the body as a JSON string. All right, and then after these, I extract ID token prom response. Nice. And then I return it as an access token. And the last step is to create Cloud Function and assign this handler to it. And let's restart our functions. And meanwhile, we can activate authentication by email and password in Firebase console. So to do this, just go to the authentication and then to sign-in methods and just activate this first option. And I think now we can try to run this action and see if we can login. So I go to our Hazara console. Then I have to build our mutation and I will provide the email. Then of course bus for heavy goal. Now we can run it. And what do we see? We see that Firebase generated that JWT token for us. So cool. And now we can grab it and replace our previous token with a new one. And of course it will be invalid because Firebase has different secret key and most probably algorithm of encoding. So we need to adjust our Hazara graph QL JWT secret in order to make it work. So if you navigate to Hazara dot IO, JWT dash config, you will see this small util which helps you with Firebase configuration. You just need to pick Firebase as a provider and provide Firebase project ID and just click Generate. Then we just copy it and replace current value with the new one. It will go and now we have to restart Docker containers. So my Docker containers were started. And if I reload now, we see the slightly different issue. And this is not because of invalid token. I would say that this is already permission issue because until now we have been doing queries, having a role admin do remember, we set it here. But if we decode our current token, which was generated by Firebase in custom claims, we see their role user. And the problem is that we don't have such a role yet. And that's why we see nothing. Because we, as a user with user role, we just have no permission to do it. And in the next video, we will learn how to create new roles and manage their permissions. So see you in the next lesson. 33. Role-based Access: Hello, welcome to this lesson. In the previous video, we implemented our authentication by a Firebase JWT token, but unfortunately we stuck with permission issue. In this video, we create a new role called user and given permission to read, write, or delete items. So let's get started. Let's go to our photo stable first. Here we see a tab called permissions. From this table we see that we have only one role called Admin, which has a full access to all operations in this table. Let's create now a new role called user. I will just enter it here. And then I click on insert. And here we see allow role user to insert roles. In our case, I would say that any logged in users should be able to insert valor. Do you agree? Then we can define more detailed Think Like in which exactly column we allow to insert values. I would say it is a fatwa whirl and description because id created at and updated ad are auto-generated. Then let's expand column presets. Here we can assign valid from session variables or some static value. And here we say that our user ID value should be a valid from session variable X has Sura user ID. Do remember where it comes. If we have a look at our decoded JWT token, we see these Hazara claims and these keys inside which starts with x has Sura. Those are actually session variables. Here you can also see our x has Sura User ID, which is Firebase user ID. And do remember where we are defining it. We define it in our cloud function immediately when we created the user. Do See this is the place where we define it. Okay, so let's go back to our permissions. Here we can see the checkbox Beck, and only so far we don't need it, but it could be useful for some specific cases which I can cover in the next video. But for now, let's keep it simple and just keep it. So I click Save button. Now if we go back to graphical, we see a mutation insert photo. And if we expand it, we see that we have only two fields which we can add. Let's add some values. And destiny, if our session variables are also working fine. By the way, have you noticed that we cannot fetch in for about inserted photon? We have only affected roles. This is because our user role has no read permissions. We'll fix it in a few minutes. But now let's execute our mutation. Alright, the photo was inserted and let's check in database if user ID was set up correctly. And yeah, we see that it is. Let's go to Permissions again and add permissions for the rest of operations. Alright, for select, I will be without any checks. So every logged in user can select photos and can select any field. And also it can do aggregation queries. Now is safe. It then for update it will be a whisk gas Tom Chegg because we do not want to allow editing to everybody. Only owner of the photo can do it right? So I peek that field user ID should be equal to X has Sura user ID. And post update check should be the same. And likewise for insert, we can define which field could be edited. I want to allow to edit only description. So let's save it. And the last operation is deleting, and this operation is only allowed to the user who inserted the photon. So similar to what we have for update operation. So I could even to reuse this logic and safe. Great, now we have everything properly secured. If we go to graphical weekend, see all allowed queries and rotations for our user role. We also need to do similar things for comments, but it will be your homework. And in the next video, I will show you how I solved it. I wish you good luck with your homework and see you in the next lesson. 34. Anonymous Role: Hello again, welcome. Hopefully you handled your homework. So let's compare it to what I have done. So for insert, I allow to insert a comment for any user. User can insert photo ID and common text only. And the value of the user ID should be the session variable user ID. For select, I have full access without any restrictions. For update, only user who posted the common can edit it and he or she can edit only common buddy. And the same rule I applied for the delete operation means only the comment owner can delete it. If you have the same or similar, then congratulations, could jump. The topic of today's video is public access. So let's describe an issue. So once we enabled our identification for our graph Gill and points, it means that all of them are protected. But very often we need to have public access to some operations like login mutation, which should be available for anyone. Or maybe if you build some photo stock service, you want to show photos without making user to login. So how we can do it? Well, it has Sura, you can define a role which will be used when there is no authorization token or admin secret header was provided. And to this role, you can grant some certain permissions. So let's go back to photo permissions and creates such a role. So I create an unevenness role and edit. You can name it as you want, but I will name it as an unevenness. And I will allow this role to select all photos. Then we have to tell Hazara How did we named this role? You should do it in our docker compose file and an additional variable called her sooner graph QL, unauthorized user. And the violence set to anonymous, or how you decided to name it. After these, we have to restart our Hazara containers. So it will take some time. And now if we go back to our graphical and disable authorization header with seed that endpoints like photo, photo by primary key and Firebase user profile available. Why profiles available? Because it is a remote schema and relates to our photo table. So it was also exposed. But if you need to restrict access there, you have to handle permissions men early in the remote schema. But we still cannot see two important endpoints which are login and create user. Those are actions. And actions also have permission's similar to what we had for tables. Let's configure our actions as well. I'm going to open the lock inaction. Here we see that their permissions, and let's say that we allow anonymous user role to execute it. And let's go do the same for create user action. Alright. Now, if we open also mutations, we see that those two are also available for non logged in user, which is exactly what we wanted. This is how you can configure public access for your application. I hope it was clear and see you in the next video. 35. Webhook Authentication Mode: Hello guys, welcome to this video. Authentication is not always about logging and password and JWT tokens. Sometimes identification can be really non-standard. So let's imagine that you need to allow access to the endpoints. If user sends some header, like secret header with a batter, trust me. Hazara definitely don't know how to handle it, but it allows you to delegate authentication process to some, another HTTP endpoint which knows how to do it. You can configure it by setting up their Hazara graph QL, I'll hook environment variable and disabled has Sura graph KL, JWT, secret and Hazara graph Gil unauthorized through all. So let's set it up. So I will comment out these two environment variables and add this variable has Sura graph girl out Hook. And the barrier will be my endpoint, which I'll create in my Cloud Functions. So let's save it and restart containers. It will take some time. Alright, our containers up and running. Then I will create the alt HUC handler. Here the logic will be really simple. I just extract secret header from headers which has Sura sense. Then I check if customer out header doesn't exist or has improper valor. In this case, I throw an error. Then we sent responses with any Hazara session variables you want. In our case, there will be x Hazara role on Lee, which equal to user. And Hazara expects that if authentication failed, these endpoint should return response with status 401. Just don't forget to export this function in index.js file. Now we can save it and restart our Cloud Functions. Here we go. Then let's go back to Hazara console. And now let's reload it. And we can see our all queries and notations which are available for anyone who has role user. I can even build a query and fetch photos. And you can see that it works as expected. And now if I try to change the value of my header, we see that we have no permissions anymore. So cool, our identification HUC is working. Use it if you need to perform some really custom authentication logic. But for the rest of the course, I will revert JWT identification method if you don't mind. And that's it about web hook, our identification and we are moving on. So see you in the next videos. 36. Migrations - Describe a Problem: Hello again, welcome to this section. So we have achieved a lot. We can build advanced queries with sorting and fills or capabilities. We can do annotations. We know how to extend business logic in has Sura and how to enable our identification. And sooner or later we will have to run our project in some, another environment. Some may be testing or staging server, or maybe we want to share our project with some, another developer via version control system. In order to reproduce it, we can stop our containers. Then I'm going to remove DB data folder because we obviously don't want to commit database data in good, right? So once another developer pool our project here or she will get same files and folders except DB data folder. And let's try to run Hazara containers again, and let's see what will happen. Alright, it takes some time. Now, we can go back to the browser and, oh, well, looks bad, right? It turns out we lost everything we were working on. Fortunately, I made a backup and I can easily recover my data. I just need to stop Docker containers and this newly created DB date, I replace with my old one. Now, let's try to restart our Docker containers. And once our docker containers are up and running, now we can go back to our Hazara and we can see that our spec. So I hope now you see the problem and in this section, we will try to solve it with her Sura migrations and metadata. So let's move on and see you in the next lesson. 37. Migrations - What is Migrations & Metadata: Hi guys, welcome. Do you agree that before to create any migrations or metadata snapshots, we have to figure out first what is migrations and metadata? Well, both these things are responsible for making a snapshot of your Hazara state and restoring kids somewhere else. Migrations are responsible for re-creating of your Postgres database schema. And each migration is represented by two YAML files called up and down, which contains some SQL scripts. Up YAML file that contains the SQL script which should be executed when you apply some particular migration. And down YAML file contains the SQL script which should be executed when you roll back from some particular migration. Usually you can see that up dot yaml creates some table and some fields. And down YAML script drops everything wad up dot yaml has created. It allows you to travel back and forth between your database schema versions and keep the history of your database schema changes. You can imagine it as a good for database. Besides database schema, we have also a lot of another things to trek like permissions, relationships, event triggers, and other things which are being used to describe the exposed graph QL API. But it is not covered by migrations because it doesn't belongs to database. And this is where metadata comes into their play. Metadata is represented by several YAML and graphical files under their metadata folder. And we will have a look at them a little bit later in this section where we export metadata from our current state. I hope now you have a little bit better understanding of what the difference between those two. But if not Ambry assured there things will get clear once we start to work with it. So let's move on and explore how ceramic operations capabilities. See you next video. 38. Migrations - Meet Hasura CLI: Hello guys, welcome to this lesson. We can create and manage migrations manually. But trust me, this is a lot of work and it is very error prone. The better way is to use Hazara CLI, which will allow you to automate a lot of things. So let's install it. If you are in Hazara docs page, find inside bar section have Sura CLI and they're installing the Hazara CLI. Here, you will find instructions how to do it for your operation system. Once you successfully install Hazara CLI, you should be able to run in your terminal or command has Sura version, and it should return you the version of your CLI. Now in the root folder of our project, we can run a comment has serine need in order to create all necessary files and folders. It will ask you how you would like to call the project folder. I will name it has Sura server. And after this you will see a new folder created. And if you expand it, you will see folders from migrations, metadata, seeds, and config file. And the moment folders, migrations and seeds are empty. But metadata folder has some files which will store a snapshot of our actions or remote schemas and so on. They all are empty for now. And also we have config dot yaml where we can set up some settings of our project like Hazara server, host or folder bathroom it at eight and so on. We will see it in action in the next videos of this section. Great. Now we have successfully installed Hazara project. And let's move on and learn which kind of benefit we can get from it. So, see you in the next lesson. 39. Migrations - Create Initial Migration: Hello guys, welcome to this lesson and let's start to create our first migrations and metadata. And let's get started with migrations. In order to create initial migration. And I would like to highlight the word initial, We have to navigate to the project folder and you have to run the common ceramic rate create Ennead where you need it is actually the name of the migration. Also, we have to say that we want to create migration from our current server state. So we have to add the flag dash, dash from server. Then we need to provide how surahs endpoint or IRL as a dash, dash and point flag. And then we have to tell that this is our HTTP URL localhost 8080. And because we already use our identification, our console is protected. So we have to provide also dash, dash, admin, secret flag, and value of our password, which is 123. So wow, quite a long comment, isn't it? But we can simplify it. We can define all this properties in our config dot yaml file. And if we open this file, we see that property endpoint was already correctly set up. And also right here we can define also our admin secret like this. Now we can save it and we can remove dash, dash and buoyant and admin secret flags. So our common now looks way more shorter and cleaner. Now we can hit enter. And as you can see, our initial migration was created. So let's have a look how it looks like. We can see there, there was created a folder with the name which consists from ID and to migration name. And also it contains one up dot SQL file. But if you remember, I told that it would be two files. But because this is initial migration, there is nothing to roll back. So down dot SQL file, we simply don't need. And if we open our file, we see that has syringe generated SQL script which creates our tables for M comments. Then it also creates foreign and primary keys, indexes, and so on. So it is a whole snapshot of our database schema. So great. Now we know how to initiate our first migration. We know how it looks like, and let's learn how to export our metadata in the next lesson. So see you there. 40. Migrations - How to export Metadata: Helga Ace, welcome to this video and in previous one will learn how to initialize our migrations. But what about metadata? Well, in order to export metadata, you have to run the similar command, which is Hazara metadata export. Of course, if you decided to not configure your config dot yaml, you have to provide also dish, dish and point flag and dash, dash admin secret as well. But we have already configured this config dot yaml file in previous videos so we can just hit enter. Cool. Clia says that metadata was experts. So let's have a look how it looks like. I want to start with our actions dot graph QL. So let's open it and do remember this types. There are accustomed types which we created for our Hazara actions. There is our create user action, login user profile and sign-up credentials and so on. In actions dot yaml, we can find configuration of our action like name handler, we're owl permissions and all that stuff. And the thing I think you will find in each of this file, like in table yaml, you can find metadata which describes permissions and drawables which we created. Here you also can see Sura triggers and its web hooks. Everything is there. Next to table YAML IS config for their remote schema. And if you remember, we created it earlier in the video about remote schemas. Ok, that's was pretty much eat what I would like to mention about this topic, it was very short, very easy, but still it is important. But we should move on and explore and other sorts of features. So see you in the next video. 41. Migrations - How to check Migration Status: Hi guys, welcome. In previous lesson, we'll learn how to export metadata and create initial migration. In this lesson, we will learn how to check immigration status and apply migrations. Hazara CLI has a command, has Sura migrate status, which displays migration status. Now you can see the table with columns, version, name, sort status, and database status. With the first two, everything is pretty much clear I think. But the more interesting then next two columns, the source status present means that this migration exists in our source code or migration folder. If you see not present, it means that migration was applied on the server, but in the source code, it doesn't exist. It can happen if multiple people are collaborating on the removed how Sura server and one of the collaborators forgot to pull the latest changes with migrations. The database status says not present. And it means that the migration exists. It was not yet applied to the database because her suramin grade create only creates migration, but doesn't apply it. In order to synchronize our source code with database, we have to apply our migrations. To do this, usually you need to run only has ceramic grade apply. But in our particular case, it will be a narrow because has Sura will run migrations script which tries to recreate already existing tables. You have two ways how to fix this issue. First, you can remove the b underscore data folder and restart Docker containers. And the second way is to add the flag dash, dash skip execution, which will mark migration as applied, but will not executed like this. And when you run the comment with dash, dash skip execution, you also have to define migration version explicitly via dash, dash version flag. Okay, good. Now we see that migrations were applied. Now if we run again how tsunami grade status, we see that migration now present in source code and also in database. By the way, let me show you one trick. I do remember that Hazara instance in Hazara cloud, which we created at the very beginning of this course. It should be completely empty. And you know what? We can apply migrations from my local computer not only to local Hazara instance, but also to remote one. I just need to copy their oral Of the cloud has for instance, and run again, Hazara migrations apply, but this time was flagged dash, dash and point, which points to cloud have serine instance. It will allow us to override our local host 8080, which we have in config dot yaml. And we can apply migration to the cloud instance and hearing oh, migrations are applied. And if we reload our Cloud Console, we see our tables. Yes, they are not public and not exposed to graph q l because this is the task for metadata which we apply in the next video. So see you there. 42. Migrations - How to apply Metadata: Hello guys, welcome to this lesson. In this video, we'll learn how to apply metadata to our Hazara instance. First of all, you must probably want to know if your metadata definitions in your source code are different from what is applied on the server, right? Something similar to what has tsunami grades start to status. In order to see the difference, use the command has Sura metadata, diff. You can define also some concrete file as Example, actions dot yaml. But I will leave just her Sura metadata DV in order to see difference from all files. And now I executed. As you can see, it shows what exactly is differ from what is applied on the server. In our case, there are no changes, so let's try to change something in metadata. Let's say in tables dot yaml, I want to change a law aggregations for their role user from true to false. And now let's save it and run it again. Then if we scroll a little bit down, we see that our change was highlighted with the red color and the green one. And the green one shows what the VAT I currently on the server. Okay, this is how we can see the status of our metadata Watts about applying. In order to apply, you just need to run has Sura metadata apply? As easy as that? Also, you can apply metadata to some remote endpoint like we did for migrations. So I will add just the end point, which will point out to our cloud Hazara instance and just hit Enter. Yeah, guys, I forgot that our remote schema points out to our local Firebase instance and it cannot be accessed from the internet. So we can deploy our Cloud Functions. To do this, I will navigate to my folder functions and I will run npm, run, deploy. That, then it will take some time, sorry. Okay, great. Our functions were deployed and now I just need to apply some temporary fix. And in remote schemas dot yaml, I just need to replace the world to our deployed Cloud Functions. Now, if I apply metadata, it should work fine. So yeah, let's check. Yeah, do you see? But most probably our actions will be broken because they still point out to the functions which are running on localhost. Well, I think I would rather create a separate video in some maybe Banos section where I will explain how to handle these multiple environments. But for now, I regard this change in order to be consistent with our local environment. And yeah, anyway, the good thing is that if we reload cloud Hazara instance, we'll see that our queries, permissions, and relations are there and we successfully restored our local state on remote environment. Okay guys, that was it about applying metadata and we have to move on. So see you next video. 43. Migrations - How to keep your Migrations & Metadata always in Sync: Hi guys, welcome. We have already learned how to create initial migration and metadata. And it is great because it allows us to make a snapshot of current has surahs, metadata and database schema state. But once we start to develop new features, our database schema and metadata will definitely change. And we should keep this changes always in sync with our migrations. Because if we miss at least one migration step, another developer's who will, who our changes, they will get and broken application. The one option is that you can run has ceramic grade create common, which would generate a new migration with up and down SQL files where you can manually write your migration scripts. But it takes a lot of time. It is annoying and very bug-prone. However, there is a better option. Has solar cells, has a common called Hazara console. And if you run it, you will be redirected to the same Hazara console, but it will be served on a different port. And here a small hint, you can always change this default port by adding the flag dash, dash console port. And then you can define any port you like. And now what is most important if your console is opened this way, then every change you do in your database schema or metadata, will produce a new migration and metadata synchronization. This is extremely useful and convenient and you can be sure that you didn't miss any migration step. So let's try to change something in photo database schema. So I will add a new column called his published, which is di Boolean, non nullable and true by default. And then I will save it. And then also let's try to modify maybe some metadata. And in permissions, let's say that we allow to select these field only for logged in users. And then I also want to allow users to update this field. Yeah, why not? Cool. Now let's go to VS code and see what have Sura console generated for us. Insight Migrations folder, we see a new migration. If we open up dot SQL file, we can see the SQL script which creates our East published column with Skype and default values which we defined. And then inside Down dot SQL file, we see that we drop this column. As I said earlier, these Down dot SQL file will be called if we want to roll back changes. Also, if we go to tables dot yaml, you can see that it was also updated. You can see there was added their permissions here. And now knowing this super useful feature from Hazara CLI console, the best practice would be to disable console and always get access to it via Sura console comment. So I can stop my current instance of Hazara console. Then I stopped containers and Docker compose YAML. I will said how Sura graph QL enable console to false and start again my containers. Now, if I navigate to localhost 8080, I get nothing because console now is disable it for direct access. But if I run Hazara console, I can use my Hazara console as before. And I can be sure that all my changes in database and metadata are always being saved inappropriate files. Okay, that's it for this lesson, and we have to move forward and see you in the next lesson. 44. Migrations - Migration Squashing: Hi guys, welcome to this lesson. In previous video we explored nice feature called Hazara console, which allows us to create migrations automatically every time we change the database schema. And back then we did a very small change and added just a one single field called is published. But usually in real work during the feature development we do a way more massive changes. So let's add also some change to the common table as well. So I will go to this table and I want to add the same east published field for our comments. And also additionally, I maybe want to know when the review of this common was done. So I will add additional field are reviewed at so it's going to be timestamp. And here we go. Now let's get back to our migrations. And wow, we barely touch the schema, but already got three migrations. If we continue to work like this, we will end up with hundreds of migration files, which is we'll just really super hard to manage and understand which migration belongs to which feature. Do you agree that it would be great to squash it somehow into one migration? Let's call it feature. On migration. It will allow us to keep Less files. And this grouping by feature allows us to either understand the intention of every concrete migration and easier to do rolling changes back if we needed. And as you may guest, Hazara CLI has such a feature. And this feature called Hazara migrate squash. You just need to run this command with some additional prompts like dash, dash name, and the name, let's say will be feature is published. And then you have to tell from which migration you want to start squashing. In my case, it will be the immigration which goals right after a neat migration. Now let's try to run it. And you can see that it has Sura asks if we want to delete those three migrations and I pick yes. Now, you see that our three migrations turned into one. And it is definitely worth to mention that this feature is experimental. So be careful if you are planning to use her sura in production. It could be better to not immediately delete source migrations during this squashing as we did. But you can test squashed migrations first and only if it works fine, you can delete the source migrations. But maybe when you are watching this video, it is already stable. So just check the documentation. And the last thing which is worth to mention is that we have to apply our new migration without execution. Same as we did for initial immigration. Because if we round has ceramic grade status, we see that it is not present in database. So I run has ceramic grade applied, Daesh there, skip execution, then dash, dash version. And the version of my squashed migration will be this one. And now we see that everything was successful. A synchronized, alright, Guess that's was it about Hazara immigration squashing? So let's move on and see you in the next video. 45. Migrations - Seed Migrations: How guys, welcome. Until now we'll learn how to restore database schemas by using migrations. And This is awesome feature. But there is one remaining issue though. We cannot restored data in tables. Usually we don't need it because development databases as well as databases on Dev, staging or especially production environment, they should be completely independent. However, sometimes data should be there regardless which environment we use. Let's say we want to add a new feature to our photo app, which allows users to assign some Gail, tech or city name where the photo was made. So let's create a separate table for cities with column ID and the name. So idea will be as usual, unique ID and the name is just a string. And then we will go to Permissions and we are allowed to read it for every role. Go. Right, it looks good. And then in photos, we have to create an additional field called city ID, which is also a foreign key, which will have their relation to the City table. Now let's create this foreign key. Are good. And then we should expose this relationship to graph QL. Good. And now let's go back to cities table. And there is the question, we know that we support some certain cities, but who and how we'll be adding this CDs to the database. Man, early, Norway, we're developers or what? We should automate it. And this is where seed migrations can help out. As you already understood, seed migrations allow you to fill out the database stable with some certain data. And this is how we can create it. The first way is to run the commons, have Sura seats create, and then you can give some name to this migration like, I don't know, cities feature. And here interactively you can write SQL code like insert into table, blah, blah, blah if you want. But there is an easier way. We can go to the console and insert CDS right there. So I will add the next cities. The first one is the best city for living in the world where I am currently leave. Then the next CT is the capital of my country. And the CT where I was born and grew up. So hello, come treatments, if you are watching my videos, I hope you are doing fine. Now once I have inserted it, I can go back to my VS Code. And in addition to our Hazara seeds create comment, I can add a flag dash, dash from table and say that it is my cities table. And if I run it has Sura will generate an SQL script which will insert our CTs in appropriate table. And if you want to apply, it just run has Sura seed apply. But I will not do this because my cities are already there and database. So it was sit migrations. And I hope you will find a lot of use cases in your application for it. But we have to move on. So keep watching and see you in the next video. 46. Migrations - How to rollback Changes if something went wrong: Hello guys, welcome to this lesson. Let's imagine the situation that our last migration broke something. Let's say we dropped some field. We had not to drop. Familiar situation, isn't it? In this case, we can roll back our changes so we can do it with already familiar CLI command has Sura, migrate, apply, but with additional flag. And this flag is dash, dash, goto. And then we should tell to which version we are rolling back. Then I can take an ideal of migration I want to apply. So let's say I want to revert everything until our feature is published. And then when I hit Enter Hazara, we'll walk through all our migrations from last one to our feature is published one. And inside each migration it will execute script down dot SQL. And if you remember, in down SQL, we drop all the tables or fields which disfiguration created. So once I execute this command and reload Hazara console, we can see that our city stable is gone. And if I want to go back to our latest state, I can change id to ID of my latest migration, or I can just run without any parameters. It will do the same. Now, if we reload console, we see that our table CDS is back, unfortunately without data, but we know already how to fix it. I just run has Sura seeds Apply. Now let's go to the console and we see that our data is back. And if you need for whatever reason to roll back, only one concrete migration. You just run has ceramic grade apply than dash, dash version. And then you define the version of this migration. Let's be, I don't know, feature is published and add additional flag type and say that valley is down, which means we want to roll back. If you run this, you will see that field is published, is gone. If you would like to revert it back, just changed dash, dash type t2 up or run without their type flag. The result will be the same. And this is how you can manage your migrations. And I hope everything was clear to you and see you in the next video. 47. Improvement - Fix Permissions for User Roles: Hi guys, welcome to this lesson. In this short video, we will do some preparations and refactoring, which we would need in order to start development of our frontend apps. And the first thing we need to adjust is login action. We have to slightly extended. And besides Tolkien, also, we need to return a user ID. So I will go to the actions and find our login action. And then inside the login object, I will add the field ID, which is required string as well. Now let's save it and go to their functions and adjust it as well. So I will find the file called login. Here we go. And here besides i did talk in, we can also fetch lockup idea, which is actually the ID of the user. Great, now let's restart our functions. K looks good. Then let's go to our console. I will quickly build their login query, right? And now I run it and it works fine, so great. Then there is one reminding thing in our egg shells. So let's get back to the actions and setup also permissions for user profile action. And here we need to allow execution of this action also for roll user. And the next improvement is we need to create a reference from our common table to the user Firebase profile in order to get also user information along with comment. Something similar what we did for photo stable. So let's go to common section then to relationship tab and add remove schema. It will be Firebase user profile, which will be the reference to our remote schema, where ID relates to User ID in the table. Alright, looks good. And now I can save it. And the last, but not least I think I have to do is to go to our permissions for our comments table and allow user, role user to perform aggregation queries like this. Because in my app I'm going to render also the counter. How many comments has some particular photo. Alright, I think that's it. And we can move forward. 48. Improvement - Create Action "Upload File": Hi guys, welcome to this lesson. In this video, we will prepare the Hazara action which we're gonna use for file uploading. The idea will be very simple. On the front-end side, will just take a file and convert it to the base 64 string. Then we will send a string to our cloud function, which will convert it to the file and save it in their Firebase storage. So let's implement it. First of all, let's create a new action. These will be imitation with the operation upload photo, and it will take only one per, Like, let's say base 64 image, which is required string. And it will return applaud result, which will have only one field called URL, and the Diabetes requires drink as well. Then I will add also that were all to the handler, which is our cloud function with oral applaud photon. And now we can save it. And also do not forget to grant permission in order to execute this action for the role user. Good, now we have to create a cloud function called applaud photo. Then inside I will based and empty of glory file handler Gould. And then let's go to our index.js file. And I'm going to register this function there. Good, let's save it. And in order to upload files from Cloud functions, we need to configure service account. And in order to do this, we need to adjust our initialization, which we do in the file create user Ts. So first fall, let's extract it and move it inside index.js file. And do not forget to adjust also, imports are okay, that looks good. Now we need to generate the right config and we can do it in our Firebase console or right, being here. Then we go to settings and find the dab service accounts. Here you can see already some example code which we can copy. But besides these, we need to clique these generate new private key button in order to download the config. Once it was downloaded, you just need to copy this file somewhere under the Functions folder in our project. And let's rename it as well. I'm going to name it, let's say service account key. Here we go. Now let's replace our current initialization with what we copy it and just adjust a bath through the service account key file. Like this. And one more thing is that we need to provide default storage. Bucket means where our files should be applauded by default in our Firebase storage and the bucket bath we can find also in Firebase console, we just have to go to our storage sanction. And here we can cope B folder path. Now let's go back and based it here. Alright, so far, so good. Do not forget to save it. And let's finish up our Cloud Function. So we go back and first of all, we need to fetch base-64 stream, which comes from has Sauron. Then using this reg ex, we extract the content type. And in our case it can be image slash B and G. The next step would be to create a reference to the file in the bucket. And also here we define the folder where it will be start. And then let's give some unique name for image. In my case, I will use chest timestamp. Now we extract our pure base-64 streaming. Then I have to create the image buffer. Now we save our buffered file and define their Content-Type. After this, we have to get science drink, which is technically our linked to their file. And this piece of code does exactly this. So here we're request signed well, and say that the action is read and expiration time is well, in my case it ten years. But you can of course define your own logic. And in the end, we just send this oral back to Hazara. A cage should be enough. So let's save it and try to restart our functions. And still is being restarted. I just would like to mention that this gold is just demonstration. And you should not do use this in production as a ds because it is insecure and it is simplified as much as possible. And you have to handle also different edge cases on your own. The fractions first started and let's test it. So I go to our Hazara and build their applaud foundation. Here we go. Now we need to generate base-64 string from some image. And as I said earlier for these will be using our frontend clan. So, so for this demo, I will use some online converter. I found this one and here we can be the EMH. I also have two big output format and it is going to be the data URI. And then I just click Generate, and I just copied this generated string. Alright, now I can run it. So I run the query and we see that we got, and we're L, which we can copy and paste somewhere in the browser and we can see our uploaded image. So great, everything works fine. And let's move on and see you in the next video. 49. Angular DEMO - Create an Angular App & install Angular Material: Welcome to this lesson. Ok, we have build some working draft girl back-end, but it is quite useless for the end-user. So let's create some simple application using Angular and build some frontend for our project. If you prefer another GAS framework chick and other sections, there is or there will be examples with React and for you in this course and maybe in the future will be included also swelled. So first fall we need to install Angular CLI, which allow us to create and manage our application. So hearing terminal, I have to run npm install minus g and goulash slash CLI and minus gm means installed globally. Once it is installed, you should be able to run in G dash dash version and see something similar. Now let's create an Angular application. So I will run and Gene you. Then I give some name for my app, let's say, and grew our Hazara app. And then I just keep industrialization of Keith because I want them later in initialized skied for the whole project. Now CLI asks if we need rooting for the AP IB. Yes. And then we need to be Style Preprocessor. And it doesn't really matter for this particular example, but I like a CSS, so I pick this one. And let's a little bit wait until Angular CLI created an app. Cool. Our application is ready so we can navigate to this folder and let's install some additional libraries. And first fall, I would like to install Angular Material library, which provides a set of ready for use components which implement Material design. Here we can pick null. And now CLI asks which theme we want to pick. You can pick whatever you want, but I would be this one. This is also up to you. I will say. So they progressively globally. Why not? And yes, let's set up browser animations. Now let's wait until packages will be installed. So great, but there is one reminding library we have to install, and this is Apollo client, a Pollock line. This is a library which allows you to simplify work with graph kill Coors and rotations. Here is the official recommendations so you can read more about it. Now let's get back to Viscoat. And in order to install this library, I will write and g add a polo Angular. It will install ONE necessary packages. And again, ask you few questions. The first one is, what is your graph gel and poems. So let's go to our Hazara console. Gold. Then I grab it here. Go back and boss did here. And that's it. You can see now that Angola schematics created a separate model for our graph. You'll sit up and inserted already correct graph Q0 and buoyant. And inside the app Morton DS file, we see that the smaller was important. So awesome. Now let's try to run our app. So I will run Andrey serve. It may take some time and after some combination process, we can navigate to localhost 4,200 part. And we can see that our app is running. Alright, that's it. We have running application and in the next video we will perform the first queries. So see you in the next video. 50. Angular DEMO - Implement SignUp functionality: Hi guys, welcome to this lesson. In this video, we will create the registration page and connected with our syrup project. Let's follow some best practices and use pattern feature per mole. So I will create a folder features here. And now let's generate a more than for this feature. So I will run in g, g, which means generate, then m, which means model. And then inside the features folder, I will create a sign-up model. It will generate a model to sign up. And then I add minus m flag and define our dot ts file. So CLI will update app model and this new model to the input section of application model or 3G model. So let's run it. And now you can see that our model was created. And if we have a look at the App component, we see that the model was automatically updated as well. Okay, great. Now we need to create a component which will be representing our sign-up page, right? So let's do it. I will run in G, G, c, which means components. And then under the folder features and then sign up. I'll create sign-up page component. And I will also have Daesh there skip desk and set it to true and run it. And now you can see that our component was created and declared in the sign-up modelled as well. The next step would be creation of the roller follower component. So let's go to our app role, the model. Then inside the roles array, let's add a new one. So we define a path which is going to be signup and component, which we would like to render for this wrote you sign-up page. Great, now we need to tell Angular where exactly we should render this component. So let's go to our app component template. And first of all, let's clean up the server sink. We don't need it. Then I create a main tag where our app content will be leaving. And now inside we can declare special directive for all Tr dash outlet, which tells Angular that components we define in the rotating order should be rendered instead of this directive. Now I think we can check if it works. So I will go to the browser and we see some of them. Green. So let's try to navigate to the sign-up wrote and we can see that everything works fine. Now let's go back to VS Code and creates a template for our sign-up page. This part is the most, let's say, boring and time consuming. And because Udemy format doesn't allow long video format, I will just based already prepared snippets and briefly explain some really the most important parts. And I will drop details because it is not a course about angler. I hope for your understanding. Okay, let's get started. First of all, let's import all necessarily more loose, which we will need. So we would need to react it forms model because we will have the form there and also material buttons, inputs and material form filled mortar. Then inside a page component ts file will create a farm property and create a form which I built with the form builder, which I inject in the constructor. This form has some basic validation and that's it. Then here below will be on submit method, which will trigger a request to the server. But for now I will do just console log of the form of values. Then let's go to template and paste some code here. This is just a set of Angular material form fields and inputs. The only one thing to highlight here is that the value of of form control name directive should have the same value as appropriate key in our form inside ts file as example Display Name here should be the same as this one, and so on. So this way angular understands where, which form control. And the last thing would be is to add styles to our components. So I will go to Components style file and paste some. Okay, now let's have a look. What we get here is our form and if we submitted, we see our values. Cool. Now let's connect some rough kill to it, and let's start with a query. So I will go to our housework console and build the sign-up query. So predation, I will check or fields. Here we go. And also we should define the variables, right? And I define it here. Done, so let's copy it. Then. Somewhere above in our component, I will introduce a constant called signup rotation and base our query here. And also do not forget to and also DQL tag in front of the query. Sammy, like this, cool. Now we need to somehow dispatcher pride and we will do it with our Apollo client. Saw, I'll inject additionally our polo client and insight on subnet method. We're doing the next. We call Apollo method mutate, which takes an object which has filled mutation there. The value will be our signup mutation and variables. The variables which are banners of our forum. And they are represented by simple object where keys, it is piece of our Form Controls and barriers. It's values. Here. Pay attention that they, key names in our inner variables should be the same as defined in the query right here. Now if we subscribe to it, we will execute this mutation. And if everything is fine, we just redirect the user to the login page. So let's inject the rotor. And here I will rhetoric to login page and for said the forum. Yeah, you know what? We did not have rolled login so far, so it will break our application. So let's maybe redirect to the very root rolled and it's just temporary solution. So now we can go and test and go to the browser. I will feel the form out. And then click the button and we see the widescreen. And it means that we were directed to the very root, which means that everything works as expected. At the last check, let's go to Firebase console and check if our user was indeed created. And here we go. It was created, so everything works fine. Alright guys, we are pretty much downwards registration, so let's move forward and implement sign-in page. 51. Angular DEMO - Implement SignIn functionality: Hello, welcome to this lesson. In this video, we will implement login page and also connected with Hazara. Let's get started with creation of necessary files. First of all, we need to create a model. So I run already familiar command, but I will slightly modify it. I will just rename it to sign-in. And that's it that we have. Now let's create a component. And again, similar command. I will again adjust it a little bit. And tr1. Okay, looks good. And let's add a rolled as well. So I will go to role model and I need a new one. So it is sign-in and will render sign-in page component. Here we go. Let's try to check it in the browser. Let's go there and then navigate to a sign-in. And we see that our component was rendered. Okay, Let's go back to via scold. And again, like we did for signup model, we need to import necessary models to into sign-in model inside this import array. And then inside the component we will create also there for property and inject form builder. Now, I create a form which would consist from email and password, like this. And then we will add also on submit method, which again just a console log from form values. And after these, let's go to the template and then paste some similar template here. And let's add also some styling. Okay, so far so good. Let's check how it looks in the browser. Well, yeah, not, not bet. And it is a time to connect it with Hazara. So let's start from Hazara console and build a similar mutation for our login. And we'll modify it. And also slightly adjust our response. Ok, cool. Let's copy it and go back to VS Code. And then I'm going to create a constant psi n0 mutation, where we will paste this cockpit mutation. Then we will inject as well our Ebola and her alter. All right, looks good. And then inside our own submit method, we are writing the next, so we call method mutate from our up all law. And then for mutation key, we provide our sign-in mutation constant and variables. This is the values from our form. Then we subscribe to it. And if everything is fine, we need to save our identification talk and somewhere in the local storage, not the best place for storing such a sensitive information, but as I mentioned earlier, just simplify it as much as possible. And for this demo case, it's not really necessary. So we're going to set the item which we will get from the response from our data field. And then Apollo. Apollo now cannot understand which data type do we have? So we need to help Apollo and time script and let's create some interface for our response. So I'm going to create a response called login response. And I know that there will be the field login because our mutation named so. And then inside there will be access token, which is a string, and ID, which is also the string. Right? Here we go. Good. Now having this interface, we can cast it like this. And now we can see that IntelliSense is working. And then right after these, we should actually redirect our user to their profile, which will be our very root path so far. By the way, do you remember that after psi now we redirect user also to the root. So let's change it to the login. Here we go. Don't forget to save it. Okay, now let's check if our login is working. So I go to browser, then I'm going to provide some credentials for the user created before. And now you can see that we were redirected to the root. And if we open a tab application and then go to their local storage, we can see our talking here. Okay, that was the debug logging and we're moving on. So see you in the next video. 52. Angular DEMO - Typescript Code and types Autogeneration: Hello guys, welcome to this lesson. We continue to build our application. And at this point, I would like to point out one issue. If you have no saltiest, we'd built an interface for our spawns. And it means that for every another query, we will need to create interfaces which describe the data we're going to get back. And can you imagine how annoying can could be? And in order to save your time, I will show you how you can generate ready types from our graph, your schema, and even more, we will be able to generate separate services which already encapsulate this logic. So let's get started and we get started with a library called graph QL called generator. Here you can see their website where you can find more information about this library. And first fall, let's install it. So I will run NPM install and add our library as a dev dependency. It can take some time, of course. And once it is installed, we can use initialization wi's are, so Iran MP3s graph QL called gain in IIT. And again, we will be asked some questions. So first one, our project is angler, whereas your schema, alright, our schema, this is the Hazara endpoint. Next question, where are your operations and fragments, right? It means where we store our queries and mutations. So far we keep them in components, right? Do remember we created those two constants where we store our mutations. But this is not really a good place for it. And it doesn't belong to component actually. So I would like to store them separate in graph girl files under the folder source. So I leave this option by default. Now it suggests us to peak plug-ins. So first to is some basic set up and the salt one is Angular-specific. This library will generate the strongly typed Angular's service for every defined query, invitation or subscription. So yeah, let's proceed like this. And other question, where to store generated files? You can pick your own location and name file as you want, but I'm fine with default one. Then the next question. Well, it asks if we need introspection file, I would say we don't need it for now. So I big NOL. And hopefully the last one, how we want to name config file. For me, default name is fine, so I just hit enter and there is one more. And the last question, how we want to name this script which will run our type generation means NPM script. So I will name it generate types. So here we go and let's install dependencies and where the little bit until dependencies will be installed. Awesome, everything was installed. Now let's run our code generator. Okay, so it says that it could not find any graph QL file, which actually makes sense. Yeah. So let's create some and in login page component, I will create a file sign-in dot graph Q_l. And then I will copy my query here and pasted in this newly created graph kill file. The same I will do also for signup page. So I create this similar file. Then I go to my component, copied these, this query or mutation, and pasted here. Alright, cool. Let's try to run it one more time. And now we see that everything went successfully and we can open our generated file and see that our graph Kill Em poets were scanned. And based on these, there were generated dives interfaces. And if we scroll to the very bottom, we will see that there were generated also services for every query, which is actually a ready for injection and using. So let's try to refactor our current implementation with the services. So I will go to sign-in page component. Then I can inject our sign-in GQ L service, which comes from these generated file, right? And insight on Submit, I replace our Apollo with new service and removed notation because this service already knows about this query because it was generated exactly for this particular mutation. And this mutation only needs our variables object, which represented by values from our form. And that's it. Now we can remove injected Apollo and all other necessary stuff. So and labs do this same for also signup page component. I go Dan also inject generated Service, but this time sign up DQL, right. Then I go into place our Apollo with this service than they have to also just params. Then of course I have to cleaner a little bit. Things go so far so good. And let's check now if our application still works. So I go to sign-up page. There. They tried to fill the form out with some dummy values. Are okay, looks good. And now let's try to login. Cool. Login works also find. So let's recap a little bit. Now we have the script which takes our graph kill endpoint and generate the types interfaces and ready to your services out of it. It also allows us to decouple graph Q0 queries from components. And now we can store it in the separate graph QL file. The one, the one remaining thing is that if you change some graphical file, you'll need to rerun type generator in order to reflect those changes in generated file. Also, you can run this command in watch mode, and every time you change your query, the court will be regenerated automatically. Okay guys, I hope you will find this feature are cool and useful, but let's move on and develop our app. Father. 53. Angular DEMO - Create Authentication Guards and Create a User Profile: Hello guys. We continue to build our Angular application which consumes our Husserl graph kill backend. And in this video, let's create a profile page component and route her guard, which will allow us to protect our profile page from unauthenticated user. So let's generate a model profile first. Now let's create a component which will be representing our profile by h. Okay, here we go. And now lads also generate their altar guard. So i Iran and GG guard. Then I will put it under the features profile and they will call it is logged in. And of course keep tests because we don't need it for this particular case. So perfect. And now let's create an appropriate or opt for it. So I will go to their older modeled and the new one here. So what will be their old bath and their component I want to render is profile page. And also let's add our guard as well. So I will add it here. So far, this guard thus nothing but lets change it. I go to the Garth and there we should return either observable or promise which has about earlier, either true or false or URL tree. So let's do the next. From our storage, we get i term our thought Ken. And if it exists, we say that it is true. Otherwise, we return world through you, which leads to login. And I will build it with our alter, which I have to inject. So let's try it out. I reload profile and we still see profile which is sexually correct here. Because if we go to Application local storage, we see that our key is there. But if I remove it, and now we see that we were redirected to the login page. Also. Alright, let's login again. And also let's add some template to our profile. I opened my profile page and add some. So he will go. Then I go and add styles. And also I have to in burden necessary modules into my feature model. Good. As you can see, we have so far hardcoded values and it would be great to fetch or real data from our Hazara beckon, right? So let's do it. Lets create a file profile page component graph Q_l. And after these, let's build a query which actually fetch the data. Before we start to build a query, we have to know the idea of the profile which we're going to fetch. So I suggest to modify slightly our wrote and have something like an old profile slash and then some ID of the user instead of our root or L. And then we would redirect our user to this new profile rolled. So let's go to the rotor and change our rule to profile slash and then semicolon and ID. And then lead suggest also sign-in components. So I'm going to be profile and then we will fetch and add ideas. Well, good. Let's check if it works. So I go back to the browser and then let's go to login page. I tried to signing one more time. And you can see that we will correctly rejected. So cool. Next, now we can create a graph girl query in order to fresh user profile, right? So I prevailed already some query. It just takes a user ID parameter and returns all virus from user profile. And here we can see already the first issue, our called generators says that it cannot query field your profile of diet queer route. It looks like it doesn't exist there. And it happens because this query is protected and accessible only for logged in user. But our tool thus query directly without any headers soul, how we will be solving it. Jwt token doesn't work because this is just a tool. It doesn't have any account in our project, right? But if you remember lessons from our authentication session, we also have a header called admin secret for Hazara. And it gives us the admin permission. And this is exactly what we need. We just need to slightly adjust our cold gain dot yaml file and add the next stream. And as I value, we need to provide the value of our admins secret. So let's have a look. In Docker compose file. It is 123. So we take it and provide it right there. Now, if we restart our code generator, we will see that it works fine again. And the next step would be actually fetching of their profiles. So I will go to the Profile component. And first of all, let's inject activated wrote in order to read ID bomb from realtor. And also auto-generated service, which allow us to fetch the profile in four. Then I need to create a property where I assign the values which I get. Then inside or only need lifecycle hook. So the next thing to our profile property. So from our old params stream, I take our parameter ID. Then having golden eat, I can provide it as a valid for our fetch methods of our profile query. Suddenly like these. Or we can also simplify it a little bit. And then I will use map operator in order to return only user profile data. Or you can also use plaque operator, which would do the same, but it's actually up to you. Then we will go to Profile Template and say that if some valid exist in our stream, then please surrendered this template and value of this stream assigned to profile variable. And then we're going to replace hardcoded values for display name and also for email. So here we go. Let's check now how it looks in the browser. And we see some similar error. And this error says that fills user profile not found entire quarter route. And the cause of this issue pretty much the same as it was for code generator. So these endpoint is protected and available only for authenticated users. So we have to provide authorization token with this request, and we will do it in the next video. So see you there. 54. Angular DEMO - Perform Secure Queries to GraphQL Endpoints: Hi guys, welcome to this lesson. In order to finally finish out identification, we have to finish one small thing. We should add our authentication token to almost every request, which goes to our graph girl beckoned. In Angular. You can achieve it with HTTP interceptors. But I'm going to show you how to do it with Apollo gland. First of all, we need to install library Apple law slash link dish context. And this library will allow us to easily create a context for Apollo. And now we have to go to our graphical model and create this context. So let's import some helper function. Then we will create a constant, which is actually arrow function, which returns set context function. This set context takes as an argument a function which should return the object where we have key headers and then Header authorization where the value is bearer and then plus our document from the local storage. By the way, just small hint, sometimes getting of the token is a synchronous operation. And you can, of course, make the center function our sink, and then you can use inside their await operator and so on. Okay, let's move forward. Then. The thing is that we don't want to attach this outdoor can to every operation. As example, our sign-in and sign up mutations are public. So if we encountered these operations, we should skip it right? And now let's have a look how we could achieve it. So our setter function actually get some arguments and the first one is operation. Then we could create some check where we have the listed operations we're talking should be skipped. And then we check if this operation name neither assigning nor sign-up, then our header will be attached. Otherwise, it will be just skipped. And you may ask how we can understand which name has some certain operation. And this is the name of notation, query or subscription. You define it in every graph QL query. So like this one in our sign-up page query. The next step is to get back to our graph Gail model and add this context to our links. I will do it by using a bolo link and it has method from which takes array of links and merges it together. So something like this. And this is pretty much it. So let's check how it works in the browser. So I go to the browser and reload it. And we see that everything works fine, and we see real user data. Now, we can even open the Network tab in the browser console and find our query and they're below should be our, our authorization header. So you can see it here. Are the guys that was it. And let's move forward. And in the next video we will see how to upload our first photos. So see you there. 55. Angular DEMO - Implement File Uploading: Hello guys, welcome to this lesson. In this video, we're going to implement the component which will be applauding our photos. As he usually let get started with the file creation and rapid templating and styling. I generate a blog file feature model. But in this case, I want to declare it not in model, but for profile because I'm going to use it there. Then we need to create a component and expert ID from these applaud fire model. Then we need their graph you'll mutation. So let's create an appropriate file. Okay? And I will base this very simple mutation, which we'll call our Hazara action, which we created earlier. Alright? That everything is cool but not enough because these component will just upload the file, but it doesn't insert it into the database. And I would like to implement then next user flow. I would like to upload the file. And once it is uploaded, I want to show this uploaded file in dialog window, which has also some text area where I can add some description and button Save, which ultimately saved their photo in the database. That means that we need to create one more component, which will be our dialog window with a small form. So let's do it. I run and g generate component, and I generate this component under the features applaud file folder and I will name it applaud form dialogue. Of course we will skip tests and we also export these component. Good. And here I will add a graphical file as well and based in mutation, which adds the photo to this, to the database. And now let's start to implement the logic. I'm gonna start with Upload File component. And as always, I place template whose input type file. Then let's go to edX CSS file and add some styles. Alright, and then let's render this component somewhere on a profile page. So let's go there. And I'm going to place it somewhere here below. Ok, it looks pretty good. Now let's implement the logic which will trigger applaud wants the file was selected. And then we will open this in their dialog window. So I will add event handler on the change event, which will trigger this method and bath the file list. And inside the ts file we should create this method. And here we take a file and read it as a data oral. Cool, Dan, I start a stream from loaded image and from there I will execute the mutation in order to tree or uploading. So I'll import generated service, upload photo, DQL. And inside operators switch map, I execute my mutation. Okay, if everything is fine and files uploaded, I would like to display dialog window. So it means that I need to import appropriate model then, right? So let's go to our feature model and import one. Now I need to import service motto, dialog. And then inside a Subscribe callback, going to trigger dialog window and bath our oral to the file as a part of data object. Then let's go to our applaud formed dialogue component. And let's read this dialogue data by injection Matt dialogue data, injection token. And now somewhere in template we can render the image. So I create the image tag and the value of the source attribute, which will be our photo URL. And of course I have to add some stylings. Good, don't forget to save this everything. And let's try out our code, upload a new file, and he will go, our image is successfully rendered. Great, so let's continue. As I said earlier, we need to add some text area. So I go back, add some text area, and also do not forget to import required models like Matt, input form, control, material, buttons and so on. Okay, let's go back and it will be an NG model. Then we have to create the appropriate property for it, right? Then let's add actions section in our template. Where will be one single button called Save. And clicking by it will trigger the safe method. And in this method, we will trigger requests to a server where we will add the photon. So we need to inject generated Service. And then in safe method, we will execute mutation. Then in February Singh was uploaded successfully. We just need to hide the dialogue. And in order to do this, we need to inject reference to the dialogue. So something like this. Okay, it is a time to test it out. Let's go to the browser. I choose a file, then let's add some description. And I hit Save button. Alright, dialogue was closed, which means that everything went fine. Now let's go to Hazara console and check if photo is there. Great. Here we can see our photo. Okay, and that's it about adding photos. Now let's move forward and learn reminding things. So see you in the next video. 56. Angular DEMO - Render the List of Uploaded Files: Hello guys, welcome to this lesson. Alright, now we know how to upload images and how to add them to the database. And now we have to fetch and render some of them, right? Ok, let's create an appropriate model. Has always. And this model, I'm going to call photo list. And then I immediately import all required models. Now we need to create a component for it. So I run in G, generate component features, photo list component, whereas keep desk and export these as well. And immediately let's create a query for it. So let's go to Hazara console and led thing. What do we need? So we need photos which are boasted by certain user. So user ID should be equal to some ID which we provide. So we need what photo ID for the world and created ad filled. Then we also have to fetch user profile, which we remember our remote schema. And we take user name only from here, k. And also I would like to show how many comments has this photon. Good, let's compete. And let's go to VS code and paste our graphical query him. Yeah, and I need slightly adjusted. So let's rename it first fall, and then I add variable user idea, which is a string, and we will make it required. Good, our code generator should generate appropriate service for it so we can inject the service. Then let's create a property called photo, which going to be an observable. We get user photos query and we interested in particular key called photo. And now let's fetch our photos. And because we fetch photos for some certain user, right, we need to get the user ID from oral. So I inject activated wrote, and let's start to build our photo stream. So I get an ID parameter from oral, then a fetch photos, buffing their user ID as a variable, and then lead chest returned photos from data. Now I think we are ready to bind it with our template. So let's go there. I will add some section. This section will be visible if we have photos and we're gonna show some loading message in case if not, then there inside the least section we will go in the loop through the our photo, sorry, and render a card for the photo. This guard will have a header where we're going to render the name of the publisher and the date. Then we going to render photo itself. And then we have some actions toolbar where I will render count of comments. And the button which allowed to add a new comment, let's say, yeah, good. Now let's apply some styling here. Now we can save it. And also we have to render these somewhere on their profile page. So I will go there and place it here below. Ok, let's save it, and let's go to the browser. And cool. Now we can see that our photo was perfectly rendered. Okay, that's it for this particular lesson. So let's move forward and implement another features for our applications. So see you in the next video. 57. Angular DEMO - Open File Details in Dialog Window: Hi guys, welcome to this lesson. Our application is getting more and more features. Now we can sign in, say now up render a user and for upload files and many more other things. The next feature I would like to have is when they click on the photo, I would like to show in the dialog window more detailed information about it. I would like to see additionally, the photo description and maybe some comments solids implemented. First of all, we need to create a model called photo details. And also we need a component which I will name photo Details dialog. Okay, good. The component was created, but we will get back to this component later. So far, let's go to our photo list modal and import module called math dialogue. And now we can go to our photo list component. And then we can inject this mud dialogue in the constructor. Then let's create a method which will attach to the click event and which will open our dialogue. So here below I want to create a method on photo clique which takes a parameter photo ID. And it opens photo Details dialog component in a dialogue with the data photo id provided. Now let's go to template and binded to the click event. Okay, so let's go and check if it works. Alright, that looks good. Looks very good. And now let's go to our photo details. I again create a file, and let's go to the Hazara console and build a query. So I'm going to query the photo by primary key. Then it will select id in description, then also created ad and of course Photo oral than we need comments. Alright, so I check ID comment itself and then creation date of this comment. And yeah, of course we have to check the username. Alright, it's Dan solids copied and go to Viscoat and pasted. And let's adjust a little bit the query by adding their photo ID variable. Cool. Now let's go to our photo details component and inject these two guys. This is the dialogue data in order to get photo ID and our autogenerated Query service get photo GKL. Then let's create a property called photon. And it's going to be stream, which has that type of cerebral get fought the query. And we interested in this particular key called photos by primary key. Then we will go to own a neat lifecycle hook. And this property, we assign this stream, which return observable with the data which we requested from Missouri. And now we can go to our template and base these very simple template here. Here I'm using NG container, which kind of wrapper place holder. You can use it when you need some, let's say parent element, but you don't want to wrap it with any HTML element. And the rest is, I would say pretty straightforward. We just render photo data within the card component and then we show loading template if there is no photo yet. The last thing we have to do is to add styles. Then we have to also import necessary models in inside the photo details module. And I just realized that we forgot to import our photo details model into our app module. So let's do it right now. Alright, it's time to save our changes and check how it looks like. So let's try to click here and we see that everything worked out fine. Okay, let's go back to VS cold. And now we need to display comments, right? Let's generate a separate model for it. And we call it comment and import automatically to the photo details model. And right after this we're going to create also the common component which we are going to export from this model as well. Let's go to the common component and make it more useful. And first of all, let's add the input which I call comment. That type will be comments which comes from auto-generated dives. But common Stipe has all possible fields there. But in fact, we do not require all of them. So I will use building January called PowerShell, which will make all fields of the type comments optional. Then let's go to the template and paste the next simple HTML. And such a simple styles as well. Crate. And now we have to add this component also in the photo Details dialog. So let's go there and here below them, mud card, I'm going to add up component. And in the loop, I just render it. And see, let's maybe rabid in some container and show comments. If there are some, otherwise we just show that and blade. No comments. And the blade we will based somewhere here. Alright, it should look fine. So let's go and test it. I opened the photo and we see that we don't have any common seabed, which actually makes sense because we will add them in the next video. So see you there. 58. Angular DEMO - Add Comments to the File: Hi guys, welcome to this lesson. In this video, we will learn how to add comments to the photo. Okay, let's think what would we need? I guess we need some forum and appropriate query, right? So let's start this time with a comment form. I'm going to generate the component for it. I will name it as a new component and expert it because I'm going to use this outside of this monitor. In this comment. What should we do? We need the output and this output will be a meeting the comment form value. I will call it, add it, and it is event emitter. And the valid type will be a little bit tricky. It will be comments, but we will not meet all fields, but only id and comment. So I will use built into the divs cream generic called peak. And from comments, I pick only id and comment like this. Then I will introduce a new common property called form, which will be the form group. And I'm going to inject also the form builder in the constructor. And here I create a form. And these form control is going to be required. Okay, so far, so good. Now I open comments template, and pastes HTML. It is very simple. It is just a form where I bind the form group from my DS file to our template. Then let's add some styles. So it's always, of course, in order to make it work, we need to import required mortals. And yeah, we also need to handle a sediment submit event. All right, so let's go to the new comment component and create on Add Method. And there inside we emitter form failure which has only comment fields. And also because we need to meet some Naomi ID, we add also the ID which is our timestamp. Okay, we're done with this particular component and we can include it in our details component, all red lights safe and check it. Looks cool. So the next thing would be to handle our aided event from New Comment component and dispatch emitted data to the server. So let's do it. And inside it ts file, let's create this method. Then we need to create appropriate mutation in order to save their comment. I will go to photo Detail, dialog, graph, kill file, and face the mutation I have pre-built for. This is very simple one. Now let's go back to DS file and inject auto-generated service for this mutation. And then in our own safe method, we will do them mutation. Now let's go back to the browser and try it out. I add some comment. Then they oppress common button and looks like nothing happened, but is not true actually that comment was created bad. The problem is that we didn't update Apollo cache. And I can prove it, just led me reload the page and opening the game. Now you can see that my comment is there because information of the photo along with the commons was really fetched. Now I will show you how you can update Apollo cash and rerender the new comment without page reloading. Well, the first option would be to read trigger the query them. How we can do this? Well, we need slightly refactor our component. So I will introduce a new private property called query watcher. And as a value, I will assign these dot get photo. But in this case, instead of fetch, I will use method watch. But what the difference between those two? Well fetched dusk worry only once and works very similar to regular HTTP call which were performed via Angular HTTP model. But watch method expose way more control over our query. If you'll look on it, we see a lot of interesting things like fetch more, start bawling, and so on. So now we could refactor our photo stream a little bit differently and replace our fetch was watch and subscribe to value changes. Then we will go to our unsafe method. And inside a subscribe, I will use our query watcher and call methods re-fetch, which calls actually our query one more time and push their valleys to valid changes and it will automatically render our view. Let's check our theory. I'll go to browser. And I'm going to open the Fatah at some comment. And you can see that my comment has immediate lab, period. You should keep in mind that re-fetch triggers HTTP call one more time. And if you have some heavy query, it might be the might be bad for your performance. So I will show you how you can update Apollo cache manually. So our mutate method decides that object with variables also takes another config object. And this config object has property update, which should be a function and it will be executed once mutation successfully finished. And I will use an appropriate method called update cache and bind it to this class because I want to call this function in context of this class. And of course, I need to create it in my class. And this function takes two parameters. The first one is Apollo cache of our Insert Comment mutation, and the second one is mutation results of this Insert Comment mutation. So it means the data which this mutation returns. And having this two, we can already update our cash. So I'm going to create a constant with our query and variables. Then I'm going to read get photo query from cash WES their options I have created. And after these, we need to write already modified data into our cache. These right query requires also the query and variables, so we will spread our options. But also additionally, it requires also data property. And in data, we should provide everything what we have now in cached data, but it should also include our new comment. So there should be the name of our mutation, photo by primary key. Here we're spread our current values of photo by primary key. And because we want to override comments, either find also comments and add my newly-created comment. And after these, I spread all other comments from our cache. It can look like quiet, cumbersome because we're not allowed to mutate the cash. So we need to use spread operators in order to create this shallow copies of the object. But you could use, Of course, some libraries like emerges which could simplify this stinks. So now let's try out our code. So I go to browser and let's try to open the Fatah and add some comment. Then I heat Comment button. And you can see that the comment was added without any page reloading and refactoring the query. And that's it. Thank you for your attention and see you in the next videos. 59. React DEMO - Create React Application: Hello, welcome to this lesson. So let's try to build a React application and try to connect it with our Hazara backend. So first of all, lives create a React application and I will use a library Create React app. And in order to do this, I run in my terminal end peaks than Create React app. And then I will give some name to my application, which is react how Sura app. And because I prefer type script, I will use template type script. So I will get a React application with spray configured type script. It will take some time to create it and install all dependencies. And now, once we have obligation created, we can go to the project folder and install a few more dependencies. The first one is material UI, which is component library which implements material design and Apollo client, which will simplify our work with graph QL. So let's get started with material UI library. I will navigate to the official material UI documentation. And I will go to the installation section and let's check how we should install it. So we need to run these npm command. Solids copied and fluoride here. Then let's go back and we see that we need to sit up rubato font. So I copy this line here and go to our React project. And inside the index HTML file, just paste this line somewhere between head tak. Do we have something else to install? Let's check. Looks like no. So we do not have icons. So this step, we can just skip. The next package we need to install a sub bolo client. So let's go to its official documentation page and also check how to install this package. Okay, as usual, let's go to install a polo and graph girl package. Now because we use type script, we need to install also types for our graph kill package. So I will run this command, npm install AD type slash graph Q_l and is going to be deaf dependency because we needed only during the development. Awesome. Now let's go back to our documentation page and we see that we need to create an Apollo client. So let's then copied these piece of gold. And I go to my application component and based it somewhere here, right? And now I just replace the world to our Hazara world. So we remember that our Sura endpoint is located on localhost 8080. And here we see some complaining from Type Script. It says that can not use cheer six unless they're dash, dash j six flock is provided. It happens because Create React app tool uses their latest bubble features which require the latest type script version. But you can see that via SQL type script where shown is four or three When the latest one is 4.1. So we need to switch to the latest version. How we can do this. I will go to my workspace config file, and then I'm going to add the next line to it. It is a pointer to the tab script version which was installed by Create React app tool. Now we have to save it and then just go to any T6 file. And here we can click on the version and our new latest one. And once we pick it, we see that error has gone. If we go back to the recommendation, we see that we need to provide our Apollo come 50k into an Ebola context provider. So let's do it. So I just copy it and adjusted in my app component. So let us remove this outer generated things. Alright, cool. And yeah, we're almost done. There is only one reminding, think it is, or alter, so we need to install some. Well, I'm going to use regular alter DOM and in order to install it, we just need to run in our terminal npm install React rotor DOM apps, easy as that. And again, because we use type script, lattes and style also desperate. So here we go. Let's run NPM install types, react rolled or DOM, or K. We're actually set up and ready to go. But before, let's write to run our applications. So I run npm, run start. Let's wait a little bit. And now we see that the application works absolutely fine. Alright, that's it. And let's move forward and see you in the next lesson. 60. React DEMO - Implement SignUp functionality: Hello guys, welcome to this lesson. In this video, we are going to implement a sign-up page and we will call our Hazara action called CREATE_USER, which we created few sections before. Alright, let's get started. First of all, let's create a folder called components. And there we will create a file called signup ds x and will generate an empty React components so far. Now we need to build some form, right? So I will use a very famous library to handle forms called forming. So let's install it first. Here we go. Then also I will install a library called 2p, which is used for foam form validation and perfectly works together with forming. And of course, let's install some types for UC library. For formic, we do not need it because for me comes already with proper typings. Okay, first of all, let's give some name to the page. And now let's build the form. So here we go here or add some initial form values. Then I will describe validation scheme in separate, let B constant, and I will add it to validation scheme property. And also formic requires that we need to handle on submit event. So let's do it. And our onsubmit handler will just console log values so far. Then inside for meek, we should return a function which returns our form. But before we add some fields, let's fetch all necessary properties which provides us formic. Then let's handle or form submission. And now I can add a new form filled from material UI library. And again, this is not a course about React. So I will drop detailed explanation because I assume that you have some little experience with React. So okay, let's move forward. I'm going to create the same form filled for email, then also for password. And I'm going to place a button which submit my form. Ok, now let's maybe try to render the component. What do you think? Okay, let's go to our app to ES6 and configure a little bit our routing. So I need to wrap it with DOM rotor. And then I'm going to create a rolled called signup. And i will render our signup component. Now let's check it. So I go to browser and navigate to signup rolled, and it works, but it definitely lacks some styles. So let's go back and create a route container for our application. And our component. We will rub two box component and we will be able to define some weeds and margins for our form rubber or Quito key, each should make our layout way more pretty dear. And now let's try to focus on our signup logic and built a graphical query to our Hazara. So here at the top I create a constant called signup mutation, and I use DQL deck. And between this back ticks, I will describe our login query. So let's go to our Hazara console and build one. Okay, it is going to be a mutation called signup. And we use action create user. And then we get back all fields from a new user. Now let's copy it and go to VS Code. I pasted somewhere here, and I will do some refactoring and add variables to the query. Cool, now we have to execute it, right? So I will use our Apollo client for it. And he is a hook called US mutation, which takes one required parameter and it is our validation query. And it returns as an array where the first value is the function which actually triggers this HTTP request. And the second element is an object where you can see different properties which describe a state of this mutation. So let's peek, maybe loading and disable our formed during the ajax call. Great, now we need to trigger our mutation. And where do we do it? Right inside our signup handler, which is being triggered when we submit the form. So instead of console log, I call our mutation. And this function takes the config object, which has a field variables, which in our case will be our form values. Alright? And here keep in mind that the keys of variables object should be the same as you defined it in your graph QL query or graph kill mutations. So the result of sign-up function call will be a promise which we need to resolve. And if promise has been resolved successfully, we wouldn't need to redirect the user to the login page. However, despite promise could be results successfully. Server can still return some errors because unlike rest, interface, graph to l always returns HTTP status to hundreds, which means request completed successfully. So we need to take the probe errors and check if there are errors there, then we will console error them. Otherwise, we should redirect users to sign in page. So I will utilize the US history Huike and if no errors, I pushed user to the sign-in page. And if promise has been rejected, we should also catch the error and also console error it. You can have some your custom logic. But for this lesson we will leave it as it is. Okay, it should be enough. So let's go to the browser and tested. Wait, no, before let's create a rolled four sign-in page. So I go to app component and create a URL to sign-in. And let's just render some an old place holder for now. And like this. And now we can go and test. So I fill out the form. And here we go. We're successfully redirected, which means that our mutation has been executed successfully. Okay, that's it, guys. In the next video, we will implement sign-in page. So let's move forward and see you there. 61. React DEMO - Implement SignIn functionality: Hello, welcome to this lesson. In this video, we will implement sign-in page. So let's get started. First of all, let's create a separate folder for the component under the components folder and the component file itself. And of course, we should create a functional component called sign-in. Then let's put initial values for our sign-in form, which we will create lay term. And it is almost the same as for signup page, but without Display Name field. And also next to eat will place the validation form for the form fields. Then I'm going to create a submit handler function, which will be just console log values from our forum for now. And of course we need the template and some form, right? So I will add the same robber as for sign the sign up page. And then they are below type biography. I'm going to build my form. So now let's fetch again the form state and we return form itself. And the form will have the field e-mail, which we just copy from sign-up page. So let's go there. Okay, and then the password field, which we copy from sign-up page as well. Here we go. And of course, we need some button which will submit our form. So let's place it somewhere here. Great, we're done with our form. Let's build now our graphical query to their login action in her Sura. So I will introduce a new constant called sign-in mutation. And let's go to our Hazara console and build our mutation. So I will compose sign-in mutation, which will return me the user ID and access token. So let's copy it and go to VS code and paste it here. Now, I will add some variables to our mutation and just it like this. Great. Now let's execute it. So I go to our component and use, use mutation HUC again. And in order to get the function which will actually trigger the HTTP call to their Hazara server. And I'm going to fetch the loading status from our mutation. Then let's disable our Form button during their HTTP call execution. And now I go to our sign-in handler and trigger HTTP call by calling sign-in function. And then I bought that config object with variables. As we know from previous lesson, this function returns a promise. So if it is results successfully, we do the check. If there are any errors. We console error them, right? Otherwise, we should save the user Tolkien in our browser storage, and then we redirect user to the user page. So I will do it in the separate function which will get as a perimeter, a successful response from the server. And then in this function, I will save the user token in our browser local storage. And then I'm going to redirect users to his or her profile page, which will create in the next lesson. So i will use already familiar US history hook and then I push user to his or her profile page. We span with our ID, which is equal to user ID. And, you know, one case we forgot to handle the case when our promise fails. So let's add here the catch operator and just console log the error. Like this k. So every scene is done from this component. I believe there are very few reminding things. First one is to replace our placeholder for signing rolled with our new sign-in component. We go good, and now let's create some placeholder already for our profile rolled. And now we can save it and go to the browser and check it. So I fill out the form league Sassanian end. And we see that we were successfully redirected to their profile rolled, which means that our code works exactly how we expect. Alright guys, that's it. Let's move forward and see you next lecture. 62. React DEMO - Typescript Code and types Autogeneration: Hello, welcome to this lesson. Before we start with user profile, let me quickly show one issue that we have and it is better to fix it at the very beginning. And our problem is in our user logged in handler, namely in it's data type, you can see that we don't know which data we will get in their response. So we need manually describe it. Of course, if you are using JavaScript, you will not get this issue because JavaScript doesn't have dives. But if you use type script, it is important to keep addressing type-safe. But in our case, we have to do it man ilium, and it is just a small mutation which returns not so much data. And our application has only two queries or actually mutations. But can you imagine if we had hundreds of this mutation queries? So we need to somehow automate it. And we're gonna do it with the tool called graph QL called generator. Here you can see it's official documentation and let's go to their installation section. Okay. It says that we need to install graph QL. But we did it in the very first video in this section when we were starting a polo. So we need to install only called Against CLI and deal it is being installed. Let's go back to documentation and check if we have to do something else. Okay, it says that we need to run in the installation wizard, so let's copy it and run. Here we will be asked about several questions. So here you can navigate with your keyboard, arrow up and down and select and deselect by Space key. So here I peek react. Our schema is served on localhost 8080, so we can copy it from our app component. Here we go. The next question is about fragments. Or if to put it simple, where we will be storing graph QL queries. So far we stored them in constants within our components. But in my opinion, it would be better to separate it and keep queries in separate graph kill files. And these files could be everywhere under their source folder. So default value works fine for me and a hip just enter now. Here wizard asks if we need to install additional plug-ins. Well. Default suggestion more than enough for me, the most interesting blogging is the third one. Types groups react upon. And these plugin can generate already ready to use React hooks for every queries. So we just called this hook and it does its job about the rest of moral rules you can read on the recommendation page. Now we have to define in which file we're going to store hole generated diabetes hooks and so on. For me, default value is completely fine. Introspection file with unneeded here the default value for our config file is okay. And now we should give some name to the script which will generate types for it. So I will name it generate types. So we are done. Now let's run their npm install comment. And now let's try to generate types by running NPM run generate divs. And it looks like we have a narrower. So let's check what was wrong. And it says enable to find any graph QL type definitions. Which actually makes sense because we didn't create it yet. So let's go to sign-in folder and create a new file with the same name but extension dot graph Q_l. And then let's call p query from component and based it into this new graph QL file. So cool. And now let's do the same for signup component. So let's create a separate folder for this component first. Right? And do not forget to adjust bath to their component in your application. So let's chunk everything. Alright, we are done. So let's try to run our code generator one more time. And here go, it looks way more better. Now let's go and check what was generated. And unbelievable this tool has generated dives for us based on our queries and mutations and our Hassan paragraph QL schema. And if we're scrolling down, we will see that there were generated react hooks which are just a wrapper around use mutation react hook from Apollo. And it uses already brought her document which is actually graph girl mutation or query. And now let's try to use our generated HUC instead, what we have created before. So I will go to sign-in component. And here instead use mutation. I will call US sign-in mutation hook. And then we can say that there should be not a forming ferrous because this diode is not so accurate. But we will be using way more better type, which comes from this auto-generated file. And this type is called sign-in mutation variables. And that's it. And now the beauty of this solution is that if you add the DOD after the data, we see all available fields which we can access here. Of course now type script complaints because we use our hard-coded type, which is not really correct one. So let's replace it with generated one. And the name of this type will be mutational result. And we're interested in one specific field called data. But we can see that it is generic type and by default, it's Valley is any, not so streaked, I would say. So. Let's define it more accurate by adding here sign-in mutation type, which was also auto-generated. And now type script wants to tell us that the value might be null, so we have to properly handle it. So if we add additional, if Chegg, We see that error is gone. And this is a power of Type Script. It allows you to catch errors before you even run the code. And with graph girl called generator, these types will be always in sync. Your graph QL schema. Ok, now we can remove all unnecessary mutations and also imports are right. Here we go. And there's one remaining place where we have to do refactoring, and it is our signup component. So let's go there and do some refactoring. I replace it. Was Usain up De Shun. Then I have to refactor that type for signup handler function. And it looks like we should we should do nothing here anymore. So I just clean up my court. I clean this all staff and this is pretty much it. So let's check if our application still works. Let's go to the signup page and try to create a new user. Or K, It looks like it works fine. I will fill out this form and cool. Our app is still working and has strong types which were generated automatically. But before we end the up this lesson, I would like to make small remark. Every time when you change your mutation or query, you have to run one more time these generate type script in order to reflect changes in your auto-generated file. Or if you change your schema quiet, often, you can modify your script by adding their dash, dash watch flag, which will be working in the watch mode, and it will be tracking every year changes and rebuild or regenerate these types for you automatically. Alright guys, that's it. And let's move forward and see you in the next video. 63. React DEMO - Create Authentication Guards and Create a User Profile: Hello. In this video, we're going to implement very basic identification guard and render using information in user profile. Okay, let's get started. I'm going to start with creating of authentication context, which we will create under the output folder. So inside this new file, I create a context and I define some interface for it. And I have to also provide an interface which describes my context. In my case, it will be one function call sign-out. Then another function called set is user sign-in, which we'll be updating our authentication state. And then I need the property is user signed in, which will be indicating if user was logged in. Here, I would like to highlight that we do not build ready for production out identification system. Alright, so keep in mind it just for learning purpose, I keep it extremely simple. Now I'm going to create a function component called auth provider, which will return the context provider. Of course, we should render its children. And the value of the context will be what we described in our identification contexts interface. So for ease, users sign Dean and set is user signed in. I will use chest. Use state react hook, where the value will be true if there is user token in local storage or false if it doesn't exist there, then this psi now is going to be just a function which removes from local storage our user authentication token. And then it said our ease user signed in property to false and then redirect the user back to login. Okay, it looks good. On now we need to create another component which I will call protected route. And this protected wrote will be just checking our context. Like us example, if user sign-in and it is true, it means we logged in and it will render its children components. Otherwise, it will just redirect the user to login page. So here is our component. Here, let's get a reference to our context. And I'm going to use the US context react hook. And then inside their return, we need to first render all burps We wanted to Bath to our route. That's why, by the way, we have diaper old props because we use this component as a proxy for rolled components. So all applicable probes for route should be also accepted by our protected or odd component. And there inside we do the very simple check. We check if issues are assigned in true, we return around children means components which we will define between protected route tags. And otherwise, we will redirect the user to login page is varies in logic. Now let's try it out and go to our app.js file. And instead of route for profile, I will use our new protected route. And I need to also wrap this everything into my identification context provider. Alright, the last part, we need to adjust our sign-in page. So we should also get a reference to the, to our context. And when users successfully logged in, we need to switch flag is user sign-in to true. So here we go. And now I can go to the browser and reload the page. And we stay on user profile. But if we remove user talking from our local storage, in theory, we should be redirected. And indeed, we have been redirected to sign-in page. And if we login Bank, we see their profile page. Cool, our authentication garment works. Now we need to render user information inside the profile component. So let's create a folder called user profile. And there inside we going to create an appropriate component like this. Okay, what we will have inside their profile, fullname, I believe, and email. I think that's everything. What do we have in our profile so far? Okay? Now, we definitely don't want to have user values hardcoded, but instead, we would like to get them or fetch them from the Hazara. Correct. So let's build a query for it. And we're going to create a graph kill file. And I'm going to base the query which I pre-build for you. It is very simple one. We just call user profile action and it returns user ID, email, and display name. And hopefully you remember that we have the code generator running, which you generate the React hook for this query. But we see that it has been failed. And the error message says that it cannot find the user profile on query root. Do you have any ideas why it could happen? If not, I will explain you. It happens because this action is protected. Do remember, we allow to execute this action only for user or admin. And from this section about authentication, we remember that there are a couple of ways how to authenticate the user. So the first one is JWT, or you can use also web hooks. But also we can provide the admin secret, which will grant the admin access. So we need to configure our code generators such a way that when it does a request to our graph kill schema, it should also send this header. But how we will do it? Well, we'll go to our code gain YAML file and slightly edited. I add additional field called headers and define x Hazara admin secret header and the value I'm going to copy from our Docker compulsory YAML. In our case, the bass forward east want or three, null if I tried to run cold again, again, we can see that it has worked fine. Great. Now we can go back to user profile component. And I'm going to call the query get profile query. As you can see, it requires variables, namely user ID, and this user ID will fetch from our route params. Do remember we have this road profile slash semicolon ID. So these ID we going to fetch. So I will use, use param react hook for it. And let's define type. And I'm going to fetch the ID. And then I just provided in variables object. Cool, now we should do, but first we need to check if there are any errors. And if so we should return some error message. Then I want to show some loading indicator if it this loading and just replace this hard-coded values with the data from the server. And the last step would be to replace our place holder in the App component with these new component which we created in this lesson. Okay, so let's go to the browser. And unfortunately we see again the error which says that field user profile was not found in type query route. The reason is the same as for code again, and it happens because there is no identification of header. So we need to attach header without identification doctrine to every query which needs it. And exactly these we're going to fix in the next lesson. So see you there. 64. React DEMO - Perform Secure Queries to GraphQL Endpoints: Hello, welcome to this lesson. As you may remember in previous video, we kinda stuck trying to fetch user profile information. And the reason was that we don't send user authentication token with our requests. So let's fix it. I go to our app component and we need to provide properly configured HTTP link and authentication link, which will be the ink there to every request. In Apollo that link, it is kind of piece of Apollo requests configuration. So I'm going to create an HTTP link and move our URI inside. In order to create authentication link, I will use the function called set context, which returns another Apollo link. And now we need to configure it properly. So this function takes a factory function as an Arab command, which takes some several parameters. The first one is the operation and the second parameter is previous context. From operation, I need to get operation name, property and operation name. It is how we named our queries and mutations. As example, if we go to signup graph key of file, the operation name, it is this signup string. And it allows me actually to filter out operations where I don't need our header and where I needed. I can attach this header with every request. So I can create a list of operations where I don't need out header. In our case it is signing and sign up operations. And then I can check if our operation isn't one of these public operations. Then I take a token from local storage, and then I return configuration object, but with all headers from previous context. And additionally, I provide authorization header where the value is this drink, which starts with string bearer and then goes our token string. Okay, our, our, THE context is created. Now we need to merge our HTTP link and authentication link. I will do it in our client. Here after the cash, I just call method concat, our Owltest link and as a parameter iPod, our HTTP link and lets it, we are ready to go. So I go to the browser and reload it. And we can see that my profile has been loaded successfully because if we open the network tab, we can see that now we sent also our custom header with authentication token. Okay, we are done with this video, so let's move forward and build our further. See you in the next lesson. 65. React DEMO - Implement File Uploading: Hello guys, welcome to this lesson. In this video, we will implement photo uploading. We're going to upload a photo then displayed in the dialog window with some, I don't know, maybe filled or a text area for the photo description. And also there will be save button which will in the end safe it in our database. Okay, let's get started with creating of appropriate folders and files. Then I will initialize a new component. These components will have some heading. And inside there is an input with a type file, which on change or when we peak some photo, it will call some handler which we need to create. And so far it will just console log the variant. Now let's try to add these components somewhere in the user profile. Your goal. Let check in the browser. Okay, I pick some Fatah and now we see inside the console log our selected file. That's great. Now we need to do what? Right? We have to convert the selected file to the data worlds, drink and send it to the Hazara action called applaud file. So let's get started with converting the file to the string. So if file exist, we take the first value in the list and initialize File Reader. And then we use this reader in order to read our file as a data oral. And then we need to define a call back which will be executed once our file will be loaded. And at this point, we need to send the string to our Hazara action upload file. So let's create a graphical file for it. So here I paste them notation, I have prepared for you. This one, really very simple. It just calls applaud photo action, which we'll call them our Cloud Function, which were built in the previous sections. And it will upload to Firebase storage the photo and return us the world to this photo. Now let's go to the component and let's use our new mutation in order to get that trigger function. And we also will fetch some loading indicator. And then in our uploaded callback, if result of our reading exists and if it is a string. Which rigor uploading and pass this result as the value of our variable called base-64 image. So immediately let's catch the error. And if it was successfully uploaded, we need to open dialog window, right? And then we have to pass the oral to the dialogue. Some I don't know Component property. So first of all, let's check if there are no, any graph girl errors. And now we need to have something like switcher for the dialogue. So let's use the US State react hook for this purpose. And until we hear, let's also create a state for photo oral, which we will need a little bit later. So now, when photo is uploaded, I said photo world to what I am God from server, right? And then I said dialogue visibility to true. But we need to create this window, right? I would suggest to create it in separate file. Then here I'm going to create a component. And we will use it as a wrapper for, for material dialogue. So we want to apply the same properties. So I will just extend the dialogue ones and add some additional Which is photo oral and on safe call back, which will be executed once user click on Save button. And this callback will take as a parameter an object which represents the photon. And also we need on cancel call back which will be executed. Once user, colleague there Cancel button. Then I create some basic template with some text field. Here we can write photo description and let's create a state for it. Okay, good, nice. And they're below. We will render the photo itself and, and they're inside. There will be the dialogue section called Actions. And there will be two buttons actually cancel and safe. And also let's assign provided callbacks to appropriate buttons. So for Cancel button, it is just on cancel. And if we click on Save, we call unsafe and build our photo object. O k should be find. So let's go back to applaud photo and important this our dialogue. Then we need to provide state open and then we need to pass the link to the photo, right. So also we should provide on Cancel and unsafe called BEC. But on safe we need to create first. So far, let's just console lock our values. And I would suggest you to check if our code works at all before to proceed. So let's go to the browser. I'm going to pick some photo. Nice. Our dialogue was popped up and I can add some description and I hit save. And great, we see our photo data. And by the way, regarding this warning, this is the bark on material UI library site, and at this time, it is not fixed, but if it is annoying Q, you can just temporary switch of this strict mold inside the index.js file. And if we save it, they're warning will gone. But still, we saw that our code works. Dominant one reminding think is to save the photo in our database. So I will go to our graph, kill queries, and I'm going to add a new mutation. And this is a notation called insert photo. It just inserts photo oral and description and takes beg their amount of effect that Rhodes and description whirl and creation date. Now let's go back to their component and use our new mutation. Then in our own safe method or function, let's trigger these safe photo and in variables iPad the photo data. Then immediately let's handle our errors. If there existing, then if everything fine, we just hide our dialogue. And what else? What else? What else? Oh, we could show also some snack bar with some message, right? So let's create state for it. Then. I will set o and snake bar to true on successful saving. And they're in template and want to create actually snack bar. And these snag bar will automatically be hidden in four seconds. And message will be, you know, maybe let's, I know, let's better create a separate component for it. And it will render some small version of the image so we can see which image we have applauded and also some small text. Now, let's added to the message property and cool. Now the last thing, we have to show some loading indicator, right? So I suggest you to use them to show user a message that something is happening. So once we upload the file, I will show some small message like files being uploaded. And when we save the photon, I don't want to refactor It's anymore, so let's just decrease the dialogue capacity, okay? And it should also look fine. So let's go and check and go into a browser, upload a photon. So we see indicator NYSE. Now add a description, save it. Yeah, it looks not bandwidth this opacity and we see our Snack Bar. And yeah, that's great. Everything works as we expected. Great job guys. So that's it. For now. Let's have some small coffee break and see you in the next video. 66. React DEMO - Render the List of Uploaded Files: Her gaze. In this lesson we will relax a little bit and implemented very easy feature which just fetch or photos we uploaded and just render them as a list, right? So let's immediately jumped to the implementation. I'm going to create a folder called photon leased and two files component and also graph kill file. And then let's build immediately the graph kill query. And this query is going to be a little bit more complicated. So I will build it in Hazara console. So letting, what would we need? We need photos where user ID is equal to our user ID in profile. Then we fetch ID, photo oral and created add field. Then from the user profile, I believe we need some display name. And then I would like to see also how many comments has this photo. So we'll fetch comments, aggregate, then aggregate, and I check this count field. Perfect. Now I just copied and pasted inside my graphical file. And let's quickly adjusted. So I will rename it. And I'm going to provide also variables. Here we go. And now I just go to my ts file and create a component. First of all, I'm going to fetch photo information by using these autogenerated react hook. And yeah, we need to provide also user ID. So let's get it from world params and bath it to our variables object. And now inside a template, I want to show some message until our photos are being loaded. And when everything is fine and just inside the loop, render every photo as a current. In this card, there will be some header where the title is. We do not really have any title to their photos. So let's maybe rendered the name of the user who uploaded this photon. And inside sub-header, let's put, I don't know creation date. Then in their cart media we will render the photo itself. And below inside the card content, we will render just amount of comments, so-called. Now let's add this component to our profile components somewhere here. He would go, good, let's check this out. I go to the browser. Okay, photos are there. We just need to fix some styling tissues. So let's add styles to our curtain media. And I'm going to add property bedding top and set it to 56%. And it will allow us to see the picture. And also I want to add some, I don't know, we're upper here. And I will apply to this wrapper, the CSS grid property in order to have, you know, some nice grid layout for our photos. But again guys, I put styles inside a template just because of the lesson and I don't want to jump between files. So for your production projects, please keep C-style separately in some separate CSS files. Okay, and now we can check. So I go to the browser and yep, it looks definitely way more better. And that's it for this lesson. And let's move forward and see you in the next lesson. 67. React DEMO - Open File Details in Dialog Window: Hello. In this lesson, I would suggest you to implement a detailed view for our photos. So if user clicks on any photo, we see the photo description and comments in separate dialog window. So let's get started with creating an appropriate folder. Then we would need to have the file. And inside the file, I'm going to create some component. I'm going to do something similar to what I have done for the to upload dialogue. And it will be just a wrapper around our dialogue from material UI. The prompts will be extending their properties from our material UI dialog. But of course it will take some extra properties. And this extra properties are the ID of the photon which we're going to show and their own cancel callback. And now I need to return the material UI dialogue. Then I have to fetch all required probes and spread the dialogue probes to our dialogue. Then I need to have some template, right? So I add some title here. Also inside the content will be our photo description. So far I just play some place holder here. And then below there will be the image inside the box. And here below I want to render our comments to the photon. For comment, I would suggest to create a separate component here in this file. And as a property, there will be passed the comment which will be described by our autogenerated time for a comments. But we do not need all those fields. So I will use the built into that type script generic called peak, which will allow me to extract only fields I need. So it is going to be their user profile, the comment itself, and they're created eight, created at date. And now we need to define the template of our comment. So it will be the, I would say cart with some title for username. Then we would have the common body and the date when, when this comment was created. And let's format the date somewhere up here. And I'm going to assign it to two separate variable. Okay, good. Now we have the comments, so let's go back to the dialogue. And so far we place it somewhere here. We will remove that empty array with, sorry, we will replace this empty array with the real data once we fetch it. And they're below, I will add their actions section where will be only one button called Cancel. And on clique, we will just call this ON council callback. Alright, now we need to fetch the data from the server, right? So let's create a graph kill file for it. And let's go and build this query inside the Hazara console. So I need to fetch photobombing primary key. Then I need the ID description and created the add field, also the photo world, right? Then we need to actually comments. I'm gonna take ID, comment and creation date from this entity. And also from user profile, I need some display name. So I just copied and go back to VS Code. I need as always, slightly refractory it so I can add some name for the query and also I have to provide variables. Now let's go back to our component and let's use our query here. But this time we will use some, another query typed, which is actually lazy query. So it will be executed not immediately like it was before. But we going to manually trigger this call. But why would we need it? Well, because dialogue is not lazy and it will be initialized eagerly. So at the moment, when we didn't click on the photo, but dialogue was initialized, the value of their photo ID will be undefined. But we definitely don't want to do this unnecessary call to the database, right? So we need to check if we have the photo ID only then we should do the HTTP coal. But if you have some experience with React hooks, you should know that you cannot use them inside condition blocks like if else. So we define a lazy HUC which will return us the function trigger. And then already familiar things like loading and data. But when we gonna trigger it, I would suggest to use their React HUC called US effect. And we will be watching property for DID and our trigger function. So once one of them has been changed, this call will be retrieved, heard. It leads to some another benefit. As example, if we open the same photo 23 times in a row, the dialog window will not execute HTTP call multiple times because the photo ID and trigger function will stay the same. Okay, now we can replace our place holders with our data from has Sura server. Then I'm going to replace their photo oral and also adjust our comments. So what else? And yeah, let's add also some loading message somewhere, maybe here. And also some message. If there are no messages to the photo like this. You know, this check is quite cumbersome so we could extract it to some separate function. Yeah, it looks definitely nicer and it is way more easier to read. Now we need to import this dialog to the photo list component. And we need to provide an ID of the photo which has been selected. So I will create a separate state for it. And if user clicks, I said selected for the ID. And now we can provide for our dialogue. They're selected idea of the photon. And then on Cancel callback, we just reset selected ID to the empty string. And the dialogue is going to be open only if there is selected photon. Otherwise, it will be just hidden. Oops, it looks like we have some type of, let's fix it. Ok, looks cool. So let's go to our browser and check how it works. And yeah, everything looks fine, so great job, and let's move forward and see you next lesson. 68. React DEMO - Add Comments to the File: Hi guys, welcome. In this video we will add some comments to our photon. So let's get started. First of all, we need to create some graphical file and add mutation, which we'll insert our comment. I have already pre-built one. It is very simple mutation. And I hope at the end of the course it should not surprise you. And it just takes the photo ID and the comment text and insert one comment and then returns some data back. And then next step would be to create a state for our comment. And this state will be empty string so far. And then we have to create a reference to our mutation, which will save our comment. And then we need to create the text filled with the button. So here before at least I will create a box we some TextField inside and unchanged. It will update our common state. And below I create a button, and once I click it, it will call some handler. And now let's create this handler. Inside, I call our safe common function and bass unnecessary variables. We handle the promise error in case if it happens. And if promise was successfully resolved, I just reset comment valor to empty string. It looks like that everything is fine, so let's go and check it. I opened the photo and add some comment. And it looks like it worked out because our text field was reset. But we don't see the comment. Actually the common was successfully added. But now you see the Apollo cache, which doesn't know that the data was updated on the server. And if I reload the page, you can see that the comment has appeared. But do you agree that it would be better to show the comment without page reloading? And you have two options for it. The first one, you can get the function re-fetch from the user, get photo lazy query hook, and you can call it when the comment was successfully added. Now if we go to our browser and the comment, you can see that it appeared instantly. But it might be not the best option because this method refresh data from the server. So the whole get photo query will be fetched one more time. And this query could be very heavy, which brings bad impact on the performance of your app. But there is another way how you can handle it. And it is updating of the Apollo cache memory allium. It is a little bit sophisticated, but still and how we can do it. Well, besides variables, every mutation takes a few more arguments, and one of it called update. And it shouldn't be the function which takes two params that cash and mutation result means our created photo. And inside this update method, we need to perform logic which updates our cash. Technically, it consists from three steps. First one, we need to read the query we want to update from cache. Then we modified it with our new data, and then we just write this data back to the cash. So let's do it one by 1. First, I need to read query get photo where you can find it. If you go to our auto-generated types, you can see that it has a constant so-called get photo document, which is our graphical query. And this is exactly what we need. So let's copy it and import. Then we need to define variables. And here pay attention that values should be the same as original query. Otherwise you will read the wrong cash. Okay, now inside the cached data, we will have their actual cash from this query, and now let's modify it. So I just do a deep cloning of my cash because Apollo doesn't allow you to mutate it. And I just override the comments array and add the new one as a first array element. And now we have to write this modified value back to our Apollo cache. We can do it by calling method write query, and we need to define some variables and query them same as we define it for the read query. Or it would be even better to keep them in separate constant and then we can just reuse it. And additionally, we should write data which will be our updated cache. And that's it. So let's go and check if it works. So I open the photon, I add a new comment. And brilliant, our comment was added without any page reloading. Okay, guys deaths will sit regarding adding comments. So see you in the next videos. 69. Deployment - Configure Hasura for convenient Local Development for a Team: Hi guys, welcome to this lesson. In this video, we will improve our infrastructure so adapt forward how Sura project. And we're going to make it more flexible in terms of multi environments support. The changes which we're going to make here will allow us to easily switch our Hazara action and points as well as remote schemas have Sura events. And also we will see how to handle sensitive data like Console secret password without exposing it to the version control system like Git. So let's get started. I'm gonna navigate to our actions tab and I will open create user action as example. You can see here that the handler is hard-coded and it points out to one of our cloud function which is being served locally. The same you can see also for remote schemas. And if you'll navigate to event stub, we will see the same thing. But if we get back to our actions and open any of them, you can see this small note which says that you can use environment variable as a bard of the handler. We're all. And then you can just concatenated. So let's create such a variable in our docker-compose file. So I will name it as an action mace oral. And we can go back to Hazara, UI and rabid from here. So this is going to be our base or L for our actions. Then I will do absolutely the same for remote schema. And also for our event, which should not defy the user about the new comment to the photo. The next step is to replace our hard-coded worlds with Val is which come from environment variables. You can do it in housework console right here. But I would like to show you how you can change it directly in their court. You just have to go to the actions YAML file, which located under the metadata folder. And you have to replace our hardcoded values with the environment variable, which we defined just few moments ago. And do not forget also to clause single quotes because the value should be a valid string. And also do not forget to do the similar things. Also for remote table where we need to replace or all property to world from envy and replace the oral itself with the environment variable. And the last but not least, it is our event which is located in the tables file. Just scroll a little bit down and replace the VAB HUC property with VAB HUC from envy and replace their role was the environment variable as well. Now everything is updated so we have to restart now our Docker container in order to initialize our new and environment variables. And now, once containers have been restarted, we can apply this new metadata to our Hazara instance. I hope you remember how to do it. Yeah, so just let us go to Hazara server folder. And in your terminal run, Hazara metadata apply. And as you can see, it was applied successfully. Now if we reload our Hazara console, we will see that our endpoints were either concatenated with the base. We're all variable like actions or the end point comes completely from an environment variable like for remote schema here. Now, I would like to highlight another problem. And the problem is that this value is in docker-compose YAML. They're also hard-coded to some extent. I mean that if you need to work in a team, it might be that every developer may want to have it's own Firebase project. Which means that all these endpoints, along with the Husserl JWT secret will be different too and other developers. And every time someone push changes to their airport, it will break the project for another developer who has defer setup, right? I hope you agree with me that it should never happen. So we need to somehow exclude this configuration from version control system. So how, how would we handle it? I would suggest you to extract the sensitive data into a separate file called dot INV, which I'm going to create under Hazara server folder. And I move those variables from Docker compose into this new file. And then I just threw plays the semi-colons with equal sign. Once I had done with this, I go to the docker compose and I'm gonna import that file by using envy underscore file property. And I'm just declare here the bus do my dot INV file. Here is a very important moment. The dot INV file must be inside. Good ignore. So we can be sure that the sensitive information from dot and v will never be exposed to the good hub or somewhere else. But How and other developers will know which variables should be provided in order to work with the project. Well, there is some convention to solve this issue. We can create a new file called dot N v dot example. And we can list here all unnecessary variables with maybe some just dummy placeholders. And you can also leave here some comments which we'll be explaining. What developers should do is it and exactly these files should be tracked by good and should be pushed to the Git repository. The next thing I would like to highlight is our config dot yaml, which has Sura CLI, uses when we run any CLIA common like Hazara console has raw metadata apply and so on. Here we have the duplication of Hazara admin secret. But the true is that when we have this dot v file under the Hazara server folder, we don't need to define it here anymore. Instead, has Sura will read it from an environment variable sura graph QL admin secret. So we can easily remove this field here. And also we could remove the endpoint as well because by default it will be localhost 8080 anyway. The next part which is affected by our change is the config of the code generator, which allows us to generate the App Script interfaces from graph kill endpoint. Here you can see that we provide also Hazara admin secret header, which is also hardcoded. So we have to replace it with our Hazara admin secret environment variable. So I can do it like this. And after the semicolon, you can define any default value in case if an environment variable was not provided. This endpoint right here will be also dynamic, but we will do it a little bit later when we will be building our first pipeline. Now in order to read the environment variable in our code generation config, we need to adjust slightly our script in package.json file. So I go there and I'm going to add the string minus R, which means require. And then the name of the plugin called dot and v slash config. And then in the end of this command, we need to define where the dot and via file is located. So we say that conflict path is under how Sura server folder and there will be a dot and we're file and after these, all those variables from our dot v file will be available inside the cold GYN YAML file. Great. Now let's try to test our changes. So I go to our dotting via file and I'm going to change as the admin secret. Then I have to restart. Our docker containers are right, it has been restarted. And now I will go to the Hazara React app and just try to run the code generator. And you can see that my code generator steel works perfectly. Now, let's check if we didn't break has Sura CLI config. So I go to has Sura server folder. And let's try maybe around them. Metadata apply. And you can see that it works as well. And what about our web application? Okay, it looks, it looks like application works fine as well. Alright, guess that's it. I would say this is a great start. So let's continue and see you in the next lesson. 70. Deployment - Configure Cloud Functions for Multi Environments: How guys, in this video we will implement support of multi environments in our Cloud Functions. Something similar what we have done in previous video for Hazara project. Let's get started. If we expand functions folder, you will see inside something similar what we have done in previous videos. So we have two config files and to config examples. And if we open this config example, we will see our API key and storage bucket ID. I did it intentionally in order to hide my sensitive information from the version control system. And then every time I need this values, I just import this JSON files in my type script file and use this various there. As example. You can see how I have done it for storage bucket key. But there is actually the better way how we can configure this environment variables when you start your Cloud Functions locally. Firebase trying to find the file dot runtime config dot JSON where you can define the environment variables for cloud function. And if it exists, it will read values from there. The one very important thing to mention is that you have to create kind of namespace. So in my case it will be Hazara. And there I can already defined another environment variables like environment, where I define that its local environment. And then we have to create our API key environment variable. And we will assign our Web API token. And this is how I will read this environment variables. So let's go to the login file. And here instead of API key, I will write functions. Then I have to implement these functions here in the top. And then I can call method config in this functions. And this method will return me my environment variables so I can access them by a dot. So the Sura and API key. So there will be my environment variable, which I've defined in my runtime configuration file. And now I can just remove this conflict JSON file import. The next changes we have to make inside our index.js file and we have to slightly modify our initialization process. First of all, we have to get read from this storage bucket. And because we don't have it inside our environment variables, we have to read this value from some, another place. And this place is Firebase Config environment variable, which is being initialized. Immediately by Cloud Functions. And this config contains also our bucket idea. So we can just parse this config via JSON bars. And then I pass our admin config to this initialize function. Then we have to handle our service account because we need this service account during their local development. Otherwise, we will not be able to upload files to to our bucket. And as I mentioned, we needed only locally once cloud functions will be deployed, we will be using another service account, which will create in next videos when we will be creating their production and development instance of Firebase. So that's why I will create the simple check and we'll check if our environment is local. Then we will require our service account key. So we will read values from there and assign it to the credential property, right? And after these, inside these initialize app, inside admin config, There will be required credentials for our service account key. And it will happen only if we run it locally. So if it's deployed, deployed environment, this part will be just skipped. Okay, we're almost done. We just have to rename our config example, two dot runtime config example. Then let's just copy the content from the runtime configuration into example and replace it with place holders. And then the last but not least, we have to modify our gitignore file and add this dot runtime config JSON file into gitignore like this. Ok, cool. Now let's just restart our functions in order to initialize our environment variables. And let's check if our application still works. So called functions were started. I go to our application reloaded and let's try to upload some photo. So I pick the photon. And you can see that the photo was successfully uploaded, which means that our Firebase Cloud Functions are working as expected. Okay guys, that's it. So whole, let's move forward and see you in the next video. 71. Deployment - Configure React Application for Multi Environment Support: Hello. In this video, we'll learn how to configure their multi environments adapt for our React application. So let's get started. First, let's open our app, the Essex file. And we can see that our graph kill endpoint is being recorded. And it wouldn't be great to read this value from environment variable, right? So we would like to have something like this. And then we could create the dot and the local file, which will be used as a configuration file for our local environment. And there we could define our work Raphael endpoint. But definitely it is not enough. And react doesn't know how to read values from this dot and v local files. So we have to somehow configure it and we will use the special library for it. It is called envied Daesh, cmd, and let's install it as a npm package. Once we have this library installed, we can go to our package JSON file and slightly modify our start screen. So I will add envied their Cmd then minus f, And which means file. And then I define the name of this file where I defined my environment variables. And then already we run React scripts. Oops, I just noticed that we created our envy logo file under the Source folder, which is wrong. And we have to move it one level up and it should be placed in the root of the React has Sura app folder. So now let's save it and restart our React application and let's see if it's still working. So I go to my browser and reload there application and we see that application is still working. So now let's get back to our VS code and create dot and v local example file, which will go to the Git repository. And there we will define to our colleagues, to another developer's which environment variables they have to configure it in order to run our application. Then we have to create the similar file for our development environment. And I will call it a dot and v dot development. And there I will place so far some place holder because I don't know which endpoint will be for our development environment because we'll create it later on in this course. And then I will create the same but for production environment. Just a small hint that do not put sensitive information inside this configuration files because these files are supposed to be public. So if you have any buzzwords, tokens, you must not put such sensitive information inside these configuration files. Okay, now let's go back to our package JSON file and create a couple of additional scripts for development and production environment. And I will start with development, and I will do pretty much the same, but instead of dot INV local, I will use dot and v development environment. And this same I'm going to do also for production environment. And as I mentioned in some previous video, we have to adjust also the endpoint inside our code again YAML file. So I will replace it with environment variable, which we will create in our CIC de pipeline. And if we don't have such a variable, there will be use localhost, ADA does string by default. Also we have to create their similar generate the Apps Script but for our CI CT system, because the current one is being Quran in watch mode and it doesn't work for CA CD pipeline. So we will create the separate one and we will just remove our watch flag. And also we don't need this. Don't envy config flag. Now let's try to run this generate dives in the new terminal tab, and let's check if everything works fine. Or case something has failed. So let's check what, what was wrong. Here. I see our graph keel endpoint is not complete, so we have to add this part as well. So now let's save it and run one more time. And you can see that our code generator works fine. Alright guys, that was it. And let's move forward and see you in the next video. 72. Deployment - Create & Configure Firebase Production & Develop Projects: Hi there, welcome to this lesson. In this video, we will create and properly configure to Firebase projects for development and for production environment. So let's get started. Basically, you have two options, how you can create Firebase project. The first one, you can go directly to the web console and created there. But I will show you another more advanced way how to create projects. And we will do it by using the Firebase CLI. So first of all, let's make sure that we are logged in in our Firebase CLI by running the command Firebase login. After this, you can run the command Firebase projects, then semicolon create. And with dash, dash display name flag. You can define the name for your project. I will call it how Sura course develop. Once you run this command, the CLI, we'll ask you to provide the unique identifier for your project. And I will write here, Chez has syrup, dish course dash, develop. And then I will hit enter. Once your Firebase project was successfully created, you will get the direct link to this project so I can come and click on it and it will be landed on the Firebase console for these project. Then the first thing I have to do is to go to authentication tab and have to activate email buzzword provider. So I just toggle this enable and then I just save my changes. After this, I have to activate my storage. So I go to the storage tab and then click Get Started button. There you have to just pick a location for storage and you can click done. And if you see this page, it means that your storage was configured successfully. So let's move forward and now we have to activate our Cloud Functions. So I go to their function step. And here's the important thing. Most probably at this time when you see this video, the Cloud Functions are working only with their pay as you go plan. But in fact, you will have to millions of cloud function invocation for free every month. So it is more than enough to test this Cloud Functions. And also I will show you how you can configure their alerts for badgered. So once you getting closer to your limit, let's say $1, you will get the some email notification and you will be able to shut down your project. So I will click upgrade my project, and here I click the purchase. But most probably you will be asked to provide your credit card credentials. So you have to fill this out and then just click Purchase. Then I just said the budget alerts and I will send my budget alert as a $1. So once I spend $0.50 on this cloud functions, I will get the email wisdom notification. Alright, that's it. And now I can finish up my setup of the cloud functions. The next step would be is to configure our service accounts in order to be able to upload files to our storage. So I will navigate to the project settings and then I will jump into their service accounts step. And colleague this managed service account permissions link in order to open the Google Cloud Console in correct place. First of all, here we have to activate their identity and access management API. And in order to do this, just start to type in the search field access management. And you will see this in the dropdown. So you just have to go there and click the Enable button. Alright, here we go. And now we can go back to our service accounts. And let's edit our Hazara course develops service account. There we have to switch to their permissions step. And for Hazara course developed service account, we will grant some access. So I click this button, then I paste the email of this service account in the new members felt. And then I open the Role drop-down and just type Cloud Functions service agent. And here we go. That's what we have to select. And then just click Save button. Now you can see that our Hazara courts develop God the new role called Cloud Functions service agent. Again, we needed in order to upload files in our buckets from Firebase Cloud Functions, right? And this is pretty much what we have to configure for our project. Now we have to create the Firebase project for production environments and configure it the same way. I will go back to my Viscoat and I will run the same comment. But instead of development, I will rename it to her Sura course production. Then I will enter their unique project id. And let's wait a little bit. Now the project has been created, so let's go there and activate our ultimate suffocation email password method. Here we go. Then we have to activate our storage as well. Then I will go to the function step and also upgrade my project, set up budget and activate the cloud functions. Then I will go to the settings and then service accounts. And again, let's jump to the Google Cloud platform. And then we have to activate Identity and Access Management API. So let's enable it. Then we go back. And just to grant access to our Sura course production service account. Here will go like this. Fine. Now we are not depending on the NADH version on aids or we can go to our package JSON file and change it to, let's say 12 Orion gaze, that's it. I hope it was clear and yeah, let's move forward and see you in the next video. 73. Deployment - Create & Configure Hasura Production & Development instances: Hello guys, welcome to this lesson. In this video, we will create two Hazara instances, one for development and one for production. So let's get started right now. I'm going to create them in Hazara cloud. So on the main Hazara page, I just click Login link and I will be redirected to Hazara Cloud console here and just League the new project. Then a big free tier and just select a region. Then you have to give some name to a project. We'll name it has Sura course development. Oops, it looks like it is a two long name, so let's rename it to Hazara course def. And then I click button continued to database set up. Then I will create the database instance with Heroku. Then let's wait a little bit, and now we can create a project. Once Project has been created, we can navigate to envy vars tab. And there we have to create some environment variables. So let's go to our VS Code. And the first environment variable we have to define is how Sura graph girl, and now for our eyes to roll. So I'm just going to copy it and paste it here. Then I also go back and copy their banner for this environment variable. And now we can add it. Yeah, now we see the configuration error because it also requires to have their Hazara graph kill admin secret. So let's create this variable as well. So here a girl just give some password for this field and click Add button again. Yeah, next few seconds you will see this config error. But once the Hazara instance will be restarted, this error notification will gone. But let's proceed with configuration and go to VS code and check what do we have to set that up as well. So we have to sit up also Sura graph kill JWT secret. Based it here and follow this link in order to find the generator config generator for Firebase projects. So just scroll a little bit down here is this link here. Just select their provider Firebase, and enter your Firebase project ID. In my case, this is Hazara course develop. And then just click Generate, config. Cool. Now let's copy it and paste it here. Now because our front-end and back-end will be served on different hosts, we will need to configure also course, which means cross-origin requests support. And here you can leave just asterix, which means that our Sura server can accept requests from any host, but you can define your some concrete host if you need. But in my case, I will live it just as the asterisks, which means that every host can request that data from our Hazara instance. Then we have to create the base URL for our actions. So let's create an appropriate environment variable. And then we have to base their appropriate oral. And by default, the world to Cloud Functions consist from the location where served our functions, in our case it's US Central one, and then the project ID, and then already Cloud So I just have to replace my project ID place holder. And in my case, this is the Hazara Daesh course dash develop. Then we have to do the same for remote schema. Well, so let's copy it here. And we can click Add right now. And last but not least, is our email notify oral variable. So let's copy and paste it here. Here we go. We are done with the setup of our development environment. And now we can go back to our projects and create a new one for production. So I do it absolutely the same way how I have done it for development environment, but with a different name of course. And that's it. Now we have to also define environment variables. So let's do it right now. So I define unauthorized role, which is anonymous. Then there should be defined admin secret goal. Then of course we have to generate JWT secret for our Hazara course production environment. And we go copy paste it. Then don't forget to configure the course domain as well. And of course, let's configure the action-based URL and also remote schemas. And of course, email notify it well. And yeah, because it is a production environment, let's implicitly define the development will mold to false for exactly this instance. Alright, good. So now we have two environments. We have the production and development, and we have appropriate graph QL APIs, whirls respectively. So, and if you remember in our React application we created some placeholders for our development and production environment configs. So let's copy these worlds and replace our placeholders there. So I'll open development or redevelopment and replace it here. And then I will do the same, but for Environment Production file. Here we go. Now we are done and let's move forward and see you in the next video. 74. Deployment - Configure GitHub Repo a setup sensitive Data as Secrets: Hello guys, welcome to this lesson. In this video, we will configure our GitHub repository in order to work with GitHub actions and basically get hub actions. This is the GitHub feature which allows you to build your pipelines, to deploy, test, or build your applications. It is free for open source projects. So I decided to go with GitHub actions. But actually the idea behind this approach is similar to any another similar things like bucket pipelines and so on. But before we start actually to build our first pipeline, we have to set up some environment variables. So let's go to this setting step and then open the secret step. And basically action secret. This is environment variables which are available in our pipelines. So we can use them in our pipeline script and securely managed different tokens or passwords. So such sensitive information like that. And let's get started with first secret, which is far based development project AD. So this is just the idea of your Firebase project which you created for development environment. And then just placing it here. Then we have to create the same but for production environment, it would go. Then we would need to generate the Firebase token. And this Firebase token allows you to kind of authenticate your Firebase CLI in CIC de pipeline. Okay? So in order to generate this token, you have to go to your terminal. In my case it is VS called terminal and they have to run the common Firebase logins Ci. Once you run this command, You will be asked to give some permissions to your project, you should say, okay, and then you will see the successfully logged in page. And if you get back to your terminal, you will see that generated token right there. So just copied and pasted into the value field. Cool. Now we have to create the Firebase Web API key for development environment as well. And in order to find these web API key, you have to go to your Firebase project. Just make sure that you are in the right project and then go to the Project Settings there you can find your web API key, so just copied and pasted them. Let's create a Firebase Web API key for production and just do the same operation. They're awesome. Next secret we have to generate is Hazara console secret DEF. So let's navigate to our Katsura cloud and just copy them admin secret for development environment. Then absolutely the same let's do just for production environment as well. Just go there, copy and pasted. Once you have done with this, create their Hazara endpoint developments secret, and then go to the Hazara Cloud console and copy the host name for your development environment. Be sure that you copy only hostname without these v1 and then graph QL. Cool. And now again, this same just for production environment. Alright, so we're almost done. Now we have to go to our VS cold and create the develop branch where we will be working now. So I go to VS code and create the develop branch from my main branch. Don't forget to push it to the GitHub. And the logic will be pretty much simple. We'll be creating the pull request from our developed branch into their main branch. And we will trigger them development, build on every push to the develop branch, okay? And once we close our Pull Request to main branch, it will trigger the production build, and then it will build and deploy our Hazara, our Cloud Functions, and our React application to the production environment. So far as you can see, there is nothing to merge actually there, those two branches are identical. But in the next videos we will make some changes in development branch. And then you will see this error missing in action, so less than forward and see you there. 75. Deployment - Deploy Cloud Function with GitHub Actions: Hello guys. In this lesson we will create our first GitHub action, which will deploy our cloud functions. So let's get started. In order to create the GitHub action, we have to create the folder in root of our project called dot GitHub. And then inside this folder, we have to create another sub folder called workflows. Then under the workflows folder, you have to create the YAML file and you can give any name you like. I will name it development. And this is actually our first GitHub action. And every GitHub action should have their name and let's give some name to it. I will call it deploy Hazara project on development environment. Then we have to define when exactly we are going to execute this GitHub action. So I would like to trigger this GitHub action once we push something into develop branch. So we say Please trigger on push into the branch, develop. Cool. Now let's define some global environment variables which will be available in these development pipeline. So the first environment variable I'm going to introduce is project id. And this is just Firebase project ID. And I will read it from the secrets variable. And this secrets, this is what we defined in the previous video, previous tutorial. When we set up the secrets for our GitHub repository. Then we have to define the Firebase token, which will allow us to deploy our Cloud Functions, right? And also we need the Hazara graph kill endpoint. And by the way, these variable, historiographical endpoint, this is what we will be using in the col generator yaml file. Okay? I hope you remember this bar. And then next the last Environment variable. And this is the historic Raphael admin secret, which is, we remember, allows us to access the protected graph KL endpoints in how Sura. Then every GitHub action has jobs, and job is the biggest building block of every GitHub action. And for every job specified, the separate instance of virtual machine. And we will have three jobs. First one, It's deploy Cloud Functions, then deploy how Sura, and the third one will deploy our React application. But let's get started with the first one. So I will define the key for this job as a deploy Cloud Functions, you can give any name, it doesn't really matter actually. Then we can give some more, I would say, meaningful name for this job. Then we define on which operating system should run this job. So. I will choose a wound to 20, and then we will define strategy. And let me quickly explain you what does it mean? So using this strategy in, under the matrix key, you can define on which NOT versions should be executed. This pipeline, in my case, I would like to run my pipeline in the container we set up not version ten and then 12. And by default these all versions will be executed at the same time. And with the MAX parallel, I say that please run only one job at the same time. So don't run them in parallel. Because if we deploy at the same time Cloud Functions, one of these pipelines will fail because of concurrency. And our Fail fast flag means that once fails, one of this job should fail whole pipeline. And that's why we allow only one job to be run in parallel, but it is not something required. I just wanted to show you how you can test it on their multiple environments so you can completely remove this strategy and out everything beneath. It will work without any problem, but is just for example. Alright, let's move forward. Then we have to define the steps for this job, because a job is being splitted into smaller steps. And the first step would be to checkout our source code from the good rapport. And I will use the reusable GitHub actions. And these particular reusable action will check out the code for us. In order to define the second step, you have to start it with a dash name and give some name to this step, then you can perform some smaller operations like working directory. And we define the function because we want to execute all our scripts in contexts of our functions folder, right? And they're already inside these functions folder. We have to run NPM install, right? In order to install all npm dependencies. And then we have to build our functions. So we will run and BAM, run build. And in the third step, we would need to define their Cloud Functions environment variable. And in order to do this, I will use the another usable action called Firebase action, which also simplifies some job with Firebase CLI. And in property arcs, I will define what exactly the CLI should execute for me. So I execute functions conflicts said, and I define a Sura API key here. And if you remember these hazard API key, this is what we defined in the dot runtime config JSON right there. But if you remember, I told you that these Runtime Config Jason works only. For local environment only if we serve these Cloud Functions locally. But if we want to set it for deployed Cloud functions, we have to use these CLI comment and it will execute the same common like you see right here. And we don't need in the pipeline these Firebase word, because their Firebase action, this reusable Firebase action will make it for us so we can drop the thing. And the last step would be actually to deploy our cloud functions. I will use again the Firebase action. But in arcs, I will now define that deploy. And with flag only, I will define the least deploy only functions. And this is pretty much it. So we can commit these changes and push it to the develop branch in order to trigger our first by Bland. And we'll use the cognitivism plugin for it in order to make them conventional commits, but it is not strictly necessary to commit like I do. It's just some best practices to have their conventional committed. It is just about the commit message format and that's it. But can commit whatever you like. Then end, just push my changes to the develop branch and let's see what will happen. So now if we go to our rep and go to the action stamp, we'll see our first pipeline which is being run actually. And if we expand this, we can see our steps and jobs in action. So you can see that it's pool our changes, then we and then we build our cloud functions. Then, then right after this, we set up Cloud Functions and environment variables. And then we go to the next step which deploys our Cloud Functions. It deg some time, yeah. But after this you will see that deploy complete. And it means that our Cloud Functions, we're successfully deployed. And you can see that it has been executed for their version ten. And there will be their second job, which will be executed with the North version 12. This is because our matrix which we defined. And once this bowls jobs will be executed, you will see that your job is successful. And then inside them, Firebase console, inside the function step, you will be able to see already deployed Cloud Functions. And this is pretty much it as you can see. Nothing special, nothing hard there. So let's move forward and finish up our GitHub actions. 76. Deployment - Deploy Hasura Engine with GitHub Actions: Hello guys. Let's continue to build our GitHub action. And in this video we will create a job which will deploy our Hazara. Alright, so let's get started. So I'm going to create the new job and make sure that you created on the same level as the deployed Cloud Functions chop. And the key of this job will be deployed. Daesh has Sura, I will name it as Sync has Sura metadata and migrations. Because actually everything boils down to applying metadata and migrations to our development has Sura instance. Then I will say that this job should run on the wanted to NDI. And here's the important thing. By default, these jobs are being executed in parallel. But in fact, our deploy has Sura job depends on the first one on the deploy cloud functions. So we have to use the keyword needs and define the key of our first job in order to wait until our cloud functions will be deployed. And only then deploy has Sura job will start to execute. And then let's quickly define the steps. So the first step, as always, is to check out our source code from the good triple. Then we have to install the Sura cell lie, right? So we run this command which you can find on the official has Seward documentation page. And after this, we will be able to update metadata and integrations. And in order to do this, we have to navigate first to the Hazara, their server folder right there. And already there inside we will run to Commons, has ceramic grade apply and has syrup metadata apply. And this is everything you have to do. So now let's commit our changes and push them to the GitHub repository. Now if we go back to our GitHub repo, we will see that the new workflow was scheduled. And there inside we see already two jobs which looked like a chain. So once our first job will deploy the vowed functions, then will be triggered. Our second job which called Sync Hazara metadata and migrations. And as you can see now, our first two jobs were completed. And now, and now the second job has been executed. Here we see all our steps which we define in action. And now we see that our pipeline completed successfully. So now we can go to our Hazara Cloud Console and it will launch the console for our Hazara console dev project. We will see that everything is there. We can see our queries, we can see our actions. Everything was applied successfully. All integrations or metadata is there. And this everything came from our CIC de pipeline. Cool or end gaze, that was it. And let's move forward. And in the next video, we will deploy our React application. So see you there. 77. Deployment - Deploy React App using GitHub Actions: Hi, there. There is one reminding Job to implement. And this job should deploy our web application, right? So let's implement it right now. So here below I will create a new job and the job will be deploy a web app. Then I will give some name to this job. Then we will say that it should run on the womb to 20. And because our application depends on the metadata and migration state, laura, We have to wait until diplomats were our job will update the metadata and migrations on their Hazara instance. So we use this needs and define the key of the job which required in order to proceed with this current job. And then let's define some steps. So first one will be check out the source code. The next step will be built our application, right, so we'll navigate to the React app folder. And there inside we will run couple of npm scripts. So first of all, we will install npm dependencies, and then we will generate types from our Sura instance. And in order to do this, we will run this script, generate type Ci, which we have created in previous videos right here. And then we'll run the build called built development in order to build our application for a development environment. And the last step would be to actually deploy our built application to the Firebase Hosting. So I will use these reusable action for Firebase. Then I will define their bath build. This is the bath which should be actually deploy it to Firebase Hosting. And these folder build will appear once they're built, development comment is executed. And under the Build folder we will find already compiled JavaScript index.HTML file and everything what we need in order to run this application. Then I will run the command deploy and with dash, dash only flag, and we'll define that, please deploy only hosting here. And this is actually everything what we need in our GitHub. But we also need to adjust some and other things. And the first one is our hosting. And the Firebase Hosting is being configured in the Firebase dot JSON file. So let's go there and create additional key and call it hosting. Here's one required field called public. It points out where actually our public files are located. And in our case, they allocated under the React app and Folder built. Again, as I mentioned before, this folder built will be created once we run npm run built comment. Then we have to create ignore key and here will be listed the files which should be ignored, so we should not deploy them to our hosting. So this is the Firebase Jason and not models. And then below inside their rewrites, We have to configure pretty much common thing for every single page application. So once we navigate to any whirl, it always should redirect us to the index HTML page. Okay, it looks fine where done with their hosting configuration. And the last thing we have to check if we have any liter errors. In order to do this, we have to build our application. And once we run npm run build react script will run the winter against our files and it will show us some warnings if it, There are some. And it's important because these warnings in C ICD pipeline will turn into the errors and they will break our deployment. So we have to fix them upfront. And indeed we have couple of warnings, so let's fix them right now. So I will go to our photo Details dialog. And here I have to find the line number 32 and remove this re-fetch variable which is not being used. And then I will navigate to applaud photo ES6 file. And on the line 79, I have to add just that alt attribute for our image. Alright, cool. So now we're ready to commit and push our changes and check if our deployment is working. Though I just commit it and then push. And now we can go to our GitHub actions and let's have a look if everything works fine. So we can see that our Cloud Function has been deployed. Then also metadata and migrations was synchronized with her Sura development in the instance. And now we can see that our build and deploy a web application job is being executed. So far so good, our application is being built. And we can see that they're hosting has been deployed. So now we can navigate to our Firebase console and then jump to the hosting page. And we see couple of or else we can go to. And here we can see our sign-in page. But because it is completely new system, we have to first of all create the new user and only then we can sign in, right? So let's go to sign up page I create a new user and it looks like rsync executed successfully. Now let's try to login. Login also fine. And now moment of glory, and let's try to applaud some image here. So I will be there, React logo, and you can see that the image is being uploaded. Great, it works. So let's add some photo description here and save our image. So it looks like everything is working correctly. We can even open this and leave some comments and everything works without any issues. Or I guess that's what seed. And let's move forward and create their production pipeline in the next lesson. 78. Deployment - Deploy the whole Project to Production Environment: Hi guys, welcome to this lesson. Finally, we are ready to create our production pipeline. So let's get started. The first thing, let's maybe removed from NADH version on metrics. They're not Version ten because the execution on to nod versions just consumes that time. But we don't really need it. So I will leave only not Version 12. Then under the Workforce folder, I will create another file and I will call it production. And m just going to copy my development GitHub action and best it to their production. And then we will need to adjust it a little bit. First of all, I have to replace all deaf suffixes to production in my environment variables, right? And then I will go to deploy a web application job. And I have to replace the build script and use build production instead. Then I need to adjust the trigger of GitHub action. And now we're going to trigger our production pipeline not on the push, but on their pool request. And this pool requests should happen against the branch main. Then we also have to define the type of this pull request and it should be closed. But closed doesn't mean that it was merged. So we have to add the additional check. And I will do this inside a deploy Cloud Functions right after their name. And I will use the keyword if. And then we will check if this GitHub event Pull Request merged equal true, then it will run the whole pipeline. And of course, don't forget to rename their the name of the GitHub action from development to production environment. Here we go. This is actually everything we have to change in our production YAML file. And now I can come meet my changes and push it to the remote repository. Now you can see that our implement production pipeline workflow has been triggered. And until it's being executed, let's go to the Pull Request tab and create our first pull request. So I click the button compare and pull request. And you can see this simple form where you can see that we are going to merge our developed into the main branch and we can give some meaningful title for this pool requests and some description. Then you click create pull request. And our pool request has been created. As you can see that so far we cannot merge it because GitHub waits until GitHub eggshells will be executed. And once whole GitHub actions are successful, only then we can merge into the master, because otherwise it makes no sense to merge the broken chord, right? So let's wait a little bit. And now you can see that our button marriage Pool Requests is active. Of course, before you merge the pull requests, you have to go to files changed and review their code, maybe discuss with your colleagues about some improvements or and so on. But if the court looks fine, you can go back and merge it. And once we merged this, you will see that our production workflow has been triggered. And now we have to wait a little bit until our production built will be done. So after some time, you will hopefully see there three successfully executed jobs. And out of these, you can navigate to your Firebase console. Then you can switch to the production. And you will see your functions which has been deployed with our production pipeline. And also if you go to the hosting, you will see that our web application was deployed as well. So let's go to our application and tested out. Okay, so far so good. Now I can sign in and looks like our application is working. Let's try maybe to upload some photos. There. I will choose this nice graphical logo. And let's try to applaud. So then you can see that the file is being uploaded. And perfect, we see our dialog window, which means that application works absolutely fine. So let's give some description to this logo and save it. Then if I reload the page, you see that our photos are being fetched from the graph kill server, from Husserl graph kill server. And we can't even open it and leave some command and it just works. All right guys. That was it. I hope you enjoyed the process and you didn't encounter any major problems with this. So see you in the next lessons.