The Serverless Framework With AWS | Michael Vargas | Skillshare

Playback Speed

  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Lessons in This Class

27 Lessons (1h 54m)
    • 1. Skillshare Serverless Course Trailer

    • 2. Skillshare Serverless Course Lecture1

    • 3. Skillshare Serverless Course Lecture2

    • 4. Skillshare Serverless Course Lecture3

    • 5. Skillshare Serverless Course Lecture4

    • 6. Skillshare Serverless Course Lecture5

    • 7. Skillshare Serverless Course Lecture6

    • 8. Skillshare Serverless Course Lecture7

    • 9. Skillshare Serverless Course Lecture8

    • 10. Serverless Course Section5 Lecture1 Project Setup

    • 11. Serverless Course Section5 Lecture2 Create Project

    • 12. Serverless Course Section5 Lecture3 Importing Project

    • 13. Serverless Course Section5 Lecture5 Creating DynamoDB Resources

    • 14. Serverless Course Section5 Lecture5 Creating DynamoDB Resources

    • 15. Serverless Course Section5 Lecture6 Creating IAM ROLES

    • 16. Serverless Course Section5 Lecture7 Deployment

    • 17. Serverless Course Section5 Lecture8 API Gateway

    • 18. Serverless Course Section5 Lecture9 CORS

    • 19. Serverless Course Section5 Lecture10 POSTMAN

    • 20. Serverless Course Section5 Lecture11 Troubleshooting

    • 21. Serverless Course Section5 Lecture12 Troubleshooting2

    • 22. Serverless Course Section5 Lecture12 Create Products

    • 23. Serverless Course Section5 Lecture14 Update Products

    • 24. Serverless Course Section5 Lecture15 Update Product2

    • 25. Serverless Course Section5 Lecture16 Delete Product

    • 26. Serverless Course Lecture29B CloudWatch Logs

    • 27. Serverles course lecture 29C Lambda Assuming Roles

  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels
  • Beg/Int level
  • Int/Adv level

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.





About This Class

In this course, I show you how to automate your infrastructure using The Serverless Framework.

You will create AWS resources that will be accessible within any location of the world and we'll do it within a couple of hours.

We'll cover Serverless and AWS resources and touch on some application development.

This Course Covers:

  • Serverless CLI: How To Create Infrastructure from Command Line

  • Serverless Templates: We'll create templates that will stand up infrastructure

  • Develop A Product API using the Simple Web Service Microservice Pattern

Meet Your Teacher

Teacher Profile Image

Michael Vargas

AWS Certified Solution Architect & Devel


Michael Edward Vargas Jr., is an American Software Engineer and Entrepreneur who is best known for his ongoing involvement in the development of federal and private enterprise application systems using the best of breed technologies. He is currently a member of the UXD Summit Developer Group and Chief Operating Officer of Intellipoint Corporation. He is a huge fan of Douglas Crockford and John Resig for their development in the JavaScript community.

On his mornings, nights and sometimes weekends, he is passionately devoted to the development of real world applications and teaching. Originally, he started out working for Motorola and has gone on to contribute to organizations such as ADT Security Services, Interval International and the Engility Corporation.

See full profile

Class Ratings

Expectations Met?
  • Exceeded!
  • Yes
  • Somewhat
  • Not really
Reviews Archive

In October 2018, we updated our review system to improve the way we collect feedback. Below are the reviews written before that update.

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.


1. Skillshare Serverless Course Trailer: technology has forever changed humanity's and future generations to come. Everything from the ways we communicate and interact with one another to the smart cities and powers on drives. Rise of artificial intelligence. Machine learning Cloud makes a lot of this. In this course, you will learn to develop ups in that club and create something that only you can imagine. 2. Skillshare Serverless Course Lecture1: Gone are the days where ideas considered a luxury. According to monthly labor review, there's about to be 280,000 jobs that will open by the year of 2022 out of every 10 jobs that will open with a 19 4 to 6 of those will be somewhere in the mathematician or the programming group is a pretty exciting time to be in I T. And if you're just getting into it or if you've been working with it, I can tell you that I don't expect things to change any time soon, especially with rise of artificial intelligence and machine learning. And although those jobs will affect many industries, many people are going to have to take that leap and start getting more involved in technology and being a part of it, because at the end of the day, people will need to teach the machines how to be like machines, how to do their jobs and, in the end, how to be more like us. Remember, one of the key differences between machines and us is our cognitive reasoning, and that's just the tip of it. Top engineers are making huge salaries big data engineers have a median salary of $155,000 a year. You have mobile architects that are making 140 plus as well, as architects in general are making 100 40 plus, and thes salaries are expected to increase. So this is a great time to be involved or get into the field as he continues to skyrocket and just explode. First, let me introduce myself. My name is Mike, and I'm really excited to talk to you about Cloud engineering. I'm a certified AWS solution architect and developer, and I've been working with AWS since 2014. I've actually been in the field for about 15 years, and, man, have things changed. So let me tell you, I T has had such a profound effect on pretty much every single thing that you can think of , and it's really, really great. So if you're looking to learn more about cloud computing and specifically about serverless architecture, I'm gonna help you get there. Get excited because this course is the first step in your journey of engineering with the club. Let's take a few minutes to discuss what this course is going to cover. First, we're gonna talk about traditional cloud methodologies versus the serverless paradigm, and we'll talk about how you're going to develop a P eyes that will scale to hundreds of thousands of users with little to no administrator supervision. And best of all, anybody can do it, especially you sitting there thinking that it's not possible. So with that will move on to discussing strategies for gaining in granting access to the cloud and the cloud provider, and in this course will be talking about eight of us. Then we'll start moving on to methods for using the serverless frameworks. And basically, in this section, we're gonna talk about how to configure your files house you develop your files, how to basically put together serverless projects that will stand up infrastructure on your behalf. And finally, for more advanced topics, you could take a look at the bonus section and hear some of the words we have to say about that. Now let's take a look at some of the features that this course is going to cover first. This is an introductory course to the ecosystem of the serverless framework, and while I'm not going to cover eight of us in detail. I will cover enough information to give you confidence on using the platform, and particularly the services that are relevant to this course. Now. Next will also take a look at some of the capabilities and functionality off writing and put together your serverless projects, and we'll also include some interactive exercises encoding exercises throughout the sections. So within those relevant sections, make sure you check out the resource is and you'll head over to an online editor I put together, which will essentially kind of reinforce the concepts. And while it's not a full blown command line, it will definitely make sure that your understanding, the main points, at least the main point that I'm trying to communicate. Finally, we'll take a look at putting this all together, and we'll actually have some examples of standing of infrastructure that will walk through in detail. Last but not least, will have a Siris, of course, is that you can use to make sure that you're on track with the information that's been presented to you thus far. Just as it's important for me to tell you what this course is. I also want to make sure that you are completely were of what this course is not going to cover now. Billing this course does not cover your AWS bill, so pay attention to what you're doing. The good news is that as long as you don't go haywire and jump off the deep end, you shouldn't be racking up a $10,000 bill. In fact, in most cases it's gonna be very cheap or even it won't cost you anything at all, because we're going to be operating within the AWS free tier account. Second, this is not an AWS certification course. We do cover some overlapping topics, but this course will not be enough to clear the exams. So essentially, if you're interested in passing, of course there's plenty of them out there. You can google them, take a look around and choose one of your liking. There's also many different options in terms of platform. Next, I want to say that this is not really a college style course, so we're going to lie less on theory, and we're going to really focus much more on application and hands on exercises. So not to discredit the theory because it's very valuable. And actually, that's how all of this came to be just the fact that throughout this course we're really gonna focus on how we use the serverless framework to accomplish the task that we're looking to do on. Even though Serverless supports multiple cloud providers like Google and Microsoft and many others, this course is simply going to focus on Amazon. So I'm not gonna be covering about height consent of serverless with Google Cloud or Microsoft Azure. But just keep that in the back of your mind that it is possible to use this framework to do so. All right, so throughout this course, I'll be right here with you. So let's get rid of all of those doubts and let's focus on making this happen. 3. Skillshare Serverless Course Lecture2: ladies and gentlemen, welcome back. Let's keep this moving. So in this section, we talked about installing no Js and Serverless. Eli. We created an AWS account and configured our provider credentials using the eight of us Seelye. And finally, we learned a few quick commands to make us a little more familiar with the user experience deploying ups with the serverless framework. So now let's actually start dining a little bit deeper into the serverless framework, its capabilities and essentially, all of the things that can do for you. With our eight of us, I'm credentials and configuration out of the way. We're now positioned to actually start doing something with Serverless. So let's go ahead and create an AWS function that runs on eight of us. Landau, that by default is highly full tolerant, highly available in scales to millions of users overnight with zero administration. Are you ready? Well, first, we'll need to open the command prompts, and if we're using windows, in my case, I'm using windows, so I'll be walking us through. But, you know, if you're using a Linux based OS BSD any of that, but just accordingly. But here, we're gonna navigate in my case to the sea directory. Now, in order for us to create the actual service, we're going to run the serverless create command. Let's take a look at what that looks like. We're expecting something a little more complicated. Yeah, I know. It's pretty incredible, isn't it? So the create command does exactly what it sounds like. It's Stan Shih eights and creates an AWS service using Club Formacion, an AWS service that programmatically provision infrastructure on your behalf. Now, serverless itself is an abstraction layer in the context of AWS. Actually, SERVERLESS supports many different cloud providers. But for the scope of this course, I'm really concentrating on the ws. So as you can see here, we can use the serverless to interface to actually stand up the instances behind the scenes and the infrastructure behind the scenes. So since serverless supports these multiple cloud providers, we need to indicate the type of service provider we wish to create. So for us to do that, we assemble essentially provide a argument, which is the template argument, and we specify which type of cloud provider as well last implementation, because remember, each of these cloud providers support many different languages. And in our case, we're actually telling server lists that not only do we want to use eight of us, but we also want to use no Js to implement our AWS lambda function. Simplicity is often the best approach here when it comes to, you know, the serverless framework and all of the things that all of the different cloud providers essentially provide. So our strategy with the context of this course is divided, conquer, and essentially we're just gonna focus on AWS, as it mentions previously before otherwise. While this course would probably be the longest course you've ever taken in your life, so we'll be developing in JavaScript and in no Js and that's going to be running on eight of us. So we'll going to use the eight of us. Knew Js Now, finally, we're gonna provide a name in a path in here. We're going to give our typical hello world as our first example. So you can simply see we just passed the dash path option, and then we passed the actual name of the project that we're gonna actually end up creating . So this is how we can create our actual service it is literally one line, pretty cool. Once this command is done running through it's actual built employment lifecycle, then we'll have a directory structure that's created. And in this case, it's going to match that path that we essentially invoked on our first command. Now, if we don't essentially add any type of path, we're just gonna use what's in our current directory. Um, so here, you can see in order to actually navigate to the project for this specific example, we will change directories a hello world. 4. Skillshare Serverless Course Lecture3: you may be wondering how we're able to do all of this work with such a little not effort. And that's really a good point. You know, we haven't really talked about how we can define services yet, so why don't we do that in this section? Were mostly gonna focus on creating services. And, yes, you've already done this. But now that you've got an idea of what's involved, we can finally reveal what's behind the curtain. So we're gonna take a look at three files which we can use to define our services eventually. Yes. Hendler genius and serverless dot hume. Each of these provide capabilities to our services that we deploy. They all play an important role to the big picture, and we're gonna describe each and more detail here. After that, we'll revisit the Deploy Command with our new information. And finally, we'll talk about the remove command for this section. Make sure you've got an AWS account, a WSC lie, no Js and serverless installed. And finally, you've either got a default user or roll set up on your computer if you plan on following along. As each of these are pre recs for this section. All right, let's talk about services. The service framework uses three files when creating a service to our mandatory, and one is optional event on Js handler John Js in serverless that you know, as mentioned previously, each of these play an important role in the overall development of your services. So now let's take a look at each of those. The serverless framework uses a file called serverless dot gammel to interpret and provision your service. The Serverless Stock Gamel file access that definition file that describes your service in detail. This is where we can define the AWS re sources, events and Landau functions that the service will use. This is the file that is used during the deployment process to stand up our infrastructure . And that's how we can define our services. Are databases rules, etcetera? It all comes from this very important file. Our next file. Equally important is the handler dot Js file. This is the file where you write your custom coat and and then this course we're writing a no Js job script based function. This means that our code will be developing goes into this file here. It's the handler dot Js file that contains our Java script and what Lambda is gonna invoke when it's being executed. This is where we can place CO to call other services, trigger events or even respond to specific events. The next optional file you might want to know about is called Event Js, and in this case you have a function that needs a specific data that might be passed to it in order to fulfill some type of request. You can accomplish this by using adventure. Yes, this file is used to send information to your service. And what's nice about this is it makes testing your function clean and even allows you to treat is sort of like a unit test. Really a design goal and best practice principle is that your land of functions should do one thing and do that one thing really well. In this example, it acts as a unit of work that is used to accomplish a specific task. And that's really why I say similar to unit test unit tests tend to be atomic and they attempt to do one thing and do that one thing. Well, okay, no more. Hello, world. You've got the big picture. Now it's time to get our hands dirty and start developing some services in the animal. But wait. Wait a minute. What's the M? Well, perhaps we should cover that first before we get ahead of ourselves. 5. Skillshare Serverless Course Lecture4: with our service declared, we still need to bind it to the code that will execute So right where it's his previous slide goes there. Just make a mental note that that's where the code from the previous slide would be now with that said, we needed to find the function. So let's do that. As you already know, Sally, this is the function that executes the land. The coat. We do this by declaring the name of our function under the functions Custom data type. What am I going to say? Think about it for a moment. Won't you done thinking if you said indentation, go ahead and do a backflip right out of your seat, just backflip right over the seat and tell everybody that you know what's up with indentation. That is exactly the right answer we need to invent here. We can define a safe function. How can we tell serverless framework to invoke the function we want to execute? That's right. The handler property finds our entry point a lot like no Js here. This is where the code will execute in the file that contains are cold well function and execute properly. In addition, to that, we can configure a description so that we can understand the name when we're pulling this up in Orlando dashboard as well as configure a time out. The time out is basically how long the function should take to execute before giving up. So less, but not least, here is where we can define our custom tags, and our alias is that we can actually use in the dashboard. So the recap in this section well, we've really talked about is defining the functions. This is where we define our function. This is worried to give it a name. We give it a handler. This is the code that's gonna execute when it when it actually hits that entry point. We gave it a description so that when we pull it up in the land of Consul, we can see what it is and a human readable form. We also gave it a time out and attack so that when we look up this particular function in more complex environments, where there's a lot of Lambda functions, we could just type the term alias, and that will pull up this specific function also, I don't recommend actually doing backflips. I have to say that, you know, just in case you try because I can't be held liable for it. But this guy's gonna had done a front flip in his share for you. 6. Skillshare Serverless Course Lecture5: So the service is pretty cool, right? It's coming along. But how is it gonna actually save anything if we don't define somewhere to save it to what I'm talking about? Well, the contacts we gotta put him somewhere, right? So why don't we put them into a database? Well, would it be cool if we could go about constructing a database as part of our deployment so we could actually provision in our contacts as they come in as part of this whole process? Right. If we could automate that entire thing from setting up the database table to setting up the read write capacities to actually creating the data structure with the keys and etcetera Man, that'd be great. Wow, it sounds like a good Segway for this section. So guess what? That's exactly what we're gonna do. We're going to find a resource that our function is going to use, okay? And that resource is going to be dynamodb. We've been kind of hinting around that throughout the course, and I figured that out of all the services that you're likely gonna work with, I think dynamodb when it comes to AWS Lambda, it's just a really good relationship. They have together the really good friends. So I figured, Why not? Let's just show that and see, You know how that actually looks. So let's check it. First, we need to define a resource that our application is going to use. We need to define this. Using the resource is block. And we can do this by defining all of the information that you might find in a cloud formacion template. But we're going to put it right here. So that's where that resource is. Tad comes in. This is how confirmation provision infrastructure on our behalf. Now we're using serverless on top of this. So we're going to actually in that that piece into our serverless file and what we're doing here is creating a contacts table notice that we're using an eight of US data type dynamodb table. So as you my suspect, this is the properties that were going to use to define our contacts table, and this will all be done and orchestrated through the serverless dot gamma file. Now, within the properties section is where we actually define the name off the actual table. We're just doing this to keep it easy. for us, so we're giving it the same name. Next we can define attributes of that table and here we can define even the data types. So notice here I'm indicating the attribute name is an email and that is going to actually have a type of s which represents string. All right, so let's take a look at what we've done here. We have to find a resource, and in this particular case, we can embedded a cloud formacion template, particularly The resource is section into this snippet of our serverless that I am a file and this is where we can actually provision in construct information and resource is that would be available on the cloud. So here we've defined a contact stable. We know that we're using the data type dynamodb table, which internally means we're going to provision a dynamodb table that has the following properties. Within each of these properties, we can define more granular settings like the name of the table and the attribute definitions. So here we've defined an email which is going to be our attribute name, and we've also given it a data type of string. This is the beginning of how we can provision are dynamodb table continuing on inside. Our resource is blocked all the same indentation levels as properties. We can continue to define the different aspects of our dynamodb table the contact stable. Now you may think immediately about things like primary keys. Different types of key kinds of algorithms essentially reads right capacities all of these different things that are common tasks that you would find whenever your provisioning and creating a dynamodb table, whether that's through a cloud formacion template or directly in the user interface. So let's take a look at how we would configure that. All right, Next up, we're gonna configure the key schema for our table and noticed it sits at the same level as the attribute definitions. Now, within this, we're going to give it an attribute name. We're gonna give it the same as the one to find above email. In addition to that, we're going to give it a type. Now there's different type of hashing algorithms and different ways that you can have a primary key, but we'll be using hash. Following this, we can configure and provisions are throughput. Capacity will configure our re capacity for five units as well as our right capacity for five units. For more information about calculating these, you can find this in the resource is section. So to recap, essentially, what we've done here is we've created a contacts table that is a dynamodb table. And then within this Properties block, we actually find the after views and definitions that make up the infrastructure or essentially describe what our table's gonna look like. We've include the attribute names we've given in an actual value email. Then we've given it a data type string, followed by indicating that what type of key schema we would like to have in place for this table so ways that we couldn't look up the information. Now, in this case, we're using a key type of hash, which is one of the many different algorithms that are currently supported on dynamodb. You can check out the resource is section for more information on that, and finally, we also implemented and configured are provisioned throughput capacity. Now, this is a pretty big important topic again. I've included some information on this, but since this isn't an aws, resource is force. I decided that I would keep this kind of high level, but there are definitely formulas you can use. They're definitely ways that you can calculate those read and write capacities, and that's definitely something you want to pay attention to because those costs can add up . If you're just guessing and the last thing you want to do, especially in a platform like AWS is, guess when you have so much data freely available that can basically tell you exactly how much you're going to need. So definitely check out the resource is section, and that pretty much wraps up. Defining resource is so in conclusion. In this section, we've defined the Lambda function that uses a Dynamo Devi table that we've deployed and provisioned as part of the overall infrastructure. And in addition, we can do all of this through the serverless framework programmatically in one file and in one place. One last thing I want to mention is that if the table already exists, we're not gonna override it. So something you don't have to worry about. You know, in the old days, we did have to worry about this kind of stuff that these days, thankfully, a lot of people take a lot of time and consideration into those types of things so you won't overwrite your table. However, if it's already been provisioned, it will simply skip that part of the process, which is pretty cool. So anyways, that's pretty much it for this section and stay tuned for the next one. 7. Skillshare Serverless Course Lecture6: Let's suppose we want to use Web pack for our AWS Lambda application. After all, Web Pack is a monster when it comes to packaging and bundling. And, well, maybe it's a really good use case that might fit into a specific scenario that we're using to do this. Well, we would actually be using something that's outside the service framework. So how will we go about accomplishing including custom things like this into our package? Well, you're looking at the answer. We define our custom plug ins, and we use those custom plug ins within our serverless dot com will file. So let's check this out. Within our serverless Diet GMO file, there is a block that we could define called plug ins and plug ins, as you might think is actually the location where we can extend and add functionality. This includes other third party plug ins or even our own that we wish to develop. Now. Under this section is where we can actually define which plug ins are serverless. Hamel is going to include in its deployment. It just so happens that there is already a serverless Web pack plug in that's available out there. That we don't have to go and reinvent the wheel with. So we basically essentially say that we would like to include this serverless Web pack plug in by indicating it as an array element entry. As you might imagine, server lists. Plug ins may have their own configuration that they require, and you need a place to specify things like the settings or variables or values that you're going to pass to these plug ins. So, as you might suspect, you can actually do this within this custom data type within the custom block, we defined the name of the plug in in this case for the Serverless Web pack plug in. The actual reference named Is Web Pack. Notice that we again are using that indentation, and we're defining this particular set of configuration settings that belong to the Web pack plug in with Web pack. There might be some configurations or settings that we need to imply or override with. That said, this is exactly how we would do that. When you're working with Web Pack, you often have a Web pack configuration file. Maybe you've decided to put it in a different location than it normally is. Well, this is how you could do that. So here there's a Web pack config entry setting where we can actually define the path to our Web pack configuration, Js or even change the name. From what practice something else lower than that. We can also decide whether we want to include modules, the modules, that core, let's go with the wet packs serverless deployment. And finally, we can even indicate the type of packaging that's going to be done in this case were indicating that the note package manager is going to do this now. These settings air actually implied, but again, just trying to show you how you can override them in how you could configure them to basically tweak according to your situation and scenario. So this is pretty much it. That's all there is to, including in extending the serverless framework to include the Web pack implementation and bundling capabilities that are offered by that framework. Pretty cool, right 8. Skillshare Serverless Course Lecture7: When you start developing your own serverless architecture, one of the terms you're gonna come across very frequently is the term events. Essentially, Lambda functions are really utilized pretty heavily based off of events. And events can range from all sorts of things, from a process to a file being uploaded from data being moved around those air. Just a few number of cases. Eso Let's suppose that we haven't on boarding sequence, where we're going to create a new contact that's joining our community. And this includes a type of newsletter that we're going to be sending out once in a while. And maybe there's some type of batch processing that needs to happen on the back end. Or maybe we need to kick off a job. All of these types of things are the type of scenarios where Lambda would basically respond to a given event. So this is the different scenarios. Orlando execution comes into play specifically when it comes to events, events essentially trigger executions and can actually trigger these processes all that come from Landau. So in this scenario, let's take a look at how we can use events when we're creating a contact again within the serverless Diame will file within our function section is where we're going to define a create contact function. Next for the create function, we're actually going to define the handler. Nothing new here. The only difference is that wanted to show you that you can actually name a specific function within that file here. I'm actually indicating a create contact within the handler file. Next, it's time to get to the point of this section. The events object. Now notice that we can actually configure within our serverless CMO file events, which will trigger this function to execute. So this is where we can actually start tying things together with events and actually create a very loosely coupled, very powerful infrastructure and very powerful solution. And we do it all through this programmatic, um, a language Pretty cool under the events block actually included in array object Noticed. I've used the http Now, as you might suspect, when there is an http event to this path, contact stuffs create with that that matches and post request that will actually trigger this function to execute. So what are we saying here when contact is created or when this Http request is made to the path of contacts and create, and it's supposed request. And it's on the East TTP protocol. Then that will cause this function to fire. Now, before you fall out of your chair about it being http and not https, I just wanted you to know that it does support https. And I'm gonna leave that to you to figure out how, but give you a hint. It's 99% there. So this is essentially how we go about using an event that fires and causes and land a function to execute. 9. Skillshare Serverless Course Lecture8: so that's it. That pretty much wraps up the majority of the course, and we've done a lot of things in this section. You've to find a service. You've provisioned the dynamodb table. You've included Web pack into your deployment, and you've started figuring out how you can coordinate complex work flows using events. I hope that you found this section completely enjoyable as well as very informative and most of all, useful. So hopefully you can go incorporate a lot of these practices into your current role and really start making a huge impact on your automation on your productivity and on your all overall effectiveness as a developer or a devil offs engineer, Or maybe somewhere in between. Thanks for watching the course, and don't forget to check out the bonus section. 10. Serverless Course Section5 Lecture1 Project Setup: I want to talk about something really important and something that will stop you dead in your tracks from reaching your goal. I'm talking about missing hands on practice. So the good news is that this section is dedicated to giving you a basic project that you're going to build and put together. We're going to apply the serverless framework and create an eight b I. And by the end of this module, you'll be able to create one that will get, create, update and delete products or informational products. Up to this point, I've talked about commands, concepts and development in general that will be necessary to complete this section. So if you haven't completed those, make sure to check him out before hitting into this section. After putting together this eighth EI, you'll have the fundamentals necessary to apply the serverless paradigm and put together a basic A P I. For more advanced topics. You can check out the bonus section, but there's a few things that we need to figure out. First we need a back end service, but what AWS services are going to be right for what we need. And more importantly, what are those So let's take a look the technologies were going to use for this project R A p I Gateway, AWS Lambda and Dynamodb AP I gate Will will be used to expose our Lambda function to the Web, and the Lambda function will contain our delegation. Logic dynamodb is a no sequel database that will be used to house the product information. Now this a p I is going to follow the rest pattern which essentially leverages http verbs to control the intended action. By the request off, the client noticed that while the endpoints are the same for some of the methods, the parentheses indicate the http Ferb that will be used to control how we invoke them. This is essentially following the rest style implementation. Now each Lambda function will perform a single action. That is, it will either read, save, delete or update a product based on command. Finally, the operation that's delegated will be stored in the dynamodb tabled, which is an unstructured database. It's a no sequel database, and that's going to contain all sorts of data and can have very instructors. Just so you're aware So with that said, let's start working on the project 11. Serverless Course Section5 Lecture2 Create Project: first things first. We need to create the project structure using the serverless command. But I also want to make sure that it's been installed properly. So I'm gonna go ahead and run the version command for that. Once that completes, it should output the version. And if I need to update, I can do that. But I'm gonna go and stick with what I got here. So now I'm gonna run the creek. Amanda's you've seen before, which, in this case I'm going to specify an AWS no Js template runtime. And then finally, I'm going to specify a path as store, which that'll make more sense in a few minutes. So with this in place, we can generate the boilerplate template for the several project and after spent created, we can navigate to at that store directory, so actually generates a directory. So once we never get into their weaken, list those files and you'll see key files that also mentioned previous 12. Serverless Course Section5 Lecture3 Importing Project: I'm going to use my favorite idea for this course, which is visual studio. And if you haven't used it, stop what you're doing right now and go check it out. You can find the download link in the resource is section of this lecture. This idea is just fantastic. And Microsoft has really done an outstanding job. But before this turns into an infomercial about visual studio, let's get back to Serverless. So I'm going to select on the left top. 13. Serverless Course Section5 Lecture5 Creating DynamoDB Resources: next on her list will want to stand up a dynamodb table that's gonna house our product information. So to do this, we need to create a resource. Is section to find within the serverless dot yet will file all the content beneath this block is gonna look exactly like a cloud formacion template. And that's because it's actually a confirmation template embedded into our serverless file . That's also why you see that redundant resource is blocked. But notice that there's an upper case are so we're gonna go ahead and create the table on. We're gonna go ahead and call it a product table, and this will be the block that will use to actually stand up the dynamodb table on. And before we can do anything, we need to define the actual type that this is going to be used. So in this case, it's a dynamodb table that is the type. And now we need to find the properties which basically describe the characteristics of the table that we're creating. So one of those things obviously would be a name. So we're gonna go ahead and just give it the same name as the blocks. I'm gonna call it product table here on. Then I'm going to need to give it some attribute definitions that will describe the I d off this structure. So we're gonna go ahead and give it an attribute name, and that's gonna be product I d. And then below that I'm going to give it different types of characteristics of that primary key. So will go ahead and give it the type of strings. So that's gonna actually describe the data type of the idea itself. Once that's defined, the next thing we need to do is describe the characteristics of the I. D from an indexing standpoint. So we'll use the key schema block here below that Also, give it some attribute definitions, so give it the name that's gonna be the same name as the attribute to find above. And there's different types of key types you can use. I'm gonna go ahead and use hash for this. And finally, to completely describe the read write capacity, we need to go ahead and define this block. Now I'm gonna go with five. But as mentioned in the previous lectures, you can actually calculate the amount that you would like based off of your performance and off of your expected traffic. All right? And that essentially makes up the whole block That will stand up our dynamodb table pretty incredible. Um, a little less than 15 lines. But the next step, in order to actually invoke any queries against this table, is giving the ability to do so. So in the next section, we're gonna go ahead and start defining roles that have the privileges to do so that so that our land of functions can properly communicate with the table. 14. Serverless Course Section5 Lecture5 Creating DynamoDB Resources: next on her list will want to stand up a dynamodb table that's gonna house our product information. So to do this, we need to create a resource. Is section to find within the serverless dot yet will file all the content beneath this block is gonna look exactly like a cloud formacion template. And that's because it's actually a confirmation template embedded into our serverless file . That's also why you see that redundant resource is blocked. But notice that there's an upper case are so we're gonna go ahead and create the table on. We're gonna go ahead and call it a product table, and this will be the block that will use to actually stand up the dynamodb table on. And before we can do anything, we need to define the actual type that this is going to be used. So in this case, it's a dynamodb table that is the type. And now we need to find the properties which basically describe the characteristics of the table that we're creating. So one of those things obviously would be a name. So we're gonna go ahead and just give it the same name as the blocks. I'm gonna call it product table here on. Then I'm going to need to give it some attribute definitions that will describe the I d off this structure. So we're gonna go ahead and give it an attribute name, and that's gonna be product I d. And then below that I'm going to give it different types of characteristics of that primary key. So will go ahead and give it the type of strings. So that's gonna actually describe the data type of the idea itself. Once that's defined, the next thing we need to do is describe the characteristics of the I. D from an indexing standpoint. So we'll use the key schema block here below that Also, give it some attribute definitions, so give it the name that's gonna be the same name as the attribute to find above. And there's different types of key types you can use. I'm gonna go ahead and use hash for this. And finally, to completely describe the read write capacity, we need to go ahead and define this block. Now I'm gonna go with five. But as mentioned in the previous lectures, you can actually calculate the amount that you would like based off of your performance and off of your expected traffic. All right? And that essentially makes up the whole block That will stand up our dynamodb table pretty incredible. Um, a little less than 15 lines. But the next step, in order to actually invoke any queries against this table, is giving the ability to do so. So in the next section, we're gonna go ahead and start defining roles that have the privileges to do so that so that our land of functions can properly communicate with the table. 15. Serverless Course Section5 Lecture6 Creating IAM ROLES: by now, you're hopefully getting an idea of how we go about creating AWS. Resource is so next we need to create a role in policy that has the ability to execute against this resource that we just created because remember, Orlando functions will be executing, and they're going to need to assume a role that has the ability or permissions to execute against that table. So that's exactly what we're gonna do here, and all of these tend to follow the same pattern. You see, there's a type that we use followed by a properties and that properties essentially describes everything that we're going to define and everything that this is going to be used against. So essentially here, where it's basically describing that when this role is assumed, this is the permissions that will use. And specifically we're gonna be looking at using AWS Lambda functions so need to essentially give the ability to our land of functions so that they can assume that role and will do this by following these guidelines right here. So here I'm indicating which service has the ability to assume this role, the one that were defining, and they're actually indicating that It's a land of function. And here this is basically saying yes, whenever this land of function executes, assume this role. So I'm just gonna go back through here and make sure everything's good to go. I think I see a typo here, so I'm gonna fix that. Definitely check your spelling on everything. Some of the words are a little tricky. Also, definitely pay attention to your indentation. Like I've been preaching about now that our land a phone should can assume the role we need to define the policy that makes up the rules. Remember, I said that the policies are the building blocks of roles, so here, we're gonna actually indicate what exactly this rule can do. So the very first thing we need to do is give the policy name. I'm going to keep following the syntax and nomenclature I've kind of been doing so far. So I'm gonna say, product table, and then just whatever it is in this case is gonna be a policy. Here is where we're gonna actually describe the policy, what version we're gonna use, so I'll go ahead and use this version here. After that comes the statement. This is where we essentially allow or deny Remember by default eight of us. Resource is our deny. So we're explicitly giving permissions to be able to perform these actions. And the actions that we're interested in is dynamodb notice. This is an array. So this is where we're gonna indicate all of the different types of quarry operations that you would expect when we're working with the database. So the first thing we'll go ahead and do is describe table next will do quarry Well followed by, um, it's very tempting to put right item, but that's actually not right. It's actually something else. So it should be update item can be a little tricky. Some of the wording kind of throws me off, sometimes delete item as well. And then this pretty much describes what we want to do now pays careful attention to the resource is block. This always messes me up as well. This is where we bind this policy to a specific DB table notice. I'm in denting out and I'm indented back in so here will use an arm that will be used with wild cards to match against our table. So that way, any kind of string that falls into this category will match specifically to the product table. And now we have a policy and roll that can be assumed by Lambda Function, which has the ability to execute against this table. One last pointer. Make sure that the type is role, not roles, accidentally made that mistake. 16. Serverless Course Section5 Lecture7 Deployment: Alright, it's time for the moment of truth. Let's see the promise of the serverless framework in action. After validating our syntax and working through any issues, we're at the point when we can deploy our service application. If you recall, that's a simple as running the serverless deploy command. There's a number of things going on behind the scenes, so let's discuss them first. The serverless framework translates the survivalist Thiemo file into a confirmation template that is then uploaded to the AWS cloud on an initial upload, recreate a vehicle deployment known as a confirmation stack. Now, when that stack is completed, this is the template that will execute instant of the infrastructure behind the scenes. Now, some infrastructure takes more time than, say another. For example, the dynamodb table will take much longer than the Lambda function. But since we create this initially were sending up the entire infrastructure, that said, when we run it again, we'll skip the steps that we don't necessarily need to recreate. So once the confirmation template executes, you can see that we actually created a store, a P I, in the US East one region, and by default, we've deployed it to a Dev environment. Noticed those functions those air Lambda function. So let's take a look at the AWS console and see what we just did in action. First, let's take a look at dynamodb. I'm gonna take a look at the tables and at the bottom. I should see the table that I just created through the service framework. Now, if I click on this, you'll see all of the properties and essentially what we've configured and specified in her service that Yandle, if we haven't specified it, the default setting will apply here. You can see we actually set up our primary key for our date table structure and over in the land of functions, Let's go take a look at those. We can see exactly what we put in place. I'm going to filter. Once I've got that servers service up, I'm gonna filter the key word here on the word products because I do have other stuff going on in there because they've got 37 of them. So this is the plural for re products. There. You can see that we've got the actual service that was specified in the serverless that you will notice the syntax here we've got the e p i slash environment slash the name of the function that we specified. There's our code that we set. So I'm gonna go back and put product as a filter so you can see the rest of them were also deployed as well. So there's update. There's delete, read all. All of them are essentially there. And if you scroll down, you can actually see these files are being loaded. But notice the role that we've set up a swell. Remember, we specify this and using that naming convention, it actually put together a role for us on our behalf. So if we click into this, we can actually see the permissions that are tied for the policy that we've just create. So if I expand a policy, you can see there are the permissions for dynamodb. And that's basically everything that we specified in the policy in our service, um oh file and less than 200 lines. Not bad. 17. Serverless Course Section5 Lecture8 API Gateway: we've put together the Dynamodb database and also land the function. But now it's time for us to expose these Lambda functions and make them accessible across the web. We're gonna do this using the A P. I gateway. So first up, we need to add an events block to our function. Then we're going to specify that the event that will trigger this function to execute will be a CDP. The path will be the same path that will use as we described in the first part of the project. Now, since we're creating an object here and we're creating a product, we're gonna specify a post method. We're gonna same follow the same pattern pretty much throughout each of these methods. But the difference will be that for updating a product. We're gonna actually have an i. D. So we're going to leverage that i d. Here in the actual you are. I so here, instead of doing a post, will do a put and then followed by that for read product will essentially follow the same pattern, so we'll have an actual i d. That will use to query against the database and return the object. So that's what we're gonna pass in This method here will put get as the actual invocation is to be verb followed. By that, we're gonna also do this but noticed difference here. We're going to call it a plural noun instead of just a singular now as well as the method yet and finally will also use the i d to specify the delete operation. And the HDP verb here will be delete. So let's review we put together each of these methods we've exposed them so that each of the Lambda functions will execute when an http request is made given that it matches the path and that it uses the correct http verb to invoke that function. All right, so one last thing to I need to fix my syntax here. That issue he should have a colon at the end of it. In visual studio. It'll be blue. If it's red, then that means it's not right. So we got to go fix all of these. Also noticed the indentation for the path and the method there indented under the GDP array . And I believe that should be that now if it is incorrect, it will show that error message down the bottom so you can see what that looks like. But essentially, that pretty much means you need to invent. All right. Now, after running the SERVERLESS deploy command, we're going to get a whole summary of all of the new infrastructure that we stood up. Now, remember, it's going to skip the steps that we've done previously so won't be setting up the DYNAMODB . However, we are actually generating the end points for the Lambda functions, so that information will be displayed in the summary right here. So I'm gonna go ahead and take this information and put it actually in a read me files so that I can refer to his endpoints later. You'll notice that the names air pretty complicated, and you might not remember the mouth the top of your head if you dio you're definitely very have a great memory, and I'd like to borrow it. But with that said, um, I'm gonna go ahead and put this in the relief. I'll I'll have this information for later in the event that we need to use it and we will hand him because we're gonna be invoking this externally outside of the AWS Cloud as well as out of the Serverless Framework project in an upcoming lecture. So keep this endpoint handy. And with that in place, we've got everything up and running. Our next issue that we need to tackle is dealing with course, so let's take a look at how we do with that. 18. Serverless Course Section5 Lecture9 CORS: before we continue on with the project. I wanted to make sure that you guys and girls have a good foundation oven importance concept that you're likely to encounter one developing back in AP eyes. And that's particularly true if you're developing on the AWS platform. I'm talking about course. Cross origin. Resource sharing, better known as cores, is a mechanism that uses additional http headers to basically at lower request toe happen or not. Now, if a request comes from the same place or the same domain as the A P I request is being issued, then course doesn't really apply. Course is gonna happen, say, in a situation where you have a backend deployed to the AWS cloud and you have a front, an application that's running on a different domain that attempts to make a request in this case course does apply, and we're going to take a look at how the whole workflow is in motion in terms of when we start. When we make the request, what happens? What's going on behind the scenes and essentially, how that overall process looks? Let's take a look at how workflow might happen. Suppose the user is interacting with the client's side application that runs on domain. A. If the request is made to another domain, say domain. Be. For example, a preflight HDP options request is made to the server if and only if the correct headers are on the request. A second request in this case, the HDP request is automatically made in the request continues as normal. Otherwise, the browser will actually prevent the second request from occurring. Now that you're kind of familiar with this whole workflow, let's take a look at some common terms that you're likely to encounter, especially in the next section. The browser essentially needs a way to keep track of all the request that's made. And if you think about it, one of those very important things is going to need to keep track of is where the request came from and where the request is going to keeping track of. This is essentially way to understand. Is this request valid? Is it intentional? Origin is an A C B header that displays where that request came from, and it's possible that you can white list domains so that requests can only come from a certain location. Otherwise, those requests will be tonight, whoever it's also possible to use wild cards. And there is one caveat that I do want to mention. And they found the sound personally when I was developing some maps for some different mobile devices, particularly with Samsung devices. I noticed that the browsers, the Samsung browser does not natively support wild cards, so ultimately it will actually end up breaking your functionality. So something to keep in mind if you do decide to go down that well card route, there are some browsers that support it, and there are those that don't so one of the calls that you have to make in either event. White listing is definitely supported by all all browsers, and you can actually put together a list that will essentially allow requests to originate from specific locations arriving to particular destinations. Remember, I had said in the previous slide that it actually looks at the specific request headers that are sent in the actual request. Well, where are those requests? Headers actually examine? That essentially, is what we're looking at here. This access control allow headers is a list of request headers that you that are specified on the server side to allow a request in order to determine whether it's a valid request or not. In this case, when you're dealing with eight of us, http cores type requests thes air the headers that are expected by default when you make a request to the FBI. So in other words, when I make an http request, these are the headers that I need to pass by default into that request. Now the x a p i he is something that I put in. Ah, the X AMC user agent is something that I put in as well as the security token those air extra layers. But at the end of the day, when you set up a default configuration, there will be a set of default parameters that are expected in http requests When you make those calls to the AWS Cloud to the A P I gateway, and it's important that your request confirm to this list of allowed control access headers . Finally, there's a setting called Allow Credentials, which tells the client if the request headers that are required are allowed to be shown to the user. So imagine a scenario where hackers attempting to hack a site, and they're trying to figure out what headers they need to pass so that they can bypass course. One of the first things that you'd probably look for is the list of headers that are sent back in the browser. Now. What this flag does is essentially keeps that list of head of parameters a mystery. So, but for this course, it's pretty much overkill. Were just, you know, it's just something I want you to keep in mind as you're going through developing your own 80 eyes. If you do want that added layer security, you can also consider, you know, adding this flag and basically kind of a few skating your request. Another thing that you could also do instead of using the default options that are supplied in Atlanta is you're gonna actually override that whole option HDB functionality and implement your own lambda function. And one of those applications that you might introduce in these types of requests is you might introduce things like a nuts, or you might introduce times, sensitivity all of those types of things. You can actually take your http headers a degree further and add additional security so that your request can be time based or they can be, ah, token based. And you can even add additional layers of AWS security onto each of those individual requests. And this is where all of that stuff can happen. One last thing I want to mention as well, because I might not have called it out or you might not have seen in the first light. But cores requests apply to Ajax requests, So I'm talking about JavaScript specific type requests. But that said, what application isn't using Ajax requests these days? Pretty much. All modern applications use Ajax requests, so it is still very important for you to deal with. All right, let's get back to serverless that channel and the serverless framework and see how we can configure course. Are you ready? Cores equals true? Yep, that's about it. But, you know, you might want to know what's going on behind that. True. So what I'm gonna do is I'm gonna actually get rid of this and go for them or expanded version of this so you can see how you might add some custom requirements or headers or things like that. So instead of doing the shortcut I'm gonna go ahead and start actually adding the configuration for course requests here. So the very 1st 1 is origin. I remember origin is where the request come from. I'm basically saying is in the wild card. That request can come from anywhere. So I'm basically opening that access up now by default. These are some of the headers that you might see. Ah, when you're working with AWS. If you were deployed manually, I'm gonna go ahead into this list and it actually takes an array off header parameters that it expects. So I'm gonna in Dent in, and then I'm going to start adding each of these individual parents. - Okay , so I made a few typos here, so I'm gonna go back and clean those up. This authorization doesn't look good. And then one thing to keep in mind with a lot of the headers from Amazon, they actually have a MZ. So you might actually see a a max here and there, and that's actually all wrong. So I need to fix all of this, But really, what it should be is X dash AMC, and then usually it's a common name that you're likely to encounter of seen in your browser before. Have you ever expand into your browser? You'll see all of these headers that come into the request. If you go into the network tab and check out the preview of the request, all of this information is listed in chrome or in Firefox or pretty much any browser that's out there. So you can kind of start seeing what's going on with all of these different requests. If you're curious and you want to see, take it a step further, okay? And just to emphasize on this, remember I said that we can display to the client what the list of a lot of parameters are . So I'm gonna go ahead right now and keep this at faults. We're gonna play around with the settings quite a bit, but ah, this essentially is gonna hide this from the client. All right, so with all of that in place, we're now ready to deploy this and see what it looks like. - All right, so let's log into the AWS management console and take a look at what we've just done and see exactly what that looks like on the deployed cloud. So I'm gonna go ahead to the A P I Gateway console and you'll see that my A p I gateway, which will be deaf store, is actually deployed in the centre. I'm gonna click into this, and then I'm just gonna show you the method. So there's all the methods that we've already put together. You can see the get the post that put the delete. All of that is up and running. And there's the options request. So now what I want to do is I'm just gonna go to this, uh, FBI Here. You can see here is the options request. That's the belief preflight request followed by the post Request. So these are the things that are going to be happening, you know, similar to what I showed you on that work flow diagram. And if you drill into each of these, you can see how those requesting, right. Look, right now we've got this mock request, but this is where essentially you'd override, you know, to implement that non, sir, that time sensitivity functionality in the post Request, you can see it's here's where it actually ties to Orlando function that Lambda proxy. That's actually what we're saying there is It's gonna tie into Orlando function and it's gonna essentially be a pass through. Now let's take a look at the course, so I'm just going to click the enables scores. I'm not gonna actually enable anything, but I want to show you this list of parameters right here so you can see this. These air These of this method specifically applies to the options and the post and within the access control allow headers. Here is those lists of parameters that we've configured. So this will be the request headers that will be expected when we make a request and notice the allow origin is a star. Now, by default. When we deploy, you'll see here it's just to get request. There is no option. Ah, there is no option request here because we actually haven't set up course for this method yet. So that will be the next step that we're going to do. And by default, you can see these are the settings that are set up in place. So even though it's not explicitly set up, this is what comes out of the box. So for the remainder of these methods, I'm just gonna enable the default functionality Because we're gonna use this to help us in the next few sections because I do want to touch base on troubleshooting, using some external tools, and I'm gonna show you some logging capabilities. And so we're gonna explore that in the next few lectures. But essentially, you can if you're following along, just go through, used the shortcut method for now, and this will at least give you kind of a quick win for the time being. We're going to revisit this a little further later when we start actually implementing logic for each of these different operations. So that pretty much wraps up this lecture on course. That was a long one. But I hope you enjoyed it. And I hope that's giving you enough foundation for what we're going to need to do in the next two lectures, because we're gonna actually be using Postman. And we're gonna configure these headers, and we're going to configure these requests so that we can run all of this stuff through a client side tool 19. Serverless Course Section5 Lecture10 POSTMAN: All right, So now let's install Postman and let's use postman actually communicate to our A p I in the cloud. Once you've downloaded and installed this and by the way, it supported all the platforms on the upper left corner You're going to see all of these collections that I've been using. But it's your first time that will be empty. Don't mind those. What we're gonna do is create our own collection. And this is where we can put all of our requests in the same place so that we can categorize them and make them nice and convenience. So I'm just gonna go ahead and give it that name. And then, you know, you can share this with your friends. You can run it. You can do all sorts of really cool things, definitely worth spending some time going through this. So I'm going to create a request here, and it's gonna be to get the products, remember, And the summary information, it shows us the endpoints. I'm just gonna copy this, but if you don't have that in front of you, hopefully you remember to copy this content into your read me files. So you have it available. Otherwise you'll have to look it up in AWS without said I'm gonna paste in here. And then I'm gonna say this collection into are this request into that specific collection , I give it a friendly name, Go and save it there and we're ready to go. So I'm gonna go ahead and send this request to the FBI, and then I get that response so you can see there's a message and an input object that I'm getting back. So if we take a look at the handler dot Js file back in visual studio, you split up here, you can actually see in that over there. There's a function called read products. Remember, this is the function that just simply returns a message in an input event. So this event object is what's passed into the actual functions. Execution. And there's our message, and you can see that being displayed back in the client. That's what it looks like. So pretty straightforward. One thing I want to mention to is, since we're invoking this in a traditional a Siddiqui way, we're not using course. In this case, Cores, remember, comes into play when you're invoking on the client side through JavaScript. So we don't really have to so much mess around with passing the right headers and whatnot. But we will kind of keep continuing. Explore this, uh, these different ways that we can invoke the A P I. 20. Serverless Course Section5 Lecture11 Troubleshooting: we're getting really close to wrapping this whole thing up. And while I don't consider this a development course, we are going to be touching on implementing the logic behind these crowd operations. So you get to see that in action. But before you get to that, I want to take a deeper dive into some of the most common problems you're likely face. And in this section, we're going to go over those common problems that you're likely to encounter, what those issues they're going to look like and how you go about solving them. So without further ado, let's get started. So one of the very first things we're gonna do is we're actually going to reset our project . I'm going to remove everything up to this point, and we're gonna start from the beginning. That's the beautiful thing about serialised is that it's really just a simple command where we can literally stand up, what all of the work we put into just like that. So while it's removing all of that content, what we're going to start taking a look at is a list of things that you're likely to encounter. I'm gonna go ahead and back this up because I'm actually going to be changing it quite a bit just to deliberately show you incorrect things. So I'm gonna go ahead and pause the video, and then I will continue after it's all set up. All right, so everything's been removed. So let's take a look at our first item here. So the very first thing that I want to talk about is basically the layout of the serverless GMO file. Now they're certain sections that are dedicated to each. For example, functions go in one place, and then the service provider goes in another. The name of the service goes in a place that resource goes in a place. Sometimes you might get confused as to where things go. So let's take a look at what happens when we deliberately put things in the wrong order. So I'm gonna come in here and actually going to put the provider, um, maybe even the resource is because I think that's probably more common thing, right. So let's try to make it a little realistic here, So I'm gonna take this block, and I'm actually gonna put it in the wrong place. Gonna go here and actually put it underneath. The resource is here, and then I'm gonna say it, and then I'm gonna try to deploy this. Let's see what happens. The anticipation is building folks in a positivity of for a little bit because it is taking a little while here, standing up the entire thing. So far, so good. Your deposit again. And there you go. Believe it or not, it actually still worked just fine. The functions are here. Everything looks good. The the endpoints are up here. Noticed that our names based change. When you go back to post manager following completely long, you'll need to update that end point. But, uh, yeah, it worked. So how about that? Um, let's go check out Amazon and take a look and see what's going on there. So I'm back in the dynamodb console and you can see there's my product table. It's back. Let's go check out E. P. A gateway. We should also see that that's back up and running. Here's deaf sore. There's our endpoints. All of that is all set up. Um, and then, you know, obviously we'll check about Lando function as well. It was right here, just put store here and look at all of that. They're all back, and it was deployed two minutes ago. So there you have it. So it looks like the incorrect order in this case did not really matter. Let's try something else here. Let's make it really bad. Let's put like this over here. Uh, this should break. Let's see if it does. All right, So we're gonna do a serverless deploy on that. It's trying to hose our infrastructure here, all right? And it's all done, so pretty incredible. It just it's all working. Um, so I guess if anybody asked you Well, now you know the answer. Now, I'm gonna go ahead and follow the implied I'm just gonna go and say that implied, um, location of everything, because it definitely seems like if you ever look at the reading file, things air in a certain place, and I think that's to kind of have that, um, unspoken convention, but it looks like it. It is flexible in that sense. So pretty interesting. So now we'll continue on to the next one. All right, so next up, let's get into javascript modules that are not exported or that do not match what's in your service? That GMO file. Let's see what happens in that case. So the next thing I want to take a look at is the exported modules. So let's go ahead and start messing around with this and see if we can break this pretty quickly. So very first thing I want to show you is that remember, we export the Lambda function using the JavaScript module keyword here. So we export whatever functions are available and buying those using this handler setting here. So, handler, actually maps to the name of the function that we define inside this job script file here. So if we were to remove this thing altogether and deploy it, let's go ahead and try that and see what happens. So what's interesting here is that after the deployment, we've removed the function entirely. Noticed that this create product is there. So we go to the interview is console. Um and we go back to functions here, and I'm just gonna go ahead and filter on the word store because that's the name of our service. So all of the functions will use that name. We have a create product But in the JavaScript file, we don't have an actual function named create product. So this is going to be interesting. Okay. What I expect to see is that it's gonna fail now will use a postman. So I'll go ahead and update my endpoint to this new one. Here, Just make sure that that's correct. Because I definitely have changed that a little bit here and there. I'm so gonna change that looks good. Just make sure this method is post. I'm just gonna pass a dummy body. Remember, we're not actually doing anything with this content yet. We're just going to return response that response if it works. So I'm gonna go ahead, click send, and I see internal server error. So that makes sense. Because the reason why I'm seeing that is because there is no create function mapped into it, even though this got deployed properly. So something that you want to keep an eye on next. Why don't we? Why don't we start exporting functions that aren't actually there? So if I go here and I change this to to um and then like I save that and deploy it, I would expect the same things gonna happen because we're mapping now, Teoh a function that doesn't actually exist in our handler dot Js file. So essentially, it's the exact same problem. It's just a different variation of it. All right, so as you can see here, it did Deploy did create a product. There we go. So now let's try to invoke it again. I'm gonna try to run the same endpoint. It is the same endpoint. But remember, that mapping is now has been changed. So we're gonna invoke it and we get an internal server here. OK, so these are all 500 errors, Um, and again, that's to be expected because we're actually mapping to something that doesn't actually exist. So even though this deployment happens somehow, magically, it doesn't actually work. So one of the things you definitely want to keep in mind. So let's restore that back and let's actually change this name. Let's go ahead and keep the handler correct. And I would imagine that this is gonna work because we do have a binding here that matches to our handler dot Js file. Let's go ahead, deploy that, see what happens. Okay, so we're deployed again, have the same endpoint. But this time we've changed the name of the actual function. Noticed that this is a create to So we'd expect to see in the console over here that the name of the function has actually changed. So I'm gonna go back to our dashboard and notice that we don't have a create product that makes sense because we've we've changed the name that's used to build this whole naming convention. So if we drill into this, um, everything else should relatively be the same because everything hasn't changed from that perspective. The mapping is still intact, so we'll go ahead and attempt to invoke this function again. Noticed. We've got to create function here. So let's see what happens when we try to invoke it this time. And there goes. It works as expected. Eso remember, The key thing here is make sure when you're working on this that your handler function matches the value that you export the name of the function. So that's it for this section. A little continue working through troubleshooting in the next 21. Serverless Course Section5 Lecture12 Troubleshooting2: working to gain work our way through a list of common issues that you're likely to encounter when you put together your serverless Yamil files. If you're doing it from scratch so the next one is going to bat indentation, Let's see what happens if we start messing around with that. So here, I'm gonna come into this create product because that's what I've been doing. And let's say I forget that this requires an indentation. I'm gonna go save that. I'm gonna move that one space back out and then I'm gonna try to deploy this. Let's see what happens. So, interestingly enough, we do get an error, and it does tell us that Line 14 actually is the one with the problem. Now it's interesting about this. Is it really can tell you that the problem is happening from line 13. Um, And as you can see here, if you look very closely, this is directly under http. It's actually all belonging to that http object. So when you see something like this, this line 17 and it's showing you Ah, this carrot symbol with the colon, something to take a look out is make sure that the indentation is right, because right here, it flat out tells you bad indentation of a mapping entry. So it gives you kind of a ballpark as to where it is. But it could be coming from the line above. Most likely it ISS. So if we fix that indentation, everything will be up and running. Let's say that we deliberately messed up and, ah, we forgot to put a colon in a particular part of a line which is is a key element in it represents, ah, complex structure that we're trying to put together. What happens in that case again, going back to here, I am going to go to this issue, be actually made this mistake and notice and with visual studio that it does highlight this to you. Um, it tries to help you. So hopefully you take note of the color change here. Um, so if we remove that and we try to deploy this again, it's obviously syntax, error. Let's see what the error message looks like. So right away, it's his bad intent ation of a sequence. Now notice that this error message and the last error message were the same. Um, so you're getting similar air messages for for both missing Coghlan's as well as bad indentation, and here tells you that the path is what's the problem? But in actuality, what's the actual issue here is the indentation is fine. It's actually that this is missing that colon. So that's another step that you can do in your troubleshooting lists when you're trying to fix these things, you know? First, take a look at the indentation and then take a look at the Coghlan's All right. Next. Let's specify Cem invalid versions in the actually Amel file. I'm going to go ahead and get out of this section and go down further where we actually have these different versions. So let's say, for example, I use a version that does not exist. We're just gonna change the month here. I'm gonna try to do a deployment, and we'll see what happens. So it got through the deployment but noticed that the operation fails. We get this serverless error and says an error encountered at product. It will roll. The policy must contain a valid version string, and we get this huge identify, and it basically says there's amount policy document that's that's not formatted essentially correctly. So all of that is all coming from here. Now, that era messages kind of Ah, kind of a great doesn't really give you exactly an idea of where it's coming from. But that is something to keep in mind is you want to check out that the version is actually the correct one? It should be 2012-10 dash 17 if you're following this exactly to the letter. Also notice that that inversion match matches the policy document as well. If you start inter mixing these, you might start encountering issues with compatibility because each version, obviously they introduce features and in previous features, um, you know are in previous releases they may take out some things, or some things might not be readily available. Um, and so or they might be, they might require complete different configurations. So you want to make sure that that this is in line and that you're using the same version? Um, you know, where applicable. There's tons and tons of actions that you can create when you are in the midst of putting together that policy, right? So let's take a look at what happens when we mess around with those. So I'm gonna come back into here and let's say, you know, let's say Okay, so let's go ahead and change this to something that doesn't exist, Like, um, super fly item. I don't know. That's just what came to my head. All right, so I'm gonna go ahead and try to deploy that. Let's see what happens. And it deploys. Well, isn't that interesting? But I can definitely tell you that this is probably not going to do anything. Um, absolutely nothing. So well, go ahead and put this back to what it waas on. That should be good. Yeah, that's what we want. Okay, So the moral of story here is you can add different types of actions that don't exist. Those actions will simply be ignored. However, the actions that you intended, those will not be applied. So it's important that you apply the correct policies and permission actions on whatever you're defining. If, for example, you create something that doesn't exist and you thought it exists, that could also explain why your permission isn't taking effect. So it's definitely a good idea to double check these. There are a considerable number of them. So I would recommend going to eight of US website and just checking the permissions you can pretty much Google any of these, and that will bring an entire list of all the different types of actions you can configure for a given specific service. In this case, we're talking about dynamodb me. Ah, and in this case, they are prefixed with dynamodb. So we have kind of an idea of the service that we're looking into. Um, but, you know, for different services, the naming will obviously be different in those cases. 22. Serverless Course Section5 Lecture12 Create Products: with all of the infrastructure we set up so far, we're now ready to start developing the logic for the land of functions. We've set up our end points. We set up the lame, the function mocks we've set up the dynamodb table. We've also created roles that can communicate against those tables and also assume be assumed through our Lambda functions. So now what we're gonna do is we're going to start actually implementing the logic to implement these different features and capability. So what we're gonna do is we're gonna actually use the AWS sdk. I'm going to include that at the top of this file, which will include it for all the Lambda functions. The same files, remember, is being deployed across all the limit functions. But we're only using the method that's exported and it's map to that particular Lambda function. But that said these since these are going to be used across all of the land of functions, we can actually use that at the top here. So I'm gonna go ahead and create this dynamo db object here which is gonna actually have the capability to communicate to our dynamodb table and what's nice about this is It's a nice to line two liner here, and I've got a dynamodb object that essentially wraps that client functionality and can communicate to the database as long as I specified the right parameters. So now let's start implementing logic toe actually save a product. So it really helps me when I'm developing functions is usually putting enough tracing information, especially when you're dealing with so many different things going on. So I like to log the event object in the context object because they give you really a good picture of what's going on within the function. I'll usually put some information around it. Um, you know, and again, this is not something that's not necessarily mandatory, but something that I personally find pretty useful. So, you know, with that information that I can kind of figure out OK, how do what's going on at the point of execution on what led to either the bug or the solution and all of that, you tends to help out. Now you'll see here there's a body object that that exists off of the event body. That's actually we were all the information we're gonna get comes from So if I expand in the cloud wash logs, you'll see that when I actually expand this is that even objects. So you can get a kind of an idea of just exactly how much information comes through that object. It really does give you a very good, clear picture of what's going on at the time that the land the function is executing on below that you'll see that this is the actual body that exists off of that event. Object. So what we're doing here is we're taking that payload from the A P I call on the client side and actually getting that information sent over to the lab to function. So what we want to do is we want to basically capture that information. So I'm gonna go ahead and use just a standard Java script. Jason Parts here on, I'm gonna parse it on the event of body, which will then return the payload object. That's going to be the object that I'm going to insert into the dynamodb database. Since we're running inside a casing function, I can use in await keyword, which will make my coat a lot cleaner on going to use the dynamodb object and use a put operation and also put in the promise because it does return a promise within their That's where I'm going to start actually filling out the details of the table. So first is the table name followed by the item. Now, I'm just going to use this key right here. I'm just going to use them Aiso timestamp from 0 14 We can adjust that later, if necessary, and then I'm going to pass the payload into the item so we'll go ahead and on the client side. I'm going to re run this once. This is actually up and running. We'll need to deploy it. It hasn't been deployed yet once that's all set. All right, that's ready to go. So now we're gonna go back over to our client, and I'm going to invoke this once again, and then we'll take a look at the actual cloud just to make sure that everything happened as expected. So I'm gonna go into the CLOUDWATCH. Logs here. Now, we'll go back to this log and we'll take a look at what actually is going on here so you could see everything appears to have process properly. There's no errors being thrown. Let's take a look at the dynamodb table. I'll refresh that. And there is the item that we are attempting to save. So that's what the object actually looks like stored in Dynamodb. And it does match what we invoked on the client side. So there you have it. That's how we can put together creating a function within Lambda. 23. Serverless Course Section5 Lecture14 Update Products: Now we're gonna work on the read product AP I call. And as you can see, this is going to be the method we're gonna use now. We need to use an i d of the product. So in order to query this properly in order to retrieve it So we need to basically set up a postman request that's going to do that. So I'm gonna come in here, I'm gonna essentially used the same endpoint. But I am going to put kind of a placeholder for now. Just put like an I d. Here. I'm going to save this into the collection that we've created. I'm gonna say, get a product and ah, I'm just going to see that here and now. We're gonna come back and clean this up. One thing I want to do is actually want to go back into the creek product and make Amore friendlier key. So I'm still going to use the ISO date, But I'm just gonna use today's date. So the way I'm going to do that is I'm going to actually create the product, I d. But I'm gonna do a little manipulation on the actual date itself. So here. I'm gonna just change that back to what it was. And then I'm gonna, um, use some, Ah, some string properties on this, because it is it is a string at this point. So I'll use the replace, get rid of the characters. I'm gonna get rid of the, uh, the tea, and then I'm also gonna get rid of the Z as well. Uh, I'm just gonna go back here, police this and what I'm doing here is I'm just trying to get get the date kind of formatted in, like, a set of more friendly your way. So now I'll come here, and what I want to do is I just want to extract today's date. So I want to get everything up to, um, up to the time stamp. Now, obviously, in a real situation, you would probably want to You might need it to the minute you might need it to the second , depending on the volume. Remember, that's one of the the key advantages of dynamodb is that you know, you can do a lot of throughput, but in this case, we don't really need to go to that extra extra mile. So what I'm gonna do is I'm just going to extract the product I d out of, um, you know, the time stamp here, and Ah, and I'm going to use this as as the identifier. So we'll go ahead. Just kind of test that out. Make sure that's good to go. I'm gonna, um, then run this request to run the request. Um, and of course, I still need to deploy this, but we can see kind of what this looks like. So here, this is what I had before, Which you can see I've got this t 12 colon that might not place the world with the girl and browsers. So I'm gonna deploy this change and get that up to the cloud, and then we should be able to you, um, invoke and create an eye product, you know, using using this time stamp instead. So there's going to stay here a little bit. It's deployed. Now. I'm gonna go back to the cloud, and I'm going to invoke this request here. First on, delete the record in the database, though. Come back to post, man. I'm gonna invoke postman here. Okay? It looks good. Now let's take a look at the key that's created and you could see now I've got a date. So I've got today's date when I recorded this. So now I'll be able to query that object based on the date So we'll be using this one object throughout the remainder of the e p I s O that will serve for our purposes, and obviously you can adjust accordingly. The last thing I got to do is I gotta change that to actually be the idea of the product, because that's what we're going to invoke. And it's gonna be part of the URL here. So this product, when we invoke it, will return the object. Now, we actually have to go to the read product method and implement that logic to actually return. Because, remember, we're just returning. Ah, mock right now. So what we're gonna do is I'm gonna just copy and paste this on. I'm gonna put it into this read product method. Then I'm going to start in formatting this so I can get the information they need to get out of it. So one of the things we're gonna need is we're gonna need to set up the actual db table parameters. So I'm going to start working on that as well, just to kind of take a look here at everything. We've got the debugging information. My seven Coghlan's cynical ones are optional, but I like to add them. All right, so now we're going to start working on creating those parameters for our General Devi table . So this is going to take just like the last one is gonna take a table name and that will be the product table. And then, um, after that, we need to create what's called a key condition expression. This is gonna match based off of the I. D that we passed in. We're gonna be using product idea. So essentially, this is creating a quarry where the product I d is going to be equal to the value that we pass in next. What we're gonna do is going to add an expression, attribute values here. This is just literal. So now we're gonna actually create that association by using this binding variable here, and then we're going to pass it to the value that we extract. We're gonna extract that from the earl above, and, uh, we're gonna have to do some work to get that out of the girl. There is multiple ways you can do this, but I'm just gonna use pure JavaScript. Um, for our needs. We don't really need anything too complex, but, you know, just consider the fact that you could use body parts or which is a more common approach. I'm just gonna just go ahead and extracted from the your I just using javascript because, you know, based off of Howard developing the a P. I already know the expected inputs and outputs since we're doing full control of that. So we're gonna go and deploy this, and we're going to go back up to the cloud. I'm gonna take a look at the cloudwatch Longs. Just take a look at the information that we have here. Gonna invoke this one more time. We did get a response, So that's a good sign. It is still a mock response because we're not actually returning anything just quite yet, So we can go check out our cloudwatch logs and go into here. All right, now we're into the logs and then we'll go find the read product down here. Gonna click into that. All right, and then there you can see we've got our debugging information here. We've got a huge this huge object, like it said, which has pretty much every single thing we want on it. We're going to see that within the request context, there's a path variable here that essentially has three. You are I You can kind of see it right there. What we're gonna do is we're going to extract that last value from it. Now, notice. In this case, the body is no. And that makes sense because we have a get request, not a post request. One of things that makes it a get request is that you don't actually pass the payload. So what we're gonna do is we're gonna come back here, and you could see here. I'm just extracting the path from this request context. And then I am grabbing the last value within the your eye or the path name. And then I'm just gonna just get that value out. So it's pretty pretty easy way Teoh, get the, you know, to extract this Now there are utilities out there if you want to use that. But for me, it's two lines of code, so just go ahead and do it that way. So, um then, um, I'm going to go ahead and just lock this information to the consul just so I can see what exactly I am. Ah, I'm getting out. You know, just in case something is not right. This is a good, good way to kind of get an idea of what's going on. So I'm gonna go ahead and log the cloud, um, on the cloud again along the path. And then I'm gonna also logged the product. I d here. Eso the product ideas log. The path is logged. The event bodies, logs. I'm gonna go ahead and deploy this again, and then I'm going to double check that. What we've got out there is exactly what we expect. So we should see Ah, here. The path. And there it is. There's the path. And then here's the product I D. Which is matches what we've put into the postman. So we're all set. We have all the data coming in. We're extracting the values. Now. We just need to essentially quarry this in dynamodb. So that's gonna be the next step, and we can go ahead and get rid of that, cause that's kind of misleading at this point. Um, so he's gonna come down here? Ah, just get rid of that. Now that you know what's going on there and then we'll go ahead and insert this into the dynamodb table. So what, we're gonna dio um just gonna put a little note here just so you guys can see read from dynamodb table. And now we're going Teoh, use DB response. That's what we've been doing. So we're gonna use Devi response can use the keyword await here because we're running in a sink function dynamodb dot query and we're gonna pass in those parameters. We're gonna have a callback. Here's air and data is or two values that we're gonna get. Um, and then here, this returns of promise. So we're just gonna go ahead and do that. And now we'll just add an if condition. If there's an error, will return the error of Otherwise, we're just gonna return the data that comes back. We don't really have any particular requests right now. It's kind of freeform eso You know where it's up to us, how we want to format the data in the response Now, you can obviously adjust this accordingly. This is where you would do those types of, um, you know, aggregations or, you know, data manipulations in your response. You know, that's happening on the A P. I layer. So, you know, maybe we'd add some additional logic Teoh to remove some fields or whatnot, but for what we're doing, I think this is This is fine. So now we're gonna actually, instead of returning this mock, what we're gonna do is we're gonna actually return Devi response. So I think the devi response probably be pretty much only need because that should contain essentially everything. So just go ahead and clean this up. Um, may see, I need to get rid of you know what? We don't need the product. I d so I'll just keep them out. The air comes back, so we don't need that either, so, yeah, that looks good. I just need to fix that seven colon. So let me go ahead and do that. Here we go. All right. And now we've got everything that we need. So let's go ahead and play this to the cloud and we should be able to see um, this actually work now. It should actually return real data. So we'll send that. And there is our data coming back from the a p I. So it's really that simple. Um, so hopefully get an idea of how quick and how easy it is to put together back in the P. I hear this took about 11 minutes, I think, if even that and assures bare bones. But it does give you the fundamentals of putting together a read operation with dynamodb. So with that said, we've got, read and create done. The remainder of these AP eyes is gonna be pretty straightforward because we've done the majority of work with the last two. So let's go ahead and start working on that next. 24. Serverless Course Section5 Lecture15 Update Product2: next up, we're gonna work on updating a product, and it's a combination of the last two methods we've done. Well, you can see here that it actually does take in I d. As part of the your eye, but also for put requests, we have to pass a payload. So because of this, we're gonna go set up a new postman request, and I'm just going to select put here, and then I'm going to take a pay load. I'm going to select raw. I'm just gonna copy this payload. And here it's slightly modified for the type. I'm going to change that toe application, Jason, so that it's the right format. Then I'm gonna go over the post request copy of this year. I'll put it back in there. Ah, and then I'm just gonna add a i d to the end of that girl. There we go. And we have pretty much everything ready to go. We just need to come back. So I'm gonna save that with update product, FBI as the name going to save it to the collection here, I will come back and update that I d one key difference between the update and post is obviously I changed from the field. So if you look at post ah is newest True prices 10 99. I changed the price. I also changed the flag so that we'd have some key features we could look for when we're updating. Now let's head back to the Lambda function and start adding rd bugger information like we've been doing so far. This method as I mentioned before, is a combination of both of the read and that creates. So we're going to need actually both pieces of that. You can see here that this is the same logic that we did with create. But I'm also going to extract the product I d. That's going to be coming into the your I. So we have a payload in the event body and we're extracting the ur I last method so that we can quarry the I d. So when we do, an update will actually have the i d. And we'll have the payload that we're going to update against. So here. Now I'm going to use the put operation we used put operation for both create and update And the difference here is is in the item. I'm actually in a pass in the primary key, so I'm passing in the product, I d and I'm passing in the payload. Finally, what will do at the very tail end of this method will be just to return the product back to the Y. Honestly, that is not exactly rest. If we're really falling rest, this is actually avoid method. But I'm just gonna go ahead and return that to keep that in mind. You know, you really want to follow the spec? Just make sure that you have a avoid method here, and you basically don't return anything. But with that said, this is updating it. So now let's go ahead and give it a run. All right? So we'll head back into the postman and go ahead and update that end your l with the key that we used in the get request so that we're updating a valid item. Gonna go ahead and send that on. Then we should see a response here. The response looks good. So now we'll go ahead and head over into dynamodb will refresh that will click onto the item and will expand this. And there we go. The fields have been updated. So this pretty much demonstrates how to update an item in dynamodb. And that pretty much concludes this section. So now on to delete 25. Serverless Course Section5 Lecture16 Delete Product: we're in the homestretch. So let's go ahead and wrap this thing up with the delete operation. So I'm gonna come over here and go into post man. I'm going to essentially copy this and create a delete request and just gonna come over here, select the delete method. Delete operations. Don't have payloads were going to use the your l noticed that the end you URL contains that key. So I'm just gonna go ahead and see this to collection, call it, delete a product, and to save it here. And then I'm gonna head over into the AWS lambda function and start updating that so I'll go ahead and do what we've been doing so far here. I'm gonna put the d bugger information. Since I'm extracting the i d from the girl, I'm going to re use the code that I've used before. Ah, and then I'll take a look here just to make sure everything's good to go. Yep, that looks good. So now what we'll do is I will actually use the dynamodb object to remove the item in the table. So right about there, I'll just go ahead and say, Delete the item in the table. So there I will get the DB response like we've been doing so far. And then on the right side, I'll use the await keyword and then dynamo db dot delete rather than what we've been using s so far. That is the key word that we're going to use that also returns a promise. So I'm just going to put a promise there and inside here. As you might imagine, we'll need the name of the table. So put the table name here. That's going to be product table, followed by a new parameter that we haven't seen yet, which is key. So here it's, ah, literal to map and inside the map. But we actually used the i d that we're looking to delete. And that's gonna be what we just passed in, which is product I d. All right, So with that all set, we've got our method pretty much for you to go. The final step is going to be returning something to the client. Now, um, in this case Ah, I'm going to just go ahead and return a message that says thebe product was removed. Remember? We're using dates, so it's just going to say the product Ideal use. I use Ah, back ticks here so I can just embed this into the message. Obviously, depending on your A p, I you might want to ADM or information or even return Ah, payload or maybe not return anything at all since it's a delete operation. But that's really up to you. How you want to design that and implement that. So with that all set, we can go ahead and start taking a look at this. So I'm gonna go and deploy it to the cloud. As soon as that is done, we'll go ahead and invoke this. So gonna go over here and ah, I'm gonna invoke this U R l Here. Just double check to make sure that's all set. We have data in there, so I'm gonna invoke that. Noticed I get the message back from the cloud and says it was deleted. Ah will double checked out with the dynamodb over here. Here's the operation where I actually did the deletion. It's under products, not product. That was a mistake, but easy enough to fix. So when I refresh that that is removed and, ah, the delete operation is working. So this concludes the delete operation as well as the product A P I and ah, I hope you guys realize just how easy it is to put something like this together. If this really once you get the hang of it, you can knock these out pretty quickly within hours, Honestly, which is pretty incredible and thinks the service framework. You can do all of the's wonderful, wonderful things for us, and it makes working with the clouds so much easier. 26. Serverless Course Lecture29B CloudWatch Logs: when you're working with AWS policies, you're gonna immediately notice about the fact that they are extremely granular. So right now we have a policy that has the ability to communicate with the dynamodb table. However, were not logging anything. And we need to explicitly say that we'd like this policy to have the ability to create logs so that we can have tracing to update logs, to write them and also to rotate them. There's all sorts of permissions that we need. Teoh essentially apply here so that we can do that. So I'm going to come in to the statement and notice that effect is in array. So I'm gonna add another element in this statement block here. But this next block is going to be specific to logging. So what we're gonna do is we want to allow, um, the ability to create logs. So essentially AWS uses the term log stream. So we're gonna create log streams. We want the ability to put log events meaning that when the uh lambda function is executing as it's creating or are logging information, we want the ability to put that log or those events into a log on AWS. We also want the ability to describe streams. We want to be able to get log events. And finally, in order for us to have this capability, we need Teoh identify which re sources are going to be creating these log events. So what we're gonna do is we're gonna come in here and we're just gonna give this log stream the actual name, the one that we've defined or that is already defined in the cloud. Um and then we'll just keep continued to do this for each of these individual functions. So for create product for update products will have one for read product read products, and then we'll also have one for delete product. So this will basically set up and give us the ability to not only have the ability to communicate with dynamodb but also logged all the information to cloudwatch and cloudwatch is going to be the tracer or its. Its AWS is eight of US service that logs all of the traffic that's happening on the clout. So when we have this policy enabled and created, then as the function executes will have will have all that information available. So it's extremely important that we have that so that we have traceability and weaken, weaken, do debugging and troubleshooting and whatnot. So this is how we can put together, Ah, policy that can execute And that will be assumed Ah, when the Lambda function is invoked 27. Serverles course lecture 29C Lambda Assuming Roles: Okay, so we've created our product table roll, which has the ability to integrate with the dynamodb table. It also has the potential for logging, and all the permissions are essentially configured here. And the idea is, we want this serverless that GM will file to encapsulate all of our deployment, including setting up users, user permissions as well as provisioning AWS re sources and configuring our AWS Lambda functions as well as our A p I gateway so that we can invoke this function externally from the Web. One thing that we have to do, though, is if you notice in our actual Lambda functions at the top here, we don't really have any things to define here. In terms of the rule, we haven't given an explicit permissions, and what happens in this case is we generate a default role. What we want to do, however, is we want to use the rule that we've defined our custom role. We want to use that custom rule that we've defined within this serverless that you malfunction and it's actually pretty easy. All we really need to do is we need to do specify the name of the role that we've defined. So here, within each of these functions, we go through and we use what ah keyword role, followed by the name of the role that we've defined in each of the functions. By doing this, when the Lambda function will execute, it will not only execute, but it will use the role that's been created within serverless that Yemen, and that will then have the ability to then invoke. The resource is that we stand up in provision, and the beauty about this is it's all in one file. It's all encapsulated as part of the same deployment, and this really does give us a true way to stand up our infrastructure. Very, very straightforward and very easily in less than 100 and 50 lines of code, not not too shabby.