Transcripts
1. Introduction: Hello and welcome to my course, Microsoft Azure Functions developing
serverless solutions. I'm your instructor for war Williams and I've been
a software engineer, lecture and code practitioner
for the past decade. In this course, we're
going to be focusing on Azure functions
and how they work. We looked at how to create them, test them, deploy them. We looked at the different
function trigger types and whole weekend MY them interact
with external services, also provisioned by
Azure and otherwise, going through all
of these concepts, we will also interact with
different development tools, all geared towards helping us to be as productive
as possible. Tools like Visual Studio, visual Studio Code as 0
Functions, core tools. We looked at the
storage emulator and the cosmos DB emulator. Know you're probably a little fuzzy as to what
Azure functions are. Their phone shuns is
one of the quickest and easiest ways to
get your code running in Azure while maintaining a cost effective and
serverless model. In other words, you can
write an application without having to worry about
the infrastructure or any servers or anything. And he can deploy it to the
Cloud and have something up and running within
a matter of minutes. It's also supports
many languages. So even though we will be
focusing on dotnet and C-Sharp, you can also use JavaScript,
TypeScript, Python, Java, and other languages
right out of the box. Now throughout the course, we will be focusing
on as your functions, but we will interact with
other Azure technologies, such as Azure Storage, azure AD, azure Cosmos DB, Application
Insights and app services. Let me say again how
excited I am to have you in this course and I can't wait to get started. See you soon.
2. Introduction to Azure Functions: In this section, we're going
to be demystifying some of the mysteries that surround the concepts of Azure Function. One, I'm assuming that you
already have an active as era cones and that you have enough credits to get through the activities related
to this course. Some of the things
will cost money, but we're not at the money
costing part just yet. I just want us to
get comfortable with Azure portal
works how we get to interact with functions and
some of the concepts and keywords surrounding their
operations. So stick around. We're going to get to
understand exactly what a 0 functions are and look at the various ways that
they can be used.
3. Create a Function App: Alright, so let's get right into understanding a 0 functions. Now at this point I'm
making a few assumptions. One, that you already have an active Microsoft Azure
cones and you do have your credit card
setup or you have the free credits
available as some of the activities
for the duration of this course will
cost some money. But we're not really at the money-making part yet
or the money spending part. It we're just looking at
the fundamental concepts. I'm already logged
into my Azure account. You can get to the Azure portal by going to portal.azure.com and login with that
account that you had used to create your your account. And then you'll see a dashboard. It may or may not look
like mine actually put some time and effort into
customizing this one. You see here I have
some resources that are deployed
from another course. And you will see, you will have the ability
to customize this. You can always go to New
Dashboard, blank dashboard, or you can just edit this
one and you can shift taus all our own and
customize them as you wish. No, we are going to jump over to the functions
up or at least the service listing
and create a function of a function up
Israeli an App Service. So you would have seen
the keyword serverless being advertised when
you see a 0 functions. It is serverless because there's no physical
server involved. It's a service that is being offered that allows us
to host functions on. So I can actually just
go to All Services here. Then I can filter
and say function. And then from here I will see that I can create
a function up. When I do that, I can click
Create, Create Function up. To proceed, I'm going
to go through a wizard. So firstly, what is
your subscripts? So when you have
different subscriptions, you're probably under trial. You're probably on pay-as-you-go dev slash tests like I am. You are probably
on a special one for your company or
for your organization, whatever it is you choose the one that you're
going to be using. I only have one. You would want to create
a new resource group. So you can go ahead
and hit Create New. And I'll see function, RG, function dash, dash RG. Click. Okay. What do we want this function
up named to be? I'm going to name itself
with the course which is Azure, Functions,
dash fundamentals, know when you're doing
that, as you can see, it's actually going to be
a public facing address. So what you can do
to make it more customized in case
that's not available. You can also put dash
and your initials or your name or a part of your name just to make it a
bit more unique. That is going to be on public, publicly accessible
URL that would allow us to browse to the
functions that will be deployed our access
the functions that are deployed in the central space
from anywhere in the world. I'm sure we all know
how URLs work written. Alright, so next we
want to say published, do we want code or a
Docker container for now, I'm going to go with code. What runtime stack do we want? A 0 function support many
different languages. We see it has support for the dotnet framework
or platform. We can use NodeJS, Python, Java, polar, Show, custom, and there are other
languages being added. Very irregularity
for this course over we're going to be
focusing on dotnet. So we'll choose that then
we choose the version. Dotnet 60 is the latest
and the greatest. So we will proceed with that. At the time we were
doing this course, they may have a later version. You can feel free
to use that also. If you're not up to dotnet six, you can also choose dotnet
Core 3.1 might be necessary if you don't have Visual Studio 2022 or higher at the time
you are doing this course, that's fine, but I will be
showing with done at six. And then your reason would be the datacenter closest to
your geographical location. If you're in Australia, you want to choose the one in Australia that's closest to you. I'm in the Caribbean. The best one for me
is usually East US. And I'll choose a stool that is newer datacenter to East US. After doing all of that, we can just click
through and see the other potential sitting so we can set up
a storage account because we have to
create a link to a storage account for blobs and queues
and tables storage, any form of interruption
with those. We can choose an
operating system. I'll leave this as Windows. We can also choose a plan type, not the plan type is going to
directly affect the costing and hold the resources are spooled to support
your functions up. So by clicking, learn more, you can grow so
the documentation, so the consumption plan will
scale automatically and you only pay for compute resources when your
functions are running. The downside to this is that
they will actually turn off or D allocate the resources
after a period of inactivity. And then when you
need them again, there was pulled them up,
but this may take time, so it's like it's not really always onboard then it will kind of torn down so that it saves
you money in the long run, however, if it's something
that you always need, then this might
not necessarily be the best plan for you right? Now. The premium plan
automatically skills based on demand and there is
no idle periods, so it's always scaling and it's always on when you need it, it is there at all times. Another thing to note also
is the execution time. So the consumption plan
basically has a limit of, if I'm not mistaken, 20 minutes. So that means our function will stop running
off the 20 minutes. Whereas the premium plan, it will not have
that time limit. So it will run longer than the maximum execution time given to us by the
consumption plan. Like I said, you
have to know what is your contexts and what
you need from it. The dedicated plan will actually give you that App Service
plan kind of runtime, meaning it will have
dedicated resources for your AP function, your function. And you won't have to guess
and spell to say, Okay, I'm overspending
on the spending. It will scale automatically and it's far more
predictive if you know that you're going to
need functions and you're going to be
doing this long-term. You probably want to
use the dedicated plant because then you can actually properly estimate
what your costs will be at a certain
kind of load. You can choose the
one that you need according to your context
for this course over, we're going to go ahead with
the salt consumption plan, which is told it as
the service plan. Next we go over to
the networking side. There isn't much or anything to configure
here because pretty much it is already configured to be public facing
to the Internet. So we don't really need to do any virtual network
adjustments here, we can go over to
monitoring where we can turn on Application Insights. And that allows us to
shoot logs and monitor the functions while
they're running the runtime and the errors that they're getting
all of those. So I would recommend
that you enable that. It does cost a
little bit of money, but I think that it's
worth it in the long run. And then tags, those are good for creating a naming resources, especially in a setting
where you may have many resources are
fairly similar nature's serving
different purposes. So tags can help to isolate them and help you with building and resource allocation
in the long run. But we can skip that for null. And then we go to
Review and Create. Once we verify that
everything is what we want, we can just hit Create. When that process is completed, you can always just
click Go to resource. Once on, Go to resource. If you've ever deployed
an App Service, it's going to have a very
similar dashboard verse, similar kind of feel. If not, then that's fine. This is the dashboard
that you're getting. So you'll see here that
he gets some charts, some metrics on
runtime notifications. You can do access control to see which user in
your organization or in your Azure AD can do what
in this particular resource, we can configure security alerts for any intrusion or
potential intrusions. We can set up events. And then of course, the reason we're here, we can start creating functions. Someone to leave it there for. No, we just looked at how
you create a function up and the different
types of hosting models, the different pricing models
that are available to you. And of course, context determines the decisions
that you make. We're going to go
through some concepts in the upcoming lessons.
4. Triggers and Bindings: All right guys, In this
lesson we're going to be looking at triggers
and bindings, and we're not doing
any coding just yet. We're still just exploring in the dashboard to see what swats, what the lingo means and how we can connect
the dots between what our needs are and watch the Azure Functions
will provide it to us. We already created our
function up and once again, that's not going to UPS
service that has been to host all of the functions that
need to be executed. So we can get there
by going to this URL. And if I click it, you'll see that it says, my functions up is
up and running. Of course there are
no phoneticians for me to browse through,
so that's fine. If I go down to that section that says functions and
click on functions, then it allows me to see all the functions
that I would have, the trigger and the status. So if I click Create, it brings up this
bleed that then allows me to define the
development environment. You see here that we
have a few options which we will be looking at later on. And then they ask us
to select a template. This is pretty good and
you can see that you have quite a few options in
that template section. The trigger
represents the action that makes the function execute, because a function is
just going to sit there. We all, at this point, I'm assuming our programmers, so we all know that
functions need to be in a program and they need to be called in order to be executed. No, The thing with Azure
functions is that they are actually more like
standalone functions, but they have triggers
and a trigger is what will prompt
for its execution. So if we look on the needs
that templates column, you'll see the different
triggered types that are available
under these templates. So you have the HTTP trigger, meaning a function that
will be run whenever it receives an HTTP request
and it will respond. So if you know about APIs, that's all an API is. You make an HTTP called maybe include data,
maybe you don't. But based on that call, the API will respond according
to how it understood the call relative to the data it may or may not have been given. So that's an HTTP
triggered function. As your function, you
have timer triggers, meaning you can
put it on a timer, execute every five minutes, execute once a day, etc. So, you know, you have that Windows Scheduler or you have
those Cron jobs in Linux, you can use timer triggers
in the Cloud space to do those kinds of
executions in the background. I'm not going to go
through and explain what every single type is, but you can always
just go through and read the
descriptions, right? So while a Zuora
lawsuits create stuff, it's actually educating
you so you can make the court decision you have as your queue stores triggers, you have Service Bus triggers, topic or cue triggers based on which one you're
using for the Service Bus, you have Cosmos DB
triggers, right? And then in the same breath you have what you call bindings. Binding is basically
you're subscribing to a non-function related feature or features sit and
reading or writing data. So you have input
and output bindings. So when we say that, I'm just giving you
this scenario so you can appreciate what I'm seeing. If you have a timer
trigger that is supposed to take
an image alt off the Blob Storage or take
a message off the queue and put something
in Blob Storage. So let's say you had an easier form shown that
was service most triggered. Know this phone was
shown is going to have a binding to the queue and it's going to have a
binding to the blob. This is going to allow you
to be able to subscribe to these two a third-party services I want to set
third-party I mean, they're not exactly
related to the function. They are stand-alone
Azure services. So the input binding would be, I'm reading from the queue. The output binding would be, I'm writing to the Blob Storage. I don't want to have
to write if it took me 50 lines of code in
irregular program to consumer blob and connect to it and know
exactly where I'm going. It will probably take
three to five lines because of that
bindings feature. That's a very powerful feature, allows you to cut out a lot of the boilerplate code and just get straight to the point and implement the business logic. So it's very useful implementing functions or
functionality rather that you need like
snap judgments, you know exactly what you want. You don't have time to
write too much code. That's what these trigger
templates bring to the table. And the ability to just bind the function to an
external resource makes it very easy to deploy an app that touches many
different parts of a 0. That's really all I
wanted to discuss here. We're still not creating
a function just yet. We're still warming up to
the dashboard and lingo. When we come back,
we'll be talking about the development tools that are available to us even though we won't go too much in depth. What we've just discussed, what the possibilities are.
5. Development Tools: All right, so the
last time we're here, we're looking at the different
types of triggers that are Azure Functions might have. No, let us take the step
further and actually look at the tooling
that would surround us, actually creating
an interacting. We'd said functions,
no upsilon one, it would be the
most obvious option which is to use the
portal because the ports, that is a very,
very powerful tool. It sits in the browser. It takes a very minimal resources
on your actual machine. So this would probably be the feature or the
option rather, not a lot of persons
might end up choosing when you go
to Create Function, the first one there
is developing portal. You can choose your
template and then hit Create after
you give it a name, let us say test the portal. Http function does something so we know exactly what it is authorization level that has to do with the levels of authorization that
we're going to be giving persons in order to
interact with this function. So I'm just gonna
say a function, you just interact as
a functional level versus anonymous versus admin. So what happens is that
when you use function, you need a full notion key. When you use admin, you need a master key and
you can actually manage those keys underneath the key management section here. So if I just say anonymous, then no keys needed. So I'm just going to
do this for this, we just hit Create and that
gives us smaller dashboard. We can go and look
at the code and execute and tests right
here in the portal. And like I said, I'm not getting too much into
the code, right? No, I'm not going to
explain what this is. I'm just showing you that
this is a text editor right here in the browser
so you can save it, you can test and
run it right here. You can do everything
you need to do. We can get the function URL. All of that can happen
right here in the portal. So that's option a for your
tool to write a function. No option B would be Visual Studio and
Visual Studio here. The one I'm using is Visual Studio 2022 Community
Edition installer. You'd want to make sure
that you have that Azure development
workload selected. So amongst all the other
workloads that you may need, make sure you have included the Azure development workflow. And then once you
have that workload, when you are creating
our projects, you will be a load
or you will be given the option to create an
a 0 functions projects. This would be,
this would come in handy if you have a whole
development project, you're working on a team. You want source control, management and all these things. And you have that
solution with a number of functions that you could deploy. Tool the function up. And if you use Visual
Studio irregularity, then I don't need
to marketed to you as being a very good
development tool. Another option you would have would be Visual Studio Code. And you can always
go to the extensions and installed as your tools. So as 0 tools will
come with a bunch of extensions for all sorts
of Azure development, including a 0 functions. If you don't want the
entire toolset, of course, you can look specifically
for the Azure Functions, but why not just
install all the tools? Now this option is
perfect for you if you are using Linux or Mac, because Visual Studio code is open source and
cross-platform. So it's not limited to the Windows PC like
Visual Studio might be. It would be a good alternative
once again to have the code on your machine and
source control and so on, outside of it being
in the portal. Now the next option,
which admittedly, I have never attempted
to use, but it is there. So we will, we will
discuss it would be the any editor plus
core tools, right? So we can go ahead and use Node.JS and install
of these libraries. And then they tell us how we can start to tell we can test it or we can create a project up to how you can publish it. So if no GS is your preference, then you can go ahead and follow these instructions
and go through that. But like I said, we're not doing any coding in this
particular lesson, I'm actually going to
delete this function because I don't need
it. Beyond know. When we come back, we will continue discussing some more concepts and
then we will close out this section and
get right into some of the fun that
we came here for.
6. Section Review - Azure Function Introduction: All right, so let's look
at those as sexual notes. Let us review what
we accomplished. One, we looked at
what it takes to create an a 0 function up. And we discussed the fact that the function is a single
unit of deployment. It's an App Service
that is going to be hosting a
number of functions. So a function can be created and it can have different templates,
are different triggers. So the template is just
the predefined cold, but you can always
deploy any kind of template and read
the code accordingly. But we have different
triggers that will set off the functions. We looked at an HTTP
triggered function already. We have timer will have q, we have Service Bus triggered. We have a number of them that
we can use, durable and MQ. Later on we will look at what the Durable Functions
really mean. But the point is
that we have all of these options available to us. We also looked at some
of the development tools that we have available to us. We will be using Visual Studio or I will be using
Visual Studio. But mostly, if not, everything that I do can be
accomplished in the portal as well as using the other tools
if you need to use them. When we come back,
we're going to delve into interacting with functions using the
portal a bit more and we're going to get our
hands dirty with some code, hold to a test or functional and sold to look at
the execution and actually see what it takes to create and deploy
some Azure Functions.
7. Azure Functions in Azure Web Portal: All right guys, welcome back. In this lesson,
we're going to get right into the fund and we'll be creating a
few Azure functions right here in the web portal. So we're going to be
looking at examples of HTTP triggered film showings. We'd be looking at timer
triggered functions. And we're going to look
at one with Blob Storage, notwithstanding that you can use those same steps and deploy
any kind of function that you need according
to watch you need as the trigger will be focusing
on those three examples. And we'll also be looking
at how you can test. Well, we can look
at the logs that the things that you can do
with the functions just to make sure that you are doing it properly and covering
all the bases. So stick around. We have a lot of fun overhead.
8. Create an HTTP Request Triggered Function: Alright, so let's
start off by creating an HTTP requests
triggered function. And we did that already. So we're just going
to replay our steps. We're in the function up and then we go down
two functions, then we click Create up top. And then that gives us the list. And from that list we
want the HTTP trigger. You click that, you give
it a function name. I'm just going to name it HTTP, request hyphen funk,
or a function. Let me just write that out. And then here for
the authorization of Endo skimmed over last time, but I'm going to go much
more slowly this time and explain in more detail when we, when we choose the authorization
level for function, what's going to happen
is that any, any device, any color to this
function needs to provide a key that is specific
to this function. So by creating a function
and we get a key, and this key must be present
in every requests to this function in order for
it to actually be triggered. When we say anonymous, it's open, anybody
can access it. There is no need for our key or any additional
security constraints. Then when we use admin, it means that the admin who
was a master key that would allow the admin to execute
any function whatsoever, any function at all, but only at the admin
level cannot be executed. I'm going to leave this
at the function level for null and I'm
going to hit Create. Now that our function
has been created, we can go down to the
code and test bleed. Once we're in this blade
is going to show us some boilerplate function code. A lot of things that
are defined for us. You'll notice that the file
extension here is dot CSS. So based on the template
that you're using, meaning are using JavaScript
or using C-sharp, etc. Then that file extension
will of course be different. And if you drop that down, you'll see that you
also have a read me file dot MD or for Mark don't similar to what you
would have on GitHub or a 0 DevOps for
documentation purposes. And you also have
function dot JSON. And this function dot JSON file basically just has
configurations for different bindings and different metadata
about the function. So here you can see it's
specific that we're using the HTTP trigger and it's accepting the methods
of GET and post. So if we wanted to do put
and maybe patch and delete, etc, you just extend
that whole section. There are four methods. There are times when you may
need to modify this file, but for now we don't necessarily
need to modify this. So we'll just leave that alone and let us go back
to our Randolph CS6. And what we're going to do
here is look through the code. So it's really simple code
where using a few libraries. And here we're including third-party library that you
would have probably out to get with NuGet if you're doing irregular
dotnet application. But we're using some
libraries here. And then we have a method that returns an eye action result. So if you've ever
worked with APIs, you'd be familiar are
even our regular MVC up. You'd be familiar with
the eye action result. Note that HTTP triggered
function is usually very useful for pretending
or not pretending, but for acting like an API. And it's also very useful
for responding to webhooks. So anything that is HTTP traffic in HTTP triggered function, generally speaking, is equipped to handle
that kind of operation. And so you could
actually build out a full API or full
application logic using a suite of HTTP
requests triggered functions, each one responding to the
specific request accordingly. Here, inside of our method, we see that we get to
parameter is GTP request. I will call me at Rick,
and we're also putting in an instance of our logger. So we're logging information
first-line where they're seeing this
was a process. It was called we are
processing our request. Then we're getting
from the request a query string with
the name name. So we're getting the value
from the query string with the name name. And we're storing it inside
of this variable called name. And then we're seeing
the request body process that request body, please. I'm looking that request body if it's JSON or stream it to JSON, look inside of it and see
if you can find name. Here it is saying that
either you're going to find the query string
with the name name in the query string
in the URL itself. Or you can look in the request body in
case it may have been a post request and look for some data that
has the name name, please give me that body. That's all that
that's just doing. It could've been on
either are because usually you send either
upwards or I get what's here. The code is just showing you
that this is where you get a query string value and this is where it gets something
from a pollster requests. That's all that's
really happening. Then it's going to formulate
a response message. And it's good to say if this name variable is
not null or empty, or sorry, if it is smaller, empty, then give them
an error C and that you need to pause it either in the core string or
in the request body. And if it is not dollar empty, then we're going to
print Hello name. This was executed successfully. Then we return, okay, which is a 200 results
if you're familiar with HTTP response codes, okay? The OK function gives back a 200 response code as
well as that message. All right, that was
a tour of the code. Not going to modify it. This is going to test it. So let's test and run. We see here firstly is going
to say OK and testing, are you doing a post
or a get method call? Secondly, what kind
of key, right, so I can use the master
key monster because in the portal we are the admins and a master key can do
everything anyway. Well, you could just test
it with a function key, since that is what we
deployed it to use. You can add a parameter here or you can change the body here. So I'm going to my name
inside of the body. That is a post request. So when I click Run, then I'm going to get the ALU
output alpha 200 response, hello, the name that I entered. This was triggered successfully. So that's a good test. Now let's jump back over to input and change the parameters. So I'm doing a GET request
this time and I'm putting that query and I'm calling
the variable name. So the parameters in the name and the value is still my name. And I'll just say dash get. And I don't know if
you're experiencing this, but while using the portal, I have to press a
button and then wait for the character to appear. So that's one of the reasons
that don't really like working in the portal
directly like this, at least for this
kind of operation. It's a little slow for me, but you can let me know if
you're experiencing that also. But when I click Run, then we get the same results. This time it was getting it from the query string as opposed to the request body like
it did for the post. Now if I miss name this query
parameter and then run, then we get to our error. This HTTP triggered foam
Sean executed successfully. But please pass in data
as it is expected. That is pretty much hold. The HTTP triggered
function works. I request comes in with data. We write code to parse
through the data, manipulate it to whatever
we need to do with it. And then maybe we send back, we write a log so
that we can actually truck the execution path. But we also just let them
know the kind of response. So I could actually respond NC, if the query parameter
was not null, then return a new maybe bad
request object results, right? Something like that. That is pretty much
holdout works. No, I can actually get the function URL by clicking
this button up top. Then it will say,
give me the default. Well, which link do you want? So it's actually going to
include the key on that URL. Do I want the master key or
do I want a function key? So I went to leave
it on default. And if we take a
look at that URL, and I'm just going to
use Notepad Plus Plus. Here we see that we have
the URL to the function up. And then it says slash API, and then the name
of the function. And then code is equal to, and then this hushed string, which is that key, That's the function key. So if I needed to execute
this from any kind of application
that I am writing, then I have to execute it again. So you are looking at
minimum like this. Without that code, it would
never reach the destination. He would just reject
the request altogether. If we go through the
other pages that are available to us
from this screen, we can go to integration which allows us to edit
certain things. So we candidate trigger, we can edit the function, we can modify the inputs and the outputs are
some mini flowchart, just to say that this
is where it's starting, this is the process and
this is what is expected. We can go down to the monitor, which basically shows us all of the executions and homeowner
for them were successful, how many of them failed? Then we can go to
the function keys, which will show us our keys. We can always click to view the value because we're
using the function key, that is our function key if
you ever need to retrieve it, if you need to renew it, reset it, etc, whatever
you need to do, that is pretty much
it for setting up an HTTP triggered function.
9. Create a Timer Triggered Function: The next type of function that
we're going to be looking at is a timer
triggered function. So this dumped onto functions. Go ahead and click Create, and this time we want
timer triggered. So we clicked Timer Trigger
and we can give it a name, someone to sit timer
triggered function. Then we can set up the schedule. The schedule is based on
a chronic expression. So if you're familiar
with Linux, um, then chronic expressions
are just way you represent the interval that you want something
to be repeated on. So you can see here
that it's in the form of a second, then minute, then all we're so each
asterisk basically represents that
that unit of time. Pretty much here. It's seeing 0 seconds,
but five-minutes. So if I change this to one, then that would be like ever minutes as opposed to
over five minutes. And then that asterisk would
represent the lower than the d and then the day of week or the month
and day of week. So that's this all you would
really schedule that it may take a bit of getting used to if you're not
familiar with how it is, but it's a very
powerful constructs to understand and appreciate. So let's go ahead and create this timer triggered function. Once that's created, we can
jump down to code and test. And you see here that it is a relatively simple bit of code. We have a function called Ron, and it has a parameter
called my timer, which has all of the
things you may need surrounding an object
that allows it to glean. I'm details about
the timer being used and it has the
logging library also. It will be logging to the information or
making an information logger other each time it is executed showing the timestamp. If I look in the
logs every minute, you are going to be
seeing this populate, so I'm going to leave it so minutes and then show
you the aftermath. Alright, So I waited
for two minutes. And there you can see
the execution for minute one and the execution
for the second time. All right? That's pretty much what the
timer service brings to you. I'm sure you're thinking about different applications
that this might've, it could be in an application
where you need to do a clean up every
morning or every night, or you need batch processing to hop in everyday
at a certain time. You have a number
of applications for timer, timer
triggered functions. In this situation do I don't need it running
in the background. If you don't want it
running in the background, you can always just disabled. And we already looked at how
we can delete a function. So I just disabled this one onto maybe in the future
when we use it again, but that's really
it for a whole. You create timer triggered
functions and how they work.
10. Create a Blob Triggered Function: All right, next up, let's
look at what it takes to create a blob
triggered function. So let's create in the blade, we're going to look for blob. As soon as it loads up. There we go. So as your Blob Storage
trigger, right. So you see there are a number of them and I encourage you
to actually try them out. You might have
useful them outside of these demonstrations
that I'm making. So I encourage you
to go ahead and try see what the
quarter looks like, see what the binding
code looks like. But here when we do the
Blob Storage trigger, I can change the name so that's
blob triggered function. And then our path
is going to be, basically the path is going
to be where should I look? Slash, end up
binding expression, what should I look for? So here I'm saying
I'm looking for the name of whatever
is in the blob. That's a binding expression. Give me the name so it allows
me to access that thread. And then the storage
account connection, it's pretty much going to ask which storage icon should I use? So it's going to be
defaulted to one already associated with the AP that the function
is being hosted in. So I'll go ahead and hit Create. And for this one, let's go to code and test, where we will see that we have information coming in from one, the stream which is
being called my blob. And then the name which is that binding expression
that we set up. And then we have the logger. Of course, this is sitting
down and watching that blob. Now let's switch over
to the function dot JSON and look again. So we have the name of
the type Blob Trigger, we have the path, samples, dash work items, slash the name. So what we need to do is have a blob that
exists with that name. If I look in my storage accounts and to get
to the storage, I'll go and just in case
you're not familiar, you can go to the dashboard, you may see it under
all resources. If not, then you can just
go to storage accounts. And then you can click
storage accounts. This no-shows all
of your containers. So these containers are really the species that the
files will be stored in. I don't want to have a
container by that name. Whatever name you I'd sit there some dashed work items
was the default. We didn't change that. What CPI change it, whatever
name you put there, you have to make sure
that you have a blob that exists with that name. So I'm going to create
that container and just go ahead and do that and
know that it exists. I can start adding files. So I'm just going to upload a test file that I
have on my computer. So I'll just click
Upload, select the file. And sometimes this
might be faster if you have the
Storage Explorer. So that's actually a third-party
tool that if you wish, you could install Microsoft
Azure Storage Explorer that allows you to handle all of his storage accounts and
everything from your computer, but that's a conversation
for another time. Split the file and then choose my test file and upload it. And you see that I
uploaded it in tab. If I go back over
to the function and look at the log
you see here it is logging that it sees a new
file and that is the path. You see the name there,
blob dash test.txt. You see the creative time insertion time on
all these things. That's just how you
can sit down and watch the storage ear to see. Maybe you wanted to compress images as soon
as they're uploaded. Maybe you want to move them
around or renamed them. There are a number
of things you can do with something like this. Once again, your contexts will always determine
your application.
11. Section Review - Different Azure Functions: So where are the end
of this section? And we're just going to
review what we've learned. One in case you missed it the
first time we did it again, and we did it three more times. We know know how to
create a new function. So inside of our function up, we would create and then
we choose from the list, the one, the templates that
best suits what we need. It's better to just create it from the ten feet as opposed to create one and then
customize it for another. But let us just work with what
it gives us as templates. And we looked at
the HTTP trigger, we looked at the
Timer Trigger and we looked at the Blob
Storage trigger. However, based on the
different resources and services that you may
end up interacting with, you may just start off with the appropriate template
from the get-go. In looking at our HTTP
requests to focus on, we saw that we get a mini
dashboard that allows us to see sort of the executions
which ones failed, how many times we've
got successful cause, maybe 400 calls,
500 clause, etc. All of these, um, are readily available to us. We can get the function URL, which we established has the code or the key
attached to it, which allows us to plug
it into any application. We also discussed that when
creating the function, at least the HTTP function, we can create it as anonymous
or for anonymous access for a function level security or mosque deliver security
or admin level security. We looked at all of that. This one is restricted out of function level as it stands. The code plus test. We can look at the code
that we need to modify. Of course, what this one does is it looks in both
the query string and the request body based on the type of requests
that we're doing, we wouldn't use one and not
the other maybe or both. It depends on your context. We can always test
on her own from this interface and
we can view the logs right here to see what
is happening based on our log information
that we write in our code. So this is the console era
gives us all of the data. Accordingly. Know, when
we go to integration, it gives us a nice
little workflow kind of screen that allows us to modify some of the
bindings and we will be looking at bindings later on. So we have input bindings
and output bindings. We have the trigger
that we can modify. We can jump back up to the code. If we look in monitor, we see here that we get access to the
application insights. Because remember we
enabled that when we created the function up. So we can see all
the invocations. Each time it was executed, what the result was, how long it took, was a successful
execution or not. We can also look in the logs for more detailed logs of all that happened or is happening
with the app. At that time. We can also go down to
function keys and we can create a new
function key if we want, we can revoke one, we can renew. If we need to have complete control over
how this function works. And all of that is just
the HTTP request function. But we also saw
that while similar, each function type has its own nuances based on
what it is there for. The timer triggered one, it will do something every so often based on the Cron
expression that we specified. The Blob one was going
on watch us space for our file to be dropped there and then it
can process it. So all of these things are possible with all of that done, we've completed this section. When we come back, we're
going to be looking at using Visual Studio to
create a function up. And we will see what the
differences are between doing it in the portal and using
Visual Studio on our machine.
12. Azure Functions in Visual Studio: Alright, so in this section
we're going to be creating some functions are a function
using Visual Studio. What I've done here is I've jumped over into Porto, creates, went to functions
and then I selected the Visual Studio from
the drop-down list. And they outline
everything that you need in order to get
that up and running. They're still mentioned
in Visual Studio 2019. So if that's what
you're working with, then no problem you
can follow along. But I will be using
Visual Studio 2022 for this set of activities. You can just follow along
these instructions. And really and
truly that's pretty much what I went to be
going through with you. If you want to just
read that and skip the rest of these lessons,
then that's up to you. But for now, I will continue and we'll just
in the next video, jump over to Visual
Studio and get started.
13. Create a Function App: All right, so at this point I'm assuming that you've
already installed Visual Studio and you've already set up your Azure workload, like we discussed at the
beginning of this course. To get started,
we're going to jump over and create a new project. And then you can filter. You can always just
search up period of templates for as your functions. Or if you already did at once. At some point in time, it would be to the side. Here. I'm just going to see
a 0 functions Fundamentals. That's the name of the
project and the solution. And then I just hit Create. At that point they're going
to ask me a few questions. One, what kind of
template would I like? So similar to the portal where we could
choose a template, they're asking me, Do you want a service both
trigger, etc, etc. You'll see in the templates. This time I'm going
to go with empty those so that we can do
a little exploration. So I'm going to
say empty and then I can use a storage emulator. So if you are doing this
without maybe having direct access to the
Azure subscription, or you haven't gotten that
far with Azure as yet, you could actually just use a storage emulator where we'll pretend that you have a storage of cones bought off course with some limitations. So I'm just going to
proceed with all of that and go ahead
and hit Create. So once that project is created, we can look at some of the files that
are available to us. So I wanted to start off
with the project files. So in dotnet, things since.net Core
three-point one O boards, when you click on the
CS approach file, you can actually see
the XML file behind it. You can always
right-click and go to Edit Project file and you
will get this view also. This file That's
a little bigger. This file really just outlines the font
that we're targeting, the dotnet seeks
framework and we're using Azure Functions version for at the time of this recording, this is the latest and greatest. And when we are
going to be working, we have to reference the package Microsoft.net.SDK
dot functions. And here it's talking
about two files that are given to us host dot JSON and
local dot settings.js SON. And they're letting us know that they are going
to copy them to the output directory
for the local settings.js on each never go
to a published directory. And that's really because
it's a local file, but we will see why in a few. So if we jump to the
host dot JSON file, it's just a configuration file. It's saying OK version 2, we will see what that
stands for in a few, but it's also has some
configurations are on logging where it's
seeing it should use the Application Insights, which is what we know we
enabled in our function up. That's really all that
is going on there. When we look at the local
lot settings, JSON, it's letting us
know that it's not encrypted and it
has some values. So the a 0 Web Jobs storage, we know that that is
actually supposed to be the storage account that's
associated with a function up. But because we're using
the development storage that is set to true, so that's the value it's seeing. Just pretend that it
exists pretty much. And then it is giving us the oceans work or runtime
which is set to dotnet, which could be
something as based on the type of language
that's being used. But because we're using C-sharp, we need a dotnet run time. So that is by default
set to dotnet. So that's a basic tour
of this functions up. And the two files that we got. When we come back,
we'll actually start writing some code to
write a function.
14. Test Function in Visual Studio: All right, so now
that we have our project up and running, What we don't have is
an actual function. Alright, so I'm going to
right-click on the project, go to Add, and then I went
to see new Azure function. When I do that, I will choose the name of this function
and I'll just leave the defaults function
one-point CS don't want this is not
dot CSS this time, but I'll leave the default name. Go ahead and hit Add. I know this dialog window asks us which template we would like. So we've got that first when
you are creating a project, we went with an empty
one, but its buck. For each function that
we're going to create, we're seeing the same templates that would have
seen in the portal. And each of them
would be asking us the same configuration
questions that we would have seen if we were creating it in the portal with a timer was still get to set up the
schedule with the blob, we still get to set
up the connection. Well, this one is a bit different than what
we saw before, where we sit up to set up the path and a number of things. Well, I'm going to go with
a path of least resistance. Let's go with the
HTTP trigger and I'm going to choose an
anonymous one this time. And let's go ahead and add. Once we do that, we get
this function file. And I'll notice all we got
was that CSV file dots, the dots CS6 five. We also don't have a
function that JSON file. So what happens is that when we use Visual Studio to
create our functions, It's actually going to
compile it down into a DLL. And if you've been using
dotnet long enough, you know what a DLL is. And then that DLL, we'll
have all the bits of information that the
portal needs to use once it's deployed to ascertain any configurations and
anything else I wrote it. So it's going to be precompiled by Visual Studio and
just deployed to Azure as opposed
to meet in a 0 and then compiled in a 0
and then used in Azure, like what we're
doing in the portal. So you're probably more familiar
space when you're writing your code and you
have better access to intelligence and resources
here in Visual Studio, we can also handle debugging much better
because we could actually just run it in
debug mode and set our breakpoints just
like we would with any other kind of application that we will be building
inside of Visual Studio. Here we have a static class and then we have
the function name. And then we have static
method called run, which knows it's
an HTTP trigger. It knows it's authorization
level is anonymous, which I can always change here to be function
both of course, you'd have to have some
more configuration around that with the key, etc. But that's anonymous. We have the different types of requests that are accepted. So GET and post. And then we can define
a root and a number of things that are available
to us right here. Ultimately though, the code
or the code in the body of the function itself
is pretty much like what we had
seen previously. If we were to run this
and I'm just going to hit F5 and I'm going to set a breakpoint just to show you
that debugging those work. If I hit F5, it's going
to Visual Studio. Visual Studio things. If it's your first time, you may see it installing some
tooling as it goes along. But here we see
it's open running. So it's letting us know that in order to get to this function, we have to send a request
through this URL. I'm going to borrow that URL
and I'm going to use a tool that we use for EPA
testing called Postman. So if you don't have post-money, can always go ahead
and install it. It's pretty easy to
find POSTs ME n, but it's a good tool
to have just descend random HTTP requests
when you are testing. So I'm just going to go
ahead and put in the URL. I'm going to leave it as I get. And for the parameters I'm going to put in
that name parameter. And I went to see test
from postman as the value. Then when I send, it's going to connect to
our app that's in run time, IACUC taking the breakpoints. So note that it's
setting a breakpoint, I can actually
interrogate my objects. So let's look at
the request itself. The request object has everything about
the HTTP requests. It has a myth that
was being used. The path, everything that I need to know pretty much
if I don't inquiry, I'm going to see that the okayed found one in the list and it has the key name and the
value tests adjustment there. I can ascertain that it is supposed to get the name right. So if I just hit F5, I can remove this
breakpoint know, and I just hit F5
for it to continue. Then the console window came up when you went
to the debug mode. But if I go back to Postman, which is the tool that
made the request, then you're going to see
it responded with a 200. Okay, hello tests from postman. This HTTP triggered function
has executed successfully. That's just hold local
testing can happen, no. So we can actually
use tools like Postman and any other
tool that you might have, maybe Fiddler or
even your browser. And try and do these
kinds of operations. We can debug know, and we can better interrogate our application
or a function, sorry, code as we're building
out or business rules for node,
that's really it. Now we know how to
create a function using Visual Studio Hall
to test it and hold to integrate with
third-party tools in your browser to trigger
HTTP or requests. Know, as we go along, we'll get a bit
more complicated. But for now, that's
it for how we create a function
using Visual Studio.
15. Section Review - Functions in Visual Studio: All right guys, so that's
really it for this section, we looked at how we can
set up Visual Studio 2022 or 2019 with the Azure
related to sets. So we installed as your workload and that allows us to
create our functions. And this is really
just a function up. So you see here, each function
that we would create in this project would be a standalone file
that is a function. When we deploy this, which we will be doing later on. This is just getting
our feet wet. But when we deploy
this entire project, it's really deploying
it as a function up with different functions. If we have a number
of things you want to accomplish and a
function per task, then we have one project
with each function for task, one setup configurations
governing whole all of these functions
interact with each other and probably any other service that
they need to and any third-party service at
that one deployment is done. When we come back, we'll be
looking at how you can use Visual Studio Code to develop
your Azure Functions.
16. Using Azure Function Core Tools: All right guys, so
let's get right into it for this section where
we'll be discussing the Azure functions
using Visual Studio Code as well as the core tools. So I'm on a pH here, we're work with a 0
function core tools is the heading so you
can get there by, you can easily just Google
your foam shedding core tools, and it's the Microsoft
documentation Ciceronian, the core tools
allow us to develop and test functions locally. So this is a better option than Visual Studio because
it is cross-platform. So it creates a runtime
on your machine to simulate what the 0 function
runtime needs to look like. And then you can develop
and test locally, of course, before you publish. Of course, once again, visits to the allows you to do that, but that's not always
an option based on your operating system and other limitations
that you might have. What we can do with
the core tools, if we scroll down, we see
the different versions. So right now I'm using version four because I'm using dotnet
six bullets, of course, based on whatever stuck you're on or whatever
version of dotnet, make sure that you're going
to use the correct version of the core tools
for installation. We have the Windows, Mac OS, and Linux
instructions, right? So based on your
operating system, you go ahead and use
whichever one you need to. I'm on Windows, I'm
going to use the 64-bit one that launches the typical Windows installer
that we're used to. And we can just go ahead and hit Next each time and
let it install. And while that's happening
in the background, I'm just going to point
until they can also get to the core tools by GitHub. So you can go to the Azure
project and look forward to Azure Functions or
Azure accounts rather, Andrew for Azure
Function core tools as her Functions core tools. And then that those have installation instructions
according to whatever stuck you're on, probably a little bit more
information than you would readily see on the
Microsoft documentation. So that option also exists. No one says finished, I'm just going to launch
my Visual Studio Code. And what I typically do whenever
I have a new project to build is I just go ahead
and create a new folder. I created and open a new folder
somewhere on my machine. And I'm kind of
assuming that you're familiar with Visual
Studio Code, but if not, what I typically
do is just click Open Folder and
navigate to where I want its create the folder and then use that folder
that I just created. And that's what
creates my project. I'm going to do all of my 0 function unrelated
operations right here. In this interface. Let me just make this a bit
bigger so that we can see it. The next thing that
I wanted to do is open a command prompt window. So if you're in Linux,
if you're in mock, you might be different
from Hawaii, I'm using it, but at
the end of the day, we can all agree that
command prompt looks very similar across
every platform on all the commands that I will
be using here will be usable in whichever operating system
you're using to verify. Well, firstly, what
I did was navigate to the folder where I
know my project is. And then I didn't see. And that's where
we went to verify that the core tools have
been installed successfully. So we get this nice
particular untold. And then we have of
all the documentation around the different
commands that we can run. Here we see unit would
create a new function up in the current folder and initialize a Git repo.
So let's try that one. So I went through it
if UNC space in it. And then it's going to ask
me to select the runtime. So of course I wanted to split the dotnet run time
since we're using C Sharp and low it, so it's doing its thing. I see here it's populated
in the files in the background while
it did its thing. No, I did try that same command inside of the terminal that's built
into Visual Studio Code. And as you can see,
I've got an error. So I was hoping that we could do everything from
inside the terminal instead Visual Studio Code. But clearly there are some limitations and
that's no problem. So we just have to understand which tool can be used for
watts when we need it. Just glance over at the
fuzzer were created. We have this thought
VS code folder which has an extension
file just to make sure that the Azure tools or the 0 function
tools are installed, give us the best experience possible in Visual Studio Code. We also have the gitignore. It created that CSV file
that we're familiar with and the other two JSON
files which were also already familiar with. If I wanted a new function
and just, just a note, remember that when
we did func init, it's asks us to runtime. So you may not necessarily
want a dotnet.net runtime. You may want an order on time or a Python runtime based on your skew towards your preferred programming
language and environment. And that's completely fine. The fact is that it supports
many different languages. I'm only focusing on dotnet because I have a
Windows machine. And if you were using a mock and that's not
really an excuse. I'm just saying I'm
focusing on dotnet because that's what we're
doing for this course. So if I want a new function, I will use that if you didn't see command again and
then I would say New. And if you're not sure, you can always go
back up and look at the commands that
would allow me know to select from the list
which new type R, which template I would
want for my new function. So I'm going to choose one
that we haven't done before, this little Cosmos DB
trigger. I can do that. Press Enter, then
it's going to say what should the name b. I went to see Cosmos DB
triggered function. And then when I press enter, it will then go ahead
and do its magic. And in the background
you'll then see the appropriate file pop-up
in your Visual Studio code. That's sums up whole weekend, use the Azure core tools and the command line to
start the process. Now the thing is
that you may not be using Visual Studio
Code as your editor. You may be using a
different editor, but for using the core tools, you can actually do
everything right here. And even if you were
using Notepad Plus, Plus, you could just go in and edit
whatever files are being deposited in the folder once you are running
your commands here. So when we come back, we're going to look at
managing the project, fusing only Visual Studio Code.
17. Using Visual Studio Code: All right guys, so
we're going to be using Visual Studio Code to start up a function
and up from scratch. And we'll be looking
at the different tools and different nuances that make it
different from using the core tools and
Visual Studio itself. So that being said,
let's get started. The first thing you want to make sure we mentioned this earlier, but I'll mention it again. Make sure that you have the Azure Functions
extension installed. Know what I have is the
Azure Tools extension with which came with
It's an extension pack. So it came with a bunch of
Azure related extensions. And I just did that because I
do some Azure developments. I prefer to just have
them when I'm ready for them then to go
hunting at the time. However, if you don't want
the whole park, no problem. You can just look for
the Azure Functions extension by just
searching up here. And that one is just going
to give you what you need to make sure
that you can do your Azure Function development. Just reading through
the documentation is actually quite
informative and it does tell you how you can go
about installing and setting it up and running and deploying. When the time is right. The emblem in the
documentation is slightly different from the
emblem that I have on screen, at least at the time
of this recording. But that's fine. Guessing that they're
going to work very similarly anyway. So I'm just going
to click that as your section and then it allows me to view
all the functions. I've created a brand new folder, which you can also
do if you need to, but let's just
work through this. So here I have the option
to create a new project, to create a new
function when I'm already in our project and
to deploy the function up, which would be the
whole project. So I'll just start
with creative project. Which then asks me, weird, do you want to put
this new project so you can go to browse and then creates a new folder
wherever it is. I already created a
folder called it as your phone, Sean's
fundamentals code. It's basically in the same
place as the previous one where I was using the
core tools will tell us go ahead and select that. And then incidents asked
me to select a language. I'm going to mix. It's above it. We've been dealing with
C-Sharp all this time, mainly because it was Windows
and because I've been using Visual Studio and we'll be using Visual Studio
going forward, both in an open source sitting. All of these are
options that you can have someone to go with. Typescript does
something different, then they want a template for the first function we
can skip that will be akin to creating the
empty project in Visual Studio and then adding
the functions later on. In this situation, I'm
just going to start off with an HTTP trigger. So let me do that. I'll leave the default name, press Enter, and I'll
make it anonymous. And with all of that
donuts creating. And then it finished grids in a project and then it
did a bunch of things. And it's initialized a Git
repository for me with all of these default files when I
go up the project listing, have a VS code folder that
has the extensions and initializing files
to make sure I have the best experience while
using Visual Studio Code, I have a folder for the HTTP trigger which
has the function dot js. We already know what this file is far we've seen it before. It also has that index files. So unlike the C-Sharp file, which has a dot CSS file,
I chose TypeScript, so it's not TSV file, which is really just an
encapsulated JavaScript file. If you do know door, no development or
angular and such, then you code TypeScript looks pretty much it's just a JavaScript version
of what we already know. It's taking the request
as an HTTP request. It does its context, which is the context of the
function where logging. And then we go and get the name from either the query
string or the body. Then we print out according
to if data came back or not. We have some file. We have a function ignore file, I have a gift ignore file, and then we have the
host JSON files. So you see here that the
runtime is gnawed on the last time when it would
be dot in that package file. And we have a TS config file. All right. Those are all the files
that we get out of the box. I'm just going to go ahead and run this someone to
start debugging. You might see it installing some other
stuff on your computer. Of course, having the latest
version of Node already installed in this situation is imperative for this to actually run locally because
obviously depends on nausea, need to have Node and npm
install on your machine. We see here in the
console that we have the URL to our function. We don't hold the testis
or do we can either use a browser or we
could use Postman. So to spoof, request someone
to go for a postman, I'm going to put in the new URL, so it's whatever you, or they give you here, slash api slash
HTTP trigger one. And then I went to put in the
query string for the name. So I went to send that. You see here that the logo and the background
went crazy, right? So it got down activity. And of course we got that
response as a 200 OK response. That's pretty much it for
using Visual Studio code, debugging the formation
in Visual Studio Code. And then we can of
course deployable. We looked at all of that. We're going to go through
the whole development of a whole suite of
functions in a few, but we're just taking
it step-by-step? No. Just before I move on, you can follow
along if you want, but you don't
necessarily have to put in the core tools runtime. It is imperative that you
know which version of the core tools runtime versus switched version of
dotnet you're using. Dotnet six is still a relatively new at the time of
this recording. The tooling isn't
always going to be laborious and 24 weeks. So here I went through the
same steps, once again, creating a new project, but I chose C-sharp this time. And you're seeing
here that you can use.net Core three
are dotnet F5. I went to choose dotnet F5, and then I went to do an HTTP
trigger just the same way. So of course I said all
of this open a brand new function by
burning folder, sorry. Just repurpose everything here. That is a C-sharp
function which looks the same as what we've been seeing. So if I just do a run, I just wanted to ruin and
make sure that my runtime is, I'd be quit for building a dotnet project using
Visual Studio Code. It was like I said,
the tooling might not necessarily be stable. It needs to be at the time, but hopefully by the time
you're doing this course, all of those nuances are gone. So yes, it was successful. Here we see the runtime is
up and I am successfully debugging my C-sharp
beast function. I'm seeing all of that to
say that maybe when you spin up with a core
tools and you might be in one version and then
you try editing it in Visual Studio Code
and then try running in debug mode in
Visual Studio code, you might not necessarily
have the best experience if the tooling and
aversions aren't the same, you might end up
getting weird errors like missing
configurations and such. So I'm just pointing that out in case you do
end up like that, It's better to start in one place and do
everything in one place. So if you start with Visual Studio code,
everything there. Same thing with Visual
Studio, etc, etc. With all of that done and said, I'm going to call it
quits on this lesson. I hope you learned and field equipped and ready for
the project up ahead.
18. Section Review - Azure Functions in Visual Studio Code: All right guys, so forth. This section we looked at alternative tooling
to Visual Studio. Visual Studio is a big
poll for beast, of course, but it's not always the best or the only option
for some persons. We looked at the
fact that you can actually get the core tools. You can spin it up from there. You can use a regular
text editor and a few commands in
the command prompt. And you can at least have a good experience
developing your phone. Shown is deploying it and
interacting with them. We also looked at
Visual Studio Code, which is a very critical sick onto what
Visual Studio offers you. And it has excellent tool sets, an intelligence to help you through your development tasks. It allows it to
debug just the same we borrowing a few things. It is a very good, like I said, for a second. If you don't have Visual Studio or that's not an option for you. Visual Studio Code is an
excellent resource to use when you want to develop
your Azure Functions.
19. Azure Function Bindings and Durable (Orchestrator) Functions: Hey guys, welcome back. In this section we're
going to be looking at some other features or functions that we probably
haven't covered yet. So I went to go a bit more in depth into input and
output bindings. We touched on that earlier, but we're going to go through
some practical examples. We're also going to look at Durable Functions
and art history. And we're going to look
at setting up a handlers. So we're going to be using Visual Studio Code for
most of these activities. And you will see how to make
functions a bit more robust. And the whole weekend
let them interact with other services outside of just
one trigger or one cause. So stick around. We have quite a few
things to go through.
20. Input and Output Bindings: All right guys, so in this lesson we went
to be looking at input and output bindings. Input and output
bindings for too much referred to literally
input and output, a source of input and
target for an output. So I already created
a brand new project. I went to the Azara tab here. And then I said creates a new
project and it shows a node or a TypeScript template
for this particular one. So we'll just be
using that here. No, I do realize
with the tooling that some of the things that
we will be going through, you will see different
things based on the project template that you're using on the way
the tooling works. Remember, a lot of
this is brand new. So over the time there might
be obeyed same and not to experience what I might point
out as a potential gap. However, I am choosing this
path of least resistance because we do get a wizard when we want to add
an input binding. When we get that
wizard and put in the appropriate information
on what's going to do is generate a block of configuration
code which is global. So no matter which template, even if you don't get
the wizard itself, you can still watch if you don't want to follow
along our account, follow along based on the
template type you've selected. However, when the code is generated and then we
review that you can actually stick it into your own In dot JSON file and
proceed just a scene. At those points. I will point them
all to, you know, another thing that we're going
to need for this activity is the Cosmos DB
storage emulator. You can get that through an easy Google search,
just Google search, a 0 Cosmos DB emulator, go to the Microsoft
documentation and you can install that on
your Windows PC. Know, if you don't
have this option, you can actually go to
Azure and provision the Cosmos DB
service on your own. And that's pretty simple. The very similar steps to what I will be doing
in the emulator. We'll be able to be replicated in the
dashboard regardless. So I'll be using the
emulator for this activity. So Cosmos DB will serve
as our input binding. And then when we're ready
for the output part of it, we will provision an Azure table or maybe when we get there we can figure it out based on the activity I'm just
playing by ear here. When you have installed
the DB emulator and you launch it and started
it will come up in the browser looking
something like this. And you'd see it tells you it's running, gives you the URI, gives you the primary key, and it gives you the
parameter connection string. We're going to need all of that. You can also go to the Explorer. You can create a new
database and you can create different
containers in there. I'm going to remove this
because I wanted to do it from scratch
with you guys. So let us start off by getting this connection string and then jump back over
to our project. What we need to do is let
the project nor about this new connection string jump over to your local dot
settings.js JSON file. And here, make sure that
the Azure Web Jobs storage is seeing use development
storage equals to true. Whatever the form
shown in runtime is, we know that what that purpose
or the purpose of that is. But let us put in our
connection string. So I'm just going to call
it Cosmos DB connection. Then I'm going to see colon
and then put in the value which is that primary
connection string that we just got
off the emulator. So let's jump back over to our emulator and
create our database. So in the explorer
of the new database, and we're going to be, let's say creating a blog. What we'll do is
make sure that we have the TIG for
provision throughput. We leave it on auto scale. And the cool thing about the
emulators that whatever you setup as the settings here, when you are doing
the MLA version, sometimes you'll get an
accurate enough estimation of how much it would cost in a
particular reason on Azure. So it's a good tool for just local development and
testing and proof of concepts. We will call it blog. And then what we'll do is
give you the new containers. So just click new container and we'll choose
the container ID. Container would be posts. And good partition key, partitioning key here presents
would index value to use. That will be blog ID. So we'll just click Okay. And then if we drop down posts and looking at
items that you see here, so you'll have the
ID of the post and the blog ID is
the partition key. While we're here, what
we'll do is put in a blog post and you can
put in a few blog posts, x2i contest bullet here. I'm just going to put in an ID, blog ID, both being one. The title, the content
comments block, just something to
make it look like. What are real blog record our document would look
like in Cosmos DB. You can just go
ahead and do that. You can just click new item. You'll get this editable section and then you click
Save when you're done. And then at the end
of the operation, your record will get
some more metadata. You don't have to
worry about that. But that's what the record
should look like in Cosmos DB. Lets us go over to our function dot JSON file and add
the configuration. Now there are two ways
to add configurations, and I'm going to do
it the manual way, or at least show
you the manual way because based on the tooling and based on the
project template that you may have spooled up, you may or may not
have this option. For instance, I'm using the TypeScript functions,
project or template. When I right-click, I see the option there to add binding. And what this does
is it brings up little wizard that
allows me to fill in the different points of configurations based on the type of configuration I'm doing. Not all of the
project types have that because I've noticed
that it's not there in a C-sharp functions
project or attempted. So I'm just going to show
you what it would look like. Generally speaking, whether
you're using TypeScript, Python, Java, C-sharp, etc. In the form of JSON, this is where all the
bindings are configured. So we have here, we have one for the N. So when you see it, it means
it's an input binding. When you see means that's
an output binding. So we already had
two by default, nowhere adding another one. So you can just go ahead
and add the comma, starting up a new object block. And then the type is
Cosmos DB direction is in the name is
input document. You could change that
to, you could call it blogs, whatever it is, what this name is going to play a role in the code itself. So I'm calling it
the input document. If you were using the default, the wizard would
have defaulted to the word or the name
input document also. However, the database
name is blog. We created a collection
called posts. So database name,
collection name. Then we have connection
string sitting. This is going to have the
same name as the clinic shown stream that we
had created earlier. And then we have the ID
and partition key node ID. Here is going to refer to what binding parameters should I use to get the ID values. So when I see query, this is like a super
global at this point. To see this query is going to be akin to the query string
that's coming in. So remember that the
default template for an HTTP triggered
function would have the request dot query
dot the variable name. Or in C-Sharp it we'd
be looking at query. So pretty much
query is just what it recognizes as a
query string variable. So a query string
look for a variable called ID for the partition key, we're also using ID stories
and the same thing, no, they are both in curly braces because it's more
like a binding. Binding parameter as
opposed to value. Those are the binding
parameters we want. No. You could
actually specify like an SQL query here as one
of the key values SQL query that would actually
allow you in the context of a Cosmos DB and potentially other database
related input variables, input binding, sorry,
you could actually write your SQL statement here to get them more complicated SQL
statement if you wish. But in this case we're
keeping it simple. All we want is to
pass in the ID. It should go and fetch it
from Cosmos DB for us, and then we can
process it afterwards. Now I'll just give
you a quick demo of what the wizard
would look like. So you just right-click
function on the list. Say at binding, you choose, you choose the type. Let me zoom in so
you can see better. And then like I said,
we're using Cosmos DB, you would choose the name, so I'll leave it as the default. You choose the name
of the database, you choose the name
of the collection. You choose from the connect, the connection string
from your Settings, and then you give
it the document ID. So that would've
been this binding. So it'll said query. Just to see him weird
the partition key, I would've said, Where am I getting the partition
key value from? Because I'm using the ID for
both for the partition key. That's why I say query dot ID. However, if it was a case
where I should pass in the ID separate from
a partition key. Then I could say key
or whatever it is. Whatever is the name of
the parameter that I'm expecting as a partition
key I would specify here. And then pretty much you just go through and then I
will press Escape to cancel if I don't want it
to go ahead and proceed. But once I would've
press in dread that SQL query would've
generated that block. Regardless. It would also have put in the SQL query and
it would have been blank. And I would advise that you
just remove that because I've seen it causes problems. The problems aren't
really very clear. If you used the wizard, you would've gotten
an SQL query. And it would have been blank. My advice would be if you're not winds of using
just remove it, you can proceed
with less concern. So now that we've
configured or Cosmos DB, we've configured the binding, we've configured our
connection string, let us write some code, refactored this method for you. And I went to point
out all the changes. I mean full, you can just
remove everything between the first contexts log and
everything up until the last, the closing brace, right, so just empty or the function. So it looks like this. Sorry. Should look like this. When
you finish emptying it, you can leave that if you wish. No problem. First modification would be
to our function parameters. So we have the contexts, we have the request. I know I'm adding
the input documents. So this input document, and because I'm
using TypeScript, it's very strongly typed. But it does allow you to be able to lose someone
to see any here. But we could specify a
data type if we needed to, but I'll just leave it
as any context requests. I know I'm adding
input colon, sorry. And then whatever code
was there from before. No, inside of the body of our function autumn
going to do is say if not input document, then say that we
couldn't find that ID, then we can log that message, but then I'll be responding. So contexts dot
response or risks, I'm assuming that means
response is equal to. And then we can specify a
status modifier, the status. And even in testing
in earlier parts of this course scene where we pass it in a wrong value
and we still got a 200. Okay, even though it didn't find the value that was supposed
to be there, which is wrong. Http responses need
to be very specific. So if you enable
API development, you will be able to
appreciate that. If not, then just know
they need to be specific. So even my saying
5400 here is wrong. And thus more like a
404404 means not found. After all, I am seeing
it wasn't phoned. 500 would be if it was like
a fatal error on my side, which is not the case in
this particular scenario. If not input document and
nothing was provided here. So this is automatically
sitting down and watching for a binding value coming
in through the request. That much is the binding
that as specified here. If no binding Valley
came through, then if that binding value didn't find anything
in the database, because what happens
is once it sees it's automatically
going to go to Cosmos DB and check against whatever it
needs to check again. So if you didn't find anything
pretty much at a time, so by the time it gets here, it will know if you've
found anything or not. It's in JavaScript because it's basically saying if it's not, then we say not phoned, and then we respond with a 404 and the message,
it wasn't found. This in the next line. We're going to be seeing
respond with a 200 and the body should contain whatever contents are
in that input document. All right, so remember
that's just the JSON block of JSON. So let us see what
that looks like. So I'm going to run, as usual, I'm going to use
Postman to run that test. So I'm passing in
that parameter, that query string,
ID equals one, and I'm getting that 200
response with the content of that document ID that
was not present. Let's see. I passed in ten. Then I get the four hundred
four hundred phone down the message blog post and not found as many documents
as you might have added, which with whatever ID's, etc, it will just
automatically go in. So that's when we see
an input binding. As soon as the
method is executed, I'm supposed to bind to
this particular value. And it is for the
purpose of querying this particular resource or interacting with this
particular service. In this example, or input
binding is looking for an ID as per our
configuration here. And once it sees that ID, or once it gets called, it's going to triangle
to the Cosmos DB to get the matching document. All right, let's
continue along this line of thought and add
an output bindings. So I've already added the configuration and you can just go ahead and
pop that in there. So a new object block. And then the type
this time is Q, the direction is Alt. We have the name.
That's the default name that would've come up in
the wizard, the queue name, we give it a name and
then the connection, and this time it needs
storage connection. And we have the Azure Web
Storage, Web Jobs storage. A connection there? No, for context for what
all of that is pointing to. If you bring up that
Azure Storage Explorer, you're going to see here under local and attached to
a storage accounts. So a storage account
is pretty much where your blobs would go. And then it also has
queues and tables. So we're using the cues that will be formed
in a storage account. Of course, we're
emulating it right now. If you went and provision
on this on Azure, then you will be able to
access it down below here. However, locally we have it emulated so we can go to
the cues and the old Q, so you don't have to
create this Q when you execute the commands that
we're able to write. If it doesn't already exist, it will actually go ahead
and create it for you. That's the beauty
of it. So you don't have to create it right now. I'm just showing
you where to look out for the creation
of the old Q. If you were to go
through the wizard for that binding the let me just show you what it would
look like quickly. So let's say that we
wanted an alt binding Wu tools that we
want Queue storage. Then they would ask
us for the name. So there will be
output queue item to whatever the name is. They're probably didn't give
it a different name or it really doesn't matter
because that will just create a new queue
section anyway. But let's give it
a different name. And then we would
want to point to the storage connection string, which because we're doing local, we're seeing use the
development settings. And then you'd be
able to proceed. And then that entire
configuration block would have been
generated for you. Once again, you have
two ways to do it. If the wizard is not an option, then you can always
just get familiar with those configuration. Keys. In our quarter are going
to make one adjustment. We're inside of the section where we don't find
our blog posts. We're just going to send
over a message to the queue. It's just going to send that response message
over to the queue. So I went to see a
context, dots bindings. Remember contexts dot bindings
is going to say contexts. Go and look for all of the
bindings that you have. And then get me the
output queue item. That's what we called it here. Then the response
message is what I want to put on that queue
or whatever misses. So you could put a
different message. It's a messaging queue, so it's generally text-based. That's really the only change
that we need to make when we run and try to find a blog
post that doesn't exist. And so far I've been
testing with ten. We would see that we do get back the response just the same. But if I jump over to the Storage Explorer
and look in that Alt Q, I'm going to see the message or messages being
deposited on the queue. That's pretty much hole. And alt, Alt binding, output binding would work. So the input binding
basically stone on watches and says when
I get this value, I am automatically
going to fire a trigger to retrieve something
or do something. And output bindings is, this is what I'm
going to send out at this point in the execution. So like we saw that you have
a number of binding options. You can do an SMS, you can do a signal, our Update, Hub Update. You can do a sin grid, which is an emailing service. And you can do HTTP, which we have been doing. That is an issue, GTP
binding protein months. So contexts dot raise is
basically saying in context, go and get the rays. That's the name of it
is short for response. I guess if you wanted
to rename that, that's all that would take. So that is really it for
input and output bindings. We need two very
simple examples. And in doing these examples, we also looked at different
third party tools that we can help use the help in
our interactions. So even in Visual Studio Code, I went all the way
over to the emulator for the Cosmos DB. But once you have
that installed, you don't necessarily
have to go into the the actual management screening could
actually do it from right here under the
database is section in Visual Studio Code because
you would have seen add our attach an emulator here. I touched ME lethal. It
would ask you if it's core or DB for Mongo API you select for and it
would automatically coding the local Cosmos DB. So that's a nice, easy way
to a 100 that operation take and create a database and the collection
from right here. We can also do virtual
machine management, but we don't have any
VMs in this course. And you can manage your
functions from here, etc. So there are tools. Once you understand how they all work together, then
you should be fine. If you're having any issues
with your storage account, which you might one, I would just suggest that you update your Storage Explorer, makes sure that it is the
latest version on retinal. I'm running 1 to two. I went I was on
1.170 some issues because they've made some
updates since 1 to two. And you should have the azurite milliliter for local
Azure Storage development installs nobody time you're
doing this recording, you probably won't
have these issues. But if you are working with an older toolset
toolset and then you're not installing fresher than
I would advise you to just install fresh because
that caught me out. One, the storage emulator
is what would've come with the original version of
the Storage Explorer, but that hasn't been
deprecated since azurite is the new thing and it should've come with
Visual Studio 2022. However, for me, it didn't
quite work out that way. For whatever reason, I instead
use the npm to install. So you will have already had npm or Node JS it version
later installed. If not, you can go
ahead and get it. But in order to keep up
with the activities, if you were using TypeScript, just know you should have
had it installed already. You can just bring up your
command prompt or your console and run that installer that it installed
in the background. And then you'd want to create a folder in that
location or wherever, and then run this command so you can make sure it's
running in the background. This will override the previously installed Azure
Storage Explorer emulator. I'm only showing you these
things in case you're encountering issues because
issues are natural, but it's always good
to have the solutions. You can do all of
that and then restart your storage Explorer and then everything should
be able to connect. The symptoms of it not
working would be that the function will not execute. What you try to test. It will pause. And another symptom
is that when you try to connect to the emulator, it would tell you
about connections being timed out our refused. So if you're getting
those symptoms, then you definitely
need to go ahead and meet these updates. That being said,
that's really it for input and output bindings. So I encourage you to explore, creates other functions that
you think would be useful. Timer triggered function
that would send off an e-mail every morning or
every week, stuff like that. So go ahead and explore.
21. Durable Functions and Orchestration Patterns: Welcome back guys. In this lesson we are
going to be talking about durable orchestration
or deal with functions. So that's another
function type that is available towards
in a 0 functions. We can easily spin it up using the template and we
will be looking at it. But before we get there, I just want to go through it. A write-up on its on the Microsoft documentation
for Durable Functions, durable form of stones, That's an extension
of Azure Functions. And it basically uses the orchestrator pattern to
follow through on things. So just for context, I durable form shown
is one that you can use the define workflows
using procedural code. So our flow generally is you
want to do this then that, then that then that
you may require Monday while the intervention
in between R-naught. But the point is that you don't
want to get to step three unless steps 12 have been
completed successfully. So a durable function will
actually truck the state of the operation so that it
knows what decision to make with each
point of reference. They are also capable of calling other functions or other durable function synchronously
and asynchronously. And the output is reliably
stored in local variables. Our history as a
functions are durable and reliable and they basically checkpoint each time they do an function or
weight or yield. So each time a function is called and it
returns its value, it creates such a point so it knows exactly where it is in the operation and the
state is never lost. It's always there. It's always keeping
track of what is happening and where
it is in the process. They can be long-running, they can be seconds, days, months never ending. So you have different
patterns where you might be calling an API
asynchronously over and over. You might be pulling
a data source for something and you wanted
to continuously do that. The most important parts of this is wanted trucks the state. And it always allows you to know where you are
in the operation. If you want a better
understanding, at least with the theory, you can read through the
document is and more. And you can see some of the code examples in the
different languages. But we're going to be doing
it together and we'll be using Visual Studio Code
and a C-Sharp template. So of course we jump over
to Visual Studio Code. I've already created a folder, a 0 durable functions. That's what I called it. No, we jumped onto the Azure tab and onto the foam
Oceans section. We're going to be creating
a new function up. So we get to our usual wizard. Me, just zoom in so
we can see clearly, I'm going to choose my folder and then I'm choosing
my language. Then I want our own time. So I have six installed. Of course you choose
runtime according to your context and of course the version would be relative to that version four
goals with dotnet six. And then I want you are able to form shunts the
findings easily. You can always just start typing and it will
filter it out. So I wanted to durable
functions, orchestration. And I'll leave the
default name for no and default namespace. Then they wanted to know which storage account
I'm going to use the local emulator and allow it to generate a project with it created and we
have our new code file. I'm just going to
zoom out a bit so we already are familiar with
what the project looks like. If I look in the
CS project file, you see here that I have this
particular library which is very important for our
durable operations. So if you were using
Visual Studio, just double-check
and make sure that this package is
being referenced. If not, you can go and
fetch it on new gates before you create the
durable function template. That's fine. But then when we have
our code file, weekend, just look through and we'll just look
through and do a test. I'm not going to modify
it too much or anything. We just wanted to understand what exactly is happening here. So we have the annotation that tells us or functional name. This function name has, sorry, this function has a
function called Ron orchestrator which is
returning list of string. And then it is passing in
an orchestration trigger. And you can see here
the context is of type I caught I durable
orchestration context. So it's tracking
all the outputs. And then it's going
to say each output, the value that is gleaned from calling
this function with that value passed in. So you see here, it's
only one function that it really has and that is
the durable functions, orchestration, C-sharp
One on the score. Hello. It's only one function
which is defined here. But imagine if we add
multiple functions that you wanted to call
one after the other, you could just define them
in that specific order. So here it's saying call the function by that name
and give it that value. This value is going to bind. Here's our function
that is calling. And even though
the name is here, the function definition
is called sayHello. And then it takes an
activity triggers. So this activity trigger is pretty much that parameter
that is going to get boned. You can woodwork like
an input binding. It could be something for our Cosmos DB or something else, you
know, whatever it is. We've gone through
our aisle bindings. So we have an appreciation
of how that could work. But once you do that, once you pass in that value, then you carry out
the operation based on the value that came
in and do whatever. So here it's only going to say hello to each of
these languages, each of these cities, sorry. It's just going to return
hello with that volume. That's all it's really doing. No. The last function
that you get here is one that is triggered
through an HTTP trigger. So we know about HTTP
triggered functions already. So that means when we call it, when we call this Durable
Function and this whole durable operation through
that HTTP trigger, I noticed your book client
is calling I durable orchestration client and
you call it Starter. Then it's going to start the operation on
stored at instance ID. And all this is doing
is in starters, start a new async of
the durable function, which was what we
looked at first. Http trigger is going to
trigger the durable function. And the durable function has its operations
outlined, trigger this. That many phoneticians
in that order stored their outputs and
make one big return. The end of all of that, it's going to log and see it started with
the instance ID. So the incidence ID once
again hopes it's a truck that this operation is currently in progress or it's
finished or whatever. And you can use that in
your logs for analytics. And then at the
end of this whole triggers just going to see check startled response and give
a status, the instance ID. So the incidence ID once again
would be the incidence off this book are
castration happening? And then it's checking the status response to
see is it finished? Are we okay, whatever. So like I said, it stores
the state of me explaining, let us see this in action. So I'm going to run this function and I
went to debug anyway. All right, it's on the
ones that's running. You're going to see here
in the logs that it is detecting that it has the HTTP starts so that these are
HTTP triggered method. Then we have the arc iteration, and then we have the activity. So that's what we'll call
the function that it calls. It's an activity function. And this is the art history. It's a film. This is a simple HTTP
triggered foam shown. So when I make that call, HTTP starts and I'm
willing to control click, Control clicking of course would have launched it in the browser. So my browser launch, he called URL and we
know what happens when you call it the URL for
an HTTP triggered function. And it actually went ahead and start the darkest
tuition, right? So this one it was called, but the host APIs. And then you see here
where it says hello was called successfully and
it returned hello Tokyo, hello Seattle, and hello London. That's evidence that it
was called successfully. But look at this also. Let me see if I
can zoom in a bit so we can see those logs better. You'd notice that
each time it would have successfully
made that call, it goes right back to that
art history OR function. It calls, it's
executing the activity. Then we get the result
of the activity. Then it tells us it's
executed the activity. And then it goes right back to the Arcbest treat or every
time it makes a phone call, it returns control
to the orchestrator. And then of course, you are the one working on the workflow. So you put in your logic and your logging and your decisions based on the output
or the success or lack off their lack thereof. Off the durable function, calling the activity.
That is hole. You can use that
to your advantage. When you called HTTP
triggered function r dot URL, you'll notice is that you
get this block of JSON. So do you use postman
or used the browser? It doesn't really matter,
but what you're getting is a status query gets URI value, which has a URI that
will give us the state, the current state
of the art history. All of that. Is that URI I went to copy that and let me just
place it in a new tab. And you see here it says name, we get the name of it, we get the instance ID. For sure. We get the runtime stack. So switching this
case is completed. So at any point in
the middle of it, if you want to see
what is the status of this durable function,
Is it done yet? This is still in progress, then you would be
able to just see, give me that status. That's pretty much it
for durable functions. Of course, it's a
really simple example, but I hope you're seeing the possibilities and how you could actually use this kind of workflow for more
complex operations where you need to
truck. Is it done yet? What is the status
in the middle of it? And make decisions based on all come off one operation
going into another.
22. Create Serverless REST API using Azure Functions: Hey guys, In this
section we're going to get down to some fun stuff. So we've been looking
at some lots of theory, a lot of concepts
and just getting to understand the inner
workings of a 0 functions. In this section, however, we're going to put all
of what we've learned, the test and we'll be
building out a rest API. So we'll just be using functions,
building old endpoints. If you are not familiar
with rest APIs, you can trickle my
other courses on rest API development where
we actually go through how to build them using a whole EPA development
project template from Visual Studio boat here
we're going to be using Azure Functions. In this video. Let's just get started. We're going to be using
Visual Studio for this one. Let's just find the
ASOR foam zones. Once again, if you don't have it to the left, you
can always search. And if you don't have it at all, you will have to go
and get the installer and install the Azure workload. Let's get Azure Functions. And this one I'm going to call, I'm going to call it
a shopping cart list. I think that's a
simple enough example. We have items that
Linda shopping cart and we take them
off when we have them. We can just go ahead
and create that. And for now, I'm going to do an empty template for
the storage emulator. Yes, we wanted to use
a storage emulator. We already went. That works. And just for
context, this point, the zeros, traditional
Azure Storage, Emily told me and not
necessarily to work properly. And I would've
mentioned that you would have to use azi, right. So to make sure after you have installed it
that it's running, you can always just run that
command azurite and then you'll see it starting
up all of the services. So if you encounter any
difficulty connecting to your trend to use these emulated services and
sending it can't connect, just run that
command azi, right? All right. I'll just have that
running in the background. Foreign. Oh, let me go ahead and
create this empty template. When we come back, we're going to get
started with creating or functions and looking at the concept of hallway
are going to lay it out in terms of building out our
rest API like structure.
23. Setup API Routes: All right, so let's
get this started. I'm going to add function, so new Azure function. And since we're dealing
with an API like structure, then all of my functions
need to be HTTP triggered. I'm going to call this
shopping cart function or shopping carts API. Let's call it that and add. Then I'm going to go
with the HTTP trigger. I went to make it anonymous
at least for now. And click Add. All right, so nowhere
more functional, not when to start
modifying just yet. Instead, I'm going to go ahead and put in
some other things. So I'm going to
add a new folder, I'm going to call it models. The thing with API
development is that you typically won't. Models are details that govern watts data you expose slash interrupt with
for each request. I'm going to have a shopping
cart model, first of all. And I went to meet
this internal, internal means that it can be
used by anything inside of the same project so I can
leave it as internal or no. And I'm going to add
some properties. I just went ahead and did
all three clauses and you can just look at them and
replicate them accordingly. So we have the
shopping cart item or a shopping cart in general. Actually this should be
a shopping cart item. Let me just update the name
of the file accordingly. Let me just do the shopping cart item has an ID which I'm just going
to use as a string GUID. I have a deed created, and then I have the item name, and then I've collected
or not, you know what, like on our shopping
list, strike off the item once you've
put it into the cart. That's pretty much what that is. Signifying. Then we have one for decree. Of course, for the Create. I'm not willing to
expose everything. I don't want the user to be in charge of the
ID or to create it. And Booleans default, default. So we don't miss specifying that are needing that from the user. I'm assuming that it's
not yet collected. All we need is the
name of the item. For the update. You may be changing
the name of the item. You might also just be
changing collected or I could just upload them to just see
yes, it's collected or not. So I think I'm going
to keep it that simple and just work with that later on when we're
connecting to the database, then we will have to
create some more models. But for node this is all
that we need. A rest API. These would be what you call details or data
transfer objects. Once again, if those terms
are familiar with them, you can always check out my
course on API development. However, it has done back
over to our function. Then what we're going to do here is actually establish or roots. So the thing with functions is that by default we're
getting this class, yes, and then we are getting
one function with R1. In this case it's saying you
can do, you can do post. We don't really
need anything to do both because in our
rest API sitting, it should be very strict. One for posting one forgets already saw at
the very basic level. But each function should be in charge of one operation
at that time. So since we're mimicking that rest API structure
and standard, we went off to modify this. What I can do is
rename the function. So if this is supposed to
get shopping cart items, I can call this get
shopping cart items. All right, so that's
the function name. However, the route, I want to specify a value because I don't want not when
you call the function, you have to say slash api slash this function name and then that function in once again rest API standards would have you call this one EPI,
one structure. But based on your verb
on potential parameters, values being possible over
then it wouldn't know which function exactly
you're referring to. So we're going to rename this
route so that it looks a little bit more like what our
rest API would look like. I'm going to call it a
shopping cart items. So your expected because
it's already going to be the URL slash, api slash and then
the function name. But when I specify
this as a root, then it wouldn't know that
from the HTTP request, anything going into
that root after the slash API will be going
to this function, right? Next thing is that I don't
want this function to post. I wanted to retrieve some
went to remove that post. And just to make sure
everything looks good, I'm also going to
rename the function itself to reflect what that function name is so that everything looks
kind of uniform. Um, for null, I'm going
to remove all of this. I can just remove all of this from the function
because what we're going to be doing require
all of that per c. That is the one to get. No. Once again, we're
doing our RISD API, so that means I
need one to create. So I'm just going to copy all
of that and I went to call this one create shopping
guard I attempts, we need to give the
method that name. And this one is going to be the post because it's
not getting this time. It's creating an
entire me create. You should use post and
then it has the same root. So it's going to be
a post request to that EPA or that HTTP route. This one is gets requested
that HTTP roots. We can do the same thing
for PUT and delete. Put would be for updating, deleting. Nothing that's
self-explanatory. Food shopping cart I attend. Or let me just make
sure that my methods are really reflective of what the operation
is going to be. So delete shopping cart, I, uh, Tim, delete shopping cart item. This one is a PUT
request and this one is a delete request. All right, Once again, we're just mapping old roots. They may change as we go along. We might see something
that we wanted to change. Of course, we're
going to be changing the parameters when we're
focusing on each one. But for me, I'm just mapping them out so we
get the big picture. There is another type
of gets because maybe we want specific ones. So if we wanted a specific
shopping cart item, then it will be get shopping
cart item, not a tens. This would also be
a gets put in for the root would be
shopping cart item slash. And then we're going to use a binding parameter to let you know that they should
expect something called ID. Afterwards. I'll get all the items we have. Gets by ID for too much yet shopping cart
item if you want to qualify, just say you're very clear and you know that that
is what it's for. But all it is, That's the ID. We have the Create,
which is just a post. And then we have the PUT and delete node that we're good
idea of the big picture. When we come back, we can start working on them one by one.
24. Setup POST (Create) HTTP Azure Function: All right, so we're back and
we're going to start off with the create
shopping cart items. The first one, let's
update or log message. Just going to say creating
shopping cart item. In the request, we do
have the data coming in. We've already seen that
this request Good of looked in the body, in the URL. And since it's a post, we're expecting the data
to come in the body. I'm going to just start
off with a variable called string requests data, where we're going to await new instance of
the stream reader. And this stream reader
is going to go into or HTTP requests parameter
req dot body. Then we want to read, read to end rather async. Alright? Now that we have read it, this is going to
return that JSON body. Pretty much we want
to serialize it into our shopping cart item object. So I'm going to save
var item is equal to, I'll use JSON converts. Json convert comes from our Newton's soft library and
that was already included. Here. It's basically a requirement for HTTP triggered functions. We've seen it, we've
used it before. Well, my point is that it's like any other pocket is
just a NuGet package. So if you already know how
to get packages and you can just go to Manage
NuGet package and add it right here
without any qualms. We want GSM convert dot
de-serialize object. We wanted to de-serialize into our create shopping
cart item object. And then we give it
the request data, which we know is
JSON in, in nature. Then after we've
done all of that, we would want to edit
some some database. Our database is going to be of type shopping cart item
because we want the defaults. Before I do all of that, let me just create
a new instance. Okay, let me change
this from item to data. And then I wanted
to say var item here is equal to
a new instance of the shopping cart
item. Shopping cart. I turn. This new shopping cart item is just going to take
that item name, which is coming in from data, Data dot item name. Once we have that object that we would want to see if we wanted to save it to a database. I'm not yet ready to start the whole connect
to database stuff. So what I'm going to do is
just use a static list, but I'm calling a
shopping cart items is equal to a new instance. So down here I'll just
say a shopping cart, items dot Add, New Item. Now, let us pause for
a minute and reflect on the relevance of
using this list. What the pros and cons are
obviously in a real setting, wouldn't be using
a list like this, would be using a database. However, I'm not ready for
the database just yet. So we're going to
be using the list. Now the thing with function
up is a While it is running, it is going to be
storing any data, any variables
during the runtime. So while we're testing this
is perfect because we can call all the different
methods and would be interacting with
the same list. A production setting. Remember that functions
are able to scale. That means that if you were
to deploy this while it may work for some scenarios while your function
app is running. Because remember that
on some plants it will actually be turned off until you have to
split it up again. So that's one danger, but to let us assume
that doesn't happen, it can scale to have
multiple instances. So then you would
end up with multiple instances of the list, which is another danger. So of course that's
why we would want one database support for null. We'll just use our list. After we added to the list, we can then return a
new object results. And I'm just went to
pass in the new item, the whole list, but just the
item that was no created. That's really it. All
of our errors for our create shopping
cart item are gone. And that is step one. So when we come back, we'll be going into our get's. We'll just do both
gets in one goal, gets all vs. get by ID.
25. Setup GET HTTP Azure Functions: Alright guys, so we're
back in this lesson, we're going to be focusing
on or get methods. The first method,
which is to get all items that one is
fairly straightforward. So my message is
going to say get, getting all shopping cart items. Then all we're really going
to have to do is return. So at this point
would really go and query the database
and say give me, all right, In this case we don't really have a database,
we only have this list. So all I'm going to do
is return on, okay, Object result with the list. That's the list of the
shopping cart items. And once you want, all, you get to all. So that's pretty much it. I have an error here. Sorry, I missed
off my new return, new okay, objects results. There we go. That's really it.
For the get's by ID. It's a bit, it takes a
bit more finished, right? So yes, we have the same rule. We know that this one is
expecting an ID value coming in. So that ID value usually
comes in as a parameter. In an HTTP request. It could've come in as
the HTTP or parameter. I'm going to add a
new parameter here as a value stream ID. Then I'll just use the
comma string ID is going to represent the ID
value that should be passed in when this
function call is made. I'm actually going to put
this as the last thing. I'll put it after the
logger stream ID. Once this method is called, we're expecting this ID. Now once we have that ID, and I'm just going to
rewrite the logger message, getting shopping
cart item with ID. And then I'm just
going to encapsulate the id value that's
being passed in. Sorry, in interpolate,
to encapsulate, know that I have that. I'm going to say var shopping
cart item is equal to, I'm going to go to my
shopping cart items list. Then I'm going to see
dots we could use find. But I'm going to stick to what
I am more comfortable with first or default because I
want to do first or default. If it doesn't find it, it will return with a null. First star default and then Control dots to use
system.in link. And then we put in our
Lambda expression. So q dot ID off
the shopping cart, shopping cart item will be equal to the id value
that is passed in. And then once we have done that, I went to know check. I see Visual Studio
is helping me along. If the shopping
cart item is null, meaning while obviously it
didn't return anything, then we would have to return a new not phoned object results. So that's a 404. Otherwise. Otherwise. Okay, so here's a
difference between, because you might
notice that you have object results versus results. Let's see of old key results different from Old
Key Object results. The difference is that when
you return to object results, you need to put in data when
you're not returning detail. So in this case I
have no data to return to its
expecting some value. I have to say not
phoned results. So they really wanted to see him thinking it's really
the same four over four, but just which one to
use when appropriate. We are going to return
a not found result. Otherwise we're going to return
a new object results with that particular
shopping cart item in hand so that the color
would've gotten the detail. Let's review quickly
to get multiple items, get shopping cart I attempts, All I'm doing is running
out query quote unquote, running a query against
the data store and returning all know what
a datastore is at least. So I'm just returning
what is in that list. When we want by ID, I have to specify that I want the parameter ID to be prisoned. They're already both get, we already know the
roots what this one is, slash IID, hence my
need for the ID. And then once I have
that idea churn rondo query to find that one
object in the Datastore. If I don't find it, I return
not following results. Otherwise, I return the
old key object results. That's really it for the get
methods off our rest API.
26. Setup PUT (Update) HTTP Azure Function: All right guys. So let us jump over
to our PUT we're looking at updating or
shopping cart item. No. We have to make
some more changes because when we want to update, we would have to pass in an ID as well as the data
to be updated, right? So we went out to do
what we did up here, which is include
ID, ID parameter. There we go. And we have to update our
roots because we also need that stuff ID on the end
of the update roots. Then we can GnG bar log message, updating shopping
cart item with ID. That's in. No, we have to do kind
of a hybrid operation. The first thing that
we have to find the data so we can repeat
what we did in the Git. I'll just go ahead
and see getting me this shopping cart
item that has the id that was passed
in as a parameter. If we didn't find it,
then return not phoned. Alright. Next we are going to have to repeat something like what
we did with the create. Some went off to parse
through the incoming data. This would be parsing into the
update shopping cart item. And then in a bigger operation, we could use something like
ultimate bird to map between the incoming data and the shopping cart item
to do the updates. But this is a very
simple operation. So what I'm going to do is simply update the
shopping cart item, dots collected flag, and let u be equal to the
data dot collected flux. So you're updating it. It must mean that
you're changing the Boolean from true
to false or vice versa. That's all we're really
going to be updating. Then at the end of all of that, we're just going to return
okay, object results. And I will just return the
updated shopping cart item. The object that's really
all there is to the update. So let's go through it
again because I kind of took code from
different pieces. So let me just explain
what we're doing. One, we're finding the
shopping cart item with the matching ID that was passed in via the parameter. Then we're going to say
if nothing was found, we return a 404. Otherwise it will
continue to see get me the data that came in in the request
because with a put, the data would be in the body. It's very similar to how it would be in the
body for the post. Give me the data from the body, piracy tilt into the
corresponding model for updating. And then we're going
to manually update. So I'm doing it manually
in a bigger operation, like I said, you would
probably use automatic or, or some other third party library or to convert
or to have that done for two very simple operations are very simple operations,
so we'll just do that. And then we return
the updated object as part of the return results. That's really it for
setting up the PUT. When we come back, we
will do our final one, which is the delete.
27. Setup DELETE HTTP Azure Function: All right, so let's get started with the delete operation. So I think we've been
doing this long enough. If you want to hit Pause
and try and do it yourself, then I would encourage that. Otherwise we can just walk through it. I already
wrote the code. We had sets up the delete shopping cart item
method from before. What we didn't do was
official to root. So the root once again will
require an ID value there. And we have the
matching parameter string ID updated
the log message. And once again, we
tried to find it. If it is not there, then we say not found results. However, if we continue, we say remove it from the shopping cart list and then return an old key results. Note not okay, object
results, both Okay, result since there's
no data to return. And that's really it for
the delete method, right? So with that, if we just
do Control Shift and B, we can build and
make sure that we have no errors
anywhere in our code. We can see that we have
a successful build 0 errors and we can continue. So now that we have
fleshed out all four off our operations, when we come back, we'll just
go through some testing. We use Postman to call the different endpoints
and make sure that it is functioning or
that our functions are functioning
the way we expect.
28. Test Azure Functions with PostMan: Alright, so it's testing time. We've done all the hard work. Let us put it to the
test so we can just click the start button here in Visual Studio to start to talk. Or if you're using Visual
Studio Code by now, you know how to run that. And then in our console, that accompanies the execution, we get a list of
all the functions, so we see all of them by name, thanks to the function name
attribute that we added. But we also notice that
each one has a root. Yes, the rules would
have been based on the name that was
provided by default. However, we overrode that and
we'll put in our own roots. And there you can
see how you can call each method
according to the route. If we test with postman or
any other HTTP testing tool, and I say, okay, call the get all methods. Notice to get all
shopping cart items, I will just call that URL so you can actually
just highlight it and then copy and paste it
into Postman click Execute. And then you'd see here
that you would get that stuff, those 200, okay. But nothing because we
haven't put any data in. Of course, you'd also see
the log gets updated. So getting all shopping
cart I attempts. And then it was executed. At that point in time. Let us try and create a
new shopping cart items. So I would just change
this method to post. And then in the body
after make sure that I send over some JSON, so changes up to RA and
then James F to G is on. And what did we specify? In order to create? We are only accepting
the item name. So I'm going to see
item name then colon. And remember to put
in quotation marks, then colon, and then this
would be just from Postman. So when I click Send, Let's see. Then you'd see it replies
with the newly created IID, that timestamp, the item name, and if it's collected,
that's false. If I retry my original ghetto, just open a new tab, uses the same your bullets
were using a get and I send. Then I'm seeing that I'm getting buck that one item
from the list. If I try my other get, This one requires the ID. So I went to see a slash and
then passing that ID value, copy, paste it there, then I should only get back
one object. There we go. I think that this
is working well, let me try the update. So I'm just creating
tubs accordingly. So once again, we have this URL, paste it there, or actually I should have just used this
URL goes for the updates. We do need that ID value, going to do a put. And then inside of the
request body raw JSON, I'm going to see that collected is no true
colon and the value true. Let me send, you see
here it returns, they'll be heated object Latina
know that it is no true. So if I go back to the gift, gets all send again, then I'm getting
back the updated. Like I said, if you
call these functions within the same session, I went to call it a session. So we're executing, right? No, we're calling one after the other, one
after the other. It's updating the
same list each time. So that is why you're
getting those results. Of course, if we
stop and restart, then that list is going
to be emptied out. So it's the same
way That's a foam shown up once it's running. If you're using that
list, then no problem. Once IT skills are winds down and then as the
wind backup afterwards, you just know that you're
going to lose that data. So you don't have to
be careful with that. But later on, once again,
we're going to be looking at using persistent
storage in the form of a 0 tables are any other
storage medium that you prefer. So that is really it. So the last one that I'm
going to do is I delete. So I'll use this URL over here. And this time the
method type is delete. We have the ID and everything. I don't need a subject body responds to 100 old
kid and all content. If I go back and try to run a query and
nothing comes back, if I recreate, then
I get a new ID. And then if I tried to look in there again, I see everything. That is literally all there is, quote unquote all there
is to our rest API. The more difficult it becomes this Bs based on your business needs bullets at
the end of the day, this is how easy it is to create a whole rest API using
just Azure Functions.
29. Test Functions with User Interface: Now a fundamental part of
any API is development, is how it interacts
with client apps. That can be a mobile app, it can be a web up, etc. In this case, I made
a simple blades are up where ruled code to interact with the
different endpoints from our z are functions api. While running it. We know that we have all the
roots are already. All I did was retrofit
the Fitch data page from the typical
blazer up template. And what it does is test from laser when they put you in
an item here and say Create, it will actually update
sometimes what happens is that when you interrupt
with the console here, it will cause it to positive. You just press enter,
it will continuous. You see here it
actually made the call create shopping cart item. And then it needs to retrieval. And there's our new
shopping cart item. If I say a test from blazer one, and let me just make sure that the console knows that
he should operate normally when I click Create updates and
nuts low in the list. If I click the tick
for collected, you see here that it
will call the port. Here we go. So put it, updated it. And then if I remove, once again, I keep on
interacting with its pauses. When I click Remove, it should remove the item. Let me try that
again. There we go. So let me create a few more
and then let me delete them. Each time I click Delete, you'll see it updating
the interface. So I'm just going to walk you through the code that
I wrote for this. It's not really a blazer course. I'm not teaching ultimately the bizarre up of other
courses for that. But just within the context, I'm going to show you that
from the blazer standpoint, it doesn't know or care what the underlying infrastructure
of this rest API is. It's just going to be doing, is there stuff relative
to our rest API? I created a brand new project and it was just a
pleaser Web Assembly up. Simple ways to do that. You just go to the solution, right-click Add New Project. You look for blazer
WebAssembly up. I'm using dotnet seeks
go ahead and create it. And then by default
you're going to get some address here, BCE address with
environmentalist that she is told to be
the base address of where I know our functions up
will be broadcasting from. All right, so we know
that that is the port. We would've done some
testing with Postman. We know that that is our
standard API wrote, alright? So that is the base address for the HTTP client being used
in our beliefs are up. Another thing that I
wanted to point out is that we had to set up cores. Cores is cross origin
reference. Okay, I'm sorry. Here's the definition. So
Cross-Origin Resource Sharing. There we go. That's protocol
or sets of rules that govern your API interacts with resources that are
not on the same server. While we were
running the function app anticipating with post-money
didn't see our problem. However, when I'm broadcasting, are sending a request from an external source that these blades are up
when it's executing, it's executing on
a different port. Therefore, it's seen as a
different app altogether, then it would
actually reject it. So this is all we enable CORS on an Azure Functions up in our local dot settings JSON file. We go to host and we open up a new object block where
we said cores to be star, meaning accept all requests. They don't have
any restrictions. Of course, when we
deploy it to Azure, that will be handled
on the Azure side. But for a local settings, we have to make sure
that is in place. You can go ahead and do that. I also would have replicated
the models folder, so I just redid them, don't hear in the blades are up. And instead I updated the
namespace to the shopping cart, at least dot blazer dot models as the name of the
app is shopping cart, at least dot blazer. You will have the models or
access to the modules there. Another thing that I would
have done would be to update, of course, the Fitch
data dot racer. Or if you wanted to make a
new pH, that's up to you. But I just did everything on
one page and I'm just going to walk you through of the
code that was written. The page title, we have
the h1 and the p tags. That's all just semantics. You don't have to do that. Then I have the edit
form of top where I have that input text
which I am binding to an object called item create and accepting
the item name. So here in the edit form the
model is Item Create and on valid submit we're going to call the method Handler create. Then I have this submit
button for my edit form. Item create is often
a shopping cart item. And I just have an
instance of that there and handled create. Is going to just see response
always the HTTP client. What does HTTP that says HTTP
client that I'm injecting into my my component. So that would've been already there because I'll
fetch data is actually assemble of how you
can interact with HTTP client so you don't
have to put that there. It should be there already. But what I'm doing in the
handle create is emptying http dot post as JSON is sink, and then post the
shopping cart item. And actually I just noticed
that this is wrong. That should be create shopping cart item really doesn't matter in the
grand scheme of things, but let's just work with it. So that should be creates
shopping cart item. So I am posting a JSON body or JSON serialization of
creative shopping cart item. And it's going to address api
slash shopping cart items. So this API slash shopping
cart item just gets appended to our base URL
here in the program.cs. Then we're sending over
the item create object. Then we say if it is a success scored or key or no content
or whatever it is we've seen, we know the different codes
represent different things. So an old key responses
are successful, then we just call them a
weight on initialized async, which basically just
redraws the entire theme. For the uninitialized async, we just see items which is an array of shopping
cart items is equal to http.get from JSON
async shopping cart item in a form of RE. And we have the same root there. I'm not going to go all out with architecting and best practices in this particular example, I'm just showing
you a quick win. If you want all of
that you can juggle. My beliefs are courses
are uninitialized. Async gets all the items and puts them inside
of this object, which is designed for the items handled
create basically just says post whatever data
came over as JSON tool, this endpoint, and then go ahead and update once
it's all successful, know the rest of it would
just be me changing the weather forecasts sample
code that was there and just using items
are what I've said for costs and changing the table headers according to my new table headers
based on the new data. Then for each item in items, remember items coming
from our API call. I'm just updating or displaying, sorry, the different values. No, for the checkbox section, I have an input which
is of type checkbox. I just give it a class where
it looks kind of nice. The value is going to
be bound to whatever item.quantity is an onchange. This is an event that
means that whenever you change the value
of the chatbox, I want to call this
method checkbox clicked passing the item and pass
in the value of the events. So because it knows
it's a checkbox, that event is going
to have data, was it true or
false, change, etc. Where policy in that value. When that checkbox clicked
method gets called, it is going to take that shopping cart item and
this could easily just be, probably really
should be the update. The ports, whatever
object type we had used their multiple courses with the clash here because this is of type shopping
cart I attempts. So I'm just showing
you that I'm taking the shopping cart
item object here. I'm taking the
check value as my, as my event args value will have to make it an
object at this point. Then I formulate my update
shopping cart item object and set divided that I
know needs to be set. And then I went to send over g sun async,
that object type. And I'm going to be calling URL and then using
interpolation, I'm just sticking
on item.name DID. So that's why it was important
to pass over the item. Because in doing the
update to meet need data, both the original item and
I can just use the ID. So I'm passing over that ID and the whole object with the data that
represents the update. And then if it was
successful, then you redraw. Then for the delete, we have a simple button. Give it some bootstrap class, and then onclick we have similar code event and
possible with the item. So posible the details
able to item to be deleted even though we
really only need that ID. So we're just going
to say virus bonds, http dot delete async, not as JSON because
there is no JSON to be passed over it reasonable
data to parse, just delete. And the URI would be APIs, that shopping cart I attempt or item.name DID and
then we redraw. That is why you get that. Starts off real-time, look
and feel to the UI because I keep on redrawing each time
the operation was successful. That's really it. For bleeds are
interacting with the API. Of course, he can get a bit more complicated based on
your needs or whatever. Of course, the API could
be a bit more complicated. But once again, we're
just taking a quick wins. And once you understand
these core concepts, you can always build on that
for your future operations. That's it. You can hit pause at the different points
and replicate the code. I just scroll through
slowly enough that you can hit pause at
the different intervals. You can see all of the code. And I would encourage you build more endpoints or
more behaviors in the API and build more
pages on components, interact with said API. Once again, make
sure that you update your base address and
make sure that you enable the course settings on the API to test
them simultaneously. You don't want to
go to the solution, right-click go to Properties. And then you'd want to select
multiple start-up projects. You can put both
the start or both, start with a debug, whichever one you prefer, based on the artistic process. That's really it for
testing using a UI.
30. Interact with CosmosDB: Welcome back guys. In this lesson, we're going
to be going through some of the changes needed to start
talking to the database. The database that are
selected for this exercise is a 0 Cosmos DB database. We already have some
experience dealing with that. We looked at using it as an input trigger or
input binding rather. And whole, we've
just pass the data in and it will communicate
with the Cosmos DB. In this, we're going to be taking a slightly
different approach to just using input bindings
because there are certain operations
that we want to do. I think it's mixed. We can write the code three different
types of waves if we use the input binding, because based on the situation, the court is going
to look differently or we can be a bit
more consistent. So I'm going more for
consistency in this situation. And I'm going to show you
guys how you can one sets up your function up for dependency
injection to Hawaii, use Cosmos DB client for
that total injection, which can be used
all over the place anywhere else in
any other function. To get started, I want us
to go over to new gets. Just right-click your
functions project and the goal to manage
images and get this one. Microsoft built a 0 dot web jobs dot extension dot Cosmos DB. So this package actually
gives you an API for Cosmos DB as well
as Table Storage, but we will be focusing
on Cosmos DB right now. The next thing is to create a startup class known as
startup class would be similar to what you see in
every dotnet Core application. So we have a blazer up here. In dotnet seeks. It looks a bit more condensed than what
you're probably used to, but this is pretty much
that startup class that they kind of
condensed into one file. This basically bootstraps and almost services to be readily. I'm available for dependency
injection anywhere in the application once
the application is running R12 writing code. So that's what that does. So we created our own, I'm calling it startup.js. And I'm just going to show you the code and walk you through each line so that we have a full appreciation
of what's happening. So just right-click
your function up, add a new class called Startup. Not CSU might get an
error or some feedback that you're creating a function
type that already exists. That's fine. You can just erase
all of that or anything that doesn't match
what I have on my screen. You just go ahead and
cart accordingly. We have the first
annotation assembly, colon, and we're doing
a functions startup. It's going to be
type of startup. You might have to include some missing references
as you go along, so you just do that accordingly. Or namespaces are
shopping cart list. We know that already. And then we have, or sorry, our class, startup, which is inheriting from
functions startup. Then we have privates
that degreed only I configuration
root configuration is equal to a new
configuration builder where we're sitting the base to be the environment
current directory, that's the directory
of the project. We are adding our JSON file
up settings.js IN true, but then of course the
local settings.js on. So we know how to
access that later on. And then we're adding
environment variables and then we just build. Then of course it would give you an error saying you need to
implement the abstract class, which is functions
startup and doing that would generate this
method stub for you. It's a public override
void, configure. All of that is
generated for you, both inside of that method, what we want to do is say builder dot services
and add a singleton. A singleton pretty
much means that whatever you're adding here
is going to be access. One instance is going to be accessible throat
your application. So one connections or Cosmos DB, the same connection to the
cosmos DB will be used each time we access this
dependency, pretty much right? If you want to know more about dependency injection and holdout works
against a gold mine, of course is what I'm going
to go through with this null. So var connection
string is equal to and then it looks
in the configuration. Remember configuration compile
all of these elements, including our settings file. So we already have
that setting as file, even though the
name is different, it will it contextually
know where to find what we're seeing configuration and get me the Cosmos
DB connection, which is what we called it in our local settings file, right? So we looked at this already. I'm using the emulator
and I'll show you in a few the database
that we've created. So just make sure
you add that key, Cosmos DB connection,
and then we add that emulator
key accordingly. So we're calling
that DB connection and then we're seeing
if it is null or empty, then throw an exception. Otherwise, return
a new instance of the cosmos strength builder with the connection string dot build. So that is our startup class
that we have configured once again so that we can inject whatever
dependencies we've defined. We only define one which
is for the cosmos client. Don't let me jump over
to our Cosmos DB. We already have some
experience with this. This is our milliliters. You can go ahead and go to your emulator and
create this new one. So we have a new database I'm calling shopping cart items. Then my collection
name is items. And for my key or
by a partition key, I'm using this word categories. So we're going to be making some changes to the
models as we have them. But remember that
the partition keys like a high-speed lookup. So it kind of like indexes, your records or your documents, quote unquote, so that
it can be found faster. So we're adding a new property called category,
which makes sense. It's a shopping cart item list. What is the category? Is it food, is it clothing? Is it a need, is
don't want, etc. So with that, we can
build up our tissue on the data and
speed up the lookup, speedup operations against
this whole database. You can go ahead and do that. After you've created the
database that those look at the changes to be
made to our models. So the only change I'm making is really to the
shopping cart item. And I'm making a GNS3 create. Let me go with a shopping
cart item first. Shopping cart item, one
of making it inherit from table entity which is
courtesy of this new class, Microsoft dot windows,
dot storage dot table, which comes in our new package. So you can let it inherit. You don't necessarily have to, but the benefit quote
unquote off letting it inherit is that the class or the model will
know actually have access to all of the properties that gets touched
whenever our document is stored to see a lot
of other Meta tags. You probably noticed
them from when we did the blog exercise. There are a number of other
Meta tags they get attached. So once it's inherited
from two entity will actually be able to
parse them over. No, I've also added JSON property annotation
to our id req property. Sorry, this JSON property
basically sees that, Yes, I know I'm capital
I in C-Sharp world, both in JSON Web, please meet me a common id. The relevance of this is Cosmos DB is looking for a
property with common ID. When we sit that ID value, you will actually get
a bunch of arrows. You can try it yourself. I actually had a little
field with Hillary. Remember that, that it
needed to put this there. So you can actually try. After I show you
all the code and everything tried
to do are create, I need to see the
errors that you get. You will complain
that it's missing the ID column even
though it is there. So we just see it
when you're in JSON, please identify as common ID and then Cosmos DB will be
able to see it and see, okay, I see the
id value prisons. Of course we're not alone in Cosmos DB to generate the ID. We're doing that. So that remains the same. I've also added another
property called category. Like I just said, category is going to be our
partition key. And just like with the ID, we need to let it know that
it is a common keys category. Of course, this just needs to match whole year rule
to the partition key. So I'm just going to
leave it like that. Capital C for codebook
command Z for JSON. Then inside of the created, I'm also putting
in that category. So when you're
creating the item, you see the name
and the category. And of course there are
better ways to do this. You could use enums, you
could use a select list, etc. But for now we're just going to keep it simple just so we can understand the
exercise at time four, the updates that is
actually kind of optional. So actually I'm
going to take it out and I'll call it a code
when we get there. So I'm going to remove that
so the update doesn't change. So we're just adding
category here. And for the shopping cart item, we're adding that Andy students
on property as well as the JSON property above the ID. It's were coming
along nice and let's jump over to our function. No, you'd notice one, I have removed all of the
static keywords, public. This was public static class. I removed that because static classes can't
have constructors. We need the construct
of oral injection. And other thing is that sorry, I just removed that line. So if you started tapping into again disregarded,
I apologize. So another thing is that we
can't have static methods because we have to modify
the code in there. And that means we have
to use the private. Fields that are going in. As we explain what's
going on here, I'm just showing you
the erudite you'd get if you lift the
method is static, so no static class and
no static methods. Alright, let us go
through line by line and see what
is happening here. So firstly, I'm
injecting the cosmos claims cosmos client is
coming over from our start. So this is returning
an instance or the dependency needed for
us to access Cosmos client. So I'm just going to
write a constructor circled for constructors
CTO, our tub, tub. And then you could
actually just write in the cosmos clients. And then using Control
dots it will offer to initialize and inject
the field for you. So I'm gonna showing
a keyboard shortcuts. You can write it in quickly. We wanted to private
read-only cosmos client. And that gets initialized
inside of our constructor. And then we have a private
container, document container. And then we're going to
see document container is equal to Cosmos client.stop container by the name
shopping cart items with the colleagues on items. So if you hover over that, you'll see it asks for the database ID and
the container ID. With those tool. With the document container, I can now access the database, the items done to all of
the operations I need. I was making the
point that I find this a bit more
consistent to do with a few lines than writing in individual triggers are
binding code per method. What would have happened
is that I would have to write Cosmos DB. Then after the
initializer Cosmos DB to have the connection
string values. I'm going to do that quick
generally what it would look like. That's it. So that's what our notation would look like as
an input bindings. So when we looked at input
bindings previously, we did it with
TypeScript project, where we actually modify the
function dot js JSON file. There is no function
that JSON file that gets generated for us when we're
using the C-sharp projects. So we actually put in the
bindings and triggers right here in our function
As on rotation. So it would say Cosmos DB, what is the database name, what is the collection name, and then the connection string, which of course would just point to the name in the app settings. And then we would call
it a document client, client know based
on the situation, you might end up using
different datatypes here for getting all items. We would use the client and then the query code would
look like this. All right, So firstly, we would have to find a URI or generates a URI for
the collection. So I'll say URI or
Factory dot create, sorry, create document
collection URI, and then we give it those. So here you are seeing that I'm repeating these strings
all over the place. You'd probably want
to think about using a constant plus a constant for the
database name or concern for the
collection name, etc. Not withstanding
that, we would just pass in the database
ID and the collection. And then we would
say I document query relative to the data type
where a boat to query. And then client
documented Client dot create document query. And then we pass in the
URI as document query. And all these can be extended to the hub where clause
is equal to even say like we're and then putting in some condition by which you
wanted to find the documents. It could be that you
want to find or do like a search that would come in handy for something
like a search. Either way, this would return
all of the documents and then we'd say while the
query has more results, we would query through
them and then potentially populates at least that we
would intend to return. So we had that list of top. I've removed that also. And I'm just using
it here locally. So you would actually just
add each item to the list. That was what you were doing, were not taken that
approach I'm just showing you because it's good
to know your options. So I'm going to
comment that out, and I'm going to
comment this out. So at least when you look
at the source code later, you can see that this would have corresponded
with that code. However, because of
injected the client on top, I can now go straight and C. So firstly, I have my list
generated here are declared. And then I say var
items is equal to document container dot get item query I iterator relative to the type
shopping cart item. This is going to be as gotta go off and just get everything and they will see awhile
items has more results. We would just go ahead and add. A three-inch tool or a new list. Alright, so var
response is equal to items that are
read next async. And then we just
drop that range into our drop-down list,
into our list. And then that's
what we'll return. Come to think of it. I could probably even
this line charts. Let me see if this would work. And C items dot read. Next is seeing dots. Okay, so it's an
asynchronous soft course I would love to our width. Well then if I do
that our weights, then I would be able to do a to-do list on
what is returned. There we go. So I probably don't even
need all of this anyway. Two lines. You see, sometimes you write
something and then you realize I could have
written it a bit better. So I'm going to comment that out for now and
then we can come back and test and validate if my
refactor works as expected. But you see goal to get
the items and then we get them is in items dots
read makes the async, which basically the result isn't the items who
were up that's in our briefs or in
parentheses so that we can call the two list on
whatever is returned. All right, that is what we're returning when we
tried to get all. So let's move on. The gets by ID. So here is where it would be like for like based
on what we did previously, where we actually
had that binding, where we looked for the
ID for the partition key, which in this case is category. So here I've modified
the route to say give me ID and partition key. Then those are the
values I'm going to use to run my query. I could actually have
used the same kind of binding that would've used previously inside of
this method here. That would actually
help us to get the one, the one record that matches
this ID and partition key. So here's an example of what that would look like,
that binding code. So Cosmos DB, we have
the same database name, collection name, and
connection string. And then I went to tell
it that the ID is bound to the ID coming in from
the parameter here. The partition key similarly is bound to the partition
key coming in here, and then that is going
to be scattered. Go to Cosmos DB, do the lookup and store the result instead of
shopping cart item. And then once we have
shopping cart item, you wouldn't be as easy as returning bulky object results, the shopping cart item. But like I said, I prefer to kind of
consistency because this, this binding code is going to look almost entirely different
from this binding code. And you have to be very careful and I'm not
saying it's bad, I'm not by any means
bashing the code. I'm seeing that it's
not my preference, but I am showing you
what your options are. So I'm going to
comment this out so that the PCAOB it as
a point of reference. But what I have done here is
to say var item is equal to our weights document
container, read item async. So get item query I iterator. That's all we get
to our collection. This one is one you want, one will want a shopping cart
item and I give it the ID, and I give it the partition
key, which is category. Of course, I likewise extended the parameter listing to include
the ID and the category. And I just did a route
that I point to note to have the ID slash
category on it. When you're trying to
get, you should pass in the ID of what
you're getting, as well as the category name.
You can talk with glucose. That is what we do. And then the item is actually going to come
back with a status code. All right, so it's
going to say if the status code is equivalent
to HTTP status code, phone, then we'll return
on not phone results. All right, Nice and simple. Otherwise we go ahead and return the old key results are ok. Object results with
item.name resource. So you can see that erupted
in a try catch because I was doing something
wrong when I wrote this and I needed to
see the exception. So you can't do that
if you need to, but it's really and
truly not necessarily. So I went to remove it
just so I can simplify the code and show you
exactly what is required. It so that's whole. You would look up
one record from the Cosmos DB data store. Now let's review what happens
when we do our Create. Alright, so what we're
doing here really and truly is seeing that we want to do
the same kind of bindings. So of course I commented it out, but if you look at it, it's the same kind of binding, except this one
would be an outward binding on output bindings. So this is an input binding. This is an output binding
because then it's going to output something
to the Cosmos DB. So essentially it is getting the Cosmos DB collection
database, we know all of that. But then the last
parameter here. Is of type I Async collector
relative to the datatype. We know we want a process
and then the name of it. So let me put it in
the same line so it looks a bit more uniform. That is the datatype
we have to use because we're operating
asynchronously. If you weren't doing
it asynchronously, then we'll just do
an Alt variable. You cannot use old verbals instead of an
asynchronous method, we have to use the
Async collector relative to the datatype and
then shopping cart item. And then later on in the code, after we get the shopping
cart item and everything, would actually have
to do something like this to add it
to the collection. So let's see. Shopping cart items ALT. So a car to this
name that should have been shopping cart items. I'm just naming that because
we know it's an ALT. It's all putting to
the Cosmos DB, right? So shopping cart items, not add async, and then we will pass in the item
that it should be adding. That's really what would happen on the end of that operation. The commented old code, much as a commented
out colder it. What we are doing
however, is we're seeing, of course we know we get
the data from the body, we parse it accordingly, and then we create a new item of type
shopping cards where we pass in the name
and the category that came in through our request. And then we see document quantity and r
dot create item async. We give you the
item and then we'd give it the partition key value. The relevance of this partition
key value is that we are using the category that was
sent over as the data, right? So when I said that category, when you're creating this
item and that category, we're creating it to
our setting that value. So when we did our blog
example where I'd use the ID for both ID and partition key. In this case, I'm showing
a more practical example. Now, you'd really want
to slice it based on something like
category or groupings. We're using the category
as the grouping. Then once you do that, we just return the old
key object results with the item accordingly. So either one of those
lines would work relative to the uncommented hard
to come into adult court. Let us look at the update
and because I removed the category from the
update shopping cart item, we're getting this
error, but when we get there, we can look at it. No, I'm not going to
spend time looking at alternatives for the put and the delete you can go to
and research those. I'm just going to
press ahead and show you how I have done it. So I updated the root or a need to update the route
to include the category. So I did not put in the category because
the way we're doing it, we always need that
partition key. So I need to extend
this to the category. Of course the method is put. So we're getting the request
data, parsing it over. And then I'm going
to see if our items, I'm going to go and Fitch
the item from the database. So we're using the same
code that would've used in the GIT single, right? Well, this time the
partition key is coming in from our parameter category, or we'll just assume ways. So I need to make sure
I put string category. And then once it finds
the item or not, it's going to check while key. If it found it, then we move on to say
item dot resource. The data after it has finished, doctoral data is stored
in item.name resource. That is why we returned item dot resource hopped
up on not just the item. I don't know if I pointed
that out earlier. Item.name, resource and
then we update whatever we need to update to
nowhere I wanted to update in the Collected. So if you extend it to have
name on whatever it will be, item.name resource
thought that property. And then we are weights
document container dot upsert item async, where we send back the item
dot resource for too much. So after we update item dot
resource, we send it back. Then that is what we're
going to return anyway, as the updated data. In the delete, we once
again update the root. So we're taking the category
as a partition key. And then we're just going to see a weights document container
delete item is seeing. We're seeing we're deleting
a shopping cart item. We want to delete the ID and that particular
partition key. And then we return the result. That's basically it
for this operation. All right guys, so let us
test it on anything we do. Anytime we do something
cool, we have to test it. I'm going to just bring up Postman and I already did a test to the post-war I created oppose and I gave you
the category of fruit. Here is a successful return. We can go and get. So this is getting all and I
have two records in there, the sun which and I have apples. One is in the category of food, one is in a category of fruit. So earlier when I was statistic, I was actually using needs
and wants as my test. And what I noticed was that it actually returned
a 500, right? So I have a cart ID card, partition keys, so that
really should give me a 404. It means you cannot find
what I'm looking for. A 500 indicates that there is a problem on my server
which is not accurate. If you look over here, you'll see that it's
actually throwing an exception because
the HTTP client that is making the
call to cosmos DB has a method in there that says ensure our property that says ensure success code. If it sees anything
other than a 200 or one of those 200
response codes, It's actually going
to run an exception. So even though it is
seeing that it's a 404, it's throwing an
exception which is manifesting as a
500 to the client, which we don't want. So what I've done is to
refactor it up a bit. So I'm showing you
the code that is being called and that is
throwing an exception. My refactor would have me do a try-catch that call
inside of the try. Then I return the old
key object results, and then 40 catch. I'm going to say catch the cosmos exception in the
form of an object called E. So Cosmo, so exception is custom exception that is
being thrown by that method. And then when the status
code is equivalent to and we can just use the built-in status code E norm for not phoned, then we want to
return a new 404. All right, so that is
my literary factor. So I'm going to restart
this and try that again. If I execute this again, let me just make sure
that this is running. So if I execute this
again and press Send, no, there's no instance or there's no indication of an exception in the log and we're
getting off 404, so that's much better. I wanted this as food. Let me try a cart. Requests that should not give a four or four or a
500, and there we go. We can retrieve the
record successfully. Now let us try put. Initially I had the category in that I'm willing to take
the tote and I went to just leave it at collected
is true that those update our ID value. So the ID and the partition
keys needed for the put, let me put both and then update. So now you'll see
that this is true. So initially, we
know it was false. We can see that it is true. The update works. Then let us delete. And once again,
we need to put in our ID and a partition
key for the delete. So let me just do that, and that's a 200. Okay. So if I tried
to get again, then I only get one. So all of our code is
working with our Cosmos DB. Of course, if you wanted
to further validate this, you can bring up the MLA it
or create looking at MLA to make sure it's there and then do all of your adjustments
accordingly. Play around with it, explore, see what is possible.
31. Updates To User Interface: All right guys, so we just
went through quiet our code, setting up our application
to use Cosmos DB, right, so let us just
quickly go through some of the UI changes
that have to be made. Since we updated the
roots information for some of our endpoints like we changed the get to also need the partition
key, stuff like that. So of course the endpoints will need to be
updated accordingly. So in our client up, we can leave the
route API endpoint or the base address as is. But then in our component, I am going to make
some adjustments to the update and the
delete. For the updates. We had said that
yes, we take the ID, but we also want the categories already have access to the item. So I'm just going to say slash, add on the category value
also the same for the Delete. We're just going to
add on the category. And that's really all there is to updating these
endpoint calls. Of course, if you are
extending it to have a detailed speech
where you click on see just the detail off one record that you'll
need to consider them. I mean, just delete
that piece of code and show you another
update that are made. Initially I had this as value equals and it was
born to the bool, but that should be checked. The input type checkbox checked is equal to
whatever value that is. And then we have that event and nothing changes
for anything else. I'm just going to
quickly test this out. Alright, so that is
our user interface. I'm going to test
this again pleaser to assist with Cosmos DB. And the category
would be random. Create shopping cart item. There we go. All right. Of course we could always update the code that it empties out the textboxes as long as
something is added successfully. Know before you would've
noticed that yes, he was calling bullet. The check value wasn't showing
when it was true or not, so we just updated that. So if I check this one, let me just bring up the
console window so you see that it is going to call the update. There it is The put what's
called successfully. So if I navigate away from
that page on comeback, it's going to run a fresh query. And sometimes when you interfere with the console, it pauses. But there it is, That's the fresh query and know the check value is showing. So if I delete, we can see that the
function was called, but this is not redrawing, so I can probably revisit that. But if I navigate away and come back and allow it
to run the git, then we see here that it
actually was deleted. So we can probably update the UI code to redraw
properly after I delete. The quick and easy fix
for that is to modify or onclick event callback method. So instead of doing the
way that we did with the event args and then we
call the method name here. We're just going to
have our ad sign, open parentheses,
then open and close. It's more like a delegate
function and it, we're delegating it to
our delete clicked item. All right, so pretty
much that will just solve that redrawing issue because the way
that the component loads that we were
doing it before, it would actually try to
reload everything before it actually registered that
the items list was updated. And another thing that you
might see me do is use a list instead of an array
that's really as normal material effect, I just tend to prefer lists. Both areas might
be more efficient. That's besides the point
at this, at this time, That's how you can
get that delete to redraw the page properly. But that's really
it for the changes required to our client up.
32. Section Review - Create REST API: All right, so where
it's another milestone and we're just going
to review some of the major talking points
from these activities. Major point number one, we hope onto know
have been creating one function with one file
or one file with a function. Nobody realized we can have multiple functions in
the same class file, which in the case of
us creating a rest API would be a bit more
efficient in terms of not spreading all of the related endpoints or
behaviors across multiple files, but just having them in one
file for 1 of reference. We also looked at the
fact that we can define the function names
separate from the root. With our defining the rules, we could actually mimic RESTful API and mentee
and the standard offers. We can get different
operations like GET, post, PUT and delete included. And we also in old fit or roots with parameters
that needs to be passed in which would automatically
be boned relative to them being present in
the method header. So those are some things
that we have looked at. We didn't look at any security
will do that when we go to the as your side when we
are publishing this up. So all of them are running at authorization
level anonymous. One of the reasons you wouldn't want to make it
following shown or even master is that with each request that key
has to be prisons. We looked at that
in future lessons. I'm just pointing
out why we remained anonymous for all of
our API operations. We also looked at
integrating with our data. And we, we saw that you
could always add it as a binding based on whole
you integrates with it, the code is going
to look different. So I've included the
commented snippets that would kind of go hand in hand with the operation
at handwrite. However, I had took
a different approach because when we use bindings, each method based on the type of binding
or the we were doing. The bank may require a
different data types and different ways to interact
with said binding. Instead of having that, you know, all those changes
all over the place, I instead went with the dependency injection option where I'm using the
cosmos clients. We set up a startup class that would have bootstrapped
the font that we want our cosmos client. And he's getting that from
the Cosmos DB connection, which is stored in my
local settings.js SON. And of course we're
emulating that using the storage emulator,
at least for no. And having done that, I can know just
injected everywhere. So originally would've gotten public static class
and static methods. I had to take all the static
keyword in order to support our dependency injection with
this document container, we could actually rely on. I'm more in my book consistent WE of variety in the code to
accomplish certain things. Know that being said, that's why I give
you the options because different folds, different strokes, different
situations may call for different executions, Right? So not necessarily seeing that this is the way
it needs to be done, I'm just showing you the
way that I prefer. Alright. So that is really all
there is to or API. We also looked at the fact
that with our Cosmos DB, we have to include
our partition key. And in doing that, we had to make sure
that our model that represents our
Cosmos DB table has Jesus On Property ID and JSON property name that
matches or partition key name. The final thing that we did was just the update or UI
to make sure that it was reflective of the new
roads and anything else. That's really it
for this module, I hope that you have accomplished
a lot and I want you to feel inspired to go and
try to do this on your own. You can see that the
Photoshop is quite light. It's an easy way
to start writing code and accomplishing
things with all the overhead of
two-minute project files and configurations needed with
all of that said and done. See you in the next module.
33. Azure Function Deployment: All right guys, welcome back. In this section we're going
to look at publishing or a 0 functions up
pretty much in a 0. So there are a number of nuances that we need
to pay attention to. We have spoken about the
hosting models before, but we're going to take
a look at them again. And we're going to explore what needs sought thought
process needs to be put in when we are going to be publishing our function
up. So stay tuned.
34. Publish Azure Functions: Alright guys, so let us get
started with our deployment. So we're going to
right-click or a function up and go to Publish. And then we get this wizard. So we wanted to publish to a 0, we select that and then we
need to specify a target. I may have
inadvertently mentioned earlier in this course that
if you're using Windows, then you may want to
use C-sharp and then you can use other languages
for other platforms. The reality, however, is that.net is actually
cross-platform. C-sharp will work as well on Windows as it will unmarked and Linux, that's the reality. So your function up
the basic level, can be deployed on the
Windows version of this function up or on the Linux version of
this function up. Just the same way that Python, Java and TypeScript
and JavaScript on any of that language
can also be supported. You also have options
to publish them as containers with Azure
Container Registry, which can allow you to
host your Docker image. However, I'm going to
proceed with Windows. After selecting windows
weekend no view or search for Photoshop
app's already have a function up that way I'd
created from the start, but I'm not going to use that. I wanted to create
a brand new one. So I'm going to click
create a new function up. And then it's going to
ask me for the name. Someone just see a
shopping cart hyphen API, the subscription name. Of course, this one is
you may be prompted to sign in with your 0 colon, so you can proceed to do that. I'm going to leave it in one of the existing resource groups. There we go. Function of r g. If you don't have
any resource group, you can actually just click New and then creates a new name. Then we have our plan types. So remember that we have
the consumption plan, the premium plan, and
the App Service plan. So just to recap
what the plan types are and the different
pros and cons of each. The consumption
plan is the server, the spleen is the
most affordable plan. That's pros and con is that it actually has limitations with
the execution time. After about five minutes or so, the whole app will
actually wind dome. So it will deallocate resources. And then when you
make another request, it will take time
to spool up again. So of course, for testing
on for quick wins, that would be a good approach. However, for a real scenario in a production setting where you need a maximum efficiency
and maximum uptime, you would probably want
to use premium where it does not wind
down to Resources. You will have
dedicated resources at all times and you can access VNets and other
resources on network. That's a network level. Of course this costs a bit more. Your other option would be
fused and App Service plan. So maybe you have an app service running our website
already or another API. You could actually just
deploy it onto that plan. Also, I wanted to go with
this consumption plan. After you select
that, you choose your location, of course. And for me that will
be East US tool. And then you choose
your stories or cones. I only have one, so
that can remain. And then we go ahead
and click Create. And after that is completed, where lead back to this
dialog box where we can choose to run
from package file. So that's ticked and it is recommended pretty much
what it's saying is that it will actually
deploy more like a zip file with all of
the required files, the DLLs and resource
files that we need. And it will mount like an image of the app instead of the actual
files being there. With old trend to understand
the intricacies of it, I'm just going to
leave it ticked is stated that it
is recommended, so I'll follow their
recommendation. Then we can hit Next. And then here they
want us to decide. Do you want to publish
like a regular website? Before there was CICD, this is how we used to publish. It would generate
a published file. Deploy all of the files to a particular place within
the network or your machine, and then he can copy and paste
them to the destination. Or do you want to use
CICD with GitHub? So I've already
added this GitHub. If you want to, you can
always go ahead and do that. You can add the
source control using this option that would have been in the bottom right-hand
corner of Visual Studio. After you do that tiny
uploaded to GitHub, then you can probably
come back here. All right, Someone
to use CICD with GitHub Actions and then finish. So that means each time
I check in a change, it will actually
trigger deployment. Nowhere lead to our
final screen where it is giving us a summary of everything
that has taken, please. It's actually going to add this new file which is
called a YAML file. A YAML file, which is Kelly has all the deployment instructions
that GitHub is going to use to determine
what needs to happen. We also get a preview off the URL that will be
generated for our zeros up. So everything that we need, we can access it live
through that URL. That's where our EPA
will be accessible. And down here we'll see some of the dependencies
Application Insights needs to be configured.
Let's go ahead and do that. Remember that we would want
to have that for the logging. You can actually just
go ahead and click Next through this wizard
and allow it to finish. Once you've done
that, it was surely agreeing teeth
that's configured. They're also showing you
that they will replace the Azure Storage sitting is for locally we're
using the emulator, but once we deploy
will know that it should be using the
real storage, I quote. All right, so with
all of that done, and if you wanted to modify the amplifier for
whatever reason, he couldn't go to
More Actions, edit. And it's showing you
what is required. Like I said, like I said, this will actually be triggered
when we do a check-in. If I open the git
changes window, you'll see that it's adding
all of these changes. And then I'm going to see
a deployment activity, deployment column it. Then I went to just come
at all and the sink save all changes and allow it
to push to the repository. We can jump over
to the repository in GitHub and we can just click that little dot that appears beside the chicken or
to commit numbers. So if I do that entry details, we'll jump over to that
CID see CICD screen. So let's give it a
few to finish up. After a few minutes,
it should be done. Let us jump over
to our dashboard. Let us look at our resources. So I'm going to go
to all resources and I should see my new
function up. There we go. Now, before we go any
further with the function, remember that we do need a Cosmos DB or whatever kind of database
you would have used. You do need that exist in order to have that connection
ready for testing. I'm going to dump over to our dashboard and I already
created the Cosmos DB, but I'm going to walk you
through the process regardless. So you can go to All
Services and then you just go and
search for Cosmos DB. Once you have the Cosmos
DB you would create. Then we're using the core SQL. No, on this screen, you choose the resource groups. So I recommend that you choose the same resource group
as your function up, of course, you enter
unopposed NameNode. The naming is very strict so you can't use any capital
letters or anything. So I would have called
the shopping cart items. Of course, that's really
just an account name, so you don't necessarily
have to call it that, but you give it that name if it tells you
it's not available, you just try and make it a
little more personalized. And then you choose a
location that's best for you. Once again, I recommend choosing a location that is as close as possible to where the
function up was deployed. Once you do that, you
can choose whether you want to provision
throughput or serverless. And if you choose serverless, obviously don't have
the options below. If you choose
provisioned throughputs, then you can choose
to appear that apply that free tier discount. Know what I have experienced
in the past is that I wasn't able to proceed with the free tier discount on
the second third tray. So I'm just pointing that old. If you try with it's applied and you end up having
a field deployment, you'd have to delete that
resource cohere, try again. And I recommend that
you do not apply. Of course, if it's
your first time, then you can leave it on that. But if you choose do not apply, then it will start costing you or you can just
choose serverless. You also can change the
buckled policy to save costs. So since this is not
a production system, you don't necessarily
have to go geo-redundant. You can do local redundant backup stores so it
costs less money. But if its production
and I recommend that you leave that on geo-redundant, those are the real to
real settings that you would have to
make sure are there. And what I'll do is jump over to my already provisioned
Cosmos DB account. Once you're here, you want
to go to the Data Explorer? No, This Data Explorer, it looks just like
what we would have experienced with the emulator. So you create your new database according to how you would have written a code, of course. And pretty much whatever
you did in the military, you just repeat those steps
to create your database and the container as well as the partition key
inside of this one. All right, so with
all of that setup, we want to make sure that we get the correct connection string. So here's a primary
connection string. We can copy that and then
go back to our emotion up. So I'm just going to go
over to the dashboard, jump over to the function up. Then inside of the
configuration, we need to add that
primary connection key. Remember in our code, kudos sets up that
we'd need to look in the upsetting dot JSON locally, that would have been the
local settings dot JSON file. That's what you're recognized
as the opposite things. However, in a deployed setting, we have to set up or
Application Settings. You'll just see new
application setting give you the appropriate name relative to what we
wrote in the code, which has Cosmos DB connection. And then you paste that
primary clinician key here. I already did that. Here it is. I added that money.
Well, the Cosmos DB connection and the value. Once you have all of that setup, then you can proceed to test. To test, we can
click on functions, and that will give us a list of all the functions that have
been deployed in this up. If I click any one of them, Let's start with the
shopping cart I attempts. Here you'll see a dashboard
of all the executions. You've seen how this
dashboard looks already. If you go to quarter and test, note that you cannot write
code for the function here because it was actually
deployed from Visual Studio. So any edits have to
happen in Visual Studio. It's really just mounting it and it has the bare minimum as far as the configurations
are concerned. So here it's showing
you that the script israeli loading from that DLL, that's the compiled version of the whole source code, right? That is what we would see there. But if we were to go back
and get the function URL, then we'd be able
to get that URL, which we can just paste
in the browser to test. If you want to monitor, as we saw before, you can just go to
monitor and go to logs. And when that is connected
and open running, you can proceed to test. So I'm just going to open in a new browser tab,
sorry, and paste. Here I'm getting that
empty array bracket. But then if I look
in the logout, see that it called the function and there
was nothing to share. All right, so that means using Postman or any other API tool, I can actually simulate the same kind of cause
that I'm used to. So here is where we
would have created a shopping cart item
instead of posting to the local host someone to
replace all of that with our new URL to our Azure
Websites, send the request. And there I have it. It's a 200 OK response. So that means if I
refresh this request to get all know MC in the
object in the database. That's it really
for publishing or function up to Microsoft Azure.
35. Deploy and Test Blazor UI App: All right guys. So in this lesson we are
going to be bringing our client up up to speed. So what I've done is I've
jumped over to Visual Studio. I have the program CS file, and I'm going to
change this path from the local host functions path to our deployed functions path. From the dashboard.
I'm just going to grab this URL as that is
our new base URL. And I went to replace
this line of code. So I'll just duplicate
the original comment that out and paste this new address with that piece, then
let's test it out. So I'm just going to
right-click this project and go to debug and start
with old debugging. When we navigate to
our fetch data pitch, you can see here I'm
getting an error. So if I go and inspect and look in the
console to see wire, you're seeing that
it failed to finish. That's all it's really saying. What if you look up above that, you see that it says access
to fetch at that address from this origin has been
blocked you to a course policy. So I've mentioned course
policies earlier, of course is the cross origin
resource security protocol. In other words, two things have to be on the same
network or a domain or a framework infrastructure
for access to be granted. What I need to do is jump back
over to the functions up, go down to the course policy. There we go. Then add the URL. Once that is added on Save, then I'll come
back and try again and notice there's no error. No, I have data coming
from as you're alright. All right, so now that I've verified that this is working, Let's jump back over
to the program.cs. And what I wanted to
do is configured this, that if we're in production, we use the Production
link for interviews. The deputy. Alright. So that I can just
set it up one time. I went to see, give
me a var B's address, creates a variable
called a base address and set it to be this new URI. Or rather let me set
it to be this string, which is our local. And then I went to say, if
builder dot host environment, it is, well, this is the
development address. I'm going to say is production. This is a nice, easy way
to check if you're in production when you're using
a blazer WebAssembly up, then I'm just going to see the base address is
equal to this string, which is our published address. Then I can have one line where I see blurred out services
issue behind me. It's address is new URI off
the base address value. With that done, it
will be dynamic. There might be more
elegant ways to do that, but, and all this will work. Let us publish our up. I'm just going to right-click. Go down to publish, publish. There we go. Then I'm going to
create a new profile. So the profile that's
there is already for the full motion up, but this time I'm
creating a new profile. I'll choose a 0 and I'm going to publish it to
another app service. And it's going to ask which
UPS service instances here. Let's have a few already. I'm not going to use any
of those are probably can. That's fine. I'm just
going to do a new one. So I'm going to
create a new one. This would be the
App Service plan. The App Service plan is
pretty much the b's. Then you deploy to that BCE, this App Service itself. I'm going to create
a new hosting plan. I'm going to call it
a shopping cart list and put it in the same resource
group as our function up. The importance of assigning
stuff to the same function, to the same resource
group would be that if you no longer wanted the up and the resources for this up, you can just delete
the resource group and everything in it. And that's a nice,
easy way to clean up. I'm just going to put
these in there also. And I'm going to place
this on a free plan. By clicking New. I choose my location and then I can
go down to the free plan. But of course, based
on your context, you don't want to
choose the plan that best suits your needs. I'm going to go with
free for this demo. We're going to click Create. When that is done, I can just go ahead
and hit Next. And I will publish this with the CMC ICD setup as the
function up. Just finish. Alright, and then this summer
pages letting me know what depending URL for my
app is going to be. At that point, I can just
go to get changes and say setup deployment for blazer, Comments All and sync. And then if you want again, jump over to GitHub and
watch that deployment. It's done. So let's jump
over to the published sides. Remember, we can just
get that URL from the published profile or go to the board TO dashboard and get it from the
newly deployed up. I just scroll to
fetch data just to make sure it can communicate. It cannot. Can you
guess and tell me why? Well, if you guessed that the
course policies not setup, then you guessed
correctly, right? Here we can see it. It's
still compatible course. Why is it complaining
of both cores? Because that's a
brand new domain name separate from the one that we
had used initially, right? I have to go back over
to the function up, jump over to the course
policy configuration, and this new URL. Let me just make sure
I'm using the format. Save and enjoy the fruits of our labor that is publishing
our clients set up, which is communicating
with our function up. There are other things
that we can do with this other
configurations that we can make along the way to
meet this more seamless. I will discuss them
later on, but for now, we have successfully published both the API and our client.
36. Setup Security: All right, so now we're
going to discuss security. Let us see that
we hover up. Yes. We want to secure the function up and we don't want
to allow every n, anybody to be able to access the data or just school
anywhere they want. So as you can see so far, URLs are quite open to the function up anybody
who gets a hold of the URL can figure out the pattern or if they've
seen it at least once, they just need to
put slash API And then that endpoint to
be able to access that. No. There are different
ways that you can secure an API at the
very basic level, you can put on like an off key or request that an off-key
be presented in the header. The current industry standard is using JWT or JSON Web Tokens. But the downside quota and
coaches that you would actually have to go and
configure your function up. Look for these tokens, are these keys in
every single request. And you'd have to
remember to do that for every single function. That can lead to overhead, especially when you
have multiple functions and multiple operations
that you're supporting. Another way would be to use the keys that we
would have looked at, the UPC keys where we can sit it from default or from anonymous, function level versus
admin master level. The downsides of this is
that once you set the key, that key has to be present in the URL and that key doesn't change unless you
command changes. It's either you're going to
have to maintain GnG need to every time or you're going
to have to expose that key. And the key would be present in the URL coming from our
requesting client set up. Anyway, so somebody
could still be able to get that information and be
able to access your API. That would meet more
sense if that was like at the network level
of machine to machine. So if you create a
function ups and you had like another server set
up calling the function up. So that means you're not
seeing any URL traffic, its GTP traffic going over in the browser or monitor Dr. MB, sniff old very easily. Then he had the keys will
be a very quick win. However, because we have
the client side up, I think it's better to kind
of impose a security at a client-side level or make it seem like it's
at a client-side level. So when a user tries
to come into the app, they will have to
authenticate here in order to get to the function. So an easy way to get started with that
is to jump over to the authentication option in our dashboard for
our function up. Then we're going to add
an identity provider. Now this is going to be like
mini version of Azure AD. If you've used Azure AD, then you know you can
use different providers, ones their support
OAuth2 Standard. If you are not so familiar, you can check out my
Microsoft Azure for dotnet developers
where we go through setting that up with azure AD, whether it's for the business or for word facing customers. But this is a very
scaled down version two, that you don't even need
a full understanding of Azure AD in order to accomplish
what we're about to do. Firstly, we choose
identity provider know each one of these might require some
amount of setup. Of course, for Twitter,
Google and Facebook, you're going to have
two global there and register your app and
all sorts of stuff. Someone to choose the
path of these resistance. I'm just going to use Microsoft. Then it's just going to ask me, okay, what do I want
up for distribution? Yes, storage or cones. The support an opponent types, you probably want to
pay a little attention here to make sure that you are going to
choose the one that is best for your operation. I can choose only
the current tenant, meaning the tenant being the account that the app
has been it deployed onto any user that has
access to that document are easy in that tenant as far as as your
AD is concerned, then they only those I can also say any
Azure AD Directory. So if it's multiple tenants, I can choose that. I can see any Active Directory and personal Microsoft account, and I can see him personal
Microsoft accounts only. I'm going to go with this one. Even though this one would
work for probably about, once again, the Azure AD,
those user management. So that's why the one that
you choose is very important. Once again, if you want
more details on that, you can get that told
in the other course. We'll do it with any Azure AD and personal Microsoft accounts. That should allow me
to login with even my personal known as your account if I had one up service
authentication on settings, how do we restrict access? When I went to see, you
can just read requiring author authentication
ensures that everybody will need
to authenticate. If you allow
unauthenticated requests, then you'll need your own code. That's no problem. I will just require
authentication. So once you are heating
up here is just going to call the API or
the function up. I require authentication. For authentication on
authenticated requests. We can choose the one based on how you want your
user experience. And this will redirect to Microsoft and we will
store the Tolkien. So I'll just click add. With all of that done
and added or perceived. We get the screen and null. We should be able to enjoy
some amount of authentication. So if I try to go to the
URL for the function up directly is just
going to redirect me to authorize or authenticate. If I wasn't already logged in, then it would ask me to put in my credentials and then it
would allow me access here. It's seeing that I'm requesting
permission to this API. And I would want to accept that. After accepting that, it
will give me consent. And then I can go to the
function up no window and said that this
activity is that it will stop working
in our blazer up. That's something new for
a number of reasons. You're going to see a corps are, and what's happening
is that it's saying that it cannot redirect to the login URL because it's me redirected from the API URL, from the website URL. So there's a lot
more configuration that needs to hop in there. And that's not going to happen
in this particular lesson. I really just wanted to
show you how you would configure and even
secure the functions up. But then to get blazer ready
for a 0 AD configuration is outside of the scope of the Azure Functions
content of this course. But I'm just showing you some of the ramifications on things that you need to look out for. So you can always go back to your API and you can actually modify different aspects of this authentication
that you've added. You actually added as your AD in a very express suite, right? So instead of going to Azure
AD and city02 manually, just by adding
authentication on this up, you'd have actually
told it that you want Azure AD to secure the up. And you can go back, you can edit the authentication, you can just disable it. If you don't want
it to or enable, you can change the rules
here based on what you need. You can also go over to
the identity provider and you can click and
modify other things. Although things being
the different aspects of the Azure AD sittings
that have been applied. Again, see the endpoints here. You can get to your
client up on your tenant, up, sorry, a client
and application IDs. You can get to your app ID URL. If you'd need to build
out on this, you can. Like I said, though, Azure AD is not in the scope
of this course, so I'm not going to go into too many details of
what is possible. So that's really how
easy it is to secure your function up on pretty
much any that you deploy. An App Service that you deploy can be secured
just as easily.
37. Section Review - Publish Azure Function App: All right guys, so
in this section we covered quite a few things. We looked at how we can
publish our Azure up that we've been working
on in Visual Studio. We also looked at
some of the changes that will be needed for both the published up as well as any other client up that mean
needs to interact with it. We have setup published proof us for both the function
and our client app, which was really just there as a demonstrative I'm piece but it's not really
the focal points, so it didn't spend
too much time on it. But we were able to deploy them using CICD pipelines courtesy of GitHub and see how your mix
everything easy and seamless. We also looked at some security
options for our functions up using Azure AD and the whole
everything can just flow. At the end of this, you should be able to
get your 0 functions up, up onto Azure and
ready for production.
38. Conclusion: Guys, congratulations for making it to the end of
this course where you have learned all you need
to know about functions. In this course, we
covered setting up a 0 function
using the portal. How we can write functions
right here in the portal, the different types of
triggers that can be used. So we can have a blob triggered function versus a
timer triggered one, or maybe the most commonly
used one which is an HTTP trigger a function. You also looked at the fact that you have different
deployment models. You have the
serverless one versus the premium plan versus
the consumption plan, each one with its own benefit. We also went through and built
our own suite of functions which we deployed in its own up and apps like
arrest fully API. Here we were able to not
have to write much code and not have to worry about any servers and configuring IIS. All we did was write a bunch of functions that interact with a very simple Cosmos DB database can be called from
any client up. Once deployed on a 0, we looked at how we
can access them by the Internet whole weekend
setup course policies to make sure that they
are fully accessible. All we need is that asterisk to say anybody can access it. We also looked at how we can
secure it using Azure AD. Notwithstanding the
very specific example that we went through, we also explore
different options of functions like
durable functions. We looked at the
different kinds of bindings that can be used. And we looked at all the
different tools that can come together to help
us get to where we are. So we looked at emulators, we looked at the core tools, we looked at Visual Studio
Code vs Visual Studio. Whatever your platform is, there is a tool that caters to you with all that said and done. Thank you for coming along
with me on this journey. And I can't wait to
see what you produce.