Splunk Basics Course | Ahmed Elakwah | Skillshare

Playback Speed

  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Lessons in This Class

17 Lessons (1h 52m)
    • 1. Class trailer

    • 2. Course structure

    • 3. Installing VMplayer

    • 4. Installing Ubuntu virtual machines

    • 5. Assign Static IPs to Ubuntu machines and change default password

    • 6. Downloading Splunk installing Apache

    • 7. Importing Fortigate Applience

    • 8. Installing Splunk and Splunk Universal Forwarder

    • 9. Deployment types

    • 10. Configure Splunk to receive logs

    • 11. Collecting logs from remote nodes

    • 12. Configure Syslog source

    • 13. Search and explore data on Splunk

    • 14. Extract fields and add knowledge to data

    • 15. Splunk Search Processing Language SPL

    • 16. Creating reports and dashboards

    • 17. Creating alerts

  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.





About This Class

Machines are trying to tell us something through logs, so they are a very valuable resource for IT departments to ensure that everything is working as expected and to give us an idea of what is going on in our IT environments which will help to respond faster to incidents.

In this hands-on course, we will learn how to set up a small virtual LAB to simulate real-world logging and monitoring scenarios, where we will collect logs from Apache web server and Fortigate firewall and send them to Splunk for storage, analysis, visualization and alerting.

I selected these two log sources specifically because they represent the majority of log sources you will find in your environment, so you can follow the same steps in the course to integrate different log sources in the future.

There are more complex logs sources to integrate like logs that are pulled from database but they are not suitable to be discussed in an introductory course.

After we onboard logs to Splunk, we will search and explore data we received then we will add knowledge to it by extracting interesting fields in these logs. 

At this point, our logs will be ready to be treated by Splunk Searching Processing Language (SPL) to create reports, dashboards, and alerts.

This course will make you ready to dig deep into more advanced topics of Splunk administration like,

  • High availability

  • Indexers clusters

  • Search head clusters

  • Deployments servers

  • Splunk Apps

  • Advanced SPL

But you have to walk before you run, so my vision for this course is to master the basics first to break the ice.

  • Lesson 1: Course introduction
    • A little bit about me and my story with Splunk and why am I teaching this course.
  • Lesson 2: Course structure
    • In this lesson, we will have a quick overview of the course structure and a quick look at the lab components.
    • Resources:
  • Lesson 3: Installing VMplayer
    • In this lesson, we will download and install VMware Player as it will host all our virtual machines and virtual appliance in this course.
    • VMplayer download link
  • Lesson 5: Assign Static IPs to Ubuntu machines and change default password
    • In this lesson, we will assign static IPs for our machines and we will change the default password for osboxes user.
    • Resources:
  • Lesson 8: Installing Splunk and Splunk Universal Forwarder
    • In this lesson, we will install Splunk Enterprise software on Splunk server machine, and install Splunk Universal Forwarder on the Apache server machine.
    • Resources:
  • Lesson 9: Deployment types
    • In this lesson, we will talk about different deployment types of Splunk.
  • Lesson 10: Configure Splunk to receive logs
    • In this lesson, we will upload a sample log file to Splunk and explore  this data from Splunk web interface.
    • We will configure required network ports on Splunk server to receive data from Syslog feed (Fortigate appliance) and from Apache server (using Splunk Universal Forwarder).
    • Resources:
  • Lesson 12: Configure Syslog source
    • In this lesson, we will login to the web interface of Fortigate firewall to configure it to send its logs to Splunk server.
  • Lesson 13: Search and explore data on Splunk
    • In this lesson, we will login to Splunk server web interface to explore the data we on sent to Splunk in previous lessons.
  • Lesson 14: Extract fields and add knowledge to data
    • In this lesson, we will extract the fields of Apache access logs using:
      • Splunk Interactive Field Extractor (IFX)
      • Regular expressions
    • Resources:
  • Lesson 15: Splunk Search Processing Language (SPL)
    • In this lesson, we will have a quick introduction to Splunk SPL "Search Processing language" with some examples.
  • Lesson 16: Creating reports and dashboards
    • In this lesson, we will create reports based on the data we received from Fortigate firewall and Apache web server, then we will place these reports into two different dashboards:
      • Firewall stats
      • Apache stats
  • Lesson 17: Creating alerts
    • In this lesson, we will learn about one of the most important features of Splunk which is the Alerting but unfortunately, it is not available in the free license of Splunk, but we can use it in the trial version for 60 days.
    • We will configure our alert to be sent to "Triggered Alerts" page and also to be sent to an email.
    • Resources:


When the course was recorded Splunk version was, On 10-09-2022 I validated Splunk Enterprise 9.0.1 on my own test lab and the steps and instructions in this course still apply.

Meet Your Teacher

Teacher Profile Image

Ahmed Elakwah

IT Security Consultant


Hi there,

I'm Ahmed, an information technology guy having more than 13 years of experience in open source technology and information security in multinational companies, enjoys operating and securing mission-critical services.

I am passionate about information security in general and I love to share my experience by creating content on YouTube and online learning platforms.

Spending most of my spare time in my home computer lab that I am trying to expand continuously.

See full profile

Class Ratings

Expectations Met?
  • 0%
  • Yes
  • 0%
  • Somewhat
  • 0%
  • Not really
  • 0%
Reviews Archive

In October 2018, we updated our review system to improve the way we collect feedback. Below are the reviews written before that update.

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.


1. Class trailer: Hello everybody and welcome to Splunk basics course. My name is Armen, likewise, and I'm going to be your instructor in this course a little bit about me before we begin, I'm the field of information technology since 20087. I started my career as a Linux administrator. Then I moved to different fields within IT until I reach it, formation security field. And currently I'm working as a security consult. Why am I teaching this course? Six years ago, I was lucky enough to be assigned to a project where my company at the time needs to test and validate the Splunk as a security and the monitoring senior, I started to learn everything about Splunk from their official documentation website and it's blank answers website. After a struggle, I was able to build working Splunk solution at the end, but it took some time for me. Then I continued my study by joining Splunk official courses and got my official Splunk certificates. And since that time, I am using this blind on daily basis, professionally and in my personal computer lab. And I'm still learning new stuff about Splunk everyday. And I really love to share my knowledge about Splunk with my colleagues at work and with you through this course. I hope you enjoyed the course as much as I enjoyed the work on the lab during course creation. And see you in the next video. 2. Course structure: Hello again everybody. Maybe you already know about Splunk, but in short, spline is an amazing tool that will help you to get quick answers from your data or logs. It's like Google for logs. As we are going to see in this code with Splunk, you can visualize your data by creating nice reports and dashboards. And with the alerting feature, you can keep an eye on your infrastructure easily by receiving emails or tickets on your ticketing system if certain conditions are met after finishing this course, you'll be able to setup a Splunk from scratch by following these simple tasks during the course, let's talk in detail about this steps. And of course we have four sections and installation where you are going to set up the lab environment and install Splunk. The second one where we are going to ingest data from different sources. The third section we are going to explore and search this data to make use of it. And the final section, we are going to create reports and dashboards and alerts. Here is the distribution of our lectures in the first section will have three lectures and installing is blank. Then we will talk about deployment types and how to configure the Splunk to receive logs. In the second section, we are going to collect logs from remote nodes using Splunk agent. Then we configure, says look source. And third section we will session explored it on Splunk or going to extract feeds and add knowledge to our data. And we will talk about Splunk processing language. In last section, we are going to talk about reports and creating alerts. Our lab setup is very simple. We need VM player to run VMs. It's free for personal use, and we need to Linux VMs to install Splunk and the Splunk agent and afford to get virtual appliance. We have here all the lectures and the lab diagram. So in the first lecture, this one, we are going to install Splunk on this node and we'll talk about deployment types. Then we'll configure Splunk to listen on this port to receive logs from remote nodes, and this port to receive slug field. And we are going to collect logs from remote nodes, which is this one in our case, that has Apache installed Apache logs to these files. And solar Splunk agent to read these logs and send it to our Splunk innocence here, which is this lecture's reconfigured assist log source here on faulty gets appliance to send logs to Splunk. Okay, after these five lectures, we have data sources ingested into Splunk. So here we can log into Splunk and start searching. Our data. Requires that we extract fields to, for example, say this is Feedly source IB, this field is hostname. There will speak about Splunk search processing language that will help us to write efficient queries to find data or and to create reports. And we get reports on the alerts and the last lectures. That was a quick summary about course structure. See you in the next video. 3. Installing VMplayer: Hello again everybody. In this lecture we are going to prepare the lab who will assault by downloading and installing VM player for. Let's go to Google. And Google will type VM player downloads. We will select the fattest result. Here it is the VM player download to be clicked download. Now, the file is start to download. Download is complete. We'll click on the file to start the installation. We need to restart the machine to install these Microsoft's Package. File click is high, rebooted the machine. Let's click on the installer again. We will click Next. And do we agree on the license agreement? I'll give the defaults. I will uncheck this box to not send telemetry information. Next. Next and install. So we have then is taller completed. We'll click Finish. Now in our applications we have VM player installed. Click on it. We have VM player installed. I already had a machine on my computer, so then store detected and added here in the list. So the next step is to add a quantum machines to VM player. 4. Installing Ubuntu virtual machines: However, by in this lecture we are going to install VT_2, VT_1, and VT_2. We have two options. The first one is to go to unwanted.com and download the ISO file, and then install the operating system by following the normal steps. You do two at a solo operating system. But this way it will take a lot of time on TV. You have the operating system and is told that as another option I discovered recently, it's called OS boxes. The transfer side contains already installed operating systems as hard disks. And then you just get the hard disk and added to a virtual machine and then you have the operating system up and running in minutes. I'm going to go for VMWare images. As we have VMware Workstation player. Then I will scroll until I find a 1-2. And I'll click on VMware, VM decay image, that it is version of a 12x. I'll click on VMware and then I'll click download. Download should start soon. I will cancelled because I already have this file downloaded. Let's extract the downloaded file. The action is completed. As we are going to have two machines in this course, thus blank server and the Apache server. But let's copy these extracted out of this. Another time to have two hardest. But let's have a look inside this folder. We have this dream dk phi, which is on hard disk for virtual machine. And it has this size. 60 gigabyte. Can create another copy. For the other mesh, we have two files now let's rename them. First one is plunked server. Second one is Apache. These names are important because when we create the virtual machine, the hard disk name should be the same. We will see this now. Then we will go to virtual box and click on create new virtual machine. We will select this option, which is I will in a solo playing system later. We can next. The operating system is Linux and it is for bone to 64-bit. This is the machine name, so let's give it a name Splunk server. And this is the location where we are going to install the fight of this machine. So choose the place where you have more space. So this is the location where I'm going to install the files of this machine. And I will click Next space for disk in this machine, but we are going to delete it anyway. So let's give it one gigabyte and click next. Before we finish, let's you customized hardware and remove unnecessary devices like CD ROM. For the memory and processor, this is okay for the lab. But I will change this and I'll click Close and finish. So now we have the new machine created. Let's do the same for the Apache server. So we'll go to home, click create new virtual machine solo thank system later. Linux under one 64-bit. If it proper name, which is Apache server, changed the location. Next. Why is hardware? As you like, I can keep them but I prefer to remove unnecessary stuff of k. Then we close and finish. And now we have two machines, but they are empty. They don't have operating system installed. So we will go to the files of these machines. These machines are stored here. This is the first one as blank server and it has a hard disk which is empty position to install an operating system. This an extension for others, but we are going to remove this one and replace it with the one. We don't know that for most boxes. So let's do this step and make sure that the file names are exactly the same. So it is plunk server, okay, we are going to delete this empty hard disk and replace it with this one. And we'll do the same for the Apache server. A, we will remove this file and replace it with this one we downloaded for most walks into the door. Okay, now we have two machines. Okay. Next step is to test that the machines will boot successfully. So let's start. The virtual machines. Do this by double-click here on this machine. Or machine is starting to boot. The machine is booted successfully. Let's login. The default password for this machine is always boxes.org. So now we have an Ubuntu machine ready to use in our course. Do the same for the Apache server machine. Or we launch a VM player again, and we will double-click on Apache server. I'll click on take ownership. So I think we got this error because of this file. So let's remove this file. And the machine again. The machine is starting. The machine is booting now. And we have another machine ready for us to use a default password was boxes. So now we have two machines up and running. In the next lecture we are going to do the initial configuration for the machines, like changing the default password and assign the IPs and make sure they are static. And also to desktop connectivity between the two machines. See you in the next video. 5. Assign Static IPs to Ubuntu machines and change default password: However, nobody now we are going to do the initial setup for the two bottom machines we install the last video. So this is the splunk server machine and this is the Apache server machine. So let's launch the terminal. For the terminal added to favorites. So now we have it here in the favorites. Currently you are logged in with those boxes user or confirm this by omega changes to the machine, we need to switch to the root account. Can do it like this. The default password is always boxes to the orb. Then we can change the password for this user boxes and give it the password you want. So first thing we did, which changed the possible next step is to know the IP of this machine minus I of conflict by default, in new versions of these commands are available. But I like to use this all the commands. So that is a package that contains all the, all the commands for Veneto configuration and solve this package called net tools installed. Now if we type II, if config schoolwork. So this is the IP, we have Southern Iraq IP, the machine, this IB when we put the machine. But for the lab setup for we need to have static IPs. Also check the default gateway assigned by VM player. We can type root minus n. And this is the default gateway, it Asana static IP for this machine, we can do it from the GUI. And the moment. Then we click here and click on IP version four and manual. And put on IP on the same range. Let's give it ten mask. And the default gateway and the DNS. We will have the same IP. Apply this. To apply the new IP, We have to disable the network interface and enables again turned off. And then we will turn it on again. The IP is changed. And let's take give the DNS is working or not. And DNS still working, we can access internet or let's go to Apache server machine. Boss walks in 4k and we'll do the same when we did the terminal and add it to the average click Escape stickleback. We will do the same as u minus to switch to the root user. And we'll type the default password and change the password. For those boxes. Uses k. Also download the package to download taxes on 12x. If you don't know, use APT get command and then install the package name. If you don't know the package name, you do ABT cache. And the fact that she name or for example, do like IF conflict because if you don't know the bag genomic and Google it or try it like this. So when we search like this, one of the physicians we got is net tools. So then you take this one and ABT and install these Flex. But this control shift v2 plus o bacteria installed. Now I can fight live concert. Visited the IB. The machine got automatically, you need to change it to a static IP. Will do the same. And we'll go to the wild connected. Then why settings? Then click here. And IP version four and manual and give it a static IP. Let's give it 112. And the DNS is the same to, to. Now we still have the old IP. We turn all the interface down and Don again. Now it's connected. Let's check if the IP changes. Now. Start to access internet. Dns. Let's try to, being the other machine. Can pair, we can being 11 machines. Also, we can do a quick test to see if the network ports are accessible between the two machines. There is a nice package on Python that will help you to launch a simple web server on any directory you have like this. So I am on this machine. Let's go to temp directory. It will have these fives in temp directory. If we run Python three, then minus M form with the module name, and then HTTP server. And then give it the port. We want to listen on. This machine is acting like a web server. On port 8 thousand. Let's go to the other machine. It's plunks ever machine to check if we can access this web server. Just to test the network connectivity and the port Splunk server, we launched our browser. And let's type the IP of the other machine on line 216811. The port is itself and as you see, we can access the files on the other machine. Okay, so now from lateral point of view, everything is working fine. At this point we did all the initial configuration for the quantum machines. Now they are ready to install Splunk on this block several machine and then soil Apache on the other machine. You in the next video. 6. Downloading Splunk installing Apache: However, by, in this lecture we are going to download on the solar Splunk, on Splunk several machine, hence plumped for order on the Apache server machine, we go to splunk.com. And if you don't have an account to create a new account. So let's click on free Splunk and told you to get an account. If you don't have an account already. For me, I have an account already, so I will login. Then I will scroll down and click on download free 60 days, thrive. In the first six days you have all the features of Splunk available, like high availability and the clusters and all this stuff. And after six days, it's brought to the free license. So let's click here. You'll notice the URL here we are at Splunk enterprise with HTML. If you click on free Splunk, it will send us to the same location. So see if we look here on Splunk three, it sends us to the same location, so it doesn't matter. It's the same thing. We are going to install it on Linux. So click here. The other 64 bit versions, the TGG package and the DP for 1-2 and libyan distributions are not agree. Infrared hat and sentence. I prefer dogs that easy because it's generic Buchanan assault on any Linux. I don't like to rely on Packages. Go click here on download now, who will start the download automatically. But I prefer to download directly to the server because in our machine we have internet access. So I will select download via command line w get. And we need to copy this command. So click here, it will select everything and then you do Control C. We copy everything and we go to our machine. These are Splunk server machine. We will login with the password, we change it. We go to the temporary directory and paste the command we got from the upside. Poets downloading now, Splunk innocence, the splunk silver. But on the other machine that has Apache installed, we will install Splunk agent that will read the logs and send it to Splunk. So let's explore on the site where to get the agent. Here you see Download Universal Florida. So let's click on this one. We need the University of Florida for Linux. And we click on Lenox and also that seem we'll click on the TEC package, so I'll click Download. Now, you see rather than log n inverse in Florida, it will start automatically, but I will incident as well. And I click on this one to get, to get it through w get elected on the Control C. And then we will go to the Apache server machine login and the password would change it. Lets terminate this on the temp directory. We will also be a command started downloading. Let's take the other machine. So package is now loaded. Here. We have now TZ file and our temp directory for the splunk server. Let's have a look to Apache server. And the same for Apache server. We have these package Splunk for order. And the version is uniform and it is a easy packages one. At this point we have the splunk Silver Package downloaded and Splunk Asian downloaded. Next step is to install Splunk service. As you are going to see foreign installing Splunk, let's install Apache quickly on the Apache server machine. This is a pretty simple step. Let's install Apache. You can EBT get stone that she gets Apache to pay for Apache installed successfully. Let's start the service. It is c in it. And then start. Started. Let's check the status. It's running. Let's check from the other machine. So let's go to the other machine. And normally web servers listening on port 80. So just like this, this delta machine, we will not like this. Okay, we've got the default page of Apache, so Apaches installed successfully. And now we are ready to move to next steps, which is installing Splunk and Splunk agent. See you in the next video. 7. Importing Fortigate Applience: Hello everybody. These and evaluation package for 40 guitar appliance. I will give you the link to download this file. It's a trial version. So to expire, I think after one month, but it's enough for our testing to test us is look feed. Well, let's extract this package and open the file. We have an IVF appliance. It's a way to package and eventual machine as an appliance. So if you double-click on it, it will automatically add the machine to VMware. So let's click on this 140 gauge. There's 3M 64, 0V F. Let's double-click on it. Magically it launched with VMware. And we will accept the license agreement and click next. It asks us for the location where we want to save this VM. Though I will save the VM or the appliance files into this directory. And the name will keep it like this. And you can click Import. Now it's importing the files into our VMware. The default user is admin, and by default there is no password, so you just click enter. So we look then to the 40 get-up lines to release the mouse from the VM, you need to control and alt to release the mouse. Okay? Now what do we want to do is to assign static IB to these lines to access it from the web interface. To do this, we need to run this command. I will put all these commands in the notes so you can find them. Type config and system and interface. So now we are in the interfaces. We will edit for one because we have many ports for this machine. We will type edit port one, and we'll type set the mood static. I press Tab to autocomplete command. So sit mode static. Then we will give it an IP. So set IB, we will give it an IP on the same range of course we used for the other two machines, the splunk several and the Apache server. And we used then for Zhuang server and the 11 for Apache server. Let's give this one to 12. Ok, and then let the mosque by 50. Okay. Do you type end to, to save the configuration? And also we need to configure the default gateway. Oh, we do this by running config router that thick. And one is the entry number, that one. And under edit one we are going to to sit the gateway. Gateway. I press Tab to auto complete command, and then we will try a default gateway, which we know from the other machines it was looked too. So this is the default type and we try to bring the 40 get appliance from slug several slot reachable. Let's check if the configuration is set to net or not to go to the settings and go to virtual machine settings. And this is port one, Yao Ni dS. We need to set the port to net, will be able to see the other machines in our lab. So let's click OK. And let's try again. Yes. Now, let's check if we can accidentally interface of this machine is 12 mice. We can access the machine. Principles for less change the password. It's usually a good practice to change the personal from the default ones, but keep track of them using a password manager. See you in the next video. 8. Installing Splunk and Splunk Universal Forwarder: Hello everybody. In this video we are going to install Splunk service out on the splunk server machine. And before we start, we need to mention something about the best practices for installing services. In general, we are going to create a user called Splunk to install the service under this user, because it's not the commended and saw the good security practice to install services on the root user, because the root user has the highest privileges on your machine. So if someone managed to compromise the service, it will end up that the attacker managing the machine with the highest privilege. So the best practice is to install services Splunk perpetuated over with the lowest privileged users, start by creating a Splunk user, and that will put all the commands we are using this video on the notes so you can copy them and paste them directly into your lab machine. You lose a command, user add. Then minus S Luther mind. Bash environment for this year and the home directory for the user. I like to put all searches on the OPT ponder opportunity have a directory called Splunk for the splunk service and minus m and then the users. So let's run this command. So we managed to create this user, this home directory. Let's switch to Splunk uses the one we just created now, now we switch it to Splunk user and conservatives by typing who am I? Then we will go to the temp directory wherever the packages there and extract the TZ package can just run this command x, say depths, and the package. These will extract it on the same location. We have package extracted here. If we type ls, we have a new directory which is the extraction of the package and go into this directory. And these are the files on the splines package. But we need to move these directory to OPT is blank so that you can do this in one command. Lucky you extract the package and all the extracted files. You would go to open this plank with these commands are going to see now. So let's remove the package. We extracted photo, extract the package directly to RPT Splunk. We will run this command x, z, f, and then the package name and minus c capital directory where we want to extract the package. Because we give the V option in our command, we can see progress of the command which files are extracted at the moment? Now the package is extracted, can double check by going to OPT is flunk. We find all the files on the operative Splunk, which is the home directory of our user as well. So if we type Control D, we look out from this user who switched back to the root. So to switch back to Splunk, we louis U minus Splunk. And PWD will go automatically to OPT is black, which is the home directory of the user. And there we have the splunk find. Their next step is to start this plunk service. And we do this by Goto is blank. Then we're usually binary files are stored for any Linux package. And then Splunk and then this part, this is the license agreement you get. For the first time you run the service. Can click space, can scroll pages. And we say yes to the administrator, user name are going to use when you want to make changes to the service. So let's make an admin and the password you like. And it should be a complex one. And save it on the password manager, like keep US secret server. Now the services starting for the first time and waiting for observer at this IP, the local IP and port 8 thousand for services started. Let's double check by going to, this port will go to 1271 or 10000. This is the interface and blank. We will login with admin credentials we added during the installation. And now we look into our Splunk, an instance for the first time. The main interface of Splunk will talk a lot about this in the next lecture. And now we have Splunk installed. Let's go to Apache server just to install Splunk agent to make it ready for the next lectures, we'll configure them. So to finish all the installation in this lecture for, let's go to the Apache server mode. Peers are going to do almost the same steps we did to install Splunk server. But we'll change the home directory of the agent because it's called Splunk. For order. Not Splunk the package name or going to run this command user head minus S min, and the home directory of the service, which is Splunk. And then the username. So now created the user with the home directory. We'll do the same. We'll extract the package into this directory. We will switch to Splunk user we just created. And we will go to the temp directory and run the tar command to extract the Splunk to GZ phi, the home directory. On OPT. Everything is extracted to the home directory. That's lovely. Check. Here We have everything on the vom directory pedagogy to see the current directory. And we will start the splunk service as well. And Splunk and click space to scroll the beaches and the Kentucky's username. And the first four admin. Again. So now we have the Splunk agent installed as well. Now everything is ready for us to start configuring this plug-in server under Splunk agent, read the logs and ingested it to see you in the next video. 9. Deployment types: Hello everybody. Before jumping to the configuration part, we just need to talk in this lecture about the different deployment type. We have, the standalone deployment and the distributed, the sundial on deployment like what we are going to have in this course. We have everything in one Splunk known and the distributed nor other functions of Splunk distributed between different modes as we are going to seek the standalone deployment, we have the functions of Splunk in one node. As you can see here, the input is on the same node. Then you parse the data and extract the fields on the same node and index the data and stored in the database of Splunk in the same node and doing searching everything in one node. This is good. In some cases, if you are in a small company and you don't need high availability, you can go for a stand-alone deployment, but it's not good for heavy use if you have many team members using Splunk on the same time. So stand alone deployment is not practical in this case. So stand-alone deployment is good for testing and for proof concepts. You want to explore. You want to convince your management uses plunk so you can create a nice dashboards and reports. You don't need to have an external server or a virtual machine. You can install it directly on your laptop. So it's good for proof of concept and for personal use and learning. If you want to blend Splunk, like what we are doing in this course, we are learning the basics. So stand-alone deployment is perfect in our case, in distributed deployment rather functions distributed on many layers. So here we have the input layer. We have many heavy for orders. In case one heavy for order is down, we have the other four. So it's good for high availability and load balancing. And then the next layer is the indexing where the data is stored. So here, indexers, we can store the data. We have many indexers. And also there is a replication between these nodes. So data is copied between these nodes in case this indexer is down, the data's still in these two node, so this indexer is done. The data will still here. And for the searching layer, we have such heads. We can have many search heads according to the load of our sessions. If we have many users, we need to add more. And then our users are doing searches on these. Search had sideway achieved this by putting a load balancer here. So all requests are coming here and then it will be load balanced between these nodes. We have more data, we add more indexes. So if we have more data, we can add many indexers as we wish. And the same for any layer. So we can scale horizontally, you just add more servers. Then we can handle any amount of data and assertion load, as well as an example for the environment. We have heavy for order here and the second one here. And there is a slog feed coming from all devices in our network. And then there is a load balancer to load balance of this feed to these heavy for order and this heavy for order, and then the heavy for orders load-balanced two indexers to store this data into indexer. So Claude balance like this. So in case this indexer is down, so all data will go to this node only and also another layer of high availability then replicates its data to the other indexer and vice versa. So at anytime we have a copy of the data and then both indexers and the layer of search heads are here. Head can reach the data from this indexer or this indexer. And on this layer we have a load balancer to load balanced between the two sessions. In this course, we are not going to talk about these advanced features or Splunk, because we need to focus on the splunk basics first before we go in farther to, to learn about high availability concepts and search had clusters and indexer clusters. So I hope this clarifies the different deployment types of Splunk and see you in the next video. 10. Configure Splunk to receive logs: However, by, in this lecture we are going to explore the Arab interfaces Splunk. And then we are going to upload a test log file. After that, we are going to configure Splunk to start receiving logs. As you described in the lab reparation metope in these two ports. And the last step in this lecture is to configure indexes. Indexes are the location where data is stored into Splunk. So for example, you have firewall logs and Apache logs. You have to create an index called firewalls. For all firewalls, we have a neural network and another index for web servers you can call it. So at this point you just need to know that the index is the location where the data is stored into Splunk, the main benefit of indexes is to apply certain rules to every index, like attention policies, like you want to keep firewall logs for five years. You can apply this and without affecting what server looks. But let's start by running the splunk Server virtual machine. I will launch VirtualBox and our lovely click on Splunk server. Or the machine is started and maximize, we will login with OLS boxes. The torque user said these password in the first part, and I will launch our terminal. My goal now is to check the IP of this machine to access it. Dark slope interface, our, our ifconfig. And this is the IP. To copy, I press control shift c. And I'll go to a browser on my host, on my laptop. Okay. And port is 1000. It will not work because monk service is not running. So as you remember in the first section, we mentioned that splunk service is running by Splunk user. And if we type, who am I now? Now we're accessing this machine was always boxes user. So we need to switch whose plunk users. But during the creation of Splunk user, we didn't create a password for splunk user. Snot needed because we can go directly to the user from the root account. We switch to the root user using pseudo ASU to Switch User Assume stands for switch users and type the OS boxes user's password. So now we switch it to root user. And from there we rather than assume Splunk to Switch User to Splunk. Okay, and now when we are using Splunk user now, then we will start the splunk service. Then start, as you'll see, the port, is it solvent? The portfolio of interface 8089 is for management traffic between nodes who are waiting for observer at these lungs started successfully. Let's switch to the browser. Here on the fish. We have a spline interface loaded. The admin user we set in the beginning. This is Splunk web interface. In settings, you will find everything to configure everything. Most of the stuff you can do from the urban per phase, but some spatial configuration should be done from the command line. But it's out of the scope of this course. Let's start by uploading assemble log file. We go do this by going to settings and then add data. And then we scroll down to upload here. Files from your computer. We can drag and drop the file here or uploaded from select file. So I have this sunblock file. I will upload it to the files silicon downloaded. I vote from internet. I just type Apache log file symbol. Then I use the first result here for let's drag it and drop it here. The file is uploaded. Then again click next. As you can see, the file is loaded into Splunk. So explored it here. We will give the defaults here. And we'll click Next here to landscapes for the index where you want to save this data. So gun, create a test and expand called a test. Just for testing and save and we'll keep the defaults and the review and submit. Your file is uploaded successfully. If we started searching here, it will automatically create a session string to explore this file. It added all these details because the source is the filename and the host is the server name and the index and source type, all this detail. Yeah, it's nice to know all this, but it's not important now. Just if we type index equals test, it will bring us everything in the index test. We have this log lines and as you can see here, the time range selector, all time. So now we manage, do import log file into Splunk wireless configures plunk to receive logs from external sources. Because applauding logs like this, it's good for proof of concepts, but it's not working like this in real life. In real life you have sys log feeds, you have logs uploaded to Splunk automatically. So we are going to configure the sports into Splunk to accept connections from different looks sources. We do this by going to settings and data inputs. As we mentioned in the lab preparation, we have assist look feed from the 40 get firework, which is assist look source, say UDP feed. So click on ADP and we'll click new local u dv. Then we will type the port, which is 500 for the default slope port. And a protocol is UDP. And we'll click Next. The most important two settings here is the source type, this one, and the index. The source type identifier to identify these log feed. So anything coming from the sport 5.1.4 UDP, we need to assign a flag to it to identify these log source. In our case, we are going to click New and the name for to get to analog feed coming from this board, it will have a source type I altogether. And we will see the importance of this source type in next lectures because we will use it to identify the luck feed and then extract the fields from this lock-free band for the index replicate new index, and you can call it firewalls. And we'll click Save, then we'll Glick review. We have input type UDP port, the port is 51 or for source type is 48, and the index is firewalls. Click submit. And we got this error. This is because we are running Splunk service was blank user, not a root user. And as a nonroot user, you are not allowed to use ports below 124 and the port 504 is below this number. There is a trick to achieve this using IBT. So let's go back and give it this port, 105.04. And then next. And repeat again 40 Gate. And the indexes firewalls. We'll click, review and submit. So now we configure this port to receive logs on the board 10510. But the problem now, and 40 gate, you cannot change this board by default says loggers using 51 or. So. The trick we're going to do Linux layer, we are going to make use of IP tables as we are running Linux now, the traffic we are going to receive from 40 gate is important for iPhone or for the default port for slope. And the problem is on the 40 gate device, you cannot change this port, will keep it as it is. 40 gate will use five or four to send logs to us. And on the ketamine level of our Linux server, we will redirect this traffic through this port 1-5, one for as we configure it on this one. So it's a trick to allow us to use the default port of sys log on our main device, which is a 40 gig. So I will put this command in the note section so you can copy it from there. And let's continue. Let's go to our Linux machines and we need to switch to root user for diving Control D it would look out from the current user we have, which is blank. So Control D, I'll turn it back to root user or you can type logout. So now our root user, I will paste this command. The command is executed successfully and we need to save it after reboot, we need to keep this rule soil type IP tables. They've now the rule is saved. It's a workaround to achieve this. And our configured the system port to receive 40 get logs. Next is to configure the report to receive Apache logs will go to settings. And in this case we go to for ordering and receiving. Because in this case we are receiving blogs from Splunk universal for order and click on configure receiving and then click New receiving port. And we will listen on port, land on line seven. Hello colligative. We didn't get the index and it's not like the process of the sys log port. So we are going to create the index ourselves. We'll click on Settings, click on indexes, and then you index then will give the index name. We can call it to a band. Click Save will keep the default of the moment. So now we have the index period, have the firewalls index we created, and also the web. And so now we created the indexes and we enable the port. That's all for this lecture and see you in the next video. 12. Configure Syslog source: Hello everybody. In this lecture we are going to configure our second log source, Jesus's feet coming from 40 gate firewall. Let's start by going to VM player to start the Watergate machine, where we have VM player here I will double-click on 40. Good. For the VM, starting now. I'll maximize it. On VM player. You press control and alt to release them out soon. Now if you try to move the mouse, to not move to wrest control l to release the mouse. 40, get appliances up and ready, reset the password in the lab reparation section. So admin user and the password we used K now I'm logged in to 40 get appliance. And we said the IP in the alap preparation section, though we can go directly to the interface newly to access the shell. At the moment, I pressed Control to relieve the mouse and go to the browser, oh, I think wipe yours. 12. So I'll control, see this IP and then listed here, and then 120. This is the interface of the 40 good appliance. And the username admin, and the password we set. As you notice, the test appliance, we don't have any firewall rules configured. So receive some traffic will go to login report. And this applies to any security device you have. You can apply these to proxy servers from glucose can apply this to any firewall devices or any IPS is so simple. That's why I tried to have a sys Look feed in these basic scores because it's a common log source. And then we go to log settings and scroll down here. And we will enable this, send logs to sys log. And it will type the IP of Splunk silver will go from here. And by default the Porter's 5104 will remove this. So let's see which logs you are going to send for testing purposes, we need everything just to generate some traffic so it slipped all. We want to get everything OK. And I'll click Apply. What will happen now? Settings ourselves successful. That fought together will send the logs by default to port 51. Or for the default says look feed on GDP. And as you remember in the previous lecture, we do this trick to get the logs and to Splunk on port 504. And the kernel will use IP tables to redirect this traffic to purport 1-5-1 for where Splunk is listening. So now let's go to lunch and type index falls and click enter. As you can see, we receive the logs are from 40 get appliance that take work. Fine. That's all in this lecture. 13. Search and explore data on Splunk: However, nobody at this point we managed to onboard R2 lock sources in this course, which are the Apache web server logs and 40 get appliance log. And this is the hardest point in the process because it requires a cooperation between different teams. You need someone to open the firewall flows between the devices and the spline and configure the device to send the logs and install the agent on many servers. Take some time in real life, but once you have the logs into spline, you kinda start exploring them and create your reports on their lives based on your use case. So let's login to Splunk who explore our data. Splunk mean beach. We will click on research and reporting app on the left-hand side here. And on the search page we have the search bar and the time selector. And in such br, we will tell us plunked in which index we are looking for data to. I will type index equals asterisk if I don't know the index name. So I will session all indexes. This query. I got no results because we are searching in the last 24 hours and we don't have logs and velocities for hours. I'll click here and I'll click all time. So now we are retrieving the logs or you have Splunk innocence, as you can see here, we have these number of events in Splunk or these are the total log lines we have in this plunk innocence at the moment. And if we scroll down here, we will find the index field. It contains three values. Firewalls is the index for 40 get logs. We already define these in previous lecture and test for the sunblock file and web index for Apache web server logs. And whenever index we have this number of events. And if we scroll up again, you can see here a field goal source type will read defines a source type in previous lectures. And it's important because it differentiate the feeds. So now we know that any logline have 40 gate and source type. It means it's coming from 40 good device and access combined. It was assigned for the log file and Apache axis told assigned for Apache web server logs. The importance of the source type is that for future configuration, it will rely on the source type. For example, if you want to extract the fields from our certain feed, you need to rely on the source type as we are going to see in the next lectures. So let's click on Apache axis. It will be added here on the search bar automatically. And let's make it more precise. We can put web goes. We know now the index name and they'll click Search again. These are the logs from the Apache web server. At the moment we have this number of events, 115 events. What we can do at the moment to explore the data, we can do free text search. We can type any string to search for or contact tests. We have one event with the test string as a request to the web server, and we can search for 1-2-3 who already have a request with this string as well. If we click on this arrow to list the fields of this event, we will find that fields are not extracted from this event. So you cannot make use of these logs because you cannot create reports, because we should have a field name called the client IP assigned to these value. And we should have a field called HTTP method assigned to the get method here. So at the moment, you cannot fully utilize your data because the fields are not extracted yet. And that's what we are going to do in the next lecture. So at this point, Splunk is working like a search engine for your logs, can search for any string, but you cannot create reports. So see you in the next lecture. 14. Extract fields and add knowledge to data: However, by in this lecture we are going to extract the fields from our log. If we scroll down, there is a feature of Splunk quantified extractor. It's click on extracting new fields. This extraction we are going to do. It will be assigned to the source types or any logline that has this. So start this extraction we are going to create. It will be applied to this logline. Select one logline has assembled, and then click Next. We have to choose if we are going to use regular expressions or extraction or delimiters. Patchy logs doesn't have a delimiter. We're logged with a regular expression. But delimiter is can be used for CSV file or if your logs are separated with a limited if you have a comma as a delimiter or a dash or whatever space I can use a delimiters, it will be more easier, but in our case, we'll use a regular expression for Apache. So I'll click here and click Next. And we'll start by selecting the feed like this by the mouse and give it a name. So this is the client. Ip can apply the simplest to analog feed, click Add extraction. As you can see below here we have client IP is extracted from all log lines correctly. And it will go to the next field. At the moment this field is empty. That's why we have hyphen here. If we selected like this, and I think for Apache log visits called Identity IV and click Add extraction, it may not work if we receive a value in this field in future, because what Splunk is trying to do here, it tries to create the regular expression for you. It may not be accurate. So let's continue why selecting a field that doesn't work because there is a limitations because Splunk engine drives zoo, create this regular solid may not want to. I'm going to remove this field and I will show you how we can do it properly and in a moment. But if we continue like this, we can select this field and give it a name, timestamp. And I will select the get method and we'll call it. And this is a URI or document location and give it to your eye. And this is HTTP version, version at extraction. And this is the stethoscope. And this is response, sorry, request size. And this is the reference. And this is string is the user agent. In our case, it's the browser. And in the next logs it was the care LA to query the observer, as we did in the previous lectures, fantasies the user, agent. And click Add extraction. Let's check below who successfully extracted the field from this two log lines. But the automatically generated regular expression by Splunk didn't manage to extract the fields from these logline can continue by creating an extraction to this type of log lines and another extraction for viz, a log lines. If you are not very familiar with regular expressions, but it's not the best approach we can click Next unsafe, but I'd prefer to create a regular expression myself because it would be more accurate. Candidate will extract all the fields. So we click here on shore regular expression. That's what Splunk created for us. To extract the fields that we can leak ended target expression and you can type our regular expression here. I'm using these sweeps field extractions. If the data you are using here are not confidential because you will copy a log line here too, as assembled to extract it. Or in our case, we will copy the slug line and paste it here. So this data is confidential. Take care because it's a cloud service. So in this case, if you do those confidential, you have to lose on offline to lie or not. But there is a plug-in for regular expressions. You can use it for good demonstration. We will use these tools and we will start typing our regular expression to create a quick regular expression to extract the fields we can start typing like this. As you can see here, that means that we are starting from the first of the line and get the fattest ward until you have a space. So we've got this field and in regular synonym to an extraction or to select an extracted field, will do these brackets around it and push and Mark and these brackets and give it a name. Client IP. So as you can see here, that client Ib is equal, these IP show extracted the first field. Then we'll click space and we'll extract the next field, backslash and S plus. And we will select this field as well and give it a name. Identity. And space. Backslash is to select the next hyphen, whatever the value there it will be selected and assign a name to it. So you see here username is equal to the hyphen and identity is equals to the hyphen. So a quick test if we remove this and type fest selected successfully. So let's return back. And the next is the timestamp. So we want to escape this square bracket. We don't want the square bracket to be in the field. To escape it would type backslash and a square bracket and want to extract everything before the other square bracket. Do these lyrics. We do it like this. Webpage here. As you can see, we selected this part and we are going to assign a name to. It, is the time stamp. As you can see below, it's extracted. And we'll continue. Now we are stopping here before the square brackets, so can escape it as well. Piling backslash and a square bracket on space. And also we want to escape these double-quotes, that backslash and the double-quotes. And we will select that Git as a method, give it the name of the method. Now the space, and we will select everything after the space as the URI. And then a space and backslash as we are selecting. Ok, we want to skip also the double-quotes. We will look like this. And we will select this. This is the version. And we will, we are stopping now here. Before the double-quotes. We will escape it as one and space. And we'll select this field as. The status code. Is. We select this field as size. Space will escape the double quotes inside. And we stop before the double quotes like this. And we will selected by brackets. This is the leaf space. And now we will escape the double quotes and double quotes as well. And it will select everything before the double quotes. The same way. As you can see that we successfully extracted all the fields with this reg ex. So I will control a to go selected and I'll copy it control c. And I'll go here. And I will delete this reg ex S1, and I'll click preview. As you can see, all localized now are extracted successfully. So our regex is better than the one generated by Splunk. Though we can click Save and give it a name. So I will remove this one and I'll give it a name. Apache access can give any name to this extraction. And I will make this extraction visible to all applications. And everyone can read it and only admin can write to it. And I'll click Finish. So let's click on Explore the fields and select all time. And let's clean the query a little bit. Let's remove this and put the index. And now as you can see here, we have fields extracted. We have the client IP, we have the identity itself, and at the moment we have the method, we have only get method we don't have post for example, we have two referrers, you have two status, and we have many URIs that the trials we did using the care. And we have three user agents. We have care as the user agent. And then this is the browser of my laptop, the Chrome browser. And this is the browser of the quantum machine itself. And if you click on any of the events here on this arrow, you can see the fields here, the method and the other request size and other status. At this point, you can start creating reports for your data. For example, to list the top client IP is you can list the top request size or you can make some mission for request size. This is a very important step. After you on board your logs to extract the fields. Let's have a look to for to get log so we can remove all these and we'll type firewalls. We know the index of 40 get logs. As you can see, that three yields for 40 get logs were extracted automatically buys plug. Because black is smart enough to notice that log lines or 40 gates are key-value pairs. As you can see here, that the date equals this value, o Splunk. Note that this is the field name and this is the value because there is a key-value pairs, so it's easy to, for extraction and device name is like this, and type equals traffic and action is deny. So key-value pairs are extract dramatically by x1. So if we click on the arrow here, you can see all the fields are extracted. So don't need to create an delegates. So it's good for us if all log feeds like these, like JSON objects as well. If you receive logs as JSON objects, Splunk will extract automatically because they are key-value pair 0440 gate. We are not going to do extra work. And in the next lecture we are going to create reports and dashboards using the data we received from our luck sources. See you in the next video. 15. Splunk Search Processing Language SPL: However, by the, in this lecture we are going to talk about Splunk search and processing language. And it is one of the three lenses of Splunk because it allows you to type queries to retrieve your data, and to apply some logic to the data you are receiving. Azure are gong to seek. Let's click on Session reporting gap. Anything we are typing here on search bar, it's SPL, blank search processing language, which we say index equals web and unselect all time. We'll get all logs. Now we have a field called status. If we say status equals 100, we've got nothing because we don't have any status quo. With 150 for Type 200, we've got only five events out of the 115 events. So these SPL, we managed to use the fields to limit the results we get from Splunk. And also there is commands that give us some statistics about our data. To use these commands, we type our pipe first and then the command we want to apply. So for example, top command and give it a field name. So for example, the client IP. And it will give us the top line diabetes with this order. And if we click here on visualization, it visualize it like this. Or we can select our pie chart. Let's try another thing. Also, we can use the Boolean expressions. We can say, for example, if we type current here as a free text search, it will give us, although log lines with the careless string. But we can also type not capital libraries and the color changes because it's a keyword for SPL. And if we type entered here, it will bring us ten events. So all these events doesn't have the current string. And let's check firewalls index, but more logs. For example, here we have a destination port. I want to remove duplicates, or I can go here and type pipe and d dot to remove the duplicates and type DST port, destination port. And if I click Enter to search, you see you only got 13 events because we removed the duplicates and I will make another pipe and we'll type table port, port with this query. I know that we only have these unique ports in our logs. These were the basics of SPL or Splunk search processing language, and let's use it to create some reports and dashboards you in the next video. 16. Creating reports and dashboards: But everybody in this lecture we are going to talk about reports and dashboards in Splunk, but let's start by firewall logs. So we have here the index defined as firewalls. So if you click top, action and click Enter and visualization, if you are satisfied with this report, we can click on Save As click on report, and we'll give it a name like top. And here the description and the content. And the moment we have only the pie chart, or if you want, the results on the pie chart can click here or only the results for I'll slip the pie chart only. And if we want to have the time range picker in the report to change it from all time to 24 hours or two last 24 hours or whatever. So I will give it as well. And they will click Save and then click on view. As you can see here, we have a report called Top action with the time selector here, we can say lost seven days. He wrote report. If we click here to go back to the main dashboard total go to reports. We click on Session reporting app and click on Reports here. And now we have this default reports and the reports that we just created now top actions. So if you click on it, it will run the report for us. That lets click on the search again. And then equals and then select all time. I'll do the same pipe and then top action. But this time I'm not going to save as a report. I will save it in a dashboard to start building these dashboards with many reports inside this dashboard for monitoring. So click on visualization and we select BI chart, click Save As. And then I will select dashboard panel. And we have the option to create a new one or an existing one. And this is the first dashboard, so I will select the new one and I'll give it a name, but firewall. And I'll make it shared so anyone can see this dashboard and this is a pie chart. I click Save and then click on View dashboard. For now we have our first dashboard with one report. Let's add more reports to this dashboard, our laterally concern, and create another report full time. So we type top source IP, and we'll click on visualization. And I prefer to change to column charts and I'll click Save As and dashboard panels. And now slick existing and slipped firewall stats and I'll click save. I will not do dashboard. And now let's add more reports. Let's explore the logs first to find the fields. Although there is a command called rare, it solves the top towards good for security monitoring. For example, to search for rare ports in your logs may indicate there is suspicious activity and destination port. As you can see here, we have only one event, have the port 518 events that have the spore pods, nice metric as well to have it in a dashboard. And we can remove this person to type pipe and then fields, then minus. It will remove this column. And we can also limit the number of results and type here. Can limit equals three. For example. Now we have a table with three entries only, and we can click Save As and dashboard panel, pig existing and firewall stats, tunneled title, delineation force, either click Save. Let's refresh the dashboard. So we can start editing this dashboard. Who can click edit here? And we can put this one here and give it a title, this talk. And here is the force. And this one is the rare destination port. And I'll click Save, co-created our quick dashboard for this matrix. For example, if you are working in a security operation center, you will have a dashboard like this on the screen to monitor the activity on firewalls. And just an example, we can also add the time picker for the dashboard because at the moment, times selected is all time. Click Edit. And we can click on add input and click on time. And it will go to every report here. If I click on the magnifier to modify the search, click here on time range, I will select share TimePicker field one, which is TimePicker we added here. So I'll select this one and I'll click Apply. For now notice it's fun because here it's set to 15 minutes. And do the same to all reports. Also, you can change the visualization for any reporters you click here. And also if you want to add a title for the y axis and x axis, you can do this from here. Well, let's save this. And we'll select lost seven days. We'll start searching. And we've got our data. Let's go to Apache logs to create dashboard as well. All type n equals Web. All time. All we have the client IP, what fields we have? We have we have the URI and user Asians. So again, first report, you can clear the top line IP. It search. And we will get created as a white Chart. And I'll click Save As dashboard panel and new. And it will be Apache. And click on Sheldon and save a lot of you the dashboard now. And then let's get another report of status is important if we are operating website. So if all users getting 400 arrows, it means that something not working in your website. So it's a good metric depending on your use case for it should be 200, the top, not the 400. So from this report, we know that something is wrong and our observer, for example. And I'll click Save As and dashboard panel and existing and Apache states. And now we have two dashboards. And give it a quick title at the top. And click Save. Let's explore the logs again. And you can read a report about, about user agent. Because also this is important because if someone attacking your web server, usually there will not be coming from a normal browser. You can use Python, library or care or whatever job searching for rare user agents. That's a good use case as well. And let's select only two interests, limited equals. And let's remove the percent column by using fields commands. And click on Save As and dashboard panel and existing Apache sets and give it a name. And click Save. Let's go, let's click here on dashboards to check our dashboard. And click on Apache. Two will give you more information about Splunk and I will skip it. Now, we click on Apache sets and let's organize our dashboard. Click edit. Then let's this report here on top. I want them in the same row. Each one is the top line. Ip. And their user agents and pop status codes. And let's add a time selective and assign it to all the ports here. Time picker, Lie, time, fly. Let's search for all time. Can also click here on BAC theme. Let's save the changes and click Refresh. We managed to quickly create a simple dashboard from Apache logs to have the basics of reports and dashboards and Splunk and see you in the next video. 17. Creating alerts : Hello everybody. In this lecture we're going to talk about alert started by having a query if a certain condition met, it will trigger an alert on our assertion web index. And let's create an alert if the URI for the certain string. So at the moment we have your, I can say URI slash lists or exam. It give us one result. At the moment we are searching an old time. Well, let's consider these a condition. If someone accessing a slash tests on our observer, it will trigger on a lift. How to do this? We'll click on Save as and on alert and give it a title and say, it's older. Access, something like this. And it will make a children AB and conflict scheduled and can run it every hour, every day. But for us for testing purposes, I will use a chrome schedule to force it to trigger immediately now after one minute. So I'll click on schedule. And this is a time range at the moment because we don't have a current data. But in real life, you will make it like cluster one hour or something because you are monitoring their new events coming too Splunk and also to wrong. Now, 36 triggered a condition when the number of resins is greater than 0. So which means we have one event or more. It will trigger and on sleep trigger one's not forever result. And the action. If we click on Add action, we have many actions at the moment we will use to trigger the lefts and send an email. So firstly started to triggered as tier and the severity is medium. And let's make it 27. And then click Save. And the triggered events are here. If we click on activity and then trigger the alerts and waiting for one minute, I click reload, but still 37 minutes didn't come yet. Let's click reload again. She now we got this alert now. Test folder axis, father letters. This folder axis. And severity is medium. And you can click here on View Results. And we have the alert here. Let's modify the alert to send an email as well. And OneNote alerting feature is not included in the free Splunk license at the moment who are using the dryer license, which gives you access to all features for 60 days. But after 60 days, the alerting feature will be disabled. And the free license also gives you 500 megabytes per day. So you can use it for free if your data is not exceeding their 500 megabytes per day. So let's click here on alerts. Will find the inlet and modify it to send an email as well. We will click edit. And edit alerts. And we'll go to the actions and click Add action. And we'll click send an email and I'll type my email. And we'll select all this stuff in the alert coming to my email and I will click save. So it's saved. But one thing we need to configure the mail gateway on Splunk itself. For example, if a company will put you mail servers gateway in the setting as you are going to see now in this course, I will Gmail as my mail gateway to send emails. So we go to settings and then click on solver settings, and click on email settings. And in May lost will use this host. I'll put all these details in the not so you can copy them and test them and will enable TLS and username. You will use your email account that you will use to login to g min and send emails. Whatever destination, I will use this Gmail account to login to the Gmail service. And I'll click Save. So now if an alert triggered, it will be sent to Gmail to email a added to the alert. Or let's configure the Aleppo trigger again here. Session and reporting there. And I'll click alerts and make it 41 thing we need to modify in Gmail account we used for authentication because the Gmail will block but access from Splunk because it considered as less secure app that uses an unsecured authentication method, allows blood to send emails veggie millimeter go to this URL and put it in the north silicon copied and pasted. And we will allow less secure apps now will receive an email if an alert triggers, lets go back to trigger the alert. I will click it, it alerts. And I'll change in time. 14 line and I'll click Save. And now if the alerted triggered, we should see it here on triggered alerts page and also on Gmail. Or let's click Reload on triggered events page. And we've got the alert here on this page. Let's check on Gmail. Here. Yes, we receive these alerts, this folder access, click on it. We got the session string and we've got the event. And also a PDF with the results. And as he is V5, for a new company, you can put your mail gateway server in the configuration and it will be more easily than GMail configuration, but that's all about alerting. Thanks for watching.