Python Project - Building Web Scraping Bot With Python | Shubham Sarda | Skillshare

Playback Speed

  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x

Python Project - Building Web Scraping Bot With Python

teacher avatar Shubham Sarda, Python Developer

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Lessons in This Class

    • 1.

      Course Introduction


    • 2.

      Web Scraping Project Demo


    • 3.

      How Do We Scrape Data?


    • 4.

      Web Scraping - Overview


    • 5.

      Project Setup + Installing Libraries


    • 6.

      Working With BeautifulSoup


    • 7.

      Filtering Required Data


    • 8.

      Extracting Current Price


    • 9.

      Extracting Table Content - I


    • 10.

      Extracting Table Content - II


    • 11.

      Extracting All Stock Result


    • 12.

      Working with Static File


    • 13.

      Storing Stock Content in CSV File


    • 14.

      Sending Text Mail Through SMTPLIB


    • 15.

      Sending Text Mail Through Email Module


    • 16.

      Sending Attachment Through Email Module


    • 17.

      Integrating Mail System with Web Scraper


    • 18.

      File Name According To Today's Date


  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.





About This Class

Welcome to Building Web Scraping Bot With Python and Beautiful Soup. Web scraping is about downloading structured data from the web, selecting some of that data, and passing along what you selected to another process.

This course will help you to learn about Web Scraping fundamentals through a real freelancing job posting. We will follow everything step-by-step with Beautiful Soup and Python Email Module.

Throughout this course you will learn about,
- Fundamentals of Web Scraping
- Working with Static Files (CSV, XLS) - Storing Data
- Working with Email Automation (Sending auto-generated mail through python script)
- Extracting Stock data from finance website through web scraping
- Learning ethical ways that should be followed during web scraping

After completing this course you will be ready to expand your knowledge over Automation and Python.
Enrol now and I will make sure you learn best about Web Scraping Fundamentals!

Meet Your Teacher

Teacher Profile Image

Shubham Sarda

Python Developer


Hey there!

I've created Complete Roadmap to become a Developer with different projects, specifically for the SkillShare audience. 

With this roadmap we will start with Python Programming, learn about basics, important concepts and solve some real life problems by building projects. Once we are ready with Python, its time to gain more experience with different field projects in Automation, Data Analysis, GUI Programming and Web Development. 

Here is Complete Python Programming - 
Python A-Z: Learn Python Programming By Building Projects
Python Programming - Advanced Concepts

Python Projects - 
Python Project - Building Twitter Bot With Python and Tweepy
Python GUI Programming - Build a Desktop Application with Tkinter and SQLit... See full profile

Level: All Levels

Class Ratings

Expectations Met?
  • 0%
  • Yes
  • 0%
  • Somewhat
  • 0%
  • Not really
  • 0%

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.


1. Course Introduction: Hey, guys. Welcome to Web scraping an automation for absolute Wagner In this project, we've learned to extract data from a finance website, stored them into CSE filed and mail it to a pre defined in militarists. I've been this project according to a real job posted by a client on a freelancing platform . This predict based tutorial is simple and quick. Start to learn Web scraping fundamentals with beautiful soup and fight on dealing with SMTP live on other email models. It's a pretty fun course. You will learn a lot off stuff and to begin with, all you need is basic bite on knowledge and rolled into the scores now, and I'm really excited to see you inside. 2. Web Scraping Project Demo: Hey, guys, Welcome to this quick name a lecture before understanding more about Web scraping and other stuff. Let me quickly give you brief idea about the requirement. So the client need certain data related to different stock daily and forward it to hire department. So we need to perform extraction off that stock, better store it in a CSC file and then mail it to a prettify inclined email address. So basically, we're going to write a script. They do all this step in one thing if I talk about the demo, this is our current court base. Let me run this one now here. As soon as I call this file, it extracts all are required data and then save it in a CFC file. You can see the CS three file is created in the background and may lead to the define email . Idris, I hope now you understand about the requirement from the next lecture. Let us start our task and talk about beautiful soup. So beautiful soup is one off the most important library to start our web scraping journey from the next lecture. They just understand the fundamentals off this. I hope you are excited. Thank you for falling this lecture. See you in the next one 3. How Do We Scrape Data?: Hey, guys, Welcome back. Now let us understand. How do we scripted? Uh so the first thing you should know is every website has a structure. Now the structure is in the form off hte email. So if I open any webpage online, it has a structure in the form off HTML. Let me open it, Stamos, or scored for this particular page. All you. After this, Just right. Click and click on you. Sore scored. Now this source code contains all the information for that particular page. So during our process, we are going to download the source scored and then extract data from this particle appease . That means if I want to extract title for the speech, all I have to do is search on This issue came and sore scored here. You can see we have title. Maybe I need some of the information. But the main point is the pages so big how I'm going to extract the correct information. That's where tags is less a tribute coming to venture. So every HTML page has some tags on attributes. Define information. Let us talk about this with some different example. So here is a sample page And now if I want to extract information from this particular patient, I need to get its html source scored. And here this particular source scored has all the information. So basically, if I want to extract this particular paragraph, all I have to do is download. The source scored and here I need to tell my scraper the check for def tug on. Attribute called Class with a Value Off panel on my scraper will do its job. We'll be talking about this in future. Also, the other thing that you should be aware off his inspect element. So it also give you a sore scored directly on your screen with a developer tool. So if you want to access it just right, click anywhere and click on this Inspector. Now here, in this element of you will get all our estimates or scored with 24 months. If I want to extract data for this particular title, all I produced just search for a day. Then I attribute called class and then again are each one done and then extract the text. So this is how we are going to follow. This was some basic information regarding its team ill on its structure. Most off you might with him live with this. If not, don't worry. We'll be following that. The next thing is the libraries off fight on which we are going to use. We'll be using request, and the 2nd 1 is beautiful soup. Later on, according to our requirements, we'll be downloading others. So I hope now you understand about this basic HTML structure. This is something we are going to talk about. 12 the course. If you are not able to understand, don't worry. Once we start working on Web scraper, you will be family with Dad. I'll give you a simple, basic understanding. Every HTML file start with HTML tags and ends with a steaming bag. In between, they have head. Now Head can contain title off that particular page. Some other matter information. Then we have body body can contain paragraph on different other tags. So this is a basic HTML file. If you know about HTML, then it's OK. If not, that's fine. Now, from the next lecture at the start, according part and work on Web scraping, I hope this lecture was helpful. Thank you for falling. See you in the next one 4. Web Scraping - Overview: Hey, guys, no need to start our task. With Web screaming, Web scraping is basically extraction off data from any online source. If you talk about current is, we depend highly wounded up. Whatever we do, we generate data. We depend on data and all the distal world. All that is still website depend on data for their advertising, for their market strategy, for their product development or anything. So it is really important for them to collect data even for new startup Israelis. Fortune 500 companies did I something which everyone needs, and this is why Web scraping is quite important. So in this course, we are going to use Beautiful. So there are other biting library that can be utilized like scrapie or selenium. But we're will look over beautiful soup on. We will be scraping data from finance started how dot com So basically, we're going to extract price for different stop now. Before moving forward, I really wanted to talk about the critical way because Web scraping is also used unethically. Track price off competent er's to extract content off different dogs and all of the activities. Remember, whenever you are going to scrape later from the Web site. Check if that website provide a pay or not. Also, don't forget to check their reports dot txt. So all you have to do is use the website ling slash reward rt Eckstine. And here they will mention all the links that are not allowed to script. Like if I consider finance tortilla ho dot com, they mentioned all the links that we cannot scream rewards or txt is available for each and every website. So suppose you are scraping data from who will make sure you visit google dot com slash reports Start the extreme On here you can see all the disallow wings. That means you cannot access them. So this is something important which most off the audience missile, and the next thing is our projects it up. So all you have to do is just create a full in your next job. I have cornered as by scraper and then create a not by file I have named as scraped or by and use your favorite editor. You can use atom usual studio by charm or any other. Every editor is going to work fine. So from the next lecture, let us install all are required libraries and start our work with web scraping. I hope this lecture was handful. Thank you for watching. See you the next one. 5. Project Setup + Installing Libraries: Hey, guys, Welcome back. I want to start my work with the scraper. Now the first thing we need to do is install these to live release called request as well is beautiful soup. So all you have to do is just open your terminal or command from in the respective fuller in which you are working, and then call this Come on, whip, install request. And once that is done, Pip install beautiful sue. For once, both celebrities are in start. We need to start working with a script or pie. The first thing I need to do is I need to import both these more news. So just toe import request and then from BS for import beautiful soup. So these are the two more do is that we are going to use now Request is usedto obtain the information in the form off Esteem in on beautiful super is used to pass that particular information. So in the previous lecture, I talked about estamos or scored so request download that a stable source scored And then, with the help of beautiful soup, we can target different tags and different attributes to extract information. I hope that's clear now moving forward. Let us define our you are in, so are currently we are going to work with Google stock on finance toward the whole dot com In future, when we are multiple links, we can create a list creator trouble or maybe our txt file. Currently, it's just store this in a very even next we need to extract the HTML page. So all I have to do is create a variable called a steam ill underscore page. And then I'm going to use my request model and then don't get and I need to pass that particular link. That's it. So this is going to help us to extract the information off that particle API. Let me bring in the desert and see what output were getting so all I could just bring it stable page dot Contain. So this content is going to help us to get only are each team and source Otherwise, there are a lot more thing that this request command gets. Let me run my screen door by so you can see this is our html source word or despite blabby . Now, the next thing is, if we are going to repeat the same process again and again. Every side will understand that we are using a board to get information. So we need to pass. Ah header along with a request to mimic as a browser. So all you have to do is just create ahead of variable and here we need to pass. Additionally regarding, I use a region. So to check out, to use a reason all you have please just open who will not come in your room and then search for my user agent on This is the particular agent for me. It can be different for you. Just copy this one and define a user reason for your request. This help us to mimic each off our query on this particular page as a browser. So we follow the integral rule. I hope they know you're able to understand everything. Now from the next lecture to start working with beautiful soup on extract some information from our repeat. Thank you for falling. See you in the next one 6. Working With BeautifulSoup: now let us start are really work with beautiful soup. So the first thing I need to lose just remove the sprint statement and create a super object. Now, this required to barometers the 1st 1 will be our HTML page content. And the 2nd 1 will be the parcel. So for our content just passed a stammel based dot Contain on here. You need to pass a parcel. I will be using Alexei Emelin. So let me open beautiful soup documentation here. If you scroll down a bit, you will see options regarding parcel. So there are four type of different parcel it stable or parcel and then lx, Immel XML and HTML five labor. So currently we're using Alexei Emelin, which is the fastest one also, it works pretty well. He can also use a stable door, parson, which also works to see me. I would not recommend to use thes other door because they have different structure requirement. Also, this one is quite slow. One status than you can now use this super object and print anything that we require. Remember to run this lxm a parcel you need to install it. Just toe install Alexei Emelin So if you scroll on a bed with this documentation, you have options regarding different objects like tags. Name, attribute. So suppose if I want a friend title of the space. All I have to do is you. Soup, object and then dark title. Let me run this one Now here you can see now we have our title. So if I open this particle is right, click on open its external source scored. Hey, you can see this is the title, Doug. And this is our dessert. So this is something really basic and we were able to do with the help off tags only. But in real life project we need to go with all the other complication Because the source scored is not credible. They are not easy to understand and they are quite big. So we need to use different methods to grab this information. Let me clear this up and let me take a valuable lest I do. And now, with the help of my super object, all after the issues, a command choir, fine. And then I need to write the each team attack that I want to find. So if I bring this one right now so you can see I still get the same result but with different metal. Now, if you have our over the fine you will see take different other para meters also will be realising them in the future. Now there is one of them at our begin also use which is called us find on Currently we know its chemical structure continually one title time. But if I want to talk about paragraphs there are many paragraph for a HTML page. So we need to use find all toe find all the paragraph find method can help us to extract the first result only. So if I'm trying to use this for paragraph, it is going to return the first object only and it is going to ignore all other No one is final. It can help us to extract all the result now head if you observe the result I'm getting has the title targets in so to extract exact lizard I need to use get underscored text method Now let me run this again here you can see now I'm able to extract the exact text for my title Dime. I hope this one was helpful. And now we were able to understand this soup object in the next Exelon display. With this on, we are going to extract Lord More information for the speech. Thank you for falling. See you in the next one. 7. Filtering Required Data: Welcome back. Let's start working on the information that we require. So the first thing requires heading off despite bless. Stop. Then we need the current price off the stock and then all the information that are in this table. So we are going to take this information and then print it out later on. We're going to save this information, and I see a street format. So the first thing I need to do is I need to find their respective tags to extract them. And maybe you finished the speech and now select the value that you want to extract for No . Let us trying to extract this hitter just right. Click and click on this inspect element. So this is the information that we require. Now, this Google developer will help you to understand everything when you will, you can scroll over the start on it will help you to understand visually, that court is the information the stack contain. So this particular doc and Dean are heading. And if I scroll up a bit, you can see this particular tag which is due on have this particular idee contained both the information that be required that is are heading. Israelis are current price. So I just need to use this bag and then I can extract board the information in one goal. So let me copy this one. Remember, this is Dick Dock with this particle already. Let me jump here. Remove this and I will create a valuable has stopped title. I just use my soup. I'm going to find all the object with this combination off alien tag. So here, I need to pass my dad first. And then you need to mention diet E. Let us try to bring this result and see what we get. Now, this is something big. Now the result is in the form off list because find all is going to return all the objects which have deal and I d combination off this. We need to realize the first present only now to extract this information. All I have to do is in this particular day, I just need to find something unique about this. So all I have to do is just find the force. It went back inside my day because there are no other h one time. So all I can do is just find each one. So this is the zeroth element. Now, let me clear my screen and print out the result here. You can see now I'm able to extract the Sichuan tag, and I can directly print out my text. All I have to use is get underscore Tech, Smith er or directly door text. This should work. I need to follow something similar for my current price. So if you inspect our current prices, you can see the taggers span. And that tribute is changing with real time. Now we need to follow something similar for our current value that will be doing in our next lecture. I hope this one was handful. And now you are able to understand how to target tanks in the next lecture to start working to get our current price. Thank you for falling. See you in the next one. 8. Extracting Current Price: Welcome back. Now in this lecture, let us extract our current price. So just right click over this click on inspect. And here little search a combination or dag and attribute. So the first thing we were doing in the previous lecture was we have filtering all the information in this day and then be accessed on H one tag. Now we need to find another combination to get this one. So one problem here is this Attributes is dynamic attributes. That means it keep on changing with this particle a value every time it gets refreshed. This change so we cannot use this. We need some fixed value. Now you can use another period. I would be using this. So I just need to filter this attribute and then I can print out first span. I hope you are following me. So all I have to do is inside my container that I've already extracted. I just need to find this tag an attribute and then print out my first span jumper, copy this one, and create a variable called Curren Price our hair and stood off my one. I need to find this deal and my attributes as class equals to this. First thing I should do is I need to remove this common Bart store this under Hillary in four. And then I can extract information from this now head instead off each one I should find do . Now. This state has a attribute called Class. And here, let me copy the value of class inside this class. I need to find the first span and then simply get my text. This looks fine if they print out my current price. Okay, we haven't later on. We cannot use class because this is a reserve keyword, and we are not allowed to use this. So instead of class, you should use class under school. So this is the right key word for our current situation. Yep, it's working. Now we are able to extract the information off our stock title on our stalker in praise. Let me give you a brief we extracted ahead of with the help off our tag and its attributes since we were using final. That means it is going to return a list on list follow indexing. So I'm using a zero index now, with the help off this head and four I can filter out all the information inside this. We have a 20. I just directly extracted that now to extract at the information, I need to use different combination. Now, the first combination I'm trying to do is our dip and this particular class. And then I'm trying to find the first span inside that particular class. I hope you are able to understand this now. In the next lecture, let us work on this particular table and extract different information that we want. I hope this letter was helpful. Thank you for falling. If you have any question, don't forget to use caution. Answer section If not, keep falling. And I see you in the next one. 9. Extracting Table Content - I: Hey, guys will come back now. Let us work towards the progress to find all these information which are in the form off table on Brinda mo. So the first thing is that it's trying to print a combination off them. Make me open and spec on here. You can see this is the value that we are trying to find. And it is inside this table room on which is inside this live on this table ecstatic. Other values can be dynamic. So I think we should extract the step and then filter out all these things. So my plan is to extract the still and then all these are falling. All these values, if you see, are falling a common class attribute. Also, they have same time, which is the I. We can extract this information easily. Let me copy this one on. Let us extract this complete table. So all after the is just used table. Unless scored in four. Then I'm going to use my super object and head I'm going to use find all. So there are chances that there can be multiple tables that muse my Dave on a class combination. How this is going to return a list that we print out this result and see what output we're getting. Hey, you can see in our list we have only one element, so I can use. He wrote Index. And now we need to find perfect combination with our table in food and inside values. Let me remove this print and the first thing I need to find us this previous clues. So if this TRS common for all what I can do is I can just directly at this PR filter in my table in four and then bring out my list itself and head let me out a new filter with respect booty are and then print my day willing for I think this should return all our elements in the form off list. Yeah, this is working. Fine. So let me do one thing. Let me try to bring the first object off this list. So this is the result off my first object If I jump or mike room. So this should have something you need to print out both the values. So what I can do is I can extract this span. The four span is my heading and the second span is my value. I think this should work Fine. So this is the zero element. And inside this I need to find span. Let me bring this one again. Now we have a less result with Buddha Span. The first pan contain our previous close and the second span contain our actual value. So here, If I used the first span and then I extract the next. Bring this again. Now here you can see I'm able to extract this information. So this is working. Fine. Let me try to store this in a variable. So let me use heading for this variable and the next one I need to create as previous close value it is going to follow the same thing. But it is going to extract the second elevate, which is our Index one. Now let me bring both these result. And then we might create a loop for all this. Hey, you can see this is our stock title. Our current prize are heading regarding our first extraction and this is its value. Let me bring both off these. And a single though can coordinate both of them on the next lecture let us try to create a loop and extract all the information. So this is easy now. Zero element was a previous close. Heading on the first element will be our open. The second element will be obeyed. Third element will be asked and then for no on. And all we need to do is store them. I hope this lecture was helpful. If you have any question, don't forget to use question. Answer section if not and keep falling. Thank you for watching the section see you in the next one. 10. Extracting Table Content - II: welcome back. As discussed in the previous lecture, all we need to do is write a loop from 0 to 7 to extract all the information. So let me jump back to my Via scored on here, let me take my Lupus, for I would be taking variable name Is I in my range from 0 to 7 and I'm going to push all this information inside this And instead of calling them with a specific name, I'm going to call them as heading and value. And here I am going to replace the index from zero, Do I? So this looks fine. So I should expect our is our previous clues than the value open the no value and follows. And here, if you see I have total eight values. So I should take range from zero to it. And this should work. Let me jump back to my dominant and print out my result. So I was able to extract the result will ask and then we haven't error, which is regarding our list. And next and we haven't entered with our day range. Let me inspect over this. This should work. Fine, but let me see what was the exact better. Let me come in this one and tried to print reserved only for our fifth element, which is in next four. Now, I heard you will observe that we have problem with our SPAN index on it. Don't have any second element inside that. That means if I jump here and try to search board the span tag, okay? They only have one span back in the second values inside the TD itself on in stuff span. If I tried toe print Dele, this should work. So this is our first duty, And this is our second Didi. And forced to d has a span. And in this value second, really don't have any span who they should work. Fine. Yeah. We are no able to extract this. So instead, off span, I should use PD, which is more unique on it is present on both Decide that is are heading in our value. James span duty. Let me open my command from clear my screen. Yep. Now we are able to extract this reserved an appropriate for man. Let me try to use a different stock so we can see if this is working with all the links Close this one. And instead off Google. Let me try to search for Amazon. Copy mailing. Come back to my Via scored on inside of the school willing He plays this one with Amazon. Come back to my domina. Yeah, this is working fine. Now we are able to extract this information in a particular format according to our requirement. Just look fine. So let me give you a brief revision What we're doing with beautiful soul. So here we have created a beautiful soup object called soup. We're taking a steam ill paid in the form of content and we're using XML parcel Then first we're extracting the header in for with the help of a combination off dead on its i. D. This is stag. This is a tribute. Since we're using final, it is going to return a list and we're using the sealed index. And then inside our head and food were frittering out our title with h one time. And we're figuring out our current price with a combination or Dave on another, our tribute and then again for dead it out with us, Pan. Now we are going to play with table in four. So we're using Find all. Use a combination of Devon class since we're using find all. So I need to fill that out with the index. Then again, I need to figure them out. So I'm using a p r. Dag. While status done, it is going to return our list since we're using final. Now, this list has all the result that we want our heading and our value. So we have created a loop and then we're filtering them with help off PD. I hope this was helpful. Now, from the next lecture what we're with Louis, we are going to use multiple UN's. So are currently we're just printing for one stop In the next lecture, let us create a list and use five or six talk and then bring that out. I hope this lecture was helpful. Thank you for falling. See you in the next one. 11. Extracting All Stock Result: Hey guys, Welcome back. Now let me create a list and store different stock inside that list and then create a look to extract result for all of them. Let me jump back to groom here, let me search for different stocks, open a list element air and then all I have to do is just keep adding multiple strokes For now, let me just take three. So now this you are is basically you RL's because we have three elements and then I need to add a loop off all this here I'm will use a for loop which is going to select a u L from my you ence. Now this has all the Urals. I'm selecting the 1st 1 it is going to use in my request page and then all the process will before nor now there is just one problem. So if we have 100 links, it is going to send 100 requests to Yahoo finance in just couple off second, which I really don't want to do because this is going to affect so worse off different website. So we should give of a time after every loop we should take of a time off. Maybe five seconds or 12th. For that, I'm going to use time more noon. So you have to do is just import time. And here, once we extract information for our forced you, Errol, we should take away time off. Five seconds. So just used time, not sleep. And then you need to provide seconds. I'm digging. Five seconds. The should work fine. Let me open my domina. We got our first result. We have a great time off fights against a second result. Then we have a great time off. Five seconds, Arthur. Drizzle. And then we have a great time off. Five seconds. So this is working fine. And now I think we can move towards RCs replied. I hope this leg troubles. Handful. Thank you for falling. See you in the next one. 12. Working with Static File: Hey, guys, Welcome back. Now, in this lecture, we are going to write our data into a CSP file. So the first thing we need to do is we need to important CSU model. Then we need to open our C history file and use a right mattered. Then we need to use a writer, and then we need to write Rose. So if you have you CSC model before, you should be familiar with it. If not, let us to now. So I let me in both my CSU manual and then let me create a variable a CSU fight. And now I should open my CFC file and use right matter. Next thing we knew clues. We need to initiate a writer, and I use my CSC DOT writer and I need to pass my file. Remember, if you already have a CSC file, then it's fine. If not, it is going to create a CSC file with this particular name. Once that is done now, we can use this writer matter and create different rules. So if you visualize any CSP, it has different does. It has a heading room and then all the values. So we are going to create a heading grow on after that In these loops we are going to store these value and RCSC file. So just create my fast. True as heading all I have to do this you CSP later door my writer room And then I knew to pass all the content off this and here you need to pass all the headings. So we a total hard. Then we lose that it's 10 headings to store. 1st 1 is stock title than our current price and then we are extracting eight values from this room. All I have to do is I need to pass a list and inside this list, I need to pass. Although values first, I'm going to use as stop name followed, went current plays and then quickly copy all these heading value. Now that's done. So we have created our first true, which is hetero. Now the next thing we need to do is we need to store these value, which is quite tricky. Let us continue this in our next lecture 13. Storing Stock Content in CSV File: Hey, guys, Welcome back. Now let us continue our task to create our right there room for all the content off our stop. So if you look here, we need to pass a list with all the content off stock name Curren Price previous close and all the other information, I think the best approach we can take us, we can create an empty list and then filled with all the values that we have. That means at first position we're going to take stock title second position, Curren Price. And then with this particular loop, we can keep adding new items for all the values. Let me remove this heading now, since the heading will be inside our hetero. So all I have to do is I need to create an empty list first. So what? Every New York girl I'm going to create a neuro. I need to create a new list. I'm going to call it as stock. Let me remove the sprint statement here and are both value of stock tidal and current price in my list. Same. I need to fall over the current price and then inside my for loop. I just need to upend with the value that's done. And once the loop is over, all we need to do is create a writer room inside this. We need to pass our stock. I hope this should work. Fine. Let us test this out. Let me open my sidebar and distance screamed so you can see we have created us. ES three file. It's working right now. Let us read for few seconds. That's done. Let me open my CSU file. Yeah, Districts, Fine. So now we are able to create US history file with all the stocks that we want. We have our stock name and all the other important values that are required. Let me give you a quick revision about CSP. So, Force, we have created a CSU file. If that is not there, then we created an object with CSP writer use. Right, ERM 1/3 pass asi asi file. Then we start writing rule. The 1st 1 is about our heading. And then we took approach to create a list on all the value off our header and then created a neural. Now, one thing is really important is to close rcsc file. So once are complete script is done. Just close. RCSC file. This looks fine. I hope this lecture was helpful. Thank you for falling. See you in the next one. 14. Sending Text Mail Through SMTPLIB: Hey, guys, Welcome back. No, no, to start working with our email system. So the first thing we need to understand is how to send a mill. And then we need to integrate that mailing system without Web scraper. Now, email are basically off today. The 1st 1 is basic text mill, and the 2nd 1 is text with attachment and we're going to follow boat. In this lecture, let us understand the first type in which we are going to send only text and in the for their lectures, we are going to understand the second part. Now the work with emails we need to use a library called SMTP lived, which is one of the most common library to send and receive me. So there to start our task and import assembly 1,000,000,000. Now, here we need to create an object for our somewhere and initiate our connection. So the first thing we need to do this we need to use our ports. Then we need to check our security and then we need to Logan. So force Let me create an object. I would be calling it a server and then let me use my SMTP live and then my SMTP mentor here, I need to pass the board and poured ideally with this lecture. My main motive is to send from this email to this email That means I'm going to send a meal from A to B. My host will be provided by Jimmy. If you're using any other service like Yahoo Out cloak or any other, you need to check their host. I'm sending a mail from my Gmail so I would be using who's provided by Google on the borders 5 57 The second thing we need to do is that is about encryption toe make our server secure. So all you need to do is you sober dot Start the list. Once that is done, it's time to Logan. Now here is one of the most important thing you need to provide the email from which you are sending your meal. In my case, it is unwired show bomb at the regime and Lord home and then here you need to provide a password. There is one good security pattern that you can follow. So all you have to do is just open your Gmail and log in the account from which you need to send a mean and then the first setting you need to change is about less secure access you need to allow it, and the next thing you need to lose, you need toe. Allow two step verification. That means you're a contra are two step verification. Once that is done, the third step is a password. Just open your my account, go to a security and here you will have an option regarding at password. Remember this A password option will only be available once you have completed first and second step. So once you have a poster gratification allowed, you will have option for a password. Now, once all the process is done here, you can create a password for your meal. I d. And you can use it for email bubbles. Just create a new password for you mean, And now you can utilize this particular password with your program with your script, and you can destroy this password anytime you want. So this is one of the safest method to utilize any ghouls. Always with fight on script, come back to my via scored on based my password yet one status than now. it's time to create a message. So if I opened my mean, you will see an option for two. You will see an option for a subject and you will see an option for your body. So the first thing let us to is to create our body, which is our message. The next thing is, we need to initiate a matter and pass are to and from email. So that's usar object darks. Endymion. And here I need to pass both the emails. The 1st 1 is from address and the 2nd 1 will be to Waitress. And here we also need to pass our message one status than we need to create our server. This looks fine. It assumed our script and see if this is working on. So we are sending email from this account. Do this alone? Yep. We have received are mean. Let me open this one. We currently don't have any subject, but we have our meal with a particular message. Let me do one small change. Convert these email Indo variables. This look much better. You can test this out and I hope this will work for you. I hope this lecture was helpful. Thank you for following this lecture. See you in the next one 15. Sending Text Mail Through Email Module: Hey guys, welcome back Now. In the previous lecture, we learn how to send a mail using SMTP live now. A simply be lived model is useful while communicating with mill without much requirement. So if you check our previous mail, we don't have any subject you cannot add. Each team attacks, we cannot attach file and we cannot certain to multiple people to include a from so subject header and all other things we need our email package on we are going to use by default mill package, which comes with fight on. So in this lecture we are willing to send a text meal, but with help off email model. So the first thing we need to do is we need to import classes from email model. We are going to require two classes which is mime, text and my multiplier. So all you have to do is use from email dot mind or text import my next and then from emails, not mine dot multi part import. My multi buyer. Make sure you are careful with the capital letters. Once that is done, begin now add information for header so heather basically contained from to and subject So these are the three important things for any header. So let me add a subject. Once that is done, now we have to create an object for my multi part. So basically, with email model, we have to attach everything to our message. That means once we have created our header, we need to attach it to message. Then we are going to create our body. We need to attach it to our message. And then in future, we have our attachment file. We need to attach it to our message. And now this message is converted into a string which is in the text format. And then the string is send as mill. So let me create a message object. All I have to do is use my multi part. And now I need to attach from to and subject. So all you have to do is use message. And here you need to mention from you need to follow Same for two and subject Now this part is belongs to our head on. Now the next part is about our body. So let me create a message with my body on. I should remove this one. You can use any other variable name. I'm using this to clear out. Confusion one status done. We need to organize this one toe embassy. That is our message. And we need to pass a perimeter to our mind. Text explaining if this body content is playing his HTML is XML or any other format. So let me use message, and then I need to attach them. And here I need to pass my mind text, and then you need to pass paramilitary as party. And if this board is playing XML, html or any other format one status done, we need to convert our complete message into string format. So here I'm going to take message. And now I need to convert my embassy as string. And this message is going along with our meal. Let me change some text here. Otherwise it might go into a spam. This looks fine. Let me try and run this file. We haven't got a nearer. Let me jump back to my crew on here. You can see I got I mean, how this time this meal has a subject on. We have a proper message Now, this time we have used by default email. Modern Now this model help us to attach file, help us to create header. Help us to create a body on. We can even send our it's team a template. So basically all we have to do is use our HTML code here and replace playing with HTML. Noticed. Test this out. So let me replace this content with HTML tag. I would be using bull and that is that. Let me replace this plane with HTML. And also let me read the subject so I can differentiate between them. Now let me run this file again with my terminal. Yeah, this is done. Let me jump back to my browser on hair. You can see I've got a mean And if I open this particle of mail, you can see I have bowled Dad. That means you can play with header. You can play with your body. You can use its Tamil maximal plane and you can even attach file. So I hope this lecture was handful in the next, like gelatinous work with attachment off file and then we need to integrate this without Web screamer. Thank you for falling. See you in the next one 16. Sending Attachment Through Email Module: Hey, guys, Welcome back. Now, in this lecture we are going toe send an attachment to our mailing system. So basically, we need to utilize the script only We just need toe are a new functionality without my impart. The first thing we need to do is we need to read the attachment that is RCs refine. You can utilize the system to send any type of attachment, maybe txt file Maybe PNG jpg or any other? Fine. You wouldn't replace this with RCs refile. So just after my body let me out gored to read attachment and singing. So the first thing you need to do is to read attachment. I'm taking a variable name as my friend and then simply read this with open here you need to pass your find name and then the method by which you want to Really? So I'm using the buying anymore so that I am creating an object name as far so I need to create this object regarding my mind based now this mind bases usedto obey our stream to upload our attachment file. Let me also would this one one status than we need to know said payload. So All you have to do is use part, said payload. And then you need to pass your attachment. The next thing is little complicated. So what? We're going to Lewis. We're going to read the byte stream and in court, the attachment using base 64 according scheme, This is one of the most common scheme. You don't need to read everything about this. You just need to be aware that we're using something like this here. You need to add your content disposition, and then you need to add a dash sprint, then same. You go alone, then find name equals two and then you have to con Captain eight, your actual file name. In artists, it is screamed or CSP. Here we are doing everything to send our files as attachments. Now, one status done like we have previously done. You have a guy started. Er you have are trashed our body. Now we need to attach our file on. So So all we need to do is use message, don't attach. And here we need to attach are fired. This looks fine before moving forward. Make sure you importer and quarter, so all you have to do is from email import and quarter. So this looks fine. One status done. Let us test our file with our command from I haven't got any Adams, let me open my comb. Here You can see I have got a meal and I have a symbol off attachment. Let me open this file and here you can see I have Ah 5 56 bite off CSE. Fine. And if I open this one? Yeah, this is actually fine. So we have created a program. Now we can add an attachment. Now, remember one thing that we have created an attachment for header for our body for our file and then everything is converted into a string. And then we are sending that as a message. Offspring, I hope now you are able to understand how to send an attachment through your Bytom script. In the next lecture, let us integrate this with our screamed or by I hope this letter was helpful. Thank you for falling. See you in the next one 17. Integrating Mail System with Web Scraper: Hey guys, Welcome back now, in this lecture, let us integrate our sin milled or pie with our screamed out by. So the first thing I need to do is I need to convert the complete script into a function so I can call this from screamed or by Let me create a function and call it as send and still it all the content and move this inside our function. One status turn. Let me move towards my script. Art by on here. Once we clues are CSP, that means RCs three Israeli And now I can work on my email function solely after his first important Fine. Now we can call our function. Let me scroll down in the bottom. Call my sin mill on here. I can call this Send metal. This should well, but let me go. Some small changes are subject would be changed according to our requirement. Let me change my subject as well as body condition. And now let us replace both off these with finally invariable. So what we can do is we can pass available from this side. Let me create on argument here I will be calling this us find him and then let me pass my final Amos scraped or by and here, let me pass that file name. And now I need to replace both off them. This looks fine. Let me d leave my script dot by and now let me run my script. I mean, we have successfully created our script or CS three and I think are milling. System should walk now. This might take some time because we are now uploading our files on uploading process. Usually take time according to your file size. That's done. Let me jump back to my browser. Here. We have a meal. Check this one. We have a finance stock report. This is script or pyre, and this is our today's finance report. And here, let me check this one. This is working fine. And now here I can add all the stocks which are required, and also every day, whenever we are going to run this one, it is going to replace new file over our scrape Got by. So you don't have to delete this every time. One other thing we can do is we can use date as on him. So, basically, if Grace 16th number 2019. Then that date will be 16 11 2019. If you raise 5th 10 2020 then the filing will be 0501 2020. So we can do something like this. All we have to do is important date and time. And we can do this. I hope Now you are able to integrate and send this one in the next lecture. Elytis, to some minute changes on wrap up our project. I hope this lecture was handful. Thank you for watching. See you in the next one. 18. File Name According To Today's Date: Hey guys, Welcome back now in this lecture let us play with our fine name. So at currently we are saving our file with script RCSC Let us replace this one on actually use I date. That means every day we are going to create a new file with a new finding because we're taking the day. So if God is 15th number are finding, will be 15 11 2019 If they raise 15 men are fine name will be 501 2020 So this will help us to create new files daily. Also also, when we send out also when we also when we send our email it will have the date as well as we can add our did in our body Or maybe in our subject. So all you're after is from day time You have to import date one status on now Let me take two days. Did so I like to use Is state dark today in now. This will give an output off artistic But the future there type, it will be date time And here we need to pass a string. So while you need to lose convert this one into str and now we can use this and can coordinate with CSC Now we have converted this into str All were clues. We need to add our dot CS three So just use concatenation and we can replace our screamed out by with story. You also need to do that here. Now let me on this one and CR changes You can see I've created a file with today's date and let me come back to my browser Hey, you can see a garden mean this one is better Since we can actually have today's date on, we can actually saved it our daily. So if I'm creating a new file tomorrow, I'm going to save that. Also, this look fine. I hope now you are able to go to Web scraping do automation with that sending email and not moving I hope this project was helpful on now you can Web script you contend in email, attach file and do a lot more thing. Thank you for holding this project. See you in the next one