Intro to Google Colab, Install Tensorflow 2.0, Free GPU and TPU | Lazy Programmer Inc | Skillshare

Playback Speed


  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x

Intro to Google Colab, Install Tensorflow 2.0, Free GPU and TPU

teacher avatar Lazy Programmer Inc

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Lessons in This Class

3 Lessons (32m)
    • 1. ColabIntro

      12:32
    • 2. ColabTF2

      7:54
    • 3. UploadData

      11:41
  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels
  • Beg/Int level
  • Int/Adv level

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.

105

Students

--

Projects

About This Class

In this course, you will learn about how to run your own data science and machine learning code in Google Colab (Colaboratory), which is a Jupyter Notebook that runs in your browser.

Learn how to install Tensorflow 2.0, and access the GPU and TPU for free.

Meet Your Teacher

Class Ratings

Expectations Met?
  • Exceeded!
    0%
  • Yes
    0%
  • Somewhat
    0%
  • Not really
    0%
Reviews Archive

In October 2018, we updated our review system to improve the way we collect feedback. Below are the reviews written before that update.

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.

Transcripts

1. ColabIntro: in this lecture, we're going to go over a very new and exciting environment for writing Deep learning code in Python, which is Google's Collab Short for coal laboratory. For those of you who like to use Jupiter notebook, this is an even better option. This is basically the same as Jupiter. Notebook with the following bonuses. First, it's hosted by Google, which means you don't have to use your own computing power. You'll notice that when you need to download data files, it happens extremely quickly because while Google's network is extremely fast, second you get access to a GPU and even Google's new TPU, which is pretty amazing. A tpu is not something you can buy for your personal computer, so it's pretty nice to be able to make use of one. Remember that the way Tensorflow code is written, you don't have to worry about what kind of device you're using. Well, for the most part, generally speaking, the same court will work whether you're using a CPU, GPU or TPU. Third, the Coal AB notebooks are stored in your Google drive, so it's in the cloud. You'll never lose it, and it's very easy to share with other people. Fourth, is that many of the libraries you need for deep learning, machine learning and data science are already included. In fact, I was surprised that there were many more than I assumed there would be. There are even competing deep learning libraries already included, such as Theano and Pytorch. So for those of you who hate doing environment set up myself included, this is really, truly awesome. So in this lecture, we're not going to do anything really, technically complicated. Rather, we're just going to talk about Google Co. Lab and do some short demos. So you know how it works and can see for yourself that it's just like reading python anywhere else to start. I'm going to assume you already know how to create a Google drive account. If you don't have one, go to dr dot google dot com and sign up. Once you have your Google drive account and you've logged in, you'll see this interface. From here. You can hit the new menu, which allows you to create all different kinds of files, such as Google docks, a spreadsheet, a presentation and so forth. So let's do this. So now what you want to do is go to the more menu and hit coal laboratory. All right, so as you can see, this brings up a new notebook, and from here, you can mostly use this as you would and normal notebook. Now, one thing that might happen to you is that you might not see cola batory in the menu at all . So as you can see, I've hit the new menu and I've hit more. But I don't see cola in this case. Here's what you can do you want to select, connect more APS from here. Just search for co lab, and the first thing that will pop up is Google's coal app. Add this and Google Co. Lab will become available from the menu we just looked at. So if we go there again, we can see that coal app and now appears where it should. So let's go in and rename this notebook to TF. Two Dato intro. So first we're gonna get right to the good stuff. How can we make use of a GPU or tpu? In order to do this, you want to go to the runtime menu and select change runtime type As you can see, there are two select boxes here. The 1st 1 lets you select which Python version you want to use, so we'll be using Python three for this course. The second lets you select what kind of device you want to use, so that's either none, which is the default or GPU or tpu. Now note that sometimes the GPU or T B, you might not be available. This is because these air shared resource is so your fellow peers taking this course and other machine learning. Students and researchers all around the world might be using a whole lap, and we're all sharing. These resource is so if our usage of these resource is hits the limit of what's available, and you might not have a GPU or TPU available when you need them. For this reason, some of the code you'll see in this course may be done on my local machine as well. But remember, Python code works the same anywhere, so it does not make a difference. Next, you can see that there are two main types of cells that we can create in the notebook code and text you can click on either of these to create a new cell of that type. Let's click on text, since that's a little easier and it's not really something we're going to use very often. So let's just get it out of the way. So I'm actually going to delete the very first cell. All right, so, as you can see when I click this, it creates a new cell with what looks like a rich text editor. You'll notice that it's split in two halves. The left half is where you enter your text, and the right half is a preview of what it will look like. So let's enter some text. This is my title. Now You can click the little T Big T icon, which changes it into header text so you can see that it makes this a little bigger and bolder appropriate for a title. Next, let's enter some regular text. This is regular text and note that there are also these arrow brackets, so it looks like it's going to let us enter code snippets. So let's try that. And so, as you can see, it makes the text of mono space find, which is appropriate for code Now there's some other options here, so you can make a link. You can add images you can in den. You can add a numbered or bullet list and so forth. So if you're interested, play around at this. Otherwise, we're not going to mention it again. Next we have the code cells. Let's create one of those, all right, and as mentioned, we're not going to write any fancy code in this lecture. We just want to do something simple to make sure everything works as expected. So let's start by importing Numb Pyin map lot lib. All right, Beautiful as I mentioned earlier, he's already come pre installed. Next, let's create a new code, sell and make a sine wave. So first we need to create some X values. So let's make X go from 0 to 10 pie with 1000 points in between. Next, let's make why the sign of X. Next, let's create a new cell and plot what we just created. So that's just plt dot plot X Y. Now, since this is a notebook, there's no need to call plt dot show, since the plot will just appear in the notebook itself, all right, Very cool works just like a regular notebook. At this point, we've convinced ourselves that Google Co. Lab can do the usual things you'd expect from a Jupiter notebook. Now, as I mentioned earlier, one thing that's very nice about Cola is that it already comes with a bunch of useful libraries preinstalled. In my opinion, this makes Google collab way better than Jupiter notebook. And if anyone ever asked me to write in a notebook environment, I would choose cola by Defoe. I'm not a big fan of notebook, but I am a big fan of coal app. So here we can see that I've written some code to try and import a bunch of libraries. Specifically, these libraries are libraries that have been used in my courses, some more than others. Some are pretty rarely used, so you might not expect that they would be included. Libraries like Word Cloud, which we've only used one so far. And yet, if we look, we see that everything I've tried to import here does not throw an error. So this tells us that these libraries are indeed available. What's interesting to me is that some of these libraries air not machine learning related at all. Of course, we've used them in my courses because they're generally useful as python libraries. But it's nice to see the folks at Google also make use of these same libraries and so thought to include them. So here you can see the usual stuff, such as psychic learn, numb pie cyp, I'm at Plot Live and Pandas. We also have torture and theano, which is surprising because they're competing. Deep learning libraries and development for theano has been stopped for a while now. We also have Seaborn Word Cloud Beautiful Soup, which is for XML in HTML, parsing requests, which is for making http calls Network X, which is for a graph functionality, CV to which is for open CV and Jim, which is open. Ai Jim all in all, very impressive and much more than I expected. So there's some final caveats to co lab that I want to mention first. The main thing you have to remember is that this is the cloud, so these air shared resource is so one way this affects you is if you leave your notebook alone for a long time, it'll become inactive and disconnect any computation that you may have run earlier won't be saved. So, for example, if you define a variable A equals five and then you come back later after your notebook was disconnected and you tried to print A it'll say is not defined. So you see this notebook has disconnected. So let's say I do reconnect and I print a, it's going to say is not defined. Another way this affects you was that you might run out of memory. So if that happens, you might want to try running the code on your local machine instead. And as mentioned earlier, the GPU and TPU might be unavailable. So either you can run your code without the cheapie or tpu, or you can run the same cold locally as always. Options you had previously are still available. For example, you can provisional GPU instance on AWS, which, if you choose the correct am I or Amazon Machine instance, will come with the usual libraries preinstalled also 2. ColabTF2: Now there's a reason I didn't mention tensorflow specifically in the previous lecturer, which is because that's what we're going to talk about in this lecture. So this lecture is going to be about how to use tensorflow 2.0, in coal lab. You'll notice that if you import TENSORFLOW in CO lab and you check the version, it'll say 1.14 So let's do that now. Obviously, this depends on when you try to do this. Currently at the time I am making this course tensorflow 2.0 is still in beta, which means it hasn't officially been released yet. So if you try to use the usual command pip install tensorflow, you will not get tensorflow two point. Oh, of course, this will change in the future when tensorflow 2.0 is officially released. At which point the usual command pip install tensorflow. We'll actually give you tensorflow two point. Oh, and of course, as subsequent versions are released, that will change to 2.12 point two and so forth or whatever version numbers they end up using. Luckily, you can install other libraries in a cool APP notebook which did not come with the notebook . So, for example, if Cole lab didn't come with Scicolone installed, then you would just run the command bang Pip install so I could learn inside a code. So within the cold lab notebook, in other words, in order to install libraries, it's as simple as running the usual pick commands. You just have to put the bang simple first. More on that later. For now, we're interested in tensorflow two point. Oh, At the time I made this video, the current version of TENSORFLOW 2.0 is beta one. So the current command would be bang pip Install minus que tensorflow equals equals 2.0, point. Oh, dash beta. One note that the minus que option here means quiet, which just means prints out less stuff. It doesn't actually modify the functionality of the command. Importantly, here you have to keep in mind one of my famous rules. Learn the principles, not the syntax. This is very important here. Why do I say this? Well, inevitably, some lost soul will end up saying, Why should I use this command when Tensorflow beta three is out? Doesn't this mean that the lecture is out of date. Shouldn't you update this lecture and remember the rule Learned the principles, not the syntax. Of course. Today, the latest version is beta one Tomorrow. That might be better to Or beta three or beta 500. Who knows? The principle is toe. Look at tensorflow. Is website to check what the current command is? That's the principle. Don't try to memorize the install command verbatim, which would be very silly. Okay, so be smart. Don't be silly. Learned the principles and don't memorize this in tax. Also note that you can install the GPU version of tensorflow, which is, as usual, Pip install tensorflow dash sheep. You Interestingly, on coal lab, I found that using the GPU is not that much faster than using the CPU. So for most small problems, it shouldn't matter that much. What you use for TB use will be discussing how that works later in the course. So let's run this. So after installing tensorflow t point no, you can check the version again. So just print out TF dot underscore underscore version. Underscore, underscore. And you should see 2.0, point o or something similar. So let's run this And so now there is one caveat to this, which is that I found that this sometimes does it work. So even after installing tensorflow 2.0, I print out the version and it still says 1.14 It seems that the problem is, if you import tensorflow and then try to change the version, it won't work. So if you accidentally do this and you actually want tensorflow 2.0, then what you want to do is first, make sure you are not trying to import tensorflow before installing tensorflow. So let's comment this out. And then let's go to the runtime menu and select a restart Runtime. So, yes, so we no longer running this, we're just going to run this and now we're going to run this and it works. So now we have 2.0, point. Oh, beta one. Now, in general, I find that this is a bit wonky. So if I run this notebook and then I try to change the tensorflow version later, so say I try to switch from CPU to keep you are the reverse. Things tend to get a little weird. So what I like to do is have everything set from the beginning. No, what you want to use and then run it like that from the start. And don't try to change things in between because sometimes the thing you were using before it's sort of like sticky. So even if you try to change it, it won't actually change. Now there is another important caveats of this, which is that if you recall previously, I said, If you leave your notebook idle for too long, it'll disconnect if this happens. Unfortunately, your Tensorflow version will revert back to the default, and you'll need to install Tensorflow 2.0, again now, Personally, I don't mind running all the cells each time, since if I really wanted to run everything in one go, I would just run it locally. But if for some reason you would like to have tensorflow two point Obaida one permanently installed in your co lab, you could try the solution provided in this link I've attached again. That's up to you. But personally, I didn't have a reason to do it myself. So you'll recall that we discussed this bank command, which, by the way, also exists in regular Jupiter notebook so far, you know that it can be used to run pip install commands. But in general, you can treat this like a directive that tells the notebook that you want to run this command like you would in the terminal. For example, if I want to list all the files in the current directory, I could use the command bang LS. So let's try that. Interestingly, you'll see that there's this folder that appears called sample data so we can call Banya less sample data. So here you can see we have the famous M NUS data set, the California Housing Data SAB and some random Jason file. We may or may not use these, but these air Good. If you want to just run some simple tests like, say, try a simple image classifier on M. Nus. In any case, there you have it. That's how you use tensorflow 2.0, in collab In the case that it's not yet been officially released 3. UploadData: in this lecture, we're going to do a few more tasks and co lab. Specifically, we're going to look at some ways to upload your own data set to cool app. Let's say, for example, your client or employer gives you A C S V file or you downloaded a CSP from cargo. How can we then make this file accessible from our coal lab notebook? In this lecture, we're going to discuss a few different ways of doing this. The first method we're going to look at is just to use the classic clinics command w get as mentioned previously. You can run command line commands by preceding the command with the bang symbol or exclamation mark. So let's go ahead and download the arrhythmia data set. Now we want to check where the data went. So let's use bang ls to see if the data is in our current directory. Okay, it looks like it is Now, let's use the head command to see the first few lines of the data file and also to check whether or not the file has a hetero Okay, so it looks like it does not have a hetero next. Let's try to load in the data using pandas we're going to pass in header equals none, since we know that the data does not have a header next since the data has many columns were just going to take the first few. We're also going to rename the columns because they're currently just inside your values as usual. Since this data is from the U. C I machine learning repository, you could just check the documentation if you want to know more about the data like what each column is. So let's run this next. Let's create a hist a gram of these data columns. Since no book by default makes the plot pretty small, we're going to import Matt, plot, live and change the figure size. Once we've done that, we can call DF dot hissed to create hissed a grams for each column. Not that I've added a semi coal into the end of DF dot hissed because if you don't then no book will print out the last returned value like it usually does, which we don't want right now. So here are some nice hissed a grams for you to look at next. Let's create a common plot for data analysis. The Scatter matrix. This does a scatter plot between each feature and every other feature along the diagonal. It just plots a hist, a gram of the feature, which we've already seen all right, so pretty standard so far. Next, let's look at the second method of loading and data, which also applies when you have Ah, you are out. This is to use tensorflow directly, specifically, the caress get file function. Let's start by assigning the euro toe a variable called euro. We're going to be using the auto NPG data set, although it doesn't really matter what you use for this example, as long as you can access it directly via you. Earl, let's run this next. We're going to make sure we have tensorflow 2.0 installed. So we're going to run Pip, install tensorflow and then prints out the version to make sure that we have the correct one. Next, we're going to call the caress get file function. The first argument is the file path. We want to save two, and the second argument is the fall source. Let's run this note that is possible to save the file to a different directory, but we'll be saving it to Caress is default folder so you can see from the print out that the file ends up in slash root slash dot caress slash data sets. Next, let's call the head command so that we can see the first few lines of a file. As you can see, it's not exactly a C S V. Instead, each columnist separated by whitespace and there's no hetero. So in order to load this data, we can still use the Panis read See SV function. But we have to pass in to Esther arguments. The first argument is to say that there's no hetero so head R equals none, and the second extra argument is the Tell Pan is that the delimit er is white space. So we set the limb white space Eagle to true. Next, we call DF dot head just to make sure everything works as expected. So as you can see, the result appears to be in the right format, and from here, you can processes state at using python code, as you normally would. The third method we're going to look at in order to add your own faster co lab is toe upload the file directly. In order to do this, we have to run a special co lab function. So we say from google dot co lab import files. Then we call files dot upload. So let's run this. So you see that this creates an upload button, which we can click and then choose a file from the local file system. So I'm gonna choose daily minimum temperatures. And if we print out the return value, you can see that it's a dictionary where the file name is the key and the value is the file contents. If we use the command bang ls, we can see that the file has been uploaded to the working directory. Next, let's read in the file, using pandas to make sure we get what we expect. Now this file has some garbage lines near the end, so I've accounted for that by setting the argument ever bad lines, you go falls. This ignores errors but prints them out as they are encountered. As you can see, the file is loaded in successfully. To follow up this example, we're going to look at a variation on what we just did you recall that when you're writing code in python, sometimes it's useful to split your code amongst several files. This helps to organize your coat and keep similar things all in one place, while keeping different things separate as a simple example. Sometimes we'll learn about multiple algorithms in one chorus, but we'll test all those algorithms on the same data set, so there's no point in rewriting the code toe load in the data set multiple, different times. Instead, we can write the data loading code once and then import it from the file. Now, you might wonder is, since we're working in coal ABB how can you import a function from a python script if that Python script is on your local hard drive? Luckily, we can take the same approach. We already have been toe upload that file to Google coal app. So here I'm going to call fouls the upload again, and this time I'm uploading the python script fake you tilt up high so fake you tell a pie contains only one function called my useful function, and all it does is print out. Hello, world. So once you've uploaded the file, you can see that we can import it just like we would if we were working locally. So I can say from fake Util import my useful function. Then when I call my youthful function, you can see that hello World is printed out just like we expect. And by the way, you might be wondering, as I did what the path of the current directory actually is. To determine this, you can just run the usual Lennox command pwd and that prints out slash content, so slash content is our current working directory. The last thing I want to cover is something you're probably all wondering. Google Drive is for storing files. So is it possible toe access files on your Google drive? And of course, the answer is yes. So in order to do this, we have to import drive from Google Coal app. Then we have to mount the drive by calling dr dot Mount and specifying the path slash content slash G drive. So this is going to give you an authorization code so you go to the U. R L in your browser here s you sign in sugo, except some terms and then it gives you a code. So you copy this code and you put it back into this box you hit enter. Okay, so that works. So after we've done this, we can call ls again to check what's now in the current directory. We can see that there is now an extra thing here. She drive. So let's ls g drive and see what that gives us. All right. So it looks like we now have a thing called Google Drive. What's again? Ls this and remember that you have to add quotes if your path contains white space and now we can see a bunch of files that are in my Google drive, which is essentially a bunch of V i p content for the V i p versions of my courses.