Gradle Plugin Masterclass | Oliver Spryn | Skillshare

Playback Speed


  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Lessons in This Class

32 Lessons (2h 11m)
    • 1. Welcome, What We Will Build

      2:41
    • 2. A Note on IDEs

      0:51
    • 3. Section Introduction - Set up the Build Environment

      0:33
    • 4. Install the Proper JDK

      1:05
    • 5. Create the Project

      6:15
    • 6. Adjust the Build Scripts

      13:59
    • 7. Section Introduction - Our First Gradle Plugin

      0:47
    • 8. The Plugin Skeleton

      2:56
    • 9. Create the Extension

      0:56
    • 10. Create the Task

      1:36
    • 11. Create the Plugin

      3:00
    • 12. Putting It All Together

      10:13
    • 13. Section Introduction - Build a Test Coverage Reporter Plugin

      0:30
    • 14. Understanding the Gradle DSL

      5:16
    • 15. Prepare for Development and Testing

      2:27
    • 16. Accept First Order Configuration

      6:59
    • 17. Accept Higher Order Configuration - Introduction

      3:29
    • 18. Accept Higher Order Configuration - Option 1

      3:41
    • 19. Accept Higher Order Configuration - Option 2

      4:36
    • 20. Accept Higher Order Configuration - Option 3

      4:31
    • 21. Accept Higher Order Configuration - Option 4

      3:07
    • 22. Implement the Coverage Reader Task

      5:01
    • 23. Section Introduction - Advanced Plugins and Seamless Integration

      0:45
    • 24. Gradle Lifecycle and Automatic Integration

      4:28
    • 25. Add Multiple Events and Extensions to Your Plugin

      6:47
    • 26. Projects with Multiple Plugins

      4:15
    • 27. Integrate Your Plugin Alongside Existing Projects

      6:30
    • 28. Section Introduction - Publish to the Gradle Plugin Repository

      0:38
    • 29. Update the Build Scripts for Publication

      4:25
    • 30. Plugin Publication and Secrets Management

      11:32
    • 31. Continuous Integration and Continuous Deployment

      6:34
    • 32. Course Conclusion

      0:35
  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.

10

Students

--

Projects

About This Class

I use Gradle plugins to save my team hundreds of hours of unnecessary work each year, and so can you. If you need to fill a key hole in your Gradle build process, then building one or more Gradle plugins may be your ticket to saving time and boosting your productivity. This course is your one-stop, comprehensive resource to do just that.

In this course, I show you how to extend the Gradle build tool for your needs. Beginning with the project and IDE setup, I quickly move on to show the basic skeleton, which is essential for all Gradle plugins. This foundation is then adapted to fit more complicated workflows to fit both large and small projects, alike.

The patterns I discuss in this course are backed by seasoned advice, which I have garnered and practiced for years as I built and led the development of MyUPMC Android, a hospital patient portal application that uses three custom Gradle plugins every day.

 Some of those concepts I discuss in this course include:

  • basic principles of Gradle plugin development
  • tips for optimizing your projects for team-wide development
  • hook into the Gradle lifecycle to make your plugin work seamlessly in every build
  • packaging multiple plugins and events into a single custom extension
  • publishing your work to the Gradle Plugin Repository
  • building private plugins alongside existing projects

By the end of this course, I am confident that you will have enough knowledge to build Gradle plugins and events from scratch. The patterns taught inside of this course generally apply to this build tool's patterns as a whole. Therefore, you can reuse these techniques regularly in your daily development and experience a similarly successful outcome.

Meet Your Teacher

Teacher Profile Image

Oliver Spryn

Lead Android Architect + Content Creator

Teacher

I am the lead software engineer on the MyUPMC Android project, the patient portal for one of America's largest healthcare systems. The software applications I help build are geared just as much to those who are 28 as it is to those who are 88. The difficulties of appealing to such a large age group are often subdued when you make the technology feel invisible, integrated, and working just the way it should.

Throughout my leadership experience, I've gained a particular aptitude for Android software architecture and development, the Gradle build system, and Git. On the web, you can find me blogging about my technical challenges and triumphs and distilling my experience down into courses here on Skillshare.

See full profile

Class Ratings

Expectations Met?
    Exceeded!
  • 0%
  • Yes
  • 0%
  • Somewhat
  • 0%
  • Not really
  • 0%
Reviews Archive

In October 2018, we updated our review system to improve the way we collect feedback. Below are the reviews written before that update.

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.

Transcripts

1. Welcome, What We Will Build: Back in 2017 when I first started using Gradle on a professional Android app. I didn't know that within our burgeoning project was hiding a huge time-saver, collectively costing my team hundreds of hours of lost time. As more engineers, designers, and product leaders were jumping in and having a hand in the project. Engineering team was asked to create and deliver more and more onetime customizations for experimentations and demos. I knew that much of our time was spent on Bill related matters and I could have saved us those hundreds of hours back then, if only I had known how to build a Gradle plugin to do that work for us. Fast-forward to today, I helped my much larger team so many more hundreds of hours Building and incorporating three custom Gradle plug-ins into our project. If you'd like to learn how to build Gradle plug-ins for your project to fill in key holes that could save your team a significant amount of time and effort. You are in the right place. Hi, my name is Oliver sprint. I'm the lead Android engineer at UPMC enterprises working on the mind UPMC project, the definitive patient folder for one of America's largest hospital systems. Throughout my years in the project, I've learned quite a bit about how to make greater work effectively for large projects. How to keep our scripts clean, and how to build plug-ins when nothing in the community can meet our needs. In doing so, we've unlocked previously unimaginable productivity and even that out functionality so that non-technical colleagues can self-service some other requests without ever needing the help of an engineer. Throughout this course, we will build quite a few plugins, but our flagship project will be the emphasis. Our objective will be to create a plugin which can extract the test coverage metrics bunch of cocoa coverage report file, then distill and display a summary of that information in the console. This implementation will come by way of one plug-in containing two separate abilities called tasks or actions. And to configuration blocks, we will spend a significant amount of time exploring how to create infinitely customizable and flexible configuration blocks. This configuration can be as simple or as complex as you need. And it's one of the ways your end user can interact with your plug-in. So it's important to get it right. Finally, we will publish our work under the Gradle plugin repository for the world to consume and automate this process with a continuous integration and delivery pipeline. By the time you're done, you will have several professional tools in your tool belt. They'll make this process convenient for smaller projects, but also scale well for larger teams working on large-scale projects, you don't have to take these next few steps alone as you go through this course, I invite you to join a community of learners and builders on the same journey as you on my Discord server, you'll have an opportunity to connect with me and receive assistant if you get stuck at any point, I look forward to the opportunity to work with you in the course, Gradle plugin masterclass. Well, let's get started. 2. A Note on IDEs: In this course, you'll see me using the intelligent idea ultimate IDE from JetBrains. While this particular edition as a paid professional product, you can accomplish the exact same things I'm doing in this course with their free community edition. If you want the best follow along experience, I would suggest that you go to the JetBrains website and download intelligent idea community. However, you don't have to, if you're more comfortable with Visual Studio code, Eclipse, or even Vim and the command line. Feel free to use those tools. There isn't anything special about what intelligence is giving me that you can't do yourself. As you'll see later on, even all of the build commands for our project, I can code into intelligent. The only advantage unintelligible give you as a project setup experience more similar to mine, and the ability to follow along exactly as you've seen the videos. The choice is up to you, but I wanted to give you that advanced notice before we started building our project. 3. Section Introduction - Set up the Build Environment: In this section, we're going to hit the ground running with three major objectives. Create our project from scratch, adjust the build scripts for local Gradle plugin development and create a plugin stub. The bulk of our work will focus on setting up our build system to optimize our development process. Later on, out-of-the-box Gradle plug-ins aren't overly friendly to local development and testing. So there are a number of configuration changes and tricks. This section we'll teach you to ensure that the rest of your journey is as seamless as possible. Your investment in this effort now will pay off during the rest of your development experience. 4. Install the Proper JDK: For those of you who have worked on Java projects in the past, installing the JDK is probably old hat. Nevertheless, for completeness sake, I thought I would discuss how to do that in this video. Many times when you look at the Gradle documentation, they recommend that you install the latest LTS version of the JDK when either building plug-ins or using Gradle to build your own projects. At the time of recording the latest version of the JDK, it's Java SE L11. Keep in mind by the time you watch this video, Java SE 17 or new or maybe the latest LTS version. This link here will show you how to go to this page and see what the latest version is from Oracle. This case, I'll click on the Java SE download link. I'll scroll down to the download section and I'll download the appropriate installer for my platform. I'm not going to guide you through the installation process for the JDK since we've run things like that many times before. But I did want to call out the importance of using the latest LTS version. Since credo is quick to drop support for older LTS versions as soon as newer ones become available. 5. Create the Project: As I promised at the beginning of the course, we're going to start this project from scratch is going to be no magic setup. You're going to see how I do it from the very beginning. I've opened up my IDE and I'm going to press the New Project button on the welcome screen. In the new project dialog, I'll select Gradle from the left side. And for the project SDK, I'll select 11, which is the current LTS version of the SDK. I'm going to make sure that I de-select the Kotlin DSL build script option. In my opinion, the greatest community in general doesn't have enough documentation and community support to make it worth my effort just yet. This is totally up to you, but I'll be using straight-up groovy scripts to build this or dot Gradle scripts. In this case, for this project, I've decided not to use Java, but instead I'm going to use Kotlin. In this case, I'm going to select Katlyn JVM. Make sure that you don't select Katlyn for NodeJS or Kotlin multi-platform as that is not compatible with a kind of project that we're building. I'll select the Next button. And I'll give my project a name. In this case, I'll call it JOCO test coverage reporter. And I'll provide a location for the project. I'm getting an alert that's saying that directory is not empty. I already have Git repository initialized at that location, which is why I'm getting this warning. This isn't something to be concerned about and wasn't likely if you're starting the project on your own, you will not have gotten this warning. I'm going to expand the artifact coordinate section. Inside there. I'm going to provide a group ID, which is going to be the inverse domain name of my organization. So since my organization is left lane dot Academy, I'm going to provide a group ID of academy dot left lane. And since this is a Gradle project, I like to namespace it ending Gradle. So the entire group ID is academy dot left lane doc Gradle. The artifact ID. I'm going to leave it to match the project name. And the version is going to be 0 dot 0 dot one. I personally like using semantic versioning, which recommends that you use this version whenever you're in the early stages of development. I'll go ahead and select Finish, and wait for the IDE to initialize my project. Now that several minutes have elapsed, my project is fully set up and ready for me to go. As I mentioned earlier, since I already have a Git repository at dislocation on my disk, I'm going to be getting a few prompts related to setting up get for this project from the IDE. I'm going to have the IDE automatically add new files to the git staging section for me. Again, this is one of those things that totally up to you, but I like the IDE to do this kind of work for me. Now, I'd like to clean up a bit of the Git repository. The first thing I'm going to do is open up my idea folder and inside of there, open up that dot git ignore folder. Depending on your team's culture, you may or may not decide to do what I'm doing here. Personally, I find it helpful to keep certain aspects of the idea folder present whenever you commit your project to a Git repository, you may prefer to ignore the entire idea folder. That's totally up to you. But here's what I find. It's worked best for me. Since I've worked with intelligence for a number of years. I've changed this file over time to fit my exact needs. If you'd like to use the configuration that I have here, feel free to take a look at this link, which will take you to a place where you can directly copy and paste into your own Git ignore folder. In general, I'm ignoring most things about this folder, except things which are incredibly relevant to this project. The most important part that I want to keep around or the Run Configurations. Now, depending on which version of intelligent you are using, that may be either a run configuration, is that XML file located at the root of this folder, or it may be a subfolder with a collection of XML files inside of it. This is the most relevant part of the idea of folded that I recommend you keep. This will be especially relevant later on when we need to share a bill configurations with our team. Next, I'm going to add a gitignore file at the root of my project. Since my IDE didn't create this file for me, I'll go ahead and add that file. Now. I'll right-click on the root of the project, select New and then file. I'll name this file dot git ignore. And I'll tell my IDE to add this to the git staging section. Two of the most important things to ignore whenever you're building Java projects with Gradle build folder and the Gradle cache files. So I'll add doc Gradle and built. The last thing I'm going to do is ensure that the version of Gradle that I'm using is as up-to-date as possible. I'm going to go to grade those documentation site. I noticed that the very top left of their homepage is the current version of Gradle. At the time of recording, that's seven dot one, dot one. I'm going to make a note of that. Then I'm going to go back to my terminal and make sure that the version of Gradle that I'm using matches what they have on their website. First, I'll navigate to my project directory. Then I'll just type Gradle w, which is the wrapper for our Gradle project. I don't need to pass any commands in. All I want it to do is tell me, welcome to Gradle and then the cradle version. As I can see, I'm running a whole generation behind the current version. In this case, I'm going to run Gradle wrapper again. Then I'm going to pass in a command saying wrapper and then specify the version that I want to upgrade. True, in this case it's seven dot one dot one. In a moment, I'll see that that task is completed. I'll run that same Gradle w command again. And I can see that the version of Gradle has been updated to the modern standards. 6. Adjust the Build Scripts: In the previous video, we created a brand new Gradle project that can be used really for anything. At this point, I could turn it into an Android project or a Spring Boot Project or some other generic Kotlin project. However, now it's time we turn our focus towards changing these build scripts so that we can be building a Gradle plugin. To do that, I'm going to introduce you to two core Gradle plug-ins. The first one is called Java Gradle plugin. And really it helps us serve three different purposes. First, it introduces APIs for Gradle plugin development. So all of the specific SDKs that we're going to need to be able to enhance the Gradle plugin system will be available because we've imported this plugin. Next, it will allow us to add metadata to our plugin. This is important not only for when you're using the project and developing it locally, but later on when we go to publish it to the Gradle plugin repository. It also has specific validation for this metadata so that we can't go wrong when it comes down to accomplish. The next core plugin that we use is called Maven publish. And similarly, this also serves three specific purposes. First and foremost, it allows us to build artifacts to a Maven repository. We're going to be specifically using that to declare a local repository and published to a local repository. That's gonna make our job as an engineer significantly easier because we can do all of this on our own machine. I'd like to point out that Gradle is compatible with Maven repositories. So even though Gradle and Maven or innocence build tool competitors, we can still use a lot of the constructs that maven has setup because Gradle is built to work with it. We're going to be using that to our advantage. Now that I've shown you some of the plug-ins that we're gonna be using for our project. Let's jump over to IDE and let's put them to work. Here I am back in intelligence. The first order of business is going to be to create a subclass. This is going to be important because we're going to start laying out the metadata for our Gradle plugin early on. One of the things that we're required to do is indicate which class is going to be the plugins entry point. So what I'm going to do here should feel rather familiar. I'll open up my main folder and I'm going to create the package structure using the same group ID that I used early on. So in this case that will be academy dot left lane dot Gradle. With the package structure in place, I'm going to create my subclass. The name of the class doesn't matter because we're going to pass that information off to the metadata in our build scripts here in a minute. But I'm going to be specific with my name and call it the name of the plugin. So that would be Chicago test coverage report or plug-in. As I mentioned earlier, this is just a subclass. I'm not going to put anything in it in this video. The only thing that matters is its presence in our project. Now that we have our stumped class created, let's jump to our build-up Gradle script, and let's start using those plug-ins that I presented early on in the video. At the very top and the plug-ins block, I'm going to include both of those plug-ins, Java, Gradle plugin as well as Maven Publish. Now I'm going to move on to declaring which version of the Java virtual machine I intend to target. Since I'm using JDK 11, which is the latest LTS version at the time of recording. I'm also going to apply that as a stipulation in my build. Like so. I'll set the source and target compatibility properties both to 11. Since I'm using Kotlin, I'll also add a compile Kotlin block. Inside of there. I'll set Kotlin options dot JVM target, and also set that to 11. I'd like to do a little bit of optimization here. I'm going to replace every instance of 11 with a variable called JVM version. I'll set that variable equal to 11. So that way, whenever I need to move my target version someday, I can just change it the variable. Now, I'm going to start declaring my Gradle plugin. Thanks to the Java Gradle plugin that we imported at the top, I have access to unblock called Gradle plugin. I'll put that in down below. Inside of that block, I'll create another block called plugins. Inside of the plug-ins block. I'm going to create another block with an exact same name as my plug-in. So in this case it will be to cocoa test coverage reporter written out in CamelCase. Inside of that block, I need to declare two specific properties. The first property will be an id. This ID needs to be universally unique, especially if you plan on publishing to the Gradle plugin repository. So in order to make sure I don't have any clashes, I'm just going to use standard Java conventions. I'm going to use my unique identifier, academy dot left lane, doc Gradle, and n with the project name Chicago test coverage reporter. The next property will be the implementation class, which is simply the fully qualified package name of the subclass we created at the beginning of the video. Now we're going to move on to using the second plugin which we imported earlier, the Maven published plug-in. This gives me access to a block called Publishing. Inside of that block, I'm going to create another one called publications. Inside of there, I'm going to create another block called plug-in Maven and pass in a type of Maven publication and make that a block. This next part is probably the most dense of our entire setup process. It requires us to create a custom task, to go fetch all the JAR files from our bill. I'm going to start by creating a task called sources jar. And it's a type, I'm going to specify it as jar. Inside here. I'm going to say from source sets, dot main, dot all Java. This will be used to fetch all of the jar files which come out of our main project. The last thing we'll have to do is specify an archive classifier as sources. This URL provides a link to greatest documentation along with why they recommend this approach and there's more information on it. If you're interested. I'll go back to my plug-in Maven block. I'll add one more property to it. Inside of there. I'll set the artifact to the outcome of the sources of jar task. Last, we need to specify where this is going to be published. Maven calls that repository. Beneath the publications block, I'll create a repository is block. I'll specify the type as Maven. Inside of that block, I'll specify a name for this repository. I'm just going to call it local. Finally, I'll set the URL property to indicate where this repository is going to live. In this case, I'm going to have it live at my project directory route slash repo. So that way it will be very close my project and the output will be very easy to import whenever it comes time to test my plug-in. Now, with the most difficult part of our setup out of the way, let's finish up by rounding out our dependencies. I'm going to need to import the Gradle API. And this is going to be done via a function call. This is available to us again because we've imported the Java Gradle plugin up above. Then I'm just going to specify this next dependency as Kotlin. Let's test our work. I'm going to add a Build configuration to our IDE. Up in the top right, I'm going to press the add configuration button. In the resulting dialogue, I'll press the plus button and select Gradle. I'll give this a run configuration and name of just build. This next option is going to be very important, especially if you plan on sharing this project with a team of developers. I'm going to select the store as project file checkbox. It's going to ask me where I want to store this configuration. By default, it's going to be under the idea folder, which I've already set up to be able to handle it and share that information to get in a previous video. Moving down to the Gradle project, I'll select the folder and make sure that the current project is selected. And then for the tasks, I'll simply just type build. Press. Okay, and I can see that my new configuration is in the Run Configurations menu. Let's test that out by pressing the Play button and making sure that our build succeeds. Well, obviously at this point we really have nothing in our project, so it builds relatively quickly. There is one thing I'd like to point out about the build that I've got on my machine and you may get on yours as well. If I take a look higher up in my Build log, I can see there were several warnings regarding different versions of Kotlin. In my project, I'm using Kotlin version one naught five, but Gradle is using version one dot for. It's important to note that Gradle ships with its own version of the Kotlin standard libraries so that it can support both doc Gradle files as well as dot grade okay. Ts files for the modern Kotlin version of Gradle files. So unless I'm willing to downgrade the version of Kotlin that I'm using to match the version that Gradle is using. I'm going to continue getting this warning. In my personal experience, I've never seen it cause a problem, but if it does, now you know how to rectify it. So right now, we have our build working successfully. Let's see if we can get our publication to work. Just as before. I'll go up to my Run Configurations menu and select Edit Configurations. From the left side, I'll select the plus sign. Provide a name of published local. Select store as project file. In short that I have my current Gradle projects selected for the tasks. I'll just type publish. I'll select Okay, to close the dialog. And then I'll go ahead and run my new configuration. Let's see what happened. Over on the left, you can see I have a new folder called repo. Inside of there are a series of folders matching my package structure. And ultimately, I can see the name of my plug-in reflected in the folder name. Inside of there, I can see I have a plug-in version of 0 dot dot one. And if I expand that folder, I can see a JAR file, which is one of my plug-in. Ultimately lives, were definitely headed down the right path. But it's good citizens. We should make sure that our git repository is going to be clean when it comes time to publish it. So I'll open up my project root gitignore file. At the bottom, I'll add repo to make sure that the build artifacts aren't published to the Git repository. Since most projects had a clean function. I'm going to add that as a custom build configuration as well. Up at the top Run Configuration menu, I'll select the button again, select Edit Configurations, and goes through the same ceremonies. This time for clean. I'll give it a name of clean. Again, making sure I store it as a project file. Select my current project, and make sure that I specify a task name of clean. By default, Gradle isn't going to know to delete to that repo folder when it runs a clean task. So to do that, I'm going to add an addendum to my build-up Gradle file at the root of the project. All the way at the bottom. I'll add an after evaluate block and I'll create a custom task. This task name, I'm going to call it remove repo. I'm going to specify a task type of delete inside of that block. I'm going to add this task to a group. This is optional, but I like to do it for good citizenship. This group is going to be billed. Now, why did I choose Build? Let me expand the Gradle panel on the right and I'll show you that it actually already exists an, a default set of tasks that Gradle ships width. If I expand the existing build group, you can see there are a number of tasks inside of it, including clean. And since I'm adding an addendum to the clean task, I'm just going to put it in the same group for the sake of organization. Of course, we need our task to actually do the deletion. So I'm going to specify the exact same path that I specified up above. Whenever I was creating the repository, I'll say delete project directory slash repo. It'd be nice if this task ran automatically for me. So to do that, I'm going to make some space beneath the removed repo task and say clean dot depends on Remove repo. So that means every time they clean task runs, first, the remote repo task will run. I'll hit the grid, it'll synchronize button in my IDE. And in a moment you can see this new task now appears underneath the build group. Let's just test this and make sure that it works. With my clean task selected. I'll press the play button. And not only does the clean task run, but so does the remote repo button. As you can see on the left, the repo folder has been deleted. 7. Section Introduction - Our First Gradle Plugin: Now that we have our Gradle scripts configured for local development and publishing, we can focus our attention on building our first plug-in to keep things as straightforward as possible. This first plug-in will be very simple. I will focus on getting you familiar with the basic components that comprise what we call a Gradle plugin. And understanding what role each of these pieces play. In this section, we'll focus on four topics. Understanding the components of a Gradle plugin, creating a block to customize its behavior, writing a task to perform some kind of job, and tying it together with the plugin will then be able to form a very basic test to ensure that the entirety of our work is functions we would expect before moving on to more complex example. In the section that follows this one, we'll put our knowledge to work to build a project I showed you at the beginning of the course. 8. The Plugin Skeleton: Before we can start building our plugin, it's important that we understand some basic terminology and what it's going to allow us to build in the future. Whenever I refer to a plug-in, It's more of a catch-all phrase used to describe a collection of things. The very first component of that is going to be the plug-in itself. We'll talk about what that means here in a minute. The next component is the task. And finally, we have the extension. When you have a plugin, typically you have these three things that come together. The plugin serves as the application entry point, much like how a main function in a Java application serves as an entry point for that kind of application. This is the equivalent in the greater world. The task is actually what does the work for us? This is what contains the business logic which is going to be unique to your plug-in. The extension allows the consumers of your plug-in to be able to configure its behavior and via a DSL in their Gradle scripts. Technically, the extension is optional if you have nothing that you want to configure or your plugin is basic enough, you may decide to forego does altogether. But in a minimum, you will always have the plugin and at least one task. The way I like to think of this as sort of like a box in the attic analogy. If you're going to your attic to get a box of things, you're probably not going there to get the box itself, but rather the things that are in the box. You don't want the box, you want the contents. That's similar here. The plugin can be thought of as the box is to delivery mechanism for everything that's inside of it. So whenever you ask Gradle too important, a particular plug-in for you in the plugins block, you're asking it to go to the attic, get the box, and return the tasks and the extensions for you. The plugin is just a delivery mechanism. Pictorially, I would think of it as something like this. We have something on the outside which is a plug-in, and inside we have one extension, one task. This is about the most simple use case whenever it comes to Gradle plug-ins. This establishes some sort of flow where the input is the extension with the user specifies what behavior they want to happen. The task is the output. What happened as a result of what they requested through the extension. Later on in the course. We'll iterate on this and enhance a little bit farther. You can have one plugin, have multiple tasks. All of those tasks can be configured from a single extension. Also use it would have to do is create an extension and then call the corresponding task. Going even deeper, we can have a single plug-in, have multiple extensions with multiple tasks. And finally, you can even have one project contain multiple plug-ins, each with their own unique extensions and tasks that are completely independent of each other. This is the power of Gradle and we're going to be exploring this starting from the very first example. And we'll work towards more complex examples like what you're seeing here later on in the course. 9. Create the Extension: While back, we created a subclass for our plugin, but now it's time to move on to the first part of our implementation, the extension. This will have to be its own class. So go ahead and create a new class and call it Chicago test coverage report or extension. I'm going to have to make this an open class in Java term is that essentially means this class is not final in Kotlin, all classes or final by default, unless you specify them as open. Greatest going to do some work for us behind the scenes in order to make this work as a DSL. For that to happen, it needs it to be non final. Inside of here, I'm going to add one property, a message, and I'm going to specify the default value as HelloWorld. And that's it. We've created our simple extension, just as easy as that greater will do all the heavy lifting for us behind the scenes. All we needed to do was create a data model for our extension. 10. Create the Task: Our next step in this journey will be to create a customer task class. I'm going to go ahead and create a new file called Chicago test coverage report or task. Just like the extension, this will also have to be an open or a non final class. Next, I'll inherit from a class called default task, and I'll create a function to perform my business logic. The name of this function can be anything that I like. This is the function where all of my business logic is going to take place. In order to indicate that it's important for Gradle to pay attention to. I'll annotate it with the task action annotation. Since I created an extension from my plug-in, I'll need some way of being able to inject that extension into my task. I can do that by creating a constructor and annotating it with inject. Then I'll create a property on that constructor for the extension that I created earlier. I'll make sure that I prepend it with private Val. That way every function in my class will have access to what was injected. Finally, for this simple example, I'm just going to print out the value of the message in my right message function. To do that, and I'll simply say print line. Then provide extension dot message. For this simple example. That's all we have to do. In the next video. We'll bundle these two things together and create a working plugin. 11. Create the Plugin: Now that we have both an extension and a task ready for us to use in our plugin. Let's go ahead and put both of them together. Let's open up that subclass that we created at the beginning of the project. We'll need to make some changes to that class. Notice one big difference here. This one is not an open class or a non final class. Gradle doesn't have that requirement for plugins just for tasks and just for extensions. But of course, right now this class doesn't do anything. Let's get started on making some changes. The first thing I'm going to do is inherit from a class called plugin. Now, this class has a generic parameter. I'm going to specify the generic parameter as of type project. This class requires me to overload a function called apply. So we'll go ahead and create that function. Now. You'll notice as a parameter to this function, I have a property called Project. This parameter is effectively an object model of the implementing project which uses this plug-in. I'm going to have to make a few changes to this model in order to add both my extension and my task into this project. Notice I didn't say anything about injecting the plugin into the project. Really, the only responsibility of the plugin is to inject both the extension and the task into the project. Will start with my extension. I'll create a variable called extension. And then I'll say project dot extension, dot create. This is my opportunity to define what the DSL was going to look like. So when a project uses this, I would expect the DSL to have some sort of name for this block. I will specify that name right here. This will be Rococo test coverage reporter. That's the name of the block. Now, as far as the DSL model, I'll specify the extension we created earlier as the second parameter. I can do that by saying Chicago test coverage report your extension colon, colon class dot Java. Or if you're doing this in Java, this would be simply dot class at the end. This extension isn't any good unless I can tie it together with a task. I'll do that using a very similar construct. On the next line, I'll say project dot tasks dot create. For the first parameter, I'll specify the name of the task as a string. I'm going to call this right message. The second parameter I'll pass in the class name, which implements the task, which of course is going to be in Chicago test coverage reporter task. The third, fourth, fifth, sixth, and whatever other properties that I add to this function will be injected as constructor parameters to this task. So since I only need one parameter, which is going to be the extension, I'll simply pass in the extension from the previous line. And there we have it. My plugin has been fully wired and is ready to go. In the next video, we'll run this plugin and see what happens. 12. Putting It All Together: By now, we have the three major components that we need for a plugin. We have the plug-in itself, the extension, and the task. Now it's time that we put all this together and we create a subproject which allows us to run the results of our work. This is going to make the process of developing this plugin much easier. To begin. I'm going to create another subproject inside of our existing project. I'm going to call this demo. So to do that, I'm going to create a new folder called demo. Then I'm going to create a build-up Gradle file. This is going to be necessary to identify this folder as a subproject. Also by convention, we're going to need a settings doc Gradle file. So I'll go ahead and create that file now. Now I'd like to point out the similarities between our base project and our demos subproject. Notice how we have build-up Gradle and settings doc Gradle files in both locations. One element that is missing inside of our demo project is to get ignore file. Now, if I open up the one that's incentive or base project, you can see there's going to be two things we're going to need from it. We need to ignore the dark gray own folder as well as the build file. These are gonna be automatically generated by the build. We don't want these to be pushed up to our Git repo. So from here, I'll create a dot gitignore file inside of my demo project. And I'm going to ignore both the doc Gradle folder as well as the Build folder. Now that we have the bones of our demo module setup, Let's jump back out to our master build-up Gradle and take a look what we see in here. At first glance, I can see towards the top of the file, I have several pieces of versioning information in here. Now, unfortunately, I'm going to need these versioning pieces of information in both my base project and my demo project. So for example, you can see I have my version of Kotlin in here, version of my plug-in, and then also the JVM version that I'm targeting. It'd be really nice if I could isolate this in some way so that I could refer to it in both my base project and my demo project. And all I need to update it in one location when that becomes necessary. So let's take some time and fix that. There's several different ways that you could go about tackling this. For me. The easiest way to do that is to create a version of Gradle file at the root of our project. And then we'll use that to distribute diversions out to both of the modules. Once I have this file created, I'm going to go ahead and create an extension is block. This is something that we've seen before. All I'm going to do is create three different variables and put each of those versions inside of it. I just copied these versions from the root build-up Gradle file in our project. Nothing special here. Now I'm going to jump back to my route build-up Gradle file. And I'm simply going to say apply from versions dot Gradle. This allows me to import that file. For the most part, this is just going to be a drop-in replacement. So since I already have JVM version defined inside that file, I can remove this extensions block. And I'm also going to replace the version with the variable we defined inside of our Gradle file. Unfortunately, because of the way the plugins block and the apply function works inside of Gradle. We can't just use this variable inside the plugins block. The plugins block as well as the version information must be declared in advance and at the very top of the file. So really this apply from that I have on line seven only works on all of the code below it, but not above it. Fortunately, Gradle has a way of working around it by applying what they like to call plug-in management in this needs to be done from the settings doc Gradle file at the root of our project. Let's open that up. Now. We haven't really been in here before, so it might feel a little bit unfamiliar, but the concepts are pretty simple. At the very top, I'm going to create a blog called plug-in management. And this does need to be at the very top of this settings app gradle file. Inside of there, I can use the apply function. So I'm going to say apply from virginia dot Gradle. Now, here's the real solution to our problem. After this apply function, I can have a plugins block. Now, this is the real magic of this solution because it allows me to use apply before plugins effectively, what I'm going to do here is to create an alias for this Kotlin plug-in. I'm going to use it very similarly, but I'm simply going to say, anytime I refer to a plugin by this ID, makes sure that you resolve this particular version. So let's see how this works. I'm gonna go back to my build-up Gradle file and I'm going to copy the line that imports the Kotlin plug-in, including the version. Then going back to settings that Gradle, I'm going to paste that ID into the plugins block in for now it looks the exact same. However, since I have versions not Gradle already applied to the script, I can simply replace the version with the variable that's defined inside of versions doc Gradle. Finally, I can go back to my build-up Gradle and simply omit diversion from the Kotlin plug-in because it already knows where it's been defined inside of settings, doc Gradle, perhaps you might feel like this was a bit of a hollow victory. But the concepts that I just showed you here, we're going to use one more time inside of the demos subproject. This will allow us to keep the plug-in version that we're using for our entire project to be consistent across both the base project. As well as the demo project. That way I don't have two different versions to maintain. So to get started with that effort, let's go back to our root settings doc Gradle file. I'm going to copy the entire plug-in management block, and I'll paste that into the settings app gradle file for our demo project. Now, obviously there's a couple of things that have to change with this file. First of course being where I'm resolving virginia dot Gradle. I'm going to add another dot to the prefix of this path. So that means I'm looking up above my current directory. Instead of redefining column accessing the Kotlin plug-in in this plug-ins block below, I'm going to redefine how I access my own plug-in using the same constructs, of course. So I'm simply going to replace the ID of the Kotlin plug-in with my own plug-in. So in that case, of course is going to be academy dot left lane, doc Gradle, Chicago test coverage reporter. And in this case, diversion of course is going to be plug-in version. Now I can jump over to the build-up Gradle file from my demo some project in simply just import the plugin by its ID. I don't even need to think about the version anymore. So before this whole thing works properly, does one more thing that we need to adjust. By default. Grid looks for plugins on the plugin repository online. And of course, because we're still in development, we haven't published a repository anywhere except for our repo folder that we have inside of our project. So we need to tell the plugins block to look there. First, I'm going to jump back to the settings doc Gradle file. And at the bottom and the plug-in management block, I'm going to create a new repository is block. Inside of there. I'm going to create a maven block and specify the URL property to look inside of the repo folder. And beneath that, I'm going to tell it to also look at Maven Central. This is necessary because we're using Kotlin and other dependencies inside of our plugin that this project will still need to resolve. So this is also unnecessary. We're just about ready to start running our project. The first, I'd like to add a new build definition to our intelligent project. So go ahead and do that by going up to the top, select Edit Configurations. And then I'm going to add a new Gradle build definition. For this example. I'm just going to name it run hello-world. And as before, I'm gonna make sure I store it as a project file. I'll select a Gradle project from the list. Then for the task name, I'm simply going to say right message, which is of course the name of the custom tasks that we just developed. The only difference here is we're going to have to tell it not to run this on the root project, but on our demo project. For the arguments, I'm simply going to say dash, dash project, dash there, and then tell it to look inside of the demo project. So it's going to be running the build-up Gradle file inside of demo instead of the one at the root of our project. That's an important distinction. Of course, this plug-in needs to exist and had been built by our base project before we can run it. So I'm going to scroll down a bit and I'm going to go through a very similar set of steps. Select run Gradle task. I'll choose to Gradle project. Then for the task name, I'm simply going to say Publish. We've done something like this before. The only difference here is we need to make sure that our IDE enforces this order build steps, because obviously we need the plugin to be built an existing inside of the repo folder. Before we can run it. I'll go ahead and hit Okay. And then I'm going to run my new task. As you can see, I get the helloworld message by default when I run my task. Let's go ahead and customize that. Jumping back to the build-up Gradle file, I'm going to go ahead and create that extension. Then I'm going to change the message to say hello Earth. Pressing the Play button, I can see that my message has been updated. Now, one last test here, I'd like to make sure that the order of the build steps is working as expected. So I'm gonna go up to the very top. I'm going to press clean and let that run. And as you can see, the repo folder no longer exists. So that means that obviously I can't run this plugin until I build first. So I'm gonna go back up to the top. And then this time I'm going to run hello-world press Play. And after it builds, you can see not only does the repo folder show up, but I also get the hello Earth message back in my output. As you can see, everything is working as expected. 13. Section Introduction - Build a Test Coverage Reporter Plugin: Now you should have a solid understanding of how a plugin interacts with an extension and a task. In this section will unwind this simple plug-in we just finished and rebuild it to create that your cocoa test coverage report or plug-in that I opened this course by showing you. This section focuses on two major topics. Creating the plug-in that I showed you at the beginning of the course. And doing a deep dive on custom extensions to make them as simple or as complex as you'd like. In my opinion, this is one of the areas where Gradle really shines and I think you'll like what it has to offer. 14. Understanding the Gradle DSL: Before we dive in and start building our plugin, I'd like to level set our understanding of Gradle DS cells. If you've ever worked with Gradle before, you've certainly used DSL, but if you don't understand how they work under the hood, they can feel a bit magical and perhaps even a bit cryptic. Since we will be writing a custom DSL for our plugin, I figured we could take the time now to really get a solid understanding of what they are and how they work. To start off, DSL is simply an acronym for domain specific language. You can think of it as a programming language which is highly specific to your exact scenario. The naming conventions and the constructs that we use to build this DSL and interact with it are often named in a way that are very specific to that use case. It's highly unlikely that you ever take a DSL in one particular area and use it for something completely different in another area of your application. They're typically single use in single-focused. In many cases, DO cells are designed to leverage the syntactic sugar that the implementing programming language offers. As a result, many of the constructs that we're used to seeing such as parentheses equals signs fall by the wayside because it's syntactic sugar at the language is able to compensate and building those constructs for us without us having to explicitly declare them. So in some ways, it can feel like you're not actually working with a programming language. But you are. One of the advantages of Gradle scripts is they're actually just groovy scripts. Groovy is just another JVM programming language which happens to have excellent support for things like DSL. So perhaps this very understanding can take away some of the cryptic feeling of Gradle scripts. There's really nothing special about them. They're just groovy scripts that tend to use a lot of DSL. After you take the time to get used to this way of programming, you tend to appreciate what it has to offer. I'd like to give you a quick history lesson. Unwind DSL is make so much sense. To do that, we're going to jump back into the early days of Java and show you how they would do the exact same thing using early Java code. Here is a standalone sample that I've built for illustrative purposes. Now if you take a look, you can see this is actually a groovy script, but a lot of the constructs that I'm using in here are familiar to Java developers. Groovy has excellent support for existing Java syntax, which is why part of this looks like a Java program. And the other part, not so much. Don't get too caught up on the details. But I'd like you to understand is how we can evolve this script into a DSL and what the advantages of it will be. I'm going to call this my terse example. Imagine that I have a program to send an email from one person to another with a subject, I could decide to implement that with code looking something like this. Now this is a perfectly acceptable and working example. In fact, if you copy and paste this code into a groovy interpreter, it will work. So why would we want to improve on it? You can see this example feels very terse. There's a lot of repetition, especially because I have to call the e-mail instance multiple times just to set each property. We can solve part of this repetition by implementing the builder paradigm. So if I tweak this example slightly and I make it behave like a builder, I get something that looks like this. Now you can see I have a chain from top to bottom, which effectively describes everything I need to do to set the properties on the email and eventually send it out. This definitely feels like a step in the right direction. But I'd like it to feel a little bit more expressive, a bit more descriptive, and a bit less terse. That is where we can use groupies ability to create a DSL from this. If you look at this example, you can see the code towards the top is a little bit busier than it was before. Let's ignore that for now. I'd like you to focus on the implementation where I have an e-mail block with a from a two and a subject that feels very natural. It feels very descriptive. And if you notice, I don't even have to call send. I can have the constructs of the DSL. Do that work for me that way I won't forget. Perhaps you're thinking, Didn't I just trade one difficulty for another difficulty? The implementation is very easy to use, but setting it up seems rapid difficult. Because Gradle has so heavily invested itself inside of DSL. They've created some excellent constructs which allow us to eliminate this boilerplate that you're seeing here. So when it comes time to build our own DSL for our plugin, it'll be very simple. Just to show you how easy this DSL is to use, let's expand on this example just a little bit. It would make sense for my email program to be able to send a body with multiple paragraphs and let's say at least one attachment. My DSL could look something like this. You can see how easy this is to read. It's almost like reading a book. And to be honest, you really don't even need to be well-versed in a programming language to understand what's going on here. That is one of the advantages of a DSL. Again, ignore the complicated code at the top. Grader will make this incredibly easy to build. Our focus is just going to be on building excellent DSL so that whenever it comes time for another developer to use our plugin, it'll be very easy for them to understand how to configure it. 15. Prepare for Development and Testing: Now that we have an understanding of how cells work, Let's jump back to the code and start adapting our project into the Rococo test coverage report or plugin. Instead of creating a brand new project or a whole bunch of new classes, Let's just adapt what we have. The first thing I'm going to do is open up the task and change the action inside of it. Instead of printing out the message that's on the extension, I'm just going to have it print out the word stub for now. Obviously, we will come back and fix this later. Since the goal of my plug-in isn't just to write a message, but to extract the test report. Let's update the name of the function. In this case, I'll call it extract, test report. The message on the extension no longer has any value to me. So I'm also going to remove that property. And finally, let's jump to the plug-in itself and update the task name from right message to extract test report. I'm going to jump into the build-up Gradle file inside of my demo project. Because obviously I need to update the properties on my DSL. Since I've removed the message property from the extension, I'll go ahead and remove it from this block as well. For the IDE Run Configurations, I'm also going to update the run hello-world task. Instead of saying run hello-world, I'll call it run extract test report. And of course, I'll have to update the task to also be extract test report. I'll go ahead and save that. And finally, I'm going to copy a few examples of some Chicago test coverage reports that are generated in advance for this project. You can download these exact files from the resources section of this video. It doesn't really matter where you put it, but just for ease of access, I'm going to place it inside of my demo project, inside of its own folder called reports. As you can see, this example includes three different coverage reports. We'll take a deeper dive into what these files actually mean later on. But I wanted to at least have them available in our project. And finally, I'm just going to run a sanity check to ensure that my extract test report works as expected. So if I press the Play button on the IDE and I look at the output, I can see that I am indeed getting that stuff message. So at this point we have a clean slate to start building our test coverage report or plugin. 16. Accept First Order Configuration: Now it's time that we turn our attention to our plugin DSL, and we're going to start thinking about what I call first-order properties. Now this isn't a reference to some sci-fi movie, but rather it's what I call properties, which we said one level deep inside of a block inside of our plugin DSL. Specifically in this video, we're going to focus on two particular configuration values. The first one is just going to be a flag configuring whether or not the plugin is enabled or disabled. The second one will set what types of coverage metrics will be extracting from our coverage reports. So in essence, I would expect our DSL to look something like this. You can see that the enabled flag is rather self-explanatory. The coverage types we'll get into in just a minute. But you can see how I feed multiple pieces of data to this single property. The coverage types are simply various types of metrics to the Chicago plugin generates for us whenever it creates a report. Specifically, there are five different summary types which it offers instruction, line complexity method in class. I'm not going to go into detail about each one of these, but I wanted to at least make you aware of them so you know what it is we're going to be working with. And you might be wondering, where do I get these different types of coverage reports? Well, there's simply come from the XML report that we have inside of our project. So for example, if I open up any of the sample XML files which are included inside of our project. I'm going to see something like this, where we have several counter nodes, each of them having their own type. It specifically the types that I'm referring to that will be extracting with our plugin. So now that we have an idea of what our first-order DSL properties are going to look like. Let's dive into developing this part of our plugin. Now that I'm back and intelligence, we're going to turn our efforts towards our extension class first. So I'm going to open up the Chicago test coverage report or extension class. Let's focus on the easiest property first, and that is going to be the enabled flag. I'm going to create a property called enabled and simply set it equal to true. By default, I'm assuming that the user will want to run our plugin. Now let's start using this property inside of our task. So I'm going to go ahead and open up the corresponding task class. Inside of the task action, I'm going to replace the line that says stub with an F check. So in this case, if the plugin is enabled, a print plugin enabled, otherwise I'll print plug-in disabled. I just want to make sure that our plugin is wired up correctly. Alright, so let's go ahead and test that out. I'm going to open up the build-up Gradle file inside of our demo project. You can see right now I simply have an empty DSL. I'd like to see what this looks like whenever I run it without any special configuration. If I run this, I can see it says plugin enabled in the output. And this is exactly what I was expecting because our extension so that the enabled by default. Let's go ahead and change that inside of our block. I'm going to say enabled false and run it one more time. And as you'd expect, it simply says plug-in disabled in the output. So it looks like everything is working as intended. Let's move on to our next property. If you recall earlier, whenever I described what I intend to our DSL to look like the coverage types excepted multiple inputs for this single property. So to me, that sounds like something which needs to be a list. Let's go ahead and implement that. I'm going to create another variable called coverage types. In this case, I'd like to set it to a list of strings where each string would be one of the cupboard metrics that we have available to us in the XML file. As a default assumption, I'm going to provide instruction as the pre-configured value. Now this is totally up to you of the five configuration types that we can set. You can pick any one of the five or all five. This functionality is totally up to you. I'm just going to keep it like this for this simple example, Let's jump back over to our task and print out all of the coverage types which are collected here. In this case, I'm going to need a loop. So I'm going to loop over every coverage type in the extension and then simply printed out one at a time. And just for some visual clarity, I'm going to add a new line between the plugin enabled disabled prompt and all of the coverage type prompts below. Now I'm going to go ahead and run this with the default configuration. So if I press play, I can see that I have plug-in disabled and instruction in the output. I'd like to make a couple of changes to this. So just as a quick refresher, I'm going to open up one of those XML files and look at all the types which I have available to me. I'm going to add in a couple more in my DSL. So since the backing property on this as an array, I'm going to create an array inside of my DSL. For the coverage types of property, I'm going to set it equal to an array of strings. And let's say in this case I pick complexity method in class. Just for starters, if I run that, I can see I have an error. In this case, I can't take advantage of the syntactic sugar that I used on line seven, where I can simply omit the equal sign. I have to be a little bit more terse and specific when setting this property. After the coverage types variable and before the array, I'll add an equal sign. This time when I run it, I can see them getting the intended output. So everything is working as expected. If you're happy with the way this looks, you can definitely stop here as the design of the DSL is totally up to you. But for me, I'm not satisfied. To me, it feels like I'm working with a programming language and then I'm not really utilizing syntactic sugar that Groovy has made available to us. Let's make some changes on that end. Let's jump back to the extension. In this case, I'm going to create a function with the exact same name as the property I have up above. So I'll say fun coverage types. And for the property, I'm going to have a varargs property. And the type of it is going to be a string. Now the coverage types of variables that I have online for, I'm going to use as a backing property instead. So in my DSL, I'm actually going to invoke the function on line seven and save the input in line for the collect all of those var argv values into a single item. I can simply say coverage types equals types da2 list. Really, we're utilizing the strength of a var arc function to do the exact same work that we were doing earlier, just with the advantage of syntactic sugar. Now that means I can omit the equal sign and the square brackets because instead of studying the property and now actually calling a function, one of the nice things that Groovy allows us to do whenever we call a function is omitted the parentheses. So now it looks like I'm just setting a property in this really feels like a DSL to me. Let's run one last test to make sure this is still working properly. And I can see my output in the console is the exact same as before. Everything looks like we're squared away. 17. Accept Higher Order Configuration - Introduction: Now that we have our first-order configuration in place, it's time that we turn our attention toward higher-order configuration. What do I mean by that? I'm simply referring to properties which are nested further in than just one layer underneath or DSL block inside of our groovy script. For the sake of comparison, let's take a look at what we built in the last video. This is our first-order DSL. And if you'll notice, there are only two properties that are nested. One layer deep inside of the outer DSL block. By contrast, if I take a look at this example, you can see that I'm setting properties on a DSL which are nested more deeply than the first two I have up above. This is what I would call second-order configuration, particularly whenever we set the name and the report path underneath each of the module blocks. Likewise, if I take this a step further, you can see that this is an example of third-order configuration because the name and the report path, or three layers removed from the outer block. In this video, we're going to be building both of the higher-order examples that I just showed you, along with two additional examples, you might be wondering, what's the virtue and building the exact same thing four different ways. Well, technically, there's no right or wrong approach to how you can build a DSL. Each example that I'm going to be showing you along the way, we'll actually end up doing the exact same thing when we go to run our task. There's no functional difference between any of these examples. Instead, I'm showing you the variety that DSLR's have to offer and the different tools you can use to leverage them. This will help you greatly expand your knowledge of building DSL whenever it comes time for you to apply these concepts to your own real-world plugin. So as I mentioned before, whenever we deal with higher-order properties, I'm referring to Properties on the DSL which are more than one level or move from the outer block. The examples that I showed you only had second, third order properties inside of it. However, these concepts apply repetitively. If you ever needed to create a DSL that we'll say 45610 or even 20 layers deep. You can use the concepts in this video to go ahead and do that. Now, if you are getting ten or 20 layers, deepen your DSL, I'd really encourage you to rethink how you're doing it. But you could still technically do it with the concepts I'll be showing you here. So as we turn our attention back towards our DSL, let's take a quick refresher on what we already built. Right now, our existing configuration allows us to set two properties. The first property enables or disables the plugins functionality, while the second one sets what coverage types we're going to report on. But as of right now, we don't have any way of specifying where to get these coverage reports. That's where our additional configuration is going to come into play. So as we continue to build our DSL, I would like these additional properties to allow us to specify where the coverage report can be found, when its output from Chicago. Another property which would be nice to have would be if we could give each module a user-friendly name. That way, whenever we refer to the output of our plugin in the console, we can quickly understand what module that particular metric is referring to. In finally, since many larger projects often include multiple modules, will need to apply these properties over and over again so that we can gather metrics on each of these modules. Now that we have that out of the way, let's turn our attention towards building the first version of our addition to this DSL. 18. Accept Higher Order Configuration - Option 1: Here is the very first iteration of these additional properties that we're going to be adding to our DSL. You can see in this case I'm simply calling the same function numerous times. The first property is referring to the name of the module, and the second one is referring to the path of the XML output file. Technically, this isn't actually higher-order configuration because as you can see, all the properties that we're setting or actually just one layer removed from the outer block. However, in the next iteration will quickly turn this into a second-order DSL. Back in IntelliJ, I'm going to open up the extension class. And perhaps the very first step that I'm going to take here won't be a surprise because as you can see, all I was doing was calling a function. So we're going to go ahead and create a function with the exact same name. In this case, the function is going to be called module. The first parameter is going to be the name and the second one is going to be a report path. In both cases, these are going to be strings. Obviously, as I call this function, I'm going to need a way of being able to capture and save all of the names and report past that are given to me by the user. There are several ways that you could go about doing this. And it's one of those cases where there's no such thing as really a right or a wrong way. This one is just going to be my personal choice. I'd like to be able to store all of the information that the user gives me inside of my own data model. So I'm going to create a data class called module, and I'm going to have two properties on there. The first one being a name, the second one being a report path. And again, those are both going to be strings. Now I'm going to create a backing property so that I can store all of these inside of a list. In this case, it's going to be a mutable list of Module classes in Kotlin. I need to be sure to make this a mutable list because if I had it simply as a list, I wouldn't be able to change it once I had the dataset with the backing property in place, now I can simply go inside of my module function. And every single time I have an invocation of that function, I can simply add a new module to the list. Of course, every time I add that module, I'm simply going to pass the name and the report path straight on through. Well that about wraps up the effort inside of the extension class. Let's go ahead and put this to work. I'm going to jump over to my task class. And at the very bottom of the action, I'm going to add in a for-loop. In this case, I'm going to loop over all of the names and reports that I have inside of that backing property. And again, since we're not at a point where we're actually implementing the functionality yet. I'm going to do is print out the name and the report path for each of the items in that list. That way we know it's working properly. Finally, I'm going to open up the build-up Gradle file inside of my demos subproject. And I'm going to call this module function several times. In this case, I'm simply going to have a module named Alice and other one named Bob, and finally a last one called Charlie. It doesn't really matter what the names are, the paths are just yet because I'm not actually doing anything with this information except printing it out. This is more of just a sanity check at this point. I'll save my changes and run my project. And as you can see in the output, I have everything that I expect to see, the name of the module along with the corresponding path. This is totally an acceptable way of building a DSL. If you're happy with the way it looks and functions, then you can simply stop here and say, I have my DSL the way I wanted. However, if you're interested in seeing a couple of more options and how you can build the exact same kind of functionality with a different DSL. Then keep on watching. We're going to cover the next option now. 19. Accept Higher Order Configuration - Option 2: In this next example, we're going to expand upon the first option and turn it into a second-order DSL. As you can see here, the name and the report path are both two layers removed from the outer block. Let's jump back into our IDE and see what we need to do to make these changes. Here I am back inside of my extension class. The very first thing I need to do is inject what's called an object factory into my class. If you remember back to an earlier video where we talked about how you build a DSL with native groovy. I said the process can be a bit complicated in that Gradle would smooth over those rough edges for us. Well, here we are. The object factory is what we need in order to help us smooth over those rough edges. Once I inject the object factory, I need to make it available to the entire class. So I'm simply going to say private Val. That way it becomes an instance of this class. We're going to use the object factory to create several instances of the module class. We have a bit of a conflict of interest here because the way data classes work inside of Kotlin, you have to set all of the properties in advance before you can fully instantiate the class. Yet the object factory wants to instantiate the class and then allow us to set the properties later on. So I'm going to have to remove the data class here and change it just to a regular open class. This is one of those cases with a class doesn't need to be open or non final because the object factory is going to be doing some things behind the scene to extend that class. So in this case, I'm going to replace data class with open class. I'll change all of the vowels to vars, and I'll set default properties to empty strings. This allows the object factory to create new instances of the module class. And then later I'll go and set the name and report path manually. I'm going to jump back to the module function and clear out both the body and the argument list. Now here is where greatest API really starts to shine. We're going to build this DSL right now. This module function is going to be the module block that we had nested one layer beneath the outer block. The things that go inside of this module block will be referred to as the action. If you recall back to the picture that I showed you earlier, we're simply going to have two properties inside of each of these blocks, the name and the report path. So obviously you can see I'm referring to the module class on line 17 with each of its properties. So therefore, this action inside of the module block needs to be of type module. That's the only argument that we need for this function. Inside of the function, we're going to use the object factory to create a new instance of this model class. Then I'm simply going to take the action which was passed to us from the DSL and executed on this module instance. Effectively, what this does is it takes all of the properties that the user set inside the DSL and applies them to the module class that I created on line 25. So after the code executes on line 26, I now have a copy of all of the properties that the user hand it to me. So now I can simply add the module instance to the names and reports backing property. It has already been prepopulated from the value specified in the DSL. That's all that we need to do to build a second-order DSL. As you can see, that was pretty simple and we didn't have any of the fancy magic that was necessary whenever you're building this with plain groovy. Let's jump back over to my task. As you can see on line 25, this loop doesn't require any changes because I didn't change anything about the backing property. I can still refer to the name and the report path, just as I was before, I will need to make some changes to the build-up Gradle file though. Let's adapt the DSL from functions two blocks. As you can see, the information that I'm providing is the exact same as it was before, but now it feels a whole lot more like a Grail DSL. So even though there's no such thing as a right and a wrong approach, if it were up to me, I would definitely choose this option over the previous example. Just to ensure everything is working properly, I'm going to go ahead and run this. As you can see in the output console, the information is the exact same as it was before. So obviously, we're capturing the input from the DSL properly. Now, let's move on to the third example. 20. Accept Higher Order Configuration - Option 3: This third example is going to be a bit of a mash-up between the first and the second example. In some ways, I prefer this DSL a bit more because it logically encapsulates each individual module inside of a module is block for the sake of organization, I think this feels a bit cleaner, but again, that's a matter of personal choice. Let's jump back into the IDE and make the necessary changes for this DSL. This option is going to require a different approach. Each block used to be called module, and that would correspond to the module function on line 24. However, now we have a modules block. So that means I'll need to create a function called modules for that block, I'm going to go ahead and clear out the module class, the backing property and the function module. That way we can get a clean slate. So perhaps as you'd expect, I'm going to create eight modules function. Once again, we're going to employ our old friend, the object factory. To do that. I'm going to need a backing property in the type of it is going to be an instance of a class that I'm going to create here in a moment, I'm going to call that class modules container. So for this case, I'm going to create a backing property called modules container instance. I'll have intelligent create that class for me. As before, this is going to need to be an open class. I'm just going to leave it as a stub for now. I'm going to finish with the object factory. I'll have the object factory create a new instance of that class for me, just like I did before. The major difference here is instead of having a class which simply has to Properties, and I'm going to create a different class which has a method on it. And if you think of it, this makes perfect sense. We have our modules function, which refers to the outer modules block inside of our DSL. Inside of that block is going to be the modules container. That is what's going to have the function with the name and the report path properties inside of it. So what we're doing here is taking the concepts from the previous example and we're basically moving it into a subclass so that it can be one layer deeper than it was before. I'm going to move the modules container class down to the bottom of the containing class. And now I'm going to add that action back to my modules function. This time, the action is going to be of type modules container. All I need to do is tell it to execute that. I don't need to grab any properties from it because I'm going to be doing all of that work inside of the modules container. I simply need to tell it to invoke it. Let's turn our attention towards the modules container. So just like the previous two examples, I'm going to need a module function in there with its own name and report path. Perhaps by now you're starting to see the pattern play out. I'm going to add a backing property inside of my modules container. And I'm simply going to call it names and reports like I had before. This will need to be a mutable list. Once again, I'm going to need a data class to help me model this information. I'll recreate that data class with a name and a report path as the two properties. Once again, every time I invoke the module function on line 31, I'm simply going to add the name and the report path to the list. There we go. Now we have a second-order DSL, which simply applies the properties from the previous two examples in a slightly different manner. Let's jump back over to my task and I'm going to have to change how I access to backing property. Obviously, the extension no longer has a backing property called names and reports. But the instance of the modules container class does. So I'll adjust the way I access that property on line 25. There we go. I don't have to make any further changes because the way I model the data is the exact same. Finally, let's jump back into our build-up Gradle file and make the necessary changes. Once again, I'll run this and see what I get in my output. As you can see here, everything looks the exact same as it was before. So we're still capturing the data correctly. As with some of the other examples, this approach definitely has its pros and cons. If you're not entirely happy with the way this looks, Let's take a look at one final example. In this final example happens to be my favorite because it encapsulates the best attributes of each of the previous three examples. 21. Accept Higher Order Configuration - Option 4: Here's a look at the final example of the DSL we're going to be building. As you can see here. Now we have third order properties where the name and report path or three levels nested from the outer block. This one is a bit of a mix between the second example. In the third example, and as I mentioned earlier, is definitely my favorite. Let's see what changes we need to make in order to make this a reality. Generally speaking, I'm pretty happy with the way that the modules block worked, but I don't like the fact that I had to call functions inside of that block in order to set my properties. So that means I'm really not going to need to make any changes to the modules block on line 19 because that function is the way I'd like it to inside of my DSL, most of my focus is going to be in the modules container class on line 28. And in fact, the changes I'm going to make are going to be very similar to the changes that I made when I went from example one to example two. As before, I'm going to need to change my module class online 23 from a data class to an open class for the same reasons as before. I'll change the vars two vowels online 2425 and specify a default value of an empty string. This module container class is going to require an object factory of its own. So I can inject one into the constructor the exact same way I did for my extension, since I'm getting an error inside of the module function on line 31, I'm just going to clear out the body of it right now. My modules container class is going to require its own instance of the object factory. So I'm going to inject it via constructor injection, just like I did for my outer extension class. Were placed the arguments on the module function on line 33 with an action in the type of the action is going to be a module. Inside of the body of this function, I'll create an instance of the module class with the object factory. Now execute that action. Finally, I'll add the result of it to my backing property. Those are all the further modifications that I'll need to make to my extension. Let's check the task real quick. As you can see, I didn't change how I'm storing the information inside of my extension. So I don't need to make any changes to this class. One last time, I'll jump over to my build-up Gradle file and make the necessary changes to the DSL format. I'll build my project. In the console. You can see I'm still getting the exact same output. This is going to be the format that I'm going to stick with for the rest of this course. To me, it's the cleanest, it makes the most sense and it feels the most like a Gradle DSL. The choice of course, is up to you, but now you have four different options under your belt. Now you can see how simple it is to apply it as repeating patterns over again. That way you have the flexibility to make your DSL is shallow or as deep as you would like. 22. Implement the Coverage Reader Task: By this point, every aspect of our plugin is done except for the task. We've finished the extension. We've wired everything together in the plug-in itself. So let's turn our attention to the task. I'm going to open up the coverage reader task inside of my IDE. At this point, you'll notice it's an, a very different state than it was in a previous lesson. Since the point of this course isn't how to read XML files into print out reports based on them. I've included a copy of this class in the resources section of this video. Feel free to copy and paste it into your project for the sake of completeness, I'll walk you through this implementation just so that you have a thorough understanding of it. At first glance, you can see that pretty much everything happens inside of this task action function. And you already see some familiar aspects such as different properties that we've set on the extension itself. If I start at the very first line inside this function, you see that I have a flag that will keep the rest of the function from running if it's disabled further on down, I do some basic cleaning on the coverage types by converting everything I'm given to lowercase whenever extract each of the metrics from the XML report. Later on, I also convert those to lowercase just so that I can have a case insensitive comparison. Now let's move on to the for-loop. As you can see, I'm looping through each of the names and reports that are given to me by the user from our DSL. As you can see on mine 25, I'm using a class called Sacks reader to help me open up the XML file and to parse it. Just to make this a bit easier on ourselves, I'm going to open up one of those XML files and have it on the right side so we can compare it to the code on the left. As you can see on line 26, I'm going through all of the elements that start with the word counter. And then further in, I'm going to loop through each of those counter types. And as soon as I'm inside of this inner loop, I grabbed the type tag off of this counter node, and then I do a comparison to see whether or not this particular type is listed inside of the type that the user provided to me, inside of the DSL. If it isn't, I simply go on to the next iteration of the loop. Otherwise, I'll go ahead and extract the coverage metrics. That is as simple as gathering the cupboard attribute and the attribute and then doing some math to convert it into a percentage. And towards the end of this loop, I print out three things. The name of the module that the user provided, the coverage type that extract from the XML report. And then finally, the percentage that I calculated. But you'll notice right above it, I have something called add to cash, right. Add both the coverage type name as well as the percentage. But what's the point of that? Well, if I scroll down towards the bottom of this action, you can see I have another for-loop. This only runs in the cases where I have more than one module specified by the user and the DSL. In that case, I'll grab the total number of modules that the user hasn't been examined. And then I'll use that to calculate the average coverage for each of the metric types. So you can see all of this does inside of the ad to cache function is it simply adds up all of the percentages that are calculated for each of the metric types. Then once it added up those metrics, I can simply divide them by the total number of modules that I'm reporting on in this project. And that gives me an overall average for that particular metric type that pretty much covers the entire implementation of this task. I wouldn't like to point out where I got that XML reader though. If you remember up above, I have the sax reader class. Let me show you how I import that. I'm going to go to the root build-up Gradle project. And you can see I'm importing a library called DOM for j. This is a relatively popular library for parsing XML files, and I found it to be perfect for this use case. Now that we have everything in place, it's time to put our plugin through its paces. I'm going to open up the build-up Gradle file inside of our demos subproject. And I'm going to replace all of the test values that we had in there. Real values. As you can see in this case, I'm referring to all different kinds of coverage types, all of the five that are available to me inside those XML files. Just as a quick reference, if I show you that XML file again, you can see that there are five different counter types available here. And there are also five different types of them referring to inside of this build-up Gradle file. So I'm going to get every available metric. And finally, if you look at the modules type, you can see that I have three different modules, each with a name and a report path with respect to the root of the project. So if you recall earlier, I placed all of my report files inside of the reports folder in my demo subproject. So with these, I should be getting every available metric from every available report that I have in my project. Now I'm gonna go ahead and run my project and see what I get. And there we go. I have five coverage types, as well as a summary which averages them all at the very bottom. By now, we've completed our first real-world useful plugin. And I'll bet that feels pretty good. 23. Section Introduction - Advanced Plugins and Seamless Integration: Thus far, we've built several iterations of our plugin. But since plugins are so essential to the operation of Gradle, it would make sense that there'll be more to explore, much like outer space, but we haven't had the final frontier. To be honest, the setup I've shown you may not be flexible or powerful enough, and it may not even work with your project's requirements. But don't despair. Gradle offers a lot more room for flexibility. This section we'll focus on three key points. How to use Gradle to automatically incorporate and use your plug-in inside of your existing builds. Large-scale projects with multiple plug-ins, each containing multiple tasks and extensions, and building plug-ins alongside your existing projects instead of as a standalone project. By the time we finish up this section, you will likely find that has filled in a lot of gas that were left open in the previous chapter. 24. Gradle Lifecycle and Automatic Integration: Now that we've finished building our plugin, I'd like to think about some creative ways that we can use it, utilizing the strengths that Gradle offers to us. Up until this point, every time that I wanted to extract coverage metrics from my report, I would have to run my custom task. While that might be fine for some small scenarios, there are some problems that can arise with this approach. Imagine, for example, that you have a large project and every time that you run the unit tests on it, you want to be able to extract the coverage metrics from it. Right now, the thing that you'd have to do is first run your unit tests and then second, remember to extract the coverage metrics with our plugin. That can certainly be a bit error-prone, especially if you have jobs running on a CI where you always need to make sure that your unit tests are run. And then immediately after the metrics appear in the output, that will be up to you as a DevOps engineer to make sure that you never forget to do one and then the other. But let's take a very basic example of what Gradle does to solve this problem. I'm going to build my project from my plug-in. And let's see what actually happens under the hood. Whenever I take a look at the task execution log, I can see that the very last task here was build. That is the actual task that I ran from my IDE. But as you can see, this task was preceded by about two dozen other tasks that I didn't have to call. That is because Gradle setup a dependency graph, meaning that whenever I run build, all of those things must precede it before the build step can finish. We can leverage the same techniques inside of our scripts. Let's take a moment and see where these come from. I'm going to go ahead and collapse the build panel. And let's open up the Gradle panel. As you can see, whenever I expand the Tasks folder, there's a bunch of tasks groupings that are underneath it. And finally, if I start to expand each of these groups, then I have the individual tasks. These are all of the tasks which are available to us inside of this project. Some of them you can see are going to be familiar, such as build or assemble are clean and other ones we never have to think about because they're called implicitly for us by the build system, we can leverage the same techniques inside of our build scripts to make utilizing the strengths of our plugin even easier, I'm going to open up the build-up Gradle file inside of my demo subproject. At the very bottom, I'm going to create a task, and I'm just going to call it my custom task. I'm going to put it inside of a group called verification. And then at the very end, I'm going to say, Everything looks good. Now let's imagine for a moment that I have a very complex build system. And this custom task is part of the verification procedure that I need to run. At the very end, I could decide to have all of my build scripts first call the extract coverage report from my plug-in and then my custom task on line 27. But as I mentioned earlier, That's error-prone. I'd like to have the tool do that work for me. So just like the build step has a bunch of dependencies, I'm going to create an explicit dependency between my custom task and the extract test report task from my plug-in at the very bottom of the script here, I'm going to add an after evaluate block. And I'm going to say my custom task done depends on extract test report. What this is saying is that every time I run my custom task, the extract test report task must have run beforehand. Let's try this out by adjusting how I run this from my IDE. I'm going to go up to the top and edit my configurations. And for the extract test report, I'm going to replace the task. The task is now going to say my custom task, and I'll also adjust the name to reflect that change. Let's go ahead and run that. And then my output, you can see that two things happened. The first thing is my extract test report task ran and all of the output follows it. Then underneath that you can see I have my custom task along with its output. Everything here seems to be running in the proper order. This was a simple example, but more practically speaking, you can imagine if you have a large project with a bunch of unit tests, you could set it up so that every time you say extract test report that you have a dependency on the unit test task. And that way, whenever you go to extract the test reports, you know that they will always be there. 25. Add Multiple Events and Extensions to Your Plugin: I'd like to take a few moments and expand on some of the existing principles that we've used to build our plugin. Right now, we have one extension that configures the entire behavior of our plugin. And we have one task which runs the entirety of the job. Although in our simple example, this might be practical for more complex examples, it would probably make sense to have multiple extensions, multiple tasks to configure different aspects of your plug-in and run them. To illustrate how this works. I'd like to take our plugin and break it down into two parts. As you know, we have certain functionality which allows you to report on the metrics of each module that you are evaluating. And separately, we have a summary at the very end, which gives you a big picture of the overall health of your project. I'd like to break those down into two separate parts. One part will just report on each of the individual modules, and another part will report on the entire summary. Each one of these parts will have their own extension and the corresponding task to execute them. Let's see what we need to do to make this work. To get started, I'm going to start disambiguating some of the names that I've created up to this point, all of the names have been pretty consistent and have worked well for the existing use case, but that won't be the case for much longer. Let's open up the existing extension class. I'm going to rename this to metrics extension. I'm going to use this extension to configure the report from each of the individual modules. Similarly, I'm going to rename the task two metrics task. And finally, I'm going to open up my plug-in and rename some of the information inside of there. I'll change the variable name, extension and task name all to align with this new effort. I'll open up my build-up Gradle file and change the blockName as well as the task name to align with these changes. I'm not going to make any changes to the execution job inside of my IDE. Because if you recall, I'm actually running my custom task. So that means that everything that I do inside of here, I can configure it directly from the Gradle file. Just as a sanity check. Let's go ahead and run this and make sure that everything's still operates as it did before. And as you can see, everything looks good. I'm going to focus the next part of my attention to extracting this part, the summary at the very end. So since I'm going to have multiple extensions and multiple tasks, I'm going to put those inside of their unpackaged so we can keep them together. First, I'll create an extensions package. And then I'll move the existing extension inside of there. I'll repeat this process for the task. With this new extension that I'm building, I'd like to keep it pretty similar to what I have right now. I'm going to go ahead and just copy the existing metrics extension and just simply call it summary extension. The way I'm going to design this extension, I'm simply going to omit the coverage types property. So in that case I can remove both the coverage types function as well as its backing property. So whenever I go ahead and use this inside of my Gradle scripts, everything else about this block is going to look the same except for the missing coverage types of property on a similar note. And also like to create a summary tasks that will look very similar to the existing metrics task. So I'll go ahead and create a copy of the metrics task class and call it summary task. Before I jump into my new task class, I'm going to remove the portion of code inside of my metrics task that we're ports on a summary that would involve this block of code down here, as well as the private function at the very bottom. With those two things gone, I can also remove the cash property that have at the very top. It looks like everything for this task is an order. So let's jump over to our new task. So many of the changes that I'm going to need to make to this class will just be the inverse of what I did to the metrics task, along with a few subtle changes. So for example, as you can see on line 11, I'm still referring to a metrics extension. I'm going to change that to the summary extension. Since that model doesn't have anything regarding coverage types, I'm going to go ahead and remove all references to coverage types inside of this class. So effectively what that leaves me with is that every single metric type is going to be reported on inside of this task. There's no particular reason why I did that other than just to make the DSL look a bit different whenever we go ahead and use them inside of our demo. Functionally speaking, you may prefer to keep them if that makes more sense for your use case. Now as you can see, I'm also still printing out each of the individual metrics types on line 33. I'm going to go ahead and remove that as well as the new line beneath that just for formatting. And we're just going to keep the parts that are necessary for the summary. If you recall earlier, I would only ever report the summary if you were using more than one module for this case, I'm going to remove that if check on line 36. So that means you're always going to get a summary even if you're using just one module. Alright, at this point, our tasks and our extension are all in place. Let's go ahead and wire those up inside of our plugin. I'll create a copy of the existing bootstrapping code from lines 1112 and just make some minor modifications on line 1415. Effectively in this case, all instances of the word metrics get changed to summary. Okay, our plugin is ready for us to use again. Let's try it out inside of our demo project. I'm going to copy the existing block that I have here for configuring my metrics portion of the plugin. And I'll paste it down beneath. Of course, I'll have to change the block name to Chicago test summary. And if you recall, I also removed coverage types. Everything else here looks pretty much the same. I'm going to add one more dependency on my custom task. So of course that means every time I run my custom task, that means both of the tasks from my plug-in are going to run beforehand. Let's go ahead and run this and see what we get. Now you can see all of these metrics are broken out into two separate parts. The very first task reports on metrics for each individual module. The second task reports just on the summary from the data above. And of course all of this runs before my custom task. So there we have it. Now we have one plug-in with two tasks and two different extensions. This might have been a bit of a silly example because we had a lot of duplication in terms of how we configure idea so and how we run our tasks. But I wanted to use it as an illustration to show you that if you have a more complex project with more things to do, this is one possible approach you could take to provide multiple cells with multiple tasks to configure all of the functionality that you're plugging offers. 26. Projects with Multiple Plugins: In the last video, we saw how a plugin can be an umbrella over multiple tasks and extensions. I'd like to take that concept one step further. In a similar sense, this intelligent project can also be an umbrella over multiple plug-ins to show you how this works, I'd like to resurrect the HelloWorld plugin that we implemented earlier on in the course and put it right alongside the one that we've been working on. Let's see what's necessary in order to make that happen. The very first thing I'm going to do is just a little cleanup. I'm going to create a plugins package and moved existing plugin class into there. In just a moment here, I'm going to create a, another plugin class. And the purpose of putting them inside their own package is the exact same purpose that the extensions and the tasks have inside of their own packages. I'm going to add a new class to my plugins package called HelloWorld plug-in. This should feel rather familiar. And just like before, I'm going to have it inherit from the plugin class. And if you recall from before, the plugin class is also a generic. So I'm going to have to specify project as the generic type. This change requires me to override the apply function. So I'll go ahead and do that and rename the p parameter to something more descriptive like project. I'll clear out the stub and then move on to creating my own tasks. Under the task folder, I'll create a new class called message task. I'm simply going to hand this plugin printout, a very simple phrase by executing this task. The first thing I'm going to do is make this an open class and then have it inherit from default task. And then inside the class body, I'll create a function called right message, which is annotated with the task action annotation. And since this is intended to be a very simple example, I'm simply going to have this print out a message. In this case it will say hello world from the second plugin. That's all that I need this task could do. Let's jump back into the plugin. I'm going to add a new task to the Gradle task graph called right message. Anytime that task is called, it's going to call the message task class. Everything I've done here so far is quite old hat. Just to keep this as simple as possible, I'm not going to wire up an extension to this. It's only going to contain a task. Let's go and register this new plug-in inside of our project. I'm going to open up the build-up Gradle file at the root of my project. And underneath the plugins block on line 44, I'm going to make a few changes. First, I'm going to have to adjust the implementation class for the existing plug-in. I'll copy the registration block and paste it down below the navel, the block I'll change to HelloWorld. And it also propagate that change to the ID as well as the implementing class. So now our project knows about this new plugin. Let's register it inside of our demo project and put it to use inside of the settings doc Gradle file. Inside of my demo project, I'm going to duplicate the line that inputs are existing plug-in and modify it to also import our HelloWorld plug-in. I'd like to clarify why both of these plugins have the exact same version. I'll jump back to the root build-up Gradle file in my project. Since I have two plugins registered inside of this project, they both inherit the version that's specified on line ten. That's why I need to propagate this version all the way down to both of these imports on lines 56. And finally, now that we have everything wired up, let's go ahead and put this to work. I'm going to open up the build-up Gradle file inside of my demo, some project. And I'm going to import this plugin, just like I have all the others. At the very bottom of this file, I'm going to wire it in my new task into the existing task graph. So that means anytime I run my custom task, I'm also going to see that helloworld message. The way this is setup right now, I would expect first extract metrics to run, then extract summary, then right message, and then finally my custom task. Let's see how that works. When I run my project, you can see everything looks just as it did before, except now I have my helloworld message. So you can see how easy it is to add multiple plug-ins to an existing project. It's really just rinse and repeat from an existing process that we've already mastered. 27. Integrate Your Plugin Alongside Existing Projects: Thus far, the projects we've been working on have been working completely in isolation. We haven't used them anywhere else and we have an integrated them into any other project. In the next section, we'll work on publishing our plugin to the Gradle plugin repository. That way your work can be used by the greater community. However, before we do that, I figured it'd be worth exploring one more option that we can do completely inside of our project. In this video, I'm going to show you how you can create a plug-in that can run alongside an existing project. You might be wondering, when would that be useful? This can be useful in a variety of scenarios. One example would be if you have an existing project that requires a plugin and you'd like to keep all the source code together under one roof. Putting your plug-in besides an existing projects source code would allow you to do that. Another example would be if your plugin is so specific that it wouldn't benefit a larger community. Or if you're plugging contains proprietary source code that you couldn't published for others to use. I've run into this scenario before, so I've actually used this in one of my professional projects. Let's go about exploring this option. For this simple example, I would like to replicate much of what we did with the HelloWorld plug-in and just tweak the verbiage slightly so that we know that we're running from a different plug-in. In this case, just to differentiate it from the others, I'm going to call it Hello universe. And it's going to have a message to fit that. To start, I'm going to use one of those conventions to get us off the ground. Under the root of the project, I'm going to create a folder called Build SRC with a capital S. This is a special folder that great, it looks forward that can contain a variety of other build scripts and even entire plugins as we're going to see here. Once I've created that folder, I'm going to add a build-up Gradle file right underneath it. And because to set it for this plugin is almost the exact same that it is for the HelloWorld plugin. I'm going to go to my route, build-up Gradle script and copy everything out of it. Then I'm going to make a few necessary modifications. The first thing I'm going to do is remove them maven published plugin. I don't need this anymore because whenever I have my plug-in under the Build source folder, it effectively publishes this for me automatically. I'm going to specify the version of Kotlin that I need on line three. This is a bit of a departure from the root build-up Gradle file. And that's because anything inside the Build source doesn't automatically inherit the plugin management settings that we set up inside of the settings doc Gradle file at the root of the project. Another slight drawback if the build source folder is I also can't refer to the virgins at Gradle file don't have been using throughout the rest of the project. So I'm going to remove the reference here and then specify the plug-in version as well as the JVM version manually. Continuing further down, I'm going to remove the task as well as the publishing block because those work in tandem with the plug-in were removed up above inside of the plug-ins registration block. I'm going to remove that Chicago test coverage reported registration. But I'll keep the HelloWorld plug-in just for now. We'll change this one than just a minute for the hello universe plug-in. This simple example doesn't need DOM for j. And since I'm not publishing this to a repo folder anymore, I'll also remove this clean-up task. Now that everything is cleaned up, it's time to register our new plugin. I'll change the name of HelloWorld to Hello universe. I'll change the ID as well as the implementation class accordingly. Now you'll notice that the ID of the plugin is rather simple. It's just called Hello universe and doesn't use a fully qualified package name. Why is that? Unlike the plug-in we've been working on for most of this course. In this plugin is only going to be published locally. I can't put it on the Gradle plugin repository. The ID doesn't need to be universally unique, then it just needs to be unique within my project. So I figured it'd be easier just to shorten the name whenever, whenever I import it later on, I can just refer to it as Hello universe. Next, let's move on to creating the source code for our new plugin underneath the build source folder, I'm going to create a folder structure that looks just like this. Source, main, Kotlin or Java. If you're doing this in Java. And then I'll keep the package structure as I had before. So academy, left lane and then cradle. And now let's create our plugin file. Underneath the tip of that very last folder, I'll create a new Kotlin file called Hello universe plug-in. And because of the similarity with this plugin to the HelloWorld plugin, I'm just going to go to the HelloWorld plug-in and my existing project, copy the source code and paste it here. I'll just change the name of the class to be Hello universe plugin instead. And of course we also need a message task. So I'm going to create another file called message task right beside my existing class. Then I'll refer to the main project and copy the message task from the HelloWorld plugin. Once I paste it into my new plug-in, all I need to do is adjust the package name and change the message a bit. Our HelloWorld plug-in already has a task called right message. So I'm going to change this one to write message to universe so that they don't clash. Now let's jump back to the root build-up Gradle file and import it by its ID. Notice how whenever I import this plugin, I can simply refer to it id as Hello universe, which is a lot shorter than its fully qualified package name. Also noticed I didn't have to do anything special to import this plugin because it lives inside the bill source plug-in inside of our project, gradle will know automatically how to resolve this plugin. Now we're ready to put our new plugin to work. I'm going to add a new run configuration to my IDE. I go up to the Run Configurations menu, hit Edit Configurations, and then I'll add a new Gradle configuration. This task I'll call run Hello universe. I'll check store as project file to make sure it gets checked into my source code. I'll select the current project as the target. And then the task will be the new task we created, right message to universe. I'll save those changes and then run the project. And now we can see what happened. You'll notice I simply just ran the right message to a universe task, but a whole bunch of other things happen inside of the build source folder. Well, because we created a Kotlin plug-in, that Kotlin plug-in had to be compiled into a jar that it could run. Gradle did the honors and making that happen for us, automatically installed the plugin into a task graph and then it ran the task for us. So it is worth noting that if you do decide to build a plugin and put it inside of your build source folder that you will be adding additional overhead to all of your builds. And that's because the plugins inside of your build source folder, it needs to be built into a jar before it can be added to your task graph and then the rest of your build can run. So there's certainly are advantages to being able to keep the project source code and your plugins together in one spot. But it is worth noting this disadvantage. Nevertheless, in the end, this felt rather familiar and in my opinion, it's rather convenient that Gradle offers this option. 28. Section Introduction - Publish to the Gradle Plugin Repository: Now it's time that we put the cherry on top of our work. This section is all about publishing your plug into the Gradle plugin repository so that others can use it. Most of this work is essential, but I have some professional advice for you that will make this process more secure for anyone looking to publish their work and easier for large teams that need to do this at scale. In particular, this section addresses two major subjects. Manually publishing your plug-in to the Gradle plugin repository and creating a continuous delivery job that can do this work automatically. Quite honestly, even if you aren't working in a large team, you'll likely find having continuous delivery job at do with publication work for you to be extremely convenient. 29. Update the Build Scripts for Publication: Our work here is almost done. We built a plugin that we're proud of and we're ready to share it with the world. The best place to share this is on the Gradle plugin repository. And thankfully, they've made the process of publishing their rather painless. Let's see what changes we need to make in order to get it published there. I'll open up the root build-up Gradle file of our project. And in the plugins block, I'll import a new plugin. The idea of this plugin is com dot Gradle plugin publish and it also specify the version. However, since this is the only plug-in where I'm specifying aversion. Let's copy this out and move it over to the plugin management block of our settings file. I'll extract the version out to a variable called Gradle published version. And that of course will live inside of versions dot Gradle. So now back to where we started. I can of course remove the version specification. This plugin is going to allow us to do two things. First, it's going to offer us unblock that allows us to specify key metadata about our plugin. And then next it offers us a task that allows us to publish to the plugin repository. Down beneath the Gradle plug-ins definitions block, I'll create a new block called plug-in bundle. And here are the four properties I'm going to specify. Description, website, VCS, URL in tax. I'll show you what these are, is I filled them out. Next. It has a plugins block inside of it, which allows us to specify which plug-ins for our project we're going to be publishing. In this case, I'm only interested in publishing digit cocoa test coverage reporter. So inside of there, I'll create another block with the same name and then specify a display name. So in this case, I'll give it a nicely formatted name like this. Now I'm going back up to fill up those properties. Description is rather self-explanatory. It will describe what your plug-in supposed to do. Now the website can be one of two things. It can either be your website or the agency website that's publishing it. Or it can be linked to a publicly accessible Git repo where you can find more information about this plugin, such as GitHub. If you do have a website or a webpage that discusses this plugin, I definitely recommend you use that over at GitHub link. That's because the next property is meant to be just that where the source of your plugins code can be found in here. You can paste a full link to the GitHub or they get Lab website that hosts your plug-in source code. Now keep in mind that this will need to be a public repository, the plugin repository as overseers which will reject your publication if this URL is not publicly accessible. And then finally, we'll specify a comma separated list of tags that users can use to search for and locate your plugin on the repository. Let me show you an example of what a similar plugin looks like whenever it's published on the repository. As you can see, most information that we specify to the publishing plugin is available here, including the description, VCS URL, as well as the tags. Now I did say earlier that the Gradle plugin repository is protected by a set of overseers. These overseers are responsible for making sure that only high-quality and re-usable plugins are published to the repository. It's important to keep in mind that there are a few standards they look for when accepting a new plugin to the repository. However, even if you published a plug-in on the repository before, anytime you publish a new one, these will be scrutinized by the maintainers. In my experience, here's a few reasons why they could potentially reject your plugin whenever you publish it for the first time. Keep in mind that this is not an exhaustive list, but hopefully it provides you with a roadmap. First, the maintainers, we either look at the website URL or the VCS URL to make sure that you have proper documentation. This can be as simple as a read me on your GitHub repository. But if you're plugging lacks documentation altogether, the likely reject it, saying that it's not very usable by the broader community. In that case, they may recommend making this a private plugin like I showed you in a previous video under the Build source plugin. That way it can be specific to just you and your project. I also mentioned that the VCS URL has to be a public repository. So that means that every single plug-in that you see on the Gradle plugin repository is actually open-source. So keep that in mind whenever you publish yours. We'll soon be setting up an account with the Gradle plugin repository. The email address that you use to sign up should vaguely resemble the group ID of your plug-in. If there's a total mismatch altogether, they may also reject your plugin because they say that the plugin or the group should be related to the organization that's responsible for publishing it. And they also recommend that the plugin should be useful and creative for the purpose of sharing their functionality. So if you're publishing a plug-in just for practice and keep in mind, it may be rejected. Again, this is not an exhaustive list, but you can see the premise of it. Makes sure that documented, open-source and generally useful, as long as you check those three boxes, you should be good to go. 30. Plugin Publication and Secrets Management: Now we're at the point where we're ready to publish our plugin. Currently, our project has three different plugins embedded inside of it to in the main project files and one in the project source directory. I'm going to pick just one of these plugins to publish. And here will be the Chicago test coverage report or plugin. So before I go any further, I'm going to remove references to the HelloWorld plug-in, just like this. L D registered the HelloWorld plugin by removing it from the plugins registration block. And then of course that means I need to remove all usages of that plug-in. So I'll go to my demo subproject and remove the import. That means I also have to remove the task as well as the reference to it inside of the plugin management block in the settings file. Now that I've officially registered the HelloWorld plugin, Let's turn our attention back to the plugin republishing. The plug-in we've been focusing on this course has been at version 0 dot 0 dot one the entire time. And that's fine for development purposes, but now it's time to publish it. So let's boost the version number. I'm going to change my plugins version to 0 dot one dot 0. If you're plugging is ready for public release, you should probably put one dot 00 here instead. At this point, it looks like I've wrapped up everything that's necessary to publish my plug-in. The only thing that's missing or the authorization keys to allow me to do the publication. Let's go fetch those from our plugins account. As you can see here, I'm on the Gradle plug-ins website. I've already created an account and this is my profile here. I already have a number of plugins published that I use professionally. Let's ignore those for now and instead, look at the API keys tab here are the secret value is that Gradle provides to allow me to publish my plug-in to the repository regardless of which operating system you use, the directions that they have above the secrets block applies on my system. I've already created a doc Gradle folder under my home directory. And then inside there, I created a Google doc properties file. Inside that file, these two lines were added. If you're not familiar with what this file does, it's effectively a global Gradle configuration file. This allows you to keep important information outside of your repository, but yet had information that Gradle may need during the scope of its build in practice, any secrets that Gradle needs to be aware of typically go inside of this file. It's unique to you and not tied to any particular project. So that means if I create another project with another Gradle plugin in the future, as long as I haven't changed my Gradient Map properties file, I'll be able to publish those very easily. With that information in place, I'm going to go over to my terminal and published a plug-in. As you can see, I've already switched to the project directory. So that means that all I need to do is type Gradle w and then publish plug-ins. Now if I were to hit Enter, this would work just fine. My plug-in will be published within a few minutes on the Gradle plugin repository. And I would see it show up under my profile. I'd like to hit pause just for a second. In my professional opinion, what we just did just toward the Gradle plugin secrets isn't really keeping it a secret. All that information to authorize a critical operation is still stored in plain text on my desk. Whenever I work creating professional Gradle plug-ins, I never actually used a cranial dot properties convention that they recommend on their site. Instead, I use a real secrets manager to help me with this. Let me introduce you to the tools that I prefer to use for this. I use a tool called Doppler, and this is available from Doppler.com. In most cases, the free version of this tool will be more than enough for what you want to use. And it will especially be the case for plug-in publication. This isn't reputable tool that's used by many companies worldwide. It acts as a central location for all of your secrets management. It works on any platform and pretty much under any scenario, if you decide to use some of its more advanced features, it even has automatic integrations with other secrets managers. This tool also allows you to separate your secrets per environment. So for example, you may have different secrets in your development flow than you would in say, your production flow. Doppler helps you manage all of that as well, keeping all your secret straight across your entire team. And this is a tool that I use to keep my Gradle plugin repository secrets safe. And professionally speaking, if you're working on developing a plugin with a larger team, it wouldn't make sense to have everybody login to the same shared account so they can access the API keys on that profile. Instead, you can use Doppler is collaboration and team features to be able to share this information safely with everyone involved. So from here on out, I'm going to show you how to use Doppler in conjunction with Gradle to get your plug-in published. To begin, you can hit the Get Started button and sign up for a free account. I already have one created. Once I'm inside the Doppler dashboard, you can see in the very top left, I have the name of the organization selected, left Lane Academy. You can break up your account into two separate organizations to separate, say, personal and professional projects if necessary. I haven't yet created a project for this plugin. So let's go ahead and do that. I'll give my project a name and keep in mind that they recommend this name be CLI friendly. So that means you can't have spaces in the name anywhere. So in this case, I decided to use kebab case. I'll create the project. And by default, it gives me three different environments, development, staging, and production. Well, in this case, there is no such thing as a Gradle plugin repository for development or staging. It's just production at that point. So underneath each of these environments, I'm going to delete development and staging and just keep production. With the other environments gone. I'm going to go ahead and click on the PRD scheme. Here's the interface where I can manage all of my secrets. I'll add in both the public key and the published secret as provided to me on the Gradle plugin repository profile page. Now if you notice the names that I gave, these don't exactly match the name that they suggested on the Gradle website. That's okay. We'll take care of that here in a minute. But keep in mind the way greater was suggesting that you do it was really specific to Gradle. This is more working with environment variables. Doppler is effectively just a secure way of being able to store environment variables. They offer you a CLI that can take your secrets from an online secure vault and then load them into your environment variables on a case-by-case basis, we can use the environment variables that it provides us and then send them directly to Gradle. So that's how we're going to be using it for this project. Keep in mind that Doppler works on all platforms, Windows, Linux, and macOS. It also has the added advantage of being able to work on pretty much any ci environment. We'll use this to our advantage in the next video. So the names I'm getting each of these secrets needs to be compliant with environment variables that you would see on Windows, Linux, and macOS. I'll save these changes in Doppler is ready to go. Now I just need to get it onto my system. I'll head over to Doppler is documentation page and see what I need to do to get this onto my machine. Interfacing with this tool comes by way of installing a CLI. And since I'm on Windows, I'll follow the installation steps for Windows. Here. They're going to be using the scoop package manager to allow me to install Doppler. Well, I don't have scoop, so I'm going to first go to scoop dot SH and copy their installation steps. This installation requires PowerShell. Keep in mind that even though I'm using PowerShell to install a tool, I'll actually have this available inside the command prompt as well. All it's doing is installing an application for me that all of the native shelves on my system can access. Once I have scoop and Doppler installed, I can simply login with the CLI in my console. I would type the Doppler login prompt and follow the instructions there. In most cases, this is going to open up a browser tab and actually have you login through your browser. Then I would change to my projects directory and then type Doppler setup doctor will guide you through the setup is necessary to tie your project to a particular directory. Earlier in the Doppler dashboard, we created the project Chicago test coverage reporter. The Doppler setup command will tie that project a particular folder on my machine for security purposes, Doppler scopes these particular secrets to a particular folder. That way, not anybody on your machine could access all of your secrets and Doppler. And that way your project couldn't access to secrets of another project that you'd have in your Doppler account. At this point I have Doppler setup. Let's see how we can use it. I'll scroll down to their usage section and I can see an example of how to run a command right there. All I need to do is prepare my command with Doppler run and double dash. In my experience, this works the same across all shelves except one. Powershell has a bit of a different way of doing this, and I'll show you how to do that later on, whether you're using command prompt on Windows or pretty much any show on Linux or MacOS. And this is the way that you use it. All Doppler does is it injects the command that you give it after the double dash with environment variables. So the environment variables effectively didn't exist on your system before the Doppler command, and they won't exist after. That way you don't have long lived environment variables exposing secrets to your system, that shouldn't be there long term, it's scoped only to the lifetime of your command. I'll jump back to command prompt and let's start using Doppler. Notice I'm already inside of my projects folder. And if you remember earlier, the Doppler setup tied my folder to a particular project. Let's see what secret to have available to me. I'll type Doppler secrets. As you can see, I had both the public key and the published secret available to me. Now, obviously this information is sensitive, is I'll clear the screen as soon as possible. Now let's turn our attention back to actually running the publication command. Just as before, I'm going to type Gradle w published plugins. And now I need to pass some environment variables from my system to Gradle. The way that you can do that is with a dash P flag immediately after the dash P, I'm going to specify the name of the Gradle variable, that means the value. So if you remember earlier from the grill profile page, the first one was Gradle dot publish duck key. And we're going to set that variable equal to the name of the environment variable, Gradle publish key in snake case. So now you can see how I'm correlating the Gradle variable with the environment variable. I'll repeat the same process for the greater public secret. And estoppel recommends a prepend the entire command with Doppler, run dash, dash. That's the entirety of the command. But in my opinion, this is much safer than the option that we had before. I don't need to keep these publications, secrets, and plain texts on my desk anymore. They're stored and encrypted memory by the Secrets Manager. So hit Enter to run that command in, in just a few seconds, you can see it's already published online. Let's go and see what this looks like. I'll go back to my profile and refresh the page. And there we go. It is now the top most plugin on my profile. As you can see, it says it's pending approval in just a few hours, this flagship fall away. Once a maintainer takes a look at your plug-in, as long as it fits the criteria that I mentioned in a prior video, they should publish it. If you'll notice when I click on it, it says the plugin is not found. This will be fixed once it's approved by a maintainer. I would like to make one more note about Doppler. I mentioned earlier that the command to run Doppler is the same across all shelves except PowerShell. Let's see what a PowerShell command looks like. I'll run this exact same command inside of PowerShell and see what happens. I'll change to my projects directory, copy the command and paste it indirectly. As you can see, it gave me an error. This isn't so much of a Doppler error. It's more of a syntax error within PowerShell itself. Before I run the proper command to republish my plug-in, let me bump the version number of my Gradle plugin really quick. I'll change the plug-in version to 0 dot one, dot one. The plugin repository won't allow you to publish the same version of the same plug-in twice. So I'll need to bump the version number to allow my update to go through. Let's fix that PowerShell command. I'll change the double-quotes after each of these arguments, single quotes. And instead of having a double dash right after the run, I'll change it to double dash command. The rest of the command after that goes inside of double-quotes. Now this is the proper way to run a Doppler command inside of PowerShell. If I execute this command, you can see now it publishes here as well, and it also reflects the version bump. Well congratulations. Now your plugin is online and the culmination of all your hard work has come to fruition. It would be great if we could get this process here automated. In the next video, we'll explore having a continuous delivery pipeline. Do this work for us? 31. Continuous Integration and Continuous Deployment: In the last video, I showed you how you can use your local command line to release your plug-in to the Gradle plugin repository. However, if you're working on a professional project or with a larger team, it would make sense to have a continuous integration and continuous delivery pipeline. Do this work for you? In this video, I'm going to show you how to do all of that with the tools we've already selected, namely with Gradle and with Doppler. Thankfully, the tools we've chosen to use are highly portable. So even though this example shows you how to use GitLab CI, this would work incredibly well with GitHub Actions or Azure DevOps pipelines for instance, since I use git lab to host my repository when I was building this project, I'm going to use GitLab, CI, and CD to do the build and release process for me, greater will run on pretty much any machine that can run Java. Thanks for the convenience of the Gradle wrapper script is I can see here Doppler has first-class support for GitLab, CI, and CD. This isn't really by chance. If I scroll down in our documentation site, you can see they have a host of integrations including GitLab actions, Bitbucket Pipelines, azure, DevOps pipelines, and so forth. Basically, as long as you haven't continuous integration provided that can run a shell script and a program from the CLI, then you should be able to run Doppler. So the first thing I'm going to do to build up a new pipeline is to make sure I can get Doppler running. The only thing that's necessary to authorize my pipeline to Doppler is to have an environment variable called Doppler token that's scoped my project. So inside the Doppler dashboard, I'll open up my project. I'll select the appropriate environment. And then I'll go to the Access tab, generate a service token, give it a name, and make sure that I save this name somewhere safe. I won't be able to see it again. Once I close the window. Then I'll go over to git lab. I'll go to the Settings panel, CICD and expand the variable section. Here I'll add a variable with a key of Doppler token, and I'll paste in the value. I'll also check the mask Variable checkbox just in case it would happen to show up when along get lamps should be able to strip it out for me automatically. Will also ensure that the protected Variable checkbox is unchecked. And then I can click Add variable. That's all I need to do inside of the GitLab interface to set up Doppler. And now let's move on to the scripting. Keep in mind, the scripting that I'm gonna be showing you here is specific to get lapsed Ci. So if you're using another vendor for this, you want to follow their documentation to get these steps in order. But generally speaking, they're going to feel very similar to what I'm doing here. I open up my IDE and at the root of the project I'm going to create a dotted get labs CI YAML file. I specify the base image as Gradle JDK 11. That way I know that both the dependencies that I need will already be preloaded inside of the environment. Then I'm going to create two stages, build and deploy. The build stage will be my continuous integration pipeline. Every time I push up code to git lab, the build stage should run and ensure that there are no build errors inside of my project. The deploy stage will always be a manual step whenever I'm ready to deploy my project to the Gradle plugin repository. Let's start with the build stage. Under the build stage, I'm going to run two separate scripts. The first line will be the published task. If you remember, it builds a project and then publishes it to a local folder for use as a local repository. And then the following line, I'm going to run the two tasks that are available inside of my demo subproject, extract metrics, and extract summary. So this will make sure that my project builds and then is able to run the two designated tasks successfully. There's nothing terribly fancy with this stage. However, with the deploy stage, we will need to use Doppler to help us publish the plugin. I'm going to add a before script section under this new stage, this allows me to install any prerequisites before I run the actual build process. So as of yet, the pipeline doesn't have any notion of Doppler, so I'll need to make sure I get that installed. Let's go over to their documentation and see how we do that. Back on Doppler GitLab CI CD page. Let's see how they recommend you install it in this kind of environment. I'll scroll down and I'll copy the line of code. Do they give me to install the CLI in their example? They have it in a script section. Personally, I prefer to put it in a before script section because it's a prerequisite for the build. Functionally, it'll be the exact same thing, except for me that before script made more sense just for thinking purposes. I'll paste in there suggested installation script. Then I'll create my script section. And I'm going to run the exact same command that we ran earlier to publish our plugin in the previous video. I'll say Gradle w published plugins. And then I'll pass in both the public key and the published secret as environment variables. And of course, all of this will need to be prepended by Doppler run dash, dash. Since this project is coming to maturity, I'm going to open up the vergence doc Gradle file and I'll bump up the version of this plugin to one diode. Now I'll push my changes up to get lab and let's see what happens. I've created a merger quest for this effort and as you can see in this section in indicates that the pipeline is already running for our job. Let's see what's going on. If I open up the build stage, I can see the build log as it's running. And pretty soon after it starts, I can see I get an error. If you get this error whenever you're running your bills, you need to adjust the execution permission of the Gradle wrapper script. In the command line, I'm referring to this particular script right here. You can see Linux identifies it as being executable. However, whenever I pushed up to get it does not. So I'm going to use git to change the permissions on this file. I'll say get update index, and then dash, dash chmod plus x to indicate that you add the execution permission. And then I'll specify the file gradle W. If I check my status, you can see this file was indeed changed, even though all I did was change in execution permission on it. I'll push up those changes and you can see and get indicates that the mode changed previously, the permission mode was 644, now it's 755. This should allow the CI environment to be able to run this Gradle wrapper script. Let's try it again. I'll push up my changes and refresh the merger quest. And I can see the pipeline is running again. Go into the build stage. I can see the process is continuing further than it was before. So that's a good sign. And after a few minutes I can see the entire process finished. If I scroll down to the bottom, my job along, I can see that the jumps exceeded and that I'm getting familiar metrics that we've been seeing throughout this entire project. So it looks like everything is working. I'll go back to my merge requests and you can see I have a green checkmark for my build stage. Let's try the deploy stage. Since this is a manual action, I'll start it by clicking the Play button beside that stage. Once again, after just a few minutes, I can see the job now publishes his plug-in for me under the version one dot toe, toe, toe. So if I go to the Gradle plugin repository, I should see a new version, my plug-in one data TO available for everyone to use. In my experience, pushing the work of publication and releases from your local machine to a continuous delivery pipeline makes total sense. That's how I do it professionally and how I recommend you would do it in a professional setting. 32. Course Conclusion: In these past few hours, we have covered a lot of ground and I wanted to congratulate you on finishing your journey. You don't have to take these next few steps alone. I invite you to join a community of learners and builders on the same journey as you on my Discord server. I hope you found this course insightful. If you have an opportunity to leave a review, I'd be very grateful for it. I take your feedback seriously and personally respond to each one as it looked to improve on future coursework that you just may encounter. Again, stay curious, stay sharp, and as always.