Dell Boomi AtomSphere - IPaaS Beginner Training | Saad Qureshi | Skillshare

Playback Speed


  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x

Dell Boomi AtomSphere - IPaaS Beginner Training

teacher avatar Saad Qureshi, Online Instructor

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Lessons in This Class

26 Lessons (5h 25m)
    • 1. Introduction and Architecture

      4:29
    • 2. Dell Boomi: Advantages and Disadvantages

      3:22
    • 3. ETL FrameWork

      2:07
    • 4. Dell Boomi Sign Up

      3:06
    • 5. MYSQLInstallationFinal

      6:57
    • 6. Runtime Agent in Dell Boomi

      7:19
    • 7. Analyse your Data Set First

      3:27
    • 8. First Integration Process

      6:43
    • 9. Data Transformation and Mapping to Database

      24:00
    • 10. Data Loading and Process Deployment

      20:00
    • 11. Amazon S3 Integration With FTP Server Part-01

      19:43
    • 12. Amazon S3 Integration With FTP Server Part-02

      11:50
    • 13. Salesforce Integration With FTP Server Part-01

      17:02
    • 14. Salesforce Integration With FTP Server Part-02

      21:40
    • 15. Cleanse Shape and Return Document Shape Example

      19:51
    • 16. Add to Cache Shape Example 1

      13:12
    • 17. Add to Cache Shape Example 2

      12:31
    • 18. Properties in Dell Boomi Part-01

      18:44
    • 19. Properties in Dell Boomi Part-02

      8:09
    • 20. Decision and Exception Shape Example

      11:12
    • 21. Process Route Shape Example

      27:53
    • 22. Business Rule Shape Example

      13:30
    • 23. Consume SOAP Service

      9:39
    • 24. Consume REST Service

      9:39
    • 25. Build SOAP Service

      15:23
    • 26. Build Rest Service in Dell Boomi

      13:19
  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels
  • Beg/Int level
  • Int/Adv level

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.

8

Students

--

Projects

About This Class

  • What is Dell Boomi AtomSphere?

    Dell Boomi is a cloud integration platform which is used to integrate different applications.

    What are we learning the in the course?

    • Disk connector

    • Database connector

    • HTTP connector

    • Webservice SOAP client connector

    • Web service Server Connector

    • Set properties

    • Map

    • Message

    • Process Call

    • Data Process

    • Add to Cache/load from Cache

    • Branch

    • Cleanse

    • Return Document

    • Stop

    • Decision

    • Exception Handling

    • Web Services

    • Consuming SOAP and REST Services

    • Building SOAP and REST in Boomi

    • Deploy and Un-deploy process

    • Scheduling the process

    • Processing of Documents

    • and much more

    After this Course:

    Once your are done with the course,you will have maximum knowledge of Dell Boomi Cloud Integration platform and can easily apply concepts to create multiple different integrations.

    Cheers..!!

    Have a Great Learning..!!!

Meet Your Teacher

Teacher Profile Image

Saad Qureshi

Online Instructor

Teacher

Background:

Computer Scientist with overall 5 years of industry and several years of freelancing experience. Other than this,I am passionate about teaching and guiding students learning programming languages.

Life Philosophy

Do not dwell in the past,do not dream of the future,concentrate the mind on the present moment.

Service to others is the rent you pay for your room here on earth..

 

See full profile

Class Ratings

Expectations Met?
  • Exceeded!
    0%
  • Yes
    0%
  • Somewhat
    0%
  • Not really
    0%
Reviews Archive

In October 2018, we updated our review system to improve the way we collect feedback. Below are the reviews written before that update.

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.

Transcripts

1. Introduction and Architecture: Welcome to my course on Dell, boomy atom sphere. What is del boomy, del boomy effect cloud integration platform, which is used to integrate multiple applications. These applications could be cloud-based application as LST, on-premise applications, del boomy atoms Fair, which is so steady on a cloud server. It also has the ETL capabilities. Etl stands for extract transform load. So we can extract data from multiple heterogeneous sources. Then we can transform that data. Then once that data is transform, we can load the data ok. So it also provides the ETL capabilities. My second is del boomy is an IPS. So IPS stands for integration platform as a service. So del boom, it can be used as a service. It is hosted on a cloud, and it can be used as a platform as a service. So these are some of the applications that you can integrate with the help of del boomy attempts fail. Like you can integrate sale forfeit, Amazon S3 bucket. We odd you can integrate Amazon S3 bucket word workday. You can integrate Oracle Net suit with Google Storage. Okay? So you can integrate multiple applications with the help of del boomy time sphere. Now let's look at the architecture of this tel boomy atom sphere. So del boomy has a multitenant architecture. So what is the meaning of this statement, multi-tenant architecture. It means tack, or you will have a single object. So the single object can be accessed by multiple customers. So del boomy, which has hosted known, which will be hosted on a cloud server, will be accessed by multiple customers. So whenever we create any processing del boomy, we need runtime agent. So atom, cloud, molecule, these two are run-time agents which are hosted on a cloud. This one local atom is also runtime engine, but this one is going to be in stored locally, but these two agents will be hosted on a cloud server. Ok, so to execute, let me just repeat this to execute any processing dealt boomy, you need an agent. Agent is going to execute your process. So this one is first one is an agent which is Australia on a cloud. This one local atom, which is also an agent runtime engine, which will be installed locally. So this is del boomy atom say, which is a platform which will be hosted on account server. So let's say we have three companies which are using del boomy itemset company a company B company see, Company a has multiple customers, company BFS multiple customers, so desk Company C. Okay. So multiple customers are going to use the del boomy attends fair, which is hosted on a cloud. Debt is the reason why it is known as multi-tenant architecture. These customers are using the local atom. Okay? So this was the brief del boomy architecture. Now what are we learning in the course or in the course, we are going to learn how to configure runtime agents, then will learn profiles, processes, different connectors. So in Dell, boomy, we have different connectors like we have disconnected database connector. So to communicate with database, we need database connector to communicate with disks. We need disconnectedness. Okay? Then we have web services, soap Klein connector, web service. So we're connector. Then we have processed design shapes like Grant shape, process called shape data process shape, E2, cash shape, load from cache, BE returned, document, etc.. Then we have this concept called exceptional handling web services. And we're going to, in the web services section, we are going to learn how to consume soap and rest services. Then we are going to learn how to build soap and rest service. In den booming, scheduling the processes is also important to processing of documents is also very important. So we are going to learn how to move a document to reprocess. So these are all we are going to learn in this course. So thank you so much and have a great learning. 2. Dell Boomi: Advantages and Disadvantages: Hello everyone. In this particular lecture, we are going to discuss the advantages and disadvantages of den boomy atoms failed. So first let's look at the advantages of del boomy. So my first is it includes multiple technologies and applications. So with the help of del boomy atoms fair, you can integrate multiple technologies in applications. So my second is no programming language is required to use del boomy. So as del boomy provides very simple and easy user interface. So you don't need to be an expert of any programming language to use del booming. My third is as del boomy fostered donor cloud server, so no maintenance is required. Ok. My last point is it supports both cloud-based as a less on-premise applications. So you, with the help of del boomy, you can integrate both cloud-based as a less on-premise applications, okay? It supports multiple applications, which includes both cloud-based as a less on-premise applications. Now let's discuss two disadvantages of del booming. And my first is del boomy cost is an issue as its licensing is too expensive. So you can get the trial version for only 30 days. So if you want to use del boomy Morton County days, you will have to pay it. It ok, you will have to bear the cost is cost as too expensive. So desktop disadvantage that one could get. My second is there are not much shapes available for data transformation. So if you want to transform your data, there are not much options available in Dell. Boomy, you need to write custom scripts okay, for transforming your data. So Datsun disadvantage. So my last point is logging and debugging is very hard and del boomy. So these are some of the advantages and disadvantages of del boomy. So now let's look at Work expertise are required to use del booming. So first of all, let me just explain Meyer expertise. So I'm a computer scientist with several years of industry experience. Other than this, I'm passionate about teaching and guiding students learning programming languages. So you don't need to be an expert of any programming language. So if you have basic knowledge of programming language, any programming language, it has good, okay, so very basic SQL expertise is required. Basic knowledge of ETL and Cloud-based tools are required. Very little knowledge of programming is required, okay? So no major expertise are required. Programming expertise are required to use den boomy. So these are the certifications, basic level certifications that you can do. Associate integration developer certification, professional integration developers notification. So you need to be an expert of despoiling topics in order to do these two certifications. 4. Dell Boomi Sign Up: Hello everyone. In this particular lecture, we are going to create del boomy free trial account. So first of all, in the Google search bar I will write del boomy atoms fair training account. So I will write this in the Google search bar. Then I will click this particular link, boomy.com slash services slash Training. Click this particular link. Then you will have to click this sign up. So you're going to provide information like in the signup form, you will have to provide information like firstName, lastName. Here you are going to provide dig company email address. So if you are providing D Gmail email address, this will not work. Ok. So I'm providing my company email address. Then you have to specify the password. Okay. Phone number, country stared, city, postal code. At the end you have to provide the company name here, okay? Then you will have to click the Sign Up button. So after this del boom is going to send you the account details, shoots your account ID and the password. Okay. Then since I've already created my del boomy accounts, so what I'm going to do next, I'm going to sign in into my Adele boomy account. So first of all, let me just open the Google search bar. Here I will write del boomy atoms, fair? I will click this login, I will click this del boomy login screen. So I will open my del boomy login screen here. I'm going to provide my email address password. Here. I'm going to provide my password signing. Okay, so here next I'm going to go to the Setup section. So first of all, let me just open my del boomy GUI screen. Okay, so this is my del boomy GUI interface. So there are three important temps, Barry, step, deployed tab and diminished step. In double-tap, We are going to create our processes. So once the processes created, the next step is to deploy the process. So we are going to use the deploy tab to deploy our processes. So where we are going to deploy our processes for debt, we need to create an environment. So normally, in general, in Dell boomy atoms fair, we create two types of environments. Test environment in the production environment. So once the process is created, we will deploy it in either test environment or in production environment. So the third tab we have is T Managed tab, which is used for process management and process reporting. Ok. So this account is accessible for only 60 days, okay? So it is going to expire after 60 days. So hope you understood the concept. Thank you so much and have a great learning. 5. MYSQLInstallationFinal: Hello everyone. In this particular lecture, we are going to learn how to download and install MySQL on your computer system. So first of all, in the Google search bar, I will write download MySQL. So I will click this first link, deaf dark MySQL.com slash downloads. So click this link. So next I'm going to click MySQL Community Server. Click this link. So next, you have to select the operating system. So if you have Mac operating system, you will, you will select the Mac OS. Then you will download the installer from here. So since I have Windows operating system, so I will select Microsoft Windows, go to download page. Then you will select D installer, distanced honored, download, gig down, Lord, click to desorption. No, thanks. Just start my download. So this is going to start the download process. So downward process had been started since I already have it in my system. So I'm going to cancel deep downward process. So cancel this. Let me show you the installer which have downloaded. So now what I'm going to do, I'm going to click this installer. So the installation process is started. Click yes. Ok. Choose a setup type. So I'm going to select the Custom option. Since I want to select only few products, I don't want to install full MySQL. Click next. So now what I'm going to do, I'm going to select the MySQL server. Which server I want to select the first one, the latest one, X6 before destroy and delete test one. Next, I'm going to select the application, this application, MySQL Workbench. So select death, the latest one. Then. I will select D connector, ODBC connectors. So I will select the latest ODBC connected, The first 164 Hertz. So my machine is 64 bit, so I will select the 64-bit version. Okay, next gig, next. Now I'm going to provide the directory where I1 to install MySQL. So here in my E directory, I will create a folder called MySQL. Copy this. Now, provide the directory. Here. Slash, slash, OK. Click Next. The falling products will be installed and ready to Installed status, ready to install, a click execute button. So the installation process has been started, progresses 50% first-year Ted's installing MySQL server. Then it will install MySQL Workbench. Last but not the least equivalent stalled Dee connected ODBC connector. So install relation has been completed. Ready to configure. Now I'm going to configure myself. Click next. So I will select the standalone MySQL server. Click next. Okay, port number is 33063 Jibril use OK. Click Next. Uses strong password. Select the password. So I will select admin one-to-one, Saudi. Let me just provide my password again. Admin 1-2-3 is my password. Repeat the password. Admin 1-2-3 is my password. Click Next. Windows service name. Mysql 9-0 is the service name. Next to execute. So click Finish, click Next, finish. So let me just openly workbench. So MySQL Workbench law, local instance, root password is admin. Username is root password is admin 1-2-3, which I set while installing D MySQL. Okay? Now this is how you install the MySQL. Now I'm going to go into services. Now let's look at the services. Mysql search, MySQL 90 service is up and running so you have to make sure that this service is up and running. Otherwise, MySQL database connection is not going to work. Okay? You have to make sure the services up and running. So thank you so much and have a great learning. 6. Runtime Agent in Dell Boomi: Hello everyone. In this particular lecture, we are going to talk about a runtime agents in Dell. Boomy. So the concept of agent is the job of an agent is to execute your del boomy processes. So without an agent, you cannot execute your processes. So if I give you an example of an agent, It's like a driver. So without a driver, you cannot drive a car. You need to have a driver to drive a car. Likewise, in Dell, boomy. Unless you have an agent configured, you cannot execute your processes, okay, you need to have an agent in place to execute your del boomy processes. So there are three types of runtime agents in Dell boomy. The first one is atom, which we call it local atom, which we are going to install it on our local computer system. The second one is molecule, which is hosted on cloud server, del boomy cloud server. So this one is the molecule. Let me just click this. So molecule install relation or setup is not available in your account. So this molecule is since we are using the trial account, so we cannot use the molecule in the trial account. Okay? And the third one is the atom cloud. So atom cloud is also hosted on a cloud server, del boomy cloud server. So both molecule in atom cloud, they both are hosted on Dell boomy cloud server. So now, first of all, in this particular lecture, I'm going to configure a term cloud which is hosted on the cloud server. So to configure atom cloud, first of all, I'm going to go in the atom management section. Here I will click New, then I will click at term. The first option is the local atom. So I'm going to show you how I'm going to install this local item on my local computer system. The second one is in the cloud. So this has two types. Detest clout in this atom clouds. So this, since we are using the test version, so I'm going to choose this particular option. This option is for production environment. So since we are using the test environments, I'm going to choose this option, then I'm going to give a name to this particular option. Now let say boomy cloud. My atom cloud name if this boomy cloud, which is hosted on a cloud server now click okay. So this is a new item has been set up close now in order to Cd atom at a management, you have to go to the atom management. Look, it is append running status, it's online, so you have configured your first atom which has those tectonic cloud server. Next I'm going to configure my local atom. So in order to configure your local atoms. So first you will go to the atom management section, then you will have to click new, then atom. Then you will have to choose desorption local operating system you will select, I will select Windows 64-bit. Since I'm using Windows 64-bit, I click this Download installer. I will download this particular installer. This is going to take some time to complete the download process. So now go here showing folder. So next I'm going to install this in my local system run as administrator. I will run this as administrator. Now move more info. Run anyway, click this. Yes. So the installation process has been started. So I'm going to install it in my local directory. Click next. So I will provide my username and password here. Now, this is the name of my item, default names. So if you want to change it, you can change it as well, but I'm going to set it as default. Click Next. So I will click this particular account. Then I'm going to click next. So this is the directory where I'm going to install my local item. So I will change my directory, so I don't have a space in my C directory, so I will change it. I will install it in my directory. So okay, let me just say this is going to be my directory. Click Next, click Next, Next. Now this is going to install in my ieee directory. So let me show you my folder. So this is the for loop, we're going to install it, okay? So the installation process has been started. So this is going to take some time to complete the installation process. So the local item has been installed on my system. Now next I'm going to click finish. So let me show you my directory where it is installed. It has installed in this particular directory, boomy atom sphere. So the installation has been completed. And next I'm going to click close. So this is the local atom which has been configured now it is in offline mode. So let me again click this, build again, go to the atom management section. Now it is look at a starting. So this is going to take some time to be in online mode. Located. It is starting right now. Look, it is in online mode. Local atom has been started. So this is how you configure Local atom and the atom which is hosted on a cloud. So this item is hosted on the boomy cloud server. It is in online mode. This is I've installed on my local system. This is also in an online mode. So hope you understood the concept. Thank you so much and have a great learning. 7. Analyse your Data Set First: Hello everyone. In this particular lecture, we are going to create a first integration process in Dell. Boomy. So first of all, we are going to analyze our requirements. So my first requirement is analyze your data set. So first of all, I'm going to analyze my data set which I have caught from source system. Let me show you my dataset. So this is my data set. Okay, so next step is read products CSV data set using the disconnected. Then after reading the dataset, we have to apply different transformation techniques from the data. Then finally, we have to insert data in deep MySQL table as well as in the flat file. Okay? After developing the integration process, wants to integration processes do I let we have to apply the process in either test environment or in production environment. So now let me just show you the dataset. So this is the dataset. So look, in this particular file, I have details related to products, so I have Product ID, product name. So if you analyze this product name, look computer DES or different products, okay, so as per our requirement, we have to convert this product name to uppercase, okay, then we have country. So I want the country to be in the form of abbreviation. Look, we have United States, then we have the abbreviations. I want this country field in the form of abbreviation for Australia. It should be a US, four, United Kingdom. It should be UK. For United States, it should be USC. It should be in the form of abbreviation. So this is the Country field. Now we have, next we have the CT fleet. So I Avant severe. Look in the city field, I have some null values. I want to replace null values with some character like n slash e. Or with question mark, I want to replace it with some value. So I will replace it with this particular value, n slash shield. So I have two null values discern in this one I will replace it with insulation. Then we have Sales field. So I want to set this sales column into decimal points. Look, disvalue in, is in three decimal points. I want to set all fields, all values into decimal points. Then we have quantity discount. So I want to show discount in the form of percentage. So I will multiply the values by a 100. So this is going to be 20%, this will be 40%, 30 percent, okay? So I will multiply this by a 100. Then we have the profit field. I will display the profit flee into decimal points, okay? 10.85, it will display it 6.08. So I will set this building two decimal points. So these are the transformations get I'm going to apply on my dataset. So next step is to create d del boomy process. In the next lecture, we are going to start creating a first integration processing del boomy. 8. First Integration Process: Hello everyone. In this particular lecture, we will do the next step which is read products CSV data using the disconnected. So indel boomy, there are different types of connectors. So for example, if you want to read data from your local computer system, you will use the disconnected. If you want to connect to a database, you will use t database connector. But if you want to connect to a FTP server, you will use the FTP connector. Okay, so in this particular example, I'm going to use get disconnected since I'm reading the products CSV data in from my local computer system. So first of all, I will create a new component inside this particular folder. So here I'm going to go here, I will create new component. So these are different types of components in Dell. Boomy, I will choose the process component, component name. I will give product CSV transformation. So I will create this component inside this particular folder. So I will click Create button to create this component. Now, whenever you are going to create del boomy integration process, the first step is going to be discharged shape. So this is the mandate three-step. Okay. Monday tree shape. The start shape is the main shaped. It begins TPP process flow. It is automatically added to each new process and it cannot be removed. So there are different types of start shape. So start share with connected types, start shape with data pass-through starts share with no data type. So I will not discuss this start ship. Okay? So if you, if you want to connect to any data source within the Start shape, you will choose the connector type. But if you don't want to connect to any data source within the Start shaved, you will choose the node data type. So I don't want to connect to any data source within these shapes, I will choose this type, node data type. Next step is I will choose the connector. So since we are reading data from my local computer system from this directory, I will choose the disk connector disconnected search. So from where I'm going to get the disconnected, I will go into connect section. Here. I'm going to go, I have this different types, general connector, tended technology connector. I will select disconnected desk, okay. Which is taken from the technology section inside the Kinect type, inside the connect section. Okay, now I will connect this to this disconnected. And now I'm going to configure my disconnected click configure. Here, I will choose the connector which has disc actions. Since I'm reading the data from my local system, I will. Tuesday get okay, connection, I will create a new connection. So for disconnection, disk file connection, here I'm going to provide the parks where from where I'm going to read data. So I will place a part here. So this is the path save enclosed disk file connection. So the connection name is disk file connection to. So now operation, which operation I want to perform? I want to perform a read operation, read desk connector operation. Okay. Save. No, I don't want to save right now. I want to configure it file filter. I will specify the exact file which I want to read. So I will give the exact filename here. Then file matching type, I will select Exact match. Now I'm going to select Save includes. So both connection in operation have been configured. Now I'm going to click, okay. So disconnected has been configured. Now at the end I'm going to introduce stop shape, which indicates the end of my integration process. Right now, I have just completed this particular step which is read products CSV dataset using the disconnected. In the next lecture, I'm going to do these two steps, perform the transformation on the data. Then finally, I'm going to use t database connector to insert data in D MySQL table. Then again, I will use t disconnected to insert data in a flat file. Okay, now, the process, the second step has been completed. The next step is to execute my process, the integration process. So as I said, whenever you are required to execute your integration process, you need an agent. It's like a driver. So without a driver, without a driver, you cannot drive a car. Likewise, indel boomy, without an agent, without runtime aging, you cannot execute your integration process. So I will click the test button. Then, okay, first I have to save this, save this, then click the test button. Then I have to choose the atoms. I will choose disparate. Let's say I'm going to choose this particular decision and local atom. Select this particular item, then run test, execute your del boomy process. So it is executing your del booming process. Look, start shape with no data type is executing, then it will execute this, then this shape. So this is the end shape. Ok. So the process has been executed successfully. In the next lecture, we are going to look at the flow of this process, okay, how the process is executed. Then after this, we're going to apply different transformation techniques on the data set. Okay? After this, we are going to insert their time MySQL table as well as in D flat file. So hope you understood the concept. Thank you so much and have a great learning. 9. Data Transformation and Mapping to Database: Hello everyone. In this particular lecture, I'm going to explain how this process is executed. And then we're going to do some data transformation steps. So first of all, let me just explain the concept of our document. So if you look at the section here, documents indel booming a document as something debt powers t execution in a process flow. So I will explain this particular line. Then the second definition of a document. Instead, a document contains the actual data. For example, if you are reading data from a CSV file like care in this particular example, we, we are reading data from a CSV file. A document will be produced which will have the content of a csv file. So for example, if you are reading data from a MySQL table or document will be produced, which will have the content of MySQL table. Okay, now let me just explain the execution of this process. So first of all, this shape will be executed start shape with no data. Start shape with no data is not doing anything. So it will produce a document. It will produce the empty document. Okay, so if you want to view the document of this shape, you will go to this particular shape, OK, so it will produce an empty document, then it will pass the document to the next shape. So if you want to read a document of this shape, you will go here. And then you will go to d, shapes or STATA. You will click, look, empty document is produced. This shape will also produce a document, since this shape is reading data from a CSV file or document will be produced which will have the content of a, which will have the content of this particular file source underscored product, CSV. Okay? Now, in order to view the document of this shape, you will go to this particular shape. And then you will click shapes or strata, and then click this. You will see the content of a csv file. It is loading, you will see the content. Look, this is the content of a csv file. So next important point I want to explain. I want to explain this particular line. A document is something that powers the execution in a process flow. So what is the meaning of this particular line? So when this shape is executed, it is going to pass the document to the next shape, which has this particular shape. If this shape receives the document dead, this shape will be executed. Otherwise, this shape will not be executed. So this is the meaning of this line, that the line is tech, a document as something that powers the execution. So this shape will receive a document in this shape will be executed. Then this shape is going to pass the document to this particular shape. This shape will receive a document and then this shape will be executed. So this is the process slow. Now we are going to perform the data transformation steps. So now let me just add another shape pair. So now since we are, we will insert data in MySQL table. So I will use t. Database connector. So in order to CDART database connector, you will go to the connect section here you will choose the database connector. Now before database connector, Let me just connect it to this top sheet. Before database connector, I will choose the map shape. So from there I will get IMAP shape. I will get it in the execute from the execute section. Now, I am going to connect this shape with this shape. Then I'm going to connect this issue with this ship. So next I will configure these two shapes. So basically the purpose of using the map shape where instead, here I'm reading this CSV data. In this particular shape. I will insert that ID MySQL table. So I have to map the source and the target table, okay? Source and target data, okay, I have to map it. So this shape is used for mapping. So I will configure the disc particular shift first, I will configure the database shape, okay, configured. Then connected. I will choose the database connector actions since I will insert data in the table. So I will use descent connection. I will create a new connection, database connection. I will click OK. database URL. I will select here MySQL username. I will select route. I will specify my password here. So MySQL password here. Apply. Okay, host, I will write localhost. Port number is 33, 0-6. Let me just close it and then let me just open this MySQL Workbench. Again. Open the MySQL Workbench. So I'm establishing the database connection. Let me just open this again. Ok, this is root localhost. Okay, port number is 3306. So in order to establish the database connection, what I am going to do, I'm going to write the port number 3306. Dandy database name, database name is dice. So database name is dice. Now next what I am going to do, I will download MySQL connector JAR file, MySQL connector JAR file. Download dev.off MySQL. Okay, I will download this particular file, platform independent. Then I will click this Download. No, thanks, just start my download process. Ok, I will download this particular file. Download process has been completed. It is just completed. Okay, now it's strapped to my skin. I will use this particular file, MySQL connector Java Control C, and paste this file in this particular directory where your app is installed. So I will choose, I will go into lib folder and I will place this particular file inside this particular folder. So this is the directory where my atom is installed. So look, file has been, I have, I did this particular file. Now next, what I am going to do, I'm going to let me just close this. I'm going to test my connection. Everything is okay. I will choose my local atom. Then I will click Next. I will establish the database connection with MySQL database. So testing connection. Okay, let me just back. Okay, this is going to give me an error. Let me just save includes OK, save, enclose. Then let me just restart my atom. Otherwise it is going to give me the error. I have to restart my atom, restart this. Now next, I'm going to open the transformation step, openly process this particular. I will again open the database configuration step, okay, everything is okay. Rude, localhost, port number database, everything is defined. Test Connection, I will choose the item, I will choose my local atom click Next. Testing connection. So look, connection has been successfully established. Click finished, save and close. Next time we're going to configure. By other things. Operation, I will configure operation. So I will create a new operation. Which operation I want to perform? Insert database operation. Insert database operation I want to perform. So I will create a new database profile for the purpose of creating a database profile, instead, I want to import DDI metadata of this particular table. So first of all, I will create this particular table in my database. And the table name is product, dice if d database name, Product ID, product name, country, city, sales. Look Sales field has two decimal points, okay, quantity, discount, profit. Let me just execute this particular. A command on the table has been created. Let me just execute this query. Okay, now, I will click this. Before that. I will click this here. I will select dynamic insert. Then I will click import. I will choose my runtime agent, which, which I will select the local agent. I will create a connection, database connection. No, I don't want to create a database connection. I will choose D database connection, which I've already created. So this database connection I've already created next. So connecting to them. Bec, then click Next again. Connecting two atom. Atom is loading database tables. So I have to choose my table. So my table name is product. So from, from this list, I will choose the product table. So where is the product table? This is the product table which I am going to choose. Click next, connecting two atom. So basically what I'm doing, I'm importing DDI metadata of this particular table. So metadata of this table is the field names, which is product ID, product name, et cetera. And then deep END, this data type. Okay? So I will include all fields. Click Next, Finish. Look. Fields have been imported in indel. Boomy. Okay. Now what I'm going to do profile is database profile, name of table product database profile, product database. I have created this particular profile. Okay? Now a profile has been created save and close connection as well as this option operation has been configured. So connection name is database connection, operation is insert database operation. So action is send, click OK. database has been configured. Next I'm going to configure this map shaped configured map. So CSV to product. Csv too, database. Jews, a profile. So rich profile. I will create a new profile. Create, No, I don't want to create a database profile. Which profile I will create. I will create flag file profile. So create new profile. So Profile Name is CSV file profile clicked record option. I will choose use column header, click this. But before that. I will open the file which I'm going to import. So basically what I'm doing, I'm importing the metadata of this particular file. So this file contains the header yes, file type. This file is comma delimited, edited notepad. If you look at this particular file, this file is comma delimited. Ok, select the coma delimited. Fine, safe. I have to save this then data element. So in the data element what I am doing, I am importing the metadata of this particular file. So I will click import, choose a file here. So I will go here where my file is present. So I will select this particular file. Click next. So this is the file which I am using. So finish. Look, product ID, correct name, country. All attributes have been added in the delle boomy. So now if you look at the data type, the default data type of all these columns are or character. Product ID should be number. Let me change the datatype. It should be no, it should be corrected. Country should be corrected. Data type characters, city, the default data type is set test character sales should be numbered. Quantity should be number. Discount should be number. Likewise, profit should be number. Save and close CSV file profile. So this profile has been added. Next time we're going to choose deep Database professor database profile I've already created. So the name of database profile is this product database profile. Let me just click it. So this is the database profile. Next time we're going to map my source data with vid, my target data, okay? So this is my source data and this is my target data. But in between, I have this section where I am going to perform the transformation. So let me just open this particular file. So what I want, I want to set my product name in uppercase, set this fleet to uppercase. Then in the Country field, where I want to set this country free, I want to have this country field in the form of abbreviation. In DCT fleet where there is a null value. I want to replace null value with a name. Okay? Now, what I'm going to do, I will click plus sign. Then, first I will choose the string function. I want to make this product name to uppercase. So for uppercase string to upper, I will use the string to upper. So product name, click OK. product name. I will match this product name to disk, then result. So result would be d, uppercase of this string. Okay? Now Product ID, I'm going to map it with this product ID as it is. Country. I want to apply a transformation on the country column. So what I'm going to do, I'm going to use the custom script option here I will write the JavaScript. I will choose the JavaScript, a debt. Here I'm going to write the java script. So let me first Open Source underscored product file. So I will choose this particular flee country. Okay? So let me just write a scripted. If country FY equal, equal, equal to United States, Make country equal to USC, then semicolon at the end. Else. If Country equal equal to t times equal, equal, equal Australia, then set Country field value to a US semicolon at the end. Okay? Us, United States, United Kingdom, Australia, OK. Else. Country goes to UK. Okay? So important, I'm going to provide country as an input, character value. Output I will get country. Now, click OK. Click OK, country. So do not give anything, leave it as it is. So I will map this country to this country, then disc to country, okay? Now, as a result, I will get d abbreviation in this particular field, I will get d abbreviation in debt will be stored in my target destination country field. What about city? So I Avant for city, I want to replace null values with any. So I will again write a java script here. Custom scripting, scripting, okay, I will choose Java script. Here. If t equals two, equals two space, then replace T with n slash a city to city. Input. I'm going to give 60. In that result, I will get as an output city value. So if there is any null value, it will be replaced with any. Okay? Now, click OK. Fine. If there is any space in my, in this particular field, it will, it will be replaced with any. Now I will map city with city, city with this particular free, fine. As far as this particular field is concerned, I am going to multiply this three with a 100. So for debt, I will use t numeric function. So which function I will choose math multiply. I will multiply this particular field value, value to multiply a 100. Let me just click OK. Now value, I will discount, value, reserve. Discount with discount, fine. Profit. I will map it quit profit, sales. I will map it with sales. Quantity. I will map it with D destination field. Now. Okay, save and close. Product CSV to database mapping has been completed. This has been configured. Okay? Now I have to save my mapping. Now in order to execute this process, I will click Test, Save it first. Then I will choose my runtime agent. I will choose the local agent run test execute. So it is executing. 10. Data Loading and Process Deployment: So the process has been executed successfully. Now let me just check my table if the data is inserted or not. So this is my table. Let me just query it. So data has been inserted. Look Product Name field has been converted to uppercase. If you look at this country feet, it is represented in the form of abbreviation. Look for United States. It tends represented as USA. For United Kingdom, it is represented as UK for Australia, EUS, okay? If you look at this city fleet, null values have been replaced with n slash shape. Sales field is represented in two decimal points. Likewise, profit field is represented in two decimal points. So as far as this Discount field is concerned, it is showing me in the form of percentage, 20%, 70 percent, etc. Now, next, what we are going to do, we are going to insert data in a flat file. So not only we are going to insert data in MySQL table, but also we are going to insert data in a flat file as well. Okay, now, I'm going to just add one more shape here. So I will go to the Kinect shape. I will add the disk shape here and a disk shape. Then I'm going to add click OK. Right now I'm not configuring this particular shape. Next I'm going to add one more shape, which is called branch shape. Let me just introduce this particular shape. Also. I'm going to add one more shape, which is called the map shape, which we have already discussed right now. So first of all, I'm going to put this branch shape. Where I'm going to put this branch shape, I'm going to put this branch shape right before this map shape. So this shape is going to have two branches. So what is the purpose of this shape? Let me just open it. So branch shape is used when you have several actions that you want to execute in sequence, sub-brand shape is going to execute. So right now, I have two branches. So maximum number of branches, I can have 25, okay? Minimum I can have two. Sub-branch shape is going to execute branches in sequential manner. So first, let me just connect this particular shape. So first it will execute this particular branch, then it will execute my second branch. So in my second branch, what I'm doing, I'm going to insert data in the flat file. So before the disk shape, I'm going to use D map shape. Okay, then I'm going to use the disk shape. Then at the end I'm going to use desktop shape. Again, I'm going to use this top shape here, okay? Just add a stop shape here. Connect to this shape, stop shape. Now I'm going to configure disk shape first. So first of all, so what section I want to perform? I want to perform the same direction. Connection. So which connection I will, I will give this particular disconnection. Let me just open this connection. So it will create a file in this particular directory. Okay, we'll create a file. It will write a file in this particular directory, save and close. Okay, which operation I want to perform. I will create a new operation, okay? I will create a new operation. I will perform write operation, right? Cooperation. Save and close this operation I am going to perform. Click. Okay, so this has been configured. Next I'm going to map. So first of all, so I will create a new mapping for this. So mapping is disc two, disc desk to desk mapping. So I'm reading data from a disk file. As I'm, I'm inserting data in a desk, but the data which I am going to insert will be the transformed data. Okay, so that is the reason why we are using this map shape in order to transform our data. So in this particular section, we are going to perform the transformation step. So first of all, I will choose the profile, it so which profile? So I will select the flat file, so the profile I've already created, so I will choose this particular CSV file profile. Choose this profile. So I have already created this profile. So here I will choose, again, I'm going to choose this particular profile. Okay? So this profile, I'm reusing it, this profile I've already created it. So this is my target space, indices mess source space. So here I'm going to perform the transformation. So product id is going to map it Product ID. So a product name, I will convert the product name to upper-case. I will use the string to upper function. Click okay, so here I'm going to map it with this with country. I'm going to use the same JavaScript. So first of all, let me just map this profit quantity. I'm going to map it does sales. I'm going to map it with sales, city, with country and city. I'm going to use the Java script. So I will create quickly create the JavaScript custom script scripting click OK. Then I'm going to select the java script edit. So if Country equal equal, equal current tree equal equal equal. Two space, then. No, so i'm applying JavaScript on country. So let me just first opened this file. I have to open this file. So for country, i have to represent country in the form of abbreviation. So if country equals two naught, this is country equals to United States. Country equals to USC. Make it USC semicolon. Else, if country equals two, equal equal, equal United Kingdom or what? Australia. Australia, then country equals to a US. Else. Country equals to UK. Import. I'm going to take Country as an input. The output value will be the country. Click OK. Now OK. Click OK, press OK. Now, OK. So country, where is the Country field? Map it with this country. And this country is going to map it with this country. Okay, so in this particular field I will get d abbreviation. So what about disagree? So I will do the same with city custom scripting, scripting. Click OK, edit. No, I have to change it to JavaScript. If equally query to space, then replace T with n slash. City to city. Semicolon at the end. In port will be city, and output will be CT. Click OK. City will be met with 60. So this is going to be the input and output will be d transformed value. So product id will be mapped to product ID name will be mapped to this country, this city, this sales have been mapped with this quantity discount. So for discount, I'm going to multiply a with a 100. So I will use did numeric math, multiply, click OK value. I'm going to multiply with a 100. So I want to value. And the result would be in the form of percentage for discount, save and close. Okay? So, okay, save. So arrange this, let me just expand this. Now, save this again, a range. Now it is looking nice. Now sub-branch shape is going to execute the branches in, in sequential manner. So first disc branch will be executed. After this, this branch will be executed. So now execute this test. Save, choose at term rank test. Okay, so first of all, I have to truncate this first look. A file has been generated, so the process has been executed successfully. Look, a file has been generated. Let me just edit this particular file. So if you look at the file name, look, file has been generated. Look, USA, UK, data has been transformed. Look, okay, then replaced with NA, this data has been transformed. But if you look at the file name, a case showing me like this dot dat, it tends to be presented in the form of debt, okay? Now, no number of rows have been increased. Okay? Now, let's understand the flow of this process. Okay, so this step is going to produce a document with no content. Then document will be passed to this particular step. Then this step will be executed. So it will produce a document and then this step is going to pass a document to this branch shape. Ok? So if you look at this document, ground shape, then branch shape is going to pass to document. First, branch shape is going to pass the document to this branch. Okay, so here I'm going to get, so for desktop comment for, so here I'm going to get, let me just open this. So it is just opening. Okay. So look, sub-branch IP is going to first it has meant to pastor document to this shape. Then it is going to pass that document to this particular branch, Saudi first it will pass it to this branch, then it will pass it to this particular branch. It will execute branches in sequential manner. Ok, so this important point has been cleared. Now what I want if you look at the document file name, document name, it is like this. Let me just open a document. So look, let me just download original document. You will see the name of document which will be like this dot dot format. Okay, download original document. Now what I want I want to change the name of this document. I want to be product dot CSV. It should be product dot csv naught like this. Ok? So what I'm going to do, I'm going to use t, set properties, shape to set D and name of this document. Okay, so I have to include the set property shape here. So where does the set property shape logic? No, it is not execute. So I will introduce this particular shape to change the name of the document. So before disk shape, I will use this particular shape to change the name. So let me just configured this set property Shape, click plus Document Properties since we are changing the name of a document. So I will choose the document property option, property type. There are other options as well, but I will choose the document property. Then I will choose the disk densify limb. I'm not changing the directory, I'm only changing the filename here. Okay, now I'm going to give didn't name here. So what name I want to assign to this document? So I will assign product, target underscored products. So this is going to be the name of my file dot CSC. Okay, click OK. Click OK. Now this shape has been configured. Save this and execute this. Now, a document will be produced with name, target underscore RC product or CSU look. So now if you analyze this process flow, you will see this process, this top shape is not executed because here we are performing the write operation. So when, when the write operation is performed, no document is produced. So debt is the reason why no documented pass to this shape. And as a result, this shape is not executed, okay, these two shapes are not executed. Okay? So now, so this is the target product file. Let me just open this particular file. This file will be transformed. Look countries represented in the form of abbreviation, NA, NA, uppercase. Okay, now, what next I'm going to do? I'm going to deploy this process which I have created. So the last step of our integration processes deployed the processing either test environment or in production environment. So what I'm going to do, I'm going to save, includes first I'm going to do save and close. Then I will. Create an environment first. Okay, so here let me just, okay. So next I'm going to go to the Manage section here I will go to the atom manage meant. So right now we are going to create an environment so that we can deploy our process in D environment, so plus environment. So there are two types in Dell, boomy production environment and the test environment. So right now I'm going to create the test environment. So test environment. Ok, so this is going to be the name of test environment. Save. A new environment has been created. Number of atoms, atom assigned to this particular environment. Right now there is no item assigned to this particular environment. Id is death, classification is this is test environment. So next step is to go to the ETA management section. Okay? Okay. I've already in theater management section, this is the test environment right now no atom is assigned to this environment. So what next I'm going to do? I'm going to deploy my process to this particular environment. So this is my process. So in order to deploy the process, I will create package component. So here I'm going to click next. Edx deployment version is 1 version. Okay? No, I don't want to give any details here. Create packaged component, deploy packaged components successfully created your component and your packet component was successfully created to deploy your package. Click Deploy button. So click Deploy choosed environment. So we are going to choose our test environment. We have just created test environment, which we have created next, next, deploy. So deployments successfully viewed deployment. Okay, so the process has been deployed to the test environment. So the Virgin is 1.00. Component type is this is a process deployed by from this account. This process has been deployed. Deployment date is this. Ok? So now at a management, I will go to the utter management. Here I will go. I will. Okay. Attachment. Right now I'm going to assign this particular item. So atom has been assigned. So if you want to, okay, atom has been assigned to this environment, ok, then deployed process. So this is the process which we have deployed. So if you want to execute this process, you will click D execute process. So if so. So our first integration process has been completed. We have finally deployed our process in the test environment and data is transformed. Okay, I hope you understood the concept. Thank you so much and have a great learning. 11. Amazon S3 Integration With FTP Server Part-01: Hello everyone. In this particular lecture, we are going to create our second integration processing del boomy. So my second integration processes debt. We're going to integrate Amazon S3 bucket with FTP server. Then we are going to integrate sales forthwith FTP server. So first of all, I will create a new component to implement this particular logic. So indel boomy, I will create a new component inside this particular folder, new component, okay, type I will select the process component name. I will write S3, coma, create a component, so that component has been created. Next step is, I will select the Start shape with no data. Click this option, click OK. Next, I'm going to choose D connector. So since we are reading data from Amazon S3 bucket, so I'll be using the Amazon S3 connected. So I'll choose this particular connector, Amazon S3. So if you want to see where this connector is located, you will go into connect section. Connect section. You will see the Amazon S3 rest connector. I will choose this particular connector. Now, next step is to create S3 bucket. So in Google's search bar, I will write Amazon S3. I will click this particular link. Then next I'm going to go to D, My Account section. Click this. Okay, let me just refresh this. I am not able to select the option now, AWS management console, I will click this particular option. Then I'm going to select IAM user. I will provide my account ID. So I've already created my account ID. If you want to create a new account ID, you can do it from here, okay, if you do not have an Amazon account, so you can create it from here, okay, Create a new AWS account. Click Next. Now I'm going to provide my Amazon password. Sign in. So, okay. Next I will search hair, S3. Search S3. So right now I'm creating S3 bucket. So inside the S3 bucket, I will upload this particular product dot JSON file. So this is the JSON file. So data inside the JSON file is represented in the form of elements. So if you talk about tables, data will be represented in the form of rules. So in JSON, data is represented in the form of elements. So this is my first element. This is my second element and this is my third I live element. Likewise, These are my elements which are inside this, inside this bracket. So now what I'm going to do, I'm going to upload this particular product dot j, some file. So first of all, I will create a bucket, new bucket, create bucket. I will give a name, bucket name, I will write boomy Training. Click next. Bucket name already exist, 20-20. Click Next, Next, Next, create bucket. So bucket has been created with name. This. Next time we'll upload a file inside the bucket. So I will click Upload Add Files. I will select this particular file, del boomy. I will select this product db.json file. Click Next, Next, Next upload. Look. So file has been uploaded. So bucket name is this boomy training, 20-20. Inside this bucket, I have uploaded this JSON file. So now next I'm going to configure this Amazon S3 connector. So since we are reading data from Amazon S3 bucket, so I will choose this particular connector. Let me just connect it with desktop shape. So first of all, we are configuring the Amazon S3 connected. So connector type is action. So since we are reading data from Amazon S3 bucket, so we will choose the get operation detection connection. I will create a new connection, Amazon S3 rest connection. Okay, here I will provide Amazon AWS KEY AND secret key. So in order to get this particular access key, I will go here, my security credentials. Okay, here I will go access keys. I will select this. So let me just delete keys which are already did. I have to delete these keys? Okay, now I will create a new XS t create new access, choose excess. So this is my access key which I've created. I will select this excess c0 n dices my secret key, secret access key. This is key ID indices, secret access key. Apply. Then I will close this. So this is statuses active. Okay, now I will click Test Connection and I will choose d at tn. Click next electron, just test this connection. So now here I'm going to go to the bucket section, services, search S3. So the connection is successfully tested. Click Finish, save and close. So the connection is created. Next I'm going to create an operation. So, okay, now next time I'm going to click Import. Oh, I will select my local atom here. This is my local atom connection. I'm going to select this particular connection reach we have just created. So this is the connection. Click next. So I need to give a nice name. So Amazon connection, ok. Save and Close Amazon connection. Let me just do it again. Atom. This one, Amazon connection. Find Click Next. Connecting two atom. Ok, here I'm going to select the bucket. So right now I have three buckets in my account. So I will select this particular bucket. Boomy training 20-20, boomy training 20-20. Click next. Okay, the object name is this is the bucket name. Click finish. So this is the bucket name, okay? Now, okay, fine. Save enclosed. So also I am going to give this Amazon operation, save and Close Amazon operation, Amazon connection. Click okay. So next I'm going to set this parameter value. So click plus input ID, static. So I will go inside D bucket. So I will give the name of my file, which is product or JSON product. So since we are reading this particular file, so I will give the name of this particular phi. So id name a will be D, filename. Click. Okay, so now it has been selected. This Amazon S3 rest connector has been configured. Next I'm going to click save. I will click the test button. I will select the disk particular atom. Now the process is executing. So this is going to read the JSON file, which is inside this bucket. Okay, so the process has been executed successfully. Now let me just check the output of this result. Now, this is going to show me the JSON file. Look. So this is the JSON file which this process has read. Okay? So this process is going to read the JSON file, which is located inside this bucket. Then it is going to pass our document to this particular shape. So if you want to see the result of this particular shape, you will go ahead and click the document. You can see the file's content of this particular file. So this is my first step, which is configure Amazon S3 bucket. Next step is, I'm going to configure this particular server, which is FTP server. So in order to configure the FTP server, first of all, I will choose the FTP server here. Ftp. Now I'm going to connect this with this particular as HTP server. At the end I'm going to choose this top shape. So before using the, before configuring the FTP server, I will first download DEE local FTP server, which I'm going to use. So first of all, I will download FTP server, download file Zola, FTP server. So I have to install this files ill FTP server in my computer system. So here from here you're going to download this. You have to. So I have already downloaded this particular five or I've already installed it. So you just have to install it in your computer system. Okay, this particular file, so let me just open my file server. So this is my files ELA surveyed, okay. Localhost deceased my port number. I'm going to connect this. Next time I'm going to create a new user. In order to create a new user, I will click, Use It. From here, I'm going to create a new user. So I've already created a user with name boomy. So I will create another user, add another user. Boomy, 2020 will be my another user. So password, I'm going to provide my password air then shared folder. So I will provide the directory where I want. What I want, I want to read the JSON file, and I want to place to JSON file in that directory. So I will mention the directory name here where I want to place files, the FTP directory. So I will place files here inside this particular directory, FTP files which I have already created. Okay? So I'm going to provide this particular file path. I will go in my directory, inside this d directory, i will go to this particular folder. Then I will select this particular folder, FTP files. Okay, next I'm going to click okay. So user is, so we have selected, NO, select this user, click OK. Now next, I'm going to open the file Zola. Services. So first of all, let me go to my services. I have to show you the FTP servers or FTP server is up and running. Yes, it is running. So let me show you the host name. I will give ten. So what test my host name? I need to check my host name. So my host name is let me just open this again. Okay. The host name is one to seven dot 0 dot 01127 dot 0 dot 0 No.1, username, boomy, 20-20 password. I'm going to provide my password head. Then I'm going to hit connect, Quick Connect. Okay. It doesn't support FTP. So here I will have to provide the port number disconnected. Logan and I have two. Let me first see my various defiles ELA server. I have to see my username here. Let me just see him as Jews and name here. Gumi 20-20 is my username. Ok, my username is incorrect. Boomy. 2020 is my username. Now I'm going to click on a so I have successfully established the connection. Okay, so this is my root directory. Ok, click OK. Now the server is up and running. Next step is to configure the FTP server, FTP connected. So here I'm going to choose the connector FTP action. I'm going to perform this sanction connection. I will create new connection, FTP connection, boomy. Ftp connection, host name. I will provide the host name. This particular host name is one to 71 to seven dark 0 dot 0 dot one for titles, Okay, username I will provide boomy, 20-20. I will provide my password, which is so I will provide this user password. Okay, boomy 2020. So port number, leave it as it is. So the port number is 121 or we can set it the port number 1414, something. Okay, I will first set this port number 21. Safe. Next I'm going to configure the operation, boomy FTP operation. I will create new operation. Boomy FTP operation will be the name. Okay, remote directories. I don't want to provide any directory. Find saved this. Next I'm going to test my connection test. So it is running right now. It should create a file inside this particular directory. Okay? It seems there is an error. Ftp server permission denied, okay? The problem is debt. One more important thing, I have to provide the permission as well. So there is a problem here. So I can also provide this particular port if I'm having an issue with this port 21, okay? So okay, connect this. Then I will go and get user. I will select this particular user share folder. Then here I'm going to go Ok. I will, okay, select this. I will have to provide d permission, readwrite, delete, append all permission. So I will select all click OK. Now, I'm going to execute this process again, save this process. Now I am going to execute this process. Now it should, this disc should create a file here inside this particular directory. So this is the FTP remote directory. Look, a file has been created, a JSON file has been created. Right now, the name is different. In order to change the name, I will have to use d set property. I will use the set property, so I will quickly change the name of this JSON file. I will click plus sign. I will choose the document property. Then I'm going to choose which connected. I will choose the FTP connector. I will choose the filename. Filename here I'm going to provide the file name. Let say product dot JSON will be the file name savior process. Now the file will be created here inside this particular folder with name product dot JSON. So I, I just deleted the old file. Click Test button, execute the process. Now you will see the file air inside. Look, a file has been created inside the remote directory. So this is the remote directory path. Okay, look, disk file has been created. So in the next lecture, we're going to implement the logic between the in-between disconnected and disconnect or ok. We will use t route shape to implement our logic. So I hope you understood the concept. Thank you so much and have a great learning. 12. Amazon S3 Integration With FTP Server Part-02: Hello everyone. In this particular lecture, I'm going to include few more shapes in our process, okay, now first I'm going to add the route shapes. So drought shape is located in the logic section. Okay? You first go into logic section and you will get the round shape from it. Now, drag-and-drop de route shape. So basically round shape. Basically routes your document two different paths based on the condition you will provide here. Okay? So right now what I'm going to do, I'm going to provide a condition on this particular field, Siegfried. Okay? So before this set properties shape I will include the route shape. Ok? So in the route shape, I will provide the condition. So I will provide the condition on the sales field. Here drought by I have to select the profile, profile type since we're reading the JSON file. So I will select the profile type JSON profile. So I have to create a new profile here for JSON for this particular file. So let me just create a new profile here. Json profile, dark added. Let me just add this one element inside this profile. Remove the comma here. Okay, everything is fine. Now, save this. Now I have to create a new profile. Select Import. I will import this particular profile which I've just created next. So this is going to import the fields only feels product array element only product ID name, city, country, sales. So sales among would be number. Quantity would be number. Data type would be number, discount would be number. Let me just circle it for JSON profile. This is the name save and close. So I will select this particular profile which I've just created. Element. I'm going to choose the element. So I will apply condition on this particular field, sales. So I will select sales. So it has been selected. Now, route equals to know. So what I want greater than 20, sales amount. Sales amount greater than 20. If sales amount greater than 20, it will go to this branch. Okay, let me just, it will go to the disc branch. Otherwise it will go to this default branch. Let me just add desktop shape here with this default branch. So what it is going to do, a round shape is going to evaluate the first element. So if you're reading the, if you are reading data from a database table, it will evaluate only the first row of a table. If you are reading data from a JSON file, it is going to evaluate only the first element. So in the first element it is going to evaluate this Sales field. Okay? So as I provide, as I, as I have probably decondition, sales amount should be greater than 20. If the sales amount is greater than 20, it will go to this particular branch. So in this particular scenario, sales amount is not greater than 20, it is less than 20. So the document will be passed to this particular stock shape. So now I'm going to save this. I'm going to save this, and I'm going to execute this process. So let me just select the atom run test. So the document will go to this particular shaped stop shape. Look. If you look at the document here, it will go to this particular shape. So now let me just open this document loading file from atoms. So look, so this is my document. So now what I'm going to do, I'm going to change the condition here. So let me just end the condition two. So the first value, so the first value of sales amount is 18.03 to let me just change it to 18.03218.03 to save. Save. Now, DQ control will go to this branch. This branch will not be executed. Because I've mentioned the first value, which is 18.03 to oops, okay, Okey-dokey, OK. I have to change it to great naught greater than equal to, I have to change it to equal to. Now test. Now it will go to this first match. So it is executing the process. Look, it will go to the first branch. Now, what I'm going to do next. So if I want all elements to be evaluated, I will introduce one mode shape, which will be delta process shape. So what data process shape will do data process shape will split my document into multiple documents. So look in this particular example, in this particular file, how many elements I have? 123456789101112 elements. Okay? Particular JSON file I have 12 elements. So what this data process shape is going to do, it is going to split my document, a single document into 12 document. So now I'm going to connect this with this data process shape. Before this sales, before this route shape, I will use this data process shifts. I will connect this data process shape to this route chip. Now I'm going to configure the data process shape. So with the help of data processed shape, I can split a single document into multiple documents. I can combine multiple documents to a single document. I can do various different operations. So right now I'm going to split a single document can to multiple documents. So, so deck each and every element in my file to be evaluated. So profile I will select the JSON profile. I will select the profile which have just created product JSON profile split element. I want to split on the basis of ID, product ID because product IDs unique. Click OK. Now it is going to split my document into multiple documents. Then. Then I'm going to use this particular shape. Now, I will change my condition. Condition will be greater than 20. If value greater than 20, then go to this branch, safe, otherwise go to this branch. Now. So I will have documents or in this particular branch, I will have documents where sales amount is less than less than 20. Look, so let me just open this document. So in this document I will have Hill amount less than 20. Look, which is 18.03. All files have been created. Let me just delete this and execute the process again. Now you will see multiple documents in this particular directory and look, multiple documents have been produced. So if I just randomly pick, randomly just open the document, you will see the sales amount is greater than 20 in every document. Okay? The sales amount would be greater than 20. So if I select here, multiple documents will be produced in, in each document, sales amount would be less than 20. Now, next, what I am going to do look 13.0773, which is less than 20. Next, what I'm going to do, I'm going to introduce the data processed shape again in order to combine my document. So before the set property shape, I will introduce this data process. So debt, I can combine my document. So I will have a single document instead of multiple files. Only one file will be produced, which will have all the content. Okay, I will combine documents. Profile. I will choose D JSON profile profile I've just created. I will reuse this profile. I don't want to create an input. So indeed combine elements section I will choose the array elements. So I want to combine this on the basis of array elements. All the array elements will be combined. Click OK, save, now, execute the process. So now you will have one file. One file will be generated with all, with all the content. So sales amount would be greater than 20. Look greater than 20. Now let me just check the file which is created here. So file is created here inside this remote directory. Look, sale amount, sales amount greater than 20. All values are greater than 20. Okay? So hope you understood the concept. In the next lecture, we are going to integrate Salesforce with FTP server. So thank you so much and have a great learning. 13. Salesforce Integration With FTP Server Part-01: Hello everyone. In this particular lecture, we are going to integrate Salesforce with FTP server. But before deck, we're going to discuss few more things in Dell. Boomy. So right now what we are doing, we are reading the product dot a JSON file from Amazon S3 bucket. And we're delivering it to a remote server. So we're delivering it to a FTP remote server. So this is the directory where file will be created. So what if this FTP server is down? So there is a possibility deck, this FTP server is down. Then we will not able to send a product file to this particular FTP server. So this FTP several will generate an error. Ok? It will not allow me to create a file inside this particular remote directory. So right now, let me show you the status of this FTP server. Right now, it is up and running. So files ELA, FTP server status is running right now it is running. Let me just execute the process. So this is going to generate a product dot JSON file inside this particular div more directly. So I just deleted the old file. This is going to generate a new file. So this is product dot JSON file. Okay, now, so let me just down my server. So let me just down my server. This is the FTP server right now it is up and running. Right-click stop. So I'm going to stop. This FTP server status is stopped. Okay, now, let me just open this. Status is stop. Service status stopped. Click OK. Now what I am going to do, I'm going to execute this process again. Let me just delete this particular file. I'm going to execute this process again. So I will get an error. Here. I will get an error because this FTP server is down. I will not able to create a file inside this remote directory. So this process is executing look connection closed, caused by connection close without indication. So now in order to know, I want to see the logs is in order to see the log of this process. You will go here, you will go to View log. You can see all the process details. So now I want to see debug info, all details. So right now, last OK, look, I am getting this particular error, unexpected error, execution process Error, setting up FTP connection. Okay, so the reason it is giving me the Sarit because the server is down and I cannot, I'm not able to communicate with D FTP server. So what I want, I want to capture this particular error message, this error message. So I want to save this error message in a separate error folder which I am going to create inside this particular directory. So I will create a folder here, error log folder. Error log. So I'm going to generate a text file. In, inside the text file, I will have the error details. I will have error. This particular error. Error time, process name, process ID, etc, these information. So next time we're going to do, I'm going to introduce one more important shape, which will be try-catch here, which is try-catch you. So try catch shape is going to capture the process level as well as the document level edits. So there are two types of errors, Intel boomy process level error, process error, AND document level adder. So this right now the server is down. This is an example of process error. So document level added the exemple of document Erin level could be like, for example, if you are reading the JSON file in your JSON file is not in correct format. Disk could be D processors, so this could be the document level at it. But if you're FTP server is down and you are not able to communicate with the FTP server. This error could be the example of process added. Okay? So this is, if I select the All error, this is going to capture both process era as well as the document error. Okay, so right now before the FTP server, I'm going to introduce this shape try-catch hip. So try catch shape is going to capture this particular error. After capturing this particular error, this is going to pass the error to disc branch. Okay? So after this, I am going to introduce one mode shape, which is Message shape. So in the message shape, I'm going to have this particular error at some message shape is going to produce a document. Inside the document, it will have the error which is captured by this try-catch ship. So here I'm going to add the error message. So let me just connect this with this shape. At the end, I'm going to introduce the same FTP server. So FTP, let me just introduce the same FTP server. At the end, I'm going to use desktop shape. After this. Here I'm going to introduce the stop shape. Connect this. Okay, so first of all, I will configure the FTP server. So action I'm going to, since I am going to perform the right operation in a folder. So I'm going to select the ACTION_SEND connection. So connection, I'm going to use the, if I select the FTP connection, okay, fine. Ftp connection operation, operation, I will select the boomy FTP operation. So. Here I'm going to specify the ok. No, I don't want to specify the remote directory here. Okay. If I specify the remote directory here, because I am using the same connection, this FTP connection, boomy, FTP operation, this particular operation I've used here. So if I change the directory, so directory will be changed here also k. So I will create a new operation because so let me just select the same operation. This FTP operation, let me just select the same operation. Save enclose. Now I'm going to tell you how I'm going to write a file hate inside this error log folder. Okay, so fine. Let me just connect this message a with this shape here I'm going to select a variable, so the variable value is, so how to define a variable? This is how you define a variable. So the variable value here, I'm going to define the heat I am going to give the variable value. So variable is, so I will select so document property. Okay, then I'm going to select indy document property. I will select DDI metadata option, and then I will select D base option. I will select this try-catch message. Click OK. So this is my error message. Along with error message, I will give a d timestamp date. So process is going to fail at this particular time. So I will provide the data as well. So date would be current date or a date or time. Okay, current date. So if I choose t, okay, this option, let me select this particular option. Okay? Third option I would give, let me select the third option. Execution property. I will select the process name. So process name, I will give Process name. I will give Process name. This is going to be my third variable. So here I'm going to get d variable value. So I will select the More option, go to the execution property. So I've process name has been selected. Let me just select the process ideas will process ID would be the fourth one? So process ID, fine. Display name, the error details. Also. Before this FTP connector, I'm going to introduce one more shape which will be didSet property shape. So with the help of set property shape, I'm going to change the name of a file which will be created inside this particular folder. Okay? So also I want to create it right now. It is going to create a file inside this particular directory, inside this particular directory. So what I want, I want to create a file inside this directory. Okay? The reason it is going to create a file inside this particular directory. Because here I have selected this particular operation in the NAND operator, in this particular operation, remote directory, ah, selected de default, which is this remote directory. Okay? So right now what I'm going to do, I'm going to change the path with the file will be generated. So configured this step, not only I'm going to change the path, also, I am going to change the file name as l. So I will select the FTP file name. So file name would be error dot TXT. I will select one more option, remote directory. So remote directory would be this backslash n, then d error log folder name. Let me just give the folder name. So this is going to be my remote directory where this error file will be generated. Click OK. Phi limb will be Disk Error dot text. And this is the remote directory where this error dot txt file will be generated. Now, let me just connect this with the FTP server. So connect this safe. Now I'm going to execute the process again. So my server is down. Let me just refresh it again. How to refresh this? Refresh this. Right now server status has down. So this is going to generate the error file saved this error log folder, run test. Ftp server is down. Okay? The problem is debt. Instead of FTP server, I will use d disconnected, so let me just use do disconnected. Ok. So this is the mistake we did. So now remove this. Okay, instead of FTP connector, use T.DIST connector. Now configured this session, I'm going to quickly configure this. Okay? This is going to generate in this particular folder, Save and Close. Now let me just document. Let me just go to the document here I will create a folder error log folder error log operation, write operation. So the path is this. Ok? Now with the help of set property shape, I will create a file inside this particular directory, naught in this particular directory here. Okay? So instead of just directory, I will create a file in this directory. Now save and close. Okay, fine, right operation. Ok, click OK with the help of set property shape, let me just change it. Now. Instead of FTP, I will use the disconnected filename. Fine. Remote directory. Edit this. Instead of FTP, I will use d disk connector directory. I did this. Here. I'm going to provide the complete directory. So like this, OK copy this. I will provide the directory path, complete directory path. Fine. Ok, click OK. Click OK. Now save this test. Execute the process. Now you will see file will be generated inside this particular error file will be generated inside this particular look. Text file has been generated edit with Notepad Plus Plus took process name is Now I know that this is the process name. This is the process ID indices, the error message, and this is the time when this process got filled. Okay, look, try ketchup is going to capture this particular error. And this error message shape is going to generate a document. And inside the document it will have the content of error details, okay, like I'm going to have these particular details. Now in order to see the document of this particular message shape, you will go here and you will see the document. And inside the document you will have the error details like process name, error message, error message, which we've mentioned. Okay, then d time when the process court failure. Now this is going to write this in a in a separate file. Okay, this in a, in a separate folder. Ok. Error log folder. 14. Salesforce Integration With FTP Server Part-02: Hello everyone. In this particular lecture, we are going to integrate sale forfeit FTP server. So what next I am going to do? I'm going to create a new process. And in the new process, I'm going to implement this scenario. So from this, this will be my main process. So from this main process, I'm going to call a new process which I'm going to create. Okay, so to call it process, which shape I'm going to use, I'm going to use deep process called shape. So with the help of this shape, I'm going to call my sub process. So here before the Amazon S3 rest, I'm going to include one more shape which is brand shape branch. It will have two branches. One will go to this branch and the other branch will go to data processed shape. So the process called shape. So this is process called shape. Ok. So now let me just, okay, so this is going to be my second branch. So this is my first project and this is my second branch. So now first of all, let me just create a new process. So I'm going to create a new process. I will call it sales force to FTP server integration integration create. So this is going to be my subprocess. I will call this process, I will call this subprocess. So next I'm going to select, this starts right now I'm going to select distort shape with no data. So what is this start Sheridan data pass through when you are going to select this particular option. So you will select this particular option when so from main process. So this is my main process. So from main process, when you have to pass values to to your sub process, then you will select this option. Injure subprocess, start shaved with datapath. Right now, we're not passing anything to subprocess, so I will select the type no data, okay? So you can also select the type to connect turn. If you, if you want to configure in these starts ships right now, I don't want to configure in the starship, I will use another shape. So first I will select the Start shape, no data, okay? In this particular subprocess, I will implement the logic of Salesforce to FTP server integration. Save this, save and closer. I'm going to save and close this. Then configure. I will choose sales force to FTP server integration. So this is my subprocess, which I'm going to call from this main process. So now let me just configure my subprocess. I'm calling this my subprocess. Next I'm going to use the sales force connector. So I will connect this shape with this shape. Ok, so I have to, first of all, I need to have an account, sale further account. So if you do not have an account, you can get the free trial version since I have an account. So I'm just going to mention my username and password. So this is my username which I got from Salesforce. So I will just copy this. Username and I will paste here username and password. I'm going to provide my self-service password. Then I'm going to click login. So the second step is after creating the sales account, after logging into your account, second step is to get the security token because you need this security token in D Salesforce configuration. So in order to get the sales force security token, you will go in the settings. Here, you will go into the Settings option. Let me just open this first. It is. Just refresh it. So now I'm going to go into the settings. I will go into the settings. Then I will go here in these security token, reset my security token. So you have to click this researcher security token after this Salesforce is going to send you the username in the security token. So you are going to use the security token in the salesforce configuration in Dell. Boomy, okay, so now I'm going to configure the sales force. So in order to configure the sales force, I will choose the connector type, Salesforce, H&M, since we are going to read data from Salesforce. So I will choose the action get connection. I will create new sales force connection. I will call it sales force SF account connection. Here I'm going to provide the user name, username. I will use this username which I got from Salesforce. Okay, so if you want to reset your password, so you're going to go in deep, change my password if you want to change your password. Certain next, go head reset my security token, okay? Now, so this is my username. Now next I'm going to use d. I'm going to provide the password here. So the parser detriment to provide air is going to be the combination of security token and your Salesforce password. So for example, if your sales source password is, let me show you an example here. So let's suppose if your sales or sales force password is ABC, and if your security token is this. So what you are going to do, you are going to concatenate this password and your security token. And then you are going to provide this information here in the password here. So I will first mention my password, Salesforce password. Then I'm going to provide my security token here. Okay? So this will be my security token, copy and paste here. Makes sure there is no space. Apply. So save enclosed, So this is your account connection. Okay? So next I'm going to create a new operation. I will call it SF account operation. Okay, next I'm going to import my profile. I will select the connection which I've just created, SF account connection. Then I'm going to click next. So this is going to give me the object. So these are different objects. Account, contacts, leads. So I'm going to fetch the information of account. So, okay, it seems there is an error, invalid username and password. So I have to just check whether I have given the correct connection details or not. So I need to check my password, and so I need to provide my password and security token again. Some maybe I've done some mistake. Let me just provide it again. Ok, Save and Close. Now, let me just create my operation again. So first of all, let me just create it. And then I'm going to give a name here. So I will choose the sales force account creation. Okay, click Next. Login into Salesforce. Okay, so now I'm going to select the object account action. I'm going to perform the query since I'm going to get the account details from the sales force. Then I'm going to click next. Okay? So it just processing. So I will click next. Okay, object type is account. This is my profile. So click finish. Okay, now select this and uncheck this option. So now I'm going to get attributes of my choice. I will select id, name, type, and then I'm going to select few other details. So I will select billing country. Then I'm going to select the shipping country, shipping 60, shipping country. So then I'm going to select D. Okay, only these details, ok, phone number, only these details I am going to get. So I will click, save and close. Here. I have to provide the operation. Let me just provide the operation. So I will call it Salesforce account operation, read operation. Okay, I will call this name. So my Salesforce connector has been configured. Next, what I'm going to do, I'm going to use d. Ftp connector here, so I will use the FTP connector. So before using the FTP connector, I need to check if my FTP server is running. A test stopped. So I will have to start the server. So now it is up and running. So now I'm going to just quickly configured the FTP server connection. I'm going to choose the FTP connection. Ok. Get operation. So it is going to know I have two 2's deep. I will select the get operation, I will select D send. So FTP connection operation. I am going to perform the write operation, FTP connector operation. Let me just check this operation. Does is send, ok. Now saving close. Before this, let me just use the stock shape. Before this shape, before the FTP connector, I'm going to use the set property shape. So I've want, so here I'm going to create another folder. So this is from in this particular folder, in this particular directory, I'm going to create another folder. So Salesforce account is the name of Folder. In this particular folder, I want to create a file. So for this, I need to set the properties. I need to change the directory. I need to set this properties. We need to use this set property shape. So before this FTP connection, I'm going to use DESeq property shapes. So debt file, which I want to write, the phi should be written in this particular folder. So now first of all, I will choose the FTP connector. Then I'm going to choose the file name. Again, I'm going to choose TFTP connector. This time around, I'm going to choose the remote directory. So remote directory, I'm going to choose this slash dist directory, Salesforce. So I have to provide the name of folder, okay? A backslash, and then the name of folder file m should be static value phi m should be the XML file. So I'm going to provide the sale force underscore account dot XML file, which I'm going to read from sales force should be in XML format. Okay? Will be an XML format. Okay, now save this, save and close. So the process has been configured. Next step is to execute the process. Look, the process has been configured. Next step is to execute the process. Now you will see a file will be generated inside this folder. So I'm going to select my runtime agent. Now a file will be generated here. Multiple files will be generated since I have multiple accounts. Okay? So not only this account, I have many accounts. So it is going to give me all account details. It is running, so the first process is going to be executed, then this process is going to be executed. This branch is going to be executed. So it is executing now if you want to see d. Okay, now you can go head process call. You can directly go to D, to your subprocess. Ok, now you're going to see multiple files have been generated. So this is the first file which is generated. So this is my first account, sod analyst. And let me show you my second account. This is my second account. Name of second account is this type is this USC shipping country USA phone number. If this now, what I want, I want to convert this XML file into, into a CSV file. So what I can do here, let me go to my subprocess before this, I am going to choose, before this, I am going to choose the map ship in order to convert the XML file to CSV. So I have to connect this with this configured create map. So XML to CSV, Salesforce XML. I will choose a profile first. I will first create a profile flag file. Create new profile XML profile record. I will select my profile. Select Ok, this is going to with option. So the file type is I need to select OK, flat file. No, I will select XML. Create new profile, salesforce. Xml profile. Do not mention profile limb. Import. User profile 2's up to this. So this is going to import the attributes fields. Okay, finish. So look, fields have been imported. Save and close. Here I'm going to provide the profile of csv. So first of all, I need to create a profile. So i will just create a profile here. I need to convert this to account. Then I need to provide the information like account ID, ID, name, close this name. Then I need to provide type. Then billing country. Then I need to mention chipping country. Then last, I'm going to mention the phone number. So I don't need to specify the data since I only want to import the attributes. So I'm going to save this file as a CSV account dot I will save it as CSV comma delimited file. Okay, save this. Just close this. So file has been saved, removed the old file. Now I'm going to create a profile. So this time around I'm going to create a flat file, create a new profile. I will choose use column header delimited, comma delimited. So this file is comma delimited. So record. So why it is it is called comma delimited because let me just look. Comma separated file. Let me just import my file here. Choose I have to import the attributes. Furnish, look, fields have been imported. Now saving close, I need to give a name here. So name will be salesforce, CSV, CSV, file, save and close. So now next time we're going to map it. I am going to map it id name, I'm going to map it name. I need to map ID with the ID. Okay. Type with type, country, with country, shipping, country, phone number, ok. Save and Close. Now save and close. Cookie XML to CSV. Click OK. And now in this particular directory, multiple files will be generated, but these files will be in CSV format. So first of all, I need to delete these files. So I have to save my process, Save and Close. Now save this process or this is my mean process. So this is the process, process called shapes. So this process called shape is going to call this process, okay? So it's like a function. So in high-level programming languages, we use the concept of function, so we call a function from main body. Similarly, we are calling a process. This is my process subprocess, which I'm calling from my main process. So I will use the process called shape for calling, for calling a sub-process. Now I am going to execute my process. Execute this process. Now you will see multiple files will be generated, but these files will be in CSV format. The reason multiple files are generated because we are reading multiple accounts. So in Salesforce, data is not only one account. So look, it is giving me dot XML extension. The reason it has given me, I know the error. The reason it is giving me because here in the set property shape, as mentioned, filename dot XML, I need to change this as well. Okay, now CSV, now, this is going to give me in proper format, save and close. Execute the process again, delete these files. Now execute the process again. Now you will get files in proper form it in CSV format, okay? So processes executing. Okay? The process has been executed. So if you want to go to the main process, so d, if you want to go to D subprocess, you will click this process called shape, and then you will have to click this. You will directly go to D subprocess. Multiple files have been generated and files are in CSV format. So this is how you can word van file format to another file format using this map shape. So hope you understood the concept. Thank you so much and have a great learning. 15. Cleanse Shape and Return Document Shape Example: Hello everyone. In this particular lecture, we are going to create an account in sales force. So this is my Salesforce account. So in this particular account object, I will create a new account. So let me just select all accounts. This one account I've created from here. So these are all accounts that are present in this account object section. Okay, so I'm going to create a new account. So first of all, what I am going to do, I'm going to read an account file which is present in the Amazon S3 bucket. So I will select S3. So I have uploaded this particular file in Amazon S3 bucket. So I'm going to read this particular account file from Amazon S3 bucket. So in this particular account file I have account ID, name type, billing, country shipping, country phone numbers. So I'm not providing all details just for understanding purpose. I'm providing few details. So this file I'm going to read from Amazon S3 bucket. So this is the bucket which have created, i have uploaded disk file in, in this particular bucket. So if you want to upload this file, you have to click up load, then you have to select the file which you want to upload. Ok. So this is the file which I'm going to read. So first of all, I'm going to create a new component here. In order to create a new component, you will have to click plus. You can also create a component from it. So name of component will be S3 bucket to Salesforce. Okay, now create. A new component has been created. Next I'm going to select the Start shape with no data. Click OK. Now after selecting this option, I'm going to select the Amazon S3 connector. So I will select disconnected since we're reading data from Amazon S3 bucket. Now I'm going to configure this Amazon S3, a connector, a Chen. I'm going to perform the detection connection. And so I've already created my connection, so no need to create a new connection if you have already created the connection. So there is no need to create another connection. Okay? So the connection is already created for safe side. I'm going to test my connection. Next connection, I'm just testing my connection effort is going to work or not. Ok, successfully save and close operation. Okay, I've already created this operation, no need to create a new operation, Amazon operation. This is the object. Okay, save and close. Next I'm going to specify the parameter value since we are reading the account dot xml file. So I will provide this exact filename ID. Static value is account dot xml file click okay. So the first step is done in district. So this step is going to read the XML resort, sorry, account dot xml file. So the next step is, so what I'm going to do next, I'm going to call a function. So to Cali function, sorry to call the subprocess. I will use the process called shape. So now what next time doing so I'm going to create a new subprocess. So I will call this a main process. I will create a new subprocess. In the subprocess, I will do some data cleansing task, okay, now, first of all, let me just save it. So I have to do some data cleansing task. So process data cleansing. So create your sub process. So this is your sub processor start shape. So I will select the Start shape data pass through. So since we are going to pass a document to a sub-process, so I will select start shape data pass through option in the sub process. So we are going to pass this document account dot XML document to my I will pass this to my subprocess, which is disk data cleansing. Okay. So I have to click this option, make the recommended change for me. Okay, we recommended you disabled capture run dates in order to effectively execute the process. Ok, make the recommended change. Click OK. And now next I am going to select the shape data. So I'm going to select the shape cleansing, cleanse. I'm going to select this particular shape. So this shape has two branches. The first branches, declined branch in the second branch is the rejected grants. So the purpose of the splint shapers tech clean shape enables you to validate document field values. Okay. Let me just repeat. Claims shape enables you to validate document field values and either repair or reject the document before further processing. So basically, cleanse shape, basically validates your document values, document field value. So in order to better understand the meaning of this sentence, let me just give you an example here. So first of all, let me just configured this shape. So profile type, since we are, we are getting the XML files. I will select the XML profile. Okay, claims data using so I've already created one profile, so no need to create another profile. So this profile, okay, right now, look, I have these elements in my profile. Let me just edit my profile and define a criteria. So for ID, I'm going to define a criteria. Okay, so for ID, or let me just select desorption, field length, validation. So I'm specifying the correct area debt ID. The minimum length of this id should be 0, and the maximum length of this id should be 50 characters long. So I'm defining this criteria on this particular ID field. Save it. Next, I'm going to define a criteria on this particular free phone. Okay? So here I'm going to specify the minimum and the maximum length. So minimum length I'm going to specify here, let say 0 in the maximum length 30. So minimum should be 0 and the maximum should be 30 characters. Save and Close. Now you're going to see these two options. Okay? So first of all, let me just select for ID, select this option for IT. The element has maximum length restriction. If too long, you have multiple auctions. You can reject the document or you can trim, trim leading characters, trim trailing characters. So maximum length, F2 long, okay. Lets it, if it is too long, trim leading character. So I will select D trim leading characters. What about the phone number? If too long, then what you want to do, you want to reject the document or you want to look, this is the meaning you want to repair it. Reject a document, no repair, or you want to repair it. You want to either perform the trim operation, trim leading character or trim trailing characters. So I will select this particular option. I will completely reject the document if the phone number length doesn't meet the criteria. If it doesn't meet the criteria, that means tacked on number is not valid. Ok, it is incorrect or click. Okay. So next I'm going to introduce the return document shape. So after performing the data cleansing tasks under document, I want to return the document to my main process, okay, to this process. So for debt, I will use the return document shape. Ok, so this is going to return the document to my main process, to process. So now after this, I'm going to use one more shape. I'm going to use the message shape to get D rejected reason to get the rejected error message. Okay, so now here I'm going to write rejected reason. Then. So this is how you define a variable in the value of this variable would be Document property. Then you have to select, in the document property, you have to select the metadata option. Now i'm going to fetch the exact error message, cleanse result message, click OK. Click OK. So inside this particular message shaped message here will produce a document and a document will have d rejected error message. Next, I'm going to use d. Let say stop shape. I can use d disk shape to get the exact error message in my local directory. Or I can use the mail connected to send the error message to the consenting. Okay. But I'm using these top shape right now. So the process is completed. This section is completed. Okay, Glenn shape. Now, let me just open this again. So ID, the element has the length restriction. If two long trim leading characters, phone, let me just check d. If if two long reject the document. Okay, we will reject the document. So Save and Close. Now here I'm going to choose the process which I've just created, data cleansing process. Now click OK. Next. Next. After this, I'm going to choose Salesforce connector to create a account in Salesforce. So now I'm going to configure this. So I will perform the action Shen. Create Account. Okay? Now, connection I'm going to choose account sales force convection. Okay, let me so this is my username, password. I'm going to provide my Salesforce password and ND in this security token, okay, combination of Salesforce password ND and a security token. So now this is okay, save and close. Next I'm going to create a new operation. So I will create a new operation. Name of operation will be Salesforce account operation. Fine. I will click import. Here. I will select the connection which I've just selected. So account, Salesforce connection, click next. So object type, so I will select the account. So since we are creating an account, so I will select the object type account. So which action you want to perform, right? Non-performing the create action. So I will create an account in Salesforce, I will select the create action. Click Next, billing object tree. So click Next. Finish. So next I'm going to click, Save and Close. Okay. Click OK. After this, I'm going to use the stop shape at the end of this process. Now I am going to execute my process. Save this. Now I'm going to execute this process test, run test. So the process is executing. So this is going to create an account in Salesforce. So this processes running in executing. So this is creating an account in Salesforce. So the account has been created successfully. Now let me just check if the account is created or not. My account, then I will go to d. Ok. All accounts, the account has been created. So just as the name, let me show you the file name which I used. So this is defined language I used. So George, typist customer, let me just open it. So there are some details which I did not provide OK. Like, I didn't provide the billing state information. There are some information which I didn't provide. So this is going to fail only these information like it is going to give George Georges de name, phone number is this exec 6889689887502. Okay. 500 to customer owner name is sod. Okay. So account has been created. Now, let me just go to my subprocess. So I will go to my subprocess in order to check in order to CD document flow loading it is loading my subprocess. Now look. So this claim shape has gone to this branch. Okay, now, what next I'm going to do? I'm going to change some logic here. So data cleanse. So what I'm going to do, I will just change the logic. So debt, this brand disclaimed shape, should go to this branch, okay? It shouldn't go, go to this particular block now indicates shape. I will do. What is the length of this? 12345678910111250, minimum length. We just specified the minimum length. I have to specify the minimum length. Here, 50. In the maximum length is a 100. Now this document is going to fail because the minute the length of this is not is is less than 50, which is the minimum length. Ok. So this document is going to fail in the branches and the control is going to go to this match. Now, if I execute this process, so I will get the exact error message. Why this, why it is flow goes to this branch. Now save and close. I have to execute the process again. Run test. So the process has been executed successfully. Now look, control hasn't gone to this branch, okay, hasn't gone to this process. Okay. Now, if I look at my subprocess, in my subprocess, I will see the exact rejected reason why this process has gone to this branch. Now in order to see click this, I will see the rejected reason of this clean shape. Look. So this is d error message. Element is shorter than the minimum length, minimum required. Let, so now I'm going to change the logic again so that the process doesn't go to D, this branch. So the process doesn't go to this branch. I need to change the logic here. So here, ID, not ID, phone number. I will specify 0 to Turkey, maximum is 30. Now the control will go to this branch and the return document is going to return the document to the main process. Then the main process is going to pass the document to this. Here it is going to create a document Saudi area. It is going to create an account. Then the disc processes going to pass to document to this particular branch. Okay, so let me just execute the process finally time test. But before that, let me just delete the account which I've created. Now I am going to delete this, delete, delete this account. Now execute the process. So now Amazon S3 from Amazon S3 bucket. And this is going to read the account dot xml file when the control goes to this branch. So the disk process, a document will be passed. So which document will be passed to the subprocess? This document will be passed to the subprocess. Okay. Now let me just go to D subprocess. So since we are passing something to a subprocess, that is the reason why here we have used t starts shape data pass through option. Then declared shape is going to cleanse your document, and then it is going to pass it to the clean branch, this branch, then the return shape, returned document shape is going to return the document back to D main process. Okay? So now, so, so if you look at account has been created, one account has been created with named George. So hope you understood the concept in the flow of this process. Thank you so much and have a great learning. 16. Add to Cache Shape Example 1: Hello everyone. In this particular lecture, we are going to discuss different shapes in del boomy. So these are the shapes that we are going to discuss. We will start with two caches. So we are going to add a document to a cache with the help of L2 cache shape. Then with the help of this shape, retrieve from cash shape, we are going to retrieve a document from a cache. So first of all, we'll, we'll discuss these two ships. So now I'm going to create a new component. So I will create a new component. Type will be processed. Shapes. Examples. Create a new component. So the first shape will be starts shaped with no data. After this, I will use the branch shape. So branch shape is going to have two branches. I'm going to connect this with DB round shape. In the first project, I'm going to use t message shape. So indie message shape, I'm going to add a content JSON contents. So this is the content JSON content, which I'm going to add in a message shape. So basically with the help of message shape, I'm going to generate a document that will have a JSON content. So first of all, let me just add this content. So this JSON content has multiple elements. So this is my first element, second element, third element. Okay, these are different elements. So now in order to add this in a document, let me just copy this. So before pasting in, inside this message body, I'm going to start with a single quote. At the end, I'm going to write single quote in-between. I'm going to paste the JSON content in-between the single quotes. So now this is how you add a edit JSON content in a message body. Click OK. Now the message shape is going to produce a document with JSON content. Next time we're going to add one more shape, which is L2 cache shapes. So I'm going to add the JSON content in the cache memory. So instead of getting data from different external sources, I'm going to fetch data from, from a cache memory. So first of all, I will have to save this Jason content in a cache memory. Then with the help of retrieve from cash, I'm going to retrieve the document. Now, first of all, liquidus configured at two caches, I will create a new document cash. So cash shape is used to add documents to a document cache. So now I'm going to create a new document cash profile. So document cache, memory profile type since we're reading the JSON file. So I will select the profile JSON, I will create a new profile. So the file which I'm going to import is the Lookup dot JSON file. So I will import the metadata of this profile. Click finish. So metadata has been added. Customer ID name foods. Now let me just give a name.JSON contact JSON profile. Save and close. So the profile has been added. Now, what is this option? Enforce one index entry per document. So if I uncheck this option, This means that in a single document i will have multiple indexes. So let me just uncheck this option. Save and Close. Now I have to add an index here. So customer ID will be my index value. Customer underscore ID is my index value. So I'm going to set disk Customer ID is in index. So key element, which I'm going to select, I'm going to select the customer ID. So I'm, I'm setting this as a, as an index safe. So customer ID has been set as an index. Now this option means Ted, I will have multiple indexes in a document. Save and close. Next I'm going to use t. So this is retrieved from cash shape. I will call it retrieve. Customer contact details. Document cash. So I will select this document cash, retrieve. So I'm going to retrieve data on the basis of ID, solely on the basis of index. So customer underscore ID is my index. So the value which I am going to provide 123. So in my document, so a document will be produced in my document. I have multiple indexes. So look at this document will be produced and customer ID will be the index value. So customer ID will be the index value, multiple values. This, so this is my first value. This is my second value. The third value. Likewise, we have different index value, so, okay, now let me just use desktop shape at the end. Now save, test and execute the process. So you will get a single document. So no matter which value index l you provide, you will get a single document because the index value you're providing is linked to a single document, okay. Is linked to a single document through this index value you have provided here is linked to a single document. So no matter which value you provide, you will get a single document. Ok, so now execute this. You will get the same document with all the contents. Now the process is executing. So in order to speed up the process, you will use this particular shape flow control. So we are going to use this shape look single document. Now what I'm going to do, I'm going to change the value. I'm going to check this option. So if I check this option, it means that any single document, I will have a single index. Ok. So I have to split my documents. I have to split my single document into multiple documents. So for splitting, I'm going to use the data process shape. This is going to split a single document into multiple documents. So configured, I have to. Now I'm going to use displayed document profile. I will select the JSON profile. Look up JSON profile. So this is the profile okayed profile has been deleted. So now I need to select a profile layer. Different contact JSON profile disproof file I have created, just created. Okay, it is sine save and close. I'm going to split on the basis of array elements. So, so this is the first array element. This is the secondary element. Likewise, dessert array elements. Okay? Now, so this is the name of an array, and these are array elements in my JSON profile. So I will split in multiple. Some multiple documents will be produced. In. Each document will have a single element. Okay? Now, so in order to speed up the process, in order to have a better performance. So what I'm going to use, I'm going to use t, which shaped flow control shape. Not the process called shape flow control ship in order to speed up the process. Now, let me just connect this with this flow control. So the default value is set as 0. So I will increase the number of threads to improve the performance. Now I am going to set tests for, now. For threads are going to work in parallel to process your documents. So now, what is this option? Definitely improved the performance batch options. So now if you want to run your document in batches, you will select this option. Let's say I want to run my run my documents in two batches. So I will give a value to what is this option? Run each document individually. So if you want to run. Each document individually, you will select this option. So when you are going to select this particular option, you will select this particular option when there is a dependency, what, when one document is dependent on another documents. So this is going to decrease your performance. So if you are going to select this particular option, no document batching, it means that you are not going to run your documents in batches. So I'm going to select this particular option. Two documents run as batches of two documents. Click OK. So this is going to improve the performance as this is going to use for threads to process your documents. Now save this. Here there is an option. I have two options, all document. So let me just show you one more thing. So hey, if enforce one index entry per document. So this means that in a single document, I will have a one index. Only one index. No more than one index is allowed in a single document. Okay? So open this. Now you're going to have two options. All documents. If you want to retrieve all documents, you will select this option. If you want to retrieve on the basis of customer ID, which is an index value, you will select this option here. You are going to provide the index value. Let's say index values 123. Click OK, save this. Now I'm going to retrieve a single document which has the Customer ID 123. Ok, the process is executing. This is going to give me a single document which has T Customer ID 1-2-3. Okay. Now, if I let me just change it, let me just change the value. Now if select one to four, it is going to give me a document which will have the value one to four, customer id one to four. Now, the process is executing. Okay? Now each document will have a single index value, okay, 1.2.4. Now, instead of customer ID, I'm going to retrieve all the documents. Now I'm not providing any index value, so this is going to retrieve all documents and each document we will have a index value. Single index value. Find, look, multiple documents have been generated, have been retrieved. So with the help of this, retrieve from retrieved from cash, I'm going to retrieve a document which is saved in a cache memory, okay? 17. Add to Cache Shape Example 2: Hello everyone. In this particular lecture, we are going to use cash as a lookup. So first of all, what we are going to do, we are going to read account dot xml file from Amazon S3 bucket. So this is the Amazon S3 bucket. Okay, so I'm going to select this boomy training 2020. So this is my Amazon S3 bucket. I'm going to read this particular file, account dot xml. So let me show you the file which I uploaded. So this is the file which I uploaded here in Amazon S3 bucket. So in this particular file, the phone number is missing. So what I'm going to do, I'm going to take this phone number from disk cache, okay, from this top document cash. So what daytime storing in document cached. So I'm storing customer ID name, phone number against Customer ID. I'm going to get the phone number, which I'm going to store, which I'm going to store it. Okay? Now, after that, I'm going to do sail forth upsert operation. So what is upsert operation? Upsert oppression is tat. If record already exist in Salesforce account, update directed if record doesn't exist, create a new record. So this is upsert operation. So I'm going to use t Salesforce connector in which we are going to use d upsert, in which we are going to create the upsert operation. Okay, now, first of all, I'm going to add one more branch hair. So first thing I'm going to read d. I'm going to use the Amazon connector. This one. Let me just add few connectors here. The first one, Amazon S3. Then I'm going to use the map shape after data, I'm going to use the Salesforce shape. Use the Salesforce. So I will choose to Salesforce connector. Ok, first I'm going to configure my Amazon S3 connected at the end. I'm going to use desktop shape. So this is going to be the stop shape at the end. So first of all, I'm going to configure the Amazon S3 connector, H&M performing the get action connection. I'm going to use this. So this is the connection which I have already created. No need to create another connection operation. I'm going to perform this Amazon operation. So this is the operation which I have already created from this Amazon S3 bucket. I'm going to get defined, save and close. So this has been configured. Now I'm going to set the parameter value in the parameter value, I'm going to choose the ID and I'm going to provide the account dot xml file name, and I'm going to provide the file name. Okay? This has been configured. Next, I'm going to configure the Salesforce account. So for deck, I need to first create the externality in the sales force. So here, from here you're going to create the external ID. So what is the meaning of external ID? So with the help of external id, we are going to determine if record already present in the Salesforce account or not. If record doesn't exist, create a new record. If record already exist in the indicial sold account, update the record. Okay, now, first of all, I need to go here in the Setup section. Then I'm going to go to the Object manager. Then I'm going to go to the Account section feels in relationship from. So from here I'm going to create a new external ID. So I'm creating a new external ID. So value I'm going to provide, so this is going to be good text value next. So field value, I'm going to provide external ID underscore. Lets say e x t. Underscore id is the name, lend a 100. So this is defile him. Ok. Fine. Description. Check this option and then check this option after debt, save. Click Next. Click Next, save. You need to save this. So external ID has been created. Now let me show you my external ID which I've just created. This IS DID which I've created. So this is DID which I am going to use. Now. I'm going to configure the sales force action. I'm going to perform the sand action connection. Let me just okay. Disconnection. I've already created this Salesforce account, sales account, Salesforce connection. I'm going to reuse disconnection operation. I'm going to create a new operation. Operation name is salesforce upsert operation. Then I'm going to use import connection. I'm going to choose disconnection. Click Next, login into Salesforce. Okay, object type is account at Shan. Now this time around I am performing the upsert action. Take next. So for a upsert action, you need to have a external ID. So external ID is going to determine if record already exist in the sales force account on Auto key click finish. So name is sales force. If I edit this, let me just edit the absurd profile. Let me just give a name. The sale for account, absurd profile, save and close. So this is the name of requests profile. Now external idea, I'm going to choose the external. I did this one, click OK, which I've created saving close. So this has been configured. Next, I'm going to configure my map shape. I'm going to quickly configure my map shape. I will call this named Salesforce to, sorry, amazon. To sail force. I'm going to choose a profile here. I will choose XML profile. I will choose this profile, a Salesforce account X-Men disc Profile I've already created. So from here I'm going to choose D sail forth profile XML. I will choose d profile, which I've just created. With profile, I've created deaths one Salesforce account upsert requests. So this profile I've created, no disc Profile, upsert profile. This profile I've created, which I'm going to use it. So this has been imported. Attributes have been imported. Now here I'm going to use the first, I'm going to use t. Which function I'm going to use. I'm going to use the lookup, which Lookup I'm performing and performing the document cache lookup. So document cache, which document cash I'm using? I'm using the document cached three. Let me just verify it. I'm using the document cash three. Okay. I'm using the document cached three. Let me just edit this. Okay. Customer ID. On the basis of customer ID, I'm going to fetch the phone number value. So from customer ID I'm going to fetch phone number. This has been configured. Next I'm going to create a concat function. So I will create the concat function, string, string concat function. Okay, input, I'm going to use name. And I'm going to choose phone number. So these two fields, I'm going to put in the external ID, here in the external ID. So external ID is going to determine the a is going to determine whether a record already present in the sales for her count or not. Okay. On the basis of these two name and phone number, it will be determined. So customer ID, I'm going to map then against Customer ID, I'm going to fetch phone number and I'm going to save it here. Then name, I'm going to save here in name field. With the name field, I'm going to put this field in the name section, type, I'm going to put AND type. Then billing country. I'm going to put billing country here. Shipping country. I'm going to put shipping country here with the shipping country. Shipping country. Phone number. No need to give the phone number. Okay. Phone number. I'm going to map it with this name, this end result, I'm going to map it with external ID, okay, on the basis of name and phone number, it will be data mine if the record is unique or not. Okay. Saver, enclose. I need to Save and Close. Click OK, save this savior process. Now let me show you the accounts. So these are all accounts in my seal for so I'm creating a new account, George. Look, this is my file. I am creating a new account, George. This George doesn't have the phone number. Okay? So I'm going to get this phone number from document cache, which so here I have no indeed document cash. I'm going to put this document okay. On the basis of customer, I am going to get the phone number. Now let me just execute the process and you are going to see a file. You are going to see a new account created in Salesforce George with named George. So the processes executing. So this is going to create a new account with named George. So the process has been executed. Now I need to check. Look George phone number next. I'm going to update, I'm going to update d phone number. So here I'm going to update the phone number. So what I'm going to provide so currently the phone number is 992234 to 99, whereas George, 99 2342. I'm going to, at the end, I'm going to write 00000, okay? Instead of this, I am going to write 355 zeros, okay? So 12345. Save it. Now when I'm going to execute this process, this record will be updated. Here. I'm going to get this value ok, which I've just created okay, updated. So I'm going to get the updated phone number here. Let me just The ok. The process has been executed. Let me just refresh it. Account. Okay. Go all accounts. I need to show my accounts. Okay, George, look, this has been updated. So this is how you use the upsert. You use the Salesforce upsert operation. You need to have an external id, otherwise you will not able to update Directorate. So external ID is going to define, external ID is going to be d, a unique ID. So in this particular example, we have made the external ID with the combination of name and phone number. We have made the external ID and it is unique. Name and phone number must be unique. Okay, so hope you understood this concept. Thank you so much and have a great learning. 18. Properties in Dell Boomi Part-01: Hello everyone. In this particular lecture, we are going to discuss del boomy properties. So in Dell boomy did are different properties related to a document and so related to a process. So we have properties like standard document property, dynamic document property. These two properties are related to a document. Then we have process-related properties, dynamic process property and process property. These two properties are related to a process. So first, in this particular lecture, we're going to look at the difference between standard document property in dynamic document property. So what is tender document property? So standard document property contains runtime information related to a document. So this is tended document property. It contains information like inbound connected details and outbound connector details. It also contains information related to metadata as LSD trading partners information. So connected details, which includes both inbound connectors as LSD outbound connectors ten, it contains trading partners information, then it contains metadata information. So the second is some information in the document property cannot be changed. So basically, with the help of set property shape, you can actually change the property of a document. But which information you change? You only change the information of how bone connectors details. Ok. So some information in the document property cannot be changed. For example, which information cannot be changed? Inbound connected information cannot be changed like you cannot change the source file name, okay? And you also cannot change the source application response code. You cannot change these information, but what you can change, you can change the outbound connector details. So we are going to look at the example of standard document property. So now next we are going to discuss the dynamic document properties. So what is dynamic document property? Dynamic document property is something which is defined by process developer. So what is standard document property? It is already defined. So this is defined by process developer. This is one of the difference between standard document property and dynamic document property. Now, what we are going to do, we are going to look at the example of standard document property and dynamic document property. So first of all, I'm going to create a new component. So I will create a new component. Properties, explaination. Create a new component. A component has been created. Next, I'm going to select the star chamber with no data. After this, I'm going to use d disconnected. So next I'm going to read employee dot JSON file. Let me just edit this file. So this file has two elements. Dismiss first element and this is my second element. So now I'm going to read this particular file. So first of all, let me just configure the disk connector, get connection. I'm going to use t disconnection operation. I'm going to choose the read operation. So let me just edit this. So in this particular operation I'm reading de employee dot JSON file, exact met, save and close. Okay, so this has been configured. Next I'm going to use tea FTP connector. So first time giving you an example of document property, standard document property, which is also known as document property. So next I'm going to use district property shape to configure the document property. Now, set properties. At the end. I'm going to use this top ship configured send, FTP connection, FTP operation. Let me just edit this. So this is going to create a file in remote directory. So I need to show my remote directory path. So where is my remote directory path? Go here. Then this is the remote directory part. So this is going to create a file inside this remote directory. So next time we're going to configure the set properties shape. And now let me just configured this shape. Okay, so I'm going to choose, so there are different properties. I'm going to choose the document properties since this is also known as standard document property. So, so now with the help of document property, I'm going to change d phi limb as well as the directory. Okay, so since we are using the FTP connectors, so I will select the FTP. Now with the help of document property. With FTP connector, I can change filename remote directory, or I can move to a directory. I can move a file to a directory. Now, let me just select the file name. So my filename is here. The file name is implied dot JSON. So I will set d phi limb EMP dot JSON. Okay, file name has been changed. Now this is my remote directory. Now what I'm going to do, I'm going to create another folder, EMP folder. I will call this name EMP. And next I'm going to change the path has well documented property, FTP connector, remote directory, click OK. Then I'm going to change the path has the backslash and the name of folder. Name of folder is this backslash 0B, mp is the name of folder. Click OK. Now, so with the help of set property, properties shape, I have changed to thing 5A and remote directory. Now, if you look at here, some information in the document property cannot be changed. Which information cannot be changed? Details of this all bone connective or sorry, this is inbound connected. Information related to inbound connector cannot be changed. This is the inbound connected indices, the outbound connected. So look, outbound connected details can be changed. So let me just open the properties if I look at here. So for, for disk, for FTP connector, I can change filename remote directory. I can move a file to another directory. When we talk about disk disconnected, I can change filename n, i can change directory path. So if I talk about the male connector, I can change the body filename, whereas the male connector, I have to see the mail connected. Ok. I can change the file name, subject, I can change the subject from address two address. So I can chain these details related to male connector. Okay? So these information are related to our bone connector. Okay. But I cannot change the information related to inbound connected, which is this, ok, this one is in months, so inbound connector is the one from where we are reading the data. Okay. I cannot change the name filing from there, I'm reading it. Okay. This is logical. Now, set properties has been configured. Now let me just executed. The FTP server must be up and running. So this is an example of standard document property. Look. So instead of creating a file in the remote directory, instead of creating a file here, it has created a file inside this EMP directory, look EMP dot JSON. So this is defined we have created. So with the help of set property, shape, vf changed to think filename and the remote directory. So next concept that we are going to discuss as D, dynamic document properties are dynamic document property is set by process developer. It is not predefined. Now, let me just give you an example of dynamic document properties. So for dynamic document property, let me just create liquid, just use a branch shape. So the first branch is going to be for standard document property. The second branch is going to be for dynamic document property. So this is the scenario that we are going to implement. So we are going to use the disk shape to read the same employee db.json file. Now let me just use two disc shape. Configure it. So I'm going to read the same JSON file. Disconnection, read operation. Read operation has been performed, Save and Close. Okay, next, shape debt. We are going to use data process shape. Data process shape is going to split my documents. So we split a single document into multiple documents. So I am going to use d split document functionality inside the data process shape split document. I will use the Profile JSON. So profile, I have already created a profile, so no need to create a new profile. Profile name is employee JSON profile. So this profile, I have already created it. Let me just again attributes. So for safe side, I'm checking the attributes ID, firstName, lastName, gender, country eight. Okay, fine. Split on the basis of array element. So array element. So in this particular file I have two elements. Okay, next I'm going to use the set properties shape. So again, I'm using the set property shape this time around I'm going to define the dynamic document property. So dynamic document property, document property then. And no, I will select dynamic document property here. I'm going to define a property. So what I'm going to do, I'm going to define the property name ID. So ID has been defined property name. Next here I'm going to give the parameter value. So before that, so let me just openly MySQL Workbench. So this is a table which I am going to use. So from this table I am going to delete these two rules. Id value 36, these two rows. Okay? Now, first of all, let me just define the dynamic document properties that I have defined the dynamic document Property ID. So this is t property which a developer defines it. Okay? So I have defined it, then I'm going to provide the parameter value. So before providing it, I'm going to introduce one mode shape, which is a program command shape. Let me just add this shape in at the end. Let me just introduce the stop Shape. Click OK. Now let me just configure this. So first let me just provide D. Let me just configured the SQL command, shape type SQL command connection. I'm going to provide the database connection. So this is d database connection. Okay. These are the details database named dice, save and close. I've already created D database connection here. In this particular section, I'm going to provide the SQL script. So since we are going to delete these two rows from the table, so here I'm going to write a delete script, delete from Dice dot. Then the table name, table name is this. Let me just maximize it. Okay? Delete from distort. This is the table name where ID equals two question mark. So question mark represents D variable. Okay, let me just remove this control X. Look using question mark to insert variables. So here I'm going to insert a variable. So here I'm going to provide the variable value. Now plus sign. I will select a dynamic process property. No, I'm not going to select this option. I will select the document property. Then I'm going to select the dynamic document property here. I'm going to provide the ID. Here, I'm going to provide the dynamic document property name. So I have defined the dynamic property name ID, where I've defined. I need to show you first, I have defined the dynamic property here, ID, okay? So here I am providing the dynamic deco document Property ID. So this ID and this ID. These two ideas are same, okay, same. So now I need to define a value here. So what I'm going to do, I'm going to select profile element, profile type. I will select the JSON profile profile name. I'm going to select employee JSON profile. Then I'm going to select D ID. I'm going to select the ID value, okay? So this ID value will contain, since here it will get to documents or this ID value will contain two values. So let me just show you the document. So this is the document I need to show you my document which I'm going to read. So where is the document which I am reading it? So I'm reading it from this particular directory, okay, this particular file. So two documents will be produced. So the first ID value will be three, in the second value will be six. So what I'm doing right now, I'm going to delete these two rules with ID value 36. Ok, so now let me show you the process. So here, ID value will get 36. Then SQL command, Delete from dye start employee underscored backup table where ID equals to this particular value. So two rows will be deleted because this ID value will get two values. So first it will get three, then it will get six. So now click, okay. So it has been configured. Let me show you the process ID again ID. Then, okay, fine. Here I have defined it. This ID now, let me save it first. Now I'm going to execute the process. So we have defined as per the definition of dynamic document property, dynamic document properties defined by profit developer. So we have defined the dynamic document Property ID and the ID value which we have assigned 36. And we're going to delete these values from the table. Now, let me just query this. If values deleted or not, I need to create it is. So look, value has been deleted, rows have been deleted. Ok, likewise, let's say, let me just change it to 7878. Let me just change the employee db.json seven and then eight. Now it is going to delete two rules, 78. So this one is an example of dynamic document property, and this is the example of standard document property or document property. So dynamic document properties defined by process developer in the document property is already defined by del boomy, okay? It is predefined. 19. Properties in Dell Boomi Part-02: Hello everyone. In this particular lecture, we are going to look at dynamic process property and process property. Also, we are going to look at the difference between dynamic document property in dynamic process property. So what is dynamic process property? Dynamic process property is related to a particular process. The scope of dynamic process property is limited to a particular process. When it comes to process property, it is defined globally. Once it is defined, you can use it in any other process. Ok? There is no limitation. The scope is unlimited. You can use it in any other process. So what it is, dynamic document property, we have already discussed Test dynamic document properties. So now we are going to look at the difference between dynamic document property and dynamic process property. Right now this is an example of dynamic document property. So I have defined d dynamic document property. Name is ID, okay, here I am deleting d elements, okay, I'm deleting two rows from the table. So as per the definition of dynamic document property, since a via, we're going to create multiple documents. So multiple instances will be created. For example, let me just define a document. Dynamic document Property ID value, let say, let me just give a static value, let's say a 100. So since in this particular example, we are creating multiple documents. So each document will have a value, a 100, so multiple instances will be created. Okay, so let's say instead of the static value, let me just give process profile element JSON liquid. Just select the employee JSON profile. Let me just give D ID. So now in this case, multiple instances will be created. Now, this ID will have two values. The first value I D7, then we'll have eight. So now if I execute this particular process, two rows will be deleted from my table because multiple instances have been created. And it is going to delete two rows from my table. So look, two rows have been deleted. Now drop this table and create a new backup table again. Ok, now, let me just execute this query, okay, same rows three rule. Okay? Now next what I am going to do, instead of using the disc dynamic document property, I'm going to use d dynamic process property. So despite having the idea I'm going to name, I'm going to give ID. So despite having multiple documents, since we have defined the dynamic process property, only one instance will be created. So what value I am going to get in DID I am going to get a single value? So if the latest document received by del boom is dis, the id value will be seven. If the latest document is disk, ID value will be eight. Okay, save. Now when I'm going to execute this process, only one row will be deleted, either seven or Rake, most probably it. Now here I'm going to change instead of here I'm going to provide document property, okay? I'm going to provide dynamic process property nor, okay, dynamic process property and the value I'm going to define ID. So instead of instead of this dynamic, so Rick document property previously it was defined this dynamic document. Previously it was defined this value. Ok. Now I'm going to change this particular value. Now instead of this, I am going to define dynamic process property value. And the value is ID since only one instance will be created. So it is going to delete only a single rule. Let me just save it and execute the process. Now you are going to see only one row will be deleted. Look, one row has been deleted. This one. Okay, this is my latest document received by del boomy, so one row is deleted. Now next I'm going to explain the concept of process property. Suppose this property, something which has defined globally, you can use it in any other, you can use it in multiple processes. So from here I'm going to create a, I'm going to define a process property. So process property is a component type, okay, processed property here you're going to define a process property. Connector details. File collector is the name of process property. Create this property. So I'm going to sew with process property. You can define multiple key-value pairs. But when it comes to dynamic process property, day's going to be only one key value pair. So like you can define multiple key-value pairs. So the first value will be phi m. The second value will be Directory. So file name, I'm going to provide EMP dot JSON directory. I'm going to provide dist directory, boomy file. So I'm going to provide this territory. Let me just delete this file. Okay. Filename, ok, fight, save and close file connector is the property name. So instead of so I have defined the disconnect or okay, send disconnection. Stephen close. Okay, write operation, it is going to perform the right operation. Okay, fine. So here I'm going to set the properties filename instead of FTP connector. I'm going to use t filename here. Here I'm going to edit this. I am going to use d directory here. I'm going to provide the value, I'm going to provide de dynamic. So the process property, I'm going to provide the value process property. So file connector is the name of Process Properties or I have to choose the name. So first I will select filename. Here I'm going to select the directory name, Process property, file connector, directory. Click OK to save this, I need to save this process. Ok. Now you're going to see a file will be created inside this directory. So this process property has been defined. You can use it in any other process as they're looking. You can create a new process look, a file is generated, you can create a new processing use this defined process property. 20. Decision and Exception Shape Example: Hello everyone. In this particular lecture, we are going to look at an example of this CAN shape in Dell. Boomy. So first of all, I will create a new component. I will call this DCN shape example. So I'll start with start ship with no data. After that, I'm going to use the Amazon S3 rest connector. So I will quickly configured this Amazon S3 rest connector action. I'm going to perform the detection operation. I have already created connection operation, I've already created operation. I will select Amazon operation. So I need to show you the operation. Okay, so what I'm going to do, I'm going to read the account dot xml file which is located inside this boomy training 2020. This is S3 bucket, Amazon S3 bucket. Okay. So this is Amazon S3 bucket saving close. So this has been configured. Next I am going to define a parameter value. So what parameter value or I'm going to define ID, account dot xml. Click OK. After that, I'm going to introduce to Jesse and shape. Okay? So does CAN shape is going to compare two values. If, if the condition is true, then it is going to go to Detroit Branch. Otherwise it will go to d false branch. So I'm going to compare two values for which value I'm going to compare. I'm going to compare the phone number field, which is indie account dot xml file. So what I'm going to do, I'm going to select the profile element, profile. I will select the XML, I will choose d profile account XML profile. So this is the profile which I have created. So I will not create an account because I've already created this profile element. I will select phone number, phone, no, comparison equal to the second value I'm going to specify. I'm not going to specify any value. So if the phone number value is blank, then it means the condition is true. It will go to the true branch. In the true branch, I will add the exception ship. And inside the sick, inside this exception shape, I will add an exception message. So what message I'm going to provide here, I'm going to provide an exception message. Get null value is found. Value is found in D. Fullname. Please specify any value. So this is the exception message which I'm going to specify. Click OK. In the false branch, I am going to introduce the Salesforce connector. If the control goes to D, false branch, then created an account. So after dead, before, before sale source, I will introduce the map ship. Okay, combine this at D and I'm going to introduce the stop shape. Now let me just configure the map shape as well as the seals for first time going to configure the sale source sang Connection account shared source connection operation. I'm going to choose Salesforce account creation. So this is the operation which I am going to choose. Indices the profile. This is the request profile which I'm going to select indy map shape configured, specify Salesforce S3 to Salesforce. I've already created this map. Ok, choose a profile, XML profile. So the Profile Name Is this, I need to specify. So account create request to SF underscore account create, request to select okay, name. I'm going to map it name, type, I'm going to type billing country, I'm going to map it. Billing country, shipping country. I'm going to map with shipping contributors, shipping country shipping country discipline, phone number. I'm going to map it. Phone number. Right now, value is blank. So no need to specify DID this id is different. So I'm not going to specify DID save and close. So this has been configured. So when I'm going to execute the process, this is going to throw an exception message. Now execute the process. The reason it is going to throw an exception message because the value is blank values null for phone number value. So the process is executing. Look, I've got this exception message null values found in the phone number. Please specify any value. So look what next I want to do. I want to capture this exception message. So to capture the exception message, I will introduce the try catch shape, which is going to capture the error message, and then it will pass it to the next branch. So with the help of try-catch shape, I'm going to capture the exception message. So I will connect to this Amazon S3 connected. Then I will introduce the message shape. After that, I can use the disk shape to save the messaging in my local disk or I can use, let's use desktop shape. Now. In the message shape, I will define a variable value. Variable value will be D exec exception message. And I will select the document property. Then I will select the metadata. And in the metadata I will select the base. Here I'm going to select the try-catch message. Now this is going to capture the exact Exception message. Now, when I'm going to execute the process, I will not get the pop-up message. Okay, i will not get a pop-up message. This strike at shape first. If it, if it is, get, if I get an exception, this try-catch will capture the exception and it will pass it to the message shape. Okay, the reason I am getting this message because I need to specify all errors. This includes both document level editors as well as the process level errors. So now I will not get this popup message again. So then the control will go to the try-catch eval captured the message captured exception, and any ripple passing to the message shape. Message shape will produce a document which will have d exception message, then it will pass it to this top shape. Now you can see the exception message here. Look null values found in the phone number five. Please specify any value. Now, what next I'm going to do? I'm going to change this phone number value. Instead of using the blank value I'm going to choose, I'm going to give 02. Okay? Delete the account dot xml file exchange. Delete, delete this file. Upload this file again. File has been deleted. Guess again, upload this file again, account dot xml. Next next file is uploading. It has been uploaded. Now, save the process. Now the control will go to the false branch. The reason it will go to the false branch, because this condition is not satisfying. That is the reason why it will go to the false branch. So before executing the process, let me just verify the mapping again. Okay? So I have to map this id with this external ID, okay? Otherwise I will get an error. I need to show you. So let me just execute the process. Now you get an error. In a try catch shape is going to capture debt error message. Okay? So I will get an error here in D. Look. I've got this Edit and now I need to check what message I'm getting. So I'm getting the message. This required fields are missing, so I need to map this external ID which I have created. So external ID I have created. So let me show you the mapping is, so this is the external ID which I've created. So I need to map it with this ID. Okay, save and close. Now, save, execute the process run test. Now I will not get an error and the new account will be created here. So I've I've just deleted the old accounts. All accounts have been deleted. Previous accounts which have created through mapping have been deleted. Now look, if I see now let me just check my account. Look. Our new account has been created with this phone number. Ok. Look, I need to show you the document. So this document. Okay. External ID 1-2-3. 21. Process Route Shape Example: Hello everyone. In this particular lecture, we're going to discuss process, route, shape, indel, boomy. So first of all, I will create a new component. I will call it process route example. Create a new component, start shape. So I'll start with start shape with no data. After that, I'm going to introduce the Amazon S3 rest connector. So what we are going to do, we are going to read the product dot csv file, which is located in Amazon S3 bucket. Let me just connect this with this Amazon S3 rest connector. After this shape, I'm going to introduce the process route shape. So what is the concept of process route shape? So the process route shape is going to call a process based on some value. For example, let me just give you an example here. So we're going to read this product dot csv file. So in this particular file I have this field called segment. So if the value is corporate, we're going to call d corporate subprocess. If the value is consumer, we are going to call this process route shape is going to call the consumers subprocess. So process route shape is going to decide based on value. It is going to decide which process should it be called. Ok. Now, let me just first configure this. Okay? Let me just configured the Amazon S3 rest connector. Get. Okay, Amazon connection operation is Amazon operation. Let me just edit this. So we are going to read the product dot csv file from this boomy training 20-20 from this object. Next I'm going to configure the parameter value. So parameter value, I will define it. Product dot csv. Click OK. product dot CSV is D ID. Okay, this has been defined. Next, I'm going to introduce the data process shape. So data process shape is going to use to split a single document into multiple documents. So I'm going to configure this data process shape. So I will select the split documents. So profile type is flat file, split byline removed. The first line is column headers. Yes. Removed the first-line S column either click OK. Now next I'm going to configure the process route shape. So route by process route, process route, I'm going to create a new process route. So name of this process route is I will call it routes, segment routes. I will call this name segment routes. After this, I'm going to select the process routing. Here. I'm going to define the key, okay? So based on this particular t, it is going to decide which process should it be called. So now I'm going to give a value. Cooperate. If the value is corporate, we're going to call a process. So here I'm going to define a process which we are going to call it, which disk process route shape is going to call it. So create a new process. I will call this subprocess, not the process. I will call this subprocess corporate subprocess. So since we are passing data to a sub-process, we will use t starts shape with data pass through option. Here I'm going to use distort shape data pass-through option, okay, saved this. Cooperate subprocess. Now, if the value is corporate, then the process route shape is going to call this particular process. Now I'm going to define another key, if value is consumer process route shape is going to call another process which I'm going to create name of subprocesses dis Let me just configure it sub process, it is going to call this consumer subprocesses, since we are going to pass some values in the subprocess, I'm going to use D starts shape with data pass through option. Make the recommended changes for me. Select this option. Click OK to save this. Now, this has been saved. Ok, let me just quickly make the recommended changes for me, okay? Save this. Okay. So a values corporate, it is going to call this particular process values consumer. It is going to call this process ok. Now, after debt, I'm going to define one more important thing. If this value selected pass-through subprocess option, if this early selected, then we look determine whether documents are passed through a sub-process for further processing. Okay? Yes, we want to pass document to a sub-process for debt is the region where this value selected configure return pots or do you want to do you want something to return to your main process? Yes. Add return parts. So I'm going to define two return path. So the first one will be the success. In the second one will be the salient saved this option. Okay? Now here rho parameter. Next I'm going to define this route parameter option. Here I will select the profile element option, profile flat file profile. I've already created this profile product profile. Let me show you the product profile. These are the elements in my product profile. Element. I'm going to choose the element segment. So if the segment value is, let me show you saved me, save this first look. Failure, success in the default branch. Let me just open this. I need to edit this success failure and we have one more branch which is default approach. Corporate subprocess. I need to show show you one more thing which is okay. Whereas the process segment route is the name of process, okay? Process routing. So if T value is corporate, If this segment values corporate, dices devalue, which has been defined as segment. If the segment value is corporate, this, then it is going to call this process. If the segment value is consumer, then it is going to call this consumer sub-processes. So let me just configure the subprocess, corporate subprocess. Okay. So this is corporate sub-processes. After debt. I'm going to introduce one mode shape, which is set properties shape. After this shape, I'm going to introduce one more shape, branch shape, return document shape. In the number one branch, I'm going to use the Amazon S3 dressed connector. Okay, so what I'm going to do, I'm going to upload the file. I'm going to upload a file here in D, in the Amazon S3 bucket, okay, in this particular bucket, what, what file I'm going to upload? I'm going to upload two files, corporate dot csv AND consumer 0.5c. So this is so this is corporate subprocesses. In this process, I'm going to upload corporate dot CSV file in the Amazon S3 bucket. So that is the reason why I'm using this Amazon S3 rest. So first of all, let me just configured the set property. So I will select, this time around, I'm, I will select Amazon S3 rest, and I will select the name in the name of file will be dot CSV is the name of file, which we are going to save it, okay, which we are going to upload. Ok, so this is D file which we are going to upload. This is the name of file, okay, now. Basically with the help of set properties shape, we are defining the name. Now, I'm going to configure this, this time around. I'm going to select the action upsert connection. I'm going to select the Amazon connection. I've already created this connection operation. I'm going to create a new operation. Upsert, S3, upsert connector operation, import, select D atom connection, Amazon connection next. So where I'm going to upload a file, I'm where I'm going to create a file inside this bucket. So this is the name of bucket or you call it object name connecting two atom. Finish. This has been configured Save and Close. Okay, this has been configured, okay? This has been configured after debt. I'm going to introduce the at the end, I'm going to introduce the stop shape. Okay? One more, I'm going to introduce one more shape which is try-catch shape, which I'm going to introduce a write after this. Then I'm going to introduce the message shape. I will explain the flow of this process, okay? After debt, I'm going to introduce the return document shape. Returned documents. So this is d failure message. And this one is the success message. Before that I'm going to introduce the same message shape here, okay? For producing the success message inside D message shape, I'm going to produce this success message. Process is executed successfully. Click OK. Now, this is the failure message. Here. I'm going to configure the exact error message. So what is my error message? I need to configure here document property. I will select DDI, metadata option, base value. I will select the try-catch message. Click OK to save this. So what is the flow of this process? Let me just copy all this. And copy this. I'm going to do same here, okay, let me just paste here. I need to configure it. Okay? So this is the consumer's subprocess. Now, here I'm going to. The set property shape, I'm going to change the name of a file this time around. A final m will be consumer dot CSV, not corporate dot csv, consumer dot CSC file. So we're going to upload the consumer data in the Amazon S3 bucket. So two files will be uploaded. Consumer dot csv AND corporate dot csv file is already present. I need to remove it first. So I need to remove this first, delete now. Okay. Now save and close. So even close consumer, I'm going to call this process, okay, now, if Let me first configure this first key value for first, if the message is successful, if the process is successful, then success message. This. If, if it fails, then the failure message. Same, I'm going to do the same success message in the failure message a, I have two branches, success failure and we have the default branch. At the end. I'm going to use this top shape. Now let me just explain the process. After death. Let me just explain the process. I'm going to introduce 311 more quickly. So this Amazon S3. One more. Let me introduce the try-catch shape here, right before the Amazon S3 connector. All errors. So this is our complete process. Stop shape. So let me just explain the process here. Save this, okay. First of all, this Amazon S3 rest connected is going to read the product dot csv file, which is located in D S3 bucket. Then the control will go to d data process shape. Data process shape is going to split a single document into multiple documents. After this, the control will go to d, this process route shape. Ok, so the process route shape is going to call a process based on a value. For example, if the value is if the value is cooperate, it is going to call d corporate subprocess. Now, let me just open this corporate subprocess. So corporate sub-process is going to get the document from the main process. Okay, then it is going to pass a document to D branch shape. Branch shape is going to first execute this process, okay, number one branch. Amazon S3 rest. If this process is successfully executed, it is going to go to this top shape. So if this process doesn't execute for what it is going to do, it will not execute the second branch. It will go to this try-catch shape is going to capture the error message and it is going to pass it to D. It's going to pass it to the failure. This return document branch will return document shape, okay? If this is successfully executed, if number one branch is successfully executed, then the control will go to d branch number two, and then the return document is going to pass returned documenters going to pass the success message to demain branch. Let me just repeat the concept. Let me just repeat the flow. So the control will go to the branch shape. The brand shape will first execute the number one branch. If this Amazon S3 rest connector doesn't execute, the control will go to d. Try catch up. The Tricare chip is going to capture the error n. It is going to pass it to d, this return document shape. Okay? The checking branch will not be executed if we get an error here in the Amazon S3 rest connector. Okay? So the second scenario is tagged. If this branch has successfully executed, then the second branch will be executed. So in the second branch, this return document shape is going to return the success message back to the main process. So I need to show you my second process. So the flow of my second process is going to be the same. So here I'm going to go into the segment routes. So this is my second process. The flow of my second processor is going to be the same. Okay? Now, let me show you the success failure dc2 branches, these are two branches and we have the default branch. In the default branch, we are going to get the documents which have values other than let me show you which have values other than consumer and corporate, okay? Values other than corporate and consumer will go to the default branch. In the success branch, I am going to get all documents with value this let me show you the value of success branch. I'm going to get this message. Processes executed successfully. Debt includes message of this process. And in my second process, my second processes, this one consumer. Okay. So. In my in my indice failure branch, I am going to get D failure message. Debt includes message of my my this process in my second process, in the default branch, I'm going to get the values other than corporate end consumer. Now, first of all, let me just open the data processed shape. I need to change one thing here before I execute the process. Let me just change one thing split by profile, keep header, profile. I'm going to select the product profile. And I'm going to link element. I'm going to link element on the basis of I'm going to separate documents on the basis of product ID. Okay, fine. Now execute the process. So this is the Amazon S3 bucket. Now execute the process. So now after executing the process, you are going to see two files in the Amazon S3 bucket. So one file will be corporate dot CSV and the other file will be consumer dot CSC file. So the process is executing. So this process route shape is executing. Look success. Now let us look at the documents of this success branch. This is d, this one is d, default branch. I need to show you the document of this default branch. In the default branch, what I am going to get, I'm going to get look home office. Ok. This one will also be the home office. All documents with value whom? Office OK. Office. Home office. Homophilous. Ok. Now in this success branch, I am going to get d success message processes executed successfully. All message will be the same. I'm going to have the same message. Okay, now, what next time we're going to do? I'm going to check my Amazon S3 bucket, refreshed this. Now you're going to see two files here. Width name, consumer dot CSV. I need to open this file. This is consumer dot CSV file. And the second file, look consumer 0.6c and look segment is consumed with. My second file is I need to show you the second 5. Second file is corporate dot CSC open this side. Only one document is produced. Okay. Only one. Okay. Let me just see one thing. Let me just delete this. Nc. Consumer, corporate, I need to delete this file. Now, execute the process again. So Vi, it is giving me only a single row, only a single document. We need to look at this. So this process is executing. Now we need to look at why this is giving me the process route shape. Okay? Now let me just check. Ok. I know the reason why it is giving me only single row, single value. The reason it is giving me a single value because we are performing the upsert operation, okay? Upsert operation is going to update the value if when the first time this file will be created, then the sec the second document comes, the value will be updated. So debt is the reason why it is giving me only a single value. If you want to have a, if you want to have multiple rows. So what you are, you can do. You can again use the data process shape where you can use the data process shape here. Now you're going to see, only. Now you're going to see multiple rows in a file. Ok? This is going to split. And now here I'm going to combine combine documents. Flat phi, ok, remove first line as Herder retained first-line is her dead. Okay. Now, click OK to save this, I'm going to do the same in the corporate subprocess, used a data process shape. Combine this, combine this combined documents. Okay, retain first-line as header. Click OK. Now I need to save and close, save and close. Okay? Now you need to execute the process again first. Okay. D files have been deleted. No, I need to delete all files here from here. Okay, action delete these files. And now you need to execute the process again, save this for safe side, Saved, saved the process. Now run test. Now you are going to see a single file will be created with multiple records. Okay? I need to show you the file which is created here. Consumer dot csv file. I need to open this file first. Now, it should have multiple records in it. Look, this time around MySQL records have been generated and F phi, the reason we have generated multiple records because we have used the data process shape to combine multiple documents. Okay, now same, I'm going to open the corporate dot csv file. Open this file. Now you are going to see multiple records have been generated. Look okay, with the header as well. Okay. So now. 22. Business Rule Shape Example: Hello everyone. In this particular lecture, we are going to look at an example of business rules shape in Dell. Boomy. So first of all, what we are going to do, we are going to create a new component. Here. I'm going to call this component business rule shape. Let me just call this component name business rulership. Create this component. So I'll start with so connected type, start shape with connector type connector. I'm using the FTP connector. I will select the FTP connected action. I'm going to perform the detection connection, FTP connection operation. This one, FTP get operation. Let me just check DMM pot here. Ftp get OK operation. So the file name which I'm going to read is this file, imply dark JSON file. Let me provide the exact filename here. So I need to provide the exact filename, which is this, OK, save, save and close. So the first step is configured. So what we are going to do, we are going to read this implied dot JSON file. And we are going to, we are going to apply these to business rules on this dataset. So my first business rule is salary should be greater than threshold value. So first we need to define this threshold value. Second rule is department should be HR. So let me just define the threshold value. So first of all, shaved this process, I'm going to create a new component, and this time around, I'm going to choose that component type process property. So since we are going to define multiple key value pairs, so I'm going to use this property called process property. I will name this property threshold. Okay, threshold is d, name of property, create this property. So inside this property I'm going to create multiple key-value pairs. So the first key value pair is threshold. In the value I'm going to set threshold. And the value I'm going to set is, let say 12 thousand. If d threshold value, save this. Now second key-value pair I'm going to get. So second value I'm going to provide is deep HR, HR department in the value of HR is 1. Third, I'm going to define sales department. And the value of sales department is to, okay, save and close. So there are three key value pairs stutter defined for process property called this threshold. So this is the process property. Okay? Now, after this I'm going to use the business rules shape. So let me just include the business rules shape. Inside this business rule shape, I'm going to define these two business rules. So first, I need to select the profile. So I will select the JSON profile. I have to select the JSON profile profile. I have already created this profile, employee profile. Let me just open this profile. I need to see the attributes of disc Profile. Okay. I D firstName, lastName, gender, country H, salary department id, D's are the attributes, okay, save and close. So this has been selected. Next, I'm going to define the business rules. So first, before the business rule, I'm going to include one more shape, which is data process shape. So I will create multiple documents. Okay, now I will configure the data process split. I'm going to select the process type split. So single file is going to be split into multiple documents. A JSON profile is employed profile split. I'm going to split on the basis of array element. So this is going to split into multiple documents. Now, business rules shape is going to evaluate each document one by one. Ok. Now here inside the business rule shape, I'm going to define the business route. So my first business rule is this salary should be greater than threshold value. So I need to give I need to give a title. A salary should be greater than threshold value. Okay, first, I'm going to said the feed. So I'm going to select the fee, which field I'm going to set. I'm going to select the salary field. So first I need to select the salary field alias. I'm going to provide salary. So salary feet. Okay, next I'm going to set the threshold value. I'm going to provide the threshold value. So function properties. So since the Process Properties already defined, so what I'm going to do, I'm going to use the get process property. Get process property. I will select this option. Then I will select the process property which has been defined threshold. This is the name of process property. So the first value is threshold. I will select the threshold value. So I need to give deep alias name threshold value. Click OK. Now here I'm going to define condition. So you can define multiple conditions. Look, there are options you can do end up operation or you can perform the operations. I'm going to have only a single condition here. For this particular example, select defeat salary. Okay, should be greater than threshold value. So the threshold value is 200 thousand, greater than equal to. Ok, specify the value threshold value, save this. So this is my first condition. You can define multiple conditions with end operator or with, with end operator or with DEF or operator. Okay? Now this is my first business rule. So next I'm going to define another rule plus sign. Now, department should be a chart. So this is my second rule. Which I am going to define. So, so first I'm going to select the field. So I'm going to select the department id. So on the basis of department id, I am going to get devalue HR. So select department id. Okay, so this is my first feed. Second, I'm going to select another field function. Ok, this time around, I'm going to select the properties, get processed property. I will select the get process property, select the threshold property name. This time around, I will select D or department, HR. Click OK. Department. So condition I'm going to provide department id equals to this HR. Department name, HR, okay, this is HR. So let me just edit this. I did this. Save, SR value is one, save, save and close. Okay. I need to provide the value HR. So the condition is department id equals two. Eta should be a charge. Okay, save this. So there are two business rules defined here. So both conditions, both rules should satisfy. If they don't satisfy, the control will go to the rejected branch. If this satisfy, the control will go to the excepted branch. At the end. I'm going to use this top shape. And in rejected branch, I am going to use the message shape. At the end. I'm going to use the same stop ship. Now I'm going to connect this with desktop shape. Let me just connect this. Now. Let me just open this one more important thing. If D, if this condition doesn't meet. So I'm going to provide the error message. So what error message I'm going to provide? Salary value must be greater than threshold value. Threshold values. This okay? Now, if, if this condition doesn't satisfy, I will get this particular error message. What about this? Is not a job? So if this condition doesn't meet, I'm going to get this error message. Click OK. If both conditions, if both rules satisfy, then the control will go to the separate branch. Otherwise it will go to the rejected branch. Now, I need to evaluate DES elements. So let me just evaluate the first element. So the condition is department should be edge-on. Yes. Values one which is which means that department is a chat. In the threshold value is set as 1200. Salary values greater than threshold value? No, it doesn't. It is not greater than threshold value, greater than or equal to. So let me see some other documents. No. I need to provide 1300. Now. I am going to get only a single document in D, in the excepted branch. All other documents will go to the rejected branch. Now, save this, save this. Only this document is going to go to the accepted branch. Now, save it and execute the process. So maybe this message shape is not configured. I think it is not configured yet, this is not configured. And excepted branch. I am going to get only a single document. Okay. The first document which which is department id is one and the salary is greater than 200. Now, I need to configure the message. First. Null, I'm going to get the error message. Document property. I will get the error message. But I need to select this particular option in order to get D Business Rule result message. Save this, execute the process. So in order to go to the excepted branch, both conditions should satisfy bought rules should satisfy which we have defined here. Okay? Look, I'm going to get the exact result message. Business rules should be HR department is not eta. So this is the HTML format and this is the message we are getting. Okay, now, what? Next time we're going to do? So now, let me just change value here. Let me just take the value here. Now, what I'm going to do, I'm going to set the value 10 thousand. Now, all documents will go to the rejected branch. Execute the process. Look, all documents will go to the rejected branch. So hope you understood the concept. Thank you so much and have a great learning. 23. Consume SOAP Service: Hello everyone. In this particular lecture, we are going to consume soap service indel, boomy. So first of all, I'm going to create a new component. From here, consume soap. I will create a new component a. So I'll start with starship with no data. After that, I'm going to use t web service. So to consume a soap service, I need to use this particular Connector. Web Service says soap client connector. So I am going to use disconnected, I'm going to connect this with this. So we are going to consume these two services. The first one is the country in for service. The second one is Celsius to Fahrenheit converter. So before that I'm going to use the brand chip. So the first branch, in my first branch we are going to get the country info, a enmesh. Second branch, we are going to consume the Celsius to Fahrenheit service. Okay? So first of all, let me just use one mode connector. So name of connector is web service. I need to search this web services. So client connector. Now, after this, I'm going to use this top shape. Core pages pays chair. I will use this top shape here. Then I'm going to connect it, okay, now, I'm going to configure the first web service, soap Klein connector. So for consuming soap service, you need to have the wisdom u r, l. So this is my wisdom URL. So now configured this connector type is this action. There is only one action which is execute connection. I'm going to create a new connection. So clang connection. Okay, Clang connection 01. So this is my first connection here. I'm going to provide the wisdoms URL. So this URL I'm going to provide here. Okay, second is soap endpoint URL. Endpoint URL is, I need to remove this question mark in the visible. I need to remove this. So this is my soap endpoint URL, username and password synced disservices or publicly available. So I don't need to provide a username and password, okay, now save enclosed. So, so 0.01 connection has been established. Now, operation, I need to create a new operation. So soap client, let me web service. So Planck connector operation, I need to just remove this operation 01. I will call this operation 01. Next, I'm going to import. I will choose an atom local item. You can answer select the cloud item. So I will select my connection, which is soap client connection 0-1 with which I've just created. Click next, connecting two atom. So I'm using this particular service first. Ok. So now what I want, I want country currency. So, so I will provide the ISO code as an input against ISO code I will get d country currency. Okay, click next, connecting two atom. Okay, finish. So this is my profile. Finish. So request profile request is something that we give. We give as an input. Response is something that we get from service, okay, from the service. So, So as an input we are going to, as, as a request, we're going to provide the ISO code. In response. We are going to get D currency, country currency, save, enclose. I need to Save and Close. Now next, click OK. Next I'm going to configured parameter as parameter value. I'm going to provide ISO code, country ISO code. So I will provide USA against USA, I will get dollar. Okay, next I'm going to configure my second soap service, which is this one. I need to open it first here. So this is my second soap service, and this is my first soap service which I am going to consume. So sought service supports only the XML format. No other file type is supported by soap connection. I'm going to create a new connection and name of connection this time around name of connection would be so prank connection, 0-2, URL. I'm going to provide this particular wisdoms URL, okay, endpoint URL. I'm going to remove this del end question mark, username and security. I'm not going to provide since this is the public soap service which is publicly available now, Save and Close. Now next I'm going to provide the operation. I will create an operation, soap client operations 02. I will call this operations 02. Then I'm going to import the profile. I'm going to select this connection, this connection 0-2, which I've just created late next. So I'm going to provide a Celsius temperature in Celsius. And in result, I'm going to get D Fahrenheit Celsius to Fahrenheit. You can also choose the Fahrenheit to Celsius. I'm going to choose Celsius to Fahrenheit. Click next. Connecting to act them. Finish. This has been configured Save and Close. Click OK. Find, save this. Now let me just provide d parameter value. So what parameter value I'm going to provide, I'm going to provide the parameter value temperature in Celsius. So let's check 35 degree Celsius. So in return, I'm going to get the value in Fahrenheit, safe. Now execute the process. In the first branch, you will get the country currency. In the second Brad, you will get the temperature in Fahrenheit. Okay? Now, the second branch is executing. Okay? As I said, soap supports only the XML formats result I will get in XML format. So loading of look USD dollar. And here I'm going to get temperature in Fahrenheit, 30-35 degrees Celsius has been converted to Fahrenheit, which is 95 degree Fahrenheit. Ok. Now, you can also provide some other values like let me just change one more value here. So instead of USA, let me just provide UK. Here. I'm going to provide, instead of 35, I'm going to provide, let say, 40 degree Celsius. Let me just execute the process. See if an execute. Let me just see the result. It is in XML format. Look 1-0, four degree Fahrenheit. And this one is UK. Okay, country not found in the database. So I need to perform instead of UK, I need to give United Kingdom or something else. Let's say instead of UK, let me just provide if t, which is the ISO code of Afghanistan, AF, I need to provide a f which is ISO code of a funny Stan. Okay, save N against AF, I will get d of Ranney Stan, currency of guanine. Let me just see. Look of Lani, ISO court AFP. I get disciplining currency. So thank you so much and have a great learning. 24. Consume REST Service: Hello everyone. In this particular lecture, we are going to consume rest service, indel Gumi. So in order to consume a rest service, first of all, let me just create a new component. Consume rest. Okay, create this component. So first I'm going to use this starts shared with no data. So in order to consume a rest service, I need to use the HTTP client connector. Http client connector. So I am going to join this with disconnected. Ok, so what I'm going to do, I'm going to read this. So if you want to read list of users, you're going to use this particular link, get and you're going to use this particular link if you want to, if you want to get the information of specific user, you will use this particular link. So this is going to give you only one user. So I'm going to read multiple users. So I will use this particular link in order to consume a rest service. So let me just configure it. So after consuming a rest Service, we are going to insert data in a database. So this is the table in which we are going to insert data. These are the fields that we are going to insert in my user underscore table. Now let me just configure this. So actually I'm going to perform the get action. Since we are going to read data from a service connection, I will create a new connection. Name of my connection is HTTP client. I will use this particular link, this particular URN I need to mention here, okay, authentication, No need to provide any authentication because it is publicly available. So no need to provide any authentication details. Save and Close HTTP client connection to is the name operation I need to create. I need to create a new operation. Client, HTTP client operation, OK, select a request profile type, select response profile type. So since we are not going to provide any input value, so here I'm going to select none in response. We are going to get JSON file. So Rush service supports multiple files like it supports JSON, XML, etc. multiple file types. I will select JSON in response. I'm going to get the JSON file. Okay? Response profile. I need to create this response profile. So create a new response profile. Response, rest, response, JSON profile. Now let me just copy this. So in my document section, del boomy, I need to document. I will create a file name. Text. Response. Profile dot JSON. Yes, response profile dot JSON. Edit this. And then save, now save this slide. So I'm going to use this particular file in order to import d. In order to import the metadata, import, choose a file. Which file? The file would have just created. The name of file. Is it response profiled or JSON? Click Next. Click finish. So profile has been created, okay, so now I'm, I'm required D fields, id, email, firstName, lastName, and company name. Okay, company, save and close. The file has been added. Content type, I want content type application, no application slash JSON HTTP method, which method I am using right now I'm using the get method. Okay, save and close. So this has been configured. Next, I'm going to use the database connector. I need to use deep database connector. So since we are going to insert data in a database, so I will use, I will configure the database connection, ACTION_SEND connection, database connection. Let me just check this particular connection. Okay, disconnection is already created. No need to create a new connection operation. I'm going to create a new operation profile. I will create a new profile. So profile I'm going to select OK, I'm going to select here dynamic, insert, import, deck, stop, connection, database connection, click next. So I'm going to use this table name to import the metadata of this table. So now connecting two atom at MS. connecting, connecting to the database. Now I am going to select my table name. Table name is this. Let me just execute this. If this table is present or not. I need to execute this table first to check if Table is already there or not. So table is already present. So what I'm going to do, I'm going to just truncate this table. Truncate table. Table has been truncated. Select star from user underscore table. This table. Next. Now let me just execute my query, okay? No data. Ok, select all fields. Next. Finish. So fields have been added. Iid. So let me just okay, Number, FirstName, character, email, save and close. So the profile has been added. Let me just edit the profile name of profile I am going to call this user database profile, save and close. Okay, save and close. From database connection. This is the name. Now, next I am going to use the map ship. So let me just configure it. Connect at the end, I'm going to use the stop ship. Know quickly configure the map shape. So create a new map. So I'm going to call HTTP client DB. I will choose a profile, so I will select JSON profile. So Profile I've just created rest JSON profile. This one. I'm going to choose discern. Fine. Now from here I'm going to choose the database profile, user database profile which I've just created. So now I'm going to map. So ID, let me just open the ArrayList. So ok. I am going to map ID with ID, email, I'm going to map it email, firstName, lastName, company. Okay. So I'm going to map company with company. So this is the database profile and this is deep HTTP profile, save and close. Okay, save and closer. So what I'm doing, I'm consuming this threat, okay, whatever did a time-consuming, I'm going to insert data in D in the database. So indeed database, I'm going to insert these fields, id, email, first name, last name, company. I'm not going to insert all fields. Okay? Now safe. Next I'm going to execute the process. So now you're going to see data will be inserted into the table. Let me just execute. I need to check it first. Look, database, sorry, data has been inserted, 12 rows have been inserted. So the process has been executed successfully. Ok, if you want to see the logs, you can see it from here. So this is how you can see a law. You can see logs of your process. So thank you so much and have a great learning. 25. Build SOAP Service: Hello everyone. In this particular lecture, we're going to build soap service indel, boomy. So first of all, I'm going to create a new component, bill soap. So this is the name of component, Create a new component. So now I'm going to select the Start share with connected type. So connector type, I'm going to select a web services server. Then action I'm going to choose listen operation. I will create a new operation, operation type I will select execute, or you can select the type get. So since, so what we are going to do, so we are going to provide an ID as an input and against ID. We're going to retrieve this information, ID, email, firstName, lastName, and company information. So this data we are going to get from this user underscore table. So let me just execute this query again. So now operation type, since we are going to get the information I can select, Get or execute. So I'm going to select the execute. If you're creating something, you will select Create. If you're updating the record, you will select update. If you are performing upsert, you will select the upsert operations or upsert means deck. If if the rocket doesn't exist, create a new record. If record exists in the, in the server. Update direct. Okay, this is absurd operations, so I'm going to select the execute object name. I'm going to give the user a sense we are getting the user details. I will give user underscore details. The name of object. So expected input types. So expected input type, I'm going to give XML's, I'm going to provide XML as an input. And in response I'm going to get D XML format, single XML object. Okay, now, I need to create a profile. So what value I am going to provide as an input? So let me show you, I need to show you what value I am going to provide as an input. So first of all, I need to create the input profile. So let me go to so input dot XML. So this is my input profile. So in this particular XML profile, I have only a single value. Ok, now let me just create a new profile. Request profile. Input XML profile, import a file, XML file. Let me just input dot xml next. So click finish. So look, the profile has been added only one attribute and this profile, save and close. Okay, fine. Now a response profile. So in response, I'm going to get these values, ID, email, firstName, lastName, Indie company name. So I need to, I need to create the response profile. So response XML profile, import. Choose a file, which file I'm going to import, I'm going to import. Let me show you the profile. This known or not discern user dot XML. So this is the profile which I'm going to import. So I need to import this user dot xml. This Open. Next. Finish. Okay, Save and Close. Now. Fine. Single, okay, everything is okay. Okay, saving close. So this has been configured, I need to change the name of operation, this operation. So let say instead of this new, let me just web service operation. Save and close for this has been configured. Okay, make the recommended changes for me. Click this option. So this is going to improve your performance. So you need to check this option as it ok, so the first step is done and dusted. Next step, I'm going to use the map ship. After that, I'm going to use Director and document shape. So why I am using the return document ship? I am going to tell you the reason, okay? Okay. Before that, I'm going to open one more tool which is known as soap UI tools. So this is the testing tool. From this tool, I'm going to provide an input value. So what we are doing, we are consuming the soap service, which is built in Dell. Boomy. So first of all, we are going to build our soap surveys and del boomy. Then from outside, we are going to consume the soap service which has been buried in Dell. Boomy. Okay. Now I need to configure my map shape first. Okay, let me just first cancel this. We are going to use this soap you a2 later. Okay, so if you do not have this tool, you can download this tool from Google, download and install the soap UA tool. You'll get this tool free, okay? So you'll get it for free. You need to download and install this soap UI open source. Click this option, okay, then afterwards you have to install it. So I already have retained my systems. I'm not going to install it again. Now. Let me just close this. Now I'm going to configure it. So next, I will create a new mapping, hope, XML mapping. Choose a profile. So first I'm going to select examine. So I'm going to select import XML profile, input XML profile which I've created, okay, with single id. Next, so this is the destination sections. In the destination section, I'm going to choose this user database profile. So I have already created this database profile. If you do not have this, you can create it from scratch. So you can go here. Database then create a new profile. Then you can go import, select. Ok, let me just show you again database connection. Click next, connecting two atom. So I'm going to import the attributes of this table connecting two atom. So this is going to import the metadata of this table. So, okay, now I need to select the table name. So this is my tables user underscore table. Click Next. Connecting two atom. Atom is connecting to the database. Now select all Next Finish. Fine. Okay, fine, save and close. So next I'm going to create SQL vlookup. So I will go in the lookup section, SQL lookup click. Okay. So connection type, I'm going to select the database connection standards. I'm not going to select this stored procedure. I will select Standard. So I'm going to write MySQL query select star from this SQL query. Let me just copy paste MySQL query here. So input value, I'm going to provide ideas and input against idea, I'm going to fetch different fields. So first field I'm going to fetch ID. Then I'm going to get D, email firstName, lastName, email, firstName, lastName. Then I'm going to get company. Ok, click OK. Ok. Now I'm going to map ID with ID. Then this idea, I'm going to manage this ID email, I'm going to map it. Email, firstname, I'm going to map it firstName, lastName, ND company. Okay, so I'm going to use this ID against ID. I'm going to fetch ID, email, first name, last name, company for example, if I'm providing value 7's, I'm going to get this specific row, okay? Only this row. So now save, enclose this. Save and Close. Soap XML mapping. Click OK. Return document is going to return data to deep to the so BWI tool. So from this soap UA tool, I'm consuming the, I'm consuming the soap service which is, which is barely gone. Del boomy. So before consuming this web service, I need to deploy this service either in test environment or production environment. So I will click this, then I will click Next. I'm going to provide the version, let's say 1.1. Then create this deployed, okay, environment. I'm going to select the environment, test environment, which I have created. Next. Click next. Deploy. So okay, deployment, successful. View deployment. Now you need to look at the deployments. So this is D deployment, build soap. This is diversion, okay, deployment date and deployed by this ID. So now I need to go to the atom management section. Here. I need to go to the shared web servers. And I need to copy this, copy this URL. So what you are going to do, you are going to use this URL. And one more thing. You go ahead build soap and you are going to use this particular URL I need to show you. You have to use this. Okay? Now combine this with this backslash. Ok. This one, combine this and then question mark. So this is the wisdoms URL that you are going to use, okay, in order to consume the soap service. So this process has been deployed. Now, the second step is you need to consume the soap service. So how you are going to consume the food service? So you are going to consume this soaps or risk from this third party tool, which is soap UI tool, which is the testing tool. So now first I'm going to just paste it in D. Okay? So this is divisible URL, okay? Now I need this URL in order to consume the soap service, soap. Now, initial wisdom, I need to provide this. So this is the project name. Click OK. Now User table, user details. So this is, I need to use this object name request. I need to provide one request. So what value I am going to give in my request? Let me just open it. I need to give ID value, this ID value against this ID value, I'm going to fetch complete information or can now let's say I'm going to provide the id value nine. So let's say ID value nine. Open the soap U2, ID value nine. Now execute. So in response, I'm going to get look ID, email the firstName, lastName, and decompose any information. I need to check this, okay, this information, lead share. Instead of nine, let say ten. Again execute. So you are going to get this information, okay? So this is how you consume the soap service from this soap UI tools. So I have built this soap service endogamy. Now let's say you have done some changes mapping, let's say soap XML mapping. So if you want to deploy this process again, you will click Create, then next, then here you are going to provide the different version. So this time around I'm going to provide 1 to create desk deploy environment. I need to select my environment, which is the test environment. Next. So this is the second time I'm deploying this process. Okay. Deployment successful. So this is the latest deployment. Are you sure you want to navigate away from the middle screen, anyone? Yes. Okay. Now, next time we're going to go to my utter management section. So this is my atom deployed process. These are the deplored process of my latest deprived processes, this one. Now, let me go to this. So hope you understood the concept. Thank you so much and have a great learning. 26. Build Rest Service in Dell Boomi: Hello everyone. In this particular lecture, we are going to build rest service in Dell booming. So this is what we are going to build, indel boomy. So we are going to build the percentage increased calculator in Dell. Boomy. So this is my question. For example, John works in a store for $1000 per month of two one-year Jones pay is increased by $79. How much his pay increase in percentage? I need to solve this question. Okay, so first of all, what I am going to do, I'm going to build the percentage increase calculator in Dell boomy. And this is the formula that we are going to implement. So first of all, I need to create a new component in order to build a rest service Intel booming. So Component type I'm going to select the process name of component will be built. Rest service, create a new component. Now I'm going to select the starts shave it connector type. After this, I'm going to know I need to configure this. I need to configure this connector type. I will select the web service server. Web services server. I will select disconnected type action. I'm going to select listen operation. I will create a new operation. New operation I will create in the name of operation will be rest, connector operation, simple URL part. This is the URL path. Okay, now operation type, I'm going to select the option execute object. I'm going to name this object percentage. I will call it percentage calculator, expected input. So what input I want to provide ion to provide D's input values, new mount, original amount. I'm going to provide these two original DES to values as an input. So for deck, I need to select the option it. So I will select single JSON object. So rest service supports multiple objects, like it supports XML format, JSON format, et cetera. So I will select simple JSON object response output. So I am going to get a response. I'm going to get a single value in response. Debt will be the percentage, increased percentage. So I will select the response type. I'm going to select single JSON object. Okay? Now request profile. Now I need to show you the request profile which I'm going to provide. So this is the request profile which I am going to provide. And against this request time to get, I'm going to get the single output percentage increase. So now I'm going to create a profile first. So create, so first I will create a response profile. So this one is. Not response profile. I will create request profile. Request profile. Okay, now request percentage, request JSON profile. Now import this choosy phi. So request I will select request dot JSON. Now click Next Finish. So attributes have been edit. Now I need to check my attributes. New amount, original amount, save includes. Now I'm going to configure this option response profile. So I will give response response JSON profile. So import response dot JSON. Click Next. Finish. Percentage response profile. Okay? This is d value occupy percentage increase. Okay, name is DESeq is save and close. So this has been configured request input type n, the output type has been configured, profiles have been configured. Save includes. Now this connector has been configured. Next I'm going to use the map shape. After this, I'm going to use the return chip, returned document chip. Percentage increase. I will get the return value, percentage increase. No. Eight, I'm going to map it, create a new mapping. So the value will be percentage, increase. Map. Ok, choose a profile. So I'm going to select the JSON profile. So the profile I have created, I've just created a profile. So the name of profile is so percentage request profile. So this is the profile which I am going to provide. Click OK. In response. I'm going to get a single value percentage increase JSON person, okay, percentage response JSon profile. So differently I'm going to get okay, percentage increase, percentage increase here I'm going to perform some calculation. So I have to create a new function in order to perform, in order to make this formula, I need to make this formula first. So first I will provide the input value. So first input will be d, new mount AND original amount. New amount. And the second input value will be the original amount. Okay? These are two inputs. Now I'm going to create this formula, nu minus original. First I will use the numeric function. Subtract. Now new minus original. Okay, nu minus original value to subtract original amount. So new minus originally. Now I want to divide by the result, whatever the result I'm going to get, I want to divide by original amount. So I'm going to use d divide function err, met divide. Ok. Click OK. So result divided by the original amount, result value, value to divide original amount. So the original amount is this. Now the next time we're going to multiply the result by a 100. So I'm going to use one more numeric method, multiply, click OK. Multiply by a 100 reserved weight result value to multiply, I want to multiply by a 100. Now I'm going to define the output value, percentage change. So this is going to be my output value. That result, I'm going to map it with disvalue. Ok? So this has been defined value new amount minus original amount. The result is divided by the original amount. Then I'm multiplying the value by a 100. And finally I'm storing the result in percentage, okay, output. So this has been configured. Let me just name it. Calculation, percentage, calculation, Save and Close. I need to save and close. Now I'm going to provide the new amount, ND, original amount here, percentage, I'm going to map it with percentage increase. Next I'm going to save enclosed this. So this mapping has been configured. The return percentage, this return document is going to return me this disvalue percentage increase value, ok? This value, I need to show you this earlier. Okay, now, next what I am going to do, I'm going to deploy this process in either test environment or in production environment. I need to save this first, it has been saved create packaged component. Next, 1 version is 1, Create Package component deployed this process, I need to choose the environment. So I'm going to deploy this in test environment. Click Next. Next, deplored the process. So the process has been deployed, view deployment. Now you can see your deployments. So this is the process. Next, I'm going to open the soap U2. As an input value. I'm going to provide this input value request dot JSON. I need to provide disvalue in pass some values in it. So now let me just open the soap you i2, it is opening. So this is the process which I've deployed. Let me just open again. So UI, cancel this. Okay, rest here. I'm going to provide the URL. So which, which URL I'm going to use? I'm going to use, I need to go to the actor management first. Okay, I need to provide a URL here. So go to the atom management shared web service. So I'm going to use this and I'm going to use go, go to the process, build dress service. Then I'm going to go here. After debt, I'm going to use this complete URL. Combine this and let me just use this particular URL. Okay, now, rest. Open this. Here. I'm going to provide this URL. Click OK. Now this is the request. So request go to the request. I need to select the post method request. Here I'm going to select the application slash JSON, provide the value in the form of JSON. Let me just copy the same texts. I will provide two values. New amount in the original amount in the result I'm going to get is the percentage increase value. So new amount. So as Burma question, I need to provide my new after one year, John's pays increased by $79. Now the johns pay is $1079. I will provide value 1079 here. 1000, new amount is 1-0, 79, d original amount was $1000. Ok, execute. Okay, JSON, I need to view in the form of JSON. Look, the value is 77.9%. Let's say if it is $90, then what value I am going to get? I'm going to get the percentage increase, 9.9%. Okay, so this is how you develop the rest service indel, boomy. So hope you understood the concept. Thank you so much and have a great learning.