Real World Vagrant For Distributed Computing - Part I

Toyin Akin, Geek BigData Engineer, Geek Financial Programmer

Play Speed
  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x
19 Videos (1h 10m)
    • Introduction to the course

    • My Backstory

    • Virtulization Tooling - Part I

    • Virtulization Tooling - Part II

    • Virtulization Tooling - Part III

    • Enterprise Hardware - Part I

    • Enterprise Hardware - Part II

    • Download VirtualBox and GitBash

    • Download Vagrant, Sublime Text

    • Whiteboard Virtual Machines

    • Startup Vagrant Tooling

    • Startup the simpliest VM

    • Building out the vagrant file

    • Location of vagrant Centos boxes

    • Bootup our first Centos VM

    • Log into our Vagrant Centos VM

    • Navigate Vagrant Environment

    • Vagrant Commands

    • Summary of Vagrant Part I


About This Class

Note : We have now added a "pocd_config.vm.box_version" variable to the vagrantfile so that the course uses a static version on the Centos box.

In Part I of "Vagrant for Distributed Computing" we download the tooling needed to build out a multi VM virtual environement on our desktop. We quickly boot up a Centos Linux OS using only six lines of ruby code!!

The class project contains a copy of the final vagrantfile used in part #1.

## #######################################################################

"NoSQL", "Big Data", "DevOps" and "In Memory Database" technology are a hot and highly valuable skill to have – and this course will teach you how to quickly create a distributed environment for you to deploy these technologies on. 

A combination of VirtualBox and Vagrant will transform your desktop machine into a virtual cluster. However this needs to be configured correctly. Simply enabling multinode within Vagrant is not good enough. It needs to be tuned. Developers and Operators within large enterprises, including investment banks, all use Vagrant to simulate Production environments. 

After all, if you are developing against or operating a distributed environment, it needs to be tested. Tested in terms of code deployed and the deployment code itself.

You'll learn the same techniques these enterprise guys use on your own Microsoft Windows computer/laptop.

Vagrant provides easy to configure, reproducible, and portable work environments built on top of industry-standard technology and controlled by a single consistent workflow to help maximize the productivity and flexibility of you and your team.

This course will use VirtualBox to carve out your virtual environment. However the same skills learned with Vagrant can be used to provision virtual machines on VMware, AWS, or any other provider.

If you are a developer, this course will help you will isolate dependencies and their configuration within a single disposable, consistent environment, without sacrificing any of the tools you are used to working with (editors, browsers, debuggers, etc.). Once you or someone else creates a single Vagrantfile, you just need to vagrant up and everything is installed and configured for you to work. Other members of your team create their development environments from the same configuration. Say goodbye to "works on my machine" bugs.

If you are an operations engineer, this course will help you build a disposable environment and consistent workflow for developing and testing infrastructure management scripts. You can quickly test your deployment scripts and more using local virtualization such as VirtualBox or VMware. (VirtualBox for this course). Ditch your custom scripts to recycle EC2 instances, stop juggling SSH prompts to various machines, and start using Vagrant to bring sanity to your life.

If you are a designer, this course will help you with distributed installation of software in order for you to focus on doing what you do best: design. Once a developer configures Vagrant, you do not need to worry about how to get that software running ever again. No more bothering other developers to help you fix your environment so you can test designs. Just check out the code, vagrant up, and start designing.






Reviews (0)

Toyin Akin

Geek BigData Engineer, Geek Financial Programmer

LinkedIn :

" ...A financial developer is a high-end programmer in a financial institution.  By comparison, a regular "software developer" in the same institution is a low status position, but if you can get people to call you a "financial developer" you suddenly become more valuable ... "

Toyin Akin spent 6 years at "Royal Bank of Scotland" and 5 years at the investment bank "BNP Paribas"  developing and managing Interest Rate Derivatives services as well as engineering and deploying In Memory DataBases (Oracle Coherence), NoSQL and Hadoop clusters (Cloudera) into production.

In 2016, Toyin left to start his own training, POC-D.
"Proof Of Concept - Delivered", which focuses on delivering training on
In Memory Database, NoSQL, BigData and DevOps technology.

I have started a YouTube Channel, publishing "free" distributed computing videos. These are not courses. Simply ad-hoc videos discussing various distributed computing ideas.

Within Royal Bank of Scotland ...

Responsible for engineering DEV and PRODUCTION stacks to support Cloudera Hadoop and Oracle Coherence providing deployment and realtime monitoring.

I successfully engineered an Oracle Coherence stack (Deployment/Monitoing/Managing) which has been deployed globally within the bank.

I worked with more than 20 production deployed clusters in either architecting or troubleshooting. This involved not only best practice for Coherence JVM process placement, cluster configuration and monitoring but O/S tuning, hardware selection and configuration, JVM tuning and log analysis.

With Cloudera Manager (CM v5.x), CM management servcies as well as CDH5 services were deployed into PRODUCTION via automation (calling on the CM API). 

I also utilized CM custom services (CSDs) and the parcels feature to deploy, monitor and manage in-house, custom, distributed, HADOOP services.

NoSQL - Responsible for leading a team in building NoSQL POC stacks with the aim of creating a tenanted, key/value store. POC stacks were created for Couchbase, DataStax Cassandra, Mongo, Riak, Redis and HBase.
In terms of hardware, HP Blade/Rack as well as Cisco UCS were considered

I was a member of the Data Faculty team. Responsibilities included certifying RDBMS / Hadoop / NoSQL / IMDG products that are to be officially used within the bank.

Further information can be obtained from my LinkedIn profile ...