How to Configure K8S Multi-Node Cluster over AWS Cloud via Ansible Role

INTRODUCTION:

FIRST STEP

Launch three ec2 instances on AWS cloud

In this task, we would deploy a Kubernetes cluster on AWS cloud via Ansible roles.

I have made two different roles for the same purpose named:

aws_provision is for launching two ec2-instances and cluster_setup is for setting Kubernetes cluster on the launched ec2-instances.

This is the main task file for AWS provision.

This is the vars file to store variables for the aws_provision.

Now we would run the ansible-playbook to run the aws_provision role.

We can see now that we have launched three ec2 instances on AWS.

SECOND STEP

Configure Kubernetes consisting one Master and two Worker Nodes.

Steps involving are:

FOR MASTER AND SLAVE BOTH:

Configure yum for kubernetes

Installing docker

Install kubeadm, kubelet, kubectl

Start kubelet service

Pull the images

Edit Daemon.json

Restart the docker service

Install Iproute-tc

kube init

FOR MASTER ONLY :

pull images for kubeadm

Configuring Master via kubeadm

FOR WORKER NODE ONLY:

copying admin file

Pasting the token file

Running the token file

Now, this is the task file for cluster setup.

this is the file folder for role cluster_setup

Now we would run the ansible-playbook to configure the cluster.

Now we would check the configuration by logging into the Master node and running: kubectl get nodes

Finally!!! our cluster is set up.

Refer to the code below for any confusion.

https://github.com/jayesh49/Task-19.git

--

--

Btech ECE pursuing student ....Arth Learner

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store