Article

CI/CD Pipeline from Bitbucket to AWS EC2 using DevOps tools

March 31, 2020

INTRODUCTION

This article describes how to deploy a NodeJS project from a Bitbucket repository to an EC2 instance in AWS, showing the flow and integration of the different resources involved, such as: AWS CloudFormation, AWS CodeBuild, AWS CodePipeline and AWS CodeDeploy.

CodeBuild is a continuous integration service that will allow to compile the code coming from the Bitbucket repository, as well as carry out the packaging with the necessary dependencies and store the artifact obtained in an S3 bucket.

With the CodePipeline service, automated pipelines will be carried out during the compilation and implementation phases of the launch process each time a code modification is made. This will allow code updates quickly and reliably.

And finally, the Code Deploy service will allow to automate the delivery of code to the AWS EC2 instance service.

OBJECTIVE

Perform a Continuous integration CI / Continuous Delivery CD flow in the implementation of a Node Js application project in an AWS EC2 instance from a Bitbucket repository.

PRE-REQUISITE

  • You will need to get an account at AWS and Bitbucket.
  • Basic knowledge of CI/CD and AWS services

OVERVIEW

Below, the solution diagram is shown, which will be working throughout the article.

ci cd 01

Bitbucket repository structure:

  • config: Folder where the file “buildspec.yml” will be stored
  • nodeapp: Node Js application
  • scripts: Bash files that will be used by “appspec.yml”
  • appspec.yml: The way in which the application will be deployed within the EC2 instance
ci cd 02

Node Js application structure:

ci cd 03

The “main.js” application that we will use for this demonstration example will display the message: "Hello Web server!"

    const http = require('http');
	const port = 3000;

    const requestHandler = (request, response) => {
        let funcName = "requestHandler";
        if (request.method === 'GET' && request.url === '/' ){
            response.writeHead(200, {"Content-Type": "text/html"});
            response.end('Hello Web server!');
        }
    };

    const server = http.createServer(requestHandler);

    server.listen(port, (err) => {
        let funcName = "server.listen";
        if (err) {
            return console.log(funcName, 'something bad happened: ' + err);
        }
    });

STEPS TO CREATE THE INFRASTRUCTURE

1. Create an IAM Role to the EC2 instance

Enter the AWS console and go to the IAM service, from the left panel select the “Roles” option and create a new ROLE for the EC2 service, which will interact with CodeDeploy and be associated with the instance that we create later.

Make sure the following policies are included in the ROLE:

  • AmazonS3FullAccess
  • AWSCodeDeployRole
ci cd 04

After creating the ROLE, edit the “Trust Relationship” option and add the following line: “codedeploy.us-east-1.amazonaws.com” as shown below, to also allow access to the CodeDeploy service.

If necessary, change the region in which you are working, in this example we are in the region of N. Virginia (us-east-1).

ci cd 05

2. Create an EC2 instance and install CodeDeploy agent

Inside the AWS console, go to the EC2 service and create a new instance with the desired operating system, in this example we will use an Ubuntu server version 18.04 and a “t2.micro” instance. In configuration details in the “IAM role” option, select the ROLE created in the previous step and leave the other options as default.

ci cd 06

AWS CodeDeploy will search for the instance based on Tags; therefore, the instance(s) must be tagged in order to can be identified at the time of deploying. In this example will only use an instance and it will be tagged with the Key=Name and Value=InstanceToDeploy

ci cd 05

Create a new security group and allow access by SSH, as well as the necessary ports to permit access to the application from the internet.

ci cd 08

Once you have launched the instance and it is running, connect to the instance via SSH and install the CodeDeploy agent following the Official instructions for the selected Operating System.

If you choose the same OS as in this example, run the following command to ensure that the agent was installed successfully:

# service codedeploy-agent status

The following message should be displayed:

ci cd imae
NOTE: From this point on, the entire CI / CD infrastructure will be created from a CloudFormation template.

3. Create S3 Bucket

This Bucket will store the “Artifacts” that will be generated in CodeBuild, likewise, the CodePipeline service will also access the Bucket to take the “Artifact” and with the CodeDeploy service deploy the code into the EC2 instance(s).

    "s3Bucket": {
      "Type": "AWS::S3::Bucket",
      "Properties": {
        "AccessControl": "Private",
        "BucketName": "poc-workflow-martin",
        "Tags": [
          {
            "Key": "Description",
            "Value": "Created from CloudFormation template"
          }
        ],
        "VersioningConfiguration": {
          "Status": "Enabled"
        }
      },
      "DeletionPolicy": "Delete"
    },

4. Create CodeBuild

When creating a CodeBuild project, it is necessary to indicate the source “repository” and the authentication method “Authentication”. These properties are referenced as parameters, for this reason, they will be requested when creating the CloudFormation stack.

 

Some important parameters of CodeBuild described below:

 

  • Artifacts -> Location: Where the "Artifact" will be built. Reference to the Bucket created in the previous step.
  • Artifacts -> Name: Name that the Artifact will have.
  • Filter Groups: A Build will be performed every time a "git push" is executed in the bitbucket master branch.
  • Source -> BuildSpec: Indicate the path of the “buildspec” file which describes the compilation details. In this example, our file is stored in: config/buildspec.yml
    "WorkflowBuild": {
      "Type": "AWS::CodeBuild::Project",
      "Properties": {
        "Artifacts": {
          "ArtifactIdentifier": "work-poc",
          "EncryptionDisabled": true,
          "Location": {
            "Ref": "s3Bucket"
          },
		  "Name": "workflow.zip",
          "NamespaceType": "NONE",
          "OverrideArtifactName": true,
          "Packaging": "ZIP",
          "Type": "S3"
        },
        "BadgeEnabled": false,
        "Description": "Build for Workflow",
        "Environment": {
          "ComputeType": "BUILD_GENERAL1_SMALL",
          "Image": "aws/codebuild/standard:2.0",
          "PrivilegedMode": false,
          "Type": "LINUX_CONTAINER"
        },
        "LogsConfig": {
          "CloudWatchLogs": {
            "Status": "ENABLED"
          }
        },
        "Name": "POC-Workflow",
        "QueuedTimeoutInMinutes": 30,
        "ServiceRole": {
          "Ref": "BuildRole"
        },
        "Source": {
          "Auth": {
            "Resource": {
              "Ref": "RepoCredentials"
            },
            "Type": "OAUTH"
          },
          "BuildSpec": "config/buildspec.yml",
          "GitCloneDepth": 1,
          "Location": {
            "Ref": "Repository"
          },
          "ReportBuildStatus": true,
          "Type": "BITBUCKET"
        },
        "Tags": [],
        "TimeoutInMinutes": 60,
        "Triggers": {
          "FilterGroups": [
            [{
              "Pattern": "refs/heads/master",
              "Type": "HEAD_REF"
            },
            {
              "Pattern": "PUSH",
              "Type": "EVENT"
            }]
          ],
          "Webhook": true
        }
      }
    },

5. Create CodeDeploy Application and Deployment Group

In CodeDeploy Application, the “ComputePlatform” property indicates the type of platform where the deploy will take place. In this example, when it comes to an EC2 instance corresponds the value “Server”.

    "WorkFlowDeploy": {
      "Type": "AWS::CodeDeploy::Application",
      "Properties": {
        "ApplicationName": "poc-Workflow-deploy",
        "ComputePlatform": "Server"
      }
    },

In the case of Deployment Group, the following properties should be indicated:

  • DeploymentType: Type of implementation “in-place or blue / green”. In this example, we indicate the value "in-place"
  • Ec2TagFilters: CodeDeploy will include and search all the EC2 instances identified with a specific “Tag”. Therefore, all instances with the Key = Name and Value = InstanceToDeploy will be included in the deployment
  • ServiceRoleArn: Here we indicate the ARN ROLE created in step 1
  • ApplicationName: Name of the CodeDeploy Application created at the beginning of this step
    "WorkFlowDeployGroup": {
      "Type": "AWS::CodeDeploy::DeploymentGroup",
      "Properties": {
        "ApplicationName": "poc-Workflow-deploy",
        "AutoRollbackConfiguration": {
          "Enabled": true,
          "Events": ["DEPLOYMENT_FAILURE"]
        },
        "DeploymentConfigName": "CodeDeployDefault.OneAtATime",
        "DeploymentGroupName": "poc-group-codedeploy",
        "DeploymentStyle": {
          "DeploymentOption" : "WITHOUT_TRAFFIC_CONTROL",
          "DeploymentType": "IN_PLACE"
        },
        "Ec2TagFilters": [{
          "Key": "Name",
          "Type": "KEY_AND_VALUE",
          "Value": "CodeDeployDirect"
        }],
        "ServiceRoleArn": "arn:aws:iam::030618954727:role/POC-codedeploy-role"
      },
      "DependsOn": "EC2instance"
    },

6. Create CodePipeline

The following 2 stages will be created for the CodePipeline:

 

  • Source: Pipeline will take the artifact "workflow.zip" from the Bucket created in step 3
  • Deploy: CodeDeploy will deliver the artifact to the EC2 instance. At this stage, our application must be implemented according to the “appspec.yml” file
    "codepipeline": {
      "Type": "AWS::CodePipeline::Pipeline",
      "Properties": {
        "ArtifactStore": {
          "Type": "S3",
          "Location": {
            "Ref": "s3Bucket"
          }
        },
        "Name": "WorkflowDeploy_poc",
        "RestartExecutionOnUpdate": true,
        "RoleArn": { 
          "Fn::GetAtt" : [ "CodePipelineServiceRole", "Arn" ]
        },
        "Stages": [{
            "Actions": [{
              "ActionTypeId": {
                "Category": "Source",
                "Owner": "AWS",
                "Provider": "S3",
                "Version": 1
              },
              "Configuration": {
                "S3Bucket": {
                  "Ref": "s3Bucket"
                },
                "S3ObjectKey": {
                  "Ref": "SourceObjectKey"
                },
                "PollForSourceChanges": false
              },
              "Name": "SourceAction",
              "OutputArtifacts": [{
                "Name": "SourceArtifact"
              }],
              "RunOrder": 1
            }],
            "Name": "Source"
          },
          {
            "Actions": [{
              "ActionTypeId": {
                "Category": "Deploy",
                "Owner": "AWS",
                "Provider": "CodeDeploy",
                "Version": 1
              },
              "Configuration": {
                "ApplicationName": {
                  "Ref": "WorkFlowDeploy"
                },
                "DeploymentGroupName": {
                  "Ref": "WorkFlowDeployGroup"
                }
              },
              "Name": "DeployAction",
              "InputArtifacts": [{
                "Name": "SourceArtifact"
              }],
              "RunOrder": 2
            }],
            "Name": "Deploy"
          }
        ]
      }
    },

7. Other resources needed for the infrastructure

The following resources are needed to complement the CI / CD infrastructure, of which they are included in the CloudFormation template “infra-workflow.json” that you can obtain in the next step.

  • AWS CloudWatch: To detect “Push” events in the S3 Bucket and automate AWS Pipeline.
  • AWS CloudTrail: To detect changes. This service is linked to CloudWatch.
  • IAM Role: Appropriate permissions for each of the resources created in this infrastructure.

8. Create the CloudFormation Stack

Copy the CloudFormation template “infra-workflow.json” from the following repository:

  • Link to repository: https://bitbucket.org/mflemate/poc-infra-workflow.git

Within the AWS console, go to the CloudFormation service and select the option to create a stack. Select the template previously downloaded and upload it to the console.

stack 1

Enter the information of the following parameters correctly:

  • RepoCredentials: Enter the ARN credentials to Bitbucket, if you do not have it, follow this Guide to generate it
  • Repository: Bitbucket repository URL
NOTE: Remember to follow the same repository structure shown in the "Overview" section.
stack details

The stack should be created successfully:

stack 2

The diagram of the “Stack” created in CloudFormation can found below:

ci cd 12

PERFORM DEPLOY

1. Validate and complete the repository structure with the following files:

  • “buildspec.json” file: This file will be manipulated by CodeBuild to create the artifact that will be stored in S3 with the necessary dependencies for the Node Js application.
version: 0.2

#env:
 #variables:
    # key: "value"
 #parameter-store:
    # key: "value"
phases:
 install:
   runtime-versions:
     nodejs: 10
   commands:
     - echo "Setting up NodeJS and Core..."
   #finally:
     # - command
 #pre_build:
   #commands:
     # - command
   #finally:
     # - command
 build:
   commands:
     - echo "Installing dependencies..."
     - cd nodeapp/
     - npm install
     - npm install express
     - echo "Build Done!"
   #finally:
     # - command
 #post_build:
   #commands:
     # - command
   #finally:
     # - command
     # - command
artifacts:
 files:
   - appspec.yml
   - 'nodeapp/*'
   - 'scripts/*'
   # - location
   # - name
 #discard-paths: yes
 #base-directory: 
#cache:
 #paths:
  • “appspec.yml” file: This file will be manipulated by CodeDeploy that will execute 3 scripts in the implementation process within the EC2 instance
version: 0.0
os: linux 
files:
  - source: /
    destination: /home/ubuntu/myapp
hooks:
  BeforeInstall:
    - location: scripts/before_install.sh
      timeout: 300
      runas: root
  AfterInstall:
    - location: scripts/after_install.sh
      timeout: 300
      runas: root
  ApplicationStart:
    - location: scripts/application_start.sh
      timeout: 300
      runas: root

Following the structure of the repository, the scripts will be stored in the “Scripts” folder.

  • Script “before_install.sh”
#!/bin/bash
npm install forever -g
apt install ruby-commander -y
  • Script “after_install.sh”
#!/bin/bash
mkdir /home/ubuntu/myapp
cd /home/ubuntu/myapp
  • Script “application_start.sh”
#!/bin/bash
# Star app
forever stopall
forever start /home/ubuntu/myapp/nodeapp/main.js

2. git push or merge to the “Master Branch”

Once the repository structure has been validated, perform a git push or merge to the “Master Branch” and validate the CI / CD flow.

ci cd 13ci cd 14

Open any browser and enter the public IP of the EC2 instance specifying the port 3000 (for example: http://3.89.7.165:3000). The following output should be displayed.

hello server

Now, make a change to the application code, for example: change the message by "The deploy was successful!". Redo the git push or merge to the "Master Branch" and refresh the browser.

As shown in the following image, the deploy was performed successfully following the CI / CD flow.

Conclusion

In this article we showed step by step how to create and validate a CI / CD infrastructure towards an EC2 instance, as well as the creation of the CloudFormation template. In addition, with an CI/CD infrastructure we have less effort and more confidence when deploying code to the production stage, by releasing the code immediately, fault isolation is simpler and quicker which helps a lot to improve teamwork, among many other advantages that can be adjusted to our needs.

 

INSIGHTS

Recommended

News
Announcing New Salesforce Commerce Cloud Partnership
News
IO Connect Services Achieves AWS Well-Architected Partner Status
Video
Monitoring Mule® Applications with Datadog
News
IO Connect Services Announces CloudWatch Mule® Integration

How can we help you?

IO Connect Services is here to help you by offering cost-effective, high quality technology solutions.