Customer Success Story

Implementing a DevSecOps Pipeline in the Healthcare Sector

Taashee helps a multinational healthcare giant create an end-to-end AWS DevSecOps CI/CD pipeline with open-source SCA, SAST and DAST tools.

Overview

The client approached us to create a complete DevSecOps pipeline for their existing infrastructure, which would include continuous integration (CI), continuous delivery and deployment (CD), continuous testing, continuous logging and monitoring, auditing and governance, and operations. Our DevSecOps experts created an architecture on AWS that covered the aforementioned practices, including SCA (Software Composite Analysis), SAST (Static Application Security Testing), DAST (Dynamic Application Security Testing), and an aggregation of vulnerability findings.

DevSecOps Overview

About the Client

Our client is an established healthcare services provider headquartered in India, with global footprints in South Asia and the Middle East. Besides maintaining a massive hospital chain infrastructure across multiple cities, our client has also ventured heavily into related verticals like pharmacies, diagnostic centers, home care and online doctor consultations.

Client
Optime Logistics
Industry
IT Consulting Services
Services
Design, Development, Implementation of Tech based Solutions

Client Issues

The client required an end-to-end DevSecOps solution for their existing infrastructure with a few special features. The DevSecOps pipeline had to include CI/CD, continuous testing, continuous logging and monitoring, auditing and governance, and operations. It had to be integrated with various open-source scanning tools, such as SonarQube, PHPStan, and OWASP Zap for SAST and DAST analysis. The vulnerability findings had to be aggregated in the Security Hub as a single pane of glass. The security of the pipeline had to be implemented using AWS cloud native services and the DevSecOps pipeline was to be provided as code using AWS CloudFormation.

Solutions

Our Solution

Taashee’s DevSecOps experts first appraised the client’s unique requirements and created a solution from scratch using AWS services and other open-source tools as described below.

Services and tools

In this section, we discuss the various AWS services and third-party tools used in this solution.

CI/CD services

For CI/CD, we use the following AWS services:

  • AWS CodeBuild – A fully managed continuous integration service that compiles source code, runs tests and produces software packages that are ready to deploy.
  • AWS CodeCommit – A fully managed source control service that hosts secure Git-based repositories.
  • AWS CodeDeploy – A fully managed deployment service that automates software deployments to a variety of compute services such as Amazon Elastic Compute Cloud (Amazon EC2), AWS Fargate, AWS Lambda, and on-premises servers.
  • AWS CodePipeline – A fully managed continuous delivery service that helps automate release pipelines for fast and reliable application and infrastructure updates.
  • AWS Lambda – A service that helps run code without provisioning or managing servers. Consumers pay only for their compute time.
  • Amazon Simple Notification Service – Amazon SNS is a fully managed messaging service for both application-to-application (A2A) and application-to-person (A2P) communication.
  • Amazon Simple Storage Service – Amazon S3 is storage for the internet. One can use Amazon S3 to store and retrieve any amount of data at any time, from anywhere on the web.
  • AWS Systems Manager Parameter Store – Parameter Store gives visibility and control of the infrastructure on AWS.

Continuous testing tools

The following are open-source scanning tools that were integrated into the pipeline to meet the client’s specific requirements.

  • OWASP Dependency-Check – A Software Composition Analysis (SCA) tool that attempts to detect publicly disclosed vulnerabilities contained within a project’s dependencies.
  • SonarQube (SAST) – Catches bugs and vulnerabilities in the app, with thousands of automated Static Code Analysis rules.
  • PHPStan (SAST) – Focuses on finding errors in the code without actually running it.
  • OWASP Zap (DAST) – Helps to automatically find security vulnerabilities in web applications while they are being developed and tested.

Pipeline Architecture
The following diagram shows the architecture of the solution.

 

Auditing and governance services

The following AWS auditing and governance services were used for this implementation:

  • AWS CloudTrail – Enables governance, compliance, operational auditing, and risk auditing of the AWS account.
  • AWS Identity and Access Management – Enables one to manage access to AWS services and resources securely. With IAM, developers can create and manage AWS users and groups, and use permissions to allow and deny their access to AWS resources.
  • AWS Config – Allows assessment, audit, and evaluation of AWS resource configurations

Operations services

The following AWS auditing and governance services were used for this implementation:

  • AWS Security Hub – Gives a comprehensive view of the security alerts and security posture across AWS accounts. This solution used Security Hub to aggregate all the vulnerability findings.
  • AWS CloudFormation – Provides an easy way to model a collection of related AWS and third-party resources, provision them quickly and consistently, and manage them throughout their lifecycles, by treating infrastructure as code.
  • AWS Systems Manager Parameter Store – Provides secure, hierarchical storage for configuration data management and secrets management. One can store data such as passwords, database strings, Amazon Machine Image (AMI) IDs, and license codes as parameter values.
  • AWS Elastic Beanstalk – An easy-to-use service for deploying and scaling web applications and services developed with Java, .NET, PHP, Node.js, Python, Ruby, Go, and Docker on familiar servers such as Apache, Nginx, Passenger, and IIS. This solution used Elastic Beanstalk to deploy the LAMP stack with WordPress and Amazon Aurora MySQL.

Continuous logging and monitoring services

The following AWS services were used for continuous logging and monitoring:

  • AWS CloudWatch Logs – Allows monitoring, storing, and accessing log files from EC2 instances, AWS CloudTrail, Amazon Route 53, and other sources
  • AWS CloudWatch Events – Delivers a near-real-time stream of system events that describe changes in AWS resources
DevSecOps13 Case Study

Fig. AWS DevSecOps CICD pipeline architecture

The main workflow was as follows

  1. When a user commits the code to a CodeCommit repository, a CloudWatch event is generated which, triggers CodePipeline.
  2. CodeBuild packages the build and uploads the artefacts to an S3 bucket. CodeBuild retrieves the authentication information (for example, scanning tool tokens) from Parameter Store to initiate the scanning.
  3. CodeBuild scans the code with an SCA tool (OWASP Dependency-Check) and SAST tool (SonarQube or PHPStan).
  4. If there are any vulnerabilities either from SCA analysis or SAST analysis, CodeBuild invokes the Lambda function. The function parses the results into AWS Security Finding Format (ASFF) and posts it to the Security Hub. Security Hub helps aggregate and view all the vulnerability findings in one place. The Lambda function also uploads the scanning results to an S3 bucket.
  5. If there are no vulnerabilities, CodeDeploy deploys the code to the staging Elastic Beanstalk environment.
  6. After the deployment succeeds, CodeBuild triggers the DAST scanning with the OWASP ZAP tool.
  7. If there are any vulnerabilities, CodeBuild invokes the Lambda function, which parses the results into ASFF and posts it to Security Hub. The function also uploads the scanning results to an S3 bucket.
  8. If there are no vulnerabilities, the approval stage is triggered, and an email is sent to the approver for action.
  9. After approval, CodeDeploy deploys the code to the production Elastic Beanstalk environment.
  10. During the pipeline run, CloudWatch Events captures the build state changes and sends email notifications to subscribed users through SNS notifications.
  11. CloudTrail tracks the API calls and sends notifications on critical events on the pipeline and CodeBuild projects, such as UpdatePipeline, DeletePipeline, CreateProject, and DeleteProject for auditing purposes.
  12. AWS Config tracks all the configuration changes of AWS services. The following AWS Config rules are added in this pipeline as security best practices.
  13. CODEBUILD_PROJECT_ENVVAR_AWSCRED_CHECK – Checks whether the project contains environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. The rule is NON_COMPLIANT when the project environment variables contain plaintext credentials.
  14. CLOUD_TRAIL_LOG_FILE_VALIDATION_ENABLED – Checks whether CloudTrail creates a signed digest file with logs. AWS recommends that the file validation be enabled on all trails. The rule is noncompliant if the validation is not enabled.

Security of the pipeline is implemented by using IAM roles and S3 bucket policies to restrict access to pipeline resources. Pipeline data at rest and in transit is protected using encryption and SSL secure transport. We used Parameter Store to store sensitive information such as API tokens and passwords.

Security in the pipeline is implemented by performing the SCA, SAST and DAST security checks.
As a best practice, encryption was enabled for the code and artefacts, whether at rest or transit.

Deploying the pipeline
To deploy the pipeline, we implemented the following steps:

  1. Download the CloudFormation template and pipeline code from the GitHub repo.
  2. Log in to the AWS account.
  3. On the CloudFormation console, choose Create Stack.
  4. Choose the CloudFormation pipeline template.
  5. Choose Next.
  6. Provide the stack parameters:
  • Under Code, provide code details, such as the repository name and the branch to trigger the pipeline.
  • Under SAST, choose the SAST tool for code analysis, and enter the API token and the SAST tool URL.
  • Under DAST, choose the DAST tool for dynamic testing and enter the API token, DAST tool URL, and the application URL to run the scan.
  • Under Lambda functions, enter the Lambda function S3 bucket name, filename, and handler name.
  • Under STG Elastic Beanstalk Environment and PRD Elastic Beanstalk Environment, enter the Elastic Beanstalk environment and application details for staging and production to which this pipeline deploys the application code.
  • Under General, enter the email addresses to receive notifications for approvals and pipeline status changes.

Fig. CloudFormation template deployment

Note: The provided CloudFormation template in this solution was formatted for AWS GovCloud

Running the pipeline

To trigger the pipeline, one has to commit changes to the application repository files. That generates a CloudWatch event and triggers the pipeline. CodeBuild scans the code and if there are any vulnerabilities, it invokes the Lambda function to parse and posts the results to Security Hub.

When posting the vulnerability finding information to Security Hub, we need to provide a vulnerability severity level. Based on the provided severity value, Security Hub assigns the label as follows. The client could adjust the severity levels in the code based on their requirements.

  • 0 – INFORMATIONAL
  • 1–39 – LOW
  • 40– 69 – MEDIUM
  • 70–89 – HIGH
  • 90–100 – CRITICAL

The following screenshot shows the progression of the pipeline

DevSecOps4 Case Study

Fig. CodePipeline stages

SCA and SAST scanning

In our architecture, CodeBuild triggers the SCA and SAST scanning in parallel.

Scanning with OWASP Dependency-Check (SCA)

The following is the code snippet from the Lambda function, where the SCA analysis results are parsed and posted to Security Hub. Based on the results, the equivalent Security Hub severity level (normalized_severity) is assigned

Fig. Lambda code snippet for OWASP Dependency-check

The results can be seen in the Security Hub, as in the following screenshot.

DevSecOps6 Case Study

Fig. SecurityHub report from OWASP Dependency-check scanning

Scanning with SonarQube (SAST)

The following is the code snippet from the Lambda function, where the SonarQube code analysis results are parsed and posted to Security Hub. Based on SonarQube results, the equivalent Security Hub severity level (normalized_severity) is assigned.

DevSecOps7 Case Study

Fig. Lambda code snippet for SonarQube

The following screenshot shows the results in Security Hub.

Fig. SecurityHub report from SonarQube scanning

Scanning with PHPStan (SAST)

The following is the code snippet from the Lambda function, where the PHPStan code analysis results are parsed and posted to Security Hub.

Fig. Lambda code snippet for PHPStan

The following screenshot shows the results in Security Hub.

Fig. SecurityHub report from PHPStan scanning

DAST scanning
In our architecture, CodeBuild triggers DAST scanning and the DAST tool.
If there are no vulnerabilities in the SAST scan, the pipeline proceeds to the manual approval stage and an email is sent to the approver. The approver can review and approve or reject the deployment. If approved, the pipeline moves to next stage and deploys the application to the provided Elastic Beanstalk environment.

Scanning with OWASP Zap

After deployment is successful, CodeBuild initiates the DAST scanning. When scanning is complete, if there are any vulnerabilities, it invokes the Lambda function similar to the SAST analysis. The function parses and posts the results to Security Hub. The following is the code snippet of the Lambda function.

DevSecOps11 Case Study

Fig. Lambda code snippet for OWASP-Zap

The following screenshot shows the results in Security Hub

DevSecOps12 Case Study

Fig. SecurityHub report from OWASP-Zap scanning

Aggregation of vulnerability findings in Security Hub provided opportunities to automate the remediation. For example, based on the vulnerability finding, the client could trigger a Lambda function to take the needed remediation action. This also reduced the burden on operations and security teams because they could now address the vulnerabilities from a single pane of glass instead of logging into multiple tool dashboards.

Services Provided

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

Technology Stack

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

This is the heading

Download Case Study (PDF)

    Our Unique Features

    Taashee’s technical team helps organizations that require increased developer productivity, higher quality applications, and lower maintenance costs. Taashee programmers specialize in multiple technologies with add-on features and advanced support. The biggest advantage for customers approaching Taashee is that they do not need to approach multiple vendors to implement different technologies.

    Unique features

    Client Testimonial

    Taashee's unique vision of combining enterprise and open-source offerings to create our entire DevSecOps pipeline has truly benefitted our day-to-day operations without being too hard on the pocket.
    Head of IT
    Healthcare Company, India

    Related Case Studies