Automating workflows for AWS IoT Greengrass V2 parts


Introduction

AWS IoT Greengrass V2 Improvement Package Command-Line Interface (GDK CLI) was introduced at AWS re:Invent 2021. With GDK CLI you’ll be able to simply create AWS IoT Greengrass V2 parts, flexibly outline recipes, and publish these parts to AWS IoT Greengrass V2. Nevertheless, each time there’s a change to the AWS Greengrass V2 part recipe, you usually should manually provision it. For instance, each new model of the part should be re-built and re-published, leading to redundant duties. Moreover, if the part is a part of an automatic workflow, the re-building and re-publishing job turns into inconvenient for total growth efforts.

To beat these challenges, you’ll be able to create an automatic workflow utilizing AWS CodePipeline together with AWS CodeCommit and AWS CodeBuild. The automated workflow will construct and publish the parts every time a brand new change to the supply is detected. The answer introduced on this weblog demonstrates this workflow with the usage of an instance.

The next picture reveals a top level view of AWS companies used within the automated workflow. AWS CodeCommit might be changed with different Git repositories like GitHub or GitLab and ultimately mirrored into AWS CodeCommit repositories.

1. Getting Began

This part highlights the essential necessities like establishing AWS Id and Entry Administration (IAM) insurance policies for various companies which are getting used. IAM insurance policies outline the entry granted to a useful resource. For instance, AWS CodeBuild must have a learn/write entry to AWS IoT Greengrass V2 with a view to publish parts.

Pre-requisites

The next are necessities to proceed with the construct resolution:

1.1 AWS CodePipeline

AWS CodePipeline is used for creating and managing a steady supply service. You need to use it to handle the processes by accessing AWS CodeCommit logs. Based mostly on the modifications pushed to AWS CodeCommit, the pipeline that runs AWS CodeBuild will likely be triggered to run the construct instructions as specified. To retailer the construct artifacts, you will want Amazon S3 entry which might be enlisted from the IAM insurance policies.

1.2 AWS CodeCommit

AWS CodeCommit is the supply management service used to host Git repositories. This may be achieved in a few alternative ways as follows:

  1. Create a Git repository in AWS CodeCommit immediately – there aren’t any further IAM coverage necessities
  2. Mirror Git repositories at the moment current in GitLab or GitHub into AWS CodeCommit – must configure your GitLab or GitHub repository to reflect into AWS CodeCommit or migrate a Git repository to AWS CodeCommit

1.3 AWS CodeBuild

AWS CodeBuild defines the supply in AWS CodeCommit to construct the venture, due to this fact you will need to configure the default IAM coverage to allow entry to AWS CodeCommit to implement git pull. Moreover, entry to Amazon S3 is required to retailer construct artifacts. That is optionally available however good to retailer the artifacts for future entry. To construct and publish AWS IoT Greengrass V2 parts, further permissions should be added to checklist and create parts:

1.4 AWS IoT Greengrass V2

As soon as the part is constructed and revealed to AWS IoT Greengrass V2, it is possible for you to to entry the part from the AWS IoT Greengrass V2 console or CLI. AWS IoT Greegrass V2 Deployments might be made as required as soon as the appropriate part is revealed and obtainable.

2. Managing Supply and Construct

To construct and publish a part with GDK you need to use Python and Bash scripts. This part demonstrates how a GDK Python pattern can be utilized to perform constructing and publishing of a part.

2.1 GDK

Step 2.1.1: Utilizing a GDK Python pattern

For Python part, use GDK to create a fundamental pattern. The command will create following information:

- README.md - An ordinary Readme file
- gdk-config.json - Used to outline the GDK construct parameters
- recipe.yaml - Used to outline the part run processes and associated parameters
- foremost.py - An instance Python script that will likely be run as soon as the part is deployed
- src/ - Listing with a supporting script for foremost.py
- exams/ - Listing with check scripts

Instructions to create a default Python primarily based part:

$ mkdir HelloWorldPython
$ cd HelloWorldPython/
$ gdk part init -l python -t HelloWorld

Step 2.1.2: Modifying the GDK Python pattern

Subsequent, modify the default foremost.py script and the src/greeter.py script as proven under. Add a pattern run.sh bash script too. At present, for examples, GDK helps Python and Java. Nevertheless, in case your purposes require working binaries, Bash scripts, or another Terminal/CMD instructions, then you need to use run.sh Bash script. Therefore, as an alternative of working foremost.py Python script immediately, the run.sh Bash script can be utilized to execute it.

Right here is an instance of the modified foremost.py script:

import sys
import src.greeter as greeter

def foremost():
    args = sys.argv[1:]
    if len(args) == 2:
        print(greeter.get_greeting(args[0], args[1]))

if __name__ == "__main__":
    foremost()

Right here is an instance of the modified src/greeter.py script:

def get_greeting(msg1, msg2):
   """
   Returns greeting string

   Parameters
   ----------
       msg1(string): msg1 to append within the greeting.
       msg2(string): msg2 to append within the greeting.

   Returns
   -------
       string : Returns greeting for the title
   """

   print('The message is {} and {}!'.format(msg1, msg2))
   return '{} {}!'.format(msg1, msg2)

Right here is an instance of what’s contained within the run.sh script:

#!/bin/bash

print_usage() { printf "Utilization: run.sh -a message1 -b message2" }

whereas getopts a:b: flag; do
    case "${flag}" in
        a) message1=${OPTARG} ;;
        b) message2=${OPTARG} ;;
        *) print_usage
            exit 1 ;;
    esac
achieved

echo "Message #1 = $message1"
echo "Message #2 = $message2"

echo "Working foremost.py script ..."
python3 -u foremost.py $message1 $message2

Right here is an instance of what’s contained within the up to date gdk-config.json file:

{
  "part": {
    "com.instance.HelloWorldPython": {
      "creator": "<PLACEHOLDER_AUTHOR>",
      "model": "0.0.1",
      "construct": {
        "build_system": "zip"
      },
      "publish": {
        "bucket": "<PLACEHOLDER FOR BUCKET>",
        "area": "<PLACEHOLDER FOR REGION>"
      }
    }
  },
  "gdk_version": "1.0.0"
}

Right here is an instance of what’s contained within the up to date recipe.yaml file:

---
RecipeFormatVersion: "2020-01-25"
ComponentName: "{COMPONENT_NAME}"
ComponentVersion: "0.0.1"
ComponentDescription: "That is easy Hiya World part written in Python."
ComponentPublisher: "{COMPONENT_AUTHOR}"
ComponentConfiguration:
  DefaultConfiguration:
    configMessage1: "Hiya"
    configMessage2: "World"
Manifests:
  - Platform:
      os: all
    Artifacts:
      - URI: "s3://BUCKET_NAME/COMPONENT_NAME/COMPONENT_VERSION/HelloWorldPythonComponent.zip"
        Unarchive: ZIP
    Lifecycle:
      Run: "/bin/bash {artifacts:decompressedPath}/HelloWorldPythonComponent/run.sh -a {configuration:/configMessage1} -b {configuration:/configMessage2}

Add a buildspec.yml file that will likely be utilized by AWS CodeBuild to run instructions for pre-build, construct, and post-build processes. Right here is an instance of a buildspec.yml file with the mandatory instructions:

model: 0.2

phases:
  set up:
    instructions:
      - apt-get replace && apt-get set up -y zip unzip build-essential wget git curl software-properties-common python3.7 python3-pip
      - curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && unzip awscliv2.zip && ./aws/set up && rm awscliv2.zip
  construct:
    instructions:
      - python3 -m pip set up -U git+https://github.com/aws-greengrass/[email protected]
      - export PATH=$PATH:~/.native/bin
      - CURRDIR=$(basename "$PWD")
      - cd ../ && mv $CURRDIR HelloWorldPythonComponent && cd HelloWorldPythonComponent
      - gdk part construct
      - gdk part publish
      - mkdir package deal && cp -r greengrass-build package deal/. && cp -r zip-build package deal/.
      - pwd && ls -al && ls -al ..
artifacts:
  information:
    - package deal/**/*
  title: gg-component-$(date +%Y-%m-%d-%H-%M-%S).zip

2.2. AWS CodeCommit

To create the AWS CodeCommit repository, from the Developer Instruments, choose CodeCommit and Create repository. This may immediate for particulars like Repository title, Tags, and so on. As soon as created, you’ll be able to push the code that has been beforehand created.
The next picture reveals an instance of an AWS CodeCommit repository with the mandatory information required for GDK CLI part construct and publish instructions. This additionally incorporates the modified scripts particularly run.sh, foremost.py, src/greeter.py , recipe.yaml, gdk-config.json, and buildspec.yml.

2.3. AWS CodeBuild

The subsequent step is to setup AWS CodeBuild to make use of the above AWS CodeCommit repository as a supply and use construct instructions offered within the buildspec.yml file to run the construct course of. For this, choose CodeBuild from Developer Instruments and create venture. The method to setup the AWS CodeBuild is as follows:

Step 2.3.1: Organising construct surroundings

To set the construct surroundings for AWS CodeBuild, use the Amazon Elastic Container Registry (ECR) service with Ubuntu 18.04: public.ecr.aws/ubuntu/ubuntu:18.04. Following reveals how the construct surroundings is setup:

Step 2.3.2: Choosing the supply for construct

For the supply, join the AWS CodeCommit repository and level it to the appropriate Department/Tag/Commit ID. On this instance will probably be related to the grasp department. Choose the IAM insurance policies that have been provisioned earlier:

Step 2.3.3: Choosing the instructions to run the construct

Subsequent, outline the Buildspec which makes use of the instructions to run the precise construct. This buildspec is outlined within the buildspec.yml which is part of the supply. Therefore you could present the file title right here. Optionally construct instructions might be added right here if not utilizing a buildspec.yml file.

Step 2.3.4: Storing the construct artifacts

So as to retailer the construct artifacts, join the appropriate Amazon S3 bucket. Choose zip as an possibility to avoid wasting the construct artifacts in a compressed package deal within the Amazon S3 location:

2.4 Creating Pipeline

To handle the artifacts, GDK construct, and to publish modifications, you’ll be able to create the construct pipeline and automate the construct processes.

Step 2.4.1: Selecting pipeline settings

From the Developer Instruments, choose CodePipeline and create a brand new Pipeline. For the service roles, choose the position that was outlined earlier.

Step 2.4.2: Add supply stage

Subsequent, select the AWS CodeCommit repository and department that was created earlier. Choose Amazon CloudWatch Occasions on this part which might set off the Pipeline to start out if it detects any modifications within the Amazon CloudWatch Occasions of the AWS CodeCommit repository talked about right here.

Step 2.4.3: Add construct stage

Now, join the AWS CodeBuild Mission on this stage which might set off the construct from the supply AWS CodeCommit modifications.

Step 2.4.4: Add deploy stage

If you’re connecting the pipeline with the AWS CodeDeploy, you need to use this part so as to add that half. Skip AWS CodeDeploy stage as this isn’t demonstrated right here.

Step 2.4.5: Evaluation the pipeline

Now that every one the items are related, you’ll be able to create your pipeline. The pipeline when invoked by way of Amazon CloudWatch Occasions, it might set off the construct. The next picture reveals the stream that’s outlined within the AWS CodePipeline. Right here, the supply is related to the construct. Therefore, the pipeline pulls from the supply first after which runs the construct instructions talked about within the AWS CodeBuild buildspec.yml.

3. Deploy the part

3.1. Examine logs of AWS CodePipeline

  • As soon as AWS CodePipeline runs efficiently, the part could be constructed and revealed.
  • To examine the logs, go to AWS CodeBuild venture and choose the Construct logs from the Construct historical past.
  • Upon checking the logs, you’ll be able to be sure that the part is saved within the Amazon S3 bucket and can also be revealed to AWS IoT Greengrass V2 Parts.

3.2. Checking the part in AWS IoT Greengrass V2

  • As soon as the part is obtainable in AWS IoT Greengrass V2, it may be deployed on the IoT Issues. You are able to do this by revising current deployments utilizing the up to date part or by creating new deployment for the IoT Factor or Factor teams.

  • Upon checking the part within the AWS IoT Greengrass V2 console, you’ll be able to see the small print just like the ‘Default configuration,’ ‘Lifecycle’ particulars, and ‘Artifacts’ location in Amazon S3 bucket, all of which relies on the recipe.yaml script.

4. Cleanup

AWS CodePipeline is provisioned to hearken to Amazon CloudWatch Occasions so each small replace to the AWS CodeCommit repository will set off the pipeline and it’ll construct and publish parts. Therefore, the pipeline might be stopped by deciding on the Cease execution.
This may additionally forestall updating construct artifacts in addition to part artifacts in Amazon S3.

5. Conclusion

AWS IoT Greengrass V2 companies are often utilized in an automatic framework the place deployments of parts are provisioned primarily based on sure occasions. GDK CLI helps in being extra versatile for creating AWS IoT Greengrass V2 parts utilizing Python/Java/Bash. Nevertheless, as an alternative of manually provisioning the construct and publish duties every time there’s any change to the part, automation could be superb. This construct resolution highlights the usage of AWS CodePipeline for constructing/publishing AWS IoT Greengrass V2 parts and reduces growth efforts in addition to handbook intervention. Moreover, for steady integration and steady deployment (CI/CD), versioning is a crucial side which might be simplified and automatic by this construct resolution.

To study extra about AWS IoT Greengrass V2, go to the Documentation and Improvement Instruments. To get began with automated workflows, go to the Weblog.

In regards to the Authors

Romil Shah is a IoT Edge Information Scientist in AWS Skilled Providers. Romil has over 6 years of trade expertise in Pc Imaginative and prescient, Machine Studying and IoT edge gadgets. He’s concerned in serving to clients optimize and deploy their Machine Studying fashions for edge gadgets.
Fabian Benitez-Quiroz is a IoT Edge Information Scientist in AWS Skilled Providers. He holds a PhD in Pc Imaginative and prescient and Sample Recognition from The Ohio State College. Fabian is concerned in serving to clients run their Machine Studying fashions with low latency on IoT gadgets.

Leave a Reply