Skip to content

Instantly share code, notes, and snippets.

@matthias-chlechowitz
Forked from sapessi/README.md
Created August 2, 2019 09:45
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save matthias-chlechowitz/b3e9f4541aaa5584425e61d7c04da031 to your computer and use it in GitHub Desktop.
Save matthias-chlechowitz/b3e9f4541aaa5584425e61d7c04da031 to your computer and use it in GitHub Desktop.
Continuous deployment of React websites to Amazon S3

Continuous deployment of React websites to Amazon S3

This sample includes a continuous deployment pipiline for websites built with React. We use AWS CodePipeline, CodeBuild, and SAM to deploy the application. To deploy the application to S3 using SAM we use a custom CloudFormation resource.

Files included

  • buildspec.yml: YAML configuration for CodeBuild, this file should be in the root of your code repository
  • configure.js: Script executed in the build step to generate a config.json file for the application, this is used to include values exported by other CloudFormation stacks (separate services of the same application).
  • index.js: Custom CloudFormation resource that publishes the website to an S3 bucket. As you can see from the buildspec and SAM template, this function is located in a s3-deployment-custom-resource sub-folder of the repo
  • app-sam.yaml: Serverless Application model YAML file. This configures the S3 bucket and the custom resource
  • serverless-pipeline.yaml: The pipeline template to deploy the whole caboodle. The pipeline is currently configured to pick up the source from GitHub

The pipeline workflow includes the following stages:

Source

  • Pick up source files from git repository, the source include the unmodified react code

Beta

CodeBuild

Using a CodeBuild container (aws/codebuild/nodejs:6.3.1) we crate the deployment packages for the application

  • Pull down all dependencies with npm for the website and the custom CloudFormation resource
  • Run a configure.js script, this script reads output values from other CloudFormation stacks and generates a config.json file used by the read app - an example could be the Cognito User Pool Id. I use the stage name in the stack, this is passed to the script when started and is accessed as an environment variable of the CodeBuild container - this is why we have one build step per stage of the pipeline
  • Run npm start build command
  • Run cloudformation package command to prepare our SAM template for deployment
  • Zip up the build directory from the website
  • Outputs from the container are the processed SAM template and the zip file with the website content

The buidspec.yml file included in this gist contains all of the commands above. I've also included a sample of the configure.js script (code needs a lot of cleanup)

CreateChangeSet

CloudFormation step that uses our SAM template to start the deployment of the website

  • This step creates the CloudFormation ChangeSet required to deploy our SAM template.
  • The template receives two additional parameters: The source artifact bucket and source artifact key in the bucket. These parameters are needed because our custom CloudFormation resource will read the CodeBuild output file, extract the zipped contents from the React build folder, and copy them to an S3 bucket

To override the parameters passed to CloudFormation we use the ParameterOverrides property in CodePipelines with the following JSON

{
  "SourceBucket" : { "Fn::GetArtifactAtt" : ["BetaBuiltZip", "BucketName"]},
  "SourceArtifact" : { "Fn::GetArtifactAtt" : ["BetaBuiltZip", "ObjectKey"]}
}

ExecuteChangeSet

This step actually runs the template we created in the previous step

Gamma

Rinse and repeat. For gamma and prod we simply repeat the steps above.

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Parameters:
SourceBucket:
Type: String
Description: S3 bucket name for the CodeBuild artifact
SourceArtifact:
Type: String
Description: S3 object key for the CodeBuild artifact
Resources:
WebsiteBucket:
Type: AWS::S3::Bucket
Properties:
AccessControl: PublicRead
WebsiteConfiguration:
IndexDocument: index.html
ErrorDocument: error.html
WebsiteBucketPolicy:
Type: AWS::S3::BucketPolicy
Properties:
Bucket: !Ref WebsiteBucket
PolicyDocument:
Statement:
-
Effect: Allow
Principal: "*"
Action:
- s3:GetObject
Resource: !Sub 'arn:aws:s3:::${WebsiteBucket}/*'
WebsiteDeployment:
Type: Custom::WebsiteDeployment
Properties:
ServiceToken: !GetAtt DeploymentCustomResourceLambda.Arn
Options:
SourceBucket: !Ref SourceBucket
SourceArtifact: !Ref SourceArtifact
DestinationBucket: !Ref WebsiteBucket
##########################################
# Custom resources for S3 Upload #
##########################################
DeploymentCustomResourceLambdaExecutionRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
Effect: Allow
Principal:
Service: lambda.amazonaws.com
Action: sts:AssumeRole
Path: '/'
Policies:
- PolicyName: root
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- logs:CreateLogGroup
- logs:CreateLogStream
- logs:PutLogEvents
Resource: arn:aws:logs:*:*:*
- Effect: Allow
Action:
- s3:GetObject
Resource: !Sub 'arn:aws:s3:::${SourceBucket}/${SourceArtifact}'
- Effect: Allow
Action:
- s3:PutObject
Resource: !Sub 'arn:aws:s3:::${WebsiteBucket}/*'
DeploymentCustomResourceLambda:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./s3-deployment-custom-resource
Handler: index.handler
MemorySize: 128
Role: !GetAtt DeploymentCustomResourceLambdaExecutionRole.Arn
Runtime: nodejs4.3
Timeout: 300
############################################
# / Custom resources for S3 Upload #
############################################
Outputs:
WebsiteURL:
Value: !GetAtt WebsiteBucket.WebsiteURL
Description: URL for the website hosted on S3
version: 0.1
phases:
install:
commands:
- cd $CODEBUILD_SRC_DIR; npm install
- cd $CODEBUILD_SRC_DIR/s3-deployment-custom-resource; npm install
pre_build:
commands:
- cd $CODEBUILD_SRC_DIR; node configure.js $STAGE
build:
commands:
- cd $CODEBUILD_SRC_DIR; npm run build
post_build:
commands:
- cd $CODEBUILD_SRC_DIR/build; zip -r ../package.zip *
- aws cloudformation package --template-file app-sam.yaml --s3-bucket $BUILD_OUTPUT_BUCKET --output-template-file app-output_sam.yaml
artifacts:
files:
- app-output_sam.yaml
- package.zip
discard-paths: yes
'use strict'
const fs = require('fs')
const AWS = require('aws-sdk')
const stage = process.argv[2]
const STACKS = [
{
"StackName": "MyService-Stack-" + stage,
"ConfigName": "MY_SERVICE",
"Vars": [ "ApiUrl", "CognitoUserPoolId", "CognitoUserPoolClientId" ]
}
]
const cf = new AWS.CloudFormation()
let configData = {}
for (let s = 0; s < STACKS.length; s++) {
let config = STACKS[s]
if (!configData[config.ConfigName]) {
configData[config.ConfigName] = {}
}
console.log("describe stack")
let isLast = false
if (s == STACKS.length - 1) {
isLast = true
}
cf.describeStacks({StackName: config.StackName}, function (err, data) {
if (err) {
console.log(`could not describe stack ${config.StackName}`)
stop(1)
}
if (data.Stacks.length == 0) {
console.log(`No stack ${config.StackName} found`)
stop(1)
}
console.log("Looping over outputs")
console.log(JSON.stringify(data.Stacks))
let stackDetails = data.Stacks[0]
for (let o = 0; o < stackDetails.Outputs.length; o++) {
let curOutput = stackDetails.Outputs[o]
console.log(`looking at var ${curOutput.OutputKey}`)
for (let v = 0; v < config.Vars.length; v++) {
if (curOutput.OutputKey == config.Vars[v]) {
configData[config.ConfigName][curOutput.OutputKey] = curOutput.OutputValue
}
}
}
if (isLast) {
console.log("Writing data")
let configFilePath = __dirname + "/src/config.json"
fs.writeFileSync(configFilePath, JSON.stringify(configData))
}
})
}
function stop(code) {
process.exit(code)
}
'use strict'
const https = require('https')
const url = require('url')
const fs = require('fs')
const AWS = require('aws-sdk')
const AdmZip = require('adm-zip')
const async = require('async')
var mime = require('mime')
const constants = {
SUCCESS: 'SUCCESS',
FAILED: 'FAILED',
UPDATE: 'Update',
CREATE: 'Create',
DELETE: 'Delete'
}
const s3 = new AWS.S3({"signatureVersion":"v4"})
exports.handler = (event, context, callback) => {
console.log(event)
const requestType = event.RequestType
const resourceOptions = requestType === constants.DELETE ? {} : event.ResourceProperties.Options
if (event.LogicalResourceId != "WebsiteDeployment") {
return sendCloudFormationResponse(constants.FAILED, { message: `Invalid LogicalResourceId: ${event.LogicalResourceId}` })
}
switch (requestType) {
case constants.CREATE:
case constants.UPDATE:
return uploadArtifacts(resourceOptions)
case constants.DELETE:
return cleanBucket(event.PhysicalResourceId)
default:
return sendCloudFormationResponse(constants.FAILED, { message: `Invalid request type ${requestType}` })
}
function cleanBucket(resourceId) {
if (!resourceId || resourceId == "") {
return sendCloudFormationResponse(constants.FAILED, { message: `Invalid physical resource id: ${resourceId}` })
}
const bucketName = resourceId.split("::")[1]
s3.listObjects({Bucket: bucketName}, function (err, data) {
if (err) {
return sendCloudFormationResponse(constants.FAILED, { message: `Could not list bucket objects: ${err}` })
}
let items = data.Contents;
for (var i = 0; i < items.length; i += 1) {
let deleteParams = {Bucket: bucket, Key: items[i].Key};
s3.deleteObject(deleteParams, function(err, data) {
if (err) {
return sendCloudFormationResponse(constants.FAILED, { message: `Could not delete object: ${items[i].Key}` })
}
})
}
});
return sendCloudFormationResponse(constants.SUCCESS, { message: 'OK' }, resourceId)
}
function uploadArtifacts(resourceOptions) {
if (!resourceOptions || !resourceOptions["SourceBucket"] ||
!resourceOptions["SourceArtifact"] || !resourceOptions["DestinationBucket"]) {
return sendCloudFormationResponse(constants.FAILED, {
message: 'Missing required options: SourceBucket, SourceArtifact, DestinationBucket'
})
}
const sourceBucket = resourceOptions.SourceBucket
const sourceArtifact = resourceOptions.SourceArtifact
const destinationBucket = resourceOptions.DestinationBucket
const physicalResourceId = 'Deployment::' + resourceOptions.DestinationBucket
const tmpSourceArtifact = '/tmp/artifact.zip'
const tmpPackageZip = '/tmp/package.zip'
// get source artifact
s3.getObject({ Bucket: sourceBucket, Key: sourceArtifact}, function(err, data) {
if (err) {
return sendCloudFormationResponse(constants.FAILED, { message: `Could not fetch artifact: ${sourceBucket}/${sourceArtifact}: ${err}` })
}
try {
fs.writeFileSync(tmpSourceArtifact, data.Body, { encoding: 'binary' })
} catch (ex) {
return sendCloudFormationResponse(constants.FAILED, { message: `Could not save artifact to disk: ${ex}` })
}
let artifactZip = new AdmZip(tmpSourceArtifact)
let packageFound = false
let zipEntries = artifactZip.getEntries()
zipEntries.forEach(function(zipEntry) {
if (zipEntry.entryName == "package.zip") {
console.log("Found package.zip file")
packageFound = true
try {
artifactZip.extractEntryTo(zipEntry, '/tmp', true, true)
} catch (ex) {
return sendCloudFormationResponse(constants.FAILED, { message: `Could not save package to disk: ${ex}` })
}
}
})
if (!packageFound) {
return sendCloudFormationResponse(constants.FAILED, { message: 'Could not package.zip in artifact' })
}
const deploymentDir = '/tmp/dist'
if (fs.existsSync(deploymentDir)){
deleteFolderRecursive(deploymentDir)
}
fs.mkdirSync(deploymentDir);
let packageZip = new AdmZip(tmpPackageZip)
let packageEntries = packageZip.getEntries()
let asyncTasks = []
packageEntries.forEach(function(entry) {
console.log("Processing entry " + entry.entryName)
if (entry.isDirectory) {
return
}
asyncTasks.push(function(callback) {
packageZip.extractEntryTo(entry, deploymentDir, true, true)
let fileName = deploymentDir + "/" + entry.entryName
let fileData = fs.readFileSync(fileName)
let s3FileProperties = {
Bucket : destinationBucket,
Key : entry.entryName,
ContentLength : fileData.length,
Body : fileData,
ContentType : mime.lookup(fileName)
}
s3.putObject(s3FileProperties, function(err, data) {
if(err)
callback(err, entry.entryName)
else
callback(null, data.Key)
})
})
})
async.parallel(asyncTasks, function(err, result) {
if (err)
return sendCloudFormationResponse(constants.FAILED, { message: `Error while uploading ${result} to destination bucket: ${err}` })
else
return sendCloudFormationResponse(constants.SUCCESS, { message: 'OK' }, physicalResourceId)
})
})
function deleteFolderRecursive(path) {
if( fs.existsSync(path) ) {
fs.readdirSync(path).forEach(function(file,index){
var curPath = path + "/" + file;
if(fs.lstatSync(curPath).isDirectory()) { // recurse
deleteFolderRecursive(curPath);
} else { // delete file
fs.unlinkSync(curPath);
}
});
fs.rmdirSync(path);
}
};
}
function sendCloudFormationResponse(responseStatus, responseData, physicalResourceId) {
const responseBody = JSON.stringify({
Status: responseStatus,
Reason: `See the details in CloudWatch Log Stream: ${context.logStreamName}`,
PhysicalResourceId: physicalResourceId || context.logStreamName,
StackId: event.StackId,
RequestId: event.RequestId,
LogicalResourceId: event.LogicalResourceId,
Data: responseData
})
console.log(`Response body:
${responseBody}`)
const parsedUrl = url.parse(event.ResponseURL)
const requestOptions = {
hostname: parsedUrl.hostname,
port: 443,
path: parsedUrl.path,
method: 'PUT',
headers: {
'content-type': '',
'content-length': responseBody.length
}
}
return new Promise((resolve, reject) => {
const request = https.request(requestOptions, resolve)
request.on('error', e => reject(`http request error: ${e}`))
request.write(responseBody)
request.end()
})
.then(() => callback(responseStatus === constants.FAILED ? responseStatus : null, responseData))
.catch(callback)
}
}
---
AWSTemplateFormatVersion: 2010-09-09
Parameters:
GitHubToken:
Description: "GitHub Oauth token"
Type: String
RepositoryName:
Default: frontend
Description: "GitHub repository name"
Type: String
RepositoryOwnerName:
Default: github_user
Description: "GitHub user"
Type: String
ServiceName:
Default: Frontend
Description: "Name for the service, used in the Lambda function and pipeline names"
Type: String
Resources:
BuildArtifactsBucket:
Type: "AWS::S3::Bucket"
CFNPipelinePolicy:
Properties:
Description: "CloudFormation Pipeline Execution Policy"
Path: /
PolicyDocument:
Statement:
Action:
- "cloudformation:CreateStack"
- "cloudformation:DescribeStacks"
- "cloudformation:DeleteStack"
- "cloudformation:UpdateStack"
- "cloudformation:CreateChangeSet"
- "cloudformation:ExecuteChangeSet"
- "cloudformation:DeleteChangeSet"
- "cloudformation:DescribeChangeSet"
- "cloudformation:SetStackPolicy"
- "cloudformation:SetStackPolicy"
- "cloudformation:ValidateTemplate"
- "codebuild:StartBuild"
- "codebuild:BatchGetBuilds"
Effect: Allow
Resource: "*"
Version: "2012-10-17"
Type: "AWS::IAM::ManagedPolicy"
CloudFormationExecutionRole:
Properties:
AssumeRolePolicyDocument:
Statement:
Action:
- "sts:AssumeRole"
Effect: Allow
Principal:
Service:
- cloudformation.amazonaws.com
Version: "2012-10-17"
ManagedPolicyArns:
- "arn:aws:iam::aws:policy/AdministratorAccess"
Path: /
Type: "AWS::IAM::Role"
CodeBuildBetaProject:
Properties:
Artifacts:
Type: CODEPIPELINE
Environment:
ComputeType: BUILD_GENERAL1_SMALL
EnvironmentVariables:
-
Name: BUILD_OUTPUT_BUCKET
Value: BuildArtifactsBucket
-
Name: STAGE
Value: Beta
Image: "aws/codebuild/nodejs:6.3.1"
Type: LINUX_CONTAINER
Name: "${ServiceName}_beta_build"
ServiceRole: CodeBuildServiceRole.Arn
Source:
Type: CODEPIPELINE
Type: "AWS::CodeBuild::Project"
CodeBuildGammaProject:
Properties:
Artifacts:
Type: CODEPIPELINE
Environment:
ComputeType: BUILD_GENERAL1_SMALL
EnvironmentVariables:
-
Name: BUILD_OUTPUT_BUCKET
Value: BuildArtifactsBucket
-
Name: STAGE
Value: Beta
Image: "aws/codebuild/nodejs:6.3.1"
Type: LINUX_CONTAINER
Name: "${ServiceName}_gamma_build"
ServiceRole: CodeBuildServiceRole.Arn
Source:
Type: CODEPIPELINE
Type: "AWS::CodeBuild::Project"
CodeBuildProdProject:
Properties:
Artifacts:
Type: CODEPIPELINE
Environment:
ComputeType: BUILD_GENERAL1_SMALL
EnvironmentVariables:
-
Name: BUILD_OUTPUT_BUCKET
Value: BuildArtifactsBucket
-
Name: STAGE
Value: Prod
Image: "aws/codebuild/nodejs:6.3.1"
Type: LINUX_CONTAINER
Name: "${ServiceName}_prod_build"
ServiceRole: CodeBuildServiceRole.Arn
Source:
Type: CODEPIPELINE
Type: "AWS::CodeBuild::Project"
CodeBuildServiceRole:
Properties:
AssumeRolePolicyDocument:
Statement:
-
Action:
- "sts:AssumeRole"
Effect: Allow
Principal:
Service:
- codebuild.amazonaws.com
Version: "2012-10-17"
Path: /
Policies:
-
PolicyDocument:
Statement:
-
Action:
- "logs:CreateLogGroup"
- "logs:CreateLogStream"
- "logs:PutLogEvents"
Effect: Allow
Resource:
- "arn:aws:logs:${AWS::Region}:${AWS::AccountId}:log-group:/aws/codebuild/*"
-
Action:
- "s3:GetObject"
- "s3:GetObjectVersion"
- "s3:PutObject"
Effect: Allow
Resource:
- "arn:aws:s3:::${BuildArtifactsBucket}/*"
-
Action:
- "cloudformation:DescribeStacks"
Effect: Allow
Resource: "*"
Version: "2012-10-17"
PolicyName: CodeBuildAccess
Type: "AWS::IAM::Role"
Pipeline:
Properties:
ArtifactStore:
Location: BuildArtifactsBucket
Type: S3
Name: "${ServiceName}_pipeline"
RoleArn: PipelineExecutionRole.Arn
Stages:
-
Actions:
-
ActionTypeId:
Category: Source
Owner: ThirdParty
Provider: GitHub
Version: 1
Configuration:
Branch: master
OAuthToken: GitHubToken
Owner: RepositoryOwnerName
Repo: RepositoryName
Name: GitHubRepo
OutputArtifacts:
-
Name: SourceZip
RunOrder: 1
Name: Source
-
Actions:
-
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: 1
Configuration:
ProjectName: CodeBuildBetaProject
InputArtifacts:
-
Name: SourceZip
Name: CodeBuild
OutputArtifacts:
-
Name: BetaBuiltZip
RunOrder: 1
-
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: 1
Configuration:
ActionMode: CHANGE_SET_REPLACE
Capabilities: CAPABILITY_IAM
ChangeSetName: "${ServiceName}-ChangeSet-Beta"
ParameterOverrides: "{ \"SourceBucket\" : { \"Fn::GetArtifactAtt\" : [\"BetaBuiltZip\", \"BucketName\"]}, \"SourceArtifact\" : { \"Fn::GetArtifactAtt\" : [\"BetaBuiltZip\", \"ObjectKey\"]} }"
RoleArn: CloudFormationExecutionRole.Arn
StackName: "${ServiceName}-Stack-Beta"
TemplatePath: "BetaBuiltZip::app-output_sam.yaml"
InputArtifacts:
-
Name: BetaBuiltZip
Name: CreateChangeSet
RunOrder: 2
-
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: 1
Configuration:
ActionMode: CHANGE_SET_EXECUTE
ChangeSetName: "${ServiceName}-ChangeSet-Beta"
RoleArn: CloudFormationExecutionRole.Arn
StackName: "${ServiceName}-Stack-Beta"
Name: ExecuteChangeSet
OutputArtifacts:
-
Name: "${ServiceName}BetaChangeSet"
RunOrder: 3
Name: Beta
-
Actions:
-
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: 1
Configuration:
ProjectName: CodeBuildGammaProject
InputArtifacts:
-
Name: SourceZip
Name: CodeBuild
OutputArtifacts:
-
Name: GammaBuiltZip
RunOrder: 1
-
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: 1
Configuration:
ActionMode: CHANGE_SET_REPLACE
Capabilities: CAPABILITY_IAM
ChangeSetName: "${ServiceName}-ChangeSet-Gamma"
ParameterOverrides: "{ \"SourceBucket\" : { \"Fn::GetArtifactAtt\" : [\"GammaBuiltZip\", \"BucketName\"]}, \"SourceArtifact\" : { \"Fn::GetArtifactAtt\" : [\"GammaBuiltZip\", \"ObjectKey\"]} }"
RoleArn: CloudFormationExecutionRole.Arn
StackName: "${ServiceName}-Stack-Gamma"
TemplatePath: "GammaBuiltZip::app-output_sam.yaml"
InputArtifacts:
-
Name: GammaBuiltZip
Name: CreateChangeSet
RunOrder: 2
-
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: 1
Configuration:
ActionMode: CHANGE_SET_EXECUTE
ChangeSetName: "${ServiceName}-ChangeSet-Gamma"
RoleArn: CloudFormationExecutionRole.Arn
StackName: "${ServiceName}-Stack-Gamma"
Name: ExecuteChangeSet
OutputArtifacts:
-
Name: "${ServiceName}GammaChangeSet"
RunOrder: 3
Name: Gamma
-
Actions:
-
ActionTypeId:
Category: Approval
Owner: AWS
Provider: Manual
Version: 1
Name: DeploymentApproval
RunOrder: 1
-
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: 1
Configuration:
ProjectName: CodeBuildProdProject
InputArtifacts:
-
Name: SourceZip
Name: CodeBuild
OutputArtifacts:
-
Name: ProdBuiltZip
RunOrder: 2
-
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: 1
Configuration:
ActionMode: CHANGE_SET_REPLACE
Capabilities: CAPABILITY_IAM
ChangeSetName: "${ServiceName}-ChangeSet-Prod"
ParameterOverrides: "{ \"SourceBucket\" : { \"Fn::GetArtifactAtt\" : [\"ProdBuiltZip\", \"BucketName\"]}, \"SourceArtifact\" : { \"Fn::GetArtifactAtt\" : [\"ProdBuiltZip\", \"ObjectKey\"]} }"
RoleArn: CloudFormationExecutionRole.Arn
StackName: "${ServiceName}-Stack-Prod"
TemplatePath: "ProdBuiltZip::app-output_sam.yaml"
InputArtifacts:
-
Name: ProdBuiltZip
Name: CreateChangeSet
RunOrder: 3
-
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: 1
Configuration:
ActionMode: CHANGE_SET_EXECUTE
ChangeSetName: "${ServiceName}-ChangeSet-Prod"
RoleArn: CloudFormationExecutionRole.Arn
StackName: "${ServiceName}-Stack-Prod"
Name: ExecuteChangeSet
OutputArtifacts:
-
Name: "${ServiceName}ProdChangeSet"
RunOrder: 4
Name: Prod
Type: "AWS::CodePipeline::Pipeline"
PipelineExecutionRole:
Properties:
AssumeRolePolicyDocument:
Statement:
-
Action:
- "sts:AssumeRole"
Effect: Allow
Principal:
Service:
- codepipeline.amazonaws.com
Version: "2012-10-17"
ManagedPolicyArns:
- "arn:aws:iam::aws:policy/AWSCodeCommitFullAccess"
- "arn:aws:iam::aws:policy/AmazonS3FullAccess"
- CFNPipelinePolicy
Path: /
Policies:
-
PolicyDocument:
Statement:
-
Action:
- "iam:PassRole"
- "lambda:InvokeFunction"
- "lambda:ListFunctions"
- "lambda:InvokeAsyc"
Effect: Allow
Resource: "*"
Version: "2012-10-17"
PolicyName: CodePipelineAccess
Type: "AWS::IAM::Role"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment