Skip to content

Instantly share code, notes, and snippets.

@donalod
Last active November 2, 2020 13:32
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save donalod/e35979d6067e9fccb2987fce70480fc2 to your computer and use it in GitHub Desktop.
Save donalod/e35979d6067e9fccb2987fce70480fc2 to your computer and use it in GitHub Desktop.
bitbucket_pipeline.yml
#
# TODO: Need to account for multi-commit history in current commit rather than just merges.
# This could be done possibly with BITBUCKET_BUILD_NUMBER -1 but that doesn't account
# for broken builds or previous commits. Perhaps inc BUILD status? Reminder, we're
# looking to pull a list of JSON files that require publishing. Not sure if we tell
# people to squash commits before push? :)
# TODO: We could build a new array of files or push them by checking all commits since head
# with smth like: git log @{u}.. | grep -iE "commit [[:alnum:]]{40}" | cut -f2 -d" "
pipelines:
branches:
master:
# We should only be deploying from master albeit there are more checks below.
- step:
name: Deploy to WebFlow
deployment: test
# trigger: manual # Uncomment to make this a manual deployment.
script:
# apt-get update && apt-get install -y jq
- |
# Unless overridden we are dealing with master branch
# There still may be some merge artifacts in master depending on approach/strategy
branch="master";
echo "Token: $GIT_LFS_SKIP_SMUDGE"
echo "BITBUCKET_PR_DESTINATION_BRANCH is $BITBUCKET_PR_DESTINATION_BRANCH";
echo "BITBUCKET_BRANCH is $BITBUCKET_BRANCH";
if [[ ${#BITBUCKET_PR_DESTINATION_BRANCH} > 0 ]] && [[ $BITBUCKET_PR_DESTINATION_BRANCH != 'master' ]]; then
echo "This is not a PR on master so we're exiting early, just in case!" && exit 1;
fi
if [[ ${#BITBUCKET_BRANCH} > 0 ]]; then
branch=$BITBUCKET_BRANCH;
fi
echo "We will work on the branch $branch and the initial commit $BITBUCKET_COMMIT";
- commit_to_check=$BITBUCKET_COMMIT;
# We need to see if this is a merge and get containing commits
- merge_check=$(git show $commit_to_check | head -n4 | grep "Merge" || true);
- |
if [[ ${#merge_check} > 0 ]]; then
echo "This was a merge, so we're checking the commit a little deeper";
commit_to_check=$(echo $merge_check | cut -d" " -f2,3 | sed "s/ /../g");
# Above sets the commit_to_check as a range of commits similar to "29bce6e..cc5b326" with the first as the
# current merge and the second as the last commit to the branch. This is fine but if there's a lot of
# churn in between on master, then the below/following logic will slurp up many files in to the array to
# re-publish...
fi
- echo "Checking commit hash(es) ${commit_to_check}"
- declare -a files_array && files_array=( $(git diff-tree --no-commit-id --name-only -r $commit_to_check) )
# Above will grab a lot of files if there have been commits that touch multiple files in the mean time... BE WARNED
- echo ${files_array[@]}
- echo "Current working directory is ${PWD}"
- ls -al
- |
files_array_length=${#files_array[@]};
for (( i=0; i<${files_array_length}; i++)); do
file_slug=${files_array[$i]};
if [[ "${files_array[$i]}" == *_metadata.json ]] && [[ "${files_array[$i]}" != *"example_metadata.json"* ]] && [[ "${files_array[$i]}" != *"example_lightweight_metadata.json"* ]] && [[ -e ${files_array[$i]} ]]; then
echo "${files_array[$i]}"
#test -e ${PWD}/${files_array[$i]}
cat ${PWD}/${files_array[$i]} | jq . -rM;
echo "Result of cat'ing file: $?";
echo "JSON FILE URL: https://api.bitbucket.org/2.0/repositories/<org>/<repo>/src/$branch/${files_array[$i]}";
echo "Sending webhook...with parameter that includes: $branch/${files_array[$i]}";
curl -k -d json_file_url="https://api.bitbucket.org/2.0/repositories/<org>/<repo>/src/$branch/${files_array[$i]}" -s "https://webhook";
echo "File update for: ${files_array[$i]}";
sleep 3;
else
echo "${files_array[$i]} is not an integration file (or might be an example integration file, script, private doco, or deleted file).";
fi
done
after-script:
- echo $BITBUCKET_EXIT_CODE
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment