I’ve been trying to build more CI/CD scripts using Gitlab to automate pipeline deployments for work. Here’s a useful one for building and deploying a React app to Amazon S3.
You’ll need to add a variable called S3_BUCKET_NAME to your repo or replace the variable with your bucket path.
stages: - build - deploy build react-app: #I'm using node:latest, but be sure to test or change to a version you know works. Sometimes node updates break the npm script. image: node:latest stage: build only: - master script: # Set PATH - export PATH=$PATH:/usr/bin/npm # Install dependencies - npm install # Build App - CI=false npm run build artifacts: paths: # Build folder - build/ expire_in: 1 hour deploy master: image: python:latest stage: deploy only: - master script: - pip3 install awscli - aws s3 sync ./build s3://$S3_BUCKET_NAME --acl public-read
I just spent a few hours setting up a Gitlab pipeline to deploy a Storybook.js site. Of course the end result ended up being much simpler than I made it out to be. Like everything else on my blog, I’m sharing in case anyone else can use the information to save time.
Just put this in your gitlab-ci.yml and it’ll take care of caching the node modules and building your static version of Storybook to deploy.
image: node:latest cache: paths: - node_modules/ stages: - build - deploy build: stage: build script: - npm install - npm run build-storybook -- -o storybook-static artifacts: paths: - storybook-static only: - qa - develop - master deploy: stage: deploy_to_aws # add your deploy code here