Skip to content

Instantly share code, notes, and snippets.

@sats17
Last active December 23, 2024 15:50
Show Gist options
  • Save sats17/493d05d8d4dfd16b7dad399163075156 to your computer and use it in GitHub Desktop.
Save sats17/493d05d8d4dfd16b7dad399163075156 to your computer and use it in GitHub Desktop.

############################################################################################################ ################## ############################# ################## #############################

              This Gist collection contains all localstack related examples

################## ############################# ################## ############################# ############################################################################################################

version: '3.8'
services:
    redis:
      container_name: "redis"
      image: redis:alpine
      ports:
        - "6379:6379" # To Connect redis from local cmd use : docker exec -it cores-redis redis-cli
    localstack:
        container_name: "localstack" # Container name in your docker
        image: localstack/localstack:latest # Will download latest version of localstack
        #image: localstack/localstack-full:latest # Full image support WebUI
        ports:
          - "4566:4566" # Default port forward
          - "9200:4571" # Elasticsearch port forward
          #- "8080:8080: # WebUI port forward
        environment:
          - SERVICES=es, s3, ec2, dynamodb, elasticcache, sqs #AWS Services that you want in your localstack
          - DEBUG=1 # Debug level 1 if you want to logs, 0 if you want to disable
          - START_WEB=0 # Flag to control whether the Web UI should be started in Docker
          - LAMBDA_REMOTE_DOCKER=0
          - DATA_DIR=/tmp/localstack/data #  Local directory for saving persistent data(Example: es storage)
          - DEFAULT_REGION=us-east-1
        volumes:
          - './.localstack:/tmp/localstack'
          - '/var/run/docker.sock:/var/run/docker.sock'
  1. S3 Connection : AmazonS3ClientBuilder .standard() .withEndpointConfiguration(new EndpointConfiguration("http://localhost:4566/", "us-east-1")) // localstack endpoint configuration .withCredentials(new DefaultAWSCredentialsProviderChain()) .withPathStyleAccessEnabled(true) // Disable virtualhost style connection and enable path style s3 bucket .build();

  2. SQS Connection : AmazonSQSClientBuilder.standard() .withEndpointConfiguration(new EndpointConfiguration("http://localhost:4566", "us-east-1")) .withCredentials(new DefaultAWSCredentialsProviderChain());

  3. SNS Connection : AmazonSNSClientBuilder.standard() .withEndpointConfiguration(new EndpointConfiguration("http://localhost:4566", "us-east-1")) .build();

  1. Create es domain -

aws --endpoint-url=http://localhost:4566 es create-elasticsearch-domain --domain-name dev-domain --elasticsearch-version 6.8.1

(Note: It take around 5-10 min to up the domain in localstack)

  1. Check if domain is up or not(Default port is 4571) -

curl -X GET "http://localhost:9200"

  1. Create new index in your domain -

curl -X PUT "localhost:9200/newindex?pretty"

  1. List all indices from your domain -
  1. List all elasticsearch domains -

aws --endpoint-url=http://localhost:4566 es list-domain-names

  1. Delete domain from elasticsearch -

aws --endpoint-url=http://localhost:4566 es delete-elasticsearch-domain --domain-name dev-domain

  1. Create s3 bucket -

aws --endpoint-url=http://localhost:4566 s3 mb s3://my-test-bucket

  1. List s3 buckets -

aws --endpoint-url="http://localhost:4566" s3 ls

  1. Upload file on s3 bucket -

aws --endpoint-url="http://localhost:4566" s3 sync "myfiles" s3://my-test-bucket

  1. List files from AWS CLI -

aws --endpoint-url="http://localhost:4566" s3 ls s3://my-test-bucket

  1. Access file via URL(File name was download.jpg) -
  1. Delete object from bucket -

aws --endpoint-url=http://localhost:4566 s3api delete-object --bucket --key <file-name.format>

  1. Create bucket notification configuration -

aws --endpoint-url=http://localhost:4566 s3api put-bucket-notification-configuration --bucket --notification-configuration file://.json

  1. Get bucket notification configuration -

aws --endpoint-url=http://localhost:4566 s3api get-bucket-notification-configuration --bucket

  1. Set bucket policy -

aws --endpoint-url=http://localhost:4566 s3api put-bucket-policy --bucket --policy file://.json

  1. Get bukcet policy -

aws --endpoint-url=http://localhost:4566 s3api get-bucket-policy --bucket

  1. Create sns topic -

aws --endpoint-url=http://localhost:4566 sns create-topic --name my-test-topic

  1. list all sns topics -

aws --endpoint-url=http://localhost:4566 sns list-topics

  1. list subscriptions -

aws --endpoint-url=http://localhost:4566 sns list-subscriptions

  1. publish message -

aws --endpoint-url=http://localhost:4566 sns publish --topic-arn "arn:aws:sns:us-east-1:000000000000:ingest-topic" --message file://message.txt --message-attributes file://attributes.json

-- message.txt my message to publish

-- attributes.json { "key": { "DataType": "String", "StringValue": "value" } }

  1. Create queue -

aws --endpoint-url=http://localhost:4566 sqs create-queue --queue-name my-test-queue

  1. Get queue url -

aws --endpoint-url=http://localhost:4566 sqs get-queue-url --queue-name

  1. list queue -

aws --endpoint-url=http://localhost:4566 sqs list-queues

  1. send message -

aws --endpoint-url=http://localhost:4566 sqs send-message --queue-url --message-body

  1. receive message -

aws --endpoint-url=http://localhost:4566 sqs receive-message --queue-url

  1. purge queue -

aws --endpoint-url=http://localhost:4566 sqs purge-queue --queue-url

  1. delete queue -

aws --endpoint-url=http://localhost:4566 sqs delete-queue --queue-url

  1. set attributes -

aws --endpoint-url=http://localhost:4566 sqs set-queue-attributes --queue-url= --attributes file://.json

  1. get attributes -

aws --endpoint-url=http://localhost:4566 sqs get-queue-attributes --queue-url=

provider "aws" {
    region     = "us-east-1"
    access_key = "test"
    secret_key = "test"  
    skip_credentials_validation = true
    skip_requesting_account_id = true
    skip_metadata_api_check = true
    s3_force_path_style = true
    # Starting with localstack version 0.11.0, all APIs are exposed via a single edge service, 
    # which is accessible on http://localhost:4566 by default 
    endpoints = {
        s3 = "http://localhost:4566" 
        sqs = "https://localhost:4566"
    }
}
  1. DynamoDB - import AWS = require("aws-sdk"); let options = { apiVersion: '2012-08-10', region: 'us-east-1', }

options['endpoint'] = new AWS.Endpoint("http://localhost:4566") console.log(options)

const dynamoDB = new AWS.DynamoDB(options); console.log(dynamoDB) dynamoDB.listTables({Limit: 10}, function(err, data) { if (err) { console.log("Error", err.code); } else { console.log("Table names are ", data.TableNames); } });

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment