Skip to content

Instantly share code, notes, and snippets.

@noelworden
Last active August 24, 2020 21:09
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save noelworden/2541c6bff2395737ae531812083532ee to your computer and use it in GitHub Desktop.
Save noelworden/2541c6bff2395737ae531812083532ee to your computer and use it in GitHub Desktop.
#helpers

Pineapple Payments

BI Testing Query

select
	fp.period_start_date,
	fbu.business_unit_name,
	fv.vendor_name,
	sum(revenue) as revenue,
	sum(fees) as fees,
	sum(icda) as icda,
	sum(processor_split) as split,
	sum(sales_volume) as vol,
	sum(transactions) as trans,
	sum(revenue) - sum(fees) - sum(processor_split) as gross_profit
from
	facts.upstream_residuals fur
	inner join facts.periods fp
		on fp.period = fur.period
	inner join facts.business_units fbu
		on fbu.id = fur.business_unit_id
	inner join facts.vendors fv
		on fv.id = fur.vendor_id
	inner join facts.merchant_accounts fma
		on fma.id = fur.merchant_account_id 
	inner join facts.products fpr
		on fpr.id = fur.product_id
group by
	fp.period_start_date,
	fbu.business_unit_name,
	fv.vendor_name
order by
	fbu.business_unit_name,
	fv.vendor_name

IEx shell tips

Localhost background color

<body style="<%= @conn.host == "localhost" && "background: #ff00001c" || "" %>">

Kubernetes and Docker

Our Elixir Format on Save

  • ensure Format on Save is enabled
    • File > Auto Save or
    • settings.json
      • "files.autoSave": "onFocusChange"
  • build new task with command
    • from menu bar
      • Terminal > Configure Tasks > Create tasks.json from template
        • if there are already tasks built, the dropdown will be an options of tasks that can then be added to
      • Add this to the .json
        • {
                "label": "formatter",
                "type": "shell",
                "command": "docker-compose exec web mix format",
                "presentation": {
                  "reveal": "silent"
                }
            }
          
      • Download the Trigger Task on Save extension for VSCode
      • Add this to settings.json
        • "triggerTaskOnSave.on": true,
          "triggerTaskOnSave.showStatusBarToggle": true,
          "triggerTaskOnSave.showNotifications": true,
          "triggerTaskOnSave.tasks": {
              "formatter": [
                  "*.exs",
                  "*.ex"
              ]
          }
          
      • Might need to disable any other format on save linters
      • Can customize notifications and if VSCode terminal is shown when running task with the json
      • May need to add .vscode to the repo's .git > info > exclude (instead of .gitignore)
      • Additional reading:

Format with hooks

To format on pre-commit

  • find existing .git/hooks/pre-commit.sample file
  • duplicate and remove .sample
  • delete contents, add this line docker-compose run --rm web mix format
  • will allow the commit to go through, and format afterwards

Docker space issue

https://forums.docker.com/t/docker-doesnt-release-disk-space-used-by-library-containers-com-docker-docker/15194

New Data Manipulations

Comment out this line

Example migration

Data manipulation source

Repo Options (.get/.one / etc)

https://hexdocs.pm/ecto/Ecto.Repo.html#callbacks

Dump Data from Prod

Two Tabs

  • First tab
    • ssh pineapple-dev-rds-tunnel -N
      • try this first
    • ssh pineapple-prod-rds-tunnel -N
      • this is least desirable because its Prod
  • Second tab
docker-compose exec -T db pg_dump -h docker.for.mac.localhost -p 5433 -U nworden -d finance > prod_dump
  • password is in 1Pass

    • Pineapple Prod DB Connection
    • Other Password
  • drop db and create new one (no migration or seeding)

  • docker-compose exec -T db psql -h localhost -U postgres -d finance_dev < prod_dump

    • If the file has recently been grabbed, it will be in the Pineapple_Stuff folder. Drag into finance and run the above command
  • Local dump

docker-compose exec -T db pg_dump -h localhost -U postgres -d finance_dev >

Merge conflicts on fetched branch

  • git pull
    • see that there are conflicts
  • git reset --merge
    • should set all the conflicts to the most recent work
  • git checkout --theirs .

Upload document

  • Put already cleaned file into source_files/clean
  • Ensure the file source_files/upload exists
#!/usr/bin/env zsh
set -eo pipefail
filter="$1"
cd "${0:a:h}/clean"
files=(**/*$~filter*.csv)
echo "About to upload $#files file(s):"
print -l $files
echo -n "continue? (y/n) "
read -q || exit 0
echo
for f in $files; do
  aws --endpoint-url=http://localhost:4572 s3 cp "$f" "s3://demo-bucket/import/$f"
  sleep 1
done
  • in terminal, run the script with source_files/upload
  • the file will be in the database as raw uploads (or something like that)
    • nothing will show in the web interface
  • go to generate link (/exports/new)
  • click the Export button
  • files should show in Exports link (/exports)

Pry

  • spin up everything, with docker-compose up
  • in a new terminal tab, stop web with docker-compose stop web
  • same tab run docker-compose run --rm --service-ports web iex -S mix phx.server
  • Insert the following into the code
    require IEx;
    IEx.pry
    

Other Pry (iEX)

  • make sure web is running dc restart web
  • start iex dc exec web iex -S mix
  • example Finance.Repo.all Finance.SourceFiles.unmapped_import_products
  • IEx.Helpers.recompile

File Handling

  • pull files from Egnyte
  • Shared > Residuals > 2019 > (Month) > Raw Source Files
  • Download that entire Raw Source Files folder
  • Rename folder to YYYYMM formate (201909) and pull into project
    • source_files < raw
  • in the source_files directory run this script
    • docker run --rm -it -v "$PWD:/source_files" finance-clean bash -ic '/source_files/clean-source-files'
  • upload clean files to DB (1 of 2 ways)
    • run the source_files/upload script
    • use the new UI interface
  • once uploaded, check the raw_imports table in the DB to confirm upload

Spin up script

  • from Erics slack. Step 4 in the process
curl -X PUT -d '<NotificationConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/"><TopicConfiguration><Id>some-optional-id</Id><Topic>arn:aws:sns:us-east-1:000000000000:new-source-files</Topic><Event>s3:ObjectCreated:*</Event><Filter><S3Key><FilterRule><Name>prefix</Name><Value>import/</Value></FilterRule></S3Key></Filter></TopicConfiguration></NotificationConfiguration>' -s 'http://localhost:4572/demo-bucket/?notification'

Setting up schema and changesets in IEx Console

defmodule Post do
  use Ecto.Schema
  import Ecto.Changeset

  schema "post" do
    field :title
    field :body
  end

  def changeset(post, params \\ %{}) do
    post
    |> cast(params, [:title, :body])
  end
end

changeset = Post.changeset(%Post{}, %{title: "first", body: "something else more"})

Format before PR

dc exec web mix format

SH into prod build

dc exec web sh

Prune docker from harddrive

docker image prune

Executing scripts on staging

  • Utilize the first two steps of the Kubernetes Cluster doc
  • Confirm --context finance-staging, so that it does not get run on Production
  • Test the script, by running the dry version first:
    kubectl --context finance-staging exec -it $pod -- ./prod/rel/finance/bin/finance rpc "Finance.Manipulations.AddAuthNetAgents.run()"
    
  • If possible, run the script locally, and save the logs to a file, then dry run the script, save that log, and diff the two saved files
  • If happy with the results, run the non-dry version of the script

Staging logs with k8s

  • kubectl --context finance-staging get pods
  • pod=<kube id>
  • kubectl logs --tail=20 $pod

Ecto

Spin up console

  • docker-compose run api "iex -S mix"

Necessary to point in right direction

  • alias Api.Repo

Example "all" search

  • Repo.all(Api.Users.User)

Example "selective" search

  • Repo.all(Api.Organizations.Organization) |> List.first

Find by ID

  • Repo.get(Api.Organizations.Organization, 24)

Querying

  • import Ecto.Query, only: [from: 2]
  • Create a query
    • query = from u in "users", where: u.age > 18, select: u.name
  • Repo.all(query)

For Entire Records

  • from u in Api.Users.User, where: u.age > 18

Preloading

  • Api.Repo.all(Api.Organizations.Organization) |> Api.Repo.preload(:job_types)

Docs

Elixir

Give new scripts permissions

  • chmod +x bin/reset_db

Linter for Elixir files

  • docker-compose run api "mix format priv/repo/dev_data.exs"

Rake Routes equivelant

  • dc run api "mix phx.routes"

Update Swagger docs dc run api "mix compile && mix phx.swagger.generate"

Inspect IO.inspect

  • pipe it in:
    • |> IO.inspect

#iex:break recompile

Bash Session

  • docker-compose run <containter> bash

  • example: docker compose run api bash

  • find available mix commands:

    • mix help
  • format of Sheets URL to an import

    • 'https://docs.google.com/spreadsheets/d/1XZw_7nLfKRiSYE7lBithc3eV2t0oXkfw2MRpWAe8toE/export?gid=0&format=csv'

notes

export const currentUserSelector = (state) => {
  debugger
  getEntityFromResource(state, 'user', path(["session", "user", "id"], state))
}

import { getEntityFromResource } from '../../selectors/resources';
import { path } from 'ramda'

pgAdmin

  • in API repo -> api/docs/PGADMIN.md

Web API tests

bin yarn test

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment