This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# THIS EXAMPLE REQUIRES THE FOLLOWING CLI/SESSION EXPORT | |
# STATEMENTS IN ORDER TO WORK CORRECTLY: | |
#export TABLEAU_API_BASE_URL=prod-useast-a.online.tableau.com | |
#export TABLEAU_API_VERSION=3.22 | |
#export TABLEAU_PAT_NAME="your tableau personal access token name" | |
#export TABLEAU_PAT_SECRET="your tableau personal access token secret" | |
#export TABLEAU_PROJECT_NAME="Name of one Tableau Cloud Project inside your Tableau Site" | |
#export OLD_SERVER_ADDRESS=abc12345.us-east-1.snowflakecomputing.com | |
#export NEW_SERVER_ADDRESS=abc12345riwkgq.us-east-1.a.p0.satoricyber.net |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# This example deliberately excludes error control or try/catch blocks | |
# so that it can be short and legible. We recommend adding these as well as | |
# any other custom logic you may want. | |
# Goal: create an array of runnable queries by iterating through | |
# all tables in allowed schemas. The Satori Data Inventory will become populated. | |
# This process may not introspect on all nested field types e.g. json or other blob types. | |
import mysql.connector |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import snowflake.connector | |
from snowflake.connector.errors import DatabaseError | |
# BEGIN PARAMETERS, CHANGE THESE AS NEEDED: | |
# should we actually run the select statements or just print them to the console? | |
# if dryrun is set to False, we will actually run select statements on all the tables and views | |
# note: dryrun only applies to the select statements, this script will still run queries to find database and schema names | |
DRYRUN = True |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# USAGE | |
# Given a full location path in a Satori Dataset like "db.schema.tablename" | |
# this utility lets you search across all Datasets for any partial substring | |
# e.g. using the above example, searching for "hema.tabl" would match "db.schema.tablename" | |
# At a command line: python searchtable.py <searchstring> | |
# Save this file, and then fill in the account id, service id and key values below | |
import json |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Simple Command line to group all downstream Satori Datastores and Datasets by each Satori DAC | |
# tested with python 3 | |
# Usage: | |
# at a command line run: | |
# python get_all_dac_info.py.py | |
# You must fill in the three Satori account variables below | |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Simple Command line to group all Satori Datastores by each Satori DAC | |
# tested with python 3 | |
# Usage: | |
# at a command line run: | |
# python find_datastores.py | |
# You must fill in the three Satori account variables below |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Simple Command line to find all Satori Datasets associated with a Satori Datastore. | |
# tested with python 3 | |
# Usage: | |
# | |
# To find ONE Satori Datastore across all Satori Datasets, | |
# first get its ID from the UI and then at a prompt, run: | |
# python find_datasets.py DATASTORE_ID | |
# | |
# To find ALL the Satori Datastores across all Satori Datasets, |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# This is a lightweight integration prototype that sends | |
# Satori audit data to PagerDuty. In this example we | |
# specify that any query which was blocked by Satori should also | |
# create a new incident in PagerDuty. | |
# We also check for any access or self service rules. If we find any, | |
# then we should not trigger PagerDuty as this is a type of false positive. | |
# We further refine this integration by using the Satori Audit ID | |
# as our PagerDuty "incident ID" - this prevents duplicate incidents |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
NewerOlder