This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
name | cusip | ||
---|---|---|---|
INVESCO QQQ TR | 46090E103 | InvescoUS | |
NVIDIA CORPORATION | 67066G104 | NVIDIAGeForce | |
MICROSOFT CORP | 594918104 | Microsoft365 | |
AMAZON COM INC | 023135106 | amazonnews | |
ISHARES TR | 464288513 | iSharesCanada | |
ALPHABET INC | 02079K305 | Alphabetlnc | |
ALPHABET INC | 02079K107 | Alphabetlnc | |
SELECT SECTOR SPDR TR | 81369Y506 | SectorSPDRs | |
ADOBE INC | 00724F101 | AdobeCare.</div></div></div></div></div></div></div |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
https://github.com/iperov/DeepFaceLab | |
#Utils: | |
#For downloading youtube videos: https://www.y2mate.com/en19 | |
#For spliting videos into images: https://www.online-convert.com/result/74a551a3-0134-4119-90d9-fb7361028642 | |
#For Trimming Videos (WIndows): Right click image and "Open with Photos" | |
#1. Find Source Video/Images | |
#2. Find Destination Video/Images | |
# Extract Images from dst .mp4 | |
chdir Documents\GitHub\DeepFaceLab-DF.wf.288res.384.92.72.22\DeepFaceLab-DF.wf.288res.384.92.72.22 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#https://www.youtube.com/watch?v=nWeW3sCmD2k | |
# Topics: | |
# 1. Creating DB and Tables | |
# 2. Inserting Data | |
# 3. Updating Data | |
# 4. Deleting Data | |
# 5. Alter Tables | |
# 6. Basic Querying | |
# 7. SQL Operators | |
# 8. Indexes |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
### | |
# 1. Read Index from Elastic Search with Pyspark as RDD | |
### | |
es_read_conf = { | |
"es.nodes":"node-1.XXX.YYY,node-2.XXX.YYY,node-3.XXX.YYY,node-4.XXX.YYY", | |
"es.port" : "9200", | |
"es.resource" : "INDEX/TYPE", | |
"es.net.http.auth.user":"USErNAME", | |
"es.net.http.auth.pass":"PASSWORD", | |
"es.nodes.wan.only":"false", |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
''' | |
For the Examples Below, Assume a table with the following format: | |
root | |
|-- jobgroup: string (nullable = true) | |
|-- jobname: string (nullable = true) | |
|-- starttime: timestamp (nullable = true) | |
|-- endtime: timestamp (nullable = true) | |
|-- status: string (nullable = true) | |
|-- comments: string (nullable = true) | |
|-- no_of_recs_processed: integer (nullable = true) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/bin/bash | |
# | |
#1. Create Log Folders For each Day In New Location: | |
# | |
LOGPATH='/data/wsolomon/workspace/projects/Claim_Extractor_Pipeline/logs' #LOGPATH='/data/bdr/logs/bdr-ptab' | |
RUNDATE=$(date +'%Y%m%d') | |
JOB=`basename $0` | |
mkdir -p $LOGPATH |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
''' | |
1. Loading XML's into pyspark dataframes | |
''' | |
import os | |
os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages com.databricks:spark-xml_2.10:0.4.1 pyspark-shell' | |
import findspark | |
findspark.init('/location/of/spark2-client/') | |
import pyspark | |
from pyspark.sql import SQLContext,SparkSession, HiveContext | |
spark = SparkSession.builder.appName('NAME_OF_JOBS').enableHiveSupport().getOrCreate() |