Skip to content

Instantly share code, notes, and snippets.

View rchamarthi's full-sized avatar

rchamarthi

View GitHub Profile
@rchamarthi
rchamarthi / typescript-extending-interfaces.ts
Last active July 31, 2023 22:34
Typescript extending interfaces
interface TestInterface {
name: string,
description: string;
}
interface MyTestInterface extends TestInterface {
id: number;
}
const myEvent: MyTestInterface = {
@rchamarthi
rchamarthi / axios-validate-error.ts
Last active July 30, 2023 17:40
Axios call with verifying status code and exceptions
import axios from 'axios';
const print_response = async (url: string) => {
console.log(`requesting url : ${url}`);
try {
const response = await axios.get(url, {
validateStatus: function (status) {
return status === 200; // default
}
});
@rchamarthi
rchamarthi / terraform_aws_multiregion_data.tf
Created September 9, 2021 02:34
Terraform module config to get data from two different regions
# With the credentials exported in the region
provider "aws" {
region = "us-east-1"
alias = "us-east-1"
}
provider "aws" {
region = "us-west-2"
alias = "us-west-2"
@rchamarthi
rchamarthi / s3_copy_operator.py
Created April 5, 2018 21:42
Airflow S3 cross account copy operator
import logging
from tempfile import NamedTemporaryFile
from airflow.exceptions import AirflowException
from airflow.hooks.S3_hook import S3Hook
from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults
class S3CopyOperator(BaseOperator):
@rchamarthi
rchamarthi / text_to_csv.py
Created December 2, 2017 15:57
string to csv write python
import csv
row = 'abc,xyz'.split(',')
print(row)
with open('eggs.csv','w') as csvfile:
spamwriter = csv.writer(csvfile, delimiter='|',
quotechar='"', quoting=csv.QUOTE_MINIMAL)
spamwriter.writerow(row)
@rchamarthi
rchamarthi / ally_to_morningstar.py
Created November 26, 2017 03:56
Script to convert stock portfolio data from ally/tradeking export format to the format expected by Morningstar
#expected format by morningstar
# Symbol,Quantity,Price,Action,TradeDate,Amount,Commission
# ABT,200,49.1475,Buy,8/31/2010,9836.5,7
#current output format by Ally/Tradeking
# "Symbol","Description","Qty","Underl.Stock","CostBasis","Avg Price","Price<sup>*</sup>","Change","Change<br />%","TotalG/L","MarketValue","","",
# "AGN","Allergan Plc","10",,"$2,448.25","$244.83","$173.75","-$1.03","-0.59","-$710.75","$1,737.50","",
import csv
output_columns = "Symbol,Quantity,Price,Amount"
@rchamarthi
rchamarthi / spark_udf_sql.scala
Created July 16, 2017 20:33
Spark - calling a udf from spark sql
scala> import org.apache.spark.sql.functions.{input_file_name, udf}
import org.apache.spark.sql.functions.{input_file_name, udf}
scala> def extract_file_name(path: String): String =
| path.split("/").last
extract_file_name: (path: String)String
scala> spark.sqlContext.udf.register("extract_file_name", extract_file_name _);
res4: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,StringType,Some(List(StringType)))