Skip to content

Instantly share code, notes, and snippets.

View DhyanRathore's full-sized avatar
😀

Dhyanendra Singh Rathore DhyanRathore

😀
View GitHub Profile
@DhyanRathore
DhyanRathore / azure-databricks-mount-adlsGen2.py
Last active October 28, 2020 21:50
Mounting Azure Data Lake Storage Gen2 Account to Azure Databricks with Service Principal and OAuth
# Python code to mount and access Azure Data Lake Storage Gen2 Account to Azure Databricks with Service Principal and OAuth
# Author: Dhyanendra Singh Rathore
# Define the variables used for creating connection strings
adlsAccountName = "dlscsvdataproject"
adlsContainerName = "csv-data-store"
adlsFolderName = "covid19-data"
mountPoint = "/mnt/csvFiles"
# Application (Client) ID
@DhyanRathore
DhyanRathore / azure-databricks-connect-SQLServer.py
Last active July 26, 2023 14:03
Connecting to Azure SQL Databases from Azure Databricks with Screts Scope
# Python code to connect to Azure SQL Databases from Azure Databricks with Screts Scope
# Author: Dhyanendra Singh Rathore
# Declare variables for creating JDBC URL
jdbcHostname = "sql-csv-data-server.database.windows.net" # Replace with your SQL Server name
jdbcPort = 1433 # Replace with your SQL Server port number
jdbcDatabase = "syn-csv-data-dw" # Replace with your database name
# Connection secrets from vault
jdbcUsername = dbutils.secrets.get(scope="CSVProjectKeyVault",key="SQLAdmin") # Replace the scope and key accordingly
@DhyanRathore
DhyanRathore / adb-schema-drifted-incremental-load.py
Last active July 7, 2021 08:13
Cleansing and transforming schema drifted csv files into relational data with incremental loads in Azure Databricks
# Python/PySpark code for cleansing and transforming schema drifted csv files into relational data with incremental loads in Azure Databricks
# Author: Dhyanendra Singh Rathore
# Define the variables used for creating connection strings
adlsAccountName = "dlscsvdataproject"
adlsContainerName = "csv-data-store"
adlsFolderName = "covid19-data"
mountPoint = "/mnt/csvFiles"
# Application (Client) ID
@DhyanRathore
DhyanRathore / adf-pl-copy-files-based-on-URL-pattern.json
Created September 23, 2020 06:46
Using Azure Data Factory to copy multiple files based on URL pattern over HTTP
{
"name": "pl_autoCopyCsvFiles",
"properties": {
"activities": [
{
"name": "ac_checkAllAvailableFiles",
"type": "GetMetadata",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
@DhyanRathore
DhyanRathore / index.js
Last active December 14, 2020 19:24
Azure Function: Node.js app to read PostgreSQL data and return JSON
// Azure Function: Node.js code to read PostgreSQL data and return results as JSON
// Author: Dhyanendra Singh Rathore
// Import the pg (node-postgres) library
const pg = require('pg');
// Entry point of the function
module.exports = async function(context, req) {
// Define variables to store connection details and credentials
@DhyanRathore
DhyanRathore / index.js
Last active September 10, 2021 18:59
Azure Function: Node.js app to read PostgreSQL data with query parameter and return JSON
// Azure Function: Node.js code to read PostgreSQL data with query parameter and return results as JSON
// Author: Dhyanendra Singh Rathore
// Import the pg (node-postgres) library
const pg = require('pg');
// Entry point of the function
module.exports = async function(context, req) {
// Define variables to store connection details and credentials
@DhyanRathore
DhyanRathore / index.js
Last active April 17, 2023 11:46
Azure Function: Node.js code to read Environment Variables during function execution and return variables as JSON
// Azure Function: Node.js code to read Environment Variables during function execution and return variables as JSON
// Author: Dhyanendra Singh Rathore
// Entry point of the function
module.exports = async function(context, req) {
// Fetch environment variables during execution
const postgresServerName = process.env["POSTGRES_SERVER_NAME"];
const postgresUserName = process.env["POSTGRES_USER"];
@DhyanRathore
DhyanRathore / index.js
Last active December 13, 2020 17:22
Azure Function: Node.js app to query Azure Synapse Analytics data warehouse and return JSON results
// Azure Function: Node.js code to read data from Azure Synapse Analytics with query parameter and return results as JSON
// Author: Dhyanendra Singh Rathore
// Import the tedious library
const Connection = require('tedious').Connection;
const Request = require('tedious').Request;
const TYPES = require('tedious').TYPES;
// Entry point of the function
module.exports = function(context, req) {
@DhyanRathore
DhyanRathore / ClaimsDemo.cs
Last active March 15, 2021 23:15
Get user Identity and Claims from HTTP Request Headers
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using System.Security.Claims;
using System.IdentityModel.Tokens.Jwt;
// DEMO: Get user Identity and Claims from the Token Headers
namespace ClaimsDemo.Function
@DhyanRathore
DhyanRathore / ClaimsDemo.cs
Last active March 19, 2021 15:15
Get ClaimsPrincipal as a Binding Parameter
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using System.Security.Claims;
// DEMO: Get ClaimsPrincipal as a binding parameter
namespace ClaimsDemo.Function
{