Skip to content

Instantly share code, notes, and snippets.

View RachidAZ's full-sized avatar

Rachid Azgaou RachidAZ

View GitHub Profile
@RachidAZ
RachidAZ / Elte_Detector.yara
Created January 17, 2022 13:39
Yara file that contains useful rules to detect malicious files
import "pe"
import "math"
private rule IsPE
{
condition:
uint16(0) == 0x5A4D and // MZ
uint32(uint32(0x3C)) == 0x00004550 // PE
}
@RachidAZ
RachidAZ / ExceptionHandler.cs
Created January 17, 2022 13:37
Catch unhandled exceptions in C#
using System;
private static void SetGlobalExceptionHandler()
{
AppDomain currentDomaine = AppDomain.CurrentDomain;
currentDomaine.UnhandledException += new UnhandledExceptionEventHandler(Handler);
}
private static void Handler(object sender, UnhandledExceptionEventArgs e)
@RachidAZ
RachidAZ / SeleniumRemote.cs
Created January 17, 2022 13:26
Selenium remote web driver using C#
// package used: Selenium.WebDriver 3.11.2
using OpenQA.Selenium;
using OpenQA.Selenium.Remote;
IWebDriver driver;
driver = new RemoteWebDriver(new Uri("http://{remote_ip:port}/"), DesiredCapabilities.PhantomJS());
driver.Navigate().GoToUrl("http://www.google.com");
Console.WriteLine(driver.Title);
driver.Quit();
@RachidAZ
RachidAZ / misc.py
Created January 16, 2022 23:12
quick access to data lake (ADLS gen2) from Databricks, save DataFrame as one partition
# set access info , disclaimer: this is not the best way to access your data from security perspective.
spark.conf.set(
"fs.azure.account.key.{storage_account_name}.dfs.core.windows.net",
"{storage_key_here}"
)
import datetime
now = datetime.datetime.now()
@RachidAZ
RachidAZ / DF_View.py
Created January 16, 2022 23:03
DateFrame <--> View in PySpark
# create DF from Table/View:
df=spark.sql("select * from {table_view_name}")
#create View from DF:
df.createOrReplaceTempView("{view_name}")
@RachidAZ
RachidAZ / varilable_mutation.py
Last active January 16, 2022 23:14
PySpark use Python variable in SQL command
# set the varilable in python
filePath='abfss://{container_name}@{storage_account_name}.dfs.core.windows.net/curated/Dims/Dim_X/*.csv'
spark.conf.set('f.filePath',filePath)
# use the varilable in SQL command
%sql
CREATE OR REPLACE TEMPORARY VIEW V_SomeView
USING CSV
@RachidAZ
RachidAZ / DF_with_schema.py
Created January 16, 2022 22:52
create DataFrame with defined schma in PySpark
# set the schema with the following json structure
jsonStringFromFile="""
{
"type": "struct",
"fields": [
{
"name": "id",
"type": "integer",
"nullable": true,
"metadata": {}
@RachidAZ
RachidAZ / UploadBlob.py
Created January 16, 2022 22:45
upload file as a blob to Azure storage account using python
from azure.storage.blob.aio import BlobClient
account_name = "{account_name}"
target_conn_str = "{connection_string}"
target_container_name = "{container_name}"
target_blob_name = "{file_to_upload}"
blob = BlobClient.from_connection_string(conn_str=target_conn_str, container_name=target_container_name, blob_name= target_blob_name)
@RachidAZ
RachidAZ / CosmosDB_Throughput_Get_Set.cs
Created January 16, 2022 21:50
Scale up/down Cosmos DB Throughput (manual/auto - shared(DB)/collection level )
using Microsoft.Azure.Cosmos;
// DB Level Throughput (Shared among collections) , returns both Manual and Auto Scale Throughput
public static async Task<int?[]> GetDBThroughput(string connection, string key, string db)
{
using (var clientcos = new CosmosClient(connection, key))
{
@RachidAZ
RachidAZ / GetBlobBySAS.cs
Last active January 16, 2022 21:31
Get azure blob as stream using SAS key
using Azure.Storage.Blobs;
using System.IO;
static internal MemoryStream GetBlobStreamUsingSAS(string AccountName , string SAS, string Container, string blobName)
{
Uri uriAddress = new Uri("https://"+AccountName+ ".blob.core.windows.net/" + SAS);
var newurl = uriAddress.Scheme + "://" + uriAddress.Host + "/" + Container + "/" + blobName + uriAddress.Query;
BlobClient blobClient = new BlobClient(new Uri(newurl), null);
var memoryStream = new MemoryStream();