This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# sep/16/2022 12:50:26 by RouterOS 7.5 | |
# software id = 79VK-VRAH | |
# | |
# model = RB5009UPr+S+ | |
# serial number = HCY08E7SYTZ | |
/interface bridge | |
add name=bridge-local | |
/interface vlan | |
add interface=ether1 name=vlan1.4 vlan-id=4 | |
add interface=ether1 loop-protect=off name=vlan1.6 vlan-id=6 |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
OPENAI_API_HOST=https://endpointname.openai.azure.com/ | |
OPENAI_API_KEY=xxxxxxxxxxxxxxxxxxxxxx | |
AZURE_DEPLOYMENT_ID=text-davinci-003 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python | |
# coding: utf-8 | |
# In[1]: | |
# Set arguments | |
dfDataOriginalPath = "/processedzone/" | |
dfDataChangedPath = "/changedzone/" | |
cw_database = "AdventureWorks" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# sep/15/2022 08:47:33 by RouterOS 7.5 | |
# software id = 79VK-VRAH | |
# | |
# model = RB5009UPr+S+ | |
# serial number = HCY08E7SYTZ | |
/interface bridge | |
add name=bridge-local | |
/interface vlan | |
add interface=ether1 name=vlan1.4 vlan-id=4 | |
add interface=ether1 loop-protect=off name=vlan1.6 vlan-id=6 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Set arguments | |
SourceSystemName = "AdventureWorks" | |
FlowName = "SalesLTAddress" | |
SourceStorageAccount = "synapsepiethein" | |
SourceContainer = "synapsedata" | |
SourcePath = "/landingzone/AdventureWorks/" | |
TargetStorageAccount = "synapsepiethein" | |
TargetContainer = "synapsedata" | |
TargetPath = "/processedzone/AdventureWorks" | |
SinkOperation = "merge" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Prepare for merge, rename columns of newly loaded data, append 'src_' | |
from pyspark.sql import functions as F | |
# Rename all columns in dataChanged, prepend src_, and add additional columns | |
df_new = dataChanged.select([F.col(c).alias("src_"+c) for c in dataChanged.columns]) | |
src_columnNames = df_new.schema.names | |
df_new2 = df_new.withColumn('src_current', lit(True)).withColumn('src_effectiveDate', lit(current_date)).withColumn('src_endDate', lit(date(9999, 12, 31))) | |
df_new2.printSchema() | |
import hashlib |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
var Kafka = require('node-rdkafka'); | |
var producer = new Kafka.Producer({ | |
//'debug' : 'all', | |
'metadata.broker.list': 'atlas-004133bc-3c87-4862-bf9d-b0ea6ae351f5.servicebus.windows.net:9093', //REPLACE | |
'dr_cb': true, //delivery report callback | |
'security.protocol': 'SASL_SSL', | |
'sasl.mechanisms': 'PLAIN', | |
'sasl.username': '$ConnectionString', //do not replace $ConnectionString | |
'sasl.password': 'Endpoint=sb://atlas-004133bc-3c87-4862-bf9d-b0ea6ae351f5.servicebus.windows.net/;SharedAccessKeyName=AlternateSharedAccessKey;SharedAccessKey=WrIVbXQnYutxKXsvmfP+Wz4G4OLKHjDtuKH&6=' //REPLACE |
NewerOlder