This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#Set up a JDBC connection to your database | |
#Get the credentials from the Keyvault that a scope was defined for using the | |
#Databricks CLI. See: https://docs.azuredatabricks.net/security/secrets/secret-scopes.html | |
#alternatively, hard-code the credentials here: | |
jdbcUrl = dbutils.secrets.get(scope="JDBC", key="url") | |
jdbcUsername = dbutils.secrets.get(scope="JDBC", key="username") | |
jdbcPassword = dbutils.secrets.get(scope="JDBC", key="password") | |
#create an object with the credentials that we can pass on to the spark dataframe reader |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
rc = spark.read.jdbc(url=jdbcUrl, table='(select 1 as nr) as t', properties=connectionProperties).count() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
connectionProperties['sessionInitStatement'] = '<insert custom SQL code here>' |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#Get the credentials from the Keyvault that a scope was defined for using the | |
#Databricks CLI. See: https://docs.azuredatabricks.net/security/secrets/secret-scopes.html | |
#alternatively, hard-code the credentials here: | |
jdbcUrl = dbutils.secrets.get(scope="JDBC", key="url") | |
jdbcUsername = dbutils.secrets.get(scope="JDBC", key="username") | |
jdbcPassword = dbutils.secrets.get(scope="JDBC", key="password") | |
#create an object with the credentials that we can pass on to the spark dataframe reader | |
connectionProperties = { | |
"user" : jdbcUsername, |