Skip to content

Instantly share code, notes, and snippets.

@sagar-synergenie
Created July 2, 2018 11:46
Show Gist options
  • Save sagar-synergenie/1c89f76232706bd870a80f25c0b6edea to your computer and use it in GitHub Desktop.
Save sagar-synergenie/1c89f76232706bd870a80f25c0b6edea to your computer and use it in GitHub Desktop.
Simba Spark Driver
# .bashrc
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib:/opt/simba/spark/lib/64
export ODBCINI=/etc/odbc.ini
export ODBCINSTINI=/usr/local/odbc
export SIMBASPARKINI=/etc/simba.sparkodbc.ini
[Simba Spark 64-bit]
# Description: DSN Description.
# This key is not necessary and is only to give a description of the data source.
Description=Simba Spark ODBC Driver (64-bit) DSN
# Driver: The location where the ODBC driver is installed to.
Driver=/opt/simba/spark/lib/64/libsparkodbc_sb64.so
# The host name or IP of the Thrift server.
#i have used my IP here
HOST=127.0.0.1
# The TCP port Thrift server is listening.
PORT=9042
# The name of the database schema to use when a schema is not explicitly specified in a query.
Schema=hdb_new
# The Spark Server Type
# 1 - Shark Server 1 for Shark 0.8.1 and earlier
# 2 - Shark Server 2 for Shark 0.9.*
# 3 - Spark Thrift Server for Shark 1.1 and later
SparkServerType=3
# The authentication mechanism to use for the connection.
#   Set to 0 for No Authentication
#   Set to 1 for Kerberos
#   Set to 2 for User Name
#   Set to 3 for User Name and Password
# Note only No Authentication is supported when connecting to Shark Server 1.
AuthMech=0
# The Thrift transport to use for the connection.
#  Set to 0 for Binary
#  Set to 1 for SASL
#  Set to 2 for HTTP
# Note for Shark Server 1 only Binary can be used.
ThriftTransport=0
# When this option is enabled (1), the driver does not transform the queries emitted by an
# application, so the native query is used.
# When this option is disabled (0), the driver transforms the queries emitted by an application and
# converts them into an equivalent from in Spark SQL.
UseNativeQuery=1
# Set the UID with the user name to use to access Spark when using AuthMech 2 or 3.
UID=
# The following is settings used when using Kerberos authentication (AuthMech 1)
# The fully qualified host name part of the of the Spark Thrift Server Kerberos service principal.
# For example if the service principal name of you Spark Thrift Server is:
#   spark/myhs2.mydomain.com@EXAMPLE.COM
# Then set KrbHostFQDN to myhs2.mydomain.com
KrbHostFQDN=_HOST
# The service name part of the of the Spark Thrift Server Kerberos service principal.
# For example if the service principal name of you Spark Thrift Server is:
#   spark/myhs2.mydomain.com@EXAMPLE.COM
# Then set KrbServiceName to spark
KrbServiceName=spark
# The realm part of the of the Spark Thrift Server Kerberos service principal.
# For example if the service principal name of you Spark Thrift Server is:
#   spark/myhs2.mydomain.com@EXAMPLE.COM
# Then set KrbRealm to EXAMPLE.COM
KrbRealm=
# Set to 1 to enable SSL. Set to 0 to disable.
SSL=0
# Set to 1 to enable two-way SSL. Set to 0 to disable. You must enable SSL in order to
# use two-way SSL.
TwoWaySSL=0
# The file containing the client certificate in PEM format. This is required when using two-way SSL.
ClientCert=
# The client private key. This is used for two-way SSL authentication.
ClientPrivateKey=
# The password for the client private key. Password is only required for password protected
# client private key.
ClientPrivateKeyPassword=
# The partial URL corresponding to the Spark server on HDInsight, HTTP, or HTTPS authentication mechanisms.
#i have used my IP here
HTTPPath=spark://127.0.0.1:7077
[Driver]
ErrorMessagesPath=/opt/simba/spark/ErrorMessages/
LogLevel=0
LogPath=
SwapFilePath=/tmp
<?php
$conn = odbc_connect("Simba Spark 64-bit","cassandra","cassandra");
var_dump($conn);
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment