##Send Data from Logstash to Azure SQL Database##
Assuming that the ELK stack elasticsearch, logstash and kibana is already setup.
- Create Azure SQL Database
- Get the following information
- Server Name
- Database Name
- Server Username
- Server Password
-
Create Azure SQL Table
CREATE TABLE dbo.sensordata ( id int IDENTITY(1,1) PRIMARY KEY NOT NULL, deviceip char(20), logtime datetime, sensor1 decimal, sensor2 decimal, sensor3 decimal, sensor4 decimal, sensor5 decimal, sensor6 decimal ) GO
-
Get jdbc.rb. This is theangryangel's jdbc logstash output plugin.
- Put this into your logstash output plugin folder e.g. /opt/logstash/lib/logstash/outputs/jdbc.rb
- Get the Microsoft JDBC driver from here
-
Extract the download, and then get the sqljdbc4.jar file
-
Create a new folder in your logstash installation, if your logstash installation is at /opt/logstash, use the command below:
mkdir -p /opt/logstash/vendor/jar/jdbc/
-
Put the sqljdbc4.jar file into the jdbc folder you created above
-
Set up your logstash config file to output to jdbc:
input { udp { port => 5005 tags => ["udp"] type => sensorData } tcp { port => 5005 tags => ["tcp"] type => sensorData } } filter { if [type] == "sensorData" { json { add_tag => [ "sensorData" ] source => "message" } } } output { stdout {} jdbc { driver_class => 'com.microsoft.sqlserver.jdbc.SQLServerDriver' connection_string => "jdbc:sqlserver://SERVER_NAME.database.windows.net;databaseName=DATABASE_NAME;user=SERVER_USERNAME;password=SERVER_PASSWORD;autoReconnect=true;" statement => [ "INSERT INTO dbo.sensordata (deviceip, logtime, sensor1, sensor2, sensor3, sensor4, sensor5, sensor6) VALUES(?, ?, ?, ?, ?, ?, ?, ?)", "host", "@timestamp", "Sensor1", "Sensor2", "Sensor3", "Sensor4", "Sensor5", "Sensor6" ] } elasticsearch { host => "127.0.0.1" protocol => "http" } }