From database into Watson Discovery
The code shown is a simple Python example of pulling data from a SQL database supported by SQLAlchemy, and then injecting the data as new JSON document into a collection of Watson Discovery Service on IBM Cloud.
|# Simple test of the Db2 on Cloud REST API|
|# Written by Henrik Loeser, firstname.lastname@example.org|
|import requests, json, sys, time|
|# Read credentials from file|
|with open(filename) as data_file:|
|credentials = json.load(data_file)|
|# for full example visit https://github.com/IBM-Cloud/github-traffic-stats/blob/master/backend/ghstats.py|
|# as part of this tutorial: https://console.bluemix.net/docs/tutorials/serverless-github-traffic-analytics.html|
|# import all kinds of modules|
|# this one is needed for the OIDC client|
|from flask_pyoidc.flask_pyoidc import OIDCAuthentication|
|# initialize Flask, etc.|
Flow to automatically respond to tweets in Twitter. If the tag "archive" is used, then the incoming tweet is stored in the DB2-based sqldb service. A simple Web service is provided to retrieve the archived tweets from DB2.
To work, the table "twitarchive" needs to be created.
create table twitarchive( id int generated always as identity, tstamp timestamp, tweet varchar(200), username varchar(100)
This flow uses the sipgate.io service of German phone company Sipgate to get notified about incoming calls. The from/to (who is calling which number) information is logged in a Cloudant database and a SMS is sent out using the Twilio service.
All was created on the IBM Bluemix service with a free trial account. http://bluemix.net
More information in this blog entry: http://blog.4loeser.net/2014/10/nodered-simple-phoney-json-entries-in.html