This is a hands-on way to pull down a set of MySQL dumps from Amazon S3 and restore your database with it
Sister Document - Backup MySQL to Amazon S3 - read that first
# Set our variables
export mysqlpass="ROOTPASSWORD"
This is a hands-on way to pull down a set of MySQL dumps from Amazon S3 and restore your database with it
Sister Document - Backup MySQL to Amazon S3 - read that first
# Set our variables
export mysqlpass="ROOTPASSWORD"
This is a simple way to backup your MySQL tables to Amazon S3 for a nightly backup - this is all to be done on your server :-)
Sister Document - Restore MySQL from Amazon S3 - read that next
this is for Centos 5.6, see http://s3tools.org/repositories for other systems like ubuntu etc
CREATE TABLE `carriers` ( | |
`id` int(11) NOT NULL AUTO_INCREMENT, | |
`type` varchar(13) CHARACTER SET utf8 DEFAULT NULL, | |
`countryName` varchar(56) CHARACTER SET utf8 DEFAULT NULL, | |
`countryCode` varchar(14) CHARACTER SET utf8 DEFAULT NULL, | |
`mcc` int(11) DEFAULT NULL, | |
`mnc` varchar(9) CHARACTER SET utf8 DEFAULT NULL, | |
`brand` varchar(48) CHARACTER SET utf8 DEFAULT NULL, | |
`operator` varchar(75) CHARACTER SET utf8 DEFAULT NULL, | |
`status` varchar(21) CHARACTER SET utf8 DEFAULT NULL, |
# Delete the possibly existing autocomplete test index | |
curl -X DELETE localhost:9200/autocomplete_test | |
# Put the config of the autocomplete index | |
curl -X PUT localhost:9200/autocomplete_test -d ' | |
{ | |
"settings" : { | |
"index" : { | |
"analysis" : { | |
"analyzer" : { |
Deploy key is a SSH key set in your repo to grant client read-only (as well as r/w, if you want) access to your repo.
As the name says, its primary function is to be used in the deploy process, where only read access is needed. Therefore keep the repo safe from the attack, in case the server side is fallen.
var temp1 =[]; | |
$(".resultTable.striped tr").each(function(){ | |
var k = []; | |
$(this).find("td").each(function(){ | |
k.push($(this).text()); | |
}); | |
temp1.push(k); | |
}); | |
temp1.splice(0,1); |
FROM "ubuntu" | |
RUN apt-get update && yes | apt-get upgrade | |
RUN mkdir -p /tensorflow/models | |
RUN apt-get install -y git python-pip curl unzip | |
RUN curl -OL https://github.com/google/protobuf/releases/download/v3.2.0/protoc-3.2.0-linux-x86_64.zip | |
RUN unzip protoc-3.2.0-linux-x86_64.zip -d protoc3 | |
RUN mv protoc3/bin/* /usr/local/bin/ | |
RUN mv protoc3/include/* /usr/local/include/ | |
RUN protoc --version | |
RUN pip install tensorflow |
package core; | |
import java.lang.reflect.Field; | |
import java.util.Collection; | |
import java.util.Date; | |
import java.util.HashMap; | |
import java.util.Map; | |
public class TinyJsonMapper { |
/* Attribution: http://techslides.com/how-to-parse-and-search-json-in-javascript */ | |
//return an array of object path and value according to key | |
function getValues(obj, key,path) { | |
var objects = []; | |
for (var i in obj) { | |
if (!obj.hasOwnProperty(i)) continue; | |
if (typeof obj[i] == 'object') { | |
objects = objects.concat(getValues(obj[i], key,path+"/"+i)); | |
} else if (i == key) { |