Skip to content

Instantly share code, notes, and snippets.

import org.apache.spark.ml.Pipeline
import org.apache.spark.ml.evaluation.BinaryClassificationEvaluator
import org.apache.spark.ml.feature.MinMaxScaler
import org.apache.spark.ml.tuning.{CrossValidator, ParamGridBuilder}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.types.{IntegerType, LongType}
import org.apache.spark.sql.SaveMode
import org.apache.spark.ml.classification.LogisticRegression
import org.apache.spark.ml.feature.VectorAssembler
[{"sap": {
"latest": {
"type": "INTERPRETER",
"name": "sap",
"version": "0.8.0",
"published": "2018-06-24T01:29:37+00:00",
"artifact": "sap@0.8.0",
"description": "Zeppelin SAP support",
"license": "Apache-2.0",
"icon": "<i class=\"fa fa-rocket\"></i>",
@two8g
two8g / docker-compose.yml
Last active May 10, 2018 04:53 — forked from samklr/docker-compose.yml
Confluent Docker Compose
version: "2"
services:
zookeeper:
image: confluentinc/cp-zookeeper:4.1.0
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ports:
- 2181:2181
logging:
/**
* It gets the connection based on different datasources.
*/
public class MultiTenantConnectionProviderImpl extends AbstractDataSourceBasedMultiTenantConnectionProviderImpl
{
Log logger = LogFactory.getLog(getClass());
private static final long serialVersionUID = 14535345L;
/**
* It specify what Tenant should be use when the hibernate session is created.
* @author jm
*/
public class CurrentTenantIdentifierResolverImpl implements CurrentTenantIdentifierResolver {
Logger logger = Logger.getLogger(getClass());
@Override
/**
* It lookup the correct datasource to use, we have one per tenant
*
* The tenant datasource has default properties from database.properties and
* also properties in database.{tenantId}.properties whose properties override
* the default ones.
*
* @author jose.mgmaestre
*
*/
http://www.sohu.com/a/131128039_505813
电商后台设计
本文包括以下几个部分:
电商后台系统概述
电商后台产品设计:商品中心
电商后台产品设计:订单拆单
电商后台产品设计:促销活动解析
电商后台产品设计:优惠券的设计和妙用
@two8g
two8g / BaseDao.java
Created May 10, 2017 15:17 — forked from pengju/BaseDao.java
SSH中BaseDao的写法
package com.kaishengit.dao.core;
import java.io.Serializable;
import java.lang.reflect.ParameterizedType;
import java.lang.reflect.Type;
import java.util.List;
import java.util.Map;
import org.hibernate.Criteria;
import org.hibernate.Query;
@two8g
two8g / gist:b016ada5ebfaf81d7561f7654077679e
Last active March 16, 2017 03:59
maven filter .properties 编码错误

Wrong encoding after activating resource filtering

http://stackoverflow.com/questions/14327003/wrong-encoding-after-activating-resource-filtering

http://docs.oracle.com/javase/6/docs/api/java/util/Properties.html#load(java.io.Reader)

Because Properties inherits from Hashtable, the put and putAll methods can be applied to a Properties object. Their use is strongly discouraged as they allow the caller to insert entries whose keys or values are not Strings. The setProperty method should be used instead. If the store or save method is called on a "compromised" Properties object that contains a non-String key or value, the call will fail. Similarly, the call to the propertyNames or list method will fail if it is called on a "compromised" Properties object that contains a non-String key.

The load(Reader) / store(Writer, String) methods load and store properties from and to a character based stream in a simple line-oriented format specified below. The load(InputStream) / store(OutputStream, String) met

@two8g
two8g / xvfb
Created October 8, 2016 03:47 — forked from rsanheim/xvfb
/etc/init.d/xvfb
root@ci:/etc/init.d# cat Xvfb
#! /bin/sh
### BEGIN INIT INFO
# Provides: Xvfb
# Required-Start: $local_fs $remote_fs
# Required-Stop:
# X-Start-Before:
# Default-Start: 2 3 4 5
# Default-Stop: