Skip to content

Instantly share code, notes, and snippets.

@wepe
Last active December 9, 2018 17:43
Show Gist options
  • Star 10 You must be signed in to star a gist
  • Fork 7 You must be signed in to fork a gist
  • Save wepe/a05ad572dca002046de443061909ff7a to your computer and use it in GitHub Desktop.
Save wepe/a05ad572dca002046de443061909ff7a to your computer and use it in GitHub Desktop.

###1. 自定义激活函数

可以采用以下几种方法:

from keras import backend as K
from keras.layers.core import Lambda
from keras.engine import Layer

#使用theano/tensorflow的内置函数简单地编写激活函数
def sigmoid_relu(x):
    """
    f(x) = x for x>0
    f(x) = sigmoid(x)-0.5 for x<=0
    """
    return K.relu(x)-K.relu(0.5-K.sigmoid(x))

#使用Lambda
def lambda_activation(x):
    """
    f(x) = max(relu(x),sigmoid(x)-0.5)
    """
    return K.maximum(K.relu(x),K.sigmoid(x)-0.5)

#编写自己的layer(这个例子中参数theta,alpha1,alpha2是固定的超参数,不需要train,故没定义build)
class My_activation(Layer):
    """
    f(x) = x for x>0
    f(x) = alpha1 * x for theta<x<=0
    f(x) = alpha2 * x for x<=theta
    """
    def __init__(self,theta=-5.0,alpha1=0.2,alpha2=0.1,**kwargs):
    	self.theta = theta
    	self.alpha1 = alpha1
    	self.alpha2 = alpha2
    	super(My_activation,self).__init__(**kwargs)
    def call(self,x,mask=None):
    	fx_0 = K.relu(x) #for x>0
    	fx_1 = self.alpha1*x*K.cast(x>self.theta,K.floatx())*K.cast(x<=0.0,K.floatx()) #for theta<x<=0
    	fx_2 = self.alpha2*x*K.cast(x<=self.theta,K.floatx())#for x<=theta
    	return fx_0+fx_1+fx_2
    def get_output_shape_for(self, input_shape):
        #we don't change the input shape
        return input_shape

#alpha1,alpha2是可学习超参数
class Trainable_activation(Layer):
    """
    f(x) = x for x>0
    f(x) = alpha1 * x for theta<x<=0
    f(x) = alpha2 * x for x<=theta
    """
    def __init__(self,init='zero',theta=-5.0,**kwargs):
        self.init = initializations.get(init)
        self.theta = theta
        super(Trainable_activation,self).__init__(**kwargs)
    def build(self,input_shape):
        self.alpha1 = self.init(input_shape[1:],name='alpha1')#init alpha1 and alpha2 using ''zero''
        self.alpha2 = self.init(input_shape[1:],name='alpha2')
        self.trainable_weights = [self.alpha1,self.alpha2]
    def call(self,x,mask=None):
        fx_0 = K.relu(x) #for x>0
        fx_1 = self.alpha1*x*K.cast(x>self.theta,K.floatx())*K.cast(x<=0.0,K.floatx()) #for theta<x<=0
        fx_2 = self.alpha2*x*K.cast(x<=self.theta,K.floatx())#for x<=theta
        return fx_0+fx_1+fx_2
    def get_output_shape_for(self, input_shape):
        #we don't change the input shape
        return input_shape

使用时:

model.add(Activation(sigmoid_relu))
model.add(Lambda(lambda_activation))
model.add(My_activation(theta=-5.0,alpha1=0.2,alpha2=0.1))
model.add(Trainable_activation(init='normal',theta=-5.0))

更多的用法,参考keras源码: activations,advanced_activations

@wepe
Copy link
Author

wepe commented Jul 9, 2016

  • 安装最新版keras

今天使用keras时发现一个小bug,compute_mask相关的错误,该bug前几天才被fix,pip源安装的没有解决该bug,所以只能直接download keras 的master分支,cd到setup.py所在的目录,然后输入sudo python setup.py install

@wepe
Copy link
Author

wepe commented Oct 13, 2016

  • 自定义metric,比如top1准确率+top2准确率。
def my_metric(y_true,y_pred):
    y_pred_sort = y_pred.argsort(axis=-1)
    top1 = K.mean(K.equal(K.argmax(y_true,axis=-1),y_pred_sort[:,-1]))
    top2 = K.mean(K.equal(K.argmax(y_true,axis=-1),y_pred_sort[:,-2]))
    return top1 + 0.1*top2

在compile时传入 metrics=[my_metric]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment