Keras AttributeError: 'Tensor' object has no attribute 'log'

i am having an error - 'Tensor' object has no attribute 'log' that i code in Keras to build a network while apply custom loss function to Keras. I think some how i need to get rid of np.log but not sure how. pls help. thx

import Numpy

import numpy as np

Custom Function

def rmsle(y_pred,y_test):
   return np.sqrt(np.mean((np.log(1+y_pred) - np.log(1+y_test))**2))

My network

def base_model():
   model = Sequential()
   model.add(Dense(50, input_dim=X_train.shape[1], init='normal',     activation='sigmoid'))
   model.add(Dropout(0.5))

   model.add(Dense(1, init='normal'))
   sgd = SGD(lr=0.01, momentum=0.8, decay=0.1, nesterov=False)
   model.compile(loss=rmsle, optimizer = sgd)# )'adam') #
   return model

keras = KerasRegressor(build_fn=base_model, nb_epoch=80, batch_size=1,verbose=1)
keras.fit(X_train ,y_train)

When i check the error msg in detail, it shows that

424         """
425         # score_array has ndim >= 2
--> 426         score_array = fn(y_true, y_pred)
427         if mask is not None:
428             # Cast the mask to floatX to avoid float64 upcasting in theano
2     #return np.sqrt(np.mean(np.square( np.log( (np.exp(a)) + 1 ) - np.log((np.exp(b))+1) )))
----> 4     return np.sqrt(np.mean((np.log(1+y_pred) - np.log(1+y_test))**2))
2     #return np.sqrt(np.mean(np.square( np.log( (np.exp(a)) + 1 ) - np.log((np.exp(b))+1) )))

2 answers

  • answered 2018-07-11 05:58 Milind Deore

    Lambda layers in Keras help you to implement functionality that is not prebuilt and which do not require trainable weights. So you get this liberty to implement your own logic as in this case 'Log'

    This can also be done using keras Lambda layer as below:

    from keras.layers import Lambda
    import keras.backend as K
    

    Define your function here:

    def logFun(x):
       return keras.backend.log(x)
    

    And later create a lambda layer:

    model.add(Lambda(logFun, ...))
    

  • answered 2018-07-11 06:48 rvinas

    You must use valid tensor operations from your backend (i.e. from keras.backend) in order to define a custom loss function. For example, your loss function could be defined as follows:

    import keras.backend as K
    
    def rmsle(y_test, y_pred):
        return K.sqrt(K.mean(K.square(K.log(1 + y_pred) - K.log(1 + y_test))))
    

    NOTE: Keras expects the first argument to be y_test (alias the ground truth).