How to pass epoch and batch size when using label powerset in keras

I have a multi-label problem and with some research, I was able to use Label powerset in conjunction with ML algorithms. Now I want to use the Label powerset with neural network and as per the official website I can use Label powerset. But I am not able to understand how to modify my existing code to be able to use Label Powerset.

I want to know how can we pass epoch or batch_size or any other parameter passed in the fit function of the model.

Since I have a multi-label problem I have used MultiLabelBinarizer of sklearn so my each target row looks like this [1,0,0,1,0,0,0,0,0,0,0,0].

and lastly, if someone could explain to me what is KERAS_PARAMS and Keras() in the below line:

def create_model_multiclass(input_dim, output_dim):
    # create model
    model = Sequential()
    model.add(Dense(8, input_dim=input_dim, activation='relu'))
    model.add(Dense(output_dim, activation='softmax'))
    # Compile model
    model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
    return model
clf = LabelPowerset(classifier=Keras(create_model_multiclass, True, KERAS_PARAMS), require_dense=[True,True]),y_train)
y_pred = clf.predict(X_test)

Below is my existing neural network code

cnn_model = Sequential()
cnn_model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['acc'])
history =, y_train, validation_data=(X_test,y_test), batch_size=32, epochs=180,verbose=1)
predictions = cnn_model.predict(X_test)

I want my output row to look like this only [1,0,0,1,0,0,0,0,0,0,0,0] as later I will use my MultiLabelBinarizer for the inverse transform of this.

1 answer

  • answered 2019-06-25 06:41 Pedro Marques

    KERAS_PARAMS are parameters to the Keras scikit wrapper. The documentation for it is rather sparse.

    Basically it seems to be the params that you would pass, for instance, to

    KERAS_PARAMS = dict(epochs=10, batch_size=100, verbose=0)

    From reading the docs, it seems to me that LabelPowerset transforms a multi-label problem into a multi-class problem by creating class permutations. You may consider just using a native Keras solution for a multi-label problem rather than using a wrapper.

    The following tutorial seems reasonable:

    The key differences are that your output layer should have a sigmoid activation rather than softmax and the loss should be binary_crossentrophy rather than categorical.