Modify layer parameters in Keras

erap129

I am interested in updating existing layer parameters in Keras (not removing a layer and inserting a new one instead, rather just modifying existing parameters).

I will give an example of a function I'm writing:

def add_filters(self, model):
    conv_indices = [i for i, layer in enumerate(model.layers) if 'convolution' in layer.get_config()['name']]
    random_conv_index = random.randint(0, len(conv_indices)-1)
    factor = 2
    conv_layer = model.layers[random_conv_index]
    conv_layer.filters = conv_layer.filters * factor
    print('new conv layer filters after transform is:', conv_layer.filters)
    print('just to make sure, its:', model.layers[random_conv_index].filters)
    return model

so what's basically happening here is me taking a random convolutional layer from my network (all my conv layers have 'convolution' in their name) and trying to double the filters. As far as I know this shouldn't cause any 'compilation issues' with input/output size compatibility in any case.

The thing is, my model doesn't change at all. The 2 print-outs I added in the end print the correct number (double the previous amount of filters). But when I compile the model and print model.summary(), I still see the previous filter amount.

BTW, I'm not constricted to Keras. If anyone has an idea how to pull this off with PyTorch for example I'll also buy it :D

today

Well, if you would like to create the architecture of a new model based on an existing model, though with some modifications, you can use to_json and model_from_json() functions. Here is an example:

model = Sequential()
model.add(Conv2D(10, (3,3), input_shape=(100,100,3)))
model.add(Conv2D(40, (3,3)))

model.summary()

Model summary:

Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_12 (Conv2D)           (None, 98, 98, 10)        280       
_________________________________________________________________
conv2d_13 (Conv2D)           (None, 96, 96, 40)        3640      
=================================================================
Total params: 3,920
Trainable params: 3,920
Non-trainable params: 0
_________________________________________________________________

Now we modify the number of filters of the first layer and create a new model based on the modified architecture:

from keras.models import model_from_json

model.layers[0].filters *= 2
new_model = model_from_json(model.to_json())
new_model.summary()

New model summary:

Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_12 (Conv2D)           (None, 98, 98, 20)        560       
_________________________________________________________________
conv2d_13 (Conv2D)           (None, 96, 96, 40)        7240      
=================================================================
Total params: 7,800
Trainable params: 7,800
Non-trainable params: 0
_________________________________________________________________

You can also modify the output of model.to_json() directly without modifying the model instance.


You can easily use get_weights() method to get the current weights of the convolution layer. It would return a list of two numpy arrays. The first one corresponds to filter weights and the second one corresponds to bias parameters. Then you can use set_weights() method to set the new weights:

conv_layer = model.layers[random_conv_index]
weights = conv_layer.get_weights()
weights[0] *= factor  # multiply filter weights by `factor`
conv_layer.set_weights(weights)

As a side note, the filters attribute of a convolution layer which you have used in your code corresponds to the number of filters in this layer and not their weights.

この記事はインターネットから収集されたものであり、転載の際にはソースを示してください。

侵害の場合は、連絡してください[email protected]

編集
0

コメントを追加

0

関連記事

分類Dev

Splitting cnn layer in keras

分類Dev

Keras Dense layer shape error

分類Dev

Is it possible to set keras layer output?

分類Dev

How to cache layer activations in Keras?

分類Dev

Keras clarification on definition of hidden layer

分類Dev

How to modify parameters in Guzzle middleware?

分類Dev

Add dense layer before LSTM layer in keras or Tensorflow?

分類Dev

How to replace (or insert) intermediate layer in Keras model?

分類Dev

Will custom Lambda layer be included in the backpropagating in Keras

分類Dev

keras usage of the Activation layer instead of activation parameter

分類Dev

keras bidirectional layer with custom RNN Cell

分類Dev

Keras Concatenate layer dimensions acting up

分類Dev

isinstance() to check Keras Layer Type on Tensor

分類Dev

Can Numba be used to subclass a Keras Layer?

分類Dev

Keras: output of concatenation has no parameters?

分類Dev

Unable to merge keras models after adding Lambda layer

分類Dev

Keras summation Layer acting weird, summing over training set

分類Dev

How to specify the axis when using the softmax activation in a Keras layer?

分類Dev

how to build Sequence-to-sequence autoencoder in keras with embedding layer?

分類Dev

How to use keras embedding layer with 3D tensor input?

分類Dev

Initialize single layer of a trained keras network and get predictions

分類Dev

Keras: Fading out layer or changing variable within session

分類Dev

How to support masking in custom tf.keras.layers.Layer

分類Dev

How to specify padding with keras in Conv2D layer?

分類Dev

How to remove the FC layer off of a fine turned model keras

分類Dev

Is there a way to use the native tf Attention layer with keras Sequential API?

分類Dev

Implementing a minimal LSTMCell in Keras using RNN and Layer classes

分類Dev

How to implement custom output layer with dynamic shape in Keras?

分類Dev

Keras: Difference between AveragePooling1D layer and GlobalAveragePooling1D layer

Related 関連記事

  1. 1

    Splitting cnn layer in keras

  2. 2

    Keras Dense layer shape error

  3. 3

    Is it possible to set keras layer output?

  4. 4

    How to cache layer activations in Keras?

  5. 5

    Keras clarification on definition of hidden layer

  6. 6

    How to modify parameters in Guzzle middleware?

  7. 7

    Add dense layer before LSTM layer in keras or Tensorflow?

  8. 8

    How to replace (or insert) intermediate layer in Keras model?

  9. 9

    Will custom Lambda layer be included in the backpropagating in Keras

  10. 10

    keras usage of the Activation layer instead of activation parameter

  11. 11

    keras bidirectional layer with custom RNN Cell

  12. 12

    Keras Concatenate layer dimensions acting up

  13. 13

    isinstance() to check Keras Layer Type on Tensor

  14. 14

    Can Numba be used to subclass a Keras Layer?

  15. 15

    Keras: output of concatenation has no parameters?

  16. 16

    Unable to merge keras models after adding Lambda layer

  17. 17

    Keras summation Layer acting weird, summing over training set

  18. 18

    How to specify the axis when using the softmax activation in a Keras layer?

  19. 19

    how to build Sequence-to-sequence autoencoder in keras with embedding layer?

  20. 20

    How to use keras embedding layer with 3D tensor input?

  21. 21

    Initialize single layer of a trained keras network and get predictions

  22. 22

    Keras: Fading out layer or changing variable within session

  23. 23

    How to support masking in custom tf.keras.layers.Layer

  24. 24

    How to specify padding with keras in Conv2D layer?

  25. 25

    How to remove the FC layer off of a fine turned model keras

  26. 26

    Is there a way to use the native tf Attention layer with keras Sequential API?

  27. 27

    Implementing a minimal LSTMCell in Keras using RNN and Layer classes

  28. 28

    How to implement custom output layer with dynamic shape in Keras?

  29. 29

    Keras: Difference between AveragePooling1D layer and GlobalAveragePooling1D layer

ホットタグ

アーカイブ