By Z. Ye


2019-01-07 13:31:16 8 Comments

Usually the input tensor of the Conv2D in Keras is a 4D tensor with the dimension batch_size * n * n * channel_size. Now I have a 5D tensor with the dimension batch_size * N * n * n * channel_size and I want to apply the 2D convolutional layer for the last three dimensions for each i in N. For example, if the kernel size is 1, then I expect that the output will have the dimension batch_size * N * n * n * 1.

Anyone knows some easy ways to implement it with Keras?

For example, for the fully-connected layer Keras can do it automatically. If the input has the shape batch_size * N * n, then the Dense layer in Keras will set a FC layer for each i in N. Hence we will get the output with batch_size * N * m, if we set Dense(m).

1 comments

@today 2019-01-07 13:38:09

You can use the TimeDistributed layer wrapper to apply the same convolution layer on all the images in the 5D tensor. For example:

model = Sequential()
model.add(TimeDistributed(Conv2D(5, (3,3), padding='same'), input_shape=(10, 100, 100, 3)))

model.summary()

Model summary:

Layer (type)                 Output Shape              Param #   
=================================================================
time_distributed_2 (TimeDist (None, 10, 100, 100, 5)   140       
=================================================================
Total params: 140
Trainable params: 140
Non-trainable params: 0
_________________________________________________________________

@Z. Ye 2019-01-07 13:39:17

Very quick answer! I will try that at once :)

@Z. Ye 2019-01-07 14:16:26

It actually works but not exactly in a way that I expected. It seems that the weights are the same for every temporal parameter i in N. However, I wanted to set different weights for each i.

@today 2019-01-07 14:26:43

@Z.Ye Of course and I mentioned that in my answer. Further, that's also the case for the Dense layer example you provided, i.e. the weights are fixed. If you want different weights and the N is known, you can easily write a for loop to do that.

@Z. Ye 2019-01-07 14:33:24

Yes, you are right. The Dense layer also did what you described. Then I will try loop.

@Z. Ye 2019-01-08 14:19:49

I'm sorry that I can not upvote your answer. I tried the for loop a bit, but it didn't not work perfectly. I posted a question here stackoverflow.com/questions/54093755/… . Could you give me a hint? Thank you very much.

Related Questions

Sponsored Content

3 Answered Questions

5 Answered Questions

1 Answered Questions

Keras Dropout Convolutional Filters

3 Answered Questions

[SOLVED] Keras Conv2D and input channels

  • 2017-04-09 11:46:45
  • yoki
  • 23183 View
  • 31 Score
  • 3 Answer
  • Tags:   python keras

1 Answered Questions

1 Answered Questions

1 Answered Questions

1 Answered Questions

Sponsored Content