By yadong sun

2018-08-30 05:10:58 8 Comments

This is my test code:

from keras import layers
input1 = layers.Input((2,3))
output = layers.Dense(4)(input1)

The output is:

<tf.Tensor 'dense_2/add:0' shape=(?, 2, 4) dtype=float32>

But What Happend?

The documentation says:

Note: if the input to the layer has a rank greater than 2, then it is flattened prior to the initial dot product with kernel.

While the output is reshaped?


@today 2018-08-30 08:12:14

Currently, contrary to what has been stated in documentation, the Dense layer is applied on the last axis of input tensor:

Contrary to the documentation, we don't actually flatten it. It's applied on the last axis independently.

In other words, if a Dense layer with m units is applied on an input tensor of shape (n_dim1, n_dim2, ..., n_dimk) it would have an output shape of (n_dim1, n_dim2, ..., m).

As a side note: this makes TimeDistributed(Dense(...)) and Dense(...) equivalent to each other.

Another side note: be aware that this has the effect of shared weights. For example, consider this toy network:

model = Sequential()
model.add(Dense(10, input_shape=(20, 5)))


The model summary:

Layer (type)                 Output Shape              Param #   
dense_1 (Dense)              (None, 20, 10)            60        
Total params: 60
Trainable params: 60
Non-trainable params: 0

As you can see the Dense layer has only 60 parameters. How? Each unit in the Dense layer is connected to the 5 elements of each row in the input with the same weights, therefore 10 * 5 + 10 (bias params per unit) = 60.

Update. Here is a visual illustration of the example above:

Visual illustration of applying a Dense layer on an input with two or more dimensions in Keras

@desertnaut 2018-08-30 10:24:07

Nice catch (+1) ;)

@Zero 2019-03-14 05:27:26

Could someone draw me a picture? Tricky to wrap my head around precisely where the connections and shared weighs are as a new-comer to the library.

@today 2019-04-12 23:01:07

@Zero Yes! I hope it's not late though ;)

Related Questions

Sponsored Content

3 Answered Questions

[SOLVED] What is the role of "Flatten" in Keras?

2 Answered Questions

[SOLVED] What is the Rank in the context of Keras?

1 Answered Questions

[SOLVED] How to fix input size of Dense Layer in CNN Keras?

1 Answered Questions

Keras custom layer with no different output_shape

1 Answered Questions

[SOLVED] Implementing Attention in Keras

1 Answered Questions

keras Flatten giving wrong output shape

3 Answered Questions

[SOLVED] Keras Conv2D and input channels

  • 2017-04-09 11:46:45
  • yoki
  • 22521 View
  • 31 Score
  • 3 Answer
  • Tags:   python keras

1 Answered Questions

Sponsored Content