2018-12-07 13:15:07 8 Comments

I am trying to understand the Keras layers better. I am working on a sequence to sequence model where I embed a sentence and pass it to a LSTM that returns sequences. Hereafter, I want to apply a Dense layer to each timestep (word) in the sentence and it seems like TimeDistributed does the job for three-dimensional tensors like this case.

In my understanding, Dense layers only work for two-dimensional tensors and TimeDistributed just applies the same dense on every timestep in three dimensions. Could one then not simply flatten the timesteps, apply a dense layer and perform a reshape to obtain the same result or are these not equivalent in some way that I am missing?

### Related Questions

#### Sponsored Content

#### 1 Answered Questions

### [SOLVED] Loss Calculations for LSTM + TimeDistributed(Dense...) layers

**2017-08-18 21:14:35****Phil****1614**View**2**Score**1**Answer- Tags: keras

#### 3 Answered Questions

### [SOLVED] Understanding Keras LSTMs

**2016-08-02 08:04:13****sachinruk****44401**View**283**Score**3**Answer- Tags: python deep-learning keras lstm

#### 1 Answered Questions

### [SOLVED] What is the role of TimeDistributed layer in Keras?

**2017-11-15 10:57:45****Buomsoo Kim****26868**View**61**Score**1**Answer- Tags: python machine-learning keras neural-network deep-learning

#### 1 Answered Questions

### [SOLVED] What are the uses of TimeDistributed wrapper for LSTM or any other layers

**2018-11-01 18:20:13****Kadam Parikh****1841**View**2**Score**1**Answer- Tags: tensorflow machine-learning neural-network keras deep-learning

#### 2 Answered Questions

### [SOLVED] Keras LSTM dense layer multidimensional input

**2017-09-13 21:19:45****sbz****3019**View**2**Score**2**Answer- Tags: python multidimensional-array keras lstm

#### 1 Answered Questions

### [SOLVED] Can TimeDistributed Layer used for many-to-one LSTM?

**2018-07-26 03:53:22****XIN LIU****280**View**0**Score**1**Answer- Tags: keras nlp deep-learning lstm rnn

#### 2 Answered Questions

### [SOLVED] TimeDistributed(Dense) vs Dense in Keras - Same number of parameters

**2017-06-18 01:41:08****thon****7108**View**23**Score**2**Answer- Tags: machine-learning neural-network keras recurrent-neural-network keras-layer

#### 2 Answered Questions

### [SOLVED] How to use return_sequences option and TimeDistributed layer in Keras?

**2017-03-13 02:35:03****jef****15975**View**39**Score**2**Answer- Tags: deep-learning keras lstm recurrent-neural-network

#### 1 Answered Questions

### [SOLVED] How to reverse a shape in Keras for LSTM input

**2016-10-20 13:33:10****Glau****411**View**2**Score**1**Answer- Tags: machine-learning neural-network tensorflow deep-learning keras

## 3 comments

## @yuvaraj8blr 2019-09-09 14:16:52

Adding to the above answers, here are few pictures comparing the output shapes of the two layers. So when using one of these layers after LSTM(for example) would have different behaviors.

## @Andrey Kite Gorin 2019-02-02 13:35:30

Dense layer can act on any tensor, not necessarily rank 2. And I think that TimeDistributed wrapper does not change anything in the way Dense layer acts. Just applying Dense layer to a tensor of rank 3 will do exactly the same as applying TimeDistributed wrapper of the Dense layer. Here is illustration:

## @jdehesa 2018-12-09 22:59:42

Imagine you have a batch of 4 time steps, each containing a 3-element vector. Let's represent that with this:

Now you want to transform this batch using a dense layer, so you get 5 features per time step. The output of the layer can be represented as something like this:

You consider two options, a

`TimeDistributed`

dense layer, or reshaping as a flat input, apply a dense layer and reshaping back to time steps.In the first option, you would apply a dense layer with 3 inputs and 5 outputs to every single time step. This could look like this:

Each blue circle here is a unit in the dense layer. By doing this with every input time step you get the total output. Importantly, these five units are the same for all the time steps, so you only have the parameters of a single dense layer with 3 inputs and 5 outputs.

The second option would involve flattening the input into a 12-element vector, applying a dense layer with 12 inputs and 20 outputs, and then reshaping that back. This is how it would look:

Here the input connections of only one unit are drawn for clarity, but every unit would be connected to every input. Here, obviously, you have many more parameters (those of a dense layer with 12 inputs and 20 outputs), and also note that each output value is influenced by every input value, so values in one time step would affect outputs in other time steps. Whether this is something good or bad depends on your problem and model, but it is an important difference with respect to the previous, where each time step input and output were independent. In addition to that, this configuration requires you to use a fixed number of time steps on each batch, whereas the previous works independently of the number of time steps.

You could also consider the option of having four dense layers, each applied independently to each time step (I didn't draw it but hopefully you get the idea). That would be similar to the previous one, only each unit would receive input connections only from its respective time step inputs. I don't think there is a straightforward way to do that in Keras, you would have to split the input into four, apply dense layers to each part and merge the outputs. Again, in this case the number of time steps would be fixed.