What does it mean to "unroll a RNN dynamically". I've seen this specifically mentioned in the Tensorflow source code, but I'm looking for a conceptual explanation that extends to... moreWhat does it mean to "unroll a RNN dynamically". I've seen this specifically mentioned in the Tensorflow source code, but I'm looking for a conceptual explanation that extends to RNN in general.In the tensorflow rnn method, it is documented:If the sequence_length vector is provided, dynamic calculation is performed. This method of calculation does not compute the RNN steps past the maximum sequence length of the minibatch (thus saving computational time),But in the dynamic_rnn method it mentions:The parameter sequence_length is optional and is used to copy-through state and zero-out outputs when past a batch element's sequence length. So it's more for correctness than performance, unlike in rnn().So does this mean rnn is more performant for variable length sequences? What is the conceptual difference between dynamic_rnn and rnn? less
I'm trying to write a simple RNN in tensorflow, based on the tutorial here: https://danijar.com/introduction-to-recurrent-networks-in-tensorflow/ (I'm using a simple RNN cell... moreI'm trying to write a simple RNN in tensorflow, based on the tutorial here: https://danijar.com/introduction-to-recurrent-networks-in-tensorflow/ (I'm using a simple RNN cell rather than GRU, and not using dropout).I'm confused because the different RNN cells in my sequence appear to be being assigned separate weights. If I run the following codeimport tensorflow as tfseq_length = 3n_h = 100 # Number of hidden unitsn_x = 26 # Size of input layern_y = 26 # Size of output layerinputs = tf.placeholder(tf.float32, )cells = for _ in range(seq_length): cell = tf.contrib.rnn.BasicRNNCell(n_h) cells.append(cell)multi_rnn_cell = tf.contrib.rnn.MultiRNNCell(cells)initial_state = tf.placeholder(tf.float32, )outputs_h, output_final_state = tf.nn.dynamic_rnn(multi_rnn_cell, inputs, dtype=tf.float32)sess = tf.Session()sess.run(tf.global_variables_initializer())print('Trainable variables:')for v in tf.trainable_variables(): print(v)If I run this in python 3, I get the following output:Trainable variables:Firstly,... less
I'm building a RNN model to do the image classification. I used a pipeline to feed in the data. However it returnsValueError: Variable rnn/rnn/basic_rnn_cell/weights already... moreI'm building a RNN model to do the image classification. I used a pipeline to feed in the data. However it returnsValueError: Variable rnn/rnn/basic_rnn_cell/weights already exists, disallowed. Did you mean to set reuse=True in VarScope? Originally defined at:I wonder what can I do to fix this since there are not many examples of implementing RNN with an input pipeline. I know it would work if I use the placeholder, but my data is already in the form of tensors. Unless I can feed the placeholder with tensors, I prefer just to use the pipeline.def RNN(inputs):with tf.variable_scope('cells', reuse=True): basic_cell = tf.contrib.rnn.BasicRNNCell(num_units=batch_size)with tf.variable_scope('rnn'): outputs, states = tf.nn.dynamic_rnn(basic_cell, inputs, dtype=tf.float32)fc_drop = tf.nn.dropout(states, keep_prob)logits = tf.contrib.layers.fully_connected(fc_drop, batch_size, activation_fn=None)return logits#Trainingwith tf.name_scope("cost_function") as scope: cost =... less
We are only using the RNN decoder (without encoder) for text generation, how is RNN decoder different from pure RNN operation?RNN Decoder in... moreWe are only using the RNN decoder (without encoder) for text generation, how is RNN decoder different from pure RNN operation?RNN Decoder in TensorFlow: https://www.tensorflow.org/api_docs/python/tf/contrib/seq2seq/dynamic_rnn_decoderPure RNN in TensorFlow: https://www.tensorflow.org/api_docs/python/tf/nn/dynamic_rnnThanks for your time