Tensorflow BasicLSTMCell 初始化参数

num_units:int,LSTM cell 中 units 的数量
forget_bias:float，添加到遗忘门的偏置项。
state_ist_tuple:如果为真，接收并返回 states 为2个 tupules，一个是 c_state 一个是 m_state。如果为假，把他们连接起来在列的维度。第二种操作很快会被废弃。
activation：内部状态的激活函数。默认为‘tanh’
reuse:Python 布尔值描述了是否重用一个已经存在的“scope”内的变量。如果不为真，而且已经存在的“scope”已经有给定的变量，会出现一个错误。

Args:
num_units: int, The number of units in the LSTM cell.
forget_bias: float, The bias added to forget gates (see above).
state_is_tuple: If True, accepted and returned states are 2-tuples of
the c_state and m_state. If False, they are concatenated
along the column axis. The latter behavior will soon be deprecated.
activation: Activation function of the inner states. Default: tanh.
reuse: (optional) Python boolean describing whether to reuse variables
in an existing scope. If not True, and the existing scope already has
the given variables, an error is raised.

class BasicLSTMCell(RNNCell):
"""Basic LSTM recurrent network cell.

The implementation is based on: http://arxiv.org/abs/1409.2329.

We add forget_bias (default: 1) to the biases of the forget gate in order to
reduce the scale of forgetting in the beginning of the training.

It does not allow cell clipping, a projection layer, and does not
use peep-hole connections: it is the basic baseline.

that follows.
"""

def __init__(self, num_units, forget_bias=1.0,
state_is_tuple=True, activation=None, reuse=None):
"""Initialize the basic LSTM cell.

Args:
num_units: int, The number of units in the LSTM cell.
forget_bias: float, The bias added to forget gates (see above).
state_is_tuple: If True, accepted and returned states are 2-tuples of
the c_state and m_state.  If False, they are concatenated
along the column axis.  The latter behavior will soon be deprecated.
activation: Activation function of the inner states.  Default: tanh.
reuse: (optional) Python boolean describing whether to reuse variables
in an existing scope.  If not True, and the existing scope already has
the given variables, an error is raised.
"""