First, the input is squashed between -1 and 1 using a tanh activation function. This can be expressed by:Where and are the weights for the input and previous cell output, respectively, and is the input bias. Note that the exponents g are not a raised power, but rather signify that these are the input weights and bias values (as opposed to the input gate, forget gate, output gate etc.).This squashed input is then multiplied element-wise by the output of the input gate, which, as discussed above, is a series of sigmoid activated nodes:The output of the input section of the LSTM cell is then given by:Where the operator expresses element-wise multiplication.