Unlike RNNS that recurrently process tokens of a sequence one by one, self-attention ditches se- quential operations in favor of parallel computation. To use the sequence order information, we can...

RNN function with code complete question is givenUnlike RNNS that recurrently process tokens of a sequence one by one, self-attention ditches se-<br>quential operations in favor of parallel computation. To use the sequence order information, we<br>can inject absolute or relative positional information by adding positional encoding to the input rep-<br>resentations. Positional encodings can be either learned or fixed. In the following, we describe a<br>fixed positional encoding based on sine and cosine functions (Vaswani et al., 2017).<br>Suppose that the input representation X e Rnxd contains the d-dimensional embeddings for n<br>tokens of a sequence. The positional encoding outputs X+Pusing a positional embedding matrix<br>PE R

Extracted text: Unlike RNNS that recurrently process tokens of a sequence one by one, self-attention ditches se- quential operations in favor of parallel computation. To use the sequence order information, we can inject absolute or relative positional information by adding positional encoding to the input rep- resentations. Positional encodings can be either learned or fixed. In the following, we describe a fixed positional encoding based on sine and cosine functions (Vaswani et al., 2017). Suppose that the input representation X e Rnxd contains the d-dimensional embeddings for n tokens of a sequence. The positional encoding outputs X+Pusing a positional embedding matrix PE R"xd of the same shape, whose element on the ith row and the (2j)th or the (2j + 1)th column is Pi,2j = sin 100002j/d (10.6.2) i Pi,2j+1 = cos 1000025/d At first glance, this trigonometric-function design looks weird. Before explanations of this design, let us first implement it in the following PositionalEncoding class.

Jun 11, 2022
SOLUTION.PDF

Get Answer To This Question

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here