SequenceAttentionLayer
✖
SequenceAttentionLayer
is equivalent to AttentionLayer[] with a single "Input" port instead of ports "Key" and "Value". It should no longer be used.
is equivalent to AttentionLayer[net] with a single "Input" port instead of ports "Key" and "Value". It should no longer be used.
Wolfram Research (2017), SequenceAttentionLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html.
Text
Wolfram Research (2017), SequenceAttentionLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html.
Wolfram Research (2017), SequenceAttentionLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html.
CMS
Wolfram Language. 2017. "SequenceAttentionLayer." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html.
Wolfram Language. 2017. "SequenceAttentionLayer." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html.
APA
Wolfram Language. (2017). SequenceAttentionLayer. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html
Wolfram Language. (2017). SequenceAttentionLayer. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html
BibTeX
@misc{reference.wolfram_2025_sequenceattentionlayer, author="Wolfram Research", title="{SequenceAttentionLayer}", year="2017", howpublished="\url{https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html}", note=[Accessed: 23-May-2025
]}
BibLaTeX
@online{reference.wolfram_2025_sequenceattentionlayer, organization={Wolfram Research}, title={SequenceAttentionLayer}, year={2017}, url={https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html}, note=[Accessed: 23-May-2025
]}