SequenceAttentionLayer

SequenceAttentionLayer 已被 AttentionLayer 取代,其在版本 12.0 中引入.

SequenceAttentionLayer[]

等同于 AttentionLayer[],有单个 "Input" 端口,而不是端口 "Key""Value". 应该是不再使用了.

SequenceAttentionLayer[net]

等同于 AttentionLayer[net],有单个 "Input" 端口,而不是端口 "Key""Value". 应该不再使用了.

更多信息

  • SequenceAttentionLayer[net] 会计算为带有单个 "Input" 端口的 AttentionLayer[net].
Wolfram Research (2017),SequenceAttentionLayer,Wolfram 语言函数,https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html.

文本

Wolfram Research (2017),SequenceAttentionLayer,Wolfram 语言函数,https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html.

CMS

Wolfram 语言. 2017. "SequenceAttentionLayer." Wolfram 语言与系统参考资料中心. Wolfram Research. https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html.

APA

Wolfram 语言. (2017). SequenceAttentionLayer. Wolfram 语言与系统参考资料中心. 追溯自 https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html 年

BibTeX

@misc{reference.wolfram_2024_sequenceattentionlayer, author="Wolfram Research", title="{SequenceAttentionLayer}", year="2017", howpublished="\url{https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html}", note=[Accessed: 22-November-2024 ]}

BibLaTeX

@online{reference.wolfram_2024_sequenceattentionlayer, organization={Wolfram Research}, title={SequenceAttentionLayer}, year={2017}, url={https://reference.wolfram.com/language/ref/SequenceAttentionLayer.html}, note=[Accessed: 22-November-2024 ]}