Apr 16, 2018 07:34
7 yrs ago
1 viewer *
English term

run at and for each sequence time-step

English to Chinese Tech/Engineering IT (Information Technology) AI
About training RNN/LSTM: RNN and LSTM are difficult to train because they require memory-bandwidth-bound computation, which is the worst nightmare for hardware designer and ultimately limits the applicability of neural networks solutions. In short, LSTM require 4 linear layer (MLP layer) per cell to {run at and for each sequence time-step}. Linear layers require large amounts of memory bandwidth to be computed, in fact they cannot use many compute unit often because the system has not enough memory bandwidth to feed the computational units.
简而言之,LSTM需要每个单元4个线性层(MLP层),以便每个顺序时间步运行一次?
Proposed translations (Chinese)
5 FYI
3 针对每次时间步长运行
Change log

Apr 16, 2018 07:34: changed "Kudoz queue" from "In queue" to "Public"

Proposed translations

17 mins
Selected

FYI

只能看出”以便在每个顺序时间步运行,并为了每个顺序时间步而运行“

这里没有说明运行单位次数。

--------------------------------------------------
Note added at 2 hrs (2018-04-16 10:13:04 GMT)
--------------------------------------------------

at and for 包含两层意思,两个都不可忽视。

--------------------------------------------------
Note added at 2 hrs (2018-04-16 10:14:55 GMT)
--------------------------------------------------

这里没有说明运行的次数。
Something went wrong...
4 KudoZ points awarded for this answer. Comment: "谢谢!"
1 hr

针对每次时间步长运行

FYI

--------------------------------------------------
Note added at 1小时 (2018-04-16 09:07:43 GMT)
--------------------------------------------------

针对每次顺序时间步长运行
Something went wrong...
Term search
  • All of ProZ.com
  • Term search
  • Jobs
  • Forums
  • Multiple search