Neural Network Operations

The Wolfram Language makes it very easy to operate on neural networks using their symbolic representation. A new architecture can be built starting from another one by taking or adding layers, selecting and combining subgraphs or replacing specific parts or patterns. Networks can then be trained on a variety of data to be optimized for a specific task.

Network Training

NetTrain train parameters in a net from examples

NetTrainResultsObject represent what happened in a training

Network Operations

NetInitialize randomly initialize parameters for a net

NetInsertSharedArrays convert all arrays in a net into shared net arrays

Network Surgery

NetExtract extract a specific layer or subgraph

NetUnfold expose recurrent states of a net

NetReplacePart replace layers or layer properties

NetFlatten flatten nested net structures like subgraphs

NetJoin combine a series of nets

NetRename rename layers or subpart of a net

NetAppend, NetPrepend add one or more layers before and after a net

NetTake  ▪  NetDrop  ▪  NetInsert  ▪  NetDelete  ▪  NetReplace

Network Composition

NetChain chain composition of net layers

NetGraph graph of net layers

Higher-Order Network Construction

NetMapOperator map over a sequence

NetMapThreadOperator map over multiple sequences

NetFoldOperator recurrent network that folds in elements of a sequence

NetBidirectionalOperator bidirectional recurrent network

NetNestOperator apply the same operation multiple times

Special Training Operators

NetPairEmbeddingOperator train a Siamese neural network

NetGANOperator train generative adversarial networks (GAN)