Skip to content

Instantly share code, notes, and snippets.

@tmsimont
Created May 9, 2016 16:12
Show Gist options
  • Save tmsimont/8c08e821b5cbb813b71d532d782de1d9 to your computer and use it in GitHub Desktop.
Save tmsimont/8c08e821b5cbb813b71d532d782de1d9 to your computer and use it in GitHub Desktop.
ListBuilder listBuilder = builder.list(2);
GravesLSTM.Builder hiddenLayerBuilder = new GravesLSTM.Builder();
hiddenLayerBuilder.nIn(100);
hiddenLayerBuilder.nOut(100);
hiddenLayerBuilder.activation("tanh");
listBuilder.layer(0, hiddenLayerBuilder.build());
RnnOutputLayer.Builder outputLayerBuilder = new RnnOutputLayer.Builder(LossFunction.MCXENT);
outputLayerBuilder.activation("softmax");
outputLayerBuilder.nIn(100);
outputLayerBuilder.nOut(100);
listBuilder.layer(1, outputLayerBuilder.build());
listBuilder.pretrain(false);
listBuilder.backprop(true);
MultiLayerConfiguration conf = listBuilder.build();
MultiLayerNetwork net = new MultiLayerNetwork(conf);
net.init();
// some time later...
net.rnnTimeStep(input);
@tmsimont
Copy link
Author

tmsimont commented May 9, 2016

How can I swap out RnnOutputLayer.Builder with a custom class type and have it relate to a custom Layer Model without re-writing the factory code here: https://github.com/deeplearning4j/deeplearning4j/blob/b1ffaddb67b6cf89ee97d40db931dd95f6cab742/deeplearning4j-core/src/main/java/org/deeplearning4j/nn/layers/factory/DefaultLayerFactory.java#L69

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment