Proyecto

General

Perfil

Integration » Histórico » Revisión 2

Revisión 1 (Federico Vera, 2018-06-10 00:38) → Revisión 2/5 (Federico Vera, 2018-06-10 00:39)

# Integration 

 # What to do with the Data 
 Now that you have successfully trained your `MLP` you might want to integrate it with other things, or whatever. 
 ## Evaluating from Java 
 First of all you'll need to grab a copy of [`libai`](https://github.com/kronenthaler/libai) and add it to your classpath. 

 Then here's a sample code that can be used as a guide: 
 ~~~Java 
 import libai.nn.supervised.MLP; 
 import libai.common.Matrix; 

 ... 
 //The rest of your code 
 ... 

 public double f(double x) { 
     MLP mlp = MLP.open("weights.dat"); //<- the weights you want to use 
     Matrix m = new Matrix(1, 1);         //<- we only use single neuron inputs 
     m.position(0, 0, x);                 //<- set the value in the matrix 
     return mlp.simulate(m).position(0, 0); //<- we only use single neuron output 
 } 
 ~~~ 

 ## Evaluating from CLI 
 ``` 
 $ java -jar mrft-VERSION.jar FILENAME VALUES 
 ``` 
 So for instance if you train the MLP with `cos(x)`, it should output: 
 ``` 
 $ java -jar mrft-VERSION.jar weights.dat 0 
   1.0 
 ``` 
 You can also use the `-csv` or `-tsv` flags, so the output will be: 
 ``` 
 $ java -jar mrft-VERSION.jar weights.dat -csv 0 
   0.0, 1.0 
 ``` 
 The output will always be via `sdt::out` so you can use something like `tee` to create a file 
 ``` 
 $ java -jar mrft-VERSION.jar -csv FILENAME VALUES | tee OUT.csv 
 ``` 

 ## GNU Octave 
 The current version of `libai` supports exporting matrices as `Octave-Level-1`    binary matrices, but it still doesn't do that for `MLP`, so if someone actually want's to collaborate with some code, or wait a bit till I have some spare time. 
Volver al inicio