rparedespalacios / layers Goto Github PK
View Code? Open in Web Editor NEWNeural Network toolkit
License: MIT License
Neural Network toolkit
License: MIT License
bison -oasin.c -d asin.y
gcc -c asin.c -pedantic
asin.c: In function ‘yyparse’:
asin.c:1404:16: warning: implicit declaration of function ‘yylex’ [-Wimplicit-function-declaration]
yychar = yylex ();
^
asin.y:41:7: warning: implicit declaration of function ‘begin_experiment’ [-Wimplicit-function-declaration]
{ begin_experiment(); }
^
flex -oalex.c alex.l
gcc -c alex.c -pedantic
In file included from alex.l:7:0:
nettable.h:59:14: error: conflicting types for ‘yyleng’
extern int yyleng;
^
alex.c:304:11: note: previous declaration of ‘yyleng’ was here
yy_size_t yyleng;
^
Makefile:17: fallo en las instrucciones para el objetivo 'alex.o'
make: *** [alex.o] Error 1
No multithreading in MLP
Compiling with gcc installed from Homebrew on OSX gives the error:
suffix or operands invalid for `movq'
To solve this error, remove the links to the Homebrew gcc with
brew unlink gcc
and use the default gcc compiler.
Do not use bias in linear projection if batch normalization is activated
const{
seed=1 //Fix random numbers generation for every random parameter.
//-different neural network parameter initialization
//-same gaussian noise generation for performing a fix experiment
threads=4
batch=100
log="B_net.log"
}
data {
D1 [filename=$env:"/pathToFile/training", binary] //Read for enviroment variable the data path file
D2 [filename=$env:"/pathToFile/test", binary] //Read for enviroment variable the data path file
}
//NETWORK
network N1 {
//data
data tr D1 //mandatory
data ts D2
// Fully Connected Input
FI in
// Fully connected
F f1 [numnodes=1000]
F f2 [numnodes=500]
F f3 [numnodes=250]
F f4 [numnodes=250]
F f5 [numnodes=250]
// Fully Connected Output
FO out [classification]
// Connections
in->f1
f1->f2
f2->f3
f3->f4
f4->f5
f5->out
}
//RUN SCRIPT
script {
// To zero one range
D1.minus(D1.min)
D1.div(D1.max-D1.min)
//To -1 to 1 range
D1.minus(D1.median)
D1.div(D1.min.abs)
//Also build in functions
// To zero one range
D1.zero-one.(D1.min,D1.max)
//
// D1.min.abs (Absolute value of min value)
// D1.max.sqr (Square value of max value)
N1.in.noisesmean=D1.mean //Use data statistics as possible parameters
N1.in.noisesd=D1.std
N1.f1.noisemean=f1In.mean//Data statistics in each possible layer input
N1.f1.noised=f1In.std
N1.mu=0.01
N1.bn=1
N1.mmu=0.9
N1.train(100)
N1.mu=0.001
N1.train(50)
N1.save("B_net_v7")
//Evaluate a model
N1.load("B_net_v7")
N1.evaluate(D2)
}
Add the possibility of using data statistics. For example mean of the data for specifying gaussian noise mean
Add layers data path enviroment variable. Example: Layers_DATA_PATH. Easy for programmer and for toolkit user.
-Considering adding a flag seed to change random weight initialization to perform different experiments with same test set. Can also add an option to automatically do cross validation
After loading a pre trained network, cannot get test error. For example add to N1.testout("test.txt") the test error on top.
The library "fl" gives me an error compiling on Mac. I need to change the line 6 of the Makefile:
--- CC_LIB = -lfl
+++ CC_LIB = -ll
Making this change solved the compilation error.
div(x) only 0-1 normalization if min(data)=0. Consider adding div(y,x) where y is the minimum of the data and x the maximum
Add the possibility of decay learning rate following a function dependent on the number of epochs of the experiments.
I get the error "#error The Eigen/Array header does no longer exist in Eigen3."
I think including the Eigen folder in the CXXFLAGS is not a good practice (see example http://eigen.tuxfamily.org/dox/GettingStarted.html).
Instead, include the "Eigen" path in all the .cpp/.h files. For instance, in Layers.cpp (line 5):
--- #include "Dense"
+++ #include "Eigen/Dense"
Making this change in all the files solved the compilation error.
Training segmentation Fault error; Error data on attached file:
error_001.txt
Corpus: MNIST
NET file:
MNIST_cnn.txt
Bison is now a requisite of the tool
Multithreading is not done in MLP
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.