Giter Site home page Giter Site logo

Comments (21)

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024 7

@riatzukiza I'd like to continue convolutional neural nets here. You showed interest, I'd love to work together to achieve this.

After researching this, I've come to the conclusion that the "convolutional" network isn't really a type of network, it is more an added intermediate layer between layers of an already defined type of network (I will insert here that I've been wrong before, and will be wrong again, and I could be now). That being said, I believe we can achieve this fairly easily by either embracing the ability to process them as either an array or a matrix (arguably just a larger flat array) with our existing codebase. What does this mean? Any network we have could be convolutional with minimal effort.

Speaking naively here...
At this point we have the following networks:

  • Feedforward - new brain.NeuralNetwork()
  • RNN - new brain.recurrent.RNN()
  • LSTM - new brain.recurrent.LSTM()
  • GRU - new brain.recurrent.GRU()
  • More networks yay!

Imagine rather than having to do this (which increases our technological footprint exponentially):

  • Feedforward - new brain.NeuralNetwork()
  • RNN - new brain.recurrent.RNN()
  • LSTM - new brain.recurrent.LSTM()
  • GRU - new brain.recurrent.GRU()
  • More networks... insert jaws theme here!
  • Feedforward Convolutional - new brain.convolutional.NeuralNetwork()
  • RNN Convolutional - new brain.convolutional.RNN()
  • LSTM Convolutional - new brain.convolutional.LSTM()
  • GRU Convolutional - new brain.convolutional.GRU()
  • More networks... insert jaws theme here!

We simply do this:

  • Feedforward - new brain.NeuralNetwork({ convolutional: true || false })
  • RNN - new brain.recurrent.RNN({ convolutional: true || false })
  • LSTM - new brain.recurrent.LSTM({ convolutional: true || false })
  • GRU - new brain.recurrent.GRU({ convolutional: true || false })
  • More networks yay!

Here we gain an optional on/off, and a clear distinction of what gives a neural network this capability.
Thoughts? More to come!

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024 6

not yet. We'd love help, and are currently working on it.

from brain.js.

riatzukiza avatar riatzukiza commented on April 28, 2024 1

from brain.js.

riatzukiza avatar riatzukiza commented on April 28, 2024

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

Good points!
In other libraries it was if they were saying: "Neural networks are so easy! All you have to do is know how they work, and then you can create one!"

When I first saw brain.js, I was shocked at the simplicity of:

var net = new brain.NeuralNetwork();
net.train();

Instantly I was enamored. Then dug further and found the toFunction method which runs the entire network in a single method. Some time later, I stumbled upon recurrent.js, and I thought: "Maybe we could do the same api in brain.js for a recurrent neural net!"

And after nearly a year, we have:

var net = new brain.recurrent.RNN();
net.train();

The reason that neural networks are hard to understand, I find, is because nearly every library abstracts the building of the neural network, when that can be done fairly easily once, and reused. The only thing the distinguishes each neural net from another (more or less) is what goes in the hidden layers.

In the case of the LSTM: https://github.com/harthur-org/brain.js/blob/master/src/recurrent/lstm.js
Composing a new neural net would simply be to know the math for your neural net, and extend RNN to use it. If one were interested in the architecture of the neural net, they could look at the internals of RNN and see them.

The idea that I think we are working towards is composition over inheritance. At this stage I would like to keep it as simple, and close to that principle as possible.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

There was a streaming api, can these networks be piped into and out of one
another?

tl;dr: no
Long answer: There still is one. As it stands we are streaming data in, not delegating a hidden layer's output to another hidden layer or output via a stream.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

@ddsol Is working on a formula composer that will be able to be reused between the nets. From what he has shown me, it will be consist of simple functional math, and before it runs, it compiles to a loop-less reusable function. He also is working on a gpu counterpart.

In the mean time, to get a simple poc convolutional neural net running with minimal effort running, I went ahead and did the following locally, though it is not yet fully working:

export default class CNN extends NeuralNetwork {
  /**
   *
   * @param input
   * @returns {*}
   */
  runInput(input) {
    this.outputs[0] = input;  // set output state of input layer

    let output = null;
    for (let layer = 1; layer <= this.outputLayer; layer++) {
      for (let node = 0; node < this.sizes[layer]; node++) {
        let weights = this.weights[layer][node];

        let sum = this.biases[layer][node];
        for (let k = 0; k < weights.length; k++) {
          sum += weights[k] * input[k];
        }
        this.outputs[layer][node] = 1 / (1 + Math.exp(-sum));
      }
      output = input = magicConvolutionFunction(this.outputs[layer]);
    }
    return output;
  }
}

The magicConvolutionFunction (not really its name) I'm still working on locally. I should have something pushed either tomorrow or Tuesday. I'm essentially reworking this https://github.com/karpathy/convnetjs/blob/master/src/convnet_layers_dotproducts.js#L43, understanding the logical flow, and simplifying to brain.js's standards. More will come soon!

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

I worked out the logical flow and simplified naming convention so almost anyone can make out what is going on in there.

You'll notice that it is essentially the same logic, the main difference I'm implementing a means of compiling a couple functions that will have all the math in it that has no loops, no instantiation of new objects/arrays, or no function calls within it. The upside to this is it is immensely faster.

The positive of it too is that the iteration can now be reused for the forward or runKernel can now be reused to build the backpropagate function or runBackpropagateKernel.

As for the main implementation of the Convolutional Neural Net. I've basically just extended the base NeuralNetwork and will attach it up to the Convolution when those methods/constructors materialize.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

@riatzukiza, I believe I have seen the light of your statement. But I think we can make things both easily composable, easily configurable, and have a means of auto config. After a bit of tinkering, I can see how convolutions offer a much wider range of configurability than vanilla neural networks, and because of that we need to be able to define configurations. I've pondered on this for a while, but it may be pure trash if the community doesn't like it. I do plan on coming back around to the convolutional: true | false idea, but it may not be practical now that I've ripped into the idea more. Just playing around with syntax definitions, what do you think of this:?

new CNN({
  input: () => new Input(35, 35, 3),
  hiddenLayers: [
    (input) => new Convolution(input, {
      depth: 1,
      filters: 6,
      padding: 1,
      stride: 1
    }),
    Relu,
    (input) => new Pool(input, {
      padding: 1,
      stride: 1,
      width: 4,
      height: 4
    })
    40, //eventually backward compatible flat array, just for visuals
  ],
  output: (input) => new Output(tbd)
});

In this way we get:

  • a flat list
  • hierarchy of the structure of the hidden layers
  • composability we like with js/es6
  • a means of being able to create defaults for hiddenLayers when auto-configured using train() method if so desired
  • potential backward compatibility with default layer types
  • the ability to define with own type of layer

The reason for lambdas and usage of constructors: Constructors will be instantiated using new and will be passed the previous input. The more advanced lambda usage will receive the previous input, which can be passed into the layer being defined, along with options to define it, which is then returned to the lambda, which will be passed to the next constructor or lambda. Output is as yet to be determined.

thoughts?

from brain.js.

riatzukiza avatar riatzukiza commented on April 28, 2024

I like the functional arguements

from brain.js.

ddsol avatar ddsol commented on April 28, 2024

By passing the functions as arguments you can't figure out what's in it until you call it with actual data. This means you can't transcribe or optimize the entire function, only the parts.

I think it's better to be declarative and describe what the whole thing should do using POJOs or strings or something else inert (non-code). The functions won't be doing anything fancy anyway. All the "custom code" in those functions will ever do is call one of the predefined constructors.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

@riatzukiza, ty for the feedback! @ddsol I really like where you are going with this as well. Do you by chance have any recommendations or pseudo code? I know you are doing some work on the rnn care to share sometime soon, as I believe it follows a similar path?

from brain.js.

ddsol avatar ddsol commented on April 28, 2024

One possibility would be for instance:

theCnn = new CNN({
  input: {
   size: 3 // <== only size is known, the actual input comes later. Maybe flexible size is possible instead?
  },
  hiddenLayers: [
    {
      type: Convolution
      depth: 1,
      filters: 6,
      padding: 1,
      stride: 1
    },
    Relu,
    {
      type: Pool
      padding: 1,
      stride: 1,
      width: 4,
      height: 4
    },
    40 //eventually backward compatible flat array, just for visuals (???)
  ] //output is implicit
}); // <== this compiles a NN and possibly optimizes it

theCnn.run([35, 35, 3]); // <== pass the datas after construction, possibly multiple times

I'm not currently working on any RNN or CNN or any other NN.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

@ddsol thank you for the suggestion and clarification, I really like what you did there.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

As a side note, the algorithms used (at least at this time) will be to compile a simple kernel, rather than to run all the math for performance. If interested in why, check this out: https://jsperf.com/loop-vs-loopless

While this isn't exactly the same thing and the size is substantially smaller, on my system I get 85 times faster results from a loopless or "compiled" algorithm. If we can achieve twice as fast as other current libraries, I'm happy.

from brain.js.

Dok11 avatar Dok11 commented on April 28, 2024

@robertleeplummerjr is any conv nn was published on brain.js or this only discuss?

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

It is partially implemented here https://github.com/harthur-org/brain.js/tree/convolutional/src. I'll commit what I have locally later today, which isn't working fully yet, but is substantially further than what is here.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

I forgot to mention on the loop vs loopless, that 85 times faster is cpu only, not even gpu yet.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on April 28, 2024

Here is the output of the not yet working convolutional kernel: https://gist.github.com/robertleeplummerjr/ea2a4cd06012ccbca47a8597522759c4

from https://github.com/harthur-org/brain.js/tree/convolutional

exciting stuff!

from brain.js.

niembro64 avatar niembro64 commented on April 28, 2024

was a CNN ever achieved here?

(i.e. a convolutional layer / pooling layer)

from brain.js.

RezaErfani67 avatar RezaErfani67 commented on April 28, 2024

was a CNN ever achieved here?

from brain.js.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.