Giter Site home page Giter Site logo

colgreen / sharpneat Goto Github PK

View Code? Open in Web Editor NEW
367.0 41.0 95.0 7.11 MB

SharpNEAT - Evolution of Neural Networks. A C# .NET Framework.

Home Page: https://sharpneat.sourceforge.io/

License: Other

C# 99.92% Batchfile 0.01% Smalltalk 0.07%
ai evolutionary-computation reinforcement-learning neural-network neat sharpneat evolution evolutionary-algorithm neuroevolution neural-networks

sharpneat's Introduction

SharpNEAT - Evolution of Neural Networks

NEAT is an evolutionary algorithm devised by Kenneth O. Stanley.

SharpNEAT is a complete implementation of NEAT written in C# and targeting .NET 8.

What is SharpNEAT?

SharpNEAT provides an implementation of an Evolutionary Algorithm (EA) with the specific goal of evolving a population of neural networks towards solving some goal problem task (known as as the Objective function).

The EA uses the evolutionary mechanisms of mutation, recombination, and selection, to search for a neural network that 'solves' a given problem task, with each neural net being assigned a fitness score that represents the quality of the solution it represents.

Some example problem tasks:

  • How to control the limbs of a simple biped or quadruped to make it walk.
  • How to control a rocket to maintain vertical flight.
  • Finding a network that implements some desired digital logic, such as a multiplexer.

A notable point is that NEAT and SharpNEAT search both neural network structure (the set of network nodes and how they are connected) and connection weights. This is distinct from algorithms such as backpropagation that attempt to find good connection weights for a given structure.

SharpNEAT is a framework, or 'kit of parts', that facilitates research into evolutionary computation and specifically evolution of neural networks. The framework provides a number of example problem tasks that demonstrate how it can be used to produce a complete working EA.

This project aims to be modular, e.g. an alternative genetic coding or entirely new evolutionary algorithm could be used alongside the other parts/classes provided by SharpNEAT. The provision for such modular experimentation was a major design goal of SharpNEAT, and is facilitated by abstractions made in SharpNEAT's architecture around key concepts such as 'genome' (genetic representation / encoding) and 'evolutionary algorithm' (mutations, recombination, selection strategy, etc.).

Motivation for the development of SharpNEAT derives from a broader interest in biological evolution, and curiosity around the limits of neuro-evolution, in terms of the of problems and level of problem complexity it can produce satisfactory solutions for.

More Info / Links

sharpneat's People

Contributors

colgreen avatar cyberboss avatar monkeywithacupcake avatar naigonakoii avatar pushad avatar radsimu avatar ziachap avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sharpneat's Issues

NuGet package of SharpNeatLib

Thanks for this great project!

It would be nice to be able to include SharpNeatLib in projects using NuGet. Currently, I must either include the source code or the library in my repository for projects that use SharpNeat.

I would appreciate the library separately because it can be used in .NET Standard 2.0 projects (thanks to your recent updates to RedZen and this project).

Cannot load a saved CPPN network.

NeatGenomeXmlIO cannot read/create instances of CppnNeatGenome, therefore saved CPPN networks cannot be loaded - i.e. the wrong type will be loaded and may therefore fail silently.

This is on the dev branch.

k-means speciation is not handling empty clusters/species.

The standard k-means algorithm may result in empty species, e.g. consider this example in 1D space:

Initial clusters

      {2, 3, 3, 3}
      {3, 7, 7, 7}
      {7, 8, 8, 8}.

For k=3, the first update will empty the middle cluster.

Once the cluster is empty it has no centroid, and therefore will not gain any genomes in later iterations, i.e. it will remain empty at completion of the k-means iterations.

This is currently somewhat/partly handled by NeatEvolutionAlgorithm.IntegrateOffspringIntoSpecies(), which 're-speciates' the full population if one or more species are empty, but that re-speciation uses k-means clustering, so it can still produce empty species.

Other code may then assume that all species obtain at least one genome, and thus other problems may result from this.

Also see https://stackoverflow.com/questions/11075272/k-means-empty-cluster

Revise use of Math.FusedMultiplyAdd()

Previously I arrived at the conclusion that this method was slow because it did not get substituted with an fma CPU instruction by the JITter. This appears to be wrong (or maybe things have improved since I checked, or perhaps the my old Intel Core i7 6700T didn't have a required CPU instruction(s)(?)).

Anyways, this definitely does result in an FMA like CPU instruction, when the CPU supports it. E.g. see:

dotnet/runtime#34450

sharplab

 public static double Foo(double a, double b, double c)
 {
     return Math.FusedMultiplyAdd(a, b, c);   
 }

C.Foo(Double, Double, Double)
    L0000: vzeroupper
    L0003: vfmadd213sd xmm0, xmm1, xmm2
    L0008: ret

FeatureRequest: Allow min fitness so that zero is the perfect answer

I'm trying to evolve something and my fitness function is based on penalty points, such that a fitness of zero would be no penalty or a perfect score. However, it looks like the newest implementation still only supports fitness max values. There should be a property that can be passed into the neat class that allows the fitness to be a less than compare instead of a greater than. It's a minor feature but very annoying for me to try and translate, as it is really tough to know the actually maximum possible penalty in my case.

Store Extra Data in Gnome

Hi Colin,
I'd like to run some convolution filters on my input before I call Brain.Activate()
What would be the best way to extend the genome to store and (possible cross) an array of doubles?
(the doubles would store values for the convolutions)
Thanks!!!

Make use of ArrayPool<T> to reduce memory allocs for neural net arrays.

When we decode a genome to a neural net we tend to allocate a few arrays. It would be good to try and use ArrayPool (probably, actually MemoryPool.Shared ?) where possible to minimise allocation and GC activity.

This work can be guided by the memory and GC tracking info from the performance profiler - we can use the efficacy sampler project to generate the workload for this.

Using arrays from a pool also means we avoid the memory clearing/zeroing overhead that occurs when getting a new array from the heap.

Update/reboot Holger Ferstl's 'Genetic Art with SharpNEAT' application

Holger Ferstl created this app back in 2006, which would have been targetting a 1.x era version of SharpNEAT.

The blog post original link was (currently offline):

http://oldblog.holgerferstl.de/2006/02/08/GeneticArt.aspx

Most recent internet archive snapshot:

https://web.archive.org/web/20130514084644/http://oldblog.holgerferstl.de/

There are copies of the source code and compiled binaries, here:

https://www.cs.ucf.edu/~kstanley/GenArtSolution.zip
https://www.cs.ucf.edu/~kstanley/GenArt.zip

(I also have local copies at D:\home\archive)

It would be good to update/refresh/reboot this app, once the SharpNEAT v4 refactor/reboot is released.

Why is CreateGenomeDecoder method private

When you try to use this framework in creating algorithm and use it later, you need some way to serialize IBlackBox, or get it from serialized data.
According to previous version the way to do it is to serialize NeatGenome structure and decode it later, but to use it you need to get istance of IGenomeDecoder. There is no CreateGenomeDecoder on IExperiment but there is one in the NeatUtils, but for some reason its private.

Is there some other way to use "train->store->reuse trained algorithm" pipeline?

your commit of connectionweight seems to be wrong(this commit wont effect anything)

Hey,
I've seen that you added this connectionWeight calculation.
0645b9a

double connectionWeight = _genomeFactory.GenerateRandomConnectionWeight(); if(_genomeFactory.Rng.NextBool()) { connectionWeight *= 0.01; }

but you dont use it. you just generate a new one after..

newConnectionGene = new ConnectionGene(existingConnectionId.Value, sourceId, targetId, _genomeFactory.GenerateRandomConnectionWeight());

instead of
newConnectionGene = new ConnectionGene(existingConnectionId.Value, sourceId, targetId, connectionWeight);

also I think that this behaviour should be in the GenerateRandomConnectionWeight function itself.

Getting blackbox for evaluation from persisted genome

Hi,
I was able to create a custom experiment, train the network and persist the best genome. Now I want to put the trained genome to real world use, but I fail to see an easy way to do it. I had to change the visibility of NeatUtils.CreateGenomeDecoder to public and then use this code to get the black box:

using CustomTasks.MarketPredictor;
using SharpNeat.IO;
using SharpNeat.Neat;
using SharpNeat.Neat.Genome.IO;

var factory = new MarketPredictorFactory();
var config = JsonUtils.LoadUtf8("......\\src\\SharpNeat.Windows.App\\config\\experiments-config\\market-predictor.config.json");
var experiment = factory.CreateExperiment(config.RootElement);

var metaNeatGenome = NeatUtils.CreateMetaNeatGenome(experiment);

var loader = NeatGenomeLoaderFactory.CreateLoaderDouble(metaNeatGenome);
var genome = loader.Load("my.genome");

var decoder = NeatUtils.CreateGenomeDecoder(experiment);
var box = decoder.Decode(genome);

Is there easier or other recommended way to do this?
Thank you in advance.

Inconsistent License Information

The current license on the SharpNEAT project is indeterminable from the current repository. In the root folder of the directory, the LICENSE file states that the project is under the MIT license. In the folder containing the SLN, the COPYING file states that the project is under the GNU license.

Since these licenses are conflicting, the project needs to specify which license is actually governing the project.

Multi-Objective Support for fitness metrics?

Hello!

I was wondering if you:

a) had any plans to include MOEA support as a replacement for the speciation/fitness metric built into the original NEAT algorithm? I've had very good results in an implementation I wrote in ANJI, but I am looking to get out of that horrible code base. Yours is very cleanly written and I much prefer to work in it.
b) if not, would you accept a pull request that adds the functionality in? I think that it would merely require an alternative to NeatEvolutionAlgorithm.cs that has a _genomerListMOEAEvaluator, which would evaluate on a list of IPhenomeEvaluators and then treat the results in an NSGA (or other MOEA) fashion. Because your code base is so much nicer, I don't see it being a major difficulty to add in...

Thoughts?

If you're interested in seeing some of the results I had with this, read http://ieeexplore.ieee.org/document/7424447/. It is a paper mostly focused on the benefits of MOEA's when working with Interactive Evolutionary Computing, but I did run an automated MOEA solution that evolved NEAT controllers for a robot in a maze domain and it did very well compared to standard fitness and with novelty search.

State of the project

Hello :)
I am planning to use sharpneat in a project I am working on for my bachelor thesis, though I am unsure whether the refactored repo is already production ready. I wasn't able to find any information regarding the state of this repository, only the pointer towards it from the original sharpneat repo.

Do you reccomend using it as it is?

Awesome project, keep it up!

README.md improvements

Chat GPT has the following suggestions for improving the existing README.md:

  1. Provide a brief introduction to NEAT and explain how it differs from other evolutionary algorithms, such as genetic algorithms.
  2. Include more specific information about the different problem tasks that SharpNEAT can be used to solve.
  3. Provide a more detailed explanation of how SharpNEAT uses mutation, recombination, and selection to search for a neural network that solves a given problem task.
  4. Highlight how SharpNEAT's modular design allows for experimentation with different genetic coding and evolutionary algorithms.
  5. Provide some information on the limitations of SharpNEAT, as well as its potential applications in the field of neuro-evolution.
  6. Provide some examples of the kind of problems that SharpNEAT can be used to solve and the results it has achieved in the past.
  7. Provide some information about the current status of the project, such as whether it's actively being developed and maintained, any recent updates or new features, and how the community can contribute.

Can I kill 10% of bottom performing gnomes?

Hi Colin,
is there a way to kill X% of bottom performing gnomes and replace them with totally random gnomes for each generation?

for example I'd like to kill 10% of the worst performers and replace them with new random ones. (Not offspring)

I feel like this would prevent stagnation in my case. Is this functionality already there?

ZigguratGaussianSampler [No overload for method 'NextDouble' takes 2 arguments]

It seems that there is some issue with Redzen 3.0.1 (updated via Nuget)

I got

double ZigguratGaussianSampler.NextDouble()
No overload for method 'NextDouble' takes 2 arguments

@sharpneat\src\SharpNeatLib\Network\ActivationFunctions\RadialBasis\RbfGaussian.cs
line 123, line 136
@sharpneat\src\SharpNeatLib\Genomes\Neat\NeatGenomeFactory.cs
line 558

How can I set up genome vs genome competitive evaluation/evolutionary algorithm?

I am new to this repo, so please forgive my lack of understanding.
Looking through the samples, they all appear to be experiments whereby a genome can be evaluated on it's own.
But what if the genome is designed to play some 2 (or n) player competitive game? In this case, each evaluation would involve 2 (or n) genomes.
Is this repo suitable for setting up such experiments? Any advice would be appreciated.

Where does the initial genome come from?

All of the examples I've seen read a genome from an XML file.... How did that genome ever get created? If I have a new experiment I want to run, how do I just generate a default NeatGenome to start it off? Or can I just create one with an empty network and SharpNeat will do the rest over the generations?

Insufficient NetworkXmlIO GetActivationFunction

Hi there! Thank you for that amazing project.
Looks like NetworkXmlIO.cs#L534 GetActivationFunction(string name) needs more registered functions such as TanH, LeakyReLU etc to handle. Maybe it would be suitable to create single place to register all supported IActivationFunctions with their aliases as some type of map?
Thank you!

Binary N-multiplexer task revamp

Consider replace the exhaustive evaluation of all input combinations (e.g. 2^11= 2048 evals for binary11 multiplexer), with a scheme such as the following:

E.g. for binary 11, with 3 address input and 8 data inputs.

  • Cycle through all combinations of the 3 address inputs; 2^3=8 combinations.

  • For each address combination:

    • Set selected data input to zero
      • Set all other data inputs to zero
      • Set all other data inputs to one
      • Set all other inputs on/off alternatively.
    • Set selected data input to one
      • Repeat above scheme.

This gives a total of 8*6 = 48 evals of the neural net per overall fitness evaluation.

Hence overall this should be approx 42.6. times faster than the exhaustive eval scheme (2048 / 48).

This introduces the risk of missing failure cases for some input combinations, this should be carefully considered. However, the the above scheme is a high bar, so is still a good test for efficacy sampling tests even if there are failures.

This scheme doesn't really offer a meaningful benefit to for the binary-6 and binary-3 multiplexer tasks (the latter being a toy task, similar to logical XOR), so those can continue to use exhaustive evaluation.

If the above scheme works, then perhaps we can apply it to a binary-20 multiplexer task....
* 4 address inputs, 16 data inputs
* 16 address combos * 6 evals per combo = 96 evaluations per fitness eval.

This might be a good candidate for a next generation standard efficacy sampling task.


Task naming convention:

  • binary N-multiplexer exhaustive
  • binary N-multiplexer non-exhaustive

Further thoughts on this task:

Consider using two binary outputs, one representing a signal when high, the other no signal when high. Rather than the single output with a threshold activation level. Which of these two schemes is easier to solve? Should we choose the easiest one for as the efficacy sampling task. There is probably good motivation for selecting binary outputs instead of using a threshold on a single continuous output, i.e. from biological neural nets.

Memory leak

Hello.

First of all, congratulations on the great library you're making. I am using this library to find a network that makes me the best regression with a series of training data and then with other data valid network setting. The problem is that I need to create many new networks NEAT and as I create I'm going up the memory and does not release. Visual Studio tells me that is the structure "KeyedCircularBuffer" but I do not see that there is a "dispose ". Can you help me?. Thank you.

SUPG for Quadruped Locomotion - 4-legged Walker

Last year, I was learning about ANN and EA by playing with existing projects, and I particularly liked this 3D 4-legged walker based on a research paper by Gregory Morse. You can watch videos of the walker on his website.

So, I updated the source code a bit, and uploaded it on bitbucket (because of my other interest in physics simulation).

I am mentioning it now after reading the issue #16 by @asimaranov and @drallensmith.
Sadly, right now, it is using a very old version of SharpNeatLib and I don't have enough mastery to update it to the current version of sharpneat.

Would you be interested in it?
I would help as much as I can.

Performance tune activation functions

See the recent performance tuning tweaks made to:

SharpNeat.NeuralNets.Double.ActivationFunctions.Vectorized.LeakyReLU

for hints.

Consider if any there are any per tuning opportunities for the scalar implementations - these are the ones that are currently used as they actually run faster for the neural net topologies that tend to get evolved, i.e, .highly irregular connectivity.

GetActivationFunction(string) in NetworkXmlIO.cs unable to load existing run with a custom activation function

I have the following code in my main function:

IActivationFunction activation = new SigmoidActivation(); NeatGenomeParameters genomeParams = new NeatGenomeParameters { ActivationFn = activation, }; NeatGenomeFactory genomeFactory = new NeatGenomeFactory(8, 1, genomeParams);

Which works fine for starting a new run. If I then save the run xml:

var doc = NeatGenomeXmlIO.SaveComplete(neat.GenomeList, nodeFnIds: true); doc.Save("d:\\neat\\generation.xml");

The xml is saved as expected. However, loading the xml crashes because GetActivationFunction(string) throws since it only maintains a hard-coded list.

Any tutorial?

is there any tutorial or documentation i can consult

Out of bounds SignalArray access

I'm creating my own CreateGenomeDecoder and getting Out of bounds SignalArray access in CreateNetworkDefinition during
// Read bias connection weight from output 1.
double weight = outputSignalArr[1];

Is that a bug or I'm doing something wrong?

I've changed the index to 0 and it seems to be working. Is this a bug????
double weight = outputSignalArr[1====>0];

Here's my code

    public override IGenomeDecoder<NeatGenome, IBlackBox> CreateGenomeDecoder()
    {
        uint neuronID = 1;

        SubstrateNodeSet inputLayer = new SubstrateNodeSet(1 + 3 + CHART_HEIGHT * CHART_WIDTH);

        AddNode(inputLayer, neuronID++, -1, 0, -1.0); //
        AddNode(inputLayer, neuronID++, 0, 0, -1.0);
        AddNode(inputLayer, neuronID++, +1, 0, -1.0);

        AddNeuralGrid(inputLayer, CHART_WIDTH, CHART_HEIGHT, ref neuronID, -1);

        //-- Output layer node.
        SubstrateNodeSet outputLayer = new SubstrateNodeSet(1);
        AddNode(outputLayer, neuronID++, 0, 0, +1.0);
        AddNode(outputLayer, neuronID++, 0, 0, +1.0);


        SubstrateNodeSet HiddenInputLayer = new SubstrateNodeSet(1 + 3 + CHART_HEIGHT * CHART_WIDTH);
        AddNode(HiddenInputLayer, neuronID++, 0, 0, -1.0); //feedback fromoutput here


        //-- Hidden layer nodes.
        SubstrateNodeSet h1Layer = new SubstrateNodeSet(4);
        AddNeuralGrid(h1Layer, 5, 5, ref neuronID, 0);



        List<SubstrateNodeSet> nodeSetList = new List<SubstrateNodeSet>(4);
        nodeSetList.Add(inputLayer); //0
        nodeSetList.Add(outputLayer); //1
        //nodeSetList.Add(h1Layer);  //2
        //nodeSetList.Add(HiddenInputLayer); //3

        // Define a connection mapping from the input layer to the output layer. Using INDEXES
        List<NodeSetMapping> nodeSetMappingList = new List<NodeSetMapping>(4);
        nodeSetMappingList.Add(NodeSetMapping.Create(0, 1/*2*/, (double?)null));
        //nodeSetMappingList.Add(NodeSetMapping.Create(2, 1, (double?)null));
        //nodeSetMappingList.Add(NodeSetMapping.Create(3, 2, (double?)null));
        //nodeSetMappingList.Add(NodeSetMapping.Create(1, 3, (double?)null));

        // Construct the substrate using a steepened sigmoid as the phenome's
        // activation function. All weights under 0.2 will not generate 
        // connections in the final phenome.
        Substrate substrate = new Substrate(nodeSetList,
            CreateLibraryCppn(),
            0, 0.2, 5, nodeSetMappingList);

        // Create genome decoder. Decodes to a neural network packaged with
        // an activation scheme that defines a fixed number of activations per evaluation.
        IGenomeDecoder<NeatGenome, IBlackBox> genomeDecoder =
            new HyperNeatDecoder(substrate, _activationSchemeCppn, _activationScheme, false);

        return genomeDecoder;
    }

New Mutation Type

Hi Colin,
I wrote a new mutation method that disconnects one input and reconnects it to another input.
Works ok, but I'm getting "CorrelationResults failed integrity check" assertion error.
What do I need to do?

    private void Mutate_ReconnectInputs()
    {

        NeuronGene new_source = _neuronGeneList[_genomeFactory.Rng.Next(_genomeFactory.InputNeuronCount)];

        //try to find source and target suitable for reconnecting 10 times max
        for (int i = 0; i < 10; i++) 
        {
            ConnectionGene con = _connectionGeneList[_genomeFactory.Rng.Next(_connectionGeneList.Count)];

            if (con.SourceNodeId < _genomeFactory.InputNeuronCount && con.SourceNodeId != new_source.Id)
            {
                NeuronGene old_source = _neuronGeneList.GetNeuronById(con.SourceNodeId);
                NeuronGene old_target = _neuronGeneList.GetNeuronById(con.TargetNodeId);

                if (old_target.SourceNeurons.Contains(new_source.Id)) continue;
                if (new_source.TargetNeurons.Contains(old_target.Id)) continue;


                old_source.TargetNeurons.Remove(old_target.Id);
                old_target.SourceNeurons.Remove(old_source.Id);

                new_source.TargetNeurons.Add(old_target.Id);
                old_target.SourceNeurons.Add(new_source.Id);

                con.IsMutated = true;
                con.SourceNodeId = new_source.Id;
                con.TargetNodeId = old_target.Id;

                break;
            }
        }
        
    }

Add Task visualizations

  • Prey capture
  • Function regression
  • Any others than are relatively easy to do that can be included in the 4.0 release as example task visualisations.

Anything Box2d related will probably have to wait - but maybe we can just lift the old code for now.

Parsing default values fails in cultures that are not en-us...

If the GUI program is run as-is on a pc with different culture it may throw an unhandled exception with an index out of range - this is caused by selecting more genomes than are available, but the root cause is a parsing error.

From SharpNeatGUI.MainForm (and MainForm.Designer):

InitializeComponent()

this.txtParamSelectionProportion.Text = "0.2"; //either en-us or generic


ReadAndUpdateExperimentParams()

eaParams.SelectionProportion = ParseDouble(txtParamSelectionProportion, eaParams.SelectionProportion);


ParseDouble()

if(double.TryParse(txtBox.Text, out val)) // No culture specified... sets SelectionProportion to 2 in countries using 0,2 instead of 0.2

Project status and code stability.

I hope my broad question is ok given this specialized project.

Looking at the GitHub project pulse for this repository and the original colgreen/sharpneat, I get the impression both projects have been in concurrent active development for a few years by the founding main contributor. Periodic formal release notes indicate that colgreen/sharpneat has reached a go-live type status whereas sharpneat-refactor has an experimental feel.

After studying both codebases I would prefer to implement my new (and first) NEAT application on this sharpneat-refector implementation. Would that be wise?

f.y.i. I am more of a software engineer than data scientist. My plan is to apply the NEAT algorithm to a new category of problem and I intend to make substantial source code revisions to either sharpneat or sharpneat-refector to achieve this objective.

SharpNeatLib target .NET framework

Did the recent commit that upgraded the .NET framework to 4.7.1 break the solution? Can I get your advice please. Thank you very much.

I am getting this error framework reference warning when loading .sln file in Visual Studio 2017

Severity Code Description Project File Line Suppression State
Warning Project 'D:\SVN\github\sharpneat\src\SharpNeatLib\SharpNeatLib.csproj' targets '.NETStandard,Version=v2.0'. It cannot be referenced by a project that targets '.NETFramework,Version=v4.7.1'. SharpNeatDomains

And I have a bunch of compile errors in SharpNeatLib:

Severity Code Description Project File Line Suppression State
Error CS0246 The type or namespace name 'ParallelOptions' could not be found (are you missing a using directive or an assembly reference?) SharpNeatLib D:\SVN\github\sharpneat\src\SharpNeatLib\SpeciationStrategies\ParallelKMeansClusteringStrategy.cs 38 Active

Any tutorial?

Hi,
sorry for opening an issue for this, but i dont know how else to contact you.
Is there any documentation on how to actually implement SharpNeat? There is a collection of examples somewhere in the internet using sharpneat, which i can not find again, and there is that 8 year old tutorial from someone using it for TicTacToe, but which is incompatible to the current Nuget release. Outside of that, how do i use SharpNeat classes to give it input and getting outputs?

Best regards

RbfFunctionRegressionExperiment

Hi,

SharpNeat.Domains.FunctionRegression.RbfFunctionRegressionExperiment
does not load from the GUI.
I did not manage to find a matching class.
Was it renamed or removed?

BackgroundThreadMethod() failed with exception

Hello, @colgreen! Thank you for that second sharpneat generation.
Architecture, APIs, and ease of use are greatly enhanced! ๐Ÿฅ‡
I'm getting next frequent exceptions:

BackgroundThreadMethod() failed with exception [Non-negative number required. (Parameter 'index')]
System.ArgumentOutOfRangeException: Non-negative number required. (Parameter 'index')
   at System.Collections.Generic.List`1.RemoveRange(Int32 index, Int32 count)
   at SharpNeat.Neat.EvolutionAlgorithm.NeatEvolutionAlgorithm`1.TrimSpeciesBackToElite(Boolean& emptySpeciesFlag) in D:\sharpneat-refactor\src\SharpNeat\Neat\EvolutionAlgorithm\NeatEvolutionAlgorithm.cs:line 268
   at SharpNeat.Neat.EvolutionAlgorithm.NeatEvolutionAlgorithm`1.PerformOneGeneration() in D:\sharpneat-refactor\src\SharpNeat\Neat\EvolutionAlgorithm\NeatEvolutionAlgorithm.cs:line 226
   at SharpNeat.EvolutionAlgorithm.Runner.EvolutionAlgorithmRunner.BackgroundThreadMethodInner() in D:\sharpneat-refactor\src\SharpNeat\EvolutionAlgorithm\Runner\EvolutionAlgorithmRunner.cs:line 263
   at SharpNeat.EvolutionAlgorithm.Runner.EvolutionAlgorithmRunner.BackgroundThreadMethod() in D:\sharpneat-refactor\src\SharpNeat\EvolutionAlgorithm\Runner\EvolutionAlgorithmRunner.cs:line 242

Pointing to this NeatEvolutionAlgorithm TrimSpeciesBackToElite loop:

for(int i=0; i < speciesCount; i++)
{
    Species<T> species = _pop.SpeciesArray[i];
    int eliteSizeInt = species.Stats.EliteSizeInt;
    int removeCount = species.GenomeList.Count - eliteSizeInt;
    species.GenomeList.RemoveRange(eliteSizeInt, removeCount);

    if(eliteSizeInt == 0) {
        emptySpeciesFlag = true;
    }
}

Does it seem like species.Stats.EliteSizeInt may somehow become negative?

Can mutation be safely ignored after sexual crossover?

With reference to method public NeatGenome CreateOffspring(NeatGenome parent, uint birthGeneration) in NeatGenome.cs @here I observed that there is no call to offspring.Mutate(); post sexual crossover!

Is it safe to completely ignore mutation after sexual crossover?

Nuget version is older than the current latest stable version

Hi,

I don't know how else to contact you. So first thanks for this amazing library. But I wanted to use in my game that I am building with Godot but when I checked the latest version on nuget was older than the current stable version. So it will be helpful if you can update the same.

Thanks.

Review genome and network IO situation.

E.g.

  • Can we init a population from a seed genome.
  • Do we have/want a generic network IO format?
  • Do we have a way of transforming between the old (XML) and new (text/tsv) formats? Perhaps with a separate command line tool?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.