- v2.7.4Latest
- v2.7.2
- v2.7.1
- v2.7.0
- v2.6.1
- v2.6.0
- v2.5.0
- v2.4.0
- v2.3.0
- v2.2.0
- v2.1.2
- v2.1.1
- v2.1.1
- v2.1.0
- v2.0.1
- v2.0.0
- v1.5.0
- v1.4.0
- v1.3.2
- v1.3.1
- v1.3.0
- v1.2.3
- v1.2.2
- v1.2.1
- v1.2.0
- v1.1.2
- v1.1.2
- v1.1.1
- v1.1.0
- v1.1.0
- v1.0.18
- v1.0.17
- v1.0.16
- v1.0.16
- v1.0.15
- v1.0.14
- v1.0.13
- v1.0.12
- v1.0.11
- v1.0.10
- v1.0.9
- v1.0.8
- v1.0.7
- v1.0.6
- v1.0.6
- v1.0.5
- v1.0.5
- v1.0.4
- v1.0.3
- v1.0.3
- v1.0.2
- v1.0.1
- v1.0.0
- v1.0.0
- v0.318.5
- v0.318.4
- v0.318.4
- v0.318.3
- v0.318.2
- v0.318.1
- v0.318.0
- v0.317.3
- v0.317.3
- v0.317.2
- v0.317.1
- v0.317.0
- v0.317.0
- v0.316.37
- v0.316.35
- v0.316.35
- v0.316.34
- v0.316.33
- v0.316.32
- v0.316.30
- v0.316.29
- v0.316.28
- v0.316.27
- v0.316.26
- v0.316.24
- v0.316.23
- v0.316.22
- v0.316.21
- v0.316.20
- v0.316.19
- v0.316.18
- v0.316.18
- v0.316.17
- v0.316.16
- v0.316.15
- v0.316.14
- v0.316.13
- v0.316.12
- v0.316.11
- v0.316.10
- v0.316.9
- v0.316.8
- v0.316.7
- v0.316.6
- v0.316.6
- v0.316.4
- v0.316.2
- v0.316.1
- v0.316.1
- v0.315.22
- v0.315.21
- v0.315.21
- v0.315.20
- v0.315.19
- v0.315.18
- v0.315.17
- v0.315.16
- v0.315.15
- v0.315.14
- v0.315.14
- v0.315.13
- v0.315.12
- v0.315.11
- v0.315.11
- v0.315.9
- v0.315.8
- v0.315.7
- v0.315.6
- v0.315.5
- v0.315.5
- v0.315.4
- v0.315.3
- v0.315.2
- v0.315.0
- v0.314.20
- v0.314.17
- v0.314.16
- v0.314.15
- v0.314.14
- v0.314.13
- v0.314.12
- v0.314.11
- v0.314.10
- v0.314.9
- v0.314.9
- v0.314.8
- v0.314.7
- v0.314.7
- v0.314.6
- v0.314.5
- v0.314.5
- v0.314.4
- v0.314.3
- v0.314.2
- v0.314.1
- v0.314.1
- v0.314.0
- v0.313.14
- v0.313.13
- v0.313.12
- v0.313.11
- v0.313.10
- v0.313.9
- v0.313.8
- v0.313.7
- v0.313.5
- v0.313.5
- v0.313.4
- v0.313.3
- v0.313.2
- v0.313.1
- v0.312.41
- v0.312.41
- v0.312.39
- v0.312.39
- v0.312.38
- v0.312.36
- v0.312.36
- v0.312.35
- v0.312.34
- v0.312.33
- v0.312.33
- v0.312.32
- v0.312.32
- v0.312.31
- v0.312.30
- v0.312.29
- v0.312.29
- v0.312.28
- v0.312.28
- v0.312.27
- v0.312.27
- v0.312.26
- v0.312.25
- v0.312.24
- v0.312.23
- v0.312.22
- v0.312.21
- v0.312.21
- v0.312.20
- v0.312.19
- v0.312.19
- v0.312.18
- v0.312.17
- v0.312.16
- v0.312.15
- v0.312.14
- v0.312.13
- v0.312.12
- v0.312.11
- v0.312.10
- v0.312.9
- v0.312.8
- v0.312.7
- v0.312.6
- v0.312.5
- v0.312.5
- v0.312.4
- v0.312.3
- v0.312.3
- v0.312.2
- v0.312.0
- v0.311.21
- v0.311.20
- v0.311.19
- v0.311.18
- v0.311.16
- v0.311.15
- v0.311.14
- v0.311.13
- v0.311.12
- v0.311.11
- v0.311.11
- v0.311.10
- v0.311.9
- v0.311.8
- v0.311.7
- v0.311.6
- v0.311.5
- v0.311.4
- v0.311.3
- v0.311.1
- v0.311.1
- v0.311.0
- v0.310.36
- v0.310.35
- v0.310.34
- v0.310.33
- v0.310.32
- v0.310.31
- v0.310.30
- v0.310.29
- v0.310.28
- v0.310.28
- v0.310.26
- v0.310.25
- v0.310.25
- v0.310.24
- v0.310.24
- v0.310.23
- v0.310.23
- v0.310.22
- v0.310.21
- v0.310.20
- v0.310.19
- v0.310.18
- v0.310.17
- v0.310.16
- v0.310.15
- v0.310.14
- v0.310.13
- v0.310.12
- v0.310.11
- v0.310.10
- v0.310.10
- v0.310.9
- v0.310.8
- v0.310.7
- v0.310.6
- v0.310.5
- v0.310.4
- v0.310.4
- v0.310.3
- v0.310.3
- v0.310.2
- v0.310.1
- v0.310.1
- v0.310.0
- v0.309.2
- v0.309.1
- v0.309.0
- v0.308.9
- v0.308.8
- v0.308.7
- v0.308.6
- v0.308.5
- v0.308.4
- v0.308.4
- v0.308.3
- v0.308.2
- v0.308.1
- v0.308.0
- v0.307.5
- v0.307.4
- v0.307.3
- v0.307.2
- v0.307.1
- v0.307.0
- v0.306.1
- v0.305.0
- v0.304.0
- v0.303.0
- v0.303.0
- v0.302.4
- v0.302.3
- v0.302.2
- v0.302.1
- v0.302.0
- v0.302.0
- v0.301.0
- v0.300.3
- v0.300.2
- v0.300.1
- v0.300.1
- v0.300.0
- v0.299.0
- v0.298.3
- v0.298.2
- v0.298.1
- v0.298.1
- v0.298.0
- v0.297.2
- v0.297.1
- v0.297.0
- v0.296.2
- v0.296.1
- v0.296.0
- v0.295.15
- v0.295.14
- v0.295.13
- v0.295.12
- v0.295.11
- v0.295.10
- v0.295.9
- v0.295.8
- v0.295.7
- v0.295.6
- v0.295.5
- v0.295.4
- v0.295.4
- v0.295.3
- v0.295.2
- v0.295.1
- v0.295.0
- v0.294.1
- v0.294.0
- v0.293.36
- v0.293.35
- v0.293.34
- v0.293.33
- v0.293.32
- v0.293.31
- v0.293.30
- v0.293.30
- v0.293.28
- v0.293.27
- v0.293.27
- v0.293.26
- v0.293.25
- v0.293.24
- v0.293.23
- v0.293.22
- v0.293.21
- v0.293.20
- v0.293.20
- v0.293.19
- v0.293.18
- v0.293.17
- v0.293.16
- v0.293.15
- v0.293.14
- v0.293.13
- v0.293.12
- v0.293.11
- v0.293.10
- v0.293.8
- v0.293.8
- v0.293.7
- v0.293.6
- v0.293.5
- v0.293.4
- v0.293.3
- v0.293.2
- v0.293.1
- v0.293.1
- v0.293.0
- v0.292.31
- v0.292.31
- v0.292.30
- v0.292.29
- v0.292.28
- v0.292.27
- v0.292.26
- v0.292.25
- v0.292.24
- v0.292.24
- v0.292.23
- v0.292.22
- v0.292.21
- v0.292.21
- v0.292.20
- v0.292.19
- v0.292.18
- v0.292.16
- v0.292.15
- v0.292.14
- v0.292.13
- v0.292.13
- v0.292.12
- v0.292.12
- v0.292.11
- v0.292.10
- v0.292.9
- v0.292.8
- v0.292.8
- v0.292.7
- v0.292.6
- v0.292.5
- v0.292.5
- v0.292.3
- v0.292.3
- v0.292.2
- v0.292.1
- v0.292.0
- v0.291.31
- v0.291.30
- v0.291.29
- v0.291.28
- v0.291.27
- v0.291.26
- v0.291.26
- v0.291.25
- v0.291.24
- v0.291.23
- v0.291.23
- v0.291.22
- v0.291.21
- v0.291.20
- v0.291.19
- v0.291.18
- v0.291.16
- v0.291.15
- v0.291.14
- v0.291.12
- v0.291.12
- v0.291.11
- v0.291.10
- v0.291.9
- v0.291.9
- v0.291.8
- v0.291.7
- v0.291.6
- v0.291.5
- v0.291.4
- v0.291.4
- v0.291.3
- v0.291.2
- v0.291.2
- v0.291.1
- v0.291.0
- v0.290.2
- v0.290.1
- v0.290.0
- v0.290.0
- v0.289.1
- v0.289.0
- v0.288.1
- v0.288.0
- v0.287.0
- v0.287.0
- v0.286.1
- v0.286.0
- v0.285.2
- v0.285.1
- v0.285.0
- v0.284.0
- v0.283.0
- v0.282.1
- v0.282.0
- v0.281.0
- v0.280.0
- v0.279.2
- v0.279.1
- v0.279.0
- v0.278.1
- v0.278.0
- v0.277.0
- v0.276.0
- v0.276.0
- v0.275.0
- v0.274.2
- v0.274.1
- v0.274.0
- v0.274.0
- v0.273.4
- v0.273.3
- v0.273.2
- v0.273.0
- v0.272.1
- v0.272.1
- v0.272.0
- v0.272.0
- v0.271.0
- v0.270.2
- v0.270.1
- v0.270.0
- v0.269.1
- v0.269.0
- v0.268.0
- v0.267.0
- v0.266.0
- v0.265.0
- v0.265.0
- v0.264.0
- v0.263.0
- v0.262.0
- v0.261.0
- v0.260.0
- v0.259.5
- v0.259.4
- v0.259.3
- v0.259.2
- v0.259.2
- v0.259.1
- v0.259.1
- v0.259.0
- v0.258.1
- v0.256.1
- v0.256.0
- v0.255.1
- v0.255.0
- v0.254.0
- v0.253.1
- v0.253.0
- v0.252.0
- v0.252.0
- v0.251.1
- v0.251.0
- v0.249.2
- v0.249.1
- v0.249.0
- v0.248.1
- v0.248.0
- v0.248.0
- v0.247.0
- v0.246.0
- v0.245.0
- v0.244.0
- v0.243.0
- v0.243.0
- v0.242.0
- v0.242.0
- v0.241.0
- v0.240.0
- v0.238.1
- v0.238.0
- v0.237.3
- v0.237.2
- v0.237.1
- v0.237.0
- v0.236.0
- v0.236.0
- v0.235.0
- v0.234.0
- v0.233.0
- v0.231.1
- v0.231.0
- v0.229.0
- v0.228.0
- v0.227.1
- v0.226.0
- v0.225.0
- v0.224.0
- v0.223.0
- v0.222.0
- v0.221.1
- v0.220.2
- v0.220.2
- v0.220.1
- v0.220.0
- v0.219.0
- v0.219.0
- v0.218.0
- v0.217.1
- v0.217.0
- v0.215.0
- v0.215.0
- v0.214.4
- v0.214.3
- v0.214.1
- v0.214.0
- v0.213.1
- v0.213.0
- v0.213.0
- v0.212.0
- v0.211.0
- v0.210.0
- v0.209.1
- v0.209.0
- v0.208.1
- v0.208.0
- v0.205.0
- v0.204.4
- v0.204.3
- v0.204.2
- v0.204.2
- v0.204.1
- v0.203.0
- v0.202.1
- v0.202.1
- v0.202.0
- v0.201.3
- v0.201.2
- v0.201.1
- v0.201.0
- v0.200.0
- v0.199.0
- v0.198.5
- v0.198.5
- v0.198.4
- v0.198.3
- v0.198.2
- v0.198.2
- v0.198.1
- v0.198.0
- v0.197.0
- v0.196.3
- v0.196.3
- v0.196.2
- v0.196.1
- v0.196.0
- v0.195.12
- v0.195.10
- v0.195.9
- v0.195.8
- v0.195.7
- v0.195.6
- v0.195.5
- v0.195.5
- v0.195.4
- v0.195.3
- v0.195.2
- v0.195.1
- v0.195.0
- v0.194.7
- v0.194.6
- v0.194.5
- v0.194.4
- v0.194.4
- v0.194.3
- v0.194.2
- v0.194.1
- v0.194.0
- v0.193.3
- v0.193.3
- v0.193.2
- v0.193.1
- v0.193.0
- v0.192.0
- v0.191.1
- v0.191.0
- v0.191.0
- v0.190.2
- v0.190.1
- v0.190.0
- v0.189.8
- v0.189.7
- v0.189.5
- v0.189.4
- v0.189.3
- v0.189.3
- v0.189.2
- v0.189.1
- v0.189.0
- v0.188.0
- v0.187.0
- v0.186.1
- v0.186.0
- v0.185.1
- v0.185.0
- v0.184.3
- v0.184.2
- v0.184.2
- v0.184.1
- v0.184.0
- v0.183.0
- v0.182.0
- v0.181.1
- v0.181.0
- v0.179.21
- v0.179.20
- v0.179.19
- v0.179.18
- v0.179.17
- v0.179.16
- v0.179.15
- v0.179.15
- v0.179.14
- v0.179.13
- v0.179.13
- v0.179.12
- v0.179.10
- v0.179.9
- v0.179.8
- v0.179.7
- v0.179.7
- v0.179.6
- v0.179.5
- v0.179.4
- v0.179.3
- v0.179.3
- v0.179.2
- v0.179.2
- v0.179.1
- v0.179.0
- v0.178.6
- v0.178.5
- v0.178.4
- v0.178.3
- v0.178.1
- v0.178.0
- v0.177.15
- v0.177.13
- v0.177.12
- v0.177.10
- v0.177.9
- v0.177.8
- v0.177.7
- v0.177.7
- v0.177.6
- v0.177.5
- v0.177.3
- v0.177.2
- v0.177.1
- v0.177.0
- v0.176.0
- v0.175.3
- v0.175.2
- v0.175.1
- v0.175.0
- v0.175.0
- v0.174.6
- v0.174.5
- v0.174.4
- v0.174.4
- v0.174.3
- v0.174.2
- v0.174.1
- v0.174.1
- v0.174.0
- v0.172.0
- v0.171.13
- v0.171.12
- v0.171.11
- v0.171.10
- v0.171.9
- v0.171.7
- v0.171.6
- v0.171.5
- v0.171.4
- v0.171.3
- v0.171.2
- v0.171.0
- v0.171.0
- v0.170.5
- v0.170.4
- v0.170.3
- v0.61.0
- v0.60.0
- v0.60.0
- v0.59.0
- v0.58.0
- v0.58.0
- v0.57.0
- v0.57.0
- v0.56.0
- v0.54.0
- v0.53.0
- v0.52.0
- v0.52.0
- v0.51.0
- v0.51.0
- v0.50.0
- v0.49.0
- v0.48.0
- v0.47.0
- v0.47.0
- v0.46.0
- v0.46.0
- v0.44.0
- v0.43.0
- v0.42.0
- v0.41.0
- v0.40.0
- v0.39.0
- v0.38.0
- v0.37.0
- v0.36.0
- v0.35.0
- v0.33.0
- v0.31.0
- v0.30.0
- v0.29.0
- v0.28.0
- v0.27.0
- v0.27.0
- v0.26.0
- v0.25.0
- v0.25.0
- v0.23.0
- v0.22.0
- v0.21.0
- v0.20.0
- v0.19.0
- v0.19.0
- v0.18.0
- v0.17.0
- v0.16.0
- v0.15.1
- v0.15.0
- v0.14.4
- v0.14.3
- v0.14.2
- v0.14.1
- v0.14.0
- v0.13.1
- v0.13.0
- v0.12.1
- v0.11.0
- v0.10.21
- v0.10.20
- v0.10.19
- v0.10.18
- v0.10.17
- v0.10.16
- v0.10.15
- v0.10.14
- v0.10.13
- v0.10.12
- v0.10.11
- v0.10.10
- v0.10.9
- v0.10.8
- v0.10.7
- v0.10.6
- v0.10.5
- v0.10.4
- v0.10.3
- v0.10.2
- v0.10.1
- v0.10.0
- v0.9.15
- v0.9.13
- v0.9.12
- v0.9.11
- v0.9.10
- v0.9.9
- v0.9.8
- v0.9.7
- v0.9.6
- v0.9.5
- v0.9.4
- v0.9.3
- v0.9.2
- v0.9.1
🧬 NEAT Neural Network for DenoJS
This project is a practical implementation of a neural network based on the NEAT (NeuroEvolution of Augmenting Topologies) algorithm, written in DenoJS using TypeScript, with additional features such as error-guided discovery, memetic evolution, and distributed workflows.
For project terminology, coding conventions, and development guidelines, see AGENTS.md.
✨ Feature Highlights
Extendable Observations: Input and output features are identified by stable UUIDs in the exported representation, rather than only by positional indices. This prevents the need to restart the evolution process as new observations are added, and makes it practical to evolve creatures on multiple machines and then recombine them, much like NEAT’s historical marking for genes Stanley & Miikkulainen (2002).
Distributed Training: Training and evolution can be run on multiple independent nodes. The best-of-breed creatures can later be combined on a centralised controller node, mirroring the island model used in evolutionary algorithms.
Life Long Learning: Designed for continuous learning in changing environments. The same population can keep training and adapting as new data arrives over weeks or months, supporting continual learning while still relying on your training data to keep past knowledge represented.
Efficient Model Utilisation: Once trained, the current best model can be utilised efficiently by calling the
activatefunction. This runs a single forward pass that maps inputs to outputs.Note
Activation uses WASM (required). The library initialises the WASM backend automatically; callers do not need to call any init function or set environment variables.
Unique Squash Functions: Supports unique squash functions such as IF, MAX and MIN, offering a wider range of potential solutions. More about Activation Functions.
Neuron Pruning: Neurons whose activations don’t vary during training are removed, and the biases in associated neurons are adjusted. More about Pruning (Neural Networks).
CRISPR: Allows injection of genes into a population of creatures during evolution. More about CRISPR.
Grafting: If parents aren’t “genetically compatible”, the grafting algorithm enables cross-island interbreeding, preserving diversity in the same spirit as island-model evolution.
Memetic Evolution: Records and utilises the biases and weights of the fittest creatures to fine-tune future generations. Learn more about Memetic Algorithms.
Error-Guided Structural Evolution: Dynamically identifies and creates new synapses by analysing neuron activations and errors. A dedicated Rust module performs GPU-accelerated analysis and proposes structural candidates. Discovery runs typically find improvements of 0.5-3% per run that accumulate over many iterations.
Warning
Relies entirely on the NEAT-AI-Discovery Rust extension library. If the library is not available, the discovery phase is skipped; there is no TypeScript fallback.
Adaptive Mutation Rate: Automatically adjusts mutation strategy based on creature size - large creatures focus on weight/bias modification rather than topology expansion.
Adaptive Mutation Rate Based on Fitness Progress: Mutation rate is automatically adjusted based on whether evolution is improving, stagnating, or stable, helping balance exploration and exploitation.
Continuous Incremental Discovery: For distributed, multi-machine discovery workflows that accumulate small improvements over time, see the Discovery Guide.
Training Data Fuzzing: Noise injection during training prevents creatures from memorising exact training examples. Gaussian or uniform perturbations are added to inputs (and optionally outputs for label smoothing) each iteration, encouraging robust generalisation.
K-Fold Cross-Validation: Built-in k-fold cross-validation evaluates creatures on held-out data folds during evolution, reducing overfitting to a single train/test split.
Hyperparameter Self-Adaptation: Each creature carries its own learning rate, mutation rates, and regularisation strength. These evolve alongside topology and weights — creatures with better-suited hyperparameters achieve higher fitness and propagate their settings, inspired by self-adaptive evolution strategies.
Transfer Learning: Export trained creatures as checkpoints with metadata, import them into new tasks with UUID mapping for different input/output configurations, and seed populations with pre-trained creatures for transfer learning across related problems.
ONNX Export: Export trained creatures to the ONNX (Open Neural Network Exchange) format for deployment in standard ML inference pipelines, bridging the gap between neuroevolution and production deployment.
Synthetic Synapse Training: Temporarily densifies inter-layer connectivity during backpropagation by adding zero-weight synapses between adjacent topological layers. After training, near-zero synapses are pruned and only the useful connections are retained — addressing NEAT’s inherent weakness of sparse connectivity compared to conventional dense layers. Opt-in via
syntheticSynapses: truein the training configuration.
🚀 Quick Start
// Single discovery iteration
const result = await creature.discoveryDir(dataDir, {
discoveryRecordTimeOutMinutes: 1,
discoveryAnalysisTimeoutMinutes: 10,
});
if (result.improvement) {
console.log(`Found ${result.improvement.changeType} improvement!`);
// Use improved creature for next iteration
}Tip
For distributed, multi-machine workflows that accumulate small improvements over time, see the Discovery Guide for a complete walkthrough.
💻 Usage
This project is designed to be used in a DenoJS environment. Please refer to the DenoJS documentation for setup and usage instructions.
📚 Documentation
For detailed documentation, see the docs/ directory:
🚀 Getting Started
- CONTRIBUTING.md: First-time contributor guide with development setup and workflow
- Configuration Guide: Complete reference of all configuration options and presets
🧠 Core Concepts
- COMPARISON.md: How NEAT compares to traditional neural networks, CNNs, RNNs, and modern LLMs
- Discovery Guide: Complete guide to distributed, multi-machine discovery workflows, including failure/success caches, replay, candidate category limits, focus overrides, and the cost-of-growth gate
- Intelligent Design: Systematic squash function optimisation for hidden neurons
🔧 API & Reference
- API Reference: Comprehensive public API documentation
- DiscoveryDir API: Technical API reference for
Creature.discoveryDir()and data preparation - Activation Functions Guide: Complete guide to all 30+ activation functions with selection guidance
🔬 Advanced Topics
- Predictive Coding: Neuroscience-inspired predictive coding training mode
- Predictive Coding Benchmarks: Benchmark results for predictive coding
- Elastic Backpropagation: Why we prefer minimum-change weight updates and avoid pushing saturated squashes further into saturation
- GPU Acceleration: GPU acceleration for discovery on macOS using Metal
- WASM Resident Topology: Feasibility analysis for WASM-resident creature topology
⚡ Operations
- Performance Tuning: Tuning WASM caches, thread pools, memory management, and scaling for large-scale training
- Performance Research: WASM migration research and benchmark learnings
- Troubleshooting: Common issues and solutions for WASM, discovery, memory, CI, and configuration
🤝 For Contributors
- AGENTS.md: Coding conventions, terminology, and development guidelines
- Discovery Architecture: Internal discovery pipeline architecture
🤝 Contributions
Contributions are welcome! See CONTRIBUTING.md for development setup, workflow, and guidelines. Please submit a pull request or open an issue to discuss potential changes/additions.
⚖️ Licence
This project is licensed under the terms of the Apache Licence 2.0. For the full licence text, please see LICENSE