- v0.198.4Latest
- v0.198.3
- v0.198.2
- v0.198.2
- v0.198.1
- v0.198.0
- v0.197.0
- v0.196.3
- v0.196.3
- v0.196.2
- v0.196.1
- v0.196.0
- v0.195.12
- v0.195.10
- v0.195.9
- v0.195.8
- v0.195.7
- v0.195.6
- v0.195.5
- v0.195.5
- v0.195.4
- v0.195.3
- v0.195.2
- v0.195.1
- v0.195.0
- v0.194.7
- v0.194.6
- v0.194.5
- v0.194.4
- v0.194.4
- v0.194.3
- v0.194.2
- v0.194.1
- v0.194.0
- v0.193.3
- v0.193.3
- v0.193.2
- v0.193.1
- v0.193.0
- v0.192.0
- v0.191.1
- v0.191.0
- v0.191.0
- v0.190.2
- v0.190.1
- v0.190.0
- v0.189.8
- v0.189.7
- v0.189.5
- v0.189.4
- v0.189.3
- v0.189.3
- v0.189.2
- v0.189.1
- v0.189.0
- v0.188.0
- v0.187.0
- v0.186.1
- v0.186.0
- v0.185.1
- v0.185.0
- v0.184.3
- v0.184.2
- v0.184.2
- v0.184.1
- v0.184.0
- v0.183.0
- v0.182.0
- v0.181.1
- v0.181.0
- v0.179.21
- v0.179.20
- v0.179.19
- v0.179.18
- v0.179.17
- v0.179.16
- v0.179.15
- v0.179.15
- v0.179.14
- v0.179.13
- v0.179.13
- v0.179.12
- v0.179.10
- v0.179.9
- v0.179.8
- v0.179.7
- v0.179.7
- v0.179.6
- v0.179.5
- v0.179.4
- v0.179.3
- v0.179.3
- v0.179.2
- v0.179.2
- v0.179.1
- v0.179.0
- v0.178.6
- v0.178.5
- v0.178.4
- v0.178.3
- v0.178.1
- v0.178.0
- v0.177.15
- v0.177.13
- v0.177.12
- v0.177.10
- v0.177.9
- v0.177.8
- v0.177.7
- v0.177.7
- v0.177.6
- v0.177.5
- v0.177.3
- v0.177.2
- v0.177.1
- v0.177.0
- v0.176.0
- v0.175.3
- v0.175.2
- v0.175.1
- v0.175.0
- v0.175.0
- v0.174.6
- v0.174.5
- v0.174.4
- v0.174.4
- v0.174.3
- v0.174.2
- v0.174.1
- v0.174.1
- v0.174.0
- v0.172.0
- v0.171.13
- v0.171.12
- v0.171.11
- v0.171.10
- v0.171.9
- v0.171.7
- v0.171.6
- v0.171.5
- v0.171.4
- v0.171.3
- v0.171.2
- v0.171.0
- v0.171.0
- v0.170.5
- v0.170.4
- v0.170.3
- v0.61.0
- v0.60.0
- v0.60.0
- v0.59.0
- v0.58.0
- v0.58.0
- v0.57.0
- v0.57.0
- v0.56.0
- v0.54.0
- v0.53.0
- v0.52.0
- v0.52.0
- v0.51.0
- v0.51.0
- v0.50.0
- v0.49.0
- v0.48.0
- v0.47.0
- v0.47.0
- v0.46.0
- v0.46.0
- v0.44.0
- v0.43.0
- v0.42.0
- v0.41.0
- v0.40.0
- v0.39.0
- v0.38.0
- v0.37.0
- v0.36.0
- v0.35.0
- v0.33.0
- v0.31.0
- v0.30.0
- v0.29.0
- v0.28.0
- v0.27.0
- v0.27.0
- v0.26.0
- v0.25.0
- v0.25.0
- v0.23.0
- v0.22.0
- v0.21.0
- v0.20.0
- v0.19.0
- v0.19.0
- v0.18.0
- v0.17.0
- v0.16.0
- v0.15.1
- v0.15.0
- v0.14.4
- v0.14.3
- v0.14.2
- v0.14.1
- v0.14.0
- v0.13.1
- v0.13.0
- v0.12.1
- v0.11.0
- v0.10.21
- v0.10.20
- v0.10.19
- v0.10.18
- v0.10.17
- v0.10.16
- v0.10.15
- v0.10.14
- v0.10.13
- v0.10.12
- v0.10.11
- v0.10.10
- v0.10.9
- v0.10.8
- v0.10.7
- v0.10.6
- v0.10.5
- v0.10.4
- v0.10.3
- v0.10.2
- v0.10.1
- v0.10.0
- v0.9.15
- v0.9.13
- v0.9.12
- v0.9.11
- v0.9.10
- v0.9.9
- v0.9.8
- v0.9.7
- v0.9.6
- v0.9.5
- v0.9.4
- v0.9.3
- v0.9.2
- v0.9.1
NEAT Neural Network for DenoJS
This project is a unique implementation of a neural network based on the NEAT (NeuroEvolution of Augmenting Topologies) algorithm, written in DenoJS using TypeScript.
Feature Highlights
Extendable Observations: The observations can be extended over time as the indexing is done via UUIDs, not numbers. This prevents the need to restart the evolution process as new observations are added, providing flexibility and scalability.
Distributed Training: Training and evolution can be run on multiple independent nodes. The best-of-breed creatures can later be combined on a centralized controller node. This feature allows for distributed computing and potentially faster training times, enhancing the efficiency of the learning process.
Life Long Learning: Unlike many pre-trained neural networks, this project is designed for continuous learning, making it adaptable and potentially more effective in changing environments. This feature ensures the model remains relevant and accurate over time.
Efficient Model Utilization: Once trained, the current best model can be utilized efficiently by calling the
activatefunction. This allows for quick and easy deployment of the trained model.Unique Squash Functions: The neural network supports unique squash functions such as IF, MAX and MIN. These functions provide more options for the activation function, which can lead to different network behaviours, offering a wider range of potential solutions. More about Activation Functions.
Neuron Pruning: Neurons whose activations donāt vary during training are removed, and the biases in the associated neurons are adjusted. This feature optimizes the network by reducing redundancy and computational load. More about Pruning (Neural Networks).
CRISPR: Allows injection of genes into a population of creatures during evolution. This feature can introduce new traits and potentially improve the performance of the population. More about CRISPR.
Grafting: If parents arenāt āgenetically compatibleā, then the āgraftingā algorithm from one parent to another parent onto the child will be used. This allows for species from islands to interbreed.
Memetic Evolution: The algorithm can now record and utilize the biases and weights of the fittest creatures to fine-tune future generations. This process, inspired by the concept of memes, allows the system to ārememberā and build upon successful traits, enhancing the evolutionary process. Learn more about Memetic Algorithms.
Error-Guided Structural Evolution: Dynamically identifies and creates new synapses by analyzing neuron activations and errors. This targeted structural adaptation improves performance by explicitly reducing neuron-level errors, blending evolutionary topology adjustments with error-driven learning.
Note: Error-Guided Structural Evolution requires the NEAT-AI-Discovery Rust extension library to be built and installed. The discovery process will gracefully skip if the Rust library is not available, but discovery functionality requires it.
Discovery Integration Guide: Step-by-step instructions for running discovery via
Creature.discoveryDir()are available in the DiscoveryDir guide.
Usage
This project is designed to be used in a DenoJS environment. Please refer to the DenoJS documentation for setup and usage instructions.
Discovery Integration
Discovery is now documented in detail in
docs/DiscoveryDir.md. The guide covers data
preparation, orchestration patterns, and safe-write practices for
Creature.discoveryDir().
Enabling the Rust Discovery Module
The Rust FFI extension shipped via
NEAT-AI-Discovery provides
the accelerated structural hints used by discoveryDir(). To enable it:
- Clone the repository alongside this project and build the library:
git clone https://github.com/stSoftwareAU/NEAT-AI-Discovery.git cd NEAT-AI-Discovery cargo build --release
- Expose the compiled artefact to Deno by either copying it into
~/.cargo/libor exporting an explicit path:export NEAT_AI_DISCOVERY_LIB_PATH="/absolute/path/to/NEAT-AI-Discovery/target/release/libneat_ai_discovery.$(uname | tr '[:upper:]' '[:lower:]' | sed 's/darwin/dylib/;s/linux/so/;s/windows/dll/')"
- Grant FFI permissions and validate the installation:
deno run --allow-env --allow-ffi --allow-read scripts/check_discovery.ts
- In your application, guard discovery calls with
isRustDiscoveryEnabled()so that controllers fail fast when the module is unavailable.
When the library cannot be resolved, set NEAT_RUST_DISCOVERY_OPTIONAL=true in
environments where skipping discovery should not abort the worker. Otherwise,
treat a missing module as a deployment error and halt the job.
Contributions
Contributions are welcome. Please submit a pull request or open an issue to discuss potential changes/additions.
License
This project is licensed under the terms of the Apache License 2.0. For the full license text, please see LICENSE