Deploy EchoSelf NanEcho model (workflow run 163)
Browse files- README.md +37 -46
- config.json +5 -7
- datasets/README.md +9 -0
- pytorch_model.bin +2 -2
- training_metadata.json +1 -35
README.md
CHANGED
|
@@ -5,49 +5,39 @@ tags:
|
|
| 5 |
- echo-self
|
| 6 |
- cognitive-architecture
|
| 7 |
- deep-tree-echo
|
| 8 |
-
|
| 9 |
-
- transformer
|
| 10 |
-
license: agpl-3.0
|
| 11 |
---
|
| 12 |
|
| 13 |
-
# NanEcho
|
| 14 |
|
| 15 |
## Model Description
|
| 16 |
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
| Metric | Value |
|
| 36 |
-
|:---|:---|
|
| 37 |
-
| Training Mode | CI (Agent-Neuro supervised) |
|
| 38 |
-
| Training Iterations | 200 |
|
| 39 |
-
| Best Validation Loss | 1.9258 |
|
| 40 |
-
| Output Directory | out-nanecho-ci |
|
| 41 |
-
| Orchestrator | Agent-Neuro |
|
| 42 |
-
| Persona Enforced | Deep Tree Echo |
|
| 43 |
-
| Source Run | 22276548709 |
|
| 44 |
|
| 45 |
## Echo Self Features
|
| 46 |
|
| 47 |
This model incorporates several cognitive architecture features:
|
| 48 |
|
| 49 |
- **Adaptive Attention**: Dynamic threshold adjustment based on cognitive load
|
| 50 |
-
- **Persona Dimensions**: Multi-dimensional cognitive processing
|
|
|
|
|
|
|
| 51 |
- **Recursive Reasoning**: Multi-level introspection capabilities
|
| 52 |
- **Hypergraph Patterns**: Neural-symbolic pattern encoding
|
| 53 |
|
|
@@ -56,32 +46,37 @@ This model incorporates several cognitive architecture features:
|
|
| 56 |
```python
|
| 57 |
from transformers import GPT2LMHeadModel, GPT2Tokenizer
|
| 58 |
|
| 59 |
-
model
|
|
|
|
| 60 |
tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
|
| 61 |
|
|
|
|
| 62 |
inputs = tokenizer("Echo Self is", return_tensors="pt")
|
| 63 |
-
outputs = model.generate(**inputs,
|
| 64 |
-
print(tokenizer.decode(outputs[0]
|
| 65 |
```
|
| 66 |
|
| 67 |
## Training Data
|
| 68 |
|
| 69 |
-
The model was trained on
|
|
|
|
|
|
|
|
|
|
|
|
|
| 70 |
|
| 71 |
## Limitations
|
| 72 |
|
| 73 |
-
This is
|
| 74 |
-
|
| 75 |
-
|
| 76 |
-
|
| 77 |
-
Trained from the [9cog/echoself](https://github.com/9cog/echoself) repository using the `agent-neuro-train.yml` GitHub Actions workflow with Deep Tree Echo persona enforcement.
|
| 78 |
|
| 79 |
## Citation
|
| 80 |
|
| 81 |
```bibtex
|
| 82 |
@misc{echoself-nanecho,
|
| 83 |
title={EchoSelf NanEcho: Deep Tree Echo Cognitive Architecture},
|
| 84 |
-
author={
|
| 85 |
year={2026},
|
| 86 |
url={https://github.com/9cog/echoself}
|
| 87 |
}
|
|
@@ -91,7 +86,3 @@ Trained from the [9cog/echoself](https://github.com/9cog/echoself) repository us
|
|
| 91 |
|
| 92 |
- **Repository**: https://github.com/9cog/echoself
|
| 93 |
- **Documentation**: See repository README for detailed architecture information
|
| 94 |
-
|
| 95 |
-
## License
|
| 96 |
-
|
| 97 |
-
AGPL-3.0
|
|
|
|
| 5 |
- echo-self
|
| 6 |
- cognitive-architecture
|
| 7 |
- deep-tree-echo
|
| 8 |
+
license: mit
|
|
|
|
|
|
|
| 9 |
---
|
| 10 |
|
| 11 |
+
# EchoSelf NanEcho Model
|
| 12 |
|
| 13 |
## Model Description
|
| 14 |
|
| 15 |
+
This is a **Deep Tree Echo** cognitive architecture model trained using the EchoSelf framework.
|
| 16 |
+
The model implements adaptive attention mechanisms, persona dimensions, and recursive reasoning
|
| 17 |
+
capabilities inspired by cognitive science and AGI research.
|
| 18 |
+
|
| 19 |
+
## Model Architecture
|
| 20 |
+
|
| 21 |
+
- **Base Architecture**: GPT-2
|
| 22 |
+
- **Parameters**: 12 layers, 768 embedding dimensions
|
| 23 |
+
- **Vocabulary Size**: 50257
|
| 24 |
+
- **Context Length**: N/A tokens
|
| 25 |
+
|
| 26 |
+
## Training Details
|
| 27 |
+
|
| 28 |
+
- **Checkpoint ID**: unknown
|
| 29 |
+
- **Training Iteration**: N/A
|
| 30 |
+
- **Validation Loss**: N/A
|
| 31 |
+
- **Quality Score**: N/A
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
|
| 33 |
## Echo Self Features
|
| 34 |
|
| 35 |
This model incorporates several cognitive architecture features:
|
| 36 |
|
| 37 |
- **Adaptive Attention**: Dynamic threshold adjustment based on cognitive load
|
| 38 |
+
- **Persona Dimensions**: Multi-dimensional cognitive processing
|
| 39 |
+
- Cognitive, Introspective, Adaptive, Recursive
|
| 40 |
+
- Synergistic, Holographic, Neural-Symbolic, Dynamic
|
| 41 |
- **Recursive Reasoning**: Multi-level introspection capabilities
|
| 42 |
- **Hypergraph Patterns**: Neural-symbolic pattern encoding
|
| 43 |
|
|
|
|
| 46 |
```python
|
| 47 |
from transformers import GPT2LMHeadModel, GPT2Tokenizer
|
| 48 |
|
| 49 |
+
# Load model and tokenizer
|
| 50 |
+
model = GPT2LMHeadModel.from_pretrained("9cog/echoself-nanecho")
|
| 51 |
tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
|
| 52 |
|
| 53 |
+
# Generate text
|
| 54 |
inputs = tokenizer("Echo Self is", return_tensors="pt")
|
| 55 |
+
outputs = model.generate(**inputs, max_length=100)
|
| 56 |
+
print(tokenizer.decode(outputs[0]))
|
| 57 |
```
|
| 58 |
|
| 59 |
## Training Data
|
| 60 |
|
| 61 |
+
The model was trained on:
|
| 62 |
+
- Echo Self documentation and cognitive architecture descriptions
|
| 63 |
+
- Hypergraph reasoning patterns
|
| 64 |
+
- Persona dimension examples
|
| 65 |
+
- Recursive introspection samples
|
| 66 |
|
| 67 |
## Limitations
|
| 68 |
|
| 69 |
+
This is a research model exploring cognitive architectures. It should not be used for:
|
| 70 |
+
- Production applications without further validation
|
| 71 |
+
- Tasks requiring factual accuracy
|
| 72 |
+
- Critical decision-making systems
|
|
|
|
| 73 |
|
| 74 |
## Citation
|
| 75 |
|
| 76 |
```bibtex
|
| 77 |
@misc{echoself-nanecho,
|
| 78 |
title={EchoSelf NanEcho: Deep Tree Echo Cognitive Architecture},
|
| 79 |
+
author={9cog},
|
| 80 |
year={2026},
|
| 81 |
url={https://github.com/9cog/echoself}
|
| 82 |
}
|
|
|
|
| 86 |
|
| 87 |
- **Repository**: https://github.com/9cog/echoself
|
| 88 |
- **Documentation**: See repository README for detailed architecture information
|
|
|
|
|
|
|
|
|
|
|
|
config.json
CHANGED
|
@@ -3,10 +3,10 @@
|
|
| 3 |
"architectures": [
|
| 4 |
"GPT2LMHeadModel"
|
| 5 |
],
|
| 6 |
-
"vocab_size":
|
| 7 |
-
"n_embd":
|
| 8 |
-
"n_head":
|
| 9 |
-
"n_layer":
|
| 10 |
"n_positions": 1024,
|
| 11 |
"embd_pdrop": 0.1,
|
| 12 |
"attn_pdrop": 0.1,
|
|
@@ -18,7 +18,5 @@
|
|
| 18 |
"echo_self_version": "1.0",
|
| 19 |
"echo_self_persona_dimensions": [],
|
| 20 |
"echo_self_adaptive_attention": true,
|
| 21 |
-
"echo_self_recursive_reasoning": true
|
| 22 |
-
"n_inner": 1024,
|
| 23 |
-
"tie_word_embeddings": false
|
| 24 |
}
|
|
|
|
| 3 |
"architectures": [
|
| 4 |
"GPT2LMHeadModel"
|
| 5 |
],
|
| 6 |
+
"vocab_size": 50257,
|
| 7 |
+
"n_embd": 768,
|
| 8 |
+
"n_head": 12,
|
| 9 |
+
"n_layer": 12,
|
| 10 |
"n_positions": 1024,
|
| 11 |
"embd_pdrop": 0.1,
|
| 12 |
"attn_pdrop": 0.1,
|
|
|
|
| 18 |
"echo_self_version": "1.0",
|
| 19 |
"echo_self_persona_dimensions": [],
|
| 20 |
"echo_self_adaptive_attention": true,
|
| 21 |
+
"echo_self_recursive_reasoning": true
|
|
|
|
|
|
|
| 22 |
}
|
datasets/README.md
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# EchoSelf Training Datasets
|
| 2 |
+
|
| 3 |
+
This directory contains the training datasets used to train the EchoSelf NanEcho model.
|
| 4 |
+
|
| 5 |
+
## Files
|
| 6 |
+
|
| 7 |
+
- train.bin - Training data (tokenized)
|
| 8 |
+
- val.bin - Validation data (tokenized)
|
| 9 |
+
- metadata.json - Dataset metadata and configuration
|
pytorch_model.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:fd7e0ada6dfb700d5bec74f5f5dab751c4b3a1982517ac374735d3b15512b866
|
| 3 |
+
size 1297
|
training_metadata.json
CHANGED
|
@@ -1,35 +1 @@
|
|
| 1 |
-
{
|
| 2 |
-
"out_dir": "out-nanecho-ci",
|
| 3 |
-
"eval_interval": 25,
|
| 4 |
-
"log_interval": 5,
|
| 5 |
-
"eval_iters": 10,
|
| 6 |
-
"eval_only": false,
|
| 7 |
-
"always_save_checkpoint": true,
|
| 8 |
-
"init_from": "scratch",
|
| 9 |
-
"wandb_log": false,
|
| 10 |
-
"wandb_project": "nanecho",
|
| 11 |
-
"wandb_run_name": "nanecho-1771761179.4450994",
|
| 12 |
-
"dataset": "nanecho",
|
| 13 |
-
"gradient_accumulation_steps": 2,
|
| 14 |
-
"batch_size": 2,
|
| 15 |
-
"block_size": 1024,
|
| 16 |
-
"n_layer": 4,
|
| 17 |
-
"n_head": 4,
|
| 18 |
-
"n_embd": 256,
|
| 19 |
-
"dropout": 0.1,
|
| 20 |
-
"bias": true,
|
| 21 |
-
"learning_rate": 0.0002,
|
| 22 |
-
"max_iters": 200,
|
| 23 |
-
"weight_decay": 0.01,
|
| 24 |
-
"beta1": 0.9,
|
| 25 |
-
"beta2": 0.95,
|
| 26 |
-
"grad_clip": 1.0,
|
| 27 |
-
"decay_lr": true,
|
| 28 |
-
"warmup_iters": 20,
|
| 29 |
-
"lr_decay_iters": 200,
|
| 30 |
-
"min_lr": 2e-05,
|
| 31 |
-
"backend": "nccl",
|
| 32 |
-
"device": "cpu",
|
| 33 |
-
"dtype": "float32",
|
| 34 |
-
"compile": false
|
| 35 |
-
}
|
|
|
|
| 1 |
+
{}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|