Update README.md
Browse files
README.md
CHANGED
|
@@ -94,22 +94,6 @@ The EveNet paper evaluates the pretrained model on four downstream tasks:
|
|
| 94 |
3. **Quantum correlations in (t\bar{t}) dilepton events:** EveNet pretrained on 500 million events achieved a normalised uncertainty (\Delta D) of **1.61 %** on the entanglement‑sensitive observable after fine‑tuning with 15 % of typical training statistics, outperforming scratch and self‑supervised baselines. It also reached **82 % pairing accuracy**, several points above scratch (80 %) and SSL (79 %) models.
|
| 95 |
4. **Anomaly detection on CMS Open Data:** Generative diffusion heads were fine‑tuned on 2016 dimuon data to rediscover the Υ meson. EveNet replaced the conditional normalising flow baseline with a generative model that directly produces dimuon point clouds; after calibration, it achieved competitive or superior anomaly significance while maintaining physical fidelity.
|
| 96 |
|
| 97 |
-
## How to Get Started
|
| 98 |
-
|
| 99 |
-
Install the package via pip:
|
| 100 |
-
|
| 101 |
-
```bash
|
| 102 |
-
pip install evenet
|
| 103 |
-
|
| 104 |
-
# convert .npz files to parquet and prepare normalization
|
| 105 |
-
python preprocessing/preprocess.py --config share/event_info/pretrain.yaml --file /path/to/mydata.npz --store_dir /path/to/output
|
| 106 |
-
|
| 107 |
-
# fine‑tune the model
|
| 108 |
-
evenet-train my_finetuning_config.yaml
|
| 109 |
-
```
|
| 110 |
-
|
| 111 |
-
Pretrained weights can be loaded by specifying `pretrain_model_load_path` in the YAML configuration. For a detailed description of configuration options, consult the documentation site.
|
| 112 |
-
|
| 113 |
## Citation
|
| 114 |
|
| 115 |
If you use this model in your research, please cite:
|
|
|
|
| 94 |
3. **Quantum correlations in (t\bar{t}) dilepton events:** EveNet pretrained on 500 million events achieved a normalised uncertainty (\Delta D) of **1.61 %** on the entanglement‑sensitive observable after fine‑tuning with 15 % of typical training statistics, outperforming scratch and self‑supervised baselines. It also reached **82 % pairing accuracy**, several points above scratch (80 %) and SSL (79 %) models.
|
| 95 |
4. **Anomaly detection on CMS Open Data:** Generative diffusion heads were fine‑tuned on 2016 dimuon data to rediscover the Υ meson. EveNet replaced the conditional normalising flow baseline with a generative model that directly produces dimuon point clouds; after calibration, it achieved competitive or superior anomaly significance while maintaining physical fidelity.
|
| 96 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 97 |
## Citation
|
| 98 |
|
| 99 |
If you use this model in your research, please cite:
|