File size: 9,392 Bytes
7d93c8e
909225a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7d93c8e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
<!doctype html>
<html lang="en">
<head>
  <meta charset="utf-8" />
  <meta name="viewport" content="width=device-width, initial-scale=1" />
  <title>NewsReX β€” Pre-trained News Recommendation Models</title>
  <link rel="stylesheet" href="style.css" />
</head>
<body>
  <div class="container">
    <header>
      <h1>NewsReX</h1>
      <p class="subtitle">Pre-trained News Recommendation Models</p>
      <div class="badges">
        <a href="https://arxiv.org/abs/2508.21572"><img src="https://img.shields.io/badge/arXiv-2508.21572-b31b1b.svg" alt="arXiv"></a>
        <a href="https://github.com/igor17400/NewsReX"><img src="https://img.shields.io/badge/GitHub-NewsReX-blue.svg" alt="GitHub"></a>
        <a href="https://www.python.org/downloads/release/python-3120/"><img src="https://img.shields.io/badge/python-3.12+-blue.svg" alt="Python 3.12+"></a>
      </div>
    </header>

    <section>
      <p>This organization hosts pre-trained weights for <strong>10 neural news recommendation models</strong> trained on the <a href="https://msnews.github.io/">MIND-small</a> dataset using the <a href="https://github.com/igor17400/NewsReX">NewsReX</a> framework. All models are trained with 3 random seeds (42, 123, 456) and evaluated on the standard MIND test split.</p>
    </section>

    <section>
      <h2>Benchmark Results (MIND-small, mean &pm; std over 3 seeds)</h2>

      <h3>JAX Models</h3>
      <div class="table-wrapper">
        <table>
          <thead>
            <tr><th>Model</th><th>AUC</th><th>MRR</th><th>NDCG@5</th><th>NDCG@10</th><th>Weights</th></tr>
          </thead>
          <tbody>
            <tr><td><strong>CROWN</strong></td><td>0.6778&pm;0.0030</td><td>0.3246&pm;0.0018</td><td>0.3619&pm;0.0022</td><td>0.4233&pm;0.0022</td><td><a href="https://huggingface.co/newsrex/CROWN-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>DIGAT</strong></td><td>0.6760&pm;0.0021</td><td>0.3245&pm;0.0021</td><td>0.3594&pm;0.0035</td><td>0.4220&pm;0.0027</td><td><a href="https://huggingface.co/newsrex/DIGAT-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>CAUM</strong></td><td>0.6734&pm;0.0013</td><td>0.3202&pm;0.0009</td><td>0.3546&pm;0.0009</td><td>0.4185&pm;0.0006</td><td><a href="https://huggingface.co/newsrex/CAUM-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>TCCM</strong></td><td>0.6734&pm;0.0055</td><td>0.3208&pm;0.0034</td><td>0.3574&pm;0.0046</td><td>0.4194&pm;0.0043</td><td><a href="https://huggingface.co/newsrex/TCCM-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>PP-Rec</strong></td><td>0.6676&pm;0.0040</td><td>0.3182&pm;0.0033</td><td>0.3544&pm;0.0041</td><td>0.4164&pm;0.0036</td><td><a href="https://huggingface.co/newsrex/PPREC-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>LSTUR</strong></td><td>0.6672&pm;0.0020</td><td>0.3177&pm;0.0033</td><td>0.3525&pm;0.0037</td><td>0.4156&pm;0.0033</td><td><a href="https://huggingface.co/newsrex/LSTUR-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>NAML</strong></td><td>0.6639&pm;0.0014</td><td>0.3130&pm;0.0022</td><td>0.3456&pm;0.0033</td><td>0.4097&pm;0.0025</td><td><a href="https://huggingface.co/newsrex/NAML-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>GLORY</strong></td><td>0.6624&pm;0.0030</td><td>0.3152&pm;0.0038</td><td>0.3483&pm;0.0041</td><td>0.4119&pm;0.0040</td><td><a href="https://huggingface.co/newsrex/GLORY-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>MINER</strong></td><td>0.6579&pm;0.0024</td><td>0.3117&pm;0.0027</td><td>0.3444&pm;0.0035</td><td>0.4080&pm;0.0025</td><td><a href="https://huggingface.co/newsrex/MINER-JAX-MIND-small">Download</a></td></tr>
            <tr><td><strong>NRMS</strong></td><td>0.6561&pm;0.0006</td><td>0.3075&pm;0.0008</td><td>0.3394&pm;0.0003</td><td>0.4039&pm;0.0007</td><td><a href="https://huggingface.co/newsrex/NRMS-JAX-MIND-small">Download</a></td></tr>
          </tbody>
        </table>
      </div>

      <h3>PyTorch Models</h3>
      <div class="table-wrapper">
        <table>
          <thead>
            <tr><th>Model</th><th>AUC</th><th>MRR</th><th>NDCG@5</th><th>NDCG@10</th><th>Weights</th></tr>
          </thead>
          <tbody>
            <tr><td><strong>CROWN</strong></td><td>0.6705&pm;0.0045</td><td>0.3183&pm;0.0049</td><td>0.3553&pm;0.0056</td><td>0.4173&pm;0.0056</td><td><a href="https://huggingface.co/newsrex/CROWN-PYTORCH-MIND-small">Download</a></td></tr>
            <tr><td><strong>CAUM</strong></td><td>0.6656&pm;0.0053</td><td>0.3176&pm;0.0028</td><td>0.3504&pm;0.0040</td><td>0.4149&pm;0.0035</td><td><a href="https://huggingface.co/newsrex/CAUM-PYTORCH-MIND-small">Download</a></td></tr>
            <tr><td><strong>NAML</strong></td><td>0.6654&pm;0.0015</td><td>0.3105&pm;0.0009</td><td>0.3464&pm;0.0027</td><td>0.4097&pm;0.0018</td><td><a href="https://huggingface.co/newsrex/NAML-PYTORCH-MIND-small">Download</a></td></tr>
            <tr><td><strong>PP-Rec</strong></td><td>0.6631&pm;0.0044</td><td>0.3130&pm;0.0024</td><td>0.3487&pm;0.0041</td><td>0.4111&pm;0.0033</td><td><a href="https://huggingface.co/newsrex/PPREC-PYTORCH-MIND-small">Download</a></td></tr>
            <tr><td><strong>TCCM</strong></td><td>0.6616&pm;0.0019</td><td>0.3088&pm;0.0022</td><td>0.3428&pm;0.0031</td><td>0.4057&pm;0.0024</td><td><a href="https://huggingface.co/newsrex/TCCM-PYTORCH-MIND-small">Download</a></td></tr>
            <tr><td><strong>NRMS</strong></td><td>0.6534&pm;0.0025</td><td>0.3052&pm;0.0021</td><td>0.3367&pm;0.0019</td><td>0.4017&pm;0.0022</td><td><a href="https://huggingface.co/newsrex/NRMS-PYTORCH-MIND-small">Download</a></td></tr>
            <tr><td><strong>LSTUR</strong></td><td>&mdash;</td><td>&mdash;</td><td>&mdash;</td><td>&mdash;</td><td><a href="https://huggingface.co/newsrex/LSTUR-PYTORCH-MIND-small">Download</a></td></tr>
            <tr><td><strong>DIGAT</strong></td><td>&mdash;</td><td>&mdash;</td><td>&mdash;</td><td>&mdash;</td><td><a href="https://huggingface.co/newsrex/DIGAT-PYTORCH-MIND-small">Download</a></td></tr>
            <tr><td><strong>GLORY</strong></td><td>&mdash;</td><td>&mdash;</td><td>&mdash;</td><td>&mdash;</td><td><a href="https://huggingface.co/newsrex/GLORY-PYTORCH-MIND-small">Download</a></td></tr>
          </tbody>
        </table>
      </div>
    </section>

    <section>
      <h2>Supported Models</h2>
      <div class="table-wrapper">
        <table>
          <thead>
            <tr><th>Model</th><th>Paper</th><th>Venue</th></tr>
          </thead>
          <tbody>
            <tr><td>NRMS</td><td>Neural News Recommendation with Multi-Head Self-Attention</td><td>EMNLP 2019</td></tr>
            <tr><td>NAML</td><td>Neural News Recommendation with Attentive Multi-View Learning</td><td>EMNLP 2019</td></tr>
            <tr><td>LSTUR</td><td>Neural News Recommendation with Long- and Short-term User Representations</td><td>ACL 2019</td></tr>
            <tr><td>CROWN</td><td>Intent Disentanglement and Feature Self-Supervision for News Recommendation</td><td>WWW 2025</td></tr>
            <tr><td>PP-Rec</td><td>News Recommendation with Personalized User Interest and Popularity Deconfounding</td><td>ACL 2021</td></tr>
            <tr><td>DIGAT</td><td>Dual Interactive Graph Attention Networks for News Recommendation</td><td>EMNLP 2022</td></tr>
            <tr><td>GLORY</td><td>Global-Local News Recommendation via Multi-Channel Graph Modeling</td><td>NAACL 2024</td></tr>
            <tr><td>MINER</td><td>Multi-Interest News Extraction and Recommendation</td><td>EMNLP 2022</td></tr>
            <tr><td>CAUM</td><td>Candidate-Aware User Modeling for News Recommendation</td><td>RecSys 2023</td></tr>
            <tr><td>TCCM</td><td>Topic-Centric Conversational Collaborative Model for News Recommendation</td><td>CIKM 2022</td></tr>
          </tbody>
        </table>
      </div>
    </section>

    <section>
      <h2>Quick Start</h2>
      <pre><code>git clone https://github.com/igor17400/NewsReX.git
cd NewsReX && uv sync --extra jax

# Evaluate a pre-trained model
uv run python src/train.py experiment=mind/nrms framework=jax \
    weights=hf://newsrex/NRMS-JAX-MIND-small/model.safetensors

# Train from scratch (3 seeds)
uv run python src/train.py experiment=mind/nrms framework=jax \
    multi_seed.enabled=true</code></pre>
    </section>

    <section>
      <h2>Repository Structure</h2>
      <pre><code>newsrex/{MODEL}-{FRAMEWORK}-MIND-small/
β”œβ”€β”€ model.safetensors              &lt;- best seed (default download)
β”œβ”€β”€ test_results.json
β”œβ”€β”€ training_run_summary.json
β”œβ”€β”€ seed_42/model.safetensors
β”œβ”€β”€ seed_123/model.safetensors
β”œβ”€β”€ seed_456/model.safetensors
└── README.md</code></pre>
    </section>

    <section>
      <h2>Citation</h2>
      <pre><code>@misc{azevedo2025newsrex,
  title={NewsReX: A More Efficient Approach to News Recommendation with Keras 3 and JAX},
  author={Igor L. R. Azevedo and Toyotaro Suzumura and Yuichiro Yasui},
  year={2025},
  eprint={2508.21572},
  archivePrefix={arXiv},
  primaryClass={cs.IR},
  url={https://arxiv.org/abs/2508.21572},
}</code></pre>
    </section>

    <footer>
      <p><strong>Authors:</strong> Igor L.R. Azevedo (U. Tokyo) &middot; Toyotaro Suzumura (U. Tokyo) &middot; Yuichiro Yasui (Nikkei Inc.)</p>
    </footer>
  </div>
</body>
</html>