Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,148 @@
|
|
| 1 |
-
---
|
| 2 |
-
license:
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
tags:
|
| 4 |
+
- outfit-compatibility-modeling
|
| 5 |
+
- fashion-recommendation
|
| 6 |
+
- multimodal
|
| 7 |
+
- pytorch
|
| 8 |
+
- graph-neural-networks
|
| 9 |
+
---
|
| 10 |
+
|
| 11 |
+
<a id="top"></a>
|
| 12 |
+
<div align="center">
|
| 13 |
+
<h1>π OCMCF: Complementary Factorization towards Outfit Compatibility Modeling</h1>
|
| 14 |
+
|
| 15 |
+
<p>
|
| 16 |
+
<b>Tianyu Su</b>
|
| 17 |
+
<b>Xuemeng Song</b>
|
| 18 |
+
<b>Na Zheng</b>
|
| 19 |
+
<b>Weili Guan</b>
|
| 20 |
+
<b>Yan Li</b>
|
| 21 |
+
<b>Liqiang Nie</b>
|
| 22 |
+
</p>
|
| 23 |
+
</div>
|
| 24 |
+
|
| 25 |
+
These are the official pre-trained model weights for **OCMCF** (Complementary Factorization towards Outfit Compatibility Modeling), a novel approach designed for outfit compatibility modeling using complementary factorization with graph neural networks.
|
| 26 |
+
|
| 27 |
+
π **Paper:** [ACM MM 2021](https://dl.acm.org/doi/10.1145/3474085.3475537)
|
| 28 |
+
π **GitHub Repository:** [iLearn-Lab/MM21-OCMCF](https://github.com/iLearn-Lab/MM21-OCMCF)
|
| 29 |
+
π **Video Introduction:** [Demo](https://files.atypon.com/acm/6091e3d23a3433ff198ae9d4119200ea)
|
| 30 |
+
|
| 31 |
+
---
|
| 32 |
+
|
| 33 |
+
## π Model Information
|
| 34 |
+
|
| 35 |
+
### 1. Model Name
|
| 36 |
+
**OCMCF** (Complementary Factorization towards Outfit Compatibility Modeling) Checkpoints.
|
| 37 |
+
|
| 38 |
+
### 2. Task Type & Applicable Tasks
|
| 39 |
+
- **Task Type:** Outfit Compatibility Modeling / Fashion Recommendation / Multimodal Learning
|
| 40 |
+
- **Applicable Tasks:** Predicting compatibility scores between fashion items, outfit recommendation, and fill-in-the-blank fashion tasks on the Polyvore dataset.
|
| 41 |
+
|
| 42 |
+
### 3. Project Introduction
|
| 43 |
+
Outfit compatibility modeling is a crucial task in fashion recommendation systems. **OCMCF** introduces a complementary factorization approach that effectively captures compatibility relationships between clothing items through graph neural networks and transformer architectures. The method decomposes outfit compatibility into complementary factors to better model item relationships.
|
| 44 |
+
|
| 45 |
+
### 4. Training Data Source
|
| 46 |
+
The model was trained and evaluated on:
|
| 47 |
+
- **Polyvore Outfits** (Full dataset)
|
| 48 |
+
- **Polyvore Outfits-D** (Disjoint split for more challenging evaluation)
|
| 49 |
+
|
| 50 |
+
---
|
| 51 |
+
|
| 52 |
+
## π Usage & Basic Inference
|
| 53 |
+
|
| 54 |
+
These weights are designed to be used directly with the official OCMCF GitHub repository.
|
| 55 |
+
|
| 56 |
+
### Step 1: Prepare the Environment
|
| 57 |
+
Clone the GitHub repository and install dependencies:
|
| 58 |
+
```bash
|
| 59 |
+
git clone https://github.com/iLearn-Lab/MM21-OCMCF.git
|
| 60 |
+
cd MM21-OCMCF/codes
|
| 61 |
+
conda create -n ocm-cf python=3.8
|
| 62 |
+
conda activate ocm-cf
|
| 63 |
+
|
| 64 |
+
# Install PyTorch
|
| 65 |
+
conda install pytorch==1.6.0 torchvision==0.7.0 cudatoolkit=10.1 -c pytorch
|
| 66 |
+
|
| 67 |
+
# Install PyTorch Geometric
|
| 68 |
+
CUDA=cu101
|
| 69 |
+
TORCH=1.6.0
|
| 70 |
+
pip install torch-scatter -f https://pytorch-geometric.com/whl/torch-${TORCH}+${CUDA}.html
|
| 71 |
+
pip install torch-sparse -f https://pytorch-geometric.com/whl/torch-${TORCH}+${CUDA}.html
|
| 72 |
+
pip install torch-geometric
|
| 73 |
+
|
| 74 |
+
# Install other dependencies
|
| 75 |
+
conda install scikit-learn tensorboard
|
| 76 |
+
```
|
| 77 |
+
|
| 78 |
+
### Step 2: Download Model Weights & Data
|
| 79 |
+
Download the checkpoint files from this Hugging Face repository and place them in your local `checkpoints/` directory:
|
| 80 |
+
|
| 81 |
+
```text
|
| 82 |
+
checkpoints/
|
| 83 |
+
βββ disjoint_best.pt
|
| 84 |
+
βββ nondisjoint_best.pt
|
| 85 |
+
```
|
| 86 |
+
|
| 87 |
+
Ensure you also download and structure the Polyvore dataset as specified in the GitHub repository's Data Preparation section.
|
| 88 |
+
|
| 89 |
+
### Step 3: Run Inference
|
| 90 |
+
To run inference with the downloaded checkpoints:
|
| 91 |
+
```bash
|
| 92 |
+
cd inference
|
| 93 |
+
bash inference_all_tasks.sh ${gpu_id}
|
| 94 |
+
```
|
| 95 |
+
|
| 96 |
+
To train the model from scratch:
|
| 97 |
+
```bash
|
| 98 |
+
# For Polyvore Outfits dataset
|
| 99 |
+
python main.py --polyvore-split nondisjoint
|
| 100 |
+
|
| 101 |
+
# For Polyvore Outfits-D dataset
|
| 102 |
+
python main.py --polyvore-split disjoint
|
| 103 |
+
```
|
| 104 |
+
|
| 105 |
+
---
|
| 106 |
+
|
| 107 |
+
## π Model Checkpoints
|
| 108 |
+
|
| 109 |
+
This repository contains the following pre-trained checkpoints:
|
| 110 |
+
|
| 111 |
+
| Checkpoint | Description | Dataset |
|
| 112 |
+
|------------|-------------|---------|
|
| 113 |
+
| `disjoint_best.pt` | Best model for disjoint split | Polyvore Outfits-D |
|
| 114 |
+
| `nondisjoint_best.pt` | Best model for non-disjoint split | Polyvore Outfits |
|
| 115 |
+
|
| 116 |
+
---
|
| 117 |
+
|
| 118 |
+
## β οΈ Limitations & Notes
|
| 119 |
+
|
| 120 |
+
**Disclaimer:** This framework and its pre-trained weights are intended for **academic research purposes only**.
|
| 121 |
+
- The model requires access to the original Polyvore dataset for full evaluation.
|
| 122 |
+
- Performance may vary depending on the dataset split and evaluation metrics used.
|
| 123 |
+
|
| 124 |
+
---
|
| 125 |
+
|
| 126 |
+
## πβοΈ Citation
|
| 127 |
+
|
| 128 |
+
If you find our work or these model weights useful in your research, please consider leaving a **Star** βοΈ on our GitHub repo and citing our paper:
|
| 129 |
+
|
| 130 |
+
```bibtex
|
| 131 |
+
@inproceedings{743aced44b004a3dac16da3feb57edbd,
|
| 132 |
+
title = "Complementary Factorization towards Outfit Compatibility Modeling",
|
| 133 |
+
author = "Tianyu Su and Xuemeng Song and Na Zheng and Weili Guan and Yan Li and Liqiang Nie",
|
| 134 |
+
note = "Publisher Copyright: {\textcopyright} 2021 ACM.; 29th ACM International Conference on Multimedia, MM 2021 ; Conference date: 20-10-2021 Through 24-10-2021",
|
| 135 |
+
year = "2021",
|
| 136 |
+
doi = "10.1145/3474085.3475537",
|
| 137 |
+
series = "MM 2021 - Proceedings of the 29th ACM International Conference on Multimedia",
|
| 138 |
+
publisher = "Association for Computing Machinery, Inc",
|
| 139 |
+
pages = "4073--4081",
|
| 140 |
+
booktitle = "MM 2021 - Proceedings of the 29th ACM International Conference on Multimedia",
|
| 141 |
+
}
|
| 142 |
+
```
|
| 143 |
+
|
| 144 |
+
---
|
| 145 |
+
|
| 146 |
+
## π License
|
| 147 |
+
|
| 148 |
+
This project is released under the MIT License.
|