Sentence Similarity
sentence-transformers
PyTorch
Transformers
Korean
bert
feature-extraction
TAACO
text-embeddings-inference
Instructions to use KDHyun08/TAACO_STS with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use KDHyun08/TAACO_STS with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("KDHyun08/TAACO_STS") sentences = [ "The weather is lovely today.", "It's so sunny outside!", "He drove to the stadium." ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] - Transformers
How to use KDHyun08/TAACO_STS with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("KDHyun08/TAACO_STS") model = AutoModel.from_pretrained("KDHyun08/TAACO_STS") - Notebooks
- Google Colab
- Kaggle
Upload with huggingface_hub
Browse files
README.md
CHANGED
|
@@ -4,17 +4,17 @@ tags:
|
|
| 4 |
- sentence-transformers
|
| 5 |
- sentence-similarity
|
| 6 |
- transformers
|
| 7 |
-
|
| 8 |
---
|
| 9 |
|
| 10 |
# TAACO_Similarity
|
| 11 |
|
| 12 |
-
λ³Έ λͺ¨λΈμ
|
| 13 |
μΈ‘μ λκ΅¬μΈ K-TAACO(κ°μ )μ μ§ν μ€ νλμΈ λ¬Έμ₯ κ° μλ―Έμ κ²°μμ±μ μΈ‘μ νκΈ° μν΄ μ μνμμ΅λλ€. λν λͺ¨λμ λ§λμΉμ λ¬Έμ₯κ° μ μ¬λ λ°μ΄ν°μ
μ ν΅ν΄μλ νλ ¨μ μ§νν μμ μ
λλ€.
|
| 14 |
|
| 15 |
## Usage (Sentence-Transformers)
|
| 16 |
|
| 17 |
-
λ³Έ λͺ¨λΈμ μ¬μ©νκΈ° μν΄μλ Sentence-
|
| 18 |
|
| 19 |
```
|
| 20 |
pip install -U sentence-transformers
|
|
@@ -33,7 +33,7 @@ print(embeddings)
|
|
| 33 |
|
| 34 |
|
| 35 |
## Usage (μ€μ λ¬Έμ₯ κ° μ μ¬λ λΉκ΅)
|
| 36 |
-
Sentence-transformers
|
| 37 |
query λ³μλ λΉκ΅ κΈ°μ€μ΄ λλ λ¬Έμ₯(Source Sentence)μ΄κ³ λΉκ΅λ₯Ό μ§νν λ¬Έμ₯μ docsμ list νμμΌλ‘ ꡬμ±νμλ©΄ λ©λλ€.
|
| 38 |
|
| 39 |
```python
|
|
@@ -65,11 +65,33 @@ for i, (score, idx) in enumerate(zip(top_results[0], top_results[1])):
|
|
| 65 |
|
| 66 |
## Evaluation Results
|
| 67 |
|
| 68 |
-
|
| 69 |
|
| 70 |
-
|
|
|
|
| 71 |
|
| 72 |
-
λ¬Έμ₯
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 73 |
|
| 74 |
## Training
|
| 75 |
The model was trained with the parameters:
|
|
|
|
| 4 |
- sentence-transformers
|
| 5 |
- sentence-similarity
|
| 6 |
- transformers
|
| 7 |
+
language: ko
|
| 8 |
---
|
| 9 |
|
| 10 |
# TAACO_Similarity
|
| 11 |
|
| 12 |
+
λ³Έ λͺ¨λΈμ [γ΄entence-transformers](https://www.SBERT.net)λ₯Ό κΈ°λ°μΌλ‘ νλ©° KLUEμ STS(Sentence Textual Similarity) λ°μ΄ν°μ
μ ν΅ν΄ νλ ¨μ μ§νν λͺ¨λΈμ
λλ€. νμκ° μ μνκ³ μλ νκ΅μ΄ λ¬Έμ₯κ° κ²°μμ±
|
| 13 |
μΈ‘μ λκ΅¬μΈ K-TAACO(κ°μ )μ μ§ν μ€ νλμΈ λ¬Έμ₯ κ° μλ―Έμ κ²°μμ±μ μΈ‘μ νκΈ° μν΄ μ μνμμ΅λλ€. λν λͺ¨λμ λ§λμΉμ λ¬Έμ₯κ° μ μ¬λ λ°μ΄ν°μ
μ ν΅ν΄μλ νλ ¨μ μ§νν μμ μ
λλ€.
|
| 14 |
|
| 15 |
## Usage (Sentence-Transformers)
|
| 16 |
|
| 17 |
+
λ³Έ λͺ¨λΈμ μ¬μ©νκΈ° μν΄μλ [Sentence-transformers](https://www.SBERT.net)λ₯Ό μ€μΉνμ¬μΌ ν©λλ€.
|
| 18 |
|
| 19 |
```
|
| 20 |
pip install -U sentence-transformers
|
|
|
|
| 33 |
|
| 34 |
|
| 35 |
## Usage (μ€μ λ¬Έμ₯ κ° μ μ¬λ λΉκ΅)
|
| 36 |
+
[Sentence-transformers](https://www.SBERT.net) λ₯Ό μ€μΉν ν μλ λ΄μ©κ³Ό κ°μ΄ λ¬Έμ₯ κ° μ μ¬λλ₯Ό λΉκ΅ν μ μμ΅λλ€.
|
| 37 |
query λ³μλ λΉκ΅ κΈ°μ€μ΄ λλ λ¬Έμ₯(Source Sentence)μ΄κ³ λΉκ΅λ₯Ό μ§νν λ¬Έμ₯μ docsμ list νμμΌλ‘ ꡬμ±νμλ©΄ λ©λλ€.
|
| 38 |
|
| 39 |
```python
|
|
|
|
| 65 |
|
| 66 |
## Evaluation Results
|
| 67 |
|
| 68 |
+
μ μμ(Usage)λ₯Ό μ€ννκ² λλ©΄ μλμ κ°μ κ²°κ³Όκ° λμΆλ©λλ€. 1μ κ°κΉμΈμλ‘ μ μ¬ν λ¬Έμ₯μ
λλ€.
|
| 69 |
|
| 70 |
+
'''python
|
| 71 |
+
μ
λ ₯ λ¬Έμ₯: μμΌμ λ§μ΄νμ¬ μμΉ¨μ μ€λΉνκ² λ€κ³ μ€μ 8μ 30λΆλΆν° μμμ μ€λΉνμλ€
|
| 72 |
|
| 73 |
+
<μ
λ ₯ λ¬Έμ₯κ³Ό μ μ¬ν 10 κ°μ λ¬Έμ₯>
|
| 74 |
+
|
| 75 |
+
1: μμΌμ λ§μ΄νμ¬ μμΉ¨μ μ€λΉνκ² λ€κ³ μ€μ 8μ 30λΆλΆν° μμμ μ€λΉνμλ€. μ£Όλ λ©λ΄λ μ€ν
μ΄ν¬μ λμ§λ³Άμ, λ―Έμκ΅, μ‘μ±, μμΌ λ±μ΄μλ€ (μ μ¬λ: 0.6687)
|
| 76 |
+
|
| 77 |
+
2: λ§€λ
μλ΄μ μμΌμ λ§μ΄νλ©΄ μμΉ¨λ§λ€ μμΌμ μ°¨λ €μΌκ² λ€. μ€λλ μ¦κ±°μ΄ νλ£¨κ° λμμΌλ©΄ μ’κ² λ€ (μ μ¬λ: 0.6468)
|
| 78 |
+
|
| 79 |
+
3: 40λ²μ§Έλ₯Ό λ§μ΄νλ μλ΄μ μμΌμ μ±κ³΅μ μΌλ‘ μ€λΉκ° λμλ€ (μ μ¬λ: 0.4647)
|
| 80 |
+
|
| 81 |
+
4: μλ΄μ μμΌμ΄λΌ λ§μκ² κ΅¬μλ³΄κ³ μΆμλλ° μ΄μ²κ΅¬λμλ μν©μ΄ λ°μν κ²μ΄λ€ (μ μ¬λ: 0.4469)
|
| 82 |
+
|
| 83 |
+
5: μμΌμ΄λκΉ~ (μ μ¬λ: 0.4218)
|
| 84 |
+
|
| 85 |
+
6: μ΄μ λ μλ΄μ μμΌμ΄μλ€ (μ μ¬λ: 0.4192)
|
| 86 |
+
|
| 87 |
+
7: μμΉ¨ μΌμ° μλ΄κ° μ’μνλ μ€ν
μ΄ν¬λ₯Ό μ€λΉνκ³ κ·Έκ²μ λ§μκ² λ¨Ήλ μλ΄μ λͺ¨μ΅μ λ³΄κ³ μΆμλλ° μ ν μκ°μ§λ λͺ»ν μν©μ΄ λ°μν΄μ... νμ§λ§ μ μ μ μΆμ€λ₯΄κ³ λ°λ‘ λ€λ₯Έ λ©λ΄λ‘ λ³κ²½νλ€ (μ μ¬λ: 0.4156)
|
| 88 |
+
|
| 89 |
+
8: λ§μκ² λ¨Ήμ΄ μ€ μλ΄μκ²λ κ°μ¬νλ€ (μ μ¬λ: 0.3093)
|
| 90 |
+
|
| 91 |
+
9: μλ΄κ° μ’μνλμ§ λͺ¨λ₯΄κ² μ§λ§ λμ₯κ³ μμ μλ νλν¬μμΈμ§λ₯Ό 보λ λ°λ‘ μμΌλ₯Ό ν΄μΌκ² λ€λ μκ°μ΄ λ€μλ€. μμμ μ±κ³΅μ μΌλ‘ μμ±μ΄ λμλ€ (μ μ¬λ: 0.2259)
|
| 92 |
+
|
| 93 |
+
10: μλ΄λ κ·Έλ° μ€ν
μ΄ν¬λ₯Ό μ’μνλ€. κ·Έλ°λ° μμλ λͺ»ν μΌμ΄ λ²μ΄μ§κ³ λ§μλ€ (μ μ¬λ: 0.1967)
|
| 94 |
+
'''
|
| 95 |
|
| 96 |
## Training
|
| 97 |
The model was trained with the parameters:
|