Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

openai
/
privacy-filter

Token Classification
Transformers
ONNX
Safetensors
Transformers.js
openai_privacy_filter
Model card Files Files and versions
xet
Community
15

Instructions to use openai/privacy-filter with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use openai/privacy-filter with Transformers:

    # Use a pipeline as a high-level helper
    from transformers import pipeline
    
    pipe = pipeline("token-classification", model="openai/privacy-filter")
    # Load model directly
    from transformers import AutoTokenizer, AutoModelForTokenClassification
    
    tokenizer = AutoTokenizer.from_pretrained("openai/privacy-filter")
    model = AutoModelForTokenClassification.from_pretrained("openai/privacy-filter")
  • Transformers.js

    How to use openai/privacy-filter with Transformers.js:

    // npm i @huggingface/transformers
    import { pipeline } from '@huggingface/transformers';
    
    // Allocate pipeline
    const pipe = await pipeline('token-classification', 'openai/privacy-filter');
  • Notebooks
  • Google Colab
  • Kaggle
privacy-filter
17.4 GB
Ctrl+K
Ctrl+K
  • 5 contributors
History: 8 commits
mihaimaruseac's picture
mihaimaruseac
Xenova's picture
Xenova HF Staff
Add Transformers.js usage/sample code (#5)
7ffa9a0 15 days ago
  • onnx
    Add ONNX weights + Transformers.js support (#3) 15 days ago
  • original
    Upload folder using huggingface_hub 15 days ago
  • .gitattributes
    2.06 kB
    Add ONNX weights + Transformers.js support (#3) 15 days ago
  • README.md
    11.1 kB
    Add Transformers.js usage/sample code (#5) 15 days ago
  • config.json
    3.04 kB
    Add ONNX weights + Transformers.js support (#3) 15 days ago
  • model.safetensors
    2.8 GB
    xet
    Upload folder using huggingface_hub 15 days ago
  • model.sig
    8.69 kB
    Upload folder using huggingface_hub 15 days ago
  • tokenizer.json
    27.9 MB
    xet
    Upload folder using huggingface_hub 15 days ago
  • tokenizer_config.json
    234 Bytes
    Upload folder using huggingface_hub 15 days ago
  • viterbi_calibration.json
    372 Bytes
    Upload 4 files 18 days ago