falcons-ai / nsfw_​image_​detection

Fine-Tuned Vision Transformer (ViT) for NSFW Image Classification

replicate.com
Total runs: 46.3M
24-hour runs: 200.0K
7-day runs: 1.8M
30-day runs: 8.4M
Github
Model's Last Updated: November 21 2023

Introduction of nsfw_​image_​detection

Model Details of nsfw_​image_​detection

Readme

Model Description The Fine-Tuned Vision Transformer (ViT) is a variant of the transformer encoder architecture, similar to BERT, that has been adapted for image classification tasks. This specific model, named “google/vit-base-patch16-224-in21k,” is pre-trained on a substantial collection of images in a supervised manner, leveraging the ImageNet-21k dataset. The images in the pre-training dataset are resized to a resolution of 224x224 pixels, making it suitable for a wide range of image recognition tasks.

During the training phase, meticulous attention was given to hyperparameter settings to ensure optimal model performance. The model was fine-tuned with a judiciously chosen batch size of 16. This choice not only balanced computational efficiency but also allowed for the model to effectively process and learn from a diverse array of images.

To facilitate this fine-tuning process, a learning rate of 5e-5 was employed. The learning rate serves as a critical tuning parameter that dictates the magnitude of adjustments made to the model’s parameters during training. In this case, a learning rate of 5e-5 was selected to strike a harmonious balance between rapid convergence and steady optimization, resulting in a model that not only learns swiftly but also steadily refines its capabilities throughout the training process.

This training phase was executed using a proprietary dataset containing an extensive collection of 80,000 images, each characterized by a substantial degree of variability. The dataset was thoughtfully curated to include two distinct classes, namely “normal” and “nsfw.” This diversity allowed the model to grasp nuanced visual patterns, equipping it with the competence to accurately differentiate between safe and explicit content.

The overarching objective of this meticulous training process was to impart the model with a deep understanding of visual cues, ensuring its robustness and competence in tackling the specific task of NSFW image classification. The result is a model that stands ready to contribute significantly to content safety and moderation, all while maintaining the highest standards of accuracy and reliability.

Intended Uses & Limitations

NSFW Image Classification: The primary intended use of this model is for the classification of NSFW (Not Safe for Work) images. It has been fine-tuned for this purpose, making it suitable for filtering explicit or inappropriate content in various applications.

This model will return either the word: “normal” or “nsfw”

Note: This model assumes input images are RGB

Runs of falcons-ai nsfw_​image_​detection on replicate.com

46.3M
Total runs
200.0K
24-hour runs
700.0K
3-day runs
1.8M
7-day runs
8.4M
30-day runs

More Information About nsfw_​image_​detection replicate.com Model

More nsfw_​image_​detection license Visit here:

https://huggingface.co/models?license=license:apache-2.0

nsfw_​image_​detection replicate.com

nsfw_​image_​detection replicate.com is an AI model on replicate.com that provides nsfw_​image_​detection's model effect (Fine-Tuned Vision Transformer (ViT) for NSFW Image Classification), which can be used instantly with this falcons-ai nsfw_​image_​detection model. replicate.com supports a free trial of the nsfw_​image_​detection model, and also provides paid use of the nsfw_​image_​detection. Support call nsfw_​image_​detection model through api, including Node.js, Python, http.

nsfw_​image_​detection replicate.com Url

https://replicate.com/falcons-ai/nsfw_image_detection

falcons-ai nsfw_​image_​detection online free

nsfw_​image_​detection replicate.com is an online trial and call api platform, which integrates nsfw_​image_​detection's modeling effects, including api services, and provides a free online trial of nsfw_​image_​detection, you can try nsfw_​image_​detection online for free by clicking the link below.

falcons-ai nsfw_​image_​detection online free url in replicate.com:

https://replicate.com/falcons-ai/nsfw_image_detection

nsfw_​image_​detection install

nsfw_​image_​detection is an open source model from GitHub that offers a free installation service, and any user can find nsfw_​image_​detection on GitHub to install. At the same time, replicate.com provides the effect of nsfw_​image_​detection install, users can directly use nsfw_​image_​detection installed effect in replicate.com for debugging and trial. It also supports api for free installation.

nsfw_​image_​detection install url in replicate.com:

https://replicate.com/falcons-ai/nsfw_image_detection

nsfw_​image_​detection install url in github:

https://github.com/lucataco/cog-nsfw_image_detection

Url of nsfw_​image_​detection

nsfw_​image_​detection replicate.com Url

nsfw_​image_​detection Owner Github

Provider of nsfw_​image_​detection replicate.com