Detect Unsafe & NSFW Content

Demo to preview the settings

Introduction

Quickly identify given image or gif if it is sensitive content or nsfw (not safe for work ) using pre-trained AI
The plugin categories image probabilities in the following 5 classes:
  1. Drawing - safe for work drawings (including anime)
  1. Hentai- hentai and pornographic drawings
  1. Neural - safe for work neutral images
  1. Porn - pornographic images, sexual acts
  1. Sexy - sexually explicit images, not pornography
Each category will have a percentage from 0% to 100% of possibility that the image is from particular category.
Check uploaded images/gifs by user before using them.
This plugin is a tool for identifying NSFW but Bubble's terms still apply on acceptable content. You may not use this plugin in violation with Bubble's acceptable use policy (https://bubble.is/acceptable-use-policy) or branding guidelines (https://bubble.is/brand-guidelines)

Disclaimer

⚠️
Keep in mind that NSFW isn't perfect, but it's pretty accurate (~90% from test set of 15,000 test images). It is ok that some images can get a few percents in not relevant categories.
Best practice is to combine results. For example, if each of category hentai,porn and sexy have less than 20% then it is safe to use even if the image have let's say 60% in drawing category and this is not drawing. Don't run more that 1 test at the time otherwise your processor will be overloaded and the page may freeze or other unexpected errors can appear.
This plugin is a tool for identifying NSFW but Bubble's terms still apply on acceptable content. You may not use this plugin in violation with Bubble's acceptable use policy (https://bubble.is/acceptable-use-policy) or branding guidelines (https://bubble.is/brand-guidelines)

Plugin Elements Properties

This plugin has one visual element which can be used on the page: NSFW.

NSFW

Identify sensitive or NSFW (not safe for work) content.

Element Actions

  1. Analyze Image - Check content from a static image.
    1. Image without caption
      Fields:
      Title
      Description
      Type
      Image
      Upload an image or insert a link to it.
      image
  1. Analyze Gif - Check content from a dynamic image.
    1. Image without caption
      Fields:
      Title
      Description
      Type
      Gif
      Upload a gif file or insert a link to it.
      file

Element Events

Title
Description
Error Occurred
This event is triggered when an error occurred.
Image Checked
This event is triggered when the image was checked.

Element States

Title
Description
Type
Instance Error
The error that occurred in the checking process.
text
Sexy Probability
The sexy content probability in percentages.
number
Hentai Probability
The hentai content probability in percentages.
number
Neutral Probability
The neutral content probability in percentages.
number
Drawing Probability
The drawing content probability in percentages.
number
Is Loading
Indicates if checking is in progress.
yes / no

Workflow example

How to check an image

In this example, it is represented how is used NSFW element for image checking.
  1. On the page, it is placed an NSFW element.
    1. Image without caption
  1. For file input, are used Input and Picture Uploader elements.
    1. Image without caption
      Image without caption
  1. For start checking, it is used a Button element.
    1. Image without caption
  1. In the workflow, when the button is clicked then the Analyze Image action is called.
    1. Image without caption
  1. For results displayed, on the page, are used some Text elements that are using the NSFW element states.
    1. Image without caption

Changelogs

Update 22.11.22 - Version 2.4.0

  • description update

Update 31.10.22 - Version 2.3.0

  • description update

Update 25.10.22 - Version 2.2.0

  • Demo update

Update 11.11.20 - Version 2.1.0

  • Fixed cross-origin errors

Update 26.07.20 - Version 2.0.1

  • Minor fixes

Update 03.10.19 - Version 2.0.0

  • Description changes

Update 29.08.19 - Version 1.0.0

  • 1.0