Giter Site home page Giter Site logo

sn0walice / sensisafe Goto Github PK

View Code? Open in Web Editor NEW
1.0 1.0 1.0 65.23 MB

SensiSafe is an API for detecting inappropriate content on the Internet, specialized in detecting upload of pornographic content by users.

License: GNU General Public License v3.0

JavaScript 100.00%
api nodejs nude nudity-detection prevent prevention protect home

sensisafe's Introduction

SensiSafe

SensiSafe is an API for detecting inappropriate content on the Internet, specialized in detecting upload of pornographic content by users. It uses machine learning algorithms to analyze images uploaded to a website and identify those that contain inappropriate content. The SensiSafe API can be easily integrated into an existing website to enhance user security and protection from pornographic content. It also allows site owners to control their content and act quickly in the event of a breach of the site’s terms and conditions of use. SensiSafe is an innovative solution to improve online security and protect users of all ages from unwanted content.

How to install

  • First, make sure that Node.js is installed on your system. To check if Node.js is already installed, open a console and type node -v. If you get a version of Node.js, you are ready to go to the next step. Otherwise, you will need to install Node.js.
  • Once you have Node.js installed, open a console and navigate to your project directory.
  • Then run the command `npm i' to initialize a new Node.jss project. This command will install the modules on your machine
  • Set up your service:
    • Change the port: in main.js (default: 3000)
    • Change the ban image public/banned.jpg

Run

Use pm2 to run the script on your backend server

API

URL Description body
/api/detect Allows to recover a JSON in order to process the raw data of the api "file -> picture_to_analyse"
/api/cadre Allows to create a frame with a description of everything that is detected by the system "file -> picture_to_analyse"
/api/blur Returns the censored image if necessary or original if no censorship has been applied "file -> picture_to_analyse"
/api/ban Returns the "banned" image if needed or original if no pornographic content has been applied "file -> picture_to_analyse"

Example

All example are available in ./examples/

Contributors

Script based on the MASTER PEACE OF this code ❤️

sensisafe's People

Contributors

sn0walice avatar

Stargazers

 avatar

Watchers

 avatar

Forkers

hariacharya80

sensisafe's Issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.