This sample demonstrates how to automatically moderate offensive images uploaded to Firebase Storage. It uses The Google Cloud Vision API to detect if the image contains adult or violent content and if so uses ImageMagick to blur the image.
See file functions/index.js for the moderation code.
The detection of adult and violent content in an image is done using The Google Cloud Vision API.
The image blurring is performed using ImageMagick which is installed by default on all Cloud Functions instances. The image is first downloaded locally from the Firebase Storage bucket to the tmp
folder using the google-cloud SDK.
The dependencies are listed in functions/package.json.
The function triggers on upload of any file to your Firebase project's default Cloud Storage bucket.
Create a Firebase project on the Firebase Console. Enable Billing on your project by switching to the Blaze or Candle plan then visit the Storage tab.
In your Google Cloud Console enable the Google Cloud Vision API.
To test the sample:
- Deploy your project using
firebase deploy
- Go to the Firebase Console Storage tab and upload an image that contains adult or violent content. After a short time the image will be replaced by a blurred version of itself.