From Alex' Docs:
Whether your own or someone else’s writing, alex helps you find gender favouring, polarising, race related, religion inconsiderate, or other unequal phrasing in text.
For example, when We’ve confirmed his identity is given to alex, it will warn you and suggest using their instead of his.
This repo hosts the code for a browser extension that checks the text you're writing in your browser using Alex.
The browser will scan the contents of all <input>
and <textarea>
elements (including password fields) on the webpage and analyze them using alex.
Since alex does its text processing locally, your data will never leave the browser.
The Profanity Likeliness Threshhold specifies how sure Alex needs to be about a profanity to warn you. From Alex` docs:
Rating | Use as a profanity | Use in clean text | Example |
---|---|---|---|
2 | likely | unlikely | asshat |
1 | maybe | maybe | addict |
0 | unlikely | likely | beaver |