Content Moderation Projects

  • Ethical Sourcing of Benchmarks

    Developers of nudity detection technology used in content need datasets of nude content for training, testing, and benchmarking their nudity detection algorithms.

    Our ongoing research finds that many if not all open-source & academic datasets lack the consent of the nude image subjects depicted.

    Our team is investigating approaches to consensually sourcing and governing the use of nude datasets.

    Our initial investigation of the use of nude datasets in machine learning and computer vision research results, identifies several ethical challenges.

  • Trauma Sensitive Content Takedown

    The process of content takedown from online platforms is time-consuming, labor-intensive, and emotionally taxing.

    To ease that burden, we are building the IBSA takedown tracker.

    Many online platforms fail victim-survivors by:

    - Not promptly removing NCII
    - Not providing transparency about the status of removal requests
    - Never notifying victim-survivors when content has been removed

    This harms victim survivors because the longer content is visible, the more likely it is to be stored and shared further.

    Victim-survivors and the organizations that support them must manually check to see if content is removed. Repeatedly seeing the same abusive content can be re-traumatizing.