Every decision in our sensitive content workflow shapes whether people feel safe, respected, and empowered when they show up on Bumble. This role sits at the heart of our mission to build a world where all relationships are healthy and equitable—by ensuring our moderation operations are consistent, scalable, and grounded in clear policy. You’ll lead the Human-in-the-Loop layer of our image classification pipeline, partnering closely with Policy, Product, and Engineering to improve how we detect and reduce harm. You’ll also role model our Bumble values as you support teams working with difficult material and drive high-quality outcomes at scale
Please note: this position involves exposure to sensitive and potentially graphic content.