Integrity Institute

Integrity Institute

Think Tanks

We are a community of integrity professionals protecting the social internet

About us

The Integrity Institute is a nonprofit organization run by a community of integrity professionals working towards a better social internet. The group has decades of combined experience in the integrity space across numerous platforms. We are here to explore, research, and teach how regulators and platforms can do better.

Website
http://integrityinstitute.org
Industry
Think Tanks
Company size
11-50 employees
Type
Nonprofit

Employees at Integrity Institute

Updates

  • View organization page for Integrity Institute, graphic

    5,549 followers

    🌟 Announcing Our 2023 Annual Report! 🌟 2023 presented challenges to both our profession and the Integrity Institute’s larger goal of building a social internet that helps people, societies, and democracies thrive. It was also an important year for the Integrity Institute to prove itself and its model to the world. Our first-ever annual report highlights our work, the organizations and individuals we've influenced, and our citations. These are key indicators of our impact. Most importantly, we unite hundreds of people committed to the Integrity Oath and advancing a healthy social internet. We are grateful for the support from foundations and donors that make this possible. Thank you for your continued support as we strive to create a safer, healthier social internet. Together, we can make a difference. 📥 Read the Full Report Here: https://bit.ly/3KYEB7r

    • No alternative text description for this image
  • Integrity Institute reposted this

    Fraud, harassment, sexual abuse, misinformation - these are just a few of the risks of going online. In our latest podcast, Grady Ward talks about how gaming and social media platforms can be DESIGNED to minimize harm. In the outsourcing world, we mostly talk about content moderation and Trust & Safety. Both of these involve addressing these issues AFTER they begin. But what if platforms were designed to be safer spaces to begin with? Grady created a project at the Integrity Institute called "Focus on Features." You can pick any type of harm and learn what platform features encourage or minimize it. It's an incredible resource for anyone who works to make the Internet a safer space. Links to the podcast and to the Focus on Features website are in the comments. Take a listen!

    View organization page for Peak Support, graphic

    10,484 followers

    In this episode of the Peak Experience, we are exploring ways to make the internet a safer place with our guest, Grady Ward, visiting fellow at the Integrity Institute. We're discussing how smarter platform design can curb online misbehavior and enhance content moderation—an issue close to our hearts in the outsourcing industry. Grady shares his journey from Google to the Integrity Institute, revealing insights on building a better social internet. Discover how small design tweaks can significantly reduce online harm and why robust customer feedback is vital for progress. As a fellow at the Integrity Institute, Grady developed “Focus on Features” - a groundbreaking resource that demonstrates how the design of social media, gaming, and various other platforms can mitigate issues like harassment, fraud, and numerous other online threats. Tune in to learn how to make the web safer for everyone! Link in the comments section, #TechTalk #InternetSafety #ContentModeration #Innovation #GamingIndustry #CustomerSupport #PeakSupport

    • No alternative text description for this image
  • Integrity Institute reposted this

    🚨 “There’s no more a pressing policy challenge at the moment than the impact that technology is having on democracy, on society and on people’s lives” - Tim Hughes, from Open Government Partnership speaking at the 5th in our series of Civil Society Roundtable events hosted jointly with Open Government Partnership. There could not therefore have been a better moment to hold an event like this on the Digital Services Act. With the landmark regulation having recently come into force, this was a unique opportunity to bring together over 90 participants from civil society, EU institutions, national regulatory bodies and academia. 👉 “(In the DSA) we may finally have a regulation at EU level that can bring a more human rights-respecting digital ecosystem. So that means concrete rules on challenging things like illegal content, but more specifically due diligence obligations for platforms to better respect our rights online and make sure that it’s safer for us all.” - Centre for Democracy & Technology Europe's Asha Allen 👉 As Eliška Pírková from Access Now points out “this is the first time that we have a chance to look in the kitchen of platforms and fully realise what is happening there and what kind of impact their systems and processes are having on our rights.” 👉 Now, given its welcome focus on rights, it’s all about making sure “the DSA lives up to its promise” highlights Chantal Joris of ARTICLE 19. “It is key that experts in human rights online that have been working on these issues for a long time get closely involved in the implementation and enforcement process,” Chantal emphasises why such roundtable events - as a way of bringing together civil society experts and policy makers - are so vital. 🤝  An overarching conclusion from the day’s technical workshops was that DSA implementation, compliance monitoring and enforcement will require much deeper collaboration between regulatory authorities, civil society and public interest groups. These dedicated, limited stakeholder spaces will continue to be essential in ensuring the legislation lives up to its potential in creating a safer, more equitable digital ecosystem. 🎥 WATCH the video here: https://lnkd.in/eRjAa8NV Read our blog for more key takeaways from the Roundtable event: https://lnkd.in/e-8BMcEQ Our thanks again to all our speakers, to the Open Government Partnership and to our Civil Society DSA Coordination Group (see more below, in comments) #DigitalServicesAct #DSA #PlatformRegulation #TechPolicy

  • Integrity Institute reposted this

    View organization page for Integrity Institute, graphic

    5,549 followers

    🌟 Announcing Our 2023 Annual Report! 🌟 2023 presented challenges to both our profession and the Integrity Institute’s larger goal of building a social internet that helps people, societies, and democracies thrive. It was also an important year for the Integrity Institute to prove itself and its model to the world. Our first-ever annual report highlights our work, the organizations and individuals we've influenced, and our citations. These are key indicators of our impact. Most importantly, we unite hundreds of people committed to the Integrity Oath and advancing a healthy social internet. We are grateful for the support from foundations and donors that make this possible. Thank you for your continued support as we strive to create a safer, healthier social internet. Together, we can make a difference. 📥 Read the Full Report Here: https://bit.ly/3KYEB7r

    • No alternative text description for this image
  • View organization page for Integrity Institute, graphic

    5,549 followers

    Announcing a New Resource: Best Practices for Developing & Launching Content Policies for your Platform Integrity Institute is proud to announce a new member-created white paper. In this resource, the authors share their experience from platforms like Google, Meta, TikTok, Grindr, and Discord on the policy creation process., and specifically 6 key elements for launching platform policies ✔️Research and Development ✔️Getting cross-functional buy-in ✔️Training content moderators ✔️Training machine-learning (ML) models ✔️Launching your policy ✔️Quality assurance and post-launch reporting The authors note that this resource applies across the board for different social platforms: “Whether you’re working for a start-up and are a small but mighty policy team of one, or you’re in Big Tech and are part of a team focused on a specific area of safety policy, or you’re transitioning into a new role within policy, we hope this will be a useful guide.” About the Authors This white paper was carefully crafted by the following Integrity Institute members, each of whom have experience in policy creation, implementation, and enforcement across multiple social platforms: Sabrina Puls Cathryn W. Alice Goguen Hunsberger Nathalia Watkins Abhi Chaudhuri Natsuki Y. Júlia Henriques Souza.

    Best Practices for Developing & Launching Content Policies for your Platform — Integrity Institute

    Best Practices for Developing & Launching Content Policies for your Platform — Integrity Institute

    integrityinstitute.org

  • View organization page for Integrity Institute, graphic

    5,549 followers

    New article: "We Worked on Election Integrity at Meta. The EU -- and All Democracies -- Need to Fix the Feed Before It's Too Late" Resident Fellow Matt Motyl, Ph.D. and Chief Research Officer Jeff Allen published a new article in Tech Policy Press on their experiences working in tech on safeguarding elections and combating threats to democracies around the world. This piece highlights platform vulnerabilities and the actions that platforms can take that platforms know work because they have in past elections. With 4.1 billion people voting in 64+ countries this year, it is critical for platforms to take protective actions. In this article, we discuss: - The role of algorithms in spreading harmful civic content - The risks posed by fake accounts and engagement-based ranking systems - Insights from our work on mitigating algorithmic risks - The need for robust regulatory actions, like those under the EU's Digital Services Act, to ensure platforms are accountable and transparent Social media companies have the tools to make elections safer. With dozens of elections on the horizon, now is the time to demand change and protect our democracies. The link to the full-text may be found in the comments section.

    • No alternative text description for this image
  • Integrity Institute reposted this

    View organization page for Checkstep, graphic

    2,401 followers

    What are the steps to writing a great Content Policy for your platform? We've recently shared our Content Policy template. But how should you organize the writing and launch of a policy for your platform? The Integrity Institute just published a very interesting white paper on "Best Practices for Developing & Launching Content Policies on your Platform". Congratulations to the experienced writers: Alice Goguen Hunsberger, Sabrina PascoeCathryn W.Nathalia WatkinsAbhi Chaudhuri, Natsuki Y., and Júlia Henriques Souza for sharing their experience from Google, Meta, TikTok, Grindr, and Discord. In the White Paper, you'll find the 6 key elements for launching platform policies: 1. Research and Development, 2. Getting cross-functional buy-in, 3. Training content moderators, 4. Training machine-learning (ML) models, 5. Launching your policy, 6. Quality assurance and post-launch reporting. Find the links to the white paper and our free template in the comments to get started on your policy journey 👇

  • Integrity Institute reposted this

    View profile for Jess Weaver, graphic

    Research + writing on public interest tech

    Which platform design solutions best minimize risk to election integrity and social cohesion going into Election 2024? Check out the insights from Prosocial Design Network's last event on election integrity, featuring the brilliant minds Glenn E. from Integrity Institute, Kaili L. from Accountable Tech, Nicole Schneidman from Protect Democracy, and Ravi Iyer from the Neely Center for Ethical Leadership and Decision Making. Very much co-authored by the fantastic Julia Kamin! Thanks, Julia 🚀 I tell everyone, but if you're not subscribed to PDN, do it -- their events and community are incredibly valuable to those working in the deliberative democracy and public interest technology spaces.

    Designing for Election 2024, a Pro-Social Recap

    Designing for Election 2024, a Pro-Social Recap

    prosocialdesign.org

Similar pages

Browse jobs