richb_c's profile picture. AI safety researcher @ MATS

Rich Barton-Cooper

@richb_c

AI safety researcher @ MATS

Dit vind je misschien leuk
Rich Barton-Cooper heeft deze post opnieuw geplaatst

Want to be mentored by one of the TIME100? At an org with a cool distinctive red logo? I have the program for you...

Honored and humbled to be in @TIME's list of the TIME100 AI of 2025! time.com/collections/ti… #TIME100AI

MariusHobbhahn's tweet image. Honored and humbled to be in @TIME's list of the TIME100 AI of 2025! 

time.com/collections/ti…

#TIME100AI


MATS has been an incredible experience! I've just completed 8.0, about to start the extension. If you want to work on important and urgent AI safety research under excellent mentorship, within a highly supportive ecosystem, I can't recommend it highly enough - apply by Oct 2nd!

MATS 9.0 applications are open! Launch your career in AI alignment, governance, and security with our 12-week research program. MATS provides field-leading research mentorship, funding, Berkeley & London offices, housing, and talks/workshops with AI experts.

ryan_kidd44's tweet image. MATS 9.0 applications are open! Launch your career in AI alignment, governance, and security with our 12-week research program. MATS provides field-leading research mentorship, funding, Berkeley & London offices, housing, and talks/workshops with AI experts.


Excited to be working on this. Any feedback is much appreciated!

My current MATS stream is looking into black box monitoring for scheming. We've written a post with early results. If you have suggestions for what we could test, please lmk. If you have good ideas for hard scheming datasets, even better.

MariusHobbhahn's tweet image. My current MATS stream is looking into black box monitoring for scheming. 

We've written a post with early results. 

If you have suggestions for what we could test, please lmk. If you have good ideas for hard scheming datasets, even better.


United States Trends

Dit vind je misschien leuk

Loading...

Something went wrong.


Something went wrong.