The ecosystem has a lot of people who are great at thinking about ideas. We need more people who are great at thinking about people and organizations.
Generator is a 3-month program where residents pitch, build, and ship AI safety infrastructure projects, then find full-time roles at impactful organizations.
Apply here by April 27.
You identify gaps and fill them without being asked.
Mission-driven, comfortable with ambiguity, willing to wear multiple hats.
Any background: engineering, design, operations, writing, policy.
What matters is your track record of execution and genuine concern about x-risk.
In the first 1–2 weeks, you'll pitch your own project while meeting the Constellation network. Then you execute with generous budgets, mentorship, and support for placement after the program.
Some ideas of what residents might build:
Events that set up new collaborations and catch attendees up on the latest research, like ControlConf or Action Potential.
Robust systems that help impactful orgs scale their teams, replacing the ad-hoc pipelines many rely on today.
Fast suites of human baselines at scale: time-horizon data, persuasion RCTs, uplift studies for multiple organizations.
These are examples. You pitch anything that builds capacity and infrastructure across the AI safety ecosystem.
Generous funding to execute your project: events, contractors, tools, travel.
Full access to the Berkeley space, including meals on working days.
1:1s with successful generalists and deep dives on the state of the field.
J-1 visa sponsorship for international residents. Active support finding full-time roles after the program.
Write your project pitch, meet the Constellation network, build context on the field and its gaps.
Execute individually or in groups with generous budgets and mentorship from generalists across six partner organizations.
Find roles at impactful orgs, spin up a new org, or get acqui-hired. Majority placed within 12 months.
Write your project pitch, meet the Constellation network, build context on the field and its gaps.
Execute individually or in groups with generous budgets and mentorship from generalists across six partner organizations.
Find roles at impactful orgs, spin up a new org, or get acqui-hired. Majority placed within 12 months.
Residents are mentored by experienced generalists across core AI safety organizations.
Here are a few examples:
Workshops and conferences. Events have historically been extremely valuable for setting up new collaborations and catching attendees up to speed on the latest research in a field. Work with an organization to create a workshop focusing on an area of policy or technical research (e.g., ControlConf), or run a retreat bringing talent into AI safety (e.g., Action Potential).
Better recruiting pipelines. Multiple impactful organizations in the ecosystem have quite ad-hoc and inefficient recruiting pipelines, bottlenecking growth and wasting researcher time. Work with them to set up robust yet flexible systems that can be used to rapidly expand organizations' headcount.
Mass human data collection. There are many tasks for which it is useful, but somewhat expensive, to get data on how humans complete them. For instance, time-horizon data, persuasion RCTs, novice uplift studies, etc. Work with an organization or multiple organizations to develop a fast suite of human baselines that takes advantage of economies of scale.
Build what the field needs most.
Apply by April 27