We are looking for projects that have a chance of substantially changing humanity's future trajectory for the better. The areas in which this kind of change seems most likely to us are:
Applications that fall outside of these focus areas are still welcome, and we expect to fund a substantial number of projects that don't narrowly fall into any of the categories above.
For more information, discussion and detail see also our LessWrong launch post.
The basic goal of the process is to allow multiple funders to coordinate on funding decisions, while maintaining maximum freedom for each funder to fund whatever projects they are excited about. The process works as follows:
We've been doing various forms of grantmaking for many years, being involved in many grantmaking projects like the Survival and Flourishing Fund and the Long Term Future Fund and we think we can do better, both in grant quality and applicant-experience.
In our experience of running projects aiming to reduce existential risk from AI, we've also found that speed is a key variable that often makes or breaks a project. Waiting on funding, and funding uncertainty, is often the key breaking point for a project. We aim to do better.
The Lightspeed Grants process also aims to be a way for new funders who want to contribute to humanity's long-term survival and flourishing to learn about good funding opportunities. This seems particularly important given the recent uptick in interest in existential risks from Artificial Intelligence.
If you are a funder interested in funding applications to Lightspeed Grants, or want to more broadly use our process, send us an email at firstname.lastname@example.org and we will get back to you in 24 hours.
Lightspeed grants is run by Lightcone Infrastructure. Applications are evaluated by grant-evaluators selected for their general reasoning ability and networks to people doing promising work and are chosen by us in collaboration with our funders. Our primary funder for this round is Jaan Tallinn.