The engineering stack and infrastructure at Meta enable us to build communities and connect billions of people around the world. In order to keep pace with innovation and advancement of AI, and to support our AI workloads at scale in a proactive way, we are adopting a different approach to the design of our AI stack and the infrastructure it runs on.
The size of the design search space may be prohibitively large for human efforts alone to tackle in a proactive way, to support the rapidly increasing scale of AI models. Our goal is to use AI itself to enable us to shrink the design and development timeframes, and efficiently navigate design search spaces toward high-potential regions, with the promise of efficient generalization and fine-tuning to constantly evolving AI workloads. We want to be proactive in making our AI stack more power- and compute-efficient, as well as enhance the reliability of our infrastructure.
We invite the academic community to partner with us and build strong collaborations focusing on making our world-class AI stack, from silicon to models’ output, even better with AI.
We are interested in using AI and ML approaches, such as reinforcement learning, Bayesian modeling, and graph representation learning to automate and improve the whole AI stack - from silicon to AI models’ output. Some specific areas of interest include, but are not limited to, chip design, ASIC development, ML-guided compiler optimization, automatic kernel generation and selection, network topology design, network engineering and optimization, data centers optimization, training clusters and system design, data center cooling, and AI for understanding AI workloads.
To foster further innovation in this area, and to deepen our collaboration with academia, Meta is pleased to invite faculty to respond to this call for research proposals pertaining to the aforementioned topics. We anticipate awarding a total of eight awards, each in the $50,000 range. Payment will be made to the proposer's host university as an unrestricted gift.