Evolution through Large Models

This talk highlights how large language models can be used to instantiate a new kind of evolutionary algorithm, and how this kind of evolutionary algorithm can in turn benefit language models. In particular, such evolution through large models can leverage language models to generate new data, which can bootstrap the language model into competence in a new domain for which very little training data is available.

The promise of this method is shown using language models trained on code, that learn to invent in a 2D simulated physical environment by bootstrapping from a single example program. The conclusion is that large language models may enable new kinds of evolutionary algorithms that can be leveraged to generate diverse training data.

About the speaker
Amy-Heineike

Joel Lehman

ML Research Scientist at Stochastic Labs

Joel Lehman is a machine learning resident at Stochastic Labs. He was previously a research scientist at OpenAI, a founding member of Uber AI Labs and assistant professor at the IT University of Copenhagen. He co-wrote a popular science book called “Why Greatness Cannot Be Planned,” on what AI search algorithms imply for individual and societal accomplishment, and his research focuses on machine creativity and safety.

NLP-Summit

When

Sessions: October 4 – 6
Trainings: October 11 – 14

Contact

nlpsummit@johnsnowlabs.com

Presented by

jhonsnow_logo