OptFormer: Towards Universal Hyperparameter Optimization With Transformers - ai.googleblog.com

## Metadata
- Author: **ai.googleblog.com**
- Full Title: OptFormer: Towards Universal Hyperparameter Optimization With Transformers
- Category: #articles
- URL: https://ai.googleblog.com/2022/08/optformer-towards-universal.html
## Highlights
- first Transformer-based frameworks for hyperparameter tuning, learned from large-scale optimization data using flexible text-based representations. While numerous works have previously demonstrated the Transformer’s strong abilities across various domains, few have touched on its optimization-based capabilities, especially over text space. Our core findings demonstrate for the first time some intriguing algorithmic abilities of Transformers: 1) a single Transformer network is capable of imitating highly complex behaviors from multiple algorithms over long horizons; 2) the network is further capable of predicting objective values very accurately, in many cases surpassing Gaussian Processes, which are commonly used in algorithms such as Bayesian Optimization.