Yeah strange that ODEnets were left off, and I’m glad you mentioned it. That has the opportunity to be a transformative approach to more efficient training and much better performance on time-series problems.
- Bert and Natural Language processing,
- Tesla Autopilot Hardware v2+: NN at scale,
- AdaNet: AutoML with Ensembles,
- AutoAugmentation,
- Training Deep Networks with Synthetic data,
- Segmentation Annotation with Polygon-RNN++,
- DAWNBench: Training fast and cheap,
- Big GAN: state of the art in image sythesis,
- Video to Video Synthesis,
- Semantic segmentation,
- AlphaZero and openAI Five,
- Deep learning frameworks
Was good to hear Lex Fridman's take on where we're at.
Honest question: how have people's experience with OpenAI Five been so far? I haven't had the time to check it out in detail so I'm paying close attention to what others are saying.
* Normalizing flows - e.g., https://arxiv.org/abs/1605.08803 , https://arxiv.org/abs/1807.03039 , among many others
* ODEnets and continous normalizing flows - https://arxiv.org/abs/1806.07366