ICLR (The International Conference on Learning Representations) is one of the most important international machine learning conferences. It’s popularity is growing fast, putting it on a par with such conferences as ICML, NeurIPS or CVPR.
The 2020 conference is slated for next April 26th, but the submissions deadline has already come and gone. 2585 publicly available papers were submitted. That’s about a thousand more than were featured at the 2019 conference.
We analyzed abstracts and keywords of all the ICLR papers submitted within the last three years to see what’s trending and what’s dying out. Brace yourselves! This year, 28% of the papers used or claimed to introduce state-of-the-art algorithms, so be prepared for a great deal of solid machine learning work!
“Deep learning” – have you heard about it?
To say you use deep learning in Computer Vision or Natural Language Processing is like saying fish live in water. Deep learning has revolutionized machine learning and become it’s underpinning. It’s present in almost all fields of ML, including less obvious ones like time series analysis or demand forecasting. This may be why the number of references to deep learning in keywords actually fell – from 19% in ‘18 to just 11% in ‘20–It’s just too obvious to acknowledge.
A revolution in network architecture?
One of the hottest topics this year turned out to be Graph Neural Networks. GNN is a deep learning architecture for graph-structured data. These networks have proved tremendously helpful in some applications in medicine, social network classification and modeling the behavior of dynamic interacting objects. The rise of GNNs is unprecedented, from 12 papers mentioning them in 18’ to 111 in 20’!
All Quiet on the GAN Front
The next topic has been extremely popular in recent years. But what has been called ‘the coolest idea in machine learning in the last twenty years’ has quickly become exploited. Generative Adversarial Networks can learn to mimic any distribution of data – creating impressive never seen artificial images. Yet they are on the decline. Yet they are on the decline, despite being prevalent in the media (deep fakes).
Leave designing your machine learning to… machines
Finding the right architecture for your neural network can be a pain in the neck. Fear not, though: Neural Architecture Search (NAS) will save you. NAS is a method of building network architecture automatically rather than handcrafting it. It has been used in several state-of-the-art algorithms improving image classification, object detection or segmentation models. The number of papers on NAS increased from a mere five in ‘18 to 47 in ‘20!
Reinforcement learning – keeping stable
The percentage of papers on reinforcement learning has remained more or less constant. Interest in the topic remains significant – autonomous vehicles, Alpha Star’s success in playing StarCraft, and advances in robotics were allwidely discussed this year. RL is a stable branch of machine learning, and for good reason: future progress is widely anticipated.
That was just a sample of machine learning trends. What will be on top next year? Even the deepest neural network cannot predict it. But interest in machine learning is still on the rise, and the researchers are nothing if not creative. We shouldn’t be surprised to hear about groundbreaking discoveries next year and a 180-degree change in the trends.
To see a full analysis of the trends papers throughout the last three conferences click the photo below: