@machine_learn
A Survey on The Expressive Power of Graph Neural Networks
This is the best survey on the theory on GNNs I'm aware of. It produces so many illustrative examples on what GNN can and cannot distinguish.
It's funny, it's made by Ryoma Sato who I already saw from other works on GNNs and I thought it's one of these old Japanese professors with long beard and strict habits, but it turned out to be a 1st year MSc student 🇯🇵
A Survey on The Expressive Power of Graph Neural Networks
This is the best survey on the theory on GNNs I'm aware of. It produces so many illustrative examples on what GNN can and cannot distinguish.
It's funny, it's made by Ryoma Sato who I already saw from other works on GNNs and I thought it's one of these old Japanese professors with long beard and strict habits, but it turned out to be a 1st year MSc student 🇯🇵
❤1
"Deep learning for Computer Vision by Jason brownlee"
Please share it with me
@raminmousa
https://machinelearningmastery.com/deep-learning-for-computer-vision/
Please share it with me
@raminmousa
https://machinelearningmastery.com/deep-learning-for-computer-vision/
@Machine_learn
MaxUp: A Simple Way to Improve Generalization of Neural Network Training
A new approach to augmentation both images and text. The idea is to generate a set of augmented data with some random perturbations or transforms and minimize the maximum, or worst case loss over the augmented data. By doing so, the authors implicitly introduce a smoothness or robustness regularization against the random perturbations, and hence improve the generation performance. Testing MaxUp on a range of tasks, including image classification, language modeling, and adversarial certification, it is consistently outperforming the existing best baseline methods, without introducing substantial computational overhead.
.
.
.
paper: https://arxiv.org/abs/2002.09024
#augmentations #SOTA #ml
@Machine_learn
MaxUp: A Simple Way to Improve Generalization of Neural Network Training
A new approach to augmentation both images and text. The idea is to generate a set of augmented data with some random perturbations or transforms and minimize the maximum, or worst case loss over the augmented data. By doing so, the authors implicitly introduce a smoothness or robustness regularization against the random perturbations, and hence improve the generation performance. Testing MaxUp on a range of tasks, including image classification, language modeling, and adversarial certification, it is consistently outperforming the existing best baseline methods, without introducing substantial computational overhead.
.
.
.
paper: https://arxiv.org/abs/2002.09024
#augmentations #SOTA #ml
@Machine_learn
Paper:
https://arxiv.org/pdf/2003.05534.pdf
Github:
https://github.com/sniklaus/softmax-splatting
Short Summary:
https://www.marktechpost.com/2020/03/14/softmax-splatting-for-video-frame-interpolation/
Paper:
https://arxiv.org/pdf/2003.05534.pdf
Github:
https://github.com/sniklaus/softmax-splatting
Short Summary:
https://www.marktechpost.com/2020/03/14/softmax-splatting-for-video-frame-interpolation/
GitHub
GitHub - sniklaus/softmax-splatting: an implementation of softmax splatting for differentiable forward warping using PyTorch
an implementation of softmax splatting for differentiable forward warping using PyTorch - sniklaus/softmax-splatting
@Machine_learn
Grid Search Optimization Algorithm in Python
https://stackabuse.com/grid-search-optimization-algorithm-in-python/
Grid Search Optimization Algorithm in Python
https://stackabuse.com/grid-search-optimization-algorithm-in-python/
Stack Abuse
Grid Search Optimization Algorithm in Python
The article explains how to use the grid search optimization algorithm in Python for tuning hyper-parameters for deep learning algorithms.
This media is not supported in your browser
VIEW IN TELEGRAM
@Machine_learn
Fast and Easy Infinitely Wide Networks with Neural Tangents
https://ai.googleblog.com/2020/03/fast-and-easy-infinitely-wide-networks.html
Colab notebook: https://colab.research.google.com/github/google/neural-tangents/blob/master/notebooks/neural_tangents_cookbook.ipynb#scrollTo=Lt74vgCVNN2b
Code: https://github.com/google/neural-tangents
Paper: https://arxiv.org/abs/1912.02803
Fast and Easy Infinitely Wide Networks with Neural Tangents
https://ai.googleblog.com/2020/03/fast-and-easy-infinitely-wide-networks.html
Colab notebook: https://colab.research.google.com/github/google/neural-tangents/blob/master/notebooks/neural_tangents_cookbook.ipynb#scrollTo=Lt74vgCVNN2b
Code: https://github.com/google/neural-tangents
Paper: https://arxiv.org/abs/1912.02803
@Machine_learn
Meta-Transfer Learning for Zero-Shot Super-Resolution
Code: https://github.com/JWSoh/MZSR
Paper: https://arxiv.org/abs/2002.12213v1
Meta-Transfer Learning for Zero-Shot Super-Resolution
Code: https://github.com/JWSoh/MZSR
Paper: https://arxiv.org/abs/2002.12213v1
This media is not supported in your browser
VIEW IN TELEGRAM
A new paper from Samsung AI Center (Moscow) on unpaired image-to-image translation. Now – without any domain labels, even on training time!
▶️ youtu.be/DALQYKt-GJc
📝 arxiv.org/abs/2003.08791
📉 @Machine_learn
▶️ youtu.be/DALQYKt-GJc
📝 arxiv.org/abs/2003.08791
📉 @Machine_learn
@Machine_learn
Anomaly detection with Keras, TensorFlow, and Deep Learning
In this tutorial, you will learn how to perform anomaly and outlier detection using autoencoders, Keras, and TensorFlow.
https://www.pyimagesearch.com/2020/03/02/anomaly-detection-with-keras-tensorflow-and-deep-learning/
Anomaly detection with Keras, TensorFlow, and Deep Learning
In this tutorial, you will learn how to perform anomaly and outlier detection using autoencoders, Keras, and TensorFlow.
https://www.pyimagesearch.com/2020/03/02/anomaly-detection-with-keras-tensorflow-and-deep-learning/
PyImageSearch
Anomaly detection with Keras, TensorFlow, and Deep Learning - PyImageSearch
In this tutorial, you will learn how to perform anomaly and outlier detection using autoencoders, Keras, and TensorFlow.
@Machine_learn
Graph Machine Learning research groups: Le Song
Le Song (~1981)
- Affiliation: Georgia Institute of Technology;
- Education: Ph.D. at U. of Sydney in 2008 (supervised by Alex Smola);
- h-index: 59;
- Awards: best papers at ICML, NeurIPS, AISTATS;
- Interests: generative and adversarial graph models, social network analysis, diffusion models.
Graph Machine Learning research groups: Le Song
Le Song (~1981)
- Affiliation: Georgia Institute of Technology;
- Education: Ph.D. at U. of Sydney in 2008 (supervised by Alex Smola);
- h-index: 59;
- Awards: best papers at ICML, NeurIPS, AISTATS;
- Interests: generative and adversarial graph models, social network analysis, diffusion models.
👍1
@Machine_learn
Rethinking Image Mixture for Unsupervised Visual Representation Learning
Code: https://github.com/szq0214/Rethinking-Image-Mixture-for-Unsupervised-Learning
Paper: https://arxiv.org/abs/2003.05438v1
Rethinking Image Mixture for Unsupervised Visual Representation Learning
Code: https://github.com/szq0214/Rethinking-Image-Mixture-for-Unsupervised-Learning
Paper: https://arxiv.org/abs/2003.05438v1
@Machine_learn
An AI program that plays Flappy Bird using reinforcement learning.
Code: https://github.com/taivu1998/FlapAI-Bird
Model: https://stanford-cs221.github.io/autumn2019-extra/posters/18.pdf
Paper: https://arxiv.org/abs/2003.09579
An AI program that plays Flappy Bird using reinforcement learning.
Code: https://github.com/taivu1998/FlapAI-Bird
Model: https://stanford-cs221.github.io/autumn2019-extra/posters/18.pdf
Paper: https://arxiv.org/abs/2003.09579
GitHub
GitHub - taivu1998/FlapAI-Bird: An AI program that plays Flappy Bird using reinforcement learning.
An AI program that plays Flappy Bird using reinforcement learning. - taivu1998/FlapAI-Bird