Telegram Web Link
A Survey of Data Augmentation Approaches for NLP

Data Augmentation has becoming more and more popular and important task in NLP. On the contrary to Computer Vision where all methods now are well-known and already pre-implemented in libraries, in NLP the situation is not so consistent.

So, there has been published a nice paper that accumulated all known due today techniques, models and applications of data augmentation in texts:
https://arxiv.org/abs/2105.03075

In the appendix you can find the list of open-source that may be useful for your task.
@Machine_learn
Topic : Sudoku solver (SolSudo)

Abstract : SolSudo is a Sudoku solver made using Deep Learning. SolSudo can solve sudokus using images. This has an intelligent solution method. According to this method, the model predicts the blank digits, and when each level is completed, the filled blanks are placed one after another. Each time a digit is filled, new sudoku will be fed to the solver to determine the next digit. Again and again, until there is no blank left. One of the features of this project is detecting sudoku from an image and filling in the blanks. This requires tesseract-ocr, however, which may cause problems. Therefore, I devised a method, in which the Sudoku numbers are entered one by one, and 0 is used for the empty spaces. Below is an example of Sudoku, its detection, and its solution.

Github Link : https://github.com/AryaKoureshi/SolSudo
Linkedin Link : https://www.linkedin.com/posts/arya-koureshi_deeplearning-python-tensorflow-activity-6711641409658716160-kdSD

@Machine_learn
با عرض سلام دوستانی که نیاز به تهیه کتاب های زبان اصلی دارند می توانند با ارسال نام کتاب و ناشر آن به ایدی بنده ثبت سفارش کنند. تمامی کتاب ها با 50% تخفیف دلاری برای تمامی رشته ها قابل دسترس می باشد.
@Raminmousa
Machine learning books and papers pinned «با عرض سلام دوستانی که نیاز به تهیه کتاب های زبان اصلی دارند می توانند با ارسال نام کتاب و ناشر آن به ایدی بنده ثبت سفارش کنند. تمامی کتاب ها با 50% تخفیف دلاری برای تمامی رشته ها قابل دسترس می باشد. @Raminmousa»
Fresh picks from ArXiv
This week on ArXiv: 1000-layer GNN, solutions to OGB challenge, and theory behind GNN explanations 🤔

If I forgot to mention your paper, please shoot me a message and I will update the post.


Deep GNNs
* Training Graph Neural Networks with 1000 Layers ICML 2021
* Very Deep Graph Neural Networks Via Noise Regularisation with Petar Veličković, Peter Battaglia

Heterophily
* Improving Robustness of Graph Neural Networks with Heterophily-Inspired Designs with Danai Koutra

Knowledge graphs
* Query Embedding on Hyper-relational Knowledge Graphs with Mikhail Galkin

OGB-challenge
* Fast Quantum Property Prediction via Deeper 2D and 3D Graph Networks
* First Place Solution of KDD Cup 2021 & OGB Large-Scale Challenge Graph Prediction Track

Theory
* Towards a Rigorous Theoretical Analysis and Evaluation of GNN Explanations with Marinka Zitnik
* A unifying point of view on expressive power of GNNs

GNNs
* Stability of Graph Convolutional Neural Networks to Stochastic Perturbations with Alejandro Ribeiro
* TD-GEN: Graph Generation With Tree Decomposition
* Unsupervised Resource Allocation with Graph Neural Networks
* Equivariance-bridged SO(2)-Invariant Representation Learning using Graph Convolutional Network
* GemNet: Universal Directional Graph Neural Networks for Molecules with Stephan Günnemann
* Optimizing Graph Transformer Networks with Graph-based Techniques

Survey
* Systematic comparison of graph embedding methods in practical tasks
* Evaluating Modules in Graph Contrastive Learning
* A Survey on Mining and Analysis of Uncertain Graphs

@Machine_learn
2025/07/06 18:59:55
Back to Top
HTML Embed Code: