Telegram Web Link
LLM Zoo: democratizing ChatGPT

Model "Phoenix", achieving competitive performance among open-source English and Chinese models while excelling in languages with limited resources

🖥 Github: https://github.com/freedomintelligence/llmzoo

Paper: https://arxiv.org/abs/2304.10453v1

⭐️ Parameters: https://huggingface.co/FreedomIntelligence/phoenix-chat-7b

@Machine_learn
Transfer Knowledge from Head to Tail: Uncertainty Calibration under Long-tailed Distribution

🖥 Github: https://github.com/jiahaochen1/calibration

Paper: https://arxiv.org/pdf/2304.06537v1.pdf

💨 Dataset: https://paperswithcode.com/dataset/cifar-10

@Machine_learn
Improving Segmentation of Objects with Varying Sizes in Biomedical Images using Instance-wise and Center-of-Instance Segmentation Loss Function

🖥 Github: https://github.com/brainimageanalysis/ici-loss

Paper: https://arxiv.org/pdf/2304.06229v1.pdf

💨 Dataset: https://paperswithcode.com/dataset/atlas-v2-0

@Machine_learn
Packt.Applied.Machine.Learning.and.High.Performance.pdf
20.5 MB
Book: Applied Machine Learning and High-Performance Computing on AWS
Authors: Mani Khanuja, Farooq Sabir,
Shreyas Subramanian ,Trenton Potgieter
ISBN: 978-1-80323-701-5
year: 2022
pages: 383
Tags: #ML #AWS
@Machine_learn
🔍 Unleashing Infinite-Length Input Capacity for Large-scale Language Models with Self-Controlled Memory System

Self-Controlled Memory (SCM) system to unleash infinite-length input capacity for large-scale language models.


🖥 Github: https://github.com/toufunao/SCM4LLMs

Paper: https://arxiv.org/abs/2304.13343v1

📌 Tasks: https://paperswithcode.com/task/language-modelling

@Machine_learn
ZipIt! Merging Models from Different Tasks without Training

ZipIt allows to combine completely distinct models with different initializations, each solving a separate task, into one multi-task model without any additional training.


🖥 Github: https://github.com/gstoica27/zipit

Paper: https://arxiv.org/abs/2305.03053v1

📌 Dataset: https://paperswithcode.com/dataset/nabirds

@Machine_learn
pymbook.pdf
1.1 MB
Book: Python for you and me
Release 0.5.beta1
Authors: Kushal Das
ISBN: Null
year: 2023
pages: 175
Tags: #Python #Code
@Machine_learn
Discover and Cure: Concept-aware Mitigation of Spurious Correlation

🖥 Github: https://github.com/wuyxin/disc

Paper: https://arxiv.org/pdf/2305.00650v1.pdf

💨 Dataset: https://paperswithcode.com/dataset/metashift

@Machine_learn
Deep-Learning-for-Natural-Language-Processing.pdf
7.3 MB
Book: Deep Learning for Natural Language Processing (Creating Neural Networks with Python)
Authors: Palash Goyal, Sumit Pandey, Karan Jain
ISBN: 978-1-4842-3685-7
year: 2018
pages: 290
Tags: #NLP #DL #Python #Code
@Machine_learn
LLM-Pruner: On the Structural Pruning of Large Language Models

Compress your LLMs to any size;


🖥 Github: https://github.com/horseee/llm-pruner

Paper: https://arxiv.org/abs/2305.11627v1

📌 Dataset: https://paperswithcode.com/dataset/piqa

@Machine_learn
QLoRA: Efficient Finetuning of Quantized LLMs

Model name Guanaco, outperforms all previous openly released models on the Vicuna benchmark, reaching 99.3% of the performance level of ChatGPT while only requiring 24 hours of finetuning on a single GPU.


🖥 Github: https://github.com/artidoro/qlora

Paper: https://arxiv.org/abs/2305.14314

⭐️ Demo: https://huggingface.co/spaces/uwnlp/guanaco-playground-tgi

📌 Dataset: https://paperswithcode.com/dataset/ffhq

@Machine_learn
Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles

Hiera is a hierarchical vision transformer that is fast, powerful, and, above all, simple. It outperforms the state-of-the-art across a wide array of image and video tasks while being much faster.

pip install hiera-transformer

🖥 Github: https://github.com/facebookresearch/hiera

Paper: https://arxiv.org/abs/2306.00989v1

📌 Dataset: https://paperswithcode.com/dataset/inaturalist

@Machine_learn
25_Awesome_Python_Scripts.pdf
171.4 KB
A Collection of 25 Awesome Python Scripts (mini projects)
#Python #Mini_Projects
@Machine_learn
با عرض سلام پکیچ های یادگیری ماشین و یادگیری عمیق رو برای دوستانی که نیاز دارن تخفیف۵۰٪ گذاشتیم در صورت نیاز به بنده اطلاع بدین.
@Raminmousa
2025/07/07 19:19:07
Back to Top
HTML Embed Code: