Telegram Web Link
Packt.Applied.Machine.Learning.and.High.Performance.pdf
20.5 MB
Book: Applied Machine Learning and High-Performance Computing on AWS
Authors: Mani Khanuja, Farooq Sabir,
Shreyas Subramanian ,Trenton Potgieter
ISBN: 978-1-80323-701-5
year: 2022
pages: 383
Tags: #ML #AWS
@Machine_learn
🔍 Unleashing Infinite-Length Input Capacity for Large-scale Language Models with Self-Controlled Memory System

Self-Controlled Memory (SCM) system to unleash infinite-length input capacity for large-scale language models.


🖥 Github: https://github.com/toufunao/SCM4LLMs

Paper: https://arxiv.org/abs/2304.13343v1

📌 Tasks: https://paperswithcode.com/task/language-modelling

@Machine_learn
ZipIt! Merging Models from Different Tasks without Training

ZipIt allows to combine completely distinct models with different initializations, each solving a separate task, into one multi-task model without any additional training.


🖥 Github: https://github.com/gstoica27/zipit

Paper: https://arxiv.org/abs/2305.03053v1

📌 Dataset: https://paperswithcode.com/dataset/nabirds

@Machine_learn
pymbook.pdf
1.1 MB
Book: Python for you and me
Release 0.5.beta1
Authors: Kushal Das
ISBN: Null
year: 2023
pages: 175
Tags: #Python #Code
@Machine_learn
Discover and Cure: Concept-aware Mitigation of Spurious Correlation

🖥 Github: https://github.com/wuyxin/disc

Paper: https://arxiv.org/pdf/2305.00650v1.pdf

💨 Dataset: https://paperswithcode.com/dataset/metashift

@Machine_learn
Deep-Learning-for-Natural-Language-Processing.pdf
7.3 MB
Book: Deep Learning for Natural Language Processing (Creating Neural Networks with Python)
Authors: Palash Goyal, Sumit Pandey, Karan Jain
ISBN: 978-1-4842-3685-7
year: 2018
pages: 290
Tags: #NLP #DL #Python #Code
@Machine_learn
LLM-Pruner: On the Structural Pruning of Large Language Models

Compress your LLMs to any size;


🖥 Github: https://github.com/horseee/llm-pruner

Paper: https://arxiv.org/abs/2305.11627v1

📌 Dataset: https://paperswithcode.com/dataset/piqa

@Machine_learn
QLoRA: Efficient Finetuning of Quantized LLMs

Model name Guanaco, outperforms all previous openly released models on the Vicuna benchmark, reaching 99.3% of the performance level of ChatGPT while only requiring 24 hours of finetuning on a single GPU.


🖥 Github: https://github.com/artidoro/qlora

Paper: https://arxiv.org/abs/2305.14314

⭐️ Demo: https://huggingface.co/spaces/uwnlp/guanaco-playground-tgi

📌 Dataset: https://paperswithcode.com/dataset/ffhq

@Machine_learn
Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles

Hiera is a hierarchical vision transformer that is fast, powerful, and, above all, simple. It outperforms the state-of-the-art across a wide array of image and video tasks while being much faster.

pip install hiera-transformer

🖥 Github: https://github.com/facebookresearch/hiera

Paper: https://arxiv.org/abs/2306.00989v1

📌 Dataset: https://paperswithcode.com/dataset/inaturalist

@Machine_learn
25_Awesome_Python_Scripts.pdf
171.4 KB
A Collection of 25 Awesome Python Scripts (mini projects)
#Python #Mini_Projects
@Machine_learn
با عرض سلام پکیچ های یادگیری ماشین و یادگیری عمیق رو برای دوستانی که نیاز دارن تخفیف۵۰٪ گذاشتیم در صورت نیاز به بنده اطلاع بدین.
@Raminmousa
🦍 Gorilla: Large Language Model Connected with Massive APIs

Gorilla a finetuned LLaMA-based model that surpasses the performance of GPT-4 on writing API calls.


🖥 Github: https://github.com/ShishirPatil/gorilla

📕 Paper: https://arxiv.org/abs/2305.15334

🔗 Demo: https://drive.google.com/file/d/1E0k5mG1mTiaz0kukyK1PdeohJipTFh6j/view?usp=share_link

👉 Project: https://shishirpatil.github.io/gorilla/

⭐️ Colab: https://colab.research.google.com/drive/1DEBPsccVLF_aUnmD0FwPeHFrtdC0QIUP?usp=sharing

@Machine_learn
Segment Anything 3D

SAM-3D: A toolbox transfers 2D SAM segments into 3D scene-level point clouds.

🖥 Github: https://github.com/pointcept/segmentanything3d

Paper: https://arxiv.org/abs/2306.03908v1

📌 Dataset: https://paperswithcode.com/dataset/scannet
@Machine_learn
Data Science Interview (en).pdf
849.5 KB
Book: DATA SCIENCE INTERVIEW
GUIDE ACE-PREP
Authors: null
ISBN: 978-1-915002-10-5
year: 2022
pages: 136
Tags: #Data_Science
@Machine_learn
2025/07/01 16:52:36
Back to Top
HTML Embed Code: