Topic: attention-mechanisms Goto Github
Some thing interesting about attention-mechanisms
Some thing interesting about attention-mechanisms
attention-mechanisms,Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
User: cbaziotis
attention-mechanisms, 🦖Pytorch implementation of popular Attention Mechanisms, Vision Transformers, MLP-Like models and CNNs.🔥🔥🔥
User: changzy00
attention-mechanisms,An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
User: cmhungsteve
attention-mechanisms,In this repository, one can find the code for my master's thesis project. The main goal of the project was to study and improve attention mechanisms for trajectory prediction of moving agents.
User: elbuco1
attention-mechanisms,Learning YOLOv3 from scratch 从零开始学习YOLOv3代码
Organization: giantpandacv
Home Page: https://w.url.cn/s/ApNXPJX
attention-mechanisms,Multi heads attention for image classification
User: johnsmithm
attention-mechanisms,Implementation of the LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Paper
User: jshuadvd
Home Page: https://arxiv.org/pdf/2402.13753.pdf
attention-mechanisms,PyTorch Dual-Attention LSTM-Autoencoder For Multivariate Time Series
User: julesbelveze
attention-mechanisms,PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"
User: kyegomez
Home Page: https://discord.gg/7VckQVxvKk
attention-mechanisms,Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
User: kyegomez
Home Page: https://discord.gg/qUtxnK2NMf
attention-mechanisms,Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
User: kyegomez
Home Page: https://discord.gg/GYbXvDGevY
attention-mechanisms,Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Organization: landskape-ai
attention-mechanisms,Implementation of Agent Attention in Pytorch
User: lucidrains
attention-mechanisms,Implementation of Alphafold 3 in Pytorch
User: lucidrains
attention-mechanisms,Implementation of AudioLM, a SOTA Language Modeling Approach to Audio Generation out of Google Research, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of Block Recurrent Transformer - Pytorch
User: lucidrains
attention-mechanisms,Implementation of Band Split Roformer, SOTA Attention network for music source separation out of ByteDance AI Labs
User: lucidrains
attention-mechanisms,Implementation of CALM from the paper "LLM Augmented LLMs: Expanding Capabilities through Composition", out of Google Deepmind
User: lucidrains
attention-mechanisms,Implementation of the conditionally routed attention in the CoLT5 architecture, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of the transformer proposed in "Building Blocks for a Complex-Valued Transformer Architecture"
User: lucidrains
attention-mechanisms,Implementation of Diffusion Policy, Toyota Research's supposed breakthrough in leveraging DDPMs for learning policies for real-world Robotics
User: lucidrains
attention-mechanisms,Implementation of the Equiformer, SE3/E3 equivariant attention network that reaches new SOTA, and adopted for use by EquiFold for protein folding
User: lucidrains
attention-mechanisms,Implementation of Flash Attention in Jax
User: lucidrains
attention-mechanisms,Implementation of fused cosine similarity attention in the same style as Flash Attention
User: lucidrains
attention-mechanisms,Unofficial implementation of iTransformer - SOTA Time Series Forecasting using Attention networks, out of Tsinghua / Ant group
User: lucidrains
attention-mechanisms,An implementation of local windowed attention for language modeling
User: lucidrains
attention-mechanisms,Implementation of MagViT2 Tokenizer in Pytorch
User: lucidrains
attention-mechanisms,Implementation of Make-A-Video, new SOTA text to video generator from Meta AI, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of ChatGPT, but tailored towards primary care medicine, with the reward being able to collect patient histories in a thorough and efficient manner and come up with a reasonable differential diagnosis
User: lucidrains
attention-mechanisms,Implementation of Mega, the Single-head Attention with Multi-headed EMA architecture that currently holds SOTA on Long Range Arena
User: lucidrains
attention-mechanisms,Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
User: lucidrains
attention-mechanisms,Some personal experiments around routing tokens to different autoregressive attention, akin to mixture-of-experts
User: lucidrains
attention-mechanisms,Implementation of a single layer of the MMDiT, proposed in Stable Diffusion 3, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of Muse: Text-to-Image Generation via Masked Generative Transformers, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
User: lucidrains
attention-mechanisms,Implementation of Phenaki Video, which uses Mask GIT to produce text guided videos of up to 2 minutes in length, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of Q-Transformer, Scalable Offline Reinforcement Learning via Autoregressive Q-Functions, out of Google Deepmind
User: lucidrains
attention-mechanisms,Implementation of Recurrent Interface Network (RIN), for highly efficient generation of images and video without cascading networks, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of RT1 (Robotic Transformer) in Pytorch
User: lucidrains
attention-mechanisms,Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
User: lucidrains
attention-mechanisms,Explorations into the recently proposed Taylor Series Linear Attention
User: lucidrains
attention-mechanisms,Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
User: lucidrains
attention-mechanisms,Implementation of Transframer, Deepmind's U-net + Transformer architecture for up to 30 seconds video generation, in Pytorch
User: lucidrains
attention-mechanisms,Implementation of Zorro, Masked Multimodal Transformer, in Pytorch
User: lucidrains
attention-mechanisms,This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
User: monk1337
attention-mechanisms,Awesome List of Attention Modules and Plug&Play Modules in Computer Vision
User: pprp
attention-mechanisms,Sparse and structured neural attention mechanisms
User: vene
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.