Eternal Reclaimer's Projects
Transform Transformers Into Liquid Transformers
Experiments with Liquid Neural Nets
Implementation of "LM-Infinite: Simple On-the-Fly Length Generalization for Large Language Models"
Plug in and Play implementation of "Certified Reasoning with Language Models" that elevates model reasoning by 40%
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
All mathematics research papers sourced from ArXiv and meticulously curated for LLM pretraining purposes.
Democratization of Med-Flamingo, "Med-Flamingo: A Multimodal Medical Few-shot Learner"
Towards Generalist Biomedical AI
Tree of Thoughts with an meta prompt for 50% boost in model reasoning
Democratization of "Cinematic Mindscapes: High-quality Video Reconstruction from Brain Activity" in Pytorch
Implementation of Minerva from "Minerva: Solving Quantitative Reasoning Problems with Language Models"
"Multimodal Instruction Tuning", An SOTA video dataset
A template for deploying ultra powerful LLMs effortlessly with the best optimizations
Boilerplate template to train AI models on AWS with CUDA
An experimental inquiry into monte carlo tree of thoughts algorithmic systems
MOSS-RLHF
🔥 chat with over 10K frames of video!
MPBS - Model PreTraining Benchmarking Suite to test various metrics for pytorch models before pretraining, training, finetuning, or inference.
Multi-Modality Trees of Thoughts
Implementation of the all-new paper "GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints"
A generalized self-supervised training paradigm for unimodal and multimodal alignment and fusion.
This is a simple torch implementation of the high performance Multi-Query Attention
NanoForge is a powerful Rust package specifically crafted for designing and manufacturing ultra-high-performance nanomachines. It is positioned to be the PyTorch equivalent in the nanotechnology domain, merging simplicity with high performance.
open-source nanotech CAD
A Transformer GAN with Reinforcement Learning on generating novel ready to mass manufacture nanomachines
1 Loss Function For Everything
An Multi-Modality Foundation Model for Humanoid robots
The open source implementation of "NeVA: NeMo Vision and Language Assistant"
🧬 Nucleotide Transformer: Building and Evaluating Robust Foundation Models for Human Genomics