Topic: pretrained-language-model Goto Github
Some thing interesting about pretrained-language-model
Some thing interesting about pretrained-language-model
pretrained-language-model,Code of NAACL 2022 "Efficient Hierarchical Domain Adaptation for Pretrained Language Models" paper.
User: alexandra-chron
Home Page: https://aclanthology.org/2022.naacl-main.96.pdf
pretrained-language-model,Code associated with the Don't Stop Pretraining ACL 2020 paper
Organization: allenai
pretrained-language-model,中文法律LLaMA (LLaMA for Chinese legel domain)
User: andrewzhe
pretrained-language-model,It is a comprehensive resource hub compiling all LLM papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
User: azminewasi
pretrained-language-model,"Unsupervised Paraphrase Generation using Pre-trained Language Model."
User: bh-so
pretrained-language-model,Implementation of "TransPolymer: a Transformer-based language model for polymer property predictions" in PyTorch
User: changwenxu98
pretrained-language-model,This repository contains the code for the paper in Findings of EMNLP 2021: "EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation".
User: cheneydon
pretrained-language-model,CEHR-BERT: Incorporating temporal information from structured EHR data to improve prediction tasks
Organization: cumc-dbmi
pretrained-language-model,The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.
Organization: dc-research
pretrained-language-model,Official implementation of the ACL 2024: Scientific Inspiration Machines Optimized for Novelty
User: eaglew
Home Page: https://arxiv.org/abs/2305.14259
pretrained-language-model,Code for Stage-wise Fine-tuning for Graph-to-Text Generation
User: eaglew
pretrained-language-model,CoditT5: Pretraining for Source Code and Natural Language Editing
Organization: engineeringsoftware
pretrained-language-model,Implementation of ICLR 21 paper: Probing BERT in Hyperbolic Spaces
User: franxyao
pretrained-language-model,BioBART: Pretraining and Evaluation of A Biomedical Generative Language Model [ACL-BioNLP 2022]
User: ganjinzero
Home Page: https://arxiv.org/abs/2204.03905
pretrained-language-model,CODER: Knowledge infused cross-lingual medical term embedding for term normalization. [JBI, ACL-BioNLP 2022]
User: ganjinzero
Home Page: https://www.sciencedirect.com/science/article/pii/S1532046421003129
pretrained-language-model,word2vec, sentence2vec, machine reading comprehension, dialog system, text classification, pretrained language model (i.e., XLNet, BERT, ELMo, GPT), sequence labeling, information retrieval, information extraction (i.e., entity, relation and event extraction), knowledge graph, text generation, network embedding
User: gaoisbest
pretrained-language-model,BERT4ETH: A Pre-trained Transformer for Ethereum Fraud Detection (WWW23)
Organization: git-disl
pretrained-language-model,Translate Natural Language Processing to SPARQL Query and vice versa
User: heraclex12
pretrained-language-model,EMNLP'23 survey: a curation of awesome papers and resources on refreshing large language models (LLMs) without expensive retraining.
Organization: hyintell
pretrained-language-model,ACL'2023: DiffusionBERT: Improving Generative Masked Language Models with Diffusion Models
User: hzfinfdu
pretrained-language-model,WideMLP for Text Classification
User: lgalke
pretrained-language-model,MWPToolkit is an open-source framework for math word problem(MWP) solvers.
User: lyh-yf
pretrained-language-model,:coconut: Code & Data for Comparative Opinion Summarization via Collaborative Decoding (Iso et al; Findings of ACL 2022)
Organization: megagonlabs
Home Page: https://aclanthology.org/2022.findings-acl.261.pdf
pretrained-language-model,[ICLR 2022] Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators
Organization: microsoft
pretrained-language-model,[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Organization: microsoft
pretrained-language-model,Foundation Architecture for (M)LLMs
Organization: microsoft
Home Page: https://aka.ms/GeneralAI
pretrained-language-model,Live Training for Open-source Big Models
Organization: openbmb
pretrained-language-model,[EMNLP 2022] This is the code repo for our EMNLP‘22 paper "COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning".
Organization: openmatch
Home Page: https://arxiv.org/abs/2210.15212
pretrained-language-model,Papers and Datasets on Instruction Tuning and Following. ✨✨✨
User: renzelou
Home Page: https://arxiv.org/abs/2303.10475
pretrained-language-model,This repository is the official implementation of our EMNLP 2022 paper ELMER: A Non-Autoregressive Pre-trained Language Model for Efficient and Effective Text Generation
Organization: rucaibox
pretrained-language-model, A curated list of pretrained sentence and word embedding models
User: separius
pretrained-language-model,Bamboo-7B Large Language Model
Organization: sjtu-ipads
pretrained-language-model,ELECTRA기반 한국어 대화체 언어모델
Organization: skplanet
Home Page: https://github.com/SKplanet/Dialog-KoELECTRA
pretrained-language-model,Awesome LLM Self-Consistency: a curated list of Self-consistency in Large Language Models
User: superbrucejia
Home Page: https://github.com/SuperBruceJia/Awesome-LLM-Self-Consistency
pretrained-language-model,data collator for UL2 and U-PaLM
User: theblackcat102
pretrained-language-model,An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
Organization: thudm
pretrained-language-model,CokeBERT: Contextual Knowledge Selection and Embedding towards Enhanced Pre-Trained Language Models
Organization: thunlp
pretrained-language-model,A plug-and-play library for parameter-efficient-tuning (Delta Tuning)
Organization: thunlp
Home Page: https://opendelta.readthedocs.io
pretrained-language-model,On Transferability of Prompt Tuning for Natural Language Processing
Organization: thunlp
Home Page: https://aclanthology.org/2022.naacl-main.290/
pretrained-language-model,EMNLP'2022: BERTScore is Unfair: On Social Bias in Language Model-Based Metrics for Text Generation
User: txsun1997
Home Page: https://arxiv.org/abs/2210.07626
pretrained-language-model,YAYI 2 是中科闻歌研发的新一代开源大语言模型,采用了超过 2 万亿 Tokens 的高质量、多语言语料进行预训练。(Repo for YaYi 2 Chinese LLMs)
User: wenge-research
pretrained-language-model,[KDD22] Official PyTorch implementation for "Towards Unified Conversational Recommender Systems via Knowledge-Enhanced Prompt Learning".
User: wxl1999
pretrained-language-model,Summarization Papers
User: xcfcode
pretrained-language-model,AiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
User: yingyuankai
Home Page: https://aispace.readthedocs.io/en/latest/index.html
pretrained-language-model,[NeurIPS 2023] This is the code for the paper `Large Language Model as Attributed Training Data Generator: A Tale of Diversity and Bias`.
User: yueyu1030
Home Page: https://arxiv.org/abs/2306.15895
pretrained-language-model,[NeurIPS 2022] Generating Training Data with Language Models: Towards Zero-Shot Language Understanding
User: yumeng5
pretrained-language-model,[WWW 2022] Topic Discovery via Latent Space Clustering of Pretrained Language Model Representations
User: yumeng5
pretrained-language-model,The source code used for paper "Empower Entity Set Expansion via Language Model Probing", published in ACL 2020.
User: yzhan238
pretrained-language-model,Worth-reading papers and related resources on attention mechanism, Transformer and pretrained language model (PLM) such as BERT. 值得一读的注意力机制、Transformer和预训练语言模型论文与相关资源集合
User: zhengzixiang
pretrained-language-model,[Paper][AAAI 2023] DUET: Cross-modal Semantic Grounding for Contrastive Zero-shot Learning
Organization: zjukg
Home Page: https://arxiv.org/abs/2207.01328
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.