Giter Site home page Giter Site logo

gyunggyung / kogpt2-finetuning Goto Github PK

View Code? Open in Web Editor NEW
228.0 5.0 57.0 25.22 MB

πŸ”₯ Korean GPT-2, KoGPT2 FineTuning cased. ν•œκ΅­μ–΄ 가사 데이터 ν•™μŠ΅ πŸ”₯

Home Page: https://hipgyung.tistory.com/110

License: Apache License 2.0

Python 86.06% Jupyter Notebook 13.94%
nlp gpt2 kogpt2 finetuning korean-nlp fine-tuning korean text-generation lyrics-generator language-model

kogpt2-finetuning's Introduction

KoGPT2-FineTuning

Open In Colab license Apache-2.0 contributions welcome GitHub issues GitHub stars

SKT-AIμ—μ„œ μ•½ 20GB의 ν•œκ΅­μ–΄ 데이터λ₯Ό Pre-Training μ‹œν‚¨ KoGPT2λ₯Ό μ‚¬μš©ν–ˆμŠ΅λ‹ˆλ‹€. 첫 번째둜 가사 μž‘μ‚¬λ₯Ό μœ„ν•΄μ„œ, μ €μž‘κΆŒμ΄ 만료된 μ •μ œλœ 가사 데이터, μ†Œμ„€, 기사 등을 Dataλ³„λ‘œ weightλ₯Ό λ‹€λ₯΄κ²Œ μ£Όλ©° Fine-tuning ν•˜μ˜€μŠ΅λ‹ˆλ‹€. λ˜ν•œ μž₯λ₯΄λ„ λ°›μ•„μ„œ μŒμ•… μž₯λ₯΄λ³„ 가사 ν•™μŠ΅ κ²°κ³Όλ₯Ό λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.

λ˜ν•œ Colabμ—μ„œλŠ” μ›ν™œν•œ ν•™μŠ΅μ„ μœ„ν•΄μ„œ Google Drive와 Dropbbox을 μ—°λ™ν–ˆμŠ΅λ‹ˆλ‹€. ν•™μŠ΅ν•œ 쀑간 κ²°κ³Όλ₯Ό Google Driveμ—μ„œ Dropbbox둜 μ΄λ™μ‹œν‚¨ ν›„, Google Driveμ—μ„œ ν•΄λ‹Ή κ²°κ³Όλ₯Ό μ‚­μ œν•˜κ²Œ ν•©λ‹ˆλ‹€. 이와 κ΄€λ ¨λœ Code

μŒμ•… μž₯λ₯΄λ³„λ‘œ, CSV ν˜•μ‹μ˜ Dataset을 λ°›λŠ” 바뀐 Version 2의 Code둜 KoGPT2-FineTuning μž‘μ—…μ„ ν•˜κΈ° μ–΄λ ΅λ‹€λ©΄, Version 1.1을 μ΄μš©ν•˜κΈΈ λ°”λžλ‹ˆλ‹€.

μ•„λž˜μ—μ„œ, λ‹€μ–‘ν•œ ν•œκ΅­μ–΄ 가사λ₯Ό ν•™μŠ΅ν•œ κ²°κ³Όλ₯Ό 확인 ν•  수 μžˆμŠ΅λ‹ˆλ‹€. μš°λ¦¬λŠ” 이외에도 λ‹€μ–‘ν•œ ν”„λ‘œμ νŠΈλ₯Ό 진행할 κ²ƒμž…λ‹ˆλ‹€.

Sample

Data structure

weight Genre lyrics
1100.0 λ°œλΌλ“œ 'λ‚΄ λ§˜μ„ μ•Œμž–μ•„μš”\n\n\nλ°”λ‘œμ²˜λŸΌ λ©ν•˜λ‹ˆ μ„œ μžˆλŠ” λͺ¨μŠ΅λ§Œ\n\n\n바라보닀\n\n\n포기할 수 밖에 μ—†μ–΄μ„œ...'
...
3x200000

Fine Tuning

python main.py --epoch=200 --data_file_path=./dataset/lyrics_dataset.csv --save_path=./checkpoint/ --load_path=./checkpoint/genre/KoGPT2_checkpoint_296000.tar --batch_size=1

parser

parser.add_argument('--epoch', type=int, default=200,
					help="epoch λ₯Ό ν†΅ν•΄μ„œ ν•™μŠ΅ λ²”μœ„λ₯Ό μ‘°μ ˆν•©λ‹ˆλ‹€.")
parser.add_argument('--save_path', type=str, default='./checkpoint/',
					help="ν•™μŠ΅ κ²°κ³Όλ₯Ό μ €μž₯ν•˜λŠ” κ²½λ‘œμž…λ‹ˆλ‹€.")
parser.add_argument('--load_path', type=str, default='./checkpoint/Alls/KoGPT2_checkpoint_296000.tar', 
					help="ν•™μŠ΅λœ κ²°κ³Όλ₯Ό λΆˆλŸ¬μ˜€λŠ” κ²½λ‘œμž…λ‹ˆλ‹€.")
parser.add_argument('--samples', type=str, default="samples/",
					help="생성 κ²°κ³Όλ₯Ό μ €μž₯ν•  κ²½λ‘œμž…λ‹ˆλ‹€.")
parser.add_argument('--data_file_path', type=str, default='dataset/lyrics_dataset.txt',
					help="ν•™μŠ΅ν•  데이터λ₯Ό λΆˆλŸ¬μ˜€λŠ” κ²½λ‘œμž…λ‹ˆλ‹€.")
parser.add_argument('--batch_size', type=int, default=8,
					help="batch_size λ₯Ό μ§€μ •ν•©λ‹ˆλ‹€.")

Use Colab

Open In Colab

Colab을 μ΄μš©ν•΄μ„œ Fine-tuning Codeλ₯Ό μ‹€ν–‰ν•  수 μžˆμŠ΅λ‹ˆλ‹€.

Runtime Disconnection Prevention

function ClickConnect() {
    // λ°±μ—”λ“œλ₯Ό ν• λ‹Ήν•˜μ§€ λͺ»ν–ˆμŠ΅λ‹ˆλ‹€.
    // GPU이(κ°€) μžˆλŠ” λ°±μ—”λ“œλ₯Ό μ‚¬μš©ν•  수 μ—†μŠ΅λ‹ˆλ‹€. 가속기가 μ—†λŠ” λŸ°νƒ€μž„μ„ μ‚¬μš©ν•˜μ‹œκ² μŠ΅λ‹ˆκΉŒ?
    // μ·¨μ†Œ λ²„νŠΌμ„ μ°Ύμ•„μ„œ 클릭
    var buttons = document.querySelectorAll("colab-dialog.yes-no-dialog paper-button#cancel"); 
    buttons.forEach(function(btn) {
		btn.click();
    });
    console.log("1λΆ„ λ§ˆλ‹€ λ‹€μ‹œ μ—°κ²°");
    document.querySelector("#top-toolbar > colab-connect-button").click();
}
setInterval(ClickConnect,1000*60);

Clear the screen every 10 minutes

function CleanCurrentOutput(){ 
	var btn = document.querySelector(".output-icon.clear_outputs_enabled.output-icon-selected[title$='ν˜„μž¬ μ‹€ν–‰ 쀑...'] iron-icon[command=clear-focused-or-selected-outputs]");
	if(btn) {
		console.log("10λΆ„ λ§ˆλ‹€ 좜λ ₯ μ§€μš°κΈ°");
		btn.click();
	}
} 
setInterval(CleanCurrentOutput,1000*60*10);

GPU Memory Check

nvidia-smi.exe

generator

python generator.py --temperature=1.0 --text_size=1000 --tmp_sent=""

ν‘œμ ˆ μ—†μŒ

python generator.py --temperature=5.0 --text_size=500 --tmp_sent=""

parser

parser.add_argument('--temperature', type=float, default=0.7,
					help="temperature λ₯Ό ν†΅ν•΄μ„œ κΈ€μ˜ μ°½μ˜μ„±μ„ μ‘°μ ˆν•©λ‹ˆλ‹€.")
parser.add_argument('--top_p', type=float, default=0.9,
					help="top_p λ₯Ό ν†΅ν•΄μ„œ κΈ€μ˜ ν‘œν˜„ λ²”μœ„λ₯Ό μ‘°μ ˆν•©λ‹ˆλ‹€.")
parser.add_argument('--top_k', type=int, default=40,
					help="top_k λ₯Ό ν†΅ν•΄μ„œ κΈ€μ˜ ν‘œν˜„ λ²”μœ„λ₯Ό μ‘°μ ˆν•©λ‹ˆλ‹€.")
parser.add_argument('--text_size', type=int, default=250,
					help="결과물의 길이λ₯Ό μ‘°μ •ν•©λ‹ˆλ‹€.")
parser.add_argument('--loops', type=int, default=-1,
					help="글을 λͺ‡ 번 λ°˜λ³΅ν• μ§€ μ§€μ •ν•©λ‹ˆλ‹€. -1은 λ¬΄ν•œλ°˜λ³΅μž…λ‹ˆλ‹€.")
parser.add_argument('--tmp_sent', type=str, default="μ‚¬λž‘",
					help="κΈ€μ˜ μ‹œμž‘ λ¬Έμž₯μž…λ‹ˆλ‹€.")
parser.add_argument('--load_path', type=str, default="./checkpoint/Alls/KoGPT2_checkpoint_296000.tar",
					help="ν•™μŠ΅λœ 결과물을 μ €μž₯ν•˜λŠ” κ²½λ‘œμž…λ‹ˆλ‹€.")

Use Colab

Open In Colab

Colab을 μ΄μš©ν•΄μ„œ generatorλ₯Ό μ‹€ν–‰ν•  수 μžˆμŠ΅λ‹ˆλ‹€.

tensorboard

ν•™μŠ΅μ— λ”°λ₯Έ λ³€ν™”λ₯Ό ν™•μΈν•˜κΈ° μœ„ν•΄μ„œ, tensorboard둜 μ ‘κ·Όν•˜μ—¬ loss와 textλ₯Ό ν™•μΈν•©λ‹ˆλ‹€.

tensorboard --logdir=runs

loss

text

Citation

@misc{KoGPT2-FineTuning,
  author = {gyung},
  title = {KoGPT2-FineTuning},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/gyunggyung/KoGPT2-FineTuning}},
}

Output

μžμ„Έν•œ 결과물은 samplesμ—μ„œ 확인 ν•  수 μžˆμŠ΅λ‹ˆλ‹€. ν•™μŠ΅μ— λŒ€ν•΄μ„œλŠ” κ΄€λ ¨ ν¬μŠ€νŒ…μ—μ„œ 확인할 수 μžˆμŠ΅λ‹ˆλ‹€.

Reference

https://github.com/openai/gpt-2
https://github.com/nshepperd/gpt-2
https://github.com/SKT-AI/KoGPT2
https://github.com/asyml/texar-pytorch/tree/master/examples/gpt-2
https://github.com/graykode/gpt-2-Pytorch
https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317
https://github.com/shbictai/narrativeKoGPT2
https://github.com/ssut/py-hanspell
https://github.com/likejazz/korean-sentence-splitter

kogpt2-finetuning's People

Contributors

dependabot[bot] avatar elenassun avatar gyunggyung avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

kogpt2-finetuning's Issues

λ―Έμ„Έμ‘°μ • μ‹œ 였λ₯˜

μ•ˆλ…•ν•˜μ„Έμš”.

쒋은 자료λ₯Ό μ œκ³΅ν•΄μ£Όμ…”μ„œ λ‹€μ–‘ν•œ ν…ŒμŠ€νŠΈλ₯Ό 진행해 λ³Ό 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
정말 κ°μ‚¬λ“œλ¦½λ‹ˆλ‹€.

닀름이 μ•„λ‹ˆλΌ, λ―Έμ„Έμ‘°μ • μ‹œ μ—¬λŸ¬ λ°μ΄ν„°μ„ΈνŠΈμ— λŒ€ν•΄μ„œ λ™μΌν•œ 였λ₯˜λ₯Ό λ°œμƒμ‹œν‚€κΈ° μžˆμ–΄, μ΄λ ‡κ²Œ 문의λ₯Ό λ“œλ¦¬κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

ν•΄λ‹Ή 였λ₯˜λŠ” μ•„λž˜μ™€ κ°™μŠ΅λ‹ˆλ‹€. ν˜Ήμ‹œ 이런 μ—λŸ¬μ˜ λ°œμƒ 원인을 μ•„μ‹ λ‹€λ©΄... κ°€μ΄λ“œ 뢀탁 λ“œλ¦½λ‹ˆλ‹€.
Traceback (most recent call last):
File "main.py", line 187, in
main(args.epoch, args.save_path, args.load_path, args.samples, args.data_file_path, args.batch_size)
File "main.py", line 144, in main
outputs = model(data, labels=data)
File "/opt/conda/lib/python3.7/site-packages/torch/nn/modules/module.py", line 550, in call
result = self.forward(*input, **kwargs)
File "/root/data/KoGPT2-FineTuning/kogpt2/model/torch_gpt2.py", line 588, in forward
inputs_embeds=inputs_embeds,
File "/opt/conda/lib/python3.7/site-packages/torch/nn/modules/module.py", line 550, in call
result = self.forward(*input, **kwargs)
File "/root/data/KoGPT2-FineTuning/kogpt2/model/torch_gpt2.py", line 474, in forward
hidden_states, layer_past=layer_past, attention_mask=attention_mask, head_mask=head_mask[i]
File "/opt/conda/lib/python3.7/site-packages/torch/nn/modules/module.py", line 550, in call
result = self.forward(*input, **kwargs)
File "/root/data/KoGPT2-FineTuning/kogpt2/model/torch_gpt2.py", line 227, in forward
self.ln_1(x), layer_past=layer_past, attention_mask=attention_mask, head_mask=head_mask
File "/opt/conda/lib/python3.7/site-packages/torch/nn/modules/module.py", line 550, in call
result = self.forward(*input, **kwargs)
File "/root/data/KoGPT2-FineTuning/kogpt2/model/torch_gpt2.py", line 190, in forward
attn_outputs = self._attn(query, key, value, attention_mask, head_mask)
File "/root/data/KoGPT2-FineTuning/kogpt2/model/torch_gpt2.py", line 146, in _attn
w = w * b - 1e4 * (1 - b)
RuntimeError: The size of tensor a (1031) must match the size of tensor b (1024) at non-singleton dimension 3

토큰 μƒμ„±μ‹œ 였λ₯˜

μ–Όλ§ˆ 전에 λ©”μΌλ‘œ μ—°λ½λ“œλ Έλ˜ μžμ—°μ–΄μ²˜λ¦¬ κ³΅λΆ€ν•˜λŠ” ν•™μƒμž…λ‹ˆλ‹€.
μ²˜μŒμ—λŠ” μ•„λž˜ 그림처럼 Text_size = 100 에 λ§žμΆ”μ–΄ 잘 생성이 λ˜μ—ˆλŠ”λ°μš”.

ok

일정 μ‹œμ  μ΄ν›„λ‘œλŠ” 토큰이 10개 미만으둜 μƒμ„±λ˜λ©° μ œλŒ€λ‘œ 생성이 λ˜μ§€ μ•Šκ³  μžˆμŠ΅λ‹ˆλ‹€. λ™μΌν•œ ν˜„μƒμ΄ μžˆμœΌμ…¨κ±°λ‚˜ 해결법이 μžˆλ‹€λ©΄ 해결법을 μ•Œκ³  μ‹ΆμŠ΅λ‹ˆλ‹€. λ°μ΄ν„°μ…‹μ˜ ν˜•νƒœλ₯Ό 바꿔보기도 ν•˜κ³  sample_sequence ν•¨μˆ˜λ₯Ό 잘λͺ» κ±΄λ“œλ¦°κ²Œ μžˆλ‚˜ μ‹Άμ–΄ μ½”λ“œλ₯Ό μƒˆλ‘œ 클둠 λ°›μ•˜λŠ”λ°λ„ 같은 ν˜„μƒμ΄ μΌμ–΄λ‚˜κ³  μžˆμŠ΅λ‹ˆλ‹€. λ¬Έμ œκ°€ λ°œμƒν•œ μ΄ν›„λ‘œλŠ” μ•„λž˜ 그림처럼 생성이 λ©λ‹ˆλ‹€.

not-ok

κΈ°λ³Έ μ‹€ν–‰ 질문

μ•ˆλ…•ν•˜μ„Έμš” :)
λ”₯λŸ¬λ‹μ„ μ‹œμž‘ν•œμ§€ μ–Όλ§ˆ λ˜μ§€μ•Šμ€ ν•™μƒμž…λ‹ˆλ‹€.
λ¨Όμ € 쒋은 λ¦¬μ†ŒμŠ€ μ˜¬λ €μ£Όμ…”μ„œ κ°μ‚¬λ“œλ¦½λ‹ˆλ‹€.
κΆκΈˆν•œ 것이 λ§Žμ€λ° λ‹΅λ³€ν•΄μ£Όμ‹œλ©΄ 정말 κ°μ‚¬ν•˜κ² μŠ΅λ‹ˆλ‹€.

  1. Fine Tuningν•˜λŠ” κ³Όμ •μ—μ„œ μ•„λž˜ νŒŒλΌλ―Έν„° 쀑 dataset은 μ œκ°€ ꡬ좕해야 ν•˜λŠ” λΆ€λΆ„μΈκ°€μš”??
    python main.py --epoch=200 --data_file_path=./dataset/All_make_lyrics_dataset.txt --save_path=./checkpoint/ --load_path=./checkpoint/auto_enter/KoGPT2_checkpoint_18500.tar --batch_size=8

  2. ν•΄λ‹Ή μžλ£Œμ—λŠ” checkpoint 디렉토리가 μ—†λŠ”λ° checkpoint의 데이터λ₯Ό μ°Έμ‘°ν•˜λŠ”λ°, 이 뢀뢄도 μ œκ°€ ꡬ좕해야 ν•˜λŠ” λΆ€λΆ„μΈκ°€μš”? μΆ”κ°€μ μœΌλ‘œ trainν•˜μ§€ μ•Šκ³  generate.pyλ₯Ό μ‹€ν–‰ν•˜λ©΄ λͺ¨λΈμ„ μ•„λž˜ 처럼 checkpointμ—μ„œ μ°Έμ‘°ν•˜λŠ”λ° 이뢀뢄도 같은 λ§₯λ½μΈκ°€μš”?
    pytorch_kogpt2 = { 'url': 'checkpoint/pytorch_kogpt2_676e9bcfa7.params', 'fname': 'pytorch_kogpt2_676e9bcfa7.params', 'chksum': '676e9bcfa7' }

  3. colab ν™˜κ²½μ—μ„œ μ•„λž˜μ™€ 같이 μ‹€ν–‰ν•˜μ…¨λŠ”λ°, !python main.py' μ•„λ‹Œ μ•„λž˜μ²˜λŸΌ 싀행이 κ°€λŠ₯ν•œ μ΄μœ κ°€ κΆκΈˆν•©λ‹ˆλ‹€.
    main(temperature=0.9, tmp_sent = "μ‚¬λž‘", text_size = 500, loops = 5, load_path = load_path, samples = samples)

κΈ΄ κΈ€ μ½μ–΄μ£Όμ…”μ„œ κ°μ‚¬ν•©λ‹ˆλ‹€.

EminemGPT μΆ”κ°€

/μ½”λ“œ 및 링크 μΆ”κ°€. κ°€λŠ₯ν•˜λ©΄ 리포지토리 이름도 λ°”κΎΈκΈ°.

Memory Leak ν˜„μƒμ— λŒ€ν•˜μ—¬

Inferenceκ³Όμ •μ—μ„œ Memory Leakν˜„μƒμ΄ λ°œμƒν•˜λŠ”λ° ν˜Ήμ‹œ μ–΄λ–»κ²Œ ν•΄κ²°ν•˜μ…¨λ‚˜μš”?
객체λ₯Ό μ œκ±°ν•˜κ±°λ‚˜ μΊμ‹œλ₯Ό λΉ„μ›Œλ„ λ™μΌν•©λ‹ˆλ‹€. GPUλ©”λͺ¨λ¦¬κ°€ μ•„λ‹Œ, μ„œλ²„ μ‹œμŠ€ν…œλ©”λͺ¨λ¦¬μž…λ‹ˆλ‹€.

κ°œν–‰λ¬Έμž ν•™μŠ΅μ— λŒ€ν•˜μ—¬

μ•ˆλ…•ν•˜μ„Έμš”,

SKT-AI/KoGPT2#11 μ΄μŠˆμ— λŒ€ν•΄ 잘 λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
ν˜Ήμ‹œ κ²°κ΅­ μ‹€μ œ ν•™μŠ΅ν•˜μ‹€ λ•ŒλŠ” μ–΄λ–»κ²Œ ν•˜μ…¨λŠ”μ§€ μ•Œμˆ˜ μžˆμ„κΉŒμš”?

version 1.1 μ—μ„œλŠ” μ΄μŠˆμ—μ„œ λ§ν•˜μ‹ λŒ€λ‘œ


vocab.token_to_idx["\n"] = vocab.token_to_idx[""]
del vocab.token_to_idx[""]

와 같은 μ½”λ“œλ₯Ό ν™œμš©ν•˜μ—¬ ν•™μŠ΅μ„ μ§„ν–‰ν•œ κ²ƒμœΌλ‘œ λ³΄μ΄λŠ”λ°, version 2.0 μ—μ„œλŠ” ν•΄λ‹Ή μ½”λ“œκ°€ μ‘΄μž¬ν•˜μ§€ μ•ŠκΈ°μ— μ—¬μ­ˆμ–΄λ΄…λ‹ˆλ‹€.

μ•„λ‹ˆλ©΄ ν˜Ήμ‹œ.spiece νŒŒμΌμ„ μˆ˜μ •ν•˜μ…¨λŠ”μ§€μš”?

저도 μƒˆλ‘œμš΄ ν† ν°μ˜ μ •μ˜κ°€ ν•„μš”ν•˜κ³  이λ₯Ό μžλ™ν† ν°ν™” ν•˜κ³ μ‹Άμ€ μƒν™©μž…λ‹ˆλ‹€.
λ‹΅λ³€ν•΄μ£Όμ‹ λ‹€λ©΄ 정말 κ°μ‚¬ν•˜κ² μŠ΅λ‹ˆλ‹€ :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.