Giter Site home page Giter Site logo

ollama's Introduction

Ollama

This repo brings numerous use cases from the Open Source Ollama

Step 1

Go Ahead to https://ollama.ai/ and download the set up file

Step 2

Install and Start the Software

Step 3

Clone my Entire Repo on your local device using the command git clone https://github.com/PromptEngineer48/Ollama.git

Step 4

The Repo has numerous working case as separate Folders. You can work on any folder for testing various use cases

Step 5

Join me on my Journey on my youtube channel https://www.youtube.com/@PromptEngineer48/

ollama's People

Contributors

promptengineer48 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ollama's Issues

Embedded data could not be deleted keep on repeating

HI Please help on this

I deleted the everything in db/xxxxxxx index folder and replace source_documents folder with new set of files.

However, the old embedded data still popping out from the query ,

for eg. my old files were are .html, new files are .txt

but when running the query it replied answers from > source_documents/xxx.txt: and > source_documents/xxx.html:

and even I delete the data base index folder and run again, it is even getting messier,
now duplicated > source_documents/xxx.txt: answers will come out.

How we can clean the database or embedded data properly

Thanks.

Using Requests ?

I thought that when using Ollama you'll be able to run purely locally, but when I run your "ingest.py" it's calling for "requests" that is blocked by my IT

Your script relies on environment variables. Who sets those variables?

Hi,

https://github.com/PromptEngineer48/Ollama/blob/main/2-ollama-privateGPT-chat-with-docs/privateGPT.py has a couple environment variables. Like MODEL. Nothing sets those variables.

So when i run ollama pull codellama (note the intential change from mistral to codellama) and then continue following your steps then your script goes bananas. Nothing tels your program that i'm now using the codellama model so it still wants to use the mistral model. Which i don't have so it fails.

Unable to ingest .docx files

I've been following the steps in readme and the video tutorial. However, I'm unable to pass through successful ingestion of a docx file. It works fine with .pdf. Anything I need to look into?
This is what I get when I type in python3 ingest.py

`Creating new vectorstore
Loading documents from source_documents
Loading new documents: 0%| | 0/2 [00:02<?, ?it/s]
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/multiprocessing/pool.py", line 125, in worker
result = (True, func(*args, **kwds))
^^^^^^^^^^^^^^^^^^^
File "/Users/rehan.arif/Documents/Chat with docs/Ollama/2-ollama-privateGPT-chat-with-docs/ingest.py", line 84, in load_single_document
return loader.load()
^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/langchain/document_loaders/unstructured.py", line 86, in load
elements = self._get_elements()
^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/langchain/document_loaders/word_document.py", line 122, in _get_elements
from unstructured.partition.docx import partition_docx
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/unstructured/partition/docx.py", line 6, in
import docx
ModuleNotFoundError: No module named 'docx'
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/Users/rehan.arif/Documents/Chat with docs/Ollama/2-ollama-privateGPT-chat-with-docs/ingest.py", line 161, in
main()
File "/Users/rehan.arif/Documents/Chat with docs/Ollama/2-ollama-privateGPT-chat-with-docs/ingest.py", line 151, in main
texts = process_documents()
^^^^^^^^^^^^^^^^^^^
File "/Users/rehan.arif/Documents/Chat with docs/Ollama/2-ollama-privateGPT-chat-with-docs/ingest.py", line 113, in process_documents
documents = load_documents(source_directory, ignored_files)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rehan.arif/Documents/Chat with docs/Ollama/2-ollama-privateGPT-chat-with-docs/ingest.py", line 102, in load_documents
for i, docs in enumerate(pool.imap_unordered(load_single_document, filtered_files)):
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/multiprocessing/pool.py", line 873, in next
raise value
ModuleNotFoundError: No module named 'docx'`

What if i want to delete documents? Vector dbs

Hi with the current setup of importing documents.
I understand that if i add new documents it will load the new documents.
What if i want to delete or replace existing embedded documents.
Also what do i have to change to use qtrand database installed locally on a docker instead of chroma?

requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Hello,

When I run your project, I get the following error message. Can you help me fix this issue?

(privateGPT) (base) mac@localhost 2-ollama-privateGPT-chat-with-docs % python privateGPT.py

Enter a query: hello
Traceback (most recent call last):
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/requests/models.py", line 971, in json
    return complexjson.loads(self.text, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/anaconda3/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/anaconda3/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/anaconda3/lib/python3.11/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT.py", line 74, in <module>
    main()
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT.py", line 46, in main
    res = qa(query)
          ^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/retrieval_qa/base.py", line 139, in _call
    answer = self.combine_documents_chain.run(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/base.py", line 480, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/combine_documents/base.py", line 105, in _call
    output, extra_return_dict = self.combine_docs(
                                ^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/combine_documents/stuff.py", line 171, in combine_docs
    return self.llm_chain.predict(callbacks=callbacks, **inputs), {}
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/llm.py", line 255, in predict
    return self(kwargs, callbacks=callbacks)[self.output_key]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/llm.py", line 91, in _call
    response = self.generate([inputs], run_manager=run_manager)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/chains/llm.py", line 101, in generate
    return self.llm.generate_prompt(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/llms/base.py", line 467, in generate_prompt
    return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/llms/base.py", line 602, in generate
    output = self._generate_helper(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/llms/base.py", line 504, in _generate_helper
    raise e
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/llms/base.py", line 491, in _generate_helper
    self._generate(
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/llms/ollama.py", line 220, in _generate
    final_chunk = super()._stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/llms/ollama.py", line 156, in _stream_with_aggregation
    for stream_resp in self._create_stream(prompt, stop, **kwargs):
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/langchain/llms/ollama.py", line 140, in _create_stream
    optional_detail = response.json().get("error")
                      ^^^^^^^^^^^^^^^
  File "/Users/mac/Ollama/2-ollama-privateGPT-chat-with-docs/privateGPT/lib/python3.11/site-packages/requests/models.py", line 975, in json
    raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

when i install requirments in your second privateGPT

ERROR: Ignored the following versions that require a different python version: 0.5.12 Requires-Python >=3.7,<3.12; 0.5.13 Requires-Python >=3.7,<3.12
ERROR: Could not find a version that satisfies the requirement pulsar-client>=3.1.0 (from chromadb) (from versions: none)
ERROR: No matching distribution found for pulsar-client>=3.1.0

Please remove your database from the repo

Hi,

I was a bit surprised that the output of my query showed this too:

> source_documents/030413.hill.think.and.grow.rich.pdf:
do it,” the uncle retorted, “Now you run on home.” “Yas sah,” the child replied. 
But she did not move. The uncle went ahead with his work, so busily engaged that 
he did not pay enough attention to the child to observe that she did not leave. 
When he looked up and saw her still standing there, he yelled at her, “I told you 
to go on home! Now go, or I’ll take a switch to you.” The little girl said “yas sah,” 
but she did not budge an inch. The uncle dropped a sack of grain he was about to

> source_documents/030413.hill.think.and.grow.rich.pdf:
a bank official, “borrowed” a large sum of the bank’s money, without the consent 
of the directors. He lost the money through gambling. One afternoon, the Bank 
Examiner came and began to check the accounts. Grant left the bank, took a room 
in a local hotel, and when they found him, three days later, he was lying in bed, 
wailing and moaning, repeating over and over these words, “My God, this will kill 
me! I cannot stand the disgrace.” In a short time he was dead.  The doctors pro-

> source_documents/030413.hill.think.and.grow.rich.pdf:
does  not necessarily mean no. 
One afternoon he was helping his uncle grind wheat in an old fashioned mill. 
The uncle operated a large farm on which a number of colored sharecrop farmers 
lived. Quietly, the door was opened, and a small colored child, the daughter of a 
tenant, walked in and took her place near the door. 
The uncle looked up, saw the child, and barked at her roughly, “what do you 
want?” Meekly, the child replied, “My mammy say send her fifty cents.” “I’ll not

> source_documents/030413.hill.think.and.grow.rich.pdf:
a powerful foe, whose men outnumbered his own. He loaded his soldiers into 
boats, sailed to the enemy’s country, unloaded soldiers and equipment, then gave 
the order to burn the ships that had carried them. Addressing his men before the 
first battle, he said, “You see the boats going up in smoke. That means that we 
cannot leave these shores alive unless we win! We now have no choice-we win-or 
we perish!  They won.

Your indexed documents are in the repo (db folder). You should probably remove that.

Some errors during the requirements.txt installation

ERROR: Ignored the following versions that require a different python version: 0.5.12 Requires-Python >=3.7,<3.12; 0.5.13 Requires-Python >=3.7,<3.12; 1.21.2 Requires-Python >=3.7,<3.11; 1.21.3 Requires-Python >=3.7,<3.11; 1.21.4 Requires-Python >=3.7,<3.11; 1.21.5 Requires-Python >=3.7,<3.11; 1.21.6 Requires-Python >=3.7,<3.11
ERROR: Could not find a version that satisfies the requirement onnxruntime>=1.14.1 (from chromadb) (from versions: none)
ERROR: No matching distribution found for onnxruntime>=1.14.1

Error when I typed the query

I run `python privateGPT.py and met this error. Could you help to take a look, thx

python privateGPT.py

Enter a query: give me a summary
Traceback (most recent call last):
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/requests/models.py", line 971, in json
    return complexjson.loads(self.text, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/polly/Downloads/Sublime_Workspace/GitHub_Workspace/ollama/examples/langchain-python-rag-privategpt/privateGPT.py", line 80, in <module>
    main()
  File "/Users/polly/Downloads/Sublime_Workspace/GitHub_Workspace/ollama/examples/langchain-python-rag-privategpt/privateGPT.py", line 51, in main
    res = qa(query)
          ^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/retrieval_qa/base.py", line 139, in _call
    answer = self.combine_documents_chain.run(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 480, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/combine_documents/base.py", line 105, in _call
    output, extra_return_dict = self.combine_docs(
                                ^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/combine_documents/stuff.py", line 171, in combine_docs
    return self.llm_chain.predict(callbacks=callbacks, **inputs), {}
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/llm.py", line 255, in predict
    return self(kwargs, callbacks=callbacks)[self.output_key]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/llm.py", line 91, in _call
    response = self.generate([inputs], run_manager=run_manager)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/llm.py", line 101, in generate
    return self.llm.generate_prompt(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/base.py", line 467, in generate_prompt
    return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/base.py", line 602, in generate
    output = self._generate_helper(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/base.py", line 504, in _generate_helper
    raise e
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/base.py", line 491, in _generate_helper
    self._generate(
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/ollama.py", line 220, in _generate
    final_chunk = super()._stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/ollama.py", line 156, in _stream_with_aggregation
    for stream_resp in self._create_stream(prompt, stop, **kwargs):
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/ollama.py", line 140, in _create_stream
    optional_detail = response.json().get("error")
                      ^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/requests/models.py", line 975, in json
    raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.