Giter Site home page Giter Site logo

tnikolla / robot-grasp-detection Goto Github PK

View Code? Open in Web Editor NEW
234.0 13.0 84.0 164.85 MB

Detecting robot grasping positions with deep neural networks. The model is trained on Cornell Grasping Dataset. This is an implementation mainly based on the paper 'Real-Time Grasp Detection Using Convolutional Neural Networks' from Redmon and Angelova.

License: Apache License 2.0

Python 100.00%
grasp grasping detection imagenet-classifier image-classification deep-learning deep-neural-networks tensorflow

robot-grasp-detection's People

Contributors

tnikolla avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

robot-grasp-detection's Issues

what is the rectangle metric detection accuracy

Hi,

I have a question in the paper(Real-Time Grasp Detection Using Convolutional Neural Networks).
In the table I , what's the rectangle metric detection accuracy? and how to compute the value?
I search many paper, they don't have explanation.

thanks.

Hi, author

Excuse me,

What is the meaning "in filename[:49] adapt the number 49 (you can contribute, or I will program it better someday)"?

About the models

Hi, I am wondering whether the grasp model is the one pretrained on the imagenet? Can I use it to train the trainset? I tried , but the loss is nan?

error when run grap_det.py

when I run grasp_det.py, I met this error as follow:
ValueError: Tensor conversion requested dtype int32 for Tensor with dtype float64: 'Tensor("truediv:0", shape=(), dtype=float64, device=/device:CPU:0)'
How can I solve this error?????

I had used "build_cgd_dataset.py" to get tfrecords.

I has made small changes in "grasp_det.py" as follow:

TRAIN_FILE = r'F:\RGB-D_grasp_datasets\data01\01\train-cgd'
VALIDATE_FILE = r'F:\RGB-D_grasp_datasets\data01\01\validation-cgd'

parser.add_argument(
'--data_dir',
type=str,
default=r'F:\RGB-D_grasp_datasets\data01\01',
help='Directory with training data.'
)
parser.add_argument(
'--log_dir',
type=str,
default=r'F:\robot-grasp-detection-master',
help='Tensorboard log_dir.'
parser.add_argument(
'--model_path',
type=str,
default=r'F:\robot-grasp-detection-master\models\grasp\m4.ckpt',
help='Variables for the model.'
)

Training Parameter about grasp

Hi ,I try to test the code and want to get the same result.
But I do not know the training parameter you didi with grasp_det.py, could you tell me the exact parameters ?

About multiplication factors 0.35 & 0.47

Hi author,

In your program, there are values: 0.35 and 0.47. What do they mean?

Thank you!

def bboxes_to_grasps(bboxes): # converting and scaling bounding boxes into grasps, g = {x, y, tan, h, w} box = tf.unstack(bboxes, axis=1) #bboxes <tf.Tensor 'batch_join:1' shape=(64, 8) dtype=float32> x = (box[0] + (box[4] - box[0])/2) *0.35 y = (box[1] + (box[5] - box[1])/2) *0.47 tan = (box[3] -box[1]) / (box[2] -box[0]) *0.47/0.35 h = tf.sqrt(tf.pow((box[2] -box[0])*0.35, 2) + tf.pow((box[3] -box[1])*0.47, 2)) w = tf.sqrt(tf.pow((box[6] -box[0])*0.35, 2) + tf.pow((box[7] -box[1])*0.47, 2)) return x, y, tan, h, w

How to convert the coordinates

Hi, I've achieved the grasping rectangle on my Jupiter notebook. But I want to use a real robotic arm to grasp item, I knew that inverse kinematics can convert the coordinates of the arm into the rotation angle of different parts of the arm. But I realize that the coordinates systems are different, in your project, it's based on the image, while in practice it's based on the robotic arm. How can I convert the system to make it work? Thanks a lot.
1535512761 1

working with D435 cam realtime

HI, thank you for your project,
I want to use it with D435 Cam in real time, how can i modify anything to use your project in realtime with cam D435? thank you so much.

Converting grasping rectangle to grasp pose

Hi, I wish to execute the proposed grasp using my own robot arm. I am wondering if you have implemented the code to convert the rectangle to the 6DOF grasp pose.
I read on the Deep Learning for Detecting Robotic Grasps paper that the final pose is determined using the rectangle center as the grasping center, its surface normal as the approach vector and the angle of the rectangle is used to rotate the gripper around the approach vector but I don't know how to implement this.

About Regression + Classification and Pretraining

Hi!
In paper 'Real-Time Grasp Detection Using Convolutional Neural Networks'
part IV. GRASP DETECTION WITH NEURAL NETWORKS
C. Regression + Classification
D. MultiGrasp Detection
Do you have a program?
part V. EXPERIMENTS AND EVALUATION
C. Pretraining
Is models/imagenet/m2/m2.ckpt the weight of pre-training results saved?
or models/imagenet/m4/m4.ckpt
Thank you!
Shu Guo!

imagenet_classifier.py

1.please excuse me output this "2021-06-21 15:36:13.057525: I C:\tf_jenkins\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX AVX2" is succeed???
2.'Done training for 100 epochs, 0 steps, 0.0 min.' what's wrong? thank you!

New to tensorflow and TFRecords - error in running build_cd_dataset.py

I seem to get this error:
I'm using Spyder/Anaconda with Python 3.6
f.read() jsut doesn't seem to like opening images

File "C:/Users/Elijah/Desktop/Masters and Machine Learning folder/Cornell Dataset/robot-grasp-detection-master/build_cgd.py", line 31, in _process_image
image_data = f.read()

File "C:\Users\Elijah\Anaconda3\envs\mle_gpu\lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]

UnicodeDecodeError: 'charmap' codec can't decode byte 0x9d in position 50: character maps to

Any help would be appreciated, thanks

Rectangular four vertex law

Hi!
I thought for a long time on this issue, did not think clearly
Rectangular four vertex law is not a law, such as clockwise, counterclockwise, or where the first vertex, these will not affect the results?
This function:
def bboxes_to_grasps(bboxes):
box = tf.unstack(bboxes, axis=1)
x = (box[0] + (box[4] - box[0])/2) * 0.35
y = (box[1] + (box[5] - box[1])/2) * 0.47
tan = (box[3] -box[1]) / (box[2] -box[0]) *0.47/0.35
h = tf.sqrt(tf.pow((box[2] -box[0])*0.35, 2) + tf.pow((box[3] -box[1])*0.47, 2))
w = tf.sqrt(tf.pow((box[6] -box[0])*0.35, 2) + tf.pow((box[7] -box[1])*0.47, 2))
return x, y, tan, h, w
Thank you!

When I run grasp_det.py for the training on pycharm, some errors happened

Hi!
When I run grasp_det.py for the training on pycharm, there were some errors,

/home/xiaoshuguo/anaconda3/bin/python /home/xiaoshuguo/PycharmProjects/untitled2/robot-grasp-detection/grasp_det.py
validation
inputs
False
False
Traceback (most recent call last):
File "/home/xiaoshuguo/PycharmProjects/untitled2/robot-grasp-detection/grasp_det.py", line 162, in
tf.app.run(main=main, argv=[sys.argv[0]] + unparsed)
File "/home/xiaoshuguo/anaconda3/lib/python3.6/site-packages/tensorflow/python/platform/app.py", line 48, in run
_sys.exit(main(sys.argv[:1] + flags_passthrough))
File "/home/xiaoshuguo/PycharmProjects/untitled2/robot-grasp-detection/grasp_det.py", line 115, in main
run_training()
File "/home/xiaoshuguo/PycharmProjects/untitled2/robot-grasp-detection/grasp_det.py", line 47, in run_training
images, bboxes = grasp_img_proc.inputs([data_files
])
File "/home/xiaoshuguo/PycharmProjects/untitled2/robot-grasp-detection/grasp_img_proc.py", line 158, in inputs
num_readers=1)
File "/home/xiaoshuguo/PycharmProjects/untitled2/robot-grasp-detection/grasp_img_proc.py", line 122, in batch_inputs
image_buffer, bbox = parse_example_proto(examples_serialized)
File "/home/xiaoshuguo/PycharmProjects/untitled2/robot-grasp-detection/grasp_img_proc.py", line 26, in parse_example_proto
r = 8*tf.random_uniform((1,), minval=0, maxval=tf.size(bboxes, out_type=tf.int32)/8, dtype=tf.int32)
File "/home/xiaoshuguo/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/random_ops.py", line 229, in random_uniform
maxval = ops.convert_to_tensor(maxval, dtype=dtype, name="max")
File "/home/xiaoshuguo/anaconda3/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 639, in convert_to_tensor
as_ref=False)
File "/home/xiaoshuguo/anaconda3/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 704, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/home/xiaoshuguo/anaconda3/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 577, in _TensorTensorConversionFunction
% (dtype.name, t.dtype.name, str(t)))
ValueError: Tensor conversion requested dtype int32 for Tensor with dtype float64: 'Tensor("truediv:0", shape=(), dtype=float64, device=/device:CPU:0)'

I don't know why it is?
How can I resovle it?
Thank you very much! #

Where is inference_redmon.py

Hi, I downloaded the codes several days ago, but I can't find the 'inference_redmon.py' file in the directory. So where is it?

Trying to implement validation on grasp_det.py

Traceback (most recent call last):
File "build_cgd_dataset.py", line 115, in
main()
File "build_cgd_dataset.py", line 97, in main
image_buffer, height, width = _process_image(filename, coder)
File "build_cgd_dataset.py", line 29, in _process_image
image_data = f.read()
File "/home/harsh/anaconda3/envs/py36/lib/python3.6/codecs.py", line 321, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x89 in position 0: invalid start byte

may i ask if i train the model,should i use RGB-D images or just RGB images?

hello,i have read the paper(Real-Time Grasp Detection Using Convolutional Neural Networks),and the paper saied that uses the RGB-D image to train the model,and may i ask the code in this project,should i use RGB-D images or RGB images ,or both of them is ok?thank you very much, this problem is very important to me!!

datasets

Hi , how do you splits Cornell dataset on image-wise and object-wise? Thanks!

I have problems when trainning on grasping dataset after obtainning the train_file and validate_file

/robot-grasp-detection-master$ python grasp_det.py
train
distorted_inputs
True
True
pass
keep_prob = 1.0
2018-01-29 14:48:20.935569: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.1 instructions, but these are available on your machine and could speed up CPU computations.
2018-01-29 14:48:20.935592: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
2018-01-29 14:48:20.935613: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
2018-01-29 14:48:20.935632: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
2018-01-29 14:48:20.935637: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.
Traceback (most recent call last):
File "grasp_det.py", line 164, in
tf.app.run(main=main, argv=[sys.argv[0]] + unparsed)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/platform/app.py", line 48, in run
_sys.exit(main(_sys.argv[:1] + flags_passthrough))
File "grasp_det.py", line 117, in main
run_training()
File "grasp_det.py", line 113, in run_training
coord.join(threads)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/coordinator.py", line 389, in join
six.reraise(*self._exc_info_to_raise)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/queue_runner_impl.py", line 238, in _run
enqueue_callable()
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 1063, in _single_operation_run
target_list_as_strings, status, None)
File "/usr/lib/python2.7/contextlib.py", line 24, in exit
self.gen.next()
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/errors_impl.py", line 466, in raise_exception_on_not_ok_status
pywrap_tensorflow.TF_GetCode(status))
tensorflow.python.framework.errors_impl.FailedPreconditionError: /home/zhaojie/zhaojie/robot-grasp-detection-master
[[Node: ReaderReadV2 = ReaderReadV2[_device="/job:localhost/replica:0/task:0/cpu:0"](TFRecordReaderV2, input_producer)]]

Multiple bboxes, positive and negative bboxes?

It seems only the positive bounding boxes might be loaded, is this the case?
bounding box

Also, how are the varying numbers of bboxes handled / structured?

The code for that part is split in several locations so it isn't easy to tell the design.

downloading the code

how large is the downloaded cofe file? is it around 16mb ? i m unable to download the code .please help

Could you add a LICENSE.md?

@tnikolla this repository looks very useful thanks for putting it up!

Would you mind adding a license to this? Without a license it is impossible to legally clone or run this code. If you're not sure and would like a suggestion the Apache 2.0 license is a good option. A quick summary of apache 2.0 is available at tl;dr legal. This is the same license used by TensorFlow. Here is the license text:

Apache License, Version 2.0 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/

TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

1. Definitions.

"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.

"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.

"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.

"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.

"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.

"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.

"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).

"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.

"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."

"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.

2. Grant of Copyright License.

Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.

3. Grant of Patent License.

Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.

4. Redistribution.

You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:

You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.

5. Submission of Contributions.

Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.

6. Trademarks.

This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.

7. Disclaimer of Warranty.

Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.

8. Limitation of Liability.

In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.

9. Accepting Warranty or Additional Liability.

While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.

END OF TERMS AND CONDITIONS

APPENDIX: How to apply the Apache License to your work

To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives.

Copyright [yyyy] [name of copyright owner]

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

About depth information

Hi,

The article you mentioned "Real-time Grasp Detection..." has the image depth information included when training the CNN. While the title of your repository "robot-grasp-detection" is "Detecting grasping... using RGB images".

Does it mean when training your network, you only used RGB images, and depth info. is not used ?
Further, the Cornell Grasping Dataset, has point cloud information of all images. If your network is trained with this dataset, depth information should be included. Am I correct ?
Thanks.

grasp_det.py

ValueError: Tensor conversion requested dtype int32 for Tensor with dtype float64: 'Tensor("truediv:0", shape=(), dtype=float64, device=/device:CPU:0)'

May I ask how to solve this problem?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.