robo-touch / taxim Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License
License: MIT License
Hi!
I found that the 'Unable to load OpenGL library' error when running taxim-robot was fixed for me by editing the PyOpenGL file OpenGL/platform/ctypesloader.py
, and changing the line
fullName = util.find_library( name )
to
fullName = '/System/Library/Frameworks/OpenGL.framework/OpenGL'
Hope it helps!
Hi,
I believe there is an indexing error in the shadow generation. The for loop in simOptical.py line 206 iterates variable s
from 1 to num_step
and retrieves values from the shadow_table at position s
:
for s in range(1,num_step):
cur_x = int(cx_origin + pr.shadow_step * s * ct)
cur_y = int(cy_origin + pr.shadow_step * s * st)
# check boundary of the image and height's difference
if cur_x >= 0 and cur_x < psp.w and cur_y >= 0 and cur_y < psp.h and heightMap[cy_origin,cx_origin] > heightMap[cur_y,cur_x]:
frame[cur_y,cur_x] = np.minimum(frame[cur_y,cur_x],v[s])
However, this means that v[0]
is never used. Looking at generateShadowMasks.py, it seems to me as if for s = 1, v[0]
should be retrieved instead of v[1]
. Hence, I think this code should be:
for s in range(num_step):
cur_x = int(cx_origin + pr.shadow_step * (s + 1) * ct)
cur_y = int(cy_origin + pr.shadow_step * (s + 1) * st)
# check boundary of the image and height's difference
if cur_x >= 0 and cur_x < psp.w and cur_y >= 0 and cur_y < psp.h and heightMap[cy_origin,cx_origin] > heightMap[cur_y,cur_x]:
frame[cur_y,cur_x] = np.minimum(frame[cur_y,cur_x],v[s])
Best,
Tim
Hi,
I tried to run
python simMarkMotionField.py -obj square -dx 0.3 -dy 0.4 -dz 0.5
however, the output image does not contain any markers.
Am I missing something?
I am unable to see the simulation of markers in taxim-robot simulation model. Is there any way to add markers in the simulation?
Hi,why doesn't the gui display when I run demo.py
tool0TomatoSoupCan height: 0.101815
height_list: [0.2205]
gripForce_list: [8]
dx_list: [0.0]
sample 1 is collected. h0.220-rot-1.57-f8.0-l1. 0:0:28 is taken in total. 1 positive samples.
Finished!
Is the gelmap5.npz used for the GelSight sensor the same as the one used for DIGIT sensor?
If no is it possible to provide us with the gelmap belonging to DIGIT?
Hi! I noticed that you have digit&gelsightR1.5 sensor calibration files, but what's the meaning of real_bg.npy under the folder?
Could you can explain this detailedly?
Thanks!
@Si-Lynnn
Hi Zilin,
Thanks for sharing so nice work "Grasp Stability Prediction with Sim-to-Real Transfer from Tactile Sensing" and having released the simulation part code.
However, I cannot find the training and testing part (e.g. network as presented in Fig. 6: Grasp stability prediction networks). Could you mind offering more information to implement this part?
Hi all,
Thanks for sharing the simulator.
I am trying to calibrate the simulator with another sensor (the DIGIT), but it seems we require a gelmap.npz file to reproduce the shape of the gel. In this repo there's the file related to the gelsight but I was wondering how can I create this kind of file for my sensor.
Thanks
Hello! I've collected about 100 ball tactile images by my own DIGIT. Then I followed the steps in README
to calibrate. However, there are some problems as the following pictures show.
The first is the simulated tactile image with background, and the second is the one without background. We can see that the light rendering of non-contact area on the tactile image is not correct. I have tried to change some parameters in Basic.params
and sensorParams
, but it still didn't work. At the same time, I also found that my DIGIT brightness is set higher than the DIGIT file you provided, is this the key to my problem?
Hi! I have read your paper and know that you firstly calibrate the deformation of gelpad under a unit pin with 0.5 mm diameter indenting in ANSYS to marker motion field simulation. When I tried to get the dense nodal displacement results with my own model in ANSYS, the result is not satisfied! So what is the setting of materials or other necessary parts that you use in ANSYS? I think the answer would help me!
Hi Zilin,
I tried to run taxim-robot with GUI option.
It seems that there is no error, but the widow "color and depth" is black all the time.
Please kindly look into the matter.
Thanks!
Hi all,
Thanks for sharing the simulator.
I followed your usage tutorial video in order to calibrate the simulator with a different sensor. When I run the generateDataPack.py script giving the as data_path the folder with the images, the GUI opens correctly (I think)
But, when I try to push the button "open" it raises an excpetion in this line of code, giving this output:
Caught exception in event handler: AttributeError: module 'nanogui' has no attribute 'directory_dialog
.
This method does not exist in nano gui, when I run help(nanogui)
, this method is not shown.
What can I do to run the calibration?
Thanks
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.