Giter Site home page Giter Site logo

Comments (19)

borntoleave avatar borntoleave commented on August 18, 2024 1

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024 1

There's a simple rule to limit a single joint's range in OpenCat.h. 

int angleLimit[][2] = {
  { -120, 120 }, { -30, 80 }, { -120, 120 }, { -120, 120 },
  { -90, 60 }, { -90, 60 }, { -90, 90 }, { -90, 90 },
  { -200, 80 }, { -200, 80 }, { -80, 200 }, { -80, 200 },
  { -80, 200 }, { -80, 200 }, { -80, 200 }, { -80, 200 },
};

You will need to calculate the exact legs' states to evaluate the collision with other body parts. The calculation could be done on the BiBoard or your master computer that sends the movement instructions. It's very challenging to fit that algorithm into NyBoard. 

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024 1

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

Oof, this may be beyond my expertise.

I'm gonna really need your input if I'm to do this.

Okay, I've looked at example.py.

You can send interpolated positions at different time intervals

  1. Is there a line in example.py that sends a "position"? I'm not sure what is meant by this.
  2. Is it possible to send positions via Bluetooth or WiFi? There is a library called pyBittle that makes sending commands to Bittle via Bluetooth/WiFi really easy, but they only by default send full pre-programmed gaits.

I've taken a look at opencat.h and imu.h, but this is where I really don't have the expertise since I don't program outside of Python often.

you can print the full 6-axis data (including the acceleration) by modifying the print6Axis() function in OpenCat/src/imu.h

  1. I see the following code in imu.h. How can I modify it to output accelerometer data in addition to gyroscope data? Any hints?
void print6Axis() {
  PT(ypr[0] );
  PT('\t');
  PT(ypr[1] );
  PT('\t');
  PT(ypr[2] );
  PTL();
}
  1. Is it possible to receive these print signals via Bluetooth or WiFi?

Thank you so much!

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024
  1. In OpenCat/serialMaster/demos, you can find an example moveBySteps.py that moves specific joints by time intervals.
  2. Yes, you can connect the Bluetooth dongle to your computer and then run any of the examples that import the ardSerial.py. There's an algorithm in the code that automatically detects connected robots.
  3. To get more IMU data, simply uncomment the lines after print ypr[]
void print6Axis() {
  PT(ypr[0] );
  PT('\t');
  PT(ypr[1] );
  PT('\t');
  PT(ypr[2] );
  PT("\t");
  PT(aaWorld.x);
  PT("\t");
  PT(aaWorld.y);
  PT("\t");
  PTL(aaWorld.z / 1000);
  PT('\t');
  PT(aaReal.z);
  PTL();
}

If you send a 'v' token, the angles will be printed to the serial port for once. And the Bluetooth or WiFi dongle can transfer it to your computer. If you send a 'V' token, the angles will be continuously printed to the serial port.

Currently, the Bluetooth code is fully incorporated in the ardSerial.py.

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

Hi Rz, this is Sam. Currently, I'm testing Bluetooth bidirectional commands. Here is the output when I send the following command:

b'v'

0000ffe1-0000-1000-8000-00805f9b34fb (Handle: 15): Vendor specific: bytearray(b'G\r\n-21.26759\t10.8250')
0000ffe1-0000-1000-8000-00805f9b34fb (Handle: 15): Vendor specific: bytearray(b'7\t-11.13954\r\nv\r\n')

Here is the output when I send the following command:

b'k'

0000ffe1-0000-1000-8000-00805f9b34fb (Handle: 15): Vendor specific: bytearray(b'G\r\n')
0000ffe1-0000-1000-8000-00805f9b34fb (Handle: 15): Vendor specific: bytearray(b'k\r\n')

In this case, Bittle briefly moves his head.

Here is the output when I send the following command:

b'-9'

0000ffe1-0000-1000-8000-00805f9b34fb (Handle: 15): Vendor specific: bytearray(b'G\r\n-\r\n')

Bittle beeps, but nothing happens. (Same for other variations).

Can you show me how to send individual commands and bytearrays of commands? I'm using a special library because my M1 MacBook was having issues with the Bluetooth code.

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

This seems to work:

str('m 0 45 0 -45 0 45 0 -45').encode('utf-8')

But this doesn't:

str('l 20 0 0 0 0 0 0 0 45 45 45 45 36 36 36 36').encode('utf-8')

And neither does this:

str('m 1 45 2 -45 3 45').encode('utf-8')

Here is my best solution. Is there a better way than this that doesn't require calling time.sleep(1) and can automatically detect when a command has finished executing?

commands = list(map(encode, [[45, 0, 0, 0, 0, 0, 0, 0, 45, 45, 45, 45, 36, 36, 36, 36],
                             [-45, 0, 0, 0, 0, 0, 0, 0, 45, 45, 45, 45, 36, 36, 36, 36],
                             [45, 0, 0, 0, 0, 0, 0, 0, 45, 45, 45, 45, 36, 36, 36, 36],
                             [-45, 0, 0, 0, 0, 0, 0, 0, 45, 45, 45, 45, 36, 36, 36, 36]]))


def encode(rotations: list):
    return ('i ' + ' '.join(map(str, [*sum(zip(range(16), rotations), ())]))).encode('utf-8')


def send(data):
    ...


for command in commands:
    # Send command via Bluetooth
    send(command)
  
    time.sleep(1)

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

Since I'm reading the IMU regularly, it's vital that I disable the beeping. Is it possible to send Bittle a command like b'v' soundlessly?

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

Is it possible to get IMU data while Bittle is performing actions?

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024

Screenshot 2023-02-13 at 17 07 09

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

I need to define constraints on Bittle's movements so that random actions don't cause him to twist a leg into his body or collide with himself. What are some of the red-zone ranges, do you know? Depending on where the other joints are, different movements are acceptable. So I may have to come up with this myself, but do you have an intuition for a general rule of thumb that can be used to constrain his movements depending on his other movements? In my scenario, he starts out taking random actions and this can be too demanding on the build.

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

Hi Rz, I'm returning to this. Is it alright if I ask some questions that I stumbled into last time?

  1. Reproducing the pre-programmed gaits via the kind of position control above seems... there seems to be something missing. I think the pre-programmed gaits have different velocities, right? I want to give the robot an action space from which it could feasibly learn to reproduce the pre-programmed gaits.
  2. Are there any high-level examples of miniature movements like taking just one step forward, or one step to the right, etc.?
  3. is it possible to disable the automatic trigger movements, like flipping upright when Bittle collapses. It's precisely this kind of movement that I have a hard time imagining the above action space reproducing.

Thank you.

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

I haven't tried the SkillComposer yet. It confuses me. I'm looking to do everything in Python.

Is it possible to send velocities through the serial API? It would be really helpful to see some examples of complex behaviors like flipping upright or running done with the serial API commands listed in the link you sent me.

Thanks,
Sam

from opencat.

slerman12 avatar slerman12 commented on August 18, 2024

Is there a simple way to read/verify that Bittle is (a) standing upright and (b) has a positive forward velocity? I retrieve 6 values from the IMU, 3 from the gyroscope and 3 from the accelerometer.

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024

Hi, you can use the token v or V to retrieve the 6-axis IMU data. Integrating the acceleration can give you speed information.

from opencat.

borntoleave avatar borntoleave commented on August 18, 2024

The 6-axis data are defined in OpenCat/src/imu.h.
In print6Axis() prints out yaw, pitch, roll in angles, and acceleration in x, y, z directions.

The sensor can only detect acceleration rather than the current velocity. The velocity is acquired by integrating the acceleration in a specific direction. You may lose the initial speed, and the error may accumulate. Because a quadruped always wobbles and shakes during motion, you can assume that after the acceleration in all directions is stable for one second, the robot is still. Then you can use it as the integration starting point.

from opencat.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.