Giter Site home page Giter Site logo

o3de / roscondemo Goto Github PK

View Code? Open in Web Editor NEW
60.0 22.0 21.0 26.19 MB

A robotic fruit picking demo project for O3DE with ROS 2 Gem

License: Other

CMake 13.22% C++ 58.45% Lua 7.90% Dockerfile 3.65% Python 16.78%
agriculture o3de robotics-simulation ros2 ros2-humble

roscondemo's Introduction

O3DE Apple Kraken Demo Project

Video demo

O3DE_Robotics_Agriculture_Demo.mp4

This project demonstrates an example application of O3DE working with ROS 2. The integration is realized through ROS 2 Gem.

How does it look like

The project includes

  • Apple Orchard, a simulation scene with many rows of apple trees.
  • Apple Kraken, a robot tasked with apple picking. It is ready to use and also included as an URDF.
    • Multiple Apple Krakens are supported
    • .. and you can spawn them using ROS 2 messages!
  • Custom components for picking apples, which benefit from direct integration with ROS 2.
    • Yes, you can write ROS 2 code in O3DE!
  • Autonomous operation which is based on ROS 2 navigation stack and ground truth.
    • Ground truth can be replaced with detectors based on sensor data. Give it a try!
  • Apples
    • Thousands of apples!

Simulation scenes (levels)

Main Level

The main scene of the demo is set in an apple orchard surrounded by countryside. The orchard is managed by the Apple Kraken.

The main level is rather performance-intensive.

The Apple Kraken is a four-wheeled robot assigned the task of navigating around the orchard, collecting apples and storing them in its basket.

Playground Level

The playground scene is much lighter and can be used to quickly prototype with Kraken. There are only a couple of apple trees and the robot itself.

Requirements

Platforms

The project runs on Ubuntu 22.04 with ROS 2 Humble or ROS 2 Iron. If you wish to run this demo in Docker environment, please use the instructions in the Docker folder.

๐Ÿ’ก Note: This demo is not supported on Windows!

O3DE

Refer to the O3DE System Requirements documentation to make sure that the system/hardware requirements are met.

The following commands should prepare O3DE (assuming ${WORKDIR} is your working directory):

cd ${WORKDIR}
git clone --branch main --single-branch https://github.com/o3de/o3de.git
cd o3de
git lfs install
git lfs pull
python/get_python.sh
scripts/o3de.sh register --this-engine

In case of any problems, please refer to the instructions to set up O3DE from GitHub.

ROS 2 Gem

This project uses the ROS 2 Gem, which is included in the O3DE extras bundle. Please install ROS 2 first.

The following commands should prepare o3de-extras into your ${WORKDIR}:

cd ${WORKDIR}
git clone --branch main --single-branch https://github.com/o3de/o3de-extras
cd o3de-extras
git lfs install
git lfs pull

And register required Gem:

cd ${WORKDIR}
./o3de/scripts/o3de.sh register --gem-path o3de-extras/Gems/ROS2

Please make sure to use the same version of o3de and o3de-extras. This demo was successfully tested with the 2310.1 release.

More information about installing ROS 2 Gem can be found in the installation guide in ROS 2 Project Configuration. Note that the Gem instructions include the installation of ROS 2 with some additional packages.

To learn more about how the Gem works check out the Robotics in O3DE. The Gem is open to your contributions!

Additional ROS 2 packages

The additional packages need to be installed. Use the following command:

sudo apt install ros-${ROS_DISTRO}-vision-msgs ros-${ROS_DISTRO}-nav-msgs ros-${ROS_DISTRO}-rmw-cyclonedds-cpp ros-${ROS_DISTRO}-cyclonedds

๐Ÿ’ก Note: This is a dependency besides all the packages already required by the ROS 2 Gem.

Required environment settings

Some commands and environmental variables are necessary for ROS 2 systems, including this demo, to function properly. It is best to add these commands and settings to either ~/.bashrc or equivalent file.

ROS 2 distribution should always be sourced when building and running the demo and its command line interfaces. For a typical ROS 2 Iron installation, this would mean running the following for each console:

source /opt/ros/iron/setup.bash

๐Ÿ’ก Note: ROS 2 Humble is also supported. In that case, the provided command would be source /opt/ros/humble/setup.bash

Currently, we are observing issues when running navigation with FastDDS (the default middleware for ROS 2 Humble and ROS 2 Iron). While the exact cause is yet to be investigated, there are no such issues when running with CycloneDDS. Thus, please set the following:

export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp

Building this project

Build steps

  1. Clone this project:
cd ${WORKDIR}
git clone https://github.com/o3de/ROSConDemo.git
  1. Register this project in the O3DE engine. In O3DE directory:
cd ${WORKDIR}/o3de
scripts/o3de.sh register -pp ${WORKDIR}/ROSConDemo/Project
  1. Ensure your ROS 2 is sourced:
echo $ROS_DISTRO
> iron

๐Ÿ’ก Note: In the case ROS 2 Humble is sourced, the output is humble.

  1. Configure build:
cd ${WORKDIR}/ROSConDemo/Project
cmake -B build/linux -G"Ninja Multi-Config" -DLY_DISABLE_TEST_MODULES=ON -DLY_STRIP_DEBUG_SYMBOLS=ON
  1. Execute build (this will take a while the first time):
cd ${WORKDIR}/ROSConDemo/Project
cmake --build build/linux --config profile --target ROSConDemo Editor AssetProcessor ROSConDemo.Assets

Building the Navigation package

To build the ROS 2 navigation stack configured for this Project, please follow this detailed document. Do not run it yet if you wish to follow the demo scenario.

Launching the Editor

Launch the O3DE Editor:

cd ${WORKDIR}/ROSConDemo/Project
build/linux/bin/profile/Editor

Running the demo scenario

You can try out the demo scenario as presented during ROSCon 2022. Take the following steps:

  1. Launch the Editor and select the Main level. Allow it to load.
  2. Start the simulation with Ctrl-G or by pressing the Play button in the Editor.
  3. When it loads, spawn your first Apple Kraken using the following command: ros2 service call /spawn_entity gazebo_msgs/srv/SpawnEntity '{name: 'apple_kraken_rusty', xml: 'line1'}'.
    1. You can learn more about spawning in this section
  4. Once the simulation is running, start the navigation stack. If you followed all the instructions for setting it up, do the following:
    1. launch the stack for the first robot with ros2 launch o3de_kraken_nav navigation_multi.launch.py namespace:=apple_kraken_rusty_1 rviz:=True.
    2. You should see a new Rviz2 window.
    3. Note that the number index _1 has been added to the namespace when it was automatically generated by the Spawner.
  5. Using RViz2, set the navigation goal using a widget in the toolbar (2D Goal Pose). You need to click and drag to indicate the direction the robot will be facing. Make sure to set the goal next to an apple tree, to have the tree on the right side. Not too close, not too far. You can set subsequent goals for the robot to move around.
    1. As configured in our package, RViz2 has additional 2D Goal Pose buttons which are hard-set to work with specific robot namespaces.
    2. Use the button first to the left.
  6. Once the robot arrives and stops next to the tree, you can trigger apple gathering.
  7. Either wait for the robot to complete its job (gather all reachable apples) or cancel the gathering through the /apple_kraken_rusty_1/cancel_apple_gathering service.
  8. Select another navigation goal for the robot.
  9. Spawn three other Krakens:
    1. ros2 service call /spawn_entity gazebo_msgs/srv/SpawnEntity '{name: 'apple_kraken_shiny', xml: 'line2'}' && ros2 service call /spawn_entity gazebo_msgs/srv/SpawnEntity '{name: 'apple_kraken_rusty', xml: 'line3'}' && ros2 service call /spawn_entity gazebo_msgs/srv/SpawnEntity '{name: 'apple_kraken_shiny', xml: 'line4'}'
    2. You can also navigate with them using the remaining 2D Goal Pose buttons and trigger gathering events. Follow the instructions in this section to launch the navigation stack for each Kraken.

๐Ÿ’ก Note: If you would like to start the scenario over, remember to close all navigation stacks. You can do this by pressing Ctrl-C in each console you started the ros2 launch o3de_kraken_nav (..) command.

Controlling the Apple Kraken

Navigation

Please refer to Kraken navigation for instructions.

Triggering Apple Gathering

Check available services in a terminal using this command:

  • ros2 service list

If your simulation is running, you should be able to see the apple gathering service(s) listed there.

  • It should be named /apple_kraken_rusty_1/trigger_apple_gathering. It might have another namespace.

If Apple Kraken is in position, next to a tree, you can trigger apple gathering with this command:

  • ros2 service call /apple_kraken_rusty_1/trigger_apple_gathering std_srvs/srv/Trigger

You can also cancel a gathering operation in progress by calling another service:

  • ros2 service call /apple_kraken_rusty_1/cancel_apple_gathering std_srvs/srv/Trigger

Spawning Krakens

Please read the following section on Robot Spawner.

To spawn a new Apple Kraken, you can use named points (provided by a Spawner Component) or custom poses.

Available spawn aliases

You can use the spawn service with the following robot names:

  • apple_kraken_rusty
  • apple_kraken_shiny
  • apple_kraken (defaults to shiny). These two robots are functionally the same.

Available named spawn poses

There are several named poses (line1 through line4) conveniently placed at entrances to apple orchard rows.

Example calls:

Named point:

ros2 service call /spawn_entity gazebo_msgs/srv/SpawnEntity '{name: 'apple_kraken', xml: 'line1'}'

Free pose:

ros2 service call /spawn_entity gazebo_msgs/srv/SpawnEntity '{name: 'apple_kraken', initial_pose: {position:{ x: 4, y: 4, z: 0.2}, orientation: {x: 0.0, y: 0.0, z: 0.0, w: 1.0}}}'

Troubleshooting

Check-list

  • Is O3DE running ok with an empty or default project?
  • Is ROS 2 installation ok? (check with ros2 topic pub etc.)
  • Is ROS 2 workspace sourced? (check ROS_DISTRO, AMENT_PREFIX_PATH)
    • Note this needs to be true before cmake is run. Re-run configuration and build when in doubt.
  • Do you have compatible settings for crucial ENV variables when running the navigation / orchestration stack in the console and when running the simulator?
    • check RMW_IMPLEMENTATION, ROS_DOMAIN_ID etc.
  • Check the console for errors as well as logs. From the Project folder, check user/log/Editor.log.
  • Are simulation topics up when you play the simulation?
    • ros2 node list should include /o3de_ros2_node
    • ros2 topic list should include /clock, /tf and /tf_static regardless of robot presence.
    • topic list should also include /pc, /ackermann_vel and /ground_truth_3D_detection if there is a robot in the scene and the simulation is running.
      • note that these topics will be namespaced.
    • ros2 service list should also show several simulation and robot services such as spawning and apple gathering.

License

For terms please see the LICENSE*.TXT files at the root of this repository.

roscondemo's People

Contributors

adamdbrw avatar adamsj-ros avatar alek-kam-robotec-ai avatar amzn-alexpete avatar amzn-changml avatar amzn-pratikpa avatar amzn-tommy avatar antoni-robotec avatar antonmic avatar arturkamieniecki avatar hultonha avatar j-rivero avatar jhanca-robotecai avatar mbalfour-amzn avatar michalpelka avatar pawelbudziszewski avatar pawellech1 avatar pijaro avatar rainbj avatar shawstar avatar smurly avatar spham-amzn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

roscondemo's Issues

Identify Orchestration Needs

Identify the scripts necessary for the demo. For example, we may need a script to harvest the apples from the tree (if the robot uses suction to pick the apple, then make the apples disappear when they are harvested, and make them reappear when dropped into the bin).

Acceptance Criteria:

  • A list of scripts necessary for the demo is documented
  • A set of GitHub issues for each needed script is created

Create the Skybox

Create the skybox, based on the accepted design.

Acceptance Criteria

  • Skybox is created

Linked Issues

  • Predecessor Task: [#13]

Orchestration Needs: apple storage management (spawning crates)

This subroutine handles apple storage, including:

  1. Holding a counter of apples stored, and other queryable counts of specified interesting events such as: global count of apples picked by this robot, apples picked in manual mode etc.
  2. Is parametrized for capacity and crate count.
  3. Handles event of threshold storage reached - spawns crate(s).

Relevant issues:

  • #44 . This subroutine realizes store the apple (..) block
  • #41 - a create to spawn

Execute QA

Execute the test plan.

Acceptance Criteria

  • All test cases are executed
  • No critical or blocker issues exist

Linked Issues

  • Predecessor Task: [#23]

Plan Logistics

Enumerate logistics needs.

Examples:

  • Determine if we can get the required hardware in Japan, and submit rental agreement if so
  • Ensure machine will display English
  • Order adapters
  • Design and acquire signage (e.g. standing banners to place around booth)
  • etc.

Acceptance Criteria

  • GHI tasks are created for each logistic item, so they can be actioned

Document AWS Robomaker Setup

Document all the steps needed to setup the demo and get it running on AWS Robomaker.

Acceptance Criteria

  • Steps are documented and posted in an externally-accessible location

Upload Initial Project to Demo Repo

Upload an initial project to the demo repo.

Acceptance Criteria:

  • An O3DE project is created
  • The ROS2 gem is added to the project as a dependency
  • The project is uploaded to the repo

Robot platform

Create platform which measures X by Y metres. An arm, wheels, basket (and motor?) will be attached to it

Create Robot Vehicle Asset

Create the robot vehicle asset, based on the accepted design.

Acceptance Criteria

  • An URDF robot vehicle asset, defined with O3DE-supported meshes (and no <gazebo> tags), is created

Linked Issues

  • Predecessor Task: [#4]
  • Successor Task: [#8]

Additional context

Consult with O3DE ROS2 Gem URDF import developers. Some features in URDF are more difficult to support within current state of Colliders and Joints. Example:

  • do not use origin in <collider> (it is fine in <visual>).

AWS Sizzle Reel - Document Requirements

To support showing a looped video outside of the ROSCon booths, document the shot list for the sizzle reel, and the technical requirements (e.g. high-def, 4k, etc.).

Super rough shot list example:

  • Camera flies over orchard showing robot picking apples
  • Camera flies through loft showing robot maneuvering
  • etc.

Acceptance Criteria

  • The shot list can be used to produce screen recordings

Linked Issues

  • Successor Task: [#26]

Create Robot Vehicle Prefab

Create a robot vehicle prefab using imported asset files and components from the ROS2 gem.

Acceptance Criteria

  • A prefab is created that can be imported into the level and communicated with over the ROS2 bridge.
  • Initial component values are set (Mass, Power, etc.)

Linked Issues

  • Predecessor Task: [#7]

Document Apple Tree Functional Design

Document the apple tree functionality requirements. Consider the force required to detach the apples from the tree, etc.

Acceptance Criteria

  • Design is reviewed and agreed upon with all parties (Robotec.ai, Open Robotics, AWS)

Linked Issues

  • Successor Task: [#16]

Live Demo: manual control of manipulator

Demo participants are going to control the manipulator to pick apples

  1. Define inputs for keyboard and game pad (which can be supported through a ROS 2 node even if O3DE does not support it yet).
  2. Adjust dimensions of manipulator, the frame and trees to make sure this task can be completed and makes sense (There is work for 1 minute).
  3. Work on inputs so that the manipulator movements is smooth and controllable. Make sure the task is easy enough and not frustrating.
  4. Visualize that apple is picked and perhaps automate returning to storage position (Show a text to explain).
  5. Add a camera which is suitable for picking.

Logistics:
If a Game Pad works, a couple of devices could be taken by Robotec.ai (let me know if you prefer to take them yourself).
@forhalle could we make sure we have items to periodically disinfect gamepads / keyboard?

AWS Sizzle Reel - Produce Screen Recordings

Produce all screen recordings identified in the sizzle reel design, including any logo fade in/outs.

Acceptance Criteria

  • All screen recordings exist, and can be assembled into a single video

Linked Issues

  • Predecessor Task: [#25]
  • Successor Task: [#27]

Build level

Assemble all assets (robot vehicle prefab, plants with fruit, terrain, lighting, etc.) into a level.

Acceptance Criteria:

  • Level is built and is usable

Linked Issues

  • Predecessor Task: [#8]
  • Predecessor Task: [#14]
  • Predecessor Task: [#15]
  • Predecessor Task: [#18]
  • Predecessor Task: [#17]

Document Environment Design

Document the environment design, including the terrain requirements (ground, number of tree rows, height of trees, fences, mountains, skybox, etc.), and lighting design.

Acceptance Criteria

  • Design is reviewed and agreed upon with all parties (Robotec.ai, Open Robotics, AWS)
  • GitHub issues are created for all required assets (Note: Some are already created - check GitHub)

Out of scope:
Apple tree design is a separate task

Linked Issues

  • Successor Task: [#14]
  • Successor Task: [#18]

Inspiration:
Image

AWS Live Demo - Document Script (demo steps + talking points)

To support manually executing a live demo inside the ROSCon booths, document each step of the user story to be told through the demo, including the estimated amount of time dedicated to each step.

For example:
0:00 - 2:00: Import robot into software
2:01 - 2:30: Insert image recognition software
2:31 - 5:00: Manually drive robot, identify fruit, move the robot arm to pick the fruit, and place fruit in the vehicle's container
etc.

Acceptance Criteria:

  • Script is reviewed and agreed upon with all parties (Robotec.ai, Open Robotics, AWS)

Create other environment assets

Create other environment assets, as outlined in the environment design.

Acceptance Criteria

  • All other environment assets (e.g. fences, hedges, etc.) are created

Linked Issues

  • Predecessor Task: [#12]
  • Predecessor Task: [#13]

Live Demo: a gamified experience of picking apples

We would like to make apple picking with Kraken a fun experience for one minute.

  1. Prepare a script for a short explanation of the task, how to control the robot etc.
  2. Have a timer start on a key/button press: "Start" (not the same key/button as manual control takeover, since we need to explain things). 1 minute is good.
  3. Keep score, include nice visuals like increasing score, apples left etc - design the UX here.
  4. Have a mini-window which shows a camera view from manipulator with a crosshair in the middle.
  5. Optionally: decide on elements such as high score etc.
  6. Decide on reward (stickers etc.)

Create QA Test Plan

Create a QA test plan.

Acceptance Criteria

  • Test Plan includes testing of the Gem, in addition to tests defined by the demo script
  • Test Plan is agreed upon with all parties

Linked Issues

  • Predecessor Task: [#6]
  • Predecessor Task: [#3]

Create Apple Tree Prefab

Create a apple tree prefab using imported asset files.

Acceptance Criteria

  • A prefab is created that can be imported into the level.

Linked Issues

  • Predecessor Task: [#16]

Orchestration Needs: move manipulator to a desired x,y,z

Realize this task through Manipulation Component in O3DE.

  1. Try to reach a given position. Inform on completion, or return error if position is unreachable / timed out.
  2. Can be queried for reachable positions (a cube?).
  3. Can be queried whether a given position is reachable.

Orchestration Needs: Apple detector and pick task planner

This script / node is responsible for:

  1. Determining positions of all apples in a certain viewport (parametrized). Vision / ground truth.
  2. Determining which of these apples are reachable. Vision / ground truth.
  3. Publishing or returning a queue of positions of all reachable apples.

Relevant issue: #44. This orchestration realizes the "Queue all reachable apples" block.

Document how to set up / configure the agricultural demo (i.e. Update the project readme)

Make the Readme ready for Demo:

  1. Describe the purpose of this project, what it demonstrates.
  2. Add instructions on how to download, build and run. Link to the Gem, user guide.
  3. Describe level(s).
  4. Instructions on how to import Apple Kraken.
  5. Instructions on how to operate the robot manually.
  6. Instructions on how to run it with navigation stack and global automation.
  7. Troubleshooting section

Live Demo - Document Demo FAQ

Document a list of hypothetical questions (an FAQ) we may receive with agreed upon answers.

Acceptance Criteria:

  • FAQ is reviewed and agreed upon with all parties (Robotec.ai, Open Robotics, AWS)

Orchestration Needs - apple gathering

Related to #42 since it realizes Gather all apples block.

Some characteristics of this orchestration:

  1. Does not care about any kind of movement or navigation.
  2. Assumes it is always at a gathering point when ran.
  3. It is finished when all reachable apples were gathered (optionally, also on a timeout).
  4. It will work well when started in a middle of picking or even with manipulator holding the apple (in a little box).
  flowchart TD;
      A[Wait for work]-->B{Task received?};
      B -- No --> A;
      B -- Yes --> Q[Queue all reachable apples];
      Q --> C{Is apple held by/in manipulator?};
      C -- Yes --> D{Is manipulator in storage position?};
      D -- Yes --> E[Store the apple / handle full storage condition];
      D -- No --> F[Move manipulator to storage position];
      F --> D;
      C -- No --> H{Apple queue empty?};
      E --> H;
      H -- Yes --> Z[Notify that task is completed];
      H -- No --> I[Pop apple pick task from the queue];
      I --> J{Is manipulator in apple pick up position for current apple?}
     J -- No --> K[Move manipulator to picking position for current apple];
     K --> J;
     J -- Yes --> L[Attempt to pick apple];
     L --> M{Was picking successful?}
     M -- Yes / timeout / attempts limit --> C;
     M -- No --> L; 
     Z --> A;

Orchestration Needs: Global state machine for gathering apples in orchard

This could be realized by scripting or by a custom (purpose-built) ROS 2 node.
We will handle only one robot with this orchestration, but it should be done in a flexible way to accommodate for multiple robots (and not run into conflicts at least for some time, e.g. for entire row).

It would be best if this orchestration was robust enough to pick operation from any valid point.

Is Simulation running -> Await simulation start

  flowchart TD;
      A[Start]-->B{Is simulation running?};
      B -- No --> C[Wait for simulation start]
      C --> B
      B -- Yes -->D{Is robot spawned?};
      D -- No -->E[Spawn Robot];
      D -- Yes --> F{Is robot in a gathering position?};
      E --> D;
      F -- No --> G[Navigate to the closest gathering position - Start Point];
      F -- Yes --> H[Gather all apples];
      G --> F;
      H --> I{Is finished globally?};
      I -- Yes --> Z[Terminate or reset];
      I -- No --> G;

Notes:
Run apples task - it is useful to add a timeout in case we can't gather all apples.

Document Robot Vehicle Design

Document the robot vehicle design, including the specification (size, number, etc.) of all parts (wheels, motor, container, robotic arm, sensors, etc.), as well as scale, mesh, joint, and rig requirements.

Acceptance Criteria

  • Design is reviewed and agreed upon with all parties (Robotec.ai, Open Robotics, AWS)

Linked Issues

  • Successor Task: [#7]

Create Apple Tree Asset

Create the apple tree asset (including apples), based on the accepted design.

Acceptance Criteria

  • FBX apple tree asset(s) are created

Linked Issues

  • Predecessor Task: [#13]
  • Successor Task: [#15]

Validation mini level

Prepare a level which is a mini cutout of the main scene, which:

  1. Is a strict subset of the main demo level (no new or changed objects, all coordinates are the same as in original scene).
  2. Contains only 6 apple trees (3 each side, mini row) - selected close to the dirt road.
  3. Contains a very small area of the dirt road attached to the mini row - so we can put a robot there for navigation scene.
  4. Contains dirt/grass/similar ground challenge / material as the main scene.
  5. Includes an imported robot in a starting position (which can be next to the first apple tree - for Apple Picking tasks, or on the Dirt Road - for navigation).
  6. Should have no content outside of its bounds (e.g. 8x8 meters or whatever fits all the points)

This would increase our ability to prototype (with increased performance and decreased loading time).

Live Demo: manual takeover and return

While the Orchestration global automation task is running, we would like to be able to take over.

  1. For simplicity, only when the robot is in a gathering position, not in between.
  2. Define inputs for takeover (For example, "M" key for manual, Pad "start").
  3. Visualize state (Show "Manual mode" or "Automated mode" in UI).
  4. Make sure orchestration is supportive of takeover and just waits until control is returned.

Create Terrain

Create the terrain based on the agreed upon environment design.

Acceptance Criteria

  • Noise map is created to generate the terrain height
  • Textures are created (soil, grass, etc.)

Linked Issues

  • Predecessor Task: [#12]

Identify Animation Needs

Identify the animations necessary for the demo.

Acceptance Criteria:

  • A list of animations necessary for the demo is documented
  • A set of GitHub issues for each needed animation is created

Document Apple Tree Visual Design

Document the apple tree visual design Consider the number of tree sizes, number of apples per tree, apple color variation requirements, apple variety (granny smith, red delicious, etc.), etc.

Acceptance Criteria

  • Design is reviewed and agreed upon with all parties (Robotec.ai, Open Robotics, AWS)

Linked Issues

  • Successor Task: [#16]

Orchestration Needs - navigation

This task only handles the semantics of navigation to specific points.

The behavior would likely be better if we limit targets to gathering points (a specified, ordered map).

  1. It should be aware whether it is are at gathering point n, gathering point n+1, or on the way between them.
  2. Abstract "get to next gathering spot" and realize it regardless of whether it is in the middle of a row or at the end of the row.
  3. Cares about a good map of gathering points in terms of world poses (position, orientation). We can assume traffic within a row is always ordered (e.g. we always gather on the right side of the robot and move forward to the next one).
  4. Does not care about apples at all.
  5. Communicates with nav2 stack

Related to #42 since it would be called for "navigate to" blocks.

Robot wheels

Create wheels for robot. Radius must be X metres

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.