Giter Site home page Giter Site logo

frc2023's People

Contributors

baconman125 avatar bcow25 avatar cybertron51 avatar ebay-kid avatar kulkarnisoham833 avatar maxchen132 avatar seanson2005 avatar siiverfish avatar the-big-z avatar thedoughnutman avatar troyfrc3952 avatar yavko avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

frc2023's Issues

Calibrate RobotGyro

The RobotGyro has a slight amount of drift, so the error should be measured over a set amount of time and stored for future corrections (or find a built-in function that accomplishes this).

Autonomous Planner

Be able to plan out trajectories from Driver Station computer, with GUI and parse code on Robot

Implement automatic claw rotation for cones

We will have a camera mounted on the claw, pointing in the direction which the claw faces. Assume we can get the orientation angle (in degrees) of the cone from camera vision, where the angle is 0 when the tip of the cone points vertically upwards from the camera's perspective. Implement automatic claw rotation in which the claw automatically rotates to match the orientation angle of the cone.

Comment if you are taking this issue, thanks!

Implement Automatic Object Placement

You may have to look up or ask for information about the game field to properly do this one.

You can assume that we can get the robot to any specific apriltag (see game manual for game field). When the robot is at any specific apriltag, we want preset configurations for the robot to place objects on the poles/platforms for scoring. These measurements should be the same for each apriltag because their surrounding grids are identical. There is a setIntendedCoordinates() method that sets the arm to a specific coordinate relative to the center of the robot, measured in inches. Find the coordinates of each pole/platform relative to the center of the robot, assuming it is up against an apriltag, and implement a command to be able to move the arm to each position (using setIntendedCoordinates()). You can store the numerical coordinate values in the Constants.java file.

Comment on this issue if you are working on it, thanks!

AprilTag Pose2d Localization

Be able to return a Pose2d object, given the output from AprilTags code. Pose2d should reflect robot's position.

Also hard-code in Pose2d's of all 8 AprilTags on the field

Make sure arm code works

ArmSubsystem, InverseKinematicsUtil, ForwardKinematicsUtil, ArmCommand -> make sure these work correctly together

(This is probably not very easy to do but you know I'll make this an issue anyways)

Global Coordinate System

We need a standard system to base all coordinates off of, regardless of which team we are on to avoid any math errors.

Arm "Master Plan"

From Sean's instructions written on the whiteboard (also posted in discord):

Step 1: Joystick will determine if the cone is on its side or straight up ("Pickup Position").

Step 2: Based on the chosen "Pickup Position":
2a) If "cone on side": set flipped in InverseKinematicsUtil to true, so that the arm will make an upside-down U shape (no code needed for the actual movement), and the claw will approach the cone from the top.
2b) If "cone upright" (THIS OPTION ALSO FOR CUBE): set flipped in InverseKinematicsUtil to false, so that the arm will approach the game piece (cone OR cube) from the side.

Step 3: Use the Limelight to precisely position the arm for piece grabbing.

Big to-do list

This is just a to-do list of the stuff we have to do (its not an actual issue, just a list)

  • System Identification (SysId)
  • Test trajectories
  • Find accurate april tag end poses for trajectory generation (where the center of the robot needs to be, not the actual location of the april tag) (almost have this)
  • Make Point class for coordinates maybe (Ivan idea)
  • Ensure that arm code works (important!)
  • Figure out detecting both cones and cubes at once
  • Find area of each, and test implementation of automatic arm movement to each
  • For the above, we might need to be able to tell whether the limelight is seeing a cone or a cube (maybe through color) (would be on python codebase that is in the readme)
  • Figure out implementation of gui and how we want to use it
  • Implement keyboard control through python gui codebase
  • More stuff later probably

Implement smooth travel and turning onto a Tank Drive (field oriented drive)

Normally, tank drive can only go forwards and backwards, with turning as separate controls. We want to be able to turn the robot and move it with the same joystick. Ex: if the robot is currently pointed forward and the joystick is pulled to the right 90 degrees, the robot will keep moving and turn to the right until it is at 90 degrees and keep moving.

Add claw limit switch

There's a limit switch on the claw, when the limit is pressed then the claw should stop closing and the encoder value should be set to the initial value (aka MIN_GRIP_ENCODER_VALUE)

Code autonomous modes

The autonomous mode is the 15 second period at the beginning of the match in which the robot operates without any input from the drivers. We want to have different autonomous commands to run. Your task is to code different autonomous mode commands, according to instructions provided by Jaci probably which I will post here later probably once I receive specific instructions.

There are multiple things you'll want to understand before you start this. First, you'll want to know how to use PathWeaver, since we will be using PathWeaver to move the robot efficiently along trajectories during autonomous. Whenever we want to move the robot from point A to point B, most of the time we will want to use PathWeaver to draw a trajectory between two points. Documentation of PathWeaver can be found in issue #35 .

Second of all, you'll want to understand our autonomous command structure and the way in which commands work. Documentation on commands can also be found in the Read documentation issue.

Explanation for autonomous command structure:
Our autonomous command structure is made up of several parts:

  1. Code in RobotContainer.java: the code here initializes and passes the selected autonomous command to the robot to be run in autonomous mode. We select the different autonomous commands through SmartDashboard (a dashboard which we use to help control the robot), which is implemented in RobotContainer through SendableChooser<Command>. Furthermore, all trajectory commands generated by PathWeaver are retrieved from their JSON files and stored in RobotContainer.

  2. Code in Autons.java: Autons.java can be found in the commands.autocommands folder, and contains multiple factory methods which return create and return Commands to be run in RobotContainer (factory methods are methods that generate objects). Each factory method returns a Command that is passed in through SendableChooser to be an option on SmartDashboard, so that we can choose different autonomous commands to run depending on the situation during competition. In other words, each method in Autons.java represents one autonomous command.

  3. How to actually code autonomous commands: To create our autonomous commands, we will be using command compositions, which is the combination of smaller commands/actions to create larger commands (you can read about command compositions in the documentation). In order to generate instant actions/commands, we use this:

Commands.runOnce(() -> {
    // Autonomous scenario code
}, subsystem1, subsystem2);

If the -> operator is new to you, look up java lambdas. subsystem1 and subsystem2 can be any number of subsystems that the command needs to access. You will most likely use this structure to code all of the arm instructions, as the drive instructions will be handled by the trajectory commands stored in RobotContainer. These instant commands can be concatenated with other commands using the .andThen() and .alongWith() decorators, which is how we will be calling both the instant commands and trajectory commands together as one command. Additionally, you will need to add the subsystems that you want to use as parameters to the factory method you are working on (see comments in Auton.java for more details).
TLDR: use command compositions to create autonomous commands to be passed in to the robot to be run during autonomous modes.

As always, if there are any questions feel free to ask. Sorry for the wall of text, some of it might be hard to understand cuase Im tired

Implement keyboard as controller into the code base

We want to be able to use the laptop keyboard to help with controlling the robot. The code should be able to take inputs from different pressed keys on the keyboard. Any implementations should be added in the controllers folder in a class named KeyboardController.java or something similar. I'm not sure if there's documentation on this out there, but if anyone finds anything or has any ideas for implementation feel free to try it.

Arm Movement Specification (controller)

Driver relative

Forward stick: extend as power
6 button: down to pick up
Trigger button: close claw hold down to keep closing
7 button: open claw
Pull back stick: return to start config (N)
Left/right stick: turn turret
5: claw palm moves to left side (90deg cap)
4: claw palm moves to right side (90deg cap)
3: claw horizontal

Do Whatever

If anyone has any ideas of stuff to do with the robot or the code base or anything else, comment here what you are doing, and feel free to code it up :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.