Kenneth Hoffmann, Julia Di
Project code repository for affiliateship between Stanford's Center for Design Research and EpiSci Systems Inc.
The perching system flight experiments involve:
- Perception
- Planner
- Perching Sequence
- The drone is in a hover state first and live camera images are collected
- The user may click on the tree
- The perception system segments the tree and detects the centroid
- Tree pose is continuously tracked
- Outputs trajectory and setpoints
- Takes in tree pose, tree velocity, drone pose, drone velocity
When ready to initiate the perching sequence:
- Time-of-Flight sensor on the gripper to check distance between gripper and tree
- Trigger "elbow" hinge to release drone vertical
- Throttle down
User input is required to (1) click on a perching location, (2) start perching sequence, (3) verify perch state.
In addition, the system can detect perch failures from IMU data.
- This work uses ROS 1.
- The drone used for experiments is a Uvify IFO-S modified with microspine gripper hardware.
- In order to use the deep learning tree detectors (e.g. Segment Anything), the processing must be done on a CUDA-enabled device.
- Drone pose is tracked with an Optitrack motion capture system
The autonomy state machine is detailed in ``percher/src/state_machine.cpp"
To run this:
- Set up the ROS_MASTER_URI to the XML-RPC URI of the master node.
- Run the vision and trajectory files with
roslaunch object_tracking target_vision_tree.launchandroslaunch mav_trajectory_generation_example example.launch - Run the state machine with
rosrun percher percher_nodeandrosrun mav_trajectory_generation_example dummy_topic_code.py
This work was conducted by Kenneth Hoffmann and Julia Di as Stanford graduate students.