Using WPILib Pose Estimation, Simulation, and PhotonVision Together
The following example comes from the PhotonLib example repository (Java/C++/Python). Full code is available at that links.
Knowledge and Equipment Needed
Everything required in Combining Aiming and Getting in Range, plus some familiarity with WPILib pose estimation functionality.
Background
This example demonstrates integration of swerve drive control, a basic swerve physics simulation, and PhotonLib’s simulated vision system functionality.
Walkthrough
Estimating Pose
The Drivetrain
class includes functionality to fuse multiple sensor readings together (including PhotonVision) into a best-guess of the pose on the field.
Please reference the WPILib documentation on using the SwerveDrivePoseEstimator
class.
We use the 2024 game’s AprilTag Locations:
68 visionSim.addAprilTags(kTagLayout);
42inline const frc::AprilTagFieldLayout kTagLayout{
43 frc::LoadAprilTagLayoutField(frc::AprilTagField::k2024Crescendo)};
46 self.camPoseEst = PhotonPoseEstimator(
To incorporate PhotonVision, we need to create a PhotonCamera
:
57 camera = new PhotonCamera(kCameraName);
145 photon::PhotonCamera camera{constants::Vision::kCameraName};
44 self.swerve = drivetrain.Drivetrain()
During periodic execution, we read back camera results. If we see AprilTags in the image, we calculate the camera-measured pose of the robot and pass it to the Drivetrain
.
64 // Correct pose estimate with vision measurements
65 var visionEst = vision.getEstimatedGlobalPose();
66 visionEst.ifPresent(
67 est -> {
68 // Change our trust in the measurement based on the tags we can see
69 var estStdDevs = vision.getEstimationStdDevs();
70
71 drivetrain.addVisionMeasurement(
72 est.estimatedPose.toPose2d(), est.timestampSeconds, estStdDevs);
73 });
38 auto visionEst = vision.GetEstimatedGlobalPose();
39 if (visionEst.has_value()) {
40 auto est = visionEst.value();
41 auto estPose = est.estimatedPose.ToPose2d();
42 auto estStdDevs = vision.GetEstimationStdDevs(estPose);
43 drivetrain.AddVisionMeasurement(est.estimatedPose.ToPose2d(), est.timestamp,
44 estStdDevs);
45 }
54 camEstPose = self.camPoseEst.update()
55 if camEstPose:
56 self.swerve.addVisionPoseEstimate(
Simulating the Camera
First, we create a new VisionSystemSim
to represent our camera and coprocessor running PhotonVision, and moving around our simulated field.
65 // Create the vision system simulation which handles cameras and targets on the field.
66 visionSim = new VisionSystemSim("main");
67 // Add all the AprilTags inside the tag layout as visible targets to this simulated field.
68 visionSim.addAprilTags(kTagLayout);
69 // Create simulated camera properties. These can be set to mimic your actual camera.
49 visionSim = std::make_unique<photon::VisionSystemSim>("main");
50
51 visionSim->AddAprilTags(constants::Vision::kTagLayout);
# Coming Soon!
Then, we add configure the simulated vision system to match the camera system being simulated.
69 // Create simulated camera properties. These can be set to mimic your actual camera.
70 var cameraProp = new SimCameraProperties();
71 cameraProp.setCalibration(960, 720, Rotation2d.fromDegrees(90));
72 cameraProp.setCalibError(0.35, 0.10);
73 cameraProp.setFPS(15);
74 cameraProp.setAvgLatencyMs(50);
75 cameraProp.setLatencyStdDevMs(15);
76 // Create a PhotonCameraSim which will update the linked PhotonCamera's values with visible
77 // targets.
78 cameraSim = new PhotonCameraSim(camera, cameraProp);
79 // Add the simulated camera to view the targets on this simulated field.
80 visionSim.addCamera(cameraSim, kRobotToCam);
81
82 cameraSim.enableDrawWireframe(true);
53 cameraProp = std::make_unique<photon::SimCameraProperties>();
54
55 cameraProp->SetCalibration(960, 720, frc::Rotation2d{90_deg});
56 cameraProp->SetCalibError(.35, .10);
57 cameraProp->SetFPS(15_Hz);
58 cameraProp->SetAvgLatency(50_ms);
59 cameraProp->SetLatencyStdDev(15_ms);
60
61 cameraSim =
62 std::make_shared<photon::PhotonCameraSim>(&camera, *cameraProp.get());
63
64 visionSim->AddCamera(cameraSim.get(), constants::Vision::kRobotToCam);
65 cameraSim->EnableDrawWireframe(true);
# Coming Soon!
Updating the Simulated Vision System
During simulation, we periodically update the simulated vision system.
114 @Override
115 public void simulationPeriodic() {
116 // Update drivetrain simulation
117 drivetrain.simulationPeriodic();
118
119 // Update camera simulation
120 vision.simulationPeriodic(drivetrain.getSimPose());
121
122 var debugField = vision.getSimDebugField();
123 debugField.getObject("EstimatedRobot").setPose(drivetrain.getPose());
124 debugField.getObject("EstimatedRobotModules").setPoses(drivetrain.getModulePoses());
125
126 // Update gamepiece launcher simulation
127 gpLauncher.simulationPeriodic();
128
129 // Calculate battery voltage sag due to current draw
130 RoboRioSim.setVInVoltage(
131 BatterySim.calculateDefaultBatteryLoadedVoltage(drivetrain.getCurrentDraw()));
132 }
95void Robot::SimulationPeriodic() {
96 launcher.simulationPeriodic();
97 drivetrain.SimulationPeriodic();
98 vision.SimPeriodic(drivetrain.GetSimPose());
99
100 frc::Field2d& debugField = vision.GetSimDebugField();
101 debugField.GetObject("EstimatedRobot")->SetPose(drivetrain.GetPose());
102 debugField.GetObject("EstimatedRobotModules")
103 ->SetPoses(drivetrain.GetModulePoses());
104
105 units::ampere_t totalCurrent = drivetrain.GetCurrentDraw();
106 units::volt_t loadedBattVolts =
107 frc::sim::BatterySim::Calculate({totalCurrent});
108 frc::sim::RoboRioSim::SetVInVoltage(loadedBattVolts);
109}
# Coming Soon!
The rest is done behind the scenes.