Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 23 additions & 19 deletions docs/source/docs/additional-resources/best-practices.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,27 +3,31 @@
## Before Competition

- Ensure you have spares of the relevant electronics if you can afford it (switch, coprocessor, cameras, etc.).
- Download the latest release .jar onto your computer and update your Pi if necessary (only update if the release is labeled "critical" or similar, we do not recommend updating right before an event in case there are unforeseen bugs).
- Test out PhotonVision at your home setup.
- Ensure that you have set up SmartDashboard / Shuffleboard to view your camera streams during matches.
- Follow all the recommendations under the Networking section in installation (network switch and static IP).
- Use high quality ethernet cables that have been rigorously tested.
- Set up port forwarding using the guide in the Networking section in installation.
- Stay on the latest version of PhotonVision until you have tested your full robot system to be functional.
- Some time before the competition, lock down the version you are using and do not upgrade unless you encounter a critical bug.
- Have a copy of the installation image for the version you are using on your programming laptop, in case re-imaging (without internet) is needed.
- Extensively test at your home setup. Practice tuning from scratch under different lighting conditions.
- Use SmartDashboard / Shuffleboard to view your camera streams during practice.
- Confirm you have followed all the recommendations under the Networking section in installation (network switch and static IP).
- Only use high quality ethernet cables that have been rigorously tested.
- Set up RIO USB port forwarding using the guide in the Networking section in installation.

## During the Competition

- Make sure you take advantage of the field calibration time given at the start of the event:
- Bring your robot to the field at the allotted time.
- Turn on your robot and pull up the dashboard on your driver station.
- Point your robot at the AprilTags(s) and ensure you get a consistent tracking (you hold one AprilTag consistently, the ceiling lights aren't detected, etc.).
- If you have problems with your pipeline, go to the pipeline tuning section and retune the pipeline using the guide there.
- Move the robot close, far, angled, and around the field to ensure no extra AprilTags are found.
- Go to a practice match to ensure everything is working correctly.
- Use the field calibration time given at the start of the event:
- Bring your robot to the field at the allotted time.
- Make sure the field has match-accurate lighting conditions active.
- Turn on your robot and pull up the dashboard on your driver station.
- Point your robot at the targets and ensure you get a consistent tracking (you hold one targets consistently, the ceiling lights aren't detected, etc.).
- If you have problems with your pipeline, go to the pipeline tuning section and retune the pipeline using the guide there.
- Move the robot close, far, angled, and around the field to ensure no extra targets are found.
- Monitor camera feeds during a practice match to ensure everything is working correctly.
- After field calibration, use the "Export Settings" button in the "Settings" page to create a backup.
- Do this for each coprocessor on your robot that runs PhotonVision, and name your exports with meaningful names.
- This will contain camera information/calibration, pipeline information, network settings, etc.
- In the event of software/hardware failures (IE lost SD Card, broken device), you can then use the "Import Settings" button and select "All Settings" to restore your settings.
- This effectively works as a snapshot of your PhotonVision data that can be restored at any point.
- Before every match, check the ethernet connection going into your coprocessor and that it is seated fully.
- Ensure that exposure is as low as possible and that you don't have the dashboard up when you don't need it to reduce bandwidth.
- Do this for each coprocessor on your robot that runs PhotonVision, and name your exports with meaningful names.
- This will contain camera information/calibration, pipeline information, network settings, etc.
- In the event of software/hardware failures (IE lost SD Card, broken device), you can then use the "Import Settings" button and select "All Settings" to restore your settings.
- This effectively works as a snapshot of your PhotonVision data that can be restored at any point.
- Before every match:
- Check the ethernet and USB connectors are seated fully.
- Close streaming dashboards when you don't need them to reduce bandwidth.
- Stream at as low of a resolution as possible while still detecting AprilTags to stay within field bandwidth limits.
3 changes: 3 additions & 0 deletions docs/source/docs/additional-resources/config.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,6 @@ A variety of files can be imported back into PhotonVision:
- {code}`hardwareSettings.json`
- {code}`networkSettings.json`
- Useful for simple hardware or network configuration tasks without overwriting all settings.


Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Next, from the SSH terminal, run `sudo nano /home/pi/runCamera` then arrow down

```

After the Romi reboots, you should be able to open the PhotonVision UI at: [`http://10.0.0.2:5800/`](http://10.0.0.2:5800/). From here, you can adjust {ref}`Settings <docs/settings:Settings>` and configure {ref}`Pipelines <docs/pipelines/index:Pipelines>`.
After the Romi reboots, you should be able to open the PhotonVision UI at: [`http://10.0.0.2:5800/`](http://10.0.0.2:5800/). From here, you can adjust settings and configure {ref}`Pipelines <docs/pipelines/index:Pipelines>`.

:::{warning}
In order for settings, logs, etc. to be saved / take effect, ensure that PhotonVision is in writable mode.
Expand Down
4 changes: 3 additions & 1 deletion docs/source/docs/apriltag-pipelines/2D-tracking-tuning.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,9 @@ AprilTag pipelines come with reasonable defaults to get you up and running with

### Target Family

Target families are defined by two numbers (before and after the h). The first number is the number of bits the tag is able to encode (which means more tags are available in the respective family) and the second is the hamming distance. Hamming distance describes the ability for error correction while identifying tag ids. A high hamming distance generally means that it will be easier for a tag to be identified even if there are errors. However, as hamming distance increases, the number of available tags decreases. The 2024 FRC game will be using 36h11 tags, which can be found [here](https://github.com/AprilRobotics/apriltag-imgs/tree/main/tag36h11).
Target families are defined by two numbers (before and after the h). The first number is the number of bits the tag is able to encode (which means more tags are available in the respective family) and the second is the hamming distance. Hamming distance describes the ability for error correction while identifying tag ids. A high hamming distance generally means that it will be easier for a tag to be identified even if there are errors. However, as hamming distance increases, the number of available tags decreases.

The 2025 FRC game will be using 36h11 tags, which can be found [here](https://github.com/AprilRobotics/apriltag-imgs/tree/main/tag36h11).

### Decimate

Expand Down
2 changes: 1 addition & 1 deletion docs/source/docs/apriltag-pipelines/detector-types.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@ The AprilTag pipeline type is based on the [AprilTag](https://april.eecs.umich.e

## AruCo

The AruCo pipeline is based on the [AruCo](https://docs.opencv.org/4.8.0/d9/d6a/group__aruco.html) library implementation from OpenCV. It is ~2x higher fps and ~2x lower latency than the AprilTag pipeline type, but is less accurate. We recommend this pipeline type for teams that need to run at a higher framerate or have a lower powered device. This pipeline type is new for the 2024 season and is not as well tested as AprilTag.
The AruCo pipeline is based on the [AruCo](https://docs.opencv.org/4.8.0/d9/d6a/group__aruco.html) library implementation from OpenCV. It is ~2x higher fps and ~2x lower latency than the AprilTag pipeline type, but is less accurate. We recommend this pipeline type for teams that need to run at a higher framerate or have a lower powered device. This pipeline type was new for the 2024 season.
12 changes: 4 additions & 8 deletions docs/source/docs/description.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Description

PhotonVision is a free, fast, and easy-to-use vision processing solution for the _FIRST_ Robotics Competition. PhotonVision is designed to get vision working on your robot _quickly_, without the significant cost of other similar solutions.
PhotonVision is a free, fast, and easy-to-use vision processing solution for the _FIRST_ Robotics Competition. PhotonVision is designed to get vision working on your robot _quickly_, but with lower cost than other solutions.
Using PhotonVision, teams can go from setting up a camera and coprocessor to detecting and tracking AprilTags and other targets by simply tuning sliders. With an easy to use interface, comprehensive documentation, and a feature rich vendor dependency, no experience is necessary to use PhotonVision. No matter your resources, using PhotonVision is easy compared to its alternatives.

## Advantages
Expand All @@ -19,19 +19,15 @@ The PhotonVision user interface is simple and modular, making things easier for

### PhotonLib Vendor Dependency

The PhotonLib vendor dependency allows you to easily get necessary target data (without having to work directly with NetworkTables) while also providing utility methods to get distance and position on the field. This helps your team focus less on getting data and more on using it to do cool things. This also has the benefit of having a structure that ensures all data is from the same timestamp, which is helpful for latency compensation.
The PhotonLib vendor dependency allows you to easily get necessary target data (without having to work directly with NetworkTables) while also providing utility methods to get distance and position on the field. A serialization strategy is used to guarantees data coherency, which is helpful for latency compensation. This helps your team focus less on getting data and more on using it to do cool things.

### User Calibration

Using PhotonVision allows the user to calibrate for their specific camera, which will get you the best tracking results. This is extremely important as every camera (even if it is the same model) will have it's own quirks and user calibration allows for those to be accounted for.

### High FPS Processing
### Low Latency, High FPS Processing

Compared to alternative solutions, PhotonVision boasts higher frames per second which allows for a smoother video stream and detection of targets to ensure you aren't losing out on any performance.

### Low Latency

PhotonVision provides low latency processing to make sure you get vision measurements as fast as possible, which makes complex vision tasks easier. We guarantee that all measurements are sent from the same timestamp, making life easier for your programmers.
PhotonVision exposes specalized hardware on select coprocessors to maximize processing speed. This allows for lower-latency detection of targets to ensure you aren't losing out on any performance.

### Fully Open Source and Active Developer Community

Expand Down
2 changes: 2 additions & 0 deletions docs/source/docs/hardware/picamconfig.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# Pi Camera Configuration

This page covers specifics about the _Raspberry Pi_ CSI camera configuration.

## Background

The Raspberry Pi CSI Camera port is routed through and processed by the GPU. Since the GPU boots before the CPU, it must be configured properly for the attached camera. Additionally, this configuration cannot be changed without rebooting.
Expand Down
Loading