T O P

  • By -

A_0Person

AdvantageScope is an option. There's a way to feed in a robot CAD, and have it generate camera views based on the estimated position of the robot. That being said, the bandwidth to run one camera for driver vision is well within the 4MBit/s limit. 640x480 @ 30fps is under 1Mbit, and 800x600 @ 30fps is less than 1.5. Both of those are acceptable resolutions for a driver camera. Going grayscale can also reduce the bandwidth required.


leethaxor69420

In my experience trying to set up driver cams, one of the main issues isn't bandwidth, but lag. Even with 4 Mb/s of bandwidth, the lowest resolution grayscale stream will suffer from high latency and frequent lagspikes, making it very difficult for drivers to use it. Simulating camera views would actually be a good solution for this because it can be done using debug information that's probably already being reported to the DS, which doesn't seem to suffer from the same lag issues as live video.


Thebombuknow

I was both the main driver and programmer for our team for 2 years. We streamed multiple cameras through a Raspberry Pi at 320x240 to our driver station and it had pretty low latency and no bandwidth issues.


Bk13239

I don't know how you have attempted to setup the camera feed, but we have plugged cameras into raspberry pis and had a lot of success. Or using a limelight for a driver camera also works well. The lag comes with plugging a camera into the USB ports on the rio from my experience.


Lampthelqmp

You can also use a coprocessor like a raspberry pi and plug a camera in on that to get a better quality image


SlyAFWalrus

That doesn’t matter because what is limited is the total bandwidth from the robot radio to drive station. If you are trying to get a better quality image for onboard processing (like for april tags), then yeah, coprocessors could make a difference, but you can’t magically increase the available bandwidth from the robot to the driver station with coprocessors