I hacked together my "IP cameras" using some web cams for which I was able to track down some FOSS i/f implementation documentation. This was OK for a proof of concept. But, I now have to settle on "production" hardware.
There seem to be a variety of i/f's for cameras:
- RS170 (requires hardware to digitize the signal; inflexible)
- parallel (video is digized, just needs to be captured at pixel rate)
- USB (digitized but packed in proprietary transport protocols over USB)
- CSI (digitized with specialty hardware required in host)
- IP (digitized but packed in protocols over IP)
The USB option is the easiest from the host's *hardware* point of view. But, seems to limit the devices that could be supported as camera vendors are loathe to publish details that only *their* driver implementers should need.
RS170 is... "passe"
Parallel requires lots of signals to/from the host/camera.
CSI seems to be primarily supported on hosts intended to address the mobile market.
As USB is essentially "free" (hardware-wise), adding support for it as an *alternative* interface seems prudent.
Relying on CSI seems like it will restrict my choice of processor(s) going forward (I'd like to standardize on *a* host platform and not have to support a variety) -- though that's where the volume lies (think camera in cell phones). OTOH, it may make getting components difficult (small fish, etc.)
IP seems to have more costs than USB with little/no gain.
Anyone been down this road who can share experiences? Probably only looking at 10K/yr...