Hi all
I've decided that I want to build a pair of night vision goggles, since I have most of the stuff lying around and it sounds like it might be fun. My current plan is this:
I'll have a night vision camera with IR lights attached feeding into a Pi 02w. After that, it will be processed by the Pi and sent to a smartphone over USB Ethernet (if possible) or WiFi (likely over HTTP as a MJPEG stream or some such). All of this will be attached to a phone VR headset (something similar to Google Cardboard).
After thinking it over, I see a few issues:
1. The framerate might be too low. I think the cameras I have currently can provide about 24fps, but I'm worried about streaming. My hope is to do this over USB ethernet to the phone, and then make a simple web client or app that processes the stream from there. I might be able to do some sort of interpolation, but I figure low latency is more important than high framerate.
2. Ethernet connection to the phone might not work. My hope is that I can have the Pi host an Ethernet network with a captive portal, so that as soon as the phone is connected, it'll open up a page with everything necessary. However, I'm not sure if this is possible with the Pi.
3. I'm not quite sure what the best way to stream the camera is. When it comes to camera streaming, I've always used Pikrellcam, since it provides low-latency video, with high-quality recording and support for audio. However, I'm probably going to want a more stripped-down version, and I'll probably want to make it myself. What's the best way to get high-framerate, low-latency streaming from the camera over HTTP?
If you've got any tips, advice, suggestions, etc for this project, please let me know!
I've decided that I want to build a pair of night vision goggles, since I have most of the stuff lying around and it sounds like it might be fun. My current plan is this:
I'll have a night vision camera with IR lights attached feeding into a Pi 02w. After that, it will be processed by the Pi and sent to a smartphone over USB Ethernet (if possible) or WiFi (likely over HTTP as a MJPEG stream or some such). All of this will be attached to a phone VR headset (something similar to Google Cardboard).
After thinking it over, I see a few issues:
1. The framerate might be too low. I think the cameras I have currently can provide about 24fps, but I'm worried about streaming. My hope is to do this over USB ethernet to the phone, and then make a simple web client or app that processes the stream from there. I might be able to do some sort of interpolation, but I figure low latency is more important than high framerate.
2. Ethernet connection to the phone might not work. My hope is that I can have the Pi host an Ethernet network with a captive portal, so that as soon as the phone is connected, it'll open up a page with everything necessary. However, I'm not sure if this is possible with the Pi.
3. I'm not quite sure what the best way to stream the camera is. When it comes to camera streaming, I've always used Pikrellcam, since it provides low-latency video, with high-quality recording and support for audio. However, I'm probably going to want a more stripped-down version, and I'll probably want to make it myself. What's the best way to get high-framerate, low-latency streaming from the camera over HTTP?
If you've got any tips, advice, suggestions, etc for this project, please let me know!
Statistics: Posted by TSelden1209 — Mon May 12, 2025 3:44 pm