AurorEye is a portable, autonomous all-sky imager optimized for timelapse Aurora photography
AurorEye is designed to be carried with Aurora Chasers and capture calibrated, all-sky timelapses of auroral events for use in research and citizen science. Each unit uses off-the-shelf camera hardware and lenses. All control, telemetry, networking, and cloud upload functions are controlled by an integrated single board Linux computer. AurorEye can capture a full night of timelapse photos 1000-5000+ images at up to 24MP (8K) resolution. A unique feature of the timelapse imagery is almost the minimal time gap between image exposures, resulting in smoother timelapses with minimal missing motion.
- APS-C sized imager, 2K and 4K resolution settings
- 2 to 30 second exposure
- Minimal time gap between exposures
- Glove-friendly controls
- Battery operated, 8 hour life in cold conditions
- GPS location and timestamps
- Assisted levelling and setup
- Single button start and stop
- Automatic timelapse & keogram generation upload to cloud
- Wifi internet connectivity and in-field Wifi hotspot
Locations where AurorEye units have captured data and list of Operators
AurorEye Units have campaigned in these locations, but not yet simultaneously. The goal is to have multiple deployments to Aurora Chasers and simultaneous all sky timelapse imagery of aurora
- UNIT01 - Retired, test article
- UNIT02 - Retired, test article
- UNIT03 - Retired, data collected in Whitehorse, YT, Dawson City, YT
- UNIT04 - Jeremy Kuzub, Ottawa, ON, September 2022 - March 2023, data collected in Edmonton, AB, Yellowknife, NWT
- UNIT04 - Vincent Ledvina, Fairbanks, AK, Mar-May 2023, Data collected in Fairbanks, PFRR
- UNIT04 - Andy Whitteman, Fairbanks, AK, April 2023
- UNIT05 - Jeremy Kuzub, Halifax, NS, Mar-Apr 2023, data collected in Yellowknife, NWT and Nova Scotia
- UNIT05 - Justin Anderson, Manitoba, Canada Jan-Aug 2024
- UNIT07 - Vincent Ledvina Fairbanks, AK, Jan-Apr 2024
- UNIT09 - Jeremy Kuzub Jan-Apr 2024
Watch the 2021 AGU presentation
AurorEye timelapse videos are hosted on a dedicated YouTube channel at 4K resolution with radial keograms. If you are a student, citizen, scientist or research interesting in the content of these timelapses, please get in touch at firstname.lastname@example.org
Visit the timelapse archive at YouTube
A Reminder why we do this: Citizen Science
These photos were being posted in Alberta Aurora Chasers' Facebook group, and were assembled into an animation of flying under the aurora. Citizen Science has a pretty broad definition, including data visualization and image processing in novel ways.
Deployment plans for 2023/2024
Deployment highlights from 2023:
- Tethered testing under a dome at Poker Flat Research Range (Thanks to Vincent Ledvina and Dr. Donald Hampton)
- Deployment to Yellowknife in harsher winter conditions (-20degC)
- Testing at lower latitudes, including Nova Scotia, during Kp8/G4 geomagnetic storm conditions
For Spring 2024
- Deployment to Alaska
- Deployment to Yellowknife
- Deployment to Manitoba
Further plans include geometric calibration of the sensor and lens combination to allow projection onto a map and potential to combine these with images from other ASI networks.
April 30, 2023
AurorEye timelapse videos are now hosted on a dedicated YouTube channel at 4K resolution with radial
Visit the timelapse archive at YouTube
If you are a student, citizen, scientist or research interesting in the content of these timelapses, please get in touch at email@example.com
Why not a dedicated, cloud powered video server and custom React front end and so on?
YouTube is the cloud based, free, video hosting service that suits AurorEye best at this time. Setting up our own cloud server for these summary 4K timelapses would be potentially costly (if there is a sudden rise in views) and we would have to reinvent the video playback wheel. Youtube offers some great advantages like visibility, bandwidth, playback speed, multiple resolutions, and linkable chapter markers for specific auroral events within timelapses.
April 17, 2023
Unit 05 - some new tricks and good conditions
Unit 5 was built to rely on its wireless access point and web app for Operator control. This worked well, allowing the Operator to setup and then monitor status, logs, and images from a distance, and even back in the vehicle while warming up. Clear skies in Yellowknife allowed capture of SAR/STEVE, red aurora of the kp8 March 23/24 event, and several sub-storms and large formations ("Churns"). This data was captured within 20-30km of the AuroraMAX ASI, which could allow comparison of data captured and some parallax experiments.
Unit 04 - Fairbanks and PFRR
Unit 4 arrived in Fairbanks in good order and was tested by two Operators (see Deployments section for credits)
Unit 04 also had the opportunity to be hosted under one of the dome at Poker Flat Research Range, demonstrating remote control and multi-sequence capture over two nights. The unit used its external power option and tied to the local WLAN network allowing remote operation via a VNC and the terminal interface.
Video processing workflow using free video editing software
Using the free version of Davinci resolve allow image processing pipelines from image sequences to 4K and 8K timelapses using the Fusion node network system. This promises flexibility and rapid iteration when combining data before committing to a coded solution
Deployment Update: March 28, 2023
This is an interim update with data still being analyzed. Units 04 and 05 were both deployed for testing in northern latitudes. More to come, but this is the beginning of testing with Aurora Chasers.
Development Update: March 2023
The theme of this month's update is "Ship it": we are getting some units out to the field for testing:
Unit 04 to Fairbanks
This should be in good hands (with the aurora chaser who will be operating it, not the journey through the mail). This test campaign has a few milestones:
- Overall user experience: does the unit work well enough to not burden an aurora chaser on their chase, while capturing the data it was designed to capture?
- Survival through the mail - special packaging is needed for electronics: modular Polyethylene foam extrusions (aka pool noodles) have the density and impact absorption needed for shipping and are much easier to shape around electronics. Hopefully they will do the trick
- Survival through international customs - shipping science equipment without incurring extra fees or customs scrutiny
- Home base functionality - can the user set up the AurorEye unit on their local network successfully like any internet connected household device, using the in build wireless access point setup tool?
- Field functionality - can the unit be set up and capturing imagery with low overhead?
- Cloud functionality - can the unit successfully offload a night's image sequences to the cloud storage facility
- Robustness - can the unit stand up to the unknown unknowns of aurora chasing with a new user?
Unit 05 to Yellowknife
Unit 05 was an exercise in minimalism - there was very little time to built it before it was due in Yellowknife for a week of testing. This meant cutting it to bare bones, which was also a great learning opportunity:
- The Raspberry Pi disk image was successfully cloned over to the new unit, demonstrating that we can potentially provision the units quite quickly
- The software was updated to just "roll with" missing sensors. This increased the robustness of the software to errors due to bad connections, sensor data dropout. It also means that various levels of AurorEye can be built with varying sensor loads and controls
- This was a 'headless' unit - no buttons, no built-in display, no power management switchgear - everything was controlled via the wireless access point's web interface, which shows the latest imagery and telementry, wehile allowing control of sequence shooting and exposure, as well as viewing the AurorEye log to debug in the field.
Stretch Goal: simultaneous image capture from two AurorEye units
A stretch goal in late March will be - if both units are operational and in position, can they simultaneously capture imagery on the same night? This would be a demonstration of an ad-hoc all sky camera network. Fairbanks and Yellowknife are about 1600km apart, so there would likely not be any imagery overlap, but showing two animated image circles on a map would be a significant step in proving out this project
Development Update: February 2023
The theme of this month's updates are "Doing What You Do Best". What this means is splitting the computation between the onboard computer and the Cloud differently that originally planned due to limitations of the Raspberry Pi, and the increase in available offload bandwidth for Aurora Chasers:
The tl;dr is this: we know we have more upload bandwidth than originally planned, so why not use it, leaving the AurorEye out of the pixel-level image processing pipeline in order to maximize the image resolution, quality and imaging cadence of the AurorEye hardware.
Let's look at three key changes that resulted:
Move away from 1080 locally encoded video with "HUD" overlay
This is computationally intensive for the RPI at two stages:
- During image capture when the rpi applies the data overlay -this requires re-save of a the original camera JPG, which has a penalty of double-image-compression, and up to 1.5 seconds of compression time
- During video encoding from JPEG to MP4 1080P via the RPI hardware encoder, which takes time and has absolute hardware limits to video quality.
This approach was "traditional" meaning a standard All-sky camera approach of outputting a data-overlaid timelapse movie, and reducing the data from the Gigabytes to the Megabytes range. In reality, we do have the upload bandwidth and the time for Gigabytes range, and that network rate will only get better with time. For this reason, we are...
Moving ahead with 4K+ image sequences with Metadata embedded in the EXIF header
The camera can shoot up to 4000x4000px (8K) JPGs at a rate of 30 frames per minute (2-second exposures), and maybe twice that.
Rather than overlaying meta data like GPS location and timestamp on the image, it can be more efficiently and quickly embedded in the JPG file EXIF tags. This embedding step can keep up with the file size and cadence for at least 2 second exposures at 4K.
This allows us to quadrupling the resolution (not to mention quality due to less compression) versus the old 1080p MP4 video method, since we are freed of RPI video hardware limitations.
In order to efficiently get thousands of JPGs uploaded to the cloud, they are combined into a standard TAR file. This is not compressed, since the JPGs are already compressed), bypassing the intensive video creation step and the arcane ffmpeg encoding step. This minimizes onboard processing time and maximizes quality, allowing video to be created in the cloud at a later date from higher quality images
Ultimately the first few units will need software updates, log reviews and fixes via remote.
After considering several methods such as a custom zip file update system or software update via the Python package manager PIP, the most flexible system was VNC remote access using a cloud connection service. This has the advantage of not committing to a single software update method (which has limited access), while providing maximum access to the RPI in the AurorEye units remotely. Using a cloud connection service such as RealVNC also leverages their security protocols and does not require AurorEye users to set up their local Internet/Local Router with port forwarding or other customization.
While all these changes have required some rewrites of the AurorEye software, they should better match the strengths and limitations of the onboard RPI, the Cloud, and the internet connection between them.
Development Update: 🌎 AGU 2022
AurorEye got some exposure as a poster under the MacGyver Sessions at the American Geophysical Union Fall 2022 Meeting. Here is the poster link. Thanks for Vincent Ledvina of the University of Alaska (Fairbanks) Geophysical Institute, and HamSCI for helping citizen science efforts get some attention, and a sincere thanks to Dr. Liz Macdonald of NASA Goddard and Laura Brandt of Aurorasaurus
Some of the new developments presented included version 2 of AurorEye: 1/8th the volume of the original, and using custom Raspberry Pi Hat:
Parts are harder to find this year, however the cameras, PCBs, cases, lenses, and Raspberry Pi's needed for several units are now in and ready for assembly:
Field Update: 👾 Dark Skies and Leonid meteors in Nova Scotia
AurorEye can also work as a meteor capture timelapse imager.
- Low inter-shot black out period, meteors will have a hard time slipping in between exposures. In this case, the exposure time was 30 seconds per frame, subsequently stacked.
- AurorEye is designed so that the Burst Mode of the Canon camera is used, allowing shots to be continuously channelled through the high-speed camera internal buffer, then to the internal SD card, and finally to USB storage mounted on the Raspberry Pi module. This results in smoother timelapses since the shutter is almost always open.
- Cameras like the Canon have a failsafe feature to prevent the shutter or battery from wearing out if the shutter button is inadvertently depressed during storage: the burst mode stops after 999 shots. This had to be accounted for in AurorEye control software to allow it to take advantage of the minimal blackout time between frames that burst mode enables. Along with that, dark frame-based noise reduction must be turned off in the camera. Noise reduction using dark frames takes a second exposure with the shutter closed to detect and cancel hot pixels and other artifacts.
Overall, we were pleased with the capability to capture meteors and it adds to the possible uses and future extension of AurorEye
Development Update: 🎱 Deciding where to do the number crunching
AurorEye can shoot all night long at 1800 frames per hour at 4K resolution. That's quite a bit of data that needs to be processed and uploaded.
Some of the tools we have at our disposal to shoot all nigh and process during the non shooting periods:
- 50,000+ frames of onboard storage at 4K JPG
- Onboard hardware accelerated h.264 movie encoding (but for HD 1080p only)
- Wifi connectivity to the Aurora Chaser's Home Base internet connection
- Cloud based storage - Google Cloud services
Onboard encoding and processing of frames into h.264 MP4 movies and PNG format keograms is a feature of AurorEye. Since it will be idling in the late morning after the Aurora Chaser stumbles wearily back through their door and shakes the snow off their boots, it makes sense that it processes the capture footage that night and uploads it - essentially it has nothing else to do during the day.
But there are options:
- The 1080p movie and keogram are uploaded to the cloud storage automatically, along with log files with associate meta data for that night's sequences
- If the night's images were quite extraordinary, the individual 4K source frames can be uploaded as well
The h.264 movie for a night of shooting is about 200-1000MB in size, depending on JPG scene complexity and number of frames, processing takes about 10 minutes per hour of timelapse onboard the AurorEye
AurorEye also includes the ability to purge internal storage of any 4K frames and uploaded sequences, freeing up space. Alternatively the aurora chaser can simply remove the SD Card and mail it to AurorEye Home Base using snail mail, and slip in a new SD card of any size.
Providing maximum flexibility is part of getting the best data we can out of the system.
Field Update: 🥶 -20degC and Hand Warmers for Robots
AurorEye has to be able to shoot in cold weather.
Fortunately it has performed well so far. It is close to a sealed unit, and a small internal volume, so the electronics power dissipation of about 5 Watts is enough to keep the temperature elevated versus the environment.
It also has a humidity and internal temperature sensor that can be used to monitor and track changes in performance with these atmospheric variables
Testing in Alberta and Yellowknife in fall and spring conditions (-15degC or so) show a stable internal temperature is reached because of electronics power dissipation. This is a gide on how to use AurorEye in cold weather
- Keep AurorEye on - don't bother conserving battery power. A liIon battery needs to be warm to work (and can be damaged if charged cold) - leaving the unit on self-heats all the electronics and battery
- Mist likely not a great idea to take the AurorEye back into the heated, humid comfort of your vehicle when relocating to another area to shoot - lens condensation may result. TODO: track humidity and temperature over time to be able to provide a condensation warning
- If temperatures are really low, consider adding a few hand warmers to the internal space of the AurorEye. An increase of a few degrees has been noted by doing this.
- Thermal insulation may be a good addition to the units
- The Raspberry Pi can act as a heater: The CPU can create a significant amount of thermal energy when running. Adding additional tasks to the 4 cores of the CPU may effectively increase thermal output!
Development Note: 🕹 interfaces, menus, and mittens
AurorEye is Mitten-Friendly. This is important, because the users will often be in -XXdegree temperatures for hours on end.
Modern camera interfaces usually use tiny buttons, touchscreens, or Wifi phone interfaces to automate timelapse shooting. Generally they are not that fun to use in real world Aurora Shooting conditions
While AurorEye can communicate and do advanced configuration via its local wifi hot-spot and the user's phone, out in the field, the design is meant for use while fully protected from the cold
Constraints force simplicity and usability. Having only 4 buttons streamlines ones thinking about the user interface. A small monochrome OLED screen (which is good with the cold) enforces the hands-off functionality of AurorEye
Development Note: ⚡🔋 Power and Voltage Supply
AurorEye has some thirsty devices on its power bus that all need to be fed their specific diet of voltage and current.
5V can be supplied with an external barrel-jack adapter, which can charge the main battery pack and run the primary power bus. Consumption is typically about 1.5 Amps at 5VDC.
Most bus devices such as storage and ethernet are powered through the USB-A connectors on the Raspberry Pi.
I2C connected devices typically require a small amount of current at either 5VDC or 3.3VDC, which can be supplied by the RPI interface header pins via the custom PiHat.
High sensitivity magnetometer may need a separate cleaner power supply, although this has yet to be determined in relation to other electromagnetic noise sources, such as the DC/DC converter and the camera shutter.
The camera requires a dedicated step up converter to supply 7.2V, as it is typically powered by its own internal LiIon battery pack. Step-up converters in the is application have some special requirements. Since the camera is forced to be "always on" the converter cannont supply any residual voltage to the camera, or it will not truly be 'off'. Nor can it supply any less than the 7.2V during power-up. Either of these states may make the camera's on internal power-up sequence misbehave.
To improve the behavior of the camera DC/DC converter, a load resistor can be attached across the output of the converter of about 5 kOhm. This will bleed residual power out of any capacitors in the converter curcuit, preventing residual voltage from keeping the camera from powering up and down cleanly.
Another consideration is the off-the-shelf LiPo battery pack, usually used to charge mobile devices through USB-A. This has to be ready to supply power when AurorEye is hard-started - this means it cannot require a separate button push or a special power sraw or signalling from the USB-connected loads. It also cannot auto-power-off if insufficient current draw is detected, which can be an issue with some supplies. Finally, the battery cannot cease supplying power if it detects some input current on its charging port, since the charging port is always connected to the rest of the power supply bus. This means that any off the shelf power pack needs to be checked out for proper behavior during power-up, charging, and continuous running, especially in cold conditions.
Development Note: 📷 Thoughtful choices in ISO, aperture and shutter speed
AurorEye is a light bucket over space and time. We have to get the most out of consumer grade imagers (which are very good) by not over or under exposing the sky and the aurora.
Aurora have large changes in brightness and apparent motion between active and quiet times. A sane choice from experience is to let the Aurora Chaser set AurorEye in one of the following configurations, all of which use the Fisheye lens at full open (f2.0):
- 2 second exposure at ISO 800, 1600, 3200
- 5 second exposure at ISO 800, 1600, 3200
- 15 second exposure at ISO 800, 1600, 3200
- 30 second exposure at ISO 800, 1600, 3200
This covers the approximate range of visible aurora exposure latitudes we can hope for at 8 bits per pixel per channel. The operator can set the timelapse sequence to any of these.
Motion was deemed to be an important kind of data to capture, so the default is 5 seconds at ISO 1600
Development Note: 📸 Choosing a camera control interface
AurorEye, at its heart, is a camera control system that must interface with a high resolution, high-sensitivity, off the shelf consumer camera.
Consumer cameras generally have three methods of being controlled externally:
- With a remote shutter release - very basic control, no offload or settings adjustment
- With a proprietary WiFi communication protocol. For example, some of Canon's cameras have a REST API that can be interacted with by a wifi connected http capable interface. This tends to be quite slow for image transfer, however.
- With a Picture Transfer Protocol compatible communication stack (often over USB). The "Python Library PTPy is a python implementation.
- With vendor-specific extensions to the Picture Transfer Protocol, or fully proprietary protocols that allow partial or full access to the cameras features, settings, configuration, and setup.
PTP sees the best way forward as it is reasonably standardized and has provision for faster file transfer over a robust(ish) physical layer (USB) that does not take up an additional WiFi connection for the communicating system (computer, e.g. Raspberry Pi).
But it is low level, and does not define the communication stack (io.e. the physical communication layer). Some vendor-specific extensions and behavior have been mapped out in the gphoto project. gphoto has a command line client interface called gphoto2, and a C++ library called libgphoto2 for finer control (also used by the CLI).
Most relevant to AurorEye, Jim Easterbrook has created a Python wrapper for this C++ API called Python-gphoto2.
Note: The gphoto project organization on Github has a python wrapper repo for libgphoto2, but it seems to not be maintained as frequently as the above repo.
libgphoto2 is frequently updated to support new cameras via USB connection. The full list is here.
libgphoto2 (and by extension the python-gphoto2 wrapper) support different level of compatibility with different cameras and manufacturers. It is not supported by manufacturers, so functionality is generally a subset of the full camera settings. If a camera's proprietary interface is not accessible, the PTP layer (described above) is used as an intermediary.
For AurorEye, Canon cameras were chosen because of reasonable support for by libgphoto2, including burst mode shooting, basic camera setting control, and image offload during burt mode shooting at a sufficient rate.
Some of the functions used:
- Get and Set camera configuration (ISO, Shutter speed)
- Virtually press and hold/release shutter button
- Inspection of the onboard SD Card file system including download and deletion
- Camera event message monitoring (and associated callback registry) to act as a polling system, as the communication interface is asynchronous to commands issued (sometimes multiple times) and data transferred
Some of the issues to contend with using this communication approach:
- There is no command queue - AurorEye software must keep checking that a command has been accepted and issue a reasonable number of retries and a failure handling mechanism
- The burst mode method of gathering timelapse data with a minimum blackout between images runs up a gainst a safety feature of the Canon camera: a maximum of 999 shots for a given burst. The shutter up and down command must be successfully issued before 999 frames are shot
- Unable to set capture quality/movie mode: at Least for the M100 Canon Camera, these cannot be accessed or set and must be configured one-time using the interface
- In burst mode, inability to offload images directly through USB to RPI mass storage. The SD Card must be used as an intermediary.
- Image storage to SD card from Camera Buffer takes time, as does the event system that notes an image has been stored. This means there is a delay between the end of an exposure and the moment that image ends up on the RPI USB mass storage system
- Cannot access the internal timelapse functionality (it does not really meet our needs anyways, but if it did, for example allowing arbitrary timelapse settings, that would reduce much of the RPI software duties.
These limitations may not be present on other Canon camera models, or there may be different limitations. For example, the M100 stores RAW images in the lossless/uncompressed CR2 file format which results in files of around 30MB each. The newer M200 supports both the REST WiFi API (too slow for image transfer for our needs) and the CR3 lossy RAW format, which would be better for mass storage and transfer of ASI timelapse data.
Overall, necessity is results in creative thinking, and these limitations and capabilities have been woven together (and sometimes bypassed) to allow AurorEye to meet its imaging requirements. A Huge Thanks to all the maintainers who have made these projects what they are today.