AurorEye is a portable, autonomous all-sky imager optimized for timelapse Aurora photography

AurorEye is designed to be carried with Aurora Chasers and supplement their own photography with calibrated, all-sky timelapses of auroral events

Features

Watch the 2021 AGU presentation

Sample Timelapses

LATEST UPDATES

Development Update: 🌎 AGU 2022

AurorEye got some exposure as a poster under the MacGyver Sessions at the American Geophysical Union Fall 2022 Meeting. Here is the poster link. Thanks for Vincent Ledvina of the University of Alaska (Fairbanks) Geophysical Institute, and HamSCI for helping citizen science efforts get some attention, and a sincere thanks to Dr. Liz Macdonald of NASA Goddard and Laura Brandt of Aurorasaurus

Some of the new developments presented included version 2 of AurorEye: 1/8th the volume of the original, and using custom Raspberry Pi Hat:

Printed circuit board for simplicity: custom Raspberry Pi Hat using the free KiCAD design software and manufactured by JLCPCB
Printed circuit board for simplicity: custom Raspberry Pi Hat using the free KiCAD design software and manufactured by JLCPCB

Parts are harder to find this year, however the cameras, PCBs, cases, lenses, and Raspberry Pi's needed for several units are now in and ready for assembly:

row of parts awaiting assembly into AurorEye cameras
Row of parts awaiting assembly into AurorEye ASI units

Field Update: 👾 Dark Skies and Leonid meteors in Nova Scotia

AurorEye timelapse sequence compressed into a single frame, a bright green meteor trail is visible in the lower left
AurorEye timelapse sequence compressed into a single frame, a bright green meteor trail is visible in the lower left

AurorEye can also work as a meteor capture timelapse imager.

Overall, we were pleased with the capability to capture meteors and it adds to the possible uses and future extension of AurorEye

Development Update: 🎱 Deciding where to do the number crunching

AurorEye is designed to strike a balance between endurance, image quality, processing time, and upload bandwidth requirements
AurorEye is designed to strike a balance between endurance, image quality, processing time, and upload bandwidth requirements

AurorEye can shoot all night long at 1800 frames per hour at 4K resolution. That's quite a bit of data that needs to be processed and uploaded.

Some of the tools we have at our disposal to shoot all nigh and process during the non shooting periods:

Onboard encoding and processing of frames into h.264 MP4 movies and PNG format keograms is a feature of AurorEye. Since it will be idling in the late morning after the Aurora Chaser stumbles wearily back through their door and shakes the snow off their boots, it makes sense that it processes the capture footage that night and uploads it - essentially it has nothing else to do during the day.

But there are options:

The h.264 movie for a night of shooting is about 200-1000MB in size, depending on JPG scene complexity and number of frames, processing takes about 10 minutes per hour of timelapse onboard the AurorEye

AurorEye also includes the ability to purge internal storage of any 4K frames and uploaded sequences, freeing up space. Alternatively the aurora chaser can simply remove the SD Card and mail it to AurorEye Home Base using snail mail, and slip in a new SD card of any size.

Providing maximum flexibility is part of getting the best data we can out of the system.

Field Update: 🥶 -20degC and Hand Warmers for Robots

AurorEye during a tropical vacation indooors. Temperature and Humidity sensors are active
AurorEye during a tropical vacation indoors. Temperature and Humidity sensors are active

AurorEye has to be able to shoot in cold weather.

Fortunately it has performed well so far. It is close to a sealed unit, and a small internal volume, so the electronics power dissipation of about 5 Watts is enough to keep the temperature elevated versus the environment.

It also has a humidity and internal temperature sensor that can be used to monitor and track changes in performance with these atmospheric variables

Testing in Alberta and Yellowknife in fall and spring conditions (-15degC or so) show a stable internal temperature is reached because of electronics power dissipation. This is a gide on how to use AurorEye in cold weather

  1. Keep AurorEye on - don't bother conserving battery power. A liIon battery needs to be warm to work (and can be damaged if charged cold) - leaving the unit on self-heats all the electronics and battery
  2. Mist likely not a great idea to take the AurorEye back into the heated, humid comfort of your vehicle when relocating to another area to shoot - lens condensation may result. TODO: track humidity and temperature over time to be able to provide a condensation warning
  3. If temperatures are really low, consider adding a few hand warmers to the internal space of the AurorEye. An increase of a few degrees has been noted by doing this.
  4. Thermal insulation may be a good addition to the units
  5. The Raspberry Pi can act as a heater: The CPU can create a significant amount of thermal energy when running. Adding additional tasks to the 4 cores of the CPU may effectively increase thermal output!

Development Note: 🕹 interfaces, menus, and mittens

AurorEye has 4 interactive buttons, tactile and glove-friendly. Compare to a modern ILC camera with touchscreen and buttons and joysticks that are hard to use with any hand protection
AurorEye has 4 interactive buttons, tactile and glove-friendly. Compare to a modern ILC camera with touchscreen and buttons and joysticks that are hard to use with any hand protection

AurorEye is Mitten-Friendly. This is important, because the users will often be in -XXdegree temperatures for hours on end.

Modern camera interfaces usually use tiny buttons, touchscreens, or Wifi phone interfaces to automate timelapse shooting. Generally they are not that fun to use in real world Aurora Shooting conditions

While AurorEye can communicate and do advanced configuration via its local wifi hot-spot and the user's phone, out in the field, the design is meant for use while fully protected from the cold

Constraints force simplicity and usability. Having only 4 buttons streamlines ones thinking about the user interface. A small monochrome OLED screen (which is good with the cold) enforces the hands-off functionality of AurorEye

Development Note: ⚡🔋 Power and Voltage Supply

AurorEye has some thirsty devices on its power bus that all need to be fed their specific diet of Voltage and Current
AurorEye has some thirsty devices on its power bus that all need to be fed their specific diet of voltage and vurrent

AurorEye has some thirsty devices on its power bus that all need to be fed their specific diet of voltage and current.

5V can be supplied with an external barrel-jack adapter, which can charge the main battery pack and run the primary power bus. Consumption is typically about 1.5 Amps at 5VDC.

Most bus devices such as storage and ethernet are powered through the USB-A connectors on the Raspberry Pi.

I2C connected devices typically require a small amount of current at either 5VDC or 3.3VDC, which can be supplied by the RPI interface header pins via the custom PiHat.

High sensitivity magnetometer may need a separate cleaner power supply, although this has yet to be determined in relation to other electromagnetic noise sources, such as the DC/DC converter and the camera shutter.

The camera requires a dedicated step up converter to supply 7.2V, as it is typically powered by its own internal LiIon battery pack. Step-up converters in the is application have some special requirements. Since the camera is forced to be "always on" the converter cannont supply any residual voltage to the camera, or it will not truly be 'off'. Nor can it supply any less than the 7.2V during power-up. Either of these states may make the camera's on internal power-up sequence misbehave.

To improve the behavior of the camera DC/DC converter, a load resistor can be attached across the output of the converter of about 5 kOhm. This will bleed residual power out of any capacitors in the converter curcuit, preventing residual voltage from keeping the camera from powering up and down cleanly.

Another consideration is the off-the-shelf LiPo battery pack, usually used to charge mobile devices through USB-A. This has to be ready to supply power when AurorEye is hard-started - this means it cannot require a separate button push or a special power sraw or signalling from the USB-connected loads. It also cannot auto-power-off if insufficient current draw is detected, which can be an issue with some supplies. Finally, the battery cannot cease supplying power if it detects some input current on its charging port, since the charging port is always connected to the rest of the power supply bus. This means that any off the shelf power pack needs to be checked out for proper behavior during power-up, charging, and continuous running, especially in cold conditions.

Development Note: 📷 Thoughtful choices in ISO, aperture and shutter speed

AurorEye has to capture as much light as possible. But without too much sensor noise, or sensor/lens cost, or image blurring from aurora motion.
AurorEye has to capture as much light as possible. But without too much sensor noise, or sensor/lens cost, or image blurring from aurora motion. These frames are part of a sequence near Yellowknife, NWT in spring of 2022 during a nearly full moon

AurorEye is a light bucket over space and time. We have to get the most out of consumer grade imagers (which are very good) by not over or under exposing the sky and the aurora.

Aurora have large changes in brightness and apparent motion between active and quiet times. A sane choice from experience is to let the Aurora Chaser set AurorEye in one of the following configurations, all of which use the Fisheye lens at full open (f2.0):

This covers the approximate range of visible aurora exposure latitudes we can hope for at 8 bits per pixel per channel. The operator can set the timelapse sequence to any of these.

Motion was deemed to be an important kind of data to capture, so the default is 5 seconds at ISO 1600

Development Note: 📸 Choosing a camera control interface

Ultimately, the core of AurorEye is camera interface code
Ultimately, AurorEye's core is a robust camera control interface (photo by author)

AurorEye, at its heart, is a camera control system that must interface with a high resolution, high-sensitivity, off the shelf consumer camera.

Consumer cameras generally have three methods of being controlled externally:

  1. With a remote shutter release - very basic control, no offload or settings adjustment
  2. With a proprietary WiFi communication protocol. For example, some of Canon's cameras have a REST API that can be interacted with by a wifi connected http capable interface. This tends to be quite slow for image transfer, however.
  3. With a Picture Transfer Protocol compatible communication stack (often over USB). The "Python Library PTPy is a python implementation.
  4. With vendor-specific extensions to the Picture Transfer Protocol, or fully proprietary protocols that allow partial or full access to the cameras features, settings, configuration, and setup.

PTP sees the best way forward as it is reasonably standardized and has provision for faster file transfer over a robust(ish) physical layer (USB) that does not take up an additional WiFi connection for the communicating system (computer, e.g. Raspberry Pi).

But it is low level, and does not define the communication stack (io.e. the physical communication layer). Some vendor-specific extensions and behavior have been mapped out in the gphoto project. gphoto has a command line client interface called gphoto2, and a C++ library called libgphoto2 for finer control (also used by the CLI).

Most relevant to AurorEye, Jim Easterbrook has created a Python wrapper for this C++ API called Python-gphoto2.

Note: The gphoto project organization on Github has a python wrapper repo for libgphoto2, but it seems to not be maintained as frequently as the above repo.

libgphoto2 is frequently updated to support new cameras via USB connection. The full list is here.

libgphoto2 (and by extension the python-gphoto2 wrapper) support different level of compatibility with different cameras and manufacturers. It is not supported by manufacturers, so functionality is generally a subset of the full camera settings. If a camera's proprietary interface is not accessible, the PTP layer (described above) is used as an intermediary.

For AurorEye, Canon cameras were chosen because of reasonable support for by libgphoto2, including burst mode shooting, basic camera setting control, and image offload during burt mode shooting at a sufficient rate.

Some of the functions used:

Some of the issues to contend with using this communication approach:

These limitations may not be present on other Canon camera models, or there may be different limitations. For example, the M100 stores RAW images in the lossless/uncompressed CR2 file format which results in files of around 30MB each. The newer M200 supports both the REST WiFi API (too slow for image transfer for our needs) and the CR3 lossy RAW format, which would be better for mass storage and transfer of ASI timelapse data.

Overall, necessity is results in creative thinking, and these limitations and capabilities have been woven together (and sometimes bypassed) to allow AurorEye to meet its imaging requirements. A Huge Thanks to all the maintainers who have made these projects what they are today.