AurorEye is a portable, autonomous all-sky imager optimized for timelapse Aurora photography
AurorEye is designed to be carried with Aurora Chasers and supplement their own photography with calibrated, all-sky timelapses of auroral events

Features
- APS-C sized imager, 2K and 4K resolution settings
- 2 to 30 second exposure
- Minimal time gap between exposures
- Glove-friendly controls
- Battery operated, 8 hour life in cold conditions
- GPS location and timestamps
- Assisted levelling and setup
- Single button start and stop
- Automatic timelapse & keogram generation upload to cloud
- Wifi internet connectivity and in-field Wifi hotspot
Watch the 2021 AGU presentation
Sample Timelapses
LATEST UPDATES
Development Update: 🌎 AGU 2022
AurorEye got some exposure as a poster under the MacGyver Sessions at the American Geophysical Union Fall 2022 Meeting. Here is the poster link. Thanks for Vincent Ledvina of the University of Alaska (Fairbanks) Geophysical Institute, and HamSCI for helping citizen science efforts get some attention, and a sincere thanks to Dr. Liz Macdonald of NASA Goddard and Laura Brandt of Aurorasaurus
Some of the new developments presented included version 2 of AurorEye: 1/8th the volume of the original, and using custom Raspberry Pi Hat:

Parts are harder to find this year, however the cameras, PCBs, cases, lenses, and Raspberry Pi's needed for several units are now in and ready for assembly:

Field Update: 👾 Dark Skies and Leonid meteors in Nova Scotia

AurorEye can also work as a meteor capture timelapse imager.
- Low inter-shot black out period, meteors will have a hard time slipping in between exposures. In this case, the exposure time was 30 seconds per frame, subsequently stacked.
- AurorEye is designed so that the Burst Mode of the Canon camera is used, allowing shots to be continuously channelled through the high-speed camera internal buffer, then to the internal SD card, and finally to USB storage mounted on the Raspberry Pi module. This results in smoother timelapses since the shutter is almost always open.
- Cameras like the Canon have a failsafe feature to prevent the shutter or battery from wearing out if the shutter button is inadvertently depressed during storage: the burst mode stops after 999 shots. This had to be accounted for in AurorEye control software to allow it to take advantage of the minimal blackout time between frames that burst mode enables. Along with that, dark frame-based noise reduction must be turned off in the camera. Noise reduction using dark frames takes a second exposure with the shutter closed to detect and cancel hot pixels and other artifacts.
Overall, we were pleased with the capability to capture meteors and it adds to the possible uses and future extension of AurorEye
Development Update: 🎱 Deciding where to do the number crunching

AurorEye can shoot all night long at 1800 frames per hour at 4K resolution. That's quite a bit of data that needs to be processed and uploaded.
Some of the tools we have at our disposal to shoot all nigh and process during the non shooting periods:
- 50,000+ frames of onboard storage at 4K JPG
- Onboard hardware accelerated h.264 movie encoding (but for HD 1080p only)
- Wifi connectivity to the Aurora Chaser's Home Base internet connection
- Cloud based storage - Google Cloud services
Onboard encoding and processing of frames into h.264 MP4 movies and PNG format keograms is a feature of AurorEye. Since it will be idling in the late morning after the Aurora Chaser stumbles wearily back through their door and shakes the snow off their boots, it makes sense that it processes the capture footage that night and uploads it - essentially it has nothing else to do during the day.
But there are options:
- The 1080p movie and keogram are uploaded to the cloud storage automatically, along with log files with associate meta data for that night's sequences
- If the night's images were quite extraordinary, the individual 4K source frames can be uploaded as well
The h.264 movie for a night of shooting is about 200-1000MB in size, depending on JPG scene complexity and number of frames, processing takes about 10 minutes per hour of timelapse onboard the AurorEye
AurorEye also includes the ability to purge internal storage of any 4K frames and uploaded sequences, freeing up space. Alternatively the aurora chaser can simply remove the SD Card and mail it to AurorEye Home Base using snail mail, and slip in a new SD card of any size.
Providing maximum flexibility is part of getting the best data we can out of the system.
Field Update: 🥶 -20degC and Hand Warmers for Robots

AurorEye has to be able to shoot in cold weather.
Fortunately it has performed well so far. It is close to a sealed unit, and a small internal volume, so the electronics power dissipation of about 5 Watts is enough to keep the temperature elevated versus the environment.
It also has a humidity and internal temperature sensor that can be used to monitor and track changes in performance with these atmospheric variables
Testing in Alberta and Yellowknife in fall and spring conditions (-15degC or so) show a stable internal temperature is reached because of electronics power dissipation. This is a gide on how to use AurorEye in cold weather
- Keep AurorEye on - don't bother conserving battery power. A liIon battery needs to be warm to work (and can be damaged if charged cold) - leaving the unit on self-heats all the electronics and battery
- Mist likely not a great idea to take the AurorEye back into the heated, humid comfort of your vehicle when relocating to another area to shoot - lens condensation may result. TODO: track humidity and temperature over time to be able to provide a condensation warning
- If temperatures are really low, consider adding a few hand warmers to the internal space of the AurorEye. An increase of a few degrees has been noted by doing this.
- Thermal insulation may be a good addition to the units
- The Raspberry Pi can act as a heater: The CPU can create a significant amount of thermal energy when running. Adding additional tasks to the 4 cores of the CPU may effectively increase thermal output!
Development Note: 🕹 interfaces, menus, and mittens

AurorEye is Mitten-Friendly. This is important, because the users will often be in -XXdegree temperatures for hours on end.
Modern camera interfaces usually use tiny buttons, touchscreens, or Wifi phone interfaces to automate timelapse shooting. Generally they are not that fun to use in real world Aurora Shooting conditions
While AurorEye can communicate and do advanced configuration via its local wifi hot-spot and the user's phone, out in the field, the design is meant for use while fully protected from the cold
Constraints force simplicity and usability. Having only 4 buttons streamlines ones thinking about the user interface. A small monochrome OLED screen (which is good with the cold) enforces the hands-off functionality of AurorEye
Development Note: ⚡🔋 Power and Voltage Supply

AurorEye has some thirsty devices on its power bus that all need to be fed their specific diet of voltage and current.
5V can be supplied with an external barrel-jack adapter, which can charge the main battery pack and run the primary power bus. Consumption is typically about 1.5 Amps at 5VDC.
Most bus devices such as storage and ethernet are powered through the USB-A connectors on the Raspberry Pi.
I2C connected devices typically require a small amount of current at either 5VDC or 3.3VDC, which can be supplied by the RPI interface header pins via the custom PiHat.
High sensitivity magnetometer may need a separate cleaner power supply, although this has yet to be determined in relation to other electromagnetic noise sources, such as the DC/DC converter and the camera shutter.
The camera requires a dedicated step up converter to supply 7.2V, as it is typically powered by its own internal LiIon battery pack. Step-up converters in the is application have some special requirements. Since the camera is forced to be "always on" the converter cannont supply any residual voltage to the camera, or it will not truly be 'off'. Nor can it supply any less than the 7.2V during power-up. Either of these states may make the camera's on internal power-up sequence misbehave.
To improve the behavior of the camera DC/DC converter, a load resistor can be attached across the output of the converter of about 5 kOhm. This will bleed residual power out of any capacitors in the converter curcuit, preventing residual voltage from keeping the camera from powering up and down cleanly.
Another consideration is the off-the-shelf LiPo battery pack, usually used to charge mobile devices through USB-A. This has to be ready to supply power when AurorEye is hard-started - this means it cannot require a separate button push or a special power sraw or signalling from the USB-connected loads. It also cannot auto-power-off if insufficient current draw is detected, which can be an issue with some supplies. Finally, the battery cannot cease supplying power if it detects some input current on its charging port, since the charging port is always connected to the rest of the power supply bus. This means that any off the shelf power pack needs to be checked out for proper behavior during power-up, charging, and continuous running, especially in cold conditions.
Development Note: 📷 Thoughtful choices in ISO, aperture and shutter speed

AurorEye is a light bucket over space and time. We have to get the most out of consumer grade imagers (which are very good) by not over or under exposing the sky and the aurora.
Aurora have large changes in brightness and apparent motion between active and quiet times. A sane choice from experience is to let the Aurora Chaser set AurorEye in one of the following configurations, all of which use the Fisheye lens at full open (f2.0):
- 2 second exposure at ISO 800, 1600, 3200
- 5 second exposure at ISO 800, 1600, 3200
- 15 second exposure at ISO 800, 1600, 3200
- 30 second exposure at ISO 800, 1600, 3200
This covers the approximate range of visible aurora exposure latitudes we can hope for at 8 bits per pixel per channel. The operator can set the timelapse sequence to any of these.
Motion was deemed to be an important kind of data to capture, so the default is 5 seconds at ISO 1600
Development Note: 📸 Choosing a camera control interface

AurorEye, at its heart, is a camera control system that must interface with a high resolution, high-sensitivity, off the shelf consumer camera.
Consumer cameras generally have three methods of being controlled externally:
- With a remote shutter release - very basic control, no offload or settings adjustment
- With a proprietary WiFi communication protocol. For example, some of Canon's cameras have a REST API that can be interacted with by a wifi connected http capable interface. This tends to be quite slow for image transfer, however.
- With a Picture Transfer Protocol compatible communication stack (often over USB). The "Python Library PTPy is a python implementation.
- With vendor-specific extensions to the Picture Transfer Protocol, or fully proprietary protocols that allow partial or full access to the cameras features, settings, configuration, and setup.
PTP sees the best way forward as it is reasonably standardized and has provision for faster file transfer over a robust(ish) physical layer (USB) that does not take up an additional WiFi connection for the communicating system (computer, e.g. Raspberry Pi).
But it is low level, and does not define the communication stack (io.e. the physical communication layer). Some vendor-specific extensions and behavior have been mapped out in the gphoto project. gphoto has a command line client interface called gphoto2, and a C++ library called libgphoto2 for finer control (also used by the CLI).
Most relevant to AurorEye, Jim Easterbrook has created a Python wrapper for this C++ API called Python-gphoto2.
Note: The gphoto project organization on Github has a python wrapper repo for libgphoto2, but it seems to not be maintained as frequently as the above repo.
libgphoto2 is frequently updated to support new cameras via USB connection. The full list is here.
libgphoto2 (and by extension the python-gphoto2 wrapper) support different level of compatibility with different cameras and manufacturers. It is not supported by manufacturers, so functionality is generally a subset of the full camera settings. If a camera's proprietary interface is not accessible, the PTP layer (described above) is used as an intermediary.
For AurorEye, Canon cameras were chosen because of reasonable support for by libgphoto2, including burst mode shooting, basic camera setting control, and image offload during burt mode shooting at a sufficient rate.
Some of the functions used:
- Get and Set camera configuration (ISO, Shutter speed)
- Virtually press and hold/release shutter button
- Inspection of the onboard SD Card file system including download and deletion
- Camera event message monitoring (and associated callback registry) to act as a polling system, as the communication interface is asynchronous to commands issued (sometimes multiple times) and data transferred
Some of the issues to contend with using this communication approach:
- There is no command queue - AurorEye software must keep checking that a command has been accepted and issue a reasonable number of retries and a failure handling mechanism
- The burst mode method of gathering timelapse data with a minimum blackout between images runs up a gainst a safety feature of the Canon camera: a maximum of 999 shots for a given burst. The shutter up and down command must be successfully issued before 999 frames are shot
- Unable to set capture quality/movie mode: at Least for the M100 Canon Camera, these cannot be accessed or set and must be configured one-time using the interface
- In burst mode, inability to offload images directly through USB to RPI mass storage. The SD Card must be used as an intermediary.
- Image storage to SD card from Camera Buffer takes time, as does the event system that notes an image has been stored. This means there is a delay between the end of an exposure and the moment that image ends up on the RPI USB mass storage system
- Cannot access the internal timelapse functionality (it does not really meet our needs anyways, but if it did, for example allowing arbitrary timelapse settings, that would reduce much of the RPI software duties.
These limitations may not be present on other Canon camera models, or there may be different limitations. For example, the M100 stores RAW images in the lossless/uncompressed CR2 file format which results in files of around 30MB each. The newer M200 supports both the REST WiFi API (too slow for image transfer for our needs) and the CR3 lossy RAW format, which would be better for mass storage and transfer of ASI timelapse data.
Overall, necessity is results in creative thinking, and these limitations and capabilities have been woven together (and sometimes bypassed) to allow AurorEye to meet its imaging requirements. A Huge Thanks to all the maintainers who have made these projects what they are today.