AurorEye is a portable, autonomous all-sky imager optimized for timelapse Aurora photography

AurorEye is designed to be carried with Aurora Chasers and capture calibrated, all-sky timelapses of auroral events for use in research and citizen science. Each unit uses off-the-shelf camera hardware and lenses. All control, telemetry, networking, and cloud upload functions are controlled by an integrated single board Linux computer. AurorEye can capture a full night of timelapse photos 1000-5000+ images at up to 24MP (8K) resolution. A unique feature of the timelapse imagery is almost the minimal time gap between image exposures, resulting in smoother timelapses with minimal missing motion.

Features

Locations where AurorEye units have captured data and list of Operators

AurorEye Units have campaigned in these locations, but not yet simultaneously. The goal is to have multiple deployments to Aurora Chasers and simultaneous all sky timelapse imagery of aurora

Operators

Watch the 2021 AGU presentation

Timelapse Archive

AurorEye timelapse videos are hosted on a dedicated YouTube channel at 4K resolution with radial keograms. If you are a student, citizen, scientist or research interesting in the content of these timelapses, please get in touch at jeremy@jufaintermedia.com

Visit the timelapse archive at YouTube

LATEST UPDATES

A Reminder why we do this: Citizen Science

These photos were being posted in Alberta Aurora Chasers' Facebook group, and were assembled into an animation of flying under the aurora. Citizen Science has a pretty broad definition, including data visualization and image processing in novel ways.

Tweet about the contribution of citsci

Deployment plans for 2023/2024

Some frames for the Spring 2023 season
3 stills from the Spring 2023 season. From left to right: Anomalous SAR/Steve at high latitude during quiet conditions. Capture of Falcon 9 staging event over Alaska. Kp7/8 Geomagnetic conditions

Deployment highlights from 2023:

For Spring 2024

Further plans include geometric calibration of the sensor and lens combination to allow projection onto a map and potential to combine these with images from other ASI networks.

Data Update:
April 30, 2023

AurorEye timelapse videos are now hosted on a dedicated YouTube channel at 4K resolution with radial keograms.

Visit the timelapse archive at YouTube

If you are a student, citizen, scientist or research interesting in the content of these timelapses, please get in touch at jeremy@jufaintermedia.com

Why not a dedicated, cloud powered video server and custom React front end and so on?

YouTube is the cloud based, free, video hosting service that suits AurorEye best at this time. Setting up our own cloud server for these summary 4K timelapses would be potentially costly (if there is a sudden rise in views) and we would have to reinvent the video playback wheel. Youtube offers some great advantages like visibility, bandwidth, playback speed, multiple resolutions, and linkable chapter markers for specific auroral events within timelapses.

Deployment Update:
April 17, 2023

Unit 05 - some new tricks and good conditions

Unit 5 was built to rely on its wireless access point and web app for Operator control. This worked well, allowing the Operator to setup and then monitor status, logs, and images from a distance, and even back in the vehicle while warming up. Clear skies in Yellowknife allowed capture of SAR/STEVE, red aurora of the kp8 March 23/24 event, and several sub-storms and large formations ("Churns"). This data was captured within 20-30km of the AuroraMAX ASI, which could allow comparison of data captured and some parallax experiments.

Unit 04 - Fairbanks and PFRR

Unit 4 arrived in Fairbanks in good order and was tested by two Operators (see Deployments section for credits)

Unit 04 also had the opportunity to be hosted under one of the dome at Poker Flat Research Range, demonstrating remote control and multi-sequence capture over two nights. The unit used its external power option and tied to the local WLAN network allowing remote operation via a VNC and the terminal interface.

Unit 04 installed in its temporary location under a dome at PFRR
Unit 04 installed in its temporary location under a dome at PFRR. Thanks to V. Ledvina and Dr. Hampton for this opportunity (photo by V. Ledvina)

Video processing workflow using free video editing software

Using the free version of Davinci resolve allow image processing pipelines from image sequences to 4K and 8K timelapses using the Fusion node network system. This promises flexibility and rapid iteration when combining data before committing to a coded solution

Davincii Resolve Fusion workflows for processing image sequences and metadata
Davinci Resolve Fusion workflows for processing image sequences and metadata

Deployment Update: March 28, 2023

Unit 04 has been sent to Fairbanks, AK for testing. Unit 05 was up in Yellowknife for testing and was deployed during the kp8 geomagnetic storm of March 23/24, 2023
Unit 04 has been sent to Fairbanks, AK for testing. Unit 05 was up in Yellowknife for testing and was deployed during the kp8 geomagnetic storm of March 23/24, 2023

This is an interim update with data still being analyzed. Units 04 and 05 were both deployed for testing in northern latitudes. More to come, but this is the beginning of testing with Aurora Chasers.

Development Update: March 2023

Data overlay on each and every frame. This simply was too much for the RPI to manage, especially re-encoding the JPG, which took 1+ seconds
Unit 05 in a nice 558nm green case. Note that all control is through a phone, rather than through onboard controls, in order to test the idea of trading hardware effort for software effort in time- or resource- constrained builds

The theme of this month's update is "Ship it": we are getting some units out to the field for testing:

Unit 04 to Fairbanks

This should be in good hands (with the aurora chaser who will be operating it, not the journey through the mail). This test campaign has a few milestones:

  1. Overall user experience: does the unit work well enough to not burden an aurora chaser on their chase, while capturing the data it was designed to capture?
  2. Survival through the mail - special packaging is needed for electronics: modular Polyethylene foam extrusions (aka pool noodles) have the density and impact absorption needed for shipping and are much easier to shape around electronics. Hopefully they will do the trick
  3. Survival through international customs - shipping science equipment without incurring extra fees or customs scrutiny
  4. Home base functionality - can the user set up the AurorEye unit on their local network successfully like any internet connected household device, using the in build wireless access point setup tool?
  5. Field functionality - can the unit be set up and capturing imagery with low overhead?
  6. Cloud functionality - can the unit successfully offload a night's image sequences to the cloud storage facility
  7. Robustness - can the unit stand up to the unknown unknowns of aurora chasing with a new user?

Unit 05 to Yellowknife

Unit 05 was an exercise in minimalism - there was very little time to built it before it was due in Yellowknife for a week of testing. This meant cutting it to bare bones, which was also a great learning opportunity:

Stretch Goal: simultaneous image capture from two AurorEye units

A stretch goal in late March will be - if both units are operational and in position, can they simultaneously capture imagery on the same night? This would be a demonstration of an ad-hoc all sky camera network. Fairbanks and Yellowknife are about 1600km apart, so there would likely not be any imagery overlap, but showing two animated image circles on a map would be a significant step in proving out this project

Development Update: February 2023

The theme of this month's updates are "Doing What You Do Best". What this means is splitting the computation between the onboard computer and the Cloud differently that originally planned due to limitations of the Raspberry Pi, and the increase in available offload bandwidth for Aurora Chasers:

The tl;dr is this: we know we have more upload bandwidth than originally planned, so why not use it, leaving the AurorEye out of the pixel-level image processing pipeline in order to maximize the image resolution, quality and imaging cadence of the AurorEye hardware.

Let's look at three key changes that resulted:

Move away from 1080 locally encoded video with "HUD" overlay

Data overlay on each and every frame. This simply was too much for the RPI to manage, especially re-encoding the JPG, which took 1+ seconds
So long, per-image HUD overlay, you're moving to the EXIF headers

This is computationally intensive for the RPI at two stages:

  1. During image capture when the rpi applies the data overlay -this requires re-save of a the original camera JPG, which has a penalty of double-image-compression, and up to 1.5 seconds of compression time
  2. During video encoding from JPEG to MP4 1080P via the RPI hardware encoder, which takes time and has absolute hardware limits to video quality.

This approach was "traditional" meaning a standard All-sky camera approach of outputting a data-overlaid timelapse movie, and reducing the data from the Gigabytes to the Megabytes range. In reality, we do have the upload bandwidth and the time for Gigabytes range, and that network rate will only get better with time. For this reason, we are...

Moving ahead with 4K+ image sequences with Metadata embedded in the EXIF header

The camera can shoot up to 4000x4000px (8K) JPGs at a rate of 30 frames per minute (2-second exposures), and maybe twice that.

Rather than overlaying meta data like GPS location and timestamp on the image, it can be more efficiently and quickly embedded in the JPG file EXIF tags. This embedding step can keep up with the file size and cadence for at least 2 second exposures at 4K.

This allows us to quadrupling the resolution (not to mention quality due to less compression) versus the old 1080p MP4 video method, since we are freed of RPI video hardware limitations.

In order to efficiently get thousands of JPGs uploaded to the cloud, they are combined into a standard TAR file. This is not compressed, since the JPGs are already compressed), bypassing the intensive video creation step and the arcane ffmpeg encoding step. This minimizes onboard processing time and maximizes quality, allowing video to be created in the cloud at a later date from higher quality images

Remote management

Ultimately the first few units will need software updates, log reviews and fixes via remote.

After considering several methods such as a custom zip file update system or software update via the Python package manager PIP, the most flexible system was VNC remote access using a cloud connection service. This has the advantage of not committing to a single software update method (which has limited access), while providing maximum access to the RPI in the AurorEye units remotely. Using a cloud connection service such as RealVNC also leverages their security protocols and does not require AurorEye users to set up their local Internet/Local Router with port forwarding or other customization.

While all these changes have required some rewrites of the AurorEye software, they should better match the strengths and limitations of the onboard RPI, the Cloud, and the internet connection between them.

Development Update: 🌎 AGU 2022

AurorEye got some exposure as a poster under the MacGyver Sessions at the American Geophysical Union Fall 2022 Meeting. Here is the poster link. Thanks for Vincent Ledvina of the University of Alaska (Fairbanks) Geophysical Institute, and HamSCI for helping citizen science efforts get some attention, and a sincere thanks to Dr. Liz Macdonald of NASA Goddard and Laura Brandt of Aurorasaurus

Some of the new developments presented included version 2 of AurorEye: 1/8th the volume of the original, and using custom Raspberry Pi Hat:

Printed circuit board for simplicity: custom Raspberry Pi Hat using the free KiCAD design software and manufactured by JLCPCB
Printed circuit board for simplicity: custom Raspberry Pi Hat using the free KiCAD design software and manufactured by JLCPCB

Parts are harder to find this year, however the cameras, PCBs, cases, lenses, and Raspberry Pi's needed for several units are now in and ready for assembly:

row of parts awaiting assembly into AurorEye cameras
Row of parts awaiting assembly into AurorEye ASI units

Field Update: 👾 Dark Skies and Leonid meteors in Nova Scotia

AurorEye timelapse sequence compressed into a single frame, a bright green meteor trail is visible in the lower left
AurorEye timelapse sequence compressed into a single frame, a bright green meteor trail is visible in the lower left

AurorEye can also work as a meteor capture timelapse imager.

Overall, we were pleased with the capability to capture meteors and it adds to the possible uses and future extension of AurorEye

Development Update: 🎱 Deciding where to do the number crunching

AurorEye is designed to strike a balance between endurance, image quality, processing time, and upload bandwidth requirements
AurorEye is designed to strike a balance between endurance, image quality, processing time, and upload bandwidth requirements

AurorEye can shoot all night long at 1800 frames per hour at 4K resolution. That's quite a bit of data that needs to be processed and uploaded.

Some of the tools we have at our disposal to shoot all nigh and process during the non shooting periods:

Onboard encoding and processing of frames into h.264 MP4 movies and PNG format keograms is a feature of AurorEye. Since it will be idling in the late morning after the Aurora Chaser stumbles wearily back through their door and shakes the snow off their boots, it makes sense that it processes the capture footage that night and uploads it - essentially it has nothing else to do during the day.

But there are options:

The h.264 movie for a night of shooting is about 200-1000MB in size, depending on JPG scene complexity and number of frames, processing takes about 10 minutes per hour of timelapse onboard the AurorEye

AurorEye also includes the ability to purge internal storage of any 4K frames and uploaded sequences, freeing up space. Alternatively the aurora chaser can simply remove the SD Card and mail it to AurorEye Home Base using snail mail, and slip in a new SD card of any size.

Providing maximum flexibility is part of getting the best data we can out of the system.

Field Update: 🥶 -20degC and Hand Warmers for Robots

AurorEye during a tropical vacation indooors. Temperature and Humidity sensors are active
AurorEye during a tropical vacation indoors. Temperature and Humidity sensors are active

AurorEye has to be able to shoot in cold weather.

Fortunately it has performed well so far. It is close to a sealed unit, and a small internal volume, so the electronics power dissipation of about 5 Watts is enough to keep the temperature elevated versus the environment.

It also has a humidity and internal temperature sensor that can be used to monitor and track changes in performance with these atmospheric variables

Testing in Alberta and Yellowknife in fall and spring conditions (-15degC or so) show a stable internal temperature is reached because of electronics power dissipation. This is a gide on how to use AurorEye in cold weather

  1. Keep AurorEye on - don't bother conserving battery power. A liIon battery needs to be warm to work (and can be damaged if charged cold) - leaving the unit on self-heats all the electronics and battery
  2. Mist likely not a great idea to take the AurorEye back into the heated, humid comfort of your vehicle when relocating to another area to shoot - lens condensation may result. TODO: track humidity and temperature over time to be able to provide a condensation warning
  3. If temperatures are really low, consider adding a few hand warmers to the internal space of the AurorEye. An increase of a few degrees has been noted by doing this.
  4. Thermal insulation may be a good addition to the units
  5. The Raspberry Pi can act as a heater: The CPU can create a significant amount of thermal energy when running. Adding additional tasks to the 4 cores of the CPU may effectively increase thermal output!

Development Note: 🕹 interfaces, menus, and mittens

AurorEye has 4 interactive buttons, tactile and glove-friendly. Compare to a modern ILC camera with touchscreen and buttons and joysticks that are hard to use with any hand protection
AurorEye has 4 interactive buttons, tactile and glove-friendly. Compare to a modern ILC camera with touchscreen and buttons and joysticks that are hard to use with any hand protection

AurorEye is Mitten-Friendly. This is important, because the users will often be in -XXdegree temperatures for hours on end.

Modern camera interfaces usually use tiny buttons, touchscreens, or Wifi phone interfaces to automate timelapse shooting. Generally they are not that fun to use in real world Aurora Shooting conditions

While AurorEye can communicate and do advanced configuration via its local wifi hot-spot and the user's phone, out in the field, the design is meant for use while fully protected from the cold

Constraints force simplicity and usability. Having only 4 buttons streamlines ones thinking about the user interface. A small monochrome OLED screen (which is good with the cold) enforces the hands-off functionality of AurorEye

Development Note: ⚡🔋 Power and Voltage Supply

AurorEye has some thirsty devices on its power bus that all need to be fed their specific diet of Voltage and Current
AurorEye has some thirsty devices on its power bus that all need to be fed their specific diet of voltage and vurrent

AurorEye has some thirsty devices on its power bus that all need to be fed their specific diet of voltage and current.

5V can be supplied with an external barrel-jack adapter, which can charge the main battery pack and run the primary power bus. Consumption is typically about 1.5 Amps at 5VDC.

Most bus devices such as storage and ethernet are powered through the USB-A connectors on the Raspberry Pi.

I2C connected devices typically require a small amount of current at either 5VDC or 3.3VDC, which can be supplied by the RPI interface header pins via the custom PiHat.

High sensitivity magnetometer may need a separate cleaner power supply, although this has yet to be determined in relation to other electromagnetic noise sources, such as the DC/DC converter and the camera shutter.

The camera requires a dedicated step up converter to supply 7.2V, as it is typically powered by its own internal LiIon battery pack. Step-up converters in the is application have some special requirements. Since the camera is forced to be "always on" the converter cannont supply any residual voltage to the camera, or it will not truly be 'off'. Nor can it supply any less than the 7.2V during power-up. Either of these states may make the camera's on internal power-up sequence misbehave.

To improve the behavior of the camera DC/DC converter, a load resistor can be attached across the output of the converter of about 5 kOhm. This will bleed residual power out of any capacitors in the converter curcuit, preventing residual voltage from keeping the camera from powering up and down cleanly.

Another consideration is the off-the-shelf LiPo battery pack, usually used to charge mobile devices through USB-A. This has to be ready to supply power when AurorEye is hard-started - this means it cannot require a separate button push or a special power sraw or signalling from the USB-connected loads. It also cannot auto-power-off if insufficient current draw is detected, which can be an issue with some supplies. Finally, the battery cannot cease supplying power if it detects some input current on its charging port, since the charging port is always connected to the rest of the power supply bus. This means that any off the shelf power pack needs to be checked out for proper behavior during power-up, charging, and continuous running, especially in cold conditions.

Development Note: 📷 Thoughtful choices in ISO, aperture and shutter speed

AurorEye has to capture as much light as possible. But without too much sensor noise, or sensor/lens cost, or image blurring from aurora motion.
AurorEye has to capture as much light as possible. But without too much sensor noise, or sensor/lens cost, or image blurring from aurora motion. These frames are part of a sequence near Yellowknife, NWT in spring of 2022 during a nearly full moon

AurorEye is a light bucket over space and time. We have to get the most out of consumer grade imagers (which are very good) by not over or under exposing the sky and the aurora.

Aurora have large changes in brightness and apparent motion between active and quiet times. A sane choice from experience is to let the Aurora Chaser set AurorEye in one of the following configurations, all of which use the Fisheye lens at full open (f2.0):

This covers the approximate range of visible aurora exposure latitudes we can hope for at 8 bits per pixel per channel. The operator can set the timelapse sequence to any of these.

Motion was deemed to be an important kind of data to capture, so the default is 5 seconds at ISO 1600

Development Note: 📸 Choosing a camera control interface

Ultimately, the core of AurorEye is camera interface code
Ultimately, AurorEye's core is a robust camera control interface (photo by author)

AurorEye, at its heart, is a camera control system that must interface with a high resolution, high-sensitivity, off the shelf consumer camera.

Consumer cameras generally have three methods of being controlled externally:

  1. With a remote shutter release - very basic control, no offload or settings adjustment
  2. With a proprietary WiFi communication protocol. For example, some of Canon's cameras have a REST API that can be interacted with by a wifi connected http capable interface. This tends to be quite slow for image transfer, however.
  3. With a Picture Transfer Protocol compatible communication stack (often over USB). The "Python Library PTPy is a python implementation.
  4. With vendor-specific extensions to the Picture Transfer Protocol, or fully proprietary protocols that allow partial or full access to the cameras features, settings, configuration, and setup.

PTP sees the best way forward as it is reasonably standardized and has provision for faster file transfer over a robust(ish) physical layer (USB) that does not take up an additional WiFi connection for the communicating system (computer, e.g. Raspberry Pi).

But it is low level, and does not define the communication stack (io.e. the physical communication layer). Some vendor-specific extensions and behavior have been mapped out in the gphoto project. gphoto has a command line client interface called gphoto2, and a C++ library called libgphoto2 for finer control (also used by the CLI).

Most relevant to AurorEye, Jim Easterbrook has created a Python wrapper for this C++ API called Python-gphoto2.

Note: The gphoto project organization on Github has a python wrapper repo for libgphoto2, but it seems to not be maintained as frequently as the above repo.

libgphoto2 is frequently updated to support new cameras via USB connection. The full list is here.

libgphoto2 (and by extension the python-gphoto2 wrapper) support different level of compatibility with different cameras and manufacturers. It is not supported by manufacturers, so functionality is generally a subset of the full camera settings. If a camera's proprietary interface is not accessible, the PTP layer (described above) is used as an intermediary.

For AurorEye, Canon cameras were chosen because of reasonable support for by libgphoto2, including burst mode shooting, basic camera setting control, and image offload during burt mode shooting at a sufficient rate.

Some of the functions used:

Some of the issues to contend with using this communication approach:

These limitations may not be present on other Canon camera models, or there may be different limitations. For example, the M100 stores RAW images in the lossless/uncompressed CR2 file format which results in files of around 30MB each. The newer M200 supports both the REST WiFi API (too slow for image transfer for our needs) and the CR3 lossy RAW format, which would be better for mass storage and transfer of ASI timelapse data.

Overall, necessity is results in creative thinking, and these limitations and capabilities have been woven together (and sometimes bypassed) to allow AurorEye to meet its imaging requirements. A Huge Thanks to all the maintainers who have made these projects what they are today.

Frequently Asked Questions

About AurorEye Imagery:

Where can I view AurorEye timelapses?

This dedicated YouTube playlist has all videos in 4K resolution (https://youtu.be/cMFbIgYoFqg?si=eRWC28COtkQwhWa3)

What is a Keogram?

Keogram article on Wikipedia AurorEye timelapse summaries us a Radial Keogram to act as a visual clock of overhead aurora activity - the more green and red, the more aurora is visible at that time

Where I request specific frames or sequences from AurorEye videos for non-commercial research use?

Please contact us using the email in the Contact Section

Can AurorEye all sky timelapses and images be seen as immersive media (i.e. YouTube360)?

Yes, please see this sample youtube 360 video for immersive first-person view sample. Works on phones and laptop/desktops

What aspects of AurorEye are considered calibrated?
  • Location (fuzzed to keep aurora chaser locations discrete)
  • Orientation to zenith (part of setup and levelling by aurora chaser)
  • Image timestamps to within about 1 second
  • Lens parameters (standard lens used)
  • Camera parameters (camera manufacturer model and associated sensor size, pixel pitch, filter mosaic pattern)
What aspects of AurorEye are considered not calibrated?
  • Mapping of image pixel brightness in RGB channels value to actual flux - this is determined by the camera internally, and could potentially be calibrated with standard sources
  • Please check out this link about calibration considerations for commercial RGB sensors with Bayer pattern filters

For Operators or Educators:

Are AurorEye units available for sale?

At this time they are lent out to qualified aurora chaser participatory researchers and are not for sale

I run an education program on Aurora, can I use AurorEye timelapses or imagery?

AurorEye timelapses on the YouTube Playlist can be used in non-profit educational settings with permission. Please get in touch via email and let us know how you with to use the content

Can I operate an AurorEye unit next season?

Please get in touch with us using the Contact Section

Is there a mailing list for updates?

Please sign up for the newsletter using the Contact Section

Could this camera be adapted to record meteor showers or watch for red sprites?

AurorEye cameras capture the whole night sky and have recorded meteors before, so this is a potential use for them

How is AurorEye different than Raspberry Pi "All Sky Camera" projects I have seen online?

There are several wonderful projects to democratize all-sky imagery, typically focusing on fixed installations with cameras like the Raspberry Pi HQ Camera. AurorEye is designed to gather high quality imagery and be portable, self contained, offline-capable, and very easy and quick to set up. The project also includes cloud upload and a software pipeline to publish 4K timelapses online.

Can I see what AurorEye is capturing in real time?

AurorEye is designed to be out in the field with aurora chasers, away from internet connectivity. However, any smartphone within 10m or so of an AurorEye unit can connect to its wifi and view the images that are being captured in near-real-time

How long does it take to set up AurorEye in the field?

About 2 minutes: turn the unit on, put it on a standard camera tripod, level it so it it pointing straight up (with the onboard levelling sensor), and press the Start button

What exposure times and frame rates can AurorEye capture?

As short as 1-2 seconds to as much as 30 seconds per exposure. Each exposure starts as soon as the previous one ends. This covers situations with fast motion as well as dimmer aurora

Can I leave AurorEye out for multiple days?

AurorEye cameras are designed to travel with the aurora chaser during their chase. Units should be brought inside after a nightly chase

How many images can AuroEye capture in a session?

typical AurorEye sequences will be several hours and several thousand frames. On board storage can store about 60,000 exposures

How do I offload images from AurorEye?

By pressing the Offload button, images are automatically uploaded to the cloud for processing and sharing. 50,000 - 60,000 images can be stored on AurorEye before offload is necessary.

How good is the image quality?

The camera in an AurorEye unit is a consumer CMOS APS-C sized sensor. Images are captured in a circular fisheye format covering 4000 pixel diameter with 8-bit RGB channels. Pixel pitch is approximately 3.7μm

How much photography experience is required?

None, however most people that might want to operate an AurorEye unit already have some aurora photography experience

Can I fill out survey allowing me to provide feedback about AurorEye features?

Here is a link to the The AurorEye feature vote form. Your feedback is welcome

Is AurorEye Hardware and Software Open Source?

At this time the source code and hardware are under active development, and have not been released as OSS or OSH

Is AurorEye designed for fixed locations?

The current version of AurorEye is focused on portability - a feature of the project is the ability for aurora chasers to chase to the best locations, and not stay at home where there may be non ideal conditions, like visibility, cloud, light pollution

Is there an #AurorEye hashtag across social media?

Yes, #AurorEye hashtag is used for content on several platforms including YouTube, Twitter, and Instagram.

For obtaining, using, and citing imagery:

Where can I request specific frames or sequences from AurorEye videos for non-commercial research use?

Please contact us using the email in the Contact Section

Citing AurorEye or using AurorEye content in a conference poster, paper or other publication
  • For individual frames or special timelapse sequences, please contact jeremykuzub@jufaintermedia.com
  • BIB code on ads for AGU 2022 presentation: https://ui.adsabs.harvard.edu/abs/2022AGUFMSH51D..04K/abstract
  • APA style website citation: AurorEye automated portable aurora all-sky camera system. (2024). AurorEye.ca. Retrieved April 12, 2024, from https://AurorEye.ca
  • APA parenthetical: (AurorEye Automated Portable Aurora All-sky Camera System, 2024)
  • Author contact: Jeremy Kuzub, Jufa Intermedia (jeremykuzub@jufaintermedia.com)
I am doing some image processing. Where can I find specific information like lens and camera parameters for specific AurorEye content?

Current Units use Canon APS-C 24MP sensors and Meike 6.5mm Fisheye f/2.0 lenses

Please contact jeremykuzub@jufaintermedia.com for more details

How can I use one or more AurorEye units in a specific upcoming research campaign or proposal?

Please contact jeremykuzub@jufaintermedia.com