Dark Sky Network

DSN

The Dark Sky Network (DSN): monitoring the night sky brightness over Southern Arizona for the next ten years, starting in 2025. We use SQM and TESS units. Several units are in place, and have been running for up to 7 years. Their data are periodically incorporated in our data space. We took delivery of SQM units on 2/6/25 and TESS units on 5/19/25. As of 10/19/25, we have 17 units in the DSN.

Caution

This code is a work in progress.

The Process

The GitHub workflow DSN-process_data runs weekly. If it finds data in the github directory DSNdata/NEW, it processes it (see below).

Step 1

SQM/TESS raw data are uploaded to DSNdata/NEW. This process may be manual (e.g. SQMs w/o internet) or automatic using a Raspberry Pi 4B. The files are labeled with the site and sensor name, DSNnnn-U_SiteName_yy-sss.dat where:

Two GitHub workflows harvest data for processing into DSNdata/NEW:

A shell script outside GitHub harvests data into DSNdata/NEW, running on DSN-imac.

Step 2

DSN-process_data looks for data in DSNdata/NEW. If it finds data there, it runs DSN_python on each file to calculate chisquared, moonalt and LST.

  1. For each file, DSN_python writes a .csv file in DSNdata/INFLUX, with the format DSNnnn-U_SiteName_yy-nn.csv.
  2. For each file, DSN_python writes a .csv file with UTC, SQM, lum, chisquared, moonalt and LST to DSNdata/BOX. These files are an archive of processed data.

Step 3

The .csv format is appropriate for input to influxDB, which feeds into Grafana for visualization. Each .csv file is uploaded into influxDB, and then deleted from DSNdata/INFLUX. Each .csv file that DSN_python writes is tagged with the site label, DSNnnn-U_SiteName, for influxDB to include it in the appropriate "dashboard," each of which is specific to the site so that Grafana can display it.

Step 4

Once each .dat file in DSNdata/NEW is processed it is deleted.

Step 5

Each file in DSNdata/BOX is uploaded to the Box repository, in the DSNdata/ARCHIVE folder, and is deleted from DSNdata/BOX. Files are stored in the format DSNnnn-U_SiteName_yy.csv. This is intended as a long-term archive of the processed data.

Step 6

A record of the file operations above is written to a running LOG.

Visualizing data

The processed data may be visualized with DSNweb (Use SHIFT-click to open in a new window.)

Analyzing and downloading data

The visualization web pages include buttons with links that trigger an analysis of the data (histograms, heatmap or jelly fish plot) or a download of the data within the selected time range. The analysis may take over 3 minutes because it runs on GitHub, which communicates with the Grafana visualization engine using InfluxDB on AWS, as well as Cloudflare to feed the analysis data from the website to GitHub.