Author ORCID Identifier

https://orcid.org/0009-0005-9903-2532

Semester

Fall

Date of Graduation

2023

Document Type

Thesis

Degree Type

MS

College

Statler College of Engineering and Mineral Resources

Department

Mechanical and Aerospace Engineering

Committee Chair

Piyush Mehta

Committee Member

Hang Woon Lee

Committee Member

Jason Gross

Abstract

Commonly utilized space weather indices and proxies drive predictive models for thermosphere density, directly impacting objects in low-Earth orbit (LEO) by influencing atmospheric drag forces. A set of solar proxies and indices (drivers), F10.7, S10.7, M10.7, and Y10.7, are created from a mixture of ground based radio observations and satellite instrument data. These solar drivers represent heating in various levels of the thermosphere and are used as inputs by the JB2008 empirical thermosphere density model. The United States Air Force (USAF) operational High Accuracy Satellite Drag Model (HASDM) relies on JB2008, and forecasts of solar drivers made by a linear algorithm, to produce forecasts of density. Density forecasts are useful to the space traffic management community and can be used to determine orbital state and probability of collision for space objects. In this thesis, we aim to provide improved and probabilistic forecasting models for these solar drivers, with a focus on providing first time probabilistic models for S10.7, M10.7, and Y10.7. We introduce auto-regressive methods to forecast solar drivers using neural network ensembles with multi-layer perceptron (MLP) and long-short term memory (LSTM) models in order to improve on the current operational forecasting methods. We investigate input data manipulation methods such as backwards averaging, varied lookback, and PCA rotation for multivariate prediction. We also investigate the differences associated with multi-step and dynamic prediction methods. A novel method for splitting data, referred to as striped sampling, is introduced to produce statistically consistent machine learning data sets. We also investigate the effects of loss function on forecasting performance and uncertainty estimates, as well as investigate novel ensemble weighting methods. We show the best models for univariate forecasting are ensemble approaches using multi step or a combination of multi step and dynamic predictions. Nearly all univariate approaches offer an improvement, with best models improving between 48 and 59% on relative mean squared error (MSE) with respect to persistence, which is used as the baseline model in this work. We show also that a stacked neural network ensemble approach significantly outperforms the operational linear method. When using MV-MLE (multivariate multi-lookback ensemble), we see improvements in performance error metrics over the operational method on all drivers. The multivariate approach also yields an improvement of root mean squared error (RMSE) for F10.7, S10.7, M10.7, and Y10.7 of 17.7%, 12.3%, 13.8%, 13.7% respectively, over the current operational method. We additionally provide the first probabilistic forecasting models for S10.7, M10.7, and Y10.7. Ensemble approaches are leveraged to provide a distribution of predicted values, allowing an investigation into robustness and reliability (R&R) of uncertainty estimates, using the calibration error score (CES) metric and calibration curves. Univariate models provided similar uncertainty estimates as other works, while improving on performance metrics. We also produce probabilistic forecasts using MV-MLE, which are well calibrated for all drivers, providing an average CES of 5.63%.

Comments

Revision which addressed the word "by" on the title and abstract pages.

Share

COinS