Many numerical astrodynamics analyses are characterized by a large input space with dispersions on those inputs. They also require numerical integration to propagate orbital trajectories, as well as the spacecraft attitude and actuator states forward in time. Often, Monte Carlo simulations are used, where each sample point is propagated numerically. These features all contribute to long Monte Carlo simulation times. Furthermore, the underlying input-output relationships are nonlinear with many variables interacting with one another. Hence, it is difficult to study the behavior in simulation of output responses as a function of the inputs – as that requires testing of a wide range of input values. Using traditional methods of varying one factor at a time and re-running the whole simulation each time is excessively time consuming. Also, varying one factor at a time means the end user of the simulation’s results cannot be certain they have captured the full range of possible input values. The aim of this paper is to adapt a method for astrodynamics simulations from industrial statistics and empirical modeling, to achieve the following outcomes: 1) Significantly reduce the run time of large-scale Monte Carlo simulations; 2) Ensure the simulation covers a wider range of values/worst case scenarios for significantly less runs than required under standard Monte-Carlo methods; 3) Increase the efficiency of Sensitivity Analysis and Optimization by using a fast/computationally cheap approximate model of the simulation, thus avoiding the need to re-run the simulation to test the effect of alternate input values on the output. To achieve outcomes 1-3, we propose to adapt the techniques of Design & Analysis of Computer Experiments Techniques (DACE) to astrodynamics simulation and illustrate with two Case Studies.
To be presented at the AAS/AIAA Astrodynamics Specialist Conference, August 2018, AAS/AIAA 18-296
Design & Development of an Optimized Sensor Scheduling & Tasking Programme for Tracking Space Objects
The Industrial Sciences Group (ISG) and SERC (Research Programme 3) collaborated on the development & analysis of a mathematical model and software to optimize sensor scheduling and tasking for a network of sensors over a 24-hour look ahead horizon.
The program uses a catalogue of space objects as inputs, with initial state vectors and covariances, and a set of active and passive optical sensors across Australia. Each object is propagated forward using an Unscented Kalman filter over a 24-hour period to determine visible passes that may be tracked by each sensor. The program outputs a schedule for each sensor detailing the tracking times of selected objects over a 24-hour period.
Object selection is made using an Information Gain criterion (Renyi Divergence) that compares the reduction in each object’s covariance that can be achieved by making observations during an object’s visible pass. Each sensor simulates multiple observations with 5-second spacings for each visible object over a 2-minute assignment window. An auction-style algorithm is used to select a single (unique) object for each sensor over the assignment window to maximize the total information gain.
The program now accounts for asynchronous assignment windows, computing the cumulative information gain from multiple observations, scaling the information gain for high priority targets, and adding constraints on laser measurements. . The programme and algorithms were successfully tested by SERC on a catalogue of 2000 objects with 6 sensors.
The success of this project required a multi-disciplinary team with expertise in astrodynamics, statistics, information theory and software engineering.
To be presented at the International Workshop On Space Debris Management And Mitigation, Canberra, November, 2018
Lunar X prize teams are competing to be the first non-governmental spacecraft to soft land on the Moon. All the teams have small budgets that are severe restrictions for mission designers. Hence it is necessary to rely heavily on historical data analysis and simulation to characterize and quantify expected performance of mission components. Statistical methods such as Exploratory Data Analysis (EDA), Time Series Analysis and Design & Analysis of Computer Experiments (DACE) are ideally suited to the task of delivering maximum information on the operating windows of expected performance at minimum cost. A case study is presented from a Lunar X team (SpaceIL) using statistical methods to characterize the expected performance of the Universal Space Network (USN) tracking stations to be used in the mission, using residuals data from the NASA Lunar Reconnaissance Orbiter mission (LRO). A moving window Time Series method was used to model the occurrence and duration of jumps in residuals. A feature of our method is the ability to isolate transient signals (e.g. jumps) from the usual noise for improved characterization of tracking performance. The EDA process revealed features such as bimodal distribution of data at some stations, and periodic patterns in the autocorrelation between residual values by day and by pass. These actual tracking performance measures will be used as inputs to a simulation tool for performance analysis of SpaceIL’s orbit determination capabilities. To maximize the information from the minimum number of simulation runs we outline the use of statistical DACE – a method adapted from industrial experiments that is highly efficient at determining input/output functional relationships in complex multivariate systems. The case study indicates a way forward for increased use of statistical tools and approaches in Mission Design and Analysis, by adapting methods from other disciplines such as econometrics and industrial experimentation.
AAS/AIAA Astrodynamics Specialist Conference, August 2017, AAS/AIAA 17-664
The process of calibrating and validating all types of transport simulation models (micro, meso or macro) is a labour intensive and costly process. There have been a number of theoretical advances in methods for efficient testing and running simulation experiments that could be applied to the calibration and validation process. However they have not yet penetrated to actual modelling or transport simulation practice.
Our detailed review of the transport research literature showed that many of the state-of-theart methods for validation and calibration are focused on the following: improving the OriginDestination (OD) matrix estimation process; optimization algorithms to automate calibration; and Sensitivity Analysis on the input-output relationships of the model’s parameters. While these methods are scientifically correct they are complex to understand and apply in practice, as the modeller needs an advanced level of mathematics to use them. Our consultation with simulation practitioners and our own experience in implementing changes to modelling practice [Shteinman 2010, 2011, 2012] lead us to expect a low probability of acceptance by practitioners of these methods.
Shteinman , D (2014) “Two Methods to Improve the Quality and Reliability of Calibrating & Validating Simulation Models”, Road & Transport Research, Vol. 23, No. 3, September 2014, pp.65-78
Motorway owners and operators are now using improvements in travel time reliability (a reduction in travel time variability) as an added benefit of motorway use and a key performance indicator (KPI).This paper introduces a novel use of the Fundamental Diagram (FD) to monitor travel times and performance on a motorway. The methodology is tested in a case study in collaboration with Leighton Contractor’s on a new motorway. KPIs on travel times are to be used to financially incentivise the operator to maintain a high level of travel time reliability. The KPI definitions that were initially proposed were based on criteria that would have led to penalties being applied even under normal operation. Hence a more robust and equitable approach to KPI calculation was required.
We report on the design and development of such an approach, which has been accepted by all stakeholders in the project. It is based on the Fundamental Diagram, which is used to define a base line for equilibrium traffic conditions. Calculation of the Fundamental Diagram is based on a recently proposed model. We then construct an envelope around the FD. The boundaries of the envelope signify normal or “healthy” traffic conditions. Travel time KPIs penalties are only triggered for traffic flow data points that are outside the agreed “healthy” flow envelope.
Shteinman , D, Lazarov, Z, Daly, B (2014) “Using The Fundamental Diagram To Monitor Travel Times And Motorway Performance: Methodology & Case Study, 26th Australian Road Research Board Conference
The main benefit to engineering practice of using industrial mathematics is in the application of mathematical or statistical thinking to a problem. Industrial mathematics is the field that explores industrial processes and systems, seeks the mathematical or statistical components to explain the underlying structure of a process, and then works to improve or optimise the process under study. Industrial mathematics, when combined with engineering expertise in a specific domain, deals with complicated or complex problems like optimising the calibration schedule of a nuclear reactor, or modelling and predicting the leakage of sequestered CO2 in an offshore gas field. Industrial mathematics has been called a doubly invisible discipline: It is invisible to industry as companies often label the activity of mathematically trained staff as something else, such as modelling, analytics or simply “research”. It is invisible to the academy as university mathematicians do not widely teach industrial mathematics as a specific standalone discipline.
Shteinman , D (2012) “Industrial Mathematics” Engineers Australia Magazine October 2012
Engineers responsible for road traffic engineering should consider the use of industrial mathematics so solve complex problems, particularly where conventional software programs are not available or do not work.
At the Australian Centre for Commercial Mathematics (ACCM) we recently worked with the NSW roads and Traffic Authority, now Road and Maritime Services (RMS), to develop a statistical framework to guide traffic simulation studies. Due to the huge expense of designing and building new road infrastructure or testing alternative traffic scenarios, all design changes are first assessed in micro-simulation, the simulation of individual vehicles in a traffic system. In collaboration with RMS, ACCM identified a need for rigorous statistical analysis of the outputs of micro-simulation. The main objective was to increase confidence in the results of simulation models.
Shteinman , D (2012) “Mathematics for Traffic Engineering ” Engineers Australia Magazine (Civil) November 2012
The current valley upsidence and closure prediction methods were first published in 2002, following completion of ACARP Research Projects C8005 and C9067. These methods use conservative empirical prediction curves, which were drawn above all of the observed upsidence and closure data. The data was extracted from subsidence surveys that had been carried out in valleys at most of the collieries in the Southern Coalfield. Little site-specific surface geological data was available at that time for the monitored valley sites and the considerable scatter that existed under the prediction curves indicated that many factors probably influenced the extent of the upsidence and closure movements.
Extensive monitoring has been carried out in valleys since 2002 and the observed upsidence and closure movements have shown that the current methods for predicting upsidence and closure movements are predominantly conservative. Reviews of the few exceedance cases have indicated that various local geology and landform factors at these monitored sites may have also influenced the magnitude of the observed upsidence or closure movements.
ACARP Research Project C18015, which commenced in 2009, seeks to improve the accuracy of upsidence and closure predictions and impact assessments, by collecting and studying additional upsidence and closure data and gathering geological and topographical data at all previously monitored valley sites. This additional information is being used to develop a more comprehensive database of valley-related ground movements. Studies based on the increased quantity and quality of data are progressing well, with a view to developing a revised prediction method, based on multi-variant statistical analyses, with the assistance of the Centre of Excellence for Mathematics and Statistics of Complex Systems (MASCOS) at the University of New South Wales. This research project is to be completed later this year and this paper presents the background to the project and some of the preliminary findings.
Triennial Conference of the Institution of Engineers Mine Subsidence Technical Society, Pokolbin , NSW , May 209111
Ground strain comprises two components, being normal strain and shear strain, which can be interrelated using Mohr’s Circle. The magnitudes of the normal strain and shear strain components are, therefore, dependant on the orientation in which they are measured. The maximum normal strains, referred to as the principal strains, are those in the direction where the corresponding shear strain is zero.
Normal strains along ground monitoring lines can be measured using 2D and 3D techniques, by taking the change in horizontal distance between two marks on the ground and dividing by the original horizontal distance between them. This provides the magnitude of normal strain along the orientation of the monitoring line and, therefore, this strain may not necessarily be the maximum (i.e. principal) normal strain. It should then be noted, that observed strains are dependent on the method of measurement, including the orientation of the monitoring lines, the spacing of the survey marks and survey tolerance.
Triennial Conference of the Institution of Engineers Mine Subsidence Technical Society, Pokolbin , NSW , May 209111
This paper describes a statistical framework to be used in a set of simulation and modelling guidelines. The framework was developed out of the necessity for defensible study results by the application of rigorous statistics. It is the result of a collaborative RTA-ACCM research project and provides a framework for analyzing simulation outputs, and also informs the design stage of a simulation study. The project was based on an analysis of thirty-six PARAMICS models supplied by the RTA. The guidelines apply Exploratory Data Analysis techniques (EDA) to the design and analysis of traffic micro- simulations, and include the graphing of output distributions to expose salient features and rigorous methods to detect and handle outliers in output data.
The framework includes methods to quantify and correct biases that result from the phenomenon of unreleased vehicles or incomplete trips. Diagnostic tests are described for discriminating between running error, model error or extreme sensitivity to congested conditions
A model to predict the run cost of a simulation as a function of critical network features is also described. A regression model was built, based on an index of model complexity and the combination of critical input factors. The regression model was limited by the small sample size of models used (36). Further research is continuing on cost prediction, using a larger set of simulation outputs and model types. The guidelines developed have provided value to RTA traffic control modelling practice and can be used by simulation modellers regardless of the micro-simulation package used.
Shteinman, D, Chong-White, C, Millar, G (2011) Advanced development of a statistical framework to guide traffic simulation studies, Proceedings of the 34th Australasian Transport Research Forum Adelaide
We present the results of a collaborative RTA-MASCOS research project that is working to provide (1) a methodological framework for analyzing simulation outputs, and (2) a framework to inform the design stage of a simulation study. The project aims to improve the statistical rigor and defensibility of study results. We present a high-level update of the project results that indicate the contribution this project will provide to the design, planning and evaluation of traffic simulation studies. We adapt the exploratory data analysis techniques (EDA), traditionally used in industrial quality control, to the analysis and design of traffic micro-simulations. This includes graphing the output distributions to expose the salient features, screening the data for errors, missing values and most importantly, outliers. Outlier analysis is used as a diagnostic tool to distinguish between model errors and genuine rare events. The salient features of the data revealed by EDA used to build a functional relationship between changes in the complexity of simulated network features, the range of confidence intervals, precision and simulation run size.
Shteinman, D, Clarke S, Chong-White, C, Millar, G and Johnson, F (2010) Development of a statistical framework to guide traffic simulation studies, Proceedings of the 17th ITS World Congress, Busan
In a recent edition of the Gazette, Graeme Wake  wrote of the success of industrial mathematics as something that, like the crest of a wave, is about to ‘break through’. In this article I would like to inform AustMS members of projects al- ready underway in industrial mathematics and statistics through MASCOS (the Australian Research Council’s Centre of Excellence for Mathematics and Statistics of Complex Systems). To extend the nautical metaphor — we are surfing down a wave now, and there are more waves coming!
Since 2008 MASCOS has conducted 14 projects in industry. Industry sectors include transport (NSW Roads and Traffic Authority, and Vicroads), defence (De- fence Science and Technology Organisation), coal mining (MSEC Consulting Group), medical devices (Cochlear), mental health (the Mental Health Research Institute) and nuclear science (ANSTO). For project details see MASCOS annual reports at www.complex.org.au.
MASCOS projects are not like typical consulting projects, where a specific problem is solved using existing mathematics, and recommendations are made. Rather, each project starts with an open problem set by the client. For example ‘design a new traffic control system to reduce congestion’, or ‘design a statistical model to predict the cost of road network simulations based on network complexity’ or ‘propose a new theoretical framework to improve the confidence in risk modelling of ground movement due to underground mining’. These projects require original applied research in a combination of mathematics, statistics and engineering to fill the gap identified by the open problem. Hence, in addition to the commercial value to the industry client, each project has research value to the professional mathematician or statistician. Research areas covered include statistical mechanics of non-equilibrium systems, extreme value theory, classification of high-dimensional data, risk modelling and more.
Shteinman , D (2010) “Industrial mathematics: Here and now . . . positive in all directions”, Gaz. Aust. Math. Soc, 37,213-219