info@industrialsciences.com.au

Published Papers

Space Traffic Management

Decision Support Tool for Risk Assessment & Maneuver Planning in Collision Avoidance

As the quantity of orbital debris continues to grow, so too does the rate of conjunction messages that suggest
possible collisions between high value payloads and debris. The abundance of these conjunction messages, and
eventual misses, has led to a culture of ignored alerts, and an increase in satellite operation costs as a result of
the frequent need to plan resources for maneuver planning and execution. The loss of “trust” in conjunction
alerts is due to the poorly characterized evolution in probability of collision (Pc) as time approaches the time of
closest approach (TCA) between two objects, as well as the interpretation of Pc in the context of maneuver
planning. To address these problems, and in collaboration with the NASA Conjunction Assessment Risk
Analysis (CARA) program, the Industrial Sciences Group has developed a novel Maneuver Decision Support
System (MDSS) to assist satellite operations in conjunction assessment and Maneuver planning. It provides a
meaningful and intuitive Urgency metric for actionable maneuver decisions, based on the physical dynamics of
conjunctions. It is based on a forecast of the evolution of Pc over time and represents an advance over current
methods that are in use for satellite conjunction monitoring and planning. The result is to give satellite operators
a validated decision support systems to plan for maneuver execution or mitigation or monitoring up to 3 days
before TCA.

Presented at the Advanced Maui Optical and Space Surveillance Technologies (AMOS) Conference, Hawaii, September 2022. [Presentation]

Space Situational Awareness

Design & Development of an Optimized Sensor Scheduling & Tasking Programme for Tracking Space Objects

The Industrial Sciences Group (ISG) and the Space Environment Research Centre (SERC) collaborated on the development & analysis of a mathematical model and software to optimize sensor scheduling and tasking for a network of sensors monitoring objects in orbit around the Earth. The program generates a schedule that maximises the utilisation of the total sensor network. The optimised schedule also reduces sensor idle time and automatically maintains an object catalog. The success of this project required a multi-disciplinary team with expertise in astrodynamics, statistics, information theory and software engineering.

Presented at the Advanced Maui Optical and Space Surveillance Technologies (AMOS) Conference, Hawaii, September 2019. [Poster]

Also presented at the International Laser Ranging Service Workshop (IWLR), Australia, November 2018. [Poster]

GNSS & Positioning

Use of advanced Kalman filtering and statistical techniques for error correction and positioning accuracy in Geoscience Australia’s Ginan software toolkit

The ‘Ginan’ software toolkit developed by Geoscience Australia delivers corrections to GNSS observations, allowing end-users to achieve position accuracies of under 5cm in real-time using precise point positioning (PPP). Ginan can be used to generate corrections or calculate user positions, in both real-time and post-processing.
The Industrial Sciences Group has been actively involved in the software development of Ginan, applying advanced Kalman filtering and statistical techniques to the toolkit. Added features include Rauch-Tung-Striebel (RTS) smoothing, fixed-lag smoothing, outlier detection and exclusion, first-order Gauss-Markov (FOGM) modelling and Joseph stabilisation. ISG has also enabled Ginan to process satellite laser ranging (SLR) observations alongside GNSS data. These features allow Ginan to generate more accurate products and corrections, as well as improving the accuracy and stability of user positions.

ISG has also extended Ginan’s compatibility to the latest data standards and formats, including RINEX 4, the Orbit Exchange Format (Orbex), and PPP correction messages delivered via an NTRIP caster in the standard RTCM3 format. These corrections will include clocks, orbits and biases, for multi-GNSS constellations including GPS, GLONASS, Galileo, Beidou and QZSS. By applying these corrections in real-time PPP processing, users can achieve position accuracies of under 5cm.

Also presented at the IGNSS2022 Conference, Australia, December 2022. [Presentation + Industry Spotlight]

Presented at the Locate22 Conference, Australia, May 2022. [Presentation]

Astrodynamics and Simulation

Statistical Approaches To Increase Efficiency Of Large-Scale Monte-Carlo Simulations

Many numerical astrodynamics analyses are characterized by a large input space with dispersions on those inputs. They also require numerical integration to propagate orbital trajectories, as well as the spacecraft attitude and actuator states forward in time. Often, Monte Carlo simulations are used, where each sample point is propagated numerically. These features all contribute to long Monte Carlo simulation times. Furthermore, the underlying input-output relationships are nonlinear with many variables interacting with one another. Hence, it is difficult to study the behavior in simulation of output responses as a function of the inputs – as that requires testing of a wide range of input values. Using traditional methods of varying one factor at a time and re-running the whole simulation each time is excessively time consuming. Also, varying one factor at a time means the end user of the simulation’s results cannot be certain they have captured the full range of possible input values. The aim of this paper is to adapt a method for astrodynamics simulations from industrial statistics and empirical modeling, to achieve the following outcomes: 1) Significantly reduce the run time of large-scale Monte Carlo simulations; 2) Ensure the simulation covers a wider range of values/worst case scenarios for significantly less runs than required under standard Monte-Carlo methods; 3) Increase the efficiency of Sensitivity Analysis and Optimization by using a fast/computationally cheap approximate model of the simulation, thus avoiding the need to re-run the simulation to test the effect of alternate input values on the output. To achieve outcomes 1-3, we propose to adapt the techniques of Design & Analysis of Computer Experiments Techniques (DACE) to astrodynamics simulation and illustrate with two Case Studies.

Presented at the AAS/AIAA Astrodynamics Specialist Conference, August 2018, AAS/AIAA 18-296

Ground System Tracking and Mission Analysis

Metric Tracking Data Analysis – Diagnosing Anomalies in Tracking Data for Improved Orbit Determination & Ground Station Performance: Case Studies from Three Lunar Missions

To enable spacecraft missions in the task of performing their own quality checks and diagnostics of tracking systems and data for accurate orbit determination and navigation. Industrial Sciences Group (ISG) in collaboration with Space Exploration Engineering (SEE) has developed a set of analytics and software tools for the detection and diagnosis of anomalies in Radiometric tracking data. Metric Tracking Data Analysis (MTDA) combines modern statistical and graphical tools with selected Kalman Filter parameters to perform causal analysis of tracking data anomalies, characterize ground station performance, and pre-process residuals. It also determines correlations between tracking data behaviour and satellite and orbit parameters. We provide case studies from three lunar missions.

Presented at the AAS/AIAA Astrodynamics Specialist Conference, August 2023 [Presentation].

Use of advanced statistical techniques for mission analysis: Case study from a Google Lunar X Team

Lunar X prize teams are competing to be the first non-governmental spacecraft to soft land on the Moon. All the teams have small budgets that are severe restrictions for mission designers. Hence it is necessary to rely heavily on historical data analysis and simulation to characterize and quantify expected performance of mission components. Statistical methods such as Exploratory Data Analysis (EDA), Time Series Analysis and Design & Analysis of Computer Experiments (DACE) are ideally suited to the task of delivering maximum information on the operating windows of expected performance at minimum cost. A case study is presented from a Lunar X team (SpaceIL) using statistical methods to characterize the expected performance of the Universal Space Network (USN) tracking stations to be used in the mission, using residuals data from the NASA Lunar Reconnaissance Orbiter mission (LRO). A moving window Time Series method was used to model the occurrence and duration of jumps in residuals. A feature of our method is the ability to isolate transient signals (e.g. jumps) from the usual noise for improved characterization of tracking performance. The EDA process revealed features such as bimodal distribution of data at some stations, and periodic patterns in the autocorrelation between residual values by day and by pass. These actual tracking performance measures will be used as inputs to a simulation tool for performance analysis of SpaceIL’s orbit determination capabilities. To maximize the information from the minimum number of simulation runs we outline the use of statistical DACE – a method adapted from industrial experiments that is highly efficient at determining input/output functional relationships in complex multivariate systems. The case study indicates a way forward for increased use of statistical tools and approaches in Mission Design and Analysis, by adapting methods from other disciplines such as econometrics and industrial experimentation.
Presented at the AAS/AIAA Astrodynamics Specialist Conference, August 2017, AAS/AIAA 17-664

Transport

Two Methods to Improve the Quality and Reliability of Calibrating & Validating Simulation Models 

The process of calibrating and validating all types of transport simulation models (micro, meso or macro) is a labour intensive and costly process. There have been a number of theoretical advances in methods for efficient testing and running simulation experiments that could be applied to the calibration and validation process. However they have not yet penetrated to actual modelling or transport simulation practice.

Our detailed review of the transport research literature showed that many of the state-of-theart methods for validation and calibration are focused on the following: improving the OriginDestination (OD) matrix estimation process; optimization algorithms to automate calibration; and Sensitivity Analysis on the input-output relationships of the model’s parameters. While these methods are scientifically correct they are complex to understand and apply in practice, as the modeller needs an advanced level of mathematics to use them. Our consultation with simulation practitioners and our own experience in implementing changes to modelling practice [Shteinman 2010, 2011, 2012] lead us to expect a low probability of acceptance by practitioners of these methods.
Shteinman , D (2014) “Two Methods to Improve the Quality and Reliability  of Calibrating & Validating Simulation Models”, Road & Transport Research, Vol. 23, No. 3, September 2014,  pp.65-78

Using the fundamental diagram to monitor travel times and motorway performance: Methodology and case study

Motorway owners and operators are now using improvements in travel time reliability (a reduction in travel time variability) as an added benefit of motorway use and a key performance indicator (KPI).This paper introduces a novel use of the Fundamental Diagram (FD) to monitor travel times and performance on a motorway. The methodology is tested in a case study in collaboration with Leighton Contractor’s on a new motorway. KPIs on travel times are to be used to financially incentivise the operator to maintain a high level of travel time reliability. The KPI definitions that were initially proposed were based on criteria that would have led to penalties being applied even under normal operation. Hence a more robust and equitable approach to KPI calculation was required.

We report on the design and development of such an approach, which has been accepted by all stakeholders in the project. It is based on the Fundamental Diagram, which is used to define a base line for equilibrium traffic conditions. Calculation of the Fundamental Diagram is based on a recently proposed model. We then construct an envelope around the FD. The boundaries of the envelope signify normal or “healthy” traffic conditions. Travel time KPIs penalties are only triggered for traffic flow data points that are outside the agreed “healthy” flow envelope.
Shteinman , D, Lazarov, Z, Daly, B (2014) “Using The Fundamental Diagram To Monitor Travel Times And Motorway Performance: Methodology & Case Study, 26th Australian Road Research Board Conference

Mathematics for traffic engineering

Engineers responsible for road traffic engineering should consider the use of industrial mathematics so solve complex problems, particularly where conventional software programs are not available or do not work.

At the Australian Centre for Commercial Mathematics (ACCM) we recently worked with the NSW roads and Traffic Authority, now Road and Maritime Services (RMS), to develop a statistical framework to guide traffic simulation studies. Due to the huge expense of designing and building new road infrastructure or testing alternative traffic scenarios, all design changes are first assessed in micro-simulation, the simulation of individual vehicles in a traffic system. In collaboration with RMS, ACCM identified a need for rigorous statistical analysis of the outputs of micro-simulation. The main objective was to increase confidence in the results of simulation models.
Shteinman , D (2012) “Mathematics for Traffic Engineering ”  Engineers Australia  Magazine (Civil) November 2012

Development of a statistical framework to guide traffic simulation studies

This paper describes a statistical framework to be used in a set of simulation and modelling guidelines. The framework was developed out of the necessity for defensible study results by the application of rigorous statistics. It is the result of a collaborative RTA-ACCM research project and provides a framework for analyzing simulation outputs, and also informs the design stage of a simulation study. The project was based on an analysis of thirty-six PARAMICS models supplied by the RTA. The guidelines apply Exploratory Data Analysis techniques (EDA) to the design and analysis of traffic micro- simulations, and include the graphing of output distributions to expose salient features and rigorous methods to detect and handle outliers in output data.

The framework includes methods to quantify and correct biases that result from the phenomenon of unreleased vehicles or incomplete trips. Diagnostic tests are described for discriminating between running error, model error or extreme sensitivity to congested conditions

A model to predict the run cost of a simulation as a function of critical network features is also described. A regression model was built, based on an index of model complexity and the combination of critical input factors. The regression model was limited by the small sample size of models used (36). Further research is continuing on cost prediction, using a larger set of simulation outputs and model types. The guidelines developed have provided value to RTA traffic control modelling practice and can be used by simulation modellers regardless of the micro-simulation package used.
Shteinman, D, Chong-White, C, Millar, G (2011) Advanced development of a statistical framework to guide traffic simulation studies, Proceedings of the 34th Australasian Transport Research Forum Adelaide

ITS Korea – Statistical framework to guide traffic simulation studies

We present the results of a collaborative RTA-MASCOS research project that is working to provide (1) a methodological framework for analyzing simulation outputs, and (2) a framework to inform the design stage of a simulation study. The project aims to improve the statistical rigor and defensibility of study results. We present a high-level update of the project results that indicate the contribution this project will provide to the design, planning and evaluation of traffic simulation studies. We adapt the exploratory data analysis techniques (EDA), traditionally used in industrial quality control, to the analysis and design of traffic micro-simulations. This includes graphing the output distributions to expose the salient features, screening the data for errors, missing values and most importantly, outliers. Outlier analysis is used as a diagnostic tool to distinguish between model errors and genuine rare events. The salient features of the data revealed by EDA used to build a functional relationship between changes in the complexity of simulated network features, the range of confidence intervals, precision and simulation run size.
Shteinman, D, Clarke S, Chong-White, C, Millar, G and Johnson, F (2010) Development of a statistical framework to guide traffic simulation studies, Proceedings of the 17th ITS World Congress, Busan

Mining

Influence of Geology on Valley Upsidence and Closure

The current valley upsidence and closure prediction methods were first published in 2002, following completion of ACARP Research Projects C8005 and C9067. These methods use conservative empirical prediction curves, which were drawn above all of the observed upsidence and closure data. The data was extracted from subsidence surveys that had been carried out in valleys at most of the collieries in the Southern Coalfield. Little site-specific surface geological data was available at that time for the monitored valley sites and the considerable scatter that existed under the prediction curves indicated that many factors probably influenced the extent of the upsidence and closure movements.

Extensive monitoring has been carried out in valleys since 2002 and the observed upsidence and closure movements have shown that the current methods for predicting upsidence and closure movements are predominantly conservative. Reviews of the few exceedance cases have indicated that various local geology and landform factors at these monitored sites may have also influenced the magnitude of the observed upsidence or closure movements.

ACARP Research Project C18015, which commenced in 2009, seeks to improve the accuracy of upsidence and closure predictions and impact assessments, by collecting and studying additional upsidence and closure data and gathering geological and topographical data at all previously monitored valley sites. This additional information is being used to develop a more comprehensive database of valley-related ground movements. Studies based on the increased quantity and quality of data are progressing well, with a view to developing a revised prediction method, based on multi-variant statistical analyses, with the assistance of the Centre of Excellence for Mathematics and Statistics of Complex Systems (MASCOS) at the University of New South Wales. This research project is to be completed later this year and this paper presents the background to the project and some of the preliminary findings.
Triennial Conference of the Institution of Engineers Mine Subsidence Technical Society, Pokolbin , NSW , May 209111

Analysis of Mining Induced Strains

Ground strain comprises two components, being normal strain and shear strain, which can be interrelated using Mohr’s Circle. The magnitudes of the normal strain and shear strain components are, therefore, dependant on the orientation in which they are measured. The maximum normal strains, referred to as the principal strains, are those in the direction where the corresponding shear strain is zero.

Normal strains along ground monitoring lines can be measured using 2D and 3D techniques, by taking the change in horizontal distance between two marks on the ground and dividing by the original horizontal distance between them. This provides the magnitude of normal strain along the orientation of the monitoring line and, therefore, this strain may not necessarily be the maximum (i.e. principal) normal strain. It should then be noted, that observed strains are dependent on the method of measurement, including the orientation of the monitoring lines, the spacing of the survey marks and survey tolerance.
Triennial Conference of the Institution of Engineers Mine Subsidence Technical Society, Pokolbin , NSW , May 209111

Industrial Mathematics

Industrial mathematics

The main benefit to engineering practice of using industrial mathematics is in the application of mathematical or statistical thinking to a problem. Industrial mathematics is the field that explores industrial processes and systems, seeks the mathematical or statistical components to explain the underlying structure of a process, and then works to improve or optimise the process under study. Industrial mathematics, when combined with engineering expertise in a specific domain, deals with complicated or complex problems like optimising the calibration schedule of a nuclear reactor, or modelling and predicting the leakage of sequestered CO2 in an offshore gas field. Industrial mathematics has been called a doubly invisible discipline: It is invisible to industry as companies often label the activity of mathematically trained staff as something else, such as modelling, analytics or simply “research”. It is invisible to the academy as university mathematicians do not widely teach industrial mathematics as a specific standalone discipline.
Shteinman , D (2012) “Industrial Mathematics”  Engineers Australia Magazine October 2012

Industrial mathematics: here and now . . . positive in all directions

In a recent edition of the Gazette, Graeme Wake [1] wrote of the success of industrial mathematics as something that, like the crest of a wave, is about to ‘break through’. In this article I would like to inform AustMS members of projects al- ready underway in industrial mathematics and statistics through MASCOS (the Australian Research Council’s Centre of Excellence for Mathematics and Statistics of Complex Systems). To extend the nautical metaphor — we are surfing down a wave now, and there are more waves coming!

Since 2008 MASCOS has conducted 14 projects in industry. Industry sectors include transport (NSW Roads and Traffic Authority, and Vicroads), defence (De- fence Science and Technology Organisation), coal mining (MSEC Consulting Group), medical devices (Cochlear), mental health (the Mental Health Research Institute) and nuclear science (ANSTO). For project details see MASCOS annual reports at www.complex.org.au.

MASCOS projects are not like typical consulting projects, where a specific problem is solved using existing mathematics, and recommendations are made. Rather, each project starts with an open problem set by the client. For example ‘design a new traffic control system to reduce congestion’, or ‘design a statistical model to predict the cost of road network simulations based on network complexity’ or ‘propose a new theoretical framework to improve the confidence in risk modelling of ground movement due to underground mining’. These projects require original applied research in a combination of mathematics, statistics and engineering to fill the gap identified by the open problem. Hence, in addition to the commercial value to the industry client, each project has research value to the professional mathematician or statistician. Research areas covered include statistical mechanics of non-equilibrium systems, extreme value theory, classification of high-dimensional data, risk modelling and more.
Shteinman , D (2010) “Industrial mathematics: Here and now . . . positive in all directions”, Gaz. Aust. Math. Soc, 37,213-219