PhD Student Projects
ExaGEO equips students with the skills, knowledge, and principles of exascale computing — drawing from geoscience, computer science, statistics, and computational engineering — to tackle some of the most pressing challenges in Earth and environmental sciences and computational research. Students will work under expert supervision in the below fields:
- Atmosphere, hydrosphere, cryosphere, and ecosystem processes and evolution
- Geodynamics, geoscience and environmental change
- Geologic hazard analysis, prediction and digital twinning
- Sustainability solutions in engineering, environmental, and social sciences
Each student will be positioned within a supervisory team consisting of multidisciplinary supervisors; one computational, one domain expert, and one from an Earth or environmental, and/or social science research background. This ‘team-based’ supervisory approach is designed to enhance multidisciplinary training.
Please note that some of the projects listed below currently have incomplete supervisory teams, however the full teams will be finalised before the start of the PhD.
Project Selection and Information
- You must apply for three projects. Each project has two project variations, i.e., teaser projects. During your first year, after working on both teaser projects (under the same supervisory team), you will select the project that best aligns with your interests. For further information on how this process will work, please see the FAQs section on our Apply page.
- Your PhD institution will be determined by the Principal Supervisor’s institutional affiliation.
- You can apply for projects at different institutions.
- Projects are grouped by research field.
- If you have any queries regarding a specific project, please contact the supervisor listed first (this will be the Principal Supervisor).
- Projects are funded via ExaGEO; this includes fees, stipends and a Research Training Support Grant. For further information, please see our Apply page.
Below is a list of our currently available projects. We encourage applications from students from diverse cultural and disciplinary backgrounds who are eager to take ownership of their research journey while benefiting from ExaGEO’s cutting-edge resources and interdisciplinary expertise. ExaGEO is committed to advancing Equality, Diversity and Inclusion (EDI) in our training
Projects with a focus on Atmosphere, Hydrosphere, Cryosphere, and Ecosystem Processes and Evolution:
-
Advancing prediction at the soil-water interface through data assimilation and exascale computing
Project institution:Lancaster UniversityProject supervisor(s):Prof Jess Davies (Lancaster University), Prof Lindsay Beevers (University of Edinburgh), Dr Simon Moulds (University of Edinburgh) and Prof Gordon Blair (Lancaster University)Overview and Background
Soil-water interactions are fundamental in a number of environmental processes and play a pivotal role in flood management, plant growth, and nutrient cycling. However, these interactions are highly complex and we currently rely on computationally intensive process-based models to help understand these processes and predict their influence on ecosystem services. With recent advances in satellite imagery and sensing, soil moisture data and other relevant data products are now available at spatial and temporal scales suitable for enhancing these models. However, integrating large volumes of data with these complex models is challenging. This studentship focuses on taking advantage of new exascale computing approaches to facilitate data assimilation, exploring how the fusion of big-data with soil-water process models can help unlock new insights and understanding.
Teaser Project 1: Improve process-based model representation of the long-term effects of extreme weather on soil carbon and nutrient cycling through remote sensing data assimilation
The lack or over-abundance of water can have large effects on plants, especially on annual crops where water conditions can severely affect the plant’s growth and survival. With changing water patterns and increasing frequency of heat waves and extreme rainfall events, the effects on plant productivity are expected to be large, and there will be knock-on effects for soil carbon storage and nutrient cycling in the longer-term. Remote sensing offers many data products that can provide us with data-based insights into plant productivity and soil moisture conditions. However, remote sensing of soil carbon is much more difficult, and understanding of the long-term response to changes in plant productivity still requires process-based models. In this project we will experiment with combining remote sensing data and process-based models to better understand the long-term effects of extreme weather on soil carbon and nutrient cycling.
Methods and PhD Pathway:
- The process-based model N14CP, which simulates plant-soil carbon, nitrogen and phosphorus cycling and export of dissolved nutrients to waterways will be adapted to assimilate (Gross or Net) Primary Productivity (GPP and NPP) and soil moisture data remote sensing products.
- The student will explore a range of approaches: from direct insertion to machine learning methods. The first teaser will begin with NPP as this has a direct proxy in the model. Freely available datasets for example from MODIS and SMAP that match the spatial resolution of the model will provide a starting point.
- To develop this path into a full PhD: multiple methods for assimilation will be explored; two-way learning between data and models will be considered; and methods for making data-assimilation real-time will be explored, with the use of exascale/GPU computing, helping move towards a digital twin.
Teaser Project 2: Estimate the contribution of soils to mitigating or increasing flood risk in a case study catchment by combining remote sensed soil moisture data and hydrological models
Antecedent soil saturation conditions can play a significant role in mitigating or increasing flood risk. If the soil has significant water held in storage, then its ability to act as a storage during times of high rainfall reduces.
Soil moisture is an important component in semi-distributed or distributed hydrological models, however, it is not routinely updated dynamically throughout the process of a simulation. With newly available satellite observations soil moisture is now available at good resolution temporally and spatially, such that is could be used to improve flood routing and water balance within catchment hydrological models.
Combining remote sensing and soil water probes offers an opportunity to develop real-time estimation for soil water, across catchments. The student will explore different approaches to data assimilation and upscaling to the catchment scale.
Methods and PhD pathway:
- Combining soil moisture estimates into hydrological and eventually hydraulic modelling for flood inundation estimation will entail significant challenges; firstly in the assimilation of data, secondly through the computational burden, and thirdly through the coupling of models in an online dynamic manner.
- Each of these challenges requires different innovative study and a range of methods.
- The student will be able to pick one or more of these three challenges to explore and develop into a full PhD, should they choose this pathway.
- For example, the coupling between hydrology requires the exploration of different coupling approaches, and will require consideration of processes such as the Basic Model Interface (BMI).
References and Further Reading
-
ArctExa: Towards Exascale Computing for Monitoring Arctic Ice Melt
Project institution:Lancaster UniversityProject supervisor(s):Prof Mal McMillan (Lancaster University), Dr Dave McKay (University of Edinburgh), Dr Jenny Maddalena (Lancaster University) and Dr Israel Martinez Hernandez (Lancaster University)Overview and Background
This project offers the exciting opportunity to be at the forefront of research to exploit the potential of exascale computing, for the purposes of satellite monitoring of Earth’s polar regions, at scale.
The Arctic is one of the most rapidly warming regions on Earth, with ongoing melting of the Greenland Ice Sheet and Arctic ice caps making a significant contribution to global sea level rise. As Earth’s climate continues to warm throughout the 21st Century, ice melt in the Arctic is expected to accelerate, leading to large-scale social and economic disruption.
Satellites provide a unique tool for monitoring the impact of climate change upon the Arctic, and are key to tracking the ongoing contribution that ice masses make to sea level rise. With recent increases in data volumes, computing power and the use of data science, comes huge potential to rapidly advance our ability to monitor and predict changes across this vast and inaccessible region. However, currently this potential is not fully realised.
This project will place you at the forefront of this research, working to advance our current capabilities towards exascale computing, through a combination of state-of-the-art satellite datasets, high performance compute, and innovative data science methods. You will be supported by a multidisciplinary supervisory team of statisticians, computer scientists and environmental scientists, with opportunities to contribute to projects run by the European Space Agency. Specifically, this project aims to develop new large-scale estimates of surface meltwater fluxes from all Arctic ice sheets and ice caps into the ocean and, in doing so, better constrain their contribution to sea level rise over the past two decades.
Methodology and Objectives
Project Aim: This project aims to utilise new streams of satellite data, alongside advanced statistical algorithms and compute, to transform our ability to monitor glacier melt at the pan-Arctic scale. More specifically, the successful candidate will develop new estimates of ice cap and ice sheet melt using high-volume, high-resolution datasets from the latest NASA and ESA satellite altimeters. These will be used to determine the first large-scale estimates of meltwater run-off into the Arctic Ocean.
Methods Used: This project will build upon recent proof-of-concept work developing Kalman Smoothing Data Assimilation techniques to create and analyse a unique record of ice melt. The focus of this PhD will be to apply these methods to the latest high-volume satellite altimetry datasets, and to do so at a massive scale. To fully exploit these big data streams and to do so at the pan-Arctic scale, will necessitate the use of Graphical Processing Units (GPU’s) on High Performance Computing (HPC) clusters. As such, developing the code to work on this high-level computing architecture will be a key element of the project. Within the first year of the PhD, the successful candidate will have the opportunity to explore 2 teaser projects, one of which will then be taken forward into subsequent years.
Teaser Project 1: High Resolution Measurements of Greenland Ice Melt over the past 15 years
This teaser project will develop novel estimates of Greenland ice melt over the past 15 years, based upon state-of-the-art CryoSat-2 swath altimetry satellite data. Specifically, a Kalman Smoothing approach, which has recently been tested within our group at several small-scale sites, will be further developed and deployed at scale, with the aim of mapping elevation changes across the entire ice sheet at high resolution. To achieve this, will require the current prototype code to be refactored and then deployed for the first time on GPU-enabled systems. Depending on the progress made, there will also be the opportunity to integrate other data streams, for example to include complementary measurements from the Sentinel-3 high-resolution Synthetic Aperture Radar altimeters.
Teaser Project 2: Towards pan-Arctic Monitoring of Ice Melt
The second teaser project will make use of the same Kalman Smoothing approach introduced above. This will ensure close synergy and complimentary between both of the first year teaser projects, thereby ensuring that the student reaps maximum gain from the development of their technical skills around this subject. Here, the student will deploy, for the first time, the Kalman Smoothing approach to monitor a small, and highly sensitive, Arctic ice cap, such as Austfonna on the Svalbard Archipelago. These smaller ice caps represent a more challenging target for satellite-based monitoring, and so alongside CryoSat-2 the student will also test the use of complementary ICESat-2 photon counting altimetry within the Kalman framework. Because of the high data volumes and the longer-term ambition to operate at the pan-Arctic scale, this project will again work to deploy the chain on GPUs.
In later years of the PhD, depending upon the student’s interests, there will be the opportunity either to extend this work to integrate output from Regional Climate Model simulations, to build more sophisticated machine learning elements into the processing chains, or to utilise other diverse streams of high-volume data, such as ultra-high resolution Digital Elevation Models or historical satellite missions.
Informal enquiries are welcome; please contact Prof Mal McMillan.
References and Further Reading
- Antarctica’s ice is melting 5 times faster than in the 90s
- Climate change: Satellite fix safeguards Antarctic data
- Greenland lost a staggering 1 trillion tons of ice in just four years
- CPOM
- CEEDS
-
Creating Sustainable Conservation-AI
Project institution:Lancaster UniversityProject supervisor(s):Dr Alex Bush (Lancaster University), Dr Tom August (Lancaster University), Prof Rachel McCrea (Lancaster University), and Prof Claire Miller (University of Glasgow)Overview and Background
New technologies for automating ecological sensing and measurement are producing unprecedented volumes of data that will continue to grow in size and complexity. A myriad of AI applications are emerging to harness the data produced, building a greater understanding of ecosystem change at local to global scales and better informing conservation actions. However, relying on increasingly powerful computing infrastructure and larger and larger data storage is only practical for the richest nations, and the growing environmental costs may contribute to energy demands that undermine our original conservation interests. A new data science is needed to serve biodiversity conservation globally; one that scales to the biodiverse Global South, and which considers the environmental impacts of data storage and large-scale compute.
Methodology and Objectives
Supporting and evaluating conservation actions with data has been a longstanding challenge, and it is therefore understandable why there has been a rush to embrace new technologies. However, the global footprint of computing and data centres is already so great it threatens climate action. If we are to scale Big Biodiversity Data tools globally, we must ask how those costs can be strategically reduced so they don’t outweigh the improvements new information makes to conservation outcomes.
The solutions to the Big Biodiversity Data Challenge draw from knowledge across the domains of statistics and computer science, with the need for underpinning support from the biological sciences. There are four key challenges which should be addressed to ensure the continued sustainability of automated monitoring methods. First, targeted data collection should ensure that the optimal amount of data is generated in the first instance. Second, the data should be handled, analysed and stored in a way that is optimal for its interpretation and reuse. Third, after analysis the information content should be used to optimize data retention to remain sustainable in the long-term. Raw data that has minimal information content could be considered for deletion, or placed in cold storage. Finally, strategies should be developed for how archived data can be re-used and integrated with new data to support biodiversity monitoring and prediction.
Using high-performance computing resources of JASMIN, the successful candidate will develop cutting edge solutions to the biodiversity big data challenge on some of the world’s largest to acoustic and imagery datasets collected by UKCEH and Lancaster University.
Teaser Project 1: Statistical approaches to the biodiversity big data problem
The easiest way to reduce the environmental impact and burden of Big Data is to collect and process only what is required. Methods to guide such decisions are available, but are computationally demanding, and therefore tools to accelerate prioritisation are critical.
- How should we best characterize ‘information content’ in our raw data? What are the common use cases for biodiversity data and how do these vary in their characterization of ‘information content’?
- What sampling strategies could be used to subset the data at the point of collection, to maximize information over data collection? What system designs would need to become common practice to enable this type of dynamic data collection?
- What are the perils of ‘business as normal’? Under future scenarios of data collection what can these statistical methods bring to help alleviate the Big Biodiversity Data Challenge.
Teaser Project 2: Computational approaches to the big data problem
Where GPU-ready models exist, parallel processes are required to assess the value of the raw data for future use and optimize data storage. That understanding could also guide how to efficiently reanalyse archive data to improve the revision and reuse of stored data.
- ‘Active Learning’ techniques have been developed to train AI-models more quickly by identifying their potential informational content. Can this approach be reapplied to optimize data storage and reuse?
- Can tools such as automated AI compression and knowledge distillation be generalised to different AI monitoring systems, and reduce resource demands to make deployment affordable in low-income countries?
- Are visions of Exa-scale computing compatible with conservation priorities? How will our use of GPUs and storage need to adapt for that uptake to remain sustainable?
The supervision team have considerable expertise in statistics and data science of automated biodiversity monitoring, and your research will be embedded in Lancaster’s Data Science Institute and Centre of Excellence in Environmental Data Science.
Image: AI-enabled monitoring like this automated moth camera trap generate terabytes of data. What do we keep, and what is redundant information?
References and Further Reading
- The real climate and transformative impact of ICT: A critique of estimates, trends, and regulations (click here)
- Deep learning and computer vision will transform entomology (click here)
- Measuring the Carbon Intensity of AI in Cloud Instances (click here)
- Recursive Bayesian computation facilitates adaptive optimal design in ecological studies (click here)
-
Decoding biological colour: leveraging AI to analyse big data on animal images in a changing world
Project institution:Lancaster UniversityProject supervisor(s):Dr Sally Keith (Lancaster University), Prof Christopher Nemeth (Lancaster University), Dr David Roy (Lancaster University) and Dr Christopher Cooney (University of Sheffield)Overview and Background
Understanding biodiversity is crucial to predict the impacts of global change and inform conservation strategies. One aspect of biodiversity that has been largely overlooked is biological colour, due to its complex quantification compared to other facets of biodiversity such as species richness. We therefore have little understanding of why species are the colours they are, its role in mediating interactions with other species (reproduction, competition, predation) and with the environment (e.g., thermoregulation), and the implications of environmental change on this role (e.g., effect of changing background on camouflage ability). One reason research in this area is limited is that quantification of colour and pattern is conceptually challenging and computationally intensive. To advance our understanding of biological colour, new more efficient techniques must be developed for both its quantification, visualisation and interpretation.
Methodology and Objectives
Biodiversity is entering the realms of ‘big data’ with the rapid growth in Citizen Science, emerging sensors for more automated monitoring, and progress towards digital twins. These approaches are generating large image datasets in near-real-time. Leveraging recent advancements in machine learning (ML) and artificial intelligence (AI), we can unlock unprecedented insights into these datasets. One area of interest is in enabling detailed analysis of biological colouration and patterning. State-of-the-art techniques such as convolutional neural networks (CNNs) and vision transformers (ViTs) can allow precise quantification of colour distributions across species and ecological communities. Through such approaches, it becomes possible to monitor and predict the effects of environmental change on biological colouration, informing conservation strategies and advancing our understanding of ecological processes.
Teaser Project 1: Have community colourscapes shifted over time in response to altered environmental conditions?
Objectives:
- Quantify colour across insect species from citizen science images.
- Reveal the diversity of colour and pattern across UK insect species.
- Follow-on. Create new methods to determine assemblage-scale colourscapes.
- Follow-on. Determine if and how insect assemblage colouration has changed over time and space, and explore the implications of that change for ecosystem function
This project draws on the concept of bioacoustic “soundscapes” to explore assemblage-scale shifts in the context of environmental change. For the teaser project, the student will apply CNNs and ViTs to quantify the colour of multiple insect species through access to two main image datasets: (1) images submitted via the iRecord citizen science platform, which includes >5 million photos spanning >20,000 UK species; and (2) standardised images of nocturnal insects captured by the UKCEH Automated Monitoring of Insects (AMI) sensor network in Central America, Africa, and Asia involve 40 devices generating ~2Tb of image data per year. This quantification will provide the foundation for a larger project coupling these colour quantifications with temporal assemblage data derived from decades of species observations across the UK (UKCEH Biological Records Centre; GBIF). This integration would enable analysis of how colourscapes – the frequency and distribution of colour across co-occurring species – have changed nationally over time, and the ecological implications of such changes. To engage conservation stakeholders, the student could later develop interactive geospatial visualisations of colour changes over time. Once calibrated, the project could be expanded to other regions and taxa, particularly where AMI sensors are deployed, and/or with coral reef fishes and birds leveraging supervisory expertise.
Teaser Project 2: Does habitat degradation disrupt the evolutionary match between animal colouration and its background?
Objectives:
- Quantify coral reef background colouration using computational methods applied to 3D photogrammetry data.
- Reveal the relationship between habitat colour diversity and reef degradation.
- Follow-on. Assess potential mismatch between reef fish colouration and habitat colour in degraded versus healthy reefs, and its implication for ecological function.
This project explores how coral reef habitat colour diversity is affected by environmental degradation using 3D models built from photogrammetry. The teaser project will build towards a broader exploration of how environmental change impacts the ecological and evolutionary functions of biological colouration. While current research on biological colouration focuses mainly on organisms, understanding their ecological context requires quantifying habitat background – the visual stage on which ecology plays out. Animal perception of colour depends on background contrast, hue, and brightness, which affect clarity and can induce colour shifts. Environmental change altering habitat background colour may heighten risks such as predation, disrupted sexual selection, or intensified competition. Coral reefs are an ideal model system because degradation shifts habitats from diverse, colourful coral to relatively uniform algae. Coral reefs are increasingly mapped in 3D via photogrammetry, creating a growing repository of models for analysis (e.g., Operation Wallacea, MARS project) and facilitating new model generation. 3D models overcome the limitations of standard photographs, which often fail to capture the structural complexity and dynamic colouration of reef habitats. Reef fishes offer a rich testbed for investigating how shifts in habitat colour affect evolved colouration. Coral reefs also provide essential ecosystem services for billions yet are among the most threatened globally, with stressors like bleaching altering habitat colour and reducing background diversity.
Image: Quantifying colour and pattern of both organisms and their background from real-world images is a significant challenge in complex environments, such as this image of Chaetodon guttatissimus on a coral reef in Bali, Indonesia.
References and Further Reading
- Hemingson et al (2024) Analysing biological colour patterns from digital images: An introduction to the current toolbox. Ecology & Evolution (click here)
- Koneru & Caro (2022) Animal coloration in the Anthropocene. Frontiers in Ecology & Evolution (click here)
- Cuthill et al (2017) The biology of color. Science (click here)
- Cooney et al (2022) Latitudinal gradients in avian colourfulness. Nature Ecology & Evolution (click here)
- Caves et al (2024) Backgrounds and the evolution of visual signals. Trends in Ecology & Evolution (click here)
-
Detecting hotspots of water pollution in complex constrained domains and networks
Project institution:University of GlasgowProject supervisor(s):Dr Mu Niu (University of Glasgow), Dr Craig Wilkie (University of Glasgow), Prof Cathy Yi-Hsuan Chen (University of Glasgow) and Dr Michael Tso (Lancaster University)Overview and Background
Technological developments with smart sensors are changing the way that the environment is monitored. Many such smart systems are under development, with small, energy efficient, mobile sensors being trialled. Such systems offer opportunities to change how we monitor the environment, but this requires additional statistical development in the optimisation of the location of the sensors.
The aim of this project is to develop a mathematical and computational inferential framework to identify optimal sensor deployment locations within complex, constrained domains and networks for improved water contamination detection. Methods for estimating covariance functions in such domains rely on computationally intensive diffusion process simulations, limiting their application to relatively simple domains and small-scale datasets. To address this challenge, the project will employ accelerated computing paradigms with highly parallelized GPUs to enhance simulation efficiency. The framework will also address regression, classification, and optimization problems on latent manifolds embedded in high-dimensional spaces, such as image clouds (e.g., remote sensing satellite images), which are crucial for sensor deployment and performance evaluation. As the project progresses, particularly in the image cloud case, the computational demands will intensify, requiring advanced GPU resources or exascale computing to ensure scalability, efficiency, and performance.
Methodology and Objectives
The idea of on-site sensors to detect water contaminants has a rich history. Since water flows at finite speeds, placing sensors strategically reduces time until detection. The mathematical analysis is often made difficult by the need to model the nonlinear dynamical systems of hydraulics within a non-Euclidean space such as constrained domains (lake or river, Wood et al., 2008) or networks (pipe network, Oluwaseye, et al., 2018). It requires solving large nonlinear systems of differential equations in the complex domain and is difficult to apply to even moderate-sized problems.
This proposed PhD project aims to develop new methods to improve environmental sampling, enabling improved estimation of water pollution and associated uncertainty that appropriately accounts for the geometry and topology of the water body.
Methods Used:
Intrinsic Bayesian Optimization (BO) on complex constrained domains and networks allows the use of the prediction and uncertainty quantification of intrinsic Gaussian processes (GPs) (Niu et al., 2019, 2023) to direct the search of the water pollution. Once new detection is observed, the search for a hotspot can be sequentially updated.
The key ingredients of BO are the Gaussian processes (GPs) prior that captures beliefs about the behaviour of the unknown black-box function in the complex domains. The student will develop intrinsic BO on non-Euclidean spaces such as complex constrained domains and networks with the state-of-the-art GPs on manifolds and GPs on graphs. Extending the idea of estimating covariance functions on manifolds, the project aims to estimate the heat kernel of the point cloud, allowing the incorporation of the intrinsic geometry of the data, and a potentially complex interior structure.
The application areas are water quality in lakes with complex domains (such as the Aral Sea) and pollution sources in a city’s sewage network. The methods would have the potential to inform about emergent water pollution events like algal blooms, providing an early warning system, and help to identify pollution sources.
Teaser Project 1:
This first teaser project will introduce the student to parallel computing using GPUs, with a focus on applying intrinsic Gaussian Processes to water quality data for efficient GPU implementation. The project will lay the foundation for the PhD by enabling the understanding of complex water quality patterns in non-Euclidean spaces, including continuous domains with complex boundaries and network domains, while developing efficient stochastic diffusion process simulations leveraging GPU power. The student will begin by applying existing methods to datasets ranging from small to large scales. This work has the potential to evolve into a full PhD project centred on the development of computationally intensive methods for modelling water quality and detecting contamination hotspots over complex domains. The use of GPU parallelization will enable scalable modelling across large spatial areas and handle the high data volumes characteristic of high-resolution water quality datasets.
Teaser Project 2:
In the second teaser project, the student will expand their work to the spatio-temporal (or manifold-temporal) setting, incorporating both complex spatial and temporal structures to fully explain the changing nature of the water quality patterns. Again, this teaser project will use involve applying existing methods to small-scale datasets. Due to the high computational complexity of spatio-temporal models, this project has the potential to evolve into a PhD with a focus on developing highly computationally efficient methods, with a focus on parallelisation on GPUs.
The student will benefit from the extensive expertise of the supervisory team. Dr Niu specializes in statistical inference in Non-Euclidean spaces, with application in ecology and environmental science.
Dr Wilkie has a background in developing spatiotemporal data fusion approaches for environmental data, focussing on satellite and in-lake water quality data. Prof Chen specializes in network modeling, statistical inference, data science, machine learning and economics. Dr Tso is an environmental data scientist with strong computational background and a portfolio of work on water quality monitoring, including adaptive sampling.
Figure 1: Examples of complex constrained domains: Chlorophyll concentrations in Aral Sea (Wood et al., 2008).
References and Further Reading
- Niu, et al., (2019), “Intrinsic Gaussian processes on complex constrained domain”, J. Roy Statist. Soc. Series B, Volume 81, Issue 3
- Niu, et al., (2023): Intrinsic Gaussian processes on unknown manifold with probabilistic geometry, Journal of Machine Learning Research; 24 (104)
- Oluwaseye, et al.,(2018) A state-of-the-art review of an optimal sensor placement for contaminant warning system in a water distribution network, Urban Water Journal, 15:10, 985–1000
- Giudicianni et al., (2020). Topological Placement of Quality Sensors in Water-Distribution Networks without the Recourse to Hydraulic Modeling. Journal of Water Resources Planning and Management, 146(6)
- Wood, S. N., Bravington, M. V. and Hedley, S. L. (2008) Soap film smoothing. J. Royal Stat. Soc. Series B, 70, 931–955
-
Developing GPU-accelerated digital twins of ecological systems for population monitoring and scenario analyses
Project institution:University of GlasgowProject supervisor(s):Prof Colin Torney (University of Glasgow), Prof Juan Morales (University of Glasgow), Prof Rachel McCrea (Lancaster University), Dr Tiffany Vlaar (University of Glasgow) and Prof Dirk Husmeier (University of Glasgow)Overview and Background
This PhD project focuses on advancing ecological research by using high-resolution datasets and GPU computing to develop digital twins of ecological systems. The study will concentrate on a population of free-roaming sheep in Patagonia, Argentina, examining the relationship between individual decision-making and population dynamics. Using data from state-of-the-art GPS collars, the research will investigate the impact of an individual’s condition on activity budgets and space use, and the dual influence of parasites on behaviour and energy balance. The digital twins will enhance the accuracy of population-level predictions and offer a versatile and transferable framework for ecosystem monitoring, providing critical insights for environmental policy, conservation strategies, and sustainable food systems.
Methodology and Objectives
A digital twin is a virtual replica of a physical system that can be used to investigate the system’s dynamics, predict potential failures, and optimise decision-making processes. What distinguishes digital twins from other simulation models is their ability to continuously update with real-time data. This feature allows them to represent the current state of the system accurately, ensuring that the model is consistently learning from empirical data. The focus of this PhD is to develop a digital twin of an ecological system. This digital twin will serve as a platform for exploring methods of capturing the emergent distribution of vital rates. It will also facilitate updates based on new data and will therefore function as both a learning tool and a forecasting tool, predicting future states of the system under different scenarios. One of the central objectives of this project is to determine the extent to which the individual-level dynamics can be simplified without compromising prediction accuracy. The digital twin developed will be an individual-based model, with rules for individual behaviour and space use informed by empirical data and theoretical principles. The landscape in which the animals move will be composed of GIS layers derived from remote sensing data and vegetation maps, while movement and activity data will be supplied by GPS collar devices. The project will employ statistical inference techniques and GPU-based simulations of multiple individuals and populations in parallel, using a process of importance sampling and resampling based on data. This process will ensure that only model parameters for which the simulations are consistent with observations are retained, and the resulting posterior distributions of parameter values are iteratively refined. In this way, the digital twin will efficiently represent animals transitioning between different behaviours and landscape usage, while monitoring their energy gains and losses.
Teaser Project 1: Predicting individual behaviour, space use, and condition
The first project will focus on the individual-level. Using telemetry data and periodic measurements of individual body condition, models will be developed to predict individual behaviour, space use, and changes to condition. The project will explore how individuals transition between behaviours and how this influences, and is influenced by, their internal state and condition. The impact of parasites on behaviour and energy balance will also be investigated. The project will develop recharge models which capture the internal state of an animal as several state variables that are either depleted or replenished depending on the activities of the animal. Individual-based models will be used to simulate complex decision-making processes as a function of internal states and environmental features. Model predictions will be compared to empirical data and the mismatch between the prediction and the observations will be used to refine the model and update estimates of parameter uncertainty.
Teaser Project 2: Connecting individuals to population dynamics
The second project will explore different potential modelling approaches to capture the dynamics of wildlife populations. The digital twin will serve as a benchmark model that encapsulates multiple facets of the complex dynamics of the system. More abstract population models will be developed that ignore or summarize individual variation, spatial heterogeneity, and the feedbacks between individual behaviour and condition. These models will be compared to the digital twin to assess the capacity of coarser representations of the system to accurately predict population declines depending on environmental conditions. The comparison will consider the trade-offs between model accuracy and the time and energy required to run the models.
References and Further Reading
- Blair, G. S. (2021). Digital twins of the natural environment. Patterns, 2(10)
- Hooten, M. B., Johnson, D. S., McClintock, B. T., & Morales, J. M. (2017). Animal movement: statistical models for telemetry data. CRC press
- Torney, C. J., Morales, J. M., & Husmeier, D. (2021). A hierarchical machine learning framework for the analysis of large scale animal movement data. Movement ecology, 9, 1-11
- Kavwele, C. M., Hopcraft, J. G. C., Morales, J. M., Nyafi, G., Kimuya, N., & Torney, C. J. (2024). Real‐time classification of Serengeti wildebeest behaviour with edge machine learning and a long‐range IoT network. Canadian Journal of Zoology
-
Downscaling and Prediction of Rainfall Extremes from Climate Model Outputs (RainX)
Project institution:University of GlasgowProject supervisor(s):Dr Sebastian Gerhard Mutz (University of Glasgow) and Dr Daniela Castro-Camilo (University of Glasgow)Overview and Background
In the last decade, Scotland’s rainfall increased by 9% annually and 19% in winter, with more water from extreme events, posing risks to the environment, infrastructure, health, and industry [Sniffer, 2021]. Urgent issues such as flooding, mass wasting, and water quality are closely tied to rainfall extremes [Sniffer, 2021]. Reliable predictions of extremes are, therefore, critical for risk management. Prediction of extremes, which is one of the main focuses of extreme value theory [Friederichs, 2010], is still considered one of the grand challenges by the World Climate Research Programme [Alexander et al., 2016]. This project will address this challenge by developing novel statistical, computationally efficient models that are able to predict rainfall extremes from the output of GPU-optimised climate models.
Methodology and Objectives
General Circulation Models (GCMs) are the primary tools for predicting future climate change [IPCC, 2023; and references therein]. While these GCM simulations are suitable for studies investigating climate dynamics and changes on coarse spatiotemporal scales, their skill in predicting local-scale climate and extremes remains very limited [IPCC, 2023]. Statistical Downscaling (SD) addresses this problem by linking coarse climate information to local-scale observational data [e.g., Hewitson et al., 2014; Mutz et al., 2021] using statistical models, which enables us to “translate” GCM output to predictions that are more relevant for regional impact studies and adaptation measures. This project will leverage recent advances in SD for extremes [Cuba et al., 2024+] to develop a set of algorithms for predicting rainfall extremes in Scotland from the output of the latest GPU-optimised GCM ICON [Giorgetta et al. 2018]. These will be integrated into the user-focused, open-source tool pyESD [Boateng and Mutz, 2023]. Both teaser projects will rely on two datasets: 1) meteorological observations that capture rainfall extremes in Scotland (i.e., the “predictand” dataset), and 2) a dataset used for SD model fitting (i.e., the “predictor” dataset).
Teaser Project 1: “Perfect Prognosis” Approach
In the perfect prognosis approach, SD models are constructed from observation-based datasets for both the predictand and predictors. These models, therefore, capture real-world conditions and relationships, aiding model validation and improving our physical understanding of local-scale predictand variability. Another strength of this approach is the ability to couple the SD models to any GCM or dataset (e.g., Ramon et al., 2021), making them highly transferable. The predictor dataset for Teaser Project 1 will be ERA5 reanalysis data [Hersbach et al., 2020]. The SD models will then be coupled to 21st century simulations conducted with the GCM ICON [Giorgetta et al. 2018] to predict future changes in rainfall extremes in Scotland.
Teaser Project 2: “Model Output Statistics” Approach
Model Output Statistics also uses an observation-based predictand dataset, but the predictor dataset is simulated with climate models (e.g., GCMs). The relationships captured in the resulting SD models do not reflect physical processes as much as in the perfect prognosis approach, and the SD models are fine-tuned to a specific climate model. However, when used in tandem with this climate model, the approach often produces more accurate results and excels at climate model bias correction (e.g., Sachindra et al., 2014). The predictor dataset for Teaser Project 2 will be ICON simulations for the present-day climate. The SD models will then be coupled to 21st century ICON simulations to predict future changes in rainfall extremes in Scotland.
References and Further Reading
- Alexander, L.V., Zhang, X., Hegerl, G. & Seneviratne, S.I. (2016). Implementation Plan for WCRP Grand Challenge on Understanding and Predicting Weather and Climate Extremes – the “Extremes Grand Challenge”. Version, June 2016 (click here)
- Boateng, D. & Mutz, S. G. (2023). pyESDv1.0.1: an open-source Python framework for empirical-statistical downscaling of climate information. Geosci. Model Dev., 16, 6479–6514 (click here)
- Coles, S. G. (2001). An Introduction to the Statistical Modeling of Extreme Values. London: Springer. Cuba, M.D., Wilkie, C., Scott, M. & Castro-Camilo, D. (2024+). Data fusion for threshold exceedances using a censored Bayesian hierarchical model. To appear
- Friederichs, P. (2010). Statistical downscaling of extreme precipitation events using extreme value theory. Extremes, 13, 109-132
- Giorgetta, M. A., Brokopf, R., Crueger, T., Esch, M., Fiedler, S., Helmert, J., et al. (2018). ICON-A, the atmosphere component of the ICON Earth system model: I. Model description. Journal of Advances in Modeling Earth Systems, 10, 1613–1637 (click here)
- Hersbach, H., Bell, B., Berrisford, P., Hirahara, S., Horányi, A., Muñoz-Sabater, J., et al. (2020). The ERA5 global reanalysis. Q. J. Roy. Meteor. Soc., 146, 1999–2049 (click here)
- Hewitson, B. C., Daron, J., Crane, R. G., Zermoglio, M. F. & Jack, C. (2014). Interrogating empirical-statistical downscaling. Clim. Change, 122, 539–554 (click here)
- IPCC (2023). Geneva, Switzerland, 35-115 (click here)
- Mutz, S. G., Scherrer, S., Muceniece, I. & Ehlers, T. A. (2021). Twenty-first century regional temperature response in Chile based on empirical-statistical downscaling. Clim. Dynam., 56, 2881–2894 (click here)
- Ramon, J., Lledó, L., Bretonnière, P.-A., Samsó, M. & Doblas-Reyes, F. J. (2021). A perfect prognosis downscaling methodology for seasonal prediction of local-scale wind speeds. Environ. Res. Lett., 16, 054010 (click here)
- Sachindra, D. A., Huang, F., Barton, A. & Perera, B. J. C. (2014). Statistical downscaling of general circulation model outputs to precipitation – part 2: bias-correction and future projections. Int. J. Climatol., 34, 3282–3303 (click here)
- Sniffer (2021): ‘Third UK Climate Change Risk Assessment Technical Report: Summary for Scotland’ (click here)
-
Foundational models for Ecology
Project institution:Lancaster UniversityProject supervisor(s):Dr Kit Macleod (Lancaster University), Dr Alex Bush (Lancaster University) and Dr Clare Rowland (Lancaster University)Overview and Background
Satellite missions generate data on the scale of exabytes (millions of terabytes), so the need for efficient data analysis has become paramount. Geospatial Foundation Models (GFM) are currently revolutionizing our approach to machine learning by using vast unlabelled databases of satellite images to self-supervise model training. The completed GFMs then require minimal additional labelled data to be trained for new applications like flood and wildfire mapping. In this project you will explore opportunities for GFMs to improve habitat mapping in support of conservation and ecosystem restoration. This project will combine detailed long-term monitoring datasets collected by UKCEH to understand how such state-of-the-art GFMs can benefit ecology and support land management under a changing climate.
Methodology and Objectives
The latest GFM (December 2024), developed by NASA and IBM, is called Prithvi-EO2 and combines multi-spectral data from millions of Landsat and Sentinel images collected over a decade. The combined remote-sensing and software-engineering expertise behind the Prithvi model projects is state-of-the-art. The tools to fine-tune GFMs are open-source (available on Hugging Face and GitHub), including TerraTorch, so there is enormous scope for this research. For example, one recent advance includes the integration of multi-satellite sensor data e.g. ICESat-2 and GEDI (Global Ecosystem Dynamics Investigation) to fine-tune the GFM to predict above ground biomass.
The basic structure of these teaser projects are to take highly detailed UK-specific data sets to fine-tune the GFM and to assess whether the outputs exceed standard modelling approaches and data sets.
Teaser Project 1: Can fine-tuning GFMs be used for improved habitat mapping of habitat type and condition?
Accurate data on habitat type and habitat condition is crucial for making informed decisions about the UK’s land. This project will explore how GFMs could be used to improve the quality of our habitat data enabling more informed decisions, particularly habitat condition which is difficult to monitor.
Objectives:
- Fine-tuning the GFM – Can species records from structured surveys, like the National Plant Monitoring Scheme (NPMS), or the National Forest Inventory (NFI) be used to fine-tune GFMs to improve habitat assessment? Can unstructured citizen-science data also be incorporated?
- Assessing the output – How do GFM habitat classifications compare to traditional machine learning approaches such as random forest? Do they provide more insight, or different insights, to traditional approaches?
- Developing the role of GFMs in ecology – Review 1.1 and 1.2. to develop an approach for how to use GFMs in ecology, including the type of field survey data that is most beneficial for GFM fine-tuning, and the ecological use cases where GFMs are most (and least) beneficial.
Teaser Project 2: Can fine-tuned GFMs be used for detecting ecological change of habitats?
Thousands of satellite images are freely available for every site of interest in the UK. But converting this data into actionable information is complex. This project will explore the use of fine-tuning GFMs to extract useful information from these massive time-series of satellite data to better understand ecological change.
Objectives:
- Fine-tuning the GFM – Monitoring and understanding change over time is important, but data sets are limited. The first step will be to review available data sets, to see which are most suitable for detecting ecological change. The most suitable data sets will then be used to train the GFM to detect ecological change for key case study sites where known changes have occurred e.g. rewilding and other forms of habitat restoration.
- Case studies – Ecological changes are often slow, and the impact of management actions
may take years to be observed? Can GFMs track changes within, or between closely related, habitats that resulted from degradation or restoration? Habitat quality is as important as quantity and is a key indicator for many national and local policies. Which indicators of ecosystem condition can GFMs predict accurately? - Understanding the role of GFM’s in ecology – Review 2.1 and 2.2 to understand how GFM’s can be used to help ecologists, including identification of outlier sites for further investigation, and to select where new labelled training data is expected to add most value to train the GFMs.
References and Further Reading
- Hunt, M. L., Blackburn, G. A., Siriwardena, G. M., Carrasco, L., & Rowland, C. S. (2023). Using satellite data to assess spatial drivers of bird diversity. Remote Sensing in Ecology and Conservation, 9(4), 483-500
- Jakubik, J., Roy, S., Phillips, C.E et al. (2023). Foundation Models for Generalist Geospatial Artificial Intelligence. arXiv, 2310.18660
- Marston, C. G., O’Neil, A. W., Morton, R. D., Wood, C. M., & Rowland, C. S. (2023). LCM2021–the UK land cover map 2021. Earth System Science Data, 15(10), 4631-4649
- Szwarcman, D., Roy, S., Fraccaro, P. et al. (2024). Prithvi-EO-2.0: A Versatile Multi-Temporal Foundation Model for Earth Observation Applications. arXiv, 2412.02732
- Artificial Intelligence for Science, NASA Science
- Expanded AI Model with Global Data Enhances Earth Science Applications, NASA Science
- IBM/terratorch: a Python toolkit for fine-tuning Geospatial Foundation Models (GFMs)
-
GPU-accelerated multiscale modelling for glacier sliding
Project institution:University of GlasgowProject supervisor(s):Dr Andrei Shvarts (University of Glasgow), Dr Jingtao Lai (University of Glasgow), Prof Lukasz Kaczmarczyk (University of Glasgow) and Prof Todd Ehlers (University of Glasgow)Overview and Background
Global climate and environmental change are increasingly resulting in weather extremes that impact society and infrastructure. These extremes include stormier climates with increased wind speeds, precipitation events or drought, and temperatures (amongst other things). A team of University of Glasgow researchers are developing an Earth systems digital twin for exascale computing that works on GPU computers and uses weather forecasts to predict the cascading effect of climate change events on environmental systems. Our goal is to provide predictions, at the national or large scale, of the impacts of environmental extremes on natural and urban settings. This project is one, stand alone, component of this larger scale project.
In this project, you will focus on the glacier-bedrock interface. Mountain glaciers are changing rapidly worldwide in response to climate change. Glacier changes affect global trends in freshwater availability, contribute to recent sea level changes, and affect regional water resources over the twenty-first century. However, uncertainties remain in projecting such impacts in future climate change scenarios. A major source of these uncertainties is the lack of understanding of glacier sliding – the relative motion between glacial ice and underlying rocks (Zoet, L. K., & Iverson, N. R., 2020). In this project, you will develop a GPU-accelerated multiscale modelling framework for glacier sliding to tackle this problem.
Your job while working on this project will involve software development for simulating the relevant multiphysical processes, applying the model to historic data for validation, working in a team/workgroup environment, attending regular research group seminars, integrating diverse environmental and satellite data into your software, and learning new techniques through ExaGEO training workshops.
Methodology and Objectives
Subglacial system consists of ice, water, and rocks. Among these components, different processes and feedbacks operate at different spatial and temporal scales, making it a challenging computational problem to simulate. To tackle this problem, this project will adopt a multiscale modelling approach combining microscale modelling of the ice-bedrock interface with macroscale simulation of glacier dynamics and subglacial hydrology. A particular focus will be developing models for GPU architectures to enable high-resolution and scalable simulation.
Methods used in this project will involve numerical simulation using the open-source parallel finite element library MoFEM developed and supported at the University of Glasgow (Kaczmarczyk, Ł., et al, 2020).
Teaser Project 1:
This teaser project, conducted during the first year, will focus on developing a microscale model of the contact interface between glacier ice and bedrock. When the ice is compressed by its own weight against the bedrock, the roughness of both the ice and bedrock surfaces creates a complex contact problem where only isolated regions of the interface are in actual contact. Simultaneously, the remaining areas form free volumes that can be occupied by flowing or stagnant (trapped) water mixed with sediments. The sub-project will build upon a previously developed finite-element framework (Shvarts, A.G. et al., 2021) by enabling its application in a distributed-memory parallel computing environment and providing further GPU acceleration using the functionality available in the MoFEM library. Additionally, the framework will be enhanced to incorporate friction between the ice and bedrock. Using available data to calibrate the model, the extended framework will predict the shear strength of the interface as a function of various parameters, including the weight of the ice, surface roughness, sediment density, and water pressure. These predictions will be compared with existing phenomenological models of glacial sliding to refine and improve the latter.
Teaser Project 2:
This sub-project, also conducted during the first year, addresses the macroscale problem and focuses on the behaviour of the glacier as a whole. It will leverage the finite-element model implemented in MoFEM, constructed using available topological and geological data. The model will incorporate the results from the microscale simulations of the first sub-project, which map ice properties to interfacial shear strength, to accurately inform the macroscopic interface behaviour. Utilizing parallel computing with GPU acceleration, this sub-project will simulate large-scale glacier dynamics under varying environmental conditions, including changes in ice thickness, surface temperature, and basal water pressure. These simulations will provide critical insights into the glacier’s flow patterns, deformation, and sliding behaviour, enabling predictions of its response to climate change scenarios. The outcomes will also help validate and refine existing phenomenological models, improving their applicability to real-world glacier systems.
Image: Fluid flow through contact interface between a solid with a fractal rough surface and a rigid flat. Left: bulk view with colour representing contact pressure and streamlines with colour showing the fluid flux intensity. Right: interface view with colour representing the fluid pressure, contact patches are shown in grey and all trapped fluid zones are purple.
References and Further Reading
- Zoet, L. K., & Iverson, N. R. (2020). A slip law for glaciers on deformable beds. Science, 368(6486), 76–78 (click here)
- Kaczmarczyk, Ł., et al, 2020. MoFEM: An open source, parallel finite element library. The Journal of Open Source Software, 5(45) (click here)
- Shvarts, A.G., Vignollet, J. and Yastrebov, V.A., 2021. Computational framework for monolithic coupling for thin fluid flow in contact interfaces. Computer Methods in Applied Mechanics and Engineering, 379, p.113738 (click here)
-
How will climate change affect stratospheric ozone recovery in the Arctic?
Project institution:Lancaster UniversityProject supervisor(s):Dr James Keeble (Lancaster University), Prof Michèle Weiland (University of Edinburgh), Prof Ryan Hossaini (Lancaster University) and Dr Luke Abrahams (University of Cambridge)Overview and Background
The stratospheric ozone layer is expected to recover over the course of the 21st century due to the controls the Montreal Protocol places on ozone depleting substances. As a result, the Montreal Protocol is considered by many to be the most successful environmental treaty of all time, and for some serves as a blueprint for how to tackle the climate crisis. However, the Arctic continues to see years with large ozone depletion, and a recent study has suggested that under future scenarios that assume large greenhouse gas emissions polar ozone depletion in the Arctic might get worse. If polar ozone depletion is worsening, this has significant implications for not just the stratosphere, regional climate, and human health, but also on how we interpret the success of the Montreal Protocol and its role as an example for other environmental policy efforts. This project will use recent advances in exascale computing and GPU hardware to model the Arctic stratosphere at unprecedented resolution in a coupled Earth system framework to address these scientific questions.
Methodology and Objectives
This project’s key scientific aim is to examine variability in Arctic stratospheric ozone and explore whether Arctic stratospheric ozone is recovering as expected. This will be achieved through examination of satellite observations and model simulations of the recent past, and high resolution, coupled Earth system modelling to explore atmospheric processes and year-to-year variability in this important region of the atmosphere. This project will use advances in exascale computing and GPU technology to run coupled Earth system models at much higher resolution and for much longer (many centuries) than has been done in the past, and to use that output, alongside other large datasets, to build new models of stratospheric ozone through emulation and machine learning processes.
Teaser Project 1: The drivers of year-to-year variability in Arctic stratospheric ozone over the recent past
This research project focuses on our understanding of the physical processes driving year-to-year Arctic ozone variability, and the impacts these have on surface climate and extreme weather. Key objectives are:
Objective 1: Examine historical Arctic ozone values using observations and model datasets from activities such as CMIP6 and CCMI-2022 to explore extreme years. Particular focus will be on the large Arctic ozone depletion events observed in the winters of 2010/11 and 2019/20, and the record high ozone levels observed in March 2024. Key questions include: are these events linked to atmospheric variability, or evidence of an emerging trend related to climate change? To what extent do models and observations agree?
This teaser project will be developed into a full PhD by using very high-resolution model simulations performed with the UKESM1 model, and exploring the following objectives:
Objective 2: Perform long, high resolution UKESM1 simulations to further explore atmospheric processes. To what extent does higher resolution allow us to better model processes such as orographic gravity waves, dynamical asymmetries in the polar vortex, polar stratospheric cloud formation, and chemistry-climate interactions.
Objective 3: Explore the impact of extreme high and low Arctic polar ozone events on regional weather and climate in high resolution model runs, with a focus on the UK and Europe, and how this might contribute to regional climate change signals.
Teaser Project 2: Using large datasets to develop faster, computationally inexpensive projections of future ozone change in the Arctic
This research project uses the huge amount of data we have from observations and recent model intercomparison projects to develop simple models of Arctic ozone that can provide reliable projections of future ozone recovery without the need to run complex, expensive climate models. Key objectives are:
Objective 1: Analyse past and future changes to Arctic ozone in CCMI-2022 and CMIP6 model simulations to get a sense of how Arctic ozone has changed in the past and is expected to change in the future. Identify the extent to which models and observations agree.
Objective 2: Using machine learning approaches, develop a computationally inexpensive emulator trained on the multi model CMIP6 and CCMI-2022 dataset that can make projections of Arctic stratospheric column ozone under different climate states.
This teaser project will be developed into a full PhD by developing the capabilities of the emulator model beyond 1-dimensional stratospheric ozone column projections:
Objective 3: Develop the emulator model of Objective 2 so that it can make projections of three dimensionally resolved (latitude-longitude-altitude) ozone in the Arctic stratosphere.
Objective 4: Perform high resolution UKESM1 model simulations looking at polar ozone under a range of future climate states. Key areas to explore will be (1) changes to stratospheric dynamics and temperature in response to greenhouse gas emissions, (2) the role of stratospheric water vapour increases in driving changes to polar ozone depletion, and (3) the role of large wildfires and associated soot particles as sites of heterogeneous chemistry. Use these new simulations to further refine the emulator developed in Objective 3.
Image: Total column ozone, March, 2011, as simulated by the UKESM1 model.
References and Further Reading
- von der Gathen et al., Climate change favours large seasonal loss of Arctic ozone. Nature Communications, 12(1), 3886, 2021
- Polvani et al., No evidence of worsening Arctic springtime ozone losses over the 21st century. Nature Communications, 14, 1608, 2023
- Newman et al., Record High March 2024 Arctic Total Column Ozone. Geophysical Research Letters, 51, e2024GL110924, 2024
- Chapter 4 of the WMO/UNEP Scientific Assessment of Ozone Depletion: 2022 (click here)
-
Measuring Biodiversity from volunteer generated ecological data sources
Project institution:University of GlasgowProject supervisor(s):Prof Ana Basiri (University of Glasgow), Joseph Shingleton (University of Glasgow), Dr Stuart Sharp (Lancaster University) and Dr Lydia Bach (University of Glasgow)Overview and Background
Citizen science has enabled researchers to have access to an unprecedented large-scale, affordable, rich, and diverse data for researchers. However, many question the inherent biases and quality issues of community/citizen generated data. Within the field of ecology, communities such as iNaturalist and eBird collate hundreds of millions of georeferenced species observations from users around the world.
In order to address the challenges of quantifying the quality (accuracy, completeness, representation) of these data sources, this PhD uses state-of-the-art modelling and data science techniques, along with other data sources, such as high resolution remotely sensed raster data, to build a foundational understanding of data quality within volunteered ecological data. To achieve this, advanced techniques in computer vision, animal behaviour modelling and geo-AI will be employed, leveraging the considerable computational resources available to the ExaGEO project.
Methodology and Objectives
The two teaser projects will focus on a single aspect of data quality within crowd-sourced and community generated ecological data: identification of repeat observations. Datasets such as iNatrualist, Movebank, and eBird provide some indication of the spatial distribution of a wide variety of animal species, with some steps taken to ensure reasonable data quality. However, there is currently no protocol for identifying multiple observations of the same individual within a species.
The two teaser projects outlined below take different and complimentary approaches to estimating the likelihood that two observations of animals from the same species are in fact of the same individual. The output of these models may be used in downstream tasks later in the PhD project to assess overall data quality of the datasets and enable quantifiably measure the reliability of the outputs for biodiversity.
Methods Used:
- Advanced computer vision (e.g. vision transformer models)
- Spatially explicit AI models (e.g. Graph Attention Networks)
- Animal behaviour modelling (e.g. ODEs/PDEs)
- Remote sensed data processing/analysis- e.g. pixel or object-oriented classification of satellite data
- Advanced statistical analysis (e.g. Markov chain Monte Carlo (MCMC), Hidden Markov, etc.)
Teaser Project 1
The first teaser project uses advanced computer vision techniques to identify similarities between photographs of animals and distinguish between individuals of the same species. To achieve this, a set of species specific keypoint models will be developed which are able to locate key identifiable features within an image (e.g. facial landmarks, joints, limbs). The relative location of these features, along with the geo-location of the observation and other data, will then be used within an unsupervised clustering model to identify likely observations of the same individual.
The success of this part of the project will rely on careful consideration of the taxon/a of study. Factors such as animal physiology, data availability and quality, and image processing and analysis techniques will play an important role in deciding this. Ultimately, the researcher will decide their area of study, after careful deliberation with the supervisory team.
Teaser Project 2
The second teaser project involves the development of a spatio-temporal model for animal behaviour. A single observation of an individual animal consists of (at minimum) species information, a geolocation and a timestamp. By combining these with other data sources (e.g., land-cover data, remotely sensed data, estimated animal populations) and animal behaviour expertise, the researcher will construct a spatio-temporal model capable of estimating the likelihood that an observation of the same species at a different geo-location and time is the same individual animal.
The approach taken can use either mechanistic models (e.g. ODEs, PDEs) or statistical machine learning models (e.g. MCMC, Hidden Markov, GeoAI), or indeed may employ a combination of both (e.g. Particle/Kalman filtering, approximate Bayesian computation).
While these two projects have the same overall aim – namely, estimating the likelihood two observations are of the same individual – their methodologies are very different. Beyond the first year of the PhD, the models created in these projects may be combined, resulting in a single model which uses positional, temporal and visual information to identify repeat observations. Later work may use this model to investigate other aspects of data quality within volunteer ecological data in even more detail. This will be used for some of ongoing relevant project on balancing quantity-quality of crowdsourced data.
References and Further Reading
- Lauer et. al. (2022), Multi-animal pose estimation, identification and tracking with DeepLabCut, Nature Methods
- Hou et. al. (2020), Identification of animal individuals using deep learning: A case Study of giant panda, Biological Conservation
- Vidal et. al. (2021), Perspectives on Individual Animal Identification from Biology and Computer Vision, Integrative and Comparative Biology
- Wahltinez, O. and Wahltinez, S. J. (2024) An open-source general purpose machine learning framework for individual animal re-identification using few-shot learning, Methods in Ecology and Evolution
- Laxton, M. R. et. al. (2022) Balancing structural complexity with ecological insight in Spatio-temporal species distribution models, Methods in Ecology and Evolution
- Karppinen S. et. al. (2022) Identifying territories using presence-only citizen science data: An application to the Finnish wolf population, Ecological Modelling
- Dorazio, R. M. and Karanth, K. U. (2017) A hierarchical model for estimating the spatial distribution and abundance of animals detected by continuous-time recorders, PLOS One
- Supp et. al. (2021) Estimating the movements of terrestrial animal populations using broad-scale occurrence data, Movement Ecology
- INaturalist
- Movebank
- EBird
-
Pace and style of glacial erosion in the Patagonian Andes
Project institution:University of GlasgowProject supervisor(s):Dr Jingtao Lai (University of Glasgow), Dr Katie Miles (Lancaster University), Dr Sarah Falkowski (University of Glasgow), Dr Sebastian Mutz (University of Glasgow) and Dr Mirjam Schaller (University of Glasgow)Overview and Background
Glacial erosion plays a critical role in the feedback mechanisms between different Earth systems. Rates and patterns of glacial erosion are controlled by climate variations, and glacial erosion can, in turn, influence the climate by modulating the carbon cycle through chemical weathering and ecosystem changes. Despite its importance, significant uncertainties remain regarding how climate affects the rates and spatial patterns of glacial erosion. The Patagonian Andes, with its broad latitudinal range and rich observational data, offers a valuable natural laboratory to address these questions. This project aims to integrate glacial landscape evolution models with thermochronology data to explore the pace and style of glacial erosion in the Patagonian Andes over the past 10 million years.
Methodology and Objectives
This project will integrate glacial landscape evolution modelling with thermochronology data to investigate glacial erosion in the Patagonian Andes. The student will use the Fastscape landscape evolution model and the Instructed Glacier Model (IGM), a glacier dynamics model that employs a Physics-Informed Neural Network (PINN) approach. Dr Lai, the project supervisor, has successfully integrated IGM with Fastscape. The student will use this modelling framework to simulate glacial landscape evolution in the Patagonian Andes. Low-temperature thermochronology provides valuable insights into erosion history by recording the time a rock sample takes to travel from a given depth to the Earth’s surface. In this project, the student will integrate model results from landscape evolution simulations with thermochronology data. Using the simulated evolution of glacial topography as input, the student will use the Pecube model to generate synthetic thermochronological datasets. These results will be compared with existing thermochronology data from the Patagonian Andes, offering new perspectives on the region’s glacial erosion history. The overall technical objective of this project is to develop a robust and scalable GPU-based modelling framework for landscape evolution in glacial environments. A key aspect of the project will be optimizing the existing code for efficient multi-GPU simulations, enabling large-scale landscape evolution simulations. This project will also involve incorporate climate models and other Earth surface process models into this modelling framework, including orographic precipitation, landslides, and sediment transport.
Teaser Project 1: Investigate valley-scale temporal evolution of glacial erosion rate in the Patagonia Andes
The hypothesis of a global increase in erosion rates due to the expansion of glaciation since the Late Cenozoic (~25 million years ago) remains a topic of intense debate and controversy. While modern glaciers are indeed more erosive than rivers, the response time of glacial erosion — specifically, how long elevated erosion rates persisted following the onset of glaciation — remains uncertain. In the Patagonian Andes, low-temperature thermochronology studies suggest that the onset of glaciation triggered a transient pulse of rapid erosion, followed by a gradual decline in erosion rates toward preglacial levels over response timescales spanning millions of years.
The objective of this teaser project is to understand the transient evolution of a glacial valley after the onset of glaciation and quantify the response time of glacial erosion. The student will connect glacial landscape evolution models with other Earth surface process models, including models for landslides and sediment transport. The student will focus on understanding the potential feedback mechanisms during glacial landscape evolution and exploring response times of glacial erosion in various climatic and tectonic conditions.
Teaser Project 2: Investigate reginal-scale spatial pattern of glacial erosion in the Patagonian Andes
The Patagonian Andes span a broad latitudinal range, offering a unique natural laboratory to study how glacial erosion responds to varying climatic conditions. Previous research suggests that glacial erosion rates are influenced by factors such as temperature, precipitation, and the basal thermal conditions of glaciers. However, a comprehensive quantitative assessment of the climatic impact on glacial erosion is still missing. This teaser project aims to integrate glacial landscape evolution models with existing climate reconstructions to simulate the regional history of glacial erosion in the Patagonian Andes. The focus will be on evaluating how latitudinal variations in temperature and precipitation shape the spatial patterns of basal thermal regimes and glacial erosion. Additionally, the model will be coupled with an orographic precipitation model to explore feedback mechanisms between topographic evolution and climate. The simulated spatial patterns of glacial erosion will be compared with those inferred from thermochronology data, providing new insights into the climatic controls on glacial erosion.
References and Further Reading
- Herman, F., Seward, D., Valla, P. G., Carter, A., Kohn, B., Willett, S. D., & Ehlers, T. a. (2013). Worldwide acceleration of mountain erosion under a cooling climate. Nature, 504(7480), 423–426 (click here)
- Herman, F., De Doncker, F., Delaney, I., Prasicek, G., & Koppes, M. (2021). The impact of glaciers on mountain erosion. Nature Reviews Earth & Environment (click here)
- Jouvet, G., & Cordonnier, G. (2023). Ice-flow model emulator based on physics-informed deep learning. Journal of Glaciology, 1–15 (click here)
- Lai, J., & Anders, A. M. (2021). Climatic controls on mountain glacier basal thermal regimes dictate spatial patterns of glacial erosion. Earth Surface Dynamics, 9(4), 845–859 (click here)
- Willett, C. D., Ma, K. F., Brandon, M. T., Hourigan, J. K., Christeleit, E. C., & Shuster, D. L. (2020). Transient glacial incision in the Patagonian Andes from ~6 Ma to present. Science Advances, 6(7), eaay1641 (click here)
-
Scalable approaches to mathematical modelling and uncertainty quantification in heterogeneous peatlands
Project institution:University of GlasgowProject supervisor(s):Dr Raimondo Penta (University of Glasgow), Dr Vinny Davies (University of Glasgow), Prof Jessica Davies (Lancaster University), Dr Lawrence Bull (University of Glasgow) and Dr Matteo Icardi (University of Nottingham)Overview and Background
While only covering 3% of the Earth’s surface, peatlands store >30% of terrestrial carbon and play a vital ecological role. Peatlands are, however, highly sensitive to climate change and human pressures, and therefore understanding and restoring them is crucial for climate action. Multiscale mathematical models can represent the complex microstructures and interactions that control peatland dynamics but are limited by their computational demands. GPU and Exascale computing advances offer a timely opportunity to unlock the potential benefits of mathematically-led peatland modelling approaches. By scaling these complex models to run on new architectures or by directly incorporating mathematical constraints into GPU-based deep learning approaches, scalable computing will to deliver transformative insights into peatland dynamics and their restoration, supporting global climate efforts.
Teaser Project 1: Scalable Mathematical Modelling of Peatlands
Objectives: This project will explore how we can do scalable modelling and inference on mathematical models of peatlands. The project will take existing microscale models for peatlands and look at how we can perform mathematical optimisation to the learn the complex parameters of the mathematical model. The focus will then be on looking at how the model can be upscaled and improved, focusing on computational inference methods that will be applicable as the model gets expanded and becomes more computationally infeasible.
Methods: The project will use scalable mathematical processing and optimisation techniques, looking at how they compared to computational statistical inference methods such a Bayesian optimisation. The peatland model will be used for simulations and analysed to understand how it can be improved to model the complex non-linear processes. Future work as part of a potential PhD project would involve extending model and adapting it to be able to run in high performance computing environments and extending the optimisation techniques to work in this scenario.
PhD Project: The main purpose of the PhD project will be scaling up the peatland models, adding more features and scaling them to be able to run across computer clusters and on GPUs with the eventual aim of extending this to Exascale computing. Advanced mathematical techniques will be used to upscale from micro- to macroscale models incorporate nonlinear instabilities such as wrinkling and surface patterning. Computational methods will then be extended to focus on predicting long-term peatland behaviour under restoration scenarios and climatic stressors. The integration of experimental data for validation and refinement of models will ensure practical applicability.
Teaser Project 2: Mathematically Informed Machine Learning for Scalable Peatland Modelling
Objectives: This project will explore how we can used emulation techniques for scalable parameter inference and uncertainty quantification in an existing model for peatlands. We will use parallelised computing to run multiple simulations of the peatland model and then use GPU based deep learning methods to build an emulator. The emulator will provide a computational cheaper version of the original model, allowing us to use Bayesian inference in previously computational infeasible scenarios giving the ability to estimate the model parameters and their associated statistical uncertainty.
Methods: The project will make use of deep learning architectures that are designed to be specifically scalable to GPUs and eventually Exascale type infrastructures. Specifically, the emulation methods will use and compare deep neural networks and deep Gaussian processes to link model parameters to observed model outcomes. Optimisation and Bayesian inference will then be carried out using the emulator within the context of an inverse problem. Future work as part of a potential PhD project would involve extending these methods into more complex deep learning frameworks, e.g. physics informed machine learning or graph neural networks.
PhD Project: A potential follow-on PhD project would focus on incorporating the mathematical models directly into the deep learning structures and linking the model to real data. This could be achieved by making the models more scalable by replacing the mathematical finite element methods via GPU trained deep learning alternatives or through methods from the physics informed machine learning literature. . Linking the model to real data will also be computationally challenging, building either directly on the emulation methods from the initial project or through the mathematical informed machine learning methods that have been developed. Essentially, this project will aim to link this model to real world applications that can help us gain a better understanding of the structure of peatlands.
References and Further Reading
-
Scalable Inference and Uncertainty Quantification for Ecosystem Modelling
Project institution:University of GlasgowProject supervisor(s):Dr Vinny Davies (University of Glasgow), Prof Richard Reeve (University of Glasgow), Prof David Johnson (Lancaster University), Prof Christina Cobbold (University of Glasgow) and Dr Neil Brummitt (Natural History Museum)Overview and Background
Understanding the stability of ecosystems and how they are impacted by climate and land use change can allow us to identify sites where biodiversity loss will occur and help to direct policymakers in mitigation efforts. Our current digital twin of plant biodiversity provides functionality for simulating species through processes of competition, reproduction, dispersal and death, as well as environmental changes in climate and habitat, but it would benefit from enhancement in several areas. The three this project would most likely target are the introduction of a soil layer (and the improvement of the modelling of soil water); improving the efficiency of the code to handle a more complex model and to allow stochastic and systematic Uncertainty Quantification (UQ); and developing techniques for scalable inference of missing parameters.
Teaser Project 1: Computational: Port core EcoSISTEM code to GPU
This project, led by Davies, will analyse the core CPU routines in EcoSISTEM, and port them to GPU. This will use packages from the JuliaGPU ecosystem, in particular CUDA.jl, a Julia package that provide a relatively easy user interface to NVIDIA A100 GPUs, which are available on UG’s MARS HPC system that the student will have access to. The main branch of the EcoSISTEM code is already efficiently parallelised for CPUs, and a preliminary assessment has suggested that the porting task should be feasible within a teaser project. This work will require support from Reeve both for his understanding of EcoSISTEM and his general Julia and HPC experience. This teaser project can be extended in a variety of ways to a full PhD:
On the one hand, once the GPU port speed-ups have been realised, the student can add two major new components to EcoSISTEM. First, Uncertainty Quantification of both the variability across stochastic realisations and systemic variability from parametric uncertainty can be feasibly added to the code, to allow us to better understand the uncertainty in possible outcomes. Second, the student can investigate scalable inference techniques for parameter inference within EcoSISTEM. These are both areas in which Davies has extensive experience.
On the other hand, there is a more sophisticated (dev) development branch of EcoSISTEM that is not currently well optimised but allows greater flexibility of how interactions can occur between components of the model. Porting this to GPUs will be a significantly harder task, but will allow richer interactions to be more easily modelled between ecosystem components. Incorporating additional ecological components into the GPU port of EcoSISTEM, such as those described in the ecological teaser project below, would also be possible. These are areas where Johnson, Cobbold and Brummitt’s expertise will be critical.
Teaser Project 2: Ecological: Incorporate key soil properties and relevant plant traits into EcoSISTEM
This project, led by Johnson, will add a preliminary model of the interaction between plant species, the species’ soil-specific traits, including fungal and microbial associations, and key soil properties including not just broad chemical, physical, biological measures but also their microbiomes, based on existing work by Johnson, with help from Cobbold (for development the soil-plant ecological model), Brummitt (for the botanical expertise) and Reeve (for integration into EcoSISTEM). This aspect of the project can be extended in many ways. This can involve enhancing the plant-soil modelling by adding in more aspects of the interaction or by improving the soil-water modelling to allow consideration of infiltration and soil water release characteristics, which are more important than the existing spot-measure of moisture content. It can also involve porting aspects of the model to GPU, and adding in other aspects (UQ, inference), referred to in the computational teaser project, above. All of these aspects of the project will require Davies’s expertise.
References and Further Reading
- Digital twins of the natural environment (click here)
- Dynamic virtual ecosystems as a tool for detecting large-scale responses of biodiversity to environmental and land-use change (click here)
- Effective extensible programming: Unleashing Julia on GPUs (click here)
- Strong phylogenetic signals in global plant bioclimatic envelopes (click here)
- Land management shapes drought responses of dominant soil microbial taxa across grasslands (click here)
-
Smart-sensing for systems-level water quality monitoring
Project institution:University of GlasgowProject supervisor(s):Dr Craig Wilkie (University of Glasgow), Dr Lawrence Bull (University of Glasgow), Prof Claire Miller (University of Glasgow) and Dr Stephen Thackeray (Lancaster University)Overview and Background
Freshwater systems are vital for sustaining the environment, agriculture, and urban development, yet in the UK, only 33% of rivers and canals meet ‘good ecological status’ (JNCC, 2024). Water monitoring is essential to mitigate the damage caused by pollutants (from agriculture, urban settlements, or waste treatment) and while sensors are increasingly affordable, coverage remains a significant issue. New techniques for edge processing and remote power offer one solution, providing alternative sources of telemetry data. However, methods which combine such information into systems-level sensing for water are not as mature as other applications (e.g., built environment). In response, procedures for computation at the edge, decision-making, and data/model interoperability are considerations of this project.
Methodology and Objectives
Initially, the student will investigate the trade-off between edge computation, at the smart sensor and cloud computation. While cloud computation is powerful, it comes at a high cost of analytics and data storage. When computation is conducted on GPUs at the edge (especially preprocessing) this greatly reduces data loads (raw data are typically high-resolution) and enables analytics in near real-time. Some skills that will be developed:
- Machine learning (ML): especially embedded/TinyML, from simple novelty detection to more complex models using embedded GPU/TPUs
- Statistics: there will be a focus on statistical and interpretable ML with uncertainty quantification, to aid decision making
- Software skills, programming (Python)
At-sensor GPU computing is integral to this project, with demanding computations leading to impractical power requirements at the edge without the efficiency of GPU computations.
Teaser Project 1: Tiny ML for embedded sensing
The project will develop a smart monitoring system, designed to be embedded within sensing devices (such as NVIDIA Jetson, or Google’s Coral AI). Tools will include signal processing, monitoring algorithms, or more advanced machine-learning techniques. The required data collection and analytics will be scoped with project partners, and the student will develop models/software for edge implementation using GPUs. This would likely involve building models in Python (Tensorflow, Keras, or Jax) and then converting them into an edge-AI device format (e.g. LiteRT).
Areas of focus:
- Monitoring of water quality indicators
- Monitoring of the sensing system itself (batteries, remote power generation)
- Both current and future sensing technologies
Teaser Project 2: Systems-level analysis of aggregate models and data
The second stage of this project considers how one aggregates information from smart sensors, to inform a whole water-systems analysis. At this stage, the analytics consider interconnected sensors, and how distributed information can inform systems-level decision-making. For example, as smart sensors allow for active control, the study might consider how data collection activities, power schedules, and maintenance can be modified given the ‘bigger picture’. Some relevant topics include:
- Adaptive experimental design
- Model fusion and federation
- Policy learning
- Decision analysis
A Hierarchy of GPU Computation for Exascale Systems
This project will develop a proof-of-concept, federated GPU computation network, where smart sensors (or alternative computing devices) are integrated as low-power accelerators within the wider exascale system. These hierarchical architectures are referred to as multi-tiered exascale systems (Navaridas, et al, 2019), where a modular approach is designed to scale from the bottom up, for better flexibility. In our case, bottom-tier nodes are naturally suited to pre-processing, real-time analytics, and in-situ image processing. Within exascale systems, these sensing units must complement high-power CPUs and GPUs, where different tiers of computation are designed/conducted in view of the whole – from centralised computation to distributed GPU resources. This approach will demonstrate the potential to develop a federated network as the sensor network expands.
References and Further Reading
- JNCC (2024). UKBI – B7. Surface water status. Published here. Accessed 10th December 2024. Last updated 10th December 2024.
- Dutta, Lachit, and Swapna Bharali. “Tinyml meets IoT: A Comprehensive Survey.” Internet of Things 16, 2021 (click here)
- NVIDIA examples of embedded ML
- Navaridas, Javier, et al. “Design exploration of multi-tier interconnection networks for exascale systems.” Proceedings of the 48th International Conference on Parallel Processing. 2019
Projects with a focus on Geodynamics, Geosciences and Environmental Change:
-
Building the Next Generation of Numerical Landscape and Species Evolvers to Quantify Geo-Biological Linkages in Changing Environments
Project institution:University of GlasgowProject supervisor(s):Dr Paul Eizenhöfer (University of Glasgow), Prof Jason Matthiopoulos (University of Glasgow), Dr Shan Huang (University of Birmingham) and Prof Kathryn Elmer (University of Glasgow)Overview and Background
The dynamics of biodiversity are driven by a concert of biotic and abiotic factors. The complexity of their interaction remains a primary challenge when interpreting empirical data that have recorded, both, biotic and environmental changes. These concern habitat creation, landscape and climate change and the rate at which these take place. The PhD project aims to develop a next generation integrative, mechanistic model taking advantage of highly parallelised computational infrastructure and latest GPU capacities. This novel tool will allow the evaluation of explicit links among the evolution of landscapes, climatic change and the evolution of life to identify key biotic responses to an ever-changing environment.
Methodology and Objectives
By integrating a range of first-order geological and biological processes within a unified modelling framework, the PhD project will investigate the fundamental question: What drives the diversification and turnover of biodiversity in evolving landscapes to eventually form present-day ecological communities?
This model is specifically aimed at being driven by both, geoscientific and (palaeo-)biological data. It will provide a highly transferable platform for disentangling the abiotic and biotic drivers responsible for dynamics of biodiversity and biomass flux across the world from the deep-time history to the future.
Suggested Headings: Establishing a geobiological data standard; developing the next generation coupled landscape/species evolvers.
Methods: The PhD project will use latest data frameworks and structures (e.g., xarray, NetCDF). Existing code originally written in Fortran, C and/or Python will need to be parallelised and made compatible with GPU-supported environments utilising optimised programming languages (e.g., Julia).
Teaser Project 1:
Teaser Project 1 (TP1) will employ artificial neural networks (ANNs) to identify spatial and temporal patterns in (palaeo-)biological and geomorphological big data for the Himalaya / Tibetan Plateau region to drive existing landscape and species evolution models. The primary objectives are to develop new ANNs and efficient data pipelines that directly link open-access databases such as NOW, eflora: Floras of China, PANGAEA., and remote sensing physiographic data sources with coupled landscape / species evolution models. In doing so, the PhD student will establish a standardised geobiological data framework consistent with both, biological and geoscientific data formats. The new data framework will be used to develop and train new ANNs, taking advantage of GPU capacities, to link biological events (e.g., timing of floral/faunal turnovers) with geological processes (e.g., timing of surface uplift periods). This will facilitate the evaluation of predicted and empirical geobiological data. Evolutionary parameters such as dispersal variability, mutation probability, trait variability, and competition-related parameters inspired by a suite of target species (e.g., the conifer Cupressus gigantea across the Three Rivers region) will be tied to geological and geomorphological constraints in space and time. As a result, the inversion of coupled landscape / species evolution models for observed data in combination with the ANN-driven spatial and temporal pattern recognition of empirical geobiological data will identify abiotic root causes for the evolution of biodiversity and present-day ecological communities across the Himalaya / Tibetan Plateau region and eventually in regions of similar physiography worldwide such as the Andes.
Teaser Project 2 :
Teaser Project 2 (TP2) will advance existing coupled landscape / species evolution models to be used on next generation computing infrastructure. Geo-biological linkages evolved through a long history of interacting processes, which are difficult to disentangle through a conventional correlation-based approach. Therefore, TP2 will build a mechanistic, data-driven computational modelling framework to address this issue. Most of the current landscape and species evolution models lack the capacity to be operated on modern high-performance, parallelised computing facilities including those equipped with high-processing power GPUs. This shortcoming slows the development of future, more holistic Earth System dynamics models. For example, most numerical geodynamics and climate models have been optimised to be used on modern computational infrastructure underscoring the importance of TP2 to achieve the same in the field of geobiology. In combination with the new data standards developed in TP1, this novel geo-bio integrative model will provide a particularly useful tool for studying biodiversity globally and for identifying major tipping points in the evolution of species under paleoenvironmental and physiographic changes.
TP1 and TP2 will both promote new avenues of research in the field of geobiology and for evaluating the impact of habitat loss due to anthropogenic factors.
References and Further Reading
- Acevedo-Trejos, E., Braun, J., Kravitz, K., Raharinirina, N. A., & Bovy, B. (2023). AdaScape 1.0: a coupled modelling tool to investigate the links between tectonics, climate, and biodiversity. Geoscientific Model Development Discussions, 2023, 1-25
- Eizenhöfer, P. R., McQuarrie, N., Shelef, E., & Ehlers, T. A. (2019). Landscape response to lateral advection in convergent orogens over geologic time scales. Journal of Geophysical Research: Earth Surface, 124(8), 2056-2078
- Hoorn, C., Wesselingh, F. P., Ter Steege, H., Bermudez, M. A., Mora, A., Sevink, J., … & Antonelli, A. (2010). Amazonia through time: Andean uplift, climate change, landscape evolution, and biodiversity. science, 330(6006), 927-931
- Huang, S., Meijers, M. J., Eyres, A., Mulch, A., & Fritz, S. A. (2019). Unravelling the history of biodiversity in mountain ranges through integrating geology and biogeography. Journal of Biogeography, 46(8), 1777-1791
- Irwin, D. E. (2012). Local adaptation along smooth ecological gradients causes phylogeographic breaks and phenotypic clustering. The American Naturalist, 180(1), 35-49
- Jablonski, D., & Edie, S. M. (2023). Perfect storms shape biodiversity in time and space. Evolutionary Journal of the Linnean Society, 2(1), kzad003
- Rahbek, C., Borregaard, M. K., Antonelli, A., Colwell, R. K., Holt, B. G., Nogues-Bravo, D., … & Fjeldså, J. (2019). Building mountain biodiversity: Geological and evolutionary processes. Science, 365(6458), 1114-1119
- Salles, T., Husson, L., Lorcery, M., & Hadler Boggiani, B. (2023). Landscape dynamics and the Phanerozoic diversification of the biosphere. Nature, 624(7990), 115-121
-
Chasing fluid pathways: using multiscale modelling of subduction zones to unravel the role of fluids and volatiles on topography
Project institution:University of GlasgowProject supervisor(s):Dr Antoniette Greta Grima (University of Glasgow) and Dr Tobias Keller (University of Glasgow)Overview and Background
Overview: This PhD studentship focuses on developing GPU-accelerated models of subduction dynamics and surface evolution with fluid release, volatile transport and melt dynamics with implications for volcanic hazard and critical resource formation potential. Subduction processes, fluid release and flow, and the resulting surface response of our planet operate across different scales, spanning from grain size to regional scale dynamics and span across quasi-instantaneous timescales to millions of years. This is a stand-alone project which will contribute one component of a multi-scale framework of independent projects using advanced GPU-based techniques to investigate the influence of fluid on subduction and surface processes. In this project, you will focus on the interaction between fluid transport and topographic evolution on the continental overriding plate at subduction zones with implications for magmatic eruption location, landform deformation preceding volcanic unrest and continental break-up.
Your work will include software development, integrating and interpreting field and experimental data sets, attending regular seminars, collaborating within a research team, and receiving training through ExaGEO workshops.
Background: Subducting slab transport altered near-surface rocks into the Earth’s mantle, introducing volatiles that sustain the deep water and carbon cycles and are crucial in generating melt and magmatic processes. Subduction links shallow and deep Earth systems, maintains conditions essential for a habitable planet (e.g., Tian et al., 2019) and at shallow depths has a critical influence on mineral resource emplacement.
Furthermore, the volatiles and fluids carried by the subducting slab influence subduction style, weaken and fracture the overriding plate, induce fluid release from the slab and into the mantle wedge, and control the location, timing, composition, and volume of arc magmatism (Nakao et al., 2016). These processes in turn govern volcanic hazards and the formation of critical metal deposits (Faccenda, 2014). Despite the key role of fluids in subduction zones, fluid release and the mechanisms controlling volatile and melt dynamics, and reactive fluid transport in subduction dynamics remain poorly understood. This project will utilise high-resolution, GPU-accelerated simulations to investigate the interaction between fluid and subduction dynamics and their expression at the Earth’s surface.
Methodology and Objectives
Modelling subduction processes is a complex scientific and computational challenge, especially when coupled with free surface deformation and fluid release and transport. This complexity stems from the interplay of multi-scale, multi-component, and multi-phase processes with feedback operating across varying timescales and rheologies. To tackle these challenges, this project employs a multiscale modelling approach that integrates small-scale reactive fluid transport and lithospheric dynamics with large-scale 2D and 3D subduction models incorporating free surface evolution.
The candidate will develop new computational tools in Julia and Python, leveraging GPU architectures and Exascale computing to couple advanced, high-resolution 2D and 3D subduction simulations with ultra-high-resolution crustal-scale models. These simulations will bridge processes operating over regional scales, and geological timescales with grain boundary-scale dynamics occurring over short timescales. Traditional CPU-based systems are insufficient for such a computationally intensive approach. GPUs enable the parallel processing necessary for adaptive mesh refinement (AMR), ensuring high resolution where intricate interactions such as those between fluid phases, rock deformation, and thermal processes occur. By utilizing Exascale capabilities, this project will be the first to dynamically couple multi-scale subduction models, capturing fine-scale details to inform system-scale dynamics while maintaining computational efficiency for large-scale simulations.
This project is part of a broader suite investigating subduction dynamics (e.g., slab-fluid interactions, fluid-fracture transport in the overriding plate). However, it stands alone in its focus on how subduction-driven reactive flow and fluid presence influence the rheology of the continental overriding plate, guide melt focusing, and shape the topographic evolution of continents.
The project begins with two introductory “teaser” projects designed to familiarise the candidate with key techniques and datasets, followed by a tailored research focus.
Teaser Project 1: GPU-Optimized Two-Phase Flow Model
Develop a GPU-optimized two-phase flow model in Julia to simulate fluid migration and solid matrix interactions in subduction zones, based on Keller & Suckale (2019). This model will leverage Exascale GPU computing to handle large, high-resolution grids and complex boundary conditions efficiently. Accelerated computation will enable parameter sweeps and real-time analysis of melt presence, mobility, and deformation patterns.
The GPU-based approach will significantly reduce simulation times, allowing systematic validation against benchmarks while exploring key parameters such as fluid mobility and viscosity contrasts. By linking microscale fluid behaviour to macroscale tectonic processes, this model will advance understanding of how subduction-driven fluid dynamics influence seismicity, magmatism, basin formation and mantle convection.
Teaser Project 2: Multiphase Thermo-Mechanical Subduction Processes
Model thermo-mechanical subduction processes in 2D and 3D Cartesian geometries using ASPECT, an open-source finite element software (Heister et al., 2017). This project will incorporate visco-plastic rheology and free surface boundary conditions to simulate time-dependent thermal structures and topographic evolution. It builds on the methods of Douglas et al. (2024), parameterizing slab dehydration via phase diagrams coupled with reactive transport processes.
This teaser project will extend ASPECT’s capabilities by developing GPU-accelerated solvers for finite element assembly and solution, enhancing resolution, AMR, and computational speed. These advancements enable multi-scale simulations to capture small-scale features such as fluid migration pathways and their impact on the rheology of the overriding plate and surface deformation. This real-time coupling of fluid dynamics and deformation is critical for exploring how subduction related stresses are influenced by dynamic fluid transport to shape topography
References and Further Reading
- Faccenda, M. (2014). Water in the slab: A trilogy. Tectonophysics, 614, 1–30 (click here)
- Heister, T., Dannberg, J., Gassmöller, R., & Bangerth, W. (2017). High accuracy mantle convection simulation through modern numerical methods – II: Realistic models and problems. Geophysical Journal International, 210(2), 833–851 (click here)
- Keller, T. and Suckale, J., 2019. A continuum model of multi-phase reactive transport in igneous systems. Geophysical Journal International, 219(1), pp.185-222
- Nakao, A., Iwamori, H., & Nakakuki, T. (2016). Effects of water transportation on subduction dynamics: Roles of viscosity and density reduction. Earth and Planetary Science Letters, 454, 178–191 (click here)
-
Chasing fluid pathways: using multiscale modelling of subduction zones to unravel the role of fluid dynamics on fluid flow across slab-wedge-crust domains
Project institution:University of GlasgowProject supervisor(s):Dr Chun Hean Lee (University of Glasgow), Dr Antoniette Greta Grima (University of Glasgow), Prof Antonio J. Gil (Swansea University) and Dr Tobias Keller (University of Glasgow)Overview and Background
Overview: This PhD studentship focuses on developing GPU-accelerated models of subduction dynamics and surface evolution, emphasising fluid release at the slab-mantle interface, volatile transport, and melt dynamics, with implications for volcanic hazard and critical resource formation potential. Subduction processes, fluid release and flow, and the resulting surface response of our planet operate across different scales, spanning from grain size to regional scale dynamics and span across quasi-instantaneous timescales to millions of years. This project will contribute to a multiscale framework by using advanced GPU-based techniques to better understand how fluids released at the slab-mantle interface migrate into the mantle wedge and influence subduction processes and melt generation, and interactions with the overriding crust. This flow is driven by significant pressure gradients, thermal gradients, and mechanical deformation. The study will involve phase changes (i.e., solid to liquid transitions) and multi-phase flow interactions between fluids and (viscoplastic) solids (plates and mantle wedge). From a modelling perspective on fluid flow across slab-wedge-crust domains, we will build upon the Smoothed Particle Hydrodynamics (SPH) methods recently developed by the supervisory team and extend them to simulate two fluid-solid phase, multi-scale non-Newtonian fluids, For large-scale simulations, these SPH developments will be integrated in the GPU-accelerated, open-source SPH code “DualSPHysics”. Importantly, these developments will be carefully linked to subduction models generated using ASPECT, from which the initial and boundary conditions for the problem will be extracted. A feedback loop between ASPECT and DualSPHysics will be established to iteratively refine the modelling approach. This project will include software development, integrating and interpreting field and experimental data sets, attending regular seminars, collaborating within a research team, and receiving training through ExaGEO workshops.
Background: Subducting slab transport altered near-surface rocks into the Earth’s mantle, introducing volatiles that sustain the deep water and carbon cycles and are crucial in generating melt and magmatic processes. Subduction links shallow and deep Earth systems, maintains conditions essential for a habitable planet (e.g., Tian et al., 2019) and at shallow depths has a critical influence on mineral resource emplacement.
Furthermore, the volatiles and fluids carried by the subducting slab influence subduction style, weaken and fracture the overriding plate, induce fluid release from the slab and into the mantle wedge, and control the location, timing, composition, and volume of arc magmatism (Nakao et al., 2016). These processes in turn govern volcanic hazards and the formation of critical metal deposits (Faccenda, 2014). Despite the key role of fluids in subduction zones, fluid release and the mechanisms controlling volatile and melt dynamics, and reactive fluid transport in subduction dynamics remain poorly understood. This project will utilise high-resolution, GPU-accelerated simulations to investigate the interaction between fluid and subduction dynamics and their expression at the Earth’s surface.
Methodology and Objectives
Modelling subduction processes is a challenging scientific and computational problem especially when this is coupled with free surface deformation and fluid release and transport. This is due to the multi-scale, multi-component and multi-phase processes and feedbacks which operate across different timescales and variable rheologies. To tackle this complexity, this project will adopt a multiscale modelling approach combining small-scale simulations of reactive fluid transport and lithospheric dynamics with large scale 2-D and 3-D subduction models with a free surface evolution. These models will capture the effects of dynamic mechanisms, strong thermal gradients, phase changes, and two-phase flow, aiming to provide a reliable representation of fluid migration from the slab-mantle interface into the mantle wedge. GPU architectures will be used to couple local and regional scale models with adaptive refinement to ensure sufficiently high resolution at the various process scales.
In this project, the candidate will build upon the explicit Smoothed Particle Hydrodynamics (SPH) formulation for Newtonian fluid flows proposed by the supervisory team (Low et al., 2021). This formulation has demonstrated the ability to mitigate long-standing SPH numerical artefacts, such as tensile instability, through the introduction of a novel entropy-stable stabilisation method (Lee et al., 2023). The candidate will first extend the formulation to model single-phase non-Newtonian fluid flow. To improve computational efficiency, an implicit formulation will be implemented. The formulation will then be extended to two-phase flows, focussing on non-Newtonian fluids interacting with viscoplastic solid materials. To enhance the accuracy and robustness of the algorithm, thermodynamically compliant particle-shifting techniques based on the recently developed ALE framework (Lee et al., 2024) will be introduced. Nonlinear Riemann solvers (Runcie et al., 2022) will be developed to accurately resolve sharp phase transitions, as well as pressure and thermal gradients. Additionally, well-posed material models, such as thermo-viscoplasticity, across any coupled deformation states will be incorporated. Finally, the method will be implemented in the GPU-accelerated, open-source SPH code “DualSPHysics” to effectively handle a wide range of spatial and temporal scales. A feedback loop will be established between ASPECT (subduction models) with DualSPHysics (fluids migration) to iteratively exchange data, enabling continuous refinement of the modelling approach and ensuring consistency between the subduction models and SPH simulations.
Within this framework, the student will start by working on two “teaser” projects to gain familiarity with the underlying theory, SPH techniques and relevant data. The student will then decide how to further develop their research, focus their efforts, and implement their work on GPUs.
Teaser Project 1:
The objective of teaser project 1 is to enable the candidate to understand the mechanics and physical behaviour of both Newtonian and non-Newtonian flows through a simple implementation. This sub-project, conducted during months1-6 of the first year, will focus on two primary areas: (1) computational mechanics, including conservation laws (e.g., Total Lagrangian, Updated Lagrangian, and Arbitrary Lagrangian Eulerian (ALE)), interface resolution using Riemann solvers, hyperbolicity/stability, and material modelling; and (2) SPH schemes, covering stabilisations, consistency, stability, and convergence. The student will begin by working with a simple explicit-based SPH MATLAB code for single-phase Newtonian flows. The candidate will then extend the code to handle non-Newtonian single-phase flows using an implicit formulation. The student will also identify challenges in implementing SPH on GPUs, focussing specifically on aspects such as such the scalability of the neighbouring search, force calculations, and SPH kernel and gradient corrections.
Teaser Project 2:
The objective of teaser project 2 is to enable the candidate to understand the capabilities of DualSPHysics in simulating multi-phase flows on CPU and GPU platforms, which will be essential for understanding flow along the slab-mantle wedge-continental crust interface. This sub-project, conducted during months 7 -12 of the first year, will focus on familiarising the open-source SPH code “DualSPHysics”. The candidate will test and evaluate the main functionalities of the code, including the single-phase free surface fluid solver, and Newtonian/Newtonian multi-phase solver. Several benchmark test cases will be tested and compared on both CPU and GPU platforms. Performance metrics, such as accuracy and computational efficiency will be assessed to understand the advantages and limitations of GPU acceleration.
The proposed project is designed as a standalone initiative, but it includes a cohort-based learning experience for linked projects related to subduction modelling.
References and Further Reading
- Low, K., Lee, C.H., Gil, A.J., Haider, J. & Bonet, J. (2021). A parameter-free Total Lagrangian SPH algorithm applied to problems with free surfaces. Computational Particle Mechanics, 8, 859892 (click here)
- Lee, C.H., de Campos, P.R.R., Gil, A.J., Giacomini, M. & Bonet, J. (2023). An entropy-stable Updated Reference Lagrangian SPH algorithm for thermo-elasticity and thermo-visco-plasticity. Computational Particle Mechanics, 10, 1493-1531 (click here)
- Lee, C.H., Gil, A.J., de Campos, P.R.R., Bonet, J., Jaugielavicius, T., Joshi, S. & Wood, C. (2024). A novel ALE SPH algorithm for nonlinear solid dynamics. CMAME, 427, 117055 (click here)
- Runcie, C.J., Lee, C.H., Haider, J., Gil, A.J. & Bonet. (2022). An acoustic Riemann solver for large strain computational contact dynamics. IJNME, 123, 5700-5748 (click here)
- Faccenda, M. (2014). Water in the slab: A trilogy. Tectonophysics, 614, 1–30 (click here)
- Heister, T., Dannberg, J., Gassmöller, R., & Bangerth, W. (2017). High accuracy mantle convection simulation through modern numerical methods – II: Realistic models and problems. Geophysical Journal International, 210(2), 833–851 (click here)
- Nakao, A., Iwamori, H., & Nakakuki, T. (2016). Effects of water transportation on subduction dynamics: Roles of viscosity and density reduction. Earth and Planetary Science Letters, 454, 178–191 (click here)
-
Investigate the response of proglacial fluvial systems to glacier retreat using GPU-accelerated numerical simulations
Project institution:University of GlasgowProject supervisor(s):Dr Amanda Owen (University of Glasgow), Dr Jingtao Lai (University of Glasgow), Prof Richard Williams (University of Glasgow) and Prof Todd Ehlers (University of Glasgow)Overview and Background
Recent climate change has driven glacier retreat worldwide, releasing increasing volumes of sediments and meltwater into proglacial environments (Zhang et al., 2022). Combined with an increase in extreme weather events, this has triggered rapid geomorphic changes in proglacial fluvial systems (Heckmann et al. 2016). These changes pose significant risks to downstream areas, threatening infrastructure, food security, and ecological stability. Despite their importance, our understanding of 1) how proglacial rivers respond to increased meltwater and sediment influx from retreating glaciers and 2) how glacier-river system collectively responds to long-term climate trends and short-term weather extremes remains limited.
This PhD project seeks to address these knowledge gaps through advanced GPU-based numerical simulations. By developing and coupling models for sediment dynamics and glacier evolution, the research will explore the interplay between glaciers and proglacial rivers under varying climatic and environmental conditions. The student will focus on developing and validating the numerical model, designing and conducting simulations to understand key interactions and mechanisms, and applying the model in selected field locations to provide predictions.
Methodology and Objectives
This project aims to use GPU-based numerical simulations to investigate the responses of proglacial fluvial systems to both long-term climate changes and short-term weather variations. The work will involve developing new, efficient code for simulating sediment dynamics in rivers on GPU devices and/or coupling existing sediment dynamics models with a GPU-based glacial landscape evolution model. The simulations will be validated against field observations and integrated with environmental datasets to provide robust predictions of how proglacial river systems may evolve under future climate scenarios.
Teaser Project 1: Investigate sediment dynamics and geomorphic changes in proglacial fluvial system
As glaciers continue to retreat due to climate change, their downstream river systems experience dynamic and complex adjustments in sediment transport, channel morphology, and erosional/depositional patterns. The first teaser project aims to simulate and understand the rapid geomorphic changes in proglacial fluvial systems driven by variations in upstream meltwater and sediment fluxes.
The student will build upon existing sediment dynamics models such as SPACE (Shobe et al., 2017) and Eros (Davy et al., 2017) to create a GPU-based high-resolution 2D model tailored for proglacial rivers. This model will explicitly simulate sediment entrainment, transport, and deposition processes, allowing for detailed exploration of fluvial responses to varying upstream inputs. Advanced computational techniques, including parallel processing, will be employed to ensure efficiency and scalability.
Using this model the student will design and conduct scenario-based simulations to explore geomorphic changes in proglacial rivers under different conditions of meltwater and sediment input. A sensitivity analysis will then be performed to identify key parameters, such as slope gradients, channel geometry, and sediment grain size distribution, that influence sediment dynamics and channel evolution. Finally, the model will be calibrated and validated against field or experimental data to ensure accuracy and robustness, providing a foundation for broader application to various proglacial systems.
Teaser Project 2: Investigate the coupled evolution of glacier-river system driven by climate change and weather extremes
Recent climate change and increased weather extremes significantly impact both glacier dynamics and proglacial fluvial systems. By coupling glacier and sediment dynamics models, the second teaser project seeks to understand the coevolution of retreating glaciers and proglacial river systems under these influences, providing insights into their interconnected responses to climatic and extreme weather events.
The student will couple a GPU-accelerated glacier model (IGM; Jouvet and Cordonnier 2023) with a fluvial sediment dynamics model, such as SPACE or Eros. This integrated model will be able to simulate the response of the coupled glacier-river system to climate and weather variations.
The student will design and conduct a group of simulations covering a range of climate scenarios and investigate how climate-driven glacier retreat impacts proglacial river processes. The student will analyse the simulations to explore various scenarios of meltwater and sediment flux release under different climatic conditions, identifying distinct geomorphic responses of proglacial rivers and determining whether glacier retreat results in sedimentation, erosion, or a combination of both. Finally, the model outputs will be validated against field observations or experimental data, and sensitivity analyses will be conducted to identify the primary controls on river responses, providing robust insights into the coevolution of glaciers and proglacial river systems.
References and Further Reading
- Davy, P., Croissant, T., & Lague, D. (2017). A precipiton method to calculate river hydrodynamics, with applications to flood prediction, landscape evolution models, and braiding instabilities. Journal of Geophysical Research: Earth Surface, 122(8), 1491–1512 (click here)
- Heckmann, T., McColl, S., & Morche, D. (2016). Retreating ice: research in pro-glacial areas matters. Earth Surface Processes and Landforms, 41(2), 271–276 (click here)
- Jouvet, G., & Cordonnier, G. (2023). Ice-flow model emulator based on physics-informed deep learning. Journal of Glaciology, 1–15 (click here)
- Shobe, C. M., Tucker, G. E., & Barnhart, K. R. (2017). The SPACE 1.0 model: a Landlab component for 2-D calculation of sediment transport, bedrock erosion, and landscape evolution. Geoscientific Model Development, 10(12), 4577–4604 (click here)
- Zhang, T., Li, D., East, A. E., Walling, D. E., Lane, S., Overeem, I., et al. (2022). Warming-driven erosion and sediment transport in cold regions. Nature Reviews Earth & Environment, 1–20 (click here)
-
Multi-scale modelling of volcanoes and their deep magmatic roots: phase-field modelling of fluid exsolution during magma solidification
Project institution:University of GlasgowProject supervisor(s):Prof Andrew McBride (University of Glasgow), Dr Tobias Keller (University of Glasgow) and Prof Lukasz Kaczmarczyk (University of Glasgow)Overview and Background
Overview: This PhD studentship focuses on using state-of-the-art open-source scientific software to develop a GPU-accelerated finite element model of magmatic processes at the mesoscale. This is part of an effort to better understand the complex and multiscale problems of volcanic hazards and magmatic resource formation. These problems span sub-millimetre mineral-fluid-melt interactions up to kilometre-scale magma dynamics and crustal deformation. Magma is a multi-phase mixture of solids, silicate melts, and volatile-rich fluids, interacting in complex thermo-chemical-mechanical ways.
In this project, you will focus on modelling silicate crystal growth and fluid bubble formation during magma solidification to better understand the dynamics of subvolcanic magma reservoirs arising from the granular-scale coupling between fluid mechanics and chemical thermodynamics.
Your work will include model, algorithm and software development, attending regular seminars, collaborating within a research team, and receiving training through ExaGEO workshops.
Background: Volcanic eruptions originate from shallow crustal magma reservoirs built up over long periods. As magma cools and crystallizes, it releases fluid phases—aqueous, briny, or containing carbonates, metal oxides, or sulfides—whose low viscosity and density contrasts drive fluid segregation. This fluid migration can trigger volcanic unrest or concentrate metals into economically valuable deposits. The distribution of fluids—discrete droplets versus interconnected drainage networks—crucially depends on crystal and melt properties. Direct observations are challenging, so high-fidelity, GPU-accelerated simulations will help us understand these complex and dynamic systems. The phase field finite element method [1] will be used to simulate the melt process at the meso-scale. We will build on developments in the Glasgow-born open-source code MoFEM.
Methodology and Objectives
The focus of this project is at the mesoscale and forms part of an initiative in ExaGEO to link disparate length and time scales via (i) direct numerical simulations of granular-scale phase interactions, (ii) deep-learning-based computational homogenisation to extract effective properties and constitutive relations, and (iii) system-scale mixture continuum models applying these relations to problems. All components leverage GPU-accelerated computing to handle direct simulations at local scales, train effective constitutive models, and achieve sufficient resolution at the system scale.
The objective of this project is to develop robust and scalable algorithms and methods for phase-field modelling of the magma solidification process. In this project you will:
- Develop a discontinuous Galerkin finite element framework for the (spatial) approximation of the Cahn-Hilliard equations for single and multiple species and phases.
- Implement implicit-explicit time integrators.
- Develop and implement error indicators to drive mesh adaptivity to accurately resolve interphases.
- Couple the problem phase field problem within buoyancy driven flow (Boussinesq equation).
- GPU-ready code optimisation, solver and preconditioner selection.
- Representative volume element simulation in 3D with periodic boundaries.
- Calibration of the chemical potential / Gibbs free energy and interfacial energy for silicate magma systems using MAGEMin software[3].
- Test with single mineral phase (olivine) growing from silicate melt of idealised basaltic composition in (SiO2+Al2O3+MgO+FeO+CaO) compositional space.
- Progress to two mineral phases (olivine+plagioclase) co-precipitating from basaltic melt (SiO2+Al2O3+MgO+FeO+CaO+Na2O).
- Systematically study growth of chemical zonation, mixing of chemically heterogeneous melt, as crystals grow in evolving residual melt under imposed. cooling/reheating conditions.
- Finally add H2O with P-dependent saturation point, exsolution of H2O-fluid droplets as during growth of anhydrous mineral phase.
Within this framework, you will start by working on two “teaser” projects to gain familiarity with different techniques and data, then choose how to further develop and focus your research. These projects will benefit greatly from the elective training offered in ExaGEO.
Teaser Project 1:
This teaser project, conducted over the first year, will focus on solving the Cahn-Hilliard equation using traditional mixed continuous Galerkin finite element approaches. The purpose is to gain familiarity and confidence with the phase field method and the MoFEM library by implementing a series of benchmark problems. You will explore the use of error indicators to assess the quality of the approximate solution, and hp adaptivity, and refinement and coarsening strategies to compute the solution to a defined accuracy for a given computational cost. High-order spatial approximation (p) is a key ingredient of matrix-free methods that underpin modern GPU-aware finite element algorithms. You will review and implement the framework for a matrix-free solver in MoFEM following, e.g., [4].
A key part of the teaser project will be to develop an appreciation of the mathematical foundations of the finite element method.
Teaser Project 2:
This teaser project, conducted over the first year, will focus on the discontinuous Galerkin finite element method. DG allows one to relax the classical continuity requirements of the finite element method and is well suited for GPUs. You will begin with simple second-order partial differential equation (i.e., Poisson’s problem) and compare various DG formulations against classical continuous Galerkin approximations. Once your DG method is validated, you will begin exploring how to exploit DG for a more complex Boussinesq problem.
These two teaser projects will provide you with a solid grounding to tackle the main project.
References and Further Reading
- Steinbach, I., & Salama, H. (2023). Lectures on phase field. Springer
- Wells, G. N., Kuhl, E., & Garikipati, K. (2006). A discontinuous Galerkin method for the Cahn–Hilliard equation. Journal of Computational Physics, 218(2), 860–877
- Riel, N., Kaus, B.J., Green, E.C.R. and Berlie, N., 2022. MAGEMin, an efficient Gibbs energy minimizer: application to igneous systems. Geochemistry, Geophysics, Geosystems, 23(7), p.e2022GC010427
- DeWitt, S., Rudraraju, S., Montiel, D. et al. (2020). PRISMS-PF: A general framework for phase-field modeling with a matrix-free finite element method. npj Comput Mater 6, 29
-
Statistical Emulation Development for Landscape Evolution Models
Project institution:University of GlasgowProject supervisor(s):Dr Benn Macdonald (University of Glasgow), Dr Mu Niu (University of Glasgow), Dr Paul Eizenhöfer (University of Glasgow), and Dr Eky Febrianto (University of Glasgow)Overview and Background
Many real-world processes, including those governing landscape evolution, can be effectively mathematically described via differential equations. These equations describe how processes, e.g. the physiography of mountainous landscapes, change with respect to other variables, e.g. time and space. Conventional approaches for performing statistical inference involve repeated numerical solving of the equations. Every time parameters of the equations are changed in a statistical optimisation or sampling procedure, the equations need to be re-solved numerically. The associated large computational cost limits advancements when scaling to more complex systems, the application of statistical inference and machine learning approaches, as well as the implementation of more holistic approaches to Earth System science. This yields to the need for an accelerated computing paradigm involving highly parallelised GPUs for the evaluation of the forward problem.
Beyond advanced computing hardware, emulation is becoming a more popular way to tackle this issue. The idea is that first the differential equations are solved as many times as possible and then the output is interpolated using statistical techniques. Then, when inference is carried out, the emulator predictions replace the differential equation solutions. Since prediction from an emulator is very fast, this avoids the computational bottleneck. If the emulator is a good representation of the differential equation output, then parameter inference can be accurate.
Methodology and Objectives
Methods Used: Gaussian process interpolation (for building the emulator), Bayesian inference (for parameter inference), geomorphological analyses, surface processes modelling.
Teaser Project 1: GPU-accelerated differential equation solver
Geodynamic models in Earth Science are used to simulate a range of natural processes. Landscape evolution models specifically contain, amongst others, equations that describe surface processes such as erosion and sediment deposition as well as rock/surface uplift and aspects of climate change. However, the numerical solver executes consecutively, rather than generating solutions in parallel. This first teaser project will commence at the beginning of the PhD project (semester 1) and will focus on familiarising the student with parallel computing via GPUs, including the optimisation of existing landscape evolution models for GPU use. At the same time, the student will take training from ExaGEO, equivalent to 20 UoG credits, in GPU programming and Exascale principles. This teaser project will support the PhD project in developing robust, reliable and efficient emulators for landscape evolution models, utilising GPU power, which will allow for a denser training set and the inclusion of a broader variety of geomorphological scenarios. This teaser project will also give insight on possible GPU acceleration in the emulation process itself.
Teaser Project 2: Emulator development
The second teaser project will look at creating an emulator for a simple mathematical model describing elevation change as a function of spatial and temporal variations in surface uplift and efficiency of erosion. This will take place in semester 2 and the student will also undergo training at the same time from ExaGEO, in statistical and numerical methods in computing, complementing the students research aims at this stage. The skills the students develop during this teaser project will set them up well, in combination with what they have attained from teaser project 1, to develop efficient emulators for more complex landscape evolution models, as the PhD project evolves.
The student will be well supported by the supervisory team. Dr Eizenhöfer has expertise in landscape evolution modelling and reconstruction, Dr Macdonald and Dr Niu have expertise in developing statistical methodology in the area of statistical emulation and Dr Febrianto has expertise in highly parallelised architecture for scientific computing and will be able to advise on software development and design with open-source vision, as well as aspects of the GPU software development.
Image: Landscape evolution model of Central Nepal including its range of input parameter types.
References and Further Reading
- Rasmussen, C.E., & Christopher K. I. Williams, C.K.I. (2006). Gaussian Processes for Machine Learning. The MIT Press. ISBN 0-262-18253-X
- Donnelly, J., Abolfathi, S., Pearson, J., Chatrabgoun, O., & Daneshkhah, A. (2022). Gaussian process emulation of spatio-temporal outputs of a 2D inland flood model. Water Research. Volume 225. ISSN 0043-1354
- Clark, M. K., Royden, L. H., Whipple, K. X., Burchfiel, B. C., Zhang, X., & Tang, W. (2006). Use of a regional, relict landscape to measure vertical deformation of the eastern Tibetan Plateau. Journal of Geophysical Research: Earth Surface, 111(F3)
- Eizenhöfer, P. R., McQuarrie, N., Shelef, E., & Ehlers, T. A. (2019). Landscape response to lateral advection in convergent orogens over geologic time scales. Journal of Geophysical Research: Earth Surface, 124(8), 2056-2078
- Mutz, S. G., & Ehlers, T. A. (2019). Detection and explanation of spatiotemporal patterns in Late Cenozoic palaeoclimate change relevant to Earth surface processes. Earth Surface Dynamics, 7(3), 663-679
- Whipple, K. X., Forte, A. M., DiBiase, R. A., Gasparini, N. M., & Ouimet, W. B. (2017). Timescales of landscape response to divide migration and drainage capture: Implications for the role of divide mobility in landscape evolution. Journal of Geophysical Research: Earth Surface, 122(1), 248-273
- Whittaker, A. C., & Boulton, S. J. (2012). Tectonic and climatic controls on knickpoint retreat rates and landscape response times. Journal of Geophysical Research: Earth Surface, 117(F2)
- Yang, R., Willett, S. D., & Goren, L. (2015). In situ low-relief landscape formation as a result of river network disruption. Nature, 520(7548), 526-529
- Zachos, J. C., Dickens, G. R., & Zeebe, R. E. (2008). An early Cenozoic perspective on greenhouse warming and carbon-cycle dynamics. nature, 451(7176), 279-283
-
Towards exa-scale simulations of slabs, core-mantle heterogeneities and the geodynamo
Project institution:University of GlasgowProject supervisor(s):Prof Radostin Simitev (University of Glasgow), Dr Antoniette Greta Grima (University of Glasgow) and Dr Kevin Stratford (University of Edinburgh)Overview and Background
Scientific computing is crucial for understanding geophysical fluid flows, such as the geodynamo that sustains Earth’s magnetic field. This project will adapt an existing pseudo-spectral geodynamo code for magnetohydrodynamic simulations in rotating spherical geometries to GPU architectures, improving efficiency on modern computing systems and enabling simulations of more realistic regimes. This will advance our understanding of Earth’s geomagnetic field and its broader interactions, such as those with mantle heterogeneities. Evidence from seismology and geodynamics shows that the core-mantle boundary (CMB) is highly heterogeneous, influencing heat transport and geodynamo dynamics. By combining compressible, thermochemical convection with geodynamo simulations, this project will further investigate how deep slab properties affect the CMB heat flux, mantle heterogeneity, and the geodynamo.
Teaser Project 1: What is the impact of ancient slabs on core-mantle boundary heterogeneities and the geodynamo?
Evidence from seismology and geodynamics reveals that the lowermost mantle and the core-mantle boundary (CMB) are highly heterogeneous due to the presence of post-perovskite, large low shear wave velocity provinces and ancient, subducted slab material. CMB heterogeneity results in variable heat transport from the core and plays a key role in core and mantle dynamics, the geodynamo, and ultimately the Earth’s habitability. Previous work shows that the spatiotemporal evolution of the CMB heterogeneity is closely linked to deep slab dynamics (e.g., Heron et al., 2024, 2025), however these remain poorly understood. This teaser project will investigate the role of deep slab properties on temporal evolution of the deep mantle heterogeneity, the CMB heat flux and the geodynamo. This will involve modelling compressible, multiphase, thermochemical convection in a 3D spherical shell following the approach of Dannberg et al., (2024) and Heron et al., (2024, 2025) using the state of the art, open-source, adaptive mesh refinement, finite element software ASPECT (Heister et al., 2017). These models will include the subduction history over the last 1 billion year from Merdith et al., (2021) and will be supported by high resolution 3D regional models investigating the role of end-member slab properties (e.g., weak vs. strong slabs) on the CMB heterogeneity. Temporal variations in CMB heat flux from these models will then be analysed using spherical harmonics across the first 4 harmonic degrees similar to the approach of Dannberg et al., (2024) and used as thermal boundary condition for the geodynamo simulations. The goal is to expand teaser project 1 to investigate the influence the deep slab on core-mantle dynamics and the implications this has for magnetic field generation and the strength and frequency of polarity reversals.
Objectives:
- Use global convection models to calculate the temporal evolution of heat flux at the CMB.
- Investigate the influence of end member slab rheologies and geometries on the heat flux heterogeneity at the CMB.
- Apply the calculated heat flux across the CMB from geodynamic models as a boundary condition to geodynamo simulations to investigate heterogeneity in magnetic field strength and the timing and frequency of magnetic field reversals.
- Use GPU architecture to couple finite element mantle convection with geodynamo simulations.
Teaser Project 2: Spectral expansion transforms in spherical geometry
Modelling the geodynamo involves solving the coupled 3D, time-dependent, nonlinear Navier-Stokes equations, pre-Maxwell electrodynamics, and heat transfer equations for a rotating fluid. At present, the pseudo-spectral method is the most accurate and widely used numerical discretisation method in this context. The method requires applying physical to spectral space transforms which are generally in integral form and have been difficult to adapt to GPU architectures. With GPUs becoming increasingly powerful and accessible, this sub-project aims to port an existing versatile pseudo-spectral code for magnetohydrodynamic simulations in rotating spherical geometries to GPU systems.
Objectives:
- Investigate alternative orthogonal polynomial basis function families that can be used to expand fields in spherical geometry, including Legendre, Jones-Worland, Jacobi and Galerkin.
- Implement alternatives in and assess/compare convergence, stability and consistency of the resulting discretisations as well as their efficiency for GPU acceleration.
Figure 1: Simulation of magnetic field generation by thermal convection in a rotating and electrically conducting fluid. From Silva L, Gupta P, MacTaggart D, Simitev, R. Effects of Shell Thickness on Cross-Helicity Generation in Convection-Driven Spherical Dynamos, Fluids, 5(4):245, 2020.
Figure 2: The evolution of the CMB heat flux from Dannberg et al., 2024 from 800 Myr (top) and to present (bottom) from Dannberg et al., 2024. Blue colours indicate a large heat flux into the mantle and represents cold CMB regions and yellow colours indicate low heat flux into the mantle and represent hot CMB regions such as LLVPs or plumes. Right most column shows the location of the plates in grey lines, continents in green and subduction zones iback from the plate reconstruction of Merdith et al., 2021.
References and Further Reading
- Dannberg, J., Gassmoeller, R., Thallner, D., LaCombe, F., & Sprain, C. (2023). Changes in core-mantle boundary heat flux patterns throughout the supercontinent cycle. arXiv preprint arXiv:2310.03229
- Paul H Roberts and Eric M King. 2013. On the genesis of the Earth’s magnetism. Rep. Prog. Phys. 76 096801 (click here)
- Gary A. Glatzmaier. 2014. Introduction to Modeling Convection in Planets and Stars: Magnetic Field, Density Stratification, Rotation. Princeton (click here)
- Heister, T., Dannberg, J., Gassmöller, R., & Bangerth, W. (2017). High accuracy mantle convection simulation through modern numerical methods – II: Realistic models and problems. Geophysical Journal International, 210(2), 833–851 (click here)
- Heron, P.J., Dannberg, J., Gassmöller, R., Shephard, G.E., & Pysklywec, R. N. (2025). The impact of Pangaean subducted oceans on mantle dynamics: passive piles and the positioning of deep mantle plumes. Gondwana Research
- Heron, P.J., Gün, E., Shephard, G.E., Dannberg, J., Gassmöller, R., Martin, E., Sharif, A., Pysklywec, R. N., Nance, R.D., & Murphy, J.B. (2025). The role of subduction in the formation of Pangaean oceanic large igneous provinces. Geological Society London, Special Publications, 542(1)
- Merdith, A.S. Williams. S.E., Brune, S., Collins, A.S., & Müller, D.R. (2021). Extending full-plate tectonic models into deep time: linking the Neoproterozoic and the Phanerozoic, Earth-Sci. Rev., 214 (click here)
- Silva. L, Simitev, R., 2018. Pseudo-spectral code for numerical simulation of nonlinear thermo-compositional convection and dynamos in rotating spherical shells, zenodo.org, 1311203, 2018 (click here)
Projects with a focus on Geologic Hazard Analysis, Prediction and Digital Twinning:
-
Developing large-scale hydrodynamic flood forecasting models for exascale GPU systems
Project institution:University of EdinburghProject supervisor(s):Dr Mark Bull (University of Edinburgh), Dr Maggie Creed (University of Glasgow), Prof Simon Mudd (University of Edinburgh) and Dr Declan Valters (British Geological Survey)Overview and Background
Flood forecasting at regional and national scale is imperative to predicting the scale and distribution of floodwaters during extreme weather events, mitigating the impact on communities most at risk from flooding. The LISFLOOD family of surface water models have proved suitable to being parallelised at scale, allowing research and forecasting communities to take advantage of the previous generation of supercomputers, such as ARCHER.
The increasing availability of high resolution topographic and meteorological data provides an opportunity to extend the capability of the LISFLOOD modelling framework to produce large-scale or high resolution flood forecasts at operational timescales – i.e., producing model runs at sufficient lead-in times to alert communities to impending flood risk from forecasted extreme weather events. GPU-based exascale HPC systems provide the technological basis to develop forecast models delivering at operational timescales.
Methodology and Objectives
LISFLOOD is a family of hydrological models based on a 2D grid simulating rainfall-runoff. The water routing across a flood basin/river catchment is based on a simplified version of the shallow water (St Venant) equations. The model is process (physics) based, and there have been several implementations (see below), usually in C or C++, using a cellular automaton approach. These have been parallelised for CPU using OpenMP and in one spin-off project, MPI (see here for more details).
The stencil-code library used in the previous CSE project, LibGeoDecomp, purports to have support for NVIDIA GPUs and CUDA (see here for more details).
Teaser Project 1:
- Implement the hydrodynamic core of the LISFLOOD model on GPU hardware to demonstrate proof-of-concept that the current CPU parallelised code is portable to GPU hardware.
- Methods for GPU parallelisation would include OpenMP offloading as initial approach to verify proof of concept. Project could then be extended to investigate CUDA bindings available in the libgeodecomp library.
Teaser Project 2:
- Profile, then optimise GPU ported code and test using case studies of UK extreme flood events, to indicate potential for near-realtime flood forecasting of GPU-enabled LISFLOOD code.
- This objective would aim to deliver a minimum-working example of a the GPU-ported flood model to deliver forecasts/re-analysis of a historic flood event in the UK within an operational timescale.
Development into a full PhD would involve further profiling and optimisation of the GPU code using either the libgeodecomp library, or another suitable GPU parallelisation framework. Delivering a proof-of-concept for a working flood forecast model at a regional scale would be a key aim of this project, demonstrating the potential to be used in operational flood forecasting systems. The full PhD may therefore look at workflow tools to integrate the various stages of forecast production such as: ingestion and pre-processing of data (i.e. from rainfall forecast/nowcasting data products), model scheduling on HPC systems, and post-processing of the outputs.
Image: Rendering of total flood induced erosion and deposition of riverbed material during a flash flood event in the Rye catchment, North Yorkshire, UK, used as a case-study when testing the development of the LISFLOOD model. Image source: Declan Valters, British Geological Survey.
References and Further Reading
- Coulthard, T.J., Neal, J.C., Bates, P.D., Ramirez, J., de Almeida, G.A. and Hancock, G.R., 2013. Integrating the LISFLOOD‐FP 2D hydrodynamic model with the CAESAR model: implications for modelling landscape evolution. Earth Surface Processes and Landforms, 38(15), pp.1897-1906
- LISFLOOD model high level overview (click here)
- Stencil Code for LibGeoDecomp (click here)
- Open Source version of the C++ code developed by Declan Valters (click here)
- Overview of an earlier project that developed an experimental version of the code for multi (CPU) node using stencil code (click here)
-
Earth system twin for landscape evolution processes
Project institution:University of GlasgowProject supervisor(s):Prof Todd Ehlers (University of Glasgow), Dr Jingtao Lai (University of Glasgow), Prof Lukasz Kaczmarczyk (University of Glasgow) and Dr Adam Smith (University of Glasgow),Overview and Background
Global climate and environmental change increasingly result in weather extremes that impact society and infrastructure. These extremes include stormier climates with increased wind speeds, precipitation events or drought, and temperatures (amongst other things). A team of University of Glasgow researchers are developing an Earth systems digital twin for exascale computing that works on GPU computers and uses weather forecasts to predict the cascading effect of climate change events on environmental systems. Our goal is to provide predictions at the national or larger scale for the impacts of environmental extremes on natural and urban settings. This project is one, stand alone, component of this larger scale project.
In this project you will develop and apply a landscape evolution model component of the Earth system model. We seek a student interested in surface water hydrology and landscape evolution modeling of rivers and hillslopes across Scotland. The student will develop software for investigating how weather forecasts and extreme weather events interact with geomorphic, hydrologic, and biosphere processes. Students from diverse backgrounds (e.g., geo- or hydrological sciences, engineering, maths, computer science) are welcome to apply to this project. The supervision team will take your background into account when setting the dissertation goals and provide mechanisms to lear the background information need to fill in knowledge gaps.
Your job while working on this project will involve software development for simulating the relevant physical processes, applying the model to historic data for model evaluation, working in a team/workgroup environment, attending regular research group seminars, integrating diverse environmental and satellite data into your software, and learning new techniques through ExaGEO training workshops.
Methodology and Objectives
Methods used in this project involved in the first year include the development of a GPU based numerical model that calculates surface water budgets (runoff, infiltration, etc) and applies the model to understand erosion, transport, and deposition of sediments as a function of fluvial and hillslope processes. The model will use meteorological forecasts, digital topography, vegetation cover, and soil/rock cover as inputs and will forecast river discharge and erosion/sedimentation. The final years of the project involve improving the model to incorporate different environmental data such as remote sensing data for land use, biota, and hydrology.
Teaser Project 1:
This teaser project, conducted in the first year, will focus on development of a GPU based flow routing algorithm for application to Digital Elevation Model (DEM) data. The focus of the project is on understanding how precipitation that falls on a landscape during different weather events will influence the amount and rate of water moving over a landscape and the resulting river discharge. The calculation of the overland flow of water and river discharge are important for understanding (see teaser project 2) what types of rainfall events lead to the mobilization of sediment and river transport, or saturation of hillslope regolith and mass wasting (landslide) events. This project will be done for large geographic regions and optimized for domain decomposition on a GPU cluster. Existing open source software (non-GPU based) exists for addressing this problem and can provide a template for development of a GPU-based version.
Initial efforts will focus on the identification of catchment boundaries and calculation of river runoff for different spatial and temporal distributions of precipitation. Time permitting, additional components of the hydrologic cycle will be added including infiltration rates as a function of different soil types, and evaporation, evapotranspiration, and snowpack melting processes. After implementing one or more of the previous configurations, the program will be applied to past meteorological events in Scotland and compared to observed river discharge.
Teaser Project 2:
This teaser project, also conducted in the first year, focuses on development of GPU software to calculate how water flowing over landscapes (overland flow) and in rivers (discharge) entrains and erodes the underlying soil, sediment, or bedrock. This project is important because projected climate change will result in more intense rainfall events that could lead to increased erosion rates, and higher sediment concentrations in rivers. For example, increased soil erosion removes nutrients needed by the biosphere and impacts agriculture practices. Too much soil erosion could therefore impact biodiversity and food security. At the start of this project you will work through learning tutorials from existing (non-GPU based) software to acquire an overview of the ‘big picture’ of processes you will address. Initial new and development efforts in this project will focus on calculating the calculating the shear stress of different amounts and velocities of water moving over a digital elevation model. These calculations will be used to determine, for different intensities of rainfall, how much sediment and rock is entrained in the flow and moved downslope. The goal would be the fast and efficient calculation of erosion rates across a landscape for different distributions of precipitation. Time permitting, the next steps of the project would include consideration of detachment/transport limiting conditions within the model and identification of where and when either erosion or deposition occur. Additional factors that can be taken into account are how different vegetation and soil types influence erosion, and including remote sensing data as model inputs for the selection of erosion related model parameters.
An example movie of the different components of this project and how a landscape evolution model works is available here.
References and Further Reading
- Sharma, H. and Ehlers, T. A.: Effects of seasonal variations in vegetation and precipitation on catchment erosion rates along a climate and ecological gradient: insights from numerical modeling, Earth Surf. Dynam., 11, 1161–1181, 2023 (click here)
- Schmid, M., Ehlers, T. A., Werner, C., Hickler, T., and Fuentes-Espoz, J.-P.: Effect of changing vegetation and precipitation on denudation – Part 2: Predicted landscape response to transient climate and vegetation cover over millennial to million-year timescales, Earth Surface Dynamics, 6, 859–881, 2018 (click here)
- Hobley, D. E. J., Adams, J. M., Nudurupati, S. S., Hutton, E. W. H., Gasparini, N. M., Istanbulluoglu, E., and Tucker, G. E.: Creative computing with Landlab: an open-source toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics, Earth Surface Dynamics, 5, 21–46, 2017 (click here)
-
Earth system twin for landslides along UK coasts with soil-rock mixtures
Project institution:University of GlasgowProject supervisor(s):Dr Zhiwei Gao (University of Glasgow), Dr Jingtao Lai (University of Glasgow), Prof Lukasz Kaczmarczyk (University of Glasgow), Dr Martin Hurst (University of Glasgow) and Hassan Al-Budairi (QTS Group)Overview and Background
Global climate and environmental change are increasingly resulting in weather extremes that impact society and infrastructure. These extremes include stormier climates with increased wind speeds, precipitation events or drought, and temperatures (amongst other things). A team of University of Glasgow researchers are developing an Earth systems digital twin for exascale computing that works on GPU computers and uses weather forecasts to predict the cascading effect of climate change events on environmental systems. Our goal is to provide predictions, at the national or large scale, of the impacts of environmental extremes on natural and urban settings. This project is one, stand-alone, component of this larger-scale project.
In this project, you will develop and apply one component of the Earth system model. We are seeking a student interested in GPU-accelerated large deformation modelling of landslides in soil-rock mixtures (SRM) along UK coasts. SRM is a naturally occurring material composed of high-strength rock fragments embedded in a matrix of low-strength soil. It is a common geological formation found in mountainous regions, river valleys and coasts. One example is the glacial till widely seen in the UK. SRM exhibits significant heterogeneity and anisotropy due to the random distribution and varying proportions of rock blocks and soil. In this project, we will develop a GPU-accelerated multifield plasticity simulation for modelling landslides in SRM.
Your job while working on this project will involve software development for simulating the relevant physical processes, applying the model to historic data for model evaluation, working in a team/workgroup environment, attending regular research group seminars, integrating diverse environmental and satellite data into your software, and learning new techniques through ExaGEO training workshops.
Methodology and Objectives
Methods used in this project involve multiscale modelling of SRM at the element level and large deformation modelling using multifield plasticity. The result from the multiscale modelling will be used to develop a constitutive model for SRM.
Teaser Project 1: Multiscale modelling of SRM
The mechanical behaviour of SRM is governed by the interaction between its components, with rock blocks contributing to structural stability and the soil matrix often controlling deformation and failure. Its unique characteristics, such as non-uniform strength, variable permeability, a wide range of particle sizes, and complex stress distribution, make SRM challenging to test and model. For instance, measuring the stress-strain relationship of SRM requires large equipment to accommodate the rock fragments in the testing cell. Developing such equipment is time-consuming and expensive. Therefore, we will use the multiscale approach to model the element response of SRM. At the mesoscale, elements of the SRM will be modelled to capture the detailed microstructure, including the distribution and properties of rock fragments and soil. This will be done using the finite element code MoFEM which is GPU compatible. The rock fragments will be modelled as non-deformable solids and soils will be modelled using a suitable elastoplastic model. The mechanical properties obtained from the microscale models will then be upscaled to the macroscale using homogenisation techniques. The multiscale modelling results will be validated and calibrated using experimental data to ensure accuracy. This involves comparing simulation results with laboratory tests reported in the literature. These simulations provide effective material properties that can be used in developing constitutive models for the SRM that are needed in large-scale modelling.
Teaser Project 2: Large deformation modelling of landslides using multifield plasticity
The multifield plasticity developed at the Glasgow Computation Engineering Centre (GCEC) is a numerical method suitable for modelling large deformation problems, which eliminates the need for local integration of the elastoplastic model and can effectively exploit the computation power of GPUs. In the multifield framework, the balance of linear momentum, the flow rule, and the Karush–Kuhn–Tucker (KKT) constraints are formulated together within a variational framework. Beyond deformation, the plastic strain and the consistency parameter are treated as global degrees of freedom in the spatially discretised problem. To manage the increased number of global degrees of freedom, the method leverages the block sparse structure of the algebraic system and employs a customised block matrix solver designed to take advantage of modern hardware architectures. A constitutive model for the SRM will be implemented following the multifield plasticity and then used to model landslides in MoFEM. We will collaborate with research teams working on field observations and large-scale modelling in this development.
References and Further Reading
- Lewandowski, K., Barbera, D., Blackwell, P., Roohi, A. H., Athanasiadis, I., McBride, A., … & Kaczmarczyk, Ł. (2023). Multifield finite strain plasticity: Theory and numerics. Computer Methods in Applied Mechanics and Engineering, 414, 116101
- Gao, W. W., Gao, W., Hu, R. L., Xu, P. F., & Xia, J. G. (2018). Microtremor survey and stability analysis of a soil-rock mixture landslide: a case study in Baidian town, China. Landslides, 15, 1951-1961
- Gao, W., Yang, H., & Hu, R. (2022). Soil–rock mixture slope stability analysis by microtremor survey and discrete element method. Bulletin of Engineering Geology and the Environment, 81(3), 121
- Qiu, Z., Liu, Y., Tang, S., Meng, Q., Wang, J., Li, X., & Jiang, X. (2024). Effects of rock content and spatial distribution on the stability of soil rock mixture embankments. Scientific Reports, 14(1), 29088
- Li, J., Wang, B., Wang, D., Zhang, P., & Vardon, P. J. (2023). A coupled MPM-DEM method for modelling soil-rock mixtures. Computers and Geotechnics, 160, 105508
-
Exploring Hybrid Flood modelling leveraging GPU/Exascale computing
Project institution:University of GlasgowProject supervisor(s):Dr Andrew Elliott (University of Glasgow), Prof Lindsay Beevers (University of Edinburgh), Prof Claire Miller (University of Glasgow) and Prof Michèle Weiland (University of Edinburgh)Overview and Background
Flood modelling is crucial for understanding flood hazards, now and in the future as a result of climate change. Modelling provides inundation extents (or flood footprints) which provide outlines of areas at risk which can help to manage our increasingly complex infrastructure network as our climate changes. Our ability to make fast, accurate predictions of fluvial inundation extents is important for disaster risk reduction. Simultaneously capturing uncertainty in forecasts or predictions is essential for efficient planning and design. Both aims require methods which are computationally efficient whilst maintaining accurate predictions. Current Navier-stokes physics-based models are computationally intensive; thus this topic would explore approaches to hybrid flood models which utilise GPU-compute and ML fused with physics-based models, as well as investigating scaling the numerical models to large-scale HPC resources.
Methodology and Objectives
Methods Used: Machine learning, statistical modelling, optimised process models using GPU computation.
Teaser Project 1
Exploring the advantages and limitations of GPU enabled approaches to flood modelling in contrast to traditional process-based flood modelling. Key considerations would be characterising the computational advantage of different ML approaches (especially physics informed machine learning models) considering both training and inference and the corresponding accuracy in comparison to the traditional process-based models. In addition, we will explore enhancing traditional process-based models by investigating the opportunities for exploiting large-scale, GPU-accelerated HPC.. Using data available from a range of sources (e.g. satellite, sensor networks as well as model outputs), different ML approaches will be explored to represent the complex hydrodynamics which a process based model would capture.
This project will naturally extend to a full PhD exploring hybrid modelling approaches with a key understanding how the level of accuracy of these models.
Teaser Project 2
Uncertainty quantification is becoming increasingly important as binary predictions give at best a limited outlook on the model and at worse can be misleading to policy makers who may not consider the implications of enforcing a binary outcome to flood forecasting models, or for adaptation development. However, with particularly slow high-fidelity models, gaining accurate and meaning uncertainty estimates via Monte Carlo, is either incredibly time consuming or indeed impossible. There are multiple solutions to this, including use surrogate/ML models (which can run the simulation faster) or improved Monte Carlo procedures (e.g. see Aitken et. al. 2024). Needless to say, while this computationally useful, it is important to understand the implications for the calibration of the uncertainty quantification of these approaches.
Thus, following from Aitken et. al. 2024, in this teaser project we will consider a large range of possible approaches, use them to obtain uncertainty quantifications and compare them to the uncertainty estimation which we can obtain from a high fidelity model, e.g. using LisfloodFP or Telemac2D. Due to the computational requirements of this approach, this is likely to require large scale compute, in both traditional and GPU compute. Comparisons will then be made between the UQ relying on large compute and those developed in this teaser project, allowing an understanding of the trade-offs between these approaches.
This teaser project naturally expands into a wider PhD designing and developing novel GPU enabled methods to obtain well calibrated uncertainty estimates via a combination of statistical and machine learning techniques to give rapid outputs to decision and policy makers.
References and Further Reading
- Aitken, G.; Beevers, L.; Christie, M.A. Advanced Uncertainty Quantification for Flood Inundation Modelling. Water 2024, 16, 1309 (click here)
- Andersson, T.R., Hosking, J.S., Pérez-Ortiz, M. et al. Seasonal Arctic sea ice forecasting with probabilistic deep learning. Nat Commun 12, 5124, 2021 (click here)
- Aitken, G., Beevers, L., & Christie, M. A. (2022). Multi-level Monte Carlo models for flood inundation uncertainty quantification. Water Resources Research, 58, e2022WR032599 (click here)
- Fraehr, Niels, et al. “Assessment of surrogate models for flood inundation: The physics-guided LSG model vs. state-of-the-art machine learning models.” Water Research 252 (2024): 121202
-
Extreme air pollution during European heatwaves: detangling the drivers through ultra-high-resolution modelling
Project institution:Lancaster UniversityProject supervisor(s):Prof Ryan Hossaini (Lancaster University), Dr Andrea Mazzeo (Lancaster University), Dr Lily Gouldsbrough (Lancaster University), Dr Helen Macintyre (UK Health Security Agency) and Prof Oliver Wild (Lancaster University)Overview and Background
While heatwaves (sustained periods of hot weather) are a well-recognized public health hazard, growing evidence highlights an emerging risk from the co-occurrence of extreme temperature and air pollution[1,2]. The 2022 European heatwave, when the UK recorded its fist ever temperature >40°C, was accompanied by a widespread deterioration in air quality, with surface levels of ozone and other air pollutants exceeding safe limits across much of the continent[3]. The causal relationship between extreme temperature and extreme air pollution levels is complex, involving synoptic weather patterns affecting air movement, atmospheric chemistry, and pollutant emissions (e.g. from wildfires)[4,5]. In combination, these factors are not adequately understood or quantified but are important to detangle as the frequency and intensity of summer heatwaves will become more common due to climate change, meaning this ‘climate penalty’ for air quality could worsen[6]. This project will provide powerful new insight into the drivers of European extreme air pollution events during heatwaves and the associated health effects. This will be achieved by combining ultra-high-resolution model simulations of air pollutant behaviour, supported by satellite observations and other big observational datasets.
The successful candidate will join LEC’s vibrant atmospheric science research group: AtMOS
Methodology and Objectives
The two teaser projects are linked by an overarching theme (extreme air pollution during heatwaves), though are distinct in focus and employ different numerical modelling approaches. Teaser #1 involves a European-scale assessment with emphasis on improving scientific understanding of the underpinning processes responsible for elevating ozone during heatwaves. Teaser #2 provides a UK-scale assessment with emphasis on forecasting of extreme events and assessment of health effects. Both teasers will equip the student with key transferable skills around the acquisition/manipulation of atmospheric measurement and model datasets and the application of policy-relevant metrics to assess model performance.
Teaser Project 1:
Focussing on recent summer heatwaves, a Europe-wide assessment of the drivers of extreme ozone events will be performed. During the ‘teaser’, the temperature-ozone relationship will first be quantified by analysing measurement records from a large number of European monitoring sites, including the newly-available, extensive TOAR-II database of surface ozone observations (Year 1). The project’s principal modelling tool will be the FRSGC/UCI chemical transport model (CTM) that is developed and maintained in Lancaster. During the teaser, the ability of the model to capture elevated summertime ozone will be examined using the 2022 heatwave as a case study (Year 1).
If developed into a full project, the work will be expanded in scope to cover other notable European heatwaves (e.g. summer 2018). In addition to surface measures, the behaviour of ozone and the model’s ability of to capture it, will be further evaluated with satellite measurements of atmospheric composition (e.g. ozone from the GOME-2 and IASI instruments). The focus of the project in Years 2/3 will be to detangle the drivers of elevated ozone using carefully designed model sensitivity experiments. These will allow, for example, the significance of temperature-induced increases in ozone precursors emissions to be explored, including biogenic volatile organic compound emissions from vegetation (e.g. isoprene) and wildfires (CO, NOx), along with assessment of the long-range transport of ozone from outside of continental Europe (including from the stratosphere).
Objectives:
- Characterize the European ozone-heatwave response across multiple summers using a suite of surface and satellite measurements.
- Assess the ability of the FRSGC/UCI atmospheric chemical transport model to capture extreme ozone events and the observed ozone-temperature relationship.
- Interpret the observed ozone-heatwave response using high resolution model simulations. Explore multiple factors, including the relative role of meteorological versus chemical drivers, the effects of model horizontal resolution, and other assumptions.
Teaser Project 2:
Process-based air quality models are frequently used to ‘hindcast’ the state of air quality over a given region, providing information required to assess the impact of changing air pollutant levels on public exposure and health. Additionally, such models are now increasingly used to alert the public and health care providers in advance of upcoming air pollution episodes (i.e. ‘forecast’). Like weather forecasts, air quality forecasts may be provided up to several days ahead, with forecast confidence generally decreasing with increasing lead time. As forecast skill is often inadequate, particularly for the most ‘extreme’ episodes, a body of literature on possible approaches to ‘bias correct’ forecasts (before they are issued) has emerged, some of which involve near real-time data assimilation[7-9].
This project will examine the ability of WRF-Chem to simulate UK air quality in both hindcast and forecast modes, with an emphasis on heatwave periods. WRF-Chem is a well-evaluated and widely adopted model suitable for high resolution simulations at the country scale. During the project’s ‘teaser’ part (Year 1), the skill of WRF-Chem to forecast surface ozone will be assessed considering a range of forecast lead times (24 to 96 hours) and by applying a range of key metrics (e.g., hit rate, false alarm rate etc.). For model evaluation, we will utilise the UK’s extensive ‘AURN’ network of air pollutant measurements. If developed into a full project, the student will explore the effectiveness of a range of bias correction techniques (Year 2), with emphasis on developing and implementing a scheme that improves the tail of the ozone distribution in both hindcast and forecast set ups. Bias-corrected hindcasts will be produced and the annual mortality burden attributable to long-term air pollutant exposure will be quantified[10] (Year 3). This analysis will be performed in time slices from ~1990 to present, allowing the effectiveness of air quality legislation on health burdens over time to be quantified.
Objectives:
- Evaluate the skill of the WRF-Chem model in reproducing surface ozone and other air pollutants over the UK during recent heatwaves.
- Assess the efficacy of a range of bias correction techniques (e.g. ‘quantile mapping’) applied to WRF-Chem hindcasts and forecasts.
- Produce bias-corrected hindcasts of key air pollutants and quantify the associated human health effects of air pollution in the UK over time.
References
- Schnell, J.L., and Prather, M.J. (2017). Co-occurrence of extremes in surface ozone, particulate matter, and temperature over eastern North America. Proc. Natl. Acad. Sci., 114, 2854-2859 (click here)
- Gouldsbrough, L., Hossaini, R., Eastoe, E., & Young, P.J.Y. (2022). A temperature-dependent extreme value analysis of UK surface ozone, 1980-2019. Atmos. Env., 273, 118975
- Copernicus scientists warn of very high ozone pollution as heatwave continues across Europe
- Pope, R. J., et al. (2023). Investigation of the summer 2018 European ozone air pollution episodes using novel satellite data and modelling, Atmos. Chem. Phys., 23, 13235-13253 (click here)
- Otero, N., Jurado, O. E., Butler, T., and Rust, H. W. (2022). The impact of atmospheric blocking on the compounding effect of ozone pollution and temperature: a copula-based approach, Atmos. Chem. Phys., 22, 1905-1919 (click here)
- Doherty, R.M., Heal, M.R., and O’Connor, F.M. Climate change impacts on human health over Europe through its effect on air quality, Environ. Health, 16 (click here)
- Neal, L.S., Agnew, P., Moseley, S., Ordóñez, C., Savage, N.H. and Tilbee, M. (2014). Application of a statistical post-processing technique to a gridded, operational, air quality forecast, Atmos. Env., 98, 385-393 (click here)
- Staehle, C., et al. (2024). Technical note: An assessment of the performance of statistical bias correction techniques for global chemistry–climate model surface ozone fields, Atmos. Chem. Phys., 24, 5953-5969 (click here)
- Gouldsbrough, L., Hossaini, R., Eastoe, E., Young, P.J.Y. & Vieno, M. (2023). A machine learning approach to downscale EMEP4UK: analysis of UK ozone variability and trends. Atmos. Chem. Phys., 24, 163-3196 (click here)
- Macintyre, H.L., et al. (2023). Impacts of emissions policies on future UK mortality burdens associated with air pollution. Environ. Int., 174, 107862 (click here)
Further Reading
- How UK’s record heatwave affected air pollution
- World meteorologists point to ‘vicious cycle’ of heatwaves and air pollution
- Schnell, J.L., and Prather, M.J. (2017). Co-occurrence of extremes in surface ozone, particulate matter, and temperature over eastern North America. Proc. Natl. Acad. Sci., 114, 2854-2859 (click here)
- Pope, R. J., et al. (2023). Investigation of the summer 2018 European ozone air pollution episodes using novel satellite data and modelling, Atmos. Chem. Phys., 23, 13235-13253 (click here)
-
Investigating the rheology of volcanic granular flows with GPU-based Discrete Element Method
Project institution:University of EdinburghProject supervisor(s):Dr Kevin Stratford (University of Edinburgh), Dr Eric Breard (University of Edinburgh) and Prof Jin Sun (University of Glasgow)Overview and Background
Geophysical flows, including those produced by landslides, volcanic eruptions, and extreme weather events are among the most pervasive natural hazards, posing significant risks to society. Despite their profound impact, our understanding of their complex rheology, driving the extent and speed of these flows, remains limited, posing significant challenges to the development of accurate hazard models. Major challenges lie in capturing the role of non-sphericity of the particulate mixtures generally assumed to be spheres and transient behaviour—often oversimplified as steady-state—and unravelling how these flows interact with substrates, driving sedimentation or substrate entrainment that influences their reach and dynamics. Leveraging the advanced GPU capabilities of the new open-source solver MFiX-Exa, we aim to simulate these complex multiphase processes with unparalleled precision. Our work will derive simplified constitutive equations that account for mass and momentum changes due to sedimentation or substrate entrainment, transforming predictive models and enhancing hazard mitigation strategies.
Teaser Project 1: Capturing flow initiation and arrest (jamming transition) in particle geophysical flows
Objective: This project focuses on gas-particle geophysical flows, with implications for landslides, pyroclastic flows, and debris flows. Our aim is to uncover how transient processes govern the transition from inertial flows to jamming (e.g., the sudden stopping of particles), which leads to deposition and momentum loss—an aspect currently missing from existing models. Using simulations, we will investigate the effects of grain size distribution and pore fluid pressure on deposition rates. The ultimate goal is to derive sedimentation and erosion rate laws that can be integrated into depth-averaged models, enabling more accurate hazard predictions.
Methods: Our approach combines high-resolution simulations using the discrete element method (DEM) coupled with computational fluid dynamics (CFD) to unravel the complex dynamics of geophysical granular flows. We will utilise the novel open-source solver MFIX-Exa, into which we will add the missing physics necessary to describe water-particle interactions, including lubrication forces, added mass, and Saffman lift forces. These additions are essential for accurately modelling bedload transport and debris flows.
We will then simulate the interaction of granular media, both with and without excess pore pressure, as it impacts a loose substrate. This will allow us to track flow-substrate dynamics as the base of the flow transitions to a jammed state. Post-processing of the DEM-CFD data will be conducted using a coarse-graining approach to derive continuum fields (e.g., stress tensors, velocity, granular temperature). These fields will facilitate the derivation of constitutive equations describing flow rheology and flow-substrate interactions, specifically the sedimentation processes (mass and momentum loss) and substrate erosion/entrainment (mass and momentum gain) in the granular flowing layer.
Teaser Project 2: Understanding the role of particle shape on the rheology of geophysical flows
Objective: Our ultimate goal is to improve the physics underlying current hazard assessments, particularly for processes such as pyroclastic density currents, debris flows, landslides, mudflows, and turbidity currents. We aim to better understand how these flows shape landscapes and become so destructive. Natural particles in geophysical systems have complex, non-spherical shapes, but they are often approximated as spheres for simplicity. However, experimental observations reveal that the rheology of spherical and non-spherical granular media can differ significantly. The aim of this project is to implement a glued-sphere model into the open-source GPU-based flow solver, MFIX-Exa, which uses DEM. DEM is a massively parallel DEM solver capable of simulating a wide range of granular flow problems. This code is ideally suited as the foundation for implementing non-spherical DEM into MFIX-Exa.
Methods: This project employs DEM-CFD simulations of granular flows to investigate the role of particle non-sphericity in driving the remarkable complexity of geophysical flows. Once the glued-sphere approach is implemented, we will systematically explore the effects of particle shape (e.g., roundness, sphericity) and examine its impact on flow rheology across quasi-static, intermediate, and inertial flow regimes (ranging from gas-like to solid-like behaviours). Discrete simulation data will be transformed into a continuum framework using a coarse-graining code. This will enable us to describe the influence of particle shape on granular flow rheology for integration into reduced-order numerical models of geophysical flows.
Image: A visualisation for various granular media simulation using MFIX (Breard).
References & Further Reading
- Musser, J., Almgren, A. S., Fullmer, W. D., Antepara, O., Bell, J. B., Blaschke, J., … & Syamlal, M. (2022). MFIX-Exa: A path toward exascale CFD-DEM simulations. The International Journal of High Performance Computing Applications, 36(1), 40-58
- Lu, L., Gao, X., Shahnam, M., & Rogers, W. A. (2021). Simulations of biomass pyrolysis using glued-sphere CFD-DEM with 3-D intra-particle models. Chemical Engineering Journal, 419, 129564
- Exascale Project: MFIX-EXA
- NREL/BDEM
-
Mechanisms for and predictions of occurrence of ocean rogue waves
Project institution:Lancaster UniversityProject supervisor(s):Dr Suzana Ilic (Lancaster University), Prof Aneta Stefanovska (Lancaster University), Michael Thomas (Reliable Insights Ltd), Dr Paul Bartholomew (University of Edinburgh) and Dr Bryan Michael Williams (Lancaster University)Overview and Background
Rogue waves, exceptionally high ocean waves, whose wave height exceeds the significant wave height by at least twice, are rare, transient phenomena that pose serious risks to shipping, fishing and maritime infrastructure, including offshore platforms and wind turbines. Understanding how they form, and their accurate and timely prediction is vital for assessing risks to marine operations.
Despite advances in theoretical and experimental research of rogue waves, the physical conditions and mechanisms that lead to the formation of rogue waves in the real sea are less understood and predictions are still challenging. The proposed PhD project/s will address these gaps by using ever-growing databases of field data, developing and refining novel data science approaches and exploiting developments in high performance computing.
Teaser Project 1:
This data-intensive project aims to accelerate novel time-localised analysis methods to investigate physical mechanisms underlying rogue waves and predict their occurrence.
Objectives:
- Use GPU Accelerated Computing to parallelise algorithms for time-localised phase coherence and couplings between waves recorded in many points in space, enabling scaling to higher-resolution and near real time analysis.
- Isolate the mechanisms leading to the formation of rogue waves using algorithms developed in Objective 1.
- Develop in-situ feature detection for automated analyses exploiting GPU and assess the relationship between the occurrence of rogue waves and their characteristics from time-series measured under different physical conditions.
- Develop a time-series-based prediction model, using the relationships identified in Objectives 2 and 3 and assess its ability to predict the occurrence of rogue waves.
Methods:
The numerical modelling and algorithms for time-series analysis will exploit GPU Accelerated Computing; exascale will then allow near real-time practical applications. The Multiscale Oscillatory Dynamics Analysis (MODA) toolbox for non-linear and time-localised phenomena in time-series (e.g. phase coherence, coupling and wave energy exchange [3&4]) will be parallelised and used to identify rogue wave mechanisms. Pattern analysis and automated featurisation will be developed to detect “anomaly” in the measured sea surface elevations. The methods will be first applied to laboratory data (e.g. [1]) and then to publicly available field measurements (e.g. Free Ocean Wave Dataset with more than 1.4 billion wave measurements). The newly developed prediction model will be systematically validated with measured data.
Teaser Project 2:
This is a data-intensive project focused on the computational optimisation of time series analyses for dynamic systems and the relationship between rogue wave properties and environmental conditions.
Objectives:
- Assess the current performance of the numerical tools included in MODA in terms of their relevance for detecting the mechanisms of rogue waves and their computational efficiency.
- Optimise the algorithms of the tools identified in Objective 1 with multiple-Graphics Processing Units (GPU) to improve time to results and experimental throughput, enabling large scale ensemble time-series analyses.
- Develop and apply a GPU version of MODA to field measured data to isolate mechanisms that lead to the formation of rogue waves.
- Assess the relationship between the occurrence of rogue waves and concurrent ocean and atmospheric data.
Methods:
The Multiscale Oscillatory Dynamics Analysis (MODA) toolbox offers several high-order methods for time-series analysis, some based on wavelets. The high computational demands of uncertainty evaluation methods limits their use for operational purposes. Optimised algorithms, GPU-acceleration and exascale facilities will open up higher resolution and practical applications. MODA will identify the mechanisms underlying rogue wave formation using field measured time-series of surface elevations (e.g. Free Ocean Wave Dataset). The concurrent environmental data (e.g. surface ocean currents, wind and atmospheric pressure) will be collated either from field measurements or from the operational forecast models provided by meteorological offices. The correlation between the occurrence of rogue waves and environmental parameters will be investigated as well as ‘casual’ relationships between the identified mechanisms and the environmental conditions, which can be incorporated into predictions in the future.
References and Further Reading
- Luxmoore, J.F., Ilic, S. and Mori, N., 2019. On kurtosis and extreme waves in crossing directional seas: a laboratory experiment. Journal of Fluid Mechanics, 876, pp.792-817
- Mori N., Waseda, T., Chabchoub A.(eds.) (2023) Science and Engineering of Freak Waves, Elsevier (click here)
- Newman, J., Pidde, A. and Stefanovska, A., 2021. Defining the wavelet bispectrum. Applied and Computational Harmonic Analysis, 51, pp.171-224
- Stankovski, T., Pereira, T., McClintock, P.V. and Stefanovska, A., 2017. Coupling functions: universal insights into dynamical interaction mechanisms. Reviews of Modern Physics, 89(4), p.045001.
- Yang X., Rahmani H., Black S., Williams B. M. Weakly supervised co-training with swapping assignments for semantic segmentation. In European Conference on Computer Vision 2025 (pp. 459-478). Springer, Cham
- Jiang Z., Rahmani H., Black S., Williams B. M. A probabilistic attention model with occlusion-aware texture regression for 3D hand reconstruction from a single RGB image. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2023 (pp. 758-767)
- Jiang Z., Rahmani H., Angelov P., Black S., Williams B. M. Graph-context attention networks for size-varied deep graph matching. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 2343-2352)
-
Mitigating geohazards through coupled multiphysics modelling and uncertainty analysis
Project institution:Lancaster UniversityProject supervisor(s):Dr Michael Tso (Lancaster University), Prof Andrew Binley (Lancaster University), Prof Andrew Curtis (University of Edinburgh), Dr Elizabeth Cooper (Lancaster University) and Dr Paul Wilkinson (British Geological Survey)Overview and Background
Many near-surface geohazards, e.g. landslides and failure of earthen dams, result from changes in flow and storage of subsurface fluids. Understanding the likelihood of such hazards is essential but often challenging because we lack of suitable subsurface monitoring technology. A range of geophysical sensors can provide a 3D time-lapse ‘movie’ below ground, but analysing such data is computationally intensive and typically done in isolation, making interpretation ambiguous—yet this uncertainty is often ignored. Computational limitations from traditional paradigms also limit the amount of data that can be used to improve the model, and the extent to which the solution space can be explored. This project aims to overcome these challenges by developing new approaches that couple multiphysics simulators, tailored for GPUs, to derive improved subsurface models with an assessment of uncertainty, thus significantly improving our ability to identify hazards and risk.
Methodology and Objectives
We will focus on application to two geohazards for which rich time-lapse geophysical datasets currently exist. Candidates include 3D electrical resistivity monitoring of dynamic moisture changes and mass movements on a hillslope or earthen dam and distributed acoustic sensing of unstable embankments. Each problem consists of determining the temporal evolution of an image of the subsurface based on many tens or hundreds of thousands of unknown parameters. Inversion of data from a single geophysical modality (e.g. electrical resistivity) can be challenging using conventional computation approaches. Furthermore, we often require near real-time execution of such inversions to assess the risk of possible hazards occurring, with adequate time for mitigating actions to take place. In this PhD we wish to solve the inverse problem coupled with a fluid flow simulator and evaluate all relevant sources of uncertainty (ambiguity) in the solution. This cannot be achieved with conventional computers and computational method.
We will develop geophysical and fluid flow simulators based on existing tools (including those developed by members of the supervisory team). Each simulator will require enhancement for application on GPUs. We will consider different strategies ranging from discretised grid-based partial differential equation solvers to surrogate modelling approaches.
We will then explore optimum strategies for coupling the geophysical and flow simulators, and any necessary data pre-processing strategies. Unlike conventional inversion of geophysical data, we will use the geophysical data (not the models) to constrain the parameters of the flow simulator. Methods for computation of predictive uncertainty in the coupled inversion will be explored, e.g. Markov chain Monte Carlo, data assimilation methods or variational inference. Insights derived from this research will benefit from GPU-acceleration of model coupling in general (e.g. coupling weather and land surface models, or different ecosystem services): this is extremely important because it is a core component in the analysis of the interdependencies within and between complex real-world environmental systems.
Teaser Project 1: Surrogate modelling of geophysical data
Simulations of many geophysical problems rely on spatio-temporal discretisation of an approximation to the governing equations. This can be computationally restrictive for many large-scale problems. Surrogate modelling (or emulators) offers a computationally attractive alternative, particularly when focussed on GPU-based architecture. A range of surrogate modelling approaches exist, e.g. those based on deep learning, which may be able to address significant nonlinearity of the problem. In this element of the PhD, we will explore surrogate modelling as an alternative approach to conventional geophysical simulation.
This project will involve the development of a GPU-accelerated simulator for simulations, inversions, and training surrogate models. Additionally, it will focus on creating and utilising GPU-based software frameworks to enhance the training of these surrogate models. Such advances would enable efficient coupled hydro-geophysical simulations for robust uncertainty quantification and long-term training scenarios (e.g., different CMIP scenarios).
Teaser Project 2: Estimation of information content
Data collection comes at a cost. This includes the provision, maintenance and operation of sensors, and the computational demands for interpretation. A widely overlooked question is, how much data do we need to develop a reliable model of the subsurface? In addition, given that we may have access to multiple geophysical modalities, how do we decide where to focus resources? And how will the uncertainty in our model reduce (or not) as we add more data? In this element of the project, we will investigate methods for assessing the value of information, such as through entropy measures, with a focus on developing techniques and GPU-based software. We will apply a suitable approach to the field study problems, to optimise data acquisition to best constrain the subsurface parameters of interest.
Image: 3D resistivity model from the BGS Hollin Hill landslide observatory in North Yorkshire, UK. Cooler colours indicate lower resistivities characteristic of a mudstone formation, warmer colours are higher resistivities associated with a sandstone formation. The dashed white line highlights active lobes of slipped mudstone.
References and Further Reading
- Boyd, J. P., Chambers, J. E., Wilkinson, P. B., Meldrum, P. I., Bruce, E., & Binley, A. (2024). Coupled Hydrogeophysical Modeling to Constrain Unsaturated Soil Parameters for a Slow‐Moving Landslide. Water Resources Research, 60(10) (click here)
- Cooper, E. S., Dance, S. L., Garcia-Pintado, J., Nichols, N. K., & Smith, P. J. (2018). Observation impact, domain length and parameter estimation in data assimilation for flood forecasting. Environmental Modelling & Software, 104, 199–214 (click here)
- Lu, D., Liu, Y., Zhang, Z., Bao, F., & Zhang, G. (2024). A Diffusion‐Based Uncertainty Quantification Method to Advance E3SM Land Model Calibration. Journal of Geophysical Research: Machine Learning and Computation, 1(3) (click here)
- Raissi, M., Perdikaris, P., & Karniadakis, G. E. (2019). Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378, 686–707 (click here)
- Strutz, D. & Curtis, A. (2024) Variational Bayesian experimental design for geophysical applications: seismic source location, amplitude versus offset inversion, and estimating CO2 saturations in a subsurface reservoir. Geophysical Journal International, 236(3), 1309–1331 (click here)
- Tso, C.-H. M., Johnson, T.C., Song, X., Chen, X., Kuras, O., Wilkinson, P., Uhlemann, S., Chambers, J. & Binley, A. (2020) Integrated hydrogeophysical modelling and data assimilation for geoelectrical leak detection, Journal of Contaminant Hydrology, 234 (click here)
- Zhang, X., Nawaz, M. A., Zhao, X., & Curtis, A. (2021). An introduction to variational inference in geophysical inverse problems. Advances in Geophysics, 62, 73–140 (click here)
- Information Theory in the Geosciences: A hub for the emerging community of practice for Information Theory in the Geosciences
-
Multi-scale modelling of volcanoes and their deep magmatic roots: Constitutive model development using data-driven methods
Project institution:University of GlasgowProject supervisor(s):Dr Ankush Aggarwal (University of Glasgow), Dr Tobias Keller (University of Glasgow) and Prof Jin Sun (University of Glasgow)Overview and Background
Overview: This PhD studentship focuses on developing GPU-accelerated models of magmatic processes that underpin volcanic hazards and magmatic resource formation. These processes span sub-millimetre mineral-fluid-melt interactions up to kilometre-scale magma dynamics and crustal deformation. Magma is a multi-phase mixture of solids, silicate melts, and volatile-rich fluids, interacting in complex thermo-chemical-mechanical ways.
This is a standalone PhD project that is part of a larger framework of magmatic systems research by the wider team. The project will contribute one component of a hierarchical, multi-scale modelling framework using advanced GPU-based techniques. Specifically, in this project, the PhD student will develop constitutive relationships between stresses and strains/strain-rates of various phases at the magmatic system-scale based on granular-scale mechanical simulations. The result will enable accurate, large-scale simulations of magma dynamics that capture the complexity of micro-scale constituents and their interactions.
Your work will include software development, integrating and interpreting field and experimental data sets, attending regular seminars, collaborating within the wider research team, and receiving training through ExaGEO workshops.
Background: Volcanic eruptions originate from shallow crustal magma reservoirs built up over long periods. As magma cools and crystallizes, it releases fluid phases—aqueous, briny, or containing carbonates, metal oxides, or sulfides—whose low viscosity and density contrasts drive fluid segregation. This fluid migration can trigger volcanic unrest or concentrate metals into economically valuable deposits. The distribution of fluids—discrete droplets versus interconnected drainage networks—crucially depends on crystal and melt properties. Direct observations are challenging, so high-resolution, GPU-accelerated simulations provide a way to understand these complex and dynamic systems.
Methodology and Objectives
Modelling volcanic systems is challenging due to the multi-scale nature of their underlying physical and chemical processes. System-scale dynamics (100 m to 100 km) emerge from interactions involving crystals, melt films, and fluid droplets or channels on micro- to centimetre scales. To link these scales, this project uses a hierarchical approach: (i) direct numerical simulations of granular-scale phase interactions, (ii) deep learning-based computational homogenisation to extract effective constitutive relations, and (iii) system-scale mixture continuum models applying these relations to problems. All components leverage GPU-accelerated computing and deep learning to handle direct simulations at local scales, train effective constitutive models, and achieve sufficient resolution at the system scale.
In this project the candidate will extract effective constitutive relations by computationally homogenising the micro-scale mechanical simulations. The effective constitutive properties will then be used in the macro-scale models to accurately capture the multi-scale effects. The project will leverage recent advances in the use of neural networks [1,2] and Gaussian processes [3,4] for constitutive model development. A range of micro-scale simulations (developed by partner project) will be run to generate data covering the different deformation regimes. The results from these simulations will be used to train a deep-learning-based constitutive model. Approaches based on neural networks and Gaussian processes will be explored and compared. The trained model will then be used in macro-scale simulations, and its results will be compared to those using the constitutive relations currently assumed in the literature. Lastly, the variability resulting from this homogenisation process will be quantified and its propagation into macro-scale simulations will be assessed to ensure confidence in the results. The focus of model applications will be the proposed regime transition from disconnected bubble migration to interconnected channelised seepage of fluids from crystallising magma bodies [5].
Within this project, the student will start by working on two “teaser” projects to gain familiarity with different techniques and data, then choose how to further develop and focus their research.
Teaser Project 1:
This teaser project, conducted over the first year, will focus on neural networks for constitutive modelling. Neural networks (NN) are the most popular choice of deep learning. Recent works have used NNs for constitutive model development, identification, and discovery [1,2]. With large flexibility in modelling wide-ranging phenomenon, NNs also bring a large number of tunable parameters (weights), associated uncertainty and requirement of large training dataset. This teaser project will explore NNs’ use for constitutive model development based on a simplified one- and two-phase micro-scale systems. This will include finding suitable architecture and training hyperparameters, and the required training dataset. A GPU-based implementation will be developed to make the training of high-dimensional neural networks feasible. This teaser project will pave the path towards a neural-network-based approach for the overall project over the next three years, wherein the initial implementation will be extended to complex micro-scale simulations modelling four phases. Additionally, in the full project, the uncertainty related to neural networks will be quantified, and the required training data will be optimised. These additions will further increase the computational cost, thus necessitating a GPU-accelerated framework.
Teaser Project 2:
This teaser project, conducted over the first year, will focus on Gaussian process for constitutive modelling. Gaussian processes (GPs) are rigorous statistical tools that are an attractive alternative to neural networks [3]. The main advantage of GPs is that, in addition to the mean, they also capture the variation/confidence in the results, which can in-turn inform which micro-scale simulations must be run to improve their accuracy. Recently, these have been used for constitutive model development for hyperelastic solids [4]. This teaser project will explore GPs’ for modelling the effective constitutive relationships of simplified one- and two-phase micro-scale systems and using the results to also select the required micro-scale simulations. Thermodynamic constraints on constitutive model will be added by extending the GP framework [4], which will increase the training cost. Thus, a GPU-based implementation will be required to make the computation feasible. If this approach is selected for the rest of the PhD, it will be extended to the fully-complex micro-scale model over the next three years. Moreover, the GP approach will be leveraged to develop a robust framework for design of experiments, such that there is a high confidence in the resulting constitutive properties. The design of experiment brings an exponentially high computational cost, thus necessitating a GPU-accelerated framework.
Image: A neural-network-based constitutive modelling framework (left), and Gaussian-process-based simulation results (right).
References and Further Reading
- Linka, K., Hillgärtner, M., Abdolazizi, K. P., Aydin, R. C., Itskov, M., & Cyron, C. J. (2021). Constitutive artificial neural networks: A fast and general approach to predictive data-driven constitutive modeling by deep learning. Journal of Computational Physics, 429, 110010
- Liu, X., Tian, S., Tao, F., & Yu, W. (2021). A review of artificial neural networks in the constitutive modeling of composite materials. Composites Part B: Engineering, 224, 109152
- Williams, C. K., & Rasmussen, C. E. (2006). Gaussian processes for machine learning (Vol. 2, No. 3, p. 4). Cambridge, MA: MIT press
- Aggarwal, A., Jensen, B. S., Pant, S., & Lee, C. H. (2023). Strain energy density as a Gaussian process and its utilization in stochastic finite element analysis: Application to planar soft tissues. Computer methods in applied mechanics and engineering, 404, 115812
- Degruyter, W., Parmigiani, A., Huber, C. and Bachmann, O., 2019. How do volatiles escape their shallow magmatic hearth?. Philosophical Transactions of the Royal Society A, 377(2139), p.20180017
-
Multi-scale modelling of volcanoes and their deep magmatic roots: Fluid release from subvolcanic magma bodies
Project institution:University of GlasgowProject supervisor(s):Dr Tobias Keller (University of Glasgow), Prof Andrew McBride (University of Glasgow), Prof Jin Sun (University of Glasgow) and Dr Ankush Aggarwal (University of Glasgow)Overview and Background
This PhD studentship focuses on developing GPU-accelerated models of magmatic processes that underpin volcanic hazards and magmatic resource formation. These processes span sub-millimeter mineral-melt-fluid interactions up to kilometer-scale magma dynamics and crustal deformation. Magma is a multi-phase mixture of solids, silicate melts, and volatile-rich or other fluids, interacting in complex thermo-chemical-mechanical ways. The project will contribute one component of a hierarchical, multi-scale modelling framework using advanced GPU-based techniques. In this project, you will focus on developing a system-scale model of fluid exsolution and extraction from a crystallising magma body with implications for volcanic unrest preceding eruptions and the genesis of magmatic-hydrothermal deposits of critical metal ore.
Your work will include software development, integrating and interpreting field and experimental data sets, attending regular seminars, collaborating within a research team, and receiving training through ExaGEO workshops.
Volcanic eruptions originate from shallow crustal magma reservoirs built up over long periods. As magma cools and crystallises, it releases fluid phases—aqueous, briny, or containing carbonates, metal oxides, or sulfides—whose low viscosity and pronounced density contrasts drive fluid segregation. This fluid migration can trigger volcanic unrest or concentrate metals into economically valuable deposits. The micro- to meso-scale distribution of fluids—discrete droplets versus interconnected drainage networks—crucially depends on crystal and melt properties. Direct observations are challenging, so high-resolution, GPU-accelerated simulations provide a way to understand these complex and dynamic systems.
Methodology and Objectives
Modelling volcanic systems is challenging due to the multi-scale nature of their underlying physical and chemical processes. System-scale dynamics (100 m to 100 km) emerge from interactions involving crystals, melt films, and fluid droplets or channels on micro- to centimetre scales. To link these scales, this project uses a hierarchical approach: (i) direct numerical simulations of granular-scale phase interactions, (ii) deep learning-based computational homogenisation to extract effective constitutive relations, and (iii) system-scale mixture continuum models applying these relations to problems. All components leverage GPU-accelerated computing and deep learning to handle direct simulations at local scales, train effective constitutive models, and achieve sufficient resolution at the system scale.
In this project the candidate will develop and apply a novel system-scale three-phase flow model informed by effective constitutive models derived from granular-scale simulations and computational homogenisation. The model will build on a recent multi-phase reaction-transport theory framework [1] and numerical treatment [2] which will be implemented in a GPU-accelerated algorithm built on cutting edge Julia packages [3]. To inform system-scale reaction and transport rates the simulations will utilise constitutive models derived from granular-scale direct simulations and computational homogenisation (delivered by partner projects). The simulations will be used to systematically investigate the role of a regime transition in the transport of fluid phases through subvolcanic magma bodies from disconnected bubble migration to interconnected channelised drainage [4].
Within this framework, the student will start by working on two “teaser” projects to gain familiarity with different techniques and data, then choose how to further develop and focus their research.
Teaser Project 1:
This teaser project, conducted over the first year, will focus on the implementation of a mechanical three-phase (solid+liquid+liquid/vapour) transport model using GPU-accelerated algorithms in Julia. The implementation will follow from previous work demonstrated in a serial Matlab prototype [2] and use a staggered-grid finite-difference method to discretise the set of underlying PDEs in combination with a matrix-free iterative solution approach demonstrated to be highly efficient when run on massively parallel GPU infrastructure. The model will be structured such that it can switch between using traditional constitutive models as analytical functions of system variables [1,2] and querying trained neural nets which output flux and transfer rates given gradients and phase deviations in system variables.
Teaser Project 2:
This teaser project, conducted over the first year, will focus on implementing and calibrating a multi-component petrological model to represent fractional crystallisation and fluid exsolution in a chosen volcanic context (to be determined). The petrological model will follow an approach utilising thermodynamics-inspired fitting functions [5] and will be calibrated against the output of an energy-minimising thermodynamic equilibrium solver [6] using machine learning tools. This established approach for formulating an approximate but robust and efficient form of phase equilibrium model will be compared to a novel approach of training a neural network to take pressure, temperature, and element composition of magmatic materials and returning the proportions and compositions of stable phase assemblages.
Figure: 2-D simulation of a three-phase magma comprising melt, solids, and fluid bubbles at high melt fraction. A buoyant layer with lower melt fraction and higher fluid fraction near the base of the domain is initiating convection. Temporal evolution of (a) solid fraction, (b) liquid fraction and (c) fluid fraction. Panels (i)–(iii) have the respective phase velocity vectors overlain, while panel (iv) shows the last time step again with the respective phase segregation flux overlain. Figure from Wong & Keller, GJI (2022).
References and Further Reading
- Keller, T. and Suckale, J., 2019. A continuum model of multi-phase reactive transport in igneous systems. Geophysical Journal International, 219(1), pp.185-222
- Wong, Y.Q. and Keller, T., 2023. A unified numerical model for two-phase porous, mush and suspension flow dynamics in magmatic systems. Geophysical Journal International, 233(2), pp.769-795
- ParallelStencil; ImplicitGlobalGrid
- Degruyter, W., Parmigiani, A., Huber, C. and Bachmann, O., 2019. How do volatiles escape their shallow magmatic hearth?. Philosophical Transactions of the Royal Society A, 377(2139), p.20180017
- Riel, N., Kaus, B.J., Green, E.C.R. and Berlie, N., 2022. MAGEMin, an efficient Gibbs energy minimizer: application to igneous systems. Geochemistry, Geophysics, Geosystems, 23(7), p.e2022GC010427
-
Multi-scale modelling of volcanoes and their deep magmatic roots: Granular dynamics of magma as a three-phase suspension
Project institution:University of GlasgowProject supervisor(s):Prof Jin Sun (University of Glasgow), Dr Tobias Keller (University of Glasgow), Prof Andrew McBride (University of Glasgow), Dr Ankush Aggarwal (University of Glasgow) and Dr Kevin Stratford (University of Edinburgh)Overview and Background
Overview: This PhD studentship focuses on developing GPU-accelerated models of magmatic processes that underpin volcanic hazards and magmatic resource formation. These processes span sub-millimeter mineral-fluid-melt interactions up to kilometer-scale magma dynamics and crustal deformation. Magma is a multi-phase mixture of solids, silicate melts, and volatile-rich fluids, interacting in complex thermo-chemical-mechanical ways. The project will contribute one component of a hierarchical, multi-scale modelling framework using advanced GPU-based techniques. In this project, you will focus on studying the detailed mineral-melt-fluid interactions and the resulting flow (rheology) of this mixture and the mobility of fluids migrating through it. This will shed light on the fundamental mechanisms that determine the large-scale multiphase flow and provide crucial data for macroscopic constitutive modelling of phase properties and interactions. The project in itself can thus enhance our understanding of the granular-scale physics and mechanisms of magmatic flows. The work can be performed interpedently but will benefit from interactions with other partner projects under the same magma multiscale modelling theme.
Your work will include software development, integrating and interpreting field and experimental data sets, attending regular seminars, collaborating within a research team, and receiving training through ExaGEO workshops.
Background: Volcanic eruptions originate from shallow crustal magma reservoirs built up over long periods. As magma cools and crystallizes, it releases fluid phases—aqueous, briny, or containing carbonates, metal oxides, or sulfides—whose low viscosity and density contrasts drive fluid segregation. This fluid migration can trigger volcanic unrest or concentrate metals into economically valuable deposits. The distribution of fluids—discrete droplets versus interconnected drainage networks—crucially depends on crystal and melt properties. Direct observations are challenging, so high-resolution, GPU-accelerated simulations provide a way to understand these complex and dynamic systems.
Methodology and Objectives
Modelling volcanic systems is challenging due to the multi-scale nature of their underlying physical and chemical processes. System-scale dynamics (100 m to 100 km) emerge from interactions involving crystals, melt films, and fluid droplets or channels on micro- to centimetre scales. To link these scales, this project uses a hierarchical approach: (i) direct numerical simulations of granular-scale phase interactions, (ii) deep learning-based computational homogenisation to extract effective constitutive relations, and (iii) system-scale mixture continuum models applying these relations to problems. All components leverage GPU-accelerated computing and deep learning to handle direct simulations at local scales, train effective constitutive models, and achieve sufficient resolution at the system scale.
In this project, the candidate will simulate multiphase magma flow within a representative volume containing several hundred to thousands of crystal particles, under various flow conditions and material compositions. The melt and fluid dynamics will be resolved using the Lattice Boltzmann Method (LBM) [1], a highly efficient numerical technique well-suited for GPU parallelisation. The crystal particle dynamics will be modelled using the Discrete Element Method (DEM) [2], which captures the motion and interactions of individual particles.
The simulations will explore how granular-scale dynamics influence macroscopic phase transport properties, addressing key research questions such as: How are phase dynamics affected by fluids present as discrete droplets or as interconnected drainage networks? How does this depend on the amount and properties of crystals present? The data generated will also contribute to the development of constitutive models in a related project.
Within this framework, the student will start by working on two “teaser” projects to gain familiarity with different techniques and data, then choose how to further develop and focus their research.
Teaser Project 1:
This teaser project, conducted during the first year, will focus on the simple shear flow of melt-crystal mixtures. Shear flow within a 3D periodic box will be simulated across various shear rates and solid volume fractions. Under these conditions, the crystals are assumed to move together with the melt. The simulations will establish a baseline for the shear rheology of simplified mixtures, providing valuable insights into volume-dependent shear viscosity and the jamming transition. During this project, the candidate will be trained in using the coupled LBM-DEM code to simulate suspensions with freely moving particles. To resolve the detailed flow fields around particles, billions of lattice nodes are required. GPU-accelerated LBM and DEM codes will be employed to handle the large computational demand. The candidate will further optimise the coupling efficiency for GPU and run simulations on GPU computers.
Teaser Project 2:
This teaser project, also conducted during the first year, will focus on inter-phase drag forces between melt-fluid and crystals. Simulations will involve flows of melt-fluid mixtures through a fixed matrix of crystal solids, where there is no relative motion between the melt and fluid phases. The overall drag forces between the mixtures and the solid phase will be calculated. The regimes of discrete and continuous fluid phases and the transitions between them will be mapped. The candidate will be trained in multiphase LBM methodology and further optimise the implementation of this method for GPU within the GPU-accelerated LBM code structure.
Image: 3D sheared suspension of solid fraction φ = 0.05 is illustrated on a slice through the domain. The distribution of particles indicated by the solid fractions is shown on the left and the fluid velocity component in horizontal direction on the right.
References and Further Reading
- Najuch, T. & Sun, J. Analysis of two partially-saturated-cell methods for lattice Boltzmann simulation of granular suspension rheology. Comput Fluids 189, 1 12 (2019)
- Ness, C. & Sun, J. Flow regime transitions in dense non-Brownian suspensions: Rheology, microstructural characterization, and constitutive modeling. Physical Review E 91, 012201 (2015)
-
Uncertainty determination and visualisation of volcanic co-PDC ash plume dispersal
Project institution:Lancaster UniversityProject supervisor(s):Dr Thomas Jones (Lancaster University), Dr Frances Beckett (UK Met Office), Charlie Bates (UK Met Office) and Professor Mike James (Lancaster University)Overview and Background
Substantial progress has been made in modelling dispersion of volcanic plumes from explosive eruptions, but plumes formed from pyroclastic density currents (i.e., co-PDC plumes) have been largely neglected. They comprise fine-grained ash particles and hot gas, can reach heights of tens of kilometres, potentially dispersing large volumes of ash over continental scale areas, impacting the environment, and posing a risk to aviation. This project, alongside the Met Office, will quantify the uncertainties of modelling co-PDC ash dispersion using NAME (Numerical Atmospheric-dispersion Modelling Environment), which is used to generate forecasts for the London Volcanic Ash Advisory Centre. You will also construct workflows that can post-process NAME outputs from large, ensemble runs, into graphics/forecasts for multiple end-users (e.g., aviation industry, meteorologists, research scientists) at operational speed.
Methodology and Objectives
Ensemble forecasting; Monte Carlo analysis; Numerical Weather Prediction models; Lagrangian and Eulerian particle dispersion models (NAME); Parallel computing; probability mapping; JASMIN; High Performance Computing.
Teaser Project 1: Evaluating uncertainties associated with co-PDC ash dispersion
The Met Office is home to the London Volcanic Ash Advisory Centre (VAAC). The role of the London VAAC is to provide advice, forecasts and guidance to the aviation authorities on the presence of volcanic ash in the atmosphere, especially for eruptions originating in Iceland. Going forward, the VAACs will be required to issue quantitative and probabilistic volcanic ash concentration information which incorporate uncertainties in both the weather data and the eruption source parameters (e.g., mass eruption rate, plume height, particle size, shape). Currently, the UK Met Office uses the Numerical Atmospheric-dispersion Modelling Environment (NAME) to provide operational forecasts based on a single set of metrological data and for a specific set of eruption source parameters. However, to present probabilistic outputs or outputs with quantitative estimates of uncertainty, ensemble or Monte Carlo runs are required. This increases computational time and cost, which need to be minimised for real-time operational forecasting during an emergency.
In this teaser project you will address this upcoming challenge for the specific case of co-PDC ash plumes. You will develop code to use ensemble metrological data with NAME for co-PDC plumes. You will optimise this approach on high performance computing infrastructure (e.g., JASMIN) such that it can be used over timescales appropriate for real-time eruption response. This teaser project could be further developed by exploring the uncertainty in eruption source parameters that are unique to co-PDCs (e.g., vent location, aspect ratio, particle shape) and, if time, expansion to other types of volcanic eruption.
Teaser Project 2: Visualisation and interaction with large, complex ash dispersion datasets
Due to the real-time operational nature of the London VAAC, individual model runs can be executed within minutes; however, they generate large datasets that need to be post-processed and visualised quickly (e.g., ash concentration at different flight levels, total plume mass loadings, embedded wind fields and precipitation data, all as a function of time since the eruption). With the growing use of ensemble model forecasts these datasets are expected to grow by several orders of magnitude in the coming years. Thus, generating computationally cheap methods to post-process these big-data and provide effective visualization presents a key challenge. In this teaser project you will use a set of ensemble volcanic ash dispersion data (i.e., big data set) and develop a set of parallel workflows and robust data structures to post-process these data and display the required VAAC graphics whilst minimizing computational time. This teaser project could be further developed by tailoring these workflows and outputs to different end-users (e.g., aviation industry, meteorologists, research scientists), who will have different requirements and different knowledge bases for data visualisation and analysis.
Image: Pyroclastic flows at Mayon Volcano.
References and Further Reading
- Jones, T.J., Beckett, F., Bernard, B., Breard, E.C., Dioguardi, F., Dufek, J., Engwell, S. and Eychenne, J., 2023. Physical properties of pyroclastic density currents: relevance, challenges and future directions. Frontiers in Earth Science, 11, p.1218645
- Madankan, R., Pouget, S., Singla, P., Bursik, M., Dehn, J., Jones, M., Patra, A., Pavolonis, M., Pitman, E.B., Singh, T. and Webley, P., 2014. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion. Journal of Computational Physics, 271, pp.39-59
- Leadbetter, S.J., Jones, A.R. and Hort, M.C., 2022. Assessing the value meteorological ensembles add to dispersion modelling using hypothetical releases. Atmospheric Chemistry and Physics, 22(1), pp.577-596
- Capponi, A., Harvey, N.J., Dacre, H.F., Beven, K., Saint, C., Wells, C. and James, M.R., 2022. Refining an ensemble of volcanic ash forecasts using satellite retrievals: Raikoke 2019. Atmospheric Chemistry and Physics, 22(9), pp.6115-6134
- Beckett, F., Barsotti, S., Burton, R., Dioguardi, F., Engwell, S., Hort, M., Kristiansen, N., Loughlin, S., Muscat, A., Osborne, M. and Saint, C., 2024. Conducting volcanic ash cloud exercises: practising forecast evaluation procedures and the pull-through of scientific advice to the London VAAC. Bulletin of Volcanology, 86(7), p.63
- Beckett, F.M., Witham, C.S., Hort, M.C., Stevenson, J.A., Bonadonna, C. and Millington, S.C., 2015. Sensitivity of dispersion model forecasts of volcanic ash clouds to the physical characteristics of the particles. Journal of Geophysical Research: Atmospheres, 120(22), pp.11-636
- Met Office Dispersion Model
- JASMIN
Projects with a focus on Sustainability Solutions in Engineering, Environmental, and Social Sciences:
-
AI Meets Glasgow’s Trees: Metrics Prediction, 3D Mapping, and Socio-Ecosystem Impact Simulations
Project institution:University of GlasgowProject supervisor(s):Dr Meiliu Wu (University of Glasgow), Dr Davide Dominoni (University of Glasgow), Dr Luigi Cao Pinna (University of Glasgow), Dr Dominic McCafferty (University of Glasgow), Dr Alex Bush (Lancaster University), Doug McNeil (EOLAS Insights Ltd) and Gillian Dick (Glasgow City Council)Overview and Background
The project aims to harness the power of AI to advance sustainable forestry and woodland management in Glasgow, focusing on predicting tree metrics (e.g., species, height, canopy area, and tree health), creating a detailed and interactive 3D tree map, and assessing tree impacts on socio-ecosystem. This research will combine cutting-edge deep learning techniques and statistical inference with diverse data sources, including high-resolution LiDAR data, remote sensing imagery, street view images, environmental sensor data, citizen-science text, and social media. By leveraging exascale GPU computing, the project will develop scalable, real-time models and tools to support sustainable urban planning, biodiversity conservation, and climate resilience efforts.
The project will comprise two teaser projects: (1) AI-powered predicting tree metrics, and developing 3D visualisation of Glasgow’s forestry and woodland; and (2) socio-ecosystem simulations to evaluate trees’ impacts on biodiversity, climate mitigation, and human well-being. These components will enable researchers, city planners, and policymakers to make data-driven decisions for a greener, more sustainable Glasgow.
Teaser Project 1: AI for Glasgow’s Greenness: Predicting and Visualising Tree Metrics with 3D Insights
The objective is to develop advanced AI models that predict key metrics of Glasgow’s trees, including species, height, canopy area, and tree health, and visualise these metrics on an interactive 3D map.
Data Sources:
- Ground truth data: Collect and label ground truth data for tree species, height, canopy area, and age.
- LiDAR and Remote Sensing (RS): High-resolution 3D point clouds and spectral imagery.
- Street View Images: Ground-level visual perspectives of trees.
- Citizen-Science Text and Social Media: Descriptive keywords and local observations about tree conditions.
Models and Approach:
- In Year 1 you will train or fine-tune deep learning models, such as convolutional neural networks (CNNs), for image and LiDAR data analysis to predict tree height, canopy area, and age.
- In Year 2 you will fine-tune vision-language models (VLMs, e.g., OpenAI’s CLIP[1] and GPT-4[2]) to seamlessly integrate visual inputs (e.g., remote sensing imagery, street view photos, citizen-science images, and social media visuals) with textual data (e.g., citizen-science descriptions and social media posts), to predict tree species and assess tree health (Wu et al., 2023; Wu & Huang, 2022). Specifically, pre-trained VLMs will be fine-tuned using image-text pairs as input. These pairs will consist of images associated with species IDs from citizen-science databases (e.g., Treezilla, GBIF, and iNaturalist), supplemented with collaboratively sourced community data on Glasgow trees, including their health and overall condition.
- In Year 3 you will map the collected and predicted attributes of Glasgow’s forestry and woodland as a web-based 3D visualisation.
Outputs:
- A detailed, fine-grained, and interactive 3D map of Glasgow’s trees, showing tree metrics (e.g., height, canopy area, species, health, and diversity), serving as a visualisation platform for city planners, ecologists, and researchers.
Teaser Project 2: Assessing Socio-Ecological Benefits of Glasgow’s Trees: AI-Powered Semantic Insights and Simulations
The objective is to measure the socio-ecological benefits of Glasgow’s trees, focusing on public sentiment, biodiversity, and climate resilience using AI-powered simulation models and semantic analysis.
Data Sources:
- Tree data: location, canopy cover, height, etc.
- Biodiversity data (e.g., insect, bird and mammal data collected at 40 sites in Glasgow from NERC GALLANT project).
- Environmental sensor data (e.g., temperature and air quality).
- Natural hazard data (e.g., flood-prone areas).
- Social media data and citizen-science descriptive text.
Models and Approach:
- Tree indices: Measure tree diversity and abundance/biomass indices by area/zone.
- Semantic Analysis: Use large language models (e.g., BERT3) and statistical topic modelling methods (e.g., Latent Dirichlet Allocation (LDA)) to analyse public sentiment and attitudes toward urban trees from citizen-science text and social media posts (e.g., from Flickr and Twitter) (e.g., Cao Pinna et al., 2024).
- AI-powered Simulation: Associating Tree Indices with Environmental Variables
- Cooling Effects: By integrating tree attributes, land surface temperatures, and urban heat island patterns, deep learning models (e.g., CNNs) can predict temperature reductions at fine spatial-temporal scales and identify optimal tree placement.
- Flood Mitigation: By using time-series rainfall data, deep learning techniques (e.g., RNNs) can predict and analyse how tree roots, soil permeability, and canopy interception influence stormwater absorption and surface runoff, and evaluate how tree planting strategies mitigate flood risks.
- Carbon Sequestration: Deep learning models (e.g., reinforcement learning) can leverage tree growth data, biomass estimates, and carbon flux measures to predict carbon sequestration rates, by identifying patterns of carbon uptake across diverse tree species and environmental conditions.
- AI-enhanced Impact Assessment of Tree Abundance, Diversity, and Species
- Tree Abundance and Diversity: Deep learning techniques (e.g., unsupervised clustering) can identify patterns in tree abundance (e.g., total biomass, canopy area) and diversity (species richness, evenness) that promote insect diversity.
- Role of Specific Tree Species: Deep learning-powered species interaction models, such as graph neural networks (GNNs), can identify keystone tree species that sustain insect and bird populations across trophic levels.
- Uncertainty Measurement: Develop ensemble models to quantify uncertainties in predictions and simulate multiple future scenarios.
Outputs:
- Year 1: Semantic maps visualising public sentiment across the city.
- Year 2: Predictive simulations of urban trees’ roles in mitigating urban heat islands, reducing flood risks, enhancing air quality, and improving urban biodiversity in insects and birds.
- Year 3: Interactive dashboards for policymakers to evaluate tree-ecosystem trade-offs and benefits. For example, similar to NYC Tree Map, the dashboard could display ecological benefits: (1) Energy conserved each year (kWh); (2) Stormwater intercepted each year (gallons); and (3) Air pollutants removed each year (pounds).
Image: Project framework
References and Further Reading
- Cao Pinna, L., Miller, C., & Scott, M. (2024). Latent Dirichlet allocation and hidden Markov models to identify public perception of sustainability in social media data. International Workshop on Statistical Modelling, 14–20
- Wu, M., & Huang, Q. (2022). IM2City: image geo-localization via multi-modal learning. Proceedings of the 5th ACM SIGSPATIAL International Workshop on AI for Geographic Knowledge Discovery, 50–61
- Wu, M., Huang, Q., Gao, S., & Zhang, Z. (2023). Mixed land use measurement and mapping with street view images and spatial context-aware prompts via zero-shot multimodal learning. International Journal of Applied Earth Observation and Geoinformation, 125, 103591
- AI initiatives in silviculture, especially concerning woodland health detection (click here)
- Glasgow City Council’s forestry and woodland strategy (click here)
- Tree Map Prototype in NYC
- Tree Map Prototype in Singapore
-
Exploring solver approaches for climate and fluid simulations
Project institution:University of EdinburghProject supervisor(s):Adrian Jackson (University of Edinburgh) and Dr Sergio Campobasso (Lancaster University)Overview and Background
Numerical solvers are the core of many computational simulation approaches. Weather and climate simulations are prime examples of these, with numerical/climate simulation packages such as the Met Office’s Unified Model (UM) containing various “dynamical core” implementations. Modern computer models of the atmosphere include many complex physical processes that each have local influences and feed back into the general circulation. At the heart of these models, however, is the solution of the dynamical equations of motion (Newton’s laws applied to a gas). For this reason, the model component that solves these equations is called the “dynamical core”[1].
Whilst these solvers are the work of continual upgrades and improvements, they all follow similar approaches; implicit solvers, spatial discretisation, similar grid construction approaches, advection schemes, etc… However, alternative numerical approaches do exist, such as contour-based schemes[2], which move away from the grid based spatial discretisation and to potentially more functional implementations. In theory these should be significantly more efficient and deal with complex phenomenon such at turbulence in a more natural manner. However, they have yet to replace existing methods for simulation.
This project aims to investigate the computational benefits and drawbacks of these different types of numerical approaches, especially with respect to modern computing hardware (GPUs and other accelerators) that will make up future Exascale systems. We aim to both understand the underlying computational requirement of different numerical approaches, how they map to computational systems, and what other functionality is required to enable upgrading the numerical approaches of our model weather, climate, and environmental simulation systems. The promise is for a 100x+ efficiency improvement in computational approaches if this can be achieved.
Methodology and Objectives
These projects would undertake in-depth analysis of software and numerical approaches, current computing hardware, and the mapping of software to hardware. There is scope to deeply dive into understanding the full range of numerical computation and become an expert on areas such as program optimisation, computational hardware, GPU programming, and parallel computing.
Teaser Project 1:
Implementation of core solver approaches in Python to give demonstrator functionality for the main simulation methods used currently for research. This will allow investigation of the functionality and scale that each approach has, and the suitability for various simulations they provide. This will build on an initial literature review of similar activities in these areas, and then will finish with the implementation of larger scale simulation functionality for direct comparison between the different approaches. Getting hands on with the simulation methods and solver approaches will allow you to understand first-hand the trade-offs and issues, as well as the benefits, of each solver approach and why they have been chosen, giving insight into the challenges of replacing older solver approaches with newer options. The simulation targets for these experiments will be atmospheric and earth model scenarios that commonly use such solvers for weather or climate experiments.
Teaser Project 2:
Computational hardware varies across the memory bandwidths, computational intensities, and specific functionality different types of hardware provide. For example, there are now a lot of variations in floating point hardware across GPUs and machine learning accelerators (such as the Cerebras and Graphcore systems), with reduce precision and specialised instructions (such as Tensor operations on Nvidia GPUs) providing significant potential performance benefits but also trade offs in accuracy and support for particular types of computations. This project would explore the hardware available in GPUs and other interesting hardware to characterise and classify the hardware performance and functionality, building performance models that could allow evaluation of applications across a range of hardware and hardware configurations. Hand in hand with similar work on the software side in the first Teaser project there should be scope to build a comprehensive understanding of computing hardware for Exascale systems and how software maps to these. As with teaser project 1, the simulations used for benchmarking and comparing performance will be climate and weather simulation approaches that are key to natural and earth science research.
Figure 1: Example turbulent flow that requires significant computation resources to simulate. Copyright D. G. Dritschel, 2014.
References and Further Reading
- ENDGame: A new dynamical core for seamless atmospheric prediction (Met Office)
- D. G. Dritschel and M. H. P. Ambaum. A contour-advective semi-lagrangian algorithm for the simulation of fine-scale conservative fields. Quart. J. Roy. Meteor. Soc., 123:1097–1130, 1997
-
GPU-Accelerated High-Fidelity Hydrodynamics Modelling for Tidal Energy Resource and Environmental Impact Assessment
Project institution:University of EdinburghProject supervisor(s):Dr Joseph O’Connor (University of Edinburgh), Dr Brian Sellar (University of Edinburgh) and Dr Athanasios Angeloudis (University of Edinburgh)Overview and Background
Tidal energy offers a predictable and sustainable energy source, driving increased interest in its development. However, as tidal energy deployments are scaled up, this places a greater burden on the local environment/ecosystem. High-fidelity hydrodynamic modelling tools are essential for predicting and mitigating environmental impacts, while also maximising energy extraction, to ensure this limited resource is used in a responsible way. However, these models are computationally demanding. Moreover, many existing tools are developed for traditional (CPU-based) HPC systems. With the advent of large GPU-based exascale machines, there is a need to prepare existing codes for this new HPC paradigm. As well as futureproofing existing codebases, this will provide a step change in computational capacity, unlocking new types of simulations (e.g. high-fidelity multi-physics) and workflows (e.g. optimisation, uncertainty quantification, data assimilation) not currently possible with today’s methods.
Methodology and Objectives
Both projects will use advanced computational techniques to improve hydrodynamic modelling capability for tidal energy resource assessment, as well as predicting/mitigating environmental impacts. This will involve porting existing open-source hydrodynamic modelling tools to GPU to enable faster, larger, and more detailed simulations/workflows than what is currently achievable. The focus is specifically on high-fidelity models, where the 3D hydrodynamic equations are solved numerically. Initially, this project will focus on tools already used throughout the supervisory team (e.g. TELEMAC-3D, Thetis). However, a survey period is envisaged to identify the most suitable tools to take forward to exascale applications.
Teaser Project 1: Accelerating high-fidelity hydrodynamic models for large-scale ensemble-based workflows
In most real-world applications, running a single simulation is often insufficient. Tasks such as optimisation and uncertainty quantification typically require hundreds/thousands of model evaluations. This is extremely computationally demanding, making it impractical with today’s high-fidelity methods. This project will enable these types of workflows on emerging GPU-based exascale machines by porting an existing high-fidelity hydrodynamics model to GPU. The objectives for the teaser project are:
- Survey and profile existing high-fidelity hydrodynamic modelling tools (starting with TELEMAC-3D and Thetis) to identify computational bottlenecks and evaluate the potential for porting to GPU.
- Build a proxy application replicating one of the identified bottlenecks (e.g. advection-diffusion equation) to test and evaluate different programming models/frameworks for porting to GPU (e.g. CUDA, OpenMP, SYCL).
Following the first year, if this project is selected the remaining objectives will be to:
- Port selected components of the chosen model to GPU to improve computational performance. This will involve implementing kernels for GPU execution and optimising memory management.
- Benchmark the GPU-accelerated implementation to compare the performance against the existing CPU implementation and identify areas for optimisation. This will also involve testing the scalability across multiple GPUs for large-scale exascale applications.
- Integrate the GPU-accelerated model within a large-scale ensemble-based framework to enable workflows that require hundreds/thousands of model evaluations. This will be demonstrated on real-world cases by performing large-scale optimisation (e.g. for marine spatial planning) and uncertainty quantification (e.g. for model reliability) campaigns.
Teaser Project 2: Accelerating data assimilation for enhanced model calibration and predictive capability
Data assimilation (DA) combines real-world observational data with numerical simulations to enhance model predictions, leading to more reliable simulations for real-world problems. However, DA is computationally intensive, requiring sophisticated large-scale modelling and data processing techniques. This project will develop a GPU-accelerated DA framework tailored for tidal resource assessment to enable these workflows on emerging GPU-based exascale machines. The objectives for the teaser project are:
- Survey existing methods (e.g. 3D/4D variational DA, ensemble Kalman filter) and libraries for combining observational data with high-fidelity hydrodynamic models. This should also consider the form of the observational data (e.g. satellite, acoustic doppler current profiler, etc.).
- Build and profile a small CPU-based example of the DA framework to identify computational bottlenecks and determine priority components for porting to GPU (e.g. model, data processing, or a mix of both).
Following the first year, if this project is selected the remaining objectives will be to:
- Port selected components of the DA framework to GPU to improve computational performance. This will involve selecting a suitable programming model/framework, implementing kernels for GPU execution, and optimising memory management.
- Benchmark and profile the GPU-accelerated implementation to compare the performance against the existing CPU implementation, as well as identifying areas for further development and optimisation.
- Demonstrate the new GPU-accelerated DA framework on real-world tidal energy applications (e.g. resource assessment and environmental impact). This will enable enhanced model calibration for uncertain parameters (e.g. bottom friction, turbulence parameters), as well as improve predictive accuracy by optimally combining model solutions with observational data.
References and Further Reading
- Almoghayer, M. A., Lam, R., Sellar, B., Old, C., & Woolf, D. K. (2024). Validation of tidal turbine wake simulations using an open regional-scale 3D model against 1MW machine and site measurements. In Ocean Engineering (Vol. 299, p. 117402). Elsevier BV (click here)
- Old, C., Sellar, B., & Angeloudis, A. (2024). Iterative dynamics-based mesh discretisation for multi-scale coastal ocean modelling. In Journal of Ocean Engineering and Marine Energy (Vol. 10, Issue 2, pp. 313–334). Springer Science and Business Media LLC (click here)
- TELEMAC-MASCARET
- Thetis