Geophysics/Meteorology Honours Projects 2014-2015 Course organiser: David Stevenson ([email protected]), Crew 314 Course secretary: Ken O’Neill ([email protected]), Grant 332 This document lists the projects on offer for senior honours (4th year) students registered on the Geophysics/Geophysics and Geology/Geophysics and Meteorology programs. The meteorology projects (and potentially some of the others too) are also on offer to Physics with Meteorology students (if Physics with Met students are unsure whether projects are suitable, they should contact the course organiser (CO)/primary supervisor). If you are interested in a particular project, please contact the main supervisor to discuss what is involved in more detail. You need to choose a project and also a second choice (and third choice, etc., if possible), for each semester (Semester two (S2) choices can be revised later). In some cases you won’t be able to do your first choice (for example if it is chosen by multiple people, and the supervisor cannot run several variants of the same project). If you don’t get your first choice, we will try and make sure you do get your first choice in the other semester. Where projects are oversubscribed, the decision of the CO (generally in consultation with the supervisor) will be final. Any projects/combinations of projects can be taken, irrespective of whether you are a Geophysics or a Geophysics and Meteorology/Geology student. Project choices for S1 (and provisional choices for S2) should be emailed to the CO (email address above) by Tuesday in Week one (16th Sept), and finalised allocations will be made by the end of Week one (19th Sept) so that you have sufficient time to fully tackle projects. Students can propose their own projects, but will need to identify a suitable supervisor (normally amongst the geophysics/meteorology staff), and convince that supervisor and the CO that the project is sensible and feasible. Students should do this as soon as possible, to fit with the above timetable. Projects listed here are a mixture of one semester, 20-credit projects, and two semester, 40-credit projects. In some cases, 20-credit semester one projects can be extended to be 40-credit projects. If students wish to extend their semester one project, they will need to get the supervisor’s and the CO’s agreement, before week seven of S1. Single semester projects should typically be 20-25 pages A4 (and a maximum of 30 pages - including all diagrams, references and appendices; 12 point font, 2.5cm margins, 1.0 line spacing, space between paragraphs); Two semester projects should typically be 40-50 pages (maximum 60 pages). Projects should be spiral bound (e.g., at JCMB copy shop). Students doing 40-credit projects also need to hand in an interim report – this will be marked and you will be given feedback, as for 20-credit projects. The mark for the interim report makes up 25% of the overall mark (this is new for 2014/15). Projects are independently marked by the primary supervisor and one other staff member, using the criteria laid out on the mark sheets, see: http://xweb.geos.ed.ac.uk/~dstevens/teaching/GP_Projects_marking_protocol.pdf. Hand in S1 project reports and 40-credit interim reports by 12 noon on Tuesday January 13th 2015. Hand in S2 project reports and 40-credit final reports by 12 noon on Friday April 4th 2015. As part of the introduction to year four (S1, Week 0) projects will be introduced by the CO, along with examples of good (and bad) practice in how students should tackle their projects, including writing them up. It is up to students to contact potential project supervisors to discuss projects (supervisor’s contact details should be included in the project descriptions). If supervisors cannot be contacted, please let the CO know. In the middle of S2 (during ‘Innovative Learning Week’), students will be expected to give a short presentation on their semester one project, or, if they are doing a single 40-credit project, on that. This presentation will not count towards your final project mark, but does contribute 20% towards the final mark on the ‘Transferable Skills for Geophysicists’ course. How thick is the ionosphere? Computing density from induction coil data and ionosonde measurements Ciarán Beggan [[email protected]] (British Geological Survey) and Kathy Whaler (School of GeoSciences) Induction coils measure small and very rapid changes of the magnetic field. A set of induction coils at the Eskdalemuir observatory record magnetic field changes over a frequency range of 0.1—10 Hz, which encompasses geophysical wave phenomena related to the conductive upper atmosphere (called the ionosphere). The ionosphere can act as a partial ‘barrier’ which electromagnetic (EM) waves can bounce off, become trapped in and resonate back and forwards for a short time. The socalled Schumann resonances (trapped between the Earth’s surface and the bottom of the ionosphere) and Ionospheric Alfvén Resonances (trapped between the bottom and the top of the ionosphere) are two such examples. However, to observe these phenomena, we have to look in the frequency domain of the measured magnetic time-series. Figure 1: A spectrogram (energy at each frequency over time) of induction coil data for 14 Feb 2014. The width between the black dotted lines (∆f) gives information about the density of the ionosphere (from Beggan, 2014). In spectrogram plots (i.e. a plot of the energy at a particular frequency versus time), the Ionospheric Alfvén Resonances are observed as a set of 4-12 fringes, occurring during local night time and disappearing during the daylight hours. Their behaviour and occurrence in frequency (f) and the difference in frequency between fringes (∆f) varies throughout the year. Figure 1 gives an example spectrogram showing what the resonances look like and when they occur (overlain with dotted lines). Research has shown that the frequency interval (∆f) between the fringes is related to the density of the upper atmosphere – a ‘thinner’ atmosphere allows the EM waves to propagate more quickly, leading to wider frequency intervals. As an added bonus, it turns out that if we measure another property of the ionosphere (from an instrument called an ionosonde1) we can quantify these relationships exactly and infer the density (e.g. Hebden et al., 2005). A new method of automatic detection based on signal and image processing techniques has been developed to identify the fringes and to quantify the values of ∆f. Previously, this required manual identification of the fringes every day. Figure 1 shows an example of the detected fringes in the spectrogram (Beggan, 2014). The project will look at two years of induction coil data from Eskdalemuir and use the automatic detection code to produce hourly values of ∆f, which will be combined with hourly values of the Chiltern ionosonde data, to compute various ionospheric parameters such as density, cavity length and wave velocity and see how they vary seasonally and annually. The student will be required to process time-series data from a number of different data sets, to produce a small piece of comparison code and to plot the data in a suitable format. The project will use Matlab to process and plot the data. References: [1] Beggan, C., Automatic detection of Ionospheric Alfvén Resonances using signal and image processing techniques, Annales Geophysicae, in press, 2014 [2] Hebden, S. R., Robinson, T. R., Wright, D. M., Yeoman, T., Raita, T., and Bosinger, T.: A quantitative analysis of the diurnal evolution of Ionospheric Alfvén resonator magnetic resonance features and calculation of changing IAR parameters, Annales Geophysicae, 23, 1711– 1721, 2005 1 Ionosondes are essentially radio transmitters that record values related to the reflectance of the mid-ionosphere at the 5.4MHz radio or foF2 frequency – in the UK there is an instrument at Chiltern. http://www.ukssdc.ac.uk/ionosondes/ionosondes.html Using induction coils for noise reduction in fluxgate magnetometers Ciarán Beggan [[email protected]], Chris Turbitt (British Geological Survey) and Kathy Whaler (School of GeoSciences) Modern geomagnetic satellites such as the ESA Swarm mission provide data at a time resolution of one second. Traditionally ground-based magnetic observatories have produced minute-mean values, though are now moving towards producing onesecond values. For a variety of technical and physical reasons, traditional fluxgate vector magnetometers often have a high noise level at Figure 1: (Top) Example one-second magnetic data from a this cadence, though newer vector magnetometer. instruments are beginning to (Bottom) Corrected magnetic data using an induction coil achieve lower noise levels. High magnetometer to reduce the noise (from Brunke & Korte, 2013). frequency induction coil instruments can give us a measure of the difference between consecutive values of the magnetic field at high temporal sampling rates. This can be used to ‘correct’ the one-second data from a lower cadence instrument. We will use the data from a two-component induction coil system operated at the Eskdalemuir observatory and use it to improve the signal-to-noise ratio of the nearby standard one-second fluxgate readings. The proposed method to reduce the noise in one-second fluxgate magnetometer data is to combine them with induction coil data (Brunke and Korte, 2013). The induction coil data are integrated over time, resulting in three unknown parameters to be solved for: (a) a scaling factor linking the measured induced values to the time derivative, (b) a constant offset in the data of induced value and (c) the integration constant. These are solved for by forming a series of simultaneous equations and performing a least squares inversion. The project will look at data from several instruments including the Eskdalemuir standard fluxgate magnetometers, a newly acquired LEMI fluxgate magnetometer and data from a set of magnetometers from a relatively low-resolution ‘school’ magnetometer system. The student will be required to process time-series data from a number of different data sets, to produce a small piece of inversion code and to plot the data in a suitable format. The project will use Matlab to process and plot the data. References: [1] Brunke, H.-, P., & Korte, M. (2013). Noise reduction of fluxgate data by joint interpretation with induction coil data. In P., Hejda (Ed.), Proceedings of the XVth IAGA Workshop on Geomagnetic Observatory Instruments, Data Acquisition and Processing: extended abstract volume sing (pp. 34-37). Identifying low-frequency earthquakes in laboratory deformation data Andrew Bell ([email protected]) and Phil Benson (Portsmouth) 1 semester project (S1 only) Low frequency (LF) earthquakes are a ubiquitous feature of active volcanic areas, and are a key indicator of increased unrest. In many instances, LF earthquakes are thought to be generated by the movement of fluids at depth (gas, hydrothermal fluids, or magma). However, this origin is controversial, and alternative models have been proposed where fluids play a much less important role. Laboratory deformation tests provide an opportunity to discriminate between these models, and allow an evaluation of the extent to which LF signals can be used to forecast eruption onset. An important element of these tests is the analysis of micro-seismicity (Acoustic emission or AE) data recorded in the lab, and in particular, the ability to identify LF events. This project will investigate different methods to identify LF events in AE data. The student will use functions in the Python-based ObsPy library to evaluate the performance of different event identification routines, using data collected in laboratory experiments run by colleagues at the University of Portsmouth. Standard amplitude-based techniques (e.g. STA/LTA) will be compared against crosscorrelation techniques aimed at identifying repeating signals. . Figure 1: LF AE data from laboratory simulations. References: Benson P.M., et al., 2008, Laboratory Simulation of Volcano Seismicity, Science, 322, 249, doi: 10.1126/science.1161927. Burlini, L., et al., (2007) Seismicity preceding volcanic eruptions: new experimental insights. Geology, 35; 183-186 Seasonality of earthquake catalogues in northern Iceland Andrew Bell ([email protected]) and Ian Main 1 semester project (S1 only) Seismicity is an important tool for understanding the physical processes occurring at depth within the Earth. In northern Iceland, large numbers of earthquakes are recorded on the local seismic monitoring network, and provide an opportunity to study the complex interactions between magmatism and faulting in this rift-transform system. The earthquake catalogue is influenced both by natural processes and those associated with the monitoring network. Winters are harsh in this part of the world, and storms can reduce the magnitude detection threshold – unpicking these monitoring effects from natural processes is important to avoid artefacts. This project will look at the seasonal properties of the earthquake catalogue in northern Iceland, using established Python routines to determine magnitude completeness thresholds and seismic “bvalues”. It will develop a “best-practice” methodology for managing these effects when undertaking more complex analyses. . Figure 2: Locations of earthquakes in N Iceland (left) and magnitudes of earthquakes in this region 2012-2014 (right). References: Woessner, J., and S. Wiemer, 2005, Assessing the Quality of Earthquake Catalogues: Estimating the Magnitude of Completeness and Its Uncertainty: Bulletin of the Seismological Society of America, v. 95, p. 684-698. Weimer, S., and M. Wyss, 2000, Minimum Magnitude of Completeness in Earthquake Catalogs: Examples from Alaska, the Western United States, and Japan: Bulletin of the Seismological Society of America, v. 90, p. 859-869. Footprint of the South Asian monsoon on African climate Supervisors: Massimo Bollasina ([email protected]), Simon Tett, and Gabi Hegerl 20 credits semester 1 or 2, or extendable to 40 credits across 2 semesters South Asia is home to the immense South Asian monsoon, a key component of the global water and energy cycles with profound worldwide influences. Its remarkable seasonal shift of precipitation is the life-blood of more than 30% of the world’s population and their agrarian societies. A link between the S. Asian monsoon and African climate is expected based on the underlying dynamics: at lower levels, the African southwesterlies over the S. Atlantic reach the Horn of Africa converging with the Somali jet in the western Indian Ocean. At upper levels, the tropical easterly jet resulting from regional heating is one of the main characteristics of the African monsoon. Many features of the above link are however still unclear, especially at subseasonal time scale. How the link works in current and future climate, with rapidly varying emissions of greenhouse gases and aerosols, has important implications for regional climate projections over Africa. In tackling this question, a significant source of uncertainty is represented by internal climate variability, which is the unforced component arising from internal processes. This project will use data from the novel NCAR CESM Large Ensemble (LE) project to explicitly identify the role of internal climate variability in the link between the S. Asian monsoon and African climate for the recent past and the near future. The data consists of a 30-member ensemble of 1920-2080 experiments with a state-of-the-art global climate model. This project involves data analyses and plotting, hence knowledge of programming in IDL, Matlab or Python is highly desirable. More sophisticated utilities for reading, analysing and plotting the data will be provided. Figure 1: Difference in standardized rainfall anomalies between post- and pre- S. Asian monsoon onset pentads. Positive anomalies are shaded. Figure from Camberlin et al. (2010). Background reading: Camberlin, P., B. Fontaine, S. Louvet, P. Oettli, and P. Valimba, 2010: Climate adjustments over Africa accompanying the Indian monsoon onset. J. Climate, 23, 2047–2064. Kay, J. E., and coauthors, 2014: The Community Earth System Model (CESM) Large Ensemble Project: A Community Resource for Studying Climate Change in the Presence of Internal Climate Variability. Bull. Amer. Met. Soc., submitted. Long-term global climate variability modulated by the Southern Ocean Supervisors: Massimo Bollasina ([email protected]) and Simon Tett 20 credits semester 1 or 2, or extendable to 40 credits across 2 semesters The ocean plays a major role in driving climate variability on decadal to centennial time scales. Due to its much larger heat capacity and dynamical inertia, the ocean’s memory greatly exceeds that of the atmosphere, making it a pacemaker for long-term variability in the climate system. Natural decadal to centennial timescale variability has the potential to mask global warming signals arising from anthropogenic greenhouse gas emissions in observational data records. Hence, it is important to identify the sources and mechanisms of long-term natural climate variability. It is however difficult to obtain an estimate of long-term internal (i.e., due to feedback processes within the climate system) variability purely from observations. A further perspective is provided by long control (i.e., with no changes in external forcing agents) simulations of climate models. The Southern Ocean has been identified as a driver of enhanced centennial variability through regional sea ice-oceanic heating coupled feedbacks. A footprint of the Southern Ocean on global quantities such as surface temperature has also been found (Figure 1). This project will analyse the output from a 5000-year control simulation of a state-of-the-art climate model, the US NOAA/GFDL CM3.0 model, to assess decadal to centennial simulated internal climate variability in the coupled climate system. This project involves data analyses and plotting, hence knowledge of programming in IDL, Matlab or Python is highly desirable. More sophisticated utilities for reading, analysing and plotting the data will be provided. Figure 1: (Left): Time series of annual mean sea surface temperature (SST) over the Southern Ocean. (Right): Pattern of global SST corresponding to high minus low index shown to the left. Figure from Latif et al. (2013). Background reading: Delworth, T. L., and F. Zeng, 2012: Multicentennial variability of the Atlantic Meridional Overturning Circulation and its climatic influence in a 4000 year simulation of the GFDL CM2.1 climate model. Geophys. Res. Lett., 39, L13702, doi:10.1029/2012GL052107. Latif, M., T. Martin, and W. Park, 2013: Southern Ocean Sector Centennial Climate Variability and Recent Decadal Trends. J. Climate, 26, 7767–7782. Analysis of Vertical Seismic Profile data from a shale gas reservoir. Mark Chapman ([email protected]) and Xiaoyang Wu. 2 semester project. Production of oil and gas from shale reservoirs has increased rapidly in recent years. Shales have very low permeability, so production depends on the process of hydraulic fracturing, typically in horizontal wells, to increase the flow of hydrocarbons. The success of the method is extremely uneven, with most production coming from fracturing a small number of “sweet spots”. Identification of such sweet spots from seismic data is a major current goal in the petroleum industry. Factors which are thought to influence productivity include the brittleness of the shale, the mineralogy and the existence of a natural fracture network. Each of these factures can influence seismic velocity. Shales are known to exhibit very strong seismic anisotropy, which means that their seismic velocities change significantly with direction. Understanding how this anisotropy varies within shale reservoirs is likely to be a key to identifying sweet spots. A Vertical Seismic Profile (VSP) survey is one in which sources are placed on the surface and receivers are placed down in the well. This geometry allows accurate measurement of seismic anisotropy to be made. A combination of “walkaway” and “walkaround” VSP’s allow velocity to be measured as a function of both polar and azimuthal angle. In this project, we will analyse VSP data from a producing shale gas field in North America with a view to providing estimates of the in-situ seismic anisotropy. We will then compare these estimates to independent well-log and core data. It is hoped to understand the factors controlling anisotropy in the field, and that this will provide important guidance for sweet spot identification from surface seismic data. The student will gain experience in seismic data processing, seismic anisotropy and the issues surrounding geophysical characterization of shale reservoirs. The project would suit a student who has a particular interest in expanding their computing skills. Estimating CO2 saturation in the Sleipner field through seismic attenuation measurements M. Chapman([email protected]) and G. Papageorgiou([email protected]) Capturing and injecting CO2 in depleted oil/gas reservoirs has been recognised as a potentially significant contribution to reducing carbon emissions. CCS projects have been running for some time but they have yet to be implemented at larger scales. Before they are, one needs to establish the safety and feasibility of such projects. To that end, seismic imaging has being used extensively to monitor the mobility of CO2 in injection fields. One of the most well-studied fields is the Sleipner filed in the North Sea and this project will utilise time-lapse data from this field to deduce correlations between seismic attenuation and CO2 saturation. In rock physics theory, such a correlation should be observed and indeed some of the preliminary results from the Sleipner field are encouraging (Figure 1). For this project you will be using an appropriate spectral method implemented in MATLAB (or a language of your choice) to estimate seismic attenuation. You will apply this method to estimate Q from pre-stack data for two di↵erent year vintages obtained in the Sleipner field and, finally, you will use appropriate rock physics models to interpret your results. AVO Intercept 20 40 60 80 100 1994 2010 SuperBin 1 SuperBin 1 3300 3300 3200 3200 SuperBin 2 SuperBin 2 3100 3800 Crossline Crossline 0 3100 3850 Inline 3900 3800 3850 3900 Inline Figure 1: Q estimation has been done for the two superbins for each of the years shown. The picture on the left shows AVO attributes pre CO2 injection while the one on the right after. The estimated Q for each superbin di↵ers greatly for the superbin within the CO2 saturated area whereas the Q estimate for superbin 1 remains unchanged. Does the jet stream position strongly influence our weather? Supervisors: Ruth Doherty ([email protected]), Simon Tett, 20 credits semester 1 or 2. The jet stream is a narrow, variable band of very strong, predominantly westerly air currents encircling the globe several miles above the earth. The location of the jet-stream determines whether storms track over the UK or to the North or South and hence exerts a major role on the weather we experience. Typically the jet-stream moves with the annual cycle of heating northward in the summer1. However, the summer of 2010 was characterised by the “frozen Jetstream” 2 which was in turn related to extreme weather events worldwide most notably flooding in Pakistan and a heatwave in Russia. This year over the UK, the position of the jetstream has also received attention as its northward location led to heatwave conditions across the UK3. This project will use meteorological reanalyses data4 to define the jet stream and determine its position and its variability over recent summers, and subsequently to examine its impact on UK and European weather. This study will hence determine how strong the relationship is between the variation in jet-stream position and temperature and rainfall across the UK and Europe. This project involves data analyses and plotting maps, hence some knowledge of programming in IDL, Matlab or Python is desirable. Figure 1: Description of the "Frozen Jetstream" in July 2010 and its effect on global weather 2 Background Reading: 1. http://www.metoffice.gov.uk/learning/wind/what-is-the-jet-stream 2 .http://www.newscientist.com/article/mg20727730.101-frozen-jet-stream-links-pakistanfloods-russian-fires.html 3 . http://www.bbc.co.uk/weather/features/23382924 4 . http://www.ecmwf.int/research/era/do/get/era-interim Comparing airmass origins across UK cities Supervisors: Ruth Doherty ([email protected]), Hugh Pumphrey, 20 credits semester 1 or 2 or extendable to 40 credits across 2 semesters or an MPhys project. We have recently studied the effect of weather and air pollution on human health across UK cities1. We find that the health effects related to weather and air pollution differ across these4 cities with the largest health impact occurring in London. We hypothesise that this is because London is a more “continental-like” city in that it receives much of its weather and pollution from continental Europe compared to more northerly or westerly cities in the UK, but there are other possibilities such as its larger urban heat island effect. This project will test this first hypothesis by performing trajectory modelling. Trajectory modelling is an established technique for studying the origin of pollution. The widely-used trajectory model FLEXTRA2 is freely available and relatively easy to use. The wind fields needed to drive the model are also freely available. The model can either trace air parcels forwards in time in order to see where polluted air might have gone, or backwards in time in order to see where an airmass observed to be polluted might have come from. Figure 1: trajectories of air arriving at Mace Head Ireland Trajectory modelling will be used to characterise the origin of air (by location and altitude) from different UK cities and some select European locations over a 5-10 year period. We will be able to answer questions (depending on the student’s interest) such as: Are the airmasses over London more similar to those over Paris than over Manchester? Where does the weather/pollution mainly come from that affects our East coast cities? Does the UK receive much pollution from Europe and across the Atlantic? The project would require experience in a data-analysis language such as MATLAB, R, IDL, Python etc. Background Reading: 1. Pattenden, S., …Doherty, R. M., Heal, (2010) Ozone, heat and mortality in fifteen British conurbations, Occup Environ Med, 2010 67: 699-707, doi: 10.1136/oem.2009.051714. 2 .http://transport.nilu.no/flexpart Carslaw, D. C.: On the changing seasonal cycles and trends of ozone at Mace Head, Ireland, Atmos. Chem. Phys., 5, 3441-3450, doi:10.5194/acp-5-3441-2005, 2005. East windy and west endy: Extent and persistence of haar Richard Essery ([email protected]) 20 credits, semester 1 or 2 Haar – fog formed by cooling of a warm air mass as it passes over cooler sea water and is then carried onshore by wind – is a common feature of the east coast of Scotland in spring and summer, sometimes giving cold and damp conditions in Edinburgh while Glasgow is enjoying warm sunshine. The aim of this project is to use data from Met Office weather stations across Scotland to identify occurrences of low visibility confined to the eastern side of the country. Having identified a number of such events, synoptic charts can be examined to determine the conditions in which they occur and satellite images can be examined to determine their extent. The persistence of haar events could be compared with simple methods for forecasting fog clearance; if there is a haar during the course of the project (most likely early in semester 1 or late in semester 2), you could compare your forecast with the official forecast from the Met Office. Apart from giving insight into atmospheric processes and measurements, this project will provide experience with the common geophysical requirement of handling large datasets. The data analysis would be possible in Excel, but IDL, Matlab, Python or R would provide more sophisticated utilities for reading, analysing and plotting the data. The Edinburgh branch of the Met Office will collaborate in the supervision of this project. Background reading: Chapter 5 in Ahrens, Meteorology Today. Findlater, Roach and McHugh, 1989. The haar of north-east Scotland. Quarterly Journal of the Royal Meteorological Society, 115, 581 – 608. http://onlinelibrary.wiley.com/doi/10.1002/qj.49711548709/pdf Lewis, Koračin and Redmond, 2004. Sea fog research in the United Kingdom and United States. Bulletin of the American Meteorological Society, 85, 395 – 408. http://journals.ametsoc.org/doi/pdf/10.1175/BAMS-85-3-395 Impact of the North Atlantic Oscillation on UK weather Richard Essery ([email protected]) 20 credits, semester 1 or 2 The North Atlantic Oscillation (NAO), which can be indexed by pressure differences between Iceland and the Azores, is the dominant mode of winter climate variability over the North Atlantic. The strength of the NAO varies from year to year, but it has a tendency to stay in one phase for a few years at a time. When the NAO is in its positive phase (stronger than usual pressure gradient), more storms cross the Atlantic on a more northerly track, giving warm and wet winters for the UK. In the negative phase, the UK is more vulnerable to cold air outbreaks from the Continent, giving cold and snowy winters; the winter of 2009-2010, as an example, had the most negative NAO index of the 190-year record maintained by the University of East Anglia Climatic Research Unit. The Met Office operates a network of surface observing stations across the UK which record many meteorological variables on daily or more frequent intervals. These data have been used to generate 5 km gridded datasets showing variations of important weather variables in time and space across the UK. The aim of this project is to analyse NAO and weather records to determine the strength and persistence of NAO impacts on UK weather for different parts of the country and different seasons. Some programming will be required to analyse and visualize the gridded datasets. Example code can be provided in IDL or Python, but this could easily be adapted to Matlab or R. Background reading: Jung, Vitart, Ferranti and Morcrette, 2011. Origin and predictability of the extreme negative NAO winter of 2009/10. Geophysical Research Letters, 38, L07701. http://onlinelibrary.wiley.com/doi/10.1029/2011GL046786/pdf Perry and Hollis, 2005. The generation of monthly gridded datasets for a range of climatic variables over the UK. International Journal of Climatology, 25, 1041– 1054. http://onlinelibrary.wiley.com/doi/10.1002/joc.1161/pdf http://www.cru.uea.ac.uk/~timo/datapages/naoi.htm Structural and parameter uncertainty in snow modelling Richard Essery ([email protected]) 20 credits, semester 1 or 2 Models of land-surface energy balance are required to provide flux boundary conditions for atmospheric models in medium-range to seasonal weather forecasting and in climate modelling. Because of computational constraints and poor understanding of some processes, models generally use highly simplified representations of complex land surface processes. Simple models may actually be preferable to complex models if they rely on fewer uncertain parameters and more clearly demonstrate causal relationships between processes. Snow on the ground, due to its high albedo, low thermal conductivity and high latent heat of fusion, presents a particularly marked modification of the surface energy balance. This project will use a snow model in which representations of several processes (absorption of solar radiation, thermal conduction, compaction and percolation of melt water in snow, turbulent exchanges of heat between snow and the atmosphere) can be switched on or off and parameter values can be adjusted to investigate how they influence snow melt and fluxes to the atmosphere. The model will be driven with and evaluated against data from one or more highly instrumented research sites. Fortran code and instructions for running the model will be supplied. Basic analyses of the model outputs could be performed in Excel, but use of IDL or Python would enable much more sophisticated and scalable analyses. Background reading: Larsen, Thomas, Eppinga and Coulthard, 2014. Exploratory modelling: Extracting causality from complexity. Eos, Transactions American Geophysical Union, 95, 285 – 292. http://onlinelibrary.wiley.com/doi/10.1002/2014EO320001/pdf Essery, Morin, Lejeune and Ménard, 2013. A comparison of 1701 snow models using observations from an alpine site. Advances in Water Resources, 55, 131 – 148. http://www.sciencedirect.com/science/article/pii/S0309170812002011 Monitoring of CO2 and other pollutants on Edinburgh City streets Jim Jack ([email protected]) 1 or 2 semester project There is a great deal of concern about rising levels of CO2 and the effect this may have on long term climate change. While there are many space based instruments probing the atmosphere and collecting data on a global scale, there are few experiments attempting to measure and map CO2 concentrations at street level. We have a number of portable CO2 sensor and are about to integrate these with GPS receivers. The CO2 sensor is based on a commercial sensor made by Gas Sensing Solution GSS, located in Cumbernauld near Glasgow. With the current battery, approximately 3 hours of data may be recorded. This is then subsequently downloaded and after some processing is plotted on a map using Google Maps. During March 2014, the instrument was carried on a route within Edinburgh City and data plotted. The concentrations measured were then correlated with the local environment, traffic volume and weather. The first step in the Project is to collect the instrument and understand what is available and how calibration may be addressed. A trial data collection and analysis could be carried out on the Kings Buildings campus by collecting data on a defined path [to be defined by the student] at different times of the day and perhaps different weather conditions. The student would have to decide this and the spatial separation of your observation points. The data, in the form of PPM CO2, can then be plotted on a map of the KB Campus. Features may emerge which are worth exploring in more detail. The data recordings give CO2 concentration, time and location. The student will then decide on how to present the data in the most useful manner and if correlations are to be investigated. Of more interest would be to carry out a survey along certain Edinburgh streets including Princes Street at different times of the day and in different weather conditions and plot a concentration profile superimposed on a map. If there is time after this an additional set of measurements at right angles to Princes Street, say along Frederick Street from Queens Street to the Gardens would be very interesting. Alternate routes across the city would also have value. The student will be expected to plan and execute the data gathering campaign. The project is somewhat open-ended as one could collect data for the next ten years, so the student will have to limit the project by the amount of time to be invested. It is also difficult to predict the value of the results or their potential civic interest, but Edinburgh City Council are interested to know if they are complying with any relevant legislation covering both CO2 levels and pollution. The student may also have to interact with the general public if you are making measurements in the city and you are engaged in conversation. Monitoring of CO2 Using Integrated Path Absorption Jim Jack ([email protected]) 1 or 2 semester project. There is a great deal of concern about rising levels of CO2 and the effect this may have on long term climate change. While there are many space based instruments probing the atmosphere and collecting data on a global scale, there are few experiments attempting to measure and map CO2 concentrations at low level within the atmosphere. Within the School of GeoSciences, we are building a large laser based instrument to measure trace gases in the lower troposphere. The concept uses the absorption of a specific laser wavelength by an absorption line of the trace gas on interest. We are currently developing a Mark 2 instrument which will measure CO2 along a sightline of several kilometres length. As part of the development and test it is intended to measure CO2 concentration using the simpler technique of Integrated Path Absorption. With the instrument installed in a room on Level 6 in the JCMB, and a receiver mounted a short distance from the instrument, likely on the roof of the Murray library, the absorption of the laser beam over the separation will be measured. Using standard absorption theory, the average concentration of CO2 within the path will be calculated. An opportunity is offered to take part in the operation of the instrument, the collection of data and the calculation of trace gas concentration. This is an opportunity for a student to become involved in an on-going research project and would be attractive to a student with interests in the engineering of instruments to measure our environment by working with an existing team. While the long term project objective is to measure CO2 concentration along a path of several kilometres using Differential Absorption Lidar, involvement in the initial development and testing of functions such as Integrated Path Absorption, offers an introduction to the problems of accurately monitoring our environment remotely. The student project activity would need to match the availability of the instrument and its successful operation and could include simple modelling, data gathering, analysis and understanding of the basic operation of Lidar. Understanding the geoelectric tides of Hartland Gemma Kelly [[email protected]], Ciarán Beggan (British Geological Survey) and Kathy Whaler (School of GeoSciences) Tidal signals in magnetic and electric field measurements are generated by the current systems in the upper atmosphere, from the motion of the conductive ocean through the Earth’s magnetic field and, to a lesser extent, by the solar-quiet magnetospheric field surrounding the Earth (see Love and Rigler (2014) and references therein). The BGS operates a set of highly-sensitive magnetometers and electric field instruments at its observatory in Hartland, Devon. The site is located close to the Bristol Channel which has an enormous tidal range (> 8 m, typically). The ebb and flow of a large body of conductive (salty) sea water through the Earth’s magnetic field, generates an electric field and a secondary magnetic field. These fields are detectable at the Hartland observatory, being particularly obvious in the electric field measurements [see Figure 1]. We wish to understand these signals in more detail and to produce a model based upon standard tidal predictions or local tidal gauge data to remove the signal in the electric field data. We will use two years’ worth of electric field measurements sampled every ten seconds at the Hartland observatory. We wish to construct highresolution, frequency-domain power spectrum across periods from 0.1 to 100 days using the Fast Fourier Transform. The aim of the project then is to identify the dominant periods, and to figure out how to model and remove them, so we can examine the Figure 1: Thirty days of electric field data from the other signals of interest, such as those from instrument at Hartland, Devon. The periodic tidal signal is obvious and dominates the other signals. geomagnetic storms. The student will be required to process time-series data from a number of different data sets, to produce a small piece of inversion code and to plot the data in a suitable format. The project will use Matlab to process and plot the data. References: [1] Love, J. J., and E. J. Rigler. The magnetic tides of Honolulu (2014), Geophys. J. Int, (June, 2014) 197 (3): 1335-1353. doi: 10.1093/gji/ggu090 Coronal Mass Ejections – how can we tell if they’re headed our way? Gemma Kelly (British Geological Survey) [[email protected]] and Kathy Whaler (School of GeoSciences) As well as providing the phenomenal light show we call the Northern lights (or aurora borealis), geomagnetic storms can pose a real threat to technological infrastructure. During a geomagnetic storm strong electric currents can flow in the atmosphere, and in turn induce currents in the ground. These currents can then flow through the power grid, communications cables, railways and pipelines and have the potential to cause damage and disruption. The largest geomagnetic storms are usually associated with coronal mass ejections (CMEs), in which large clouds of charged particles, and the solar magnetic field, are thrown from the surface of the Sun. CMEs take between 19 hours and 5 days to travel to Earth, whereupon they transfer large amounts of energy to the geomagnetic field environment. Although we can observe CMEs occurring from satellite imagery, it is still very difficult to predict whether the CME will actually hit the Earth, and how geo-effective (i.e. will it cause a geomagnetic storm?) it will be. Moon et al (2005) have developed a simple parameter, called the direction parameter (DP), Figure 1 Coronagraph from the LASCO instrument on which can be directly determined from satellite board SOHO (NASA), showing a halo Coronal Mass coronagraph images (e.g. Figure 1). The DP is Ejection intended to be a better measure of whether a CME may trigger a geomagnetic storm compared to just using the location of the source region. We wish to calculate the direction parameters (DPs) for historical CMEs (1998-2010) which were identified as having potential to hit the Earth. The aim of the project is then to investigate whether the DP is a useful predictor of CME geo-effectiveness by comparing it with the geomagnetic storm record, and whether it can be used in operational space weather forecasting. This includes assessing the ease, accuracy and reproducibility in calculating the DP, and whether it is a better predictor of geo-effectiveness than CME source location. The student will be required to calculate the DP for a range of CMEs, and to produce and test some code to perform a statistical analysis of the results. The project will use some pre-existing Python code to calculate the DPs and Python or Matlab to analyse and plot the results. This is a one-semester project available in Semester 1 or 2. The results of this project will feed in to space weather forecast operations at British Geological Survey. References: Moon, Y.-J., K.-S. Cho, M. Dryer, Y.-H. Kim, Su-chan Bong,1 Jongchul Chae, and Y. D. Park (2005). New geoeffective parameters of very fast halo coronal mass ejections, The Astrophysical Journal, 642, 414-419. Water in the Earth’s lower mantle Area: Solid Earth geophysics and high-pressure petrology Supervisor: Tetsuya Komabayashi ([email protected]) Semester 1 or 2, or across 2 semesters Project summary The subduction of oceanic lithosphere transports water into the mantle (Fig. 1). The concentration of water has significant effects on mantle properties such as melting temperature, rheology, and electrical conductivities. Understanding of the mechanisms of water circulation in the mantle will therefore provide vital information about how water is related to the dynamics and evolution of the solid Earth. For water subduction, stability relations of hydrous phases in subducting slabs are essential because they carry water into the deep mantle. Such phase relations to the mantle transition zone (410-660 km depth) were extensively studied while those in the lower mantle have been poorly understood. This project is aimed at elucidating the stability relations of the hydrous minerals and fluids under lower mantle conditions. You will employ analyses of published experimental papers and construct a phase diagram which is applicable to the lower mantle conditions. Results of this project will contribute to our understanding of behaviour of water transported in the deep Earth. Basic knowledge of petrology and chemical thermodynamics is beneficial. References Komabayashi, T. et al., (2004) Petrogenetic Grid in the System MgO-SiO2-H2O up to 30 GPa, 1600°C: Applications to Hydrous Peridotite Subducting Into the Earth’s Deep Interior. Journal of Geophysical Research, 109, B03206, doi:10.1029/2003JB002651. Nishi, M. et al., (2014) Stability of hydrous silicate at high pressures and water transport to the deep lower mantle. Nature Geoscience, doi: 10.1038/NGEO02074 Observed variations in greenhouse gases and reactive trace gases in the background troposphere Paul Palmer ([email protected]) Single semester project The Global Atmosphere Watch (GAW) programme of the World Meteorological Organization is a partnership involving 80 countries, which provides reliable scientific data and information on the chemical composition of the atmosphere, its natural and anthropogenic change, and helps to improve the understanding of interactions between the atmosphere, the oceans and the biosphere. It collects data on greenhouse gases (CO2, CH4, CFCs, N2O, surface ozone, etc.) and related gases (CO, NOx, SO2, VOC, etc.). The GAW measurement stations tend to be located in remote geographical regions away from urban environments. The student will develop a set of robust statistical metrics that describe observed variations in gases at the GAW stations, and provide a scientific interpretation of results by understanding the sources, sinks, and resulting atmospheric lifetimes of the gases. The student will be responsible for selecting a collection of, possibly interrelated, gases from the available measurement suite. Ideal candidates will have knowledge of IDL or Python. Title: The composition of the mesosphere using ground-based mm-wave remote sensing Area: Atmospheric physics Contact: Hugh Pumphrey (Room 313, Crew Building, extn. 50 6026, email: [email protected]) The mesosphere, lying at altitudes between 50 and 80 km, is one of the least-understood regions of the atmosphere. One way to study its composition is to use a millimetre-wave receiver (essentially a radio telescope) sited on the ground (preferably on a high mountain). The spectra from such an instrument can provide information on the mixing ratio of a variety of chemical species of interest. Recent improvements in technology are permitting easier access to higher frequencies. This, then, poses these questions: which species might one usefully measure with this technique? What characteristics (bandwidth, resolution, noise level) would a spectrometer require in order to make the measurement? How badly affected would the measurement be by a wet troposphere (and hence, how high a mountain would you need)? The basic technique of the project is to simulate a measurement using a readily-available radiative-transfer model (ARTS: see http://www.sat.ltu.se/arts) and apply the standard techniques of inverse theory[1] to the simulation to determine what information the measurements would contain. Several projects along these lines would be possible, to answer such questions as: Which of the various absorption lines of HCN is most suitable for sounding the mesosphere? Is it possible to use ground-based sensing of HCl to track the chlorine loading of the middle atmosphere? Two years of measurements of the 230GHz carbon monoxide spectral line taken from the Norwegian Antarctic base using the British Antarctic Survey’s microwave radiometer. The project was run in 2011-12 to study CO. If run again, it would target different species. These projects would probably be 20-point projects available in either semester. It would be suitable for students on any physics or geophysics-based degree programme. The ARTS output would be analysed using a data analysis language such as R, MATLAB, Octave or python/matplotlib. [1] Inverse Methods for Atmospheric Sounding: Theory and practice by Clive D. Rodgers (World Scientific, ISBN 981-022740-X) Title: Using a trajectory model to track SO2 from volcanoes Area: Atmospheric physics Contact: Hugh Pumphrey (Room 313, Crew Building, extn. 50 6026, email: [email protected]) Trajectory modelling is an established technique for studying pollution from isolated point sources such as volcanos, chemical plants, disasters at nuclear power stations, and so forth. The widely-used trajectory model FLEXTRA is freely available (http://transport.nilu.no/flexpart) and relatively easy to use. The wind fields needed to drive the model are also freely available. The model can either trace air parcels forwards in time in order to see where polluted air might have gone, or backwards in time in order to see where an airmass observed to be polluted might have come from. The MLS instrument on NASA’s Aura satellite has been observing sulphur dioxide (SO2) in the stratosphere since 2004. Although there have been no large volcanic eruptions in that period there have been a number of moderate-sized ones which have injected measurable amounts of SO2 into the stratosphere. The basic idea of the project would be to identify these events and then to run back trajectories from the observations of high SO2 to locate the volcano from which the SO2 came. Map on left shows an example of trajectories. These are run backwards from locations where MLS observed unusual amounts of CO. The trajectories go past the site of the Black Saturday bush fires of February 2009 [1]. Map on right shows MLS observations of SO2 a few days after the eruption of the volcano Sarychev in Japan, in 2009. This eruption was analysed by a student in 2011-12. The 2008 eruption of Kasatochi has also been done, but there are a number of other eruptions in the record available for study[2]. An extension of the project would be to use the Flexpart particle dispersion model to attempt to make more detailed simulations of the plume. The basic project could be a 20-point honours project for either the Physics or Geophysics degree programme groupings. If suitably extended it would be suitable for a 40-point geophysics project or an M.Phys/M.EarthPhys project. It should be added that because trajectory modelling has wide applicability it could provide a basis for students to design their own projects. The project would require competence in a data-analysis language such as MATLAB, R, Octave, IDL, python/matplotlib etc. and also good general computing competence. [1] Microwave Limb Sounder observations of biomass-burning products from the Australian bush fires of February 2009 H. C. Pumphrey, M. L. Santee, N. J. Livesey, M. J. Schwartz, and W. G. Read, Atmos. Chem. Phys., 11, 6285-6296, 2011 [2] Observations of volcanic SO2 from MLS on Aura. H. C. Pumphrey, W. G. Read, N. J. Livesey, and K. Yang Atmos. Meas. Tech. Discuss., 7, 7883-7922, 2014 Title: Gravity surveying projects Area: Solid-Earth Geophysics: Gravity Contact: Hugh Pumphrey (Room 313, Crew Building, extn. 50 6026, email: [email protected]) Two of the School of GeoSciences' classic Lacoste & Romberg gravimeters have recently been serviced and are available for use in final year projects. The UK has all been surveyed at a resolution of about 1 km, but there are many possible targets which are smaller than this. A typical project would consist of a number of days fieldwork to collect gravity data, followed by mapping of the data and modelling of the possible underground density contrasts which might give rise to it. A GIS package (such as the freely available Quantum GIS from http://qgis.org/ ) is useful for the mapping the data. Possible targets include (but are not restricted to) the following: (1) Density of volcanic intrusions by Nettleton's method Edinburgh and East Lothian contain a number of isolated volcanic hills of which Arthur's Seat, North Berwick Law and Traprain Law are the most obvious. The density of the basalt which makes them up could be estimated by making a profile of gravity measurements across the hill and then finding the density value which provides the best correction for the effect of altitude on gravity. Using a surveyor's level for the altitude surveying is difficult in these cases; an alternative method (such as a total station, of which we have one available) would need to be used. (2) The gravity signal of a railway tunnel A tunnel causes a mathematically simple gravity anomaly. Edinburgh has several examples, including the innocent tunnel under Pollock Halls / Holyrood park. (Students did this in 2012-13 and 2013-14, but I think that better results could be obtained using more closely-spaced points. It may also be possible to obtain a better value for the density of the surrounding rock by taking a measurement inside the tunnel.) (3) The ancient volcanoes of Dunbar The geology under the town of Dunbar is a mixture of sandstones (grey, beige) and volcanic rocks, which are in turn a mixture of (presumably) dense basalts (green) and (presumably) less dense tuffs (orange). The town is not too hilly and has several long straight east-west streets, which would expedite a gravity survey. (This project has the potential to be expanded to a 40-credit project or an M.EarthPhys project.) As a gravity survey requires a levelling survey, which is a 2-person job, a single student would require assistance in the field from the supervisor. It would also be possible for two students to work together in the field to carry out two different surveys, and then for each student to write up one of the surveys. Title: Time series analysis of gravity data Area: Solid-Earth Geophysics: Gravity Contact: Hugh Pumphrey (Room 313, Crew Building, extn. 50 6026, email: [email protected]) One of the School of GeoSciences' gravity meters has been left running and recording gravity for the last 9 months; the data have been archived at a rate of one point every 5 seconds. The data for the last week can (usually) be seen at http://www.geos.ed.ac.uk/~hcp/gravity/ . This record contains a number of features including the daily gravity tide, signals from earthquakes, and longer noisy periods which might be associated with storms and ocean waves. The figure shows an example of the data, taken from July 2014. Note the earthquake on 7 July and the period of nonearthquake noise during 3-4 July. The project would consist of a detailed analysis of this time series with the intention of answering some of these questions: 1. How strong (and how close) does an earthquake have to be in order to prevent gravity surveying from being carried out? 2. Are the bursts of non-earthquake noise correlated with periods of high wind? If not, what does cause them? 3. Are there any other features in the data beyond those I have identified? This project would require competence in a programming language designed for data analysis, such as MATLAB, R, python/matplotlib, IDL etc. Analysing variations in solar radiation measured in Edinburgh Supervisors: David Stevenson ([email protected]) & Hugh Pumphrey ([email protected]) Single semester project, potentially extendible to two semesters. The University of Edinburgh’s weather station, located on the top of the James Clerk Maxwell Building, has been recording several meteorological variables near continuously at 1-minute resolution since May 2006. One of these variables is the direct flux of solar radiation (Figure 1). This flux follows an annual and diurnal variation due to the Earth’s rotation and orbit around the Sun that is well understood, albeit with numerous subtleties2. However, deviations from this predictable sinusoidally-varying behaviour occur due to atmospheric scattering, mainly from clouds (and also to a lesser extent by aerosols). Scattering can either decrease or increase the measured direct solar flux compared to the theoretical ‘clear-sky’ value. Figure 1: Solar flux data from the JCMB weather station (August 6-13, 2013)1 The data can be analysed in several ways. The predicted clear-sky annual and diurnal variations can be calculated and compared to the actual measurements. Daily measurements nearly always show some cloud cover (this is Edinburgh after all!), but there are occasional completely clear days in the record, and many almost clear days. These allow a direct comparison to the theoretical clear-sky predictions. The scattering behaviour on cloudy days is also interesting, and can be expected to vary with zenith angle (i.e. height of the Sun in the sky), and may also show some coherent diurnal variation, and/or relationships with other measured variables. For example, we would expect to find more cloud in afternoons, as convective cloud should peak when surface temperatures reach a maximum. The student would be expected to test various hypotheses using the data. This is a data analysis project, and will require high-level programming (e.g., MatLab, IDL, R or similar). References 1 http://www.geos.ed.ac.uk/abs/Weathercam/station/latestweek.html [Accessed 13th August 2013] 2 http://www.esrl.noaa.gov/gmd/grad/solcalc/calcdetails.html [Accessed 13th August 2013] Has Scottish Extreme Precipitation increased? 20 credit project with possible extension to 40 credits. Prof. Simon Tett ([email protected]) & Prof. Gabi Hegerl One of the expected impacts of climate change is an increase in extreme precipitation. Changes in extreme precipitation can have large societal impacts largely through flooding and the consequent impact on property and infrastructure. The aim of this project is to investigate existing records of precipitation extreme indices for Scotland to see if they show change that is consistent with expectations of an increase in extreme precipitation. However, as the climate system is chaotic there will also be substantial “noise” which might mask any change in extremes. One possible source of “noise” is changes in the North Atlantic Oscillation (Osborn, 2006). Figure 1 Image of Balcarres Street flooding of October 2011 from http://www.flickr.com/photos/chdot/62545 20074/in/set-72157627921523918 Recently the HadEX2 global gridded dataset of extreme indices has been developed (Donat et al, 2013). This has gridded a standard set of temperature and precipitation indices and the data is available from http://www.climdex.org. The student would extract data from this dataset for several different precipitation extreme indices and see 1) Are there any change in extreme precipitation since 1950 2) Is this change, particularly any winter changes, related to changes in the North Atlantic Oscillation. Extensions: The project could be extended to 40 credits in a variety of different ways. 1) by taking digitised precipitation from 1900 to 1950, quality controlling it, computing the precipitation extreme indices and carrying out an analysis from 1900 to present. 2) Comparing the results from 1950-2004 with several different model simulations to see if climate models are correctly simulating the observed changes in Scotland. 3) Carry out some more detailed analysis for the UK using the HadEX2 gridded dataset. References Donat, M. G., et al. (2013), Updated analyses of temperature and precipitation extreme indices since the beginning of the twentieth century: The HadEX2 dataset, J. Geophys. Res. Atmos., 118, 2098– 2118, doi:10.1002/jgrd.50150 Osborn TJ (2004) Simulating the winter North Atlantic Oscillation: the roles of internal variability and greenhouse gas forcing. Clim. Dyn. 22, 605-623 Extremes data is available from http://www.climdex.org while North Atlantic Oscillation data is available from http://www.cru.uea.ac.uk/~timo/datapages/naoi.htm. Climate Physics: Maximum Entropy Production in the Earth System. Simon Tett ([email protected]) & Ian Main Zonal-mean Energy Balance models have been used to study such things as Icealbedo feedbacks, the climate of the early earth and how the climate might respond to changes in greenhouse gases. Such models normally have simple parameterisations of heat transport. More complex General Circulation Models which explicitly solve the equations of motion are used to predict the weather a few days ahead and how climate might change over the next few centuries. These models are computationally expensive and so cannot investigate timescales of millennia or more. In the 1970’s Paltridge (Paltridge, 1974 & 1978) published a set of papers in which he build a simple model of the Earth’s climate. He parameterised the horizontal transports of heat using the assumption that the transports would act to maximise entropy production (MEP) by the Earth system and found good agreement with observations. Recently Dewer (2005, 2003) claimed to have proved that MEP arises from statistical-mechanical arguments. There is controversy over whether, or not, MEP applies to the Earth-system (Goody, 2007; Ozawa et al, 2003; Paltridge, 2007; Kleidon, 2009). Recent work (Niven, 2009) suggests that MEP acts locally for systems in steady state. The aim of the project is to build a simple model of the ocean/atmosphere/ice system using MEP and balancing outgoing radiation against incoming radiation from the sun. Then the model will be applied to a variety of cases including increasing Greenhouse gases and volcanic eruptions. The student should have some programming experience, an understanding of entropy/thermodynamics and an interest in climate. References Dewer, RC, 2005 Maximum entropy production and the fluctuation theorem, Journal of physics A. DOI: 10.1088/0305-4470/38/21/L01 Dewer, RC, 2003 Information theory explanation of the fluctuation theorem, maximum entropy production and self-organized criticality in non-equilibrium stationary states, Journal of Physics A: 36, 631-641. Goody, R., 2007: Maximum Entropy Production in Climate Theory. Journal Of The Atmospheric Sciences: 64 2735-2739 Kleidon, A, 2009: “Nonequilibrium thermodynamics and maximum entropy production in the Earth system” Naturwissenschaften 96:653–677 DOI 10.1007/s00114-009-0509-x Niven, R, 2009:”Steady State of a Dissipative Flow-Controlled System and the Maximum Entropy Production Principle” Phys. Rev. E doi: 10.1103/PhysRevE.80.021113 Ozawa, H et al, 2003: The second law of thermodynamics and the global climate system: A review of the Maximum Entropy Production Principle. Reviews of Geophysics doi: 10.1029/2002RG000113 Paltridge, GW; Farquhar, GD; Cuntz, M Maximum Entropy Production, Cloud Feedback, And Climate Change GEOPHYSICAL RESEARCH LETTERS, 34 (14): Art. No. L14708 JUL 26 2007 Paltridge, GW Steady-State Format Of Global Climate. Quarterly Journal Of The Royal Meteorological Society, 104 (442): 927-945 1978 Paltridge, GW Global Dynamics And Climate - System Of Minimum Entropy Exchange Quarterly Journal Of The Royal Meteorological Society, 101 (429): 475-484 1975
© Copyright 2026