Brimicombe, A.J. (2009) GIS, Environmental Modelling and Engineering (2nd Edition). CRC Press, Boca Raton, FL, USA.

Li, Y. (2007) "Control of spatial discretisation in coastal oil spill modelling" International Journal of Applied Earth Observation and Geoinformation, 9: 392-402

Spatial discretisation plays an important role in many numerical environmental models. This paper studies the control of spatial discretisation in coastal oil spill modelling with a view to assure the quality of modelling outputs for given spatial data inputs. Spatial data analysis techniques are effective for investigating and improving the spatial discretisation in different phases of the modelling. Proposed methods are implemented and tested with experimental models. A new "automatic search" method based on GIS zone design principles is shown to significantly improve discretisation of bathymetric data and hydrodynamic modelling outputs. The concepts and methods developed in the study are expected to have general relevance for a range of applications in numerical environmental modelling.

Li, Y. (2006) "Spatial data quality analysis with Agent technologies" Proceedings GISRUK2006, Nottingham: 250-254

Agent technologies have attracted increasing attention from GIS research and applications. Intelligent agents have shown considerable potential for spatial data analysis. This paper proposes an agent-based solution for spatial data quality analysis. A collection of collaborating agents is constructed in a multi-agent framework. These remote intelligent agents can be shared as spatial data quality analysis tools over a network. The proposed solution therefore offers a decentralised, distributed and service-oriented system. It is expected to aid users in designing their own data quality test procedures for environmental simulation models.

Li, Y. (2005) "Agent technologies for spatial data auality analysis in environmental modelling" Proceedings 8th International Conference on GeoComputation, Ann Arbor, University of Michigan (CD)

The study shows the potential of an agent-based Geo-data Quality Analysis engine. With this interoperable engine, spatial data quality analysis can be achieved in environmental modelling. Ultimately, in the longer term, through such agents, the Geo-data Quality Analysis engine would be distributed on the Internet to be used by the scientific and professional community.

Brimicombe, A.J. (2003) GIS, Environmental Modelling and Engineering. Taylor & Francis, London. 

The significance of modelling in managing the environment is well recognised from scientific and engineering perspectives as well as in the political arena. Environmental concerns and issues of sustainability have permeated both public and private sectors, particularly the need to predict, assess and mitigate against adverse impacts that arise from continuing development and use of resources. Environmental modelling is an inherently spatial activity well suited to taking advantage of Geographical Information Systems (GIS) functionality whether this be modelling aspects of the environment within GIS or linked to external simulation models. In doing so, a number of issues become important: environmental models need to consider space-time processes often in three-dimensions whereas GIS are largely two-dimensional and static; environmental models are often computational simulations, as distinct from cartographic modelling and map algebra in GIS; what should be the degree of integration between GIS and environmental models on any project; how does uncertainty in spatial data from GIS and the choices of parameterisation in environmental models combine and propagate through analyses to affect outputs; by what means can decisions be made with uncertain information. These issues inevitably arise in every scientific and engineering project to model and manage our environment. Students need to be made aware of these issues. Practitioners should enrich their knowledge and skills in these areas. This book focuses on the modelling, rather than on data collection or visualisation - other books adequately cover these areas - and aims to develop critical users of both GIS and environmental models.

Li. Y. and Brimicombe A.J. (2003) "A Spatial Data Quality Analysis Engine for Coastal Oil-Spill Modelling" in Oil Spill, Oil Pollution and Remediation, Istanbul: 43-54 

Oil-spill models play an important role in prevention, preparation, response and impact analysis for coastal oil-spill pollution. The reliability of an oil-spill model depends on the accuracy of spatial data entry, integration and computation. There are a wide range of data quality problems for both data and model operations. This paper presents research work for a newly-developed Geo-data Quality Analysis (GQA) engine, which is constructed as a tightly-coupled collection of GI tools. Off-the-shelf tools are intended to work together in an interoperable framework for a specified oil-spill model. It overcomes the lack of spatial data quality functionality in GIS software as well as the limitation of fully-integrated package. A conceptual prototype of GQA engine has been developed, which includes some basic analysis tools and also provides help in designing test procedures for the specific model. This prototype has been applied for the coastal oil-spill modelling and should be generally applicable with its flexible structure and compatible tools.

Li, Y. and Brimicombe, A.J. (2002) "Assessing the quality implications of accessing spatial data: the case for GIS and environmental modelling" Proceedings GISRUK 2002, Sheffield: 68-71

For the spatial sciences, the 1990's were a period of transition from data-poverty to data-richness. Digital spatial data sets have grown rapidly in scope, coverage and volume (Miller & Han, 2001). This state change has been facilitated by: improved technology and wider use of GPS, remote sensing and digital photogrammetry for data collection; the introduction of new technologies such as LiDAR and radar interferometry; the operation of Moore's Law resulting in increased computing power to process raw data coupled with the falling cost of data storage; the advent of data warehousing technologies; increasingly efficient ways of accessing and delivering data on-line. The technical advances in hardware, software and data have been so profound that their effect on the range of problems studied and the methodologies used have been fundamental. At the same time however, problems have arisen in identifying and locating relevant data and in evaluating choices of resolution, coverage, provenance, cost and conformance with the models to be used. Furthermore, for environmental models it may not be so much the characteristics of the raw data that are the most critical but their characteristics once converted, aggregated and implemented in the model. Given that a modelling task may access data from multiple sources, there is the added difficulty of assessing combined performance in relation to the implementation of the simulation such that outputs have fitness-for-use. Clearly front-end tools are required in order to resolve these data issues both prior to purchase and also in specifying new data collection. A conceptual prototype for Geo-Quality Analysis (GQA) engine is developed, which includes off-shelf quality analysis tools and experiment design program. Data-richness has lead to choice and the need to evaluate that choice from the outset in terms of their implications for fitness-for-use in environmental modelling. The application of a GQA prototype in coastal oil spill modelling has shown the feasibility of doing this.

Li, Y.; Brimicombe, A.J. and Ralphs, M. (2000) "Spatial data quality and sensitivity analysis in GIS and environmental modelling: the case of coastal oil spills" Computers, Environment & Urban Systems 24: 95-108

Integration with environmental modelling, concern for spatial data quality issues and the rise of geocomputation paradigm have been three important areas of GIS research. In this paper they are brought together in the context of coastal oil spill modelling. Spatial data quality analyses of data and model elements for output sensitivity and error propagation, highlight the need to revise the coupling strategies of GIS and environmental modelling to include a geo-data analysis (GQA) engine. The implementation of comprehensive geospatial data quality analysis procedures within existing GIS appears unlikely. Nevertheless, as the complexity of data sets and the modelling capability of computer systems increase, the need to address the quality of both data and models is increasingly important. With growing availability of proprietary and public domain software suitable for spatial data quality analysis, GQA engines will evolve from the assembly of these tools external to GIS. GIS, environmental models and GQA will form a tightly-coupled modelling network, which, because of the importance of quality issues and the need for systematic testing, will see the dominant interaction between the GQA and the environmental modelling.

Brimicombe, A.J. (2000) "Encoding expert opinion in geo-information systems: a fuzzy set solution" Transactions in International Land Management 1: 105-121 

Since much of the relevant information will be derived from digital spatial data, geo-information systems (GIS) and related IT will be key tools. An area where these tools currently perform poorly is in the encoding, storage and analytical handling of expert opinion or linguistic statements regarding the data content of GIS. This paper proposes a solution - fuzzy expectation (˜E). Fuzzy expectation is derived from a small number of stylised fuzzy sets which, using an intuitive probability interface, are building blocks for 'translating' expert opinion into a compact fuzzy set representation. Once encoded, expert opinion about the data is embedded in the data structure and can be combined and propagated through GIS analyses such as overlay. A worked example in a land management context is provided as a means of illustrating the implementation of ˜E.

Li, Y.; Ralphs, M. and Brimicombe, A.J. (2000) "Error propagation for complex environmental modelling - the case of spatial data quality in coastal oil spill modelling" Proceedings Accuracy 2000: 409-16 

Error propagation analysis can now offer a large amount of information about both spatial data inputs and modelling processes. This paper reports on the results of an error propagation study that takes coastal oil spill modelling as its operational context. In coastal oil spill modelling, any data error or uncertainty will be propagated through the sub-models because of the model configuration. There is also considerable scope for operationally-induced error or uncertainty. The analyses are carried out for spatial data inputs to the hydrodynamic model and initially focus on hydrodynamic modelling with knock-on error effects for trajectory and fate modelling. The paper also considers the use of error propagation techniques in assisting the user with the issue of algorithm choice in the oil spill modelling process. This paper gives the results of the experiments in full and provides an improved overall methodology for the study of error propagation for spatial data in environmental modelling.

Li, Y.; Brimicombe, A.J. and Ralphs, M. (1999) "Sensitivity to spatial data quality in numerical modelling coastal oil spill" in Proceedings GISRUK'99, Southampton, Vol. 7: 104-107

The simulation of coastal oil spills is an important form of environmental modelling due to the seriousness of both physical and social-economical impacts of such spills. Spatial data quality has come to be recognised as a critical issue in coastal oil spill modelling and is beginning to attract the attention of developers and users alike. The initial focus of this research has been on the influences of spatial data quality on the hydrodynamic modelling (which is treated as a grey box) with a consideration of knock-on effects for trajectory and fates modelling. A geocomputational approach has therefore been adopted whereby the use of data exploration software, a statistical package, geostatistics, fractal software, and visualisation software are coupled using GIS as a hub for data storage and handling. The sensitivity to a range of data qualities for shoreline representation, seabed bathymetry, sampling strategies, tidal data and so on can thus be systematically researched. A synthetic modelling approach is consequently being employed for the next stage of the research and for the spatial data quality tests presented in this paper. In this way the data in the experiments can be carefully and systematically controlled. Error propagation analysis offers much more information than error measurement. An overall model is derived for the spatial data quality propagation given that the hydrodynamic model are "grey box". Error propagation techniques, such as Monte Carlo method, are suitable for the complex dynamic-distribute models utilised in hydrodynamic modelling. By studying the effects of spatial data quality on numerical modelling of coastal oil spills, optimum strategies for managing and improving the spatial data quality using appropriate surveying techniques can be explored.

Li, Y.; Brimicombe, A.J. and Ralphs, M. (1998) "Spatial data quality and coastal spill modelling" in Oil and Hydrocarbon Spills: Modelling, Analysis and Control (eds. Garcia-Martinez & Brebbia), Computational Mechanics Publications, Southampton: 53-62

A growing number of numerical models have been applied to oil spill research, contingency planning and risk assessment. Whilst model quality has been a critical theme in their development and utilization, data quality in both quantitative and qualitative terms is also critical to the fitness of use of the outputs from the oil spill modelling process. Given the importance of such factors as shoreline and bathymetric representation, data resolution and method of interpolation in modelling coastal spills for example, it is essential that modelling of spatial data quality is applied to hydrodynamic, trajectory and fates modelling. This paper presents initial results of research in progress that will nevertheless emphasize to modeller and manager alike the practical issues of spatial data quality for coastal oil spill modelling. It is centred around a case study of Jiao Zhou Bay in the People's Republic of China. After summarizing the issue of spatial data quality in GIS, a taxonomy useful to oil spill modelling is presented. Some strategies for managing the spatial data quality in oil spill modelling are explored using GIS functionality for spatial data analysis and geostatistical modelling.

Brimicombe, A.J. (1998) "A fuzzy co-ordinate system for locational uncertainty in space and time", Innovations in GIS 5 (ed. Carver), Taylor & Francis: 143-152

Uncertainty is an unavoidable characteristic of thematic maps derived from digital spatial data. Boundaries drawn as pixel-thin lines to universally represent sharp, gradual or vague changes in a theme are an inevitable consequence of the lack of stored data with which to interpret these boundaries otherwise. Although some recent research has focused on the uncertainty of boundaries deriving from the purity of the polygons on either side, there is scope for explicitly recording the locational extent of uncertainty for individual points and lines that make up polygon boundaries. Fuzzy numbers are introduced and used to construct a fuzzy co-ordinate system in two-, three- and four-dimensions. The paper discusses the geometry of fuzzy lines and their inclusion within the traditional GIS topological data structure. By using a C language struct, the overhead of including data on boundary uncertainty may be minimal. Fuzzy number co-ordinate geometry offers opportunities in GIS applications where boundaries are inherently inexact to have more representative data that better reflect the true situation.

Brimicombe, A.J. (1997) "A universal translator of linguistic hedges for the handling of uncertainty and fitness-for-use in Geographical Information Systems" in Innovations in GIS 4 (ed. Kemp), Taylor & Francis: 115-126

Spatial data quality has been attracting much interest. Much of the problem lies in the degree to which current data structures are unable to model the real world and the way imperfections in the data may propagate during analyses and cast doubt on the validity of the outcomes. Much of the research has concentrated on the quantitative accuracy of spatial data, the derivation of indices and their propagation through analyses. Geographical data invariably includes an element of interpretation for which linguistic hedges of uncertainty may be generated. The paper presents a new technique of handling such expressions in a GIS through fuzzy expectation - intuitive probabilities linked to stylized fuzzy sets. This can be achieved without adversely affecting the size of the database. By using fuzzy expectation as linguistic building blocks, many of the difficulties in using fuzzy set descriptors in GIS have been overcome. The stylized fuzzy sets can be propagated using Boolean operators to give a resultant fuzzy set which can be 'translated' back into a linguistic quality statement. For the first time, linguistic criteria of fitness-for-use can be derived for GIS outputs regardless of the language being used.

Brimicombe, A.J. (1994) "Uncertainty and fitness-for-use: a research and implementation paradigm" Proceedings FIG XX Congress, Melbourne, Vol. 3: 374-381 

Spatial data are often inexact with levels of inherent and perceived uncertainty. Studying local, regional and global change using geo-information systems requires a paradigm for managing uncertainty and assessing fitness-for-use in the specific context of an application. A model is presented which provides the basis for recording uncertainty, propagation through data transformation and assessment of fitness-for-use either through direct visualisation or through sensitivity analysis.

Law, J. and Brimicombe, A.J. (1994) "Integrating primary and secondary data sources in land information systems" Proceedings FIG XX Congress, Melbourne, Vol. 3: 478-489 

Spatial databases usually include significant amounts of secondary data. This is inevitable as database creation proceeds from existing amps and plans rather than carrying out new field surveys. In subsequent updating cycles, change is detected and recorded using primary data collection techniques. The paper presents current research on resolving problems which arise when integrating primary and secondary spatial data within a layer and ensuring consistency and compatibility with other layers in a Land Information System.

Brimicombe, A.J. (1993) "Combining positional and attribute uncertainty using fuzzy expectation in a GIS" Proceedings GIS/LIS'93, Minneapolis, Vol. 1: 72-81

Spatial data of natural resources and other aspects of the environment are frequently inexact with levels of inherent and perceived uncertainty. The literature mostly treats positional and attribute uncertainty separately. A model has been developed which provides the basis for recording uncertainty, for propagation of uncertainty during data transformation and for assessing fitness-for-use of outcomes. In the context of this model, a metric - fuzzy expectation - has been developed as an extension of fuzzy sets and fuzzy numbers into a coordinate system and linguistic uncertainty descriptor. Using this metric, thematic uncertainty can be modelled over vague boundaries.

Centre for Geo-Information Studies

The Centre for Geo-Information Studies is an established research centre specialising specialise in all aspects of geo-information science.

Read more