Rupasinghe K.A.B.S; Brimicombe, A.J and Li, Y. (2012) "An approach to data enrichment of building features using Delaunay
triangulation for automatic map generalization" In Proceedings of the GIS Research UK 20th Annual Conference, Lancaster, Lancaster University: 235-240
Automatic map generalization is a difficult task due to the contextual nature of the spatial objects represented on maps and has been the focus of much research. Understanding such contextual spatial relationships is critical to determine how map generalization is applied to spatial features on the map, considering their role, meaning and the context. Existing geographic databases lack functionality for extracting such spatial relationships in the form of auxiliary data, although researchers have explored spatial structures using various algorithms in computational geometry to enhance the spatial relations between features. The process of adding such auxiliary data to a data base is called data enrichment. This paper introduces a reliable geometrical data structure using Delaunay triangulation as a means of enriching databases of polygonal building features with the necessary auxiliary data. [back to the publication list]
Li, Y. and Brimicombe, A.J (2012) "Mobile Geographic Information Systems" In Ubiquitous Positioning and Mobile Location-based Services in Smart Phones, IGI Global, 230-253
The concept of Mobile Geographical Information Systems (Mobile GIS) is introduced as an evolution of conventional GIS to being available on wireless mobile devices such as smart phones. The evolution of the technology and its applications are charted in this Chapter. The main elements of Mobile GIS are then discussed. This focuses on: GIS servers; wireless mobile telecommunication networks; wireless mobile devices; location-awareness technology; and gateway services. This is followed by a discussion of the main features in terms of the services and usage of Mobile GIS: mobility; real-time connectivity; location-awareness; broadened usage. Mobile GIS are an important facilitating technology for Location-Based Services (LBS). A range of applications of Mobile GIS for smart phones are described. The Chapter closes with a discussion of the prospects and challenges for Mobile GIS. Challenges derive from four broad areas: limitations that derive from the technologies being used; areas of GIScience that still need to be adequately researched; users; and business models for a sustainable presence. [back to the publication list]
Brimicombe, A.J and Li, Y. (2012) "Open Data and the Monitoring of the Sustainability of a London 2012 Legacy" Researching and Evaluating the Games Conference, London, Department for Culture Media and Sport (DCMS) of UK government
This study was competitively commissioned by the ESRC on behalf of the International Olympic Committee (IOC) and London 2012. Due to the open data policy and web dissemination of data tables, no new primary data collection was required to carry out the study. This is not the case with other host cities where in Vancouver and now in Sochi and Rio large amounts of primary data collection become necessary because fundamental data on the environment, economy and society are not readily available at sufficient granularity. The performance of London 2012 can continue to be monitored on an annual basis from open data updates which act as a barometer to legacy outcomes. This is a testament to the accessible time-series data infrastructure that has been created in the United Kingdom which for most data sets can be for a decade. [back to the publication list]
Brimicombe, A.J (2012) "Did GIS Start a Crime Wave? SatNav Theft and Its Implications for Geo-information Engineering" Urban The Professional Geographer, (in press)
SatNavs are the first mass consumer product containing GIS and GPS technologies. The engineering of the product as an easily detachable device without login or other secured access meant that SatNavs quickly became a target of theft and imparted to the owners (at the time) an unrecognised level of vulnerability. Spatial clustering analyses show that SatNav thefts in London Borough Newham is significantly different to other thefts from vehicles reflecting in part visitor patterns and predatory, prolific offenders. [back to the publication list]
Li, Y. and Brimicombe, A.J (2011) "A New Variable for Spatial Accessibility Measurement in Social Infrastructure Planning" Proceedings 11th International Conference on Geocomputation,
London, University College London (CD)
A new variable (Average Weighted Distance) is developed to measure and analyse spatial accessibility by small area geography. It will support rapid assessments of inequalities and ‘what-if’ analyses in local social infrastructure planning. The approach can use both Euclidean distance and network distance using postcode centroids as the atomic spatial unit. However, it is found that these two approaches have a high correlation and therefore similar patterns of relative inequality. The Euclidean distance approach has less computational load and is generally applicable, particularly where rapid ‘what-if’ analyses are required for decision support in a planning context. Local organisations are then able to interpret and further analyse relative local spatial accessibility for specific services/facilities as well as monitor changes in accessibility over time. s. [back to the publication list]
Brimicombe, A.J.; Li, Y. and Li, C. (2009) "Evidencing Population Change for Sustainable Community" Urban Design and Planning
162: 159-167
A hallmark of sustainable communities is their ability to adapt to demographic change. Fundamental to sustainable communities are the quality of services and opportunities afforded by the social infrastructure provision. Where the needs of residents change rapidly due to (im)migration, social and economic mobility and transience, there needs to be robust mechanisms of compiling and updating the evidence base on which policy and planning changes must necessarily be founded. A key component of such an evidence base is up-to the moment population estimates at small-area geographies. Current debates around the ability of official statistics to reflect actual population size and demographics, their lag in release and the geographical scale at which they are made available have prompted an investigation into a novel approach to population modelling using administrative data. This paper provides an insight into the population models for lower super output area (LSOA) level estimates developed for ten boroughs within the London Thames Gateway based on council tax, child benefit and schools census data. The multi-stage multiple regression models are initially constructed using 2001 data and tested against official statistics. The estimates are then moved forward with successive annual data sets to provide an understanding of year-on-year population change. This approach is not meant to displace official statistics but to provide another view through a different route; they data can be set alongside each other for evidential decision support in social infrastructure planning. This approach is now being applied, for example, by a number of Primary Care Trusts in growth areas in London and the south Midlands in order to inform decisions on health infrastructure planning. [back to the publication list]
Brimicombe, A.J.; Li, Y.; Al-Zakwani, A. and Li, C. (2009) "Agent-Based Distributed Component Services in Spatial Modeling" In Computational Science and its Applications-ICCSA 2009 (eds. Gervasi et al.) Springer, Berlin: 300-312 3
Agent technologies have been increasingly applied to spatial simulation and modeling in silico. Where multi-agent systems have been used for spatial simulation, agents have tended to be deployed as spatial objects in order to study emergent patterns from micro-level behaviors. Many of these applications only deploy a weak notion of agency. More recently, the concept has emerged in the spatial domain that agents can be deployed as services to assist in complex modeling tasks. Agent-based distributed component services bring a stronger notion of agency to spatial modeling and are particularly suited to achieving interoperability in heterogeneous computational environments. Two case studies are presented. In the first, agent-based services are deployed over a network for spatial data quality analysis. In the second, a variogram agent component is used to demonstrate how a collaborating multi-agent system can provide intelligent, autonomous services to carry out complex operations. [back to the publication list]
Brimicombe, A.J. and Li, C. (2009) Location-Based Services and Geo-Information Engineering, Wiley, Chichester.
Location-Based Services (LSB) are the delivery of data and information services where the content of those services is tailored to the current location and context of a mobile user. This is a new and fast-growing technology sector incorporating GIS, wireless technologies, positioning systems and mobile human-computer interaction. Geo-Information (GI) Engineering is the design of dependably engineered solutions to society's use of geographical information and underpins applications such as LBS. These are brought together in this comprehensive text that takes the reader through from source data to product delivery. This book will appeal to professionals and researchers in the areas of GIS, mobile telecommunications services and LSB. It provides a comprehensive view and in-depth knowledge for academia and industry alike. It serves as essential reading and an excellent resource for final year undergraduate and postgraduate students in GIScience, Geography, Mobile Computing or Information Systems who wish to develop their understanding of LBS. [back to the publication list]
Li, Y.; Brimicombe, A.J. and Li, C. (2008) "Agent-based services for the validation and calibration of multi-agent models" Computers, Environment and Urban Systems 32: 464-473
Agent-based modelling in the form of multi-agent models has been increasingly applied to the simulation of spatial phenomena in silico. Validation and calibration are recurrent problems. The complexity of these models with large numbers of parameters can make validation procedures intractable. In this paper, the novel concept of using agent-based technologies to create services that assist in the validation and calibration of multi-agent models is developed. Such agent-based services offer an efficient solution where large numbers of model runs need to be carried out. In this paper, the agent-based services are collaborative sets of agents that perform calibration and sensitivity analysis as a key task in model validation. In a case study, the prototype agent-based validation services are implemented for a multi-agent wayfinding model as a means of proof-of-concept. The case study demonstrates how agent-based services can be deployed for testing the robustness of emergent patterns through sensitivity analyses and used for model calibration. [back to the publication list]
Li, Y.; Brimicombe, A.J. and Li, C. (2008) "Scenario-based Small Area Population Modelling for Social Infrastructure Planning" Proceedings GISRUK2008, Manchester: 348-353
In recent years, the geodemographic makeup of some areas in the UK has been rapidly changing. For example, immigration has put more pressure on child services, education and health care in places such as Slough, Peterborough and the Thames Gateway. Other factors affecting the Thames Gateway are housing development as part of the massive regeneration and the development and legacy of the Olympic site. This region is also experiencing high population churn, uncertainty in its demographic composition and issues in matching service delivery. There are also increasing demands for building sustainable communities that can adapt to change. A key to maintaining sustainable communities is the quality of services and opportunities afforded by the social infrastructure. Where the needs of residents rapidly change due to (im)migration, social and economic mobility and transience, there needs to be robust mechanisms for compiling and updating the evidence base on which policy and planning changes must necessarily be founded. This paper proposes scenario-base small area population modelling with multiple administrative data sources as a means of evidencing change. It is being implemented in the Thames Gateway London boroughs, with funding from UrbanBuzz (www.urbanbuzz.org) to support local social infrastructure planning. [back to the publication list]
Brimicombe, A.J. (2008) “Location-Based Services and GIS” In The Handbook of Geographical Information Science (eds. Wilson & Fotheringham), Blackwell, Oxford: 581-595
In this chapter, I set the context for the emergence of location-based services (LBS) as an application of GIS. LBS can then be defined and placed alongside othe GIS-based technologies. I explore some of the data implications of LBS, how LBS users are positioned so that the system knows where they are and how queries can be expedited. I then explore some applications of LBS and conclude by reading some of the signposts as to what lies on the road ahead. LBS is a newly emerging technology and as with most other technologies we cannot be sure where it will lead us. All I can do here is reveal what is known, cut through the inevitable hype and scan the horizon of our possible futures. But one thing is for sure, LBS is an application of exciting potential which integrates nearly all aspects of geo information science presented in the other chapters of this book. [back to the publication list]
Brimicombe, A.J. (2007) “Ethnicity, religion and residential segregation in London: evidence from a computational typology of minority communities” Environment & Planning B, Planning & Designg 34: 904-924
Within the context of growing polarisation and fragmentation of the urban landscape, this paper presents a computational typology applicable to the study of minority communities, both ethnic and religious, useful in understanding their spatial distribution and juxtaposition at neighbourhood levels. The typology has been applied to multicultural London using the 2001 census in which there were questions on ethnicity and religion. The landscape of religion is found to be more highly segregated in contrast to the landscape of ethnicity. Furthermore, on the basis of a preliminary analysis of indicator variables, minorities seem on aggregate to be in an improved situation given a level of residential segregation with the exception of residents of segregated Asian-Bangladeshi areas for ethnicity and residents of segregated Muslim areas for religion. This questions the generally held view that segregation in a multicultural society is undesirable per se and suggests that a ‘one size fits all’ government policy towards residential segregation is insufficiently perceptive. The typology introduced here should facilitate a more critically informed approach to multiculturalism and the contemporary city. [back to the publication list]
Li, Y. (2007) "Control of spatial discretisation in coastal oil spill modelling" International Journal of Applied Earth Observation and Geoinformation,
9: 392-402
Spatial discretisation plays an important role in many numerical environmental models. This paper studies the control of spatial discretisation in coastal oil spill modelling with a view to assure the quality of modelling outputs for given spatial data inputs. Spatial data analysis techniques are effective for investigating and improving the spatial discretisation in different phases of the modelling. Proposed methods are implemented and tested with experimental models. A new “automatic search” method based on GIS zone design principles is shown to significantly improve discretisation of bathymetric data and hydrodynamic modelling outputs. The concepts and methods developed in the study are expected to have general relevance for a range of applications in numerical environmental modelling. [back to the publication list]
Li, Y.; Brimicombe, A.J. and Li, C. (2007) "Agent-Based Services for Validating Multi-Agent Models" in Proceedings 9th International Conference on GeoComputation, Maynooth, National University of Ireland
Agent-based modelling has been increasingly applied to the simulation of spatial phenomena in silico. In an agent-based spatial simulation, agents have tended to be defined as spatial objects to computationally represent the behaviour of individuals in order to study emergent patterns arising from micro-level interactions. More recently, agents have been used to represent spatial processes as the modelling primitives in order to focus on process information in dynamic models. However, a recurrent problem in agent-based modelling is the validation of outcomes. Thus a third approach investigated here, is to harness the mobility and intelligence of agents to create tools that offer agentbased services for the validation of agent based modelling. A typical agent-based service for spatial simulation might be quality analyses of both data and models. In this paper we specifically investigate agent-based services for sensitivity analysis and calibration of multi-agent models. In order to develop an interoperable and distributed system, remote multi-agent technology is deployed. A collection of collaborating agents can then be shared as services across a network. [back to the publication list]
Brimicombe,
A.J.; Brimicombe, L.C.; Li, Y. (2007) "Improving geocoding reates in
preparation for crime data analysis" International Journal of Police Science
and Management 9: 80-92
Problem-oriented policing
requires quality analyses of patterns and trends in crime incidences. A common
form of analysis is the identification of geographical clusters or 'hot spots'.
For such analyses, crime incident records must first be geocoded, that is,
address-matched so as to have geographical co-ordinates attached to each
record. The address fields in crime databases typically have omissions and
inaccuracies whilst a good proportion of crimes occur at non-address locations.
Consequently, geocoding can have an unacceptably low hit rate. We present and
test an improved automated and consistent approach to batch geocoding of crime
records that raises the hit rate by an additional 65% to an overall rate of
91%. This is based on an actual implementation for a UK Police Force. Kernel
density surfaces used to visualise the results of the test show that the
additional geocoded records have distinct spatial patterning. This would
indicate that without the improved hit rate, geocoded crime records are likely
to be spatially biased and that 'hot spots' of crime tend also to be 'hot
spots' of otherwise un-geocoded data.[back to the
publication list]
Brimicombe, A.J.
(2007) "A dual approach to cluster discovery in point event data set"
Computers, Environment and Urban Systems 31: 4-18
Spatial data mining seeks to discover meaningful patterns in data where a prime
dimension of interest is geographical location. Consideration of a spatial
dimension becomes important where data either refer to specific locations
and/or have significant spatial dependence which needs to be considered if
meaningful patterns are to emerge. For point event data there are two main
groups of approaches to identifying clusters. One stems from the statistical
tradition of classification which assigns point events to a spatial
segmentation. A popular method is the k-means algorithm. The other broad
approach is one which searches for hot spots which can be loosely
defined as a localised excess of some incidence rate. Examples of this approach
are GAM and kernel density estimation. This paper presents a novel variable
resolution approach to hot spot cluster discovery which acts to
define spatial concentrations within the point event data. Hot spot centroids are then used to establish additional distance variables and initial
cluster centroids for a k-means classification that produces a segmentation,
both spatially and by attribute. This dual approach is effective in quickly
focusing on rational candidate solutions to the values of k and choice of
initial candidate centroids in the k-means clustering. This is demonstrated
through the analysis of a business transactions database. The overall dual
approach can be used effectively to explore clusters in very large point event
data sets.[back to the publication list]
Brimicombe, A.J. and Li, Y (2006) "Mobile
Space-Time Envelopes for Location-Based Services" Transactions in GIS
10(1): 5-23
The convergence and miniaturisation of a range of
information and communication technologies, together with increasing bandwidth
availability and near ubiquity of mobile phones, are giving rise to a
technological environment in which location-based services (LBS) can
realistically develop. In this paper we review the nature of locationbased
services and the implications for data and spatial queries. In doing so, we put
forward a research agenda that arises for geographical information science and
engineering. Central to LBS are problems of response time and the information
utility of responses to queries and any pushed alerts, where information
utility refers to content, timeliness and geographical footprint. Within a
publish/subscribe model of LBS provision, we propose mobile space-time
envelopes as a novel approach to event brokerage. These envelopes
simultaneously provide soft clip pruning of candidate data sets in
anticipation of queries, and provide the trigger that subscribers are
pertinently in-range for alerts. We present the geometrical, algebraic and
algorithmic concepts of mobile space-time envelopes and provide an example of
these mobile envelopes in action. We conclude with a discussion of how this
initial implementation could be further developed to incorporate added
spatio-temporal intelligence.[back to the
publication list]
Brimicombe, A.J.
(2006) "Location-Based Services and GIS" in Handbook of Geographical
Information Science(eds. Wilson and Fotheringham), Blackwell, Oxford:
Chapter 38 (in press).
In this chapter, I set the context for
the emergence of location-based services (LBS) as an application of GIS. LBS is
then defined and placed alongside other GIS-based technologies. I explore some
of the data implications of LBS, how LBS users are positioned so that the
system knows where they are, and how queries can be expedited. I then explore
some applications of LBS and conclude by reading some of the signposts as to
what lies on the road ahead. LBS is a newly emerging technology and as with
most other technologies we cannot be sure where it will lead us. All I can do
here is reveal what is known, cut through the inevitable hype and scan the
horizon of our possible futures. But one thing is for sure, LBS is an
application of exciting potential which integrates nearly all aspects of
geo-information science presented in the other chapters of this book. [back to the publication list]
Brimicombe, A.J. (2006) "Modelling spatial
variation in street crime: an inductive learning approach" Proceedings
GISRUK2006, Nottingham: 59-64
A key dimension in crime
analysis is geographical location - the characteristics and juxtapositions of
where crimes happen. Not surprisingly then, Geographical Information Systems
(GIS) have been in use since the early 1990s to assist in the identification of
geographical clusters of crime and are now routinely used to determine such
'hot spots'. The research focus in crime mapping has now shifted towards the
development of analytical models that provide an understanding of the
underlying determinants that can then inform policy and crime prevention
intitiatives.[back to the publication list]
Li, Y. (2006) "Spatial data quality
analysis with Agent technologies" Proceedings GISRUK2006, Nottingham:
250-254
Agent technologies have attracted increasing
attention from GIS research and applications. Intelligent agents have shown
considerable potential for spatial data analysis. This paper proposes an
agent-based solution for spatial data quality analysis. A collection of
collaborating agents is constructed in a muliti-agent framework. These remote
intelligent agents can be shared as spatial data quality analysis tools over a
network. The proposed solution therefore offers a decentralised, distributed
and service-oriented system. It is expected to aid users in designing their own
data quality test procedures for environmental simulation models.[back to the publication list]
Brimicombe, A.J. (2005) "Cluster detection in point event
data having tendency towards spatially repetitive events." Proceedings 8th
International Conference on GeoComputation, Ann Arbor, Michigan
(CD)
The analysis of point event patterns in geography,
ecology and epidemiology have a long tradition. Of particular interest are
patterns of clustering or 'hot spots' and such cluster detection lies at the
heart of spatial data mining. Certain classes of point event patterns exhibit a
tendency towards spatial repetitiveness (within the resolution of
geo-positioning) although with a temporal separation. Examples are crime and
traffic accidents. Spatial superimposition of point events challenges many
existing approaches to cluster detection. In this paper a variable resolution
approach, Geo-ProZones, is applied to residential burglary data exhibiting a
high level of repeat victimisation. This is coupled with robust normalisation
as a means of consistently defining and visualising the 'hot spots'. [back to the publication list]
Brimicombe, A.J. (2005) "La
détection des concentrations des evénements ponctuels ayant des
répétitions spatiales" Actes du Colloque International de
Géomatique et d'Analyse Spatiale, Avignon, France ISBN 2-910545-06-7
(CD)
Abstract. [back to the
publication list]
Li, Y. (2005) "Agent technologies for spatial data auality analysis in environmental
modelling" Proceedings GeoComputation 05, Ann Arbor, Michigan
(CD)
The study shows the potential of an agent-based Geo-data
Quality Analysis engine. With this interoperable engine, spatial data quality
analysis can be achieved in environmental modelling. Ultimately, in the longer
term, through such agents, the Geo-data Quality Analysis engine would be
distributed on the Internet to be used by the scientific and professional
community.[back to the publication list]
Miller, N. and Brimicombe, A.J.
(2004) "Mapping research journeys across complex terrain with heavy baggage."
Studies in Continuing Education 26: 405-417
In this
article we review our experience of collaborating in the design and delivery of
multidisciplinary training and support programmes for doctoral students, and
our attempts to locate models and metaphors for research planning and
implementation which travel well across disciplines. We extend the metaphor of
the journey to conceptualise a mapping of the PhD process, and examine the
extent to which research students from widely divergent backgrounds may travel
together and help each other navigate towards their destinations. We explore
some issues of culture and communication involved in working in an
interdisciplinary context, showing how we have been provoked to reflect
critically on our own research identities and locations in the process of
working together. We also identify some tensions between the assumptions about
research development embodied in recent government policy documents and the
lived experience of the research students with whom we work.[back to the publication list]
Brimicombe, A.J. (2003) GIS, Environmental Modelling and
Engineering. Taylor & Francis, London.
The
significance of modelling in managing the environment is well recognised from
scientific and engineering perspectives as well as in the political arena.
Environmental concerns and issues of sustainability have permeated both public
and private sectors, particularly the need to predict, assess and mitigate
against adverse impacts that arise from continuing development and use of
resources. Environmental modelling is an inherently spatial activity well
suited to taking advantage of Geographical Information Systems (GIS)
functionality whether this be modelling aspects of the environment within GIS
or linked to external simulation models. In doing so, a number of issues become
important: environmental models need to consider space-time processes often in
three-dimensions whereas GIS are largely two-dimensional and static;
environmental models are often computational simulations, as distinct from
cartographic modelling and map algebra in GIS; what should be the degree of
integration between GIS and environmental models on any project; how does
uncertainty in spatial data from GIS and the choices of parameterisation in
environmental models combine and propagate through analyses to affect outputs;
by what means can decisions be made with uncertain information. These issues
inevitably arise in every scientific and engineering project to model and
manage our environment. Students need to be made aware of these issues.
Practitioners should enrich their knowledge and skills in these areas. This
book focuses on the modelling, rather than on data collection or visualisation
- other books adequately cover these areas - and aims to develop critical users
of both GIS and environmental models.[back to the
publication list]
Li, Y. (2004) "Control of Spatial Discretisation in Coastal Oil Spill Modelling "
Proceedings GISRUK2004, East Anglia: 64-68
Spatial
discritisation is one effective mean of spatial data modelling. Data
aggregation or division over space for census data, social or economic
modelling has attracted much attention in GIS research. For environmental
problems, the impact of environmental change also has a spatial dimension. In
most environmental simulation modelling, the numerical computation or
manipulation has to work with discretised spatial data rather than continuous
data or point survey data. Spatial discretisation has therefore been widely
used for numerical environmental modelling. Through spatial discretisation, a
tessellation for numerical modelling is established to regroup spatial data to
match specific criteria. The diversity and complexity of environmental
modelling raises a number of interesting issues as well as demands for proper
study. In coastal oil spill modelling, spatial discretisation is the basis for
numerical computation and simulation. Discretisation is used to construct the
modelling mesh for Finite Element or Finite Differential computation in
hydrodynamic modelling. It will also generate the modelling grid for trajectory
and fate simulations. Current de facto industry procedures for such
discretisations are pragmatic. However in many cases, they lack quality control
and depend on the modeller's experience. With spatial analysis techniques, this
paper will study the control of spatial discretisation in coastal oil spill
modelling with a view to assure the quality of modelling for given spatial data
inputs. [back to the publication list]
Li, Y.; Grainger, A.; Hesley, Z.; Hofstad, O.;
Sankhayan, P.; Diallo, O.; O'Kting'Ati, S. (2004) "Using GIS techniques to
evaluate community sustainability in open forestlands in Sub-Saharan Africa" in Methodologies, Models and Instruments for Rural and Urban Development
(ed. Dixon-Gough), Ashgate Publishing, Aldershot.
Community
sustainability in villages in developing countries is a meaningful concept to
their inhabitants, whose livelihoods heavily depend on renewable natural
resources in the immediate vicinity. This Paper describes how the integrated
use of optimization models and Geographic Information System (GIS) models can
give insights into community sustainability. The case studies are carried out
for villages in open forestlands of three African countries - Senegal, Tanzania
and Uganda. GIS efficiently manage various data in this project and provide the
basis for practical planning techniques. [back to
the publication list]
Li. Y. and
Brimicombe A.J. (2003) "A Spatial Data Quality Analysis Engine for Coastal
Oil-Spill Modelling" Proceedings of Conference on Oil Spill, Oil Pollution
and Remediation, Istanbul: 43-54
Oil-spill models play an
important role in prevention, preparation, response and impact analysis for
coastal oil-spill pollution. The reliability of an oil-spill model depends on
the accuracy of spatial data entry, integration and computation. There are a
wide range of data quality problems for both data and model operations. This
paper presents research work for a newly-developed Geo-data Quality Analysis
(GQA) engine, which is constructed as a tightly-coupled collection of GI tools.
Off-the-shelf tools are intended to work together in an interoperable framework
for a specified oil-spill model. It overcomes the lack of spatial data quality
functionality in GIS software as well as the limitation of fully-integrated
package. A conceptual prototype of GQA engine has been developed, which
includes some basic analysis tools and also provides help in designing test
procedures for the specific model. This prototype has been applied for the
coastal oil-spill modelling and should be generally applicable with its
flexible structure and compatible tools.[back to
the publication list]
Brimicombe, A.J.
(2003) "A variable resolution approach to cluster discovery in spatial data
mining" in Computational Science and Its Applications (eds. Kumar et
al.), Springer-Verlag, Berlin, Vol. 3: 1-11
Spatial data
mining seeks to discover meaningful patterns from data where a prime dimension
of interest is geographical location. Consideration of a spatial dimension
becomes important when data either refer to specific locations and/or have
significant spatial dependence which needs to be considered if meaningful
patterns are to emerge. For point data there are two main groups of approaches.
One stems from traditional statistical techniques such as k-means clustering in
which every point is assigned to a spatial grouping and results in a spatial
segmentation. The other broad approach searches for 'hotspots' which can be
loosely defined as a localised excess of some incidence rate. Not all points
are necessarily assigned to clusters. This paper presents a novel variable
resolution approach to cluster discovery which acts in the first instance to
define spatial concentrations within the data thus allowing the nature of
clustering to be defined. The cluster centroids are then used to establish
initial cluster centres in a k-means clustering and arrive at a segmentation on
the basis of point attributes. The variable resolution technique can thus be
viewed as a bridge between the two broad approaches towards knowledge discovery
in mining point data sets. Applications of the technique to date include the
mining of business, crime, health and environmental data.[back to the publication list]
Brimicombe, A.J. (2002) "GIS - Where are the frontiers
now?" Proceedings GIS 2002, Bahrain: 33-45
Geographical Information Systems (GIS) have undergone a state change. The
discipline can now differentiate activities of science and engineering from the
more narrow focus of just systems. There has also been a paradigm shift towards
geocomputation as an appropriate approach towards both scientific investigation
and building engineering solutions. This paper discusses these issues and goes
on to identify three areas at the current forefront of GIS: spatial data
mining, computational modelling of spatial processes and location-based
services.[back to the publication list]
Brimicombe, A.J. (2002) "Cluster discovery in
spatial data mining: a variable resolution approach" In Data Mining III
(eds. Zanasi et al.), WIT Press, Southampton: 625-634
Spatial
data mining seeks to discover meaningful patterns from data where a key
dimension of the data is geographical location. This spatial dimension becomes
important when data either refer to specific locations and/or have significant
spatial dependence and which needs to be taken into consideration if meaningful
patterns are to emerge. For point data there are two main groups of approaches.
One stems from traditional statistical techniques such as k-means clustering in
which every point is assigned to a spatial grouping and results in a spatial
segmentation. The segmentation has k sub-regions, is usually space filling and
non-overlapping (i.e. a tessellation) in which all points fall within a spatial
segment. The difficulty with this approach is in defining k centroid locations
at the outset of any data mining. The other broad approach searches for
'hotspots' which can be loosely defined as a localised excess of some incidence
rate. In this approach not all points are necessarily assigned to clusters. It
is the mainstay of those approaches which seek to identify any significantly
elevated risk above what might be expected from an at-risk background
population. Definition of the population at risk is clearly critical and in
some data mining applications is not possible at the outset. This paper
presents a novel variable resolution approach to cluster discovery which acts
in the first instance to define spatial concentrations in the absence of
population at risk. The cluster centroids are then used to establish initial
centroids for techniques such as k-means clustering and arrive at a
segmentation on the basis of point attributes. The variable resolution
technique can thus be viewed as a bridge between the two broad approaches
towards knowledge discovery in mining point data sets. The technique is equally
applicable to the mining of business, crime, health and environmental data. A
business-oriented case study is presented here.[back to the publication list]
Brimicombe, A.J. and Li, Y. (2002) "Dynamic space-time
envelopes for location-based services. CGIS Working Paper.
An important determinant for success of location-based services
(LBS) will be the speed of response for information. Databases for LBS are
likely to be networked and very large with response times for spatial queries
from mobile devices orders of magnitude longer to transact than non-spatial
queries. This paper proposes dynamic space-time envelopes as a way of
geographically partitioning databases in anticipation of requests for
information from individuals on the move. The dynamics of these envelopes is
illustrated using data from a real journey and pseudo code for the creation of
envelopes is provided.[back to the publication
list]
Li, Y. and Brimicombe, A.J.
(2002) "Assessing the quality implications of accessing spatial data: the case
for GIS and environmental modelling" Proceedings GISRUK 2002, Sheffield:
68-71
For the spatial sciences, the 1990's were a period of
transition from data-poverty to data-richness. Digital spatial data sets have
grown rapidly in scope, coverage and volume (Miller & Han, 2001). This
state change has been facilitated by: improved technology and wider use of GPS,
remote sensing and digital photogrammetry for data collection; the introduction
of new technologies such as LiDAR and radar interferometry; the operation of
Moore's Law resulting in increased computing power to process raw data coupled
with the falling cost of data storage; the advent of data warehousing
technologies; increasingly efficient ways of accessing and delivering data
on-line. The technical advances in hardware, software and data have been so
profound that their effect on the range of problems studied and the
methodologies used have been fundamental. At the same time however, problems
have arisen in identifying and locating relevant data and in evaluating choices
of resolution, coverage, provenance, cost and conformance with the models to be
used. Furthermore, for environmental models it may not be so much the
characteristics of the raw data that are the most critical but their
characteristics once converted, aggregated and implemented in the model. Given
that a modelling task may access data from multiple sources, there is the added
difficulty of assessing combined performance in relation to the implementation
of the simulation such that outputs have fitness-for-use. Clearly front-end
tools are required in order to resolve these data issues both prior to purchase
and also in specifying new data collection. A conceptual prototype for
Geo-Quality Analysis (GQA) engine is developed, which includes off-shelf
quality analysis tools and experiment design program. Data-richness has lead to
choice and the need to evaluate that choice from the outset in terms of their
implications for fitness-for-use in environmental modelling. The application of
a GQA prototype in coastal oil spill modelling has shown the feasibility of
doing this.[back to the publication list]
Brimicombe, A.J.; Ralphs, M.; Sampson, A. and
Tsui, P. (2001) "An exploratory analysis of the role of neighbourhood ethnic
composition in the geographical distribution of racially motivated incidents"
British Journal of Criminology 41: 293-308
This paper
explores the use of statistical and Geographical Information Systems mapping
techniques in producing a preliminary assessment of geographical patterns of
racially motivated crimes and harassment in a given area. The geographical
distribution of allegations of racially motivated incidents reported to the
police in the London Borough of Newham is investigated. The results of the
analysis suggest that the ethnic composition of an area appears to have a
significant effect on the rate of incidents. Correlation and regression
analyses are carried out and support the preliminary finding that rates of
incidence are significantly higher where there is a large white majority and
smaller groups of other ethnicities.[back to the
publication list]
Li, Y.; Brimicombe,
A.J. and Ralphs, M. (2000) "Spatial data quality and sensitivity analysis in
GIS and environmental modelling: the case of coastal oil spills" Computers,
Environment & Urban Systems 24: 95-108
Integration
with environmental modelling, concern for spatial data quality issues and the
rise of geocomputation paradigm have been three important areas of GIS
research. In this paper they are brought together in the context of coastal oil
spill modelling. Spatial data quality analyses of data and model elements for
output sensitivity and error propagation, highlight the need to revise the
coupling strategies of GIS and environmental modelling to include a geo-data
analysis (GQA) engine. The implementation of comprehensive geospatial data
quality analysis procedures within existing GIS appears unlikely. Nevertheless,
as the complexity of data sets and the modelling capability of computer systems
increase, the need to address the quality of both data and models is
increasingly important. With growing availability of proprietary and public
domain software suitable for spatial data quality analysis, GQA engines will
evolve from the assembly of these tools external to GIS. GIS, environmental
models and GQA will form a tightly-coupled modelling network, which, because of
the importance of quality issues and the need for systematic testing, will see
the dominant interaction between the GQA and the environmental modelling.[back to the publication list]
Li, Y.; Ralphs, M. and Brimicombe, A.J. (2000) "Error
propagation for complex environmental modelling - the case of spatial data
quality in coastal oil spill modelling" Proceedings Accuracy 2000:
409-16
Error propagation analysis can now offer a large
amount of information about both spatial data inputs and modelling processes.
This paper reports on the results of an error propagation study that takes
coastal oil spill modelling as its operational context. In coastal oil spill
modelling, any data error or uncertainty will be propagated through the
sub-models because of the model configuration. There is also considerable scope
for operationally-induced error or uncertainty. The analyses are carried out
for spatial data inputs to the hydrodynamic model and initially focus on
hydrodynamic modelling with knock-on error effects for trajectory and fate
modelling. The paper also considers the use of error propagation techniques in
assisting the user with the issue of algorithm choice in the oil spill
modelling process. This paper gives the results of the experiments in full and
provides an improved overall methodology for the study of error propagation for
spatial data in environmental modelling.[back to the publication list]
Brimicombe,
A.J. and Tsui, P. (2000) "A variable resolution, geocomputational approach to
the analysis of point patterns, Hydrological Processes 14: 2143-2155
A geocomputational approach to the solution of applied spatial
problems is being ushered in to take advantage of ever increasing computer
power. The move is seen widely as a paradigm shift allowing better solutions to
be found for old problems, solutions to be found for previously unsolvable
problems and the development of new quantitative approaches to geography. This
paper uses geocomputation to revisit point pattern analysis as an objective,
exploratory means of evaluating mapped distributions of landforms and/or
events. A new variable resolution approach is introduced and tested alongside
more traditional approaches of nearest neighbour distance and quadrat analysis
and against another geocomputational approach, the K function. The results
demonstrate that firstly, the geocomputational paradigm allows new and more
useful solutions to be found for old problems. Secondly, a variable resolution
approach to geographical data analysis goes some way towards overcoming the
problem of scale inherent in such analyses. Finally, the technique facilitates
spatio-temporal analyses of event data, such as landslides, thus offering new
lines of enquiry in areas such as hazard mitigation.[back to the publication list]
Brimicombe, A.J. (2000) "Constructing and evaluating
contextual indices using GIS: a case of primary school performance"
Environment & Planning A 32: 1909-1933
The current
political agenda has a firm focus on primary school education as one, among a
number of critical public services, that determine electorate opinion.
Performance tables which emphasise aggregate examination scores have become an
entrenched feature of the educational landscape for parents, teachers and
policy makers. Yet it is widely accepted that these types of tables of
aggregate examination scores provide a problematic, even flawed guide to the
performance of schools. Given the recognised broad link between pupil
performance and the social and economic environment in which they live and are
brought up, there is continued interest in being able to contextualise school
examination scores so as to better reflect relative achievement. Inequalities
are inherently spatial phenomena and with the use of census-based indices to
measure them, it is not surprising that geographical information systems (GIS)
are increasingly being used in the task of creating contextual measures. This
paper explores a methodology for creating and analysing a contextual index of
ambient disadvantage centred on robust normalisation of data and is illustrated
using census variables, pupil numbers and test scores for 3687 primary schools
in the north of England. Relevant census variables are interpolated using
ordinary kriging with an element of smoothing so as to simulate to some extent
the effect of school catchment areas. Key features of using robust
normalisation are that variable weights can be tested and the internal level of
support for an index, weighted absolute deviation, can be calculated and
mapped. This latter provides a quality measure for an index. The methodology is
critically assessed in relation to other recent approaches.[back to the publication list]
Brimicombe, A.J. (2000) "Encoding expert opinion in
geo-information systems: a fuzzy set solution" Transactions in International
Land Management 1: 105-121
Since much of the relevant
information will be derived from digital spatial data, geo-information systems
(GIS) and related IT will be key tools. An area where these tools currently
perform poorly is in the encoding, storage and analytical handling of expert
opinion or linguistic statements regarding the data content of GIS. This paper
proposes a solution - fuzzy expectation (E). Fuzzy expectation is derived
from a small number of stylised fuzzy sets which, using an intuitive
probability interface, are building blocks for 'translating' expert opinion
into a compact fuzzy set representation. Once encoded, expert opinion about the
data is embedded in the data structure and can be combined and propagated
through GIS analyses such as overlay. A worked example in a land management
context is provided as a means of illustrating the implementation of E.[back to the publication list]
Brimicombe, A.J. (2000) "GIS as science, engineering and
technology - do we have a model?". Proceedings EUGISES 2000, Budapest
(CD).
Pertinent issues of geographical information systems as
technology, science and engineering are revisited sequentially using the
metaphor of the peg, clothes hanger and valet stand. These three organisational
phases map out the development of the GI domain and in particular the emergence
of a science. The view put forward in the paper is that we are on the threshold
of the third phase when a mature science emerges to influence both the
technology and engineering aspects. From an educational perspective, a clearer
view of the domain and its constituent parts is required. A linguistic model is
proposed for wider discussion, that attempts to bring together the identities
of GI as science, engineering and technology and give coherence to educational
programmes.[back to the publication list]
Brimicombe, A.J. (1999) "Small may be
beautiful - but is simple sufficient?" Geographical and Environmental
Modelling 3: 9-33
The title of this paper is inspired by
two trends in spatial analysis: a shift in perspective from global to local,
and the growing sophistication of analytical techniques employed to do so.
While simple techniques tend to be trivialized, they are robust without being
brittle. Thus, this paper proposes a modified, robust tool for exploratory
spatial data analysis - the normalized boxplot - alongside other robust
measures of distribution. This tool can used to explore both the presence of
spatial non-stationarity at the level and to recognize zones at the local level
within which adequate spatial stationarity exists to develop meaningful
hypotheses concerning causal relationships. The method is used in a case study
of limiting long-term illness in 595 wards in North-East England. The analysis
results, through the detection of spatial non-stationarity, in a spatial
partitioning of the area into five locality types for which different
relationships for the possible explanatory variables exist. Thus, the
detection, description and explanation of spatial differentiation at the local
level is clearly an important goal of spatial analysis, but as is shown through
tools such as normalized boxplots, some simple, safe and easily understandable
approaches are adequate to the task.[back to the publication list]
Brimicombe, A.J.
(1999) "Geographical information systems and environmental modelling for
sustainable development" in Land Reform for Sustainable Development (ed.
Dixon-Gough), Ashgate Publishing, Aldershot: 77-92
This paper
focuses on land use planning issues of sustainable development in the face of
escalating fatalities and economic losses due to naturally occurring
environmental hazards. Incremental land use change and urban development can
adversely effect the magnitude and frequency of geohazards resulting in greater
risk for future generations. Hong Kong is cited as an example. Geographic
Information Systems (GIS) are proving an effective tool in land use planning.
In evaluating geo-hazards and in making decisions about the relative merits of
structural and non-structural solutions, planning teams need to engage in
environmental modelling. GIS can be used for 'spatial coexistence' and
'source-pathway characterization' modelling of geohazards. Particular emphasis
is given to the construction of spatial decision support systems. GIS, however,
are not a panacea. Some of the drawbacks in the context of land use planning
for sustainable development are discussed.[back to the publication list]
Li, Y.;
Brimicombe, A.J. and Ralphs, M. (1999) "Sensitivity to spatial data quality in
numerical modelling coastal oil spill" in Proceedings GISRUK'99,
Southampton, Vol. 7: 104-107
The simulation of coastal oil
spills is an important form of environmental modelling due to the seriousness
of both physical and social-economical impacts of such spills. Spatial data
quality has come to be recognised as a critical issue in coastal oil spill
modelling and is beginning to attract the attention of developers and users
alike. The initial focus of this research has been on the influences of spatial
data quality on the hydrodynamic modelling (which is treated as a grey box)
with a consideration of knock-on effects for trajectory and fates modelling. A
geocomputational approach has therefore been adopted whereby the use of data
exploration software, a statistical package, geostatistics, fractal software,
and visualisation software are coupled using GIS as a hub for data storage and
handling. The sensitivity to a range of data qualities for shoreline
representation, seabed bathymetry, sampling strategies, tidal data and so on
can thus be systematically researched. A synthetic modelling approach is
consequently being employed for the next stage of the research and for the
spatial data quality tests presented in this paper. In this way the data in the
experiments can be carefully and systematically controlled. Error propagation
analysis offers much more information than error measurement. An overall model
is derived for the spatial data quality propagation given that the hydrodynamic
model are "grey box". Error propagation techniques, such as Monte Carlo method,
are suitable for the complex dynamic-distribute models utilised in hydrodynamic
modelling. By studying the effects of spatial data quality on numerical
modelling of coastal oil spills, optimum strategies for managing and improving
the spatial data quality using appropriate surveying techniques can be
explored.[back to the publication list]
Chen, X.H.; Brimicombe, A.J. and Hu, R.
(1999) "Forecasting of flood and sediment with an improved time-variant
diffusive model" Hydrological Sciences Journal 44: 583:595
Floods are a significant barrier in the exploitation and
management of water resources. Hydrological forecasting is an important tool
for flood control and can provide the basis for dealing with the conflict
between flood control and water resources. This paper proposes a hydrological
forecasting method for rivers by deriving an energy equation which describes
the conservation of potential slope, kinetic slope, and friction slope of a
river reach. The equation contains an additional term, the rate of change of
stage, which has a significant effect on the propagation of flood waves, as
compared with the Saint-Venant momentum equation. The energy equation has been
found to provide more accurate results in the hydraulic calculation of the Han
River, China, than the Saint-Venant momentum equation. Based on this energy
equation, an improved diffusive model (IDM) was derived. By modelling the flood
process for the Yellow River, China, the IDM was compared with the classical
diffusive model (CDM) showing that the IDM gives better results. The time
series form of the IDM with time-variant parameters was used for real-time
hydrological forecasting of the Yellow River. The recursive ordinary
least-squares (ROLS) approach, improved by introducing an effective matrix, was
used for on-line identification of parameters. The model presented in this
paper was found to have the ability to model the nonlinear and time-variant
behaviours of the hydrological process.[back to the publication list]
Chen, X.H.;
Brimicombe, A.J.; Whiting, B. and Wheeler, C. (1998) "Enhancement of
three-dimensional reservoir quality modelling by Geographic Information
Systems" Geographical and Environmental Modelling 2: 125-139
Two- and three-dimensional (3D) mathematical modelling of the
hydro-dynamics and pollution distributions in large water bodies is generally
used on a grid with a relatively large cell size due to hardware limitations.
Geographic information systems (GIS) have advanced capability for interpolation
and visualization. Although GIS have been used for direct modelling, they are
not designed for independent 3D modelling of large water bodies. This paper
loosely couples a grid-based GIS with a 3D reservoir water quality model
developed by the authors. Using this 3D model which is written in FORTRAN and
run on a Pentium-133 computer, the 3D flow fields and distribution of Fe and Mn
for the Arha Reservoir, China, are computed on a coarse grid of with a cell
size of 50m square. Based on these results, a GIS grid-cell kriging function is
used to obtain the distributions of Fe and Mn for the continuous water body on
a finer grid of 25m cells. The results are compared with a rerun of the 3D
reservoir quality model using the 25m grid, and this confirms the efficacy of
kriging in the 3D reservoir quality model. Even smaller grids (2m square) of Fe
and Mn concentrations in the area of water intake can be obtained by this GIS
kriging which are suitable for a detailed study of the quality of the water
supply. The use of kriging in a grid-based GIS in this paper greatly enhanced
the 3D reservoir quality modelling.[back to the publication list]
Li, Y.; Brimicombe,
A.J. and Ralphs, M. (1998) "Spatial data quality and coastal spill modelling" in Oil and Hydrocarbon Spills: Modelling, Analysis and Control (eds.
Garcia-Martinez & Brebbia), Computational Mechanics Publications,
Southampton: 53-62
A growing number of numerical models have
been applied to oil spill research, contingency planning and risk assessment.
Whilst model quality has been a critical theme in their development and
utilization, data quality in both quantitative and qualitative terms is also
critical to the fitness of use of the outputs from the oil spill modelling
process. Given the importance of such factors as shoreline and bathymetric
representation, data resolution and method of interpolation in modelling
coastal spills for example, it is essential that modelling of spatial data
quality is applied to hydrodynamic, trajectory and fates modelling. This paper
presents initial results of research in progress that will nevertheless
emphasize to modeller and manager alike the practical issues of spatial data
quality for coastal oil spill modelling. It is centred around a case study of
Jiao Zhou Bay in the People's Republic of China. After summarizing the issue of
spatial data quality in GIS, a taxonomy useful to oil spill modelling is
presented. Some strategies for managing the spatial data quality in oil spill
modelling are explored using GIS functionality for spatial data analysis and
geostatistical modelling.[back to the publication list]
Tsui, P. and Brimicombe, A.J.
(1998) "A conceptual framework of spatio-temporal process models for GIS: a
planning perspective" Proceedings XX1 FIG Congress, Vol. 3: 383-397
Current development of GIS applications has been hindered by the
lack of functionality to handle and analyse temporal geographical phenomena. A
fundamental obstacle in developing temporal GIS is the absence of comprehensive
and generalised conceptual models of spatio-temporal processes suitable for
implementation in a GIS environment. This paper proposes a conceptual framework
of spatio-temporal process models which can cover a wide range of temporal
geographical phenomena and integrate with spatio-temporal data models for GIS.
Finally, the use of this framework in modelling dynamic spatial processes in a
planning perspective is illustrated.[back to the publication list]
Brimicombe, A.J.
(1998) "A fuzzy co-ordinate system for locational uncertainty in space and
time", Innovations in GIS 5 (ed. Carver), Taylor & Francis:
143-152
Uncertainty is an unavoidable characteristic of
thematic maps derived from digital spatial data. Boundaries drawn as pixel-thin
lines to universally represent sharp, gradual or vague changes in a theme are
an inevitable consequence of the lack of stored data with which to interpret
these boundaries otherwise. Although some recent research has focused on the
uncertainty of boundaries deriving from the purity of the polygons on either
side, there is scope for explicitly recording the locational extent of
uncertainty for individual points and lines that make up polygon boundaries.
Fuzzy numbers are introduced and used to construct a fuzzy co-ordinate system
in two-, three- and four-dimensions. The paper discusses the geometry of fuzzy
lines and their inclusion within the traditional GIS topological data
structure. By using a C language struct, the overhead of including data on
boundary uncertainty may be minimal. Fuzzy number co-ordinate geometry offers
opportunities in GIS applications where boundaries are inherently inexact to
have more representative data that better reflect the true situation.[back to the publication list]
Tsui, P. and Brimicombe, A.J. (1997) "Hierarchical
tessellations model and its use in spatial analysis" Transactions in GIS
2: 267-279
Hierarchical tessellation model is a class of
spatial data models based on recursive decomposition of space. Quadtree is one
such tessellation and is characterised by square cells and 1:4 decomposition
ratio. To relax these constraints in tessellation, a generalised hierarchical
tessellation data model, called Adaptive Recursive Tessellations (ART), has
been proposed. ART increases flexibility in tessellation by the use of
rectangular cells and variable decomposition ratios. In ART, users can specify
cell sizes which are intuitively meaningful to their applications, or can
reflect the scales of data. ART is implemented in a data structure, called
Adaptive Recursive Run-Encoding (ARRE), which is a variant of two-dimensional
run-encoding whose running path can vary with different tessellation structures
of ART model. Given the recognition of the benefits of implementing statistical
spatial analysis in GIS, the use of hierarchical tessellation models, such as
ART, in spatial analysis are discussed. Firstly, ART can be applied to solve
quadrat size problem in quadrat analysis for point pattern with variable size
quadrats. Besides, ART can also act as data model in variable resolution block
kriging technique for geostatistical data to reduce variation in kriging error.
Finally, ART model can facilitate the evaluation of spatial autocorrelation for
area data at multiple map resolutions and how to construct connectivity matrix
for calculating spatial autocorrelation indices based on ARRE is also
illustrated.[back to the publication list]
Tsui, P. and Brimicombe, A.J. (1997) "Adaptive recursive tessellations (ART) for Geographical Information Systems"
International Journal of Geographical Information Science 11:
247-263
Adaptive Recursive Tessellations (ART) is a
conceptual and generalised framework for a series of hierarchical tessellation
models characterised by a variable decomposition ratio and rectangular cells.
ART offers more flexibility in cell size and shape than the quadtree which is
constrained by its fixed 1:4 decomposition ratio and square cells. Thus the
variable resolution storage characteristic of the hierarchical tessellations
can be fully utilised. A data structure for the implementation of the ART,
called Adaptive Recursive Run-Encoding (ARRE), is proposed. \Then a spatial
database management system specially for ART, the Tessellation Manager, is
constructed based on the ARRE. Space efficiency analysis of three ART models
are conducted using the Tessellation Manager. The result shows that ART models
have similar space efficiency with the quadtree model. ART also has many
potential applications in GIS and is suitable as a spatial data model for
raster GIS.[back to the publication list]
Brimicombe, A.J. (1997) "A universal
translator of linguistic hedges for the handling of uncertainty and
fitness-for-use in Geographical Information Systems" in Innovations in GIS
4 (ed. Kemp), Taylor & Francis : 115-126
Spatial data
quality has been attracting much interest. Much of the problem lies in the
degree to which current data structures are unable to model the real world and
the way imperfections in the data may propagate during analyses and cast doubt
on the validity of the outcomes. Much of the research has concentrated on the
quantitative accuracy of spatial data, the derivation of indices and their
propagation through analyses. Geographical data invariably includes an element
of interpretation for which linguistic hedges of uncertainty may be generated.
The paper presents a new technique of handling such expressions in a GIS
through fuzzy expectation - intuitive probabilities linked to stylized fuzzy
sets. This can be achieved without adversely affecting the size of the
database. By using fuzzy expectation as linguistic building blocks, many of the
difficulties in using fuzzy set descriptors in GIS have been overcome. The
stylized fuzzy sets can be propagated using Boolean operators to give a
resultant fuzzy set which can be 'translated' back into a linguistic quality
statement. For the first time, linguistic criteria of fitness-for-use can be
derived for GIS outputs regardless of the language being used.[back to the publication list]
Chen, X.H. and Brimicombe, A.J. (1997) "Temporal and
spatial variations of Fe and Mn in Arha Reservoir" Water Pollution IV,
Computational Mechanics Publications, Southampton: 95-104
Hydrological forecasting can provide the basis for dealing with the conflict
between flood control and water use. This paper proposes a flood and sediment
forecasting method for rivers by deriving an energy equation which describes
the conservation of potential slope, kinetic slope, and friction slope of a
river reach. The equation contains an additional term, the rate of change of
stage, which has a significant effect on the propagation of flood waves, as
compared with the Saint-Venant momentum equation. The energy equation has been
found to provide more accurate results in the hydraulic calculation of the Han
River, China, than the Saint-Venant momentum equation. Based on this energy
equation, an improved diffusive model (IDM) was derived. By modelling the flood
processes for the Yellow River, China, the IDM was compared with the classical
diffusive model (CDM) showing that the IDM gives better results. The time
series form of IDM with time-variant parameters was used for real-time flood
and sediment forecasting of the Yellow River. The recursive ordinary
least-squares (ROLS) approach, improved by introducing an effective matrix, was
used for on-line identification of parameters.[back
to the publication list]
Brimicombe,
A.J. and Bartlett, J. (1996) "Linking geographic information systems with
hydraulic simulation modeling for flood risk assessment: the Hong Kong
approach" in GIS and Environmental Modelling: Progress and Research
Issues (eds. Goodchild et al.), GIS World Inc.: 165-168
Hong Kong's northern lowland basins have undergone substantial urban and
sub-urban development over a 20 year period. This has been associated with
worsening recurrent flood problems. An approach has been adopted whereby
hydraulic modeling has been used in conjunction with geographic information
systems (GIS) to produce 1:5,000 scale Basin Management Plans. GIS has a dual
role: in data integration and quantification as an input to hydraulic modeling;
in data interpolation, visualization and assessment of flood hazard and flood
risk using the outputs from the hydraulic modeling. By using current land use
and various development scenarios to be modeled over a range of rainstorm
events, 'what if' decision support can be used in devising Basin Management
Plans. Linking with hydraulic modeling requires a different approach to GIS
data modeling than with the more traditional linkage with hydrological modeling
only. The methodology developed in Hong Kong is presented as a case study.[back to the publication list]
Li, Y. (1996) "The study and development of oil spill model
for Jiao Zhou Bay" Environmental Protection in Communications, 6:
26-38
The transportation of oil production in Jiao Zhou Bay
is considerable heavy, because several large oil export terminals locate within
this bay. Around Jiao Zhou Bay, there are dense residential area, important
economical zone and rich tourist resources, for which oil spill might cause
serious damage. In this paper, oil spill model is studied and developed for
Jiao Zhou Bay and adjacent water. Hydrodynamic modelling is explored with
finite element method and is further locally calibrated in term of the accuracy
consideration. Trajectory and fate modelling is then carried out to simulate
the oil dispersion and weather processes. Various data are inputted at
different phases of modelling, most of which are dynamic and distributed.
Geographic information system (GIS) has crucial role in this study for data
integration, manufacture and display. GIS and the oil spill model are fully
coupled for easy-use in this case. This oil spill model would efficiently
support the contingency plan and emergency system for Qing Dao Port which
locate in Jiao Zhou Bay.[back to the publication
list]
Tsui, P. and Brimicombe, A.J.
(1996) "Hierarchical tessellations model and its use in spatial analysis"
First International Conference on Geocomputation, Leeds, Vol2:
815-825
Quadtree, a classical hierarchical tessellation
model, has been widely adopted as a GIS spatial data model for its ability to
compress raster data. Nevertheless, due to its rigid tessellation structure, a
generalised hierarchical tessellation model, called Adaptive Recursive
Tessellations (ART) is proposed. It has a more flexible tessellation structure
than quadtree as a result of the use of rectangular cells and variable
decomposition ratios. Thus users can assign specific sizes of cells to
different levels of an ART model which are intuitively useful or meaningful to
their applications, or can reflect the scales of data. ART is represented by a
special type of two-dimensional run-encoding, Adaptive Recursive Run-Encoding
(ARRE), whose running path can vary with the tessellation structure of an
individual ART model. The benefits of implementing statistical spatial analysis
in GIS have been recognised. As a spatial data model, the use of hierarchical
tessellation models in spatial analysis are discussed in this paper. The
formulae for mean and variance statistics in terms of addresses of ARRE are
stated. A connectivity matrix which is an essential element in calculating
spatial autocorrelation can also be constructed based on the ARRE. Finally, the
application of the ART model in solving modifiable areal unit problem is
discussed. Flexibility of the tessellation structure and the inherent ability
in spatial aggregation of the ART model have been found useful in studying
scale and aggregation effects of areal units on spatial analysis.[back to the publication list]
Brimicombe, A.J. and Yeung, D. (1995) "An object oriented
approach to spatially inexact socio-cultural data" Proceedings 4th
International Conference on Computers in Urban Planning & Urban
Management, Melbourne, Vol 2: 519-530
Culture, as a
system of shared beliefs within a society, has an important influence on the
use of land and attitudes towards the environment. Socio-cultural data
describing the spatial manifestations and influences of culture should
therefore be fundamental to the processes of planning and conservation.
Socio-cultural data, however, are inherently inexact and topology may not be
strictly determined by geometric adjacency. Thus geographical information
systems currently used by planners are largely unable to represent
socio-cultural phenomena, leading to its exclusion from many analyses and hence
from much of the decision-making process. This paper argues for an
object-oriented approach in which socio-cultural objects and classes of
phenomena can be modelled initially as a-spatial. This is illustrated using a
Hong Kong example of the influence of traditional landscape beliefs (feng shui)
on village layout. Location and extent, as attributes of objects, can then be
defined using raster and/or vector in absolute or relative space. The outcome
of this approach provides a new GIS perspective of space, place and
landscape.[back to the publication list]
Hansen, A.; Brimicombe, A.J.; Franks, C;
Kirk, P. and Tung, F. (1995) "Application of GIS to hazard assessment, with
particular reference to landslides in Hong Kong" in Geographical Information
Systems in Assessing Natural Hazards (eds. Carrara & Guzzetti), Kluwer,
Netherlands: 273-298
This paper reviews some of the factors
relevant to the implementation of various GIS facilities with respect to
terrain-related hazards prevailing in Hong Kong, with particular reference to
landslides. Examples of the current state of practice are described, together
with some considerations for future developments with regard to landslide
hazard assessment and emergency response. Although Hong Kong faces the combined
hazards of landslides. flooding, tidal surge and typhoon winds, the emphasis of
this paper is placed on the approach to reducing landslide risks to the public.
The approach is discussed within the framework of hazard mitigation (prevention
and preparedness), disaster response and to a lesser extent, recovery. The key
programmes within the Geotechnical Engineering Office of the Hong Kong
Government are given as studies. These systems continue to be developed to be
more responsive to the pressing needs of urban development and to an increasing
community awareness of the hazards of slope failures.[back to the publication list]
Brimicombe, A.J. (1994) "Uncertainty and fitness-for-use: a
research and implementation paradigm" Proceedings FIG XX Congress,
Melbourne, Vol. 3: 374-381
Spatial data are often inexact
with levels of inherent and perceived uncertainty. Studying local, regional and
global change using geo-information systems requires a paradigm for managing
uncertainty and assessing fitness-for-use in the specific context of an
application. A model is presented which provides the basis for recording
uncertainty, propagation through data transformation and assessment of
fitness-for-use either through direct visualisation or through sensitivity
analysis.[back to the publication list]
Law, J. and Brimicombe, A.J. (1994) "Integrating primary and secondary data sources in land information systems"
Proceedings FIG XX Congress, Melbourne, Vol. 3: 478-489
Spatial databases usually include significant amounts of
secondary data. This is inevitable as database creation proceeds from existing
amps and plans rather than carrying out new field surveys. In subsequent
updating cycles, change is detected and recorded using primary data collection
techniques. The paper presents current research on resolving problems which
arise when integrating primary and secondary spatial data within a layer and
ensuring consistency and compatibility with other layers in a Land Information
System.[back to the publication list]
Ashworth, J.; Tang, W. and Brimicombe, A.J.
(1993) "Hong Kong: safeguarding ecology in a dynamic environment" Mapping
Awareness 7(6): 34-36
Rapid urbanisation presents a
challenge to those concerned with nature conservation in Hong Kong. World Wide
Fund for Nature Hong Kong is using GIS technology to produce an Ecological
Information System for the Territory to help ensure that future land
development is carried out in an environmentally sensitive way.[back to the publication list]
Brimicombe, A.J. (1993) "Combining positional and attribute
uncertainty using fuzzy expectation in a GIS" Proceedings GIS/LIS'93,
Minneapolis, Vol. 1: 72-81
Spatial data of natural resources
and other aspects of the environment are frequently inexact with levels of
inherent and perceived uncertainty. The literature mostly treats positional and
attribute uncertainty separately. A model has been developed which provides the
basis for recording uncertainty, for propagation of uncertainty during data
transformation and for assessing fitness-for-use of outcomes. In the context of
this model, a metric - fuzzy expectation - has been developed as an extension
of fuzzy sets and fuzzy numbers into a coordinate system and linguistic
uncertainty descriptor. Using this metric, thematic uncertainty can be modeled
over vague boundaries.[back to the publication
list]
Brimicombe, A.J. and Tsui, P.
(1993) "Adaptive recursive tessellations: a versatile approach to data modeling
GIS applications in planning and engineering" Third International Conference
on Computers in Urban Planning and Urban Management, Atlanta, Vol. 1:
435-450
Physical planning and engineering feasibility studies
progress through stages of refinement from initial site search and selection to
detailed zoning or layout proposals. Whilst GIS is an appropriate tool, current
data models do not easily support the refinement process. Adaptive recursive
tessellations (ART) are raster-based models developed to intuitively match the
usual approach to the task. Specific features include the ability to: define
appropriate levels of resolution by varying the decomposition ratio; conform to
existing map sheet series; locally increase the resolution of a layer as more
refined data becomes available; segment the database over a distributed network
between Planning Offices. A tessellation management system (TMS) has been
developed to create and manage ARTs according to planners' requirements. Data
input, through the TMS, can be either in vector or raster. The data structure
should allow for efficient analytical operations such as overlay, Boolean
mapping and multicriteria evaluation.[back to the
publication list]
Tsui, P. and
Brimicombe, A.J. (1993) "Adaptive recursive tessellations: an approach to data
modeling in civil engineering feasibility studies" 3rd International
Workshop on GIS, Beijing, Vol. 1: 41-55
Practical
approaches to civil engineering pre-feasibility and feasibility studies
progress through stages of refinement from initial site search and selection to
outline design. Whilst GIS is an appropriate tool, current data models do not
easily support the refinement process. Adaptive recursive tessellations (ART)
are raster-based models developed to intuitively match the usual approach to
the task. Specific features include the ability to: define appropriate levels
of resolution by varying the decomposition ratio; conform to existing map sheet
series; locally increase the resolution of a layer as more refined data becomes
available; segment the database, if necessary, over a distributed network
between site or regional offices. A tessellation manager (TM) has been
developed to create and manage ARTs according to project requirements. The data
structure should allow for efficient analytical operations usually used in
opportunities and constraints mapping and site selection.[back to the publication list]
Brimicombe, A.J. and Bartlett, J. (1993) "Spatial decision
support in flood hazard and flood risk assessment: a Hong Kong case study"
3rd International Workshop on GIS, Beijing, Vol. 2: 173-182
Hong Kong's northern New Territories has undergone rapid urban
and sub-urban development which has worsened recurrent flooding problems. An
approach has been adopted in Hong Kong whereby hydraulic modelling has been
used in conjunction with geographic information systems to produce 1:5,000
scale Basin Management Plans. GIS has a dual role: in data integration and
quantification as an input to hydraulic modelling; in data interpolation,
visualisation and assessment of flood hazard and flood risk using the outputs
from the hydraulic modelling. By using current land use and various development
scenarios to be modelled over a range of rainstorm events, 'what if' decision
support can be used in the design of the Basin Management Plans. Linking with
hydraulic modelling requires a different GIS approach than with the more
traditional linkage with hydrological modelling. The methodology developed in
Hong Kong is presented as a case study.[back to the
publication list]
Tang, W.; Ashworth,
J. and Brimicombe, A.J. (1993) "Implementation of the Hong Kong ecological
information system" 3rd International Workshop on GIS, Beijing, Vol.
2
Despite its small area, Hong Kong has a very diverse flora
and fauna which is continually under threat both from the relentless expansion
of urban areas and associated infrastructure and from the increasingly heavy
demands for recreational outlets. Whilst the ecology of Hong Kong has been
studied and documented, the information is generally unpublished and scattered
amongst local researchers and naturalists. In order to have a more centralised
information base with which to direct conservation efforts and review impacts
of new developments, the World Wide Fund for Nature Hong Kong (WWFN HK) has
adopted Geographic Information System technology. A vegetation layer,
interpreted by WWFN HK from aerial photos, relevant conservation and
administrative boundary layers and climatic summary layers are now complete.
Topographic, drainage, roads and ecological information layers are partially
complete. A 1:50,000 scale colour maps and handbook are being prepared for
printing and wide dissemination. Use of the database for environmental impact
assessment and research has already begun.[back to
the publication list]
Brimicombe,
A.J. (1992) "Responsibility, Uncertainty and Fitness-for-use: Implications
for Geo-Information Theory" International Archives of Photogrammetry and
Remote Sensing, Vol. 29 B3: 759-767
The accuracy and level of
uncertainty in GIS products has been a growing concern within the user
community. The issue of uncertainty in GIS is explored and current strategies
for the measurement, modelling and management of error are reviewed. Current
approaches have significant limitations for imprecise spatial data. A general
model is proposed as a basis for developing workable solutions. An example of
how the model operates is given. Implications are that uncertainty measures
should be embedded within GIS data rather than the use of global measures.
Users must take responsibility for assessing fitness-for-use of their data for
the particular context of an analysis.[back to the
publication list]
Tsui, P. and
Brimicombe, A.J. (1992) "Adaptive Recursive Tessellations" Proceedings
GIS/LIS'92, San Jose, Vol. 2: 777-786.
Quadtree, a
recursive tessellation data model can compress the volume of raster data by
representing a large area of same characteristic with a larger cell instead of
a vast number of small cells. However, the deficiencies of quadtree are the
ubiquitous use of square cells and fixed decomposition ratio (1 : 4). These
make quadtree a very inflexible data model since the sizes of cells at
different levels are fixed once the dimension of the whole area is known. A new
Adaptive Recursive Tessellation (ART) data model which allows the use of
rectangular cells and variable decomposition ratio is presented. ART offers
much greater flexibility to users to cope with the needs of different
applications in the aspect of data modelling. A modified two-dimensional
run-encoding technique is also implemented on ART to further reduce the storage
volume. In order to construct and update an ART, a Tessellation Management
System (TMS) has been developed.[back to the
publication list]
Brimicombe, A.J.
(1992) "Flood risk assessment using spatial decision support systems"
Simulation 59: 379-380.
Urbanizing drainage basins are
complex, dynamic man-environment systems. All too frequently the carefully
balanced rural coexistence with the river and its floodplain, developed by
trial and error over many years (often centuries), is severely disrupted by the
geomorphic response to urban development within a drainage basin. Flooding can
become both more frequent and more severe in terms of depth and duration. The
consequences can be costly with disrupted livelihoods, damage to property and
essential services and even loss of life. The integration of geographical
information systems (GIS) and hydraulic simulation of flood events for
different development scenarios is proving an effective tool in planning and
operational management.[back to the publication
list]
Brimicombe, A.J. (1989) "Commercialisation: a south-east Asian viewpoint" International Journal of
Remote Sensing 10: 429-430
Some south-east Asian
fundamentals of commercialisation of remote sensing - namely attitudes, open
skies, products and accessibility - are discussed.[back to the publication list]
Brimicombe, A.J. (1989) "Boolean maps and the
rationalization of opportunities and constraints in physical planning"
Proceedings International Conference on Computers in Urban Planning and
Urban Management, Hong Kong : 13-22
The recognition of
opportunities and constraints offered by the physical landscape and its cover
is a necessary precursor to development and planning. Whilst aerial
photographic interpretation and a geomorphological evaluation of landform can
rapidly furnish most of the data, conclusions on opportunities and constraints
can be of a generalised nature. If the paper products instead form a digital
database, then Boolean mapping can provide a more rigorous and communicable
approach. Boolean maps are the outcome of overlay analyses incorporating
logical operators that allow the direction of analysis to be controlled so that
outcomes are rationalized for specific proposed land uses or activities.[back
to the publication list]
Brimicombe,
A.J. (1987) "Geomorphology terrain evaluation for solid wastes disposal in
tropical and sub-tropical climates" in The Role of Geology in Urban
Development (ed. Whiteside), Geological Society, Hong Kong : 457-462
Two important considerations in formulating environmentally sound
policies towards solid wastes disposal are land use planning and engineering
feasibility. Regulated zoning based on land capability, political desirability
and other criteria is frequently adopted to resolve competing land uses, to
conserve valuable agricultural and recreational areas and to reserve prime
development land. If potential landfill sites are purposely sought, then strict
criteria of land suitability need to be met. However, the very nature of waste
disposal inevitably results in these sites being consigned to marginal lands
possessing environmental and engineering problems. A detailed evaluation of the
problems will be required for a sound assessment of engineering feasibility.
Emphasis in the literature is on the economic and social; aspects of waste
management and though generalised guidelines for the siting of disposal
facilities may be listed, too few authors consider how assess the physical
setting in any detail. In the tropics and sub-tropics, increasing volumes of
waste are coincident with a frequent lack of information about the landscape
whilst high seasonal rainfall and landscape sensitivity produce special design
problems. Modern geomorphological methods of terrain evaluation using aerial
photography and other remote sensing imagery offer a cost-effective solution.
Landform is the key to interpreting materials and processes which can be
assessed in relation to the design criteria and design limitations for
landfills to arrive at decisions on land suitability that reflect engineering
feasibility. A terrain evaluation also highlights at an early stage the
problems and constraints that would have to be overcome through investigation
and design to ensure that sites are environmentally acceptable and can be
safely operated.[back to the publication
list]
Brimicombe, A.J. (1985) "Geomorphology: an aid to solid wastes management in the tropics" in Pollution in the Urban Environment (eds. Chan et al), Elsevier, London :
421-426
The problems which need to be solved for effective
solid waste management within the Tropics have long been recognised as being
different from those traditionally encountered in the industrialised nations of
the West. There are however, no universal solutions and "every city needs a
system tailor-made to its own .... environment" (Flintoff, 1984). All too often
the literature concentrates on the economic and social aspects of waste
management and though generalised guidelines for the siting of waste disposal
facilities may be listed, too few authors consider how to assess the physical
setting in any detail. This paper introduces geomorphology, the study of
landforms, as a cost-effective means of evaluating the physical environment of
the Tropics within the appropriate economic and social context for the
reconnaissance, feasibility and design stages of waste disposal facilities.[back to the publication list]
Brimicombe, A.J. (1984) "Computer-stored databases and the
analysis of superficial deposits" in Geology of Surficial Deposits in Hong
Kong (ed. Yim), Geological Society, Hong Kong : 19-24
The
two-stage study of superficial materials where a site evaluation is followed by
a site investigation represents a logical and planned progression of data
collection. At the end of each stage a synthesis is carried out to present the
data as useable information for engineers, planners and architects. This rarely
presents any great difficulty at the end of site evaluation as the data are
usually qualitative and two-dimensional. On completion of the site
investigation stage, however, geologists and geomorphologists are faced with
having to analyse and present large quantitative data sets that are
three-dimensional for superficial materials and frequently four-dimensional
(space and time) for groundwater studies. Manual procedures of drawing
cross-sections, isopachs or isometric block diagrams can be both tedious and
time consuming and therefore detrimental to the extent of analysis. The
computer-stored database approach described in this paper is a rapid means of
graphically analysing site investigation data in relation to site topography
and is an effective means of communicating the results.[back to the publication list]
Brimicombe, A.J. (1984) "Aerial photographic evidence for
environmental change in Hong Kong" in The Evolution of the East Asian
Environment (ed. Whyte), University of Hong Kong, Vol. I: 128-136
In 1948 Belcher stated the principle that "similar soils are
developed on similar slopes under the action of weathering of similar
materials". Thirty-five years later this principle, though slightly modified,
is still the basis for engineering site evaluation. Landforms created from
geologically similar materials and subjected to the same evolutionary processes
operating within identical external environments for the same period of time
will have a similar range of engineering properties. A thorough knowledge of
the geology and evolution in cyclic time would therefore not only enable the
geologist and geomorphologist more easily to interpret site investigation data,
but also extrapolate properties from one landform to another. Furthermore, a
knowledge of a landform's present tendency would permit an assessment of its
probable reaction under human impact.[back to the
publication list]
Brimicombe, A.J.
(1982) "Engineering site evaluation from aerial photographs" Proceedings of
the Seventh Southeast Asian Geotechnical Conference, Hong Kong, Vol. II:
139-148
Two contrasting sites in Hong Kong are evaluated
using aerial photographs to illustrate the approach and technique of
interpretation and the range of information that can be extracted. The approach
towards interpretation is related to the engineer's information requirement
whilst the skill of the interpreter, scale and quality of the aerial
photographs and the landscape itself define the range of information that can
be extracted.[back to the publication list]
© 2011
For a general description of these pages and an explanation of how they should work with screenreading equipment please follow this link:Link to general description
For further information on this web site's accessibility features please follow this link:Link to accessibility information
The following message does not apply to screenreader users:
You will still be able to access all the essential content of this web site, but it will not look, or function, exactly as intended.