Development and Evolution of a Site Survey System: Groundhog

Author(s):  
Mike Davies ◽  
Robert Murley ◽  
Ian Adsley

Traditional techniques for the assessment of pollutants in contaminated land, notably brown-field sites, may not yield the speed and accuracy now required for estimates of risk and remediation cost. Detailed site investigation is often limited by the time and cost of laboratory-based analysis techniques and time-consuming data collation phases. Thus, relatively straightforward technical issues, such as the mapping of priority areas of a site, can be unnecessarily delayed and expensive. The GROUNDHOG system was developed to address these problems and to provide a platform for the development of a range of techniques for the radiological survey of potentially contaminated land. The system brings together the best of well-established and recent technologies. Visualisation of the survey results is improved by the use of Geographical Information Systems and Database systems allow an audit trail to be maintained as part of a Quality Assurance programme. Development of the Groundhog system has continued, increasing the sensitivity of the system for some applications, using gamma radiation spectrometry systems to provide qualitative measurements and constructing ruggedised systems for surveys of areas where the risks associated with manual surveys are deemed unacceptable. In recent years, ‘conventional’ Groundhog surveys have been performed on many nuclear and non-nuclear sites, for a wide range of reasons: de-licensing nuclear facilities; pre- and post-remediation surveys of contaminated land; during the remediation of contaminated land, to reduce waste volume. Specialised versions of the system have been developed and used for the location of discrete nuclear fuel ‘particles’ on beaches, sub-surface measurements have been made for estimating waste volume and a submarine survey has been conducted. This paper describes some of the projects completed and the technologies used to perform the work.

Author(s):  
B. L. Turner II ◽  
D. R. Foster

Frontiers advance and retreat, both figuratively and literally. At this moment they are advancing in three ways relevant to the subject of this book and the ongoing project on which it is based. First, after more than a century of reductionist hegemony, various science communities worldwide increasingly recognize the need to improve complementary, synthesis understanding—a way of putting the reductionist pieces of the problem back together again in order to understand how the ‘whole’ system works and to identify the emergent properties that follow from the complex interactions of the pieces. Synthesis understanding is not, of course, new. In the late eighteenth century, Immanuel Kant argued for it as one of the pillars of science in the reorganization of knowledge in the European academy (Turner 2002a) and designated geography as one of the ‘synthesis sciences’. Its contemporary rediscovery, however, rests in the science of global environmental change (Lawton 2001; Steffen et al. 2002), especially efforts to model complex systems, such as those in ocean–atmosphere–land interactions, and has been expanded by emerging research agendas seeking to couple human and environment systems, often registered under the label of ‘sustainability science’ (e.g. Kates et al. 2001; NRC 1999). Second, within these developments landuse and land-cover change (or, simply, land change) is singled out because of its centrality to a wide range of environmental concerns, including global climate change, regional–local hydrological impacts, biodiversity, and, of course, human development and ecosystem integrity (e.g. Brookfield 1995; NRC 2000; Watson et al. 2001). The need to advance an integrated land-change science is also increasingly recognized, one in which human, ecological, and remote sensing and geographical information systems (GIS) sciences are intertwined in problem-solving (Liverman et al. 1998; Klepeis and Turner 2001; Turner 2002b). And central to this effort is the need to advance geographically (spatially) explicit land-change models that can explain and project coupled human-ecological systems, and thus serve a wide range of research and assessment constituencies, from carbon to biodiversity to human vulnerability (IGBP 1999; Irwin and Geoghegan 2001; Kates et al. 2001; Liverman et al. 1998; Veldkamp and Lambin 2001). These two developments—synthesis science and integrated land science directed towards geographically explicit land-change models—constitute the broader intellectual and research frontiers to which this work contributes.


2000 ◽  
Vol 51 (3) ◽  
pp. 255 ◽  
Author(s):  
N. L. Andrew ◽  
A. L. O'Neill

Aerial photography was used to estimate the representation of shallow subtidal habitats in New South Wales. Sixty sites, each between 4 and 5 hectares, were mapped with Geographical Information Systems software using ortho-rectified images digitized from 1:8000-scale photographs and ‘ground truthed’ in the field by divers. Barrens habitat covered an estimated 50% (s.e. = 3.9) of nearshore reefs between Port Stephens and Disaster Bay. Coverage of barrens habitat was greatest in Disaster Bay (68%, s.e. = 6.7) and least south of Disaster Bay (1%, s.e. = 0.3). There were clear differences among localities in the area of reef within the mapped sites; those at Cape Howe, Nadgee, and Turingal were significantly smaller in area than all others. There was no clear latitudinal trend in these differences but there was evidence of sand inundation at a site at Nadgee, where the reef was small. Differences in the densities and size-structure of the sea urchin Centrostephanus rodgersiiat 27 of the mapped sites provide a basis for testing relationships between the demography of this species and the persistence of the barrens habitat. The extensive coverage of the barrens habitat in New South Wales is likely to limit the productivity of the abalone industry. The development of a sea urchin fishery may have large impacts on habitat representation on nearshore reefs.


Author(s):  
Johan Andersson ◽  
Kristina Skagius ◽  
Anders Winberg ◽  
Anders Stro¨m ◽  
Tobias Lindborg

The Swedish Nuclear Fuel and Waste Management Co., SKB, is currently finalizing its surface based site investigations for the final repository for spent nuclear fuel in the municipalities of O¨sthammar (the Forsmark area) and Oskarshamn (the Simpevar/Laxemar area). The investigation data are assessed into a Site Descriptive Model, constituting a synthesis of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and a surface system description. Site data constitute a wide range of different measurement results. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modeling. The three-dimensional modeling (i.e. estimating the distribution of parameter values in space) is made in a sequence where the geometrical framework is taken from the geological models and in turn used by the rock mechanics, thermal and hydrogeological modeling. These disciplines in turn are partly interrelated, and also provide feedback to the geological modeling, especially if the geological description appears unreasonable when assessed together with the other data. Procedures for assessing the uncertainties and the confidence in the modeling have been developed during the course of the site modeling. These assessments also provide key input to the completion of the site investigation program.


Author(s):  
Angela Bartlett ◽  
Mike Davies ◽  
Peter Burgess ◽  
Gavin Coppins

The United Kingdom nuclear research programme started in the 1940s. Research Sites Restoration Limited (RSRL) is responsible for the restoration of two sites which were at the forefront of this research, under a programme funded by the UK Nuclear Decommissioning Authority (NDA). These are the 100 hectare Harwell site in Oxfordshire and the 84 hectare Winfrith site on the south coast of England. The work performed on these sites covered a huge range of nuclides, combinations of nuclides, chemical and physical processes, far more complicated than a power station, for example. The sites have a complex history with records of hundreds of buildings, many kilometres of drainage systems, groundwater contamination issues and land areas which require remediation. Formal work towards site release began in the 1990s, but demolition and clearance for re-use started many years earlier. An efficient restoration programme requires appropriate quality data. It is vital to decide what you need to know and how well you need to know it. As part of this, a challenging number of factors need to be considered in its design. This paper discusses these factors using the examples of the approach used at the Harwell and Winfrith sites including: • historical knowledge and associated uncertainties; • relevant clearance criteria; • availability and limitations of surveying equipment; • effective targeted and validation sampling with appropriate analytical methods; • data capture and analysis techniques; • effective communication between RSRL and the relevant technical teams; • mapping technologies (Global Positioning Systems, Geographical Information Systems); • use of Babcock’s IMAGES land quality software tool; • integration of the above over long time scales. The RSRL programme of works at the Harwell and Winfrith Sites is producing large volumes of different types of information from decommissioning, site investigation and remediation projects. This will be required to be accessible and understandable to support the process of site release which will continue over many years. The paper illustrates the methods by which RSRL is using effective knowledge management to compile a verifiable record to support site release as the site restoration works progress.


Author(s):  
Laura Beltz Imaoka

Abstract This study situates geospatial technology within the platform economy and constructs its brand culture, making it visible as a for-profit business rather than a utility. A critical lens is turned on the macroscopic economic and micro-social processes of the geospatial industry that result in the hegemonic relations and discursive regimes that legitimize and naturalize a common geospatially equipped, data-driven world. The annual user conventions and platform marketing of Esri, the global market leader in geographical information systems (GIS), acts as a site to observe how an imagined geospatial community of practitioners and investors is constructed. Branded content is unpacked to understand how the company’s image-making cultivates power relations between the public at large while negating itself as gatekeeper. These symbolic processes and collective practices help influence the uncritical investment and growth of the geospatial industry.


2021 ◽  
pp. 36-55
Author(s):  
Karel Charvat ◽  
Runar Bergheim ◽  
Raitis Bērziņš ◽  
František Zadražil ◽  
Dailis Langovskis ◽  
...  

For the purpose of exploiting the potential of cloud connectivity in geographical information systems, the Map Whiteboard technology introduced in this article does for web mapping what Google Docs does for word processing; create a shared user interface where multiple parties collaboratively can develop maps and map data while seeing each other work in realtime. To develop the Map Whiteboard concept, we have applied a methodology whereby we have collected technical and functional requirements through a series of hackathons, implemented a prototype in several stages, and subjected this to rigorous testing in a lab environment and with selected users from relevant environments at intermediate scale. The work has resulted in a fully functional prototype that exploits WebSockets via a cloud service to reflect map and data changes between multiple connected clients. The technology has a demonstrated potential for use in a wide range of web GIS applications, something that is facilitated by the interfaces already implemented towards mainstream mapping frameworks like OpenLayers and QGIS-two of the most popular frameworks for Web GIS solutions. Further development and testing are required before operationalization in mission-critical environments. In conclusion, the Map Whiteboard concept offers a starting point for exploiting cloud connectivity within GIS to facilitate the digitalization of common processes within the government and private sector. The technology is ready for early adopters and welcomes the contribution of interested parties.


2011 ◽  
Vol 6 (2) ◽  
Author(s):  
Sarah Cornelius ◽  
Ian Heywood

Geographical Information Systems (GIS) are computer-based tools for the input, management, analysis, modelling and display of geographical data. GIS are applied in a wide range of organizations and disciplines, including central and local government, environmental agencies, transport planning and vehicle navigation, education and research, utilities management, resource management, and the financial and retail sectors. GIS is a field of constantly changing technology, and it has been recognized that GIS education needs to be more than a 'once in a lifetime' event (Muller, 1993). Consequently, GIS teachers have developed computer-based materials for learners at all levels, from school students to postgraduates, and for the independent professional updating their skills and knowledge. To date, these materials have followed a number of approaches. Initially demonstrations of GIS software and its capabilities addressed the need to increase awareness of GIS and its applications (DoE, 1987). Arcdemo (Green, 1987) was an innovative early example, providing a demonstration of the software package Arc/Info online. Training in particular software, and the need for hands-on experience, have been addressed with the production of software-specific educational materials which use primarily traditional text-based instructions for exercises with specially prepared data. Examples include Getting started in GIS (Langford, 1993), the Unitar workbooks for Idrisi (for example McKendry et al, 1992) and Understanding GIS: the Arc/Info Method (ESRI, 1990). These have proved particularly popular, and by directing learners through structured exercises allow new users to become familiar with complex software in a relatively short time. More recently, they have migrated to CD-ROM, with training materials such as Getting to know Arcview (ESRI, 1995) provided in this format, and combining software, data and demonstrations.DOI:10.1080/0968776980060204


2011 ◽  
Vol 26 (S1) ◽  
pp. s46-s46
Author(s):  
K.M. Simon-Agolory ◽  
K.Z. Watkins

It is common knowledge that having an individual or family disaster plan is vital for saving lives and property before, during and after a disaster. First responders have the daunting task of helping many people during a disaster. It would make their jobs easier if people had disaster plans before a disaster. However, for a variety of reasons, few people have a disaster plan. People often do not develop disaster plans due to the time required to devise a plan, a lack of knowledge of the benefits of having a plan, or the effort required for the primarily manual process of developing a disaster plan. Wilberforce University has designed a solution called Wilberforce's Information Library Boosting Emergency Recovery (WILBER) which is a customized, online tool to quickly and automatically generate disaster plans to help save lives and property as well as mitigate the impacts of a potential disaster. WILBER utilizes an interdisciplinary approach to automatically generate a basic disaster preparedness plan. The system addresses a wide range of disasters but focuses on floods, earthquakes and technological disasters such as terrorism and nuclear disasters. WILBER automatically processes locally relevant data intelligently and combines mathematical analysis; distributed computing; individual and business risk management; current and historical information from a comprehensive Geographical Information Systems (GIS) that includes imagery, infrastructure, demographic, and environmental data; and wireless sensors for real time condition assessment. Not planning for a disaster only increases the potential magnitude of a disaster. WILBER allows citizens to quickly establish immediate procedures in the event of an emergency which in turn can lessen the burden on first responders and reduces the likelihood of loss of life. This research is funded by the Department of Energy's National Nuclear Security Administration and conducted by the Wilberforce University Disaster Recovery Center in Wilberforce, Ohio, USA.


2011 ◽  
Vol 1 (2) ◽  
pp. 263-270 ◽  
Author(s):  
H. K. Watson ◽  
R. A. Diaz-Chavez

This paper synthesizes lessons learnt from research that aimed to identify land in the dryland regions of eight sub-Saharan African study countries where bioenergy feedstocks production has a low risk of detrimental environmental and socio-economic effects. The methodology involved using geographical information systems (GISs) to interrogate a wide range of datasets, aerial photograph and field verification, an extensive literature review, and obtaining information from a wide range of stakeholders. The GIS work revealed that Africa's drylands potentially have substantial areas available and agriculturally suitable for bioenergy feedstocks production. The other work showed that land-use and biomass dynamics in Africa's drylands are greatly influenced by the inherent ‘disequilibrium’ behaviour of these environments. This behaviour challenges the sustainability concept and perceptions regarding the drivers, nature and consequences of deforestation, land degradation and other factors. An assessment of the implications of this behaviour formed the basis for the practical guidance suggested for bioenergy feedstock producers and bioenergy policy makers.


Sign in / Sign up

Export Citation Format

Share Document