When most people think of GIS, they think of maps, and rightfully so. For decades, typical consumers of spatial data were cities, municipalities, and other organisations that used GIS to manage and visualise information about assets and environments.
This is continuing, of course, as the use of geospatial information moves into new private, commercial, and industrial segments. However, as GIS data flows from the field to end users, opportunities exist to develop information that goes well beyond the traditional positions and attributes.
Three components for GIS data delivery
To understand this potential, let’s look at how GIS data moves through an organisation. There are three components to the process. The first step is checking and quality assurance of the incoming data. This process begins in the field, where software guides technicians in collecting positions and attributes required for a specific application. In-field quality control helps to ensure that all needed information is collected and recorded.
Quality assurance continues when field data is transferred to a back-end system, typically office- or cloud-based software, where it can be examined to catch outliers and other potential errors. Technicians can also merge the GIS data with geospatial information from other sources (field notes, images, or environmental sensors to name a few) before passing it along for other analyses. An aggressive approach to quality control taken early in the process is important to prevent or reduce problems for downstream users.
The second step leverages post-processing software to refine and improve the precision and accuracy of positions measured using GNSS. While some solutions can produce sub-meter or decimeter precision in the field, many users use post processing to simplify field operations and conduct additional review and checking in the office. Obtained with little time and effort, the resulting higher accuracy can increase the value of the GIS data.
The third step relies on industry-standard methods that provide an open interface from the field to the system of record. The system of record is the repository for data that the customer considers authoritative. It can be a database, a post-processing system or a real-time analysis and reporting solution, contains the most current version of data and a snapshot of current system status. The system of record serves as the basis for myriad decisions and activities that will be based on the geospatial data. So it’s critical to be able to exchange information quickly and accurately between the various field solutions and the system of record.
The open interface (such as that defined by the Open GIS Consortium) enables GIS professionals to manage and exchange rich data from multiple sources. Tightly defined protocols and formats ensure complete and consistent data transfer. The open interface is an essential component in productive information management. It also opens the door to using valuable, non-spatial information for GIS.
The value of metadata
As discussed above, the typical back-end system is designed to interface with any number of systems. The system of record often may be a GIS database, and that’s where the field-gathered features go. But the field data is not just about fire hydrants and their attributes and not all information ends up in a shapefile or geodatabase. A great amount of useful information can be extracted from the feature metadata. In addition to positions and attributes, the results from the field include information such as who collected the data, the working environment, what equipment was used and how it was configured. All of that information can be extracted and included in reports and quality control.
The metadata from the field can support project management as well. For example, if users need to get an idea of fieldwork or productivity, the field data can be examined with that in mind. The time elapsed between recorded points can help gauge the speed at which a crew is working or reveal difficulties in moving from one feature to the next.
GIS data can be important to an enterprise resource planning (ERP) system. As an example, GIS field crews may be evaluated or compensated based on the number of features they collect. That sort of exchange of data provides value well beyond the features and attributes that the field workers gather. GIS organisations can use the field data to create a productivity matrix in which performance parameters can be developed to help optimise the use of human and technological resources.
GIS software also enables managers to view project progress at a glance. Users and project managers want to know where data has been collected and where it hasn’t. The software provides visual information needed to assess coverage. Managers can use the system to develop progress information and plan field activities to collect missing data efficiently. Metadata can also contain information useful for quality control in assessing positioning accuracy.
Open exchange of information
It’s important to reiterate the value of open information exchange. The most valuable gains in the past few years have come from the improvement in exchanging spatial information. The geospatial industry has access to new, comprehensive means by which GIS data can be exchanged between field and stakeholder decision point — or multiple decision points.
Thanks to the widespread and standardised adoption of web protocols, many systems and databases are no longer proprietary. Even a proprietary system can have standard published inputs that encourage the exchange of features, positions, and metadata between systems. GIS users should understand the importance of the office side of GIS data collection. The concepts and tools for quality control, increased accuracy, metadata, and open information exchange can be leveraged to increase the value and utilisation of field-gathered information.
Ron Bisio at Trimble