The Google Maps API has provided a basis for many of the applications that have been developed for business and consumers on the web or mobile applications over this time. Whilst 10 years ago, most people didnât have even easy access to a base map of data, now we have myriad services. The speed of updates to mapping sources such as Google has made getting access to accurate base map services easy for developers. From familiar maps from Google to those provided as backdrops to digital geospatial artworks, having a base map in your application is now simple â it is the data on top of this that is now the most distinguishing feature and of most meaning to users.â©
Increasingly, that data isnât just a set of simple static points showing the location of an ATM, but the result of the processing of massive amounts of geospatial data that is being captured and analysed in real-time. â©
The âFlybyâ functionality of the running app Strava hides massive amounts of geospatial functionality into a ânice to haveâ feature for most users. The basic app tracks how far and fast you run and shares it with friends. You can track yourself (geospatial storage) on runs and compare yourself with other people (geospatial visualisation), not just on the macro scale of a whole run, but also on the micro scale of run âsegmentsâ that have been added by other runners (geospatial editing). Recently, though, Stravaâs developers added another feature that allows you to play back your run in the context of other people exercising at the same time and see who has âflown byâ you while you have been exercising (geospatial analysis). â©
Not only is Strava tracking your run, it is able to post-process this and analyse who was running in the area at the same time and allow you to visualise them on top of a familiar basemap. The fact that this sort of analysis and visualisation is readily adopted by consumers show the sophisticated way they are interacting with geospatial technology, without even caring about the process, just the outcome of the results.â©
Providing this functionality for a few 10s or even 100s of users has previously been the remit of the traditional geospatial companies providing enterprise software or solutions, developing niche applications or cloud-based platforms mainly for existing users. But processing and developing applications for millions of users requires a whole new set of technologies, which has only been available for a few years. â©
With applications needing to provide answers to more and more spatially and temporally contextual questions to users of mobile devices, the requirement for data storage and processing speed goes beyond what is possible for traditional software.â©
At Google, having access to platforms that can rapidly store and process highly scalable datasets has allowed us to provide services such as Google Maps Directions API. This provides the ability to not only find out the quickest way to get somewhere based on distance by car but also to answer the question, âHow long it will take to get there now, in the current traffic conditions?â Capturing the road network, processing real-time edits and additions and getting up to the minute signals of road speeds from a number of different sources globally requires a special platform to do this, a platform historically only companies such as Google could build. â©
Having these services behind a simple REST service or JavaScript API can help to build new applications that go beyond planning to enabling people to react in the moment. This is crucial for the myriad new services that have grown up over the past few years, such as taxi or delivery services, information which is then accessible to consumers on their shiny new smartwatches.â©
Processing data at the speed of nowâ©
The Google Cloud Platform contains conventional services, such as the Google Compute Engine which provides the ability to run standard virtual machines and Google Cloud SQL which enables the creation of scalable SQL databases (based on MySQL) to be run in the cloud, paying only for what you use. For those wishing to move their existing in-house geospatial services to the cloud, these provide the building blocks of any traditional solution. â©
Other products, such as the newly released Google Cloud Bigtable (used internally at Google since 2004 including data storage for Google Earth), now provide the platform to store and query large complex datasets in real-time without the hassle of managing and scaling servers or databases, even if they are in the cloud.â©
As it is a generic, highly scaled storage platform, other technology can be layered on top of it to provide whatever services are required. GeoMesa is an open source solution that can be integrated with Bigtable to store, process and query billions of geospatial features for use in display or analysis. One of the launch partners was CCRi, which has used its geospatial expertise to make Bigtable the backend of open source products including GeoMesa and GeoServer to provide GIS access to big data services in a simple manner. GeoMesa adds spatio-temporal support to Bigtable, which in turn uses familiar services such as GeoTools to access the data and use it in more traditional products like GeoServer. The fact that Bigtable launched with a geospatial-focused solution shows that as the market moves towards big data analysis and the internet of things, there are now low-cost tools and pay as-you-go platforms on which to base these services.â©
Everyoneâs a geospatial expertâ©
This will enable another step-change in the types of functionality consumers will be comfortable with and demand in future applications.â©
Services that were once only achievable by Google-like companies will now be available to anyoneâ©
Matt Toon is geospatial sales engineering manager (EMEA) for Google (www.google.com)