Geospatial data has become increasingly critical in today's world. Location is important for all kinds of businesses, from small businesses whose discoverability depends on Google Maps, to larger enterprises with several stores, big warehouses and intricate distribution routes.
Mapify can help these businesses handle large volumes of location-based data seamlessly and in real-time, coming from a high number of IoT devices and sensors and process it according to your business rules on the fly with support for geospatial functions.
It can then be streamed to BigQuery where after-the-facts processing of insights and reporting can be performed at a very large scale with the added context of other relevant data your business may also host in BigQuery. BigQuery’s built-in machine learning lets you create machine learning models using standard SQL functions which can leverage your data with predictive capabilities.
How to build a Mapify solution with real-time IoT sensor data
This article will walk you through quickly building a working solution in which Mapify will handle incoming real-time sensor data, process it through a simplified workflow and stream it to BigQuery. You will then be able to use your own knowledge of BigQuery to generate insights and build your own predictive analytics (not in the scope of this article).
If you want to follow this step-by-step guide and build this solution yourself, this is what you'll need:
Mapify account (feel free to request a Mapify trial here)
Some programming experience
In this article, we'll be using publicly available data from the amazing city of Helsinki. Information regarding the city public transportation vehicles' location can be subscribed through Digitransit's high-frequency positioning API. The open HFP API can be used to subscribe to vehicle movements in soft real-time.
Since the data is published through MQTT, making it available in Mapify is as easy as creating a Datafeed. Let's see how!
Example of a Mapify Datafeed subscribing to Helsinki's high-frequency positioning MQTT topic to obtain the real-time location of the public buses.
It is easy to set up a Mapify layer showing the bus location data in real-time, as we have shown in previous articles, but it is not necessary to send data to an external system, such as BigQuery. Nevertheless, this is how Helsinki buses' location data can look like in a Mapify real-time layer:
Small video showing the real-time location of Helsinki's public buses.
Your BigQuery dataset and table
Before you can create a workflow to process and send data to BigQuery, you should have already set up a dataset and at least one table which will be used to hold your data.
Maintaining the Helsinki public transport examples above, we will create a mapify_bq_demo dataset in which we create a new helsinki-public-buses-location-data table.
BigQuery explorer view of the created dataset and table which will hold data sent from Mapify.
For this example, the helsinki-public-buses-location-data table schema will simply reflect the fields from the real-time message payloads provided by Helsinki public buses positioning devices to Mapify.
The helsinki-public-buses-location-data schema reflects the fields included in the vehicles' positioning devices provided data.
The important thing is that we now have a BigQuery table to which we can send large sets of data from Mapify, and which can be used by BigQuery to provide large-scale analytics and reporting, as well as contribute to machine learning based solutions.
Creating a workflow
In order for you to send data in real-time from Mapify to BigQuery, we'll add an output node to a workflow which streams Helsinki buses real-time locations to BigQuery. This way, we know that the data sent to our data lake has already been filtered and processed using our business rules before being sent.
Simple workflow to illustrate the ability to include data export according to business rules.
Your data in BigQuery
Once you have your workflow up and running and bus locations are being received and processed, data will be streamed to BigQuery and quickly accessible for you to obtain insights using the SQL language we are used to.
You can query large quantities of data in BigQuery using simple SQL queries.
Now that you have your data being streamed into your BigQuery instance from Mapify, you will be able to explore and generate valuable insights relevant to your business with just your browser. You will then be able to quickly add artificial intelligence capabilities to your business, by creating and executing machine learning models in BigQuery using your data.
We hope this article provided you with a quick view of how Mapify can help you obtain even more value from your data by providing you with the best of two worlds: world-class location intelligence processing and integration with a global-scale data lake such as Big Query.
Mapify has much more to offer! Our team is looking forward to exploring all the amazing services which will make our product better and our customers happier, while also making things easier for your development team. Sounds like a win-win situation 🤝