We have a project using Google IoT core to manage devices and we want to create a monitoring system for the devices.
We have a few thousands devices and for each individual device we want to record a few (~10) metrics(total size around 50 bytes). We will dynamically add/remove devices overtime.
We are thinking using a pubsub job/Cloud function to listen to all devices' states and for each device, we create a set of custom metrics on Google Cloud Monitoring and write the received devices' states into the custom metrics
I'm wondering if this is a practical scalable solution -- I'm worried that the number of metrics will be too large for Google monitoring. If it's not, what's the recommended way to monitor large number of device managed by Google IoT core? Thanks!
I see two different discussions here:
Cloud Functions: in the official documentation you can see IoT listed in it use cases. However, it's worth noting that the main drawback you could face using Functions is that the execution environment is often initialized from scratch. This creates some invocation latency. You can mitigate this following some performance tips and tricks. Also, depending on your use case, you could prefer using Cloud Dataflow (which implements Apache Beam) or Cloud Dataproc (Apache Spark, Hadoop, etc.).
Monitoring: there are some Cloud Monitoring quota limits that are worth considering when designing your use case. You can find more information about monitoring IoT environments here.
As a final note, let me link you 3 use cases that might be relevant to you and help you better decide on how to proceed:
Using Cloud Logging with IoT Core devices
Using Prometheus and Grafana for IoT monitoring
Real-time data processing with IoT Core