skip to Main Content

I use my IoT core to act as an MQTT broker to collect sensor data. This part works fine and everything looks correct when testing it in the test environment. The IoT core passes the data to a dynamodb table and here is where it goes wrong. I’m not receiving all the data as the IoT core does. I should have every 10-40 seconds some data but the dynamodb register sometimes with a gap of 5 minutes and other times it has gaps from over a couple of hours. Is there some kind of setting which I need to change or how to fix this that I’m not losing any data while transporting it to Dynamodb?

Data comes in this format on the test explorer:

> {   "GEG_BUF2_LT01_M3": "477.4305",   "GEG_BIO1_TT_01": "8.709491",  
> "STATUS_BIO2_P_02": "0",   "STATUS_BIO1_S_01": "0",  
> "STATUS_BIO2_P_01": "0",   "M_alarm": "0",   "M_nieuw_alarm": "0",  
> "time": "2022-12-04 20:48:01" }

Dynamodb table picture

2

Answers


  1. What is the primary key for your DynamoDB table?

    Have you checked your DynamoDB table metrics for throttling, which can result in writes now being persisted?

    How are you ensuring the iot sensor emitted data for the time which is missing from the table?

    Login or Signup to reply.
  2. Since your data sometime ends up in Dynamo, it’s possible your query fails to match data or more likely that some malformed data is causing parsing to fail. Try adding an error action that writes to S3 or SNS so you can see what is failing.

    Alternately enable IoT logging and use Cloudwatch to debug.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search