Azure SQL database row limit and table refresh

97 Views Asked by At

Currently I'm visualizing data in Grafana from Azure IotHub to Stream analytics to a SQL database.
I'm able to read and visualize the data but it is showing the affected row's last data. Later the new data is not updating in table; only limited rows of data exist, but stream analytics is still sending new data to the database.

What would be the cause and from Grafana's side? I gave the 5s refresh for each panel. Is there any way to refresh the table for new data or continuously update the table?

In stream analytics the query is:

SELECT
      data1,
      data2,
      data3
INTO
     database_name
FROM
     Hub_name

SELECT *
FROM table_name
1

There are 1 best solutions below

1
Sampath On

Steps to send data from Azure IotHub using Stream analytics to a SQL database:

  • Create Event Hubs Namespace:

    • In the Azure Portal, go to the Home page.
    • Search for "Event Hubs" and click on it.
    • Press the "Create" button.
    • Provide a name and location for the namespace, choose the Basic pricing tier, and click "Review + Create."
  • Save Connection String:

    • After the namespace is created, go to the 'Shared access policies' tab.
    • Click on the RootManagerSharedAccessKey link.
    • Copy the 'Connection string- primary key' and save it for later use.
  • Create Event Hub:

    • Go to the Overview tab of the Event Hubs namespace.
    • Click on the '+ Event Hub' button to create an Event Hub under the namespace.
    • Provide a name and set the partition count
    • Click "Create."
  • Code taken from DOC andDOC2.

enter image description here

  • Configure Event Hubs as Data Export Destination:

    • In the IoT Central Home Page, go to the Destination tab of Data Export.
    • Select Destination Type as Azure Event Hubs.
    • Provide the Connection String of Event Hubs saved earlier.
    • Select the name of the Event Hub created in the previous step.
    • Press the Save button.
  • Create Data Export:

    • Go to the Exports tab of Data Export.
    • Provide a name for the export.
    • Select the type of data as Telemetry.
    • Choose the destination (Event Hub) created earlier.
    • Press the Save button.
  • Create Stream Analytics Job:

    • In the Azure Portal, search for "Stream Analytics job" and click on it.
    • Provide a name for the new Stream Analytics job.
    • Select the number of streaming units .
    • Click "Create."

. Add Stream Input:

  • After the Stream Analytics job is created, go to the Overview page. - Select the arrow link for Inputs. - Choose Event Hub as the stream input type. - Provide details for the Event Hub, including the namespace and Event Hub name. - Click "Save."

  • Create Azure SQL Database:

    • Go back to the Azure portal Home page.
    • Create a SQL database in the same region as the Stream Analytics job.
    • Provide the required details and click 'Review + create.'
    • Ensure that the logical server for Azure SQL database is in the same region as the Stream Analytics job.
  • Add Stream Output:

  • Go back to the Stream Analytics job.

  • Go to the Outputs link.

  • Select SQL Database as an output type.

  • Provide details for the SQL database, including the table name.

  • Click "Save."

  • Define SQL Table:

    • Ensure that the SQL table (e.g., t1) is already created in the Azure SQL database.

    • Edit Stream Analytics Query:

    • Press on the Edit Query link in the Stream Analytics job.

    • Modify the SQL query to retrieve details from the telemetry data.

    • Test the query and save it.

    • . Start Stream Analytics Job:

    • Press the Start button in the Stream Analytics job. -. Monitor Stream Analytics Job:

    • Check the Stream Analytics job Monitoring to understand how data streams are transferred from the input Event Hub to the output Azure SQL Database.

    • . Verify SQL Table:

    • Check the SQL table to verify how streams of telemetry data are getting populated in the SQL Database. enter image description here