Provision a Stream Analytics job

  1. From the Management portal, click NEW,DATA SERVICES, STREAM ANALYTICS, QUICK CREATE.
  2. Specify the following values, and then click CREATE STREAM ANALYTICS JOB:
  • JOB NAME: Enter a job name. For example, lot-sensor-stream.
  • REGION: Select the region where you want to run the job. Azure Stream Analytics is currently only available in 2 regions during preview. For more information, see Azure Stream Analytics limitations and known issues. Consider placing the job and the Event Hub in the same region to ensure better performance and that you will not be paying to transfer data between regions.
  • STORAGE ACCOUNT: Choose the Storage account that you would like to use to store monitoring data for all Stream Analytics jobs running within this region. You have the option to choose an existing Storage account or to create a new one.
  1. Click STREAM ANALYTICS in the left pane to list the Stream Analytics jobs.

clip_image001[4]

The new job will be listed with a status of CREATED. Notice the START button on the bottom of the page is disabled. You must configure the job input, output, query and so on before you can start the job.

Specify job input

  1. Click the job name.
  2. Click INPUTS from the top of the page, and then click ADD INPUT. The dialog that opens will walk you through a number of steps to setup your Input.
  3. Select DATA STREAM, and then click the right button.
  4. Select EVENT HUB, and then click the right button
  5. Type or select the following values on the third page:
    • INPUT ALIAS: Enter a friendly name for this job input, e.g. here we will call it Input. Note that you will be using this name in the query later on.
    • EVENT HUB: If the Event Hub you created is in the same subscription as the Stream Analytics job, select the namespace the Event Hub is in.
      If your Event Hub is in a different subscription, select Use Event Hub from Another Subscription and manually enter the SERVICE BUS NAMESPACE,EVENT HUB NAME, EVENT HUB POLICY NAME, EVENT HUB POLICY KEY, and EVENT HUB PARTITION COUNT. NOTE: This sample uses the default number of partitions, which is 16.
    • EVENT HUB NAME: Select the name of the Azure Event Hub you created earlier.
    • EVENT HUB POLICY NAME: Select the Event Hub policy created earlier, which is streamaccess

clip_image001[6]

Specify the following values:

    • EVENT SERIALIZER FORMAT: JSON
    • ENCODING: UTF8

Click the check button to add this source and to verify that Stream Analytics can successfully connect to the Event Hub.

Specify job output

  1. Click OUTPUT from the top of the page, then click ADD OUTPUT.
  2. Select SQL DATABASE, and then click the right button.
  3. Type or select the following values.
  • SQL DATABASE: Choose the SQL Database you created earlier. If it is in -the same subscription, select the database from the dropdown menu. If not, manually enter the Server Name and Database fields.
  • USERNAME: Enter the SQL Database login name.
  • PASSWORD: Enter the SQL Database login password.
  • TABLE: Specify the table you created earlier to send output to: dbo.LotSensorStream.

image

Click the check button to create your output and verify that Stream Analytics can successfully connect to the SQL Database as specified.

Specify job query

Stream Analytics supports a simple, declarative query model for describing transformations.

This sample solution provides a simple pass-through query that outputs sensor readings to a SQL Database table.

  1. Click Query from the top of the page
  2. Add the following to the code editor:
    SELECT HomeHubID, SensorName,SensorRole,SensorData, EntryDateTime FROM Input
  3. Make sure that the name of the input source matches the name of the input you specified earlier.
  4. Click SAVE from the bottom of the page and YES to confirm.

Note: the simple query we are using here is meant to demonstrate how output data to the SQL db. To learn more about writing queries for more advanced processing and transformation, please see the Azure Stream Analytics Query Language Reference.

 

Start the job

As a default, Stream Analytics jobs start reading incoming events from the time that the job starts. Because the Event Hub contains existing data to process, we need to configure the job to consume this historical data.

  1. Click DASHBOARD from the top of the page.
  2. Click START from the bottom of the page.
  3. Click CUSTOM TIME, and specify a start time. Make sure that the start time is sometime before the time that you ran BasicEventHubSample.
  4. Click the check button on the bottom of the dialog. In the quick glance pane, the STATUS will change to Starting and may take a couple of minutes to complete the starting process and move into the Running state.

View job output

In Visual Studio or SQL Server Management Studio, connect to your SQL Database and run the following query:

Select HomeHubId, CAST (DATEADD(hour, -8, EntryDateTime) as datetime) as 'datetime', sensorrole, sensorname, CAST(sensordata as float) as 'data'
from [dbo].[LotSensorStream] Order by EntryDateTime DESC

(time converted from GMT to PST)

You will see records corresponding to the reading events from the Event Hub.

clip_image001[10]

 

Another Example


An example of a stream job with a query with a little more computation:

image ‘'

This query will use the time that the event was pushed to Event Hub as the timestamp, finding the average temperature reading every hour and the number of events that fall within that one hour window.

For this query, you would need an output SQL table that looks like this :

CREATE TABLE [lotsensors].[AvgReadings] (
    [WinStartTime] DATETIME2 (6)  NULL,
    [WinEndTime]   DATETIME2 (6)  NULL,
    [HomeHubId]    NVARCHAR (50)  NULL,
    [SensorName]   NVARCHAR (150) NULL,
    [SensorRole]   NVARCHAR (150) NULL,
    [SensorData]   NVARCHAR (150) NULL,
    [AvgReading]   FLOAT (53)     NULL,
    [EventCount]   BIGINT         NULL
);

GO
CREATE CLUSTERED INDEX [AvgReadings]
    ON [lotsensors].[AvgReadings]([HomeHubId] ASC);


The results from the input query will look like this:

image

Last edited Feb 20, 2015 at 6:17 PM by dannyh206, version 3

Comments

No comments yet.