This repository uses the datagen connector to mock clickstream data and sends it to Kafka. The Azure SQLDW Sink connector will consume the data and create a table in SQLDW. Instead of using the mssql
command line tool to query from SQLDW, I will provide instructions select it from Azure.
This demo requires the az
command line tool. Please follow the instructions here to install.
Create a env.sh file and set these variables
#!/bin/bash
export ADMIN_USER=# changeme exmaple: jay
export ADMIN_PASS=# changeme example: confluent10!
export IP_ADDRESS=#g o to google and ask "what is my ip"
export RESOURCE_GROUP=quickstartResourceGroup # change this if you want
export SERVER=# changeme-sql-server
export DWNAME=quickstartDataWarehouse # change this if you want
export REGION=eastus # change this if you want
az login
source env.sh # set environment variables
make build # builds the containers
make cluster # creates the CP cluster
make ps # lists the containers
make sqldw # creates the Azure SQLDW
make topic # creaets the clickstream topic
make connect # creates the source and sink connectors (datagen and SQLDW sink)
- Goto https://portal.azure.com and type sqldw in the search bar.
- Find click on the resource group you created.
- Use your ADMIN_USER and AMDIN_PASSWORD to log into SQL server.
- Open tables folcer and click on dbo.kafka_clickstream
- Paste
select * from dbo.kafka_clickstream
into Query 1 and view results below.
make down # destroys the cluster and deletes the Azure resource group