You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: samples/WalletProcessing_KafkademoSample/ReadMe.md
+15-13Lines changed: 15 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,8 @@ The flow of the application is captured in this diagram:
19
19
20
20

21
21
22
-
Pre-requisites:
22
+
## Pre-requisites
23
+
23
24
1. An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
24
25
2. The [Azure Functions Core Tools](https://docs.microsoft.com/en-us/azure/azure-functions/functions-run-local#v2) version 4.x.
25
26
3. The [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli) version 2.4 or later.
@@ -31,13 +32,13 @@ Pre-requisites:
31
32
-[Create an instance of Apache Kafka and the topics needed in a Confluent Cloud managed Kafka cluster](https://docs.microsoft.com/en-us/azure/partner-solutions/apache-kafka-confluent-cloud/create)
32
33
-[Create an instance of Azure functions](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan)
33
34
34
-
# Build and deploy the producer and consumer applications to Azure functions
35
+
##Build and deploy the producer and consumer applications to Azure functions
35
36
36
37
1.[Setup schema registry for the wallet\_event topic](https://docs.confluent.io/cloud/current/get-started/schema-registry.html#quick-start-for-schema-management-on-ccloud).
37
38
2. Setup a producer code to write Avro serialized messages to a Kafka topic.
38
39
3. Setup Kafka trigger apps to track the events arriving at various Kafka topics.
39
40
40
-
# Execute the Kafka producer to generate events
41
+
##Execute the Kafka producer to generate events
41
42
42
43
1. Run the producer program to produce serialized messages into the Kafka topic
43
44
@@ -77,15 +78,16 @@ Here is how the header data looks:
The next section provides a summary of how to set up the Azure services and infrastructure to build the entire solution. References to the documentation and other resources are provided where applicable.
85
86
86
-
# Setup the Kafka cluster using Confluent Cloud on Azure offering
87
+
##Setup the Kafka cluster using Confluent Cloud on Azure offering
87
88
88
-
Provision a [Confluent Cloud cluster on Azure Marketplace](https://docs.microsoft.com/azure/partner-solutions/apache-kafka-confluent-cloud/create?WT.mc_id=data-28802-abhishgu).
89
+
Provision a [Confluent Cloud cluster on Azure Marketplace](https://docs.microsoft.com/azure/partner-solutions/apache-kafka-confluent-cloud/create?WT.mc_id=data-28802-abhishgu).
90
+
89
91
**Provide configuration details** for creating a Confluent Cloud organization on Azure. Post completion of creation steps a Confluent organization is created which can be identified through Azure portal and using single sign-on url you can directly login to Confluent cloud UI portal.
90
92
91
93

@@ -94,13 +96,13 @@ Create a new Kafka cluster in the default environment or create a new environmen
94
96
95
97

96
98
97
-
# Setup the schema registry
99
+
##Setup the schema registry
98
100
99
101
Confluent Cloud [Schema Registry](https://docs.confluent.io/cloud/current/get-started/schema-registry.html#quick-start-for-schema-management-on-ccloud) helps to manage schemas in Confluent Cloud. Enable schema registry once you login to the cluster. Select the schema registry region closer to the cluster. To use Confluent Cloud Schema Registry for managing Kafka clusters, you need an API key specific to Schema Registry. Click the **Schema Registry** tab, then click **Edit** on the **API credentials** card. Create a new set of API key and secret and note down the values.
100
102
101
103

102
104
103
-
# Create the topics
105
+
##Create the topics
104
106
105
107
Create new [Kafka Topics](https://docs.confluent.io/cloud/current/client-apps/topics/manage.html) as follows using the default topic settings.
106
108
@@ -118,7 +120,7 @@ Here is the view of the topic wallet\_event once the schema is mapped:
118
120
119
121
Create a new API Key and Secret which can be found under the Cluster-> Data Integration -> API keys - note these values
120
122
121
-
# Update the code
123
+
##Update the code
122
124
123
125
- Clone [this repository](https://github.com/Azure/azure-functions-kafka-extension/tree/dev/samples/WalletProcessing_KafkademoSample) using Git to a folder.
124
126
- Update the local.settings.json file to point to your Kafka cluster that you set up in the previous step.
@@ -142,7 +144,7 @@ Update the producer-config.properties file. These values will be leveraged by th
142
144
143
145
Update the **schemaRegistryUrl** in pom.xml with the API endpoint of the Schema registry.
144
146
145
-
# Now lets see things in action!!
147
+
##Now lets see things in action!!
146
148
147
149
Now that we have all the components in place, we can test the end-to-end functionality. Lets build and run the function's code by starting the local Functions runtime host from the folder which has been cloned :
148
150
@@ -159,7 +161,7 @@ Run the WalletProducer.java to generate messages into the Kafka topic wallet\_ev
159
161
160
162

161
163
162
-
# Deploy the app to Azure Functions Premium Plan
164
+
##Deploy the app to Azure Functions Premium Plan
163
165
164
166
Now you are ready to deploy this Function app to a [Azure Functions Premium Plan](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan). Use the following [link](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan#create-a-premium-plan) for instructions on how to first create an Azure Functions Premium plan Function app. Note the name of the Function app.
165
167
@@ -201,11 +203,11 @@ Finally, you can head over to the portal and for example use the [Live Metrics v
201
203
202
204

203
205
204
-
# Clean up the resources
206
+
##Clean up the resources
205
207
206
208
Once you're done, delete the services so that you do not incur unwanted costs. If they are in the same resource group, simply [deleting the resource group](https://docs.microsoft.com/azure/azure-resource-manager/management/delete-resource-group?tabs=azure-portal&WT.mc_id=data-14444-abhishgu#delete-resource-group) will suffice. You can also delete the resources (Confluent Cloud organization, Azure functions) individually.
207
209
208
-
# References
210
+
##References
209
211
210
212
[Apache Kafka bindings for Azure Functions | Microsoft Docs](https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-kafka?tabs=in-process%2Cportal&pivots=programming-language-csharp)
0 commit comments