Skip to content

Commit 6c2981b

Browse files
committed
fixing the # issue
1 parent 347e65e commit 6c2981b

File tree

1 file changed

+15
-13
lines changed
  • samples/WalletProcessing_KafkademoSample

1 file changed

+15
-13
lines changed

samples/WalletProcessing_KafkademoSample/ReadMe.md

Lines changed: 15 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,8 @@ The flow of the application is captured in this diagram:
1919

2020
![alt text](img/arch_diag.png)
2121

22-
Pre-requisites:
22+
## Pre-requisites
23+
2324
1. An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
2425
2. The [Azure Functions Core Tools](https://docs.microsoft.com/en-us/azure/azure-functions/functions-run-local#v2) version 4.x.
2526
3. The [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli) version 2.4 or later.
@@ -31,13 +32,13 @@ Pre-requisites:
3132
- [Create an instance of Apache Kafka and the topics needed in a Confluent Cloud managed Kafka cluster](https://docs.microsoft.com/en-us/azure/partner-solutions/apache-kafka-confluent-cloud/create)
3233
- [Create an instance of Azure functions](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan)
3334

34-
# Build and deploy the producer and consumer applications to Azure functions
35+
## Build and deploy the producer and consumer applications to Azure functions
3536

3637
1. [Setup schema registry for the wallet\_event topic](https://docs.confluent.io/cloud/current/get-started/schema-registry.html#quick-start-for-schema-management-on-ccloud).
3738
2. Setup a producer code to write Avro serialized messages to a Kafka topic.
3839
3. Setup Kafka trigger apps to track the events arriving at various Kafka topics.
3940

40-
# Execute the Kafka producer to generate events
41+
## Execute the Kafka producer to generate events
4142

4243
1. Run the producer program to produce serialized messages into the Kafka topic
4344

@@ -77,15 +78,16 @@ Here is how the header data looks:
7778

7879
![alt text](img/img11.png)
7980

80-
**Clean Up**
81+
## Clean Up
8182

8283
1. Delete the resources (Azure function's instance – Kafka trigger apps, Kafka clusters hosted on Confluent Cloud – topics, schema registry)
8384

8485
The next section provides a summary of how to set up the Azure services and infrastructure to build the entire solution. References to the documentation and other resources are provided where applicable.
8586

86-
# Setup the Kafka cluster using Confluent Cloud on Azure offering
87+
## Setup the Kafka cluster using Confluent Cloud on Azure offering
8788

88-
Provision a [Confluent Cloud cluster on Azure Marketplace](https://docs.microsoft.com/azure/partner-solutions/apache-kafka-confluent-cloud/create?WT.mc_id=data-28802-abhishgu).
89+
Provision a [Confluent Cloud cluster on Azure Marketplace](https://docs.microsoft.com/azure/partner-solutions/apache-kafka-confluent-cloud/create?WT.mc_id=data-28802-abhishgu).
90+
8991
**Provide configuration details** for creating a Confluent Cloud organization on Azure. Post completion of creation steps a Confluent organization is created which can be identified through Azure portal and using single sign-on url you can directly login to Confluent cloud UI portal.
9092

9193
![alt text](img/img12.png)
@@ -94,13 +96,13 @@ Create a new Kafka cluster in the default environment or create a new environmen
9496

9597
![alt text](img/img13.png)
9698

97-
# Setup the schema registry
99+
## Setup the schema registry
98100

99101
Confluent Cloud [Schema Registry](https://docs.confluent.io/cloud/current/get-started/schema-registry.html#quick-start-for-schema-management-on-ccloud) helps to manage schemas in Confluent Cloud. Enable schema registry once you login to the cluster. Select the schema registry region closer to the cluster. To use Confluent Cloud Schema Registry for managing Kafka clusters, you need an API key specific to Schema Registry. Click the **Schema Registry** tab, then click **Edit** on the **API credentials** card. Create a new set of API key and secret and note down the values.
100102

101103
![alt text](img/img14.png)
102104

103-
# Create the topics
105+
## Create the topics
104106

105107
Create new [Kafka Topics](https://docs.confluent.io/cloud/current/client-apps/topics/manage.html) as follows using the default topic settings.
106108

@@ -118,7 +120,7 @@ Here is the view of the topic wallet\_event once the schema is mapped:
118120

119121
Create a new API Key and Secret which can be found under the Cluster-> Data Integration -> API keys - note these values
120122

121-
# Update the code
123+
## Update the code
122124

123125
- Clone [this repository](https://github.com/Azure/azure-functions-kafka-extension/tree/dev/samples/WalletProcessing_KafkademoSample) using Git to a folder.
124126
- Update the local.settings.json file to point to your Kafka cluster that you set up in the previous step.
@@ -142,7 +144,7 @@ Update the producer-config.properties file. These values will be leveraged by th
142144

143145
Update the **schemaRegistryUrl** in pom.xml with the API endpoint of the Schema registry.
144146

145-
# Now lets see things in action!!
147+
## Now lets see things in action!!
146148

147149
Now that we have all the components in place, we can test the end-to-end functionality. Lets build and run the function's code by starting the local Functions runtime host from the folder which has been cloned :
148150

@@ -159,7 +161,7 @@ Run the WalletProducer.java to generate messages into the Kafka topic wallet\_ev
159161

160162
![alt text](img/img22.png)
161163

162-
# Deploy the app to Azure Functions Premium Plan
164+
## Deploy the app to Azure Functions Premium Plan
163165

164166
Now you are ready to deploy this Function app to a [Azure Functions Premium Plan](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan). Use the following [link](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan#create-a-premium-plan) for instructions on how to first create an Azure Functions Premium plan Function app. Note the name of the Function app.
165167

@@ -201,11 +203,11 @@ Finally, you can head over to the portal and for example use the [Live Metrics v
201203

202204
![alt text](img/img28.png)
203205

204-
# Clean up the resources
206+
## Clean up the resources
205207

206208
Once you're done, delete the services so that you do not incur unwanted costs. If they are in the same resource group, simply [deleting the resource group](https://docs.microsoft.com/azure/azure-resource-manager/management/delete-resource-group?tabs=azure-portal&WT.mc_id=data-14444-abhishgu#delete-resource-group) will suffice. You can also delete the resources (Confluent Cloud organization, Azure functions) individually.
207209

208-
# References
210+
## References
209211

210212
[Apache Kafka bindings for Azure Functions | Microsoft Docs](https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-kafka?tabs=in-process%2Cportal&pivots=programming-language-csharp)
211213

0 commit comments

Comments
 (0)