Kafka Streams Quick Start for Confluent Platform

Confluent for VS Code provides project scaffolding for many different Apache Kafka® clients, including Kafka Streams. The generated project has everything you need to compile and run a simple Kafka Streams application that you can extend with your code.

This guide shows you how to build a Kafka Streams application that connects to a Kafka cluster. You’ll learn how to:

  • Create a Kafka Streams project using Confluent for VS Code
  • Process streaming data with Kafka Streams operations
  • Run your application in a Docker container

Confluent for VS Code generates a project for a Kafka Streams application that consumes messages from an input topic and produces messages to an output topic by using the following code.

builder.stream(INPUT_TOPIC, Consumed.with(stringSerde, stringSerde))
       .peek((k, v) -> LOG.info("Received raw event: {}", v))
       .mapValues(value -> generateEnrichedEvent())
       .peek((k, v) -> LOG.info("Generated enriched event: {}", v))
       .to(OUTPUT_TOPIC, Produced.with(stringSerde, stringSerde));

Tip

In this guide, you run shell commands for tasks like starting Docker containers, and you can run these commands in VS Code’s Terminal pane. For more information, see Terminal Basics.

Prerequisites

  • Confluent for VS Code: Follow the steps in Installation
  • Docker installed and running in your development environment

Step 1: Create the Kafka Streams project

Create the Kafka Streams project by using the Kafka Streams Application template and filling in a form with the required parameters.

Open the template in VS Code directly

To go directly to the Kafka Streams Application template in VS Code, click this button:

Open template in VS Code

The Kafka Streams Application form opens.

Skip the manual steps and proceed to Step 2: Fill in the template form.

Open the template in VS Code manually

Follow these steps to open the Kafka Streams Application template manually.

  1. Open VS Code.

  2. In the Activity Bar, click the Confluent icon. If you have many extensions installed, you may need to click to access Additional Views and select Confluent from the context menu.

  3. In the extension’s Side Bar, locate the Support section and click Generate Project from Template.

    The palette opens and shows a list of available project templates.

  4. Click Kafka Streams Application.

    The Kafka Streams Application template opens.

Step 2: Fill in the template form

The project needs a few parameters to connect with your Kafka cluster.

  1. In the Kafka Streams Application form, provide the following values.
    • Kafka Bootstrap Server: One or more comma-separated host and port pairs that represent the addresses where Kafka brokers accept client bootstrap requests. Leave this field blank, because you add the host:port string in a later step.
    • Kafka Cluster API Key: Leave this field blank.
    • Kafka Cluster API Secret: Leave this field blank.
    • Input Topic: The name of a topic that the Kafka Streams application consumes messages from. Enter input_topic. You create this topic in a later step.
    • Output Topic: The name of a topic that the Kafka Streams application produces messages to. Enter output_topic. You create this topic in a later step.
  2. Click Generate & Save, and in the save dialog, navigate to the directory in your development environment where you want to save the project files and click Save to directory.

Confluent for VS Code generates the project files.

  • The Kafka Streams code is saved in the src/main/java/examples directory, in a file named KafkaStreamsApplication.java.
  • A docker-compose.yml file declares how to build the Kafka Streams code.
  • Configuration settings, like bootstrap.servers, are saved in a file named config.properties.
  • Secrets, like the Kafka cluster API key, are saved in a file named .env.
  • A README.md file has instructions for compiling and running the project.

Step 3: Create topics

Confluent for VS Code enables creating Kafka topics easily within VS Code.

  1. In the extension’s Side Bar, open Local in the Resources section and click cluster-local.

    The Topics section refreshes, and the cluster’s topics are listed.

  2. In the Topics section, click to create a new topic.

    The palette opens with a text box for entering the topic name.

  3. In the palette, enter input_topic. Press ENTER to confirm the default settings for the partition count and replication factor properties.

    The new topic appears in the Topics section.

  4. Repeat the previous steps for another new topic named output_topic.

Step 4: Configure the project

The Kafka Streams project runs in a Docker container and requires some configuration to communicate with container that’s running the Kafka broker, which is named vscode-confluent-local-broker-1.

  1. In the project’s docker-compose.yml file, declare an external connection to the Kafka container’s network, which is named vscode-confluent-local-network.

    services:
      kafka-streams-app:
        build:
          context: .
          dockerfile: Dockerfile
        container_name: kafka-streams-app
        volumes:
          - ./build:/app/build
          - ./config.properties:/app/config.properties
          - ./.env:/app/.env
        environment:
          LOG_LEVEL: INFO
        networks:
          - vscode-confluent-local-network
    
    networks:
      vscode-confluent-local-network:
        external: true
    
  2. In the project’s .env file, comment out the following line:

    # sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='' password='';
    
  3. Run the following command to get the port number for the Kafka bootstrap server from the container.

    docker container inspect vscode-confluent-local-broker-1 | grep -i KAFKA_ADVERTISED_LISTENERS
    

    Your output should resemble:

    "KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://vscode-confluent-local-broker-1:45579,PLAINTEXT_HOST://localhost:38177",
    

    In the current example, the bootstrap server host:port setting is vscode-confluent-local-broker-1:45579.

  4. Edit the config.properties file and change these settings.

    • Assign the bootstrap.servers configuration with the name of the Kafka container and the port number, for example, vscode-confluent-local-broker-1:45579.
    • Assign PLAINTEXT to the security.protocol configuration.
    • Remove sasl.mechanism=PLAIN.

    Your config.properties file should resemble the following.

    bootstrap.servers=vscode-confluent-local-broker-1:45579
    security.protocol=PLAINTEXT
    client.id=vscode-kafka-streams-simple-example-...
    

Step 5: Compile and run the project

Your Kafka Streams project is ready to build and run in a Docker container.

  1. In your terminal, navigate to the directory where you saved the project.

  2. The Confluent for VS Code extension saves the project files in a subdirectory named kafka-streams-simple-example. Run the following command to navigate to this directory.

    cd kafka-streams-simple-example
    
  3. Run the following command to build and run the Kafka Streams application.

    docker compose up --build
    

    Docker downloads the required images and starts a container that compiles the project.

Step 6: Produce messages to the input topic

Confluent for VS Code enables producing messages to Kafka topics from within VS Code.

In this step, you create a file that has an example message that you send to the input topic.

  1. Copy the following example message into a file named test-message.json and save the file.

    {
      "headers": [
        {
          "key": "task.generation",
          "value": "350"
        },
        {
          "key": "task.id",
          "value": "0"
        },
        {
          "key": "current.iteration",
          "value": "39067914"
        }
      ],
      "key": 39067914,
      "value": {
        "id": "123e4567-e89b-12d3-a456-426614174000",
        "timestamp": 1638360000000,
        "customer": {
          "name": "John Smith",
          "email": "john.smith@example.com",
          "address": "123 Main St, Suite 456, Anytown, ST 12345",
          "phone": "(555) 123-4567"
        },
        "order": {
          "orderId": "AB123456",
          "product": "Ergonomic Steel Keyboard",
          "price": "199.99",
          "department": "Electronics",
          "quantity": 2
        }
      }
    }
    
  2. In the extension’s Side Bar, hover over input_topic and click Send Message(s) to Topic.

    The palette opens with a textbox for entering the path to the message file.

  3. In the palette, navigate to the test-message.json file, click Select a File, and click OK.

    A notification reports that you have successfully produced a message to input_topic.

  4. In the Side Bar, hover over input_topic and click View Messages.

    The message viewer opens and shows the message you sent in the previous step.

  5. In the Side Bar, hover over output_topic and click View Messages.

    The message viewer opens and shows the message that was produced by the Kafka Streams application in response to the message sent to input_topic.

Step 7: Clean up

To clean up your development environment, simply stop the Docker containers that are running the Kafka Streams application and the Kafka broker.

  1. In the terminal, run the following command to stop the Kafka Streams application.

    docker compose down
    

    Your output should resemble:

    ✔ Container kafka-streams-app  Removed
    
  2. In the extension’s Side Bar, click Stop Resources on the Local resource.

    A notification appears reporting that 1 Kafka container is stopping.

To resume this quick start, click Start Local Resources to run the Docker container. Your Kafka broker starts again, with your input and output topics intact.