CCDAK- Confluent Certified Developer Apache Kafka Practice Questions

Serkan SAKINMAZ
4 min readApr 25, 2021
CCDAK Confluent Certified Developer for Apache Kafka

CCDAK is one of the most popular exam for Apache Kafka. In this section, I have listed up some example questions

Question 1

Kafka Connect can be run in these modes; (Select two option)

  • Distributed Mode
  • Vertical mode
  • Batch mode
  • Standalone mode

Answer 1

Kafka can be run with Standalone mode and Distributed mode. Standalone mode is useful for development and testing Kafka Connect on a local machine.

Distributed mode runs Connect workers on multiple machines (nodes)

Question 2

To add a field without default value is a ….. compatibility

  • Backward
  • Forward
  • Full
  • Nonen

Answer 2

To add a field without default value is forward compatibility (or delete a field that has optional value)

Question 3

In order to push data from source to Kafka, you need to implement

  • Kafka Producer
  • Kafka Consumer
  • Kafka Connect Sink
  • Kafka API

Answer 3

The correct answer is to implement the Kafka Producer.Kafka Producer is a Kafka client that publishes records to the Kafka cluster.

Question 4

To export data from Kafka to S3, which Kafka Connector you need to use

  • Amazon S3 source connector
  • Amazon S3 Sink connector
  • Kafka Streams S3 Connector
  • CDC Connector

Answer 4

You can use the Kafka Connect Amazon S3 sink connector to export data from Apache Kafka® topics to S3 objects in either Avro, JSON, or Bytes formats.

https://docs.confluent.io/current/connect/kafka-connect-s3/index.html

Question 5

Which protocol is used in Kafka ?

  • UDP
  • TTLS
  • TCP
  • HTTP

Answer 5

Kafka uses a binary protocol over TCP

Question 6

What is the command to produce a message to Kafka from console?

  • kafka-topics.sh — zookeeper localhost:9092 — topic my-topic
  • kafka-topics.sh --broker-list localhost:9092 --topic my-topic
  • kafka-console-producer.sh --broker-list localhost:9092 --topic my-topic
  • kafka-console-consumer.sh --broker-list localhost:9092 --topic my-topic --from-beginning

Answer 6

In order to publish a message to Kafka, you need to use

kafka-console-producer.sh --broker-list localhost:9092 --topic my-topic

Question 7

In order to create a topic with 3 partitions, you need to execute

  • kafka-topics.sh — create — bootstrap-server localhost:9092 — replication-factor 1 — partitions 3 — topic test
  • kafka-topics.sh — create — zookeeper localhost:9092 — replication-factor 1 — partitions 3 — topic test
  • kafka-producer-topics.sh — create — zookeeper localhost:9092 — replication-factor 1 — partitions 3 — topic test
  • kafka-topics.sh — create — zookeeper localhost:9092 — replication 1 — partitions 3 — topic test

Answer 7

kafka-topics.sh — create — bootstrap-server localhost:9092 — replication-factor 1 — partitions 3 — topic test

— partitions parameter is used to define the number of the partition. On the other hand, you need to use --bootstrap-server to define list of hosts

Question 8

Which are the default value of the replication factor and partition size (Select two answer)

  • The default replication factor for new topics is 1
  • The default partition number is 1
  • The default replication factor for new topics is 3
  • The default partition number is 3

Answer 8

For the partition, the default value is 1 (the value in the config num.partitions=1). The default Replication factor is also 1 (the setting of the config is replication.factor)

Question 9

In order to send data to Kafka without implementing any Producer, you need to use

  • Kafka Producer
  • Kafka Rest Proxy
  • Kafka Sink
  • Kafka API

Answer 9

Kafka Rest Proxy allows you to produce/consume message without any development

Example of producing message

$ curl -X POST -H "Content-Type: application/vnd.kafka.avro.v1+json" \
--data '{"value_schema": "{\"type\": \"record\", \"name\": \"User\", \"fields\": [{\"name\": \"name\", \"type\": \"string\"}]}", "records": [{"value": {"name": "testUser"}}]}' \
"http://localhost:8082/topics/avrotest"
{"offsets":[{"partition":0,"offset":0,"error_code":null,"error":null}],"key_schema_id":null,"value_schema_id":21}

Question 10

Which of the below sentence explains at least once semantics?

  • Once the message is processed properly, the consumer is going to send an acknowledgement
  • Once the message is received by the consumer, acknowledgement is sent accordingly
  • The message must be delivered only once and no message should be lost

Answer 10

Once the message is processed properly, the consumer is going to send an acknowledgement

Consumers will receive and process every message, but they may process the same message more than once.

Do you want more questions like this ?

CCDAK — Certified Developer Apache Kafka Practice Test

3 tests with 150 exam questions in order to prepare Confluent Certified Developer Apache Kafka certification

Go to the course with this link !

--

--