Summer Sale- Special Discount Limited Time 65% Offer - Ends in 0d 00h 00m 00s - Coupon code: netdisc

Confluent CCDAK Confluent Certified Developer for Apache Kafka Certification Examination Exam Practice Test

Page: 1 / 6
Total 61 questions

Confluent Certified Developer for Apache Kafka Certification Examination Questions and Answers

Question 1

Which two statements about Kafka Connect Single Message Transforms (SMTs) are correct?

(Select two.)

Options:

A.

Multiple SMTs can be chained together and act on source or sink messages.

B.

SMTs are often used to join multiple records from a source data system into a single Kafka record.

C.

Masking data is a good example of an SMT.

D.

SMT functionality is included within Kafka Connect converters.

Question 2

Which two statements are correct about transactions in Kafka?

(Select two.)

Options:

A.

All messages from a failed transaction will be deleted from a Kafka topic.

B.

Transactions are only possible when writing messages to a topic with single partition.

C.

Consumers can consume both committed and uncommitted transactions.

D.

Information about producers and their transactions is stored in the _transaction_state topic.

E.

Transactions guarantee at least once delivery of messages.

Question 3

You have a Kafka client application that has real-time processing requirements.

Which Kafka metric should you monitor?

Options:

A.

Consumer lag between brokers and consumers

B.

Total time to serve requests to replica followers

C.

Consumer heartbeat rate to group coordinator

D.

Aggregate incoming byte rate

Question 4

You need to consume messages from Kafka using the command-line interface (CLI).

Which command should you use?

Options:

A.

kafka-console-consumer

B.

kafka-consumer

C.

kafka-get-messages

D.

kafka-consume

Question 5

Match each configuration parameter with the correct deployment step in installing a Kafka connector.

Question # 5

Options:

Question 6

You are creating a Kafka Streams application to process retail data.

Match the input data streams with the appropriate Kafka Streams object.

Question # 6

Options:

Question 7

What is accomplished by producing data to a topic with a message key?

Options:

A.

Messages with the same key are routed to a deterministically selected partition, enabling order guarantees within that partition.

B.

Kafka brokers allow you to add more partitions to a given topic, without impacting the data flow for existing keys.

C.

It provides a mechanism for encrypting messages at the partition level to ensure secure data transmission.

D.

Consumers can filter messages in real time based on the message key without processing unrelated messages.

Question 8

Which partition assignment minimizes partition movements between two assignments?

Options:

A.

RoundRobinAssignor

B.

StickyAssignor

C.

RangeAssignor

D.

PartitionAssignor

Question 9

Your Kafka cluster has five brokers. The topic t1 on the cluster has:

    Two partitions

    Replication factor = 4

    min.insync.replicas = 3You need strong durability guarantees for messages written to topic t1.You configure a producer acks=all and all the replicas for t1 are in-sync.How many brokers need to acknowledge a message before it is considered committed?

Options:

A.

2

B.

3

C.

4

D.

5

Question 10

You have a topic t1 with six partitions. You use Kafka Connect to send data from topic t1 in your Kafka cluster to Amazon S3. Kafka Connect is configured for two tasks.

How many partitions will each task process?

Options:

A.

2

B.

3

C.

6

D.

12

Question 11

What is the default maximum size of a message the Apache Kafka broker can accept?

Options:

A.

1MB

B.

2MB

C.

5MB

D.

10MB

Question 12

You are building a system for a retail store selling products to customers.

Which three datasets should you model as a GlobalKTable?

(Select three.)

Options:

A.

Inventory of products at a warehouse

B.

All purchases at a retail store occurring in real time

C.

Customer profile information

D.

Log of payment transactions

E.

Catalog of products

Question 13

Match the testing tool with the type of test it is typically used to perform.

Question # 13

Options:

Question 14

Which statement describes the storage location for a sink connector’s offsets?

Options:

A.

The __consumer_offsets topic, like any other consumer

B.

The topic specified in the offsets.storage.topic configuration parameter

C.

In a file specified by the offset.storage.file.filename configuration parameter

D.

In memory which is then periodically flushed to a RocksDB instance

Question 15

A producer is configured with the default partitioner. It is sending records to a topic that is configured with five partitions. The record does not contain any key.

What is the result of this?

Options:

A.

Records will be dispatched among the available partitions.

B.

Records will be sent to partition 0.

C.

An error will be raised and no record will be sent.

D.

Records will be sent to the least used partition.

Question 16

Your application is consuming from a topic with one consumer group.

The number of running consumers is equal to the number of partitions.

Application logs show that some consumers are leaving the consumer group during peak time, triggering a rebalance. You also notice that your application is processing many duplicates.

You need to stop consumers from leaving the consumer group.

What should you do?

Options:

A.

Reduce max.poll.records property.

B.

Increase session.timeout.ms property.

C.

Add more consumer instances.

D.

Split consumers in different consumer groups.

Question 17

You need to explain the best reason to implement the consumer callback interface ConsumerRebalanceListener prior to a Consumer Group Rebalance.

Which statement is correct?

Options:

A.

Partitions assigned to a consumer may change.

B.

Previous log files are deleted.

C.

Offsets are compacted.

D.

Partition leaders may change.

Question 18

You create a topic named loT-Data with 10 partitions and replication factor of three.

A producer sends 1 MB messages compressed with Gzip.

Which two statements are true in this scenario?

(Select two.)

Options:

A.

Compression type will be stored in batch attributes.

B.

By default, compression is the producer’s responsibility.

C.

The message is already compressed so it will not be serialized.

D.

All compressed messages will be stored in the same topic partition.

Page: 1 / 6
Total 61 questions