BigData - Interview Questions and Answers for 'Kafka' | Search Interview Question - javasearch.buggybread.com
Javasearch.buggybread.com

Search Interview Questions


 More than 3000 questions in repository.
 There are more than 900 unanswered questions.
Click here and help us by providing the answer.
 Have a video suggestion.
Click Correct / Improve and please let us know.
Label / Company      Label / Company / Text

   



BigData - Interview Questions and Answers for 'Kafka' - 14 question(s) found - Order By Newest

 Q1. What is Apache Kafka ?BigData
Ans. Kafka is a distributed, partitioned, replicated commit log service. It provides the functionality of a messaging system through messages being written to logs.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   message queue     Asked in 4 Companies


 Q2. What is a broker in Apache Kafka ?
Ans. Kafka is run as a cluster comprised of one or more servers each of which is called a broker

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   message queue


 Q3. What is a Topic in Apache Kafka ?
Ans. A topic is a category or feed name to which messages are published

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   message queue


 Q4. Have you used Kafka in your project ? If Yes, for what ?
Ans. We were using Kafka as a replacement for JMS Message Queue for better throughput. We were just using a simple Java multi threaded client as order of message consumption didn't matter to us.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   message queue   java


 Q5. Does Kafka uses ZooKeeper ?
Ans. Yes

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   yes-no   yes no


 Q6. Can we have multiple threads consuming message stream from a single partition ?
Ans. Yes, by having multiple Consumer Groups.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     apache kafka  kafka  kafka consumer  kafka topic partitions  bigdata  consumer group


 Q7. Can we have multiple threads consuming messages from a single partition if we have single Consumer Group ?
Ans. No, we can only have max 1 thread per partition in a single Consumer Group.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     apache kafka  kafka  kafka consumer  kafka topic partitions  bigdata  consumer group


 Q8. If we have more threads than partitions in a kafka consumer, How can we model it efficiently ?
Ans. We will have to use multiple consumer groups in that case as threads will remain idle if we use single consumer group. A more sophisticated algorithm could be required with multiple groups if we have to ensure the order of consumption.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     apache kafka  kafka  kafka consumer  kafka topic partitions  bigdata  consumer group


 Q9. Have you ever heard of Kafka connect ?BigData
Ans. Yes, it's a project by confluent that provides in built mechanism for streaming records from bigdata data source to apache kafka message queues and vice versa. It provides a variety of source and sink connectors to achieve this.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka  kafka connect


 Q10. What are the source and sink connectors in Kafka connect ? BigData
Ans. Source connectors are the connectors that are used to get information from the source whereas sink connectors are used to deploy information to the destination.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka connect


 Q11. How can we make sure that we don't receive duplicate messages from the Kafka Topic ? Apache Kafka
 This question is still unanswered. Can you please provide an answer.


 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka


 Q12. How can we makes sure that message has been consumed properly from Kafka ?Apache Kafka
Ans. We can have a mechanism to send the conformation back to Kafka so that Kafka can do offset only after receiving the confirmation.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     apache kafka  kafka  kafka consumer


 Q13. Kafka has a deliver atleast once default Policy, What are the other policies that can be configured with Kafka ?Apache Kafka
Ans. Exactly once and At Max Once

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka


 Q14. What are the different delivery policies in Apache Kafka ?Apache Kafka
Ans. Exactly once , At least once , At max once

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve          Asked in 1 Companies      basic        frequent



Help us and Others Improve. Please let us know the questions asked in any of your previous interview.

Any input from you will be highly appreciated and It will unlock the application for 10 more requests.

Company Name:
Questions Asked: