Bigdata - Interview Questions and Answers for 'At' | Search Interview Question - javasearch.buggybread.com
Javasearch.buggybread.com

Search Interview Questions


 More than 3000 questions in repository.
 There are more than 900 unanswered questions.
Click here and help us by providing the answer.
 Have a video suggestion.
Click Correct / Improve and please let us know.
Label / Company      Label / Company / Text

   



Bigdata - Interview Questions and Answers for 'At' - 23 question(s) found - Order By Newest

 Q1. Is there any schema in mongo DB ?MongoDB
Ans. There is schema in mongo-db but the schema need not to be defined before creating collection.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     mongodb  nosql  bigdata     Asked in 1 Companies


 Q2. What is the difference between namenode and datanode in Hadoop? BigData
Ans. NameNode stores MetaData (No of Blocks, On Which Rack which DataNode is stored etc) whereas the DataNode stores the actual Data.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     hadoop   at&t     Asked in 2 Companies


 Q3. What is Apache Kafka ?BigData
Ans. Kafka is a distributed, partitioned, replicated commit log service. It provides the functionality of a messaging system through messages being written to logs.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   message queue     Asked in 4 Companies


 Q4. What is a broker in Apache Kafka ?
Ans. Kafka is run as a cluster comprised of one or more servers each of which is called a broker

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   message queue


 Q5. What is a Topic in Apache Kafka ?
Ans. A topic is a category or feed name to which messages are published

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   message queue


 Q6. Have you used Kafka in your project ? If Yes, for what ?
Ans. We were using Kafka as a replacement for JMS Message Queue for better throughput. We were just using a simple Java multi threaded client as order of message consumption didn't matter to us.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   message queue   java


 Q7. Does Kafka uses ZooKeeper ?
Ans. Yes

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka   bigdata   yes-no   yes no


 Q8. What is a Sequence File?

a. A Sequence File contains a binary encoding of an arbitrary number of homogeneous writable objects.
b. A Sequence File contains a binary encoding of an arbitrary number key-value pairs. Each key must be the same type. Each value must be of same type.
c. A Sequence File contains a binary encoding of an arbitrary number of heterogeneous writeable objects.
d. A Sequence File contains a binary encoding of an arbitrary number of Writable Comparable objects, in sorted order.
Ans. A Sequence File contains a binary encoding of an arbitrary number key-value pairs. Each key must be the same type. Each value must be of same type.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     hadoop   bigdata   big data


 Q9. What is the input to the Reduce function ?

a. One Key and One Value
b. Multiple Keys and Multiple associated Values
c. Multiple Keys and One associated values with each
d. One key and associated values.
Ans. One key and associated values.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     hadoop   bigdata   big data   map-reduce   map reduce   reduce function


 Q10. Which of the following is the implementation language for Map Reduce Framework ?

a. Big Data
b. Hadoop
c. Java
d. C++
Ans. Java

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     hadoop   bigdata   big data   map-reduce   map reduce framework


 Q11. Can we have multiple threads consuming message stream from a single partition ?
Ans. Yes, by having multiple Consumer Groups.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     apache kafka  kafka  kafka consumer  kafka topic partitions  bigdata  consumer group


 Q12. Can we have multiple threads consuming messages from a single partition if we have single Consumer Group ?
Ans. No, we can only have max 1 thread per partition in a single Consumer Group.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     apache kafka  kafka  kafka consumer  kafka topic partitions  bigdata  consumer group


 Q13. If we have more threads than partitions in a kafka consumer, How can we model it efficiently ?
Ans. We will have to use multiple consumer groups in that case as threads will remain idle if we use single consumer group. A more sophisticated algorithm could be required with multiple groups if we have to ensure the order of consumption.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     apache kafka  kafka  kafka consumer  kafka topic partitions  bigdata  consumer group


 Q14. What are the two main components of Hadoop System ?BigData
Ans. Distributed file system and Map Reduce system.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     hadoop  bigdata  hadoop system components        frequent


 Q15. What is Hadoop Framework ?BigData
Ans. Hadoop is an open source framework , written in java by apche software foundation. This framework is used to write applications to process vast amount of data. Processing happens in parallel on large clusters which could have 1000 of computers. It processes data in a very reliable and fault tolerant manner.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     Hadoop framework  BigData


 Q16. Can you narrate some sample usage for Bigdata ?BigData
Ans. One common usage is predictive analytic using huge current or past data. For example - Using recent medical data ( diagnosis and procedure ), one can identify the pattern of diseases or the procedures that has to eventually applied upon certain diagnosis. This analysis might help in predicting the diseases that might occur to a patient. the other usage could be to identify the future spending patterns of the population by analyzing the past and current habits.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     BigData


 Q17. What is the use of Combiners ?BigData
Ans. Combiners are used to increase the efficiency of a Map Reduce program. They are used to aggregate intermediate map output locally on individual mapper outputs. Combiners can help you reduce the amount of data that needs to be transferred across to the reducers.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     Hadoop Systems  BigData   hadoop combiners     Asked in 1 Companies


 Q18. How would you process a 20 gb file with an application only having access to 4gb memory ?BigData
Ans. Load the file in chunks and then process. If we need to do analytic, we can process analytic information for those chunks and then reprocess the processed information from each chunk.

For example - we need to average all marks in the file. We can divide the file and load into 5 chunks and calculate average for each chunk. Then we can collect averages for all 5 chunks and then calculate the final average.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     bigdata  processing big data     Asked in 1 Companies


 Q19. how many reducer task runs on a hadoop cluster ?BigData
Ans. Generally one reducer runs for all mappers, but it can be increased as per requirements.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     Hadoop  BigData     Asked in 1 Companies


 Q20. Have you ever heard of Kafka connect ?BigData
Ans. Yes, it's a project by confluent that provides in built mechanism for streaming records from bigdata data source to apache kafka message queues and vice versa. It provides a variety of source and sink connectors to achieve this.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka  kafka connect


 Q21. What are the source and sink connectors in Kafka connect ? BigData
Ans. Source connectors are the connectors that are used to get information from the source whereas sink connectors are used to deploy information to the destination.

 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve     kafka connect


 Q22. Can an In Memory Data Management play a significant role for Big Data Analytics? BigData
 This question was recently asked at 'Jean Martin'.This question is still unanswered. Can you please provide an answer.


 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve          Asked in 1 Companies


 Q23. Why Hadoop serialized data types are not using in Spark serialization ?BigData
 This question was recently asked at 'Tavant Technology'.This question is still unanswered. Can you please provide an answer.


 Help us improve. Please let us know the company, where you were asked this question :   

   Like         Discuss         Correct / Improve          Asked in 1 Companies


 Q24. How can we create objects if we make the constructor private ?Core Java
a. We can't create objects if constructor is private
b. We can only create objects if we follow singleton pattern
c. We can only create one object
d. We can create new object through static method or static block

Ans.d. We can create new object through static method or static block

 Q25. What will be the output of executing following class ?

public class BuggyBread {

static {
System.out.println("Static Block");
}

{
System.out.println("Initialization Block");
}

BuggyBread(){
System.out.println("Constructor");
}

public static void main(String[] args){
System.out.println("Main Method");
}
}
Core Java
a. Static Block
Main Method
b. Static Block
Instance Initialization Block
Main Method
c. Static Block
Constructor
Main Method
d. Static Block
Instance Initialization Block
Constructor
Main Method

Ans.a. Static Block
Main Method

 Q26. What will be the output upon executing following class ?

public class BuggyBread {

static {
System.out.println("Static Block");
}

{
System.out.println("Instance Initialization Block");
}

BuggyBread(){
System.out.println("Constructor");
}

public static void main(String[] args){
System.out.println("Main Method");
new BuggyBread();
}
}
Core Java
a. Instance Initialization Block
Constructor
Static Block
Main Method
b. Static Block
Instance Initialization Block
Constructor
Main Method
c. Main Method
Static Block
Instance Initialization Block
Constructor
d. Static Block
Main Method
Instance Initialization Block
Constructor

Ans.d. Static Block
Main Method
Instance Initialization Block
Constructor

 Q27. With the following code, Which is a valid way to initialize ?
public class BuggyBread {

   private String element1;

   private String element2;

   private BuggyBread(String element1, String element2){
      this.element1 = element1;
      this.element2 = element2;
   }

   public static class Builder {
   
      private String element1;

      private String element2;

      Builder(BuggyBread buggybread){
         element1 = buggybread.element1;
         element2 = buggybread.element2;
      }

      Builder withElement1(String element1){
         this.element1 = element1;
         return this;
      }

      Builder withElement2(String element2){
         this.element2 = element2;
         return this;
      }

      BuggyBread build(){
         BuggyBread buggybread = new BuggyBread(element1,element2);
         return buggybread;
      }
   }
}
Core Java
a. BuggyBread buggybread = new BuggyBread();
b. BuggyBread buggybread = new BuggyBread("element1","element2");
c. BuggyBread.Builder builder = new BuggyBread.Builder();
d. BuggyBread.Builder builder = new BuggyBread.Builder("element1","element2");

Ans.d. BuggyBread.Builder builder = new BuggyBread.Builder("element1","element2");

 Q28. What will be the output of following ?

public class BuggyBread {

   private int x;
   private Integer y;

   BuggyBread(int x,int y){};

   public static void main(String[] args){
      BuggyBread buggybread = new BuggyBread(1,2);
      System.out.println(buggybread.x);
      System.out.println(buggybread.y);
   }
}
Core Java
a. 0 0
b. 0 null
c. null 0
d. null null

Ans.b. 0 null

 Q29. Which of the following is true for == operator ?Core Java
a. For primitives, == checks if the variables on left and right have same data type
b. For primitives, == checks if the variables on left and right have same value
c. For Objects, == checks if the references on left and right have same data type
d. For Objects, == checks if the references on left and right have same value

Ans.b. For primitives, == checks if the variables on left and right have same value

 Q30. Which of the following is equivalent to following logic ?

Not X && Not Y
Core Java
a. x || Y
b. Not(X || Y)
c. Not(X && Y)
d. Not X && Y

Ans.b. Not(X || Y)


Help us and Others Improve. Please let us know the questions asked in any of your previous interview.

Any input from you will be highly appreciated and It will unlock the application for 10 more requests.

Company Name:
Questions Asked: