Name | Description | Type | Package | Framework |
ActorHelper | A receiver trait to be mixed in with your Actor to gain access to the API for pushing received data into Spark Streaming for being processed. | Interface | org.apache.spark.streaming.receiver | Apache Spark |
|
ActorSupervisorStrategy | | Class | org.apache.spark.streaming.receiver | Apache Spark |
|
BatchInfo | Class having information on completed batches. | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
Broker | Represents the host and port info for a Kafka broker. | Class | org.apache.spark.streaming.kafka | Apache Spark |
|
ConstantInputDStream | An input stream that always returns the same RDD on each timestep. | Class | org.apache.spark.streaming.dstream | Apache Spark |
|
DStream | A Discretized Stream (DStream), the basic abstraction in Spark Streaming, is a continuous sequence of RDDs (of the same type) representing a continuous stream of data (see | Class | org.apache.spark.streaming.dstream | Apache Spark |
|
Duration | | Class | org.apache.spark.streaming | Apache Spark |
|
Durations | | Class | org.apache.spark.streaming | Apache Spark |
|
FlumeUtils | | Class | org.apache.spark.streaming.flume | Apache Spark |
|
HasOffsetRanges | Represents any object that has a collection of OffsetRanges. | Interface | org.apache.spark.streaming.kafka | Apache Spark |
|
InputDStream | This is the abstract base class for all input streams. | Class | org.apache.spark.streaming.dstream | Apache Spark |
|
JavaDStream | A Java-friendly interface to DStream, the basic abstraction in Spark Streaming that represents a continuous stream of data. | Class | org.apache.spark.streaming.api.java | Apache Spark |
|
JavaDStreamLike | | Interface | org.apache.spark.streaming.api.java | Apache Spark |
|
JavaInputDStream | A Java-friendly interface to InputDStream. | Class | org.apache.spark.streaming.api.java | Apache Spark |
|
JavaMapWithStateDStream | DStream representing the stream of data generated by mapWithState operation on a JavaPairDStream. | Class | org.apache.spark.streaming.api.java | Apache Spark |
|
JavaPairDStream | A Java-friendly interface to a DStream of key-value pairs, which provides extra methods like reduceByKey and join. | Class | org.apache.spark.streaming.api.java | Apache Spark |
|
JavaPairInputDStream | A Java-friendly interface to InputDStream ofSee Also:Serialized Form | Class | org.apache.spark.streaming.api.java | Apache Spark |
|
JavaPairReceiverInputDStream | A Java-friendly interface to ReceiverInputDStream, the abstract class for defining any input stream that receives data over the network. | Class | org.apache.spark.streaming.api.java | Apache Spark |
|
JavaReceiverInputDStream | A Java-friendly interface to ReceiverInputDStream, the abstract class for defining any input stream that receives data over the network. | Class | org.apache.spark.streaming.api.java | Apache Spark |
|
JavaStreamingContext | A Java-friendly version of StreamingContext which is the main entry point for Spark Streaming functionality. | Class | org.apache.spark.streaming.api.java | Apache Spark |
|
KafkaUtils | | Class | org.apache.spark.streaming.kafka | Apache Spark |
|
KinesisUtils | | Class | org.apache.spark.streaming.kinesis | Apache Spark |
|
KinesisUtilsPythonHelper | This is a helper class that wraps the methods in KinesisUtils into more Python-friendly class and function so that it can be easily instantiated and called from Python's KinesisUtils. | Class | org.apache.spark.streaming.kinesis | Apache Spark |
|
MapWithStateDStream | DStream representing the stream of data generated by mapWithState operation on a Additionally, it also gives access to the stream of state snapshots, that is, the state data of | Class | org.apache.spark.streaming.dstream | Apache Spark |
|
Milliseconds | Helper object that creates instance of Duration representing a given number of milliseconds. | Class | org.apache.spark.streaming | Apache Spark |
|
Minutes | Helper object that creates instance of Duration representing a given number of minutes. | Class | org.apache.spark.streaming | Apache Spark |
|
MQTTUtils | | Class | org.apache.spark.streaming.mqtt | Apache Spark |
|
OffsetRange | Represents a range of offsets from a single Kafka TopicAndPartition. | Class | org.apache.spark.streaming.kafka | Apache Spark |
|
OutputOperationInfo | Class having information on output operations. | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
PairDStreamFunctions | Extra functions available on DStream of (key, value) pairs through an implicit conversion. | Class | org.apache.spark.streaming.dstream | Apache Spark |
|
Receiver | Abstract class of a receiver that can be run on worker nodes to receive external data. | Class | org.apache.spark.streaming.receiver | Apache Spark |
|
ReceiverInfo | Class having information about a receiverSee Also:Serialized Form | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
ReceiverInputDStream | Abstract class for defining any InputDStream that has to start a receiver on worker nodes to receive external data. | Class | org.apache.spark.streaming.dstream | Apache Spark |
|
Seconds | Helper object that creates instance of Duration representing a given number of seconds. | Class | org.apache.spark.streaming | Apache Spark |
|
SparkFlumeEvent | A wrapper class for AvroFlumeEvent's with a custom serialization format. | Class | org.apache.spark.streaming.flume | Apache Spark |
|
State | Abstract class for getting and updating the state in mapping function used in the mapWithState operation of a pair DStream (Scala) | Class | org.apache.spark.streaming | Apache Spark |
|
StateSpec | Abstract class representing all the specifications of the DStream transformation mapWithState operation of a | Class | org.apache.spark.streaming | Apache Spark |
|
Statistics | Statistics for querying the supervisor about state of workers. | Class | org.apache.spark.streaming.receiver | Apache Spark |
|
StatsReportListener | A simple StreamingListener that logs summary statistics across Spark Streaming batches param: numBatchInfos Number of last batches to consider for generating statistics (default: 10) | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingContext | Main entry point for Spark Streaming functionality. | Class | org.apache.spark.streaming | Apache Spark |
|
StreamingContextState | enum StreamingContextState Represents the state of a StreamingContext. | Class | org.apache.spark.streaming | Apache Spark |
|
StreamingListener | | Interface | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingListenerBatchCompleted | | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingListenerBatchStarted | | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingListenerBatchSubmitted | | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingListenerOutputOperationCompleted | | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingListenerOutputOperationStarted | | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingListenerReceiverError | | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingListenerReceiverStarted | | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamingListenerReceiverStopped | | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
StreamInputInfo | Track the information of input stream at specified batch time. | Class | org.apache.spark.streaming.scheduler | Apache Spark |
|
Time | This is a simple class that represents an absolute instant of time. | Class | org.apache.spark.streaming | Apache Spark |
|
TwitterUtils | | Class | org.apache.spark.streaming.twitter | Apache Spark |
|
WriteAheadLog | This abstract class represents a write ahead log (aka journal) that is used by Spark Streaming to save the received data (by receivers) and associated metadata to a reliable storage, so that | Class | org.apache.spark.streaming.util | Apache Spark |
|
WriteAheadLogRecordHandle | This abstract class represents a handle that refers to a record written in a It must contain all the information necessary for the record to be read and returned by | Class | org.apache.spark.streaming.util | Apache Spark |
|
ZeroMQUtils | | Class | org.apache.spark.streaming.zeromq | Apache Spark |