Share

Search Java Classes and Packages

Search Java Frameworks and Libraries

255581 classes and counting ...
Search Tips Index Status



# Classes and Interfaces in #Apache Spark - 732 results found.
NameDescriptionTypePackageFrameworkJavaDoc
AbsoluteError Class for absolute error loss calculation (for regression).Classorg.apache.spark.mllib.tree.lossApache Sparkjavadoc
AccumulableA data type that can be accumulated, ie has an commutative and associative "add" operation, but where the result type, R, may be different from the element type being added, T.Classorg.apache.sparkApache Sparkjavadoc
AccumulableInfo Information about an Accumulable modified during a task or stage.Classorg.apache.spark.schedulerApache Sparkjavadoc
AccumulableInfoClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
AccumulableParamHelper object defining how to accumulate values of a particular type.Interfaceorg.apache.sparkApache Sparkjavadoc
AccumulatorA simpler value of Accumulable where the result type being accumulated is the same as the types of elements being merged, i.Classorg.apache.sparkApache Sparkjavadoc
AccumulatorParamA simpler version of AccumulableParam where the only data type you can add in is the same type as the accumulated value.Interfaceorg.apache.sparkApache Sparkjavadoc
ActorHelper A receiver trait to be mixed in with your Actor to gain access to the API for pushing received data into Spark Streaming for being processed.Interfaceorg.apache.spark.streaming.receiverApache Sparkjavadoc
ActorSupervisorStrategyClassorg.apache.spark.streaming.receiverApache Sparkjavadoc
AFTAggregatorClassorg.apache.spark.ml.regressionApache Sparkjavadoc
AFTCostFunClassorg.apache.spark.ml.regressionApache Sparkjavadoc
AFTSurvivalRegression Fit a parametric survival regression model named accelerated failure time (AFT) model (https://en.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
AFTSurvivalRegressionModel Model produced by AFTSurvivalRegression.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
AggregatedDialectAggregatedDialect can unify multiple dialects into one virtual Dialect.Classorg.apache.spark.sql.jdbcApache Sparkjavadoc
AggregatingEdgeContextClassorg.apache.spark.graphx.implApache Sparkjavadoc
Aggregator A set of functions used to aggregate data.Classorg.apache.sparkApache Sparkjavadoc
AggregatorA base class for user-defined aggregations, which can be used in DataFrame and Dataset operations to take all of the elements of a group and reduce them to a single value.Classorg.apache.spark.sql.expressionsApache Sparkjavadoc
Algo Enum to select the algorithm for the decision treeSee Also:Serialized FormClassorg.apache.spark.mllib.tree.configurationApache Sparkjavadoc
AlphaComponentA new component of Spark which may have unstable API's.Classorg.apache.spark.annotationApache Sparkjavadoc
ALS Alternating Least Squares (ALS) matrix factorization.Classorg.apache.spark.ml.recommendationApache Sparkjavadoc
ALSClassorg.apache.spark.mllib.recommendationApache Sparkjavadoc
ALSModel Model fitted by ALS.Classorg.apache.spark.ml.recommendationApache Sparkjavadoc
AnalysisException Thrown when a query fails to analyze, usually because the query itself is invalid.Classorg.apache.spark.sqlApache Sparkjavadoc
AndA filter that evaluates to true iff both left or right evaluate to true.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

ApplicationAttemptInfoClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
ApplicationInfoClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
ApplicationStatusenum ApplicationStatusEnum Constant SummaryClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
ArrayTypeClassorg.apache.spark.sql.typesApache Sparkjavadoc
AskPermissionToCommitOutputClassorg.apache.spark.schedulerApache Sparkjavadoc
AssociationRules Generates association rules from a RDD[FreqItemset[Item].Classorg.apache.spark.mllib.fpmApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.fpm.AssociationRules
AsyncRDDActionsA set of asynchronous RDD actions available through an implicit conversion.Classorg.apache.spark.rddApache Sparkjavadoc
Attribute Abstract class for ML attributes.Classorg.apache.spark.ml.attributeApache Sparkjavadoc
AttributeGroup Attributes that describe a vector ML column.Classorg.apache.spark.ml.attributeApache Sparkjavadoc
AttributeType An enum-like type for attribute types: AttributeType$.Classorg.apache.spark.ml.attributeApache Sparkjavadoc
BaseRelation Represents a collection of tuples with a known schema.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
BaseRRDDClassorg.apache.spark.api.rApache Sparkjavadoc
BatchInfo Class having information on completed batches.Classorg.apache.spark.streaming.schedulerApache Sparkjavadoc
BernoulliCellSampler A sampler based on Bernoulli trials for partitioning a data sequence.Classorg.apache.spark.util.randomApache Sparkjavadoc
BernoulliSampler A sampler based on Bernoulli trials.Classorg.apache.spark.util.randomApache Sparkjavadoc
Binarizer Binarize a column of continuous features given a threshold.Classorg.apache.spark.ml.featureApache Sparkjavadoc
BinaryAttribute A binary attribute.Classorg.apache.spark.ml.attributeApache Sparkjavadoc
BinaryClassificationEvaluator Evaluator for binary classification, which expects two input columns: rawPrediction and label.Classorg.apache.spark.ml.evaluationApache Sparkjavadoc
BinaryClassificationMetricsEvaluator for binary classification.Classorg.apache.spark.mllib.evaluationApache Sparkjavadoc
BinaryLogisticRegressionSummary Binary Logistic regression results for a given model.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
BinaryLogisticRegressionTrainingSummary Logistic regression training results.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
BinarySampleClass that represents the group and value of a sample.Classorg.apache.spark.mllib.stat.testApache Sparkjavadoc
BinaryType The data type representing Array[Byte] values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
BisectingKMeansA bisecting k-means algorithm based on the paper "A comparison of document clustering techniques" by Steinbach, Karypis, and Kumar, with modification to fit Spark.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
BisectingKMeansModelClustering model produced by BisectingKMeans.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

BlockId Identifies a particular Block of data, usually associated with a single file.Classorg.apache.spark.storageApache Sparkjavadoc
BlockManagerId This class represent an unique identifier for a BlockManager.Classorg.apache.spark.storageApache Sparkjavadoc
BlockMatrixRepresents a distributed matrix in blocks of local matrices.Classorg.apache.spark.mllib.linalg.distributedApache Sparkjavadoc
BlockNotFoundExceptionClassorg.apache.spark.storageApache Sparkjavadoc
BlockStatusClassorg.apache.spark.storageApache Sparkjavadoc
BlockUpdatedInfo Stores information about a block status in a block manager.Classorg.apache.spark.storageApache Sparkjavadoc
BooleanParam Specialized version of Param[Boolean] for Java.Classorg.apache.spark.ml.paramApache Sparkjavadoc
BooleanType The data type representing Boolean values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
BoostingStrategyConfiguration options for GradientBoostedTrees.Classorg.apache.spark.mllib.tree.configurationApache Sparkjavadoc
BoundedDoubleA Double value with error bars and associated confidence.Classorg.apache.spark.partialApache Sparkjavadoc
BroadcastA broadcast variable.Classorg.apache.spark.broadcastApache Sparkjavadoc
BroadcastBlockIdClassorg.apache.spark.storageApache Sparkjavadoc
BroadcastFactory An interface for all the broadcast implementations in Spark (to allow multiple broadcast implementations).Interfaceorg.apache.spark.broadcastApache Sparkjavadoc
BrokerRepresents the host and port info for a Kafka broker.Classorg.apache.spark.streaming.kafkaApache Sparkjavadoc
Bucketizer Bucketizer maps a column of continuous features to a column of feature buckets.Classorg.apache.spark.ml.featureApache Sparkjavadoc
BufferReleasingInputStreamHelper class that ensures a ManagedBuffer is release upon InputStream.Classorg.apache.spark.storageApache Sparkjavadoc
ByteType The data type representing Byte values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
CalendarIntervalType The data type representing calendar time intervals.Classorg.apache.spark.sql.typesApache Sparkjavadoc
CatalystScan An interface for experimenting with a more direct connection to the query planner.Interfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
CategoricalSplit Split which tests a categorical feature.Classorg.apache.spark.ml.treeApache Sparkjavadoc
ChiSqSelector Chi-Squared feature selection, which selects categorical features to use for predicting aSee Also:Serialized FormClassorg.apache.spark.ml.featureApache Sparkjavadoc
ChiSqSelectorClassorg.apache.spark.mllib.featureApache Sparkjavadoc
ChiSqSelectorModelClassorg.apache.spark.ml.featureApache Sparkjavadoc
ChiSqSelectorModelChi Squared selector model.Classorg.apache.spark.mllib.featureApache Sparkjavadoc
ChiSqTestResultObject containing the test results for the chi-squared hypothesis test.Classorg.apache.spark.mllib.stat.testApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

ClassificationModel Model produced by a Classifier.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
ClassificationModelRepresents a classification model that predicts to which of a set of categories an example belongs.Interfaceorg.apache.spark.mllib.classificationApache Sparkjavadoc
Classifier Single-label binary or multiclass classification.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
CleanAccumClassorg.apache.sparkApache Sparkjavadoc
CleanBroadcastClassorg.apache.sparkApache Sparkjavadoc
CleanCheckpointClassorg.apache.sparkApache Sparkjavadoc
CleanRDDClassorg.apache.sparkApache Sparkjavadoc
CleanShuffleClassorg.apache.sparkApache Sparkjavadoc
CleanupTaskWeakReferenceA WeakReference associated with a CleanupTask.Classorg.apache.sparkApache Sparkjavadoc
CoGroupedRDD A RDD that cogroups its parents.Classorg.apache.spark.rddApache Sparkjavadoc
CoGroupFunctionInterfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
Column A column that will be computed based on the data in a DataFrame.Classorg.apache.spark.sqlApache Sparkjavadoc
ColumnName A convenient class used for constructing schema.Classorg.apache.spark.sqlApache Sparkjavadoc
ColumnPrunerUtility transformer for removing temporary columns from a DataFrame.Classorg.apache.spark.ml.featureApache Sparkjavadoc
ComplexFutureActionA FutureAction for actions that could trigger multiple Spark jobs.Classorg.apache.sparkApache Sparkjavadoc
CompressionCodec CompressionCodec allows the customization of choosing different compression implementations to be used in block storage.Interfaceorg.apache.spark.ioApache Sparkjavadoc
ConnectedComponentsConnected components algorithm.Classorg.apache.spark.graphx.libApache Sparkjavadoc
ConstantInputDStreamAn input stream that always returns the same RDD on each timestep.Classorg.apache.spark.streaming.dstreamApache Sparkjavadoc
ContinuousSplit Split which tests a continuous feature.Classorg.apache.spark.ml.treeApache Sparkjavadoc
CoordinateMatrixClassorg.apache.spark.mllib.linalg.distributedApache Sparkjavadoc
CountVectorizer Extracts a vocabulary from document collections and generates a CountVectorizerModel.Classorg.apache.spark.ml.featureApache Sparkjavadoc
CountVectorizerModel Converts a text document to a sparse vector of token counts.Classorg.apache.spark.ml.featureApache Sparkjavadoc
CreatableRelationProviderInterfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
CrossValidator K-fold cross validation.Classorg.apache.spark.ml.tuningApache Sparkjavadoc
CrossValidatorModel Model from k-fold cross validation.Classorg.apache.spark.ml.tuningApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

DataFrame A distributed collection of data organized into named columns.Classorg.apache.spark.sqlApache Sparkjavadoc
DataFrameHolderA container for a DataFrame, used for implicit conversions.Classorg.apache.spark.sqlApache Sparkjavadoc
DataFrameNaFunctions Functionality for working with missing data in DataFrames.Classorg.apache.spark.sqlApache Sparkjavadoc
DataFrameReader Interface used to load a DataFrame from external storage systems (e.Classorg.apache.spark.sqlApache Sparkjavadoc
DataFrameStatFunctions Statistic functions for DataFrames.Classorg.apache.spark.sqlApache Sparkjavadoc
DataFrameWriter Interface used to write a DataFrame to external storage systems (e.Classorg.apache.spark.sqlApache Sparkjavadoc
Dataset A Dataset is a strongly typed collection of objects that can be transformed in parallel using functional or relational operations.Classorg.apache.spark.sqlApache Sparkjavadoc
DatasetHolderA container for a Dataset, used for implicit conversions.Classorg.apache.spark.sqlApache Sparkjavadoc
DataSourceRegister Data sources should implement this trait so that they can register an alias to their data source.Interfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
DataType The base type of all Spark SQL data types.Classorg.apache.spark.sql.typesApache Sparkjavadoc
DataTypesTo get/create specific data type, users should use singleton objects and factory methods provided by this class.Classorg.apache.spark.sql.typesApache Sparkjavadoc
DataValidators A collection of methods used to validate data before applying ML algorithms.Classorg.apache.spark.mllib.utilApache Sparkjavadoc
DateType A date type, supporting "0001-01-01" through "9999-12-31".Classorg.apache.spark.sql.typesApache Sparkjavadoc
DB2DialectClassorg.apache.spark.sql.jdbcApache Sparkjavadoc
DCT A feature transformer that takes the 1D discrete cosine transform of a real vector.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.feature.DCT
DecimalA mutable implementation of BigDecimal that can hold a Long if values are small enough.Classorg.apache.spark.sql.typesApache Sparkjavadoc
DecimalTypeClassorg.apache.spark.sql.typesApache Sparkjavadoc
DecisionTreeA class which implements a decision tree learning algorithm for classification and regression.Classorg.apache.spark.mllib.treeApache Sparkjavadoc
DecisionTreeClassificationModel Decision tree model for classification.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
DecisionTreeClassifier Decision tree learning algorithm for classification.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
DecisionTreeModelDecision tree model for classification or regression.Classorg.apache.spark.mllib.tree.modelApache Sparkjavadoc
DecisionTreeRegressionModel Decision tree model for regression.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
DecisionTreeRegressor Decision tree learning algorithm It supports both continuous and categorical features.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
DefaultSourcelibsvm package implements Spark SQL data source API for loading LIBSVM data as DataFrame.Classorg.apache.spark.ml.source.libsvmApache Sparkjavadoc
DenseMatrixColumn-major dense matrix.Classorg.apache.spark.mllib.linalgApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

DenseVectorA dense vector represented by a value array.Classorg.apache.spark.mllib.linalgApache Sparkjavadoc
Dependency Base class for dependencies.Classorg.apache.sparkApache Sparkjavadoc
DerbyDialectClassorg.apache.spark.sql.jdbcApache Sparkjavadoc
DeserializationStream A stream for reading serialized objects.Classorg.apache.spark.serializerApache Sparkjavadoc
DeveloperApiA lower-level, unstable API intended for developers.Classorg.apache.spark.annotationApache Sparkjavadoc
DistributedLDAModel Distributed model fitted by LDA.Classorg.apache.spark.ml.clusteringApache Sparkjavadoc
DistributedLDAModelClassorg.apache.spark.mllib.clusteringApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.clustering.DistributedLDAModel
DistributedMatrixRepresents a distributively stored matrix backed by one or more RDDs.Interfaceorg.apache.spark.mllib.linalg.distributedApache Sparkjavadoc
DoubleArrayParam Specialized version of Param[Array[Double} for Java.Classorg.apache.spark.ml.paramApache Sparkjavadoc
DoubleFlatMapFunctionA function that returns zero or more records of type Double from each input record.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
DoubleFunctionA function that returns Doubles, and can be used to construct DoubleRDDs.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
DoubleParam Specialized version of Param[Double] for Java.Classorg.apache.spark.ml.paramApache Sparkjavadoc
DoubleRDDFunctionsExtra functions available on RDDs of Doubles through an implicit conversion.Classorg.apache.spark.rddApache Sparkjavadoc
DoubleType The data type representing Double values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
DStreamA Discretized Stream (DStream), the basic abstraction in Spark Streaming, is a continuous sequence of RDDs (of the same type) representing a continuous stream of data (seeClassorg.apache.spark.streaming.dstreamApache Sparkjavadoc
DummySerializerInstanceUnfortunately, we need a serializer instance in order to construct a DiskBlockObjectWriter.Classorg.apache.spark.serializerApache Sparkjavadoc
DurationClassorg.apache.spark.streamingApache Sparkjavadoc
DurationsClassorg.apache.spark.streamingApache Sparkjavadoc
EdgeA single directed edge consisting of a source id, target id, and the data associated with the edge.Classorg.apache.spark.graphxApache Sparkjavadoc
EdgeActivenessCriteria for filtering edges based on activeness.Classorg.apache.spark.graphx.implApache Sparkjavadoc
EdgeContextRepresents an edge along with its neighboring vertices and allows sending messages along the edge.Classorg.apache.spark.graphxApache Sparkjavadoc
EdgeDirectionThe direction of a directed edge relative to a vertex.Classorg.apache.spark.graphxApache Sparkjavadoc
EdgeRDDEdgeRDD[ED, VD] extends RDD[Edge[ED} by storing the edges in columnar format on each partition for performance.Classorg.apache.spark.graphxApache Sparkjavadoc
EdgeRDDImplClassorg.apache.spark.graphx.implApache Sparkjavadoc
EdgeTripletAn edge triplet represents an edge along with the vertex attributes of its neighboring vertices.Classorg.apache.spark.graphxApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

ElementwiseProduct Outputs the Hadamard product (i.Classorg.apache.spark.ml.featureApache Sparkjavadoc
ElementwiseProductOutputs the Hadamard product (i.Classorg.apache.spark.mllib.featureApache Sparkjavadoc
EMLDAOptimizer Optimizer for EM algorithm which stores data + parameter graph, plus algorithm parameters.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
Encoder Used to convert a JVM object of type T to and from the internal Spark SQL representation.Interfaceorg.apache.spark.sqlApache Sparkjavadoc
Encoders Methods for creating an Encoder.Classorg.apache.spark.sqlApache Sparkjavadoc
Entropy Class for calculating entropy during binary classification.Classorg.apache.spark.mllib.tree.impurityApache Sparkjavadoc
EnumUtilClassorg.apache.spark.utilApache Sparkjavadoc
EnvironmentListenerClassorg.apache.spark.ui.envApache Sparkjavadoc
EqualNullSafePerforms equality comparison, similar to EqualTo.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
EqualToA filter that evaluates to true iff the attribute evaluates to a valueSince:1.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
Estimator Abstract class for estimators that fit models to data.Classorg.apache.spark.mlApache Sparkjavadoc
Evaluator Abstract class for evaluators that compute metrics from predictions.Classorg.apache.spark.ml.evaluationApache Sparkjavadoc
ExceptionFailure Task failed due to a runtime exception.Classorg.apache.sparkApache Sparkjavadoc
ExecutionListenerManager Manager for QueryExecutionListener.Classorg.apache.spark.sql.utilApache Sparkjavadoc
ExecutorInfo Stores information about an executor to pass from the scheduler to SparkListeners.Classorg.apache.spark.scheduler.clusterApache Sparkjavadoc
ExecutorLostFailure The task failed because the executor that it was running on was lost.Classorg.apache.sparkApache Sparkjavadoc
ExecutorRegisteredClassorg.apache.sparkApache Sparkjavadoc
ExecutorRemovedClassorg.apache.sparkApache Sparkjavadoc
ExecutorsListenerClassorg.apache.spark.ui.execApache Sparkjavadoc
ExecutorStageSummaryClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
ExecutorSummaryClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
ExpectationSumClassorg.apache.spark.mllib.clusteringApache Sparkjavadoc
ExperimentalAn experimental user-facing API.Classorg.apache.spark.annotationApache Sparkjavadoc
ExperimentalMethods Holder for experimental methods for the bravest.Classorg.apache.spark.sqlApache Sparkjavadoc
ExponentialGenerator Generates i.Classorg.apache.spark.mllib.randomApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

FeatureTypeEnum to describe whether a feature is "continuous" or "categorical"See Also:Serialized FormClassorg.apache.spark.mllib.tree.configurationApache Sparkjavadoc
FetchFailed Task failed to fetch shuffle data from a remote node.Classorg.apache.sparkApache Sparkjavadoc
FilterA filter predicate for data sources.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
FilterFunctionBase interface for a function used in Dataset's filter function.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
FlatMapFunctionA function that returns zero or more output records from each input record.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
FlatMapFunction2A function that takes two inputs and returns zero or more output records.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
FlatMapGroupsFunctionA function that returns zero or more output records from each grouping key and its values.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
FloatParam Specialized version of Param[Float] for Java.Classorg.apache.spark.ml.paramApache Sparkjavadoc
FloatType The data type representing Float values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
FlumeUtilsClassorg.apache.spark.streaming.flumeApache Sparkjavadoc
ForeachFunctionBase interface for a function used in Dataset's foreach function.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
ForeachPartitionFunctionBase interface for a function used in Dataset's foreachPartition function.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
FPGrowthA parallel FP-growth algorithm to mine frequent itemsets.Classorg.apache.spark.mllib.fpmApache Sparkjavadoc
FPGrowthModelModel trained by FPGrowth, which holds frequent itemsets.Classorg.apache.spark.mllib.fpmApache Sparkjavadoc
FunctionBase interface for functions whose return types do not create special RDDs.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
Function2A two-argument function that takes arguments of type T1 and T2 and returns an R.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
Function3A three-argument function that takes arguments of type T1, T2 and T3 and returns an R.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
Function4A four-argument function that takes arguments of type T1, T2, T3 and T4 and returns an R.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
functionsClassorg.apache.spark.sqlApache Sparkjavadoc
FutureActionA future for the result of an action to support cancellation.Interfaceorg.apache.sparkApache Sparkjavadoc
GammaGenerator Generates i.Classorg.apache.spark.mllib.randomApache Sparkjavadoc
GaussianMixtureThis class performs expectation maximization for multivariate Gaussian Mixture Models (GMMs).Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
GaussianMixtureModelMultivariate Gaussian Mixture Model (GMM) consisting of k Gaussians, where points are drawn from each Gaussian i=1.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
GBTClassificationModel Gradient-Boosted Trees (GBTs) model for classification.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
GBTClassifier Gradient-Boosted Trees (GBTs) learning algorithm for classification.Classorg.apache.spark.ml.classificationApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

GBTRegressionModel Gradient-Boosted Trees (GBTs) model for regression.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
GBTRegressor Gradient-Boosted Trees (GBTs) learning algorithm for regression.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
GeneralizedLinearAlgorithm GeneralizedLinearAlgorithm implements methods to train a Generalized Linear Model (GLM).Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
GeneralizedLinearModel GeneralizedLinearModel (GLM) represents a model trained using GeneralizedLinearAlgorithm.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
Gini Class for calculating the during binary classification.Classorg.apache.spark.mllib.tree.impurityApache Sparkjavadoc
Gradient Class used to compute the gradient for a loss function, given a single data point.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
GradientBoostedTreesA class that implements Stochastic Gradient BoostingClassorg.apache.spark.mllib.treeApache Sparkjavadoc
GradientBoostedTreesModelRepresents a gradient boosted trees model.Classorg.apache.spark.mllib.tree.modelApache Sparkjavadoc
GradientDescentClass used to solve an optimization problem using Gradient Descent.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
GraphThe Graph abstractly represents a graph with arbitrary objects associated with vertices and edges.Classorg.apache.spark.graphxApache Sparkjavadoc
GraphGeneratorsA collection of graph generating functions.Classorg.apache.spark.graphx.utilApache Sparkjavadoc
GraphImplAn implementation of Graph to support computation on graphs.Classorg.apache.spark.graphx.implApache Sparkjavadoc
GraphKryoRegistratorRegisters GraphX classes with Kryo for improved performance.Classorg.apache.spark.graphxApache Sparkjavadoc
GraphLoaderProvides utilities for loading Graphs from files.Classorg.apache.spark.graphxApache Sparkjavadoc
GraphOpsContains additional functionality for Graph.Classorg.apache.spark.graphxApache Sparkjavadoc
GraphXUtilsClassorg.apache.spark.graphxApache Sparkjavadoc
GreaterThanA filter that evaluates to true iff the attribute evaluates to a value greater than value.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
GreaterThanOrEqualA filter that evaluates to true iff the attribute evaluates to a value greater than or equal to value.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
GroupedData A set of methods for aggregations on a DataFrame, created by DataFrame.Classorg.apache.spark.sqlApache Sparkjavadoc
GroupedDataset A Dataset has been logically grouped by a user specified grouping key.Classorg.apache.spark.sqlApache Sparkjavadoc
HadoopFsRelation A BaseRelation that provides much of the common code required for relations that store their data to an HDFS compatible filesystem.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
HadoopFsRelationProvider Implemented by objects that produce relations for a specific kind of data source with a given schema and partitioned columns.Interfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
HadoopRDD An RDD that provides core functionality for reading data stored in Hadoop (e.Classorg.apache.spark.rddApache Sparkjavadoc
HashingTF Maps a sequence of terms to their term frequencies using the hashing trick.Classorg.apache.spark.ml.featureApache Sparkjavadoc
HashingTFMaps a sequence of terms to their term frequencies using the hashing trick.Classorg.apache.spark.mllib.featureApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

HashPartitionerA Partitioner that implements hash-based partitioning using Java's Object.Classorg.apache.sparkApache Sparkjavadoc
HasOffsetRangesRepresents any object that has a collection of OffsetRanges.Interfaceorg.apache.spark.streaming.kafkaApache Sparkjavadoc
HingeGradient Compute gradient and loss for a Hinge loss function, as used in SVM binary classification.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
HiveContextAn instance of the Spark SQL execution engine that integrates with data stored in Hive.Classorg.apache.spark.sql.hiveApache Sparkjavadoc
HttpBroadcastFactoryA BroadcastFactory implementation that uses a HTTP server as the broadcast mechanism.Classorg.apache.spark.broadcastApache Sparkjavadoc
Identifiable Trait for an object with an immutable unique ID that identifies itself and its derivatives.Interfaceorg.apache.spark.ml.utilApache Sparkjavadoc
IDF Compute the Inverse Document Frequency (IDF) given a collection of documents.Classorg.apache.spark.ml.featureApache Sparkjavadoc
IDFInverse document frequency (IDF).Classorg.apache.spark.mllib.featureApache Sparkjavadoc
IDFModelClassorg.apache.spark.ml.featureApache Sparkjavadoc
IDFModelRepresents an IDF model that can transform term frequency vectors.Classorg.apache.spark.mllib.featureApache Sparkjavadoc
Impurity Trait for calculating information gain.Interfaceorg.apache.spark.mllib.tree.impurityApache Sparkjavadoc
InA filter that evaluates to true iff the attribute evaluates to one of the values in the array.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
IndexedRowRepresents a row of IndexedRowMatrix.Classorg.apache.spark.mllib.linalg.distributedApache Sparkjavadoc
IndexedRowMatrixClassorg.apache.spark.mllib.linalg.distributedApache Sparkjavadoc
IndexToStringClassorg.apache.spark.ml.featureApache Sparkjavadoc
InformationGainStats Information gain statistics for each split param: gain information gain valueClassorg.apache.spark.mllib.tree.modelApache Sparkjavadoc
InnerClosureFinderClassorg.apache.spark.utilApache Sparkjavadoc
InputDStreamThis is the abstract base class for all input streams.Classorg.apache.spark.streaming.dstreamApache Sparkjavadoc
InputFormatInfo Parses and holds information about inputFormat (and files) specified as a parameter.Classorg.apache.spark.schedulerApache Sparkjavadoc
InputMetricDistributionsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
InputMetricsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
InsertableRelation A BaseRelation that can be used to insert data into it through the insert method.Interfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
IntArrayParam Specialized version of Param[Array[Int} for Java.Classorg.apache.spark.ml.paramApache Sparkjavadoc
IntegerType The data type representing Int values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
Interaction Implements the feature interaction transform.Classorg.apache.spark.ml.featureApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

InternalNode Internal Decision Tree node.Classorg.apache.spark.ml.treeApache Sparkjavadoc
InterruptibleIterator An iterator that wraps around an existing iterator to provide task killing functionality.Classorg.apache.sparkApache Sparkjavadoc
IntParam Specialized version of Param[Int] for Java.Classorg.apache.spark.ml.paramApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.param.IntParam
IsNotNullA filter that evaluates to true iff the attribute evaluates to a non-null value.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
IsNullA filter that evaluates to true iff the attribute evaluates to null.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
IsotonicRegressionClassorg.apache.spark.ml.regressionApache Sparkjavadoc
IsotonicRegressionClassorg.apache.spark.mllib.regressionApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.regression.IsotonicRegression
IsotonicRegressionModel Model fitted by IsotonicRegression.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
IsotonicRegressionModelRegression model for isotonic regression.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.regression.IsotonicRegressionModel
JavaDoubleRDDClassorg.apache.spark.api.javaApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.api.java.JavaDoubleRDD
JavaDStreamA Java-friendly interface to DStream, the basic abstraction in Spark Streaming that represents a continuous stream of data.Classorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JavaDStreamLikeInterfaceorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JavaFutureActionInterfaceorg.apache.spark.api.javaApache Sparkjavadoc
JavaHadoopRDDClassorg.apache.spark.api.javaApache Sparkjavadoc
JavaInputDStreamA Java-friendly interface to InputDStream.Classorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JavaIterableWrapperSerializerA Kryo serializer for serializing results returned by asJavaIterable.Classorg.apache.spark.serializerApache Sparkjavadoc
JavaMapWithStateDStream DStream representing the stream of data generated by mapWithState operation on a JavaPairDStream.Classorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JavaNewHadoopRDDClassorg.apache.spark.api.javaApache Sparkjavadoc
JavaPairDStreamA Java-friendly interface to a DStream of key-value pairs, which provides extra methods like reduceByKey and join.Classorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JavaPairInputDStreamA Java-friendly interface to InputDStream ofSee Also:Serialized FormClassorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JavaPairRDDClassorg.apache.spark.api.javaApache Sparkjavadoc
JavaPairReceiverInputDStreamA Java-friendly interface to ReceiverInputDStream, the abstract class for defining any input stream that receives data over the network.Classorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JavaParams Java-friendly wrapper for Params.Classorg.apache.spark.ml.paramApache Sparkjavadoc
JavaRDDClassorg.apache.spark.api.javaApache Sparkjavadoc
JavaRDDLikeDefines operations common to several Java RDD implementations.Interfaceorg.apache.spark.api.javaApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

JavaReceiverInputDStreamA Java-friendly interface to ReceiverInputDStream, the abstract class for defining any input stream that receives data over the network.Classorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JavaSerializer A Spark serializer that uses Java's built-in serialization.Classorg.apache.spark.serializerApache Sparkjavadoc
JavaSparkContextA Java-friendly version of SparkContext that returns JavaRDDs and works with Java collections instead of Scala ones.Classorg.apache.spark.api.javaApache Sparkjavadoc
JavaSparkListenerJava clients should extend this class instead of implementing SparkListener directly.Classorg.apache.sparkApache Sparkjavadoc
JavaSparkStatusTrackerLow-level status reporting APIs for monitoring job and stage progress.Classorg.apache.spark.api.javaApache Sparkjavadoc
JavaStreamingContextA Java-friendly version of StreamingContext which is the main entry point for Spark Streaming functionality.Classorg.apache.spark.streaming.api.javaApache Sparkjavadoc
JdbcDialect Encapsulates everything (extensions, workarounds, quirks) to handle the SQL dialect of a certain database or jdbc driver.Classorg.apache.spark.sql.jdbcApache Sparkjavadoc
JdbcDialects Registry of dialects that apply to every new jdbc DataFrame.Classorg.apache.spark.sql.jdbcApache Sparkjavadoc
JdbcRDDAn RDD that executes an SQL query on a JDBC connection and reads results.Classorg.apache.spark.rddApache Sparkjavadoc
JdbcType A database type definition coupled with the jdbc type needed to send null values to the database.Classorg.apache.spark.sql.jdbcApache Sparkjavadoc
JobDataClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
JobExecutionStatusenum JobExecutionStatusEnum Constant SummaryClassorg.apache.sparkApache Sparkjavadoc
JobLogger A logger class to record runtime information for jobs in Spark.Classorg.apache.spark.schedulerApache Sparkjavadoc
JobProgressListener Tracks task-level information to be displayed in the UI.Classorg.apache.spark.ui.jobsApache Sparkjavadoc
JobSucceededClassorg.apache.spark.schedulerApache Sparkjavadoc
KafkaUtilsClassorg.apache.spark.streaming.kafkaApache Sparkjavadoc
KernelDensityKernel density estimation.Classorg.apache.spark.mllib.statApache Sparkjavadoc
KillTaskClassorg.apache.spark.scheduler.localApache Sparkjavadoc
KinesisUtilsClassorg.apache.spark.streaming.kinesisApache Sparkjavadoc
KinesisUtilsPythonHelperThis is a helper class that wraps the methods in KinesisUtils into more Python-friendly class and function so that it can be easily instantiated and called from Python's KinesisUtils.Classorg.apache.spark.streaming.kinesisApache Sparkjavadoc
KMeansClassorg.apache.spark.ml.clusteringApache Sparkjavadoc
KMeansClassorg.apache.spark.ml.clusteringApache Sparkjavadoc
KMeansK-means clustering with support for multiple parallel runs and a k-means++ like initialization mode (the k-meansalgorithm by Bahmani et al).Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
KMeansDataGenerator Generate test data for KMeans.Classorg.apache.spark.mllib.utilApache Sparkjavadoc
KMeansModel Model fitted by KMeans.Classorg.apache.spark.ml.clusteringApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

KMeansModelA clustering model for K-means.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
KolmogorovSmirnovTestResult Object containing the test results for the Kolmogorov-Smirnov test.Classorg.apache.spark.mllib.stat.testApache Sparkjavadoc
KryoRegistratorInterfaceorg.apache.spark.serializerApache Sparkjavadoc
KryoSerializerA Spark serializer that uses the Kryo serialization library.Classorg.apache.spark.serializerApache Sparkjavadoc
L1Updater Updater for L1 regularized problems.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
LabelConverterLabel to vector converter.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
LabeledPointClass that represents the features and labels of a data point.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
LabelPropagationLabel Propagation algorithm.Classorg.apache.spark.graphx.libApache Sparkjavadoc
LassoModelRegression model trained using Lasso.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
LassoWithSGDTrain a regression model with L1-regularization using Stochastic Gradient Descent.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
LBFGS Class used to solve an optimization problem using Limited-memory BFGS.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
LDA Latent Dirichlet Allocation (LDA), a topic model designed for text documents.Classorg.apache.spark.ml.clusteringApache Sparkjavadoc
LDALatent Dirichlet Allocation (LDA), a topic model designed for text documents.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
LDAModel Model fitted by LDA.Classorg.apache.spark.ml.clusteringApache Sparkjavadoc
LDAModelLatent Dirichlet Allocation (LDA) model.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
LDAOptimizer An LDAOptimizer specifies which optimization/learning/inference algorithm to use, and it can hold optimizer-specific parameters for users to set.Interfaceorg.apache.spark.mllib.clusteringApache Sparkjavadoc
LeafNode Decision tree leaf node.Classorg.apache.spark.ml.treeApache Sparkjavadoc
LeastSquaresAggregatorLeastSquaresAggregator computes the gradient and loss for a Least-squared loss function, as used in linear regression for samples in sparse or dense vector in a online fashion.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
LeastSquaresCostFunLeastSquaresCostFun implements Breeze's DiffFunction[T] for Least Squares cost.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
LeastSquaresGradient Compute gradient and loss for a Least-squared loss function, as used in linear regression.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
LessThanA filter that evaluates to true iff the attribute evaluates to a valueSince:1.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
LessThanOrEqualA filter that evaluates to true iff the attribute evaluates to a value less than or equal to value.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
LinearDataGenerator Generate sample data used for Linear Data.Classorg.apache.spark.mllib.utilApache Sparkjavadoc
LinearRegression The learning objective is to minimize the squared error, with regularization.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
LinearRegressionModel Model produced by LinearRegression.Classorg.apache.spark.ml.regressionApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

LinearRegressionModelRegression model trained using LinearRegression.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
LinearRegressionSummaryClassorg.apache.spark.ml.regressionApache Sparkjavadoc
LinearRegressionTrainingSummaryClassorg.apache.spark.ml.regressionApache Sparkjavadoc
LinearRegressionWithSGDTrain a linear regression model with no regularization using Stochastic Gradient Descent.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
Loader Trait for classes which can load models and transformers from files.Interfaceorg.apache.spark.mllib.utilApache Sparkjavadoc
LocalLDAModel Local (non-distributed) model fitted by LDA.Classorg.apache.spark.ml.clusteringApache Sparkjavadoc
LocalLDAModel This model stores only the inferred topics.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
LoggingUtility trait for classes that want to log data.Interfaceorg.apache.sparkApache Sparkjavadoc
LogisticAggregatorLogisticAggregator computes the gradient and loss for binary logistic loss function, as used in binary classification for instances in sparse or dense vector in a online fashion.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
LogisticCostFunLogisticCostFun implements Breeze's DiffFunction[T] for a multinomial logistic loss function, as used in multi-class classification (it is also used in binary logistic regression).Classorg.apache.spark.ml.classificationApache Sparkjavadoc
LogisticGradient Compute gradient and loss for a multinomial logistic loss function, as used in multi-class classification (it is also used in binary logistic regression).Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
LogisticRegression Logistic regression.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.classification.LogisticRegression
LogisticRegressionDataGenerator Generate test data for LogisticRegression.Classorg.apache.spark.mllib.utilApache Sparkjavadoc
LogisticRegressionModel Model produced by LogisticRegression.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.classification.LogisticRegressionModel
LogisticRegressionModelClassification model trained using Multinomial/Binary Logistic Regression.Classorg.apache.spark.mllib.classificationApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.classification.LogisticRegressionModel
LogisticRegressionSummaryAbstraction for Logistic Regression Results for a given model.Interfaceorg.apache.spark.ml.classificationApache Sparkjavadoc
LogisticRegressionTrainingSummaryAbstraction for multinomial Logistic Regression Training results.Interfaceorg.apache.spark.ml.classificationApache Sparkjavadoc
LogisticRegressionWithLBFGSTrain a classification model for Multinomial/Binary Logistic Regression using Limited-memory BFGS.Classorg.apache.spark.mllib.classificationApache Sparkjavadoc
LogisticRegressionWithSGDTrain a classification model for Binary Logistic Regression using Stochastic Gradient Descent.Classorg.apache.spark.mllib.classificationApache Sparkjavadoc
LogLoss Class for log loss calculation (for classification).Classorg.apache.spark.mllib.tree.lossApache Sparkjavadoc
LogNormalGenerator Generates i.Classorg.apache.spark.mllib.randomApache Sparkjavadoc
LongParam Specialized version of Param[Long] for Java.Classorg.apache.spark.ml.paramApache Sparkjavadoc
LongType The data type representing Long values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
Loss Trait for adding "pluggable" loss functions for the gradient boosting algorithm.Interfaceorg.apache.spark.mllib.tree.lossApache Sparkjavadoc
LossesClassorg.apache.spark.mllib.tree.lossApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

LZ4CompressionCodec LZ4 implementation of CompressionCodec.Classorg.apache.spark.ioApache Sparkjavadoc
LZFCompressionCodec LZF implementation of CompressionCodec.Classorg.apache.spark.ioApache Sparkjavadoc
MapFunctionBase interface for a map function used in Dataset's map function.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
MapGroupsFunctionBase interface for a map function used in GroupedDataset's mapGroup function.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
MapPartitionsFunctionBase interface for function used in Dataset's mapPartitions.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
MapType The data type for Maps.Classorg.apache.spark.sql.typesApache Sparkjavadoc
MapWithStateDStream DStream representing the stream of data generated by mapWithState operation on a Additionally, it also gives access to the stream of state snapshots, that is, the state data ofClassorg.apache.spark.streaming.dstreamApache Sparkjavadoc
MatricesFactory methods for Matrix.Classorg.apache.spark.mllib.linalgApache Sparkjavadoc
MatrixTrait for a local matrix.Interfaceorg.apache.spark.mllib.linalgApache Sparkjavadoc
MatrixEntryRepresents an entry in an distributed matrix.Classorg.apache.spark.mllib.linalg.distributedApache Sparkjavadoc
MatrixFactorizationModelModel representing the result of matrix factorization.Classorg.apache.spark.mllib.recommendationApache Sparkjavadoc
MemoryEntryClassorg.apache.spark.storageApache Sparkjavadoc
Metadata Metadata is a wrapper over Map[String, Any] that limits the value type to simple ones: Boolean, Long, Double, String, Metadata, Array[Boolean], Array[Long], Array[Double], Array[String], andClassorg.apache.spark.sql.typesApache Sparkjavadoc
MetadataBuilder Builder for Metadata.Classorg.apache.spark.sql.typesApache Sparkjavadoc
MethodIdentifierHelper class to identify a method.Classorg.apache.spark.utilApache Sparkjavadoc
MFDataGenerator Generate RDD(s) containing data for Matrix Factorization.Classorg.apache.spark.mllib.utilApache Sparkjavadoc
MillisecondsHelper object that creates instance of Duration representing a given number of milliseconds.Classorg.apache.spark.streamingApache Sparkjavadoc
MinMaxScaler Rescale each feature individually to a common range [min, max] linearly using column summary statistics, which is also known as min-max normalization or Rescaling.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.feature.MinMaxScaler
MinMaxScalerModelClassorg.apache.spark.ml.featureApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.feature.MinMaxScalerModel
MinutesHelper object that creates instance of Duration representing a given number of minutes.Classorg.apache.spark.streamingApache Sparkjavadoc
MLPairRDDFunctionsMachine learning specific Pair RDD functions.Classorg.apache.spark.mllib.rddApache Sparkjavadoc
MLReadableTrait for objects that provide MLReader.Interfaceorg.apache.spark.ml.utilApache Sparkjavadoc
MLReaderAbstract class for utility classes that can load ML instances.Classorg.apache.spark.ml.utilApache Sparkjavadoc
MLUtilsHelper methods to load, save and pre-process data used in ML Lib.Classorg.apache.spark.mllib.utilApache Sparkjavadoc
MLWritableTrait for classes that provide MLWriter.Interfaceorg.apache.spark.ml.utilApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

MLWriterAbstract class for utility classes that can save ML instances.Classorg.apache.spark.ml.utilApache Sparkjavadoc
Model A fitted model, i.Classorg.apache.spark.mlApache Sparkjavadoc
MQTTUtilsClassorg.apache.spark.streaming.mqttApache Sparkjavadoc
MsSqlServerDialectClassorg.apache.spark.sql.jdbcApache Sparkjavadoc
MulticlassClassificationEvaluator Evaluator for multiclass classification, which expects two input columns: score and label.Classorg.apache.spark.ml.evaluationApache Sparkjavadoc
MulticlassMetrics Evaluator for multiclass classification.Classorg.apache.spark.mllib.evaluationApache Sparkjavadoc
MultilabelMetricsEvaluator for multilabel classification.Classorg.apache.spark.mllib.evaluationApache Sparkjavadoc
MultilayerPerceptronClassificationModel Classification model based on the Multilayer Perceptron.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
MultilayerPerceptronClassifier Classifier trainer based on the Multilayer Perceptron.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
MultivariateGaussian This class provides basic functionality for a Multivariate Gaussian (Normal) Distribution.Classorg.apache.spark.mllib.stat.distributionApache Sparkjavadoc
MultivariateOnlineSummarizer MultivariateOnlineSummarizer implements MultivariateStatisticalSummary to compute the mean, variance, minimum, maximum, counts, and nonzero counts for instances in sparse or dense vectorClassorg.apache.spark.mllib.statApache Sparkjavadoc
MultivariateStatisticalSummaryTrait for multivariate statistical summary of a data matrix.Interfaceorg.apache.spark.mllib.statApache Sparkjavadoc
MutableAggregationBuffer A Row representing an mutable aggregation buffer.Classorg.apache.spark.sql.expressionsApache Sparkjavadoc
MutablePair A tuple of 2 elements.Classorg.apache.spark.utilApache Sparkjavadoc
MySQLDialectClassorg.apache.spark.sql.jdbcApache Sparkjavadoc
NaiveBayes Naive Bayes Classifiers.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
NaiveBayesClassorg.apache.spark.mllib.classificationApache Sparkjavadoc
NaiveBayesModel Model produced by NaiveBayes param: pi log of class priors, whose dimension is C (number of classes)Classorg.apache.spark.ml.classificationApache Sparkjavadoc
NaiveBayesModelModel for Naive Bayes Classifiers.Classorg.apache.spark.mllib.classificationApache Sparkjavadoc
NarrowDependency Base class for dependencies where each partition of the child RDD depends on a small number of partitions of the parent RDD.Classorg.apache.sparkApache Sparkjavadoc
NewHadoopRDD An RDD that provides core functionality for reading data stored in Hadoop (e.Classorg.apache.spark.rddApache Sparkjavadoc
NGram A feature transformer that converts the input array of strings into an array of n-grams.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Node Decision tree node interface.Classorg.apache.spark.ml.treeApache Sparkjavadoc
Node Node in a decision tree.Classorg.apache.spark.mllib.tree.modelApache Sparkjavadoc
NominalAttribute A nominal attribute.Classorg.apache.spark.ml.attributeApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

NoopDialectNOOP dialect object, always returning the neutral element.Classorg.apache.spark.sql.jdbcApache Sparkjavadoc
Normalizer Normalize a vector to have unit norm using the given p-norm.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.feature.Normalizer
NormalizerNormalizes samples individually to unit L^p^ norm For any 1 <= p < Double.Classorg.apache.spark.mllib.featureApache Sparkjavadoc
NotA filter that evaluates to true iff child is evaluated to false.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
NullType The data type representing NULL values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
NumericAttribute A numeric attribute with optional summary statistics.Classorg.apache.spark.ml.attributeApache Sparkjavadoc
NumericType Numeric data types.Classorg.apache.spark.sql.typesApache Sparkjavadoc
OffsetRangeRepresents a range of offsets from a single Kafka TopicAndPartition.Classorg.apache.spark.streaming.kafkaApache Sparkjavadoc
OneHotEncoder A one-hot encoder that maps a column of category indices to a column of binary vectors, with at most a single one-value per row that indicates the input category index.Classorg.apache.spark.ml.featureApache Sparkjavadoc
OneToOneDependency Represents a one-to-one dependency between partitions of the parent and child RDDs.Classorg.apache.sparkApache Sparkjavadoc
OneVsRest Reduction of Multiclass Classification to Binary Classification.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
OneVsRestModel Model produced by OneVsRest.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
OnlineLDAOptimizer An online optimizer for LDA.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
Optimizer Trait for optimization problem solvers.Interfaceorg.apache.spark.mllib.optimizationApache Sparkjavadoc
OrA filter that evaluates to true iff at least one of left or right evaluates to true.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
OracleDialectClassorg.apache.spark.sql.jdbcApache Sparkjavadoc
OrderedRDDFunctionsExtra functions available on RDDs of (key, value) pairs where the key is sortable through an implicit conversion.Classorg.apache.spark.rddApache Sparkjavadoc
OutputMetricDistributionsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
OutputMetricsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
OutputOperationInfo Class having information on output operations.Classorg.apache.spark.streaming.schedulerApache Sparkjavadoc
OutputWriter OutputWriter is used together with HadoopFsRelation for persisting rows to the underlying file system.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
OutputWriterFactory A factory that produces OutputWriters.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
PageRankPageRank algorithm implementation.Classorg.apache.spark.graphx.libApache Sparkjavadoc
PairDStreamFunctionsExtra functions available on DStream of (key, value) pairs through an implicit conversion.Classorg.apache.spark.streaming.dstreamApache Sparkjavadoc
PairFlatMapFunctionA function that returns zero or more key-value pair records from each input record.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

PairFunctionA function that returns key-value pairs (Tuple2), and can be used to construct PairRDDs.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
PairRDDFunctionsExtra functions available on RDDs of (key, value) pairs through an implicit conversion.Classorg.apache.spark.rddApache Sparkjavadoc
PairwiseRRDDForm an RDD[(Int, Array[Byte])] from key-value pairs returned from R.Classorg.apache.spark.api.rApache Sparkjavadoc
Param A param with self-contained documentation and optionally default value.Classorg.apache.spark.ml.paramApache Sparkjavadoc
ParamGridBuilder Builder for a param grid used in grid search-based model selection.Classorg.apache.spark.ml.tuningApache Sparkjavadoc
ParamMap A param to value map.Classorg.apache.spark.ml.paramApache Sparkjavadoc
ParamPair A param and its value.Classorg.apache.spark.ml.paramApache Sparkjavadoc
ParamsInterfaceorg.apache.spark.ml.paramApache Sparkjavadoc
ParamValidators Factory methods for common validation functions for Param.Classorg.apache.spark.ml.paramApache Sparkjavadoc
PartialResultClassorg.apache.spark.partialApache Sparkjavadoc
PartitionAn identifier for a partition in an RDD.Interfaceorg.apache.sparkApache Sparkjavadoc
PartitionCoalescerCoalesce the partitions of a parent RDD (prev) into fewer partitions, so that each partition of this RDD computes one or more of the parent ones.Classorg.apache.spark.rddApache Sparkjavadoc
PartitionerAn object that defines how the elements in a key-value pair RDD are partitioned by key.Classorg.apache.sparkApache Sparkjavadoc
PartitionGroupClassorg.apache.spark.rddApache Sparkjavadoc
PartitionPruningRDD A RDD used to prune RDD partitions/partitions so we can avoid launching tasks on all partitions.Classorg.apache.spark.rddApache Sparkjavadoc
PartitionStrategyInterfaceorg.apache.spark.graphxApache Sparkjavadoc
PCA PCA trains a model to project vectors to a low-dimensional space using PCA.Classorg.apache.spark.ml.featureApache Sparkjavadoc
PCAA feature transformer that projects vectors to a low-dimensional space using PCA.Classorg.apache.spark.mllib.featureApache Sparkjavadoc
PCAModelClassorg.apache.spark.ml.featureApache Sparkjavadoc
PCAModelModel fitted by PCA that can project vectors to a low-dimensional space using PCA.Classorg.apache.spark.mllib.featureApache Sparkjavadoc
Pipeline A simple pipeline, which acts as an estimator.Classorg.apache.spark.mlApache Sparkjavadoc
PipelineModel Represents a fitted pipeline.Classorg.apache.spark.mlApache Sparkjavadoc
PipelineStage A stage in a pipeline, either an Estimator or a Transformer.Classorg.apache.spark.mlApache Sparkjavadoc
PMMLExportable Export model to the PMML format Predictive Model Markup Language (PMML) is an XML-based file formatInterfaceorg.apache.spark.mllib.pmmlApache Sparkjavadoc
PoissonGenerator Generates i.Classorg.apache.spark.mllib.randomApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

PoissonSampler A sampler for sampling with replacement, based on values drawn from Poisson distribution.Classorg.apache.spark.util.randomApache Sparkjavadoc
PolynomialExpansion Perform feature expansion in a polynomial space.Classorg.apache.spark.ml.featureApache Sparkjavadoc
PortableDataStreamA class that allows DataStreams to be serialized and moved around by not creating them until they need to be readClassorg.apache.spark.inputApache Sparkjavadoc
PostgresDialectClassorg.apache.spark.sql.jdbcApache Sparkjavadoc
PowerIterationClusteringClassorg.apache.spark.mllib.clusteringApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.clustering.PowerIterationClustering
PowerIterationClusteringModelModel produced by PowerIterationClustering.Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.clustering.PowerIterationClusteringModel
PrecisionInfoPrecision parameters for a DecimalSee Also:Serialized FormClassorg.apache.spark.sql.typesApache Sparkjavadoc
PredictPredicted value for a node param: predict predicted valueClassorg.apache.spark.mllib.tree.modelApache Sparkjavadoc
PredictionModel Abstraction for a model for prediction tasks (regression and classification).Classorg.apache.spark.mlApache Sparkjavadoc
Predictor Abstraction for prediction problems (regression and classification).Classorg.apache.spark.mlApache Sparkjavadoc
PrefixSpan A parallel PrefixSpan algorithm to mine frequent sequential patterns.Classorg.apache.spark.mllib.fpmApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.fpm.PrefixSpan
PrefixSpanModelModel fitted by PrefixSpan param: freqSequences frequent sequencesClassorg.apache.spark.mllib.fpmApache Sparkjavadoc
Pregel Unlike the original Pregel API, the GraphX Pregel API factors the sendMessage computation over edges, enables the message sending computation to read both vertex attributes, and constrainsClassorg.apache.spark.graphxApache Sparkjavadoc
PrivateA class that is considered private to the internals of Spark -- there is a high-likelihood they will be changed in future versions of Spark.Classorg.apache.spark.annotationApache Sparkjavadoc
ProbabilisticClassificationModel Model produced by a ProbabilisticClassifier.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
ProbabilisticClassifier Single-label binary or multiclass classifier which can output class conditional probabilities.Classorg.apache.spark.ml.classificationApache Sparkjavadoc
PrunedFilteredScan A BaseRelation that can eliminate unneeded columns and filter using selected predicates before producing an RDD containing all matching tuples as Row objects.Interfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
PrunedScan A BaseRelation that can eliminate unneeded columns before producing an RDD containing all of its tuples as Row objects.Interfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
QRDecompositionClassorg.apache.spark.mllib.linalgApache Sparkjavadoc
QuantileDiscretizer QuantileDiscretizer takes a column with continuous features and outputs a column with binned categorical features.Classorg.apache.spark.ml.featureApache Sparkjavadoc
QuantileStrategyEnum for selecting the quantile calculation strategySee Also:Serialized FormClassorg.apache.spark.mllib.tree.configurationApache Sparkjavadoc
QueryExecutionListener The interface of query execution listener that can be used to analyze execution metrics.Interfaceorg.apache.spark.sql.utilApache Sparkjavadoc
RandomDataGenerator Trait for random data generators that generate i.Interfaceorg.apache.spark.mllib.randomApache Sparkjavadoc
RandomForestA class that implements a Random Forest learning algorithm for classification and regression.Classorg.apache.spark.mllib.treeApache Sparkjavadoc
RandomForestClassificationModel Random Forest model for classification.Classorg.apache.spark.ml.classificationApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

RandomForestClassifier Random Forest learning algorithm for It supports both binary and multiclass labels, as well as both continuous and categoricalClassorg.apache.spark.ml.classificationApache Sparkjavadoc
RandomForestModelRepresents a random forest model.Classorg.apache.spark.mllib.tree.modelApache Sparkjavadoc
RandomForestRegressionModel Random Forest model for regression.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
RandomForestRegressor Random Forest learning algorithm for regression.Classorg.apache.spark.ml.regressionApache Sparkjavadoc
RandomRDDsGenerator methods for creating RDDs comprised of i.Classorg.apache.spark.mllib.randomApache Sparkjavadoc
RandomSampler A pseudorandom sampler.Interfaceorg.apache.spark.util.randomApache Sparkjavadoc
RangeDependency Represents a one-to-one dependency between ranges of partitions in the parent and child RDDs.Classorg.apache.sparkApache Sparkjavadoc
RangePartitionerA Partitioner that partitions sortable records by range into roughly equal ranges.Classorg.apache.sparkApache Sparkjavadoc
RankingMetrics Evaluator for ranking algorithms.Classorg.apache.spark.mllib.evaluationApache Sparkjavadoc
RatingA more compact class to represent a rating than Tuple3[Int, Int, Double].Classorg.apache.spark.mllib.recommendationApache Sparkjavadoc
RDDA Resilient Distributed Dataset (RDD), the basic abstraction in Spark.Classorg.apache.spark.rddApache Sparkjavadoc
RDDBlockIdClassorg.apache.spark.storageApache Sparkjavadoc
RDDDataDistributionClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
RDDFunctionsMachine learning specific RDD functions.Classorg.apache.spark.mllib.rddApache Sparkjavadoc
RDDInfoClassorg.apache.spark.storageApache Sparkjavadoc
RDDPartitionInfoClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
RDDStorageInfoClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
Receiver Abstract class of a receiver that can be run on worker nodes to receive external data.Classorg.apache.spark.streaming.receiverApache Sparkjavadoc
ReceiverInfo Class having information about a receiverSee Also:Serialized FormClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc
ReceiverInputDStreamAbstract class for defining any InputDStream that has to start a receiver on worker nodes to receive external data.Classorg.apache.spark.streaming.dstreamApache Sparkjavadoc
ReduceFunctionBase interface for function used in Dataset's reduce.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
RegexTokenizer A regex based tokenizer that extracts tokens either by using the provided regex pattern to split the text (default) or repeatedly matching the regex (if gaps is false).Classorg.apache.spark.ml.featureApache Sparkjavadoc
RegressionEvaluator Evaluator for regression, which expects two input columns: prediction and label.Classorg.apache.spark.ml.evaluationApache Sparkjavadoc
RegressionMetricsEvaluator for regression.Classorg.apache.spark.mllib.evaluationApache Sparkjavadoc
RegressionModel Model produced by a Regressor.Classorg.apache.spark.ml.regressionApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

RegressionModelInterfaceorg.apache.spark.mllib.regressionApache Sparkjavadoc
RelationProvider Implemented by objects that produce relations for a specific kind of data source.Interfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
Resubmitted A ShuffleMapTask that completed successfully earlier, but we lost the executor before the stage completed.Classorg.apache.sparkApache Sparkjavadoc
ReturnStatementFinderClassorg.apache.spark.utilApache Sparkjavadoc
ReviveOffersClassorg.apache.spark.scheduler.localApache Sparkjavadoc
RFormula Implements the transforms required for fitting a dataset against an R model formula.Classorg.apache.spark.ml.featureApache Sparkjavadoc
RFormulaModel A fitted RFormula.Classorg.apache.spark.ml.featureApache Sparkjavadoc
RidgeRegressionModelRegression model trained using RidgeRegression.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
RidgeRegressionWithSGDTrain a regression model with L2-regularization using Stochastic Gradient Descent.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
RowRepresents one row of output from a relational operator.Interfaceorg.apache.spark.sqlApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.sql.Row
RowFactoryA factory class used to construct Row objects.Classorg.apache.spark.sqlApache Sparkjavadoc
RowMatrixRepresents a row-oriented distributed Matrix with no meaningful row indices.Classorg.apache.spark.mllib.linalg.distributedApache Sparkjavadoc
RpcUtilsClassorg.apache.spark.utilApache Sparkjavadoc
RRDDAn RDD that stores serialized R objects as Array[Byte].Classorg.apache.spark.api.rApache Sparkjavadoc
RuntimePercentageClassorg.apache.spark.schedulerApache Sparkjavadoc
Saveable Trait for models and transformers which may be saved as files.Interfaceorg.apache.spark.mllib.utilApache Sparkjavadoc
SaveModeSaveMode is used to specify the expected behavior of saving a DataFrame to a data source.Classorg.apache.spark.sqlApache Sparkjavadoc
SchedulingMode"FAIR" and "FIFO" determines which policy is used to order tasks amongst a Schedulable's sub-queuesClassorg.apache.spark.schedulerApache Sparkjavadoc
SchemaRelationProvider Implemented by objects that produce relations for a specific kind of data source with a given schema.Interfaceorg.apache.spark.sql.sourcesApache Sparkjavadoc
ScriptTransformationWriterThreadClassorg.apache.spark.sql.hive.executionApache Sparkjavadoc
SecondsHelper object that creates instance of Duration representing a given number of seconds.Classorg.apache.spark.streamingApache Sparkjavadoc
SequenceFileRDDFunctionsExtra functions available on RDDs of (key, value) pairs to create a Hadoop SequenceFile, through an implicit conversion.Classorg.apache.spark.rddApache Sparkjavadoc
SerializableWritableClassorg.apache.sparkApache Sparkjavadoc
SerializationStream A stream for writing serialized objects.Classorg.apache.spark.serializerApache Sparkjavadoc
Serializer A serializer.Classorg.apache.spark.serializerApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

SerializerInstance An instance of a serializer, for use by one thread at a time.Classorg.apache.spark.serializerApache Sparkjavadoc
ShortestPathsComputes shortest paths to the given set of landmark vertices, returning a graph where each vertex attribute is a map containing the shortest-path distance to each reachable landmark.Classorg.apache.spark.graphx.libApache Sparkjavadoc
ShortType The data type representing Short values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
ShuffleBlockIdClassorg.apache.spark.storageApache Sparkjavadoc
ShuffleDataBlockIdClassorg.apache.spark.storageApache Sparkjavadoc
ShuffleDependency Represents a dependency on the output of a shuffle stage.Classorg.apache.sparkApache Sparkjavadoc
ShuffledRDD The resulting RDD from a shuffle (e.Classorg.apache.spark.rddApache Sparkjavadoc
ShuffleIndexBlockIdClassorg.apache.spark.storageApache Sparkjavadoc
ShuffleReadMetricDistributionsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
ShuffleReadMetricsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
ShuffleWriteMetricDistributionsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
ShuffleWriteMetricsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
SignalLoggerHandlerClassorg.apache.spark.utilApache Sparkjavadoc
SimpleFutureActionA FutureAction holding the result of an action that triggers a single job.Classorg.apache.sparkApache Sparkjavadoc
SimpleUpdater A simple updater for gradient descent *without* any regularization.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
SingularValueDecompositionRepresents singular value decomposition (SVD) factors.Classorg.apache.spark.mllib.linalgApache Sparkjavadoc
SizeEstimator Estimates the sizes of Java objects (number of bytes of memory they occupy), for use in memory-aware caches.Classorg.apache.spark.utilApache Sparkjavadoc
SnappyCompressionCodec Snappy implementation of CompressionCodec.Classorg.apache.spark.ioApache Sparkjavadoc
SnappyOutputStreamWrapperWrapper over SnappyOutputStream which guards against write-after-close and double-close issues.Classorg.apache.spark.ioApache Sparkjavadoc
SparkAppHandleA handle to a running Spark application.Interfaceorg.apache.spark.launcherApache Sparkjavadoc
SparkConfConfiguration for a Spark application.Classorg.apache.sparkApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.SparkConf
SparkContextMain entry point for Spark functionality.Classorg.apache.sparkApache Sparkjavadoc
SparkEnv Holds all the runtime environment objects for a running Spark instance (either master or worker), including the serializer, Akka actor system, block manager, map output tracker, etc.Classorg.apache.sparkApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.SparkEnv
SparkExceptionClassorg.apache.sparkApache Sparkjavadoc
SparkFilesResolves paths to files added through SparkContext.Classorg.apache.sparkApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

SparkFirehoseListenerClass that allows users to receive all SparkListener events.Classorg.apache.sparkApache Sparkjavadoc
SparkFlumeEventA wrapper class for AvroFlumeEvent's with a custom serialization format.Classorg.apache.spark.streaming.flumeApache Sparkjavadoc
SparkJobInfoExposes information about Spark Jobs.Interfaceorg.apache.sparkApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.SparkJobInfo
SparkJobInfoImplClassorg.apache.sparkApache Sparkjavadoc
SparkLauncherLauncher for Spark applications.Classorg.apache.spark.launcherApache Sparkjavadoc
SparkListener Interface for listening to events from the Spark scheduler.Interfaceorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerApplicationEndClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerApplicationStartClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerBlockManagerAddedClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerBlockManagerRemovedClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerBlockUpdatedClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerEnvironmentUpdateClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerExecutorAddedClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerExecutorMetricsUpdatePeriodic updates from executors.Classorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerExecutorRemovedClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerJobEndClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerJobStartClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerStageCompletedClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerStageSubmittedClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerTaskEndClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerTaskGettingResultClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerTaskStartClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkListenerUnpersistRDDClassorg.apache.spark.schedulerApache Sparkjavadoc
SparkMasterRegexA collection of regexes for extracting information from the master string.Classorg.apache.sparkApache Sparkjavadoc
SparkShutdownHookClassorg.apache.spark.utilApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

SparkStageInfoExposes information about Spark Stages.Interfaceorg.apache.sparkApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.SparkStageInfo
SparkStageInfoImplClassorg.apache.sparkApache Sparkjavadoc
SparkStatusTrackerLow-level status reporting APIs for monitoring job and stage progress.Classorg.apache.sparkApache Sparkjavadoc
SparseMatrixColumn-major sparse matrix.Classorg.apache.spark.mllib.linalgApache Sparkjavadoc
SparseVectorA sparse vector represented by an index array and an value array.Classorg.apache.spark.mllib.linalgApache Sparkjavadoc
SpecialLengthsClassorg.apache.spark.api.rApache Sparkjavadoc
SpillListenerA SparkListener that detects whether spills have occurred in Spark jobs.Classorg.apache.sparkApache Sparkjavadoc
Split Interface for a "Split," which specifies a test made at a decision tree node to choose the left or right path.Interfaceorg.apache.spark.ml.treeApache Sparkjavadoc
Split Split applied to a feature param: feature feature indexClassorg.apache.spark.mllib.tree.modelApache Sparkjavadoc
SplitInfoClassorg.apache.spark.schedulerApache Sparkjavadoc
SQLContextThe entry point for working with structured data (rows and columns) in Spark.Classorg.apache.spark.sqlApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.sql.SQLContext
SQLImplicitsA collection of implicit methods for converting common Scala objects into DataFrames.Classorg.apache.spark.sqlApache Sparkjavadoc
SQLTransformer Implements the transformations which are defined by SQL statement.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.feature.SQLTransformer
SQLUserDefinedType A user-defined type which can be automatically recognized by a SQLContext and registered.Classorg.apache.spark.sql.typesApache Sparkjavadoc
SquaredError Class for squared error loss calculation.Classorg.apache.spark.mllib.tree.lossApache Sparkjavadoc
SquaredL2Updater Updater for L2 regularized problems.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
StageDataClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
StageInfo Stores information about a stage to pass from the scheduler to SparkListeners.Classorg.apache.spark.schedulerApache Sparkjavadoc
StageStatusClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
StandardNormalGenerator Generates i.Classorg.apache.spark.mllib.randomApache Sparkjavadoc
StandardScaler Standardizes features by removing the mean and scaling to unit variance using column summary statistics on the samples in the training set.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.feature.StandardScaler
StandardScalerStandardizes features by removing the mean and scaling to unit std using column summary statistics on the samples in the training set.Classorg.apache.spark.mllib.featureApache Sparkjavadoc
StandardScalerModelClassorg.apache.spark.ml.featureApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.feature.StandardScalerModel
StandardScalerModelRepresents a StandardScaler model that can transform vectors.Classorg.apache.spark.mllib.featureApache Sparkjavadoc
StatCounterA class for tracking the statistics of a set of numbers (count, mean and variance) in a numerically robust way.Classorg.apache.spark.utilApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

State Abstract class for getting and updating the state in mapping function used in the mapWithState operation of a pair DStream (Scala)Classorg.apache.spark.streamingApache Sparkjavadoc
StateSpec Abstract class representing all the specifications of the DStream transformation mapWithState operation of aClassorg.apache.spark.streamingApache Sparkjavadoc
StatisticsClassorg.apache.spark.mllib.statApache Sparkjavadoc
Statistics Statistics for querying the supervisor about state of workers.Classorg.apache.spark.streaming.receiverApache Sparkjavadoc
StatsReportListenerClassorg.apache.spark.schedulerApache Sparkjavadoc
StatsReportListener A simple StreamingListener that logs summary statistics across Spark Streaming batches param: numBatchInfos Number of last batches to consider for generating statistics (default: 10)Classorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StatusUpdateClassorg.apache.spark.scheduler.localApache Sparkjavadoc
StopCoordinatorClassorg.apache.spark.schedulerApache Sparkjavadoc
StopExecutorClassorg.apache.spark.scheduler.localApache Sparkjavadoc
StopWordsRemover A feature transformer that filters out stop words from input.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.ml.feature.StopWordsRemover
StorageLevel Flags for controlling the storage of an RDD.Classorg.apache.spark.storageApache Sparkjavadoc
StorageLevelsExpose some commonly useful storage level constants.Classorg.apache.spark.api.javaApache Sparkjavadoc
StorageListener A SparkListener that prepares information to be displayed on the BlockManagerUI.Classorg.apache.spark.ui.storageApache Sparkjavadoc
StorageStatus Storage information for each BlockManager.Classorg.apache.spark.storageApache Sparkjavadoc
StorageStatusListener A SparkListener that maintains executor storage status.Classorg.apache.spark.storageApache Sparkjavadoc
StrategyStores all the configuration options for tree construction param: algo Learning goal.Classorg.apache.spark.mllib.tree.configurationApache Sparkjavadoc
StreamBlockIdClassorg.apache.spark.storageApache Sparkjavadoc
StreamingContextMain entry point for Spark Streaming functionality.Classorg.apache.spark.streamingApache Sparkjavadoc
StreamingContextStateenum StreamingContextState Represents the state of a StreamingContext.Classorg.apache.spark.streamingApache Sparkjavadoc
StreamingKMeansStreamingKMeans provides methods for configuring a streaming k-means analysis, training the model on streaming,Classorg.apache.spark.mllib.clusteringApache Sparkjavadoc
StreamingKMeansModelStreamingKMeansModel extends MLlib's KMeansModel for streaming algorithms, so it can keep track of a continuously updated weightClassorg.apache.spark.mllib.clusteringApache Sparkjavadoc
StreamingLinearAlgorithm StreamingLinearAlgorithm implements methods for continuously training a generalized linear model model on streaming data,Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
StreamingLinearRegressionWithSGDTrain or predict a linear regression model on streaming data.Classorg.apache.spark.mllib.regressionApache Sparkjavadoc
StreamingListenerInterfaceorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StreamingListenerBatchCompletedClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

StreamingListenerBatchStartedClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StreamingListenerBatchSubmittedClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StreamingListenerOutputOperationCompletedClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StreamingListenerOutputOperationStartedClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StreamingListenerReceiverErrorClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StreamingListenerReceiverStartedClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StreamingListenerReceiverStoppedClassorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StreamingLogisticRegressionWithSGDTrain or predict a logistic regression model on streaming data.Classorg.apache.spark.mllib.classificationApache Sparkjavadoc
StreamingTestClassorg.apache.spark.mllib.stat.testApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.stat.test.StreamingTest
StreamInputInfo Track the information of input stream at specified batch time.Classorg.apache.spark.streaming.schedulerApache Sparkjavadoc
StringArrayParam Specialized version of Param[Array[String} for Java.Classorg.apache.spark.ml.paramApache Sparkjavadoc
StringContainsA filter that evaluates to true iff the attribute evaluates to a string that contains the string value.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
StringEndsWithA filter that evaluates to true iff the attribute evaluates to a string that starts with value.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
StringIndexer A label indexer that maps a string column of labels to an ML column of label indices.Classorg.apache.spark.ml.featureApache Sparkjavadoc
StringIndexerModel Model fitted by StringIndexer.Classorg.apache.spark.ml.featureApache Sparkjavadoc
StringRRDDAn RDD that stores R objects as Array[String].Classorg.apache.spark.api.rApache Sparkjavadoc
StringStartsWithA filter that evaluates to true iff the attribute evaluates to a string that starts with value.Classorg.apache.spark.sql.sourcesApache Sparkjavadoc
StringType The data type representing String values.Classorg.apache.spark.sql.typesApache Sparkjavadoc
StronglyConnectedComponentsStrongly connected components algorithm implementation.Classorg.apache.spark.graphx.libApache Sparkjavadoc
StructFieldA field inside a StructType.Classorg.apache.spark.sql.typesApache Sparkjavadoc
StructType A StructType object can be constructed by StructType(fields: Seq[StructField])Classorg.apache.spark.sql.typesApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.sql.types.StructType
SuccessClassorg.apache.sparkApache Sparkjavadoc
SVDPlusPlusClassorg.apache.spark.graphx.libApache Sparkjavadoc
SVMDataGenerator Generate sample data used for SVM.Classorg.apache.spark.mllib.utilApache Sparkjavadoc
SVMModelModel for Support Vector Machines (SVMs).Classorg.apache.spark.mllib.classificationApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

SVMWithSGDTrain a Support Vector Machine (SVM) using Stochastic Gradient Descent.Classorg.apache.spark.mllib.classificationApache Sparkjavadoc
TaskCommitDenied Task requested the driver to commit, but was denied.Classorg.apache.sparkApache Sparkjavadoc
TaskCompletionListener Listener providing a callback function to invoke when a task's execution completes.Interfaceorg.apache.spark.utilApache Sparkjavadoc
TaskContextContextual information about a task which can be read or mutated during execution.Classorg.apache.sparkApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.TaskContext
TaskDataClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
TaskFailedReason Various possible reasons why a task failed.Interfaceorg.apache.sparkApache Sparkjavadoc
TaskInfo Information about a running task attempt inside a TaskSet.Classorg.apache.spark.schedulerApache Sparkjavadoc
TaskKilled Task was killed intentionally and needs to be rescheduled.Classorg.apache.sparkApache Sparkjavadoc
TaskKilledException Exception thrown when a task is explicitly killed (i.Classorg.apache.sparkApache Sparkjavadoc
TaskLocalityClassorg.apache.spark.schedulerApache Sparkjavadoc
TaskMetricDistributionsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
TaskMetricsClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
TaskResultBlockIdClassorg.apache.spark.storageApache Sparkjavadoc
TaskResultLost The task finished successfully, but the result was lost from the executor's block manager beforeSee Also:Serialized FormClassorg.apache.sparkApache Sparkjavadoc
TaskSortingClassorg.apache.spark.status.api.v1Apache Sparkjavadoc
TestResultTrait for hypothesis test results.Interfaceorg.apache.spark.mllib.stat.testApache Sparkjavadoc
TimeThis is a simple class that represents an absolute instant of time.Classorg.apache.spark.streamingApache Sparkjavadoc
TimestampType The data type representing java.Classorg.apache.spark.sql.typesApache Sparkjavadoc
TimeTrackingOutputStreamIntercepts write calls and tracks total time spent writing in order to update shuffle write metrics.Classorg.apache.spark.storageApache Sparkjavadoc
Tokenizer A tokenizer that converts the input string to lowercase and then splits it by white spaces.Classorg.apache.spark.ml.featureApache Sparkjavadoc
TorrentBroadcastFactoryA Broadcast implementation that uses a BitTorrent-like protocol to do a distributed transfer of the broadcasted data to the executors.Classorg.apache.spark.broadcastApache Sparkjavadoc
TrainValidationSplit Validation for hyper-parameter tuning.Classorg.apache.spark.ml.tuningApache Sparkjavadoc
TrainValidationSplitModel Model from train validation split.Classorg.apache.spark.ml.tuningApache Sparkjavadoc
Transformer Abstract class for transformers that transform one dataset into another.Classorg.apache.spark.mlApache Sparkjavadoc
TriangleCountCompute the number of triangles passing through each vertex.Classorg.apache.spark.graphx.libApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

TripletFieldsRepresents a subset of the fields of an [[EdgeTriplet]] or [[EdgeContext]].Classorg.apache.spark.graphxApache Sparkjavadoc
TwitterUtilsClassorg.apache.spark.streaming.twitterApache Sparkjavadoc
TypedColumnA Column where an Encoder has been given for the expected input and return type.Classorg.apache.spark.sqlApache Sparkjavadoc
UDF10A Spark SQL UDF that has 10 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF11A Spark SQL UDF that has 11 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF12A Spark SQL UDF that has 12 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF13A Spark SQL UDF that has 13 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF14A Spark SQL UDF that has 14 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF15A Spark SQL UDF that has 15 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF16A Spark SQL UDF that has 16 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF17A Spark SQL UDF that has 17 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF18A Spark SQL UDF that has 18 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF19A Spark SQL UDF that has 19 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF20A Spark SQL UDF that has 20 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF21A Spark SQL UDF that has 21 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF22A Spark SQL UDF that has 22 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF3A Spark SQL UDF that has 3 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF4A Spark SQL UDF that has 4 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF5A Spark SQL UDF that has 5 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF6A Spark SQL UDF that has 6 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF7A Spark SQL UDF that has 7 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF8A Spark SQL UDF that has 8 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDF9A Spark SQL UDF that has 9 arguments.Interfaceorg.apache.spark.sql.api.javaApache Sparkjavadoc
UDFRegistrationFunctions for registering user-defined functions.Classorg.apache.spark.sqlApache Sparkjavadoc
UnaryTransformer Abstract class for transformers that take one input column, apply transformation, and output the result as a new column.Classorg.apache.spark.mlApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

UniformGenerator Generates i.Classorg.apache.spark.mllib.randomApache Sparkjavadoc
UnionRDDClassorg.apache.spark.rddApache Sparkjavadoc
UnknownReason We don't know why the task ended -- for example, because of a ClassNotFound exception when deserializing the task result.Classorg.apache.sparkApache Sparkjavadoc
UnresolvedAttribute An unresolved attribute.Classorg.apache.spark.ml.attributeApache Sparkjavadoc
Updater Class used to perform steps (weight update) using Gradient Descent methods.Classorg.apache.spark.mllib.optimizationApache Sparkjavadoc
UserDefinedAggregateFunction The base class for implementing user-defined aggregate functions (UDAF).Classorg.apache.spark.sql.expressionsApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.sql.expressions.UserDefinedAggregateFunction
UserDefinedFunctionA user-defined function.Classorg.apache.spark.sqlApache Sparkjavadoc
UserDefinedType The data type for User Defined Types (UDTs).Classorg.apache.spark.sql.typesApache Sparkjavadoc
Variance Class for calculating variance during regressionSee Also:Serialized FormClassorg.apache.spark.mllib.tree.impurityApache Sparkjavadoc
VectorRepresents a numeric vector, whose index type is Int and value type is Double.Interfaceorg.apache.spark.mllib.linalgApache Sparkjavadoc
Check Code Snippets / Samples for org.apache.spark.mllib.linalg.Vector
VectorClassorg.apache.spark.utilApache Sparkjavadoc
VectorAssembler A feature transformer that merges multiple columns into a vector column.Classorg.apache.spark.ml.featureApache Sparkjavadoc
VectorAttributeRewriterUtility transformer that rewrites Vector attribute names via prefix replacement.Classorg.apache.spark.ml.featureApache Sparkjavadoc
VectorIndexer Class for indexing categorical feature columns in a dataset of Vector.Classorg.apache.spark.ml.featureApache Sparkjavadoc
VectorIndexerModel Transform categorical features to use 0-based indices instead of their original values.Classorg.apache.spark.ml.featureApache Sparkjavadoc
VectorsClassorg.apache.spark.mllib.linalgApache Sparkjavadoc
VectorSlicer This class takes a feature vector and outputs a new feature vector with a subarray of the The subset of features can be specified with either indices (setIndices())Classorg.apache.spark.ml.featureApache Sparkjavadoc
VectorTransformerInterfaceorg.apache.spark.mllib.featureApache Sparkjavadoc
VectorUDT:: AlphaComponent :: User-defined type for Vector which allows easy interaction with SQLClassorg.apache.spark.mllib.linalgApache Sparkjavadoc
VertexRDD pre-indexing the entries for fast, efficient joins.Classorg.apache.spark.graphxApache Sparkjavadoc
VertexRDDImplClassorg.apache.spark.graphx.implApache Sparkjavadoc
VocabWordClassorg.apache.spark.mllib.featureApache Sparkjavadoc
VoidFunction2A two-argument function that takes arguments of type T1 and T2 with no return value.Interfaceorg.apache.spark.api.java.functionApache Sparkjavadoc
WeibullGenerator Generates i.Classorg.apache.spark.mllib.randomApache Sparkjavadoc
Window Utility functions for defining window in DataFrames.Classorg.apache.spark.sql.expressionsApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner

WindowSpec A window specification that defines the partitioning, ordering, and frame boundaries.Classorg.apache.spark.sql.expressionsApache Sparkjavadoc
Word2Vec Word2Vec trains a model of Map(String, Vector), i.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Word2VecClassorg.apache.spark.mllib.featureApache Sparkjavadoc
Word2VecModel Model fitted by Word2Vec.Classorg.apache.spark.ml.featureApache Sparkjavadoc
Word2VecModelClassorg.apache.spark.mllib.featureApache Sparkjavadoc
WriteAheadLog This abstract class represents a write ahead log (aka journal) that is used by Spark Streaming to save the received data (by receivers) and associated metadata to a reliable storage, so thatClassorg.apache.spark.streaming.utilApache Sparkjavadoc
WriteAheadLogRecordHandle This abstract class represents a handle that refers to a record written in a It must contain all the information necessary for the record to be read and returned byClassorg.apache.spark.streaming.utilApache Sparkjavadoc
ZeroMQUtilsClassorg.apache.spark.streaming.zeromqApache Sparkjavadoc

Subscribe to Java News and Posts. Get latest updates and posts on Java from Buggybread.com
Enter your email address:
Delivered by FeedBurner



comments powered by Disqus