Search Java Classes and Packages

Search Java Frameworks and Libraries

255581 classes and counting ...
Search Tips Index Status



#Org.apache.spark.sql Classes and Interfaces - 123 results found.
NameDescriptionTypePackageFramework
AggregatedDialectAggregatedDialect can unify multiple dialects into one virtual Dialect.Classorg.apache.spark.sql.jdbcApache Spark
AggregatorA base class for user-defined aggregations, which can be used in DataFrame and Dataset operations to take all of the elements of a group and reduce them to a single value.Classorg.apache.spark.sql.expressionsApache Spark
AnalysisException Thrown when a query fails to analyze, usually because the query itself is invalid.Classorg.apache.spark.sqlApache Spark
AndA filter that evaluates to true iff both left or right evaluate to true.Classorg.apache.spark.sql.sourcesApache Spark
ArrayTypeClassorg.apache.spark.sql.typesApache Spark
BaseRelation Represents a collection of tuples with a known schema.Classorg.apache.spark.sql.sourcesApache Spark
BinaryType The data type representing Array[Byte] values.Classorg.apache.spark.sql.typesApache Spark
BooleanType The data type representing Boolean values.Classorg.apache.spark.sql.typesApache Spark
ByteType The data type representing Byte values.Classorg.apache.spark.sql.typesApache Spark
CalendarIntervalType The data type representing calendar time intervals.Classorg.apache.spark.sql.typesApache Spark
CatalystScan An interface for experimenting with a more direct connection to the query planner.Interfaceorg.apache.spark.sql.sourcesApache Spark
Column A column that will be computed based on the data in a DataFrame.Classorg.apache.spark.sqlApache Spark
ColumnName A convenient class used for constructing schema.Classorg.apache.spark.sqlApache Spark
CreatableRelationProviderInterfaceorg.apache.spark.sql.sourcesApache Spark
DataFrame A distributed collection of data organized into named columns.Classorg.apache.spark.sqlApache Spark
DataFrameHolderA container for a DataFrame, used for implicit conversions.Classorg.apache.spark.sqlApache Spark
DataFrameNaFunctions Functionality for working with missing data in DataFrames.Classorg.apache.spark.sqlApache Spark
DataFrameReader Interface used to load a DataFrame from external storage systems (e.Classorg.apache.spark.sqlApache Spark
DataFrameStatFunctions Statistic functions for DataFrames.Classorg.apache.spark.sqlApache Spark
DataFrameWriter Interface used to write a DataFrame to external storage systems (e.Classorg.apache.spark.sqlApache Spark
Dataset A Dataset is a strongly typed collection of objects that can be transformed in parallel using functional or relational operations.Classorg.apache.spark.sqlApache Spark
DatasetHolderA container for a Dataset, used for implicit conversions.Classorg.apache.spark.sqlApache Spark
DataSourceRegister Data sources should implement this trait so that they can register an alias to their data source.Interfaceorg.apache.spark.sql.sourcesApache Spark
DataType The base type of all Spark SQL data types.Classorg.apache.spark.sql.typesApache Spark
DataTypesTo get/create specific data type, users should use singleton objects and factory methods provided by this class.Classorg.apache.spark.sql.typesApache Spark
DateType A date type, supporting "0001-01-01" through "9999-12-31".Classorg.apache.spark.sql.typesApache Spark
DB2DialectClassorg.apache.spark.sql.jdbcApache Spark
DecimalA mutable implementation of BigDecimal that can hold a Long if values are small enough.Classorg.apache.spark.sql.typesApache Spark
DecimalTypeClassorg.apache.spark.sql.typesApache Spark
DerbyDialectClassorg.apache.spark.sql.jdbcApache Spark
DoubleType The data type representing Double values.Classorg.apache.spark.sql.typesApache Spark
Encoder Used to convert a JVM object of type T to and from the internal Spark SQL representation.Interfaceorg.apache.spark.sqlApache Spark
Encoders Methods for creating an Encoder.Classorg.apache.spark.sqlApache Spark
EqualNullSafePerforms equality comparison, similar to EqualTo.Classorg.apache.spark.sql.sourcesApache Spark
EqualToA filter that evaluates to true iff the attribute evaluates to a valueSince:1.Classorg.apache.spark.sql.sourcesApache Spark
ExecutionListenerManager Manager for QueryExecutionListener.Classorg.apache.spark.sql.utilApache Spark
ExperimentalMethods Holder for experimental methods for the bravest.Classorg.apache.spark.sqlApache Spark
FilterA filter predicate for data sources.Classorg.apache.spark.sql.sourcesApache Spark
FloatType The data type representing Float values.Classorg.apache.spark.sql.typesApache Spark
functionsClassorg.apache.spark.sqlApache Spark
GreaterThanA filter that evaluates to true iff the attribute evaluates to a value greater than value.Classorg.apache.spark.sql.sourcesApache Spark
GreaterThanOrEqualA filter that evaluates to true iff the attribute evaluates to a value greater than or equal to value.Classorg.apache.spark.sql.sourcesApache Spark
GroupedData A set of methods for aggregations on a DataFrame, created by DataFrame.Classorg.apache.spark.sqlApache Spark
GroupedDataset A Dataset has been logically grouped by a user specified grouping key.Classorg.apache.spark.sqlApache Spark
HadoopFsRelation A BaseRelation that provides much of the common code required for relations that store their data to an HDFS compatible filesystem.Classorg.apache.spark.sql.sourcesApache Spark
HadoopFsRelation .FakeFileStatusClassorg.apache.spark.sql.sources.HadoopFsRelationApache Spark
HadoopFsRelation .FakeFileStatus$Classorg.apache.spark.sql.sources.HadoopFsRelationApache Spark
HadoopFsRelationProvider Implemented by objects that produce relations for a specific kind of data source with a given schema and partitioned columns.Interfaceorg.apache.spark.sql.sourcesApache Spark
HiveContextAn instance of the Spark SQL execution engine that integrates with data stored in Hive.Classorg.apache.spark.sql.hiveApache Spark
InA filter that evaluates to true iff the attribute evaluates to one of the values in the array.Classorg.apache.spark.sql.sourcesApache Spark
InsertableRelation A BaseRelation that can be used to insert data into it through the insert method.Interfaceorg.apache.spark.sql.sourcesApache Spark
IntegerType The data type representing Int values.Classorg.apache.spark.sql.typesApache Spark
IsNotNullA filter that evaluates to true iff the attribute evaluates to a non-null value.Classorg.apache.spark.sql.sourcesApache Spark
IsNullA filter that evaluates to true iff the attribute evaluates to null.Classorg.apache.spark.sql.sourcesApache Spark
JdbcDialect Encapsulates everything (extensions, workarounds, quirks) to handle the SQL dialect of a certain database or jdbc driver.Classorg.apache.spark.sql.jdbcApache Spark
JdbcDialects Registry of dialects that apply to every new jdbc DataFrame.Classorg.apache.spark.sql.jdbcApache Spark
JdbcType A database type definition coupled with the jdbc type needed to send null values to the database.Classorg.apache.spark.sql.jdbcApache Spark
LessThanA filter that evaluates to true iff the attribute evaluates to a valueSince:1.Classorg.apache.spark.sql.sourcesApache Spark
LessThanOrEqualA filter that evaluates to true iff the attribute evaluates to a value less than or equal to value.Classorg.apache.spark.sql.sourcesApache Spark
LongType The data type representing Long values.Classorg.apache.spark.sql.typesApache Spark
MapType The data type for Maps.Classorg.apache.spark.sql.typesApache Spark
Metadata Metadata is a wrapper over Map[String, Any] that limits the value type to simple ones: Boolean, Long, Double, String, Metadata, Array[Boolean], Array[Long], Array[Double], Array[String], andClassorg.apache.spark.sql.typesApache Spark
MetadataBuilder Builder for Metadata.Classorg.apache.spark.sql.typesApache Spark
MsSqlServerDialectClassorg.apache.spark.sql.jdbcApache Spark
MutableAggregationBuffer A Row representing an mutable aggregation buffer.Classorg.apache.spark.sql.expressionsApache Spark
MySQLDialectClassorg.apache.spark.sql.jdbcApache Spark
NoopDialectNOOP dialect object, always returning the neutral element.Classorg.apache.spark.sql.jdbcApache Spark
NotA filter that evaluates to true iff child is evaluated to false.Classorg.apache.spark.sql.sourcesApache Spark
NullType The data type representing NULL values.Classorg.apache.spark.sql.typesApache Spark
NumericType Numeric data types.Classorg.apache.spark.sql.typesApache Spark
OrA filter that evaluates to true iff at least one of left or right evaluates to true.Classorg.apache.spark.sql.sourcesApache Spark
OracleDialectClassorg.apache.spark.sql.jdbcApache Spark
OutputWriter OutputWriter is used together with HadoopFsRelation for persisting rows to the underlying file system.Classorg.apache.spark.sql.sourcesApache Spark
OutputWriterFactory A factory that produces OutputWriters.Classorg.apache.spark.sql.sourcesApache Spark
PostgresDialectClassorg.apache.spark.sql.jdbcApache Spark
PrecisionInfoPrecision parameters for a DecimalSee Also:Serialized FormClassorg.apache.spark.sql.typesApache Spark
PrunedFilteredScan A BaseRelation that can eliminate unneeded columns and filter using selected predicates before producing an RDD containing all matching tuples as Row objects.Interfaceorg.apache.spark.sql.sourcesApache Spark
PrunedScan A BaseRelation that can eliminate unneeded columns before producing an RDD containing all of its tuples as Row objects.Interfaceorg.apache.spark.sql.sourcesApache Spark
QueryExecutionListener The interface of query execution listener that can be used to analyze execution metrics.Interfaceorg.apache.spark.sql.utilApache Spark
RelationProvider Implemented by objects that produce relations for a specific kind of data source.Interfaceorg.apache.spark.sql.sourcesApache Spark
RowRepresents one row of output from a relational operator.Interfaceorg.apache.spark.sqlApache Spark
RowFactoryA factory class used to construct Row objects.Classorg.apache.spark.sqlApache Spark
SaveModeSaveMode is used to specify the expected behavior of saving a DataFrame to a data source.Classorg.apache.spark.sqlApache Spark
SchemaRelationProvider Implemented by objects that produce relations for a specific kind of data source with a given schema.Interfaceorg.apache.spark.sql.sourcesApache Spark
ScriptTransformationWriterThreadClassorg.apache.spark.sql.hive.executionApache Spark
ShortType The data type representing Short values.Classorg.apache.spark.sql.typesApache Spark
SQLContextThe entry point for working with structured data (rows and columns) in Spark.Classorg.apache.spark.sqlApache Spark
SQLImplicitsA collection of implicit methods for converting common Scala objects into DataFrames.Classorg.apache.spark.sqlApache Spark
SQLUserDefinedType A user-defined type which can be automatically recognized by a SQLContext and registered.Classorg.apache.spark.sql.typesApache Spark
StringContainsA filter that evaluates to true iff the attribute evaluates to a string that contains the string value.Classorg.apache.spark.sql.sourcesApache Spark
StringEndsWithA filter that evaluates to true iff the attribute evaluates to a string that starts with value.Classorg.apache.spark.sql.sourcesApache Spark
StringStartsWithA filter that evaluates to true iff the attribute evaluates to a string that starts with value.Classorg.apache.spark.sql.sourcesApache Spark
StringType The data type representing String values.Classorg.apache.spark.sql.typesApache Spark
StructFieldA field inside a StructType.Classorg.apache.spark.sql.typesApache Spark
StructType A StructType object can be constructed by StructType(fields: Seq[StructField])Classorg.apache.spark.sql.typesApache Spark
TimestampType The data type representing java.Classorg.apache.spark.sql.typesApache Spark
TypedColumnA Column where an Encoder has been given for the expected input and return type.Classorg.apache.spark.sqlApache Spark
UDF10A Spark SQL UDF that has 10 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF11A Spark SQL UDF that has 11 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF12A Spark SQL UDF that has 12 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF13A Spark SQL UDF that has 13 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF14A Spark SQL UDF that has 14 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF15A Spark SQL UDF that has 15 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF16A Spark SQL UDF that has 16 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF17A Spark SQL UDF that has 17 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF18A Spark SQL UDF that has 18 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF19A Spark SQL UDF that has 19 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF20A Spark SQL UDF that has 20 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF21A Spark SQL UDF that has 21 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF22A Spark SQL UDF that has 22 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF3A Spark SQL UDF that has 3 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF4A Spark SQL UDF that has 4 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF5A Spark SQL UDF that has 5 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF6A Spark SQL UDF that has 6 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF7A Spark SQL UDF that has 7 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF8A Spark SQL UDF that has 8 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDF9A Spark SQL UDF that has 9 arguments.Interfaceorg.apache.spark.sql.api.javaApache Spark
UDFRegistrationFunctions for registering user-defined functions.Classorg.apache.spark.sqlApache Spark
UserDefinedAggregateFunction The base class for implementing user-defined aggregate functions (UDAF).Classorg.apache.spark.sql.expressionsApache Spark
UserDefinedFunctionA user-defined function.Classorg.apache.spark.sqlApache Spark
UserDefinedType The data type for User Defined Types (UDTs).Classorg.apache.spark.sql.typesApache Spark
Window Utility functions for defining window in DataFrames.Classorg.apache.spark.sql.expressionsApache Spark
WindowSpec A window specification that defines the partitioning, ordering, and frame boundaries.Classorg.apache.spark.sql.expressionsApache Spark