Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SQL] Minor fix for doc and comment #3533

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion docs/sql-programming-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -994,7 +994,7 @@ Several caching related features are not supported yet:
## Compatibility with Apache Hive

Spark SQL is designed to be compatible with the Hive Metastore, SerDes and UDFs. Currently Spark
SQL is based on Hive 0.12.0.
SQL is based on Hive 0.12.0 and 0.13.1.

#### Deploying in Existing Hive Warehouses

Expand Down Expand Up @@ -1033,6 +1033,7 @@ Spark SQL supports the vast majority of Hive features, such as:
* Sampling
* Explain
* Partitioned tables
* View
* All Hive DDL Functions, including:
* `CREATE TABLE`
* `CREATE TABLE AS SELECT`
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,10 @@ object HiveFromSpark {
val sparkConf = new SparkConf().setAppName("HiveFromSpark")
val sc = new SparkContext(sparkConf)

// A local hive context creates an instance of the Hive Metastore in process, storing
// the warehouse data in the current directory. This location can be overridden by
// specifying a second parameter to the constructor.
// A hive context adds support for finding tables in the MetaStore and writing queries
// using HiveQL. Users who do not have an existing Hive deployment can still create a
// HiveContext. When not configured by the hive-site.xml, the context automatically
// creates metastore_db and warehouse in the current directory.
val hiveContext = new HiveContext(sc)
import hiveContext._

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ class DefaultSource extends RelationProvider {
sqlContext: SQLContext,
parameters: Map[String, String]): BaseRelation = {
val path =
parameters.getOrElse("path", sys.error("'path' must be specifed for parquet tables."))
parameters.getOrElse("path", sys.error("'path' must be specified for parquet tables."))

ParquetRelation2(path)(sqlContext)
}
Expand Down