Skip to content

Commit

Permalink
[SPARK-6337][Documentation, SQL]Spark 1.3 doc fixes
Browse files Browse the repository at this point in the history
Author: vinodkc <vinod.kc.in@gmail.com>

Closes apache#5112 from vinodkc/spark_1.3_doc_fixes and squashes the following commits:

2c6aee6 [vinodkc] Spark 1.3 doc fixes
  • Loading branch information
vinodkc authored and srowen committed Mar 22, 2015
1 parent 7a0da47 commit 2bf40c5
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 4 deletions.
7 changes: 5 additions & 2 deletions docs/sql-programming-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -509,8 +509,11 @@ val people = sc.textFile("examples/src/main/resources/people.txt")
// The schema is encoded in a string
val schemaString = "name age"

// Import Spark SQL data types and Row.
import org.apache.spark.sql._
// Import Row.
import org.apache.spark.sql.Row;

// Import Spark SQL data types
import org.apache.spark.sql.types.{StructType,StructField,StringType};

// Generate the schema based on the string of schema
val schema =
Expand Down
2 changes: 1 addition & 1 deletion mllib/src/main/scala/org/apache/spark/ml/Pipeline.scala
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ import org.apache.spark.sql.types.StructType
abstract class PipelineStage extends Serializable with Logging {

/**
* :: DeveloperAPI ::
* :: DeveloperApi ::
*
* Derives the output schema from the input schema and parameters.
* The schema describes the columns and types of the data.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ private[sql] object DataFrame {
* val people = sqlContext.parquetFile("...")
* val department = sqlContext.parquetFile("...")
*
* people.filter("age" > 30)
* people.filter("age > 30")
* .join(department, people("deptId") === department("id"))
* .groupBy(department("name"), "gender")
* .agg(avg(people("salary")), max(people("age")))
Expand Down

0 comments on commit 2bf40c5

Please sign in to comment.