Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions docs/building-spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -190,6 +190,18 @@ or
Java 8 tests are automatically enabled when a Java 8 JDK is detected.
If you have JDK 8 installed but it is not the system default, you can set JAVA_HOME to point to JDK 8 before running the tests.

# Running Docker based Integration Test Suites

Running only docker based integration tests and nothing else.

mvn install -DskipTests
mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.11

or

sbt docker-integration-tests/test


# Packaging without Hadoop Dependencies for YARN

The assembly directory produced by `mvn package` will, by default, include all of Spark's dependencies, including Hadoop and some of its ecosystem projects. On YARN deployments, this causes multiple versions of these to appear on executor classpaths: the version packaged in the Spark assembly and the version on each node, included with `yarn.application.classpath`. The `hadoop-provided` profile builds the assembly without including Hadoop-ecosystem projects, like ZooKeeper and Hadoop itself.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,9 @@ import java.math.BigDecimal
import java.sql.{Connection, Date, Timestamp}
import java.util.Properties

import org.scalatest.Ignore

import org.apache.spark.tags.DockerTest

@DockerTest
@Ignore
class MySQLIntegrationSuite extends DockerJDBCIntegrationSuite {
override val db = new DatabaseOnDocker {
override val imageName = "mysql:5.7.9"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,6 @@ package org.apache.spark.sql.jdbc
import java.sql.Connection
import java.util.Properties

import org.scalatest.Ignore

import org.apache.spark.sql.test.SharedSQLContext
import org.apache.spark.tags.DockerTest

Expand All @@ -46,12 +44,11 @@ import org.apache.spark.tags.DockerTest
* repository.
*/
@DockerTest
@Ignore
class OracleIntegrationSuite extends DockerJDBCIntegrationSuite with SharedSQLContext {
import testImplicits._

override val db = new DatabaseOnDocker {
override val imageName = "wnameless/oracle-xe-11g:latest"
override val imageName = "wnameless/oracle-xe-11g:14.04.4"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is probably a good idea, to fix the version, but just checking this was intentional?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srowen Yes, the other tests were already with fixed docker image version, but Oracle one that was introduce recently was not. Having a fixed one enables us to have consistency on the backend version, and also avoid timeout issues on builds trying to download new version of the docker image. Please let me know if you want me to use a different jira to make this change more explicit.

override val env = Map(
"ORACLE_ROOT_PASSWORD" -> "oracle"
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,15 +20,12 @@ package org.apache.spark.sql.jdbc
import java.sql.Connection
import java.util.Properties

import org.scalatest.Ignore

import org.apache.spark.sql.Column
import org.apache.spark.sql.catalyst.expressions.Literal
import org.apache.spark.sql.types.{ArrayType, DecimalType}
import org.apache.spark.tags.DockerTest

@DockerTest
@Ignore
class PostgresIntegrationSuite extends DockerJDBCIntegrationSuite {
override val db = new DatabaseOnDocker {
override val imageName = "postgres:9.4.5"
Expand Down
8 changes: 7 additions & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,6 @@
<module>sql/core</module>
<module>sql/hive</module>
<module>sql/hivecontext-compatibility</module>
<module>external/docker-integration-tests</module>
<module>assembly</module>
<module>external/flume</module>
<module>external/flume-sink</module>
Expand Down Expand Up @@ -2381,6 +2380,13 @@
</build>
</profile>

<profile>
<id>docker-integration-tests</id>
<modules>
<module>external/docker-integration-tests</module>
</modules>
</profile>

<!-- A series of build profiles where customizations for particular Hadoop releases can be made -->

<!-- Hadoop-a.b.c dependencies can be found at
Expand Down
3 changes: 2 additions & 1 deletion project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -382,7 +382,8 @@ object SparkBuild extends PomBuild {

enable(Java8TestSettings.settings)(java8Tests)

enable(DockerIntegrationTests.settings)(dockerIntegrationTests)
// SPARK-14738 - Remove docker tests from main Spark build
// enable(DockerIntegrationTests.settings)(dockerIntegrationTests)

/**
* Adds the ability to run the spark shell directly from SBT without building an assembly
Expand Down