-
Notifications
You must be signed in to change notification settings - Fork 346
Add integration tests that run against a real Redshift cluster #41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 16 commits
9fdd5b3
9d280ec
b479bb1
47bb64b
6cba042
9b257e1
695183f
4b203c7
ffa00d3
8371d6a
3652056
e08b513
4e517ae
55c7724
cf1d516
14e3cc2
b62646f
66c166d
bffe6eb
da4800a
7c4f64b
2959a9c
78a6ccb
b86d74e
24f9a9e
a74a950
39edc80
40581fe
681b5f2
7fabdd7
a1f0146
93faac2
b82e5fd
2e6e274
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -8,10 +8,25 @@ cache: | |
| directories: | ||
| - $HOME/.ivy2 | ||
| env: | ||
| - HADOOP_VERSION="1.0.4" | ||
| - HADOOP_VERSION="1.2.1" | ||
| - HADOOP_VERSION="2.2.0" | ||
| matrix: | ||
| - HADOOP_VERSION="1.0.4" | ||
| - HADOOP_VERSION="1.2.1" | ||
| - HADOOP_VERSION="2.2.0" | ||
| global: | ||
| # AWS_REDSHIFT_JDBC_URL | ||
| - secure: "RNkxdKcaKEYuJqxli8naazp42qO5/pgueIzs+J5rHwl39jcBvJMgW3DX8kT7duzdoBb/qrolj/ttbQ3l/30P45+djn0BEwcJMX7G/FGpZYD23yd03qeq7sOKPQl2Ni/OBttYHJMah5rI6aPmAysBZMQO7Wijdenb/RUiU2YcZp0=" | ||
| # AWS_REDSHIFT_PASSWORD | ||
| - secure: "Bzre/ohanBt6wrj5asn8+iaIU5qm2QBZ+P/PiAeg55R5sqfyI/pwCYZKdtKSG7SuKzsoiAOtnjvcXMD2hickTLIDz3GmrvFcpx7yn3PEKoLQfT4Ry1/RMOsqa1+sj6zJ7J2dl4w0AURJ7Jb9/7GRylNnL0jkUvqUnWet8PBb7R8=" | ||
| # AWS_REDSHIFT_USER | ||
| - secure: "LIkY/ZpBXK3vSFsdpBSRXEsgfD2wDF52X8OZOlyBJOiZpS4y1/obj8b3VQABDPyPH95bGX/LOpM0vVM137rYgF0pskgVEzLMyZOPpwYqNGPf/d4BtQhBRc8f7+jmr6D4Hrox4jCl0cCKaeiTazun2+Y9E+zgCUDvQ8y9qGctR2k=" | ||
| # AWS_ACCESS_KEY_ID | ||
| - secure: "CDlql+nrgdi7sUr7bYyXF4CFoOUCiJG9WEYNRV4k/lC37eS/al3iVYicnXqF+6UrPv5a4kHulG4d3g78J4hzn4ZVJuEhn6v8beoOBUoJJ7W/J05hVwGiQFxUq86wT3tIaBrAuDmOXaAnPEvDmPfJGNZL9ZG1CaQJo70R/HkbbVA=" | ||
| # AWS_SECRET_ACCESS_KEY | ||
| - secure: "V/Ac0ZkTslNpNc8wszalFqZYWnl910PgSORlA2tyTUCC/xfqX+CdtN9RNuVb3LBrvrkYiOBKF7ANMGOxnc/yazLNFBUmByf+rwEfR7NDCCz+SKXSNwIOPpDraOpNVd1KLyrJ9uKivFojW/IweN9bsJAEji8ql/Lpeb7qKfDbVWY=" | ||
| # AWS_S3_SCRATCH_SPACE | ||
| - secure: "LvndQIW6dHs6nyaMHtblGI/oL+s460lOezFs2BoD0Isenb/O/IM+nY5K9HepTXjJIcq8qvUYnojZX1FCrxxOXX2/+/Iihiq7GzJYdmdMC6hLg9bJYeAFk0dWYT88/AwadrJCBOa3ockRLhiO3dkai7Ki5+M1erfaFiAHHMpJxYQ=" | ||
| script: | ||
| - sbt -Dhadoop.version=$HADOOP_VERSION coverage test | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Nit: is there a reason to not test style and other fast stuff first?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I wanted the tests to still be able to run even if style checks failed. I figured this wasn't a huge deal here compared to what we do in Spark because the test time is really fast. |
||
| - if [ "$TRAVIS_SECURE_ENV_VARS" ]; then sbt -Dhadoop.version=$HADOOP_VERSION coverage it:test; fi | ||
| after_success: | ||
| - bash <(curl -s https://codecov.io/bash) | ||
This file was deleted.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,66 @@ | ||
| /* | ||
| * Copyright 2015 Databricks | ||
| * | ||
| * Licensed under the Apache License, Version 2.0 (the "License"); | ||
| * you may not use this file except in compliance with the License. | ||
| * You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| import sbt._ | ||
| import sbt.Keys._ | ||
| import sbtsparkpackage.SparkPackagePlugin.autoImport._ | ||
| import scoverage.ScoverageSbtPlugin | ||
|
|
||
| object SparkRedshiftBuild extends Build { | ||
| val hadoopVersion = settingKey[String]("Hadoop version") | ||
|
|
||
| // Define a custom test configuration so that unit test helper classes can be re-used under | ||
| // the integration tests configuration; see http://stackoverflow.com/a/20635808. | ||
| lazy val IntegrationTest = config("it") extend Test | ||
|
|
||
| lazy val root = Project("spark-redshift", file(".")) | ||
| .configs(IntegrationTest) | ||
| .settings(net.virtualvoid.sbt.graph.Plugin.graphSettings: _*) | ||
| .settings(Defaults.itSettings: _*) | ||
| .settings(Seq( | ||
| name := "spark-redshift", | ||
| organization := "com.databricks", | ||
| version := "0.4.1-SNAPSHOT", | ||
| scalaVersion := "2.10.4", | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Doesn't have to be here, but in a followup can we make sure to cross publish 2.11?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yep, I plan to add Scala versions to the Matrix build configuration. |
||
| sparkVersion := sys.props.get("spark.version").getOrElse("1.4.1"), | ||
| hadoopVersion := sys.props.get("hadoop.version").getOrElse("2.2.0"), | ||
| spName := "databricks/spark-redshift", | ||
| sparkComponents += "sql", | ||
| licenses += "Apache-2.0" -> url("http://opensource.org/licenses/Apache-2.0"), | ||
| credentials += Credentials(Path.userHome / ".ivy2" / ".credentials"), | ||
| resolvers += | ||
| "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots", | ||
| libraryDependencies ++= Seq( | ||
| "com.amazonaws" % "aws-java-sdk-core" % "1.9.40" % "provided", | ||
| // We require spark-avro, but avro-mapred must be provided to match Hadoop version: | ||
| "com.databricks" %% "spark-avro" % "1.0.0", | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Another comment so I don't forget, we should update this when we publish a new version of avro (and I guess this creates a cross publishing ordering dependency).
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes. The fact that |
||
| "org.apache.avro" % "avro-mapred" % "1.7.6" % "provided" exclude("org.mortbay.jetty", "servlet-api"), | ||
| // A Redshift-compatible JDBC driver must be present on the classpath for spark-redshift to work. | ||
| // For testing, we use an Amazon driver, which is available from | ||
| // http://docs.aws.amazon.com/redshift/latest/mgmt/configure-jdbc-connection.html | ||
| "com.amazon.redshift" % "jdbc4" % "1.1.7.1007" % "test" from "https://s3.amazonaws.com/redshift-downloads/drivers/RedshiftJDBC4-1.1.7.1007.jar", | ||
| "com.google.guava" % "guava" % "14.0.1" % "test", | ||
| "org.scalatest" %% "scalatest" % "2.1.5" % "test", | ||
| "org.scalamock" %% "scalamock-scalatest-support" % "3.2" % "test" | ||
| ), | ||
| ScoverageSbtPlugin.ScoverageKeys.coverageHighlighting := { | ||
| if (scalaBinaryVersion.value == "2.10") false | ||
| else false | ||
| }, | ||
| // Display full-length stacktraces from ScalaTest: | ||
| testOptions in Test += Tests.Argument("-oF") | ||
| ): _*) | ||
| } | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if I should purposely give these different names than the ones expected by the Amazon SDKs; this might be necessary in order to be able to test credential mechanisms (e.g. for writing a regression test for #32)