Skip to content

Commit a20bfc7

Browse files
sarutakHyukjinKwon
authored andcommitted
[SPARK-56356][BUILD] Fix an issue in release build caused by error on fetching artifacts
### What changes were proposed in this pull request? This PR fixes a release build failure which recently happens at document generation phase due to error on fetching artifacts. https://github.com/apache/spark/actions/runs/23083193766/job/67055792458 ``` Error: lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts: Error: file:/home/spark-rm/.m2/repository/io/netty/netty-codec-protobuf/4.2.10.Final/netty-codec-protobuf-4.2.10.Final.jar: not found: /home/spark-rm/.m2/repository/io/netty/netty-codec-protobuf/4.2.10.Final/netty-codec-protobuf-4.2.10.Final.jar Error: file:/home/spark-rm/.m2/repository/io/netty/netty-codec-marshalling/4.2.10.Final/netty-codec-marshalling-4.2.10.Final.jar: not found: /home/spark-rm/.m2/repository/io/netty/netty-codec-marshalling/4.2.10.Final/netty-codec-marshalling-4.2.10.Final.jar Error: Error: at lmcoursier.internal.shaded.coursier.Artifacts$.$anonfun$fetchArtifacts$9(Artifacts.scala:365) Error: at lmcoursier.internal.shaded.coursier.util.Task$.$anonfun$flatMap$extension$1(Task.scala:14) Error: at lmcoursier.internal.shaded.coursier.util.Task$.$anonfun$flatMap$extension$1$adapted(Task.scala:14) Error: at lmcoursier.internal.shaded.coursier.util.Task$.wrap(Task.scala:82) Error: at lmcoursier.internal.shaded.coursier.util.Task$.$anonfun$flatMap$2(Task.scala:14) Error: at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) Error: at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:51) Error: at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:74) Error: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) Error: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) Error: at java.base/java.lang.Thread.run(Thread.java:840) Error: Caused by: lmcoursier.internal.shaded.coursier.cache.ArtifactError$NotFound: not found: /home/spark-rm/.m2/repository/io/netty/netty-codec-protobuf/4.2.10.Final/netty-codec-protobuf-4.2.10.Final.jar Error: at lmcoursier.internal.shaded.coursier.cache.internal.Downloader.$anonfun$checkFileExists$1(Downloader.scala:603) Error: at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) Error: at scala.util.Success.$anonfun$map$1(Try.scala:255) Error: at scala.util.Success.map(Try.scala:213) Error: at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) Error: at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:42) Error: at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:74) Error: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) Error: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) Error: at java.base/java.lang.Thread.run(Thread.java:840) Error: (streaming-kinesis-asl / update) lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts: Error: file:/home/spark-rm/.m2/repository/io/netty/netty-codec-protobuf/4.2.10.Final/netty-codec-protobuf-4.2.10.Final.jar: not found: /home/spark-rm/.m2/repository/io/netty/netty-codec-protobuf/4.2.10.Final/netty-codec-protobuf-4.2.10.Final.jar Error: file:/home/spark-rm/.m2/repository/io/netty/netty-codec-marshalling/4.2.10.Final/netty-codec-marshalling-4.2.10.Final.jar: not found: /home/spark-rm/.m2/repository/io/netty/netty-codec-marshalling/4.2.10.Final/netty-codec-marshalling-4.2.10.Final.jar Error: Total time: 368 s (0:06:08.0), completed Mar 14, 2026, 8:40:17 AM ------------------------------------------------ Jekyll 4.4.1 Please append `--trace` to the `build` command for any additional information or backtrace. ------------------------------------------------ ``` This issue is similar to SPARK-34762 and SPARK-37302 in that there are pom files but are not corresponding jar files under `.m2` for some dependencies. In this case, the following command is executed through [make-distribution.sh](https://github.com/apache/spark/blob/d9c8eda57e22f65d0443cab7078c632462c11272/dev/make-distribution.sh#L183) and downloads pom files for `xz:1.10`, `netty-codec-protobuf` and `netty-codec-marshalling` ``` build/mvn clean package -DskipTests -Dmaven.javadoc.skip=true -Dmaven.scaladoc.skip=true -Dmaven.source.skip -Dcyclonedx.skip=true -B -Pyarn -Pkubernetes -Phadoop-3 -Phive -Phive-thriftserver ``` And when building documents, the following command is executed through [build_api_docs.rb](https://github.com/apache/spark/blob/d9c8eda57e22f65d0443cab7078c632462c11272/docs/_plugins/build_api_docs.rb#L48) and tries to download the dependencies. ``` NO_PROVIDED_SPARK_JARS=0 build/sbt -Phive -Pkinesis-asl clean package ``` Regarding xz, `1.12` is declared in `pom.xml` so this PR fixes `SparkBuild.scala` to pin the version. Regarding `netty-codec-protobuf` and `netty-codec-marshalling`, they are declared in pom.xml to be excluded. So, this PR fixes `SparkBuild.scala` to exclude them. ### Why are the changes needed? To recover the release build. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? The following command successfully finishes on my laptop. ``` $ build/mvn clean package -DskipTests -Dmaven.javadoc.skip=true -Dmaven.scaladoc.skip=true -Dmaven.source.skip -Dcyclonedx.skip=true -B -Pyarn -Pkubernetes -Phadoop-3 -Phive -Phive-thriftserver $ SKIP_SCALADOC=1 SKIP_RDOC=1 SKIP_SQLDOC=1 bundle exec jekyll build ``` Note that to build documents, please follow the instructions in `docs/README.md` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #55198 from sarutak/fix-dependency-resolution-issue. Authored-by: Kousuke Saruta <sarutak@amazon.co.jp> Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
1 parent 842eb7b commit a20bfc7

2 files changed

Lines changed: 8 additions & 2 deletions

File tree

pom.xml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -239,6 +239,7 @@
239239
<leveldbjni.group>org.fusesource.leveldbjni</leveldbjni.group>
240240
<kubernetes-client.version>7.6.1</kubernetes-client.version>
241241
<vertx.version>4.5.26</vertx.version>
242+
<xz.version>1.12</xz.version>
242243

243244
<test.java.home>${java.home}</test.java.home>
244245

@@ -1602,7 +1603,7 @@
16021603
<dependency>
16031604
<groupId>org.tukaani</groupId>
16041605
<artifactId>xz</artifactId>
1605-
<version>1.12</version>
1606+
<version>${xz.version}</version>
16061607
</dependency>
16071608
<dependency>
16081609
<groupId>org.apache.zookeeper</groupId>

project/SparkBuild.scala

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1200,11 +1200,14 @@ object DependencyOverrides {
12001200
SbtPomKeys.effectivePom.value.getProperties.get("avro.version").asInstanceOf[String]
12011201
val slf4jVersion =
12021202
SbtPomKeys.effectivePom.value.getProperties.get("slf4j.version").asInstanceOf[String]
1203+
val xzVersion =
1204+
SbtPomKeys.effectivePom.value.getProperties.get("xz.version").asInstanceOf[String]
12031205
Seq(
12041206
"com.google.guava" % "guava" % guavaVersion,
12051207
"jline" % "jline" % jlineVersion,
12061208
"org.apache.avro" % "avro" % avroVersion,
12071209
"org.slf4j" % "slf4j-api" % slf4jVersion,
1210+
"org.tukaani" % "xz" % xzVersion,
12081211
"org.scala-lang" % "scalap" % scalaVersion.value
12091212
) ++ jacksonDeps.key.value
12101213
}
@@ -1222,7 +1225,9 @@ object ExcludedDependencies {
12221225
ExclusionRule(organization = "ch.qos.logback"),
12231226
ExclusionRule("org.lz4", "lz4-java"),
12241227
ExclusionRule("org.slf4j", "slf4j-simple"),
1225-
ExclusionRule("javax.servlet", "javax.servlet-api"))
1228+
ExclusionRule("javax.servlet", "javax.servlet-api"),
1229+
ExclusionRule("io.netty", "netty-codec-protobuf"),
1230+
ExclusionRule("io.netty", "netty-codec-marshalling"))
12261231
)
12271232
}
12281233

0 commit comments

Comments
 (0)