Spline producer url unreachable from spark #1355
Replies: 7 comments 9 replies
-
Try to follow steps from this guide: #1225 |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
In your config you use port |
Beta Was this translation helpful? Give feedback.
-
Yeah, I used commands with both the configs. With 9090: With 8080: In both the scenarios I get the same error. |
Beta Was this translation helpful? Give feedback.
-
In the log I see output from
|
Beta Was this translation helpful? Give feedback.
-
Yeah I tried the following command:
So the config as of now is: But error still prevails! |
Beta Was this translation helpful? Give feedback.
-
If everything else works, it must be some networking issue. Are you sure that the server is reachable? Is the spark running locally on the same machine as the severe? If spark is somewhere else, localhost will not work. Also, if a server or agent is running in the docker container, it may block the connection. |
Beta Was this translation helpful? Give feedback.
-
I used the TLDR configuration and packages given in https://absaoss.github.io/spline/. za.co.absa.spline.agent.spark:spark-2.4-spline-agent-bundle_2.12:0.5.2 and 'spark.spline.producer.url' as 'http://localhost:9090/producer'
But I get this error in my spark job:
Configurations:
Spark -> 2.4.2
Scala -> 2.12
Commands used:
spark-submit --packages za.co.absa.spline.agent.spark:spark-2.4-spline-agent-bundle_2.12:0.5.2 --conf "spark.sql.queryExecutionListeners=za.co.absa.spline.harvester.listener.SplineQueryExecutionListener" --conf "spark.spline.producer.url=http://localhost:9090/producer" AB_2.py
I have also tried spark 3.0.0 with compatible artifacts and Java version but I'm getting the same error.
Beta Was this translation helpful? Give feedback.
All reactions