-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TypeError: 'JavaPackage' object is not callable error despite linking jars into spark succesfully #9
Comments
@NatMzk The |
from my understanding |
@NatMzk Could you provide the full stack of the exception above? and try restarting the kernel before load model |
I don't have the Databricks Runtime, but when I remove the links created by the script For details about the following configurations, see the official doc: https://spark.apache.org/docs/latest/configuration.html All those ones can be specified by the conf file or the command line, check the doc for your eivironment. Take the command line to launch pyspark as an example:
Recommend the options 1 and 2 |
@NatMzk Did the methods above resolve your issue? |
Another relatively simple way for Databricks is to copy the jar files to |
I have run the
link_pmml4s_jars_into_spark.py
script succesfullyand pmml4s jar files are present in
SPARK_HOME
locationHowever,
TypeError: 'JavaPackage' object is not callable
still occursI am running Java Version=1.8.0_302 and Spark Version=3.2.1.
I would kindly appreciate any suggestion what is missing.
The text was updated successfully, but these errors were encountered: