You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/home/orestis/Desktop/feast_project/feature_streaming/test_spark.py", line 40, in <module>
start_spark()
File "/home/orestis/Desktop/feast_project/feature_streaming/test_spark.py", line 27, in start_spark
ingestion_config = SparkProcessorConfig(mode="spark", source="kafka", spark_session=spark, processing_time="5 seconds",query_timeout=None)
File "pydantic/main.py", line 342, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for SparkProcessorConfig
query_timeout
none is not an allowed value (type=type_error.none.not_allowed)
Specifications
Version: 0.34.1
Platform: Linux
Subsystem: Ubuntu
Possible Solution
Add None as a value for query_timeout
The text was updated successfully, but these errors were encountered:
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Expected Behavior
Spark Kafka Processor query_timeout can be set to None
Current Behavior
Spark Kafka Processor query_timeout cannot be set to None
Steps to reproduce
Use query_timeout=None in SparkProcessorConfig
Execute with spark-submit
Error:
Specifications
Possible Solution
Add None as a value for query_timeout
The text was updated successfully, but these errors were encountered: