-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Convert to Delta #328
Comments
Assuming you are using open-source Delta Lake (i.e., not Databricks Delta Lake), Hive Metastore will still consider it to be an internal table in Parquet format, so any Hive queries will attempt to read the table as parquet and fail (the JSON files in the _delta_log directory will cause problems). And it is pretty weird if an internal Hive table cannot be read from Hive. Hence, it's not generally recommended to convert an internal hive table to a Delta table. Rather, copy the internal table to an external location and convert that to a Delta table. And after the conversion, you can query using Spark, Presto/Athena, Snowflake, etc. See the list of connectors - https://docs.delta.io/latest/integrations.html We do have an experimental Hive connector that will allow Hive to read Delta tables. However, the table definition in the metastore will need to be explicitly changed from being a Parquet table to a Delta table (i.e., |
@zhangxinjian123 were you able to work around the issue? |
According to the delta Lake API, I successfully converted the hive internal table to delta Lake table. The first question is whether the transformed table is the inner table of hive or the outer surface of hive. Because the data of delta Lake project can't enter hive directly, you need to connect delta Lake table through delta IO / connectors. The second problem is that hive internal tables are transformed into delta Lake tables. Can later delta Lake data directly enter the money table?
The text was updated successfully, but these errors were encountered: