Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Properly reject Iceberg tables in Hive connector #8693

Closed
Tracked by #1324
findepi opened this issue Jul 28, 2021 · 4 comments · Fixed by #10441
Closed
Tracked by #1324

Properly reject Iceberg tables in Hive connector #8693

findepi opened this issue Jul 28, 2021 · 4 comments · Fixed by #10441
Assignees
Labels
bug Something isn't working
Milestone

Comments

@findepi
Copy link
Member

findepi commented Jul 28, 2021

Currently

SELECT * FROM hive.default.an_iceberg_table

fails with

Query 20211207_145628_00264_6i3st failed: Unable to create input format org.apache.hadoop.mapred.FileInputFormat
io.trino.spi.TrinoException: Unable to create input format org.apache.hadoop.mapred.FileInputFormat
	at io.trino.plugin.hive.util.HiveUtil.getInputFormat(HiveUtil.java:342)
	at io.trino.plugin.hive.BackgroundHiveSplitLoader.loadPartition(BackgroundHiveSplitLoader.java:388)
	at io.trino.plugin.hive.BackgroundHiveSplitLoader.loadSplits(BackgroundHiveSplitLoader.java:352)
	at io.trino.plugin.hive.BackgroundHiveSplitLoader$HiveSplitLoaderTask.process(BackgroundHiveSplitLoader.java:276)
	at io.trino.plugin.hive.util.ResumableTasks$1.run(ResumableTasks.java:38)
	at io.trino.$gen.Trino_dev____20211207_122558_2.run(Unknown Source)
	at io.airlift.concurrent.BoundedExecutor.drainQueue(BoundedExecutor.java:80)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.RuntimeException: java.lang.InstantiationException
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:135)
	at io.trino.plugin.hive.util.HiveUtil.getInputFormat(HiveUtil.java:339)
	... 9 more
Caused by: java.lang.InstantiationException
	at java.base/jdk.internal.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
	... 10 more

and

SELECT * FROM hive.default."an_iceberg_table$properties"

does not fail at all.

@findepi findepi added the bug Something isn't working label Jul 28, 2021
@HiLany
Copy link
Contributor

HiLany commented Dec 10, 2021

My understanding is that if it is an iceberg table, should both of the above SQLs fail and return a friendly prompt? Is that right?

@findepi
Copy link
Member Author

findepi commented Dec 13, 2021

correct

@findinpath
Copy link
Contributor

findinpath commented Dec 31, 2021

Reproduction steps for the issue

Spin up a Trino hadoop dockerized environment

testing/bin/ptl env up --environment 'singlenode'  --config 'config-default'

Create the tables via Trino CLI

create table hive.default.hivetable (id bigint);

create table iceberg.default.icebergtable (id bigint);

Select from an iceberg table within the hive connector:

select * from hive.default.icebergtable;

This operation fails with the above mentioned exception.

Select from a hive table within the iceberg connector

select * from iceberg.default.hivetable;

The operation fails with the exception (as expected):

2021-12-31 11:45:45] [84148224] Query failed (#20211231_104545_00014_c7yj6): Not an Iceberg table: default.hivetable
[2021-12-31 11:45:45] io.trino.plugin.iceberg.UnknownTableTypeException: Not an Iceberg table: default.hivetable

@shubhamg931
Copy link

Is there a thread to explain why iceberg tables can't be queries via hive connector? Thanks in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Development

Successfully merging a pull request may close this issue.

4 participants