You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
java.lang.IllegalArgumentException: Flink and StarRocks types are not matched for column userId, flink type is DECIMAL(20, 0), starrocks type is bigint unsigned at com.starrocks.connector.flink.manager.StarRocksSinkTable.validateTableStructure(StarRocksSinkTable.java:148) at com.starrocks.connector.flink.table.sink.StarRocksDynamicSinkFunctionV2.<init>(StarRocksDynamicSinkFunctionV2.java:99) at com.starrocks.connector.flink.table.sink.SinkFunctionFactory.createSinkFunction(SinkFunctionFactory.java:99) at com.starrocks.connector.flink.table.sink.StarRocksDynamicTableSink.getSinkRuntimeProvider(StarRocksDynamicTableSink.java:44) at org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecSink.createSinkTransformation(CommonExecSink.java:158) at org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translateToPlanInternal(StreamExecSink.java:176) at org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:161) at org.apache.flink.table.planner.delegation.StreamPlanner.$anonfun$translateToPlan$1(StreamPlanner.scala:85) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) at scala.collection.Iterator.foreach(Iterator.scala:937) at scala.collection.Iterator.foreach$(Iterator.scala:937) at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) at scala.collection.IterableLike.foreach(IterableLike.scala:70) at scala.collection.IterableLike.foreach$(IterableLike.scala:69) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at scala.collection.TraversableLike.map(TraversableLike.scala:233) at scala.collection.TraversableLike.map$(TraversableLike.scala:226) at scala.collection.AbstractTraversable.map(Traversable.scala:104) at org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:84) at org.apache.flink.table.planner.delegation.PlannerBase.getExplainGraphs(PlannerBase.scala:543) at org.apache.flink.table.planner.delegation.StreamPlanner.explain(StreamPlanner.scala:103) at org.apache.flink.table.planner.delegation.StreamPlanner.explain(StreamPlanner.scala:51) at org.apache.flink.table.api.internal.TableEnvironmentImpl.explainInternal(TableEnvironmentImpl.java:727) at org.apache.flink.table.api.internal.StatementSetImpl.explain(StatementSetImpl.java:103) at org.apache.flink.table.api.Explainable.explain(Explainable.java:40) at com.dlink.executor.Executor.explainStatementSet(Executor.java:351) at com.dlink.explainer.Explainer.explainSql(Explainer.java:199) at com.dlink.job.JobManager.explainSql(JobManager.java:388) at com.dlink.job.JobManagerExplainApi.main(JobManagerExplainApi.java:64)
The text was updated successfully, but these errors were encountered:
danzhewuju
changed the title
[Bug] Flink Decimal(20, 0) cannot be converted to the StarRocks unsigned BIGINT data type.
[Bug] Flink Decimal(20, 0) cannot be converted to the StarRocks bigint unsigned data type.
Aug 23, 2024
Environment
My StarRocks table has a column of type 'BIGINT UNSIGNED'. I created a StarRocks sink table with the following structure:
CREATE TABLE starrocks_sink (
userIddecimal(20,0))
Error information
java.lang.IllegalArgumentException: Flink and StarRocks types are not matched for column userId, flink type is DECIMAL(20, 0), starrocks type is bigint unsigned at com.starrocks.connector.flink.manager.StarRocksSinkTable.validateTableStructure(StarRocksSinkTable.java:148) at com.starrocks.connector.flink.table.sink.StarRocksDynamicSinkFunctionV2.<init>(StarRocksDynamicSinkFunctionV2.java:99) at com.starrocks.connector.flink.table.sink.SinkFunctionFactory.createSinkFunction(SinkFunctionFactory.java:99) at com.starrocks.connector.flink.table.sink.StarRocksDynamicTableSink.getSinkRuntimeProvider(StarRocksDynamicTableSink.java:44) at org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecSink.createSinkTransformation(CommonExecSink.java:158) at org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translateToPlanInternal(StreamExecSink.java:176) at org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:161) at org.apache.flink.table.planner.delegation.StreamPlanner.$anonfun$translateToPlan$1(StreamPlanner.scala:85) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) at scala.collection.Iterator.foreach(Iterator.scala:937) at scala.collection.Iterator.foreach$(Iterator.scala:937) at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) at scala.collection.IterableLike.foreach(IterableLike.scala:70) at scala.collection.IterableLike.foreach$(IterableLike.scala:69) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at scala.collection.TraversableLike.map(TraversableLike.scala:233) at scala.collection.TraversableLike.map$(TraversableLike.scala:226) at scala.collection.AbstractTraversable.map(Traversable.scala:104) at org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:84) at org.apache.flink.table.planner.delegation.PlannerBase.getExplainGraphs(PlannerBase.scala:543) at org.apache.flink.table.planner.delegation.StreamPlanner.explain(StreamPlanner.scala:103) at org.apache.flink.table.planner.delegation.StreamPlanner.explain(StreamPlanner.scala:51) at org.apache.flink.table.api.internal.TableEnvironmentImpl.explainInternal(TableEnvironmentImpl.java:727) at org.apache.flink.table.api.internal.StatementSetImpl.explain(StatementSetImpl.java:103) at org.apache.flink.table.api.Explainable.explain(Explainable.java:40) at com.dlink.executor.Executor.explainStatementSet(Executor.java:351) at com.dlink.explainer.Explainer.explainSql(Explainer.java:199) at com.dlink.job.JobManager.explainSql(JobManager.java:388) at com.dlink.job.JobManagerExplainApi.main(JobManagerExplainApi.java:64)
The text was updated successfully, but these errors were encountered: