You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First problem, to make code compile to function must return a Future[Double], I would expect something like Future[Map[String, Double]] or Future[Seq[(String, Double)]].
Then, when I try to run the code and get this in return (this is logic since the return type is not correct) :
RethinkRuntimeError: Can not deserialize instance of java.lang.Double out of START_OBJECT token at
[Source: {"t":1,"r":[{"$reql_type$":"GROUPED_DATA","data":[["foo",143]]}]}; line: 1, column: 13]
(through reference chain: com.rethinkscala.net.JsonResponse["r"]->com.fasterxml.jackson.module.scala.deser.BuilderWrapper[0])]
I think the request is good, since in the error I can see the data grouped and counted.
BTW, I'm using : "com.rethinkscala" %% "core" % "0.4.8-SNAPSHOT"
Any help appreciated !
The text was updated successfully, but these errors were encountered:
Group and ungroup have always been a problem to support due to the ability to group by different data types. I'll take another look into this to see if I can at least provide some casting methods so the user can instruct the driver of the correct type when it fails to pick the correct type.
Hi,
I try using group() and count() on a table like : https://www.rethinkdb.com/api/java/group/
The goal is to simply group data by a field and count grouped occurences.
In my code, this looks like :
First problem, to make code compile to function must return a Future[Double], I would expect something like Future[Map[String, Double]] or Future[Seq[(String, Double)]].
Then, when I try to run the code and get this in return (this is logic since the return type is not correct) :
I think the request is good, since in the error I can see the data grouped and counted.
BTW, I'm using : "com.rethinkscala" %% "core" % "0.4.8-SNAPSHOT"
Any help appreciated !
The text was updated successfully, but these errors were encountered: