-
-
Notifications
You must be signed in to change notification settings - Fork 404
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PostgreSQL: Statistics broken #62
Comments
Analog to the SQL, the DQL used to generate it is: SELECT SUM(p.stockLevel) AS stockLevel, pu FROM de\RaumZeitLabor\PartKeepr\Part\PartUnit pu LEFT JOIN pu.parts p GROUP BY pu.id If we omit "pu" from the SELECT, it works (but we don't have any information about the unit then). |
This was a bug in Doctrine, which has been fixed. Verified and closing. |
It seems that the issue is related to PostgreSQL 8 (it is confirmed to work on PostgreSQL 9) |
The same problem occurs when trying to create a project report: SQLSTATE[42803]: Grouping error: 7 ERROR: column "p1_.id" must appear in the GROUP BY clause or be used in an aggregate function LINE 1: SELECT SUM(p0_.quantity) AS sclr0, p1_.id AS id1, s2_.name A... ^ |
Another probably related one in the "CreateStatisticSnapshot" cron job: tbruese@saturn:/var/www/devpartkeepr/cronjobs$ php CreateStatisticSnapshot.php |
I believe all of these are are done now, but to be sure, can you retest when you find time? |
All mentioned issues seem to be fixed |
I'm glad :) Closing this one and preparing release now. |
Statistics fail on PostgreSQL because of the following SQL statement:
SELECT SUM(p0_.stockLevel) AS sclr0, p1_.id AS id1, p1_.name AS name2, p1_.shortName AS shortName3, p1_.is_default AS is_default4 FROM PartUnit p1_ LEFT JOIN Part p0_ ON p1_.id = p0_.partUnit_id GROUP BY p1_.id;
ERROR: column "p1_.name" must appear in the GROUP BY clause or be used in an aggregate function
LINE 1: ...LECT SUM(p0_.stockLevel) AS sclr0, p1_.id AS id1, p1_.name A...
If we just retrieve the SUM() without any field of p1_, it works.
The text was updated successfully, but these errors were encountered: