-
Notifications
You must be signed in to change notification settings - Fork 301
Closed
Description
When a query (especially long running) is interrupted while using a connection from a Pool, that connection is almost immediately returned to the pool even if the query is still running on that connection.
Another async consumer picks up the connection out of the pool to do it's work and you get:
libpq: failed (another command is already in progress
The gist of it is:
- Create a connection pool of 1
- In another thread acquire the connection from the pool doing some long running query like 10000 inserts
- Kill that thread before it finishes
- immediately try doing some other query using that same pool
I've created a reproducable repo here that just requires stack and docker (for the postgres server):
https://github.com/codygman/persistent-postgresql-query-in-progress-repro/blob/master/src/Main.hs
Metadata
Metadata
Assignees
Labels
No labels