The tool creates new tasks for the incomming requests. I wonder if I have to create a new DB connection in every task or if all tasks can re-use an existing connection?
Creating a new connection for each task probably would add too much overhead. You should use a connection pool.
pgx has a companion library called
pgxpool that you should be able to use easily.
I found the connection pooling feature, but it reads like I have to name the size of the pool? It is not well documented at all…
The docs for
pgxpool config are here: pgxpool · pkg.go.dev . Note that this library is not owned by Cockroach Labs, so questions about it should to the pgx issue tracker.
But I dont know the needed size as I don’t know how many requests are comming in (echo is creating new tasks automatically).
Your pool size should not really be related to the number of incoming requests. What matters more is the ability of the underlying system to parallelize the requests. For CockroachDB, see these docs for a pool size recommendation. If possible, you should run experiments to find the best size for your workload, using that as a starting point.
So what happens if the pool is 100 and the echo framework starts the 101st thread?
Each pool can behave differently, so you should read the docs for the one you choose. The way
pgxpool works is that the 101st request will block until a connection becomes available in the pool – when one of the other 100 threads returns its connection back into the pool.
Any advise about how to do that right would be great! Maybe some golang example on how to use connection pools with pgx in general?
I hope this helped. The link I posted above also has examples.