PostgreSQL - Too Many Clients

The internal database running on Mitto installations is postgreSQL. By default the max number of client connections allowed is 100 (max_connections in /etc/postgresql/{version}/main/postgresql.conf).

Any remote person or system using Mitto’s database opens a new connection, also any Mitto job that uses Mitto’s internal database opens a connection. If there are lots of jobs running at the same time and/or a lot of people on your team using the database, or a service like Tableau is using it as a datasource it is possible to max out the allowed number of connections.

When that happens you might see an error like:

FATAL:  sorry, too many clients already

To see details about the connections in your database client (datagrip, for example), connect to the mitto database and try the following queries:

-- see the number of connections by user at a summary level
SELECT * FROM pg_stat_database;
 
-- See the connection details
SELECT * FROM pg_stat_activity;

If this is an ongoing issue and Zuar is hosting your Mitto, contact Zuar Support for assistance in changing the limit.

(1) How can I check this: (max_connections in /etc/postgresql/{version}/main/postgresql.conf ).

(2) I ran both pg_stat_database and pg_stat_activity queries noted above – but how do you interpret where the issues are?

(3) Are there any best practices you would suggest to ensure we avoid these errors? We are using dBeaver and Tableau.

@BlueFlamingo

(1) How can I check this: (max_connections in /etc/postgresql/{version}/main/postgresql.conf ).

For this you need ssh access to the Mitto. You would need to edit the file and then restart the postgres server. For Zuar hosted infrastucture, you’re welcome to create a support ticket to check/edit max_connections

(2) I ran both pg_stat_database and pg_stat_activity queries noted above – but how do you interpret where the issues are?

I believe the first table shows you info on which databases are being used, while the second query shows you the actual client connections.

(3) Are there any best practices you would suggest to ensure we avoid these errors? We are using dBeaver and Tableau.

It’s a good idea to keep it in mind when scheduling jobs and sequences, and if we need to raise the limit we can.