0

I have a backend written in NodeJs. I am using a postgreSQL database. I am using docker to run containers. There is a problem I am encountering that the memory consumption of the backend container increases over time and I have narrowed down the cause to read/write operations to the database.

Here is a sniplet of an example database read function. All the others are written the same way. I couldnt come up with a reason as to why this is happening.

Any input would be appriciated

Thanks in advance.

async getAllUsers() {
    const client = await pool.connect(); // Acquire a client from the pool
    const query = `
      SELECT * FROM Users;
    `;
  
    try {
      const queryResult = await client.query(query);
      console.log(queryResult.rows); // Log all users
      return queryResult.rows; // Return all users for further processing
    } catch (err) {
      console.error("Error executing select query", err.stack);
      throw err; // Rethrow the error to handle it in the caller function
    } finally {
      client.release(); // Release the client back to the pool
    }
  }

Tried solving the memory leak by adding client.release. Doesnt change the behaviour. The backend memory consumption shouldnt increase over time.

2
  • Is there anything specific to Docker in this setup? Do you see the same behavior if you run the Node application using a plain non-container Node environment?
    – David Maze
    Commented Jul 7 at 15:02
  • Did you check pool's props related to clients it provides?
    – Anatoly
    Commented Jul 7 at 18:02

0