#StackBounty: #javascript #node.js #stream #node-fetch #node-streams Prevent memory/connection leaks when piping fetched response to cl…

Bounty: 50


I have some code that looks roughly like this:

const express = require("express");
const app = express();
const fetch = require('node-fetch');

app.get("/file/:path", async function(request, response) {
  let path = request.params.path;
  let r = await fetch(`https://cdn.example.com/${encodeURIComponent(path)}`, {timeout:10000}).catch(e => false);
  if(r === false || !r.ok) {
  } else {


So you can see that I’m simply fetching a response with node-fetch and then piping it the the client response stream.

Note that I’m handling the errors better than .catch(e => false) in the real code. It’s extremely rare that there are errors, so I figured I’d leave out those details.


What are the possible ways in which this code could leak TCP connections or memory? How do I "cover" these cases?

Extra Info:

I’d have thought that by default node-fetch/express would have default timeouts (based on time since last chunk received) that would prevent leaks, but that doesn’t seem to be the case.

My first (potentially naive) thought is something like:



let timeout = setTimeout(() => {
  // pseudocode - not sure what the correct abort/cancel/destroy method to use is:
}, 1000*120);

response.on("finish", () => {

But I’m not sure whether that should be r.abort() instead? E.g. if the CDN sends data too slowly (or stops altogether, but doesn’t error). I don’t know what’s going on within the stream API (RE timeouts, failure modes, etc.), so I don’t know how to debug this. It would help if I could quickly try things out, but this leak only become apparent/significant over the course of days, so it’s hard to test my "guesses".

Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.