It's 10:37PM and my file table just overflowed.

Safia Abdalla - Dec 19 '18 - - Dev Community

So, I was running a build on a lerna-managed monorepo recently when I ran into this strange error about halfway through the build.

(libuv) kqueue(): Too many open files in system
net.js:271
    err = this._handle.open(fd);
                       ^

Error: ENFILE: file table overflow, uv_pipe_open
    at new Socket (net.js:271:24)
    at createWritableStdioStream (internal/process/stdio.js:191:18)
    at process.getStdout [as stdout] (internal/process/stdio.js:20:14)
    at console.js:469:19
    at NativeModule.compile (internal/bootstrap/loaders.js:339:7)
    at Function.NativeModule.require (internal/bootstrap/loaders.js:200:18)
    at setupGlobalConsole (internal/bootstrap/node.js:444:41)
    at startup (internal/bootstrap/node.js:132:7)
    at bootstrapNodeJSCore (internal/bootstrap/node.js:826:3)
error Command failed with exit code 1.

I've never seen this before. File table overflows?!!? That sounds interesting. I decided to dig around to figure out what might be going on. Some Googling revealed that this was a result of having too many open files on my system. This seemed super suspicious though. I've run this command hundreds of times before without a hitch. Why now?

I did some Googling to figure out what the maximum number of open file descriptors on my system was and got the following.

By the way, for those of you who might be unfamiliar, file descriptors are used to uniquely identify open files on our operating system. Although the name contains the word "file," they are also used to uniquely identify any file-like object which in the Unix world includes things like hardware devices. If you're curious about this, I think this article does a good job of diving into it.

$ launchctl limit maxfiles
        maxfiles    256            unlimited

The 256 represents the soft limit of files that can be opened, this is probably the limit I reached. The unlimited is the hard limit (the one that absolutely cannot be surpassed).

I decided to run the build command again to see if I would run into the same error. This time, I tracked down the process ID of the build command and used the lsof command to get the list of open file descriptors associated with that process.

$ lsof -c node

This ended up being quite a noisy command because lsof decided to print out all the file descriptors with all processes running Node on my machine. For me, this included things like Hyper, my terminal emulator.

In any case, the second time I ran the command, the builds completed successfully without any errors bubbling up from the operating system. Perhaps the build was just a red herring and something else was running on my system that was opening up too many file descriptors? Maybe there is a dependency in my build that is causing the issue? Perhaps the bug in that dependency is pretty transient which is why I only saw it on that occasion?

Several of the online posts regarding this issue across different packages recommending increasing the limit on the maximum number of file descriptors open on your system. I decided not to do this since this issue seemed to be more of a fluke than anything else.

Anyway, no real conclusion or moral to this blog post. I was just doing some work and noticed something interesting and decided to annotate my process as I worked through it. If I run into this issue again, I'll post a follow-up blog post.

I should go to bed. Good night, fellow developers!

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player