=^.^=

Find and Delete the Largest Files on a File System

The fastest way to find large or run-away files on a whole filesystem or specific directory is to run:

find / -type f -follow | xargs ls -l | sort -r -n -k 5,5 | head -20

Where / is the target and 20 is the number of results you would like to see (sparing yourself a flooded terminal buffer). The output looks something like:

-rw-r--r-- 1 karma  karma  358826880 Jan 22  2011 dist/clear-trollup.tar.lzma
-rw-r--r-- 1 karma  karma  273621974 Oct 20  2010 dist/clear-foxpaws.hvm.hdd.tar.lzma
-rw-r--r-- 1 karma  karma  273399504 Oct 20  2010 dist/clear-foxpaws.tar.lzma
-rw-r--r-- 1 karma  karma   22299432 Dec  9 19:21 dist/megaupload.mp4
-rw-r--r-- 1 root   root     2815800 Nov 27 19:45 dist/kernel-domU-2.6.38
-rw-r--r-- 1 root   root     2569440 Apr 17 19:13 dist/kernel-domU-3.2.12
...

There is a much cooler but equally less efficient way: the "File Size View" graphical file manager for KDE's Konqueror. You may need to install the konqueror-plugins package if it is not already available.

File Size View does not work over kio abstractions (ssh/sftp/fish/ftp etc) but works fine (excruciatingly slowly) over NFS.

LiteStar says:

$ find . -type f -size +30M

This will give you a list of all the files over 30MB.

UPDATE Your woes may not be over. Even after deleting files they can stick around until every process that is using them has terminated. Please continue reading Find the Largest Open Files and Their Owner(s) on Linux with lsof if you are experiencing problems with "ghost files."

Comments

• litestar

the nice thing about -size is to use +/-, signifying great than/less than:

find . -type f -size +30M

find . -type f -size 30M

find . -type f -size -30M

same goes for create/access time, &c. nice utility, all said.