Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Unix find tutorial

Prev | Contents | Next

Part 8: Finding Files based on size: largest, empty and within certain range


Introduction

Among categories of files with specific sizes the most popular are probably zero-length file and "huge files". You can also be interesting is sizes about certain for specific categories such as tar file TGZ files, ISO files, etc.

Do not try to delete found files in the same statement using -exec. Write output of find into the file analyze it and only then run xargs against this file. In case you operating with system directories do backup first and run xargs second, You can do a lot of damage using -exec unless you understand what you are doing. And if you use -exec typically you realize that you missed something when it's too late

Predicate size have the following syntax:

-size [+|-]n[bckwMG]

True if the file uses n units of space, rounding up. The units are 512-byte blocks by default, but they can be changed by adding a one-character suffix to n:

+ and - means above given size and below given size, correspondingly. If you use two -size predicates in the same expression you can find files within certain interval of sizes. This is useful for finding "missing files": when you know that file exists, but forgot exact name and path.

Zero length files

Zero length files formally might be considered a subclass of abnormal files, but in reality they are often used in software installations. Another common situation is related to trying to find largest file in the filesystems. In this case you can also sort result in the descending order to make decision about which file to delete and which to move in order to free space easier.

To find all zero length files:

find /home -type f -size 0 -ls 

You can also use a special predicate -empty

$ find /home -empty -exec ls {} \;
After finding empty files, you might choose to delete them by replacing the ls command with the rm command. But it's better to verify the list before jumping the gun... Never do it from root directory, only from the second level directories that you know are save to delete such files. Some zero length files are legitimate entries in /etc and other system directories.

Finding files within certain size range

Another unfortunate, but rather common situation when you destroyed certain files due to a bug in a script and they all have certain size. In this case searching for this size can give you the list that defines the scope of the damage and allow to restore files from the backup.

You can combine several size predicated to get "range", which is useful if you forgot the name of the file, but know its approximate size and some other parameters (for example age of the file):
find /home -type f -size +12K -size -15K -mtime -3 -ls

You can use size in bytes not only in larger units (see above). For example:

find /home -type f -size +1024c -size -2048c

Finding "huge" files

Excluding certain parts of the tree from searching

As with abnormal files discussed in previous section it is important to exclude certain parts of the directory tree from searching. For example, some zero length files are legitimate entries in /etc and other system directories. You can use all approaches discussed in section 6 but often more practical and more simple to use a small Perl script to weed "junk" directories. For example:

#!/usr/bin/perl
open(SYSLOG,"find / -type f -size 0 -ls|");
@pattern=(
 "/var/spool/",
 "/var/tmp/",
 "/var/locks/",
 "/var/ct/",
 "/var/ifor/",
 "/usr/lpp/",
 "/usr/omni/log/debug.log",
 "/tmp/",
 "/proc/", 
 "/.swdis/",
 "/etc/Tivoli/tec/",
 "/usr/omni/lib/perl",
 "/opt/TMF/db/nti2171.db/",
 "/opt/tivoli/EPProxy/",
 "/opt/tivoli/TWS/stdlist/",
 "/var/adm/logs/",
 "/oracle/app/oracle/product/9.2.0/",
 "/oracle/app/oracle/product/9.2.0_OLD/",
 "/usr/opt/perl5/lib/5.8.2/aix-thread-multi/",
 "/usr/opt/perl5/lib64/5.8.2/aix-thread-multi-64all/",
 "/usr/opt/perl5/lib/site_perl/5.8.2/",
 "/usr/lib/objrepos/",
 "/usr/aix/_AIX_print_subsystem",
 "/etc/ipsec/inet/DB/",
 "/etc/objrepos/"
 );
while() {
   $data=$_;
   $line=substr($data,65);
   chomp($line);
   $found=0;
   foreach $string (@pattern) {
      if ( index($line,"$string") > -1 ) {
         $found=1;
         last;
      }
   }	
   if ($found == 0 ) {print "$data\n";}
}  

Cleaning the filesystem

The most typical situation for professional system administrator is overflowing of the disk filesystem. Typically users put too many files into one of the filesystems and such a filesystem needs a cleanup, extension of the size or both. This is especially typical for servers that serve research groups.

If you have spare disk capacity, extend the size and think about cleanup later. The problem arise when you no longer have a spare disk capacity. Here you need to resourt to cleanup, which is a dangerous operation that need to be planned and executed in very carefully. You should be extremely careful not to create a "man-made disaster" instead of cleanup ;-). Here are two principles that might help:

First you need to find out top largest file/directories in particular filesystem. For example

du -a /home | sort -n -r | head -n 50 | tee /tmp/largest

To find largest files in a particular directory you can use ls:

ls -lSh . | head -12

Typically cleanup starts with finding tar and TGZ file downloaded by the users. If such file are packages downloadable from the net they can be deleted first as this is cleraly safe think to do:

find . -name "*\,tar" -ls

As users download the same packages again and again you can create for yourself the list that represent well know software packages and automate the process.

find . -name "*\.tgz"   | xargs ls -l | tee /tmp/current_tgz.txt
for f in /root/Config/known_tgz.txt ; do  
   if grep $f  /tmp/current_tgz.txt ; then  
      echo "Known software package $f is present and should be deleted"
      known=`grep $f  /tmp/current_tgz.txt`
      if [ -f $known ] ;  then
         rm known
      fi
   fi 
done

After that you need to find "oldest" large files on the filesystem. Files that are more then a year old are often good candidates of moving to backup storage.

If "huge" files found are compressible, they can be compressed with gzip or bzip2, but such command should run only in particular directory, not globally.

find . -name "*\.tar" -exec gzip {} \;
You can also use du on some directories
du --max-depth=1 -x -h /

Often some effect can give finding and eliminating duplicate files, especially if such files are huge.

find /home -size +500M -mtime +365 -ls

Cleaning filesystems without backup can lead to SNAFU

WARNING: Usage of find for cleaning filesystems can lead to SNAFU, if the names of the file in different directories are identical. For example, in case those two files are selected by find only the second one will survive the move:

/Apps/London/150722/car_traffic 
/Apps/London/150723/car_traffic

In other words, the command

find /Apps +size 500M -mtime +365 -exec mv {} /Backup

will destroy files in different directories that have identical names, preserving only the one that was selected last. If "flattens that directory structure to a single directory, so files with identical names will be overwritten. You can try to use built-in function dirname to avoid that

find /Apps +size 500M -mtime +365 -exec mv {} /Backup/`dirname {}`

If there are not too many such directories, it is much better and safer to use Midnight Commander to move older directories to another location. it automatically prevents this SNAFU.

In any case if you use find command like above after detection of files you need to move, you first need to ensure that there are no files with identical names in different directories. the simplest way of doing this is to add suffix with the file creation date.

You can also use rsync to copy selected directories to remote location.

Or you can postprocess the list selecting the directories and then use this list via xargs for copy


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

https://twitter.com/climagic

Command Line Magic ‏@climagic Feb 23

find music -name '*.mp3' -mtime +365 -a -size +10M -ls # Find and long list mp3 files in Music dir older than a year and larger than 10MB.

find . -empty -type d # List of empty subdirectories of current directory.

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

Sites

Top articles

Sites

Prev | Contents | Next



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: October, 03, 2019