||Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
|(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix|
|Access Control in Operating Systems||UID policy||Authentication|
Which is a tiny and a very basic utility, essentially a more intelligent, more polished Perl implementation of a three line Bash script:
find /home -maxdepth 1 -type -d -mtime +365 | xargs -L 1 du -sh du -dh /home find /home | wc -l
Instead of basic output dormant_user_stats produces neat table with all kind of additional stats required for deciding whether reclaiming the space is worth the candles burned in the process, and if yes, for which 20% of users in the list above it should be done :-).
It is useful mainly for large installation and for maintainers of web sites like Softpanorama which store the content in the Unix directories tree not in MySQL or other database. In this case knowing what branches are stale and need to be iether updated or archived, saving space and i-nodes, is as important as knowing, which users abandoned the particular server, and the space of their home directories can be reclaimed on the large enterprise servers. When the number of users often exceeds 300 primitive bash script like above can run several hours and you can benefit form some optimization which dormant_user_stats implements (which first of all means more careful filtering).
But in any case such utilities can run several hours, or several days (on petabyte storage), especially if multiple users have multi-terabyte directories and millions of files.
I have another utility called dir2tar which does this safely with all kind of verifications in the process and can be used along with dormant_user_stats for the top 20% of dormant users or, more correctly, dormant directories.