Softpanorama
May the source be with you, but remember the KISS principle ;-)

Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

uniq command

News Shells Recommended Links Options Examples Pipes AWK xargs
  find grep sort cut tee Exit Status Etc

The uniq command can eliminate or count duplicate lines in a presorted file. It reads in lines and compares the previous line to the current line. Depending on the options specified on the command line it may display only unique lines or one occurrence of repeated lines or both types of lines.  The common idiom is

sort | uniq  

and

sort | uniq -c

For example

grep 'hysteresis' * | awk -F: '{print $1}' | sort | uniq | wc -l  

or in more complex pipe:

cut -d '"' -f 2 $1 | cut -d '/' -f 3 | tr '[:upper:]' '[:lower:]' \
    | sort | uniq -c | sort -r > $1_sites

The default output is to display lines that only appear once and one copy of lines that appear more than once. It is also useful to filter out multiple blank lines from unsorted output of other commands. For example, the dircmp command displays its output using pr; thus the output usually scrolls off your screen before you can read it. But if you pipe the output of the dircmp command through the uniq command, the blank lines are reduced and the output is more compact.

Syntax

uniq [ -options [ input [output] ]

Options

Arguments

The following arguments may be passed to the uniqcommand.

inputThe name of the file containing the input data.
outputThe name of the file to hold the output data. If no output file is specified, the output is displayed on the standard output.
The input and output files must not be the same name. If they are, the contents of the file are destroyed.
If no input file is specified, uniqreads from the standard input and writes to the standard output.
You cannot specify an output file without specifying an input file.

NEWS CONTENTS

Old News ;-)

TTTT uniq

uniq(1)  takes a stream of lines and collapses adjacent duplicate lines into one copy of the lines. So if you had a file called foo  that looked like:
davel
davel
davel
jeffy
jones
jeffy
mark
mark
mark
chuck
bonnie
chuck

You could run uniq  on it like this:
% uniq foo
davel
jeffy
jones
jeffy
mark
chuck
bonnie
chuck

Notice that there are still two jeffy  lines and two chuck  lines. This is because the duplicates were not adjacent. To get a true unique list you have to make sure the stream is sorted:
% sort foo | uniq
jones
bonnie
davel
chuck
jeffy
mark

That gives you a truly unique list. However, it's also a useless use of uniq  since sort(1)  has an argument, -u  to do this very common operation:
% sort -u foo
jones
bonnie
davel
chuck
jeffy
mark

That does exactly the same thing as "sort | uniq", but only takes one process instead of two.

uniq  has other arguments that let it do more interesting mutilations on its input:


Tuesday Tiny Techie Tip -- 03 December 1996

Emulation of uniq Unix command in ksh93 - AIX Expert

	#!/usr/bin/ksh93
################################################################
#### Program: uniq_k93
#### 
#### Description: Emulation of the Unix "uniq" command
#### 
#### Author: Dana French (dfrench@mtxia.com)
####         Copyright 2004
#### 
#### Date: 07/22/2004
#### 
################################################################
function uniq_k93 {
  typeset TRUE="0"
  typeset FALSE="1"
  typeset OLDPREV=""
  typeset FNAME=""
  typeset STDIN="${1: ${FALSE}}"
  STDIN="${STDIN:-${TRUE}}"

  if (( STDIN == TRUE ))
  then
    getUniqLine_k93 OLDPREV
  else
    for FNAME in "${@}"
    do
      if [[ -f "${FNAME}" ]]
      then      
        exec 0<"${FNAME}"
          getUniqLine_k93 OLDPREV
        exec 0<&-
      else
        print -u 2 "ERROR: \"${FNAME}\" does not exist or is not a regular file."
      fi
    done
  fi
  return 0
}
################################################################
function getUniqLine_k93 {
  nameref OLDPREV="${1}"
  typeset PREV="${OLDPREV}"
  typeset CURR
  IFS=""

  [[ ${OLDPREV} = "_" ]] &&
    read -r -- PREV &&
    print -r -- "${PREV}"

  while read -r -- CURR
  do
    [[ "_${PREV}" = "_${CURR}" ]] && continue
    print -r -- "${CURR}"
    PREV="${CURR}"
  done
  OLDPREV="${PREV}"
  return 0
}
################################################################

uniq_k93 "${@}"

Perl reimplementation from Perl Power Tools

#!/usr/bin/perl -w
# uniq - report or filter out repeated lines in a file

use strict;

my $VERSION = '1.0';

END {
  close STDOUT || die "$0: can't close stdout: $!\n";
  $? = 1 if $? == 255;  # from die
}

sub help {
  print "$0 [-c | -d | -u] [-f fields] [-s chars] [input files]\n";
  exit 0;
}

sub version { print "$0 (Perl Power Tools) $VERSION\n"; exit 0; }

# options
my ($optc, $optd, $optf, $opts, $optu);

sub get_numeric_arg {
  # $_ contains current arg
  my ($argname, $desc, $opt) = @_;
  if    (length) { $opt = $_ }
  elsif (@ARGV)  { $opt = shift @ARGV }
  else           {die "$0: option requires an argument -- $argname\n"}
  $opt =~ /\D/ && die "$0: invalid number of $desc: `$opt'\n";  
  $opt;
}

while (@ARGV && $ARGV[0] =~ /^[-+]/) {
  local $_ = shift;
  /^-[h?]$/    && help();        # terminates
  /^-v$/       && version();     # terminates
  /^-c$/       && ($optc++, next);
  /^-d$/       && ($optd++, next);
  /^-u$/       && ($optu++, next);
  /^-(\d+)$/   && (($optf = $1), next);
  /^\+(\d+)$/  && (($opts = $1), next);
  s/^-f//      && (($optf = get_numeric_arg('f', 'fields to skip')), next);
  s/^-s//      && (($opts = get_numeric_arg('s', 'bytes to skip')), next);

  die "$0: invalid option -- $_\n";
}

my ($comp, $save_comp, $line, $save_line, $count, $eof);

# prime the pump
$comp = $line = <>;
exit 0 unless defined $line;
if ($optf) {($comp) = (split ' ', $comp, $optf+1)[$optf] }
if ($opts) { $comp  =  substr($comp, $opts) }

LINES: 
while (!$eof) {
  $save_line = $line;
  $save_comp = $comp;
  $count = 1;
 DUPS: 
  while (!($eof = eof())) {
    $comp = $line = <>;
    if ($optf) {($comp) = (split ' ', $comp, $optf+1)[$optf] }
    if ($opts) { $comp  =  substr($comp, $opts) }
    last DUPS if $comp ne $save_comp;
    ++$count;
  }
  # when we get here, $save_line is the first occurrence of a sequence
  # of duplicate lines, $count is the number of times it appears
  if    ($optc) { printf "%7d $save_line", $count }
  elsif ($optd) { print $save_line if $count >  1 }
  elsif ($optu) { print $save_line if $count == 1 }
  else          { print $save_line }
}

exit 0;

__END__

=head1 NAME 

uniq - report or filter out repeated lines in a file

=head1 SYNOPSIS

uniq [B<-c> | B<-d> | B<-u>] [B<-f> I] [B<-s> I] [I]

=head1 DESCRIPTION

The uniq utility reads the standard input comparing adjacent lines and
writes a copy of each unique input line to the standard output.  The sec-
ond and succeeding copies of identical adjacent input lines are not writ-
ten.  Repeated lines in the input will not be detected if they are not
adjacent, so it may be necessary to sort the files first.

The following options are available:

=over

=item c

Precede each output line with the count of the number of times the
line occurred in the input, followed by a single space.

= item d

Don't output lines that are not repeated in the input.

=item f I

Ignore the first fields in each input line when doing compar-
isons.  A field is a string of non-blank characters separated
from adjacent fields by blanks.  Field numbers are one based,
i.e. the first field is field one.

=item s I

Ignore the first chars characters in each input line when doing
comparisons.  If specified in conjunction with the B<-f> option, the
first chars characters after the first fields fields will be ig-
nored.  Character numbers are one based, i.e. the first character is
character one.

=item u

Don't output lines that are repeated in the input.

=back

If additional arguments are specified on the command line, they are
used as the names of input files.

The uniq utility exits 0 on success or >0 if an error occurred.

=head1 COMPATIBILITY

The historic B<->I and B<+>I options are supported as
synonyms for B<-f> I and B<-s> I, respectively.

This version accepts 0 as a valid argument for the B<-f> and
B<-s> switches; some implementations of uniq do not.

=head1 SEE ALSO

sort(1)

=head1 BUGS

I has no known bugs.

=head1 AUTHOR

The Perl implementation of I was written by Jonathan Feinberg,
I.

=head1 COPYRIGHT and LICENSE

This program is copyright (c) Jonathan Feinberg 1999.

This program is free and open software. You may use, modify, distribute,
and sell this program (and any modified variants) in any way you wish,
provided you do not restrict others from doing the same.



Recommended Links

Softpanorama Top Visited

Softpanorama Recommended

uniq - Wikipedia, the free encyclopedia

Emulation of uniq Unix command in ksh93 - AIX Expert

Ad Hoc Data Analysis From The Unix Command Line - Wikibooks, collection of open-content textbooks

uniq




Etc

Society

Groupthink : Understanding Micromanagers and Control Freaks : Toxic Managers : BureaucraciesHarvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Two Party System as Polyarchy : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

Skeptical Finance : John Kenneth Galbraith : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Oscar Wilde : Talleyrand : Somerset Maugham : War and Peace : Marcus Aurelius : Eric Hoffer : Kurt Vonnegut : Otto Von Bismarck : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Oscar Wilde : Bernard Shaw : Mark Twain Quotes

Bulletin:

Vol 26, No.1 (January, 2013) Object-Oriented Cult : Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks: The efficient markets hypothesis : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Haterís Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

 

The Last but not Least


Copyright © 1996-2014 by Dr. Nikolai Bezroukov. www.softpanorama.org was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Site uses AdSense so you need to be aware of Google privacy policy. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine. This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting hosting of this site with different providers to distribute and speed up access. Currently there are two functional mirrors: softpanorama.info (the fastest) and softpanorama.net.

Disclaimer:

The statements, views and opinions presented on this web page are those of the author and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

Created: May 16, 1997; Last modified: February 19, 2014