Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Logs Security & Integrity

News See also Recommended Links Logs security and integrity Log Rotation Syslog configuration Remote Syslog
Log analysers Syslog-ng for Solaris Syslog Loghost server and remote syslog Horror Stories    
          Humor Etc

It is very important that the information that comes from syslog not be compromised. Making the files in /var/log readable and writable by only a limited number of users is a good start.

Be sure to keep an eye on what gets written there, especially under the auth facility. Multiple login failures, for example, can indicate an attempted break-in.

syslog configuration is distribution dependent . Both Red Hat and suse use /var/log for storing  messages, mail.log, and other logs. 

You can find out where your distribution is logging to by looking at your /etc/syslog.conf file. This is the file that tells syslogd (the system logging daemon) where to log various messages.

Rog rotation is also implemented differently. Red Hat uses logrotate package.

If your log files have been tampered with, see if you can determine when the tampering started, and what sort of things appeared to be tampered with. Are there large periods of time that cannot be accounted for? Checking backup tapes (if you have any) for untampered log files is a good idea.

Intruders typically modify log files in order to cover their tracks, but they should still be checked for strange happenings. You may notice the intruder attempting to gain entrance, or exploit a program in order to obtain the root account. You might see log entries before the intruder has time to modify them.

You should also be sure to separate the auth facility from other log data, including attempts to switch users using su, login attempts, and other user accounting information.

If possible, configure syslog to send a copy of the most important data to a secure system. This will prevent an intruder from covering his tracks by deleting his login/su/ftp/etc attempts. See the syslog.conf man page, and refer to the @ option.

There are several more advanced syslogd programs out there. Take a look at http://www.core-sdi.com/ssyslog/ for Secure Syslog. Secure Syslog allows you to encrypt your syslog entries and make sure no one has tampered with them.

Another syslogd with more features is syslog-ng. It allows you a lot more flexibility in your logging and also can has your remote syslog streams to prevent tampering.

Finally, log files are much less useful when no one is reading them. Take some time out every once in a while to look over your log files, and get a feeling for what they look like on a normal day. Knowing this can help make unusual things stand out.

Log monitoring is now covered in a separate page.  The simplest protection of logs integrity is to use central  logserver.  See Remote Syslog for more information.

The information that is routinely collected about specific events occurring within information technology (IT) systems and networks can be used by organizations to improve the security of their operations. This information is recorded as an entry in a log, and each log entry can be linked to one or several events. Log entries provide valuable information to personnel responsible for the operations and security of systems.  

Log entries are recorded by OS components and applications. The entries containing information related to system security are produced by several sources. Some log entries are created by security software, such as antivirus software, firewalls, intrusion detection systems, and intrusion prevention systems. Other sources of security-related log entries are the operating systems on an organization’s servers, workstations, and networking equipment, and the applications on the systems.

NIST’s Information Technology Laboratory recently issued Special Publication (SP) 800-92, Guide to Computer Security Log Management, by Karen Kent and Murugiah Souppaya, to help organizations develop, implement, and maintain effective processes for managing logs with security-related information. It contains basic information about computer security logs, the usefulness of these logs, and the challenges of managing them. Briefly mentioned are the components of the log management infrastructure; the planning processes that enable the organization to carry out consistent, reliable, and efficient log management practices; and the operational processes that aid organizations in successfully managing logs.  See  http://csrc.nist.gov/publications/nistpubs/index.html.

Logs can be used for many purposes:  

  1. optimizing system and network performance; to record the actions of users;

  2. identifying security incidents, policy violations, fraudulent activities, and operational problems;

  3. performing audits and forensic analyses;

  4. supporting internal investigations;

  5. establishing baselines; and

  6. identifying operational trends and long-term problems.

 NIST’s guide focuses on helping organizations manage the use of logs to improve IT security. While many logs are created by IT systems and could provide data that is useful for security, NIST SP 800-92 focuses on the logs that are closely related to computer security. For example, audit logs track user authentication attempts, and security device logs record possible attacks on systems. In managing computer security-related log data, organizations have to create, transmit, store, analyze and dispose of the data correctly.  The computer security records should be stored in sufficient detail for an appropriate period of time and be available for routine log analysis. Federal organizations have to take into account the requirements of laws, regulations, and organizational policies. For example, federal organizations may need to analyze the log information for compliance with federal legislation and regulations, including:

One of the challenges to the effective management of computer security logs is balancing the availability of large amounts of log information with the limited availability of organizational resources for analysis of the data. A large amount of information is collected daily by a large number of logs, and there are increasing threats to networks and systems. Organizations could realize benefits in using the data to reduce risks, but the staff time and resources needed to perform the analyses and to manage the log information have to be taken into consideration. 

The large number of log information sources may produce inconsistent and incompatible content, formats, and time stamp information, making it difficult for analysts to understand the meaning of the data collected. Organizations may have to utilize automated methods to convert logs with different content and formats to a single standard format with consistent data field representations.

Another challenge is protecting the confidentiality, integrity, and availability of log information. Information such as users’ passwords and the content of e-mails may be captured by logs. This raises security and privacy concerns involving both the individuals that review the logs and others that might be able to access the logs through either authorized or unauthorized means. Logs that are secured improperly in storage or in transit might also be susceptible to alteration and destruction by both intentional and unintentional techniques. As a result, malicious activities might go unnoticed and evidence could be manipulated to conceal the identity of a malicious party. 

Log information should be analyzed on a regular basis and in a timely fashion by security, system, and network administrators. These staff members need support for their exacting tasks. They especially need training on how to carry out the log analysis procedures and how to prioritize their activities effectively. They should also be provided with tools that can automate portions of the analysis process, such as scripts and security software tools. These tools can be helpful in finding patterns that humans cannot easily perceive, such as correlating entries from multiple logs that are related to the same event.  Analysis of logs by staff members has to be an ongoing activity so that organizations can predict future problems and prevent them. In the past, many logs have not been analyzed in a timely manner. When organizations do not institute sound processes for analyzing logs, the value of the logs is significantly reduced.

Organizations also need to protect the availability of their logs. Many logs have a maximum size; for example, the software is limited to storing the 10,000 most recent events, or keeping 100 megabytes of log data. When the size limit is reached, the log might overwrite old data with new data or completely stop collecting log information.  Both of these outcomes result in the loss of availability of log data. To meet data retention requirements, organizations might need to keep copies of log files for a longer period of time than the original log sources can support. It may be necessary to establish processes to archive the log information. 

Because of the volume of logs and the costs of archiving log data, it can be appropriate in some cases to reduce the logs by filtering out log entries that do not need to be archived.  The confidentiality and integrity of the archived logs also need to be protected.

NIST recommends that organizations carry out the following actions for more effective and efficient log management processes: 

Establish policies and procedures for log management. Organizations should develop standard processes for performing log management. In the planning process, logging requirements and goals should be defined. Based on those goals and requirements, an organization can then develop policies that clearly define mandatory requirements and suggested recommendations for log management activities, including log generation, transmission, storage, analysis, and disposal. An organization should also ensure that related policies and procedures incorporate and support the log management requirements and recommendations. The organization’s management should provide the necessary support for the efforts involving log management planning, policy, and procedures development.  Policies and procedures help to assure a consistent approach and implementation of laws and regulatory requirements throughout the organization. Audits, testing, and validation procedures can help to assure that the logging standards and guidelines are being followed.  Requirements and recommendations for logging should be created in conjunction with an analysis of the technology and resources needed to implement the log management process. Generally, organizations should require logging and analyzing the data that is of the greatest importance, and should also have non-mandatory recommendations for the other types and sources of data that should be logged and analyzed if time and resources permit. In some cases, organizations can choose to have all or nearly all of its log data generated and stored for at least a short period of time in case it is needed. This policy gives greater weight to security considerations than to usability and resource usage. Also this policy can support better decision making in some cases. When establishing requirements and recommendations, organizations should strive to be flexible since each system is different and will log different amounts of data than other systems within the organization. The organization’s policies and procedures should also address the preservation of original logs. Many organizations send copies of network traffic logs to centralized devices. In addition, they may use tools that analyze and interpret network traffic. In cases where logs may be needed as evidence in proceedings, organizations may wish to acquire copies of the original log files, the centralized log files, and interpreted log data.  This policy is useful in case there are any questions regarding the fidelity of the copying and interpretation processes. Retaining logs for evidence may involve the use of different forms of storage and different processes, such as putting additional restrictions on access to the records.

Prioritize log management appropriately throughout the organization. After an organization defines its requirements and goals for the log management process, it should then prioritize its requirements and goals based on the organization’s perceived reduction of risk and the expected time and resources needed to perform log management functions.  An organization should also define roles and responsibilities for log management for key personnel throughout the organization, including establishing log management duties at both the individual system level and the log management infrastructure level.

 Create and maintain a log management infrastructure. A log management infrastructure consists of the hardware, software, networks, and media used to generate, transmit, store, analyze, and dispose of log data. Log management infrastructures normally perform several functions that support the analysis and security of log data.  After establishing an initial log management policy and identifying roles and responsibilities, an organization should develop one or more log management infrastructures that can effectively support the policy and roles. Organizations should consider implementing log management infrastructures that includes centralized log servers and log data storage. When designing infrastructures, organizations should plan for both the current and future needs of the infrastructures and the individual log sources throughout the organization. Major factors that should be considered in the design include the volume of log data to be processed, network bandwidth, online and offline data storage, the security requirements for the data, and the time and resources needed for staff to analyze the logs.

Provide proper support for all staff with log management responsibilities. To ensure that log management for individual systems is performed effectively throughout the organization, the administrators of those systems should receive adequate support.  This should include disseminating information to log management staff, providing training, designating points of contact to answer questions, providing specific technical guidance, and making tools and documentation available.

Establish standard log management operational processes. The major log management operational processes include configuring log sources, performing log analysis, initiating responses to identified events, and managing long-term storage. In addition, administrators have other responsibilities, such as:

 Other NIST publications that support log management processes include: 

 

NEWS CONTENTS

Old News

Securing_Linux_Part_2

  1. System logs and logging servers
    Moving your logging server to a separate, isolated machine adds another level of security to your system; replacing the logging utility with one of its secure counterparts makes for an even better solution.

    Secure logging
    Use a separate logging server for syslog services. This could be a spare box with sufficient disk space for all of your logging needs. There should be no other services running on this system. This will severely limit an intruder's ability to penetrate the logging system and tamper with the logs. Periodically writing the logs to a write-once media, for those who have such devices, is also useful to thwart attempts to tamper with logs.

    Secure syslog
    Even with separate log servers, the stock syslog utility is rather insecure. While it is true that prevention is the first line of defense, detection is always the second.

    Bruce Schneier has presented ideas for creating a secure log server in which cryptographic signatures would be incorporated into the logs. This would ensure that intrusions cannot go undetected, even if the logs are tampered with.

    A secure log server replacement for syslog, called ssyslong, is available from the Core SDI site (see Resources). This daemon implements cryptographic security on the logs and enables remote auditing.

    From the Core SDI page on Secure Syslog:

    BLOCKQUOTE: Designed to replace the syslog daemon, ssyslog implements a cryptographic protocol called PEO-1 that allows the remote auditing of system logs. Auditing remains possible even if an intruder gains superuser privileges in the system, [as] the protocol guarantees that the information logged before and during the intrusion process cannot be modified without the auditor (on a remote, trusted host) noticing. :END_BLOCKQUOTE

    syslog-ng
    Another secure alternative to syslog is syslog-ng, the "next generation syslog," available from BalaBit.

    This is a more configurable logging daemon, and it includes cryptographic signatures to detect log tampering. It also allows for directing log traffic based on facility and level, as well as on content and use of regular expressions.

    Remote logging servers, in combination with cryptographically secure logging servers and secure remote audit capability, can make log tampering extremely difficult to perform -- and very easy to detect.

Securing Linux Part 2

Fine tuning your system logs

There are two main logging daemons running on your Linux machine -- "klogd"
and "syslogd". "klogd" is the kernel logging daemon and "syslogd" is the
logging daemon for system related services. The configuration file for
syslog is /etc/syslog.conf. Lets fine-tune this file to get syslogd to dump
more accurate information into the logs. This additional information can
always be useful in case of break in or if there is any other malfunction
of a particular service.

Our /etc/syslog.conf looks as follows.

# Start of the /etc/syslog.conf file
# Monitor authentication attempts
auth.*;authpriv.* /var/log/authlog
#----------
#--> This is to dump all the authentication attempt related output to
the
#file /var/log/authlog.
#----------

# Monitor all kernel messages
kern.* /var/log/kernlog
#--------------
# --> This is to dump all the kernel related messages to the
#file /var/log/kernlog.
#--------------

# Monitor all warning and error messages
*.warn;*.err /var/log/syslog
#-------------
# --> All the errors and warning messages are appended to the file
#/var/log/syslog.
#-------------
# End of /etc/syslog.conf

Most of the distributions often have only one or two log files where all
the information regarding authentication messages, error messages, kernel
log messages are stored. We would certainly like to clean up the act and
have different information stored in specific files. The advantage of
having placed the different logged information in other files is that at
the time of retrieval it is easier to sort through the data as all the data
related to one particular activity is present in one file only. Having made
the changes to syslog.conf file, make sure you create an empty file of 0
bytes (touch filename) for files that are to be logged to. All files
mentioned in /etc/syslog.conf should be created at the locations mentioned
there as well as with the chosen filenames.

Only having logs in place to record all the activity is not enough, a good
system administrator also has to think of the devious minds at work trying
to erase all the signs of suspicious activity on your machine. This
requires you to ensure that file permissions are properly set.

Restrict access to log directory and syslog files for normal users using
the following commands.

bash# chmod 751 /var/log /etc/logrotate.d
bash# chmod 640 /etc/syslog.conf /etc/logrotate.conf
bash# chmod 640 /var/log/*log

We have briefly covered many aspects that will help you make your Linux box
more secure. We hope in time you develop a concept of security. There are
a few more articles lined up in this series, but the contents of those
depend on the feedback that we get from you. Contact us.

Linux.com Securing a fresh Linux install, part 2

System logs

In order to trace any unwanted activity on your computer, you should keep complete and accurate logs. On Linux machines, logging is handled by the syslog daemon, syslogd. Syslogd reads its configuration from the /etc/syslog.conf file. You can set the facilities to be logged, the log priority, and the files in which to log information here. The default values in most distributions do not give you enough information.

A sensible log policy is to log almost everything in /var/log/messages and /var/log/syslog and then to have each individual facility log to its own separate file, as shown in the example below:

--- Begin Example syslog.conf-----

# Log anything 'info' or higher, but lower than 'warn'.
# Exclude mail.  This is logged elsewhere.
*.info;*.!warn;mail.none                                -/var/log/messages

# Log anything 'warn' or higher.
# Exclude mail.  This is logged elsewhere.
*.warn;                                                 -/var/log/syslog

# Debugging information is logged here.
*.=debug                                    	        -/var/log/debug

# Kernel related logs:
kern.*  						-/var/log/kernel

# Private authentication message logging:
authpriv.*                                              -/var/log/secure

# Cron related logs:
cron.*                                                  -/var/log/cron

# Mail related logs:
mail.*                                                  -/var/log/maillog

# Daemon related logs:
daemon.*                                                -/var/log/daemonlog

# User related logs:
user.*                                                  -/var/log/userlog

# Mark logs:
mark.*                                                  -/var/log/marklog

# Emergency level messages go to all users consoles:
*.emerg                                                 *

--- End Example syslog.conf-----

Note the dash before the log files' names. This tells syslogd not to sync after every log. The disadvantage of this is that log information may be lost in the event of a system crash. Removing the dash, however, can cause a performance loss with heavy logging.

If you want to be able to track logs in real time, you can open a log file using the command tail -f /var/log/messages. Alternatively you can have a permanent log console by adding the line

*.* /dev/tty8

to the end of your syslog.conf file. This displays logs in real time on /dev/tty8. (Be sure that tty8 exists, of course!)

In order to keep accurate logs, ensure that your system clock is accurate at all times. You should look to using Network Time Protocol (NTP) to maintain your system clock's accuracy. The easiest way to do this is to regularly run ntpdate some.time.server from a cron job.

Finally, although it's not really related to your system logs, make sure you redirect root's mail to your normal user account so you don't miss any important warning mail messages sent to root. You should do this either by placing the line:

root: [email protected]

in /etc/aliases and running the command newaliases, or, alternatively, create a file named .forward in /root containing the address you want mail to be forwarded to.

Using Stunnel for logs encryption

Stunnel -- Universal SSL Wrapper

Stunnel is a program that allows you to encrypt arbitrary TCP connections inside SSL (Secure Sockets Layer) available on both Unix and Windows. Stunnel can allow you to secure non-SSL aware daemons and protocols (like POP, IMAP, LDAP, etc) by having Stunnel provide the encryption, requiring no changes to the daemon's code.
- quoted from www.stunnel.org

I have syslog collection hosts in two datacenters that forward using TCP to a central loghost. The TCP connections from each collection host are port forwarded over stunnel.

The collection hosts in each datacenter collect the logs via UDP, so no special software is necessary on any hosts other than the collection hosts.

I wanted the reliability of TCP but know how easy it can be to hijack or otherwise disrupt a TCP stream between two hosts (if you're in the right place). Stunnel solves this for me by making the stream tamper proof. Sure they can block the packets, but they can't easily modify them en route.

I setup my tunnel like this on the satellite servers:

stunnel -c -d 5140 -r loghost:5140

Then I have syslog-ng write to the stunnel port on localhost:

destination loghost {
	tcp("127.0.0.1" port(5140));
};
log {
	source(src);
	destination(loghost);
};

The central loghost listens on port 5140 and redirects that connection to port 514, where syslog-ng is listening:

stunnel -p /etc/stunnel/stunnel.pem -d 5140 -r 127.0.0.1:514

Configuring syslog.conf

Don't use spaces, use tabs when configuring syslog.conf. After making changes, kill -HUP pid for syslog.conf.

syslog.conf FreeBSD example from manpage:

EXAMPLES
A configuration file might appear as follows:

# Log all kernel messages, authentication messages of
# level notice or higher, and anything of level err or
# higher to the console.
# Don't log private authentication messages!
*.err;kern.*;auth.notice;authpriv.none /dev/console

# Log anything (except mail) of level info or higher.
# Don't log private authentication messages!
*.info;mail.none;authpriv.none /var/log/messages

# Log daemon messages at debug level only
daemon.=debug /var/log/daemon.debug

# The authpriv file has restricted access.
authpriv.* /var/log/secure

# Log all the mail messages in one place.
mail.* /var/log/maillog

# Everybody gets emergency messages, plus log them on another
# machine.
*.emerg *
*.emerg @arpa.berkeley.edu

# Root and Eric get alert and higher messages.
*.alert root,eric

# Save mail and news errors of level err and higher in a
# special file.
uucp,news.crit /var/log/spoolerr

# Pipe all authentication messages to a filter.
auth.* |exec /usr/local/sbin/authfilter

# Save ftpd transactions along with mail and news
!ftpd
*.* /var/log/spoolerr

# Log all security messages to a separate file.
security.* /var/log/security

# Log all writes to /dev/console to a separate file.
console.* /var/log/console.log

# Log ipfw messages without syncing after every message.
!ipfw
*.* -/var/log/ipfw
LinuxSecurity -- Secure Logging

Posted By: Chris Parker
11/14/2000

msyslog: For a cracker to successfully hide her intrusion, she must edit the logs.

When a cracker breaks into a computer, the first step to covering his tracks is to delete the log entries that show anything suspicious. If the logs are edited well and not much is done to the system, it may be months before a system administrator notices that the system has been cracked, or it may even never happen. Because of the importance put on to log files to report what is going on in a system and because of ease of editing log files, they do not help in detecting intrusions as much as they should.

Enter msyslog, the obvious solution to the problem of logs not helping in intrusion detection. Msyslog is a syslogd and klogd replacement that encrypts and hashes the log files. With msyslog, crackers will need a significantly more time to hide their tracks, time that they probably does not have. While a cracker can still delete the log file all together, that is a pretty big sign that the box has been broken into, something they don't want.

Configuration

First, get the software here. After unzipping and untarring it, read the README and INSTALL files. Then, edit the modules.conf file to something similar to this:

  UNIX=static
  BSD=
  LINUX=static
  UDP=
  CLASSIC=static
  PEO=static
  REGEX=static
  MYSQL=
  PGSQL=

UNIX refers to receiving input from /dev/log. BSD refers to receiving input from the special BSD logging device, /dev/klog. LINUX refers to receiving input from the special Linux logging device. UDP refers to receiving input from other systems on a specific port. CLASSIC refers to the outputting tasks the syslogd normally does. PEO refers to hashing the logs into the PEO-1 and L-PEO algorithms. REGEX refers to allowing output redirection based on a set of regular expressions. MYSQL refers to outputting the logs into a mysql database. PGSQL refers to outputting the logs into a postgresql database.

Now run:

  ./configure --prefix=/usr/local

Installation

For installation, run:

  make clean;make;make install

Setup

After installing msyslog, there will be directions given to edit /etc/rc.d/init.d/syslog. After editing and saving it, remove the klogd start up and shut down process since msyslog can log kernel messages. Now, move run this command:

  mv /usr/local/sbin/syslogd /sbin/syslogd

Assuming everything worked correctly so far, /etc/syslog.conf must be edited. The changes to syslog.conf will be minimal if all that is needed is encryption and hashes of the log files. To do this, these two lines:

  *.info;mail.none;authpriv.none  /var/log/messages
  authpriv.*                      /var/log/secure

becomes

  *.info;mail.none;authpriv.none %peo -l -m md5 -k /var/syslog/.var.log.messages.key %classic /var/log/messages
  authpriv.* %peo -l -m md5 -k /var/syslog/.var.log.secure.key %classic /var/log/secure

The second set of files will be encrypted with the key in /var/syslog and an md5 hash of them made of them. Now, the keys to be used for encryption must be made. Make the keys for the above example like this:

  /usr/local/sbin/peochk -g -f /var/log/messages -i messagekey0 -m md5
  /usr/local/sbin/peochk -g -f /var/log/secure -i securekey0 -m md5

The keys messagekey0 and securekey0 should be stored in a very safe place, like a CD.

Start

After this, kill both klogd and syslogd and start msyslog using the start up script. Start msyslog like this:

  /etc/rc.d/init.d/syslog start

Integrity Test

If there is a possibility that someone has been messing with the logs, run this to check their integrity:

  /usr/local/sbin/peochk -m md5 -i messagekey0 -f /var/log/messages
  /usr/local/sbin/peochk -m md5 -i securekey0 -f /var/log/secure

If something comes up, chances are much better than not that the logs have been doctored and the systems admin had a really big problem.

More Information

While there isn't a lot of information (read none as far as I can tell) about msyslog setup and use, there are a few mailing lists that are helpful and msyslog itself comes with excellent documentation. These are the mailing lists Core-SDI provides for msyslog discussion and help. The im_linux.8, om_mysql.8, om_peo.8, om_regex.8, peochk.8, syslog.conf.5, and syslogd.8 man pages more than filled the void of outside documentation.


Iplog

iplog 1.4 -freahsmeat

http://www.ojnk.org/~eric/


iplog is a collection of daemons that log tcp, udp, and icmp traffic. It has features not available in other traffic logging programs, including detecting 'stealth' scans used by port scanners such as nmap, protection against SYN floods, and logging of remote user information.

Changes: Fixed strange byte ordering problems, added some things to avoid a port scan DoS, now logfile can be specified. Consider this the final version.

Autobuse is Perl daemon which identifies probes and the like in logfiles and automatically reports them via email. This is, in a way, the opposite of logcheck in that autobuse identifies known badness and deals with it automatically, while logcheck identifies known goodness and leaves you with the rest.


Random Findings

FreeSoft

modular syslog

Secure Syslog is a cryptographically secure system logging tool for UNIX systems. Designed to replace the syslog daemon, ssyslog implements a cryptographic protocol called PEO-1 that allows the remote auditing of system logs. Auditing remains possible even if an intruder gains superuser privileges in the system, the protocol guarantees that the information logged before and during the intrusion process cannot be modified without the auditor (on a remote, trusted host) noticing.

Ssyslog is the ultimate tool for system logs auditing and is designed to constitute a valuable tool in intrusion detection processes. Ssyslog was developed in the research labs of CORE SDI S.A., and is now placed in the public domain.



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Created: May 16, 1997; Last modified: March 12, 2019