Search the Site

My Social
Meta
Powered by Squarespace

Entries in Linux (14)

Monday
Jan232012

Courier IMAPd and Mail.app warnings

After installing an ISPConfig deployment, everything seemed to work properly, but every now-and-then I got this weird error that there was something wrong with the mail server configuration. The Apple Mail.app showed a exclamation mark with the following message:

The server returned the error: The attempt to read data from the server server.domain.ext failed.

Some research showed that the Apple mail clients tend to open several connections for IMAP, and the default setting of the Courier IMAPd server is to allow (only) 4 connections from the same IP address.

Modifying the Courier config file (/usr/lib/courier-imap/etc/imapd) and allowing e.g. 20 connections from 1 IP address solved this problem.

<ORIGINAL CONFIG>
##NAME: MAXPERIP:0
#
#  Maximum number of connections to accept from the same IP address

MAXPERIP=4

<MODIFIED CONFIG>
##NAME: MAXPERIP:0
#
#  Maximum number of connections to accept from the same IP address

MAXPERIP=20

 If your company / household holds several imap mail clients you may need to increase the counter even more (65536 is the maximum amount of connections for ANY IP address).

If you have SSL enabled on the Courier IMAPd server you also need to add the MAXPERIP variable to the imap-ssl config file (/usr/lib/courier-imap/etc/imapd-ssl).

Finally, you need to restart the Courier IMAPd services (/etc/init.d/courier-imap restart)

Saturday
Jan072012

Changing SSL Certificates in a ISPConfig v3 Configuration

When you install a Perfect Server based on Centos and ISPConfig v3.x, the system / 'installer' creates for the components self-signed certificates. All these certificates will generate different warnings in your browser, mail clients etc. So time to eliminate those warnings.

First I needed to find out where all those certificates are located, and what there formats are. In my case, there are three services that use SSL/TLS in some form;

  1. Postfix SMTP service
  2. Courier IMAP service
  3. http / Apache2 webservice

Checking the configuration files will reveal their locations.

Click to read more ...

Thursday
Jan052012

Getting ISPConfig to Work on Centos

This is not a manual describing the installation (pre-requisites) of ISPConfig software on a Centos platform. An excellent manual can be found online. It's just that I ran into a problem when I tried to connect an e-mail client to the (IMAP) mailserver (controled by ISPConfig). All the appropriate ports / listeners were up and running, so it had to be a configuration issue.

Googling around didn't solve my problem. My collegue, Xander (@xmoments / xmoments.nl), cam eto the rescue with the solution;

yum install cyrus-sasl-plain-2.1.23-13.el6.x86_64

Software that handles cleartext passwords between mail processes. After the installation, the mail went flying across the Interwebs.

Thursday
Nov102011

Upgrading Splunk on Ubuntu Linux

Just a small post with the instructions on upgrading Splunk on Ubuntu Linux.

First download the Splunk update. The Splunk website also gives you the wget command, which you can use directly on the Linux commandline.

Click to read more ...

Friday
Aug262011

Configuring Syslog-ng on Ubuntu

Syslog-ng is a replacement for the default syslog daemons you get with most Linux distributions. The advantage of syslog-ng is that the configuration is easier to understand, and it gives the sys-admin numerous advantages. Especially in complex environments.

Let's say we have a RADIUS environment which is able to send authentication and accounting information through syslog to external devices. And let's assume that a relevant part of this syslog information is needed by a department within a large cooperation.

Installing syslog-ng (on Ubuntu) is done by the following command:

# sudo apt-get install syslog-ng

Through the use of syslog-ng we can store, and/or forward syslog information based on the following (but not limited to):

  • source IP address
  • destination IP address
  • syslog level
  • content in the original syslog message by using regular expressions.

All this can be configured in the /etc/syslog-ng/syslog-ng.conf file.

Click to read more ...

Sunday
Mar062011

Wireshark on Ubuntu 10.10

Last week, I acquired a small netbook @ work for testing purposes (HP Mini 5103). You don't want to know how handy a second laptop is when you're testing authentication environments.

For flexibility purposes I installed a dual-boot with Microsoft Windows 7 Enterprise and Ubuntu Linux v10.10 from a USB Flashdrive (1, 2), erasing all the preinstalled HP/McAfee crap. This all worked like a charm, up to the installtion of Wireshark in Ubuntu. Starting the sniffer application resulted in an empty list of network adapters. No network adapters mean no capturing capabilities.

This was probably related to the lack of rights while starting the application. Running it from the terminal with sudo resulted in some weird error regarding a display-thingy (which is still the main reason why Linux isn't penetrating the desktop market).

While searching the Interwebs, I found the following solution. Which, after some alterations to the commands, the following worked for me;

In a terminal session, execute these commands:

sudo addgroup –quiet –system wireshark (this command simply didn't work in my case)
sudo chown root:wireshark /usr/bin/dumpcap
sudo setcap cap_net_raw,cap_net_admin=eip /usr/bin/dumpcap

usermod -a -G wireshark <my user name>

Reconfigure the Wireshark Common package and answer ‘Yes’ to the question ‘Should non-superusers be able to capture packets?

sudo dpkg-reconfigure wireshark-common (I needed to add the sudo part on this command)

After rebooting the laptop, Wireshark started normally with all the network interfaces available for capturing traffic.

UPDATE: I did a reinstall of the OS, and this time Wireshark started 'properly' from the Terminal application by typing (without the quotes) 'sudo wireshark'.

Thursday
Jul152010

Vanishing HD Space on Popcorn Hour

Sometimes you think you're mind is playing tricks on you. I have that sometimes. E.g. I have a Popcorn Hour  (Networked Media Tank) which holds movies and series I watch. Bits and Bytes come and go on that machine. Thing you've seen are deleted, and replaced by new content. But over the 'years' it seemed to hold less and less content. Oke, movies have increased in size (10-20GB per movie is nothing nowadays). So I didn't really think much of it....

Until I started transporting the content with Transmit instead of FileZilla. Transmit was configured to show even the hidden files, and hidden files it showed. I found 4 hidden temporary pureftp-upload files of almost 32GB each. The timestamps on those files differed from late last year to a couple of months back

.pureftpd-upload-<some random string>

The problem was that I couldn't remove them from the Popcorn, but I really wanted my 120GD of free space back. Turned out that the FTP daemon on the Popcorn was locking these files. Reboot of the popcorn didn't help.

The way to remove them was to stop the FTP daemon on the Popcorn in the menu, and access the device through SMB (or another protocol you can use), and make sure that you can see hidden files.
Select the files and press delete, and they should be gone. After that you can re-enable the FTP service if you like.

Friday
Apr022010

Installing & Configuring CentOS 5.4 (Day 2)

Oké. Day 2. After the successful installation and configuration of CentOS with Adobe Coldfusion, I needed to install MySQL as a database. So, I started the virtual machine, and found out why Linux will (probably) never cut it as a common desktop environment.

X11 - No DesktopYesterday I (properly) shutdown the system (which had the GNOME Desktop), and today it started with some back to the 60's desktop. Every icon gone. All that I'm left with was a terminal window, clock, and a FireFox window. This environment is the basic X11 desktop.

Click to read more ...

Thursday
Apr012010

Adobe Coldfusion 9 on CentOS 5.4 (x64)

Updated on Monday, April 5, 2010 at 18:53 by Registered CommenterWillem

A while back I figured out how to install Adobe (could have been Macromedia back then) Coldfusion MX on an Ubuntu linux server. This config still runs as it should, but in the mean time several things have changed in the world of software. I've been leaning towards Centos 5.4, and Adobe released Coldfusion 9 somewhere in 2009. These two 'events' made me decide to combine the two.

Before I continue, I must warn you that the Coldfusion installer is still broken..... So there's some manipulation of code involved.

Since the new 'server' is going to run in a virtual environment I started in a VMWare Workstation. This way I could make snapshots of my progress. This made it easier to start from scratch. The problem with Linux and me is that I tend to reinstall the OS when things go haywire. So VMWare is a safe way out in this case.

Click to read more ...

Tuesday
Mar242009

Why Linux Won't Work

.... for me at this moment (and probably for most others).

I'm still in the fase of migrating my Windows server to a more 'reliable' Operating System. The new server should cover the following basic functionalities:

  • Filesharing (either via Samba or NFS)
  • Webserver with PHP and Coldfusion
  • SSH server
  • RADIUS Server
  • Central user database (e.g. an LDAP server)
  • a NZB downloader of some sort
  • etc.

Up till now I've tried several Linux distros (Ubuntu 7, Ubuntu 8, and CentOS 5), and none of them are that easy to configure.

It seems that NFS isn't that easy to configure, since Apple OSX requires some special features (standards anyone??). Especially when you want some sort of user authentication. Just do a search on Linux, NFS and Apple OSX.

So after NFS didn't really work, I tried SMB for filesharing. While CentOS has some GUI to configure the shares, it lacks the interface of permissions for users...
You create users within the Linux environment, and if you wanna connect to a Samba share you need to configure additional users (in a different userdatabase).
This can be done by synchronization, but you need to do some scripting etc.

So far I've managed to install a working SSH server (with x509 authentication) and a Webserver.

When I compare this with Windows.... Well, there's no real comparison. Windows has a real central user database (not talking about the Active Directory here, but the local user database). And every service I need can talk to this database, and every service recognizes this database. No need to start 'hacking' some configuration files to make it work.

Well, as long as there's no new (read:better) interface of configuring those services and combining them with a shared user database ....

LINUX >> /dev/null

Recommendations can be left in the comments (if there are any)......