Parahanga

Posted by on Aug 16 2020

My journey into photography (and life) has gone through many successive waves.

I started learning photography on 35mm. In my mid twenties, I found my passion in digital photography when it was fresh on the scene. It offered a new paradigm to photograpy – moved from chemical processes to to electrical (digital) process. It greatly reduced the barrier of entry for photography; it didn’t require a darkroom: all that was required was digital camera and computer (and now, just digital camera). I loved the simplicity, and when I immigrated to New Zealand, I dived into photography (I even toyed with the idea of becoming professional!).  My website here at vonnagy.com even ranked first for ‘new zealand photography’ gathering up to 35,000 visitors a month in the early 2000’s.

This was a huge period of growth as an artist for me. During this time, I made from friends from all over the world. I’ve had the opportunity meet people in real life. I travelled to Australia, Canada, Poland, Portugal, and Ireland to meet other photographers. Many of these people, such as my friend Troy Moth, have become close friends.

But after several years, my passion for photography dried up.  I was also wrapped up very deeply vested in the start up Online Republic, this was taking up as much as 80 hours or more of my time. In 2007, I  I took my digital camera on a much needed holiday to Europe. I remember taking some  photos in beautiful Northern Ireland. After that, I didn’t pick up my digital camera for a long time. In fact those photos from Northern Ireland are still residing on an unviewed flash card somewhere! My wordpress blog also got hacked via the comment plugin I had, and start pumping heaps of spammy links into my sites and others. I eventually fixed it, but the damage was already done, my site when from 35,000 visitors to about 35 visitors a month afterwards. I just figured my time with photography had run its course. I was nearly 2 years before I picked up a camera again.

Then, I heard that Polaroid was no longer going to manufacture their film. That caught my attention.

I bought a Polaroid camera in an  auction and took a roadtrip to the central North Island to pick it up with my  girlfriend. After the first photograph I took (which was literally a chicken that had crossed a road), I was hooked. . Though I had no preview for how the film would turn out, and quirky nature of polaroids gave me unexpected joy in shooting photographs again. I realised that it was the medium of digital photography had sapped my passion for photography, not photography itself. Film renewed my passion.

With my website in shambles (though I later restored it), I decided to create a new website for my second life as a photographer. I toyed with many new ideas, but the answer came unexpectedly from my father. My father is a photographer as well, he’s taken nature photographs his whole life  and has never deviated from that course. When he visited me in New Zealand, he would always point out to rubbish on the side of the road or flotsam  on the beach. In his view, all I photographed was ‘junk’.

At the time, I was thinking about the Austrian word rubbish (in dialect ‘Klumpert‘). But since I was in New Zealand, I tried to find a local word that had the same meaning. I found the word ‘Parahanga‘ in the Maori Dictionary online and thought it was a suitable reference to my dive into analogue photography:

parahanga 1. (noun) rubbish, litter, scraps, rubbish dump, pollution.

So in 2009, I launched Parahanga.com, and started my commitment to analogue photography. This was a deep dive. During this time, the process of learning was very zen-like. Analogue photography forces to slow down and be patient. You contemplate carefully. In the case of polaroid film, there are 10 or less prints to be made, it was expensive, and the fickle nature of expired film often meant that that not every exposure will turn out. Sometimes whole packs could be ruined. But the results could turn out to be other-worldly. An unlike digital, completed tangible. You could hold them in your hand (not computer screen or iphone).

My success with Online Republic allowed to buy all sorts of crazy and wacky cameras. Although I didn’t have space in my home for darkroom, I soon learned how to at least developed negatives, including colour C-41 negatives in my kitchen. It a labour of passion and I loved that I was involved in nearly aspect of photography. The process didn’t stop at the rendering of light by a CMOS chip or abstract 0’s and 1’s that could be manipulated by PhotoShop. It was a physical object that meant no layer of abstraction with me and the final image.

In addition, I found the community of analogue photographers very endearing; I was quite involved on Flickr and Polanoid, the latter site I won Shot of the Day six times. I relished the days I could go out and take photos again.

But all things change. In 2013, I only had 3 posts on my Parahanga.com site. In early 2014, my mum had a very serious auto accident which shook my world. Very fortunate for us, my mum turned out to be okay, but we decided that we should spend at least part of the year overseas. Photography took a backseat. I posted my last post on Parahanga in June of that year. Over the next couple of years, I sold the majority of my cameras and packs of film.

Previously I stopped photography because lack of passion, but now I didn’t have time. Life happens; however, life does come around in waves.

Its August of 2020, I am a currently overseas waiting to come back to New Zealand. My wife (my girlfriend when I bought my first polaroid camera) and I have decided that after trying hard to balance two countries for last 5 years, to return home to New Zealand for good. I had a look at a parahanga.com the other day, and I decided to migrate it away from the Google corporate blogging platform into wordpress site. I am glad I did not let  parahanga.com expire; the common practice by many search engine spammers is re-acquire old domains, and redress them spam until the domain is not worth anything anymore. Though I am not sure if I will continue my photography on parahanga.com, but at least I can keep it from deteriorating on the web.

Although I am still months away from reaching my camera and film, there is still fuel in my tank for photography. Let’s see what happens.

 

 

Quick guide to Nginx and Swift Perfect server on your local linux machine.

Posted by on May 02 2020

The goal is a to use the Swift Perfect server to use the default “hello world” template, but let Nginx serve static files

I am currently testing out the Swift Perfect Server (eg from [Perfect.org][1]) and Nginx. The idea here is to let Swift Perfect to serve dynamic stuff, and Nginx to serve static stuff (like images, css etc).

This quick & dirty guide assumes the following:

  • You’ve got [Nginx][2] installed and running.
  • You’ve got [Swift][3] installed and working.
  • You’ve got [Perfect Server][4] installed and running

This is for a local configuration, since I don’t have a domain yet, I want to run it locally. So add this line the hosts file

sudo nano /etc/hosts

and add this:

127.0.0.1       swift.local

This will allow you to access swift.local on your laptop or desktop computer.

Let’s create a directory for static files (eg images/pdfs/css ) that your Nginx server will serve.

mkdir -p /var/www/swift.local/html/images

the -p will make sure that all directories along the way will be created. Let’s download an image. I [used this image][5] and renamed it swift.jpeg and placed it in the images folder from above. I set the permissions to 0755:

sudo chmod 0755 /var/www/swift.local/html/images/swift.jpeg

Now that we have directory stucture, lets create virtual hosts file. In my nginx set up, I go to the conf.d directory and create a file called swift.local.conf:

sudo nano /etc/nginx/conf.d/swift.local.conf

and add the following:

server {
        listen 80;

        root /var/www/swift.local/html;
        server_name swift.local www.swift.local;

        access_log /var/www/swift.local/access.log;
        error_log /var/www/swift.local/error.log;

       location / {
                #swift perfect server access
                proxy_pass        http://localhost:8181;
                proxy_set_header  X-Real-IP  $remote_addr;
        }


        # serve static files

        location ~ ^/(images|javascript|js|css|media|static)/  {
          root /var/www/swift.local/html;
          expires 30d;
        }


}

What’s going on here:

  • Our Nginx Server is listening to port 80. If we were on a remote live server, I would port this to 443 for https
  • I’ve stuck my my access/error logs just above the root directory
  • location / : This is where Swift does its magic; the proxy pass goes to port 8181 which is the default Perfect server port. All the dynamic goodies will be here.
  • finally the static files are all service by Nginx and given a 30d caching expiry.

Once you saved this, try the following commands. The first will test your nginx configuration. If anything is wrong with the virtual host you just set up, it should show here. The second will reload your nginx.

sudo nginx -t 
sudo systemctl reload nginx

Finally, create a new PerfectHTTP project anywhere on your local machine. I have build mine in /home//swift/, but you can build this anywhere it makes sense to you. Run the following the commands to create a start project:

git clone https://github.com/PerfectlySoft/PerfectTemplate.git
cd PerfectTemplate
swift build

Now go and edit the the main.swift file. There are 2 changes to make:

  1. add the image code in the responsebody

    1. Remove or comment out the second route handler. Why, because this the work we want nginx to do, not PerfectHTTP: swift
      //routes.add(method: .get, uri: “/**”, handler:
      StaticFileHandler(documentRoot: “./webroot”, allowResponseFilters:
      true).handleRequest)

Here’s the main.swift boilerplate file that has been edited:

  import PerfectHTTP
  import PerfectHTTPServer


func handler(request: HTTPRequest, response: HTTPResponse) {
    response.setHeader(.contentType, value: "text/html")
    response.appendBody(string: "<html><title>Hello, world!</title><body>Hello, world! |<img src=’/images/swift.jpeg’></body></html>")
    response.completed()

}

  var routes = Routes()<br />
  routes.add(method: .get, uri: "/", handler: handler)

  try <span>HTTPS</span>erver.launch(name: "localhost",
                        port: 8181,
                        routes: routes,
                        responseFilters: [
                          (PerfectHTTPServer.HTTPFilter.contentCompression(data: [:]), HTTPFilterPriority.high)])

Almost there! The final step is that we have rebuild the main.swift file, then launch PerfectHTTP:

    swift build
    .build/debug/PerfectTemplate

Now if all went well, you can bring up your browser to http://swift.local and see the “Hello World” served by PerfectHTTPServer and the swift image served by nginx!

Quick Note:

Another way to test how nginx is handling images, is simply stop the perfect server. One the homepage you should get a 502 error from Nginx (because it can’t find the swift server at port 8080 which we stopped). However you should still be able to see the image at http://swift.local/images/swift.jpeg

I had an issue where I couldn’t get additional routes working other than root:

     routes.add(method: .get, uri: "/", handler: handler)

For instance this arbitrary route was giving me an nginx 404:

    routes.add(method: .get, uri: "/scooby/dooby/doo", handler: handler)

The PerfectHTTPServer code looked good; it turned I had this nginx try code under my Nginx server: swift
location / {
#swift perfect server access
proxy_pass http://localhost:8181;
proxy_set_header X-Real-IP $remote_addr;
try_files $uri $uri/ =404;
}

The “try_files $uri $uri/ =404;” by nginx seemed to supercede Perfect’s routing. I deleted the offending line and my Perfect routing again worked. In the future, error handling like 404’s should be done from Perfect, not Nginx.

Exporting from CSV format into Markdown – the long way.

Posted by on Apr 03 2020

{… or how to export from the texpattern cms the hard way}

Kraxn.io recently migrated from textpattern to a static site generator. *

This was a journey, that didn’t happen overnight. In fact, as I couldn’t find any out of the box solutions, it took a couple of weeks before I took a stab at it again. I had over 350 posts, so I could do it manually, or figure out a more automated way to do it. Here are the steps I took to do this:

  • Export a csv from mysql (you can use mysql command line, but used phpMyAdmin export function since I had it installed).
  • Convert each row in of csv into an markdown (.md) file. Each row represents a blog entry.

Before I decided this path, I decide if I could just do a quick & dirty download with wget eg:

wget -mkEpnp http://example.org

Unfortunately – there was too much “dirty html” to clean by running markdown. There are probably ways of doing this using Regex, and perhaps even the php striptags function (in hindsight this could have been a possibility). I would have to put in some effort to know all the html tags I used on the site. So I opted for the export csv method.

Exporting blog data as CSV.

From PHP myadmin:

  • login to phpmyadmin
  • open the database you use click on the table you wish to use
  • click export: method = quick; format = csv ** and download

If you want export from mysql command line, I suggest you try this link.

Generate md files from CSV rows using PHP

Now that we’ve got a csv file with our content, let’s output it as as a .

Below is the script I used, some notes on data hygiene:

  • Download Markdownify and references the scripts.
  • Copy csv to php array for this script.
  • Posted – This is the date posted – this is essential for retaining the post date; I use php touch() to revert to the original post date.
  • Body_html – this the main content
  • Title – this is the title also used for URL, it had some funky characters, therefore the long list of php string functions. Ideally a regex would be faster!
  • Image – the image file which is numbers.

An aside: I had a mix of png and jpg images. I actually wanted all of them to be jpgs. Luckily I had just installed ImageMagick, which could bulk convert images. By converting them to jpgs, my script would also work. Here is the command I used inside my images folder:

sudo mogrify -format jpg *.png

So is here is the php script:

<?php
declare(strict_types = 1);

# https://github.com/Elephant418/Markdownify
require_once ("/Markdownify-master/src/Converter.php");
require_once ("/Markdownify-master/src/Parser.php");


/**

 * Convert a comma separated file into an associated array.
 * The first row should contain the array keys.
 * 
 * Example:
 * 
 * @param string $filename Path to the CSV file
 * @param string $delimiter The separator used in the file
 * @return array
 * @link http://gist.github.com/385876
 * @author Jay Williams <http://myd3.com/>
 * @copyright Copyright (c) 2010, Jay Williams
 * @license http://www.opensource.org/licenses/mit-license.php MIT License
   */


function csv_to_array(string $filename='', string $delimiter=',')
{
    if(!file_exists($filename) || !is_readable($filename))
        return FALSE;

    $header = NULL;
    $data = array();
    if (($handle = fopen($filename, 'r')) !== FALSE)
    {
        while (($row = fgetcsv($handle, 1000, $delimiter)) !== FALSE)
        {
            if(!$header)
                $header = $row;
            else
                $data[] = array_combine($header, $row);
        }
        fclose($handle);
    }
    return $data;

}


$csv = csv_to_array('textpattern.csv',",");


$counter = 0;



    foreach( $csv as $item){

    //collect all data for html body: heading, body html, and image.
    $file_data =
                "<h1>" . $item['Title'] . "</h1>" .
                $item['Body_html'] .
                "<img src='/inc/images/" . $item['Image'] ."t.jpg'>"; 


​    

    $converter = new MarkdownifyConverter;
    $file_data = $converter->parseString($file_data);    

    //title is a bit janky, but working.     
    if($item['Title']){
     $file_name = 
            strtolower ( str_replace ( " ", "-", strip_tags (  str_replace("/", "-", trim ($item['Title']) ) ) ) ).".md" ;    
    }  


​        

        if($file_name != false){

            $file_handler    = fopen($file_name, 'w');

            fwrite($file_handler, $file_data);


            fclose($file_handler);  

             $get_timestamp  = strtotime( $item['Posted'] );   //important  

            touch($file_name, $get_timestamp);

        }    


​         

    }       


?>

Using declare(strict_types = 1); may issue some warnings on on $data[] = array_combine($header, $row);. These can be ignored, or they can be fixed by deleting any columns you don’t in the csv file.


* Of all database driven cms’s, I prefer textpattern over wordpress. In fact I had accumulated over 300 posts on textpattern, and had virtually no issues. I decided to move because I used luapress on a demo site, and fell in love with it. My workflow is much better with Typora and luapress. It just felt natural. I would still recommend textpattern over wordpress.

** alternatively – I noticed there was php_array here too. I haven’t tested it, but it theoretically might have saved some steps.

Update 2020-09-30:  I’ve recently learned of a much easier to export files from MySQL directly into a file just by issuing a few simple SQL commands. See how to save mysql query output to a file.

Of browsers and sausages

Posted by on Mar 18 2020

“Je weniger die Leute wissen, wie Würste und Gesetze gemacht werden, desto besser schlafen sie!” (the less people know how sausages and laws are made, the better they sleep at night!) – Otto von Bismarck

Your web browser may be most complex piece of software on your computer. The original web browser to render html/txt from remote servers: pretty simple. That was 20 years ago. Think about what browsers can do today:

  • Render 3D graphics – with webGL you can have a 3D engine in your browser. You can play all sorts of games and view 3D online today.
  • Trivial to access your computer. You can drag and drop files directly into browser to upload them. Your webcam can be used in your browser for web conferencing.
  • Steaming Multimedia – this was always pretty clunky in the old days (usually) involved downloading files. Today streaming is the standard fair on the internet.
  • Geolocation – Browsers can detect your location – think all the online map tools you use.

Browsers can manage memory, hardware, and almost all the computing processes that an operating system (Windows, IOS, Android) can do.

To show how complex browsers have become, developer downloaded all the specifications for web standards, of which browsers face the brunt of the coding. In his article The reckless, infinite scope of web browsers, Drew Devault counted over 100 million words required by the specifications. He stated that:

I conclude that it is impossible to build a new web browser. The complexity of the web is obscene. The creation of a new web browser would be comparable in effort to the Apollo program or the Manhattan project.

Another developer, Casey Muratori, discusses how the duopoly of operating systems (Windows/Mac for desktops, Android/IOS for mobile) had led to less than optimal results for software:

In all cases, they are maintained by companies whose revenue does not primarily come from the sale of operating systems, and who have many incentives to pursue other goals that delivering the most stable, reliable, trustworthy experience. The operating system is never a product anymore — it is merely something users are forced to use based on the hardware they have chosen, and it is increasingly treated solely as a vehicle for pursuing the platform holders’ other business goals.

Our browser has morphed from a simple html renderer to a multi-tool we use for our lives. What are the consequences for using such a complex piece of software?

Less competition – Independent browser makers can not compete with the complexity. Apple and Google’s ecosystem make it very difficult to compete

More software vulnerabilities – Devault cites that there are over 8000 common vulnerabilities and exposures (CVE) for browsers such as Firefox, Chrome, Safari and IE. The more complicated the code, the more likely it can get exploited. This is a step away from Unix Philosophy axiom stating: Do one thing, and do it well.

Privacy is a second class citizen. Many of the default features of Firefox allow ‘digital breadcrumbs’ to be sent back to servers, other features allow disabled of certain functions (eg allowing sites to disable right mouse click).

A user on Github named 0XDE57 has compiled a large list of of under-the-hood features in the About:Config section for Firefox, some that may break certain websites. However, many of these can give you back some degree of control over your browser.

Another software programmer Andrew Chase compiled a list of modifications for Chrome and Firefox on Github, which is listed on github as well.

Closing thoughts.

Complexity isn’t necessarily a bad thing, however it does make obfuscation easier. The average user will not care how their browsers works, so long as it works.

Going back to Bismarck’s comment – the less you know, the better you can sleep. While some sleep may be lost knowing what your browser is doing under the hood, we should perhaps use that insomnia for making better software.

Go private, go blockchain, or roll your own email

Posted by on Mar 14 2020

Email is an interesting beast. Many people, even those with considerable technical abilities shy away using anything but a big tech email provider. The majority people use Google – which its a classic gmail address, or with touch more configuration, set up their custom domain with Google. There are still more than a handful that go Hotmail or Yahoo. The last smattering goes to various other email providers.

Email is an interesting beast. Many people, even those with considerable technical abilities shy away using anything but a big tech email provider. The majority people use Google – which its a classic gmail address, or with touch more configuration, set up their custom domain with Google. There are still more than a handful that go Hotmail or Yahoo. The last smattering goes to various other email providers.

Here I’ll go over the consequences of using that free email account and 3 alternatives to consider.

What are the consequences of using big tech email?

We use big tech email because its free and convenient. But this mean to us, and our privacy? Here are 3 simple reasons not use big tech emails:

You data becomes their fodder. Yes Google allows you check and remove your data at any point. This is a moot point. Its essentially like giving your personal diary to your sticky-beak auntie for safe keeping just because she’s got a nice mansion to keep it secure. You can retrieve it anytime, and its certainly yours, but who’s to say your Auntie didn’t readcopy and distribute your diary! While you might own your data, you don’t control it!

Big players make them big targets. We often worry about hackers lurking in dark basement tracking your online behaviour. These days. Hackers are less individualistic and have become massive state players. China, Russia, USA and many other countries have been complicit in massive data breaches. Data is most valuable commodity, and if there is a dragon hoarding this gold, you can bet that they are troupes of rogues that are systematically looking for ways to pilch these huge reserves. The biggest breach in history so far has been Yahoo, and the attack, though technical, relied a lot on human error.

Spam – Free email providers are often trapped in spam filters simply because they are free, and can be used by spammers. Even if you have never distributed your email to anyone, spammers have vast tool-base to guess emails. For businesses, reaching out to free emails accounts can be problem as well.

There are plenty more reasons. Lets look at 3 solutions

Get Privacy Focused Emails Provider

There are several free and paid privacy email providers. I’ve selected 3 below for having strong security and being in a jurisdiction where there is user privacy is more protected (this excludes nearly all English speaking nations). For more on this read about Lavabit.

In addition, being open source is important, it gives an opportunity to how their data is handled;

Protonmail is Swiss based email provider and is the first name that usually comes for privacy email. Although Switzerland is not under the jurisdiction of the GDPR, they have strong legacy for privacy, both in business and culturally. Some features of Protonmail.

  • Encrypted with AES, RSA, and OpenPGP
  • Free Accounts get 500MB of storage and a limit of 150 emails per day
  • Paid accounts get more features starting at 4.00 € /Month, this includes using your domain (eg like mark@asylon.org) for the account.
  • Is open source
  • Has IOS and Android Apps

Tutanota is a German based email provider. Germany is under the jurisdiction of the EU, which means the strong protection of the GDPR are in place. Here are some features:

  • Symmetric (AES 128) and asymmetric encryption (AES 128 / RSA 2048) to encrypt emails end-to-end.
  • Free accounts get 1 Gigabyte of storage.
  • Paid accounts get more features starting at 1.20 € /Month, such as getting your own domain.
  • Is open source.
  • Has Android App.

Mailfence is a Belgium based provider, this also puts them in the jurisdiction of the GDPR. Some features:

  • Encrypted with AES-256
  • Free accounts get 500 MB and 500MB in Document storage
  • Paid accounts get more features starting at 2.50 € /Month
  • Is in part open source, using the OpenPGjs library.

One of the features is that the free services offer a lot less space than the big tech accounts, but the trade-off is more privacy.

Get on the Blockchain

This is not one I would recommend, yet, however, there is a lot of promise here, and it contrary to popular belief its not just hype. Currently I not convinced this technology is going to be mature enough for many users.

In short, these emails can be easily accessed through web, but the data is not stored on a central server. These are know as dAPPs or ‘decentralised applications’, and your data is distributed encrypted and secure in the network, instead of being controlled by a central authority like a government or corporation.

I have limited experience with these, but given enough enough users will be huge threat surveillance capitalists. Currently the top blockchain networks for dAPPS are the Bitcoin and Etherium networks.

One to investigate is Blockstack.org and Dmail. Blockstack uses the Bitcoin network for creation of blockchain applications, and Dmail is an app that resides on it. What’s great about it is that it has all of the same tools you would expect from Google ranging from your own online storage, email, even maps.

However, I can not recommend it at this point, because as much as they talk about privacy, there privacy policy remains spotty at best. As of the time of writing this, their actual online privacy policy is an unreadable document (archive.org link).

If you are an early adopter, it would worth keep your eye on different blockchain networks.

Rolling your own server.

There is stigma is that email is hard. Over the years, I found that most of the programmers I knew had their own personal websites, but very few had their own mail server. They know doubt had the technical ability to set one up, but always defaulted to out of the box solutions (eg gmail). Upon querying my programmer friends who didn’t host their own email the responses were similar across the board:

  • Its too much maintenance work.
  • I’ve already got a free [gmail/yahoo/hotmail/wundermailingus] accounts

These are not actually excuses. Not to denigrate myself, but I don’t nearly have as good technical chops as the people I work/have worked with. I have been fortunate to work with some of the most intelligent technical minds around the world the last 20 years. If I can do it safe and securely, so can anyone.

First, after hosting my own email, I found that if configured correctly, its no less work than hosting your own website. Maybe a little more, but not that much more. Secondly, people have moved from one free big tech account (yahoo.com, hotmail.com) to others (gmail.com) previously, so this

There are many tutorials out there now about how to set up your server. If you go this way, I see 2 different paths to take: Setting up an email server on your own dedicated server; using an open source hosting control system. Let’s look at each option.

Dedicated Email Server.

Below is a very simple overview of what is involved with setting up an email server. As you can see its pretty involved, but can be tackled in a systematic approach:

  1. Getting your own domain name.
  2. A hosting server (more than likely using a flavour of Linux)
  3. MTA – Configuring a Mail Transfer Agent – This the technology that involved in sending the mails. Postfix is a software commonly used for this.
  4. MDA – Configuring Mail Delivery Agents these get emails from server delivers them the users’ inboxes. Dovecot is a software used use for this.
  5. A Spam Blocker (SpamAssasin)
  6. A database (Postgre/MySQL/MariaDB)
  7. A webserver (Apache/Nginx)
  8. A webmail client (roundcube)

Although a few years old, this Ars Technica article is an excellent start for setting up your own email server. Even if you don’t go this route, its an excellent read to understand each component that is required for your own email server.

An another very popular solution is a bundled software solution that care of several of the above steps for you. One of the most popular programs for setting up your email own email is Mailinabox.com. This application includes a precise step by step guidean install video, and discussion forum. There are other solutions out there as well, such as iRedMail and Modoba.

Hosting Control Panel.

Unlike the above, a hosting control panel handles nearly every aspect website & email hosting. The learning curve is slightly less, however there are many more moving parts that go wrong! If you already own a few websites, then this might be the best solution. There are two open source control panels that I have worked with, both are excellent for different reasons:

  • ISPConfig – This control panel system runs on a BSD license and can run on several linux systems. Here is list of features from ISP Config: https://www.ispconfig.org/ispconfig/services-and-functions/
  • Virtualmin – This hosting software runs on GPL license and supports several operating systems, though mileage may vary. Here is a list of Virtualmin features.

Both hosting systems have online communities, and I would highly recommended checking them out before installing to get a feel of what’s they are like; if you have an issue the community boards are best place to solve them.

Regarding the setups, much of the email set up is automated once you get the system up and running, and most of the configurations are ‘hardened’ by default, erring on the side of security.

Further reading:

These links range from the philosophical, to the technical to the practicality of moving and changing you’re email account. If you are deciding to make a change in your email lifestyle ponder upon some of these reads:

Finally, let me conclude with some final thoughts from legendary computer scientist Don Knuth;

I have been a happy man ever since January 1, 1990, when I no longer had an email address. I’d used email since about 1975, and it seems to me that 15 years of email is plenty for one lifetime.