Showing posts with label systems administration. Show all posts
Showing posts with label systems administration. Show all posts

Tuesday, 10 May 2016

Conversations about the cloud in Australia

Another day and another chat with a client about cloud computing options. There are some absolute turkeys out there peddling cloud this and cloud that to people. Stop it! ADSL2+ doesn't provide enough bandwidth for your plans - in the war between reality and expectation, reality wins. This particular client is fortunately on the ball enough to realise that pushing all their key applications off their local server and into the cloud isn't a brilliant plan.

So what else do we do for these clients? What clever options can we provide?

It comes down to the application of course. If they're doing scanning or uploading large files to an offsite location it's not hard to use a Raspberry Pi or similar to get the data trickling out, or bulk upload it over night with a script.

If it's email or something like that - then get it into the cloud. Just let 'em know the limitations that their server currently manages - i.e. sending a large email out will take time. Your server used to plod along getting it out the door, but now you have to wait while Chrome sends it to Gmail. 

Remote Desktop Services aren't something people like, so what about a microserver with 2012 on it, AD replication and file replication using DFS? Under the right circumstances this will work over ADSL and people in both sites will see updated information reasonably quickly - depending of course on how DFS is configured. 

There are options - we just have to be smart about how it's presented and show a path forward if NBN does ever arrive. Today I showed a router upgrade to a client, then talked about how it's plug and play (almost) for NBN and how it can leverage great access for VPNs etc. We IT people are typically poor salesman - we either get excited over the trivialities of a solution or the technicalities of a solution and we lose our audience.

The biggest lesson I can give you is simple - use analogies to explain why cloud computing is a challenge. I always show an ADSL connection as a 4 lane highway in and a goat track out to represent the data path. People understand that - it's easy. Get yourself a few of these analogies and put them together to form a coherent image to bring your clients along with you in the discussion. Remember - a client can be a business client, friend, colleague or even your boss. With a little bit of education we can help our clients avoid big mistakes and avoid some of the bullshit around the cloud. 

The cloud can be great. We just have to be smart about it and make sure the shyster, bullshit artists out there don't screw up our client's networks because then we've failed in our jobs. 

In closing - please give us decent NBN! Australia needs it to grow and for businesses to be more agile (and I totally need it at home so I can download movies faster!)

Saturday, 16 April 2016

The rise of ransomware and the devil that is Cryptolocker

Over time, with advancements in anti-virus and anti-spyware, the ne'er do wells would eventually evolve. Their cunning and understanding of human behaviour has resulted in the devil that is ransomware. An innocent email from Australia Post arrives or a letter from the Australian Federal Police turns up in your inbox - most people are curious, even excited by an unexpected package, or concerned about a letter from the AFP and so they click the link. Boom! All their data starts to be infected - encrypted with heavy encryption and a lovely letter to say pay us or never get your stuff back.

We've seen it live and in the field on at least 5 occasions and had one or two clients actually pay the ransom - buying their Bitcoins and getting their decryption key back. Sadly, we've had people try this only to find out the website they need to talk to has been closed down by the authorities and their data irretrievably lost - until we restore it from backups that is... but sometimes even this doesn't work if people haven't moved data to the servers for backup. It's not good.

It seems like it was inevitable that a new mode of making money from people would have to emerge. Stealing credit card details and personal information - while profitable - is also fraught with danger of being identified when you try to use them. An anonymous ransom - helpfully supported by an anonymous currency - means a relatively straightforward exchange - a key for your own data back. It's kind of elegant in a way. This doesn't mean I don't think these people should be prosecuted - I've seen and experienced the stress and pain of business owners as their data becomes inaccessible and I'd love to pass that back to the perpetrators.

There are ways and means of protecting yourself from this, but it comes down to being systematic about it - just like these thieving bastards have been. Consider your vectors of risk:

  • external influences
    • email
    • websites
  • physical influences
    • users
    • dodgy USB keys
  • active attacks
    • hack attempts
    • social engineering
There could be more.

Defending against these requires defence in depth. Some of these things are passive - they do the job all the time, with managed updates and some are active - people actually need to think about what's happening. Here are a few examples:-
  • email scanning (externally if possible)
  • strong firewall
  • internet scanning (if possible)
  • anti-virus and anti-spyware on the machines
  • Chrome instead of good old IE (or awful Edge)
  • user education - the single best form of protection and typically the one with the least amount of time and resources put into it. Honestly, it makes me experience sadness that people aren't trying to get their staff skilled up a bit on the computer. It doesn't take all that much!
  • backups
  • backups of backups offsite
  • restorable backups
Of course, this kind of systematic approach doesn't just cover the awfulness of ransomware, but could be helpful against many other risks - attacks, environmental (fire / flood / famine / Justin Bieber), malicious employee activity (still identified as the Number #1 risk by many security dudes) and other "out of the box" issues.

I haven't mentioned patching or updating your system although I know this is critical. Updates are a beast of a thing, and anyone caught with a SharePoint update that means a database rebuild will feel where I'm coming from with this - security updates can cause more pain than if the damn thing was hacked or compromised in some way. Understanding the impact of an update on key infrastructure is very important and it isn't a passive security mechanism like some of the other things I've mentioned.

Reporting is also a tricky thing - yes it's great to have all this stuff going on, but who wants to plough through 20 or 30 system generated emails in a morning? I have for the last 12 years and it's bollocks. I don't care for it at all, and to be honest - it becomes so rote and routine that I miss things (sometimes very important things) because I'm just skimming. I urge you to consider some sort of consolidation report or webpage with information colour coded according to rules so that things going wrong are immediately obvious. We only have a limited attention time (and I think I've written about this elsewhere) and so we need to make sure that time is being carefully put to use and not wasted on stuff that's all OK and we don't care about. Likewise, with Nagios or other monitoring software I'm using, the thresholds for things going wrong is quite high - i.e. tell me when something is really going to be in trouble, not that it's just experiencing some sadness now. 

This post has diverged somewhat from cryptolocker, but the principles in keeping that piece of garbage out of your network are similar across a wide range of threats. 

Saturday, 4 January 2014

Experiences with the IOmega ix12 NAS

One of my clients has an IOmega ix12 - a rack mounted, 12x3.5" bay NAS. This one in question is running 2.6.5.something firmware and has (or should I say had) 8 x 1TB disks in it. Today it dropped another one. The NAS has been set up with a RAID10 with total storage of 2TB (1.5 or so in reality). Although it has this redundancy, I can assure you it's not all that it's cracked up to be. In the main storage pool is a number of iSCSI drives shared out with various ESX servers. Sadly, the largest and most important of these iSCSI drives has an unfortunate tendency to vanish whenever there is a disk issue.

Today one of the drives failed. No big deal right? RAID10 is tolerant of such faults usually. This one kept going except for the aforementioned large drive that disappeared like magic. After replacing the disk, impatiently waiting for it to rebuild and then attempting to reconnect the iSCSI drives I realised I still couldn't see it. The fix? Turn the NAS completely off, count to 30 and turn it back on again. Once it booted, just like magic everything reappeared. Be wary of this. My previous post was the precursor to this. Finding out the actual issue is still to be completed.

I'm concerned about upgrading the firmware because the backups, while reliable, are never straightforward to recover and I'd hate to have to recover so many disks. Its a job for next week I reckon.

Friday, 10 February 2012

A simple script to use either robocopy or xcopy to backup files

Under various circumstances, I've found it useful to cobble together a script to do a sync backup across the network from one Windows server to another. Usually this is for files only and is for either a mirror or a daily, full backup of data. Obviously there are some great backup tools available that make something like this largely unnecessary, however, this is quick, simple and gives you an email output of what has happened. The first example below uses robocopy (Robust Copy) which is a very nice bit of kit indeed. It's a bit more useful than xcopy and handles larger numbers of files better. Don't get me wrong, I love xcopy, but it has it's limitations. I use rsync a lot on Linux servers and robocopy gives me many similar options for how I want to handle files.

The destination directory could be anything - another folder on the same PC, a removable disk, a mapped share or even a straight UNC path e.g. \\server\share - flexibility is the key for this script, once the basic variables are right and you've decided to use robocopy or xcopy then off you go.

So to the script:

Open notepad and put this info in - note where things are comments and what variables you'll have to change:

Script start:
echo on
REM Set up some time variables
for /f "Tokens=1-4 Delims=/ " %%i in ('date /t') do set dt=%%l-%%k-%%j
for /f "Tokens=1" %%i in ('time /t') do set tm=-%%i
set tm=%tm::=-%
set dtt=%dt%%tm%
REM set up variables for log files, source and destination - change this variable
set log="C:\Users\owner\Documents\Scripts\Logs\%dt%.log"
REM local stuff to be backed up - change this variable
set src="c:\documents"
REM remote location to put backups - change this variable
set dest="I:\backups\server"
REM now for the actual work - change switches as required - explanation of switches is below.
robocopy %src% %dest% /E /Z /MIR /R:1 /LOG:%log%
REM I'd like to know how it went (this file can be big if there are a lot of files copied)
echo Backup Logs attached | blat - -subject "Sync Log Report for %dt%" -to "me@mydomain.com" -attach %log% -f user@domain.com

Use blat to send the email - grab it from www.blat.net (great program!) It sends an email with a header that will look like this:
Sync Log Report for 2012-02-10
and an attachment of your log file. You can add different things to this - for example I'll often use a [servername] tag after the date.

The robocopy switches used are:

  • /E = copy sub-directories, including empty ones
  • /Z = copy files in restartable mode (in case the network drops out or something similar)
  • /MIR = MIRror a directory tree (which is /E plus /PURGE)
  • /R:1 = number of retries on failed copies. It's best to set this - by default it's 1 million (!)
I run this from the Windows Scheduler and have a mirrored copy of data files each night. It's quite a useful little tool. If you'd like to use xcopy instead there are a few things to consider:
  • the src and dest variables need to have a trailing backslash and a wildcard
    • set src="c:\documents\*"
    • set dest="i:\backups\server\*"
  • and the command to insert would be:
    • xcopy %src% %dest% /C /D /E /H /Y > %log%
    • where the switches are:
      • /C = continue copying even if there are errors
      • /D = copies files whose source is newer
      • /E = copies directories and sub-directories (even if empty)
      • /H = copies hidden and system files
      • /Y = suppresses prompting to overwrite files
    • the > redirects xcopy's output to the %log% variable we configured earlier in the script, and then blat will email the resulting file out.
If you find this useful in anyway, please let me know in the comments.

Friday, 28 May 2010

The joys of SPARC

When I was beginning in the IT game it was a long time ago and Sun Microsystems, IBM, SGI were all huge names. Microsoft was a beginner and all the big machines were Unix based and so were the servers. I was first introduced the web on an old Sun box running lynx. I couldn't think of anything I wanted to look at on the Internet so I left it alone :-)

As time went on, Sun boxes were still the class act. Ultra10's came in and I was fortunate enough to have one with an 18.1" LCD on my desk. Ah the joys of the stability of that machine. Very nice indeed. When the new Sun Blades came out I was too damn busy with dirty old Windows boxes and just starting to get into Linux in a big way so I didn't get to play with them. As my job has changed over the years, my ability to play with these machines has never been renewed... until now!

Recently from eBay I managed to gain for a very small amount both a Sun Blade 150 and a Sun Blade 1500. I haven't had much of a chance to play with the 150, but the 1500 and I have become acquaintances over the last few days. It's been a struggle - the Sparc architecture enjoys quite a bit of support but the XVR-600 video card it has doesn't. I've really struggled with this as I want this machine to be a desktop machine. So after trying Debian's Sparc install, Gentoo and OpenSolaris I've finally settled on giving OpenBSD a crack. Why not, I say? Try everything until you either run out of bandwidth or time!

I haven't really used the BSD range of distributions before, finding them more akin to Unix than Linux and as often as not, I find the time I need to put into learning them is simply not there. Fortunately now I've had some time to put into this and I've discovered a few things. Number 1 - if you haven't been actively learning stuff getting your brain back into that particular mode of thinking takes a while. Assimilating and understanding all of the relevant information can take time and irritating enough, getting your mind back into that thought pattern can be difficult. Number 2 - I'm waaay out of touch with the command line and editing conf files. Although I do a little bit of this still, the requirements of OpenBSD demand an attention to the detail in the confs. That being said, the default installation of OpenBSD is simple and easy - getting to a working system in no time. The ample documentation that I've found is quite clear and easy to work with. Number 3 - learning a new operating system on a complicated bit of hardware probably wasn't the *best* idea I've had. OpenBSD is hard enough, making it work on a Sparc is a bit harder again, especially when you have troubles with some of the stuff I did.

That was namely configuring X. The graphical frontend to Linux/BSD etc is complicated if it doesn't work by default. Most often it does nowadays and I've been blissfully using it without giving thought at all to the old days when I had to hack the X config to get pictures on the screen. So it felt like a return to the days of yore when I had to start looking at crafting an X config to get it to work. There is apparently a bug in OpenBSD and as such, the driver for the Sun Blade 1500 XVR-600 (while fully supported) doesn't get automatically assigned. So with the help of the Sparc OpenBSD mailing list I was helpfully provided with the very basics of an X conf and away I went. I'm experimenting with it now - trying to get better resolution and colour depth. The amount of time I have spent has been prodigious but well worth it. The guys on the mailing list were very helpful and quick to assist me so a big shout out and thanks to them!

I'll add another post when I start to get some traction on what I'm doing. Also, I have a Blade 150 I am going to install up for fun so I'll post details on that. I'm disappointed that I couldn't get Gentoo to work Blade 1500, especially after the 3 GB download. Ah well - I still have the disk in there so maybe we'll get it going :-)

Friday, 14 November 2008

Ubuntu 8.10 Review updates

I've been using 8.10 for a while now at home and at work. Naturally the more demanding of the two has been at work, where I've been working both on ISO9001:2000 documentation and also web page development. Two things have struck me as annoying, and I've found one little solution and one big solution.

The first problem I've had is when I enable the nVidia restricted drivers for the video card in this GX270 OpenOffice has problems with it's rendering. Specifically the menu names, the font name, font style etc all become transparent and I can't read them. When I close a document and the dialogue box for Save/Discard/Cancel comes up I can't read the options. Firefox is unaffected, which is fine, but I spend a fair bit of time in OpenOffice and it's very inconvenient. The fix? Well it's a little one - I disable the restricted drivers and problem solved. I don't use a lot of the extra graphics effects so that's fine.

The other problem is the workspaces control applet in Gnome. There are only 2 workspaces by default and ordinarily you right click on them to add extra workspaces. For some reason this doesn't work all the time. There is a bug report about it I believe. Yesterday I was cranky enough that I re-installed Ubuntu (this was an upgrade machine that I've "tweaked" and so a fresh start was a good option). Voila! The workspaces thing is working again. At home, however, it isn't and I'm not about to re-install that machine too. For the time being I'll suck it up, find the config file and manually change it if I get cranky enough. At home the extra workspaces isn't such an issue, but I do like to have 4 at any given moment.

Apart from that, everything is going extremely well. The darker theme is very soothing to the eyeballs and that's a good things - especially with many hours spent staring at the screen. Well done, good work and keep it Linux Developers!

Tuesday, 11 November 2008

Review: Windows Server 2008 Small Business Server

Recently I've had the requirement to install and configure Windows Server 2008 SBS. Previously I've deployed 2003 SBS in varying settings and found it to be quite a nice little product. In more recent times though, I've gone the more segmented path of Windows Server 2003 plus Exchange 2003 separately and found that to be quite effective. For a small operation, however, SBS provides many features that are very helpful to the operator who isn't necessarily skilled in IT, nor has a great deal of time to set things up and play with them. Being a Sys Admin with the inclination and the time to play with it, here are some of my thoughts regarding SBS2008.

The management system for accounts and the like is very nicely set up for the none technical user. An application with common tasks is easily available and allows for configuration of most casual system changes to be made. I refer primarily to user and group (both distribution and security) creation, file and folder shares and backup configuration/testing. SBS2008, like it's predecessor has some nice daily reports that can be emailed to the Administrator or to a designated person and contain excellent information to keep an eye on the system. For example, this includes disk usage, backup status and the like. Very nice.

SBS2008 likes to take control of the network - it offers to be the gateway, DNS, DHCP, mail and file server all in one. I had problems with this on my network - I already *have* a gateway, DNS, and DHCP servers and it didn't play nicely together at all. In fact, the installation failed and I had to restart it again from scratch. This time I was a bit smarter and had the server plugged into a dummy switch. Once the initial install was passed, I added a virtual interface to my gateway and set the SBS server to point at it. This time it was happy, but by gum it wasn't the first time around. It automatically assigned itself the .2 IP address on the network (i.e. 192.168.0.2) and off it went from there. It did talk successfully to the gateway and once I reconfigured squid and my firewall, network traffic flowed as one would expect it to.

Something I learned from SBS 2008 is that you cannot set up mail addresses etc from within Active Directory, as you can in Server 2003 (not SBS) with Exchange. Also, security and distribution groups all had to be set up from the SBS system application. Not realising this, I created groups and used the Exchange Systems Manager to set up the email accounts and groups. These did not appear in the SBS system application. Oops! I'll have to revisit it and see if it's found them or not. Something else I learned - the HP ML350 I installed SBS2008 on is a very nice machine, but the HP setup disks - which cater for nearly every *other* version of Server 2008 - do not cater for SBS 2008. Damn! All those drivers and tools had to be installed after the fact - SBS2008 just would not work with the HP boot CD. Disappointing, but the ML350 was released before SBS2008, so you can't have everything I guess. That being said, the ML350 is quite a nice bit of kit indeed and highly configurable and affordable too. This one had a simple RAID1 mirror and everything ran flawlessly - as you would definitely hope it to in this particular setting.

Overall, I was impressed with SBS2008 and if needed I would happily deploy it elsewhere. I imagine that as one becomes more familiar with it, the ease of installation and configuration would certainly increase - I felt a bit confused at times and had to rely on trusty Google to get me through. At the very least, the problems I encountered were well documented by others. I must say though, the initial problem I had with the installation of the server failing was one that had been noted on the Microsoft webpages but no solution had been offered. If your SBS 2008 install fails - make sure it is the only thing on the network when you do the install - no other servers should be present if you want it to go smoothly. By all means migrate your AD stuff later, or even better (given that it is for Small Businesses) create a fresh new directory and build it in a pristine state for roll out. Also to be noted - it requires a minimum of 60GB of hard disk space for the C drive and a minimum of 4GB of RAM (!). Lucky RAM is cheap :-)

Flexibility, with Dell and Ubuntu

I know I wax rhapsodic about Ubuntu and the general power and wonderfulness that is Linux but I've just had an experience that adds another ingredient to the mix. I have a bit of penchant for older hardware - I maintain a gaming system that is generally no more than a year old before I upgrade it's innards but apart from that system, all my hardware is old stuff. Why bother to buy a new machine to run Linux on it, when a two or three year old box will do exactly what I want it to?

I recently picked up a Dell Precision 380 and an Optiplex GX280. Sweet machines - I'm particularly impressed with the Precision and it makes me wonder what a brand new one would be like. Perhaps I'll have to break with tradition and invest in a new one :-) The Optiplex was purchased on a bit of a whim - only $100 and it's a P4 3.2GHz box with DDR2 RAM etc (I'm using it right now actually). Not bad and it's also a mini desktop (or whatever you call them). Very small, quiet and easy to tuck away. It's the same as the two GX240's I've got. The only bad thing I've found is the CDROM appears to be faulty in some fashion. At any rate, installing Ubuntu 8.10 wasn't happening - disk errors abounded and although the CD itself checked out, it was failing on this system. Rather than piss and moan about it, I pulled the 160GB SATA disk out and popped it into the Precision - all the disk mounting stuff is the same, the cables reached perfectly etc etc and after a short interval I shut the box down, retrieved the disk and threw it into the GX280 again. An hour and a half later, I've customised the hell out of my Ubuntu install (mmm pretty) and it's going great guns - no hardware issues nothing. Sweet!

This is kind of a review of Ubuntu 8.10 too - I really like the new DarkRoom theme and as usual the upgrade process (that I performed on another box) went very smoothly. It runs quite nicely and although I've read elsewhere that 8.10 is slower than other iterations of Ubuntu it appears to run quite snappily for me. Earlier I transferred about 10GB of data over my gigabit network and that gave it a real push - the load hit about 8 at one point :-) But the old hardware and the new operating system pushed on quite happily - although I do hear the fan complaining a bit. With the ambient air temperature around the 28 degrees C plus the work it's doing, it's no wonder really.

Back to the Dell PCs I've got now - I have 6 and many of the parts are interchangeable. Naturally there are exceptions to the rule - the power supplies vary somewhat between the little PCs and the Precision has almost unique innards compared to the others. This is quite alright. I'm very pleased with all of them. I've actually got a reasonably good working installation of Windows Server 2008 Standard on the Precision - Microsoft's TechNet Direct is a terrific resource if you want to stay on top of all the Microsoft software out there. The Precision runs it quite happily and it's gratifying to note that nVidia even has drivers for the Quadro network card that work happily under 64 bit Server 2008. It's a happy little family of Dell PCs, Ubuntu Linux, Windows Server 2008 and my lone Windows Vista/XP gaming box. The cousins in the lounge room - the Xbox (modded naturally) and it's younger brother the 360 talk happily to these machines too and combined it forms a nice little network. If only my power bill wasn't quite so large.....

Wednesday, 29 October 2008

Virtualisation in the server space

Recently I've had the chance to view VMWare (www.vmware.com) and HP's (www.hp.com.au) entries in the virtualisation market - through VMWare's ESXi server and HP's Blade servers and SAN systems. Prior to this presentation I'd only looked into virtualisation a little bit, concerns for server redundancy and the like preventing me from spending a lot of time looking into it, and also, the size and scope of the organisation I work for is insufficient for such things - or so I had thought.

I have about 10 servers under my direct control and of these, 6 could be virtualised. Thinking about it going forward, in 3 to 5 years when these servers are due for replacement, buying 2 servers and a small SAN will cost less than the 6 new servers and provide the same services. At the end of the day, I think that's what it is all about - same services (or better), with uptime being up in the 5 nines range and cost-effective hardware/software. I've been a fan of VMWare's virtualisation stuff for several years, solely through the VMWare desktop range. Running multiple operating systems and applications on a single development machine was lots of fun. Back in the day (5 odd years ago) RAM was pricey and so was disk space so the VM's were pretty little.

Now you can run whole server instances in a file and then slip that file between servers depending on load requirements and whatever other fancy things you want to do - such as bringing one physical server down for updates and maintaining all your VM's on another server. Absolutely amazing stuff - especially when you combine it with blade technology and have dozens of servers sharing one big SAN all in the same rackspace. Truly incredible. Pretty complicated too I think as well, remembering where everything is going.

The new Blade servers from HP look very impressive too - they've done a lot of work on cooling and on cabling and everything is redundant with backup from at least one module exactly the same. Of course, when I saw those big fancy Blade servers I thought to myself - why virtualise? You could have a netboot install of a server ready to rock at a moment's notice and be back up and running in a very short time, especially if you combine the server with Microsoft's ASR technology and tape backup. There is so much chance for overkill it's every enterprise geek's wet dream. There were quite a few geeks at the presentation that looked like they were in their happy place that's for sure.

The most important thing I took away from it all was that virtualisation in the server space has come a long way since I last looked into it and the offerings from both VMWare and HP are quite spectacular. Given my quite low funds though - I'm now looking into Linux's XEN virtualisation. If you're more interested in this (any virtualisation for that matter) there is a lot of stuff on the interwebs. Mind you don't clog up your tubes! :-)

Monday, 15 October 2007

Musings on System Administration

I was reading an article discussing forensic preparation for computer systems. Some of the
stuff in there I knew the general theory of, but not the specifics of how to perform. As I
thought about it, it occurred to me that Systems Administration is such a vast field. There
is no way I can know all of this stuff. I made a list of the software and operating systems

I currently manage. They include:
- Windows Server 2003, Standard and Enterprise
- Exchange 2003
- Windows XP
- Windows Vista
- Windows 2000
- Ubuntu Linux
- OpenSuSE Linux
- Mac OSX (10.3 and 10.4)
- Solaris 8
- SQL 2005
- Various specialised software for the transport industry

I have specific knowledge on some of this, broad knowledge on all of it, and always think "There's so much I *don't* know". It gets a bit down heartening sometimes. For one thing - I
have no clue about SQL 2005 and I need to make it work with another bit of software. All
complicated and nothing straightforward. Irritating doesn't begin to explain it. As to the
Microsoft Software - because of it's prevalence throughout the world, there is a lot of
online information available. Likewise with Linux - it's incredibly rare to encounter a problem someone else hasn't already come up against.

So how to function within such an environment? Understanding I think is the key. The more
you understand about the process, the easier it is to figure things out. To my reckoning, understanding the "why" of how things work makes the job of learning new things fast to fix
problems much easier. For beginning sysadmins it's probably the most important thing. That
and curiosity. I'm always interested in learning about new things and trying new stuff out. Having sufficient hardware to indulge in this obsession doesn't hurt either. If you can't have all new stuff - second hand stuff is a good way to play.

Adventures with Immich

With the implementation of my Proxmox server it's now time to play with some new applications - and we'll start with Immich, a repla...