Home server project (part 2)

In an earlier post, I talked about putting together the hardware for my home server project, next I needed to load an operating system. This is where I ran into a few problems…

As my laptop runs Linux Mint day-to-day, a Windows server was never a consideration. Naively, I assumed there would be a raft of linux-based NAS or home server distros which I could install and be streaming episodes of Spaced to our Chromecast within hours.

I had a checklist of non-negotiable requirements:
– NAS file storage
– scheduled backups
– Plex media server
– Bit torrent
– Minecraft server for our son

I thought the hardest part would be choosing. On the face of it, I could be forgiven that assumption. The web is awash with reviews and suggestions – Amahi, OpenMediaVault, FreeNAS, NAS4Free, Rockstor, etc.

As a clear favourite in a lot of the web reviews I read, after some moderate to severe procrastination I eventually decided to try Amahi first. I expected to find an Amahi ISO to download, but the instructions for Amahi 8 required Fedora 21 to be installed first. Err… OK. Off I popped to Fedora.

The user is then expected to enter bash commands to install Amahi over the top of the Fedora Server install – I’m fine with that, but I imagine it could very off putting for someone with limited or no Linux experience.

Whilst waiting for Fedora to install, I signed up to Amahi and created my ‘hda profile’ (HDA apparently stands for home digital assistant), which seems like a very useful feature. However, a scan through the Amahi Apps (after creating an account) suddenly revealed that practically every ‘essential’ app is charged for – a bit of a cheeky surprise, but not a deal breaker. The way this is only revealed after creating an account, however, is somewhat disingenuous and sets the old ‘rip-off alert’ sirens going in my head.

After that, I noticed that several of the apps I wanted are still in beta, and the beta testing program has been closed to new testers. However, Pro account users can have ‘early access’ to beta apps for $7.95 per month – this was all starting to smell somewhat fishy, and I realised that the monthly charge option plus paid apps would quickly mount up. I don’t have a problem paying for software, indeed I’ve paid hundreds for pro-audio or office apps over the years, but I was starting to wonder what other hidden costs would materialise after I committed to an Amahi setup. Plus, I had no guarantee that everything would work as I wanted it to.

In the end, the choice was made for me when typing one of Amahi’s install commands generated an error every time. Onto the next…

Rockstor was more promising; built on CentOS, an ISO image available to download, easy to install and configure, quick to get up and running. It’s built over the top of BTRFS as the file system, with the benefits that brings, such as snapshots, checksums to guard against bitrot and SSD optimisation, etc. Plus, the web UI is gorgeous!

Rockstor also offers plugins, Docker containers called Rock-Ons, which seem to work very well. Rockstor is the only system on which I progressed to the point of getting Plex up-and-running, easily achieved with the Plex Rock-On. The selection is limited right now, but once the Rock-Ons platform is fully stable, I would anticipate a flood of existing Docker container projects being forked, adapted and tweaked to expand the range of Rock-Ons.

So, why didn’t I stick with Rockstor? The Bit Torrent Rock-On on offer is Transmission, which I just can’t get along with – a shame because Transmission seems to be the default torrenting app on so many Linux distros. Transmission was an irritation, but the deciding factor was that, try as I might, I just couldn’t get Rockstor to find one of my unformatted hard drives, and that it didn’t seem possible to plug in an external USB3 hard drive to transfer media files (as the external drive was formatted as NTFS not BTRFS). However, I was still hugely impressed by Rockstor’s features and wiped it somewhat reluctantly – I suspect I’ll be back in a few months to give it another try.

So that’s as far as I got this week – next, I’m planning to give OpenMediaVault and FreeNAS a whirl. If neither of those work for me, a DIY NAS on Ubuntu Server will probably be my fall-back.

If you like this, please feel free to click one of these fancy sharing buttons:

A fun little home server project

… or so I thought.

IMG_20150914_174238It started OK; about a year ago, I inherited an old desktop tower case from a neighbour who moved away, a Thermaltake Kandalf. This thing is a monster – almost two feet high and 22 kg, with built-in liquid cooling.

Having spent its life in the Arabian desert, it needed a lot of cleaning to get rid of all the sandy dust, and then it sat in a cabinet until I finally got around to doing something with it last week.

It still needed another proper clean – every time we have a sandstorm here, everything in the house gets covered with a layer of yellow dust.

IMG_20151009_205206There are seven internal 3.5″ hard drive bays ready to use, as well as four or five additional 5.25″ drive bays that could be used with adapters.

One of the planned uses for this server will be as our NAS and backup machine, so having some space to expand is a plus. I already had some spare hard drives from the desktop PC I gave to a friend when we left the UK.

With the addition of a PSU, a veteran gaming motherboard, an Intel i5-4690 CPU and a stick of RAM, we have pretty well everything we need for our home server.

I don’t plan to use the built-in water cooling radiator and connections right now, as the unit is running cool and very, very quiet right now, but the option is there if I need better cooling later – the coolant pipes are tucked away neatly at the bottom of the case.

It had been a while since I last built a PC, but everything seemed to fit together as easily as I remember – if not more so these days. Whilst fitting the motherboard, I realised that the three fans in the front panel have two-pin connectors, which means that the motherboard (having four-pin connectors) won’t be able to control their speed, so I picked up a cheap four fan controller as well.

IMG_20151009_204959 IMG_20151009_205054

OK, now to install an operating system.

Loads of choice out there…  what could possibly go wrong?

If you like this, please feel free to click one of these fancy sharing buttons:

OpenStack

I recently came across this video, which helped me understand the basics of how Openstack does what it does:

I really love the idea of OpenStack – an open-source cloud platform that’s fast becoming the industry standard, and displacing the proprietary cloud standards in the market – but the underlying technology is not simple, and seems to involve a steep learning curve.

OpenStack started out as a joint project between NASA and Rackspace, combining elements from an older NASA platform called Nebula and Rackspace’s existing Cloud Files service. The idea was to make an open source platform available to install on consumer hardware, thereby bringing the cost of offering cloud-computing services considerably. OpenStack has already been embraced by Ubuntu and Fedora, making it simpler to deploy a series of servers quickly and (relatively) cheaply.

There’s a great piece in the OpenStack FAQ to explain what it’s about:

How do you describe OpenStack to your parents?

This is a great question because I have been in that position before, I like to explain to them with simple examples so that they can correlate with their everyday activities, for instance my father loves to fix almost everything and he waste a lot of time looking for the right tool for the job that he is trying to do, so one day I told him that the Cloud was a toolbox always reachable that will give him the right tool at the moment that he needs it, after a couple of minutes later, he smiled to me and said “When can I have it?”  I just smiled back to him!

More info here: https://wiki.openstack.org/wiki/Main_Page

If you like this, please feel free to click one of these fancy sharing buttons:

Free at last!

My IT shackles have finally been removed!

After four years of struggling to deliver against the constraints of a locked down PC, I’ve finally managed to persuade the powers that be to allow me a ‘developer build’, along with other members of the team.

This allows us a small measure of additional control over our Windows XP boxes, not least of which is the ability to defrag the HD (at last!), and install applications.

Right off the bat, I installed PowerPivot for Excel, Firefox, Chrome, GIMP, LibreOffice, Notepad++, Eclipse, WinPython (python 2.7 and a whole load of useful libraries and applications), Git, MySQL and PostgreSQL.

Just having the tools available to streamline my workflow is already paying dividends. Plus, I’m now enjoying my work more than ever…

If you like this, please feel free to click one of these fancy sharing buttons:

Getting my HEAD around Git

Since getting involved in a Python development project at work, I’ve also had to start using Git.

There’s a lot about Git that I don’t fully understand, but I found the term ‘HEAD’ and the concept it refers to particularly opaque until I had a chat with a colleague about it.

Apparently, HEAD can be viewed as the currently checked-out branch, ie the one that you’re working on. My understanding is that HEAD is akin to a setting or property that tells Git where to point the details of edits you’ve made.

Just to confuse matters, it seems there is also a difference between ‘HEAD’ and ‘head’; with the uppercase version being the currently selected branch, and there being more than one ‘head’ for each branch in the repository.

A quick search of YouTube brings up this talk about Git, including explanation of key terms like HEAD/head.

Slight tangent, and it may be an old blog post now (2007), but I like Zack Rusin’s Git Cheat Sheet too – I’m a visual kind of person, and the commands sequence diagram suits my ‘scribble it on a whiteboard’ thought patterns quite nicely.

Hopefully the information above will help someone who’s where I was a little while ago, struggling to understand what Git is all about, and developing migraines in the process! Of course, if you feel my interpretation is incorrect, please feel free to leave a comment. Any input or clarification is welcome!

If you like this, please feel free to click one of these fancy sharing buttons:

Ubuntu Studio documentation

I’ve been using Ubuntu Studio since I started experimenting with Linux again last November – it’s a great, multimedia-focused distribution based on Xubuntu, and it comes with all sorts of audio, video and graphics applications pre-installed. It also has a low-latency kernel, which (in theory) means that any real-time audio or video processing should proceed without getting interrupted by routine software updates or suchlike.

However, I hate to admit this, but I still don’t find it anywhere near as easy to lay down a quick backing track or indulge in some ad-hoc sound design as it is in Windows 7 or OSX. Part of this is bound to be familiarity – I had quite a bit of experience using both Windows and OSX for audio stuff – but the main issue is that the open-source pro-audio applications available and the linux audio stack just doesn’t seem quite as well-developed. Recording multi-track audio seems more than adequately covered by Ardour, but the electronic music industry has now pretty well embraced VST/AU plug-ins and virtual instruments, and it’s not easy (maybe impossible) to replicate the kind of plug-and-play workflow of applications like Ableton Live, Cubase or Logic Pro using the open source options currently available (yep, including the all-powerful Ardour).

Luckily, one of Ubuntu Studio’s real strengths is the documentation available, including the Ubuntu Studio Wiki, so if there’s a way to achieve a particular task, you can usually find out how in the docs. Another of the things I love about open source software is the opportunity to contribute, and I approached the Ubuntu Studio team last month about helping out – they told me that they could do with some help reviewing and updating the documentation. We agreed that the perfect place for me to start was with the Pro Audio Intro and the Jack Quick Start. I’m working my way through both of these (when I get chance alongside work stuff), but I’m having great fun learning about Ubuntu Studio and audio on linux in general.

UPDATE 27/06/2013: Unfortunately, since I learned that my current contract work contract will be ending soon, my focus has been (and will be for the foreseeable future) on ‘up-skilling’ in technology that’s more relevant to work, so Ubuntu Studio has had to take a back seat for the time-being…

If you like this, please feel free to click one of these fancy sharing buttons:

Be cool

MatchFlameI’ve just been reading a great blog post on Linux Advocates by Ken Starks called ‘time to put on our Big Boy pants‘, and it really struck a chord with me.

The only reason I got anywhere using the various operating systems and applications I’ve experimented with over the years is because I’ve clicked away at them like a big kid until I broke them (if I could), then had to work out how to fix what I broke, and learned a whole lot about how they worked in the process.

Now that we’re seeing the Raspberry Pi in schools and old PCs given a new lease of life with Linux, before being given to children who can’t afford a brand new one, there are ever greater numbers of kids out there doing the same thing – clicking, breaking, fixing, learning.

For this reason, it’s all the more important for those of us who populate the Linux and open-source communities to set a better example. Quite frankly some of the behaviour we all see now is worse than childish.

The other tangent of the article related to the usability and naming of apps, and sometimes it does feel a little like these are specifically designed to be off-putting or intimidating. This is self-defeating. As more talented young coders start to find their feet, those deliberately obstructive, “you’re not worthy” types of packages will be superseded by better and more user-friendly options that those young programmers will create themselves… and ultimately forgotten in the mists of history.

I get that some people feel that running Linux makes them the underdog, and that they love this. However, when the platform you use and the communities that surround it start gaining popularity, surely that should be celebrated – it shows that you were right all along in your choice.

If your reaction is to belittle and shame some unfortunate kid who doesn’t know the basics, and maybe doesn’t yet have the language skills needed to express themselves to your satisfaction… that’s not going to stop the influx of new people to your community, it just makes you look a little pathetic.

Surely a guiding hand is going to benefit us all much more than a slap in the face? It will benefit the poor kid (or average joe/joanna, or silver surfer) trying to find their way, it will benefit the community as a whole, and who knows, it might even benefit the person who thinks twice before spouting bile.

If we encourage a new generation of Linux-based hackers, coders and admins, if a greater proportion of the general population know how to manage their operating system and create elegant applications, surely that benefits all of us? Don’t worry, there will still be plenty of work to go around. We’re on the cusp of a new era in computing, all the signs are pointing to that. Let’s open the doors and really see what we can build together.

If you like this, please feel free to click one of these fancy sharing buttons: