A number of years ago I had the misfortune to be caught in a heavy rain shower on my way to work. Not only did the water penetrate the raincoat I was wearing, leaving me totally saturated, but it also ‘drowned’ my laptop, leading to problems occurring when I tried to start it up, resulting in the hard drive being completely unusable and nothing able to be retrieved from it. Fortunately I worked in an organisation that allowed me to send daily backups of my laptop across the network to be stored on the server. Within a few hours I was again working on a borrowed laptop, with all my files installed, minus just a few things I’d been working on the night before.
That was really my first ‘close shave’ that caused me to appreciate the absolute importance of ‘backups’! Failure to do that would have been a disaster for me!
I’m imagining that many schools and teachers in Christchurch are thinking about this after the recent earthquake. Many have either had their laptops or servers destroyed, or have lost access to them as they lie inside condemned buildings. For them the issue of ‘disaster recovery‘ takes on new meaning – more than simply a case of whether things have been ‘backed up’ – but also a case of where those back-ups are located.
The principal from one school I spoke to is distraught because while his school had invested wisely in a complete back-up server and ensured that regular and comprehensive back-ups were made on a regular basis, the back-up server was located alongside the active server in the school, and together they lie in a condemned building in the city. Their data is undoubtedly safe, but inaccessible.
A teacher from a second school was telling me how ‘lucky’ they were that as the earthquake was happening their technician had the presence of mind to grab the back-up tapes from the office as he fled the building, and now the staff and students are able to continue operating on borrowed computers in borrowed premises accessing their files installed on a borrowed server. Certainly a case of good luck rather than good planning – they are the fortunate ones. Their tapes could so easily have been left inaccessible inside a condemned building also, leaving them in the same situation as the first school.
One of the essential elements of a good disaster recover plan is to ensure that you have off-site back-up and storage. This doesn’t simply mean that you take the back-up tapes home at the end of each day. Effective off-site back-up involves regular ‘pushing’ of data to the off-site server – this should occur at least once daily, typically overnight, but with digital data being mission critical for schools, more frequent back-up or “continuous data protection” should be seriously considered.
This is one of the significant benefits of being connected to Ultrafast Broadband, and as schools look forward to how they can leverage their investment in UFB, the lessons learned from Christchurch should raise the concerns for a good disaster recovery plan to somewhere near the top of the list.
There were issues with this in the Brisbane floods as well. A friend who managed library and student support services at a TAFE said that they lost a number of QLD based services during the power cuts but (from memory) the library catalogues etc stayed on line as they were part of a spider of inter- state library catalogues and as the others all had power the system supported the Queensland ones. In one of the smaller New Zealand organisations I worked for which had a hugely popular site we planned having the back up stored in another city for disaster recovery purposes e.g. if your main host is based in Auckland then your DR might be based in Christchurch (or vice versa) The irony of that was we avoided Wellington as an option because of the earthquake risk!
It’s not really until you’ve had failure or a near miss that you appreciate the value of a good back-up.
A few years ago the library computer spat the dummy on a Monday. Fortunately I had done a once a week, grandfathered backup on the Friday. I quick post of the disk to Wellington Access-It and we were back running smartly. Thank goodness- imagine re entering all the bar codes and SCIS data again by hand.
Personally one day my laptop just wouldn’t start- needed a new hard-drive. I had had a wireless back up to a Time Capsule- the day was saved- along with all my data.
Even off site backups can be foiled by humans making assumptions. Many years ago I worked for Musac, the school admin software company, and was visiting a rural primary school. In the process of checking things out, I asked about backups and was shown a model system, well ahead of the times. The admin computer had a tape drive and 5 backup tapes labelled Monday to Friday; each day the secretary backed up to that day’s tape, then took it home in her handbag. “But we only use the Thursday tape,” she said, “the others are broken.”
That aroused my suspicions; it turned out that Thursday’s tape was faulty, so it spun briefly then stopped. The others rattled and whirred away for ages, so the secretary and the principal had decided they were faulty and stopped using them. They had been backing up absolutely nothing for the last 18 months!
good story, Gregor – illustrates the absolute need for a well-thought through systems approach to back-up and disaster recovery that is multi-faceted and not capable of being subverted by human action 🙂
Many years ago, my fellow students thought I was mad when I took several backups of my Masters, keeping one on the uni server, one in the car, one at home. I figured it was unlikely that the server would fail, my home would burn down and my car stolen. I did not reckon on an earthquake in Manchester, thankfully they don’t get them there. These days I backup to the cloud overnight, so much less to think about.
Natural disasters always bring to mind what **could** happen. Love the story of the school thinking they were backing up their data and none of the tapes worked! For less than $200 they could have been covered. I’m guessing they were their own IT department.