Tag Archives: Process Control

Loss of Data


It’s probably happened to all of us at one time or another: you’ve been working on a spreadsheet or some other document for several hours, and were so absorbed in your work that you didn’t make intermediate saves, and then the server or your computer went down (say by a power cut), and you lost a whole afternoon’s work.

This was more likely to happen in the early days of computing, where application software didn’t carry temporary back-ups, and if an unsaved document was gone, it was gone for good. One example that clearly stands out in my mind was when a load management at Allied Steel & Wire cut out the mains a quarter of an hour earlier than I had expected, and I lost a 1-2-3 spreadsheet that I had been working on all afternoon. My expletives must have echoed through the building, because a manager of a few offices down the corridor came to compliment me on my command of the English language.

The situation was even more fraught in Iscor, where many of our documents were stored and edited through a mainframe terminal. In the end you started to read the warning signs when your connection started to slow down – often a warning sign that the connection might cut out – and then it was a race against time to try and save whatever you had before you lost your work.

At least I’ve never been in the situation of Steve Carless, who lost all his EngDoc work that was stored on a common drive when we changed our system of using common drives at Welsh Labs. The worst I did was to accidentally delete all my emails on my Lotus Notes account, and you know what ? I never missed a single one of them !

All of this confirms the adage that data isn’t data until it’s backed up. That was proven when I accidentally deleted more than a year’s worth of blast furnace 4 production records, and they were restored within a few hours from the server back-ups held by Process Control.

Come to think of it, considering that I’ve been working with computers for several decades, I’ve had surprisingly few mishaps. Either I’ve been lucky, or I’ve been doing something right.

Was Perfect Once


From Ben Hammersley’s book “Now for Then – How to face the digital future without fear, Part 60. Failing Gracefully, or Why Everything is Beta Now”

“On-line everything is beta because the state of perfection is permanently receding on endless waves of innovation. And an app that is adaptable, or that can deliver a soft landing even when it fails, is far more valuable than the perfect-for-a-moment app that either lacks the flexibility to cope with whatever is coming down the line next, or is too late. Good-Enough Right Now beats Very-Good Later, and completely defeats Was Perfect Once.”

In my innocence the above description was far closer to the type of development cycle I was using when doing my IT stuff than the official line at Process Control or GIS. Both (but especially the latter) were so glacially slow that by the time a solution was ready, quite often the problem had changed substantially. Also, their approach seemed to assume that the developer had all the information available to build the finished article, which from personal experience I found was hardly ever the case. Especially since only a selected few users were consulted when compiling the needs analysis.

In the end, what you receive from such an approach is something that may look snazzy and may have a wow factor on day one, but soon the user starts to feel the limitation of usability when it becomes clear that getting new requirements added means a complete new turn of the development cycle – which most users feel (in my view quite rightly so) should have been an automatic extension of the initial development cycle.

What I’ve found from personal experience is that, even if the user has a full say over the initial requirements list, initial use of the system will either show up gaps in the initial understanding of the customer’s needs, or create further requirements based on an extension of the initial design. And that is just assuming that all relevant stakeholders have been able to have their input, which is not always the case.

The answer: build a continually developing system where interaction of the user and the system grows asymptotically to something that approaches stability – that is, until outside factors induce the next step change.

IT Security


IT Security is the health & safety of the IT world. Not that people die of it if something goes wrong, or have to visit the occupational health nurse, but in the sense that it stops all discussions because “surely, no-one can be opposed to better health & safety regulation?” Likewise, IT security can be used to squash any dissenting views because “surely, no-one can be opposed to better IT Security?”

Except when the proclaimed added security is a cover to meet no opposition for whatever IT feature you want to introduce. You don’t like people using Dropbox to keep documents on the cloud ? Declare it a security risk and ban the use of a Dropbox. You don’t like people accessing their personal emails during work ? Declare GMail unsafe (which by the way takes some chutzpah) and ban its use.

But even when the issue is real security, sometimes the medicine is worse than the disease. Take for instance the scanning of your hard drive which is scheduled to start at 10am every Tuesday : for people with a standard XP machine (in short, most people) the limited amount of CPU means that for the rest of the working day your machine will be slow as hell.

And when there’s a breach, did it happen through these well-advertised security failures ? No. The latest one I encountered came from someone plugging in an infected USB stick, and McAfee hadn’t been set up to vet such devices for anything dodgy. I half expected that from then on USB ports would be disabled, but presumably even the security experts thought that was a measure too far.

Still, whenever a new measure was announced, my cynical self could help search for ulterior motives. Not that it would help you to complain anyway. Not when you receive a snotty email identifying you as one of a select group of people who had the audacity of using Dropbox for the simple reason that it was useful.

Fear of Open Source


Big companies tend to go for big companies such as Microsoft or Oracle when it comes to supporting their IT needs. The main reason is exactly that : support. Your contract will always ensure that whenever something goes badly wrong with your systems and your internal support can’t handle it, then there’s always the option of big brother giving you a helping hand.

So far, so good.

Most of the time it doesn’t really matter too much which system you’re going for, provided you have a sensible balance between fitness for purpose and a small enough number of systems that can be supported. But what I’ve seen happen over time is that anything that has a whiff of being “non-standard” gets the evil stare.

Take for instance the Hot Mill Systems guys at Port Talbot’s hot strip Mill, who for reasons that were hidden in the mists of time (but probably had something to do with the fact that Martin Doyle was more familiar with Linux boxes and the languages such as Perl and php that go with it) did not fit in with the Process Control set-up of the time which consisted of Windows servers, Visual Basic and ASP.

At one point someone from the hot mills management team had been approached by the Process Control manager to see whether a switch from their open source set-up to the standard Process Control could be considered. That question was passed on to me, to which my reply was : “Do you realise how many thousands of web pages alone would have to be rewritten in order to make the conversion?” Not that it would be impossible to convert existing php pages to ASP, but it would be a mammoth task that would take a whole team several months just for the sake of tidiness, with no new functionality to show for it.

Mind you, at a later stage when I was asked to rewrite the morning meeting pages I did so in ASP.NET, not only because I had got out of the habit of writing web pages in php, but also because the added functionality I could achieve in .NET. However, the decision was made on practical rather than ideological grounds.

On the other hand, if you spoke to Process Control, and even more so to the business analysts in a GIS, the whole concept of open source was complete anathema, and to be avoided at all costs. Maybe this was because they were system administrators but not coders, and maybe felt that they might be out of their depth if there was not the security of a well-supported back-up solution.

This knee-jerk aversion to open source became clear to me when I was considering whether it would be possible to change from our existing Sharepoint wiki to MediaWiki which, since it is the software supporting Wikipedia, I considered to be the industry standard. At the time I was looking for a wiki solution that might be more user-friendly than the Sharepoint set-up, and thought that the ease of adding images, as well as the availability of discussion pages and automatic indexing at the top of an article made MediaWiki a strong candidate to migrate to.

So I put some feelers out to see whether there was any chance to try MediaWiki out as an alternative to our existing Sharepoint wiki. No sooner had I broached my proposal that the shutters came down – in the analyst’s words: we don’t do open source. No point arguing, it appeared to be a dogma that never mind how useful a piece of software might be, if it was open source, it did not get past the first hurdle. End of.

Never mind that in my opinion MediaWiki IS the standard when it comes to wikis, because it is open source you could not even get to the point of discussing the relative merits and demerits of the old and the proposed solution. Which is a real shame.

I Don’t Do Attachments


“I don’t do attachments”, those were reportedly the words of Ian Hobson when he was Operations manager in Ebbw Vale. Meaning that if you sent him an email, you had to state what you had to say in the main text of the email without reference to any attachments, because he would not even look at them. At the time I thought that was a rather odd thing to do, but over time I’ve come to understand his point of view (at least if not carried into extremes).

The reason why I’ve come to this understanding is Lotus Notes databases. During the British Steel and subsequent Corus days, our email system was IBM’s Lotus Notes, something that was only replaced by Microsoft’s Office365 during the Tata Steel years. The Lotus Notes also came with the option of creating “databases”, sometimes properly created and useful systems such as the Next Steps databases or Process Control’s Post-Product Support database. However, most of the time it was merely a vehicle for clunky text-based documents, and worst of all some databases consisted merely of a forest of attachments without a word of explanation of what was where.

That’s why, when I started the Strip UK Sharepoint wiki, I immediately disabled the facility for adding attachments. I wanted a wiki article to show on the first click the information most relevant to that article, explained in normal English and accessible without having to click on an attachment to access information that should have been in the main body of the article to start with.

Obviously, the time and effort spent in getting your thoughts together and putting them in writing is greater than in merely attaching a document, but in the end the usefulness of an article is enhanced immeasurably for any user who visits the site at a later stage.

I’m fairly sure there’s a general message in here, in that a little bit more effort in advance can make most systems more useful in the end. Or am I over-generalising ?