Tag Archives: IT

Writing Good Code

I’ve always liked xkcd, because it’s clear that the author knows firsthand what IT really is about. Especially the webcomic “Good Code” highlights the conundrum with which every coder is confronted: write fast or well? In the first instance the danger of spaghetti code is always around the corner, whereas in the latter the requirements may have changed by the time you’ve finished your project.

Those who have read my blog know where my preferences lie. At least when you write at the speed required by your customer’s needs you stand a chance that it will have some useful life span.


Coding Block

Stuck with a piece of coding ? How about letting your freewheeling mind take the strain, while you’re driving home. Or sit in a pub, and write out a stream of ideas that revolve in your head. Or get up in the middle of the night because your mind is too full, and put your ideas on paper until your mind is sufficiently at rest to go back to sleep.

It’s amazing how many times a combination of the above circumstnaces have helped me out of a rut. At times your conscious mind becomes so intent on one type of solution that other options are not even considered. That’s why letting your backseat mind take the strain sometimes helps.

Or just discussing and explaining your problem with either another IT specialist or even a layperson can make you see the light : merely by making things clearer in your mind is it often possible to blow away the cobwebs and see the light.

People Are Different

People differ from one another, so much is obvious just by looking at them. But until I had to plough through other people’s coding, I did not realise how much more different people are inside their head.

Presumably that’s why software companies insist on one coding standard, to make sure that one person’s job can easily be picked up by another. As things stand in Port Talbot, there is no such standardisation, with everyone writing the type of code they were most comfortable with. On the one hand this makes it easy to establish your own style and be comfortable with your own writing, and in a way it makes it easier to recognise someone else’s writing from their own idiosyncrasies.

However, if you’re given someone else’s code and you’re asked to modify it or to develop it for your own needs, very often the best way to progress is to figure out what the code is intended to do, and then rewrite it in your own style, otherwise you’ll always be at the mercy of a half understood piece of code.

One of the hardest pieces of writing were Jim Kyle’s ASP code, where variables and functions were given short and far from meaningful names, and trying to follow the logic was tortuous at best and nigh impossible at worst. In one instance I had to go back to the owner of the page and ask how, from first principles, his page was supposed to be populated, and then forget about Jim’s code and write my own effort as if it was a brand new page.

Sometimes even my own early attempts at coding made me cringe when after many years I was attempting to add some functionality. In a way that’s not bad thing, since it shows that my coding skills have improved over time, and secondly by ripping up the original and creating an improved replacement, I had decreased the number of badly coded pages by one.

Still, I’m fairly sure that if I had to go back now, I probably would be lost in my own pages, especially if someone else had to modify them. Mind you, as time goes on, a return becomes less and less likely, and that’s probably all for the good. It means that either the code stands the test of time, or whoever is in charge of the various pieces of code has managed to adapt them fir their own purposes.

Look at how different people are doing their coding, and you realise that people are more different inside their head than they are on the outside.

The Vision Thing

I remember George Bush Sr. admitting that he wasn’t all that good at “the vision thing”. If my performance appraisals at Tinplate R&D are to be believed, then I’m also lacking in this department. Not exactly sure what was meant by that, but maybe that’s why I was deemed deficient in that department.

Mind you, if I see how British Steel was going about their business of deciding on their strategy and implementing it, I have my doubts about this vision thing. Was it all that clever to underinvest in your core and instead spend your money on a mini-mill in Tuscaloosa ? Or to close one of two coke ovens in Port Talbot just when the demand from a China cranked up the coke world price ? For people who were in a position where you’re supposed some degree of vision, those were pretty daft decisions (I know, 20/20 hindsight is a wonderful thing).

Closer to home, I’ve never understood that senior managers never bothered to find out how much man hours was spent producing the reports they wanted. And hence they never found out the hidden waste involved with people going through mindless loops and spending between a quarter and half of their time unproductively in producing regular business reports that could and should have been automated. But then again, IT is only advanced plumbing, and whoever heard of a plumber with vision ?


This is a piece of software to perform data mining (currently owned by IBM and known as SPSS Modeler) for which we’ve a number of active licences since the British Steel days. I supposed it was an early version of the type of data analysis the likes of Facebook and Google now use to analyse customer habits and preferences.

I haven’t used Clementine for a long time, mostly because my job in the last 15 years in Corus / Tata didn’t really call for it, but also because I’ve seen it used as a data extraction tool where a bespoke .NET application could of the job far easier and fully automated as well.

The one time I used it properly in an attempt to find a pattern determining the flatness of temper rolled blackplate ended in failure. I thought my sample of about a 1000 records should have been plenty – it definitely took a long time and much blood, sweat and tears to collect the data prior to automated data gathering – but when you split your data set over various reel types, two types of annealing, various gauge ranges and mechanical properties you found that many of the conclusions for a possible correlation were based on ridiculously small sample sizes, and therefore any conclusions were not worth a sack of beans.

It definitely cooled my ardour for this type of investigation, and fortunately my subsequent job content no longer called for its use. Maybe that’s all for the better, since I’m not aware that all those years of using Clementine in Operational Research have really led to any new insights – at best they merely confirmed what a knowledgeable practitioner might have suspected prior to the analysis. Our investigation into the issue of PM10s (see Pollution) was a case in point, snce it merely onfirmed the prime importance of wind direction.

There was also some misuse of Clementine in trying to do some jobs that were better performed by a bespoke .NET application. One example was the extraction of screen sales records, which meant someone had to kick off the Clementine job once a week, capture the resulting records in an Excel spreadsheet and snd this spreadsheet to the interested parties. Exactly the type of labour intensive job that I was trying to get rid off – so that’s what I did : I used the logic for collecting the relevant screen sales records to build an extraction application, populated an SQL table, and built a web page to display the results. Meaning that from that moment onwards no further manual input was required to retrieve and analyse the data.