This by way of follow up to the post about pivot tables at reference 1.
Shortly after that post, I was moved to remind myself about writing programs in Visual Basic. In a couple of hours I had 145 lines of code, a single subroutine, which seemed to do the job, to convert my weight text column in pounds and ounces to a numeric column in grams. From ‘5lbs 9.5oz’ to 2537.28.
Confident enough that I did not bother with any prior analysis and design, I started out with the premise that I had a text field which was likely to start with a number, give or take a few leading spaces.
The next assumption was that this number should be followed by some variation of ‘lbs’.
At each stage, I wrote the results out to a neighbouring column. Peered at the two columns until a decision emerged from the depths about which wrinkle in the results to tackle next.
After a while, I had code which did the job. I even went to the bother of adding a third column to hold the weight rounded to the nearest hundred grams, expressed in hundreds of grams.
Which gave rise to various thoughts about the coding process. At least, about this one.
It was evolutionary, rather than top down, which meant that half way through the process one realised that there was a better way of doing things – but that it was a bit late, short of starting over. A better way of doing things which might have involved a different way of organising the problem, perhaps a few subroutines taking on bits of the work, rather than just the one – and breaking out these subroutines would have made checking and testing rather easier. On the down side, the overheads of having those extra subroutines and passing data between them all.
It also meant that the coding was opportunistic. It sometimes made use of bits which had been put there with one end in mind for some quite different end. I associate to Lieberman in reference 2 talking about the complicated way that function interacts with structure during the evolution of the head, with all kinds of odd bits of pre-existing structure being co-opted for odd bits of new function.
Which added up to the code being good enough, rather than perfect. The code coped well enough with the data that was there, but would quite probably break if some new sort of data turned up, some new way of doing things which this coder did not know or think about.
I would add that crashing through the problem like this, without preamble or forethought, is not recommended, certainly not for problems which are significantly bigger. It might or might not come right in the end, but there are apt to be some travails on the way. Not best practise – but then, evolution wouldn’t know anything about that.
After the code, I moved onto the chart, where I found that things were still not quite right, with the business of pivot tables not bothering with codes that are not there cropping up again. So the chart of the left is rather distorted by there being codes missing between the two peaks, peaks which are real enough in themselves.
I dealt with this by hand correction of the chart’s input data, that is to say the pivot table tabulated far left. Copy the table to a new part of the worksheet, using paste special to strip out all the pivot table stuff, add in the missing rows and replace the blanks with ‘NA’. Which yields the chart on the right.
A reminder of the rather different problem of the appearance of charts of this sort being very dependent on the scales chosen.
PS: I may say that the two loaves of batch 550 turned out very well, having lost a total of eleven ounces in the cooking, and that batch 551 is now under construction.
References
Reference 1: http://psmv4.blogspot.com/2020/02/the-perils-of-pivot-tables.html.
Reference 2: The evolution of the human head - Daniel E Lieberman - 2011.
Reference 3: https://psmv4.blogspot.com/2020/01/no-spatburgunder.html. The last notice of reference 2, from a quite different angle.
No comments:
Post a Comment