Wednesday 7 November 2018

Is science in a bad way?

Last week to the Royal Institution to hear Jeremy Baumberg (reference 1) on the secret life of science. A talk which got the Institute back on its feet after a couple of not so good talks. In part, a promotion for his new book (references 2 and 3), a promotion which I have resisted so far - not because the book does not look interesting, rather because my pile of unread new books is rather high as it is.

In his day job into nano science, but moonlights on the sociology of science, on what he calls the science eco-system, a term which captures what he is on about quite well. He did have a diagram of this system, including all the main players, but I can neither reproduce it from memory nor find a copy on the Internet. Players like the scientists themselves (perhaps 50 million of them), the institutions employing the scientists, the journals publishing their work, the scientific publishing conglomerates (like Elsevier), the governments (and others) that pay for the science and the science journalists who tell us all about it. Industries that use it. Tools like the citation metrics used to measure the players.

But to start at the beginning, the first event of note was wondering what Allen Carr of Raynes Park was. See reference 4 for the solution.

Perhaps next time we will find out what the place snapped left is - it not being at all clear from the pavement of Albemarle Street. But this time I managed a Quickie at the Goat, and then on to a not very good showing in the main lecture theatre at the Royal Institution. But Baumberg himself was very good, with assured delivery of material which I thought was well pitched to the occasion. He came across as rather upbeat about both the state and the future of science, despite the rather downbeat opening of the Project Syndicate article listed at reference 3, as yet unread by me.

He spent quite a lot of time on the citation metrics used to rank scientists, institutions and journals. With the first of these being, roughly speaking, the number of times that your papers have been cited by others - not unlike what I understand Google to do to rank hits to searches when no money has changed hands. A metric which might be a bit crude, but which he claims is quite hard to game: to get a good report you need to do lots of good work. He also pointed out that with the avalanche of papers pouring out of the system, people needed some way of ranking them, and the ranking of the journal in which they are published is such a way. So if the paper gets into one of the family of journals published by the Royal Society, then it must be good. But then, as an FRS, he would say that wouldn't he... In the same way, if one gets hundreds of applications for some routine research post, one needs some not too time consuming way of ranking them.

He did not mention the custom of most scientists of including lots of references in their papers, in part in the hope that the authors of the papers they cite will return the compliment. Or the custom of some teams of including all their names on all their papers - which can run to hundreds of them in large teams. I seem to recall that there is kudos in being first or last on such lists, so perhaps the citation algorithms score for that too. Or perhaps the more democratic teams insist on alphabetic presentation, algorithms notwithstanding. Change one's name by deed poll to 'Arthur Aardvark' or 'Zachery Zydecos'?

I also recall being told by a animal biologist about the never ending quest for funding, and I dare say that frequent publication is an important part of this. I dare say also that having to knock out papers to impress the authorities can get in the way of actually doing science. Can take up a great deal of time, perhaps a great deal more than the handful of happy readers would justify. There is also a tendency - and not just in the science parts of educational establishments - to rate skill at publishing rather than skill at teaching - skills which do not always reside in the same person.

On the up side is the way in which the struggle to get published in a reputable journal is good for the quality of papers. Giving the thing yet another going over before submitting it yet again, almost always resulted in improvements. Give papers an easy ride and they will get sloppy. Glazer made the same point in a rather more colourful way in the talk noticed at reference 8 - with the paper in question going on to be successful and widely read.

He noted that science was prone to fads and fashions, just like any other endeavour. We just had to learn how to manage them. I associate to claims that the way, some years ago now, to get funding, was to work the acronym 'AIDS' into one's proposal. And that some years before that the way was to work in some words which might catch the eye of some computer in the US Department of Defense, at that time rather free with its research funds for all sorts of unlikely projects.

He noted that science had become very competitive and that there were some down sides to this. But hard to see how it could be otherwise now that a lot of science was very expensive and that a lot of people wanted to do it.

He claimed that science was unlikely to get things wrong in a big way. It was too open and there were too many people at it for gross error to go undetected for long. But he did worry that science was becoming too homogenous, too much the same the world over. Diversity is important and if we all do things in the same way we will probably end up missing stuff. An observation that plays in other walks of life, with bringing people together from lots of different backgrounds to tackle something or other, often producing unexpected and valuable results.

He claimed that the science system, in the round, worked. Scientists always complained about politicians, but you had to have politicians to hand out the money, of which there would never be enough. And so the politicians had to have tools - that is to say metrics - to help them with that handing out. He thought that here in the UK we did rather well at this - although other countries were catching up fast. While I wondered about how much science could be done these days without money and I associated to all those Soviet mathematicians and physicists who did great work with chalk and blackboard - because that was all a lot of them had.

He did not mention the open science movement, which raises some of the same issues. A movement which seems to be winning, with a great deal of stuff now being published open in the first instance. See, for example, reference 5. Nor did he mention the various services which exist to help with the avalanche. See, for example, reference 6 - a service which I find very useful.

I remember just two observations about nano-engineering. First, that it must be possible because nature does it. Second, that the mechanical appearance of biological gadgets at electron microscope scales is encouraging.

Question time much better than is usual at these sorts of talks.

Downing College (one of the relatively new colleges at Cambridge) was having a London alumni evening downstairs, to which they attracted about a hundred people. Rather more than Baumberg.

On the way home, called in at the Half Way House at Earlsfield, but were driven out after one by the background music, which was very foreground as far as we were concerned. I mentioned this to the attractive bar maid on the way out, who managed a rather condescending smile. About all that a pensioner rates in what is essentially a young professional bar. Having once been a rather sleepy & comfortable working class bar.

With the result that we wound up at the Marquis at Epsom. Back bar busy with a bash for some local lawyers, but their noise was entirely tolerable. As was the wine.

Reference 1: Professor Jeremy Baumberg FRS, Fellow of Jesus College, Cambridge, University Professor of Nano Science and Director of Nano Doctoral Training Centre.

Reference 2: The Secret Life of Science: How It Really Works and Why It Matters - Jeremy J. Baumberg - 2018.

Reference 3: http://thesciencemonster.com/. A website which appears to exist to promote reference 2.

Reference 4: https://www.allencarr.com/. Daily Telegraph: 12:01AM GMT 30 Nov 2006: Allen Carr [of Raynes Park], who died yesterday aged 71, was a 100-a-day cigarette addict before kicking the habit in 1983 and launching his "Easyway" method of helping smokers to quit ... Carr was philosophical when diagnosed with an inoperable form of lung cancer in July this year. "I estimate I've cured 25 million smokers over the years," he said, "and if my illness is the price for that, it's worth paying".

Reference 5: https://www.plos.org. 'PLOS was founded as a non-profit Open Access publisher, innovator and advocacy organization with a mission to advance progress in science and medicine by leading a transformation in research communication. We believe that OPEN is a mindset that represents the best scientific values, bringing scientists together to share work as rapidly and widely as possible, to advance science faster and to benefit society as a whole'.

Reference 6: http://martinedwardes.webplus.net/eaorc/. 'The EAORC was set up in 2003 to help students and investigators in Evolutionary Anthropology to find out about key papers in their subject area as they were published. It was based on an original concept created by Helena Tuzinska, and has been producing a weekly bulletin (compiled by Martin Edwardes) for over fourteen years now. The interests of the members of the EAORC are varied, but cluster around language evolution, cultural evolution, and cognitive evolution. The papers, articles and news items selected each week reflect these interests'.

Reference 7: http://www.globalblue.com/corporate/. Having now checked out this outfit, I don't think we need abandon the Goat after all. Something to do with support for luxury shopping; not our sort of thing at all.

Reference 8: https://psmv3.blogspot.com/2017/04/bragg-and-son.html.

Group search key: jba.

No comments:

Post a Comment