Chemistry in the UK – too important to accept such a large gender imbalance

I believe that chemistry and its related activities hold the key to great improvements in both the quality and quantity of human life. As a consequence, it is critical that a vibrant and successful country has a breadth and depth of chemistry expertise. That should mean that the best scientific talent is identified and nurtured to become our leading researchers. This should lead to reward and recognition for those brilliant scientists. You might imagine then that evidence that this has not happened and is still not happening would be something that you might want to keep quiet or else make damned certain you are actually addressing before confessing to. Sadly, the Royal Society of Chemistry (RSC) has once again announced to the UK that the discipline of chemistry continues to fail the nation. They announced their prizes and awards recently and here’s the thing: they give away almost £200K every year in prizes.


If, back in 1980 (when the RSC was formed), you had suggested spending the prize funds in such a way that it would be 2017 before more than 20% of that money went to women, I hope that you would have been sent away with a flea in your ear to think again. And yet, here we are, it’s 2017 and for the first time the proportion of money going to women (among awards going to individuals) has just crept above 20%. You have to believe some strange things about the abilities of the genders to not view this as an admission of failure. As far as I can tell, since 1929 there has only been one female recipient of the Pedler award, none of the Perkin prize (since 2008) and none of the Robert Robinson award (since 1964). While I cannot quibble with the selection of the recipients of these prizes (I have enjoyed working with several of them), this is a damning indictment of the discipline. If the RSC can really think of nothing better to do with £200K every year than to keep on advertising how chemistry is failing the UK then I am deeply worried.


Back in 2014 when I first took notice of the list of prizes, I wrote an email to the RSC on the subject and received a vaguely reassuring response (both published in the RSC news magazine). I was sufficiently annoyed by the following year’s published list that I threw it away in disgust. I have kept the 2016 and 2017 versions and have analysed them along with the 2014 list. I am happy to make the spreadsheet available. It is interesting that female recipients receive a slightly higher proportion of the money than of the prizes, although I think this just represents the over-representation of men in all of the prizes and there are fewer prizes that receive £1K or less. I also acknowledge that I have assigned genders on visual appearance from the accompanying photographs and names. It is also true that gender is a more fluid property than has long been assumed. However. To use any of that as an excuse or distraction would be inappropriate.


I note that in the UK parliament session beginning in 2001, there were 118 female MPs (18%), which had dropped slightly from the previous parliament. This caused sufficient concern and embarrassment that the law was changed to permit positive steps to be taken to change the situation. This has seen a slow but steady increase to the current parliament, which has 32% of female MPs. I am afraid that unless chemists collectively take positive steps to change their situation that similar governmental intervention might be required. We are not living through a period of time that can indulge a comfortable club for the boys controlling a vital national resource.


Chemistry in the UK – too important to accept such a large gender imbalance

A question about scientific publishing

I don’t really pay enough attention to know whether this is a new thing and I should just be used to it or not but I just received a strange editorial decision from an ACS journal (J. Med. Chem.).  The reviewers recommendations were:

1 – publish after major revisions [direct cost to author = $0]

2 – publish in JCIM [direct cost to author = $0]

3 – publish after major revisions [direct cost to author = $0]


Hardly a ringing endorsement of the paper but the editor’s decision is a bit troubling and I have to admit I didn’t even know it was something the ACS did.

Editors decision – transfer to ACS Omega [direct cost to author = $2000]

Is this just a feature of publishing that I need to get used to?  Are ACS editors in anyway incentivised (even indirectly via internal promotion of these pay-to-publish journals) to transfer papers in this direction?

This feels like two very different models of what scientific publishing is for shoe-horned into one organisation.  And on this occasion its an organisation carrying the name of a learned society which I think ought to favour one of those business models over the other.

I would be interested to know if this is a common experience which I should just get used to or if this something to be worried about?  Am I missing something?


A question about scientific publishing

Science and Democracy; wake up and smell the brexit.

I have been intending to blog for a very long time about an article that I read last year concerning the public funding of science in the UK. Events of this summer reminded me of the article and crystallised some thoughts that I feel like getting off my chest. The article in question is by Prof. Athene Donald in the Guardian (I should confess a tenuous link to Prof. Donald in that she is currently head of the college I attended in Cambridge). The article is entitled “UK science is excelling, but are we funding the wrong projects?”

The tenor of the article is that “excellence”, as judged by the UK research councils, should be more important in deciding how funds are allocated than “geography” as championed by politicians. The politician in this case is George Osborne, former Chancellor of the Exchequer, not a man that I am particularly fond of. However, the attitude in the article is that a cosy bunch of scientists should be permitted to decide how public funds are spent and that scientists have been “bothered” because a democratically accountable politician has taken an interest in this spending. Indeed, Mr Osborne went so far as to be a very prominent and public campaigner for a particular institute and for science in the north-west of England more generally. I am afraid that attitudes like that expressed in the article, which seem to assume that scientists have some inalienable right to spend public funds and that the intrusion of democratic accountability into the process is a bad thing, are just one more contributor to the alienation of much of the population of the UK. Their response to scientists’ contribution to the brexit debate was a great big flick of the finger.

It is pretty clear that the research councils in the UK are a fig leaf to give a veneer of scientific excellence while avoiding the need for politicians to get involved in picking individual scientific projects. They provide an accountant’s sense that the UK taxpayer is getting value for money and in many senses, I am sure they are. While this seems reasonable for some of the funds, which will inevitably go towards smaller, more incremental pieces of research, the line in the article decrying the fact that the research councils are required “to make open-ended commitments to new Treasury-backed projects” is an outrage and an enormous missed opportunity. Taxpayers have every right to scrutinize how any public money is spent and to allocate it however they wish via democratically accountable politicians. To suggest otherwise is, at best, foolish. More importantly, this is a short-sighted, missed opportunity. We should be seeking political support to drive the prominence of science in the public consciousness; I would like the public to be encouraged to celebrate the great feats being undertaken by scientists in the UK and to see our politicians climbing over one another to champion the great science that we do in the UK. I fear that the system we have instead sees a closed group of self-congratulatory scientists supporting far too much work that the public would be unlikely to wish to fund and telling them that they must keep paying for the research but are not entitled to have a say. I think Charles I had much the same attitude and revisiting this article has made me sense the righteous indignation that must have driven the Parliamentarians that he faced. I will try and write something without reverting to references to the 17th century at some point soon…

Obviously the above represents my own personal views and not that of any of the institutions with which I may be affiliated.

Science and Democracy; wake up and smell the brexit.

Docking Screens for Novel Ligands Conferring New Biology

Recently, we discussed the paper “Docking Screens for Novel Ligands Conferring New Biology” by Irwin and Shoichet. The paper discusses improvements in docking methods and demonstrates how docking campaigns can help to find new ligand series and new chemistries that can help reveal new biology.

The paper includes an extensive list of available docking resources: different web resources, compound libraries and docking software, together with the brief description of their application.

The authors highlight the danger that often comes with using docking as a black box – there is a great chance of the “garbage in, garbage out” effect where bad input results in bad output. Even a method that is (deceivingly) as simple as docking requires a team of experts that can thoroughly analyse the results and come up with several different potential lead series from the whole output rather than allowing the algorithm to pick the few best scoring ones by itself.

Another point was made that is often overlooked – there is an overwhelming amount of literature describing “docking for docking’s sake”. These are campaigns that result in good in silico hits, but these hits are then never confirmed experimentally. The authors briefly discuss the link between the docking and experimental artifacts and emphasise the need for orthogonal testing which, in our opinion, is rarely seen in publications dealing with any computational method.

We think this is a great summary of available docking resources that points out pitfalls associated with the approach that will inspire users to approach docking in a more thoughtful manner. However, we do feel that the title, although appealing, does not do it justice. The paper focuses on much more than just case studies where new chemotypes reveal new biology and it certainly is a necessary read for anyone embarking on a docking campaign for the first time or an experienced medicinal chemist.


Docking Screens for Novel Ligands Conferring New Biology

Journal Club – A real-World Perspective on Molecular Design

During our recent Journal Club, we discussed the paper “A real-World Perspective on Molecular Design” by researchers from Roche. The paper presents 10 case studies where computational methods were successfully applied to different research projects. The examples are diverse enough to cover a broad spectrum of possible approaches that one may take when doing computational study. It provides a set of guidelines, a check list of computational things you can do that are compatible with short project deadlines and that can provoke further thinking and generate new ideas. It also reminds the users that although computational methods, by their nature, provide numerical values, these should be treated as a qualitative input to an ongoing project. Somewhat surprisingly, all the studies were given in a prospective manner, which rarely is the case with published computational approaches – this in a way highlights the real life, practical use.

However, it is not clear who the article is intended for: for the method users/developers it lacks depth and for experimentalists it might be overly simplified, giving the impression that these are simple approaches that work every time and which can solve anything. But, as we well know, the road to success is paved with failure. It’s a shame these failures are rarely recorded in public literature. The authors are users/developers of the methods, so we find it surprising that they consider further development of the methods useless, “further improvements in computational methods may then have less to do with science than with good software engineering and interface design”. We think that science will benefit from improvements in computational methods and vice versa. The two are tightly interlinked: there is no good software engineering without the good underlying science. However, good science can exist without good software, but good software can surely help the science.

Unfortunately, they mentioned just cases in which protein structures and ligand properties are well known. Earlier stages of molecular design lack that sort of information and thus might need implementation of different approaches, such as computationally more expensive tools that haven’t been mentioned in this perspective.

We also found that some of the words are not always used appropriately. Looking at following sentence ‘Sulfonamides are molecular chimeras, which are found to form hydrogen bonds as well as interact with unipolar environments within proteins.‘ it seems that rather then being a hybrid chimeric molecule, sulfonamides as described in the article are more of a Dr Jekyll and Mr Hyde type of molecule which can be either one or the other rather than being a bit of both at the same time. Quite what the authors really meant by unipolar is also slightly confusing; if they mean a single magnetic or electric pole (unipolar), this seems unlikely to be a feature of many protein binding sites, at least when getting in to the interesting detail!

Journal Club – A real-World Perspective on Molecular Design

Synthetic chemistry is for roundheads, medicinal chemistry should be for cavaliers

The last blog post on the subject of Roundheads and Cavaliers prompted me to reflect about how good I think the analogy is and whether it is useful. It eventually got me wondering about another, related subject that I have struggled to find the words to describe. This is the divergence between the mindsets of two sub-disciplines of chemistry: synthetic and medicinal chemistry. Although it may no longer show, my PhD training was in synthetic chemistry. One of the things that I enjoyed most during that period was the retrosynthesis challenge that would be set for the group. We would work in small groups to try and concoct a convincing synthetic approach to various fiendish molecules. These would then be presented to the rest of the group who would provide robust criticism of the proposed schemes – it was a terrifying undertaking for a first year graduate student but a satisfying thrill by the time I was finished. The great challenge was to have a synthesis that would get past the immense cumulative knowledge of a world-leading bunch of synthetic chemists. The whole session hinged on the transferability of reactions from one context to another. Those who knew the most detail from the literature would always “win” – and what is more, this was (almost always) because they could reasonably say that they knew what would work and what would not. Hence, their proposed syntheses could be better informed than everybody else’s and they could provide more informed critique of the other syntheses.

This is an appalling preparation for medicinal chemistry.

It has frequently shocked me to hear medicinal chemists pontificating about what will and will not work. It has too often felt like a delusion. Actually, that’s not quite right. In terms of stacking the odds in your favour, it is a pretty good idea to say that everything will not work. This makes medicinal chemistry ideal territory for self-satisfied Roundheads. But how unhelpful, how uninspiring. In an environment in which little is understood definitively, we require persistent sorts who can take the knocks of things not working as hoped and can pick themselves up and do it all again. A corrosive presence that will decrease the prospects of success is the one who can only tell you why they think something will not work or worse, the “told you so” sorts who don’t even make testable predictions. I fear that the puritanical roundheads of synthetic chemistry are all too often in this category. Is this really the best training for the random bittersweetness of drug discovery? What sort of training can we provide that will bring more joie de vivre to the undertaking, where do we find cavaliers and how do we train them?

Synthetic chemistry is for roundheads, medicinal chemistry should be for cavaliers