Showing posts with label dislikes. Show all posts
Showing posts with label dislikes. Show all posts

Wednesday, April 22, 2015

When the description of methods in a scientific paper becomes optional.

I have just read a paper describing some very interesting tailoring of enzyme specificity on a P450 enzyme. I was, however, surprised to find that no description of the experimental methods was present in the paper itself, but was only available as Supporting Information. Upon examination of the instructions for authors in the journal I learned that, although being online only (and therefore lacking any space constraints), this publication enforces a 40-thousand character limit on the published papers and specifically states that the experimental section is optional. Traditionally, Supporting Information includes accessory data which would be cumbersome to include in the paper.  In this journal, it functions instead as a cumbersome way to access a vital part of information which should be part of the paper. I cannot even begin to understand why any reputable publisher would, in the absence of any printing costs, force their authors to split their manuscripts and "demote" the potentially most useful portion of the paper to the Supporting Information.
That's ACS: proudly claiming to "[publish] the most compelling, important primary reports on research in chemistry and in allied fields" while making it difficult for readers to have access to that same information.

Saturday, February 7, 2015

On the wrong use of expressions such as "evolution's null hypothesis"

A new paper published in PNAS has been in the news lately, claiming to have found 2-billion-years old fossils of sulfur-metabolizing bacteria undistinguishable from modern specimens. The abstract is somewhat cautious "The marked similarity of microbial morphology, habitat, and organization of these fossil communities to their modern counterparts documents exceptionally slow (hypobradytelic) change that, if paralleled by their molecular biology, would evidence extreme evolutionary stasis." (emphasis added). In the press release and in their talks with the media, however, the authors of this study have been much more forceful and hyperbolic: they directly claim that these organisms have not changed at all! As any microbiologist worth its salt would attest, it takes a lot more than morphological similarities to establish that two microbial communities are composed of the same species. Otherwise, metabolic tests with dozens of substrates would not be needed to distinguish microbial species: we would simply need to throw the little bugs under a microscope and see what they looked like! How could the authors possibly be sure, simply from their tests, that microbial adaptation to the environment had achieved that of modern bacteria by the time their sample fossilized?

More than this extraordinary leap of logic, I was grated by the author's claim that such a lack of evolution would be in agreement with evolution's null-hypothesis of no biological change in the absence of changes in the physico-chemical environment, and it therefore strengthens the case for evolution.... How is it possible to cram so many errors and inaccuracies in such few words? How could the peer-reviewers let such inane nonsense appear in the title of the paper? Let us start to unravel the many mistakes in this formulation:

  • What the authors call "evolution's null hypothesis" has NOT (as far as  I have been able to ascertain) ever been claimed as "evolution's null hypothesis" at all: it is well-know, at least since the seminal work by Kimura, that the strongest driver of genetic variation is not the positive selection of advantageous mutations but the random fixation of neutral (or barely neutral mutations). Indeed, in humans only ca. 400 of the estimated 16500 genes show strong evidence of positive selection, even though all of the genes show variation from those of closely-related species.  It is therefore NOT at all expected that genomic stasis would be observed over a long period of time. Stating (as the authors) that observing no  change in these organisms is a confirmation of the mechanisms of evolution reveals a shocking lack of knowledge regarding  molecular evolution. And the authors have not even proved that there was no change: that would require establishing that their ATP-producing metabolism is as efficient as that of their modern counterparts, that they are able to use the same substrates, and contain all the same  enzymes, etc....
  •  By claiming that an unchanging environment leads to an immutable species, the authors commit a further logical fallacy: after all, there was a time (let's call it t0) when the ancestor to that community first entered that unchanging environment. If an unchanging environment leads to evolutionary stasis, then the authors are claiming that at time t0+1 million years the species would be equal to that at time t0, or that at t0+2 million years, and so forth. But of course adaptation to an environment is not instantaneous, unless the parent ancestor already possesses all enzymes needed to thrive there (and this is most unlikely, as there has been no selective pressure for that). An unchanging environment therefore causes evolutionary pressures, at least in regards to the first cells which venture there.
  • When an observation is compatible with different theories, it cannot be used to further any of them: after all, seeing no change in 2 billion years could also be used to argue for the immutability of species. It is therefore logically fallacious to present it as proof that Darwin was right. Also, evolutionary theory has also developed a lot since the writing of "On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life". Shouldn't other workers, like Kimura, Felsenstein, Farris and Gould be acknowledged? 
  • Why would any scientist need to claim that “the findings therefore provide further scientific proof for Darwin’s work.”? Do astrophysicists need to state “the findings therefore provide further scientific proof for heliocentrism” every time that a new comet is found, its orbit is computed, and it is found to move around the sun rather than around the earth? Do anthropologists working in the Balkans need to point out that “the findings therefore provide further scientific proof that human societies do not all resemble hunter-gatherer groups?” The curious insistence of American-based media to frame biological discoveries as a supposed debate/beauty-contest between "evolution" and "creationism/immutability of species/intelligent design" is completely mind-boggling to any European, whether religious or not. This insistence was also displayed in Neil de Grasse Tyson's "Cosmos", which, unlike Sagan's masterpiece, seemed more interested in scoring debate points against a sub-section of its domestic audience than on presenting the astounding amount of knowledge mankind has gathered in the few millenia we have spent since the dawn of agriculture.



Claims unwarranted by data, exaggeration and PR stunts: all of these are usually as ascribed (rightly or not) to politicians, polemicists, salespeople and shady companies seeking to attract capital. Do we really want science to be tarred by the same brush?

Tuesday, July 22, 2014

Challenges of teaching Biochemistry to Health Sciences students

Had anyone told me, 20 years ago, that I would earn my living as a lecturer, I would have considered it as a put-down. I did have a lot of respect and appreciation for (most of) my lecturers at the University of Porto, but I expected to become a full-time scientist, rather than a "lecturer who finds time to do some science in-between classes/grading" or a "researcher with required part-time lecturing duties". Real life disabused me of that expectation: due to the dearth of other scientific jobs in Portugal, I did become a "lecturer who finds time to do some science in-between classes/grading" after finishing my PhD. 

The culture shock I experienced when I first lectured to Health Science Students left me unable to speak about much more than the woes  of teaching for the best part of a year. A big portion of my surprise came from my first-contact with regular students who attended my lectures simply because they were required to by the University, rather than due to recognizing the subject as a relevant background for their (mostly) vocational training as Physical Therapists, etc. Being required to take most classes to graduate  (rather than choosing large part of the curriculum around a core subset) is a very common feature of university curricula in Portugal. In principle, it is meant to ensure that all students have a balanced curriculum and do not "flee" the hardest subjects. In practice, it also tends to lead to ever larger classes of those same "hard subjects", since students tend to consider those lectures as bureaucratic hurdles thrown at them, rather than as valuable knowledge and therefore feel disengaged, alienated and fail them in large numbers.

All classes I have taught (Biochemisty, Organic/General/Analytic Chemistry, Basic Mathematics/Statistics) fall into the "hard subjects" class. In my first years of lecturing, I had a most demotivated cohort of students. My expectations regarding their performance were generally very unrealistic, as my baseline comparison was my own student experience at my "alma mater", where I was surrounded by engaged student peers who were motivated into learning pure scientific subjects, and did not regard them as "filler" or "bureacratic hurdles" aimed at winnowing the sutdent body. Moreover, my "alma mater", the Faculty of Sciences at the University of Porto, was famous among students by its harsh grades: attrition was relatively high, less than 20% of those graduating from its Chemistry or Biochemistry curricula would have an average grade of 16/20 or 17/20, and higher final grades were virtually unheard of. In other Portuguese Universities, final average grades of 18/20 were common, even though their student body was of the same (or even slighly lower) quality, as judged by their entry grades.

I eventually adapted to the students' expectations, and developed a teaching method that engages students and apparently motivates them (as judged from the appreciative comments in teacher evaluation forms). However, I find that this only seems to work during class time: students pay attention, seem to be making all the right connections (as long as I softly nudge them towards the right path, etc.), congratulate me on the quality of lectures, etc. In tests/quizzes/exams, however, a strong disconnect appears: ca. 50% of my students still struggle with many concepts that I would consider as absolutely basic. Why does this happen?

I have just found out that there is a proper name to what is happening in my classes: pseudoteaching (defined as " The concept [...] that even the most outwardly perfect lesson can result in students not actually getting what it is you wanted them to understand."): along with this, there is also pseudolearning ("Going through the “expected” steps without extracting a solid, working understanding of a topic would") and pseudostudying (which I would define as "reading and working the material to the point where one feels tired  but without actually taking anything from the exercise due to inability to distill the core concepts into working knowledge").  I cannot prevent students from pseudolearning or pseudostudying (apart from exhorting them to rest properly, keep their blood sugar levels up while studying and work/study in short bursts daily rather than pulling all-nighters on the eve of the tests). Avoiding pseudoteaching is in my power, but I do not (yet) know how to: Jan Jensen (following Mazur) advocates a flipped classroom model where exposition occurs outside class time (using short video lectures and key exercises) followed by solving exercises "in-class" with free exchange of ideas among students (peer-instruction). I do not think this  method would help with my students, though: a previous experience of short (< 10 minutes) in-class quizzes led to class disruption, acutely stressed students during and after the quiz and minimal improvement in weekly off-class engagement with the study material. What would you suggest me do?

Thursday, January 16, 2014

Moving towards Open Access...

In physics and mathematics, publishing Preprints of papers in the arXiv is the most common form of distributing scientific papers. All the major journals in those areas have therefore been "forced" to accept papers previously available as preprints.
In Chemistry and Biology, however, most journals do not accept preprints and therefore authors are quite loath to make their work available as a preprint. The lack of this "free preprint" culture then enables journals to keep increasing their subscription prices way above inflation levels, which further gives publishers an extra incentive to keep rejecting sound work that might otherwise be available as costless preprints. This is a classic instance of Catch-22.
I believe that, as authors, we should do our utmost to fight this status quo. Our science should be evaluated on its merits, rather than on the accidental name of the journal where it has appeared. Therefore, I will henceforth submit all my Biochemistry work to PeerJ / PeerJPrePrints. PeerJ is an innovative and remarkably inexpensive Open Access Publisher with transparent peer-review and the option of publishing the paper's reviews alongside the manuscript.  The integrity of the reviewing process is therefore above reproach, ensuring that it will be both rigorous and fair.
PeerJ does not (yet?) accept submissions outside the field of Biology. My Chemistry work must continue to be submitted elsewhere. I am thinking of given the Beilstein Journal of Organic Chemistry a shot: completely free, open access, and rigorous. It does not have a stellar IF (around 2.8, I think), but who cares? Playing the IF game is ultimately detrimental to quick publication, as several journals insist on publishing only the "extra-sexy" work to prevent their IFs from falling, and often even refuse to send manuscripts for review simply because some editor feels they are not "hot" enough (ACS, I am talking to you....)

The power to change is, after all, in our hands. It may be a very small amount of power, and the odds of effecting any change may be vanishingly small, but if we do not use it, nothing will change for sure.

Thursday, June 20, 2013

Science by press release

I woke up today with the news that researchers at the University of Aveiro had, "for the first time", altered the translational apparatus of an organism. I was outraged with the news: not with the science itself, but with the mindless hype surrounding it: actually, such a modification had already been performed in 2011 in C. elegans . I first thought that the "first time evah" pitch had been added by ignorant journalists, but the hype was already present in the press release from Univ. Aveiro!
The research publicized today is good and interesting, no doubt about that, but the quest for "good press" should never come at the expense of the truth. There is no excuse for that. Every bit of "good press" achieved with hype/exageration unfairly benefits those institutions and/or researchers with no moral qualms, leaving those researchers who are honest enough to not misrepresent their results in a disadvantage.

I've always disliked "science by press release", because (all other things being equal) it disproportionately benefits those who have access to the mass media, or who can afford publicists. Hyped press releases are even worse. And this can only end when science journalists stop relying on press releases to decide what is newsworthy. Though I strongly believe that such a day will not happen in the next 5 * 109 years.


Addendum: Previous reports all reassigned a STOP codon to an unnatural aminoacid. The report from Univ. Aveiro is indeed the first time that a non-STOP codon has been reassigned in an organism. This difference is unfortunately not present in the press release. I still stand by all other points on my post.

Thursday, September 29, 2011

Dividing research into very small chunks...

Research roductivity is most often measured by people who do not have the ability to distinguish good papers from bad papers. Such measurements therefore tend to devolve into mechanical algorithms that count the number of publications and the impact factor of the journal where the research was published, rather than sensible arguments about the merits (or demerits) of the researcher. Evaluating a researcher therefore becomes a "numbers games", where a researcher with a higher number of small papers easily outranks another who has a smaller number of longer, more complex, publications. The race to the "smallest publishable piece of research" increases the number of papers (arguably "good" to the researcher who needs a "good" evaluation) but makes accompanying the literature more difficult, as one has to keep track of ever increasing numbers of papers with dwindling individual importance. It also detracts from the value of research being reported: in my example today, two papers report computations of very similar compounds. The only difference is the interchange of a nitrogen with a phosphorus atom.
A single paper would have been much more useful and important, but research managers would count that as less productive :-(


PS: I happen to disagree strongly with the suggestion, in these papers, of the existence of intramolecular H-bonding, as the angles involved are too small for H-bonds.