Deposit full protocols early!

Disclaimer: I am an ambassador of the protocol sharing website Yes I got an $50 amazon voucher from them end of last year with which I bought some books for my son Saul. I think pre-prints are awesome but have some caveats.  And foremost I love free stuff accessible by everyone.


I am currently working on long-read genome assemblies of fungi and plants. All successful long read genome projects start with months of optimization for DNA extraction methods. This can often involve phenol chloroform extraction for weeks on end with slight alterations of the same protocols. Mine took about 9+ months to optimize and to finally get some good data back.  I was lucky as  I got three protocols to start with form colleagues around the globe. Looking into the scientific literature was no help at all. The one I build on was from INRA and put online by Jason Stajich here, which I only found because I asked Andrii Gryganskyi for help. So overall I didn’t reinvent the wheel but simply build on what others already created previously. Pretty much was 99%+ of all science really is. Notwithstanding,  I knew several other groups struggled with the same problem so I put my working protocol online on and publicized it on the 18th of April 2016.

Now on the 31st of August 2017 we published pretty much the same protocol as a peer reviewed methods chapter in a book a friend of mine edited.

I would like to celebrate this occasion by sharing some thoughts about open sharing of methods early. Overall I obviously think it is advantages for you and everyone else in science to do so. This is in contrast of letting one’s protocols rot on one’s harddisk and to only let them out to see the light of the day once someone offers you a co-authorship.

In semi random order.

Instantaneous availability

The most obvious one first. It took 17 months to publish pretty much the exact same protocol I knew was working when I put it up online. That’s nearly half my son’s life. In the meantime the protocol got 2375 views, I got 10+ emails or messaging requests to explain different parts of the protocol better and some ‘Thank yous’ on conference as other got it to work for their favorite species as well. Overall putting it up online early enabled other to discover more in in less time instead of reinventing the wheel of DNA extraction protocols.

I know some people would not use none peer reviewed protocols that they find online (discussion here and here). I personally think it is an illusion to believe that methods get properly reviewed in papers. I only ONCE got a comment on a small technical detail in my methods section across all my 34 publications. Similarly, I’ve hardly ever (never?) seen another reviewer ask for details or even comment on the methods section.

I know the protocol worked for me and no closest peer review would have changed it.

Discoverability and ease of use

Everyone loves detailed easily accessible protocols. EVERYONE! Everyone despises short methods section in the supplement that provide no detail and cite another paper that cites another paper that cites another paper that cites another paper that supposedly did the experiment the exact way the authors did it.

So putting your full detailed protocol on your lab webpage is good (see this awesome long read DNA extraction protocol) yet putting it up on a publicly accessible and search-able platform like is even better. You get a DOI, it has versioning, people can fork it, people can comment on it, people can ask you directly, it is backed up available forever, and you can even add it to collections/usergroups so even more people can discover it. Everyone is happy using detailed protocols they can interact with and cite. Everyone!

Catering to both systems

Of course some people get nervous and fear they cannot publish their methods in esteemed journals anymore. Of course not everyone goes online on novel platforms like to hunt down the latest protocol. Many prefer to search the peer reviewed literature. And that is really the great thing about depositing full methods early. You can do both.  You can share your protocol to help others quickly and get the bonus points of a journal publication. That is just what it did! It might well be that more people will cite one’s publication as they already used one’s protocol before it was officially published in a peer reviewed journal. That is the case for preprints and future studies will show if we can observe similar effects for protocols For sure I will cite both and my book chapter in my next genome manuscript preprint.

Reproducibility and rigor

There is a lot of talk about the lack of reproducibility in life sciences. Much of this is focused on depositing code online. Not much is focused on the actual data generation. Methods section are often poor and most supervisors don’t care a la ‘just write something no one reads it anyway’. Yet every good project starts with good data, starts with a solid detailed protocol of how things are done and were done. This not only helps others to reproduce one’s work but also oneself and one’s lab. Writing down great protocols often makes one realize the fine grained and important details. Sharing those protocols publicly is even better as it accelerates discoveries, saves money, makes science a better place for everyone. On top we in the western world sitting at elite universities often forget how hard it is to access high quality protocols. One just need to look at researchgate to see what I mean.

Come join us, make everyone happy, and share your detailed protocols online early. Imagine a world were you just have to search for a detailed protocol and you find one pretty much immediately. Just like cooking recipes.  

Both and bio-protocols are a great starting point.



Today I reached my a-index

Today I finally reached my a-index. Yes I just made this up to celebrate my 34 accepted publications at the age of 34.

No a doesn’t stand for a*8hole but for age. The a-index is achieved once you reached the same amount of publications as your age. The convenient fact about this new amazingly made up index is that it only requires one publication per year once you reached it.

So I guess I can take it easy from here on and focus on getting some science done instead of hunting yet another index.

The #FOR community is named Science People of the year

I am ecstatic. For real this is amazing! An awesome achievement. We, the Future of Research community, were named  ScienceCareers People of the Year!

Oh yeah! This feels SOOOOO good. All of use put tonnes of payless volunteered hours in FOR and other important projects aimed at improving the academic enterprise. ScienceCareers put it nicely

“Each December, Science Careers names a “Person of the Year” to recognize those who have made especially significant and sustained contributions to improving the lot of early-career scientists. Up to now, each year’s honoree has been a senior academic figure occupying a prestigious post. This year’s choice, by contrast, is plural; of a different generation; and at the opposite end of the academic status ladder. For their dedicated, creative, and expanding efforts to empower early-career and aspiring scientists with knowledge and awareness so that they can take control of their futures and help bring needed change to the scientific enterprise, we are delighted to name the activists of the Future of Research (FOR) movement as the 2015 Science Careers People of the Year.”

It is great to see an actual community effort by young upcoming early career academics rewarded by an world renowned science stable. Sometimes it simply feels good to get some back patting from the elite. Clearly, FOR got people active and the bigger academic community talking about issues important to the academic enterprise (e.g. diversity). We also create community by bringing together people with the same interested in improving our  and other peoples’ lives. I see this award not only as a recognition of past achievements but as support of our future aspirations. Folks believe that change is possible and there is a opening in history to make things happen. Hopefully FOR will become a stable itself throwout our careers and I meet folks again.

Thanks to each of the many countless people that make this happen.

On a personal note, this really speaks to my heart and soul, because the award does not single out an individual established person but a community. Only by working together science progresses, this is true within the lab and outside. This recognition is soothing to my soul. Like many other I met over the years [15+], I have been volunteering for all sorts of community actions within and outside of academia for quiet a bit. Many of us spend countless hours on planes, on hangouts, in meetings… organizing events that are not really accounted for in the status quo career progression pathways. Yet we do it because we believe in it. We think it makes our communities a better place.  While seeing things we created happening and community forming we often get wet handshakes and some muttered ‘well done’. So having the elite say “Excellent Job!” loud and in the open is nice!

Anyways, I am sure many of us would keep on doing our projects nonetheless. Yet recognition goes a long long long way!


P.S.: This early-mid career conference at the Australian National University is our next local project spanning the globe.

Discovery and Redemption emerge from a scientific mistake

///This is a slightly delayed post as it was lost in the labyrinth of popular science glamor mag editor desks. We are finally moving on and decided to self publish instead.///


Benjamin Schwessinger, Ofir Bahar, Anna Joe, Mawsheng Chern, and Rory Pruitt

Internal editor:

Pamela Ronald


We were confused and perplexed. Our team in the laboratory of Pamela Ronald at the University of California, Davis could not reproduce previously published results. Instead of building on the reported discovery that a microbial signal triggered the plant innate immune response, we were thrust into trying to figure out what went wrong. It was the most difficult time in our careers. Then, we made a remarkable new discovery.

The story began in the 1970s, when Professor Gurdev Khush and colleagues demonstrated that a wild species of rice was immune to most strains of a serious bacterial disease. This was exciting because such broad-spectrum resistance had not previously been identified. Such traits are agronomically important because farmers can plant resistant varieties rather than spraying pesticides. In 1995, Pam’s lab reported that the broad spectrum resistance was controlled by a single gene, called Xa21, predicted to encode a receptor that senses the microbe and then activates the rice immune response.

In 2009, Pam and her former team members reported the discovery of such a molecule.  However, in 2012, while trying to build on those findings, we discovered major errors in this work. Pam contacted the editors to inform them of the identified problems. She also decided to notify the scientific community of the issues at an international research symposium. Ben still remembers the day when Pam announced the errors in front of so many distinguished colleagues, many whom Pam had known her entire career. “It was remarkable that Pam mustered all this courage to inform the community at this top research symposium. Some in the audience buried their heads in their hands and were clearly uncomfortable but still more people were awed and expressed their support.” Sophien Kamoun, a leader in the field of plant biology, displayed his respect in a tweet:

Finally, in October 2013 we had accumulated sufficient new experimental evidence definitively proving that key aspects of the study were incorrect, Pam and her former colleagues that co-authored the 2009 study retracted the original Science paper.

Pam and the several of us in the lab lost many nights of sleep as we racked our brains to figure out what went wrong. We had many intense discussions on how and why this could have happened. As Ofir remarked, “Pam was on sabbatical in France when we discovered the problem and so we had these electrifying video conference calls to bring everyone on the same page.” It took time and persistence to break down the problems one at a time as we were working backwards identifying what were solid results we could build and what we could not. One day Ofir and Ben were having their morning coffee outside in the Californian sun when Ofir announced that he had found that some of the bacterial strains used in the previous study were mixed up. Rory still remembers this day vividly, “Even though this was not happy news, we were so happy, we finally had a partial explanation of what went wrong.” From that day onwards we were getting back to solid ground. The scientific process had pulled us through and pointed us in the right direction. Our team discussed the mistakes and corrections in lectures, blog posts and in a scientific publication. The process with which we addressed the problems was highlighted in an article in Nature magazine and in Retraction Watch, a blog that reports on retractions of scientific papers, as “Doing the right thing”. In a dubious claim to fame, the 2009 retraction was included in the top 10 retractions of 2013.

While we were notifying our colleagues, discarding old strains and correcting the literature – an essential part of science so that others do not waste (more) time trying to build on fatally flawed work- we were also working hard to discover the microbial partner of XA21. In the mist of all the fog Rory identified a new bacterial mutant strain that was able to infect XA21 rice plants. We were excited but cautious at the same time. Could this really be the correct mutant lacking the microbial partner of XA21? The 15+ year lab veteran Mashweng was especially skeptical as he had lived through the whole story form the beginning. He often suggested critical controls and reminded us to be extra critical with our own results. A superb team of collaborations from around the world allowed us to incorporate additional experiments and independent controls perform in former critics’ laboratories. After many independent tests we were certain of our results. In mid 2015, we reported the identification of this long sought after molecule that activates XA21-mediated immunity in Science Advances.

We named this small microbial protein RaxX. We found that bacteria that lack RaxX are able to evade detection by the rice XA21 immune system. Bacterial strains found in farmer’s fields in India, which express alternate versions of RaxX can cause disease on XA21 rice plants.

Wrestling with trying to reproduce flawed experiments and discovering the new molecule in rapid succession was an enormous challenge. In our view, the key point in straightening out such complicated and delicate situation was the persistence and collaboration of our laboratory team. Pam worked hard to keep the team together. We were a dedicated crew of senior scientists, technicians, postdocs and graduate students (as well as former team members who shared their records with us because they also wanted to know where they had made mistakes) motivated by a supervisor determined to get to the bottom of the situation. Looking back, we find it stunning how this all worked out. “It’s amazing what can happen if you are doing science with a set of respectful and likeminded people. We were focused on setting the record straight rather than blaming others. It clicked. We were on a run”, Ofir still remembers to this day. We felt our careers were on the line. We realized that there was no moving forward without first going backwards. We did not give up even though at times we wished we could. Working together on this daunting challenge buoyed all of our spirits. “Back then, I [Ben] was living in this lovely trailer with a trellis outside. We had ‘poker-nights’, when everyone came along to talk and be jolly. At times it felt like ‘group therapy’ with drinks.”

We were impressed by the supportive and kind response from the scientific community, our editors, and funding agencies including NIH, BSF, and HFSP. We received many letters of encouragement – even from complete strangers. These conversations helped keep us going. We even had new scientists join Pam’s lab in midst of the mess. Anna Joe, a postdoc who joined the lab at the time of the retraction remarked, “Their open mind and transparency attracted me to the lab.”

Pam tells us that there are still hills to climb, “Some scientists may be extra skeptical of results from my lab for a long time to come.” For example, in a critique of our submission, one reviewer asked, “how do we know the strains weren’t mixed up again this time?”

We all know errors are part of the process of scientific discovery (although most of us don’t see ourselves as making these mistakes). The key is to track them down as fast and efficiently as possible, ideally before publication. To this end Pam has improved and instituted new laboratory practices: generating duplicate stocks of key strains (validated and maintained by the lab manager), mandating electronic notebooks for each lab member and requiring that all new assays be independently validated by three independent researchers before publication.

After all this stress and uncertainty, it is still invigorating when a colleague comes by after one of our talks and says, “It is nice to see how that you handled and communicated all the errors, corrections, retractions, recoveries, and discoveries. These stories are so important to tell as they are part of doing science.”

The new finding of RaxX as activator of rice immunity has opened many new research directions. We all are excited to figure out the biological function of this potent novel molecule. We are moving on.


The authors would like to thank Pamela Ronald for critical reading and constructive comments on this manuscript.

Additional reading:

Retraction Watch, What do you do after painful retractions?

Nature News, Rice researchers redress retraction

The Tree of Life, A Phoenix Rises from the Ashes



The true culture of STEM inclusivity

Disclaimer: This is a guest post by a good friend of mine, Shaila Kotadia. I fully endorse this post as it raises many important issues in academia.

As talk of how to restructure the funding of STEM fields to make them sustainable, the word diversity often arises. Almost everyone states this is a part of [insert name here]’s mission but how actively is academia really supporting this notion? In particular, over the past several decades, there has been little change in the makeup of professor positions.

I recently attended a conference organized by a scholars program for undergraduates. This program has existed for over 20 years, is successful at equalizing the playing field for students that are overlooked in and often drop out of STEM majors, and is now evaluating the parts of the program that result in this positive outcome. The morning consisted of a series of excellent speakers that conducted evaluations of programs targeted at underrepresented undergraduates to determine their effectiveness. One study conducted by Mica Estrada at UC San Francisco evaluated a national panel of minority science students for their feelings on self-efficacy, scientific identity, and how their values aligned with scientists to determine scientific integration (Estrada, et al., J Educ Psychol, 2011). This raised important questions in my mind. Are we selecting for a self-perpetuating personality type that think about and approach problems very similarly regardless of their background? Or are we molding these students through formal and informal programming to fit a model of an academician and losing their diverse perspective in the process?

A town hall discussion concluded the event. This is where we were challenged to share our thoughts that called out the academic system and its support for underrepresented students. This is when the conversation got really interesting. Comments on the culture of STEM shifting with the influx of Ph.D.’s and the lack of associate professor positions suggest that we need to adapt. Additionally, it was suggested that little diversity in these positions might be due to individuals from all backgrounds not having a desire for these positions. I felt the need to chime in. We are encouraging these students to pursue Ph.D.’s with the ultimate outcome to retain them in the STEM fields with a heavy emphasis on faculty positions. But why would we do that to them? Faculty positions suck now, in my honest opinion. And as these students progress through the different stages of academia with programmatic help, are we selecting for the same types of personalities or molding these students into what faculty positions require? How is that helping anyone? We may end up increasing diversity “by the numbers” but in the end we are losing the real benefit of diversity of thought. One interesting way to put it is that we need more cognitive diversity (Valantine and Collins, PNAS, 2015).

When I think of diverse populations and those that are underrepresented in STEM that I have worked with or mentored, their values do not align well with what academia values. Those that continue onto faculty positions may feel like they lose parts of their identity and the need to conform in order to be promoted in their career (McGee and Kazembe, Race Ethnicity and Education, 2015). That is going to continuously result in a lack of diversity, whether it be of thought or people leaving the field, despite all of the early programmatic measures.

Additionally, while unconscious bias training will likely decrease microaggressions towards others and might eliminate gender bias in hiring (Carson, TechRepublic, 2015), which is completely necessary and should be required, it appears as though we still select candidates from the same top schools that tend to attract similar personalities (Clauset, et al., Science Advances, 2015). In the end, the change in overall culture will be gradual but hiring more diverse individuals will only become more diverse with a cultural change that people must be willing to enact in the long run.

I do acknowledge that top notch research is conducted at the highly-ranked institutions but I want to also acknowledge that ground-breaking research is often discovered across all institutions. One can even argue that the same schools where postdocs are being hired from cannot employ all of them, which means that these researchers are being attracted to other programs that might offer a more welcoming environment. And there is no doubt that the people being hired are incredibly talented and yet there might well be not a single perfect match for each position but multiple. Hence many brilliant and creative candidates might go home empty handed simply because of the factor of chance.

So, I propose a different approach. Let’s change what we value when hiring faculty. Let’s change the academic environment. Not just the funding structure or implementing training for various STEM career paths or making more permanent research positions. All that is good and well but I predict it won’t increase diversity, cognitive, gender, or racial, at the faculty level. What we need to change is the way we select who continues onto academic positions. Why not use the same “pluck” criteria, or the desire to create change in a spirited or daring manner, and holistic approach, considering more than the candidate’s pedigree and publication record, that many colleges and universities use to admit undergraduates to select faculty? You might be surprised by the results.

More reading: Gibbs and Griffin, CBE LSE, 2013
Warner and Clauset, Slate, 2015

///Edited for three references 28/11/2015///

The Pickpocketing of Postdocs

These are the times when folks all over the USA and beyond are concerned about the current state of academia. Early career researchers and allies are coming together to work towards improving academia and their situation within. The Future of Research conference in Boston kicked it all off for real and now we are seeing it snowballing from New York (NYU), MW-Madison, to the Bay (UCSF) and Chicago…hopefully many more to follow. Similar debates are held in other countries such as India and Australia. Even the glamor mags (here and here) agree that something needs to be done about the current dilemma. The NIH also is thinking about fundamentally changing their granting scheme. And of course there was Tim Hunt; no matter on which side you are on everyone seems to agreed  that there are problems with sexism and lack of diversity within the Ivory Tower.

In the middle of all this falls the current UC postdoc union (UAW5810) bargaining campaign with the University of California. The content of the second contract will define working and living conditions of over 6000 postdocs in California for the next several years. In addition, this new contract will set standards nationwide. So it appears the time is ripe to actually put the money where your mouth is and support early career researchers by creating progressive work environments for everyone. Work environments in which people  from all walks of life can flourish and express their academic creativity.

Well you would have thought so but these are the demands UC came up with. For sure their are not progressive, for sure they will not foster diversity.

  Article 3 Benefits

Article 3

UC wants to restrict the current benefit plan and increase the current postdoctoral scholar’s share of the benefit. In the absence of increased pay postdocs shall take a factual pay cut for worse benefits.

 Article 4 Compensation


UC wants to change the compensation scheme including introducing salary caps, exclusion of Paid Directs and more… WhatWhatWhat postdocs shall get salary caps? Show me a rich postdoc and I show you a unicorn. Each and every time I read this line I cannot stop laughing. Seriously an institution that pays its leadership and administration mid to high six digits paychecks wants to restrict pay for ‘lowly’ postdocs. The starting salary of postdocs at UC is about $43.000 per year. Seriously? The most frequent case that a postdoc makes more than the minimum salary is when she wins her own fellowship. In this case the funding agency provides the salary. Why restrict this when it isn’t UCs money anyway? Why punish the successful and independent? The only reason I can come up with is that other postdocs might see they are getting paid jack shit and demand more. The actually postdoc salary is less than the living wage for a family in 9 out of 10 UC campuses!

 Article 11 Layoff


UC wants to be able to lay postdocs off more quickly. Well short 1 year contracts are not enough let’s make the postdoc experience even more precarious and insecure. A postdoc might get too lazy otherwise and sit in the sun the whole day long.

Article 17 Personal Time Off and Article 22 Sick Leave

Article 17 Article22

UC wants to cut postdocs sick days and personal time off. Of course postdocs are too lazy and take too many holidays. They are not productive enough and should work more. Never mind stress and mental health. Also if a postdoc gets sick and shows some sign of ‘lack of work’ UC simply kicks him out under the proposed changes to the layoff rules (Article 11).

New Article

New Article

UC wants to practically do away with the minimum contract length of 1 year and introduce a 1 year probationary period instead. Of course if a postdocs boss doesn’t like her, bad luck. He lays her off and she and her family can go back to their home country after three months and her just having invested thousands of dollars to live the Californian dream. Visas are mostly linked to a postdoc’s employment status in the US so he is practically at his supervisors good will to stay in the country. From personal experience I’ve already seen too many postdocs that didn’t dare to speak up against discriminatory actions this would make their lives even worse. Also only perceiving the threat of contract termination and mandatory removal from a foreign country is very unpleasant as everyone knows that experienced it before.

Instead of being progressive, implementing measures to increase diversity of the postdoctoral workforce and reducing the ‘leaky pipeline’ UCs demands are simply regressive. UC demands factual pay cuts for postdocs, offers worse benefits, less rights and less job security.  This all in times when early career researchers are stretched to the limit and positive messages are scarce. I hope UC gets round. UC is a great place to work, has a vibrant postdoctoral culture and loads to offer to foster a postdocs career. Making this experience of a lifetime more accessible for everyone by improving childcare support, parental leave conditions and fair treatment of all postdocs is what UC really should be aspiring to.

Poverty outside the Ivory Tower

Beginning 2014 Tenure, She Wrote published a great post about “Poverty inside the Ivory Tower”. I would recommend everyone to go read it and appreciate how difficult it is being poor and part of academia.

How many of my colleagues are looking down on poor people and ‘bad neighborhoods’ has always bothered me. How can you be so dismissive and disrespectful to your follow brothers and sisters only because they are poor and need help?  In the Bay Area I met many academics that bad mouthed about the Tenderloin in SF or West Oakland. Sometimes I wondered if these academics are actual talking about human beings or some kind of more primitive specimen. It hurts hearing these derogative words about your neighbors, fellows and friends. I bet many never talk with anyone living in these neighborhoods.

So here is a small story to tell you how great folks are living in these neighborhoods and the sense of community they embody.

We recently moved from West Oakland to Australia. The last couple of days of our move we were giving away goods we couldn’t move and didn’t need. This included plenty of food, bags of old spare change, towels, and other small household items. Larry and other (homeless) locals would stop by and ask if we have something else to give away. Everyone was happy to be treated with respect and get a little help to get by. Some of the locals greeted my partner and wife at the local store with ‘….thanks for feeding us all’. Great feeling to actually help. So all went well till the last day. My little family already left and I had to hand over the house. Early morning I wanted to go relax a bit going to the gym before a hectic day. Well this didn’t work out as planned. While getting ready I found that two of my bags got stollen that night one including my laptop, missing half of its keyboard, some paperwork and hiking gear for my last California trip with friends of the lab. Of course you can imagine I was pretty pissed and annoyed. How could someone steal from us when we were sharing so much? Off doing several necessary phone calls. At work using a shared computer I was getting the idea to walk through the (neighbor)hood and talk with the locals to get my stuff back. I made some handwritten leaflets with a short description and a phone number. A one-hour walk through the hood ensued and I talked with Larry, the homeless at our playground and half a dozen other folks. I explained everyone that it was annoying that folks stole from us while we gave away plenty. Everyone assured me to get in touch if they found something. Of course I wasn’t really hopeful. But hey was I in for a surprise. Later the day one of the locals gave me a call that he located one of my bags. After having it picked up. He gave me another call that night that he found some more of my stuff. The next morning the guy from our playground called me that he found a bag and inquired if it was mine. Two days later the first guy called again telling me he located nearly all the remaining stuff. Each and everyone that returned something actually needed the goods (e.g. shoes, cloths, bag) more than I did. They could have kept it all, panned or sold it on. They are really in need. But no, the sense of community and respect had bigger value for them. People actually care about you if you show them respect.

So in these situations when my fellow academics badmouth about poor people I hardly ever find the courage to speak up. In most of these situations I feel people nod and agree. Folks don’t appear to appreciate their own privilege of color, race, class and/or gender rather focusing on their own hardship and achievements. But hey everyone fought his/her own struggle and most people never had the opportunities we had.

Respect your brothers and sisters, cause we are actually all in this together.

Preprint at work and at work for preprints

A couple of weeks ago I wrote about our experience getting rejected during the review process, at the R1-R2 stage, for novelty reasons. Another group had submitted and published a competing manuscript after our initial submission. This all happened while we had posted the preprint of our manuscript on biorxiv several months prior.  Our story ended well. After some noise the editors reconsidered and invited us to resubmit. You can read the whole story here.

A couple of months have passed since the graceful turn around of PLoS. We were able to address the remaining reviewer’s concerns. So now our manuscript is improved and finally fully approved be the peer review process. You can read the accepted version of the manuscript here, all previous versions here and a short media snipped here. ///In case you are interested in improving crops be intergenus transfer of immune receptors I recommend these papers in wheat, rice and Arabidopsis.///

….this whole story was only possible with PLoS, who actually cares about the community it serves. Imagine a AAAS or Nature group journal, I bet they would have simply swept us aside.

Enough of the background story….Was it worth all the hassle? Do preprints work in my field (of plant biology)? Would I post a preprint of my work again?

Yes it was worth all the hassle. I truly think we were treated unfairly as it simply isn’t good manners to reject a manuscript due to novelty reasons alone during the review process. The journal already engaged you and asked for more. Now it simply drops you like a hot potato?!? No one wants a dirty potato off the floor! Our preprint simply made our case stronger and public.

Yes it was worth all the hassle. Our preprint got over 1500 visits and 500 pdf downloads before publication with PLoS. Doesn’t sound much to you…well it is well more than none while hidden away in a pile of papers of a busy reviewer.

I think it is (very) early days for preprints in plant biology. Several (senior) PIs told me that it doesn’t work, it isn’t good for your career, it hurts early career researchers, the field is too competitive….well that’s their opinion not mine. Others have written good posts of why to support preprints (here, here, here, here)  so I won’t repeat all that jazz here (yet see below). The plant biology community in particular though can only profit from more openness including preprints, post publication peer review (PPPR), public peer review (PPR) and similar new ways of improving the peer review/publication process. I tend to wonder if many of the high profile retractions (here and here) and falsification accusations (here and here) could have been avoided in a more transparent culture. Impossible to tell in hindsight yet well worth striving for in near future. More transparency and openness is really the only way forward…

Yes of course! Every time I have the chance I will post my manuscript on a pre-print server at time of submission. I deem it ready to go public so no reason to hold back! /// I even hope to  get feedback on a preprint someday./// This decisions of course also depends on my co-authors and our journal choice. ///See (not fully up-to-date) policies here. And no in my experience AAAS does not support preprints./// Also the American Society of Plant Biology (ASPB) wasn’t really supportive of preprints up to very recently. Thankfully this has changed due to lobbying efforts of several of its members (including myself) as you can read here and here. ASPB hasn’t publicly announced this change however I hope they follow in the footsteps of other big societies such as American Phytopathological Society (APS), Amercian Society of Microbiology (ASM), and Genetics Society of America (GSA) to only name a few. It is great to see scientific societies listening to their memberships.

Yes preprints work as long as we work for them. It will be great to see preprints fully take of in (plant) biology including pre-publication reviews such as here.  I can only agree with others that the whole publication experience needs to be improved. Researchers should not feel like ‘beggars can’t be choosers’ when it comes to publication and sharing their knowledge. Hidden away peer review might help improve your manuscript in many, but for sure not all cases. It doesn’t appear to be the best way to decide on accept/reject of your manuscript (see also here) and thereby deciding on your future in academia.

Be positive! From witch hunts to the new reward culture

Disclaimer: This is a guest blog post by Sophien Kamoun, a highly respected group leader in Plant Biology at The Sainsbury Laboratory, Norwich. We decided to publish his post as a valuable contribution to the debate on ‘open science’. However the represented opinion does not necessarily reflect our both/both opinions.

I’m a proponent of open science. Science is continuously in flux. Our knowledge, theories and concepts are continuously evolving. The essence of science is to capture new information, integrate it into current models and regurgitate more elaborate concepts. Therefore science cannot thrive without a vibrant culture of discussion and debate. Open science widens the net. Anyone can access the data and comment on it. A tweet by someone you don’t know could lead you to think differently about your science and help you to develop new concepts. We move from elitist old boy clubs to an open door party. This is healthy for science.

As a native of Tunisia open science matches well with my personality. I grew up in the typical Mediterranean culture of vibrant discussion, constant arguing and yes the occasional bickering. These traits are embedded in me. I know they can be irritating to others. But I believe they did help me develop into an engaged citizen and scientist. A paucity of critical thinking and engagement among the citizen of modern and, presumably, well-educated societies is one of the drama of our age.

It is therefore natural for me to support all efforts of post-publication review. Platforms such as PubPeer aim at extending the discussion and analysis beyond publication in peer-reviewed literature. They are perfect for this era of journal proliferation and internet communication. They also address flaws in pre-publication peer-review that have been well documented.

But PubPeer has evolved, most certainly against the wishes of its anonymous founders, into a modern day “witch hunt” platform. Many comments seem valid. But pointless and frivolous comments are being posted, and have certainly increased in frequency in recent weeks (at least in plant biology). How to address this? How to buffer or eliminate such posts while maintaining the original goal of PubPeer as a vibrant journal club platform? My suggestions are two-fold.

First, PubPeer should encourage and promote the posting of positive reviews. Peer review is not about “Gotcha! I found a flaw!” It is primarily to endorse excellent science. It does shock me that the great majority of the posts I read only list negative comments. This is not what peer review or journal clubs are about. More often than not, we find positives in the literature we read and discuss. There is plenty of great science out there. Why shouldn’t we acknowledge it and promote it? Why are posters rushing to reveal “vertical lines” in a blot but failing to highlight a flawless figure? PubPeer and related platforms have a role to play here. They could help build the confidence and reputation of young scientists, strengthen their CVs – further shifting the obsession with impact factors and publishing in glam magazines to a focus on the quality of an individual’s work.

Second, PubPeer may consider recruiting an Editorial Board to help moderate the questionable posts. I expect many reputable scientists, junior and senior, to be willing to volunteer just as we do for scientific journals. An Editorial Board that reflects diversity in gender, geography, career stage, and research topic would improve transparency and credibility. It will also serve to temper criticism and cynicism about PubPeer that is prevalent among many in the scientific community.

The reality is that post-publication peer review is here to stay. The recent episode that my colleagues and I faced was a timely teaching moment. It reminded us of the importance of record keeping, archiving old data, ensuring that pictures integrate visible labels and so on. Several members of my lab told me that this sorry episode has prompted them to document and store their data more rigorously. Nobody wants to find out 10 years later that they cannot respond to an allegation about their paper. Mistakes do happen so we should be prepared to respond and revise.

At my host institution, The Sainsbury Lab, which is currently led by Head of Lab Cyril Zipfel, recent episodes have further justified the misconduct training initiatives that were already undertaken way before the current brouhaha. We need to raise awareness of these important issues. Scrutiny and discussion of the science post-publication should become part of the culture. A shift to a new reward culture is happening. It’s not only where you published but also what you published. Quality indicators other than the journal impact factor are becoming recognized. It’s you, the next generation of scientific leaders, who can ensure that the cultural shift takes hold. And PubPeer and other post-publication peer review platforms have a role to play in this new reward culture.

/// See also our first guest post of today by anonymous Unregistered Submission about the same issue. ///

Don’t judge too fast!

Disclaimer: This is an anonymous blog post submission by Unregistered Submission. We decided to publish it as a valuable contribution to the debate on ‘open science’. However the represented opinion does not necessarily reflect our both/both opinions. To keep with spirit of this post we gave Sophien Kamoun a 24-hours heads up before publishing. Some of his feedback was incorporated into this post by the anonymous author. Slighlty modified paragraphs are highlighted in italic.

There you go, find a duplicated figure panel in an article, make a figure, write one sentence and post it on pubpeer (, the online journal club where scientific peers can anonymously place comments on scientific publications. Little effort for one person, something that may have a huge effect on people far away from where I am living…

What followed was a tremendously fast response from the authors involved in this manuscript ( and I think a new world-record in correcting a scientific paper ( I absolutely respect and admire the professionalism by which the authors (Mireille van Damme, Cahid Cakir, Sophien Kamoun et al.) handled the probably very unpleasant situation. I posted my concerns regarding one specific figure of the respective article on Saturday evening, already by Sunday the original data was provided on figshare by the authors to convince any skeptical colleague ( Let me be clear, I never believed that the authors purposely published data to mislead the reader. Obviously, this was just a simple mistake, something that could happen to anyone actively involved in science.

Why did the authors rush so much to get the original data online so fast? My thought is that the authors wanted to avoid entering a harmful treadmill, in which other anonymous commenters start to dig further trying to add additional “evidence” that the authors purposely misled the reader. In fact, this process almost immediately started after I added my concern. With one person adding some more fuel to the starting fire by talking about the authors purposely rotating another panel in the same figure. What if the authors were unable to provide such a fast response? For example, when data could not be found immediately or someone was on a holiday for two weeks or more? Would the authors have had a fair chance to defend themselves against a growing group of anonymous commenters?

The last couple of weeks I have been following evolving stories around papers of Olivier Voinnet, David Baulcombe et al.,. Doubt about figures in a number of papers were posted on pubpeer in September 2014 (, ). This was followed by an explosion of post refereeing to over 25 papers in January 2015. Worrisome? Yes, but I was a bit shocked by the way colleagues around me spoke with disgust about multiple scientists involved in any of these papers, and how on social media such as pubpeer and retractionwatch people carelessly provide their opinion and accuse scientists potentially involved in figure manipulations and duplications. Probably knowing little to nothing about the factual situation.

To my feeling the pubpeer website in its current form is too much a “hunt the scientist” website, a place where scientists can be suspected of publishing falsified data. Not really the “online community that uses the publication of scientific results as an opening for fruitful discussion among scientists” that it claims or wants to be ( Why does a comment that I add myself on a Saturday evening have to appear online in public and to the authors on a Sunday? Why can’t the authors be informed far in advance before making a comment public? Giving the authors ample amount of time to sort things out and reply. The way pubpeer currently works, or better, the way it allows some people to use it, resembles a modern day witch-hunt.

I would like to stress that intentional figure manipulations are indeed extremely bad, in fact it is fraud and that is a very serious crime. This is especially why we should be extremely careful commenting on other scientists work. We should not allow ourselves to create a platform of which the primarily use currently seems to be a public “scientific execution site”. Potentially damaging innocent scientists reputations and that of co-authors who may have nothing or very little to do with the whole situation. To accuse someone from committing a crime is a big thing and I wonder whether this should be discussed so directly and openly in public, with maybe little chances for the authors to reply or defend themselves initially. Do we do the same with other types of crime? We don’t put anonymous notes in supermarkets with names of customers who are suspected of theft, at least not in the country where I live. No, we go to a respectable authority and led them investigate what is actually going on. Shouldn’t pubpeer have a more stringent editorial filter? An online open journal club is something different from an online open crime-report site.

Don’t get me wrong, I am very much pro “open science” and the more discussion the better. Pubpeer is a good initiative, but currently not working optimally. Authors should have a fair chance to defend themselves and one should not judge before all evidence is provided. Also scientists have the right of being “innocent until proven guilty”. In fact, how many of the suspected papers are actually truly worrisome? Yes, for a couple of papers it looks bad, but there also seem to be quite a few with marginal evidence for intentional figure manipulations (,,, Are all these authors and co-authors suspected of fraud or, alternatively, can we accurately point all these cases to one single person?

Pubpeer in its current form is surrounded with a negative and suggestive atmosphere, something you would not like your paper to be associated with. A site where comments seem to be frequently made by over-frustrated scientists. People like sensation: “big names struggling”, always a source of entertainment. Whether these big names are famous movie-stars, politicians or scientists. (Un)fortunately scientists are people too and on pubpeer (and sites like retractionwatch) it is shown that we are often little better then gossip loving yellow press readers in a supermarket.

That brings me to the fact of specifically bringing up the following manuscript/figure on pubpeer. Last week, tweets appeared on social media that were joking about the scientists suspected of fraud ( , sensation!). Did it bring a smile on my face?, honestly yes, it was quite funny. However, would I like to be in the same position as any of the co-authors of the 25+ suspected papers?, and am I 100% sure that all my papers are spotless? My answer would be “No”. Several scientists actively re-tweeted this joke, but isn’t that also a tiny little bit hypocritical? Or are these scientists very sure that their own published work is the gold standard?

I decided to look into a number papers of the re-tweeting scientists present in my literature archive for troublesome data (I was only looking into manuscript main figures, without help of any software). A childish “gotcha game”? Maybe, but I guess that’s how the true wanna-be “science-detectives” on pubpeer work. I discovered one paper of the Kamoun lab with a “serious” issue. Did I suspect fraud?, not for a single moment. But I felt I should bring this up to show how vulnerable we all are as scientist at this moment. The Kamoun lab is without doubt one of the most active labs on social media within the plant sciences field. Something, I actually greatly appreciate and respect! I could thus expect that posting a comment on pubpeer would attract a lot of attention (well it certainly did, Something I hope would aware the community that at this moment we all can too easily be suspected of being a fraud.

Cynical and ironic jokes arise when questions remain unanswered. I understand that the Voinnet lab has had plenty of time to reply to any of the initial concerns posted. This is in stark contrast to the way the Kamoun lab handled the situation, by directly replying to all concerns raised. Nonetheless, with over 25 papers currently in doubt in the case of Voinnet I am aware that ironic jokes currently target a large group of innocent scientist who may not have a fair chance to reply at this very moment.

I would like to open a discussion in which standards of how to criticize a scientific paper are addressed. Should this always be done so open and direct? Shouldn’t websites such as pubpeer have a better editing process for certain type of comments?, especially when issues such as scientific integrity are at stage? At least scientists should be given sufficient time before they are being haunted by a group of “science-detectives”. Lastly, we as scientists should not judge too fast, suspecting or suggesting someone is a fraud is a big leap. It is time we start using pubpeer in a much more positive way by not just posting negative or suggestive comments. To the Kamoun lab: I promise to make a start by now placing my honest, positive and fair opinion on several of your great manuscripts! Pubpeer should be used as online journal club highlighting not only flaws but also the great science out there.

/// See also our second guest post of today by Sophien Kamoun about the same issue. ///