MESSAGE
DATE | 2017-02-28 |
FROM | Ruben Safir
|
SUBJECT | Re: [Learn] [Hangout-NYLXS] Peer Review
|
From learn-bounces-at-nylxs.com Tue Feb 28 09:46:00 2017 Return-Path: X-Original-To: archive-at-mrbrklyn.com Delivered-To: archive-at-mrbrklyn.com Received: from www.mrbrklyn.com (www.mrbrklyn.com [96.57.23.82]) by mrbrklyn.com (Postfix) with ESMTP id 7D065161312; Tue, 28 Feb 2017 09:46:00 -0500 (EST) X-Original-To: learn-at-nylxs.com Delivered-To: learn-at-nylxs.com Received: from [10.0.0.62] (flatbush.mrbrklyn.com [10.0.0.62]) by mrbrklyn.com (Postfix) with ESMTP id C29E716001A; Tue, 28 Feb 2017 09:45:57 -0500 (EST) References: <4ee77e1b-2b7f-993b-0451-0ef1fe397651-at-mrbrklyn.com> To: "learn-at-nylxs.com" From: Ruben Safir Message-ID: Date: Tue, 28 Feb 2017 09:45:57 -0500 User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:45.0) Gecko/20100101 Thunderbird/45.6.0 MIME-Version: 1.0 In-Reply-To: <4ee77e1b-2b7f-993b-0451-0ef1fe397651-at-mrbrklyn.com> Subject: Re: [Learn] [Hangout-NYLXS] Peer Review X-BeenThere: learn-at-nylxs.com X-Mailman-Version: 2.1.17 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable Errors-To: learn-bounces-at-nylxs.com Sender: "Learn"
The most interesting this was this, IMO
Prof. Stark runs a course on the theory and application of statistical models. In his course, groups of students replicate and critique the statistical analyses of published research articles using the article=E2=80= =99s publicly available raw data. Obviously, for this course to work, Prof. Stark needs rigorous research articles and the raw data used in the article. In this sense, Pitt and Hill=E2=80=99s article on ScienceOpen was = an ideal candidate..
The groups of students started their critical replication of the Hill and Pitt article in the Fall semester of 2016 and finished right before the new year. By getting students to actively engage with research, they gain the confidence and expertise to critically analyse published research.
On 02/28/2017 09:42 AM, Ruben Safir wrote: > http://blog.scienceopen.com/2017/02/a-post-publication-peer-review-succes= s-story/ > =
> =
> A post-publication peer review success story > February 28, 2017 * > Author: Jon Tennant > =
> =
> =
> In 2016, Dr. Joel Pitt and Prof. Helene Hill published an important > paper in ScienceOpen Research. In their paper, they propose new > statistical methods to detect scientific fraudulent data. Pitt and Hill > demonstrate the use of their method on a single case of suspected fraud. > Crucially, in their excellent effort to combat fraud, Pitt and Hill make > the raw data on which they tested their method publicly available on the > Open Science Framework (OSF). Considering that a single case of > scientific fraud can cost institutions and private citizens a huge > amount of money, their result is provocative, and it emphasizes how > important it is to make the raw data of research papers publicly availabl= e. > =
> The Pitt and Hill (2016) article was read and downloaded almost 100 > times a day since its publication on ScienceOpen. More importantly, it > now has 7 independent post-publication peer reviews and 5 comments. > Although this is a single paper in ScienceOpen=E2=80=99s vast index of 28 > million research articles (all open to post-publication peer review!), > the story of how this article got so much attention is worth re-telling. > Enhanced article-level statistics and context =E2=80=93 just one of 28 mi= llion > on our platform! > =
> Peer review history > =
> The manuscript was submitted and published in January 2016, and the > final typeset version of the article was available for download on March > 1st. Shortly after this in May 2016, PhD student Chris Hartgerink > publicly peer reviewed the article, summarising it as =E2=80=9CInteresting > research, but in need of substantial rewriting.=E2=80=9D > =
> It was after this that the article came to the attention of Prof. Philip > B. Stark, an Associate Dean at the University of California, Berkeley, > and author of the most highly read article on our platform with over > 39,000 views to date! > =
> Prof. Stark runs a course on the theory and application of statistical > models. In his course, groups of students replicate and critique the > statistical analyses of published research articles using the article=E2= =80=99s > publicly available raw data. Obviously, for this course to work, Prof. > Stark needs rigorous research articles and the raw data used in the > article. In this sense, Pitt and Hill=E2=80=99s article on ScienceOpen wa= s an > ideal candidate.. > =
> The groups of students started their critical replication of the Hill > and Pitt article in the Fall semester of 2016 and finished right before > the new year. By getting students to actively engage with research, > they gain the confidence and expertise to critically analyse published > research. > =
> The Post-Publication Peer Review function on ScienceOpen is usually only > open to researchers with more than 5 published articles. This would have > normally barred Stark=E2=80=99s groups from publishing their critical > replications. However, upon hearing about his amazing initiative, > ScienceOpen opened their review function to each of Prof. Stark=E2=80=99s= vetted > early career researchers. And importantly, since each peer review on > ScienceOpen is assigned a CrossRef DOI along with a CC-BY license, after > posting their reviews, each member of the group has officially shared > their very own scientific publication. > =
> This also means that each peer review can be easily imported into any > user=E2=80=99s ORCID, Publons, and even ImpactStory profiles =E2=80=93 th= e choice is yours! > =
> Public, post-publication peer review works > =
> All of the complete peer reviews from the groups of students can be > found below. They all come with highly detailed statistical analyses of > the research, and are thorough, constructive, and critical, as we expect > an open peer review process to be. > =
> Furthermore, unlike almost every other Post Publication Peer Review > function out there, the peer reviews on ScienceOpen are integrated with > graphics and plots. This awesome feature was added specifically for > Prof. Stark=E2=80=99s course, but note that it is now available for any p= eer > review on ScienceOpen. > =
> Maurer and Mohanty, who stated that the work was an important > demonstration of the use of statistical methods for detecting fraud; > Hejazi, Schiffman and Zhou, who evaluated the work as comprehensible > but largely incomplete; > Dwivedi, Hejazi, Schiffman and Zhou, who note that the research is a > strong advocate for detecting scientific fraud and the use of > reproducible statistical methods; > Stern, Gong and Zhou call the research clever in the application of > the techniques t uses to address a pressing problem in science; > Bertelli, DeGraaf and Hicks think the analysis is convincing and > valuable, but with a methodology that could be refined; > Hung, Sheehan, Chen and Liu evaluated the paper, finding a few minor > discrepancies between their own results on those of the published researc= h. > =
> A fine example from one of the students > =
> So overall a large variety of findings drawn from the critical > replication project, and each of which individually greatly enhance the > published research. > =
> Aftermath > =
> Many of the peer reviews focused on a specific assumption that the Pitt > and Hill article made about how sets of numbers are distributed. We > talked to Dr. Helene Hill for comment. She noted that Dr. Pitt was > working on the distribution issue noted by many reviewers, and that she > was happy to see her research received such critical attention. > =
> She notes that critical reception was: > =
> just what I was hoping for =E2=80=94 for our paper to be a model for = posting > and analyzing data. > =
> Prof. Stark said there are several things that this project accomplished > for him in terms of getting students actively involved in peer review: > =
> Get students thinking about alternative models for scholarly > publication; > Get students thinking about reproducibility and open science; > Get students to work collaboratively on a data analysis project that > involves thinking hard about the underlying science; > Get students to register with ORCID; > Get students to post their analyses on GitHub so that their own work > is reproducible/extensible; > Get students their first scientific publication. > =
> For another step of Open Science brilliance, the reviews themselves > sought to be completely reproducible, with the code for all the > students=E2=80=99 calculations is available on GitHub (eg here and here)! > =
> Prof Stark said: > =
> I think it=E2=80=99s remarkable that ScienceOpen extended the platfor= m and > your process to make this possible, including allowing people with fewer > than 5 publications associated with their ORCID to submit reviews, > figuring out how to allow figures in reviews, and that you are working > on allowing reviews with multiple authors. ScienceOpen was really a > partner in making this exercise possible. > =
> We also asked some of the students how they found the peer review > exercise, many of whom praised Pitt and Hill=E2=80=99s efforts on making = the > research as reproducible as possible. > =
> Stephanie DeGraaf: =E2=80=9CThe Hill and Pitt paper made the data publicly > available and explained their analyses thoroughly enough so that we > could reproduce all of their results. The paper focused on a really > fascinating topic of testing for fraudulent data, and I really enjoyed > thinking about how to tackle the problem in a statistically valid way. I > found it really interesting to see that even though all of us in the > class were reviewing this same paper, we all had different perspectives, > criticisms, and ideas for other ways to investigate the researchers=E2=80= =99 > claims.=E2=80=9D > =
> Aaron Stern: =E2=80=9CThe Hill and Pitt paper was a great choice for the = purpose > of the course; not only did the authors employ interesting and novel > statistical methods for us to critique, but they also were tackling a > very important issue in science =E2=80=94 namely, fraud. While we agreed = with > the paper=E2=80=99s conclusions, we found a number of scenarios where the= ir > approach applied to new datasets could result in false positives; i.e., > their methods could impugn an innocent researcher. Thus, it=E2=80=99s imp= ortant > to validate these methods thoroughly in order to avoid hurting innocent > scientists.=E2=80=9D > =
> Kenneth Hung: =E2=80=9CIt is not very agreed, among statisticians, what > constitutes replications and reproductions. In writing this review, it > gave my new and broader perspectives, in comparison to the > post-selection inference background I came from, as well as common tools > in practice for detecting scientific frauds.=E2=80=9D > =
> Nima Hejazi: =E2=80=9CConstructing a review of the paper required extensi= ve > collaboration, the use of open source software tools, and the leveraging > of statistical and domain knowledge for the purpose of detecting > fraudulent science =E2=80=93 all in all an experience that demonstrated q= uite > well the challenges of working with real-world data and making use of > open-access publishing platforms.=E2=80=9D > Prof. Stark=E2=80=99s profile on our platform. Those stats are looking gr= eat! > =
> Great success! > =
> So we definitely count this as a major success story on several levels. > =
> Students gained the experience in performing analyses for the sake > of reproducible research. > Students also gained the skills and confidence to perform rigorous > and constructive peer reviews in public. > Post-publication peer review works just as well, if not better, than > traditional peer review. > Openness facilitates recognition and reward for peer review, which > is crucial for those just starting their research careers. > This whole exercise shows that just because research has been > published, it does not mean that critical evaluation of it should stop. > =
> So, what is the next step? Well, anyone who has an ORCID can peer review > any of 28 million research articles on our platform. They don=E2=80=99t h= ave to > be detailed statistical analyses =E2=80=93 they can be critical commentar= ies, > additional notes and context, or what your own related research says. > =
> The point is the choice is yours. The reason is that you help to improve > the context and progress of your research field in the open, while > improving your research skills and receiving recognition and credit for > doing so. > Share this: > =
-- =
So many immigrant groups have swept through our town that Brooklyn, like Atlantis, reaches mythological proportions in the mind of the world - RI Safir 1998 http://www.mrbrklyn.com
DRM is THEFT - We are the STAKEHOLDERS - RI Safir 2002 http://www.nylxs.com - Leadership Development in Free Software http://www2.mrbrklyn.com/resources - Unpublished Archive http://www.coinhangout.com - coins! http://www.brooklyn-living.com
Being so tracked is for FARM ANIMALS and and extermination camps, but incompatible with living as a free human being. -RI Safir 2013 _______________________________________________ Learn mailing list Learn-at-nylxs.com http://lists.mrbrklyn.com/mailman/listinfo/learn
|
|