Important contributions to OA in Europe

There have been some noteworthy contributions to the Open Access (OA) movement in Europe over the past week.

1) In the UK, the government announced that all UK-funded research will be OA within two years. An interesting commentary about this announcement has been provided by Mike Taylor, accessible via the link in his tweet:

2) The Research Councils UK (RCUK) has strengthened its OA policy, summarized here. The announcement is available via the link in this tweet:

3) The the European Commission (EC) has backed calls for OA. See:

and:

4) The European Research Council (ERC) has announced that it will participate in the UK PubMed Central (UKPMC) OA repository service, and that the repository will be rebranded as “Europe PMC” by 1 November 2012. A link to information about this announcement is included in this tweet:

Comment: Will these important contributions to the OA movement give rise to analogous contributions in other nations? I hope so.

Advertisement

Comments (1)

About Google Scholar Metrics

A blog post by Stephanie Kovalchik, “Google starts ranking journals” (April 10, 2012), is interesting. It’s about Google Scholar Metrics, announced on April 1, 2012.

Google Scholar Metrics can be accessed from Google Scholar, via a “Metrics” icon on the upper right of the home page.

The purpose of Google Scholar Metrics is to “help authors worldwide as they consider where to publish their latest article“. Two more excerpts from the Google Scholar Blog post, “Google Scholar Metrics for Publications“, are:

To get started, you can browse the top 100 publications in several languages, ordered by their five-year h-index and h-median metrics.

And:

Scholar Metrics currently covers many (but not all) articles published between 2007 and 2011. It includes journal articles only from websites that follow our inclusion guidelines as well as conference articles and preprints from a small number of hand-identified sources. For more details, see the Scholar Metrics help page.

Stephanie Kovalchik’s blog post provides a brief description of the h-index. More details are available via the Wikipedia entry for h-index.

One noteworthy aspect of the top 100 publications in several languages is that RePEc (Research Papers in Economics) and the arXiv repository are included, and achieve high ranks (#4 and #5, respectively; Nature, New England Journal of Medicine and Science are ranked #1 to #3, respectively).

Another interesting aspect of this list of 100 publications, from an Open Access perspective, is that the PLoS journal that’s top-ranked on the basis of h-index  is PLoS ONE, ranked #63. Then comes PLoS Biology (#83), PLoS Medicine (#88) and PLoS Genetics (#93).

The Eigenfactor method of ranking journals also identifies PLoS ONE as the top-ranked PLoS journal.

Different rankings are obtained using other measures. In particular, the 2010 Journal Impact factor (JIF) for PLoS ONE is lower than that of the other 6 PLoS journals. And, on the basis of the 2010 Article Influence Score, PLoS ONE has a lower rank than 5 of the other 6 PLoS journals.

The SJR method for ranking journals also gives PLoS ONE a low rank in comparison with the other PLoS journals.

Comment: Which ranking method to believe? My answer is: none of them, if considered by themselves.

In 2009, Johan Bollen and 3 co-authors published “A Principal Component Analysis of 39 Scientific Impact Measures” [PLoS ONE 4(6): e6022]. One of their conclusions was that “scientific impact is a multi-dimensional construct“. Various measures place a different emphasis on the major dimensions of the construct (see Figure 2 of their paper). They also concluded that “the JIF and SJR express a rather particular aspect of scientific impact that may not be at the core of the notion of scientific ‘impact’.” They suggested that the JIF and SJR are indicative of journal “Popularity”. rather than “Prestige”. Their results indicated that the h-index places less emphasis on “Popularity” and more on “Prestige” than do the JIF or the SJR.

Bollen et al didn’t include the Eigenfactor approach in the set of impact measures, but merely commented that it should be considered for inclusion in comparisons of impact measures.The Article Influence Score was also not included in their set of measures.

So, what to  conclude about Google Scholar Metrics? Their current focus on the h-index is useful, but the metrics would be more useful if they included other measures. Of particular interest would be measures of “usage”, analogous to those identified by Bollen et al as “Rapid” measures of “Prestige”. From this point if view, the editorial by Gunther Eysenbach, “Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact” [J Med Internet Res 2011;13(4):e123] is of particular interest. Might tweets quickly predict “Popularity” more than they predict “Prestige”?

Leave a Comment

Case studies: the OA policies of two publishers

The American Society for Biochemistry and Molecular Biology (ASBMB) publishes three journals. They are (with their abbreviations and 2010 Impact Factors): the Journal of Biological Chemistry (JBC, 5.3), Molecular & Cellular Proteomics (MCP, 8.4) and the Journal of Lipid Research (JLR, 6.1).

In the Open Access Policy of the ASBMB, it’s stated that:

We were among the first to introduce a form of open access in 1996 when we started to release all back issues of JBC Online free to everyone at the end of each calendar year. On January 1, 1997, for example, all previous content became available without a subscription. We continue this practice.

And:

Our most innovative advance, however, came in 2001 when we introduced JBC Papers in Press which allows us to publish and provide free access to all papers on the day they are accepted for publication. This innovation has reduced the time from acceptance to publication from 8 weeks to 1 day to the delight of both authors and readers. The JBC Papers in Press system has allowed us to meet the spirit of Open Access publishing yet maintain our ability to meet costs.

And,

Our success with open access to JBC has led to the same policy for other ASBMB journals: Molecular and Cellular Proteomics and Journal of Lipid Research.

The ASBMB journals also offer an Author Choice (hybrid OA) option (undated Editorial). The ASBMB policy for NIH-funded research involves a 12-month embargo after deposition of the final accepted version of  manuscripts in PubMed Central (undated Editorial). Neither transfer to PubMed Central nor choosing immediate release through the Author’s Choice option affects free access to Papers in Press (item dated Oct. 9, 2009).

Comment: I like the policy of permitting public access to Papers in Press.

Cell Press publishes 15 journals. One of these is Cell Reports, an open access journal. The other 14 journals have Impact Factors that range from 32.4 (Cell) to 4.2 (Biophys J). Cell Press also publishes another 14 journals in the “Trends in…” series (see: Cell Press Journals). These latter journals have 2010 Impact Factors that range from 14.4 (Trends Ecol Evol) to  4.9 (Trends Parasitol).

The current issue of most (but not all) of the Cell Press journals includes at least one article that is marked “Free Featured Article”.

Comment: It’s not clear to me what criteria are used to select Free Featured Articles. However, even a few publicly accessible articles in high-quality journals are better than none.

Comments (2)

More from the editor

I’m continuing to post to Twitter any OA-related news items that catch my eye, and am continuing to tag these items with the hashtag #OpenAccess.

I’ve also set up Open Access Chronicle, an online newspaper which is based on all recent tweets identified via the hashtag #OpenAccess (or #openaccess). This Daily, which is hosted by Paper.li, provides a convenient summary of recent OA-related news.

Comments (1)

From the editor

Recently, I’ve been finding few news items that are not already available via the OA tracking project. For this reason, there will be no additional posts to this blog during the remainder of 2010. Instead, I will post to Twitter any news items that catch my eye. Such items will then be included among all of those found via a search of the real-time feed aggregator FriendFeed (FF) for the hashtag #openaccess. An advantage of FF is that it permits comments to be added by anyone who joins FriendFeed.

The Science 2.0 Group on FF includes people who are interested in Open Science.

Leave a Comment

Selected OA news items noted during August 2010

Leave a Comment

Political theater about public access to federally funded research

On July 29, 2010, the Information Policy, Census, and National Archives Subcommittee of the US House Committee on Oversight and Government Reform held a hearing entitled: “Public Access to Federally-Funded Research”. The hearing was chaired by Subcommittee Chairman Representative William Lacy Clay (D-MO).

The Alliance for Taxpayer Access has posted a news item about the hearings, entitled: Summary: Hearing on Public Access to Federally Funded Research, dated August 12, 2010.  Excerpt from the last paragraph of this summary: “Next steps: Congress will be in recess until September 9, so any further action on this issue or related legislation will happen after that point.”

There was a webcast of the hearings (2 hr 14 min) and a video is available. Copies of the Opening Statement of Chairman Clay and of the Prepared Testimony of the ten panel members are available here.

Some information about the video (the total duration of the hearing was 2:14:00):

  • 3:10 End of Chairman’s Opening Statement.
  • 7:30 End of statement from Representative Jason Chaffetz (R-UT).
  • 7:35 Introduction of Panel I.
  • 9:40 Beginning of reading of Prepared Testimony by each of three members of the first panel. Each member was given 5 minutes to present their testimony. (All had concerns about government-mandated public access to the outputs of federally funded research).
  • 26:10 End of Panel I presentations and beginning of first question period. Representatives Jason Chaffetz (R-UT), Judy Chu (D-CA), Carolyn Maloney (D-NY) and Chairman Clay asked questions.
  • 1:07:25 End of first panel.
  • 1:09:00 Introduction of Panel II.
  • 1:13:35 Beginning of reading of Prepared Testimony by each of six members of the second panel. (All were supporters of public access to the outputs of federally funded research).
  • 1:43:15 End of Panel II presentations and beginning of second question period. Chairman Clay was the only Representative still present, and he asked several questions.
  • 1:59:10 End of second panel.
  • 2:00:15 Introduction of Panel III.
  • 2:01:30 Beginning of reading of Prepared Testimony by the single member of the third panel, Dr. David Lipman (Director, NCBI, NLM, National Institutes of Health).
  • 2:05:50 End of Panel III presentation and beginning of third question period. Again, Chairman Clay was the only Representative still present, and he asked several questions.
  • 2:14:00 End of hearing.

Summaries of Twitter messages (tweets) about the webcast have been posted here and here. The emphasis is on the Panel II session.

Another commentary about the hearings is: House Holds Hearing on Status of Open Access, FASEB Washington Update, August 6, 2010. The emphasis is on the Panel I session.

Comments: How to review this video, as an example of political theater? First impression: it was based on three one-act plays. Each one was nicely staged. Second impression: the model for these plays was one of the “Judge So-and-So” programs that can be seen on television. In such programs, the judge listens while various people present their different versions of a dispute, and tries to decide who is being deceitful and who isn’t. Representative Clay played the role of “Judge Clay” very well. Most of the supporting cast were also excellent (although perhaps Representative Maloney spent more time in the spotlight than was really necessary). There were even some humorous moments.

What was the purpose of this particular example of political theater? It served well as a tutorial about the OA movement. However, Representative Clay was the only member of the House to benefit from the full tutorial. The other three Representatives were present and asked questions only during the first act. Then, they left.

Were these hearings simply a prelude to further legislative action or an executive pronouncement? Stay tuned for the next exciting episode.

Comments (2)

CoLab launched

CoLab was launched at Open Science Summit 2010. It’s: “Designed for open and massively collaborative science“. [FriendFeed entry].

Comments: The focus is on unresolved scientific issues that are identified by members. So far, two issues have been contributed: “Locally optimal scientific research environments“, contributed on July 29 by A Garrett Lisi, and “How do you build an effective social sharing site for scientists?“, contributed on July 30 by Cameron Neylon. Both have attracted multiple comments.

There have been many Twitter trackbacks about CoLab (76 as of August 6th).

Leave a Comment

Selected OA news items noted during July 2010

Comments (1)

UCLA survey of knowledge about the NIH Public Access Policy

Measuring Capacity and Effectiveness of NIH Public Access Policy Programming as a Model for Open Access by Tania Bardyn and 4 co-authors, UCLA Louis M Darling Biomedical Library. Dated July 24, 2010 in the University of New Mexico DSpace repository.

Abstract: This file contains the presentation slides from Ms. Bardyn’s presentation at the Evidence Based Scholarly Communication Conference, March 11-12, 2010, in Albuquerque, NM. [PDF of 44 presentation slides].

This was a “Survey of Translational and Other Researchers’ Knowledge of the NIH Public Access Policy at UCLA“.

  • From Slides 3, 9 & 10: Translational researchers at UCLA (N=?) and attendees at 8 NIH Workshops (N=103) were surveyed. The survey took place between Nov. 30, 2009 and Dec. 15, 2009.
  • From Slide 13: 72.5% of responses (50/69) were from the David Geffen School of Medicine.
  • From Slides 15 & 17: Of 69 respondents, 51% did not attend an NIH Workshop at UCLA. And, 51% were Translational Researchers.
  • From Slide 16: 74% (51/69 respondents) answered “Yes” to the question: “Are you currently involved in any NIH funded research?
  • From Slide 26: 50% (32/64 respondents) did not know the stated intention of the NIH Public Access Policy.
  • From Slide 36: Of 65 respondents, 43% had successfully submitted an article to PubMed Central.
  • From Slide 37: Of 65 respondents, 95.4% answered “No” to the question; “Have you made any attempts to retain your copyrights when publishing in an academic journal?“.
  • From Slide 38: Knowledge Sharing: 89% (24 respondents) of NIH Workshop Attendees answered “Yes“; 49% (17 respondents) of Translational Researchers answered “Yes“.
  • From Slide 42: Quote from survey respondent at UCLA, December 2009 (about future training on the NIH Public Access Policy): “I think it is more efficient for the NIH website or other external website to provide such training. The issues are the same at all universities and it is not clear why each institution should provide this information. Since the NIH requires IDs on papers in biosketches and progress reports, that affects investigators competitiveness on grants which is much stronger motivation to comply with the policy than mandated training by UCLA which will force investigators to know the policy, but not necessarily comply with the policy.”
  • From Slide 43: 57% (37/65 respondents) answered “Yes” to the question: “Do you think you need further training on this issue?“.

Comments: The response rate was not high for this survey. Of 103 NIH Workshop Attendees, only 43% of 69 survey respondents were sure that they had attended a Workshop. So, the response rate from NIH Workshop attendees was 0.43×69/103=29%. It’s not stated how many Translational Researchers were surveyed, but it’s unlikely that the response rate for the Translational Researchers was higher than it was for the NIH Workshop Attendees.

Might the survey results be biased in a way that yielded an underestimate of knowledge about the NIH Public Access Policy by translational and other researchers at UCLA? This also seems unlikely.

A question that wasn’t answered in the slide presentation: were the 28 respondents who had successfully submitted an article to PubMed Central (see Slide 36/44) also the most knowledgeable about the NIH Public Access Policy? (If not, does it matter?).

Comments (1)

« Newer Posts · Older Posts »