Designing a Meta-Analysis for Data Sharing via Open Science Networks

I’m behind in consulting the literature for references to figshare.

From participating in the 2013 Walter E. Dean Environmental Information Management Institute, I am aware of a trend in research for using scholarly databases to conduct a “meta analysis.”

The book about meta analysis we referenced for the course in Environmental Information Visualization was “Handbook of Meta-analysis in Ecology and Evolution, available online from Princeton University Press at <>.

This site describes meta analysis, so I might as well paste it here for review:

Meta-analysis is a powerful statistical methodology for synthesizing research evidence across independent studies.

The visualization I did for my class project concerned the frequency of the term “data sharing” by both discipline and by year.  I created a “heat map” of the prevalence of the term in titles and abstracts indexed by a major online index of scholarly journals.  I grouped the journal articles into similar disciplines using Ulrich’s Web as a controlled vocabulary.

I’m thinking I can do something similar to understand how figshare is being used.

I have two questions in looking at figshare in journal article databases.

First, I’m curious if figshare appears in social science databases, or in journal articles concerning information science, library science, usability, etc.

Secondly, I’m curious if figshare’s citation and use will be apparent in journal articles.

From an earlier notebook entry, I established that the various means of sharing content can result in a diverse set of URLs, ranging from “” to “”

My understanding of the DOI system is that each publisher has a unique code.  Therefore, it would follow that figshare would have a unique DOI. A quick glance at the wikipedia entry on DOIs suggests this understanding is somewhat limited: CrossRef assigns DOIs on behalf of 3,000 publishers for scholarly works, DataCite is for research datasets, and there are other offices that assign DOIs that probably have their own unique DOI.

I did a search for figshare AND doi on google and found this figshare blog entry dated October 9, 2013, entitled “All research outputs should be citable” and states “As of today, all figshare content will have it’s own DOI.” Full article at <>.

Interestingly figshare has partnered with the California Digital Library on this – which contributes to DataONE tools such as DataUP.

Unfortunately I believe this means my supposition concerning all figshare dx.doi short URLs having a consistent “figshare” publisher’s code is probably incorrect.  On the bright side, they will be trackable going forward – probably why “citation coming soon” was noted in a previous post. Figshare comments to this effect:

This means that we will soon be recording how many times each object on figshare has been cited.

I want to get a sense of how some of the older URLs might have been stored.  Without seeing a whole dataset I don’t think it’s possible to see a pattern, but maybe I can just get an idea of the urls pre-and-post consistent DOI.

I navigated to:

Narrowed results to “Showing: Earth Sciences.”

I need to look at some datasets from after October 9, 2013.

Since it’s sorted by “most recent” this isn’t hard.

I’m just going to open every 20th resource in hopes that that will be somewhat random. 10.1371/journal.pone.0078219.g002.
Alright… that’s proving difficult as clicking the 20th has resulted in 3 items that appear to have been generated by PLOS and are “Unclaimed.”  Essentially, figshare’s partnership with PLOS constitutes a de facto association with figshare, which is not the point of what I’m looking at.  I want to understand people who use figshare due to some perceived value or benefit.
I am now limiting to “File Types” specified as “Poster” in hopes that this will produce surefire examples of hosted content that is claimed by a user as opposed to uploaded by a default setting on PLOS.
There are no posters dated after October 9 2013. The most recent poster is April 2013.
I changed File Type to dataset.
Also realized I’m limiting myself to the “Earth Science” category.
Switched to all categories.
Something to note is that in the “links” session of a  poster I looked at is a reference to
Obviously 824319 is related to the individual article.
I’m curious if 10.6084 is something unique to figshare.
I do need to look at other poster dx.doi links to see if that’s consistent.
It is.
Now I need to see if older (pre October 9, 2013 / figshare standardization of URLs) also have 10.6084.
I switched to “most viewed” since older items probably would accumulate more views.
No way to refine the results to a specified date range (odd).
I’m just opening the top 5.
Ok, they all have “10.6084.” in the dx.doi URL.  Not sure if that was always the case, but it is useful for the purposes of doing the meta analysis if it was. Possible it was changed recently.
I’m going to google figshare AND “10.6084”
This pulls up 39,300 results.
Some are from; some are from figshare.
I tried to limit the search to and got a few results concerning “most popular datasets” statistics.
figshare AND 10.6084
Suffice to say the search for figshare AND 10.6084 draws from a range of items.
In January, 2013, far before the October 2013 date, 10.6084 was used.  So it seems like it’s permanently associated with figshare –
At this point I do not understand what’s going on with the October 9 announcement – the URLs appear pretty stable to me.  I’ll need to look at this later.
The point is that I need a set of controlled queries for doing the meta analysis.
the url is probably one of those queries that I can use along, obviously, with the word “figshare.”
The first step of the meta-analysis is to do a systematic review.
In the Handbook of Meta Analysis in Ecology and Evolution – a systematic review is described as follows:
Systematic review is a research synthesis on a precisely denied topic using explicit methods to identify, select, critically appraise, and analyze relevant research. The crucial element of the systematic review that distinguishes it from an ordinary narrative review is an a priori protocol, which describes the methodology, including detailed search strategy and inclusion criteria. This review protocol makes the review process rigorous, transparent, and repeatable.
Essentially, to have a meta analysis, I must first attempt to design a systematic review.

About Tanner Jessel

I am a graduate research assistant funded by DataONE and pursuing a Masters in Information Sciences with an Interdisciplinary Graduate Minor in Computational Science. I assist scholarly research efforts supporting the Sociocultural, Usability and Assessment, and Member Nodes working groups within DataONE. I am based at the Center for Information and Communication Studies at the University of Tennessee School of Information Science in Knoxville, Tennessee.

Leave a Reply

Your email address will not be published. Required fields are marked *