Friday, 18 July 2014

Taylor & Francis Open Access Survey: translating values into licences

Guest post from Alex Green, Transformation Project Co-ordinator, Wellcome Trust

Last month saw the publication of the 2014 Taylor & Francis Open Access Survey. Combining responses from just over 7,900 authors who published with Taylor & Francis in 2012 (9% of the total), this represents the opinions of authors from across the world in roughly the proportions they have published with Taylor & Francis – although my inner data geek really wants to get hold of the full dataset to apply some weighting to the under-represented authors of East and South East Asia.

The Taylor & Francis survey shows strong support for open access (OA) publishing by authors and a clear belief in its benefits compared with traditional publication, which has increased from the 2013 survey. 70% of respondents also disagreed or strongly disagreed with the statement ‘There are no fundamental benefits to open access publication’, up 10% from 2013.

However, when read alongside this evidence of increasing widespread positive attitudes to OA publishing, the findings on licensing are striking. In particular, the preference for more restrictive licenses seems at odds with the attitudes and values expressed by authors in response to the Section 1 questions on that topic. For example, in Question 5, 71% of authors were happy for their work to be re-used without prior knowledge or permission for non-commercial gain provided they receive attribution. This is equivalent to a CC-BY-NC licence, but only 18% of authors selected this licence as their first or second choice when answering Question 6 (see graphic below).

Question 5


Question 6

This mismatch between author attitudes and licence preference seems common to nearly all responses. Again in Question 5 authors seem concerned about commercial reuse of their work without prior knowledge, with 65% disagreeing that it is acceptable. However in Question 6, 47% selected Copyright Assignment as their first or second preference for licensing making it the second most popular licence along with Exclusive License to Publish. Having assigned copyright away, authors would have little control over any commercial reuse, not least by the publisher they have assigned the copyright to.

There are several hypotheses we could explore to account for this seeming contradiction. Looking down the results for Question 6 there is a clear drop off towards the bottom of the list for ‘preferred licences’. It would be interesting to know if these options were always presented in the survey in the same order that they are presented in the report. If so, we may be seeing response order effects (Krosnick and Alwyn 1987), but unfortunately full details of the methodology are not openly available along with the report. 

The contradictions between responses may also be partly due to a degree of satisficing, an effect where respondents choose adequate answers rather than optimal answers because it is easier to pick the first acceptable, or most familiar, option than fully evaluate all options (Simon 1957, Krosnick 1991). This could be leading to selection of the more familiar Copyright Assignment and Exclusive License to Publish over the various Creative Commons licence combinations.

Finally, it’s probably relevant to note that the definition boxes provided to respondents at the start of the survey gave explanations of different modes of OA publishing, repositories and text- and data-mining but not the differences between the licences. These aren’t always easy to grasp, especially when completing a survey at speed – I don’t mind admitting that I had to check with a colleague on the exact differences between the various licence options. Perhaps, as Dr David Green, Global Journals Publishing Director said in the Taylor & Francis press release, there is still ‘much work left to do in simplifying our policies and documentation so that our author communities are in no doubt as to what their OA options are’.
 

Monday, 7 July 2014

The right to read is the right to mine: Text and data mining copyright exceptions introduced in the UK.

New copyright exceptions to text and data mining for non-commercial research have recently come into effect and this is welcome news for UK researchers and research, argues Ross Mounce. Here he provides a brief overview of the past issues discouraging text and data mining and what the future holds now that these exceptions have been introduced. But despite legal barriers being removed, many technical barriers still remain. Furthermore it remains to be decided what formally constitutes ‘non-commercial’ research.

After eight long years including not one but two expert-led reviews of intellectual property; new copyright exceptions, some of which in particular will enable and empower UK academic research came into force on June 1st 2014. All disciplines are set to benefit from this: the humanities, the social sciences, science, technology and medicine.
Of particular interest to myself and other researchers is the ‘Exception for copying of works for use by text and data analytics’. In order to understand why this is so important, let me take you back to how it was before the copyright exception came into force (and how the legal situation still is for researchers in most other European countries):

Content Mining: mining one or more types of media for information; media as data (Image credit: DeclanTM (Flickr, CC BY)
The situation before the copyright exception
Before this exception came into force in the UK, for subscription-access content, you’d essentially have to ask permission from the publisher, before you started analysing. If you proceeded, without permission, to download electronic copies of ‘their’ copyrighted materials (see author’s note - bottomen masse for analysis, you would be infringing ‘their’ copyright – it would be illegal, and they could take legal action against you, even if your analysis was undertaken for non-commercial academic research purposes. Depending on the exact subscription-access agreement held with your institution (of which your institution may not be able to disclose the details of, because of confidentiality clauses!), the publisher could even ask for additional fees to be paid to cover this ‘additional’ type of usage if it is not covered in the subscription agreement. Many agreements did & and still do explicitly prohibit text and data mining.

If one did ask for permission, the process was complex and lengthy, involving many employees, and much bureaucracy, for each publisher. That’s if the publisher agrees to give permission. In a study by the Publishing Research Consortium it was found that “only 35% of the respondents [publishers] state that permission is granted in the majority or in 100% of the cases for all requests” (p. 106) – and that sample of publishers included open access publishers that by definition, allow mining. Thus publishers can and have denied permission for content mining research on ‘their’ works.

The situation after the introduction of the UK copyright exception for TDM
After June 1st 2014, for research conducted in the UK, under the jurisdiction of UK law, for ‘non-commercial’ research purposes (more of which later…), the new copyright exception overrides anything in subscription contracts that prohibits content mining. As Peter Murray-Rust puts it: The Right To Read Is The Right To Mine, and provided you are in the UK, and doing ‘non-commercial’ research that is now true, and legal. This provides welcome and useful protection for researchers against litigious publishers.
No researcher, doing ‘non-commercial’ research in the UK, needs to agree to, nor abide by the terms of any text and data mining ‘licence’ that publishers may wish to impose upon researchers.

Schemes such as CrossRef’s text and data mining services will be heavily advertised to researchers by the major publishers, in order to try and control the way in which researchers do content mining both through legal means (the licencing) and technical means (the API). The use of such services entails agreeing to detailed and lengthy licencing agreements, which many probably won’t read. If you do read the full terms and conditions you’ll find them disappointingly limiting which is why organisations such as LIBER have publicly criticized these terms.

Even with some legal barriers now removed, technical barriers remain
Despite legal barriers being removed, non-trivial technical barriers still remain which can be problematic for content mining. Most websites for instance have rate limits. If you are detected attempting to crawl or scrape too many pages (i.e. research articles) within too short a time-span, your access to that website may be blocked. Publishers such as BioMed Central (BMC) have a crawl rate limit of one article per second which is an acceptable rate limit for researchers. Through Elsevier’s text-mining API there’s a limit of 10,000 articles per week which is equivalent to a rate limit of 1 article every 60 seconds. At that rate limit it would take ~21 years to go through all 11 million articles that Elsevier control access-to through their Science Direct platform – not really feasible! The rate limit imposed is entirely artificial – researchers with good internet connections can crawl many articles per second if they were allowed to. The publisher sets the rate ‘allowed’ and even despite this new copyright exception, to get the rate-limit changed a researcher would still have to beg permission from the publisher, which the publisher is fully within their rights to either grant or not.

Open Access publishers tend to be exceedingly helpful to content miners: BMC, Hindawi, and MDPI to name but a few, make available whole content dumps (i.e. everything they publish), openly available to download by anyone for any purpose which greatly facilitates content mining. For biomedical researchers, the PubMed Central Open Access Subset and Europe PMC also allow downloading of full-text dumps but these are limited to CC BY papers only (another reason of many why CC BY is the preferred licence of open access publishers).

Other less-helpful publishers sometimes pay money to employ external firms like Atypon to populate their websites with booby-trapped links that block access to the entire subscribing institution if clicked. These links called ‘spider-trap’ links, inevitably end-up doing more harm than good as in the recent #ACSgate debacle whereby over 200 institutions had their access to one publisher’s content blocked by people innocently clicking on a DOI-like URL that was openly available on the publisher’s website.

Image Source Shutterstock Copyright: dmitriyGo
Why do these publishers dislike crawling and scraping so much? Scraping the web is a normal, legitimate activity for researchers; even a recent European Commission report says so:
‘Scraping’ the World-wide web for data is today a familiar activity for the digitally literate researcher. (p. 11)

With over 50 million scholarly articles out there, and millions being published each and every year in popular fields like biomedical science; content mining is fast-becoming a necessity. Human-eyes can only read so much. Computers, and computational techniques to help us comprehensively and rigorously mine the literature are a boon for research. One expert report on the state of content mining argues that “European academics are falling behind their Asian and North American counterparts” – this new copyright exception will thus help the competitiveness of UK research in the global sphere.

The only nagging question remaining to address is: what is ‘non-commercial’?
I won’t pretend to give a convincing answer to this. I simply don’t know, and I can see it being a potentially difficult sticking point for many.

For my own research on extracting data from evolutionary tree figures (phylogeny), I can feel safe that this subject and use-case might not readily be definiable as ‘commercial’ but for other researchers I can imagine it may not be so easy to safely & surely classify their research as ‘non-commercial’. Indeed a recent court in case in Germany seemed to indicate that ‘non-commercial’ use was only safely equivalent to personal-use. The consequences, risks and side-effects of ‘non-commercial’ remain largely untested in case law and can prevent much more usage than you might think. Will publishers be eager to sue academic researchers for what they perceive to be commercial mining? I hope not, but sadly it would not surprise me if they did.

Author’s Note: I feel pained to discuss the copyright owned by publishers, over work written by academics hence the inverted commas when discussing ‘their’ copyright. Part of the reason academia got into this copyright-pickle in the first place is that we allowed publishers (and still do for some!), to take copyright away from the authors with completely unnecessary copyright transfer agreements (CTA’s). Publishers do NOT need a CTA to publish your work, so don’t sign them! You can instead retain your copyright over your work, and just give them a non-exclusive license to publish. Keep your copyright!

Disclaimer & Warning: None of this article constitutes formal, vetted legal advice and should not be relied on or treated as a substitute for specific advice relevant to particular circumstances. Academic publishers and even societies can, and do take legal action against research-related activities, if they feel so inclined.

About the Author
Dr Ross Mounce is a BBSRC-funded postdoc at the University of Bath, working on the PLUTo project to liberate phyloinformatic data from the literature. He is working with The Content Mine team to encourage the adoption and use of content mining tools and techniques, including giving a workshop at this year’s Open Knowledge Festival 2014 (Berlin, July). As a keen advocate for open scholarship, you can also find him at OpenCon 2014 (Washington D.C., November) – the student and early career researcher conference on Open Access, Open Education and Open Data.


This piece originally appeared on The London School of Economics and Political Science's Impact Blog under a Creative Commons CC BY license. One of the images has been replaced, and another omitted (please see individual images for copyright restrictions). The article gives the views of the author.


Wednesday, 4 June 2014

Let’s (not) Get Physical: the Effect of Spironolactone on Muscle Strength in the Elderly

Image Source: Serial/Trash

Life expectancy has increased continuously over the past several decades, and with it, a host of new age-related ailments have emerged as contemporary medical issues. Muscle function decreases with age, leaving increasing numbers of elderly people incapable of being physically independent. This not only has devastating personal effects but is also a major public health issue. In a recent study, researchers investigated whether blocking the hormone aldosterone may help counter the effects of physical impairment with age.

Aldosterone is a hormone best known from the role it plays in the regulation of blood pressure, but it has also been associated with muscle weakness. This makes sense, considering that aldosterone also lowers levels of magnesium and potassium – two minerals that are essential for muscle contractility. Thus lower levels of aldosterone may promote restoring muscle function.

How can we keep aldosterone levels low? The production of aldosterone in the body is a complex interplay between hormones and enzymes. Medication can be targeted to interfere with it at different stages. One way to influence aldosterone levels is to inhibit the creation of a hormone that would otherwise stimulate its production. These so-called angiotensin-converting enzyme blockers (ACE blockers) are usually prescribed to treat high blood pressure and heart failure. Interestingly, some studies reported additional positive effects on muscle strength and physical fitness in elderly.  But the results not conclusive: some studies did not find any positive effect on muscle strength. Furthermore, ACE blockers seem to fail at reducing aldosterone levels over longer time periods. However, long-term efficacy is a crucial factor for improving muscle function. 

Image Source: Shutterstock Copyright: Tish1

Another way to interfere with aldosterone is to block aldosterone receptors directly. This is exactly what a drug called spironolactone does. It blocks receptors in the kidneys, heart and blood vessels that are usually targeted by aldosterone. Because aldosterone can only act when it has the possibility to bind to its receptors, any competition for those receptors blocks its action. Recent studies suggest however that blocking the aldosterone receptors by means of spironolactone may be superior to ACE inhibitors when it comes to blocking the specific effects of aldosterone. For example, spironolactone seems to be more effective than ACE inhibitors in keeping magnesium and potassium levels in muscles high – a highly desired effect for improving muscle strength. Spironolactone may thus be a promising medication to counter the decline in muscle function in elderly.

In line with this, physical fitness of heart failure patients improves after they’re treated with spironolactone. However, one must consider that spironolactone is well-known as an effective medication for high blood pressure and heart failure. Therefore, its positive effect on physical performance may simply be caused by better overall health after treatment rather than by a direct effect of spironolactone on muscle strength. To understand whether the intake of spironolactone may have a beneficial effect on muscle strength in elderly who do not suffer from heart failure, a research team from the University of Dundee recruited 120 patients aged 65 and older. The patients had self-reported difficulties in performing daily activities, but did not suffer from heart conditions or high blood pressure. Half of the participants received a daily dose of spironolactone over a period of 20 weeks, while the other half of the patient group received a placebo – a pill that resembled spironolactone but which was medically inert. The researchers considered various factors of interest, including objective and subjective physical function. Subjective physical functioning was measured by self-reports. Objective physical function was assessed by the distance that participants could walk within 6 minutes, a simple but widely-used, reliable measure of exercise performance in the elderly.

The results were surprising: the treatment with spironolactone did not increase the distance that participants could walk within 6 minutes, suggesting that the intake of spironolactone did not enhance their muscle functioning. In contrast to the objective findings, the self-report measures showed another picture: after 20 weeks, only those participants who had been treated with spironolactone reported an improvement in quality of life. Whether this effect was caused by a pain-relieving effect or by other pharmacological benefits associated with spironolactone is not known; further studies are required to clarify these results.

Despite the fact that the results could not confirm any positive effect of spironolactone on muscle strength, this should not be interpreted as a definite exclusion of its efficacy as a counteragent for age-related muscle decline. Rather, the findings demonstrate that spironolactone was well tolerated in participants without heart failure and improved their quality of life. This provides encouragement to test the drug at higher doses and for longer treatment durations. Future studies are needed to determine whether these advanced treatments can cause the desired physiological effects, but the current findings help to define them.

This summary by Jenny-Charlotte Baumeister was shortlisted for Access to Understanding 2014 and was commended by the judges. It describes research published in the following article, selected for inclusion in the competition by the Chief Scientist Office of the Scottish Executive:

PMCID: PMC3695565
L.A. Burton, D. Sumukadas, M.D. Witham, A.D. Struthers & M.E.T. McMurdo.
The American Journal of Medicine (2013) 126(7), 590-597.

Access to Understanding entrants are asked to write a plain English summary of a research article. For Access to Understanding 2014 there were 10 articles to choose from, selected by the Europe PMC fundersThe articles are all available from Europe PMC, are free to read and download, and were supported by one or more of the Europe PMC funders.

Look out here and on Twitter @EuropePMC_news for further competition news and other Europe PMC announcements.   

Wednesday, 28 May 2014

The TBPH gene – Do neurodegenerative diseases have a fly in the ointment?

Image design: Serial/Trash
A number of genes have been implicated in neurodegenerative diseases such as amyotrophic lateral sclerosis (ALS), also known as motor neurone disease, and Frontotemporal lobar degeneration (FTLD). However, the core biological processes involved in these disorders are extremely difficult to model and this is hampering the effort to develop treatments. If we can resolve this and increase our knowledge about these disease processes it may be possible to develop new and improved treatments.

The gene TDP-43 in ALS and FTLD
The gene TDP-43 has been shown to be involved in the development of the neurodegenerative diseases ALS and FTLD in humans, however the exact way(s) in which it does this still remain unclear. This protein acts as a master regulator of RNA, a precursor to proteins, and can therefore control the expression of genes within cells. A recent study by a group led by Dr. Frank Hirth from King’s College London has shed some light on this by using an interesting choice of organism: drosophila melanogaster, the common fruit fly.

A fly way to model diseases
Researchers in a wide variety of fields are looking to drosophila to model complex diseases. But why choose flies, they couldn’t be any more different biologically to us could they? In fact, fruit flies share many biological processes with us; they have a complex immune system, an intricate brain and can even develop cancer! Genes that are similar in structure and cellular function in flies and human are called homologues. They can be retained from a common ancestor by a process called evolutionary conservation. What makes the fruit fly a great model for studying disease is that it has fewer genes than humans. Where humans often have multiple genes that perform essentially the same function, a feature known as redundancy, flies often only have one gene at each stage of a signalling pathway. This means that if you alter that one gene the effect should be more obvious. Imagine calling in sick for work but no one there is able to cover for you, your work just won’t get done and others will see exactly what your role is!

Altering the function of fruit fly TBPH models neurodegenerative disease
The fruit fly homologue of the TDP-43 gene is called TBPH, and it too has been implicated in neurodegenerative disease. To investigate TBPH the researchers from King’s made fruit flies which either lacked the gene, or had increased gene expression. Interestingly, both types of fly have key symptoms of neurological disorders such as a decreased life span and motor dysfunction – this was measured by videoing the flies walking and then making recordings about the speed and gait of their  movements. TBPH may therefore be considered a “goldilocks gene”, too much or too little can be biologically bad and “just the right amount” is required for normal health.

What mechanism did this study find?
To discover how loss or gain of TBPH leads to altered motility of the flies the researchers began to compare physical features of the normal fly to flies with loss of TBPH. Firstly they noted that there were no visible differences between muscle or synapse structure. This led them to dig deeper into the process of motor function. Using a series of complex experiments measuring the electrical currents generated at the gaps between neurons and muscles (neruo-muscular-junctions) of TBPH expressing, non-expressing or over-expressing flies, the group were able to suggest that the defects exist at the pre-synaptic stage of signal transmission. The flies have trouble transmitting neurological signals from the neurones to the muscles and this causes the physical difficulty in movement; as seen in humans with disorders driven by dysfunctional motor neurones.

Modelling the aging fly
Humans typically develop neurological diseases during old age, a fruit fly model where TBPH loss, or over-expression, was targeted only to a specific area of neurones allowed flies to age normally. It was not until flies had reached an “old age” of 40 days that they started to lose pre-synaptic function. This led ultimately to symptoms of neurodegeneration suggesting TBPH loss or over-expression has a progressive effect on neurological degeneration. Therefore, optimal TBPH levels in neurones are critical for maintaining proper motor neuron function during aging and supports both the TDP-43 gene as a driver of motor neuron disorders. This study elegantly highlights the usefulness of the TBPH fruit fly model as a great way to research the underlying mechanisms behind these disorders. If more is discovered about these mechanisms then new and exciting therapeutic agents could be developed for treating neurodegenerative diseases. It is clear that when it comes to neurodegenerative disease research the flies have it!

This summary by John Foster was shortlisted for Access to Understanding 2014 and was commended by the competition judges. It describes research published in the following article, selected for inclusion in the competition by the Motor Neurone Disease Association:

PMCID: PMC3605831
D.C. Diaper, Y. Adachi, B. Sutcliffe, D.M. Humphrey, C.J.H. Elliot, A. Stepto, Z.N. Ludlow, L. Vanden Broeck, P. Callaerts, B. Dermaut, A. Al-Chalabi, C.E. Shaw, I.M. Robinson & F. Hirth.
Human Molecular Genetics (2013) 22(8), 1539-1557.

Access to Understanding entrants are asked to write a plain English summary of a research article. For Access to Understanding 2014 there were 10 articles to choose from, selected by the Europe PMC fundersThe articles are all available from Europe PMC, are free to read and download, and were supported by one or more of the Europe PMC funders.

Look out here and on Twitter @EuropePMC_news for further competition news and other Europe PMC announcements.   

Wednesday, 21 May 2014

A divorce in development: single regulators can raise arteries alone

Image Source: Serial/Trash
Understanding how blood vessels are born and propagated is vital for the treatment of a whole host of diseases including heart disorders, diabetes and cancer. Scientists from Oxford’s Ludwig Institute for Cancer Research have begun to reveal the mechanism by which the switching on of specific genes leads to the development of arteries. 

A vast network of blood-carrying arteries feeds our body with the oxygen and nutrients it needs to survive. Within a young embryo, this network takes its primitive shape in a series of stages. First, the cell type which will later make up the inner-walls of all blood vessels, the endothelial cells, is generated. Then simple tube-like structures of these endothelial cells must differentiate into either arteries or veins.

But the story doesn’t end there – the process of sprouting new blood vessels continues throughout life and indeed maintaining just the right distribution is critical to our health. Too few, too many or abnormally-developed blood vessels can all lead to disease. Interestingly, although cancer and Alzheimer’s disease are very different conditions, scientists believe that the underlying molecular processes responsible for the defective blood vessel development that comes with them are very similar and therefore exciting targets for research.

All aspects of our development, from the formation of vital organs within the embryo to the healing of wounds in adulthood, utilise similar molecular tools to lay down the pattern which governs how cells and tissues specialise into one of many types – rather like a blueprint. External signalling molecules are deployed to pass instructions to cells depending on where they lie in the overall blueprint. These signals can be sensed by each cell individually via receptor molecules protruding from their surface.

Scientists are confident of the signalling molecules released during artery development. Vascular Endothelial Growth Factor (VEGF) spreads diffusely across tissues and is the primary driver of general blood vessel formation. The Notch pathway, which operates when adjacent cells touch, is implicated in deciding which vessels become arteries. However, signalling messages are short lived – how does an artery know to remain an artery? It is this last link in the chain that until now scientists have been most unsure about – how can several signalling pathways be combined inside the cell so that the correct genes are turned on for operating an artery?

All cells carry a copy of the entire genome, but few genes are required in every cell or all the time. Genes lie adjacent to ‘enhancers’, DNA sequence elements that do not encode protein but rather allow control of when, where and how fast a gene is read. Such control is governed by DNA-binding proteins, which sit on the DNA structure and interact with the gene-reading machinery.

Image Source: Shutterstock Copyright: Crystal Eye Studio

Dr Sarah De Val and her colleagues at Oxford have conducted a series of experiments in mice and zebrafish that reveal which DNA-binding proteins are important in the formation of arteries. They first pinpointed which enhancers are most important for the activation of an artery-specific Notch gene before demonstrating which of the known DNA-binding proteins engage them. These included a DNA-binding component of the Notch pathway and three members of the SOX-gene family, utilised during development throughout the animal kingdom.

By fusing copies of the artery enhancers to a bacterial gene that produces a bright blue protein when activated, it was possible for the researchers to trace the pattern of artery formation at different stages during embryo development. Unsurprisingly, when they cut out the binding sites at which the proteins responsible for formation of endothelial cells associate with enhancer DNA, or chemically disabled the VEGF signalling pathway, the normal pattern of Notch gene activation was completely lost. But intriguingly, deleting the binding sites for the SOX and Notch proteins only had a severe effect when carried out in parallel – loss of regulation by either SOX or Notch individually was of little importance.

This finding was echoed by injecting inhibitory DNA molecules into embryos to simultaneously turn off the genes encoding the DNA-binding SOX and Notch proteins. Although endothelial cells were able to form a network of primitive blood vessels, the principle artery, the aorta, was missing and none of the known genes common to arteries were activated.

As a general rule, developmental characteristics tend to emerge in cells located in regions where two or more necessary signals overlap. This research, proclaiming that proteins of either the SOX or Notch pathways alone are sufficient for much of artery function without the other, intriguingly contradicts this.

Highlighting the fact that the vascular system is extremely sensitive to genetic fine-tuning, Dr De Val’s study reveals some of the first molecular targets for potential vascular disease therapies. At the same time, it exposes some unusual molecular intricacies that will continue to excite scientists for some time.

This summary by Christopher Waite was shortlisted for Access to Understanding 2014 and was commended by the judges. It describes research published in the following article, selected for inclusion in the competition by the British Heart Foundation:

PMCID: PMC3718163
N. Sacilotto, R. Monteiro, M. Fritzsche, P.W. Becker, L. Sanchez-del-Campo, K. Liu, P. Pinheiro, I. Ratnayaka, B. Davies, C.R. Goding, R. Patient, G. Bou-Charios & S. De Val.
Proceedings of the National Academy of Science USA (2013) 110(29), 11893-11898.

Access to Understanding entrants are asked to write a plain English summary of a research article. For Access to Understanding 2014 there were 10 articles to choose from, selected by the Europe PMC fundersThe articles are all available from Europe PMC, are free to read and download, and were supported by one or more of the Europe PMC funders.

Look out here and on Twitter @EuropePMC_news for further competition news and other Europe PMC announcements.    

Friday, 16 May 2014

Now available: use DOIs with our External Links Service

Our External Links Service enables links to be created from articles on Europe PMC to free third-party resources that enrich our articles. Since launching the service last July we've been joined by providers who have set up links to an ever widening range of useful resources, including data underlying articles, press releases and plain English summaries, and article full text not otherwise held by Europe PMC - to name but a few. We now have 14 providers enriching 33, 000 articles with links out to additional information.

A recent development makes setting up these links even easier. Whereas previously you needed to have a PMID or PMCID associated with the information for which you wanted to create a link, we've enabled links to be created using DOIs, a commonly used, stable document identifier. The first provider to create links using DOIs is PANGAEAa data publisher for earth and environmental science.


Image source: Shutterstock Copyright: Thomas Reichhart
As always we welcome your feedback, and we would be delighted to hear from you if you want to get involved in this free service - at labslink@europepmc.org.

To stay up-to-date with Europe PMC news you can also follow us on Twitter @EuropePMC_news.

Wednesday, 14 May 2014

Reforming rheumatoid arthritis treatment: a step in the right direction

Image Source: Serial/Trash
There is good news for rheumatoid arthritis sufferers: scientists are a step closer to predicting which patients will benefit the most from a particular type of drug using just a urine sample.

Imagine being in pain whilst carrying out routine, daily tasks such as opening a door, reaching for something in the cupboard, or writing. On top of this, imagine that this pain often comes on quickly and intensively, but not being able to predict when this will happen. This is what thousands of people with rheumatoid arthritis have to cope with every day.

Usually, the cells of your immune system protect your body from invasion by harmful viruses and bacteria, acting as your body’s army. Rheumatoid arthritis occurs when some of these immune cells start to attack your own cells, specifically in the healthy tissue around your joints. This results in the affected joint and surrounding area becoming swollen, painful and stiff. These symptoms often intensify very quickly and without warning, resulting in severe discomfort for the person concerned. What causes these sudden ‘flares’ and the onset of the disease in general is still largely a mystery.

There are currently several different types of drugs available to treat the symptoms of rheumatoid arthritis. Unfortunately, it can take a period of trial and error to find out which drug works the best to reduce symptoms in each patient. Doctors and scientists are currently trying to find out why some drugs work very well in some people, but not in others. This research is important as it will ensure that in the future, patients are more likely to be given a drug that works the best for them; wasting less time than trying treatments by trial and error.

TNF-alpha inhibitors are a group of drugs currently used to treat moderate to severe rheumatoid arthritis. TNF-alpha is one of the proteins responsible for creating the swelling and pain in joints affected by rheumatoid arthritis. These drugs block the TNF-alpha protein, and so in turn reduce the symptoms associated with rheumatoid arthritis. They work very well in reducing symptoms in some patients, but do not work for many others. This latest research reveals a potential way to identify which people with rheumatoid arthritis are most likely to respond well to the TNF-alpha inhibitor drugs, by looking at their urine.
Image Source: Shutterstock Copyright: MILA Zed 
The researchers decided to see if they could find any differences in the amounts of certain molecules in the urine of rheumatoid arthritis patients who responded well to the TNF-alpha inhibitors, compared to the urine of those who did not respond well. They chose to look at the patients’ urine as a similar approach has already been successful with other diseases. Obtaining a urine sample is also very quick, easy and not intrusive to the patient.

Sixteen rheumatoid arthritis patients were chosen to participate in the study. They each gave a urine sample before commencing treatment with one of two TNF-alpha inhibitor drugs, which continued for one year. After this time, a doctor assessed the patients and decided, based on their symptoms, who had responded well and who had not. At the same time, scientists analysed the urine samples the patients had given at the beginning of the study. They did this using a technique called nuclear magnetic resonance spectroscopy. This takes advantage of the notion that different molecules behave differently when they come into contact with a magnetic field. When a magnetic field was applied to the urine samples, all the different molecules in the urine separated and a machine identified exactly which molecules were present, and in what amounts.

Three different computer programs analysed the results to make sure the final conclusions were accurate. In the patients who responded well to the TNF-alpha inhibitor drugs, all three computer programs agreed that these patients had more histamine, glutamine and xanthurenic acid molecules in their urine, but less ethanolamine molecules, compared to those who did not respond well. These findings mean that, if this test was used in the future, a doctor could take a urine sample from a rheumatoid arthritis patient to help him or her decide if the TNF-alpha inhibitor drugs are likely to ease the symptoms of the patient.

This study was small, so the next step is to test these findings on a larger group of rheumatoid arthritis patients. These initial findings are exciting though, and not just for those patients who are most likely to benefit from the TNF-alpha inhibitors. For patients where the drugs are not likely to work, this will be known quickly, and an alternative can be suggested instead. This means that each patient is more likely to receive a treatment that helps his or her symptoms as quickly as possible; something which is invaluable.

This summary by Elizabeth McAdam was shortlisted for Access to Understanding 2014 and was awarded second prize. It describes research published in the following article, selected for inclusion in the competition by the Arthritis Research UK:

PMCID: PMC3715109
S.R. Kapoor, A. Filer, M.A. Fitzpatrick, B.A. Fisher, P.C. Taylor, C.D. Buckley, I.B. McInnes, K. Raza & S.P. Young.
Arthritis & Rheumatism (2013) 65(6), 1448-1456.

Access to Understanding entrants are asked to write a plain English summary of a research article. For Access to Understanding 2014 there were 10 articles to choose from, selected by the Europe PMC fundersThe articles are all available from Europe PMC, are free to read and download, and were supported by one or more of the Europe PMC funders.

Look out here and on Twitter @EuropePMC_news for further competition news and other Europe PMC announcements.