Poll results on quality of research: Journals 3-2 H-Index

What a surprise! There have been fewer responses than in other polls, I thought that there were more interest on this topic, but the results are clear on quality of research: Journals 3-2 H-Index.

Is Google Scholar a good indicator of your quality of research activity and influence?

Poll results on quality of research: Journals 3-2 H-Index

* The poll was posted in October 2013 in many academic discussion groups. Around 470 answers were collected.

Although Google Scholar is open and reliable because it treats scholars equally, it’s not considered to be a good indicator of quality of research activity and influence. It’s incredible and difficult to digest because I had a hope in this. So I guess what you can think of Altmetric, which is based on an ample idea of impact, not just on academic production…

The reasons for these results are implicit in the survey because, if on the one hand these metrics provide useful public information, on the other hand I understand that, in general, H-Index metrics have the following barriers:

  1. It adds pressure on researchers.
  2. No organization seems to be looking seriously at them.
  3. Not many people are using them, because their citations are poor comparing with those of some champions in the sciences, and because is another annoying tool to take care of.
  4. And mainly because it is still considered that journals are a better indicator of quality of research.

Well okay, I get it, we’ll look at the individual metrics but focusing on publishing in indexed journals. I don’t think that in the medium term this will change much, peer-review will remain the king of research quality assessment, and makes perfect sense.

But instead, for journals these results are a triumph and a shot of adrenaline. The road for them is to be indexed in well-known databases and be open to open access to allow authors to be cited and have an impact, isn’t it?

POLL: Do you bet your future as a researcher on Google Scholar metrics?

POLL: Do you bet your future as a researcher on Google Scholar metrics?The journals’ world is boiling: Internet and Open Access is questioning the indexed journals’ model, from anonymity in peer review to the selection criteria of the directories or databases, and the calculation of the journal impact indicators.

An alternative model to measure the quality of research are the personal citation indexes (H-index), which allow open tools, accessible and public such us Google Scholar; although, there are also private metrics, or at a cost, as Altmetric.

  • The change is motivated because the indexed journals’ model doesn’t just convince scientists, because of the power that some journals and private directories have. But it is producing an undesirable effect, that the two models are used now to evaluate the quality of research, with the danger of drowning professors and scientists even more.
  • And paradoxically, journals are also pressured in some way, because now they have to ‘promise’ implicitly that their published articles will be cited more with them.

Google Scholar metrics are here to stay. What do you think? Do you bet on your citation appeal?

* It can be chosen 1 o 2 answers.
**Comments are highly encouraged.

Traditional vs Alternative means of dissemination in academia. Poll results

There are many pressures for change regarding the dissemination of research, such as the current Web 3.0. technology environment in education, open access journals/repositories and the consolidation of citation metrics tools.

Professors and researchers shared with us their vision about the future of publishing, voting in the poll.

Traditional vs Alternative means of dissemination in academia. Poll results

Indexed journals have been adding high value to all academic stakeholders, and they will be.

Traditional vs Alternative means of dissemination in academia. Poll results

In general, it could be seen in the results a balance between the traditional (48%) and the alternative (52%) means of dissemination in academia, but there are other conclusions quite interesting:

  1.  “Open access journals/directories with peer review” was the preferred mean of dissemination, with 29% of the votes; it makes sense due to the expectation that citation rankings are creating.
  2. Both added, “Indexed journals” + “impact factor”, would be the most voted (39%); the current journal system still prevails.
  3. “Repositories with peer review” + “number of downloads”, were voted by 21% of respondents, opening an interesting way to new alternatives for the dissemination of scientific knowledge in academia.

Traditional vs Alternative

Professors are rational people with common sense, we understand that change is needed in the system, but little by little, as it is working reasonably well. It’s like we will be waiting to see how those changes develop and how journals and publishing houses respond to them. Sure they do well.

* The poll was posted in June 2013 in many academic discussion groups. Around 900 answers were collected. 

Research in Sciences: Pieces of advice from an outstanding researcher

m guillenMontserrat Guillén was born in Barcelona in 1964. She received a Master of Science in Mathematics and Mathematical Statistics in 1987 and a PhD in Economics from University of Barcelona in 1992. She got an MSc in Data Analysis from the University of Essex (United Kingdom). She was Visiting Research faculty at the University of Texas at Austin (USA) in 1994. Montserrat also holds a Visiting Professor position at the University of Paris II, where she teaches Insurance Econometrics. Since April, 2001 she is chair professor of the Department of Econometrics at the University of Barcelona. Montserrat was awarded the ICREA Academia distinction.

Her research focuses on actuarial statistics and quantitative risk management. She has published many scientific articles, contributions to book chapters and books on insurance and actuarial science. She is an Associate Editor for the Journal of Risk and Insurance – the official journal of the American Risk and Insurance Association, a senior editor of Astin Bulletin – the official journal of the International Actuarial Association, and chief editor of SORT – Statistics and Operations Research Transactions.  Montserrat was awarded by the Casualty Actuarial Society and received the International Insurance Prize. She is a highly cited academic in the field of risk management and insurance. She was elected President of the European Group of Risk and Insurance Economists, the Geneva Association, in 2011. She serves in many scientific boards, international programs and steering committees and conducts R&D joint programs with many companies.

Gaudeamus. How do you select your research projects, or do they select you? 

Monserrat Guillen. I usually apply for research project funding to academic institutions. The topics are usually basic research with a very long term and ambitious perspective, which means that the application is not going to be immediate. When private funding comes into place, it is usually because a very specific research with direct transfer to the industry is expected

G. You usually collaborate with international scholars, it should not be easy to coordinate and organize research, is there any aspect worth mentioning that could help us researchers regarding international projects? 

MG. There must be a leader. The leader must be open-minded, active, motivating and has to set up short term and long-term goals for the team. Everyone involved must know his/her role in the project and why his/her contribution is important to the whole group.

G. If you had to prioritize, what do you put in the first place: teaching or researching? 

MG. Both. Even if a lecturer is very good, good teaching is even better with good research. I find that usually we forget that research advances have to be introduced in the syllabuses and this is essential for high quality education. Research also benefits from teaching, because communicating research results needs many of the skill that is developed when teaching.

G. What is the research activity you like most?

MG. I really enjoy the instants when a new result is obtained. There are some seconds of doubt, and then an explosion of joy when the result is confirmed. Sometime this happens when working on my own and sometimes this is shared with colleagues. If I obtain a results and no colleague is next to me, I would immediately tell it with my colleagues.

G. Once you have a draft research document, what key issues should be taken into account until it is published?

MG. Audience, structured, correctness in all sense

G. Internet and open access is changing the scholarly publishing industry, is it also changing research activity?

MG. It does because searching information is much easier than it used to be. Reading the essential papers is important when there are so many out there.

G. How do you choose the journals where to publish?  Or if you prefer, what are you looking for in a journal?

MG. The topic and the impact factor. I look for a sign of quality

G. Finally, what advice would you give to novel researchers (for example, about collaboration, time dedicated to research, make an impact, etc.).

MG. I would recommend spending a lot of time on how to explain the research result. Some very good contributions remain get no notoriety due to a poor presentation. Correctness, clarity and motivation are crucial for the success of a paper.

Back to basics: The roll of journal indexes

I wonder about the contribution of journal indexes / databases to the assessment of research quality.

Lately, and against what would be logical given the major changes being experienced by the publishing industry, professors are increasingly required to publish in journals indexed in Journal Citation Reports (JCR), both statewide for accreditation as at universities, especially in private ones.

journals

If indexes and impact indicators were a kind of accreditation on the quality of journals’ processes, particularly on peer review quality and editorial board, I would understand all this alarm about publishing in first class reputable indexes. But apparently not:

  • Being in JCR, journals have to demonstrate to be a regular publication, printed in English, have an international editorial board and other requirements that have little to do with the quality of the papers within.
  • Having a journal indexed in Scopus and other known ones, it is enough to filling out a form giving them permission to use the journal data.
  • Following the same line, other similar indexes (generalists, regional or specialists), only require an application form to be filled.

So, what are the main sources of prestige for a journal? I pointed just a few:

  • Large base of readers.
  • Quality of authors and papers.
  • Sound peer reviewer processes, with good reviewers and feedback.
  • Good Editorial board and clear editorial line, objectives, etc.

If that is somehow true, then, what makes the difference with un-indexed peer review journals? I have not it very clear, it looks like a kind of complex corporate governance system for journals: different publishing stakeholders (indexes, journals, professors, researchers, universities, departments, accreditation bodies, governments, readers, peer reviewers, editors, journal owners, etc.) taking care of research prestige and reputation.

Many voices in academia call for a change, but, is there a better system than journal indexes and impact indicators to assess quality of research?

POLL: The future of research quality assessment

The main drivers of change regarding the assessment of research quality and its dissemination are the current Web 3.0. technology environment in education, open access journals/repositories and the consolidation of citation metrics tools.

Indexed journals have been adding high value to all academic stakeholders: professor, researchers, publishers, editors, professionals, universities, faculties and libraries; but has arrived the time for journals to change?

journal burning

Shape the future of publishing voting in the poll. Share with us your vision.

Tomorrow belongs to cites

openaccess Over the last decades, journal rankings moved from something only a few librarians cared about to something that is now critical to the future of professors and researchers. The same thing could happen to the individual citation metrics.

  • Internet and open access movement is urging academia to reconsider the current model of research assessment, journal rankings and each of the phases of the publishing process, such as the private citation system, the growing role of repositories, the subscription and payment model , and even the peer review and impact indicators.
  • Assessment of quality of research activity is needed, either of the journal, or research activity of department or individual, no one doubts it, the problem is what type; the ideal would be all of them. Some countries do this, they rate individual academics by levels, for example in UK (REF), Australia (EIA) or Spain (ANECA), having into account many more things, such as teaching assignments, research centers or stays in international universities.
  • We have now journal rankings, but it will probably have less relevance in the future with open access, though it could be more necessary in the short term due to the initial confusion with the evaluation of research quality. If the move is to individual cites, and its calculations are improved, for example with a bias corrector by field of knowledge and years of experience, why the need of journal rankings and impact factors?, one could go directly to estimate individual cites and see the quality and prestige of the researcher, are there anything more real and tangible than cites?

This brings me again to the old question ever, publish/cited or perished? That is, the pressure to profs. I wonder if the same assessments could be made to other professionals, such as judges, politicians or even bankers. Don’t you think so?

Poll: reasons when selecting a journal to submit a paper

dudando

Choosing the journal where to send our paper is critical for two main reasons:

  • Objective factors. The main thing is that it fits in our publication strategy: indexed, with impact factor, peer reviewed, open access, among other criteria.
  • Subjective factors. Then, and not the least, we have to find the journal that is looking for the kind of research and manuscript that we have written. This way we could avoid sending our paper to multiple journals, wasting our time and morale.

Worst practices for misconduct authors

arbitroThere are several ideas going around in my head regarding the reasons for the growing plagiarism in academic publications and that someone is willing to get into this game for money:

  • There is great pressure to publish.
  • Capitalism is pervading everything.
  • In general, professors are not well paid.
  • Some publishing activities are not remunerated, as academic editor or peer review.

And trying to clarify this issue in blogs and online discussions, I have been able to make a list of the types of plagiarism that currently exist, that could be seen as the worst practices for pirate-authors:

  • Plagiarism: kidnapping or appropriation of others thoughts and ideas without acknowledging its source.
  • Self-plagiarism or recycling fraud: reuse of your own texts without attributing previous publication.
  • Ghost writing: write books, articles or other texts that are credited to another person, generally for money.
  • Honorary authorship: include authors in a publication without adding value or contributing, inflating its credentials.
  • Duplicate publication: use your own publications more than once, changing the title and abstract.
  • Salami slicing: creating several short publications out of material that could have, perhaps more validly, been published as a single article in a journal or review.
  • Remix or mosaic plagiarism: mixing several publications to obtain more publishable units.
  • Image and data manipulation: modify data and results to obtain another document for publication.

It is amusing and dangerous at the same time the combination of some of the above activities, such as ghost writing and plagiarism, it would be that you pay for an article to be written but that in turn is plagiarized, so at the end, apart from wasting your money, you may run many risks, as the reputational one.

I am not sure before, but now with open access and the Internet is becoming easier to detect plagiarism of any of the existing types. Recently in Spain a professor has been condemned for plagiarizing a chapter of a student. In line with those worst practices above, the article could have been coauthored with the student – that is, the professor adds his name and the student the content, or that he did not even remember that it was not his? But I guess believing to be very smart is worse than plagiarism.

Journals that ask for money: poll conclusions

will you publish in journals with fees?

will you publish in journals with fees?

This poll was posted in January 2013 in different academic forums, 250 answers were collected and approximately 75 comments. The question was: will you publish in an indexed journal that charges you fees? And the results are: 39,60% Yes, 60,40% No.
poll results

Regarding the question in itself, it was a closed one, with no option to answer ‘it depends on’, because it would have been the preferred one.

My summary on the issue and your comments are:

1. Journals have no access to funds; they have a lot of expenses to make the journal attractive to authors, and that is why they ask for a fee.
2. Journal editors feel uncomfortable and somehow ashamed about this issue.
3. If the fee is reasonable, not higher than around 300 dollars, it is worthwhile the investment for the author, but only if the journal is well indexed and open access.
4. Authors are able to pay this amount if they share the fees with co-authors and they get some help from their university.

But…
1. Some authors from developing countries find 300 dollars out of their reach, even with co-authorship and university refund, if any. And this upset scholars in general because is pretty unfair and incompatible with open opportunities, and that this system is closing the doors to some knowledge to be disseminated.
2. Moreover, academics feel uneasy with all this issue, they do not understand that some journals owners earn a profit and they do not pay for content or peer review. It is like they were taking advantage of the necessity of scholars to publish in indexed journals, and again, is unfair.
3. May be the ethics of some journals/publishers are questionable but I think that the model itself is beneficial for the industry. The challenge is about the knowledge (innovative, reliable, reputable, biased, ethical, etc.) not about who pays for it or its dissemination.

Charging fees to authors is an uncomfortable necessity for journals, which work because is based on the obligation for scholars to publish in journals. Well, it makes sense as a business but, was it not all about knowledge and its dissemination? Who can improve this system? Governments? Market itself? Technology?…

%d bloggers like this: