Academic absurd cases? Looking for candidates with a minimum h-index

Although the h-index is well known, I didn’t know that it may have some value, but it does! Such as a minimum requirement for certain academic positions; something that I had never seen before, and I have reviewed many academic job posts for years on the internet from all universities in the world!

Academic absurd cases? Looking for candidates with a minimum h-indexTo illustrate it, I’ve recently received the following academic job offer:

  • From the Department of International Relations we share with you this call for professors of sociology, economics and history interested in pursuing an academic stay at Saint Petersburg State Polytechinical University (Russian Federation). Candidates should have at least 2 points in the Hirsch index”.

H-index is, for a given researcher, the number h of publications that are cited at least h times each in academic journals. For example, my h-index is 2 (Finance) because I have 2 papers cited 2 or more times each; the rest of my publications have only 1 (or none) citation each. And to get a 3 h-index, these other papers would have to be cited at least twice, and that would mean 3 articles cited at least 3 times each.

But this so convoluted thing seems to work, at least within the same field of knowledge. So the pirates that wrote the announcement knew what they were doing, and thus I guess they avoid reviewing and/or analyzing (with love and care) a lot of requests, such as those who do not understand the h-index, those do not maintain it, or those without citations, among others. Why wasting time browsing hundreds of academic curriculums when they have the miraculous h-index?

Moving forward, Google Scholar calculates your h-index, though you can also obtain it your course, but you have to document and demonstrate it indicating journals, dates, articles, authors, etc.. The weak point is that Google Scholar considers also self-citations (should they be included for the h-index?), and citations from not indexed journals, books, and other publications. But it’s what we have.

Therefore, apart from the overwhelming logic of using the h-index to select researchers, don’t you think it is outrageously absurd? You can imagine what may be the environment/working pressure on a place that calls for the h-index to their professors/researchers … but it’s up to you and your needs.

Advertisements

Academic networks contest: ResearchGate vs. Academia vs. Mendeley

As university professor, with great pressure to publish in academic journals, I find academic generalist networks essential, such as ResearchGate, Academia.edu or Mendeley, which help me to:

  • Disseminate on the web my published articles to try to obtain citations and name among the scientific community in my field of expertiseAcademic networks contest: ResearchGate vs. Academia vs. Mendeley
  • Find research papers quickly and easily
  • Search for collaboration and international research projects
  • Share ideas and find solutions

I wonder if you can do the same on Facebook or Linkedin. Facebook don’t clearly do it because it’s very focused on leisure and personal life but, what about Linkedin? With millions of professors and professionals connected interested in science/research?

But no, it seems that we need a specific one to ourselves that differentiate researchers, with specific functionalities on usability and sociability (mainly source credibility), two main factors for evaluating online communities (Chinthakalaya, 2013).

I don’t intend to make a thorough analysis of the technical and functional characteristics of these platforms, but rather from the point of view of the user or scientist, offer an outline of their main features, to make the most of our time and know what you can expect from each. And although this is a blog and not a scientific research project, I have also taken into account the views of other users in the academic networks, such as those expressed in ResearchGate in this forum.

Obviously, I have created profiles on all platforms, so one important point is that you are forced to be on all of them, but if not constantly updated (profile and papers), the effort will be useless.

ResearchGate

  • I’d highlight its:
    • Interactivity: Collaboration and discovery through its discussions/questions and publication repository,
    • Intelligence:  The statistics and the scoring about your work are a great invitation / encouragement to participate and interact, though its administrators are very aware of all that is posted in the network, manipulating content, as if we were small children.
    • And source credibility: only researchers are accepted, and they use it a lot because of the scoring mentioned above.
  • But ResearchGate still has to improve its repository: I find it difficult to upload all my publications, not just papers, and it sometimes doesn’t find the links to get data when uploading them.

Academia.edu

  •  Its strength point is the repository of publications: Allows you to post the link on your paper, so other researchers can download your papers directly from the original source (SSRN, RepEc, arXiv.org, CiteCeerX or SSOAR), which increase your score on these repositories, if that is important for anyone.
  • But I find it less democratic so at the end less interactive: It’s very restrictive when disclosing your ideas to exchange views with other researchers. For example Academia.edu has deleted almost all of my new discussions, and they even closed my first profile there and I had to open another, which gives me the creepy feeling of censorship and guarded by a big brother with the excuse of spam.

Mendeley

  • It works more as a reference management system (organize and search bibliographies, add papers from the web to your library, etc.) with both online and desktop versions, sometimes difficult to understand. So it isn’t an academic network, but it has “a social network integrated”, which can give you an idea of its limited social and sharing capabilities.
  • It also has strong corporate and lucrative connotations: Mendeley was acquired in 2013 by Elsevier, the publishing house; yes, the one that is requesting scientific social-networking sites and authors to remove the papers posted online without their permission.

In conclusion, ResearchGate and Academia.edu are very similar social networks for scientists, each with their particular strengths/software, but I foresee a better future for ResearchGate because of its commitment to sociability, though not as much as Linkedin, my favorite generalist academic network.

Too bad they aren’t specialized in socializing the process of publishing in scholarly journals, both to editors and authors.

 

Fishing citations for your papers. An introduction

Fishing citations for your papers. An introductionActively seeking your papers to be cited is not well considered in our academic community, it’s compared with snake-oil selling. So, since it’s not a perverse activity in itself but we are forced to do so by the circumstances and the current publishing system, I’m wondering about how to approach it so that it could be accepted better.

What I’ve written so far in this blog, and the limited literature found on the subject, it’s based on a process, where there are activities to be performed before and after the publication of our paper (a kind of workouts), in order to get better citations ratings, with a focus on results.

On the other hand, I’ve found that seeking citations has greater acceptance if presented as an additional writing task to do with your manuscript to improve their search engine rankings (academic SEO) in the future, but it looks like a bit limited and unattractive for a new conceptual model; so I came across with the idea of ​​looking for a sport activity that could have some parallelism with obtaining citations, and I think I’ve found it: citation fishing.

  • Fishing is an activity that it’s enjoyed, and benefits are obtained, just with the fact of doing it; it’s rewarding in itself.
  • So there is, in theory, no pressure for results, since it depends on many external factors as in the case of fishing it’s the cold waters (field or knowledge), their turbulence (research topic), the time of day (number of authors), the area of the river or the sea (affiliation, experience or academic relevance); which must be known and managed anyway.
  • When fishing, we wait patiently for the fish to bite with all the means and planning we have taken for them to do so: knowing the prey (the scientists of my field of knowledge), but patiently let others scientists to come, find our work, and finally bite the hook.

There are many types of fishing, such as trawling, angling, using fishing nets, from a boat, from the shore or into the river itself; but I think that trolling fishing from a boat (web 3.0) is the best suited to our academic type of fish: citations.

  • Trolling is a method of fishing where one or more fishing lines (discussions or posts), baited with lures (our papers and background), are drawn through the academic waters (social networks).
  • Trolling is used both for recreational and commercial fishing, it’ll depend on your dedication. Multiple lines are often used (academic web sites), and outriggers (the tools: journals platform, academic tools, social networks, etc.) can be used to spread the lines more widely and reduce their chances of tangling. Downriggers (what to do to get to the scientific community: networking, discussions or communications) can also be used to keep the lures or baits trailing at a desired depth.

Would you like to know all the secrets about fishing citations for your papers? I’ll develop further on this type of citation fishing in the future; I hope you’ll enjoy it as much as I do.

The future of scientific research dissemination: Liberalism back again

The future of scientific research dissemination: Liberalism back againLast week was the presentation of my book ‘Publish in Journals 3.0’ and attended as speakers one of the foremost authorities in Spain on accreditation, the President of ACAP; the Director of the Corporate Finance Department at Universidad Complutense de Madrid, and the Library director of the Faculty of Business and Economics; who brought their different views on the future of scientific research dissemination.

In the later discussion, there were addressed two issues of particular relevance, which I found interesting to comment here for its reflection.

1. We were wondering if it makes sense for a centralized agency to evaluate professors, and somehow tell the universities which of them could recruit.

  • It would be something similar to university admissions, there is now a centralized evaluation to be replaced in the near future by the specific of each college, American style.
  • Accreditation agencies would focus then to certify program studies and not to professors, seen as a private subject, of its quality and vision of teaching.
  • Many professors present at the event, as me, were slightly perplexed since we are working very hard on our accreditations, and because this new scenario would put it much harder for their foreseeable lack of transparency and equality of criteria.
  • But this change in evaluations doesn’t mean that we shouldn’t take care of our academic curriculum, on the contrary, the requirements will not be lower.

2. The other interesting point is that probably the future of quality of research dissemination is not in the Platform (journals, repositories or even peer-reviewed books and conferences), but in the number of citations.

  • Although the results of the last poll I conducted on this topic reflected the opposite, which was the opinion of researchers about their current situation; in the future more emphasis will be given to citations obtained than to the relevance of the journals in which research is published, both closely interrelated.
  • Moreover, publish papers in journals is not the only thing that measures the impact or quality of research, but there are other important activities, such as patents and transfer of knowledge to society through the creation of start-ups.

That is, the conclusion I draw is that the important thing is to do research, publishing is its result, not the goal, or the system become perverted. Either way, we professors expect troubled times (you know: life is change, change is life), but not necessarily for the better academically and for the future of society. It is the vision of radical liberalism that now prevails, I guess.

Poll results on quality of research: Journals 3-2 H-Index

What a surprise! There have been fewer responses than in other polls, I thought that there were more interest on this topic, but the results are clear on quality of research: Journals 3-2 H-Index.

Is Google Scholar a good indicator of your quality of research activity and influence?

Poll results on quality of research: Journals 3-2 H-Index

* The poll was posted in October 2013 in many academic discussion groups. Around 470 answers were collected.

Although Google Scholar is open and reliable because it treats scholars equally, it’s not considered to be a good indicator of quality of research activity and influence. It’s incredible and difficult to digest because I had a hope in this. So I guess what you can think of Altmetric, which is based on an ample idea of impact, not just on academic production…

The reasons for these results are implicit in the survey because, if on the one hand these metrics provide useful public information, on the other hand I understand that, in general, H-Index metrics have the following barriers:

  1. It adds pressure on researchers.
  2. No organization seems to be looking seriously at them.
  3. Not many people are using them, because their citations are poor comparing with those of some champions in the sciences, and because is another annoying tool to take care of.
  4. And mainly because it is still considered that journals are a better indicator of quality of research.

Well okay, I get it, we’ll look at the individual metrics but focusing on publishing in indexed journals. I don’t think that in the medium term this will change much, peer-review will remain the king of research quality assessment, and makes perfect sense.

But instead, for journals these results are a triumph and a shot of adrenaline. The road for them is to be indexed in well-known databases and be open to open access to allow authors to be cited and have an impact, isn’t it?

Towards a Corporate Governance system for journals

Towards a Corporate Governance system for journalsIn the previous post, I suggested the idea of ​​using the corporate governance model but for academic journals and research, a kind of Journal Governance system, aligning journal practices with each other and with the scientific environment in which they operate, which would lead the academic publishing industry towards a Corporate Governance system for journals.

In corporate governance there are two leading models: that of the Shareholders (in our case the journal would seek wealth maximization), monitored by the market, that is, their readers, paper rejections ratios, subscriptions, indexation in high ranked indexes, publication prestige, etc.; and that of the Stakeholders, having into account a dense network of journal collaborations; but the trend is to use a mixed model, in which the publishing world could have the following key Journal Governance Variables.

The internal forces, those directly responsible for determining both the strategic direction and the execution of the journal’s future:

  1. The journal owner (publishing company, faculty/university, scientists): Maximization of the journal value.
  2. The editorial board: Transparency and international approach.
  3. The editors (Editor-in-Chief, Managing Editor):  Independence and loyalty.
  4. The peer-reviewers: Knowledge and ethics.

The external forces, those interested in the journals behavior and success:

  1. Readers, looking for quality, innovation and rigor of published research.
  2. Authors, seeking the prestige of the journal.
  3. Funding institutions, in need of project validation.
  4. Universities and faculties.
  5. Databases and indexes.
  6. Accreditation agencies of professors.
  7. The regulation of each country on education and teaching.

Many of these forces are currently existing, but in a weak way and not incorporated or regulated by a comprehensive model, for example forcing journals to publish a sort of Journal  Governance Annual Report, among other practices, which would be compelling as other quality practices, such as peer-review or independence of the academic board.

Anyway I’m not naive, I know that this hypothetical system of Journal Governance wouldn’t be infallible either, but would be the best we could come to have in the medium term, don’t you think so?

POLL: Do you bet your future as a researcher on Google Scholar metrics?

POLL: Do you bet your future as a researcher on Google Scholar metrics?The journals’ world is boiling: Internet and Open Access is questioning the indexed journals’ model, from anonymity in peer review to the selection criteria of the directories or databases, and the calculation of the journal impact indicators.

An alternative model to measure the quality of research are the personal citation indexes (H-index), which allow open tools, accessible and public such us Google Scholar; although, there are also private metrics, or at a cost, as Altmetric.

  • The change is motivated because the indexed journals’ model doesn’t just convince scientists, because of the power that some journals and private directories have. But it is producing an undesirable effect, that the two models are used now to evaluate the quality of research, with the danger of drowning professors and scientists even more.
  • And paradoxically, journals are also pressured in some way, because now they have to ‘promise’ implicitly that their published articles will be cited more with them.

Google Scholar metrics are here to stay. What do you think? Do you bet on your citation appeal?

* It can be chosen 1 o 2 answers.
**Comments are highly encouraged.

Open Access Journals: The model that would be king. Poll results

Open Access Journals - The model that would be king - Poll results2The topic of Open Access (OA) has already been widely discussed in academia and currently it is a common reality in the publishing world, but there are still some doubts and suspicions on the part of scientists, as we shall see.

Are you submitting your articles to open access journals? was the question of the poll, and at first glance its results are clearly optimistic: we love this OA model.

  1. 62% of the respondents would submit their articles to OA journals.
  2. 35% would send it, but after good analysis of the OA journal: indexation, impact factor and fees to authors, which make sense anyway.
  3. 23% of professors wouldn’t, which is a pretty high percentage of them.
  4. 15% of them don’t mind about OA, they just mind about journal indexation, so I suppose they care little about their citations.
Open Access Journals: The model that would be king. Poll results

* The poll was posted in August 2013 in many academic discussion groups. Around 700 answers were collected.

But we have to keep in mind the bias of the sample, because it corresponds to professors and researchers who routinely use technology and internet. That is, within the cream of the crop, 38% (23% +15% above) still remains some skepticism about sending their papers to OA journals.

On the other hand, another reading of the results is that OA journals are acceptable for scientists, but only if they meet certain minimum traditional academic etiquette (and of common sense): quality, indexed, peer-review and reputation of their board.

My impression is that although OA has been with us for decades, that publishers are making good use of it and that authors need it because it represents a clear advantage, it has yet to completely establish itself as a model. But, who wants to miss this train of OA journals?

POLL: The current use of open access journals

POLL: the current use of open access journalsOpen access (OA) journals has been one of the main drivers of change in the academic publishing world in the last decades, and OA will still shape the future of assessment of research quality and scientific dissemination thanks to the Internet and Web 3.0 technology.

Behind this situation is the urgent demand of professors and researchers who need to publish in indexed journals (quality of research) but also expect their work to be accessible to a wider audience (citations), pressed by faculties and promotion.

And what are you doing with your papers? Are you using OA journals? What about fake or predatory journals/businesses? There are still some questions to be clarified. Please share with us your use as author of OA journals participating in the survey.

* OA: Open access

** It can be chosen 1 or 2 answers.

***Comments are highly encouraged.

Traditional vs Alternative means of dissemination in academia. Poll results

There are many pressures for change regarding the dissemination of research, such as the current Web 3.0. technology environment in education, open access journals/repositories and the consolidation of citation metrics tools.

Professors and researchers shared with us their vision about the future of publishing, voting in the poll.

Traditional vs Alternative means of dissemination in academia. Poll results

Indexed journals have been adding high value to all academic stakeholders, and they will be.

Traditional vs Alternative means of dissemination in academia. Poll results

In general, it could be seen in the results a balance between the traditional (48%) and the alternative (52%) means of dissemination in academia, but there are other conclusions quite interesting:

  1.  “Open access journals/directories with peer review” was the preferred mean of dissemination, with 29% of the votes; it makes sense due to the expectation that citation rankings are creating.
  2. Both added, “Indexed journals” + “impact factor”, would be the most voted (39%); the current journal system still prevails.
  3. “Repositories with peer review” + “number of downloads”, were voted by 21% of respondents, opening an interesting way to new alternatives for the dissemination of scientific knowledge in academia.

Traditional vs Alternative

Professors are rational people with common sense, we understand that change is needed in the system, but little by little, as it is working reasonably well. It’s like we will be waiting to see how those changes develop and how journals and publishing houses respond to them. Sure they do well.

* The poll was posted in June 2013 in many academic discussion groups. Around 900 answers were collected. 

%d bloggers like this: