Avoid Free Trial Trap: How to Choose Systematic Review Semi-Automation Tools for Yourself, Your Institute, or Library?

Farhad Shokraneh
7 min readOct 19, 2023

--

Systematic Review Automation Programmes
Systematic Review Automation Programmes

The marketing departments of automation programme companies are constantly contacting individuals, research-related libraries, institutes and universities, offering them free trial periods. While a free trial could be used to your advantage, it could become a trap if not used well.

This approach has been in place since a long time ago with databases and other 'Information Resources'; however, there is a danger of getting into the trap for the passive decision-makers. Passive deciders wait for the marketing departments to contact them and offer them a product instead of actively hunting for the best programs and comparative studies.

What can go wrong with free trials?

A. Free trial of a single product compared to nothing!

When you get a free trial for a single product, the users have no other reference or comparison point, so they cannot compare the products. The result of such a trial is, in most cases, obvious: the product is good, and better than nothing. So, feedback would tell you to get a subscription.

B. The fate of files and projects post-free-trial period

What happens to the files and projects when the trial period ends?

Let's say a hundred users create a project, start using the program and spend hours, days, or months on their projects. Suddenly, you inform them that the free trials have ended. Panic among users and pressure on the subscriptions department to get permanent access to the source is a predictable outcome; otherwise, their efforts will go to waste.

C. Platform-dependent reviews and exclusivity

In a competitive market, there is always a possibility that a product improves to become better than the others and, in a few years, changes its rank from 10 to 1 or vice versa. If you are so dependent on a single product and there is no way to transfer your reviews to another product, you will be in a self-build prison while you have no choice but to continue paying the prison guards. If needed, can you give up Microsoft Windows and switch to Linux, or give up EndNote and move to Zotero? Think about the automation programs the same way; be futuristic in your choices.

What to do?

How do you get the best out of free trial periods?

  1. Have a selection committee to set a budget and create short-term, mid-term, and long-term plans.
  2. Short-list the products from the list of many to a few
  3. Read independent reviews written by librarians and information professionals with clear declarations of conflicts of interest.
  4. Get a free trial agreement with multiple products simultaneously alongside free tools so you and the user community can compare them.
  5. Don't rely only on your users; create a real-world project and get your hands dirty with live experience.
  6. Advertise the products to your user community with a stern warning that they may lose access to their files and reviews if the trial period ends, so they should make sure they have saved the records of their works in at least two reliable storages (preferably at least one at top three clouding service providers).
  7. Develop written and video guides and training for using and citing the tools and a temporary in-house support role for your users so you can collect FAQs; if you don't, this information will directly go to the company's support team only without you being able to get the knowledge and concerns. If you can't support it, ask the company to share a copy of any queries from the users with you.
  8. Share your knowledge with others so they can learn and make the best choice.
  9. Ask the automation companies to make changes to their programmes to improve the user experience (including user interface and user journey).
  10. Use triangulation to collect information and evidence-based decision-making to make the final decision: Ask librarians in other institutes about their experiences, collect the user feedback, form an expert-in-institute decision-making team to feedback, and collect the published evidence comparing the tools. There are many papers comparing automation tools.
Evidence-Based Decision Making for Choosing Systematic Review Automation Programmes
Evidence-Based Decision-Making for Choosing Systematic Review Automation Programmes

What Automation Programme Companies can do to help?

A. Don't live in your baby's bubble!

Of course, your baby is the loveliest and the winner of Oscar and Noble and a few Grammys; however, have you seen the others' babies? If you are a good programme developer, you will have at least one ongoing review in Rayyan, Covidence, EPPI-Reviewer, PICO Portal, DistillerSR, and so on. This way, you can learn what other programmes are doing right and learn from them, making your programme better and more competitive.

Burst that bubble to boost your baby. Your tool will not get better if you already believe it is the best. Be realistic about the positives and negatatives of your developed tools so you can improve them. Learn from other tool developers and share your experience with them to create a positive development environment that benefits the users.

B. Engage with the user community!

Don't develop in a cave; form a competent user testing group from actual daily users and constantly ask for their feedback on any new or possible future features.

C. Hear the futurists even if you ignore them

It tortures me when I read a book written or watch a movie made 50 or 60 years ago and have predicted some present-day innovations so vividly that it makes me believe time travel exists. When I talk to some of the scientists in the field of evidence synthesis, automation, and open science, my mind is so pleased no active drug can imitate such pleasure. Some nights, I wake up and can't sleep, thinking about those conversations saying such intelligence can only come from the heavens. It surprises me how shortsighted some of these companies and programme developers are. Isn't it obvious who leads the waves of automation in the field of evidence synthesis? Shouldn't we be more humble and start learning from each other?

D. Provide security and reliability, so I trust you.

Wouldn't it be great if your programme could provide project import/export? If your programme could find a way to transfer projects to and from other programmes, wouldn't that be great? If my project could find a way to live forever, either in my cloud or your cloud, wouldn't that be great? It would, and I would trust that I am the project's owner, even if I go broke. It means you are so assured of your programme's greatness that you are not worried about people taking their projects to another programme and getting their reviews from another programme into your programme. People migrate, you know. It shows your confidence in your programme and respect for the users' freedom. We need more guides like "How to move your project from programme X to programme Y with no data loss". Programmes should work with each other to create and follow certain standards.

E. Be ethical

Not every student can pay.

Not all researchers of low- and middle-income countries can pay.

Many researchers are affected by unfair economic sanctions against their country, meaning they cannot pay for subscriptions or anything, indeed, even if they want to pay.

So, as an ethical developer, what's your solution? Do you have a free version of the tool or a free subscription with fewer features to offer them?

Does the company support charities and good causes?

Can people access their projects after their subscription ends? If not, what happens to the project?

F. Free access for comparative researchers

Provide free access for those who’d like to compare the tools and publish the results, and encourage such research to improve your products.

What can librarians do?

"Get courses on MBA, User Experience, Coding, Data Analysis, and Artificial Intelligence, for God's sake." This was advice from one of my librarian friends who retired at the age of 36. Don't ask why and how; it depresses me as someone who will not have retirement as a choice, probably ever. I hardly find time to do anything, but I do advise my fellow LIS profs to get into such stuff.

A conclusion, why not

Respect the programme developers for making life easy for us; however, be a savvy shopper.

A. Choose the best for your users with a futuristic view on the only constant: Change. Change in budget, programme's quality (service, support, features), processes, methods, market, other tools, migration from one programme to another, etc.

B. Sometimes the best programme is no programme, and sometimes the best solution is two programmes.

Use many sources of information and data from companies, published comparative evidence, independent reviews, users' feedback, other librarians' feedback, and your committee's experiences.

Consider the type of reviews and evidence synthesis your user community and their context & complexity; some tools may work better than others depending on the use cases.

Consider the complexity of the tools, particularly the machine learning features. Can most of your users use them with little training?

Should your research library start supporting and using a free or paid automation tool? Yes, it should. You have no idea how much time and lives such tools save. If time allows, I will write an evidence-based post about it.

The librarians are the main pillars of the implementation of systematic review automation. Such improtance usually comes in a package with a lot of responsibilities and no pay raise. Enjoy!

If you liked this blog post, please support me by pressing the green Follow button and signing up so I can write more. Email Subscription Dysfunctions. Thank you :D

--

--

Farhad Shokraneh
Farhad Shokraneh

Written by Farhad Shokraneh

Evidence Synthesis Manager, Oxford Uni Post-Doc Research Associate, Cambridge Uni Senior Research Associate, Bristol Uni Director, Systematic Review Consultants

No responses yet