Conducting Systematic Review - Issues with # of Items, Speed, Deletion

Hi All - I am part of a group that is completing a systematic literature review and storing items in a group Zotero library. We are running into severe slowdowns because of the number of items in our library. Some folks are unable to use Zotero at all. I have been using it with severe time delays but now it seems to have stopped working - I can load library after some time but can't do anything within the library. At this point, we'd like to just delete all the duplicates to see if it fixes the problem (I suspect it will because many of the duplicates are in 10-12 copies). We have tried the following approaches:

1) We tried merge through the duplicates feature which worked for a while albeit very very slowly. This no longer works.
2) We tried deleting items from individual libraries which seemed to have worked for a bit but has since stopped working.
3) We tried turning off sync - nothing changed.
4) We tried deleting on Zotero web browser but that only works some of the time and doesn't seem to register deletions.
5) We purchased an unlimited data plan for the group library - nothing changed.
6) I purchased an unlimited data plan for my personal library - nothing changed.

In the long term, we have amended our search protocols to reduce the number of items that will be imported into Zotero so I think if we can just deal with what's there now the problem will be fixed.

Thanks in advance for any suggestions!
  • How many items are you talking about? For comparison, I’ve done several systematic reviews with Zotero with 10-20k items without issues
  • And just to be clear—syncing, storage space, etc. wouldn’t have any impact on Zotero speed.
  • Hi thanks for your feedback - it's about 500,000 references... with the revised searches we have many fewer but I need to get them out of there in order to free things up. At least that's what seems to be bogging it down. I've checked memory and what not on my own machine as well.
  • Yeah, that’s really big and going to slow Zotero down noticeably. A few things could speed things up some:

    1. Do the searches 1 at a time and merge duplicates after each.
    2. Pre-merge duplicates, such as by extracting DOIs from the downloaded bibliography files, then the thinned list.
  • (I think the immediate Q is how they can get their Zotero back to usable...)
  • @bwiernik - thanks for the comment and yes, we will definitely be taking steps to fix the issue as we move forward.

    @adamsmith - yes 100%. I tried creating a second profile and ONLY loading that group library (because I have several) and then deleting items but now I'm having issues syncing things up (not a huge surprise).
  • For the past two days I have been worrying about a series of searches for a systematic review that resulted in 500,000 items. My curiosity is literally keeping me awake.

    Clearly, several things have contributed to the large number of items you need to evaluate to arrive at a manageable number of references. I suggest a different approach that involves a less-sensitive search query _and_ a restricted number of years searched. This could lead to an article about your search process.

    Consider restricting your search to a single year and therein test your search terms to learn which terms are responsible for the most false drops. Test to see if you can eliminate these search terms yet still capture the relevant records. Similarly, ask yourself if you are examining too many databases. For example, among the years covered by Scopus, all PubMed records are duplicated there. It is possible to search Scopus and exclude PubMed/Medline records by appending " AND NOT INDEX(medline) " to your query string (this specifies MedLine but will also eliminate PubMed articles).

    If you are willing to share the general topic of your systematic review, I and others here can likely help you with our sad-lessons-learned when we ourselves searched and found an overwhelming number of results.





  • Going back to the topic of fixing the library --
    on what operating system have you tried this? Zotero has access to more RAM memory on Mac and Linux, and that's likely the constraining factor here. So if you have access to a powerful Mac with 32+GB RAM, that's at least worth a try.

    I think the only group library approach you tried is also promising -- what happens in terms of syncing when you try that?
  • You wouldn't even need a powerful Mac — on Windows it's just limited to using 3 GB, so even an 8 GB Mac could handle a much larger library.
Sign In or Register to comment.