Conducting Systematic Review - Issues with # of Items, Speed, Deletion
Hi All - I am part of a group that is completing a systematic literature review and storing items in a group Zotero library. We are running into severe slowdowns because of the number of items in our library. Some folks are unable to use Zotero at all. I have been using it with severe time delays but now it seems to have stopped working - I can load library after some time but can't do anything within the library. At this point, we'd like to just delete all the duplicates to see if it fixes the problem (I suspect it will because many of the duplicates are in 10-12 copies). We have tried the following approaches:
1) We tried merge through the duplicates feature which worked for a while albeit very very slowly. This no longer works.
2) We tried deleting items from individual libraries which seemed to have worked for a bit but has since stopped working.
3) We tried turning off sync - nothing changed.
4) We tried deleting on Zotero web browser but that only works some of the time and doesn't seem to register deletions.
5) We purchased an unlimited data plan for the group library - nothing changed.
6) I purchased an unlimited data plan for my personal library - nothing changed.
In the long term, we have amended our search protocols to reduce the number of items that will be imported into Zotero so I think if we can just deal with what's there now the problem will be fixed.
Thanks in advance for any suggestions!
1) We tried merge through the duplicates feature which worked for a while albeit very very slowly. This no longer works.
2) We tried deleting items from individual libraries which seemed to have worked for a bit but has since stopped working.
3) We tried turning off sync - nothing changed.
4) We tried deleting on Zotero web browser but that only works some of the time and doesn't seem to register deletions.
5) We purchased an unlimited data plan for the group library - nothing changed.
6) I purchased an unlimited data plan for my personal library - nothing changed.
In the long term, we have amended our search protocols to reduce the number of items that will be imported into Zotero so I think if we can just deal with what's there now the problem will be fixed.
Thanks in advance for any suggestions!
1. Do the searches 1 at a time and merge duplicates after each.
2. Pre-merge duplicates, such as by extracting DOIs from the downloaded bibliography files, then the thinned list.
@adamsmith - yes 100%. I tried creating a second profile and ONLY loading that group library (because I have several) and then deleting items but now I'm having issues syncing things up (not a huge surprise).
Clearly, several things have contributed to the large number of items you need to evaluate to arrive at a manageable number of references. I suggest a different approach that involves a less-sensitive search query _and_ a restricted number of years searched. This could lead to an article about your search process.
Consider restricting your search to a single year and therein test your search terms to learn which terms are responsible for the most false drops. Test to see if you can eliminate these search terms yet still capture the relevant records. Similarly, ask yourself if you are examining too many databases. For example, among the years covered by Scopus, all PubMed records are duplicated there. It is possible to search Scopus and exclude PubMed/Medline records by appending " AND NOT INDEX(medline) " to your query string (this specifies MedLine but will also eliminate PubMed articles).
If you are willing to share the general topic of your systematic review, I and others here can likely help you with our sad-lessons-learned when we ourselves searched and found an overwhelming number of results.
on what operating system have you tried this? Zotero has access to more RAM memory on Mac and Linux, and that's likely the constraining factor here. So if you have access to a powerful Mac with 32+GB RAM, that's at least worth a try.
I think the only group library approach you tried is also promising -- what happens in terms of syncing when you try that?