Pubmed Database Fetching
Hi there,
i asked something about Pubmed / Zotero in this forum a while ago, but i still have some problems related to it, so i will ask here again since this forum is somehow related to it.
I am currently trying to download huge batches from Pubmed into my local database. Thats all what i want right now. My current method is working - but working far to slow. I am currently fetching each article via the ID:
$XML = simplexml_load_file("http://eutils.ncbi.nlm.nih.gov/entrez/eutils/efetch.fcgi?db=pubmed&id=$ID&retmode=xml");
This works but is far to slow since for 500 Entries i have to make 500 http calls. Luckily there is a way to receive huge loads of those entries at once! http://www.ncbi.nlm.nih.gov/books/NBK25498/#chapter3.Application_3_Retrieving_large
So i wrote something like this in PHP:
private function Search() {
$this->SParameters['term'] = $this->Query;
$Url = $this->SUrl . http_build_query($this->SParameters);
$this->SResults = simplexml_load_file($Url);
}
private function Fetch() {
$this->FParameters['query_key'] = (string) $this->SResults->QueryKey;
$this->FParameters['WebEnv'] = (string) $this->SResults->WebEnv;
$Url = $this->FUrl . http_build_query($this->FParameters);
$this->FResults = file_get_contents($Url);
}
My Problem is now, that i want to fetch ALL articles, not just specific ones defined via a search Query. So i tried something like $Query = ''; or $Query = ' '; But that gives me back:
Warning: file_get_contents(http://eutils.ncbi.nlm.nih.gov/entrez/eutils/efetch.fcgi?db=pubmed&retmax=500&query_key=&WebEnv=) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request
So what i am going to do now? How can i download huge batches of Pubmed articles without 10million http calls? Is there any way on how i can set the Query to $Query = ''; so i can download the articles from ID 0 to ID 10.000 not just the ones with $Query = 'test'; in one http call?
Thank you very much!
i asked something about Pubmed / Zotero in this forum a while ago, but i still have some problems related to it, so i will ask here again since this forum is somehow related to it.
I am currently trying to download huge batches from Pubmed into my local database. Thats all what i want right now. My current method is working - but working far to slow. I am currently fetching each article via the ID:
$XML = simplexml_load_file("http://eutils.ncbi.nlm.nih.gov/entrez/eutils/efetch.fcgi?db=pubmed&id=$ID&retmode=xml");
This works but is far to slow since for 500 Entries i have to make 500 http calls. Luckily there is a way to receive huge loads of those entries at once! http://www.ncbi.nlm.nih.gov/books/NBK25498/#chapter3.Application_3_Retrieving_large
So i wrote something like this in PHP:
private function Search() {
$this->SParameters['term'] = $this->Query;
$Url = $this->SUrl . http_build_query($this->SParameters);
$this->SResults = simplexml_load_file($Url);
}
private function Fetch() {
$this->FParameters['query_key'] = (string) $this->SResults->QueryKey;
$this->FParameters['WebEnv'] = (string) $this->SResults->WebEnv;
$Url = $this->FUrl . http_build_query($this->FParameters);
$this->FResults = file_get_contents($Url);
}
My Problem is now, that i want to fetch ALL articles, not just specific ones defined via a search Query. So i tried something like $Query = ''; or $Query = ' '; But that gives me back:
Warning: file_get_contents(http://eutils.ncbi.nlm.nih.gov/entrez/eutils/efetch.fcgi?db=pubmed&retmax=500&query_key=&WebEnv=) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request
So what i am going to do now? How can i download huge batches of Pubmed articles without 10million http calls? Is there any way on how i can set the Query to $Query = ''; so i can download the articles from ID 0 to ID 10.000 not just the ones with $Query = 'test'; in one http call?
Thank you very much!
This is an old discussion that has not been active in a long time. Before commenting here, you should strongly consider starting a new discussion instead. If you think the content of this discussion is still relevant, you can link to it from your new discussion.
This discussion has been closed.
This sounds like a good question for stackoverflow or a php support forum.