I use google_places from googleway package to get a dataframe of places from Google. I am looking for "blood donation in Germany" (in German) https://www.google.de/maps/search/Blutspende+in+Deutschland/@51.5019637,6.4650438,12z The vignette says that each API query results in 20 locations. https://cran.r-project.org/web/packages/googleway/vignettes/googleway-vignette.html I assume that there should be about 300 blood donation places in Germany. I am trying to build a loop that returns all Google place results to a dataframe for my key term. A similar post can be found here next_page_token not working on second attempt (google_places function)
How can I built my loop such that it returns a dataframe of all Google searches?
library(googleway)
# initialize list
datalist = list()
# start first search
key = "YOUR-KEY"
res <- google_places(search_string = "Blutspende in Deutschland",
key = key)
# store first 20 results
datalist[[1]] <- data.frame(Name = res$results$name,
Place = res$results$formatted_address)
# set next page token
token = res$next_page_token
for(i in 1:10){
# sleep time
Sys.sleep(2)
# next search
res_n <- google_places(search_string = "Blutspende in Deutschland",
page_token = token,
key = key)
# store next results
datalist[[i+1]] <- data.frame(Name = res_n$results$name,
Place = res_n$results$formatted_address)
# set next token again
token <- res_n$next_page_token
# print status
aa = res_n$status
cat(i, aa, '\n')
}
# to dataframe
big_data = do.call(rbind, datalist)
There is a massive amount of duplicates in this search.
library(tidyverse)
big_data %>% distinct() %>% nrow()
For me, I have 54 distinct entries out of 202. I don't know why.
Google Map's place API limits the responses to 60 locations by query, paginated in up to 3 json with 20 places. (See Places API Docs).
To get more than ~60 observations, one easy trick with
googlewayis to query by regions/Lands, or even by municipalities. In the next example I will loop through the 16 German Lands/States to get 600+ results.