I have a database table that contains columns storing latitude and longitude coords (both have the decimal data type). The table has about 500k rows through the searches are made.
The problem is that if I search rows that are in radius of 100-200 miles, the searches lasts about 140 seconds, which is not possible to put on production.
The stack:
- Rails 4
- MySQL 5.5
- Geokit-rails gem
I'd like to ask you for help on how to speed up the searches through geographical data. Should I implement a search similar to kayak.com, where are displayed some results to users and the search is still running on the background?
Thank you in advance.
EDIT: 'stores' scheme
create_table "stores", force: true do |t|
t.string "store_name"
t.string "address"
t.string "city"
t.string "state"
t.string "county"
t.string "zip_code"
t.decimal "latitude", precision: 10, scale: 6
t.decimal "longitude", precision: 10, scale: 6
end
There are no indexes on the table. I've a similar table in the database that has latitude and longitude columns too, I tried to add indexes on them, but the loading time of the searches stayed the same.
Best ways to solve this is using a geospatial index. However neither
geokitnorgeocoderdo support them for relational databases. Thergeogem does. It's however is not well documented. If you are doing some serios search stuff it probably will make sense to delegate this problem to elastic_search.