How to Implement Elasticsearch When Developing a Rails Web App
The article was originally published on Codica Blog.
Developing an app, you should take into account two important aspects, namely information analyzing and searching. They define the overall success of your product. The same is true when you aim to develop a perfect Rails web app.
At Codica, we believe that when it goes about powerful and convenient search algorithms, Elasticsearch saves the day. In the blog post, we want to show you the process of building a test Rails application implementing this technology.
Step #1: Setting up the tools
Before jumping into the actual code writing, let’s install the services and tools we are going to apply.
Install Ruby 2.6.1
With RVM, you can manage multiple Ruby versions set up on our system. To check the Ruby version, use rvm list
and install the following one rvm install 2.6.1
.
Install Rails 5.2.3
The next step is to set up Rails gem:
➜~gem install rails -v 5.2.3
To make sure you have installed the needed Rails gem version, type in the request Rails -v
.
Install Elasticsearch 6.4.0
We have saved Elasticsearch to the Downloads folder. Start the service by typing in the request:
➜~/Downloads/elasticsearch-6.4.0/bin/elasticsearch
Open the tool with http://localhost:9200/
to make sure it is launched.
Here, we get the following code snippet:
{
“name”: “v5W3xjV”,
“cluster_name”: “elasticsearch”,
“cluster_uuid”: “aNCXXbKyTkSIAlNNZTDc3A”,
“version”: {
“number”: “6.4.0”,
“build_flavor”: “default”,
“build_type”: “tar”,
“build_hash”: “595516e”,
“build_date”: “2018–08–17T23:18:47.308994Z”,
“build_snapshot”: false,
“lucene_version”: “7.4.0”,
“minimum_wire_compatibility_version”: “5.6.0”, “minimum_index_compatibility_version”: “5.0.0”
},
“tagline”: “You Know, for Search”
}
Install Kibana 6.4.2
We have saved Kibana to the Downloads folder. To run the service, type in the following:
➜~/Downloads/kibana-6.4.2-linux-x86_64/bin/kibana
To be sure that Kibana is operating, go to http://localhost:5601/
.
You see the window like this:
We have installed all the needed tools and services. Let’s proceed to the next step.
Step #2: Initiating a new Rails app
At this stage, we use both the PostgreSQL database and the Rails in API mode:
rvm use 2.6.1rails new elasticsearch_rails — api -T -d postgresqlcd elasticsearch_railsbundle install
First of all, configure config/database.yml structure similar to this:
default: &defaultadapter: postgresqlencoding: unicodepool: <%= ENV.fetch(“RAILS_MAX_THREADS”) { 5 } %>username: postgresdevelopment:<<: *defaultdatabase: elasticsearch_rails_developmenttest:<<: *defaultdatabase: elasticsearch_rails_testproduction:<<: *defaultdatabase: elasticsearch_rails_productionusername: elasticsearch_railspassword: <%= ENV[‘DB_PASSWORD’] %>
So, we have created the rails db:create
database.
Now we are going to build an indexed and searchable model. Create a Location table with two fields, such as name and level:
➜~rails generate model location name level
After the table was built, launch the migration with the rails db:migrate
command.
We have arranged all the test data needed. We just copy the contents of the following file, insert it into db/seeds.rb, and run rails db:seed
.
Step #3: Using Elasticsearch with Rails
To implement Elasticsearch to the Rails app, you should add the following gems to Gemfile:
➜~gem ’elasticsearch-model’➜~gem ’elasticsearch-rails’
To install these gems run bundle install
.
Now it’s time to add actual functionality to the location model. Use the so-called concerns
.
We create a new app/models/concerns/searchable.rb
file.
Afterward, we add the following code snippet:
module Searchableextend ActiveSupport::Concernincluded doinclude Elasticsearch::Modelinclude Elasticsearch::Model::Callbacksendend
And include the created module to the location model:
class Location < ApplicationRecordinclude Searchableend
The steps we reproduce are as follows:
* With Elasticsearch::Model module
, add Elasticsearch integration to the model.
* With Elasticsearch::Model::Callbacks
, add callbacks.
Now, let’s index our model.
We open the Rails rails c
console and run Location.import force: true
.
With the force: true option, you can build an index if it doesn’t exist. To be sure that the index has been created, open Kibana dev tools at http://localhost:5601/
and insert GET _cat/indices?v
.
So, we have built the index with the name locations:
The index was created automatically. It means that the default configuration was applied to all fields.
Now we are going to develop a test query. Find more information about Elasticsearch Query DSL here.
Open Kibana development tools and go to http://localhost:5601
.
Insert the snippet:
GET locations/_search{“query”: {“match_all”: {}}}
Don’t forget about the hits attribute of the response’s JSON and especially its _source attribute. Take into consideration that all fields in the Location model were serialized and indexed.
To make a test query through the Rails app open rails c
console and insert the following:
results = Location.search(‘san’)results.map(&:name) # => [“san francisco”, “american samoa”]
Step #4: Creating a custom index with autocomplete functionality
At this step, we have to delete the previous index and then create a new one. To do this just open rails c Location.__elasticsearch__.delete_index!
.
Now it’s time to edit the app/models/concerns/searchable.rb
file. It would look like this:
module Searchableextend ActiveSupport::Concernincluded doinclude Elasticsearch::Modelinclude Elasticsearch::Model::Callbacksdef as_indexed_json(_options = {})as_json(only: %i[name level])endsettings settings_attributes domappings dynamic: false doindexes :name, type: :text, analyzer: :autocompleteindexes :level, type: :keywordendenddef self.search(query, filters)set_filters = lambda do |context_type, filter|@search_definition[:query][:bool][context_type] |= [filter]end@search_definition = {size: 5,query: {bool: {must: [],should: [],filter: []}}}if query.blank?set_filters.call(:must, match_all: {})elseset_filters.call(:must,match: {name: {query: query,fuzziness: 1}})endif filters[:level].present?set_filters.call(:filter, term: { level: filters[:level] })end__elasticsearch__.search(@search_definition)endendclass_methods dodef settings_attributes{index: {analysis: {analyzer: {autocomplete: {type: :custom,tokenizer: :standard,filter: %i[lowercase autocomplete]}},filter: {autocomplete: {type: :edge_ngram,min_gram: 2,max_gram: 25}}}}}endendend
In this snippet, we are serializing our model attributes to JSON with the key as_indexed_json
method.
In fact, we will work only with two fields, i.e. name and level:
def as_indexed_json(_options = {})as_json(only: %i[name level])end
Let’s define the index configuration:
settings settings_attributes domappings dynamic: false do# we use our autocomplete custom analyzer that we have defined aboveindexes :name, type: :text, analyzer: :autocompleteindexes :level, type: :keywordendenddef settings_attributes{index: {analysis: {analyzer: {# we define custom analyzer with name autocompleteautocomplete: {# type should be custom for custom analyzerstype: :custom,# we use standard tokenizertokenizer: :standard,# we apply two token filters# autocomplete filter is a custom filter that we defined abovefilter: %i[lowercase autocomplete]}},filter: {# we define custom token filter with name autocompleteautocomplete: {type: :edge_ngram,min_gram: 2,max_gram: 25}}}}}endend
Let’s define an autocomplete
custom analyzer with standard tokenizer
and with lowercase and autocomplete
filters.
An autocomplete
filter is of edge_ngram
type. The edge_ngram
tokenizer splits the text into smaller parts (grams).
For instance, the word “ruby”
will be divided into [“ru”, “rub”, “ruby”]
.
When implementing autocomplete functionality, edge_ngram
comes in handy. Still, there is one more way to integrate the options needed. It is called the completion suggester approach.
We use mappings with the name and level fields. The keyword data type is applied to the level field. The text data type is used with both the name field and our custom autocomplete
analyzer.
And now, we are going to explain the search method we apply:
def self.search(query, filters)# a lambda function adds conditions to a search definitionset_filters = lambda do |context_type, filter|@search_definition[:query][:bool][context_type] |= [filter]end@search_definition = {# we indicate that there should be no more than 5 documents to returnsize: 5,# we define an empty query with the ability to# dynamically change the definition# Query DSL https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl.htmlquery: {bool: {must: [],should: [],filter: []}}}# match all documentsif query.blank?set_filters.call(:must, match_all: {})elseset_filters.call(:must,match: {name: {query: query,# fuzziness means you can make one typo and still match your documentfuzziness: 1}})end# the system will return only those documents that pass this filterif filters[:level].present?set_filters.call(:filter, term: { level: filters[:level] })end__elasticsearch__.search(@search_definition)end
Open the Rails console and check the request to make sure the project works correctly:
rails cresults = Location.search(‘san francisco’, {})results.map(&:name) # => [“san francisco”, “american samoa”]
Also, we are going to verify if the product performance is correct with a few mistakes in the request. The purpose is to make sure the project functions accurately:
results = Location.search(‘Asan francus’, {})results.map(&:name) # => [“san francisco”]
We have one filter defined. It is applied to the Location filter by level. The database contains two objects in the database with the same name, i.e. New York, which are of different levels. The first level concerns the state, and the second one — the city:
results = ation.import force: true=>”new york”, :level=>”state”}results = Location.search(‘new york’, { level: city })results.map { |result| { name: result.name, level: result.level } }# [{:name=>”new york”, :level=>”city”}
Step #5: Making the search request available by API
In the final stage, we are going to create a controller through which the search queries will pass:
➜~rails generate controller Home search
For this purpose, open app/controllers/home_controller.rb and insert the following snippet in it:
class HomeController < ApplicationControllerdef searchresults = Location.search(search_params[:q], search_params)locations = results.map do |r|r.merge(r.delete(‘_source’)).merge(‘id’: r.delete(‘_id’))endrender json: { locations: locations }, status: :okendprivatedef search_paramsparams.permit(:q, :level)endend
To see the project performance, run the Rails server by typing in rails s
and then navigate to http://localhost:3000//home/search?q=new&level=state
.
In the code, we ask all files containing the name “new” and whose level is equal to the state.
The response looks like:
{
"locations": [
{
"_index": "locations",
"_type": "_doc",
"_id": "41",
"_score": 3.676841,
"name": "new york",
"level": "state",
"id": "41"
},
{
"_index": "locations",
"_type": "_doc",
"_id": "17",
"_score": 3.5186555,
"name": "new jersey",
"level": "state",
"id": "17"
},
{
"_index": "locations",
"_type": "_doc",
"_id": "10",
"_score": 2.7157228,
"name": "new hampshire",
"level": "state",
"id": "10"
}
]
}
Conclusion
We hope that the guide was helpful for you, and recommend you to learn all the possibilities of Elasticsearch, as it is a great tool for implementing a full-text search into your code.
The powerful features to pay attention to:
- Speed plays an important role in giving a customer positive user experience.
- Flexibility helps to change the search performance and optimize multiple datasets and use cases.
- Even with a typo in a search, Elasticsearch returns relevant results for what the user is looking for.
- The tool makes it possible to look both for definite keywords and other matching data kept in your database.
With its powerful features, Elasticsearch perfectly matches with Ruby on Rails app development.
The article was originally published on Codica Blog.