MERN | Search from 100000 books | Backend Pagingation
- Backend - Node.js
- Frontend - React.js
- Database - MongoDB
- Backend
- cd server
- run "npm install"
- run "node index.js"
- Frontend
- cd client
- run "npm install"
- run "npm start"
-
Server-side Pagination - I have assumed that only 12 books are displayed on the UI for any book name search. To make the search faster I have used backend pagination with the help of mongoose-paginate. It allows looking for the first 12 records that match the search criteria and return on the frontend. To look for the next 12 records, an API call is made with page parameter value 2.
-
Indexing on Column bookName - I have created an index on bookName column to improve the search speed.
-
Master-Slave architecture of DB - MongoDB Atlas has been used to store data. 1 master and 2 slaves are offeres with the Atlas service. The reads are equally distributed among slave nodes. This helps to handle the concurrent read read requests better.
- Application has been tested with 100,000 records due to space constraints in Cloud DB.
- The books have been named in this format _<book_number>book_name, where book_number lies in range [1, 100,000].
- Script used to add data.
import requests
import random
url = 'http://localhost:3002'
for i in range(1, 1000001):
try:
author_name = "{}_author_name".format(random.randint(1, 50))
book_name = "{}_book_name".format(i)
request_data = {}
request_data['authorName'] = author_name
request_data['bookName'] = book_name
x = requests.post(url, data = request_data)
except:
print(i)
print("")
- Redis Caching - Cashing the API request with the result in database to speed up the response time.