top of page

Sayan Bhattacharya | Architect & FullStack engineer

We at ShoppinPal deal with multifarious retail data at scale and a key application of ours caters to the aggregation of retailer data in a highly customized manner, to ensure real-time BI/Analytics for effective decision making in these unprecedented times. In this context, as part of our #CovidTaskForce initiatives, I was recently tasked with developing a complex dashboard service which had to be shipped at scale. “Scaling” fazes many DevOps specialists and requires the utmost Optimisation in terms of the approach taken to tackle the problem statement. An additional challenge is that the data which is to be displayed co-exists in separate time zones and needs to be synced in real-time. Since database aggregations are fairly resource intensive and expensive to operate, the idea was to bring in caching of requests based on the filters applied. After some brainstorming, we addressed these challenges in a streamlined manner that ensured end-user satisfaction. STEPS : Ensure you have NodeJS 10+ installed on your machine. You can verify that by node -v Run npm init -y to initialise an empty npm package. Add the below to the dependencies section of the package.json "dependencies": { "body-parser": "1.19.0", "express": "4.17.1", "object-hash": "2.0.3", "mongoose": "5.6.11", "redis": "2.8.0" } The catch here is object-hash which helps me to hash a given JavaScript object. For every unique filter combination, the hash will be same so it can be used as a key to store the response in the Redis instance.

Below is a sample route that caches requests depending on the filters passed to the request. The response is set to expire with a ttl of 300 seconds. So, once it expires it gets the data and sets the response again in the redis instance with the new hashed filter key. const express = require('express'); const app = express(); const bodyParser = require('body-parser'); const hash = require('object-hash'); const redis = require('redis'); const redisClient = redis.createClient(process.env.REDIS_HOST);app.use(bodyParser.urlencoded({ exten ded: true })); app.use(bodyParser.json());const port = 80;const router = express.Router();const getDataFromSource = async function(filter){ //do something here i.e, fetch data from API or DB };router.post('/charts-data', async function(req, res) { let filterKey = `${hash(req.body)}`; try { redisClient.get(filterKey, async function(err, instance){ if(err){ throw err; } else { if(!instance){ let payload = { "timezone": req.body.timezone, "intervalCount": req.body.intervalCount, "startDate": req.body.startDate, "endDate": req.body.endDate, }; let data = await getDataFromSource(payload); redisClient.setex(filterKey, 300, JSON.stringify(data)); res.send(data); } else { res.send(JSON.parse(instance)); } } }); } catch(e){ res.send(e); } });app.use('/', router);app.listen(port); console.log('API running on port : ' + port); This also serves the purpose of rate limiting to some extent as you’re not making complex API calls frequently. Share your own Tech Innovation experiences to be featured in this section:

email covidtaskforce@shoppinpal.com.

#Shoppinpal #CovidTaskForce #SourceHacks #NodeJS #Redis #DBArticthecture #Analytics #BI #Retail #DevOps #Innovation #Technology #Sustainability #SME #StartUps



bottom of page