Caching is a fundamental computer science concept in which data that has been accessed once is stored on faster storage like DRAM so that accessing it further is much quicker.
Today we will show this using Node and Express. First, let’s set up a project.
In an empty folder run
npm init -y
to create an npm project and then run
npm i express node-cache
to install Express and node-cache packages.
Now create a file called index.js which will hold our server code. In the index.js file write the following code:
1
2
3
4
5
6
7
8
const express = require('express');
const NodeCache = require('node-cache');
const https = require('https');
const myCache = new NodeCache({
stdTTL: 10
});
const app = express();
const port = 3000;
Here in line two we have imported node-cache and in line four we have initialized it with a key of stdTTL: 10, which means that our cache will keep an element in memory for ten seconds.
Now type the following code to set up the server.
1 2 3 4 5 6 7 8 9
const todosURL = 'https://jsonplaceholder.typicode.com/todos'; app.get('/', (req, res) => { res.send('API is up'); }); app.listen(port, () => { console.log('Server started'); })
The todosURL is something that we will use later.
Make a request to localhost:3000 using your browser/Postman to make sure that all is OK. We are expecting a response like this.
1 API is up
Now that our server is up, let’s make a request to the todosURL I mentioned above. I will be using the https node module but you can use whatever you want. With https the syntax will be something like this.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
app.get("/", (req, res) => { try { https.get(todosURL, (resp) => { let data = ""; resp.on("data", (chunk) => { data += chunk; }); resp.on('end', () => { const todos = JSON.parse(data); res.send(todos); }) }); } catch (err) { res.send(err); } });
After making the request I am holding the response in the data variable, and once the data stream ends I am parsing it in JSON.
The result will look like this.
But there will be about a hundred such todos in response.
Now, let’s cache it. Whenever a request comes after caching, instead of making the fetch request, we will check in our cache. If it’s present there, we will return the result from the cache, which will be faster.
It’s simple. As soon as you have the data just add it into the cache with a suitable and unique key. Make the following changes in the API in the resp.on(‘end’) code block. I have included a GitHub link at the end of the article, you can check that out to see the complete code.
1 2 3 4 5
resp.on('end', () => { const todos = JSON.parse(data); myCache.set('todos', todos); res.send(todos); })
Now, if we access the cache for the key “todos”, we will get the stored data.
Note that until now we have stored the data in the cache but we are still not accessing the data from it. Let’s see how much time a request without cache takes. You can see it here in Postman. In my case, it’s averaging at around 150 ms. In your case it will differ and depend on variables like your network speed.
Before making the request we will check the cache to see if we have the data present there, like this…
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
app.get("/", (req, res) => { if (myCache.has("todos")) { res.send(myCache.get("todos")); } else { try { https.get(todosURL, (resp) => { let data = ""; resp.on("data", (chunk) => { data += chunk; }); resp.on("end", () => { const todos = JSON.parse(data); myCache.set("todos", todos); res.send(todos); }); }); } catch (err) { res.send(err); } } });
Now make the request again.
As you can see, the response time has decreased significantly. Now it’s averaging around 8 ms, which is much faster.
Here is the Github link.