Here’s how you can make concurrent requests in Node.js using the axios
library along with Promise.all
to handle multiple requests concurrently. This approach is similar to what you would do in Ruby or Python.
Here’s the Node.js code:
const axios = require('axios');
const sendRequest = async (query) => {
const url = "https://piloterr.com/api/v2/website/crawler";
const xApiKey = "YOUR-X-API-KEY";
try {
console.log(Sending request to ${query}
);
const response = await axios.get(url, {
params: {
'x_api_key': xApiKey,
'query': query
}
});
console.log(response.data);
} catch (error) {
console.error(HTTP Request failed: ${error.message}
);
}
};
const urlsToScrape = [
"https://www.piloterr.com",
"https://www.piloterr.com/blog"
];
const scrapeConcurrently = async () => {
const requests = urlsToScrape.map(url => sendRequest(url));
await Promise.all(requests);
console.log("Process Ended");
};
scrapeConcurrently();
axios: The axios
library is used for making HTTP requests. It’s simple and supports promises, making it easy to work with asynchronous code.
Promise.all: This function allows you to run multiple promises concurrently and waits for all of them to resolve. In our case, we map each URL to a sendRequest
call and then pass that array to Promise.all
.
async/await: Used to handle the asynchronous nature of HTTP requests cleanly.
Install the axios
library if you don’t already have it: npm install axios
Replace "YOUR-X-API-KEY"
with your actual API key.
Update the urlsToScrape
array with the URLs you want to scrape.
Using Promise.all
allows you to send all the requests concurrently, which is faster than sending them one at a time.
This solution is scalable and easy to adapt for larger projects or more URLs.