Here’s how you can perform concurrent HTTP requests in C# using HttpClient
and Task.WhenAll
to handle multiple requests asynchronously.
Here’s the C# code:
using System;
using System.Net.Http;
using System.Threading.Tasks;
using System.Collections.Generic;
class Program
{
private static readonly HttpClient client = new HttpClient();
static async Task SendRequest(string query)
{
try
{
string url = "https://piloterr.com/api/v2/website/crawler";
string xApiKey = "YOUR-X-API-KEY";
string fullUrl = $"{url}?x_api_key={Uri.EscapeDataString(xApiKey)}&query={Uri.EscapeDataString(query)}";
Console.WriteLine($"Sending request to {query}");
HttpResponseMessage response = await client.GetAsync(fullUrl);
response.EnsureSuccessStatusCode();
string responseBody = await response.Content.ReadAsStringAsync();
Console.WriteLine($"Response from {query}:\n{responseBody}");
}
catch (HttpRequestException e)
{
Console.WriteLine($"Request to {query} failed: {e.Message}");
}
}
static async Task ScrapeConcurrently(List<string> urls)
{
var tasks = new List<Task>();
foreach (string url in urls)
{
tasks.Add(SendRequest(url));
}
await Task.WhenAll(tasks);
Console.WriteLine("Process Ended");
}
static async Task Main(string[] args)
{
var urlsToScrape = new List<string>
{
"https://www.piloterr.com",
"https://www.piloterr.com/blog"
};
await ScrapeConcurrently(urlsToScrape);
}
}
HttpClient: Used for making HTTP requests. It’s recommended to instantiate HttpClient
once and reuse it for better performance.
Task.WhenAll: Runs all the tasks concurrently and waits for all of them to complete. Each request is represented by a Task
returned by the SendRequest
method.
async/await: Used to handle asynchronous operations in a clear and non-blocking way.
Replace "YOUR-X-API-KEY"
with your actual API key.
Modify the urlsToScrape
list to include the URLs you want to scrape.
Using asynchronous programming and Task.WhenAll
, you can efficiently run multiple HTTP requests concurrently, improving performance.
This approach scales easily for handling a large number of URLs.
This C# example provides a simple and effective way to handle concurrent HTTP requests, making it suitable for scenarios where performance is crucial.