I have a fairly vanilla web service (old school asmx). One of the methods kicks off some async processing that has no bearing on the result returned to the client. Hopefully, the little snippet below makes sense:
[System.Web.Services.WebMethod]
public List<Foo> SampleWebMethod(string id)
{
// sample db query
var foo = db.Query<Foo>("WHERE [email protected]",id);
// kick of async stuff here - for example firing off emails
// dont wait to send result
DoAsyncStuffHere();
return foo;
}
My initial implementation for the DoAsyncStuffHere method made use of the ThreadPool.QueueUserWorkItem. So, it looks something like:
public void DoAsyncStuffHere()
{
ThreadPool.QueueUserWorkItem(delegate
{
// DO WORK HERE
});
}
This approach works fine under low load conditions. However, I need something that can handle a fairly high load. So, the producer/consumer pattern would seem to be the best way to go.
Where I am confused is how to constrain all work being done by the queue to a single thread across all instances of the web service. How would I best go about setting up a single queue to be accessed by any instance of the web service?
You can use a System.Collections.Concurrent.BlockingCollection<T>
with a System.Collections.Concurrent.ConcurrentQueue<T>
as the underlying collection.
As the name of the namespace implies, the collections are thread safe.
Start a consumer thread (or a few) to pull items from the collection, using the Take()
method. When no items are available, the thread will block.
Your DoAsyncStuffHere method adds items to the BlockingCollection. These items could be unstarted System.Threading.Tasks.Task
objects; the consumer thread(s) would in that case Start
the tasks after taking them from the collection.