c# – How to deal with a large amount of in-memory data in a production environment?


We develop a platform that’s centered around data analytics, and a lot of the data we serve to our users is pretty transitive in nature. We also want to serve data at low latency. As a result, the majority of this data (30+ GB) is held in-memory on a single VM instance. This wasn’t planned, it has just turned out this way due to bad design decisions in the beginning. This is causing us a lot of pain because anytime we need to restart this instance, we need to re-intitialise all of the data which can take hours. The application in question is a .NET 7 web API project, where the data is held in long living objects. We are looking to refactor/break down this project so we no longer have this issue – what is the best way of going about this? Is there any tools or patterns we should be using?

Our current plan is to break down the existing large project I mentioned into smaller ones, without changing the underlying functionality. This is obviously going to require an arduous and difficult refactor, and the interaction between the new moving parts will have to be solved.



Source link

Leave a Comment