I/O makes or breaks the system, so storage performance is always a big deal, especially in virtualized environments, where VMs are hungry for IOPS. It’s extremely expensive to implement an all-Flash, even more so – all-RAM storage, both considered to be overkill. Thus, a combination of slower spindle, faster Flash and much faster RAM tiers is typically used in the industry.
This multi-tier approach brings up a few issues. Faster and more reliable memory costs more. High-performance or non-volatile memory usage significantly raises the resulting price of the system.
Talking about flash, SLC flash memory is very expensive, while cheaper but still costly MLC flash has shorter lifetime due to up to 10 times lower write/erase cycles limit. VM workload is dominated by random I/O, which is difficult to predict.
Using conventional RAM instead of non-volatile memory, renders the cache prone to errors, because data in cache will be lost in case of hardware malfunction or power outage. On the other hand, utilizing power-independent memory raises the problem of high price, referenced above.
StarWind raises performance by having bigger cache for the same money. Commodity inexpensive hardware is used – MLC flash instead of costly SLC flash, so more memory may be bought to meet workload requirements. In addition, starting VM’s after migration does not affect the performance as all moved VMs are started in “hot state” because caches are kept coherent – synchronized between nodes. This means that the destination node holds the required data in cache, so the VM starts without loss in performance.
Reliability is kept at maximum – StarWind basically mirrors data in cache between multiple nodes, creating distributed cache. This way, even when the power goes out, all data is safe, because redundant replicas are stored on all nodes. Besides, cache blocks are digitally signed, negating the possibility of silent data corruption – bit rot. Additionally, space reduction technologies lengthen the life of flash memory, lowering the risk of failure.