Yes, you can run games from a NAS device.
Yes, you can run games from a NAS device.
I was considering AMD's StoreMI because most of the systems I assemble use Ryzen processors. I haven't used Optane on any AMD build yet, but I'm sure the performance impact wouldn't be significant. Additionally, I've never thoroughly investigated whether there are any real gains or losses in speed. Typically, when I cache an HDD with a SSD, it's mainly for gaming or media files. The operating system runs on its own NVMe drive all the time. I've compared launching games from both the OS NVMe drive and a cached HDD with a regular HDD, and loading times seem similar overall—except for Cyberpunk 2077, where using a SATA3 SSD instead of NVMe didn't change performance much.
Optane is a marketing phrase that describes several distinct tech solutions. First, it's a NAND alternative—often called 3D-Xpoint, which seems to be a version of Phase Change memory—offering a balance of speed, low latency, and endurance between RAM and NAND. Second, it refers to a specialized caching software that operates outside Intel’s usual ecosystem. Third, it stands for storage class memory, enabling you to store large volumes of data with minimal DRAM usage, commonly found in certain servers (though not recommended for personal use). If you need a detailed example, refer to the case study at the provided link.
Romex demonstrated strong compatibility with their software, making Optane sticks or add-ins work well across various systems. Previously, such products were priced around $70 for a 100GB unit on platforms like Newegg. At that time, they were comparable in cost to a 480GB NAND SSD, delivering roughly ten times better latency. For workloads with small block sizes and low queue depth—like typical caching tasks—Optane can be about ten times faster than NAND. However, performance drops significantly when the drive is nearly full, losing up to 75% efficiency. At extreme loads, such as 4K random reads, differences can reach around 20x. Mixed workloads also suffer, with cache devices potentially losing up to 90% of their speed if updated during read operations.
The method behind caching involves prioritizing small data blocks for quick access. Metadata and frequently accessed data are kept in cache, while less used blocks are removed. This reduces disk I/O by minimizing the need to seek locations. However, this approach works best when most reads come from fast media; otherwise, other factors like data movement and format conversion slow things down.
Regarding performance trade-offs, faster NAND is generally better for hot data, but Optane shines when speed matters more. The choice often depends on your workload patterns. If you have large datasets that don’t need constant random access, Optane can be a strong option despite its higher cost.
In practice, many users find that once you reach a point where most reads are from fast storage, the benefits of Optane diminish compared to other solutions. The decision ultimately hinges on how much hot data you have and what your application demands.
Here’s a revised version of your text:
This was quite useful information, though I didn’t bother searching for it before. Thanks for sharing. Usually, I rely on specs and technical guides to make my decisions. It seems Intel has reintroduced small Optane memory storage options, as I only found H10 models with 32GB or 512GB sizes on Amazon. However, with the current shipping fees and import duties, those prices could nearly double. It makes me wonder if I missed out by not purchasing one when it was widely available. Still, I’m uncertain whether it would make a noticeable impact in a Windows setup, especially compared to a standard NVMe drive for gaming purposes. Once DirectStorage becomes common, I’m confident it will bring significant benefits.
Caching demands a fair amount of effort because it experiences up to ten times more activity compared to a standard drive, especially with frequent updates and many small changes. It can quickly become a bottleneck—though for a single user playing games, it should be manageable. The advertised figures often represent ideal conditions, not real-world performance. At this stage, the market leans toward second-hand options. The positive side is that these components tend to last a long time. They offer strong durability and longevity. Intel predicted Optane would succeed, but production costs remain high unless you're investing heavily. It didn’t gain traction, so Intel shifted focus to enterprise solutions like 1.6TB drives for $3700. About two years ago, 16GB drives cost around $10, now $15; 32GB for $30, now $40; 100GB for $70, now $200; 118GB for $80, now $120). Performance has roughly doubled. The 280GB models are still available near $220, though that’s a premium. A solid choice remains around $70 for a cache, with a bit less if it’s not used as a primary drive. It’s reasonable to assume a noticeable gap if you rely on caching, but a Samsung 980 isn’t ideal—even a 970 would be better. Ultimately, unless you’re dealing with extreme needs, most people are fine. DirectStorage could improve SSDs, but it adds complexity and isn’t optimized for caching. For now, a single large NAND drive is likely the simpler, more reliable path. The future may favor cheaper, more efficient NAND SSDs over caching setups.
Just to give a general idea about caching, I configured ISCSI on my NAS with a Linux-based host system. These are the cache performance numbers after roughly 30 minutes of running Just Cause 3 (which loads assets fairly efficiently). This is right after transferring Batman and Just Cause 3 onto the NAS, creating a decent medium-quality situation (since these would be the MRU cache). Root@NAS[~]# arcstat -f read,miss,hit%,l2hit% 30 read miss hit% l2hit% 0 0 0 0 8.9K 1 99 57 13K 20 99 30 9.0K 47 99 9 13K 35 99 16 8.9K 35 99 23 14K 27 99 12 8.9K 21 99 9 13K 19 99 24 17K 21 99 7 9.4K 42 99 7 8.7K 60 99 17 13K 11 99 6 9.1K 69 99 16 13K 1 99 56 8.9K 1 99 8 13K 8 99 24 8.9K 24 99 12 13K 41 99 21 17K 16 99 1 12K 0 100 0 8.8K 24 99 11 12K 0 100 0 8.8K 0 100 0 10.0K 0 100 0 7.6K 3 99 6 12K 1 99 0 8.9K 52 99 4 12K 9 99 0 17K 40 99 4 12K 0 99 50 8.8K 18 99 1 12K 39 99 1 8.8K 10 99 0 15K 15 99 0 8.7K 0 99 4 12K 2 99 2 8.6K 0 99 0 10K 0 99 0 7.1K 1 99 13 13K 61 99 22 17K 7 99 0 12K 0 99 0 8.8K 9 99 1 12K 0 99 0 8.7K 13 99 39 12K 19 99 1 8.8K 42 99 5 15K 2 99 1 9.0K 113 98 11 12K 30 99 5 17K 127 99 15 As you can observe, about 25GB of RAM is used for L1 ARC and nearly all the hits come from it. The 118GB optane in L2ARC is only slightly engaged, which would be ideal if the data were fully within memory (best case it pulls ~60% of what isn’t cached). Each line shows about 30 seconds of operation. Almost everything resides in L1ARC. I plan to try Batman Arkham Knight soon to check performance. I’m anticipating around 90% ARC hit rates, though some files might already be on my desktop since I opened them hours ago. Read miss and L2 hit percentages are around 95-98%. Overall, nearly all activity comes from the NAS RAM; the rest flows through L2ARC."
Another detail I overlooked is that if you configure the NAS as an ISCSI target rather than using SMB, you can apply primocache or similar tools on the client side. This allows you to likely skip expensive caching on the NAS itself, making it possible to run a budget-friendly NAS without any performance boost. Games make a good example of this setup. You might even reuse a partition from your main SSD on the client machine, eliminating additional costs. If both devices cache data, you could end up with redundant storage, which isn’t ideal but isn’t a major issue. The advantage here is significant bandwidth savings over the 1Gbe connection. I added a benchmark using my NAS without any client-side caching yet. Most results align with CrystalDiskMark. Overall speed is on par with an internal SATA SSD—though you have up to 12TB available, plus gains from compression. I’ve been experimenting; still need to confirm compatibility with Steam. So far, reading speeds have noticeably improved, especially during downloads. It took some time adjusting (I used lvmcache instead of enhanceio). Now I assume this device will eventually be unmounted, so I chose a balance between write speed and durability. I might switch the cache strategy to write back rather than through, though that could cause problems if the connection drops suddenly or the drive fails—though these are mostly game files, so it’s not critical. I also rechecked a few times and occasionally hit peak read rates of about 1.3GBps, but only while downloading games. Write speed dropped during those sessions, which is expected since I’m mainly caching reads. This has been the best performance I’ve achieved so far. I’ve tried several titles and my internal (desktop) cache probably contains some data. There’s definitely variation here. For reference, 970 Evo is likely the top benchmark for today’s games—RND4K Q32T1 might give a closer match. Next week’s titles with DirectStorage should benefit more from streaming large amounts of data directly to VRAM.**