Difference between revisions of "Fusion Data Store memory requirements"
Jump to navigation
Jump to search
| (3 intermediate revisions by the same user not shown) | |||
| Line 1: | Line 1: | ||
| + | [[Category:Fusion Registry Configuration]] | ||
| + | |||
Fusion Data Store requires a peak of approximately 6GB of server memory per 50 million observations:<br> | Fusion Data Store requires a peak of approximately 6GB of server memory per 50 million observations:<br> | ||
'''120MB per million observations.''' | '''120MB per million observations.''' | ||
| Line 6: | Line 8: | ||
* The figures quoted above are peak memory requirements when updating or replacing existing datasets. Quiescent memory requirements are lower. | * The figures quoted above are peak memory requirements when updating or replacing existing datasets. Quiescent memory requirements are lower. | ||
* Services such as Fusion Registry have additional memory requirements which need to be taken into account when sizing deployment platforms such as the volume of stuctural metadata. | * Services such as Fusion Registry have additional memory requirements which need to be taken into account when sizing deployment platforms such as the volume of stuctural metadata. | ||
| − | |||
| − | |||
| − | |||
Latest revision as of 23:49, 2 November 2025
Fusion Data Store requires a peak of approximately 6GB of server memory per 50 million observations:
120MB per million observations.
Notes:
- The balance between series and observations will also affect memory requirements with datasets where each series has few observations needing additional memory.
- The figures quoted above are peak memory requirements when updating or replacing existing datasets. Quiescent memory requirements are lower.
- Services such as Fusion Registry have additional memory requirements which need to be taken into account when sizing deployment platforms such as the volume of stuctural metadata.