Skip to content

Instantly share code, notes, and snippets.

@arturohernandez10
Created October 5, 2023 21:23
Show Gist options
  • Save arturohernandez10/bd7a95c01c0e53d30b18e3dcb36a3f42 to your computer and use it in GitHub Desktop.
Save arturohernandez10/bd7a95c01c0e53d30b18e3dcb36a3f42 to your computer and use it in GitHub Desktop.
`monitorUsage` Function Documentation

monitorUsage Function Documentation

Overview

The monitorUsage function is designed to inspect and report on the memory usage of specific objects in a Node.js application, particularly those marked with a unique identifier. This tool aims to provide developers with an insight into memory allocation and retention tied to marked cache objects, which can be invaluable for diagnosing potential memory leaks or for optimizing memory usage.

Rationale

In large-scale applications, memory optimization is crucial for performance and scalability. However, determining which objects consume significant memory can be challenging. By marking crucial objects (like cache objects) and using tools like node-heapdump-analyzer, specific insights into these objects' memory footprint can be gained. This function serves as a utility to extract and summarize this information.

Usage

Prerequisites

  • Ensure the required packages are installed:

    npm install heapdump node-heapdump-analyzer @types/heapdump @types/node
  • Mark the target objects in the application with a unique identifier:

    myCacheObject.__MY_UNIQUE_MARKER__ = "CACHE_OBJECT";

Function Call

Simply invoke the monitorUsage function:

monitorUsage().then(data => {
    console.log(JSON.stringify(data, null, 2));
}).catch(err => {
    console.error("Error:", err);
});

The function returns a promise that resolves with a JSON object detailing the memory usage of marked objects.

Output

The resulting JSON object contains:

  • totalCacheObjects: The total number of marked cache objects found in the heap snapshot.
  • totalSize: The combined shallow size of all marked objects. Represents the size of the objects themselves.
  • totalRetainedSize: The combined retained size of all marked objects. Represents the size of the objects and all other objects they reference.
  • details: An array containing detailed information about each individual marked object.

Considerations

  • Performance Impact: Generating a heap snapshot can be resource-intensive, especially for large applications. It's recommended to use this function in development or debugging environments and not in production scenarios.

  • Snapshot Storage: The function currently stores the heap snapshot in the file currentSnapshot.heapsnapshot. Ensure there's enough disk space for this file and consider periodically deleting or archiving older snapshots.

  • Marked Objects: This function specifically searches for objects marked with __MY_UNIQUE_MARKER__ = "CACHE_OBJECT". Ensure target objects are marked appropriately.

  • TypeScript Definitions: At the time of this documentation, node-heapdump-analyzer might not have TypeScript definitions available. Custom type definitions or workarounds using the any type might be necessary.

Conclusion

The monitorUsage function is a valuable tool in a developer's arsenal for understanding memory usage tied to specific marked objects. By providing detailed insights into memory allocation and retention, it aids in the diagnosis and resolution of memory-related issues in Node.js applications.

Actual Code (untested)

import * as heapdump from 'heapdump';
import * as path from 'path';

const SNAPSHOT_PATH = path.join(__dirname, 'currentSnapshot.heapsnapshot');

interface CacheObjectInfo {
    objectId: number;
    size: number;
    retainedSize: number;
}

interface MonitorUsageResult {
    totalCacheObjects: number;
    totalSize: number;
    totalRetainedSize: number;
    details: CacheObjectInfo[];
}

async function monitorUsage(): Promise<MonitorUsageResult> {
    // Type declaration for node-heapdump-analyzer as it may not have TypeScript support
    const HeapDumpAnalyzer: any = require('node-heapdump-analyzer');

    return new Promise((resolve, reject) => {
        // 1. Generate a heap snapshot
        heapdump.writeSnapshot(SNAPSHOT_PATH, (err: Error) => {
            if (err) return reject(err);

            // 2. Load the snapshot into the analyzer
            const analyzer = new HeapDumpAnalyzer();
            analyzer.load(SNAPSHOT_PATH, (err: Error) => {
                if (err) return reject(err);

                // 3. Query the snapshot for marked cache objects
                const cacheObjects: any[] = analyzer.queryObjects({
                    name: '__MY_UNIQUE_MARKER__',
                    value: 'CACHE_OBJECT'
                });

                // 4. Extract useful information
                const results: CacheObjectInfo[] = cacheObjects.map(obj => {
                    return {
                        objectId: obj.id,
                        size: obj.shallowSize,     // Size of the object itself
                        retainedSize: obj.retainedSize // Size including objects it references
                    };
                });

                // Calculate overall stats
                const totalSize = results.reduce((sum, obj) => sum + obj.size, 0);
                const totalRetainedSize = results.reduce((sum, obj) => sum + obj.retainedSize, 0);

                const info: MonitorUsageResult = {
                    totalCacheObjects: results.length,
                    totalSize: totalSize,
                    totalRetainedSize: totalRetainedSize,
                    details: results
                };

                resolve(info);
            });
        });
    });
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment