Memory Management

Custom allocator without garbage collection

Why No Garbage Collection?

guideXOS uses a manual memory management system instead of .NET's garbage collector. This design choice provides:

  • Predictable Performance - No GC pauses
  • Lower Memory Footprint - No GC metadata overhead
  • Deterministic Cleanup - Explicit Free() calls
  • Real-Time Capable - No stop-the-world pauses
  • Manual Management - Requires careful coding
⚠ Important:

Every Allocator.Allocate() must have a matching Allocator.Free() or memory will leak!


Page-Based Allocator

The memory allocator manages a pool of 4KB pages. All allocations are rounded up to page boundaries:

Allocator Configuration

  • Page Size: 4,096 bytes (4KB)
  • Total Pages: 131,072 pages (512MB)
  • Start Address: 0x00400000
  • End Address: 0x20400000
  • Metadata: Free list bitmap + owner tracking

Allocation Process

void* Allocator.Allocate(ulong size, string owner)
    
Calculate pages needed = (size + 4095) / 4096
    
Find contiguous free pages in bitmap
    
Mark pages as allocated
    
Record owner for tracking
    
Return pointer to first page

Example Code

// Allocate 16KB buffer
void* buffer = Allocator.Allocate(16384, "NetworkBuffer");

// Use the buffer
byte* data = (byte*)buffer;
data[0] = 0x42;

// Free when done
Allocator.Free(buffer);

Page Alignment

Requested Size Pages Allocated Actual Size
1 byte 1 page 4 KB
4 KB 1 page 4 KB
5 KB 2 pages 8 KB
1 MB 256 pages 1 MB
10 MB 2560 pages 10 MB

Memory Tracking

Every allocation is tagged with an "owner" string for debugging and monitoring:

Owner Tags

  • Purpose: Track which component allocated memory
  • Format: String identifier (e.g., "NetworkStack", "GUI")
  • Visibility: View with mem command
  • Debugging: Find memory leaks by component

Memory Command Output

gx> mem

Memory Usage:
Total Memory:     512 MB (131,072 pages)
Used Memory:      48 MB (12,288 pages)
Free Memory:      464 MB (118,784 pages)
Utilization:      9.4%

Allocations by Owner:
NetworkStack      16 MB (4,096 pages)
GUI               12 MB (3,072 pages)
Framebuffer       8 MB (2,048 pages)
FileSystem        8 MB (2,048 pages)
Applications      4 MB (1,024 pages)

Leak Detection

Monitor memory usage over time to detect leaks:

// Run mem command multiple times
gx> mem
Used: 48 MB

// Do some work...
gx> ping 8.8.8.8

// Check again
gx> mem
Used: 48 MB  ← Good! No leak

// If usage keeps growing:
Used: 52 MB  ← Potential leak!

StringPool - String Interning System

The StringPool is a string caching and interning system designed to prevent memory leaks from repeated string allocations in a non-GC environment. Common strings like percentages, numbers, and memory sizes are cached and reused instead of allocating new strings each frame.

✓ Problem Solved:

Without StringPool, displaying "42%" in Task Manager 60 times per second would allocate 3,600 string objects per minute, causing rapid memory growth!

How It Works

StringPool maintains pre-allocated arrays of commonly used strings. When you request a string, it either returns a cached instance or creates and caches it on first access (lazy initialization).

Cached String Types

String Type Cache Size Example Output
GetPercentage() 101 entries (0-100) "42%", "95%", "100%"
GetNumber() 10,001 entries (0-10000) "0", "42", "1024", "9999"
GetMemorySize() 2,000 entries (LRU cache) "42 MB", "512 KB", "2 GB"
GetTransferRate() Composed from numbers "150 KB/s", "2 MB/s"
FormatUptime() Composed from numbers "01:23:45"

Usage Examples

Before StringPool (Memory Leak):
// BAD: Allocates new string every frame!
void DrawCpuUsage()
{
    int cpuPct = GetCpuUsage();
    string label = cpuPct.ToString() + "%";  // New allocation!
    DrawText(label);
    label.Dispose();  // Manual cleanup required
}

// At 60 FPS, this allocates 3,600 strings per minute!
After StringPool (No Leak):
// GOOD: Reuses cached string
void DrawCpuUsage()
{
    int cpuPct = GetCpuUsage();
    string label = StringPool.GetPercentage(cpuPct);  // Cached!
    DrawText(label);
    // No Dispose() needed - cached strings are never freed
}

// At 60 FPS, this allocates 0 new strings (after warm-up)

API Reference

GetPercentage(int value)
// Returns cached percentage strings (0% to 100%)
string pct = StringPool.GetPercentage(42);  // "42%"
string full = StringPool.GetPercentage(100); // "100%"

// Values outside 0-100 are clamped
string clamped = StringPool.GetPercentage(150); // "100%"
GetNumber(int/ulong value)
// Returns cached number strings (0 to 10,000)
string count = StringPool.GetNumber(42);     // "42"
string large = StringPool.GetNumber(9999);   // "9999"

// Values > 10,000 allocate new strings
string huge = StringPool.GetNumber(50000);   // New allocation

// Works with ulong too
ulong bytes = 1024UL;
string bytesStr = StringPool.GetNumber(bytes); // "1024"
GetMemorySize(ulong bytes)
// Formats bytes as KB/MB/GB and caches result
string kb = StringPool.GetMemorySize(4096);      // "4 KB"
string mb = StringPool.GetMemorySize(4194304);   // "4 MB"
string gb = StringPool.GetMemorySize(1073741824); // "1 GB"

// Cache stores up to 2,000 unique values
// Most common values (like Task Manager updates) are cached
GetTransferRate(int kbps)
// Formats transfer rates as KB/s or MB/s
string slow = StringPool.GetTransferRate(150);   // "150 KB/s"
string fast = StringPool.GetTransferRate(2048);  // "2 MB/s"

// Automatically converts to MB/s when >= 1024 KB/s
FormatUptime(ulong ticks)
// Formats uptime as HH:MM:SS
ulong ticks = Timer.Ticks;
string uptime = StringPool.FormatUptime(ticks); // "01:23:45"

// Efficiently reuses cached number strings for components
GetCacheStats()
// Returns cache statistics for debugging
string stats = StringPool.GetCacheStats();
// Output: "StringPool - Pct: 85/101, Num: 1247/10001, Mem: 342/2000"
// Shows: populated/total entries for each cache

Real-World Impact

StringPool was introduced to fix critical memory leaks in the Task Manager. Before StringPool:

Scenario Without StringPool With StringPool
Task Manager open for 1 minute ~180 KB leaked 0 KB leaked
Task Manager open for 10 minutes ~1.8 MB leaked ~8 KB total
Allocations per frame (60 FPS) ~50 strings 0 strings
Performance Degrades over time Stable

Implementation Details

  • Lazy Initialization: Strings are only allocated when first requested
  • Thread-Safe: Static readonly strings can be accessed from any thread
  • No Dictionary: Uses simple arrays to avoid Dictionary initialization issues in kernel code
  • Memory Size Cache: Uses parallel arrays (_memorySizeCacheKeys and _memorySizeCache) for O(n) lookup
  • Fallback: Values outside cache ranges still work (allocate new strings)

Best Practices

📋 Usage Guidelines:
  1. Use for UI/repeated strings: Perfect for Task Manager, status displays, tooltips
  2. Don't dispose pooled strings: They're managed by the pool, not your code
  3. Concatenation creates new strings: GetNumber(42) + " MB" allocates - use GetMemorySize() instead
  4. Check cache ranges: Values outside cached ranges will allocate normally
  5. Monitor with stats: Use GetCacheStats() to verify cache effectiveness

Common Pitfalls

❌ Don't Dispose Pooled Strings:
string pct = StringPool.GetPercentage(50);
pct.Dispose();  // BAD! Corrupts the pool!
❌ String Concatenation Still Allocates:
// This defeats the purpose - allocates new string!
string bad = StringPool.GetNumber(42) + "%";

// Use the dedicated method instead:
string good = StringPool.GetPercentage(42);
⚠ Cache Misses Allocate:
// This is outside cache range - allocates new string
string huge = StringPool.GetNumber(99999); // > 10,000
huge.Dispose(); // Must dispose non-pooled strings!

Future Improvements

  • Larger caches: Could increase MAX_NUMBER_CACHE for more coverage
  • Custom formatters: Add domain-specific string formatters (IP addresses, MAC addresses)
  • LRU eviction: Memory size cache could implement LRU to handle more dynamic values
  • Statistics tracking: Track cache hit rates for optimization

No Garbage Collector Design

guideXOS completely eliminates the .NET garbage collector. Here's how:

Implementation Details

  1. Custom CoreLib - Replaces standard .NET base class library
  2. No Heap - No managed heap or GC metadata
  3. Unsafe Code Only - All allocations via unsafe pointers
  4. Stack Allocation - Local variables on stack
  5. Manual Lifetime - Developer controls allocation/deallocation

Consequences

✓ Benefits
  • Zero GC pauses
  • Lower memory usage
  • Predictable latency
  • Simpler runtime
⚠ Trade-offs
  • Manual Free() calls
  • Potential memory leaks
  • No reference tracking
  • More careful coding

Memory Safety Rules

📋 Best Practices:
  1. Always Free() what you Allocate()
  2. Use owner tags for all allocations
  3. Monitor memory with mem command
  4. Prefer stack variables when possible
  5. Document allocation ownership

Common Usage Patterns

Pattern 1: Temporary Buffer

void ProcessData()
{
    // Allocate temporary buffer
    void* buffer = Allocator.Allocate(8192, "TempBuffer");
    
    // Use buffer...
    DoSomething(buffer);
    
    // Always free when done!
    Allocator.Free(buffer);
}

Pattern 2: Long-Lived Object

static void* networkBuffer;

void NetworkInit()
{
    // Allocate buffer that lives for entire runtime
    networkBuffer = Allocator.Allocate(65536, "NetworkBuffer");
}

void NetworkShutdown()
{
    // Free at shutdown
    Allocator.Free(networkBuffer);
}

Pattern 3: Dynamic Array

struct PacketList
{
    byte** packets;
    int count;
    int capacity;
}

void AddPacket(PacketList* list, byte[] data)
{
    if (list->count >= list->capacity)
    {
        // Grow array
        int newCap = list->capacity * 2;
        byte** newArray = (byte**)Allocator.Allocate(
            newCap * sizeof(void*), "PacketArray");
        
        // Copy old data
        for (int i = 0; i < list->count; i++)
            newArray[i] = list->packets[i];
        
        // Free old array
        Allocator.Free(list->packets);
        
        list->packets = newArray;
        list->capacity = newCap;
    }
    
    // Add packet
    list->packets[list->count++] = data;
}

Pattern 4: Structs on Stack

void SendPacket()
{
    // Stack-allocated struct (no Allocate/Free needed)
    IPv4Packet packet;
    packet.sourceIP = 0xC0A80101;
    packet.destIP = 0x08080808;
    packet.length = 64;
    
    // Use packet...
    Network.Send(&packet);
    
    // Automatically cleaned up when function returns
}
💡 Optimization Tip:

Prefer stack allocation for small, short-lived data. Reserve heap allocation (Allocator.Allocate) for large buffers or objects with complex lifetimes.


Related Topics