Testing and Test Data Generation
How do I generate test data for LogEvents?
You can use the following T-SQL command to insert test data into your LogEvents table. This script generates realistic log events with configurable message sizes and volumes.
-- SQL Server 2022: insert rows where Body is base64(UTF-8 inner JSON)
SET NOCOUNT ON;
DECLARE @RowCount INT = 10000; -- test small, increase as needed (number of log rows)
DECLARE @BodySizeKB INT = 4096; -- target decoded inner JSON size in KB (UTF-8)
DECLARE @MaxOrdersPerBatch INT = 1000; -- safety cap to avoid runaway loops
DECLARE @NowUtc DATETIME2 = SYSUTCDATETIME();
DECLARE @StartUtc DATETIME2 = DATEADD(SECOND, -@RowCount, @NowUtc);
DECLARE @TargetBytes INT = @BodySizeKB * 1024;
DECLARE @i INT = 1;
WHILE @i <= @RowCount
BEGIN
DECLARE @rn INT = @i;
DECLARE @batchSuffix NVARCHAR(20) = RIGHT('00000' + CAST(@rn AS VARCHAR(10)), 5);
DECLARE @batchId NVARCHAR(100) = N'batch-' + FORMAT(@StartUtc, 'yyyyMMdd') + N'-' + @batchSuffix;
DECLARE @generatedAt NVARCHAR(50) = FORMAT(DATEADD(SECOND, @rn - 1, @StartUtc), 'yyyy-MM-ddTHH:mm:ssZ');
-- Build orders array by appending order objects until UTF-8 size > @TargetBytes
DECLARE @Orders NVARCHAR(MAX) = N'[';
DECLARE @orderIndex INT = 1;
DECLARE @Inner NVARCHAR(MAX);
DECLARE @CurrentBytes INT = 0;
WHILE @CurrentBytes <= @TargetBytes AND @orderIndex <= @MaxOrdersPerBatch
BEGIN
DECLARE @orderId NVARCHAR(50) = N'ORD-' + FORMAT(@rn, '000000') + N'-' + FORMAT(@orderIndex, '0000');
DECLARE @custId NVARCHAR(50) = N'CUST-' + FORMAT(@rn, '000000') + N'-' + FORMAT(@orderIndex, '0000');
DECLARE @OrderFragment NVARCHAR(MAX) =
N'{"orderId":"' + @orderId + N'","customer":{"customerId":"' + @custId + N'","name":"Customer ' + CAST(@orderIndex AS NVARCHAR(10)) + N'","email":"' + @custId + N'@example.com"},"orderDate":"2025-10-14T00:00:00Z","orderRows":[{"rowId":1,"sku":"SKU-ABC-01","description":"Item","quantity":1,"unitPrice":19.95,"lineTotal":19.95}],"orderTotal":19.95,"currency":"USD"}';
IF @orderIndex > 1
SET @Orders = @Orders + N',' + @OrderFragment;
ELSE
SET @Orders = @Orders + @OrderFragment;
-- compose inner JSON and measure its UTF-8 byte length
SET @Inner = N'{"batchId":"' + @batchId + N'","generatedAt":"' + @generatedAt + N'","orders":' + @Orders + N']}';
SET @CurrentBytes = DATALENGTH(CAST(@Inner COLLATE Latin1_General_100_CI_AS_SC_UTF8 AS VARCHAR(MAX)));
SET @orderIndex = @orderIndex + 1;
END
-- At this point @Inner already contains the orders array closed with ']}' and CurrentBytes > TargetBytes (or reached cap)
-- Convert inner JSON NVARCHAR -> VARCHAR (UTF-8 collation) -> VARBINARY to get UTF-8 bytes, then base64
DECLARE @bin VARBINARY(MAX) = CAST(CAST(@Inner COLLATE Latin1_General_100_CI_AS_SC_UTF8 AS VARCHAR(MAX)) AS VARBINARY(MAX));
DECLARE @Base64 NVARCHAR(MAX);
SELECT @Base64 = CAST('' AS XML).value('xs:base64Binary(sql:variable("@bin"))','nvarchar(max)');
-- Outer LogDateTime (ISO-8601 Z)
DECLARE @OuterLogDateTime NVARCHAR(50) = FORMAT(DATEADD(SECOND, @rn - 1, @StartUtc), 'yyyy-MM-ddTHH:mm:ssZ');
DECLARE @OuterJson NVARCHAR(MAX) =
N'{"LogAgentValueId":42,"EndPointName":"INT101: Receive Hello World Log Events","EndPointUri":"C:\\temp\\in","EndPointDirection":0,"EndPointTypeId":60,"OriginalMessageTypeName":"Hello.World.File/1.0","EventDirection":17,"LogDateTime":"'
+ @OuterLogDateTime + N'","ProcessingUser":"DOMAIN\\user","SequenceNo":0,"EventNumber":0,"LogText":"File OK","ApplicationInterchangeId":"","LocalInterchangeId":"00000000-0000-0000-0000-000000000000","LogStatus":0,"ProcessName":"My Process","ProcessingMachineName":"localhost","ProcessingModuleName":"INT101-HelloWorld-Application","ProcessingModuleType":"FilePickup","ServiceInstanceActivityId":"00000000-0000-0000-0000-000000000000","ProcessingTime":80,"Body":"'
+ @Base64 + N'","Bodies":null,"Context":{"CorrelationId":"064205E2-F7CF-43A6-B514-4B55536C2B67","ExtendedProperties/1.0#Filename":"HelloWorld.json"}}';
INSERT INTO dbo.LogEvents (LogEvent, ValidationFailed) VALUES (@OuterJson, 0);
SET @i = @i + 1;
END;
Example above generates 10,000 log entries, each containing a batch of orders with a total inner JSON size of approximately 4 MB. Adjust @RowCount
and @BodySizeKB
as needed to test different volumes and sizes.
How do I test with large messages (4MB+)?
The script above is specifically designed for testing very large messages. The @BodySizeKB
parameter controls the size of the inner JSON payload (before base64 encoding).
Understanding the message structure
Each test LogEvent contains:
- Outer JSON - The Nodinite Log Event envelope (~500 bytes)
- Base64-encoded Body - Contains the inner JSON payload
- Inner JSON - A batch of orders that grows until reaching
@BodySizeKB
Configuration parameters
Parameter | Default | Purpose | Example Values |
---|---|---|---|
@RowCount |
10,000 | Number of test log entries to create | 100 (quick test), 100,000 (load test) |
@BodySizeKB |
4,096 | Target size of inner JSON in KB (before base64) | 1 KB (small), 10 MB (10240), 100 MB (102400) |
@MaxOrdersPerBatch |
1,000 | Safety limit to prevent infinite loops | Increase for very large messages |
Example configurations
Small message test (1 KB messages):
DECLARE @RowCount INT = 1000;
DECLARE @BodySizeKB INT = 1;
Medium message test (1 MB messages):
DECLARE @RowCount INT = 5000;
DECLARE @BodySizeKB INT = 1024;
Large message test (10 MB messages):
DECLARE @RowCount INT = 1000;
DECLARE @BodySizeKB INT = 10240;
Extreme large message test (100 MB messages):
DECLARE @RowCount INT = 100;
DECLARE @BodySizeKB INT = 102400;
DECLARE @MaxOrdersPerBatch INT = 10000; -- Increase safety cap
How do I performance test my SQL Server configuration?
Performance testing strategy
- Baseline test - Small messages, low volume
- Volume test - Small messages, high volume
- Size test - Large messages, low volume
- Stress test - Large messages, high volume
Step 1: Baseline test (100 rows, 1 KB each)
DECLARE @RowCount INT = 100;
DECLARE @BodySizeKB INT = 1;
-- Run the test data generation script
Expected results:
- Insert time: <10 seconds
- Pickup Service processing: <30 seconds
- All LogEvents successfully processed
Step 2: Volume test (10,000 rows, 1 KB each)
DECLARE @RowCount INT = 10000;
DECLARE @BodySizeKB INT = 1;
Expected results:
- Insert time: 1-5 minutes
- Pickup Service processing: 5-15 minutes (depends on
MaxRowsLimit
) - Monitor Pickup Service performance metrics
Step 3: Size test (100 rows, 10 MB each)
DECLARE @RowCount INT = 100;
DECLARE @BodySizeKB INT = 10240;
DECLARE @MaxOrdersPerBatch INT = 5000;
Expected results:
- Insert time: 2-10 minutes (base64 encoding is CPU-intensive)
- Pickup Service processing: 10-30 minutes
- Monitor memory usage and network throughput
Step 4: Stress test (1,000 rows, 10 MB each)
DECLARE @RowCount INT = 1000;
DECLARE @BodySizeKB INT = 10240;
DECLARE @MaxOrdersPerBatch INT = 5000;
Expected results:
- Insert time: 20-60 minutes
- Pickup Service processing: 1-3 hours
- Monitor for:
- SQL Server memory pressure
- Tempdb growth
- Transaction log growth
- Network bandwidth saturation
- Pickup Service memory usage
What should I monitor during testing?
SQL Server metrics
- Transaction log growth - Ensure auto-growth is configured
- Tempdb usage - Base64 encoding uses tempdb heavily
- Memory pressure - Watch for buffer pool pressure
- Disk I/O - Monitor IOPS and latency
- CPU utilization - Expect high CPU during base64 encoding
Pickup Service metrics
- Processing rate - Messages per minute
- Memory usage - Watch for memory leaks with large messages
- Error rate - Monitor for validation failures
- Batch efficiency - Compare different
MaxRowsLimit
values
Nodinite metrics
- Log API response time - Should remain under 5 seconds per batch
- Database insert performance - Monitor Log Database write latency
- Search performance - Verify search remains responsive with large messages
How do I clean up test data?
After testing, remove test data to avoid cluttering your environment:
-- Delete all test data generated by the script
DELETE FROM dbo.LogEvents
WHERE LogEvent LIKE '%INT101: Receive Hello World Log Events%'
AND LogEvent LIKE '%batch-202%';
-- Or truncate the entire table if this is a test environment
TRUNCATE TABLE dbo.LogEvents;
Tips for realistic testing
Use production-like data sizes
- Analyze your real LogEvents to determine typical message sizes
- Test with 90th percentile, 95th percentile, and maximum observed sizes
- Account for seasonal spikes (month-end, year-end processing)
Simulate concurrent writers
Run multiple instances of the test script simultaneously to simulate multiple applications writing to the LogEvents table:
# PowerShell: Run 5 concurrent test data generators
1..5 | ForEach-Object -Parallel {
sqlcmd -S YourServer -d YourDatabase -E -Q "/* Run test script here */"
}
Test failure scenarios
Insert intentionally invalid LogEvents to test monitoring and purge procedures:
-- Insert invalid JSON to test monitoring
INSERT INTO dbo.LogEvents (LogEvent, ValidationFailed)
VALUES (N'{invalid-json', 0);
-- Insert LogEvents that will fail validation
INSERT INTO dbo.LogEvents (LogEvent, ValidationFailed)
VALUES (N'{"missing": "required fields"}', 0);