- 0 minutes to read

SQL Server Configuration Example

Easily integrate your enterprise systems by configuring the Nodinite Pickup Logging Service to fetch JSON Log Events directly from your SQL Server database. This guide empowers you to set up secure, scalable, and reliable log event collection for your organization, with robust validation and table design for compliance and analytics.

  • ✅ Seamless integration with SQL Server for real-time log event collection
  • ✅ Secure and compliant long-term storage of log data
  • ✅ Scalable configuration for enterprise environments
  • ✅ Automated validation and error handling for reliable operations
graph LR subgraph "SQL Server" roS(fal:fa-database Database Table) end subgraph "Nodinite instance" roPS(fal:fa-truck-pickup Pickup Service) roS --> roPS roPS --> |Long term storage|roDB(fal:fa-database Log Database) end

The diagram above illustrates how the Pickup Service interacts with SQL Server tables, processes messages, and stores them in the Nodinite Log Database.

Considering SQL Server for your log collection? Review SQL Server Pros and Cons to understand performance benefits, infrastructure trade-offs, and when to choose SQL Server vs. alternatives (HTTP Log API, Azure Service Bus, RabbitMQ).

SQL Servers

The SQL Servers section lets you manage all SQL Server database sources for your log events.

You configure these sources in the Settings.json file, which uses JSON format. The default path is:

C:\Program Files\Nodinite\Logging Agent - Pickup Service\Settings.json
{
  ...
  "SQLServers": [
    {
      "Enabled": true,
      "LogApi": "https://localhost/Nodinite/Dev/LogAPI/",
      "UseConfigurationDatabase": false,
      "ConfigurationDatabaseConnectionString": null,
      "ConnectionString": "Server=myServerAddress;Database=myDataBase;Integrated Security=True;Connection Timeout=60;TrustServerCertificate=true",
      "MaxRowsLimit": 500,
      "ReplaceRules": [
        {
          "Name": "Fix Endpoint real customer id to {customerId}",
          "ReplaceName": false,
          "ReplaceUri": true,
          "Pattern": "/([0-9]{4,})$",
          "Group1Value": "{customerId}"
        }
      ]
    }
  ]
  ...                      
}

SQL Servers is an array of SQL Server configurations. Each entry defines how the Log Agent connects to and processes messages from a specific SQL Server database. This setup ensures your Nodinite JSON Log Events are reliably collected and managed.

Property Description Value Example Comment
ConnectionString SQL Server connection string (uses Microsoft.Data.SqlClient with modern TLS defaults) Server=myServerAddress;Database=myDataBase;Integrated Security=True;Connection Timeout=60;TrustServerCertificate=true See SQL Server Connection Strings for certificate validation, AOAG, troubleshooting, and security
MaxRowsLimit Maximum number of rows to fetch on each loop 500 (default) Larger values may impact memory usage
Enabled See the Shared Configuration section for more info
LogAPI See the Shared Configuration section for more info
UseConfigurationDatabase See the Shared Configuration section for more info
ConfigurationDatabaseConnectionString See the Shared Configuration section for more info
ReplaceRules See the shared Replace Rules section for more info

Important

You must restart the Nodinite Pickup Logging Service for configuration changes to take effect.

SQL Server table

If you use a Microsoft SQL Server database to store Log Events, your custom logging solution must define the following LogEvents table:

Column Datatype Purpose
LogEvent json Your JSON encoded Log Event
Id bigint Automatically created identity; the PickupService must know which row to flag as invalid if the value in LogEvent is not valid
ValidationFailed boolean Automatically set to false when inserted; may be set to true during processing if errors are detected
ValidationFailedText text Updated if errors occur validating the LogEvent during processing
Created timestamp Automatically created during insert; useful for troubleshooting

Important

Invalid Log Events will have ValidationFailed set to True. You must remove these entries manually.

-- Table: [LogEvents], used by :Nodinite: Logging

-- DROP TABLE [dbo].[LogEvents];

CREATE TABLE [dbo].[LogEvents](
    [Id] [bigint] IDENTITY(1,1) NOT NULL,
    [LogEvent] [nvarchar](max) NOT NULL,
    [ValidationFailed] [bit] NOT NULL,
    [ValidationFailedText] [nvarchar](max) NULL,
    [Created] [datetimeoffset](7) NOT NULL,
 CONSTRAINT [PK_LogEvents] PRIMARY KEY NONCLUSTERED 
(
    [Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 100, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF, DATA_COMPRESSION = PAGE) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

-- Drop existing nonclustered primary key if needed and recreate as clustered
-- Or keep it and add this covering index
CREATE NONCLUSTERED INDEX [IX_LogEvents_ValidationFailed_Created_Id]
ON [dbo].[LogEvents] ([ValidationFailed], [Created], [Id])
INCLUDE ([LogEvent])
WITH (FILLFACTOR = 100, DATA_COMPRESSION = PAGE, ONLINE = ON);

ALTER TABLE [dbo].[LogEvents] ADD CONSTRAINT [DF_LogEvents_ValidationFailed]  DEFAULT ((0)) FOR [ValidationFailed]
GO

ALTER TABLE [dbo].[LogEvents] ADD CONSTRAINT [DF_LogEvents_Created]  DEFAULT (sysutcdatetime()) FOR [Created]
GO

IF TYPE_ID(N'dbo.IdList') IS NULL EXEC(N'CREATE TYPE dbo.IdList AS TABLE (Id BIGINT NOT NULL);');
GO

GRANT DELETE ON OBJECT::dbo.LogEvents TO [DEV\N-Svc-gMSA$]; -- Replace with proper service account
GO

GRANT EXECUTE ON TYPE::dbo.IdList TO [DEV\N-Svc-gMSA$]; -- Replace with proper service account
GO

-- Do not forget to assign user access rights

[!IMPORTANT] Remember to grant the service account for the Pickup Service the following database rights:

Recommended database rights — Grant the Pickup Service account the minimum required roles below. Use the table to map permissions to purpose and official documentation.

Permission Purpose Microsoft Docs
db_datareader Grants read access to all tables and views (required to fetch LogEvent rows and related metadata). db_datareader
db_datawriter Grants write access to all tables and views (used when Pickup Service updates status or flags rows). db_datawriter
db_ddladmin Allows running DDL statements and creating table types (required when the Pickup Service uses table-valued parameters or creates temp objects). db_ddladmin

Monitoring and Maintenance

Monitor failed LogEvents

Detect failed LogEvents early to prevent processing backlogs, data loss, and compliance gaps. Use the Nodinite Database Monitoring Agent to monitor the LogEvents table for failed entries with automated, continuous monitoring and instant alerts.

See the complete guide: How to Monitor Failed LogEvents (includes T-SQL stored procedures, configuration steps, threshold settings, and testing recommendations)

Purge failed LogEvents

Remove invalid LogEvents safely to avoid unbounded table growth and keep your Pickup Service processing healthy. Use batch deletion procedures to minimize locking and transaction log impact.

See the complete guide: How to Purge Failed LogEvents (includes T-SQL batch deletion procedures, dry-run testing, scheduling recommendations, and safety best practices)

Test Data Generation

Generate realistic test data to validate your SQL Server configuration, performance test large message scenarios (4MB+ payloads), and simulate high-volume logging.

See the complete guide: How to Generate Test Data (includes T-SQL scripts for creating test LogEvents with configurable volumes and message sizes)


Next Step