- 0 minutes to read

SQL Server Configuration Example

Easily integrate your enterprise systems by configuring the Nodinite Pickup Logging Service to fetch JSON Log Events directly from your SQL Server database. This guide empowers you to set up secure, scalable, and reliable log event collection for your organization, with robust validation and table design for compliance and analytics.

  • ✅ Seamless integration with SQL Server for real-time log event collection
  • ✅ Secure and compliant long-term storage of log data
  • ✅ Scalable configuration for enterprise environments
  • ✅ Automated validation and error handling for reliable operations
graph LR subgraph "SQL Server" roS(fal:fa-database Database Table) end subgraph "Nodinite instance" roPS(fal:fa-truck-pickup Pickup Service) roS --> roPS roPS --> |Long term storage|roDB(fal:fa-database Log Database) end

The diagram above illustrates how the Pickup Service interacts with SQL Server tables, processes messages, and stores them in the Nodinite Log Database.

Tip

Considering SQL Server for your log collection? Review SQL Server Pros and Cons to understand performance benefits, infrastructure trade-offs, and when to choose SQL Server vs. alternatives (HTTP Log API, Azure Service Bus, RabbitMQ).

SQL Servers

The SQL Servers section lets you manage all SQL Server database sources for your log events.

In the Remote Configuration GUI, navigate to the SQL Servers tab. Here, you can add, edit, or remove SQL Server database configurations. Each configuration defines how the Nodinite Pickup Log Events Logging Service connects to and processes messages from a specific SQL Server database table.

SQL Servers Configuration Tab
Example of the SQL Servers configuration tab in the Remote Configuration GUI.

Click the Add button to create a new SQL Server configuration. Fill in the required fields, such as connection string, database name, and row limit settings. You can also configure Replace Rules for advanced message processing.

SQL Server Configuration Accordion
Example of a SQL Server configuration accordion in the Remote Configuration GUI.

Expand the accordion for each SQL Server configuration to access advanced settings, including Replace Rules for message processing. These rules allow you to modify message content before it's stored in Nodinite, ensuring consistency and compliance with your logging standards.

Important

Ensure that your SQL Server database user has read access to the LogEvents table and update permissions to set the ValidationFailed flag. Test connectivity before enabling a configuration.

General tab

The General tab provides the fundamental settings for your SQL Server configuration within the Remote Configuration GUI.
General Tab
The General tab allows you to enable/disable the configuration and provide a display name and description.

The following key fields are required:

Field Description Required Notes
Enabled Checkbox to activate or deactivate this SQL Server configuration No When unchecked, the Pickup Service will skip this configuration
Display Name User-friendly name for this SQL Server source configuration Yes Used in the GUI and logs for easy identification
Description Optional detailed description of the SQL Server configuration No Helps document the purpose and details of this specific database setup

Follow these guidelines when configuring the General tab:

  • Use descriptive names that clearly indicate the database and purpose (e.g., "Production DB - Order Events")
  • In the Description field, note important details such as the database's business purpose, owner contact, or any special processing notes
  • The Display Name appears throughout the Nodinite interface, so keep it concise yet meaningful
  • Leave the configuration disabled during initial setup if you're not ready to start collecting log events

Source tab

The Source tab contains the database connection configuration and query settings.
Source Tab
The Source tab contains settings for connecting to the SQL Server database.

The following key fields must be configured for database connectivity:

Field Description Required Notes
Connection String SQL Server connection string with authentication Yes Example: Server=myServerAddress;Database=myDataBase;Integrated Security=True;Connection Timeout=60;TrustServerCertificate=true
Max Rows Limit Maximum number of rows to fetch per query execution No Default: 500; higher values may impact memory usage; adjust based on your environment

Follow these guidelines when setting up the Source tab:

  • Ensure your database user has SELECT permissions on the LogEvents table and UPDATE permissions to flag invalid records
  • Use connection strings with encryption and certificate validation for production environments
  • Test the connection before enabling to verify credentials and database accessibility
  • Set Max Rows Limit based on available server memory and network bandwidth
  • For large datasets, start with 500 rows and adjust upward after testing
  • Include proper timeout and retry settings in your connection string
  • Use strong, unique passwords and store them securely
  • Never embed credentials directly in documentation or version control

Destination tab

The Destination tab configures where processed log events are stored. By default, events are sent to the Nodinite Log API where they are written to the Log Database for long-term storage and analysis.
Destination Tab
The Destination tab contains settings for connecting to the Log API and authentication options.

The following key fields are required for destination configuration:

Field Description Required Notes
Use Log API Checkbox to enable sending events to the Log API Yes When checked, processed events are written to the Log Database through the Log API
Log API Base URL Base URL for your Log API instance Yes Example: http://{host}:{port}/LogAPI/
Protected Checkbox to enable OAuth 2.0 authentication No Check this if you are using an authenticated Log API
Client ID Public identifier for your application (issued by IDP) Conditional Required when Protected is checked
Client Secret Confidential key for application authentication Conditional Required when Protected is checked; keep this secure
Scope Space-separated list of access permissions Conditional Required when Protected is checked (e.g., read write)
IDP Token Endpoint URL where authentication requests are sent Conditional Required when Protected is checked; this is where the client obtains access tokens

Follow these guidelines for the Destination tab:

  • Always ensure Use Log API is checked to maintain proper log event storage
  • For local/development environments, use unprotected Log API URLs (e.g., http://localhost:40002/LogAPI/)
  • For production environments with authentication, check the Protected checkbox and provide valid OAuth 2.0 credentials
  • When using Protected authentication, ensure your Client ID, Client Secret, and Scope are correctly configured with your identity provider (IDP)
  • The IDP Token Endpoint must be accessible from the Pickup Service instance
  • Keep Client Secret values secure and never commit them to version control

Replace Rules

Read about Replace Rules in the shared configuration section. Replace Rules allow you to modify message content before it's stored in Nodinite, ensuring consistency and compliance with your logging standards. Replace Rules Tab
The Replace Rules tab allows you to configure rules for modifying message content.

Configuration file

Note

This section applies for older versions (<7.1.x) of the Nodinite Pickup Log Events Logging Service.

C:\Program Files\Nodinite\Logging Agent - Pickup Service\Settings.json
{
  ...
  "SQLServers": [
    {
      "Enabled": true,
      "LogApi": "https://localhost/Nodinite/Dev/LogAPI/",
      "UseConfigurationDatabase": false,
      "ConfigurationDatabaseConnectionString": null,
      "ConnectionString": "Server=myServerAddress;Database=myDataBase;Integrated Security=True;Connection Timeout=60;TrustServerCertificate=true",
      "MaxRowsLimit": 500,
      "ReplaceRules": [
        {
          "Name": "Fix Endpoint real customer id to {customerId}",
          "ReplaceName": false,
          "ReplaceUri": true,
          "Pattern": "/([0-9]{4,})$",
          "Group1Value": "{customerId}"
        }
      ]
    }
  ]
  ...                      
}

SQL Servers is an array of SQL Server configurations. Each entry defines how the Log Agent connects to and processes messages from a specific SQL Server database. This setup ensures your Nodinite JSON Log Events are reliably collected and managed.

Property Description Value Example Comment
ConnectionString SQL Server connection string (uses Microsoft.Data.SqlClient with modern TLS defaults) Server=myServerAddress;Database=myDataBase;Integrated Security=True;Connection Timeout=60;TrustServerCertificate=true See SQL Server Connection Strings for certificate validation, AOAG, troubleshooting, and security
MaxRowsLimit Maximum number of rows to fetch on each loop 500 (default) Larger values may impact memory usage
Enabled See the Shared Configuration section for more info
LogAPI See the Shared Configuration section for more info
UseConfigurationDatabase See the Shared Configuration section for more info
ConfigurationDatabaseConnectionString See the Shared Configuration section for more info
ReplaceRules See the shared Replace Rules section for more info

Important

When you are done editing the configuration, you must restart the Nodinite Pickup Logging Service for configuration changes to take effect.

SQL Server Table

If you use a Microsoft SQL Server database to store Log Events, your custom logging solution must define the following LogEvents table:

Column Datatype Purpose
LogEvent json Your JSON encoded Log Event
Id bigint Automatically created identity; the PickupService must know which row to flag as invalid if the value in LogEvent is not valid
ValidationFailed boolean Automatically set to false when inserted; may be set to true during processing if errors are detected
ValidationFailedText text Updated if errors occur validating the LogEvent during processing
Created timestamp Automatically created during insert; useful for troubleshooting

Important

Invalid Log Events will have ValidationFailed set to True. You must remove these entries manually.

-- Table: [LogEvents], used by :Nodinite: Logging

-- DROP TABLE [dbo].[LogEvents];

CREATE TABLE [dbo].[LogEvents](
    [Id] [bigint] IDENTITY(1,1) NOT NULL,
    [LogEvent] [nvarchar](max) NOT NULL,
    [ValidationFailed] [bit] NOT NULL,
    [ValidationFailedText] [nvarchar](max) NULL,
    [Created] [datetimeoffset](7) NOT NULL,
 CONSTRAINT [PK_LogEvents] PRIMARY KEY NONCLUSTERED 
(
    [Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 100, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF, DATA_COMPRESSION = PAGE) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

-- Drop existing nonclustered primary key if needed and recreate as clustered
-- Or keep it and add this covering index
CREATE NONCLUSTERED INDEX [IX_LogEvents_ValidationFailed_Created_Id]
ON [dbo].[LogEvents] ([ValidationFailed], [Created], [Id])
INCLUDE ([LogEvent])
WITH (FILLFACTOR = 100, DATA_COMPRESSION = PAGE, ONLINE = ON);

ALTER TABLE [dbo].[LogEvents] ADD CONSTRAINT [DF_LogEvents_ValidationFailed]  DEFAULT ((0)) FOR [ValidationFailed]
GO

ALTER TABLE [dbo].[LogEvents] ADD CONSTRAINT [DF_LogEvents_Created]  DEFAULT (sysutcdatetime()) FOR [Created]
GO

IF TYPE_ID(N'dbo.IdList') IS NULL EXEC(N'CREATE TYPE dbo.IdList AS TABLE (Id BIGINT NOT NULL);');
GO

GRANT DELETE ON OBJECT::dbo.LogEvents TO [DEV\N-Svc-gMSA$]; -- Replace with proper service account
GO

GRANT EXECUTE ON TYPE::dbo.IdList TO [DEV\N-Svc-gMSA$]; -- Replace with proper service account
GO

-- Do not forget to assign user access rights

Important

Remember to grant the service account for the Pickup Service the following database rights:

Recommended database rights — Grant the Pickup Service account the minimum required roles below. Use the table to map permissions to purpose and official documentation.

Permission Purpose Microsoft Docs
db_datareader Grants read access to all tables and views (required to fetch LogEvent rows and related metadata). db_datareader
db_datawriter Grants write access to all tables and views (used when Pickup Service updates status or flags rows). db_datawriter
db_ddladmin Allows running DDL statements and creating table types (required when the Pickup Service uses table-valued parameters or creates temp objects). db_ddladmin

Monitoring Log Processing Health

This section describes using the Nodinite Database Monitoring Agent (a separate monitoring component) to track the health of log event processing in the Pickup Service. While the Pickup Service handles logging (collecting and storing log events), you use Monitor Agents to ensure the logging infrastructure is functioning properly.

Monitor failed LogEvents

Detect failed LogEvents early to prevent processing backlogs, data loss, and compliance gaps. Use the Nodinite Database Monitoring Agent to monitor the LogEvents table for failed entries with automated, continuous monitoring and instant alerts.

See the complete guide: How to Monitor Failed LogEvents (includes T-SQL stored procedures, configuration steps, threshold settings, and testing recommendations)

Purge failed LogEvents

Remove invalid LogEvents safely to avoid unbounded table growth and keep your Pickup Service processing healthy. Use batch deletion procedures to minimize locking and transaction log impact.

See the complete guide: How to Purge Failed LogEvents (includes T-SQL batch deletion procedures, dry-run testing, scheduling recommendations, and safety best practices)

Test Data Generation

Generate realistic test data to validate your SQL Server configuration, performance test large message scenarios (4MB+ payloads), and simulate high-volume logging.

See the complete guide: How to Generate Test Data (includes T-SQL scripts for creating test LogEvents with configurable volumes and message sizes)

Next Step