- 0 minutes to read

Pickup LogEvents Service, Scenarios, Business Value, ROI, Monitoring, Message Types, Search Fields, Log Event Chain Pickup LogEvents Service, scenario, business value, ROI, use case, Message Types, Search Fields, Log Event, correlation, filtering Understanding the Log Event → Message Type → Search Fields Chain - Critical concept: Creating a Log Event is just the first step. To unlock the full power of Nodinite—searchable business data, powerful filtering, and cross-format correlation—you must understand how Message Types and Search Fields work together.

Understanding the Log Event → Message Type → Search Fields Chain

Understanding the Log Event → Message Type → Search Fields Chain

Critical concept: Creating a Log Event is just the first step. To unlock the full power of Nodinite—searchable business data, powerful filtering, and cross-format correlation—you must understand how Message Types and Search Fields work together.

The 3-step chain:

  1. Your code creates a [Log Event][Log Event] with a Message Type name (e.g., FlatFileInvoice/1.0, http://schemas.acme.com/Order/2.0#Order)
  2. The [Logging Service][] determines the Message Type and finds all [Search Field Expressions][Search Field Expressions] bound to it
  3. Search Field Expressions extract business values from your payload/context (invoice numbers, customer IDs, order amounts, etc.) making them searchable

Why this matters for Pickup Service users:

  • No Message Type = No Search Fields extracted - Your event is logged but invisible to business searches
  • Wrong Message Type = Wrong Search Fields extracted - You'll extract values for the wrong format (e.g., XML expressions running on JSON data)
  • Non-XML formats require explicit Message Type - EDI, flat files, JSON without schema identifiers can't be auto-detected

Example:

{
  "LogAgentValueId": 42,
  "EndPointName": "File Pickup - Invoices",
  "EndPointUri": "C:\\Invoices\\In",
  "EndPointDirection": 0,
  "EndPointTypeId": 60,
  "OriginalMessageTypeName": "FlatFileInvoice/1.0",  ← Critical field!
  "LogDateTime": "2025-10-12T14:23:00Z",
  "Body": "SW52b2ljZTEyMzQ1...",  ← Base64 encoded flat file
  "Context": {
    "FileName": "Invoice_12345.txt"
  }
}

Without "OriginalMessageTypeName": "FlatFileInvoice/1.0", the [Logging Service][] has no way to know this is an invoice, and no [Search Field Expressions][Search Field Expressions] are evaluated. With it, all expressions bound to FlatFileInvoice/1.0 run (e.g., RegEx to extract invoice number, customer ID, amount), making your flat file data searchable.

Learn more:

  • [What is a Message Type?][Message Types] - Understand the binding layer
  • [What is a Search Field?][Search Fields] - See how business values are extracted
  • [Log Event Processing Pipeline][Log Event Processing] - The 4-step journey from unprocessed to searchable
  • [Context Options][Context Options] - Advanced control over Message Type determination

Some real-world examples include:

  • C#/.NET platform – [Hello World - Sample][]
  • Mule – [Custom connector][Mule ESB - Custom Connector]
  • PowerShell
  • IBM Sterling – custom code
  • IBM Cloud – custom code, e.g., to a [PostgreSQL][] database
  • Java-based solutions
  • Azure Functions – using [Serilog][]

Unlike traditional Logging Agents, the Pickup LogEvents Service Logging Agent reads Nodinite [Log Events][Log Event] from sources like disks or queues and forwards them to the Nodinite [Log API][] (RESTful). For high-performance, on-premises solutions, you can even bypass the [Log API][] and write directly to the [Log Database][Log Databases].

Features

Source Description Recommended Monitoring Agent External Link Configuration
ActiveMQ Fetch [Log Events][Log Event] from [ActiveMQ][Apache NMS]/ActiveMQ Artemis queues [Message Queuing Agent][] [Apache NMS ActiveMQ][] [Configuration][ActiveMQs]
AMQP New 6.2 Fetch [Log Events][Log Event] from [AMQP v1.0][AMQP]/ActiveMQ Artemis queues [Message Queuing Agent][] AMQP.org [Configuration][AMQP]
AnypointMQ Fetch [Log Events][Log Event] from MuleSoft Cloudhub AnypointMQ platform [Message Queuing Agent][] AnypointMQ [Configuration][AnypointMQs]
Azure Event Hub Fetch [Log Events][Log Event] from EventHub; Create the [Log Event][] using the Nodinite [Serilog EventHub sink][Serilog - Event Hub] [Azure Monitoring Agent][Azure Agent] EventHub [Configuration][EventHubConfig]
Azure ServiceBus Fetch [Log Events][Log Event] from Azure Service Bus; Create the [Log Event][] using the Nodinite [Serilog ServiceBus sink][Serilog - Service Bus] [Message Queuing Agent][] [Azure Service Bus][] [Configuration][ServiceBusQueues]
Azure Blob New 6.2 Fetch [Log Events][Log Event] from Azure Storage Account [Azure Monitoring Agent][Azure Agent] [Azure Blob Storage][] [Configuration][BlobStorageConfig]
Disk / Folder Fetch [Log Events][Log Event] from file folders and SMB enabled shares; Create the [Log Event][] using the Nodinite [Serilog File sink][Serilog - File] [File Monitoring Agent][] [Configuration][FolderConfig]
Microsoft MSMQ
NOTE: DEPRECATED - This feauture is no longer available with 7.1.x and later
Fetch [Log Events][Log Event] from Microsoft MSMQ [Message Queuing Agent][] [Configuration][MSMQConfig]
Microsoft SQL Server Fetch [Log Events][Log Event] from Microsoft SQL Server [Database Monitoring Agent][] [Configuration][SQLConfig]
PostgreSQL Fetch [Log Events][Log Event] from PostgreSQL database instances [Database Monitoring Agent][] [PostgreSQL][] [Configuration][PostgreSQLConfig]

Tip

Missing a source? Contact our support at support@nodinite.com

The agent performs logging by sending an HTTP/HTTPS POST of a [Log Event][] to the api/LogEvent/LogEvent endpoint. For high-performance, on-premises solutions, you can bypass the [Log API][] and write [Log Events][Log Event] directly to the proper [Log Database][Log Databases] using multiple threads.


Frequently Asked Questions

Find more solutions and answers in the [Troubleshooting][] user guide.