Skip to main content

Troubleshooting

If a pipeline is misconfigured, the data flowing through it might not be delivered to the desired destination. When this happens, it is important to validate that both the device is sending streaming data in the format that you expect, and that pipeline destination is configured correctly.

Data Destination Metrics

Other than accessing the external data destination directly, the quickest way to validate that data is flowing through a pipeline is to check the real-time usage metrics for the project's organization on Golioth. Usage can be viewed by navigating to the organization's Settings page, then clicking on the Usage tab. Metrics will be broken down by Project and Service. Data destinations for Pipelines that incur usage are identified by data_{destination-name}.

Validating Device Data

If sending application/json data, or data that can be transformed into application/json, the simplest way to ensure that data is structured as expected, is to create a minimal pipeline that sends data to LightDB Stream.

filter:
path: "*"
content_type: application/json
steps:
- name: step-0
destination:
type: lightdb-stream
version: v1

If data does not arrive at LightDB Stream using this pipeline, it is likely that the device is not communicating with Golioth, or it is sending data that is not valid JSON.

Validating Pipeline Configuration

The Golioth Simulator, which is visible as a widget in the bottom right corner of the console when working with Pipelines, can be used to validate that a given pipeline is configured correctly. The Simulator will send the provided JSON data to the root path for streaming data (/.s).

For example, the following pipeline could be used with the Simulator to verify that the GCP PubSub destination is configured with valid credentials.

Pipeline: Validate Pipeline Configuration
filter:
path: "*"
content_type: application/json
steps:
- name: step-0
destination:
type: gcp-pubsub
version: v1
parameters:
service_account: $GCP_SERVICE_ACCOUNT
topic: my-topic