Giter Site home page Giter Site logo

signalk-to-influxdb's Introduction

Signal K to InfluxDb Plugin

Signal K Node server plugin to write all simple numeric Signal K values to InfluxDB 1.x, a time series database.

Note: If you're interested in using InfluxDB 2.x, see the signalk-to-influxdb2 repository instead. This is the preferred approach for new installations.

Once the data is in InfluxDb you can use for example Grafana to draw pretty graphs of your data.

The plugin assumes that the database you specify exists. You can create one with

curl -X POST http://localhost:8086/query?q=CREATE+DATABASE+boatdata

The plugin writes only self data. It converts Signal K paths to InfluxDb measurement keys in CamelCase format, eg. navigationPosition.

Adding support for non-self data would be pretty easy by adding context as InfluxDB tags.

Position handling and tracks

If enabled by black/whitelist configuration navigation.position updates are written to the db no more frequently than once per second. More frequent updates are simply ignored.

The coordinates are written as [lon, lat] strings for minimal postprocessing in GeoJSON conversion. Optionally, coordinates can be written separately to database. This enable location data to be used in various ways e.g. in Grafana (mapping, functions, ...).

The plugin creates /signalk/vX/api/self/track endpoint that accepts three parameters and returns GeoJSON MultiLineString.

Parameters:

  • timespan: in the format xxxY (e.g. 1h), where xxx is a Number and Y one of:

    • s (seconds)
    • m (minutes)
    • h (hours)
    • d (days)
    • w (weeks)
  • resolution: (in the same format) specifies the time interval between each point returned. For example http://localhost:3000/signalk/v1/api/self/track?timespan=1d&resolution=1h will return the data for the last 1 day (24 hours) with one position per hour. The data is simply sampled with InfluxDB's first() function.

  • timespanOffset: (number) Without timespanOffset defined the end time of the returned data is the current time. Supplying a timespanOffset value changes the end time to be current time - timespanOffset. The timespanOffset value is considered to have the same "Y" as timespan.

Examples: where current time is 14:00

http://localhost:3000/signalk/v1/api/self/track?timespan=12h&resolution=1m returns data in the time window 2:00 - 14:00

http://localhost:3000/signalk/v1/api/self/track?timespan=12h&resolution=1m&timespanOffset=1 returns data in the time window 1:00 - 13:00.

Sources

If you have multiple sources generating the same data / same Signal K paths you can distinguish between them by specifying source in the query:

image

To get persistent source data in data from NMEA 2000 networks use Use Can NAME in source data in connection settings. This way all sources will get a unique identity that does not change the NMEA 2000 bus addresses change.

Time Series API

This plugin implements an HTTP API for retrieving historical / time series values with urls like http://localhost:3000/signalk/v1/history/values?from=2021-05-25T20:00:00.001Z&to=2021-05-25T23:00:00.561Z&paths=navigation.speedOverGround,navigation.speedOverGround

  • from and to are date-times with a time offset and/or a time zone in the ISO-8601 calendar system
  • paths is a comma delimited list of Signal K paths
  • resolution is how many seconds between entries, given as integer
  • context

Additionally you can retrieve the contexts that the db has data for with query like http://localhost:3000/signalk/v1/history/contexts?from=2021-05-25T20:00:00.001Z&to=2021-05-25T23:00:00.561Z and paths with http://localhost:3000/signalk/v1/history/paths?from=2021-05-25T20:00:00.001Z&to=2021-05-25T23:00:00.561Z

Provider

If you want to import log files to InfluxDb this plugin provides also a provider interface that you can include in your input pipeline. First configure your log playback, then stop the server and insert the following entry in your settings.json:

        {
          "type": "signalk-to-influxdb/provider",
          "options": {
            "host": "localhost",
            "port": 8086,
            "database": "signalk",
            "selfId": <your self id here>,
            "batchSize": 1000
          }
        }

Try it out / Development setup

A quick way to get started / try things out / set things up for development is to start InfluxDb and Grafana withdocker-compose up. Then you need to configure the plugin to write to localhost:8086 and Grafana to use InfluxDb data.

For a real world setup you probably want to install these locally, see for example Seabits step by step instructions.

signalk-to-influxdb's People

Contributors

albert-smit avatar bergie avatar kegustafsson avatar mairas avatar panaaj avatar ph1l avatar sbender9 avatar tkurki avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

signalk-to-influxdb's Issues

Write Data to Influxdb Cloud?

Hi,
Thanks for this awesome package.
Is it possible to write data directly to an InfluxDB bucket on InfluxDB Cloud?
I have tried inputting credentials, but so far have been unable to connect
Many thanks
Alex

timestamp sent to influx uses incorrect timezone

hi! thanks for the great plugin!

i noticed data in influx has timestamps roughly +7 hours in the future. my localtime is -0700 so figured it was a timezone issue:

$ influx -database goldenmean -execute 'select last(*) from "electrical.batteries.1.voltage"' && date +%s%N
name: electrical.batteries.1.voltage
time                last_value
----                ----------
1596060830085000000 11.57       <-- influx timestamp: Wed Jul 29 2020 15:13:50
1596035635156144036             <-- real epoch:       Wed Jul 29 2020 08:13:55

looking at the code, update.timestamp from signalk appears to be a UTC string, but without any timezone indicator:

2020-07-29T15:25:28.695

in skToInflux.js:51, the timestamp is converted to a Date with new Date(update.timestamp) but Date is interpreting this as localtime:

d = new Date('2020-07-29T15:25:28.695');
Wed Jul 29 2020 15:25:28 GMT-0700 (Pacific Daylight Time)   <-- in the future

Adding in a Z at the end of the string timestamp does the trick, as Date interprets this as UTC:

d = new Date('2020-07-29T15:25:28.695Z');
Wed Jul 29 2020 08:25:28 GMT-0700 (Pacific Daylight Time)   <-- correct!

i changed skToInflux.js:51 to new Date(update.timestamp + 'Z') and influx has the correct timestamp now:

$ influx -database goldenmean -execute 'select last(*) from "electrical.batteries.1.voltage"' && date +%s%N
name: electrical.batteries.1.voltage
time                last_value
----                ----------
1596038643098000000 11.56       <-- influx timestamp: Wed Jul 29 2020 09:04:03
1596038651677254700             <-- real epoch:       Wed Jul 29 2020 09:04:11

note i'm just getting starting using signalk so not actually sure if this is a signalk-to-influxdb or signalk bug. not sure where update.timestamp is coming from, and if signalk, should it be encoding the Z to indicate UTC?

Batched InfluxDb writes

My hunch is that writing all values individually is way more inefficient than doing batched writes.

We could gather updates for a while and do a batch write to Influx periodically, with a configurable interval.

Info: Signal-to-influxdb - successful setup with influxdb v2 and Raspian Bullseye 64-bit

Dear Teppo,
just wanted to let you know, that I have my whole SignalK setup now running on Raspi 64-Bit and influxdb v2.
With influxdb v1 the CPU usage recently was ramping up to more than 90%, now the system is only using 16% and that is absolutely stable since 6 days.
I‘m using the v1 API of influxdb v2, no modification to the SignalK influxdb plugin.

Let me know if you like to see the details of the influxdb setup.

Cheers
Peter
En route, SY Joy

What will be the measurement key in case of multiple values?

What will be the measurement key in case of multiple values for eg. first and second engine temp?
The case is described here thoroughly here: https://signalk.org/specification/1.5.0/doc/data_model_multiple_values.html

It is quite possible for a key value to come from more than one device. Many modern devices have a built in GPS receiver and as a result on any given boat there may be several sources of position, speed over ground, and heading. Multiple depth sounders are also common, often installed to port and starboard on monohull sailboats or in each hull of a catamaran.

SignalK enables subscription of this value by full path including source which looks like this:
navigation.courseOverGroundTrue.values[n2k./dev/ikommunicate.128267]

How will such data be saved in influxdb? Is this case somehow implemented?

Regards

Problem setting up plugin (crashing server)

pi@freeboard:~/signalk-server-node $ ./bin/signalk-server -s settings/freeboard-serial.json
TypeError: Cannot read property 'length' of undefined
    at Object.start (/home/pi/signalk-to-influxdb/index.js:113:39)
    at registerPlugin (/home/pi/signalk-server-node/lib/interfaces/plugins.js:115:12)
    at fs.readdirSync.filter.forEach.e (/home/pi/signalk-server-node/lib/interfaces/plugins.js:93:7)
    at Array.forEach (native)
    at startPlugins (/home/pi/signalk-server-node/lib/interfaces/plugins.js:85:68)
    at Object.start (/home/pi/signalk-server-node/lib/interfaces/plugins.js:26:7)
    at /home/pi/signalk-server-node/lib/index.js:196:30
    at /home/pi/signalk-server-node/node_modules/lodash/lodash.js:4944:15
    at Function.forIn (/home/pi/signalk-server-node/node_modules/lodash/lodash.js:12938:11)
    at startInterfaces (/home/pi/signalk-server-node/lib/index.js:188:5)

Running in Docker - No host available/Request timed out

I am unsure if this is an issue from running with docker or with the plugin. After the server has been running for a few minutes I end up with the following in the log:

Nov 05 22:26:24 signalk-to-influxdb:Error: No host available
Nov 05 22:26:34 signalk-to-influxdb:Error: Request timed out

Has anyone else come across this and a potential fix?

I have tried with docker versions:
1.35.1
v1.36.0-beta.3

I have the batch interval set to 10s. Watching grafana load the data with auto refresh every 10s it starts loading with the grafana dash loading every 10 seconds then slowly gets delayed, it is updating the data but everytime it gets delayed by an extra 5-10 seconds until after ~5mins the errors appear in singalk debug log.

Separate influx/grafana paths when SK path has >1 source

Hi,

I've got a homemade magnetic compass next to my Raymarine Fluxgate one and I'd like to graph both comparison. Both sources end up in SK, but end up in 1 grafana path.

Ideally I'd like to preserve both and have e.g. navigation.headingMagnetic_$src as well.

Write to more than one influx database

As suggested by paddyb on Slack on the 3rd of December, it would be a great benefit if the influx-plugin could be configured to write to more than one influx database.

Exemplary use cases:

  • Write to a local influx database as well as to a cloud-based instance.
  • Write to two separate, local influx databases. An example: The first database stores data with a high sampling rate with a retention policy that only keeps e.g. the last week or so, second database stores data with a low sample rate with an indefinite retention policy.

Installation fails in rpi

Hi,

I tried to install your signalk-to-influxdb plugin, but the installation fails. I'm not sure if I'm doing it right and both node and Signal K are relatively new to me.

pi@busterls:~/iot/signalk-server-node $ npm install signalk-to-influxdb
npm ERR! Linux 4.4.50+
npm ERR! argv "/opt/node-v6.10.2-linux-armv6l/bin/node" "/usr/bin/npm" "install" "signalk-to-influxdb"
npm ERR! node v6.10.2
npm ERR! npm  v3.10.10

npm ERR! Cannot convert undefined or null to object
npm ERR! 
npm ERR! If you need help, you may report this error at:
npm ERR!     <https://github.com/npm/npm/issues>

npm ERR! Please include the following file with any support request:
npm ERR!     /home/pi/iot/signalk-server-node/npm-debug.log

My environment:
Platform: Raspberry Pi Zero W
Raspbian: Linux busterls 4.4.50+ #970 Mon Feb 20 19:12:50 GMT 2017 armv6l GNU/Linux
Influxdb: https://github.com/hypriot/rpi-influxdb

Signal K has been updated:

pi@busterls:~/iot/signalk-server-node $ npm update
[email protected] /home/pi/iot/signalk-server-node
├── [email protected] 
├── [email protected]
├── [email protected]  (git://github.com/signalk/instrumentpanel.git#eb424cd95f254eccd86e09ce16d17de4b98bb521)
├── [email protected]  (git://github.com/signalk/maptracker.git#d2822bb7c51194d8d006878b850aa6464d29a4c4)
├── [email protected]  (git://github.com/tkurki/marinetrafficreporter.git#e9180fb95d34c804534445d2d97d6c7590cbf16c)
├── [email protected]  (git://github.com/signalk/nmea0183-signalk.git#f79fe6929d7941431b77f787592604a7c40233c6)
├── [email protected]  (git://github.com/signalk/sailgauge.git#7e5f73375fe684b4885a53f583136fd3e349e2fb)
├── [email protected]  (git://github.com/tkurki/set-system-time.git#b45b4c274d18880766abc72a2d5b158cbe434b39)
├── [email protected]  (git://github.com/signalk/signalk-to-nmea0183.git#45003a43a38775ae95a1a8f47da68633ef974963)
├── [email protected]  (git://github.com/signalk/signalk-zones.git#3195e96a93569df5a4f684a8604ee371b14a378d)
├── [email protected]  (git://github.com/signalk/simplegauges.git#76a1bb192da3ff04e10059a21f9902031dfbb81a)
└── [email protected] 

Npm-debug.log attached
npm-debug.zip

Include Depth value in Track data

Is it possible the returned track data from ./self/track/ have depth values in included as "altitude' in the position data?

e.g. [6.3, 50.23, 12.5]

Failed to connect to localhost port 8086

Hi! Thanks for the great work! I have just installed your plug-in into Signal-K and have configured it as follows:

Host: localhost
Port: 8086
Database: (empty)

In terminal I get the following error when trying to create the database:

pi@openplotter:~ $ curl -X POST http://localhost:8086/query?q=CREATE+DATABASE+boatdata
curl: (7) Failed to connect to localhost port 8086: Connection refused

I have also installad Grafana and I am trying to configure the connection, in there I receive the error:
Network Error: Bad Gateway(502)

I think the port is not starting, what could I do to fix this?

Multiple subscriptions

It would be useful to be able to subscribe to multiple data points with different rates.

This could also work flexibly in concert with #57 so that one could write to multiple databases (in one Influx instance or several) to produce output with different sampling rates.

Error: Bounds are not valid.

When starting the app with the leaflet chart, I get the error: Error: Bounds are not valid.
It loads the map correctly, also when zooming in on different regions.
Have I misconfigured something?

SignalK server: 1.22

Unit setting in SignalK

Hi Teppo, sorry to bother you here but maybe you can point me in the right direction. Have also raised this question in the openplotter forum. My problem is, that I wish to display the signalK keys xxxxx.temperature and xxx.pressure in C and hPa rather than the standardized units. Have changed the privat unit setting accordingly in the SignalK diagnos options of openplotter but fail to change the unit settings now in InfluxDB/Grafana.
In the NMEA0183 generation everything works fine, probably as its handled in the plugin rather than in SignalK.
Is there already a simple solution or could you consider some options within SignalK to apply privat unit setting before data are sent to InfluxDB.
Thanks- and stay healthy!
Freddie

Venus OS - signal k to influxdb2

I have Venus OS with the signal k to influxdb 2. I have created bucket and api tokes, but no data is getting through?

Server log:
Oct 21 17:45:52 x [RequestTimedOutError]: Request timed out at ClientRequest. (/data/conf/signalk/node_modules/@influxdata/influxdb-client/dist/index.js:5:6447) at ClientRequest.emit (node:events:517:28) at Socket.emitRequestTimeout (node:_http_client:847:9) at Object.onceWrapper (node:events:631:28) at Socket.emit (node:events:529:35) at Socket._onTimeout (node:net:598:8) at listOnTimeout (node:internal/timers:569:17) at process.processTimers (node:internal/timers:512:7)

High CPU usage

Hi,

I have the following settings:

image

And it has been working perfectly for weeks, however there is now high CPU load. One of the cores is running over 100%.
So, I've been doing some checking.

DB sizes:

image

And tried to disable monitoring in influxdb.conf:

image

This did not yet fix the high CPU issue yet. I am thinking of applying a data retention period so nothing gets stored older than 1 month. But, I don't know how to do that. Someone in the slack channel said it could be done by using chronograf, but although it is installed and running on my openplotter I cannot access it on port 8889.

I was wondering if you might have some advice. The high CPU usage makes the system unusable.

Kind regards,

Jamos

Load SignalK log files into influx

Would be great to be able to load log files into influx. Would help with recovery from general and influx specific crashes, and also allow to upload data from before the signalk-to-influx installation.

Error after disabling the plugin (trackhandling branch)

    at Error (native)
    at Database.prepare (/usr/local/src/signalk-to-influxdb.trackhandling/node_modules/sqlite3/lib/sqlite3.js:20:25)
    at Promise (/usr/local/src/signalk-to-influxdb.trackhandling/node_modules/sqlite/main.js:280:32)
    at Database.prepare (/usr/local/src/signalk-to-influxdb.trackhandling/node_modules/sqlite/main.js:279:12)
    at trackIdP.then.theTrackId (/usr/local/src/signalk-to-influxdb.trackhandling/trackdb.js:76:24)
    at process._tickCallback (internal/process/next_tick.js:103:7) errno: 21, code: 'SQLITE_MISUSE' }

Split navigation.position geojson into separate measurements

Hi Teppo

Many thanks for this- I'm having a lot of fun with Signalk/Influx/Grafana

I was wondering if is possible to split the navigation.position geojson array into separate longitude and latitude measurements or columns?
Reason being- some of the Grafana panels require separate long/lat values.

I've had a look on the Influx/Grafana end & can't see an obvious way to do this (possibly a flux script?)

i.e something like:
"name": "navigation.position",
"columns": [
"time",
"context",
"longitude",
"latitude",
"source"
],
"values": [
[
"2020-02-22T01:01:26.961Z",
"vessels.urn:mrn:imo:mmsi:503063560",
"151.3048064",
"-33.65744",
"OPcan.1"
]

thanks

Andrew

Cannot write to telegraf

I could of made a mistake on my end, but after losing my SD card on my pi running openplotter, I reinstalled to the latest version. I can use signalk-to-influxdb to write directly to my influxdb instances, however I had telegraf used to buffer the data. I am getting these errors now, I tried downgrading influx and telegraf with no avail.

Error:
May 02 21:37:54 signalk-to-influxdb:Error: node-influx expected the results length to equal 1, but it was 0. Please report this here: https://git.io/influx-err
May 02 21:37:54 Error: node-influx expected the results length to equal 1, but it was 0. Please report this here: https://git.io/influx-err at Object.parseSingle (/home/pi/.signalk/node_modules/influx/lib/src/results.js:128:15) at /home/pi/.signalk/node_modules/influx/lib/src/index.js:303:38 at runMicrotasks () at processTicksAndRejections (node:internal/process/task_queues:96:5)

Add Can Name as a Tag to the Influx Series, not just Source

Problem:
Currently InfluxDB series are created with the Source to uniquely seperate series, however source can, and does change between network reboots, especially if there are other changes in the network.

navigation.position,context=vessels.urn:mrn:imo:mmsi:000000000,source=can0.115
navigation.position,context=vessels.urn:mrn:imo:mmsi:000000000,source=can0.43

This can cause a horribly mish-mash of series and knock on effects, especially in dashboards.

The issue shows up most with navigation.position as there are often multiple sources of this on the boat, and more and more things integrate a GPS.

Proposed Solution:

Replace source with CanName, or other unique identifier to the taglist in Influx, so that they can be uniquely identified regardless of network address. This is required when displays, dashboards etc need to filter and display a specific value from a specific sensor, without having to edit dashboards if Source changes.

Add source support

PLugin should support distinguishing data on $source, not just SK path.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.