Send data
Send (ingest), transport, and fetch data from different sources such as Relational database, web logs, batch data, real-time, app logs, streaming data, etc. for later usage with the Axiom API.
You can also collect, load, group, and move data from one or more sources to Axiom where it can be stored and further analyzed.
Before ingesting data, you need to generate an API token from the Settings > Tokens page on the Axiom Dashboard. See API tokens documentation for more detail.
Once you have an API token, there are different ways to get your data into Axiom:
- Using the Ingest API;
- Using a Data Shipper (Logstash, Filebeat, Metricbeat, Fluentd, etc.);
- Using the Elasticsearch Bulk API that Axiom supports natively.
- Using the Apps we support
- Using Endpoints
Ingest method
Select the method to ingest your data. Each ingest method follows a particular path.
Client libraries
Library extensions
Next.js
The official Next.js library for Axiom.
Tracing
The official Rust tracing layer for Axiom
Winston
The official Axiom transport for Winston logger
Pino
Axiom Transport for Pino logger
Logrus
Axiom Go Adapter for sirupsen/logrus
Apex
Axiom Go Adapter for apex/log
Zap
Axiom Go Adapter for uber-go/zap
Python logging
Python logging helper for Axiom
Go OTEL
Package otel provides helpers for using [OpenTelemetry] with Axiom.
Integrations
Vercel
Connect Axiom with Vercel to get the deepest observability experience for your Vercel projects.
Netlify
Connect site traffic logs, function logs, and edge function logs from Netlify’s CDN to Axiom.
Vector
Deliver log events to Axiom using the Axiom sink in Vector.
AWS Lambda
Ingest logs and platform events from your lambda functions
Cloudflare Workers
Send logs from Cloudflare Workers to Axiom
Ingest API
Axiom exports a simple REST API that can accept any of the following formats:
Ingest using JSON
application/json
- single event or JSON array of events
Example
curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
-H 'Authorization: Bearer $API_TOKEN' \
-H 'Content-Type: application/json' \
-d '[
{
"_time":"2021-02-04T03:11:23.222Z",
"data":{"key1":"value1","key2":"value2"}
},
{
"data":{"key3":"value3"},
"attributes":{"key4":"value4"}
},
{
"tags": {
"server": "aws",
"source": "wordpress"
}
}
]'
Ingest using NDJSON
application/x-ndjson
- Ingests multiple JSON objects, each represented as a separate line.
Example
curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
-H 'Authorization: Bearer $API_TOKEN' \
-H 'Content-Type: application/x-ndjson' \
-d '{"id":1,"name":"machala"}
{"id":2,"name":"axiom"}
{"id":3,"name":"apl"}
{"index": {"_index": "products"}}
{"timestamp": "2016-06-06T12:00:00+02:00", "attributes": {"key1": "value1","key2": "value2"}}
{"queryString": "count()"}'
Ingest using CSV
text/csv
- this should include the header line with field names separated by commas
Example
curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
-H 'Authorization: Bearer $API_TOKEN' \
-H 'Content-Type: text/csv' \
-d 'user, name
foo, bar'
Data Shippers
Configure, read, collect, and send logs to your Axiom deployment using a variety of data shippers. Data shippers are lightweight agents that acquire logs and metrics enabling you to ship data directly into Axiom.
AWS CloudFront AWS CloudWatch Logs Elastic Beats Fluent Bit Fluentd Heroku Log Drains Kubernetes Logstash Loki Multiplexer Syslog Proxy VectorApps
Send logs and metrics from Vercel, Netlify, and other supported apps.
Endpoints
Endpoints enable you to easily integrate Axiom into your existing data flow by allowing you to use tools and libraries that you are already familiar with.
You can create an Endpoint for your favorite services like HoneyComb, Jaeger, Grafana Loki, or Splunk and eventually send the logs from these Services directly into Axiom.
Get started with endpoints here
Limits and requirements
Axiom applies certain limits and requirements on the ingested data to guarantee good service across the platform. Some of these limits depend on your pricing plan, and some of them are applied system-wide. For more information, see Limits and requirements.
The most important field requirement is about the timestamp.
All events stored in Axiom must have a _time
timestamp field. If the data you ingest doesn't have a _time
field, Axiom assigns the time of the data ingest to the events. To specify the timestamp yourself, include a _time
field in the ingested data.
If you include the _time
field in the ingested data, ensure the _time
field contains timestamps in a valid time format. Axiom accepts many date strings and timestamps without knowing the format in advance, including Unix Epoch, RFC3339, or ISO 8601.