The EMnify Data Streamer (EDS) allows you to receive real-time data streams with event and usage data of endpoints and SIMs. Data streams created using this feature can be ingested by any third party analytics application or can be pushed to pre-integrated cloud services.
1. Creating a Data Stream
To create a Data Stream, log in to an EUI account and navigate to the Technical Settings page by clicking on the "link" icon in the top-right navigation menu.
The Technical Settings page contains a panel dedicated to configuring and displaying Data Streams.
1. Click on the action button "Add Data Stream".
The stream type can be any one of the following:
- Usage Data
- Event Data
- Usage and Event Data combined in a single stream
2. Choose an "API Type". This can be any one of the following:
- EMnify API
- EMnify API (Bulk-mode)
- Salesforce
- keen.io
- DataDog
- AWS S3
- AWS Kinesis
NOTE: Depending on the API type, API keys and configuration parameters may be required.
Details about the types of data streams available is covered in the <<Integrations>> section.
3. If past data is required, the checkbox "Stream historic data" should be checked.
The system will send historic data up to 20 days old. It may take time to catch up with live data if this option is enabled.
4. Click `Add Data Stream`
When created, new data streams are instantly active and you can monitor their status on the info panel on the *Technical Settings* page.
The column "Remote Status" displays the HTTP status code from the remote side of the stream.
This should be `200` (OK) under normal conditions.
If the configuration is invalid or the receiving component or server is down, it will show `5XX` errors.
2. Modifying a Data Stream
Pausing a data stream:
A data stream can be paused and resumed at any time by using the action buttons on the right of the stream info. This is useful for actions such as maintenance of the receiving server.
Removing a data stream:
To permanently remove a data stream, click the icon:times[] *remove* icon on the right.
You will be asked to confirm this action.
Parallel data streams:
It is possible to add up to 10 data streams which run in parallel.
This means that one stream may be connected to a network monitoring system, another can connect to data analytics platform and another can be syncing with S3 for archiving at the same time, and so on.
Filtering:
It's possible to add filters to a data stream which will filter the entire stream to contain only a certain event type or usage data type. This is covered in more detail in the section Filtering below.
3. Integrations
3.1. EMnify RestAPI
Customers who wish to use the RestAPI
data stream must provide an API which itself consumes the data stream. If you choose to implement the EMnify API on your server, the EDS will post data in the form of JSON objects as they occur.
This is the most flexible method of processing a data stream as it allows any implementation of analytics, reporting or a pipeline of tools to process usage and event data.
3.1.1. EMnify API (Bulk-Mode)
If you select EMnify API in bulk-mode, each HTTP POST
will include an array of objects. The HTTP POST
are sent at intervals and should be used if the receiving system should process multiple events in bulk instead of individual events as they occur.
3.2. AWS Kinesis
Amazon Kinesis allows for collecting and processing large streams of data records in real time. Applications created on Kinesis can run on Amazon EC2 instances and typical uses are to send processed records to dashboards, generate alerts, dynamically change pricing or advertising strategies, or send data to other AWS services.
Note: Currently only region "eu-west-1 is supported by EDS.
3.3. AWS S3
Amazon S3 allows for storage of the raw event and usage data as it arrives from a data stream. To receive a data stream in S3, access keys from an AWS account are used in the EUI for authorization. Shortly after the stream is created, CSV
files are uploaded to the S3 bucket; one for event data and one for usage data. The CSV files can then be send to other Amazon services or consumed by a third-party analytics or BI tool for generating insights.
3.4. Salesforce
With the data streamer Salesforce configuration, it’s possible to stream your event data directly into a Salesforce account. This is done by setting up a connected app in Salesforce which provides a client id and client secret. These credentials are used in the EMnify User Interface to grant access to the Salesforce platform event system.
NOTE: Consider configuring this data streamer type for selected event types only as there are limits on the number of events supported. Location Updates or PDP Context events can be extremely frequent and are probably not the most typical to be processed in Salesforce. More information on this feature is detailed in the section Filtering below.
4. Configuration by Integration
4.1. AWS S3 Setup
1. In your AWS account, navigate to S3 and create a new bucket.
Data Streamer Add S3 bucket
Currently only region eu-west-1 (EU Ireland) is supported.
Bucket names should be DNS-compliant, see AWS docs: Rules for Bucket Naming.
2. In IAM, select or create a user which has access to this S3 bucket, ideally access limited only to this bucket.
a. Navigate to Security Credentials and select Create access key. Securely store the Access ID and Secret Key
3. Log on to EUI and add an S3 Data Stream.
Data Streamer Add S3 bucket
Shortly after the stream is created, .csv files will be uploaded to the S3 bucket:
Data Streamer S3 bucket with data
Event Data
The event data filename follows the format events_YYYYMMDD_HHmmss.csv.
Usage Data
The usage data filename follows the format cdr_YYYYMMDD_HHmmss.csv.
4.1.1. S3 Security Guidelines
Event data that is sent via Data Streams may include usernames, email addresses and other data which can identify users or platform resources. The generated .csv files should therefore be treated as containing sensitive information. Precautions should be taken to ensure that the event and usage data in the destination S3 buckets are adequately secured.
The following three steps should be considered as the minimum security requirements for storing such data in S3:
1. Ensure that the S3 bucket is not publicly accessible. This can be applied in the Permissions tab of the S3 bucket:
Blocking Public Access on the S3 Bucket
2. Server-Side Encryption can be enabled per-bucket and S3 will encrypt objects before they are saved to disk. The decryption is then performed when downloading the objects. This can be enabled in the Properties tab of the S3 bucket:
Enabling Server-Side Encryption
3. The IAM user whose credentials (key ID & key secret) access the S3 bucket should have their permissions restricted to writing to the required bucket only. This can be done in the AWS console in Users → {Data Stream User} → Permissions → Add Permissions → Attach Policy Directly → Create Policy. An example JSON Policy would look like:
AWS provide an online JSON Policy Generator which can be used to create a policy like the example given above.
4.2. AWS Kinesis Setup
1. Log on to your AWS account
2. Navigate to Kinesis and create one or two new streams
Currently only region eu-west-1 (EU Ireland) is supported.
3. Navigate to IAM, select or create a user which has access to this Kinesis server, ideally limited access to this server only.
a. Navigate to Security Credentials and select Create access key. Securely store the Access ID and Secret Key
4. Log on to EUI and add a Kinesis Data Stream:
Add Kinesis Data Stream
4.3. Salesforce Setup
With the data streamer Salesforce configuration, it’s possible to stream your event data directly into a Salesforce account.
In Salesforce, we use events described as Platform Events which are intended for streaming data between external apps and Salesforce. More information about Platform Events and how to implement them can be found in the Salesforce Documentation - Platform Events.
Consider configuring this data streamer type for selected event types only as there are limits on
the number of events supported. Location Updates or PDP Context events can be extremely frequent and are probably not the most typical to be processed in Salesforce. Details of how to achieve this can be found in Filtering Data Streams.
To prepare Salesforce system to receive our event stream you need to perform the following steps:
- Setup a connected app in Salesforce which will provide your client id and client secret
- Obtain a refresh token via Oauth2 to grant the data streamer access to your Salesforce platform event stream
- Create a new platform event in your Salesforce account
Afterwards, add the following custom fields and assign them to your previously created platform event:
Salesforce fields required for Data Streamer Events
Now you can configure your Salesforce event stream in EUI. Your platform event name will be reflected as event stream name.
EDS S3 add stream
4.4. keen.io Setup
keen.io offers APIs for streaming, analyzing, and embedding rich data and integrates with the EMnify Data Streamer. To get the EDS integrated with keen.io and run your first queries and build dashboards, the following steps are required:
1. Create a keen.io user account
2. Create a new project, e.g. EMnify Data Stream
3. In the project settings go to Access to find your project id and write key
Configuration for keen.io access
4. Log on to the EUI and add a keen.io stream using the Project ID and Write Key
EDS keen.io add stream
Shortly after the stream is created you will see first data arriving, which you can explore on keen.io on the streams tab. Depending on the stream type you selected there will one or two streams appear with the names EMnifyEventData and EMnifyUsageData
You choose the data collection, e.g. EMnifyUsageData and click on Last 10 Events Streamed to see the live event data coming in.
Streaming data arriving in keen.io
4.5. DataDog Setup
DataDog provides real-time performance monitoring (https://www.datadoghq.com). In conjunction with EDS it allows you to collect and analyze metrics about the usage of your endpoints and SIM cards, you can create dashboards and trigger alerts on certain situations.
- Create a DataDog user account
- Navigate to Integrations → APIs from the left navbar
- Generate a new Application Key
- Log on to the EUI and add a DataDog stream
EDS DataDog add stream
Shortly after the stream is created, data will begin arriving. In the DataDog explorer, you can check the incoming data and then you can start to create dashboards using the metrics endpoint.volume, endpoint.volume_tx, endpoint.volume_rx, and endpoint.cost.
5. Features
5.1. Filtering
One of the most powerful features of a data stream is the capability to apply filters to each individual stream. By default, no filters are added to a data stream and all events are streamed. Multiple filters can be applied to each stream and this creates more granular and targeted data for analysis.
The following screenshot shows filters applied to a data stream via the EMnify User Interface. The data stream that the filters are applied to will only contain Update Location
and User Authentication failed
events.
6. Event Data Reference
6.1. Generic Event Data
The following table shows the properties in generic event data. The following properties are included in all events:
Property | Format | Description |
---|---|---|
|
Numeric |
A unique numerical identifier of this event. If multiple events with same id are received (e.g. due to transmission errors) these should be treated by the receiver as duplicates |
|
Timestamp |
Date/time when this event happened |
|
Nested Object |
Type of the event, see Event Types for details |
|
Nested Object |
Severity of the event, see Severity Levels for details |
|
Nested Object |
Source of the event, see Event Source for details |
|
Nested Object |
Organisation associated with this event, see Organisation Object for details |
|
Boolean |
Event is a candidate to be alerted to an user |
|
String |
Human readable description of the event |
6.2. Additional Properties
Event types relating specifically to SIMs, endpoints and users include the following additional properties:
Property | Format | Description |
---|---|---|
|
Nested Object |
Details of IMSI, see IMSI Object for details (in case of multi-IMSI configuration, multiple different IMSIs may be reported for the same SIM) |
|
Nested Object |
Details of SIM, see SIM Object for details |
|
Nested Object |
Details of Endpoint, see Details Object for details |
6.3. Additional Details by Event Type
The data streamer will send additional data when available depending on the event type. This data is added as a nested object called detail
and contains information on the actual used Mobile Network Operator and country. See Details Object for more information.
6.4. Details Object
Property | Format | Description |
---|---|---|
|
Numeric |
Unique identifier of the actual used Mobile Network Operator |
|
String |
Name of the Mobile Network Operator |
|
Nested Object |
Country of Mobile Network Operator |
|
Numeric |
Unique identifier of the country |
|
String |
Name of country |
|
String |
Country code |
|
String |
Mobile Country Code (MCC) |
|
String |
ISO code |
|
Nested Object |
PDP Context Details |
|
Nested Object |
Volume consumed in PDP Context |
|
Number (up to 6 decimal places) |
Downstream Volume in MB |
|
Number (up to 6 decimal places) |
Upstream Volume in MB |
|
Number (up to 6 decimal places) |
Total volume |
6.5. PDP Context
Property | Format | Description |
---|---|---|
|
String |
Unique identifier of this PDP context |
|
Timestamp |
Date/time when this PDP context was created |
|
String |
GTP Version (either |
|
String |
IP Address of GGSN/PGW Control Plane |
|
String |
IP Address of GGSN/PGW Data Plane |
|
String |
IP Address of SGSN/SGW Control Plane |
|
String |
IP Address of SGSN/SGW Data Plane |
|
String |
Region where Data Plane is located |
|
String |
IP Address used for Internet Breakout |
|
String |
Access Point Name (APN) |
|
Integer |
Network Service Access Point Identifier (NSAPI) |
|
String |
IP address assigned to Endpoint |
|
String |
IMEISV |
|
String |
Mobile Country Code (MCC) |
|
String |
Mobile Network Code (MNC) |
|
Integer |
Location Area Code (LAC) |
|
Integer |
Service Area code (SAC) |
|
Integer |
Routing Area code (RAC) |
|
Integer |
Cell Identification (CI) |
|
Integer |
Radio Access Type (RAT). 1: 3G |
6.6. Organisation Object
Property | Format | Description |
---|---|---|
|
Numeric |
Unique identifier of this organisation |
|
String |
Name of Organisation |
6.7. User Object
Property | Format | Description |
---|---|---|
|
Numeric |
Unique identifier of this user |
|
String |
Username e.g. email address |
|
String |
Realname of user |
6.8. IMSI Object
Property | Format | Description |
---|---|---|
|
Numeric |
Unique identifier of this IMSI |
|
String |
International mobile subscriber identity (IMSI) |
|
Timestamp |
Date/Time this IMSI was provisioned |
6.9. SIM Object
Property | Format | Description |
---|---|---|
|
Numeric |
Unique identifier of this SIM |
|
String |
Integrated Circuit Card identifier (ICCID) without checksum digit |
|
String |
MSISDN |
|
Timestamp |
Date/Time this SIM chip was produced |
6.10. Endpoint Object
Property | Format | Description |
---|---|---|
|
Numeric |
Unique identifier of this Endpoint |
|
String |
Configured name of this endpoint |
|
String |
IP Address assigned to this Endpoint |
|
String |
Tags assigned to this Endpoint |
|
String |
International mobile equipment identity (IMEI) |
6.11. Event Types
Id | Description |
---|---|
|
Generic |
|
Update location |
|
Update GPRS location |
|
Create PDP Context |
|
Update PDP Context |
|
Delete PDP Context |
|
User authentication failed |
|
Application authentication failed |
|
SIM activation |
|
SIM suspension |
|
SIM deletion |
|
Endpoint blocked |
|
Organisation blocked |
|
Support Access |
|
Multi-factor Authentication |
|
Purge Location |
|
Purge GPRS location |
|
Self-Signup |
|
Threshold reached |
|
Quota used up |
6.12. Severity Levels
Id | Description |
---|---|
|
INFO |
|
WARN |
|
CRITICAL |
6.13. Event Source
Id | Description |
---|---|
|
Network |
|
Policy Control |
|
API |
7. Usage Data Reference
7.1. Usage Data Properties
Property | Format | Description |
---|---|---|
|
Numeric |
Unique identifier of this transaction |
|
Number |
Cost calculation of reported traffic volume. May use up to 6 decimal places. |
|
Numeric |
Unique identifier of currency of indicated cost |
|
ISO 4217 |
Currency Code |
|
UTC Timestamp |
Start time of traffic measurement |
|
UTC Timestamp |
End time of traffic measurement |
|
Number |
Downstream traffic (MB) received by the endpoint. May use up to 6 decimal places. |
|
Number |
Upstream traffic (MB) send by the endpoint. May use up to 6 decimal places. |
|
Number |
Total traffic consumed. May use up to 6 decimal places. |
|
15 digit numeric string |
Currently used IMSI |
|
Numeric |
Unique identifier of endpoint |
|
String |
The user-defined name set for this endpoint |
|
Numeric |
The IP address assigned to this endpoint |
|
String |
User-defined tags (if any) set for this |
|
Numeric |
The IMEI of the endpoint hardware |
|
Numeric |
Unique identifier of SIM |
|
19 digit numeric string |
ICCID of SIM |
|
Numeric |
MSISDN of the associated SIM |
|
Timestamp |
The production date of the associated SIM |
|
Numeric |
Unique identifier of organisation |
|
String |
Name of organisation |
|
Numeric |
Unique identifier of visited operator |
|
String |
Name of that mobile operator |
|
Numeric |
Unique identifier of visited country |
|
String |
Name of visited country |
|
Numeric |
Mobile Network Code of the visited operator |
|
Numeric |
Mobile Country Code of the visited operator |
|
Numeric |
Unique identifier of applied tariff |
|
String |
Name of Tariff |
|
Numeric |
Unique identifier of applied ratezone |
|
String |
Name of Ratezone |
|
Numeric |
Unique identifier of traffic type |
|
String |
Name of traffic type |
7.2. Usage Data JSON Sample
{
"cost": 0.00558275,
"id": 174321498,
"operator": {
"id": 4,
"name": "EPlus",
"country": {
"id": 74,
"name": "Germany"
}
},
"organisation": {
"id": 399921,
"name": "Test"
},
"tariff": {
"ratezone": {
"id": 83,
"name": "Europe_I"
},
"id": 64,
"name": "Global Pro I"
},
"traffic_type": {
"id": 5,
"name": "Data"
},
"endpoint": {
"id": 8392037
},
"imsi": "901430999910777",
"volume": {
"rx": 0.0138,
"tx": 0.008531,
"total": 0.022331
},
"start_timestamp": "2017-03-19 21:06:33",
"sim": {
"iccid": "8988303000000011085",
"id": 233746
},
"currency": {
"symbol": "€",
"code": "EUR",
"id": 1
},
"end_timestamp": "2017-03-19 21:21:23"
}
7.3. Example Events
The following section describes the properties contained in events.
User Authentication Failed
{
"id": 201388127,
"alert": false,
"description": "Failed authentication request from 'user@company.com', Reason: Invalid password from IP 9.9.9.9",
"timestamp": "2017-10-26T07:42:00.000+0000",
"event_type": {
"id": 6,
"description": "User authentication failed"
},
"event_source": {
"id": 2,
"description": "API"
},
"event_severity": {
"id": 1,
"description": "Warn"
},
"organisation": {
"id": 839921,
"name": "Demo Company"
},
"user": {
"id": 84993,
"username": "user@company.com",
"name": "Scott Tiger"
}
}
Update Location
{
"id": 201370709,
"alert": false,
"description": "New location received from VLR for IMSI='90143012345678912345', now attached to VLR='491720013095'.",
"timestamp": "2017-10-26T07:28:00.000+0000",
"event_type": {
"id": 1,
"description": "Update location"
},
"event_source": {
"id": 0,
"description": "Network"
},
"event_severity": {
"id": 0,
"description": "Info"
},
"organisation": {
"id": 839921,
"name": "Demo Company"
},
"endpoint": {
"id": 8638726,
"name": "GPS Tracker",
"ip_address": "100.96.234.249",
"tags": null,
"imei": "3577620833012201"
},
"imsi": {
"id": 205672,
"imsi": "90143012345678912345",
"import_date": "2016-12-27T10:09:23.000+0000"
},
"sim": {
"id": 274887,
"iccid": "8988303001234567890",
"production_date": "2016-12-27T10:09:23.000+0000"
},
"detail": {
"id": 3,
"name": "Vodafone",
"country": {
"id": 74,
"name": "Germany",
"country_code": "49",
"mcc": "262",
"iso_code": "de"
},
"tapcode": [{
"id": 2,
"tapcode": "DEUD2"
}],
"mnc": [{
"id": 3,
"mnc": "02"
}]
}
}
Comments
0 comments
Please sign in to leave a comment.