Quick Start Guide
Phase 1: Platform Configuration
Note: For this guide, we will manually set up the structure in the web interface to get started quickly. However, for advanced or scaled integrations, all the steps below (creating groups, defining data sources, and publishing) can be performed programmatically via the API endpoints (e.g., /api/input/survey/create).
Log in to the BoundaryAI platform.
Create a Feedback Group
This serves as your primary container (for example, “Customer Support Logs”).
Create an Endpoint (Data Source)
Within the Feedback Group, create a new endpoint (for example, “Live Chat Logs”).
Click Start Collecting Feedback → Connectors → API Endpoint
Define the expected payload format for your data.
Save all the information shared, as it will be essential to sending data correctly
Enable Ongoing Collection (Optional)
Locate the newly created Endpoint/Data Source.
Click the three-dot menu (…).
Open Advanced Options.
Toggle Ongoing feedback collection to ON.
Why this matters: Enabling ongoing collection tells BoundaryAI to treat incoming data as a continuous stream rather than a one-time upload. This is required for longitudinal analysis, trend detection, and evolution tracking over time.
Phase 2: Authentication
Navigate to the Integrations tab in the sidebar.
Select API.
Click Create API Key.
Select permission level Push (sufficient for sending data) or All.
Copy the key immediately (it starts with
inpk_). This key will not be shown again.
Phase 3: Integration (Sending Data)
To send data, you need three identifiers: survey_series_id (Group), survey_id (Source), and question_id (The specific field for text & data). These fields were provided to you when you created the endpoint but you can also fetch them using GET method.
Push Data
Now that you have your IDs (e.g., Group 101, Survey 505, Question 888), use the push endpoint to send data.
Endpoint: POST https://app.boundary-ai.com/api/input/content/push
Python Example:
Curl Example:
Response (JSON Snippet):
Summary of Best Practices
Batching: You can send up to 10,000 items per request. Buffer your user comments and send them in batches rather than one API call per comment.
Idempotency: Always include a 'Idempotency-Key' header. If a network error occurs and you retry the request, the API will use this reference to recognize the batch and prevent duplicate data entry.
Rate Limits: The API allows 60 requests/minute per key. Default is 60/min, max is 300/min.
Last updated