Integrations
Integrations connect TDMP with external data sources, destinations, and tools to extend your testing workflows.
Where to find Integrations
Click Integrations in the main navigation menu. The page displays all configured integrations and a catalog of available providers.
What are integrations?
Integrations allow TDMP to:
- Store datasets externally: Save generated data to cloud storage (AWS S3, MinIO, Google Cloud Storage)
- Export to databases: Write test data directly to PostgreSQL, Snowflake, MySQL, or BigQuery
- Send notifications: Alert your team via Slack or custom webhooks when datasets complete or fail
- Stream events: Publish dataset events to Kafka topics for downstream systems
- Create tickets: Automatically file Jira issues when schema validation fails
Integration categories
Storage: AWS S3, MinIO, Google Cloud Storage, Azure Blob Storage
Database: PostgreSQL, MySQL, Snowflake, BigQuery
Messaging: Kafka, RabbitMQ
Notifications: Slack
DevOps: Jira
Other: Custom webhooks
Managing integrations
Your configured integrations
The top section shows integrations you've already set up:
Status indicators:
- Connected (green): Integration is working and tested successfully
- Not tested (gray): Integration configured but not verified
- Error (red): Connection test failed
Available actions:
- Edit: Update configuration details (credentials, URLs, settings)
- Test: Verify the connection works
- Remove: Delete the integration
Provider catalog
Below your configured integrations is a catalog of all available providers. Each card shows:
- Provider name and category
- Brief description
- Configure button to set up a new integration
Filter the catalog:
- Use the search box to find specific providers
- Use the category dropdown to view only Storage, Database, Notifications, etc.
Setting up an integration
Step-by-step
- Find the provider in the catalog
- Click Configure
- Enter a display name (e.g., "S3 Production", "Slack QA Alerts")
- Fill in required configuration fields:
- Storage providers: Bucket name, region, access credentials
- Database providers: Host, port, database name, credentials
- Slack: Webhook URL, channel name
- Webhooks: URL, HTTP method, optional secret
- Click Save
- Test the connection to verify it works
Authentication types
Different providers use different authentication methods:
API key: Simple token-based authentication (Slack, some webhooks)
Basic auth: Username and password (Jira, some databases)
AWS credentials: Access key ID and secret access key (S3)
GCP credentials: Service account JSON file (Google Cloud Storage, BigQuery)
OAuth: Browser-based authentication flow (some providers)
Configuration tips
Give descriptive names: Use names like "S3 Staging Environment" or "Slack QA Team" instead of generic names.
Test before using: Always click Test after configuration to catch errors early.
Update credentials when they change: Edit integrations when rotating keys or passwords.
Remove unused integrations: Keep your list clean by deleting integrations you no longer use.
Using integrations
Once configured, integrations become available when:
- Generating datasets: Select a storage integration as the destination
- Exporting data: Choose a database integration to write test data directly
- Receiving notifications: Slack and webhook integrations trigger automatically on dataset events
Common integration scenarios
Storing datasets in S3
- Configure an AWS S3 integration with your bucket and credentials
- When generating a dataset, select the S3 integration as the storage destination
- Datasets are uploaded to your bucket automatically upon completion
Getting Slack notifications
- Create a Slack webhook URL in your Slack workspace
- Configure a Slack integration with the webhook URL and channel
- Receive automatic notifications when datasets complete or fail
Exporting to Snowflake
- Configure a Snowflake integration with your account, warehouse, and credentials
- When generating a dataset, choose to export directly to Snowflake
- Test data is written to your specified database and schema
Triggering webhooks on events
- Configure a webhook integration with your endpoint URL
- Choose which events trigger the webhook (dataset.completed, dataset.failed, etc.)
- Your endpoint receives POST requests with event details
Security notes
Credentials are encrypted: All API keys, passwords, and secrets are securely stored and encrypted.
Test connections safely: Connection tests validate configuration without exposing sensitive data.
Tenant isolation: Integrations are private to your tenant and not shared across organizations.
Role requirements: Depending on your deployment, configuring integrations may require Admin or Manager role.
Troubleshooting
Connection test fails:
- Verify credentials are correct and not expired
- Check network connectivity (firewalls, VPNs)
- Confirm bucket names, database names, and URLs are spelled correctly
- Review error messages for specific hints
Integration not appearing in dropdown:
- Ensure the integration status is "Connected"
- Refresh the page
- Check that the integration is enabled
Notifications not arriving:
- Test the integration manually
- Verify webhook URLs and channels are correct
- Check Slack app permissions
Best practices
Separate environments: Create different integrations for development, staging, and production.
Use service accounts: For cloud providers, create dedicated service accounts with minimal permissions.
Monitor integration health: Regularly test integrations to catch credential expiration or configuration drift.
Document integration purposes: Use clear names and maintain a record of what each integration is used for.