Actions100
- API Service Actions
- Bundles Automation-Side Actions
- Bundles Design-Side Actions
- Connection Actions
- Continuous Activity Actions
- Dashboard Actions
- Data Collection Actions
- Data Quality Actions
- Compute Rules on Specific Partition
- Create Data Quality Rules Configuration
- Delete Rule
- Get Data Quality Project Current Status
- Get Data Quality Project Timeline
- Get Data Quality Rules Configuration
- Get Dataset Current Status
- Get Dataset Current Status per Partition
- Get Last Outcome on Specific Partition
- Get Last Rule Results
- Get Rule History
- Update Rule Configuration
- Dataset Actions
- Compute Metrics
- Create Dataset
- Create Managed Dataset
- Delete Data
- Delete Dataset
- Execute Tables Import
- Get Column Lineage
- Get Data
- Get Data - Alternative Version
- Get Dataset Settings
- Get Full Info
- Get Last Metric Values
- Get Metadata
- Get Schema
- Get Single Metric History
- List Datasets
- List Partitions
- List Tables
- List Tables Schemas
- Prepare Tables Import
- Run Checks
- Set Metadata
- Set Schema
- Synchronize Hive Metastore
- Update Dataset Settings
- Update From Hive Metastore
- Dataset Statistic Actions
- Discussion Actions
- DSS Administration Actions
Overview
The node integrates with the Dataiku DSS API, enabling users to perform a wide range of operations on various Dataiku DSS resources. Specifically for the Dataset resource and the Prepare Tables Import operation, this node prepares the import of selected tables (from SQL or Hive) into a dataset within a specified project.
This operation is useful when you want to stage or prepare data tables for import into Dataiku DSS datasets, facilitating data ingestion workflows. For example, if you have external SQL tables that need to be imported and processed in Dataiku DSS, this operation helps list and prepare those tables before actual import execution.
Properties
Name | Meaning |
---|---|
Project Key | The key identifier of the Dataiku DSS project where the dataset resides or will be used. |
Request Body | JSON object containing parameters required by the "Prepare Tables Import" API endpoint. |
- Project Key: Specifies the target project in Dataiku DSS.
- Request Body: Contains the details for preparing the table import, such as keys identifying which tables to prepare. This should conform to the expected JSON structure of the Dataiku DSS API for this operation.
Output
The output is a JSON array where each element corresponds to the response from the Dataiku DSS API call for the prepare tables import action.
- The
json
field contains the parsed JSON response from the API, which typically includes information about the prepared tables, their schemas, or any metadata returned by the preparation step. - There is no binary output for this operation.
Dependencies
- Requires an active connection to a Dataiku DSS instance.
- Requires valid API credentials (an API key token) for authentication with the Dataiku DSS API.
- The node expects the Dataiku DSS server URL and user API key to be configured in the credentials.
- No additional environment variables are needed beyond the API credentials.
Troubleshooting
- Missing Credentials Error: If the node throws an error about missing credentials, ensure that the Dataiku DSS API credentials are properly set up in n8n.
- Project Key Required: The operation requires a valid project key; omitting it will cause an error.
- Invalid Request Body: The request body must be a valid JSON string matching the API's expected format. Malformed JSON or incorrect parameters will result in API errors.
- API Endpoint Errors: Errors returned from the Dataiku DSS API (e.g., 4xx or 5xx HTTP status codes) will be surfaced as node errors. Check the API documentation and logs for detailed error messages.
- Network Issues: Ensure that the n8n instance can reach the Dataiku DSS server URL and that there are no firewall or network restrictions.
Links and References
- Dataiku DSS API Documentation - Datasets
- Dataiku DSS API - Prepare Tables Import
- n8n Documentation - Creating Custom Nodes
This summary focuses on the Dataset resource's Prepare Tables Import operation as requested, based on static analysis of the provided source code and property definitions.