These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
After upgrading to build 4.6.1.2529 of the Hybrid Data Pipeline server, the Manage Configuration page in the Web UI did not load.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Log management capabilities have been enhanced. Administrators may now specify a centralized location for Hybrid Data Pipeline logs. In addition, administrators may set logging levels for system services, including the web UI, the data access service, the notification server, and the Apache Tomcat server. For details, refer to Log management.
For OData Version 4, Hybrid Data Pipeline now supports long binary and long character types up to 1 MB. Supported long binary types include BLOB and LONGVARBINARY. Supported long character types include CLOB, LONGNVARCHAR, LONGVARCHAR, and NCLOB. Column sizes for long binary types may be managed with the limits ODataBinaryColumnSizeLimit and ODataIncludeBinaryLongData. Column sizes for long character types may be managed with the limits ODataCharacterColumnSizeLimit and ODataIncludeCharacterLongData. Refer to the following documentation resources for details: Entity Data Model (EDM) types for OData Version 4, Manage Limits view, and Limits API.
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.95. (On-Premises Connector version 4.6.2.1046)
Google has ended support for Universal Analytics (also referred to as Google Analytics 3). Therefore, Google Analytics 3 support has been removed from Hybrid Data Pipeline. Google Analytics 4 support was added to Hybrid Data Pipeline with the 4.6.1.1854 release of the server. Google Analytics 4 continues to be supported and maintained as a Hybrid Data Pipeline data store. Refer to Google Analytics 4 parameters for details. (On-Premises Connector version 4.6.2.1046)
When executing a SQL query that ends with a semicolon against an Oracle data source, the error "ORA-03137: malformed TTC packet from client rejected" was returned. (JDBC driver version 4.6.2.403)
When using a JDBC third-party connector, the OData schema map could not be refreshed using the Web UI.
Resource leaks occurred with SSL connections to OpenEdge, MySQL, Sybase, Oracle Service Cloud, and Db2 data sources. These leaks resulted in a "Too many open files" exception and caused the server to fail.
On the data source page for the SAP S/4HANA data store, the Connector ID parameter for the On-Premises Connector was missing.
On the advanced tab of the SAP S4/HANA data store page, the "Extended Options" parameter was not exposed.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline now supports the creation of a custom password policy. Administrators may set an expiration date for passwords and configure the minimum and maximum number of characters allowed in a password. A custom policy may also be configured to require upper case letters, lower case letters, numbers, and special characters. See Password policy for details.
The new Administrator Connectors API allows administrators to retrieve information about On-Premises Connectors registered with Hybrid Data Pipeline. Administrators may obtain a full list of On-Premises Connectors with this API. They may also use it to filter the list for details such as version number, owner, and tenant. See Obtaining information about On-Premises Connectors for details.
Hybrid Data Pipeline provides data source logging to record user activity against data sources. Data source logs may now be obtained from the Data Sources view in the Web UI or with the data source logs endpoint. In addition, data source logs may be retrieved by running the getdslogs.sh script on each node in the deployment. See Obtaining the logs for a data source for details.
Hybrid Data Pipeline generates a number of log files to record events, activity, and other information. System logs may now be obtained through the System Configurations view in the Web UI or via the Nodes API. In addition, system logs may be retrieved by running the getlogs.sh script on each node in the deployment. See System logs for details.
For connectivity to MySQL CE, the MySQL CE Connector/J jar must be supplied during the deployment of Hybrid Data Pipeline. With this release, version 8.0 of the MySQL CE Connector/J jar has been certified with Hybrid Data Pipeline. For the latest data source and platform support information, refer to the Product Compatibility Guide.
After setting the MaxFetchRows limit, it was observed that Hybrid Data Pipeline ignored the limit. In addition, the SQL Editor returned incorrect results.
When querying the api/mgmt/datastores endpoint, Hybrid Data Pipeline returned invalid JSON in the response payload.
When querying the UserMeter table for information about an OData query, the RemoteAddress field contained the Hybrid Data Pipeline server IP address instead of the IP address of the client machine.
When registering a SAML authentication service using Azure as the Identify Provider, Hybrid Data Pipeline returned the error "Value must be a valid URL" even though the IDP entity ID was valid.
When a special character was used for the user password of a MySQL system database, the Hybrid Data Pipeline server installation failed.
When specifying NULL for a SQL_DECIMAL parameter while inserting data with the ODBC driver, the error "[DataDirect][ODBC Hybrid driver][Service]Worker thread error: java.io.EOFException" was returned. (ODBC driver 4.6.1.268)
When attempting to connect to a Salesforce test instance using OAuth, Hybrid Data Pipeline returned the error "There is a problem connecting to the DataSource. REST STatus 404 NOT Found returned for GET https://login.salesforce.com/services/oauth2/userinfo."
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
The Data Store page has been enhanced. The Data Store page lists all supported data stores. It is the first stop when creating a data source, or connection, to a data store. The enhancements to the Data Store page include a new layout, search functionality, and links to documentation resources.
The Hybrid Data Pipeline product package now includes the update_server_cert.sh shell script to simplify the process of updating SSL certificates in Linux deployments of Hybrid Data Pipeline. After you obtain a new CA certificate, you may run the script to configure the server to use the new certificate. Then, depending on your environment, certificate information must be updated for components such as the ODBC driver, JDBC driver, and On-Premises Connector. See Updating SSL certificates in the Deployment Guide for details.
The curl library files that are installed with the ODBC driver have been upgraded to version 8.4.0, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities in the curl documentation. (ODBC driver 4.6.1.249)
The default value of the shutdown port has been changed from -1 to 8005.
When attempting to deploy the Hybrid Data Pipeline ODBC driver in a Docker container, Exit Code 1 is returned. (ODBC driver 4.6.1.249)
The curl library files that are installed with the ODBC driver have been upgraded to version 8.4.0 to address the curl Library vulnerabilities CVE-2023-38545 and CVE-2023-38546. (ODBC driver 4.6.1.249)
When contacting the Hybrid Data Pipeline server to open a websocket connection, the On-Premises Connector was not providing the Server Name Indication (SNI) extension for the SSL handshake. (On-Premises Connector 4.6.1.758)
After upgrading to Java 11.0.20 on Windows Server 2019, the installation of the JDBC driver failed with the error "java.util.zip.ZipException: Invalid CEN header (invalid extra data field size for tag: 0x3831 at 0)." (JDBC driver 4.6.1.271)
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline became unresponsive to incoming queries (1) due to slow response times associated with queries sent to an on-premises data source and (2) because threads were not timing out as expected
An issue that prevented FIPS to be used with Azure CosmosDB and MongoDB connections has been resolved.
An issue that prevented FIPS to be used with SAP HANA connections has been resolved.
After upgrading to version 4.6.1.676 of the On-Premises Connector, the Hybrid Data Pipeline server was unable to connect to the on-premises data source. (On-Premises Connector 4.6.1.709)
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
When fetching data from an OData-enabled Oracle database, Hybrid Data Pipeline returned Date and Time values only in UTC.
When using the SQL Editor to query a SQL Server data source, the SQL Editor was unable to browse tables, views, and procedures under any schema name that included a dot.
When deploying the server as a Docker container, using the HDP_DATABASE_ADVANCED_OPTIONS option to enable SSL (HDP_DATABASE_ADVANCED_OPTIONS=EncryptionMethod=SSL) failed to enable SSL against the system database.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Idle Google BigQuery connections did not fully close, causing the Hybrid Data Pipeline server to reach the limit on open files.
After upgrading to the On-Premises Connector, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.255)
When attempting to authenticate using SAML, Hybrid Data Pipeline returned the exception "Request header is too large."
After an account lockout occurred, OData queries were running successfully.
When using a third-party connector where the database credentials are not included in the Hybrid Data Pipeline data source, the user is prompted to enter credentials. In this scenario, Hybrid Data Pipeline returned the error message "user name is missing." (JDBC driver 4.6.1.77)
Hybrid Data Pipeline was unable to support the Azure Database for PostgreSQL as an external database because Azure Database for PostgreSQL requires a unique user naming convention.
When attempting to query an Azure Synapse serverless instance, Hybrid Data Pipeline returned a java.io.IOException worker thread error.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Support for connectivity to Snowflake has been added to Hybrid Data Pipeline. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Snowflake.
Note: Hybrid Data Pipeline does not support FIPS for Snowflake connections. Refer to "FIPS mode" or "Snowflake" in Hybrid Data Pipeline known issues for details.
The MySQL CE data store icon no longer appears by default on the Data Stores page. The icon will only appear if the MySQL Connector/J driver jar has been provided during the deployment process.
The ODBC driver does not include the version metadata required to display the driver version number in the ODBC Administrator and in the driver library properties. (ODBC driver 4.6.1.177)
When using custom authentication to connect to a REST service with the Autonomous REST Connector, the connection failed after an initial connection because the Hybrid Data Pipeline server was not properly storing authentication parameters.
Hybrid Data Pipeline has been updated to use Spring Framework version 5.8.3 to address security vulnerabilities described in CVE-2023-20862. (Hybrid Data Pipeline server 4.6.1.1548, On-Premises Connector 4.6.1.570)
When running enable_ssl.sh, the script does not throw an error when an argument is not supplied.
When using the SQL Editor to query Azure SQL Data Warehouse with ActiveDirectoryPassword authentication, the error message "Catalog view 'dm_exec_sessions' is not supported in this version" was returned.
The Autonomous REST Connector was able to access the local file system of the server hosting Hybrid Data Pipeline.
When using the Autonomous REST Connector to connect to a REST service, Hybrid Data Pipeline failed to return results for a query that attempted to use dynamic filtering on a date field.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
The Hybrid Data Pipeline has been enhanced to support case conversions for entity types, entity sets, and properties. The owner of a data source can now change the entity type, entity set, and property names to all uppercase or all lowercase on the OData tab in the Web UI or using the Hybrid Data Pipeline Management API.
The Web UI has been enhanced to support data source sharing. The owner of a data source can now share access to a data store with Hybrid Data Pipeline users and tenants through the Data Sources view in the Web UI.
The Web UI has been enhanced to fully support the IP address whitelist feature. Administrators can secure access to Hybrid Data Pipeline resources by implementing IP address whitelists through the Web UI. The Web UI can be used to create IP address whitelists at the system level, tenant level, user level, or some combination of these levels
The navigation bar can be expanded to show the names of the views supported in the Web UI. The icons in the navigation bar have been reordered and updated.
Hybrid Data Pipeline supports exposing stored functions for OData Version 4 connectivity to PostgreSQL data sources. When configuring a PostgreSQL data source, the OData schema map can be configured to expose stored functions.
A new throttling limit has been introduced in the System Limits view. The XdbcMaxResponse limit can be used to set the approximate maximum size of JDBC and ODBC HTTP result data.
The ODBC driver installation program has been enhanced to support branded installations for OEM customers (available in the ODBC driver installer on November 18, 2019). The branded driver can then be distributed with OEM customer client applications. For the Hybrid Data Pipeline ODBC driver distribution guide, visit the Progress DataDirect Product Books page on the Progress PartnerLink website (login required).
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.20. (On-Premises Connector version 4.6.1.7)
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline now supports connectivity to PostgreSQL 14 databases. PostgreSQL 14 can also be used as a system database to store account and configuration information for a Hybrid Data Pipeline instance. This functionality is supported in the following component versions.
The default behavior for handling PostgreSQL call escape syntax has changed. Previously, Hybrid Data Pipeline only supported stored functions, and treated the non-standard escape syntax {call function()} the same as the standard escape syntax {? = call function()}. With this latest patch, Hybrid Data Pipeline supports stored functions and stored procedures for JDBC and ODBC connections. Now Hybrid Data Pipeline determines whether a function or procedure is being called based on the call escape syntax. If the return value parameter ?= is used, then the connectivity service calls a stored function. If the return value parameter is not used, then the connectivity service calls a stored procedure. You can change this default behavior by setting the CallEscapeBehavior option as an extended option under the Advanced tab. These are the valid values for the CallEscapeBehavior option:
When using the OData $expand functionality to query an OpenEdge data source, the query failed and an error was returned.
When a SQL query included columns of the same name, the SQL Editor did not display the column values.
The SQL Editor did not display results as expected.
See Hybrid Data Pipeline known issues for details.
Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.
Hybrid Data Pipeline has added support to integrate with Active Directory for user authentication using LDAP protocol. Customers can configure an LDAP authentication configuration by providing the details of the server and can configure users to use the
LDAP authentication as opposed to the default authentication.
In order to get started with LDAP Authentication, you need to do the following:
The following aspects of OData Version 4 functions are supported:
The following aspects of OData Version 4 functions are currently NOT supported:
GUI | Console | Definition |
D2C_ADMIN_PASSWORD | D2C_ADMIN_PASSWORD_CONSOLE | Specifies the password for the default administrator. |
D2C_USER_PASSWORD | D2C_USER_PASSWORD_CONSOLE | Specifies the password for the default user. |
• Product Information In cases where you are using the evaluation version of the product, the Web UI now mentions evaluation timeout information as 'xx Days Remaining'.
• Version Information The product version information now
includes details about the licence type. This can be seen under the version information section of the UI. The licence type is also returned when you query for version information via the version API.
• With the 4.3 release, Hybrid Data Pipeline enables users to plug JDBC drivers into Hybrid Data Pipeline and access data using those drivers. This beta feature supports accessibility via JDBC, ODBC and OData clients with the Teradata JDBC
driver. If you are interested in setting up this feature as you evaluate Hybrid Data Pipeline, please contact our sales department.
Apache Hive
• Enhancements
• Enhanced to optimize the performance of fetches.
• Enhanced to support the Binary, Char, Date, Decimal, and Varchar data types.
• Enhanced to support HTTP mode, which allows you to access Apache Hive data sources using HTTP/HTTPS requests. HTTP mode can be configured using the new Transport Mode and HTTP Path parameters.
• Enhanced to support cookie based authentication for HTTP connections. Cookie based authentication can be configured using the new Enable Cookie Authentication and Cookie Name parameters. * Enhanced to support Apache Knox.
• Enhanced to support Impersonation and Trusted Impersonation using the Impersonate User parameter.
• The Batch Mechanism parameter has been added. When Batch Mechanism is set to multiRowInsert, the driver executes a single insert for all the rows contained in a parameter array. MultiRowInsert is the default setting and provides substantial performance gains when performing batch inserts.
• The Catalog Mode parameter allows you to determine whether the native catalog functions are used to retrieve information returned by DatabaseMetaData functions. In the default setting, Hybrid Data Pipeline employs a balance of native functions and driver-discovered information for the optimal balance of performance and accuracy when retrieving catalog information.
• The Array Fetch Size parameter improves performance and reduces out of memory errors. Array Fetch Size can be used to increase throughput or, alternately, improve response time in Web-based applications.
• The Array Insert Size parameter provides a workaround for memory and server issues that can sometimes occur when inserting a large number of rows that contain large values.
• Certifications
• Certified with Hive 2.0.x, 2.1.x
• Apache Hive data store connectivity has been certified with the following distributions:
• Cloudera (CDH) 5.4, 5.5, 5.6, 5.7, 5.8, 5.9, 5.10, 5.11, 5.12
• Hortonworks (HDP) 2.3, 2.4, 2.5
• IBM BigInsights 4.1, 4.2, 4.3
• MapR 5.2
Version and distribution support
• Hive versions 1.0 and higher are supported. Support for earlier version has been deprecated.
• The HiveServer2 protocol and higher is supported. As a result:
• Support for the HiveServer1 protocol has been deprecated.
• The Wire Protocol Version parameter has been deprecated.
• Support has been deprecated for the following distributions:
• Amazon Elastic MapReduce (Amazon EMR) 2.1.4, 2.24-3.1.4, 3.2-3.7
• Cloudera's Distribution Including Apache Hadoop (CDH) 4.0, 4.1, 4.2, 4.5, 5.0, 5.1, 5.2, 5.3
• Hortonworks (HDP), versions 1.3, 2.0, 2.1, 2.2
• IBM BigInsights 3.0 - MapR Distribution for Apache Hadoop 1.2, 2.0
• Pivotal Enterprise HD 2.0.1, 2.1
IBM DB2
• Certifications
• Certified with DB2 V12 for z/OS
• Certified with dashDB (IBM Db2 Warehouse on Cloud)
Oracle Marketing Cloud (Oracle Eloqua)
• Data type support. The following data types are supported for the Oracle Eloqua data store.
• BOOLEAN
• DECIMAL
• INTEGER
• LONG
• LONGSTRING
• STRING
Oracle Sales Cloud
• Data type support. The following data types are supported for the Oracle Eloqua data store.
• ARRAY
• BOOLEAN
• DATETIME
• DECIMAL
• DURATION
• INTEGER
• LARGETEXT
• LONG
• TEXT
• URL
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Support for connections to Google Analytics 4 (GA4) has been added to Hybrid Data Pipeline. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to GA4. Refer to [GA4] Introducing the next generation of Analytics, Google Analytics 4 in Google's Analytics Help for information on GA4 and the retirement of Universal Analytics (also referred to as Google Analytics 3 or GA3). Refer to Google Analytics 4 parameters for details. (On-Premises Connector 4.6.1.676)
Hybrid Data Pipeline now supports access to MongoDB and MongoDB-type data stores, such as MongoDB Atlas and Azure CosmosDB for MongoDB. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to MongoDB and MongoDB-type data stores. Refer to MongoDB parameters for details. (On-Premises Connector 4.6.1.676)
Note:
Hybrid Data Pipeline now supports access to SAP HANA data stores. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to SAP HANA. Refer to SAP HANA parameters for details. (On-Premises Connector 4.6.1.676)
Note: In this release, Hybrid Data Pipeline does not support SAP HANA in FIPS environments.
Hybrid Data Pipeline now supports access to SAP S/4HANA and S/4HANA-type data stores, such as SAP BW/4HANA and SAP NetWeaver. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to SAP S/4HANA and S/4HANA-type data stores. Refer to SAP S/4HANA parameters for details. (On-Premises Connector 4.6.1.676)
Note: The HTTP Header authentication method is not supported for SAP S/4HANA, SAP BW/4HANA, and SAP NetWeaver in Hybrid Data Pipeline
The ODBC driver has been updated to use OpenSSL 3.0 to implement TLS protocols for data encryption between client applications and Hybrid Data Pipeline. In addition, the driver supports the Federal Information Processing Standard or FIPS (140-2), regarding cryptographic module security requirements. (ODBC driver 4.6.1.239)
The ODBC driver has been updated to use OpenSSL 3.0 to implement TLS protocols for data encryption between client applications and Hybrid Data Pipeline. This enhancement allows the driver to support the Federal Information Processing Standard or FIPS (140-2), regarding cryptographic module security requirements. To support OpenSSL 3.0 and FIPS, the Crypto Protocol Version and Enable FIPS connection options have been added to the driver. (ODBC driver 4.6.1.239)
Previously, when an end user created and saved a Hybrid Data Pipeline data source without providing authentication credentials, the user would be prompted for credentials when using the SQL editor to query the data source. This is no longer the case. Now, when an end user attempts to use the SQL editor to query a data source for which credentials have not been saved, Hybrid Data Pipeline returns the error "INVALID_LOGIN: Invalid username, password, security token; or user locked out."
Hybrid Data Pipeline has been updated to install and use Tomcat 9.0.75. This update addresses the security vulnerability in Tomcat 9.0.73 as described in CVE-2023-28709. (On-Premises Connector 4.6.1.676)
After the installation of the ODBC driver on Linux, the default values in the odbc.ini template installed with the driver did not match the values in the hybridDefaults.properties file. (ODBC driver 4.6.1.239)
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
When exposing a Microsoft SQL Server table via OData Version 4 and filtering on a BIT/BOOLEAN field, Hybrid Data Pipeline returned the ODataApplicationException An expression of non-boolean type specified in a context where a condition is expected, near ')'.
With some shell configurations, the Hybrid Data Pipeline shutdown script stop.sh was not shutting down Hybrid Data Pipeline server processes.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline has been enhanced to support Google Analytics, Google BigQuery, Salesforce, and REST data store implementations of OAuth 2.0. To integrate Hybrid Data Pipeline with an OAuth 2.0 authorization flow, Hybrid Data Pipeline must be registered as a client application with the given data store. Then, OAuth application and profile objects must be created to manage OAuth endpoints, properties, and tokens. For details, refer to Integrating Hybrid Data Pipeline as a client application with a data store OAuth 2.0 authorization flow. (On-Premises Connector 4.6.1.241)
The procedures for integrating Hybrid Data Pipeline as a client application to enable access to Google Analytics include the ability to select or create an OAuth application in the Web UI. For details, refer to Google Analytics parameters.
When selecting a data source from the dropdown in the SQL Editor, the server is returning the error "No suitable driver found."
When performing an insert on an OData-enabled MySQL Community Edition data source, Hybrid Data Pipeline returned an error on a datetime column.
Performing a mergeEntity operation against an OData-enabled MySQL Community Edition data source resulted in a NullPointerException.
Server-side SSL could not be configured because the enable_ssl.sh script was not properly setting the truststore information from the Web UI.
To mitigate the CVE-2022-23181 security vulnerability, the Tomcat context.xml file has been modified
such that session persistence is disabled by default.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline has added support for a subset of the functionality defined by the OData Version 4 extension for data aggregation. Aggregation functionality is extended with the $apply query parameter. See Aggregation support in the user's guide for details.
The Hybrid Data Pipeline On-Premises Connector, ODBC driver, and JDBC driver have added support for Windows Server 2019. (On-Premises Connector version 4.6.1.48, ODBC driver 4.6.1.12, JDBC driver 4.6.1.9)
The Hybrid Data Pipeline server required the specification of the Server Name parameter, even though Server Name is not required for a TNS connection. In addition, when Server Name was specified, the server returned an inaccurate error message.
See Hybrid Data Pipeline known issues for details.
Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.
{Administrators can fall back to the old functionality by setting the configurations API with the new secureChangePassword attribute (specified with the number 2). For example, the following PUT operation would configure the system to use the old functionality where the user must provide only a new password.
"currentPassword": "<mycurrentpassword>"
"newPassword": "<mynewpassword>"
}
https://myserver:port/api/admin/configurations/2
{
"value": "false"
}
Check for BouncyCastle Approved Only Mode [result=true]NOTE: Because the installer program is not capable of regenerating encryption keys for existing users and data sources, we currently recommend a new, clean installation of Hybrid Data Pipeline with FIPS enabled when upgrading from a non-FIPS-compliant server to a FIPS-compliant server. With a new installation, users and data sources must be re-created.
Note: The On-Premises Connector is not currently FIPS compliant. Therefore, any connections made to an on-premises data source through an On-Premises Connector will not be fully FIPS compliant.
The following aspects of OData Version 4 functions are supported:
The following aspects of OData Version 4 functions are currently NOT supported:
GUI | Console | Definition |
D2C_USING_FIPS_CONFIG | D2C_USING_FIPS_CONFIG_CONSOLE | Specifies if you want to configure the server to be FIPS-compliant. |
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
When translating an OData Version 4 query to a SQL query, Hybrid Data Pipeline did not honor the parenthesis in the OData query. This caused the reordering of the operator precedence, and lead to incorrect results. The parenthesis in OData queries is now reproduced in SQL queries to maintain operator precedence.
When connecting to an on-premise data source using the On-Premises Connector, the error origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10 was returned. The On-Premises Connector now correctly handles pongs sent from load balancers according to WebSocket protocol.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
A production instance of the Hybrid Data Pipeline server can now be deployed using a Docker image. The Hybrid Data Pipeline Docker image is available in the Hybrid Data Pipeline Docker Deployment Package. In addition, the Docker Deployment Package includes demos for a number of deployment scenarios. For details and instructions, see Deploying Hybrid Data Pipeline using Docker in the installation guide.
Hybrid Data Pipeline now supports user authentication using the OIDC protocol. An identity provider and client applications can be configured to authorize users and grant access to the OData endpoints of the Hybrid Data Pipeline server. See Integrating an OIDC authentication service in the user's guide for details.
When using the third-party JDBC Oracle driver, the Hybrid Data Pipeline SQL Editor did not return tables.
When the special character '+' (plus sign) was used in an account password, the user was unable to authenticate with the Hybrid Data Pipeline server.
Hybrid Data Pipeline was unable to access Oracle Cloud Financials REST Endpoints with the Autonomous REST Connector.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
When connecting to Google Analytics data source using an OAuth profile created in a previous version of Hybrid Data Pipeline, the following error was returned: There is a problem connecting to the data source. Error getting data source. System not available, try again later.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline has been updated to use Log4j version 2.15 to address the security vulnerability found in Log4j version 2.13.3 as described in CVE-2021-44228. For details, refer to CVE-2021-44228. (Hybrid Data Pipeline server 4.6.1.306, On-Premises Connector version 4.6.1.85).
See Hybrid Data Pipeline known issues for details.
Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.
OpenSSL 1.0.2k addresses vulnerabilities resolved by earlier versions of the library. For more information on OpenSSL vulnerabilities resolved by this upgrade, refer to OpenSSL announcements.
The Oracle Marketing Cloud data store provides access to Oracle Eloqua. Improved features and functionality for this data store are available with this Hybrid Data Pipeline release.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline now supports invoking stored procedures for JDBC and ODBC connections. Stored procedures functionality includes support for input parameters, output parameters, and in/out parameters. Stored procedures that return multiple results are also supported. This functionality is supported in the following component versions.
When attempting to enable FIPS, the error message did not state that the required secure random instance could not be created because there was not enough entropy on the host machine.
When an HTTP redirect status was returned, the driver was unable to follow the redirection and returned an error that the HTTP endpoint had been relocated. To resolve this issue, the FollowRedirects connection property has been introduced. When FollowRedirects is enabled, the driver can follow an HTTP redirection instead of returning an error. For details, refer to FollowRedirects.
When processing an empty result reply to a query execution request against a ServiceNow REST service, Hybrid Data Pipeline returned "Unexpected end of stream in statement" error.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
To improve performance, the OData startswith() function was changed to use LIKE instead of LOCATE for a SQL statement that takes less time to execute.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline now supports branding of its Web UI. The default branding information like logo, colors, naming, and icons can be configured before or after installation. For more information, refer to Branding the Web UI for details.
The Autonomous REST Composer is now available on the Configure Endpoints tab from the Autonomous REST Connector data store interface. The Composer allows you to create a REST data source and configure or import a REST Model file using the Web UI. For more information, refer to Creating REST data sources with the Web UI for details.
Hybrid Data Pipeline has been updated to install and use Tomcat 9.0.73.In addition, the following Hybrid Data Pipeline Tomcat configurations have been made to improve security.
-1
disables this port.Oracle 19c has been certified to operate as a Hybrid Data Pipeline system database.
The Microsoft Dynamics 365 data store supports a new connection option Cross Company that allows access to cross company data for users who have access to multiple companies. Refer to Microsoft Dynamics 365 parameters for details.
When querying a SQL Server data source, the JDBC driver returned the "unexpected end of stream reached" error. (JDBC driver 4.6.1.212)
The shipping version of the Tomcat server was upgraded from Tomcat 9.0.65 to 9.0.73 to address the vulnerability described in CVE-2023-24998. (Hybrid Data Pipeline server 4.6.1.1391, On-Premises Connector 4.6.1.524)
After configuring the Hybrid Data Pipeline server to use an external JRE and run in FIPS mode, server-side SSL could not be enabled. (Hybrid Data Pipeline server 4.6.1.1391)
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline now supports access to Google BigQuery. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Google BigQuery. OAuth 2.0 and Service Account authentication methods are supported.
See Hybrid Data Pipeline known issues for details.
Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline supports throttling the number of simultaneous OData queries a user may have running against a Hybrid Data Pipeline server at one time. OData query throttling for users may be configured with the ODataMaxConcurrentRequests and ODataMaxWaitingRequests limits. The ODataMaxConcurrentRequests limit sets the maximum number of simultaneous OData requests allowed per user, while the ODataMaxWaitingRequests limit sets the maximum number of waiting OData requests allowed per user. See Throttling in the user's guide for details.
Support for environment variables to specify server and system database credentials during the installation process has been added. The use of environment variables allows you to perform a more secure silent installation, compared to a standard silent installation where credential information must be specified in plain text in the silent installation response file. See Silent installation process in the user's guide for details.
When installing the Hybrid Data Pipeline server using SQL Server as the system database, the use of special characters in admin or user account credentials caused the installation to fail with the error Error in createSchema at Line 266.
NOTE: While installation no longer fails when special characters are used in system database account credentials, the installer cannot currently validate the necessary database schema objects when any of the following special characters are used
in either database user ID or password values: space ( ), quotation mark ("), number sign (#), dollar sign ($), and apostrophe ('). Therefore, in a standard installation where these characters are used in database credentials, database validation
must be skipped to proceed with the installation. Similarly, when performing a silent installation in this case, the SKIP_DATABASE_VALIDATION property should be set to true.
Note that when skipping database validation in this scenario, the server should install successfully and work with the specified system database.
The specification of system database admin and user passwords in plain text in the response file as part of the silent installation process raised security concerns. Support for environment variables to specify server and system database credentials during the installation process has been added. See Silent installation process in the user's guide for details.
When the HAProxy load balancer was configured with the setting x-content-type-options:nosniff, Firefox, Chrome, and Microsoft Edge browsers rendered the Web UI as text instead of HTML.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline now supports user authentication using the SSO/SAML protocol. Customers can configure SAML authentication by providing the details of an identity provider and can configure users to use the SAML authentication.
When the "FileNotFoundException (Too many open files)" error occurred, the Hybrid Data Pipeline connection was lost and the server had to be restarted.
When fetching invalid date and datetime values from columns or literals, such as SELECT DATE(0), against MySQL data sources, the Hybrid Data Pipeline server returned an error.
Tableau was unable to connect to OData v4 endpoints exposed by Hybrid Data Pipeline.
Users were unable to use special characters for Hybrid Data Pipeline passwords.
When HTTP was disabled on the load balancer, the load balancer did not return OData responses to the client application as would be expected with the configuration of the X-Forwarded-Proto header to manage HTTP and HTTPS traffic.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline now supports server-side SSL. Server-side SSL allows you to enable SSL behind the load balancer and secure communication between the load balancer and server nodes, as well as Hybrid Data Pipeline nodes in a cluster deployment. This functionality is supported in the following component versions.
Note:
The curl library files used with the ODBC driver have been upgraded to version 7.80.0.
The default version of the OpenSSL library used with the ODBC driver has been upgraded to version 1.1.1l.
The SQL Editor was not displaying SYNONYM objects.
When queries demanded the return of multiple large result sets, the query failed and the error "Unexpected end of stream" was returned.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline now supports access to a number of Microsoft Dynamics 365 apps. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to these Dynamics 365 apps. OAuth 2.0 connectivity is supported. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)
The generally available Hybrid Data Pipeline Docker image now supports a trial Docker deployment. After you obtain the image from the Progress Enterprise Delivery site (ESD) or the Trial Download page, you may perform a trial deployment of Hybrid Data Pipeline as a Docker container on a single node with an internal system database. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.
A Power BI custom connector is now available from the Progress DataDirect Hybrid Data Pipeline Public GitHub repository. This custom connector may be used to implement connectivity from Power BI to Hybrid Data Pipeline resources that use OAuth 2.0 or OIDC authentication. For details, refer to Configuring a Power BI custom connector for OAuth 2.0 or Configuring a Power BI custom connector for OIDC.
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.63. This addresses the CVE-2022-23181 security vulnerability that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector version 4.6.1.287)
The Microsoft Dynamics CRM data store has been deprecated. Connectivity to a number of Dynamics 365 apps is now supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)
The Docker trial image has been deprecated. A Docker trial deployment of Hybrid Data Pipeline may now be performed using the generally available Hybrid Data Pipeline Docker image. This image may be obtained from the Progress Enterprise Delivery site (ESD) or the Trial Download page. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.
The ODBC driver did not support the GUID data type. (ODBC driver 4.6.1.67)
The shipping version of the Tomcat server was upgraded from Tomcat 9.0.54 to 9.0.63. This addresses the CVE-2022-23181 security vulnerability
that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector 4.6.1.287)
When using the SQL Editor to query datetimeoffset and sql_variant, a NullPointerException was returned.
When setting HDP_DATABASE_ADVANCED_OPTIONS to use an SSL connection to the external system database, the setting was not propagated correctly.
When performing a Hybrid Data Pipeline server upgrade in an environment using FIPS and an external JRE, the upgrade failed with the error Error in MAIN at line 576.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
The Microsoft Dynamics CRM data store was recently deprecated, and has now been removed from the product package. Connectivity to a number of Dynamics 365 apps, including CRM and ERP apps, is supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details.
The Rollbase data store has been removed from the product package. If you would like to reintroduce the Rollbase data store, contact Technical Support.
The SugarCRM data store has been removed from the product package. If you would like to reintroduce the SugarCRM data store, contact Technical Support.
The Hybrid Data Pipeline product and it’s connectors utilized a version of HyperSQL Database that was vulnerable to remote code execution described in CVE-2022-41853. All impacted components have been patched to fix this vulnerability. For details of components impacted and fixed versions, refer to the following KB article:
Note: In addition to updating the Hybrid Data Pipeline server, if any On-Premises Connectors are used in your environment, they should be updated with build 4.6.1.395 of the On-Premises Connector
After an initial connection to Microsoft Dynamics 365 using the OAuth 2.0 client credentials grant, the Authorization URI field automatically populated with the default value when the data source was reopened. The value in the Authorization URI field had to be manually cleared to reconnect with Microsoft Dynamics 365.
Hybrid Data Pipeline was unable to connect to an Azure Synapse serverless database via a SQL Server data source.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
The curl library files that are installed with the ODBC driver have been upgraded to version 7.88.1, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities in the curl documentation. (ODBC driver 4.6.1.158)
The default version of the OpenSSL library has been upgraded to version 1.1.1t, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities: Fixed in OpenSSL 1.1.1 in OpenSSL News. (ODBC driver 4.6.1.158)
When upgrading the Hybrid Data Pipeline server to enable FIPS, the installation failed and the installer returned an account database error.
The JDBC driver was allowing statements to be executed after a connection was terminated, resulting in an "Invalid session token" error. (JDBC driver 4.6.1.194)
On a JDBC data source configured for OAuth and created with the DataDirect Snowflake JDBC driver, the user was prompted for a user ID and password when attempting to test connect.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
When using the PostgreSQL JDBC driver as a third party driver to connect to backend data, the Metadata Exposed Schemas dropdown did not load PostgreSQL schemas.
After upgrading to server build 4.6.1.357, the introduction of a new keystore prevented successful login.
Hybrid Data Pipeline deployment failed when using environment variables to deploy the server as a Docker container
Hybrid Data Pipeline has been updated to use Spring Framework version 5.3.18, Spring Boot version 2.6.6, and Spring Security version 5.6.2 to address the vulnerability described in CVE-2022-22965. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)
Hybrid Data Pipeline has been updated to use version 2.13.2.2 of the Jackson library to address the vulnerability described in CVE-2020-36518. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)
After upgrading to On-Premises Connector build 4.6.1.120, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.164)
See Hybrid Data Pipeline known issues for details.
Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.
The following issues have been resolved. An asterisk (*) indicates an issue that was resolved in a software patch subsequent to the GA release.
Issue HDP-3974 Installation fails when choosing a unicode external database*
When a unicode external database was selected during the installation process, the Hybrid Data Pipeline server failed to install. This fix is available
in build 4.5.0.71.
Issue HDP-3989 Validate Server Certificate persistence in the Web UI*
The Web UI was not persisting the value of the Validate Server Certificate parameter after it had been set to OFF. After exiting the data source and returning back,
this resulted in the test connection failing. This fix is available in build 4.5.0.65.
Issue HDP-3785 Data source password replacing plus sign (+) with space*
When creating a password for a MySQL CE data source in the Web UI, the plus sign (+) was incorrectly being replaced with a space. This fix is available in build
4.5.0.65.
Issue HDP-3878 OData model creation failure*
OData model creation was failing when the connectivity service was building an OData model from a very large database. Additionally, if unable to read metadata from unique or unusual tables,
the creation of the OData model would result in either no rows returned or only partial rows returned. Hybrid Data Pipeline now builds the OData model from the tables selected to be in the model, as opposed to all the tables in the database. This
enhancement is available in build 4.5.0.61.
Hybrid Data Pipeline now supports multitenancy. Multitenancy allows a system administrator to isolate groups of users, such as organizations or departments, that are being hosted through the Hybrid Data Pipeline service. The provider maintains a physical instance of Hybrid Data Pipeline, while each tenant (group of users) is provided with its own logical instance of the service. In a multitenant environment, the default system tenant contains multiple child tenants. The user accounts that reside in one tenant are isolated from those in other tenants.
Hybrid Data Pipeline now supports data source sharing via the Data Sources API. Data source owners can now share data sources with other users. Standard users can share data sources with individual user accounts. Administrators can share data sources with tenants and individual user accounts. Data source sharing allows administrators to provision users for limited or query-only access to Hybrid Data Pipeline resources.
Hybrid Data Pipeline support for third-party JDBC drivers is now GA. Administrators can use a command line validation tool to determine whether a third-party JDBC driver will work with the Hybrid Data Pipeline server and On-Premises Connector. If validated, a third-party driver can be used to support OData, JDBC, and ODBC connectivity in the Hybrid Data Pipeline environment. Once the driver is integrated with the Hybrid Data Pipeline environment, users can create Hybrid Data Pipeline data sources for the backend data store supported by the third-party JDBC driver.
Administrators can now restrict access to Hybrid Data Pipeline by creating an IP address whitelist to determine which IP addresses (either individual IP addresses or a range of IP addresses) can access resources such as the Data Sources API, the Users API, and the Web UI. IP address whitelists can be implemented at system, tenant, and user levels.
Hybrid Data Pipeline now supports the following features.
The Data Source API now supports operations to export the relational map files for non-relational data sources. When a data source is created for a web service such as Salesforce, Hybrid Data Pipeline generates files to map the object model to a relational model. These files may be used to resolve issues that can arise when performing queries against data sources such as these.
The evaluation period for Hybrid Data Pipeline has been changed from 90 to 30 days.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline supports changing the catalog of data sources. The setCatalog method can be used to change catalogs in JDBC, while the connection attribute SQL_ATTR_CURRENT_CATALOG can be used in ODBC. Support for changing catalogs includes support for changing the default database on an active connection to a SQL Server data source. This support extends to any data source configured with an underlying JDBC connector that supports the setCatalog method. This enhancement is available in the latest build of the Hybrid Data Pipeline server (4.6.1.132). Components such as the Hybrid Data Pipeline ODBC and JDBC drivers, as well as the On-Premises Connector must be reinstalled to adopt the enhancement (On-Premises Connector version 4.6.1.62, ODBC driver 4.6.1.27, JDBC driver 4.6.1.13).
When an incorrect host name was specified in the connection URL, the Hybrid Data Pipeline JDBC driver defaulted to service.datadirectcloud.cloud.com as the host name and returned an inaccurate error message.
The ODBC driver was not installing on Amazon Linux 2.
See Hybrid Data Pipeline known issues for details.
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Hybrid Data Pipeline has been updated to use Log4j version 2.17.1 to address security vulnerabilities found in Log4j versions 2.17 as described in CVE-2021-44832. For details, refer to CVE-2021-44832. (Hybrid Data Pipeline server 4.6.1.325, On-Premises Connector version 4.6.1.99).
Hybrid Data Pipeline has been updated to use Log4j version 2.17 to address security vulnerabilities found in Log4j versions 2.15 and 2.16 as described in CVE-2021-45046 and CVE-2021-45105. For details, refer to CVE-2021-45046 and CVE-2021-45105. (Hybrid Data Pipeline server 4.6.1.311, On-Premises Connector 4.6.1.91).
See Hybrid Data Pipeline known issues for details.
Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.
The Hybrid Data Pipeline 4.4 release implements features that simplify the deployment of cluster environments by implementing an enhanced messaging that removes the external dependency upon a Kafka message queue and provides integration with application load balancers in public cloud environments.
Hybrid Data Pipeline has added support for multi-node clusters that integrate with cloud load balancers. Hybrid Data Pipeline supports cloud load balancers that support the Websockets protocol (such as AWS application load balancer and Azure application gateway).
Hybrid Data Pipeline now has enhanced messaging such that the deployments no longer rely upon a Kafka cluster to support highly available inter-node communication.
Support for OAuth 2.0
Hybrid Data Pipeline now supports OAuth 2.0 authorization for OData API access, in addition to the basic authentication. Customers using client applications or third-party applications like Salesforce Connect and Power BI will be able to invoke Hybrid Data Pipeline OData access endpoints by passing in the required tokens as opposed to storing user credentials in the application.
You can now install a single node Hybrid Data Pipeline server for evaluation purposes using a Docker image. Docker is a tool that makes it easier to deploy and run applications. The use of a Docker image means that no prior machine setup is required. You may choose to use this method if you want to get started without spending time on installation and configuration.
The following properties have been removed from the response file for both console and GUI modes:
The following properties have been added to the response file for both console and GUI modes:
Users can now configure the additional property Metadata Exposed Schemas in the data source configuration to restrict the schemas they see in the SQL Editor and the OData Editor.
Added support for exposing GUID data type as a GUID in OData for SQL Server data source.
Simplified the On-premises Connector installation such that the Cloud Access service is no longer installed. Only the single Cloud Connector service will be installed.
Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.
{
"HDPVersion": "major.minor.service_pack.build_number"
"WAPVersion": "major.minor.service_pack.build_number",
"DASVersion":"major.minor.service_pack.build_number"
}
Note: Values for the SKIP_HOSTNAME_VALIDATION and SKIP_PORT_VALIDATION options are now false | true, false for disable and true for enable. These options have the same name in GUI-generated and console-generated response files.
Note: Values for the SKIP_LB_HOSTNAME_VALIDATION option are now false | true, false for disable and true for enable. This option has the same name in GUI-generated and console-generated response files.
GUI | Console | Definition |
D2C_USING_KAFKA_CONFIG | D2C_USING_KAFKA_CONFIG_CONSOLE | Specifies whether you are using an Apache Kafka message queue service. |
D2C_MESSAGE_QUEUE_SERVERS | D2C_MESSAGE_QUEUE_SERVERS_CONSOLE | Specifies the servers in your Apache Kafka cluster. |
D2C_HDP_CLUSTER_NAME | D2C_HDP_CLUSTER_NAME_CONSOLE | Specifies a name for your Hybrid Data Pipeline cluster used by the Apache Kafka message queue service. |
D2C_DB_VENDOR_MSSQLSERVER | na | Specifies whether you are using SQL Server as an external systems database. In a console mode response file, the external database is specified with the D2C_DB_VENDOR_CONSOLE option. |
D2C_DB_PORT_MSSQLSERVER | na | Specifies the port number of a SQL Server external systems database. In a console mode response file, the external database port is specified with the D2C_DB_PORT_CONSOLE option. |
D2C_SCHEMA_NAME | D2C_SCHEMA_NAME_CONSOLE | Specifies the name of the schema to be used to store systems information when a SQL Server external systems database is being used. |
These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Sometimes when trying to execute SELECT * FROM table against an on-premise SQL Server database using the On-Premises Connector, the ODBC driver returned the error [HY000] [DataDirect][ODBC Hybrid driver][SQLServer]Unexpected content at the end of chunk.
When the name of the On-Premises Connector host machine was in all uppercase at the time of the installation of the On-Premises Connector, the Connector Label field in the On-Premises Configuration Tool did not populate with the hostname as expected. Then, when attempting to update the Connector Label field with the correct hostname, the On-Premises Configuration Tool returned Error setting connector label for user Request returned Status:404 Message.
When using the Hybrid Data Pipeline ODBC driver to connect to a data source created with a third party JDBC driver, the following error was returned: ODBC--call failed. [DataDirect][ODBC Hybrid driver]Numeric value out of range. Error in column 16. (#0). This error was returned because the third party driver diverged from the JDBC specification when describing the data type of CHAR_OCTET_LENGTH for DatabaseMetaData.getColumns(). The ODBC driver has been modified to work with the third party JDBC driver despite this divergence from the JDBC specification.
Hybrid Data Pipeline has added a SQL statement auditing feature. When SQL statement auditing is enabled, the connectivity service records SQL statements and related metrics in the SQLAudit table on the Hybrid Data Pipeline system database (also referred to as the account database). This information can then be queried directly by administrators. See SQL statement auditing in the user's guide for details.
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.37. (On-Premises Connector version 4.6.1.14)
See Hybrid Data Pipeline known issues for details.