4.6.1.2566

May 9, 2024

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

HDP-9319 Manage Configuration page did not load after upgrade to build 4.6.1.2529

After upgrading to build 4.6.1.2529 of the Hybrid Data Pipeline server, the Manage Configuration page in the Web UI did not load.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.2.2978

November 8, 2024

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Log management capabilities

Log management capabilities have been enhanced. Administrators may now specify a centralized location for Hybrid Data Pipeline logs. In addition, administrators may set logging levels for system services, including the web UI, the data access service, the notification server, and the Apache Tomcat server. For details, refer to Log management.

OData Version 4 Long Types

For OData Version 4, Hybrid Data Pipeline now supports long binary and long character types up to 1 MB. Supported long binary types include BLOB and LONGVARBINARY. Supported long character types include CLOB, LONGNVARCHAR, LONGVARCHAR, and NCLOB. Column sizes for long binary types may be managed with the limits ODataBinaryColumnSizeLimit and ODataIncludeBinaryLongData. Column sizes for long character types may be managed with the limits ODataCharacterColumnSizeLimit and ODataIncludeCharacterLongData. Refer to the following documentation resources for details: Entity Data Model (EDM) types for OData Version 4, Manage Limits view, and Limits API.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.95. (On-Premises Connector version 4.6.2.1046)

ODBC driver ICU library upgrade (74.1) for Linux

For Linux, the ICU library files that are installed with the driver have been upgraded to version 74.1. In addition, the ICU library file names have changed. For the 32-bit driver, the ICU file name has changed from libivicu28.so to libivicu.so. For the 64-bit driver, the ICU file name has changed from libddicu28.so to libddicu.so. (ODBC driver version 4.6.2.340, November 21, 2024)

ODBC driver ICU library upgrade (74.1) for Windows

For Windows, the ICU library files that are installed with the driver have been upgraded to version 74.1. As a part of this upgrade, the ICU library file names have changed. For the 32-bit driver, the ICU file name has changed from ivicu28.dll to ivicu.dll. For the 64-bit driver, the ICU file name has changed from ddicu28.dll to ddicu.dll. (ODBC driver version 4.6.2.340, November 21, 2024)

Changed Behavior

Google Analytics support

Google has ended support for Universal Analytics (also referred to as Google Analytics 3). Therefore, Google Analytics 3 support has been removed from Hybrid Data Pipeline. Google Analytics 4 support was added to Hybrid Data Pipeline with the 4.6.1.1854 release of the server. Google Analytics 4 continues to be supported and maintained as a Hybrid Data Pipeline data store. Refer to Google Analytics 4 parameters for details. (On-Premises Connector version 4.6.2.1046)

ODBC driver Windows runtime version upgrade

The driver is now compiled with an upgraded compiler for Windows platforms. As a result, you must have Microsoft Visual C/C++ runtime version 14.40.33810 or higher on your machine to run the driver. (ODBC driver version 4.6.2.340, November 21, 2024)

Resolved Issues

Issue HDP-9081 Error "ORA-03137: malformed TTC packet from client rejected" returned when query ended with semicolon

When executing a SQL query that ends with a semicolon against an Oracle data source, the error "ORA-03137: malformed TTC packet from client rejected" was returned. (JDBC driver version 4.6.2.403)

Issue HDP-10028 OData schema map does not refresh for third-party JDBC connectors

When using a JDBC third-party connector, the OData schema map could not be refreshed using the Web UI.

Issue HDP-10898 Resource leaks resulted in "Too many open files" exception and caused the server to fail

Resource leaks occurred with SSL connections to OpenEdge, MySQL, Sybase, Oracle Service Cloud, and Db2 data sources. These leaks resulted in a "Too many open files" exception and caused the server to fail.

Issue HDP-11034 SAP S/4HANA data source missing Connector ID parameter for On-Premises Connector

On the data source page for the SAP S/4HANA data store, the Connector ID parameter for the On-Premises Connector was missing.

Issue HDP-11081 SAP S4/HANA data source missing the "Extended Options" parameter

On the advanced tab of the SAP S4/HANA data store page, the "Extended Options" parameter was not exposed.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.2529

April 18, 2024

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Custom password policy

Hybrid Data Pipeline now supports the creation of a custom password policy. Administrators may set an expiration date for passwords and configure the minimum and maximum number of characters allowed in a password. A custom policy may also be configured to require upper case letters, lower case letters, numbers, and special characters. See Password policy for details.

Collect information about On-Premises Connectors

The new Administrator Connectors API allows administrators to retrieve information about On-Premises Connectors registered with Hybrid Data Pipeline. Administrators may obtain a full list of On-Premises Connectors with this API. They may also use it to filter the list for details such as version number, owner, and tenant. See Obtaining information about On-Premises Connectors for details.

Download data source logs using the Web UI or the Hybrid Data Pipeline API

Hybrid Data Pipeline provides data source logging to record user activity against data sources. Data source logs may now be obtained from the Data Sources view in the Web UI or with the data source logs endpoint. In addition, data source logs may be retrieved by running the getdslogs.sh script on each node in the deployment. See Obtaining the logs for a data source for details.

Download system logs using the Web UI or the Hybrid Data Pipeline API

Hybrid Data Pipeline generates a number of log files to record events, activity, and other information. System logs may now be obtained through the System Configurations view in the Web UI or via the Nodes API. In addition, system logs may be retrieved by running the getlogs.sh script on each node in the deployment. See System logs for details.

Changed Behavior

MySQL Community Edition

For connectivity to MySQL CE, the MySQL CE Connector/J jar must be supplied during the deployment of Hybrid Data Pipeline. With this release, version 8.0 of the MySQL CE Connector/J jar has been certified with Hybrid Data Pipeline. For the latest data source and platform support information, refer to the  Product Compatibility Guide.

Resolved Issues

Issue HDP-8498 MaxFetchRows limit not specifying the maximum rows allowed to be fetched

After setting the MaxFetchRows limit, it was observed that Hybrid Data Pipeline ignored the limit. In addition, the SQL Editor returned incorrect results.

Issue HDP-8677 Calls to api/mgmt/datastores endpoint returning invalid JSON

When querying the api/mgmt/datastores endpoint, Hybrid Data Pipeline returned invalid JSON in the response payload.

Issue HDP-8683 UserMeter table RemoteAddress field contains the Hybrid Data Pipeline server IP address instead of the client machine IP address for OData queries

When querying the UserMeter table for information about an OData query, the RemoteAddress field contained the Hybrid Data Pipeline server IP address instead of the IP address of the client machine.

Issue HDP-8690 Error 'Value must be a valid URL' returned when registering a SAML authentication service

When registering a SAML authentication service using Azure as the Identify Provider, Hybrid Data Pipeline returned the error "Value must be a valid URL" even though the IDP entity ID was valid.

Issue HDP-8710 Special characters not supported for external system database user passwords

When a special character was used for the user password of a MySQL system database, the Hybrid Data Pipeline server installation failed.

Issue HDP-8844 Worker thread error: java.io.EOFException with the ODBC driver

When specifying NULL for a SQL_DECIMAL parameter while inserting data with the ODBC driver, the error "[DataDirect][ODBC Hybrid driver][Service]Worker thread error: java.io.EOFException" was returned. (ODBC driver 4.6.1.268)

Issue HDP-9079 OAuth2 is not working against the Salesforce test instance

When attempting to connect to a Salesforce test instance using OAuth, Hybrid Data Pipeline returned the error "There is a problem connecting to the DataSource. REST STatus 404 NOT Found returned for GET https://login.salesforce.com/services/oauth2/userinfo."

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.2057

December 5, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Data Store page

The Data Store page has been enhanced. The Data Store page lists all supported data stores. It is the first stop when creating a data source, or connection, to a data store. The enhancements to the Data Store page include a new layout, search functionality, and links to documentation resources.

SSL certificate update script

The Hybrid Data Pipeline product package now includes the update_server_cert.sh shell script to simplify the process of updating SSL certificates in Linux deployments of Hybrid Data Pipeline. After you obtain a new CA certificate, you may run the script to configure the server to use the new certificate. Then, depending on your environment, certificate information must be updated for components such as the ODBC driver, JDBC driver, and On-Premises Connector. See Updating SSL certificates in the Deployment Guide for details.

curl Library Upgrade

The curl library files that are installed with the ODBC driver have been upgraded to version 8.4.0, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities in the curl documentation. (ODBC driver 4.6.1.249)

Changed Behavior

Shutdown Port

The default value of the shutdown port has been changed from -1 to 8005.

Resolved Issues

Issue HDP-8191 Exit Code 1 returned when deploying the ODBC driver as a Docker container

When attempting to deploy the Hybrid Data Pipeline ODBC driver in a Docker container, Exit Code 1 is returned. (ODBC driver 4.6.1.249)

Issue HDP-8281 ODBC driver curl Library vulnerabilities (CVE-2023-38545 and CVE-2023-38546)

The curl library files that are installed with the ODBC driver have been upgraded to version 8.4.0 to address the curl Library vulnerabilities CVE-2023-38545 and CVE-2023-38546. (ODBC driver 4.6.1.249)

Issue HDP-8307 OPC not sending SNI extension on SSL handshake for websocket connections

When contacting the Hybrid Data Pipeline server to open a websocket connection, the On-Premises Connector was not providing the Server Name Indication (SNI) extension for the SSL handshake. (On-Premises Connector 4.6.1.758)

Issue HDP-8431 JDBC driver installation fails with "Invalid CEN header" error after upgrade to Java 11

After upgrading to Java 11.0.20 on Windows Server 2019, the installation of the JDBC driver failed with the error "java.util.zip.ZipException: Invalid CEN header (invalid extra data field size for tag: 0x3831 at 0)." (JDBC driver 4.6.1.271)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1930

October 26, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-8003 Server unresponsive to Azure Application Gateway load balancer

Hybrid Data Pipeline became unresponsive to incoming queries (1) due to slow response times associated with queries sent to an on-premises data source and (2) because threads were not timing out as expected

Issue HDP-8009 Azure CosmosDB and Mongo datasources in FIPS environment

An issue that prevented FIPS to be used with Azure CosmosDB and MongoDB connections has been resolved.

HDP-8010 SAP HANA datasource in FIPS environment

An issue that prevented FIPS to be used with SAP HANA connections has been resolved.

Issue HDP-8183 Unable to connect to on-premises data source after upgrading to OPC version 4.6.1.676

After upgrading to version 4.6.1.676 of the On-Premises Connector, the Hybrid Data Pipeline server was unable to connect to the on-premises data source. (On-Premises Connector 4.6.1.709)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1128

January 25, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-6464 Service returned only Date and Time values in UTC when fetching data from an OData-enabled Oracle database

When fetching data from an OData-enabled Oracle database, Hybrid Data Pipeline returned Date and Time values only in UTC.

Issue HDP-6539 SQL Editor unable to browse tables, views, or procedures under a schema name that has dot

When using the SQL Editor to query a SQL Server data source, the SQL Editor was unable to browse tables, views, and procedures under any schema name that included a dot.

Issue HDP-6623 HDP_DATABASE_ADVANCED_OPTIONS did not enable SSL against the system database

When deploying the server as a Docker container, using the HDP_DATABASE_ADVANCED_OPTIONS option to enable SSL (HDP_DATABASE_ADVANCED_OPTIONS=EncryptionMethod=SSL) failed to enable SSL against the system database.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0607

July 28, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-5690 Hybrid Data Pipeline reached open files limit

Idle Google BigQuery connections did not fully close, causing the Hybrid Data Pipeline server to reach the limit on open files.

Issue HDP-5866 On-Premises Connector throwing HTTP 401 error during installation

After upgrading to the On-Premises Connector, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.255)

Issue HDP-5938 "Request header is too large" exception with HDP SAML authentication

When attempting to authenticate using SAML, Hybrid Data Pipeline returned the exception "Request header is too large."

Issue HDP-6133 Account lockout not working as expected for ODBC and OData data access

After an account lockout occurred, OData queries were running successfully.

Issue HDP-6152 Not passing user credentials when using third-party connector

When using a third-party connector where the database credentials are not included in the Hybrid Data Pipeline data source, the user is prompted to enter credentials. In this scenario, Hybrid Data Pipeline returned the error message "user name is missing." (JDBC driver 4.6.1.77)

Issue HDP-6154 Unable to use Azure Database for PostgreSQL as an external database

Hybrid Data Pipeline was unable to support the Azure Database for PostgreSQL as an external database because Azure Database for PostgreSQL requires a unique user naming convention.

Issue HDP-6178 Worker thread error when connecting to Azure Synapse serverless instance

When attempting to query an Azure Synapse serverless instance, Hybrid Data Pipeline returned a java.io.IOException worker thread error.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1548

June 23, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Snowflake support

Support for connectivity to Snowflake has been added to Hybrid Data Pipeline. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Snowflake.

Note: Hybrid Data Pipeline does not support FIPS for Snowflake connections. Refer to "FIPS mode" or "Snowflake" in Hybrid Data Pipeline known issues for details.

Changed Behavior

MySQL Community Edition

The MySQL CE data store icon no longer appears by default on the Data Stores page. The icon will only appear if the MySQL Connector/J driver jar has been provided during the deployment process.

Resolved Issues

Issue HDP-7219 ODBC Driver on Windows is not showing the version number

The ODBC driver does not include the version metadata required to display the driver version number in the ODBC Administrator and in the driver library properties. (ODBC driver 4.6.1.177)

Issue HDP-7510 Connection failed when using custom authentication to connect to a REST service

When using custom authentication to connect to a REST service with the Autonomous REST Connector, the connection failed after an initial connection because the Hybrid Data Pipeline server was not properly storing authentication parameters.

Issue HDP-7541 Address the Spring Security vulnerability (CVE-2023-20862)

Hybrid Data Pipeline has been updated to use Spring Framework version 5.8.3 to address security vulnerabilities described in CVE-2023-20862. (Hybrid Data Pipeline server 4.6.1.1548, On-Premises Connector 4.6.1.570)

Issue HDP-7545 enable_ssl.sh does not throw an error when an argument is not supplied

When running enable_ssl.sh, the script does not throw an error when an argument is not supplied.

Issue HDP-7595 SQL Editor query to Azure SQL Data Warehouse failed when using ActiveDirectoryPassword authentication

When using the SQL Editor to query Azure SQL Data Warehouse with ActiveDirectoryPassword authentication, the error message "Catalog view 'dm_exec_sessions' is not supported in this version" was returned.

Issue HDP-7596 Server-side request forgery with Autonomous REST Connector

The Autonomous REST Connector was able to access the local file system of the server hosting Hybrid Data Pipeline.

Issue HDP-7597 No results returned for a query that attempted to use dynamic filtering on a date field

When using the Autonomous REST Connector to connect to a REST service, Hybrid Data Pipeline failed to return results for a query that attempted to use dynamic filtering on a date field.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0014

December 17, 2019

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Entity case conversion feature

The Hybrid Data Pipeline has been enhanced to support case conversions for entity types, entity sets, and properties. The owner of a data source can now change the entity type, entity set, and property names to all uppercase or all lowercase on the OData tab in the Web UI or using the Hybrid Data Pipeline Management API.

Web UI data source sharing feature

The Web UI has been enhanced to support data source sharing. The owner of a data source can now share access to a data store with Hybrid Data Pipeline users and tenants through the Data Sources view in the Web UI.

Web UI IP address whitelist feature

The Web UI has been enhanced to fully support the IP address whitelist feature. Administrators can secure access to Hybrid Data Pipeline resources by implementing IP address whitelists through the Web UI. The Web UI can be used to create IP address whitelists at the system level, tenant level, user level, or some combination of these levels

Web UI navigation bar

The navigation bar can be expanded to show the names of the views supported in the Web UI. The icons in the navigation bar have been reordered and updated.

PostgreSQL OData Version 4 stored functions

Hybrid Data Pipeline supports exposing stored functions for OData Version 4 connectivity to PostgreSQL data sources. When configuring a PostgreSQL data source, the OData schema map can be configured to expose stored functions.

JDBC and ODBC throttling

A new throttling limit has been introduced in the System Limits view. The XdbcMaxResponse limit can be used to set the approximate maximum size of JDBC and ODBC HTTP result data.

ODBC driver branded installation

The ODBC driver installation program has been enhanced to support branded installations for OEM customers (available in the ODBC driver installer on November 18, 2019). The branded driver can then be distributed with OEM customer client applications. For the Hybrid Data Pipeline ODBC driver distribution guide, visit the Progress DataDirect Product Books page on the Progress PartnerLink website (login required).

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.20. (On-Premises Connector version 4.6.1.7)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0372

March 9, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

PostgreSQL 14

Hybrid Data Pipeline now supports connectivity to PostgreSQL 14 databases. PostgreSQL 14 can also be used as a system database to store account and configuration information for a Hybrid Data Pipeline instance. This functionality is supported in the following component versions.

  • Hybrid Data Pipeline Server 4.6.1.372 and higher
  • On-Premises Connector 4.6.1.120 and higher

Changed Behavior

PostgreSQL call escape behavior

The default behavior for handling PostgreSQL call escape syntax has changed. Previously, Hybrid Data Pipeline only supported stored functions, and treated the non-standard escape syntax {call function()} the same as the standard escape syntax {? = call function()}. With this latest patch, Hybrid Data Pipeline supports stored functions and stored procedures for JDBC and ODBC connections. Now Hybrid Data Pipeline determines whether a function or procedure is being called based on the call escape syntax. If the return value parameter ?= is used, then the connectivity service calls a stored function. If the return value parameter is not used, then the connectivity service calls a stored procedure. You can change this default behavior by setting the CallEscapeBehavior option as an extended option under the Advanced tab. These are the valid values for the CallEscapeBehavior option:

  • If set to select, treats the object being called by both the {call...} syntax and the {? = call...} syntax as a stored function and makes the applicable native call to the PostgreSQL database.
  • If set to call, treats the object being called by both the {call...} syntax and the {? = call...} syntax as a stored procedure and makes the applicable native call to the PostgreSQL database.
  • If set to callIfNoReturn (the default), the service determines whether to call a function or stored procedure based on the call escape syntax. If the return value parameter ?= is used, the service calls a function. If not, the service calls a stored procedure.

Resolved Issues

Issue HDP-5459 OData $expand query fails against OpenEdge data source

When using the OData $expand functionality to query an OpenEdge data source, the query failed and an error was returned.

Issue HDP-5605 SQL Editor not displaying values when two columns had the same name

When a SQL query included columns of the same name, the SQL Editor did not display the column values.

Issue HDP-5642 SQL Editor not displaying results

The SQL Editor did not display results as expected.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.3.0 archive

Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.

Enhancements

Security

 

 

LDAP authentication

 

Hybrid Data Pipeline has added support to integrate with Active Directory for user authentication using LDAP protocol. Customers can configure an LDAP authentication configuration by providing the details of the server and can configure users to use the LDAP authentication as opposed to the default authentication.

In order to get started with LDAP Authentication, you need to do the following:

  1. Create an Authentication Service of type 3 using the Authentication APIs. Once your authentication service has been created, you must note the authentication service ID.
  2. Create Users tagged to the authentication service ID. You have several different ways of creating service has been created, you must note the authentication service ID.
  3. Create Users tagged to the authentication service ID. You have several different ways of creating users. Refer to the User guide for details.
Permissions
• Support for a permissions API has been added. The Permissions API enables administrators to manage permissions through the Users, Roles, and DataSource APIs. In addition, the Permissions API allows administrators to create data sources on behalf of users and manage end user access to data source details. Administrators can also specify whether to expose change password functionality in the Web UI and SQL editor functionality.
Password policy
• Support for a password policy has been added.
Tomcat Upgrade
• The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 8.5.28.

 


Hybrid Data Pipeline Server
  • OData Version 4 functions.
    Added OData Version 4 function support for IBM DB2 and Microsoft SQL Server data
    stores. (Note: This functionality was previously added for Oracle Database.) If the data stores contain stored functions, they can be exposed using an OData Version 4 service. As part of OData function support, OData schema map version has been changed. The Web UI will automatically migrate the existing OData schema map to a newer OData schema map version when the OData schema is modified for OData Version 4 data sources.

    The following aspects of OData Version 4 functions are supported:

    • Functions that are unbound (static operations)
    • Function imports
    • Functions that return primitive types
    • Function invocation with OData system query options $filter

    The following aspects of OData Version 4 functions are currently NOT supported:

    • Functions that return complex types and entities
    • Functions that are bound to entities
    • Built-in functions
    • Functions with OUT/INOUT parameters
    • Overloaded functions
    • OData system query options using $select
    • OData system query options using $orderby
    • Functions that invoke Parameter value
    • Parameter aliases are not supported. Hence, function invocation with function parameters as URL query parameters is not supported.

  • Installation procedures and response file. The installation program work flow has been modified. The Hybrid Data Pipeline service has two default users, "d2cadmin" and "d2cuser". The installer now prompts you to enter passwords for each default user. When generating a response file to perform a silent installation, the installer will not include values for these properties. Hence, you will need to add the passwords manually to the response file before proceeding with a silent installation. Also, note that a password policy is not enforced during the installation process. The installer only ensures that a value has been specified. The following table provides the new settings. The settings differ depending on whether you generate the response file with a GUI or console installation. Further details are available in the Progress DataDirect Hybrid Data Pipeline Installation Guide.
New response file options
GUIConsoleDefinition
D2C_ADMIN_PASSWORDD2C_ADMIN_PASSWORD_CONSOLESpecifies the password for the
default administrator.
 D2C_USER_PASSWORD D2C_USER_PASSWORD_CONSOLESpecifies the password for the
default user.

 

Web UI


• Product Information In cases where you are using the evaluation version of the product, the Web UI now mentions evaluation timeout information as 'xx Days Remaining'.
• Version Information The product version information now includes details about the licence type. This can be seen under the version information section of the UI. The licence type is also returned when you query for version information via the version API.

Beta support for third party JDBC drivers


• With the 4.3 release, Hybrid Data Pipeline enables users to plug JDBC drivers into Hybrid Data Pipeline and access data using those drivers. This beta feature supports accessibility via JDBC, ODBC and OData clients with the Teradata JDBC driver. If you are interested in setting up this feature as you evaluate Hybrid Data Pipeline, please contact our sales department.

Apache Hive

Enhancements

• Enhanced to optimize the performance of fetches.

• Enhanced to support the Binary, Char, Date, Decimal, and Varchar data types.

• Enhanced to support HTTP mode, which allows you to access Apache Hive data sources using HTTP/HTTPS requests. HTTP mode can be configured using the new Transport Mode and HTTP Path parameters.

• Enhanced to support cookie based authentication for HTTP connections. Cookie based authentication can be configured using the new Enable Cookie Authentication and Cookie Name parameters. * Enhanced to support Apache Knox.

• Enhanced to support Impersonation and Trusted Impersonation using the Impersonate User parameter.

• The Batch Mechanism parameter has been added. When Batch Mechanism is set to multiRowInsert, the driver executes a single insert for all the rows contained in a parameter array. MultiRowInsert is the default setting and provides substantial performance gains when performing batch inserts.

• The Catalog Mode parameter allows you to determine whether the native catalog functions are used to retrieve information returned by DatabaseMetaData functions. In the default setting, Hybrid Data Pipeline employs a balance of native functions and driver-discovered information for the optimal balance of performance and accuracy when retrieving catalog information.

• The Array Fetch Size parameter improves performance and reduces out of memory errors. Array Fetch Size can be used to increase throughput or, alternately, improve response time in Web-based applications.

• The Array Insert Size parameter provides a workaround for memory and server issues that can sometimes occur when inserting a large number of rows that contain large values.

• Certifications

• Certified with Hive 2.0.x, 2.1.x

• Apache Hive data store connectivity has been certified with the following distributions:

• Cloudera (CDH) 5.4, 5.5, 5.6, 5.7, 5.8, 5.9, 5.10, 5.11, 5.12

• Hortonworks (HDP) 2.3, 2.4, 2.5

• IBM BigInsights 4.1, 4.2, 4.3

• MapR 5.2

 

Version and distribution support

• Hive versions 1.0 and higher are supported. Support for earlier version has been deprecated.

• The HiveServer2 protocol and higher is supported. As a result:

• Support for the HiveServer1 protocol has been deprecated.

• The Wire Protocol Version parameter has been deprecated.

• Support has been deprecated for the following distributions:

• Amazon Elastic MapReduce (Amazon EMR) 2.1.4, 2.24-3.1.4, 3.2-3.7

• Cloudera's Distribution Including Apache Hadoop (CDH) 4.0, 4.1, 4.2, 4.5, 5.0, 5.1, 5.2, 5.3

• Hortonworks (HDP), versions 1.3, 2.0, 2.1, 2.2

• IBM BigInsights 3.0 - MapR Distribution for Apache Hadoop 1.2, 2.0

• Pivotal Enterprise HD 2.0.1, 2.1

IBM DB2

Certifications

• Certified with DB2 V12 for z/OS

• Certified with dashDB (IBM Db2 Warehouse on Cloud)

Oracle Marketing Cloud (Oracle Eloqua)

Data type support. The following data types are supported for the Oracle Eloqua data store.

• BOOLEAN

• DECIMAL

• INTEGER

• LONG

• LONGSTRING

• STRING

Oracle Sales Cloud

Data type support. The following data types are supported for the Oracle Eloqua data store.

• ARRAY

• BOOLEAN

• DATETIME

• DECIMAL

• DURATION

• INTEGER

• LARGETEXT

• LONG

• TEXT
• URL

 

 

4.6.1.1854

September 12, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Google Analytics 4 support

Support for connections to Google Analytics 4 (GA4) has been added to Hybrid Data Pipeline. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to GA4. Refer to [GA4] Introducing the next generation of Analytics, Google Analytics 4 in Google's Analytics Help for information on GA4 and the retirement of Universal Analytics (also referred to as Google Analytics 3 or GA3). Refer to Google Analytics 4 parameters for details. (On-Premises Connector 4.6.1.676)

MongoDB support (including MongoDB Atlas and Azure CosmosDB for MongoDB)

Hybrid Data Pipeline now supports access to MongoDB and MongoDB-type data stores, such as MongoDB Atlas and Azure CosmosDB for MongoDB. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to MongoDB and MongoDB-type data stores. Refer to MongoDB parameters for details. (On-Premises Connector 4.6.1.676)

Note:

  • In this release, Hybrid Data Pipeline does not support MongoDB-type data stores in FIPS environments.
  • The Kerberos authentication method is not supported for MongoDB in Hybrid Data Pipeline.
SAP HANA support

Hybrid Data Pipeline now supports access to SAP HANA data stores. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to SAP HANA. Refer to SAP HANA parameters for details. (On-Premises Connector 4.6.1.676)

Note: In this release, Hybrid Data Pipeline does not support SAP HANA in FIPS environments.

SAP S/4HANA support (including SAP BW/4HANA and SAP NetWeaver)

Hybrid Data Pipeline now supports access to SAP S/4HANA and S/4HANA-type data stores, such as SAP BW/4HANA and SAP NetWeaver. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to SAP S/4HANA and S/4HANA-type data stores. Refer to SAP S/4HANA parameters for details. (On-Premises Connector 4.6.1.676)

Note: The HTTP Header authentication method is not supported for SAP S/4HANA, SAP BW/4HANA, and SAP NetWeaver in Hybrid Data Pipeline

OpenSSL 3.0 support

The ODBC driver has been updated to use OpenSSL 3.0 to implement TLS protocols for data encryption between client applications and Hybrid Data Pipeline. In addition, the driver supports the Federal Information Processing Standard or FIPS (140-2), regarding cryptographic module security requirements. (ODBC driver 4.6.1.239)

Changed Behavior

OpenSSL 3.0 support

The ODBC driver has been updated to use OpenSSL 3.0 to implement TLS protocols for data encryption between client applications and Hybrid Data Pipeline. This enhancement allows the driver to support the Federal Information Processing Standard or FIPS (140-2), regarding cryptographic module security requirements. To support OpenSSL 3.0 and FIPS, the Crypto Protocol Version and Enable FIPS connection options have been added to the driver. (ODBC driver 4.6.1.239)

SQL Editor

Previously, when an end user created and saved a Hybrid Data Pipeline data source without providing authentication credentials, the user would be prompted for credentials when using the SQL editor to query the data source. This is no longer the case. Now, when an end user attempts to use the SQL editor to query a data source for which credentials have not been saved, Hybrid Data Pipeline returns the error "INVALID_LOGIN: Invalid username, password, security token; or user locked out."

Resolved Issues

Issue HDP-7621 Address Apache Tomcat vulnerability CVE-2023-28709

Hybrid Data Pipeline has been updated to install and use Tomcat 9.0.75. This update addresses the security vulnerability in Tomcat 9.0.73 as described in CVE-2023-28709. (On-Premises Connector 4.6.1.676)

Issue HDP-7228 Values in the odbc.ini template not correct

After the installation of the ODBC driver on Linux, the default values in the odbc.ini template installed with the driver did not match the values in the hybridDefaults.properties file. (ODBC driver 4.6.1.239)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0023

May 19, 2020

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-4490 ODataApplicationException returned when filtering on BIT/BOOLEAN field

When exposing a Microsoft SQL Server table via OData Version 4 and filtering on a BIT/BOOLEAN field, Hybrid Data Pipeline returned the ODataApplicationException An expression of non-boolean type specified in a context where a condition is expected, near ')'.

Issue HDP-4480 Shutdown script not working

With some shell configurations, the Hybrid Data Pipeline shutdown script stop.sh was not shutting down Hybrid Data Pipeline server processes.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0558

July 6, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Support for Google Analytics, Google BigQuery, Salesforce, and REST data store implementations of OAuth 2.0

Hybrid Data Pipeline has been enhanced to support Google Analytics, Google BigQuery, Salesforce, and REST data store implementations of OAuth 2.0. To integrate Hybrid Data Pipeline with an OAuth 2.0 authorization flow, Hybrid Data Pipeline must be registered as a client application with the given data store. Then, OAuth application and profile objects must be created to manage OAuth endpoints, properties, and tokens. For details, refer to Integrating Hybrid Data Pipeline as a client application with a data store OAuth 2.0 authorization flow. (On-Premises Connector 4.6.1.241)

Changed Behavior

Google Analytics OAuth 2.0 implementation

The procedures for integrating Hybrid Data Pipeline as a client application to enable access to Google Analytics include the ability to select or create an OAuth application in the Web UI. For details, refer to Google Analytics parameters.

Resolved Issues

Issue HDP-5804 Selecting a data source in the SQLEditor results in no suitable driver found error

When selecting a data source from the dropdown in the SQL Editor, the server is returning the error "No suitable driver found."

Issue HDP-5805 Error on datetime column when using OData to connect to MySQL Community Edition

When performing an insert on an OData-enabled MySQL Community Edition data source, Hybrid Data Pipeline returned an error on a datetime column.

Issue HDP-5836 NPE after receiving a merge request with an empty payload

Performing a mergeEntity operation against an OData-enabled MySQL Community Edition data source resulted in a NullPointerException.

Issue HDP-5881 Unable to configure server-side SSL between HDP nodes

Server-side SSL could not be configured because the enable_ssl.sh script was not properly setting the truststore information from the Web UI.

Issue HDP-5924 Update the context.xml file to disable Session persistence in Tomcat

To mitigate the CVE-2022-23181 security vulnerability, the Tomcat context.xml file has been modified such that session persistence is disabled by default.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0092

December 15, 2020

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Aggregation support

Hybrid Data Pipeline has added support for a subset of the functionality defined by the OData Version 4 extension for data aggregation. Aggregation functionality is extended with the $apply query parameter. See Aggregation support in the user's guide for details.

Windows Server 2019

The Hybrid Data Pipeline On-Premises Connector, ODBC driver, and JDBC driver have added support for Windows Server 2019. (On-Premises Connector version 4.6.1.48, ODBC driver 4.6.1.12, JDBC driver 4.6.1.9)

Resolved Issues

Issue HDP-4478 Unable to connect using TNS connection option for Oracle

The Hybrid Data Pipeline server required the specification of the Server Name parameter, even though Server Name is not required for a TNS connection. In addition, when Server Name was specified, the server returned an inaccurate error message.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.2.1 archive

Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.

Changes Since Release 4.2.1

Enhancements

Change password functionality
  • Hybrid Data Pipeline change password functionality has been enhanced. When changing passwords, users must now provide a current password as well as a new password by default. The Administrator's API has been modified to support this functional change. The changepassword API now includes the currentPassword parameter, as well as the newPassword parameter, in the payload.
       {
       "currentPassword": "<mycurrentpassword>"
       "newPassword": "<mynewpassword>"
       }
    Administrators can fall back to the old functionality by setting the configurations API with the new secureChangePassword attribute (specified with the number 2). For example, the following PUT operation would configure the system to use the old functionality where the user must provide only a new password.
       https://myserver:port/api/admin/configurations/2
       {
       "value": "false"
       }

Resolved Issues

  • 4.2.1.59. Issue 83987. Resolved an issue where editing of the OData schema map resulted in the addition of "entityNameMode":"pluralize" when a data source had been configured with OData Version 4, the OData schema map version was odata_mapping_v3, and entityNameMode had not been included.
  • 4.2.1.59. Issues 84061. Resolved issues where the Web UI was not displaying function synonyms for read-only users and where the Web UI duplicated function parameters when synonyms were created for read-only users.
  • 4.2.1.59. Issue 84480. Resolved an issue where the data access service, when configured with a delimiter for external authentication, required the user login to contain the user name, a delimiter, and the identifier for the internal authentication service, for any users authenticating with the internal authentication service. For example, if the data access service was configured with the @ character as the delimiter, then authenticating as an internal user might look like user1@Internal. Now the user login only needs to contain the user name for any users authenticating with the internal authentication service, for example, user1. When only the user name is provided, the data access service uses the internal authentication service to authenticate the user.
  • 4.2.1.59. Issue 84496. Resolved an issue where the data access server was not running in FIPS approved mode when FIPS was enabled. The Bouncy Castle BCFIPS security provider now ensures that the data access service is running in FIPS approved mode.
    When the data access and notification services start, they check to see if they are running in FIPS approved mode. You can confirm that the services are running in FIPS approved mode by checking their corresponding log files: das/server/logs/catalina.out and notification/logs/palatte/datestamp-SYSTEM.log. With result=true, the log entry confirms that the service is running in FIPS approved mode:
    Check for BouncyCastle Approved Only Mode [result=true]
    NOTE: Because the installer program is not capable of regenerating encryption keys for existing users and data sources, we currently recommend a new, clean installation of Hybrid Data Pipeline with FIPS enabled when upgrading from a non-FIPS-compliant server to a FIPS-compliant server. With a new installation, users and data sources must be re-created.
  • 4.2.1.59. Issue 84499. Resolved an issue where a log file was created for each external user when the data access service is configured to use an external authentication service. The data access service now produces a single log file for each internal user and data source with logging details for each external user associated with that internal user and data source.
  • 4.2.1.59. Issue 84527. Resolved an issue where the database host name and port numbers were included in an error message when a query was made against the data access service with the database down.

4.2.1 Release Notes

Security

FIPS compliance
  • Hybrid Data Pipeline is now FIPS 140-2 compliant. By default, HDP will be installed in a FIPS disabled mode. We recommend a new, clean installation with FIPS enabled for production environments. With a new installation, users and datasources must be re-created. For information on how to enable FIPS, refer to the Progress DataDirect Hybrid Data Pipeline Installation Guide.

    Note: The On-Premises Connector is not currently FIPS compliant. Therefore, any connections made to an on-premises data source through an On-Premises Connector will not be fully FIPS compliant.

Support for external authentication
  • Hybrid Data Pipeline now supports two types of authentication: the Hybrid Data Pipeline internal authentication mechanism and external authentication. The external authentication feature is supported as a Java plugin. Administrators can create their own implementation and plug it into Hybrid Data Pipeline either at the time of installation, or at a later time. After external authentication is set up successfully, using APIs, one can set up users in such a way that they get authenticated against an external authentication system. Optionally, multiple external authentication users can be configured to map to one Hybrid Data Pipeline user to get access to data sources.
Tomcat Upgrade
  • The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 8.5.23.

Enhancements

Hybrid Data Pipeline Server
  • OData Version 4 functions. With 4.2.1, Hybrid Data Pipeline supports OData Version 4 functions for Oracle data sources only. If the Oracle database contains stored functions, they can be exposed using an OData Version 4 service. As part of OData function support, OData schema map version has been changed. The Web UI will automatically migrate the existing OData schema map to a newer OData schema map version when the OData schema is modified for OData Version 4 data sources.

    The following aspects of OData Version 4 functions are supported:

    • Functions that are unbound (static operations)
    • Function imports
    • Functions that return primitive types
    • Function invocation with OData system query options $filter

    The following aspects of OData Version 4 functions are currently NOT supported:

    • Functions that return complex types and entities
    • Functions that are bound to entities
    • Built-in functions
    • Functions with OUT/INOUT parameters
    • Overloaded functions
    • OData system query options using $select
    • OData system query options using $orderby
    • Functions that invoke Parameter value
    • Parameter aliases are not supported. Hence, function invocation with function parameters as URL query parameters is not supported.
  • Log files cleanup. Hybrid Data Pipeline now enables you to configure the number of days for which log files must be stored. This is to prevent log files from completely filling up your directories. You can use the Limits API to specify the number of days for log file retention.
    • Support for Ubuntu. Hybrid Data Pipeline Server now supports Ubuntu Linux version 16 and higher.
    • Installation procedures and response file. The installation procedures have been modified with the introduction of support for FIPS and External Authentication. New prompts have been added to the installation process. One of these prompts has a corresponding option that appears in the response file generated by the latest installer for silent installation. If you are using a response file generated by an earlier version of the installer, you should regenerate the response file with the latest installer. The new response file should then be used for silent installations. The following table provides the new settings. The settings differ depending on whether you generate the response file with a GUI or console installation.
    New response file options
    GUIConsoleDefinition
    D2C_USING_FIPS_CONFIGD2C_USING_FIPS_CONFIG_CONSOLESpecifies if you want to configure the server to be FIPS-compliant.
     

    4.6.1.0020

    April 30, 2020

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Resolved Issues

    Issue HDP-4465 Parenthesis in OData query not honored

    When translating an OData Version 4 query to a SQL query, Hybrid Data Pipeline did not honor the parenthesis in the OData query. This caused the reordering of the operator precedence, and lead to incorrect results. The parenthesis in OData queries is now reproduced in SQL queries to maintain operator precedence.

    Issue HDP-4464 Intermittent error "origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10" (On-Premises Connector version 4.6.1.8)

    When connecting to an on-premise data source using the On-Premises Connector, the error origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10 was returned. The On-Premises Connector now correctly handles pongs sent from load balancers according to WebSocket protocol.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0296

    December 6, 2021

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    Docker image

    A production instance of the Hybrid Data Pipeline server can now be deployed using a Docker image. The Hybrid Data Pipeline Docker image is available in the Hybrid Data Pipeline Docker Deployment Package. In addition, the Docker Deployment Package includes demos for a number of deployment scenarios. For details and instructions, see Deploying Hybrid Data Pipeline using Docker in the installation guide.

    OpenID Connect (OIDC) support

    Hybrid Data Pipeline now supports user authentication using the OIDC protocol. An identity provider and client applications can be configured to authorize users and grant access to the OData endpoints of the Hybrid Data Pipeline server. See Integrating an OIDC authentication service in the user's guide for details.

    Resolved Issues

    Issue HDP-5395 Third-party JDBC Oracle driver integration does not return tables

    When using the third-party JDBC Oracle driver, the Hybrid Data Pipeline SQL Editor did not return tables.

    Issue HDP-5433 Unable to authenticate when special character '+' (plus sign) in account password

    When the special character '+' (plus sign) was used in an account password, the user was unable to authenticate with the Hybrid Data Pipeline server.

    Issue HDP-5461 Unable to access Oracle Cloud Financials

    Hybrid Data Pipeline was unable to access Oracle Cloud Financials REST Endpoints with the Autonomous REST Connector.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0032

    July 16, 2020

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Resolved Issues

    Issue HDP-4534 Unable to connect to Google Analytics data source

    When connecting to Google Analytics data source using an OAuth profile created in a previous version of Hybrid Data Pipeline, the following error was returned: There is a problem connecting to the data source. Error getting data source. System not available, try again later.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0306

    December 15, 2021

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Resolved Issues

    Issue HDP-5560 Resolved Log4j security vulnerability

    Hybrid Data Pipeline has been updated to use Log4j version 2.15 to address the security vulnerability found in Log4j version 2.13.3 as described in CVE-2021-44228. For details, refer to CVE-2021-44228. (Hybrid Data Pipeline server 4.6.1.306, On-Premises Connector version 4.6.1.85).

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.1.0 archive

    Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.

    Changes Since Release 4.1.0

    Enhancements

    Hybrid Data Pipeline server
    • Account Lockout Policy (Limits API). Support has been added for implementing an account lockout policy. An account lockout policy allows the administrator to set the number of consecutive failed authentication attempts that result in a user account being locked, as well as the lockout period and the duration of time that failed attempts are counted. When a lockout occurs, the user is unable to authenticate until the specified period of time has passed or until the administrator unlocks the account.
    • Configurable CORS Behavior (Limits API). Support for disabling cross-origin resource sharing (CORS) filter for environments that do not require it. Since Hybrid Data Pipeline does not currently support filtering of cross-origin requests, disabling CORS filters can provide added security against cross-site forgery attacks.
    Apache Hive
    • Certified with Apache Hive 2.0 and 2.1.
    IBM DB2
    • Certified with DB2 for i 7.3
    Oracle Database
    • Certified with Oracle 12c R2 (12.2).

    Resolved Issues

    Hybrid Data Pipeline server
    • Version 4.1.0.44. Bug 71841. Resolved an issue where the Hybrid Data Pipeline server failed to honor the START_ON_INSTALL environment variable to stop and start Tomcat services.
    • Version 4.1.0.44. Resolved an issue where the installer accepted an SSL certificate only in the PEM file format during the installation of the server for a cluster environment. The installer now accepts the SSL certificate (root certificate) in PEM, DER, or base64 encodings for a cluster installation.
    • Version 4.1.0.44. Resolved an issue where an SSL certificate was required for a cluster installation. An SSL certificate is no longer required for a cluster installation.
    • Version 4.1.0.44. Resolved an issue that prevented the installer from supporting a number of upgrade scenarios.
    JDBC driver
    • Version 4.1.0.7. Resolved an issue where the JDBC driver was not connecting to the Hybrid Data Pipeline server by default when running on a UNIX/Linux system.

    4.1.0 Release Notes

    Security

    OpenSSL
    • The default OpenSSL library has been updated to 1.0.2k, which fixes the following security vulnerabilities.
      • Truncated packet could crash via OOB read (CVE-2017-3731)
      • BN_mod_exp may produce incorrect results on x86_64 (CVE-2017-3732)
      • Montgomery multiplication may produce incorrect results (CVE-2016-7055)

      OpenSSL 1.0.2k addresses vulnerabilities resolved by earlier versions of the library. For more information on OpenSSL vulnerabilities resolved by this upgrade, refer to OpenSSL announcements.

    SSL Enabled Data Stores
    • The default value for Crypto Protocol Version has been updated to TLSv1, TLSv1.1, TLSv1.2 for data stores that support the option. This change improves the security of the connectivity service by employing only the most secure cryptographic protocols as the default behavior. At connection, the connectivity service will attempt to use the most secure protocol first, TLS 1.2, then fall back to use 1.1 and then 1.0.
    On-Premises Connector
    • The On-Premises Connector has been enhanced to resolve a security vulnerability. We strongly recommend upgrading to the latest version to take advantage of this fix.
    Apache Hive Data Store
    • Hybrid Data Pipeline now supports SSL for Apache Hive data stores running Apache Hive 0.13.0 or higher.
    SQL Server Data Store
    • Support for NTLMv2 authentication has been added for the SQL Server data store. NTLMv2 authentication can be specified in the Authentication Method field under the Security tab.

    Enhancements

    Hybrid Data Pipeline server
    • Hybrid Data Pipeline Cluster. To support scalability, the Hybrid Data Pipeline service can be deployed on multiple nodes behind a load balancer. Incoming requests can be evenly distributed across cluster nodes. SSL communication is supported if the load balancer supports SSL termination. Session affinity is supported to bind a client query to a single node for improved performance. (Session affinity must be enabled in the load balancer to support the Web UI and ODBC and JDBC clients.) HTTP health checks are supported via the Health Check API.
    • MySQL Community Edition Data Store. Support for MySQL Community Edition has been added to Hybrid Data Pipeline. During installation of the Hybrid Data Pipeline server and the On-Premises Connector, you provide the location of the MySQL Connector/J driver. After installation, you may then configure data sources that connect to a MySQL Community Edition data store and execute queries with ODBC, JDBC, and OData applications.
    • MySQL Community Edition System Database. Support for MySQL Community Edition as an external system database has been added. During the installation process, you are prompted to select either an internal database or an external database to store system information necessary for the operation of Hybrid Data Pipeline. With this enhancement, you can choose either Oracle or MySQL Community Edition as an external database.
    • Installation Procedures and Response File. The installation procedures have been modified with the introduction of support for the Hybrid Data Pipeline cluster, the MySQL Community Edition data store, and the MySQL Community Edition system database. New prompts have been added to the installation process. Several of these prompts have corresponding settings that must be used in the response file for silent installation of the server. If you are performing silent installations of the server, your response file must be modified accordingly. The following list provides the new settings. The settings may differ depending on whether you generate the response file with a GUI or console installation.
      Note: Values for the SKIP_HOSTNAME_VALIDATION and SKIP_PORT_VALIDATION options have been changed from false | true to 0 | 1. These options have the same name in GUI-generated and console-generated response files.
      Note: Values for the SKIP_LB_HOSTNAME_VALIDATION option are currently 0 for disable and true for enable. In a future release, the values will be 0 for disable and 1 for enable. This option has the same name in GUI-generated and console-generated response files.
      New response file options.The first name in the list is the name of the response file option generated by the GUI installer. The second name in the list is the name generated by the console mode installer. (If only one value is provided, there is no corresponding value for console mode.)
      • USING_LOAD_BALANCING_YES | D2C_USING_LOAD_BALANCING _CONSOLE - Specifies whether you are installing the service on a node behind a load balancer.
      • LOAD_BALANCING_HOST_NAME | LOAD_BALANCING_HOST_NAME_CONSOLE - Specifies the hostname of the load balancer appliance or the machine hosting the load balancer service.
      • USING_LOAD_BALANCING_NO - Specifies whether you are installing the service on a node behind a load balancer. For console installation, only D2C_USING_LOAD_BALANCING _CONSOLE is used.
      • SKIP_LB_HOSTNAME_VALIDATION | SKIP_LB_HOSTNAME_VALIDATION - Specifies whether the installer should validate the load balancer hostname during the installation of a node.
      • D2C_CERT_FILE | D2C_CERT_FILE_CONSOLE - Specifies the fully qualified path of the Certificate Authority certificate that signed the load balancer server certificate. This certificate is used to create the trust store used by ODBC and JDBC clients.
      • D2C_DB_MYSQL_COMMUNITY_SUPPORT_YES | D2C_DB_MYSQL_COMMUNITY_SUPPORT_CONSOLE - Specifies whether the service will support MySQL Community Edition data store.
      • D2C_DB_MYSQL_JAR_PATH | D2C_DB_MYSQL_JAR_PATH_CONSOLE - Specifies whether the fully qualified path of the MySQL Connector/J jar file to support a MySQL Community Edition data store.
      • D2C_DB_MYSQL_COMMUNITY_SUPPORT_NO - Specifies whether the service will support MySQL Community Edition data store. For console installation, only D2C_DB_MYSQL_COMMUNITY_SUPPORT_CONSOLE is used.
      • D2C_DB_VENDOR_MYSQL - Specifies whether a MySQL Community Edition database will be used as the external system database. For console mode installations, D2C_DB_VENDOR_CONSOLE is used to specify an Oracle or MySQL Community Edition external system database.
      • D2C_DB_PORT_MYSQL - Specifies the port number of the MySQL Community Edition external database. For console mode installations, D2C_DB_PORT_CONSOLEis used to specify the port of either an Oracle or MySQL Community Edition externa system database.
      • USER_INPUT_KEY_LOCATION | USER_INPUT_KEY_LOCATION_CONSOLE - Specifies the fully qualified path of the encryption key to be shared by the nodes in a cluster environment.
    • Throttling (Limits API). Support for throttling to prevent a user or group of users from adversely impacting the performance of the connectivity service has been added. The Limits API allows administrators to set limits on how many rows can be returned for ODBC, JDBC, and OData requests. An error is returned if an application fetches rows beyond the specified limit.
    • Refresh Map. The new refresh map button has been added to the Mapping tab. This button allows you to refresh the map without connecting to the data store. This feature is useful when you are in the process of developing your application and you have made changes to the objects in your backend data store. Pressing this button forces the data store to rebuild the map allowing the new objects to show up in the relational map the next time your application connects to the data source. (The map can also be refreshed with a Management API call or when establishing a connection.)
    • SQL Editor. The SQL editor in the SQL Testing view has been upgraded. The functionality of the new editor is similar to that of the previous editor. However, the history panel is not currently supported with the new editor.
    • OpenAccess Server. The OpenAccess server component has been deprecated. The OpenAccess server is no longer required to connect with Oracle Eloqua.
    On-Premises Connector
    • Upgraded to use Tomcat 8.0.41
    • Upgraded to use Java SE 8
    • Support for Windows Server 2003 has been deprecated
    Hybrid Data Pipeline ODBC Driver
    • Certified with CentOS Linux 4.x, 5.x, 6.x, and 7.x
    • Certified with Debian Linux 7.11, 8.5
    • Certified with Oracle Linux 4.x, 5.x, 6.x, and 7.x
    • Certified with Ubuntu Linux 14.04, 16.04
    • Support for Windows Server 2003 has been deprecated
    Apache Hive
    • Added SSL support for Apache Hive 0.13.0 and higher
    • Certified with Apache Hive 0.13, 0.14, 1.0, 1.1, 1.2
    • Certified with Amazon (AMI) 3.2, 3.3.1, 3.7
    • Certified with Cloudera (CDH) 5.0, 5.1, 5.2, 5.3, 5.4, 5.4, 5.6, 5.7
    • Certified with Hortonworks (HDP) 2.1, 2.2
    • Certified with IBM BigInsights 4.1
    • Certified with Pivotal HD (PHD) 2.1
    Greenplum
    • Made generally available
    • Certified with Greenplum 4.3
    • Certified with Pivotal HAWQ 1.2, 2.0
    IBM DB2
    • Certified with IBM DB2 V11.1 for LUW
    • Certified with DB2 for i 7.2
    Informix
    • Made generally available
    • Certified with Informix 12.10
    • Certified with Informix 11.7, 11.5, 11.0
    • Certified with Informix 10.0
    • Certified with Informix 9.4, 9.3, 9.2
    Oracle Marketing Cloud (Oracle Eloqua)

    The Oracle Marketing Cloud data store provides access to Oracle Eloqua. Improved features and functionality for this data store are available with this Hybrid Data Pipeline release.

    • Write Access
      • Support for INSERT/UPDATE/DELETE operations on CONTACT, ACCOUNT and CustomObjects_XXX
    • Bulk Calls
      • Performance improvement for bulk calls
      • Supports fetching more than 5 million records
      • Supports fetching up to 250 columns for bulk calls
      • Supports pushing OR operators for bulk calls (This does not apply to Activities)
    • REST Calls
      • Some queries with OR and AND operators have been optimized.
    • Metadata
      • The data store now uses null as the catalog name. Previously, ECATALOG was used as the catalog name.
      • The current version of the data store maps columns with integer data to type INTEGER. The previous version mapped the integer type to string.
    • In contrast to the previous version, the current version of the data store cannot split OR queries and push them separately to Oracle Eloqua APIs. Therefore, compared to the previous version, the current version may take longer to return results involving OR queries.
    • The previous version of the data store used the ActivityID field as the primary key for Activity_EmailXXX objects, such as Activity_EmailOpen, Activity_EmailClickthrough, and Activity_EmailSend. In contrast, the current version of the data store uses the ExternalID field as the primary key instead of ActivityID.
    PostgreSQL
    • Certified with PostgreSQL 9.3, 9.4, 9.5, 9.6
    Progress OpenEdge
    • Certified with Progress OpenEdge 11.4, 11.5, 11.6
    Salesforce
    • Certified with Salesforce API 38
    SAP Sybase ASE
    • Made generally available
    • Certified with SAP Adaptive Server Enterprise 16.0
    • Added support for NTLMv2 authentication. NTLMv2 authentication can be specified in the Authentication Method field under the Security tab.
    • Certified with Microsoft SQL Server 2016

    Resolved Issues

    Web UI
    • Resolved an issue where the SQL editor in the SQL Testing view returned errors when executing SQL commands against Google Analytics data sources
    OData
    • Resolved an issue where OData requests were timing out before application could finish retrieving the results
    Hybrid Data Pipeline Management API
    • Resolved an issue where a 201 was returned when adding members to a group data source through the Management API
    • Resolved an issue where a normal user would receive a 400 instead of a 404 error when using the user query parameter to Management API calls
    • Resolved an issue where user creation API allowed invalid values for the status field
    DB2
    • Resolved an issue where the error "Numeric value out of range" occurs when calling SQLStatistics in DB2 with the ODBC driver
    Google Analytics
    • Resolved an issue where the SQL editor in the SQL Testing view returned errors when executing SQL commands against Google Analytics data sources
     

    4.6.1.0256

    October 13, 2021

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    Stored procedures

    Hybrid Data Pipeline now supports invoking stored procedures for JDBC and ODBC connections. Stored procedures functionality includes support for input parameters, output parameters, and in/out parameters. Stored procedures that return multiple results are also supported. This functionality is supported in the following component versions.

    • Hybrid Data Pipeline Server 4.6.1.256 and higher
    • On-Premises Connector 4.6.1.73 and higher
    • JDBC Driver 4.6.1.23 and higher
    • ODBC Driver 4.6.1.31 and higher

    Resolved Issues

    Issue HDP-5020 Error message did not state reason that the required secure random instance could not be created when enabling FIPS

    When attempting to enable FIPS, the error message did not state that the required secure random instance could not be created because there was not enough entropy on the host machine.

    Issue HDP-5064 JDBC driver not able to follow redirects (JDBC driver 4.6.1.23)

    When an HTTP redirect status was returned, the driver was unable to follow the redirection and returned an error that the HTTP endpoint had been relocated. To resolve this issue, the FollowRedirects connection property has been introduced. When FollowRedirects is enabled, the driver can follow an HTTP redirection instead of returning an error. For details, refer to FollowRedirects.

    Issue HDP-5412 "Unexpected end of stream in statement" error returned

    When processing an empty result reply to a query execution request against a ServiceNow REST service, Hybrid Data Pipeline returned "Unexpected end of stream in statement" error.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0138

    April 16, 2021

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Resolved Issues

    Issue HDP-4923 Performance issue querying OpenEdge database

    To improve performance, the OData startswith() function was changed to use LIKE instead of LOCATE for a SQL statement that takes less time to execute.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.1391

    April 27, 2023

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    Web UI branding

    Hybrid Data Pipeline now supports branding of its Web UI. The default branding information like logo, colors, naming, and icons can be configured before or after installation. For more information, refer to Branding the Web UI for details.

    Autonomous REST Composer

    The Autonomous REST Composer is now available on the Configure Endpoints tab from the Autonomous REST Connector data store interface. The Composer allows you to create a REST data source and configure or import a REST Model file using the Web UI. For more information, refer to Creating REST data sources with the Web UI for details.

    Tomcat updates

    Hybrid Data Pipeline has been updated to install and use Tomcat 9.0.73.In addition, the following Hybrid Data Pipeline Tomcat configurations have been made to improve security.

    • The autoDeploy attribute has been set to false. Hence, Tomcat does not check for new or updated web applications.
    • TLS 1.2 is the minimum supported version for TLS encryption.
    • Weak ciphers are no longer supported.
    • For Linux installations, the Shutdown Port permits a REST call to shutdown the server. The default value of this port has been changed from 8005 to -1. The new default value -1 disables this port.
    Oracle 19c certified as a system database

    Oracle 19c has been certified to operate as a Hybrid Data Pipeline system database.

    Microsoft Dynamics 365 Cross Company connection option

    The Microsoft Dynamics 365 data store supports a new connection option Cross Company that allows access to cross company data for users who have access to multiple companies. Refer to Microsoft Dynamics 365 parameters for details.

    Resolved Issues

    Issue HDP-7029 JDBC driver returned the error "unexpected end of stream reached"

    When querying a SQL Server data source, the JDBC driver returned the "unexpected end of stream reached" error. (JDBC driver 4.6.1.212)

    Issue HDP-7147 Resolved Hybrid Data Pipeline vulnerability CVE-2023-24998

    The shipping version of the Tomcat server was upgraded from Tomcat 9.0.65 to 9.0.73 to address the vulnerability described in CVE-2023-24998. (Hybrid Data Pipeline server 4.6.1.1391, On-Premises Connector 4.6.1.524)

    Issue HDP-7495 Unable to configure SSL behind a load balancer when using FIPS and an external JRE

    After configuring the Hybrid Data Pipeline server to use an external JRE and run in FIPS mode, server-side SSL could not be enabled. (Hybrid Data Pipeline server 4.6.1.1391)

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0169

    June 10, 2021

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    Google BigQuery support

    Hybrid Data Pipeline now supports access to Google BigQuery. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Google BigQuery. OAuth 2.0 and Service Account authentication methods are supported.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.0.1 archive

    Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.

    Enhancements

    Hybrid Data Pipeline server
    • Added support for bypassing hostname and port validation when performing a silent installation. When hostname validation fails during the interactive installation process, you are prompted to reenter the hostname or skip validation. If you choose to skip validation, the hostname and port validation properties in your response file will have the following settings.
      SKIP_HOSTNAME_VALIDATION=true
      SKIP_PORT_VALIDATION=true
      Running an installation in silent mode with a response file containing these settings allows the silent installation to continue even if hostname or port validation fail.  When the validation fails during the silent installation process, the installer generates the file SilentInstallInfo.log in the home directory of the target machine but completes a full installation.
    • Added support for version information to be returned from /api/admin/version endpoint.  This feature is only accessible via admin accounts.  The response is returned in a JSON-style format with the following syntax.
      {
      "HDPVersion": "<major>.<minor>.<service_pack>.<build_number>"
      "DASVersion":"<major>.<minor>.<service_pack>.<build_number>"
      }
    • Upgraded to Tomcat 8.0.39
    Oracle Sales Cloud
    • Enhanced to support queries with equality filters on non-indexed fields. Previously, equality filter conditions were passed to the Oracle Sales Cloud server for indexed fields only.
    • Enhanced performance for queries with non-equality filters. Previously, the entire table (or tables) were retrieved before the data from a non-equality operator could be filtered. These filters are now passed to Oracle Sales Cloud for faster processing.
    • Enhanced to support the columns specified in a Select list. Previously, all fields were retrieved from the Oracle Sales Cloud. Now only the fields specified in the Select list are retrieved.
    • Proxy options have been enabled for Oracle Sales Cloud Driver. You define the proxy server options in the data source definition, in the Extended Options field of the Advanced tab. The syntax is shown in the following example:
      ProxyHost=Server1;ProxyPort=5122;ProxyUser=JohnDoe;ProxyPassword=John'sPW;
      • ProxyHost identifies a proxy server to use for the connection, using either the server name or an IP address specified in either IPv4 or IPv6 format.
      • ProxyPort specifies the port number where the proxy server is listening for HTTPS requests.
      • ProxyUser specifies the user name needed to connect to a proxy server when authentication on the proxy server is enabled.
      • ProxyPassword specifies the password needed to connect to a proxy server when authentication on the proxy server is enabled.

    Resolved Issues

    Hybrid Data Pipeline installer
    • Resolved an issue where the final.log file was generated even though the installation succeeded
    • Resolved an issue with console installation where user was prompted for OpenAccess options even though OpenAccess was not selected
    • Resolved an issue where silent installation failed with a non-standard port for an external Oracle database
    • Resolved an issue where silent installation failed when a response file generated from console mode was used for the silent installation
    • Resolved an issue where silent installation failed when external database was selected and configured
    • Resolved an issue where the schema was not created during a silent installation with an external database configuration
    Hybrid Data Pipeline server
    • Enabled compression
    • Resolved an issue where the HDPVersion in the version API response did not include the build number for the package
    • Resolved an issue where the server would make calls to external resources
    • Resolved an issue where server would not start in an environment that already had CATALINA_HOME configured
    • Resolved an issue where shutdown scripts would shut down process not related to the server
    • Resolved an issue where the server did not shut down completely when executing the shutdown script
    Credentials database
    • Resolved an issue where only the DB Admin credentials were being validated when Oracle was selected as an external database
    • Resolved an issue where embedded database was started when environment was configured to use external database
    • Resolved a failure when creating users with external database configured for Oracle version 11.2.0.4 patch 7
    Hybrid Data Pipeline Management API
    • Resolved an issue where a 201 was returned when adding members to a group data source through the Management API
    • Resolved an issue where a normal user would receive a 400 instead of a 404 error when using the user query parameter to Management API calls
    • Resolved an issue where user creation API allowed invalid values for the status field
    OData
    • Resolved an issue where OData requests were timing out before application could finish retrieving the results
    Oracle Sales Cloud
    • Resolved a file input/output error when connecting to an Oracle Sales Cloud data source
    • Resolved an issue where an empty folder named OracleSalesCloud_Schema was created in the installation directory
    • Resolved an issue where decimal values with a precision greater than 7 were not returned correctly when retrieved from an Oracle Sales Cloud data source
     

    4.6.1.0107

    January 13, 2021

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    OData query throttling for users

    Hybrid Data Pipeline supports throttling the number of simultaneous OData queries a user may have running against a Hybrid Data Pipeline server at one time. OData query throttling for users may be configured with the ODataMaxConcurrentRequests and ODataMaxWaitingRequests limits. The ODataMaxConcurrentRequests limit sets the maximum number of simultaneous OData requests allowed per user, while the ODataMaxWaitingRequests limit sets the maximum number of waiting OData requests allowed per user. See Throttling in the user's guide for details.

    Environment variables support for silent installation

    Support for environment variables to specify server and system database credentials during the installation process has been added. The use of environment variables allows you to perform a more secure silent installation, compared to a standard silent installation where credential information must be specified in plain text in the silent installation response file. See Silent installation process in the user's guide for details.

    Resolved Issues

    Issue HDP-4853 Installation failed when special characters were used in system database credentials

    When installing the Hybrid Data Pipeline server using SQL Server as the system database, the use of special characters in admin or user account credentials caused the installation to fail with the error Error in createSchema at Line 266.
    NOTE: While installation no longer fails when special characters are used in system database account credentials, the installer cannot currently validate the necessary database schema objects when any of the following special characters are used in either database user ID or password values: space ( ), quotation mark ("), number sign (#), dollar sign ($), and apostrophe ('). Therefore, in a standard installation where these characters are used in database credentials, database validation must be skipped to proceed with the installation. Similarly, when performing a silent installation in this case, the SKIP_DATABASE_VALIDATION property should be set to true. Note that when skipping database validation in this scenario, the server should install successfully and work with the specified system database.

    Issue HDP-4854 Silent installation process required the specification of system database admin and user passwords in clear text in the response file

    The specification of system database admin and user passwords in plain text in the response file as part of the silent installation process raised security concerns. Support for environment variables to specify server and system database credentials during the installation process has been added. See Silent installation process in the user's guide for details.

    Issue HDP-4859 Firefox, Chrome, and Microsoft Edge browsers not rendering Web UI correctly for load balancer installation

    When the HAProxy load balancer was configured with the setting x-content-type-options:nosniff, Firefox, Chrome, and Microsoft Edge browsers rendered the Web UI as text instead of HTML.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0233

    September 10, 2021

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    SSO/SAML support

    Hybrid Data Pipeline now supports user authentication using the SSO/SAML protocol. Customers can configure SAML authentication by providing the details of an identity provider and can configure users to use the SAML authentication.

    Resolved Issues

    Issue HDP-4549 HDP server unreachable due to OS file handle leak

    When the "FileNotFoundException (Too many open files)" error occurred, the Hybrid Data Pipeline connection was lost and the server had to be restarted.

    Issue HDP-5202 Error returned when fetching MySQL zero values for date and datetime columns

    When fetching invalid date and datetime values from columns or literals, such as SELECT DATE(0), against MySQL data sources, the Hybrid Data Pipeline server returned an error.

    Issue HDP-5210 OData v4 Endpoint not compatible with Tableau Desktop

    Tableau was unable to connect to OData v4 endpoints exposed by Hybrid Data Pipeline.

    Issue HDP-5217 Some special characters not allowed in passwords

    Users were unable to use special characters for Hybrid Data Pipeline passwords.

    Issue HDP-5266 Load balancer not returning OData responses from the server

    When HTTP was disabled on the load balancer, the load balancer did not return OData responses to the client application as would be expected with the configuration of the X-Forwarded-Proto header to manage HTTP and HTTPS traffic.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0357

    February 16, 2022

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    Server-side SSL

    Hybrid Data Pipeline now supports server-side SSL. Server-side SSL allows you to enable SSL behind the load balancer and secure communication between the load balancer and server nodes, as well as Hybrid Data Pipeline nodes in a cluster deployment. This functionality is supported in the following component versions.

    • Hybrid Data Pipeline Server 4.6.1.357 and higher
    • JDBC Driver 4.6.1.32 and higher
    • ODBC Driver 4.6.1.34 and higher

    Note:

    • Updating On-Premises Connectors is not required to configure server-side SSL.
    • For details on server-side SSL, refer to SSL configuration.
    curl Library update (ODBC driver 4.6.1.34)

    The curl library files used with the ODBC driver have been upgraded to version 7.80.0.

    OpenSSL library update (ODBC driver 4.6.1.34)

    The default version of the OpenSSL library used with the ODBC driver has been upgraded to version 1.1.1l.

    Resolved Issues

    Issue HDP-5587 SYNONYMS not displayed in the Web UI

    The SQL Editor was not displaying SYNONYM objects.

    Issue HDP-5611 "Unexpected end of stream" error returned

    When queries demanded the return of multiple large result sets, the query failed and the error "Unexpected end of stream" was returned.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0757

    September 14, 2022

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    Microsoft Dynamics 365 support

    Hybrid Data Pipeline now supports access to a number of Microsoft Dynamics 365 apps. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to these Dynamics 365 apps. OAuth 2.0 connectivity is supported. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)

    Docker trial deployment

    The generally available Hybrid Data Pipeline Docker image now supports a trial Docker deployment. After you obtain the image from the Progress Enterprise Delivery site (ESD) or the Trial Download page, you may perform a trial deployment of Hybrid Data Pipeline as a Docker container on a single node with an internal system database. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.

    Power BI custom connector

    A Power BI custom connector is now available from the Progress DataDirect Hybrid Data Pipeline Public GitHub repository. This custom connector may be used to implement connectivity from Power BI to Hybrid Data Pipeline resources that use OAuth 2.0 or OIDC authentication. For details, refer to Configuring a Power BI custom connector for OAuth 2.0 or Configuring a Power BI custom connector for OIDC.

    Tomcat upgrade

    The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.63. This addresses the CVE-2022-23181 security vulnerability that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector version 4.6.1.287)

    Changed Behavior

    Microsoft Dynamics CRM data store deprecated

    The Microsoft Dynamics CRM data store has been deprecated. Connectivity to a number of Dynamics 365 apps is now supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)

    Docker trial image

    The Docker trial image has been deprecated. A Docker trial deployment of Hybrid Data Pipeline may now be performed using the generally available Hybrid Data Pipeline Docker image. This image may be obtained from the Progress Enterprise Delivery site (ESD) or the Trial Download page. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.

    Resolved Issues

    Issue HDP-5854 The ODBC driver not supporting the GUID data type

    The ODBC driver did not support the GUID data type. (ODBC driver 4.6.1.67)

    Issue HDP-5925 Upgrade the version of Tomcat shipped with Hybrid Data Pipeline server from Tomcat 9.0.54 to 9.0.63

    The shipping version of the Tomcat server was upgraded from Tomcat 9.0.54 to 9.0.63. This addresses the CVE-2022-23181 security vulnerability that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector 4.6.1.287)

    Issue HDP-6212 SQL Editor query of datetimeoffset and sql_variant data type columns returns NullPointerException

    When using the SQL Editor to query datetimeoffset and sql_variant, a NullPointerException was returned.

    Issue HDP-6217 Problem with setting HDP_DATABASE_ADVANCED_OPTIONS setting for Docker deployments

    When setting HDP_DATABASE_ADVANCED_OPTIONS to use an SSL connection to the external system database, the setting was not propagated correctly.

    Issue HDP-6275 Hybrid Data Pipeline server upgrade failed in an environment using FIPS and an external JRE

    When performing a Hybrid Data Pipeline server upgrade in an environment using FIPS and an external JRE, the upgrade failed with the error Error in MAIN at line 576.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.1030

    November 21, 2022

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Changed Behavior

    Microsoft Dynamics CRM data store removed

    The Microsoft Dynamics CRM data store was recently deprecated, and has now been removed from the product package. Connectivity to a number of Dynamics 365 apps, including CRM and ERP apps, is supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details.

    Rollbase data store removed

    The Rollbase data store has been removed from the product package. If you would like to reintroduce the Rollbase data store, contact Technical Support.

    SugarCRM data store removed

    The SugarCRM data store has been removed from the product package. If you would like to reintroduce the SugarCRM data store, contact Technical Support.

    Resolved Issues

    Issue HDP-6514 Use of Externally-Controlled Input to Select Classes or Code ('Unsafe Reflection') - (CVE-2022-41853)

    The Hybrid Data Pipeline product and it’s connectors utilized a version of HyperSQL Database that was vulnerable to remote code execution described in CVE-2022-41853. All impacted components have been patched to fix this vulnerability. For details of components impacted and fixed versions, refer to the following KB article:

    https://community.progress.com/s/article/DataDirect-Hybrid-Data-Pipeline-Critical-Security-Bulletin-November-2022-CVE-2022-41853

    Note: In addition to updating the Hybrid Data Pipeline server, if any On-Premises Connectors are used in your environment, they should be updated with build 4.6.1.395 of the On-Premises Connector

    Issue HDP-6431 Microsoft Dynamics 365 Authorization URI auto-populated for client credentials auth flow

    After an initial connection to Microsoft Dynamics 365 using the OAuth 2.0 client credentials grant, the Authorization URI field automatically populated with the default value when the data source was reopened. The value in the Authorization URI field had to be manually cleared to reconnect with Microsoft Dynamics 365.

    Issue HDP-6601 Hybrid Data Pipeline unable to connect to an Azure Synapse serverless database via a SQL Server data source

    Hybrid Data Pipeline was unable to connect to an Azure Synapse serverless database via a SQL Server data source.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.1248

    March 30, 2023

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    curl Library Upgrade

    The curl library files that are installed with the ODBC driver have been upgraded to version 7.88.1, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities in the curl documentation. (ODBC driver 4.6.1.158)

    OpenSSL Upgrade

    The default version of the OpenSSL library has been upgraded to version 1.1.1t, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities: Fixed in OpenSSL 1.1.1 in OpenSSL News. (ODBC driver 4.6.1.158)

    Resolved Issues

    Issue HDP-6444 Hybrid Data Pipeline Server upgrade to enable FIPS is failing

    When upgrading the Hybrid Data Pipeline server to enable FIPS, the installation failed and the installer returned an account database error.

    Issue HDP-6931 JDBC driver allowed statements to be executed even when the connection was dead causing "Invalid session token" error

    The JDBC driver was allowing statements to be executed after a connection was terminated, resulting in an "Invalid session token" error. (JDBC driver 4.6.1.194)

    Issue HDP-6973 User and password properties should be made optional in the JDBC data source

    On a JDBC data source configured for OAuth and created with the DataDirect Snowflake JDBC driver, the user was prompted for a user ID and password when attempting to test connect.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0417

    April 14, 2022

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Resolved Issues

    Issue HDP-5675 The Metadata Exposed Schemas dropdown not loading schemas when using the PostgreSQL driver

    When using the PostgreSQL JDBC driver as a third party driver to connect to backend data, the Metadata Exposed Schemas dropdown did not load PostgreSQL schemas.

    Issue HDP-5780 Unable to login after upgrading server

    After upgrading to server build 4.6.1.357, the introduction of a new keystore prevented successful login.

    Issue HDP-5792 Unable to deploy as Docker container using environment variables

    Hybrid Data Pipeline deployment failed when using environment variables to deploy the server as a Docker container

    Issue HDP-5811 Resolved Spring Framework vulnerability

    Hybrid Data Pipeline has been updated to use Spring Framework version 5.3.18, Spring Boot version 2.6.6, and Spring Security version 5.6.2 to address the vulnerability described in CVE-2022-22965. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)

    Issue HDP-5813 Resolved Jackson Deserializer vulnerability

    Hybrid Data Pipeline has been updated to use version 2.13.2.2 of the Jackson library to address the vulnerability described in CVE-2020-36518. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)

    Issue HDP-5841 On-Premises Connector unable to connect after upgrade

    After upgrading to On-Premises Connector build 4.6.1.120, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.164)

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.5.0 archive

    Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.

    Resolved Issues

    The following issues have been resolved. An asterisk (*) indicates an issue that was resolved in a software patch subsequent to the GA release.

    Issue HDP-3974 Installation fails when choosing a unicode external database*
    When a unicode external database was selected during the installation process, the Hybrid Data Pipeline server failed to install. This fix is available in build 4.5.0.71.

    Issue HDP-3989 Validate Server Certificate persistence in the Web UI*
    The Web UI was not persisting the value of the Validate Server Certificate parameter after it had been set to OFF. After exiting the data source and returning back, this resulted in the test connection failing. This fix is available in build 4.5.0.65.

    Issue HDP-3785 Data source password replacing plus sign (+) with space*
    When creating a password for a MySQL CE data source in the Web UI, the plus sign (+) was incorrectly being replaced with a space. This fix is available in build 4.5.0.65.

    Issue HDP-3878 OData model creation failure*
    OData model creation was failing when the connectivity service was building an OData model from a very large database. Additionally, if unable to read metadata from unique or unusual tables, the creation of the OData model would result in either no rows returned or only partial rows returned. Hybrid Data Pipeline now builds the OData model from the tables selected to be in the model, as opposed to all the tables in the database. This enhancement is available in build 4.5.0.61.

    Enhancements

    Multitenancy

    Hybrid Data Pipeline now supports multitenancy. Multitenancy allows a system administrator to isolate groups of users, such as organizations or departments, that are being hosted through the Hybrid Data Pipeline service. The provider maintains a physical instance of Hybrid Data Pipeline, while each tenant (group of users) is provided with its own logical instance of the service. In a multitenant environment, the default system tenant contains multiple child tenants. The user accounts that reside in one tenant are isolated from those in other tenants.

    Data source sharing

    Hybrid Data Pipeline now supports data source sharing via the Data Sources API. Data source owners can now share data sources with other users. Standard users can share data sources with individual user accounts. Administrators can share data sources with tenants and individual user accounts. Data source sharing allows administrators to provision users for limited or query-only access to Hybrid Data Pipeline resources.

    Third-party JDBC support and validation tool

    Hybrid Data Pipeline support for third-party JDBC drivers is now GA. Administrators can use a command line validation tool to determine whether a third-party JDBC driver will work with the Hybrid Data Pipeline server and On-Premises Connector. If validated, a third-party driver can be used to support OData, JDBC, and ODBC connectivity in the Hybrid Data Pipeline environment. Once the driver is integrated with the Hybrid Data Pipeline environment, users can create Hybrid Data Pipeline data sources for the backend data store supported by the third-party JDBC driver.

    IP address whitelists

    Administrators can now restrict access to Hybrid Data Pipeline by creating an IP address whitelist to determine which IP addresses (either individual IP addresses or a range of IP addresses) can access resources such as the Data Sources API, the Users API, and the Web UI. IP address whitelists can be implemented at system, tenant, and user levels.

    Web UI
    • The Web UI has been refreshed with modern look and feel to provide an improved user experience. As part of the refresh, the Web UI URL has been changed to http(s)://<servername>:<portnumber>/hdpui.
    • The OData Configure Schema editor has been enhanced and now provides a better way to configure an OData schema map.
    • The process for creating Google Analytics data sources has also been improved. 
    SQL Server data store

    Hybrid Data Pipeline now supports the following features.

    • Transparent connectivity to Microsoft Azure Synapse Analytics (formerly Microsoft Azure SQL Data Warehouse) and Microsoft Analytics Platform System data sources
    • Always On Availability Groups via the Multi-Subnet Failover, Application Intent, and Server Name options
    • Azure Active Directory authentication (Azure AD authentication) via the Authentication Method, User, Password, Server Name, and Port Number options
    Exporting non-relational data source files

    The Data Source API now supports operations to export the relational map files for non-relational data sources. When a data source is created for a web service such as Salesforce, Hybrid Data Pipeline generates files to map the object model to a relational model. These files may be used to resolve issues that can arise when performing queries against data sources such as these.

    Evaluation period

    The evaluation period for Hybrid Data Pipeline has been changed from 90 to 30 days. 

     

    4.6.1.0132

    March 22, 2021

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Enhancements

    Changing catalog

    Hybrid Data Pipeline supports changing the catalog of data sources. The setCatalog method can be used to change catalogs in JDBC, while the connection attribute SQL_ATTR_CURRENT_CATALOG can be used in ODBC. Support for changing catalogs includes support for changing the default database on an active connection to a SQL Server data source. This support extends to any data source configured with an underlying JDBC connector that supports the setCatalog method. This enhancement is available in the latest build of the Hybrid Data Pipeline server (4.6.1.132). Components such as the Hybrid Data Pipeline ODBC and JDBC drivers, as well as the On-Premises Connector must be reinstalled to adopt the enhancement (On-Premises Connector version 4.6.1.62, ODBC driver 4.6.1.27, JDBC driver 4.6.1.13).

    Resolved Issues

    Issue HDP-4463 JDBC driver defaulted to service.datadirectcloud.com host name and returned inaccurate error message (JDBC driver 4.6.1.13)

    When an incorrect host name was specified in the connection URL, the Hybrid Data Pipeline JDBC driver defaulted to service.datadirectcloud.cloud.com as the host name and returned an inaccurate error message.

    Issue HDP-4858 ODBC driver not installing on Amazon Linux 2

    The ODBC driver was not installing on Amazon Linux 2.

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.6.1.0325

    February 3, 2022

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Resolved Issues

    Issue HDP-5589 Resolved Log4j 2.17 security vulnerability

    Hybrid Data Pipeline has been updated to use Log4j version 2.17.1 to address security vulnerabilities found in Log4j versions 2.17 as described in CVE-2021-44832. For details, refer to CVE-2021-44832. (Hybrid Data Pipeline server 4.6.1.325, On-Premises Connector version 4.6.1.99).

    4.6.1.311

    Resolved Issues

    Issue HDP-5565 Resolved Log4j 2.15 and 2.16 security vulnerabilities

    Hybrid Data Pipeline has been updated to use Log4j version 2.17 to address security vulnerabilities found in Log4j versions 2.15 and 2.16 as described in CVE-2021-45046 and CVE-2021-45105. For details, refer to CVE-2021-45046 and CVE-2021-45105. (Hybrid Data Pipeline server 4.6.1.311, On-Premises Connector 4.6.1.91).

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    4.4.0 archive

    Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.

    Enhancements

    The Hybrid Data Pipeline 4.4 release implements features that simplify the deployment of cluster environments by implementing an enhanced messaging that removes the external dependency upon a Kafka message queue and provides integration with application load balancers in public cloud environments.

    Integration with cloud load balancers

    Hybrid Data Pipeline has added support for multi-node clusters that integrate with cloud load balancers. Hybrid Data Pipeline supports cloud load balancers that support the Websockets protocol (such as AWS application load balancer and Azure application gateway).

    Enhanced Messaging

    Hybrid Data Pipeline now has enhanced messaging such that the deployments no longer rely upon a Kafka cluster to support highly available inter-node communication.

    Support for OAuth 2.0

    Hybrid Data Pipeline now supports OAuth 2.0 authorization for OData API access, in addition to the basic authentication. Customers using client applications or third-party applications like Salesforce Connect and Power BI will be able to invoke Hybrid Data Pipeline OData access endpoints by passing in the required tokens as opposed to storing user credentials in the application.

    Support for Installation using Docker image

    You can now install a single node Hybrid Data Pipeline server for evaluation purposes using a Docker image. Docker is a tool that makes it easier to deploy and run applications. The use of a Docker image means that no prior machine setup is required. You may choose to use this method if you want to get started without spending time on installation and configuration.

    Response File changes

    The following properties have been removed from the response file for both console and GUI modes:

    • USING_LOAD_BALANCING_YES
    • USING_LOAD_BALANCING_NO
    • D2C_USING_KAFKA_CONFIG
    • D2C_USING_KAFKA_CONFIG_CONSOLE
    • D2C_MESSAGE_QUEUE_SERVERS
    • D2C_MESSAGE_QUEUE_SERVERS_CONSOLE
    • D2C_HDP_CLUSTER_NAME

    The following properties have been added to the response file for both console and GUI modes:

    • D2C_NO_LOAD_BALANCER (GUI and console): specifies if no load balancer is used.
    • D2C_NETWORK_LOAD_BALANCER (or D2C_NETWORK_LOAD_BALANCER_CONSOLE): specifies whether a network load balancer is used.
    • DC_CLOUD_LOAD_BALANCER (or D2C_CLOUD_LOAD_BALANCER_CONSOLE): specifies whether a cloud load balancer is used. 

    Web UI Enhancements

    Limit GetSchema Hybrid Data Pipeline

    Users can now configure the additional property Metadata Exposed Schemas in the data source configuration to restrict the schemas they see in the SQL Editor and the OData Editor.

    GUID for SQL Server

    Added support for exposing GUID data type as a GUID in OData for SQL Server data source.

    On-Premises Connector

    Simplified the On-premises Connector installation such that the Cloud Access service is no longer installed. Only the single Cloud Connector service will be installed.

     

    4.2.0 archive

    Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.

    Security

    On-Premises Connector in a Hybrid Data Pipeline Cluster
    • Support for the On-Premises Connector in a cluster environment in which multiple Hybrid Data Pipeline nodes run behind a load balancer. The On-Premises Connector allows cloud applications to securely query on-premises data sources without requiring a VPN or other gateway.
    Account Lockout Policy
    • Support for implementing an account lockout policy. An account lockout policy can be implemented with the Administrator Limits API. An account lockout policy allows the administrator to set the number of consecutive failed authentication attempts that result in a user account being locked, as well as the lockout period and the duration of time that failed attempts are counted. When a lockout occurs, the user is unable to authenticate until the specified period of time has passed or until the administrator unlocks the account.
      With Release 4.2.0, the Hybrid Data Pipeline account lockout policy is by default enabled in accordance with Federal Risk and Authorization Management Program (FedRAMP) low- and medium-risk guidelines. The number of failed authentication attempts is limited to 3 in a 15 minute period. Once this limit is met, a lockout of the user account occurs for 30 minutes.
    CORS Filters
    • Support for cross-origin resource sharing (CORS) filters that allow the sharing of web resources across domains. While the default CORS setting is off, CORS filters can be enabled with the Administrator Limits API and a list of trusted origins can be enabled with the Whitelist APIs to fully implement CORS filtering. CORS provides several advantages over sites with a single-origin policy, including improved resource management and two-way integration between third-party sites.
    JVM Upgrade
    • The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Java SE 8 (8u131).
    Tomcat Upgrade
    • The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 8.0.46.


    Enhancements

    Hybrid Data Pipeline Server
    • OData 4.0 Support. Support for the OData 4.0 specification. OData 4.0 support includes the following:
      • Support for the $search clause
      • Support for $batch requests
      • The expand clause has been enhanced to support $select, *, $filter, and $top operations
      • The $metadata clause has been enhanced to support the full, minimal, and none arguments
      • Support for the date, dateTimeOffset, and timeOfDay data types
      • Only supports JSON format for payloads
    • OData Model Status
      • OData Model status now displays the timestamp of the OData model creation. The timestamp is displayed only when model creation is completed successfully.
      • Users can now view the details of the tables and/or columns that were dropped while generating the OData Model for a given schema map of a Data Source. You can view the warnings through the Web UI as well as through API.
    • Version Information. Support for version information to be returned from /api/mgmt/version endpoint. This feature is now accessible via all user accounts. The response is returned in a JSON-style format with the following syntax.

     

     

    {
    "HDPVersion": "major.minor.service_pack.build_number"
    "WAPVersion": "major.minor.service_pack.build_number",
    "DASVersion":"major.minor.service_pack.build_number"
    }

     

    • OData Query Throttling. Support for OData query throttling. OData throttling can be implemented with the Administrator Limits API. When OData throttling is enabled, rows are fetched one page in advance of application requests. In addition, administrators can specify a maximum number of concurrent OData queries to prevent users from exhausting system and database resources.
    • Apache Kafka Message Queue. Support for Apache Kafka message queue in a Hybrid Data Pipeline. Apache Kafka allows you to distribute your message queue over multiple nodes for a high-level of availability and fault tolerance. Note that Apache Kafka is not included as part of the product installation. For download information, refer to https://kafka.apache.org/.
    • Logging. Support for configuring logging at the data source and user level. Administrators can configure logging using the Web UI or the Administrator Logging API.
    • Microsoft SQL Server System Database. Support for SQL Server as an external system database. During the installation process, you are prompted to select either an internal database or an external database to store system information necessary for the operation of Hybrid Data Pipeline. With this enhancement, you can choose Oracle, MySQL Community Edition, or SQL Server as an external database.
    • Shared Key Location. During the installation of the Hybrid Data Pipeline server, you are prompted to specify a "Key location" for the generated key. The directory specified serves as the location for additional internal files used in the installation and operation of the server. These files include properties files, encryption keys, and system information. In particular, the files located in the redist subdirectory after installation of the server must be used in the installation of the On-Premises Connector, the ODBC driver, and the JDBC driver. See the Progress DataDirect Hybrid Data Pipeline Installation Guide for details.
    • Installation Procedures and Response File. The installation procedures have been modified with the introduction of support for the On-Premises Connector in a Hybrid Data Pipeline cluster, support for Apache Kafka message queue in a Hybrid Data Pipeline cluster, and support for the Microsoft SQL Server system database. New prompts have been added to the installation process. Several of these prompts have corresponding options that appear in a response file generated by the latest installer for silent installation. If you are using a response file generated by an earlier version of the installer, you should regenerate the response file with the latest installer. The new response file should then be used for silent installations. The following table provides the new settings. The settings may differ depending on whether you generate the response file with a GUI or console installation. Further details are available in the Progress DataDirect Hybrid Data Pipeline Installation Guide.

    Note: Values for the SKIP_HOSTNAME_VALIDATION and SKIP_PORT_VALIDATION options are now false | true, false for disable and true for enable. These options have the same name in GUI-generated and console-generated response files.



    Note: Values for the SKIP_LB_HOSTNAME_VALIDATION option are now false | true, false for disable and true for enable. This option has the same name in GUI-generated and console-generated response files.


    New response file options
    GUIConsoleDefinition
    D2C_USING_KAFKA_CONFIGD2C_USING_KAFKA_CONFIG_CONSOLESpecifies whether you are using an Apache Kafka message queue service.
    D2C_MESSAGE_QUEUE_SERVERSD2C_MESSAGE_QUEUE_SERVERS_CONSOLESpecifies the servers in your Apache Kafka cluster.
    D2C_HDP_CLUSTER_NAMED2C_HDP_CLUSTER_NAME_CONSOLESpecifies a name for your Hybrid Data Pipeline cluster used by the Apache Kafka message queue service.
    D2C_DB_VENDOR_MSSQLSERVERnaSpecifies whether you are using SQL Server as an external systems database. In a console mode response file, the external database is specified with the D2C_DB_VENDOR_CONSOLE option.
    D2C_DB_PORT_MSSQLSERVERnaSpecifies the port number of a SQL Server external systems database. In a console mode response file, the external database port is specified with the D2C_DB_PORT_CONSOLE option.
    D2C_SCHEMA_NAMED2C_SCHEMA_NAME_CONSOLESpecifies the name of the schema to be used to store systems information when a SQL Server external systems database is being used.

     

    On-Premises Connector
    • Support for the On-Premises Connector in a cluster environment in which multiple Hybrid Data Pipeline nodes run behind a load balancer. The On-Premises Connector allows cloud applications to securely query on-premises data sources without requiring a VPN or other gateway.
    Apache Hive
    • Certified with Hive 2.0 and 2.1.
    IBM DB2
    • Certified with DB2 for i 7.3.
    Oracle Database
    • Certified with Oracle 12c R2 (12.2).
    Oracle Sales Cloud
    • Support for proxy server connections.

    Resolved Issues

    Hybrid Data Pipeline server
    • Issue 71841. Resolved an issue where the Hybrid Data Pipeline server failed to honor the START_ON_INSTALL environment variable to stop and start Tomcat services.
    • Resolved an issue where the installer accepted an SSL certificate only in the PEM file format during the installation of the server for a cluster environment. The installer now accepts the SSL certificate (root certificate) in PEM, DER, or base64 encodings for a cluster installation.
    • Resolved an issue where an SSL certificate was required for a cluster installation. An SSL certificate is no longer required for a cluster installation.
    • Resolved an issue that prevented the installer from supporting a number of upgrade scenarios.
    JDBC driver
    • Resolved an issue where the JDBC driver was not connecting to the Hybrid Data Pipeline server by default when running on a UNIX/Linux system.

     

    4.6.1.0062

    November 19, 2020

    Preface

    These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

    The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

    The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

    The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

    java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

    For the ODBC driver, see Driver version string for details on obtaining the driver version number.

    Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

    Resolved Issues

    Issue HDP-4757 Cannot retrieve data from SQL Server table (On-Premises Connector version 4.6.1.47)

    Sometimes when trying to execute SELECT * FROM table against an on-premise SQL Server database using the On-Premises Connector, the ODBC driver returned the error [HY000] [DataDirect][ODBC Hybrid driver][SQLServer]Unexpected content at the end of chunk.

    Issue HDP-4574 HTTP error 404 while renaming the connector label (On-Premises Connector version 4.6.1.47)

    When the name of the On-Premises Connector host machine was in all uppercase at the time of the installation of the On-Premises Connector, the Connector Label field in the On-Premises Configuration Tool did not populate with the hostname as expected. Then, when attempting to update the Connector Label field with the correct hostname, the On-Premises Configuration Tool returned Error setting connector label for user Request returned Status:404 Message.

    Issue HDP-4704 Error while accessing link tables in MS Access application using Hybrid Data Pipeline data source (ODBC driver 4.6.1.12)

    When using the Hybrid Data Pipeline ODBC driver to connect to a data source created with a third party JDBC driver, the following error was returned: ODBC--call failed. [DataDirect][ODBC Hybrid driver]Numeric value out of range. Error in column 16. (#0). This error was returned because the third party driver diverged from the JDBC specification when describing the data type of CHAR_OCTET_LENGTH for DatabaseMetaData.getColumns(). The ODBC driver has been modified to work with the third party JDBC driver despite this divergence from the JDBC specification.

    Enhancements

    SQL statement auditing

    Hybrid Data Pipeline has added a SQL statement auditing feature. When SQL statement auditing is enabled, the connectivity service records SQL statements and related metrics in the SQLAudit table on the Hybrid Data Pipeline system database (also referred to as the account database). This information can then be queried directly by administrators. See SQL statement auditing in the user's guide for details.

    Tomcat upgrade

    The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.37. (On-Premises Connector version 4.6.1.14)

    Known Issues

    See Hybrid Data Pipeline known issues for details.

     

    Connect any application to any data source anywhere

    Explore all DataDirect Connectors

    A product specialist will be glad to get in touch with you

    Contact Us