Pyexasol

Latest version: v0.27.0

Safety actively analyzes 714792 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 6

0.23.3

- SSL certificate verification is now enabled when used with `access_token` or `refresh_token` connection options.
- Updated documentation regarding [encryption](/docs/ENCRYPTION.md).

OpenID tokens are used to connect to Exasol SAAS clusters, which are available using public internet address. Unlike Exasol clusters running "on-premises" and secured by corporate VPNs and private networks, SAAS clusters are at risk of MITM attacks. SSL certificates must be verified in such conditions.

Exasol SAAS certificates are properly configured using standard certificate authority, so no extra configuration is required.

0.23.2

Not secure
- Added initial implementation of certificate fingerprint validation, similar to standard JDBC / ODBC clients starting from version 7.1+.
- Replaced most occurrences of ambiguous word `host` in code with `hostname` or `ipaddr`, depending on context.

0.23.1

Not secure
- Improved termination logic for HTTP transport thread while handling an exception. Order of closing pipes now depends on type of callback (EXPORT or IMPORT). It should help to prevent unresponsive infinite loop on Windows.
- Improved parallel HTTP transport examples with better exception handling.
- Removed comment about `if __name__ == '__main__':` being required for Windows OS only. Multiprocessing on macOS uses `spawn` method in the most recent Python versions, so it is no longer unique.
- `pyopenssl` is now a hard dependency, which is required for encrypted HTTP transport to generate an "ad-hoc" certificate. Encryption will be enabled by default for SAAS Exasol in future.

0.23.0

- Added [`orjson`](https://github.com/ijl/orjson) as possible option for `jsob_lib` connection parameter.
- Default `indent` for JSON debug output is now 2 (was 4) for more compact representation.
- `ensure_ascii` is now `False` (was `True`) for better readability and smaller payload size.
- Fixed JSON examples, `fetch_mapper` is now set correctly for second data set.

0.22.0

**BREAKING (!):** HTTP transport was significantly reworked in this version. Now it uses [threading](https://docs.python.org/3/library/threading.html) instead of [subprocess](https://docs.python.org/3/library/subprocess.html) to handle CSV data streaming.

There are no changes in a common **single-process** HTTP transport.

There are some breaking changes in **parallel** HTTP transport:

- Argument `mode` was removed from [`http_transport()`](/docs/REFERENCE.mdhttp_transport) function, it is no longer needed.
- Word "proxy" used in context of HTTP transport was replaced with "exa_address" in documentation and code. Word "proxy" now refers to connections routed through an actual HTTP proxy only.
- Function `ExaHTTPTransportWrapper.get_proxy()` was replaced with property `ExaHTTPTransportWrapper.exa_address`. Function `.get_proxy()` is still available for backwards compatibility, but it is deprecated.
- Module `pyexasol_utils.http_transport` no longer exists.
- Constants `HTTP_EXPORT` and `HTTP_IMPORT` are no longer exposed in `pyexasol` module.

Rationale:

- Threading provides much better compatibility with Windows OS and various exotic setups (e.g. uWSGI).
- Orphan "http_transport" processes will no longer be a problem.
- Modern Pandas and Dask can (mostly) release GIL when reading or writing CSV streams.
- HTTP thread is primarily dealing with network I/O and zlib compression, which (mostly) release GIL as well.

Execution time for small data sets might be improved by 1-2s, since another Python interpreter is no longer started from scratch. Execution time for very large data sets might be ~2-5% worse for CPU bound workloads and unchanged for network bound workloads.

Also, [examples](/docs/EXAMPLES.md) were re-arranged in this version, refactored and grouped into multiple categories.

0.21.2

Not secure
- Fixed a bug in `ExaStatement` when no rows were fetched. It could happen when data set has less than 1000 rows, but the amount of data exceeds maximum chunk size.

Page 2 of 6

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.