Clickhouse-connect

Latest version: v0.8.16

Safety actively analyzes 723650 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 13 of 20

0.5.15

Bug Fix
- Remove unnecessary addition of the client database to the table name for inserts. Fixes
https://github.com/ClickHouse/clickhouse-connect/issues/145

Improvement
- The driver should now work for older versions of ClickHouse back to 19.16. Note that older versions are not
officially tested or supported (like the main ClickHouse database, we officially support the last three monthly ClickHouse
releases and the last two LTS ClickHouse releases). For versions prior to 19.17, you may want change the new `readonly`
`clickhouse_connect.common` setting to '1' to allow sending ClickHouse settings with individual queries (if the user has
write permissions). Thanks to [Aleksey Astafiev](https://github.com/aastafiev) for this contribution and for
updating the tests to run with these legacy versions!

0.5.14

Bug Fix
- Remove direct pandas import that caused an unrecoverable error when pandas was not installed.
https://github.com/ClickHouse/clickhouse-connect/issues/139

0.5.13

Improvements
- By default, reading Pandas Dataframes with query_df and query_df_stream now sets a new QueryContext property
of `use_pandas_na` to `True`. When `use_pandas_na` is True, clickhouse_connect will attempt to use Pandas "missing"
values, such as pandas.NaT and pandas.NA, for ClickHouse NULLs (in Nullable columns only), and use the associated
extended Pandas dtype. Closes https://github.com/ClickHouse/clickhouse-connect/issues/132
- There are new low level optimizations for reading some Nullable columns, and writing Pandas dataframes

Bug Fixes
- Timezone information from ClickHouse DateTime columns with a timezone was lost. There was a workaround implemented
for this issue in v0.5.8 that allowed assigned timezones to the query or columns on the client side. ClickHouse now
support sending this timezone data with the column, but only in server versions 23.2 and later. If such a version is
detected, clickhouse-connect will return timezone aware DateTime values without a workaround. Fixes
https://github.com/ClickHouse/clickhouse-connect/issues/120
- For certain queries, an incorrect, non-zero "zero value" would be returned for queries where `use_none` was set
to `False`. All NULL values are now properly converted.
- Timezone data was lost when a DateTime64 column with a timezone was converted to a Pandas DataFrame. This has been
fixed. https://github.com/ClickHouse/clickhouse-connect/issues/136
- send_progress headers were not being correctly requested, which could result in unexpected timeouts for long-running
queries. This has been fixed.

0.5.12

Improvement
- A new keyword parameter `server_host_name` is now recognized by the `clickhouse_connect.get_client` method. This identifies
the "real" ClickHouse server hostname that should be used for HTTPS/TLS certificate validation, in cases where access to
the server is through an ssh tunnel or other proxy with a different hostname. For examples of how to use the new parameter,
see the updated file https://github.com/ClickHouse/clickhouse-connect/blob/main/examples/ssh_tunnels.py.

Bug fix
- The `database` element of a DSN was not recognized when present in the `dsn` parameter of `clickhouse_connect.get_client`.
This has been fixed.

0.5.11

Bug Fix
- Referencing the QueryResult `named_results` property after other properties such as `row_count` would incorrectly
raise a StreamClosedError. Thanks to [Stas](https://github.com/reijnnn) for the fix.

Improvement
- A better error message is returned when trying to read a "non-standard" DateTime64 column function for a numpy array
or Pandas DataFrame. "non-standard" means a DateTime64 precision not conforming to seconds, milliseconds, microseconds,
or nanoseconds (0, 3, 6, or 9 respectively). These DateTime64 types are not supported for numpy or Pandas because there is
no corresponding standard numpy datetime64 type and conversion would be unacceptably slow (supported numpy types are
`datetime64[s]`, `datetime64[ms]`, `datetime64[us]`, and `datetime64[ns]`). A workaround is to cast the DateTime64 type
to a supported type, i.e. `SELECT toDateTime64(col_name, 3)` for a millisecond column.
- The base configuration required for a urllib PoolManager has been broken out into its own help method,
`clickhouse_connect.driver.http_util.get_pool_manager_options`. This makes it simpler to configure a SOCKSProxyManager
as in the new example file https://github.com/ClickHouse/clickhouse-connect/blob/main/examples/ssh_tunnels.py

0.5.10

Improvement
- Reading Nullable(String) columns has been optimized and should be approximately 2x faster. (This does yet not include
LowCardinality(Nullable(String)) columns.)
- Extraction of ClickHouse error messages included in the HTTP Response has been improved

Bug Fixes
- When reading native Python integer columns, the `use_none=False` query parameter would not be respected,
and ClickHouse NULLS would be returned as None instead of 0. `use_none=False` should now work correctly for
Nullable(*Int*) columns
- Starting with release 0.5.0, HTTP Connection pools were not always cleanly closed on exit. This has been fixed.

Page 13 of 20

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.