Timeplus-connect

Latest version: v0.8.15

Safety actively analyzes 723650 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 13 of 20

0.5.13

Improvements
- By default, reading Pandas Dataframes with query_df and query_df_stream now sets a new QueryContext property
of `use_pandas_na` to `True`. When `use_pandas_na` is True, clickhouse_connect will attempt to use Pandas "missing"
values, such as pandas.NaT and pandas.NA, for ClickHouse NULLs (in Nullable columns only), and use the associated
extended Pandas dtype. Closes https://github.com/ClickHouse/clickhouse-connect/issues/132
- There are new low level optimizations for reading some Nullable columns, and writing Pandas dataframes

Bug Fixes
- Timezone information from ClickHouse DateTime columns with a timezone was lost. There was a workaround implemented
for this issue in v0.5.8 that allowed assigned timezones to the query or columns on the client side. ClickHouse now
support sending this timezone data with the column, but only in server versions 23.2 and later. If such a version is
detected, clickhouse-connect will return timezone aware DateTime values without a workaround. Fixes
https://github.com/ClickHouse/clickhouse-connect/issues/120
- For certain queries, an incorrect, non-zero "zero value" would be returned for queries where `use_none` was set
to `False`. All NULL values are now properly converted.
- Timezone data was lost when a DateTime64 column with a timezone was converted to a Pandas DataFrame. This has been
fixed. https://github.com/ClickHouse/clickhouse-connect/issues/136
- send_progress headers were not being correctly requested, which could result in unexpected timeouts for long-running
queries. This has been fixed.

0.5.12

Improvement
- A new keyword parameter `server_host_name` is now recognized by the `clickhouse_connect.get_client` method. This identifies
the "real" ClickHouse server hostname that should be used for HTTPS/TLS certificate validation, in cases where access to
the server is through an ssh tunnel or other proxy with a different hostname. For examples of how to use the new parameter,
see the updated file https://github.com/ClickHouse/clickhouse-connect/blob/main/examples/ssh_tunnels.py.

Bug fix
- The `database` element of a DSN was not recognized when present in the `dsn` parameter of `clickhouse_connect.get_client`.
This has been fixed.

0.5.11

Bug Fix
- Referencing the QueryResult `named_results` property after other properties such as `row_count` would incorrectly
raise a StreamClosedError. Thanks to [Stas](https://github.com/reijnnn) for the fix.

Improvement
- A better error message is returned when trying to read a "non-standard" DateTime64 column function for a numpy array
or Pandas DataFrame. "non-standard" means a DateTime64 precision not conforming to seconds, milliseconds, microseconds,
or nanoseconds (0, 3, 6, or 9 respectively). These DateTime64 types are not supported for numpy or Pandas because there is
no corresponding standard numpy datetime64 type and conversion would be unacceptably slow (supported numpy types are
`datetime64[s]`, `datetime64[ms]`, `datetime64[us]`, and `datetime64[ns]`). A workaround is to cast the DateTime64 type
to a supported type, i.e. `SELECT toDateTime64(col_name, 3)` for a millisecond column.
- The base configuration required for a urllib PoolManager has been broken out into its own help method,
`clickhouse_connect.driver.http_util.get_pool_manager_options`. This makes it simpler to configure a SOCKSProxyManager
as in the new example file https://github.com/ClickHouse/clickhouse-connect/blob/main/examples/ssh_tunnels.py

0.5.10

Improvement
- Reading Nullable(String) columns has been optimized and should be approximately 2x faster. (This does yet not include
LowCardinality(Nullable(String)) columns.)
- Extraction of ClickHouse error messages included in the HTTP Response has been improved

Bug Fixes
- When reading native Python integer columns, the `use_none=False` query parameter would not be respected,
and ClickHouse NULLS would be returned as None instead of 0. `use_none=False` should now work correctly for
Nullable(*Int*) columns
- Starting with release 0.5.0, HTTP Connection pools were not always cleanly closed on exit. This has been fixed.

0.5.9

Bug Fixes
- Large query results using `zstd` compression incorrectly buffered all incoming data at the start of the query,
consuming an excessive amount of memory. This has been fixed. https://github.com/ClickHouse/clickhouse-connect/issues/122
Big thanks to [Denny Crane](https://github.com/den-crane) for his detailed investigation of the problem. Note that
this affected large queries using the default `compress=True` client setting, as ClickHouse would prefer `zstd` compression
in those cases.
- Fixed an issue where a small query_limit would break client initialization due to an incomplete read of the `system.settings`
table. https://github.com/ClickHouse/clickhouse-connect/issues/123

Improvement
- Stream error handling has been improved so exceptions thrown while consuming a stream should be correctly propagated.
This includes unexpected stream closures by the ClickHouse server. Errors inserted into the HTTP response by ClickHouse
during a query should also be reported as part of a StreamFailureError

0.5.8

Bug Fix
- Return empty dataframe instead of empty list when no records returned from `query_df` method Fixes
https://github.com/ClickHouse/clickhouse-connect/issues/118

Default parameter change
- The client `query_limit` now defaults to 0 (unlimited rows returned), since the previous default of 5000 was unintuitive
and led to confusion when limited results were returned.

New Feature
- Allow client side control of datetime.datetime timezones for query results. The client `query` methods for native
Python results now accept two new parameters: `query_tz` is the timezone to be assigned for any DateTime or DateTime64
objects in the results, while timezones can be set per column using the `column_tzs` dictionary of column names to
timezones. See the [test file](https://github.com/ClickHouse/clickhouse-connect/blob/main/tests/integration_tests/test_timezones.py)
for simple examples. This is a workaround for https://github.com/ClickHouse/clickhouse-connect/issues/120 and the
underlying ClickHouse issue https://github.com/ClickHouse/ClickHouse/issues/40397 Note that this issue only affects DateTime
columns, not DateTime64, although the query context parameters will override the returned DateTime64 timezone as well.

Page 13 of 20

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.