Viadot

Latest version: v0.4.24

Safety actively analyzes 688323 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 8

0.4.0

Added
- Added `custom_mail_state_handler` task that sends email notification using a custom SMTP server.
- Added new function `df_clean_column` that cleans data frame columns from special characters
- Added `df_clean_column` util task that removes special characters from a pandas DataFrame
- Added `MultipleFlows` flow class which enables running multiple flows in a given order.
- Added `GetFlowNewDateRange` task to change date range based on Prefect flows
- Added `check_col_order` parameter in `ADLSToAzureSQL`
- Added new source `ASElite`
- Added KeyVault support in `CloudForCustomers` tasks
- Added `SQLServer` source
- Added `DuckDBToDF` task
- Added `DuckDBTransform` flow
- Added `SQLServerCreateTable` task
- Added `credentials` param to `BCPTask`
- Added `get_sql_dtypes_from_df` and `update_dict` util tasks
- Added `DuckDBToSQLServer` flow
- Added `if_exists="append"` option to `DuckDB.create_table_from_parquet()`
- Added `get_flow_last_run_date` util function
- Added `df_to_dataset` task util for writing DataFrames to data lakes using `pyarrow`
- Added retries to Cloud for Customers tasks
- Added `chunksize` parameter to `C4CToDF` task to allow pulling data in chunks
- Added `chunksize` parameter to `BCPTask` task to allow more control over the load process
- Added support for SQL Server's custom `datetimeoffset` type
- Added `AzureSQLToDF` task
- Added `AzureDataLakeRemove` task
- Added `AzureSQLUpsert` task

Changed
- Changed the base class of `AzureSQL` to `SQLServer`
- `df_to_parquet()` task now creates directories if needed
- Added several more separators to check for automatically in `SAPRFC.to_df()`
- Upgraded `duckdb` version to 0.3.2

Fixed
- Fixed bug with `CheckColumnOrder` task
- Fixed OpenSSL config for old SQL Servers still using TLS < 1.2
- `BCPTask` now correctly handles custom SQL Server port
- Fixed `SAPRFC.to_df()` ignoring user-specified separator
- Fixed temporary CSV generated by the `DuckDBToSQLServer` flow not being cleaned up
- Fixed some mappings in `get_sql_dtypes_from_df()` and optimized performance
- Fixed `BCPTask` - the case when the file path contained a space
- Fixed credential evaluation logic (`credentials` is now evaluated before `config_key`)
- Fixed "$top" and "$skip" values being ignored by `C4CToDF` task if provided in the `params` parameter
- Fixed `SQL.to_df()` incorrectly handling queries that begin with whitespace

Removed
- Removed `autopick_sep` parameter from `SAPRFC` functions. The separator is now always picked automatically if not provided.
- Removed `dtypes_to_json` task to task_utils.py

0.3.2

Fixed
- fixed an issue with schema info within `CheckColumnOrder` class.

0.3.1

Changed
-`ADLSToAzureSQL` - added `remove_tab` parameter to remove uncessery tab separators from data.

Fixed
- fixed an issue with return df within `CheckColumnOrder` class.

0.3.0

Added
- new source `SAPRFC` for connecting with SAP using the `pyRFC` library (requires pyrfc as well as the SAP NW RFC library that can be downloaded [here](https://support.sap.com/en/product/connectors/nwrfcsdk.html)
- new source `DuckDB` for connecting with the `DuckDB` database
- new task `SAPRFCToDF` for loading data from SAP to a pandas DataFrame
- new tasks, `DuckDBQuery` and `DuckDBCreateTableFromParquet`, for interacting with DuckDB
- new flow `SAPToDuckDB` for moving data from SAP to DuckDB
- Added `CheckColumnOrder` task
- C4C connection with url and report_url documentation
-`SQLIteInsert` check if DataFrame is empty or object is not a DataFrame
- KeyVault support in `SharepointToDF` task
- KeyVault support in `CloudForCustomers` tasks

Changed
- pinned Prefect version to 0.15.11
- `df_to_csv` now creates dirs if they don't exist
- `ADLSToAzureSQL` - when data in csv coulmns has unnecessary "\t" then removes them

Fixed
- fixed an issue with duckdb calls seeing initial db snapshot instead of the updated state (282)
- C4C connection with url and report_url optimization
- column mapper in C4C source

0.2.15

Added
- new option to `ADLSToAzureSQL` Flow - `if_exists="delete"`
- `SQL` source: `create_table()` already handles `if_exists`; now it handles a new option for `if_exists()`
- `C4CToDF` and `C4CReportToDF` tasks are provided as a class instead of function


Fixed
- Appending issue within CloudForCustomers source
- An early return bug in `UKCarbonIntensity` in `to_df` method

0.2.14

Fixed
- authorization issue within `CloudForCustomers` source

Page 5 of 8

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.