Bulk insert operation
Bulk insert operation inserts multiple rows of data to target table in chunks. Default chunk size is 1000.
If error occurs, only the original driver error (no insert statements) is printed and logged.
If used driver is `pyodbc`, `fast_executemany` is enabled.
import ahjo.operations as op
records = read_from_csv(file_path)
table = Table(target_table, metadata, schema=target_schema, autoload=True)
op.bulk_insert_into_database(engine, table, records, chunk_size=500)
Badges to README
Latest PyPI release and required Python version badges added.