Pydatacore

Latest version: v1.1.2

Safety actively analyzes 701922 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.1.2

Implement async data-ready signaling in Data and DataPool classes

- Added support for asynchronous data-ready signaling in the `Data` and `DataPool` classes to streamline the data flow and improve synchronization.
- Now, `Data` objects can signal when they are ready for processing, allowing for better coordination in data pipelines.

Future enhancement:
- Plan to extend async data-ready functionality to chunked data handling within mixins such as `ChunkableMixin` and `FileRamMixin` using `data.mark_data_ready()` for each chunk. This feature is deferred to maintain current test stability.

**Full Changelog**: https://github.com/GuillaumeTrain/PyDataCore/compare/1.1.1...1.1.2

1.1.1

**Full Changelog**: https://github.com/GuillaumeTrain/PyDataCore/compare/1.1.0...1.1.1
FreqLimitsData Enhancements:

Added freq_min and freq_max properties to track the minimum and maximum frequencies.
Improved add_limit_point to enforce ascending frequency order, updating freq_min and freq_max dynamically.
Added validation to set_interpolation_type for accepted values (linear or log).
Updated docstrings for clarity and consistency in frequency limit handling.
New TempLimitsData Class:

Created the TempLimitsData class to manage temporal limits, storing tuples of (level, transparency_time, release_time).
Added add_limit_point to validate and store temporal limit points, with enforced ascending order for transparency_time.
Included time_min and time_max properties to store the minimum and maximum time ranges dynamically.
Implemented clear_limit_points to reset temporal data points and get_limits_in_range to filter data within a specified time range.
Testing:

Added test_temporal_data.py to validate TempLimitsData functionality, including:
Adding multiple temporal limit points.
Checking minimum and maximum time boundaries.
Testing get_limits_in_range to retrieve points within a specified time range.
Enhanced existing frequency limit tests to ensure comprehensive validation.

1.1.0

Description:
1.1.0 fix big mistakes in project structure and syntax of Data_Type.
this can be seen as the debuged 1.0.9 release.
This commit refactors several components of the DataPool, FFTSData, and ChunkableMixin classes to improve data management, streamline object instantiation, and enhance error handling. Key changes include:

FFTSData Refactor:

The FFTSData class now stores only the data_id of FreqSignalData objects instead of the entire object. This approach enables better memory management and modularity.
Added a new property, fft_ids, to retrieve a list of all data_ids associated with FreqSignalData.
Adjusted add_fft_signal() to append only the data_id, and fft_signals to retrieve full objects via datapool using data_id.
DataPool Enhancements:

Enhanced register_data() to handle variable kwargs more flexibly, allowing data_size_in_bytes and number_of_elements to pass selectively as needed.
Refactored the get_data() method by removing the auto-acknowledgment functionality to give subscribers explicit control over data access and processing.
Added get_data_object() to directly retrieve Data objects using data_id, with comprehensive permission checks and without automatic acknowledgment.
ChunkableMixin Adjustment:

Refined read_chunked_data() to include additional conditions for reading chunks, improving control over data stored in files.
Testing and Validation:

The test script has been updated to verify the proper storage and retrieval of FreqSignalData instances within FFTSData.
Implemented tabulate for a structured display of data_registry status, ensuring data storage and associations are correctly maintained.

1.0.9

Updated setup.py to version 1.0.9 to reflect this refactoring and functional enhancement.
These changes enhance the flexibility, clarity, and efficiency of data handling within the DataPool and make FFTSData more modular and suitable for FFT-related data management.

1.0.8

changed FFTSDATA attribut freq_step to df for compatibility purpose with methods shared with FreqSignalData
pushed release to 1.0.8 in order to be usable in other project

**Full Changelog**: https://github.com/GuillaumeTrain/PyDataCore/compare/1.0.7...1.0.8

1.0.7

Refactor and fix data storage and chunk handling for file and RAM-based signals.

- Added proper calculation and assignment of `data_size_in_bytes` and `num_samples` during data storage.
- Refactored methods `store_data_from_object` and `store_data_from_data_generator` to ensure accurate tracking of sample size and data size.
- Removed redundant manual size calculations in the `store_data` method.
- Improved error handling and debug outputs when dealing with file-based data.
- Ensured consistent behavior between file-based and RAM-based data storage and retrieval.
- Updated chunk reading logic to handle large datasets more efficiently.

This commit resolves issues with incorrect chunk handling for file-stored signals and ensures that file size and sample size are correctly calculated and managed.

**Full Changelog**: https://github.com/GuillaumeTrain/PyDataCore/compare/1.0.6...1.0.7

Page 1 of 2

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.