Dffml

Latest version: v0.4.0.post2

Safety actively analyzes 688634 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

0.4.0

Added
- New model for Anomaly Detection
- Ablity to specify maximum number of contexts running at a time
- CLI and Python example usage of Custom Neural Network
- PyTorch loss function entrypoint style loading
- Custom Neural Network, last layer support for pre-trained models
- Example usage of sklearn operations
- Example Flower17 species image classification
- Configloading ablity from CLI using "" before filename
- Docstrings and doctestable example for DataFlowPreprocessSource
- XGBoost Regression Model
- Pre-Trained PyTorch torchvision Models
- Spacy model for NER
- Ability to rename outputs using GetSingle
- Tutorial for using NLP operations with models
- Operations plugin for NLP wrapping spacy and scikit functions
- Support for default value in a Definition
- Source for reading images in directories
- Operations plugin for image preprocessing
- `-pretty` flag to `list records` and `predict` commands
- daal4py based linear regression model
- DataFlowPreprocessSource can take a config file as dataflow via the CLI.
- Support for link on conditions in dataflow diagrams
- `edit all` command to edit records in bulk
- Support for Tensorflow 2.2
- Vowpal Wabbit Models
- Python 3.8 support
- binsec branch to `operations/binsec`
- Doctestable example for `model_predict` operation.
- Doctestable examples to `operation/mapping.py`
- shouldi got an operation to run Dependency-check on java code.
- `load` and `run` functions in high level API
- Doctestable examples to `db` operations.
- Source for parsing `.ini` file formats
- Tests for noasync high level API.
- Tests for load and save functions in high level API.
- `Operation` inputs and outputs default to empty `dict` if not given.
- Ability to export any object with `dffml service dev export`
- Complete example for dataflow run cli command
- Tests for default configs instantiation.
- Example ffmpeg operation.
- Operations to deploy docker container on receiving github webhook.
- New use case `Redeploying dataflow on webhook` in docs.
- Documentation for creating Source for new File types taking `.ini` as an example.
- New input modes, output modes for HTTP API dataflow registration.
- Usage example for tfhub text classifier.
- `AssociateDefinition` output operation to map definition names to values
produced as a result of passing Inputs with those definitions to operations.
- DataFlows now have a syntax for providing a set of definitions that will
override the operations default definition for a given input.
- Source which modifies record features as they are read from another source.
Useful for modifying datasets as they are used with ML commands or editing
in bulk.
- Auto create Definition for the `op` when they might have a spec, subspec.
- `shouldi use` command which detects the language of the codebase given via
path to directory or Git repo URL and runs the appropriate static analyzers.
- Support for entrypoint style loading of operations and seed inputs in `dataflow create`.
- Definition for output of the function that `op` wraps.
- Expose high level load, run and save functions to noasync.
- Operation to verify secret for GitHub webhook.
- Option to modify flow and add config in `dataflow create`.
- Ability to use a function as a data source via the `op` source
- Make every model's directory property required
- New model AutoClassifierModel based on `AutoSklearn`.
- New model AutoSklearnRegressorModel based on `AutoSklearn`.
- Example showing usage of locks in dataflow.
- `-skip` flag to `service dev install` command to let users not install certain
core plugins
- HTTP service got a `-redirect` flag which allows for URL redirection via a
HTTP 307 response
- Support for immediate response in HTTP service
- Daal4py example usage.
- Gitter chatbot tutorial.
- Option to run dataflow without sources from cli.
- Sphinx extension for automated testing of tutorials (consoletest)
- Example of software portal using DataFlows and HTTP service
- Retry parameter to `Operation`. Allows for setting number of times operation
should be retried before it's exception should be raised.
Changed
- Renamed `-seed` to `-inputs` in `dataflow create` command
- Renamed configloader/png to configloader/image and added support for loading JPEG and TIFF file formats
- Update record `__str__` method to output in tabular format
- Update MNIST use case to normalize image arrays.
- `arg_` notation replaced with `CONFIG = ExampleConfig` style syntax
for parsing all command line arguments.
- Moved usage/io.rst to docs/tutorials/dataflows/io.rst
- `edit` command substituted with `edit record`
- `Edit on Github` button now hidden for plugins.
- Doctests now run via unittests
- Every class and function can now be imported from the top level module
- `op` attempts to create `Definition`s for each argument if an `inputs` are not
given.
- Classes now use `CONFIG` if it has a default for every field and `config` is `None`
- Models now dynamically import third party modules.
- Memory dataflow classes now use auto args and config infrastructure
- `dffml list records` command prints Records as JSON using `.export()`
- Feature class in `dffml/feature/feature.py` initialize a feature object
- All DefFeatures() functions are substituted with Features()
- All feature.type() and feature.lenght() are substituted with
feature.type and feature.length
- FileSource takes pathlib.Path as filename
- Tensorflow tests re-run themselves up to 6 times to stop them from failing the
CI due to their randomly initialized weights making them fail ~2% of the time
- Any plugin can now be loaded via it's entrypoint style path
- `with_features` now raises a helpful error message if no records with matching
features were found
- Split out model tutorial into writing the model, and another tutorial for
packaging the model.
- IntegrationCLITestCase creates a new directory and chdir into it for each test
- Automated testing of Automating Classification tutorial
- `dffml version` command now prints git repo hash and if the repo is dirty
Fixed
- `export_value` now converts numpy array to JSON serializable datatype
- CSV source overwriting configloaded data to every row
- Race condition in `MemoryRedundancyChecker` when more than 4 possible
parameter sets for an operation.
- Typing of config values for numpy parsed docstrings where type should be tuple
or list
- Model predict methods now use `SourcesContext.with_features`
Removed
- Monitor class and associated tests (unused)
- DefinedFeature class in `dffml/feature/feature.py`
- DefFeature function in `dffml/feature/feature.py`
- load_def function in Feature class in `dffml/feature/feature.py`

0.3.7

Added
- IO operations demo and `literal_eval` operation.
- Python prompts `>>>` can now be enabled or disabled for easy copying of code into interactive sessions.
- Whitespace check now checks .rst and .md files too.
- `GetMulti` operation which gets all Inputs of a given definition
- Python usage example for LogisticRegression and its related tests.
- Support for async generator operations
- Example CLI commands and Python code for `SLRModel`
- `save` function in high level API to quickly save all given records to a
source
- Ability to configure sources and models for HTTP API from command line when
starting server
- Documentation page for command line usage of HTTP API
- Usage of HTTP API to the quickstart to use trained model
Changed
- Renamed `"arg"` to `"plugin"`.
- CSV source sorts feature names within headers when saving
- Moved HTTP service testing code to HTTP service `util.testing`
Fixed
- Exporting plugins
- Issue parsing string values when using the `dataflow run` command and
specifying extra inputs.
Removed
- Unused imports

0.3.6

Added
- Operations for taking input from the user `AcceptUserInput` and for printing the output `print_output`
- PNG ConfigLoader for reading images as arrays to predict using MNIST trained models
- Docstrings and doctestable examples to `record.py`.
- Inputs can be validated using operations
- `validate` parameter in `Input` takes `Operation.instance_name`
- New db source can utilize any database that inherits from `BaseDatabase`
- Logistic Regression with SAG optimizer
- Test tensorflow DNNEstimator documentation examples in CI
- shouldi got an operation to run cargo-audit on rust code.
- Moved all the downloads to tests/downloads to speed the CI test.
- Test tensorflow DNNEstimator documentation exaples in CI
- Add python code for tensorflow DNNEstimator
- Ability to run a subflow as if it were an operation using the
`dffml.dataflow.run` operation.
- Support for operations without inputs.
- Partial doctestable examples to `features.py`
- Doctestable examples for `BaseSource`
- Instructions for setting up debugging environment in VSCode
Fixed
- New model tutorial mentions file paths that should be edited.
- DataFlow is no longer a dataclass to prevent it from being exported
incorrectly.
- `operations_parameter_set_pairs` moved to `MemoryOrchestratorContext`
- Ignore generated files in `docs/plugins/`
- Treat `"~"` as the the home directory rather than a literal
- Windows support by selecting `asyncio.ProactorEventLoop` and not using
`asyncio.FastChildWatcher`.
- Moved SLR into the main dffml package and removed `scratch:slr`.
Changed
- Refactor `model/tensroflow`

0.3.5

Added
- Parent flows can now forward inputs to active contexts of subflows.
- `forward` parameter in `DataFlow`
- `subflow` in `OperationImplementationContext`
- Documentation on writing examples and running doctests
- Doctestable Examples to high-level API.
- Shouldi got an operation to run npm-audit on JavaScript code
- Docstrings and doctestable examples for `record.py` (features and evaluated)
- Simplified model API with SimpleModel
- Documentation on how DataFlows work conceptually.
- Style guide now contains information on class, variable, and function naming.
Changed
- Restructured contributing documentation
- Use randomly generated data for scikit tests
- Change Core to Official to clarify who maintains each plugin
- Name of output of unsupervised model from "Prediction" to "cluster"
- Test scikit LR documentation examples in CI
- Create a fresh archive of the git repo for release instead of cleaning
existing repo with `git clean` for development service release command.
- Simplified SLR tests for scratch model
- Test tensorflow DNNClassifier documentation examples in CI
- config directories and files associated with ConfigLoaders have been renamed
to configloader.
- Model config directory parameters are now `pathlib.Path` objects
- New model tutorial and `skel/model` use simplifeid model API.

0.3.4

Added
- Tensorflow hub NLP models.
- Notes on development dependencies in `setup.py` files to codebase notes.
- Test for `cached_download`
- `dffml.util.net.cached_download_unpack_archive` to run a cached download and
unpack the archive, very useful for testing. Documented on the Networking
Helpers API docs page.
- Directions on how to read the CI under the Git and GitHub page of the
contributing documentation.
- HTTP API
- Static file serving from a directory with `-static`
- `api.js` file serving with the `-js` flag
- Docs page for JavaScript example
- shouldi got an operation to run golangci-lint on Golang code
- Note about using black via VSCode
Fixed
- Port assignment for the HTTP API via the `-port` flag
Changed
- `repo`/`Repo` to `record`/`Record`
- Definitions with a `spec` can use the `subspec` parameter to declare that they
are a list or a dict where the values are of the `spec` type. Rather than the
list or dict itself being of the `spec` type.
- Fixed the URL mentioned in example to configure a model.
- Sphinx doctests are now run in the CI in the DOCS task.
- Lint JavaScript files with js-beautify and enforce with CI
Removed
- Unused imports

0.3.3

Added
- Moved from TensorFlow 1 to TensorFlow 2.
- IDX Sources to read binary data files and train models on MNIST Dataset
- scikit models
- Clusterers
- KMeans
- Birch
- MiniBatchKMeans
- AffinityPropagation
- MeanShift
- SpectralClustering
- AgglomerativeClustering
- OPTICS
- `allowempty` added to source config parameters.
- Quickstart document to show how to use models from Python.
- The latest release of the documentation now includes a link to the
documentation for the master branch (on GitHub pages).
- Virtual environment, GitPod, and Docker development environment setup notes to
the CONTRIBUTING.md file.
- Changelog now included in documentation website.
- Database abstraction `dffml.db`
- SQLite connector
- MySQL connector
- Documented style for imports.
- Documented use of numpy docstrings.
- `Inputs` can now be sanitized using function passed in `validate` parameter
- Helper utilities to take callables with numpy style docstrings and
create config classes out of them using `make_config`.
- File listing endpoint to HTTP service.
- When an operation throws an exception the name of the instance and the
parameters it was executed with will be thrown via an `OperationException`.
- Network utilities to peformed cached downloads with hash validation.
- Development service got a new command, which can retrieve an argument passed
to setuptools `setup` function within a `setup.py` file.
Changed
- All instances of `src_url` changed to `key`.
- `readonly` parameter in source config is now changed to `readwrite`.
- `predict` parameter of all model config classes has been changed from `str` to `Feature`.
- Defining features on the command line no longer requires that defined features
be prefixed with `def:`
- The model predict operation will now raise an exception if the model it is
passed via it's config is a class rather than an instance.
- `entry_point` and friends have been renamed to `entrypoint`.
- Use `FastChildWatcher` when run via the CLI to prevent `BlockingIOError`s.
- TensorFlow based neural network classifier had the `classification` parameter
in it's config changed to `predict`.
- SciKit models use `make_config_numpy`.
- Predictions in `repos` are now dictionary.
- All instances of `label` changed to `tag`
- Subclasses of `BaseConfigurable` will now auto instantiate their respective
config classes using `kwargs` if the config argument isn't given and keyword
arguments are.
- The quickstart documentation was improved as well as the structure of docs.
Fixed
- CONTRIBUTING.md has `-e` in the wrong place in the getting setup section.
- Since moving to auto `args()` and `config()`, BaseConfigurable no longer
produces odd typenames in conjunction with docs.py.
- Autoconvert Definitions with spec into their spec
Removed
- The model predict operation erroneously had a `msg` parameter in it's config.
- Unused imports identified by deepsource.io
- Evaluation code from feature.py file as well as tests for those evaluations.

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.