Added - New local_executor() to run pywren jobs in the local machine - New localhost compute backend - New localhost storage backend - New docker_executor() to run pywren jobs in the local machine by using docker - New docker compute backend
Changed - Docs updated - Code refactor
Fixed - Internal fixes - Bump pillow from 5.4.1 to 6.2.0
1.1.1
Added - Allowed partitioner to split files by a number of chunks - Missing logic in knative backend
Changed - Docs updated
Fixed - Internal fixes
1.1.0
Added - Added knative-serving compute backend - Added Dockerfile skeleton for slim Python3.6 runtime (only 307MB) - Added CACHE_DIR in ~/.pywren/cache - knative_executor() and function_executor() - support to work on multiple regions at a time
Changed - Docs updated - Runtime Dockerfiles updated - Runtime requirements updated - Updated Cloudpickle lib to version 1.2.2 - Parameters introduced in the executer now overwrite the config - updated tests
Fixed - Internal logic to generate runtime_metadata - invalid call to "is_remote_cluster" method - Cloudpickle lib to accept any kind of import - include_modules option in serializer
1.0.20
Added - Storage abstraction for data partitioner - Added 'extra_params' arg to map() and map_reduce() calls - Logic to reuse IAM API Key tokens during 1 hour - More debug logging
Changed - Docs updated - Full support for Python3.5
Fixed - Fixed minor issue in config - Fixed possible issue extracting metadata from large docker images (runtimes)
1.0.19
Added - Added 'obj' as an optional arg for the functions when a user wants to process objects from OS - Added 'rabbitmq' as an optional arg for the functions - Added 'id' as an optional arg for the functions - Added rabbitmq example
Changed - Deleted 'bucket' 'key' 'data_stream' function args in favor of 'obj' - Internal improvements related data partitioning - Changed create_timeline_plots() method name to create_execution_plots() - Docs updated - updated notebooks - Upgrade cos-sdk Python module version
Added - Added CloudObject abstraction - Added CloudObject example - Restored OOM exception - Allowed to get_results when rabbit monitoring is activated - Allowed rabbimq to monitor multiple jobs at a time - Statuses returned from rabbitmq to futures
Changed - Code refactoring about compute abstraction - Reorganized libs folder - Updated cloudpickle lib from 0.6.1 to 1.2.1 - Updated glob2 lib to 0.7 - Updated tests - Modified job_id format
Fixed - Fixed minor issue listing CF actions - Fixed issue when executing pywren inside pywren - Fixed possible issue with invalid config parameters - Fixed wrong method name: build_runtime() - Fixed internal_storage parameter in partitioner - Fixed crete_timeline_plots method according recent changes