Added - Storage abstraction for data partitioner - Added 'extra_params' arg to map() and map_reduce() calls - Logic to reuse IAM API Key tokens during 1 hour - More debug logging
Changed - Docs updated - Full support for Python3.5
Fixed - Fixed minor issue in config - Fixed possible issue extracting metadata from large docker images (runtimes)
1.0.19
Added - Added 'obj' as an optional arg for the functions when a user wants to process objects from OS - Added 'rabbitmq' as an optional arg for the functions - Added 'id' as an optional arg for the functions - Added rabbitmq example
Changed - Deleted 'bucket' 'key' 'data_stream' function args in favor of 'obj' - Internal improvements related data partitioning - Changed create_timeline_plots() method name to create_execution_plots() - Docs updated - updated notebooks - Upgrade cos-sdk Python module version
Added - Added CloudObject abstraction - Added CloudObject example - Restored OOM exception - Allowed to get_results when rabbit monitoring is activated - Allowed rabbimq to monitor multiple jobs at a time - Statuses returned from rabbitmq to futures
Changed - Code refactoring about compute abstraction - Reorganized libs folder - Updated cloudpickle lib from 0.6.1 to 1.2.1 - Updated glob2 lib to 0.7 - Updated tests - Modified job_id format
Fixed - Fixed minor issue listing CF actions - Fixed issue when executing pywren inside pywren - Fixed possible issue with invalid config parameters - Fixed wrong method name: build_runtime() - Fixed internal_storage parameter in partitioner - Fixed crete_timeline_plots method according recent changes
1.0.17
Changed - Code refactoring about compute abstraction
Changed - Simplified invoker - Moved compute and storage classes to separate files - Deleted unnecessary files - Close plots on finish - Code refactoring about compute abstraction