Unfurl

Latest version: v1.1.0

Safety actively analyzes 626157 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.1.0

<small>[Compare with 1.0.0](https://github.com/onecommons/unfurl/compare/v1.0.0...v1.1.0) </small>

Major changes

Ansible configurator

* A playbook's host is now set automatically if not explicitly specified. See <https://docs.unfurl.run/configurators.html#playbook-processing> for the selection rules.
* If `playbook` and `inventory` input parameters have a string value, detect whether to treat as a file path or parse to YAML.
* Fix rendering of `inventory.yml` when the `inventory` input parameter is set to inline YAML.
* If the `ansible_connection` host var is not explicitly set, default to "ssh" if we're also setting `ansible_host` to the node's `ip_address`.

1.0.0

<small>[Compare with 0.9.1](https://github.com/onecommons/unfurl/compare/v0.9.1...v1.0.0) </small>

Features

We've strived to maintain backwards compatibility and API stability for a while now, so for this release we decided to go ahead and christen it 1.0 🎉.

Major new features include:

TOSCA namespaces and global type identifiers

This release adds features designed to enable 3rd party type libraries to be shared within the same TOSCA topology and for Unfurl API consumers (such as Unfurl Cloud) to manage them.

* Namespace isolation.

Each imported service template is placed in a separate namespace that is used to resolve type references in the file. It includes the types defined in that file along with the types it imports, with those type names prefixed if `namespace_prefix` key is set on the import definition. The namespace will be unique to that import unless an explicit namespace is declared (see below). This can be disabled or limited using the new `UNFURL_GLOBAL_NAMESPACE_PACKAGES` environment variable (see _Breaking changes_ below). Note that the Python DSL already behaves this way, as documented [here](https://github.com/onecommons/unfurl/tree/main/tosca-package#imports-and-repositories).

* Support for TOSCA 1.3's `namespace` field

If a service template explicitly declares a namespace using the `namespace` keyword, its namespace will be assigned that name and namespace isolation will be disabled for any templates it imports -- so any import will share the same namespace unless it also declares its own namespace. In addition, any other template that declares the same namespace identifier will be placed in the same namespace. Shared namespaces means a template can reference types it didn't explicitly import and overwrite existing type definitions with the same name and so declaring namespaces is not recommended.

* Globally unique identifiers for types.

Namespaces can be used to generate globally unique type names and updated the Unfurl Server APIs and GraphQl/JSON export format to use these globally unique names.
Follow the format `<typename><namespace_id>` where `namespace_id` is namespace the type was declared in. If a namespace id isn't explicitly declared using the `namespace` keyword, one will be generated using the package id of its repository or current project, and optionally, a file path if it isn't the root service template.
For example: `ContainerComputeHostunfurl.cloud>/onecommons/std` (a type defined in `service-template.yaml`) and `EC2Instanceunfurl.cloud>/onecommons/std:aws` (a type defined in `aws.yaml`). TOSCA and unfurl types defined in the core vocabulary (and don't need to be imported) are not qualified. Built-in unfurl types that do need to be imported use `unfurl` as there package id, for example: `unfurl.nodes.Installer.Terraformunfurl:tosca_plugins/artifacts`.
Generation of type global names with export and the APIs can be disabled by setting the `UNFURL_EXPORT_LOCALNAMES` environment variable (See _Breaking changes_ below).

* Cross-package interoperability with type metadata.

A package can declare compatibility with types in different packages without having to import those packages using the `aliases` and `deprecates` keywords in the metadata section of a type definition. The keywords value can be either a fully qualified type name or a list of fully qualified type names and indicate that the type is equivalent to the listed types. This is used both by the parser and by the API (export includes those types in exported types `extends` section) used by Unfurl Cloud's UI.

Unfurl Server and export APIs

* Unfurl Server (and the Unfurl Cloud front-end) patching API now uses global type names to generate import statements with prefixes as needed to prevent clashes packages with the same name.

* Support for HTTP caching: Improved etag generation and Cache-Control headers that enable the browser and proxy caches to use stale content while processing slow requests. Use the `CACHE_CONTROL_SERVE_STALE` environment variable to set.

* Add a Dockerfile that includes a nginx caching proxy in front of unfurl server. Provide prebuilt container images as `onecommons/unfurl:v1.0.0-server-cached` on docker.io and ghcr.io.

* Improvements to Redis caching: Track dependencies better, cache commit dates for more efficient shallow clones, improved error handling and recovery.

* Local developer mode will now serve content from any local repository tracked by the current project or the home project (in `local/unfurl.yaml`). It also improves handling local changes and error recovery.

* Improve type annotations for new Graphql types and consolidated Graphql Schema.

* Add support for a `property_metadata` metadata key to apply metadata to individual properties on a TOSCA datatype.

For example, this property declaration applies the `user_settable` metadata key to the environment property on unfurl.datatypes.DockerContainer:

yaml
container:
type: unfurl.datatypes.DockerContainer
metadata:
property_metadata:
environment:
user_settable: true


Python DSL

Service templates written in Python now have the following integration points with Unfurl's runtime:

* ToscaType implementations of TOSCA operations can return a Python method instead of an artifact or configurator and that method will converted to a configurator.
* If TOSCA operation attempts to execute runtime-only functionality it will be invoked during the job's render phase instead of converted to YAML.
* Introduce a `Computed()` field specifier for TOSCA properties whose value is computed at runtime with the given method.
* Add a `unfurl.tosca_plugins.functions` module containing utility functions that can be executed in the safe mode Python sandbox. This allows these functions to be executed in "spec" mode (e.g. as part of a class definition or in `_class_init`).
* Add a `unfurl.expr` module containing type-safe python equivalents to Unfurl's eval expression functions.
* Add methods and free functions providing type-safes equivalents to Unfurl's query expressions: `find_configured_by`, `find_hosted_on`, `find_required_by` and `find_all_required_by` APIs.
* Providing these apis required synchronizing a job's task context as a per-thread global runtime state, add public apis for querying global state and retrieving the current `RefContext`.
* Treat non-required instance properties as optional (return None instead of raising KeyError)
* Add a public `unfurl.testing.create_runner()` api for writing unit tests for Python DSL service templates.

Various improvements to YAML-to-Python and Python-to-YAML conversion, including:

* Better support for artifacts
* Better default operation conversion
* Support for aliased references (multiple variables assigned to the same type).
* Special case module attributes named `__root__` to generate TOSCA `substitution_mappings` (aka `root`). For example:

python
__root__ = my_template

or

__root__ = MyNodeType


DSL API improvements:

* Add a `NodeTemplateDirective` string enum for type-safe node template directives.
* Add a `anymethod` method decorator for creating methods that can act as both a classmethods and regular method.
* Revamp Options api for typed and validated metadata on field specifiers.
* Add a `DEFAULT` sentinel value to indicate that a field should have a default value constructed from the type annotation. This helps when using forward references to types that aren't defined yet as well as prevent a bit of DRY.
* Add a similar `CONSTRAINED` sentinel value to indicate that the property's value will be set in `_class_init`.
* Add a `unfurl.support.register_custom_constraint()` API for registering custom TOSCA property constraint types.
* Add UNFURL_TEST_SAFE_LOADER environment variable to force a runtime exception if the tosca loader isn't in safe mode or to disable safe mode (for testing).UNFURL_TEST_SAFE_LOADER=never option to disable, any other non-empty value to enforce safe mode.
* Improve the [API documentation](https://docs.unfurl.run/api.html#api-for-writing-service-templates).

0.9.1

<small>[Compare with 0.9.0](https://github.com/onecommons/unfurl/compare/v0.9.0...v0.9.1) </small>

Features

TOSCA and Python DSL

* Add `ToscaInputs` classes for Unfurl's built-in configurators to enable static type checking of configurator inputs.

* Introduce the `Options` class to enable typed metadata for TOSCA fields and have the Terraform configurator add `tfvar` and `tfoutput` options to automatically map properties and attributes to Terraform variables and outputs, respectively.

This example uses the above features to integrate a Terraform module.

python
from unfurl.configurators.terraform import TerraformConfigurator, TerraformInputs, tfvar, tfoutput

class GenericTerraformManagedResource(tosca.nodes.Root):
example_terraform_var: str = Property(options=tfvar)
example_terraform_output: str = Attribute(options=tfoutput)

operation(apply_to=["Install.check", "Standard.configure", "Standard.delete"])
def default(self, **kw):
return TerraformConfigurator(TerraformInputs(main="terraform_dir"))


* Introduce `unfurl.datatypes.EnvironmentVariables`, a TOSCA datatype that converts to map of environment variables. Subclass this type to enable statically typed environment variables.

* Allow TOSCA data types to declare a "transform" in metadata that is applied as a property transform.

* `node_filter` improvements:
* Recursively merge `requirements` keys in node filters when determining the node_filter for a requirement.
* Allow `get_nodes_of_type` TOSCA function in node_filter `match` expressions.

* Release 0.0.5 version of the Python tosca package.

Packages

* Allow service templates to declare the unfurl version they are compatible with.

They can do this be declaring a repository for the unfurl package like so:

yaml
repositories:
unfurl:
url: https://github.com/onecommons/unfurl
revision: v0.9.1


Unfurl will still resolve imports in the unfurl package using the local installed version of unfurl but it will raise an error if it isn't compatible with the version declared here.

* Do semver compatibility check for 0.* versions.

Even though pre 1.0 versions aren't expected to provide semver guarantees the alternative to doing the semver check is to treat every version as incompatible with another thus requiring every version reference to a package to be updated with each package update. This isn't very useful, especially when developing against an unstable package.

* Support file URLs in package rules.

Minor Enhancements and Notable Bug Fixes

* **parser** allow merge keys to be optional, e.g. "+?/a/d"
* **loader**: Add a `UNFURL_OVERWRITE_POLICY` environment variable to guide the loader's python to yaml converter.
* **loader**: Relax restrictions on `from foo import *` and other bug fixes with the DSL sandbox's Python import loader.
* **init**: apply `UNFURL_SEARCH_ROOT` to unfurl project search.
* **packages**: If a package rule specifies a full url, preserve it when applying the rule.

0.9.0

<small>[Compare with 0.8.0](https://github.com/onecommons/unfurl/compare/v0.8.0...v0.9.0) </small>

Features

***Introduce Python DSL for TOSCA***

Write TOSCA as Python modules instead of YAML. Features include:

* Static type checking of your TOSCA model.
* IDE integration.
* Export command now support conversion from YAML to Python and Python to YAML.
* Python data model simplifies TOSCA YAML
* But allows advanced constraints that encapsulate verbose relationships and node filters.
* Python executes in a sandbox to safely parse untrusted TOSCA service templates.

See <https://github.com/onecommons/unfurl/blob/main/tosca-package/README.md> for more information.

Unfurl Cloud local development mode

You can now see local changes to a blueprint project under development on [Unfurl Cloud](https://unfurl.cloud) by running `unfurl serve .` in the project directory. If the project was cloned from Unfurl Cloud, it will connect to that local server to render and deploy that local copy of the blueprint (for security, on your local browser only). Use the `--cloud-server` option to specify an alternative instance of Unfurl Cloud.

Embedded Blueprints (TOSCA substitution mapping)

Support for TOSCA substitution mapping has been stabilized and integrated into Unfurl Cloud.

One new feature level enhancement an extension to TOSCA's requirements mapping to that enables you to essentially parameterize an embedded template by letting the outer (the embedding template) substitute node templates in the embedded template.

When substituted node template (in the outer topologies) declares requirements with whose name matches the name of a node template in the substituted (inner) topology then that node template will be replaced by the node template targeted by the requirement.

For example, if the substituted (inner) topology looked like:

yaml
node_types:
NestedWithPlaceHolder:
derived_from: tosca:Root
requirements:
- placeholder:
node: PlaceHolderType

topology_template:
substitution_mapping:
node_type: NestedWithPlaceHolder

node_templates:
placeholder:
type: PlaceHolderType
...


Then another topology that is embedding it can replace the "placeholder" template like so:

yaml
node_templates:
nested1:
type: NestedWithPlaceHolder
directives:
- substitute
requirements:
- placeholder: replacement

replacement:
type: PlaceHolderType
...


CLI Improvements

* Add `--skip-upstream-check` global option which skips pulling latest upstream changes from existing repositories and checking remote repositories for version tags.

Improvements to sub commands:

* **init** Add Azure and Kubernetes project skeletons (`unfurl init --skeleton k8s` and `unfurl init --skeleton azure`)
* **clone** Add `--design` flag which to configure the cloned project for blueprint development.
* **serve** Add `--cloud-server` option to specify Unfurl Cloud instance to connect to.
* **export** Add support for exporting TOSCA YAML to Python and Python to YAML; add `--overwrite` and `--python-target` options.
* **cloudmap** Also allow host urls for options that had only accepted a pre-configured name of a repository host.

Dry Run improvements

* When deploying jobs with `--dryrun` if a `Mock` operation is defined for a node type or template it will be invoked if the configurator doesn't support dry run mode.
* The **terraform** configurator now supports `dryrun_mode` and `dryrun_outputs` input parameters.

Runtime Eval Expressions

* Eval expressions can now be used in `node_filter`s to query for node matches using the new `match` keyword in the node filter.
* Add a ".hosted_on" key to eval expressions that (recursively) follows ".targets" filtered by the `HostedOn` relationship, even across topology boundaries.
* Add optional `wantList` parameter to the jinja2 `eval` filter to guarantee the result is a list e.g. `{{"expression" | eval(wantList=True)}}`
* The `trace` keyword in eval expressions now accept `break` as a value. This will invoke Python's `breakpoint()` function when the expression is evaluated.

New Environment Variables

* `UNFURL_SEARCH_ROOT` environment variable: When search for ensembles and unfurl projects Unfurl will stop when it reach the directory this is set to.

* `UNFURL_SKIP_SAVE` environment variable: If set, skips saving the ensemble.yaml to disk after a job runs,

Minor Enhancements and Notable Bug Fixes

* **artifacts:** Add an `asdf` artifact for `kompose` and have the **kompose** configurator schedule the artifact if `kompose` is missing.
* **parser:** treat import failures are fatal errors (abort parsing and validation).
* **parser:** better syntax validation of `+include` statements.
* **cloudmap:** allow anonymous connections to Gitlab and Unfurl Cloud api for readonly access
* **plan:** fix spurious validation failures when creating initial environment variable rules
* **tosca:**: Add support for 'unsupported', 'deprecated', and 'removed' property statuses.
* **tosca:** Add support for bitrate scalar units.
* **tosca:** fix built-in storage type definition
* **plan:** fix candidate status check when deleting instances.
* **server:** make sure the generated local unfurl.yaml exists when patching deployments.
* **packages:** warn when remote lookup of version tags fails.
* **repo:** fix matching local paths with repository paths.
* **loader:** fix relative path lookups when processing an +include directives during a TOSCA import of a file in a repository.
* **logging:** improve readability when pretty printing dictionaries.

0.8.0

<small>[Compare with 0.7.1](https://github.com/onecommons/unfurl/compare/v0.7.1...v0.8.0)</small>

Breaking Changes

* The Unfurl package now only depends on the [ansible-core](https://pypi.org/project/ansible-core) package instead of the full [ansible](https://pypi.org/project/ansible/) package. Unfurl projects that depend on ansible modules installed by that package will not work with new Unfurl installations unless it is installed by some other means -- or, better, declare an Ansible collection artifact as a dependency on the template that requires it. For an example, see this usage in the [docker-template.yaml](https://github.com/onecommons/unfurl/blob/v0.7.2/unfurl/configurators/docker-template.yaml#L127C29-L127C29).

Features

* Allow Ansible collections to be declared as TOSCA artifacts. Some predefined ones are defined [here](https://github.com/onecommons/unfurl/blob/v0.7.2/unfurl/tosca_plugins/artifacts.yaml#L163).
* Unfurl Server now tracks the dependent repositories accessed when generating cached representations (e.g. for `/export`) and uses that information to invalidate cache items when the dependencies change.
* Unfurl Server now caches more git operations, controlled by these environment variables: `CACHE_DEFAULT_PULL_TIMEOUT` (default: 120s) and
`CACHE_DEFAULT_REMOTE_TAGS_TIMEOUT` (default: 300s)
* Unfurl Server: add a `/types` endpoint that can extract types from a cloudmap.
* API: allow simpler [Configurator.run()](https://docs.unfurl.run/api.html#unfurl.configurator.Configurator.run) implementations
* cloudmap: The [cloudmap](https://docs.unfurl.run/cli.html#unfurl-cloudmap) command now supports `--import local`.
* eval: Unfurl's Jinja2 filters that are marked as safe can now be used in safe evaluation mode (currently: `eval`, `map_value`, `sensitive`, and the `to_*_label` family).

Bug Fixes

* jobs: Fix issue where some tasks that failed during the render phase were missing from the job summary.
* jobs: Don't apply the `--dryrun` option to the tasks that are installing local artifacts. (Since they will most likely be needed to execute the tasks in the job even in dryrun mode.)
* k8s: Fix evaluation of the `kubernetes_current_namespace()` expression function outside of a job context
* k8s: Filter out data keys with null values when generating a Kubernetes Secret resource.
* helm: Fix `check` operation for `unfurl.nodes.HelmRepository`
* helm: If the Kubernetes environment has the insecure flag set, pass `--kube-insecure-skip-tls-verify` to helm.

Misc

* Introduce CHANGELOG.md (this file -- long overdue!)
* CI: container images will be built and pushed to <https://github.com/onecommons/unfurl/pkgs/container/unfurl> with every git push to CI, regardless of the branch. (In addition to the container images at <https://hub.docker.com/r/onecommons/unfurl>, which are only built from main.)

0.0.8

Release includes the following fixes and enhancements:

* yaml to python: more idiomatic Python when importing ``__init__.yaml``
* yaml to python: use import namespace when following imports
* yaml to python: don't forward reference built-in types
* overwrite policy: don't overwrite if converted contents didn't change
* remove dependency on unfurl package.
* support array and key access in field projections
* allow regular data as arguments to boolean expressions.
* add `fallback(left: Optional[T], right: T) -> T` to `unfurl.tosca_plugins.expr` for type-safe default expressions.
* move `tfoutput` and `tfvar` Options to `unfurl.tosca_plugins.expr` (this make them available in safe mode).

Server enhancements and fixes

* Add a `/empty_cache` POST endpoint for clearing entire cache (using optional `prefix` parameter). Access requires `UNFURL_SERVER_ADMIN_PROJECT` environment variable to be set and the `auth_project` URL parameter to match it.
* Patch and export now support the "branch" field on DeploymentTemplate
* Invalidate cached blueprint export even if file in the key didn't change.

Other Notable bug fixes

897c32ba testing: add mypy testing api and make lifecycle api more flexible

3f80e5bf job: fix for TaskView.find_connection()

269f981c runtime: refine "is resource computed" heuristic

253f55aa cloudmap: fix tag refs on gitlab hosts

61da0c6f clone: apply revision if found in fragment of the cloned url

63e2f48a kompose: have KomposeInputs use ``unfurl_datatypes_DockerContainer``

03feb7d7 plan: exclude replaced node templates from relationships

0922b557 logging: smarter truncation of log messages with stack traces

aaabb8a2 packages: fix resolving package compatibility

667c59d9 export: stop clearing out requirement match pointing to nested templates

5bfef1d8 export: fix `get_nodes_of_type` in match filters

a3c8fb54 export: stop hoisting default templates as requirements.

df2d3e97 parser: fix matching when a requirement target is substituted by the outer topology template.

a3f7feae parser: fix typename checking when evaluating 'node' field on requirements

b570246f parser: use the type's namespace when creating properties.

1b965e5b parser: fix ``Namespace.get_global_name_and_prefix()``

Breaking changes

Drop support for Python 3.7

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.