Pelicun

Latest version: v3.5.1

Safety actively analyzes 723976 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 3

3.0

* The architecture was redesigned to better support interactive calculation and provide a low-level integration across all supported methods. This is the first release with the new architecture. Frequent updates are planned to provide additional examples, tests, and bugfixes in the next few months.

* New `assessment` module introduced to replace `control` module:
* Provides a high-level access to models and their methods
* Integrates all types of assessments into a uniform approach
* Most of the methods from the earlier `control` module were moved to the `model` module

* Decoupled demand, damage, and loss calculations:
* Fragility functions and consequence functions are stored in separate files. Added new methods to the `db` module to prepare the corresponding data files and re-generated such data for FEMA P58 and Hazus earthquake assessments. Hazus hurricane data will be added in a future release.
* Decoupling removed a large amount of redundant data from supporting databases and made the use of HDF and json files for such data unnecessary. All data are stored in easy-to-read csv files.
* Assessment workflows can include all three steps (i.e., demand, damage, and loss) or only one or two steps. For example, damage estimates from one analysis can drive loss calculations in another one.

* Integrated damage and loss calculation across all methods and components:
* This includes phenomena such as collapse, including various collapse modes, and irreparable damage.
* Cascading damages and other interdependencies between various components can be introduced using a damage process file.
* Losses can be driven by damages or demands. The former supports the conventional damage->consequence function approach, while the latter supports the use of vulnerability functions. These can be combined within the same analysis, if needed.
* The same loss component can be driven by multiple types of damages. For example, replacement can be triggered by either collapse or irreparable damage.

* Introduced *Options* in the configuration file and in the `base` module:
* These options handle settings that concern pelicun behavior;
* general preferences that might affect multiple assessment models;
* and settings that users would not want to change frequently.
* Default settings are provided in a `default_config.json` file. These can be overridden by providing any of the prescribed keys with a user-defined value assigned to them in the configuration file for an analysis.

* Introduced consistent handling of units. Each csv table has a standard column to describe units of the data in it. If the standard column is missing, the table is assumed to use SI units.

* Introduced consistent handling of pandas MultiIndex objects in headers and indexes. When tabular data is stored in csv files, MultiIndex objects are converted to simple indexes by concatenating the strings at each level and separating them with a `-`. This facilitates post-processing csv files in pandas without impeding post-processing those files in non-Python environments.

* Updated the DL_calculation script to support the new architecture. Currently, only the config file input is used. Other arguments were kept in the script for backwards compatibility; future updates will remove some of those arguments and introduce new ones.

* The log files were redesigned to provide more legible and easy-to-read information about the assessment.

2.6

* Support EDPs with more than 3 characters and/or a variable in their name. For example, SA_1.0 or SA_T1
* Support fitting normal distribution to raw EDP data (lognormal was already available)
* Extract key settings to base.py to make them more accessible for users.
* Minor bug fixes mostly related to hurricane storm surge assessment

2.5

* Extend the uq module to support:
* More efficient sampling, especially when most of the random variables in the model are either independent or perfectly correlated.
* More accurate and more efficient fitting of multivariate probability distributions to raw EDP data.
* Arbitrary marginals (beyond the basic Normal and Lognormal) for joint distributions.
* Latin Hypercube Sampling
* Introduce external auto-population scripts and provide an example for hurricane assessments.
* Add a script to help users convert HDF files to CSV (HDF_to_CSV.py under tools)
* Use unique and standardized attribute names in the input files
* Migrate to the latest version of Python, numpy, scipy, and pandas (see setup.py for required minimum versions of those tools).
* Bug fixes and minor improvements to support user needs:
* Add 1.2 scale factor for EDPs controlling non-directional Fragility Groups.
* Remove dependency on scipy's truncnorm function to avoid long computation times due to a bug in recent scipy versions.

2.1.1

- Aggregate DL data from JSON files to HDF5 files. This greatly reduces the number of files and makes it easier to share databases.
- Significant performance improvements in EDP fitting, damage and loss calculations, and output file saving.
- Add log file to pelicun that records every important calculation detail and warnings.
- Add 8 new EDP types: RID, PMD, SA, SV, SD, PGD, DWD, RDR.
- Drop support for Python 2.x and add support for Python 3.8.
- Extend auto-population logic with solutions for HAZUS EQ assessments.
- Several bug fixes and minor improvements to support user needs.

2.0.0

- migrated to the latest version of Python, numpy, scipy, and pandas
see setup.py for required minimum versions of those tools
- Python 2.x is no longer supported
- improve DL input structure to
= make it easier to define complex performance models
= make input files easier to read
= support custom, non-PACT units for component quantities
= support different component quantities on every floor
- updated FEMA P58 DL data to use ea for equipment instead of units such as KV, CF, AP, TN
- add FEMA P58 2nd edition DL data
- support EDP inputs in standard csv format
- add a function that produces SimCenter DM and DV json output files
- add a differential evolution algorithm to the EDP fitting function to do a better job at finding the global optimum
- enhance DL_calculation.py to handle multi-stripe analysis (significant contributions by Joanna Zou):
= recognize stripe_ID and occurrence rate in BIM/EVENT file
= fit a collapse fragility function to empirical collapse probabilities
= perform loss assessment for each stripe independently and produce corresponding outputs

1.2

- support for HAZUS hurricane wind damage and loss assessment
- add HAZUS hurricane DL data for wooden houses
- move DL resources inside the pelicun folder so that they come with pelicun when it is pip installed
- add various options for EDP fitting and collapse probability estimation
- improved the way warning messages are printed to make them more useful

Page 2 of 3

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.