Skip to content

Releases: gem/oq-engine

OpenQuake Engine 3.7.1

25 Oct 09:32
Compare
Choose a tag to compare

[Michele Simionato (@micheles)]

  • Fixed disaggregation with a parent calculation
  • Fixed the case of implicit grid with a site model: sites could be
    incorrectly discarded
  • Fixed the ShakeMap downloader to find also unzipped uncertaintly.xml
    files
  • Fixed the rupture exporters to export the rupture ID and not the
    rupture serial

[Marco Pagani (@mmpagani)]

  • Fixed a bug in the GenericGmpeAvgSA

OpenQuake Engine 3.7.0

26 Sep 09:52
Compare
Choose a tag to compare

[Michele Simionato (@micheles)]

  • Hiding calculations that fail before the pre-execute phase (for instance,
    because of missing files); they already give a clear error
  • Added an early check on truncation_level in presence of correlation model

[Guillaume Daniel (@guyomd)]

  • Implemented Ameri (2017) GMPE

[Michele Simionato (@micheles)]

  • Changed the ruptures CSV exporter to use commas instead of tabs
  • Added a check forbidding aggregate_by for non-ebrisk calculators
  • Introduced a task queue
  • Removed the cache_XXX.hdf5 files by using the SWMR mode of h5py

[Kris Vanneste (@krisvanneste)]

  • Updated the coefficients table for the atkinson_2015 to the actual
    values in the paper.

[Michele Simionato (@micheles)]

  • Added an /extract/agg_curves API to extract both absolute and relative
    loss curves from an ebrisk calculation
  • Changed oq reset --yes to remove oqdata/user only in single-user mode
  • Now the engine automatically sorts the user-provided intensity_measure_types
  • Optimized the aggregation by tag
  • Fixed a bug with the binning when disaggregating around the date line
  • Fixed a prefiltering bug with complex fault sources: in some cases, blocks
    ruptures were incorrectly discarded
  • Changed the sampling algorithm for the GMPE logic trees: now it does
    not require building the full tree in memory
  • Raised clear errors for geometry files without quotes or with the wrong
    header in the multi_risk calculator
  • Changed the realizations.csv exporter to export '[FromShakeMap]' instead
    of '[FromFile]' when needed
  • Changed the agg_curves exporter to export all realizations in a single file
    and all statistics in a single file
  • Added rlz_id, rup_id and year to the losses_by_event output for ebrisk
  • Fixed a bug in the ruptures XML exporter: the multiplicity was multiplied
    (incorrectly) by the number of realizations
  • Fixed the pre-header of the CSV outputs to get proper CSV files
  • Replaced the 64 bit event IDs in event based and scenario calculations
    with 32 bit integers, for the happiness of Excel users

[Daniele Viganò (@daniviga)]

  • Numpy 1.16, Scipy 1.3 and h5py 2.9 are now required

[Michele Simionato (@micheles)]

  • Changed the ebrisk calculator to read the CompositeRiskModel directly
    from the datastore, which means 20x less data transfer for Canada

[Anirudh Rao (@raoanirudh)]

  • Fixed a bug in the gmf CSV importer: the coordinates were being
    sorted and new site_ids assigned even though the user input sites
    csv file had site_ids defined

[Michele Simionato (@micheles)]

  • Fixed a bug in the rupture CSV exporter: the boundaries of a GriddedRupture
    were exported with lons and lats inverted
  • Added some metadata to the CSV risk outputs
  • Changed the distribution mechanism in ebrisk to reduce the slow tasks

[Graeme Weatherill (@g-weatherill)]

  • Updates Kotha et al. (2019) GMPE to July 2019 coefficients
  • Adds subclasses to Kotha et al. (2019) to implement polynomial site
    response models and geology+slope site response model
  • Adds QA test to exercise all of the SERA site response calculators

[Michele Simionato (@micheles)]

  • Internal: there is not need to call gsim.init() anymore

[Graeme Weatherill (@g-weatherill)]

  • Adds parametric GMPE for cratonic regions in Europe

[Michele Simionato (@micheles)]

  • In the agglosses output of scenario_risk the losses were incorrectly
    multiplied by the realization weight
  • Removed the output sourcegroups and added the output events

[Graeme Weatherill (@g-weatherill)]

  • Adds new meta ground motion models to undertake PSHA using design code
    based amplification coefficients (Eurocode 8, Pitilakis et al., 2018)
  • Adds site amplification model of Sandikkaya & Dinsever (2018)

[Marco Pagani (@mmpagani)]

  • Added a new rupture-site metric: the azimuth to the closest point on the
    rupture

[Michele Simionato (@micheles)]

  • Fixed a regression in disaggregation with nonparametric sources, which
    were effectively discarded
  • The site amplification has been disabled by default in the ShakeMap
    calculator, since it is usually already taken into account by the USGS

[Daniele Viganò (@daniviga)]

  • Deleted calculations are not removed from the database anymore
  • Removed the 'oq dbserver restart' command since it was broken

[Richard Styron (@cossatot)]

  • Fixed YoungsCoppersmith1985MFD.from_total_moment_rate(): due to numeric
    errors it was producing incorrect seismicity rates

[Michele Simionato (@micheles)]

  • Now we generate the output disagg_by_src during disaggregation even in the
    case of multiple realizations
  • Changed the way the random seed is set for BT and PM distributions
  • The filenames generated by disagg_by_src exporter now contains the site ID
    and not longitude and latitude, consistently with the other exporters
  • Accepted again meanLRs greater than 1 in vulnerability functions of kind LN
  • Fixed a bug in event based with correlation and a filtered site collection
  • Fixed the CSV exporter for the realizations in the case of scenarios
    with parametric GSIMs
  • Removed some misleading warnings for calculations with a site model
  • Added a check for missing risk_investigation_time in ebrisk
  • Reduced drastically (I measured improvements over 40x) memory occupation,
    data transfer and data storage for multi-sites disaggregation
  • Sites for which the disaggregation PoE cannot be reached are discarded
    and a warning is printed, rather than killing the whole computation
  • oq show performance can be called in the middle of a computation again
  • Filtered out the far away distances and reduced the time spent in
    saving the performance info by orders of magnitude in large disaggregations
  • Reduced the data transfer by reading the data directly from the
    datastore in disaggregation calculations
  • Reduced the memory consumption sending disaggregation tasks incrementally
  • Added an extract API disagg_layer
  • Moved max_sites_disagg from openquake.cfg into the job.ini
  • Fixed a bug with the --config option: serialize_jobs could not be overridden
  • Implemented insured losses

OpenQuake Engine 3.6.0

16 Jul 10:15
Compare
Choose a tag to compare

[Michele Simionato (@micheles)]

  • In some cases applyToSources was giving a fake error about the source
    not being in the source model even if it actually was

[Chris Van Houtte (@cvanhoutte)]

  • Adds the Van Houtte et al. (2018) significant duration model for New
    Zealand

[Michele Simionato (@micheles)]

  • Added a way to compute and plot the MFD coming from an event based
  • Storing the MFDs in TOML format inside the datastore

[Robin Gee (@rcgee)]

  • Moves b4 constant into COEFFS table for GMPE Sharma et al., 2009

[Graeme Weatherill (@g-weatherill)]

  • Adds functionality to Cauzzi et al. (2014) and Derras et al. (2014)
    calibrated GMPEs for Germany to use either finite or point source distances

[Michele Simionato (@micheles)]

  • Restored the ability to associate site model parameters to a grid of sites
  • Made it possible to set hazard_curves_from_gmfs=true with
    ground_motion_fields=false in the event based hazard calculator
  • Introduced a mechanism to split the tasks based on an estimated duration
  • Integrated oq plot_memory into oq plot
  • Removed NaN values for strike and dip when exporting griddedRuptures
  • Fixed oq reset to work in multi-user mode
  • Extended the source_id-filtering feature in the job.ini to multiple sources
  • Supported WKT files for the binary perils in the multi_risk calculator
  • Added an early check on the coefficients of variation and loss ratios of
    vulnerability functions with the Beta distribution
  • Made sure that oq engine --dc removes the HDF5 cache file too
  • Removed the flag optimize_same_id_sources because it is useless now
  • Introduced a soft limit at 65,536 sites for event_based calculations
  • Fixed a performance regression in ucerf_classical that was filtering
    before splitting, thus becoming extra-slow
  • Improved the progress log, that was delayed for large classical calculations
  • Exported the ruptures as 3D multi-polygons (instead of 2D ones)
  • Changed the aggregate_by exports for consistency with the others
  • Changed the losses_by_event exporter for ebrisk, to make it more
    consistent with scenario_risk and event_based_risk
  • Changed the agglosses and losses_by_event exporters in scenario_risk,
    by adding a column with the realization index
  • Changed the generation of the hazard statistics to consume very little
    memory
  • Fixed a bug with concurrent_tasks being inherited from the parent
    calculation instead of using the standard default
  • Removed the dependency from mock, since it is included in unittest.mock
  • For scenario, replaced the branch_path with the GSIM representation in
    the realizations output
  • Added a check for suspiciously large source geometries
  • Deprecated the XML disaggregation exporters in favor of the CSV exporters
  • Turned the disaggregation calculator into a classical post-calculator
    to use the precomputed distances and speedup the computation even more
  • Fixed the disaggregation calculator by discarding the ruptures outside
    the integration distance
  • Optimized the speed of the disaggregation calculator by moving a statistical
    functions outside of the inner loop
  • Changed the file names of the exported disaggregation outputs
  • Fixed an export agg_curves issue with pre-imported exposures
  • Fixed an export agg_curves issue when the hazard statistics are different
    from the risk statistics
  • Removed the disaggregation statistics: now the engine disaggregates only on
    a single realization (default: the closest to the mean)
  • Forbidden disaggregation matrices with more than 1 million elements
  • Reduced the data transfer when computing the hazard curves
  • Optimized the reading of large CSV exposures
  • Fixed the --hc functionality across users
  • Optimized the reduction of the site collection on the exposure sites
  • Made more robust the gsim logic tree parser: lines like
    <uncertaintyModel gmpe_table="../gm_tables/Woffshore_low_clC.hdf5">
    are accepted again
  • Added a check against duplicated values in nodal plane distributions and
    hypocenter depth distributions
  • Changed the support for zipped exposures and source models: now the
    name of the archive must be written explicitly in the job.ini
  • Added support for numpy 1.16.3, scipy 1.3.0, h5py 2.9.0
  • Removed the special case for event_based_risk running two calculations

[Graeme Weatherill (@g-weatherill)]

  • Adds the Tromans et al. (2019) adjustable GMPE for application to PSHA
    in the UK

[Michele Simionato (@micheles)]

  • Optimized src.sample_ruptures for (multi)point sources and are sources
  • Fixed a mutability bug in the DistancesContext and made all context
    arrays read-only: the fix may affect calculations using the GMPEs
    berge_thierry_2003, cauzzi_faccioli_2008 and zhao_2006;
  • Fixed a bug with the minimum_distance feature
  • Fixed a bug in the exporter of the aggregate loss curves: now the loss
    ratios are computed correctly even in presence of occupants
  • Removed the (long time deprecated) capability to read hazard curves and
    ground motion fields from XML files: you must use CSV files instead

[Marco Pagani (@mmpagani)]

  • Implemented a modified GMPE that add between and within std to GMPEs only
    supporting total std

[Michele Simionato (@micheles)]

  • Added the ability to use a taxonomy_mapping.csv file
  • Fixed a bug in classical_damage from CSV: for hazard intensity measure
    levels different from the fragility levels, the engine was giving incorrect
    results
  • Serialized also the source model logic tree inside the datastore
  • Added a check on missing intensity_measure_types in event based
  • Fixed oq prepare_site_model in the case of an empty datadir
  • Added a comment line with useful metadata to the engine CSV outputs
  • Removed the long time deprecated event loss table exporter for event based
    risk and enhanced the losses_by_event exporter to export the realization ID
  • Removed the long time deprecated GMF XML exporter for scenario
  • IMT-dependent weights in the gsim logic tree can be zero, to discard
    contributions outside the range of validity of (some of the) GSIMs
  • Now it is possible to export individual hazard curves from an event
  • Added a view gmvs_to_hazard

OpenQuake Engine 3.5.2

31 May 08:41
Compare
Choose a tag to compare

[Daniele Viganò (@daniviga)]

  • Fixed packaging issue, the .hdf5 tables for Canada were missing

[Michele Simionato (@micheles)]

  • Fixed regression in the gsim logic tree parser for the case
    of .hdf5 tables

OpenQuake Engine 3.5.1

20 May 14:18
Compare
Choose a tag to compare

[Michele Simionato (@micheles)]

  • Added a rlzi column to to sig_eps.csv output
  • Accepted GMF CSV files without a rlzi column
  • Accepted a list-like syntax like return_periods=[30, 60, 120, 240, 480]
    in the job.ini, as written in the manual
  • Fixed a bug in the asset_risk exporter for uppercase tags

[Paul Henshaw (@pslh)]

  • Fixed an encoding bug while reading XML files on Windows

OpenQuake Engine 3.5.0

13 May 14:37
Compare
Choose a tag to compare

[Michele Simionato (@micheles)]

  • Added a view gmvs_to_hazard

[Giovanni Lanzano (@giovannilanzanoINGV)]

  • Lanzano and Luzi (2019) GMPE for volcanic zones in Italy

[Michele Simionato (@micheles)]

  • Now it is possible to export individual hazard curves from an event
    based calculation by setting hazard_curves_from_gmfs = true and
    `individual_curves = true (before only the statistics were saved)

[Graeme Weatherill (@g-weatherill)]

  • Adds adaptation of Abrahamson et al. (2016) 'BC Hydro' GMPEs calibrated
    to Mediterranean data and with epistemic adjustment factors

[Chris Van Houtte (@cvanhoutte)]

  • Added new class to bradley_2013b.py for hazard maps
  • Modified test case_37 to test multiple sites

[Marco Pagani (@mmpagani)]

  • Fixed a bug in the logic tree parser and added a check to forbid logic
    trees with applyToSources without applyToBranches, unless there is a
    single source model branch

[Michele Simionato (@micheles)]

  • Removed the experimental parameter prefilter_sources

[Daniele Viganò (@daniviga)]

  • Multiple DbServer ZMQ connections are restored to avoid errors under heavy
    load and/or on slower machines

[Michele Simionato (@micheles)]

  • Removed the ugly registration of custom signals at import time: now they
    are registered only if engine.run_calc is called
  • Removed the dependency from rtree
  • Removed all calls to ProcessPool.shutdown to speed up the tests and to
    avoid non-deterministic errors in atexit._run_exitfuncs

[Marco Pagani (@mmpagani)]

  • Added tabular GMPEs as provided by Michal Kolaj, Natural Resources Canada

[Michele Simionato (@micheles)]

  • Extended the ebrisk calculator to support coefficients of variations

[Graeme Weatherill (@g-weatherill)]

  • Adds Kotha et al (2019) shallow crustal GMPE for SERA
  • Adds 'ExperimentalWarning' to possible GMPE warnings
  • Adds kwargs to check_gsim function

[Michele Simionato (@micheles)]

  • Fixed problems like SA(0.7) != SA(0.70) in iml_disagg
  • Exposed the outputs of the classical calculation in event based
    calculations with compare_with_classical=true
  • Made it possible to serialize together all kind of risk functions,
    including consequence functions that before were not HDF5-serializable
  • Fixed a MemoryError when counting the number of bytes stored in large
    HDF5 datasets
  • Extended asset_hazard_distance to a dictionary for usage with multi_risk
  • Extended oq prepare_site_model to work with sites.csv files
  • Optimized the validation of the source model logic tree: now checking
    the sources IDs is 5x faster
  • Went back to the old logic in sampling: the weights are used for the
    sampling and the statistics are computed with identical weights
  • Avoided to transfer the epsilons by storing them in the cache file
    and changed the event to epsilons associations
  • Reduced the data transfer in the computation of the hazard curves, causing
    in some time huge speedups (over 100x)
  • Implemented a flag modal_damage_state to display only the most likely
    damage state in the output dmg_by_asset of scenario damage calculations
  • Reduced substantially the memory occupation in classical calculations
    by including the prefiltering phase in the calculation phase

[Daniele Viganò (@daniviga)]

  • Added a 'serialize_jobs' setting to the openquake.cfg
    which limits the maximum number of jobs that can be run in parallel

[Michele Simionato (@micheles)]

  • Fixed two exporters for the ebrisk calculator (agg_curves-stats and
    losses_by_event)
  • Fixed two subtle bugs when reading site_model.csv files
  • Added /extract/exposure_metadata and /extract/asset_risk
  • Introduced an experimental multi_risk calculator for volcanic risk

[Guillaume Daniel (@guyomd)]

  • Updating of Berge-Thierry (2003) GSIM and addition of several alternatives
    for use with Mw

[Michele Simionato (@micheles)]

  • Changed the classical_risk calculator to use the same loss ratios for all
    taxonomies and then optimized all risk calculators
  • Temporarily removed the insured_losses functionality
  • Extended oq restore to download from URLs
  • Removed the column 'gsims' from the output 'realizations'
  • Better parallelized the source splitting in classical calculations
  • Added a check for missing hazard in scenario_risk/scenario_damage
  • Improved the GsimLogicTree parser to get the line number information, a
    feature that was lost with the passage to Python 3.5
  • Added a check against mispellings in the loss type in the risk keys
  • Changed the aggregation WebAPI from
    aggregate_by/taxonomy,occupancy/avg_losses?kind=mean&loss_type=structural to
    aggregate/avg_losses?kind=mean&loss_type=structural&tag=taxonomy&tag=occupancy
  • Do not export the stddevs in scenario_damage in the case of 1 event
  • Fixed export bug for GMFs imported from a file
  • Fixed an encoding error when storing a GMPETable
  • Fixed an error while exporting the hazard curves generated by a GMPETable
  • Removed the deprecated feature aggregate_by/curves_by_tag

OpenQuake Engine 3.4.0

18 Mar 10:33
Compare
Choose a tag to compare

[Michele Simionato (@micheles)]

  • Compatibility with 'decorator' version >= 4.2

[Giovanni Lanzano (@giovannilanzanoINGV)]

  • Contributed a GMPE SkarlatoudisEtAlSSlab2013

[Michele Simionato (@micheles)]

  • Changed the event loss table exporter to export also rup_id and year
  • Extended the ebrisk calculator to compute loss curves and maps

[Rodolfo Puglia (@rodolfopuglia)]

  • Spectral acceleration amplitudes at 2.5, 2.75 and 4 seconds added

[Marco Pagani (@mmpagani)]

  • Improved the event based calculator to account for cluster-based models

[Michele Simionato (@micheles)]

  • Removed the now redundant command oq extract hazard/rlzs

[Daniele Viganò (@daniviga)]

  • Fixed 'oq abort' and always mark killed jobs as 'aborted'

[Michele Simionato (@micheles)]

  • Made it possible to use in the Starmap tasks without a monitor argument
  • Stored the sigma and epsilon parameters for each event in event based
    and scenario calculations and extended the gmf_data exporter consequently
  • Fixed the realizations CSV exporter which was truncating the names of the
    GSIMs
  • Deprecated the XML exporters for hcurves, hmaps, uhs
  • Introduced a sap.script decorator
  • Used the WebExtractor in oq importcalc
  • Restored validation of the source_model_logic_tree.xml file
  • Raised an early error for missing occupants in the exposure
  • Added a check to forbid duplicate file names in the uncertaintyModel tag
  • Made it possible to store the asset loss table in the ebrisk calculator
    by specifying asset_loss_table=true in the job.ini
  • Added a flag oq info --parameters to show the job.ini parameters
  • Removed the source_name column from the disagg by source output

[Rao Anirudh]

  • Fixed wrong investigation_time in the calculation of loss maps from
    loss curves

[Robin Gee (@rcgee)]

  • Added capability to optionally specify a time_cutoff parameter to
    declustering time window

[Michele Simionato (@micheles)]

  • Merged the commands oq plot_hmaps and oq plot_uhs inside oq plot
  • Changed the storage of hazard curves and hazard maps to make it consistent
    with the risk outputs and Extractor-friendly

[Chris Van Houtte (@cvanhoutte)]

  • Added necessary gsims to run the Canterbury Seismic Hazard Model
    in Gerstenberger et al. (2014)
  • Added a new gsim file mcverry_2006_chch.py to have the Canterbury-
    specific classes.
  • Added a new gsim file bradley_2013b.py to implement the
    Christchurch-specific modifications to the Bradley2013 base model.

[Michele Simionato (@micheles)]

  • Added a check on the intensity measure types and levels in the job.ini,
    to make sure they are ordered by period
  • Reduced the number of client sockets to the DbServer that was causing
    (sporadically) the hanging of calculations on Windows
  • Extended the WebAPI to be able to extract specific hazard curves, maps
    and UHS (i.e. IMT-specific and site specific)
  • Removed the realization index from the event loss table export, since
    is it redundant
  • Forced all lowercase Python files in the engine codebase
  • Removed the dependency from nose

[Robin Gee (@rcgee)]

  • Updated GMPE of Yu et al. (2013)

[Michele Simionato (@micheles)]

  • Added an Extractor client class leveraging the WebAPI and enhanced
    oq plot_hmaps to display remote hazard maps
  • Added a check when disaggregation is attempted on a source model
    with atomic source groups
  • Implemented serialization/deserialization of GSIM instances to TOML
  • Added a check against mispelled rupture distance names and fixed
    the drouet_alpes_2015 GSIMs
  • Changed the XML syntax used to define dictionaries IMT -> GSIM
  • Now GSIM classes have an .init() method to manage notrivial
    initializations, i.e. expensive initializations or initializations
    requiring access to the filesystem
  • Fixed a bug in event based that made it impossible to use GMPETables
  • Associated the events to the realizations even in scenario_risk: this
    involved changing the generation of the epsilons in the case of asset
    correlation. Now there is a single aggregate losses output for all
    realizations
  • Removed the rlzi column from the GMF CSV export
  • Introduced a new parameter ebrisk_maxweight in the job.ini
  • For classical calculations with few sites, store information about the
    realization closest to the mean hazard curve for each site
  • Removed the max_num_sites limit on the event based calculator

[Valerio Poggi (@klunk386)]

  • Added an AvgSA intensity measure type and a GenericGmpeAvgSA which is
    able to use it

[Michele Simionato (@micheles)]

  • Introduced the ability to launch subtasks from tasks
  • Stored rupture information in classical calculations with few sites

[Chris Van Houtte (@cvanhoutte)]

  • Adding conversion from geometric mean to larger horizontal component in
    bradley_2013.py

[Michele Simionato (@micheles)]

  • Fixed a bug in applyToSources for the case of multiple sources
  • Moved the prefiltering on the workers to save memory
  • Exported the aggregated loss ratios in avg losses and agg losses
  • Removed the variables quantile_loss_curves and mean_loss_curves: they
    were duplicating quantile_hazard_curves and mean_hazard_curves
  • Only ruptures boundingbox-close to the site collection are stored

[Marco Pagani (@mmpagani)]

  • Added cluster model to classical PSHA calculator

[Michele Simionato (@micheles)]

  • Fixed a bug in scenario_damage from ShakeMap with noDamageLimit=0
  • Avoided the MemoryError in the controller node by speeding up the saving
    of the information about the sources
  • Turned utils/reduce_sm into a proper command
  • Fixed a wrong coefficient in the ShakeMap amplification
  • Fixed a bug in the hazard curves export (the filename did not contain
    the period of the IMT thus producing duplicated files)
  • Parallelized the reading of the exposure

[Marco Pagani (@mmpagani)]

  • Fixed the implementation on mutex ruptures

[Michele Simionato (@micheles)]

  • Changed the aggregated loss curves exporter
  • Added an experimental calculator ebrisk
  • Changed the ordering of the events (akin to a change of seed in the
    asset correlation)

[Robin Gee (@rcgee)]

  • Fixed bug in tusa_langer_2016.py BA08SE model - authors updated b2 coeff
  • Fixed bug in tusa_langer_2016.py related to coeffs affecting Repi models

[Michele Simionato (@micheles)]

  • Added a check to forbid to set ses_per_logic_tree_path = 0
  • Added an API /extract/event_info/eidx
  • Splitting the sources in classical calculators and not in event based
  • Removed max_site_model_distance
  • Extended the logic used in event_based_risk - read the hazard sites
    from the site model, not from the exposure - to all calculators
  • In classical_bcr calculations with a CSV exposure the retrofitted field
    was not read. Now a missing retrofitted value is an error

OpenQuake Engine 3.3.2

22 Jan 10:01
Compare
Choose a tag to compare

[Robin Gee (@rcgee)]

  • Fixed bug in tusa_langer_2016.py BA08SE model - authors updated b2 coeff

[Michele Simionato (@micheles)]

  • Fixed a bug in scenario_damage from ShakeMap with noDamageLimit=0
  • Avoided the MemoryError in the controller node by speeding up the saving
    of the information about the sources
  • Fixed a wrong coefficient in the ShakeMap amplification
  • Fixed a bug in the hazard curves export (the filename did not contain
    the period of the IMT thus producing duplicated files)

OpenQuake Engine 3.3.1

14 Jan 08:40
Compare
Choose a tag to compare

[Michele Simionato (@micheles)]

  • Fixed the GMF exporter to export the event IDs and not event indices

[Robin Gee (@rcgee)]

  • Fixed bug in tusa_langer_2016.py related to coeffs affecting Repi models

OpenQuake Engine 3.3.0

07 Jan 14:39
Compare
Choose a tag to compare

[Graeme Weatherill (@g-weatherill)]

  • Adds GMPE suite for national PSHA for Germany

[Daniele Viganò (@daniviga)]

  • Added a warning box when an unsupported browser is used to view the WebUI
  • Updated Docker containers to support a multi-node deployment
    with a shared directory
  • Moved the Docker containers source code from oq-builders
  • Updated the documentation related to the shared directory
    which is now mandatory for multi-node deployments

[Matteo Nastasi (@nastasi-oq)]

  • Removed tests folders

[Stéphane Drouet (@stephane-on)]

  • Added Drouet & Cotton (2015) GMPE including 2017 erratum

[Michele Simionato (@micheles)]

  • Optimized the memory occupation in classical calculations (Context.poe_map)
  • Fixed a wrong counting of the ruptures in split fault sources with
    an hypo_list/slip_list causing the calculation to fail
  • Made the export of uniform hazard spectra fast
  • Made the std hazard output properly exportable
  • Replaced the ~ in the header of the UHS csv files with a -
  • Restored the individual_curves flag even for the hazard curves
  • Implemented dGMPE weights per intensity measure type
  • Extended --reuse-hazard to all calculators
  • Fixed a bug in event_based_risk from GMFs with coefficients of variations

[Graeme Weatherill (@g-weatherill)]

  • Adds magnitude scaling relation for Germany

[Michele Simionato (@micheles)]

  • Used floats for the the GSIM realization weights, not Python Decimals
  • Added a flag fast_sampling, by default False
  • Added an API /extract/src_loss_table/<loss_type>
  • Removed the rupture filtering from sample_ruptures and optimized it in
    the RuptureGetter by making use of the bounding box
  • Raised the limit on ses_per_logic_tree_path from 216 to 232;
  • Added a parameter max_num_sites to increase the number of sites accepted
    by an event based calculation up to 2 ** 32 (the default is still 2 ** 16)
  • Added a command oq compare to compare hazard curves and maps within
    calculations
  • Extended the engine to read transparently zipped source models and exposures
  • Restored the check for invalid source IDs in applyToSources
  • Extended the command oq zip to zip source models and exposures
  • Parallelized the associations event ID -> realization ID
  • Improved the message when assets are discarded in scenario calculations
  • Implemented aggregation by multiple tags, plus a special case for the
    country code in event based risk

[Marco Pagani (@mmpagani)]

  • Added two modified versions of the Bindi et al. (2011) to be used in a
    backbone approach to compute hazard in Italy
  • Added a modified version of Berge-Thierry et al. 2003 supporting Mw

[Michele Simionato (@micheles)]

  • Changed the way loss curves and loss maps are stored in order to unify
    the aggregation logic with the one used for the average losses
  • Now it is possible to compute the ruptures without specifying the sites
  • Added an early check for the case of missing intensity measure types
  • Deprecated the case of exposure, site model and region_grid_spacing all
    set at the same time
  • Implemented multi-exposure functionality in event based risk
  • Changed the event based calculator to store the ruptures incrementally
    without keeping them all in memory
  • Refactored the UCERF event based calculator to work as much as possible
    the regular calculator
  • Optimized the management and storage of the aggregate losses in the event
    based risk calculation; also, reduced the memory consumption
  • Changed the default for individual_curves to "false", which is the right
    default for large calculations
  • Optimized the saving of the events
  • Removed the save_ruptures flag in the job.ini since ruptures must be saved
    always
  • Optimized the rupture generation in case of sampling and changed the
    algorithm and seeds
  • Fixed a bug with the IMT SA(1) considered different from SA(1.0)
  • Removed the long-time deprecated GMF exporter in XML format for event_based
  • Added a re-use hazard feature in event_based_risk in single-file mode
  • Made the event ID unique also in scenario calculations with
    multiple realizations
  • Removed the annoying hidden .zip archives littering the export directory
  • Added an easy way to read the exposure header
  • Added a way to run Python scripts using the engine libraries via oq shell
  • Improved the minimum_magnitude feature
  • Fixed the check on missing hazard IMTs
  • Reduced substantially the memory occupation in event based risk
  • Added the option spatial_correlation=no correlation for risk calculations
    from ShakeMaps
  • Removed the experimental calculator ucerf_risk
  • Optimized the sampling of time-independent sources for the case of
    prefilter_sources=no
  • Changed the algorithm associating events to SESs and made the event based
    hazard calculator faster in the case of many SESs
  • Reduced substantially the memory consumption in event based risk
  • Made it possible to read multiple site model files in the same calculation
  • Implemented a smart single job.ini file mode for event based risk
  • Now warnings for invalid parameters are logged in the database too
  • Fixed oq export avg_losses-stats for the case of one realization
  • Added oq export losses_by_tag and oq export curves_by_tag
  • Extended oq export to work in a multi-user situation
  • Forbidden event based calculations with more than max_potential_paths
    in the case of full enumeration
  • Saved a large amount of memory in event_based_risk calculations
  • Added a command oq export losses_by_tag/<tagname> <calc_id>
  • Extended oq zip to zip the risk files together with the hazard files
  • Changed the building convention for the event IDs and made them unique
    in the event loss table, even in the case of full enumeration
  • Optimized the splitting of complex fault sources
  • Fixed the ShakeMap download procedure for uncertainty.zip archives
    with an incorrect structure (for instance for ci3031111)
  • Disabled the spatial correlation in risk-from-ShakeMap by default
  • Optimized the rupture sampling where there is a large number of SESs
  • Extended the reqv feature to multiple tectonic region types and
    removed the spinning/floating for the TRTs using the feature
  • Reduced the GMPE logic tree upfront for TRTs missing in the source model
  • Fixed the ShakeMap downloader to use the USGS GeoJSON feed
  • Improved the error message when there are more than 65536 distinct tags
    in the exposure
  • Turned vs30measured into an optional parameter

[Chris Van Houtte (@cvanhoutte)]

  • Added siteclass as a site parameter, and reference_site_class as
    a site parameter than can be specified by the user in the ini file
  • Added new classes to mcverry_2006.py to take siteclass as a predictor
  • Updated comments in mcverry_2006.py
  • Added new mcverry_2006 test tables to account for difference in site
    parameter
  • Added qa_test_data classical case_32

[Michele Simionato (@micheles)]

  • Fixed the rupture exporter for Canada
  • Extended the oq prepare_site_model to optionally generate the
    fields z1pt0, z2pt5 and vs30measured
  • It is now an error to specify both the sites and the site model in the
    job.ini, to avoid confusion with the precedency
  • Implemented a reader for site models in CSV format
  • Made the export_dir relative to the input directory
  • Better error message for ShakeMaps with zero stddev
  • Added a source_id-filtering feature in the job.ini
  • Added a check on non-homogeneous tectonic region types in a source group
  • Fixed the option oq engine --config-file that broke a few releases ago
  • Replaced nodal_dist_collapsing_distance and
    hypo_dist_collapsing_distance with pointsource_distance and made
    use of them in the classical and event based calculators

[Graeme Weatherill (@g-weatherill)]

  • Fixes to hmtk completeness tables for consistent rates and addition of
    more special methods to catalogue

[Michele Simionato (@micheles)]

  • Restricted ChiouYoungs2008SWISS01 to StdDev.TOTAL to avoid a bug
    when computing the GMFs with inter/intra stddevs
  • Raised an error if assets are discarded because too far from the hazard
    sites (before it was just a warning)
  • Added an attribute .srcidx to every event based rupture and stored it
  • Fixed an issue with the Byte Order Mark (BOM) for CSV exposures prepared
    with Microsoft Excel
  • Reduced the site collection instead of just filtering it; this fixes
    a source filtering bug and changes the numbers in case of GMF-correlation
  • Added a command oq prepare_site_model to prepare a sites.csv file
    containing the vs30 and changed the engine to use it
  • Added a cutoff when storing a PoE=1 from a CSV file, thus avoiding NaNs
    in classical_damage calculations
  • Reduced the data transfer in the risk model by only considering the
    taxonomies relevant for the exposure
  • Extended oq engine --run to accept a list of files
  • Optimized the saving of the risk results in event based in the case of
    many sites and changed the command oq show portfolio_loss to show
    mean and standard deviation of the portfolio loss for each loss type

[Marco Pagani (@mmpagani)]

  • Added a first and preliminary version of the GMM for the Canada model
    represented in an analytical form.
  • Added a modified version of Atkinson and Macias to be used for the
    calculation of hazard in NSHMP2014.
  • Added support for PGA to the Si and Midorikawa (1999).

[Michele Simionato (@micheles)]

  • Made it possible to run the risk over an hazard calculation of another user
  • Worked around the OverflowError: cannot serialize a bytes object larger
    than 4 GiB in event based calculations
  • Started using Python 3.6 features
  • Fixed the check on vulnerability function ID uniqueness for NRML 0.5
  • Ruptures and GMFs are now computed concurrently, thus mitigating the
    issue of slow tas...
Read more