# A melhor ferramenta para a sua pesquisa, trabalho e TCC!

Página 7 dos resultados de 140 itens digitais encontrados em 0.001 segundos

## A framework to generate synthetic multi-label datasets

Fonte: Elsevier; Amsterdam
Publicador: Elsevier; Amsterdam

Tipo: Artigo de Revista Científica

Português

Relevância na Pesquisa

17.690776%

#data generator#artificial datasets#multi-label learning#publicly available framework#Java#PHP#INTELIGÊNCIA ARTIFICIAL

A controlled environment based on known properties of the dataset used by a learning algorithm is useful to empirically evaluate machine learning algorithms. Synthetic (artificial) datasets are used for this purpose. Although there are publicly available frameworks to generate synthetic single-label datasets, this is not the case for multi-label datasets, in which each instance is associated with a set of labels usually correlated. This work presents Mldatagen, a multi-label dataset generator framework we have implemented, which is publicly available to the community. Currently, two strategies have been implemented in Mldatagen: hypersphere and hypercube. For each label in the multi-label dataset, these strategies randomly generate a geometric shape (hypersphere or hypercube), which is populated with points (instances) randomly generated. Afterwards, each instance is labeled according to the shapes it belongs to, which defines its multi-label. Experiments with a multi-label classification algorithm in six synthetic datasets illustrate the use of Mldatagen.; São Paulo Research Foundation (FAPESP) (grants 2011/02393-4, 2010/15992-0 and 2011/12597-6); Proceedings of the XXXIX Latin American Computing Conference (CLEI 2013).
Naiguatá...

Link permanente para citações:

## Desenvolvimento de um método para estimar o consumo de energia de edificações comerciais através da aplicação de redes neurais

Fonte: Florianópolis
Publicador: Florianópolis

Tipo: Tese de Doutorado
Formato: 189 p.| il., grafs., tabs.

Português

Relevância na Pesquisa

17.191792%

Tese (doutorado) - Universidade Federal de Santa Catarina, Centro Tecnológico. Programa de Pós-Graduação em Engenharia Civil; Em Fevereiro de 2009 foi aprovado sob a portaria de No 53 o Regulamento Técnico da Qualidade do Nível de Eficiência Energética de Edifícios Comerciais, de Serviços e Públicos (RTQ-C). Este regulamento visa a etiquetagem de edificações comerciais no Brasil, classificando pelo nível de eficiência energética e baseando-se em três requisitos principais: Eficiência e potência instalada do sistema de iluminação; Eficiência do sistema de condicionamento de ar e Desempenho térmico da envoltória da edificação, quando a mesma for condicionada. O RTQ-C apresenta dois métodos para a avaliação do nível final de eficiência da edificação: Método Prescritivo, através da utilização de um modelo simplificado; ou através do Método de Simulação. Durante o desenvolvimento do modelo simplificado para a avaliação da envoltória presente no RTQ-C, foram encontradas algumas limitações com relação à volumetria do edifício e do parâmetro transmitância térmicas das paredes. Após o desenvolvimento do modelo, foram também observadas diferenças entre os níveis de eficiência de edificações com grande volumetria quando avaliadas através do Método Prescritivo e Método de Simulação. Outra observação foi com relação aos resultados fornecido pelo modelo simplificado. Os resultados são representados por um Indicador de Consumo...

Link permanente para citações:

## A generalized entropy characterization of N -dimensional fractal control systems

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Publicado em 21/04/2014
Português

Relevância na Pesquisa

17.191792%

It is presented the general properties of N-dimensional multi-component or
many-particle systems exhibiting self-similar hierarchical structure. Assuming
there exists an optimal coarse-graining scale at which the quality and
diversity of the (box-counting) fractal dimensions exhibited by a given system
are optimized, it is computed the generalized entropy of each hypercube of the
partitioned system and shown that its shape is universal, as it also exhibits
self-similarity and hence does not depend on the dimensionality N . For certain
systems this shape may also be associated with the large time stationary
profile of the fractal density distribution in the absence of external fields
(or control).; Comment: 6 pages, 4 figures This work has been submitted for publication to
the proceedings of The Latin American Congress on Computational Intelligence
(http://la-cci.org/)

Link permanente para citações:

## Parameter space exploration of ecological models

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Português

Relevância na Pesquisa

27.191792%

In recent years, we are seeing the formulation and use of elaborate and
complex models in ecological studies. The questions related to the efficient,
systematic and error-proof exploration of parameter spaces are of great
importance to better understand, estimate confidences and make use of the
output from these models. In this work, we investigate some of the relevant
questions related to parameter space exploration, in particular using the
technique known as Latin Hypercube Sampling and focusing in quantitative output
analysis. We present the analysis of a structured population growth model and
contrast our findings with results from previously used techniques, known as
sensitivity and elasticity analyses. We also assess how are the questions
related to parameter space analysis being currently addressed in the ecological
literature.

Link permanente para citações:

## Physical Layer Network Coding for the K-user Multiple Access Relay Channel

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Português

Relevância na Pesquisa

27.191792%

A Physical layer Network Coding (PNC) scheme is proposed for the $K$-user
wireless Multiple Access Relay Channel (MARC), in which $K$ source nodes
transmit their messages to the destination node $D$ with the help of a relay
node $R.$ The proposed PNC scheme involves two transmission phases: (i) Phase 1
during which the source nodes transmit, the relay node and the destination node
receive and (ii) Phase 2 during which the source nodes and the relay node
transmit, and the destination node receives. At the end of Phase 1, the relay
node decodes the messages of the source nodes and during Phase 2 transmits a
many-to-one function of the decoded messages. Wireless networks in which the
relay node decodes, suffer from loss of diversity order if the decoder at the
destination is not chosen properly. A novel decoder is proposed for the PNC
scheme, which offers the maximum possible diversity order of $2,$ for a proper
choice of certain parameters and the network coding map. Specifically, the
network coding map used at the relay is chosen to be a $K$-dimensional Latin
Hypercube, in order to ensure the maximum diversity order of $2.$ Also, it is
shown that the proposed decoder can be implemented by a fast decoding
algorithm. Simulation results presented for the 3-user MARC show that the
proposed scheme offers a large gain over the existing scheme for the $K$-user
MARC.; Comment: More Simulation results added...

Link permanente para citações:

## Monte Carlo Methods and Path-Generation techniques for Pricing Multi-asset Path-dependent Options

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Publicado em 03/10/2007
Português

Relevância na Pesquisa

27.191792%

We consider the problem of pricing path-dependent options on a basket of
underlying assets using simulations. As an example we develop our studies using
Asian options. Asian options are derivative contracts in which the underlying
variable is the average price of given assets sampled over a period of time.
Due to this structure, Asian options display a lower volatility and are
therefore cheaper than their standard European counterparts. This paper is a
survey of some recent enhancements to improve efficiency when pricing Asian
options by Monte Carlo simulation in the Black-Scholes model. We analyze the
dynamics with constant and time-dependent volatilities of the underlying asset
returns. We present a comparison between the precision of the standard Monte
Carlo method (MC) and the stratified Latin Hypercube Sampling (LHS). In
particular, we discuss the use of low-discrepancy sequences, also known as
Quasi-Monte Carlo method (QMC), and a randomized version of these sequences,
known as Randomized Quasi Monte Carlo (RQMC). The latter has proven to be a
useful variance reduction technique for both problems of up to 20 dimensions
and for very high dimensions. Moreover, we present and test a new path
generation approach based on a Kronecker product approximation (KPA) in the
case of time-dependent volatilities. KPA proves to be a fast generation
technique and reduces the computational cost of the simulation procedure.; Comment: 34 pages...

Link permanente para citações:

## Cumulative physical uncertainty in modern stellar models. II. The dependence on the chemical composition

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Publicado em 29/04/2013
Português

Relevância na Pesquisa

27.191792%

We extend our work on the effects of the uncertainties on the main input
physics for the evolution of low-mass stars. We analyse the dependence of the
cumulative physical uncertainty affecting stellar tracks on the chemical
composition. We calculated more than 6000 stellar tracks and isochrones, with
metallicity ranging from Z = 0.0001 to 0.02, by changing the following physical
inputs within their current range of uncertainty: 1H(p,nu e+)2H,
14N(p,gamma)15O and triple-alpha reaction rates, radiative and conductive
opacities, neutrino energy losses, and microscopic diffusion velocities. The
analysis was performed using a latin hypercube sampling design. We examine in a
statistical way the dependence on the variation of the physical inputs of the
turn-off (TO) luminosity, the central hydrogen exhaustion time (t_H), the
luminosity and the helium core mass at the red-giant branch (RGB) tip, and the
zero age horizontal branch (ZAHB) luminosity in the RR Lyrae region. For the
stellar tracks, an increase from Z = 0.0001 to Z = 0.02 produces a cumulative
physical uncertainty in TO luminosity from 0.028 dex to 0.017 dex, while the
global uncertainty on t_H increases from 0.42 Gyr to 1.08 Gyr. For the RGB tip,
the cumulative uncertainty on the luminosity is almost constant at 0.03 dex...

Link permanente para citações:

## Sensitivity Analysis for Computationally Expensive Models using Optimization and Objective-oriented Surrogate Approximations

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Português

Relevância na Pesquisa

27.191792%

In this paper, we focus on developing efficient sensitivity analysis methods
for a computationally expensive objective function $f(x)$ in the case that the
minimization of it has just been performed. Here "computationally expensive"
means that each of its evaluation takes significant amount of time, and
therefore our main goal to use a small number of function evaluations of $f(x)$
to further infer the sensitivity information of these different parameters.
Correspondingly, we consider the optimization procedure as an adaptive
experimental design and re-use its available function evaluations as the
initial design points to establish a surrogate model $s(x)$ (or called response
surface). The sensitivity analysis is performed on $s(x)$, which is an lieu of
$f(x)$. Furthermore, we propose a new local multivariate sensitivity measure,
for example, around the optimal solution, for high dimensional problems. Then a
corresponding "objective-oriented experimental design" is proposed in order to
make the generated surrogate $s(x)$ better suitable for the accurate
calculation of the proposed specific local sensitivity quantities. In addition,
we demonstrate the better performance of the Gaussian radial basis function
interpolator over Kriging in our cases...

Link permanente para citações:

## Refined Stratified Sampling for efficient Monte Carlo based uncertainty quantification

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Publicado em 11/05/2015
Português

Relevância na Pesquisa

27.191792%

A general adaptive approach rooted in stratified sampling (SS) is proposed
for sample-based uncertainty quantification (UQ). To motivate its use in this
context the space-filling, orthogonality, and projective properties of SS are
compared with simple random sampling and Latin hypercube sampling (LHS). SS is
demonstrated to provide attractive properties for certain classes of problems.
The proposed approach, Refined Stratified Sampling (RSS), capitalizes on these
properties through an adaptive process that adds samples sequentially by
dividing the existing subspaces of a stratified design. RSS is proven to reduce
variance compared to traditional stratified sample extension methods while
providing comparable or enhanced variance reduction when compared to sample
size extension methods for LHS - which do not afford the same degree of
flexibility to facilitate a truly adaptive UQ process. An initial investigation
of optimal stratification is presented and motivates the potential for major
advances in variance reduction through optimally designed RSS. Potential paths
for extension of the method to high dimension are discussed. Two examples are
provided. The first involves UQ for a low dimensional function where
convergence is evaluated analytically. The second presents a study to asses the
response variability of a floating structure to an underwater shock.

Link permanente para citações:

## Validating Sample Average Approximation Solutions with Negatively Dependent Batches

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Português

Relevância na Pesquisa

27.191792%

Sample-average approximations (SAA) are a practical means of finding
approximate solutions of stochastic programming problems involving an extremely
large (or infinite) number of scenarios. SAA can also be used to find estimates
of a lower bound on the optimal objective value of the true problem which, when
coupled with an upper bound, provides confidence intervals for the true optimal
objective value and valuable information about the quality of the approximate
solutions. Specifically, the lower bound can be estimated by solving multiple
SAA problems (each obtained using a particular sampling method) and averaging
the obtained objective values. State-of-the-art methods for lower-bound
estimation generate batches of scenarios for the SAA problems independently. In
this paper, we describe sampling methods that produce negatively dependent
batches, thus reducing the variance of the sample-averaged lower bound
estimator and increasing its usefulness in defining a confidence interval for
the optimal objective value. We provide conditions under which the new sampling
methods can reduce the variance of the lower bound estimator, and present
computational results to verify that our scheme can reduce the variance
significantly, by comparison with the traditional Latin hypercube approach.

Link permanente para citações:

## A central limit theorem for general orthogonal array based space-filling designs

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Publicado em 23/09/2014
Português

Relevância na Pesquisa

27.191792%

Orthogonal array based space-filling designs (Owen [Statist. Sinica 2 (1992a)
439-452]; Tang [J. Amer. Statist. Assoc. 88 (1993) 1392-1397]) have become
popular in computer experiments, numerical integration, stochastic optimization
and uncertainty quantification. As improvements of ordinary Latin hypercube
designs, these designs achieve stratification in multi-dimensions. If the
underlying orthogonal array has strength $t$, such designs achieve uniformity
up to $t$ dimensions. Existing central limit theorems are limited to these
designs with only two-dimensional stratification based on strength two
orthogonal arrays. We develop a new central limit theorem for these designs
that possess stratification in arbitrary multi-dimensions associated with
orthogonal arrays of general strength. This result is useful for building
confidence statements for such designs in various statistical applications.; Comment: Published in at http://dx.doi.org/10.1214/14-AOS1231 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org)

Link permanente para citações:

## n-Ary quasigroups of order 4

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Português

Relevância na Pesquisa

27.191792%

We characterize the set of all N-ary quasigroups of order 4: every N-ary
quasigroup of order 4 is permutably reducible or semilinear. Permutable
reducibility means that an N-ary quasigroup can be represented as a composition
of K-ary and (N-K+1)-ary quasigroups for some K from 2 to N-1, where the order
of arguments in the representation can differ from the original order. The set
of semilinear N-ary quasigroups has a characterization in terms of Boolean
functions. Keywords: Latin hypercube, n-ary quasigroup, reducibility; Comment: 10pp. V2: revised

Link permanente para citações:

## Multiprocess parallel antithetic coupling for backward and forward Markov Chain Monte Carlo

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Publicado em 30/05/2005
Português

Relevância na Pesquisa

27.191792%

Antithetic coupling is a general stratification strategy for reducing Monte
Carlo variance without increasing the simulation size. The use of the
antithetic principle in the Monte Carlo literature typically employs two strata
via antithetic quantile coupling. We demonstrate here that further
stratification, obtained by using k>2 (e.g., k=3-10) antithetically coupled
variates, can offer substantial additional gain in Monte Carlo efficiency, in
terms of both variance and bias. The reason for reduced bias is that
antithetically coupled chains can provide a more dispersed search of the state
space than multiple independent chains. The emerging area of perfect simulation
provides a perfect setting for implementing the k-process parallel antithetic
coupling for MCMC because, without antithetic coupling, this class of methods
delivers genuine independent draws. Furthermore, antithetic backward coupling
provides a very convenient theoretical tool for investigating antithetic
forward coupling. However, the generation of k>2 antithetic variates that are
negatively associated, that is, they preserve negative correlation under
monotone transformations, and extremely antithetic, that is, they are as
negatively correlated as possible, is more complicated compared to the case
with k=2. In this paper...

Link permanente para citações:

## Improvement of random LHD for high dimensions

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Publicado em 10/07/2009
Português

Relevância na Pesquisa

27.191792%

Designs of experiments for multivariate case are reviewed. Fast algorithm of
construction of good Latin hypercube designs is developed.; Comment: 6 pages, Proceedings of the 6th St. Petersburg Workshop on
Simulation, 1091-1096

Link permanente para citações:

## Determining Key Model Parameters of Rapidly Intensifying Hurricane Guillermo(1997) using the Ensemble Kalman Filter

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Português

Relevância na Pesquisa

27.191792%

#Physics - Geophysics#Computer Science - Systems and Control#Mathematics - Optimization and Control#86A10, 86A22, 35R30

In this work we determine key model parameters for rapidly intensifying
Hurricane Guillermo (1997) using the Ensemble Kalman Filter (EnKF). The
approach is to utilize the EnKF as a tool to only estimate the parameter values
of the model for a particular data set. The assimilation is performed using
dual-Doppler radar observations obtained during the period of rapid
intensification of Hurricane Guillermo. A unique aspect of Guillermo was that
during the period of radar observations strong convective bursts, attributable
to wind shear, formed primarily within the eastern semicircle of the eyewall.
To reproduce this observed structure within a hurricane model, background wind
shear of some magnitude must be specified; as well as turbulence and surface
parameters appropriately specified so that the impact of the shear on the
simulated hurricane vortex can be realized. To identify the complex nonlinear
interactions induced by changes in these parameters, an ensemble of model
simulations have been conducted in which individual members were formulated by
sampling the parameters within a certain range via a Latin hypercube approach.
The ensemble and the data, derived latent heat and horizontal winds from the
dual-Doppler radar observations...

Link permanente para citações:

## The network-level reproduction number and extinction threshold for vector-borne diseases

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Português

Relevância na Pesquisa

27.191792%

The reproduction number of deterministic models is an essential quantity to
predict whether an epidemic will spread or die out. Thresholds for disease
extinction contribute crucial knowledge on disease control, elimination, and
mitigation of infectious diseases. Relationships between the basic reproduction
numbers of two network-based ordinary differential equation vector-host models,
and extinction thresholds of corresponding continuous-time Markov chain models
are derived under some assumptions. Numerical simulation results for malaria
and Rift Valley fever transmission on heterogeneous networks are in agreement
with analytical results without any assumptions, reinforcing the relationships
may always exist and proposing a mathematical problem of proving their
existences in general. Moreover, numerical simulations show that the
reproduction number is not monotonically increasing or decreasing with the
extinction threshold. Key parameters in predicting uncertainty of extinction
thresholds are identified using Latin Hypercube Sampling/Partial Rank
Correlation Coefficient. Consistent trends of extinction probability observed
through numerical simulations provide novel insights into mitigation strategies
to increase the disease extinction probability. Research findings may improve
understandings of thresholds for disease persistence in order to control
vector-borne diseases.

Link permanente para citações:

## Coupling models of cattle and farms with models of badgers for predicting the dynamics of bovine tuberculosis (TB)

Fonte: Universidade Cornell
Publicador: Universidade Cornell

Tipo: Artigo de Revista Científica

Publicado em 06/03/2015
Português

Relevância na Pesquisa

27.191792%

#Quantitative Biology - Populations and Evolution#Computer Science - Computational Engineering, Finance, and Science#Mathematics - Dynamical Systems#Statistics - Applications#Statistics - Computation

Bovine TB is a major problem for the agricultural industry in several
countries. TB can be contracted and spread by species other than cattle and
this can cause a problem for disease control. In the UK and Ireland, badgers
are a recognised reservoir of infection and there has been substantial
discussion about potential control strategies. We present a coupling of
individual based models of bovine TB in badgers and cattle, which aims to
capture the key details of the natural history of the disease and of both
species at approximately county scale. The model is spatially explicit it
follows a very large number of cattle and badgers on a different grid size for
each species and includes also winter housing. We show that the model can
replicate the reported dynamics of both cattle and badger populations as well
as the increasing prevalence of the disease in cattle. Parameter space used as
input in simulations was swept out using Latin hypercube sampling and
sensitivity analysis to model outputs was conducted using mixed effect models.
By exploring a large and computationally intensive parameter space we show that
of the available control strategies it is the frequency of TB testing and
whether or not winter housing is practised that have the most significant
effects on the number of infected cattle...

Link permanente para citações:

## Creating a conceptual hydrological soil response map for the Stevenson Hamilton Research Supersite, Kruger National Park, South Africa

Fonte: Water SA
Publicador: Water SA

Tipo: Artigo de Revista Científica
Formato: text/html

Publicado em 01/04/2014
Português

Relevância na Pesquisa

27.191792%

#Digital soil mapping#terrain analysis#ecosystem services#conceptual hydrological soil responses#SoLIM

The soil water regime is a defining ecosystem service, directly influencing vegetation and animal distribution. Therefore the understanding of hydrological processes is a vital building block in managing natural ecosystems. Soils contain morphological indicators of the water flow paths and rates in the soil profile, which are expressed as 'conceptual hydrological soil responses' (CHSR's). CHSR's can greatly aid in the understanding of hydrology within a landscape and catchment. Therefore a soil map could improve hydrological assessments by providing both the position and area of CHSR's. Conventional soil mapping is a tedious process, which limits the application of soil maps in hydrological studies. The use of a digital soil mapping (DSM) approach to soil mapping can speed up the mapping process and thereby extend soil map use in the field of hydrology. This research uses an expert-knowledge DSM approach to create a soil map for Stevenson Hamilton Research Supersite within the Kruger National Park, South Africa. One hundred and thirteen soil observations were made in the 4 001 ha area. Fifty-four of these observations were pre-determined by smart sampling and conditioned Latin hypercube sampling. These observations were used to determine soil distribution rules...

Link permanente para citações:

## Analysis of the economic impact of cystic echinococcosis in Spain

Fonte: World Health Organization
Publicador: World Health Organization

Tipo: Artigo de Revista Científica
Formato: text/html

Publicado em 01/01/2010
Português

Relevância na Pesquisa

27.191792%

OBJECTIVE: To estimate the overall economic losses due to human and animal cystic echinococcosis (CE) in Spain in 2005. METHODS: We obtained data on annual CE incidence from surveillance and abattoir records, and on CE-related treatment and productivity losses (human and animal) from the scientific literature. Direct costs were those associated with diagnosis, surgical or chemotherapeutic treatment, medical care and hospitalization in humans, and condemnation of offal in livestock (sheep, goats, cattle and pigs). Indirect costs comprised human productivity losses and the reduction in growth, fecundity and milk production in livestock. The Latin hypercube method was used to represent the uncertainty surrounding the input parameters. FINDINGS: The overall economic loss attributable to CE in humans and animals in 2005 was estimated at 148 964 534 euros () (95% credible interval, CI: 21 980 446-394 012 706). Human-associated losses were estimated at 133 416 601 (95% CI: 6 658 738-379 273 434) and animal-associated losses at 15 532 242 (95% CI: 13 447 378-17 789 491). CONCLUSION: CE is a neglected zoonosis that remains a human and animal health concern for Spain. More accurate data on CE prevalence in humans (particularly undiagnosed or asymptomatic cases) and better methods to estimate productivity losses in animals are needed. CE continues to affect certain areas of Spain...

Link permanente para citações:

## Likely effectiveness of pharmaceutical and non-pharmaceutical interventions for mitigating influenza virus transmission in Mongolia

Fonte: World Health Organization
Publicador: World Health Organization

Tipo: Artigo de Revista Científica
Formato: text/html

Publicado em 01/04/2012
Português

Relevância na Pesquisa

27.191792%

OBJECTIVE: To assess the likely benefit of the interventions under consideration for use in Mongolia during future influenza pandemics. METHODS: A stochastic, compartmental patch model of susceptibility, exposure, infection and recovery was constructed to capture the key effects of several interventions - travel restrictions, school closure, generalized social distancing, quarantining of close contacts, treatment of cases with antivirals and prophylaxis of contacts - on the dynamics of influenza epidemics. The likely benefit and optimal timing and duration of each of these interventions were assessed using Latin-hypercube sampling techniques, averaging across many possible transmission and social mixing parameters. FINDINGS: Timely interventions could substantially alter the time-course and reduce the severity of pandemic influenza in Mongolia. In a moderate pandemic scenario, early social distancing measures decreased the mean attack rate from around 10% to 7-8%. Similarly, in a severe pandemic scenario such measures cut the mean attack rate from approximately 23% to 21%. In both moderate and severe pandemic scenarios, a suite of non-pharmaceutical interventions proved as effective as the targeted use of antivirals. Targeted antiviral campaigns generally appeared more effective in severe pandemic scenarios than in moderate pandemic scenarios. CONCLUSION: A mathematical model of pandemic influenza transmission in Mongolia indicated that...

Link permanente para citações: