Página 2 dos resultados de 140 itens digitais encontrados em 0.001 segundos

## Técnicas de amostragem inteligente em simulação de Monte Carlo; Intelligent sampling techniques in Monte Carlo simulation

Santos, Ketson Roberto Maximiano dos
Fonte: Biblioteca Digitais de Teses e Dissertações da USP Publicador: Biblioteca Digitais de Teses e Dissertações da USP
Tipo: Dissertação de Mestrado Formato: application/pdf
Publicado em 26/03/2014 Português
Relevância na Pesquisa
27.191792%

## Aumento da eficiência dos métodos següenciais de simulação condicional

Pilger, Gustavo Grangeiro
Fonte: Universidade Federal do Rio Grande do Sul Publicador: Universidade Federal do Rio Grande do Sul
Tipo: Tese de Doutorado Formato: application/pdf
Português
Relevância na Pesquisa
28.013716%
O algoritmo de simulação seqüencial estocástica mais amplamente utilizado é o de simulação seqüencial Gaussiana (ssG). Teoricamente, os métodos estocásticos reproduzem tão bem o espaço de incerteza da VA Z(u) quanto maior for o número L de realizações executadas. Entretanto, às vezes, L precisa ser tão alto que o uso dessa técnica pode se tornar proibitivo. Essa Tese apresenta uma estratégia mais eficiente a ser adotada. O algoritmo de simulação seqüencial Gaussiana foi alterado para se obter um aumento em sua eficiência. A substituição do método de Monte Carlo pela técnica de Latin Hypercube Sampling (LHS), fez com que a caracterização do espaço de incerteza da VA Z(u), para uma dada precisão, fosse alcançado mais rapidamente. A técnica proposta também garante que todo o modelo de incerteza teórico seja amostrado, sobretudo em seus trechos extremos.; Sequential simulation is probably the most used algorithm in geostatistical simulation, specially the sequential Gaussian algorithm. In theory, this method maps the space of uncertainty as the number realizations increase. However, some times the number of simulations needs to be large which makes the procedure prohibitive. This Thesis presents a more efficient strategy. The idea is to replace the Monte Carlo simulation by the Latin Hypercube Sampling (LHS) technique in order to improve the efficiency of the algorithm. The use of the modified algorithm showed that the space of uncertainty related to the random variable modeled was faster obtained than the traditional Monte-Carlo simulation for a given degree of precision. This approach also ensures that the model of uncertainty is better represented in its entirety.

## Modelagem matemática aplicada a precificação de opções

Soares Júnior, Márcio
Tipo: Trabalho de Conclusão de Curso
Português
Relevância na Pesquisa
27.191792%
The mathematical models are critical to determine theoretical prices of options and analyze whether they are overrated or underrated. This information strongly influence in operations carried out by the investor. Therefore, it is necessary that the employee model present high degree of reliability and be consistent with the reality of investment to which it is intended. In this sense, this dissertation aims to apply the steps of mathematical modeling in the Pricing of options for decision making in the investment of a hydroelectric power plant. Was used a Monte Carlo simulation, with the Latin Hypercube Method, to determine the volatility of returns of the project. In order to validate the proposed model, compared to the results found by the Binomial Model, which is one of the models most used in this type of investment. The results reinforce the hypothesis that the mathematical modeling with the Binomial Model is critical to investment decision-making in hydroelectric power; Os modelos matemáticos são fundamentais para se determinar os preços teóricos de opções e analisar se estão superestimados ou subestimados. Essas informações influenciam fortemente nas operações realizadas pelo investidor. Logo, é necessário que o modelo empregado apresente alto grau de confiabilidade e seja condizente com a realidade do investimento ao qual ele se destina. Neste sentido...

## Planejamento de reativos em sistemas elétricos de potência multi-área através de modelos estocásticos

López Quizhpi, Julio César
Tipo: Tese de Doutorado Formato: 130 f. : il.
Português
Relevância na Pesquisa
28.013716%
Pós-graduação em Engenharia Elétrica - FEIS; In this work, the reactive power planning problem is modeled and solved as a two stage sto- chastic multi-period convex optimization problem in multi-area power systems. The classical mixed integer reative power planning model is reformulated as a multi-period conic convex mi- xed integer model considering the taps of transformers as integer variables. In the multi-area power system context the problem is decentralized by lagrangian relaxation, decomposing the multi-area problem in subproblems associated with each area. The transmission system opera- tors in each area solve their subproblems in coordination with adjacent areas while maintaining the confidentiality of their power system data, only exchanging boundary buses information. In the stochastic formulation, demand uncertainty in each area is considered by a Normal distribu- tion function, and the scenario generation in each period is made through the efficient technique Latin Hypercube sampling. The uncertainty presence at the problem is analyzed by computing the values that quantify the importance of that parameters. Moreover, the stochastic reactive power planning problem is formulated as a multiobjective mathematical programming problem optimizing the expansion costs function and load shedding risk function that is modeled by regret...

## Quadrados latinos e aplicações; Latin squares and applications

Mateus Alegri
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Dissertação de Mestrado Formato: application/pdf
Publicado em 04/08/2006 Português
Relevância na Pesquisa
37.68389%

## Exploratory ensemble designs for environmental models using k-extended Latin Hypercubes

Williamson, D
Fonte: Blackwell Publishing Ltd Publicador: Blackwell Publishing Ltd
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
37.514731%
In this paper we present a novel, flexible, and multi-purpose class of designs for initial exploration of the parameter spaces of computer models, such as those used to study many features of the environment. The idea applies existing technology aimed at expanding a Latin Hypercube (LHC) in order to generate initial LHC designs that are composed of many smaller LHCs. The resulting design and its component parts are designed so that each is approximately orthogonal and maximises a measure of coverage of the parameter space. Designs of the type advocated for in this paper are particularly useful when we want to simultaneously quantify parametric uncertainty and any uncertainty due to the initial conditions, boundary conditions, or forcing functions required to run the model. This makes the class of designs particularly suited to environmental models, such as climate models that contain all of these features. The proposed designs are particularly suited to initial exploratory ensembles whose goal is to guide the design of further ensembles aimed at, for example, calibrating the model. We introduce a new emulator diagnostic that exploits the structure of the advocated ensemble designs and allows for the assessment of structural weaknesses in the statistical modelling. We provide illustrations of the method through a simple example and describe a 400 member ensemble of the Nucleus for European Modelling of the Ocean (NEMO) ocean model designed using the method. We build an emulator for NEMO using the created design to illustrate the use of our emulator diagnostic test. © 2015 The Authors. Environmetrics published by John Wiley & Sons Ltd.

## Analytical techniques of quality and cost : robust design, design of experiments, and the prediction of mean shift

Ruflin, Justin, 1981-
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 50 leaves; 3154789 bytes; 3155427 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
28.013716%
The quality of a product to a large extent determines the success of that product in competitive markets. Measuring and improving quality is thus a primary objective of the designer. The aim of the following work is to provide an introduction to the methods of quality optimization and to illustrate these techniques through examples. Quality is first defined and quantified. The robust design method, which is a technique that focuses on improving quality without adding cost, is then described. Particular attention is paid to experiment design, which is a major factor in the effectiveness and efficiency of the robust design process. The effect of product variability on the mean performance of a product is also explained along with the various ways that can be used to predict a shift in the mean value of the performance. Two examples are then developed. The first focuses on the application of the robust design method to illustrate the steps of the process. The second example primarily focuses on creating a comparison of the Monte Carlo, Latin Hypercube, and star pattern sampling methods on predicting mean shift. The benefits of the star pattern sampling method are apparent through the example. The error in the prediction of mean shift of the star pattern is less than 1%...

## Sampling design and machine learning optimization for the application of soil sensing data in digital soil mapping; Optimierungen von Stichprobenverfahren und Methoden des Data Minings zur Nutzung geophysikalischer Naherkundungsdaten im Rahmen von Bodenprognosen

Ramirez-Lopez, Leonardo
Tipo: Dissertação
Português
Relevância na Pesquisa
28.013716%
The general aim of this thesis was to develop innovative methods to build and optimize empirical soil models based on soil sensing data. The combination of effective sampling schemes with geophysical sensing techniques is an active branch of soil scientific research. This approach aims to provide high resolution soil property data for flood forecasting and protection, agricultural management as well as for developing strategies to adapt to global climate change. This thesis comprises four manuscripts. The first two manuscripts are dedicated to calibration sampling strategies. Sampling design is crucial in predictive modeling, since all results and interpretation are based on the selected samples. Hence, the first manuscript investigates the effect of the calibration set size and the calibration sampling strategy on the generalization error of visible and near infrared (vis–NIR) models. Furthermore, a method useful for identifying the optimal sample set size necessary for calibrating vis–NIR models of soil attributes is developed. Within the context of digital soil mapping, the second manuscript focuses on a comparison of different calibration sampling strategies for building predictive models of soil properties based on soil sensing. An improved version of the well-known conditioned Latin hypercube sampling algorithm...

## An exploratory analysis on the effects of human factors on combat outcomes

Wan, Szu Ching.
Fonte: Monterey California. Naval Postgraduate School Publicador: Monterey California. Naval Postgraduate School
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
28.013716%
Approved for public release; distribution is unlimited.; The ongoing revolution in military affairs is transforming the nature of warfare. Modern combat systems are increasingly more effective yet more complex to operate. Nonetheless, their complexities cannot be compared to human behaviors-which remain the most important factor in combat. Within Project Albert, an agent-based model called SOCRATES has been developed to enable users to explore the emergent behaviors of the agents. A deep operation scenario is developed to explore the effects of human factors on combat outcomes. Two experimental designs are used in this investigation: A Latin Hypercube and a Full-Factorial Design. Using the computing facilities at NPS, MITRE and MHPCC (Maui High Performance Computing Center), a total of 174,960 runs are made. The data suggest the existence of emergent patterns, and provide some insights into the question of how much more capable a smaller force must be in order to effectively battle a larger force. In addition, the analysis shows that the Latin Hypercube Design is able to identify the same significant factors in the scenario as are obtained by the Factorial Design, but with much fewer runs.

## Estimation and Potential Improvement of the Quality of Legacy Soil Samples for Digital Soil Mapping

CARRE' FLORENCE; MCBRATNEY Alex; MINASNY B.
Fonte: ELSEVIER SCIENCE BV Publicador: ELSEVIER SCIENCE BV
Tipo: Articles in Journals Formato: Online
Português
Relevância na Pesquisa
28.221953%
Legacy soil data form an important resource for digital soil mapping and are essential for calibration of models for predicting soil properties from environmental variables. Such data arise from traditional soil survey. Methods of soil survey are generally empirical and based on the mental development of the surveyor, correlating soil with underlying geology, landforms, vegetation and air-photo interpretation. There are no statistical criteria for traditional soil sampling, and this may lead to biases in the areas being sampled. The challenge is to use legacy data for large-area mapping (e.g. national or continental) as funds are limited to resample large areas. The problem is then to assess the reliability and quality of the legacy soil databases that have been mainly populated by traditional soil survey, and if there is a possibility of additional funding for sampling, where should new sampling units be located. This additional sampling can be used to improve and validate the prediction model. Latin hypercube sampling (LHS) has been proposed as a sampling design for digital soil mapping when there is no prior sample. We use the principle of hypercube sampling to assess the quality of existing soil data and guide us to the area that needs to be sampled. First an area is defined and the empirical environmental data layers or covariates are identified on a regular grid. The existing soil data is matched with the environmental variables. The HELS spell out algorithm is used to check the occupancy of the legacy sampling units in the hypercube of the quantiles of the covarying environmental data1 . This is to determine whether legacy soil survey data occupy the hypercube uniformly or if there is over- or under-observation in the partitions of the hypercube. It also allows posterior estimation of the apparent probability of sample units being surveyed. From this information we can design further sampling. The methods are illustrated using legacy soil samples from Edgeroi...

## Um modelo para o planejamento anual da operação energética considerando técnicas avaçadas de otimização estocástica

Matos, Vitor Luiz de
Fonte: Florianópolis Publicador: Florianópolis
Tipo: Tese de Doutorado Formato: 267 p.| il., grafs., tabs.
Português
Relevância na Pesquisa
28.013716%

## A semi-random field finite element method to predict the maximum eccentric compressive load for masonry prisms

Moradabadi, Ehsan; Laefer, Debra F.; Clarke, Julie A.; Lourenço, Paulo B.
Fonte: Elsevier Publicador: Elsevier
Tipo: Artigo de Revista Científica
Publicado em //2015 Português
Relevância na Pesquisa
37.514731%
An accurate prediction of the compressive strength of masonry is essential both for the analysis of existing structures and the construction of new masonry buildings. Since experimental material testing of individual masonry components (e.g. masonry unit and mortar joints) often produces highly variable results, this paper presents a numerical modelling based approach to address the associated uncertainty for the prediction of the maximum compressive load of masonry prisms. The method considers numerical model to be semi-random for a masonry prism by adopting a Latin Hyper cube simulation method used in conjunction with a parametric finite element model of the individual masonry prism. The proposed method is applied to two types of masonry prisms (using hollow blocks and solid clay bricks), for which experimental testing was conducted as part of the 9th International Masonry Conference held at Guimarães in July 2014. A Class A prediction (presented before the tests were conducted) was generated for the two masonry prisms according to the proposed methodology, and the results were compared to the final experimental testing results. The root mean square deviation of the method for prediction of eccentric compressive strength of both types of prisms differed by only 2.2KN...

## On completion of latin hypercuboids of order 4

Tipo: Artigo de Revista Científica
Publicado em 19/01/2011 Português
Relevância na Pesquisa
38.66979%
A latin hypercuboid of order $N$ is an $N\times...\times N\times k$ array filled with symbols from the set $\{0,...,N-1\}$ in such a way that every symbol occurs at most once in every line. If $k=N$, such an array is a latin hypercube. We prove that any latin hypercuboid of order 4 is completable to a latin hypercube. Keywords: latin hypercube, n-ary quasigroup; Comment: 5 pages

## Computer experiments with functional inputs and scalar outputs by a norm-based approach

Muehlenstaedt, Thomas; Fruth, Jana; Roustant, Olivier
Tipo: Artigo de Revista Científica
Publicado em 01/10/2014 Português
Relevância na Pesquisa
27.514731%
A framework for designing and analyzing computer experiments is presented, which is constructed for dealing with functional and real number inputs and real number outputs. For designing experiments with both functional and real number inputs a two stage approach is suggested. The first stage consists of constructing a candidate set for each functional input and during the second stage an optimal combination of the found candidate sets and a Latin hypercube for the real number inputs is searched for. The resulting designs can be considered to be generalizations of Latin hypercubes. GP models are explored as metamodel. The functional inputs are incorporated into the kriging model by applying norms in order to define distances between two functional inputs. In order to make the calculation of these norms computationally feasible, the use of B-splines is promoted.

## Implementing Quasi-Monte Carlo Simulations with Linear Transformations

Sabino, Piergiacomo
Tipo: Artigo de Revista Científica
Publicado em 31/10/2007 Português
Relevância na Pesquisa
28.013716%
Pricing exotic multi-asset path-dependent options requires extensive Monte Carlo simulations. In the recent years the interest to the Quasi-monte Carlo technique has been renewed and several results have been proposed in order to improve its efficiency with the notion of effective dimension. To this aim, Imai and Tan introduced a general variance reduction technique in order to minimize the nominal dimension of the Monte Carlo method. Taking into account these advantages, we investigate this approach in detail in order to make it faster from the computational point of view. Indeed, we realize the linear transformation decomposition relying on a fast ad hoc QR decomposition that considerably reduces the computational burden. This setting makes the linear transformation method even more convenient from the computational point of view. We implement a high-dimensional (2500) Quasi-Monte Carlo simulation combined with the linear transformation in order to price Asian basket options with same set of parameters published by Imai and Tan. For the simulation of the high-dimensional random sample, we use a 50-dimensional scrambled Sobol sequence for the first 50 components, determined by the linear transformation method, and pad the remaining ones out by the Latin Hypercube Sampling. The aim of this numerical setting is to investigate the accuracy of the estimation by giving a higher convergence rate only to those components selected by the linear transformation technique. We launch our simulation experiment also using the standard Cholesky and the principal component decomposition methods with pseudo-random and Latin Hypercube sampling generators. Finally...

## Risk aggregation with empirical margins: Latin hypercubes, empirical copulas, and convergence of sum distributions

Mainik, Georg
Tipo: Artigo de Revista Científica
Publicado em 11/08/2015 Português
Relevância na Pesquisa
37.514731%
This paper studies convergence properties of multivariate distributions constructed by endowing empirical margins with a copula. This setting includes Latin Hypercube Sampling with dependence, also known as the Iman--Conover method. The primary question addressed here is the convergence of the component sum, which is relevant to risk aggregation in insurance and finance. This paper shows that a CLT for the aggregated risk distribution is not available, so that the underlying mathematical problem goes beyond classic functional CLTs for empirical copulas. This issue is relevant to Monte-Carlo based risk aggregation in all multivariate models generated by plugging empirical margins into a copula. Instead of a functional CLT, this paper establishes strong uniform consistency of the estimated sum distribution function and provides a sufficient criterion for the convergence rate $O(n^{-1/2})$ in probability. These convergence results hold for all copulas with bounded densities. Examples with unbounded densities include bivariate Clayton and Gauss copulas. The convergence results are not specific to the component sum and hold also for any other componentwise non-decreasing aggregation function. On the other hand, convergence of estimates for the joint distribution is much easier to prove...

## Analysis of Hepatitis C Viral Dynamics Using Latin Hypercube Sampling

Pachpute, Gaurav; Chakrabarty, Siddhartha P.
Tipo: Artigo de Revista Científica
Publicado em 27/06/2011 Português
Relevância na Pesquisa
48.013716%
We consider a mathematical model comprising of four coupled ordinary differential equations (ODEs) for studying the hepatitis C (HCV) viral dynamics. The model embodies the efficacies of a combination therapy of interferon and ribavirin. A condition for the stability of the uninfected and the infected steady states is presented. A large number of sample points for the model parameters (which were physiologically feasible) were generated using Latin hypercube sampling. Analysis of our simulated values indicated approximately 24% cases as having an uninfected steady state. Statistical tests like the chi-square-test and the Spearman's test were also done on the sample values. The results of these tests indicate a distinctly differently distribution of certain parameter values and not in case of others, vis-a-vis, the stability of the uninfected and the infected steady states.

## Multidimensional Latin Bitrade

Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
28.50129%
A subset $S$ of $k$-ary $n$-dimensional hypercube is called latin bitrade if $|S\cap F|\in\{0,2\}$ for each 1-face $F$. We find all admissible small (less than $2^{n+1}$) cardinalities of latin bitrades. A subset $M$ of $k$-ary $n$-dimensional hypercube is called $t$-fold MDS code if $|M\cap F|=t$ for each 1-face $F$. Symmetric difference of two 1-fold MDS codes is always a latin bitrade. Symmetric difference of two $t$-fold MDS codes may also be a latin bitrade. In this case we say that this latin bitrade embedded into $t$-fold MDS code. The intersection of $t$-fold MDS code and a latin bitrade embedded into it is called a component of the code. We study the questions of embedding of latin bitrades into $t$-fold MDS and admissible cardinalities of the component of $t$-fold MDS. Keywords: MDS code, latin bitrade, component.; Comment: in Russian

## Maximin design on non hypercube domain and kernel interpolation

Auffray, Yves; Barbillon, Pierre; Marin, Jean-Michel
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
38.54489%
In the paradigm of computer experiments, the choice of an experimental design is an important issue. When no information is available about the black-box function to be approximated, an exploratory design have to be used. In this context, two dispersion criteria are usually considered: the minimax and the maximin ones. In the case of a hypercube domain, a standard strategy consists of taking the maximin design within the class of Latin hypercube designs. However, in a non hypercube context, it does not make sense to use the Latin hypercube strategy. Moreover, whatever the design is, the black-box function is typically approximated thanks to kernel interpolation. Here, we first provide a theoretical justification to the maximin criterion with respect to kernel interpolations. Then, we propose simulated annealing algorithms to determine maximin designs in any bounded connected domain. We prove the convergence of the different schemes.; Comment: 3 figures

## Constructions of transitive latin hypercubes

Krotov, Denis; Potapov, Vladimir
A function $f:\{0,...,q-1\}^n\to\{0,...,q-1\}$ invertible in each argument is called a latin hypercube. A collection $(\pi_0,\pi_1,...,\pi_n)$ of permutations of $\{0,...,q-1\}$ is called an autotopism of a latin hypercube $f$ if $\pi_0f(x_1,...,x_n)=f(\pi_1x_1,...,\pi_n x_n)$ for all $x_1$, ..., $x_n$. We call a latin hypercube isotopically transitive (topolinear) if its group of autotopisms acts transitively (regularly) on all $q^n$ collections of argument values. We prove that the number of nonequivalent topolinear latin hypercubes growths exponentially with respect to $\sqrt{n}$ if $q$ is even and exponentially with respect to $n^2$ if $q$ is divisible by a square. We show a connection of the class of isotopically transitive latin squares with the class of G-loops, known in noncommutative algebra, and establish the existence of a topolinear latin square that is not a group isotope. We characterize the class of isotopically transitive latin hypercubes of orders $q=4$ and $q=5$. Keywords: transitive code, propelinear code, latin square, latin hypercube, autotopism, G-loop.; Comment: 16 pages. V2: the paper has been completely rewritten; the previous version can contain incorrect statements