Página 1 dos resultados de 605 itens digitais encontrados em 0.034 segundos

Probabilidade do erro do tipo I nas cartas X e S de Shewhart sob não normalidade; Probability of type I error in X and S Shewhart Control Charts under non-normality

Korzenowski, Andre Luis; Werner, Liane
Fonte: Universidade Federal do Rio Grande do Sul Publicador: Universidade Federal do Rio Grande do Sul
Tipo: Artigo de Revista Científica Formato: application/pdf
Português
Relevância na Pesquisa
36.951042%
O objetivo deste artigo é verificar o comportamento das cartas de média e desvio padrão de Shewhart em relação à probabilidade do erro do tipo I quando da violação da suposição de normalidade. Foi realizada a simulação de uma série de 500.000 amostras (subgrupos) de tamanho n = 3, 5, 7, 10, 15, 20 e 25. As amostras foram simuladas a partir das distribuições normal, t de Student, exponencial, qui-quadrado, gamma e Weibull. Verificou-se em dados não normais o aumento na probabilidade de erro do tipo I na carta de médias em todas as distribuições simuladas. O mínimo tamanho da amostra necessário está relacionado ao grau de assimetria da distribuição dos dados, sendo que, em alguns casos, nem mesmo n = 25 apresentou resultados satisfatórios. No gráfico S, o aumento da probabilidade de erro do tipo I é significativamente superior em quase todas as distribuições simuladas e seu comportamento é influenciado não só pelo tipo de distribuição, mas também pelo tamanho da amostra.

An empirical power comparison of univariate goodness-of-fit tests for normality

Xavier Romão; Raimundo Delgado; Aníbal Costa
Fonte: Universidade do Porto Publicador: Universidade do Porto
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
37.515305%
A comprehensive power comparison study of existing tests for normality is proposed. Given the importance of this subject and the widespread development of normality tests, comprehensive descriptions and power comparisons of such tests are of considerable interest. Since recent comparison studies do not include several interesting and more recently developed tests, a further comparison of normality tests is considered to be of foremost interest. The study addresses the performance of 33 normality tests, for various sample sizes, considering several significance levels and for a number of symmetric, asymmetric and modified normal distributions. General recommendations for normality testing resulting from the study are defined according to the nature of the non-normality.

A Two One-Sided Parametric Tolerance Interval Test for Control of Delivered Dose Uniformity—Part 3—Investigation of Robustness to Deviations from Normality

Novick, Steven; Christopher, David; Dey, Monisha; Lyapustina, Svetlana; Golden, Michael; Leiner, Stefan; Wyka, Bruce; Delzeit, Hans-Joachim; Novak, Chris; Larner, Gregory
Fonte: Springer US Publicador: Springer US
Tipo: Artigo de Revista Científica
Publicado em 24/06/2009 Português
Relevância na Pesquisa
37.324644%
The robustness of the parametric tolerance interval test, which was proposed by the Food and Drug Administration for control of delivered dose uniformity in orally inhaled and nasal drug products, is investigated in this article using different scenarios for deviations from a univariate normal distribution. The studied scenarios span a wide range of conditions, the purpose of which is to provide an understanding of how the test performs depending on the nature and degree of the deviation from normality. Operating characteristic curves were generated to compare the performance of the test for different types of distributions (normal and non-normal) having the same proportion of doses in the tails (on one or both sides) outside the target interval. The results show that, in most cases, non-normality does not increase the probability of accepting a batch of unacceptable quality (i.e., the test is robust) except in extreme situations, which do not necessarily represent commercially viable products. The results also demonstrate that, in the case of bimodal distributions where the life-stage means differ from each other by up to 24% label claim, the test’s criterion on life-stage means does not affect pass rates because the tolerance interval portion of the test reacts to shifting means as well.

Estimating Latent Variable Interactions With Non-Normal Observed Data: A Comparison of Four Approaches

Cham, Heining; West, Stephen G.; Ma, Yue; Aiken, Leona S.
Fonte: PubMed Publicador: PubMed
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
37.26747%
A Monte Carlo simulation was conducted to investigate the robustness of four latent variable interaction modeling approaches (Constrained Product Indicator [CPI], Generalized Appended Product Indicator [GAPI], Unconstrained Product Indicator [UPI], and Latent Moderated Structural Equations [LMS]) under high degrees of non-normality of the observed exogenous variables. Results showed that the CPI and LMS approaches yielded biased estimates of the interaction effect when the exogenous variables were highly non-normal. When the violation of non-normality was not severe (normal; symmetric with excess kurtosis < 1), the LMS approach yielded the most efficient estimates of the latent interaction effect with the highest statistical power. In highly non-normal conditions, the GAPI and UPI approaches with ML estimation yielded unbiased latent interaction effect estimates, with acceptable actual Type-I error rates for both the Wald and likelihood ratio tests of interaction effect at N ≥ 500. An empirical example illustrated the use of the four approaches in testing a latent variable interaction between academic self-efficacy and positive family role models in the prediction of academic performance.

Power and Sample Size Determination in the Rasch Model: Evaluation of the Robustness of a Numerical Method to Non-Normality of the Latent Trait

Guilleux, Alice; Blanchin, Myriam; Hardouin, Jean-Benoit; Sébille, Véronique
Fonte: Public Library of Science Publicador: Public Library of Science
Tipo: Artigo de Revista Científica
Publicado em 10/01/2014 Português
Relevância na Pesquisa
47.1846%
Patient-reported outcomes (PRO) have gained importance in clinical and epidemiological research and aim at assessing quality of life, anxiety or fatigue for instance. Item Response Theory (IRT) models are increasingly used to validate and analyse PRO. Such models relate observed variables to a latent variable (unobservable variable) which is commonly assumed to be normally distributed. A priori sample size determination is important to obtain adequately powered studies to determine clinically important changes in PRO. In previous developments, the Raschpower method has been proposed for the determination of the power of the test of group effect for the comparison of PRO in cross-sectional studies with an IRT model, the Rasch model. The objective of this work was to evaluate the robustness of this method (which assumes a normal distribution for the latent variable) to violations of distributional assumption. The statistical power of the test of group effect was estimated by the empirical rejection rate in data sets simulated using a non-normally distributed latent variable. It was compared to the power obtained with the Raschpower method. In both cases, the data were analyzed using a latent regression Rasch model including a binary covariate for group effect. For all situations...

Detecting ARCH Effects in Non-Gaussian Time Series

Raunig, Burkhard
Fonte: Oxford University Press Publicador: Oxford University Press
Tipo: Artigo de Revista Científica Formato: text/html
Português
Relevância na Pesquisa
37.40493%
Engles ARCH test has become the standard test for ARCH effects in applied work. Under non-normality the true rejection probability of this test can differ substantially from the nominal level, however. Bootstrap and Monte Carlo versions of the test may then be used instead. This paper proposes an alternative test procedure. The new test exploits the empirical distribution of the data and an extended probability integral transformation. The test is compared with the former tests in Monte Carlo experiments. Under normality, the new test works as well as the conventional Monte Carlo test and the bootstrap. Under non-normality, the test tends to be more accurate and more powerful than the bootstrapped ARCH test. The procedure is then used to test for ARCH effects in S&P 500 returns sampled at different frequencies. In contrast to the standard and the bootstrapped ARCH tests, the new test detects ARCH effects in the transformed low-frequency returns.

Testing Mean-Variance Efficiency in CAPM with Possibly Non-Gaussian Errors : An Exact Simulation-Based Approach

BEAULIEU, Marie-Claude; DUFOUR, Jean-Marie; KHALAF, Lynda
Fonte: Université de Montréal Publicador: Université de Montréal
Tipo: Artigo de Revista Científica Formato: 400691 bytes; application/pdf
Português
Relevância na Pesquisa
47.75769%
In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods...

Exact Skewness-Kurtosis Tests for Multivariate Normality and Goodness-of-fit in Multivariate Regressions with Application to Asset Pricing Models

DUFOUR, Jean-Marie; KHALAF, Lynda; BEAULIEU, Marie-Claude
Fonte: Université de Montréal Publicador: Université de Montréal
Tipo: Artigo de Revista Científica Formato: 225374 bytes; application/pdf
Português
Relevância na Pesquisa
37.324644%
We study the problem of testing the error distribution in a multivariate linear regression (MLR) model. The tests are functions of appropriately standardized multivariate least squares residuals whose distribution is invariant to the unknown cross-equation error covariance matrix. Empirical multivariate skewness and kurtosis criteria are then compared to simulation-based estimate of their expected value under the hypothesized distribution. Special cases considered include testing multivariate normal, Student t; normal mixtures and stable error models. In the Gaussian case, finite-sample versions of the standard multivariate skewness and kurtosis tests are derived. To do this, we exploit simple, double and multi-stage Monte Carlo test methods. For non-Gaussian distribution families involving nuisance parameters, confidence sets are derived for the the nuisance parameters and the error distribution. The procedures considered are evaluated in a small simulation experi-ment. Finally, the tests are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995.; Dans cet article, nous proposons des tests sur la forme de la distribution des erreurs dans un modèle de régression linéaire multivarié (RLM). Les tests que nous développons sont fonction des résidus obtenus par moindres carrés multivariés...

The effects of non-normality and nonlinearity of the Navier–Stokes operator on the dynamics of a large laminar separation bubble

CHERUBINI, Stefania; Robinet, Jean-Christophe; DE PALMA, Pietro
Fonte: AIP Publicador: AIP
Português
Relevância na Pesquisa
47.58992%
Publisher version : http://pof.aip.org/resource/1/phfle6/v22/i1/p014102_s1?isAuthorized=no; The effects of non-normality and nonlinearity of the two-dimensional Navier–Stokes differential operator on the dynamics of a large laminar separation bubble over a flat plate have been studied in both subcritical and slightly supercritical conditions. The global eigenvalue analysis and direct numerical simulations have been employed in order to investigate the linear and nonlinear stability of the flow. The steady-state solutions of the Navier–Stokes equations at supercritical and slightly subcritical Reynolds numbers have been computed by means of a continuation procedure. Topological flow changes on the base flow have been found to occur close to transition, supporting the hypothesis of some authors that unsteadiness of separated flows could be due to structural changes within the bubble. The global eigenvalue analysis and numerical simulations initialized with small amplitude perturbations have shown that the non-normality of convective modes allows the bubble to act as a strong amplifier of small disturbances. For subcritical conditions, nonlinear effects have been found to induce saturation of such an amplification, originating a wave-packet cycle similar to the one established in supercritical conditions...

Logique et tests d'hypotheses: reflexions sur les problemes mal poses en econometrie

DUFOUR, Jean-Marie
Fonte: Université de Montréal Publicador: Université de Montréal
Tipo: Artigo de Revista Científica Formato: 161986 bytes; application/pdf
Português
Relevância na Pesquisa
37.291643%
Dans ce texte, nous analysons les développements récents de l’économétrie à la lumière de la théorie des tests statistiques. Nous revoyons d’abord quelques principes fondamentaux de philosophie des sciences et de théorie statistique, en mettant l’accent sur la parcimonie et la falsifiabilité comme critères d’évaluation des modèles, sur le rôle de la théorie des tests comme formalisation du principe de falsification de modèles probabilistes, ainsi que sur la justification logique des notions de base de la théorie des tests (tel le niveau d’un test). Nous montrons ensuite que certaines des méthodes statistiques et économétriques les plus utilisées sont fondamentalement inappropriées pour les problèmes et modèles considérés, tandis que de nombreuses hypothèses, pour lesquelles des procédures de test sont communément proposées, ne sont en fait pas du tout testables. De telles situations conduisent à des problèmes statistiques mal posés. Nous analysons quelques cas particuliers de tels problèmes : (1) la construction d’intervalles de confiance dans le cadre de modèles structurels qui posent des problèmes d’identification; (2) la construction de tests pour des hypothèses non paramétriques...

Bayesian test of normality versus a Dirichlet process mixture alternative

Tokdar, Surya T.; Martin, Ryan
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
37.306963%
We propose a Bayesian test of normality of univariate or multivariate data against alternative nonparametric models characterized by Dirichlet process mixture distributions. The alternative models are based on the principles of embedding and predictive matching. They can be interpreted to offer random granulation of a normal distribution into a mixture of normals with mixture components occupying a smaller volume the farther they are from the distribution center. A scalar parametrization based on latent clustering is used to cover an entire spectrum of separation between the normal distributions and the alternative models. An efficient sequential importance sampler is developed to calculate Bayes factors. Simulations indicate the proposed test can detect non-normality without favoring the nonparametric alternative when normality holds.; Comment: 20 pages, 5 figures, 1 table

Non-normality in combustion-acoustic interaction in diffusion flames: a critical revision

Magri, Luca; Balasubramanian, K.; Sujith, R. I.; Juniper, Matthew P.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 01/10/2013 Português
Relevância na Pesquisa
47.1846%
Perturbations in a non-normal system can grow transiently even if the system is linearly stable. If this transient growth is sufficiently large, it can trigger self-sustained oscillations from small initial disturbances. This has important practical consequences for combustion-acoustic oscillations, which are a continual problem in rocket and aircraft engines. Balasubramanian and Sujith (Journal of Fluid Mechanics, 2008, 594, 29-57) modelled an infinite-rate chemistry diffusion flame in an acoustic duct and found that the transient growth in this system can amplify the initial energy by a factor, $G_{max}$, of order $10^5$ to $10^7$. However, recent investigations by L. Magri & M. P. Juniper have brought to light certain errors in that paper. When the errors are corrected, $G_{max}$ is found to be of order 1 to 10, revealing that non-normality is not as influential as it was thought to be.; Comment: 4 pages, 2 figures

Filtered overlap: speedup, locality, kernel non-normality and Z_A~1

Durr, Stephan; Hoelbling, Christian; Wenger, Urs
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 28/06/2005 Português
Relevância na Pesquisa
47.10431%
We investigate the overlap operator with a UV filtered Wilson kernel. The filtering leads to a better localization of the operator even on coarse lattices and with the untuned choice $\rho=1$. Furthermore, the axial-vector renormalization constant $Z_A$ is much closer to 1, reducing the mismatch with perturbation theory. We show that all these features persist over a wide range of couplings and that the details of filtering prove immaterial. We investigate the properties of the kernel spectrum and find that the kernel non-normality is reduced. As a side effect we observe that for certain applications of the filtered overlap a speed-up factor of 2-4 can be achieved.; Comment: 30 pp, 23 figs

Multivariate Non-Normality in the WMAP 1st Year Data

Dineen, Patrick; Coles, Peter
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 29/11/2005 Português
Relevância na Pesquisa
37.681772%
The extraction of cosmological parameters from microwave background observations relies on specific assumptions about the statistical properties of the data, in particular that the p-point distributions of temperature fluctuations are jointly-normal. Using a battery of statistical tests, we assess the multivariate Gaussian nature of the Wilkinson Microwave Anisotropy Probe (WMAP) 1st year data. The statistics we use fall into three classes which test different aspects of joint-normality: the first set assess the normality of marginal (one-point) distributions using familiar univariate methods; the second involves statistics that directly assess joint-normality; and the third explores the evidence of non-linearity in the relationship between variates. We applied these tests to frequency maps, `foreground-cleaned' assembly maps and all-sky CMB-only maps. The assembly maps are of particular interest as when combined with the kp2 mask, we recreate the region used in the computation of the angular power spectrum. Significant departures from normality were found in all the maps. In particular, the kurtosis coefficient, D'Agostino's statistic and bivariate kurtosis calculated from temperature pairs extracted from all the assembly maps were found to be non-normal at 99% confidence level. We found that the results were unaffected by the size of the Galactic cut and were evident on either hemisphere of the CMB sky. The latter suggests that the non-Gaussianity is not simply related to previous claims of north-south asymmetry or localized abnormalities detected through wavelet techniques.; Comment: 15 pages...

Do probabilistic medium-range temperature forecasts need to allow for non-normality?

Jewson, Stephen
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 13/10/2003 Português
Relevância na Pesquisa
47.10431%
The gaussian spread regression model for the calibration of site specific ensemble temperature forecasts depends on the apparently restrictive assumption that the uncertainty around temperature forecasts is normally distributed. We generalise the model using the kernel density to allow for much more flexible distribution shapes. However, we do not find any meaningful improvement in the resulting probabilistic forecast when evaluated using likelihood based scores. We conclude that the distribution of uncertainty is either very close to normal, or if it is not close to normal, then the non-normality is not being predicted by the ensemble forecast that we test.

The Consequences of Non-Normality

Hip, I.; Lippert, Th.; Neff, H.; Schilling, K.; Schroers, W.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 17/10/2001 Português
Relevância na Pesquisa
47.10431%
The non-normality of Wilson-type lattice Dirac operators has important consequences - the application of the usual concepts from the textbook (hermitian) quantum mechanics should be reconsidered. This includes an appropriate definition of observables and the refinement of computational tools. We show that the truncated singular value expansion is the optimal approximation to the inverse operator D^{-1} and we prove that due to the gamma_5-hermiticity it is equivalent to gamma_5 times the truncated eigenmode expansion of the hermitian Wilson-Dirac operator.; Comment: Lattice2001(theorydevelop)

Non-normal and Stochastic Amplification in Turbulent Dynamo: Subcritical Case

Fedotov, Sergei
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
37.26747%
Our attention focuses on the stochastic dynamo equation with non-normal operator that give an insight into the role of stochastics and non-normality in the galactic magnetic field generation. The main point of this Letter is a discussion of the generation of a large-scale magnetic field that cannot be explained by traditional linear eigenvalue analysis. We present a simple stochastic model for the thin-disk axisymmetric $\alpha \Omega $ dynamo involving three factors: (a) the non-normality generated by differential rotation, (b) the nonlinearity reflecting how the magnetic field affects the turbulent dynamo coefficients, and (c) stochastic perturbations. We show that even for \textit{subcritical case,} there are three possible mechanisms for the generation of magnetic field. The first mechanism is a deterministic one that describes an interplay between transient growth and nonlinear saturation of the turbulent $\alpha -$effect and diffusivity. It turns out that the trivial state is nonlinearity unstable to small but finite initial perturbations. The second and third are the stochastic mechanisms that account for the interaction of non-normal effect generated by differential rotation and random additive and multiplicative fluctuations. In particular...

Uniaxial Tension of a Class of Compressible Solids With Plastic Non-Normality

Mohan, Nisha; Cheng, Justine; Greer, Julia R.; Needleman, Alan
Fonte: American Society Mechanical Engineers Publicador: American Society Mechanical Engineers
Tipo: Article; PeerReviewed Formato: application/pdf
Publicado em /07/2013 Português
Relevância na Pesquisa
47.10431%
Motivated by a model that qualitatively captured the response of vertically aligned carbon nanotube (VACNT) pillars in uniaxial compression, we consider the uniaxial tensile response of a class of compressible elastic-viscoplastic solids. In Hutchens et al. [“Analysis of Uniaxial Compression of Vertically Aligned Carbon Nanotubes,” J. Mech. Phys. Solids, 59, pp. 2227–2237 (2011), Erratum 60, 1753–1756 (2012)] an elastic viscoplastic constitutive relation with plastic compressibility, plastic non-normality, and a hardening-softening-hardening hardness function was used to model experimentally obtained uniaxial compression data of cylindrical VACNT micropillars. Complex deformation modes were found in uniaxial compression, which include a sequential buckling-like collapse of the type seen in experiments. These complex deformation modes led to the overall stress-strain signature of the pillar not being of the same form as the input material hardness function. A fundamental question that motivates exploring the deformation of this class of materials—both experimentally and theoretically—is how to extract the intrinsic material response from simple tests. In this study we explore the relation between the input material response and the overall stress strain behavior in uniaxial tension using the constitutive framework of Hutchens et al. A simple one-dimensional analysis reveals the types of instability modes to be expected. Dynamic...

Bayesian Modeling and Computation for Mixed Data

Cui, Kai
Fonte: Universidade Duke Publicador: Universidade Duke
Tipo: Dissertação
Publicado em //2012 Português
Relevância na Pesquisa
37.184597%

Multivariate or high-dimensional data with mixed types are ubiquitous in many fields of studies, including science, engineering, social science, finance, health and medicine, and joint analysis of such data entails both statistical models flexible enough to accommodate them and novel methodologies for computationally efficient inference. Such joint analysis is potentially advantageous in many statistical and practical aspects, including shared information, dimensional reduction, efficiency gains, increased power and better control of error rates.

This thesis mainly focuses on two types of mixed data: (i) mixed discrete and continuous outcomes, especially in a dynamic setting; and (ii) multivariate or high dimensional continuous data with potential non-normality, where each dimension may have different degrees of skewness and tail-behaviors. Flexible Bayesian models are developed to jointly model these types of data, with a particular interest in exploring and utilizing the factor models framework. Much emphasis has also been placed on the ability to scale the statistical approaches and computation efficiently up to problems with long mixed time series or increasingly high-dimensional heavy-tailed and skewed data.

To this end...

CLASSIFICAÇÃO DO USO/COBERTURA DA TERRA POR MEIO DE ALGORITMOS NÃO PARAMÉTRICOS COMPARADOSAO CLASSIFICADOR DE MÁXIMA VEROSSIMILHANÇA; LANDUSE/LANDCOVER CLASSIFICATION BY NON-PARAMETRIC ALGORITHMS COMPARED TO THE LIKELIHOOD CLASSIFIER

da Costa, Thomaz Corrêa e Castro; De Marco Júnior, Paulo; Brites, Ricardo Seixas
Fonte: UFPR Publicador: UFPR
Tipo: info:eu-repo/semantics/article; info:eu-repo/semantics/publishedVersion; Artigo Avaliado pelos Pares Formato: application/pdf
Publicado em 24/04/2006 Português
Relevância na Pesquisa
36.954268%
Landuse/landcover maps produced by non-parametric classifiers of a Landsat TM image were compared with results obtained by the usual likelihood classifier (MAXLIKE). The maximum likelihood classifier is the parametric strategy pixel the pixel most used in orbital images classification. But Skidmore and Turner (1988) obtained results of global accuracy with their algorithm non parametric in SPOT XS data, for classes of Pinus spp. age superior 14% in relation to the algorithm of maximum likelihood. The algorithm of Skidmore/Turner suffered modification proposal in Lowell (1989), Gong and Dunlop (1991) and Dymond (1993). The purpose of the algorithm non-parametric that were developed in this work was to diminish the limitation of the Skidmore/Turner non-parametric classifier that, compared to MAXLIKE, demands a very large training sample to reduce the not classified areas in the image. This algorithm, a new supervised nonparametric classifier that make flexible the association amidst brightness values was developed for testing the assumption that it increases the image classified area without missing accuracy and for solving the limitation of the nonparametric classifier of Skidmore/Turner, that requires a larger training sample for reducing the of unclassified pixels in the image. To facilitate the understanding of the algorithms used in this work...