Página 1 dos resultados de 39612 itens digitais encontrados em 0.060 segundos

Determinação entrópica do preço racional da opção européia simples ordinária sobre ação e bond: uma aplicação da teoria da informação em finanças em condição de incerteza; Entropic approach to rational pricing of the simple ordinary option of european-type over stock and bond: an application of information theory in finance under uncertainty

Siqueira, José de Oliveira
Fonte: Biblioteca Digitais de Teses e Dissertações da USP Publicador: Biblioteca Digitais de Teses e Dissertações da USP
Tipo: Tese de Doutorado Formato: application/pdf
Publicado em 17/12/1999 Português
Relevância na Pesquisa
66.646353%
Esta tese promove uma integração entre Finanças e Teoria de Informação para criação de um ambiente alternativo para a determinação do preço racional da opção européia simples ordinária sobre ação e ativo de renda fixa (bond). Uma das características deste novo ambiente de determinação de preço racional é poder continuar utilizando o cálculo newtoniano em vez do estocástico. Cria uma notação matemática precisa e completa para a Teoria da Informação e a integra com a teoria de Finanças em condições de incerteza. Integra as abordagens entrópicas de determinação do preço racional da opção européia simples ordinária de Gulko (1998 e 1998a) e de Yang (1997). Define precisamente o mundo com preço da incerteza neutralizado (risk-neutral world), o mundo martingale, o mundo informacionalmente eficiente e o mundo entrópico e suas implicações para a Ciência do Investimento e, mais especificamente, para a determinação do preço racional de ativos básicos e derivativos. Demonstra detalhadamente a fórmula do preço racional da opção européia simples ordinária de Black-Scholes-Merton, melhorando a notação matemática, simplificando (eliminando a abordagem martingale) e complementando a demonstração feita por Baxter & Rennie (1998). Interrompe uma sucessão de trabalhos que estabelecem uma forma equivocada de calcular o preço da opção européia simples ordinária. Esse erro teve sua origem...

Algorithmic information theory

Hutter, Marcus
Fonte: Scholarpedia Publicador: Scholarpedia
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
66.648286%
This article is a brief guide to the field of algorithmic information theory (AIT), its underlying philosophy, and the most important concepts. AIT arises by mixing information theory and computation theory to obtain an objective and absolute notion of information in an individual object, and in so doing gives rise to an objective and robust notion of randomness of individual objects. This is in contrast to classical information theory that is based on random variables and communication, and has no bearing on information and randomness of individual objects. After a brief overview, the major subfields, applications, history, and a map of the field are presented.

Informationstheoretische Optimierung künstlicher neuronaler Netze : für den Einsatz in Steuergeräten; Optimization of artifitial neural networks through information theory

Froehlich, Michael Helmut
Fonte: Universidade de Tubinga Publicador: Universidade de Tubinga
Tipo: Dissertação
Português
Relevância na Pesquisa
66.67244%
Künstliche neuronale Netze werden in der Regel durch eine wiederholte Präsentation von Trainingspattern adaptiert, welche aus Meßdaten gewonnen wurden. Die zur Abbildung des Trainingsraums erforderliche Größe des Netzes muß dabei in der Regel durch den Anwender vorgegeben werden und hängt stark von der Komplexität der Daten ab. Zur Bestimmung der Komplexität von Meßdaten kann ein von Claude Shannon schon 1948 vorgeschlagenes Informationsmaß verwendet werden. Aufbauend auf der sogenannten Transinformation werden verschiedene Verfahren zur Datenanalyse und zur Erzeugung einer an die Komplexität angepaßten Netztopologie vorgestellt. Durch den Einsatz der Transinformation ist zunächst ein Verfahren entwickelt worden, das eine informationstheoretische Bewertung von Fourier-Spektren ermöglicht. Dabei werden einzelne Frequenzbereiche entsprechend ihres Informationsanteils bezogen auf das Gesamtspektrum gewichtet. Diese Gewichtung bietet eine zusätzliche Hilfe bei der Entwicklung digitaler Filter. Die informationstheoretische Bewertung von Trainingspattern konnte für die Entwicklung einer Methode zur gezielten Auswahl tatsächlich notwendiger Netzeingänge genutzt werden. Aufbauend auf so gewonnenen Erfahrungen ist schließlich ein Verfahren zur Erzeugung vollständiger Netztopologienen entwickelt worden. Dabei wird eine Topologie erzeugt...

The decoupling approach to quantum information theory

Dupuis, Frédéric
Fonte: Université de Montréal Publicador: Université de Montréal
Tipo: Thèse ou Mémoire numérique / Electronic Thesis or Dissertation
Português
Relevância na Pesquisa
66.67244%
La théorie de l'information quantique étudie les limites fondamentales qu'imposent les lois de la physique sur les tâches de traitement de données comme la compression et la transmission de données sur un canal bruité. Cette thèse présente des techniques générales permettant de résoudre plusieurs problèmes fondamentaux de la théorie de l'information quantique dans un seul et même cadre. Le théorème central de cette thèse énonce l'existence d'un protocole permettant de transmettre des données quantiques que le receveur connaît déjà partiellement à l'aide d'une seule utilisation d'un canal quantique bruité. Ce théorème a de plus comme corollaires immédiats plusieurs théorèmes centraux de la théorie de l'information quantique. Les chapitres suivants utilisent ce théorème pour prouver l'existence de nouveaux protocoles pour deux autres types de canaux quantiques, soit les canaux de diffusion quantiques et les canaux quantiques avec information supplémentaire fournie au transmetteur. Ces protocoles traitent aussi de la transmission de données quantiques partiellement connues du receveur à l'aide d'une seule utilisation du canal, et ont comme corollaires des versions asymptotiques avec et sans intrication auxiliaire. Les versions asymptotiques avec intrication auxiliaire peuvent...

Interactive quantum information theory

Touchette, Dave
Fonte: Université de Montréal Publicador: Université de Montréal
Tipo: Thèse ou Mémoire numérique / Electronic Thesis or Dissertation
Português
Relevância na Pesquisa
66.70172%
La théorie de l'information quantique s'est développée à une vitesse fulgurante au cours des vingt dernières années, avec des analogues et extensions des théorèmes de codage de source et de codage sur canal bruité pour la communication unidirectionnelle. Pour la communication interactive, un analogue quantique de la complexité de la communication a été développé, pour lequel les protocoles quantiques peuvent performer exponentiellement mieux que les meilleurs protocoles classiques pour certaines tâches classiques. Cependant, l'information quantique est beaucoup plus sensible au bruit que l'information classique. Il est donc impératif d'utiliser les ressources quantiques à leur plein potentiel. Dans cette thèse, nous étudions les protocoles quantiques interactifs du point de vue de la théorie de l'information et étudions les analogues du codage de source et du codage sur canal bruité. Le cadre considéré est celui de la complexité de la communication: Alice et Bob veulent faire un calcul quantique biparti tout en minimisant la quantité de communication échangée, sans égard au coût des calculs locaux. Nos résultats sont séparés en trois chapitres distincts, qui sont organisés de sorte à ce que chacun puisse être lu indépendamment. Étant donné le rôle central qu'elle occupe dans le contexte de la compression interactive...

An Introductory Review of Information Theory in the Context of Computational Neuroscience

McDonnell, Mark D.; Ikeda, Shiro; Manton, Jonathan H.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 14/07/2011 Português
Relevância na Pesquisa
66.648286%
This paper introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information theory without understanding the assumptions behind their definitions can lead to erroneous results and conclusions. Furthermore, this century will see a convergence of information theory and neuroscience; information theory will expand its foundations to incorporate more comprehensively biological processes thereby helping reveal how neuronal networks achieve their remarkable information processing abilities.; Comment: 18 pages, 7 figures, to appear in Biological Cybernetics

Information Theory and Statistical Physics - Lecture Notes

Merhav, Neri
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 08/06/2010 Português
Relevância na Pesquisa
66.69823%
This document consists of lecture notes for a graduate course, which focuses on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, as well as to graduate students in Physics who have basic background in Information Theory. Strong emphasis is given to the analogy and parallelism between Information Theory and Statistical Physics, as well as to the insights, the analysis tools and techniques that can be borrowed from Statistical Physics and `imported' to certain problem areas in Information Theory. This is a research trend that has been very active in the last few decades, and the hope is that by exposing the student to the meeting points between these two disciplines, we will enhance his/her background and perspective to carry out research in the field. A short outline of the course is as follows: Introduction; Elementary Statistical Physics and its Relation to Information Theory; Analysis Tools in Statistical Physics; Systems of Interacting Particles and Phase Transitions; The Random Energy Model (REM) and Random Channel Coding; Additional Topics (optional).; Comment: 176 pages, 26 figures. Lecture notes of a graduate course delivered at the Technion in the Spring of 2010

Foundations of Information Theory

Burgin, Mark
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 06/08/2008 Português
Relevância na Pesquisa
66.69741%
Information is the basic concept of information theory. However, there is no definition of this concept that can encompass all uses of the term information in information theories and beyond. Many question a possibility of such a definition. However, foundations of information theory developed in the context of the general theory of information made it possible to build such a relevant and at the same time, encompassing definition. Foundations of information theory are built in a form of ontological principles, which reflect basic features of information and information processes.

A Perspective on Future Research Directions in Information Theory

Andrews, Jeffrey G.; Dimakis, Alexandros; Dolecek, Lara; Effros, Michelle; Medard, Muriel; Milenkovic, Olgica; Montanari, Andrea; Vishwanath, Sriram; Yeh, Edmund; Berry, Randall; Duffy, Ken; Feizi, Soheil; Kato, Saul; Kellis, Manolis; Licht, Stuart; Soren
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 21/07/2015 Português
Relevância na Pesquisa
66.69823%
Information theory is rapidly approaching its 70th birthday. What are promising future directions for research in information theory? Where will information theory be having the most impact in 10-20 years? What new and emerging areas are ripe for the most impact, of the sort that information theory has had on the telecommunications industry over the last 60 years? How should the IEEE Information Theory Society promote high-risk new research directions and broaden the reach of information theory, while continuing to be true to its ideals and insisting on the intellectual rigor that makes its breakthroughs so powerful? These are some of the questions that an ad hoc committee (composed of the present authors) explored over the past two years. We have discussed and debated these questions, and solicited detailed inputs from experts in fields including genomics, biology, economics, and neuroscience. This report is the result of these discussions.

Proposal new area of study by connecting between information theory and Weber-Fechner law

Choe, HaengJin
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
66.68345%
Rough speaking, information theory deals with data transmitted over a channel such as the internet. Modern information theory is generally considered to have been founded in 1948 by Shannon in his seminal paper, "A mathematical theory of communication." Shannon's formulation of information theory was an immediate success with communications engineers. Shannon defined mathematically the amount of information transmitted over a channel. The amount of information doesn't mean the number of symbols of data. It depends on occurrence probabilities of symbols of the data. Meanwhile, psychophysics is the study of quantitative relations between psychological events and physical events or, more specifically, between sensations and the stimuli that produce them. It seems that Shannon's information theory bears no relation to psychophysics established by German scientist and philosopher Fechner. Here I show that to our astonishment it is possible to combine two fields. And therefore we come to be capable of measuring mathematically perceptions of the physical stimuli applicable to the Weber-Fechner law. I will define the concept of new entropy. And as a consequence of this, new field will begin life.; Comment: 8 pages, no figures

Is Consciousness Computable? Quantifying Integrated Information Using Algorithmic Information Theory

Maguire, Phil; Moser, Philippe; Maguire, Rebecca; Griffith, Virgil
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 01/05/2014 Português
Relevância na Pesquisa
66.652695%
In this article we review Tononi's (2008) theory of consciousness as integrated information. We argue that previous formalizations of integrated information (e.g. Griffith, 2014) depend on information loss. Since lossy integration would necessitate continuous damage to existing memories, we propose it is more natural to frame consciousness as a lossless integrative process and provide a formalization of this idea using algorithmic information theory. We prove that complete lossless integration requires noncomputable functions. This result implies that if unitary consciousness exists, it cannot be modelled computationally.; Comment: Maguire, P., Moser, P., Maguire, R. & Griffith, V. (2014). Is consciousness computable? Quantifying integrated information using algorithmic information theory. In P. Bello, M. Guarini, M. McShane, & B. Scassellati (Eds.), Proceedings of the 36th Annual Conference of the Cognitive Science Society. Austin, TX: Cognitive Science Society

The relation between Granger causality and directed information theory: a review

Amblard, Pierre-Olivier; Michel, Olivier J. J.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 13/11/2012 Português
Relevância na Pesquisa
66.69655%
This report reviews the conceptual and theoretical links between Granger causality and directed information theory. We begin with a short historical tour of Granger causality, concentrating on its closeness to information theory. The definitions of Granger causality based on prediction are recalled, and the importance of the observation set is discussed. We present the definitions based on conditional independence. The notion of instantaneous coupling is included in the definitions. The concept of Granger causality graphs is discussed. We present directed information theory from the perspective of studies of causal influences between stochastic processes. Causal conditioning appears to be the cornerstone for the relation between information theory and Granger causality. In the bivariate case, the fundamental measure is the directed information, which decomposes as the sum of the transfer entropies and a term quantifying instantaneous coupling. We show the decomposition of the mutual information into the sums of the transfer entropies and the instantaneous coupling measure, a relation known for the linear Gaussian case. We study the multivariate case, showing that the useful decomposition is blurred by instantaneous coupling. The links are further developed by studying how measures based on directed information theory naturally emerge from Granger causality inference frameworks as hypothesis testing.

A primer on information theory, with applications to neuroscience

Effenberger, Felix
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
66.7163%
Given the constant rise in quantity and quality of data obtained from neural systems on many scales ranging from molecular to systems', information-theoretic analyses became increasingly necessary during the past few decades in the neurosciences. Such analyses can provide deep insights into the functionality of such systems, as well as a rigid mathematical theory and quantitative measures of information processing in both healthy and diseased states of neural systems. This chapter will present a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. To begin, the fundamentals of probability theory such as the notion of probability, probability distributions, and random variables will be reviewed. Then, the concepts of information and entropy (in the sense of Shannon), mutual information, and transfer entropy (sometimes also referred to as conditional mutual information) will be outlined. As these quantities cannot be computed exactly from measured data in practice, estimation techniques for information-theoretic quantities will be presented. The chapter will conclude with the applications of information theory in the field of neuroscience...

Algorithmic Information Theory: a brief non-technical guide to the field

Hutter, Marcus
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 06/03/2007 Português
Relevância na Pesquisa
66.68345%
This article is a brief guide to the field of algorithmic information theory (AIT), its underlying philosophy, and the most important concepts. AIT arises by mixing information theory and computation theory to obtain an objective and absolute notion of information in an individual object, and in so doing gives rise to an objective and robust notion of randomness of individual objects. This is in contrast to classical information theory that is based on random variables and communication, and has no bearing on information and randomness of individual objects. After a brief overview, the major subfields, applications, history, and a map of the field are presented.; Comment: 11 LaTeX pages. http://www.scholarpedia.org/article/Algorithmic_information_theory

Information theory and Thermodynamics

Kafri, Oded
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 07/02/2006 Português
Relevância na Pesquisa
66.66717%
A communication theory for a transmitter broadcasting to many receivers is presented. In this case energetic considerations cannot be neglected as in Shannon theory. It is shown that, when energy is assigned to the information bit, information theory complies with classical thermodynamic and is part of it. To provide a thermodynamic theory of communication it is necessary to define equilibrium for informatics systems that are not in thermal equilibrium and to calculate temperature, heat, and entropy with accordance to Clausius inequality. It is shown that for a binary file the temperature is proportional to the bit energy and that information is thermodynamic entropy. Equilibrium exists in random files that cannot be compressed. Thermodynamic bounds on the computing power of a physical device, and the maximum information that an antenna can broadcast are calculated.; Comment: 16 pages and 5 figures. The paper was submitted to IEEE Transaction on Information theory

Cores of Cooperative Games in Information Theory

Madiman, Mokshay
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 31/12/2008 Português
Relevância na Pesquisa
66.646353%
Cores of cooperative games are ubiquitous in information theory, and arise most frequently in the characterization of fundamental limits in various scenarios involving multiple users. Examples include classical settings in network information theory such as Slepian-Wolf source coding and multiple access channels, classical settings in statistics such as robust hypothesis testing, and new settings at the intersection of networking and statistics such as distributed estimation problems for sensor networks. Cooperative game theory allows one to understand aspects of all of these problems from a fresh and unifying perspective that treats users as players in a game, sometimes leading to new insights. At the heart of these analyses are fundamental dualities that have been long studied in the context of cooperative games; for information theoretic purposes, these are dualities between information inequalities on the one hand and properties of rate, capacity or other resource allocation regions on the other.; Comment: 12 pages, published at http://www.hindawi.com/GetArticle.aspx?doi=10.1155/2008/318704 in EURASIP Journal on Wireless Communications and Networking, Special Issue on "Theory and Applications in Multiuser/Multiterminal Communications"...

Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)

Raginsky, Maxim; Sason, Igal
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
66.68345%
During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory, information theory, theoretical computer science, and learning theory. This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors. The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication. The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities. The basic ingredients of the entropy method are discussed first in the context of logarithmic Sobolev inequalities...

Objective Information Theory: A Sextuple Model and 9 Kinds of Metrics

Jianfeng, Xu; Jun, Tang; Xuefeng, Ma; Bin, Xu; Yanli, Shen; Yongjie, Qiao
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
66.657744%
In the contemporary era, the importance of information is undisputed, but there has never been a common understanding of information, nor a unanimous conclusion to the researches on information metrics. Based on the previous studies, this paper analyzes the important achievements in the researches of the properties and metrics of information as well as their main insufficiencies, and explores the essence and connotation, the mathematical expressions and other basic problems related to information. On the basis of the understanding of the objectivity of information, it proposes the definitions and a Sextuple model of information; discusses the basic properties of information, and brings forward the definitions and mathematical expressions of nine kinds of metrics of information, i.e., extensity, detailedness, sustainability, containability, delay, richness, distribution, validity and matchability. Through these, this paper establishes a basic theory frame of Objective Information Theory to support the analysis and research on information and information system systematically and comprehensively.; Comment: 20 pages

Algorithmic information theory

Grunwald, Peter D.; Vitanyi, Paul M. B.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
66.661133%
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are fundamentally different. We indicate how recent developments within the theory allow one to formally distinguish between `structural' (meaningful) and `random' information as measured by the Kolmogorov structure function, which leads to a mathematical formalization of Occam's razor in inductive inference. We end by discussing some of the philosophical implications of the theory.; Comment: 37 pages, 2 figures, pdf, in: Philosophy of Information, P. Adriaans and J. van Benthem, Eds., A volume in Handbook of the philosophy of science, D. Gabbay, P. Thagard, and J. Woods, Eds., Elsevier, 2008. In version 1 of September 16 the refs are missing. Corrected in version 2 of September 17

Relating Granger causality to directed information theory for networks of stochastic processes

Amblard, Pierre-Olivier; Michel, Olivier J. J.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
66.680073%
This paper addresses the problem of inferring circulation of information between multiple stochastic processes. We discuss two possible frameworks in which the problem can be studied: directed information theory and Granger causality. The main goal of the paper is to study the connection between these two frameworks. In the case of directed information theory, we stress the importance of Kramer's causal conditioning. This type of conditioning is necessary not only in the definition of the directed information but also for handling causal side information. We also show how directed information decomposes into the sum of two measures, the first one related to Schreiber's transfer entropy quantifies the dynamical aspects of causality, whereas the second one, termed instantaneous information exchange, quantifies the instantaneous aspect of causality. After having recalled the definition of Granger causality, we establish its connection with directed information theory. The connection is particularly studied in the Gaussian case, showing that Geweke's measures of Granger causality correspond to the transfer entropy and the instantaneous information exchange. This allows to propose an information theoretic formulation of Granger causality.; Comment: submitted...