Ciclo de Palestras 2013 – 2° Semestre

Palestras do Departamento de Metodos Estatísticos - Instituto de Matemática - UFRJ

2º semestre de 2013
As palestras ocorreram no Auditório do Laboratório de Sistemas Estocásticos (LSE), sala I-044b, as 15:30 h, a menos de algumas exceções devidamente indicadas.

18/12 - (excepcionalmente as 13:30 horas)

Prior sensitivity analysis plays an important role in Bayesian statistics. This is especially true for Bayesian hierarchical models, where interpretability of the parameters within deeper layers in the hierarchy becomes challenging. In addition, lack of information together with identifiability issues may imply that the prior distributions for such models have an undesired influence on the posterior inference. Despite its relevance, informal approaches to prior sensitivity analysis are currently used. They require repetitive re-runs of the model with ad-hoc modified base prior parameter values. Other formal approaches to prior sensitivity analysis suffer from a lack of popularity in practice, mainly due to their high computational cost and absence of software implementation. We propose a novel formal approach to prior sensitivity analysis which is fast and accurate. It quantifies sensitivity without the need for a model re-run. We develope a ready-to-use priorSens package in R for routine prior sensitivity investigation by R-INLA. Throughout a series of examples we show how our approach can be used to detect high prior sensitivities of some parameters as well as identifiability issues in possibly over-parametrized Bayesian hierarchical models.
Joint work with Malgorzata Roos, Leonhard Held & Havard Rue.

In this talk we study a class of insurance products where the policy holder has the option to insure k of its annual Operational Risk losses in a horizon of T years. This involves a choice of k out of T years in which to apply the insurance policy coverage by making claims against losses in the given year. The insurance product structure presented can accommodate any kind of annual mitigation, but we present three basic generic insurance policy structures that can be combined to create more complex types of coverage. Following the Loss Distributional Approach (LDA) with Poisson distributed annual loss frequencies and Inverse-Gaussian loss severities we are able to characterize in closed form analytical expressions for the multiple optimal decision strategy that minimizes the expected Operational Risk loss over the next T years.
Joint work with Gareth Peters, Georgy Sofronov & Pavel Shevchenko.

18/12

During the financial worldwide crisis in 2008 some insurance companies experienced solvency problems. A famous case was featured by the American International Group (AIG), the world’s biggest insurance company for many decades. The United States government decided to spend 85 billion dollars to bailout AIG.
In 2009 the European Union approved Solvency II Directive (S2D), a regulation scheme aiming to reduce companies’ risk of insolvency and to promote consumer protection. The regulation to be implemented in 2016, regards the amount of capital insurance companies most hold in order to be solvent. Since the approval of S2D, other non-European countries have followed the initiative to renew their own regulation laws in the same spirit.
S2D is based on three main pillars: quantitative requirements (e.g. the amount of capital an insurer should hold), qualitative requeriments concerning the risk management of insurers as well as its supervision (e.g. an internal risk board and its evaluation) and disclosure and transparency requirements.
The objective of this talk is to address the quantitative requirements of the Mexican law regulation (to be applied on 2014) through the study of a particular type of insurance: the short period life. This new methodology includes Actuarial science, Statistics and Stochastic processes topics. If time allows we will give a glimpse of more complex insurance examples.

04/12

Periodontal disease progression is often quantified by clinical attachment level (CAL) defined as the distance down a tooth’s root that is detached from the surrounding bone. Measured at 6 locations per tooth throughout the mouth (excluding the molars), it gives rise to a dependent data set-up. These data are often reduced to a one-number summary, such as the whole mouth average or the number of observations greater than a threshold, to be used as the response in a regression to identify important covariates related to the current state of a subject’s periodontal health. Rather than a simple one-number summary, we set forward to analyze all available CAL data for each subject, exploiting the presence of spatial dependence, non-stationarity, and non-normality. Also, many subjects have a considerable proportion of missing teeth which cannot be considered missing at random because periodontal disease is the leading cause of adult tooth loss. Under a Bayesian paradigm, we propose a nonparametric flexible spatial (joint) model of observed CAL and the location of missing tooth via kernel convolution methods, incorporating the aforementioned features of CAL data under a unified framework. Application of this methodology to a data set recording the periodontal health of an African-American population, as well as simulation studies reveal the gain in model fit and inference, and provides a new perspective into unraveling covariate-response relationships in presence of complexities posed by these data.

13/11

Sistemas de gravação e reprodução de áudio podem apresentar imperfeições que afetam a qualidade de experiência do ouvinte. Algoritmos de restauração se destinam a atuar no sinal degradado visando a torná-lo o mais parecido possível com o sinal originalmente gravado. Utilizando-se de técnicas de processamento estatístico de sinais, as quais permitem incorporar o conhecimento sobre o sinal de áudio e sobre os defeitos inseridos pelo sistema, tais algoritmos podem melhorar substancialmente a qualidade dos sinais, podendo em alguns casos tornar os defeitos imperceptíveis. Esta palestra apresentará uma revisão de técnicas existentes para alguns dos defeitos mais comuns em aplicações práticas, salientando as contribuições de alguns trabalhos que venho realizando recentemente com colaboradores.

Abordaremos o metabolismo celular como um sistema de reações químicas interconectadas cujos fluxos derivam da solução de um problema de otimização, onde situações de crescimento constante se traduzem em vínculos de continuidade de massa. Descreveremos a abordagem matemática que nos fornece tais fluxos e, de posse do mapa de reações da bactéria E. Coli, mostramos que o volume celular finito atua como um vínculo importante, restringindo a produção das enzimas catalizadoras das reações a altas taxas de crescimento celular e provocando reorganizações globais nos fluxos. Finalmente, comparamos nossas previsões com experimentos e mostramos o sucesso desta abordagem.

Será abordado o importante papel do sono e dos sonhos na consolidação e restruturação de memórias, cruciais para o aprendizado e a criação de novas ideias. Noções freudianas serão postas em perspectiva, entre as quais a que postula a semelhança entre sonho e delírio psicopatológico. Como evoluiu a mente humana? Respostas serão buscadas numa perspectiva evolucionista, partindo do sono de nossos ancestrais mais remotos até chegar a fenomenologia dos sonhos contemporâneos, utilizando dados da genética, neurofisiologia de sistemas e psicologia.

23/10

Since the famous Recurrence Theorem of Poincaré, the probabilistic analysis of occurrence times has been a subject of intense study, both in the area of stochastic process and dynamical systems. For instance, the non less famous Kac’s Lemma is the first quantitative result. In this talk we will review some recent results (the last decade) giving precise quantitative information on the statistical laws for the many different kinds of occurrence times. Motivation comes from physical phenomena, DNA analysis, information theory, compression algorithms, game theory, among others.

02/10 - Colóquio Inter-institucional "Modelos estocásticos e aplicações" (excepcionalmente as 14:00 horas no IME-UFF)

After a few remarks about what we mean by quantization, I will explain the powerful role that operator-valued measure can play in quantizing any set equipped with a measure, for instance a group equipped with its (left) Haar measure. Integral quantizations based on the Weyl-Heisenberg group and on the affine group are compared. I will insist on the probabilistic aspects of such a procedure. An interesting application in quantum cosmology will be presented.

I plan to discuss examples of stochastic phenomena within a framework of Quantum Mechanics. These will include the uncertainty principle, teleportation and — time permits — localization. The talk will not assume preliminary knowledge of Quantum Mechanics or Probability Theory.

18/09

Modelos de séries temporais com parâmetros variando no tempo foram categorizados por Cox (1981) em duas classes: “observation driven models” e “parameter driven models”. Nos modelos “observation driven”, a variação temporal dos parâmetros é uma função de observações passadas e variáveis exógenas. Embora os parâmetros sejam estocásticos, eles são preditos perfeitamente condicional à informação passada. Este procedimento simplifica o cálculo da verossimilhança e explica porque estes modelos se tornaram populares na literatura estatística. Neste trabalho apresentaremos alguns “observation driven models” construídos na família exponencial de distribuições, tais como os modelos GARMA, GLARMA e GAS. Alguns exemplos e estudos de simulação são realizados para dados de contagem que seguem a distribuição de Poisson.

11/09

First hitting times arise naturally in survival analysis where the underlying stochastic counting process represents the strength of the health of an individual. The patient experiences a clinical endpoint when this process reaches a critical point for the first time. We propose a very flexible and unified first hitting time density function in a stochastic carcino- genesis counting process, and its mathematical properties are investigated. The Poisson and negative binomial first hitting time models are addressed and two examples with real data are presented.

Na maioria dos materiais a diminuição da temperatura implica aumento monotônico na densidade. Este não é o caso da água que apresenta a 4C um máximo na densidade. Além disso, observa-se um aumento não esperado na compressibilidade entre 0.1 MPa e 190 MPa e, a pressão atmosférica uma aumento no calor específico a pressão constante. Não somente a termodinâmica, mas igualmente a dinâmica da água é pouco usual. O coeficiente de difusão da água tem um máximo a 4C e 1.5 atm, enquanto para líquidos usuais a difusão aumenta com a diminuição de pressão. Neste seminário propomos que as anomalias da água podem ser explicadas por um potencial de duas escalas. Mostramos que esta hipótese pode ser comprovada em potenciais efetivos que reproduzem as anomalias da água. A ligação entre múltiplas escalas e a presença de anomalias é mostrada exatamente em um modelo em uma dimensão e observada por simulações em três dimensões.

Varias classes de sistemas dinâmicos muito determinísticos podem ser analisadas por meio de um operador de renormalização, o qual tipicamente tem propriedades caóticas que por sua vez permitem uma modelagem estocástica. Ilustraremos essa filosofia com uma discussão de como a solução de certas questões sobre bilhares racionais envolve o estudo de certas propriedades finas de produtos aleatórios de matrizes.

16/08 - (excepcionalmente numa 6ª feira as 13:30 horas)

In medical studies the diagnostic of a patient is very often based on some characteristic of interest, which may lead to classification errors. These classification errors are calibrated on the basis of two indicators: sensitivity (probability of diagnosing an ill person as ill) and specificity (probability of diagnosing a healthy person as healthy).
When the diagnostic variable is continuous, the classification will necessarily be based on a cut-off value: if the variable exceeds the cut-off then the patient is classified as ill, otherwise the patient is classified as healthy. In this situation, it is of special interest the geometrical locus obtained when varying the cut-off values in the complement of the specificity versus the sensitivity. This geometrical locus is called the receiver operating characteristic curve (ROC curve), and it is of extensive use to analyse the discriminative power of the diagnostic variable. Some summary indicators, such as the area under the curve or Youden’s index, are used to describe the main features of the ROC curve. The first part of the talk will be devoted to give a general introduction about ROC curves and present some nonparametric estimators.
In many studies, a covariate is avaliable along with the diagnostic variable. The information contained in the covariate may modify the discriminatory capability of the ROC curve, and therefore it is interesting to study the impact of the covariate on the conditional ROC curve. The second part of the talk will be devoted to the study of a nonparametric estimation procedure of the conditional ROC curve and its associated summary indices (conditional AUC and conditional Youden index). A data set concerning diagnosis of diabetes will be used as an illustration of the proposed methodology.

24/07 - (excepcionalmente na sala C116)

We consider recent work of Foss and Baccelli on Poisson rain queuing models: On the lattice jobs arrive according to Poisson processes. A job requires a random amount of service time and a random collection of servers working simultaneously and servers must perform jobs on a first come first served manner. For this reason the job can only be started once all servers concerned are free from prior “committments” which may involve a server not doing anything even though work is available. We consider when the system may be stable (under sufficiently reduced arrival rates).