We use an unusually rich data set that provides national coverage of housing prices and homicides and exploits within-municipality variations. We find that the impact of violence on housing prices is borne entirely by the poor sectors of the population. An increase in homicides equivalent to 1 standard deviation leads to a 3 percent decrease in the price of low-income housing.
The Jones R&D Growth Model
This paper examines foreign direct investment FDI in the Hungarian economy in the period of post-Communist transition since Hungary took a quite aggressive approach in welcoming foreign investment during this period and as a result had the highest per capita FDI in the region as of We discuss the impact of FDI in terms of strategic intent, i.
The effect of these two kinds of FDI is contrasted by examining the impact of resource seeking FDI in manufacturing sectors and market serving FDI in service industries. In the case of transition economies, we argue that due to the strategic intent, resource seeking FDI can imply a short-term impact on economic development whereas market serving FDI strategically implies a long-term presence with increased benefits for the economic development of a transition economy. Our focus is that of market serving FDI in the Hungarian banking sector, which has brought improved service and products to multinational and Hungarian firms.
This has been accompanied by the introduction of innovative financial products to the Hungarian consumer, in particular consumer credit including mortgage financing. However, the latter remains an underserved segment with much growth potential. For public policy in Hungary and other transition economies, we conclude that policymakers should consider the strategic intent of FDI in order to maximize its benefits in their economies. We propose a general framework for extracting rotation invariant features from images for the tasks of image analysis and classification.
Our framework is inspired in the form of the Zernike set of orthogonal functions. It provides a way to use a set of one-dimensional functions to form an orthogonal set over the unit disk by non-linearly scaling its domain, and then associating it an exponential term. When the images are projected into the subspace created with the proposed framework, the rotations in the image affect only the exponential term while the value of the orthogonal functions serve as rotation invariant features.
- Estrategias sistemáticas de comercio cuantitativo.?
- Tasa de referencia de forex;
- The Jones R&D Growth Model.
We exemplify our framework using the Haar wavelet functions to extract features from several thousand images of symbols. We then use the features in an OCR experiment to demonstrate the robustness of the method. In this paper we explore the use of orthogonal functions as generators of representative, compact descriptors of image content. In Image Analysis and Pattern Recognition such descriptors are referred to as image features, and there are some useful properties they should possess such as rotation invariance and the capacity to identify different instances of one class of images.
We exemplify our algorithmic methodology using the family of Daubechies wavelets, since they form an orthogonal function set. We benchmark the quality of the image features generated by doing a comparative OCR experiment with three different sets of image features.
Our algorithm can use a wide variety of orthogonal functions to generate rotation invariant features, thus providing the flexibility to identify sets of image features that are best suited for the recognition of different classes of images. When analyzing catastrophic risk, traditional measures for evaluating risk, such as the probable maximum loss PML , value at risk VaR , tail-VaR, and others, can become practically impossible to obtain analytically in certain types of insurance, such as earthquake, and certain types of reinsurance arrangements, specially non-proportional with reinstatements.
Given the available information, it can be very difficult for an insurer to measure its risk exposure.
The Institute for Business and Finance Research
This effect can be assessed mathematically. The PML is defined in terms of a very extreme quantile. The resulting reinsurance structures will then be very complicated to analyze and to evaluate their mitigation or transfer effects analytically, so it may be necessary to use alternative approaches, such as Monte Carlo simulation methods. This is what we do in this paper in order to measure the effect of a complex reinsurance treaty on the risk profile of an insurance company. We compute the pure risk premium, PML as well as a host of results: impact on the insured portfolio, risk transfer effect of reinsurance programs, proportion of times reinsurance is exhausted, percentage of years it was necessary to use the contractual reinstatements, etc.
Since the estimators of quantiles are known to be biased, we explore the alternative of using an Extreme Value approach to complement the analysis. The need to estimate future claims has led to the development of many loss reserving techniques. There are two important problems that must be dealt with in the process of estimating reserves for outstanding claims: one is to determine an appropriate model for the claims process, and the other is to assess the degree of correlation among claim payments in different calendar and origin years.
We approach both problems here. On the one hand we use a gamma distribution to model the claims process and, in addition, we allow the claims to be correlated. We follow a Bayesian approach for making inference with vague prior distributions. The methodology is illustrated with a real data set and compared with other standard methods. Consider a random sample X1, X2,. Only the sample size, mean and range are recorded and it is necessary to estimate the unknown population mean and standard deviation.
In this paper the estimation of the mean and standard deviation is made from a Bayesian perspective by using a Markov Chain Monte Carlo MCMC algorithm to simulate samples from the intractable joint posterior distribution of the mean and standard deviation. The proposed methodology is applied to simulated and real data. This paper is concerned with the situation that occurs in claims reserving when there are negative values in the development triangle of incremental claim amounts. Typically these negative values will be the result of salvage recoveries, payments from third parties, total or partial cancellation of outstanding claims due to initial overestimation of the loss or to a possible favorable jury decision in favor of the insurer, rejection by the insurer, or just plain errors.
Some of the traditional methods of claims reserving, such as the chain-ladder technique, may produce estimates of the reserves even when there are negative values. Historically the chain-ladder method has been used as a gold standard benchmark because of its generalized use and ease of application. This paper presents a Bayesian model to consider negative incremental values, based on a three-parameter log-normal distribution. The model presented here allows the actuary to provide point estimates and measures of dispersion, as well as the complete distribution for outstanding claims from which the reserves can be derived.
It is concluded that the method has a clear advantage over other existing methods. The BMOM is particularly useful for obtaining post-data moments and densities for parameters and future observations when the form of the likelihood function is unknown and thus a traditional Bayesian approach cannot be used. Also, even when the form of the likelihood is assumed known, in time series problems it is sometimes difficult to formulate an appropriate prior density. Here, we show how the BMOM approach can be used in two, nontraditional problems.
Revista de Comunicación. vol. N°2, by Revista de Comunicación - Issuu
The first one is conditional forecasting in regression and time series autoregressive models. Specifically, it is shown that when forecasting disaggregated data say quarterly data and given aggregate constraints say in terms of annual data it is possible to apply a Bayesian approach to derive conditional forecasts in the multiple regression model.
The types of constraints conditioning usually considered are that the sum, or the average, of the forecasts equals a given value. This kind of condition can be applied to forecasting quarterly values whose sum must be equal to a given annual value. Analogous results are obtained for AR p models.
The second problem we analyse is the issue of aggregation and disaggregation of data in relation to predictive precision and modelling. Predictive densities are derived for future aggregate values by means of the BMOM based on a model for disaggregated data. They are then compared with those derived based on aggregated data.
Por ejemplo, la cantidad por pronosticar puede ser el total de un periodo año y el cual debe hacerse en cuanto se obtiene información sólo para algunos subperiodos meses dados.
- Todo lo que he aprendido con la - Richard H Thaler?
- Corso di trading forex gratis;
- Publicaciones de la facultad.
En este trabajo se analiza el problema en el contexto de muestreo por conglomerados. Se propone un estimador de razón para el total que se quiere pronosticar, bajo el supuesto de estacionalidad estable. Se presenta un estimador puntual y uno para la varianza del total. Se incluyen algunos ejemplos reales, así como aplicaciones a datos publicados con anterioridad.
Enlaces de interés
Se hacen comparaciones con otros métodos. The problem of estimating the accumulated value of a positive and continuous variable for which some partially accumulated data has been observed, and usually with only a small number of observations two years , can be approached taking advantage of the existence of stable seasonality from one period to another.
For example the quantity to be predicted may be the total for a period year and it needs to be made as soon as partial information becomes available for given subperiods months. These conditions appear in a natural way in the prediction of seasonal sales of style goods, such as toys; in the behavior of inventories of goods where demand varies seasonally, such as fuels; or banking deposits, among many other examples. In this paper, the problem is addressed within a cluster sampling framework. A ratio estimator is proposed for the total value to be forecasted under the assumption of stable seasonality.
Estimators are obtained for both the point forecast and the variance. The procedure works well when standard methods cannot be applied due to the reduced number of observations. Some real examples are included as well as applications to some previously published data. Comparisons are made with other procedures. We present a Bayesian solution to forecasting a time series when few observations are available. The quantity to predict is the accumulated value of a positive, continuous variable when partially accumulated data are observed. These conditions appear naturally in predicting sales of style goods and coupon redemption.
A simple model describes the relation between partial and total values, assuming stable seasonality. Exact analytic results are obtained for point forecasts and the posterior predictive distribution. Noninformative priors allow automatic implementation. Examples are provided.
We give a brief description of the Project and characteristics of the target population.
We then describe and use the FGT Index to determine if the communities included in the Project were correctly chosen. We describe the method of cost-effectiveness analysis used in this article. The procedure for specifying cost-effectiveness ratios is next presented, and their application to measure the impact of PNAS on Food Expenditures carried out.
Finally we present empirical results that show that, among other results, PNAS increased Food Expenditures of the participating households by 7. Con frecuencia las instituciones financieras internacionales y los gobiernos locales se ven implicados en la implantación de programas de desarrollo.