Statistical Computing
Raheleh Zamini
Abstract
In various statistical model, such as density estimation and estimation of regression curves or hazardrates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametricstatistics is to estimate a monotone density function f on a compact interval. A known estimator ...
Read More
In various statistical model, such as density estimation and estimation of regression curves or hazardrates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametricstatistics is to estimate a monotone density function f on a compact interval. A known estimator fordensity function of f under the restriction that f is decreasing, is Grenander estimator, where is the leftderivative of the least concave majorant of the empirical distribution function of the data. Many authorsworked on this estimator and obtained very useful properties from this estimator. Grenander estimatoris a step function and as a consequence it is not smooth. In this paper, we discuss the estimation of adecreasing density function by the kernel smoothing method. Many works have been done due to theimportance and applicability of Berry-Esseen bounds for the density estimator. In this paper, we studya Berry- Esseen type bound for a smoothed version of Grenander estimator.
Rahim Mahmoudvand; Paulo Canas Rodrigues
Abstract
In a referendum conducted in the United Kingdom (UK) on June 23, 2016, $51.6\\%$ of the participants voted to leave the European Union (EU). The outcome of this referendum had major policy and financial impact for both UK and EU, and was seen as a surprise because the predictions consistently indicate ...
Read More
In a referendum conducted in the United Kingdom (UK) on June 23, 2016, $51.6\\%$ of the participants voted to leave the European Union (EU). The outcome of this referendum had major policy and financial impact for both UK and EU, and was seen as a surprise because the predictions consistently indicate that the ``Remain'''' would get a majority. In this paper, we investigate whether the outcome of the Brexit referendum could have been predictable by polls data. The data consists of 233 polls which have been conducted between January 2014 and June 2016 by YouGov, Populus, ComRes, Opinion, and others. The sample size range from 500 to 20058.We used Singular Spectrum Analysis (SSA) which is an increasingly popular and widely adopted filtering technique for both short and long time series. We found that the real outcome of the referendum is very close to our point estimate and within our prediction interval, which reinforces the usefulness of SSA to predict polls data.
Mina Norouzirad; Mohammad Arashi; Mahdi Roozbeh
Abstract
Partial linear model is very flexible when the relation between the covariates and responses, either parametric and nonparametric. However, estimation of the regression coefficients is challenging since one must also estimate the nonparametric component simultaneously. As a remedy, the differencing approach, ...
Read More
Partial linear model is very flexible when the relation between the covariates and responses, either parametric and nonparametric. However, estimation of the regression coefficients is challenging since one must also estimate the nonparametric component simultaneously. As a remedy, the differencing approach, to eliminate the nonparametric component and estimate the regression coefficients, can be used. Here, suppose the regression vector-parameter is subjected to lie in a sub-space hypothesis. In situations where the use of difference-based least absolute and shrinkage selection operator (D-LASSO) is desired for, we propose a restricted D-LASSO estimator. To improve its performance, LASSO-type shrinkage estimators are also developed. The relative dominance picture of suggested estimators is investigated. In particular, the suitability of estimating the nonparametric component based on the Speckman approach is explored. A real data example is given to compare the proposed estimators. From the numerical analysis, it is obtained that the partial difference-based shrinkage estimators perform better than the difference-based regression model in average prediction error sense.
Statistical Computing
Abstract
This paper presents approximate confidence intervals for each function of parameters in a Banach space based on a bootstrap algorithm. We apply kernel density approach to estimate the persistence landscape. In addition, we evaluate the quality distribution function estimator of random variables ...
Read More
This paper presents approximate confidence intervals for each function of parameters in a Banach space based on a bootstrap algorithm. We apply kernel density approach to estimate the persistence landscape. In addition, we evaluate the quality distribution function estimator of random variables using integrated mean square error (IMSE). The results of simulation studies show a significant improvement achieved by our approach compared to the standard version of confidence intervals algorithm. Finally, real data analysis shows that the accuracy of our method compared to that of previous works for computing the confidence interval.
Computational Statistics
Reza Pourtaheri
Abstract
Traditionally, the statistical quality control techniques utilize either an attributes or variables product quality measure. Recently, some methods such as three-level control chart have been developed for monitoring multi attribute processes. Control chart usually has three design parameters: the sample ...
Read More
Traditionally, the statistical quality control techniques utilize either an attributes or variables product quality measure. Recently, some methods such as three-level control chart have been developed for monitoring multi attribute processes. Control chart usually has three design parameters: the sample size (n), the sampling interval (h) and the control limit coefficient (k).The design parameters of the control chart are generally specified according to statistical or/and economic criteria. The variable sampling interval (VSI) control scheme has been shown to provide an increase to the detecting efficiency of the control chart with fixed sampling rate (FRS). In this paper a method is proposed to conduct the economic-statistical design for variable sampling interval of the three-level control charts. We use the cost model developed by Costa and Rahim and optimize this model by genetic algorithm approach. We compare the expected cost per unit time of the VSI and FRS 3-level control charts. Results indicate that the proposed chart has improved performance.
Abstract
In this article, we consider the problem of estimating the stress-strength reliability $Pr (X > Y)$ based on upper record values when $X$ and $Y$ are two independent but not identically distributed random variables from the power hazard rate distribution with common scale parameter $k$. When the parameter ...
Read More
In this article, we consider the problem of estimating the stress-strength reliability $Pr (X > Y)$ based on upper record values when $X$ and $Y$ are two independent but not identically distributed random variables from the power hazard rate distribution with common scale parameter $k$. When the parameter $k$ is known, the maximum likelihood estimator (MLE), the approximate Bayes estimator and the exact confidence intervals of stress-strength reliability are obtained. When the parameter $k$ is unknown, we obtain the MLE and some bootstrap confidence intervals of stress-strength reliability. We also apply the Gibbs sampling technique to study the Bayesian estimation of stress-strength reliability and the corresponding credible interval. An example is presented in order to illustrate the inferences discussed in the previous sections. Finally, to investigate and compare the performance of the different proposed methods in this paper, a Monte Carlo simulation study is conducted.
Statistical Simulation
farzad eskandari
Abstract
Imprecise measurement tools produce imprecise data. Interval-valued data is usually used to deal with such imprecision. So interval-valued variables are used in estimation methods. They have recently been modeled by linear regression models. If response variable has any statistical distributions, interval-valued ...
Read More
Imprecise measurement tools produce imprecise data. Interval-valued data is usually used to deal with such imprecision. So interval-valued variables are used in estimation methods. They have recently been modeled by linear regression models. If response variable has any statistical distributions, interval-valued variables are modeled in generalized linear models framework. In this article, we propose a new consistent estimator of a parameter in generalized linear models with regard to distributions of response variable in the exponential family. A simulation study shows that the new estimator is better than others on the basis of particular distributions of response variable. We present optimal properties of the estimators in this research
Statistical Computing
mohammad hossein naderi; Mohammad Bameni Moghadam; asghar Seif
Abstract
A proper method of monitoring a stochastic system is to use the control charts of statisticalprocess control in which a drift in characteristics of output may be due to one or several assignable causes. In the establishment of X charts in statistical process control, an assumption is made that there ...
Read More
A proper method of monitoring a stochastic system is to use the control charts of statisticalprocess control in which a drift in characteristics of output may be due to one or several assignable causes. In the establishment of X charts in statistical process control, an assumption is made that there is no correlation within the samples. However, in practice, there are many cases where the correlation does exist within the samples. It would be more appropriate to assume that each sample is a realization of a multivariatenormal random vector. Using three dierent loss functions in the concept of quality control charts with economic and economic statistical design leads to better decisions in the industry. Although some research works have considered the economic design of control charts under single assignable cause and correlated data, the economic statistical design of X control chart for multiple assignable causes and correlated data under Weibull shock model with three dierent loss functions have not been presented yet. Based on theoptimization of the average cost per unit of time and taking into account the dierent combination valuesof Weibull distribution parameters, optimal design values of sample size, sampling interval and control limitcoecient were derived and calculated. Then the cost models under non-uniform and uniform samplingscheme were compared. The results revealed that the model under multiple assignable causes with correlatedsamples with non-uniform sampling integrated with three dierent loss functions has a lower cost than themodel with uniform sampling.
Bayesian Computation Statistics
Ehsan Ormoz
Abstract
In the meta-analysis of clinical trials, usually the data of each trail summarized by one or more outcome measure estimates which reported along with their standard errors. In the case that summary data are multi-dimensional, usually, the data analysis will be performed in the form of a number of separated ...
Read More
In the meta-analysis of clinical trials, usually the data of each trail summarized by one or more outcome measure estimates which reported along with their standard errors. In the case that summary data are multi-dimensional, usually, the data analysis will be performed in the form of a number of separated univariate analysis. In such a case the correlation between summary statistics would be ignored. In contrast, a multivariate meta-analysis model, use from these correlations synthesizes the outcomes, jointly to estimate the multiple pooled effects simultaneously. In this paper, we present a nonparametric Bayesian bivariate random effect meta-analysis.
Esmaeil Shirazi
Abstract
Estimation of a quantile density function from biased data is a frequent problem in industrial life testingexperiments and medical studies. The estimation of a quantile density function in the biased nonparametric regression model is inves-tigated. We propose and develop a new wavelet-based methodology ...
Read More
Estimation of a quantile density function from biased data is a frequent problem in industrial life testingexperiments and medical studies. The estimation of a quantile density function in the biased nonparametric regression model is inves-tigated. We propose and develop a new wavelet-based methodology for this problem. In particular, anadaptive hard thresholding wavelet estimator is constructed. Under mild assumptions on the model, weprove that it enjoys powerful mean integrated squared error properties over Besov balls. The performanceof proposed estimator is investigated by a numerical study.In this study, we develop two types of wavelet estimators for the quantile density function when datacomes from a biased distribution function. Our wavelet hard thresholding estimator which is introducedas a nonlinear estimator, has the feature to be adaptive according to q(x). We show that these estimatorsattain optimal and nearly optimal rates of convergence over a wide range of Besov function classes.
mozhgan taavoni
Abstract
This paper considers an extension of the linear mixed model, called semiparametric mixed effects model, for longitudinal data, when multicollinearity is present. To overcome this problem, a new mixed ridge estimator is proposed while the nonparametric function in the semiparametric model is approximated ...
Read More
This paper considers an extension of the linear mixed model, called semiparametric mixed effects model, for longitudinal data, when multicollinearity is present. To overcome this problem, a new mixed ridge estimator is proposed while the nonparametric function in the semiparametric model is approximated by the kernel method. The proposed approache integrates ridge method into the semiparametric mixed effects modeling framework in order to account for both the correlation induced by repeatedly measuring an outcome on each individual over time, as well as the potentially high degree of correlation among possible predictor variables. The asymptotic normality of the exhibited estimator is established. To improve efficiency, the estimation of the covariance function is accomplished using an iterative algorithm. Performance of the proposed estimator is compared through a simulation study and analysis of CD4 data.
Bayesian Computation Statistics
sima naghizadeh
Abstract
The Bayesian variable selection analysis is widely used as a new methodology in air quality control trials and generalized linear models. One of the important and, of course,controversial topics in this area is selection of prior distribution of unknown model parameters. The aim of this study is presenting ...
Read More
The Bayesian variable selection analysis is widely used as a new methodology in air quality control trials and generalized linear models. One of the important and, of course,controversial topics in this area is selection of prior distribution of unknown model parameters. The aim of this study is presenting a substitution for mixture of priors which besidespreservation of benefits and computational efficiencies obviate the available paradoxes andcontradictions. In this research we pay attention to two points of view; empirical and fullyBayesian. Especially, a mixture of priors and its theoretical characteristics is introduced.Finally, the proposed model is illustrated with a real example.
Statistical Computing
Hassan Rashidi; Hamed Heidari; Marzie Movahedin; Maryam Moazami Gudarzi; Mostafa Shakerian
Abstract
The purpose of this research is to identify and introduce effective factors in adoption of e-learning based on technology adoption model. Accordingly, by considering the studies conducted in this field, several variables such as computer self-efficacy, content quality, system support, interface design, ...
Read More
The purpose of this research is to identify and introduce effective factors in adoption of e-learning based on technology adoption model. Accordingly, by considering the studies conducted in this field, several variables such as computer self-efficacy, content quality, system support, interface design, technology tools and computer anxiety as factors influencing the adoption of e-learning system were extracted and based on them, a conceptual model of research was developed. To measure the model and the relationships between the variables in the model, a questionnaire was designed and provided to users of the electronic education system of Qazvin University of Medical Sciences. The results of the data analysis confirmed the correctness of all hypotheses using the structural equation modeling method, except for the effect of technology tools on the acceptance of the e-learning system. The findings of this study will help university administrators and the professors associated with this system to encourage students to make effective use of the system by creating the necessary background for effective factors.
Mahsa Ghajarbeigi; Hamid Reza Vakely fard; Ramzanali Roeayi
Abstract
The purpose of this paper was to investigate the impact of audit quality on the reduction of collateral facilities, taking into account the role of major shareholders in companies listed on the Tehran Stock Exchange during the period 2017 to 2022. Considering the research conditions, 179 companies were ...
Read More
The purpose of this paper was to investigate the impact of audit quality on the reduction of collateral facilities, taking into account the role of major shareholders in companies listed on the Tehran Stock Exchange during the period 2017 to 2022. Considering the research conditions, 179 companies were selected as the statistical sample of the research (From a total number of 895 companies). The research method of this research is descriptive and applied research in terms of nature and content. The panel data method was used to test the research hypotheses. The findings of this research emphasized that audit quality reduces collateral facilities. The rotation of the auditor increases collateral facilities. But the auditor's expertise in the industry does not have a significant effect on collateral facilities. On the other hand, the ownership percentage of major shareholders does not affect the intensity of the impact of audit quality and expertise in the audit industry and audit turnover on collateral facilities.
azar ghyasi; hanieh rashidi
Abstract
Due to the inherent complexity and increasing competition, today's business environment requires new approaches in organizing and managing. One of the new approaches is business intelligence, which is the most critical technology to help manage and deliver smart services, especially business reporting. ...
Read More
Due to the inherent complexity and increasing competition, today's business environment requires new approaches in organizing and managing. One of the new approaches is business intelligence, which is the most critical technology to help manage and deliver smart services, especially business reporting. Business intelligence enables firms to manage their business efficiently to meet the needs of businesses at different macro, middle and even operations levels. In this paper, while investigating the feasibility of implementing business intelligence in firms, designing business intelligence to report and present new services is discussed. In order to demonstrate the capabilities of this type of intelligence, an approach based on the concept of Bayesian network in the application layer of business intelligence is presented. This approach is implemented for one of the companies governed by the Iranian Industrial Development and Renovation Organization, and the effects of important accounting and financial variables on the firm goals are investigated.
Mathematical Computing
Mohammad Arashi
Abstract
The multilinear normal distribution is a widely used tool in the tensor analysis of magnetic resonance imaging (MRI). Diffusion tensor MRI provides a statistical estimate of a symmetric 2nd-order diffusion tensor for each voxel within an imaging volume. In this article, tensor elliptical (TE) distribution ...
Read More
The multilinear normal distribution is a widely used tool in the tensor analysis of magnetic resonance imaging (MRI). Diffusion tensor MRI provides a statistical estimate of a symmetric 2nd-order diffusion tensor for each voxel within an imaging volume. In this article, tensor elliptical (TE) distribution is introduced as an extension to the multilinear normal (MLN) distribution. Some properties, including the characteristic function and distribution of affine transformations are given. An integral representation connecting densities of TE and MLN distributions is exhibited that is used in deriving the expectation of any measurable function of a TE variate.
Statistical Computing
farzad eskandari
Abstract
Interval-valued data are observed as ranges instead of single values and contain richer information thansingle-valued data. Meanwhile, interval-valued data are used for interval-valued characteristics. An intervalgeneralized linear model is proposed for the first time in this research. Then a suitable ...
Read More
Interval-valued data are observed as ranges instead of single values and contain richer information thansingle-valued data. Meanwhile, interval-valued data are used for interval-valued characteristics. An intervalgeneralized linear model is proposed for the first time in this research. Then a suitable model is presented toestimate the parameters of the interval generalized linear model. The two models are provided on the basis ofthe interval arithmetic. The estimation procedure of the parameters of the suitable model is as the estimationprocedure of the parameters of the interval generalized linear model. The least-squares (LS) estimation of thesuitable model is developed according to a nice distance in the interval space. The LS estimation is resolvedanalytically through a constrained minimization problem. Then some desirable properties of the estimatorsare checked. Finally, both the theoretical and the empirical performance of the estimators are investigated.
Alireza Safariyan; Reza Arabi Belaghi
Abstract
In this paper, the probability of failure-free operation until time t, along with the probability of stress-strength, based on progressive censoring data is studied in a family of lifetime distributions. Since the number of data in a progressive censoring scheme is usually reduced, so shrinkage methods ...
Read More
In this paper, the probability of failure-free operation until time t, along with the probability of stress-strength, based on progressive censoring data is studied in a family of lifetime distributions. Since the number of data in a progressive censoring scheme is usually reduced, so shrinkage methods have been used to improve the classical estimator. For estimation purposes, the preliminary test and Stein-type shrinkage estimators are proposed and their exact distributional properties are derived. For numerical superiority demonstration of the proposed estimation strategies, some improved bootstrap confidence intervals, are constructed. The theoretical results are illustrated by a real data examples and an extensive simulation study. Simulation shreds of evidence revealed that our proposed shrinkage strategies perform well in the estimation of parameters based on progressive censoring data.
Bayesian Network
Vahid Rezaei Tabar
Abstract
The aim of this paper is to learn a Bayesian network structure for discrete variables. For this purpose, we introduce a Gibbs sampler method. Each sample represents a Bayesian network. Thus, in the process of Gibbs sampling, we obtain a set of Bayesian networks. For achieving a single graph that represents ...
Read More
The aim of this paper is to learn a Bayesian network structure for discrete variables. For this purpose, we introduce a Gibbs sampler method. Each sample represents a Bayesian network. Thus, in the process of Gibbs sampling, we obtain a set of Bayesian networks. For achieving a single graph that represents the best graph fitted on data, we use the mode of burn-in graphs. This means that the most frequent edges of burn-in graphs are considered to indicate the best single graph. The results on the well-known Bayesian networks show that our method has higher accuracy in the task of learning a Bayesian network structure.
Statistical Simulation
Zahra Zandi; Hossein Bevrani; Reza Arabi Belaghi
Abstract
In this paper, we consider the problem of parameter estimation in {color{blue} negative binomial mixed model} when it is suspected that some of the fixed parameters may be restricted to a subspace via linear shrinkage, {color{blue} preliminary test}, shrinkage {color{blue} ...
Read More
In this paper, we consider the problem of parameter estimation in {color{blue} negative binomial mixed model} when it is suspected that some of the fixed parameters may be restricted to a subspace via linear shrinkage, {color{blue} preliminary test}, shrinkage {color{blue} preliminary test}, shrinkage, and positive shrinkage estimators along with the unrestricted maximum likelihood and restricted estimators. The random effects are considered as nuisance parameters. We conduct a Monte Carlo simulation study to evaluate the performance of each estimator in the sense of simulated relative efficiency. The results of simulation study reveal that the proposed estimation strategies perform more better than {color{blue} the} maximum likelihood method. The proposed estimators are applied to a real dataset to appraise their performance.
Mathematical Computing
Ali Moafi; Ali Kheyroddin; Hamid Saberi; Vahid Saberi
Abstract
Due to several reasons as the low resistance of constructed concrete and also change in codes or application of structures, some concrete frames need to be retrofitted. By adding the steel prop and curb to the reinforced concrete, many parameters are changed such as ductility, resistance, and stiffness. ...
Read More
Due to several reasons as the low resistance of constructed concrete and also change in codes or application of structures, some concrete frames need to be retrofitted. By adding the steel prop and curb to the reinforced concrete, many parameters are changed such as ductility, resistance, and stiffness. This study investigates numerically the impact of adding the prop and curb, slit damper, gusset plate and also prop with a ductile ring on stiffness, resistance, energy dissipation and ductility of RC frames. For this purpose, the effect of the aforementioned methods on the linear and nonlinear moment frame behavior of reinforced concrete under monotonic loads have been numerically investigated using the ABAQUS software. In the present study 12 samples of reinforced frames with one story and one span retrofitted by different methods. The novelty of the paper was using such props and slit damper in RC frames. The results obtained from the modeling showed the retrofitted frame with a ring, slit damper and gusset plate also showed a better behavior in terms of resistance and stiffness compared to the RC frame and the sample with slit damper and prop with a ductile ring as well as compared to the sample with the prop and curb showed more ductility and energy dissipation.
Mathematical Computing
Mahboubeh Aalaei
Abstract
In this paper, a new adaptive Monte Carlo algorithm is proposed to solve the systems of linear algebraic equations arising from the Black–Scholes model to price European and American options. The proposed algorithm offers several advantages ...
Read More
In this paper, a new adaptive Monte Carlo algorithm is proposed to solve the systems of linear algebraic equations arising from the Black–Scholes model to price European and American options. The proposed algorithm offers several advantages over the conventional and previous adaptive Monte Carlo algorithms. The corresponding properties of the algorithm and Convergence theories are discussed and numerical experiments are presented which demonstrate the computational efficiency of the proposed algorithm. The results are also compared with other methods.
bahareh asadi
Abstract
One of the important challenges in Wireless Sensor Networks is to proceeds data transmission in a way that tries to increase the life of the network. One of the main issues is the reduction of latency in the node and energy in the sink nodes. Due to the limited energy of the nodes, data transmission ...
Read More
One of the important challenges in Wireless Sensor Networks is to proceeds data transmission in a way that tries to increase the life of the network. One of the main issues is the reduction of latency in the node and energy in the sink nodes. Due to the limited energy of the nodes, data transmission has the largest share in energy consumption, so it is important to design a structure that has the least amount of energy in sending data to the base station. In this paper, we use fuzzy logic and Mamdani method for clustering to solve the challenge and time division multiplexing method to connect the nodes with the header. The proposed clustering is based on the use of the LEACH algorithm, the capability and reliability of which are improved by fuzzy systems, and the particle optimization algorithm is used to optimize the path of the networks. The simulation results show that energy consumption decreases with increasing number of cycles. For example, energy consumption reached 0.9 in the 2000 round and 0.1 in the 5000 round.
sima naghizadeh
Abstract
Abstract: so many natural phenomena of determining relationship and the effect of input variables on response variable in statistical studies may be different from the suggested model that the researcher selects for his study due to the occupant exists in the structure of data. It may be so influential ...
Read More
Abstract: so many natural phenomena of determining relationship and the effect of input variables on response variable in statistical studies may be different from the suggested model that the researcher selects for his study due to the occupant exists in the structure of data. It may be so influential on different distributions considered for response variables. The optimal properties of estimators evaluated and studied for two statistical variables considered for response variable and input variables in the suggested model. It has been simulated for study and real data has been also investigated. The results confirmed the superiority of a model which is close to the structure of the data.
Zahra Aghajani; mostafa karbasi; Bahareh asadi
Abstract
Deaf people or people with hearing loss have a major problem in everyday communication. There are many applications available in the market to help blind people to interact with the world. Voice-based email and chatting systems are available to communicate with each other by blinds. This helps to interact ...
Read More
Deaf people or people with hearing loss have a major problem in everyday communication. There are many applications available in the market to help blind people to interact with the world. Voice-based email and chatting systems are available to communicate with each other by blinds. This helps to interact with persons by blind people. Also, many attempts have been made with Sign Language (SL) translators to solve of communication gap between normal and deaf people and ease communication for deaf people. In this paper, the geometric feature is used as feature extraction for static sign recognition. Support Vector Machine (SVM) classifier is used for training and testing to develop a system using static signs. So, the accuracy result for static signs using the Geometric feature is 62.92\% which needs to be improved by other feature extraction and classifiers.