Dr. Ravichandran currently serves as Professor in the Department of Mathematics, School of Engineering, and Coimbatore. 

Earlier, he served in the Statistical Quality Control Department of a manufacturing industry for more than 12 years. His research and subject areas of interest include statistical quality control, statistical inference, Six Sigma, Total Quality Management, Statistical Inference, Statistical Pattern Recognition and SPSS.

He has published two books and a number of papers on Six Sigma and Statistical Inference in international journals and websites. His papers have got more than 150 citations. 

He is regularly reviewing research papers for the international journals such as The TQM Journal (Emerald Publications) and Total Quality Management and Business Excellence (Taylor and Francis – Routledge). He has guided one Ph.D. Scholar. 

He has organized a two-day National Conference on Quality Improvement Concepts and their Implementation in Higher Educational Institution (QICIHEI – 2009). 

Dr. Ravichandran served as resource person in many workshops and seminars (on Six Sigma, Statistics and SPSS) organized by various educational institutions and universities. 

He was a Senior Member of American Society for Quality (1987 – 2007) and is a Life Member of Indian Society for Technical Education (ISTE)


Publication Type: Journal Article
Year of Publication Publication Type Title
2016 Journal Article J. Ravichandran, “Estimation of DPMO and EGPMO for Higher-the-Better and Lower-the-better Quality Characteristics for Quality Evaluation”, Total Quality Management & Business Excellence, vol. 27, no. 10, pp. 1112 - 1120, 2016.[Abstract]

Whenever specification is provided for a measureable quality characteristic, under normality assumption, the defects per million opportunities (DPMO) can be computed by taking into account both the tail probabilities. This implies that a unit of a product is declared defective if a measurement on quality characteristic falls either below the specified lower limit or above the upper limit. However, there are practical situations, where a unit is said to be defective if the measurement either falls below the lower limit (higher-the-better) or falls above the upper limit (lower-the-better). Under this circumstance, the DPMO needs to be computed taking into account the appropriate tail probabilities (left or right). In this paper, the aspects of Six Sigma are used to decide the upper and lower sigma quality limits for these two cases. Followed by this, the DPMO is estimated accordingly. Probabilities of far good units, that is, extremely good parts per million opportunities (EGPMO), of the quality characteristic are also determined. It is discussed about how DPMO and EGPMO help in evaluating the overall quality level of the process/product of interest. The procedure is illustrated with numerical examples.

More »»
2016 Journal Article J. Ravichandran, “Six Sigma-based X-bar Control chart for Continuous Quality Improvement”, International Journal of Quality Research, vol. 10, no. 2, pp. 257 – 266, 2016.[Abstract]

Introduced by Shewhart, the traditional variable control chart for mean (X-bar Chart) is an effective tool for controlling and monitoring processes. Notwithstanding, the main disadvantage of X-bar chart is that the population standard deviation is unknown though the sample mean is an unbiased estimator of the population mean. There are many approaches to estimating the unknown standard deviation with the expertise available with the researchers and practitioners that may lead to varying conclusions. In this paper, an innovative approach is introduced to estimate the population standard deviation from the perspective of Six Sigma quality for the construction of the proposed control chart for mean called Six Sigma-based X-bar control chart. Under the assumption that the process is normal, in the proposed chart the population mean and standard deviation are drawn from the process specification from the perspective of Six Sigma quality. After discussing the aspects of the traditional X-bar control chart, the procedure for the construction of the proposed new Six Sigma-based X-bar control chart is presented. The new chart is capable of maintaining the process mean close to the target by variance reduction resulting in quality improvement. Also, it may be noted that at a point of time, the process, though under statistical control, may be maintaining a particular sigma quality level only while the goal is Six Sigma quality level of just 3.4 defects per million opportunities. Hence, as a practice of continuous quality improvement, it is suggested to use the proposed control chart every time with improvement till the goal of Six Sigma with 3.4 defects per million opportunities is achieved. An innovative cyclic approach for performing the continuous quality improvement activity is also presented. The construction of the proposed Six Sigma-based X-bar control chart is demonstrated using an illustrative example. More »»
2015 Journal Article J. Ravichandran, “Control Chart for High Quality Processes Based on Six Sigma Quality”, International Journal of Quality and Reliability Management, 2015.
2015 Journal Article T. Palanisamy and Ravichandran, J., “A wavelet-based hybrid approach to estimate variance function in heteroscedastic regression models”, Statistical Papers, vol. 56, pp. 911-932, 2015.[Abstract]

We propose a wavelet-based hybrid approach to estimate the variance function in a nonparametric heteroscedastic fixed design regression model. A data-driven estimator is constructed by applying wavelet thresholding along with the technique of sparse representation to the difference-based initial estimates. We prove the convergence of the proposed estimator. The numerical results show that the proposed estimator performs better than the existing variance estimation procedures in the mean square sense over a range of smoothness classes. © 2014, Springer-Verlag Berlin Heidelberg.

More »»
2015 Journal Article T. Palanisamy and Ravichandran, J., “Variance Estimation in Heteroscedastic Models by Undecimated Haar Transform”, Communications in Statistics-Simulation and Computation, vol. 44, pp. 1532–1544, 2015.[Abstract]

We propose a method in order to maximize the accuracy in the estimation of piecewise constant and piecewise smooth variance functions in a nonparametric heteroscedastic fixed design regression model. The difference-based initial estimates are obtained from the given observations. Then an estimator is constructed by using iterative regularization method with the analysis-prior undecimated three-level Haar transform as regularizer term. We notice that this method shows better results in the mean square sense over an existing adaptive estimation procedure considering all the standard test functions used in addition to the functions that we target. Some simulations and comparisons with other methods are conducted to assess the performance of the proposed method.

More »»
2014 Journal Article T. Palanisamy and Ravichandran, J., “Estimation of variance function in heteroscedastic regression models by generalized coiflets”, Communications in Statistics: Simulation and Computation, vol. 43, pp. 2213-2224, 2014.[Abstract]

A wavelet approach is presented to estimate the variance function in heteroscedastic nonparametric regression model. The initial variance estimates are obtained as squared weighted sums of neighboring observations. The initial estimator of a smooth variance function is improved by means of wavelet smoothers under the situation that the samples at the dyadic points are not available. Since the traditional wavelet system for the variance function estimation is not appropriate in this situation, we demonstrate that the choice of the wavelet system is significant to have better performance. This is accomplished by choosing a suitable wavelet system known as the generalized coiflets. We conduct extensive simulations to evaluate finite sample performance of our method. We also illustrate our method using a real dataset. Copyright © 2014 Taylor & Francis Group, LLC.

More »»
2012 Journal Article J. Ravichandran, “A review of preliminary test-based statistical methods for the benefit of Six Sigma quality practitioners”, Statistical Papers, vol. 53, pp. 531-547, 2012.[Abstract]

Ever since Professor Bancroft developed inference procedures using preliminary tests there has been a lot of research in this area by various authors across the world. This could be evidenced from two papers that widely reviewed the publications on preliminary test-based statistical methods. The use of preliminary tests in solving doubts arising over the model parameters has gained momentum as it has proven to be effective and powerful over to that of classical methods. Unfortunately, there has been a downward trend in research related to preliminary tests as it could be seen from only few recent publications. Obviously, the benefits of preliminary test-based statistical methods did not reach Six Sigma practitioners as the concept of Six Sigma just took off and it was in a premature state. In this paper, efforts have been made to present a review of the publications on the preliminary test-based statistical methods. Though studies on preliminary test-based methods have been done in various areas of statistics such as theory of estimation, hypothesis testing, analysis of variance, regression analysis, reliability, to mention a few, only few important methods are presented here for the benefit of readers, particularly Six Sigma quality practitioners, to understand the concept. In this regard, the define, measure, analyze, improve and control methodology of six sigma is presented with a link of analyze phase to preliminary test-based statistical methods. Examples are also given to illustrate the procedures. © 2010 Springer-Verlag.

More »»
2012 Journal Article J. Ravichandran and Palanisamy, T., “Nonparamteric Regression Curve Estimation Using Generalized Coiflets”, Bulletin of Calcutta Mathematical Society, vol. 104, no. 2, pp. 139-146, 2012.
2011 Journal Article J. Ravichandran, “A chi-square-based heuristic statistical procedure for approximating bid price in a competitive marketplace: A case study”, Journal of Business and Industrial Marketing, vol. 27, pp. 69-76, 2011.[Abstract]

Purpose: The purpose of this paper is to propose a chi-square-based heuristic statistical procedure that considers the most recent bid prices of self and competitors in proposing an optimum winning bid price. The use of simple regression method to predict the adjusted optimum bid price is also considered. Design/methodology/approach: In order to achieve this objective, the proposed approach uses past bid prices of the case company and its competitors. A heuristic chi-square statistical procedure is then developed to obtain the expected bid prices. The absolute difference between the prices is obtained and the same is adjusted to get the optimal bid price for the forthcoming bid. Findings: It is demonstrated that the proposed heuristic chi-square-based approach is superior to that of the regression method, which is otherwise in practice in the case company, since the latter still gets support from the former. Further, the paper throws light on how a familiar statistical method could be conveniently but effectively applied as a tactic of marketing management. Research limitations/implications: The method is developed based on the experience of the case company in bid participation. Therefore, the study on the performance of the application of the proposed method from the perspective of different types of companies can be taken up as a future work. Also, if appropriate, the time and other factors may be considered as other independent factors along with the tender quantity in predicting the tender price. Practical implications: The proposed heuristic chi-square procedure is simple to implement as the required data on bid prices are usually available once the bids are open. The case study reveals how the method can be successfully implemented to win bids in a competitive market place. Also, the procedure is more generic so that it can be adapted as it is or with slight modification as desired by companies of any nature participating in bids. Originality/value: The proposed approach would have a high value among manufacturing firms and marketing managers who participate in tenders with an aim to increase sales turnover which in turn will increase profit. The paper is unique of its kind and will help researchers to think of either extending or proposing such new approaches as they need. © Emerald Group Publishing Limited. More »»
2008 Journal Article J. Ravichandran, “The performance of treatments in single-factor experiments using Six Sigma metrics”, International Journal of Six Sigma and Competitive Advantage, vol. 4, pp. 114-132, 2008.[Abstract]

Whether the specification of a process parameter of interest (response) is available or not, experimenters always try to optimise the process by moving the mean close to the target value and simultaneously reducing the variance. The Analysis of Variance (ANOVA) is a powerful method and is useful to know whether the treatment means are significantly different or not. As a part of post-ANOVA analysis, when the hypothesis of equal means is rejected, various types of Multiple Comparison Tests (MCTs) are available to classify the means into different homogeneous groups. Also, in order to identify the significant treatment levels, the Analysis of Means (ANOM), a graphical approach, is found to be useful as it serves as an alternative to ANOVA. This paper concentrates on the analysis of the actual performance of the treatment means from the Six Sigma practitioners' point of view. Therefore, a new Six Sigma-based approach is proposed to study the performance of the treatment levels in single-factor experiments using Six Sigma performance metrics such as 'mean shifts' and 'sigma quality levels'. A new graphical display that uses mean shifts and sigma quality levels is also developed for easy interpretation. This approach will help the experimenters, particularly Six Sigma practitioners, decide on the treatment level(s) that can give the most efficient response. A number of examples are considered for illustration. © 2008, Inderscience Publishers. More »»
2007 Journal Article J. Ravichandran, “Cost‐based Process Weights for DPMO and the Overall Performance of an Organization”, The TQM Magazine, vol. 19, pp. 442-453, 2007.[Abstract]

Purpose – The purpose of this paper is to propose a new approach in which cost‐based process weights are used to determine a unique weighted‐defects per million opportunity (DPMO) and its corresponding overall sigma level in order to classify an organization as either “world‐class,” “industry average” or “non‐competitive.” Design/methodology/approach – In order to achieve this objective, the proposed approach uses both internal and external performances of the products and processes in terms of costs involved to determine cost‐based process weights. These weights are then incorporated into the respective DPMOs for computing weighted‐DPMOs. Finally, a unique weighted‐DPMO and its corresponding sigma level are found. Findings – The proposed method is a new one and it involves various costs for determining process weights. The findings reveal that the weight‐based overall sigma level is more realistic than the one that is calculated without weights. Further, the results of this study could provide interesting feedback to six‐sigma practitioners, as they are particular about DPMOs and return on investments in project implementations. Research limitations/implications – The results of this paper are based on the weights of respective processes and their products that are calculated using various cost aspects. Determining such weights by means of any other process and product factors incorporating the effects of various marketing activities, if any, could extend its generality and fulfil the gap. Practical implications – The proposed method is simple to implement and the required data can be collected without any additional commitments. Also, it is more generic so that it can be adapted by organizations of any nature. This paper recommends change in the practice from simply using the DPMOs with equal importance to using the weight‐based DPMOs for evaluating overall sigma level (performance) of an organization. Originality/value – The proposed approach would have a high value among six‐sigma quality practitioners and researchers as it provides a new and more realistic measure for overall performance of an organization during the evaluation process. More »»
2006 Journal Article J. Ravichandran, “Setting up a Quality Specification”, Six Sigma Forum Magazine, vol. 5, no. 2, pp. 26 – 30, 2006.
2006 Journal Article J. Ravichandran, “Six Sigma Milestones: An Overall Sigma Level of an Organization”, Total Quality Management & Business Excellence, vol. 17, no. 8, pp. 973 – 980, 2006.
Publication Type: Book
Year of Publication Publication Type Title
2015 Book J. Ravichandran, Probability and Random Processes. New Delhi: IK International, 2015.
2010 Book J. Ravichandran, Probability and Statistics for Engineers. Wiley India , 2010.[Abstract]

Probability and Statistics for Engineers is written for undergraduate students of engineering and physical sciences. Besides the students of B.E. and B.Tech., those pursuing MCA and MCS can also find the book useful. The book is equally useful to six sigma practitioners in industries. A comprehensive yet concise, the text is well-organized in 15 chapters that can be covered in a one-semester course in probability and statistics. Designed to meet the requirement of engineering students, the text covers all important topics, emphasizing basic engineering and science applications. Assuming the knowledge of elementary calculus, all solved examples are real-time, well-chosen, self-explanatory and graphically illustrated that help students understand the concepts of each topic. Exercise problems and MCQs are given with answers. This will help students well prepare for their exams. More »»
Faculty Details


Faculty Email: 
+91 9865005528