Ph.D. Thesis Abstracts

Authors' profiles can be viewed by clicking on their name.
Keywords are searchable by using your browser's Find function, usually by pressing Ctrl + F.

Linguistic Analysis of Efficiency Using Fuzzy Systems Theory and Data Envelopment Analysis
Ozren Despic


Linguistic Analysis of Efficiency (LAE) is a new theoretical development in the domain of efficiency analysis that brings together the elements of performance evaluation with the elements of benchmarking. In organizational practice, LAE is a visually appealing tool, which, based on comparative efficiency evaluation, assists in choosing the best path for improving productivity while at the same time promotes organizational learning.

LAE has been built upon two important theories in today’s scientific community - Data Envelopment Analysis (DEA) initiated by Charnes et al. [Char78] and Fuzzy System Theory (FST) originated by Zadeh [Zade65]. Combining the two theories, LAE model produces an easy to understand set of natural language rules that describes the shape and characteristics of the standard DEA production space and its efficiency frontier. Once created, these rules then can be easily translated into various forms of informative charts showing the paths toward improving efficiency. LAE thus removes the complexity and abstractness of the DEA process making it more transparent to not only the analyst, but, more importantly, to the decision making unit’s (DMU’s) manager or the new DMUs that may be created. Hence, LAE allows the analyst to interact with (and better understand) the inner workings of DEA, opening the door to new insight for DMUs wishing to improve efficiency and providing for the first time, a map for new DMUs to follow when starting out.

Measuring and Incorporating Meaningful Proportional Slacks in Radial DEA Models, and Aspects of Non-Discretionary DEA Models

Barak Edelstein

     This thesis is comprised of theoretical contributions in two unique but related DEA fields: non-discretionary DEA models, and slack selection methodologies in radial DEA models and their integration into an overall efficiency score. The theoretical developments are illustrated through the use of intermediation efficiency models using real world data from the retail branch network of one of Canada’s five largest banks.  The models examine the loan quality and business generation of the bank’s branches using data on the branches’ business environment, the levels of deposits and loans that they generate, and the resulting levels of bad loans.

        Several theoretical contributions are made in the field of non-discretionary DEA models. It is shown that the NCN non-discretionary DEA model will produce artificially high efficiency scores under certain circumstances. The research also expands on the work of Lovell and Pastor dealing with one-sided DEA models with discretionary variables, by examining how non-discretionary variables can be integrated into one-sided DEA models. A new solution technique for solving one-sided DEA models with semi-negative non-discretionary variables is proposed.

        The thesis presents a new unit invariant slack selection methodology for radial DEA models that maximizes the relative slack improvements. Relevant background literature is reviewed showing the shortcomings of the current slack selection approaches in radial DEA models, including the CCR and BCC models. In addition, new models are proposed for incorporating the slacks with the radial Ө efficiency score into an overall efficiency score that makes the results more usable.

Determining the Relationship between the Credit and Equity Markets

Angie Elkhodiry

The evolution of credit derivatives has inspired many researchers to investigate the behaviour of credit spreads. Today the growing consensus is that the equity option market provides sufficient information to estimate latent credit parameters. Recently Hull, Nelken and White [HULL05] proposed a model to estimate credit spreads from the equity option market. This research presents some important theoretical developments that could be profitably adopted by the financial industry. We first test the conjecture of an existing relationship between the credit and equity options markets by running a time series analysis between market credit spreads and the corresponding implied equity volatility. Different terms and moneyness were considered in order to eliminate any presumptions; we find that there exists a strong positive relationship between credit spreads and implied equity volatility. Secondly. we extend Hull et al.'s model by relaxing significantassumptions, and introducing our "First-Passage Alternative (FPA) model". We show that the FPA model produces an accurate estimation of credit spreads while showing sensitivity to implied equity volatility in all ranges. Finally we use this model to deduce hedging ratios which result in simple closed-form solutions enabling traders to effectively and economically eliminate their risk exposure using both credit and equity markets. The FPA model provides a useful link between the two markets, and allows the recognition of existing arbitrage opportunities as they occur.


The Corrected Frontier; DEA-Chebyshev model with an application to Financial Resource Management

Kerry Khoo

It is often seen that productivity analysis is based on quantitative approaches. Management’s intangible concerns are often neglected due to the difficulty in quantifying them. Economists coin them as “nuisance parameters” which are often considered as fixed effects or to be ignored entirely. These “nuisance parameters” can be a product of rapidly evolving economic, environmental and sociological conditions that are not explicitly captured in static data. Under these conditions, a corporation may appear more inefficient (or efficient) than it truly is. The issue now is (which is the goal of this thesis), how can these (qualitative) concerns be incorporated into productivity analysis? The DEA-Chebyshev model (DCF) is an appealing technique developed in this research which has incorporated qualitative information into a quantitative approach that is simple to use and easily applied without requiring parameterization, specification of some functional form and substantial amount of data. This model was tested on several test data sets, and has proven to be more accurate for approximating the “corrected” frontier than the current stochastic methods, even when homogeneity is assumed in DCF.

In the economics literature, a resource-based view suggests that efficiency of individual businesses rather than industry structure determines profitability. In today’s fast paced global economy where monetary resources are scarce and talent is in even shorter supply, corporations are forced to optimize the utilization of funds and people. Conventional tools traditionally used to measure productivity such as ratio analyses are no longer adequate or even applicable. Ratio analyses cannot evaluate performance based on multiple parameters simultaneously and is not unit invariant allowing the analyst to only carry out the analysis based on dollar amounts or unit numbers. DCF was designed to overcome these limitations and be applied to the analysis of profitability and operational efficiency of a group of corporations in the manufacturing sector. The results show that there are more corporations that are efficient in generating profits than that are operationally efficient, indicating that profit improvements are possible even that in strong performance periods. Companies, which ranked high on the Business500 list may not have ranked as high in DCF due to the multidimensional nature of the production model.

Performance Analysis of Ontario Credit Unions

Peter Pille

The theoretical contributions of this work to DEA science includes the development of an algorithm to find all the facets of a VRS DEA frontier. This enables the projection of inefficient units to the closest point on the efficient frontier, or with the minimum sum of slacks. The proportion of flill dimensional facets affects the degree of freedom for movement along the frontier. The dimensionality of any region of the frontier can be determined and is discussed. An efficiency measure is introduced for inefficient units, relative to their closest point on the frontier. This efficiency measure can also be used with any projection to the frontier, including the traditional input or output oriented radial projections. With radial projections, slack is accounted for in this measure, and the measure collapses to the traditional radial measures when there is no slack.

The practical application part of this work is based on data from Credit Unions in Ontario. This is an important sector of the Financial Services Industry, with 1996 assets of $13.5 billion and 1.7 million members. The health of the Credit Union industry is monitored by a government agency, the Deposit Insurance Corporation of Ontario (DICO), the collaborating entity that provided data for this research. In this section DEA models are developed to detect weaknesses in Credit Unions so that potential financial failures can be predicted. DEA with a VRS frontier and input orientation is utilized to create various production models that are compared with each other, and with current models used by DICO. In total, fifteen models are constructed and compared with DICO's current z-score model. A simple equity/asset ratio is shown to provide as good a prediction of failure as any of the other models, and is not improved upon the z-score model. The best DEA model provides results comparable to the equity/asset ratio when the new slack adjusted efficiency score is used to measure efficiency, particularly for larger asset sizes. For each Credit Union, comparison is made with a peer group of efficient entities that the inefficient institution's management can emulate, in order to improve their performance.

Applications of DEA to Software Engineering Management

David Reese

Software systems play an increasingly important role in organizational effectiveness, as well as in gaining competitive advantage and in differentiating organizations from their competition. This applies to both producers of software and those organizations that utilize software. Thus, our ability to efficiently produce high quality software is a crucial factor in determining the level of success of many organizations. Unfortunately, the track record of most software producers remains poor, with only a handful of notable exceptions worldwide. In addition to new and improved methods and technology, many experts have cited the need to overcome the difficulties associated with managing the software engineering process as solutions to this so-called `software crisis'. Measurement and analysis serves a fundamental role in the suggested solutions to the software engineering and management problems. Well suited to assess the performance of complex, multiple input - multiple output production processes, Data Envelopment Analysis (DEA) methods are ideal for measuring the efficiency of software production processes. DEA methods have proven to be very valuable, in practice, as a management tool appropriate for both planning and control activities.

This thesis addresses several limitations of DEA techniques that can arise through their application to software engineering management. New theoretical contributions are made in three main areas. The first area is crucial to the performance measurement process. New and enhanced models of software production are presented which divide the software production process into multiple phases and are capable of evaluating data sets containing projects with varying degrees of new and modified code. Measuring overall efficiency and effectiveness is fundamental to the management control process. This is the motivation for the second area: researching the relationship between traditional economic production measures (and definitions) and DEA multiplier flexibility. A prescriptive framework for the application of DEA models to measure overall efficiency and effectiveness is presented along with several new DEA models important to this framework. The third area is related to the application of DEA to software project planning and presents new tools for forecasting and trade-off analysis. Existing DEA techniques are adapted for the purpose of conducting general trade-off analysis. Inherent in this analysis process is the generation of efficient project forecasts. This document, while inspired by software engineering management, contains new theoretical contributions which are not limited in their application to this domain only. In particular, this applies to the framework to measure overall efficiency and the new measures that it contains. The new methods of forecasting and trade-off analysis are also applicable to domains other than software engineering where such project management tools are appropriate.

Ranking by Consensus Using One-Sided DEA

John P. Ruggieri

The extension of Data Envelopment Analysis (DEA) developed in this thesis presents a new approach for calculating a common set of weights for use in the scoring and ranking of alternatives in a project selection problem setting. In contrast to the traditional approach where expert opinion is interpreted to derive weights for aggregating individual criterion scores of an alternative, a new development is introduced to derive these weights from empirical data, based on a set of new OneSided DEA (OSD) models and a new Ranking by Consensus (RC) methodology.

The RC methodology was applied to four relevant problems: 1) Power Plant site selection, 2) Shortlisting Research Grant Proposals for a large Government Granting Agency 3) Capital City site selection and 4) Two-dimensional simulated data for illustration. The results offer new insights into the possibilities of novel tools which can be helpful in the decision making process by providing a mathematical ranking of alternatives based on a common set of weights.

This set of weights have been shown to represent the consensus opinion of the very candidates being evaluated, with the understanding that the candidates were free to selfishly select weights, with full knowledge of all other candidate's criteria scores, so as to make themselves appear as attractive as possible in the respective selection problem setting. The interpretation and implication of this common set of weights is explored and recommendations for extending the methodology in practice are made.

Inverse and Negative DEA and their application to credit risk evaluation

Paul C. Simak

With the increasing emphasis for financial institutions to deal with their exposures and risks, there has been an explosion of companies that develop risk management tools and methodologies. Market risk is a significant component of the enterprise wide risk management process, but credit risk still remains the most important goal for most financial institutions. The objective of this thesis is to develop a methodology utilizing the capabilities of Data Envelopment Analysis, that provides for an accurate means of evaluating the credit worthiness of corporations. Several DEA models were developed, and the best ones were identified and their optimal cut-off points found. The out of sample classification accuracies of the models were calculated and the Type I and Type II error trade-off examined. The DEA models outperformed the Z Score and Shumway models in the majority of the cases, and gave encouraging classification accuracy results. Out of sample accuracy was as high as 100% bankrupt classification accuracy and 82% nonbankrupt classification accuracy for one of the models one year prior to failure.

This thesis introduces the concept of Negative DEA, an approach that identifies worst performers by placing them on or near the efficient frontier. The relationship between Normal, Inverse, and Negative DEA was investigated and it was found that the combined use of all three can increase the efficiency of the failure prediction process. Several theoretical advances were made in this area, developments which can be utilized in many other applications. Using Negative DEA, a layering technique was developed that does not rely on an optimal cut-off point but provides accurate classification results. In addition, the development of a DEA Search Engine is presented, which is used to estimate the market value of private companies.

Establishing the Practical Frontier in DEA

Taraneh Sowlati

Data Envelopment Analysis (DEA) assigns a score to each production unit (DMU) considered in the analysis. Such score indicates whether the unit is efficient or not. For inefficient units, it also identifies a hypothetical unit as the target and thus suggests improvements to their efficiency. However, for efficient units no further improvement can be indicated based on a DEA analysis. Nevertheless, it is important for management to indicate targets for their efficient units if the organization is to improve as a whole. If the inputs and outputs of efficient units can be varied within a specified range, then it is possible to find other combinations of inputs and outputs from which new, "artificial", DMUs can be created. These DMUs are constrained to be more efficient than the DEA efficient unit from which they were created.

This thesis presents a linear programming model and a methodology for improving the efficiency of empirically efficient units by defining a new “Practical frontier” and utilizing management input. This new frontier allows the analyst to identify adjusted efficiency scores for DMUs which were on the frontier when only real DMUs were considered. The new frontier, formed mostly from the new, artificial DMUs, thus ranks the efficient units which will now have scores less than 1.0. Available bank branch data was used to illustrate the applicability of this theoretical development.

The examination of allocative and overall efficiencies in DEA using shadow prices, and the introduction of an omni-oriented radial DEA model

Fai K. Tam

This research presents a couple of important theoretical developments to the field of Data Envelopment Analysis (DEA). The original models developed by Charnes, Cooper and Rhodes [CHAR78], and Banker, Charnes and Cooper [BANK84b] were both radial models. These models, and their varied extensions have remained the most popular DEA models in terms of utilization. Radial models presented in the DEA literature employ either and input- or outputorientation. Furthermore, the benchmark targets they determined for inefficient units were mostly based on the notion of maintaining the same input and output mixes originally employed by the evaluated unit.

New DEA models were formulated here to extend the capabilities of DEA in both of these areas. The omni-oriented radial DEA simultaneously considers both input reduction and output expansion, and provides a single efficiency measure based on both of these goals. A methodology was also developed to estimate allocation and overall efficiency in the absence of defined input and output prices. The benchmarks determined from models based on this methodology will consider all possible input mixes, output mixes, or both. Both of these developments were illustrated on a model of the financial intermediary function of a bank branch network.

Bankruptcy Prediction of Companies in the Retail-Apparel Industry Using Data Envelopement Analysis

Angela Tran

Since 2008, the world has been in recession. As daily news outlets report, this crisis has prompted many small businesses and large corporations to file for bankruptcy, which has grave global social implications. Despite government intervention and incentives to stimulate the economy that have put nations in hundreds of billions of dollars of debt, and have reduced the prime rates to almost zero, efforts to combat the increase in unemployment rate as well as the decrease in discretionary income have been troublesome. It is a vicious cycle: consumers are apprehensive of spending due to the instability of their jobs and ensuing personal financial problems; businesses are weary from the lack of revenue and are forced to tighten their operations which likely translates to layoffs; and so on. Cautious movement of cash flows are rooted in and influenced by the psychology of the players (stakeholders) of the game (society). Understandably, the complexity of this economic fallout is the subject of much attention. And while the markets have recovered much of the lost ground as of late, there is still great opportunity to learn about all the possible factors of this recession, in anticipation of and bracing for one more downturn before we emerge from this crisis. In fact, there is no time like today more appropriate for research in bankruptcy prediction because of its relevance, and in an age where documentation is highly encouraged and often mandated by law, the amount and accessibility of data is paramount – an academic’s paradise!

Measuring Effects of Cultural Differences in Bank Branch Performance and Implications to the Branch Integration Process in Bank Mergers

Sandra Vela

The forces of globalization and technological advances have significantly altered the structure of the Canadian banking industry. The banks’ response is more intense performance measurements and branch consolidations to reduce costs and improve productivity. Data Envelopment Analysis (DEA) has been a popular technique for bank efficiency studies, however, DEA requires that units operate in consistent “cultures” to produce fair and comparable results. The “culture” can be defined as the firm’s unique management infrastructure and operational environment. To address this limitation, a new methodology was developed that allows for comparison between units whose performances are affected by their different cultural environments.

This research presents a framework for assessing efficiency of bank branch networks that operate under different environmental conditions and for predicting potential efficiency gains when merging culturally diverse units. Two separate cultural indices are first developed, one to capture the nature of corporate strategies (corporate index) and the other to uncover the level of services provided to customers (service index). The new theory is then applied to the merged set of branch networks of a major Bank and the largest independent Trust company operating in Canada. The cultural indices are incorporated into the culturally-adjusted (CA) DEA models to measure the relative performance of branches under different operating conditions. These results indicate that CA-DEA models are able to adjust for the systematic bias created by cultural differences. The two indices are further utilized to merge the units of distinct cultures and predict the effect on efficiency of the final merged entity.

DEA Based Analysis of Software Project Efficiency

Zijiang Yang

Modern computers have become increasingly indispensable in all sorts of industries. However, the increasing software cost and delivery delay encourage Information System Department to attempt to install objective measurement programs for their software projects. The traditional methods used for this purpose have a number of problems and limitations and therefore, there is a continuing need to explore new methods to measure the efficiency of the software project production process. The multi-dimensionality of software development makes Data Envelopment Analysis an attractive solution. It has the ability to handle multiple inputs and outputs. Its strength also lies in the fact that no preassigned weights on the relative importance of the inputs and outputs are required and that it reduces the multiple measures into a single efficiency score.

The objective of this work is to validate the hypothesis that DEA is a superior technique for measuring software project efficiency in an actual production environment relative to commonly used techniques. Two DEA models are developed for this purpose. The results compared favorably to the results of several popular ratio analyses. The key factors that affect performance are investigated using DEA results. In addition, the projects are segmented to three categories and carry out more analysis.