MASc Thesis Abstracts

Authors' profiles can be viewed by clicking on their name.
Keywords are searchable by using your browser's Find function, usually by pressing Ctrl + F.

Performance Analysis of Large Canadian Banks Over Time using DEA
Vanita Aggarwal

 

The Schedule I Canadian banks are experiencing immense competitive pressures from other financial institutions due to the globalization of market, regulatory changes and economic climate. To meet these challenges, banks need to analyze and improve their performance and productivity. Currently, equity researchers use a number of technical indicators to measure the performance of these banks. These indicators give incomplete and often contradictory information about the banks' performance, especially as these relate to their productivity and efficiency Data Envelopment Analysis DEA) is a relatively new technique that measures the relative efficiency of each bank by comparing it to similar, best practice, banks. One of the strengths of DEA is its ability to handle the multidimensional nature of banking operations. Despite being the world's leading banking institutions, performance issues of Canadian banks have not been researched thoroughly from a multidimensional perspective. This state of affairs thus provides an excellent research opportunity.

 

The first step was to identify the factors that influence the performance of the banks. Two banking models are developed for analyzing the cost efficiency and the managerial efficiency of the six Canadian Schedule I bank. The unusual combination of a small number of comparable banks and a large number of input and output variables posed a unique challenge to this research. DEA window analysis approach is used to analyze the efficiency of these banks and also the performance trends of these banks over time during the period from 1981 to 1994. The banks are compared to themselves in different time periods, as well as relative to each other in the same period. The various sources of inefficiencies such as technical, allocative and scale inefficiencies ale also identified and analyzed, The relationship of banks' performance with prevailing economic conditions is also analyzed.

 

The Makmquist Index is used to measure the change in productivity of the banks over time. The two components of this productivity index: the "catching up" (change in the relative efficiency of each bank from one time period to another with respect to their own frontier) and the "technological change" (shift in the efficient frontier from one time period to the next) am also separated.

 

Future research may include a comparison of Canadian banks to other similar banks in the US and other countries and also a similar performance analysis of Canadian Schedule II banks.

Valuing Private Companies: A Dea Approach

Burcu Anadol

Traditionally company valuation methods are based on discounted cash flows and liquidation values, but all have a number of known shortcomings associated with them. The application of Data Envelopment Analysis (DEA) t~r finding comparable firms and predicting the market values of companies is an extension of the market-based approach and replaces the traditional method where DEA is used for assessing relative efficiencies.

A DEA model was developed. For the inefficient units, the analysis correctly classified the market ranges for 70% of the companies. An upper bound was predicted for the inefficient companies based on their efficient peers. Of these, 75% were below their upper bound. For the efficient peers. DEA was able to find a lower bound for their market value based on peers being compared to them 50% of the time. Although some of the ranges specified by the peers are quite wide, the method still does a relatively good job in classifying market values.

A Process Control Simulation of the USD Wire Payment System at a Major Canadian Bank

Kyla Augustine

There is an interesting parallel between the complex continuous processes found in the chemical engineering industry and those in the financial services industry. This study applies the principles of chemical process engineering to model the USD wire payment system at a major Canadian bank using CADSIM, simulation software originally designed for the pulp and paper industry. The objectives were to build a highly visual and interactive simulation model that provides a transparent overview of the wire payment system and opportunity to evaluate process control strategies. Two financial system specific modules were developed: an account 'tank' and a 'payment' queue. Historical payment data was used to test the optimum queue strategy and overdraft limit for minimizing Fedwire overdraft charges. The model successfully demonstrated the merit of flexible, adjustable controls as the results show that monthly overdraft charges could be reduced by almost $40,000 for four of the major accounts, without significantly delaying payments. For cash managers at the Canadian bank the model could be useful as a training tool for new employees or as prototyping opportunity before large scale IT changes are implemented in the real system.

A Belief Network for Prediction of Extra Cash Withdrawals for Y2K

Roya Azad

The Y2K problem was one of the world's greatest technological challenges in the 1990s, but especially for the developed nations. In addition to the technical challenges, the Y2K problem could have brought in psychological issues for the banking system. The public's lack of knowledge of Y2K might have resulted in the wrong perception of its consequences. As a result of this ignorance, there could have been emotionally driven irrational behavior from the public resulting in a run on the banks. Since the banking system relies on public confidence, it is vulnerable to such unpredictable reactions. Belief Networks (BNs), expert systems for decision-making under uncertainty, makes an ideal tool for predicting the public's reaction towards the Y2K problem because they can address the inherent uncertainty and volatility of the public's response to the Y2K issue.

A belief network model was developed to predict the amount of cash the citizenry would require to deal with the real or imagined emergencies resulting from the Y2K problem as we were approaching January 1, 2000. The final (practical) BN model proved to be a good predictor of the amount of cash withdrawals Canadians might make to deal with the Y2K. This technique might be used as a decision-making tool for future events affecting the banking system or any other situation where the public's reactions must be predicted in some way.

The Impact of Environmental Variables on Bank Branch Performance in a Merger

Andrea Chan

A well-functioning financial system is a catalyst for economic growth, and an essential ingredient for a strong commercial economy. Currently, one of the most controversial propositions on the Canadian financial landscape is the prohibition of large bank mergers. Any reform would considerably impact the entire financial services sector, and in turn the Canadian population. Accordingly, the topic of bank merger efficiency forms the topic of this research. A two-stage estimation procedure is employed to evaluate branch efficiency before and after the merger of a large Canadian Bank and a large Trust company. In the first stage, Data Envelopment Analysis is used to evaluate the operational efficiency of the branch networks for the two pre-merger firms as well as the network of merged firms. The second stage employs limited dependent variable regression techniques to relate the branch efficiency scores to the demographic characteristics of the customer base and the environmental characteristics of the branch location. Bootstrap algorithms were used to calculate statistical inference about the regression coefficients. The results indicate that overall, efficiency gains were achieved as a result of the merger. The post-merger branch network experienced a convergence and increase in efficiency scores. Furthermore, environmental characteristics such as market type, age, and education level of the customer base were helpful in explaining both the variations between efficiency scores, as well as the relative change in efficiency post-merger. This suggests that the environmental characteristics of the branches and their customers are key indicators and predictors of branch efficiency as well as merger success.

Mutual Fund Performance Evaluation using DEA

Ramez T. Chehade

Mutual funds have found their way into the lives of millions of people across North America, if not as an integral portion of their personal investment strategies, then at least as a hot topic of conversation. Their growth rate has been unprecedented by any other investment vehicle. The 1990's have witnessed the bulk of this growth in diversity, in number, and in shareholder accounts in the hundreds of percent. Irrespective of this growth, however, studies during the past thirty years revealed that the majority of mutual funds in both the US and Canada, did not perform any better than the indices against which they have been measured. For the average investor planning to take advantage of a mutual fund as a "one-stop investment", it is imperative that a performance measure not only give an accurate measure of the fund's past performance to date, but also an indication of its future performance. The traditional methods used for this purpose have a number of known problems and shortcomings associated with their use, further stressing the need to explore other methods of analyzing financial data on mutual funds.

The multidimensional nature of mutual fund performance makes it a very attractive application area for Data Envelopment Analysis. The strength of this technique lies in the following attribute: its ability to handle multiple inputs and outputs, that it does not require the specification of a functional form for input-out-put correspondence, and that it gives a single measure of performance which takes into account the multiple dimensions of organizational activity.

The goal of this work is to validate the hypothesis that DEA may be used as a tool for creating better investment portfolios while serving as a management efficiency benchmark. Production models based on the DEA methodology, with the Mutual Fund Performance Evaluation using DEA methodology, with the aim of capturing the essence of fund performance, were developed to generate efficiency scores for the majority of Canadian mutual funds, in order to classify the funds into portfolios. The performance of most of the DEA model portfolios generally fared the same or slightly worse than the Sharpe Index model portfolios. Certain DEA model portfolios, however, did significantly outperform the Sharpe Index model portfolios.

The results of the DEA techniques utilized in this work invite further research in the development of even better models for evaluating mutual fund performance.

 

An Object-Based Analysis and Model of On-line Auctions

Issam Darawish

On-line auctions represent a unique form of e-commerce that employs an arbitrated pricing method for the sale of goods. The challenges of designing such systems motivated the objective of this thesis: to create an on-line auction model that can be extended to meet different trading requirements.

The thesis presents an Object-Oriented Analysis of the on-line auction domain in terms of Information, State and Process. The information analysis identifies pertinent objects from which an Information Model of on-line auctions is developed. The state analysis abstracts the behaviour of these objects by defining their relevant states, events and transitions. From this analysis, an Object Communication Model portraying object hierarchy and control exhibited by on-line auctions is created. Finally, the process analysis identifies the processing units associated with the actions of each object. Using this analysis, an Object Access Model that provides a global view of processing in on-line auctions is constructed.

Bank Branch Intermediation Efficiency Evaluation Using Data Envelopment Analysis and Non-Discretionary Variables

Barak Edelstein

This research examines the intermediation efficiency of the retail branch network of one of Canada's five largest banks using the deposit and loan levels generated by the branches and their levels of bad loans, and at the same time incorporating data on the business environment in which they operate. Three DEA models are proposed, the first aims to maximize loan quality by minimizing the levels of bad loans produced, the second seeks to maximize the levels of deposits and loans generated, while the third also maximizes the level of potential business that could be generated, but with no regard to loan quality. The results obtained from the DEA models are further broken down into market segments and geographical regions. It is concluded from the results that a smaller number of DMUs will generally lead to a higher percentage of efficient DMUs and higher average efficiency scores.

A comparison is conducted between the Banker and Morey Exogenously Fixed Variables formulation and the Cooper et al Non-Controllable Variables formulation, in essence comparing the EMS Version 1.3.0 and DEA-Solver Version 1.0 software packages which are based on these formulations. The main differences between the two formulations were found to be the exclusion of the discretionary slacks from the objective function of the Cooper et al formulation, and its transformation of the non-discretionary inequality constraints to equalities. The Banker and Morey formulation and the EMS software are deemed to be better for solving non-discretionary variables models.

The Enhancement of Credit Card Fraud Detection Systems using Machine Learning Methodology

Soheila Ehramikar

Along with the rise of credit card use, fraud is on the rise. In Canada, credit card fraud occurrences rose sharply in 1998 causing $147 million in losses. To address this problem, financial institutions (FIs) are employing preventive measures and fraud detection systems one of which is called FDS. Although FDS has shown good results in reducing fraud, the majority of cases (approximately 90%) being flagged by this system are False Positives resulting in substantial investigation costs and cardholder inconvenience.

The possibilities of enhancing the current operation by introducing a post processing system constitute the objective of this research. The data used for the analysis was provided by one of the major Canadian banks. Based on several variations and combinations of features and training class distributions, different models (more than seventy) were developed to explore the influence of these parameters on the performance of the desired system. The results indicate that this approach has a very good potential to improve on the existing system. However, further research is required including the development of prototype systems which should be enhanced by more extensive and informative data.

A Data Envelopment Analysis Approach for Measuring the Efficiency of Canadian Acute Care Hospitals

Tamas Fixler

Data Envelopment Analysis (DEA) is a methodology particularly well-suited to measuring the efficiency of hospitals because it is able to accommodate multiple heterogeneous inputs and outputs in order to model the complex relationships that exist within them. This thesis uses DEA to develop two hospital efficiency models in collaboration with the Canadian Institute for Health Information (CIHI). The models are intended to illustrate the utility of DEA as a hospital performance measurement tool for CIHI and to augment their current hospital performance indicators.

The first model measures the clinical efficiency of Canadian acute care hospitals in the treatment of patients with acute myocardial infarction (AMI). The model uses inputs measuring the intensity and duration of clinical resource utilization and includes survival rate as the key output. Factors such as AMI patient volumes and the proportion of patients under the primary care of a cardiologist are shown to positively affect performance. The second model measures the overall production efficiency of acute care hospitals and uses labour and capital inputs together with outputs measuring inpatient and outpatient activity. Both models include non-discretionary variables adjusting for case mix variations among the hospitals. The models are extensively validated and each is able to identify a set of highly referenced, efficient hospitals ideal for the establishment of best practices.

Forecasting DEA Scores

David Forshtendiker

The last decade has seen significant work examining the statistical properties of the measurements yielded by Data Envelopment Analysis (DEA). However, no attempt has been made to predict DEA scores for the individual decision-making units (DMUs) being analyzed. This paper develops an autoregressive model using the Box-Jenkins methodology to predict and tests it on data sets from three different industries - Canadian banking, U.S. major oil companies, and Japanese power producers. In the process, a novel measure for DMU comparison is developed and a new density function is introduced to show "distances" between inefficient DMUs. The model is shown to provide good in-sample fits and out-of-sample predictions for all three industries.

Machine Learning Applications in the Canadian Mutual Fund Industry

Steve French

With the novel and recent dataset from the Canadian mutual fund market, this thesis looks at the application of machine learning algorithms to better understand the behaviour and drivers of fund level asset flows. Learning schemes from the data mining literature were shown to outperform traditional asset flow models by wide margin. In particular, the concept of using attribute selection techniques was introduced and shown to not only improve predictive accuracy, but also reduce the dimensionality of the problem.

Previously documented characteristics of the "Performance- Flow" relationship were also reassessed through the machine learning algorithms. Attribute selection techniques confirmed that risk adjusted measures of performance are stronger factor in asset flows than raw measures; this is contrary to findings in many past studies. We believe these findings are a result of the increased access retail investors have to sophisticated fund analytics through the internet.

Identifying Collateral Minimization Opportunities for a Canadian Bank in the Large Value Transfer Systems Using Process Control Techniques

Sukrit Ganguly

The Large Value Transfer Systems (LVTS) is one of the most important payment systems in Canada with which fifteen members (including Bank of Canada and the major Canadian bank being studied) send money to each other. The Canadian bank studies currently processes outgoing payments on a First-In First Out bases and only employs manual controls to manage liquidity needs for large "Jumbo" transactions (those exceeding 100 million Canadian dollars). This time consuming manual process does not optimize liquidity usage which results in excess collateral being tied up in the LVTS system at a significant cost to the Bank.

This work developed a real time controller, similar to those in use in continuous process manufacturing systems, involving and engineering software package, CadSim. The Controller manages the Canadian Bank's payments process automatically to optimize the liquidity and collateral needs and to provide an overview of its role in the LVTS. The controller's efficiency was tested with the Bank's historical data and found that collateral requirements can be reduced by 20% with minor payment delays. The reduction on liquidity needs with free up valuable collateral for the Bank and hence, reduce the opportunity cost of using the LVTS.

Impact Study of Multimedia Information Technology on Customers Service Employees in a Video Banking Kiosk

Ani Ghazarian

It is expected that by the latter half of the 1990's, the concept of "staff-less" banking will be generally accepted as a delivery vehicle capable of providing additional non-cash services not currently delivered through ATMs or telephone banking. This add-on services market was the prime motivator for the development of the Royal Bank's "Video Banking Services" (VBS). The research for the VBS pilot, sponsored by Bell Canada (Stenter) and Royal Bank of Canada, provided an analysis of the effect of multimedia information technology on service employees, or 'Video Banking Representatives' (VBR). Video recordings were taken of the VBR/workstation interaction during customer sessions.

We developed a methodological framework for making measurements of the recordings and analyzed the measurements for four areas of this Thesis.

1. We found the technology learning curve for a novice VBR using the VBS workstation was 90.3%, which is well within the typical range of learning rates (between 88% and 92%).
2. We attempted to establish average session times for VBS service delivery and found some fluctuation between products but both VBRs had the longest average times during Loan sessions, while their shortest average time was during General sessions.
3. The proxy for service quality was chosen as the amount of time the VBR spent looking at the customer. An experienced VBR looked at the customer for more 50% of the time.
4. We observed that the workstation was designed such that on average, the experienced VBRs ratio of eye movement frequencies made away from the camera to the total eye frequency for a customer session, appeared similar for all product types. The ratios indicated that just over half the eye movements were required to deal with the workstation computer monitors.

For the investigated areas, we recommend that more recordings be taken of the VBRs performing tasks on the workstation, to further strengthen the results. Further, we recommend that additional novice and experienced VBRs are studied to determine accurate measures for the areas investigated.

Web - Based Calculator for Residential Energy Conservation

Pulkit Gupta

A large Canadian financial services institution (FSI) is planning to develop a web-based application aimed at helping homeowners calculate the financial and environmental impacts of potential energy conserving upgrades to their dwellings. The algorithm for this calculator, the questions to be posed to the homeowners, and how the homeowners can access some of the scientifically-inclined energy-related information is presented. The potential upgrades considered were: furnace efficiency, heat pump efficiency, programmable thermostats, window-efficiency, building insulation, lighting efficiency, and refrigerator efficiency. The algorithm developed was used to demonstrate that changing just one of the input variables can, in certain cases, have a drastic effect on the resulting output: upgrades with positive net present value (NPV) can drop to negative NPV, and in certain cases CO2 emissions can increase as a result of the upgrade considered. The effect of future changes in fuel prices, and the price levied on CO2 emissions is also demonstrated.

Efficiency in the Canadian Insurance Industry: A Data Envelopment Analysis Approach

Allison Hewlitt

The objective of this work is to provide an alternative means to measure the productive efficiency of insurers in the property and casualty, P&C, insurance industry. Currently, there are several indicators used to measure insurer performance. However, there are several limitations associated with these indicators. Data Envelopment Analysis (DEA) is used to measure the relative efficiency of each insurer by comparing it to a similar "best practice" insurer. One of the strengths of DEA is its ability to handle multiple input and output scenarios, as is the case in the insurance industry.

Two models are developed to capture selected business processes of 120 insurers in the P&C industry. Average pure technical and scale inefficiencies in the industry are determined respectively, as 0.87 and 0.90 based on the operational performance of insurers. In terms of the investment performance of insurers, average technical efficiency scores under variable and constant returns to scale assumptions are determined respectively, as 1.93 and 2.22. The results of this research suggest that variable returns to scale exist in the industry. Insurers, grouped according to specific characteristics, are analyzed in an attempt to determine factors that may affect efficiency.

Bank Branch Profitability and Productivity: A Domestic and International Study using DEA

Denise Ho

Globalization presents corporations with the opportunity to expand beyond their traditional reach, but also increases the level of competition in their established markets. Retail banking markets have become more integrated as governments ease their regulations allowing foreign competition, and technological infrastructure facilitates information exchange within multi-national corporations. Clearly there is a need to analyze the performance of banks and banking systems internationally. While a number of institution-level studies have been published, only one branch-level international study has been published to date.

This research uses Data Envelopment Analysis to evaluate bank branch performance in seven national branch networks within one geographical region operated by a multi-national financial services corporation (''The Bank'. In agreement with the focus of other measurement techniques at The Bank, the analysis was conducted in two dimensions: profitability and productivity. The domestic profitability model evaluates branches within their own country; only best practice branches are included in a second, regional model in order to identify the best practice branches and countries within the region as a whole. The same methodology is applied to a productivity model.

This research is the first study to isolate the effects of a country's culture, operating environment, and regulatory policy on branch performance. This was a unique opportunity because all the branches are owned by The Bank, which imposes its management philosophy equally on all of its subsidiaries. Examining branches providing standardized products and services, operating under shared corporate values and goals, and using identical information systems, removes managerial and corporate disparity. It can therefore be deduced that differences in performance are due to either branch management or country effects.

These DEA models provide results applicable to multiple levels of management at The Bank. Branch management is provided with best practice branches within their own country as well as a list of their sub-optimized operations in order to improve domestic branch performance. In addition, top performing branches and countries are identified to senior management in order to derive additional information for strategic decisions.

Assessing Financial Risk Tolerance of Portfolio Investors Using DEA

Parisa Hosseini Ardehali

The task of assigning investors to the right risk tolerance category and therefore, suggesting the most suitable investment portfolios to them, is an essential, and legally obligated part of the mutual fund providers' service. Properly executed, any screening system should prevent unexpected or unbearable losses to the client. In fact, such safeguards would measurably improve the quality of the service. At the same time, if it is designed in the appropriate manner, it accomplishes the requirements for the "know your client rule". Although this problem is significant, not much have been done in the academic world to address the need.

We know that the three main steps of asset allocation are:

  • Analysing clients' needs and constraints to determine risk tolerance,
  • Determining the risk/retum attributes of available investments,
  • Assigning the maximum return portfolio with a bearable level of risk to the client.

This work concerns the first step.

Risk tolerance is a multidimensional aspect of a human behaviour. Grable and Lytton mentioned some of its dimensions as guaranteed versus probable gambles, general risk choice, choice between sure loss and sure gain, risk as related to experience and knowledge, risk as a level of comfort, speculative risk and prospect theory [GRAB99a]. While looking at each of these dimensions alone cannot lead to a fair assessment of how a person feels toward taking financial risk, a wider view, which looks at all of the dimensions at the same time, can. That is why Data Envelopment Analysis was adopted in this work for this purpose.

DEA is a non-parametric LP optimization technique, which is often used for measuring efficiency in services production units. What makes it appealing for risk tolerance assessment is its ability to handle multiple variables at the same time. Just like when it measures efficiency based on several inputs and outputs, it can measure risk tolerance based on several psychological questions, each of them representing one dimension of risk tolerance. This is the first time DEA has been used for evaluating a psychologically oriented dataset, and the results have been quite encouraging.

Merger-Related Productivity Gains in the Canadian Banking Industry

Sarah Kim

In this study, merger-related productivity growth in a Canadian bank is investigated using 4 non-parametric frontier techniques. Employing Data Envelopment Analysis, efficiency improvements achieved through the merging of retail-banking branches are assessed by (i) calculating the pre-merger branch efficiency in 1999-2000, (ii) calculating the postmerger branch efficiency in 2002-2003, and (iii) comparing the pre- and post-merger results. Malmquist indices are calculated and decomposed to measure technical efficiency change and technological change. The results indicate that, on average, branches experienced technical efficiency gains after the merger. Also, in the majority of cases, branches that were closed and merged with surviving branches became more scale efficient. Moreover, branches that were directly affected by merger activities experienced greater efficiency and productivity gains than branches that were not directly affected. The results suggest that the newly merged bank has been able to capitalize on the opportunity to reduce operating costs, optimally re-allocate staff and realize synergies.

 

Software Size Measurement Issues to Enterprise System Environment

Atin Kulkarni

A fundamental requirement for improving software development performance is to correctly measure it. From a project management point of view, it is important to know the size of the system to be developed, as accurately and early in the Project Life Cycle as possible. The Function Point (FP) metric was developed at IBM as an effective size measure. In this thesis we critically evaluate FP in a corporate systems environment.

A number of models were developed to understand the relationship between software size, measured in Unadjusted Function Points (UFPs), and development effort using a large dataset from a Canadian Bank. Linear models were not found to satisfactorily describe the underlying relationship whereas log-linear models appeared valid in almost all the cases. Overall, it was found that UPFs account for substantial effort, as much as 80% in one type of project. In one case, however, UFPs accounted for only 25 % of the effort.

From an effort estimation point of view, we hypothesized that models derived by disaggregating UFPs might be more useful than those based on UFPs alone. Indeed, this was found to be the case. We also expected that the weights would not add useful in formation to the estimation models. We confirmed this by comparing weighted and unweighted models. Variability in the manual FP counts due to the subjective decisions made by the analyst is an issue of serious concern. Automation of the FP counting pro cess was found to be a promising direction to solve this problem. A number of conceptual and practical issues related to automation were identified in this research.

Object Oriented Development (OOD) is fast replacing the tradition procedural paradigm. Traditional metrics, including FP, do not appear to adequately represent the fundamental OO concepts. A number of new metrics have been proposed exclusively to me asure OO software size. We developed an automated system to count a variant of traditional FPs, the Class Function Points (CFPs), as well as a new set of OO metrics, the C-K metrics. The C-K metrics were found to be much easier to count automatically an d more accurate than the CFPs.

Evaluation of Bank Branch Growth Potential Using Data Envelopment Analysis

Alex E. LaPlante

The main objective of this study is to evaluate the branch growth potential of a major Canadian Bank using a series of novel Data Envelopment Analysis (DEA) models. The approach used investigates growth efficiency from four different perspectives using six unique models. The results of each model are analyzed using various segmentations and their functionality assessed. Along with a comprehensive review of the DEA literature and methodologies, a synopsis of the Canadian banking industry is presented and a general overview of the bank being studied is provided. Details pertaining to the more than 1000 branches evaluated, the regional and size segmentations used and the unique methodologies applied to each model are also discussed.

 

Software Production Efficiency without Size Parameters using DEA

Christopher McEachern

Software has facilitated unprecedented growth, innovation, technological advancement and globalization across essentially all industries and markets. In particular, it plays a pivotal role in the delivery of services and operations in the financial sector, Productivity improvement continues to lag behind all other aspects of computing advances. Arguably, this is due to increasing complexity in the problems and domains to which it is being applied. While several productivity measures of the software process exist, more sophisticated analysis techniques are required to more completely represent the multidimensionality of the production function.

This research uses Data Envelopment Analysis (DEA) to evaluate twenty-six software projects completed at a Canadian bank. The analysis uses an internal risk and complexity assessment as a measure of software development output, in conjunction with customer satisfaction. Three models are included in this study, each focusing on a different facet of the software process: time-to­market, cost estimation, and cost minimization.

This research is a continuation of software productivity studies at the Centre for Management of Technology and Entrepreneurship, with a focus on including the dimensions of risk and complexity in DEA analysis of the software production function. The study is unique for several reasons. Firstly, very few DEA studies have accounted for risk/complexity in modeling software development, at least to the knowledge of this author. In addition, it uses budget variance as a proxy for measuring cost estimation and cost minimization efficiency. Finally, it is also unique because it does not rely upon an explicit measure of size to account for software development output.

The results provide the Bank with three views of their software production function. Each view defines a set of best practice projects, which less efficient units should strive to emulate. In addition, contextual data was used to evaluate differences in efficiency across the level of vendor contribution on the project, the level of business risk, the sponsoring stakeholder group and so forth.

Investigating the Payback of Information Technology Investments at a Large Canadian Financial Institution

Brent Gordon McGaw

There is a growing industry concern, that investments in Information Technology are not "paying off." The objective of this thesis was to determine whether four project characteristics (Project Size, Business Unit (Group) initiating a project, Management Objective and Application Domain) could provide useful information for predicting the variability of different financial forecasts (Benefits, Operating Costs, Development Costs, and Depreciation Costs). Using a database of Information Technology investments, which contained forecast estimates prior to project development and new estimates upon completion, it was established that indeed a portfolio approach to project management, and knowledge concerning project characteristics could provide useful insights. The Business Unit and Application Domain characteristics produced important predictive information with respect to Benefits, Development Costs, and operating Costs. Size and Management Objective only provided useful insights with respect to Benefits. None of the characteristics provided information with respect to Depreciation variability.

Evaluating Customer Service Representative Staff Allocation and Meeting Customer Satisfaction Benchmarks: DEA Bank Branch Analysis

Elizabeth Jeeyoung Min

The main objective of this study is to evaluate one of Canada’s ‘Big Five’ bank’s Customer Service Representative (CSR) allocation model for their branches by (1) evaluating the efficiency of their national branch network in the context of employment only and (2) evaluating the efficacy of branch operations meeting the desired service time benchmarks. The study employed a non-parametric, fractional, linear programming method, Data Envelopment Analysis (DEA) and particularly, the non-controllable variables included in Constant Returns to Scale (CRS) and Variable Returns to Scale (VRS) models.

Comparison of Long-Term Investments in Single-Family Housing with Stocks, and Fixed-Income Securities Markets

Susan Mohammadzadeh

The historical long-term volatility and return on investment in single-family dwellings was investigated and compared with investments in equity, bonds and T-bill markets. Total return index for equity and fixed-income security indices were obtained from available sources, of course, a proper index for measurement of long-term changes in house prices was unavailable. In an effort to measure the house price changes, a relatively homogeneous pool of houses in the downtown Toronto area was selected and its price tracked over the study period of 44 years. Inflation rate affects the return of investments in everything similarly therefore this was not considered in the calculations. Results of comparing the investment of cash in one's family home versus in other investment vehicles showed that the ratio of investment growth to its volatility for a single-family house exceeded the ratios for other investments by a large margin. As a future work, there is a potential for developing a derivatives market on residential real estate, hence, the house price index built in this study lays out the foundation for that.

A Bayesian Belief Network for Corporate Credit Risk Assessment

Rinku Pershad

The traditional methods used for credit risk assessment are based on financial data and have a number of shortcomings associated with them. Bayesian Belief Networks application for decision-making under uncertainty is widespread and the uncertainty and objectivity inherent in assessing corporate risk makes it an ideal application area.

A BBN was developed to assess the credit rating and credit grade for a group of companies operating in the U.S. retail industry. The revised BBN outperformed the original BBN by correctly classifying 34% of the credit ratings and 59% of the credit grades as compared to 26% of the credit ratings and 40% of the credit grades in the original BBN. Also, the results suggest that qualitative information on a company has much influence in the assessment of credit risk. The encouraging results of using BBNs in credit risk assessment invite research in the development of better models, using larger and more representative data sets for improved analysis.

Applications of Data Envelopment Analysis to Measure the Efficiency of Software Production in the Financial Services

David N. Reese

Canadian banks' Information Systems Departments spend hundreds of millions of dollars year One of the significant issues facing banks is how to measure and improve the efficiency productivity of the people and processes used in the industry. Currently, many firms use a large assortment of performance ratios to assess the success of software projects. Data Envelopment Analysis is an alternate technique that uses comparisons of specific decision making units (DMU) to best practice units to assess the relative efficiency of software project teams. of the main strengths of DEA is its ability to handle the multi-dimensionality of software production processes.

Several DEA model were built to analyze the data collected from the Toronto Dominion k. The technical and overall efficiency of completed software projects was measured. her, the different behaviors of project efficiencies were examined by comparing the results of different technical and overall efficiency models.

The ability to measure and then decompose overall efficiency into technical and allocative efficiency, provides management with significant information which corresponds to the different sources of inefficiency. As a result, such models can make a significant addition to management control systems of both software production and other outputs of various service industry firms.

Performance Analysis for Engineering Teams at Bell Canada

Sandra Rehm

After the long distance telecommunications market was deregulated in 1992. Access Network engineers at Bell Canada have had to change their paradigm within which they manage and deploy assets. This research sponsored by Bell Canada conducts an efficiency analysis using Data Envelopment Analysis for Access Network engineering teams at Bell. We developed two DEA models for measuring the efficiency of engineering teams: Full Activity and Economic Models. The measures in the models were formulated in collaboration with management at Bell. We collected 1994 data for each model for a total of 39 teams in Ontario and Quebec.

We also developed the Reduced Activity Model as a version of the Full Activity model which excludes held orders as an input factor. The analysis of the Reduced model verses the Full model enabled the identification of teams that exhibit particularly low levels of held orders as well as those that exhibit very high levels. The results of the unconstrained Economic Model reveal how DEA selects factor weights for each team that will enable provide the highest possible technical efficiency score. In order to prevent the assignment of unrealistically high efficiency scores, the factor weights were bound using managerial information regarding the relative input factor importance. This information revealed that managers were equally interested in minimizing held orders and costs. The efficiency scores obtained from the constrained model were found to be lower than those obtained using the unbound model. We also analyzed the effect on efficiency scores due to a shift in the focus towards higher emphasis on cost minimization over held order avoidance. We found that the average efficiency score increases from 0.65 under the actual managerial preferences to 0.76 Under the hypothetical policy.

We also found that the peer groups obtained from the Economic Model do not align with the urban and non-urban team categories established by management. Also, on average the urban teams were found be more efficient than the non-urban teams. Finally we recommend that teams 11, 13 and 19 be investigated by management to identify successful practices that could be implemented by the less efficient units.

Two Stage Evaluation of Bank Branch Efficiency Using DEA

S. Rouatt

With increased foreign and alternative channel entrants in the Canadian banking industry, the domestic banks must be run as efficiently as possible in order to remain competitive. More detailed performance measurements are now required to identify underperforming areas and to help cut costs and improve efficiency. Data Envelopment Analysis (DEA) has been successfully applied in many efficiency evaluation programs in the banking industry, however traditional DEA analyses have focused on measuring only one performance aspect of a banking system. Bank management requires a more comprehensive performance measure that gives them a more complete picture of their branch systems’ performance.

This research presents a new two stage DEA model for evaluating the efficiency of a bank branch network in multiple dimensions. The first stage involves separate DEA analyses using three different branch performance models - productivity, profitability, and intermediation.

Then, a new concept of using the results of these initial models in the second stage (combined model) is done to provide a measure of overall efficiency and to allow the individual ranking of the branches. The top branches, identified by the second stage rankings, can then be used as benchmarks by management for the creation of “templates” for what new branches should be modeled after.

This new two stage model is then applied to one of the five largest Canadian bank’s national retail branch network. This novel approach is used to gauge the performance of the individual branches and to provide recommendations for improvement directly to the branch managers. The economic effects of market size and geographical region and their effects on branch performance are also examined. The stage two results are then presented to senior management to be used to provide a system-wide ranking of branches.

 

Determining optimal fibre-optic network architecture using bandwidth forecast, competitive market, and infrastructure-efficient models used to study last mile economics.

Muhammad Osamah Saeed

Human communication has come a long way and is ever evolving. It not only changes the way people interact with one another, but also the world around them. The Internet, a virtual mesh that facilitates nearly every aspect of our daily lives, is maturing to the point where the existing telecommunications structure can no longer support it. Besides our dependence on it for paying our bills and reading the daily news, we look to it for sources of entertainment such as instantly streamed videos, songs and other bandwidth heavy applications straight to our personal computers and mobile devices. Consistent with other estimates, this thesis predicts demand for bandwidth to exceed 100 Mbps by the year 2012, for high-end, power users, and that demand for different bandwidth offerings is converging towards higher speed offerings.

 

The Use Of Data Envelopment Analysis In The Measurement Of Software Development Team Performance: A Quality Focused Approach

Dwight Schmidt

For much of its existence the software industry has been plagued by shortcomings in its ability to consistently develop effective products in an efficient manner. This has resulted in budget and schedule overruns, unmet user needs, and even unusable applications, contributing to dissatisfied clients and most importantly lost business. In order to improve on these shortcomings effective means by which to measure software development efficiency must be developed. This thesis uses a new approach to gauge productive efficiency, Data Envelopment Analysis (DEA), and applies it to the measurement of the software development process. DEA is able to incorporate the multiple inputs and outputs of a process to arrive at relative efficiency ratings for individual decision making units (DMUs).

New DEA models were constructed for the software process and applied to project data provided by a large Canadian Financial Institution (FI). One of the main differentiators between these models and most other DEA models examining the software process is the use of quality metrics, namely client satisfaction survey scores measuring software team performance. But uniquely, size metrics were left out of the analysis. Traditionally software size has been the main variable in software performance measurements, but this thesis purposefully explores a quality-focused approach.

The technical and scale efficiencies of the software projects are analyzed using two separate DEA models. The characteristics of efficient projects are presented, and relationships among quality, efficiency, and other projects factors are investigated.

Credit Card Fraud Detection Via the Application of Neural Networks

Paul Seethaler

At the end of 1992 fraud losses by Master Card and VISA in Canada amounted to over $100 million, and the rate of loss is increasing. Although the card holder has some liability, when the card is lost, it is only $50, the majority of the loss from fraud is carried by the financial institution (FI) which issued the card. As traditional methods lose their effectiveness and raising the stakes through updated authorization procedures is exceedingly expensive, FI's are continually looking for new tools to reduce fraud early in its brief cycle of 72 hours.

Different methodologies such as statistical analysis, expert systems, and Neural Networks can be utilized to address the problem of flagging potentially fraudulent activity in the credit cards accounts. In reality FI's are using simplistic reporting algorithms to select accounts for investigation, but they rely heavily on card holders contacting the FI, reporting lost or stolen cards and disputing transactions on the monthly statement. The initial tool employed to flag fraud is the foundation for success in actively fighting fraud, hence it's flagging accuracy is of paramount importance.

Neural Networks were selected for this thesis to explore the merits of this methodology in fraud detection. The data used in this research comprised one month worth of fraudulent and legitimate card holder activity. This was extracted from the collaborating FI's computer system. Three standard back propagation neural network architectures were developed. A number of parameters were varied such as: different sizes for the three main network architectures; varying weight decay values, and rotating the sections of the dataset to produce training, validation and test sets. All these were employed to explore how these parameter variations influence the performance of the network. The prototypes were implemented, trained and tested through the public domain Xerion Neural Network simulator.

The results show that Neural Networks can be employed to detect fraud earlier than other existing methods. Yet it was also discovered, that further investigation would be required before commercial installation can be contemplated. To advance this work to a commercial level, a more elaborate and rigorous data collection effort is required.

A Dynamic Pricing Service For The Multi-Channel Retail Bank With Applications In General E-Commerce

Michael Serbinis

This research is an investigation of new tools for use in the E-Commerce sales channel in the banking industry. The work specifically focuses on the pricing of retail banking products at the point of sale in any of the E-Commerce channels. The dynamic pricing problem is investigated considering the stated goals of a customer-centric bank: to maximize customer profitability across all sales channels. A dynamic pricing model is proposed and compared against traditional pricing models. It was hypothesized that a dynamic pricing model based on channel costs, customer profitability and the value of a new sale opportunity should always result in a more profitable customer base.

It was found that dynamic prices perform better from a profitability perspective than all other pricing models. It was also found that such pricing strategies based on future profitability perform better than the same methodology based on past profitability. This is especially true in a low profitability customer base where the gains realized by using a dynamic pricing method are very significant (+1,224%). In this case, the model outperformed the optimal flat discount strategy by nearly 100%. In the high profitability customer base (+21%), the effect was similar, only to a lesser degree, outperforming an optimal flat discount by 40%.

Hence, the hypothesis is proven and, in fact, it is shown that substantial competitive opportunities exist in exploiting this approach in a number of businesses such as Telecommunications/Internet Communications and General E-commerce. Furthermore, the mathematical model and simulation technique used in this research can be utilized, although more development is desirable, to determine the effect of dynamic prices on specific businesses using actual business data as well as to develop production ready dynamic pricing services

Ratios in DEA and How These May Be Useful Predictions of the Future

Sanaz Sigaroudi

In the standard Data Envelopment Analysis (DEA), the strong disposability and convexity axioms along with the variable/constant return to scale assumption provide a good estimation of the production possibility set and the efficient frontier. However, when data contains some or all measures represented by ratios, the standard DEA fails to generate an accurate efficient frontier. This problem has been addressed by a number of researchers and models have been proposed to solve the problem. This thesis proposes a "Maximized Slack Model" as a second stage to an existing model. This work implements a two phase modified model in MATLAB (since no existing DEA software can handle ratios) and with this new tool, compares the results of our proposed model against the results from two other standard DEA models for a real example with ratio and non-ratio measures. Then we propose different approaches to get a close approximation of the convex hull of the production possibility set as well as the frontier when ratio variables are present on the side of the desired orientation.

DEA Based Analysis of Corporate Failure

Paul C. Simak

Anyone who is planning to invest in a company, whether debt or equity, wants to make sure that the risk is acceptable for the returns expected and that the progress of the investee company continues as anticipated. The increasing number of corporate bankruptcies in the 1990s have reemphasized the need for research in the area of identifying early warning indicators of corporate distress. The traditional methods used for this purpose have a number of known problems and shortcomings associated with them and there is a continuing need to explore other methods of analyzing financial data. The multidimensional nature of corporate performance makes it a very attractive application area for Data Envelopment Analysis. The strength of this technique lies in its ability to handle multiple inputs and outputs, the fact that it does not require the specification of a functional form for input-output correspondence, and that it gives a single measure of performance which takes into account the multiple dimensions of corporate activity.

The goal of this work was to validate the hypothesis that DEA can be used as a tool for predicting future corporate distress. Production models based on the DEA methodology were developed and used to predict the financial viability of firms based on their historical financial data. The classification accuracies of the DEA models compared very favorably to those of the popular "Z score" model, and in most cases, the model accurately picked ou companies which showed signs of distress as early as 3 years prior to failure.

The successful results of the DEA technique found in this work invite further research in the development of even better models for predicting corporate distress using larger data sets for specific industries.

 

Using Data Envelopment Analysis in Measuring Project Management Performance

Dalia Sherif

Measuring project management performance is a challenging yet critical task to optimize the operations of any organization.  The multidimensionality of the measurement process raises its complexity; hence the use of a mathematical programming technique, Data Envelopment Analysis (DEA), along with Earned Value (EV) technique is proposed here to address this issue. 

DEA was used to measure the relative efficiency of 161 IT projects from a large financial institution.   Seven hypotheses, incorporating the main measures or factors contributing to project performance, were developed and tested using DEA.  Higher DEA efficiency scores (≥ 0.6) were allocated to projects that aligned with their EV.  Unexpectedly, the project efficiency was not impacted by elevated complexity or the degree of senior management involvement.   Projects with Capability Maturity Model Integration (CMMI) Level 4 designation and Mandatory project type got higher efficiency scores.  Finally, Customer Satisfaction (CSAT) scores did not agree with DEA efficiency scores for many projects as not all attributes of CSAT scores were available for this investigation.

Ultimately, the major outcome of this methodology is the provision of a tool for senior managers to assess the collective achievements of their project management practices.

A Study of Relative Stock Market Pricing Efficiency in Several Industries, Using Data Envelopment Analysis

Fai Keung Tam

This research examines whether a Data Envelopment Analysis (DEA) mode based on risk-return considerations can identify stocks which are mispriced relative to other stocks in the same industry. The risk variables employed are variables which had previously been found to help explain covariance in security returns Four industries were considered Telecom, Telecom Equipment Hardware, and Computer Software and Services. Each was studied as the year end dates in 1997, and 1998.

DEA identifies an efficient frontier of those stocks which are efficiently priced relative to the other stocks considered. The Software industry was found to have the lowest level of overall pricing efficiency, and the Telecom industry the highest. Equally-weighted portfolios of the efficiently priced stocks in the Software industry were found to outperform portfolios of inefficiently priced and randomly chosen stocks. Results in the other industries were mixed.

DEA Based Bank Branch Productivity Comparison with Bank Methods

Niloofar Tochaie

The purpose of this study is to introduce a new staffing allocation model for a bank branch network, using Data Envelopment Analysis (DEA) and compare the results to those from the Bank's own system.

The Bank's system methodology is based on complex index calculations and has sufficient complexity to be comparable to DEA; hence, the validation is of some real relevance to management. The Bank's approach is to identify an "efficiency band" that shows branches, which are considered just busy enough and, of course, it identifies branches that are "overstaffed" and "understaffed". In effect, the most efficient branches; in a DEA sense, are "understaffed" and are working "too hard", so additional staff may be required. "Overstaffed" branches are identified in the usual way. An evaluation or validation of their system as well as some improvements seem desirable for the Bank.

DEA can provide valuable managerial insights when assessing the dual impacts of operating and business strategies. The DEA model proposed here monitors the Bank branch network efficiency and identifies the best practices of efficient branches; classified in two size categories: "small" branches and "large" branches. Moreover, it enables management to efficiently, and effectively manage the human resources across its delivery network. When comparing/contrasting the results from both methods, much can be learned. These results yield a practical methodology and some computational tools that the Bank's executives, managers, and analysts may use to monitor and enhance various aspects of their branch staff allocation strategy, as well as making strategic decisions around the effective redeployment of staff, which could lead to potential savings. Detailed analysis combined with field study could achieve these savings.

Two-Stage Financial Risk Tolerance Assessment using Data Envelopment Analysis

Angela Tran

Questionnaires administered by financial advisors to assess financial risk tolerance are embedded with stereotypes, have seemingly unscientific scoring approaches and treat risk as a one-dimensional concept. In this work, a novel survey tool using Data Envelopment Analysis (DEA) was developed to assess relative risk tolerance. It is essentially a questionnaire that characterizes risk by its five distinct elements: propensity, attitude, capacity, knowledge, and time horizon. Results from surveying over 250 individuals and analysis with Slacks-based measure type of DEA efficiency models show that the multidimensionality of risk must be considered for complete assessment of risk tolerance, and provide insight into the relationship between risk, its elements and other variables. In particular, the perception of risk varies by gender. The tool could ultimately serve as a "risk calculator" which might provide legal compliance to the "Know Your Client" rule for financial institutions, their advisors and any applicable Web-based investment transactions.

Modeling Hedge Fund Performance Using Neural Network Models

Marinos Tryphonas

Modeling hedge fund returns has been an exciting and difficult subject ever since hedge funds have seen unprecedented growth in the past two decades. Due to their complex, unregulated, and often undisclosed trading strategies, along with a lack of historical performance data until just recently, modeling hedge fund performance has been difficult to accomplish. This thesis proposes a new way to model hedge fund monthly performance using a neural network modeling approach. Specifically, a three-layer neural network with 1-8 hidden layer neurons was constructed using a resilient backpropagation training algorithm.

Opportunity Index Development for Bank Branch Networks

Heather Vance

Technological progress over the past several years has provided the Financial Services Industry with the opportunity to develop a myriad of service channels from which customers can procure products. As these alternatives to traditional branch banking become more and more popular, it becomes vital for financial firms to monitor the cost and location issues associated with maintaining bank branch networks. The need for branch locations has not vanished however, branch networks need to keep pace with changes in customer demographics and behavior. The need to evaluate these branch networks has prompted this research.

This thesis addresses the need to evaluate factors such as the number of competitors and socioeconomic characteristics of clients within branch trade areas. A comprehensive overview of pertinent customer behaviour research and literature on possible methodologies for measuring trade area composition are presented. In addition, an innovative method defining trade areas based on assigning customers areas to branches is employed. This method provides the foundation for the Opportunity Index methodology, which uses Multiplicative Competitive Interaction theory and Statistics Canada census information to assess the probable dollar opportunity available to individual bank branches within a trade region.

In this research the Opportunity Index is employed to determine index values for over 1500 financial institution branches across the GTA and Niagara region. The Opportunity Index value provides a useful measure of the competitive and socioeconomic environment of a branch's trade area. The index values for the data set ranged from 0.0037 to 0.4365, while the average index value was 0.0991. Examination of index values revealed a right skewed distribution. This concentration of smaller index values implies that selection of a cut-off point for investigating lower opportunity will vary with the analyst's interests.

The Opportunity Index methodology was successful in identifying regions of less opportunity based only on customer residence. Conversely, the majority of branches with high index values were on the perimeter of the study area. These branches, on the perimeter should be treated as anomalies in evaluation. Individual firms were also isolated to measure the average performance of their branch networks. No real significant difference between firms was found except for the slightly lower average performance of the Laurentian Bank, which coincidentally had the lowest number of branches in the study region.

This research shows that the Opportunity Index derived is a useful tool to be used in identifying branches with low opportunity. The Opportunity Index can be used as a diagnostic tool, which could be used in preliminary investigation or as a component in more detailed multi-faceted analysis of branch location.

Canadian Life and Health Insurance Productivity Evaluation using DEA

Sandra Vela

As the dual phenomenon of ongoing consolidation and demutualization continue to proliferate throughout the Canadian Life and Health insurance carrier market, firms are forced to reconsider their operating strategies in order to improve their performance and remain competitive. In addition, intense pressures posed by the entry of the non-traditional market participants, such as banks and mutual fund companies, have focused insurers’ attention on efficiency and productivity measurements.

Data Envelopment Analysis (DEA), as a more sophisticated alternative to central-tendency analyses, provides valuable managerial insights when assessing the dual impacts of operating and business strategies. The DEA framework is suitable for evaluating efficiency and productivity within insurers’ complex service operations since it requires a minimal set of assumptions regarding technology and minimum extrapolations from the observed data.

The three-year average of technical efficiency scores are reported in this work as 0.82 and 2.02 based on the insurers’ productivity and investment performance, which is equivalent to $10.2 billion and $20.2 billion in savings, respectively. Technical efficiency determined from an input-oriented production model ranges from zero to one, whereas technical efficiency in an output-oriented investment model extends from one to infinity, in both cases unity representing the best performance. The evidence indicates that constant returns-to-scale technology operates in the Canadian insurance industry with a significant number of increasing returns-to-scale participants. Furthermore, the results show that firm efficiency is statistically related to their size. Based on the productivity performance, the largest and the smallest firms exhibit the highest efficiency scores with declining production trends in between the two, whereas efficiency increases with increasing insurer size in terms of their investment ability . In addition, DEA efficiency scores are statistically related to conventional performance indicators. Efficiencies of alternative organizational forms, ownership types and distribution methods are investigated to conclude that insurer characteristics have no significant effects on DEA scores. Stock firms are just as efficient as mutuals, so are Canadian versus foreign owned companies and insurers who distribute their products either through brokers/agents or directly to clients. The assessment of changes in efficiency and productivity over the 1996-1998 period using the Malmquist Index methodology indicates that insurers exhibited overall technical efficiency losses, technological progress and scale efficiency losses based on their production performance. However, only relatively minor improvements or deteriorations have taken place throughout the sample period in terms of the insurers’ investment ability.

An Information Technology Justification Framework for the Dental Industry

James Yacyshyn

The need for an effective strategy for evaluating and justifying technology initiatives in the healthcare industry grows. This work focuses on the challenges of information technology evaluation and justification in a healthcare sector.

An overview of the challenges faced in the health care industry is presented. A review of works relating to "technological justification methods" and the use of "Data Envelopment Analysis" (DEA) for healthcare follow. The development of a framework that focuses on the dental industry is then presented. The framework includes the development of a "process-oriented business model" of the dental industry. Additionally, an industry-specific technology implementation strategy is developed, along with the evaluation of various business-case economic measures for their effective utilization in a dental technology initiative. This thesis utilizes DEA in a novel way to determine clinical efficiency, to address potential sources of inefficiency, and determine the impact of a technology initiative on the technical efficiency of a dental practice.

Bank Productivity and its Relationship to Stock Price Movement

Tracy Yang

This thesis examined the relationship between the efficiency found from two DEA models (production & intermediation) and the stock price movement of the five major Canadian banks over a 20-year time period. In order to accommodate the small sample size and the long time frame, DEA window analysis was used.

Stock market price variables were added individually to the basic models as outputs and new efficiency scores were compared and analyzed. The basic DEA model efficiency scores were also compared to the actual market data and traditional Key Performance Indicators in order to determine whether or not a relationship exists.

These various tests were unable to definitively prove that a relationship exists between a bank's operational or intermediary role's efficiency and its stock price. That is not to say that a relationship does not exist, only that this research cannot conclusively show that this relationship does exist.

DEA Based Analysis of the Software Project Efficiency

Zijiang Yang

The behaviour of an output oriented DEA model is used to determine the priority of Information Systems projects given a user evaluation based on eight criteria. The criteria are inputs in the model and the priority is the lone output. An artificial set of projects are created and given a priority and act as the set of DMUs to be analyzed in the model. Projects to be prioritized are given a low priority score as a default. The output oriented model maximizes the lone output, a priority score. Projects are prioritized by their priority score. The paper discusses the design of the DEA model, the creation of the artificial set (which acts as a reference) and the interpretation of the DEA model output which can aid the model's designer and the user of the model.

Data Envelopment Analysis of Corporate Failure for Non-Manufacturing Firms using a Slacks-Based Model

D’Andre Wilson

The purpose of this work was to study the ability of the Slacks-Based Model of Data Envelopment Analysis in the prediction of corporate failure of non-manufacturing companies as compared to Altman’s Z’’ score model. DEA had been tested for corporate failure before, however the DEA model used was a BCC model and this was tested against Altman’s original Z-score model, which is an asset-dominated model. This research looks at non-manufacturing firms specifically and attempts to classify companies without looking at the asset size of the firm.