B.A.Sc. Thesis Abstracts - Year 1995

Keywords are searchable by using your browser's Find function, usually by pressing Ctrl + F.

Interactive Home Banking: A Framework for Bridging the Gap Between Technology and Financial Services

Edna Leung and Andrew Chak


The past several years have seen an explosions in the popularity and content of the information highway. Canadian organizations in telecommunications and cable television are fiercely competing for the opportunity to provide access to this world of abundant and instantaneous information. Apart from the attention that is being paid to the technical aspects, organizations are also competing to provide content for the information highway.

It is imperative that Canadians become comfortable with the information highway as a conduit for the delivery of services such as interactive home banking (MB). The objective of this thesis was to develop a set of specifications for an MB system in collaboration with the Royal Bank of Canada, the largest bank in the country.

The target market that the Royal Bank wishes to pursue is current university students who are likely to become the high net worth banking customers of the future. A survey was designed and administered to help formulate an understanding of this future customer group.

The specifications for the features and capabilities of the MB service were based primarily on the different customer clusters identified from the analysis of the survey results. Clusters were used because they better identified the differences between groups of potential IHB customers. It is expected that two out of the three clusters (clusters 2 and 3) will be receptive to MB, provided that it is properly delivered and packaged. The combined population of the two clusters make up 77% of the university students who participated in the survey.

Of particular note is the packaging of the services offered on IHB. With so many financial products and services available to customers, IHB may be seen as a means to organize one's personal finances in an effective manner. The technology that will be used to deliver IHB should provide a flexible interface which can accommodate various packaging of the available services.

Initially, core banking services should be offered on IHB. Once customers are comfortable with utilizing these services, then other banking services such as investments and loans should be added. It should be noted that other information highway services will have an effect on the acceptance of MB. The majority of respondents believe that "serious" services should be offered first on the information highway. These services include interest rate bulletins, economic forecasts, and investment news. Surprisingly. home shopping was not well received at this time, however, it was found that IHB users have a greater potential for accepting home shopping than those who do not wish to subscribe to MB.

It is important that the price of MB is kept low to entice university students. Students in general, tend to have low incomes at this point in their lives. Moreover, students will not use MB if the fees are higher than current in-branch or ATM banking fees. However, pricing must be taken in the context that students do not fully understand the value that IHB can provide given the fact that they have never used the service before.

With the overwhelming support for MB, it is evident that students recognize the imminent arrival of this service. Therefore, the Royal Bank of Canada will be an industry leader by providing IHB services that reflect the needs of the students as revealed by the results of survey. The provision of MB will fulfill the promise of moving the distribution point of banking services into the convenience of the home.

Object-Oriented Databases and their Applicablity to the Management of Retail Branch Banking

Leo Donatelli and Mike Kosic

Businesses can remain competitive and reach their goals of profit and growth by effectively utilizing information technology. The Toronto Dominion Bank has over 1,000 branches and millions of clients across Canada. The use of local databases in each branch can be used to cost effectively deploy front-line applications to the branches.

The personal banker branch environment at the Bank is known as BANKMATE. It consists of an IBM OS/2 LAN server with IBM PC client workstations running OS/2. The application software is written in the object oriented programming (OOP) language Smalltalk/V from Digitalk Inc.

The BANKMATE application has major performance problems. There are instances where queries on complex data take up to 40 seconds. Performance problems were identified in three major areas: data retrieval, object assembly, and query response display.

The work of this thesis was carried out to improve BANKMATE performance by investigating the applicability of using an object-oriented database management system (ODBMS) in place of the existing relational database management system (RDBMS) and elimination of the translation of data in relational tables retrieved by SQL into object structures used by the application.

In order to prove this, a test of the data retrieval and object assembly times for both the existing RDBMS and a compatible ODBMS should be carried out. For the purposes of this test, an ODBMS, called Tensegrity, has been developed. We recommend that the Bank applies this framework to the current relational database environment and to the Tensegrity object-oriented database in order to generate relevant performance benchmarks.


A Skill Inventory and Planning System for Information Technology Employees

Enrique Goldsmit and Salim N. Ladak

The Systems Research and Development (SR & D) department of the Toronto-Dominion Bank has been considering for some time a skills inventory and planning system for its information technology employees. Such a system would maximize the benefits of dollars spent on training and facilitate employee development. Analysis was divided into two phases gathering information and data modeling. In Phase I, interviews were conducted with management and employees from throughout the organization and from the bank's education centre - Operations Training. A presentation was made to a sub- committee of the Managers Forum, the results of which was a final list of thirteen requirements: to summarize this list, data is required on employee project experience, employee skills and employee training histories/plans.

In the next phase, a data model consisting of five facets was produced. The skill/role facet models how skills can be categorized and how different skills relate to different roles within the organization. The skill/course facet describes the relationship between skills and courses, and it provides course information, including that pertaining to vendors, course prerequisites and course schedules. The employee/skill facet describes the employee process of planning skill improvements by relating employees to skills. The employee/course facet describes the employee training plan/history by relating employees to course offerings. Finally, the employee/project facet describes the relationship between projects, employees and role types and aims to model the project management and employee evaluation processes.

Upon receiving management approval for the data model, a software product from a third party vendor was evaluated to determine how well it fits the needs of the organization. The product was IBM's Skills Planning System Version 1.1, currently in use at IBM and at one other major bank. While the package currently addresses only two of the five facets from Phase II, it complements the Supertrainer system, currently in use by Operations Training. However, due to the limited functionality of SPS and the effort required for integrating SPS with Supertrainer, SPS is not recommended for SR & D. Additional vendor packages should be evaluated and only if customization is moderate, feasible and cost-effective should one be selected. If such a product cannot be found, in-house development would make more sense than purchasing and 'rewriting a packaged solution.'

Impact Analysis of Cash Management Services at the Toronto Dominion Bank

Helen R. Kostiner

Cash Management Services (CMS) of the Toronto Dominion Bank requires a system that provides readily available information on the consequences of hardware, operating system, or application software failures. A Relational Database Management System (RDBMS) is proposed as a viable alternative to the set of tables currently utilized for this purpose. The existing tables correlate:

  1. Failing Applications to Impacted Applications
  2. Impacted Applications to Failing Systems

To best serve its customers, in as expedient and efficient a manner as possible, CMS needs answers to the following questions at the time of a data processing or telecommunications failure:

  • What customers will be impacted?
  • Which services will be affected?
  • Who should be notified?

This may be achieved by carefully defining relationships that exist between 5 entities: Customers, Business Services (Business Software) Applications, Resources, and Functions. In so doing, three models, which together form the basis of a RDBMS design scheme, are presented:

  1. a Logical Data Model
  2. a Functional Data Model
  3. a Physical Data Model

The Logical Data Model groups data into entities and relationships, thus reflecting nothing more than the business operations that occur within CMS. The Functional Data Model is one step beyond the Logical Data Model, redefined to satisfy operational performance constraints. Finally, the information required for the Physical Data Model is drawn up in the context of Relational Database methodology that is used with such commercially available software, as Paradox for Windows. Both the Functional and Physical Data Models require sensitivity to the user's particular needs.

The Toronto Dominion Bank's Rapidwire/e, which allows customers to send payments via on-line access to the bank's computers, is examined in particular, to validate the model presented.

The RDBMS that has been developed serves as a model of the type of Impact Analysis Database Management System that could find application within other departments of the Toronto Dominion Bank.


Record Compression/Decompression Strategies in Very Large Record Applications on Enterprise Systems

Paul Simak

Compression is a major concern for the client payroll department of the Royal Bank of Canada. They are increasing their clientele at a rapid rate which will continue to add to the already lengthy execution time of processing client payroll records. The current compression program, which uses up 90% of the CPU cycles attributed to the update part of the overall payroll process, will not allow for a substantial increase in clientele due to this time limitation and therefore an improvement must be found. This thesis deals with the evaluation and the improvement of the compression program used in the payroll processing process.

Based on the data analysis performed on sample employee records at the Bank, it was found that a better alternative for compressing and decompressing them can be achieved by implementing the null suppression compression algorithm. This algorithm was compared against the currently used bit mapping technique and gave promising results. The implementation of this new compression technique on sample data files resulted in a 25% improvement in the execution time of the overall compression/decompression cycle. Running the same tests on the mainframe on actual employee records resulted in a 25% decrease in decompression time and a 1% increase in compression time. The time improvement of the compression decompression cycle was measured to be 5%, but is projected to be 9% or higher upon implementation in IBM assembler language. The new compression technique will increase the compression ratio of the employee records from 13.1 to 16.2, which will also result in additional time savings in the 1/0 part of the compression process.

Canadian Mutual Fund Performance Evaluation using Data Envelopment Analysis

Andrew K.J. Khoo

Mutual funds are investment vehicles that have experienced an incredible growth in popularity in the last couple of decades. The large number of funds and the total amount of dollars invested in the funds leads to a high level of competition between fund companies. From the standpoint of both the investor and the fund manager it is extremely desirable to develop a methodology to measure the performance of a mutual fund.

Current mutual fund ranking systems are based upon the past financial return of the fund. However, it is a well accepted fact that future returns of a fund can not be accurately determined by past returns alone. The incorporation of measures of risk, expense ratios and return over several time periods may improve the usefulness of the rankings as predictors of future fund performance. The methods used to determine these ranking systems may still not present a complete picture of fund performance as many variables are not considered, and weighting of the variables are typically based on an arbitrary weighting system.

Data Envelopment Analysis (DEA) is a non-parametric linear programming method that allows the analysis of multiple input and multiple output systems. It has been used to successfully measure the efficiency of several service sector industries. It is also well suited to handle the large amount of information involved with the typical analysis carried out in the financial industry.

The primary objective of this thesis was to determine the usefulness of DEA as a predictor of mutual fund performance. The analysis used data obtained from Financial Times Bell Charts. The subset of funds selected for analysis were Canadian Equity Mutual Funds which had been in existence for at least 3 years. The basic model used risk (standard deviation) as the input, and expense-adjusted returns as the outputs. The secondary objective was to determine what other information a DEA analysis of mutual funds migh provide

Due to time and resource limitations, sufficient financial data was not obtained. Thus, the ability to determine the usefulness of DEA as a long term predictor of fund performance was not established. The selection of Canadian equity funds also prevented the comparison of the rankings produced in this thesis with other systems, as the bulk of the commercial analysis has been done on U.S. based mutual funds.

Although the primary objective was not accomplished, it was determined that a DEA analysis of mutual funds can provide both the investor and the fund manager with a wealth of additional information about the fund such as a peer group.

Graphical User Interface Design for Data Envelopment Analysis

Mark Pustilnik

Data Envelopment Analysis (DEA) is a very useful linear programming technique that can be applied to perform quantitative efficiency analysis of any system. It is of particular value to service businesses and other organizations where efficiency and performance are hard to quantify. Using DEA can be very technical and complex so in order to make it approachable by a wide variety of users a graphical user interface is needed.

The author of this paper created a high-level design and partial implementation of such a graphical user interface. This paper discusses the goals of the project, the high level design of the proposed user interface and the current state of the author's development efforts together with some low-level implementation details.

The appendix contains the current version of the program source code in C++.