B.A.Sc. Thesis Abstracts - Year 2012

Keywords are searchable by using your browser's Find function, usually by pressing Ctrl + F.

Developing a Hedging Calculator for Options on Commodities Futures

Ali Bashiri, Engineering Science

 

Derivatives pricing are of great importance to financial institutions as they affect production, trading, and risk management strategies. Specifically, commodities derivatives are important due to the volume of trades involving these derivatives. Over the past decades, there have been major developments in modeling the term structure of commodities futures and financial derivatives. Through the works of Fischer Black and Myron Scholes and other useful models proposed by the likes of Eduardo Schwartz, commodities derivatives and hedging strategies for many natural resources and agricultural products have been modeled and developed.

Futures options in particular are important options offered by financial institutions to clients. Underwriting an option has an undesired inherent risk for financial institutions. Financial institutions are interested in offsetting these risks through various hedging processes.

This project aims at developing a hedging tool for commodities futures options. For this purpose, first the commodities spot prices are modeled. Then, the futures prices and futures options prices are computed. Next, a delta hedging process is implemented and the resultant P&L distribution of this hedging method is studied. Finally, a graphical user interface (GUI) is implemented to facilitate the use of this hedging tool.

Using Client Analytics to Identify Target Markets

Patricia Ko, Redda Ellaine Naval

 

This thesis was sponsored by a major Canadian financial institution whose wealth management group is interested in applying client analytics to increase its business portfolio. To gain a better understanding the Bank’s current clients and predict future clients of value, three of the Bank’s datasets of clients were matched to an external database of Canadian businesses called InfoCanada. InfoCanada contains 1.4 million records and provides several additional fields of information that are not already present in the Bank’s datasets. The matching process was performed with the assistance of a free open source records linkage tool called Fine-Grained Records Integration and Linkage Tool (FRIL). After the matching process, low match percentages (10%-18%) across the Bank’s three datasets were found. By performing analysis on the types of businesses present in the Bank’s datasets compared to the businesses present in InfoCanada, some variations in data were observed. A meeting with a representative from InfoCanada was also conducted and it was found that the most recent match InfoCanada performed for the Bank received a match percentage of 16%. This confirmed that validity of the matching process, and opened up discussion for areas of future work in the realm of data standardization and collection. Based on the matches found some analysis was also performed, however based on the quality of the Bank’s data it was difficult to find good predictors of client value.

The Dodd-Frank Wall Street Reform and Consumer Protection Act: Impact on the Capital Markets Business Unit of a Major Canadian Financial Institution and its Implications for Information Technology Infrastructure

Dingran (Jim) Zhang, Cai Mike (Michael) Wang

 

In the wake of the 2008 global financial crisis, the United State Congress passed the Dodd-Frank Act (―DFA) to restore order and to bring stability back the financial system. One key area of reform the DFA focuses on is the regulation of the Over-the-Counter (―OTC) derivatives market, which transforms into the swap market under the legal framework of the DFA. This study identifies and analyzes the DFA‘s impact on information technology for a financial institution engaged in the swap market. The study builds on the existing knowledge of OTC derivatives and develops models for trade flow of swap deals as mandated by the DFA. These DFA trade model serves as the blueprint to identify technology change in the processes, standards and protocols along each step of a swap trade flow. As the reform on the swap market follows a phased-in schedule, the study focuses on the DFA‘s implementation schedule and on the established reporting and clearing requirements. The initial reporting requirement for interest rate swaps will be effective on July 2012; this demands immediate compliance effort from financial institutions to optimize current data structure and to streamline current reporting processes. While the comprehensive clearing requirement is yet to be finalized, the proposed rules only grant institutions 90 days for compliance once rules are finalized. This calls for the financial institutions to work closely with clearing organizations in preparation. Although the subsequent compliance effort is complex and onerous, it also presents a unique opportunity for the financial institution to seek industry best practice and streamline its existing operations.

Application Lifecycle Management Implementation Opportunities

Jie (Jason) Hao and Arie Chernitsky

 

The successful management of an application throughout its entire lifecycle requires the implementation of application lifecycle management (ALM) techniques. Currently, the Bank is not formally implementing those techniques consistently. As part of an initiative to improve its IT services performance to increase the Bank's competitive advantage, the Bank wants to assess its performance with regards to ALM approaches by comparison to other "best-in-class" organizations. As part of this effort, the Bank wants to know what ALM techniques it can use for a successful implementation. This thesis describes the theories behind application lifecycle management, analyzes the Bank's implementation of those theories, analyzes the best practices in the industry and then compares the Bank's practices against industry's best practices. Finally, it provides recommendations for improvement and a high-level plan for a successful implementation of ALM techniques within the IT department.

Effective Implementation of Intra-Corporation Networks and Social Media in Large Organizations

Morteza Soleymannezhad

 

This thesis expands on the types of intra-corporation networks and explains the perceived benefits as well as the best practices of social media campaigns. In addition, the methodology behind the successful implementation of social media is studied. And through a focus group brainstorming session, the collaboration needs of the Law Group are analyzed. Moreover, a number of recommendations including a sample “home-page” configuration and details of future brainstorming steps are provided. The results of this thesis provide a deeper understanding of intra-corporation networks and methods of improving collective collaboration at the Law Group.

 

 

Predictive Analytics with Large Datasets

Saksham Uppal

 

Large organizations are increasingly turning towards a data-driven approach to better understand the marketplace and consumer interactions. An obstacle, and potential opportunity, is presented through the growing amount of unstructured user-generated content released on social media platforms pertaining to brands and organizations. This content, if assessed correctly, may offer an opportunity at performance boosting insights. Specifically, there is a growing interest in determining whether the sentiment associated with user generated content on the internet has any correlation with performance indicators for organizations (e.g. stock price). Accordingly the investigation focused on determining the relation between user sentiment and stock price for the Bank stock. The study first conducted sentiment analysis on thousands of online posts (a mix of user and news content) that pertained to the Bank. Then regression analysis was performed to determine the correlation between the average negative sentiment of the social media content on a given day to the abnormal returns of the Bank stock. Analysis performed on a data set of social media content spanning 122 successive active business days suggested a possible correlation between user sentiment and stock price, and suggested that this effect decreases as time passes from when the content was initially posted. Using these findings, the Bank can investigate possible refinements for the attempted method and compare it to other parallel methods that employ different sentiment analysis and correlation techniques.<\p>

Design of Choice Architecture Based On Principles of Behavioral Economics for Home Equity Financing

Stephanie Au and Lucy Huo

 

The Bank is interested in improving the performance of a home equity financing product (HP) using behavioral economics (BE), which is a field that draws on aspects of psychology and economics to understand everyday decisions made by individuals. Presently, HP is a complex product that requires the customized setup of multiple mortgage and home equity line of credit (HELOC) segments, therefore improvements to simplify its appearance can increase product utilization and purchase by clients. To understand and adapt HP to client behavior, principles of BE are used to analyze client types and client interaction points (CIPs). This approach resulted in the creation of themed HP bundles to enhance choice architecture, that is, how a decision is influenced and by how its opportunities are presented. These bundles are customized for three main client groups, each of which provides HP default values based on client characteristics. These three bundles are designed to be visually appealing, informative, and useful in simplifying the presentation of HP. They also provide opportunities of solution extensions by means of additional interactive tools, new bundle types, and potential attractive marketing ideas.

A Study on Knowledge Management and Communivation Efficiency

Mike Jiang

 

This project attempts to analyze the problems associated with corporate communications within the Bank, in order to make strategic and feasible recommendations that can be implemented. The main emphasis of the study is placed on analyzing email communication, which has become a productivity obstacle and deemed inadequate for certain functions it currently serves. The goal is to identify alternative technology or user practices that could reduce email overload, promote an information retention/discovery process, and contribute to knowledge repositories. A comprehensive literature review was conducted to benchmark industry best practices. A survey questionnaire was distributed among Bank staff to collect information on current practices at the Bank. Based on the literature review, feasibility analysis, and lessons from industry, technology and user practice gaps were identified and addressed. The study recommends the Bank to

• invest in email training

• invest in SharePoint & IBM Connections training

• align optimal technologies to different business tasks

• and promote a company culture of information and knowledge “sharing”.

Productivity and Social Media in the Workplace

Andy Hsu and Katrina Jin

 

The use of social media has become a daily routine for many Canadian Internet users. Companies are quickly adopting enterprise social media, a variant of public social media, to facilitate knowledge sharing and to create a sense of community within the organization. The client of this research project is interested in knowing the strategies implemented by other companies with regards to the management of enterprise social media platforms. A literature review and a survey study are performed to fulfill the company’s request. After summarizing the results from literature review and surveys, the research team provides a 10-point recommendation for the Bank, ensuring a successful deployment and management of the enterprise social media platform.

Forecasting Internet Demand

Ferdinent S. Cheng

 

Internet plays an important role in people‟s everyday life, and the demand for it continues to grow. The question left to the carrier company is to figure out the right balance which allocates enough capital to service existing demand yet has the right amount of resources to invest in network capacity for future use. This research proposes a new methodology to forecast internet demand based on the characterization of the usage profile of the customers and incorporating the impact of major influential factors. The methodology adopts Excel VBA technology as a preliminary step towards data characterization and trend observation, from which graphs depicting the usage spectrum are plotted and basic statistics are computed. Numerical values on the demand and its various components are then deduced, followed by the exploration of what-if analyses to incorporate the impact of yet unidentified disruptive technology and other influential factors. Rather than predicting an actual number, the major contribution of this research is the suggestion of a new methodology and the intention to bring and organize different approaches in a systematic way in the effort of standardizing a procedure which increases the applicability of the methodology introduced.

Green Data Centre Design

Samantha Yang and Karen On

 

Client Co, a large telecommunications corporation, provides space in data centres to host their customers’ servers. Client Co is responsible for the continuous operation of the equipment, and ensuring the temperature is appropriate at all times. Recently Client Co built a new data centre in partnership with Company M, a district energy system that supplies power, heated/cooled water, and a power generator for backup power. Client Co would like to determine the environmental impacts of their partnership with Company M, and to compare it against a hypothetical data centre with the same functionality, but without the partnership with Company M. To analyze this partnership, the relevant components for comparison are determined and the models of the current scenario (referred to as the Company M scenario) and a green field data centre scenario (referred to as the Greenfield scenario) are designed for the years 2011, 2016, and 2021. The impact of each scenario is calculated and compared, in terms of CO2 emissions and waste heat emissions. Provincial variation in fuel mix are also considered and compared to the Greenfield and Company M scenarios. After evaluating all scenarios, it is determined that the Company M scenarios have less of an environmental impact than the Greenfield scenarios in 2011, 2016 and 2021. Company M utilizes heat recovery on both their CHP plant and, in the future, on the data centre, reducing the amount of waste heat released into the atmosphere. This recovered heat is then used to heat other buildings in the community, reducing the amount of electricity otherwise needed to heat these buildings, and thus reducing the CO2 emissions released. From the provincial analysis, Manitoba, Quebec, Newfoundland, British Columbia, P.E.I, and Yukon are concluded to produce less CO2 emissions and waste heat than the Company M and Greenfield scenarios.

Impact of Internet Congestion on Internet Users

Rankie Chun-Wang Wong

 

The ability to adequately predict both short‐term and long‐term network demand can help Internet Service Providers (ISPs) reduce the impacts of congestion, and allow them to provide consistently high levels of service to their customers. However, this is not always easy; with the internet growing at extremely high rates, there is an urgent need for ISPs to better understand how end users experience and perceive congestion in the network [Telco10a]. This study first analyzes historical traffic data to determine usage patterns and internet application preferences of users. Then, a simulator, which combines the empirical findings from historical data with qualitative factors that affect customer experience, is developed. The simulator outputs the average per customer usage level, as well as “flags” indicating the likelihood of customers connected complaining about effects of congestion. Analyses performed on various simulation trials indicate that the most important factors in determining the impact of congestion on end users are predicting the customer usage growth rate, finding a percentage breakdown, by type of internet application, of customer usage that is representative of a specific customer base, and understanding the tolerance levels to reduced internet speeds for different applications. Using this information, ISPs can then accurately anticipate demands for their networks and relieve congestion sufficiently ahead of time.

Validation of Energy Use Proxy Measurements and Greenhouse Gas Emissions

Yin Liu

 

The Bank intends to improve the accuracy of its measurement of energy consumption and related greenhouse gas (GHG) emissions from its office and branch facilities. The Bank has requested that the Centre for Management of Technology and Entrepreneurship (CMTE) at the University of Toronto assess the accuracy of the current proxy methods used in measuring its energy consumption. The Bank outsources the management of its facilities to a third-party Property Management Service Provider (PMSP) that collects periodic energy consumption data for some of the Bank’s facilities. For the remaining facilities, PMSP provides consumption estimates based on proxies. This thesis develops an energy-simulation approach based on EnergyPlus, with its core engines and reference commercial benchmark building models developed by the U.S. Department of Energy in conjunction with three U.S. national laboratories. The actual meter readings of the Bank’s energy consumption and the EnergyPlus-simulated results both suggest a pattern of “carbon economies of scale,” where the energy usage and associated GHG emission levels reduce as the total building size increases. This thesis identifies that the proxies do not take into account the reduction of energy consumption and carbon footprints as seen in larger facilities. Therefore, it is concluded that the proxy method cannot adequately meet the Bank’s needs for accurate management of energy usage and GHG emissions. To acquire realistic simulation parameters specific to the Bank’s individual facilities, a number of actionable steps are recommended to the Bank.

Estimating US and Canadian Retail Portfolio Asset Correlations

Yi Xue, Engineering Science

 

Two important parameter inputs into the capital requirements formula in the Basel II Accord are default probabilities (PDs) and correlations. The Basel Committee sets standard specifications for the asset correlations: 15% for residential real estate loans and 4% for credit card loans. In this paper, we estimate the default probabilities and asset correlations using data from US and Canadian commercial banks. We find that in stable periods, the asset correlation is much lower than the values assumed by the Basel Accord, but in periods of stress, such as during the US subprime mortgage crisis, a higher asset correlation may be warranted.

Modeling Commodity Futures: Price Levels and Volatilities

Biyun Zhang, Engineering Science

 

In 2007, Leif Andersen developed a generic and practical framework for construction of Markov models for commodities futures. In his publication, Andersen derived a series of theoretical results about low-dimensional Markov representation of the dynamics of the term structure of commodities futures prices and volatilities. The objective of this thesis is to determine the seasonality component in the volatility of commodities futures under Andersen’s framework. Further, additional parameters of the future’s price process based on the Markov Diffusion model will be assessed. Firstly, the calibration of parameters will be presented under both the risk-neutral and the real measure. We compare the simplified approach and the transformation approach described by Andersen for calibration under the risk-neutral measure. We use the Kalman Filter technique for calibration under the real measure. Secondly, we use the calibration results to simulate the futures price movements of Natural Gas (NG) and Crude Oil (CL) futures under the risk-neutral measure. Thirdly, we simulate the futures price movements of NG and CL under the real measure. Our results proved the existence of seasonal patterns in the volatility of Natural Gas futures prices. Also, we managed to determine the parameters involved in modeling the futures price movements. Special attention has been paid to the trend of calibration results as the number of considered futures contracts increases. Finally, we determined that Kalman Filter is applicable for calibration of parameters under the real measure. The determination of parameters and seasonality component in the volatility will help to effectively model the price movements of commodities futures.

Operational Risk Measurement for High Impact Low Frequency Events

Ankit Arun Jain

 

The ability to measure operational risk (OR) for high-impact, low-frequency (HILF) losses is important for large financial institutions to allocate their capital requirements for OR as accurately as possible. This measure enables the banks to set aside the minimum capital required to cover losses and prevent bankruptcy in the event of extreme losses. However, it is very difficult to model and measure these HILF risks. This study investigates different approaches for measuring OR and focuses on extreme value theory (EVT) distributions such as the generalized extreme value distribution (GEV) and the generalized Pareto distribution (GPD) for modelling the HILF operational losses. Different goodness-of-fit (GoF) tests and parameter estimation techniques are used to model the operational losses and estimate their accuracy in estimating the regulatory capital for OR. Data from the Bank is modelled by a GEV and a GPD. At different thresholds, different GoF tests and parameter estimation techniques were used to identify the type of distribution, their parameters and their accuracy. The results from the experiments show that the GoF tests and parameter estimation techniques are poor estimators and impractical for use given the amount of loss data currently available in the operational loss database. However, with an increased sample size, the accuracy of the estimates improves significantly, providing an encouraging outlook on the use of EVT for modelling OR for HILF losses.