Categories
- All
- Solution Algorithm
- Residual Risk Model
- Commodity Spreads
- Trading and Quantitative Fund Management
- Supply Chain Management
- Quadratic Programming
- Analytics & Hedge Funds
- History of Modelling Languages
- Scenario Generation
- Asset-Liability Management & Liability Driven Investment
- Portfolio Optimization & Asset Allocation
- News Analytics and Sentiment Analysis
Abstract
The field of natural language processing (NLP) has evolved significantly in recent years. In this chapter we consider two leading and well-established methodologies, namely, those due to Loughran McDonald, and FinBERT. We then contrast our approach to these two approaches and compare our performance against these methods which are considered to be benchmarks. We use S&P 500 market data for our investigations and describe the results obtained following our strategies. Our main consideration is the Earnings Calls for the S&P 500 stocks. We vindicate our findings and present the performance of our trading and fund management strategy which shows better results.
Keywords— Natural Language Processing, Sentiment Analysis, Earnings Calls
Abstract
The explosive development of electronic media has brought to the market participants thousands of pieces of financial news which are released on different platforms every day. Many news wires published online are editorially controlled and can be relied as factual summary as opposed to fake news or disinformation. These news items provide a rich source of textual information which in a summative way represents the sentiment of the market. The sentiments influence or impact the asset price as well as the volatility of individual assets. In this study we have tested sentiment enhanced daily trading strategies. Alexandria Technology has provided us news sentiment metadata, which is used in this study. We have also resorted to ‘asset filters’ which we use to restrict the universe of assets chosen for daily trades. We have considered quantified news sentiment and its impact on the movement of asset prices as a second time series data, which is used together with the asset price/return time series data. Our asset allocation strategy uses Second Order Stochastic Dominance (SSD); see Roman et al. (2006, 2013). Following this modelling paradigm we compute daily trade schedules using a time series of historical equity price data. In contrast to classical mean-variance method this approach improves the tail risk as well as the upside of the return. In our recent research we have introduced news sentiment indicators such as News RSI (NRSI) and Derived RSI (DRSI) filters. These filters restrict the choice of asset universe for trading. Consistent performance improvement achieved in back-testing vindicates our approach.
Keywords— Trading Strategy, Sentiment Analysis, News Meta Data, Asset Filter
Abstract
We explain the importance of Market Microstructure in the study of the Financial Markets; and then describe the Market Participants who collectively comprise the Financial Market. After a short history of capital markets, we describe the transition of the trading activities from the physical markets to computer enabled automated exchanges. We discuss how the investment community is related to the trading community; these we categorize as the Mainstream Market Participants. The providers of information about the market have emerged in many ways equally important market participants we call them Tertiary Market Participants. Lately automated trading and algorithmic trading have become the mainstay of the market. We have studied the rise of ‘Algo Trading’ in the equities and equity derivatives as well growth of ‘Algo Trading’ across other asset classes. News sentiment, social media sentiment (micro blogs) have gained importance in respect of how they affect the Market. This as well as recently emerging Alternative Data as influencer of Market activities have been considered in the final section of this Whitepaper.
Abstract
We investigate how “news sentiment” in general and the “impact of news” in particular can be utilised in designing equity trading strategies. News is an event that moves the market in a small way or a big way. We have introduced a derived measure of news impact score which takes into consideration news flow and decay of sentiment. Since asset behaviour is characterised by return, volatility and liquidity we first consider a predictive analytic model in which market data and impact scores are the inputs and also the independent variables of the model. We finally describe the trading strategies which take into consideration the three important characteristics of an asset, namely, return, volatility and liquidity. The minute-bar market data as well as intraday news sentiment metadata have been provided by Thomson Reuters.
Abstract
We propose a method of incorporating macroeconomic news into a predictive model for forecasting prices of crude oil futures contracts. Since these futures contracts are more liquid than the underlying commodity itself, accurate forecasting of their prices is of great value to multiple categories of market participants. We utilize the Kalman filtering framework for forecasting arbitrage-free (futures) prices, and assume that the volatility of oil (futures) price is influenced by macroeconomic news. The impact of quantified news sentiment on the price volatility is modelled through a parametrized, non-linear functional map. This approach is motivated by the successful use of a similar model structure in our earlier work, for predicting individual stock volatility using stock-specific news. We claim the proposed model structure for incorporating macroeconomic news together with historical (market) data is novel and improves the accuracy of price prediction quite significantly. We report results of extensive numerical experiments which justify our claim.
Key Words: crude oil; macroeconomic news sentiment; Kalman filter; forecasting
Abstract
The rapid rise of social media communication has touched upon all aspects of our social and commercial life. In particular, the rise of social media as the most preferred way of connecting people on-line has led to new models of information communication amongst the peers. Of these media Twitter has emerged as a particularly strong platform and in the financial domain tweets by market participants are of great interest and value. News in general, and commercial and financial news wires, in particular provide the market sentiment and in turn influence the asset price behaviour in the financial markets. In a comparable way micro-blogs of tweets generate sentiment and has an impact on market behaviour, that is , the price as well as the volatility of stock prices.
In our recent research we have introduced news sentiment based filters such as News RSI (NRSI) and Derived RSI (DRSI), which restrict the choice of asset universe for trading. In this present study, we have extended the same approach to StockTwit’s data. We use the filter approach of asset selection and restrict the available asset universe. We then apply our daily trading strategy using the Second Order Stochastic Dominance (SSD) as an asset allocation model. Our trading model is instantiated by two time series data, namely, (i) historical market price data and (ii) StockTwits sentiment (scores) data. Instead of NRSI we compute the Micro-blog RSI (MRSI) and using this a DRSI is computed. The resulting combined filter (DRSI) leads to an enhancement of the SSD based trading and asset allocation strategy. Empirical experimental results of constructing portfolios are reported for S&P 500 Index constituents.
Key Words: Trading Strategy, Sentiment Analysis, Micro blog data, Asset Filter
Abstract
Volatility prediction plays an important role in the financial domain. The GARCH family of prediction models is very popular and efficient in using past returns to forecast volatility. It has also been observed that news, scheduled and unscheduled, have an impact on return volatility of assets. An enhanced GARCH model, called News Augmented GARCH (NAGARCH) includes an additional component for news sentiment. With the rise in popularity of the world wide web and social media, it has become a rich source for opinions and sentiments. Twitter is one such platform. It is a micro-blogging site and a popular source for public view on different topics. StockTwits is a social media platform that started as an application built using Twitter’s API. It has since grown into an independent financial social media platform for news and sentiment. StockTwits is a rich source of opinions from subject experts and analysts. This data provides first systematic exploration of social media. It reflects raw sentiments of traders, investors, media, public companies, and investment professionals as opposed to sentiments from curated news wires. This research attempts to determine if the sentiment on stocks from StockTwits micro-blogs can improve volatility prediction. The experiment is performed on 9 NASDAQ100 stocks. The GARCH model with stock returns, and the NA-GARCH model with stock returns and micro-blog sentiment are tuned and their prediction results are evaluated. NA-GARCH, with the sentiment data from StockTwits performed better than the GARCH model in 7 out of the 9 cases.
Key Words: Volatility prediction, GARCH, NA-GARCH, Sentiment, Micro-blog, StockTwits
Abstract
The recommendation of financial analysts plays an important role in making investment decisions. The method of constructing a portfolio using such recommendations does not rely on quantitative models instead it relies on research of the analysts and their qualitative views. We explore paradigms of modelling whereby the qualitative research outputs of the analysts are introduced in quantitative models of portfolio construction. In this report, the quality of the analysts research is modelled using exploratory data analysis techniques. The findings are then used to create a model of filters. This method narrows down the choice of the asset universe, thereby improving the results of the asset allocation investment model.
The market data used in this report is stock prices for 83 assets, which are or once were components of the NIFTY 50 stock index. The market data and the analyst recommendations are supplied by Thomson Reuters and the study covers the period from 1 January 2013 to 25 July 2018.
Key Words: Analyst Recommendations; Filters; NIFTY50 Index
Abstract
A look at the future direction of analytics by Professor Gautam Mitra.
Abstract
This white paper gives an overview to Hedge Funds, with a focus on risk management issues. We define and explain the general characteristics of Hedge Funds, their main investment strategies, risk models employed and address the problems in Hedge Fund modelling. We also survey current Hedge Funds available on the market, those that have been withdrawn and briefly argue the cases supporting and opposing Hedge Fund usage. The purpose of this white paper is to provide an informed analysis of Hedge Funds. This informational analysis will be of value not only to finance professionals but also academics and more generally to any person broadly interested in finance. This paper will be of interest to: * Professionals working within the Financial Risk Management field, in particular: -Hedge Fund and Mutual Fund Managers -Quantitative Analysts -“Front” and “Middle” Office banking functions e.g. Treasury Management * Regulators concerned with Hedge Fund Financial Risk Management * Private and Institutional Investors * Academic Researchers in the area of Financial Risk Management * General Finance community A unique value of this whitepaper, compared to other Hedge Fund literature freely available on the internet, is that this review is fully sourced from academic references (such as peer reviewed journals) and is thus a bona fide study.
Abstract
UIMP is a matrix-generator report-writer system designed to aid the realization (generation) of mathematical programming models and also the analysis-reporting of the solutions of such models. The data structure facility of the system allows the underlying structure of a user model to be captured and helps to define such models. This data-structure feature is not only a powerful modelling aid, it also finds use in the analysis of solutions and report generation. The experience of using the system, its shortcomings and possible extensions are also discussed.
Abstract
LP models are usually constructed using index sets and data tables which are closely related to the attributes and relations of relational database (RDB) systems. We extend the syntax of MPL, an existing LP modelling language, in order to connect it to a given RDB system. This approach reuses existing modelling and database software, provides a rich modelling environment and achieves model and data independence. This integrated software enables Mathematical Programming to be widely used as a decision support tool by unlocking the data residing in corporate databases.
Abstract
The introduction of new technologies and concepts has redefined the relative positioning of information systems (IS) and decision technologies in a corporate context. Corporate IS have been extended to include not only transaction processing databases but also analytical databases, often known as Data Warehouses. On-line analytical processing (OLAP) , as introduced by Codd et al. , is capable of capturing the structure of the real world data in the form of multidimensional tables which are known as ‘datacubes’ by management information systems (MIS) and statistical systems specialists. Manipulation and presentation of such information through multidimensional views and graphical displays provide invaluable support for the decision-maker. We illustrate the natural coupling, which exists between data modelling, symbolic modelling and ‘What if’ analysis phases of a decision support system (DSS) . In particular, we explore the power of roll-up and drill-down features of OLAP and show how these translate into aggregation and disagreggation of the underlying decision models. Our approach sets out a paradigm for analysing the data, applying DSS tools and progressing through the information value chain to create organisational knowledge.
Abstract
This white paper introduces Markowitz mean-variance model with a general overview and sets out to explain why and how the finance industry has fully embraced this as a method of choice for portfolio planning. The main focus of the white paper is to bring out many aspects of the portfolio planning problem which are addressed by enhanced mean-variance models that meet the growing requirements of the finance industry. Portfolio analysis is a leading issue with fund managers who apply such models in many situations such as index tracking, performance evaluation and historical data/backtesting. This white paper will be of interest to: – Fund Managers – Trading Desk Staff – Back Office Staff – Quantitative Analysts who wish to know the general development in the market place.
Abstract
Second order Stochastic Dominance (SSD) has a well-recognised importance in portfolio selection, since it provides a natural interpretation of the theory of risk-averse investor behaviour. Recently, SSD-based models of portfolio choice have been proposed; these assume that a reference distribution is available and a portfolio is constructed, whose return distribution dominates the reference distribution with respect to SSD. We present an empirical study which analyses the effectiveness of such strategies in the context of enhanced indexation
Abstract
We formulate a portfolio planning model that is based on second-order stochastic dominance as the choice criterion. This model is an enhanced version of the multi-objective model proposed by Roman et al. ; the model compares the scaled values of the different objectives, representing tails at different confidence levels of the resulting distribution. The proposed model can be formulated as a risk minimization model where the objective function is a convex risk measure; we characterize this risk measure and the resulting optimization problem. Moreover, our formulation offers a natural generalization of the SSD-constrained model of Dentcheva and Ruszczyn4 ski . A cutting plane-based solution method for the proposed model is outlined. We present a computational study showing: (a) the effectiveness of the solution methods and (b) the improved modeling capabilities: the resulting portfolios have superior return distributions.
Abstract
Second-order stochastic dominance (SSD) is widely recognised as an important decision criterion in portfolio selection. Unfortunately, stochastic dominance models are known to be very demanding from a computational point of view. In this paper we consider two classes of models which use SSD as a choice criterion. The first, proposed by Dentcheva and Ruszczy4nski (J Bank Finance 30:433-451, 2006), uses a SSD constraint, which can be expressed as integrated chance constraints (ICCs). The second, proposed by Roman et al. (Math Program, Ser B 108:541-569, 2006) uses SSD through a multi-objective formulation with CVaR objectives. Cutting plane representations and algorithms were proposed by Klein Haneveld and Van der Vlerk (Comput Manage Sci 3:245-269, 2006) for ICCs, and by K|nzi-Bay and Mayer (Comput Manage Sci 3:3-27, 2006) for CVaRminimization. These concepts are taken into consideration to propose representations and solution methods for the above class of SSD based models.We describe a cutting plane based solution algorithm and outline implementation details. A computational study is presented, which demonstrates the effectiveness and the scale-up properties of the solution algorithm, as applied to the SSD model of Roman et al. (Math Program, Ser B 108:541-569, 2006).
Abstract
This paper considers longshort portfolio optimization in the presence of two risk measures (variance and conditional value-at-risk (CVaR)), and asset choice constraints regarding buying and selling and holding thresholds, and cardinality restrictions on the number of stocks to be held in the portfolio. The mean-variance-CVaR model is based on the mean-variance approach but has an additional constraint on CVaR. Our empirical investigations show that short-selling strategies lead to a superior choice of portfolios, with higher expected return and much lower risk exposures. In particular, the downside risk can be considerably reduced by introducing short selling. Our longshort extension to the mean-variance-CVaR model incorporates the practice of many financial institutions with regard to short decisions. Numerical experiments with the resulting model, which is a quadratic mixed integer program, are conducted on real data drawn from the FTSE 100 index
Abstract
Modern Portfolio Theory (MPT) is based upon the classical Markowitz model which uses variance as a risk measure. A generalisation of this approach leads to mean-risk models, in which a return distribution is characterised by the expected value of return (desired to be large) and a “risk” value (desired to be kept small). Portfolio choice is made by solving an optimization problem, in which the portfolio risk is minimised and a desired level of expected return is specified as a constraint. The need to penalize different undesirable aspects of the return distribution led to the proposal of alternative risk measures, notably those penalising only the downside part (adverse) and not the upside (potential). The downside risk considerations constitute the basis of the Post Modern Portfolio Theory (PMPT). Examples of such risk measures are lower partial moments, Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR). We revisit these risk measures and the resulting mean-risk models. We discuss alternative models for portfolio selection, their choice criteria and the evolution of MPT to PMPT which incorporates: utility maximisation and stochastic dominance.
Abstract
This paper proposes a model for portfolio optimization, in which distributions are characterized and compared on the basis of three statistics: the expected value, the variance and the CVaR at a specified confidence level. The problem is multi-objective and transformed into a single objective problem in which variance is minimized while constraints are imposed on the expected value and CVaR. In the case of discrete random variables, the problem is a quadratic program. The mean-variance (mean-CVaR) efficient solutions that are not dominated with respect to CVaR (variance) are particular efficient solutions of the proposed model. In addition, the model has efficient solutions that are discarded by both mean-variance and mean-CVaR models, although they may improve the return distribution. The model is tested on real data drawn from the FTSE 100 index. An analysis of the return distribution of the chosen portfolios is presented.
Abstract
Mean-risk models have been widely used in portfolio optimization. However, such models may produce portfolios that are dominated with respect to second order stochastic dominance and therefore not optimal for rational and risk-averse investors. This paper considers the problem of constructing a portfolio which is non-dominated with respect to second order stochastic dominance and whose return distribution has specified desirable properties. The problem is multi-objective and is transformed into a single objective problem by using the reference point method, in which target levels, known as aspiration points, are specified for the objective functions. A model is proposed in which the aspiration points relate to ordered outcomes for the portfolio return. This concept is extended by additionally specifying reservation points, which act pre-emptively in the optimization model. The theoretical properties of the models are studied. The performance of the models on real data drawn from the Hang Seng index is also investigated.
Abstract
We consider the mean-variance (M-V) model of Markowitz and the construction of the risk-return efficient frontier. We examine the effects of applying buy-in thresholds, cardinality constraints and transaction roundlot restrictions to the portfolio selection problem. Such discrete constraints are of practical importance but make the efficient frontier discontinuous. The resulting quadratic mixed-integer (QMIP) problems are NP-hard and therefore computing the entire efficient frontier is computationally challenging. We propose alternative approaches for computing this frontier and provide insight into its discontinuous structure. Computational results are reported for a set of benchmark test problems.
Abstract
This white paper sets out to explain an important financial planning model called the asset liability management (ALM); in particular it discusses why in practice, optimum planning models are used. The ability to build an integrated approach which combines liability models with that of asset allocation decisions have proved desirable and more efficient in that it can lead to better ALM decisions. The role of uncertainty, and quantification of risk in these planning models is considered. This white paper will be of interest to corporate treasurers, to fund managers in the pension & insurance industry, and to analysts who support ALM models in different financial institutions.
Abstract
We describe the structure of Employees Provident Funds (EPFs); EPF is the main retirement scheme for private sector employees of the four countries: Singapore, Malaysia, India and Sri Lanka. In this paper we compare the EPF plans for these four countries and describe the similarities and the differences in terms of contributions and accounts, dividends withdrawals and annuity, minimum sum, health benefits, investment and also performance. We also discuss the challenges faced by EPFs and possible ways to overcome them. The biggest challenge is due to the phenomenon of aging population, which occurs even faster in Asia than in the Western countries. EPFs are defined contribution pension schemes; thus, the increase in life expectancy brings the risk that participants could outlive their savings. Lastly we discuss the EPF framework from an ALM perspective.
Abstract
In contrast to decision models which use in-sample scenarios we use out-of-sample scenarios to conduct simulation and decision evaluation including backtesting. We propose two simulation methodologies and describe six decision evaluation techniques that can be applied to test the performance of liability-driven investment (LDI) models. A good practice of evaluating the performance of funds is to apply risk-adjusted performance measures; we have chosen two widely applied ratios: the Sortino ratio and the funding ratio. We perform statistical tests and establish the quality of the portfolios. We report our empirical ! endings by testing an asset and liability management model for pension funds where we propose one deterministic linear programing and three stochastic programming models.
Abstract
Asset and Liability Management (ALM) models have been recently recast as Liability-Driven Investment (LDI) models for making integrated financial decisions in pension schemes investment: matching and outperforming a pension plans liabilities. LDI has become extremely popular as the decision tool of choice for pension funds. Market developments and recent accounting and regulatory changes require a pension fund to adopt a new view on their asset allocation decision. We present a generic ALM problem cast as an LDI, which we represent through a family of four decision models: a deterministic linear programming model, a two-stage stochastic programming (SP) model incorporating uncertainty, a chance-constrained SP model and an integrated chance-constrained SP model. In the deterministic model, we study the relationship between PV01 matching and the required funding. In the model, we have two sources of randomness: liabilities and interest rates. We generate interest rate scenarios using the Cox, Ingersoll and Ross model and investigate the relationship between funding requirements and minimize the absolute deviation of the present value matching of the assets and liabilities over time. In the chance-constrained programming model, we limit the number of future deficit events by introducing binary variables and a user-specified reliability level. The fourth model has integrated chance constraints, which not only limits the events of underfunding, but also the amount of underfunding relative to the liabilities. All models employ a buy-and-hold strategy in a fixed-income portfolio, whereas recourse actions are only taken in borrowing and lending.
Abstract
Not available.
Abstract
Not available.
Abstract
Not available.
Abstract
This report shows evaluation and simulation techniques. This work is not only verifying these techniques but also comparing different mathematical programming approaches applied to a pension fund. The five approaches are discussed in detail in Schwaiger et al. (2007).
Abstract
This report uses risk-adjusted performance measures to identify the quality of al- ternative decision models. This report uses the decision models introduced by Schwaiger et al. (2007) and the simulation and decision evaluation outcomes from Schwaiger et al. (2008). Risk-adjusted performance measures are used by fund managers to rank and com- pare their portfolio performance with peers. In this report we examine the perfor- mance of alternative decision models for pension funds and use two ratios, namely the Sortino and the Solvency ratio to measure their performance over time.
Abstract
Traditional Asset and Liability Management(ALM) models have been recently re- cast as Liability Driven Investment (LDI) models for making integrated financial decisions in pension schemes investment: matching and outperforming liabilities. LDI has become extremely popular as a decision tool of choice for pension funds. The last decade experienced a fall in the equity markets while bond yields reached low levels. New regulations were introduced whereby liabilities were hard to meet. In the case of a deficit the pension fund trustees and employers have to agree on extra contributions to fill the deficit within 10 years time. The UK Accounting standard FRS17 (since 2001, replacing SSAP24) requires the assets to be mea- sured by their market value and liabilities measured by a projected unit method and a discount rate reflecting the market yields then available on AA rated cor- porate bonds of appropriate currency and term (see Accounting Standards Board Financial Reporting Standard 17). Furthermore, deficit or surplus has to be fully included on the balance sheet. In the Netherlands and the Nordic countries LDI models have become established; the UK, Italy and few other European countries are close followers of this trend. Traditionally, assets and liabilities were considered separate. In asset management the aim was to maximize return for a given risk level. However, the matching of the liabilities was not taken into consideration. The main argument was that assets should be made to grow faster than liabilities. The modern integrated approach to LDI considers the cash flow streams for in- vested assets which can be fixed income portfolios enhanced by interest rate swaps and in some cases includes added swaptions. We present an asset and liability management (ALM) problem for LDI, which we model using three approaches: a deterministic model, a stochastic model incorpo- rating uncertainty and a chance-constrained stochastic model. In the deterministic model we look at the relationship between PV 01 matching and the required fund- ing. PV 01 is the change of the net present value of a bond due to 0:01% positive parallel shift in the yield curve. In the stochastic programming model we have two sources of randomness: liabilities and interest rates. We generate interest rate scenarios, look at the relationship between funding requirements and minimize the deviation of the PV (present value) matching of the assets and liabilities over time. In the chance-constrained programming model we limit the number of future de/cit events by introducing binary variables and a user specified reliability level. The last model has integrated chance constraints, which not only limits the events of underfunding, but also the amount of underfunding relative to the liabilities. Fur- thermore, a fixed mix model is introduced for testing purposes only.
Abstract
Today’s marketplace is becoming increasingly dynamic and volatile. As consumers become more sophisticated, they demand the right product at the right time, at the right price, and at the right place. Whereas quality was the competitive weapon of the 80s, customer responsiveness, or time-to-market is the differentiator today. In many industries, hyper-competition is forcing many enterprises to fundamentally change the way business is conducted in order to service. Given these challenges, traditional paradigms for business management are ineffective. At the same time, businesses face tremendous pressure from their stakeholders to increase ROA, profit contribution and customer responsiveness. Given the complexity of a typical supply chain, supply chain planning systems (also known Given the complexity of a typical supply chain, supply chain planning systems (also known as Advanced Planning Systems) enable companies to intelligently manage the activities of the supply chain. Every company must perform five basic activities or processes within a supply chain: buy, make, move, store and sell. Within each of these processes, there are short-term decisions (which product should be put on the truck?) and long-term decisions (do we need a new factory to meet demand?). Through the intelligent application of constraint-based principles, we can reduce system inertia by reducing capacity on non-constrained resources without a corresponding increase in system nervousness or instability. Multi-enterprise planning capabilities of an intelligent system should include support for the various command and control structures as well as organizational aspects of the supply chain. The ability to model multiple authority domains and support autonomy with interdependence among the various business functions within a supply chain can provide a great deal of flexibility in managing system inertia. Ideally, the distributed architecture of a decision support system should provide global visibility to the various business units or functions to make decisions that meet both local business objectives as well as the global objective of the entire supply chain. In this white paper we discuss the optimal allocation and utilisation of resources using mathematical optimization techniques. We explain in non-technical terms the success and growing importance of optimization techniques in efficiently processing complex supply chain problems in a cross-section of industry. We highlight the differences between strategic and tactical supply chain models. We set out the sequence of actions, the necessary system issues, the decision making issues and managerial aspects in Supply chain. Through this paper the reader will: 1. appreciate the significance of using optimization techniques in making decisions on industrial problems, 2. be aware of real-life case-studies highlighting the benefits of supply-chain optimization, 3. have access to the relevant and the most recent articles and books on supply chain.
Abstract
We consider a strategic supply chain planning problem formulated as a two-stage stochastic integer programming (SIP) model. The strategic decisions include site locations, choices of production, packing and distribution lines, and the capacity increment or decrement policies. The SIP model provides a practical representation of real-world discrete resource allocation problems in the presence of future uncertainties which arise due to changes in the business and economic environment. Such models that consider the future scenarios (along with their respective probabilities) not only identify optimal plans for each scenario, but also determine a hedged strategy for all the scenarios. We (1) exploit the natural decomposable structure of the SIP problem through Benders decomposition, (2) approximate the probability distribution of the random variables using the generalized lambda distribution, and (3) through simulations, calculate the performance statistics and the risk measures for the two models, namely the expected-value and the here-and-now.
Abstract
Equity portfolio management problems require fund managers to make decisions about what portfolio to hold (ex-ante) without knowing what future equity returns will be. Though these returns are uncertain, market participants try to understand the nature of the uncertainty and make decisions based on their beliefs about the market environment. Traditionally, portfolio managers have used variants of Markowitz mean-variance analysis to determine the optimal portfolio to hold and this is still fairly standard practice in industry. Mean-variance portfolio decision models fall into the more general group of mean-risk models, where portfolio risk and expected return are traded-off when making asset choices. Variance and standard deviation both measure the spread of a distribution about its mean. Since the variance of a portfolio can be easily calculated from the co-variances of the pairs of asset returns and the asset weights used in the portfolio, variance is predominantly used in portfolio formation. In contrast to computing the asset variances and covariances directly using historical data, multifactor models provide an accurate and efficient way to provide these estimates. They decompose an assets return into returns derived from exposure to common factors and an asset-specific component. The common factors can be understood as representing different risk (uncertainty) aspects, which all the assets are exposed to in varying degrees (factor sensitivities). By describing a group of asset returns through a set of key common factors, the size of the estimation problem is significantly reduced. The new problem faced is to estimate the covariance matrix of common sources of risk, the variances of the specific returns and estimates of each security factor exposures. These models capture the natural intuition that firms with similar characteristics will behave similarly. Active portfolio managers seek to incorporate their investment insight to beat the market. An accurate description of asset price uncertainty is key to the ability to outperform the market. Tetlock et al. (2008) develop a fundamental factor model that incorporates news as a factor. Investors perceptions of the riskiness of an asset are determined by their knowledge about the company and its prospects, that is, by their information sets. They note that these are determined from three main sources: analysts forecasts, quantifiable publicly disclosed accounting variables and linguistic descriptions of the firms current and future profit generating activities. If the first two sources of information are incomplete or biased, the third may give us relevant information for equity prices. We seek to extract an improved understanding of equity price uncertainty using a quantified measure of market sentiment to update a traditional factor model. This may give us the tools to make improved portfolio (management) decisions.
Abstract
We enhance the modelling and forecasting of sovereign bond spreads by taking into account quantitative information gained from macro-economic news sentiment. We investigate sovereign bonds spreads of five European countries and improve the prediction of spread changes by incorporating news sentiment from relevant entities and macro-economic topics. In particular, we create daily news sentiment series from sentiment scores as well as positive and negative news volume and investigate their effects on yield spreads and spread volatility. We conduct a correlation and rolling correlation analysis between sovereign bond spreads and accumulated sentiment series and analyse changing correlation patterns over time. Market regimes are detected through correlation series and the impact of news sentiment on sovereign bonds in different market circumstances is investigated. We find best-suited external variables for forecasts in an ARIMAX model set-up. Error measures for forecasts of spread changes and volatility proxies are improved when sentiment is considered. These findings are then utilised to monitor sovereign bonds from European countries and detect changing risks through time.
Abstract
Momentum strategy is one of the most popular strategies that market participants use to make investment decisions. In the past two decades, many researchers have shown that momentum strategy beats the market, and provides attractive portfolio returns. In this study we investigate Dow Jones Industry Average (DJIA) index and include news data and social media sentiment data to improve the performance of momentum strategy.Particularly, we select StockTwits as the social media source. Four weekly momentum strategies are built and compared over a five-year back-testing period. This research starts with using market data to calculate 5-day Relative Strength Indicator (RSI) that captures the momentum of price. A momentum strategy is constructed based on the overbought/oversold (70/30) signals of RSI proposed by Wilder (1978). Furthermore,the news and social media sentiment data are applied separately to enhance the RSI selections of momentum strategy. News impact scores are used to give more precise evaluations toward news sentiment. Finally, news and social media sentiment data are applied as a double filter to enhance the momentum strategy. The results show that news sentiment and social media data improves the performance of the momentum strategy.
Abstract
In this study, we introduce a new method of assessing the credit risk of corporate bonds;where in addition to the historical market data news sentiment data is used. Typically,a higher yield spread is usually associated with higher credit risk. By predicting the upward/downward movement of yield and yield spread accurately, the credit risk associated to the bonds can be detected precisely. The corporate bonds studied are issued after 1 January 2007 by seven chosen companies listed in Euro Stoxx 50 index.The time series of bond yields and news sentiment cover the period from 1 January 2007 to 15 May 2017. The modelling of the dynamics of corporate bond yields and credit spreads are based on ARIMA and ARIMAX models. In the ARIMAX model, macroeconomic and firm-specific news sentiment are used as the external explanatory variable. We examine the effect of several categories of macroeconomics news sentiment and firm-specific news sentiment on corporate bond yield spreads.Furthermore, we separate the positive and negative sentiment and investigate their impact on the forecast of corporate bond yields. It is found that negative country news sentiment and central bank news sentiment are effective during a recession period and positive country news sentiment is effective in the recovery period. Negative government and firm-specific news sentiment, in general, affect corporate bond yield spreads more than positive government and company news sentiment.
Abstract
Forecasting of stock return volatility plays an important role in the financial markets. GARCH model is one of the most common models used for predicting asset price volatility from the return time series. In this study, we have considered quantified news sentiment as a second source of information, which is used together with the GARCH model to predict the volatility of asset price returns. We call this NA-GARCH (news augmented GARCH) model. Our empirical investigation compares volatility prediction of returns of 12 different stocks (from two different stock markets), with 9 data sets for each stock. Our results clearly demonstrate that NA-GARCH provides a superior prediction of volatility than the “plain vanilla” GARCH model. These results vindicate some recent findings regarding the utility of news sentiment as a predictor of volatility, and also vindicate the utility of our novel model structure combining the proxies for past news sentiments and the past asset price returns.
Abstract
We describe a method for generating daily trading signals to construct trade portfolios of exchange traded securities. Our model uses Second Order Stochastic Dominance (SSD) as the choice criterion for both long and short positions. We control dynamic risk of ‘draw down’ by applying money management. The asset choice for long and short positions are influenced by market sentiment; the market sentiments are in turn acquired from news wires and microblogs. The solution method is challenging as it requires processing stochastic integer programming (SIP) models as well as computing the impact of market sentiment. The computation of SSD portfolios are well known to be computationally hard as this involves processing of large discrete MIP problems. The solution approach is based on our well-established solver system FortSP which uses CPLEX as its embedded solver engine to process SIP models.
Abstract
In this study we investigate how the prediction of future volatility is improved by using news (meta)data. We use three input time series, namely: (i) market data, (ii) news sentiment impact scores, as explained by Yu (2014), and (iii) the news volume. We compare the results of predicting volatility by using a “vanilla” GARCH model, which uses market data only, and the news enhanced GARCH, as described above. Finally, the forecasted volatility is compared with the realized volatility, allowing an assessment of the robustness and precision of the model. RavenPack and Thomson Reuters provided news data and market data, respectively. The main findings are that the inclusion of scheduled news and the inclusion of news volume characterized by negative sentiment improve the forecasted volatility. The added value of scheduled news to volatility predictions is in line with Li and Engle (1998).
Abstract
Due to its significance, forecasting asset volatility has been an active area of research in recent decades. In this whitepaper we aim to take into account the stylised facts of volatility to improve predictive power of a simple GARCH model. We investigate the power of three GARCH models (GARCH, EGARCH, GJRGARCH) using implied volatility and news sentiment data as external regressors in order to enhance forecasts of stock return volatility. We also explore the impact of the use of fat-tailed and skewed distributions. Analysis is conducted on 5 constituents of the S&P500. In terms of in-sample performance, the findings suggest that a GJR-GARCH(1,1) model incorporating a student-t distribution, implied volatility and news sentiment data consistently out-performs a simple GARCH(1,1) with a normal distribution. When comparing out-of-sample forecast performance, the enhanced models were able to improve volatility predictions for four out of five stocks.
Abstract
We report an empirical study of a predictive analysis model for equities; the model uses high frequency (minute-bar) market data and quantified news sentiment data. The purpose of the study is to identify a predictive model which can be used in designing automated trading strategies. Given that trading strategies take into consideration three important characteristics of an asset, namely, return, volatility and liquidity, our model is designed to predict these three parameters for a collection of assets. The minute-bar market data as well as intraday news sentiment metadata have been provided by Thomson Reuters.
Abstract
Computer trading in financial markets is a rapidly developing field with a growing number of applications. Automated analysis of news and computation of market sentiment is a related applied research topic which impinges on the methods and models deployed in the former. In this review we have first explored the asset classes which are best suited for computer trading. We present in a summary form the essential aspects of market microstructure and the process of price formation as this takes place in trading. We critically analyse the role of different classes of traders and categorise alternative types of automated trading. We introduce alternative measures of liquidity which have been developed in the context of bid-ask of price quotation and explore its connection to market microstructure and trading. We review the technology and the prevalent methods for news sentiment analysis whereby qualitative textual news data is turned into market sentiment. The impact of news on liquidity and automated trading is critically examined. Finally we explore the interaction between manual and automated trading.
Abstract
A review of news analytics and its applications in finance is given in this chapter. In particular we review the multiple facets of current research and some of the major applications. It is widely recognised news plays a key role in financial markets. The sources and volumes of news continue to grow. New technologies that enable automatic or semi-automatic news collection, extraction, aggregation and categorisation are emerging. Further machine learning techniques can be used to process the textual input of news stories to determine quantitative sentiment scores. We consider the various types of news available and how these can be processed to form inputs to financial models. We report applications of news, for prediction of abnormal returns, for trading strategies, for diagnostic applications as well as the use of news for risk control.
Abstract
This white paper surveys stochastic programming scenario generation methods. We introduce the basic concepts relating to scenario generation, the main scenario genera- tion methods -by sampling, simulation and statistical approaches. We also review new scenario generation methods such as “hybrid” methods and a review of basic stochastic programming theory can be found in the appendices. The purpose of this white paper is to provide an informed survey of stochastic pro- gramming scenario generation methods. This informational analysis will be of value not only to academics but also modelling practitioners. This paper will be of interest to – quantitative analysts – operational research analysts – other professional involved in the area of stochastic modelling and optimization.
Abstract
Many financial decision problems require scenarios for multivariate financial time series that capture their sequentially changing behaviour, including their extreme movements. We consider modelling financial time series by hidden Markov models (HMMs), which are regime-switching-type models. Estimating the parameters of an HMM is a difficult task and the multivariate case can pose serious implementation issues. After the parameter estimation, the calibrated model can be used as a scenario generator to describe the future realizations of asset prices. The scenario generator is tested in a single-period meanconditional value-at-risk optimization problem for portfolio selection.
Abstract
Investment decisions are made ex ante, that is based on parameters that are not known at the time of decision making. Scenario generators are used not only in the models for (optimum) decision making under uncertainty, they are also used for evaluation of decisions through simulation modelling. In this paper, we review those properties of scenario generators which are regarded as desirable; these are not sufcient to guarantee the ?goodness? of a scenario generator. We also review classical models for scenario generation of asset prices. In particular we consider some recently reported methods which have been proposed for distributions with ‘heavy tails’.
Abstract
The Geometric Brownian motion (GBM) is a standard method for modelling financial time series. An important criticism of this method is that the parameters of the GBM are assumed to be constants; due to this fact, important features of the time series, like extreme behaviour or volatility clustering cannot be captured. We propose an approach by which the parameters of the GBM are able to switch between regimes, more precisely they are governed by a hidden Markov chain. Thus, we model the financial time series via a hidden Markov model (HMM) with a GBM in each state. Using this approach, we generate scenarios for a financial portfolio optimization problem in which the portfolio CVaR is minimized. Numerical results are presented.
Abstract
Stochastic programming (SP) brings together models of optimum resource allocation and models of randomness and thereby creates a robust decision-making framework. The models of randomness with their finite, discrete realizations are known as scenario generators. In this report, we consider alternative approaches to scenario generation in a generic form which can be used to formulate (a) two-stage (static) and (b) multi-stage dynamic SP models. We also investigate the modelling structure and software issues of integrating a scenario generator with an optimization model to construct SP recourse problems. We consider how the expected value and SP decision model results can be evaluated within a descriptive modelling framework of simulation. Illustrative examples and computational results are given in support of our investigation.
Abstract
Market conditions change over time leading to up-beat (bullish) or down-beat (bearish) market sentiments. The concept of bull and bear markets, also known as market regimes, is introduced to describe market status. Since regimes of the total market are not observable and the return can be calculated directly, the modelling paradigm of hidden Markov model is introduced to capture the tendency of financial markets which change their behavior abruptly.
In this project we analyze the FTSE 100 and the Euro Stoxx 50 data series via the well-known Hidden Markov Model (HMM). Using this model, we are able to better capture the stylized factors such as fat tails and volatility clustering compared with the Geometric Brownian motion (GBM), and find the market signal to forecast the future market conditions.
Abstract
Convex quadratic programming (QP) as applied to portfolio planning is established and well understood. In this paper, presented in two parts, we highlight the importance of choosing an algorithm that processes a family of problems efficiently. In Part I in particular we describe an adaptation of the simplex method for QP. The method takes advantage of the sparse features of simplex and the use of the duality property makes it ideally suited for processing the discrete optimization models. Part II (to be published in issue 8/4) of the paper considers a family of discrete QP formulations of the portfolio problem, which captures threshold constraints and cardinality restrictions. We describe the adaptation of a novel method branch, fix and relax to process this class of models efficiently. Theory and computational results are presented.
Abstract
Convex quadratic programming as applied to portfolio planning is established and well understood. In this paper, presented in two parts, we highlight the importance of choosing an algorithm that processes a family of problems efficiently. In Part I (published in issue 8/3), in particular, we described an adaptation of the simplex method for Quadratic Programming (QP). The method not only takes advantage of the sparse features of simplex, the use of the duality property makes it ideally suited for processing the discrete optimization models. Part II of the paper considers a family of discrete QP formulations of the portfolio problem, which capture threshold constraints and cardinality restrictions. We describe the adaptation a novel method branch, fix and relax to process this class of models efficiently. Theory and computational results are presented.
Abstract
We report a computational study of two-stage SP models on a large set of benchmark problems and consider the following methods: (i) Solution of the deterministic equivalent problem by the simplex method and an interior point method, (ii) Benders decomposition (L-shaped method with aggregated cuts), (iii) Regularised decomposition of Ruszczy4nski (Math Program 35:309-333, 1986), (iv) Benders decomposition with regularization of the expected recourse by the level method (Lemarichal et al. in Math Program 69:111-147, 1995), (v) Trust region (regularisation) method of Linderoth and Wright (Comput Optim Appl 24:207-250, 2003). In this study the three regularisation methods have been introduced within the computational structure of Benders decomposition. Thus second-stage infeasibility is controlled in the traditional manner, by imposing feasibility cuts. This approach allows extensions of the regularisation to feasibility issues, as in Fabian and Szuke (Comput Manag Sci 4:313-353, 2007). We report computational results for a wide range of benchmark problems from the POSTS and SLPTESTSET collections and a collection of difficult test problems compiled by us. Finally the scale-up properties and the performance profiles of the methods are presented.
Abstract
This paper proposes a new algorithm to compute the residual risk of failure of an explosion protection system on an industrial process plant. A graph theoretic framework is used to model the process. Both the main reasons of failure are accounted for, viz. hardware failure and inadequate protection even when the protection hardware functions according to specifications. The algorithm is shown to be both intuitive and simple to implement in practice. Its application is demonstrated with a realistic example of a protection system installation on a spray drier.
Abstract
The aim of this project s to forecast futures spreads of WTI Crude Oil. The motivation for this project springs from the fact that trading with calendar futures spreads is much more advantageous than trading with many other financial instruments. We make use of the fact that
futures prices follow the mean-reverting process (Ornstein-Uhlenbeck process, OU). We propose a new method which combines three linear Gaussian state space models, namely one factor model, one factor model with risk premium, and one factor model with seasonality. Thereafter,we directly model futures spreads. Kalman filter and Maximum Likelihood Estimate (MLE) are used to estimate the model parameters. It is shown that this new approach, using the ratio between the nearest prices over spot prices as a latent variable and calendar futures spreads vector as the observed variable, is more accurate and robust than the indirect forecasting method which inputs both spot prices and futures prices as the latent variable and the observed variable respectively. Results on calibration and comparison for three models and two methods, as well as out-of-sample forecasting results are then presented and discussed.