You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Martin Scholl (University of Oxford): "Studying Market Ecology Using Agent-Based Models"
Martin Scholl (University of Oxford): "Studying Market Ecology Using Agent-Based Models"
Martin Scholl, a researcher from the University of Oxford, has delved into the study of market ecology using agent-based models. Unlike traditional approaches that rely on assumptions like the efficient market hypothesis, Scholl takes issue with the rational expectations equilibrium theory commonly employed in neoclassical finance. He believes that this theory demands all participants to have a perfect understanding of the real world, which is unrealistic given the cognitive limitations of both retail investors and fund managers. Instead, he advocates for applying tools from biology to analyze real-world financial data, offering a fresh perspective on understanding financial markets.
To explore market ecology, Scholl likens investment strategies to species in biology, with individual investors representing individuals of a given species. The aggregate wealth invested using a particular strategy is comparable to the abundance or total population size of that species. In a toy model of an investment game, Scholl introduces a simplified scenario where agents can choose to leave their wealth in a money market account or invest in a stock that pays dividends. This model allows for the examination of various investment strategies and objections to the neoclassical assumption of perfect rationality.
Scholl identifies different investment strategies employed in agent-based models to study market ecology. The first is a perfectly rational strategy, where the net asset value is divided between the stock and cash. A value investor estimates the growth rate of the dividend to make future forecasts and understand the stock's future price. The second strategy involves trend followers who analyze recent prices and extrapolate trends. The third strategy encompasses noise traders who enter the market to fulfill liquidity needs but are not price-sensitive on a short time scale. However, their mean-reversing noise process is connected to the fundamental value on a long time scale.
To simulate market mechanisms and study market ecology, Scholl and his team utilize agent-based models with the help of software packages. They ensure comparability between different runs of the model by fixing endowments and dividing the initial endowments among individuals of different species, keeping track of the relative share. The simulations run for a span of 200 years, enabling the observation of the mean annual return for each species. Interestingly, they find that each strategy has at least one region where it is the most profitable, regardless of its abundance.
In his experiments, Scholl examines the behavior of trend followers and the impact of reinvesting profits. He observes that the market spends most of its time in an unstable, chaotic region with large outliers, resulting in speckled noise. When investors reinvest their profits, trajectories fluctuate around an identified central point but do not entirely converge towards it. Increasing the concentration of trend followers leads to higher volatility in returns. Scholl attributes the quick movement away from trend followers to the rationality of investors and positive autocorrelation in the dividend process.
Scholl explains that agent-based models can be employed to construct a financial community matrix, similar to the predator-and-prey Volterra equations used in biology. The return of a particular strategy is equated to population size, and the sensitivity of the return to changes in population size represents the community matrix. In the financial market, competition between different strategies arises when prices deviate from equilibrium points. Scholl emphasizes that financial markets exhibit density dependence, making species interactions more complex than in biological systems. This density dependence leads to scenarios like bubble-like price increases but acknowledges that such situations are unrealistic.
In the context of market ecology, Scholl discusses the practical implications of his findings. He presents a linear model that uses the abundance of species to describe the relationships between different types of predators, thereby impacting market results. This approach highlights the multidimensional nature of investments and demonstrates the importance of appropriately sizing strategies to avoid losses or becoming prey in highly density-dependent financial markets. It challenges the traditional view that stock prices reflect all available fundamental information and presents financial markets as complex systems influenced by various conditions.
Scholl further elaborates on his use of a simple linear model within agent-based models to study market ecology. By analyzing the holdings and relative abundance of market activities, he found that this approach outperformed department-derived models that assume rationality and translate fundamentals automatically. However, he acknowledges the limitations of his model and emphasizes the need for further research to enhance its realism. One aspect he addresses is the sensitivity of the model to different recipes and definitions, particularly in relation to trend following. While dividends play a significant role in his model, incorporating more realistic elements for real-world financial markets would require additional steps.
Regarding the adaptability of agents' beliefs in his model, Scholl points out that market operations often involve fund managers following strategies outlined in prospectuses for extended periods. This indicates a tendency toward mechanical asset allocation processes. As a result, Scholl leans towards modeling less adaptive behavior and less intelligence. However, he highlights that other researchers in his group at the University of Oxford are actively exploring the application of evolutionary algorithms to change parameters and even innovate new strategies.
Martin Scholl's research focuses on studying market ecology using agent-based models. He challenges traditional finance theories and assumptions by applying concepts from biology to understand financial markets better. By comparing investment strategies to species in biology, analyzing different strategies, and simulating market mechanisms, Scholl uncovers the complexity of financial markets and the interplay between various strategies. His findings suggest that financial markets are highly density-dependent, and appropriate sizing of investment strategies is crucial to avoid losses and become prey in this dynamic ecosystem. Scholl's work provides valuable insights into the nature of markets as complex systems, contrasting the traditional view that stock prices solely reflect fundamental information.
Kevin Webster: "How Price Impact Distorts Accounting P&L"
Kevin Webster: "How Price Impact Distorts Accounting P&L"
In a YouTube video, Kevin Webster delves into the topic of how price impact can distort accounting profit and loss (P&L) statements. He emphasizes the significance of accurately modeling price impact to effectively manage risk and highlights the importance of managing liquidity risk to avoid being left with an illiquid position. Webster acknowledges that there are various price impact models available, but they generally agree on the majority of the data.
The talk begins by addressing the intersection between price impact and liquidity risk, particularly noting that the liquidity of major markets was often taken for granted before the financial crisis. Webster shares powerful quotes that illustrate how price impact creates an illusion of profit, leading to price dislocations away from financial values. The objective of the talk is to mathematically formalize this concept, providing a quantitative framework based on estimating the market impact of liquidation to eliminate the illusion of profit.
Webster explains price impact as a causal model for trading, where more aggressive trading pushes prices further and vice versa. Price impact models are widely used in transaction cost analysis and optimal execution, serving as pre-trade tools to estimate expected transaction costs and optimize execution strategies. He showcases a mock transaction cost analysis report that allows traders to evaluate how their algorithms are performing on a quarterly basis, with a focus on minimizing order slippage and considering both mechanical moves and alpha slippage.
The speaker discusses the guidelines published by the European Securities and Markets Authority (ESMA) regarding liquidity stress tests, which involve simulating asset liquidation during market stress periods. Simulating reactions from the market, such as price dislocations, and employing hedging strategies are crucial to reduce risk exposure. Webster references various literature on liquidity stress tests and price impact on accounting P&L, including the works of Cascioli, Boucheron, Farmer, and regulatory committees like ESMA and the Baffled Committee. He emphasizes the necessity of liquidity stress testing to mitigate situations that could impact accounting P&L and result in high liquidation costs.
The concept of a trading footprint is introduced, which measures the distorting effect of price impact on accounting P&L and ties together different definitions of P&L. Webster presents a simple fire sale model to illustrate the significant conclusions about accounting P&L drawn by the Casadio-Bouchard-Farmer paper. He explains how the number traders and platform managers observe on a daily basis overestimates their final P&L, leading to deflation when the trade is completed. However, this inflation property can be measured and displayed in real-time, providing actionable information for traders. Webster notes that position inflation losses are often temporary and dependent on risk tolerance.
The issues related to valuing a stock position and its impact on the P&L of a company are discussed. Webster highlights the ambiguity in determining which prices to use for marking the stock position and the difference between accounting P&L and the fundamental P&L used by trading algorithms. The trading footprint is defined as the difference between accounting P&L and fundamental P&L, with ambiguity resolved when the position is closed. The speaker explores position inflation, making certain assumptions under which this property holds. The impact model and its two cases, the original OW mole and the W mole studied by Fruehwirth and Bond, are also touched upon.
Webster explains that for the model to make sense, a no-arbitrage condition between lambda and beta needs to be satisfied, along with a self-financing equation condition. He delves into calculating expected P&L at closing time and how the trading footprint introduces bias into accounting P&L. The position inflation property causes the position to inflate during the position entering phase, remain during the holding phase, and eventually evaporate. All of these aspects can be observed in real-time on a trading screen, providing traders with valuable insights.
Webster further explains the distortions in accounting P&L caused by price impact. He discusses how traders can make profitable trades even without alpha, but warns that these profits are short-lived due to transaction costs. Monitoring price dislocations early on is crucial to avoid losses. Additionally, Webster notes that portfolio managers prefer to view their portfolios as a whole and introduces the concept of a stationary portfolio, which controls the size and turnover of a portfolio in the mathematical finance world.
The concept of a stationary portfolio is then explored in relation to estimating running transaction costs. By understanding the time scale of the propagator, traders can estimate the extent to which their positions are inflated and the illusion of profit they may lose when liquidating their positions. Webster demonstrates the framework using empirical data, showcasing its applicability to real-world scenarios. He applies the framework to a fire sale model and explains the differences between accounting P&L and fundamental P&L, highlighting how they inform different objective functions based on a trader's risk aversion.
The speaker delves into the impact of fire sales or the trading activity of other market participants on a trader's P&L and position. Aggressive hedging can lead to crowding effects and position inflation, resulting in permanent losses. Accurately modeling price impact is crucial for effective risk management, and managing liquidity risk is emphasized to avoid ending up with illiquid positions.
Webster acknowledges that while there are many different price impact models available, they generally agree on the majority of the data. However, differences may arise in the amount and duration of the impact's persistence. Temporary dislocations can last from a couple of days to a month. From a risk management perspective, there is a clear course of action, whereas from a trader and performance perspective, effective communication becomes key. Understanding whether P&L is mechanical or not and removing the mechanical part allows traders to focus on actual alpha or edge in their trades.
The speaker explains the "no price manipulation" principle, highlighting that even if traders gain profits, they cannot maintain them as they will eventually evaporate. Position inflation leads to the deflation of trade value over time or immediate liquidation, resulting in zero or even negative P&L. Therefore, traders need to rely on other variables to generate sustainable profits. Webster further explores the correlation between the initial impact state, the impact caused by the rest of the market, and the impact from the trader's hedges and the rest of the market.
In conclusion, Kevin Webster provides a comprehensive understanding of how price impact can distort accounting P&L. He sheds light on the extra costs during high-volatility liquidity regimes and their correlation with the broader market, emphasizing their impact on bias. From a regulatory perspective, corporate bonds and insurance companies are likely to be more affected by this bias. While Webster admits that he lacks detailed answers for markets outside of equities, he provides a solid mathematical foundation for understanding price impact and its potential distortion of P&L.
Laura Leal (Princeton University) - "Learning a Functional Control for High-Frequency Finance"
Laura Leal (Princeton University) - "Learning a Functional Control for High-Frequency Finance"
Laura Leal, a researcher from Princeton University, delivered an informative presentation on the application of deep neural networks in high-frequency finance. She emphasized the limitations of conventional solutions and explored the advantages of utilizing neural networks in this domain. Leal highlighted their ability to adapt to complex factors like autocorrelation and intraday seasonality, which traditional models struggle with. By leveraging neural networks, traders can achieve optimal execution by minimizing market impact and trading smoothly.
To address concerns about the black box nature of neural networks, Leal introduced the concept of explainability. She discussed the projection of neural network control onto a lower-dimensional manifold, enabling a better understanding of the associated risks and the deviation from familiar risk sectors. The team evaluated the performance of the neural network control, comparing it with the classic closed-form PDE (partial differential equation) solution. They examined the value function, mark-to-market wealth, and relative errors in projections to assess the accuracy and effectiveness of the neural network approach.
Leal delved into the intricacies of training the neural network, emphasizing the importance of incorporating real-world data and accurate dynamics. She also proposed a multi-preference controller that allows traders to input their risk preferences, enabling quicker adaptation to new market conditions. By considering risk aversion parameters and incorporating a trader's preferences, the neural network can generate a solution to the stochastic optimization problem in high-frequency finance.
The presenter discussed the structure of the neural network used for risk control, highlighting its recurrent nature. While the network is not excessively deep, it employs a recurring structure at each time step, updating weights simultaneously. The inputs to the network include time and inventory, while the output is the control itself—determining the optimal amount of stocks to trade at each time step. To address the challenge of limited financial data availability, transfer learning is employed, simulating data using Monte Carlo methods.
Leal outlined the process of projecting the neural network control onto a linear function space using linear regression. This projection technique facilitates a better understanding of the non-linear functions of the neural network and their alignment with closed-form control solutions. The results demonstrated the impact of incorporating seasonality and risk aversion parameters on the model's reaction to the market. Additionally, the presenter emphasized the significance of gamma, which is typically set to two in the literature but showed a non-linear solution when taken as three over two.
The performance and accuracy of the neural network control in executing trades for high-frequency finance were thoroughly evaluated. Leal compared the value function, mark-to-market wealth, and relative errors in projections across different scenarios and gamma values. While the neural network exhibited superior performance, it executed trades in a non-linear manner, deviating from the known control solution. This raised questions about the decision to trade using the neural network and determining appropriate margin levels based on its divergence from the established solution.
Leal explored the benefits of the multi-preference controller approach, allowing traders to input their risk conversion parameters and start trading immediately with a pre-trained model. While the neural network solution took longer to execute than the PDE solution, it offered greater flexibility and adaptability to different risk preferences. To enhance explainability, Leal proposed a projection idea using linear regression, reducing computational burden while retaining the multi-preference capability. She also highlighted the broader applications of the neural network approximation concept, suggesting its relevance in other financial problems, such as hedging.
The training process for the neural network in high-frequency finance was discussed, emphasizing offline training to avoid latency issues associated with online reinforcement learning. The network takes time, inventory, and potentially risk aversion parameters as inputs and produces a rate as output. Leal also described the fine-tuning procedure in transfer learning, transitioning from simulated data to real data increments obtained from the Toronto Stock Exchange once the network has converged. The presenter underscored the importance of using real-world data and accurate dynamics during the training process, as it enhances the network's ability to capture the complexities of high-frequency finance.
In the subsequent section, Laura Leal provided insights into the inputs and objective function employed in the neural network for high-frequency finance. The neural network incorporates the inventory as a proportion of the average volume for a specific stock during a day, allowing for a normalized representation. The objective function is framed as a maximization problem, with the output serving as the control for optimal execution. The structure of the neural network is based on function approximation, utilizing two input nodes and four hidden layers to capture the underlying relationships.
Addressing a question about the discrepancy between two control solutions, Leal clarified that it could be interpreted as a reflection of the changing utility of the investor. By adjusting the gamma parameter, different utility functions can be employed, leading to variations in the control solutions. In their research, the team chose the gamma value of three halves based on empirical testing with actual traders, which resulted in satisfactory performance.
Leal further highlighted that the neural network's output is observable and analyzable. They can monitor the positions taken by the network and how they evolve throughout the trading day, providing transparency and insights into the decision-making process. This level of interpretability and understanding allows traders to gain confidence in the neural network's execution strategies.
The challenges associated with developing functional controls for high-frequency finance were also discussed by Leal. While an average control process can provide overall insights into trade execution, it may not accurately represent the behavior of individual trajectories. The dynamics of the market, such as the emergence of meme stocks, necessitate the adaptation of control methods to capture evolving conditions effectively.
In conclusion, Laura Leal's presentation shed light on the complexities of creating effective controls in the realm of high-frequency finance. By leveraging deep neural networks, researchers and traders can overcome the limitations of traditional models and adapt to the intricate dynamics of this domain. The incorporation of risk preferences, explainability measures, and real-world data contributes to the development of robust and adaptable control solutions. Through their work, Leal and her team offer valuable insights and solutions that pave the way for more efficient and informed decision-making in high-frequency finance.
Zihao Zhang (Oxford-Man Institute) - "Deep Learning for Market by Order Data"
Zihao Zhang (Oxford-Man Institute) - "Deep Learning for Market by Order Data"
Zihao Zhang, a postdoctoral researcher at the Oxford-Man Institute and part of the machine learning research group, presents his team's recent work on applying deep learning to market by order data. Their focus is on market microstructure data, particularly the limit order book, which provides valuable insights into the overall demand and supply dynamics for a specific financial instrument. By combining market by order and limit order book data, Zhang and his team have discovered that they can reduce signal variance and obtain better predictive signals. This application of their model holds potential for enhancing trade execution and market-making strategies.
Zhang begins his presentation by providing a brief introduction to market microstructure data, specifically emphasizing the significance of market by order data. This data source offers highly granular information, providing frequent updates and events compared to the limit order book data, which has received more attention in existing literature. He introduces their deep learning model, explaining the network architectures they have designed for analyzing market by order data. Zhang highlights that their work represents the first predictive model using market by order data for forecasting high-frequency movement, offering an alternative source of information that expands the possibilities for alpha discovery.
Next, Zhang delves into the concept of the limit order book, which serves as a comprehensive record of all outstanding limit orders for a financial instrument at a given point in time. He emphasizes that while chart data offers low-frequency information, the price of a stock is actually represented by the limit order book, which is a multivariate time series. Zhang explains how the limit order book is organized into different price levels based on submitted orders, with each price level consisting of numerous small orders segmented by different traders. He also discusses how the order book is updated when new messages arrive, which can introduce new positions, cancel existing orders, or modify current orders. Zhang points out that the derived data from the limit order book reveals the overall demand and supply relationship for a specific financial instrument, and his objective is to determine if utilizing market by order data, containing information on order placement and cancellation, can provide additional insights for making predictions.
Moving forward, Zhang explores how market by order data can be utilized in deep learning to predict market movements. Although the message strings in market order data possess lower dimensions compared to the limit order book, they offer additional information that can be leveraged for predictions. Zhang explains how past events can be transformed into 2D matrices, forming images that can be fed into a neural network for prediction. The resulting features from the convolutional layer can then be integrated into the recurrent neural layers to learn the structure and capture additional dependencies. The final layer produces predictions based on a classification setup using threshold returns.
Zhang proceeds to discuss the network architecture employed for making predictions using limit order book data. In this case, the first two components are replaced with messages from individual traders, and the convolutional layers are substituted with an LSTM layer or attention layer. Zhang briefly explains the attention mechanism, which facilitates single-point prediction and involves an encoder-decoder structure. The encoder extracts meaningful features from the input times and summarizes them into a hidden state, while the decoder generates the prediction. Normalization is employed to determine whether an order is a buy or sell based on the mid-price.
In the subsequent section, Zhang presents the results of their model trained with a group of assets, normalized to a similar scale, and tested using different models such as the simple linear model, multilayer perceptron, LSTM, and attention model, incorporating both limit order book data and pure ambient data. The results indicate that predictive signals from the ambient data exhibit less correlation with the signals from the limit order book, suggesting that a combination of these two sources can reduce signal variance, benefit from diversification, and yield superior predictive signals. Therefore, an ensemble model that averages the predictive signals from both data types demonstrates the best performance.
Zhang proceeds to discuss the potential benefits of incorporating market-by-order (MBO) data into predictions and highlights the ability to perform feature engineering with this data. He presents the results for prediction horizons ranging from two to 20 ticks ahead, noting similar behaviors observed for 50 and 100 ticks ahead. Zhang also addresses questions from the audience, including the possibility of training a single model using all instruments for improved generalization and the source of the MBO data from the London Stock Exchange. In response to an audience member's question about focusing on NF1 instead of PNL, Zhang agrees and acknowledges that PNL is a more relevant measure of success.
Zhang further discusses the use of predictive signals and various ways to define them, such as using a raw signal or setting a threshold based on softmax probabilities. He summarizes the key points of the paper, which propose modeling market by order (MBO) data instead of limit order book data and testing deep learning models, including the LSTM retention mechanism. The results indicate that a combination of MBO and limit order book data yields the best results. Zhang addresses audience questions regarding autocorrelation between market moves, filtering out noise trades, and the motivation for using CNN layers in modeling limit order pictures.
In the following section, Zhang explains how the order book can be treated as a spatial structure that can be effectively explored using convolutional neural networks (CNNs). Using a CNN to extract information from each price level has proven to be valuable for predictions. The long short-term memory (LSTM) layer is chosen over multilayer perceptrons as it maintains the temporal flow of data and summarizes past events for making predictions. Zhang notes that the benefits of using an attention mechanism are limited due to the nature of financial time series. The paper includes a detailed description of the hyperparameters employed in their model.
Zhang addresses the concern regarding the large number of parameters used in neural network methods and their effectiveness in predicting the stock market. He acknowledges that the abundance of parameters can be a subject of critique, but emphasizes that his team has only fine-tuned a few parameters specific to their model. They have not yet considered using the bid-ask spread as a criterion for success, but recognize its potential for further exploration. Zhang believes that their model holds practical value for trade execution and market-making strategies. However, he mentions that if one intends to cross the spread, downsampling the data may be necessary, as the frequent updates in the order book data can complicate trade execution. Finally, when modeling the Elo limit order book, they aggregate the total size at each price level rather than including information about individual order sizes.
In the concluding section, Zhang explains the differences between market by order and market by price data. Market by order data allows for tracking individual orders, which is not possible with market by price data. With proper feature engineering, market by order data can provide additional information and generate alpha. Zhang also discusses how his model treats modifications in the price of a specific limit order while keeping the size unchanged. Each new message with updated prices is treated as a new update, enriching the dataset.
Overall, Zihao Zhang's presentation showcases the application of deep learning to market by order data, highlighting its potential for extracting valuable insights from market microstructure data. By combining market by order and limit order book data, Zhang's team has demonstrated the reduction of signal variance and the generation of improved predictive signals. Their work holds promise for enhancing trade execution and market-making strategies, offering a valuable contribution to the field of financial market analysis.
Vineel Yellapantula (Cornell MFE '20): "Quantifying Text in SEC Filings"
Vineel Yellapantula (Cornell MFE '20): "Quantifying Text in SEC Filings"
Vineel Yellapantula presents his summer project, which involves the application of natural language processing (NLP) techniques to trade stocks based on textual information found in SEC filings, particularly focusing on the MD&A section. The goal of the project is to assign a score to each report of the 430 stocks present in the US market and analyze their performance by grouping them into five quantiles based on the score. Yellapantula utilizes traditional methods such as cosine and Jaccard similarity to determine the similarity score between texts, with Jaccard similarity proving to be more consistent over time. He also explores the creation of a sentiment analysis model using recurrent neural networks (RNNs) with Keras on a text dataset, achieving an impressive accuracy of 87.5% with his model.
During the presentation, Yellapantula emphasizes the importance of selecting the appropriate method for each specific problem and incorporating additional data to improve results. He highlights the abundance of information available through text data, particularly within 10-K filings, and mentions that factors developed using previous documents can be more effective than those solely relying on the present document. Yellapantula points out various alternatives for utilizing deep learning techniques with text data, including glove, word2vec, BERT, and RNNs. He further suggests incorporating more data sources, such as 8-K filings and news cycles, to enhance the predictive power of the models. However, he acknowledges the presence of selection bias in his study, as it focuses on well-performing stocks present in the index from 2007 to 2020.
In the section dedicated to sentiment analysis, Yellapantula explains the process of creating a model using RNNs with Keras. The steps involve tokenizing the text to understand its meaning, reducing dimensionality through embeddings, and employing an LSTM layer and a dense layer with a sigmoid function for sentiment classification. He demonstrates the application of this approach using IMDB reviews, restricting the review length to 500 words and padding shorter reviews with zeroes to maintain consistency. Through rigorous evaluation, Yellapantula achieves an accuracy rate of 87.5% with his sentiment analysis model.
Furthermore, Yellapantula highlights the significance of information correlation in determining the effectiveness of factors and their consistency over time. He references a study that suggests companies with stable reporting tend to perform well, indicating it as a promising factor to explore. In conclusion, Yellapantula expresses gratitude to the audience for their interest and looks forward to further engagement in the future.
Vineel Yellapantula's project demonstrates the application of NLP techniques to extract valuable insights from textual information in SEC filings. By assigning scores to reports and analyzing their performance, his work contributes to the understanding of how language can influence stock trading. Moreover, his exploration of sentiment analysis using RNNs showcases the potential of deep learning in capturing sentiment from textual data. Through careful methodology selection and the incorporation of additional data sources, Yellapantula emphasizes the opportunity to enhance the accuracy and effectiveness of such models.
Peter Carr (NYU) "Stoptions" feat. Lorenzo Torricelli (University of Parma)
Peter Carr (NYU) "Stoptions" feat. Lorenzo Torricelli (University of Parma)
Peter Carr introduces a financial product called "stoptions" that combines features of futures contracts and put options. Stoptions allow the owner to avoid unfavorable price changes by incorporating a Bermudan put option element. Carr explains the concept of options and provides an example of a three-day option with different floors associated with it. He then moves on to discuss the valuation of one-day and two-day stoptions, with the latter having two floors and the flexibility to exercise on either the first or second day.
Carr further explores stoption valuation for longer periods by delving into backward recursion, the valuation of a married put, and the use of pseudo-sums. He suggests utilizing the logistic distribution to represent price changes in married put options. The value of stoptions can be obtained using simple formulas for "at-the-money" options, and valuation and hedging can be done analytically.
Carr concludes the article by discussing the challenges associated with the adoption of such options by the market. He highlights the importance of finding a buyer and a seller for these products and shares his conversations with potential buyers and sellers. Additionally, Carr acknowledges that the stoptions model is an alternative to existing models like Black-Scholes and Bachelier, but it may not fit every situation optimally. Nonetheless, he emphasizes that their model aims to capture the multitude of binary operations with special significance in finance.
In a later section, Carr and Lorenzo Torricelli propose a "stoptions" model using a conjugate paradigm and logistic distribution. This model offers flexibility in the term structure with a single parameter, allowing accommodation of various term structures at one strike. However, it may not perfectly fit the market due to its downward-sloping implied volatility graph. The authors acknowledge the limitations of their model and recognize the countless binary operations in finance that their model aims to capture. They discuss optionality between a strike and a single option, as well as repeated optionality through pseudo summation. The section concludes with mutual appreciation and anticipation of attending each other's seminars.
Lorenzo Torricelli (University of Parma) - "Additive Logistic Processes in Option Pricing"
Lorenzo Torricelli (University of Parma) - "Additive Logistic Processes in Option Pricing"
Lorenzo Torricelli, a distinguished professor at the University of Parma, delves into the intricacies of option pricing by exploring the additive logistic model and the self-similar specification. In his enlightening presentation, he elucidates the formula for pricing vanilla options using these innovative models and exemplifies their application by showcasing a density comparison between the logistic pricing model and traditional normal models.
Furthermore, Torricelli conducts a benchmark analysis of the cumulative term structure for the logistic model against a linear revolution of the term structure for homogeneous models. His insightful observations reveal that the logistic model offers significantly more flexibility in shaping the term structure, thus providing a noteworthy advantage over conventional approaches.
To provide a comprehensive understanding, Torricelli also examines the volatility surfaces associated with these models. He notes the presence of a positive skew in the model stemming from the skewed distribution of log returns and the kurtosis of the logistic distribution. However, he highlights the absence of skew in the logistic distribution itself, as it exhibits symmetry. Torricelli further discusses the impact of modal parameters on the volatility term structure, acknowledging the potential for improvement in the chosen parameterization.
In conclusion, Torricelli emphasizes that the option formulae derived from these models are explicit and well-known, facilitating their practical implementation. Notably, he commends the impressive speed demonstrated during the performance test. As a testament to transparency and academic collaboration, Torricelli plans to make the code associated with these models publicly accessible, benefiting researchers and practitioners alike.
Yumeng Ding (Cornell MFE '20) - "Interpreting Machine Learning Models"
Yumeng Ding (Cornell MFE '20) - "Interpreting Machine Learning Models"
Yumeng Ding, a proficient researcher, delves into the realm of interpreting machine learning models for stock price predictions. In her comprehensive analysis, she explores a range of interpretability methods, including partial dependence plots, permutation feature importance, edge statistics, and LIME, to shed light on the inner workings of these models. By employing these methods, Ding aims to unravel the contribution of individual factors and their interactive effects in predicting stock prices.
Ding's study revolves around three types of factors: technical, quality, and value, which are utilized as inputs for various machine learning models such as classifiers and regressions. Leveraging the interpretability methods mentioned earlier, she unravels the intricate relationships between these factors and stock price predictions. Through rigorous backtesting, Ding discovers that non-linear models surpass linear models in terms of performance. Moreover, she observes that the effects of different factors exhibit temporal variations, highlighting the dynamic nature of stock price prediction. Ultimately, Ding identifies AdaBoost as the most suitable model for their specific scenario.
Importantly, Ding underscores the significance of interpretability methods in comprehending machine learning models. She underscores that while the vector approach provides quick insights into the most predictive interactions, it falls short in revealing the quality of these interactions. Ding emphasizes the value of employing two-dimensional partial dependence plots to visualize simpler interactions effectively. Additionally, she recommends the line plot method for delving into the intricacies of individual interactions and visualizing local effects, as long as the data is sufficiently clear from noise.
Summing up her findings, Ding emphasizes two key takeaways from her project. Firstly, she confirms that machine learning models outperform linear naive regressions in the majority of scenarios due to their capacity to capture complex interaction effects. Secondly, she highlights the feasibility of interpreting machine learning models by leveraging a variety of interpretability methods. These techniques enable researchers to elucidate the individual contributions of factors and comprehend their interactive influences on predictions.
Silvia Ruiz (Cornell MFE '20): "How to Predict Stock Movements Using NLP Techniques"
Silvia Ruiz (Cornell MFE '20): "How to Predict Stock Movements Using NLP Techniques"
Silvia Ruiz, a recent graduate of the Cornell MFE program, shares insights from her project focused on predicting stock prices using NLP (Natural Language Processing) techniques. The objective of her team's research was to explore the relationship between corporate filings, such as 10-K and 10-Q reports, and the subsequent impact on stock prices. To accomplish this, they collected a substantial dataset consisting of 1,095 reports from the EDGAR website, encompassing 50 companies across five sectors of the S&P 500.
Initially, Ruiz and her team experimented with dictionary-based models but encountered limitations in their effectiveness. To address this, they incorporated advanced methods like the word to back model and Finberg, which proved crucial in comprehending the contextual nuances embedded in the corporate filings. Additionally, they employed various sentiment measures, including word polarity and complexity, as well as an xg boost model, to predict stock price movements.
The accuracy of their predictions was evaluated over two different time frames. In the short-term, their model achieved a remarkable accuracy of 61%, while in the long-term, it demonstrated a respectable accuracy of 53%. Leveraging these predictions as signals for investment decisions, they outperformed an equally weighted portfolio. However, Ruiz highlights the need for further research across diverse sectors to enhance the precision and generalizability of their findings.
Silvia Ruiz concludes her discussion by generously offering her contact information and providing a link to her project's repository on Github. This gesture encourages follow-up inquiries and promotes collaboration in advancing the understanding and application of NLP techniques in the domain of stock price prediction.
Charles-Albert Lehalle: "An Attempt to Understand Natural Language Processing"
Charles-Albert Lehalle: "An Attempt to Understand Natural Language Processing"
In this video presentation, Charles-Albert Lehalle and his team delve into the applications of Natural Language Processing (NLP) in the finance domain. Their discussion revolves around three key areas: sentiment analysis, stock price prediction, and transaction cost modeling. They acknowledge the challenges associated with NLP, such as the risk of overfitting and bias in embeddings, and propose potential solutions, including multitasking learning and expanding lexicons. The team explores both the potential and limitations of NLP in the financial industry, emphasizing the importance of understanding context and language patterns within different sectors.
Lehalle and his team present their own experiments using NLP techniques, providing valuable insights on how NLP can compress information and offer informative indicators for financial analysts. They highlight the challenges of employing NLP in finance, including the requirement for domain-specific knowledge and the difficulty of extracting meaningful information from unstructured text data. Ethical concerns surrounding the use of NLP in finance, such as leveraging social media data for trading purposes, are also discussed.
Throughout the presentation, Charles-Albert Lehalle shares his expertise and knowledge on various NLP topics. He explains the use of lexicon-based and embedding-based NLP methods in finance, proposing a combination of both approaches to capture lexical and probabilistic features in text data. The challenges of distinguishing between synonyms and antonyms within embeddings are addressed, and Lehalle's team explores generative models to control the structure and sentiment of text. The importance of understanding embeddings and reference models, such as matrices representing joint word distributions, is emphasized.
Lehalle further explores the significance of context in NLP, discussing how embeddings can be biased for positive and negative words based on context. He explains the use of Markov chains to structure reference matrix models and presents experiments on identifying synonyms within embeddings. The limitations of NLP in capturing company names and their associated polarities are acknowledged, along with the suggestion of multitasking learning for supervised embeddings. The speakers also touch on the Loughran-McDonald Lexicon's imbalance of positive and negative words and the challenges of processing irony in financial texts.
The presentation concludes with an overview of a project by Sylvia Ruiz, a recent Cornell Financial Engineering graduate. The project focuses on predicting stock prices using NLP techniques, specifically by scraping management discussion sections from 10-K and 10-Q filings of 50 S&P 500 companies and analyzing sentiment to assess its impact on stock prices. Lehalle discusses the limitations of dictionary-based models and explains how their team expanded the dictionary, employed FinBERT to understand context, and utilized various features to measure sentiment. They achieved better performance than an equally weighted portfolio in both the short and long term.
In summary, Charles-Albert Lehalle and his team shed light on the potential and challenges of NLP in finance. They offer insights, experiments, and strategies for applying NLP techniques effectively, while emphasizing the importance of responsible use and a deep understanding of both the technology and the financial domain.