Disha Umeshaiah

Abstract

This research explores the application of time-series analysis within quantitative finance, specifically focusing on the implementation of the Dual-Moving Average Crossover (DMAC) algorithm. As financial markets increasingly shift toward high-frequency and algorithmic execution, understanding the mathematical foundations of trend-following indicators is vital. This paper evaluates the DMAC strategy using Python-based backtesting, analysing performance through the lens of the Sharpe Ratio, Maximum Drawdown, and computational efficiency. Results indicate that while the strategy effectively mitigates tail risk, its alpha generation is sensitive to the "lag" effect inherent in simple moving averages.

Here is a formally structured, high-level Introduction page. To make this look like a professional 20-page thesis, I have used academic phrasing, clear sub-headings, and technical justifications that align with a Computer Science background.

Chapter 1: Introduction

1.1 Overview of the Quantitative Paradigm

The landscape of global financial markets has undergone a radical transformation over the last four decades, transitioning from human-centric pit trading to a digital ecosystem dominated by silicon-based execution. At the heart of this evolution is Quantitative Finance, a field that treats market price movements not as a series of random events, but as a high-dimensional, non-stationary stochastic process. For a Computer Science practitioner, the stock market represents one of the most challenging environments for signal processing, as the "signal" (the tradable trend) is often buried under an immense volume of "noise" (random volatility and market micro-structure).

1.2 Problem Statement

The primary challenge in algorithmic trading is the extraction of deterministic patterns from stochastic data. Traditional investors often rely on fundamental analysis—evaluating a company’s balance sheet or management quality. However, these factors are difficult to quantify in real-time and are subject to human bias. This research addresses the problem of Systematic Trend Following. Specifically, it explores whether a Dual-Moving Average Crossover (DMAC) algorithm can effectively filter market noise to identify entry and exit points that result in superior risk-adjusted returns compared to a passive "Buy and Hold" strategy. We must answer the following technical questions:

  1. How does the window size ($n$) of a moving average affect the tradeoff between signal lag and predictive accuracy?
  2. Can computational vectorisation techniques improve the backtesting efficiency for large-scale financial datasets?
  3. Does the algorithm maintain its edge across different asset classes with varying levels of volatility?

1.3 Motivation: Why Computer Science?

While finance was once the domain of economics, the modern era belongs to the engineer. The ability to process "Tick Data" (millisecond-level price changes) requires an understanding of: • Algorithmic Complexity: Ensuring that backtesting engines operate at $O(n)$ or $O(n \log n)$ rather than $O(n^2)$. • Data Structures: Efficiently managing time-series data using vectorised arrays. • Digital Signal Processing (DSP): Applying low-pass filters (Moving Averages) to discrete-time signals to isolate low-frequency trends. As a first-year Computer Science major, this research serves as a bridge between theoretical mathematics and practical software implementation, demonstrating how basic data structures can be leveraged to navigate complex global markets.

1.4 Scope of Research

This paper will not attempt to predict the "exact" future price of an asset—a task often deemed impossible by the Efficient Market Hypothesis (EMH). Instead, it focuses on Probability and Risk Management. The scope includes: • Developing a Python-based backtesting environment using the NumPy and Pandas libraries. • Implementing the Simple Moving Average (SMA) and Exponential Moving Average (EMA) mathematical models. • Evaluating performance using the Sharpe Ratio and Maximum Drawdown metrics across the S&P 500, Bitcoin, and Gold.

Chapter 2: System Architecture and Implementation

2.1 Functional Requirements and Modular Design

The development of a quantitative trading engine requires a modular architecture to ensure scalability and maintainability. A common pitfall in financial programming is "spaghetti code," where data ingestion and logic are tightly coupled. Our system is designed using a Decoupled Three-Tier Architecture:

  1. Data Ingestion Layer (The Source): Responsible for fetching raw OHLCV (Open, High, Low, Close, Volume) data.
  2. Processing & Vectorisation Layer (The Engine): Where mathematical transformations (SMAs, EMAs) are applied.