You are currently browsing the tag archive for the ‘quant’ tag.

Just watched a really interesting documentary on the Flash Crash of 2010 :

Money & Speed: Inside the Black Box

Some stand-out points :
> CFTC / SEC attribute the root cause as Waddell & Reed ‘dumping’ \$4.1bn shares
> Eric Hunsader @ Nanex looks at the W&R trades [ see video at 34m18s ]
> W&R trades don’t look like dumping, they maximize sell price during local up runs
> there are other trades that do look like aggressive dumping, ie. rapid sequential bursts down

The Nanex explanation of the Flash Crash : FlashCrashAnalysis

### Price manipulation ?

This raises an interesting Question :

Is it possible for a black box algorithm to use a rational probabilistic strategy to drive down stock price in bursts like this, in order to later buy the stock at a very low artificially deflated price ?  You’d need a lot of stock to do this : is there a threshold of stock volume, say 5% of all stock, under which its impossible to create this effect ?

### Price Delay Arbitrage ?

Another aspect of this, is the possibility to do ‘Diffusion Arbitrage’ for want of a better name :

If you can drive the market so quickly that derived instruments take seconds to reprice ( due to the storm of new data), then you have that window to trade ahead of the market in options or indexes based on the underlying you have manipulated.

In this case the delay was a whopping 5 to 35 seconds : see for example Nanex’s FlashCrashSummary , showing the delayed drop and recovery of the Dow.

Some amazing javascript libraries are coming out now, which enable interactive math directly in the browser.

One of the most impressive is jsxGraph, from Bayreuth University, which is purpose built upon SVG for plotting mathematical functions.  See their wiki for superb demos.

On the jsxGraph blog, there is a video showing a construction being manipulated by touch on the iPhone – very cool.

Experimenting with their circles-on-circles web app, by sliding the parameters around I uncovered this delightful piece, which reminded me of those ornate calligraphy end-notes you see in olde books.

Its a lot of fun to experiment.  For the curious, the parameters of this specimen are : c1:0.51 f1:7 c2:0.32 f2:17

### SVG coolness

I think SVG is the right way to go, and a more natural approach than using canvas [as I did for my interactive sine generator last week].

There’s no technical reason now why something as featured as GeoGebra or KGeo could not be implemented directly in the browser.. it just has to be done!

Geogebra is really nice, but I do prefer the simplicity of not having to download the java applet and approve it for access.  Each extra step supposedly halves the audience, and I want math to be interactive and accessible.

Continuing on the same topic as my previous post, its nice to be able to gather all the kth order moments in a single pass.

Last time I mentioned the boost/accumulators example, but you will have noticed two issues if you use that.  Firstly, moment<k> tag will give you the kth simple moment relative to zero, whereas we often want the kth central moment of a sequence relative to the mean.  Secondly, although boosts accumulator is well written it does seem to take a while to compile [~ 12 seconds for code using moment<12>].

After some playing around Ive got a faster simpler approach, where the inner loop accumulates kth powers of the element.  After you’ve run the sequence through, you can then easily extract variance, and all the kth central moments.  So in adding the more general case of kth moments, Ive made the particular variance case simpler.  That often seems to happen in programming and math!

### algebra

First a bit of math and then the code.  We want to express the kth central moment in terms of the k basic moments.

First, lets define the basic moment as –

$\displaystyle M_{n}^{j}= \sum_{i=1}^n {x}_i^{j}$

We rearrange the binomial expansion –

$\displaystyle nv_{n}^{k}= \sum_{i=1}^n({x}_{i}-\mu_{n})^k$

$\displaystyle = \sum_{i=1}^n \sum_{j=0}^k \binom{k}{j} {x}_{i}^j(-\mu_{n})^{k-j}$

$\displaystyle = \sum_{j=0}^k \binom{k}{j} (-\mu_{n})^{k-j} \sum_{i=1}^n {x}_{i}^j$

So we have the kth central moment given as a weighted sum of the kth simple moments –

$\displaystyle v_{n}^{k} = 1/n(\sum_{j=0}^k \binom{k}{j} (-\mu_{n})^{k-j} M_{n}^{j})$

which shows that all we need to accumulate as we walk across the sequence is the kth simple powers $({x}_{i})^k$.

Notice the variance is now handled as a special case where k=2.  Likewise k=0 corresponds to n, the element count and k=1 is the sum of elements.

### c++ impl

Heres a basic impl of the above expression –

I found some very interesting (promo?) slides from SocieteGenerale discussing WHY the Libor Market Model LMM [aka BGM/J, an instance of HJM] has become so popular, despite its significant imperfections.

But are we fooling ourselves that we can still put “the wrong number in the wrong formula to get the right price”?

I think they miss the point, basically you need some parameters to make a good model fit the environment [to paraphrase Dermans definition of useful ‘model’ is that it allows you to price something from other, somehow related, market observables].  Sure, you need several parameters to get enough flexibility to calibrate to the market.

Dermans autobiography ‘My Life as a Quant‘ is a lovely nontechnical read.  It follows from his childhood in South Africa, his desire to study physics and later disillusionment with postdoc study, through developing symbolic math software at Bell Labs, to his career as a rocket scientist on Wall St and top quant at Goldman Sachs.  He comes across as a refined and honest intellectual with a real passion for his craft and the journey makes for a good story.

I just discovered an online video of a very punchy talk he gave at NYU on whats wrong with models [.mov format ~90Mb ~10mins].

This is awesome and he doesnt hold back.  His main point is that although Black Scholes is a great advance, an elegant useful model, its only gonna give you a ballpark figure…

Unlike QED or other physics theories which are deeply descriptive of nature and can be accurate to 8 decimal places in predictive power, quantitative models interpolate.  He makes an analogy with models like Black Scholes – Imagine you need to approximate the value of a manhattan apartment if you only have the square foot market value of a lower east side apartment for comparison… ok, I got the NY areas mixed up, but its clear we value one instrument in terms of another, making all sorts of reasonable ad hoc adjustments, such as higher comparative building maintenance service charges etc..  and factor all that into an ‘implied’ price per square foot price.

It seems to me the time is ripe for Visual DSLs to appear in the financial quantitative domain – you can imagine the pure housekeeping involved in having 2500 trade strategies operating across 3500 tick feeds.. hmmm.

So you might be asking what the hell is a DSL?  A DSL or Domain Specific Language can mean many things, but usually falls into 2 classes – Visual and Textual.

Say I want to create a web app that generates some HTML page dynamically… regardless of whether I program in Ruby or Lisp I really want a nicer way to open and close HTML tags than

print('<html>\n') ... print(</html>\n')

… a textual DSL gives you a language that is a much closer match to the problem .. I want to just write something like