Forums  > Trading  > PnL optimisation on history data  
Page 1 of 1
Display using:  


Total Posts: 1
Joined: Apr 2019
Posted: 2019-04-13 10:50
Let's suppose we have data with the following columns: timestamp, bid price, ask price, alpha. The alpha column shows a signal that was generated by model, which tells us whether it's a good idea to buy now (the signal is strongly positive) or to sell. We want to evaluate the PnL(Profit and Loss) of a simple strategy: we can only be long 1 lot, short 1 lot or flat (0 position).
We assume that while writing this strategy we will have a parameter that regulates how strong should be the signal for us to enter into a position, that is a threshold for the alpha. There is a comission of 0.0011% of the asset price per 1 lot of traded asset.
Is there some methods to find threshold, that optimises PnL? Sharpe Ratio?


Total Posts: 1167
Joined: Jun 2007
Posted: 2019-04-13 12:01
You are asking if there is way to set the threshold on a parameter (which defines if you set 1 = long , -1 = short or 0=flat) that optimizes your PNL on historical data ?

Yes. There are tons of ways to do that. From a simple grid search (loop over the back test several times, each time changing the threshold) to using a meta heuristic like simulated annealing or an evolutionary algorithm (google differential evolution, this is my workhorse for these things).

So yes, you could do it. But please keep in mind: this is a horrible way to over-fit your strategy.

And sorry if I misunderstood your question and telling you stuff you already know.

Ich kam hierher und sah dich und deine Leute lächeln, und sagte mir: Maggette, scheiss auf den small talk, lass lieber deine Fäuste sprechen...


Total Posts: 510
Joined: May 2006
Posted: 2019-04-15 10:32
I think the biggest fallacy here is strong signal => high pnl per lot.

If you can filter out trades with low per lot, you can also do much better than that - e.g. scale trades by signal strength, only enter trades with >99% positive pnl etc.

Like @maggette says, what you are proposing to do is overfitting. Won't work. Before you go there, spend a long time working out what exactly the relationship is between signal strength and per lot.

"There is a SIX am?" -- Arthur


Total Posts: 265
Joined: Dec 2012
Posted: 2019-04-19 17:33
I'm not so sure that the overfit backtest is actually a waste of time. While I wouldn't take the results as indicative, the overfitting may reveal a fair bit about your signal.

Unless your data set is quite large, I think a simple grid search will suffice. Pick your favorite risk metric to optimize for; one of my favorites is something simple like weighing negative days twice as heavily etc.

There are no surprising facts, only models that are surprised by facts


Total Posts: 838
Joined: Jun 2005
Posted: 2019-04-20 00:17

alpha = predicted_price - (ask+bid)*0.5


alpha = invECDF(predicted_price - (ask+bid)*0.5)

why not...

we don't know the nature of alpha to answer precisely. for example, ronin perhaps has something different in mind. so, wrong question, irrelevant answer.
Previous Thread :: Next Thread 
Page 1 of 1