<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>Comments on: All Correlations Tend to One&#8230;</title>
	<atom:link href="http://keplerianfinance.com/2013/06/all-correlations-go-to-one/feed/" rel="self" type="application/rss+xml" />
	<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/</link>
	<description>exploring the boundaries of quantitative finance</description>
	<lastBuildDate>Wed, 14 Aug 2013 02:24:36 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=4.1.41</generator>
	<item>
		<title>By: maillot de foot</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-48</link>
		<dc:creator><![CDATA[maillot de foot]]></dc:creator>
		<pubDate>Wed, 03 Jul 2013 23:25:09 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-48</guid>
		<description><![CDATA[&lt;strong&gt;maillot de foot...&lt;/strong&gt;

Im not positive exactly where you are receiving your details, but great matter. I needs to devote some time finding out far more or comprehension a lot more. Many thanks for magnificent details I was hunting for this data for my mission....]]></description>
		<content:encoded><![CDATA[<p><strong>maillot de foot&#8230;</strong></p>
<p>Im not positive exactly where you are receiving your details, but great matter. I needs to devote some time finding out far more or comprehension a lot more. Many thanks for magnificent details I was hunting for this data for my mission&#8230;.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Robert J Frey</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-46</link>
		<dc:creator><![CDATA[Robert J Frey]]></dc:creator>
		<pubDate>Mon, 01 Jul 2013 06:22:49 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-46</guid>
		<description><![CDATA[As a mathematician I think in terms of the components of the correlation matrix. The term &quot;r-squared&quot; is not one that I use myself. I&#039;ll double check on the terminology and edit the post if necessary; it may be easiest to simply take out the reference as it is unnecessary to the argument being presented, In any event, I appreciate your careful reading.

Yes, the correlation makes most sense when talking about a linear relationship. I used the CAPM, which is linear, simply as an approximation of reality. There are economic arguments that this linear behavior dominates. Increasing the volatility of a factor that is common to two instruments will increase the tendency of those instruments to move together--whether positively or negatively.

The reason I left the $latex \sigma^2 $ in the expression for $latex \rho $ is to make the source of the relationship clearer. One could, of course, factor it out.

The fourth equation is correct. The covariance between two instruments in the CAPM is the product of their respective $latex \beta $ times the market variance. An instrument&#039;s variance is its $latex \beta $ square times the market variance plus the error variance. The expression follows.

The fact that two stocks have the same $latex \beta$ doesn&#039;t mean that their correlation is necessarily high. The effects of the error variances have to be accounted for.]]></description>
		<content:encoded><![CDATA[<p>As a mathematician I think in terms of the components of the correlation matrix. The term &#8220;r-squared&#8221; is not one that I use myself. I&#8217;ll double check on the terminology and edit the post if necessary; it may be easiest to simply take out the reference as it is unnecessary to the argument being presented, In any event, I appreciate your careful reading.</p>
<p>Yes, the correlation makes most sense when talking about a linear relationship. I used the CAPM, which is linear, simply as an approximation of reality. There are economic arguments that this linear behavior dominates. Increasing the volatility of a factor that is common to two instruments will increase the tendency of those instruments to move together&#8211;whether positively or negatively.</p>
<p>The reason I left the <img src="//s0.wp.com/latex.php?latex=%5Csigma%5E2+&#038;bg=ffffff&#038;fg=000&#038;s=0" alt="&#92;sigma^2 " title="&#92;sigma^2 " class="latex" /> in the expression for <img src="//s0.wp.com/latex.php?latex=%5Crho+&#038;bg=ffffff&#038;fg=000&#038;s=0" alt="&#92;rho " title="&#92;rho " class="latex" /> is to make the source of the relationship clearer. One could, of course, factor it out.</p>
<p>The fourth equation is correct. The covariance between two instruments in the CAPM is the product of their respective <img src="//s0.wp.com/latex.php?latex=%5Cbeta+&#038;bg=ffffff&#038;fg=000&#038;s=0" alt="&#92;beta " title="&#92;beta " class="latex" /> times the market variance. An instrument&#8217;s variance is its <img src="//s0.wp.com/latex.php?latex=%5Cbeta+&#038;bg=ffffff&#038;fg=000&#038;s=0" alt="&#92;beta " title="&#92;beta " class="latex" /> square times the market variance plus the error variance. The expression follows.</p>
<p>The fact that two stocks have the same <img src="//s0.wp.com/latex.php?latex=%5Cbeta&#038;bg=ffffff&#038;fg=000&#038;s=0" alt="&#92;beta" title="&#92;beta" class="latex" /> doesn&#8217;t mean that their correlation is necessarily high. The effects of the error variances have to be accounted for.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Algomind</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-45</link>
		<dc:creator><![CDATA[Algomind]]></dc:creator>
		<pubDate>Sat, 29 Jun 2013 13:51:47 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-45</guid>
		<description><![CDATA[&gt;Thanks, for your kind words on my Twitter posts (as @financequant) and your feedback 

It&#039;s a great honor for me to have the possibility to follow you, and learn from your great experience

&gt; &quot;(the covariance divided by the product of the standard deviations) is what most stat software would call r-squared&quot;

I am sure that some financial software might possibly use the terminology in that way, but, imho, it would be misleading. But let me explain better my view, and please do advise if I am wrong.

&quot;covariance divided by the product of the standard deviations&quot; is what we normally call &quot;r&quot;, while &quot;r-squared&quot; is, as the name suggests, is the square of that (in the specific case of linear model), or, more in general, it is a quadratic measure of fit (coefficient of determination).

There is also an evident reason why something called &quot;r-squared&quot; cannot be possibly defined just as a &quot;normalized covariance&quot; (which is r). In fact, a covariance is &quot;signed&quot; (can take *any* negative value) while &quot;r-squared&quot;, being  &quot;squared&quot;, can only be nonnegative (0-1, or 0-100 if %). 

Thus,&quot;r-squared&quot; (in the linear model case) has the square of the covariance at the numerator, and it cannot be defined, imho, as &quot;covariance divided by the product of the standard deviations&quot;.

See also, for instance, this reference (where the square of covariance at numerator is clearly shown): 
http://stats.stackexchange.com/questions/17050/explanation-for-r-squared-as-ratio-of-covariances-and-variances

So the 4-th formula from top would need some adjustments, imho.

The more conceptual point is that &quot;r squared&quot; is, in general, a measure of goodness of fit (coefficient of determination) or proportion of response variation &quot;explained&quot; by the regressors in a model, and, in the linear case it just &quot;happens&quot; to be algebraically equal to the square of the coefficient of correlation r (this is apparently the reason of the name): R^2 =r^2 =[cor(x,y)]^2
cfr: http://en.wikipedia.org/wiki/Coefficient_of_determination


&gt; &quot;equal unsystematic variances which are in turn equal to the market variance. 
Call that common value σ^2. To further simplify the exposition we will assume both stocks have betas of one.&quot;

Let me propose an interpretation, and, please, advise if I am missing the point.

Assuming the betas = 1, you are constraining both stocks to &quot;same direction as, and about the same amount as the movement of the benchmark&quot;
http://en.wikipedia.org/wiki/Beta_(finance)

This would cause, by transitivity, the correlation of the 2 stocks to be equal to +1.

What k^2 / ( k^2 + 1 ) -&gt; +1  seems to be telling, in intuitive terms, is that, by &quot;neglecting&quot; (in relative terms) what is here called the &quot;idiosyncratic variance&quot; (k large), we end up letting emerge the correlation = +1, which in fact was assumed in the first place, by  setting equal systematic variances and betas = 1.

It seems to me the conceptual thesis here is essentially assumed, and not actually derived or justified through &quot;the experiment&quot;.  

On a more practical perspective, take, for instance, two instruments like &quot;fas&quot; and &quot;faz&quot;. It does not matter what volatility they or &quot;spy&quot; exhibit, and whatever else stress condition in the mkt, &quot;faz&quot; will always have a  -1 &lt; r &lt; 0 with both &quot;fas&quot; and &quot;spy&quot;, no matter what, and such correlation will not converge anywhere, but rather randomly fluctuate in the negative interval. This is simply by construction.

However, even finding a more convincing mathematical argument involving &#124;r&#124; and a local approximation (which is actually conceivable, imho), a broader objection could be that r captures, in any case, only the &quot;linear component&quot; of a relationship, while linearity is an elementary abstraction created by mathematicians, and, imho, mkt could not care less about it, unless we consider local approximations. And, in fact, the geometry of reality seems much more of random and fractal nature, at least in its initial or simplest and primitive expressions.

To make another practical example assume you have two stocks like for instance &quot;spy&quot; and &quot;vxx&quot;. Imagine a short period with violent moves. Say &quot;spy&quot; goes up and then turns down (or vice versa). In the entire period, you would see an r closer to 0. Looking at half period, you would see an r closer to -1.

(To transcend linearity limitations there are dependence metrics which may suit better the analysis)]]></description>
		<content:encoded><![CDATA[<p>&gt;Thanks, for your kind words on my Twitter posts (as @financequant) and your feedback </p>
<p>It&#8217;s a great honor for me to have the possibility to follow you, and learn from your great experience</p>
<p>&gt; &#8220;(the covariance divided by the product of the standard deviations) is what most stat software would call r-squared&#8221;</p>
<p>I am sure that some financial software might possibly use the terminology in that way, but, imho, it would be misleading. But let me explain better my view, and please do advise if I am wrong.</p>
<p>&#8220;covariance divided by the product of the standard deviations&#8221; is what we normally call &#8220;r&#8221;, while &#8220;r-squared&#8221; is, as the name suggests, is the square of that (in the specific case of linear model), or, more in general, it is a quadratic measure of fit (coefficient of determination).</p>
<p>There is also an evident reason why something called &#8220;r-squared&#8221; cannot be possibly defined just as a &#8220;normalized covariance&#8221; (which is r). In fact, a covariance is &#8220;signed&#8221; (can take *any* negative value) while &#8220;r-squared&#8221;, being  &#8220;squared&#8221;, can only be nonnegative (0-1, or 0-100 if %). </p>
<p>Thus,&#8221;r-squared&#8221; (in the linear model case) has the square of the covariance at the numerator, and it cannot be defined, imho, as &#8220;covariance divided by the product of the standard deviations&#8221;.</p>
<p>See also, for instance, this reference (where the square of covariance at numerator is clearly shown):<br />
<a href="http://stats.stackexchange.com/questions/17050/explanation-for-r-squared-as-ratio-of-covariances-and-variances" rel="nofollow">http://stats.stackexchange.com/questions/17050/explanation-for-r-squared-as-ratio-of-covariances-and-variances</a></p>
<p>So the 4-th formula from top would need some adjustments, imho.</p>
<p>The more conceptual point is that &#8220;r squared&#8221; is, in general, a measure of goodness of fit (coefficient of determination) or proportion of response variation &#8220;explained&#8221; by the regressors in a model, and, in the linear case it just &#8220;happens&#8221; to be algebraically equal to the square of the coefficient of correlation r (this is apparently the reason of the name): R^2 =r^2 =[cor(x,y)]^2<br />
cfr: <a href="http://en.wikipedia.org/wiki/Coefficient_of_determination" rel="nofollow">http://en.wikipedia.org/wiki/Coefficient_of_determination</a></p>
<p>&gt; &#8220;equal unsystematic variances which are in turn equal to the market variance.<br />
Call that common value σ^2. To further simplify the exposition we will assume both stocks have betas of one.&#8221;</p>
<p>Let me propose an interpretation, and, please, advise if I am missing the point.</p>
<p>Assuming the betas = 1, you are constraining both stocks to &#8220;same direction as, and about the same amount as the movement of the benchmark&#8221;<br />
<a href="http://en.wikipedia.org/wiki/Beta_(finance)" rel="nofollow">http://en.wikipedia.org/wiki/Beta_(finance)</a></p>
<p>This would cause, by transitivity, the correlation of the 2 stocks to be equal to +1.</p>
<p>What k^2 / ( k^2 + 1 ) -&gt; +1  seems to be telling, in intuitive terms, is that, by &#8220;neglecting&#8221; (in relative terms) what is here called the &#8220;idiosyncratic variance&#8221; (k large), we end up letting emerge the correlation = +1, which in fact was assumed in the first place, by  setting equal systematic variances and betas = 1.</p>
<p>It seems to me the conceptual thesis here is essentially assumed, and not actually derived or justified through &#8220;the experiment&#8221;.  </p>
<p>On a more practical perspective, take, for instance, two instruments like &#8220;fas&#8221; and &#8220;faz&#8221;. It does not matter what volatility they or &#8220;spy&#8221; exhibit, and whatever else stress condition in the mkt, &#8220;faz&#8221; will always have a  -1 &lt; r &lt; 0 with both &quot;fas&quot; and &quot;spy&quot;, no matter what, and such correlation will not converge anywhere, but rather randomly fluctuate in the negative interval. This is simply by construction.</p>
<p>However, even finding a more convincing mathematical argument involving |r| and a local approximation (which is actually conceivable, imho), a broader objection could be that r captures, in any case, only the &quot;linear component&quot; of a relationship, while linearity is an elementary abstraction created by mathematicians, and, imho, mkt could not care less about it, unless we consider local approximations. And, in fact, the geometry of reality seems much more of random and fractal nature, at least in its initial or simplest and primitive expressions.</p>
<p>To make another practical example assume you have two stocks like for instance &quot;spy&quot; and &quot;vxx&quot;. Imagine a short period with violent moves. Say &quot;spy&quot; goes up and then turns down (or vice versa). In the entire period, you would see an r closer to 0. Looking at half period, you would see an r closer to -1.</p>
<p>(To transcend linearity limitations there are dependence metrics which may suit better the analysis)</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Robert J Frey</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-41</link>
		<dc:creator><![CDATA[Robert J Frey]]></dc:creator>
		<pubDate>Fri, 28 Jun 2013 05:30:24 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-41</guid>
		<description><![CDATA[Thanks, for your kind words on my Twitter posts (as @financequant) and your feedback on the typos. Regarding the r-squared, the reference is correct. What I had called the correlation (the covariance divided by the product of the standard deviations) is what most stat software would call r-squared. Finally, I did come across the fact that I could use LaTeX embedded in my posts and will do so in future ones.]]></description>
		<content:encoded><![CDATA[<p>Thanks, for your kind words on my Twitter posts (as @financequant) and your feedback on the typos. Regarding the r-squared, the reference is correct. What I had called the correlation (the covariance divided by the product of the standard deviations) is what most stat software would call r-squared. Finally, I did come across the fact that I could use LaTeX embedded in my posts and will do so in future ones.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Algomind</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-40</link>
		<dc:creator><![CDATA[Algomind]]></dc:creator>
		<pubDate>Thu, 27 Jun 2013 20:06:39 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-40</guid>
		<description><![CDATA[Following your great thoughts on Twitter, and come here following your link. 
You may wish to check a couple of typos I have spotted reading the article:

1.  There is an &quot;i&quot; in the denominator of the formula  [ following the sentence: &quot;The correlation (in the sense here it is usually called r-squared)&quot; ] which is probably mean to be a &quot;j&quot; (I think)

2. &quot;alegbra&quot;  in &quot;some simple alegbra&quot;

3. The &quot;r-squared&quot; is probably meant to be be simply &quot;r&quot; (as it is not squared now, maybe it was in a previous version of the article ). 
Also, your final formula is, actually, simply: r = k^2 / ( k^2 + 1 ) also equal to
1 - 1 / ( k^2 + 1), where clearly the second term quickly vanishes as k diverges.

If you like, I will later read also more carefully the content and, in case, provide some feedback on the conceptual part. Ah, btw, in case, for formulas, it may very simple and convenient to use mathjax: http://www.mathjax.org/  (if your CMS allows it, of course).]]></description>
		<content:encoded><![CDATA[<p>Following your great thoughts on Twitter, and come here following your link.<br />
You may wish to check a couple of typos I have spotted reading the article:</p>
<p>1.  There is an &#8220;i&#8221; in the denominator of the formula  [ following the sentence: &#8220;The correlation (in the sense here it is usually called r-squared)&#8221; ] which is probably mean to be a &#8220;j&#8221; (I think)</p>
<p>2. &#8220;alegbra&#8221;  in &#8220;some simple alegbra&#8221;</p>
<p>3. The &#8220;r-squared&#8221; is probably meant to be be simply &#8220;r&#8221; (as it is not squared now, maybe it was in a previous version of the article ).<br />
Also, your final formula is, actually, simply: r = k^2 / ( k^2 + 1 ) also equal to<br />
1 &#8211; 1 / ( k^2 + 1), where clearly the second term quickly vanishes as k diverges.</p>
<p>If you like, I will later read also more carefully the content and, in case, provide some feedback on the conceptual part. Ah, btw, in case, for formulas, it may very simple and convenient to use mathjax: <a href="http://www.mathjax.org/" rel="nofollow">http://www.mathjax.org/</a>  (if your CMS allows it, of course).</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Robert J Frey</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-37</link>
		<dc:creator><![CDATA[Robert J Frey]]></dc:creator>
		<pubDate>Sat, 08 Jun 2013 03:27:18 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-37</guid>
		<description><![CDATA[I was leaving out the \displaystyle{ ... }. This simple fix will make my life a lot easier. I&#039;m new to Word Press, and this sort of &quot;training&quot; is much appreciated!]]></description>
		<content:encoded><![CDATA[<p>I was leaving out the \displaystyle{ &#8230; }. This simple fix will make my life a lot easier. I&#8217;m new to Word Press, and this sort of &#8220;training&#8221; is much appreciated!</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Alex</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-36</link>
		<dc:creator><![CDATA[Alex]]></dc:creator>
		<pubDate>Sat, 08 Jun 2013 03:10:03 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-36</guid>
		<description><![CDATA[Oh, my, the trick is quite simple. You simply write the &lt;code&gt;Slatex ...S&lt;/code&gt; as usual (I use capital &quot;S&quot; instead of dollar signs, just for demonstrations). The first thing you do:
&lt;code&gt;Slatex \displaystyle{ ... }S&lt;/code&gt;

Wrapping everything inside a &lt;code&gt;\displaystyle{...}&lt;/code&gt; macro makes it quite pretty, like the latter integral. This is what happens inside equation environments in LaTeX. 

Neglecting it makes the LaTeX math be inline, like what happens in &lt;code&gt;\(...\)&lt;/code&gt;.

[My goodness, I hope that all renders correctly!]]]></description>
		<content:encoded><![CDATA[<p>Oh, my, the trick is quite simple. You simply write the <code>Slatex ...S</code> as usual (I use capital &#8220;S&#8221; instead of dollar signs, just for demonstrations). The first thing you do:<br />
<code>Slatex \displaystyle{ ... }S</code></p>
<p>Wrapping everything inside a <code>\displaystyle{...}</code> macro makes it quite pretty, like the latter integral. This is what happens inside equation environments in LaTeX. </p>
<p>Neglecting it makes the LaTeX math be inline, like what happens in <code>\(...\)</code>.</p>
<p>[My goodness, I hope that all renders correctly!]</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Robert J Frey</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-35</link>
		<dc:creator><![CDATA[Robert J Frey]]></dc:creator>
		<pubDate>Sat, 08 Jun 2013 03:06:30 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-35</guid>
		<description><![CDATA[Yes, it does, but I found it to be kind of kludgy--at least for the tools I usually use. My stuff doesn&#039;t seem to come out as nicely formatted as the integral in your comment. Could you send me the plaintext that you used to create it. Thanks.]]></description>
		<content:encoded><![CDATA[<p>Yes, it does, but I found it to be kind of kludgy&#8211;at least for the tools I usually use. My stuff doesn&#8217;t seem to come out as nicely formatted as the integral in your comment. Could you send me the plaintext that you used to create it. Thanks.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Robert J Frey</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-34</link>
		<dc:creator><![CDATA[Robert J Frey]]></dc:creator>
		<pubDate>Sat, 08 Jun 2013 03:01:51 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-34</guid>
		<description><![CDATA[Thank you for reading the post carefully. You are, of course, correct, and I&#039;ve fixed the typo in the equation in both the text and plot.]]></description>
		<content:encoded><![CDATA[<p>Thank you for reading the post carefully. You are, of course, correct, and I&#8217;ve fixed the typo in the equation in both the text and plot.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Alex</title>
		<link>http://keplerianfinance.com/2013/06/all-correlations-go-to-one/#comment-33</link>
		<dc:creator><![CDATA[Alex]]></dc:creator>
		<pubDate>Fri, 07 Jun 2013 15:50:30 +0000</pubDate>
		<guid isPermaLink="false">http://keplerianfinance.com/?p=155#comment-33</guid>
		<description><![CDATA[(I hasten to add, $latex \displaystyle\mathcal{P}\exp\left(\int A_{\mu}(x)\,\mathrm{d}x^{\mu}\right)$ for display style equations, etc.)]]></description>
		<content:encoded><![CDATA[<p>(I hasten to add, <img src="//s0.wp.com/latex.php?latex=%5Cdisplaystyle%5Cmathcal%7BP%7D%5Cexp%5Cleft%28%5Cint+A_%7B%5Cmu%7D%28x%29%5C%2C%5Cmathrm%7Bd%7Dx%5E%7B%5Cmu%7D%5Cright%29&#038;bg=ffffff&#038;fg=000&#038;s=0" alt="&#92;displaystyle&#92;mathcal{P}&#92;exp&#92;left(&#92;int A_{&#92;mu}(x)&#92;,&#92;mathrm{d}x^{&#92;mu}&#92;right)" title="&#92;displaystyle&#92;mathcal{P}&#92;exp&#92;left(&#92;int A_{&#92;mu}(x)&#92;,&#92;mathrm{d}x^{&#92;mu}&#92;right)" class="latex" /> for display style equations, etc.)</p>
]]></content:encoded>
	</item>
</channel>
</rss>
