<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: About Poisson Distribution&#8230;</title>
	<atom:link href="http://physics.bu.edu/sites/geneva-program/2010/02/04/about-poisson-distribution/feed/" rel="self" type="application/rss+xml" />
	<link>http://physics.bu.edu/sites/geneva-program/2010/02/04/about-poisson-distribution/</link>
	<description>Just another Boston University Physics weblog</description>
	<lastBuildDate>Mon, 19 Apr 2010 07:15:50 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.5</generator>
	<item>
		<title>By: arno</title>
		<link>http://physics.bu.edu/sites/geneva-program/2010/02/04/about-poisson-distribution/comment-page-1/#comment-29</link>
		<dc:creator>arno</dc:creator>
		<pubDate>Fri, 05 Feb 2010 15:49:56 +0000</pubDate>
		<guid isPermaLink="false">http://physics.bu.edu/sites/geneva-program/?p=361#comment-29</guid>
		<description><![CDATA[Here is more info about the relationship between the various distributions (&lt;a href=&quot;http://en.wikipedia.org/wiki/Poisson_distribution&quot; rel=&quot;nofollow&quot;&gt;Wikipedia&lt;/a&gt;):
[...]
	The Poisson distribution can be derived as a limiting case to the binomial distribution as the number of trials goes to infinity and the expected number of successes remains fixed — see law of rare events below. Therefore it can be used as an approximation of the binomial distribution if n is sufficiently large and p is sufficiently small. There is a rule of thumb stating that the Poisson distribution is a good approximation of the binomial distribution if n is at least 20 and p is smaller than or equal to 0.05, and an excellent approximation if n ≥ 100 and np ≤ 10.[2]

	For sufficiently large values of λ, (say λ&gt;1000), the normal distribution with mean λ and variance λ (standard deviation ), is an excellent approximation to the Poisson distribution. If λ is greater than about 10, then the normal distribution is a good approximation if an appropriate continuity correction is performed, i.e., P(X ≤ x), where (lower-case) x is a non-negative integer, is replaced by P(X ≤ x + 0.5).
[...]
]]></description>
		<content:encoded><![CDATA[<p>Here is more info about the relationship between the various distributions (<a href="http://en.wikipedia.org/wiki/Poisson_distribution" rel="nofollow">Wikipedia</a>):<br />
[...]<br />
	The Poisson distribution can be derived as a limiting case to the binomial distribution as the number of trials goes to infinity and the expected number of successes remains fixed — see law of rare events below. Therefore it can be used as an approximation of the binomial distribution if n is sufficiently large and p is sufficiently small. There is a rule of thumb stating that the Poisson distribution is a good approximation of the binomial distribution if n is at least 20 and p is smaller than or equal to 0.05, and an excellent approximation if n ≥ 100 and np ≤ 10.[2]</p>
<p>	For sufficiently large values of λ, (say λ&gt;1000), the normal distribution with mean λ and variance λ (standard deviation ), is an excellent approximation to the Poisson distribution. If λ is greater than about 10, then the normal distribution is a good approximation if an appropriate continuity correction is performed, i.e., P(X ≤ x), where (lower-case) x is a non-negative integer, is replaced by P(X ≤ x + 0.5).<br />
[...]</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: arno</title>
		<link>http://physics.bu.edu/sites/geneva-program/2010/02/04/about-poisson-distribution/comment-page-1/#comment-27</link>
		<dc:creator>arno</dc:creator>
		<pubDate>Thu, 04 Feb 2010 21:48:40 +0000</pubDate>
		<guid isPermaLink="false">http://physics.bu.edu/sites/geneva-program/?p=361#comment-27</guid>
		<description><![CDATA[To understand the Poisson distribution perhaps it is helpful to try to understand the difference/relation to the Gaussian distribution. 

Also from &lt;a href=&quot;http://en.wikipedia.org/wiki/Normal_distribution&quot; rel=&quot;nofollow&quot;&gt;Wikipedia&lt;/a&gt;: [...] In probability theory and statistics, the normal distribution or Gaussian distribution is a continuous probability distribution that describes data that cluster around the mean. The graph of the associated probability density function is bell-shaped, with a peak at the mean, and is known as the Gaussian function or bell curve. [...] By the central limit theorem, the sum of a large number of independent random variables is distributed approximately normally. [...] Another practical consequence of the central limit theorem is that certain other distributions can be approximated by the normal distribution, for example:

	-The binomial distribution B(n, p) is approximately normal N(np, np(1 − p)) for large n and for p not too close to zero or one.
        -The Poisson(λ) distribution is approximately normal N(λ, λ) for large values of λ.
	-The chi-squared distribution χ2(k) is approximately normal N(k, 2k) for large ks.
	-The Student’s t-distribution t(ν) is approximately normal N(0, 1) when ν is large.

Whether these approximations are sufficiently accurate depends on the purpose for which they are needed, and the rate of convergence to the normal distribution. It is typically the case that such approximations are less accurate in the tails of the distribution.

 [...]]]></description>
		<content:encoded><![CDATA[<p>To understand the Poisson distribution perhaps it is helpful to try to understand the difference/relation to the Gaussian distribution. </p>
<p>Also from <a href="http://en.wikipedia.org/wiki/Normal_distribution" rel="nofollow">Wikipedia</a>: [...] In probability theory and statistics, the normal distribution or Gaussian distribution is a continuous probability distribution that describes data that cluster around the mean. The graph of the associated probability density function is bell-shaped, with a peak at the mean, and is known as the Gaussian function or bell curve. [...] By the central limit theorem, the sum of a large number of independent random variables is distributed approximately normally. [...] Another practical consequence of the central limit theorem is that certain other distributions can be approximated by the normal distribution, for example:</p>
<p>	-The binomial distribution B(n, p) is approximately normal N(np, np(1 − p)) for large n and for p not too close to zero or one.<br />
        -The Poisson(λ) distribution is approximately normal N(λ, λ) for large values of λ.<br />
	-The chi-squared distribution χ2(k) is approximately normal N(k, 2k) for large ks.<br />
	-The Student’s t-distribution t(ν) is approximately normal N(0, 1) when ν is large.</p>
<p>Whether these approximations are sufficiently accurate depends on the purpose for which they are needed, and the rate of convergence to the normal distribution. It is typically the case that such approximations are less accurate in the tails of the distribution.</p>
<p> [...]</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Elim</title>
		<link>http://physics.bu.edu/sites/geneva-program/2010/02/04/about-poisson-distribution/comment-page-1/#comment-28</link>
		<dc:creator>Elim</dc:creator>
		<pubDate>Thu, 04 Feb 2010 21:44:13 +0000</pubDate>
		<guid isPermaLink="false">http://physics.bu.edu/sites/geneva-program/?p=361#comment-28</guid>
		<description><![CDATA[I don&#039;t think k is a parameter. Maybe I am wrong.. 
From what I understand just now, k is the actual number of occurrences while the input parameter, λ, is the expected number of occurrences. 
The following is my own thinking:
So, we have several bins in front of us and we are going to throw some balls into those bins. For each bin, we expect a certain amount of balls; that is λ. When we actually throw the balls randomly, we have the actual results; that is k. And then, we will see if which λ gives the best results (that&#039;s the fitting). 
Maybe this thinking doesn&#039;t make sense to anyone..and maybe I am completely wrong...I am not quit sure...please let me your thoughts.

Because k is #_events, which is always positive, there is no -ve events. So, the -ve x-axis is always zero. What i meant &quot;thrown out&quot; is not really thrown out, but zero.. sorry if i confused you..And, it make sense from the graph too.

Again, I maybe completely wrong..that&#039;s just my thoughts...
Elim =D]]></description>
		<content:encoded><![CDATA[<p>I don&#8217;t think k is a parameter. Maybe I am wrong..<br />
From what I understand just now, k is the actual number of occurrences while the input parameter, λ, is the expected number of occurrences.<br />
The following is my own thinking:<br />
So, we have several bins in front of us and we are going to throw some balls into those bins. For each bin, we expect a certain amount of balls; that is λ. When we actually throw the balls randomly, we have the actual results; that is k. And then, we will see if which λ gives the best results (that&#8217;s the fitting).<br />
Maybe this thinking doesn&#8217;t make sense to anyone..and maybe I am completely wrong&#8230;I am not quit sure&#8230;please let me your thoughts.</p>
<p>Because k is #_events, which is always positive, there is no -ve events. So, the -ve x-axis is always zero. What i meant &#8220;thrown out&#8221; is not really thrown out, but zero.. sorry if i confused you..And, it make sense from the graph too.</p>
<p>Again, I maybe completely wrong..that&#8217;s just my thoughts&#8230;<br />
Elim =D</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: chelsea</title>
		<link>http://physics.bu.edu/sites/geneva-program/2010/02/04/about-poisson-distribution/comment-page-1/#comment-26</link>
		<dc:creator>chelsea</dc:creator>
		<pubDate>Thu, 04 Feb 2010 21:23:09 +0000</pubDate>
		<guid isPermaLink="false">http://physics.bu.edu/sites/geneva-program/?p=361#comment-26</guid>
		<description><![CDATA[So is k, the frequency of events, not a parameter?  That seems strange to me.  And I don&#039;t see why that is a reason for the negative part to be thrown out...]]></description>
		<content:encoded><![CDATA[<p>So is k, the frequency of events, not a parameter?  That seems strange to me.  And I don&#8217;t see why that is a reason for the negative part to be thrown out&#8230;</p>
]]></content:encoded>
	</item>
</channel>
</rss>