Swordrock's Blog

Just another WordPress.com site

Random variables

Today i will tell you about some random variables , and their usage .

I will tell them in 2 parts  , first for this lesson i will give ;  Elements of probability , What it means to be a random variable , Jointly distributed random variables , some statistical properties like Variance , Expectation .

We live in a random world , everything is going randomly so we should learn how events reacts to our daily lifes , this is so amazing .

Probability and random process is a hard task to learn , because we use our brain to think on all deterministic , but real world does not include any deterministic event , that is all going sthocastic .

I use matlab to simulate random variables , that makes my mind a bit clear to see how random variables react when we try and try again .

My aim is not to give all probability theorems in this text , you can find on the net easily that tasks ;

-Elements of probability ; sample spaces , outcomes , events , axioms of probability , conditional probability , bayes rules (this rule is so much important for us to get the probabilistic idea) ,  dependent-independent events

You can use this youtube video to make an introduction to probability , what it means to be probabilistic :

You should learn them because we will use them all on the way of probabilistic theories .

There is also an ebook that is about elements of probability  :


I will not tell mathematical teories that you can find everywhere , but usage of them .

For probability simulations  : http://www.mathsonline.co.uk/nonmembers/resource/prob/

How can we relate them into realistic problems .

Let’ begin .

What is a Random Variable ?

When an experiment is performed , we are interested in the value of numerical quantities determined by the result .

For example , a civil engineer may not be directly concerned with the daily risings and declines of the water level of a reservoir  , but may care about the level at the end of the rainy season.

These quantites of interest that are determined by the result of the experiment are known as Random Variable .


Ex. : Let X denote the random variable that is defined as the sum of two fair dice , then ;

P(X=2) = P{1,1} = 1/36 ;

P{3}=P{(1,2),(2,1)} = 2/36 ;

P{12}=P{(6,6)}=1/36 ;

As you see , r. d . P is taking integral values between 2 and 12 .


Looking at this example , r.d takes on finite number of possible values . Random variables whose set of possible values can be written either as a finite sequence


or as an infinite series , said to be discrete,

so we call this variables “discrete random variables”

There are also r.d which takes values on continuum of possible events . These are known as “continous random variables”.

The Cumulative Distribution Function

It is also called distribution function , F of the random variable X is defined for any real number x by ;

F(x)= P{X<=x}

That means , F(x) is the probability that the random variable X takes on a value that is less than or equal to x .


Ex : A key step in manufacturing integrated circuit requires baking the chips in a special oven in a certain temperature range . Let T be a random variable modelling the oven temperature.

Show the probability the oven temperature is in the range a<T<=b can be expressed as

P (a<T<=b) = P(T<=b) – P(T<=a)


F(b) – F(a)



Ex : A computer has three disk drives numbered 0,1 and 2 . When the computer is booted , it randomly selects a drive to store temporary files on .

If we model the selected drive number with the random variable X , show that the probability drive 0 or drive 1 is

P (X=0 or X=1) = P(X=0) + P (x=1)

Or means Union .

Ex : Suppose random varible X has distribution function ;

F(x) =

0 if x<=0

1-exp(-x^2) if x>0

What is the probability that X exceeds 1 ?

P (X>1) = 1-F (1)

exp(-1) = 0.368



Properties Of Random Variables

For a discrete random variable  , we define Probability Mass Function (PMF) by ;

p(a) = P {X=a}

Pmf is greater than zero for all values of X , 0 for others .


Ex : Consider a random variable X that is equal to 1,2 or 3 .

p(1) = 1/2  , p(2) = 1/3 , so what is p(3) ?

As we remember , sum of sample space must be 1 , so ;

p(1) + p(2) + p(3) = 1 ;

p(3)= 1/6

Pmf of 3 is 1/6



Cumulative Distribution is replaced by Probability Density Function (PDF) when we have continous random variable .

Jointly Distributed Random Variables

Sometimes  we model  real world problems with more than one random variable , so we use relationship of two or more random variables for them .

For example , suppose we study on possible causes of cancer , we might be interested in the relationship between the average number of cigarettes smoked daily and the age at which and individual contracts cancer .

To specify these positions , we use Cumulative probability distribution function as ;

F (x,y) = P {X<=x,Y<=y}

So we can obtain , for example X from that equality ,

F (x) = P {X<=x}

=P {X<=x,Y<inf}



We can also write probability  mass function as ;

p (Xi,Yi) = P {X=xi,Y=yi}



They are said to be independent if one event does not effect the other at the same time or place .

P {X=xi,Y=yi} = p(xi) x p (yi)



That is an important concept int probability . If X is an discrete random variables taking values x1,x2,…. ;

E [X]  is the sum of  xi x P {X=xi}  .


That means ; the expected value of X is a weighted average of the possible values that X can take on , each value being weighted by the probability that X assumes it .

For instance ,  if the probability mass function is given like ;

p(0) = 1/2 = p(1)


E [x] = 0 x 1/2 + 1 x 1/2 = 1/2


is a weighted average of the two possible values of 0 and 1 .


Let s roll  a die !!!!


Ex :

we got 1,2,3,4,5,6 and probability 1/6 .

E [X] = 1*1/6+2*1/6+3*1/6+4*1/6+5*1/6+6*1/6 = 7/2


7/2 does not mean that we expect a 3.5 score from rolling a die , it means that is the average value of die when we roll it much more !




It is useful to know some properties about proability mass functions , like Expectation .

But expectation itself does not tell us anything about variation , or spread , of these values .

Like ;

W = 0 with proability 1

Z = -1 with probability 1/2 , 1 with probability 1/2


have same expectation values , so it means nothing for us .

Because we expect X to take on values around its mean E[X ], it would appear that a reasonable way of measuring the possible variation of X would be to look at how far
apart X would be from its mean on the average. One possible way to measure this would be to consider the quantity E[|X − μ|], where μ = E[X ], and [X − μ] represents the absolute value of X −μ.

However, it turns out to be mathematically inconvenient to deal with this quantity and so a more tractable quantity is usually considered — namely, the expectation of the square of the difference between X and its mean.


Var (X) = E [(X-μ)^2]   ,  also an alternative formula is like that ;

Var (X) = E [X^2] – μ^2

Var (aX) = a^2Var (X)


I will share with you some statsistical commands on Matlab like rand , randn , mean , median , var and std

We can use matlab to genereta random variables by using statsitical toolbox features .

For example rand (m,n) command will give us a m x n dimensin random numbers  , lets do it :


Ex :

>> rand (5,8)

ans =

0.8147    0.0975    0.1576    0.1419    0.6557    0.7577    0.7060    0.8235
0.9058    0.2785    0.9706    0.4218    0.0357    0.7431    0.0318    0.6948
0.1270    0.5469    0.9572    0.9157    0.8491    0.3922    0.2769    0.3171
0.9134    0.9575    0.4854    0.7922    0.9340    0.6555    0.0462    0.9502
0.6324    0.9649    0.8003    0.9595    0.6787    0.1712    0.0971    0.0344


randn() is a bit different from rand , it generates  random numbers with zero means and variance of unity , lets practice :


Ex :

>> x = randn (1,400 );
>> var (x) , mean (x) , median (x) , std (x)

ans =


ans =

-0.0418      % as you see , mean is so much closer to zero

ans =


ans =


You can also find mean and median of the given data by using mean and media commands , like that ;


Ex :

>> X = [ 3 5 8 7 9 6 5 4 7] ;
>> mean (X)

ans =


>> median (X)

ans =



We also have commands for variance and standart derivation calculations like : var (x) and std (x) as you see in randn example

Here is another link that tells expectation and variance  :


Thats all for that lesson , but i will share important and must see videos from youtube , See You !





Kasım 19, 2010 Posted by | Matlab | , , , , , | Yorum bırakın

Electricity collected from the air

Such an interesting new !

Today i will share a new source of electricity : Electricity From The Air .

Yes they manage to collect the electricity from the air .


It is much like solar cells capture sunlight and uses them in ligthning and other home aplliances .

They say ““Our research could pave the way for turning electricity from the atmosphere into an alternative energy source for the future.”” .

He goes on , “If we know how electricity builds up and spreads in the atmosphere, we can also prevent death and damage caused by lightning strikes”

As you remember , that was the dream of Nikola Tesla , the magnificient genius , to capture and use the electricity in the air .

They call it as “hygroelectricity” which means humidity electricity .

Here is a useful link for hygroelectricity : http://www.worldchanging.com/archives/011530.html

Their aim is now to design panels like solar cells to use this new hygroelectricity in home , and to prevent lightning from striking and causing people to die .


The details are here , full paper , Take care .



Kasım 15, 2010 Posted by | Electronics | , , , | Yorum bırakın