Quoted By:
Intuitively a random variable X over probabilistic space (Om, sigma, P) is a variable with an ascribed probability distribution of its value. Consider a discrete rv. Think of it as a random number with - for every value it could take - given a probability that it takes it. So for example if it only takes values 1 through 4, you could have a probability distribution like this: P(X=1)=0.2, P(X=2)=0.5, P(X=3)=0.2, P(X=4)=0.1. Even the notation is intuitive. For continuous rvs you have probability for every interval of a random real number X being inside of it - P(X in (2, 6))=0.4. So you see that an rv is just a random number X with probabilitiy specified for every little interval of it containing X - a probability distribution (over a real line). Defining X as a measurable function from Omega to R is just to make the whole notion formal and to enable rigorous analysis and proofs. The Omega is there to specify probabilities of intervals (the prob that an interval contains X is the prob of the preimage of this interval in the space Omega). You have to learn to think about the rv as both this intuitive notion of a random number and a measurable function.
Omega really starts playing a role when you want to have several random variables that are mutually dependent (their joint probabilities P(X=k, Y=l, Z=m) are complicated). Then having a common underlying probability space Omega for these rvs let's you define probabilities for all of them simultaneously.