Program to show a randrange function.

Here is a straightforward implementation of randrange2:

#include 

int randrange2(int m, int n)
{
return rand() % (n - m + 1) + m;
}
Here is one using the suggested ``better way of reducing the range of the rand function'':
#include 

int randrange2(int m, int n)
{
return rand() / (RAND_MAX / (n - m + 1) + 1) + m;
}
Notice that I've replaced N in the suggested general form with the expression n - m + 1.

You could implement the simpler randrange function either as

 int randrange(int n)
{
return rand() % n + 1;
}
or, using the improvement,
 int randrange(int n)
{
return rand() / (RAND_MAX / n + 1) + 1;
}
or, by writing it ``on top of'' the more general randrange2 you already wrote,
 int randrange(int n)
{
return randrange2(1, n);
}

The various ``fudge factors'' in these expressions deserve some explanation. The first one is straightforward: The + 1 in (n - m + 1) simply gives us the number of numbers in the range mn, including m and n. (Leaving out the + 1 in this case is the classic example of a fencepost error, named after the old puzzle, ``How many pickets are there in a picket fence ten feet long, with the pickets one foot apart?'') to

The other +1 is a bit trickier. First let's consider the second implementation of randrange. We want to divide rand's output by some number so that the results will come out in the range 0 to n - 1. (Then we'll add in 1 to get numbers in the range 1 through n.) Left to its own devices, randRAND_MAX (where RAND_MAX is a constant defined for us in ). The division, remember, is going to be integer division, which will truncate. So numbers which would have come out in the range 0.0 to 0.99999... (if the division were exact) will all truncate to 0, numbers which would have come out in the range 1.0 to 1.99999... will all truncate to 1, 2.0 to 2.99999... will all truncate to 2, etc. If we were to divide rand's output by the quantity will return numbers in the range 0 to

 RAND_MAX / n
that is, if we were to write
 rand() / (RAND_MAX / n)
then when rand returned RAND_MAX, the division could yield exactly n, which is one too many. (This wouldn't happen too often--only when rand returned that one value, its maximum value--but it would be a bug, and a hard one to find, because it wouldn't show up very often.) So if we add one to the denominator, that is, divide by the quantity
 RAND_MAX / n + 1
then when rand returns RAND_MAX, the division will yield a number just shy of n, which will then be truncated to n - 1, which is just what we want. We add in 1, and we're done.

In the case of the more general randrange2, everything we've said applies, with n replaced by n - m + 1. Dividing by

 RAND_MAX / (n - m + 1)
would occasionally give us a number one too big (n + 1, after adding in m), so we divide by
 RAND_MAX / (n - m + 1) + 1
instead.

Finally, just two lines in the dice-rolling program would need to be changed to make use of the new function:

  d1 = randrange(6);
d2 = randrange(6);
or
  d1 = randrange2(1, 6);
d2 = randrange2(1, 6);

The answer to the extra-credit portion of the exercise is that under some compilers, the output of the rand function is not quite as random as you might wish. In particular, it's not uncommon for the rand funciton to produce alternately even and odd numbers, such that if you repeatedly compute rand % 2, you'll get the decidedly non-random sequence 0, 1, 0, 1, 0, 1, 0, 1... . It's for this reason that the slightly more elaborate range-reduction techniques involving the constant RAND_MAX are recommended.

Related Links :

No comments:

Post a Comment


If you face any Problem in viewing code such as Incomplete "For Loops" or "Incorrect greater than or smaller" than equal to signs then please collect from My Web Site CLICK HERE


More Useful Topics...

 

History Of C..

In the beginning was Charles Babbage and his Analytical Engine, a machine
he built in 1822 that could be programmed to carry out different computations.
Move forward more than 100 years, where the U.S. government in
1942 used concepts from Babbage’s engine to create the ENIAC, the first
modern computer.
Meanwhile, over at the AT&T Bell Labs, in 1972 Dennis Ritchie was working
with two languages: B (for Bell) and BCPL (Basic Combined Programming
Language). Inspired by Pascal, Mr. Ritchie developed the C programming
language.

My 1st Program...


#include
#include
void main ()
{
clrscr ();
printf ("\n\n\n\n");
printf ("\t\t\t*******Pankaj *******\n");
printf ("\t\t\t********************************\n");
printf ("\t\t\t\"Life is Good...\"\n");
printf ("\t\t\t********************************");
getch ();
}

Next Step...


#include
#include

void main ()
{
clrscr ();
printf ("\n\n\n\n\n\n\n\n");
printf ("\t\t\t --------------------------- \n\n");

printf ("\t\t\t | IGCT, Info Computers, INDIA | \n\n");
printf ("\t\t\t --------------------------- ");

getch ();

}

Hits!!!