Minima in permutations. Write a program that takes an integer command line argument n, generates a random permutation, prints the permutation, and prints the number of left-to-right minima in the permutation (the number of times an element is the smallest seen so far). Then write a program that takes two integer command-line arguments m and n, generates m random permutations of length n, and prints the average number of left-to-right minima in the permutations generated. Extra credit: Formulate a hypothesis about the number of left-toright minima in a permutation of length n, as a function of n.
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here