Vous êtes sur la page 1sur 6

DESIGN AND ANALYSIS OF ALGORITHMS

TIME SPACE TRADE OFF

Time space trade off


The efficiency of an algorithm can be expressed in terms of computing time and memory space requirement. The space required will depend upon the number and size of the inputs such as variables, arrays, and other complex data structures used by the algorithm. The time required depends upon the number of basic operations that are required to be done. The amount of time required to run to compilation is called as time complexity, and the amount of memory space required to run to compilation is called as space complexity. The space and time required by an algorithm are inversely related. We may be able to reduce the execution time only by increasing the space available to the algorithm, likewise we shall reduce the memory space only by increasing the time taken by the algorithm. A space-time or time-memory tradeoff is a situation where the memory use can be reduced at the cost of slower program execution (or, vice versa, the computation time can be reduced at the cost of increased memory use). As the relative costs of CPU cycles, RAM space, and hard drive space for some time been getting cheaper at a much faster rate than other components of computers. The appropriate choices for space-time tradeoffs have changed radically. Often, by exploiting a space-time tradeoff, a program can be made to run much faster.

The most common situation is an algorithm involving a lookup table: an implementation can include the entire table, which reduces computing time, but increases the amount of memory needed, or it can compute table entries as needed, increasing computing time, but reducing memory requirements. A space-time tradeoff can be applied to the problem of data storage. If data is stored uncompressed, it takes more space but less time than if the data were stored compressed (since compressing the data reduces the amount of space it takes, but it takes time to run the compression algorithm). Depending on the particular instance of the problem, either way is practical. Another example is displaying mathematical formulae on primarily text-based websites.

Storing only the latex source and rendering it as an image every time the page is requested would be trading time for space - more time used, but less space. Rendering the image when the page is changed and storing the rendered images would be trading space for time - more space used, but less time. Note that there are also rare instances where it is possible to directly work with compressed data, such as in the case of compressed bitmap indices, where it is faster to work with compression than without compression. Larger code size can be traded for higher program speed when applying loop unrolling. This technique makes the code longer for each iteration of a loop, but saves the computation time required for jumping back to the beginning of the loop at the end of each iteration. Reduction of time increases the space and reduction of space increases the time. This is known as time-space-trade-off. Example Input: An array of n numbers arranged in ascending order. Output: An array of n numbers arranged in descending order.

Method I: (reversing array elements using 2 arrays) Algorithm Step 1: Assign two arrays, one for input and other for output. Step 2: Read the elements of the first array in reverse linear order and place them in the second array linearly from the beginning.

Program int arr1[n]; int arr2[n]; For(int i=0;i<n;i++) arr2[i]=arr1[(n-1)-i]; Analysis: Space and time complexity o The size of the input array, arr1, is n o The size of the extra array, arr2, is n o The output is obtained by assigning values of ary1 into ary2 in reverse order. o Therefore total space requirement of the algorithm is 2n

o The time complexity is found by the number of assignment statements. o The time complexity is n units of time. o This algorithm gave an option of increased space with lesser time.

Method II: (reversing array elements using swap method) Algorithm Step 1: assign an array for both input and output. Step 2: swap the first and last elements, and then swap the next two immediate elements as one from each end and so on. Step 3: repeat the process in step 2 until all the elements of the array get swapped.

Program int arr1[n]; int k = floor(n/2); for(int i=0;i<k;i++) Swap(&arr1[i],&arr1[(n-1)-i]);

Swap function Swap(int *a,int *b) { int temp = *a; *a = *b; *b = temp; } Analysis: Space complexity o The size of the array, arr1, is n; o A temporary variable, temp o The output is obtained by swapping the values of arr1.

o Therefore total space required is n+1.

Time complexity o o o o o Result: In both the methods any attempt to reduce the space leads to an increase in the time taken by the algorithm and vice-versa. This is an example of the time-space trade off.
ALGORITHMS THAT USE OF SPACE-TIME TRADEOFFS

Time complexity is based on the number of assignment statements. For each swapping, there are 3 assignment statements, Number of swapping = atmost half-time the size of the array. Therefore total time required is 3n/2 unit of time. This algorithm gave an option of reduced space with increased time.

1) Baby-step giant-step algorithm for calculating discrete logarithms. 2) Rainbow tables in cryptography, where the adversary is trying to do better than the exponential time required for a brute force attack. Rainbow tables use partially precomputed values in the hash space of a cryptographic hash function to crack passwords in minutes instead of weeks. Decreasing the size of the rainbow table increases the time required to iterate over the hash space. The meet-in-the-middle attack uses a space-time tradeoff to find the cryptographic key in only 2n + 1 encryptions (and o(2n) space) versus the expected 22n encryptions (but only o(1) space) of the naive attack. 3) Dynamic programming, where the time complexity of a problem can be reduced significantly by using more memory.

TIME SPACE TRADE OFF IN HASH FUNCTIONS AND ITS APPLICATIONS A mechanism is provided for constructing log-n-wise-independent hash functions that can be evaluated in O(1) time. Random words can be accessed by a small O(1)-time program to compute one important family of hash functions. An explicit algorithm for such a family, which achieves comparable performance for all practical purposes, is also given. A lower bound shows that such a program must take (k /&epsi;) time, and a probabilistic arguments shows that programs can run in O(k2/&epsi;2) time. An immediate consequence of these constructions is that double hashing using these universal functions has (constant factor) optimal performance in time, for suitably moderate loads. Another consequence is that a T-time PRAM algorithm for n log n processors (and nk memory) can be emulated on an nprocessor machine interconnected by an nlog n Omega network with a multiplicative penalty for total work that, with high probability, is only O(1).

Vous aimerez peut-être aussi