# Talk:UVa 10013

Temper3243 posted in the Explanation section:

Read the problem statement in the end clearly. Well there is a little ambiguity i felt . The last test case doesn't output a blank newline. Usually it is printf("%d",number); printf("\n"); if(n>=1) printf("\n"); . This should do it . Which means that last test case has only 1 newline and other testcases has 2 newlines.

I don't think there is any ambiguity. This problem statement, like many others, says "There is a blank line between output blocks." (emphasis mine) This means that between test outputs, there should be a blank line, and nowhere else. This paragraph is being removed. Hubert 15:59, 9 Oct 2005 (EDT)

Offered methods (using arrays or BigNum) is bad because of O(n) memory consumption. This task reqiures only O(1) additional memory (with remark that O(1) variable can place numbers [0..n]).

How so? Larry 00:40, 8 Nov 2005 (EST)

Let ai,bi,ri - 2 source numbers and result number digits, lower i corresponding to higher level digits (the order in which they appears). In the simple case ri = ai + bi. If $a_i + b_i \ge 10$, then ri - 1 should be increased. So we need to remember ri - 1 and after calculating ai + bi output ri and remember new ri - 1. Bad news - whe have digit 9, and if we increase it we should increase ri - 2. So, finally, we need to remember last ri that is not 9 and number of 9's after it. If next $a_i + b_i \ge 10$ then print that last ri + 1, '0' * number of 9's and remember new state, if ai + bi < 9, then print last ri, '9' * number of 9's and remember new state, else count this 9.

I hope this is human-readable or at least understandable. :) green 09:33, 8 Nov 2005 (EST)

Neat solution :)

I believe Larry was asking why O(n) is considered "bad". In this case, it really doesn't hurt, and it doesn't affect run time. Hubert 12:09, 8 Nov 2005 (EST)

O(n) memory instead of O(1) without loss of speed is better (excluding case where O(1) = C > N :)). Because char a[1000000] will occupy nearly 1 Mb in memory. And if the number of digits increased to something like 110, probably, You'll have problems. Of course it can be packed more, but it will affect input and output complexity.

PS. I don't want to say this O(n) methods do not acceptable as solution in the online contest (I think he received "Accepted"). But it is like shooting fly whith a cannon (don't know how it'll sound in English). green 13:07, 8 Nov 2005 (EST)

PPS. All I want to say - it is not a BigNum problem. Of course BigNum can solve this problem, but it is like exhaustive search can solve many problems. :) green 13:12, 8 Nov 2005 (EST)

I guess your philosophy about problem solving is kind of different from mine. In my opinion, optimizing to an O(1) memory solution is overkill. Yes, it's neat, and yes, if they give you 10 trillion digits, you're going to run out of memory. But why work harder than you have to? The O(n) solution is obvious and implementable in less than five minutes, which is what counts in ACM contests. Hubert 13:23, 8 Nov 2005 (EST)

I know this. But if I placed sush task at contest, then I'll give them 10 trillion digits! :) green 14:25, 8 Nov 2005 (EST)

As long as you make sure you define the boundary of the input.. =) Nice solution, but ya, I usually don't optimize pre-maturely... and doesn't optimize after it gets AC, so =) Larry 15:30, 15 Nov 2005 (EST)