Example1 Assuming that the screen is buffered with a 50byte
Example1: Assuming that the screen is buffered with a 50-byte buffer. When running the program below, how many seconds will pass before you can see something on the screen?
}
Your answer : 25
Example2:
Assuming that the screen is buffered with a 100-byte buffer. When running the program below, how many seconds will pass before you can see something on the screen?
}
Your answer: 99
I dont understand how does those two program work? Can you explain the process for me thank you
Solution
while(1) is an infinite loop.It will go inside the loop and infinite loop should keep on executing untill either we put a break to come out of the loop or buffer memory for the program ends.In ist case,fputs() put \"hi\" which takes 2 bytes and it get sleep for 1 second after fput.So,for 50 byte of memory, it will sleep for 25 times and hence will take 25 seconds to show the output on the screen.
Same is with 2nd part.For this fputs() put \"0\" which is of 1 byte.
