You'll never understand C programming because it doesn't make anyr sense. It isn't possible for anyone to understand it.
First I need to state categorically that the C programming language makes excellent sense, except for how it was implemented from the very first. The C language is based on the concept of functions, as known in mathematics. (Not quite true; C functions are not necessarily truly functional in mathematical terms, but the C concept is an expansion of the math concept.) C functions are deterministic mappings from some domain of input values to a range of output values, such as from the domain of N x N x N (ordered integer triplets) to rational numbers Q.
The domain and range sets may also be strings, which are encoded by integers but interpretted algorithmically, the number of integers in the n-tuple being indeterminate until a specific value is given: The sequence 65 32 114 101 100 32 102 108 111 119 101 114 46 10 could be used for "A red flower." (The 10, in this case, is being used as a terminator.) The domain and range can be indeterminate at other levels, too, so the C function is a logical machine (like a Turing machine for example), a broader category than mathematical functions.
All well and good. Except that's not how C has ever been written. Here's a simple C program example. I found it at cprogramming.com although you can find similar examples all over the place.
• printf( "I am alive!\n" );
• return 0; -}
What this does is define the C function "main" which maps from the null set, no input domain whatever, into the integers (abbreviated by "int"). Already you see a confusion here, in that we've defined a mapping from nothing into something. There is only one empty set, so this might be understood as a mapping from that fixed input; in that case one would expect a constant output. (In fact, one could define all constants as being static functions of the empty set. I'm not sure what one would gain by that understanding.) The last instruction, the "return 0", does define the value of the function to be zero, so our explanation does work in this case: The entire function is equivalent to the constant zero. Pointless, but logically consistent.
In that case, why have the program at all? Because the logic of the C function is being completely ignored and the intended work of the program is being done outside the design of language. See those other 2 instructions? They look like function calls but they are being used like procedural language instructions. They actually invoke programs which are probably not written in C and which interact directly with the external environment. The 'printf' sends the string to a standard output device (which today is likely a display device, not a printer) and the 'getchar' waits for a character from the current input device (maybe a keyboard, or perhaps a touchscreen).
So the work the program is designed to do has nothing to do with the mapping from arguments to return values. The 'printf' function does use the argument value as input to the underlying program, and there is an integer returned. (The return value indicates merely success or failure, zero for no errors, a 1 or 2 if something goes wrong.) This little program doesn't even look at the value of the function. Why? because the programmer doesn't care about the function; the programmer is merely trying to invoke that underlying program which prints (so to speak) the string to a place external to the entire C environment.
Conversely, the underlying program invoked by 'getchar' is doing the task of waiting for a keystroke. ("Press any key to continue," as they used to say.) The function does return an integer which represents the key pressed, but in this case the value isn't used for anything. What the funcation call is doing, very indirectly, is causing the program to wait for a keypress in the external environment. The real input is not the argument to the function -- the null set -- but some other data from some other source.
It's perfectly logical to write a program to display a string and then to wait for a user response. The only thing that's illogical is to do that in the C language, which is designed to do something completely different. But, in fact, almost the totality of C programming is just an expansion of this tiny example; virtually none of it makes use of the logic of the C language. And that is why you will never understand it.