I saw one example
int x = 10;
int y = 5;
bool isGreater = x > y;
printf("%d", isGreater);
But I could write this
int x = 10;
int y = 5;
printf("%d", x > y);
I am a complete beginner and I have no real reason why I would or would not want to deal with boolean variables, but I want to understand their raison d’être.
Edit: typo city


Here’s a little secret: All data types don’t actually exist. When your code is compiled down to machine code your computer just operates on collections of bits. Types are a tool we use to make code more understandable to humans.
intandunsigned inttell us information about how numerical values are interpreted.booltells you that only two values are going to be used and that the value will somehow relate to true/false.chartells us that the value is probably some sort of text. But again, everything is just stored as bits under the hood.You can go even further and define new types, even though they might still just be numbers under the hood - you could for instance define a new type, e.g.
Age, which is still just an integer, but it tells you the reader more information about what it is and what it might be used for. You might define a new type just for error codes, so at a glance you can see that this function doesn’t return anint, it returns anOsError, even if the errors are actually still just integer values.C does however play somewhat loosely by these rules, and to try and ‘make your life easier’ will often ‘coerce’ types between eachother to make everything work. For instance, integer types can be coerced to booleans -
0isfalse, everything else istrue(even negative values). Later on you’ll find that arrays can ‘decay’ to pointers, and other fun surprises. Personally, I think this implicit coercion between types is one of C’s biggest mistakes. If types exist to help humans reason about code, what does it mean if the compiler needs to silently change types behind the scenes to make things work? It means the humans were sloppy and the compiler can only guess at what they actually wanted to do (and this is the best case scenario! What happens if the compiler coerces types in places humans didn’t expect? Bugs!).There exist a spectrum of different behaviours in this regard, some languages are what we call ‘weakly typed’ (like Javascript, or to a lesser extent C) and other languages which are ‘strongly typed’. Weakly typed languages will try to implicitly convert types together to make things work: In Javascript if you try and add an integer to text
"Hello!" + 52the compiler will try to implicitly convert types until something makes sense, for instance in this case the compiler would say ‘oh, they probably want to add the text “52” onto the end’ and will produce"Hello!52". Sometimes this is handy, sometimes it introduces bugs. A strongly typed language will instead simply refuse to compile until you make the types line up.