The Big Mistake I Made When I Learned to Code
Programming languages have little in common with natural languages
Programming languages aren’t really languages, and when you read or write code, you shouldn’t be trying to parse it like a language.
You would be better off thinking of coding as lining up buckets of water in a row, than as anything that looks like writing in a natural language.
This is you programming:
Let me explain.
When you write code, you are writing instructions for the hardware to do what you want. How this works is usually described in a good intro-to-computer science class (not an intro-to-programming class), but basically, you can use logic gates at the circuitry level to do formal logic—that is, you can have inputs, and then get an output of either True or False. What kinds of stuff gets you a True or False is the subject of a Formal Logic or a Logic Gates class. How you build circuits that do this is the subject of a good electrical engineering class.
Being able to get True or False from a set of inputs is enough to do basically all of arithmetic, because when you have this, you can ask questions like “Is there more stuff in input A than in input B?” This is enough to set up addition and subtraction (If this seems like a leap, that’s because it’s pretty complicated), and when you can do those, you can do division and multiplication (This is a less complicated leap).
Programming languages are just programmer-friendly rulebooks for talking to the machine, created by computer scientists and electrical engineers. Most working programmers do not actually have advanced knowledge of how programming languages are created, and how exactly they work. Most programmers do not need this information to do their jobs.
Programming languages that are easy for the programmer to write and understand talk to other programming languages that are harder for the programmer to write and understand. There is a program in between that is the translator. The people who write these programs and these lower-level languages (closer to machine code) tend to be very good and paid very well.
So, what do you do when you write a program?
When you are coding, you are generally doing one of these:
Telling the machine to make a block of memory available for interaction
Going to a block of memory
Putting something into a block of memory
Removing something from a block of memory
If this seems confusing, let me clarify what I mean. When you are writing code, the literal words you are writing are commands to:
Create a data structure
Access a data structure
Run an algorithm on numbers
Put a number somewhere
Remove a number from somewhere
In this way, you really are, putting cups on the table, filling them with water, moving them around. It seems you are doing more because sometimes you are not just using a cup of water, but a really fancy water pipe. Sometimes the cups have crazy slides in them. Your table ends up looking something like this:
This should help you a lot. Because
When you edit or write your code, you should not imagine moving words around, even though that is what you are literally doing. You should imagine moving containers and the contents of the containers.
This means that your debugging rate should be much lower. One of the myths of programming is that you should be spending all of your time debugging your code, and that it is impossible to just get your code correct as a one-off. It may be hard to get your code completely bug-free, but especially when you are starting out without too-complicated intertwined code bases, you should not be spending more time debugging than coding.
Ideally, your programming journey will go something like this: