What can possibly go wrong
[ << ]

Programming involves talking in a relatively stripped down way to the computer, using a limited set of terms it understands (like "say"), and in the order it recognises. The grammar used to order the basic terms is called the language's "syntax". For example, in our program we had to drag the words into the "say" block to get it to work. If, instead, we had dragged the "say" block onto the words it wouldn't work; the computer won't recognise it the wrong way round. This is especially critical if the language needs to be typed rather than dragged together like Scratch.

When you learn a programming language (and someone might learn several from the hundreds available) you learn the syntax and the basic terms. This is the only thing beginners do that is different from more advanced coders. Beginners often think that making mistakes is something only terrible beginners do, and they can get disillusioned because they think they fall into this group – but it isn't the case. Mistakes are an inherent part of the process.

One of the problems with coding is you are talking to something far less clever than a human. This means you have to get the syntax exactly correct. While in English we can say:
say "Hello World"
or
"Hello World" – say this.
and someone would probably understand both the same way, a computer wouldn't understand the second because it wouldn't know what "this" referred to. Worse, it would probably dislike that you've ended the line with a full stop it wasn't expecting.

Because we're human beings, with all the joy, capriciousness, ambiguity, and complexity of human beings, it's almost impossible to write perfect code first time. If you could, you wouldn't be a human being, you'd be a computer. Because of this, the first time people try to run one of their programs it won't work. Beginners assume this is because they're no good – it isn't – it's because they're human beings. It happens to advanced coders as well.

In addition, you may get code that works, but doesn't do a job exactly as you expected. This is normal as well. Advanced coders are better at recognising simple problems before they come up, but they still get the same number of problems because they are doing trickier things.

In short, probably 50% to 80% of a coder's time is spent sorting out these "bugs" (problems) in the code – what's called "debugging". That doesn't change whether you're a beginner or advanced – that's what coding is: writing the correct syntax as close as you can, and then getting the program working. All that changes is the nature of the issues. You know you're a coder when you enjoy solving the problems.

 

Now the problems are (hopefully) clearer, we're in a position to think through talking to a computer in detail.
First though, a quick summary.