Dark theme

What can possibly go wrong

[ << ]

Programming involves talking in a relatively stripped down way to the computer, using a limited set of terms it understands (like "say"), and in the order it recognises. The grammar used to order the basic terms is called the language's "syntax". For example, in our program we had to drag the words into the "say" block to get it to work. If, instead, we had dragged the "say" block onto the words it wouldn't work; the computer won't recognise it the wrong way round. This is especially critical if the language needs to be typed rather than dragged together like Scratch.

When you learn a programming language (and someone might learn several from the hundreds available) you learn the syntax and the basic terms. This is the only thing beginners do that is different from more advanced coders. Beginners often think that making mistakes is something only terrible beginners do, and they can get disillusioned because they think they fall into this group – but it isn't the case. Mistakes are an inherent part of the process.

One of the problems with coding is you are talking to something far less clever than a human. This means you have to get the syntax exactly correct. While in English we can say:
say "Hello World"
or
"Hello World" – say this.
and someone would probably understand both the same way, a computer wouldn't understand the second because it wouldn't know what "this" referred to. Worse, it would probably dislike that you've ended the line with a full stop it wasn't expecting.

Because we're human beings, with all the joy, capriciousness, ambiguity, and complexity of human beings, it's almost impossible to write perfect code first time. If you could, you wouldn't be a human being, you'd be a computer. Because of this, the first time people try to run one of their programs it won't work. Beginners assume this is because they're no good – it isn't – it's because they're human beings. It happens to advanced coders as well.

In addition, you may get code that works, but doesn't do a job exactly as you expected. This is normal as well. Advanced coders are better at recognising simple problems before they come up, but they still get the same number of problems because they are doing trickier things.

In short, probably 50% to 80% of a coder's time is spent sorting out these "bugs" (problems) in the code – what's called "debugging". That doesn't change whether you're a beginner or advanced – that's what coding is: writing the correct syntax as close as you can, and then getting the program working. All that changes is the nature of the issues. You know you're a coder when you enjoy solving the problems.

You can see from our "Hello World" program that there are set ways of doing things. The "say" block puts words on the screen. If you don't put a word in the space, it won't say it. If you put the "say" block on the "Hello World" words it wouldn't work – it would be the wrong way round.

Computers need things to be done right. They aren't clever like people, so they can't work out what you mean unless it is right. To code you learn how to put things in the right way. We call the rules about the way to put the words the code's "syntax".

Because you are a person, not a computer, you will always put some things the wrong way. Don't worry about this, we all do it. You just have to fix the code you've made so it works. This is called "debugging" because wrong bits are called "bugs". It is a big part of coding and everyone does it.

When people start coding they worry they get bugs because they are just starting. That's not why. They get bugs because they are people, and full of the great kind of craziness that people are full of. You wouldn't want to get rid of that -- you're not a computer and you wouldn't want to be a computer. It would be pretty horrible.

 
Now the problems are (hopefully) clearer, we're in a position to think through talking to a computer in detail.
First though, a quick summary.