In computer programming, a variable is a value that can change over the course of the program execution. These values are called variables because the represented information can vary rather than remaining constant.

In most programming languages, variables are declared with a specific data type, which determines the kind and size of data they can store. For example, in C++ an int variable can store a whole number between -2147483648 and 2147483647, while a char variable can store a single letter or symbol.

Variables can be assigned different values at different times during program execution, which makes them useful for storing results that may need to be used later. For example, if you were writing a program to calculate the average test score for a class of students, you could use a variable to store each student’s score as you read it in from an input file. Once all the scores have been read and stored in variables, you could then calculate the average by adding up all the scores and dividing by the number of students.

Variables are also often used as counters. This means you can count how many times something happened in your program. In these cases they usually start with an initial value of 0 or 1 (depending on whether you’re counting from 0 or 1), and then have 1 added to them each time through the loop. For example, if you wanted to keep track of how many times someone guessed incorrectly when trying to guess your secret number, you could set up a counter variable in JavaScript like this.

let counter = 0; // starts at 0
// code goes here that increments counter each time something happens
// ...
counter++; // increment the counter
// ...

At the end of the program counter would contain the total number of times something interesting happened.