Machine language is the basic low-level programming language designed to be recognized by a computer. Actually the language is written in a binary code of 0s and 1s that represent electric impulses or off and on electrical states respectively. A group of such digits is called an instruction and it is translated into a command that the central processing unit or CPU understands.
More specifically, instructions are organized in patterns of 0s and 1s in various lengths such as 16, 24, 32, and 64 digits or bits, representing specific tasks such as storing or transferring data. An instruction is made up of two parts: the operator or opcode and the operand. The first few bits of an instruction are the "operator or opcode," whose role is to specify the kind of operation that needs to be performed. The rest of the bits are the "operand," whose role is to indicate the location where the operation is to be performed. For instance, a binary opcode such as the 000001 could be an instruction to store the contents of the accumulator in a given memory address. The whole instruction could look like this: 00000100011100000000000100000010.
Another example of binary machine language is the binary-coded decimal, where decimal numbers are encoded in binary form. Each decimal digit is coded as a four-digit binary number as follows:
- 0000 = 0
- 0001 = 1
- 0010 = 2
- 0011 = 3
- 0100 = 4
- 0101 = 5
- 0110 = 6
- 0111 = 7
- 1000 = 8
- 1001 = 9
For example, the decimal number 5,270 is represented by the binary code for 5, 2, 7, 0, which translates into 0101 0010 0111 0000.
The CPU has the ability to perform millions of instructions per second and this fact makes the binary machine language efficient, despite the volume of bits. It would be useful to note that different CPUs from different manufacturers use different machine languages.