Instruction set
The term instruction set is used to refer to low level instruction executed by computers.
Examples of these low level instructions would be the ADD instruction, which would add the contents of a number to the number already held in the computer's main accumulator register, before proceeding to the next instruction in the program. Conditional JUMP instructions would rewrite the next instruction to execute in the instruction register, depending on easily measured conditions like, did the accumulator contain all zeroes.
Programmers rarely write programs using those low-level instructions, instead using high level computer languages. One of the first computer languages was Fortran, now also rarely used, but still influential.
In the 1960s International Business Machines (IBM) introduced something new, the 360 series of computers. The series was a range of computers that had a range of processing speed, but all executed the same instructions. This was accomplished by having the CPU have its own hidden instructions, for a hidden internal processor, that would interpret each instruction. The things done in microcode were even more primitive than the instructions earlier computer processed, but, depending on their design, some of these operations could be done in parallel.
The faster processors in the 360 series used more silicon, and were able to do operations more quickly, due to parallelism.
In the 1970s, as a reaction to computers that executed microcode, some academics introduced designs for what they called reduced instruction set computers (RISC). These computers would, they argued, process programs more quickly, because each instruction would only take one computer cycle. They would not be exploiting parallelism. These instructions were more simple, like the instruction sets of the very first computer. One drawback to this approach is that compiled programs were longer.
Commentators argued vociferously over whether reduced instruction set computers were superior to Complex instruction set computers (CISC). Designers of later RISC computers faced a quandary, as Moore's law enabled both kinds of computers to use more transistors.