Generic programming is one popular type of computer programming written in such a way that it creates the most efficient code possible while allowing the code to apply to as many situations as possible without requiring any changes to the original code itself. Once the code is written, it can only perform the exact functions it was written for. By using generic programming to create codes that work in a number of different situations, while still performing the same basic, overall function, programmers can use a single piece of code in different programs without ever making changes to the original.
During the 1970s, generic programming made its debut in the Ada and CLU programming languages. Soon after, other programming languages such as Java and C++ began to use generic programming to simplify programming code while allowing the same code to be used in multiple scenarios. Each programming language has a particular way of using this code and different terms to describe it. "Generics," "templates," and "parameterized types" have all been used at some point or another to refer to instances of generic programming.
To understand this type of programming, it's important to know the basic concepts of how a programming language works. If, for example, Paul wants to write a program that adds two numbers together, he would type out the computer code to add two objects. He would then tell the computer that those two objects are numbers and that the final answer should be a number as well.
While the program will work as long as Paul is adding two numbers, it will crash if he tries to add anything else together. If Paul decided to string sentences together to form a paragraph, the program would crash because it would find letters and not numbers. Paul could fix this problem, however, by using generic programming to tell the original program to accept a number of variables — both numbers and letters — and thus the program could create sentences or perform addition.
Not all programming languages need the concept of generic programming for them to be efficient. Those that use it are statically typed languages. This simply means that the code is set in stone, so to speak, and cannot be altered while the program is running. For this reason, if a programmer specified that input from the user would be in the form of letters and the user typed a number, the program could not use the input. Thus programmers try to foresee all logical data types a user could input, be it numbers, letters, or symbols, and create a program that can adjust accordingly.