Granular computing is a method of problem solving which blends together information that is precise, with details that are more general. It focuses on how to incorporate uncertainties and probabilities into computers. Originally devised in the 1970s, this method of theoretical computer science has been incorporated into computer programming and artificial intelligence. The principle of fuzzy sets was developed in the 1960s for handling uncertainty; both fuzzy set and probability theory are typically used in granular computing. This method has often been referred by such terms as rough set theory, data compression, and machine learning.
Used as a way to structure problem solving and general thinking, granular computing has been modeled in different ways. It is often used for clustering data in large databases, and has sometimes been used for abstraction and generalization of data to organize information. This is important for data mining because people often don’t think of information in specific and complex numerical terms. Computers can analyze language to gauge how to use search terms, so granular computing is often part of how search results are acquired.
Data mining in a corporate network often involves granular computing. Search engines on the Internet typically do as well. General search terms can therefore get a person to a website with more details on a subject. In a typical database, information is organized into different classes, clusters, and subsets depending on a number of variables. Corporate computer programs can use this method of classifying data to organize a lot of information; employees can then acquire information when it is needed the most.
Humans generally don’t think like computers. Words are used to represent abstract ideas and often make details less precise. Substituting words and phrases for complex ideas is usually necessary; the brain doesn’t typically calculate details like precise speed or distance, for example. A sensor connected to a computer can do this. The brain can determine if something tastes or feels good, but generally cannot count a large number of things unless such information is already available.
Granular computing, therefore, helps to make computers work more like the thought processes that take place in a person’s head. There are typically numbers, computer language elements, and probability constraints in between. The end result is a computer program that can interpret the how people communicate with a computerized interface. Enabled by years of theoretical computer science, this concept is used in many corporate, medical, and security computer systems, and also may be applied to the Internet.