The "real" versus "artificial" knowledge division mentioned earlier is itself a tad artificial. Without delving too much into empiricism, it's pretty clear that what we call "knowledge" is not a feature of the universe, it is something that has been created by humans for humans in their attempts to understand the universe. Coincident with any and all knowledge is, of course, the representation form and syntax that we humans use to describe, structure, and store that knowledge. The representational form and syntax are entirely conventional [1]. We can't have knowledge of any sort unless it is in some representational form. But we get to create and choose what that form is.
So what representational forms could we choose? The answer is: pretty much any one we want. But our ability to understand, manipulate, build on, and extend the represented knowledge is highly dependent on the representational form we choose. This form has to map onto our understanding and observation of the knowledge we are encoding and--perhaps more importantly--it must be/should be in a structure and format that is:
(a) reasonably easy to understand
(b) usable in communicating with others
(c) be extensible and mutable to allow incorporation of additional knowledge, and...
(d) ideally, not incompatible with other related representational forms [2] .
As a simple example: the Roman Numeral system for counting and performing simple additive arithmetic was based on counting on fingers and scratching marks on boards. In addition (sic) to having multiple number bases (1, 5, 10, 50,...), there was no symbol for zero. This does not mean that Romans would not be aware that, when counting objects, there might be zero objects there, but they could not manipulate the zero concept since their representational form did not include it. The form also rendered some simple arithmetic functions difficult to compute. The division LXXII ÷ XII could be done by successive subtraction but it would require considerable effort [3] . Once Arabic notation and the base 10 number system came into use, the equivalent form (72 ÷ 12) provided a much easier system for calculation. This is one of the many reasons why Roman number systems are no longer used for arithmetic [4].
Since thought is basically knowledge on the move, knowledge and thought are almost synonymous [5] . As noted earlier, there is evidence that much of our thought processes, and hence our ability to create and process "knowledge" have their origins in the senses: what they are, what information they give us, how we process that information, and how we name and categorize that information.
In explaining the basis of mathematics (initially of arithmetic) Lakoff and Nunez [6] use the concept (q.v) of "conceptual metaphor." They note that, while the word "metaphor" is generally used in the sense of a figure of speech, it is actually the basis of most understanding. These metaphors of understanding are used everywhere humans acquire knowledge, consider knowledge, and communicate knowledge.
And these metaphors are largely constructs of physical sensation, as in acquired by the senses. Lakoff and Nunez asserted that much of arithmetical reasoning is based on relative location of things:
Position: this thing is near (far away, above , below, beside, behind, in front of,...) another thing [7]
Size: this thing is bigger (the same size, smaller, much bigger,...) than that thing
Containing: this thing is inside (outside) of another thing [8]
Most of our classification methods employ this metaphor: we associate similarity between objects based on their metaphoric closeness. One object on which we might sit has many similarities to another object on which we might sit, so we assign both items to the class "Chair" and label them accordingly.
In addition to location (either empirical or metaphorical nearness), we classify objects by other characteristics...
FOOTNOTES
[1] As in "by convention" (rather than intrinsic) meaning we get to choose.
[2] The Roman Numeral example below conforms to (a) and (b), somewhat supports (c) in a limited capacity but is poor at rereuiqment (d). Imagine trying to explain logarithms to someone who only knew the Roman numerical system.
[3] I used to challenge software engineers to design and code a program to calculate the result of LXXII ÷ XII while using only Roman Numeral notation formats within the program. That is, no cheating and converting into decimal, hexadecimal, binary, or other number system to simplify the coding. Nobody ever took me up on this.
[4] Growing up in Britain in the 1950s and working in my father's shop, I had to master the British system of currency which had a number of base units: four farthings in a penny , twelve pennys in a shilling, and twenty shillings in a pound. In addition to the basic fathing, penny, and shilling and to further complicate things, they also had coins which represented a half penny, three pence, six pence, two shillings (a "florin"), a coin worth two shillings and six pence (called "half a crown"), five shillings (a "crown") as well as notes of ten shillings. To cap it all, they also used a coin called a "guinea" worth one pound and one shilling (so, 21 shillings) mostly used for high-value items such as artwork sold at auction and as prizies in major horse races. Arithimetic using this system required some agility of mind.
[5] Except knowledge can be resident in something that doesn't think, such as DNA, a book, or a hardware machine. At the time of writing the jury is out whether knowledge resident in software can be considered to "think." Also, thought might not explicitly refer to something in the "real world" though it will invariably present as some analogue of such.
[6] In their seminal work, Where Mathematics Comes From George Lakoff and Rafael Nunez make a very strong case for the basis of mathematics being a metaphorical extension of the senses. In addition to this revolutionary idea, the authors also explain the why of Euler's Identity: eiπ + 1 = 0, arguably the most symmetrical and beautiful (and, on the surface, somewhat baffling) mathematical formula ever. [Basic Books, First Edition 2001. ISBN 0465037712].
While the book is about the discipline of mathematics as a generalized model, it is no great leap to consider the same origin for any cognitive model and hence the basic founding model of thought itself. Indeed, how would it be any different? Where else would thought originate?
[7] When things don't correspond to this sensory-metaphor modality, we often have difficulty understanding it. An example might be the behavior of quantum entanglement where particles (apparently) separated by significant distance behave the same even though the distance between them would require interactions faster than light. This feature seems to call into question our very concept of physical locality (things being near or further away from things) at least at the quantum level. This is hard to understand (and was hard to accept) in part because it conflicts with this most basic thought model
Another example is the classic wave/particle duality of light. We experience waves and particles as different entities with different characteristics. Therefore encountering something that "is" both (or neither come to that), generates confusion. Basically any electromagnetic radiation is neither a particle nor a wave, it is whatever it is--we, as humans, must use an experienced metaphor for us to "understand" and process it. This use of metaphor inevitably means our understanding is limited to the span of the metaphoric association, seems to generate conflicting models, and must be an incomplete mapping onto the thing.
[8] Arguably, this is simply an extension of the position metaphor.