Understanding is a complex phenomenon that has been studied extensively in the fields of psychology, cognitive science, and philosophy. While it is true that understanding may exist beyond the realm of human experience, the notion that plants or artificial intelligence possess a similar or comparable understanding to humans is simply not correct.
Plants are known to possess various mechanisms for sensing and responding to their environment, such as phototropism, gravitropism etc. These mechanisms involve various chemical and molecular processes that allow plants to adapt and respond to their surroundings. However, while these processes are important for plant survival, they do not necessarily imply a form of understanding in the human sense. Understanding involves the ability to reason, perceive, and empathize with others, which are not qualities that are typically associated with plant life nor AI
Similarly, while artificial intelligence has made tremendous strides in recent years, current AI systems are still far from possessing a similar understanding to humans. AI systems typically rely on pre-defined rules and algorithms to process data and make decisions, which is a far cry from the complex cognition and decision-making processes that humans possess. While AI systems may be able to process vast amounts of data and produce outputs based on pre-defined rules, they lack the ability to reason and perceive in the same way that humans do. For example, while an AI system may be able to recognize and label objects in an image, it does not have the same understanding of the objects as a human would.
Moreover, research in cognitive science has shown that understanding in humans involves a complex interplay between various cognitive processes, such as perception, attention, memory, and reasoning. For example, studies have shown that our ability to reason about abstract concepts such as justice or freedom relies on our ability to make analogies and draw connections between different domains of knowledge. Similarly, our ability to understand language relies on our ability to process and integrate information from multiple sources, such as syntax, semantics, and pragmatics.
Moreover, research in cognitive science has shown that understanding in humans involves a complex interplay between various cognitive processes, such as perception, attention, memory, and reasoning. Lacking such processes will inevitably producer lower quality output. For example, studies have shown that our ability to reason about abstract concepts such as justice or freedom relies on our ability to make analogies and draw connections between different domains of knowledge. Similarly, our ability to understand language relies on our ability to process and integrate information from multiple sources, such as syntax, semantics, and pragmatics.
[CHATGPT understanding of what I wrote, I asked it to draw a conclusion and It’s plain wrong/useless, props to Jakob first to notice]
In conclusion, while it is important to acknowledge that there may be different forms of understanding, the notion that plants or artificial intelligence possess a similar or equivalent understanding to humans is a perception you may have when reading a reply that simulates reasoning but is only predictive patterns. Understanding in humans involves a complex interplay between various cognitive processes, which are not typically associated with plant life or current AI systems.
[CHATGPT understanding of what I wrote, I asked it to draw a conclusion and It’s plain wrong/useless]
I agree with your conclusion, but I do not agree with your reasoning, the matter is way more complicated than you can write in a forum post, however I conclude (just like you do) that AI is fine, and can be helpful in many technical situations, however AI response will never be the same quality as a Human response, at least for the time being, even with a perfectly fine tuned dataset, due to the lack of the forementioned qualities (context, intertwining concepts, being able to reason based on similar problems, connect the dots and so on, an infinite list of human qualities)
TL;DR AI is good enough and I agree with your conclusion, however I strongly disagree with your reasoning. Prediction models != reasoning, understanding and so forth, therefore output quality is inevitably lower.