For the moment, the main deficiency in AI (where philosophy is concerned) is its inability to formulate and argue a strong, original case. Presented with a philosophical question, its responses too often resemble a summay of discussion points. — alan1000
Presented with a philosophical question, its responses too often resemble a summay of discussion points — alan1000
IMHO, these machines are still only very very fast GIGO, data-mining, calculators. — 180 Proof
As I've already said, I think AIs must also be embodied (i.e. have synthetic phenomenology that constitutes their "internal models").So if it's not internal models that make them more than "very fast GIGO, data-mining, calculators", then what would, in your view? — flannel jesus
I'll be convinced of that when, unprompted and on its own, an AI is asking and exploring the implications of non-philosophical as well as philosophical questions, understands when and when not to question, and learns how to create novel, more probative questions. This is my point about what current AIs (e.g. LLMs) cannot do.What evidence would you have to see about some future generation of ai that would lead you to say it's more than "very fast GIGO, data-mining, calculators"?
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.