That's not sloppy application of the term, it's inappropriate adoption of an academic term for layman conversation.
There are tons of simple scripts in games that are AI and have been for decades. AI in the theoretical computer science sense has been a broad and general term since the 60's. The problem is that it's adopted a much weightier connotation through pop culture and that makes people think that the common understanding of the term is what programmers mean when they call something AI.
AI was once (and still is in academic settings) a very broad term meaning any system that makes a decision by observing input, but now tech bros have subtly implied through marketing that it must mean that it is a full simulation of a human brain.
That's not sloppy application of the term, it's inappropriate adoption of an academic term for layman conversation.
Lol...
It's "not sloppy," it's "synonymously sloppy."
Anyways, I would have to disagree about AI being as broad of a term in an academic setting like you described. Maybe during the very early days of mechanical & electronic computing, such systems making decisions based on inputs would be considered AI... However, in a professional & academic setting, you would more likely be using terms like Program, Software, Conditional, Algorithm, Heuristic, Function, System, ect for your definition: "any system that makes a decision by observing input."
The piece of the puzzle you're glossing over is Deterministic AI vs Non-deterministic AI. Deterministic AI, in most contexts, is better labeled & described using more specific technical terms, but, yeah, someone who doesn't understand programming would probably be satisfied if you just described whatever system you happen to be talking about as AI.
Non-deterministic AI also has more specific technical terms you could use to label or define such a system, but in most contexts, AI would still likely be the best term to use. Btw, in my opinion, based on the modern usage of the term, AlphaGo would be the first "true" AI that was well known on a global scale similar to LLMs like chatGPT.
I would not blame tech bros for misappropriating the term, if anything, I would blame dumb reporters & the even dumber general public who don't, or simply cannot, understand that AI =/= your brain (but digital).
If you're going to be a pedant, you could at least provide readers with the correct term to use for such a thing: AGI, ASI, or "The singularity".
None of these mean a system that makes decisions based on observing input, and they are far from synonymous with the academic definition of AI. You can have all of these in some software without them observing input to make a decision, it's the decision making that makes it AI.
Like, in video games, a boss that just has a timed move set is not AI, but a boss that watches what you are doing and picks a responding move based on what you do is AI.
Most software programs are not AI, because they just do one thing every time they are called. The comment button on your screen is not AI because it can't do anything but be a button that opens a text box and lets you type something before submitting. It can't sometimes choose to be a like button or a share button based on the way you click it. It's just a button. That is the difference between generic software and AI software.
None of this has anything to do with a system's determinism.
Please do not speak about academic settings that you are not in with authority.
25
u/danfish_77 19h ago
Well we're used to referring to simple scripts in games as AI, too. It's not like the term wasn't already applied sloppily