Artificial Intelligence (AI) is that part of computer science that automates responses to perceptions of an environment. AI is a multifaceted specialized field of computer science that includes techniques including: big data analytics, machine learning, predicative analytics, cognitive computing. These four, among others, are in various stages of evolution and implementation. "Artificial General Intelligence (AGI)", also referred to as "True AI", as a complete concept, is still in the future. AI in today's environment is software that is reliant on the computer programmer's ability to solve a known set of requirements and problems. In the future, "AGI" must be able to deliver self-adapting responses in an unknown unknown environment.
Here is a summary of some commonly used definitions.
Artificial General Intelligence (AGI), aka Strong or Full AI, is the idea that a single intelligent system can operate with the capability, understanding and learning ability of a human being. This is in contrast with Weak or Narrow AI, which is currently limited to software created to accomplish specific tasks such as facial recognition, data extraction from large data sets, statistical modeling, etc. Today AGI is the providence of discussion by philosophers, of speculation in numerous books and movies (primarily SyFy), but also the technological goal of scientists around the world. How will we know if AGI occurs? There are many tests among which- a single intelligent machine (Robot) is asked to enter an unknown, randomly-selected house, make a cup of coffee and add an appropriate amount of milk and sugar. Consider the number of points of capabilities, understanding and learning it would require to accomplish this. When will AGI occur? The guesses range from several years to never the consensus is 30 or 40 years.
Artificial Intelligence (AI) The technical definition is the simulation or imitation of intelligent human behavior in computers using one or a combination of the techniques defined below (i.e. often used as a generic, overall cover term for these techniques). Presently, AI is best described as the ability of a programmed computer system to adapt and respond to the current environment based on past experience. As previously stated, the future vision for AI is AGI, the ability to adapt/respond to an unknown unknown environment as humans are capable of doing.
Artificial Neural Networks today are a crude imitation of our brains and that is the very reason AI research is inspired by and interested in closely studying the human brain. Since the brain epitomizes intelligence, and AI's quest is to create artificial intelligence, it makes sense to study the operations of the biological brain and replicate the things it does well (recognize patterns, learn through practice, master general-purpose learning to apply to specific problems, etc.). The ultimate goal is to replicate the capabilities of a biological brain with an electronic brain which will require computers with massively parallel architectures.
Automated Machine Learning (AutoML) is the automation of machine learning implementations. AutoML includes the steps in developing deployable machine learning models from raw domain information. AutoML reduces the expertise needed to implement machine learning models and techniques.
Augmented Analytics is the use of Machine Learning (ML) to enhance and speed up Big Data Analytics with the intention of deriving more value. This is generally done with the use of analytics software and integrated analytic tools to scrub and parse raw data gathered from a number of disparate data sources. ML enables augmented analytics tools to understand and interact with raw data, notice valuable or unusual trends, and return key data for analysis.
Big Data describes a data set that is extremely large, and/or complex, and/or persistently changing, Processing data to extract value from these large data sets by conventional means, while operating on a data base with these characteristics, is extremely difficult if not impossible. Big data is becoming increasingly prevalent in diverse environments and growing exponentially, particularly with the rapid introduction and expansion of the use of Internet of Things (IoT) devices.
Big Data Analytics is the process used to examine a big data set, with these characteristics described above, to uncover hidden patterns, unknown correlations, trends, preferences, anomalies etc. with the goal of making better decisions. To accomplish this goal, Data Analytics uses one or combinations of the techniques defined below.
Cognitive Computing (CC) does not have a completely recognized single definition, is often used interchangeably with AI. CC is frequently considered to be a gradually and laboriously evolving computer technology process focusing on the goal of mimicking the way the human brain functions and approaches complex problem solving. To accomplish this goal CC uses many of the AI techniques defined above and, as the brain does: learns as information changes and requirements evolve; resolves ambiguity and tolerates unpredictability; uses Data Mining techniques and recognizes patterns; and weighs conflicting data to suggest situationally fitting answers rather that what is "correct" or "established".
Data Mining is an older term coined in the 60's that was used to describe many of the techniques and concepts encompassed in AI. Today it can be used as a way to categorize and define Data Analytics techniques and results to produce desired value. For example: Clustering - finding groups and subgroups; Anomaly Detection - data that doesn't fit normal patterns; Association - identifying related pieces of data; Text Mining - analysis of written words.
Deep Learning is a sub-category of ML that uses multiple layer algorithms (usually based on artificial neural networks) to continuously extract higher quality facets from raw input. For example, if the raw input were pixels, after several layer iterations, the final product might be a machine recognizable face. This process can be applied to voice recognition, computer vision, social network filtering, etc.
Expert System is an early form of AI designed to emulate a human subject matter expert. An expert system applies a set of if-then rules to solve complex problems such in automatic electronic test, measurement, and diagnostic equipment. In contrast with today's AI systems, Expert Systems lack the ability to learn from their experience.
Large Language Model (LLM) is a type of artificial intelligence that uses deep learning techniques to understand and generate human-like text based on the input it receives. These models are trained on vast amounts of text data from various sources, allowing them to learn the complexities of language, including grammar, context, and even some elements of reasoning and creativity.
Machine Learning (ML) is essentially a computer system process which learns, using algorithms, statistical models etc., from a training set of prior performance data. A computer system with ML capabilities relies on patterns and inference, not on explicit instructions, to make subsequent appropriate responses to new data input, a specific task, decisions, or make predictions.
Predictive Analytics is the use of Data Analytics to extract data from historical Big Data sets, use traditional statistical modeling and ML, determine patterns, and produce future insights and predictions. The results should be seen as forecasts, with an acceptable level of reliability, of what might happen in the future including what-if's and risk assessment scenarios and reports.
BCT LLC
10810 Guilford Road, Suite 111 | Annapolis Junction, MD 20701