Please enable JavaScript.
Coggle requires JavaScript to display documents.
Natural Language Processing - Coggle Diagram
Natural Language Processing
Definition of NLP
Natural Language Processing (NLP) is a field of Artificial Intelligence (AI) that focuses on enabling computers to understand, interpret, and generate human language. It's essentially about making machines capable of interacting with humans through natural language, whether spoken or written.
Computer Science
concerned with developing internal representations of data and efficient processing of these structures.
Cognitive Psychology
looks at language usage as a window into human cognitive processes, and has the goal of modeling the use of language in a psychologically plausible way.
Linguistics
focuses on formal, structural models of language and the discovery of language universals – the field of NLP was originally referred to as Computational Linguistics.
What is language?
a systematic means of communicating ideas or feelings by the use of conventionalized signs, sounds, gestures, or marks having understood meanings.
Formal language
Python, Java, C
Natural language
Human language
Phonetics
the study of speech sounds and their physiological production and acoustic qualities.
Phonology
Interpretation of speech sounds within and across words.
Morphology
Understanding distinct words according
to their morphemes
Lexicon
Understanding everything about distinct words according to their position in the speech, their meanings and their relation to other words
non-standard English
neologisms
idioms
Syntax
Analyzing the words of a sentence
to uncover the grammatical structure of the sentence.
Semantics
The possible meanings of a sentence by focusing on the interactions among word-level meanings in the sentence...
Discourse Analysis
Properties of the text as a whole that convey meaning by making connections between component sentences.
Pragmatics
Explains how extra meaning is read into texts without actually being encoded in them.
History
1960s Eliza - (Weizenbaum) psychotherapist
1970s Conceptual ontologies - the nature of being
1954 Georgetown experiment
chatterbots = Jabberwacky (1997), ALICE
1950s Turing “computer machinery and intelligence” test
Until 1980s - complex hand-written (programmed) rules, then machine learning algorithms used
Machine learning
Advantages
use stats to deal with unfamiliar input
improves with more data, not more programming
auto-focus on most common cases
Applications - clever searches
N-gram
An n-gram is a sequence of n adjacent items (like letters, syllables, or words) in a text or speech dataset.
sentiment analysis
-identify trends in public opinion
Web Link
https://www.deeplearning.ai/resources/natural-language-processing/
https://www.britannica.com/science/phonetics
https://www.ibm.com/think/topics/machine-learning
https://www.mathworks.com/discovery/ngram.html