Consulta de Guies Docents



Academic Year: 2022/23

8037 - Theoretical and Applied Linguistics - MA

32134 - Computational Semantics


Teaching Plan Information

Academic Course:
2022/23
Academic Center:
803 - Masters Centre of the Department of Translation and Language Sciences
Study:
8037 - Theoretical and Applied Linguistics - MA
Subject:
32134 - Computational Semantics
Ambit:
---
Credits:
5.0
Course:
1
Teaching languages:
Theory: Group 1: English
Teachers:
Gemma Boleda Torrent, Lucas Weber , Eleonora Gualdoni
Teaching Period:
First quarter
Schedule:

Presentation

This course provides the basics of how natural language meaning is modeled in Computational Linguistics / Natural Language Processing. We will introduce the relevant semantic phenomena as well as the main computational approaches to tackling them. Along the way, we will learn the basic methodology of Machine Learning and strengthen the student's skills in using computational and quantitative tools (in class, we will use Python and Python-based toolkits, such as spaCy and NLTK).

Associated skills

  • Analytical skills (problem solving, data analysis, reasoning about semantic data).
  • Machine Learning methodology.
  • Familiarity with data annotation.
  • Ability to evaluate model results and do error analysis.
  • Basic programming (Python, NLTK).
  • Quantitative thinking in the domain of language.

Learning outcomes

The student will acquire:

  • a deeper understanding of semantics and how Computational Linguistics can contribute to its study;
  • knowledge of the basic methodology of Machine Learning, and associated basic skills to carry out Machine Learning experiments;
  • familiarity with quantitative and computational methods for semantic phenomena.

Sustainable Development Goals

ODS 4: Educació de qualitat

ODS 8: Treball decent i creixement econòmic

ODS 9: Indústria, innovació i infraestructura

 

Prerequisites

Contents

Part 1: Introduction to Computational Semantics

After a general introduction (of the teaching methods, the topic, and ourselves) we dive right into it by building a system for Sentiment Classification. In week 1 we use simple, hand-made rules, in week 2 we let the computer optimize our model using Logistic Regression. Throughout this part we also focus on model evaluation and analysis: How well do systems perform? When and why do they fail? What can a classifier tell us about language? 

Part 2. What words mean.

An important distinction in linguistic semantics/pragmatics is between what words mean in general, and what speakers mean by them on a particular occasion. Part 2 of this course will be about modelling what words mean in general. We will cover two approaches: sense enumeration (WordNet) and distributional semantics (in particular Word2Vec).

Part 3. What speakers mean by words

Whereas what a word means in general is necessarily an abstraction over many possible uses, what a given speaker means by it on a particular occasion is typically very specific. In this final part of the course we consider two case studies that deal with this contextualized specificity: coreference resolution and contextualized word embeddings. If time allows, we will introduce multi-modal approaches integrating visual and textual sources of information about meaning.

Note: The contents of the course may vary depending on the students' interests. 

Teaching Methods

The class will be based on lectures, readings, and practical exercises. Lectures will be primarily Q&A sessions about weekly readings. Students are expected to submit 3 questions about the reading each week. Readings will be mostly material from the textbook, but based on student interest we can include research articles, too. The practical exercises will be directed towards model construction, evaluation, and analysis for computational semantic tasks, as well as data annotation.

Evaluation

The students will be evaluated via:

  1. Three written assignments that include programming, evaluation, and analysis of computational semantic tasks:
    • Assignment 1: 20%
    • Assignment 2: 20%
    • Assignment 3: 30%
  2. Questions submitted about the textbook: 20%
  3. Participation in class discussions (in class or online): 10%

Bibliography and information resources

Textbook:

  • Jurafsky, Daniel & Martin, James H. (2009), Speech and Language Processing: An Introduction to
    Natural Language Processing, Computational Linguistics, and Speech Recognition. 3rd edition. Prentice
    Hall. https://web.stanford.edu/~jurafsky/slp3

Further readings:

  • Word meaning:

Kilgarriff, Adam. I don't believe in word senses. Computers and the Humanities 31.2 (1997): 91-113.

Murphy, Gregory L. (2002). The big book of concepts. Cambridge, MA: MIT Press. Note: Really great book I recommend to everybody interested in word meaning. See especially Chapter 11.

  • Symbolic (formal semantics, DRT-based) system for the processing of free English text (not covered in the course):

Bos, Johan (2008). Wide-coverage semantic analysis with Boxer. Proceedings of the 2008 Conference on Semantics in Text Processing. Association for Computational Linguistics.

Bos, J., & Markert, K. (2005). Recognising textual entailment with logical inference. Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing (pp. 628-635). Association for Computational Linguistics.

  • Distributional semantics, general:

Boleda, G. Distributional Semantics and Linguistic TheoryAnnual Review of Linguistics, Vol. 6: 213-23. (Pre-print version). Note: survey article.

Stephen Clark. 2015. Vector Space Models of Lexical Meaning. Handbook of Contemporary Semantic Theory — second edition, edited by Shalom Lappin and Chris Fox. Chapter 16, pp.493-522. Wiley-Blackwell. [PDF] (pre-copy editing). Note: survey article.

Katrin Erk. Vector space models of word meaning and phrase meaning: a survey. Language and Linguistics Compass 6(10), 635-653, October 2012. Note: survey article.

Alessandro Lenci. 2008. Distributional semantics in linguistic and cognitive research. Italian journal of linguistics, 20 (1), pp. 1-31. Note: survey article.

  • Building word vectors with neural networks:

Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, Luke Zettlemoyer.  Deep contextualized word representations. Proceedings of NAACL 2018.

Jeffrey Pennington, Richard Socher, Christopher Manning. 2014. Glove: Global vectors for word representation. Proceedings of EMNLP.

Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean. 2013. Efficient Estimation of Word Representations in Vector Space. arXiv:1301.3781v3.

M. Baroni, G. Dinu and G. Kruszewski. 2014. Don't count, predict! A systematic comparison of context-counting vs. context-predicting semantic vectors. Proceedings of ACL 2014 (52nd Annual Meeting of the Association for Computational Linguistics), East Stroudsburg PA: ACL, 238-247.

Mikolov, T., Yih, W., & Zweig, G. (2013). Linguistic Regularities in Continuous Space Word Representations. Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 746–751). Atlanta, Georgia: Association for Computational Linguistics.

  • General Machine Learning:

Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433-460. http://m.mind.oxfordjournals.org/content/LIX/236/433.full.pdf, (if that fails: http://phil415.pbworks.com/f/TuringComputing.pdf)

Domingos, P. (2012). A few useful things to know about machine learning. Communications of the ACM, 55(10), 78. http://doi.org/10.1145/2347736.2347755

Parloff, R. (2016). The AI Revolution: Why Deep Learning Is Suddenly Changing Your Life. Fortune Magazine. Note: well written, thorough popular science article about deep learning.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. http://doi.org/10.1038/nature14539.

 


Academic Year: 2022/23

8037 - Theoretical and Applied Linguistics - MA

32134 - Computational Semantics


Teaching Plan Information

Academic Course:
2022/23
Academic Center:
803 - Masters Centre of the Department of Translation and Language Sciences
Study:
8037 - Theoretical and Applied Linguistics - MA
Subject:
32134 - Computational Semantics
Ambit:
---
Credits:
5.0
Course:
1
Teaching languages:
Theory: Group 1: English
Teachers:
Gemma Boleda Torrent, Lucas Weber , Eleonora Gualdoni
Teaching Period:
First quarter
Schedule:

Presentation

This course provides the basics of how natural language meaning is modeled in Computational Linguistics / Natural Language Processing. We will introduce the relevant semantic phenomena as well as the main computational approaches to tackling them. Along the way, we will learn the basic methodology of Machine Learning and strengthen the student's skills in using computational and quantitative tools (in class, we will use Python and Python-based toolkits, such as spaCy and NLTK).

Associated skills

  • Analytical skills (problem solving, data analysis, reasoning about semantic data).
  • Machine Learning methodology.
  • Familiarity with data annotation.
  • Ability to evaluate model results and do error analysis.
  • Basic programming (Python, NLTK).
  • Quantitative thinking in the domain of language.

Learning outcomes

The student will acquire:

  • a deeper understanding of semantics and how Computational Linguistics can contribute to its study;
  • knowledge of the basic methodology of Machine Learning, and associated basic skills to carry out Machine Learning experiments;
  • familiarity with quantitative and computational methods for semantic phenomena.

Sustainable Development Goals

ODS 4: Educació de qualitat

ODS 8: Treball decent i creixement econòmic

ODS 9: Indústria, innovació i infraestructura

 

Prerequisites

Contents

Part 1: Introduction to Computational Semantics

After a general introduction (of the teaching methods, the topic, and ourselves) we dive right into it by building a system for Sentiment Classification. In week 1 we use simple, hand-made rules, in week 2 we let the computer optimize our model using Logistic Regression. Throughout this part we also focus on model evaluation and analysis: How well do systems perform? When and why do they fail? What can a classifier tell us about language? 

Part 2. What words mean.

An important distinction in linguistic semantics/pragmatics is between what words mean in general, and what speakers mean by them on a particular occasion. Part 2 of this course will be about modelling what words mean in general. We will cover two approaches: sense enumeration (WordNet) and distributional semantics (in particular Word2Vec).

Part 3. What speakers mean by words

Whereas what a word means in general is necessarily an abstraction over many possible uses, what a given speaker means by it on a particular occasion is typically very specific. In this final part of the course we consider two case studies that deal with this contextualized specificity: coreference resolution and contextualized word embeddings. If time allows, we will introduce multi-modal approaches integrating visual and textual sources of information about meaning.

Note: The contents of the course may vary depending on the students' interests. 

Teaching Methods

The class will be based on lectures, readings, and practical exercises. Lectures will be primarily Q&A sessions about weekly readings. Students are expected to submit 3 questions about the reading each week. Readings will be mostly material from the textbook, but based on student interest we can include research articles, too. The practical exercises will be directed towards model construction, evaluation, and analysis for computational semantic tasks, as well as data annotation.

Evaluation

The students will be evaluated via:

  1. Three written assignments that include programming, evaluation, and analysis of computational semantic tasks:
    • Assignment 1: 20%
    • Assignment 2: 20%
    • Assignment 3: 30%
  2. Questions submitted about the textbook: 20%
  3. Participation in class discussions (in class or online): 10%

Bibliography and information resources

Textbook:

  • Jurafsky, Daniel & Martin, James H. (2009), Speech and Language Processing: An Introduction to
    Natural Language Processing, Computational Linguistics, and Speech Recognition. 3rd edition. Prentice
    Hall. https://web.stanford.edu/~jurafsky/slp3

Further readings:

  • Word meaning:

Kilgarriff, Adam. I don't believe in word senses. Computers and the Humanities 31.2 (1997): 91-113.

Murphy, Gregory L. (2002). The big book of concepts. Cambridge, MA: MIT Press. Note: Really great book I recommend to everybody interested in word meaning. See especially Chapter 11.

  • Symbolic (formal semantics, DRT-based) system for the processing of free English text (not covered in the course):

Bos, Johan (2008). Wide-coverage semantic analysis with Boxer. Proceedings of the 2008 Conference on Semantics in Text Processing. Association for Computational Linguistics.

Bos, J., & Markert, K. (2005). Recognising textual entailment with logical inference. Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing (pp. 628-635). Association for Computational Linguistics.

  • Distributional semantics, general:

Boleda, G. Distributional Semantics and Linguistic TheoryAnnual Review of Linguistics, Vol. 6: 213-23. (Pre-print version). Note: survey article.

Stephen Clark. 2015. Vector Space Models of Lexical Meaning. Handbook of Contemporary Semantic Theory — second edition, edited by Shalom Lappin and Chris Fox. Chapter 16, pp.493-522. Wiley-Blackwell. [PDF] (pre-copy editing). Note: survey article.

Katrin Erk. Vector space models of word meaning and phrase meaning: a survey. Language and Linguistics Compass 6(10), 635-653, October 2012. Note: survey article.

Alessandro Lenci. 2008. Distributional semantics in linguistic and cognitive research. Italian journal of linguistics, 20 (1), pp. 1-31. Note: survey article.

  • Building word vectors with neural networks:

Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, Luke Zettlemoyer.  Deep contextualized word representations. Proceedings of NAACL 2018.

Jeffrey Pennington, Richard Socher, Christopher Manning. 2014. Glove: Global vectors for word representation. Proceedings of EMNLP.

Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean. 2013. Efficient Estimation of Word Representations in Vector Space. arXiv:1301.3781v3.

M. Baroni, G. Dinu and G. Kruszewski. 2014. Don't count, predict! A systematic comparison of context-counting vs. context-predicting semantic vectors. Proceedings of ACL 2014 (52nd Annual Meeting of the Association for Computational Linguistics), East Stroudsburg PA: ACL, 238-247.

Mikolov, T., Yih, W., & Zweig, G. (2013). Linguistic Regularities in Continuous Space Word Representations. Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 746–751). Atlanta, Georgia: Association for Computational Linguistics.

  • General Machine Learning:

Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433-460. http://m.mind.oxfordjournals.org/content/LIX/236/433.full.pdf, (if that fails: http://phil415.pbworks.com/f/TuringComputing.pdf)

Domingos, P. (2012). A few useful things to know about machine learning. Communications of the ACM, 55(10), 78. http://doi.org/10.1145/2347736.2347755

Parloff, R. (2016). The AI Revolution: Why Deep Learning Is Suddenly Changing Your Life. Fortune Magazine. Note: well written, thorough popular science article about deep learning.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. http://doi.org/10.1038/nature14539.

 


Academic Year: 2022/23

8037 - Theoretical and Applied Linguistics - MA

32134 - Computational Semantics


Teaching Plan Information

Academic Course:
2022/23
Academic Center:
803 - Masters Centre of the Department of Translation and Language Sciences
Study:
8037 - Theoretical and Applied Linguistics - MA
Subject:
32134 - Computational Semantics
Ambit:
---
Credits:
5.0
Course:
1
Teaching languages:
Theory: Group 1: English
Teachers:
Gemma Boleda Torrent, Lucas Weber , Eleonora Gualdoni
Teaching Period:
First quarter
Schedule:

Presentation

This course provides the basics of how natural language meaning is modeled in Computational Linguistics / Natural Language Processing. We will introduce the relevant semantic phenomena as well as the main computational approaches to tackling them. Along the way, we will learn the basic methodology of Machine Learning and strengthen the student's skills in using computational and quantitative tools (in class, we will use Python and Python-based toolkits, such as spaCy and NLTK).

Associated skills

  • Analytical skills (problem solving, data analysis, reasoning about semantic data).
  • Machine Learning methodology.
  • Familiarity with data annotation.
  • Ability to evaluate model results and do error analysis.
  • Basic programming (Python, NLTK).
  • Quantitative thinking in the domain of language.

Learning outcomes

The student will acquire:

  • a deeper understanding of semantics and how Computational Linguistics can contribute to its study;
  • knowledge of the basic methodology of Machine Learning, and associated basic skills to carry out Machine Learning experiments;
  • familiarity with quantitative and computational methods for semantic phenomena.

Sustainable Development Goals

ODS 4: Educació de qualitat

ODS 8: Treball decent i creixement econòmic

ODS 9: Indústria, innovació i infraestructura

 

Prerequisites

Contents

Part 1: Introduction to Computational Semantics

After a general introduction (of the teaching methods, the topic, and ourselves) we dive right into it by building a system for Sentiment Classification. In week 1 we use simple, hand-made rules, in week 2 we let the computer optimize our model using Logistic Regression. Throughout this part we also focus on model evaluation and analysis: How well do systems perform? When and why do they fail? What can a classifier tell us about language? 

Part 2. What words mean.

An important distinction in linguistic semantics/pragmatics is between what words mean in general, and what speakers mean by them on a particular occasion. Part 2 of this course will be about modelling what words mean in general. We will cover two approaches: sense enumeration (WordNet) and distributional semantics (in particular Word2Vec).

Part 3. What speakers mean by words

Whereas what a word means in general is necessarily an abstraction over many possible uses, what a given speaker means by it on a particular occasion is typically very specific. In this final part of the course we consider two case studies that deal with this contextualized specificity: coreference resolution and contextualized word embeddings. If time allows, we will introduce multi-modal approaches integrating visual and textual sources of information about meaning.

Note: The contents of the course may vary depending on the students' interests. 

Teaching Methods

The class will be based on lectures, readings, and practical exercises. Lectures will be primarily Q&A sessions about weekly readings. Students are expected to submit 3 questions about the reading each week. Readings will be mostly material from the textbook, but based on student interest we can include research articles, too. The practical exercises will be directed towards model construction, evaluation, and analysis for computational semantic tasks, as well as data annotation.

Evaluation

The students will be evaluated via:

  1. Three written assignments that include programming, evaluation, and analysis of computational semantic tasks:
    • Assignment 1: 20%
    • Assignment 2: 20%
    • Assignment 3: 30%
  2. Questions submitted about the textbook: 20%
  3. Participation in class discussions (in class or online): 10%

Bibliography and information resources

Textbook:

  • Jurafsky, Daniel & Martin, James H. (2009), Speech and Language Processing: An Introduction to
    Natural Language Processing, Computational Linguistics, and Speech Recognition. 3rd edition. Prentice
    Hall. https://web.stanford.edu/~jurafsky/slp3

Further readings:

  • Word meaning:

Kilgarriff, Adam. I don't believe in word senses. Computers and the Humanities 31.2 (1997): 91-113.

Murphy, Gregory L. (2002). The big book of concepts. Cambridge, MA: MIT Press. Note: Really great book I recommend to everybody interested in word meaning. See especially Chapter 11.

  • Symbolic (formal semantics, DRT-based) system for the processing of free English text (not covered in the course):

Bos, Johan (2008). Wide-coverage semantic analysis with Boxer. Proceedings of the 2008 Conference on Semantics in Text Processing. Association for Computational Linguistics.

Bos, J., & Markert, K. (2005). Recognising textual entailment with logical inference. Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing (pp. 628-635). Association for Computational Linguistics.

  • Distributional semantics, general:

Boleda, G. Distributional Semantics and Linguistic TheoryAnnual Review of Linguistics, Vol. 6: 213-23. (Pre-print version). Note: survey article.

Stephen Clark. 2015. Vector Space Models of Lexical Meaning. Handbook of Contemporary Semantic Theory — second edition, edited by Shalom Lappin and Chris Fox. Chapter 16, pp.493-522. Wiley-Blackwell. [PDF] (pre-copy editing). Note: survey article.

Katrin Erk. Vector space models of word meaning and phrase meaning: a survey. Language and Linguistics Compass 6(10), 635-653, October 2012. Note: survey article.

Alessandro Lenci. 2008. Distributional semantics in linguistic and cognitive research. Italian journal of linguistics, 20 (1), pp. 1-31. Note: survey article.

  • Building word vectors with neural networks:

Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, Luke Zettlemoyer.  Deep contextualized word representations. Proceedings of NAACL 2018.

Jeffrey Pennington, Richard Socher, Christopher Manning. 2014. Glove: Global vectors for word representation. Proceedings of EMNLP.

Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean. 2013. Efficient Estimation of Word Representations in Vector Space. arXiv:1301.3781v3.

M. Baroni, G. Dinu and G. Kruszewski. 2014. Don't count, predict! A systematic comparison of context-counting vs. context-predicting semantic vectors. Proceedings of ACL 2014 (52nd Annual Meeting of the Association for Computational Linguistics), East Stroudsburg PA: ACL, 238-247.

Mikolov, T., Yih, W., & Zweig, G. (2013). Linguistic Regularities in Continuous Space Word Representations. Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 746–751). Atlanta, Georgia: Association for Computational Linguistics.

  • General Machine Learning:

Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433-460. http://m.mind.oxfordjournals.org/content/LIX/236/433.full.pdf, (if that fails: http://phil415.pbworks.com/f/TuringComputing.pdf)

Domingos, P. (2012). A few useful things to know about machine learning. Communications of the ACM, 55(10), 78. http://doi.org/10.1145/2347736.2347755

Parloff, R. (2016). The AI Revolution: Why Deep Learning Is Suddenly Changing Your Life. Fortune Magazine. Note: well written, thorough popular science article about deep learning.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. http://doi.org/10.1038/nature14539.