A course on quantitative methods is offered in a wide variety of disciplines, from the social sciences to business to the natural sciences. The same statistical methods are applied across disciplines. Therefore, it should not be surprising that the tools you will learn to use in this course will benefit you in your future courses and careers regardless of whether your career interest is Finance, Accounting, Strategy, Management or Marketing. In this course, you will learn many interesting methods for making healthy decisions as managers of the future. The first part of this course focuses on the principles or rational decision-making supported by the theory of probability. The second part extends these technics to computer assisted decision-making in particular the method of Machine Learning (ML) by the use of Neural Networks (NN). By doing so, the student is provided with one of the main tools of the former paradigm of decision-making over big data. Basic programming skills will be required as the second part of the course aims to apply ML techniques in Python environment.
Titular Professors
Professors
Basic knowledge about statistics and probability. Basic knowledge about calculus.
Learning Outcomes of this subject are: LO.01 - Know terminology, notation and methods from quantitative research, concretely those related to inference. LO.02 - Able to analyse and summarize information from lectures and materials provided by the teacher. LO.03 - Understand and be able to implement ML algorithms in Python.
These are the topics that will be covered during the course: First part 1. Probability Review 2. Conditional probability and Bayes' Theorem 3. Decision Trees 4. Multicriteria Decision Second part 5. Multivariable Function analysis 6. Perceptrons 7. Neural Networks
| Session | Unit | Subjects | In class assignment | Homework distributed | Deadline Homework |
| 1 | 1. Overview, probability theory | Decision trees, Probability refresher | x | ||
| 2 | 2. Cond. probability and Bayes Theorem. | Cross tables, Bayes Theorem, Likelihood | Homework 1 | ||
| 3 | Confusion matrix, Probability trees | x | Homework 1 | ||
| 4 | 3. Decision trees, Games | Expected utility. Games in Normal form, Games in extensive form. | x | ||
| 5 | Imperfect information, Perfect information | Homework 2 | |||
| 6 | 4. Multi-criterion Decision Making | Decisions without probability, SMART. Sensitivity analysis | x | Homework 2 | |
| 7 | 5. Computer assisted decisions | Machine Learning Overview. Application to Bayesean networks | x | ||
| MIDTERM | Written exam | Session 1-6. Bring pen, eraser and pocket calculator | |||
| 8 | 6. Multivariariate analysis | Multivariate optimisation. Gradient decent | Homework 3 | ||
| 9 | 7. Perceptron | The perceptron, learning, and the AI winter | x | Homework 3 | |
| La Salle Fest | Self-study, tbd | Final Project | |||
| 10 | 8. Neural networks and learning | Deep neural networks. Fitting. Bootstrapping. | x | ||
| 11 | More on fitting neural network. Levaraging large pretrained models through APIs | ||||
| 12 | 9. Application of neural network | Current application of neural network | x | ||
| 13 | 9.2 Application of neural network | Current application of neural network | Final Project | ||
| FINAL | Oral exam | Questions on Final Project + theory from session 7-12 | |||
Weekly teaching will consist of one lecturing session to explain basic concepts and group problem-solving in class to apply knowledge to practical situations. Programming sessions are for problem-solving and final project purposes.
| Subject | Methods |
| 1. Probability theory | Lecture, Problem solving, Group work, Computer lab |
| 2. Cond. probability and Bayes Theorem. | Lecture, Problem solving, Group work, Computer lab |
| 3. Decision trees, Games | Lecture, Problem solving, Group work, Computer lab |
| 4. Multi-criterion Decision Making | Lecture, Problem solving, Group work |
| 5. Computer assisted decisions | Lecture, Problem solving, Group work, Computer lab |
| 6. Multivariariate analysis | Lecture, Problem solving, Computer lab |
| 7. Perceptron | Lecture |
| 8. Neural networks and learning | Lecture, Problem solving, Group work, Computer lab |
| 9. Application of neural network | Lecture, Problem solving, Group work, Computer lab |
| Part | Weight | What | Note | Importance |
| Participation | 10% | Attendance, attitude, punctuality, class assignments | - | Moderate |
| Individual work | 20% | Homework 1-3 | > on average 4/10 to pass | Moderate |
| Case study | 20% | Final project, individual or in groups | > 4/10 to pass | High |
| Midterm | 25% | Written test. | > 4/10 to pass | High |
| Final | 25% | Oral exam. | > 4/10 to pass | High |
Retake policy: There is no retake exam on this course
--
The recommended textbook is: 1) "Decision Analysis for Management Judgment". Paul Goodwin and George Wright. Wiley 2009. Luckily, the most important references for machine learning are available online. 2) "Neural networks and deep learning". Michael Nielsen, available 3) "Practical Deep Learning for Coders - Practical Deep Learning (fast.ai)", J. Howard, S. Gugger. For a more formal exposition of the topic and perspectives on advanced ML techniques: 4) "Deep Learning" I. Goodfellow, Y. Bengio, A Courville https://www.deeplearningbook.org/
--