(Difference between revisions)
 Revision as of 09:52, 25 November 2019 (view source)Georges (Talk | contribs)m (→Seminar on Friday, November 8, 2019)← Older edit Revision as of 09:55, 25 November 2019 (view source)Georges (Talk | contribs) m (→Seminar on Friday, November 8, 2019)Newer edit → Line 16: Line 16: * [http://blackwell.math.yorku.ca/wiki_uploads/Cross-Validation.html A More General Cross Validation Function for R] ([http://blackwell.math.yorku.ca/wiki_uploads/Cross-Validation.Rmd Rmd file]) * [http://blackwell.math.yorku.ca/wiki_uploads/Cross-Validation.html A More General Cross Validation Function for R] ([http://blackwell.math.yorku.ca/wiki_uploads/Cross-Validation.Rmd Rmd file]) + + == Seminar on Friday, November 22, 2019 == + We will read and discuss Chapter 6 of [http://faculty.marshall.usc.edu/gareth-james/ISL/ ''An Introduction to Statistical Learning''].
+ * [https://www.r-bloggers.com/in-depth-introduction-to-machine-learning-in-15-hours-of-expert-videos/ 15 hours of videos by Tibshirani and Hastie on our textbook] + == Seminar on Friday, November 22, 2019 == == Seminar on Friday, November 22, 2019 ==

## Revision as of 09:55, 25 November 2019

• A topic need not occupy the entire academic year and we could plan to consider more than one topic.
• To get an account to edit this wiki send a message to Georges Monette.

## Seminar on Friday, September 20, 2019

At our meeting on September 13, we agreed to read and discuss the first two chapters of An Introduction to Statistical Learning which can be downloaded from this link. We will then decide what to do next.

## Seminar on Friday, October 4, 2019

We will read and discuss Chapter 3 of An Introduction to Statistical Learning.
Data for the exercises are available by installing the ISLR package in R.

## Seminar on Friday, October 25, 2019

We will read and discuss Chapter 4 of An Introduction to Statistical Learning.

## Seminar on Friday, November 8, 2019

We will read and discuss Chapter 5 of An Introduction to Statistical Learning.

## Seminar on Friday, November 22, 2019

We will read and discuss Chapter 6 of An Introduction to Statistical Learning.

## New candidates topics for 2019-2020

• Add suggestions here or send them to Georges Monette who can add them for you.
• "Statistical learning": We could read either An Introduction to Statistical Learning [1] or The Elements of Statistical Learning [2]; free .pdfs are available for both.

"The books are based on the concept of “statistical learning,” a mashup of stats and machine learning. The field of machine learning is all about feeding huge amounts of data into algorithms to make accurate predictions. Statistics is concerned with predictions as well, says Tibshirani, but also with determining how confident we can be about the importance of certain inputs. This is important in areas like medicine, where a researcher doesn’t just want to know whether a medicine worked, but also why it worked. Statistical learning is meant to take the best ideas from machine learning and computer science, and explain how they can be used and interpreted through a statistician’s lens." See [3]

## Candidates from 2018-2019

• Reproducibility of research: a crisis in Statistics??: How should statistical practice be informed by current controversies about the replication crisis and countervailing moves toward open science and reproducibility? This was a big topic at the recent JSM 2018 conference.
• Roger Peng, Reproducible Research in Computational Science, Science, Vol. 334, Issue 6060, pp. 1226-1227 [4]
• RC: This is timely in a couple ways. Chris Green is teaching a course on the topic, so he or one of the students might be able to come to a meeting to discuss their views. We also have a QM Forum speaker on topic in the Winter
• We can use articles in the special issue of the American Statistician devoted to current problems in statistical inference, particularly the use and interpretation of p-values and the concept of statistical significance. For an overview, see the editorial published in the special issue.
• Big Data problems: Another hot topic, but perhaps too broad. [MF: I don't know enough to specify it more clearly as a useful seminar topic.] [This would be interesting, but I would hope we could find materials that address the issue from a social science perspective]
• Some statistical methods topics, not yet clearly articulated:
• Clustering methods [RC: Could this fall under the Big Data label?]
• Robustness
• Mediation
• Meta analysis in medicine: how can you tell whether a lit review is complete with logistic regression?! [RC: I think Meta-Analysis could be a good topic, including M-L/J-A discussing their research]
• Consulting issues: Practical aspects of statistical consulting [MF: Perhaps this would be a better topic for the SCS staff meetings ??] [RC: That would make a good topic for a business/staff meeting, if there were no consulting cases to discuss]
• Survey Sampling: Elucidating the mystery of bootstrap weights and how to use them when analyzing survey data, e.g. from Statistics Canada.
• Evidence-based medicine: Ideas and implications
• Machine Learning, AI, Deep Learning: An Overview
• Disseminating technical information to non-technical audiences: in consulting and in teaching. [Could this be put together with 'consulting issues'?]
• Missing Data
• Statistical Paradoxes and Fallacies [RC: There are some good "summary" articles related to statistical paradoxes and Georges' examples are always helpful]