Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
M
machinelearning-introduction-workshop
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container Registry
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
sispub
courses
machinelearning-introduction-workshop
Commits
bb86ffaf
Commit
bb86ffaf
authored
6 years ago
by
schmittu
Browse files
Options
Downloads
Plain Diff
Merge branch 'master' of sissource.ethz.ch:sispub/machinelearning-introduction-workshop
parents
cef8ffaf
3e89156c
No related branches found
No related tags found
No related merge requests found
Pipeline
#400
canceled
6 years ago
Stage: build
Stage: test
Changes
1
Pipelines
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
content.md
+12
-13
12 additions, 13 deletions
content.md
with
12 additions
and
13 deletions
content.md
+
12
−
13
View file @
bb86ffaf
...
@@ -7,7 +7,7 @@
...
@@ -7,7 +7,7 @@
# Course structure
# Course structure
-
Two days workshop, 1.5 days workshop + .5 day working on own data / prepared data.
-
Two days workshop, 1.5 days workshop + .5 day working on own data / prepared data.
-
Every part below includes a coding session using Jupter notebooks.
-
Every part below includes a coding session using Jup
y
ter notebooks.
-
Coding sessions provide code frames which should be completed.
-
Coding sessions provide code frames which should be completed.
-
We provide solutions.
-
We provide solutions.
...
@@ -16,7 +16,7 @@
...
@@ -16,7 +16,7 @@
## Part 0: Preparation
## Part 0: Preparation
-
Quick basics matplotlib, numpy, pandas?
:
-
Quick basics matplotlib, numpy, pandas?
### Coding session
### Coding session
...
@@ -59,7 +59,7 @@
...
@@ -59,7 +59,7 @@
## Part 3: accuracy, F1, ROC, ...
## Part 3: accuracy, F1, ROC, ...
Intention: accuracy is useful
l
but has pitfalls
Intention: accuracy is useful but has pitfalls
-
how to measure accuracy ?
-
how to measure accuracy ?
...
@@ -83,9 +83,8 @@ classifiers / regressors have parameters / degrees of freedom.
...
@@ -83,9 +83,8 @@ classifiers / regressors have parameters / degrees of freedom.
-
overfitting:
-
overfitting:
-
features have actual noise, or not enough information
-
features have actual noise, or not enough information: orchid example in 2d. elevate to 3d using another feature.
not enough information: orchid example in 2d. elevate to 3d using another feature.
-
polynome of degree 5 to fit points on a line + noise
-
polnome of degree 5 to fit points on a line + noise
-
points in a circle: draw very exact boundary line
-
points in a circle: draw very exact boundary line
-
how to check underfitting / overfitting ?
-
how to check underfitting / overfitting ?
...
@@ -97,14 +96,14 @@ classifiers / regressors have parameters / degrees of freedom.
...
@@ -97,14 +96,14 @@ classifiers / regressors have parameters / degrees of freedom.
### Coding session:
### Coding session:
-
How to do cross validation with scikit-learn
-
How to do cross validation with scikit-learn
-
run crossvalidation on classifier for beer data
-
run cross
validation on classifier for beer data
## Part 5: pipelines / parameter tuning with scikit-learn
## Part 5: pipelines / parameter tuning with scikit-learn
-
Sci
c
it learn API incl summary what we have seen up to now.
-
Sci
k
it learn API incl
.
summary
of
what we have seen up to now.
-
pipelines, preprocessing (scaler, PCA)
-
pipelines, preprocessing (scaler, PCA)
-
cross validatio
o
n
-
cross validation
-
Hyper parameter tuning: grid search / random search.
-
Hyper parameter tuning: grid search / random search.
### Coding session
### Coding session
...
@@ -116,7 +115,7 @@ classifiers / regressors have parameters / degrees of freedom.
...
@@ -116,7 +115,7 @@ classifiers / regressors have parameters / degrees of freedom.
## Part 6: Overview classifiers
## Part 6: Overview classifiers
-
Ne
igh
rest neighbours
-
Ne
a
rest neighbours
-
SVMs
-
SVMs
-
demo for RBF: different parameters influence on decision line
-
demo for RBF: different parameters influence on decision line
-
Random forests
-
Random forests
...
@@ -125,7 +124,7 @@ classifiers / regressors have parameters / degrees of freedom.
...
@@ -125,7 +124,7 @@ classifiers / regressors have parameters / degrees of freedom.
### Coding session
### Coding session
-
Prepare examples for 2d classification problems incl visualization of different
-
Prepare examples for 2d classification problems incl
.
visualization of different
decision surfaces.
decision surfaces.
-
Play with different classifiers on beer data
-
Play with different classifiers on beer data
...
@@ -146,7 +145,7 @@ Introduce movie data set, learn SVR or other regressor on this data set.
...
@@ -146,7 +145,7 @@ Introduce movie data set, learn SVR or other regressor on this data set.
-
Overview of the field
-
Overview of the field
-
Introduction feed forward neural networks
-
Introduction
to
feed forward neural networks
-
Demo Keras
-
Demo Keras
### Coding Session
### Coding Session
...
@@ -156,7 +155,7 @@ Introduce movie data set, learn SVR or other regressor on this data set.
...
@@ -156,7 +155,7 @@ Introduce movie data set, learn SVR or other regressor on this data set.
## Workshop
## Workshop
-
assist to setup own computer.
-
assist to setup
the workshop material on
own computer.
-
provide example problems if attendees don't bring own data.
-
provide example problems if attendees don't bring own data.
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment