+ - 0:00:00
Notes for current slide
Notes for next slide

3.8 — Polynomial Regression

ECON 480 • Econometrics • Fall 2020

Ryan Safner
Assistant Professor of Economics
safner@hood.edu
ryansafner/metricsF20
metricsF20.classes.ryansafner.com

Linear Regression

  • OLS is commonly known as "linear regression" as it fits a straight line to data points

  • Often, data and relationships between variables may not be linear

Linear Regression

  • OLS is commonly known as "linear regression" as it fits a straight line to data points

  • Often, data and relationships between variables may not be linear

Linear Regression

  • OLS is commonly known as "linear regression" as it fits a straight line to data points

  • Often, data and relationships between variables may not be linear

^Life Expectancyi=^β0+^β1GDP per capitai

Linear Regression

  • OLS is commonly known as "linear regression" as it fits a straight line to data points

  • Often, data and relationships between variables may not be linear

^Life Expectancyi=^β0+^β1GDP per capitai

^Life Expectancyi=^β0+^β1GDP per capitai+^β2GDP per capita2i

Linear Regression

  • OLS is commonly known as "linear regression" as it fits a straight line to data points

  • Often, data and relationships between variables may not be linear

  • Get rid of the outliers (>$60,000)

Linear Regression

  • OLS is commonly known as "linear regression" as it fits a straight line to data points

  • Often, data and relationships between variables may not be linear

  • Get rid of the outliers (>$60,000)

^Life Expectancyi=^β0+^β1GDP per capitai

Linear Regression

  • OLS is commonly known as "linear regression" as it fits a straight line to data points

  • Often, data and relationships between variables may not be linear

  • Get rid of the outliers (>$60,000)

^Life Expectancyi=^β0+^β1GDP per capitai

^Life Expectancyi=^β0+^β1GDP per capitai+^β2GDP per capita2i

Linear Regression

  • OLS is commonly known as "linear regression" as it fits a straight line to data points

  • Often, data and relationships between variables may not be linear

  • Get rid of the outliers (>$60,000)

^Life Expectancyi=^β0+^β1GDP per capitai

^Life Expectancyi=^β0+^β1GDP per capitai+^β2GDP per capita2i

^Life Expectancyi=^β0+^β1ln(GDP per capitai)

Nonlinear Effects in Linear Regression

  • Despite being "linear regression", OLS can handle this with an easy fix

  • OLS requires all parameters (i.e. the β's) to be linear, the regressors (X's) can be nonlinear:

Nonlinear Effects in Linear Regression

  • Despite being "linear regression", OLS can handle this with an easy fix

  • OLS requires all parameters (i.e. the β's) to be linear, the regressors (X's) can be nonlinear:

Yi=β0+β1X2i

Nonlinear Effects in Linear Regression

  • Despite being "linear regression", OLS can handle this with an easy fix

  • OLS requires all parameters (i.e. the β's) to be linear, the regressors (X's) can be nonlinear:

Yi=β0+β1X2i

Yi=β0+β21Xi

Nonlinear Effects in Linear Regression

  • Despite being "linear regression", OLS can handle this with an easy fix

  • OLS requires all parameters (i.e. the β's) to be linear, the regressors (X's) can be nonlinear:

Yi=β0+β1X2i

Yi=β0+β21Xi

Yi=β0+β1Xi

Nonlinear Effects in Linear Regression

  • Despite being "linear regression", OLS can handle this with an easy fix

  • OLS requires all parameters (i.e. the β's) to be linear, the regressors (X's) can be nonlinear:

Yi=β0+β1X2i

Yi=β0+β21Xi

Yi=β0+β1Xi

Yi=β0+β1Xi

Nonlinear Effects in Linear Regression

  • Despite being "linear regression", OLS can handle this with an easy fix

  • OLS requires all parameters (i.e. the β's) to be linear, the regressors (X's) can be nonlinear:

Yi=β0+β1X2i

Yi=β0+β21Xi

Yi=β0+β1Xi

Yi=β0+β1Xi

Yi=β0+β1(X1i×X2i)

Nonlinear Effects in Linear Regression

  • Despite being "linear regression", OLS can handle this with an easy fix

  • OLS requires all parameters (i.e. the β's) to be linear, the regressors (X's) can be nonlinear:

Yi=β0+β1X2i

Yi=β0+β21Xi

Yi=β0+β1Xi

Yi=β0+β1Xi

Yi=β0+β1(X1i×X2i)

Yi=β0+β1ln(Xi)

Nonlinear Effects in Linear Regression

  • Despite being "linear regression", OLS can handle this with an easy fix

  • OLS requires all parameters (i.e. the β's) to be linear, the regressors (X's) can be nonlinear:

Yi=β0+β1X2i

Yi=β0+β21Xi

Yi=β0+β1Xi

Yi=β0+β1Xi

Yi=β0+β1(X1i×X2i)

Yi=β0+β1ln(Xi)

  • In the end, each X is always just a number in the data, OLS can always estimate parameters for it

  • Plotting the modelled points (Xi,^Yi) can result in a curve!

Sources of Nonlinearities

  • Effect of X1Y might be nonlinear if:

Sources of Nonlinearities

  • Effect of X1Y might be nonlinear if:
  1. X1Y is different for different levels of X1
    • e.g. diminishing returns: X1 increases Y at a decreasing rate
    • e.g. increasing returns: X1 increases Y at an increasing rate

Sources of Nonlinearities

  • Effect of X1Y might be nonlinear if:
  1. X1Y is different for different levels of X1

    • e.g. diminishing returns: X1 increases Y at a decreasing rate
    • e.g. increasing returns: X1 increases Y at an increasing rate
  2. X1Y is different for different levels of X2

    • e.g. interaction effects (last lesson)

Nonlinearities Alter Marginal Effects

  • Linear: Y=^β0+^β1X

  • marginal effect (slope), (^β1)=ΔYΔX is constant for all X

Nonlinearities Alter Marginal Effects

  • Polynomial: Y=^β0+^β1X+^β2X2

  • Marginal effect, “slope” (^β1) depends on the value of X!

Sources of Nonlinearities III

  • Interaction Effect: ˆY=^β0+^β1X1+^β2X2+^β3X1×X2

  • Marginal effect, “slope” depends on the value of X2!

  • Easy example: if X2 is a dummy variable:

    • X2=0 (control) vs. X2=1 (treatment)

Polynomial Functions of X I

  • Linear

ˆY=^β0+^β1X

Polynomial Functions of X I

  • Linear

ˆY=^β0+^β1X

  • Quadratic

ˆY=^β0+^β1X+^β2X2

Polynomial Functions of X I

  • Linear

ˆY=^β0+^β1X

  • Quadratic

ˆY=^β0+^β1X+^β2X2

  • Cubic

ˆY=^β0+^β1X+^β2X2+^β3X3

Polynomial Functions of X I

  • Linear

ˆY=^β0+^β1X

  • Quadratic

ˆY=^β0+^β1X+^β2X2

  • Cubic

ˆY=^β0+^β1X+^β2X2+^β3X3

  • Quartic

ˆY=^β0+^β1X+^β2X2+^β3X3+^β4X4

Polynomial Functions of X I

^Yi=^β0+^β1Xi+^β2X2i++^βrXri+ui

Polynomial Functions of X I

^Yi=^β0+^β1Xi+^β2X2i++^βrXri+ui

  • Where r is the highest power Xi is raised to
    • quadratic r=2
    • cubic r=3

Polynomial Functions of X I

^Yi=^β0+^β1Xi+^β2X2i++^βrXri+ui

  • Where r is the highest power Xi is raised to

    • quadratic r=2
    • cubic r=3
  • The graph of an rth-degree polynomial function has (r1) bends

Polynomial Functions of X I

^Yi=^β0+^β1Xi+^β2X2i++^βrXri+ui

  • Where r is the highest power Xi is raised to

    • quadratic r=2
    • cubic r=3
  • The graph of an rth-degree polynomial function has (r1) bends

  • Just another multivariate OLS regression model!

The Quadratic Model

Quadratic Model

^Yi=^β0+^β1Xi+^β2X2i

  • Quadratic model has X and X2 variables in it (yes, need both!)

Quadratic Model

^Yi=^β0+^β1Xi+^β2X2i

  • Quadratic model has X and X2 variables in it (yes, need both!)

  • How to interpret coefficients (betas)?

    • β0 as “intercept” and β1 as “slope” makes no sense 🧐
    • β1 as effect XiYi holding X2i constant??

Note: this is not a perfect multicollinearity problem! Correlation only measures linear relationships!

Quadratic Model

^Yi=^β0+^β1Xi+^β2X2i

  • Quadratic model has X and X2 variables in it (yes, need both!)

  • How to interpret coefficients (betas)?

    • β0 as “intercept” and β1 as “slope” makes no sense 🧐
    • β1 as effect XiYi holding X2i constant??

Note: this is not a perfect multicollinearity problem! Correlation only measures linear relationships!

  • Estimate marginal effects by calculating predicted ^Yi for different levels of Xi

Quadratic Model: Calculating Marginal Effects

^Yi=^β0+^β1Xi+^β2X2i

  • What is the marginal effect of ΔXiΔYi?

Quadratic Model: Calculating Marginal Effects

^Yi=^β0+^β1Xi+^β2X2i

  • What is the marginal effect of ΔXiΔYi?

  • Take the derivative of Yi with respect to Xi: YiXi=^β1+2^β2Xi

Quadratic Model: Calculating Marginal Effects

^Yi=^β0+^β1Xi+^β2X2i

  • What is the marginal effect of ΔXiΔYi?

  • Take the derivative of Yi with respect to Xi: YiXi=^β1+2^β2Xi

  • Marginal effect of a 1 unit change in Xi is a (^β1+2^β2Xi) unit change in Y

Quadratic Model: Example I

Example: ^Life Expectancyi=^β0+^β1GDP per capitai+^β2GDP per capita2i

  • Use gapminder package and data
library(gapminder)

Quadratic Model: Example II

  • These coefficients will be very large, so let's transform gdpPercap to be in $1,000's
gapminder <- gapminder %>%
mutate(GDP_t = gdpPercap/1000)
gapminder %>% head() # look at it
ABCDEFGHIJ0123456789
country
<fctr>
continent
<fctr>
year
<int>
lifeExp
<dbl>
pop
<int>
gdpPercap
<dbl>
GDP_t
<dbl>
AfghanistanAsia195228.8018425333779.44530.7794453
AfghanistanAsia195730.3329240934820.85300.8208530
AfghanistanAsia196231.99710267083853.10070.8531007
AfghanistanAsia196734.02011537966836.19710.8361971
AfghanistanAsia197236.08813079460739.98110.7399811
AfghanistanAsia197738.43814880372786.11340.7861134

Quadratic Model: Example III

  • Let’s also create a squared term, gdp_sq
gapminder <- gapminder %>%
mutate(GDP_sq = GDP_t^2)
gapminder %>% head() # look at it
ABCDEFGHIJ0123456789
country
<fctr>
continent
<fctr>
year
<int>
lifeExp
<dbl>
pop
<int>
gdpPercap
<dbl>
GDP_t
<dbl>
AfghanistanAsia195228.8018425333779.44530.7794453
AfghanistanAsia195730.3329240934820.85300.8208530
AfghanistanAsia196231.99710267083853.10070.8531007
AfghanistanAsia196734.02011537966836.19710.8361971
AfghanistanAsia197236.08813079460739.98110.7399811
AfghanistanAsia197738.43814880372786.11340.7861134

Quadratic Model: Example IV

  • Can “manually” run a multivariate regression with GDP_t and GDP_sq
library(broom)
reg1<-lm(lifeExp ~ GDP_t + GDP_sq, data = gapminder)
reg1 %>% tidy()
ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125

Quadratic Model: Example V

  • OR use gdp_t and add the “transform” command in regression, I(gdp_t^2)
reg1_alt<-lm(lifeExp ~ GDP_t + I(GDP_t^2), data = gapminder)
reg1_alt %>% tidy()
ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
I(GDP_t^2)-0.015019270.0005794139-25.921493.935809e-125

Quadratic Model: Example VI

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125

Quadratic Model: Example VI

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125

^Life Expectancyi=50.52+1.55GDP per capitai0.02GDP per capita2i

Quadratic Model: Example VI

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125

^Life Expectancyi=50.52+1.55GDP per capitai0.02GDP per capita2i

  • Positive effect (^β1>0), with diminishing returns (^β2<0)

  • Effect on Life Expectancy of increasing GDP depends on initial value of GDP!

Quadratic Model: Example VII

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125
  • Marginal effect of GDP per capita on Life Expectancy:

Quadratic Model: Example VII

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125
  • Marginal effect of GDP per capita on Life Expectancy:

YX=^β1+2^β2XiLife ExpectancyGDP1.55+2(0.02)GDP1.550.04GDP

Quadratic Model: Example VIII

Life ExpectancyGDP=1.550.04GDP

Marginal effect of GDP if GDP =5 ($ thousand):

Life ExpectancyGDP=1.550.04GDP=1.550.04(5)=1.550.20=1.35

Quadratic Model: Example VIII

Life ExpectancyGDP=1.550.04GDP

Marginal effect of GDP if GDP =5 ($ thousand):

Life ExpectancyGDP=1.550.04GDP=1.550.04(5)=1.550.20=1.35

  • i.e. for every addition $1 (thousand) in GDP per capita, average life expectancy increases by 1.35 years

Quadratic Model: Example IX

Life ExpectancyGDP=1.550.04GDP

Marginal effect of GDP if GDP =25 ($ thousand):

Quadratic Model: Example IX

Life ExpectancyGDP=1.550.04GDP

Marginal effect of GDP if GDP =25 ($ thousand):

Life ExpectancyGDP=1.550.04GDP=1.550.04(25)=1.551.00=0.55

Quadratic Model: Example IX

Life ExpectancyGDP=1.550.04GDP

Marginal effect of GDP if GDP =25 ($ thousand):

Life ExpectancyGDP=1.550.04GDP=1.550.04(25)=1.551.00=0.55

  • i.e. for every addition $1 (thousand) in GDP per capita, average life expectancy increases by 0.55 years

Quadratic Model: Example X

Life ExpectancyGDP=1.550.04GDP

Marginal effect of GDP if GDP =50 ($ thousand):

Quadratic Model: Example X

Life ExpectancyGDP=1.550.04GDP

Marginal effect of GDP if GDP =50 ($ thousand):

Life ExpectancyGDP=1.550.04GDP=1.550.04(50)=1.552.00=0.45

Quadratic Model: Example X

Life ExpectancyGDP=1.550.04GDP

Marginal effect of GDP if GDP =50 ($ thousand):

Life ExpectancyGDP=1.550.04GDP=1.550.04(50)=1.552.00=0.45

  • i.e. for every addition $1 (thousand) in GDP per capita, average life expectancy decreases by 0.45 years

Quadratic Model: Example XI

^Life Expectancyi=50.52+1.55GDP per capitai0.02GDP per capita2iLife ExpectancydGDP=1.550.04GDP

Initial GDP per capita Marginal Effect
$5,000 1.35 years
$25,000 0.55 years
$50,000 0.45 years

Of +$1,000 GDP/capita on Life Expectancy.

Quadratic Model: Example XII

ggplot(data = gapminder)+
aes(x = GDP_t,
y = lifeExp)+
geom_point(color="blue", alpha=0.5)+
stat_smooth(method = "lm",
formula = y ~ x + I(x^2),
color="green")+
geom_vline(xintercept=c(5,25,50),
linetype="dashed",
color="red", size = 1)+
scale_x_continuous(labels=scales::dollar,
breaks=seq(0,120,10))+
scale_y_continuous(breaks=seq(0,100,10),
limits=c(0,100))+
labs(x = "GDP per Capita (in Thousands)",
y = "Life Expectancy (Years)")+
ggthemes::theme_pander(base_family = "Fira Sans Condensed",
base_size=16)

The Quadratic Model: Maxima and Minima

Quadratic Model: Maxima and Minima I

  • For a polynomial model, we can also find the predicted maximum or minimum of ^Yi

Quadratic Model: Maxima and Minima I

  • For a polynomial model, we can also find the predicted maximum or minimum of ^Yi

  • A quadratic model has a single global maximum or minimum (1 bend)

Quadratic Model: Maxima and Minima I

  • For a polynomial model, we can also find the predicted maximum or minimum of ^Yi

  • A quadratic model has a single global maximum or minimum (1 bend)

  • By calculus, a minimum or maximum occurs where:

YiXi=0β1+2β2Xi=02β2Xi=β1Xi=β12β2

Quadratic Model: Maxima and Minima II

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
(Intercept)50.524005780.2978134673
GDP_t1.550991120.0373734945
GDP_sq-0.015019270.0005794139

Quadratic Model: Maxima and Minima II

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
(Intercept)50.524005780.2978134673
GDP_t1.550991120.0373734945
GDP_sq-0.015019270.0005794139

GDPi=β12β2GDPi=(1.55)2(0.015)GDPi51.67

Quadratic Model: Maxima and Minima III

ggplot(data = gapminder)+
aes(x = GDP_t,
y = lifeExp)+
geom_point(color="blue", alpha=0.5)+
stat_smooth(method = "lm",
formula = y ~ x + I(x^2),
color="green")+
geom_vline(xintercept=51.67, linetype="dashed", color="red", size = 1)+
geom_label(x=51.67, y=90, label="$51.67", color="red")+
scale_x_continuous(labels=scales::dollar,
breaks=seq(0,120,10))+
scale_y_continuous(breaks=seq(0,100,10),
limits=c(0,100))+
labs(x = "GDP per Capita (in Thousands)",
y = "Life Expectancy (Years)")+
ggthemes::theme_pander(base_family = "Fira Sans Condensed",
base_size=16)

Are Polynomials Necessary?

Determining Polynomials are Necessary I

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125
  • Is the quadratic term necessary?

Determining Polynomials are Necessary I

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125
  • Is the quadratic term necessary?
  • Determine if ^β2 (on X2i) is statistically significant:
    • H0:^β2=0
    • Ha:^β20

Determining Polynomials are Necessary I

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
p.value
<dbl>
(Intercept)50.524005780.2978134673169.649840.000000e+00
GDP_t1.550991120.037373494541.499761.292863e-260
GDP_sq-0.015019270.0005794139-25.921493.935809e-125
  • Is the quadratic term necessary?
  • Determine if ^β2 (on X2i) is statistically significant:
    • H0:^β2=0
    • Ha:^β20
  • Statistically significant we should keep the quadratic model
    • If we only ran a linear model, it would be incorrect!

Determining Polynomials are Necessary II

  • Should we keep going up in polynomials?

^Life Expectancyi=^β0+^β1GDPi+^β2GDP2i+^β3GDP3i

Determining Polynomials are Necessary III

  • In general, you should have a compelling theoretical reason why data or relationships should “change direction” multiple times

  • Or clear data patterns that have multiple “bends”

A Second Polynomial Example I

Example: How does a school district's average income affect Test scores?

A Second Polynomial Example I

Example: How does a school district's average income affect Test scores?

^Test Scorei=^β0+^β1Incomei

A Second Polynomial Example I

Example: How does a school district's average income affect Test scores?

^Test Scorei=^β0+^β1Incomei+^β1Income2i

A Second Polynomial Example II

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
statistic
<dbl>
(Intercept)607.301735013.046219282199.362449
avginc3.850994740.30426169312.656850
I(avginc^2)-0.042308460.006260061-6.758474

A Second Polynomial Example III

ABCDEFGHIJ0123456789
term
<chr>
estimate
<dbl>
std.error
<dbl>
(Intercept)6.000790e+025.8295880342
avginc5.018677e+000.8594537744
I(avginc^2)-9.580515e-020.0373591998
I(avginc^3)6.854842e-040.0004719549
  • Should we keep going?

Strategy for Polynomial Model Specification

  1. Are there good theoretical reasons for relationships changing (e.g. increasing/decreasing returns)?

Strategy for Polynomial Model Specification

  1. Are there good theoretical reasons for relationships changing (e.g. increasing/decreasing returns)?

  2. Plot your data: does a straight line fit well enough?

Strategy for Polynomial Model Specification

  1. Are there good theoretical reasons for relationships changing (e.g. increasing/decreasing returns)?

  2. Plot your data: does a straight line fit well enough?

  3. Specify a polynomial function of a higher power (start with 2) and estimate OLS regression

Strategy for Polynomial Model Specification

  1. Are there good theoretical reasons for relationships changing (e.g. increasing/decreasing returns)?

  2. Plot your data: does a straight line fit well enough?

  3. Specify a polynomial function of a higher power (start with 2) and estimate OLS regression

  4. Use t-test to determine if higher-power term is significant

Strategy for Polynomial Model Specification

  1. Are there good theoretical reasons for relationships changing (e.g. increasing/decreasing returns)?

  2. Plot your data: does a straight line fit well enough?

  3. Specify a polynomial function of a higher power (start with 2) and estimate OLS regression

  4. Use t-test to determine if higher-power term is significant

  5. Interpret effect of change in X on Y

Strategy for Polynomial Model Specification

  1. Are there good theoretical reasons for relationships changing (e.g. increasing/decreasing returns)?

  2. Plot your data: does a straight line fit well enough?

  3. Specify a polynomial function of a higher power (start with 2) and estimate OLS regression

  4. Use t-test to determine if higher-power term is significant

  5. Interpret effect of change in X on Y

  6. Repeat steps 3-5 as necessary

Paused

Help

Keyboard shortcuts

, , Pg Up, k Go to previous slide
, , Pg Dn, Space, j Go to next slide
Home Go to first slide
End Go to last slide
Number + Return Go to specific slide
b / m / f Toggle blackout / mirrored / fullscreen mode
c Clone slideshow
p Toggle presenter mode
t Restart the presentation timer
?, h Toggle this help
Esc Back to slideshow