![Stat Courses](/img/default-banner.jpg)
- 53
- 2 624 908
Stat Courses
United States
Registrace 3. 07. 2012
StatCourses is dedicated to teaching concepts in Statistics and Data science in concise and relatable manner.
Indexing: R vs Python
In this video, we show indexing of R vectors and Python's numpy arrays. Not just one of the languages, but both Python and R.
Indexing allows you to select a subset of elements in a given object containing a sequence of data elements. You can index a vector in R and a numpy array in Python by using square brackets [] and specifying the indices of the elements you want to extract.
While square brackets are used for both Python and R indexing, how you specify index values within square brackets differs between the two languages.
For example, x[1] accesses the first element of an R vector x; and y[0] accesses the first element of a numpy array y in Python.
In python, negative indexing can be used to access the elements of a numpy array from the end. For example, the last element has an index of -1, the second last element has an index of -2, and so on.
Negative indexing has a different behavior in R. It has the effect of excluding elements instead of the reverse indexing behavior in Python. For example x[-1] excluded the first element of x and accesses the rest, and x[-2] returns all elements of vector x except the 2nd entry.
#rstats #Python #numpy
Indexing allows you to select a subset of elements in a given object containing a sequence of data elements. You can index a vector in R and a numpy array in Python by using square brackets [] and specifying the indices of the elements you want to extract.
While square brackets are used for both Python and R indexing, how you specify index values within square brackets differs between the two languages.
For example, x[1] accesses the first element of an R vector x; and y[0] accesses the first element of a numpy array y in Python.
In python, negative indexing can be used to access the elements of a numpy array from the end. For example, the last element has an index of -1, the second last element has an index of -2, and so on.
Negative indexing has a different behavior in R. It has the effect of excluding elements instead of the reverse indexing behavior in Python. For example x[-1] excluded the first element of x and accesses the rest, and x[-2] returns all elements of vector x except the 2nd entry.
#rstats #Python #numpy
zhlédnutí: 300
Video
Knowledge Building - The Bayesian Way
zhlédnutí 1,1KPřed rokem
Bayesian Knowledge building process explained using the popular puzzle game Wordle as an example. We, humans, operate as Bayesians! We build on our existing knowledge. What we know as of now gets updated when we receive more information or data. In this video we will introduce the Bayesian knowledge building process using an example - a wordle game. Prior knowledge guide us in making decisions ...
Covariance - Explained
zhlédnutí 2,1KPřed rokem
This educational video delves into how you quantify a linear statistical relationship between two variables using covariance! #statistics #probability #SoME2 This video gives a visual and intuitive introduction to the covariance, one of the ways we measure a linear statistical relationship. Covariance values above zero mean positive linear relationship, negative covariance indicates negative li...
R packages: installing, loading, using and updating R packages
zhlédnutí 17KPřed 2 lety
This video is all about R packages: What are R Packages Installing R packages Loading R packages to your current R session Using R packages Updating installed R packages Uninstalling R packages
Joint Probability Density Functions
zhlédnutí 7KPřed 2 lety
This video gives an intuitive explanation of the joint probability density function of two continuous random variables. We will mainly focus on understanding what bivariate joint probability density functions are and then extend the concept to multivariate joint probability density functions. To aid the understanding of the concepts, we will plot three-dimensional histograms for a pair of rando...
Probability Density Functions
zhlédnutí 3KPřed 2 lety
Probability density functions explained in an intuitive way using an example! This video is intuitive introduction to the probability density function of a univariate continuous random variable. For a continuous random variable, the #probability probability density function at a specific point can be thought of as the relative likelihood of observing values in a very small interval containing t...
Joint Probability Mass Function Example
zhlédnutí 2,9KPřed 2 lety
An example of the joint probability mass function (joint PMF) of two random variables. We learn about joint probability mass functions (joint PMFs) by exploring these two discrete random variables jointly: number of goals scored by the home team in a football/soccer game, and the number of goals scored by the away/traveling team. This lesson also showcases how you can use a three-dimensional pl...
Download and Install R and RStudio for Windows
zhlédnutí 6KPřed 3 lety
Learn how to install R on a Windows computer. We will also install the open source edition of RStudio - a powerful IDE (integrated development environment) for R and Python. Link for downloading R #RStat: Download R from a CRAN server nearest to you using the link cloud.r-project.org Link for downloading #RStudio: www.rstudio.com/products/rstudio/download/#download About #Rprogramming language ...
Joint Probability Distribution Of Discrete Random Variables
zhlédnutí 51KPřed 3 lety
Introductory video for joint probability distribution of two discrete random variables (and probability mass function of discrete random vectors in general). - We introduce joint distribution of two discrete random variables using examples (example 1 - 0:28 and example 2 - 5:08) - We then formally define the joint probability mass function - We lay out the conditions joint probability mass func...
Introduction to Multivariate Probability Distributions
zhlédnutí 20KPřed 3 lety
In this video, you learn why we study multivariate distributions. For example, we get more insight when we look at the number of reported COVID-19 cases and deaths by age (or another variable such as pre-existing medical conditions). In another example, we show the relationship between marital status and annual automobile insurance claim amount.
Chebyshev’s Inequality
zhlédnutí 6KPřed 3 lety
In this video you will learn about Chebyshev’s inequality using examples, prove Chebyshev’s inequality by utilizing Markov’s inequality, and learn three ways of writing Chebyshev’s inequality. In addition, we describe the scenario for which Chebyshev’s inequality is tight. Related video link: Markov’s inequality lesson: czcams.com/video/apLNpPQENus/video.html
Markov's Inequality - Intuitively and visually explained
zhlédnutí 8KPřed 3 lety
Markov's Inequality - Intuitively and visually explained
Poisson Process and Gamma Distribution
zhlédnutí 36KPřed 6 lety
Poisson Process and Gamma Distribution
Univariate transformation of a random variable
zhlédnutí 51KPřed 10 lety
Univariate transformation of a random variable
Lesson 19 Hypergeometric Distribution - Introduction
zhlédnutí 55KPřed 11 lety
Lesson 19 Hypergeometric Distribution - Introduction
Lesson 18: Negative Binomial distribution Part II
zhlédnutí 42KPřed 11 lety
Lesson 18: Negative Binomial distribution Part II
Lesson 17: Geometric Distribution part II
zhlédnutí 33KPřed 11 lety
Lesson 17: Geometric Distribution part II
Lesson 17: Geometric Distribution Part 1
zhlédnutí 61KPřed 11 lety
Lesson 17: Geometric Distribution Part 1
Lesson 18: Negative Binomial Distribution - Part 1
zhlédnutí 48KPřed 11 lety
Lesson 18: Negative Binomial Distribution - Part 1
Lesson 16 Binomial Distribution Part 2
zhlédnutí 27KPřed 11 lety
Lesson 16 Binomial Distribution Part 2
Lesson 16 Bernoulli and Binomial Distribution Part 1
zhlédnutí 51KPřed 11 lety
Lesson 16 Bernoulli and Binomial Distribution Part 1
Thank you ❤.
very nicely explained.
Ty
Thank you very much for your help!
ZAAAAAAAAAMN
HUGE SHOUTOUT TO YOU! HELPED ME ON THIS ONE!!!!!!!!!!!!!!!!!
Amazing video! Thank you.
ty sir
amazing video - this series has helped me so much!!🇿🇦
Thanks very much ✊🙏
finally someone taught me how to integrate the normal bell curve, really thanks with all the guided calculations
Thank you for sharing
For the probability of the home team winning with atleast one goal = 0.4755 And For the probability of the away team winning with atleast one goal = 0.3372 I definitely think I got the right one here💫
I love you
i got 0.2912 is that right?
no its 0.4756
wow very nice
thankyou. very clear explanation
super
so helpful, thank you 🙏🏼 😊
Is the reason that we can simply state that Q=1 rather than +-1 because e raised to a power never creates a negative number?
Thank you so much!!!!!
Tysm sir for this video, it was really helpful.
Vid: 🎶🎶🎶🎶🎶🎶 My brain: 📈📈📈
ohn has six 1 rand coins, three 2 rand coins and two 5 rand coin and wishes to buy another item that costs 10 rand which is sold by the vending machine. In how many distinct ways may he correctly insert a total of 10 rand into the vending machine for purposes of buying the item (given that the machine does not give change)?
BRILLIANT!
Good work, Sir 👏
Can't wrap my head around the overall volume being the area of one of the equally partitioned regions :)
Thank you for sharing
thank you sir
1. I used a longer way but I still want to share it On-time = 0.40 Satisfactory = 0.50 Neither = 0.25 Let On-time as A and Satisfactory as B and Neither as the complement of AUB ((AUB)'). We will use this Identity of the Sample Space: P(S) = P(AUB) + P((AUB)') By using the identity of the union of A and B, P(S) = P(A) + P(B) - P(A∩B) + P((AUB)') We can now plug-in the three given values, P(S) = 0.40 + 0,50 - P(A∩B) + 0.25 Since P(S) = 1, 1 = 0.40 + 0,50 - P(A∩B) + 0.25 Finally, P(A∩B) = 0.40 + 0,50 + 0.25 -1 P(A∩B) = 0.15
I found the same answer to the last question through a more convoluted way. P(CUB/F')=P(CUB)P(F'/CUB)/Law of total probability for F' well I figured C and B were disjoint so P(CUB)= P(C)+P(B), but the hard part was finding P(F'/CUB). P(F'/CUB) is the probability that the component didn't fail out of the class C or B components. given earlier P(CUB)= P(C)+P(B)= .12+.18= .3 If .9 of Components B worked and .82 of components C worked (given by P(F'/B)=1-P(F/B) and P(F'/C)=1-P(F/B)) and they are disjoint one could take the average of the working components in class B and C. Class C and B take up .3 of the sample space, and class B takes up .18 of that .3, while C takes up .12 of that .3. So class B is .6 from sample BUC (18/30) and class C is .4 of BUC (12/30). Taking a weighted average of (.9x.6)+(.82x.4) finds the average probability that Components didn't fail out of class C or B. P(F'/CUB)= (.9x.6)+(.82x.4) = .868. Plugging that back in to the original equation finds 0.2751
This is a true beauty. Thank you and may this video live from 0 to infinity!
i swear this is a stupid video
Great videos but i am getting annoying commercials every 2 minutes.
thankyou
Thank you very much
Great expectation 👍
Thanks! Very clear and concise. Really appreciate it!
but how can I install a package that I downloaded on my computer? I've got R version 4.2.2 and R studio and my teacher had send me the package "rmf" to install but I don't know how to install it
Thank you for starting with an example
LIFE SAVER
𝓣𝓺𝓼
Where are the answers to the last 2 exercise questions?
First time I understand covariance, thanks.
Awesome 😊🔥👌👍🔥❤️❤️
ggplot2 is not available for 4.2.2 version of r, so what we have to do?,plzz solve my problem 😢
I recommend u 4.3.1
👏👏
This is incredibly clear, thank you!
In the formula of P(A U B U C), it should be +2P(A∩B∩C) right?