Library

Today, I'm going to talk  to you about quantum computing applications and  
Video Player is loading.
 
Current Time 0:00
Duration 5:58
Loaded: 0.00%
 
Today I'm going to talk  to you about quantum computing applications and  

Today, I'm going to talk  to you about quantum computing applications and  

x1.00


Back

Games & Quizzes

Training Mode - Typing
Fill the gaps to the Lyric - Best method
Training Mode - Picking
Pick the correct word to fill in the gap
Fill In The Blank
Find the missing words in a sentence Requires 5 vocabulary annotations
Vocabulary Match
Match the words to the definitions Requires 10 vocabulary annotations

You may need to watch a part of the video to unlock quizzes

Don't forget to Sign In to save your points

Challenge Accomplished

PERFECT HITS +NaN
HITS +NaN
LONGEST STREAK +NaN
TOTAL +
- //

We couldn't find definitions for the word you were looking for.
Or maybe the current language is not supported

  • 00:00

    Today, I'm going to talk  to you about quantum computing applications and  
    Today, I'm going to talk  to you about quantum computing applications and  

  • 00:04

    machine learning. This is a very exciting area of  quantum computing research and lots of classical  
    machine learning. This is a very exciting area of  quantum computing research and lots of classical  

  • 00:11

    machine learning developers are understandably  excited about the potential applications within  
    machine learning developers are understandably  excited about the potential applications within  

  • 00:17

    their own field. So to get started, let's talk  about a classical machine learning problem that  
    their own field. So to get started, let's talk  about a classical machine learning problem that  

  • 00:23

    is one this very common linear classification. So  if we start with two sets of data that we want to  
    is one this very common linear classification. So  if we start with two sets of data that we want to  

  • 00:31

    classify into two separate categories, let's draw  them here. We're just going to have three dots  
    classify into two separate categories, let's draw  them here. We're just going to have three dots  

  • 00:38

    and three crosses, all on a single linear plane  here. If the data is arranged like this, it can  
    and three crosses, all on a single linear plane  here. If the data is arranged like this, it can  

  • 00:48

    be pretty easy to classify this into two discrete  groups. We can draw a single line in the middle  
    be pretty easy to classify this into two discrete  groups. We can draw a single line in the middle  

  • 00:53

    here and now we've classified them, but this  can be a lot harder if our data is more complex.  
    here and now we've classified them, but this  can be a lot harder if our data is more complex.  

  • 01:01

    For example, if our data is arranged like this,  perhaps with the crosses in the middle. Now, there  
    For example, if our data is arranged like this,  perhaps with the crosses in the middle. Now, there  

  • 01:08

    isn't a single line that we can draw on this plane  to classify the data into two discrete groups.  
    isn't a single line that we can draw on this plane  to classify the data into two discrete groups.  

  • 01:18

    So in order to solve this problem and classify  this data, what we need to do is we need to map  
    So in order to solve this problem and classify  this data, what we need to do is we need to map  

  • 01:23

    this data into a higher dimensional space,  which we're going to call a feature space.  
    this data into a higher dimensional space,  which we're going to call a feature space.  

  • 01:36

    Then if we've mapped the  data, for example, like this.  
    Then if we've mapped the  data, for example, like this.  

  • 01:43

    We can now see because we've mapped this data  into a high dimensional space. There is now a  
    We can now see because we've mapped this data  into a high dimensional space. There is now a  

  • 01:49

    much easier way to classify this. So how do  we do this step of transferring our data,  
    much easier way to classify this. So how do  we do this step of transferring our data,  

  • 01:57

    mapping it into a high dimensional feature  space? To do this, we can use kernel functions.  
    mapping it into a high dimensional feature  space? To do this, we can use kernel functions.  

  • 02:08

    Kernel functions work by taking some underlying  features of the original dataset and using that to  
    Kernel functions work by taking some underlying  features of the original dataset and using that to  

  • 02:15

    map those data points into this high dimensional  feature space. Kernel functions are incredibly  
    map those data points into this high dimensional  feature space. Kernel functions are incredibly  

  • 02:21

    powerful and incredibly versatile, but they  do face problems. Sometimes they just get poor  
    powerful and incredibly versatile, but they  do face problems. Sometimes they just get poor  

  • 02:27

    results, and also the compute runtime can explode  as the complexity of the data sets increase.  
    results, and also the compute runtime can explode  as the complexity of the data sets increase.  

  • 02:36

    If you're an experienced machine learning  developer, perhaps you've seen this already.  
    If you're an experienced machine learning  developer, perhaps you've seen this already.  

  • 02:41

    If you're dealing with data that has very  strong correlations, or perhaps if you're  
    If you're dealing with data that has very  strong correlations, or perhaps if you're  

  • 02:45

    dealing with time series forecasting where the  data is very complex and at a high frequency,  
    dealing with time series forecasting where the  data is very complex and at a high frequency,  

  • 02:53

    but quantum computers have the potential  to provide an advantage in this space.  
    but quantum computers have the potential  to provide an advantage in this space.  

  • 03:01

    They can be useful because quantum  computers can access much more complex  
    They can be useful because quantum  computers can access much more complex  

  • 03:08

    and higher dimensional feature spaces  than their classical counterparts can.  
    and higher dimensional feature spaces  than their classical counterparts can.  

  • 03:14

    And they can do this because quantum computers  can we can encode our data into quantum circuits,  
    And they can do this because quantum computers  can we can encode our data into quantum circuits,  

  • 03:20

    and the resulting kernel functions could be very  difficult or even impossible to replicate on a  
    and the resulting kernel functions could be very  difficult or even impossible to replicate on a  

  • 03:27

    classical machine, as well as this, those  kind of functions also can perform better.  
    classical machine, as well as this, those  kind of functions also can perform better.  

  • 03:33

    In 2021, IBM researchers actually proved that  quantum kernels can provide an exponential  
    In 2021, IBM researchers actually proved that  quantum kernels can provide an exponential  

  • 03:42

    speed up over their classical counterparts for  certain classes of classification problems.  
    speed up over their classical counterparts for  certain classes of classification problems.  

  • 03:55

    As well as this. There is a lot of research going  into improving Quantum kernel with structure,  
    As well as this. There is a lot of research going  into improving Quantum kernel with structure,  

  • 04:01

    data and kernel alignment. So as you  can see, this field is incredibly  
    data and kernel alignment. So as you  can see, this field is incredibly  

  • 04:07

    exciting. There's a lot of research going on  in this space. And you can use Qiskit Runtime  
    exciting. There's a lot of research going on  in this space. And you can use Qiskit Runtime  

  • 04:25

    to easily build a quantum machine learning  algorithms with built in tools such as the  
    to easily build a quantum machine learning  algorithms with built in tools such as the  

  • 04:31

    sampler primitive, which primitives are unique to  the IBM's Qiskit Runtime. These are essentially  
    sampler primitive, which primitives are unique to  the IBM's Qiskit Runtime. These are essentially  

  • 04:40

    predefined programs that help us to optimize  workflows and execute them efficiently on  
    predefined programs that help us to optimize  workflows and execute them efficiently on  

  • 04:47

    quantum systems. Let's take, for example, our  linear classification problem. Let's say we  
    quantum systems. Let's take, for example, our  linear classification problem. Let's say we  

  • 04:54

    have our data and we've encoded it into a quantum  circuit. We can then use the sampler primitive.  
    have our data and we've encoded it into a quantum  circuit. We can then use the sampler primitive.  

  • 05:08

    To obtain quasi-probabilities indicating  the relationships between the the different  
    To obtain quasi-probabilities indicating  the relationships between the the different  

  • 05:16

    data points and these relationships  can constitute our kernel matrix.  
    data points and these relationships  can constitute our kernel matrix.  

  • 05:24

    And that kernel matrix can then be evaluated and  used in even a classical support vector machine  
    And that kernel matrix can then be evaluated and  used in even a classical support vector machine  

  • 05:32

    to predict new classification labels. So if you're  ready to get started learning more about quantum  
    to predict new classification labels. So if you're  ready to get started learning more about quantum  

  • 05:40

    machine learning, you can check out the links in  the description for more information about Kiska  
    machine learning, you can check out the links in  the description for more information about Kiska  

  • 05:45

    runtime as well as the quantum machine learning  course that's available on the Qiskit textbook.  
    runtime as well as the quantum machine learning  course that's available on the Qiskit textbook.  

  • 05:51

    I hope you've enjoyed this content. Thank  you very much for watching. [00:00:00][0.0]
    I hope you've enjoyed this content. Thank  you very much for watching. [00:00:00][0.0]

All idiomverb-ing
this is
//

idiom

Used to quote, paraphrase, or mimic the words of someone else, especially in a mocking or derisive manner.

computing
/kəmˈpyo͞odiNG/

word

To find out by calculating or estimating

Quantum Machine Learning Explained

4,743 views

Video Language:

  • English

Caption Language:

  • English (en)

Accent:

  • English (US)

Speech Time:

86%
  • 5:09 / 5:58

Speech Rate:

  • 137 wpm - Conversational

Category:

  • Education

Intro:

Today, I'm going to talk  to you about quantum computing applications and  
machine learning. This is a very exciting area of  quantum computing research and lots of classical  
machine learning developers are understandably  excited about the potential applications within  
their own field. So to get started, let's talk  about a classical machine learning problem that  
is one this very common linear classification. So  if we start with two sets of data that we want to  
classify into two separate categories, let's draw  them here. We're just going to have three dots  
and three crosses, all on a single linear plane  here. If the data is arranged like this, it can  
be pretty easy to classify this into two discrete  groups. We can draw a single line in the middle  
here and now we've classified them, but this  can be a lot harder if our data is more complex.  
For example, if our data is arranged like this,  perhaps with the crosses in the middle. Now, there  
isn't a single line that we can draw on this plane  to classify the data into two discrete groups.  
So in order to solve this problem and classify  this data, what we need to do is we need to map  
this data into a higher dimensional space,  which we're going to call a feature space.  
Then if we've mapped the  data, for example, like this.  
We can now see because we've mapped this data  into a high dimensional space. There is now a  
much easier way to classify this. So how do  we do this step of transferring our data,  
mapping it into a high dimensional feature  space? To do this, we can use kernel functions.  
Kernel functions work by taking some underlying  features of the original dataset and using that to  
map those data points into this high dimensional  feature space. Kernel functions are incredibly  
powerful and incredibly versatile, but they  do face problems. Sometimes they just get poor  

Video Vocabulary

/ˈpou(ə)rfəl/

adjective adverb

Having control or influence over. very.

/kəmˈpyo͞odər/

noun other

electronic device for storing and processing data. Machines for storing data, accessing the internet.

/krôs/

noun other verb

shape formed by two intersecting lines. Animals/plants resulting from combining two breeds. To put one thing over another e.g. arms.

/məˈSHēn/

noun verb

Piece of equipment used to do work. make or operate on with machine.

verb

To move something from one place to another.

/ˈlərniNG/

noun verb

acquisition of knowledge or skills through study. To get knowledge or skills by study or experience.

/əˈrijənl/

adjective noun

Being unique or thinking differently from others. earliest form of something.

/kəmˈpyo͞odiNG/

noun verb

use or operation of computers. To find out by calculating or estimating.

/diˈskrēt/

adjective

Separate; composed of distinct parts.

/ikˈsīdəd/

adjective verb

Happy, interested or eager; enthusiastic. To make someone feel happy, interested or eager.

/ˈklasəˌfīd/

adjective noun verb

arranged in categories. newspaper advertisements. To keep some information secret from the public.

/kəmˈpyo͞ot/

verb

To find out by calculating or estimating.