This computer is learning to read your mind | DIY Neuroscience, a TED series

Sdílet
Vložit
  • čas přidán 1. 06. 2024
  • Modern technology lets neuroscientists peer into the human brain, but can it also read minds? Armed with the device known as an electroencephalogram, or EEG, and some computing wizardry, neuroscientist Greg Gage and his colleagues attempt to peer into a subject's thoughts.
    On DIY Neuroscience, a TED original series, watch cutting-edge neuroscience experiments on a shoestring budget. Check out more here: go.ted.com/diyneuroscience
    Subscribe to our channel: / ted
  • Zábava

Komentáře • 215

  • @denissetiawan566
    @denissetiawan566 Před 5 lety +4

    1. add more test subjects
    2. add more difficulty images, eg. man faces vs woman faces (no house, no scenery)
    3. focus on occipital lobe, subject can only use one of his/her dominant eyes
    4. add splitting image (eg. solid orange colour) for 1-2 secs after each face pictures

  • @deadpanacea
    @deadpanacea Před 5 lety +79

    WTF!! Age restricted??!
    Edit: thanks for removing the restriction :-)

  • @morn1415
    @morn1415 Před 5 lety +25

    You were explaining before that there is a visible dip after the P100 for faces. So machine learning is probably not needed for spotting that.
    I think it is a mistake to average out all sensors. A lot of localisation information is lost. But I see great potential in applying machine learnining here.
    There is amazing progress in deep neural networks learning to remove noise on images for example.
    Looking forward.

    • @paulsawford7827
      @paulsawford7827 Před 5 lety

      By localisation information, do you mean details? As in the subject is not looking at a face, but an eye for instance?

    • @morn1415
      @morn1415 Před 5 lety +3

      @@paulsawford7827 I mean in the sense of the algorithm being able theoretically to localize different areas in the brain. Recognizing a threedimensional pattern in the brain if you will. I think this is only possible if you provide the information of each individual sensor. If you make an average out of all a dimension is lost maybe. Just a thought.

  • @TheWarriorLP16
    @TheWarriorLP16 Před 5 lety +65

    But what will the computer do if I'm stoned?🤔🤔🤔

    • @iron60bitch62
      @iron60bitch62 Před 5 lety +4

      TheWarriorLP16 People who are under the influence of drugs are much easier to read than people who are sober

    • @Zahlenteufel1
      @Zahlenteufel1 Před 5 lety +6

      Not necessarily. If the recognition algorithm is finely tuned and especially if machine learning is used the influence of some drugs could completely ruin the measurements. AFAIK, a brain has radical drops of alpha waves and heightened levels of some other types on DMT.

    • @Amberfifu
      @Amberfifu Před 5 lety

      💀

    • @iron60bitch62
      @iron60bitch62 Před 5 lety

      dellort tog uoy Who’s talking about drugs

    • @gammalight1312
      @gammalight1312 Před 5 lety +5

      Cannabis is a muscle relaxer so in theroy it will likely clarify information but on the flip side it may show chaotic graphing in some people. I've had EEG and many types of brain scans while on and off Marijuana. I think it depends on the individuals substance reaction more then anything.

  • @agustinasanchez7906
    @agustinasanchez7906 Před 5 lety +1

    This is so cool!! i can see a future on this

  • @tubeyoukonto
    @tubeyoukonto Před 5 lety

    Awesome! More of these videos please!!

  • @The1sillygirly
    @The1sillygirly Před 5 lety +56

    I can only imagine how helpful this would be for physically handicapped people.

    • @iron60bitch62
      @iron60bitch62 Před 5 lety

      Rae Seaweed .000123% of the population will set the standard for how we all live that is insanity. And how can you verify what’s being discovered in the brains of these handicap people so when the government comes out and tells you these 50 people here who are in a vegetative state in these beds have asked to be euthanized you’ll say no problem how scary is that

    • @billybrinkley4575
      @billybrinkley4575 Před 5 lety

      As someone who is bipolar (not quite what you were mentioning) I was excited to see this for that reason

    • @HermanWillems
      @HermanWillems Před 5 lety

      Why? This is a closed loop training system. The neural networks learns what pattern fits to what image. It knows the image in the training stage. After that it has created a model. It's not something magical or something. Nor it can help any handicapped person.

    • @billybrinkley4575
      @billybrinkley4575 Před 5 lety +1

      It is the groundwork for more advanced programs that could help with those problems.

    • @lorie.8488
      @lorie.8488 Před 5 lety +2

      As the mother of a child who is intelligent but non-verbal, I think I would give my last breath to have her ACTUAL thoughts come out onto a computer screen

  • @LeonelMendesCogita
    @LeonelMendesCogita Před 5 lety +13

    You tried to decode in an afternoon what her brain coded for itself through trial and error for her entire life. Of course it didnt work. Also, there should be constant noise activity in ones brain even while idle. Showing the same face twice may also cause different reactions and brain patterns, which may mean the only signal you were able to pick up was from a lower function facial recognition response.

  • @aaroncameron1494
    @aaroncameron1494 Před 5 lety +35

    The girl in the thumbnail pic looks like she is out of a painting

  • @liucijap.1618
    @liucijap.1618 Před 5 lety +4

    You know they say that people in comas think and have conscious thoughts but just can't say them or move. Imagine how helpful this would be to them if the computer could fully detect the people's thoughts. An idea for the future.

    • @metanumia
      @metanumia Před 5 lety

      In the past two decades fMRI and EEG advancements have prevented quite a few people in comas from being euthanized/life-support turned off prematurely. There's a fairly recently defined condition called Locked-In Syndrome, you're absolutely right about this helping identify and treat people who are in a conscious coma state after trauma/stroke etc...

  • @miyamuraizumi3312
    @miyamuraizumi3312 Před 5 lety +1

    This is awesome but theres always more to find out and learn in the near future

  • @krraman666
    @krraman666 Před 5 lety

    This is so amazing!

  • @alan2here
    @alan2here Před 5 lety

    This is great. TY TED :)

  • @GideonvanderMerwe
    @GideonvanderMerwe Před 5 lety +9

    How long did you train the algorithm ?

  • @smitdumore1064
    @smitdumore1064 Před 5 lety +1

    Let us take a moment to appreciate KRISTI’S dedication

  • @doreenwatson-read
    @doreenwatson-read Před 5 lety

    Loved that.

  • @LeonidasGGG
    @LeonidasGGG Před 5 lety +1

    This AS been done. The computer could even reproduce the image the watcher was looking at.
    It was blurry and it required the watcher to look at the picture for more time, but they said that with faster computers and more brain sensors it WAS achievable... And it was a TEDtalk.

  • @parthbhavsar5452
    @parthbhavsar5452 Před 5 lety +2

    Great progress
    I can see future sometimes in my eyes

    • @iron60bitch62
      @iron60bitch62 Před 5 lety

      Parth Bhavsar Progress is great this has the potential to be a disaster think about it. One day of government organization asked you to come in for an interview during the interview they say that you have some anti-social or anti-government attitudes you told him that’s ridiculous that never happened and they say no this machine just read your mind you are a threat to our society you need to be re-educated What will your response be what this is great progress

    • @parthbhavsar5452
      @parthbhavsar5452 Před 5 lety

      See every technology has both side positive and negative.
      For example AI

  • @theblackdaniels.official

    Why does the "picture ready" block in the bottom left corner flashing differently for each category? I mean that black & white flashing used in the last test to "cheat", for me it seems now that you told the category to the computer with the flashing sequence... Instead of a "start monitor" signal it should be

  • @pervendrans.g2580
    @pervendrans.g2580 Před 5 lety +2

    Fascinating... But for now they are categorising images into certain similar patterns like face, scenery and so on...... But what makes specificity of these images like types of faces or colour of faces or weather in scenery ?

  • @marcoroberts6518
    @marcoroberts6518 Před 5 lety

    @1:23 Surprise picture of young Pep Guradiola hahahahaha

  • @metanumia
    @metanumia Před 5 lety +1

    Scientists will need far more detailed brain measurement/diagnostic devices and more efficient machine-learning algorithms for thoughts to be interpreted properly by a computer. If a neuroimaging device could image and measure every single neuron, and every synapse, and measure the electrical signal at each of those in real-time, and all of that information was digitized and capable of being stored in RAM, then a sophisticated machine-learning program should be able to find *way* *more* useful data about our thoughts. There is another TED talk from Summer 2018 in which a woman, who is a neuroscientist and engineer, talks about her team's new brain-imaging device. This device uses red-wavelength laser light combined with holography to allow non-intrusive imaging and measurement of *individual* neurons and synapses in real-time, it has extreme spatial resolution and incredible speed of measurement. Very soon we'll be living in a world where people's entire central nervous systems can be imaged, measured, and interpreted in great detail, if her team's devices are paired with machine-learning algorithms and deep-learning neural networks. This is simultaneously incredibly promising for the advancement of neuroscience, psychiatry, psychology, psychopharmacology, medicine, and deeply frightening if one contemplates the less ethical applications of such technology.

  • @123ssima
    @123ssima Před 3 lety

    Parallel thoughts at the time of watching the images provided??like going to a grocery store after the session affects?

  • @RyanFavale
    @RyanFavale Před 5 lety +1

    Nice. We've hit the tip of the iceberg. This is exciting! :D

  • @ketakiambadkar7523
    @ketakiambadkar7523 Před 5 lety +1

    This might be a stupid question but do the waves change from person to person of the way they see an image or do they remain similar?

    • @kerensahardesty9851
      @kerensahardesty9851 Před 3 měsíci

      Maybe so! The FFA is an area in the inferior temporal region responsible for recognizing faces. When a person sees a familiar face, the FFA tends to be more active. But such small differences in brain activity probably cannot be accurately detected by the EEG. The device they're using (the headband around her head) can only pick up brain waves from her scalp, so it probably isn't a very good representation of all the activity going on in the brain.

  • @ClaudaoTheBest
    @ClaudaoTheBest Před 4 lety

    WoW! Nice! Near future !

  • @shirleyware9059
    @shirleyware9059 Před 5 lety

    Live and learn that's what it's all about, keep on testing the only failure is when you quit and no longer try.

  • @okayokay1979
    @okayokay1979 Před 5 lety +13

    So most important part of this project was machine learning...
    You should hire good machine learning engineer then you can classify even 100 categories.
    (btw i am available next month) 😉😁😄😂

  • @CaptiveReefSystems
    @CaptiveReefSystems Před 4 lety

    Does anybody happen to know why I can find any information or circuitry schematics regarding the stacked boards used in the (multi-channel?) EEG recording pictured at 1:33? I am assuming that it is probably some variation of their 'Backyard Brains' designs, but they look fairly different in this video... 🤔 I am in desperate need of a cost-effective, multi-channel EEG recorder for my work. Any information would be *greatly* appreciated!

  • @bhaveekdesai
    @bhaveekdesai Před 5 lety

    This is fascinating! At University of Florida, we've been working on multiple Brain-Computer Interfaces, including world's first ever Brain-Drone Racing and playing musical instruments with mind for people with disabilities.

    • @HermanWillems
      @HermanWillems Před 5 lety +1

      How do you implement your actuators to neuron interface? You actually drill holes in the patients head and place "outputs" from the brain there? Or just curious.

    • @bhaveekdesai
      @bhaveekdesai Před 5 lety

      We've used non-invasive EEG options for all our projects. This reduces risk and is suitable for many applications

  • @Valient6
    @Valient6 Před 5 lety

    You forgot to mention the fact that the human face has lines in it that reflect organic structures and therefore can be represented by patterns associated with beauty and relevance in scenery. And so on between the categories. Resulting in multiple firings of the differing areas of recognition. Also the fact that spontaneity of memories will create thoughts of these different areas of the brain depending on the past of the individual and the image shown. It will be in possible to determine through the EKG waves alone.

    • @Markcus003
      @Markcus003 Před 5 lety

      That's the reason it's really hard to study the brain because you can't control what someone's thinking and just say we did it in a control environment

  • @danielchmiel7787
    @danielchmiel7787 Před 5 lety +1

    That crazy blinking square in the corner during first experiment struck me... Planned failure huh?

  • @queenmiiram970
    @queenmiiram970 Před 5 lety

    Amazing

  • @nioxic77
    @nioxic77 Před 5 lety +1

    In top gear, james may drives a small "car" that he controls with his mind. he thinks of a cat, for turning left, and punching a guy, for turning right etc. they "record" the waves as the thought of those things, and the car then reacts when he thinks on those specific things.
    this is like.. 10 years ago.

  • @Zahlenteufel1
    @Zahlenteufel1 Před 5 lety +4

    With quantum computers, there might be a possibility to take an EEG or some more sophisticated version of it (with a lot more data points) and calculate back the currents that created them. It will probably also need ai/machine-learning.
    I just made this up but it seems to me like the very thing quantum computers are good at, similar to many-particle-simulations which I know they are very good at (or will be). There is also the holographic principle (yes I watch SpaceTime), so it's not unseen in nature for the information of a volume being related to the surface area containing the volume (brain).
    Also, I'm stoned, have a nice day.

    • @HermanWillems
      @HermanWillems Před 5 lety

      Ah you want to connect all synapses of each neuron to a sensor? :) 80.000.000.000 * 7000 connections you got there. Goodluck boy.

    • @Zahlenteufel1
      @Zahlenteufel1 Před 5 lety

      @@HermanWillems No. The holographic principle states that all information about a volume can be inscribed on its surface (iiuc) therefore it should be possible to calculate back the state of the whole system using only information from its surface (from a helmet-like device). Quantum computers are thought to be very good at simulations where many particles interact and this seemed to me like a similar problem (and normal computers would suck at it for sure).

    • @HermanWillems
      @HermanWillems Před 5 lety +2

      If you know the model at the exact time of x. But the model constantly changes due to neuroplasticy. How are you going to cope with that? Then every certain moment you need to generate a new model of the neural pattern state. hmmmm still sounds to me as a very difficult problem. A resolution problem, the ammount of possibilities of the brain is not just simple that a neuron is ON or OFF. But also how strong a certain connection of a synaps is. All signals are pulses, so the ammount of possibilities of our brain is huge. Based on travel time, length of synapses. Strength of signal. and much more. I wonder if we ever going to simulate something so complex with some kind of Quantum computer. But who knows. Would be interesting. :)

    • @Zahlenteufel1
      @Zahlenteufel1 Před 5 lety +2

      I can't really follow you but it doesn't matter. I don't plan on doing research on this any time soon. I just thought that it might be worth looking into. You are right the brain is very complex with 86 billion neurons, each with 26 possible power states. I think it might be possible to track the signals running through the neurons because moving charge creates a measurable disturbance in the electromagnetic field (measured by an EEG or my proposed helmet device). Having this model of the disturbances measured in a shell around the brain (a sphere surface) I suspect it might be possible to calculate the origin of each tiny disturbance that contributed to the noisy mess we see on the EEG using physics. This would definitely require a vast amount of computational power but quantum computers are famous for turning unsolvably large problems into smaller problems that normal computers can deal with, maybe even in real time (i.e. similar to many-particle-simulation which deals with possibly even greater numbers). Therefore it would be interesting to see research in that direction.
      Notice all of the qualifying language. This is by no means a scientific hypothesis (more like a brain fart) but who knows? Maybe somebody will build on it...

    • @okayokay1979
      @okayokay1979 Před 5 lety +2

      @@Zahlenteufel1 yeah one day it will be built.... Can i connect with you brother!?

  • @davidtyson6869
    @davidtyson6869 Před 4 lety

    That is very interesting sir I perhaps we all can get together and change some knowledge however I am a Scientist and I am in the process of feeding information into the brain thank you

  • @rakeshmallick8040
    @rakeshmallick8040 Před 5 lety

    I have seen a better TED video on brain reading everyone has forgotten about it seems.

  • @jackjohnson3024
    @jackjohnson3024 Před 5 lety +1

    Damn that's cool

  • @IIIIIllllIII
    @IIIIIllllIII Před 5 lety +2

    It is not mind reading that you can retrieve stored thoughts that are tied to a signal, Mind reading means that you capture these thoughts yourself and and this is not what you deal with in your experiment.
    You are already having a computer recall a stored signal and not recognize it yourself.

  • @ddeakpeti
    @ddeakpeti Před 5 lety +8

    What if she "cheats" and strongly focuses on let's say a face when a scenery is shown?

    • @metanumia
      @metanumia Před 5 lety +5

      I've looked into the research of a scientist who is developing a genuine lie detection device based on the P300 signal. Apparently their EEG and interpretation software has >90% accuracy, even when someone does what you mentioned and cheats by trying to force other thoughts during the measurement while viewing some image or object. If they recognize that image or object, the P300 signal is seen, no matter how hard one tries to think of something else. Scary, right?
      Relevant studies:
      Meijer, E., Ben-Shakhar, G., Verschuere, B., & Donchin, E. (2012). A comment on Farwell (2012): Brain fingerprinting: A comprehensive tutorial review of detection of concealed information with event-related brain potentials. Cognitive Neurodynamics, 7, 155-158.
      www.ncbi.nlm.nih.gov/pmc/articles/PMC3704663/
      www.ncbi.nlm.nih.gov/pubmed/22091554
      www.ncbi.nlm.nih.gov/pubmed/21440013
      www.ncbi.nlm.nih.gov/pubmed/21965119

    • @Louie.Oxford
      @Louie.Oxford Před 5 lety +1

      @@metanumia wow

  • @manish313
    @manish313 Před 5 lety

    Awesome

  • @gummisnoodable
    @gummisnoodable Před 5 lety

    Even better, look up Yukiyasu Kamitani, a scientist who seems to have come up with AI code that does a surprisingly good job of relaying the image the subject is seeing. You can also search for one of the articles using the search: brain scan can read images. Good for humanity? That's another question altogether.

  • @darktealglasses
    @darktealglasses Před 5 lety +5

    Scary enough 😳 imagine a century later, this tech would get to the full level (the machine becomes a mind reader) and someone uses it to read ours 🧠

    • @HermanWillems
      @HermanWillems Před 5 lety

      Seems you have something to hide in your dirty little mind do you?

    • @darktealglasses
      @darktealglasses Před 5 lety

      @@HermanWillems
      OR maybe yours is dirty so you think other's is

    • @HermanWillems
      @HermanWillems Před 5 lety

      Just a joke dude. :) Don't take so seriously ok.

    • @darktealglasses
      @darktealglasses Před 5 lety +1

      @@HermanWillems
      No tone or whatsoever as a hint 🤔

    • @angirgie729
      @angirgie729 Před rokem

      They already can read our minds

  • @GrantCelley
    @GrantCelley Před 5 lety

    For the Machine learning it might be interesting to try lstms or grus and maybe go into a deep memory network. I am wondering how advanced their machine learning is.

    • @HermanWillems
      @HermanWillems Před 5 lety +1

      I have seen what they can do with googles TensorFlow which is far more advanced. So i think it's just that probably... You can use it to. Just download tensorflow and start writing some python and some GUI. Also you need a GPU to even have some kind of performance.

  • @xXCHEETOBEANXx
    @xXCHEETOBEANXx Před 5 lety +1

    This wouldn't work because there is too much information that she in taking in other than the visuals. Along with what she sees, she is also hearing something there is also something like her thoughts, what she remembers in her mind. So that in data is constantly being mixed with other data. This could work if you put stronger imagery, that emites emotion. Something like Fear or Love for example. With stronger data like that, the algorithm could potentially make better data records for stronger guesses.

    • @Markcus003
      @Markcus003 Před 5 lety

      Give it time humanity is still young

  • @ArielleVicMaxim
    @ArielleVicMaxim Před 5 lety +7

    I though our phones were already reading our minds I always get ads on my apps or recommendations on subjects I had only though of not speak of, and of course I always get the o es on things I had said

    • @tylera5598
      @tylera5598 Před 5 lety +2

      Thats bc of a bunch of data google collects based on past searches and stuff.... Well i guess that sort of is mind reading

    • @ArielleVicMaxim
      @ArielleVicMaxim Před 5 lety

      Honudes Gai Honudes Gai i never made the assumption of being “unique” all I said was these is nothing new they being doing for a while and I know perfectly how they collect data from all us, so your comment doesn’t really makes sense 😒

  • @jusore
    @jusore Před 5 lety +1

    We can use it to read what a person is dreaming, and use it with people that cannot wake up.

  • @QuyNguyen-np2fq
    @QuyNguyen-np2fq Před 5 lety

    0:14 I listening "machines future that can read our thoughts", but in translate "machines that can read our thoughts", even on TED.com too, Can anybody hepl me explain this issue? Thanks.

  • @AmjadX
    @AmjadX Před 5 lety +1

    This is what we do..
    Congratulations humans....🙉💟💟

  • @holdmybeer
    @holdmybeer Před 5 lety +13

    not cheating, it's setting rules.

  • @rafael_augusto_20
    @rafael_augusto_20 Před rokem

    They're using some computer program on me that reads the person's mind, and they're changing my whole face and body. I'm looking for information on the internet about this. Please help me.

  • @DrAdnan
    @DrAdnan Před 5 lety +1

    This + AI will be crazy

  • @mireusted499
    @mireusted499 Před 5 lety

    What would happen if they change the subject? Probably the waves would be different from every different human doesn't it?

  • @Spenceristired
    @Spenceristired Před 5 lety

    I like Christy's face, I wonder what that would look like.

  • @roadsign289
    @roadsign289 Před 5 lety +2

    3 uploads at once

  • @neon32
    @neon32 Před 5 lety

    But we could see with our eyes the difference between the data from the face picture and the data from the scenery. It was a big difference.

  • @shadoku888
    @shadoku888 Před 5 lety +1

    i could code a game with my mind if perfected i hope

  • @300Moritz
    @300Moritz Před 5 lety +2

    interesting

  • @tony2534
    @tony2534 Před 5 lety

    Always interesting

  • @jusore
    @jusore Před 5 lety +1

    Now you can make a computer that generate that signals when see something with their bionic eyes. A step closer to the digital mind.

    • @HermanWillems
      @HermanWillems Před 5 lety

      One EEG sensor is VERY LOW resolution. It measure probably like a massive area of your brain. You need single neuron signals to even do something like you propose. Probably can do that in few hundred years.

  • @danielhawkins3392
    @danielhawkins3392 Před 5 lety

    Really cool work but as you said I can tell if it's a face or something else just by that big dip... It's not really doing anything beyond what we could do before.

  • @me-yo8ej
    @me-yo8ej Před 5 lety +1

    I like to see them try with my split personality good luck

  • @dirkwalker9686
    @dirkwalker9686 Před 5 lety

    The limitation isn't with machine learning or AI, the limitation is clearly with the sensors. It's like taking a blurry picture and trying to zoom and enhance. There's only so much you can do if the data you started with sucks. We need a better way to detect neurons firing than outdated EEGs, if we're ever going to get anything useful out of the data.

    • @HermanWillems
      @HermanWillems Před 5 lety

      The brain of a human is already extremely compact. Only some birds have more neurons per square inch. But if you want to connect a sensor to all synapses of each neuron. You can calculate how many sensors you need. Around 80.000.000.000 times 7000(average ammount of synapse connections per neuron) goodluck ! :) With finding space to place those sensors. This is why we cannot connect a "connector" to the human brain. We are not serial machines. But each neuron is a processor of it's own.

    • @dirkwalker9686
      @dirkwalker9686 Před 5 lety

      Never said it would be easy lol. Maybe if we create some self replicating nano tech that can can pull minerals out of the blood to build new nanites, and then attach themselves to neurons, it can be done. Either way we're a long way from having the technology or the moral inclination to do that kind of thing but i wouldn't say its impossible.

    • @HermanWillems
      @HermanWillems Před 5 lety

      And these nanites transfer the information trough deterministic wifi. :) Who knows !!

  • @tiavor
    @tiavor Před 5 lety

    "one day" ... not with the traditional EEG, with each sensor you cover so huge areas what you are never able to read each single neuron. it will always be this kind of guesswork.

  • @FAREANO1
    @FAREANO1 Před 5 lety

    Does the computer has a "time" activity control? I am pretty sure that the human brain knows what is going to be on the screen before it appears...

  • @parthpanchal8104
    @parthpanchal8104 Před 5 lety

    Cool

  • @tauqiralisayyad2341
    @tauqiralisayyad2341 Před 5 lety

    Now , you just think of scenery and see if it is telling right or not ....... and in fact it is actual mind reading

  • @anniechunawala6158
    @anniechunawala6158 Před 5 lety +1

    Cheers everyone age restriction is now removed!!!!! - ☺

  • @whyrare9911
    @whyrare9911 Před 5 lety +10

    The Detroit Become Human era isn't that far away now...

  • @rzxkp7none275
    @rzxkp7none275 Před 3 lety

    Wow this and voice to skull technology and FBI and other law enforcement agencies and volunteers thank you. Thumbs up

  • @kingchris5271
    @kingchris5271 Před 5 lety

    I have a suggestion your feeding her the images to fast. Give them 1 to 2 mins for each video if you don’t want to cheat next time. Like this experiment try doing it again.

  • @KunwarPratapSingh41951

    Is it generalizing or just memorizing?

    • @Markcus003
      @Markcus003 Před 5 lety

      For now probably generalizing

  • @imie-nazwisko
    @imie-nazwisko Před 5 lety +1

    Someday traffic signs Captcha, someday.

  • @thetrashmaster1352
    @thetrashmaster1352 Před 5 lety

    With 6 months of R&D I see this being extremely useful to police. It should be able to eventually tell between a face you know and one you don't. Imagine how useful it would be if police could just say "If you"ve never seen this person before, take this test" Or even when trying to find a criminal, don't ask the person which picture it was. Just read their brain so it can't be wrong.

  • @copethedeez
    @copethedeez Před 5 lety

    why is this age restricted

  • @scienceandmind3065
    @scienceandmind3065 Před 5 lety

    Interesting, I am working for a while now on something similar but a bit deeper and more complex. I also use an EEG (OpenEEG Cython 8 channel) to monitor brain waves but plan to combine it with other biometrical and sensory data (ekg, skin resistance, head/eye-trakcing, environment audio and video) to improve my cognitive performance and in the long run to automate certain cognitive processes.

    • @HermanWillems
      @HermanWillems Před 5 lety

      Ok so you are picking up data from the output of the body into software. In what way you trow information back at the body? Sound? Monitor? (I mean i see our brain as a black box which is nothing without the outside world.)

  • @ahmetmutlu348
    @ahmetmutlu348 Před 3 lety

    welll you cant read mind bu just reading outer signals that reflect the outer part of head skin.
    thats similar to trying to recognise images on computer screen by looking at reflection on the wall in the dark :P
    i guess it can be done tough nano robotics that catch and store signals ie like nanorobotc modems so you can easily put a nano modem on required signal lines. that way lots of information gan be gained. well partially its allready done for deafs or blinds electronic modules that translate audio or video.

  • @mohmedyzed8920
    @mohmedyzed8920 Před rokem

  • @HermanWillems
    @HermanWillems Před 5 lety

    EEG says something. But the resolution is VERY LOW. As if you want native resolution you need a connector to each connecting synapse. We have around 80 billion or more neurons. Echt neuron has about 7000 synaps connections. Troughout the brain in 3D. And no EEG will not give you such information about thoughts. But it can give us super low resolution information on which part of our brain is active. Using Neural networks.. ( technology borrowed from how our brain works) to reverse pattern match it is possible. But only simple things can be done like in this video. Converting EEG to a realtime "neural image" is.... far from possible.

  • @not_a_human_being
    @not_a_human_being Před 5 lety

    put EEGs on Kaggle! "Nirvana"'s coming! :)

  • @jurgenkranenburg1309
    @jurgenkranenburg1309 Před 5 lety

    try colors

  • @Because_alex
    @Because_alex Před 5 lety

    Did anyone else think that the girl in the thumbnail WAS the computer?

  • @0stre
    @0stre Před 5 lety

    to think that not so long ago we were only hunters and gatherers. The future is an increasingly terrifying and dehumanized you materialistic, bloody consumer.

  • @jimpruitt5328
    @jimpruitt5328 Před 5 lety

    This is a classic neural network problem - a result and data connected to the result. Just set up the network and train it. Stop trying to preprocess the data. Let the network figure it out. This is trivial compared to identifying cancers from x-ray data.

  • @chunli99
    @chunli99 Před 5 lety

    “But daaaammn” I feel like this guy hangs out with Bill Nye. That or they should be friends.

  • @BelovedOfTMH
    @BelovedOfTMH Před 5 lety

    You think this is how they're going to get the hive mind? Blanket that.

    • @Markcus003
      @Markcus003 Před 5 lety

      Are talking about force hive mind or just normal hive mind

  • @sagunmdr
    @sagunmdr Před 5 lety

    Sister Irrine?

  • @davidedorigo570
    @davidedorigo570 Před 5 lety

    wtf i think the exactly thing Mr. Gage tell at the end!!!!!! be carefully. (we need to saw also that we cannot stop the technology transition).

  • @ziadahmedsamy
    @ziadahmedsamy Před 5 lety

    Why am I not getting any recommended vids

    • @Markcus003
      @Markcus003 Před 5 lety

      Update youre CZcams I guess

    • @ziadahmedsamy
      @ziadahmedsamy Před 5 lety

      @@Markcus003 oh this is weird lmao when I watched it as soon as it got uploaded it wasn't monetized nor where there recommends I thought this was age restricted but it's fixed now lol

    • @Markcus003
      @Markcus003 Před 5 lety

      @@ziadahmedsamy sometimes CZcams are broken

  • @Bell_City_Fights
    @Bell_City_Fights Před 2 lety

    Someone is using this tech one without my permission..
    How do I stop them

  • @prixtront7240
    @prixtront7240 Před 3 lety

    wow, some people in this comment section really seem to think that our brains emit waves capable of traveling hundreds of meters after which they could still be picked up by a sensor and correctly interpreted by an artificial neural network.

  • @Bell_City_Fights
    @Bell_City_Fights Před 2 lety

    Someone is using this tech on me with out my permission how do I stop them

  • @UhMuthaFuckinVirgo
    @UhMuthaFuckinVirgo Před 5 lety

    I could fool the machine!!!🤓

  • @afrosamurai6969
    @afrosamurai6969 Před 5 lety

    My mind is fucked up

  • @asaadmahmood7674
    @asaadmahmood7674 Před 5 lety

    Though crime, here we come!

  • @najma3782
    @najma3782 Před 4 lety

    I thought the girl in the thumbnail was a robot

  • @leeluv96
    @leeluv96 Před 5 lety

    Her little adorable face is all smooched in that cap. 😊

  • @tehbonehead
    @tehbonehead Před 5 lety

    I say "Don't think about elephants..." what do you think about??

  • @sherouqsherouq8357
    @sherouqsherouq8357 Před 5 lety

    إنه من الرائع جدا أن نرى هذا التطور الذي وصلت إليه البشرية

  • @rewtnode
    @rewtnode Před 5 lety

    A lot of effort to get about one bit per second channel capacity.