Beating Moore's Law: This photonic computer is 10X faster than NVIDIA GPUs using 90% less energy

Sdílet
Vložit
  • čas přidán 30. 05. 2024
  • Moore's Law is dead, right? Not if we can get working photonic computers.
    Lightmatter is building a photonic computer for the biggest growth area in computing right now, and according to CEO Nick Harris, it can be ordered now and will ship at the end of this year. It's already much faster than traditional electronic computers a neural nets, machine learning for language processing, and AI for self-driving cars.
    It's the world's first general purpose photonic AI accelerator, and with light multiplexing -- using up to 64 different colors of light simultaneously -- there's long path of speed improvements ahead.
    Links:
    TechFirst transcripts: johnkoetsier.com/category/tec...
    Forbes columns: www.forbes.com/sites/johnkoet...
    Subscribe to the podcast: anchor.fm/techfirst
    Keep in touch: / johnkoetsier
  • Věda a technologie

Komentáře • 2,7K

  • @WeeHee
    @WeeHee Před 2 lety +5673

    Now we can literally say that RGB improves performance.

  • @jackwright7014
    @jackwright7014 Před 2 lety +2605

    I've learned with a lot of startups that it is mostly talk. I'll wait until they've actually got something.

    • @lanrebloom7030
      @lanrebloom7030 Před 2 lety +43

      I mean this has nothing to do with that. Its the technology that should be your concern here.

    • @KanedaSyndrome
      @KanedaSyndrome Před 2 lety +444

      Agreed. He says that you can "actually buy it" and yet when asked for the price, there is no answer, meaning that you can't actually buy it when it comes to it.

    • @owenb6499
      @owenb6499 Před 2 lety +158

      @@KanedaSyndrome he even said it’s not gonna be able to run windows, definitely going to be a very specialized research/industry chip.

    • @owenb6499
      @owenb6499 Před 2 lety +72

      @XxTreXXockxX good one Einstein, a consumer gpu is setup to run with windows. You gotta make a make your own specialized version of linux for a supercomputer, do the math genius.

    • @mattizzle81
      @mattizzle81 Před 2 lety +205

      I concur. I've seen so much vaporware in my life it's not even funny. If I only had a dollar for every breakthrough that is hyped up and never seen the light of day.
      Then on the flip side things that I've never heard about suddenly revolutionize everything.

  • @Justin_80
    @Justin_80 Před 2 lety +840

    Just when you thought regular GPU's were getting too expensive.

    • @pszyk.o.9074
      @pszyk.o.9074 Před 2 lety +23

      So now their going to be replaced

    • @thehotdogman9317
      @thehotdogman9317 Před 2 lety +15

      Nvidia better start taking notice.

    • @ashflame6888
      @ashflame6888 Před 2 lety +51

      @@thehotdogman9317 LOL you people are funny..... This is still 10+ years out.

    • @SpenserRoger
      @SpenserRoger Před 2 lety +13

      @@ashflame6888 who says

    • @ashflame6888
      @ashflame6888 Před 2 lety +20

      @@SpenserRoger Intel, AMD and anybody else who knocks on the door of this company with a billion dollars.... OR the simple fact that you are dumb enough think tech actually will EVER move that fast.... Yeah lets go from 8 core 5ghz processors to 40 core 20ghz lightspeed processors overnight! SURE THAT WILL HAPPEN!!! Turn your brain on before you open your mouth again.... I wouldnt be surprised if this NEVER happens and I also wouldnt be surprised that if it does happen its still 20 years out, or more. Its taken 20 years just go from single core 1ghz to 8 core 5hgz..... and you think we just gonna quadruple it overnight? LOL yeah....

  • @anonanon5147
    @anonanon5147 Před 2 lety +496

    12:49 Just a small correction, Moore's law is not about doubling of performance, instead it is doubling of the number of transistors.

    • @sidharthghoshal
      @sidharthghoshal Před 2 lety +55

      That used to mean doubling of performance, at one point, smaller transistor = less power AND faster speeds AND more storage. nowadays the only part of that which remains true is more storage.

    • @robertfarquhar2780
      @robertfarquhar2780 Před 2 lety +18

      @@sidharthghoshal Another reason chips get so hot is because they run at higher frequencies now, You can overclock intel processors up to like 5.4 ghz and they're stable.

    • @sierra5065
      @sierra5065 Před 2 lety +9

      @@robertfarquhar2780 11900k can hit 5.6ghz with a regular water cooler. Granted its draws double the power compared to Ryzen but it's still cool.

    • @Leo99929
      @Leo99929 Před 2 lety +26

      Also, CPU's are roughly 100x more efficient in the last 15 years, and 22x more powerful. He is just straight up wrong about this imaginary stagnation. Though we are approaching a physical limit in transistor minimum size, so processors will start getting bigger like AMD Epyc range, or we won't be able to fit more transistors per unit area using electricity due to quantum tunneling issues when the tech reaches the

    • @edwardcoyle5425
      @edwardcoyle5425 Před 2 lety +2

      The concept from moors law has expanded to include processing power and then to overall performance.

  • @Dia1Up
    @Dia1Up Před 2 lety +2125

    I feel like we missed an important question: But can it run Crysis?

    • @DivineWerezwolf
      @DivineWerezwolf Před 2 lety +122

      Thats outdated thinking! its all about Cryptocurrency now, Can it do the crypto mining?

    • @spr_
      @spr_ Před 2 lety +180

      @@DivineWerezwolf NO I DONT THINK SO

    • @bigrat6030
      @bigrat6030 Před 2 lety +145

      @@DivineWerezwolf yeah no one cares

    • @kikiriski6017
      @kikiriski6017 Před 2 lety +74

      @@DivineWerezwolf
      ABSOLUTELY NO

    • @skmetal7
      @skmetal7 Před 2 lety +104

      It can run crysis in crysis, while compiling the code for crysis.

  • @vkoskiv
    @vkoskiv Před 2 lety +661

    'General purpose AI accelerator'
    Is that what we're calling matrix multiplication now?
    Still a big deal of course, but I find marketing speak funny.

    • @Apocalymon
      @Apocalymon Před 2 lety +54

      That's how you market to C-suite folks

    • @aleksanteri_r
      @aleksanteri_r Před 2 lety +94

      Matrix is so '90s. AI must be included, that's a standard, but when you add "general", it hints at AGI and the investors start throwing money at you.

    • @3moluvr
      @3moluvr Před 2 lety +3

      @@aleksanteri_r LOL

    • @causalitymastery269
      @causalitymastery269 Před 2 lety +15

      @@aleksanteri_r ridiculously on-point.

    • @v1Broadcaster
      @v1Broadcaster Před 2 lety +3

      Nah it’s been like that for a while lol

  • @leoshaju4259
    @leoshaju4259 Před 2 lety +168

    John: Interesting
    Ceo lightmatter: yeah so

    • @wackyroo
      @wackyroo Před 2 lety +1

      \*Ceo lightmatter: *rapidly blinking* yeah so

    • @DROOBYDOO
      @DROOBYDOO Před 2 lety +6

      Yeah. So, it literally changes the game and I'm totally stoked. OMG, it's fire bro.

    • @johnkoetsier
      @johnkoetsier  Před 2 lety +6

      :-)

  • @brigham1465
    @brigham1465 Před 2 lety +38

    Asks literally any question,
    Ceo: "yeah, so"

    • @procactus9109
      @procactus9109 Před 2 lety +3

      Dont forget the 'interesting reply to an ambiguous answer.

  • @johnsavard7583
    @johnsavard7583 Před 2 lety +856

    My understanding about photonic computers is that they solve the low-k dielectric problem. Light beams that pass by each other side by side don't cause capacitance problems slowing each other down.

    • @N00B283
      @N00B283 Před 2 lety +23

      Never considered that

    • @RealNovgorod
      @RealNovgorod Před 2 lety +232

      That's not really an issue, since photonic devices are still orders of magnitude larger than modern transistors and getting anywhere close to the few-nm range can't really be done at optical wavelengths, it requires DUV or even XUV light and noone has figured out the materials yet for photonics at these wavelengths. At the moment, the only advantage is the low energy loss for linear operations as well as potentially much higher clock speeds than silicon (but then again you're speed-limited by silicon I/O and memory).
      The real breakthrough will come with actual photonic switching (that nonlinear stuff), but right now this requires phase-controlled few-cycle laser pulses (table-sized laser source) and probably plasmonic nanostructures to get enough field enhancement for nonlinear switching. If that kind of stuff can ever be integrated on a chip, you've got yourself a few-100-THz transistor...

    • @dvl973
      @dvl973 Před 2 lety +40

      @@RealNovgorod the problem of computing nowadays isn't really in the computing itself. It's in the communication on the chip. If we're going to see any advancements, I think it's going to be light communication inside our computers.

    • @mba4677
      @mba4677 Před 2 lety +16

      @@RealNovgorod awesome comment, bro. how do you people know so much about this stuff lmao is it your work?

    • @RealNovgorod
      @RealNovgorod Před 2 lety +38

      @@mba4677 When you're juggling electrons around with a laser, you might pick up a bit or two about what it all could be useful for "in future". Don't get your hopes up too high though - actual photonic computers (not even the quantum flavor) are as far away as fusion energy, maybe even further.

  • @faustin289
    @faustin289 Před 2 lety +460

    "linear algebra"; the math swiss army knife of computer science

    • @muzzletov
      @muzzletov Před 2 lety +4

      no? a swith army knife has mutiple tools built-in, linear algebra is a tool you can use in multiple ways. thats a completely different analogy. xD 204 updvotes know what there are voting for.

    • @creator-link
      @creator-link Před 2 lety +11

      muzzletov ?????

    • @mr.blackcat6223
      @mr.blackcat6223 Před 2 lety +12

      @@muzzletov Learn how to spell please.

    • @davidmangus
      @davidmangus Před 2 lety +9

      Linear algebra is necessary as it garuntee linear separability and thus can be parallelized. Everything uses it because it has to in order to scale.

    • @mutestingray
      @mutestingray Před 2 lety +1

      @@muzzletov @Faustin yeah that’s right you’re wrong Faustin smh how embarrassing you should delete your comment maybe even your account

  • @spacefacts1681
    @spacefacts1681 Před 2 lety +233

    This definitely feels like news I'd expect to read about in the year 2021

    • @thefirstsin
      @thefirstsin Před 2 lety +2

      Ah yes it's getting closer and closer the light computers exceeding our devices thousands of times.

    • @thefirstsin
      @thefirstsin Před 2 lety +3

      Holy he just said 20ghz.. Already at this year.. Graphite is 400+.

    • @oricooper9525
      @oricooper9525 Před 2 lety

      @@thefirstsin graphene

    • @pingpong1727
      @pingpong1727 Před 9 měsíci

      @@thefirstsinwhere did you get 400 from?

  • @NanClaymore
    @NanClaymore Před 2 lety +392

    I'm worried that photonic logic might only output the binary state "yeah so"

    • @dissidentjaws
      @dissidentjaws Před 2 lety +15

      You read my mind so yeah lol

    • @RENO_K
      @RENO_K Před 2 lety +14

      Probably runs binary
      If it runs using wavelengths as signals, it'd just break everything 😂
      Transmitting 2 bits at the same time on the same cable is insane. 1 running UV 1 running IR.

    • @demonfedor3748
      @demonfedor3748 Před 2 lety +15

      @@RENO_K UV doesn't look very power efficient to me. That sort of device would use tons of power. Using polarized FIR at multiple frequencies makes more sense here I believe. I wouldn't go as far as microwaves,but anything is possible. I am pretty sure they can do 128-bit per clock on the same cable with 64 colors and 2 different polarizations, however I might be wrong.

    • @NeoMK
      @NeoMK Před 2 lety +1

      The 64 colors can basically be tied to 64PSK which is used in SATCOM, however in SATCOM there is not only atmospheric attenuation but everytime you double PSK your power decreases by 3dB but it wouldn't in an optical chip so it's kinda cool. Interesting... ok so....

    • @krishanSharma.69.69f
      @krishanSharma.69.69f Před 2 lety +1

      Wow... are you an A.I. cause' you just found a pattern. That is what A.I.s are for.

  • @jimmyhackers8980
    @jimmyhackers8980 Před 3 lety +734

    im waiting for a shadow based processor.

    • @ThePC007
      @ThePC007 Před 2 lety +93

      There's a video game on Steam that is funnily enough also called "Lightmatter" and it's about a scientific experiment (that was performed by a company named "Lightmatter technologies") that has gone so wrong that the shadows have begun devouring every living thing they touch.
      Surely those shadows could be used as a computational resource somehow.

    • @ontoverse
      @ontoverse Před 2 lety +27

      That's what interference based approaches to optical processing do; there's even some experiments searchable here on youtube regarding that. Essentially you diffract the light on input and use interference at a fixed focal point as logic operations (since interference is a quantized phenomenon it works better for logic). Not sure how that miniturizes since you'd need very high frequency light to make the diffractors very small. Unfortunately that scales really poorly, since pretty soon you're in the x-ray diffraction realm!

    • @guai9632
      @guai9632 Před 2 lety +12

      I worship His Divine Shadow

    • @3nertia
      @3nertia Před 2 lety +4

      @@ThePC007 Sounds like antimatter ...

    • @Synthetiks
      @Synthetiks Před 2 lety +9

      I guess you watched Vsauce video about shadow(dark) faster than light

  • @ioratv
    @ioratv Před 2 lety +553

    John: * asks question *
    Lightmatter CEO: Yeah, so

    • @theonewhobullies
      @theonewhobullies Před 2 lety +45

      INTERESTING

    • @joshuagavaghan224
      @joshuagavaghan224 Před 2 lety +9

      This is just how nerds talk man. 😅 I noticed this too but because I respond to everything with "yeah, ....."

    • @gnaarW
      @gnaarW Před 2 lety +7

      And then he mostly dodged the questions ugggg

    • @Joshlul
      @Joshlul Před 2 lety +10

      this is the mark zuckerberg school of ceo speak

    • @tommylyeah
      @tommylyeah Před 2 lety +2

      @Steve Fox
      🍻🥴

  • @supersaiyancommenter
    @supersaiyancommenter Před 2 lety +60

    So instead of thermal throttling, we’ll have darkness throttling?

  • @ezg8448
    @ezg8448 Před 2 lety +160

    "Will it stay on the cloud?" About 10 mins in this question was asked.
    That means no one can buy a physical product, but only rent processing time from this company.
    That's a very big red flag in my book!

    • @SprDrumio64
      @SprDrumio64 Před 2 lety +24

      Yep, i either completely own my tech or i dont buy it at all

    • @-wenschow907
      @-wenschow907 Před 2 lety +33

      @@SprDrumio64 this really doesn't sound like consumer tech either way even if it can be bought and owned. Looks like they're just casting their PR net far and wide for another funding round.

    • @oblivion_2852
      @oblivion_2852 Před 2 lety +8

      You don't need this amount of compute personally. The idea is to use this in the cloud to train networks (that can perform more efficient algorithms) and then use the network produced on traditional chips. Something like nvidia resolution scaling could be optimized even more by AI. Imagine the future where we have AIs on each end of encoding and decoding video streams giving 4k resolution at 1kb/s information streams.

    • @raptagames
      @raptagames Před 2 lety +12

      „Stay in cloud“ is not meant that they are renting out the computational power. It means there customer is Google and not a privat Person.

    • @aarongalicia3760
      @aarongalicia3760 Před 2 lety +5

      “We are not a general purpose computer. We are a general purpose AI accelerator. I am not going to run Windows for you”
      They talked about releasing hardware by the end of the year but it’s not gonna be available to general consumers

  • @Baleur
    @Baleur Před 2 lety +321

    He needs to be really careful about answering Every question with "Yeah, so.."
    Or he's gonna end up promising things he never intended.

    • @jyvben1520
      @jyvben1520 Před 2 lety +12

      and it is a bit irritating, hope his businessmeetings are better ...

    • @thetruthexperiment
      @thetruthexperiment Před 2 lety +22

      Yeah, so, that’s like how all young CEO’s talk and it’s scary.

    • @thetruthexperiment
      @thetruthexperiment Před 2 lety +16

      @@0s0sXD saying. “Yeah, so...” is a nervous, space filler such as “like” and “um”. It exemplifies the rapid degradation of civilization itself. 😝

    • @therealb888
      @therealb888 Před 2 lety

      @UC3-EKrIcg6VtSjtEypjz64A he's not careful , thats the point genius

    • @Hickeroar
      @Hickeroar Před 2 lety +10

      I think the questions are tailored to be "yes questions." I expect they were predetermined. He's perfectly up front about what the limitations are, and isn't "over" promising anything. It's a limited featureset for specific jobs.

  • @HA7DN
    @HA7DN Před 2 lety +86

    One question I expected was "how could you do this before tech giants?"

    • @thefirstsin
      @thefirstsin Před 2 lety +3

      Idk man

    • @xponen
      @xponen Před 2 lety +31

      IBM and Intel also have this. My guess is they are like a hen sitting on eggs, they have something that works but not works well and didn't know how to sell it.

    • @tadficuscactus
      @tadficuscactus Před 2 lety +11

      They will probably buy him out.

    • @TheZenytram
      @TheZenytram Před 2 lety +30

      remember kodak, they didnt care about digital cameras
      remenber Tesla, they did a function eletric car, now every car companies are ruinning behind to deliver their owns, before was just concept car that they didnt care at all.
      the beginning of apple and the IBM at that time.
      those giant has their market and product, they can only "innovate " around theirs "golden egg goose".
      they can't compete with themselves with a "golden chicken egg".

    • @xponen
      @xponen Před 2 lety +12

      @@TheZenytram true, Kodak even had a patent for digital camera, but they don't innovate on purpose. Their "golden egg" was film-camera.

  • @jamesjensen5000
    @jamesjensen5000 Před 2 lety +2

    I became familiar with light transfer and photonics channeling 25 years ago with work done by company called Silicon Graphics ... it has progressed slowly since since then.

  • @wolfram77
    @wolfram77 Před 2 lety +107

    I was waiting to listen about some technical details, but this is a surface level discussion 😭

    • @prakharmishra3000
      @prakharmishra3000 Před 2 lety +14

      He spoke a lot of bs. Didn't even show a second picture of the cpu let alone a prototype. When asked about price he changed the topic.

    • @savifrank6323
      @savifrank6323 Před 2 lety +2

      Host lost me when he said modern computers have hundreds of gpus

    • @Fernando_Ribeiro
      @Fernando_Ribeiro Před 2 lety +9

      @@savifrank6323 well techinally a gpu is composed of hundreds of cores

    • @daath144occultistmaster5
      @daath144occultistmaster5 Před 2 lety

      Ty for rhetoric

    • @Real_MisterSir
      @Real_MisterSir Před 2 lety +13

      @@savifrank6323 he said modern supercomputers - which may very well have hundreds of gpus in a network working on the same tasks - basically like renderfarming but on a single massive computer rather than tons of separate computers.

  • @Sk4lli
    @Sk4lli Před 2 lety +32

    Very interesting. And kudos to the CEO to not sell it as a thing that will change everything as it's often done with new technologies. He clearly knows the strengths and isn't afraid to admin the current shortcomings.
    It's an additional thing to put in computers to speed up specific tasks, the ones that will become more important in the future.

  • @SwedishDeathLlama
    @SwedishDeathLlama Před 2 lety +17

    At CES, probably 10 years ago, NVidia had a small demo of some of their own research into optical computing. Obviously no sales at that point, but speaking to the engineer about the possibilities was one of the most interesting things ever. So happy to see this tech coming to market!

    • @PravinDahal
      @PravinDahal Před 2 lety +5

      It's coming to the market like Nvidia's was coming to the market 10 years ago. Don't hold your breath.

  • @JuanPretorius
    @JuanPretorius Před 2 lety +24

    Your questions are top notch, they're leading and for the advantage of the audience. Well done

  • @Alm8hoorOW
    @Alm8hoorOW Před 2 lety +174

    The fact that this firm hasn’t been acquired by big names in the industry speaks volumes in how much experts trust the technology.

    • @trjozsef
      @trjozsef Před 2 lety +11

      The CEO called electron leakage* "quantum tunneling," I wouldn't trust whatever he's selling.
      * From what I gather it is electromagnetic induction, but I'm not an expert either.

    • @S3thc0n
      @S3thc0n Před 2 lety +137

      @@trjozsef At scale of current gen semiconductors the method of leakage is quantum tunneling.🤦‍♀️
      An expert you are truly not.

    • @suagy7492
      @suagy7492 Před 2 lety +45

      @@trjozsef Get your facts straight, the CEO didn't spew nonsense

    • @SandstormGT
      @SandstormGT Před 2 lety +7

      Not everyone is interested in selling out to big tech...

    • @Restrictted
      @Restrictted Před 2 lety +19

      Companies don't have to worry about people "buying" anything. I mean look at the iPhone. People buy it and it's years behind Android.

  • @ripper82
    @ripper82 Před 2 lety +17

    22 questions asked
    8 questions were ‘yes/no’
    20 answered with “Yeah…”

    • @johnkoetsier
      @johnkoetsier  Před 2 lety +2

      :-) how much time do you have? btw, chatting with him again in a couple weeks and will ask all the top questions from these comments

  • @humanbass
    @humanbass Před 2 lety +368

    If it lives up to his claims, he will be richer than Bezos lol.

    • @sinephase
      @sinephase Před 2 lety +39

      doubt it, I don't think even nvidia+arm is that rich

    • @KP3droflxp
      @KP3droflxp Před 2 lety +64

      @@sinephase the dude would basically have a monopoly on the fastest GPU type processors

    • @sinephase
      @sinephase Před 2 lety +47

      @@KP3droflxp OK but I don't think even AMD combined with Nvidia is even that rich though LOL

    • @MyContext
      @MyContext Před 2 lety +6

      Unless he sells out Bezos or someone else which is common...

    • @owenb6499
      @owenb6499 Před 2 lety +11

      It seems like right now, its not consumer at all. Hopefully eventually we can have all components be photonic.

  • @mr_viss_
    @mr_viss_ Před 2 lety +121

    I guess this is the next company that Nvidia is going to buy.

    • @flynifty3951
      @flynifty3951 Před 2 lety +9

      Or AMD

    • @mr_viss_
      @mr_viss_ Před 2 lety +16

      @@flynifty3951 I don't know about amd but Intel might try and get a hand on the gpu market...

    • @XJoukov
      @XJoukov Před 2 lety +3

      they should have bought this instead of ARM really --'...

    • @matrixace_8903
      @matrixace_8903 Před 2 lety +4

      And shut it down ⬇️

    • @Leo99929
      @Leo99929 Před 2 lety +10

      Nvidia, Intel, and AMD, are too smart to fall for this bullshit. 30 seconds on Google shows the guy has no idea what he's talking about. His stats are regularly out from reality by factors of 100 or more.

  • @richardmunsch8127
    @richardmunsch8127 Před 2 lety +42

    Just another cold fusion until something actually reaches the market and proves not to be a load of natural organic fertilizer.

  • @ongeri
    @ongeri Před 3 lety +646

    Nice to see smart people working on some of my random showertime brainwaves, make me proud guys.

  • @BrutusPalmeira
    @BrutusPalmeira Před 2 lety +28

    A great first step would be to see one of these working even at 1990’s clock speeds. This could be another Theranos…

    • @AntonySimkin
      @AntonySimkin Před 2 lety

      I hope it works at least like the beast gpu's now but with the promised 1/10th of the wattage needed so I can build an android to be my waifu lol

    • @esuil
      @esuil Před 2 lety +4

      Yup, I could not find any information about this from any real people who have their solution. There is literally 0 information or demos if you try searching for it.
      Everything looks great on paper. But the fact that I can't find any actual hardware demos anywhere are concerning.

    • @justinthejerkoff
      @justinthejerkoff Před 2 lety +1

      @@esuil that's because it doesn't exist and it never will. The idea of "photonic" computing, or more accurately but much less cool sounding, optical computing has been around since like the 70s or 80s. And yet nobody has ever produced a chip that does anything near what's promised.
      It's all hot air.

    • @franklee663
      @franklee663 Před rokem

      light travels at.... err speed of light, almost instantaneous. The only latency will be probably the DACs and ADCs for interfacing. Electrons also travel at the speed of light, but then the difference is how arithmetic is done. Just for addition of digital signals, you need to go through multiple steps (each step driven by a internal clock signal). But for analog, you just join two wires together and they just add up in one step. Yes, you can also do it with analog computing, but then you have to deal with external influence from EMF from other components and stuff, but light once sealed, you can't add or subtract signals from external sources.

  • @MrCarburator
    @MrCarburator Před 2 lety

    Thanks John. I think your interview is right there at the forefront. Excellent! PS: Im not 100% sure but if its a large diaphragm mic you are using it may be pointed the wrong way. You may want to check its manual and get better sound free of charge so to speak. Cheers!

  • @EdgyShooter
    @EdgyShooter Před 2 lety +20

    Glad I'm doing a PhD in Photonics!

  • @MassageWithKlay
    @MassageWithKlay Před 2 lety +53

    "Computers with friken lazer beams" .... looks like someone has been getting all Dr Evil on us :)
    Besides that, the chip set sounds promising, and should be really good, now just have to see how the future looks for them.

  • @madras61
    @madras61 Před 3 lety +20

    Great interview. The focus of the company is more on deep learning for cloud computing in data centers for those mega houses like AWS, Azure and Google for now.
    Once those there use in their data centers, the experience will help Lightmatter to scale.

  • @hollowmajin5146
    @hollowmajin5146 Před 2 lety +50

    That's the funny thing about the "laws" of nature, aren't actually laws, they're simply descriptions of nature as we understand it, the laws don't change only our perception of how the universe works

    • @communist-hippie
      @communist-hippie Před 2 lety +2

      Smart said, never thought about it that way

    • @raptagames
      @raptagames Před 2 lety +9

      Moors law was never meant as a law, it’s more a Guideline.

    • @Drakeblood97
      @Drakeblood97 Před 2 lety +4

      Well no, they are in fact laws. Saying that a law is 'just a description, so it can't be a law' is like saying a theory is 'just an idea, so it can't be true.' That's exactly what a law of nature is by definition (Moore's law isn't a law of nature). Laws aren't meant to "change our perception of how the universe works," they're just meant to formalize our observations so we have something written down to reference.

    • @BlastinRope
      @BlastinRope Před 2 lety

      @@Drakeblood97 Swing.... And a miss.

    • @hollowmajin5146
      @hollowmajin5146 Před 2 lety +1

      @@Drakeblood97 I believe there is a misunderstanding, as you have simply rephrased what I said.

  • @bananalord8575
    @bananalord8575 Před 2 lety

    John this is the first video of yours I have watched and I have to say that you are verry goodi nterviwer. Smootly asked new questions the (same ones I had), not interupting him etc etc... good job

  • @Vermilicious
    @Vermilicious Před 2 lety +36

    Sounds very impressive, but I can't help thinking that it has to interface with traditional hardware that can't keep up.

    • @Danger-Tater
      @Danger-Tater Před 2 lety +6

      I don't think traditional hardware has to keep up, this is made for big super computers or ai learning centers and unless they optimize it and make it run graphic programs like games, graphic design programs and just be general use computers current hardware doesn't have to keep up, but Who knows what will happen on the next page of history which is what's exciting

    • @TheBestNameEverMade
      @TheBestNameEverMade Před 2 lety +13

      It's kinda like how your graphics card is 100x faster then your computer. It doesn't matter though cause all the computer needs to do is feed it the problems and take the output. The output could be generated at normal computer readable speed but the problems would take a CPU much longer to solve.
      In AI for example you are multiplying a huge matrix by another huge matrix.
      A traditional computer would take forever to multiply and add each element. With one of these you can take a large chunk and do all that work in the light processor and just spit out the multiplied matrix segments at the computers read rate.

    • @SinanAkkoyun
      @SinanAkkoyun Před 2 lety +7

      Well, imagine it like this:
      When you interface with a very fast photonic AI device, you only need to send the relevant data, the device does the heavy lifting with it.
      It is like typing into a calulator. Sure, your fingers are way slower than the calculator could process, but doing highly sophisticated algebra and so on would take you wayyy longer yourself than typing them into a black box that does the work :)

  • @ultraviolet.catastrophe
    @ultraviolet.catastrophe Před 2 lety +8

    This such an awesome interview, guys. Very good questions asked, great answers given, and an overall wonderful introduction to the technology. Thumbs up! +1!

  • @meetoptics
    @meetoptics Před 2 lety

    So nice to see that the industry is developing and evolutioning. Let's see how this project progresses, but the idea seems promising. From MEETOPTCS we are proud to be part of the photonics community and to help engineers and researchers in their search for optical lenses through our site. We celebrate every step forward. Congratulations!

  • @jksharma7
    @jksharma7 Před 2 lety

    wonderful insight information.... Thanks to both of you.

  • @smjure
    @smjure Před 3 lety +5

    Good questions John, tnx

  • @big.atom37
    @big.atom37 Před 2 lety +15

    The term computer in the modern world usually refers to a piece of hardware that can perform all computational tasks. In other words, it's a machine with a general-purpose CPU inside. Hardware that is used to speed up some specific operation like matrix multiplication is usually called an accelerator.

    • @Kindlylisten3
      @Kindlylisten3 Před 2 lety

      Nonbeliever why you choose that name?

    • @big.atom37
      @big.atom37 Před 2 lety +1

      @@Kindlylisten3 What's wrong with it?

    • @Kindlylisten3
      @Kindlylisten3 Před 2 lety

      @@big.atom37 Because i don't think we become nonbeliever but as a choise.
      We still remain agnostic.
      And Because God wants us to believe in him. I am not christian. I'm muslim. So God is only one. And muhammad is his messenger. And our afterlife is based on our belief and actions otherwise failure forever.

    • @big.atom37
      @big.atom37 Před 2 lety +2

      ​@@Kindlylisten3 From where I stand, believing in gods is a weakness of human nature. Human brain is very limited in what it can process so it needs some simplified picture of the world to serve as a basis, which is religion for many.

    • @Kindlylisten3
      @Kindlylisten3 Před 2 lety

      @@big.atom37 I'm talking about God not Gods. And that God said in his book that "He is the creator & expander of the universe". And also said he made the "best" in wombs of our mothers. Also, He made everything in "pairs". Also, The biggest sign is He himself challenging humans to write a verse like his Book. The Divine book i am referring to is Qur'an containing signs. Saved from a time About 650 A.D.
      Give it a thought!

  • @suntzu1409
    @suntzu1409 Před 2 lety +22

    "You can literally see it!"

  • @slowfudgeballs9517
    @slowfudgeballs9517 Před 2 lety +6

    I want a "Yeah so" counter in the top right.

  • @raymundhofmann7661
    @raymundhofmann7661 Před 2 lety +25

    What is not mentioned here is that the calculations that can be performed here are lossy or have a limited signal/noise ratio because first it is D/A converted from electrical signals to optical radiation and then A/D converted back to electrical signals. The optical "processing" in between is just like a kind of "fourier optics" thing, a kind of instant convolution / correlation but with imperfections, or something that can be projected on it with reasonable benefit. There are applications for which the lossy and numerically unstable nature is acceptable, but for the majority of applications computers are used for it is not.
    As the whole AI and machine learning stuff is anyway a very fuzzy affair where no one expects deterministic results, this is consequently the market they say to bring some benefit to.

    • @nonconsensualopinion
      @nonconsensualopinion Před 2 lety +2

      Why do you think they have any A/D conversion? Binary thresholded electrical signals being encoded into binary light pulses and then thresholded back to electric.
      It's no different than fiber optics. Nobody thinks that sending your data over a fiberoptic cable yields reduced numerical accuracy on the receiving machine.

    • @raymundhofmann7661
      @raymundhofmann7661 Před 2 lety +8

      @@nonconsensualopinion Your lack of understanding lets me hope you don't work as an engineer.
      In fiber optics there is no intentional signal processing or alteration happening over the medium, if it is like multi mode reflections or parasitic reflections at couplers / connectors, then these are unwanted and corrected by digital error correction and digital / analog or even electro-optical transmission line equalizers.
      And these are relatively easy to correct, meaning using little processing and electrical power. This is by design and mainly a function of how much information has to be convoluted to check for transmission errors and correct them to very low accepted rates, which never are zero, btw.
      With this "photonic computing", whatever processing or function is going on in the medium, from electrical signal in to electrical signal out, which is a necessity simply because they advertise it can be integrated into a existing data center, more or less evades from any error correction.
      Do you know fourier optics? Do you know what a 4f correlator is?
      You can "optically calculate" a 2d fourier transform blazingly fast, basically at the speed of light, but you would also have to convert your data to optical (D/A) and then convert it back to digital (A/D). This combined with the optical imperfections of the used elements basically determines a signal to noise you might achieve, how lossy your processing function is.
      And there is no way to completely get rid of this except than doing the same calculation which happened in the "photonic computer" again on a classic computer. As said, a huge convolution of information would have to be done on the electronic side, basically the optical calculation repeated, to get to the wanted bit error rate, making the optical calculation redundant.
      Of course there could be some clever ways to improve on that with viable measures on the electronic sides of the "photonic computer", meaning to improve on the signal to noise or give the noise some characteristics more suitable for the intended application, but these definitely will not get rid of the noise, they will just make this lossy computing more viable for certain applications, but they will never rival the low and cheap to get error rates of classic computers.
      A classic computer is also lossy, has a error rate or a signal to noise if you want to say, but this can be made so amazingly low that human factors like human lifetime easily make the dominant "error rate".

    • @yimoawanardo
      @yimoawanardo Před 2 lety +5

      @@raymundhofmann7661 Everything you said is good, I just hoped you were a little less salty when answering '^-^. Thanks for the infos though

    • @Scholzey
      @Scholzey Před 2 lety +1

      @@raymundhofmann7661 what node are these on? Surely your not going to fit billions of optical-electrical transmitters and recievers on a chip.
      Initially I thought they were swapping copper traces with optical tubes. But if they are multiplexing, then obviously not.
      I didn't really hear how it actually worked, what changes states etc to allow the light to do the work instead of just carrying information.

    • @tetraedri_1834
      @tetraedri_1834 Před 2 lety

      @@raymundhofmann7661 My first instinct to reduce the effect of noise would be to couple the data with error correction code, so that CPU can detect and fix errors due to noise later (assuming noise is below the threshold of the used error correction code). Of course, this would require homomorphic error correction codes (i.e. codes that are preserved under evaluation of functions), which I don't know how much they are studied. But if photonic computation becomes big, I bet research in homomorphic error correction codes would become a big field

  • @P8qzxnxfP85xZ2H3wDRV
    @P8qzxnxfP85xZ2H3wDRV Před 2 lety +57

    I'd love to see this as a PCIe expansion card that I can use in my computer to accellerate raytracing in games. And obviously machine learning.

    • @ashtentheplatypus
      @ashtentheplatypus Před 2 lety +27

      So, you'd have light simulating light!

    • @fennadikketetten1990
      @fennadikketetten1990 Před 2 lety +8

      @Rusty Clark you mean pseudo-raytracing right

    • @wombatillo
      @wombatillo Před 2 lety

      Drivers and driver support would be a nightmare. People have been talking about co-processors (FPGA, neural, etc.) for ages and nothing much has come out of them.

    • @P8qzxnxfP85xZ2H3wDRV
      @P8qzxnxfP85xZ2H3wDRV Před 2 lety +9

      @@wombatillo Isn't the GPU a co-processor?

    • @SianaGearz
      @SianaGearz Před 2 lety +3

      I can't see raytracing as a viable task for this topology, because the data structures needed for that are insanely complex. The only operations that this accelerates are multiply-accumulate on straightforward data structures, or basically matrix multiplication, which is computationally intensive but structurally extremely simple. So neural networks are an excellent match.
      But you know what i bet is coming? Cryptocoin mining.
      Other uses this will potentially adapt to easily are uniform finite element analysis physics simulations, such as fluid dynamics and load bearing structures. The latter powers things such as generative design, where you specify load bearing surfaces, and have the CAD generate an optimised structure that solves a given mechanical task. Warning, if you have trypophobia (anxiety of holes), maybe don't google "generative design", and also don't google "trypophobia". Also this hardware might be useful for things like protein folding, for medical research.

  • @antonnym214
    @antonnym214 Před 2 lety +5

    Excellent interview! This is hyper-interesting technology, and certainly kudos for the spectacular advancements made at Lightmatter. A gentle note for Mr. Harris, which will help especially when relating and communicating with anyone over 40: Try not to start every answer with "Yeah so." I would recommend just not using it to begin ANY sentence. One need only view a single episode of Shark Tank, for example, to see the effect that can have on discourse. All good wishes!

  • @lstan444
    @lstan444 Před 2 lety +3

    Good interview, I like the way you structured the questions, real journalism!!!

  • @ll_distribution
    @ll_distribution Před 2 lety +44

    This is a 10! Great interview John and how great communicator is this CEO, not short of impressive being clear, precise, articulate beside obvious a wiz in edge tech. Wow

  • @danfg7215
    @danfg7215 Před 2 lety +130

    Sounds like the overhyping of No Man’s Sky

    • @EversonBernardes
      @EversonBernardes Před 2 lety +20

      The claims aren't so outlandish, actually. Vast majority of the inneficiency from current chips actually comes from moving data around inside the chips. As you miniaturize, resistance increases, you get more parastic loss and quantum effects, meaning you have to ramp up voltages to achieve the same frequencies. Just by moving to light-based data transport through waveguides you are already making huge savings in power (so, less waste heat), what also allows you to ramp up frequencies.
      Main issue is steering the monumental ship that is semiconductor technology to adopt a completely new paradigm in production.

    • @danfg7215
      @danfg7215 Před 2 lety +11

      @@EversonBernardes oh I don't doubt the tech, just how nonchalantly the guy over there just throws multipliers and promises like nobody's business. I'll believe it when I see it. No Man's Sky eventually delivered, but it took years of actual hard work after the disastrous launch and the lead designer shutting up about overhyped public statements.

    • @____uncompetative
      @____uncompetative Před 2 lety +1

      @@danfg7215 What are you talking about?
      _No Man's Sky_ delivered on everything that was in its Steam product description at launch.

    • @danfg7215
      @danfg7215 Před 2 lety

      @@____uncompetativehmm unless you're being sarcastic, you can learn more about the game's prerelease overhype in this random summary video: czcams.com/video/PvCg0Cz7n4I/video.html

    • @____uncompetative
      @____uncompetative Před 2 lety +1

      @@danfg7215 I am not talking about anything said in early interviews. Games can change for aesthetic, technical, or business reasons up until the point of preorder. What they then promise they should satisfy. What Sean Murray put in the Steam product description was delivered by the game at launch. Games can be hyped up for years and then cancelled and not come out at all. _No Man's Sky_ released. Sean Murray never lied:
      czcams.com/video/wyV9XOtEh1A/video.html

  • @michaelsmith6818
    @michaelsmith6818 Před 2 lety +34

    New drinking game. Every time a CEO starts his answer with "Yeah, so..." take a drink. You'll be bombed by the end of the video.

    • @vondahe
      @vondahe Před 2 lety +2

      In fact, every time people start a sentence with “so”, I stop listening. Doing that generally indicates two things, both of which turns off my interest.

  • @Mr.Unbreakable83
    @Mr.Unbreakable83 Před 2 lety

    Hey John just wanted to let you know that I subscribed to your channel because of this video it's super interesting and I would love to learn more about photonic computing.

  • @anubisplays1421
    @anubisplays1421 Před 2 lety +22

    Surley you would want to show your tech off, but no demonstration? sounds suspect to me.

    • @MonsterJuiced
      @MonsterJuiced Před 2 lety +4

      Was gonna say this. Why is there no demonstration?

    • @HDSQ
      @HDSQ Před 2 lety

      Considering that it's mostly for AI and number crunching, I doubt there's much that can really be shown at this point besides a wall of text on a screen, which isn't very interesting to most people.

    • @xSuspect-7
      @xSuspect-7 Před 2 lety

      @@HDSQ Make comparisons would do, but yeah for general public wouldn't make sense

    • @J3R3MI6
      @J3R3MI6 Před 2 lety

      Seeing some graphics processing or rendering would be nice

    • @esuil
      @esuil Před 2 lety

      @@HDSQ I was interested and could not find even that anywhere. So yeah...

  • @swifty1969
    @swifty1969 Před 2 lety +40

    So how come amd, intel, nvidia, Amazon, google are not on a bidding frenzy over this company?

    • @vallorahn
      @vallorahn Před 2 lety +2

      You forgot Apple and Tesla.

    • @DouglasWatt
      @DouglasWatt Před 2 lety +30

      Intel and IBM have been doing minor research projects in optical computers for years, but there wasn't enough market for narrow-focus chips until neural networks became a thing, and the current state of optics requires designing narrow-focus chips, like tensor math chips. Also, highly controlled nano-scale photon generation is a pretty recent invention, and that has prevented optical circuits from keeping pace with silicon improvements. But a lot of improvements have happened over the last 10 years, and IBM has been increasing their photonics & optical computer division. If Lightmatter's product does what it says it does once its in the cloud, they will definitely be the target from the major players. But being a target doesn't guarantee that Lightmatter agrees to a buyout either.

    • @trueneutral3092
      @trueneutral3092 Před 2 lety +3

      Maybe this is the company that's going to tell them to go get bent.

    • @xMilesxHighxClubx
      @xMilesxHighxClubx Před 2 lety +1

      @@trueneutral3092 Lol

    • @simon4133
      @simon4133 Před 2 lety +1

      They will once it achieves a certain market share, at which point they will add their government approved spyware called Intel ME.

  • @trocha419
    @trocha419 Před 2 lety +1

    I had a research paper I did on this for my physics class I’m high school in 05. I couldn’t figure out a solid ways back then for translating the light to electrical

  • @ComputerHistoryArchivesProject

    Highly fascinating video, great detail in the explanation. Thanks for making and sharing this!

  • @WildEngineering
    @WildEngineering Před 2 lety +5

    Im interested in how phase, polarization, and wavelength can be optimized to squeeze out more performance

  • @CanaldoMiauOfficial
    @CanaldoMiauOfficial Před 2 lety +91

    Cryptominers be like: Yeeeeeeeeeeeah boi.

    • @360addict70
      @360addict70 Před 2 lety +23

      not really. if anything it'll devalue crypto by making easier to make. crypto is valuable because it's hard to make.

    • @CanaldoMiauOfficial
      @CanaldoMiauOfficial Před 2 lety +5

      @@360addict70 Yes you're right, but it solves the problem of the energetic consuming made up because of cryptocoins, so is nice for people who want to sell NFT art. And probably for now only a few people will have this kind of "photonic computer.", but I got you, you're right.

    • @Zephyrus0
      @Zephyrus0 Před 2 lety +7

      They will just create harder to crack hashing algorithms, take Monero's RandomX for example, it's made in such a way that GPU and ASICs suck at solving them.

    • @JorgetePanete
      @JorgetePanete Před 2 lety +1

      Remember that proof of stake is becoming common

    • @xSuspect-7
      @xSuspect-7 Před 2 lety +4

      @@360addict70 Not really, it would have the same effect we have today
      Weaker GPUs can mine but the value is not that high bc we have stronger GPUs mining faster dictating the price
      If this could be a reality those with the new technology would dominate the market and everyone else with weaker GPUs will gain less

  • @randfee
    @randfee Před 2 lety +1

    nice interview, I was wondering when somebody would create a chip like this! FINALLY!
    Is there some kind of technical video on how the chip works fundamentally, how it does math and about the architecture?
    I think this would be great to show some of my students, I'd be happy to promote it on LinkedIn and use it/link to it on conferences!
    Kudos to the entire team at lightmatter and best of luck. I want to see real-time 120fps+ 8k raytracing for AR/VR applications running on these in 10 years ;-)
    ... a fellow photonics (lasers) physicist!

  • @aliuzel4211
    @aliuzel4211 Před 2 lety

    Super interesting. Enjoyed. Thank you.

  • @fergarram
    @fergarram Před 2 lety +4

    This was a great interview! Very clear and easy to follow, thanks!

  • @mikimouse3001
    @mikimouse3001 Před 2 lety +142

    To me this smells like that " brand new revolutionary battery technology" all over again.

    • @ObliviouslyAware
      @ObliviouslyAware Před 2 lety +12

      That's still a thing materials keep changing though so R&D funding keeps getting changed around, right now the hot option is graphene ultra capacitors, you're still not wrong though.

    • @goku21youtub
      @goku21youtub Před 2 lety +9

      its sad that youtube doesnt show downvotes on comments anymore... it would hurt your feelings for real lol

    • @IARRCSim
      @IARRCSim Před 2 lety +14

      It smells a fraud to me. It reminds me of Theranos. It sounds too good to be true and they're not claiming to deliver until the end of year so we'll see then if that gets delayed, disappoints, or ever becomes available for outsiders to fairly compare with more conventional technology at all.

    • @gabrielvieira6529
      @gabrielvieira6529 Před 2 lety

      ???

    • @stevenarvizu3602
      @stevenarvizu3602 Před 2 lety +9

      The technology exists, it’s a matter of logistics. If you can’t find
      >investors
      >workers experienced in the field of “your new invention”
      >access to material goods through inexpensive or cost effective means
      You may as well not even make whatever you have until you do.
      This is why most “ground breaking” technology disappears and then comes back ten years later.
      When Pixar released movies with ray tracing it was amazing! They were like “once we figure out how to do this quicker it’s gonna be so mainstream!” And then no one ever really used it again.
      Then ten years later Nvidia figured out how to do it in real time and now it’s back again.
      These things take time, unfortunately

  • @patrick6110
    @patrick6110 Před 2 lety

    Excellent interview! Thank you for posting.

  • @localnyraccoon
    @localnyraccoon Před 2 lety +2

    Excited to see where this is going, if it can actually make it's way into personal computers that would be amazing.

  • @quite1enough
    @quite1enough Před 2 lety +45

    I like how Mr. Harris tells a lot, yet absolutely nothing. I have my doubts I'd say.

  • @ChernzObyl
    @ChernzObyl Před 2 lety +3

    Correct me if I'm wrong but this should revolutionize blockchain applications

  • @emmanueloluga9770
    @emmanueloluga9770 Před 2 lety +1

    What a great video, even after a Year of first viewing it. I have followed LightMatter for almost 4 years now, and I am pumped for all they have to accomplish. GG.
    14:42 is a great Anecdote for what optical computers are not good at currently. This most definitely needs a fundamental rethinking of the approach to Logic operators (Maybe Fredkin Gates?

  • @xnadave
    @xnadave Před 2 lety

    I took a class on photonic bandgap crystals in grad school. For our project, one of my classmates designed an entirely optical XOR gate. It's a really fascinating field. (Ha. Field. Get it?) Kudos to the interviewee for breaking down a really complex topic into easily-digestable terms.

  • @jackburton5085
    @jackburton5085 Před 2 lety +11

    "Now there's actually one that you can order" ....

  • @joey286
    @joey286 Před 2 lety +7

    Great video... and yes to answer your last question it is 100% possible to have local AI built in. Also, I am going to create a optical computer that is better than this one. hehe. Good luck Nicholas! Cheers.

  • @jonnupe1645
    @jonnupe1645 Před 2 lety +2

    This is exciting! I can't wait to process local weather in real time on a hopefully (eventually) smaller form factor and near guarantee of lower amount processing power

  • @zeropointclocker4228
    @zeropointclocker4228 Před 2 lety

    I had some options, until I scrolled down to see they were all covered lol. Liked and subbed as well. Good stuff.

  • @chicolofi
    @chicolofi Před 3 lety +10

    I don't care. I want a personal photonic computer.

  • @seasong7655
    @seasong7655 Před 3 lety +6

    I've tried some machine learning before and wish I've had this in my PC.

  • @shaun6828
    @shaun6828 Před 2 lety +1

    I wonder about the density and how they generate / receive the optical signals. Interesting ideas involved.

  • @El.Duder-ino
    @El.Duder-ino Před 6 měsíci

    Very cool, thx for sharing this interview!👍 This was 2 years ago, where is this technology now?

  • @cesarmonroy-olivares4125
    @cesarmonroy-olivares4125 Před 2 lety +52

    You lost me when you said you won't run Windows... JK

  • @plekkchand
    @plekkchand Před 2 lety +4

    The engineer answers every question, oddly, with "Yeah". This may be the new "So ", which had recently taken the place of a direct acknowledgement of the questioner characteristic of traditional conversational etiquette, and was widely adopted by think tank people partially to convey the implication that they were not in the same universe as the lowly interviewer. It's as if the two parts of the conversation were simply simultaneously occurring, with gaps provided.

  • @maciejajewski
    @maciejajewski Před 2 lety

    John you are very eloquent. Can easily tell you are a smart man from the quality of your questions. Definitely subscribed

  • @cipherxen2
    @cipherxen2 Před 2 lety +1

    Nicholas' reply header contain magic string "YEAH"

  • @MrSomeDonkus
    @MrSomeDonkus Před 2 lety +15

    The amount of times ive watched this exact same video and the thing ends up being nothing is astounding.
    This type of video almost seems like a parody at this point.

  • @asdfasdf71865
    @asdfasdf71865 Před 2 lety +5

    If you can make a matrix multiplication, you can simulate quantum computer and implement traditional logic with the trofolli gate

  • @woodstoney
    @woodstoney Před 2 lety +1

    Great video bringing new light to photonic computing! ;)

  • @steve222010
    @steve222010 Před 2 lety

    Why isn't this front-center news in the headlines? Major news networks haven't even given it 5 seconds of coverage.

  • @PASTRAMIKick
    @PASTRAMIKick Před 2 lety +13

    They found a wrecked terminator in a factory and based on its technology are now developing this computer, nice.

    • @davestorm6718
      @davestorm6718 Před 2 lety

      😎

    • @luke144
      @luke144 Před 2 lety

      🦾

    • @richystar2001
      @richystar2001 Před 2 lety

      If you examine and study the evolution of the micro transistor chip ...its a very boring and a slow read....everyone likes to think that alien tech or something was used to advance breakthroughs in science. But the evolution of technology is slow and very uninteresting.

    • @davestorm6718
      @davestorm6718 Před 2 lety

      @@richystar2001 - I wouldn't say it's really that boring, but, indeed, people that don't know how, for example, integrated circuits got smaller and smaller, need to look into photography/optics and photo-chemistry. These 2 fields had the most impact on IC construction, initially, and revolutionized the electronics industry the most (the application of microfilm tech, basically).
      The rest is quite ordinary incremental advancements (though with the "Law of Accelerating Returns" that is in play w/regards to all other technology and scientific advancements that would be impossible w/o the computer).

  • @johnkoetsier
    @johnkoetsier  Před 2 lety +47

    And ... here's the Q&A with Lightmatter CEO Nick Harris answering some of the best/top/funniest questions below: czcams.com/video/mF4QendzazQ/video.html

    • @KnowArt
      @KnowArt Před 2 lety +6

      I'd be really really interested in how it _actually_ works. I have some understanding of transistors and logic gates, but I can't understand how you'd actually do linear algebra with light.

    • @DudeSoWin
      @DudeSoWin Před 2 lety +3

      Will it run DOOM?

    • @maxwhite4732
      @maxwhite4732 Před 2 lety +1

      I think the top 2 things for me are gaming and rendering, he mentioned it's good at raytracing which is perfect for both so I would imagine that this would be a huge selling point for a lot of people. Exciting stuff!

    • @2beJT
      @2beJT Před 2 lety

      How can I invest in this?

    • @johnkoetsier
      @johnkoetsier  Před 2 lety +2

      Just completed the follow-up interview. I'll post the link here when it's ready to go

  • @ntrpnr
    @ntrpnr Před 2 lety

    How do you load the model onto the chip? Is each chip custom made for its inference model?

  • @cumuIonimbus
    @cumuIonimbus Před rokem +2

    This dude Nicholas seems to excel at 3 things at the same time - science, entrepreneurship and life in general. You rarely can find a person who manages to combine 2 of them.

  • @MohammedIqlasUddin
    @MohammedIqlasUddin Před 2 lety +6

    Every answer starts with "Yeah so" on a photonic computer

    • @rRobertSmith
      @rRobertSmith Před 2 lety

      So is that a voice coach problem or a lack of rehearsal (real honest answers!) problem then?

    • @Danger-Tater
      @Danger-Tater Před 2 lety

      @@rRobertSmith yeah so... Not rehearsing questions is bad for your company's rep because people might consider you incompetent and you don't know what you're doing

  • @roger_isaksson
    @roger_isaksson Před 2 lety +7

    Harris seem like a chill dude in charge of an operation. 🤔👍

    • @therealb888
      @therealb888 Před 2 lety +1

      They all seem like that until you work for them.

  • @ThePhotovoltaicMan
    @ThePhotovoltaicMan Před 2 lety +1

    how many different colors could you run through a core? every nm of wavelength? or every 12nm, 20nm, 50 etc?

  • @thesystemera
    @thesystemera Před 2 lety

    Hey man..Great hit on this video man.

  • @wkcs007
    @wkcs007 Před 2 lety +13

    When I see a statement "10 times more powerful then the high end Nvidia..." , it means better processing power for a cryptomining farm... And with it using a fraction of electrical power, imagine the crypto profitability...

    • @XantheFIN
      @XantheFIN Před 2 lety +2

      Hope crypto and all "currencies" drops all payments for miners. Its waste of energy in any level.

    • @mineland8220
      @mineland8220 Před 2 lety +2

      Wouldnt that degrade the profit of crypto if these processors were used for that as they would be getting more common degrading the value of crypto over time?

    • @Onion_Nite
      @Onion_Nite Před 2 lety

      Depends on what you mine

    • @synthlordvr
      @synthlordvr Před 2 lety +1

      @@mineland8220 Yes. This will compound crypto wealth dramatically. Only those who can afford to buy these at mass scale will take over mining. At least until they become consumer PC integrated.

    • @arirahikkala
      @arirahikkala Před 2 lety

      It's an analog device. It might be used to sell you ads, or design nukes, or whatever, but you're not going to be mining crypto on it.

  • @The_Slavstralian
    @The_Slavstralian Před 2 lety +12

    My only issue is this basically is targeted at massive companies to allow them yo cut their costs (energy use wise) so they can better serve us the ads we all hate.....
    Thanks for that mate

    • @joemerino3243
      @joemerino3243 Před 2 lety +1

      Regular computers also started off as government/corporate monstrosities.

  • @ihorkarpiuk4102
    @ihorkarpiuk4102 Před 2 lety

    It's so good to learn such things right at the time when they come out and not way before and have to wait

  • @josephpadula2283
    @josephpadula2283 Před 2 lety

    In 1973-4 I worked in a scuba shop filling tanks while in 10th grade.
    The Scuba club from Stevens Institute of Technology in Hoboken NJ came to fill tanks and go in group dives.
    One student was always talking about how computers were really going to take off when photonics replaced electronics according to his Prof. The IBM 360 was the standard
    Large computer at that time and the PDP-7?
    The standard small one.
    I do not remember his name and never knew the Profs name but find this CZcams interesting.

  • @Wilhuf1
    @Wilhuf1 Před 2 lety +12

    So, a fast matrix calculator integrated with traditional silicon.

  • @Daddelpalm
    @Daddelpalm Před 3 lety +3

    Are the networks also trained with the photonic computer or are they just used to deploy the trained network? Since training the networks take a lot of time it would be interesting to know.

    • @johnkoetsier
      @johnkoetsier  Před 3 lety +4

      I believe they are trained on the photonic computer, but I'm not certain.

    • @falconux7006
      @falconux7006 Před 2 lety +2

      No, he said they are focusing on inference, so the neural net is already trained.

  • @Chevifier
    @Chevifier Před 5 měsíci

    I love how were going full circle but with modern technology taken into account.

  • @ThinkHuman
    @ThinkHuman Před 2 lety +2

    Wow.. this is just mindblowing.