There are a few facts from neuroscience that your model can't capture. Time is not discrete, and real neurons are sensitive to dynamics. There are periodic oscillations, phase sensitive neurons, tonic firing modes with poor time resolution but great intensity resolution, and burst modes with the opposite resolution relation. There are modes of inhibition and disinhibition that last different periods of time. Fractional delays between neuron activations, i.e. less than a synaptic delay, are used very commonly in basic calculations, like motion detection in fruit flies. You just can't do that with a single bit of state per neuron that's volatile on the scale of a single discrete step. In particular, phase sensitivity and periodic behavior are suspected to be important to consciousness because they are highly sensitive to state of consciousness (awake, asleep, anesthesia, etc). You might be able to compute things in such a model, but you won't be able to capture the dynamics or central control of a real brain. I actually don't think we will ever have a truly general model of processing in the brain because there doesn't appear to be any general principle that is applicable to every neural computation. Sometimes assemblies tell the story, but other functions are better described with topological principles, analog physical transformations, or Fourier transformations. The brain is the wild west. It was not designed with a plan. The best we can do is to account for all of the individual degrees of freedom in our experience, because no single paradigm will describe it.
what is computation? the best definition i have come up with, that fits every single conceivable example, is that computation is a sequence of interdependent transformations on objects encoded with information. this covers everything.
also if you are inspired to look for analogies that can be worked into models for representing the brain, then look into combinatorial logic. the simplest language that can compute anything. furthermore, look into the iota combinator. which is a very recent discovery which finds that all combinators can be reduced down to just one combinator. repeated enough times can simulate any computation. it is one of the most astonishing discoveries in math and computer science. just one! something fundamental is happening here.
computation is by definition an abstraction from the substrate. You decided it. It means "change" for you. Then you say "weather computes". You literally use it for "a thing computes if it changes". Which is this close to be a totology. And absolutely useless.
Right. You know nothing about my intelligence. The fact that I am even watching this type of video might suggest something. That you would question another’s intelligence actual speaks volumes about you!
Turn on subtitles. Btw, if this basic system can transcribe the introduction flawlessly, but you feel the need to complain publicly, what it says about your intelligence level may mean that this talk is not for you…
Wonderful!
Thanks.
Spelling mistake in description: gerogia tech
It's misspelled in the corner of the video too. 🫣
Awesome!
How can I attend this summer school as an international master student, nice contents
On CZcams it seems, accessable and free content is quite a level advantage.
where is the link to these papers?
There are a few facts from neuroscience that your model can't capture. Time is not discrete, and real neurons are sensitive to dynamics. There are periodic oscillations, phase sensitive neurons, tonic firing modes with poor time resolution but great intensity resolution, and burst modes with the opposite resolution relation. There are modes of inhibition and disinhibition that last different periods of time. Fractional delays between neuron activations, i.e. less than a synaptic delay, are used very commonly in basic calculations, like motion detection in fruit flies. You just can't do that with a single bit of state per neuron that's volatile on the scale of a single discrete step. In particular, phase sensitivity and periodic behavior are suspected to be important to consciousness because they are highly sensitive to state of consciousness (awake, asleep, anesthesia, etc).
You might be able to compute things in such a model, but you won't be able to capture the dynamics or central control of a real brain. I actually don't think we will ever have a truly general model of processing in the brain because there doesn't appear to be any general principle that is applicable to every neural computation. Sometimes assemblies tell the story, but other functions are better described with topological principles, analog physical transformations, or Fourier transformations. The brain is the wild west. It was not designed with a plan. The best we can do is to account for all of the individual degrees of freedom in our experience, because no single paradigm will describe it.
what is computation? the best definition i have come up with, that fits every single conceivable example, is that computation is a sequence of interdependent transformations on objects encoded with information. this covers everything.
also if you are inspired to look for analogies that can be worked into models for representing the brain, then look into combinatorial logic. the simplest language that can compute anything. furthermore, look into the iota combinator. which is a very recent discovery which finds that all combinators can be reduced down to just one combinator. repeated enough times can simulate any computation. it is one of the most astonishing discoveries in math and computer science. just one! something fundamental is happening here.
Shut
computation is by definition an abstraction from the substrate. You decided it. It means "change" for you. Then you say "weather computes". You literally use it for "a thing computes if it changes". Which is this close to be a totology. And absolutely useless.
Right. You know nothing about my intelligence. The fact that I am even watching this type of video might suggest something. That you would question another’s intelligence actual speaks volumes about you!
Geez. Guy introducing the talk is barely understandable.
Turn on subtitles. Btw, if this basic system can transcribe the introduction flawlessly, but you feel the need to complain publicly, what it says about your intelligence level may mean that this talk is not for you…
You are soo wrong.