[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vr / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / asp / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / qst / sci / soc / sp / tg / toy / trv / tv / vp / wsg / wsr / x] [Settings] [Search] [Home]
Board
Settings Home
/sci/ - Science & Math


Thread archived.
You cannot reply anymore.



A couple days ago an Anon posted here wondering if you could just start throwing inputs at a neural network and it would become smart like a human.

Later in the the thread another Anon says it’s ridiculous, because brains are analog (albeit each neuron uses action potentials to mimic a pseudo-digital system), so OP posits we should just use analog computers. Cue a flood of /sci/entists shouting he is a brainlet

But what if OP was on to something? There’s an evolutionary reason brains evolved as they did, transferring analog signals as opposed to using organically created NAND gates, and there do exist computers that use analog circuits to solve problems. These types of computers are outdated of course, at least until recently, some novel research has been diving into analog computation to solve complex problems. Theoretically, you could model a neural network using these computers to produce something more similar to a human brain, and thus it could yield more interesting results.

Electronic circuits have the ability to modulate and amplify themselves in certain conditions which could model a change in the weights of the neural network.

What say you /sci/? Is such a thing plausible?
>>
https://en.m.wikipedia.org/wiki/Physical_neural_network
>>
>>10647243
The idea is popular amongst cranks, haven't read papers that supposedly give it any legitimacy. You can effectively emulate analog on digital anyways.
>>
>>10647243
Brain signals are analog, but the neurons are triggered by spikes from other neuron's activations. In other words, it's actually a digital process: neurons only work when there's a drastic change in the inputs. Artificial Neural Networks, by contrast, are constantly operating. In cycles, sure, but the frequency is so high that it may as well be continuous.

The problem with neural networks is that they're limited. They're just lineal curve-fitters, and they can't extrapolate. It's amazing how many problems you can solve just with curve-fitting (e.g. image classification), but if you want to create actual intelligence you need way more than that.
>>
>>10647243
>organically created NAND gates
You literally don't understand what you are talking about.
Stop.
>>
>>10647243

I have a book here about VLSI neural nets and physical neural networks. its definitely possible and its being done for sure.
>>
>>10647243
https://en.wikipedia.org/wiki/All-or-none_law
>The all-or-none law is the principle that the strength by which a nerve or muscle fibre responds to a stimulus is independent of the strength of the stimulus. If that stimulus exceeds the threshold potential, the nerve or muscle fibre will give a complete response; otherwise, there is no response.
This:
>>10647291
>it's actually a digital process: neurons only work when there's a drastic change in the inputs.
>>
>>10647300
brainlet
>>
>>10647300
You are aware that talking about things you do not understand is the only way to learn? Are you not?

Rather than telling them they do not comprehend the situation, respectfully point out what they do not comprehend.
>>
>>10647400
This. It's good for onlookers like myself too.
Using knowledge to dab on people by going "haha I know and you don't fuck you for being curious" is pretty bad from the point of view of spreading information.
>>
>>10647300
based. this thread smells of first year
>>
People put too much weight i think on the action potential. The real information is in the pattern of efficacy across a neurons connections. It also seems that the process of threshold activation is far more complicated than we thought. Neurons seem to be geared to fire to specific spatiotemporal sequences and when in groups, oscillatory cycles seem to be far more extensive than the physiology of a single neuron in that group would suggest. It seems more complicated than just binary responses.

>>10647291
>curve fitting
explain
>>
>>10648233
If you don't know that artificial neural networks are curve-fitters I don't know what are you even doing replying to this thread.
>>
>>10647243
What about quantic neural networks?
>>
>>10648422
Fitting to what curve though.
>>
>>10648422
Tbh cant all types of intelligent behaviour be described as curve fitting?
>>
>>10647243
Good luck emulating neurotransmitters mechanically.



Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.