PDA

View Full Version : Anyone ever play with neural networks?



W1GUH
03-19-2011, 04:09 PM
Wiki article (http://en.wikipedia.org/wiki/Neural_network)

I did some R&D on them in the early 90's and never did get them to work as advertised. But then, resources were limited & I spent most of the time making the software work -- they "couldn't afford" proper software.

I saw a presentation and read many articles about them claiming that the author(s) had developed a super whiz-bang box using them...but not one of them described how they arrived at the topology (number of layers, number of nodes) they were using. Got a strong impression that that was all done heuristically (read -- trial and error).

The one I was working with "seemed" to be going in the proper direction after millions and millions, maybe even billions, of training cycles, but never quite got there. Made some pretty pix, though, when I plotted the results in 3D. Actually looked like something organic and growing.

I put AI in the same category. A promising concept that turns out to be "not much" when put into practice. In both cases, the literature was very general and more of a tutorial on the state of the research than it was on practical usage.

ab1ga
03-19-2011, 06:01 PM
Now that's a blast from the past.

I had a pretty intense interest in both neural networks and fuzzy logic a bit later, but was disappointed by both areas for largely the same reason: analysis paralysis using inappropriate mathematical tools.

It was distressing: the whole point behind neural networks was that the brain could find workable solutions to complex problems using slow processors (nerve cells) in large quantities communicating via pulse frequency modulation. The fascinating and distinguishing characteristic was that high precision, high speed floating point processors were not used, and just as obviously not necessary. Nor did it seem likely that only one solution was possible, i.e. multiple connection configurations should exist which would provide a solution.

But when you looked at the papers, and later the physical coprocessors for PCs that sold for a fortune, they almost all used a precise mathematical model of a transfer function, with precisely specified transfer and feedback coefficients, and treated the data not as statistical quantities, but as purely deterministic. The only person who didn't do that, and the one who came up with the nicest results, was Carver Mead at CalTech when he built his prototype artificial retina.

I thought NNs would experience a resurgence when fuzzy logic hit the scene, since it provided an algorithmic approach with the same features that neural nets did for the computational mechanism: multiple solution paths, soft requirements on engine precision, etc. But the second time around practicing engineers seemed to be almost offended by the term "fuzzy" and developed a mental block against the field. I haven't seem much on FL since then, with the exception of a rice cooker I saw online recently.

Your reference to number of layers and inputs goes straight to the major problem NN research faced, i. e., there seemed to be a lack of a "survey" of architectures, in the sense astronomers use the term. When astronomers get a new toy, they do a survey of the sky, basically a quick look at everything looking for promising targets for further research. Neural network research didn't have that, rather the opposite. When Minsky and Papert wrote their book Perceptrons and showed that a such devices (two-layer neural networks) couldn't perform an exclusive-OR function, everyone just assumed they wouldn't work at all, and nothing happened for literally decades. Then someone decided to add a third layer and whoa, Nellie, the race was on again. I don't think anyone went much beyond four layers, although they did introduce some feedback between layers at one time.

I've always wondered whether a statistical approach to analyzing neural networks would produce a useful result, but I didn't have the math chops for it then, and it's not going to happen any more now. Plus, I don't think anyone cares anymore; the increase in speed provided by incremental technology improvements over the years may have made the payoff of a NN-based system too small to be worth the bother.

73,

ad4mg
03-19-2011, 07:06 PM
OK, I'm dumb as a rock, and you guys rule. :dance:

I was just sitting here wondering why nobody uses the 'giggity' smilie anymore when I bumped into this thread ...

ab1ga
03-19-2011, 07:16 PM
OK, I'm dumb as a rock, and you guys rule. :dance:

I was just sitting here wondering why nobody uses the 'giggity' smilie anymore when I bumped into this thread ...

And I'm dumber than two rocks, and wondering what the hell the "giggity" smilie is and why it's used.

ad4mg
03-20-2011, 06:24 AM
And I'm dumber than two rocks, and wondering what the hell the "giggity" smilie is and why it's used.

A. :giggity: <- giggity smilie


B. Don't have a clue ... :lol:

WØTKX
03-20-2011, 09:22 AM
http://www.makhfi.com/tools.htm

I never got to play with NN stuff much... closest I got to it was Small Talk and a lotta AutoLisp. But I've also studied and read a lot of interesting things about cognition and such... Douglas Hofstadter and a little Marvin Minsky.

Giggity? That's best explained this way:

http://www.youtube.com/watch?v=1AumfgzZBIk


http://www.youtube.com/watch?v=1AumfgzZBIk

ab1ga
03-20-2011, 11:46 AM
Okay, so I'm dumber than three rocks, but that doesn't mean I'm going to have sex with ANYBODY on this board!

73, but nothing more,

NQ6U
03-20-2011, 01:27 PM
Okay, so I'm dumber than three rocks, but that doesn't mean I'm going to have sex with ANYBODY on this board!

73, but nothing more,

And the The Board thanks you for that! ;)

WØTKX
03-20-2011, 02:25 PM
Giggity Goo. http://www.simbrain.net/Screenshots/screenshots_main.html

ki4itv
03-20-2011, 05:20 PM
Okay, so I'm dumber than three rocks, but that doesn't mean I'm going to have sex with ANYBODY on this board!

73, but nothing more,

Uhhmm, didn't bother to read the IOMH EULA, didja?? :neener::rofl: