2007-2008 Supercomputing Challenge New Mexico Supercomputing Challenge
I
I
I
I
I
I


Registration

Proposals
Interims
Final Reports

Dates

Kickoff
Evaluations
Expo
STI
Wiki

School Map
Sponsors

Mail

Challenge
Technical Guide



Past Participant Survey

GUTS

 

Challenge Team Interim Report


[Challenge Logo]

    Team Number: 047

    School NameCimarron High School

    Area of Science:Computer Science

    Project Title:Neural Networks

Abstract
Interim
Final Report

A Neural Network is computer architecture modeled upon a brain's interconnected system of neurons. Most neural networks are simulations run on conventional computers. They imitate the brain's ability to sort out patterns and learn from trial and error. Neural Networks are well suited for pattern recognition, foreign language translation, process control, medical data interpretation, and parallel processing. Unlike comparing numbers in standard programs, neural nets can learn to recognize near matches.

Neural Networks can compute any computable function. They can do everything a normal digital computer can do, perhaps more. They are especially useful for classification and mapping problems.

Often the purpose of using a neural net is to generalize. In order for a network to have good generalization three conditions are necessary. The first is that the inputs to the network contain sufficient information pertaining to the subject, so that there exists a mathematical function relating correct outputs to inputs with the desired accuraccy. A network cannot learn a nonexistent function.

The second condition is that the function being learned is smooth. A small change in the inputs should produce a small change in the outputs. For continuous inputs and targets, smoothness of the function implies continuity and restrictions over most of the input space. It is more likely to get better generalization with realistic sample sizes if the classification boundaries are smoother.

The third condition for good generalization is that the training case be a sufficiently large and representative subset. The importance of this is related to the fact that there are two different types of generalization: interpolation and extrapolation. Interpolation applies to cases that are more or less surrounded by nearby training cases; everything else is considered extrapolation. Interpolation can often be done reliably, unlike extrapolation which is often unreliable. It is important to have sufficient training data to avoid the need for extrapolation.

We plan to modify an existing type of neural net program and train it to recognize the differennt patterns for different brands of bottom soles of shoes. The test will be for the net to identify a sole pattern it has not seen as to which manufacturer it is from. This pattern should have all the characteristics mentioned for patterns that neural nets can handle.


Team Members

Team Mail

Sponsoring Teachers

Project Advisor(s)

  • Jeffrey Raloff
  • Dean Bernadone
For questions about the Supercomputing Challenge, a 501(c)3 organization, contact us at: consult1516 @ supercomputingchallenge.org

New Mexico Supercomputing Challenge, Inc.
80 Cascabel Street
Los Alamos, New Mexico 87544
(505) 667-2864

Flag Counter

Tweet #SupercomputingChallenge