[Catalyst 2017]
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
training data) to modify the system until the<br />
differences between the expected output<br />
and the actual output values are minimized. 4<br />
Training algorithms allow for neural networks<br />
to be constructed with optimal weights,<br />
which lets the neural network make accurate<br />
predictions when presented with new<br />
data. One such training algorithm is the<br />
backpropagation algorithm. In this design,<br />
the algorithm analyzes the gradient vector<br />
and the error surface in the data until a<br />
minimum is found. 1 The difficult part of the<br />
backpropagation algorithm is determining<br />
the step size. Larger steps can result in faster<br />
runtimes, but can overstep the solution;<br />
comparatively smaller steps can lead to a<br />
much slower runtime, but are more likely to<br />
find a correct solution. 1<br />
While feedforward neural network designs<br />
like MLP are common, there are many<br />
other neural network designs. These<br />
other structures include examples such<br />
as recurrent neural networks, which allow<br />
for connections between neurons in the<br />
same layer, and self-organizing maps, in<br />
which neurons attain weights that retain<br />
characteristics of the input. All of these<br />
network types also have variations within<br />
their specific frameworks. 5 The Hopfield<br />
network and Boltzmann machine neural<br />
network architectures utilize the recurrent<br />
neural network design. 5 While feedforward<br />
neural networks are the most common, each<br />
design is uniquely suited to solve specific<br />
problems.<br />
DISADVANTAGES<br />
One of the main problems with neural<br />
networks is that, for the most part, they have<br />
limited ability to identify causal relationships<br />
explicitly. Developers of neural networks feed<br />
these networks large swathes of data and<br />
allow for the neural networks to determine<br />
independently which input variables are most<br />
important. 10 However, it is difficult for the<br />
network to indicate to the developers which<br />
variables are most important in calculating<br />
the outputs. While some techniques exist<br />
to analyze the relative importance of each<br />
neuron in a neural network, these techniques<br />
still do not present as clear of a causal<br />
relationship between variables as can be<br />
gained in similar data analysis methods such<br />
as a logistic regression. 10<br />
Another problem with neural networks is<br />
the tendency to overfit. Overfitting of data<br />
occurs when a data analysis model such as a<br />
neural network generates good predictions<br />
for the training data but worse ones for<br />
testing data. 10 Overfitting happens because<br />
the model accounts for irregularities and<br />
outliers in the training data that may not be<br />
present across actual data sets. Developers<br />
can mitigate overfitting in neural networks<br />
by penalizing large weights and limiting<br />
the number of neurons in hidden layers. 10<br />
Reducing the number of neurons in hidden<br />
layers reduces overfitting but also limits the<br />
ability of the neural network to model more<br />
complex, nonlinear relationships. 10<br />
APPLICATIONS<br />
Artificial neural networks allow for the<br />
processing of large amounts of data, making<br />
them useful tools in many fields of research.<br />
For example, the field of bioinformatics<br />
relies heavily on neural network pattern<br />
recognition to predict various proteins’<br />
secondary structures. One popular algorithm<br />
used for this purpose is Position Specific<br />
Iterated Basic Local Alignment Search Tool<br />
(PSI-BLAST) Secondary Structure Prediction<br />
(PSIPRED). 6 This algorithm uses a two-stage<br />
structure that consists of two three-layered<br />
Neural networks can<br />
provide more accurate<br />
predictions more efficiently<br />
than other prediction<br />
models.<br />
feedforward neural networks. The first stage<br />
of PSIPRED involves inputting a scoring matrix<br />
generated by using the PSI-BLAST algorithm<br />
on a peptide sequence. PSIPRED then takes<br />
15 positions from the scoring matrix and uses<br />
them to output three values that represent<br />
the probabilities of forming the three protein<br />
secondary structures: helix, coil, and strand. 6<br />
These probabilities are then input into the<br />
second stage neural network along with the<br />
15 positions from the scoring matrix, and the<br />
output of this second stage neural network<br />
includes three values representing more<br />
accurate probabilities of forming helix, coil,<br />
and strand secondary structures. 6<br />
Neural networks are used not only to predict<br />
protein structures, but also to analyze<br />
genes associated with the development and<br />
progression of cancer. More specifically,<br />
researchers and doctors use artificial<br />
neural networks to identify the type of<br />
cancer associated with certain tumors. Such<br />
identification is useful for correct diagnosis<br />
and treatment of each specific cancer. 7 These<br />
artificial neural networks enable researchers<br />
to match genomic characteristics from large<br />
datasets to specific types of cancer and<br />
predict these types of cancer. 7<br />
In bioinformatic scenarios such as the<br />
above two examples, trained artificial<br />
neural networks quickly provide highquality<br />
results for prediction tasks. 6 These<br />
characteristics of neural networks are<br />
important for bioinformatics projects because<br />
bioinformatics generally involves large<br />
quantities of data that need to be interpreted<br />
both effectively and efficiently. 6<br />
The applications of artificial neural networks<br />
are also viable within fields outside the<br />
natural sciences, such as finance. These<br />
networks can be used to predict subtle trends<br />
such as variations in the stock market or<br />
when organizations will face bankruptcy. 8,9<br />
Neural networks can provide more accurate<br />
predictions more efficiently than other<br />
prediction models. 9<br />
CONCLUSION<br />
Over the past decade, artificial neural<br />
networks have become more refined and are<br />
being used in a wide variety of fields. Artificial<br />
neural networks allow researchers to find<br />
patterns in the largest of datasets and utilize<br />
the patterns to predict potential outcomes.<br />
These artificial neural networks provide a new<br />
computational way to learn and understand<br />
diverse assortments of data and allow for<br />
a more accurate and effective grasp of the<br />
world.<br />
WORKS CITED<br />
[1] Ayodele, T. New Advances in Machine Learning;<br />
Zhang, Y, Eds.; InTech: 2010; pp. 31-36.<br />
[2] Muller, B.; Reinhardt, J.; Strickland M. T. Neural<br />
Networks: An Introduction; Leo van Hemmen, J.,<br />
Domany, E., Schulten, K.; Springer-Verlag Berlin<br />
Heidelberg: Germany 1995; pp. 3-7.<br />
[3] Urbas, J. Neural Networks Salem Press Enc. Sci. 2011,<br />
1-5.<br />
[4] Nielsen, M. Neural Networks and Deep Learning;<br />
Determination Press, 2015;<br />
Mehrotra, K.; Mohan, C.; Ranka, S. Elements of Artificial<br />
Neural Networks; MIT Press: Cambridge, Mass 1997.<br />
[5] Tu, J. J. of Clinical Epidemiology [Online] 1996, 49,<br />
1225-1231. http://www.sciencedirect.com/science/<br />
article/pii/S0895435696000029?via%3Dihub.<br />
[6] Chen, K.; Kurgan, L. Handbook of Natural Computing;<br />
Rozenberg G., Nok J., Back T.; Springer-Verlag Berlin<br />
Heidelberg: Germany, 2012; 4, pp. 565-584.<br />
[7] Oustimov, A.;Vu, V. Artificial Neural Networks in the<br />
Cancer Genomics Frontier, Trans. Cancer Research<br />
[Online] 2014, 3. http://tcr.amegroups.com/article/<br />
view/2647/html.<br />
[8] Pal, N.; Aguan, K.; Sharma, A.; Amari, S.<br />
BMC Bioinformatics [Online] 2007, 8 http://<br />
bmcbioinformatics.biomedcentral.com/<br />
articles/10.1186/1471-2105-8-5.<br />
[9] Ma, J.; Huang, S. Kwok, S. An Enhanced Artificial<br />
Network for Stock Preditions, Int. J. of Bus. and Econ.<br />
Development [Online] 2016, 4, 28-32. http://www.ijbed.<br />
org/admin/content/pdf/i-12_c-122.pdf.<br />
[10] Mansouri, A.; Nazari, A.; Ramazani, M. Int. J. of<br />
Adv. Computer Research [Online] 2016, 6, 81-92<br />
https://papers.ssrn.com/sol3/papers2.cfm?abstract_<br />
id=2787308.<br />
DESIGN BY Jessica Lee<br />
EDITED BY Jacob Mattia<br />
CATALYST | 69