M.Sc. IT Semester III ARTIFICIAL NEURAL NETWORKS (2014 – 2015)

M.Sc. IT Semester III
ARTIFICIAL NEURAL NETWORKS (2014 – 2015)
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
Chapter 1 to 5
Explain the Working of architecture of biological neurons?
Explain the evolution of artificial neural network?
Explain the phenomenological properties of the human brain?
Discuss in brief about artificial intelligence and neural network approach?
What is the propagation of action potential? And Also Explain the sequence of
event in propagation.
Explain the process of synaptic transmission.
Explain the different signal function employed in hidden layer.
Explain the architecture of artificial neural network.
Explain the different content of neural network.
Explain single layer perception and multilayer perception.
Give the comparison between feedforward and recurrent neural network.
State and explain the properties and applications domain of neural network.
What is learning? And Explain the process of learning with an algorithm
approach.
Explain the supervised and unsupervised learning?
Write a note on recognition.
Explain the process of recognition?
Explain how Boolean function affects the linear separability.
Give example for AND function.
Explain the implementation of linear separability.
Show that XOR is not linear separable.
Explain multilayer network.
Show how XOR classification is not equivalent to simple module 2 addition
hint:- (by condition)
Solve the logical OR problem geometrical to find appropriate weights and
thresholds for the neuron.
Discus in brief about TLN neural network architecture.
How does a single neuron design for NAND function differ from that of AND
neuron design?
Give the mathematical definition of Boolean function.
Explain in brief convex sets, convex pulls and linear seperabilitty.
Discuss the theorem that no signal threshold logic neuron can slove the logical
XOR classification problem X1 +X2 ?
Explain in brief about multilayer network ?
Explain the theorem no more than three layer binary threshold logic feedforword
networks are required to form arbitrary complex decision regions?
Derive the following equvation for M(h,n) if
a) N=1 , b) h<=n [a0-Mirchandani theorem ] ?
Explain in brief error correction and gradient decent rules.
Explain the learning objective for threshold logic neuron
Explain perceptron learning theorem.
Explain perceptron convergence theorem.
36.
37.
38.
39.
40.
41.
42.
43.
44.
45.
46.
Explain in brief about pocket algorithm.
Explain how α-LMS works with normalize training pattern.
Discuss the study of Quassi neural approach to find solution.
Discuss safiest descent search.
Explain in brief square error performance function.
Give the derivation for back propogation algorithm.
Explain in brief about regularization?
Discuss in brief about reinforcement learning network?
Comment on the following
-how long should network be trained?
-How many hidden layer are necessary to approximate continuous function?
Find the threshold logic neuron network that can correctly solve the two class
decision problem as shown in given diagram
Operational summary of perceptron algorithm.
Y=x
Y=-x+3
Y=1
47.
48.
49.
50.
51.
52.
53.
54.
55.
56.
57.
58.
59.
60.
61.
62.
63.
64.
65.
66.
67.
3
Chapter 6
Explain multilayered architecture in brief.
Explain in brief about backpropagation learning algorithm. (ans. Table 6.2)
Give the outline of Learning Procedure for Backpropagation algorithm.
Derive the backpropagation algorithm mathematically. (page 172)
Discuss the error criteria for termination of learning.(page 191)
Explain the process of Inference and Learning wit halgorithmic approach.
Write a note on: i. Weight ii.Re-Sampling and Early Stopping. iii. Cross
Validation
What is Cross Validation? Explain its role in brief.
Explain in brief cascade correlation algorithm.
Give the outline of Universal approximation theorem.
What is data preprocessing? Discuss the issues involved in data processing.
Explain in brief error correction and gradient decent rules.
Explain the learning objective for threshold logic neuron
Explain perceptron learning theorem.
Explain perceptron convergence theorem.
What is reinforcement learning? Give its applications.
Comment, “How long should a network be trained?”
Comment, “How many hidden layers are necessary to approximate a continuous
function?”
Give the derivation for back propagation algorithm.
Chapter 7
What is Bayes theorem? Discuss joint and conditional probability (Page 216)
What is posterior probability? Derive the equation explaining posterior
probability. (page 217)
68.
69.
Discuss Bayes theorem for continuous variables.
Consider the 2-d Gaussian Distribution:
(Page 218)
X = ( 1 0)T
K = 1 0.5
0.5 1
70.
71.
72.
73.
74.
75.
76.
77.
78.
79.
80.
81.
82.
83.
84.
85.
86.
87.
88.
89.
90.
91.
92.
93.
94.
95.
96.
97.
98.
99.
100.
101.
102.
103.
4
a. Compute the Mahalanobis distance for the point ( 0.5 , 0.5)
b. What is the probability density at this point?
Show the interpretation of neuron signals using Gaussian distributed data. (page
230)
Show the interpretation of neuron signals using Bernoulli distributed data. (page
230)
Discuss in brief about network signals and class posteriors. (page 239)
Discuss a cross entropy for a Two class problem.
Discuss a cross entropy for a C-class problem.
Chapter 8
Explain the analogy of SVM in brief.
Explain the working of SVM for 2-vectors.
Explain the working of SVM for 3-vectors.
Explain the working of SVM for 4-vectors.
Give the outline of SVM in brief.
Give comparison between backpropagation learning algorithm and SVM.
Discuss in brief about Vapnik’s key theorem. (page 255)
Discuss SVM for Regression. (page 281)
Explain in brief about Radial Basis Function Networks.
Explain how XOR problem is solved using the RBFN classifier.(Page 296)
Derive Euler Lagrange equations in brief. ( page 300)
Give outline of the summary of k-means clustering algorithm. (Table 8.5)
Discuss in brief about Hybrid Learning Algorithm. (Page 312)
Chapter 9
Discuss in brief about state equations. (page 328)
Explain in brief about attractors and stability. (Page 329)
Explain in short about autonomous linear systems and matrix operators (329)
Explain the behavior of nonlinear system by linearization. ( page 335)
Discuss the analysis of a non-linear differential system through linearization.
(page 337)
Write a note on Additive Neuron Dynamics. ( page 344)
Derive the quadratic forms for Lyapunov Functions. (page 342)
Discuss the Cohen-Grossenberg theorem. ( page 347)
Chapter 10
Explain in brief about associative learning.
Discuss the associative memory model in brief. (page 355)
Write a note on Hebb Association Matrix. (Page 356)
Explain in brief about Linear associative memory. (Page 357)
Discuss Hopfield auto associative memory architecture. ( Page 360)
Explain in brief about stability analysis in continuous time. (page 363)
Explain in brief about Hopfield CAM algorithm. (Table 10.1)
What is spurious attractor? Discuss the different classes of these attractors.
104.
105.
106.
107.
108.
109.
110.
111.
112.
113.
114.
115.
116.
117.
118.
119.
120.
121.
122.
123.
124.
125.
126.
127.
128.
129.
130.
131.
132.
133.
134.
135.
136.
137.
138.
139.
140.
141.
5
Write a note on simulated annealing.
Give the outline of the boltzman machine relaxation procedure.
Give the outline of the simulated annealing algorithm.
Exercise que 10.8
10.9
10.11
Chapter 11
Discuss additive activation models saturate, multiplicative models do not.
Explain the following concepts: a. Competition b. Resonance
Carefully analyze the STM and LTM dynamics of instars and outstars and bring
out their operational differences.
How do ART systems solve the stability- plasticity dilemma?
Explain in brief Weber Law Rule.
Explain the impact of the signal function that affects the global asymptotic
behavior of such networks.
How do bottom-up LTM weights ensure direct access to learn codes?
Write a note on the philosophy of adaptive resonance.
Explain the role of attend vigilance in the context of learning and recognition in
ART systems.
Chapter 12
Explain in brief about Oja’s Rule.
Explain in brief about generalized Learning Laws.
Explain the principal of competitive learning. (page 487)
Explain in brief about vector quantization. (page 487)
Discuss a generalized competitive learning law.
Give the outline of the AVQ algorithm.(Table 12.3)
Write a note on supervised vector quantization. (page 494)
Write a note on Mexican hat neural network. (page 496)
Discuss the general overview of self-organizing feature map. (page 500)
Discuss in brief about competitive hebbian learning.
Explain the steps involved in GNG algorithm.
Chapter 13
What is fuzzy? Discuss the fuzzy uncertainty and the linguistic variable.
Discuss in brief about membership functions.
Discuss in brief about a) fuzzy intersection b) fuzzy union c) fuzzy
complementation.
Fuzzy rules for approximate reasoning.
Explain in brief about Defuzzification.
Define fuzzy sets on Input and Output UOD’s.
Chapter 14
Explain in brief about Evolutionary algorithms.
Define evolutionary algorithms.
Discuss the streams of evolutionary algorithm.
Discuss the different operator selection mechanism.
Explain the working of Genetic Algorithm.
Discuss in brief about crossover and mutation.